WorldWideScience

Sample records for automatic differentiation tools

  1. Automatic differentiation bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Corliss, G.F. (comp.)

    1992-07-01

    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  2. Automatic differentiation tools in the dynamic simulation of chemical engineering processes

    Directory of Open Access Journals (Sweden)

    Castro M.C.

    2000-01-01

    Full Text Available Automatic Differentiation is a relatively recent technique developed for the differentiation of functions applicable directly to the source code to compute the function written in standard programming languages. That technique permits the automatization of the differentiation step, crucial for dynamic simulation and optimization of processes. The values for the derivatives obtained with AD are exact (to roundoff. The theoretical exactness of the AD comes from the fact that it uses the same rules of differentiation as in differential calculus, but these rules are applied to an algorithmic specification of the function rather than to a formula. The main purpose of this contribution is to discuss the impact of Automatic Differentiation in the field of dynamic simulation of chemical engineering processes. The influence of the differentiation technique on the behavior of the integration code, the performance of the generated code and the incorporation of AD tools in consistent initialization tools are discussed from the viewpoint of dynamic simulation of typical models in chemical engineering.

  3. Automatic differentiation of functions

    International Nuclear Information System (INIS)

    Douglas, S.R.

    1990-06-01

    Automatic differentiation is a method of computing derivatives of functions to any order in any number of variables. The functions must be expressible as combinations of elementary functions. When evaluated at specific numerical points, the derivatives have no truncation error and are automatically found. The method is illustrated by simple examples. Source code in FORTRAN is provided

  4. AUTO_DERIV: Tool for automatic differentiation of a Fortran code

    Science.gov (United States)

    Stamatiadis, S.; Farantos, S. C.

    2010-10-01

    restrictions of available memory and the capabilities of the compiler are the same as the original version. Additional comments: The program has been tested using the following compilers: Intel ifort, GNU gfortran, NAGWare f95, g95. Running time: The typical running time for the program depends on the compiler and the complexity of the differentiated function. A rough estimate is that AUTO_DERIV is ten times slower than the evaluation of the analytical ('by hand') function value and derivatives (if they are available). References:S. Stamatiadis, R. Prosmiti, S.C. Farantos, AUTO_DERIV: tool for automatic differentiation of a Fortran code, Comput. Phys. Comm. 127 (2000) 343.

  5. The efficiency of geophysical adjoint codes generated by automatic differentiation tools

    Science.gov (United States)

    Vlasenko, A. V.; Köhl, A.; Stammer, D.

    2016-02-01

    The accuracy of numerical models that describe complex physical or chemical processes depends on the choice of model parameters. Estimating an optimal set of parameters by optimization algorithms requires knowledge of the sensitivity of the process of interest to model parameters. Typically the sensitivity computation involves differentiation of the model, which can be performed by applying algorithmic differentiation (AD) tools to the underlying numerical code. However, existing AD tools differ substantially in design, legibility and computational efficiency. In this study we show that, for geophysical data assimilation problems of varying complexity, the performance of adjoint codes generated by the existing AD tools (i) Open_AD, (ii) Tapenade, (iii) NAGWare and (iv) Transformation of Algorithms in Fortran (TAF) can be vastly different. Based on simple test problems, we evaluate the efficiency of each AD tool with respect to computational speed, accuracy of the adjoint, the efficiency of memory usage, and the capability of each AD tool to handle modern FORTRAN 90-95 elements such as structures and pointers, which are new elements that either combine groups of variables or provide aliases to memory addresses, respectively. We show that, while operator overloading tools are the only ones suitable for modern codes written in object-oriented programming languages, their computational efficiency lags behind source transformation by orders of magnitude, rendering the application of these modern tools to practical assimilation problems prohibitive. In contrast, the application of source transformation tools appears to be the most efficient choice, allowing handling even large geophysical data assimilation problems. However, they can only be applied to numerical models written in earlier generations of programming languages. Our study indicates that applying existing AD tools to realistic geophysical problems faces limitations that urgently need to be solved to allow the

  6. Automatic Differentiation and Deep Learning

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Statistical learning has been getting more and more interest from the particle-physics community in recent times, with neural networks and gradient-based optimization being a focus. In this talk we shall discuss three things: automatic differention tools: tools to quickly build DAGs of computation that are fully differentiable. We shall focus on one such tool "PyTorch".  Easy deployment of trained neural networks into large systems with many constraints: for example, deploying a model at the reconstruction phase where the neural network has to be integrated into CERN's bulk data-processing C++-only environment Some recent models in deep learning for segmentation and generation that might be useful for particle physics problems.

  7. Sensitivity analysis and design optimization through automatic differentiation

    International Nuclear Information System (INIS)

    Hovland, Paul D; Norris, Boyana; Strout, Michelle Mills; Bhowmick, Sanjukta; Utke, Jean

    2005-01-01

    Automatic differentiation is a technique for transforming a program or subprogram that computes a function, including arbitrarily complex simulation codes, into one that computes the derivatives of that function. We describe the implementation and application of automatic differentiation tools. We highlight recent advances in the combinatorial algorithms and compiler technology that underlie successful implementation of automatic differentiation tools. We discuss applications of automatic differentiation in design optimization and sensitivity analysis. We also describe ongoing research in the design of language-independent source transformation infrastructures for automatic differentiation algorithms

  8. Automatically-Programed Machine Tools

    Science.gov (United States)

    Purves, L.; Clerman, N.

    1985-01-01

    Software produces cutter location files for numerically-controlled machine tools. APT, acronym for Automatically Programed Tools, is among most widely used software systems for computerized machine tools. APT developed for explicit purpose of providing effective software system for programing NC machine tools. APT system includes specification of APT programing language and language processor, which executes APT statements and generates NC machine-tool motions specified by APT statements.

  9. Automatic Differentiation and its Program Realization

    Czech Academy of Sciences Publication Activity Database

    Hartman, J.; Lukšan, Ladislav; Zítko, J.

    2009-01-01

    Roč. 45, č. 5 (2009), s. 865-883 ISSN 0023-5954 R&D Projects: GA AV ČR IAA1030405 Institutional research plan: CEZ:AV0Z10300504 Keywords : automatic differentiation * modeling languages * systems of optimization Subject RIV: BA - General Mathematics Impact factor: 0.445, year: 2009 http://dml.cz/handle/10338.dmlcz/140037

  10. Applications of automatic differentiation in topology optimization

    DEFF Research Database (Denmark)

    Nørgaard, Sebastian A.; Sagebaum, Max; Gauger, Nicolas R.

    2017-01-01

    and is demonstrated on two separate, previously published types of problems in topology optimization. Two separate software packages for automatic differentiation, CoDiPack and Tapenade are considered, and their performance and usability trade-offs are discussed and compared to a hand coded adjoint gradient...

  11. Paediatric Automatic Phonological Analysis Tools (APAT).

    Science.gov (United States)

    Saraiva, Daniela; Lousada, Marisa; Hall, Andreia; Jesus, Luis M T

    2017-12-01

    To develop the pediatric Automatic Phonological Analysis Tools (APAT) and to estimate inter and intrajudge reliability, content validity, and concurrent validity. The APAT were constructed using Excel spreadsheets with formulas. The tools were presented to an expert panel for content validation. The corpus used in the Portuguese standardized test Teste Fonético-Fonológico - ALPE produced by 24 children with phonological delay or phonological disorder was recorded, transcribed, and then inserted into the APAT. Reliability and validity of APAT were analyzed. The APAT present strong inter- and intrajudge reliability (>97%). The content validity was also analyzed (ICC = 0.71), and concurrent validity revealed strong correlations between computerized and manual (traditional) methods. The development of these tools contributes to fill existing gaps in clinical practice and research, since previously there were no valid and reliable tools/instruments for automatic phonological analysis, which allowed the analysis of different corpora.

  12. TMB: Automatic differentiation and laplace approximation

    DEFF Research Database (Denmark)

    Kristensen, Kasper; Nielsen, Anders; Berg, Casper Willestofte

    2016-01-01

    are automatically integrated out. This approximation, and its derivatives, are obtained using automatic differentiation (up to order three) of the joint likelihood. The computations are designed to be fast for problems with many random effects (approximate to 10(6)) and parameters (approximate to 10...... computations. The user defines the joint likelihood for the data and the random effects as a C++ template function, while all the other operations are done in R; e.g., reading in the data. The package evaluates and maximizes the Laplace approximation of the marginal likelihood where the random effects......(3)). Computation times using ADMB and TMB are compared on a suite of examples ranging from simple models to large spatial models where the random effects are a Gaussian random field. Speedups ranging from 1.5 to about 100 are obtained with increasing gains for large problems...

  13. Automatic Tool for Local Assembly Structures

    Energy Technology Data Exchange (ETDEWEB)

    2016-10-11

    Whole community shotgun sequencing of total DNA (i.e. metagenomics) and total RNA (i.e. metatranscriptomics) has provided a wealth of information in the microbial community structure, predicted functions, metabolic networks, and is even able to reconstruct complete genomes directly. Here we present ATLAS (Automatic Tool for Local Assembly Structures) a comprehensive pipeline for assembly, annotation, genomic binning of metagenomic and metatranscriptomic data with an integrated framework for Multi-Omics. This will provide an open source tool for the Multi-Omic community at large.

  14. Automatic Search for Differential Trails in ARX Ciphers

    OpenAIRE

    Biryukov, Alex; Velichkov, Vesselin

    2014-01-01

    We propose a tool for automatic search for differential trails in ARX ciphers. By introducing the concept of a partial difference distribution table (pDDT) we extend Matsui's algorithm, originally proposed for DES-like ciphers, to the class of ARX ciphers. To the best of our knowledge this is the first application of Matsui's algorithm to ciphers that do not have S-boxes. The tool is applied to the block ciphers TEA, XTEA, SPECK and RAIDEN. For RAIDEN we find an iterative characteristic on al...

  15. PASTEC: an automatic transposable element classification tool.

    Science.gov (United States)

    Hoede, Claire; Arnoux, Sandie; Moisset, Mark; Chaumier, Timothée; Inizan, Olivier; Jamilloux, Véronique; Quesneville, Hadi

    2014-01-01

    The classification of transposable elements (TEs) is key step towards deciphering their potential impact on the genome. However, this process is often based on manual sequence inspection by TE experts. With the wealth of genomic sequences now available, this task requires automation, making it accessible to most scientists. We propose a new tool, PASTEC, which classifies TEs by searching for structural features and similarities. This tool outperforms currently available software for TE classification. The main innovation of PASTEC is the search for HMM profiles, which is useful for inferring the classification of unknown TE on the basis of conserved functional domains of the proteins. In addition, PASTEC is the only tool providing an exhaustive spectrum of possible classifications to the order level of the Wicker hierarchical TE classification system. It can also automatically classify other repeated elements, such as SSR (Simple Sequence Repeats), rDNA or potential repeated host genes. Finally, the output of this new tool is designed to facilitate manual curation by providing to biologists with all the evidence accumulated for each TE consensus. PASTEC is available as a REPET module or standalone software (http://urgi.versailles.inra.fr/download/repet/REPET_linux-x64-2.2.tar.gz). It requires a Unix-like system. There are two standalone versions: one of which is parallelized (requiring Sun grid Engine or Torque), and the other of which is not.

  16. PASTEC: an automatic transposable element classification tool.

    Directory of Open Access Journals (Sweden)

    Claire Hoede

    Full Text Available SUMMARY: The classification of transposable elements (TEs is key step towards deciphering their potential impact on the genome. However, this process is often based on manual sequence inspection by TE experts. With the wealth of genomic sequences now available, this task requires automation, making it accessible to most scientists. We propose a new tool, PASTEC, which classifies TEs by searching for structural features and similarities. This tool outperforms currently available software for TE classification. The main innovation of PASTEC is the search for HMM profiles, which is useful for inferring the classification of unknown TE on the basis of conserved functional domains of the proteins. In addition, PASTEC is the only tool providing an exhaustive spectrum of possible classifications to the order level of the Wicker hierarchical TE classification system. It can also automatically classify other repeated elements, such as SSR (Simple Sequence Repeats, rDNA or potential repeated host genes. Finally, the output of this new tool is designed to facilitate manual curation by providing to biologists with all the evidence accumulated for each TE consensus. AVAILABILITY: PASTEC is available as a REPET module or standalone software (http://urgi.versailles.inra.fr/download/repet/REPET_linux-x64-2.2.tar.gz. It requires a Unix-like system. There are two standalone versions: one of which is parallelized (requiring Sun grid Engine or Torque, and the other of which is not.

  17. Automatic differentiation for gradient-based optimization of radiatively heated microelectronics manufacturing equipment

    Energy Technology Data Exchange (ETDEWEB)

    Moen, C.D.; Spence, P.A.; Meza, J.C.; Plantenga, T.D.

    1996-12-31

    Automatic differentiation is applied to the optimal design of microelectronic manufacturing equipment. The performance of nonlinear, least-squares optimization methods is compared between numerical and analytical gradient approaches. The optimization calculations are performed by running large finite-element codes in an object-oriented optimization environment. The Adifor automatic differentiation tool is used to generate analytic derivatives for the finite-element codes. The performance results support previous observations that automatic differentiation becomes beneficial as the number of optimization parameters increases. The increase in speed, relative to numerical differences, has a limited value and results are reported for two different analysis codes.

  18. Automatic Clustering Using FSDE-Forced Strategy Differential Evolution

    Science.gov (United States)

    Yasid, A.

    2018-01-01

    Clustering analysis is important in datamining for unsupervised data, cause no adequate prior knowledge. One of the important tasks is defining the number of clusters without user involvement that is known as automatic clustering. This study intends on acquiring cluster number automatically utilizing forced strategy differential evolution (AC-FSDE). Two mutation parameters, namely: constant parameter and variable parameter are employed to boost differential evolution performance. Four well-known benchmark datasets were used to evaluate the algorithm. Moreover, the result is compared with other state of the art automatic clustering methods. The experiment results evidence that AC-FSDE is better or competitive with other existing automatic clustering algorithm.

  19. Tangent: Automatic Differentiation Using Source Code Transformation in Python

    OpenAIRE

    van Merriënboer, Bart; Wiltschko, Alexander B.; Moldovan, Dan

    2017-01-01

    Automatic differentiation (AD) is an essential primitive for machine learning programming systems. Tangent is a new library that performs AD using source code transformation (SCT) in Python. It takes numeric functions written in a syntactic subset of Python and NumPy as input, and generates new Python functions which calculate a derivative. This approach to automatic differentiation is different from existing packages popular in machine learning, such as TensorFlow and Autograd. Advantages ar...

  20. Operator overloading as an enabling technology for automatic differentiation

    International Nuclear Information System (INIS)

    Corliss, G.F.; Griewank, A.

    1993-01-01

    We present an example of the science that is enabled by object-oriented programming techniques. Scientific computation often needs derivatives for solving nonlinear systems such as those arising in many PDE algorithms, optimization, parameter identification, stiff ordinary differential equations, or sensitivity analysis. Automatic differentiation computes derivatives accurately and efficiently by applying the chain rule to each arithmetic operation or elementary function. Operator overloading enables the techniques of either the forward or the reverse mode of automatic differentiation to be applied to real-world scientific problems. We illustrate automatic differentiation with an example drawn from a model of unsaturated flow in a porous medium. The problem arises from planning for the long-term storage of radioactive waste

  1. Applications of automatic differentiation in computational fluid dynamics

    Science.gov (United States)

    Green, Lawrence L.; Carle, A.; Bischof, C.; Haigler, Kara J.; Newman, Perry A.

    1994-01-01

    Automatic differentiation (AD) is a powerful computational method that provides for computing exact sensitivity derivatives (SD) from existing computer programs for multidisciplinary design optimization (MDO) or in sensitivity analysis. A pre-compiler AD tool for FORTRAN programs called ADIFOR has been developed. The ADIFOR tool has been easily and quickly applied by NASA Langley researchers to assess the feasibility and computational impact of AD in MDO with several different FORTRAN programs. These include a state-of-the-art three dimensional multigrid Navier-Stokes flow solver for wings or aircraft configurations in transonic turbulent flow. With ADIFOR the user specifies sets of independent and dependent variables with an existing computer code. ADIFOR then traces the dependency path throughout the code, applies the chain rule to formulate derivative expressions, and generates new code to compute the required SD matrix. The resulting codes have been verified to compute exact non-geometric and geometric SD for a variety of cases. in less time than is required to compute the SD matrix using centered divided differences.

  2. Automatic Calibration Of Manual Machine Tools

    Science.gov (United States)

    Gurney, Rex D.

    1990-01-01

    Modified scheme uses data from multiple positions and eliminates tedious positioning. Modification of computer program adapts calibration system for convenient use with manually-controlled machine tools. Developed for use on computer-controlled tools. Option added to calibration program allows data on random tool-axis positions to be entered manually into computer for reduction. Instead of setting axis to predetermined positions, operator merely sets it at variety of arbitrary positions.

  3. Automatically Assessing Lexical Sophistication: Indices, Tools, Findings, and Application

    Science.gov (United States)

    Kyle, Kristopher; Crossley, Scott A.

    2015-01-01

    This study explores the construct of lexical sophistication and its applications for measuring second language lexical and speaking proficiency. In doing so, the study introduces the Tool for the Automatic Analysis of LExical Sophistication (TAALES), which calculates text scores for 135 classic and newly developed lexical indices related to word…

  4. A new design of automatic vertical drilling tool

    Directory of Open Access Journals (Sweden)

    Yanfeng Ma

    2015-09-01

    Full Text Available In order to effectively improve penetration rates and enhance wellbore quality for vertical wells, a new Automatic Vertical Drilling Tool (AVDT based on Eccentric Braced Structure (EBS is designed. Applying operating principle of rotary steering drilling, AVDT adds offset gravity block automatic induction inclination mechanism. When hole straightening happens, tools take essentric moment to be produced by gravity of offset gravity lock to control the bearing of guide force, so that well straightening is achieved. The normal tool's size of the AVDT is designed as 215.9 mm,other major components' sizes are worked out by the result of theoretical analysis, including the offset angle of EBS. This paper aims to introduce the structure, operating principle, theoretical analysis and describe the key components' parameters setting of the AVDT.

  5. Automatized material and radioactivity flow control tool in decommissioning process

    International Nuclear Information System (INIS)

    Rehak, I.; Vasko, M.; Daniska, V.; Schultz, O.

    2009-01-01

    In this presentation the automatized material and radioactivity flow control tool in decommissioning process is discussed. It is concluded that: computer simulation of the decommissioning process is one of the important attributes of computer code Omega; one of the basic tools of computer optimisation of decommissioning waste processing are the tools of integral material and radioactivity flow; all the calculated parameters of materials are stored in each point of calculation process and they can be viewed; computer code Omega represents opened modular system, which can be improved; improvement of the module of optimisation of decommissioning waste processing will be performed in the frame of improvement of material procedures and scenarios.

  6. CASAS: A tool for composing automatically and semantically astrophysical services

    Science.gov (United States)

    Louge, T.; Karray, M. H.; Archimède, B.; Knödlseder, J.

    2017-07-01

    Multiple astronomical datasets are available through internet and the astrophysical Distributed Computing Infrastructure (DCI) called Virtual Observatory (VO). Some scientific workflow technologies exist for retrieving and combining data from those sources. However selection of relevant services, automation of the workflows composition and the lack of user-friendly platforms remain a concern. This paper presents CASAS, a tool for semantic web services composition in astrophysics. This tool proposes automatic composition of astrophysical web services and brings a semantics-based, automatic composition of workflows. It widens the services choice and eases the use of heterogeneous services. Semantic web services composition relies on ontologies for elaborating the services composition; this work is based on Astrophysical Services ONtology (ASON). ASON had its structure mostly inherited from the VO services capacities. Nevertheless, our approach is not limited to the VO and brings VO plus non-VO services together without the need for premade recipes. CASAS is available for use through a simple web interface.

  7. Developing an Automatic Generation Tool for Cryptographic Pairing Functions

    OpenAIRE

    Dominguez Perez, Luis Julian

    2011-01-01

    Pairing-Based Cryptography is receiving steadily more attention from industry, mainly because of the increasing interest in Identity-Based protocols. Although there are plenty of applications, efficiently implementing the pairing functions is often difficult as it requires more knowledge than previous cryptographic primitives. The author presents a tool for automatically generating optimized code for the pairing functions which can be used in the construction of such cryptograp...

  8. Automatic differential analysis of NMR experiments in complex samples.

    Science.gov (United States)

    Margueritte, Laure; Markov, Petar; Chiron, Lionel; Starck, Jean-Philippe; Vonthron-Sénécheau, Catherine; Bourjot, Mélanie; Delsuc, Marc-André

    2017-11-20

    Liquid state nuclear magnetic resonance (NMR) is a powerful tool for the analysis of complex mixtures of unknown molecules. This capacity has been used in many analytical approaches: metabolomics, identification of active compounds in natural extracts, and characterization of species, and such studies require the acquisition of many diverse NMR measurements on series of samples. Although acquisition can easily be performed automatically, the number of NMR experiments involved in these studies increases very rapidly, and this data avalanche requires to resort to automatic processing and analysis. We present here a program that allows the autonomous, unsupervised processing of a large corpus of 1D, 2D, and diffusion-ordered spectroscopy experiments from a series of samples acquired in different conditions. The program provides all the signal processing steps, as well as peak-picking and bucketing of 1D and 2D spectra, the program and its components are fully available. In an experiment mimicking the search of a bioactive species in a natural extract, we use it for the automatic detection of small amounts of artemisinin added to a series of plant extracts and for the generation of the spectral fingerprint of this molecule. This program called Plasmodesma is a novel tool that should be useful to decipher complex mixtures, particularly in the discovery of biologically active natural products from plants extracts but can also in drug discovery or metabolomics studies. Copyright © 2017 John Wiley & Sons, Ltd.

  9. Post-convergence automatic differentiation of iterative schemes

    International Nuclear Information System (INIS)

    Azmy, Y.Y.

    1997-01-01

    A new approach for performing automatic differentiation (AD) of computer codes that embody an iterative procedure, based on differentiating a single additional iteration upon achieving convergence, is described and implemented. This post-convergence automatic differentiation (PAD) technique results in better accuracy of the computed derivatives, as it eliminates part of the derivatives convergence error, and a large reduction in execution time, especially when many iterations are required to achieve convergence. In addition, it provides a way to compute derivatives of the converged solution without having to repeat the entire iterative process every time new parameters are considered. These advantages are demonstrated and the PAD technique is validated via a set of three linear and nonlinear codes used to solve neutron transport and fluid flow problems. The PAD technique reduces the execution time over direct AD by a factor of up to 30 and improves the accuracy of the derivatives by up to two orders of magnitude. The PAD technique's biggest disadvantage lies in the necessity to compute the iterative map's Jacobian, which for large problems can be prohibitive. Methods are discussed to alleviate this difficulty

  10. Parallel computation of automatic differentiation applied to magnetic field calculations

    International Nuclear Information System (INIS)

    Hinkins, R.L.; Lawrence Berkeley Lab., CA

    1994-09-01

    The author presents a parallelization of an accelerator physics application to simulate magnetic field in three dimensions. The problem involves the evaluation of high order derivatives with respect to two variables of a multivariate function. Automatic differentiation software had been used with some success, but the computation time was prohibitive. The implementation runs on several platforms, including a network of workstations using PVM, a MasPar using MPFortran, and a CM-5 using CMFortran. A careful examination of the code led to several optimizations that improved its serial performance by a factor of 8.7. The parallelization produced further improvements, especially on the MasPar with a speedup factor of 620. As a result a problem that took six days on a SPARC 10/41 now runs in minutes on the MasPar, making it feasible for physicists at Lawrence Berkeley Laboratory to simulate larger magnets

  11. Analysis on machine tool systems using spindle vibration monitoring for automatic tool changer

    OpenAIRE

    Shang-Liang Chen; Yin-Ting Cheng; Chin-Fa Su

    2015-01-01

    Recently, the intelligent systems of technology have become one of the major items in the development of machine tools. One crucial technology is the machinery status monitoring function, which is required for abnormal warnings and the improvement of cutting efficiency. During processing, the mobility act of the spindle unit determines the most frequent and important part such as automatic tool changer. The vibration detection system includes the development of hardware and software, such as ...

  12. A Domain Specific Embedded Language in C++ for Automatic Differentiation, Projection, Integration and Variational Formulations

    Directory of Open Access Journals (Sweden)

    Christophe Prud'homme

    2006-01-01

    Full Text Available In this article, we present a domain specific embedded language in C++ that can be used in various contexts such as numerical projection onto a functional space, numerical integration, variational formulations and automatic differentiation. Albeit these tools operate in different ways, the language overcomes this difficulty by decoupling expression constructions from evaluation. The language is implemented using expression templates and meta-programming techniques and uses various Boost libraries. The language is exercised on a number of non-trivial examples and a benchmark presents the performance behavior on a few test problems.

  13. AUTOMATIC TOOL-CHANGING WITHIN THE RECONFIGURABLE MANUFACTURING SYSTEMS PARADIGM

    Directory of Open Access Journals (Sweden)

    J.E.T. Collins

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Reconfigurable manufacturing systems were developed as a proposed solution to the varying market and customer requirements present in today’s global market. The systems are designed to offer adaptability in machining functions and processes. This adaptive capability requires access to a selection of tools. The development of reconfigurable manufacturing systems has mainly been focused on the machine tools themselves. Methods of supplying tools to these machines need to be researched. This paper does so, presenting a tool-changing unit that offers a solution to this need. It then discusses the enabling technologies that would allow for automatic integration and diagnostic abilities of the unit.

    AFRIKAANSE OPSOMMING: Herkonfigureerbare vervaardingstelsels is ontwikkel as ’n voorgestelde oplossing vir die varierende mark- en klantbehoeftes in die hedendaagse globale mark. Die stelsels is ontwikkel om aanpasbaarheid te bied ten opsigte van masjineringsfunksies en –prosesse. Hierdie aanpasbare vermoëns vereis egter toegang tot ‘n verskeidenheid van gereedskapstukke. Die ontwikkeling van herkonfigureerbare vervaardigingstelsels het egter hoofsaaklik gefokus op die gereedskapstukke. Die wyse waarop hierdie gereedskapstukke beskikbaar gestel word aan die masjinerie moet egter nagevors word. Hierdie artikel doen juis dit en stel ‘n eenheid voor vir die ruiling van gereedskapstukke. Voorts word die tegnologieë bespreek wat automatiese integrasie moontlik maak en diagnostiese vermoëns verskaf.

  14. An automatic detection software for differential reflection spectroscopy

    Science.gov (United States)

    Yuksel, Seniha Esen; Dubroca, Thierry; Hummel, Rolf E.; Gader, Paul D.

    2012-06-01

    Recent terrorist attacks have sprung a need for a large scale explosive detector. Our group has developed differential reflection spectroscopy which can detect explosive residue on surfaces such as parcel, cargo and luggage. In short, broad band ultra-violet and visible light is shone onto a material (such as a parcel) moving on a conveyor belt. Upon reflection off the surface, the light intensity is recorded with a spectrograph (spectrometer in combination with a CCD camera). This reflected light intensity is then subtracted and normalized with the next data point collected, resulting in differential reflection spectra in the 200-500 nm range. Explosives show spectral finger-prints at specific wavelengths, for example, the spectrum of 2,4,6, trinitrotoluene (TNT) shows an absorption edge at 420 nm. Additionally, we have developed an automated software which detects the characteristic features of explosives. One of the biggest challenges for the algorithm is to reach a practical limit of detection. In this study, we introduce our automatic detection software which is a combination of principal component analysis and support vector machines. Finally we present the sensitivity and selectivity response of our algorithm as a function of the amount of explosive detected on a given surface.

  15. Automatic Parallelization Tool: Classification of Program Code for Parallel Computing

    Directory of Open Access Journals (Sweden)

    Mustafa Basthikodi

    2016-04-01

    Full Text Available Performance growth of single-core processors has come to a halt in the past decade, but was re-enabled by the introduction of parallelism in processors. Multicore frameworks along with Graphical Processing Units empowered to enhance parallelism broadly. Couples of compilers are updated to developing challenges forsynchronization and threading issues. Appropriate program and algorithm classifications will have advantage to a great extent to the group of software engineers to get opportunities for effective parallelization. In present work we investigated current species for classification of algorithms, in that related work on classification is discussed along with the comparison of issues that challenges the classification. The set of algorithms are chosen which matches the structure with different issues and perform given task. We have tested these algorithms utilizing existing automatic species extraction toolsalong with Bones compiler. We have added functionalities to existing tool, providing a more detailed characterization. The contributions of our work include support for pointer arithmetic, conditional and incremental statements, user defined types, constants and mathematical functions. With this, we can retain significant data which is not captured by original speciesof algorithms. We executed new theories into the device, empowering automatic characterization of program code.

  16. Development of tools for automatic generation of PLC code

    CERN Document Server

    Koutli, Maria; Rochez, Jacques

    This Master thesis was performed at CERN and more specifically in the EN-ICE-PLC section. The Thesis describes the integration of two PLC platforms, that are based on CODESYS development tool, to the CERN defined industrial framework, UNICOS. CODESYS is a development tool for PLC programming, based on IEC 61131-3 standard, and is adopted by many PLC manufacturers. The two PLC development environments are, the SoMachine from Schneider and the TwinCAT from Beckhoff. The two CODESYS compatible PLCs, should be controlled by the SCADA system of Siemens, WinCC OA. The framework includes a library of Function Blocks (objects) for the PLC programs and a software for automatic generation of the PLC code based on this library, called UAB. The integration aimed to give a solution that is shared by both PLC platforms and was based on the PLCOpen XML scheme. The developed tools were demonstrated by creating a control application for both PLC environments and testing of the behavior of the code of the library.

  17. A Thermo-Hydraulic Tool for Automatic Virtual Hazop Evaluation

    Directory of Open Access Journals (Sweden)

    Pugi L.

    2014-12-01

    Full Text Available Development of complex lubrication systems in the Oil&Gas industry has reached high levels of competitiveness in terms of requested performances and reliability. In particular, the use of HazOp (acronym of Hazard and Operability analysis represents a decisive factor to evaluate safety and reliability of plants. The HazOp analysis is a structured and systematic examination of a planned or existing operation in order to identify and evaluate problems that may represent risks to personnel or equipment. In particular, P&ID schemes (acronym of Piping and Instrument Diagram according to regulation in force ISO 14617 are used to evaluate the design of the plant in order to increase its safety and reliability in different operating conditions. The use of a simulation tool can drastically increase speed, efficiency and reliability of the design process. In this work, a tool, called TTH lib (acronym of Transient Thermal Hydraulic Library for the 1-D simulation of thermal hydraulic plants is presented. The proposed tool is applied to the analysis of safety relevant components of compressor and pumping units, such as lubrication circuits. Opposed to the known commercial products, TTH lib has been customized in order to ease simulation of complex interactions with digital logic components and plant controllers including their sensors and measurement systems. In particular, the proposed tool is optimized for fixed step execution and fast prototyping of Real Time code both for testing and production purposes. TTH lib can be used as a standard SimScape-Simulink library of components optimized and specifically designed in accordance with the P&ID definitions. Finally, an automatic code generation procedure has been developed, so TTH simulation models can be directly assembled from the P&ID schemes and technical documentation including detailed informations of sensor and measurement system.

  18. Preliminary Design Through Graphs: A Tool for Automatic Layout Distribution

    Directory of Open Access Journals (Sweden)

    Carlo Biagini

    2015-02-01

    Full Text Available Diagrams are essential in the preliminary stages of design for understanding distributive aspects and assisting the decision-making process. By drawing a schematic graph, designers can visualize in a synthetic way the relationships between many aspects: functions and spaces, distribution of layouts, space adjacency, influence of traffic flows within a facility layout, and so on. This process can be automated through the use of modern Information and Communication Technologies tools (ICT that allow the designers to manage a large quantity of information. The work that we will present is part of an on-going research project into how modern parametric software influences decision-making on the basis of automatic and optimized layout distribution. The method involves two phases: the first aims to define the ontological relation between spaces, with particular reference to a specific building typology (rules of aggregation of spaces; the second entails the implementation of these rules through the use of specialist software. The generation of ontological relations begins with the collection of data from historical manuals and analyses of case studies. These analyses aim to generate a “relationship matrix” based on preferences of space adjacency. The phase of implementing the previously defined rules is based on the use of Grasshopper to analyse and visualize different layout configurations. The layout is generated by simulating a process involving the collision of spheres, which represents specific functions of the design program. The spheres are attracted or rejected as a function of the relationships matrix, as defined above. The layout thus obtained will remain in a sort of abstract state independent of information about the exterior form, but will still provide a useful tool for the decision-making process. In addition, preliminary results gathered through the analysis of case studies will be presented. These results provide a good variety

  19. Towards an automatic tool for resolution evaluation of mammographic images

    Energy Technology Data Exchange (ETDEWEB)

    De Oliveira, J. E. E. [FUMEC, Av. Alfonso Pena 3880, CEP 30130-009 Belo Horizonte - MG (Brazil); Nogueira, M. S., E-mail: juliae@fumec.br [Centro de Desenvolvimento da Tecnologia Nuclear / CNEN, Pte. Antonio Carlos 6627, 31270-901, Belo Horizonte - MG (Brazil)

    2014-08-15

    Quality of Mammographies from the Public and Private Services of the State. With an essentially educational character, an evaluation of the image quality is monthly held from a breast phantom in each mammographic equipment. In face of this, this work proposes to develop a protocol for automatic evaluation of image quality of mammograms so that the radiological protection and image quality requirements are met in the early detection of breast cancer. Specifically, image resolution will be addressed and evaluated, as a part of the program of image quality evaluation. Results show that for the fourth resolution and using 28 phantom images with the ground truth settled, the computer analysis of the resolution is promising and may be used as a tool for the assessment of the image quality. (Author)

  20. Towards an automatic tool for resolution evaluation of mammographic images

    International Nuclear Information System (INIS)

    De Oliveira, J. E. E.; Nogueira, M. S.

    2014-08-01

    Quality of Mammographies from the Public and Private Services of the State. With an essentially educational character, an evaluation of the image quality is monthly held from a breast phantom in each mammographic equipment. In face of this, this work proposes to develop a protocol for automatic evaluation of image quality of mammograms so that the radiological protection and image quality requirements are met in the early detection of breast cancer. Specifically, image resolution will be addressed and evaluated, as a part of the program of image quality evaluation. Results show that for the fourth resolution and using 28 phantom images with the ground truth settled, the computer analysis of the resolution is promising and may be used as a tool for the assessment of the image quality. (Author)

  1. CRISPR Recognition Tool (CRT): a tool for automatic detection ofclustered regularly interspaced palindromic repeats

    Energy Technology Data Exchange (ETDEWEB)

    Bland, Charles; Ramsey, Teresa L.; Sabree, Fareedah; Lowe,Micheal; Brown, Kyndall; Kyrpides, Nikos C.; Hugenholtz, Philip

    2007-05-01

    Clustered Regularly Interspaced Palindromic Repeats (CRISPRs) are a novel type of direct repeat found in a wide range of bacteria and archaea. CRISPRs are beginning to attract attention because of their proposed mechanism; that is, defending their hosts against invading extrachromosomal elements such as viruses. Existing repeat detection tools do a poor job of identifying CRISPRs due to the presence of unique spacer sequences separating the repeats. In this study, a new tool, CRT, is introduced that rapidly and accurately identifies CRISPRs in large DNA strings, such as genomes and metagenomes. CRT was compared to CRISPR detection tools, Patscan and Pilercr. In terms of correctness, CRT was shown to be very reliable, demonstrating significant improvements over Patscan for measures precision, recall and quality. When compared to Pilercr, CRT showed improved performance for recall and quality. In terms of speed, CRT also demonstrated superior performance, especially for genomes containing large numbers of repeats. In this paper a new tool was introduced for the automatic detection of CRISPR elements. This tool, CRT, was shown to be a significant improvement over the current techniques for CRISPR identification. CRT's approach to detecting repetitive sequences is straightforward. It uses a simple sequential scan of a DNA sequence and detects repeats directly without any major conversion or preprocessing of the input. This leads to a program that is easy to describe and understand; yet it is very accurate, fast and memory efficient, being O(n) in space and O(nm/l) in time.

  2. Analysis on machine tool systems using spindle vibration monitoring for automatic tool changer

    Directory of Open Access Journals (Sweden)

    Shang-Liang Chen

    2015-12-01

    Full Text Available Recently, the intelligent systems of technology have become one of the major items in the development of machine tools. One crucial technology is the machinery status monitoring function, which is required for abnormal warnings and the improvement of cutting efficiency. During processing, the mobility act of the spindle unit determines the most frequent and important part such as automatic tool changer. The vibration detection system includes the development of hardware and software, such as vibration meter, signal acquisition card, data processing platform, and machine control program. Meanwhile, based on the difference between the mechanical configuration and the desired characteristics, it is difficult for a vibration detection system to directly choose the commercially available kits. For this reason, it was also selected as an item for self-development research, along with the exploration of a significant parametric study that is sufficient to represent the machine characteristics and states. However, we also launched the development of functional parts of the system simultaneously. Finally, we entered the conditions and the parameters generated from both the states and the characteristics into the developed system to verify its feasibility.

  3. Managed PACS operation with an automatic monitoring tool

    Science.gov (United States)

    Zhang, Jianguo; Han, Ruolin; Wu, Dongqing; Zhang, Xiaoyan; Zhuang, Jun; Feng, Jie; Wang, Mingpeng; Zhang, Guozhen; Wang, Cuanfu

    2002-05-01

    Huadong hospital in Shanghai with 800 beds provides health care services for inpatients and outpatients, as well as special senior and VIP patients. In order to move to digital imaging based radiology practice, and also provide better intra-hospital consultation services for senior and VIP patients, we started to implement PACS for hospital wide services from 1999, and also designed and developed an automatic monitoring system (AMS) to monitor and control PACS operation and dataflow to decrease the total cost of ownership for PACS operation. We installed the AMS on top of the Huadong Hospital PACS in the May of 2001. The installation was painless, did not interrupt the normal PACS operation, and took only one month. The PACS administrators with the AMS can now monitor and control the entire PACS operation in real time, and also track patient and image data flow automatically. These features make administrators take proper action even before user's complaint if any failure happened in any PACS component or process, they reduce the size of the management team, and decrease total cost of PACS ownership.

  4. MatchGUI: A Graphical MATLAB-Based Tool for Automatic Image Co-Registration

    Science.gov (United States)

    Ansar, Adnan I.

    2011-01-01

    MatchGUI software, based on MATLAB, automatically matches two images and displays the match result by superimposing one image on the other. A slider bar allows focus to shift between the two images. There are tools for zoom, auto-crop to overlap region, and basic image markup. Given a pair of ortho-rectified images (focused primarily on Mars orbital imagery for now), this software automatically co-registers the imagery so that corresponding image pixels are aligned. MatchGUI requires minimal user input, and performs a registration over scale and inplane rotation fully automatically

  5. Automatic design optimization tool for passive structural control systems

    Science.gov (United States)

    Mojolic, Cristian; Hulea, Radu; Parv, Bianca Roxana

    2017-07-01

    The present paper proposes an automatic dynamic process in order to find the parameters of the seismic isolation systems applied to large span structures. Three seismic isolation solutions are proposed for the model of the new Slatina Sport Hall. The first case uses friction pendulum system (FP), the second one uses High Damping Rubber Bearing (HDRB) and Lead Rubber Bearings, while (LRB) are used for the last case of isolation. The placement of the isolation level is at the top end of the roof supporting columns. The aim is to calculate the parameters of each isolation system so that the whole's structure first vibration periods is the one desired by the user. The model is computed with the use of SAP2000 software. In order to find the best solution for the optimization problem, an optimization process based on Genetic Algorithms (GA) has been developed in Matlab. With the use of the API (Application Programming Interface) libraries a two way link is created between the two programs in order to exchange results and link parameters. The main goal is to find the best seismic isolation method for each desired modal period so that the bending moment on the supporting columns should be minimum.

  6. RETRANS - A tool to verify the functional equivalence of automatically generated source code with its specification

    International Nuclear Information System (INIS)

    Miedl, H.

    1998-01-01

    Following the competent technical standards (e.g. IEC 880) it is necessary to verify each step in the development process of safety critical software. This holds also for the verification of automatically generated source code. To avoid human errors during this verification step and to limit the cost effort a tool should be used which is developed independently from the development of the code generator. For this purpose ISTec has developed the tool RETRANS which demonstrates the functional equivalence of automatically generated source code with its underlying specification. (author)

  7. Semi-automatic tool to ease the creation and optimization of GPU programs

    DEFF Research Database (Denmark)

    Jepsen, Jacob

    2014-01-01

    We present a tool that reduces the development time of GPU-executable code. We implement a catalogue of common optimizations specific to the GPU architecture. Through the tool, the programmer can semi-automatically transform a computationally-intensive code section into GPU-executable form...... and apply optimizations thereto. Based on experiments, the code generated by the tool can be 3-256X faster than code generated by an OpenACC compiler, 4-37X faster than optimized CPU code, and attain up to 25% of peak performance of the GPU. We found that by using pattern-matching rules, many...... of the transformations can be performed automatically, which makes the tool usable for both novices and experts in GPU programming....

  8. ANISOMAT+: An automatic tool to retrieve seismic anisotropy from local earthquakes

    Science.gov (United States)

    Piccinini, Davide; Pastori, Marina; Margheriti, Lucia

    2013-07-01

    An automatic analysis code called ANISOMAT+ has been developed and improved to automatically retrieve the crustal anisotropic parameters fast polarization direction (ϕ) and delay time (δt) related to the shear wave splitting phenomena affecting seismic S-wave. The code is composed of a set of MatLab scripts and functions able to evaluate the anisotropic parameters from the three-component seismic recordings of local earthquakes using the cross-correlation method. Because the aim of the code is to achieve a fully automatic evaluation of anisotropic parameters, during the development of the code we focus our attention to devise several automatic checks intended to guarantee the quality and the stability of the results obtained. The basic idea behind the development of this automatic code is to build a tool able to work on a huge amount of data in a short time, obtaining stable results and minimizing the errors due to the subjectivity. These behaviors, coupled to a three component digital seismic network and a monitoring system that performs automatic pickings and locations, are required to develop a real-time monitoring of the anisotropic parameters.

  9. DAF: differential ACE filtering image quality assessment by automatic color equalization

    Science.gov (United States)

    Ouni, S.; Chambah, M.; Saint-Jean, C.; Rizzi, A.

    2008-01-01

    Ideally, a quality assessment system would perceive and measure image or video impairments just like a human being. But in reality, objective quality metrics do not necessarily correlate well with perceived quality [1]. Plus, some measures assume that there exists a reference in the form of an "original" to compare to, which prevents their usage in digital restoration field, where often there is no reference to compare to. That is why subjective evaluation is the most used and most efficient approach up to now. But subjective assessment is expensive, time consuming and does not respond, hence, to the economic requirements [2,3]. Thus, reliable automatic methods for visual quality assessment are needed in the field of digital film restoration. The ACE method, for Automatic Color Equalization [4,6], is an algorithm for digital images unsupervised enhancement. It is based on a new computational approach that tries to model the perceptual response of our vision system merging the Gray World and White Patch equalization mechanisms in a global and local way. Like our vision system ACE is able to adapt to widely varying lighting conditions, and to extract visual information from the environment efficaciously. Moreover ACE can be run in an unsupervised manner. Hence it is very useful as a digital film restoration tool since no a priori information is available. In this paper we deepen the investigation of using the ACE algorithm as a basis for a reference free image quality evaluation. This new metric called DAF for Differential ACE Filtering [7] is an objective quality measure that can be used in several image restoration and image quality assessment systems. In this paper, we compare on different image databases, the results obtained with DAF and with some subjective image quality assessments (Mean Opinion Score MOS as measure of perceived image quality). We study also the correlation between objective measure and MOS. In our experiments, we have used for the first image

  10. ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations

    Science.gov (United States)

    Laloo, Jalal Z. A.; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai

    2017-07-01

    The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.

  11. Depfix, a Tool for Automatic Rule-based Post-editing of SMT

    Directory of Open Access Journals (Sweden)

    Rudolf Rosa

    2014-09-01

    Full Text Available We present Depfix, an open-source system for automatic post-editing of phrase-based machine translation outputs. Depfix employs a range of natural language processing tools to obtain analyses of the input sentences, and uses a set of rules to correct common or serious errors in machine translation outputs. Depfix is currently implemented only for English-to-Czech translation direction, but extending it to other languages is planned.

  12. Building Automatic Grading Tools for Basic of Programming Lab in an Academic Institution

    Science.gov (United States)

    Harimurti, Rina; Iwan Nurhidayat, Andi; Asmunin

    2018-04-01

    The skills of computer programming is a core competency that must be mastered by students majoring in computer sciences. The best way to improve this skill is through the practice of writing many programs to solve various problems from simple to complex. It takes hard work and a long time to check and evaluate the results of student labs one by one, especially if the number of students a lot. Based on these constrain, web proposes Automatic Grading Tools (AGT), the application that can evaluate and deeply check the source code in C, C++. The application architecture consists of students, web-based applications, compilers, and operating systems. Automatic Grading Tools (AGT) is implemented MVC Architecture and using open source software, such as laravel framework version 5.4, PostgreSQL 9.6, Bootstrap 3.3.7, and jquery library. Automatic Grading Tools has also been tested for real problems by submitting source code in C/C++ language and then compiling. The test results show that the AGT application has been running well.

  13. DiffNet: automatic differential functional summarization of dE-MAP networks.

    Science.gov (United States)

    Seah, Boon-Siew; Bhowmick, Sourav S; Dewey, C Forbes

    2014-10-01

    The study of genetic interaction networks that respond to changing conditions is an emerging research problem. Recently, Bandyopadhyay et al. (2010) proposed a technique to construct a differential network (dE-MAPnetwork) from two static gene interaction networks in order to map the interaction differences between them under environment or condition change (e.g., DNA-damaging agent). This differential network is then manually analyzed to conclude that DNA repair is differentially effected by the condition change. Unfortunately, manual construction of differential functional summary from a dE-MAP network that summarizes all pertinent functional responses is time-consuming, laborious and error-prone, impeding large-scale analysis on it. To this end, we propose DiffNet, a novel data-driven algorithm that leverages Gene Ontology (go) annotations to automatically summarize a dE-MAP network to obtain a high-level map of functional responses due to condition change. We tested DiffNet on the dynamic interaction networks following MMS treatment and demonstrated the superiority of our approach in generating differential functional summaries compared to state-of-the-art graph clustering methods. We studied the effects of parameters in DiffNet in controlling the quality of the summary. We also performed a case study that illustrates its utility. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. NuFTA: A CASE Tool for Automatic Software Fault Tree Analysis

    International Nuclear Information System (INIS)

    Yun, Sang Hyun; Lee, Dong Ah; Yoo, Jun Beom

    2010-01-01

    Software fault tree analysis (SFTA) is widely used for analyzing software requiring high-reliability. In SFTA, experts predict failures of system through HA-ZOP (Hazard and Operability study) or FMEA (Failure Mode and Effects Analysis) and draw software fault trees about the failures. Quality and cost of the software fault tree, therefore, depend on knowledge and experience of the experts. This paper proposes a CASE tool NuFTA in order to assist experts of safety analysis. The NuFTA automatically generate software fault trees from NuSCR formal requirements specification. NuSCR is a formal specification language used for specifying software requirements of KNICS RPS (Reactor Protection System) in Korea. We used the SFTA templates proposed by in order to generate SFTA automatically. The NuFTA also generates logical formulae summarizing the failure's cause, and we have a plan to use the formulae usefully through formal verification techniques

  15. Parameter optimization of differential evolution algorithm for automatic playlist generation problem

    Science.gov (United States)

    Alamag, Kaye Melina Natividad B.; Addawe, Joel M.

    2017-11-01

    With the digitalization of music, the number of collection of music increased largely and there is a need to create lists of music that filter the collection according to user preferences, thus giving rise to the Automatic Playlist Generation Problem (APGP). Previous attempts to solve this problem include the use of search and optimization algorithms. If a music database is very large, the algorithm to be used must be able to search the lists thoroughly taking into account the quality of the playlist given a set of user constraints. In this paper we perform an evolutionary meta-heuristic optimization algorithm, Differential Evolution (DE) using different combination of parameter values and select the best performing set when used to solve four standard test functions. Performance of the proposed algorithm is then compared with normal Genetic Algorithm (GA) and a hybrid GA with Tabu Search. Numerical simulations are carried out to show better results from Differential Evolution approach with the optimized parameter values.

  16. Evaluation of user-guided semi-automatic decomposition tool for hexahedral mesh generation

    Directory of Open Access Journals (Sweden)

    Jean Hsiang-Chun Lu

    2017-10-01

    Full Text Available Volumetric decomposition is essential for all-hexahedral mesh generation. Because fully automatic decomposition methods that can generate high-quality hexahedral meshes for arbitrary volumes have yet to be realized, manual decomposition is still required frequently. Manual decomposition is a laborious process and requires a high level of user expertise. Therefore, a user-guided semi-automatic tool to reduce the human effort and lower the requirement of expertise is necessary. To date, only a few of these approaches have been proposed, and a lack of user evaluation makes it difficult to improve upon this approach. Based on our previous work, we present a user evaluation of a user-guided semi-automatic tool that provides visual guidance to assist users in determining decomposition solutions, accepts sketch-based inputs to create decomposition surfaces, and simplifies the decomposition commands. This user evaluation investigated (1 the usability of the visual guidance, (2 the types of visual guidance essential for decomposition, (3 the effectiveness of the sketch-based decomposition, and (4 the performance differences between beginner and experienced users using the sketch-based decomposition. The result and user feedback indicate that the tool enables users who have limited prior experience or familiarity with the computer-aided engineering software to perform volumetric decomposition more efficiently. The visual guidance increases the success rate of the user’s decomposition solution by 28%. The sketch-based decomposition significantly reduces 46% of the user’s time on creating decomposition surfaces and setting up decomposition commands.

  17. Data Quality Monitoring : Automatic MOnitoRing Environment (AMORE ) Web Administration Tool in ALICE Experiment

    CERN Document Server

    Nagi, Imre

    2013-01-01

    ALICE (A Large Ion Collider Experiment) is the heavy-ion detector designed to study the physics of strongly interacting matter and the quark-gluon plasma at the CERN Large Hadron Collider (LHC). The quality of the acquired data evolves over time depending on the status of the detectors, its components and the operating environment. To get an excellent performance of detector, all detector configurations have to be set perfectly so that the data-taking can be done in an optimal way. This report describes a new implementation of the administration tools of the ALICE’s DQM framework called AMORE (Automatic MonitoRing Environment) with web technologies.

  18. Automatic generation of bioinformatics tools for predicting protein-ligand binding sites.

    Science.gov (United States)

    Komiyama, Yusuke; Banno, Masaki; Ueki, Kokoro; Saad, Gul; Shimizu, Kentaro

    2016-03-15

    Predictive tools that model protein-ligand binding on demand are needed to promote ligand research in an innovative drug-design environment. However, it takes considerable time and effort to develop predictive tools that can be applied to individual ligands. An automated production pipeline that can rapidly and efficiently develop user-friendly protein-ligand binding predictive tools would be useful. We developed a system for automatically generating protein-ligand binding predictions. Implementation of this system in a pipeline of Semantic Web technique-based web tools will allow users to specify a ligand and receive the tool within 0.5-1 day. We demonstrated high prediction accuracy for three machine learning algorithms and eight ligands. The source code and web application are freely available for download at http://utprot.net They are implemented in Python and supported on Linux. shimizu@bi.a.u-tokyo.ac.jp Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  19. Automatic generation of bioinformatics tools for predicting protein–ligand binding sites

    Science.gov (United States)

    Banno, Masaki; Ueki, Kokoro; Saad, Gul; Shimizu, Kentaro

    2016-01-01

    Motivation: Predictive tools that model protein–ligand binding on demand are needed to promote ligand research in an innovative drug-design environment. However, it takes considerable time and effort to develop predictive tools that can be applied to individual ligands. An automated production pipeline that can rapidly and efficiently develop user-friendly protein–ligand binding predictive tools would be useful. Results: We developed a system for automatically generating protein–ligand binding predictions. Implementation of this system in a pipeline of Semantic Web technique-based web tools will allow users to specify a ligand and receive the tool within 0.5–1 day. We demonstrated high prediction accuracy for three machine learning algorithms and eight ligands. Availability and implementation: The source code and web application are freely available for download at http://utprot.net. They are implemented in Python and supported on Linux. Contact: shimizu@bi.a.u-tokyo.ac.jp Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26545824

  20. A new fully automatic PIM tool to replicate two component tungsten DEMO divertor parts

    International Nuclear Information System (INIS)

    Antusch, Steffen; Commin, Lorelei; Heneka, Jochen; Piotter, Volker; Plewa, Klaus; Walter, Heinz

    2013-01-01

    Highlights: • Development of a fully automatic 2C-PIM tool. • Replicate fusion relevant components in one step without additional brazing. • No cracks or gaps in the seam of the joining zone visible. • For both material combinations a solid bond of the material interface was achieved. • PIM is a powerful process for mass production as well as for joining even complex shaped parts. -- Abstract: At Karlsruhe Institute of Technology (KIT), divertor design concepts for future nuclear fusion power plants beyond ITER are intensively investigated. One promising KIT divertor design concept for the future DEMO power reactor is based on modular He-cooled finger units. The manufacturing of such parts by mechanical machining such as milling and turning, however, is extremely cost and time intensive because tungsten is very hard and brittle. Powder Injection Molding (PIM) has been adapted to tungsten processing at KIT since a couple of years. This production method is deemed promising in view of large-scale production of tungsten parts with high near-net-shape precision, hence, offering an advantage of cost-saving process compared to conventional machining. The properties of the effectively and successfully manufactured divertor part tile consisting only of pure tungsten are a microstructure without cracks and a high density (>98% T.D.). Based on the achieved results a new fully automatic multicomponent PIM tool was developed and allows the replication and joining without brazing of fusion relevant components of different materials in one step and the creation of composite materials. This contribution describes the process route to design and engineer a new fully automatic 2C-PIM tool, including the filling simulation and the implementing of the tool. The complete technological fabrication process of tungsten 2C-PIM, including material and feedstock (powder and binder) development, injection molding, and heat-treatment of real DEMO divertor parts is outlined

  1. getDEG: A Versatile Matlab Tool for Identifying Differentially Expressed Genes from High-Throughput Biomedical Data

    Directory of Open Access Journals (Sweden)

    Hua Tan

    2017-10-01

    Full Text Available The identification of differentially expressed genes (DEGs is an important initial step for characterizing critical regulators and associated signaling profiles under specific conditions. Yet, a sophisticated computational tool to detect DEGs in a fully automatic manner is still lacking. Here we describe getDEG, a versatile Matlab program to fill this gap by offering efficient solutions of the most needed functions. Particularly, getDEG adopts user-designated statistical test and ranking method to prioritize probes/genes assayed. Furthermore, getDEG allows preliminary filtering by the machine detection p-value, and collapsing multiple probes to their associated gene. Taken together, getDEG is a powerful and automatic tool which satisfies basic and advanced needs in searching for most relevant candidates from microarray assays or other high-throughput screens. The toolgetDEG and test examples are freely available online at https://sites.google.com/site/differentiallyexpressedgene.

  2. Four-bar linkage-based automatic tool changer: Dynamic modeling and torque optimization

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Sangho; Seo, TaeWon [Yeungnam University, Gyeongsan (Korea, Republic of); Kim, Jong-Won; Kim, Jongwon [Seoul National University, Seoul (Korea, Republic of)

    2017-05-15

    An Automatic tool changer (ATC) is a device used in a tapping machine to reduce process time. This paper presents the optimization of a Peak torque reduction mechanism (PTRM) for an ATC. It is necessary to reduce the fatigue load and energy consumed, which is related to the peak torque. The PTRM uses a torsion spring to reduce the peak torque and was applied to a novel ATC mechanism, which was modeled using inverse dynamics. Optimization of the PTRM is required to minimize the peak torque. The design parameters are the initial angle and stiffness of the torsion spring, and the objective function is the peak torque of the input link. The torque was simulated, and the peak torque was decreased by 10 %. The energy consumed was reduced by the optimization.

  3. The tool for the automatic analysis of lexical sophistication (TAALES): version 2.0.

    Science.gov (United States)

    Kyle, Kristopher; Crossley, Scott; Berger, Cynthia

    2017-07-11

    This study introduces the second release of the Tool for the Automatic Analysis of Lexical Sophistication (TAALES 2.0), a freely available and easy-to-use text analysis tool. TAALES 2.0 is housed on a user's hard drive (allowing for secure data processing) and is available on most operating systems (Windows, Mac, and Linux). TAALES 2.0 adds 316 indices to the original tool. These indices are related to word frequency, word range, n-gram frequency, n-gram range, n-gram strength of association, contextual distinctiveness, word recognition norms, semantic network, and word neighbors. In this study, we validated TAALES 2.0 by investigating whether its indices could be used to model both holistic scores of lexical proficiency in free writes and word choice scores in narrative essays. The results indicated that the TAALES 2.0 indices could be used to explain 58% of the variance in lexical proficiency scores and 32% of the variance in word-choice scores. Newly added TAALES 2.0 indices, including those related to n-gram association strength, word neighborhood, and word recognition norms, featured heavily in these predictor models, suggesting that TAALES 2.0 represents a substantial upgrade.

  4. AlignTool: The automatic temporal alignment of spoken utterances in German, Dutch, and British English for psycholinguistic purposes.

    Science.gov (United States)

    Schillingmann, Lars; Ernst, Jessica; Keite, Verena; Wrede, Britta; Meyer, Antje S; Belke, Eva

    2018-01-29

    In language production research, the latency with which speakers produce a spoken response to a stimulus and the onset and offset times of words in longer utterances are key dependent variables. Measuring these variables automatically often yields partially incorrect results. However, exact measurements through the visual inspection of the recordings are extremely time-consuming. We present AlignTool, an open-source alignment tool that establishes preliminarily the onset and offset times of words and phonemes in spoken utterances using Praat, and subsequently performs a forced alignment of the spoken utterances and their orthographic transcriptions in the automatic speech recognition system MAUS. AlignTool creates a Praat TextGrid file for inspection and manual correction by the user, if necessary. We evaluated AlignTool's performance with recordings of single-word and four-word utterances as well as semi-spontaneous speech. AlignTool performs well with audio signals with an excellent signal-to-noise ratio, requiring virtually no corrections. For audio signals of lesser quality, AlignTool still is highly functional but its results may require more frequent manual corrections. We also found that audio recordings including long silent intervals tended to pose greater difficulties for AlignTool than recordings filled with speech, which AlignTool analyzed well overall. We expect that by semi-automatizing the temporal analysis of complex utterances, AlignTool will open new avenues in language production research.

  5. Smart-card-based automatic meal record system intervention tool for analysis using data mining approach.

    Science.gov (United States)

    Zenitani, Satoko; Nishiuchi, Hiromu; Kiuchi, Takahiro

    2010-04-01

    The Smart-card-based Automatic Meal Record system for company cafeterias (AutoMealRecord system) was recently developed and used to monitor employee eating habits. The system could be a unique nutrition assessment tool for automatically monitoring the meal purchases of all employees, although it only focuses on company cafeterias and has never been validated. Before starting an interventional study, we tested the reliability of the data collected by the system using the data mining approach. The AutoMealRecord data were examined to determine if it could predict current obesity. All data used in this study (n = 899) were collected by a major electric company based in Tokyo, which has been operating the AutoMealRecord system for several years. We analyzed dietary patterns by principal component analysis using data from the system and extracted 5 major dietary patterns: healthy, traditional Japanese, Chinese, Japanese noodles, and pasta. The ability to predict current body mass index (BMI) with dietary preference was assessed with multiple linear regression analyses, and in the current study, BMI was positively correlated with male gender, preference for "Japanese noodles," mean energy intake, protein content, and frequency of body measurement at a body measurement booth in the cafeteria. There was a negative correlation with age, dietary fiber, and lunchtime cafeteria use (R(2) = 0.22). This regression model predicted "would-be obese" participants (BMI >or= 23) with 68.8% accuracy by leave-one-out cross validation. This shows that there was sufficient predictability of BMI based on data from the AutoMealRecord System. We conclude that the AutoMealRecord system is valuable for further consideration as a health care intervention tool. Copyright 2010 Elsevier Inc. All rights reserved.

  6. Dosimetric evaluation of an automatic segmentation tool of pelvic structures from MRI images for prostate cancer radiotherapy

    International Nuclear Information System (INIS)

    Pasquier, D.; Lacornerie, T.; Lartigau, E.; Pasquier, D.; Pasquier, D.; Betrouni, N.; Vermandel, M.; Rousseau, J.

    2008-01-01

    Purpose: An automatic segmentation tool of pelvic structures from MRI images for prostate cancer radiotherapy was developed and dosimetric evaluation of differences of delineation (automatic versus human) is presented here. Materials and methods: C.T.V. (clinical target volume), rectum and bladder were defined automatically and by a physician in 20 patients. Treatment plans based on 'automatic' volumes were transferred on 'manual' volumes and reciprocally. Dosimetric characteristics of P.T.V. (V.95, minimal, maximal and mean doses), rectum (V.50, V.70, maximal and mean doses) and bladder (V.70, maximal and mean doses) were compared. Results: Automatic delineation of C.T.V. did not significantly influence dosimetric characteristics of 'manual' P.T.V. (projected target volume). Rectal V-50 and V.70 were not significantly different; mean rectal dose is slightly superior (43.2 versus 44.4 Gy, p = 0.02, Student test). Bladder V.70 was significantly superior too (19.3 versus 21.6, p = 0.004). Organ-at-risk (O.A.R.) automatic delineation had little influence on their dosimetric characteristics; rectal V.70 was slightly underestimated (20 versus 18.5 Gy, p = 0.001). Conclusion: C.T.V. and O.A.R. automatic delineation had little influence on dosimetric characteristics. Software developments are ongoing to enable routine use and interobserver evaluation is needed. (authors)

  7. Low rank approach to computing first and higher order derivatives using automatic differentiation

    International Nuclear Information System (INIS)

    Reed, J. A.; Abdel-Khalik, H. S.; Utke, J.

    2012-01-01

    This manuscript outlines a new approach for increasing the efficiency of applying automatic differentiation (AD) to large scale computational models. By using the principles of the Efficient Subspace Method (ESM), low rank approximations of the derivatives for first and higher orders can be calculated using minimized computational resources. The output obtained from nuclear reactor calculations typically has a much smaller numerical rank compared to the number of inputs and outputs. This rank deficiency can be exploited to reduce the number of derivatives that need to be calculated using AD. The effective rank can be determined according to ESM by computing derivatives with AD at random inputs. Reduced or pseudo variables are then defined and new derivatives are calculated with respect to the pseudo variables. Two different AD packages are used: OpenAD and Rapsodia. OpenAD is used to determine the effective rank and the subspace that contains the derivatives. Rapsodia is then used to calculate derivatives with respect to the pseudo variables for the desired order. The overall approach is applied to two simple problems and to MATWS, a safety code for sodium cooled reactors. (authors)

  8. Attentional Bias for Pain and Sex, and Automatic Appraisals of Sexual Penetration: Differential Patterns in Dyspareunia vs Vaginismus?

    Science.gov (United States)

    Melles, Reinhilde J; Dewitte, Marieke D; Ter Kuile, Moniek M; Peters, Madelon M L; de Jong, Peter J

    2016-08-01

    Current information processing models propose that heightened attention bias for sex-related threats (eg, pain) and lowered automatic incentive processes ("wanting") may play an important role in the impairment of sexual arousal and the development of sexual dysfunctions such as genitopelvic pain/penetration disorder (GPPPD). Differential threat and incentive processing may also help explain the stronger persistence of coital avoidance in women with vaginismus compared to women with dyspareunia. As the first aim, we tested if women with GPPPD show (1) heightened attention for pain and sex, and (2) heightened threat and lower incentive associations with sexual penetration. Second, we examined whether the stronger persistence of coital avoidance in vaginismus vs dyspareunia might be explained by a stronger attentional bias or more dysfunctional automatic threat/incentive associations. Women with lifelong vaginismus (n = 37), dyspareunia (n = 29), and a no-symptoms comparison group (n = 51) completed a visual search task to assess attentional bias, and single target implicit-association tests to measure automatic sex-threat and sex-wanting associations. There were no group differences in attentional bias or automatic associations. Correlational analysis showed that slowed detection of sex stimuli and stronger automatic threat associations were related to lowered sexual arousal. The findings do not corroborate the view that attentional bias for pain or sex contributes to coital pain, or that differences in coital avoidance may be explained by differences in attentional bias or automatic threat/incentive associations. However, the correlational findings are consistent with the view that automatic threat associations and impaired attention for sex stimuli may interfere with the generation of sexual arousal. Copyright © 2016 International Society for Sexual Medicine. Published by Elsevier Inc. All rights reserved.

  9. MAISTAS: a tool for automatic structural evaluation of alternative splicing products.

    Science.gov (United States)

    Floris, Matteo; Raimondo, Domenico; Leoni, Guido; Orsini, Massimiliano; Marcatili, Paolo; Tramontano, Anna

    2011-06-15

    Analysis of the human genome revealed that the amount of transcribed sequence is an order of magnitude greater than the number of predicted and well-characterized genes. A sizeable fraction of these transcripts is related to alternatively spliced forms of known protein coding genes. Inspection of the alternatively spliced transcripts identified in the pilot phase of the ENCODE project has clearly shown that often their structure might substantially differ from that of other isoforms of the same gene, and therefore that they might perform unrelated functions, or that they might even not correspond to a functional protein. Identifying these cases is obviously relevant for the functional assignment of gene products and for the interpretation of the effect of variations in the corresponding proteins. Here we describe a publicly available tool that, given a gene or a protein, retrieves and analyses all its annotated isoforms, provides users with three-dimensional models of the isoform(s) of his/her interest whenever possible and automatically assesses whether homology derived structural models correspond to plausible structures. This information is clearly relevant. When the homology model of some isoforms of a gene does not seem structurally plausible, the implications are that either they assume a structure unrelated to that of the other isoforms of the same gene with presumably significant functional differences, or do not correspond to functional products. We provide indications that the second hypothesis is likely to be true for a substantial fraction of the cases. http://maistas.bioinformatica.crs4.it/.

  10. Preoperative automatic visual behavioural analysis as a tool for intraocular lens choice in cataract surgery

    Directory of Open Access Journals (Sweden)

    Heloisa Neumann Nogueira

    2015-04-01

    Full Text Available Purpose: Cataract is the main cause of blindness, affecting 18 million people worldwide, with the highest incidence in the population above 50 years of age. Low visual acuity caused by cataract may have a negative impact on patient quality of life. The current treatment is surgery in order to replace the natural lens with an artificial intraocular lens (IOL, which can be mono- or multifocal. However, due to potential side effects, IOLs must be carefully chosen to ensure higher patient satisfaction. Thus, studies on the visual behavior of these patients may be an important tool to determine the best type of IOL implantation. This study proposed an anamnestic add-on for optimizing the choice of IOL. Methods: We used a camera that automatically takes pictures, documenting the patient’s visual routine in order to obtain additional information about the frequency of distant, intermediate, and near sights. Results: The results indicated an estimated frequency percentage, suggesting that visual analysis of routine photographic records of a patient with cataract may be useful for understanding behavioural gaze and for choosing visual management strategy after cataract surgery, simultaneously stimulating interest for customized IOL manufacturing according to individual needs.

  11. Migration check tool: automatic plan verification following treatment management systems upgrade and database migration.

    Science.gov (United States)

    Hadley, Scott W; White, Dale; Chen, Xiaoping; Moran, Jean M; Keranen, Wayne M

    2013-11-04

    Software upgrades of the treatment management system (TMS) sometimes require that all data be migrated from one version of the database to another. It is necessary to verify that the data are correctly migrated to assure patient safety. It is impossible to verify by hand the thousands of parameters that go into each patient's radiation therapy treatment plan. Repeating pretreatment QA is costly, time-consuming, and may be inadequate in detecting errors that are introduced during the migration. In this work we investigate the use of an automatic Plan Comparison Tool to verify that plan data have been correctly migrated to a new version of a TMS database from an older version. We developed software to query and compare treatment plans between different versions of the TMS. The same plan in the two TMS systems are translated into an XML schema. A plan comparison module takes the two XML schemas as input and reports any differences in parameters between the two versions of the same plan by applying a schema mapping. A console application is used to query the database to obtain a list of active or in-preparation plans to be tested. It then runs in batch mode to compare all the plans, and a report of success or failure of the comparison is saved for review. This software tool was used as part of software upgrade and database migration from Varian's Aria 8.9 to Aria 11 TMS. Parameters were compared for 358 treatment plans in 89 minutes. This direct comparison of all plan parameters in the migrated TMS against the previous TMS surpasses current QA methods that relied on repeating pretreatment QA measurements or labor-intensive and fallible hand comparisons.

  12. An open source automatic quality assurance (OSAQA) tool for the ACR MRI phantom.

    Science.gov (United States)

    Sun, Jidi; Barnes, Michael; Dowling, Jason; Menk, Fred; Stanwell, Peter; Greer, Peter B

    2015-03-01

    Routine quality assurance (QA) is necessary and essential to ensure MR scanner performance. This includes geometric distortion, slice positioning and thickness accuracy, high contrast spatial resolution, intensity uniformity, ghosting artefact and low contrast object detectability. However, this manual process can be very time consuming. This paper describes the development and validation of an open source tool to automate the MR QA process, which aims to increase physicist efficiency, and improve the consistency of QA results by reducing human error. The OSAQA software was developed in Matlab and the source code is available for download from http://jidisun.wix.com/osaqa-project/. During program execution QA results are logged for immediate review and are also exported to a spreadsheet for long-term machine performance reporting. For the automatic contrast QA test, a user specific contrast evaluation was designed to improve accuracy for individuals on different display monitors. American College of Radiology QA images were acquired over a period of 2 months to compare manual QA and the results from the proposed OSAQA software. OSAQA was found to significantly reduce the QA time from approximately 45 to 2 min. Both the manual and OSAQA results were found to agree with regard to the recommended criteria and the differences were insignificant compared to the criteria. The intensity homogeneity filter is necessary to obtain an image with acceptable quality and at the same time keeps the high contrast spatial resolution within the recommended criterion. The OSAQA tool has been validated on scanners with different field strengths and manufacturers. A number of suggestions have been made to improve both the phantom design and QA protocol in the future.

  13. Technical Note: PLASTIMATCH MABS, an open source tool for automatic image segmentation

    International Nuclear Information System (INIS)

    Zaffino, Paolo; Spadea, Maria Francesca; Raudaschl, Patrik; Fritscher, Karl; Sharp, Gregory C.

    2016-01-01

    Purpose: Multiatlas based segmentation is largely used in many clinical and research applications. Due to its good performances, it has recently been included in some commercial platforms for radiotherapy planning and surgery guidance. Anyway, to date, a software with no restrictions about the anatomical district and image modality is still missing. In this paper we introduce PLASTIMATCH MABS, an open source software that can be used with any image modality for automatic segmentation. Methods: PLASTIMATCH MABS workflow consists of two main parts: (1) an offline phase, where optimal registration and voting parameters are tuned and (2) an online phase, where a new patient is labeled from scratch by using the same parameters as identified in the former phase. Several registration strategies, as well as different voting criteria can be selected. A flexible atlas selection scheme is also available. To prove the effectiveness of the proposed software across anatomical districts and image modalities, it was tested on two very different scenarios: head and neck (H&N) CT segmentation for radiotherapy application, and magnetic resonance image brain labeling for neuroscience investigation. Results: For the neurological study, minimum dice was equal to 0.76 (investigated structures: left and right caudate, putamen, thalamus, and hippocampus). For head and neck case, minimum dice was 0.42 for the most challenging structures (optic nerves and submandibular glands) and 0.62 for the other ones (mandible, brainstem, and parotid glands). Time required to obtain the labels was compatible with a real clinical workflow (35 and 120 min). Conclusions: The proposed software fills a gap in the multiatlas based segmentation field, since all currently available tools (both for commercial and for research purposes) are restricted to a well specified application. Furthermore, it can be adopted as a platform for exploring MABS parameters and as a reference implementation for comparing against

  14. MAISTAS: a tool for automatic structural evaluation of alternative splicing products.

    KAUST Repository

    Floris, Matteo

    2011-04-15

    MOTIVATION: Analysis of the human genome revealed that the amount of transcribed sequence is an order of magnitude greater than the number of predicted and well-characterized genes. A sizeable fraction of these transcripts is related to alternatively spliced forms of known protein coding genes. Inspection of the alternatively spliced transcripts identified in the pilot phase of the ENCODE project has clearly shown that often their structure might substantially differ from that of other isoforms of the same gene, and therefore that they might perform unrelated functions, or that they might even not correspond to a functional protein. Identifying these cases is obviously relevant for the functional assignment of gene products and for the interpretation of the effect of variations in the corresponding proteins. RESULTS: Here we describe a publicly available tool that, given a gene or a protein, retrieves and analyses all its annotated isoforms, provides users with three-dimensional models of the isoform(s) of his/her interest whenever possible and automatically assesses whether homology derived structural models correspond to plausible structures. This information is clearly relevant. When the homology model of some isoforms of a gene does not seem structurally plausible, the implications are that either they assume a structure unrelated to that of the other isoforms of the same gene with presumably significant functional differences, or do not correspond to functional products. We provide indications that the second hypothesis is likely to be true for a substantial fraction of the cases. AVAILABILITY: http://maistas.bioinformatica.crs4.it/.

  15. Technical Note: plastimatch mabs, an open source tool for automatic image segmentation.

    Science.gov (United States)

    Zaffino, Paolo; Raudaschl, Patrik; Fritscher, Karl; Sharp, Gregory C; Spadea, Maria Francesca

    2016-09-01

    Multiatlas based segmentation is largely used in many clinical and research applications. Due to its good performances, it has recently been included in some commercial platforms for radiotherapy planning and surgery guidance. Anyway, to date, a software with no restrictions about the anatomical district and image modality is still missing. In this paper we introduce plastimatch mabs, an open source software that can be used with any image modality for automatic segmentation. plastimatch mabs workflow consists of two main parts: (1) an offline phase, where optimal registration and voting parameters are tuned and (2) an online phase, where a new patient is labeled from scratch by using the same parameters as identified in the former phase. Several registration strategies, as well as different voting criteria can be selected. A flexible atlas selection scheme is also available. To prove the effectiveness of the proposed software across anatomical districts and image modalities, it was tested on two very different scenarios: head and neck (H&N) CT segmentation for radiotherapy application, and magnetic resonance image brain labeling for neuroscience investigation. For the neurological study, minimum dice was equal to 0.76 (investigated structures: left and right caudate, putamen, thalamus, and hippocampus). For head and neck case, minimum dice was 0.42 for the most challenging structures (optic nerves and submandibular glands) and 0.62 for the other ones (mandible, brainstem, and parotid glands). Time required to obtain the labels was compatible with a real clinical workflow (35 and 120 min). The proposed software fills a gap in the multiatlas based segmentation field, since all currently available tools (both for commercial and for research purposes) are restricted to a well specified application. Furthermore, it can be adopted as a platform for exploring MABS parameters and as a reference implementation for comparing against other segmentation algorithms.

  16. Aerodynamic design applying automatic differentiation and using robust variable fidelity optimization

    Science.gov (United States)

    Takemiya, Tetsushi

    , and that (2) the AMF terminates optimization erroneously when the optimization problems have constraints. The first problem is due to inaccuracy in computing derivatives in the AMF, and the second problem is due to erroneous treatment of the trust region ratio, which sets the size of the domain for an optimization in the AMF. In order to solve the first problem of the AMF, automatic differentiation (AD) technique, which reads the codes of analysis models and automatically generates new derivative codes based on some mathematical rules, is applied. If derivatives are computed with the generated derivative code, they are analytical, and the required computational time is independent of the number of design variables, which is very advantageous for realistic aerospace engineering problems. However, if analysis models implement iterative computations such as computational fluid dynamics (CFD), which solves system partial differential equations iteratively, computing derivatives through the AD requires a massive memory size. The author solved this deficiency by modifying the AD approach and developing a more efficient implementation with CFD, and successfully applied the AD to general CFD software. In order to solve the second problem of the AMF, the governing equation of the trust region ratio, which is very strict against the violation of constraints, is modified so that it can accept the violation of constraints within some tolerance. By accepting violations of constraints during the optimization process, the AMF can continue optimization without terminating immaturely and eventually find the true optimum design point. With these modifications, the AMF is referred to as "Robust AMF," and it is applied to airfoil and wing aerodynamic design problems using Euler CFD software. The former problem has 21 design variables, and the latter 64. In both problems, derivatives computed with the proposed AD method are first compared with those computed with the finite

  17. An evolutionary computation based algorithm for calculating solar differential rotation by automatic tracking of coronal bright points

    Science.gov (United States)

    Shahamatnia, Ehsan; Dorotovič, Ivan; Fonseca, Jose M.; Ribeiro, Rita A.

    2016-03-01

    Developing specialized software tools is essential to support studies of solar activity evolution. With new space missions such as Solar Dynamics Observatory (SDO), solar images are being produced in unprecedented volumes. To capitalize on that huge data availability, the scientific community needs a new generation of software tools for automatic and efficient data processing. In this paper a prototype of a modular framework for solar feature detection, characterization, and tracking is presented. To develop an efficient system capable of automatic solar feature tracking and measuring, a hybrid approach combining specialized image processing, evolutionary optimization, and soft computing algorithms is being followed. The specialized hybrid algorithm for tracking solar features allows automatic feature tracking while gathering characterization details about the tracked features. The hybrid algorithm takes advantages of the snake model, a specialized image processing algorithm widely used in applications such as boundary delineation, image segmentation, and object tracking. Further, it exploits the flexibility and efficiency of Particle Swarm Optimization (PSO), a stochastic population based optimization algorithm. PSO has been used successfully in a wide range of applications including combinatorial optimization, control, clustering, robotics, scheduling, and image processing and video analysis applications. The proposed tool, denoted PSO-Snake model, was already successfully tested in other works for tracking sunspots and coronal bright points. In this work, we discuss the application of the PSO-Snake algorithm for calculating the sidereal rotational angular velocity of the solar corona. To validate the results we compare them with published manual results performed by an expert.

  18. Deep Learning as a Tool for Automatic Segmentation of Corneal Endothelium Images

    Directory of Open Access Journals (Sweden)

    Karolina Nurzynska

    2018-03-01

    Full Text Available The automatic analysis of the state of the corneal endothelium is of much interest in ophthalmology. Up till now, several manual and semi-automatic methods have been introduced, but the need of fully-automatic segmentation of cells in the endothelium is still in search. This work addresses the problem of automatic delineation of cells in the corneal endothelium images and suggests to use the convolutional neural network (CNN to classify between cell center, cell body, and cell border in order to achieve precise segmentation. Additionally, a method to automatically select and split merged cells is given. In order to skeletonize the result, the best-fit method is used. The achieved outcomes are compared to manual annotations in order to define the mutual overlapping. The Dice index, Jaccard coefficient, modified Hausdorff distance, and several other metrics for mosaic overlapping are used. As a final check-up, the visual inspection is shown. The performed experiments revealed the best architecture for CNN. The correctness and precision of the segmentation were evaluated on Endothelial Cell “Alizarine” dataset. According to the Dice index and Jaccard coefficient, the automatically achieved cell delineation overlaps the original one with 93% precision. While modified Hausdorff distance shows 0.14 pixel distance, proving very high accuracy. These findings are confirmed by other metrics and also supported by presented visual inspection of achieved segmentations. To conclude, the methodology to achieve fully-automatic delineation of cell boundaries in the corneal endothelium images was presented. The segmentation obtained as a result of pixel classification with CNN proved very high precision.

  19. SU-F-T-94: Plan2pdf - a Software Tool for Automatic Plan Report for Philips Pinnacle TPS

    International Nuclear Information System (INIS)

    Wu, C

    2016-01-01

    Purpose: To implement an automatic electronic PDF plan reporting tool for Philips Pinnacle treatment planning system (TPS) Methods: An electronic treatment plan reporting software is developed by us to enable fully automatic PDF report from Pinnacle TPS to external EMR programs such as MOSAIQ. The tool is named “plan2pdf”. plan2pdf is implemented using Pinnacle scripts, Java and UNIX shell scripts, without any external program needed. plan2pdf supports full auto-mode and manual mode reporting. In full auto-mode, with a single mouse click, plan2pdf will generate a detailed Pinnacle plan report in PDF format, which includes customizable cover page, Pinnacle plan summary, orthogonal views through each plan POI and maximum dose point, DRR for each beam, serial transverse views captured throughout the dose grid at a user specified interval, DVH and scorecard windows. The final PDF report is also automatically bookmarked for each section above for convenient plan review. The final PDF report can either be saved on a user specified folder on Pinnacle, or it can be automatically exported to an EMR import folder via a user configured FTP service. In manual capture mode, plan2pdf allows users to capture any Pinnacle plan by full screen, individual window or rectangular ROI drawn on screen. Furthermore, to avoid possible patients’ plan mix-up during auto-mode reporting, a user conflict check feature is included in plan2pdf: it prompts user to wait if another patient is being exported by plan2pdf by another user. Results: plan2pdf is tested extensively and successfully at our institution consists of 5 centers, 15 dosimetrists and 10 physicists, running Pinnacle version 9.10 on Enterprise servers. Conclusion: plan2pdf provides a highly efficient, user friendly and clinical proven platform for all Philips Pinnacle users, to generate a detailed plan report in PDF format for external EMR systems.

  20. Automatic detection and treatment of oscillatory and/or stiff ordinary differential equations

    Energy Technology Data Exchange (ETDEWEB)

    Gear, C. W.

    1980-06-01

    The next generation of ODE software can be expected to detect special problems and to adapt to their needs. The low-cost, automatic detection of oscillatory behavior, the determination of its period, and methods for its subsequent efficient integration are addressed here, along with stiffness detection. In the first phase, the method for oscillatory problems discussed examines the output of any integrator to determine if the output is nearly periodic. At the point this answer is positive, the second phase is entered and an automatic, nonstiff, multirevolutionary method is invoked. This requires the occasional solution of a nearly periodic initial-value problem over one period by a standard method and the re-determination of its period. Because the multirevolutionary method uses a very large step, the problem has a high probability of being stiff in this second phase. Hence, it is important to detect if stiffness is present so that an appropriate stiff, multirevolutionary method can be selected. 6 figures.

  1. Modeling and monitoring of pipelines and networks advanced tools for automatic monitoring and supervision of pipelines

    CERN Document Server

    Torres, Lizeth

    2017-01-01

    This book focuses on the analysis and design of advanced techniques for on-line automatic computational monitoring of pipelines and pipe networks. It discusses how to improve the systems’ security considering mathematical models of the flow, historical flow rate and pressure data, with the main goal of reducing the number of sensors installed along a pipeline. The techniques presented in the book have been implemented in digital systems to enhance the abilities of the pipeline network’s operators in recognizing anomalies. A real leak scenario in a Mexican water pipeline is used to illustrate the benefits of these techniques in locating the position of a leak. Intended for an interdisciplinary audience, the book addresses researchers and professionals in the areas of mechanical, civil and control engineering. It covers topics on fluid mechanics, instrumentation, automatic control, signal processing, computing, construction and diagnostic technologies.

  2. New sonorities for early jazz recordings using sound source separation and automatic mixing tools

    OpenAIRE

    Matz, Daniel; Cano, Estefanía; Abeßer, Jakob

    2015-01-01

    In this paper, a framework for automatic mixing of early jazz recordings is presented. In particular, we propose the use of sound source separation techniques as a preprocessing step of the mixing process. In addition to an initial solo and accompaniment separation step, the proposed mixing framework is composed of six processing blocks: harmonic-percussive separation (HPS), cross-adaptive multi-track scaling (CAMTS), cross-adaptive equalizer (CAEQ), cross-adaptive dynamic spectral panning (C...

  3. A tool to automatically analyze electromagnetic tracking data from high dose rate brachytherapy of breast cancer patients

    Science.gov (United States)

    Lahmer, G.; Strnad, V.; Bert, Ch.; Hensel, B.; Tomé, A. M.; Lang, E. W.

    2017-01-01

    During High Dose Rate Brachytherapy (HDR-BT) the spatial position of the radiation source inside catheters implanted into a female breast is determined via electromagnetic tracking (EMT). Dwell positions and dwell times of the radiation source are established, relative to the patient’s anatomy, from an initial X-ray-CT-image. During the irradiation treatment, catheter displacements can occur due to patient movements. The current study develops an automatic analysis tool of EMT data sets recorded with a solenoid sensor to assure concordance of the source movement with the treatment plan. The tool combines machine learning techniques such as multi-dimensional scaling (MDS), ensemble empirical mode decomposition (EEMD), singular spectrum analysis (SSA) and particle filter (PF) to precisely detect and quantify any mismatch between the treatment plan and actual EMT measurements. We demonstrate that movement artifacts as well as technical signal distortions can be removed automatically and reliably, resulting in artifact-free reconstructed signals. This is a prerequisite for a highly accurate determination of any deviations of dwell positions from the treatment plan. PMID:28934238

  4. Automatic differentiation of u- and n-serrated patterns in direct immunofluorescence images

    NARCIS (Netherlands)

    Shi, Chenyu; Guo, Jiapan; Azzopardi, George; Meijer, Joost; Jonkman, Marcel F.; Petkov, Nicolai

    2015-01-01

    Epidermolysis bullosa acquisita (EBA) is a subepidermal autoimmune blistering disease of the skin. Manual u- and n-serrated patterns analysis in direct immunofluorescence (DIF) images is used in medical practice to differentiate EBA from other forms of pemphigoid. The manual analysis of serration

  5. Integrating the Revised Bloom's Taxonomy With Multiple Intelligences: A Planning Tool for Curriculum Differentiation

    Science.gov (United States)

    Noble, Toni

    2004-01-01

    Both the special education and gifted education literature call for a differentiated curriculum to cater for the wide range of student differences in any classroom. Gardner's theory of multiple intelligences was integrated with the revised Bloom's taxonomy to provide a planning tool for curriculum differentiation. Teachers' progress in using the…

  6. MetaQuant: a tool for the automatic quantification of GC/MS-based metabolome data.

    Science.gov (United States)

    Bunk, Boyke; Kucklick, Martin; Jonas, Rochus; Münch, Richard; Schobert, Max; Jahn, Dieter; Hiller, Karsten

    2006-12-01

    MetaQuant is a Java-based program for the automatic and accurate quantification of GC/MS-based metabolome data. In contrast to other programs MetaQuant is able to quantify hundreds of substances simultaneously with minimal manual intervention. The integration of a self-acting calibration function allows the parallel and fast calibration for several metabolites simultaneously. Finally, MetaQuant is able to import GC/MS data in the common NetCDF format and to export the results of the quantification into Systems Biology Markup Language (SBML), Comma Separated Values (CSV) or Microsoft Excel (XLS) format. MetaQuant is written in Java and is available under an open source license. Precompiled packages for the installation on Windows or Linux operating systems are freely available for download. The source code as well as the installation packages are available at http://bioinformatics.org/metaquant

  7. A New Internet Tool for Automatic Evaluation in Control Systems and Programming

    Science.gov (United States)

    Munoz de la Pena, D.; Gomez-Estern, F.; Dormido, S.

    2012-01-01

    In this paper we present a web-based innovative education tool designed for automating the collection, evaluation and error detection in practical exercises assigned to computer programming and control engineering students. By using a student/instructor code-fusion architecture, the conceptual limits of multiple-choice tests are overcome by far.…

  8. JUST (Java User Segmentation Tool) for semi-automatic segmentation of tomographic maps.

    Science.gov (United States)

    Salvi, Eleonora; Cantele, Francesca; Zampighi, Lorenzo; Fain, Nick; Pigino, Gaia; Zampighi, Guido; Lanzavecchia, Salvatore

    2008-03-01

    We are presenting a program for interactive segmentation of tomographic maps, based on objective criteria so as to yield reproducible results. The strategy starts with the automatic segmentation of the entire volume with the watershed algorithm in 3D. The watershed regions are clustered successively by supervised classification, allowing the segmentation of known organelles, such as membranes, vesicles and microtubules. These organelles are processed with topological models and input parameters manually derived from the tomograms. After known organelles are extracted from the volume, all other watershed regions can be organized into homogeneous assemblies on the basis of their densities. To complete the process, all voxels in the volume are assigned either to the background or individual structures, which can then be extracted for visualization with any rendering technique. The user interface of the program is written in Java, and computational routines are written in C. For some operations, involving the visualization of the tomogram, we refer to existing software, either open or commercial. While the program runs, a history file is created, that allows all parameters and other data to be saved for the purposes of comparison or exchange. Initially, the program was developed for the segmentation of synapses, and organelles belonging to these structures have thus far been the principal targets modeled with JUST. Since each organelle is clustered independently from the rest of the volume, however, the program can accommodate new models of different organelles as well as tomograms of other types of preparations of tissue, such as cytoskeletal components in vitreous ice.

  9. Structured illumination microscopy and automatized image processing as a rapid diagnostic tool for podocyte effacement.

    Science.gov (United States)

    Siegerist, Florian; Ribback, Silvia; Dombrowski, Frank; Amann, Kerstin; Zimmermann, Uwe; Endlich, Karlhans; Endlich, Nicole

    2017-09-13

    The morphology of podocyte foot processes is obligatory for renal function. Here we describe a method for the superresolution-visualization of podocyte foot processes using structured illumination microscopy of the slit diaphragm, which before has only been achieved by electron microscopy. As a proof of principle, we measured a mean foot process width of 0.249 ± 0.068 µm in healthy kidneys and a significant higher mean foot process width of 0.675 ± 0.256 µm in minimal change disease patients indicating effacement of foot processes. We then hypothesized that the slit length per glomerular capillary surface area (slit diaphragm density) could be used as an equivalent for the diagnosis of effacement. Using custom-made software we measured a mean value of 3.10 ± 0.27 µm -1 in healthy subjects and 1.83 ± 0.49 µm -1 in the minimal change disease patients. As foot process width was highly correlated with slit diaphragm density (R 2  = 0.91), we concluded that our approach is a valid method for the diagnosis of foot process effacement. In summary, we present a new technique to quantify podocyte damage, which combines superresolution microscopy with automatized image processing. Due to its diverse advantages, we propose this technique to be included into routine diagnostics of glomerular histopathology.

  10. Validation of a Novel Digital Tool in Automatic Scoring of an Online ECG Examination at an International Cardiology Meeting.

    Science.gov (United States)

    Quinn, Kieran L; Crystal, Eugene; Lashevsky, Ilan; Arouny, Banafsheh; Baranchuk, Adrian

    2016-07-01

    We have previously developed a novel digital tool capable of automatically recognizing correct electrocardiography (ECG) diagnoses in an online exam and demonstrated a significant improvement in diagnostic accuracy when utilizing an inductive-deductive reasoning strategy over a pattern recognition strategy. In this study, we sought to validate these findings from participants at the International Winter Arrhythmia School meeting, one of the foremost electrophysiology events in Canada. Preregistration to the event was sent by e-mail. The exam was administered on day 1 of the conference. Results and analysis were presented the following morning to participants. Twenty-five attendees completed the exam, providing a total of 500 responses to be marked. The online tool accurately identified 195 of a total of 395 (49%) correct responses (49%). In total, 305 responses required secondary manual review, of which 200 were added to the correct responses pool. The overall accuracy of correct ECG diagnosis for all participants was 69% and 84% when using pattern recognition or inductive-deductive strategies, respectively. Utilization of a novel digital tool to evaluate ECG competency can be set up as a workshop at international meetings or educational events. Results can be presented during the sessions to ensure immediate feedback. © 2015 Wiley Periodicals, Inc.

  11. Application of automatic change of interval to de Vogelaere's method of the solution of the differential equation y'' = f (x, y)

    International Nuclear Information System (INIS)

    Rogers, M.H.

    1960-11-01

    The paper gives an extension to de Vogelaere's method for the solution of systems of second order differential equations from which first derivatives are absent. The extension is a description of the way in which automatic change in step-length can be made to give a prescribed accuracy at each step. (author)

  12. A new tool for rapid and automatic estimation of earthquake source parameters and generation of seismic bulletins

    Science.gov (United States)

    Zollo, Aldo

    2016-04-01

    of the equivalent Wood-Anderson displacement recordings. The moment magnitude (Mw) is then estimated from the inversion of displacement spectra. The duration magnitude (Md) is rapidly computed, based on a simple and automatic measurement of the seismic wave coda duration. Starting from the magnitude estimates, other relevant pieces of information are also computed, such as the corner frequency, the seismic moment, the source radius and the seismic energy. The ground-shaking maps on a Google map are produced, for peak ground acceleration (PGA), peak ground velocity (PGV) and instrumental intensity (in SHAKEMAP® format), or a plot of the measured peak ground values. Furthermore, based on a specific decisional scheme, the automatic discrimination between local earthquakes occurred within the network and regional/teleseismic events occurred outside the network is performed. Finally, for largest events, if a consistent number of P-wave polarity reading are available, the focal mechanism is also computed. For each event, all of the available pieces of information are stored in a local database and the results of the automatic analyses are published on an interactive web page. "The Bulletin" shows a map with event location and stations, as well as a table listing all the events, with the associated parameters. The catalogue fields are the event ID, the origin date and time, latitude, longitude, depth, Ml, Mw, Md, the number of triggered stations, the S-displacement spectra, and shaking maps. Some of these entries also provide additional information, such as the focal mechanism (when available). The picked traces are uploaded in the database and from the web interface of the Bulletin the traces can be download for more specific analysis. This innovative software represents a smart solution, with a friendly and interactive interface, for high-level analysis of seismic data analysis and it may represent a relevant tool not only for seismologists, but also for non

  13. Strategy proposed by Electricite de France in the development of automatic tools

    Energy Technology Data Exchange (ETDEWEB)

    Castaing, C.; Cazin, B. [Electricite de France, Noisy le grand (France)

    1995-03-01

    The strategy proposed by EDF in the development of a means to limit personal and collective dosimetry is recent. It follows in the steps of a policy that consisted of developing remote operation means for those activities of inspection and maintenance on the reactor, pools bottom, steam generators (SGs), also reactor building valves; target activities because of their high dosimetric cost. One of the main duties of the UTO (Technical Support Department), within the EDF, is the maintenance of Pressurized Water Reactors in French Nuclear Power Plant Operations (consisting of 54 units) and the development and monitoring of specialized tools. To achieve this, the UTO has started a national think-tank on the implementation of the ALARA process in its field of activity and created an ALARA Committee responsible for running and monitoring it, as well as a policy for developing tools. This point will be illustrated in the second on reactor vessel heads.

  14. Extending a User Interface Prototyping Tool with Automatic MISRA C Code Generation

    Directory of Open Access Journals (Sweden)

    Gioacchino Mauro

    2017-01-01

    Full Text Available We are concerned with systems, particularly safety-critical systems, that involve interaction between users and devices, such as the user interface of medical devices. We therefore developed a MISRA C code generator for formal models expressed in the PVSio-web prototyping toolkit. PVSio-web allows developers to rapidly generate realistic interactive prototypes for verifying usability and safety requirements in human-machine interfaces. The visual appearance of the prototypes is based on a picture of a physical device, and the behaviour of the prototype is defined by an executable formal model. Our approach transforms the PVSio-web prototyping tool into a model-based engineering toolkit that, starting from a formally verified user interface design model, will produce MISRA C code that can be compiled and linked into a final product. An initial validation of our tool is presented for the data entry system of an actual medical device.

  15. Automatic Tool Selection in V-bending Processes by Using an Intelligent Collision Detection Algorithm

    Science.gov (United States)

    Salem, A. A.

    2017-09-01

    V-bending is widely used to produce the sheet metal components. There are global Changes in the shape of the sheet metal component during progressive bending processes. Accordingly, collisions may be occurred between part and tool during bending. Collision-free is considered one of the feasibility conditions of V-bending process planning which the tool selection is verified by the absence of the collisions. This paper proposes an intelligent collision detection algorithm which has the ability to distinguish between 2D bent parts and the other bent parts. Due to this ability, 2D and 3D collision detection subroutines have been developed in the proposed algorithm. This division of algorithm’s subroutines could reduce the computational operations during collisions detecting.

  16. A tool for automatic generation of RTL-level VHDL description of RNS FIR filters

    DEFF Research Database (Denmark)

    Re, Andrea Del; Nannarelli, Alberto; Re, Marco

    2004-01-01

    Although digital filters based on the Residue Number System (RNS) show high performance and low power dissipation, RNS filters are not widely used in DSP systems, because of the complexity of the algorithms involved. We present a tool to design RNS FIR filters which hides the RNS algorithms...... to the designer, and generates a synthesizable VHDL description of the filter taking into account several design constraints such as: delay, area and energy....

  17. A Tool for Automatic Verification of Real-Time Expert Systems

    Science.gov (United States)

    Traylor, B.; Schwuttke, U.; Quan, A.

    1994-01-01

    The creation of an automated, user-driven tool for expert system development, validation, and verification is curretly onoging at NASA's Jet Propulsion Laboratory. In the new age of faster, better, cheaper missions, there is an increased willingness to utilize embedded expert systems for encapsulating and preserving mission expertise in systems which combine conventional algorithmic processing and artifical intelligence. The once-questioned role of automation in spacecraft monitoring is now becoming one of increasing importance.

  18. GIS (Geographic Information Systems) based automatic tool for selection of gas pipeline corridors

    Energy Technology Data Exchange (ETDEWEB)

    Matos, Denise F.; Menezes, Paulo Cesar P.; Paz, Luciana R.L.; Garcia, Katia C.; Cruz, Cristiane B.; Pires, Silvia H.M.; Damazio, Jorge M.; Medeiros, Alexandre M.

    2009-07-01

    This paper describes a methodology developed to build total accumulated surfaces in order to better select gas pipelines corridor alternatives. The methodology is based on the minimization of negative impacts and the use of Geographic Information Systems (GIS), allowing an automatic method of construction, evaluation and selection of alternatives, that will contribute to the decision making process. It is important to emphasize that this paper follows the assumptions presented on the research reports of a project sponsored by the Ministry of Mines and Energy (MME) and elaborated at the Electric Power Research Center (CEPEL), called 'Development of a Geographic Information System to Oil and Gas Sectors in Brazil', and also the studies d GTW Project (Gas to Wire). Gas pipelines, as for their linear characteristic, may cross a variety of habitats and settlements, increasing the complexity of their environmental management. Considering this reality, this paper presents a methodology that takes into account different environmental criteria (layers), according to the area impacted. From the synthesis of the criteria it is presented the total accumulated surface. It is showed an example of a hypothetical gas pipeline connection between two points using the total accumulated surface. To select the 'impact scores' of the features, the gas pipeline was considered as a linear feature, but the result is a region, formed by pixels, each pixel with an accumulated impact score lower than some arbitrary measure. This region is called 'corridor', and it is the final result obtained using the proposed methodology. (author)

  19. Finger tapping movements of Parkinson's disease patients automatically rated using nonlinear delay differential equations.

    Science.gov (United States)

    Lainscsek, C; Rowat, P; Schettino, L; Lee, D; Song, D; Letellier, C; Poizner, H

    2012-03-01

    Parkinson's disease is a degenerative condition whose severity is assessed by clinical observations of motor behaviors. These are performed by a neurological specialist through subjective ratings of a variety of movements including 10-s bouts of repetitive finger-tapping movements. We present here an algorithmic rating of these movements which may be beneficial for uniformly assessing the progression of the disease. Finger-tapping movements were digitally recorded from Parkinson's patients and controls, obtaining one time series for every 10 s bout. A nonlinear delay differential equation, whose structure was selected using a genetic algorithm, was fitted to each time series and its coefficients were used as a six-dimensional numerical descriptor. The algorithm was applied to time-series from two different groups of Parkinson's patients and controls. The algorithmic scores compared favorably with the unified Parkinson's disease rating scale scores, at least when the latter adequately matched with ratings from the Hoehn and Yahr scale. Moreover, when the two sets of mean scores for all patients are compared, there is a strong (r = 0.785) and significant (p<0.0015) correlation between them.

  20. HClass: Automatic classification tool for health pathologies using artificial intelligence techniques.

    Science.gov (United States)

    Garcia-Chimeno, Yolanda; Garcia-Zapirain, Begonya

    2015-01-01

    The classification of subjects' pathologies enables a rigorousness to be applied to the treatment of certain pathologies, as doctors on occasions play with so many variables that they can end up confusing some illnesses with others. Thanks to Machine Learning techniques applied to a health-record database, it is possible to make using our algorithm. hClass contains a non-linear classification of either a supervised, non-supervised or semi-supervised type. The machine is configured using other techniques such as validation of the set to be classified (cross-validation), reduction in features (PCA) and committees for assessing the various classifiers. The tool is easy to use, and the sample matrix and features that one wishes to classify, the number of iterations and the subjects who are going to be used to train the machine all need to be introduced as inputs. As a result, the success rate is shown either via a classifier or via a committee if one has been formed. A 90% success rate is obtained in the ADABoost classifier and 89.7% in the case of a committee (comprising three classifiers) when PCA is applied. This tool can be expanded to allow the user to totally characterise the classifiers by adjusting them to each classification use.

  1. An Interactive Tool for Automatic Predimensioning and Numerical Modeling of Arch Dams

    Directory of Open Access Journals (Sweden)

    D. J. Vicente

    2017-01-01

    Full Text Available The construction of double-curvature arch dams is an attractive solution from an economic viewpoint due to the reduced volume of concrete necessary for their construction as compared to conventional gravity dams. Due to their complex geometry, many criteria have arisen for their design. However, the most widespread methods are based on recommendations of traditional technical documents without taking into account the possibilities of computer-aided design. In this paper, an innovative software tool to design FEM models of double-curvature arch dams is presented. Several capabilities are allowed: simplified geometry creation (interesting for academic purposes, preliminary geometrical design, high-detailed model construction, and stochastic calculation performance (introducing uncertainty associated with material properties and other parameters. This paper specially focuses on geometrical issues describing the functionalities of the tool and the fundamentals of the design procedure with regard to the following aspects: topography, reference cylinder, excavation depth, crown cantilever thickness and curvature, horizontal arch curvature, excavation and concrete mass volume, and additional elements such as joints or spillways. Examples of application on two Spanish dams are presented and the results obtained analyzed.

  2. A pre-clinical assessment of an atlas-based automatic segmentation tool for the head and neck.

    Science.gov (United States)

    Sims, Richard; Isambert, Aurelie; Grégoire, Vincent; Bidault, François; Fresco, Lydia; Sage, John; Mills, John; Bourhis, Jean; Lefkopoulos, Dimitri; Commowick, Olivier; Benkebil, Mehdi; Malandain, Grégoire

    2009-12-01

    Accurate conformal radiotherapy treatment requires manual delineation of target volumes and organs at risk (OAR) that is both time-consuming and subject to large inter-user variability. One solution is atlas-based automatic segmentation (ABAS) where a priori information is used to delineate various organs of interest. The aim of the present study is to establish the accuracy of one such tool for the head and neck (H&N) using two different evaluation methods. Two radiotherapy centres were provided with an ABAS tool that was used to outline the brainstem, parotids and mandible on several patients. The results were compared to manual delineations for the first centre (EM1) and reviewed/edited for the second centre (EM2), both of which were deemed as equally valid gold standards. The contours were compared in terms of their volume, sensitivity and specificity with the results being interpreted using the Dice similarity coefficient and a receiver operator characteristic (ROC) curve. Automatic segmentation took typically approximately 7min for each patient on a standard PC. The results indicated that the atlas contour volume was generally within +/-1SD of each gold standard apart from the parotids for EM1 and brainstem for EM2 that were over- and under-estimated, respectively (within +/-2SD). The similarity of the atlas contours with their respective gold standard was satisfactory with an average Dice coefficient for all OAR of 0.68+/-0.25 for EM1 and 0.82+/-0.13 for EM2. All data had satisfactory sensitivity and specificity resulting in a favourable position in ROC space. These tests have shown that the ABAS tool exhibits satisfactory sensitivity and specificity for the OAR investigated. There is, however, a systematic over-segmentation of the parotids (EM1) and under-segmentation of the brainstem (EM2) that require careful review and editing in the majority of cases. Such issues have been discussed with the software manufacturer and a revised version is due for release.

  3. A pre-clinical assessment of an atlas-based automatic segmentation tool for the head and neck

    International Nuclear Information System (INIS)

    Sims, Richard; Isambert, Aurelie; Gregoire, Vincent; Bidault, Francois; Fresco, Lydia; Sage, John; Mills, John; Bourhis, Jean; Lefkopoulos, Dimitri; Commowick, Olivier; Benkebil, Mehdi; Malandain, Gregoire

    2009-01-01

    Background and purpose: Accurate conformal radiotherapy treatment requires manual delineation of target volumes and organs at risk (OAR) that is both time-consuming and subject to large inter-user variability. One solution is atlas-based automatic segmentation (ABAS) where a priori information is used to delineate various organs of interest. The aim of the present study is to establish the accuracy of one such tool for the head and neck (H and N) using two different evaluation methods. Materials and methods: Two radiotherapy centres were provided with an ABAS tool that was used to outline the brainstem, parotids and mandible on several patients. The results were compared to manual delineations for the first centre (EM1) and reviewed/edited for the second centre (EM2), both of which were deemed as equally valid gold standards. The contours were compared in terms of their volume, sensitivity and specificity with the results being interpreted using the Dice similarity coefficient and a receiver operator characteristic (ROC) curve. Results: Automatic segmentation took typically ∼7 min for each patient on a standard PC. The results indicated that the atlas contour volume was generally within ±1SD of each gold standard apart from the parotids for EM1 and brainstem for EM2 that were over- and under-estimated, respectively (within ±2SD). The similarity of the atlas contours with their respective gold standard was satisfactory with an average Dice coefficient for all OAR of 0.68 ± 0.25 for EM1 and 0.82 ± 0.13 for EM2. All data had satisfactory sensitivity and specificity resulting in a favourable position in ROC space. Conclusions: These tests have shown that the ABAS tool exhibits satisfactory sensitivity and specificity for the OAR investigated. There is, however, a systematic over-segmentation of the parotids (EM1) and under-segmentation of the brainstem (EM2) that require careful review and editing in the majority of cases. Such issues have been discussed with the

  4. Design and Development of an Automatic Tool Changer for an Articulated Robot Arm

    Science.gov (United States)

    Ambrosio, H.; Karamanoglu, M.

    2014-07-01

    In the creative industries, the length of time between the ideation stage and the making of physical objects is decreasing due to the use of CAD/CAM systems and adicitive manufacturing. Natural anisotropic materials, such as solid wood can also be transformed using CAD/CAM systems, but only with subtractive processes such as machining with CNC routers. Whilst some 3 axis CNC routing machines are affordable to buy and widely available, more flexible 5 axis routing machines still present themselves as a too big investment for small companies. Small refurbished articulated robots can be a cheaper alternative but they require a light end-effector. This paper presents a new lightweight tool changer that converts a small 3kg payload 6 DOF robot into a robot apprentice able to machine wood and similar soft materials.

  5. ProMoIJ: A new tool for automatic three-dimensional analysis of microglial process motility.

    Science.gov (United States)

    Paris, Iñaki; Savage, Julie C; Escobar, Laura; Abiega, Oihane; Gagnon, Steven; Hui, Chin-Wai; Tremblay, Marie-Ève; Sierra, Amanda; Valero, Jorge

    2018-04-01

    Microglia, the immune cells of the central nervous system, continuously survey the brain to detect alterations and maintain tissue homeostasis. The motility of microglial processes is indicative of their surveying capacity in normal and pathological conditions. The gold standard technique to study motility involves the use of two-photon microscopy to obtain time-lapse images from brain slices or the cortex of living animals. This technique generates four dimensionally-coded images which are analyzed manually using time-consuming, non-standardized protocols. Microglial process motility analysis is frequently performed using Z-stack projections with the consequent loss of three-dimensional (3D) information. To overcome these limitations, we developed ProMoIJ, a pack of ImageJ macros that perform automatic motility analysis of cellular processes in 3D. The main core of ProMoIJ is formed by two macros that assist the selection of processes, automatically reconstruct their 3D skeleton, and analyze their motility (process and tip velocity). Our results show that ProMoIJ presents several key advantages compared with conventional manual analysis: (1) reduces the time required for analysis, (2) is less sensitive to experimenter bias, and (3) is more robust to varying numbers of processes analyzed. In addition, we used ProMoIJ to demonstrate that commonly performed 2D analysis underestimates microglial process motility, to reveal that only cells adjacent to a laser injured area extend their processes toward the lesion site, and to demonstrate that systemic inflammation reduces microglial process motility. ProMoIJ is a novel, open-source, freely-available tool which standardizes and accelerates the time-consuming labor of 3D analysis of microglial process motility. © 2017 Wiley Periodicals, Inc.

  6. A Survey of Automatic Protocol Reverse Engineering Approaches, Methods, and Tools on the Inputs and Outputs View

    Directory of Open Access Journals (Sweden)

    Baraka D. Sija

    2018-01-01

    Full Text Available A network protocol defines rules that control communications between two or more machines on the Internet, whereas Automatic Protocol Reverse Engineering (APRE defines the way of extracting the structure of a network protocol without accessing its specifications. Enough knowledge on undocumented protocols is essential for security purposes, network policy implementation, and management of network resources. This paper reviews and analyzes a total of 39 approaches, methods, and tools towards Protocol Reverse Engineering (PRE and classifies them into four divisions, approaches that reverse engineer protocol finite state machines, protocol formats, and both protocol finite state machines and protocol formats to approaches that focus directly on neither reverse engineering protocol formats nor protocol finite state machines. The efficiency of all approaches’ outputs based on their selected inputs is analyzed in general along with appropriate reverse engineering inputs format. Additionally, we present discussion and extended classification in terms of automated to manual approaches, known and novel categories of reverse engineered protocols, and a literature of reverse engineered protocols in relation to the seven layers’ OSI (Open Systems Interconnection model.

  7. I-AbACUS: a Reliable Software Tool for the Semi-Automatic Analysis of Invasion and Migration Transwell Assays.

    Science.gov (United States)

    Cortesi, Marilisa; Llamosas, Estelle; Henry, Claire E; Kumaran, Raani-Yogeeta A; Ng, Benedict; Youkhana, Janet; Ford, Caroline E

    2018-02-28

    The quantification of invasion and migration is an important aspect of cancer research, used both in the study of the molecular processes involved in this collection of diseases and the evaluation of the efficacy of new potential treatments. The transwell assay, while being one of the most widely used techniques for the evaluation of these characteristics, shows a high dependence on the operator's ability to correctly identify the cells and a low protocol standardization. Here we present I-AbACUS, a software tool specifically designed to aid the analysis of transwell assays that automatically and specifically recognizes cells in images of stained membranes and provides the user with a suggested cell count. A complete description of this instrument, together with its validation against the standard analysis technique for this assay is presented. Furthermore, we show that I-AbACUS is versatile and able to elaborate images containing cells with different morphologies and that the obtained results are less dependent on the operator and their experience. We anticipate that this instrument, freely available (Gnu Public Licence GPL v2) at www.marilisacortesi.com as a standalone application, could significantly improve the quantification of invasion and migration of cancer cells.

  8. SplitRacer - a semi-automatic tool for the analysis and interpretation of teleseismic shear-wave splitting

    Science.gov (United States)

    Reiss, Miriam Christina; Rümpker, Georg

    2017-04-01

    We present a semi-automatic, graphical user interface tool for the analysis and interpretation of teleseismic shear-wave splitting in MATLAB. Shear wave splitting analysis is a standard tool to infer seismic anisotropy, which is often interpreted as due to lattice-preferred orientation of e.g. mantle minerals or shape-preferred orientation caused by cracks or alternating layers in the lithosphere and hence provides a direct link to the earth's kinematic processes. The increasing number of permanent stations and temporary experiments result in comprehensive studies of seismic anisotropy world-wide. Their successive comparison with a growing number of global models of mantle flow further advances our understanding the earth's interior. However, increasingly large data sets pose the inevitable question as to how to process them. Well-established routines and programs are accurate but often slow and impractical for analyzing a large amount of data. Additionally, shear wave splitting results are seldom evaluated using the same quality criteria which complicates a straight-forward comparison. SplitRacer consists of several processing steps: i) download of data per FDSNWS, ii) direct reading of miniSEED-files and an initial screening and categorizing of XKS-waveforms using a pre-set SNR-threshold. iii) an analysis of the particle motion of selected phases and successive correction of the sensor miss-alignment based on the long-axis of the particle motion. iv) splitting analysis of selected events: seismograms are first rotated into radial and transverse components, then the energy-minimization method is applied, which provides the polarization and delay time of the phase. To estimate errors, the analysis is done for different randomly-chosen time windows. v) joint-splitting analysis for all events for one station, where the energy content of all phases is inverted simultaneously. This allows to decrease the influence of noise and to increase robustness of the measurement

  9. SU-C-202-03: A Tool for Automatic Calculation of Delivered Dose Variation for Off-Line Adaptive Therapy Using Cone Beam CT

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, B; Lee, S; Chen, S; Zhou, J; Prado, K; D’Souza, W; Yi, B [University of Maryland School of Medicine, Baltimore, MD (United States)

    2016-06-15

    Purpose: Monitoring the delivered dose is an important task for the adaptive radiotherapy (ART) and for determining time to re-plan. A software tool which enables automatic delivered dose calculation using cone-beam CT (CBCT) has been developed and tested. Methods: The tool consists of four components: a CBCT Colleting Module (CCM), a Plan Registration Moduel (PRM), a Dose Calculation Module (DCM), and an Evaluation and Action Module (EAM). The CCM is triggered periodically (e.g. every 1:00 AM) to search for newly acquired CBCTs of patients of interest and then export the DICOM files of the images and related registrations defined in ARIA followed by triggering the PRM. The PRM imports the DICOM images and registrations, links the CBCTs to the related treatment plan of the patient in the planning system (RayStation V4.5, RaySearch, Stockholm, Sweden). A pre-determined CT-to-density table is automatically generated for dose calculation. Current version of the DCM uses a rigid registration which regards the treatment isocenter of the CBCT to be the isocenter of the treatment plan. Then it starts the dose calculation automatically. The AEM evaluates the plan using pre-determined plan evaluation parameters: PTV dose-volume metrics and critical organ doses. The tool has been tested for 10 patients. Results: Automatic plans are generated and saved in the order of the treatment dates of the Adaptive Planning module of the RayStation planning system, without any manual intervention. Once the CTV dose deviates more than 3%, both email and page alerts are sent to the physician and the physicist of the patient so that one can look the case closely. Conclusion: The tool is capable to perform automatic dose tracking and to alert clinicians when an action is needed. It is clinically useful for off-line adaptive therapy to catch any gross error. Practical way of determining alarming level for OAR is under development.

  10. Exosomes as biomimetic tools for stem cell differentiation: Applications in dental pulp tissue regeneration.

    Science.gov (United States)

    Huang, Chun-Chieh; Narayanan, Raghuvaran; Alapati, Satish; Ravindran, Sriram

    2016-12-01

    Achieving and maintaining safe and reliable lineage specific differentiation of stem cells is important for clinical translation of tissue engineering strategies. In an effort to circumvent the multitude of problems arising from the usage of growth factors and growth factor delivery systems, we have explored the use of exosomes as biomimetic tools to induce stem cell differentiation. Working on the hypothesis that cell-type specific exosomes can trigger lineage-specific differentiation of stem cells, we have evaluated the potential of exosomes derived from dental pulp cells cultured on under growth and odontogenic differentiation conditions to induce odontogenic differentiation of naïve human dental pulp stem cells (DPSCs) and human bone marrow derived stromal cells (HMSCs) in vitro and in vivo. Results indicate that the exosomes can bind to matrix proteins such as type I collagen and fibronectin enabling them to be tethered to biomaterials. The exosomes are endocytosed by both DPSCs and HMSCs in a dose-dependent and saturable manner via the caveolar endocytic mechanism and trigger the P38 mitogen activated protein kinase (MAPK) pathway. In addition, the exosomes also trigger the increased expression of genes required for odontogenic differentiation. When tested in vivo in a tooth root slice model with DPSCs, the exosomes triggered regeneration of dental pulp-like tissue. However, our results indicate that exosomes isolated under odontogenic conditions are better inducers of stem cell differentiation and tissue regeneration. Overall, our results highlight the potential exosomes as biomimetic tools to induce lineage specific differentiation of stem cells. Our results also show the importance of considering the source and state of exosome donor cells before a choice is made for therapeutic applications. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Design and evaluation of a slave manipulator with roll-pitch-roll wrist and automatic tool loading mechanism in telerobotic surgery.

    Science.gov (United States)

    Kim, Ki-Young; Lee, Jung-Ju

    2012-12-01

    As there is a shortage of scrub nurses in many hospitals, automatic surgical tool exchanging mechanism without human labour has been studied. Minimally invasive robotic surgeries (MIRS) also require scrub nurses. A surgical tool loading mechanism without a scrub nurse's assistance for MIRS is proposed. Many researchers have developed minimally invasive surgical instruments with a wrist joint that can be movable inside the abdomen. However, implementation of a distal rolling joint on a gripper is rare. To implement surgical tool exchanging without a scrub nurse's assistance, a slave manipulator and a tool loader were developed to load and unload a surgical tool unit. A surgical tool unit with a roll-pitch-roll wrist was developed. Several experiments were performed to validate the effectiveness of the slave manipulator and the surgical tool unit. The slave manipulator and the tool loader were able to successfully unload and load the surgical tool unit without human assistance. The total duration of unloading and loading the surgical tool unit was 97 s. Motion tracking experiments of the distal rolling joint were performed. The maximum positioning error of the step input response was 2°. The advantage of the proposed slave manipulator and tool loader is that other robotic systems or human labour are not needed for surgical tool loading. The feasibility of the distal rolling joint in MIS is verified. Copyright © 2012 John Wiley & Sons, Ltd.

  12. A Simple Tool for Integration and Differentiation of Tabular Values in Microsoft Excel

    Science.gov (United States)

    Haugland, Ole Anton

    2011-12-01

    There are many software alternatives for analyzing experimental data in our physics teaching. I prefer to use Excel® because of its flexibility and widespread use elsewhere in our society. Whatever our students will work with in their future career, they almost certainly will have access to a spreadsheet. For a long time I have missed a tool for integrating and differentiating tabular values in Excel. For every new version I thought it would appear, but it did not. Such a tool could also be useful if you analyze data from other sources than your own experiment, for example, data from the Internet. Therefore, I have written a simple tool that can be integrated seamlessly into Excel as an add-in. It is written in Excels powerful macro language Microsoft Visual Basic for Applications. The tool can be downloaded online and there are two versions of it: one for Excel 2003 and one for Excel 2007/2010.

  13. ALICE: A tool for automatic localization of intra-cranial electrodes for clinical and high-density grids.

    Science.gov (United States)

    Branco, Mariana P; Gaglianese, Anna; Glen, Daniel; Hermes, Dora; Saad, Ziad S; Petridou, Natalia; Ramsey, Nick F

    2017-10-31

    Electrocorticographic (ECoG) measurements require the accurate localization of implanted electrodes with respect to the subject's neuroanatomy. Electrode localization is particularly relevant to associate structure with function. Several procedures have attempted to solve this problem, namely by co-registering a post-operative computed tomography (CT) scan, with a pre-operative magnetic resonance imaging (MRI) anatomy scan. However, this type of procedure requires a manual and time-consuming detection and transcription of the electrode coordinates from the CT volume scan and restricts the extraction of smaller high-resolution ECoG grid electrodes due to the downsampling of the CT. ALICE automatically detects electrodes on the post-operative high-resolution CT scan, visualizes them in a combined 2D and 3D volume space using AFNI and SUMA software and then projects the electrodes on the individual's cortical surface rendering. The pipeline integrates the multiple-step method into a user-friendly GUI in Matlab ® , thus providing an easy, automated and standard tool for ECoG electrode localization. ALICE was validated in 13 subjects implanted with clinical ECoG grids by comparing the calculated electrode center-of-mass coordinates with those computed using a commonly used method. A novel aspect of ALICE is the combined 2D-3D visualization of the electrodes on the CT scan and the option to also detect high-density ECoG grids. Feasibility was shown in 5 subjects and validated for 2 subjects. The ALICE pipeline provides a fast and accurate detection, discrimination and localization of ECoG electrodes spaced down to 4mm apart. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. SplitRacer - a new Semi-Automatic Tool to Quantify And Interpret Teleseismic Shear-Wave Splitting

    Science.gov (United States)

    Reiss, M. C.; Rumpker, G.

    2017-12-01

    We have developed a semi-automatic, MATLAB-based GUI to combine standard seismological tasks such as the analysis and interpretation of teleseismic shear-wave splitting. Shear-wave splitting analysis is widely used to infer seismic anisotropy, which can be interpreted in terms of lattice-preferred orientation of mantle minerals, shape-preferred orientation caused by fluid-filled cracks or alternating layers. Seismic anisotropy provides a unique link between directly observable surface structures and the more elusive dynamic processes in the mantle below. Thus, resolving the seismic anisotropy of the lithosphere/asthenosphere is of particular importance for geodynamic modeling and interpretations. The increasing number of seismic stations from temporary experiments and permanent installations creates a new basis for comprehensive studies of seismic anisotropy world-wide. However, the increasingly large data sets pose new challenges for the rapid and reliably analysis of teleseismic waveforms and for the interpretation of the measurements. Well-established routines and programs are available but are often impractical for analyzing large data sets from hundreds of stations. Additionally, shear wave splitting results are seldom evaluated using the same well-defined quality criteria which may complicate comparison with results from different studies. SplitRacer has been designed to overcome these challenges by incorporation of the following processing steps: i) downloading of waveform data from multiple stations in mseed-format using FDSNWS tools; ii) automated initial screening and categorizing of XKS-waveforms using a pre-set SNR-threshold; iii) particle-motion analysis of selected phases at longer periods to detect and correct for sensor misalignment; iv) splitting analysis of selected phases based on transverse-energy minimization for multiple, randomly-selected, relevant time windows; v) one and two-layer joint-splitting analysis for all phases at one station by

  15. A fully automatic tool to perform accurate flood mapping by merging remote sensing imagery and ancillary data

    Science.gov (United States)

    D'Addabbo, Annarita; Refice, Alberto; Lovergine, Francesco; Pasquariello, Guido

    2016-04-01

    Flooding is one of the most frequent and expansive natural hazard. High-resolution flood mapping is an essential step in the monitoring and prevention of inundation hazard, both to gain insight into the processes involved in the generation of flooding events, and from the practical point of view of the precise assessment of inundated areas. Remote sensing data are recognized to be useful in this respect, thanks to the high resolution and regular revisit schedules of state-of-the-art satellites, moreover offering a synoptic overview of the extent of flooding. In particular, Synthetic Aperture Radar (SAR) data present several favorable characteristics for flood mapping, such as their relative insensitivity to the meteorological conditions during acquisitions, as well as the possibility of acquiring independently of solar illumination, thanks to the active nature of the radar sensors [1]. However, flood scenarios are typical examples of complex situations in which different factors have to be considered to provide accurate and robust interpretation of the situation on the ground: the presence of many land cover types, each one with a particular signature in presence of flood, requires modelling the behavior of different objects in the scene in order to associate them to flood or no flood conditions [2]. Generally, the fusion of multi-temporal, multi-sensor, multi-resolution and/or multi-platform Earth observation image data, together with other ancillary information, seems to have a key role in the pursuit of a consistent interpretation of complex scenes. In the case of flooding, distance from the river, terrain elevation, hydrologic information or some combination thereof can add useful information to remote sensing data. Suitable methods, able to manage and merge different kind of data, are so particularly needed. In this work, a fully automatic tool, based on Bayesian Networks (BNs) [3] and able to perform data fusion, is presented. It supplies flood maps

  16. A comprehensive comparison of tools for differential ChIP-seq analysis.

    Science.gov (United States)

    Steinhauser, Sebastian; Kurzawa, Nils; Eils, Roland; Herrmann, Carl

    2016-11-01

    ChIP-seq has become a widely adopted genomic assay in recent years to determine binding sites for transcription factors or enrichments for specific histone modifications. Beside detection of enriched or bound regions, an important question is to determine differences between conditions. While this is a common analysis for gene expression, for which a large number of computational approaches have been validated, the same question for ChIP-seq is particularly challenging owing to the complexity of ChIP-seq data in terms of noisiness and variability. Many different tools have been developed and published in recent years. However, a comprehensive comparison and review of these tools is still missing. Here, we have reviewed 14 tools, which have been developed to determine differential enrichment between two conditions. They differ in their algorithmic setups, and also in the range of applicability. Hence, we have benchmarked these tools on real data sets for transcription factors and histone modifications, as well as on simulated data sets to quantitatively evaluate their performance. Overall, there is a great variety in the type of signal detected by these tools with a surprisingly low level of agreement. Depending on the type of analysis performed, the choice of method will crucially impact the outcome. © The Author 2016. Published by Oxford University Press.

  17. MAGE (M-file/Mif Automatic GEnerator): A graphical interface tool for automatic generation of Object Oriented Micromagnetic Framework configuration files and Matlab scripts for results analysis

    Science.gov (United States)

    Chęciński, Jakub; Frankowski, Marek

    2016-10-01

    We present a tool for fully-automated generation of both simulations configuration files (Mif) and Matlab scripts for automated data analysis, dedicated for Object Oriented Micromagnetic Framework (OOMMF). We introduce extended graphical user interface (GUI) that allows for fast, error-proof and easy creation of Mifs, without any programming skills usually required for manual Mif writing necessary. With MAGE we provide OOMMF extensions for complementing it by mangetoresistance and spin-transfer-torque calculations, as well as local magnetization data selection for output. Our software allows for creation of advanced simulations conditions like simultaneous parameters sweeps and synchronic excitation application. Furthermore, since output of such simulation could be long and complicated we provide another GUI allowing for automated creation of Matlab scripts suitable for analysis of such data with Fourier and wavelet transforms as well as user-defined operations.

  18. Western Blotting Is an Efficient Tool for Differential Diagnosis of Paracoccidioidomycosis and Pulmonary Tuberculosis

    Science.gov (United States)

    Bertoni, Thâmara Aline; Perenha-Viana, Maysa Cláudia Zolin; Patussi, Eliana Valéria; Cardoso, Rosilene Fressatti

    2012-01-01

    Sputum and sera from 134 patients screened for tuberculosis (TB) were analyzed to investigate TB and paracoccidioidomycosis (PCM). Of these patients, 11 (8.2%) were confirmed to have TB, but six (4.5%) were positive only for PCM. All patients with PCM presented anti-43-kDa-component antibodies in Western blotting (WB) assays, while in the TB-positive patients these antibodies did not appear. This preliminary study suggests WB as a potential tool for differential laboratory diagnosis between TB and PCM. PMID:22971781

  19. Progress on statistical learning systems as data mining tools for the creation of automatic databases in Fusion environments

    Energy Technology Data Exchange (ETDEWEB)

    Vega, J., E-mail: jesus.vega@ciemat.e [JET-EFDA, Culham Science Center, OX14 3DB, Abingdon (United Kingdom); Asociacion EURATOM/CIEMAT para Fusion. Avda. Complutense, 22, 28040 Madrid (Spain); Murari, A. [JET-EFDA, Culham Science Center, OX14 3DB, Abingdon (United Kingdom); Associazione EURATOM-ENEA per la Fusione, Consorzio RFX, 4-35127 Padova (Italy); Ratta, G.A.; Gonzalez, S. [JET-EFDA, Culham Science Center, OX14 3DB, Abingdon (United Kingdom); Asociacion EURATOM/CIEMAT para Fusion. Avda. Complutense, 22, 28040 Madrid (Spain); Dormido-Canto, S. [JET-EFDA, Culham Science Center, OX14 3DB, Abingdon (United Kingdom); Dpto. Informatica y Automatica, UNED, Madrid (Spain)

    2010-07-15

    Nowadays, processing all information of a fusion database is a much more important issue than acquiring more data. Although typically fusion devices produce tens of thousands of discharges, specialized databases for physics studies are normally limited to a few tens of shots. This is due to the fact that these databases are almost always generated manually, which is a very time consuming and unreliable activity. The development of automatic methods to create specialized databases ensures first, the reduction of human efforts to identify and locate physical events, second, the standardization of criteria (reducing the vulnerability to human errors) and, third, the improvement of statistical relevance. Classification and regression techniques have been used for these purposes. The objective has been the automatic recognition of physical events (that can appear in a random and/or infrequent way) in waveforms and video-movies. Results are shown for the JET database.

  20. Nouns referring to tools and natural objects differentially modulate the motor system.

    Science.gov (United States)

    Gough, Patricia M; Riggio, Lucia; Chersi, Fabian; Sato, Marc; Fogassi, Leonardo; Buccino, Giovanni

    2012-01-01

    While increasing evidence points to a critical role for the motor system in language processing, the focus of previous work has been on the linguistic category of verbs. Here we tested whether nouns are effective in modulating the motor system and further whether different kinds of nouns - those referring to artifacts or natural items, and items that are graspable or ungraspable - would differentially modulate the system. A Transcranial Magnetic Stimulation (TMS) study was carried out to compare modulation of the motor system when subjects read nouns referring to objects which are Artificial or Natural and which are Graspable or Ungraspable. TMS was applied to the primary motor cortex representation of the first dorsal interosseous (FDI) muscle of the right hand at 150 ms after noun presentation. Analyses of Motor Evoked Potentials (MEPs) revealed that across the duration of the task, nouns referring to graspable artifacts (tools) were associated with significantly greater MEP areas. Analyses of the initial presentation of items revealed a main effect of graspability. The findings are in line with an embodied view of nouns, with MEP measures modulated according to whether nouns referred to natural objects or artifacts (tools), confirming tools as a special class of items in motor terms. Additionally our data support a difference for graspable versus non graspable objects, an effect which for natural objects is restricted to initial presentation of items. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. Evaluation of a new software tool for the automatic volume calculation of hepatic tumors. First results; Evaluation eines neuen Softwareassistenten zur automatischen Volumenbestimmung von intrahepatischen Tumoren. Erste Ergebnisse

    Energy Technology Data Exchange (ETDEWEB)

    Meier, S.; Mildenberger, P.; Pitton, M.; Thelen, M. [Klinik und Poliklinik fuer Radiologie, Univ. Mainz (Germany); Schenk, A.; Bourquain, H. [MeVis, Bremen (Germany)

    2004-02-01

    Purpose: computed tomography has become the preferred method in detecting liver carcinomas. The introduction of spiral CT added volumetric assessment of intrahepatic tumors, which was unattainable in the clinical routine with incremental CT due to complex planimetric revisions and excessive computing time. In an ongoing clinical study, a new software tool was tested for the automatic detection of tumor volume and the time needed for this procedure. Materials and methods: we analyzed patients suffering from hepatocellular carcinoma (HCC). All patients underwent treatment with repeated transcatheter chemoembolization of the hepatic arteria. The volumes of the HCC lesions detected in CT were measured with the new software tool in HepaVison (MeVis, Germany). The results were compared with manual planimetric calculation of the volume performed by three independent radiologists. Results: our first results in 16 patients show a correlation between the automatically and the manually calculated volumes (up to a difference of 2 ml) of 96.8%. While the manual method of analyzing the volume of a lesion requires 2.5 minutes on average, the automatic method merely requires about 30 seconds of user interaction time. Conclusion: These preliminary results show a good correlation between automatic and manual calculations of the tumor volume. The new software tool requires less time for accurate determination of the tumor volume and can be applied in the daily clinical routine. (orig.) [German] Ziel: Die Computertomographie hat sich bei der Verlaufskontrolle von Lebertumoren als wichtiges Verfahren etabliert. Mit dem Verfahren ist auch eine Angabe des Tumorvolumens moeglich, welche bisher jedoch aufgrund der aufwendigen planimetrischen Aufarbeitung nicht praktikabel ist. In einer laufenden klinischen Studie wird ein neuer Softwareassistent auf seine automatische Volumenerfassung von Lebertumoren getestet und die notwendige Zeit fuer diesen Vorgang erfasst. Material und Methoden: Es

  2. Focusing Automatic Code Inspections

    NARCIS (Netherlands)

    Boogerd, C.J.

    2010-01-01

    Automatic Code Inspection tools help developers in early detection of defects in software. A well-known drawback of many automatic inspection approaches is that they yield too many warnings and require a clearer focus. In this thesis, we provide such focus by proposing two methods to prioritize

  3. Occupational self-coding and automatic recording (OSCAR): a novel web-based tool to collect and code lifetime job histories in large population-based studies.

    Science.gov (United States)

    De Matteis, Sara; Jarvis, Deborah; Young, Heather; Young, Alan; Allen, Naomi; Potts, James; Darnton, Andrew; Rushton, Lesley; Cullinan, Paul

    2017-03-01

    Objectives The standard approach to the assessment of occupational exposures is through the manual collection and coding of job histories. This method is time-consuming and costly and makes it potentially unfeasible to perform high quality analyses on occupational exposures in large population-based studies. Our aim was to develop a novel, efficient web-based tool to collect and code lifetime job histories in the UK Biobank, a population-based cohort of over 500 000 participants. Methods We developed OSCAR (occupations self-coding automatic recording) based on the hierarchical structure of the UK Standard Occupational Classification (SOC) 2000, which allows individuals to collect and automatically code their lifetime job histories via a simple decision-tree model. Participants were asked to find each of their jobs by selecting appropriate job categories until they identified their job title, which was linked to a hidden 4-digit SOC code. For each occupation a job title in free text was also collected to estimate Cohen's kappa (κ) inter-rater agreement between SOC codes assigned by OSCAR and an expert manual coder. Results OSCAR was administered to 324 653 UK Biobank participants with an existing email address between June and September 2015. Complete 4-digit SOC-coded lifetime job histories were collected for 108 784 participants (response rate: 34%). Agreement between the 4-digit SOC codes assigned by OSCAR and the manual coder for a random sample of 400 job titles was moderately good [κ=0.45, 95% confidence interval (95% CI) 0.42-0.49], and improved when broader job categories were considered (κ=0.64, 95% CI 0.61-0.69 at a 1-digit SOC-code level). Conclusions OSCAR is a novel, efficient, and reasonably reliable web-based tool for collecting and automatically coding lifetime job histories in large population-based studies. Further application in other research projects for external validation purposes is warranted.

  4. Summer Student Work Project Report: SCADA Bridge Tool Development Automatically Capturing Data from SCADA to the Maintenance System

    CERN Document Server

    Alhambra-Moron, Alfonso

    2015-01-01

    The main purpose of this report is to summarize the work project I have been doing at CERN during the last 3 months as a Summer Student. My name is Alfonso Alhambra Morón and the 8th of June 2015 I joined the EN-HE-LM team as a summer student supervised by Damien Lafarge in order to collaborate in the automation of the transfer of meter readings from SCADA1 to Infor EAM2, the computerized maintenance management system at CERN. The main objective of my project was to enable the automatic updates of meters in Infor EAM fetching data from SCADA so as to automatize a process which was done manually before and consumed resources in terms of having to consult the meter physically, import this information to Infor EAM by hand and detecting and correcting the errors that can occur when doing all of this manually. This problem is shared by several other teams at CERN apart from the Lift Maintenance team and for this reason the main target I had when developing my solution was flexibility and scalability so as to make...

  5. Differential Arc expression in the hippocampus and striatum during the transition from attentive to automatic navigation on a plus maze

    Science.gov (United States)

    Gardner, Robert S.; Suarez, Daniel F.; Robinson-Burton, Nadira K.; Rudnicky, Christopher J.; Gulati, Asish; Ascoli, Giorgio A.; Dumas, Theodore C.

    2016-01-01

    The strategies utilized to effectively perform a given task change with practice and experience. During a spatial navigation task, with relatively little training, performance is typically attentive enabling an individual to locate the position of a goal by relying on spatial landmarks. These (place) strategies require an intact hippocampus. With task repetition, performance becomes automatic; the same goal is reached using a fixed response or sequence of actions. These (response) strategies require an intact striatum. The current work aims to understand the activation patterns across these neural structures during this experience-dependent strategy transition. This was accomplished by region-specific measurement of activity-dependent immediate early gene expression among rats trained to different degrees on a dual-solution task (i.e., a task that can be solved using either place or response navigation). As expected, rats increased their reliance on response navigation with extended task experience. In addition, dorsal hippocampal expression of the immediate early gene Arc was considerably reduced in rats that used a response strategy late in training (as compared with hippocampal expression in rats that used a place strategy early in training). In line with these data, vicarious trial and error, a behavior linked to hippocampal function, also decreased with task repetition. Although Arc mRNA expression in dorsal medial or lateral striatum alone did not correlate with training stage, the ratio of expression in the medial striatum to that in the lateral striatum was relatively high among rats that used a place strategy early in training as compared with the ratio among over-trained response rats. Altogether, these results identify specific changes in the activation of dissociated neural systems that may underlie the experience-dependent emergence of response-based automatic navigation. PMID:26976088

  6. Fourier Transform Infrared Spectroscopy (FTIR) as a Tool for the Identification and Differentiation of Pathogenic Bacteria.

    Science.gov (United States)

    Zarnowiec, Paulina; Lechowicz, Łukasz; Czerwonka, Grzegorz; Kaca, Wiesław

    2015-01-01

    Methods of human bacterial pathogen identification need to be fast, reliable, inexpensive, and time efficient. These requirements may be met by vibrational spectroscopic techniques. The method that is most often used for bacterial detection and identification is Fourier transform infrared spectroscopy (FTIR). It enables biochemical scans of whole bacterial cells or parts thereof at infrared frequencies (4,000-600 cm(-1)). The recorded spectra must be subsequently transformed in order to minimize data variability and to amplify the chemically-based spectral differences in order to facilitate spectra interpretation and analysis. In the next step, the transformed spectra are analyzed by data reduction tools, regression techniques, and classification methods. Chemometric analysis of FTIR spectra is a basic technique for discriminating between bacteria at the genus, species, and clonal levels. Examples of bacterial pathogen identification and methods of differentiation up to the clonal level, based on infrared spectroscopy, are presented below.

  7. Foodomics: A new tool to differentiate between organic and conventional foods.

    Science.gov (United States)

    Vallverdú-Queralt, Anna; Lamuela-Raventós, Rosa Maria

    2016-07-01

    The demand for organic food is increasing annually due to the growing consumer trend for more natural products that have simpler ingredient lists, involve less processing and are grown free of pesticides. However, there is still not enough nutritional evidence in favor of organic food consumption. Classical chemical analysis of macro- and micronutrients has demonstrated that organic crops are poorer in nitrogen, but clear evidence for other nutrients is lacking. Omics technologies forming part of the new discipline of foodomics have allowed the detection of possible nutritional differences between organic and conventional production, although many results remain controversial and contradictory. The main focus of this review is to provide an overview of the studies that use foodomics techniques as a tool to differentiate between organic and conventional production. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Ulnar sensory-motor amplitude ratio: a new tool to differentiate ganglionopathy from polyneuropathy

    Directory of Open Access Journals (Sweden)

    Raphael Ubirajara Garcia

    2013-07-01

    Full Text Available The objective of this study was to evaluate if the ratio of ulnar sensory nerve action potential (SNAP over compound muscle action potential (CMAP amplitudes (USMAR would help in the distinction between ganglionopathy (GNP and polyneuropathy (PNP. Methods We reviewed the nerve conductions studies and electromyography (EMG of 18 GNP patients, 33 diabetic PNP patients and 56 controls. GNP was defined by simultaneous nerve conduction studies (NCS and magnetic resonance imaging (MRI abnormalities. PNP was defined by usual clinical and NCS criteria. We used ANOVA with post-hoc Tukey test and ROC curve analysis to compare ulnar SNAP and CMAP, as well as USMAR in the groups. Results Ulnar CMAP amplitudes were similar between GNP x PNP x Controls (p=0.253, but ulnar SNAP amplitudes (1.6±3.2 x 11.9±9.1 × 45.7±24.7 and USMAR values (0.3±0.3 × 1.5±0.9 × 4.6±2.2 were significantly different. A USMAR threshold of 0.71 was able to differentiate GNP and PNP (94.4% sensitivity and 90.9% specificity. Conclusions USMAR is a practical and reliable tool for the differentiation between GNP and PNP.

  9. Neuroimaging in Parkinsonism: a study with magnetic resonance and spectroscopy as tools in the differential diagnosis

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcellos, Luiz Felipe Rocha [1Hospital dos Servidores do Estado, Rio de Janeiro RJ (Brazil)], e-mail: luizneurol@terra.com.br; Novis, Sergio A. Pereira; Rosso, Ana Lucia Z. [Hospital Universitario Clementino Fraga Filho (HUCFF), Rio de Janeiro, RJ (Brazil); Moreira, Denise Madeira [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Inst. de Neurologia Deolindo Couto; Leite, Ana Claudia C.B. [Fundacao Oswaldo Cruz (FIOCRUZ), Rio de Janeiro, RJ (Brazil)

    2009-03-15

    The differential diagnosis of Parkinsonism based on clinical features, sometimes may be difficult. Diagnostic tests in these cases might be useful, especially magnetic resonance imaging, a noninvasive exam, not as expensive as positron emission tomography, and provides a good basis for anatomical analysis. The magnetic resonance spectroscopy analyzes cerebral metabolism, yielding inconsistent results in parkinsonian disorders. We selected 40 individuals for magnetic resonance imaging and spectroscopy analysis, 12 with Parkinson's disease, 11 with progressive supranuclear palsy, 7 with multiple system atrophy (parkinsonian type), and 10 individuals without any psychiatric or neurological disorders (controls). Clinical scales included Hoenh and Yahr, unified Parkinson's disease rating scale and mini mental status examination. The results showed that patients with Parkinson's disease and controls presented the same aspects on neuroimaging, with few or absence of abnormalities, and supranuclear progressive palsy and multiple system atrophy showed abnormalities, some of which statistically significant. Thus, magnetic resonance imaging and spectroscopy could be useful as a tool in differential diagnosis of Parkinsonism. (author)

  10. Solving ordinary differential equations by electrical analogy: a multidisciplinary teaching tool

    Science.gov (United States)

    Sanchez Perez, J. F.; Conesa, M.; Alhama, I.

    2016-11-01

    Ordinary differential equations are the mathematical formulation for a great variety of problems in science and engineering, and frequently, two different problems are equivalent from a mathematical point of view when they are formulated by the same equations. Students acquire the knowledge of how to solve these equations (at least some types of them) using protocols and strict algorithms of mathematical calculation without thinking about the meaning of the equation. The aim of this work is that students learn to design network models or circuits in this way; with simple knowledge of them, students can establish the association of electric circuits and differential equations and their equivalences, from a formal point of view, that allows them to associate knowledge of two disciplines and promote the use of this interdisciplinary approach to address complex problems. Therefore, they learn to use a multidisciplinary tool that allows them to solve these kinds of equations, even students of first course of engineering, whatever the order, grade or type of non-linearity. This methodology has been implemented in numerous final degree projects in engineering and science, e.g., chemical engineering, building engineering, industrial engineering, mechanical engineering, architecture, etc. Applications are presented to illustrate the subject of this manuscript.

  11. A Global Multi-Objective Optimization Tool for Design of Mechatronic Components using Generalized Differential Evolution

    DEFF Research Database (Denmark)

    Bech, Michael Møller; Nørgård, Christian; Roemer, Daniel Beck

    2016-01-01

    This paper illustrates how the relatively simple constrained multi-objective optimization algorithm Generalized Differential Evolution 3 (GDE3), can assist with the practical sizing of mechatronic components used in e.g. digital displacement fluid power machinery. The studied bi- and tri-objectiv...... different optimization control parameter settings and it is concluded that GDE3 is a reliable optimization tool that can assist mechatronic engineers in the design and decision making process.......This paper illustrates how the relatively simple constrained multi-objective optimization algorithm Generalized Differential Evolution 3 (GDE3), can assist with the practical sizing of mechatronic components used in e.g. digital displacement fluid power machinery. The studied bi- and tri......-objective problems having 10+ design variables are both highly constrained, nonlinear and non-smooth but nevertheless the algorithm converges to the Pareto-front within a hours of computation (20k function evaluations). Additionally, the robustness and convergence speed of the algorithm are investigated using...

  12. Neuroimaging in Parkinsonism: a study with magnetic resonance and spectroscopy as tools in the differential diagnosis

    International Nuclear Information System (INIS)

    Vasconcellos, Luiz Felipe Rocha; Novis, Sergio A. Pereira; Rosso, Ana Lucia Z.; Moreira, Denise Madeira

    2009-01-01

    The differential diagnosis of Parkinsonism based on clinical features, sometimes may be difficult. Diagnostic tests in these cases might be useful, especially magnetic resonance imaging, a noninvasive exam, not as expensive as positron emission tomography, and provides a good basis for anatomical analysis. The magnetic resonance spectroscopy analyzes cerebral metabolism, yielding inconsistent results in parkinsonian disorders. We selected 40 individuals for magnetic resonance imaging and spectroscopy analysis, 12 with Parkinson's disease, 11 with progressive supranuclear palsy, 7 with multiple system atrophy (parkinsonian type), and 10 individuals without any psychiatric or neurological disorders (controls). Clinical scales included Hoenh and Yahr, unified Parkinson's disease rating scale and mini mental status examination. The results showed that patients with Parkinson's disease and controls presented the same aspects on neuroimaging, with few or absence of abnormalities, and supranuclear progressive palsy and multiple system atrophy showed abnormalities, some of which statistically significant. Thus, magnetic resonance imaging and spectroscopy could be useful as a tool in differential diagnosis of Parkinsonism. (author)

  13. Development of an artificial vision system for the automatic evaluation of the cutting angles of worn tools

    Directory of Open Access Journals (Sweden)

    Gianni Campatelli

    2016-03-01

    Full Text Available This article presents a new method to evaluate the geometry of dull cutting tools in order to verify the necessity of tool re-sharpening and to decrease the tool grinding machine setup time, based on a laser scanning approach. The developed method consists of the definition of a system architecture and the programming of all the algorithms needed to analyze the data and provide, as output, the cutting angles of the worn tool. These angles are usually difficult to be measured and are needed to set up the grinding machine. The main challenges that have been dealt with in this application are related to the treatment of data acquired by the system’s cameras, which must be specific for the milling tools, usually characterized by the presence of undercuts and sharp edges. Starting from the architecture of the system, an industrial product has been designed, with the support of a grinding machine manufacturer. The basic idea has been to develop a low-cost system that could be integrated on a tool sharpening machine and interfaced with its numeric control. The article reports the developed algorithms and an example of application.

  14. Automatic speech recognition (ASR) and its use as a tool for assessment or therapy of voice, speech, and language disorders.

    Science.gov (United States)

    Kitzing, Peter; Maier, Andreas; Ahlander, Viveka Lyberg

    2009-01-01

    In general opinion computerized automatic speech recognition (ASR) seems to be regarded as a method only to accomplish transcriptions from spoken language to written text and as such quite insecure and rather cumbersome. However, due to great advances in computer technology and informatics methodology ASR has nowadays become quite dependable and easier to handle, and the number of applications has increased considerably. After some introductory background information on ASR a number of applications of great interest for professionals in voice, speech, and language therapy are pointed out. In the foreseeable future, the keyboard and mouse will by means of ASR technology be replaced in many functions by a microphone as the human-computer interface, and the computer will talk back via its loud-speaker. It seems important that professionals engaged in the care of oral communication disorders take part in this development so their clients may get the optimal benefit from this new technology.

  15. epiModel: a system to build automatically systems of differential equations of compartmental type-epidemiological models.

    Science.gov (United States)

    Cortés, Juan-C; Sánchez-Sánchez, Almudena; Santonja, Francisco-J; Villanueva, Rafael-J

    2011-11-01

    In this paper we describe epiModel, a code developed in Mathematica that facilitates the building of systems of differential equations corresponding to type-epidemiological linear or quadratic models whose characteristics are defined in text files following an easy syntax. It includes the possibility of obtaining the equations of models involving age and/or sex groups. Copyright © 2011. Published by Elsevier Ltd.

  16. [Late life depression or prodromal Alzheimer's disease: Which tools for the differential diagnosis?

    Science.gov (United States)

    Gasser, A-I; Salamin, V; Zumbach, S

    2018-02-01

    executive functions could not differentiate between patients with late life depression and patients with prodromal Alzheimer's disease. A measure of global cognitive decline does not seem to be helpful in differentiating early Alzheimer's disease and depression, unlike an analysis of the neuropsychological profile on several composite scales, such as the Mini Mental State Examination. Furthermore, recent work has investigated the utility of olfactory or gustative markers with promising results and convenient tools for clinical practice. Concerning morphological brain imaging, only detailed volumetric analysis could show differences between the two diseases, but these techniques are not always available for clinical practice. It is the same for other recent techniques, such as quantitative electroencephalography, Near InfraRed Spectroscopy, Single Photon Emission Computed Tomography, or Transcranial Doppler Ultrasonography, which have received little attention so far as differential diagnostic tools. Finally, cerebrospinal fluid analysis could be useful, including beta amyloid levels. Despite numerous efforts in recent years, differential diagnosis of dementia from depression in the elderly remains difficult. Results of this review highlight the necessity of conducting more research in this area, with multi-method studies, using not only cognitive analysis but also cerebral imaging techniques. Copyright © 2017 L'Encéphale, Paris. Published by Elsevier Masson SAS. All rights reserved.

  17. A Comparison of Automatic Parallelization Tools/Compilers on the SGI Origin 2000 Using the NAS Benchmarks

    Science.gov (United States)

    Saini, Subhash; Frumkin, Michael; Hribar, Michelle; Jin, Hao-Qiang; Waheed, Abdul; Yan, Jerry

    1998-01-01

    Porting applications to new high performance parallel and distributed computing platforms is a challenging task. Since writing parallel code by hand is extremely time consuming and costly, porting codes would ideally be automated by using some parallelization tools and compilers. In this paper, we compare the performance of the hand written NAB Parallel Benchmarks against three parallel versions generated with the help of tools and compilers: 1) CAPTools: an interactive computer aided parallelization too] that generates message passing code, 2) the Portland Group's HPF compiler and 3) using compiler directives with the native FORTAN77 compiler on the SGI Origin2000.

  18. A Modern Automatic Chamber Technique as a Powerful Tool for CH4 and CO2 Flux Monitoring

    Science.gov (United States)

    Mastepanov, M.; Christensen, T. R.; Lund, M.; Pirk, N.

    2014-12-01

    A number of similar systems were used for monitoring of CH4 and CO2 exchange by the automatic chamber method in a range of different ecosystems. The measurements were carried out in northern Sweden (mountain birch forest near Abisko, 68°N, 2004-2010), southern Sweden (forest bog near Hässleholm, 56°N, 2007-2014), northeastern Greenland (arctic fen in Zackenberg valley, 74°N, 2005-2014), southwestern Greenland (fen near Nuuk, 64°N, 2007-2014), central Svalbard (arctic fen near Longyearbyen, 78°N, 2011-2014). Those in total 37 seasons of measurements delivered not only a large amount of valuable flux data, including a few novel findings (Mastepanov et al., Nature, 2008; Mastepanov et al., Biogeosciences, 2013), but also valuable experience with implementation of the automatic chamber technique using modern analytical instruments and computer technologies. A range of high resolution CH4 analysers (DLT-100, FMA, FGGA - Los Gatos Research), CO2 analyzers (EGM-4, SBA-4 - PP Systems; Li-820 - Li-Cor Biosciences), as well as Methane Carbon Isotope Analyzer (Los Gatos Research) has shown to be suitable for precise measurements of fluxes, from as low as 0.1 mg CH4 m-1 d-1 (wintertime measurements at Zackenberg, unpublished) to as high as 2.4 g CH4 m-1 d-1 (autumn burst 2007 at Zackenberg, Mastepanov et al., Nature, 2008). Some of these instruments had to be customized to accommodate 24/7 operation in harsh arctic conditions. In this presentation we will explain some of these customizations. High frequency of concentration measurements (1 Hz in most cases) provides a unique opportunity for quality control of flux calculations; on the other hand, this enormous amount of data can be analyzed only using highly automated algorithms. A specialized software package was developed and improved through the years of measurements and data processing. This software automates the data flow from raw concentration data of different instruments and sensors and various status records

  19. TeraStitcher - A tool for fast automatic 3D-stitching of teravoxel-sized microscopy images

    Directory of Open Access Journals (Sweden)

    Bria Alessandro

    2012-11-01

    Full Text Available Abstract Background Further advances in modern microscopy are leading to teravoxel-sized tiled 3D images at high resolution, thus increasing the dimension of the stitching problem of at least two orders of magnitude. The existing software solutions do not seem adequate to address the additional requirements arising from these datasets, such as the minimization of memory usage and the need to process just a small portion of data. Results We propose a free and fully automated 3D Stitching tool designed to match the special requirements coming out of teravoxel-sized tiled microscopy images that is able to stitch them in a reasonable time even on workstations with limited resources. The tool was tested on teravoxel-sized whole mouse brain images with micrometer resolution and it was also compared with the state-of-the-art stitching tools on megavoxel-sized publicy available datasets. This comparison confirmed that the solutions we adopted are suited for stitching very large images and also perform well on datasets with different characteristics. Indeed, some of the algorithms embedded in other stitching tools could be easily integrated in our framework if they turned out to be more effective on other classes of images. To this purpose, we designed a software architecture which separates the strategies that use efficiently memory resources from the algorithms which may depend on the characteristics of the acquired images. Conclusions TeraStitcher is a free tool that enables the stitching of Teravoxel-sized tiled microscopy images even on workstations with relatively limited resources of memory (

  20. ATLAS (Automatic Tool for Local Assembly Structures) - A Comprehensive Infrastructure for Assembly, Annotation, and Genomic Binning of Metagenomic and Metaranscripomic Data

    Energy Technology Data Exchange (ETDEWEB)

    White, Richard A.; Brown, Joseph M.; Colby, Sean M.; Overall, Christopher C.; Lee, Joon-Yong; Zucker, Jeremy D.; Glaesemann, Kurt R.; Jansson, Georg C.; Jansson, Janet K.

    2017-03-02

    ATLAS (Automatic Tool for Local Assembly Structures) is a comprehensive multiomics data analysis pipeline that is massively parallel and scalable. ATLAS contains a modular analysis pipeline for assembly, annotation, quantification and genome binning of metagenomics and metatranscriptomics data and a framework for reference metaproteomic database construction. ATLAS transforms raw sequence data into functional and taxonomic data at the microbial population level and provides genome-centric resolution through genome binning. ATLAS provides robust taxonomy based on majority voting of protein coding open reading frames rolled-up at the contig level using modified lowest common ancestor (LCA) analysis. ATLAS provides robust taxonomy based on majority voting of protein coding open reading frames rolled-up at the contig level using modified lowest common ancestor (LCA) analysis. ATLAS is user-friendly, easy install through bioconda maintained as open-source on GitHub, and is implemented in Snakemake for modular customizable workflows.

  1. Revisiting the dose-effect correlations in irradiated head and neck cancer using automatic segmentation tools of the dental structures, mandible and maxilla

    International Nuclear Information System (INIS)

    Thariat, J.; Ramus, L.; Odin, G.; Vincent, S.; Orlanducci, M.H.; Dassonville, O.; Darcourt, V.; Lacout, A.; Marcy, P.Y.; Cagnol, G.; Malandain, G.

    2011-01-01

    Purpose. - Manual delineation of dental structures is too time-consuming to be feasible in routine practice. Information on dose risk levels is crucial for dentists following irradiation of the head and neck to avoid post-extraction osteoradionecrosis based on empirical dose-effects data established on bidimensional radiation therapy plans. Material and methods. - We present an automatic atlas-based segmentation framework of the dental structures, called Dentalmaps, constructed from a patient image-segmentation database. Results. - This framework is accurate (within 2 Gy accuracy) and relevant for the routine use. It has the potential to guide dental care in the context of new irradiation techniques. Conclusion. - This tool provides a user-friendly interface for dentists and radiation oncologists in the context of irradiated head and neck cancer patients. It will likely improve the knowledge of dose-effect correlations for dental complications and osteoradionecrosis. (authors)

  2. Fuchsia : A tool for reducing differential equations for Feynman master integrals to epsilon form

    Science.gov (United States)

    Gituliar, Oleksandr; Magerya, Vitaly

    2017-10-01

    We present Fuchsia - an implementation of the Lee algorithm, which for a given system of ordinary differential equations with rational coefficients ∂x J(x , ɛ) = A(x , ɛ) J(x , ɛ) finds a basis transformation T(x , ɛ) , i.e., J(x , ɛ) = T(x , ɛ) J‧(x , ɛ) , such that the system turns into the epsilon form : ∂xJ‧(x , ɛ) = ɛ S(x) J‧(x , ɛ) , where S(x) is a Fuchsian matrix. A system of this form can be trivially solved in terms of polylogarithms as a Laurent series in the dimensional regulator ɛ. That makes the construction of the transformation T(x , ɛ) crucial for obtaining solutions of the initial system. In principle, Fuchsia can deal with any regular systems, however its primary task is to reduce differential equations for Feynman master integrals. It ensures that solutions contain only regular singularities due to the properties of Feynman integrals. Program Files doi:http://dx.doi.org/10.17632/zj6zn9vfkh.1 Licensing provisions: MIT Programming language:Python 2.7 Nature of problem: Feynman master integrals may be calculated from solutions of a linear system of differential equations with rational coefficients. Such a system can be easily solved as an ɛ-series when its epsilon form is known. Hence, a tool which is able to find the epsilon form transformations can be used to evaluate Feynman master integrals. Solution method: The solution method is based on the Lee algorithm (Lee, 2015) which consists of three main steps: fuchsification, normalization, and factorization. During the fuchsification step a given system of differential equations is transformed into the Fuchsian form with the help of the Moser method (Moser, 1959). Next, during the normalization step the system is transformed to the form where eigenvalues of all residues are proportional to the dimensional regulator ɛ. Finally, the system is factorized to the epsilon form by finding an unknown transformation which satisfies a system of linear equations. Additional comments

  3. Open-source tool for automatic import of coded surveying data to multiple vector layers in GIS environment

    Directory of Open Access Journals (Sweden)

    Eva Stopková

    2016-12-01

    Full Text Available This paper deals with a tool that enables import of the coded data in a singletext file to more than one vector layers (including attribute tables, together withautomatic drawing of line and polygon objects and with optional conversion toCAD. Python script v.in.survey is available as an add-on for open-source softwareGRASS GIS (GRASS Development Team. The paper describes a case study basedon surveying at the archaeological mission at Tell-el Retaba (Egypt. Advantagesof the tool (e.g. significant optimization of surveying work and its limits (demandson keeping conventions for the points’ names coding are discussed here as well.Possibilities of future development are suggested (e.g. generalization of points’names coding or more complex attribute table creation.

  4. Validation of the ICU-DaMa tool for automatically extracting variables for minimum dataset and quality indicators: The importance of data quality assessment.

    Science.gov (United States)

    Sirgo, Gonzalo; Esteban, Federico; Gómez, Josep; Moreno, Gerard; Rodríguez, Alejandro; Blanch, Lluis; Guardiola, Juan José; Gracia, Rafael; De Haro, Lluis; Bodí, María

    2018-04-01

    Big data analytics promise insights into healthcare processes and management, improving outcomes while reducing costs. However, data quality is a major challenge for reliable results. Business process discovery techniques and an associated data model were used to develop data management tool, ICU-DaMa, for extracting variables essential for overseeing the quality of care in the intensive care unit (ICU). To determine the feasibility of using ICU-DaMa to automatically extract variables for the minimum dataset and ICU quality indicators from the clinical information system (CIS). The Wilcoxon signed-rank test and Fisher's exact test were used to compare the values extracted from the CIS with ICU-DaMa for 25 variables from all patients attended in a polyvalent ICU during a two-month period against the gold standard of values manually extracted by two trained physicians. Discrepancies with the gold standard were classified into plausibility, conformance, and completeness errors. Data from 149 patients were included. Although there were no significant differences between the automatic method and the manual method, we detected differences in values for five variables, including one plausibility error and two conformance and completeness errors. Plausibility: 1) Sex, ICU-DaMa incorrectly classified one male patient as female (error generated by the Hospital's Admissions Department). Conformance: 2) Reason for isolation, ICU-DaMa failed to detect a human error in which a professional misclassified a patient's isolation. 3) Brain death, ICU-DaMa failed to detect another human error in which a professional likely entered two mutually exclusive values related to the death of the patient (brain death and controlled donation after circulatory death). Completeness: 4) Destination at ICU discharge, ICU-DaMa incorrectly classified two patients due to a professional failing to fill out the patient discharge form when thepatients died. 5) Length of continuous renal replacement

  5. Multi-objective optimum design of fast tool servo based on improved differential evolution algorithm

    International Nuclear Information System (INIS)

    Zhu, Zhiwei; Zhou, Xiaoqin; Liu, Qiang; Zhao, Shaoxin

    2011-01-01

    The flexure-based mechanism is a promising realization of fast tool servo (FTS), and the optimum determination of flexure hinge parameters is one of the most important elements in the FTS design. This paper presents a multi-objective optimization approach to optimizing the dimension and position parameters of the flexure-based mechanism, which is based on the improved differential evolution algorithm embedding chaos and nonlinear simulated anneal algorithm. The results of optimum design show that the proposed algorithm has excellent performance and a well-balanced compromise is made between two conflicting objectives, the stroke and natural frequency of the FTS mechanism. The validation tests based on finite element analysis (FEA) show good agreement with the results obtained by using the proposed theoretical algorithm of this paper. Finally, a series of experimental tests are conducted to validate the design process and assess the performance of the FTS mechanism. The designed FTS reaches up to a stroke of 10.25 μm with at least 2 kHz bandwidth. Both of the FEA and experimental results demonstrate that the parameters of the flexure-based mechanism determined by the proposed approaches can achieve the specified performance and the proposed approach is suitable for the optimum design of FTS mechanism and of excellent performances

  6. Fractal Dimension as a Diagnostic Tool of Complex Endometrial Hyperplasia and Well-differentiated Endometrioid Carcinoma.

    Science.gov (United States)

    Bikou, Olga; Delides, Alexander; Drougou, Aggeliki; Nonni, Afroditi; Patsouris, Efstratios; Pavlakis, Kitty

    Fractal dimension (FD) is widely used in medicine and biology as a tool for defining features of structure. This study aimed to compare pathological endometrium (simple-complex hyperplasia and endometrial carcinoma), as well as the endometrial changes, during the phases of the menstrual cycle. The main goal was the objective measurement of fractal dimension and to refrain from subjective evaluation. Two thousand cases of endometrial tissue from patients who underwent dilatation and curettage (D&C) were reviewed. Out of these, 137 cases were eligible for the study. In each case, immunohistochemistry with cytokeratin Ae1/AE3 was performed in order to simplify the evaluation of the FD. Endometria with carcinoma, simple or complex hyperplasia showed significant differences only in the immunohistochemically stained fractal dimensions. As expected, significant differences were also found between atrophic and secretory endometrium and carcinoma. FD is an objective, rapid and simple procedure for the differential diagnosis between complex hyperplasia and endometrial adenocarcinoma. Copyright © 2016 International Institute of Anticancer Research (Dr. John G. Delinassios), All rights reserved.

  7. GAPscreener: An automatic tool for screening human genetic association literature in PubMed using the support vector machine technique

    Directory of Open Access Journals (Sweden)

    Khoury Muin J

    2008-04-01

    Full Text Available Abstract Background Synthesis of data from published human genetic association studies is a critical step in the translation of human genome discoveries into health applications. Although genetic association studies account for a substantial proportion of the abstracts in PubMed, identifying them with standard queries is not always accurate or efficient. Further automating the literature-screening process can reduce the burden of a labor-intensive and time-consuming traditional literature search. The Support Vector Machine (SVM, a well-established machine learning technique, has been successful in classifying text, including biomedical literature. The GAPscreener, a free SVM-based software tool, can be used to assist in screening PubMed abstracts for human genetic association studies. Results The data source for this research was the HuGE Navigator, formerly known as the HuGE Pub Lit database. Weighted SVM feature selection based on a keyword list obtained by the two-way z score method demonstrated the best screening performance, achieving 97.5% recall, 98.3% specificity and 31.9% precision in performance testing. Compared with the traditional screening process based on a complex PubMed query, the SVM tool reduced by about 90% the number of abstracts requiring individual review by the database curator. The tool also ascertained 47 articles that were missed by the traditional literature screening process during the 4-week test period. We examined the literature on genetic associations with preterm birth as an example. Compared with the traditional, manual process, the GAPscreener both reduced effort and improved accuracy. Conclusion GAPscreener is the first free SVM-based application available for screening the human genetic association literature in PubMed with high recall and specificity. The user-friendly graphical user interface makes this a practical, stand-alone application. The software can be downloaded at no charge.

  8. Sentiment Analysis and Social Cognition Engine (SEANCE): An automatic tool for sentiment, social cognition, and social-order analysis.

    Science.gov (United States)

    Crossley, Scott A; Kyle, Kristopher; McNamara, Danielle S

    2017-06-01

    This study introduces the Sentiment Analysis and Cognition Engine (SEANCE), a freely available text analysis tool that is easy to use, works on most operating systems (Windows, Mac, Linux), is housed on a user's hard drive (as compared to being accessed via an Internet interface), allows for batch processing of text files, includes negation and part-of-speech (POS) features, and reports on thousands of lexical categories and 20 component scores related to sentiment, social cognition, and social order. In the study, we validated SEANCE by investigating whether its indices and related component scores can be used to classify positive and negative reviews in two well-known sentiment analysis test corpora. We contrasted the results of SEANCE with those from Linguistic Inquiry and Word Count (LIWC), a similar tool that is popular in sentiment analysis, but is pay-to-use and does not include negation or POS features. The results demonstrated that both the SEANCE indices and component scores outperformed LIWC on the categorization tasks.

  9. Algorithmic differentiation of Java programs

    OpenAIRE

    Slusanschi, Emil-Ioan

    2008-01-01

    Derivatives are a crucial component in many areas of science and engineering, and their accurate evaluation is often required in various scientific applications. One technique widely used to obtain computer derivatives is Automatic Differentiation (AD). The fact that to date no usable AD tool implementation exists for Java motivated the development of an AD tool for the Java language. Because of the portability and simplicity in terms of standardization provided by the Java bytecode, our ADiJ...

  10. Development and application of an automatic tool for the selection of control variables based on the self-optimizing control methodology

    Directory of Open Access Journals (Sweden)

    S. K. Silva

    Full Text Available Abstract Rules for control structure design for industrial processes have been extensively proposed in the literature. Some model-based methodologies have a sound mathematical basis, such as the self-optimizing control technology. The procedure can be applied with the aid of available commercial simulators, e.g., PRO/IITM and AspenPlus®, from which the converging results are obtained more suitably for industrial applications, lessening the effort needed to build an appropriate mathematical model of the plant. Motivated by this context, this work explores the development and application of a tool designed to automatically generate near-optimal controlled structures for process plants based on the self-optimizing control technology. The goal is to provide a means to facilitate the way possible arrangements of controlled variables are generated. Using the local minimum singular value rule supported by a modified version of a branch-and-bound algorithm, the best sets of candidate controlled variables can be identified that minimize the loss between real optimal operation and operation under constant set-point policy. A case study consisting of a deethanizer is considered to show the main features of the proposed tool. The conclusion indicates the feasibility of merging complex theoretical contents within the framework of a user-friendly interface simple enough to generate control structures suitable for real world implementation.

  11. CSReport: A New Computational Tool Designed for Automatic Analysis of Class Switch Recombination Junctions Sequenced by High-Throughput Sequencing.

    Science.gov (United States)

    Boyer, François; Boutouil, Hend; Dalloul, Iman; Dalloul, Zeinab; Cook-Moreau, Jeanne; Aldigier, Jean-Claude; Carrion, Claire; Herve, Bastien; Scaon, Erwan; Cogné, Michel; Péron, Sophie

    2017-05-15

    B cells ensure humoral immune responses due to the production of Ag-specific memory B cells and Ab-secreting plasma cells. In secondary lymphoid organs, Ag-driven B cell activation induces terminal maturation and Ig isotype class switch (class switch recombination [CSR]). CSR creates a virtually unique IgH locus in every B cell clone by intrachromosomal recombination between two switch (S) regions upstream of each C region gene. Amount and structural features of CSR junctions reveal valuable information about the CSR mechanism, and analysis of CSR junctions is useful in basic and clinical research studies of B cell functions. To provide an automated tool able to analyze large data sets of CSR junction sequences produced by high-throughput sequencing (HTS), we designed CSReport, a software program dedicated to support analysis of CSR recombination junctions sequenced with a HTS-based protocol (Ion Torrent technology). CSReport was assessed using simulated data sets of CSR junctions and then used for analysis of Sμ-Sα and Sμ-Sγ1 junctions from CH12F3 cells and primary murine B cells, respectively. CSReport identifies junction segment breakpoints on reference sequences and junction structure (blunt-ended junctions or junctions with insertions or microhomology). Besides the ability to analyze unprecedentedly large libraries of junction sequences, CSReport will provide a unified framework for CSR junction studies. Our results show that CSReport is an accurate tool for analysis of sequences from our HTS-based protocol for CSR junctions, thereby facilitating and accelerating their study. Copyright © 2017 by The American Association of Immunologists, Inc.

  12. Molecular polymorphism as a tool for differentiating ground beetles (Carabus species): application of ubiquitin PCR/SSCP analysis.

    Science.gov (United States)

    Boge, A; Gerstmeier, R; Einspanier, R

    1994-11-01

    Differentiation between Carabus species (ground beetle) and subspecies is difficult, although there have been extensive studies. To address this problem we have applied PCR in combination with SSCP analysis focussing on the evolutionally conservative ubiquitin gene to elaborate a new approach to molecular differentiation between species. We report that Carabidae possess an ubiquitin gene and that its gene has a multimeric structure. Differential SSCP analysis was performed with the monomeric form of the gene to generate a clear SSCP pattern. Such PCR/SSCP resulted in reproducible patterns throughout our experiments. Comparing different Carabus species (Carabus granulatus, C. irregularis, C. violaceus and C. auronitens) we could observe clear interspecies differences but no differences between genders. Some species showed some remarkable differences between the individuals. We suggest that the ubiquitin PCR-SSCP technique might be an additional tool for the differentiation of ground beetles.

  13. Retrieval interval mapping, a tool to optimize the spectral retrieval range in differential optical absorption spectroscopy

    Science.gov (United States)

    Vogel, L.; Sihler, H.; Lampel, J.; Wagner, T.; Platt, U.

    2012-06-01

    Remote sensing via differential optical absorption spectroscopy (DOAS) has become a standard technique to identify and quantify trace gases in the atmosphere. The technique is applied in a variety of configurations, commonly classified into active and passive instruments using artificial and natural light sources, respectively. Platforms range from ground based to satellite instruments and trace-gases are studied in all kinds of different environments. Due to the wide range of measurement conditions, atmospheric compositions and instruments used, a specific challenge of a DOAS retrieval is to optimize the parameters for each specific case and particular trace gas of interest. This becomes especially important when measuring close to the detection limit. A well chosen evaluation wavelength range is crucial to the DOAS technique. It should encompass strong absorption bands of the trace gas of interest in order to maximize the sensitivity of the retrieval, while at the same time minimizing absorption structures of other trace gases and thus potential interferences. Also, instrumental limitations and wavelength depending sources of errors (e.g. insufficient corrections for the Ring effect and cross correlations between trace gas cross sections) need to be taken into account. Most often, not all of these requirements can be fulfilled simultaneously and a compromise needs to be found depending on the conditions at hand. Although for many trace gases the overall dependence of common DOAS retrieval on the evaluation wavelength interval is known, a systematic approach to find the optimal retrieval wavelength range and qualitative assessment is missing. Here we present a novel tool to determine the optimal evaluation wavelength range. It is based on mapping retrieved values in the retrieval wavelength space and thus visualize the consequence of different choices of retrieval spectral ranges, e.g. caused by slightly erroneous absorption cross sections, cross correlations and

  14. Advanced Differential Radar Interferometry (A-DInSAR) as integrative tool for a structural geological analysis

    Science.gov (United States)

    Crippa, B.; Calcagni, L.; Rossi, G.; Sternai, P.

    2009-04-01

    Advanced Differential SAR interferometry (A-DInSAR) is a technique monitoring large-coverage surface deformations using a stack of interferograms generated from several complex SLC SAR images, acquired over the same target area at different times. In this work are described the results of a procedure to calculate terrain motion velocity on highly correlated pixels (E. Biescas, M. Crosetto, M. Agudo, O. Monserrat e B. Crippa: Two Radar Interferometric Approaches to Monitor Slow and Fast Land Deformation, 2007) in two area Gemona - Friuli, Northern Italy, Pollino - Calabria, Southern Italy, and, furthermore, are presented some consideration, based on successful examples of the present analysis. The choice of these pixels whose displacement velocity is calculated depends on the dispersion index value (DA) or using coherence values along the stack interferograms. A-DInSAR technique allows to obtain highly reliable velocity values of the vertical displacement. These values concern the movement of minimum surfaces of about 80m2 at the maximum resolution and the minimum velocity that can be recognized is of the order of mm/y. Because of the high versatility of the technology, because of the large dimensions of the area that can be analyzed (of about 10000Km2) and because of the high precision and reliability of the results obtained, we think it is possible to exploit radar interferometry to obtain some important information about the structural context of the studied area, otherwise very difficult to recognize. Therefore we propose radar interferometry as a valid investigation tool whose results must be considered as an important integration of the data collected in fieldworks.

  15. Automatization of the special library as a tool of the provision of the quality services for the readers

    International Nuclear Information System (INIS)

    Zendulkova, D.

    2004-01-01

    The article is concerned with the base principles of automation library activities. It deals with the problem of the activities selection intended for automation with regard to character of delivered library services as well as for user requirements. It analyzes the actual situation at our place in the field of libraries software menu. It also shows the reality that by the identification of the requirements on library system there exist many criteria that are advisable to be taken into account. The article briefly characterizes the latest trends in the field of library co-operation, actual used interchange formats of data processing and some new legislative documents related to libraries fond processing, which stimulate property of libraries software. In the next part the article analyzes the applications that are typical for a smaller library. These applications are administered by the database and searching system WinISIS including cataloguing of books and periodicals and charging (borrowing) system for example. It deals with available behaviour of libraries database exposure that is produced by this system on Internet as well as the possibilities of hypertext interface of libraries database with online accessible external information sources. In the conclusion are the mentioned services that the Centre of Scientific and Technical Information offers to the users and the persons concerned to the software tools for the automation of libraries. (author)

  16. Automatization of the special library as a tool of the provision of the quality services for the readers

    International Nuclear Information System (INIS)

    Zendulkova, D.

    2004-01-01

    The presentation is concerned with the base principles of automation library activities. It deals with the problem of the activities selection intended for automation with regard to character of delivered library services as well as for user requirements. It analyzes the actual situation at our place in the field of libraries software menu. It also shows the reality that by the identification of the requirements on library system there exist many criteria that are advisable to be taken into account. The article briefly characterizes the latest trends in the field of library co-operation, actual used interchange formats of data processing and some new legislative documents related to libraries fond processing, which stimulate property of libraries software. In the next part the article analyzes the applications that are typical for a smaller library. These applications are administered by the database and searching system WinISIS including cataloguing of books and periodicals and charging (borrowing) system for example. It deals with available behaviour of libraries database exposure that is produced by this system on Internet as well as the possibilities of hypertext interface of libraries database with online accessible external information sources. In the conclusion are the mentioned services that the Centre of Scientific and Technical Information offers to the users and the persons concerned to the software tools for the automation of libraries. (author)

  17. Automatic sequences

    CERN Document Server

    Haeseler, Friedrich

    2003-01-01

    Automatic sequences are sequences which are produced by a finite automaton. Although they are not random they may look as being random. They are complicated, in the sense of not being not ultimately periodic, they may look rather complicated, in the sense that it may not be easy to name the rule by which the sequence is generated, however there exists a rule which generates the sequence. The concept automatic sequences has special applications in algebra, number theory, finite automata and formal languages, combinatorics on words. The text deals with different aspects of automatic sequences, in particular:· a general introduction to automatic sequences· the basic (combinatorial) properties of automatic sequences· the algebraic approach to automatic sequences· geometric objects related to automatic sequences.

  18. PatternLab for proteomics: a tool for differential shotgun proteomics

    Directory of Open Access Journals (Sweden)

    Yates John R

    2008-07-01

    Full Text Available Abstract Background A goal of proteomics is to distinguish between states of a biological system by identifying protein expression differences. Liu et al. demonstrated a method to perform semi-relative protein quantitation in shotgun proteomics data by correlating the number of tandem mass spectra obtained for each protein, or "spectral count", with its abundance in a mixture; however, two issues have remained open: how to normalize spectral counting data and how to efficiently pinpoint differences between profiles. Moreover, Chen et al. recently showed how to increase the number of identified proteins in shotgun proteomics by analyzing samples with different MS-compatible detergents while performing proteolytic digestion. The latter introduced new challenges as seen from the data analysis perspective, since replicate readings are not acquired. Results To address the open issues above, we present a program termed PatternLab for proteomics. This program implements existing strategies and adds two new methods to pinpoint differences in protein profiles. The first method, ACFold, addresses experiments with less than three replicates from each state or having assays acquired by different protocols as described by Chen et al. ACFold uses a combined criterion based on expression fold changes, the AC test, and the false-discovery rate, and can supply a "bird's-eye view" of differentially expressed proteins. The other method addresses experimental designs having multiple readings from each state and is referred to as nSVM (natural support vector machine because of its roots in evolutionary computing and in statistical learning theory. Our observations suggest that nSVM's niche comprises projects that select a minimum set of proteins for classification purposes; for example, the development of an early detection kit for a given pathology. We demonstrate the effectiveness of each method on experimental data and confront them with existing strategies

  19. Automatic flow-through dynamic extraction: A fast tool to evaluate char-based remediation of multi-element contaminated mine soils.

    Science.gov (United States)

    Rosende, María; Beesley, Luke; Moreno-Jimenez, Eduardo; Miró, Manuel

    2016-02-01

    An automatic in-vitro bioaccessibility test based upon dynamic microcolumn extraction in a programmable flow setup is herein proposed as a screening tool to evaluate bio-char based remediation of mine soils contaminated with trace elements as a compelling alternative to conventional phyto-availability tests. The feasibility of the proposed system was evaluated by extracting the readily bioaccessible pools of As, Pb and Zn in two contaminated mine soils before and after the addition of two biochars (9% (w:w)) of diverse source origin (pine and olive). Bioaccessible fractions under worst-case scenarios were measured using 0.001 mol L(-1) CaCl2 as extractant for mimicking plant uptake, and analysis of the extracts by inductively coupled optical emission spectrometry. The t-test of comparison of means revealed an efficient metal (mostly Pb and Zn) immobilization by the action of olive pruning-based biochar against the bare (control) soil at the 0.05 significance level. In-vitro flow-through bioaccessibility tests are compared for the first time with in-vivo phyto-toxicity assays in a microcosm soil study. By assessing seed germination and shoot elongation of Lolium perenne in contaminated soils with and without biochar amendments the dynamic flow-based bioaccessibility data proved to be in good agreement with the phyto-availability tests. Experimental results indicate that the dynamic extraction method is a viable and economical in-vitro tool in risk assessment explorations to evaluate the feasibility of a given biochar amendment for revegetation and remediation of metal contaminated soils in a mere 10 min against 4 days in case of phyto-toxicity assays. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Ultraprecise parabolic interpolator for numerically controlled machine tools. [Digital differential analyzer circuit

    Energy Technology Data Exchange (ETDEWEB)

    Davenport, C. M.

    1977-02-01

    The mathematical basis for an ultraprecise digital differential analyzer circuit for use as a parabolic interpolator on numerically controlled machines has been established, and scaling and other error-reduction techniques have been developed. An exact computer model is included, along with typical results showing tracking to within an accuracy of one part per million.

  1. Delay differential equations via the matrix Lambert W function and bifurcation analysis: application to machine tool chatter.

    Science.gov (United States)

    Yi, Sun; Nelson, Patrick W; Ulsoy, A Galip

    2007-04-01

    In a turning process modeled using delay differential equations (DDEs), we investigate the stability of the regenerative machine tool chatter problem. An approach using the matrix Lambert W function for the analytical solution to systems of delay differential equations is applied to this problem and compared with the result obtained using a bifurcation analysis. The Lambert W function, known to be useful for solving scalar first-order DDEs, has recently been extended to a matrix Lambert W function approach to solve systems of DDEs. The essential advantages of the matrix Lambert W approach are not only the similarity to the concept of the state transition matrix in lin ear ordinary differential equations, enabling its use for general classes of linear delay differential equations, but also the observation that we need only the principal branch among an infinite number of roots to determine the stability of a system of DDEs. The bifurcation method combined with Sturm sequences provides an algorithm for determining the stability of DDEs without restrictive geometric analysis. With this approach, one can obtain the critical values of delay, which determine the stability of a system and hence the preferred operating spindle speed without chatter. We apply both the matrix Lambert W function and the bifurcation analysis approach to the problem of chatter stability in turning, and compare the results obtained to existing methods. The two new approaches show excellent accuracy and certain other advantages, when compared to traditional graphical, computational and approximate methods.

  2. Fuchsia. A tool for reducing differential equations for Feynman master integral to epsilon form

    International Nuclear Information System (INIS)

    Gituliar, Oleksandr; Magerya, Vitaly

    2017-01-01

    We present Fuchsia - an implementation of the Lee algorithm, which for a given system of ordinary differential equations with rational coefficients ∂ x f(x,ε)=A(x,ε)f(x,ε) finds a basis transformation T(x,ε), i.e., f(x,ε)=T(x,ε)g(x,ε), such that the system turns into the epsilon form: ∂ x g(x,ε)=εS(x)g(x,ε), where S(x) is a Fuchsian matrix. A system of this form can be trivially solved in terms of polylogarithms as a Laurent series in the dimensional regulator ε. That makes the construction of the transformation T(x,ε) crucial for obtaining solutions of the initial system. In principle, Fuchsia can deal with any regular systems, however its primary task is to reduce differential equations for Feynman master integrals. It ensures that solutions contain only regular singularities due to the properties of Feynman integrals.

  3. The relevance of clinical balance assessment tools to differentiate balance deficits

    OpenAIRE

    Mancini, Martina; Horak, Fay B

    2010-01-01

    Control of balance is complex and involves maintaining postures, facilitating movement, and recovering equilibrium. Balance control consists of controlling the body center of mass over its limits of stability. Clinical balance assessment can help assess fall risk and/or determine the underlying reasons for balance disorders. Most functional balance assessment scales assess fall risk and the need for balance rehabilitation but do not differentiate types of balance deficits. A system approach t...

  4. Adipose-derived stem cell differentiation as a basic tool for vascularized adipose tissue engineering.

    Science.gov (United States)

    Volz, Ann-Cathrin; Huber, Birgit; Kluger, Petra J

    2016-01-01

    The development of in vitro adipose tissue constructs is highly desired to cope with the increased demand for substitutes to replace damaged soft tissue after high graded burns, deformities or tumor removal. To achieve clinically relevant dimensions, vascularization of soft tissue constructs becomes inevitable but still poses a challenge. Adipose-derived stem cells (ASCs) represent a promising cell source for the setup of vascularized fatty tissue constructs as they can be differentiated into adipocytes and endothelial cells in vitro and are thereby available in sufficiently high cell numbers. This review summarizes the currently known characteristics of ASCs and achievements in adipogenic and endothelial differentiation in vitro. Further, the interdependency of adipogenesis and angiogenesis based on the crosstalk of endothelial cells, stem cells and adipocytes is addressed at the molecular level. Finally, achievements and limitations of current co-culture conditions for the construction of vascularized adipose tissue are evaluated. Copyright © 2016 International Society of Differentiation. Published by Elsevier B.V. All rights reserved.

  5. Differential Diagnosis Tool for Parkinsonian Syndrome Using Multiple Structural Brain Measures

    Directory of Open Access Journals (Sweden)

    Miho Ota

    2013-01-01

    Full Text Available Clinical differentiation of parkinsonian syndromes such as the Parkinson variant of multiple system atrophy (MSA-P and cerebellar subtype (MSA-C from Parkinson's disease is difficult in the early stage of the disease. To identify the correlative pattern of brain changes for differentiating parkinsonian syndromes, we applied discriminant analysis techniques by magnetic resonance imaging (MRI. T1-weighted volume data and diffusion tensor images were obtained by MRI in eighteen patients with MSA-C, 12 patients with MSA-P, 21 patients with Parkinson’s disease, and 21 healthy controls. They were evaluated using voxel-based morphometry and tract-based spatial statistics, respectively. Discriminant functions derived by step wise methods resulted in correct classification rates of 0.89. When differentiating these diseases with the use of three independent variables together, the correct classification rate was the same as that obtained with step wise methods. These findings support the view that each parkinsonian syndrome has structural deviations in multiple brain areas and that a combination of structural brain measures can help to distinguish parkinsonian syndromes.

  6. Advances in geospatial analysis platforms and tools: Creating space for differentiated policy and investment responses

    CSIR Research Space (South Africa)

    Maritz, Johan

    2010-09-01

    Full Text Available Over the last 5 years a set of incremental advances within geospatial analysis platforms and tools developed by the CSIR's Planning Support Systems in collaboration with key stakeholders such as The Presidency, enabled a more nuanced regional level...

  7. 'Feeling good' unpacked : Developing design tools to facilitate a differentiated understanding of positive emotions

    NARCIS (Netherlands)

    Yoon, J.; Pohlmeyer, A.E.; Desmet, P.M.A.; Desmet, P.M.A.; Fokkinga, S.F.; Ludden, G.D.S.; Cila, N.; Van Zuthem, H.

    2016-01-01

    The range of positive emotions experienced in human-product interactions is diverse, and understanding the differences and similarities between these positive emotions can support emotion-driven design. Yet, there is little knowledge about what kind of tool would be effective to leverage

  8. Heart Rate Variability – a Tool to Differentiate Positive and Negative Affective States in Pigs?

    Science.gov (United States)

    The causal neurophysiological processes, such as autonomic nervous system activity, that mediate behavioral and physiological reactivity to an environment have largely been ignored. Heart rate variability (HRV) analysis is a clinical diagnostic tool used to assess affective states (stressful and ple...

  9. Dementia Apraxia Test (DATE): A Brief Tool to Differentiate Behavioral Variant Frontotemporal Dementia from Alzheimer's Dementia Based on Apraxia Profiles.

    Science.gov (United States)

    Johnen, Andreas; Frommeyer, Jana; Modes, Fenja; Wiendl, Heinz; Duning, Thomas; Lohmann, Hubertus

    2016-01-01

    Standardized praxis assessments with modern, empirically validated screening tests have substantially improved clinical evaluation of apraxia in patients with stroke. Although apraxia may contribute to early differential diagnosis of Alzheimer's dementia (AD) and behavioral variant frontotemporal dementia (bvFTD), no comparable test is readily available to clinicians for this purpose to date. To design a clinically useful apraxia test for the differentiation of AD and bvFTD. 84 test items pertaining to twelve praxis subdomains were evaluated for their efficacy to discriminate between patients with bvFTD (n = 24), AD (n = 28), and elderly healthy controls (HC; n = 35). Items were then selected based on discriminative value and psychometric properties. Items indicative of mild AD comprised spatially complex imitation of hand and finger postures and to a lesser degree, pantomime of common object-use. Buccofacial apraxia including imitation of face postures, emblematic face postures, and repetition of multisyllabic pseudowords differentiated bvFTD from HC and AD. The final test version consisting of 20 items proved highly efficient for the discrimination of biologically confirmed dementia patients from HC (sensitivity 91% , specificity 71%) but also for differential diagnosis of bvFTD and AD (sensitivity 74% , specificity 93%). Assessment of praxis profiles effectively contributes to diagnosis and differential diagnosis of AD and bvFTD. The Dementia Apraxia Test (DATE) is a brief and easy to administer cognitive tool for dementia assessment, has a high inter-rater reliability (Cohen's κ= 0.885) and demonstrates content validity.

  10. A method and tool for combining differential or inclusive measurements obtained with simultaneously constrained uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Kieseler, Jan [CERN, Geneva (Switzerland)

    2017-11-15

    A method is discussed that allows combining sets of differential or inclusive measurements. It is assumed that at least one measurement was obtained with simultaneously fitting a set of nuisance parameters, representing sources of systematic uncertainties. As a result of beneficial constraints from the data all such fitted parameters are correlated among each other. The best approach for a combination of these measurements would be the maximization of a combined likelihood, for which the full fit model of each measurement and the original data are required. However, only in rare cases this information is publicly available. In absence of this information most commonly used combination methods are not able to account for these correlations between uncertainties, which can lead to severe biases as shown in this article. The method discussed here provides a solution for this problem. It relies on the public result and its covariance or Hessian, only, and is validated against the combined-likelihood approach. A dedicated software package implementing this method is also presented. It provides a text-based user interface alongside a C++ interface. The latter also interfaces to ROOT classes for simple combination of binned measurements such as differential cross sections. (orig.)

  11. Volatile fraction composition and physicochemical parameters as tools for the differentiation of lemon blossom honey and orange blossom honey.

    Science.gov (United States)

    Kadar, Melinda; Juan-Borrás, Marisol; Carot, Jose M; Domenech, Eva; Escriche, Isabel

    2011-12-01

    Volatile fraction profile and physicochemical parameters were studied with the aim of evaluating their effectiveness for the differentiation between lemon blossom honey (Citrus limon L.) and orange blossom honey (Citrus spp.). They would be useful complementary tools to the traditional analysis based on the percentage of pollen. A stepwise discriminant analysis constructed using 37 volatile compounds (extracted by purge and trap and analysed by gas chromatography-mass spectrometry), and physicochemical and colour parameters (diastase, conductivity, Pfund colour and CIE L a b) together provided a model that permitted the correct classification of 98.3% of the original and 96.6% of the cross-validated cases, indicating its efficiency and robustness. This model proved its effectiveness in the differentiation of both types of honey with another set of batches from the following year. This model, developed from the volatile compounds, physicochemical and colour parameters, has been useful for the differentiation of lemon and orange blossom honeys. Furthermore, it may be of particular interest for the attainment of a suitable classification of orange honey in which the pollen count is very low. These capabilities imply an evident marketing advantage for the beekeeping sector, since lemon blossom honey could be commercialized as unifloral honey and not as generic citrus honey and orange blossom honey could be correctly characterized. Copyright © 2011 Society of Chemical Industry.

  12. Decision support tool for early differential diagnosis of acute lung injury and cardiogenic pulmonary edema in medical critically ill patients.

    Science.gov (United States)

    Schmickl, Christopher N; Shahjehan, Khurram; Li, Guangxi; Dhokarh, Rajanigandha; Kashyap, Rahul; Janish, Christopher; Alsara, Anas; Jaffe, Allan S; Hubmayr, Rolf D; Gajic, Ognjen

    2012-01-01

    At the onset of acute hypoxic respiratory failure, critically ill patients with acute lung injury (ALI) may be difficult to distinguish from those with cardiogenic pulmonary edema (CPE). No single clinical parameter provides satisfying prediction. We hypothesized that a combination of those will facilitate early differential diagnosis. In a population-based retrospective development cohort, validated electronic surveillance identified critically ill adult patients with acute pulmonary edema. Recursive partitioning and logistic regression were used to develop a decision support tool based on routine clinical information to differentiate ALI from CPE. Performance of the score was validated in an independent cohort of referral patients. Blinded post hoc expert review served as gold standard. Of 332 patients in a development cohort, expert reviewers (κ, 0.86) classified 156 as having ALI and 176 as having CPE. The validation cohort had 161 patients (ALI = 113, CPE = 48). The score was based on risk factors for ALI and CPE, age, alcohol abuse, chemotherapy, and peripheral oxygen saturation/Fio(2) ratio. It demonstrated good discrimination (area under curve [AUC] = 0.81; 95% CI, 0.77-0.86) and calibration (Hosmer-Lemeshow [HL] P = .16). Similar performance was obtained in the validation cohort (AUC = 0.80; 95% CI, 0.72-0.88; HL P = .13). A simple decision support tool accurately classifies acute pulmonary edema, reserving advanced testing for a subset of patients in whom satisfying prediction cannot be made. This novel tool may facilitate early inclusion of patients with ALI and CPE into research studies as well as improve and rationalize clinical management and resource use.

  13. Metaproteomics analyses as diagnostic tool for differentiation of Escherichia coli strains in outbreaks

    Science.gov (United States)

    Jabbour, Rabih E.; Wright, James D.; Deshpande, Samir V.; Wade, Mary; McCubbin, Patrick; Bevilacqua, Vicky

    2013-05-01

    The secreted proteins of the enterohemorrhagic and enteropathogenic E. coli (EHEC and EPEC) are the most common cause of hemorrhagic colitis, a bloody diarrhea with EHEC infection, which often can lead to life threatening hemolytic-uremic syndrome (HUS).We are employing a metaproteomic approach as an effective and complimentary technique to the current genomic based approaches. This metaproteomic approach will evaluate the secreted proteins associated with pathogenicity and utilize their signatures as differentiation biomarkers between EHEC and EPEC strains. The result showed that the identified tryptic peptides of the secreted proteins extracted from different EHEC and EPEC growths have difference in their amino acids sequences and could potentially utilized as biomarkers for the studied E. coli strains. Analysis of extract from EHEC O104:H4 resulted in identification of a multidrug efflux protein, which belongs to the family of fusion proteins that are responsible of cell transportation. Experimental peptides identified lies in the region of the HlyD haemolysin secretion protein-D that is responsible for transporting the haemolysin A toxin. Moreover, the taxonomic classification of EHEC O104:H4 showed closest match with E. coli E55989, which is in agreement with genomic sequencing studies that were done extensively on the mentioned strain. The taxonomic results showed strain level classification for the studied strains and distinctive separation among the strains. Comparative proteomic calculations showed separation between EHEC O157:H7 and O104:H4 in replicate samples using cluster analysis. There are no reported studies addressing the characterization of secreted proteins in various enhanced growth media and utilizing them as biomarkers for strain differentiation. The results of FY-2012 are promising to pursue further experimentation to statistically validate the results and to further explore the impact of environmental conditions on the nature of the secreted

  14. Star-Shaped Fluid Flow Tool for Use in Making Differential Measurements

    Science.gov (United States)

    England, John Dwight (Inventor); Kelley, Anthony R. (Inventor); Cronise, Raymond J. (Inventor)

    2014-01-01

    A fluid flow tool's plate-like structure has a ring portion defining a flow hole, a support portion extending radially away from the ring portion and adapted to be coupled to conduit wall, and extensions extending radially away from the ring portion such that a periphery of the plate-like structure is defined by the extensions and trough regions between adjacent extensions. One or more ports formed in the ring portion are in fluid communication with the flow hole. A first manifold in the plate-like structure is in fluid communication with each port communicating with the flow hole. One or more ports are formed in the periphery of the plate-like structure. A second manifold in the plate-like structure is in fluid communication with each port formed in the periphery. The first and second manifolds extend through the plate-like structure to terminate and be accessible at the conduit wall.

  15. Combinatorial hexapeptide ligand libraries (ProteoMiner): an innovative fractionation tool for differential quantitative clinical proteomics.

    Science.gov (United States)

    Hartwig, Sonja; Czibere, Akos; Kotzka, Jorg; Passlack, Waltraud; Haas, Rainer; Eckel, Jürgen; Lehr, Stefan

    2009-07-01

    Blood serum samples are the major source for clinical proteomics approaches, which aim to identify diagnostically relevant or treatment-response related proteins. But, the presence of very high-abundance proteins and the enormous dynamic range of protein distribution hinders whole serum analysis. An innovative tool to overcome these limitations, utilizes combinatorial hexapeptide ligand libraries (ProteoMiner). Here, we demonstrate that ProteoMiner can be used for comparative and quantitative analysis of complex proteomes. We spiked serum samples with increasing amounts (3 microg to 300 microg) of whole E. coli lysate, processed it with ProteoMiner and performed quantitative analyses of 2D-gels. We found, that the concentration of the spiked bacteria proteome, reflected by the maintained proportional spot intensities, was not altered by ProteoMiner treatment. Therefore, we conclude that the ProteoMiner technology can be used for quantitative analysis of low abundant proteins in complex biological samples.

  16. Automatic tracking of the constancy of the imaging chain radiographic equipment using integrated tool for dummy and evaluation software; Seguimiento automatico de la constancia de la cadena de imagen de Equipos Radiograficos mediante herramienta integrada por maniqui y software de evaluacion

    Energy Technology Data Exchange (ETDEWEB)

    Mayo, P.; Rodenas, F.; Marin, B.; Alcaraz, D.; Verdu, G.

    2011-07-01

    This paper presents an innovative tool nationwide for the automatic analysis of the constancy of the imaging chain digital radiographic equipment, both computed radiography (CR) and direct digital (DR).

  17. Recognizing and differentiating uncommon body fluids: Considerations and tools for a proper practical approach.

    Science.gov (United States)

    Janssens, Pim M W

    2017-08-01

    Clinical laboratories are regularly requested to inspect uncommon body fluids obtained from patients because clinicians are uncertain as to the origin of the collected material. They may need this information for the actual diagnosis, to confirm a supposition, or for guiding treatment and invasive operations like draining and puncturing. Often there is also a need to know more precisely what is going on in the cavity that gave rise to the fluid, for instance a local infection or metastasis, or whether the cavity is connected to organs or fluid compartments nearby etcetera. The results of the laboratory investigations often have () direct consequences. As the investigation of uncommon body fluids is distinct from routine laboratory analyses it requires special attention. This paper presents an overview of the characteristics of uncommon human body fluids, constituents useful as markers for recognizing and differentiating fluids and considerations that have to be taken into account when interpreting the results of analyses. In addition a number of practical recommendations for approaching the task of identifying uncommon body fluids are given. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Embryonic hybrid cells: a powerful tool for studying pluripotency and reprogramming of the differentiated cell chromosomes

    Directory of Open Access Journals (Sweden)

    SEROV OLEG

    2001-01-01

    Full Text Available The properties of embryonic hybrid cells obtained by fusion of embryonic stem (ES or teratocarcinoma (TC cells with differentiated cells are reviewed. Usually, ES-somatic or TC-somatic hybrids retain pluripotent capacity at high levels quite comparable or nearly identical with those of the pluripotent partner. When cultured in vitro, ES-somatic- and TC-somatic hybrid cell clones, as a rule, lose the chromosomes derived from the somatic partner; however, in some clones the autosomes from the ES cell partner were also eliminated, i.e. the parental chromosomes segregated bilaterally in the ES-somatic cell hybrids. This opens up ways for searching correlation between the pluripotent status of the hybrid cells and chromosome segregation patterns and therefore for identifying the particular chromosomes involved in the maintenance of pluripotency. Use of selective medium allows to isolate in vitro the clones of ES-somatic hybrid cells in which "the pluripotent" chromosome can be replaced by "the somatic" counterpart carrying the selectable gene. Unlike the TC-somatic cell hybrids, the ES-somatic hybrids with a near-diploid complement of chromosomes are able to contribute to various tissues of chimeric animals after injection into the blastocoel cavity. Analysis of the chimeric animals showed that the "somatic" chromosome undergoes reprogramming during development. The prospects for the identification of the chromosomes that are involved in the maintenance of pluripotency and its cis- and trans-regulation in the hybrid cell genome are discussed.

  19. Bayesian nonparametric variable selection as an exploratory tool for discovering differentially expressed genes.

    Science.gov (United States)

    Shahbaba, Babak; Johnson, Wesley O

    2013-05-30

    High-throughput scientific studies involving no clear a priori hypothesis are common. For example, a large-scale genomic study of a disease may examine thousands of genes without hypothesizing that any specific gene is responsible for the disease. In these studies, the objective is to explore a large number of possible factors (e.g., genes) in order to identify a small number that will be considered in follow-up studies that tend to be more thorough and on smaller scales. A simple, hierarchical, linear regression model with random coefficients is assumed for case-control data that correspond to each gene. The specific model used will be seen to be related to a standard Bayesian variable selection model. Relatively large regression coefficients correspond to potential differences in responses for cases versus controls and thus to genes that might 'matter'. For large-scale studies, and using a Dirichlet process mixture model for the regression coefficients, we are able to find clusters of regression effects of genes with increasing potential effect or 'relevance', in relation to the outcome of interest. One cluster will always correspond to genes whose coefficients are in a neighborhood that is relatively close to zero and will be deemed least relevant. Other clusters will correspond to increasing magnitudes of the random/latent regression coefficients. Using simulated data, we demonstrate that our approach could be quite effective in finding relevant genes compared with several alternative methods. We apply our model to two large-scale studies. The first study involves transcriptome analysis of infection by human cytomegalovirus. The second study's objective is to identify differentially expressed genes between two types of leukemia. Copyright © 2012 John Wiley & Sons, Ltd.

  20. "NeuroStem Chip": a novel highly specialized tool to study neural differentiation pathways in human stem cells

    Directory of Open Access Journals (Sweden)

    Li Jia-Yi

    2007-02-01

    Full Text Available Abstract Background Human stem cells are viewed as a possible source of neurons for a cell-based therapy of neurodegenerative disorders, such as Parkinson's disease. Several protocols that generate different types of neurons from human stem cells (hSCs have been developed. Nevertheless, the cellular mechanisms that underlie the development of neurons in vitro as they are subjected to the specific differentiation protocols are often poorly understood. Results We have designed a focused DNA (oligonucleotide-based large-scale microarray platform (named "NeuroStem Chip" and used it to study gene expression patterns in hSCs as they differentiate into neurons. We have selected genes that are relevant to cells (i being stem cells, (ii becoming neurons, and (iii being neurons. The NeuroStem Chip has over 1,300 pre-selected gene targets and multiple controls spotted in quadruplicates (~46,000 spots total. In this study, we present the NeuroStem Chip in detail and describe the special advantages it offers to the fields of experimental neurology and stem cell biology. To illustrate the utility of NeuroStem Chip platform, we have characterized an undifferentiated population of pluripotent human embryonic stem cells (hESCs, cell line SA02. In addition, we have performed a comparative gene expression analysis of those cells versus a heterogeneous population of hESC-derived cells committed towards neuronal/dopaminergic differentiation pathway by co-culturing with PA6 stromal cells for 16 days and containing a few tyrosine hydroxylase-positive dopaminergic neurons. Conclusion We characterized the gene expression profiles of undifferentiated and dopaminergic lineage-committed hESC-derived cells using a highly focused custom microarray platform (NeuroStem Chip that can become an important research tool in human stem cell biology. We propose that the areas of application for NeuroStem microarray platform could be the following: (i characterization of the

  1. Airfoil-Shaped Fluid Flow Tool for Use in Making Differential Measurements

    Science.gov (United States)

    England, John Dwight (Inventor); Kelley, Anthony R. (Inventor); Cronise, Raymond J. (Inventor)

    2014-01-01

    A fluid flow tool includes an airfoil structure and a support arm. The airfoil structure's high-pressure side and low-pressure side are positioned in a conduit by the support arm coupled to the conduit. The high-pressure and low-pressure sides substantially face opposing walls of the conduit. At least one measurement port is formed in the airfoil structure at each of its high-pressure side and low-pressure side. A first manifold, formed in the airfoil structure and in fluid communication with each measurement port so-formed at the high-pressure side, extends through the airfoil structure and support arm to terminate and be accessible at the exterior wall of the conduit. A second manifold, formed in the airfoil structure and in fluid communication with each measurement port so-formed at the low-pressure side, extends through the airfoil structure and support arm to terminate and be accessible at the exterior wall of the conduit.

  2. Development of computational tools for automatic modeling and FE (Finite Element) analysis of corroded pipelines; Desenvolvimento de ferramentas computacionais para modelagem e analise automatica de defeitos de corrosao em dutos via MEF (Metodo de Elemento Finito)

    Energy Technology Data Exchange (ETDEWEB)

    Cabral, Helder Lima Dias; Willmersdorf, Ramiro Brito; Lyra, Paulo Roberto Maciel [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Dept. de Engenharia Mecanica], e-mail: hldcabral@yahoo.com.br, e-mail: ramiro@willmersdorf.net, e-mail: prmlyra@ufpe.br; Silva, Silvana Maria Bastos Afonso da [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Dept. de Engenharia Civil], e-mail: smb@ufpe.br

    2008-06-15

    Corrosion is one of the most common causes of accidents involving oil and gas pipelines. The computational simulation through finite element method (FEM) is one of the most efficient tools to reliably quantify the remaining strength of corroded pipes. However, the modeling process demands intense manual engineering labor and it is also slow and extremely repetitive; therefore it is very prone to errors. The main purpose of this work is to present the PIPEFLAW program which has tools for generating automatically FE pipe models with corrosion defects, ready to be analyzed with commercial FEM programs. PIPEFLAW has computational tools based on MSC.Patran pre and post-processing program, and were written in PCL (patran command language). The program has a user friendly customized graphical interface, which allows the user to provide the main parameters of the pipe and defect (or a series of defects). The PIPEFLAW program allows the user to generate automatically FE pipe models with rectangular or elliptical shaped corrosion defects located on the internal or external pipe surface. Defects generated by the PIPEFLAW program can assume the configuration of an isolated defect (single defect) or multiple defects (aligned or located in an arbitrary position). These tools were validated by comparing the results of numerical simulations, made with the PIPEFLAW tools, with the numerical, experimental and semi-empiric results available in the literature. Results confirmed the robustness of PIPEFLAW tools which proved to be a rapid way of generating reliable FE models ready to be used on the structural evaluation of corroded pipelines. (author)

  3. Not proper ROC curves as new tool for the analysis of differentially expressed genes in microarray experiments

    Directory of Open Access Journals (Sweden)

    Pistoia Vito

    2008-10-01

    Full Text Available Abstract Background Most microarray experiments are carried out with the purpose of identifying genes whose expression varies in relation with specific conditions or in response to environmental stimuli. In such studies, genes showing similar mean expression values between two or more groups are considered as not differentially expressed, even if hidden subclasses with different expression values may exist. In this paper we propose a new method for identifying differentially expressed genes, based on the area between the ROC curve and the rising diagonal (ABCR. ABCR represents a more general approach than the standard area under the ROC curve (AUC, because it can identify both proper (i.e., concave and not proper ROC curves (NPRC. In particular, NPRC may correspond to those genes that tend to escape standard selection methods. Results We assessed the performance of our method using data from a publicly available database of 4026 genes, including 14 normal B cell samples (NBC and 20 heterogeneous lymphomas (namely: 9 follicular lymphomas and 11 chronic lymphocytic leukemias. Moreover, NBC also included two sub-classes, i.e., 6 heavily stimulated and 8 slightly or not stimulated samples. We identified 1607 differentially expressed genes with an estimated False Discovery Rate of 15%. Among them, 16 corresponded to NPRC and all escaped standard selection procedures based on AUC and t statistics. Moreover, a simple inspection to the shape of such plots allowed to identify the two subclasses in either one class in 13 cases (81%. Conclusion NPRC represent a new useful tool for the analysis of microarray data.

  4. Automatic Picking of Foraminifera: Design of the Foraminifera Image Recognition and Sorting Tool (FIRST) Prototype and Results of the Image Classification Scheme

    Science.gov (United States)

    de Garidel-Thoron, T.; Marchant, R.; Soto, E.; Gally, Y.; Beaufort, L.; Bolton, C. T.; Bouslama, M.; Licari, L.; Mazur, J. C.; Brutti, J. M.; Norsa, F.

    2017-12-01

    Foraminifera tests are the main proxy carriers for paleoceanographic reconstructions. Both geochemical and taxonomical studies require large numbers of tests to achieve statistical relevance. To date, the extraction of foraminifera from the sediment coarse fraction is still done by hand and thus time-consuming. Moreover, the recognition of morphotypes, ecologically relevant, requires some taxonomical skills not easily taught. The automatic recognition and extraction of foraminifera would largely help paleoceanographers to overcome these issues. Recent advances in automatic image classification using machine learning opens the way to automatic extraction of foraminifera. Here we detail progress on the design of an automatic picking machine as part of the FIRST project. The machine handles 30 pre-sieved samples (100-1000µm), separating them into individual particles (including foraminifera) and imaging each in pseudo-3D. The particles are classified and specimens of interest are sorted either for Individual Foraminifera Analyses (44 per slide) and/or for classical multiple analyses (8 morphological classes per slide, up to 1000 individuals per hole). The classification is based on machine learning using Convolutional Neural Networks (CNNs), similar to the approach used in the coccolithophorid imaging system SYRACO. To prove its feasibility, we built two training image datasets of modern planktonic foraminifera containing approximately 2000 and 5000 images each, corresponding to 15 & 25 morphological classes. Using a CNN with a residual topology (ResNet) we achieve over 95% correct classification for each dataset. We tested the network on 160,000 images from 45 depths of a sediment core from the Pacific ocean, for which we have human counts. The current algorithm is able to reproduce the downcore variability in both Globigerinoides ruber and the fragmentation index (r2 = 0.58 and 0.88 respectively). The FIRST prototype yields some promising results for high

  5. Vital Recorder-a free research tool for automatic recording of high-resolution time-synchronised physiological data from multiple anaesthesia devices.

    Science.gov (United States)

    Lee, Hyung-Chul; Jung, Chul-Woo

    2018-01-24

    The current anaesthesia information management system (AIMS) has limited capability for the acquisition of high-quality vital signs data. We have developed a Vital Recorder program to overcome the disadvantages of AIMS and to support research. Physiological data of surgical patients were collected from 10 operating rooms using the Vital Recorder. The basic equipment used were a patient monitor, the anaesthesia machine, and the bispectral index (BIS) monitor. Infusion pumps, cardiac output monitors, regional oximeter, and rapid infusion device were added as required. The automatic recording option was used exclusively and the status of recording was frequently checked through web monitoring. Automatic recording was successful in 98.5% (4,272/4,335) cases during eight months of operation. The total recorded time was 13,489 h (3.2 ± 1.9 h/case). The Vital Recorder's automatic recording and remote monitoring capabilities enabled us to record physiological big data with minimal effort. The Vital Recorder also provided time-synchronised data captured from a variety of devices to facilitate an integrated analysis of vital signs data. The free distribution of the Vital Recorder is expected to improve data access for researchers attempting physiological data studies and to eliminate inequalities in research opportunities due to differences in data collection capabilities.

  6. Cytokine and chemokine profiles in fibromyalgia, rheumatoid arthritis and systemic lupus erythematosus: a potentially useful tool in differential diagnosis.

    Science.gov (United States)

    Wallace, Daniel J; Gavin, Igor M; Karpenko, Oleksly; Barkhordar, Farnaz; Gillis, Bruce S

    2015-06-01

    Making a correct diagnosis is pivotal in the practice of clinical rheumatology. Occasionally, the consultation fails to provide desired clarity in making labeling an individual as having fibromyalgia (FM), systemic lupus erythematosus (SLE) or rheumatoid arthritis (RA). A chemokine and cytokine multiplex assay was developed and tested with the goal of improving and achieving an accurate differential diagnosis. 160 patients with FM, 98 with RA and 100 with SLE fulfilling accepted criteria were recruited and compared to 119 controls. Supernatant cytokine concentrations for IL-6, IL-8, MIP-1 alpha and MIP-1 beta were determined using the Luminex multiplex immunoassay bead array technology after mitogenic stimulation of cultured peripheral blood mononuclear cells. Each patient's profile was scored using a logistical regression model to achieve statistically determined weighting for each chemokine and cytokine. Among the 477 patients evaluated, the mean scores for FM (1.7 ± 1.2; 1.52-1.89), controls (-3.56 ± 5.7; -4.59 to -2.54), RA (-0.68 ± 2.26; -1.12 to -0.23) and SLE (-1.45 ± 3.34, -2.1 to -0.79). Ninety-three percent with FM scored positive compared to only 11% of healthy controls, 69% RA or 71% SLE patients had negative scores. The sensitivity, specificity, positive predictive and negative predictive value for having FM compared to controls was 93, 89, 92 and 91%, respectively (p < 2.2 × 10(-16)). Evaluating cytokine and chemokine profiles in stimulated cells reveals patterns that are uniquely present in patients with FM. This assay can be a useful tool in assisting clinicians in differentiating systemic inflammatory autoimmune processes from FM and its related syndromes and healthy individuals.

  7. Automatic centering device of a tool with regard to an aperture. Dispositif de centrage automatique d'un outil par rapport a un orifice

    Energy Technology Data Exchange (ETDEWEB)

    Delevalee, A.

    1993-01-08

    The manipulator arm carries a fixed support and a mobile support that can move perpendicularly to the axis of the tube. The mobile support can be held in any position by a brake arrangement. An index device allows the positioning of the mobile support in an initial predetermined position. A conical centering device can be placed coaxial with the tool and as it enters the tube ensures the alignment of the axis of the tool with the axis of the tube.

  8. Automatic Localization of Vertebral Levels in X-Ray Fluoroscopy Using 3D-2D Registration: A Tool to Reduce Wrong-Site Surgery

    Science.gov (United States)

    Otake, Y.; Schafer, S.; Stayman, J. W.; Zbijewski, W.; Kleinszig, G.; Graumann, R.; Khanna, A. J.; Siewerdsen, J. H.

    2012-01-01

    Surgical targeting of the incorrect vertebral level (“wrong-level” surgery) is among the more common wrong-site surgical errors, attributed primarily to a lack of uniquely identifiable radiographic landmarks in the mid-thoracic spine. Conventional localization method involves manual counting of vertebral bodies under fluoroscopy, is prone to human error, and carries additional time and dose. We propose an image registration and visualization system (referred to as LevelCheck), for decision support in spine surgery by automatically labeling vertebral levels in fluoroscopy using a GPU-accelerated, intensity-based 3D-2D (viz., CT-to-fluoroscopy) registration. A gradient information (GI) similarity metric and CMA-ES optimizer were chosen due to their robustness and inherent suitability for parallelization. Simulation studies involved 10 patient CT datasets from which 50,000 simulated fluoroscopic images were generated from C-arm poses selected to approximate C-arm operator and positioning variability. Physical experiments used an anthropomorphic chest phantom imaged under real fluoroscopy. The registration accuracy was evaluated as the mean projection distance (mPD) between the estimated and true center of vertebral levels. Trials were defined as successful if the estimated position was within the projection of the vertebral body (viz., mPD fluoroscopy in near-real-time could be valuable in reducing the occurrence of wrong-site surgery while helping to reduce radiation exposure. The method is applicable beyond the specific case of vertebral labeling, since any structure defined in pre-operative (or intra-operative) CT or cone-beam CT can be automatically registered to the fluoroscopic scene. PMID:22864366

  9. Automatic localization of vertebral levels in x-ray fluoroscopy using 3D-2D registration: a tool to reduce wrong-site surgery

    Science.gov (United States)

    Otake, Y.; Schafer, S.; Stayman, J. W.; Zbijewski, W.; Kleinszig, G.; Graumann, R.; Khanna, A. J.; Siewerdsen, J. H.

    2012-09-01

    Surgical targeting of the incorrect vertebral level (wrong-level surgery) is among the more common wrong-site surgical errors, attributed primarily to the lack of uniquely identifiable radiographic landmarks in the mid-thoracic spine. The conventional localization method involves manual counting of vertebral bodies under fluoroscopy, is prone to human error and carries additional time and dose. We propose an image registration and visualization system (referred to as LevelCheck), for decision support in spine surgery by automatically labeling vertebral levels in fluoroscopy using a GPU-accelerated, intensity-based 3D-2D (namely CT-to-fluoroscopy) registration. A gradient information (GI) similarity metric and a CMA-ES optimizer were chosen due to their robustness and inherent suitability for parallelization. Simulation studies involved ten patient CT datasets from which 50 000 simulated fluoroscopic images were generated from C-arm poses selected to approximate the C-arm operator and positioning variability. Physical experiments used an anthropomorphic chest phantom imaged under real fluoroscopy. The registration accuracy was evaluated as the mean projection distance (mPD) between the estimated and true center of vertebral levels. Trials were defined as successful if the estimated position was within the projection of the vertebral body (namely mPD anatomy in fluoroscopy in near-real-time could be valuable in reducing the occurrence of wrong-site surgery while helping to reduce radiation exposure. The method is applicable beyond the specific case of vertebral labeling, since any structure defined in pre-operative (or intra-operative) CT or cone-beam CT can be automatically registered to the fluoroscopic scene.

  10. Comparison of a semi-automatic annotation tool and a natural language processing application for the generation of clinical statement entries.

    Science.gov (United States)

    Lin, Ching-Heng; Wu, Nai-Yuan; Lai, Wei-Shao; Liou, Der-Ming

    2015-01-01

    Electronic medical records with encoded entries should enhance the semantic interoperability of document exchange. However, it remains a challenge to encode the narrative concept and to transform the coded concepts into a standard entry-level document. This study aimed to use a novel approach for the generation of entry-level interoperable clinical documents. Using HL7 clinical document architecture (CDA) as the example, we developed three pipelines to generate entry-level CDA documents. The first approach was a semi-automatic annotation pipeline (SAAP), the second was a natural language processing (NLP) pipeline, and the third merged the above two pipelines. We randomly selected 50 test documents from the i2b2 corpora to evaluate the performance of the three pipelines. The 50 randomly selected test documents contained 9365 words, including 588 Observation terms and 123 Procedure terms. For the Observation terms, the merged pipeline had a significantly higher F-measure than the NLP pipeline (0.89 vs 0.80, p<0.0001), but a similar F-measure to that of the SAAP (0.89 vs 0.87). For the Procedure terms, the F-measure was not significantly different among the three pipelines. The combination of a semi-automatic annotation approach and the NLP application seems to be a solution for generating entry-level interoperable clinical documents. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.comFor numbered affiliation see end of article.

  11. Traduction automatique et terminologie automatique (Automatic Translation and Automatic Terminology

    Science.gov (United States)

    Dansereau, Jules

    1978-01-01

    An exposition of reasons why a system of automatic translation could not use a terminology bank except as a source of information. The fundamental difference between the two tools is explained and examples of translation and mistranslation are given as evidence of the limits and possibilities of each process. (Text is in French.) (AMH)

  12. SU-G-TeP4-07: Automatic EPID-Based 2D Measurement of MLC Leaf Offset as a Quality Control Tool

    Energy Technology Data Exchange (ETDEWEB)

    Ritter, T; Moran, J [The University of Michigan, Ann Arbor, MI (United States); Schultz, B [University of Michigan, Ann Arbor, MI (United States); Kim, G [University of California, San Diego, La Jolla, CA (United States); Barnes, M [Calvary Mater Hospital Newcastle, Warratah, NSW (Australia); Perez, M [North Sydney Cancer Center, Sydney (Australia); Farrey, K [University of Chicago, Chicago, IL (United States); Popple, R [University Alabama Birmingham, Birmingham, AL (United States); Greer, P [Calvary Mater Newcastle, Newcastle (Australia)

    2016-06-15

    Purpose: The MLC dosimetric leaf gap (DLG) and transmission are measured parameters which impact the dosimetric accuracy of IMRT and VMAT plans. This investigation aims to develop an efficient and accurate routine constancy check of the physical DLG in two dimensions. Methods: The manufacturer’s recommended DLG measurement method was modified by using 5 fields instead of 11 and by utilizing the Electronic Portal Imaging Device (EPID). Validations were accomplished using an ion chamber (IC) in solid water and a 2D IC array. EPID data was collected for 6 months on multiple TrueBeam linacs using both Millennium and HD MLCs at 5 different clinics in an international consortium. Matlab code was written to automatically analyze the images and calculate the 2D results. Sensitivity was investigated by introducing deliberate leaf position errors. MLC calibration and initialization history was recorded to allow quantification of their impact. Results were analyzed using statistical process control (SPC). Results: The EPID method took approximately 5 minutes. Due to detector response, the EPID measured DLG and transmission differed from the IC values but were reproducible and consistent with changes measured using the ICs. For the Millennium MLC, the EPID measured DLG and transmission were both consistently lower than IC results. The EPID method was implemented as leaf offset and transmission constancy tests (LOC and TC). Based on 6 months of measurements, the initial leaf-specific action thresholds for changes from baseline were set to 0.1 mm. Upper and lower control limits for variation were developed for each machine. Conclusion: Leaf offset and transmission constancy tests were implemented on Varian HD and Millennium MLCs using an EPID and found to be efficient and accurate. The test is effective for monitoring MLC performance using dynamic delivery and performing process control on the DLG in 2D, thus enhancing dosimetric accuracy. This work was supported by a grant

  13. On the development of conjunctival hyperemia computer-assisted diagnosis tools: Influence of feature selection and class imbalance in automatic gradings.

    Science.gov (United States)

    Sánchez Brea, María Luisa; Barreira Rodríguez, Noelia; Sánchez Maroño, Noelia; Mosquera González, Antonio; García-Resúa, Carlos; Giráldez Fernández, María Jesús

    2016-07-01

    The sudden increase of blood flow in the bulbar conjunctiva, known as hyperemia, is associated to a red hue of variable intensity. Experts measure hyperemia using levels in a grading scale, a procedure that is subjective, non-repeatable and time consuming, thus creating a need for its automatisation. However, the task is far from straightforward due to data issues such as class imbalance or correlated features. In this paper, we study the specific features of hyperemia and propose various approaches to address these problems in the context of an automatic framework for hyperemia grading. Oversampling, undersampling and SMOTE approaches were applied in order to tackle the problem of class imbalance. 25 features were computed for each image and regression methods were then used to transform them into a value on the grading scale. The values and relationships among features and experts' values were analysed, and five feature selection techniques were subsequently studied. The lowest mean square error (MSE) for the regression systems trained with individual features is below 0.1 for both scales. Multi-layer perceptron (MLP) obtains the best values, but is less consistent than the random forest (RF) method. When all features are combined, the best results for both scales are achieved with MLP. Correlation based feature selection (CFS) and M5 provide the best results, MSE=0.108 and MSE=0.061 respectively. Finally, the class imbalance problem is minimised with the SMOTE approach for both scales (MSElearning methods are able to perform an objective assessment of hyperemia grading, removing both intra- and inter-expert subjectivity while providing a gain in computation time. SMOTE and oversampling approaches minimise the class imbalance problem, while feature selection reduces the number of features from 25 to 3-5 without worsening the MSE. As the differences between the system and a human expert are similar to the differences between experts, we can therefore conclude that

  14. Dosimetric evaluation of an automatic segmentation tool of pelvic structures from MRI images for prostate cancer radiotherapy; Evaluation dosimetrique d'un outil de delineation automatique des organes pelviens a partir d'images IRM pour la radiotherapie du cancer prostatique

    Energy Technology Data Exchange (ETDEWEB)

    Pasquier, D.; Lacornerie, T.; Lartigau, E. [Centre Oscar-Lambret, Dept. Universitaire de Radiotherapie, 59 - Lille (France); Pasquier, D. [Centre Galilee, Polyclinique de la Louviere, 59 - Lille (France); Pasquier, D.; Betrouni, N.; Vermandel, M.; Rousseau, J. [Lille-2 Univ., U703 Thiais, Inserm, Lab. de Biophysique EA 1049, Institut de Technologie Medicale, CHU de Lille, 59 (France)

    2008-09-15

    Purpose: An automatic segmentation tool of pelvic structures from MRI images for prostate cancer radiotherapy was developed and dosimetric evaluation of differences of delineation (automatic versus human) is presented here. Materials and methods: C.T.V. (clinical target volume), rectum and bladder were defined automatically and by a physician in 20 patients. Treatment plans based on 'automatic' volumes were transferred on 'manual' volumes and reciprocally. Dosimetric characteristics of P.T.V. (V.95, minimal, maximal and mean doses), rectum (V.50, V.70, maximal and mean doses) and bladder (V.70, maximal and mean doses) were compared. Results: Automatic delineation of C.T.V. did not significantly influence dosimetric characteristics of 'manual' P.T.V. (projected target volume). Rectal V-50 and V.70 were not significantly different; mean rectal dose is slightly superior (43.2 versus 44.4 Gy, p = 0.02, Student test). Bladder V.70 was significantly superior too (19.3 versus 21.6, p = 0.004). Organ-at-risk (O.A.R.) automatic delineation had little influence on their dosimetric characteristics; rectal V.70 was slightly underestimated (20 versus 18.5 Gy, p = 0.001). Conclusion: C.T.V. and O.A.R. automatic delineation had little influence on dosimetric characteristics. Software developments are ongoing to enable routine use and interobserver evaluation is needed. (authors)

  15. Knickzone Extraction Tool (KET) - A new ArcGIS toolset for automatic extraction of knickzones from a DEM based on multi-scale stream gradients

    Science.gov (United States)

    Zahra, Tuba; Paudel, Uttam; Hayakawa, Yuichi S.; Oguchi, Takashi

    2017-04-01

    Extraction of knickpoints or knickzones from a Digital Elevation Model (DEM) has gained immense significance owing to the increasing implications of knickzones on landform development. However, existing methods for knickzone extraction tend to be subjective or require time-intensive data processing. This paper describes the proposed Knickzone Extraction Tool (KET), a new raster-based Python script deployed in the form of an ArcGIS toolset that automates the process of knickzone extraction and is both fast and more user-friendly. The KET is based on multi-scale analysis of slope gradients along a river course, where any locally steep segment (knickzone) can be extracted as an anomalously high local gradient. We also conducted a comparative analysis of the KET and other contemporary knickzone identification techniques. The relationship between knickzone distribution and its morphometric characteristics are also examined through a case study of a mountainous watershed in Japan.

  16. A Web-Based Tool for Automatic Data Collection, Curation, and Visualization of Complex Healthcare Survey Studies including Social Network Analysis

    Directory of Open Access Journals (Sweden)

    José Alberto Benítez

    2017-01-01

    Full Text Available There is a great concern nowadays regarding alcohol consumption and drug abuse, especially in young people. Analyzing the social environment where these adolescents are immersed, as well as a series of measures determining the alcohol abuse risk or personal situation and perception using a number of questionnaires like AUDIT, FAS, KIDSCREEN, and others, it is possible to gain insight into the current situation of a given individual regarding his/her consumption behavior. But this analysis, in order to be achieved, requires the use of tools that can ease the process of questionnaire creation, data gathering, curation and representation, and later analysis and visualization to the user. This research presents the design and construction of a web-based platform able to facilitate each of the mentioned processes by integrating the different phases into an intuitive system with a graphical user interface that hides the complexity underlying each of the questionnaires and techniques used and presenting the results in a flexible and visual way, avoiding any manual handling of data during the process. Advantages of this approach are shown and compared to the previous situation where some of the tasks were accomplished by time consuming and error prone manipulations of data.

  17. IsoBED: a tool for automatic calculation of biologically equivalent fractionation schedules in radiotherapy using IMRT with a simultaneous integrated boost (SIB technique

    Directory of Open Access Journals (Sweden)

    Benassi Marcello

    2011-05-01

    Full Text Available Abstract Background An advantage of the Intensity Modulated Radiotherapy (IMRT technique is the feasibility to deliver different therapeutic dose levels to PTVs in a single treatment session using the Simultaneous Integrated Boost (SIB technique. The paper aims to describe an automated tool to calculate the dose to be delivered with the SIB-IMRT technique in different anatomical regions that have the same Biological Equivalent Dose (BED, i.e. IsoBED, compared to the standard fractionation. Methods Based on the Linear Quadratic Model (LQM, we developed software that allows treatment schedules, biologically equivalent to standard fractionations, to be calculated. The main radiobiological parameters from literature are included in a database inside the software, which can be updated according to the clinical experience of each Institute. In particular, the BED to each target volume will be computed based on the alpha/beta ratio, total dose and the dose per fraction (generally 2 Gy for a standard fractionation. Then, after selecting the reference target, i.e. the PTV that controls the fractionation, a new total dose and dose per fraction providing the same isoBED will be calculated for each target volume. Results The IsoBED Software developed allows: 1 the calculation of new IsoBED treatment schedules derived from standard prescriptions and based on LQM, 2 the conversion of the dose-volume histograms (DVHs for each Target and OAR to a nominal standard dose at 2Gy per fraction in order to be shown together with the DV-constraints from literature, based on the LQM and radiobiological parameters, and 3 the calculation of Tumor Control Probability (TCP and Normal Tissue Complication Probability (NTCP curve versus the prescribed dose to the reference target.

  18. High-frequency ultrasonography (HFUS as a useful tool in differentiating between plaque morphea and extragenital lichen sclerosus lesions

    Directory of Open Access Journals (Sweden)

    Rafał Białynicki-Birula

    2017-10-01

    Full Text Available Introduction : Morphea and lichen sclerosus (LS are chronic inflammatory diseases that may pose a diagnostic challenge for a physician. High-frequency ultrasonography (HFUS is a versatile diagnostic method utilized in dermatologic practice, allowing monitoring the course of the disease, treatment response and differentiation between certain skin disorders. Aim : To prove the usefulness of HFUS in differentiating between plaque morphea and extragenital LS lesions. Material and methods : We examined 16 patients with plaque morphea and 4 patients with extragenital LS using 20 MHz taberna pro medicum TM (Germany device. Results : Investigations revealed hyperechogenic entrance echo in both morphea and LS lesions, whereas a distinct polycyclic surface of the entrance echo was detected exclusively in LS. Conclusions : High-frequency ultrasonography is a current diagnostic modality that may prove useful in differentiating between morphea and LS lesions.

  19. AUTOMATIC ARCHITECTURAL STYLE RECOGNITION

    Directory of Open Access Journals (Sweden)

    M. Mathias

    2012-09-01

    Full Text Available Procedural modeling has proven to be a very valuable tool in the field of architecture. In the last few years, research has soared to automatically create procedural models from images. However, current algorithms for this process of inverse procedural modeling rely on the assumption that the building style is known. So far, the determination of the building style has remained a manual task. In this paper, we propose an algorithm which automates this process through classification of architectural styles from facade images. Our classifier first identifies the images containing buildings, then separates individual facades within an image and determines the building style. This information could then be used to initialize the building reconstruction process. We have trained our classifier to distinguish between several distinct architectural styles, namely Flemish Renaissance, Haussmannian and Neoclassical. Finally, we demonstrate our approach on various street-side images.

  20. Automatic Detection of Fake News

    OpenAIRE

    Pérez-Rosas, Verónica; Kleinberg, Bennett; Lefevre, Alexandra; Mihalcea, Rada

    2017-01-01

    The proliferation of misleading information in everyday access media outlets such as social media feeds, news blogs, and online newspapers have made it challenging to identify trustworthy news sources, thus increasing the need for computational tools able to provide insights into the reliability of online content. In this paper, we focus on the automatic identification of fake content in online news. Our contribution is twofold. First, we introduce two novel datasets for the task of fake news...

  1. New trends in diffusion-weighted magnetic resonance imaging as a tool in differentiation of serous cystadenoma and mucinous cystic tumor: a prospective study.

    Science.gov (United States)

    Schraibman, Vladimir; Goldman, Suzan Menasce; Ardengh, José Celso; Goldenberg, Alberto; Lobo, Edson; Linhares, Marcelo Moura; Gonzales, Adriano Mizziara; Abdala, Nitamar; Abud, Thiago Giansante; Ajzen, Sérgio Aron; Jackowsky, Andrea; Szejnfeld, Jacob

    2011-01-01

    Pancreatic cystic lesions are increasingly being recognized. Magnetic resonance imaging (MRI) is the method that brings the greatest amount of information about the morphologic features of pancreatic cystic lesions. To establish if diffusion-weighted MRI (DW-MRI) can be used as a tool to differentiate mucinous from nonmucinous lesions. Fifty-six patients with pancreatic cystic lesions (benign, n = 46; malignant, n = 10) were prospectively evaluated with DW-MRI in order to differentiate mucinous from nonmucinous lesions. Final diagnosis was obtained by follow-up (n = 31), surgery (n = 16) or endoscopic ultrasound-guided fine needle aspiration (n = 9). Serous cystadenoma was identified in 32 (57%) patients. The threshold value established for the differentiation of mucinous from nonmucinous lesions was 2,230.06 s/mm(2) for ADC of 700. DWI-MRI behavior between mucinous and nonmucinous groups revealed sensitivity, specificity, positive predictive value, negative predictive value and accuracy to be 80, 98, 92, 93 and 93%, respectively (p < 0.01, power of sample = 1.0). In the comparison of the diffusion behavior between mucinous (n = 13) and serous (n = 32) lesions, the sensitivity, specificity, positive predictive value, negative predictive value and accuracy were 100, 97, 92, 100 and 98%, respectively (p < 0.01, power of sample = 1.0). The results of endoscopic ultrasound-guided fine needle aspiration were similar to those of DW-MRI. DW-MRI can be included as part of the array of tools to differentiate mucinous from nonmucinous lesions and can help in the management of pancreatic cystic lesions. and IAP. Copyright © 2011 S. Karger AG, Basel.

  2. Energy dispersive x-ray diffractometry as a tool alternative to differential scanning calorimetry for investigating polymer phase transitions

    Science.gov (United States)

    Rossi-Albertini, V.; Isopo, A.; Caminiti, R.; Tentolini, U.

    2002-02-01

    Recently, a technique based on energy dispersive x-ray diffraction has been proposed to follow the polymer phase transitions. However, the potentialities of this method were not clear, as well as the experimental conditions in which it is more convenient than differential scanning calorimetry, generally used for the same purpose. In the present letter, the answer to this question is provided. It is shown that the two methods are complementary, rather than equivalent, the heating rate being the relevant parameter to establish which is preferable. The demonstration of this statement is given through the observation of the complex thermal properties of a reference sample studied in both ways at progressively lower heating rates. The connection between such unusual application of x-ray diffraction and the differential scanning calorimetry is discussed in terms of the two possible definitions of entropy.

  3. Automatic Assessment of 3D Modeling Exams

    Science.gov (United States)

    Sanna, A.; Lamberti, F.; Paravati, G.; Demartini, C.

    2012-01-01

    Computer-based assessment of exams provides teachers and students with two main benefits: fairness and effectiveness in the evaluation process. This paper proposes a fully automatic evaluation tool for the Graphic and Virtual Design (GVD) curriculum at the First School of Architecture of the Politecnico di Torino, Italy. In particular, the tool is…

  4. Automatic Synthesis of Robust and Optimal Controllers

    DEFF Research Database (Denmark)

    Cassez, Franck; Jessen, Jan Jacob; Larsen, Kim Guldstrand

    2009-01-01

    In this paper, we show how to apply recent tools for the automatic synthesis of robust and near-optimal controllers for a real industrial case study. We show how to use three different classes of models and their supporting existing tools, Uppaal-TiGA for synthesis, phaver for verification...

  5. A reminder of peristalsis as a useful tool in the prenatal differential diagnosis of abdominal cystic masses

    OpenAIRE

    O. Gerscovich, Eugenio; Sekhon, Simran; W. Loehfelm, Thomas; L. Wootton-Gorges, Sandra; Greenspan, Adam

    2017-01-01

    With routine antenatal ultrasound and recent advances in ultrasound technology, fetal intraabdominal cystic masses are recognized more often and are better characterized than in the past. They may be classified as solid and cystic, and may originate from multiple structures. When considering the extensive differential diagnosis of cystic masses, the observation of peristalsis narrows the possibilities to the gastrointestinal tract. To find this feature on ultrasound, the examin...

  6. MATLAB automatic differentiation using source transformation

    OpenAIRE

    Kharche, R V

    2012-01-01

    This thesis presents our work on compiler techniques to implement Algo- rithmic Di erentiation (AD) using source transformation in MATLAB. AD is concerned with the accurate and e cient computation of derivatives of complicated mathematical functions represented by computer programs. Source transformation techniques for AD, whilst complicated to imple- ment, are known to yield derivative code with better run-time e ciency than methods using overloading support of the underlyi...

  7. Automatic fluid dispenser

    Science.gov (United States)

    Sakellaris, P. C. (Inventor)

    1977-01-01

    Fluid automatically flows to individual dispensing units at predetermined times from a fluid supply and is available only for a predetermined interval of time after which an automatic control causes the fluid to drain from the individual dispensing units. Fluid deprivation continues until the beginning of a new cycle when the fluid is once again automatically made available at the individual dispensing units.

  8. Automatic validation of numerical solutions

    DEFF Research Database (Denmark)

    Stauning, Ole

    1997-01-01

    , using this method has been developed. (ADIODES is an abbreviation of `` Automatic Differentiation Interval Ordinary Differential Equation Solver''). ADIODES is used to prove existence and uniqueness of periodic solutions to specific ordinary differential equations occuring in dynamical systems theory....... These proofs of existence and uniqueness are difficult or impossible to obtain using other known methods. Also, a method for solving boundary value problems is described. Finally a method for enclosing solutions to a class of integral equations is described. This method is based on the mean value enclosure...... of an integral operator and uses interval Bernstein polynomials for enclosing the solution. Two numerical examples are given, using two orders of approximation and using different numbers of discretization points....

  9. The problem of automatic identification of concepts

    International Nuclear Information System (INIS)

    Andreewsky, Alexandre

    1975-11-01

    This paper deals with the problem of the automatic recognition of concepts and describes an important language tool, the ''linguistic filter'', which facilitates the construction of statistical algorithms. Certain special filters, of prepositions, conjunctions, negatives, logical implication, compound words, are presented. This is followed by a detailed description of a statistical algorithm allowing recognition of pronoun referents, and finally the problem of the automatic treatment of negatives in French is discussed [fr

  10. A reminder of peristalsis as a useful tool in the prenatal differential diagnosis of abdominal cystic masses.

    Science.gov (United States)

    Gerscovich, Eugenio O; Sekhon, Simran; Loehfelm, Thomas W; Wootton-Gorges, Sandra L; Greenspan, Adam

    2017-06-01

    With routine antenatal ultrasound and recent advances in ultrasound technology, fetal intraabdominal cystic masses are recognized more often and are better characterized than in the past. They may be classified as solid and cystic, and may originate from multiple structures. When considering the extensive differential diagnosis of cystic masses, the observation of peristalsis narrows the possibilities to the gastrointestinal tract. To find this feature on ultrasound, the examiner must expressly think and look for it, otherwise it may be missed. Our case report illustrates one of those cases.

  11. A reminder of peristalsis as a useful tool in the prenatal differential diagnosis of abdominal cystic masses

    Directory of Open Access Journals (Sweden)

    Eugenio O. Gerscovich

    2017-06-01

    Full Text Available With routine antenatal ultrasound and recent advances in ultrasound technology, fetal intraabdominal cystic masses are recognized more often and are better characterized than in the past. They may be classified as solid and cystic, and may originate from multiple structures. When considering the extensive differential diagnosis of cystic masses, the observation of peristalsis narrows the possibilities to the gastrointestinal tract. To find this feature on ultrasound, the examiner must expressly think and look for it, otherwise it may be missed. Our case report illustrates one of those cases.

  12. Apparent diffusion coefficient measurement by diffusion weighted magnetic resonance imaging is a useful tool in differentiating renal tumors.

    Science.gov (United States)

    Liu, Jing-Hong; Tian, Shi-Feng; Ju, Ye; Li, Ye; Chen, An-Liang; Chen, Li-Hua; Liu, Ai-Lian

    2015-04-16

    To determine the clinical value of apparent diffusion coefficient (ADC) measurement by diffusion weighted magnetic resonance imaging (DW-MRI) in differentiating renal tumors. Electronic databases were searched using combinations of keywords and free words relating to renal tumor, ADC and DW-MRI. Based on carefully selected inclusion and exclusion criteria, relevant case-control studies were identified and the related clinical data was acquired. Statistical analyses were performed using STATA 12.0 (Stata Corporation, College station, TX). Sixteen case-control studies were ultimately included in the present meta-analysis. These 16 high quality studies contained a combined total of 438 normal renal tissues and 832 renal tumor lesions (597 malignant and 235 benign). The results revealed that ADC values of malignant renal tumor tissues were markedly lower than normal renal tissues and benign renal tumor tissues. ADC values of benign renal tumor tissues were also significantly lower than normal renal tissue. ADC measurement by DW-MRI provided clinically useful information on the internal structure of renal tumors and could be an important radiographic index for differentiation of malignant renal tumors from benign renal tumors.

  13. Neural differentiation of human embryonic stem cells as an in vitro tool for the study of the expression patterns of the neuronal cytoskeleton during neurogenesis.

    Science.gov (United States)

    Liu, Chao; Zhong, Yongwang; Apostolou, Andria; Fang, Shengyun

    2013-09-13

    The neural differentiation of human embryonic stem cells (ESCs) is a potential tool for elucidating the key mechanisms involved in human neurogenesis. Nestin and β-III-tubulin, which are cytoskeleton proteins, are marker proteins of neural stem cells (NSCs) and neurons, respectively. However, the expression patterns of nestin and β-III-tubulin in neural derivatives from human ESCs remain unclear. In this study, we found that neural progenitor cells (NPCs) derived from H9 cells express high levels of nestin and musashi-1. In contrast, β-III-tubulin was weakly expressed in a few NPCs. Moreover, in these cells, nestin formed filament networks, whereas β-III-tubulin was distributed randomly as small particles. As the differentiation proceeded, the nestin filament networks and the β-III-tubulin particles were found in both the cell soma and the cellular processes. Moreover, the colocalization of nestin and β-III-tubulin was found mainly in the cell processes and neurite-like structures and not in the cell soma. These results may aid our understanding of the expression patterns of nestin and β-III-tubulin during the neural differentiation of H9 cells. Copyright © 2013 Elsevier Inc. All rights reserved.

  14. Stochastic differential equations as a tool to regularize the parameter estimation problem for continuous time dynamical systems given discrete time measurements.

    Science.gov (United States)

    Leander, Jacob; Lundh, Torbjörn; Jirstrand, Mats

    2014-05-01

    In this paper we consider the problem of estimating parameters in ordinary differential equations given discrete time experimental data. The impact of going from an ordinary to a stochastic differential equation setting is investigated as a tool to overcome the problem of local minima in the objective function. Using two different models, it is demonstrated that by allowing noise in the underlying model itself, the objective functions to be minimized in the parameter estimation procedures are regularized in the sense that the number of local minima is reduced and better convergence is achieved. The advantage of using stochastic differential equations is that the actual states in the model are predicted from data and this will allow the prediction to stay close to data even when the parameters in the model is incorrect. The extended Kalman filter is used as a state estimator and sensitivity equations are provided to give an accurate calculation of the gradient of the objective function. The method is illustrated using in silico data from the FitzHugh-Nagumo model for excitable media and the Lotka-Volterra predator-prey system. The proposed method performs well on the models considered, and is able to regularize the objective function in both models. This leads to parameter estimation problems with fewer local minima which can be solved by efficient gradient-based methods. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Fluorescence In Situ Hybridization for MDM2 Amplification as a Routine Ancillary Diagnostic Tool for Suspected Well-Differentiated and Dedifferentiated Liposarcomas: Experience at a Tertiary Center

    Directory of Open Access Journals (Sweden)

    Khin Thway

    2015-01-01

    Full Text Available Background. The assessment of MDM2 gene amplification by fluorescence in situ hybridization (FISH has become a routine ancillary tool for diagnosing atypical lipomatous tumor (ALT/well-differentiated liposarcoma and dedifferentiated liposarcoma (WDL/DDL in specialist sarcoma units. We describe our experience of its utility at our tertiary institute. Methods. All routine histology samples in which MDM2 amplification was assessed with FISH over a 2-year period were included, and FISH results were correlated with clinical and histologic findings. Results. 365 samples from 347 patients had FISH for MDM2 gene amplification. 170 were positive (i.e., showed MDM2 gene amplification, 192 were negative, and 3 were technically unsatisfactory. There were 122 histologically benign cases showing a histology:FISH concordance rate of 92.6%, 142 WDL/DDL (concordance 96.5%, and 34 cases histologically equivocal for WDL (concordance 50%. Of 64 spindle cell/pleomorphic neoplasms (in which DDL was a differential diagnosis, 21.9% showed MDM2 amplification. Of the cases with discrepant histology and FISH, all but 3 had diagnoses amended following FISH results. For discrepancies of benign histology but positive FISH, lesions were on average larger, more frequently in “classical” (intra-abdominal or inguinal sites for WDL/DDL and more frequently core biopsies. Discrepancies of malignant histology but negative FISH were smaller, less frequently in “classical” sites but again more frequently core biopsies. Conclusions. FISH has a high correlation rate with histology for cases with firm histologic diagnoses of lipoma or WDL/DDL. It is a useful ancillary diagnostic tool in histologically equivocal cases, particularly in WDL lacking significant histologic atypia or DDL without corresponding WDL component, especially in larger tumors, those from intra-abdominal or inguinal sites or core biopsies. There is a significant group of well-differentiated adipocytic neoplasms

  16. Isoenzyme patterns: a valuable molecular tool for the differentiation of Zygosaccharomyces species and detection of misidentified isolates.

    Science.gov (United States)

    Duarte, Filomena L; Pais, Célia; Spencer-Martins, Isabel; Leão, Cecília

    2004-08-01

    Electrophoretic analysis of esterase, acid phosphatase, lactate dehydrogenase, glucose-6-phosphate dehydrogenase and alcohol dehydrogenase isoenzymes was performed in 39 strains classified into six species of the yeast genus Zygosaccharomyces. The electrophoretic profiles obtained allowed the clear separation of Z. bailii, Z. bisporus, Z. florentinus, Z. lentus, Z. mellis and Z. rouxii, strains of the latter species clustering into two subgroups. Furthermore, this methodology enabled the detection of misidentified strains, as subsequently confirmed by DNA-DNA reassociation and sequencing of the D1/D2 domain of the 26S rRNA gene. Cluster analysis of the global electrophoretic data and those obtained using only two of the isoenzyme systems, esterase and lactate dehydrogenase, yielded similar grouping of the strains examined, indicating that these enzymes are good markers for the differentiation of Zygosaccharomyces species.

  17. arXiv A method and tool for combining differential or inclusive measurements obtained with simultaneously constrained uncertainties

    CERN Document Server

    Kieseler, Jan

    2017-11-22

    A method is discussed that allows combining sets of differential or inclusive measurements. It is assumed that at least one measurement was obtained with simultaneously fitting a set of nuisance parameters, representing sources of systematic uncertainties. As a result of beneficial constraints from the data all such fitted parameters are correlated among each other. The best approach for a combination of these measurements would be the maximization of a combined likelihood, for which the full fit model of each measurement and the original data are required. However, only in rare cases this information is publicly available. In absence of this information most commonly used combination methods are not able to account for these correlations between uncertainties, which can lead to severe biases as shown in this article. The method discussed here provides a solution for this problem. It relies on the public result and its covariance or Hessian, only, and is validated against the combined-likelihood approach. A d...

  18. The Monte Carlo method as a tool for statistical characterisation of differential and additive phase shifting algorithms

    International Nuclear Information System (INIS)

    Miranda, M; Dorrio, B V; Blanco, J; Diz-Bugarin, J; Ribas, F

    2011-01-01

    Several metrological applications base their measurement principle in the phase sum or difference between two patterns, one original s(r,φ) and another modified t(r,φ+Δφ). Additive or differential phase shifting algorithms directly recover the sum 2φ+Δφ or the difference Δφ of phases without requiring prior calculation of the individual phases. These algorithms can be constructed, for example, from a suitable combination of known phase shifting algorithms. Little has been written on the design, analysis and error compensation of these new two-stage algorithms. Previously we have used computer simulation to study, in a linear approach or with a filter process in reciprocal space, the response of several families of them to the main error sources. In this work we present an error analysis that uses Monte Carlo simulation to achieve results in good agreement with those obtained with spatial and temporal methods.

  19. Differentiating Delirium From Sedative/Hypnotic-Related Iatrogenic Withdrawal Syndrome: Lack of Specificity in Pediatric Critical Care Assessment Tools.

    Science.gov (United States)

    Madden, Kate; Burns, Michele M; Tasker, Robert C

    2017-06-01

    To identify available assessment tools for sedative/hypnotic iatrogenic withdrawal syndrome and delirium in PICU patients, the evidence supporting their use, and describe areas of overlap between the components of these tools and the symptoms of anticholinergic burden in children. Studies were identified using PubMed and EMBASE from the earliest available date until July 3, 2016, using a combination of MeSH terms "delirium," "substance withdrawal syndrome," and key words "opioids," "benzodiazepines," "critical illness," "ICU," and "intensive care." Review article references were also searched. Human studies reporting assessment of delirium or iatrogenic withdrawal syndrome in children 0-18 years undergoing critical care. Non-English language, exclusively adult, and neonatal intensive care studies were excluded. References cataloged by study type, population, and screening process. Iatrogenic withdrawal syndrome and delirium are both prevalent in the PICU population. Commonly used scales for delirium and iatrogenic withdrawal syndrome assess signs and symptoms in the motor, behavior, and state domains, and exhibit considerable overlap. In addition, signs and symptoms of an anticholinergic toxidrome (a risk associated with some common PICU medications) overlap with components of these scales, specifically in motor, cardiovascular, and psychiatric domains. Although important studies have demonstrated apparent high prevalence of iatrogenic withdrawal syndrome and delirium in the PICU population, the overlap in these scoring systems presents potential difficulty in distinguishing syndromes, both clinically and for research purposes.

  20. Value of sagittal color Doppler ultrasonography as a supplementary tool in the differential diagnosis of fetal cleft lip and palate

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Myoung Seok [Dept. of Radiology, Seoul Metropolitan Government-Seoul National University Boramae Medical Center, Seoul (Korea, Republic of); Cho, Jeong Yeon; Kim, Sang Youn; Kim, Seung Hyup [Dept. of Radiology, Seoul National University Hospital, Seoul (Korea, Republic of); Park, Joong Shin; Jun, Jong Kwan [College of Medicine, Seoul National University, Seoul (Korea, Republic of)

    2017-01-15

    The purpose of this study was to evaluate the feasibility and usefulness of sagittal color Doppler ultrasonography (CDUS) for the diagnosis of fetal cleft lip (CL) and cleft palate (CP). We performed targeted ultrasonography on 25 fetuses with CL and CP, taking coronal and axial images of the upper lip and maxillary alveolar arch in each case. The existence of defects in and malalignment of the alveolus on the axial image, hard palate defects on the midsagittal image, and flow-through defects on CDUS taken during fetal breathing or swallowing were assessed. We compared the ultrasonography findings with postnatal findings in all fetuses. Alveolar defects were detected in 16 out of 17 cases with CP and four out of eight cases with CL. Alveolar malalignment and hard palate defects were detected in 11 out of 17 cases and 14 out of 17 cases with CP, respectively, but not detected in any cases with CL. Communicating flow through the palate defect was detected in 11 out of 17 cases of CL with CP. The accuracy of detection in axial scans of an alveolar defect and malalignment was 80% and 76%, respectively. Accuracy of detection of in mid-sagittal images of hard palate defect and flow was 80% and 86%, respectively. The overall diagnostic accuracy of combined axial and sagittal images with sagittal CDUS was 92%. Sagittal CDUS of the fetal hard palate is a feasible method to directly reveal hard palate bony defects and flow through defects, which may have additional value in the differential diagnosis of fetal CL and CP.

  1. Value of sagittal color Doppler ultrasonography as a supplementary tool in the differential diagnosis of fetal cleft lip and palate

    International Nuclear Information System (INIS)

    Lee, Myoung Seok; Cho, Jeong Yeon; Kim, Sang Youn; Kim, Seung Hyup; Park, Joong Shin; Jun, Jong Kwan

    2017-01-01

    The purpose of this study was to evaluate the feasibility and usefulness of sagittal color Doppler ultrasonography (CDUS) for the diagnosis of fetal cleft lip (CL) and cleft palate (CP). We performed targeted ultrasonography on 25 fetuses with CL and CP, taking coronal and axial images of the upper lip and maxillary alveolar arch in each case. The existence of defects in and malalignment of the alveolus on the axial image, hard palate defects on the midsagittal image, and flow-through defects on CDUS taken during fetal breathing or swallowing were assessed. We compared the ultrasonography findings with postnatal findings in all fetuses. Alveolar defects were detected in 16 out of 17 cases with CP and four out of eight cases with CL. Alveolar malalignment and hard palate defects were detected in 11 out of 17 cases and 14 out of 17 cases with CP, respectively, but not detected in any cases with CL. Communicating flow through the palate defect was detected in 11 out of 17 cases of CL with CP. The accuracy of detection in axial scans of an alveolar defect and malalignment was 80% and 76%, respectively. Accuracy of detection of in mid-sagittal images of hard palate defect and flow was 80% and 86%, respectively. The overall diagnostic accuracy of combined axial and sagittal images with sagittal CDUS was 92%. Sagittal CDUS of the fetal hard palate is a feasible method to directly reveal hard palate bony defects and flow through defects, which may have additional value in the differential diagnosis of fetal CL and CP

  2. Automatic Fiscal Stabilizers

    Directory of Open Access Journals (Sweden)

    Narcis Eduard Mitu

    2013-11-01

    Full Text Available Policies or institutions (built into an economic system that automatically tend to dampen economic cycle fluctuations in income, employment, etc., without direct government intervention. For example, in boom times, progressive income tax automatically reduces money supply as incomes and spendings rise. Similarly, in recessionary times, payment of unemployment benefits injects more money in the system and stimulates demand. Also called automatic stabilizers or built-in stabilizers.

  3. In situ Raman spectroelectrochemistry as a tool for the differentiation of inner tubes of double-wall carbon nanotubes and thin single-wall carbon nanotubes.

    Science.gov (United States)

    Kalbác, Martin; Kavan, Ladislav; Dunsch, Lothar

    2007-12-01

    In situ Raman spectroelectrochemistry has been used to distinguish between thin single-wall carbon nanotubes (SWCNT) and the inner tubes of double-wall carbon nanotubes (DWCNT). The spectroelectrochemical method is based on the different change of the electronic structure of the inner tube in DWCNT and that of SWCNT during electrochemical charging, which is reflected in the Raman spectra. During electrochemical charging the inner tubes of DWCNT exhibit a delayed attenuation of the intensities of their Raman modes as referred to the behavior of SWCNT of similar diameter. The changes are pronounced for the radial breathing mode (RBM), and thus, these modes are diagnostic for the distinction of inner tubes of DWCNT from the thin SWCNT. The different sensitivities of inner and outer tubes to the applied electrochemical charging is a simple analytical tool for differentiation of SWCNT and DWCNT in a mixture. The significance of the proposed method is demonstrated on a commercial DWCNT sample.

  4. The differentiation of fibre- and drug type Cannabis seedlings by gas chromatography/mass spectrometry and chemometric tools.

    Science.gov (United States)

    Broséus, Julian; Anglada, Frédéric; Esseiva, Pierre

    2010-07-15

    Cannabis cultivation in order to produce drugs is forbidden in Switzerland. Thus, law enforcement authorities regularly ask forensic laboratories to determinate cannabis plant's chemotype from seized material in order to ascertain that the plantation is legal or not. As required by the EU official analysis protocol the THC rate of cannabis is measured from the flowers at maturity. When laboratories are confronted to seedlings, they have to lead the plant to maturity, meaning a time consuming and costly procedure. This study investigated the discrimination of fibre type from drug type Cannabis seedlings by analysing the compounds found in their leaves and using chemometrics tools. 11 legal varieties allowed by the Swiss Federal Office for Agriculture and 13 illegal ones were greenhouse grown and analysed using a gas chromatograph interfaced with a mass spectrometer. Compounds that show high discrimination capabilities in the seedlings have been identified and a support vector machines (SVMs) analysis was used to classify the cannabis samples. The overall set of samples shows a classification rate above 99% with false positive rates less than 2%. This model allows then discrimination between fibre and drug type Cannabis at an early stage of growth. Therefore it is not necessary to wait plants' maturity to quantify their amount of THC in order to determine their chemotype. This procedure could be used for the control of legal (fibre type) and illegal (drug type) Cannabis production. (c) 2010 Elsevier Ireland Ltd. All rights reserved.

  5. Stable isotopes as a tool to differentiate eggs laid by caged, barn, free range, and organic hens.

    Science.gov (United States)

    Rogers, Karyne M

    2009-05-27

    Stable carbon and nitrogen isotope values of whole yolk, delipidized yolk, albumen, and egg membrane were analyzed from 18 different brands of chicken eggs laid under caged, barn, free range, and organic farming regimes. In general, free range and organic egg components showed enrichment of (15)N values up to 4‰ relative to caged and barn laid eggs, suggesting a higher animal protein (trophic) contribution to the chicken's diet than pure plant-based foods and/or that the feed was organically manufactured. One sample of free range and two samples of organic eggs had δ(15)N values within the range of caged or barn laid eggs, suggesting either that these eggs were mislabeled (the hens were raised under "battery" or "barn" conditions, and not permitted to forage outside) or that there was insufficient animal protein gained by foraging to shift the δ(15)N values of their primary food source. δ(13)C values of potential food sources are discussed with respect to dietary intake and contribution to the isotopic signature of the eggs to determine mixing of C(3) and C(4) diets, although they did not elucidate laying regimen. The study finds that stable nitrogen isotope analysis of egg components is potentially a useful technique to unravel dietary differences between caged or barn hens and free range hens (both conventional and organic) and could be further developed as an authentication tool in the egg industry.

  6. Molecular diagnostic tools for detection and differentiation of phytoplasmas based on chaperonin-60 reveal differences in host plant infection patterns.

    Directory of Open Access Journals (Sweden)

    Tim J Dumonceaux

    Full Text Available Phytoplasmas ('Candidatus Phytoplasma' spp. are insect-vectored bacteria that infect a wide variety of plants, including many agriculturally important species. The infections can cause devastating yield losses by inducing morphological changes that dramatically alter inflorescence development. Detection of phytoplasma infection typically utilizes sequences located within the 16S-23S rRNA-encoding locus, and these sequences are necessary for strain identification by currently accepted standards for phytoplasma classification. However, these methods can generate PCR products >1400 bp that are less divergent in sequence than protein-encoding genes, limiting strain resolution in certain cases. We describe a method for accessing the chaperonin-60 (cpn60 gene sequence from a diverse array of 'Ca.Phytoplasma' spp. Two degenerate primer sets were designed based on the known sequence diversity of cpn60 from 'Ca.Phytoplasma' spp. and used to amplify cpn60 gene fragments from various reference samples and infected plant tissues. Forty three cpn60 sequences were thereby determined. The cpn60 PCR-gel electrophoresis method was highly sensitive compared to 16S-23S-targeted PCR-gel electrophoresis. The topology of a phylogenetic tree generated using cpn60 sequences was congruent with that reported for 16S rRNA-encoding genes. The cpn60 sequences were used to design a hybridization array using oligonucleotide-coupled fluorescent microspheres, providing rapid diagnosis and typing of phytoplasma infections. The oligonucleotide-coupled fluorescent microsphere assay revealed samples that were infected simultaneously with two subtypes of phytoplasma. These tools were applied to show that two host plants, Brassica napus and Camelina sativa, displayed different phytoplasma infection patterns.

  7. Analogiebildung und Differenzierung als Grundoperationen im Sprachlernprozess und ihre Beruecksichtigung bei der Festigung und Automatisierung grammatischer Kenntnisse (Analogy Formation and Differentiation as Basic Operations in the Language-Learning Process: Their Place in Solidifying and Automatizing Grammatical Knowledge)

    Science.gov (United States)

    Pohl, Lothar

    1975-01-01

    A theoretical discussion of analogy formation and differentiation in language learning, followed by teaching hints on the German nominative case, stressing progressively increasing difficulty in the "analogical" and "differentiating" exercises. (Text is in German.) (IFS/WGA)

  8. Automatic control systems engineering

    International Nuclear Information System (INIS)

    Shin, Yun Gi

    2004-01-01

    This book gives descriptions of automatic control for electrical electronics, which indicates history of automatic control, Laplace transform, block diagram and signal flow diagram, electrometer, linearization of system, space of situation, state space analysis of electric system, sensor, hydro controlling system, stability, time response of linear dynamic system, conception of root locus, procedure to draw root locus, frequency response, and design of control system.

  9. Neural Bases of Automaticity

    Science.gov (United States)

    Servant, Mathieu; Cassey, Peter; Woodman, Geoffrey F.; Logan, Gordon D.

    2018-01-01

    Automaticity allows us to perform tasks in a fast, efficient, and effortless manner after sufficient practice. Theories of automaticity propose that across practice processing transitions from being controlled by working memory to being controlled by long-term memory retrieval. Recent event-related potential (ERP) studies have sought to test this…

  10. AUTOMATIC INTRAVENOUS DRIP CONTROLLER*

    African Journals Online (AJOL)

    Both the nursing staff shortage and the need for precise control in the administration of dangerous drugs intra- venously have led to the development of various devices to achieve an automatic system. The continuous automatic control of the drip rate eliminates errors due to any physical effect such as movement of the ...

  11. Automatic Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Preuss, Mike

    2014-01-01

    Automatically generating computer animations is a challenging and complex problem with applications in games and film production. In this paper, we investigate howto translate a shot list for a virtual scene into a series of virtual camera configurations — i.e automatically controlling the virtual...

  12. Automatic feed system for ultrasonic machining

    Science.gov (United States)

    Calkins, Noel C.

    1994-01-01

    Method and apparatus for ultrasonic machining in which feeding of a tool assembly holding a machining tool toward a workpiece is accomplished automatically. In ultrasonic machining, a tool located just above a workpiece and vibrating in a vertical direction imparts vertical movement to particles of abrasive material which then remove material from the workpiece. The tool does not contact the workpiece. Apparatus for moving the tool assembly vertically is provided such that it operates with a relatively small amount of friction. Adjustable counterbalance means is provided which allows the tool to be immobilized in its vertical travel. A downward force, termed overbalance force, is applied to the tool assembly. The overbalance force causes the tool to move toward the workpiece as material is removed from the workpiece.

  13. Automatic solar feature detection using image processing and pattern recognition techniques

    Science.gov (United States)

    Qu, Ming

    The objective of the research in this dissertation is to develop a software system to automatically detect and characterize solar flares, filaments and Corona Mass Ejections (CMEs), the core of so-called solar activity. These tools will assist us to predict space weather caused by violent solar activity. Image processing and pattern recognition techniques are applied to this system. For automatic flare detection, the advanced pattern recognition techniques such as Multi-Layer Perceptron (MLP), Radial Basis Function (RBF), and Support Vector Machine (SVM) are used. By tracking the entire process of flares, the motion properties of two-ribbon flares are derived automatically. In the applications of the solar filament detection, the Stabilized Inverse Diffusion Equation (SIDE) is used to enhance and sharpen filaments; a new method for automatic threshold selection is proposed to extract filaments from background; an SVM classifier with nine input features is used to differentiate between sunspots and filaments. Once a filament is identified, morphological thinning, pruning, and adaptive edge linking methods are applied to determine filament properties. Furthermore, a filament matching method is proposed to detect filament disappearance. The automatic detection and characterization of flares and filaments have been successfully applied on Halpha full-disk images that are continuously obtained at Big Bear Solar Observatory (BBSO). For automatically detecting and classifying CMEs, the image enhancement, segmentation, and pattern recognition techniques are applied to Large Angle Spectrometric Coronagraph (LASCO) C2 and C3 images. The processed LASCO and BBSO images are saved to file archive, and the physical properties of detected solar features such as intensity and speed are recorded in our database. Researchers are able to access the solar feature database and analyze the solar data efficiently and effectively. The detection and characterization system greatly improves

  14. Automatic Test Systems Aquisition

    National Research Council Canada - National Science Library

    1994-01-01

    We are providing this final memorandum report for your information and use. This report discusses the efforts to achieve commonality in standards among the Military Departments as part of the DoD policy for automatic test systems (ATS...

  15. Automatic requirements traceability

    OpenAIRE

    Andžiulytė, Justė

    2017-01-01

    This paper focuses on automatic requirements traceability and algorithms that automatically find recommendation links for requirements. The main objective of this paper is the evaluation of these algorithms and preparation of the method defining algorithms to be used in different cases. This paper presents and examines probabilistic, vector space and latent semantic indexing models of information retrieval and association rule mining using authors own implementations of these algorithms and o...

  16. Position automatic determination technology

    International Nuclear Information System (INIS)

    1985-10-01

    This book tells of method of position determination and characteristic, control method of position determination and point of design, point of sensor choice for position detector, position determination of digital control system, application of clutch break in high frequency position determination, automation technique of position determination, position determination by electromagnetic clutch and break, air cylinder, cam and solenoid, stop position control of automatic guide vehicle, stacker crane and automatic transfer control.

  17. Automatic identification of species with neural networks

    Directory of Open Access Journals (Sweden)

    Andrés Hernández-Serna

    2014-11-01

    Full Text Available A new automatic identification system using photographic images has been designed to recognize fish, plant, and butterfly species from Europe and South America. The automatic classification system integrates multiple image processing tools to extract the geometry, morphology, and texture of the images. Artificial neural networks (ANNs were used as the pattern recognition method. We tested a data set that included 740 species and 11,198 individuals. Our results show that the system performed with high accuracy, reaching 91.65% of true positive fish identifications, 92.87% of plants and 93.25% of butterflies. Our results highlight how the neural networks are complementary to species identification.

  18. Automatic inference of indexing rules for MEDLINE

    Directory of Open Access Journals (Sweden)

    Shooshan Sonya E

    2008-11-01

    Full Text Available Abstract Background: Indexing is a crucial step in any information retrieval system. In MEDLINE, a widely used database of the biomedical literature, the indexing process involves the selection of Medical Subject Headings in order to describe the subject matter of articles. The need for automatic tools to assist MEDLINE indexers in this task is growing with the increasing number of publications being added to MEDLINE. Methods: In this paper, we describe the use and the customization of Inductive Logic Programming (ILP to infer indexing rules that may be used to produce automatic indexing recommendations for MEDLINE indexers. Results: Our results show that this original ILP-based approach outperforms manual rules when they exist. In addition, the use of ILP rules also improves the overall performance of the Medical Text Indexer (MTI, a system producing automatic indexing recommendations for MEDLINE. Conclusion: We expect the sets of ILP rules obtained in this experiment to be integrated into MTI.

  19. Automatic inference of indexing rules for MEDLINE.

    Science.gov (United States)

    Névéol, Aurélie; Shooshan, Sonya E; Claveau, Vincent

    2008-11-19

    Indexing is a crucial step in any information retrieval system. In MEDLINE, a widely used database of the biomedical literature, the indexing process involves the selection of Medical Subject Headings in order to describe the subject matter of articles. The need for automatic tools to assist MEDLINE indexers in this task is growing with the increasing number of publications being added to MEDLINE. In this paper, we describe the use and the customization of Inductive Logic Programming (ILP) to infer indexing rules that may be used to produce automatic indexing recommendations for MEDLINE indexers. Our results show that this original ILP-based approach outperforms manual rules when they exist. In addition, the use of ILP rules also improves the overall performance of the Medical Text Indexer (MTI), a system producing automatic indexing recommendations for MEDLINE. We expect the sets of ILP rules obtained in this experiment to be integrated into MTI.

  20. Development of a versatile tool for the simultaneous differential detection of Pseudomonas savastanoi pathovars by End Point and Real-Time PCR.

    Science.gov (United States)

    Tegli, Stefania; Cerboneschi, Matteo; Libelli, Ilaria Marsili; Santilli, Elena

    2010-05-28

    Pseudomonas savastanoi pv. savastanoi is the causal agent of olive knot disease. The strains isolated from oleander and ash belong to the pathovars nerii and fraxini, respectively. When artificially inoculated, pv. savastanoi causes disease also on ash, and pv. nerii attacks also olive and ash. Surprisingly nothing is known yet about their distribution in nature on these hosts and if spontaneous cross-infections occur. On the other hand sanitary certification programs for olive plants, also including P. savastanoi, were launched in many countries. The aim of this work was to develop several PCR-based tools for the rapid, simultaneous, differential and quantitative detection of these P. savastanoi pathovars, in multiplex and in planta. Specific PCR primers and probes for the pathovars savastanoi, nerii and fraxini of P. savastanoi were designed to be used in End Point and Real-Time PCR, both with SYBR Green or TaqMan chemistries. The specificity of all these assays was 100%, as assessed by testing forty-four P. savastanoi strains, belonging to the three pathovars and having different geographical origins. For comparison strains from the pathovars phaseolicola and glycinea of P. savastanoi and bacterial epiphytes from P. savastanoi host plants were also assayed, and all of them tested always negative. The analytical detection limits were about 5 - 0.5 pg of pure genomic DNA and about 102 genome equivalents per reaction. Similar analytical thresholds were achieved in Multiplex Real-Time PCR experiments, even on artificially inoculated olive plants. Here for the first time a complex of PCR-based assays were developed for the simultaneous discrimination and detection of P. savastanoi pv. savastanoi, pv. nerii and pv. fraxini. These tests were shown to be highly reliable, pathovar-specific, sensitive, rapid and able to quantify these pathogens, both in multiplex reactions and in vivo. Compared with the other methods already available for P. savastanoi, the identification

  1. Automatic text summarization

    CERN Document Server

    Torres Moreno, Juan Manuel

    2014-01-01

    This new textbook examines the motivations and the different algorithms for automatic document summarization (ADS). We performed a recent state of the art. The book shows the main problems of ADS, difficulties and the solutions provided by the community. It presents recent advances in ADS, as well as current applications and trends. The approaches are statistical, linguistic and symbolic. Several exemples are included in order to clarify the theoretical concepts.  The books currently available in the area of Automatic Document Summarization are not recent. Powerful algorithms have been develop

  2. Automatic Ultrasound Scanning

    DEFF Research Database (Denmark)

    Moshavegh, Ramin

    on the scanners, and to improve the computer-aided diagnosis (CAD) in ultrasound by introducing new quantitative measures. Thus, four major issues concerning automation of the medical ultrasound are addressed in this PhD project. They touch upon gain adjustments in ultrasound, automatic synthetic aperture image...... on the user adjustments on the scanner interface to optimize the scan settings. This explains the huge interest in the subject of this PhD project entitled “AUTOMATIC ULTRASOUND SCANNING”. The key goals of the project have been to develop automated techniques to minimize the unnecessary settings...

  3. Magnetic Resonance Imaging and conformal radiotherapy: Characterization of MRI alone simulation for conformal radiotherapy. Development and evaluation of an automatic volumes of interest segmentation tool for prostate cancer radiotherapy

    International Nuclear Information System (INIS)

    Pasquier, David

    2006-01-01

    Radiotherapy is a curative treatment of malignant tumours. Radiotherapy techniques considerably evolved last years with the increasing integration of medical images in conformal radiotherapy. This technique makes it possible to elaborate a complex ballistics conforming to target volume and sparing healthy tissues. The examination currently used to delineate volumes of interest is Computed Tomography (CT), on account of its geometrical precision and the information that it provides on electronic densities needed to dose calculation. Magnetic Resonance Imaging (MRI) ensures a more precise delineation of target volumes in many locations, such as pelvis and brain. For pelvic tumours, the use of MRI needs image registration, which complicates treatment planning and poses the problem of the lack of in vivo standard method of validation. The obstacles in the use of MRI alone in treatment planning were evaluated. Neither geometrical distortion linked with the system and the patient nor the lack of information on electronic densities represent stumbling obstacles. Distortion remained low even in edge of large field of view on modern machines. The assignment of electronic densities to bone structures and soft tissues in MR images permitted to obtain equivalent dosimetry to that carried out on the original CT, with a good reproducibility and homogeneous distribution within target volume. The assignment of electronic densities could not be carried out using 20 MV photons and suitable ballistics. The development of Image Guided Radiotherapy could facilitate the use of MRI alone in treatment planning. Target volumes and organ at risk delineation is a time consuming task in radiotherapy planning. We took part in the development and evaluated a method of automatic and semi automatic delineation of volumes of interest from MRI images for prostate cancer radiotherapy. For prostate and organ at risk automatic delineation an organ model-based method and a seeded region growing method

  4. Automatic creation of simulation configuration

    International Nuclear Information System (INIS)

    Oudot, G.; Poizat, F.

    1993-01-01

    SIPA, which stands for 'Simulator for Post Accident', includes: 1) a sophisticated software oriented workshop SWORD (which stands for 'Software Workshop Oriented towards Research and Development') designed in the ADA language including integrated CAD system and software tools for automatic generation of simulation software and man-machine interface in order to operate run-time simulation; 2) a 'simulator structure' based on hardware equipment and software for supervision and communications; 3) simulation configuration generated by SWORD, operated under the control of the 'simulator structure' and run on a target computer. SWORD has already been used to generate two simulation configurations (French 900 MW and 1300 MW nuclear power plants), which are now fully operational on the SIPA training simulator. (Z.S.) 1 ref

  5. Research and implementation of software automatic test

    Science.gov (United States)

    Li-hong, LIAN

    2017-06-01

    With the fast development in IT technology nowadays, software is increasingly complex and large. Hundreds of people in the development team, thousands of modules and interfaces, across geographies and systems user are no longer a fresh thing. All of these put forward higher requirements for software testing. Due to the low cost of implementation and the advantage of effective inheritance and accumulation of test assets, software automation testing has gradually become one of the important means to ensure the quality of software for IT enterprises. This paper analyzes the advantages of automatic test, common misconceptions; puts forward unsuitable application scenarios and the best time to intervene; focus on the analysis of the feasibility of judging the interface automation test; and puts forward the function and elements of interface automatic test tools to have; provides a reference for large-scale project interface automated testing tool selection or custom development.

  6. Reactor component automatic grapple

    International Nuclear Information System (INIS)

    Greenaway, P.R.

    1982-01-01

    A grapple for handling nuclear reactor components in a medium such as liquid sodium which, upon proper seating and alignment of the grapple with the component as sensed by a mechanical logic integral to the grapple, automatically seizes the component. The mechanical logic system also precludes seizure in the absence of proper seating and alignment. (author)

  7. Automatic Commercial Permit Sets

    Energy Technology Data Exchange (ETDEWEB)

    Grana, Paul [Folsom Labs, Inc., San Francisco, CA (United States)

    2017-12-21

    Final report for Folsom Labs’ Solar Permit Generator project, which has successfully completed, resulting in the development and commercialization of a software toolkit within the cloud-based HelioScope software environment that enables solar engineers to automatically generate and manage draft documents for permit submission.

  8. Automatic Complexity Analysis

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    1989-01-01

    One way to analyse programs is to to derive expressions for their computational behaviour. A time bound function (or worst-case complexity) gives an upper bound for the computation time as a function of the size of input. We describe a system to derive such time bounds automatically using abstrac...

  9. A framework for evaluating automatic indexing or classification in the context of retrieval

    DEFF Research Database (Denmark)

    Golub, Korajlka; Soergel, Dagobert; Buchanan, George

    2016-01-01

    Tools for automatic subject assignment help deal with scale and sustainability in creating and enriching metadata, establishing more connections across and between resources and enhancing consistency. While some software vendors and experimental researchers claim the tools can replace manual...

  10. Automatic Planning of External Search Engine Optimization

    Directory of Open Access Journals (Sweden)

    Vita Jasevičiūtė

    2015-07-01

    Full Text Available This paper describes an investigation of the external search engine optimization (SEO action planning tool, dedicated to automatically extract a small set of most important keywords for each month during whole year period. The keywords in the set are extracted accordingly to external measured parameters, such as average number of searches during the year and for every month individually. Additionally the position of the optimized web site for each keyword is taken into account. The generated optimization plan is similar to the optimization plans prepared manually by the SEO professionals and can be successfully used as a support tool for web site search engine optimization.

  11. Preventing SQL Injection through Automatic Query Sanitization with ASSIST

    Directory of Open Access Journals (Sweden)

    Raymond Mui

    2010-09-01

    Full Text Available Web applications are becoming an essential part of our everyday lives. Many of our activities are dependent on the functionality and security of these applications. As the scale of these applications grows, injection vulnerabilities such as SQL injection are major security challenges for developers today. This paper presents the technique of automatic query sanitization to automatically remove SQL injection vulnerabilities in code. In our technique, a combination of static analysis and program transformation are used to automatically instrument web applications with sanitization code. We have implemented this technique in a tool named ASSIST (Automatic and Static SQL Injection Sanitization Tool for protecting Java-based web applications. Our experimental evaluation showed that our technique is effective against SQL injection vulnerabilities and has a low overhead.

  12. Pattern-Driven Automatic Parallelization

    Directory of Open Access Journals (Sweden)

    Christoph W. Kessler

    1996-01-01

    Full Text Available This article describes a knowledge-based system for automatic parallelization of a wide class of sequential numerical codes operating on vectors and dense matrices, and for execution on distributed memory message-passing multiprocessors. Its main feature is a fast and powerful pattern recognition tool that locally identifies frequently occurring computations and programming concepts in the source code. This tool also works for dusty deck codes that have been "encrypted" by former machine-specific code transformations. Successful pattern recognition guides sophisticated code transformations including local algorithm replacement such that the parallelized code need not emerge from the sequential program structure by just parallelizing the loops. It allows access to an expert's knowledge on useful parallel algorithms, available machine-specific library routines, and powerful program transformations. The partially restored program semantics also supports local array alignment, distribution, and redistribution, and allows for faster and more exact prediction of the performance of the parallelized target code than is usually possible.

  13. Automatic trend estimation

    CERN Document Server

    Vamos¸, C˘alin

    2013-01-01

    Our book introduces a method to evaluate the accuracy of trend estimation algorithms under conditions similar to those encountered in real time series processing. This method is based on Monte Carlo experiments with artificial time series numerically generated by an original algorithm. The second part of the book contains several automatic algorithms for trend estimation and time series partitioning. The source codes of the computer programs implementing these original automatic algorithms are given in the appendix and will be freely available on the web. The book contains clear statement of the conditions and the approximations under which the algorithms work, as well as the proper interpretation of their results. We illustrate the functioning of the analyzed algorithms by processing time series from astrophysics, finance, biophysics, and paleoclimatology. The numerical experiment method extensively used in our book is already in common use in computational and statistical physics.

  14. Automatic Program Development

    DEFF Research Database (Denmark)

    Automatic Program Development is a tribute to Robert Paige (1947-1999), our accomplished and respected colleague, and moreover our good friend, whose untimely passing was a loss to our academic and research community. We have collected the revised, updated versions of the papers published in his ...... a renewed stimulus for continuing and deepening Bob's research visions. A familiar touch is given to the book by some pictures kindly provided to us by his wife Nieba, the personal recollections of his brother Gary and some of his colleagues and friends....... honor in the Higher-Order and Symbolic Computation Journal in the years 2003 and 2005. Among them there are two papers by Bob: (i) a retrospective view of his research lines, and (ii) a proposal for future studies in the area of the automatic program derivation. The book also includes some papers...

  15. Automaticity or active control

    DEFF Research Database (Denmark)

    Tudoran, Ana Alina; Olsen, Svein Ottar

    This study addresses the quasi-moderating role of habit strength in explaining action loyalty. A model of loyalty behaviour is proposed that extends the traditional satisfaction–intention–action loyalty network. Habit strength is conceptualised as a cognitive construct to refer to the psychological...... aspects of the construct, such as routine, inertia, automaticity, or very little conscious deliberation. The data consist of 2962 consumers participating in a large European survey. The results show that habit strength significantly moderates the association between satisfaction and action loyalty, and......, respectively, between intended loyalty and action loyalty. At high levels of habit strength, consumers are more likely to free up cognitive resources and incline the balance from controlled to routine and automatic-like responses....

  16. Automatic food decisions

    DEFF Research Database (Denmark)

    Mueller Loose, Simone

    Consumers' food decisions are to a large extent shaped by automatic processes, which are either internally directed through learned habits and routines or externally influenced by context factors and visual information triggers. Innovative research methods such as eye tracking, choice experiments...... and food diaries allow us to better understand the impact of unconscious processes on consumers' food choices. Simone Mueller Loose will provide an overview of recent research insights into the effects of habit and context on consumers' food choices....

  17. Automatic Language Identification

    Science.gov (United States)

    2000-08-01

    hundreds guish one language from another. The reader is referred of input languages would need to be supported , the cost of to the linguistics literature...eventually obtained bet- 108 TRAINING FRENCH GERMAN ITRAIING FRENCH M- ALGORITHM - __ GERMAN NHSPANISH TRAINING SPEECH SET OF MODELS: UTTERANCES ONE MODEL...i.e. vowels ) for each speech utterance are located malized to be insensitive to overall amplitude, pitch and automatically. Next, feature vectors

  18. Inter Genre Similarity Modelling For Automatic Music Genre Classification

    OpenAIRE

    Bagci, Ulas; Erzin, Engin

    2009-01-01

    Music genre classification is an essential tool for music information retrieval systems and it has been finding critical applications in various media platforms. Two important problems of the automatic music genre classification are feature extraction and classifier design. This paper investigates inter-genre similarity modelling (IGS) to improve the performance of automatic music genre classification. Inter-genre similarity information is extracted over the mis-classified feature population....

  19. Automatic Classification of Marine Mammals with Speaker Classification Methods.

    Science.gov (United States)

    Kreimeyer, Roman; Ludwig, Stefan

    2016-01-01

    We present an automatic acoustic classifier for marine mammals based on human speaker classification methods as an element of a passive acoustic monitoring (PAM) tool. This work is part of the Protection of Marine Mammals (PoMM) project under the framework of the European Defense Agency (EDA) and joined by the Research Department for Underwater Acoustics and Geophysics (FWG), Bundeswehr Technical Centre (WTD 71) and Kiel University. The automatic classification should support sonar operators in the risk mitigation process before and during sonar exercises with a reliable automatic classification result.

  20. Toward a non-invasive screening tool for differentiation of pancreatic lesions based on intra-voxel incoherent motion derived parameters

    International Nuclear Information System (INIS)

    Graf, Markus; Simon, Dirk; Mang, Sarah; Lemke, Andreas; Gruenberg, Katharina

    2013-01-01

    Early recognition of and differential diagnosis between pancreatic cancer and chronic pancreatitis is an important step in successful therapy. Parameters of the IVIM (intra-voxel incoherent motion) theory can be used to differentiate between those lesions. The objective of this work is to evaluate the effects of rigid image registration on IVIM derived parameters for differentiation of pancreatic lesions such as pancreatic cancer and solid mass forming pancreatitis. The effects of linear image registration methods on reproducibility and accuracy of IVIM derived parameters were quantified on MR images of ten volunteers. For this purpose, they were evaluated statistically by comparison of registered and unregistered parameter data. Further, the perfusion fraction f was used to differentiate pancreatic lesions on eleven previously diagnosed patient data sets. Its diagnostic power with and without rigid registration was evaluated using receiver operating curves (ROC) analysis. The pancreas was segmented manually on MR data sets of healthy volunteers as well as the patients showing solid pancreatic lesions. Diffusion weighted imaging was performed in 10 blocks of breath-hold phases. Linear registration of the weighted image stack leads to a 3.7% decrease in variability of the IVIM derived parameter f due to an improved anatomical overlap of 5%. Consequently, after registration the area under the curve in the ROC-analysis for the differentiation approach increased by 2.7%. In conclusion, rigid registration improves the differentiation process based on f-values. (orig.)

  1. Toward a non-invasive screening tool for differentiation of pancreatic lesions based on intra-voxel incoherent motion derived parameters

    Energy Technology Data Exchange (ETDEWEB)

    Graf, Markus; Simon, Dirk; Mang, Sarah [Deutsches Krebsforschungszentrum (DKFZ), Heidelberg (Germany). Software Development for Integrated Therapy and Diagnostics; Lemke, Andreas [Heidelberg Univ., Mannheim (Germany). Dept. of Computer Assisted Clinical Medicine; Gruenberg, Katharina [Deutsches Krebsforschungszentrum (DKFZ), Heidelberg (Germany). Dept. of Radiology

    2013-03-01

    Early recognition of and differential diagnosis between pancreatic cancer and chronic pancreatitis is an important step in successful therapy. Parameters of the IVIM (intra-voxel incoherent motion) theory can be used to differentiate between those lesions. The objective of this work is to evaluate the effects of rigid image registration on IVIM derived parameters for differentiation of pancreatic lesions such as pancreatic cancer and solid mass forming pancreatitis. The effects of linear image registration methods on reproducibility and accuracy of IVIM derived parameters were quantified on MR images of ten volunteers. For this purpose, they were evaluated statistically by comparison of registered and unregistered parameter data. Further, the perfusion fraction f was used to differentiate pancreatic lesions on eleven previously diagnosed patient data sets. Its diagnostic power with and without rigid registration was evaluated using receiver operating curves (ROC) analysis. The pancreas was segmented manually on MR data sets of healthy volunteers as well as the patients showing solid pancreatic lesions. Diffusion weighted imaging was performed in 10 blocks of breath-hold phases. Linear registration of the weighted image stack leads to a 3.7% decrease in variability of the IVIM derived parameter f due to an improved anatomical overlap of 5%. Consequently, after registration the area under the curve in the ROC-analysis for the differentiation approach increased by 2.7%. In conclusion, rigid registration improves the differentiation process based on f-values. (orig.)

  2. Automatic TLI recognition system, general description

    Energy Technology Data Exchange (ETDEWEB)

    Lassahn, G.D.

    1997-02-01

    This report is a general description of an automatic target recognition system developed at the Idaho National Engineering Laboratory for the Department of Energy. A user`s manual is a separate volume, Automatic TLI Recognition System, User`s Guide, and a programmer`s manual is Automatic TLI Recognition System, Programmer`s Guide. This system was designed as an automatic target recognition system for fast screening of large amounts of multi-sensor image data, based on low-cost parallel processors. This system naturally incorporates image data fusion, and it gives uncertainty estimates. It is relatively low cost, compact, and transportable. The software is easily enhanced to expand the system`s capabilities, and the hardware is easily expandable to increase the system`s speed. In addition to its primary function as a trainable target recognition system, this is also a versatile, general-purpose tool for image manipulation and analysis, which can be either keyboard-driven or script-driven. This report includes descriptions of three variants of the computer hardware, a description of the mathematical basis if the training process, and a description with examples of the system capabilities.

  3. In vitro differentiated adult human liver progenitor cells display mature hepatic metabolic functions: a potential tool for in vitro pharmacotoxicological testing.

    Science.gov (United States)

    Khuu, Dung Ngoc; Scheers, Isabelle; Ehnert, Sabrina; Jazouli, Nawal; Nyabi, Omar; Buc-Calderon, Pedro; Meulemans, Ann; Nussler, Andreas; Sokal, Etienne; Najimi, Mustapha

    2011-01-01

    The potential use of stem/progenitor cells as alternative cell sources to mature hepatocytes remains basically dependent on their ability to exhibit some, if not all, the metabolic liver functions. In the current study, four major liver functions were investigated in adult derived human liver stem/progenitor cell (ADHLSCs) populations submitted to in vitro hepatogenic differentiation: gluconeogenesis, ammonia detoxification, and activity of phase I and phase II drug-metabolizing enzymes. These acquired hepatic activities were compared to those of primary adult human hepatocytes, the standard reference. Amino acid content was also investigated after hepatogenic differentiation. Differentiated ADHLSCs display higher de novo synthesis of glucose correlated to an increased activity of glucose-6 phosphatase and mRNA expression of key related enzymes. Differentiated ADHLSCs are also able to metabolize ammonium chloride and to produce urea. This was correlated to an increase in the mRNA expression of relevant key enzymes such arginase. With respect to drug metabolism, differentiated ADHLSCs express mRNAs of all the major cytochromes investigated, among which the CYP3A4 isoform (the most important drug-metabolizing enzyme). Such increased expression is correlated to an enhanced phase I activity as independently demonstrated using fluorescence-based assays. Phase II enzyme activity and amino acid levels also show a significant enhancement in differentiated ADHLSCs. The current study, according to data independently obtained in different labs, demonstrates that in vitro differentiated ADHLSCs are able to display advanced liver metabolic functions supporting the possibility to develop them as potential alternatives to primary hepatocytes for in vitro settings. © 2011 Cognizant Comm. Corp.

  4. Revisiting the dose-effect correlations in irradiated head and neck cancer using automatic segmentation tools of the dental structures, mandible and maxilla; Dentalmaps: un outil pratique pour chirurgiens dentistes et radiotherapeutes pour l'estimation de la dose recue aux dents, mandibule et maxillaire et du risque de complications postradiques en cas de soins dentaires

    Energy Technology Data Exchange (ETDEWEB)

    Thariat, J. [Departement de radiotherapie, centre Antoine-Lacassagne, 33, avenue de Valombrose, 06189 Nice cedex 2 (France); IBDC CNRS UMR 6543, Parc Valrose, 06108 Nice cedex 2 (France); Universite de Nice Sophia-Antipolis, 33, avenue de Valombrose, 06189 Nice cedex 2 (France); Ramus, L. [Dosisoft, 45/47, avenue Carnot, 94230 Cachan (France); equipe de recherche Asclepios, Inria, 2004, route des Lucioles, BP 93, 06902 Sophia-Antipolis (France); Odin, G. [Departement d' odontologie, hopital Saint-Roch, CHU de Nice, 5, rue Pierre-Devoluy, 06006 Nice (France); Vincent, S.; Orlanducci, M.H.; Dassonville, O. [Institut universitaire de la face et du cou, 33, avenue de Valombrose, 06189 Nice cedex 2 (France); Departement de chirurgie, centre Antoine-Lacassagne, 33, avenue de Valombrose, 06189 Nice cedex 2 (France); Darcourt, V. [Departement de radiotherapie, centre Antoine-Lacassagne, 33, avenue de Valombrose, 06189 Nice cedex 2 (France); Lacout, A.; Marcy, P.Y. [Departement of radiologie, centre d' imagerie medicale, 83, avenue Charles-de-Gaulle, 15000 Aurillac (France); Cagnol, G. [Departement de chirurgie cervicofaciale, clinique de l' Esperance, 122, avenue du Docteur-Maurice-Donat, BP 1250, 06254 Mougins (France); Malandain, G. [equipe de recherche Asclepios, Inria, 2004, route des Lucioles, BP 93, 06902 Sophia-Antipolis (France)

    2011-12-15

    Purpose. - Manual delineation of dental structures is too time-consuming to be feasible in routine practice. Information on dose risk levels is crucial for dentists following irradiation of the head and neck to avoid post-extraction osteoradionecrosis based on empirical dose-effects data established on bidimensional radiation therapy plans. Material and methods. - We present an automatic atlas-based segmentation framework of the dental structures, called Dentalmaps, constructed from a patient image-segmentation database. Results. - This framework is accurate (within 2 Gy accuracy) and relevant for the routine use. It has the potential to guide dental care in the context of new irradiation techniques. Conclusion. - This tool provides a user-friendly interface for dentists and radiation oncologists in the context of irradiated head and neck cancer patients. It will likely improve the knowledge of dose-effect correlations for dental complications and osteoradionecrosis. (authors)

  5. Automatic Evaluation Of Interferograms

    Science.gov (United States)

    Becker, Friedhelm; Meier, Gerd E. A.; Wegner, Horst

    1983-03-01

    A system for the automatic evaluation of interference patterns has been developed. After digitizing the interferograms from classical and holografic interferometers with a television digitizer and performing different picture enhancement operations the fringe loci are extracted by use of a floating-threshold method. The fringes are numbered using a special scheme after the removal of any fringe disconnections which might appear if there was insufficient contrast in the interferograms. The reconstruction of the object function from the numbered fringe field is achieved by a local polynomial least-squares approximation. Applications are given, demonstrating the evaluation of interferograms of supersonic flow fields and the analysis of holografic interferograms of car-tyres.

  6. Automatic quantitative renal scintigraphy

    International Nuclear Information System (INIS)

    Valeyre, J.; Deltour, G.; Delisle, M.J.; Bouchard, A.

    1976-01-01

    Renal scintigraphy data may be analyzed automatically by the use of a processing system coupled to an Anger camera (TRIDAC-MULTI 8 or CINE 200). The computing sequence is as follows: normalization of the images; background noise subtraction on both images; evaluation of mercury 197 uptake by the liver and spleen; calculation of the activity fractions on each kidney with respect to the injected dose, taking into account the kidney depth and the results referred to normal values; edition of the results. Automation minimizes the scattering parameters and by its simplification is a great asset in routine work [fr

  7. Automatic segmentation of vertebrae from radiographs

    DEFF Research Database (Denmark)

    Mysling, Peter; Petersen, Peter Kersten; Nielsen, Mads

    2011-01-01

    Segmentation of vertebral contours is an essential task in the design of automatic tools for vertebral fracture assessment. In this paper, we propose a novel segmentation technique which does not require operator interaction. The proposed technique solves the segmentation problem in a hierarchical...... manner. In a first phase, a coarse estimate of the overall spine alignment and the vertebra locations is computed using a shape model sampling scheme. These samples are used to initialize a second phase of active shape model search, under a nonlinear model of vertebra appearance. The search...... is constrained by a conditional shape model, based on the variability of the coarse spine location estimates. The technique is evaluated on a data set of manually annotated lumbar radiographs. The results compare favorably to the previous work in automatic vertebra segmentation, in terms of both segmentation...

  8. The Characterization Tool: A knowledge-based stem cell, differentiated cell, and tissue database with a web-based analysis front-end.

    NARCIS (Netherlands)

    I. Wohlers (Inken); H. Stachelscheid; J. Borstlap; K. Zeilinger; J.C. Gerlach

    2009-01-01

    htmlabstractIn the rapidly growing field of stem cell research, there is a need for universal databases and web-based applications that provide a common knowledge base on the characteristics of stem cells, differentiated cells, and tissues by collecting, processing, and making available diverse

  9. Pigmented Nodular Basal Cell Carcinomas in Differential Diagnosis with Nodular Melanomas: Confocal Microscopy as a Reliable Tool for In Vivo Histologic Diagnosis

    International Nuclear Information System (INIS)

    Casari, A.; Pellacani, G.; Seidenari, S.; Pepe, P.; Longo, C.; Cesinaro, A. M.; Beretti, F.

    2011-01-01

    Nodular basal cell carcinoma, especially when pigmented, can be in differential diagnosis with nodular melanomas, clinically and dermoscopically. Reflectance confocal microscopy is a relatively new imaging technique that permits to evaluate in vivo skin tumors with a nearly histological resolution. Here, we present four cases of challenging nodular lesions where confocal microscopy was able to clarify the diagnosis.

  10. The Guide-based Automatic Creation of Verified Test Scenarious

    Directory of Open Access Journals (Sweden)

    P. D. Drobintsev

    2013-01-01

    Full Text Available This paper presents an overview of technology of the automated generation of test scenarios based on guides. The usage of this technology can significantly improve the quality of the developed program products. In order to ground the technology creation, the main problems that occur during the development and testing of the large industrial systems, are described, as well as the methodologies of software verification on conformity to product requirements. The potentialities of tools for automatic and semi-automatic generation of a test suite by using a formal model in UCM notation are demonstrated, as well as tools for verification and automation of testing.

  11. Automatic readout micrometer

    Science.gov (United States)

    Lauritzen, T.

    A measuring system is described for surveying and very accurately positioning objects with respect to a reference line. A principle use of this surveying system is for accurately aligning the electromagnets which direct a particle beam emitted from a particle accelerator. Prior art surveying systems require highly skilled surveyors. Prior art systems include, for example, optical surveying systems which are susceptible to operator reading errors, and celestial navigation-type surveying systems, with their inherent complexities. The present invention provides an automatic readout micrometer which can very accurately measure distances. The invention has a simplicity of operation which practically eliminates the possibilities of operator optical reading error, owning to the elimination of traditional optical alignments for making measurements. The invention has an extendable arm which carries a laser surveying target. The extendable arm can be continuously positioned over its entire length of travel by either a coarse of fine adjustment without having the fine adjustment outrun the coarse adjustment until a reference laser beam is centered on the target as indicated by a digital readout. The length of the micrometer can then be accurately and automatically read by a computer and compared with a standardized set of alignment measurements. Due to its construction, the micrometer eliminates any errors due to temperature changes when the system is operated within a standard operating temperature range.

  12. Automatic personnel contamination monitor

    International Nuclear Information System (INIS)

    Lattin, Kenneth R.

    1978-01-01

    United Nuclear Industries, Inc. (UNI) has developed an automatic personnel contamination monitor (APCM), which uniquely combines the design features of both portal and hand and shoe monitors. In addition, this prototype system also has a number of new features, including: micro computer control and readout, nineteen large area gas flow detectors, real-time background compensation, self-checking for system failures, and card reader identification and control. UNI's experience in operating the Hanford N Reactor, located in Richland, Washington, has shown the necessity of automatically monitoring plant personnel for contamination after they have passed through the procedurally controlled radiation zones. This final check ensures that each radiation zone worker has been properly checked before leaving company controlled boundaries. Investigation of the commercially available portal and hand and shoe monitors indicated that they did not have the sensitivity or sophistication required for UNI's application, therefore, a development program was initiated, resulting in the subject monitor. Field testing shows good sensitivity to personnel contamination with the majority of alarms showing contaminants on clothing, face and head areas. In general, the APCM has sensitivity comparable to portal survey instrumentation. The inherit stand-in, walk-on feature of the APCM not only makes it easy to use, but makes it difficult to bypass. (author)

  13. Do Automatic Self-Associations Relate to Suicidal Ideation?

    NARCIS (Netherlands)

    Glashouwer, Klaske A.; de Jong, Peter J.; Penninx, Brenda W. J. H.; Kerkhof, Ad J. F. M.; van Dyck, Richard; Ormel, Johan

    Dysfunctional self-schemas are assumed to play an important role in suicidal ideation. According to recent information-processing models, it is important to differentiate between 'explicit' beliefs and automatic associations. Explicit beliefs stem from the weighting of propositions and their

  14. Reachability Games on Automatic Graphs

    Science.gov (United States)

    Neider, Daniel

    In this work we study two-person reachability games on finite and infinite automatic graphs. For the finite case we empirically show that automatic game encodings are competitive to well-known symbolic techniques such as BDDs, SAT and QBF formulas. For the infinite case we present a novel algorithm utilizing algorithmic learning techniques, which allows to solve huge classes of automatic reachability games.

  15. Automatic reactor protection system tester

    International Nuclear Information System (INIS)

    Deliant, J.D.; Jahnke, S.; Raimondo, E.

    1988-01-01

    The object of this paper is to present the automatic tester of reactor protection systems designed and developed by EDF and Framatome. In order, the following points are discussed: . The necessity for reactor protection system testing, . The drawbacks of manual testing, . The description and use of the Framatome automatic tester, . On-site installation of this system, . The positive results obtained using the Framatome automatic tester in France

  16. Portable and Automatic Moessbauer Analysis

    International Nuclear Information System (INIS)

    Souza, P. A. de; Garg, V. K.; Klingelhoefer, G.; Gellert, R.; Guetlich, P.

    2002-01-01

    A portable Moessbauer spectrometer, developed for extraterrestrial applications, opens up new industrial applications of MBS. But for industrial applications, an available tool for fast data analysis is also required, and it should be easy to handle. The analysis of Moessbauer spectra and their parameters is a barrier for the popularity of this wide-applicable spectroscopic technique in industry. Based on experience, the analysis of a Moessbauer spectrum is time-consuming and requires the dedication of a specialist. However, the analysis of Moessbauer spectra, from the fitting to the identification of the sample phases, can be faster using by genetic algorithms, fuzzy logic and artificial neural networks. Industrial applications are very specific ones and the data analysis can be performed using these algorithms. In combination with an automatic analysis, the Moessbauer spectrometer can be used as a probe instrument which covers the main industrial needs for an on-line monitoring of its products, processes and case studies. Some of these real industrial applications will be discussed.

  17. Automatic sets and Delone sets

    International Nuclear Information System (INIS)

    Barbe, A; Haeseler, F von

    2004-01-01

    Automatic sets D part of Z m are characterized by having a finite number of decimations. They are equivalently generated by fixed points of certain substitution systems, or by certain finite automata. As examples, two-dimensional versions of the Thue-Morse, Baum-Sweet, Rudin-Shapiro and paperfolding sequences are presented. We give a necessary and sufficient condition for an automatic set D part of Z m to be a Delone set in R m . The result is then extended to automatic sets that are defined as fixed points of certain substitutions. The morphology of automatic sets is discussed by means of examples

  18. Combination of cyst fluid CEA and CA 125 is an accurate diagnostic tool for differentiating mucinous cystic neoplasms from intraductal papillary mucinous neoplasms.

    Science.gov (United States)

    Nagashio, Yoshikuni; Hijioka, Susumu; Mizuno, Nobumasa; Hara, Kazuo; Imaoka, Hiroshi; Bhatia, Vikram; Niwa, Yasumasa; Tajika, Masahiro; Tanaka, Tsutomu; Ishihara, Makoto; Shimizu, Yasuhiro; Hosoda, Waki; Yatabe, Yasushi; Yamao, Kenji

    2014-01-01

    Despite advances in imaging techniques, diagnosis and management of pancreatic cystic lesions still remains challenging. The objective of this study was to determine the utility of cyst fluid analysis (CEA, CA 19-9, CA 125, amylase, and cytology) in categorizing pancreatic cystic lesions, and in differentiating malignant from benign cystic lesions. A retrospective analysis of 68 patients with histologically and clinically confirmed cystic lesions was performed. Cyst fluid was obtained by surgical resection (n = 45) or endoscopic ultrasound-guided fine needle aspiration (EUS-FNA) (n = 23). Cyst fluid tumor markers and amylase were measured and compared between the cyst types. Receiver operating characteristic (ROC) curve analysis of the tumor markers demonstrated that cyst fluid CEA provided the greatest area under ROC curve (AUC) (0.884) for differentiating mucinous versus non-mucinous cystic lesions. When a CEA cutoff value was set at 67.3 ng/ml, the sensitivity, specificity and accuracy for diagnosing mucinous cysts were 89.2%, 77.8%, and 84.4%, respectively. The combination of cyst fluid CEA content >67.3 ng/ml and cyst fluid CA 125 content >10.0 U/ml segregated 77.8% (14/18) of mucinous cystic neoplasms (MCNs) from other cyst subtypes. On the other hand, no fluid marker was useful for differentiating malignant versus benign cystic lesions. Although cytology (accuracy 83.3%) more accurately diagnosed malignant cysts than CEA (accuracy 65.6%), it lacked sensitivity (35.3%). Our results demonstrate that cyst fluid CEA can be a helpful marker in differentiating mucinous from non-mucinous, but not malignant from benign cystic lesions. A combined CEA and CA 125 approach may help segregate MCNs from IPMNs. Copyright © 2014 IAP and EPC. Published by Elsevier B.V. All rights reserved.

  19. Electricity of machine tool

    International Nuclear Information System (INIS)

    Gijeon media editorial department

    1977-10-01

    This book is divided into three parts. The first part deals with electricity machine, which can taints from generator to motor, motor a power source of machine tool, electricity machine for machine tool such as switch in main circuit, automatic machine, a knife switch and pushing button, snap switch, protection device, timer, solenoid, and rectifier. The second part handles wiring diagram. This concludes basic electricity circuit of machine tool, electricity wiring diagram in your machine like milling machine, planer and grinding machine. The third part introduces fault diagnosis of machine, which gives the practical solution according to fault diagnosis and the diagnostic method with voltage and resistance measurement by tester.

  20. Norms concerning the programmable automatic control devices

    International Nuclear Information System (INIS)

    Fourmentraux, G.

    1995-01-01

    This presentation is a report of the studies carried out by the Work Group on Functioning Safety of Programmable Automatic Control Devices and by the Group for Prevention Studies (GEP) from the CEA. The objective of these groups is to evaluate the methods which could be used to estimate the functioning safety of control and instrumentation systems involved in the Important Elements for Safety (EIS) of the Basic Nuclear Installations (INB) of the CEA, and also to carry out a qualification of automatic control devices. Norms, protocols and tools for the evaluation are presented. The problem comprises two aspects: the evaluation of fault avoidance techniques and the evaluation of fault control techniques used during the conceiving. For the fault avoidance techniques, the quality assurance organization, the environment tests, and the software quality plans are considered. For the fault control techniques, the different available tools and fault injection models are analysed. The results of an analysis carried out with the DEF.I tool from the National Institute for Research and Safety (INRS) are reported. (J.S.). 23 refs

  1. Automatic quantitative metallography

    International Nuclear Information System (INIS)

    Barcelos, E.J.B.V.; Ambrozio Filho, F.; Cunha, R.C.

    1976-01-01

    The quantitative determination of metallographic parameters is analysed through the description of Micro-Videomat automatic image analysis system and volumetric percentage of perlite in nodular cast irons, porosity and average grain size in high-density sintered pellets of UO 2 , and grain size of ferritic steel. Techniques adopted are described and results obtained are compared with the corresponding ones by the direct counting process: counting of systematic points (grid) to measure volume and intersections method, by utilizing a circunference of known radius for the average grain size. The adopted technique for nodular cast iron resulted from the small difference of optical reflectivity of graphite and perlite. Porosity evaluation of sintered UO 2 pellets is also analyzed [pt

  2. Semi-automatic fluoroscope

    International Nuclear Information System (INIS)

    Tarpley, M.W.

    1976-10-01

    Extruded aluminum-clad uranium-aluminum alloy fuel tubes must pass many quality control tests before irradiation in Savannah River Plant nuclear reactors. Nondestructive test equipment has been built to automatically detect high and low density areas in the fuel tubes using x-ray absorption techniques with a video analysis system. The equipment detects areas as small as 0.060-in. dia with 2 percent penetrameter sensitivity. These areas are graded as to size and density by an operator using electronic gages. Video image enhancement techniques permit inspection of ribbed cylindrical tubes and make possible the testing of areas under the ribs. Operation of the testing machine, the special low light level television camera, and analysis and enhancement techniques are discussed

  3. Automatic surveying techniques

    International Nuclear Information System (INIS)

    Sah, R.

    1976-01-01

    In order to investigate the feasibility of automatic surveying methods in a more systematic manner, the PEP organization signed a contract in late 1975 for TRW Systems Group to undertake a feasibility study. The completion of this study resulted in TRW Report 6452.10-75-101, dated December 29, 1975, which was largely devoted to an analysis of a survey system based on an Inertial Navigation System. This PEP note is a review and, in some instances, an extension of that TRW report. A second survey system which employed an ''Image Processing System'' was also considered by TRW, and it will be reviewed in the last section of this note. 5 refs., 5 figs., 3 tabs

  4. Automatic alkaloid removal system.

    Science.gov (United States)

    Yahaya, Muhammad Rizuwan; Hj Razali, Mohd Hudzari; Abu Bakar, Che Abdullah; Ismail, Wan Ishak Wan; Muda, Wan Musa Wan; Mat, Nashriyah; Zakaria, Abd

    2014-01-01

    This alkaloid automated removal machine was developed at Instrumentation Laboratory, Universiti Sultan Zainal Abidin Malaysia that purposely for removing the alkaloid toxicity from Dioscorea hispida (DH) tuber. It is a poisonous plant where scientific study has shown that its tubers contain toxic alkaloid constituents, dioscorine. The tubers can only be consumed after it poisonous is removed. In this experiment, the tubers are needed to blend as powder form before inserting into machine basket. The user is need to push the START button on machine controller for switching the water pump ON by then creating turbulence wave of water in machine tank. The water will stop automatically by triggering the outlet solenoid valve. The powders of tubers are washed for 10 minutes while 1 liter of contaminated water due toxin mixture is flowing out. At this time, the controller will automatically triggered inlet solenoid valve and the new water will flow in machine tank until achieve the desire level that which determined by ultra sonic sensor. This process will repeated for 7 h and the positive result is achieved and shows it significant according to the several parameters of biological character ofpH, temperature, dissolve oxygen, turbidity, conductivity and fish survival rate or time. From that parameter, it also shows the positive result which is near or same with control water and assuming was made that the toxin is fully removed when the pH of DH powder is near with control water. For control water, the pH is about 5.3 while water from this experiment process is 6.0 and before run the machine the pH of contaminated water is about 3.8 which are too acid. This automated machine can save time for removing toxicity from DH compared with a traditional method while less observation of the user.

  5. Automatic Visualization of Software Requirements: Reactive Systems

    International Nuclear Information System (INIS)

    Castello, R.; Mili, R.; Tollis, I.G.; Winter, V.

    1999-01-01

    In this paper we present an approach that facilitates the validation of high consequence system requirements. This approach consists of automatically generating a graphical representation from an informal document. Our choice of a graphical notation is statecharts. We proceed in two steps: we first extract a hierarchical decomposition tree from a textual description, then we draw a graph that models the statechart in a hierarchical fashion. The resulting drawing is an effective requirements assessment tool that allows the end user to easily pinpoint inconsistencies and incompleteness

  6. Automatic generation of combinatorial test data

    CERN Document Server

    Zhang, Jian; Ma, Feifei

    2014-01-01

    This book reviews the state-of-the-art in combinatorial testing, with particular emphasis on the automatic generation of test data. It describes the most commonly used approaches in this area - including algebraic construction, greedy methods, evolutionary computation, constraint solving and optimization - and explains major algorithms with examples. In addition, the book lists a number of test generation tools, as well as benchmarks and applications. Addressing a multidisciplinary topic, it will be of particular interest to researchers and professionals in the areas of software testing, combi

  7. Length Scales in Bayesian Automatic Adaptive Quadrature

    Directory of Open Access Journals (Sweden)

    Adam Gh.

    2016-01-01

    Full Text Available Two conceptual developments in the Bayesian automatic adaptive quadrature approach to the numerical solution of one-dimensional Riemann integrals [Gh. Adam, S. Adam, Springer LNCS 7125, 1–16 (2012] are reported. First, it is shown that the numerical quadrature which avoids the overcomputing and minimizes the hidden floating point loss of precision asks for the consideration of three classes of integration domain lengths endowed with specific quadrature sums: microscopic (trapezoidal rule, mesoscopic (Simpson rule, and macroscopic (quadrature sums of high algebraic degrees of precision. Second, sensitive diagnostic tools for the Bayesian inference on macroscopic ranges, coming from the use of Clenshaw-Curtis quadrature, are derived.

  8. Length Scales in Bayesian Automatic Adaptive Quadrature

    Science.gov (United States)

    Adam, Gh.; Adam, S.

    2016-02-01

    Two conceptual developments in the Bayesian automatic adaptive quadrature approach to the numerical solution of one-dimensional Riemann integrals [Gh. Adam, S. Adam, Springer LNCS 7125, 1-16 (2012)] are reported. First, it is shown that the numerical quadrature which avoids the overcomputing and minimizes the hidden floating point loss of precision asks for the consideration of three classes of integration domain lengths endowed with specific quadrature sums: microscopic (trapezoidal rule), mesoscopic (Simpson rule), and macroscopic (quadrature sums of high algebraic degrees of precision). Second, sensitive diagnostic tools for the Bayesian inference on macroscopic ranges, coming from the use of Clenshaw-Curtis quadrature, are derived.

  9. Automatic Creation of quality multi-word Lexica from noisy text data

    OpenAIRE

    Frontini, Francesca; Quochi, Valeria; Rubino, Francesco

    2012-01-01

    This paper describes the design of a tool for the automatic creation of multi-word lexica that is deployed as a web service and runs on automatically web-crawled data within the framework of the PANACEA platform. The main purpose of our task is to provide a (computationally "light") tool that creates a full high quality lexical resource of multi-word items. Within the platform, this tool is typically inserted in a work flow whose first step is automatic web-crawling. Therefore, the input data...

  10. Classifying visemes for automatic lipreading

    NARCIS (Netherlands)

    Visser, Michiel; Poel, Mannes; Nijholt, Antinus; Matousek, Vaclav; Mautner, Pavel; Ocelikovi, Jana; Sojka, Petr

    1999-01-01

    Automatic lipreading is automatic speech recognition that uses only visual information. The relevant data in a video signal is isolated and features are extracted from it. From a sequence of feature vectors, where every vector represents one video image, a sequence of higher level semantic elements

  11. The combination of urinary IL - 6 and renal biometry as useful diagnostic tools to differentiate acute pyelonephritis from lower urinary tract infection

    Directory of Open Access Journals (Sweden)

    Sherif Azab

    Full Text Available ABSTRACT Objective: To evaluate the role of renal ultrasound (RUS and urinary IL-6 in the differentiation between acute pyelonephritis (APN and lower urinary tract infection (LUTI. Patients and methods: This prospective study was carried out at the Pediatric and urology outpatient and inpatient departments of Cairo University Children's Hospital as well as October 6 University Hospital and it included 155 children between one month and fourteen years old with positive culture UTI. Patients were categorized into APN and LUTI based on their clinical features and laboratory parameters. Thirty healthy children, age and sex matched constituted the control group. Children with positive urine cultures were treated with appropriate antibiotics. Before treatment, urinary IL-6 was measured by enzyme immunoassay technique (ELISA, and renal ultrasound (RUS was done. CRP (C-reactive protein, IL-6 and RUS were repeated on the 14th day of antibiotic treatment to evaluate the changes in their levels in response to treatment. Results: UIL-6 levels were more significantly higher in patients with APN than in patients with LUTI (24.3±19.3pg/mL for APN vs. 7.3±2.7pg/mL in LUTI (95% CI: 2.6-27.4; p20pg/mL and serum CRP >20μg/mL were highly reliable markers of APN. Mean renal volume and mean volume difference between the two kidneys in the APN group were more than that of the LUTI and control groups (P<0.001. Renal volume between 120-130% of normal was the best for differentiating APN from LUTI. Conclusions: RUS and urinary IL-6 levels have a highly dependable role in the differentiation between APN and LUTI especially in places where other investigations are not available and/ or affordable.

  12. Formal Specification Based Automatic Test Generation for Embedded Network Systems

    Directory of Open Access Journals (Sweden)

    Eun Hye Choi

    2014-01-01

    Full Text Available Embedded systems have become increasingly connected and communicate with each other, forming large-scaled and complicated network systems. To make their design and testing more reliable and robust, this paper proposes a formal specification language called SENS and a SENS-based automatic test generation tool called TGSENS. Our approach is summarized as follows: (1 A user describes requirements of target embedded network systems by logical property-based constraints using SENS. (2 Given SENS specifications, test cases are automatically generated using a SAT-based solver. Filtering mechanisms to select efficient test cases are also available in our tool. (3 In addition, given a testing goal by the user, test sequences are automatically extracted from exhaustive test cases. We’ve implemented our approach and conducted several experiments on practical case studies. Through the experiments, we confirmed the efficiency of our approach in design and test generation of real embedded air-conditioning network systems.

  13. Coherence measures in automatic time-migration velocity analysis

    International Nuclear Information System (INIS)

    Maciel, Jonathas S; Costa, Jessé C; Schleicher, Jörg

    2012-01-01

    Time-migration velocity analysis can be carried out automatically by evaluating the coherence of migrated seismic events in common-image gathers (CIGs). The performance of gradient methods for automatic time-migration velocity analysis depends on the coherence measures used as the objective function. We compare the results of four different coherence measures, being conventional semblance, differential semblance, an extended differential semblance using differences of more distant image traces and the product of the latter with conventional semblance. In our numerical experiments, the objective functions based on conventional semblance and on the product of conventional semblance with extended differential semblance provided the best velocity models, as evaluated by the flatness of the resulting CIGs. The method can be easily extended to anisotropic media. (paper)

  14. Automatic Recognition of Road Signs

    Science.gov (United States)

    Inoue, Yasuo; Kohashi, Yuuichirou; Ishikawa, Naoto; Nakajima, Masato

    2002-11-01

    The increase in traffic accidents is becoming a serious social problem with the recent rapid traffic increase. In many cases, the driver"s carelessness is the primary factor of traffic accidents, and the driver assistance system is demanded for supporting driver"s safety. In this research, we propose the new method of automatic detection and recognition of road signs by image processing. The purpose of this research is to prevent accidents caused by driver"s carelessness, and call attention to a driver when the driver violates traffic a regulation. In this research, high accuracy and the efficient sign detecting method are realized by removing unnecessary information except for a road sign from an image, and detect a road sign using shape features. At first, the color information that is not used in road signs is removed from an image. Next, edges except for circular and triangle ones are removed to choose sign shape. In the recognition process, normalized cross correlation operation is carried out to the two-dimensional differentiation pattern of a sign, and the accurate and efficient method for detecting the road sign is realized. Moreover, the real-time operation in a software base was realized by holding down calculation cost, maintaining highly precise sign detection and recognition. Specifically, it becomes specifically possible to process by 0.1 sec(s)/frame using a general-purpose PC (CPU: Pentium4 1.7GHz). As a result of in-vehicle experimentation, our system could process on real time and has confirmed that detection and recognition of a sign could be performed correctly.

  15. Automatic ultrasound image enhancement for 2D semi-automatic breast-lesion segmentation

    Science.gov (United States)

    Lu, Kongkuo; Hall, Christopher S.

    2014-03-01

    Breast cancer is the fastest growing cancer, accounting for 29%, of new cases in 2012, and second leading cause of cancer death among women in the United States and worldwide. Ultrasound (US) has been used as an indispensable tool for breast cancer detection/diagnosis and treatment. In computer-aided assistance, lesion segmentation is a preliminary but vital step, but the task is quite challenging in US images, due to imaging artifacts that complicate detection and measurement of the suspect lesions. The lesions usually present with poor boundary features and vary significantly in size, shape, and intensity distribution between cases. Automatic methods are highly application dependent while manual tracing methods are extremely time consuming and have a great deal of intra- and inter- observer variability. Semi-automatic approaches are designed to counterbalance the advantage and drawbacks of the automatic and manual methods. However, considerable user interaction might be necessary to ensure reasonable segmentation for a wide range of lesions. This work proposes an automatic enhancement approach to improve the boundary searching ability of the live wire method to reduce necessary user interaction while keeping the segmentation performance. Based on the results of segmentation of 50 2D breast lesions in US images, less user interaction is required to achieve desired accuracy, i.e. live-wire segmentation.

  16. Analyzing Program Termination and Complexity Automatically with AProVE

    DEFF Research Database (Denmark)

    Giesl, Jürgen; Aschermann, Cornelius; Brockschmidt, Marc

    2017-01-01

    In this system description, we present the tool AProVE for automatic termination and complexity proofs of Java, C, Haskell, Prolog, and rewrite systems. In addition to classical term rewrite systems (TRSs), AProVE also supports rewrite systems containing built-in integers (int-TRSs). To analyze p...

  17. Automatic web site authoring with SiteGuide

    NARCIS (Netherlands)

    de Boer, V.; Hollink, V.; van Someren, M.W.; Kłopotek, M.A.; Przepiórkowski, A.; Wierzchoń, S.T.; Trojanowski, K.

    2009-01-01

    An important step in the design process for a web site is to determine which information is to be included and how the information should be organized on the web site’s pages. In this paper we describe ’SiteGuide’, a tool that automatically produces an information architecture for a web site that a

  18. Robust automatic intelligibility assessment techniques evaluated on speakers treated for head and neck cancer

    NARCIS (Netherlands)

    Middag, C.; Clapham, R.; van Son, R.; Martens, J-P.

    2014-01-01

    It is generally acknowledged that an unbiased and objective assessment of the communication deficiency caused by a speech disorder calls for automatic speech processing tools. In this paper, a new automatic intelligibility assessment method is presented. The method can predict running speech

  19. Automatic Inspection During Machining

    Science.gov (United States)

    Ransom, Clyde L.

    1988-01-01

    In experimental manufacturing process, numerically-controlled machine tool temporarily converts into inspection machine by installing electronic touch probes and specially-developed numerical-control software. Software drives probes in paths to and on newly machined parts and collects data on dimensions of parts.

  20. Automatic exposure for xeromammography

    International Nuclear Information System (INIS)

    Aichinger, H.

    1977-01-01

    During mammography without intensifying screens, exposure measurements are carried out behind the film. It is, however, difficult to construct an absolutely shadow-free ionization chamber of adequate sensitivity working in the necessary range of 25 to 50 kV. Repeated attempts have been made to utilize the advantages of automatic exposure for xero-mammography. In this case also the ionization chamber was placed behind the Xerox plate. Depending on tube filtration, object thickness and tube voltage, more than 80%, sometimes even 90%, of the radiation is absorbed by the Xerox plate. Particularly the characteristic Mo radiation of 17.4 keV and 19.6 keV is almost totally absorbed by the plate and cannot therefore be registered by the ionization chamber. This results in a considerable dependence of the exposure on kV and object thickness. Dependence on tube voltage and object thickness have been examined dosimetrically and spectroscopically with a Ge(Li)-spectrometer. Finally, the successful use of a shadow-free chamber is described; this has been particularly adapted for xero-mammography and is placed in front of the plate. (orig) [de

  1. Historical Review and Perspective on Automatic Journalizing

    OpenAIRE

    Kato, Masaki

    2017-01-01

    ContentsIntroduction1. EDP Accounting and Automatic Journalizing2. Learning System of Automatic Journalizing3. Automatic Journalizing by the Artificial Intelligence4. Direction of the Progress of the Accounting Information System

  2. The combination of urinary IL - 6 and renal biometry as useful diagnostic tools to differentiate acute pyelonephritis from lower urinary tract infection.

    Science.gov (United States)

    Azab, Sherif; Zakaria, Mostafa; Raafat, Mona; Seief, Hadeel

    2016-01-01

    To evaluate the role of renal ultrasound (RUS) and urinary IL-6 in the differentiation between acute pyelonephritis (APN) and lower urinary tract infection (LUTI). This prospective study was carried out at the Pediatric and urology outpatient and inpatient departments of Cairo University Children's Hospital as well as October 6 University Hospital and it included 155 children between one month and fourteen years old with positive culture UTI. Patients were categorized into APN and LUTI based on their clinical features and laboratory parameters. Thirty healthy children, age and sex matched constituted the control group. Children with positive urine cultures were treated with appropriate antibiotics. Before treatment, urinary IL-6 was measured by enzyme immunoassay technique (ELISA), and renal ultrasound (RUS) was done. CRP (C-reactive protein), IL-6 and RUS were repeated on the 14th day of antibiotic treatment to evaluate the changes in their levels in response to treatment. UIL-6 levels were more significantly higher in patients with APN than in patients with LUTI (24.3±19.3pg/mL for APN vs. 7.3±2.7pg/mL in LUTI (95% CI: 2.6-27.4; p20pg/mL and serum CRP >20μg/mL were highly reliable markers of APN. Mean renal volume and mean volume difference between the two kidneys in the APN group were more than that of the LUTI and control groups (Purinary IL-6 levels have a highly dependable role in the differentiation between APN and LUTI especially in places where other investigations are not available and/ or affordable. Copyright© by the International Brazilian Journal of Urology.

  3. Thyroid scintigraphy and perchlorate test after recombinant human TSH: a new tool for the differential diagnosis of congenital hypothyroidism during infancy

    Energy Technology Data Exchange (ETDEWEB)

    Fugazzola, Laura; Vannucchi, Guia; Mannavola, Deborah; Beck-Peccoz, Paolo [University of Milan and Fondazione Policlinico IRCCS, Department of Medical Sciences, Milan (Italy); Persani, Luca [University of Milan and Istituto Auxologico Italiano, Department of Medical Sciences, Via Zucchi, Cusano, Milan (Italy); Carletto, Marco; Longari, Virgilio [Fondazione Policlinico IRCCS, Department of Nuclear Medicine, Milan (Italy); Vigone, Maria C.; Cortinovis, Francesca; Weber, Giovanna [Universita Vita-Salute S. Raffaele, Centro di Endocrinologia dell' Infanzia e dell' Adolescenza, Milan (Italy); Beccaria, Luciano [A. Manzoni Hospital, Paediatric Unit, Lecco (Italy)

    2007-09-15

    Prompt initiation of l-thyroxine therapy in neonates with congenital hypothyroidism (CH) often prevents the performance of functional studies. Aetiological diagnosis is thus postponed until after infancy, when the required investigations are performed after l-thyroxine withdrawal. The aim of this study was to verify the efficacy and safety of new protocols for rhTSH (Thyrogen) testing during l-thyroxine replacement in the differential diagnosis of CH. Ten CH patients (15-144 months old) were studied. Seven had neonatal evidence of gland in situ at the ultrasound examination performed at enrolment and received two rhTSH injections (4 {mu}g/kg daily, i.m.) with {sup 123}I scintigraphy and perchlorate test on day 3. Three patients with an ultrasound diagnosis of thyroid dysgenesis received three rhTSH injections with {sup 123}I scintigraphy on days 3 and 4. TSH and thyroglobulin (Tg) determinations were performed on days 1, 3 and 4, and neck ultrasound on day 1. rhTSH stimulation caused Tg levels to increase in eight cases. Blunted Tg responses were seen in two patients with ectopia and hypoplasia. Interestingly, in two cases the association of different developmental defects was demonstrated. Perchlorate test revealed a total iodide organification defect in two patients, including one with a neonatal diagnosis of Pendred's syndrome, who were subsequently found to harbour TPO mutations. rhTSH did not cause notable side-effects. These new rhTSH protocols always resulted in accurate disease characterisation, allowing specific management and targeted genetic analyses. Thus, rhTSH represents a valid and safe alternative to l-thyroxine withdrawal in the differential diagnosis of CH in paediatric patients. (orig.)

  4. Image based automatic water meter reader

    Science.gov (United States)

    Jawas, N.; Indrianto

    2018-01-01

    Water meter is used as a tool to calculate water consumption. This tool works by utilizing water flow and shows the calculation result with mechanical digit counter. Practically, in everyday use, an operator will manually check the digit counter periodically. The Operator makes logs of the number shows by water meter to know the water consumption. This manual operation is time consuming and prone to human error. Therefore, in this paper we propose an automatic water meter digit reader from digital image. The digits sequence is detected by utilizing contour information of the water meter front panel.. Then an OCR method is used to get the each digit character. The digit sequence detection is an important part of overall process. It determines the success of overall system. The result shows promising results especially in sequence detection.

  5. Sexual Modes Questionnaire (SMQ): Translation and Psychometric Properties of the Italian Version of the Automatic Thought Scale.

    Science.gov (United States)

    Nimbi, Filippo Maria; Tripodi, Francesca; Simonelli, Chiara; Nobre, Pedro

    2018-03-01

    The Sexual Modes Questionnaire (SMQ) is a validated and widespread used tool to assess the association among negative automatic thoughts, emotions, and sexual response during sexual activity in men and women. To test the psychometric characteristics of the Italian version of the SMQ focusing on the Automatic Thoughts subscale (SMQ-AT). After linguistic translation, the psychometric properties (internal consistency, construct, and discriminant validity) were evaluated. 1,051 participants (425 men and 626 women, 776 healthy and 275 clinical groups complaining about sexual problems) participated in the present study. 2 confirmatory factor analyses were conducted to test the fit of the original factor structures of the SMQ versions. In addition, 2 principal component analyses were performed to highlight 2 new factorial structures that were further validated with confirmatory factor analyses. Cronbach α and composite reliability were used as internal consistency measures and differences between clinical and control groups were run to test the discriminant validity for the male and female versions. The associations with emotions and sexual functioning measures also are reported. Principal component analyses identified 5 factors in the male version: erection concerns thoughts, lack of erotic thoughts, age- and body-related thoughts, negative thoughts toward sex, and worries about partner's evaluation and failure anticipation thoughts. In the female version 6 factors were found: sexual abuse thoughts, lack of erotic thoughts, low self-body image thoughts, failure and disengagement thoughts, sexual passivity and control, and partner's lack of affection. Confirmatory factor analysis supported the adequacy of the factor structure for men and women. Moreover, the SMQ showed a strong association with emotional response and sexual functioning, differentiating between clinical and control groups. This measure is useful to evaluate patients and design interventions focused on

  6. Electronic amplifiers for automatic compensators

    CERN Document Server

    Polonnikov, D Ye

    1965-01-01

    Electronic Amplifiers for Automatic Compensators presents the design and operation of electronic amplifiers for use in automatic control and measuring systems. This book is composed of eight chapters that consider the problems of constructing input and output circuits of amplifiers, suppression of interference and ensuring high sensitivity.This work begins with a survey of the operating principles of electronic amplifiers in automatic compensator systems. The succeeding chapters deal with circuit selection and the calculation and determination of the principal characteristics of amplifiers, as

  7. Automatic Control of the Concrete Mixture Homogeneity in Cycling Mixers

    Science.gov (United States)

    Anatoly Fedorovich, Tikhonov; Drozdov, Anatoly

    2018-03-01

    The article describes the factors affecting the concrete mixture quality related to the moisture content of aggregates, since the effectiveness of the concrete mixture production is largely determined by the availability of quality management tools at all stages of the technological process. It is established that the unaccounted moisture of aggregates adversely affects the concrete mixture homogeneity and, accordingly, the strength of building structures. A new control method and the automatic control system of the concrete mixture homogeneity in the technological process of mixing components have been proposed, since the tasks of providing a concrete mixture are performed by the automatic control system of processing kneading-and-mixing machinery with operational automatic control of homogeneity. Theoretical underpinnings of the control of the mixture homogeneity are presented, which are related to a change in the frequency of vibrodynamic vibrations of the mixer body. The structure of the technical means of the automatic control system for regulating the supply of water is determined depending on the change in the concrete mixture homogeneity during the continuous mixing of components. The following technical means for establishing automatic control have been chosen: vibro-acoustic sensors, remote terminal units, electropneumatic control actuators, etc. To identify the quality indicator of automatic control, the system offers a structure flowchart with transfer functions that determine the ACS operation in transient dynamic mode.

  8. Theory and applications of differential algebra

    International Nuclear Information System (INIS)

    Pusch, G.D.

    1992-01-01

    Differential algebra (DA) is a new method of automatic differentiation. DA can rapidly and efficiently calculate the values of derivatives of arbitrarily complicated functions, in arbitrarily many variables, to arbitrary order, via its definition of multiplication. I provide a brief introduction to DA, and enumerate some of its recent applications. (author). 6 refs

  9. A Clustering-Based Automatic Transfer Function Design for Volume Visualization

    Directory of Open Access Journals (Sweden)

    Tianjin Zhang

    2016-01-01

    Full Text Available The two-dimensional transfer functions (TFs designed based on intensity-gradient magnitude (IGM histogram are effective tools for the visualization and exploration of 3D volume data. However, traditional design methods usually depend on multiple times of trial-and-error. We propose a novel method for the automatic generation of transfer functions by performing the affinity propagation (AP clustering algorithm on the IGM histogram. Compared with previous clustering algorithms that were employed in volume visualization, the AP clustering algorithm has much faster convergence speed and can achieve more accurate clustering results. In order to obtain meaningful clustering results, we introduce two similarity measurements: IGM similarity and spatial similarity. These two similarity measurements can effectively bring the voxels of the same tissue together and differentiate the voxels of different tissues so that the generated TFs can assign different optical properties to different tissues. Before performing the clustering algorithm on the IGM histogram, we propose to remove noisy voxels based on the spatial information of voxels. Our method does not require users to input the number of clusters, and the classification and visualization process is automatic and efficient. Experiments on various datasets demonstrate the effectiveness of the proposed method.

  10. Wavelet-based feature extraction applied to small-angle x-ray scattering patterns from breast tissue: a tool for differentiating between tissue types

    International Nuclear Information System (INIS)

    Falzon, G; Pearson, S; Murison, R; Hall, C; Siu, K; Evans, A; Rogers, K; Lewis, R

    2006-01-01

    This paper reports on the application of wavelet decomposition to small-angle x-ray scattering (SAXS) patterns from human breast tissue produced by a synchrotron source. The pixel intensities of SAXS patterns of normal, benign and malignant tissue types were transformed into wavelet coefficients. Statistical analysis found significant differences between the wavelet coefficients describing the patterns produced by different tissue types. These differences were then correlated with position in the image and have been linked to the supra-molecular structural changes that occur in breast tissue in the presence of disease. Specifically, results indicate that there are significant differences between healthy and diseased tissues in the wavelet coefficients that describe the peaks produced by the axial d-spacing of collagen. These differences suggest that a useful classification tool could be based upon the spectral information within the axial peaks

  11. Differential voltage analysis as a tool for analyzing inhomogeneous aging: A case study for LiFePO4|Graphite cylindrical cells

    Science.gov (United States)

    Lewerenz, Meinert; Marongiu, Andrea; Warnecke, Alexander; Sauer, Dirk Uwe

    2017-11-01

    In this work the differential voltage analysis (DVA) is evaluated for LiFePO4|Graphite cylindrical cells aged in calendaric and cyclic tests. The homogeneity of the active lithium distribution and the loss of anode active material (LAAM) are measured by the characteristic shape and peaks of the DVA. The results from this analysis exhibit an increasing homogeneity of the lithium-ion distribution during aging for all cells subjected to calendaric aging. At 60 °C, LAAM is found additionally and can be associated with the deposition of dissolved Fe from the cathode on the anode, where it finally leads to the clogging of pores. For cells aged under cyclic conditions, several phenomena are correlated to degradation, such as loss of active lithium and local LAAM for 100% DOD. Moreover, the deactivation of certain parts of anode and cathode due to a lithium-impermeable covering layer on top of the anode is observed for some cells. While the 100% DOD cycling is featured by a continuous LAAM, the LAAM due to deactivation by a covering layer of both electrodes starts suddenly. The homogeneity of the active lithium distribution within the cycled cells is successively reduced with deposited passivation layers and with LAAM that is lost locally at positions with lower external pressure on the electrode.

  12. Optical methods and differential scanning calorimetry as a potential tool for discrimination of olive oils (extra virgin and mix with vegetable oils)

    Science.gov (United States)

    Nikolova, Kr.; Yovcheva, T.; Marudova, M.; Eftimov, T.; Bodurov, I.; Viraneva, A.; Vlaeva, I.

    2016-03-01

    Eleven samples from olive oil have been investigated using four physical methods - refractive index measurement, fluorescence spectra, color parameters and differential scanning colorimetry. In pomace olive oil (POO) and extra virgin olive oil (EVOO) the oleic acid (65.24 %-78.40 %) predominates over palmitic (10.47 %-15.07 %) and linoleic (5.26 %-13.92 %) acids. The fluorescence spectra contain three peaks related to oxidation products at about λ = (500-540) nm, chlorophyll content at about λ = (675-680) nm and non determined pigments at λ = (700-750) nm. The melting point for EVOO and POO is between -1 °C and -6 °C. In contrast, the salad olive oils melt between -24 °C and -30 °C. The refractive index for EVOO is lower than that for mixed olive oils. The proposed physical methods could be used for fast and simple detection of vegetable oils in EVOO without use of chemical substances. The experimental results are in accordance with those obtained by chemical analysis.

  13. Automatic analysis of multiparty meetings

    Indian Academy of Sciences (India)

    AMI) meeting corpus, the development of a meeting speech recognition system, and systems for the automatic segmentation, summarization and social processing of meetings, together with some example applications based on these systems.

  14. AVID: Automatic Visualization Interface Designer

    National Research Council Canada - National Science Library

    Chuah, Mei

    2000-01-01

    .... Automatic generation offers great flexibility in performing data and information analysis tasks, because new designs are generated on a case by case basis to suit current and changing future needs...

  15. Clothes Dryer Automatic Termination Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    TeGrotenhuis, Ward E.

    2014-10-01

    Volume 2: Improved Sensor and Control Designs Many residential clothes dryers on the market today provide automatic cycles that are intended to stop when the clothes are dry, as determined by the final remaining moisture content (RMC). However, testing of automatic termination cycles has shown that many dryers are susceptible to over-drying of loads, leading to excess energy consumption. In particular, tests performed using the DOE Test Procedure in Appendix D2 of 10 CFR 430 subpart B have shown that as much as 62% of the energy used in a cycle may be from over-drying. Volume 1 of this report shows an average of 20% excess energy from over-drying when running automatic cycles with various load compositions and dryer settings. Consequently, improving automatic termination sensors and algorithms has the potential for substantial energy savings in the U.S.

  16. An automatic image recognition approach

    Directory of Open Access Journals (Sweden)

    Tudor Barbu

    2007-07-01

    Full Text Available Our paper focuses on the graphical analysis domain. We propose an automatic image recognition technique. This approach consists of two main pattern recognition steps. First, it performs an image feature extraction operation on an input image set, using statistical dispersion features. Then, an unsupervised classification process is performed on the previously obtained graphical feature vectors. An automatic region-growing based clustering procedure is proposed and utilized in the classification stage.

  17. An automatic system for elaboration of chip breaking diagrams

    DEFF Research Database (Denmark)

    Andreasen, Jan Lasson; De Chiffre, Leonardo

    1998-01-01

    A laboratory system for fully automatic elaboration of chip breaking diagrams has been developed and tested. The system is based on automatic chip breaking detection by frequency analysis of cutting forces in connection with programming of a CNC-lathe to scan different feeds, speeds and cutting...... depths. An evaluation of the system based on a total of 1671 experiments has shown that unfavourable snarled chips can be detected with 98% certainty which indeed makes the system a valuable tool in chip breakability tests. Using the system, chip breaking diagrams can be elaborated with a previously...

  18. Child vocalization composition as discriminant information for automatic autism detection.

    Science.gov (United States)

    Xu, Dongxin; Gilkerson, Jill; Richards, Jeffrey; Yapanel, Umit; Gray, Sharmi

    2009-01-01

    Early identification is crucial for young children with autism to access early intervention. The existing screens require either a parent-report questionnaire and/or direct observation by a trained practitioner. Although an automatic tool would benefit parents, clinicians and children, there is no automatic screening tool in clinical use. This study reports a fully automatic mechanism for autism detection/screening for young children. This is a direct extension of the LENA (Language ENvironment Analysis) system, which utilizes speech signal processing technology to analyze and monitor a child's natural language environment and the vocalizations/speech of the child. It is discovered that child vocalization composition contains rich discriminant information for autism detection. By applying pattern recognition and machine learning approaches to child vocalization composition data, accuracy rates of 85% to 90% in cross-validation tests for autism detection have been achieved at the equal-error-rate (EER) point on a data set with 34 children with autism, 30 language delayed children and 76 typically developing children. Due to its easy and automatic procedure, it is believed that this new tool can serve a significant role in childhood autism screening, especially in regards to population-based or universal screening.

  19. Automatic surveillance systems

    International Nuclear Information System (INIS)

    Bruschi, R.; Pallottelli, R.

    1985-01-01

    In this paper are presented studies and realization of a special tool, for supporting the console-operator, during normal or abnormal conditions of the plant. It has been realized by means of real-time simulation techniques and it is able: a) to diagnose plant-faults in real-time mode and, to allow the operator to detect the locations of the causes of the malfunctions; b) to supply the conditions where the plant is going, whether in normal or accidental evolution

  20. Differentiation of control and ALS mutant human iPSCs into functional skeletal muscle cells, a tool for the study of neuromuscolar diseases

    Directory of Open Access Journals (Sweden)

    Jessica Lenzi

    2016-07-01

    Full Text Available Amyotrophic Lateral Sclerosis (ALS is a severe and fatal neurodegenerative disease characterized by progressive loss of motoneurons, muscle atrophy and paralysis. Recent evidence suggests that ALS should be considered as a multi-systemic disease, in which several cell types contribute to motoneuron degeneration. In this view, mutations in ALS linked genes in other neural and non-neural cell types may exert non-cell autonomous effects on motoneuron survival and function. Induced Pluripotent Stem Cells (iPSCs have been recently derived from several patients with ALS mutations and it has been shown that they can generate motoneurons in vitro, providing a valuable tool to study ALS. However, the potential of iPSCs could be further valorized by generating other cell types that may be relevant to the pathology. In this paper, by taking advantage of a novel inducible system for MyoD expression, we show that both control iPSCs and iPSCs carrying mutations in ALS genes can generate skeletal muscle cells. We provide evidence that both control and mutant iPSC-derived myotubes are functionally active. This in vitro system will be instrumental to dissect the molecular and cellular pathways impairing the complex motoneuron microenvironment in ALS.

  1. Differential Laser Doppler based Non-Contact Sensor for Dimensional Inspection with Error Propagation Evaluation

    Directory of Open Access Journals (Sweden)

    Ketsaya Vacharanukul

    2006-06-01

    Full Text Available To achieve dynamic error compensation in CNC machine tools, a non-contactlaser probe capable of dimensional measurement of a workpiece while it is being machinedhas been developed and presented in this paper. The measurements are automatically fedback to the machine controller for intelligent error compensations. Based on a well resolvedlaser Doppler technique and real time data acquisition, the probe delivers a very promisingdimensional accuracy at few microns over a range of 100 mm. The developed opticalmeasuring apparatus employs a differential laser Doppler arrangement allowing acquisitionof information from the workpiece surface. In addition, the measurements are traceable tostandards of frequency allowing higher precision.

  2. Sequence History Update Tool

    Science.gov (United States)

    Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; DelGuercio, Chris

    2008-01-01

    The Sequence History Update Tool performs Web-based sequence statistics archiving for Mars Reconnaissance Orbiter (MRO). Using a single UNIX command, the software takes advantage of sequencing conventions to automatically extract the needed statistics from multiple files. This information is then used to populate a PHP database, which is then seamlessly formatted into a dynamic Web page. This tool replaces a previous tedious and error-prone process of manually editing HTML code to construct a Web-based table. Because the tool manages all of the statistics gathering and file delivery to and from multiple data sources spread across multiple servers, there is also a considerable time and effort savings. With the use of The Sequence History Update Tool what previously took minutes is now done in less than 30 seconds, and now provides a more accurate archival record of the sequence commanding for MRO.

  3. Training shortest-path tractography: Automatic learning of spatial priors

    DEFF Research Database (Denmark)

    Kasenburg, Niklas; Liptrot, Matthew George; Reislev, Nina Linde

    2016-01-01

    knowledge. Here we demonstrate how such prior knowledge, or indeed any prior spatial information, can be automatically incorporated into a shortest-path tractography approach to produce more robust results. We describe how such a prior can be automatically generated (learned) from a population, and we......Tractography is the standard tool for automatic delineation of white matter tracts from diffusion weighted images. However, the output of tractography often requires post-processing to remove false positives and ensure a robust delineation of the studied tract, and this demands expert prior...... demonstrate that our framework also retains support for conventional interactive constraints such as waypoint regions. We apply our approach to the open access, high quality Human Connectome Project data, as well as a dataset acquired on a typical clinical scanner. Our results show that the use of a learned...

  4. CHLOE: A tool for automatic detection of peculiar galaxies

    Science.gov (United States)

    Shamir, Lior; Manning, Saundra; Wallin, John

    2014-09-01

    CHLOE is an image analysis unsupervised learning algorithm that detects peculiar galaxies in datasets of galaxy images. The algorithm first computes a large set of numerical descriptors reflecting different aspects of the visual content, and then weighs them based on the standard deviation of the values computed from the galaxy images. The weighted Euclidean distance of each galaxy image from the median is measured, and the peculiarity of each galaxy is determined based on that distance.

  5. Image processing tool for automatic feature recognition and quantification

    Science.gov (United States)

    Chen, Xing; Stoddard, Ryan J.

    2017-05-02

    A system for defining structures within an image is described. The system includes reading of an input file, preprocessing the input file while preserving metadata such as scale information and then detecting features of the input file. In one version the detection first uses an edge detector followed by identification of features using a Hough transform. The output of the process is identified elements within the image.

  6. Automatic design of conformal cooling channels in injection molding tooling

    Science.gov (United States)

    Zhang, Yingming; Hou, Binkui; Wang, Qian; Li, Yang; Huang, Zhigao

    2018-02-01

    The generation of cooling system plays an important role in injection molding design. A conformal cooling system can effectively improve molding efficiency and product quality. This paper provides a generic approach for building conformal cooling channels. The centrelines of these channels are generated in two steps. First, we extract conformal loops based on geometric information of product. Second, centrelines in spiral shape are built by blending these loops. We devise algorithms to implement the entire design process. A case study verifies the feasibility of this approach.

  7. Computer Aided Model Development for Automatic Tool Wear ...

    African Journals Online (AJOL)

    The pre-processing operations on the images (taken on photographic cards) included scanning, in order to transfer onto a computer and convert them to digital images. Thresholding and segmentation were done in order to convert the altered background of the scanned images to a pure white background; the images were ...

  8. An analysis of tools for automatic software development and automatic code generation

    Directory of Open Access Journals (Sweden)

    Viviana Yarel Rosales-Morales

    2015-01-01

    Full Text Available El desarrollo de software es una importante área en la ingeniería de software, por tal motivo han surgido técnicas, enfoques y métodos que permiten la automatización de desarrollo del mismo. En este trabajo se presenta un análisis de las herramientas para el desarrollo automático de software y la generación automática de código fuente, con el fi n de evaluarlas y determinar si cumplen o no con un conjunto de características y funcionalidades en términos de calidad. Dichas características incluyen efi cacia, productividad, seguridad y satisfacción, todo a través de una evaluación cualitativa y cuantitativa. Estas herramientas son 1 herramientas CASE, 2 marcos de trabajo ( frameworks y 3 ambientes de desarrollo integrado (IDEs. La evaluación se llevó a cabo con el fi n de medir no sólo la capacidad de uso, sino también el apoyo que brindan para el desarrollo de software automático y la generación automática de código fuente. El objetivo de este trabajo es proporcionar una metodología y una breve revisión de los trabajos más importantes para, de esta forma, identifi car las principales características de éstos y presentar una evaluación comparativa en términos cualitativos y cuantitativos, con la fi nalidad de proporcionar la información necesaria para el desarrollador de software que facilite la toma de decisiones al considerar herramientas que le pueden ser útiles.

  9. Automatic design of digital synthetic gene circuits.

    Directory of Open Access Journals (Sweden)

    Mario A Marchisio

    2011-02-01

    Full Text Available De novo computational design of synthetic gene circuits that achieve well-defined target functions is a hard task. Existing, brute-force approaches run optimization algorithms on the structure and on the kinetic parameter values of the network. However, more direct rational methods for automatic circuit design are lacking. Focusing on digital synthetic gene circuits, we developed a methodology and a corresponding tool for in silico automatic design. For a given truth table that specifies a circuit's input-output relations, our algorithm generates and ranks several possible circuit schemes without the need for any optimization. Logic behavior is reproduced by the action of regulatory factors and chemicals on the promoters and on the ribosome binding sites of biological Boolean gates. Simulations of circuits with up to four inputs show a faithful and unequivocal truth table representation, even under parametric perturbations and stochastic noise. A comparison with already implemented circuits, in addition, reveals the potential for simpler designs with the same function. Therefore, we expect the method to help both in devising new circuits and in simplifying existing solutions.

  10. Automatic approach tendencies toward high and low caloric food in restrained eaters : Influence of task-relevance and mood

    NARCIS (Netherlands)

    Neimeijer, Renate A. M.; Roefs, Anne; Ostafin, Brian D.; Jong, de Peter

    2017-01-01

    Objective: Although restrained eaters are motivated to control their weight by dieting, they are often unsuccessful in these attempts. Dual process models emphasize the importance of differentiating between controlled and automatic tendencies to approach food. This study investigated the hypothesis

  11. Automatic sample changers maintenance manual

    International Nuclear Information System (INIS)

    Myers, T.A.

    1978-10-01

    This manual describes and provides trouble-shooting aids for the Automatic Sample Changer electronics on the automatic beta counting system, developed by the Los Alamos Scientific Laboratory Group CNC-11. The output of a gas detector is shaped by a preamplifier, then is coupled to an amplifier. Amplifier output is discriminated and is the input to a scaler. An identification number is associated with each sample. At a predetermined count length, the identification number, scaler data plus other information is punched out on a data card. The next sample to be counted is automatically selected. The beta counter uses the same electronics as the prior count did, the only difference being the sample identification number and sample itself. This manual is intended as a step-by-step aid in trouble-shooting the electronics associated with positioning the sample, counting the sample, and getting the needed data punched on an 80-column data card

  12. Development of an automatic scaler

    International Nuclear Information System (INIS)

    He Yuehong

    2009-04-01

    A self-designed automatic scaler is introduced. A microcontroller LPC936 is used as the master chip in the scaler. A counter integrated with the micro-controller is configured to operate as external pulse counter. Software employed in the scaler is based on a embedded real-time operating system kernel named Small RTOS. Data storage, calculation and some other functions are also provided. The scaler is designed for applications with low cost, low power consumption solutions. By now, the automatic scaler has been applied in a surface contamination instrument. (authors)

  13. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming focuses on the techniques of automatic programming used with digital computers. Topics covered range from the design of machine-independent programming languages to the use of recursive procedures in ALGOL 60. A multi-pass translation scheme for ALGOL 60 is described, along with some commercial source languages. The structure and use of the syntax-directed compiler is also considered.Comprised of 12 chapters, this volume begins with a discussion on the basic ideas involved in the description of a computing process as a program for a computer, expressed in

  14. Differential equations for dummies

    CERN Document Server

    Holzner, Steven

    2008-01-01

    The fun and easy way to understand and solve complex equations Many of the fundamental laws of physics, chemistry, biology, and economics can be formulated as differential equations. This plain-English guide explores the many applications of this mathematical tool and shows how differential equations can help us understand the world around us. Differential Equations For Dummies is the perfect companion for a college differential equations course and is an ideal supplemental resource for other calculus classes as well as science and engineering courses. It offers step-by-step techniques, practical tips, numerous exercises, and clear, concise examples to help readers improve their differential equation-solving skills and boost their test scores.

  15. Real time automatic discriminating of ultrasonic flaws

    International Nuclear Information System (INIS)

    Suhairy Sani; Mohd Hanif Md Saad; Marzuki Mustafa; Mohd Redzwan Rosli

    2009-01-01

    This paper is concerned with the real time automatic discriminating of flaws from two categories; i. cracks (planar defect) and ii. Non-cracks (volumetric defect such as cluster porosity and slag) using pulse-echo ultrasound. The raw ultrasonic flaws signal were collected from a computerized robotic plane scanning system over the whole of each reflector as the primary source of data. The signal is then filtered and the analysis in both time and frequency domain were executed to obtain the selected feature. The real time feature analysis techniques measured the number of peaks, maximum index, pulse duration, rise time and fall time. The obtained features could be used to distinguish between quantitatively classified flaws by using various tools in artificial intelligence such as neural networks. The proposed algorithm and complete system were implemented in a computer software developed using Microsoft Visual BASIC 6.0 (author)

  16. Automatic Phonetic Transcription for Danish Speech Recognition

    DEFF Research Database (Denmark)

    Kirkedal, Andreas Søeborg

    for English and now extended to cover 50 languages. Due to the nature of open source software, the quality of language support depends greatly on who encoded them. The Danish version was created by a Danish native speaker and contains more than 8,600 spelling-to-phoneme rules and more than 11,000 rules...... for particular words and word classes in addition. In comparison, English has 5,852 spelling-tophoneme rules and 4,133 additional rules and 8,278 rules and 3,829 additional rules. Phonix applies deep morphological analysis as a preprocessing step. Should the analysis fail, several fallback strategies...... to acquire and expensive to create. For languages with productive compounding or agglutinative languages like German and Finnish, respectively, phonetic dictionaries are also hard to maintain. For this reason, automatic phonetic transcription tools have been produced for many languages. The quality...

  17. Individual Differences in Automatic Emotion Regulation Interact with Primed Emotion Regulation during an Anger Provocation

    OpenAIRE

    Zhang, Jing; Lipp, Ottmar V.; Hu, Ping

    2017-01-01

    The current study investigated the interactive effects of individual differences in automatic emotion regulation (AER) and primed emotion regulation strategy on skin conductance level (SCL) and heart rate during provoked anger. The study was a 2 × 2 [AER tendency (expression vs. control) × priming (expression vs. control)] between subject design. Participants were assigned to two groups according to their performance on an emotion regulation-IAT (differentiating automatic emotion control tend...

  18. Automatic assessment of average diaphragm motion trajectory from 4DCT images through machine learning.

    Science.gov (United States)

    Li, Guang; Wei, Jie; Huang, Hailiang; Gaebler, Carl Philipp; Yuan, Amy; Deasy, Joseph O

    2015-12-01

    To automatically estimate average diaphragm motion trajectory (ADMT) based on four-dimensional computed tomography (4DCT), facilitating clinical assessment of respiratory motion and motion variation and retrospective motion study. We have developed an effective motion extraction approach and a machine-learning-based algorithm to estimate the ADMT. Eleven patients with 22 sets of 4DCT images (4DCT1 at simulation and 4DCT2 at treatment) were studied. After automatically segmenting the lungs, the differential volume-per-slice (dVPS) curves of the left and right lungs were calculated as a function of slice number for each phase with respective to the full-exhalation. After 5-slice moving average was performed, the discrete cosine transform (DCT) was applied to analyze the dVPS curves in frequency domain. The dimensionality of the spectrum data was reduced by using several lowest frequency coefficients ( f v ) to account for most of the spectrum energy (Σ f v 2 ). Multiple linear regression (MLR) method was then applied to determine the weights of these frequencies by fitting the ground truth-the measured ADMT, which are represented by three pivot points of the diaphragm on each side. The 'leave-one-out' cross validation method was employed to analyze the statistical performance of the prediction results in three image sets: 4DCT1, 4DCT2, and 4DCT1 + 4DCT2. Seven lowest frequencies in DCT domain were found to be sufficient to approximate the patient dVPS curves ( R = 91%-96% in MLR fitting). The mean error in the predicted ADMT using leave-one-out method was 0.3 ± 1.9 mm for the left-side diaphragm and 0.0 ± 1.4 mm for the right-side diaphragm. The prediction error is lower in 4DCT2 than 4DCT1, and is the lowest in 4DCT1 and 4DCT2 combined. This frequency-analysis-based machine learning technique was employed to predict the ADMT automatically with an acceptable error (0.2 ± 1.6 mm). This volumetric approach is not affected by the presence of the lung tumors

  19. Automatically Preparing Safe SQL Queries

    Science.gov (United States)

    Bisht, Prithvi; Sistla, A. Prasad; Venkatakrishnan, V. N.

    We present the first sound program source transformation approach for automatically transforming the code of a legacy web application to employ PREPARE statements in place of unsafe SQL queries. Our approach therefore opens the way for eradicating the SQL injection threat vector from legacy web applications.

  20. Automatic Validation of Protocol Narration

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, Pierpablo

    2003-01-01

    We perform a systematic expansion of protocol narrations into terms of a process algebra in order to make precise some of the detailed checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we...

  1. The Automatic Measurement of Targets

    DEFF Research Database (Denmark)

    Höhle, Joachim

    1997-01-01

    The automatic measurement of targets is demonstrated by means of a theoretical example and by an interactive measuring program for real imagery from a réseau camera. The used strategy is a combination of two methods: the maximum correlation coefficient and the correlation in the subpixel range...... interactive software is also part of a computer-assisted learning program on digital photogrammetry....

  2. Automatic Error Analysis Using Intervals

    Science.gov (United States)

    Rothwell, E. J.; Cloud, M. J.

    2012-01-01

    A technique for automatic error analysis using interval mathematics is introduced. A comparison to standard error propagation methods shows that in cases involving complicated formulas, the interval approach gives comparable error estimates with much less effort. Several examples are considered, and numerical errors are computed using the INTLAB…

  3. Rapid Differentiation of Haemophilus influenzae and Haemophilus haemolyticus by Use of Matrix-Assisted Laser Desorption Ionization-Time of Flight Mass Spectrometry with ClinProTools Mass Spectrum Analysis.

    Science.gov (United States)

    Chen, Jonathan H K; Cheng, Vincent C C; Wong, Chun-Pong; Wong, Sally C Y; Yam, Wing-Cheong; Yuen, Kwok-Yung

    2017-09-01

    Haemophilus influenzae is associated with severe invasive disease, while Haemophilus haemolyticus is considered part of the commensal flora in the human respiratory tract. Although the addition of a custom mass spectrum library into the matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) system could improve identification of these two species, the establishment of such a custom database is technically complicated and requires a large amount of resources, which most clinical laboratories cannot afford. In this study, we developed a mass spectrum analysis model with 7 mass peak biomarkers for the identification of H. influenzae and H. haemolyticus using the ClinProTools software. We evaluated the diagnostic performance of this model using 408 H. influenzae and H. haemolyticus isolates from clinical respiratory specimens from 363 hospitalized patients and compared the identification results with those obtained with the Bruker IVD MALDI Biotyper. The IVD MALDI Biotyper identified only 86.9% of H. influenzae (311/358) and 98.0% of H. haemolyticus (49/50) clinical isolates to the species level. In comparison, the ClinProTools mass spectrum model could identify 100% of H. influenzae (358/358) and H. haemolyticus (50/50) clinical strains to the species level and significantly improved the species identification rate (McNemar's test, P mass spectrometry to handle closely related bacterial species when the proprietary spectrum library failed. This approach should be useful for the differentiation of other closely related bacterial species. Copyright © 2017 American Society for Microbiology.

  4. Automatic Addition of Genre Information in a Japanese Dictionary

    Directory of Open Access Journals (Sweden)

    Raoul BLIN

    2012-10-01

    Full Text Available This article presents the method used for the automatic addition of genre information to the Japanese entries in a Japanese-French dictionary. The dictionary is intended for a wide audience, ranging from learners of Japanese as a second language to researchers. The genre characterization is based on the statistical analysis of corpora representing different genres. We will discuss the selection of genres and corpora, the tool and method of analysis, the difficulties encountered during this analysis and their solutions.

  5. Automatic annotation of head velocity and acceleration in Anvil

    DEFF Research Database (Denmark)

    Jongejan, Bart

    2012-01-01

    We describe an automatic face tracker plugin for the ANVIL annotation tool. The face tracker produces data for velocity and for acceleration in two dimensions. We compare the annotations generated by the face tracking algorithm with independently made manual annotations for head movements....... The annotations are a useful supplement to manual annotations and may help human annotators to quickly and reliably determine onset of head movements and to suggest which kind of head movement is taking place....

  6. WebQuests: Tools for Differentiation

    Science.gov (United States)

    Schweizer, Heidi; Kossow, Ben

    2007-01-01

    This article features the WebQuest, an inquiry-oriented activity in which some or all of the information that learners interact with comes from resources on the Internet. WebQuests, when properly constructed, are activities, usually authentic in nature, that require the student to use Internet-based resources to deepen their understanding and…

  7. Microcontroller based automatic temperature control for oyster mushroom plants

    Science.gov (United States)

    Sihombing, P.; Astuti, T. P.; Herriyance; Sitompul, D.

    2018-03-01

    In the cultivation of Oyster Mushrooms need special treatment because oyster mushrooms are susceptible to disease. Mushroom growth will be inhibited if the temperature and humidity are not well controlled because temperature and inertia can affect mold growth. Oyster mushroom growth usually will be optimal at temperatures around 22-28°C and humidity around 70-90%. This problem is often encountered in the cultivation of oyster mushrooms. Therefore it is very important to control the temperature and humidity of the room of oyster mushroom cultivation. In this paper, we developed an automatic temperature monitoring tool in the cultivation of oyster mushroom-based Arduino Uno microcontroller. We have designed a tool that will control the temperature and humidity automatically by Android Smartphone. If the temperature increased more than 28°C in the room of mushroom plants, then this tool will turn on the pump automatically to run water in order to lower the room temperature. And if the room temperature of mushroom plants below of 22°C, then the light will be turned on in order to heat the room. Thus the temperature in the room oyster mushrooms will remain stable so that the growth of oyster mushrooms can grow with good quality.

  8. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS/E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  9. Quantification of γ- and α-tocopherol isomers in combination with pattern recognition model as a tool for differentiating dry-cured shoulders of Iberian pigs raised on different feeding systems.

    Science.gov (United States)

    Rey, Ana I; Amazan, Daniel; López-Bote, Clemente J; García-Casco, Juan M

    2014-10-01

    Quantification of γ- and α-tocopherol in dry-cured shoulders of Iberian pigs was evaluated as a tool for differentiating feeding backgrounds or regimens. Samples (n = 115) were obtained over two different seasons from the four categories of pigs described in the Industry Quality Policy, i.e. pigs fed in free-range conditions (FREE-RANGE), pigs fed in free-range conditions and provided feed supplements (FREE-FEED), pigs fed outdoors with feed and with access to grass (FEED-OUT) and pigs fed in intensive conditions with feed (FEED). Linear discriminant functions were calculated and validated. The validation results showed that 20% of the muscle samples were not correctly classified into the four feeding categories, giving an 80% success rate. The FEED group had the lowest proportion of errors, with 100% of samples correctly classified. For the FREE-RANGE group, 87% of samples were assigned to the correct feeding system by cross-validation; however, 13% were considered as FREE-FEED. A higher rate of correct classification can be obtained when using three categories or by calculating the weight gain in free-range conditions using regression equations. Taking into account the high variability of the samples and the high success in classification, these results are of interest and may be applied in practical situations. © 2014 Society of Chemical Industry.

  10. Automatisms: bridging clinical neurology with criminal law.

    Science.gov (United States)

    Rolnick, Joshua; Parvizi, Josef

    2011-03-01

    The law, like neurology, grapples with the relationship between disease states and behavior. Sometimes, the two disciplines share the same terminology, such as automatism. In law, the "automatism defense" is a claim that action was involuntary or performed while unconscious. Someone charged with a serious crime can acknowledge committing the act and yet may go free if, relying on the expert testimony of clinicians, the court determines that the act of crime was committed in a state of automatism. In this review, we explore the relationship between the use of automatism in the legal and clinical literature. We close by addressing several issues raised by the automatism defense: semantic ambiguity surrounding the term automatism, the presence or absence of consciousness during automatisms, and the methodological obstacles that have hindered the study of cognition during automatisms. Copyright © 2010 Elsevier Inc. All rights reserved.

  11. SU-E-J-272: Auto-Segmentation of Regions with Differentiating CT Numbers for Treatment Response Assessment

    International Nuclear Information System (INIS)

    Yang, C; Noid, G; Dalah, E; Paulson, E; Li, X; Gilat-Schmidt, T

    2015-01-01

    Purpose: It has been reported recently that the change of CT number (CTN) during and after radiation therapy (RT) may be used to assess RT response. The purpose of this work is to develop a tool to automatically segment the regions with differentiating CTN and/or with change of CTN in a series of CTs. Methods: A software tool was developed to identify regions with differentiating CTN using K-mean Cluster of CT numbers and to automatically delineate these regions using convex hull enclosing method. Pre- and Post-RT CT, PET, or MRI images acquired for sample lung and pancreatic cancer cases were used to test the software tool. K-mean cluster of CT numbers within the gross tumor volumes (GTVs) delineated based on PET SUV (standard uptake value of fludeoxyglucose) and/or MRI ADC (apparent diffusion coefficient) map was analyzed. The cluster centers with higher value were considered as active tumor volumes (ATV). The convex hull contours enclosing preset clusters were used to delineate these ATVs with color washed displays. The CTN defined ATVs were compared with the SUV- or ADC-defined ATVs. Results: CTN stability of the CT scanner used to acquire the CTs in this work is less than 1.5 Hounsfield Unit (HU) variation annually. K-mean cluster centers in the GTV have difference of ∼20 HU, much larger than variation due to CTN stability, for the lung cancer cases studied. The dice coefficient between the ATVs delineated based on convex hull enclosure of high CTN centers and the PET defined GTVs based on SUV cutoff value of 2.5 was 90(±5)%. Conclusion: A software tool was developed using K-mean cluster and convex hull contour to automatically segment high CTN regions which may not be identifiable using a simple threshold method. These CTN regions were reasonably overlapped with the PET or MRI defined GTVs

  12. Self-Compassion and Automatic Thoughts

    Science.gov (United States)

    Akin, Ahmet

    2012-01-01

    The aim of this research is to examine the relationships between self-compassion and automatic thoughts. Participants were 299 university students. In this study, the Self-compassion Scale and the Automatic Thoughts Questionnaire were used. The relationships between self-compassion and automatic thoughts were examined using correlation analysis…

  13. The MPO system for automatic workflow documentation

    Energy Technology Data Exchange (ETDEWEB)

    Abla, G.; Coviello, E.N.; Flanagan, S.M. [General Atomics, P.O. Box 85608, San Diego, CA 92186-5608 (United States); Greenwald, M. [Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Lee, X. [General Atomics, P.O. Box 85608, San Diego, CA 92186-5608 (United States); Romosan, A. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Schissel, D.P., E-mail: schissel@fusion.gat.com [General Atomics, P.O. Box 85608, San Diego, CA 92186-5608 (United States); Shoshani, A. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Stillerman, J.; Wright, J. [Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Wu, K.J. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States)

    2016-11-15

    Highlights: • Data model, infrastructure, and tools for data tracking, cataloging, and integration. • Automatically document workflow and data provenance in the widest sense. • Fusion Science as test bed but the system’s framework and data model is quite general. - Abstract: Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for critical applications. However, it is not the mere existence of data that is important, but our ability to make use of it. Experience has shown that when metadata is better organized and more complete, the underlying data becomes more useful. Traditionally, capturing the steps of scientific workflows and metadata was the role of the lab notebook, but the digital era has resulted instead in the fragmentation of data, processing, and annotation. This paper presents the Metadata, Provenance, and Ontology (MPO) System, the software that can automate the documentation of scientific workflows and associated information. Based on recorded metadata, it provides explicit information about the relationships among the elements of workflows in notebook form augmented with directed acyclic graphs. A set of web-based graphical navigation tools and Application Programming Interface (API) have been created for searching and browsing, as well as programmatically accessing the workflows and data. We describe the MPO concepts and its software architecture. We also report the current status of the software as well as the initial deployment experience.

  14. The MPO system for automatic workflow documentation

    International Nuclear Information System (INIS)

    Abla, G.; Coviello, E.N.; Flanagan, S.M.; Greenwald, M.; Lee, X.; Romosan, A.; Schissel, D.P.; Shoshani, A.; Stillerman, J.; Wright, J.; Wu, K.J.

    2016-01-01

    Highlights: • Data model, infrastructure, and tools for data tracking, cataloging, and integration. • Automatically document workflow and data provenance in the widest sense. • Fusion Science as test bed but the system’s framework and data model is quite general. - Abstract: Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for critical applications. However, it is not the mere existence of data that is important, but our ability to make use of it. Experience has shown that when metadata is better organized and more complete, the underlying data becomes more useful. Traditionally, capturing the steps of scientific workflows and metadata was the role of the lab notebook, but the digital era has resulted instead in the fragmentation of data, processing, and annotation. This paper presents the Metadata, Provenance, and Ontology (MPO) System, the software that can automate the documentation of scientific workflows and associated information. Based on recorded metadata, it provides explicit information about the relationships among the elements of workflows in notebook form augmented with directed acyclic graphs. A set of web-based graphical navigation tools and Application Programming Interface (API) have been created for searching and browsing, as well as programmatically accessing the workflows and data. We describe the MPO concepts and its software architecture. We also report the current status of the software as well as the initial deployment experience.

  15. Automatic schema evolution in Root

    International Nuclear Information System (INIS)

    Brun, R.; Rademakers, F.

    2001-01-01

    ROOT version 3 (spring 2001) supports automatic class schema evolution. In addition this version also produces files that are self-describing. This is achieved by storing in each file a record with the description of all the persistent classes in the file. Being self-describing guarantees that a file can always be read later, its structure browsed and objects inspected, also when the library with the compiled code of these classes is missing. The schema evolution mechanism supports the frequent case when multiple data sets generated with many different class versions must be analyzed in the same session. ROOT supports the automatic generation of C++ code describing the data objects in a file

  16. Automatic design of magazine covers

    Science.gov (United States)

    Jahanian, Ali; Liu, Jerry; Tretter, Daniel R.; Lin, Qian; Damera-Venkata, Niranjan; O'Brien-Strain, Eamonn; Lee, Seungyon; Fan, Jian; Allebach, Jan P.

    2012-03-01

    In this paper, we propose a system for automatic design of magazine covers that quantifies a number of concepts from art and aesthetics. Our solution to automatic design of this type of media has been shaped by input from professional designers, magazine art directors and editorial boards, and journalists. Consequently, a number of principles in design and rules in designing magazine covers are delineated. Several techniques are derived and employed in order to quantify and implement these principles and rules in the format of a software framework. At this stage, our framework divides the task of design into three main modules: layout of magazine cover elements, choice of color for masthead and cover lines, and typography of cover lines. Feedback from professional designers on our designs suggests that our results are congruent with their intuition.

  17. Automatic computation of radioimmunoassay data

    International Nuclear Information System (INIS)

    Toyota, Takayoshi; Kudo, Mikihiko; Abe, Kanji; Kawamata, Fumiaki; Uehata, Shigeru.

    1975-01-01

    Radioimmunoassay provided dose response curves which showed linearity by the use of logistic transformation (Rodbard). This transformation which was applicable to radioimmunoassay should be useful for the computer processing of insulin and C-peptide assay. In the present studies, standard curves were analysed by testing the fit of analytic functions to radioimmunoassay of insulin and C-peptides. A program for use in combination with the double antibody technique was made by Dr. Kawamata. This approach was evidenced to be useful in order to allow automatic computation of data derived from the double antibody assays of insulin and C-peptides. Automatic corrected calculations of radioimmunoassay data of insulin was found to be satisfactory. (auth.)

  18. Physics of Automatic Target Recognition

    CERN Document Server

    Sadjadi, Firooz

    2007-01-01

    Physics of Automatic Target Recognition addresses the fundamental physical bases of sensing, and information extraction in the state-of-the art automatic target recognition field. It explores both passive and active multispectral sensing, polarimetric diversity, complex signature exploitation, sensor and processing adaptation, transformation of electromagnetic and acoustic waves in their interactions with targets, background clutter, transmission media, and sensing elements. The general inverse scattering, and advanced signal processing techniques and scientific evaluation methodologies being used in this multi disciplinary field will be part of this exposition. The issues of modeling of target signatures in various spectral modalities, LADAR, IR, SAR, high resolution radar, acoustic, seismic, visible, hyperspectral, in diverse geometric aspects will be addressed. The methods for signal processing and classification will cover concepts such as sensor adaptive and artificial neural networks, time reversal filt...

  19. Program Code Generator for Cardiac Electrophysiology Simulation with Automatic PDE Boundary Condition Handling.

    Directory of Open Access Journals (Sweden)

    Florencio Rusty Punzalan

    Full Text Available Clinical and experimental studies involving human hearts can have certain limitations. Methods such as computer simulations can be an important alternative or supplemental tool. Physiological simulation at the tissue or organ level typically involves the handling of partial differential equations (PDEs. Boundary conditions and distributed parameters, such as those used in pharmacokinetics simulation, add to the complexity of the PDE solution. These factors can tailor PDE solutions and their corresponding program code to specific problems. Boundary condition and parameter changes in the customized code are usually prone to errors and time-consuming. We propose a general approach for handling PDEs and boundary conditions in computational models using a replacement scheme for discretization. This study is an extension of a program generator that we introduced in a previous publication. The program generator can generate code for multi-cell simulations of cardiac electrophysiology. Improvements to the system allow it to handle simultaneous equations in the biological function model as well as implicit PDE numerical schemes. The replacement scheme involves substituting all partial differential terms with numerical solution equations. Once the model and boundary equations are discretized with the numerical solution scheme, instances of the equations are generated to undergo dependency analysis. The result of the dependency analysis is then used to generate the program code. The resulting program code are in Java or C programming language. To validate the automatic handling of boundary conditions in the program code generator, we generated simulation code using the FHN, Luo-Rudy 1, and Hund-Rudy cell models and run cell-to-cell coupling and action potential propagation simulations. One of the simulations is based on a published experiment and simulation results are compared with the experimental data. We conclude that the proposed program code

  20. MOS voltage automatic tuning circuit

    OpenAIRE

    李, 田茂; 中田, 辰則; 松本, 寛樹

    2004-01-01

    Abstract ###Automatic tuning circuit adjusts frequency performance to compensate for the process variation. Phase locked ###loop (PLL) is a suitable oscillator for the integrated circuit. It is a feedback system that compares the input ###phase with the output phase. It can make the output frequency equal to the input frequency. In this paper, PLL ###fomed of MOSFET's is presented.The presented circuit consists of XOR circuit, Low-pass filter and Relaxation ###Oscillator. On PSPICE simulation...

  1. Automatic controller at associated memory

    International Nuclear Information System (INIS)

    Courty, P.

    1977-06-01

    Organized around an A2 type controller, this CAMAC device allows on command of the associated computer to start reading 64K 16 bit words into an outer memory. This memory is fully controlled by the computer. In the automatic mode, which works at 10 6 words/sec, the computer can access any other module of the same crate by cycle-stealing [fr

  2. Automatic Guidance for Remote Manipulator

    Science.gov (United States)

    Johnston, A. R.

    1986-01-01

    Position sensor and mirror guides manipulator toward object. Grasping becomes automatic when sensor begins to receive signal from reflector on object to be manipulated. Light-emitting diodes on manipulator produce light signals for reflector, which is composite of plane and corner reflectors. Proposed scheme especially useful when manipulator arm tends to flex or when object is moving. Sensor and microprocessor designed to compensate for manipulatorarm oscillation.

  3. Annual review in automatic programming

    CERN Document Server

    Halpern, Mark I; Bolliet, Louis

    2014-01-01

    Computer Science and Technology and their Application is an eight-chapter book that first presents a tutorial on database organization. Subsequent chapters describe the general concepts of Simula 67 programming language; incremental compilation and conversational interpretation; dynamic syntax; the ALGOL 68. Other chapters discuss the general purpose conversational system for graphical programming and automatic theorem proving based on resolution. A survey of extensible programming language is also shown.

  4. Automatic computation of transfer functions

    Science.gov (United States)

    Atcitty, Stanley; Watson, Luke Dale

    2015-04-14

    Technologies pertaining to the automatic computation of transfer functions for a physical system are described herein. The physical system is one of an electrical system, a mechanical system, an electromechanical system, an electrochemical system, or an electromagnetic system. A netlist in the form of a matrix comprises data that is indicative of elements in the physical system, values for the elements in the physical system, and structure of the physical system. Transfer functions for the physical system are computed based upon the netlist.

  5. An automatized frequency analysis for vine plot detection and delineation in remote sensing

    OpenAIRE

    Delenne , Carole; Rabatel , G.; Deshayes , M.

    2008-01-01

    The availability of an automatic tool for vine plot detection, delineation, and characterization would be very useful for management purposes. An automatic and recursive process using frequency analysis (with Fourier transform and Gabor filters) has been developed to meet this need. This results in the determination of vine plot boundary and accurate estimation of interrow width and row orientation. To foster large-scale applications, tests and validation have been carried out on standard ver...

  6. Automatic segmentation of clinical texts.

    Science.gov (United States)

    Apostolova, Emilia; Channin, David S; Demner-Fushman, Dina; Furst, Jacob; Lytinen, Steven; Raicu, Daniela

    2009-01-01

    Clinical narratives, such as radiology and pathology reports, are commonly available in electronic form. However, they are also commonly entered and stored as free text. Knowledge of the structure of clinical narratives is necessary for enhancing the productivity of healthcare departments and facilitating research. This study attempts to automatically segment medical reports into semantic sections. Our goal is to develop a robust and scalable medical report segmentation system requiring minimum user input for efficient retrieval and extraction of information from free-text clinical narratives. Hand-crafted rules were used to automatically identify a high-confidence training set. This automatically created training dataset was later used to develop metrics and an algorithm that determines the semantic structure of the medical reports. A word-vector cosine similarity metric combined with several heuristics was used to classify each report sentence into one of several pre-defined semantic sections. This baseline algorithm achieved 79% accuracy. A Support Vector Machine (SVM) classifier trained on additional formatting and contextual features was able to achieve 90% accuracy. Plans for future work include developing a configurable system that could accommodate various medical report formatting and content standards.

  7. Scanner OPC signatures: automatic vendor-to-vendor OPE matching

    Science.gov (United States)

    Renwick, Stephen P.

    2009-03-01

    As 193nm lithography continues to be stretched and the k1 factor decreases, optical proximity correction (OPC) has become a vital part of the lithographer's tool kit. Unfortunately, as is now well known, the design variations of lithographic scanners from different vendors cause them to have slightly different optical-proximity effect (OPE) behavior, meaning that they print features through pitch in distinct ways. This in turn means that their response to OPC is not the same, and that an OPC solution designed for a scanner from Company 1 may or may not work properly on a scanner from Company 2. Since OPC is not inexpensive, that causes trouble for chipmakers using more than one brand of scanner. Clearly a scanner-matching procedure is needed to meet this challenge. Previously, automatic matching has only been reported for scanners of different tool generations from the same manufacturer. In contrast, scanners from different companies have been matched using expert tuning and adjustment techniques, frequently requiring laborious test exposures. Automatic matching between scanners from Company 1 and Company 2 has remained an unsettled problem. We have recently solved this problem and introduce a novel method to perform the automatic matching. The success in meeting this challenge required three enabling factors. First, we recognized the strongest drivers of OPE mismatch and are thereby able to reduce the information needed about a tool from another supplier to that information readily available from all modern scanners. Second, we developed a means of reliably identifying the scanners' optical signatures, minimizing dependence on process parameters that can cloud the issue. Third, we carefully employed standard statistical techniques, checking for robustness of the algorithms used and maximizing efficiency. The result is an automatic software system that can predict an OPC matching solution for scanners from different suppliers without requiring expert intervention.

  8. Automatic Recognition of Object Names in Literature

    Science.gov (United States)

    Bonnin, C.; Lesteven, S.; Derriere, S.; Oberto, A.

    2008-08-01

    SIMBAD is a database of astronomical objects that provides (among other things) their bibliographic references in a large number of journals. Currently, these references have to be entered manually by librarians who read each paper. To cope with the increasing number of papers, CDS develops a tool to assist the librarians in their work, taking advantage of the Dictionary of Nomenclature of Celestial Objects, which keeps track of object acronyms and of their origin. The program searches for object names directly in PDF documents by comparing the words with all the formats stored in the Dictionary of Nomenclature. It also searches for variable star names based on constellation names and for a large list of usual names such as Aldebaran or the Crab. Object names found in the documents often correspond to several astronomical objects. The system retrieves all possible matches, displays them with their object type given by SIMBAD, and lets the librarian make the final choice. The bibliographic reference can then be automatically added to the object identifiers in the database. Besides, the systematic usage of the Dictionary of Nomenclature, which is updated manually, permitted to automatically check it and to detect errors and inconsistencies. Last but not least, the program collects some additional information such as the position of the object names in the document (in the title, subtitle, abstract, table, figure caption...) and their number of occurrences. In the future, this will permit to calculate the 'weight' of an object in a reference and to provide SIMBAD users with an important new information, which will help them to find the most relevant papers in the object reference list.

  9. AUTOMATIC EXTRACTION OF BUILDING OUTLINE FROM HIGH RESOLUTION AERIAL IMAGERY

    Directory of Open Access Journals (Sweden)

    Y. Wang

    2016-06-01

    Full Text Available In this paper, a new approach for automated extraction of building boundary from high resolution imagery is proposed. The proposed approach uses both geometric and spectral properties of a building to detect and locate buildings accurately. It consists of automatic generation of high quality point cloud from the imagery, building detection from point cloud, classification of building roof and generation of building outline. Point cloud is generated from the imagery automatically using semi-global image matching technology. Buildings are detected from the differential surface generated from the point cloud. Further classification of building roof is performed in order to generate accurate building outline. Finally classified building roof is converted into vector format. Numerous tests have been done on images in different locations and results are presented in the paper.

  10. Signal Compression in Automatic Ultrasonic testing of Rails

    Directory of Open Access Journals (Sweden)

    Tomasz Ciszewski

    2007-01-01

    Full Text Available Full recording of the most important information carried by the ultrasonic signals allows realizing statistical analysis of measurement data. Statistical analysis of the results gathered during automatic ultrasonic tests gives data which lead, together with use of features of measuring method, differential lossy coding and traditional method of lossless data compression (Huffman’s coding, dictionary coding, to a comprehensive, efficient data compression algorithm. The subject of the article is to present the algorithm and the benefits got by using it in comparison to alternative compression methods. Storage of large amount  of data allows to create an electronic catalogue of ultrasonic defects. If it is created, the future qualification system training in the new solutions of the automat for test in rails will be possible.

  11. Uav-Based Automatic Tree Growth Measurement for Biomass Estimation

    Science.gov (United States)

    Karpina, M.; Jarząbek-Rychard, M.; Tymków, P.; Borkowski, A.

    2016-06-01

    Manual in-situ measurements of geometric tree parameters for the biomass volume estimation are time-consuming and economically non-effective. Photogrammetric techniques can be deployed in order to automate the measurement procedure. The purpose of the presented work is an automatic tree growth estimation based on Unmanned Aircraft Vehicle (UAV) imagery. The experiment was conducted in an agriculture test field with scots pine canopies. The data was collected using a Leica Aibotix X6V2 platform equipped with a Nikon D800 camera. Reference geometric parameters of selected sample plants were measured manually each week. In situ measurements were correlated with the UAV data acquisition. The correlation aimed at the investigation of optimal conditions for a flight and parameter settings for image acquisition. The collected images are processed in a state of the art tool resulting in a generation of dense 3D point clouds. The algorithm is developed in order to estimate geometric tree parameters from 3D points. Stem positions and tree tops are identified automatically in a cross section, followed by the calculation of tree heights. The automatically derived height values are compared to the reference measurements performed manually. The comparison allows for the evaluation of automatic growth estimation process. The accuracy achieved using UAV photogrammetry for tree heights estimation is about 5cm.

  12. Training shortest-path tractography: Automatic learning of spatial priors.

    Science.gov (United States)

    Kasenburg, Niklas; Liptrot, Matthew; Reislev, Nina Linde; Ørting, Silas N; Nielsen, Mads; Garde, Ellen; Feragen, Aasa

    2016-04-15

    Tractography is the standard tool for automatic delineation of white matter tracts from diffusion weighted images. However, the output of tractography often requires post-processing to remove false positives and ensure a robust delineation of the studied tract, and this demands expert prior knowledge. Here we demonstrate how such prior knowledge, or indeed any prior spatial information, can be automatically incorporated into a shortest-path tractography approach to produce more robust results. We describe how such a prior can be automatically generated (learned) from a population, and we demonstrate that our framework also retains support for conventional interactive constraints such as waypoint regions. We apply our approach to the open access, high quality Human Connectome Project data, as well as a dataset acquired on a typical clinical scanner. Our results show that the use of a learned prior substantially increases the overlap of tractography output with a reference atlas on both populations, and this is confirmed by visual inspection. Furthermore, we demonstrate how a prior learned on the high quality dataset significantly increases the overlap with the reference for the more typical yet lower quality data acquired on a clinical scanner. We hope that such automatic incorporation of prior knowledge and the obviation of expert interactive tract delineation on every subject, will improve the feasibility of large clinical tractography studies. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Validation of semi-automatic segmentation of the left atrium

    Science.gov (United States)

    Rettmann, M. E.; Holmes, D. R., III; Camp, J. J.; Packer, D. L.; Robb, R. A.

    2008-03-01

    Catheter ablation therapy has become increasingly popular for the treatment of left atrial fibrillation. The effect of this treatment on left atrial morphology, however, has not yet been completely quantified. Initial studies have indicated a decrease in left atrial size with a concomitant decrease in pulmonary vein diameter. In order to effectively study if catheter based therapies affect left atrial geometry, robust segmentations with minimal user interaction are required. In this work, we validate a method to semi-automatically segment the left atrium from computed-tomography scans. The first step of the technique utilizes seeded region growing to extract the entire blood pool including the four chambers of the heart, the pulmonary veins, aorta, superior vena cava, inferior vena cava, and other surrounding structures. Next, the left atrium and pulmonary veins are separated from the rest of the blood pool using an algorithm that searches for thin connections between user defined points in the volumetric data or on a surface rendering. Finally, pulmonary veins are separated from the left atrium using a three dimensional tracing tool. A single user segmented three datasets three times using both the semi-automatic technique as well as manual tracing. The user interaction time for the semi-automatic technique was approximately forty-five minutes per dataset and the manual tracing required between four and eight hours per dataset depending on the number of slices. A truth model was generated using a simple voting scheme on the repeated manual segmentations. A second user segmented each of the nine datasets using the semi-automatic technique only. Several metrics were computed to assess the agreement between the semi-automatic technique and the truth model including percent differences in left atrial volume, DICE overlap, and mean distance between the boundaries of the segmented left atria. Overall, the semi-automatic approach was demonstrated to be repeatable within

  14. A tool for rapid manual translation

    OpenAIRE

    Nordström, Magnus; Pettersson, Paul

    1993-01-01

    There have been several attempts to realize the idea of a fully automatic translation system for text translation to replace human translators. By contrast, little work has been put into building tools to aid human translators. This report describes the ideas behind such a tool. The tool is intended to aid human translators in achieving higher productivity and better quality, by presenting terminological information extracted from previous translations. The report documents the i...

  15. Language Management Tools

    DEFF Research Database (Denmark)

    Sanden, Guro Refsum

    This paper offers a review of existing literature on the topic of language management tools – the means by which language is managed – in multilingual organisations. By drawing on a combination of sociolinguistics and international business and management studies, a new taxonomy of language...... management tools is proposed, differentiating between three categories of tools. Firstly, corporate policies are the deliberate control of issues pertaining to language and communication developed at the managerial level of a firm. Secondly, corporate measures are the planned activities the firm’s leadership...... may deploy in order to address the language needs of the organisation. Finally, front-line practices refer to the use of informal, emergent language management tools available to staff members. The language management tools taxonomy provides a framework for operationalising the management of language...

  16. Automatic Testcase Generation for Flight Software

    Science.gov (United States)

    Bushnell, David Henry; Pasareanu, Corina; Mackey, Ryan M.

    2008-01-01

    The TacSat3 project is applying Integrated Systems Health Management (ISHM) technologies to an Air Force spacecraft for operational evaluation in space. The experiment will demonstrate the effectiveness and cost of ISHM and vehicle systems management (VSM) technologies through onboard operation for extended periods. We present two approaches to automatic testcase generation for ISHM: 1) A blackbox approach that views the system as a blackbox, and uses a grammar-based specification of the system's inputs to automatically generate *all* inputs that satisfy the specifications (up to prespecified limits); these inputs are then used to exercise the system. 2) A whitebox approach that performs analysis and testcase generation directly on a representation of the internal behaviour of the system under test. The enabling technologies for both these approaches are model checking and symbolic execution, as implemented in the Ames' Java PathFinder (JPF) tool suite. Model checking is an automated technique for software verification. Unlike simulation and testing which check only some of the system executions and therefore may miss errors, model checking exhaustively explores all possible executions. Symbolic execution evaluates programs with symbolic rather than concrete values and represents variable values as symbolic expressions. We are applying the blackbox approach to generating input scripts for the Spacecraft Command Language (SCL) from Interface and Control Systems. SCL is an embedded interpreter for controlling spacecraft systems. TacSat3 will be using SCL as the controller for its ISHM systems. We translated the SCL grammar into a program that outputs scripts conforming to the grammars. Running JPF on this program generates all legal input scripts up to a prespecified size. Script generation can also be targeted to specific parts of the grammar of interest to the developers. These scripts are then fed to the SCL Executive. ICS's in-house coverage tools will be run to

  17. Constraint Differentiation

    DEFF Research Database (Denmark)

    Mödersheim, Sebastian Alexander; Basin, David; Viganò, Luca

    2010-01-01

    We introduce constraint differentiation, a powerful technique for reducing search when model-checking security protocols using constraint-based methods. Constraint differentiation works by eliminating certain kinds of redundancies that arise in the search space when using constraints to represent...... results show that constraint differentiation substantially reduces search and considerably improves the performance of OFMC, enabling its application to a wider class of problems....

  18. Differential manifolds

    CERN Document Server

    Kosinski, Antoni A

    2007-01-01

    The concepts of differential topology form the center of many mathematical disciplines such as differential geometry and Lie group theory. Differential Manifolds presents to advanced undergraduates and graduate students the systematic study of the topological structure of smooth manifolds. Author Antoni A. Kosinski, Professor Emeritus of Mathematics at Rutgers University, offers an accessible approach to both the h-cobordism theorem and the classification of differential structures on spheres.""How useful it is,"" noted the Bulletin of the American Mathematical Society, ""to have a single, sho

  19. Introduction to differential equations

    CERN Document Server

    Taylor, Michael E

    2011-01-01

    The mathematical formulations of problems in physics, economics, biology, and other sciences are usually embodied in differential equations. The analysis of the resulting equations then provides new insight into the original problems. This book describes the tools for performing that analysis. The first chapter treats single differential equations, emphasizing linear and nonlinear first order equations, linear second order equations, and a class of nonlinear second order equations arising from Newton's laws. The first order linear theory starts with a self-contained presentation of the exponen

  20. DMET-Analyzer: automatic analysis of Affymetrix DMET Data

    Directory of Open Access Journals (Sweden)

    Guzzi Pietro

    2012-10-01

    Full Text Available Abstract Background Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. Results We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP, (iii the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different

  1. DMET-analyzer: automatic analysis of Affymetrix DMET data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Agapito, Giuseppe; Di Martino, Maria Teresa; Arbitrio, Mariamena; Tassone, Pierfrancesco; Tagliaferri, Pierosandro; Cannataro, Mario

    2012-10-05

    Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters) is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism) on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix) and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i) to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii) the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP), (iii) the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists) to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different case studies regarding the analysis of

  2. Automatic Genre Classification of Musical Signals

    Science.gov (United States)

    Barbedo, Jayme Garcia sArnal; Lopes, Amauri

    2006-12-01

    We present a strategy to perform automatic genre classification of musical signals. The technique divides the signals into 21.3 milliseconds frames, from which 4 features are extracted. The values of each feature are treated over 1-second analysis segments. Some statistical results of the features along each analysis segment are used to determine a vector of summary features that characterizes the respective segment. Next, a classification procedure uses those vectors to differentiate between genres. The classification procedure has two main characteristics: (1) a very wide and deep taxonomy, which allows a very meticulous comparison between different genres, and (2) a wide pairwise comparison of genres, which allows emphasizing the differences between each pair of genres. The procedure points out the genre that best fits the characteristics of each segment. The final classification of the signal is given by the genre that appears more times along all signal segments. The approach has shown very good accuracy even for the lowest layers of the hierarchical structure.

  3. Unification of automatic target tracking and automatic target recognition

    Science.gov (United States)

    Schachter, Bruce J.

    2014-06-01

    The subject being addressed is how an automatic target tracker (ATT) and an automatic target recognizer (ATR) can be fused together so tightly and so well that their distinctiveness becomes lost in the merger. This has historically not been the case outside of biology and a few academic papers. The biological model of ATT∪ATR arises from dynamic patterns of activity distributed across many neural circuits and structures (including retina). The information that the brain receives from the eyes is "old news" at the time that it receives it. The eyes and brain forecast a tracked object's future position, rather than relying on received retinal position. Anticipation of the next moment - building up a consistent perception - is accomplished under difficult conditions: motion (eyes, head, body, scene background, target) and processing limitations (neural noise, delays, eye jitter, distractions). Not only does the human vision system surmount these problems, but it has innate mechanisms to exploit motion in support of target detection and classification. Biological vision doesn't normally operate on snapshots. Feature extraction, detection and recognition are spatiotemporal. When vision is viewed as a spatiotemporal process, target detection, recognition, tracking, event detection and activity recognition, do not seem as distinct as they are in current ATT and ATR designs. They appear as similar mechanism taking place at varying time scales. A framework is provided for unifying ATT and ATR.

  4. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming, Volume 4 is a collection of papers that deals with the GIER ALGOL compiler, a parameterized compiler based on mechanical linguistics, and the JOVIAL language. A couple of papers describes a commercial use of stacks, an IBM system, and what an ideal computer program support system should be. One paper reviews the system of compilation, the development of a more advanced language, programming techniques, machine independence, and program transfer to other machines. Another paper describes the ALGOL 60 system for the GIER machine including running ALGOL pro

  5. On automatic machine translation evaluation

    Directory of Open Access Journals (Sweden)

    Darinka Verdonik

    2013-05-01

    Full Text Available An important task of developing machine translation (MT is evaluating system performance. Automatic measures are most commonly used for this task, as manual evaluation is time-consuming and costly. However, to perform an objective evaluation is not a trivial task. Automatic measures, such as BLEU, TER, NIST, METEOR etc., have their own weaknesses, while manual evaluations are also problematic since they are always to some extent subjective. In this paper we test the influence of a test set on the results of automatic MT evaluation for the subtitling domain. Translating subtitles is a rather specific task for MT, since subtitles are a sort of summarization of spoken text rather than a direct translation of (written text. Additional problem when translating language pair that does not include English, in our example Slovene-Serbian, is that commonly the translations are done from English to Serbian and from English to Slovenian, and not directly, since most of the TV production is originally filmed in English. All this poses additional challenges to MT and consequently to MT evaluation. Automatic evaluation is based on a reference translation, which is usually taken from an existing parallel corpus and marked as a test set. In our experiments, we compare the evaluation results for the same MT system output using three types of test set. In the first round, the test set are 4000 subtitles from the parallel corpus of subtitles SUMAT. These subtitles are not direct translations from Serbian to Slovene or vice versa, but are based on an English original. In the second round, the test set are 1000 subtitles randomly extracted from the first test set and translated anew, from Serbian to Slovenian, based solely on the Serbian written subtitles. In the third round, the test set are the same 1000 subtitles, however this time the Slovene translations were obtained by manually correcting the Slovene MT outputs so that they are correct translations of the

  6. Automatic Evaluation of Machine Translation

    DEFF Research Database (Denmark)

    Martinez, Mercedes Garcia; Koglin, Arlene; Mesa-Lao, Bartolomé

    2015-01-01

    of quality criteria in as few edits as possible. The quality of MT systems is generally measured by automatic metrics, producing scores that should correlate with human evaluation.In this study, we investigate correlations between one of such metrics, i.e. Translation Edit Rate (TER), and actual post...... of post-editing effort, namely i) temporal (time), ii) cognitive (mental processes) and iii) technical (keyboard activity). For the purposes of this research, TER scores were correlated with two different indicators of post-editing effort as computed in the CRITT Translation Process Database (TPR...

  7. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming, Volume 2 is a collection of papers that discusses the controversy about the suitability of COBOL as a common business oriented language, and the development of different common languages for scientific computation. A couple of papers describes the use of the Genie system in numerical calculation and analyzes Mercury autocode in terms of a phrase structure language, such as in the source language, target language, the order structure of ATLAS, and the meta-syntactical language of the assembly program. Other papers explain interference or an ""intermediate

  8. Coordinated hybrid automatic repeat request

    KAUST Repository

    Makki, Behrooz

    2014-11-01

    We develop a coordinated hybrid automatic repeat request (HARQ) approach. With the proposed scheme, if a user message is correctly decoded in the first HARQ rounds, its spectrum is allocated to other users, to improve the network outage probability and the users\\' fairness. The results, which are obtained for single- and multiple-antenna setups, demonstrate the efficiency of the proposed approach in different conditions. For instance, with a maximum of M retransmissions and single transmit/receive antennas, the diversity gain of a user increases from M to (J+1)(M-1)+1 where J is the number of users helping that user.

  9. Motor automaticity in Parkinson’s disease

    Science.gov (United States)

    Wu, Tao; Hallett, Mark; Chan, Piu

    2017-01-01

    Bradykinesia is the most important feature contributing to motor difficulties in Parkinson’s disease (PD). However, the pathophysiology underlying bradykinesia is not fully understood. One important aspect is that PD patients have difficulty in performing learned motor skills automatically, but this problem has been generally overlooked. Here we review motor automaticity associated motor deficits in PD, such as reduced arm swing, decreased stride length, freezing of gait, micrographia and reduced facial expression. Recent neuroimaging studies have revealed some neural mechanisms underlying impaired motor automaticity in PD, including less efficient neural coding of movement, failure to shift automated motor skills to the sensorimotor striatum, instability of the automatic mode within the striatum, and use of attentional control and/or compensatory efforts to execute movements usually performed automatically in healthy people. PD patients lose previously acquired automatic skills due to their impaired sensorimotor striatum, and have difficulty in acquiring new automatic skills or restoring lost motor skills. More investigations on the pathophysiology of motor automaticity, the effect of L-dopa or surgical treatments on automaticity, and the potential role of using measures of automaticity in early diagnosis of PD would be valuable. PMID:26102020

  10. Automatic detection and severity measurement of eczema using image processing.

    Science.gov (United States)

    Alam, Md Nafiul; Munia, Tamanna Tabassum Khan; Tavakolian, Kouhyar; Vasefi, Fartash; MacKinnon, Nick; Fazel-Rezai, Reza

    2016-08-01

    Chronic skin diseases like eczema may lead to severe health and financial consequences for patients if not detected and controlled early. Early measurement of disease severity, combined with a recommendation for skin protection and use of appropriate medication can prevent the disease from worsening. Current diagnosis can be costly and time-consuming. In this paper, an automatic eczema detection and severity measurement model are presented using modern image processing and computer algorithm. The system can successfully detect regions of eczema and classify the identified region as mild or severe based on image color and texture feature. Then the model automatically measures skin parameters used in the most common assessment tool called "Eczema Area and Severity Index (EASI)," by computing eczema affected area score, eczema intensity score, and body region score of eczema allowing both patients and physicians to accurately assess the affected skin.

  11. Automatic Chessboard Detection for Intrinsic and Extrinsic Camera Parameter Calibration

    Directory of Open Access Journals (Sweden)

    Jose María Armingol

    2010-03-01

    Full Text Available There are increasing applications that require precise calibration of cameras to perform accurate measurements on objects located within images, and an automatic algorithm would reduce this time consuming calibration procedure. The method proposed in this article uses a pattern similar to that of a chess board, which is found automatically in each image, when no information regarding the number of rows or columns is supplied to aid its detection. This is carried out by means of a combined analysis of two Hough transforms, image corners and invariant properties of the perspective transformation. Comparative analysis with more commonly used algorithms demonstrate the viability of the algorithm proposed, as a valuable tool for camera calibration.

  12. Production ready feature recognition based automatic group technology part coding

    Energy Technology Data Exchange (ETDEWEB)

    Ames, A.L.

    1990-01-01

    During the past four years, a feature recognition based expert system for automatically performing group technology part coding from solid model data has been under development. The system has become a production quality tool, capable of quickly the geometry based portions of a part code with no human intervention. It has been tested on over 200 solid models, half of which are models of production Sandia designs. Its performance rivals that of humans performing the same task, often surpassing them in speed and uniformity. The feature recognition capability developed for part coding is being extended to support other applications, such as manufacturability analysis, automatic decomposition (for finite element meshing and machining), and assembly planning. Initial surveys of these applications indicate that the current capability will provide a strong basis for other applications and that extensions toward more global geometric reasoning and tighter coupling with solid modeler functionality will be necessary.

  13. Effectiveness of an Automatic Tracking Software in Underwater Motion Analysis

    Directory of Open Access Journals (Sweden)

    Fabrício A. Magalhaes

    2013-12-01

    Full Text Available Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP, based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers’ positions were manually tracked to determine the markers’ center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM. Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker’s coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4% than for COM (17.8%. Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis.

  14. Stiffness and the automatic selection of ODE codes

    International Nuclear Information System (INIS)

    Shampine, L.F.

    1984-01-01

    The author describes the basic ideas behind the most popular methods for the numerical solution of ordinary differential equations (ODEs). He takes up the qualitative behavior of solutions of ODEs and its relation ot the propagation of numerical error. Codes for ODEs are intended either for stiff problems or for non-stiff problems. The difference is explained. Users of codes do not have the information needed to recognize stiffness. A code, DEASY, which automatically recognizes stiffness and selects a suitable method is described

  15. Automatic computation and solution of generalized harmonic balance equations

    Science.gov (United States)

    Peyton Jones, J. C.; Yaser, K. S. A.; Stevenson, J.

    2018-02-01

    Generalized methods are presented for generating and solving the harmonic balance equations for a broad class of nonlinear differential or difference equations and for a general set of harmonics chosen by the user. In particular, a new algorithm for automatically generating the Jacobian of the balance equations enables efficient solution of these equations using continuation methods. Efficient numeric validation techniques are also presented, and the combined algorithm is applied to the analysis of dc, fundamental, second and third harmonic response of a nonlinear automotive damper.

  16. Differential games.

    Science.gov (United States)

    Varaiya, P. P.

    1972-01-01

    General discussion of the theory of differential games with two players and zero sum. Games starting at a fixed initial state and ending at a fixed final time are analyzed. Strategies for the games are defined. The existence of saddle values and saddle points is considered. A stochastic version of a differential game is used to examine the synthesis problem.

  17. Automatic detection of tooth cracks in optical coherence tomography images.

    Science.gov (United States)

    Kim, Jun-Min; Kang, Se-Ryong; Yi, Won-Jin

    2017-02-01

    The aims of the present study were to compare the image quality and visibility of tooth cracks between conventional methods and swept-source optical coherence tomography (SS-OCT) and to develop an automatic detection technique for tooth cracks by SS-OCT imaging. We evaluated SS-OCT with a near-infrared wavelength centered at 1,310 nm over a spectral bandwidth of 100 nm at a rate of 50 kHz as a new diagnostic tool for the detection of tooth cracks. The reliability of the SS-OCT images was verified by comparing the crack lines with those detected using conventional methods. After performing preprocessing of the obtained SS-OCT images to emphasize cracks, an algorithm was developed and verified to detect tooth cracks automatically. The detection capability of SS-OCT was superior or comparable to that of trans-illumination, which did not discriminate among the cracks according to depth. Other conventional methods for the detection of tooth cracks did not sense initial cracks with a width of less than 100 μm. However, SS-OCT detected cracks of all sizes, ranging from craze lines to split teeth, and the crack lines were automatically detected in images using the Hough transform. We were able to distinguish structural cracks, craze lines, and split lines in tooth cracks using SS-OCT images, and to automatically detect the position of various cracks in the OCT images. Therefore, the detection capability of SS-OCT images provides a useful diagnostic tool for cracked tooth syndrome.

  18. An automatic holographic adaptive phoropter

    Science.gov (United States)

    Amirsolaimani, Babak; Peyghambarian, N.; Schwiegerling, Jim; Bablumyan, Arkady; Savidis, Nickolaos; Peyman, Gholam

    2017-08-01

    Phoropters are the most common instrument used to detect refractive errors. During a refractive exam, lenses are flipped in front of the patient who looks at the eye chart and tries to read the symbols. The procedure is fully dependent on the cooperation of the patient to read the eye chart, provides only a subjective measurement of visual acuity, and can at best provide a rough estimate of the patient's vision. Phoropters are difficult to use for mass screenings requiring a skilled examiner, and it is hard to screen young children and the elderly etc. We have developed a simplified, lightweight automatic phoropter that can measure the optical error of the eye objectively without requiring the patient's input. The automatic holographic adaptive phoropter is based on a Shack-Hartmann wave front sensor and three computercontrolled fluidic lenses. The fluidic lens system is designed to be able to provide power and astigmatic corrections over a large range of corrections without the need for verbal feedback from the patient in less than 20 seconds.

  19. Automatic welding machine for piping

    International Nuclear Information System (INIS)

    Yoshida, Kazuhiro; Koyama, Takaichi; Iizuka, Tomio; Ito, Yoshitoshi; Takami, Katsumi.

    1978-01-01

    A remotely controlled automatic special welding machine for piping was developed. This machine is utilized for long distance pipe lines, chemical plants, thermal power generating plants and nuclear power plants effectively from the viewpoint of good quality control, reduction of labor and good controllability. The function of this welding machine is to inspect the shape and dimensions of edge preparation before welding work by the sense of touch, to detect the temperature of melt pool, inspect the bead form by the sense of touch, and check the welding state by ITV during welding work, and to grind the bead surface and inspect the weld metal by ultrasonic test automatically after welding work. The construction of this welding system, the main specification of the apparatus, the welding procedure in detail, the electrical source of this welding machine, the cooling system, the structure and handling of guide ring, the central control system and the operating characteristics are explained. The working procedure and the effect by using this welding machine, and the application to nuclear power plants and the other industrial field are outlined. The HIDIC 08 is used as the controlling computer. This welding machine is useful for welding SUS piping as well as carbon steel piping. (Nakai, Y.)

  20. SRV-automatic handling device

    International Nuclear Information System (INIS)

    Yamada, Koji

    1987-01-01

    Automatic handling device for the steam relief valves (SRV's) is developed in order to achieve a decrease in exposure of workers, increase in availability factor, improvement in reliability, improvement in safety of operation, and labor saving. A survey is made during a periodical inspection to examine the actual SVR handling operation. An SRV automatic handling device consists of four components: conveyor, armed conveyor, lifting machine, and control/monitoring system. The conveyor is so designed that the existing I-rail installed in the containment vessel can be used without any modification. This is employed for conveying an SRV along the rail. The armed conveyor, designed for a box rail, is used for an SRV installed away from the rail. By using the lifting machine, an SRV installed away from the I-rail is brought to a spot just below the rail so that the SRV can be transferred by the conveyor. The control/monitoring system consists of a control computer, operation panel, TV monitor and annunciator. The SRV handling device is operated by remote control from a control room. A trial equipment is constructed and performance/function testing is carried out using actual SRV's. As a result, is it shown that the SRV handling device requires only two operators to serve satisfactorily. The required time for removal and replacement of one SRV is about 10 minutes. (Nogami, K.)

  1. Automatic scanning of emulsion films

    International Nuclear Information System (INIS)

    D'Ambrosio, N.; Mandrioli, G.; Sirrib, G.

    2003-01-01

    The use of nuclear emulsions in recent large neutrino experiments is mostly due to the significant results in the developments of this detection technique. In the emulsion films, trajectories of through-going particles are permanently recorded: thus, the emulsion target can be considered not only as a tracking but also as a storing device. If the data readout is performed by automatic scanning systems interfaced to an acquisition computer equipped with a fast frame grabber, nuclear emulsions can be used as very large target detector and quickly analyzed in particle physics experiments. Techniques for automatic scanning of nuclear emulsions have been developed in the early past. The effort was initiated by Niwa at Nagoya (Japan) in the late 70s. The first large-scale application was the CHORUS experiment; then emulsions have been used to search for T neutrinos in a high track density environment like DONUT. In order to measure with high accuracy and high speed, very strict constraints must be satisfied in terms of mechanical precisions, camera speed, image processing power. Recent improvements in this technique are briefly reported

  2. Automatic generation of tourist brochures

    KAUST Repository

    Birsak, Michael

    2014-05-01

    We present a novel framework for the automatic generation of tourist brochures that include routing instructions and additional information presented in the form of so-called detail lenses. The first contribution of this paper is the automatic creation of layouts for the brochures. Our approach is based on the minimization of an energy function that combines multiple goals: positioning of the lenses as close as possible to the corresponding region shown in an overview map, keeping the number of lenses low, and an efficient numbering of the lenses. The second contribution is a route-aware simplification of the graph of streets used for traveling between the points of interest (POIs). This is done by reducing the graph consisting of all shortest paths through the minimization of an energy function. The output is a subset of street segments that enable traveling between all the POIs without considerable detours, while at the same time guaranteeing a clutter-free visualization. © 2014 The Author(s) Computer Graphics Forum © 2014 The Eurographics Association and John Wiley & Sons Ltd. Published by John Wiley & Sons Ltd.

  3. Individuals with fear of blushing explicitly and automatically associate blushing with social costs

    NARCIS (Netherlands)

    Glashouwer, K.A.; de Jong, P.J.; Dijk, C.; Buwalda, F.M.

    2011-01-01

    To explain fear of blushing, it has been proposed that individuals with fear of blushing overestimate the social costs of their blushing. Current information-processing models emphasize the relevance of differentiating between more automatic and more explicit cognitions, as both types of cognitions

  4. Individuals with Fear of Blushing Explicitly and Automatically Associate Blushing with Social Costs

    NARCIS (Netherlands)

    Glashouwer, Klaske A.; de Jong, Peter J.; Dijk, Corine; Buwalda, Femke M.

    2011-01-01

    To explain fear of blushing, it has been proposed that individuals with fear of blushing overestimate the social costs of their blushing. Current information-processing models emphasize the relevance of differentiating between more automatic and more explicit cognitions, as both types of cognitions

  5. The Masculinity of Money: Automatic Stereotypes Predict Gender Differences in Estimated Salaries

    Science.gov (United States)

    Williams, Melissa J.; Paluck, Elizabeth Levy; Spencer-Rodgers, Julie

    2010-01-01

    We present the first empirical investigation of why men are assumed to earn higher salaries than women (the "salary estimation effect"). Although this phenomenon is typically attributed to conscious consideration of the national wage gap (i.e., real inequities in salary), we hypothesize instead that it reflects differential, automatic economic…

  6. SU-E-J-16: Automatic Image Contrast Enhancement Based On Automatic Parameter Optimization for Radiation Therapy Setup Verification

    International Nuclear Information System (INIS)

    Qiu, J; Li, H. Harlod; Zhang, T; Yang, D; Ma, F

    2015-01-01

    Purpose: In RT patient setup 2D images, tissues often cannot be seen well due to the lack of image contrast. Contrast enhancement features provided by image reviewing software, e.g. Mosaiq and ARIA, require manual selection of the image processing filters and parameters thus inefficient and cannot be automated. In this work, we developed a novel method to automatically enhance the 2D RT image contrast to allow automatic verification of patient daily setups as a prerequisite step of automatic patient safety assurance. Methods: The new method is based on contrast limited adaptive histogram equalization (CLAHE) and high-pass filtering algorithms. The most important innovation is to automatically select the optimal parameters by optimizing the image contrast. The image processing procedure includes the following steps: 1) background and noise removal, 2) hi-pass filtering by subtracting the Gaussian smoothed Result, and 3) histogram equalization using CLAHE algorithm. Three parameters were determined through an iterative optimization which was based on the interior-point constrained optimization algorithm: the Gaussian smoothing weighting factor, the CLAHE algorithm block size and clip limiting parameters. The goal of the optimization is to maximize the entropy of the processed Result. Results: A total 42 RT images were processed. The results were visually evaluated by RT physicians and physicists. About 48% of the images processed by the new method were ranked as excellent. In comparison, only 29% and 18% of the images processed by the basic CLAHE algorithm and by the basic window level adjustment process, were ranked as excellent. Conclusion: This new image contrast enhancement method is robust and automatic, and is able to significantly outperform the basic CLAHE algorithm and the manual window-level adjustment process that are currently used in clinical 2D image review software tools

  7. Automatic creation of specialised multilingual dictionaries in new subject areas

    Directory of Open Access Journals (Sweden)

    Joaquim Moré

    2009-05-01

    Full Text Available This article presents a tool to automatically generate specialised dictionaries of multilingual equivalents in new subject areas. The tool uses resources that are available on the web to search for equivalents and verify their reliability. These resources are, on the one hand, the Wikipedias, which can be freely downloaded and processed, and, on the other, the materials that terminological institutions of reference make available. This tool is of use to teachers producing teaching materials and researchers preparing theses, articles or reference manuals. It is also of use to translators and terminologists working on terminological standardisation in a new subject area in a given language, as it helps them in their work to pinpoint concepts that have yet to receive a standardised denomination.

  8. Automaticity in reading isiZulu

    OpenAIRE

    Sandra Land

    2016-01-01

    Automaticity, or instant recognition of combinations of letters as units of language, is essential for proficient reading in any language. The article explores automaticity amongst competent adult first-language readers of isiZulu, and the factors associated with it or its opposite - active decoding. Whilst the transparent spelling patterns of isiZulu aid learner readers, some of its orthographical features may militate against their gaining automaticity. These features are agglutination; a c...

  9. Advanced differential quadrature methods

    CERN Document Server

    Zong, Zhi

    2009-01-01

    Modern Tools to Perform Numerical DifferentiationThe original direct differential quadrature (DQ) method has been known to fail for problems with strong nonlinearity and material discontinuity as well as for problems involving singularity, irregularity, and multiple scales. But now researchers in applied mathematics, computational mechanics, and engineering have developed a range of innovative DQ-based methods to overcome these shortcomings. Advanced Differential Quadrature Methods explores new DQ methods and uses these methods to solve problems beyond the capabilities of the direct DQ method.After a basic introduction to the direct DQ method, the book presents a number of DQ methods, including complex DQ, triangular DQ, multi-scale DQ, variable order DQ, multi-domain DQ, and localized DQ. It also provides a mathematical compendium that summarizes Gauss elimination, the Runge-Kutta method, complex analysis, and more. The final chapter contains three codes written in the FORTRAN language, enabling readers to q...

  10. Automatic Generation of Validated Specific Epitope Sets

    Directory of Open Access Journals (Sweden)

    Sebastian Carrasco Pro

    2015-01-01

    Full Text Available Accurate measurement of B and T cell responses is a valuable tool to study autoimmunity, allergies, immunity to pathogens, and host-pathogen interactions and assist in the design and evaluation of T cell vaccines and immunotherapies. In this context, it is desirable to elucidate a method to select validated reference sets of epitopes to allow detection of T and B cells. However, the ever-growing information contained in the Immune Epitope Database (IEDB and the differences in quality and subjects studied between epitope assays make this task complicated. In this study, we develop a novel method to automatically select reference epitope sets according to a categorization system employed by the IEDB. From the sets generated, three epitope sets (EBV, mycobacteria and dengue were experimentally validated by detection of T cell reactivity ex vivo from human donors. Furthermore, a web application that will potentially be implemented in the IEDB was created to allow users the capacity to generate customized epitope sets.

  11. Nasal pressure recordings for automatic snoring detection.

    Science.gov (United States)

    Lee, Hyo-Ki; Kim, Hojoong; Lee, Kyoung-Joung

    2015-11-01

    This study presents a rule-based method for automated, real-time snoring detection using nasal pressure recordings during overnight sleep. Although nasal pressure recordings provide information regarding nocturnal breathing abnormalities in a polysomnography (PSG) study or continuous positive airway pressure (CPAP) system, an objective assessment of snoring detection using these nasal pressure recordings has not yet been reported in the literature. Nasal pressure recordings were obtained from 55 patients with obstructive sleep apnea. The PSG data were also recorded simultaneously to evaluate the proposed method. This rule-based method for automatic, real-time snoring detection employed preprocessing, short-time energy and the central difference method. Using this methodology, a sensitivity of 85.4% and a positive predictive value of 92.0% were achieved in all patients. Therefore, we concluded that the proposed method is a simple, portable and cost-effective tool for real-time snoring detection in PSG and CPAP systems that does not require acoustic analysis using a microphone.

  12. Automatic Generation of Minimal Cut Sets

    Directory of Open Access Journals (Sweden)

    Sentot Kromodimoeljo

    2015-06-01

    Full Text Available A cut set is a collection of component failure modes that could lead to a system failure. Cut Set Analysis (CSA is applied to critical systems to identify and rank system vulnerabilities at design time. Model checking tools have been used to automate the generation of minimal cut sets but are generally based on checking reachability of system failure states. This paper describes a new approach to CSA using a Linear Temporal Logic (LTL model checker called BT Analyser that supports the generation of multiple counterexamples. The approach enables a broader class of system failures to be analysed, by generalising from failure state formulae to failure behaviours expressed in LTL. The traditional approach to CSA using model checking requires the model or system failure to be modified, usually by hand, to eliminate already-discovered cut sets, and the model checker to be rerun, at each step. By contrast, the new approach works incrementally and fully automatically, thereby removing the tedious and error-prone manual process and resulting in significantly reduced computation time. This in turn enables larger models to be checked. Two different strategies for using BT Analyser for CSA are presented. There is generally no single best strategy for model checking: their relative efficiency depends on the model and property being analysed. Comparative results are given for the A320 hydraulics case study in the Behavior Tree modelling language.

  13. Automatic Adjustments of a Trans-oesophageal Ultrasound Robot for Monitoring Intra-operative Catheters

    Science.gov (United States)

    Wang, Shuangyi; Housden, James; Singh, Davinder; Rhode, Kawal

    2017-12-01

    3D trans-oesophageal echocardiography (TOE) has become a powerful tool for monitoring intra-operative catheters used during cardiac procedures in recent years. However, the control of the TOE probe remains as a manual task and therefore the operator has to hold the probe for a long period of time and sometimes in a radiation environment. To solve this problem, an add-on robotic system has been developed for holding and manipulating a commercial TOE probe. This paper focuses on the application of making automatic adjustments to the probe pose in order to accurately monitor the moving catheters. The positioning strategy is divided into an initialization step based on a pre-planning method and a localized adjustments step based on the robotic differential kinematics and related image servoing techniques. Both steps are described in the paper along with simulation experiments performed to validate the concept. The results indicate an error less than 0.5 mm for the initialization step and an error less than 2 mm for the localized adjustments step. Compared to the much bigger live 3D image volume, it is concluded that the methods are promising. Future work will focus on evaluating the method in the real TOE scanning scenario.

  14. Automatic Thermal Infrared Panoramic Imaging Sensor

    National Research Council Canada - National Science Library

    Gutin, Mikhail; Tsui, Eddy K; Gutin, Olga; Wang, Xu-Ming; Gutin, Alexey

    2006-01-01

    .... Automatic detection, location, and tracking of targets outside protected area ensures maximum protection and at the same time reduces the workload on personnel, increases reliability and confidence...

  15. Machine tool

    International Nuclear Information System (INIS)

    Kang, Myeong Sun

    1981-01-01

    This book indicates machine tool, which includes cutting process and processing by cutting process, theory of cutting like tool angle and chip molding, cutting tool such as milling cutter and drill, summary and introduction of following machine ; spindle drive and feed drive, pivot and pivot bearing, frame, guide way and table, drilling machine, boring machine, shaper and planer, milling machine, machine tool for precision finishing like lapping machine and super finishing machine gear cutter.

  16. Differential games

    CERN Document Server

    Friedman, Avner

    2006-01-01

    This volume lays the mathematical foundations for the theory of differential games, developing a rigorous mathematical framework with existence theorems. It begins with a precise definition of a differential game and advances to considerations of games of fixed duration, games of pursuit and evasion, the computation of saddle points, games of survival, and games with restricted phase coordinates. Final chapters cover selected topics (including capturability and games with delayed information) and N-person games.Geared toward graduate students, Differential Games will be of particular interest

  17. AuTom: a novel automatic platform for electron tomography reconstruction

    KAUST Repository

    Han, Renmin

    2017-07-26

    We have developed a software package towards automatic electron tomography (ET): Automatic Tomography (AuTom). The presented package has the following characteristics: accurate alignment modules for marker-free datasets containing substantial biological structures; fully automatic alignment modules for datasets with fiducial markers; wide coverage of reconstruction methods including a new iterative method based on the compressed-sensing theory that suppresses the “missing wedge” effect; and multi-platform acceleration solutions that support faster iterative algebraic reconstruction. AuTom aims to achieve fully automatic alignment and reconstruction for electron tomography and has already been successful for a variety of datasets. AuTom also offers user-friendly interface and auxiliary designs for file management and workflow management, in which fiducial marker-based datasets and marker-free datasets are addressed with totally different subprocesses. With all of these features, AuTom can serve as a convenient and effective tool for processing in electron tomography.

  18. Automatic Tagging as a Support Strategy for Creating Knowledge Maps

    Directory of Open Access Journals (Sweden)

    Leonardo Moura De Araújo

    2017-06-01

    Full Text Available Graph organizers are powerful tools for both structuring and transmitting knowledge. Because of their unique characteristics, these organizers are valuable for cultural institutions, which own large amounts of information assets and need to constantly make sense of them. On one hand, graph organizers are tools for connecting numerous chunks of data. On the other hand, because they are visual media, they offer a bird's-eye view perspective on complexity, which is digestible to the human eye. They are effective tools for information synthesis, and are capable of providing valuable insights on data. Information synthesis is essential for Heritage Interpretation, since institutions depend on constant generation of new content to preserve relevance among their audiences. While Mind Maps are simpler to be structured and comprehended, Knowledge Maps offer challenges that require new methods to minimize the difficulties encountered during their assembly. This paper presents strategies based on manual and automatic tagging as an answer to this problem. In addition, we describe the results of a usability test and qualitative analysis performed to compare the workflows employed to construct both Mind Maps and Knowledge Maps. Furthermore, we also talk about how well concepts can be communicated through the visual representation of trees and networks. Depending on the employed method, different results can be achieved, because of their unique topological characteristics. Our findings suggest that automatic tagging supports and accelerates the construction of graphs.

  19. Differential Geometry

    CERN Document Server

    Stoker, J J

    2011-01-01

    This classic work is now available in an unabridged paperback edition. Stoker makes this fertile branch of mathematics accessible to the nonspecialist by the use of three different notations: vector algebra and calculus, tensor calculus, and the notation devised by Cartan, which employs invariant differential forms as elements in an algebra due to Grassman, combined with an operation called exterior differentiation. Assumed are a passing acquaintance with linear algebra and the basic elements of analysis.

  20. Automatic gamma spectrometry analytical apparatus

    International Nuclear Information System (INIS)

    Lamargot, J.-P.; Wanin, Maurice.

    1980-01-01

    This invention falls within the area of quantitative or semi-quantitative analysis by gamma spectrometry and particularly refers to a device for bringing the samples into the counting position. The purpose of this invention is precisely to provide an automatic apparatus specifically adapted to the analysis of hard gamma radiations. To this effect, the invention relates to a gamma spectrometry analytical device comprising a lead containment, a detector of which the sensitive part is located inside the containment and additionally comprising a transfer system for bringing the analyzed samples in succession to a counting position inside the containment above the detector. A feed compartment enables the samples to be brought in turn one by one on to the transfer system through a duct connecting the compartment to the transfer system. Sequential systems for the coordinated forward feed of the samples in the compartment and the transfer system complete this device [fr

  1. Antares automatic beam alignment system

    International Nuclear Information System (INIS)

    Appert, Q.; Swann, T.; Sweatt, W.; Saxman, A.

    1980-01-01

    Antares is a 24-beam-line CO 2 laser system for controlled fusion research, under construction at Los Alamos Scientific Laboratory (LASL). Rapid automatic alignment of this system is required prior to each experiment shot. The alignment requirements, operational constraints, and a developed prototype system are discussed. A visible-wavelength alignment technique is employed that uses a telescope/TV system to view point light sources appropriately located down the beamline. Auto-alignment is accomplished by means of a video centroid tracker, which determines the off-axis error of the point sources. The error is nulled by computer-driven, movable mirrors in a closed-loop system. The light sources are fiber-optic terminations located at key points in the optics path, primarily at the center of large copper mirrors, and remotely illuminated to reduce heating effects

  2. Automatic quantification of iris color

    DEFF Research Database (Denmark)

    Christoffersen, S.; Harder, Stine; Andersen, J. D.

    2012-01-01

    An automatic algorithm to quantify the eye colour and structural information from standard hi-resolution photos of the human iris has been developed. Initially, the major structures in the eye region are identified including the pupil, iris, sclera, and eyelashes. Based on this segmentation, the ...... is completely data driven and it can divide a group of eye images into classes based on structure, colour or a combination of the two. The methods have been tested on a large set of photos with promising results....... regions. The result is a blue-brown ratio for each eye. Furthermore, an image clustering approach has been used with promising results. The approach is based on using a sparse dictionary of feature vectors learned from a training set of iris regions. The feature vectors contain both local structural...

  3. Computerized automatic tip scanning operation

    International Nuclear Information System (INIS)

    Nishikawa, K.; Fukushima, T.; Nakai, H.; Yanagisawa, A.

    1984-01-01

    In BWR nuclear power stations the Traversing Incore Probe (TIP) system is one of the most important components in reactor monitoring and control. In previous TIP systems, however, operators have suffered from the complexity of operation and long operation time required. The system presented in this paper realizes the automatic operation of the TIP system by monitoring and driving it with a process computer. This system significantly reduces the burden on customer operators and improves plant efficiency by simplifying the operating procedure, augmenting the accuracy of the measured data, and shortening operating time. The process computer is one of the PODIA (Plant Operation by Displayed Information Automation) systems. This computer transfers control signals to the TIP control panel, which in turn drives equipment by microprocessor control. The process computer contains such components as the CRT/KB unit, the printer plotter, the hard copier, and the message typers required for efficient man-machine communications. Its operation and interface properties are described

  4. Simulation tools

    CERN Document Server

    Jenni, F

    2006-01-01

    In the last two decades, simulation tools made a significant contribution to the great progress in development of power electronics. Time to market was shortened and development costs were reduced drastically. Falling costs, as well as improved speed and precision, opened new fields of application. Today, continuous and switched circuits can be mixed. A comfortable number of powerful simulation tools is available. The users have to choose the best suitable for their application. Here a simple rule applies: The best available simulation tool is the tool the user is already used to (provided, it can solve the task). Abilities, speed, user friendliness and other features are continuously being improved—even though they are already powerful and comfortable. This paper aims at giving the reader an insight into the simulation of power electronics. Starting with a short description of the fundamentals of a simulation tool as well as properties of tools, several tools are presented. Starting with simplified models ...

  5. Automatic Approach Tendencies toward High and Low Caloric Food in Restrained Eaters: Influence of Task-Relevance and Mood

    OpenAIRE

    Neimeijer, Renate A. M.; Roefs, Anne; Ostafin, Brian D.; de Jong, Peter J.

    2017-01-01

    Objective: Although restrained eaters are motivated to control their weight by dieting, they are often unsuccessful in these attempts. Dual process models emphasize the importance of differentiating between controlled and automatic tendencies to approach food. This study investigated the hypothesis that heightened automatic approach tendencies in restrained eaters would be especially prominent in contexts where food is irrelevant for their current tasks. Additionally, we examined the influenc...

  6. MODULEWRITER: a program for automatic generation of database interfaces.

    Science.gov (United States)

    Zheng, Christina L; Fana, Fariba; Udupi, Poornaprajna V; Gribskov, Michael

    2003-05-01

    MODULEWRITER is a PERL object relational mapping (ORM) tool that automatically generates database specific application programming interfaces (APIs) for SQL databases. The APIs consist of a package of modules providing access to each table row and column. Methods for retrieving, updating and saving entries are provided, as well as other generally useful methods (such as retrieval of the highest numbered entry in a table). MODULEWRITER provides for the inclusion of user-written code, which can be preserved across multiple runs of the MODULEWRITER program.

  7. Automatic control system generation for robot design validation

    Science.gov (United States)

    Bacon, James A. (Inventor); English, James D. (Inventor)

    2012-01-01

    The specification and drawings present a new method, system and software product for and apparatus for generating a robotic validation system for a robot design. The robotic validation system for the robot design of a robotic system is automatically generated by converting a robot design into a generic robotic description using a predetermined format, then generating a control system from the generic robotic description and finally updating robot design parameters of the robotic system with an analysis tool using both the generic robot description and the control system.

  8. Particle swarm optimization applied to automatic lens design

    Science.gov (United States)

    Qin, Hua

    2011-06-01

    This paper describes a novel application of Particle Swarm Optimization (PSO) technique to lens design. A mathematical model is constructed, and merit functions in an optical system are employed as fitness functions, which combined radiuses of curvature, thicknesses among lens surfaces and refractive indices regarding an optical system. By using this function, the aberration correction is carried out. A design example using PSO is given. Results show that PSO as optical design tools is practical and powerful, and this method is no longer dependent on the lens initial structure and can arbitrarily create search ranges of structural parameters of a lens system, which is an important step towards automatic design with artificial intelligence.

  9. Machine Learning Algorithms for Automatic Classification of Marmoset Vocalizations.

    Directory of Open Access Journals (Sweden)

    Hjalmar K Turesson

    Full Text Available Automatic classification of vocalization type could potentially become a useful tool for acoustic the monitoring of captive colonies of highly vocal primates. However, for classification to be useful in practice, a reliable algorithm that can be successfully trained on small datasets is necessary. In this work, we consider seven different classification algorithms with the goal of finding a robust classifier that can be successfully trained on small datasets. We found good classification performance (accuracy > 0.83 and F1-score > 0.84 using the Optimum Path Forest classifier. Dataset and algorithms are made publicly available.

  10. Comparison of automatic and visual methods used for image segmentation in Endodontics: a microCT study.

    Science.gov (United States)

    Queiroz, Polyane Mazucatto; Rovaris, Karla; Santaella, Gustavo Machado; Haiter-Neto, Francisco; Freitas, Deborah Queiroz

    2017-01-01

    To calculate root canal volume and surface area in microCT images, an image segmentation by selecting threshold values is required, which can be determined by visual or automatic methods. Visual determination is influenced by the operator's visual acuity, while the automatic method is done entirely by computer algorithms. To compare between visual and automatic segmentation, and to determine the influence of the operator's visual acuity on the reproducibility of root canal volume and area measurements. Images from 31 extracted human anterior teeth were scanned with a μCT scanner. Three experienced examiners performed visual image segmentation, and threshold values were recorded. Automatic segmentation was done using the "Automatic Threshold Tool" available in the dedicated software provided by the scanner's manufacturer. Volume and area measurements were performed using the threshold values determined both visually and automatically. The paired Student's t-test showed no significant difference between visual and automatic segmentation methods regarding root canal volume measurements (p=0.93) and root canal surface (p=0.79). Although visual and automatic segmentation methods can be used to determine the threshold and calculate root canal volume and surface, the automatic method may be the most suitable for ensuring the reproducibility of threshold determination.

  11. Automatic segmentation of diatom images for classification

    NARCIS (Netherlands)

    Jalba, Andrei C.; Wilkinson, Michael H.F.; Roerdink, Jos B.T.M.

    A general framework for automatic segmentation of diatom images is presented. This segmentation is a critical first step in contour-based methods for automatic identification of diatoms by computerized image analysis. We review existing results, adapt popular segmentation methods to this difficult

  12. Solar Powered Automatic Shrimp Feeding System

    Directory of Open Access Journals (Sweden)

    Dindo T. Ani

    2015-12-01

    Full Text Available - Automatic system has brought many revolutions in the existing technologies. One among the technologies, which has greater developments, is the solar powered automatic shrimp feeding system. For instance, the solar power which is a renewable energy can be an alternative solution to energy crisis and basically reducing man power by using it in an automatic manner. The researchers believe an automatic shrimp feeding system may help solve problems on manual feeding operations. The project study aimed to design and develop a solar powered automatic shrimp feeding system. It specifically sought to prepare the design specifications of the project, to determine the methods of fabrication and assembly, and to test the response time of the automatic shrimp feeding system. The researchers designed and developed an automatic system which utilizes a 10 hour timer to be set in intervals preferred by the user and will undergo a continuous process. The magnetic contactor acts as a switch connected to the 10 hour timer which controls the activation or termination of electrical loads and powered by means of a solar panel outputting electrical power, and a rechargeable battery in electrical communication with the solar panel for storing the power. By undergoing through series of testing, the components of the modified system were proven functional and were operating within the desired output. It was recommended that the timer to be used should be tested to avoid malfunction and achieve the fully automatic system and that the system may be improved to handle changes in scope of the project.

  13. Towards Automatic Trunk Classification on Young Conifers

    DEFF Research Database (Denmark)

    Petri, Stig; Immerkær, John

    2009-01-01

    In the garden nursery industry providing young Nordmann firs for Christmas tree plantations, there is a rising interest in automatic classification of their products to ensure consistently high quality and reduce the cost of manual labor. This paper describes a fully automatic single-view algorithm...

  14. Equipment for fully automatic radiographic pipe inspection

    International Nuclear Information System (INIS)

    Basler, G.; Sperl, H.; Weinschenk, K.

    1977-01-01

    The patent describes a device for fully automatic radiographic testing of large pipes with longitudinal welds. Furthermore the invention enables automatic marking of films in radiographic inspection with regard to a ticketing of the test piece and of that part of it where testing took place. (RW) [de

  15. Automatic face morphing for transferring facial animation

    NARCIS (Netherlands)

    Bui Huu Trung, B.H.T.; Bui, T.D.; Poel, Mannes; Heylen, Dirk K.J.; Nijholt, Antinus; Hamza, H.M.

    2003-01-01

    In this paper, we introduce a novel method of automatically finding the training set of RBF networks for morphing a prototype face to represent a new face. This is done by automatically specifying and adjusting corresponding feature points on a target face. The RBF networks are then used to transfer

  16. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kosterev, Dmitry [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dai, T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-01

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  17. Automatic detection and visualisation of MEG ripple oscillations in epilepsy

    Directory of Open Access Journals (Sweden)

    Nicole van Klink

    2017-01-01

    Full Text Available High frequency oscillations (HFOs, 80–500 Hz in invasive EEG are a biomarker for the epileptic focus. Ripples (80–250 Hz have also been identified in non-invasive MEG, yet detection is impeded by noise, their low occurrence rates, and the workload of visual analysis. We propose a method that identifies ripples in MEG through noise reduction, beamforming and automatic detection with minimal user effort. We analysed 15 min of presurgical resting-state interictal MEG data of 25 patients with epilepsy. The MEG signal-to-noise was improved by using a cross-validation signal space separation method, and by calculating ~2400 beamformer-based virtual sensors in the grey matter. Ripples in these sensors were automatically detected by an algorithm optimized for MEG. A small subset of the identified ripples was visually checked. Ripple locations were compared with MEG spike dipole locations and the resection area if available. Running the automatic detection algorithm resulted in on average 905 ripples per patient, of which on average 148 ripples were visually reviewed. Reviewing took approximately 5 min per patient, and identified ripples in 16 out of 25 patients. In 14 patients the ripple locations showed good or moderate concordance with the MEG spikes. For six out of eight patients who had surgery, the ripple locations showed concordance with the resection area: 4/5 with good outcome and 2/3 with poor outcome. Automatic ripple detection in beamformer-based virtual sensors is a feasible non-invasive tool for the identification of ripples in MEG. Our method requires minimal user effort and is easily applicable in a clinical setting.

  18. Automatic transmission parameters measurement and radiation pattern simulation for an RF photonic integrated beamformer

    NARCIS (Netherlands)

    Burla, M.; Lavabre, E.; Roeloffzen, C.G.H.; Marpaung, D.A.I.; Zhuang, L.; Khan, M.R.H.; van Etten, Wim

    2011-01-01

    We present the implementation and demonstration of a software tool for the performance characterization of integrated N-by-1 photonic beamformers for phased array antennas. The software operates the automatic measurement of the transmission parameters of an equivalent N+1 ports microwave network,

  19. Towards the Availability of the Distributed Cluster Rendering System: Automatic Modeling and Verification

    DEFF Research Database (Denmark)

    Wang, Kemin; Jiang, Zhengtao; Wang, Yongbin

    2012-01-01

    In this study, we proposed a Continuous Time Markov Chain Model towards the availability of n-node clusters of Distributed Rendering System. It's an infinite one, we formalized it, based on the model, we implemented a software, which can automatically model with PRISM language. With the tool, whe...

  20. Automatic Atrial Fibrillation Detection: A Novel Approach Using Discrete Wavelet Transform and Heart Rate Variabilit

    DEFF Research Database (Denmark)

    Bruun, Iben H.; Hissabu, Semira M. S.; Poulsen, Erik S.

    2017-01-01

    be used as a screening tool for patients suspected to have AF. The method includes an automatic peak detection prior to the feature extraction, as well as a noise cancellation technique followed by a bagged tree classification. Simulation studies on the MIT-BIH Atrial Fibrillation database was performed...

  1. Fourier transform infrared microspectroscopy identifies early lineage commitment in differentiating human embryonic stem cells.

    Science.gov (United States)

    Heraud, Philip; Ng, Elizabeth S; Caine, Sally; Yu, Qing C; Hirst, Claire; Mayberry, Robyn; Bruce, Amanda; Wood, Bayden R; McNaughton, Don; Stanley, Edouard G; Elefanty, Andrew G

    2010-03-01

    Human ESCs (hESCs) are a valuable tool for the study of early human development and represent a source of normal differentiated cells for pharmaceutical and biotechnology applications and ultimately for cell replacement therapies. For all applications, it will be necessary to develop assays to validate the efficacy of hESC differentiation. We explored the capacity for FTIR spectroscopy, a technique that rapidly characterises cellular macromolecular composition, to discriminate mesendoderm or ectoderm committed cells from undifferentiated hESCs. Distinct infrared spectroscopic "signatures" readily distinguished hESCs from these early differentiated progeny, with bioinformatic models able to correctly classify over 97% of spectra. These data identify a role for FTIR spectroscopy as a new modality to complement conventional analyses of hESCs and their derivatives. FTIR spectroscopy has the potential to provide low-cost, automatable measurements for the quality control of stem and differentiated cells to be used in industry and regenerative medicine. Crown Copyright 2009. Published by Elsevier B.V. All rights reserved.

  2. Automatic analysis of multiparty meetings

    Indian Academy of Sciences (India)

    NXT tool for annotating dialogue acts in a multiparty conversation. designer, marketing expert and interface designer), and the team participated in a series of four meetings. The meetings took place over 3–4 hours: about half of this time was spent in meetings, the remainder was spent in preparation, with each participant ...

  3. Automatic locking orthotic knee device

    Science.gov (United States)

    Weddendorf, Bruce C. (Inventor)

    1993-01-01

    An articulated tang in clevis joint for incorporation in newly manufactured conventional strap-on orthotic knee devices or for replacing such joints in conventional strap-on orthotic knee devices is discussed. The instant tang in clevis joint allows the user the freedom to extend and bend the knee normally when no load (weight) is applied to the knee and to automatically lock the knee when the user transfers weight to the knee, thus preventing a damaged knee from bending uncontrollably when weight is applied to the knee. The tang in clevis joint of the present invention includes first and second clevis plates, a tang assembly and a spacer plate secured between the clevis plates. Each clevis plate includes a bevelled serrated upper section. A bevelled shoe is secured to the tank in close proximity to the bevelled serrated upper section of the clevis plates. A coiled spring mounted within an oblong bore of the tang normally urges the shoes secured to the tang out of engagement with the serrated upper section of each clevic plate to allow rotation of the tang relative to the clevis plate. When weight is applied to the joint, the load compresses the coiled spring, the serrations on each clevis plate dig into the bevelled shoes secured to the tang to prevent relative movement between the tang and clevis plates. A shoulder is provided on the tang and the spacer plate to prevent overextension of the joint.

  4. Automatic Transmission Of Liquid Nitrogen

    Directory of Open Access Journals (Sweden)

    Sumedh Mhatre

    2015-08-01

    Full Text Available Liquid Nitrogen is one of the major substance used as a chiller in industry such as Ice cream factory Milk Diary Storage of blood sample Blood Bank etc. It helps to maintain the required product at a lower temperature for preservation purpose. We cannot fully utilise the LN2 so practically if we are using 3.75 litre LN2 for a single day then around 12 of LN2 450 ml is wasted due to vaporisation. A pressure relief valve is provided to create a pressure difference. If there is no pressure difference between the cylinder carrying LN2 and its surrounding it will results in damage of container as well as wastage of LN2.Transmission of LN2 from TA55 to BA3 is carried manually .So care must be taken for the transmission of LN2 in order to avoid its wastage. With the help of this project concept the transmission of LN2 will be carried automatically so as to reduce the wastage of LN2 in case of manual operation.

  5. Automatic segmentation of the colon

    Science.gov (United States)

    Wyatt, Christopher L.; Ge, Yaorong; Vining, David J.

    1999-05-01

    Virtual colonoscopy is a minimally invasive technique that enables detection of colorectal polyps and cancer. Normally, a patient's bowel is prepared with colonic lavage and gas insufflation prior to computed tomography (CT) scanning. An important step for 3D analysis of the image volume is segmentation of the colon. The high-contrast gas/tissue interface that exists in the colon lumen makes segmentation of the majority of the colon relatively easy; however, two factors inhibit automatic segmentation of the entire colon. First, the colon is not the only gas-filled organ in the data volume: lungs, small bowel, and stomach also meet this criteria. User-defined seed points placed in the colon lumen have previously been required to spatially isolate only the colon. Second, portions of the colon lumen may be obstructed by peristalsis, large masses, and/or residual feces. These complicating factors require increased user interaction during the segmentation process to isolate additional colon segments. To automate the segmentation of the colon, we have developed a method to locate seed points and segment the gas-filled lumen with no user supervision. We have also developed an automated approach to improve lumen segmentation by digitally removing residual contrast-enhanced fluid resulting from a new bowel preparation that liquefies and opacifies any residual feces.

  6. Automatic panoramic thermal integrated sensor

    Science.gov (United States)

    Gutin, Mikhail A.; Tsui, Eddy K.; Gutin, Olga N.

    2005-05-01

    Historically, the US Army has recognized the advantages of panoramic imagers with high image resolution: increased area coverage with fewer cameras, instantaneous full horizon detection, location and tracking of multiple targets simultaneously, extended range, and others. The novel ViperViewTM high-resolution panoramic thermal imager is the heart of the Automatic Panoramic Thermal Integrated Sensor (APTIS), being jointly developed by Applied Science Innovative, Inc. (ASI) and the Armament Research, Development and Engineering Center (ARDEC) in support of the Future Combat Systems (FCS) and the Intelligent Munitions Systems (IMS). The APTIS is anticipated to operate as an intelligent node in a wireless network of multifunctional nodes that work together to improve situational awareness (SA) in many defense and offensive operations, as well as serve as a sensor node in tactical Intelligence Surveillance Reconnaissance (ISR). The ViperView is as an aberration-corrected omnidirectional imager with small optics designed to match the resolution of a 640x480 pixels IR camera with improved image quality for longer range target detection, classification, and tracking. The same approach is applicable to panoramic cameras working in the visible spectral range. Other components of the ATPIS sensor suite include ancillary sensors, advanced power management, and wakeup capability. This paper describes the development status of the APTIS system.

  7. Automatic segmentation of psoriasis lesions

    Science.gov (United States)

    Ning, Yang; Shi, Chenbo; Wang, Li; Shu, Chang

    2014-10-01

    The automatic segmentation of psoriatic lesions is widely researched these years. It is an important step in Computer-aid methods of calculating PASI for estimation of lesions. Currently those algorithms can only handle single erythema or only deal with scaling segmentation. In practice, scaling and erythema are often mixed together. In order to get the segmentation of lesions area - this paper proposes an algorithm based on Random forests with color and texture features. The algorithm has three steps. The first step, the polarized light is applied based on the skin's Tyndall-effect in the imaging to eliminate the reflection and Lab color space are used for fitting the human perception. The second step, sliding window and its sub windows are used to get textural feature and color feature. In this step, a feature of image roughness has been defined, so that scaling can be easily separated from normal skin. In the end, Random forests will be used to ensure the generalization ability of the algorithm. This algorithm can give reliable segmentation results even the image has different lighting conditions, skin types. In the data set offered by Union Hospital, more than 90% images can be segmented accurately.

  8. Differential forms theory and practice

    CERN Document Server

    Weintraub, Steven H

    2014-01-01

    Differential forms are utilized as a mathematical technique to help students, researchers, and engineers analyze and interpret problems where abstract spaces and structures are concerned, and when questions of shape, size, and relative positions are involved. Differential Forms has gained high recognition in the mathematical and scientific community as a powerful computational tool in solving research problems and simplifying very abstract problems through mathematical analysis on a computer. Differential Forms, 2nd Edition, is a solid resource for students and professionals needing a solid g

  9. Automatic Creation of Stepper Job Files

    Science.gov (United States)

    Dailor, David S.

    1989-07-01

    Application Specific Integrated Circuit (ASIC) manufacturing, characterized by fast cycle time, small lot sizes and an abundance of products and process flows, creates challenges not found in a dedicated product line. The continuous addition of new devices into the wafer fab creates problems such as reticle storage, critical dimensions specifications and stepper set-up. One of the stepper set-up problems is stepper job file generation. Each device run on a stepper requires several job files to be created. The number of job files created for each device is dependent on the manufacturing environment, the alignment scheme, process flow and stepper used. There can be as many job files as there are layers for each device. Data from the reticle layout engineer is entered into the stepper by the process engineer to define the stepping array, location of alignment marks, framing blade settings and other exposure related information. This process is time consuming and removes a machine from production while job files are being written and tested. In an ASIC environment where several new devices start each week, creating stepper job files is a major problem impacting: productivity, cycle time, process yield and the rework rate. A solution to all of these problems: automate the stepper job file generation process. Our stepper job files are created automatically by computer when the reticles are tooled. Stepper job files are ready before the masks arrive and they are always correct. The result is improved productivity and process yields with reduced reworks and prototype cycle times. A description of this automated approach and the necessary items to implement this technique is presented.

  10. Automatic Road Pavement Assessment with Image Processing: Review and Comparison

    Directory of Open Access Journals (Sweden)

    Sylvie Chambon

    2011-01-01

    Full Text Available In the field of noninvasive sensing techniques for civil infrastructures monitoring, this paper addresses the problem of crack detection, in the surface of the French national roads, by automatic analysis of optical images. The first contribution is a state of the art of the image-processing tools applied to civil engineering. The second contribution is about fine-defect detection in pavement surface. The approach is based on a multi-scale extraction and a Markovian segmentation. Third, an evaluation and comparison protocol which has been designed for evaluating this difficult task—the road pavement crack detection—is introduced. Finally, the proposed method is validated, analysed, and compared to a detection approach based on morphological tools.

  11. A neurocomputational model of automatic sequence production.

    Science.gov (United States)

    Helie, Sebastien; Roeder, Jessica L; Vucovich, Lauren; Rünger, Dennis; Ashby, F Gregory

    2015-07-01

    Most behaviors unfold in time and include a sequence of submovements or cognitive activities. In addition, most behaviors are automatic and repeated daily throughout life. Yet, relatively little is known about the neurobiology of automatic sequence production. Past research suggests a gradual transfer from the associative striatum to the sensorimotor striatum, but a number of more recent studies challenge this role of the BG in automatic sequence production. In this article, we propose a new neurocomputational model of automatic sequence production in which the main role of the BG is to train cortical-cortical connections within the premotor areas that are responsible for automatic sequence production. The new model is used to simulate four different data sets from human and nonhuman animals, including (1) behavioral data (e.g., RTs), (2) electrophysiology data (e.g., single-neuron recordings), (3) macrostructure data (e.g., TMS), and (4) neurological circuit data (e.g., inactivation studies). We conclude with a comparison of the new model with existing models of automatic sequence production and discuss a possible new role for the BG in automaticity and its implication for Parkinson's disease.

  12. Differential belongings

    DEFF Research Database (Denmark)

    Oldrup, Helene

    2014-01-01

    This paper explores suburban middle-class residents’ narratives about housing choice, everyday life and belonging in residential areas of Greater Copenhagen, Denmark, to understand how residential processes of social differentiation are constituted. Using Savage et al.’s concepts of discursive...

  13. AUTO-LAY: automatic layout generation for procedure flow diagrams

    International Nuclear Information System (INIS)

    Forzano, P.; Castagna, P.

    1995-01-01

    Nuclear Power Plant Procedures can be seen from essentially two viewpoints: the process and the information management. From the first point of view, it is important to supply the knowledge apt to solve problems connected with the control of the process, from the second one the focus of attention is on the knowledge representation, its structure, elicitation and maintenance, formal quality assurance. These two aspects of procedure representation can be considered and solved separately. In particular, methodological, formal and management issues require long and tedious activities, that in most cases constitute a great barrier for procedures development and upgrade. To solve these problems, Ansaldo is developing DIAM, a wide integrated tool for procedure management to support in procedure writing, updating, usage and documentation. One of the most challenging features of DIAM is AUTO-LAY, a CASE sub-tool that, in a complete automatical way, structures parts or complete flow diagrams. This is a feature that is partially present in some other CASE products, that, anyway, do not allow complex graph handling and isomorphism between video and paper representation AUTO-LAY has the unique prerogative to draw graphs of any complexity, to section them in pages, and to automatically compose a document. This has been recognized in the literature as the most important second-generation CASE improvement. (author). 5 refs., 9 figs

  14. Semi-automatic Data Integration using Karma

    Science.gov (United States)

    Garijo, D.; Kejriwal, M.; Pierce, S. A.; Houser, P. I. Q.; Peckham, S. D.; Stanko, Z.; Hardesty Lewis, D.; Gil, Y.; Pennington, D. D.; Knoblock, C.

    2017-12-01

    Data integration applications are ubiquitous in scientific disciplines. A state-of-the-art data integration system accepts both a set of data sources and a target ontology as input, and semi-automatically maps the data sources in terms of concepts and relationships in the target ontology. Mappings can be both complex and highly domain-specific. Once such a semantic model, expressing the mapping using community-wide standard, is acquired, the source data can be stored in a single repository or database using the semantics of the target ontology. However, acquiring the mapping is a labor-prone process, and state-of-the-art artificial intelligence systems are unable to fully automate the process using heuristics and algorithms alone. Instead, a more realistic goal is to develop adaptive tools that minimize user feedback (e.g., by offering good mapping recommendations), while at the same time making it intuitive and easy for the user to both correct errors and to define complex mappings. We present Karma, a data integration system that has been developed over multiple years in the information integration group at the Information Sciences Institute, a research institute at the University of Southern California's Viterbi School of Engineering. Karma is a state-of-the-art data integration tool that supports an interactive graphical user interface, and has been featured in multiple domains over the last five years, including geospatial, biological, humanities and bibliographic applications. Karma allows a user to import their own ontology and datasets using widely used formats such as RDF, XML, CSV and JSON, can be set up either locally or on a server, supports a native backend database for prototyping queries, and can even be seamlessly integrated into external computational pipelines, including those ingesting data via streaming data sources, Web APIs and SQL databases. We illustrate a Karma workflow at a conceptual level, along with a live demo, and show use cases of

  15. Automatic adjustment of astrochronologic correlations

    Science.gov (United States)

    Zeeden, Christian; Kaboth, Stefanie; Hilgen, Frederik; Laskar, Jacques

    2017-04-01

    Here we present an algorithm for the automated adjustment and optimisation of correlations between proxy data and an orbital tuning target (or similar datasets as e.g. ice models) for the R environment (R Development Core Team 2008), building on the 'astrochron' package (Meyers et al.2014). The basis of this approach is an initial tuning on orbital (precession, obliquity, eccentricity) scale. We use filters of orbital frequency ranges related to e.g. precession, obliquity or eccentricity of data and compare these filters to an ensemble of target data, which may consist of e.g. different combinations of obliquity and precession, different phases of precession and obliquity, a mix of orbital and other data (e.g. ice models), or different orbital solutions. This approach allows for the identification of an ideal mix of precession and obliquity to be used as tuning target. In addition, the uncertainty related to different tuning tie points (and also precession- and obliquity contributions of the tuning target) can easily be assessed. Our message is to suggest an initial tuning and then obtain a reproducible tuned time scale, avoiding arbitrary chosen tie points and replacing these by automatically chosen ones, representing filter maxima (or minima). We present and discuss the above outlined approach and apply it to artificial and geological data. Artificial data are assessed to find optimal filter settings; real datasets are used to demonstrate the possibilities of such an approach. References: Meyers, S.R. (2014). Astrochron: An R Package for Astrochronology. http://cran.r-project.org/package=astrochron R Development Core Team (2008). R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. ISBN 3-900051-07-0, URL http://www.R-project.org.

  16. Quantification of aortic annulus in computed tomography angiography: Validation of a fully automatic methodology.

    Science.gov (United States)

    Gao, Xinpei; Boccalini, Sara; Kitslaar, Pieter H; Budde, Ricardo P J; Attrach, Mohamed; Tu, Shengxian; de Graaf, Michiel A; Ondrus, Tomas; Penicka, Martin; Scholte, Arthur J H A; Lelieveldt, Boudewijn P F; Dijkstra, Jouke; Reiber, Johan H C

    2017-08-01

    Automatic accurate measuring of the aortic annulus and determination of the optimal angulation of X-ray projection are important for the trans-catheter aortic valve replacement (TAVR) procedure. The objective of this study was to present a novel fully automatic methodology for the quantification of the aortic annulus in computed tomography angiography (CTA) images. CTA datasets of 26 patients were analyzed retrospectively with the proposed methodology, which consists of a knowledge-based segmentation of the aortic root and detection of the orientation and size of the aortic annulus. The accuracy of the methodology was determined by comparing the automatically derived results with the reference standard obtained by semi-automatic delineation of the aortic root and manual definition of the annulus plane. The difference between the automatic annulus diameter and the reference standard by observer 1 was 0.2±1.0mm, with an inter-observer variability of 1.2±0.6mm. The Pearson correlation coefficient for the diameter was good (0.92 for observer 1). For the first time, a fully automatic tool to assess the optimal projection curves was presented and validated. The mean difference between the optimal projection curves calculated based on the automatically defined annulus plane and the reference standard was 6.4° in the cranial/caudal (CRA/CAU) direction. The mean computation time was short with around 60s per dataset. The new fully automatic and fast methodology described in this manuscript not only provided precise measurements about the aortic annulus size with results comparable to experienced observers, but also predicted optimal X-ray projection curves from CTA images. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Automatic twin vessel recrystallizer. Effective purification of acetaminophen by successive automatic recrystallization and absolute determination of purity by DSC.

    Science.gov (United States)

    Nara, Osamu

    2011-01-24

    I describe an interchangeable twin vessel (J, N) automatic glass recrystallizer that eliminates the time-consuming recovery and recycling of crystals for repeated recrystallization. The sample goes in the dissolution vessel J containing a magnetic stir-bar K; J is clamped to the upper joint H of recrystallizer body D. Empty crystallization vessel N is clamped to the lower joint M. Pure solvent is delivered to the dissolution vessel and the crystallization vessel via the head of the condenser A. Crystallization vessel is heated (P). The dissolution reservoir is stirred and heated by the solvent vapor (F). Continuous outflow of filtrate E out of J keeps N at a stable boiling temperature. This results in efficient dissolution, evaporation and separation of pure crystals Q. Pure solvent in the dissolution reservoir is recovered by suction. Empty dissolution and crystallization vessels are detached. Stirrer magnet is transferred to the crystallization vessel and the role of the vessels are then reversed. Evacuating mother liquor out of the upper twin vessel, the apparatus unit is ready for the next automatic recrystallization by refilling twin vessels with pure solvent. We show successive automatic recrystallization of acetaminophen from diethyl ether obtaining acetaminophen of higher melting temperatures than USP and JP reference standards by 8× automatic recrystallization, 96% yield at each stage. Also, I demonstrate a novel approach to the determination of absolute purity by combining the successive automatic recrystallization with differential scanning calorimetry (DSC) measurement requiring no reference standards. This involves the measurement of the criterial melting temperature T(0) corresponding to the 100% pure material and quantitative ΔT in DSC based on the van't Hoff law of melting point depression. The purity of six commercial acetaminophen samples and reference standards and an eight times recrystallized product evaluated were 98.8 mol%, 97.9 mol%, 99

  18. 12th Portuguese Conference on Automatic Control

    CERN Document Server

    Soares, Filomena; Moreira, António

    2017-01-01

    The biennial CONTROLO conferences are the main events promoted by The CONTROLO 2016 – 12th Portuguese Conference on Automatic Control, Guimarães, Portugal, September 14th to 16th, was organized by Algoritmi, School of Engineering, University of Minho, in partnership with INESC TEC, and promoted by the Portuguese Association for Automatic Control – APCA, national member organization of the International Federation of Automatic Control – IFAC. The seventy-five papers published in this volume cover a wide range of topics. Thirty-one of them, of a more theoretical nature, are distributed among the first five parts: Control Theory; Optimal and Predictive Control; Fuzzy, Neural and Genetic Control; Modeling and Identification; Sensing and Estimation. The papers go from cutting-edge theoretical research to innovative control applications and show expressively how Automatic Control can be used to increase the well being of people. .

  19. 2010 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2010 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  20. 2014 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2014 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  1. 2011 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2011 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  2. 2009 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2009 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  3. 2012 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2012 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  4. Differential discriminator

    International Nuclear Information System (INIS)

    Dukhanov, V.I.; Mazurov, I.B.

    1981-01-01

    A principal flowsheet of a differential discriminator intended for operation in a spectrometric circuit with statistical time distribution of pulses is described. The differential discriminator includes four integrated discriminators and a channel of piled-up signal rejection. The presence of the rejection channel enables the discriminator to operate effectively at loads of 14x10 3 pulse/s. The temperature instability of the discrimination thresholds equals 250 μV/ 0 C. The discrimination level changes within 0.1-5 V, the level shift constitutes 0.5% for the filling ratio of 1:10. The rejection coefficient is not less than 90%. Alpha spectrum of the 228 Th source is presented to evaluate the discriminator operation with the rejector. The rejector provides 50 ns time resolution

  5. Differential topology

    CERN Document Server

    Margalef-Roig, J

    1992-01-01

    ...there are reasons enough to warrant a coherent treatment of the main body of differential topology in the realm of Banach manifolds, which is at the same time correct and complete. This book fills the gap: whenever possible the manifolds treated are Banach manifolds with corners. Corners add to the complications and the authors have carefully fathomed the validity of all main results at corners. Even in finite dimensions some results at corners are more complete and better thought out here than elsewhere in the literature. The proofs are correct and with all details. I see this book as a reliable monograph of a well-defined subject; the possibility to fall back to it adds to the feeling of security when climbing in the more dangerous realms of infinite dimensional differential geometry. Peter W. Michor

  6. Partial differential equations

    CERN Document Server

    Sloan, D; Süli, E

    2001-01-01

    /homepage/sac/cam/na2000/index.html7-Volume Set now available at special set price ! Over the second half of the 20th century the subject area loosely referred to as numerical analysis of partial differential equations (PDEs) has undergone unprecedented development. At its practical end, the vigorous growth and steady diversification of the field were stimulated by the demand for accurate and reliable tools for computational modelling in physical sciences and engineering, and by the rapid development of computer hardware and architecture. At the more theoretical end, the analytical insight in

  7. Automatic safety rod for reactors. [LMFBR

    Science.gov (United States)

    Germer, J.H.

    1982-03-23

    An automatic safety rod for a nuclear reactor containing neutron absorbing material and designed to be inserted into a reactor core after a loss-of-flow. Actuation is based upon either a sudden decrease in core pressure drop or the pressure drop decreases below a predetermined minimum value. The automatic control rod includes a pressure regulating device whereby a controlled decrease in operating pressure due to reduced coolant flow does not cause the rod to drop into the core.

  8. Automatic Control of Freeboard and Turbine Operation

    DEFF Research Database (Denmark)

    Kofoed, Jens Peter; Frigaard, Peter Bak; Friis-Madsen, Erik

    The report deals with the modules for automatic control of freeboard and turbine operation on board the Wave dragon, Nissum Bredning (WD-NB) prototype, and covers what has been going on up to ultimo 2003.......The report deals with the modules for automatic control of freeboard and turbine operation on board the Wave dragon, Nissum Bredning (WD-NB) prototype, and covers what has been going on up to ultimo 2003....

  9. Automatic and strategic processes in advertising effects

    DEFF Research Database (Denmark)

    Grunert, Klaus G.

    1996-01-01

    Two kinds of cognitive processes can be distinguished: Automatic processes, which are mostly subconscious, are learned and changed very slowly, and are not subject to the capacity limitations of working memory, and strategic processes, which are conscious, are subject to capacity limitations......, and can easily be adapted to situational circumstances. Both the perception of advertising and the way advertising influences brand evaluation involves both processes. Automatic processes govern the recognition of advertising stimuli, the relevance decision which determines further higher-level processing...

  10. Towards automatic verification of ladder logic programs

    OpenAIRE

    Zoubek , Bohumir; Roussel , Jean-Marc; Kwiatkowska , Martha

    2003-01-01

    International audience; Control system programs are usually validated by testing prior to their deployment. Unfortunately, testing is not exhaustive and therefore it is possible that a program which passed all the required tests still contains errors. In this paper we apply techniques of automatic verification to a control program written in ladder logic. A model is constructed mechanically from the ladder logic program and subjected to automatic verification against requirements that include...

  11. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathan

    2010-05-31

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques to detect regions of high error and the flexibility of the transfinite interpolation to add degrees of freedom to these areas. Examples are shown of a section of the Palo Duro Canyon in northern Texas.

  12. Automatic control of commercial computer programs

    International Nuclear Information System (INIS)

    Rezvov, B.A.; Artem'ev, A.N.; Maevskij, A.G.; Demkiv, A.A.; Kirillov, B.F.; Belyaev, A.D.; Artem'ev, N.A.

    2010-01-01

    The way of automatic control of commercial computer programs is presented. The developed connection of the EXAFS spectrometer automatic system (which is managed by PC for DOS) is taken with the commercial program for the CCD detector control (which is managed by PC for Windows). The described complex system is used for the automation of intermediate amplitude spectra processing in EXAFS spectrum measurements at Kurchatov SR source

  13. Automatic penalty continuation in structural topology optimization

    DEFF Research Database (Denmark)

    Rojas Labanda, Susana; Stolpe, Mathias

    2015-01-01

    this issue is addressed. We propose an automatic continuation method, where the material penalization parameter is included as a new variable in the problem and a constraint guarantees that the requested penalty is eventually reached. The numerical results suggest that this approach is an appealing...... alternative to continuation methods. Automatic continuation also generally obtains better designs than the classical formulation using a reduced number of iterations....

  14. Hepatocyte differentiation.

    Science.gov (United States)

    Olsavsky Goyak, Katy M; Laurenzana, Elizabeth M; Omiecinski, Curtis J

    2010-01-01

    Increasingly, research suggests that for certain systems, animal models are insufficient for human toxicology testing. The development of robust, in vitro models of human toxicity is required to decrease our dependence on potentially misleading in vivo animal studies. A critical development in human toxicology testing is the use of human primary hepatocytes to model processes that occur in the intact liver. However, in order to serve as an appropriate model, primary hepatocytes must be maintained in such a way that they persist in their differentiated state. While many hepatocyte culture methods exist, the two-dimensional collagen "sandwich" system combined with a serum-free medium, supplemented with physiological glucocorticoid concentrations, appears to robustly maintain hepatocyte character. Studies in rat and human hepatocytes have shown that when cultured under these conditions, hepatocytes maintain many markers of differentiation including morphology, expression of plasma proteins, hepatic nuclear factors, phase I and II metabolic enzymes. Functionally, these culture conditions also preserve hepatic stress response pathways, such as the SAPK and MAPK pathways, as well as prototypical xenobiotic induction responses. This chapter will briefly review culture methodologies but will primarily focus on hallmark hepatocyte structural, expression and functional markers that characterize the differentiation status of the hepatocyte.

  15. MM98.19 An automatic system for elaboration of chip breaking diagrams

    DEFF Research Database (Denmark)

    Andreasen, Jan Lasson; Chiffre, Leonardo De

    1998-01-01

    A laboratory system for fully automatic elaboration of chip breaking diagrams has been developed and tested. The system is based on automatic chip breaking detection by frequency analysis of cutting forces in connection with programming of a CNC-lathe to scan different feeds, speeds and cutting...... depths. An evaluation of the system based on a total of 1671 experiments has shown that unfavourable snarled chips can be detected with 98% certainty which indeed makes the system a valuable tool in chip breakability tests. Using the system, chip breaking diagrams can be elaborated with a previously...

  16. A bottom-up approach to automatically configured Tango control systems

    International Nuclear Information System (INIS)

    Rubio-Manrique, S.; Beltran, D.; Costa, I.; Fernandez-Carreiras, D.; Gigante, J.V.; Klora, J.; Matilla, O.; Ranz, R.; Ribas, J.; Sanchez, O.

    2012-01-01

    Alba is the first synchrotron light source built in Spain. Most of Alba control system has been developed on top of Tango control system. An amount of 5531 devices are controlled in Alba accelerators (linac, booster and storage ring) using 150 Linux PCs. Alba maintains a central repository, so called 'Cabling and Controls database' (CCDB), which keeps the inventory of equipment, cables, connections and their configuration and technical specifications. The valuable information kept in this MySQL database enables some tools to automatically create and configure Tango devices and other software components of the control systems of Accelerators, beamlines and laboratories. This paper describes the process involved in this automatic setup

  17. Automatic Implementation of Ttethernet-Based Time-Triggered Avionics Applications

    Science.gov (United States)

    Gorcitz, Raul Adrian; Carle, Thomas; Lesens, David; Monchaux, David; Potop-Butucaruy, Dumitru; Sorel, Yves

    2015-09-01

    The design of safety-critical embedded systems such as those used in avionics still involves largely manual phases. But in avionics the definition of standard interfaces embodied in standards such as ARINC 653 or TTEthernet should allow the definition of fully automatic code generation flows that reduce the costs while improving the quality of the generated code, much like compilers have done when replacing manual assembly coding. In this paper, we briefly present such a fully automatic implementation tool, called Lopht, for ARINC653-based time-triggered systems, and then explain how it is currently extended to include support for TTEthernet networks.

  18. Automatic lymphoma classification with sentence subgraph mining from pathology reports.

    Science.gov (United States)

    Luo, Yuan; Sohani, Aliyah R; Hochberg, Ephraim P; Szolovits, Peter

    2014-01-01

    Pathology reports are rich in narrative statements that encode a complex web of relations among medical concepts. These relations are routinely used by doctors to reason on diagnoses, but often require hand-crafted rules or supervised learning to extract into prespecified forms for computational disease modeling. We aim to automatically capture relations from narrative text without supervision. We design a novel framework that translates sentences into graph representations, automatically mines sentence subgraphs, reduces redundancy in mined subgraphs, and automatically generates subgraph features for subsequent classification tasks. To ensure meaningful interpretations over the sentence graphs, we use the Unified Medical Language System Metathesaurus to map token subsequences to concepts, and in turn sentence graph nodes. We test our system with multiple lymphoma classification tasks that together mimic the differential diagnosis by a pathologist. To this end, we prevent our classifiers from looking at explicit mentions or synonyms of lymphomas in the text. We compare our system with three baseline classifiers using standard n-grams, full MetaMap concepts, and filtered MetaMap concepts. Our system achieves high F-measures on multiple binary classifications of lymphoma (Burkitt lymphoma, 0.8; diffuse large B-cell lymphoma, 0.909; follicular lymphoma, 0.84; Hodgkin lymphoma, 0.912). Significance tests show that our system outperforms all three baselines. Moreover, feature analysis identifies subgraph features that contribute to improved performance; these features agree with the state-of-the-art knowledge about lymphoma classification. We also highlight how these unsupervised relation features may provide meaningful insights into lymphoma classification. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  19. Automatic grade classification of Barretts Esophagus through feature enhancement

    Science.gov (United States)

    Ghatwary, Noha; Ahmed, Amr; Ye, Xujiong; Jalab, Hamid

    2017-03-01

    Barretts Esophagus (BE) is a precancerous condition that affects the esophagus tube and has the risk of developing esophageal adenocarcinoma. BE is the process of developing metaplastic intestinal epithelium and replacing the normal cells in the esophageal area. The detection of BE is considered difficult due to its appearance and properties. The diagnosis is usually done through both endoscopy and biopsy. Recently, Computer Aided Diagnosis systems have been developed to support physicians opinion when facing difficulty in detection/classification in different types of diseases. In this paper, an automatic classification of Barretts Esophagus condition is introduced. The presented method enhances the internal features of a Confocal Laser Endomicroscopy (CLE) image by utilizing a proposed enhancement filter. This filter depends on fractional differentiation and integration that improve the features in the discrete wavelet transform of an image. Later on, various features are extracted from each enhanced image on different levels for the multi-classification process. Our approach is validated on a dataset that consists of a group of 32 patients with 262 images with different histology grades. The experimental results demonstrated the efficiency of the proposed technique. Our method helps clinicians for more accurate classification. This potentially helps to reduce the need for biopsies needed for diagnosis, facilitate the regular monitoring of treatment/development of the patients case and can help train doctors with the new endoscopy technology. The accurate automatic classification is particularly important for the Intestinal Metaplasia (IM) type, which could turn into deadly cancerous. Hence, this work contributes to automatic classification that facilitates early intervention/treatment and decreasing biopsy samples needed.

  20. Automatic visual tracking and social behaviour analysis with multiple mice.

    Directory of Open Access Journals (Sweden)

    Luca Giancardo

    Full Text Available Social interactions are made of complex behavioural actions that might be found in all mammalians, including humans and rodents. Recently, mouse models are increasingly being used in preclinical research to understand the biological basis of social-related pathologies or abnormalities. However, reliable and flexible automatic systems able to precisely quantify social behavioural interactions of multiple mice are still missing. Here, we present a system built on two components. A module able to accurately track the position of multiple interacting mice from videos, regardless of their fur colour or light settings, and a module that automatically characterise social and non-social behaviours. The behavioural analysis is obtained by deriving a new set of specialised spatio-temporal features from the tracker output. These features are further employed by a learning-by-example classifier, which predicts for each frame and for each mouse in the cage one of the behaviours learnt from the examples given by the experimenters. The system is validated on an extensive set of experimental trials involving multiple mice in an open arena. In a first evaluation we compare the classifier output with the independent evaluation of two human graders, obtaining comparable results. Then, we show the applicability of our technique to multiple mice settings, using up to four interacting mice. The system is also compared with a solution recently proposed in the literature that, similarly to us, addresses the problem with a learning-by-examples approach. Finally, we further validated our automatic system to differentiate between C57B/6J (a commonly used reference inbred strain and BTBR T+tf/J (a mouse model for autism spectrum disorders. Overall, these data demonstrate the validity and effectiveness of this new machine learning system in the detection of social and non-social behaviours in multiple (>2 interacting mice, and its versatility to deal with different

  1. [OISO, automatic treatment of patients management in oncogenetics].

    Science.gov (United States)

    Guien, Céline; Fabre, Aurélie; Lagarde, Arnaud; Salgado, David; Gensollen-Thiriez, Catherine; Zattara, Hélène; Beroud, Christophe; Olschwang, Sylviane

    Oncogenetics is a long-term process, which requires a close relation between patients and medical teams, good familial links allowing lifetime follow-up. Numerous documents are exchanged in between the medical team, which has to frequently interact. We present here a new tool that has been conceived specifically for this management. The tool has been developed according to a model-view-controler approach with the relational system PostgreSQL 9.3. The web site used PHP 5.3, HTML5 and CSS3 languages, completed with JavaScript and jQuery-AJAX functions and two additional modules, FPDF and PHPMailer. The tool allows multiple interactions, clinical data management, mailing and emailing, follow-up plannings. Requests are able to follow all patients and planning automatically, to send information to a large number of patients or physicians, and to report activity. The tool has been designed for oncogenetics and adapted to its different aspects. The CNIL delivered an authorization for use. Secured web access allows the management at a regional level. Its simple concept makes it evolutive according to the constant updates of genetic and clinical management of patients. Copyright © 2017 Société Française du Cancer. Published by Elsevier Masson SAS. All rights reserved.

  2. What differentiates a differential psychopharmacology?

    OpenAIRE

    Krüger, Hans-Peter

    2010-01-01

    The methodological implications of a differential psychopharmacology are discussed. It is shown that the technique of stratifying subjects with personality scores depends on one basic assumption: the personality score is not affected by the other experimental factors. Two experiments are reported in which pre- and posttest (after the experiment) scores were measured. The pre-post-differences showed themselves to be affected by the medication. It is argued that in psychopharmacological experim...

  3. Implantation of automatic cardioverter-defibrillators via median sternotomy.

    Science.gov (United States)

    Brodman, R; Fisher, J D; Furman, S; Johnston, D R; Kim, S G; Matos, J A; Waspe, L E

    1984-11-01

    15 AICD (automatic implantable cardioverter-defibrillator) Model B units were implanted in 10 patients. The median sternotomy is our preferred surgical approach using a right atrial patch electrode, a left ventricular apex patch electrode, and two closely placed epicardial sensing electrodes. Follow-up is 109 patient months and all patients are alive. AICD units discharged for ventricular tachycardia, ventricular flutter, and ventricular fibrillation. Discharges also occurred for sinus tachycardia and atrial fibrillation above the rate limit in three units. Premature pulse generator depletion has occurred in four AICD-B units 3 to 18 months postimplant and appears due to a defect in original battery design. Discharge of the AICD for supraventricular tachycardia is a problem that will remain until a better means of differentiating supraventricular tachycardia from ventricular tachyarrhythmias is found. The AICD appears to prevent sudden death from ventricular tachyarrhythmias.

  4. 46 CFR 63.25-1 - Small automatic auxiliary boilers.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Small automatic auxiliary boilers. 63.25-1 Section 63.25... AUXILIARY BOILERS Requirements for Specific Types of Automatic Auxiliary Boilers § 63.25-1 Small automatic auxiliary boilers. Small automatic auxiliary boilers defined as having heat-input ratings of 400,000 Btu/hr...

  5. 30 CFR 77.314 - Automatic temperature control instruments.

    Science.gov (United States)

    2010-07-01

    ... UNDERGROUND COAL MINES Thermal Dryers § 77.314 Automatic temperature control instruments. (a) Automatic temperature control instruments for thermal dryer system shall be of the recording type. (b) Automatic... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic temperature control instruments. 77...

  6. Validation tools for image segmentation

    Science.gov (United States)

    Padfield, Dirk; Ross, James

    2009-02-01

    A large variety of image analysis tasks require the segmentation of various regions in an image. For example, segmentation is required to generate accurate models of brain pathology that are important components of modern diagnosis and therapy. While the manual delineation of such structures gives accurate information, the automatic segmentation of regions such as the brain and tumors from such images greatly enhances the speed and repeatability of quantifying such structures. The ubiquitous need for such algorithms has lead to a wide range of image segmentation algorithms with various assumptions, parameters, and robustness. The evaluation of such algorithms is an important step in determining their effectiveness. Therefore, rather than developing new segmentation algorithms, we here describe validation methods for segmentation algorithms. Using similarity metrics comparing the automatic to manual segmentations, we demonstrate methods for optimizing the parameter settings for individual cases and across a collection of datasets using the Design of Experiment framework. We then employ statistical analysis methods to compare the effectiveness of various algorithms. We investigate several region-growing algorithms from the Insight Toolkit and compare their accuracy to that of a separate statistical segmentation algorithm. The segmentation algorithms are used with their optimized parameters to automatically segment the brain and tumor regions in MRI images of 10 patients. The validation tools indicate that none of the ITK algorithms studied are able to outperform with statistical significance the statistical segmentation algorithm although they perform reasonably well considering their simplicity.

  7. Differential geometry

    CERN Document Server

    Ciarlet, Philippe G

    2007-01-01

    This book gives the basic notions of differential geometry, such as the metric tensor, the Riemann curvature tensor, the fundamental forms of a surface, covariant derivatives, and the fundamental theorem of surface theory in a selfcontained and accessible manner. Although the field is often considered a classical one, it has recently been rejuvenated, thanks to the manifold applications where it plays an essential role. The book presents some important applications to shells, such as the theory of linearly and nonlinearly elastic shells, the implementation of numerical methods for shells, and

  8. Differential equations

    CERN Document Server

    Tricomi, FG

    2013-01-01

    Based on his extensive experience as an educator, F. G. Tricomi wrote this practical and concise teaching text to offer a clear idea of the problems and methods of the theory of differential equations. The treatment is geared toward advanced undergraduates and graduate students and addresses only questions that can be resolved with rigor and simplicity.Starting with a consideration of the existence and uniqueness theorem, the text advances to the behavior of the characteristics of a first-order equation, boundary problems for second-order linear equations, asymptotic methods, and diff

  9. Differential topology

    CERN Document Server

    Guillemin, Victor

    2010-01-01

    Differential Topology provides an elementary and intuitive introduction to the study of smooth manifolds. In the years since its first publication, Guillemin and Pollack's book has become a standard text on the subject. It is a jewel of mathematical exposition, judiciously picking exactly the right mixture of detail and generality to display the richness within. The text is mostly self-contained, requiring only undergraduate analysis and linear algebra. By relying on a unifying idea-transversality-the authors are able to avoid the use of big machinery or ad hoc techniques to establish the main

  10. Theory of control systems described by differential inclusions

    CERN Document Server

    Han, Zhengzhi; Huang, Jun

    2016-01-01

    This book provides a brief introduction to the theory of finite dimensional differential inclusions, and deals in depth with control of three kinds of differential inclusion systems. The authors introduce the algebraic decomposition of convex processes, the stabilization of polytopic systems, and observations of Luré systems. They also introduce the elemental theory of finite dimensional differential inclusions, and the properties and designs of the control systems described by differential inclusions. Addressing the material with clarity and simplicity, the book includes recent research achievements and spans all concepts, concluding with a critical mathematical framework. This book is intended for researchers, teachers and postgraduate students in the area of automatic control engineering.

  11. Tool steels

    DEFF Research Database (Denmark)

    Højerslev, C.

    2001-01-01

    On designing a tool steel, its composition and heat treatment parameters are chosen to provide a hardened and tempered martensitic matrix in which carbides are evenly distributed. In this condition the matrix has an optimum combination of hardness andtoughness, the primary carbides provide...... resistance against abrasive wear and secondary carbides (if any) increase the resistance against plastic deformation. Tool steels are alloyed with carbide forming elements (Typically: vanadium, tungsten, molybdenumand chromium) furthermore some steel types contains cobalt. Addition of alloying elements...... serves primarily two purpose (i) to improve the hardenabillity and (ii) to provide harder and thermally more stable carbides than cementite. Assuming proper heattreatment, the properties of a tool steel depends on the which alloying elements are added and their respective concentrations....

  12. Perceptions about Implementation of Differentiated Instruction

    Science.gov (United States)

    Robinson, Lora; Maldonado, Nancy; Whaley, Jerita

    2014-01-01

    The absence of differentiated instruction in many classrooms stifles success for students who do not learn the same way as their peers. Providing teachers with the knowledge and tools to differentiate in their classrooms may increase test scores and help low achieving students find success, while expanding the learning growth of gifted and…

  13. Hyperbolic differential operators and related problems

    CERN Document Server

    Ancona, Vincenzo

    2003-01-01

    Presenting research from more than 30 international authorities, this reference provides a complete arsenal of tools and theorems to analyze systems of hyperbolic partial differential equations. The authors investigate a wide variety of problems in areas such as thermodynamics, electromagnetics, fluid dynamics, differential geometry, and topology. Renewing thought in the field of mathematical physics, Hyperbolic Differential Operators defines the notion of pseudosymmetry for matrix symbols of order zero as well as the notion of time function. Surpassing previously published material on the top

  14. Automatic analog IC sizing and optimization constrained with PVT corners and layout effects

    CERN Document Server

    Lourenço, Nuno; Horta, Nuno

    2017-01-01

    This book introduces readers to a variety of tools for automatic analog integrated circuit (IC) sizing and optimization. The authors provide a historical perspective on the early methods proposed to tackle automatic analog circuit sizing, with emphasis on the methodologies to size and optimize the circuit, and on the methodologies to estimate the circuit’s performance. The discussion also includes robust circuit design and optimization and the most recent advances in layout-aware analog sizing approaches. The authors describe a methodology for an automatic flow for analog IC design, including details of the inputs and interfaces, multi-objective optimization techniques, and the enhancements made in the base implementation by using machine leaning techniques. The Gradient model is discussed in detail, along with the methods to include layout effects in the circuit sizing. The concepts and algorithms of all the modules are thoroughly described, enabling readers to reproduce the methodologies, improve the qual...

  15. Automatic digital photo-book making system

    Science.gov (United States)

    Wang, Wiley; Teo, Patrick; Muzzolini, Russ

    2010-02-01

    The diversity of photo products has grown more than ever before. A group of photos are not only printed individually, but also can be arranged in specific order to tell a story, such as in a photo book, a calendar or a poster collage. Similar to making a traditional scrapbook, digital photo book tools allow the user to choose a book style/theme, layouts of pages, backgrounds and the way the pictures are arranged. This process is often time consuming to users, given the number of images and the choices of layout/background combinations. In this paper, we developed a system to automatically generate photo books with only a few initial selections required. The system utilizes time stamps, color indices, orientations and other image properties to best fit pictures into a final photo book. The common way of telling a story is to lay the pictures out in chronological order. If the pictures are proximate in time, they will coincide with each other and are often logically related. The pictures are naturally clustered along a time line. Breaks between clusters can be used as a guide to separate pages or spreads, thus, pictures that are logically related can stay close on the same page or spread. When people are making a photo book, it is helpful to start with chronologically grouped images, but time alone wont be enough to complete the process. Each page is limited by the number of layouts available. Many aesthetic rules also apply, such as, emphasis of preferred pictures, consistency of local image density throughout the whole book, matching a background to the content of the images, and the variety of adjacent page layouts. We developed an algorithm to group images onto pages under the constraints of aesthetic rules. We also apply content analysis based on the color and blurriness of each picture, to match backgrounds and to adjust page layouts. Some of our aesthetic rules are fixed and given by designers. Other aesthetic rules are statistic models trained by using

  16. Automatic Power Line Inspection Using UAV Images

    Directory of Open Access Journals (Sweden)

    Yong Zhang

    2017-08-01

    Full Text Available Power line inspection ensures the safe operation of a power transmission grid. Using unmanned aerial vehicle (UAV images of power line corridors is an effective way to carry out these vital inspections. In this paper, we propose an automatic inspection method for power lines using UAV images. This method, known as the power line automatic measurement method based on epipolar constraints (PLAMEC, acquires the spatial position of the power lines. Then, the semi patch matching based on epipolar constraints (SPMEC dense matching method is applied to automatically extract dense point clouds within the power line corridor. Obstacles can then be automatically detected by calculating the spatial distance between a power line and the point cloud representing the ground. Experimental results show that the PLAMEC automatically measures power lines effectively with a measurement accuracy consistent with that of manual stereo measurements. The height root mean square (RMS error of the point cloud was 0.233 m, and the RMS error of the power line was 0.205 m. In addition, we verified the detected obstacles in the field and measured the distance between the canopy and power line using a laser range finder. The results show that the difference of these two distances was within ±0.5 m.

  17. Automatic radioxenon analyzer for CTBT monitoring

    International Nuclear Information System (INIS)

    Bowyer, T.W.; Abel, K.H.; Hensley, W.K.

    1996-12-01

    Over the past 3 years, with support from US DOE's NN-20 Comprehensive Test Ban Treaty (CTBT) R ampersand D program, PNNL has developed and demonstrated a fully automatic analyzer for collecting and measuring the four Xe radionuclides, 131m Xe(11.9 d), 133m Xe(2.19 d), 133 Xe (5.24 d), and 135 Xe(9.10 h), in the atmosphere. These radionuclides are important signatures in monitoring for compliance to a CTBT. Activity ratios permit discriminating radioxenon from nuclear detonation and that from nuclear reactor operations, nuclear fuel reprocessing, or medical isotope production and usage. In the analyzer, Xe is continuously and automatically separated from the atmosphere at flow rates of about 7 m 3 /h on sorption bed. Aliquots collected for 6-12 h are automatically analyzed by electron-photon coincidence spectrometry to produce sensitivities in the range of 20-100 μBq/m 3 of air, about 100-fold better than with reported laboratory-based procedures for short time collection intervals. Spectral data are automatically analyzed and the calculated radioxenon concentrations and raw gamma- ray spectra automatically transmitted to data centers

  18. Design tools

    Science.gov (United States)

    Anton TenWolde; Mark T. Bomberg

    2009-01-01

    Overall, despite the lack of exact input data, the use of design tools, including models, is much superior to the simple following of rules of thumbs, and a moisture analysis should be standard procedure for any building envelope design. Exceptions can only be made for buildings in the same climate, similar occupancy, and similar envelope construction. This chapter...

  19. Multibody simulation of vehicles equipped with an automatic transmission

    Science.gov (United States)

    Olivier, B.; Kouroussis, G.

    2016-09-01

    Nowadays automotive vehicles remain as one of the most used modes of transportation. Furthermore automatic transmissions are increasingly used to provide a better driving comfort and a potential optimization of the engine performances (by placing the gear shifts at specific engine and vehicle speeds). This paper presents an effective modeling of the vehicle using the multibody methodology (numerically computed under EasyDyn, an open source and in-house library dedicated to multibody simulations). However, the transmission part of the vehicle is described by the usual equations of motion computed using a systematic matrix approach: del Castillo's methodology for planetary gear trains. By coupling the analytic equations of the transmission and the equations computed by the multibody methodology, the performances of any vehicle can be obtained if the characteristics of each element in the vehicle are known. The multibody methodology offers the possibilities to develop the vehicle modeling from 1D-motion to 3D-motion by taking into account the rotations and implementing tire models. The modeling presented in this paper remains very efficient and provides an easy and quick vehicle simulation tool which could be used in order to calibrate the automatic transmission.

  20. Automatic sentence extraction for the detection of scientific paper relations

    Science.gov (United States)

    Sibaroni, Y.; Prasetiyowati, S. S.; Miftachudin, M.

    2018-03-01

    The relations between scientific papers are very useful for researchers to see the interconnection between scientific papers quickly. By observing the inter-article relationships, researchers can identify, among others, the weaknesses of existing research, performance improvements achieved to date, and tools or data typically used in research in specific fields. So far, methods that have been developed to detect paper relations include machine learning and rule-based methods. However, a problem still arises in the process of sentence extraction from scientific paper documents, which is still done manually. This manual process causes the detection of scientific paper relations longer and inefficient. To overcome this problem, this study performs an automatic sentences extraction while the paper relations are identified based on the citation sentence. The performance of the built system is then compared with that of the manual extraction system. The analysis results suggested that the automatic sentence extraction indicates a very high level of performance in the detection of paper relations, which is close to that of manual sentence extraction.