WorldWideScience

Sample records for automatic differentiation tools

  1. Automatic differentiation as a tool in engineering design

    Science.gov (United States)

    Barthelemy, Jean-Francois; Hall, Laura E.

    1992-01-01

    Automatic Differentiation (AD) is a tool that systematically implements the chain rule of differentiation to obtain the derivatives of functions calculated by computer programs. AD is assessed as a tool for engineering design. The forward and reverse modes of AD, their computing requirements, as well as approaches to implementing AD are discussed. The application of two different tools to two medium-size structural analysis problems to generate sensitivity information typically necessary in an optimization or design situation is also discussed. The observation is made that AD is to be preferred to finite differencing in most cases, as long as sufficient computer storage is available; in some instances, AD may be the alternative to consider in lieu of analytical sensitivity analysis.

  2. Automatic differentiation bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Corliss, G.F. [comp.

    1992-07-01

    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  3. Automatic differentiation tools in the dynamic simulation of chemical engineering processes

    Directory of Open Access Journals (Sweden)

    Castro M.C.

    2000-01-01

    Full Text Available Automatic Differentiation is a relatively recent technique developed for the differentiation of functions applicable directly to the source code to compute the function written in standard programming languages. That technique permits the automatization of the differentiation step, crucial for dynamic simulation and optimization of processes. The values for the derivatives obtained with AD are exact (to roundoff. The theoretical exactness of the AD comes from the fact that it uses the same rules of differentiation as in differential calculus, but these rules are applied to an algorithmic specification of the function rather than to a formula. The main purpose of this contribution is to discuss the impact of Automatic Differentiation in the field of dynamic simulation of chemical engineering processes. The influence of the differentiation technique on the behavior of the integration code, the performance of the generated code and the incorporation of AD tools in consistent initialization tools are discussed from the viewpoint of dynamic simulation of typical models in chemical engineering.

  4. Automatic differentiation of functions

    International Nuclear Information System (INIS)

    Douglas, S.R.

    1990-06-01

    Automatic differentiation is a method of computing derivatives of functions to any order in any number of variables. The functions must be expressible as combinations of elementary functions. When evaluated at specific numerical points, the derivatives have no truncation error and are automatically found. The method is illustrated by simple examples. Source code in FORTRAN is provided

  5. Automatic Differentiation and Deep Learning

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Statistical learning has been getting more and more interest from the particle-physics community in recent times, with neural networks and gradient-based optimization being a focus. In this talk we shall discuss three things: automatic differention tools: tools to quickly build DAGs of computation that are fully differentiable. We shall focus on one such tool "PyTorch".  Easy deployment of trained neural networks into large systems with many constraints: for example, deploying a model at the reconstruction phase where the neural network has to be integrated into CERN's bulk data-processing C++-only environment Some recent models in deep learning for segmentation and generation that might be useful for particle physics problems.

  6. Sensitivity analysis and design optimization through automatic differentiation

    International Nuclear Information System (INIS)

    Hovland, Paul D; Norris, Boyana; Strout, Michelle Mills; Bhowmick, Sanjukta; Utke, Jean

    2005-01-01

    Automatic differentiation is a technique for transforming a program or subprogram that computes a function, including arbitrarily complex simulation codes, into one that computes the derivatives of that function. We describe the implementation and application of automatic differentiation tools. We highlight recent advances in the combinatorial algorithms and compiler technology that underlie successful implementation of automatic differentiation tools. We discuss applications of automatic differentiation in design optimization and sensitivity analysis. We also describe ongoing research in the design of language-independent source transformation infrastructures for automatic differentiation algorithms

  7. Automatic differentiation algorithms in model analysis

    NARCIS (Netherlands)

    Huiskes, M.J.

    2002-01-01

    Title: Automatic differentiation algorithms in model analysis
    Author: M.J. Huiskes
    Date: 19 March, 2002

    In this thesis automatic differentiation algorithms and derivative-based methods

  8. Automatic differentiation in geophysical inverse problems

    Science.gov (United States)

    Sambridge, M.; Rickwood, P.; Rawlinson, N.; Sommacal, S.

    2007-07-01

    Automatic differentiation (AD) is the technique whereby output variables of a computer code evaluating any complicated function (e.g. the solution to a differential equation) can be differentiated with respect to the input variables. Often AD tools take the form of source to source translators and produce computer code without the need for deriving and hand coding of explicit mathematical formulae by the user. The power of AD lies in the fact that it combines the generality of finite difference techniques and the accuracy and efficiency of analytical derivatives, while at the same time eliminating `human' coding errors. It also provides the possibility of accurate, efficient derivative calculation from complex `forward' codes where no analytical derivatives are possible and finite difference techniques are too cumbersome. AD is already having a major impact in areas such as optimization, meteorology and oceanography. Similarly it has considerable potential for use in non-linear inverse problems in geophysics where linearization is desirable, or for sensitivity analysis of large numerical simulation codes, for example, wave propagation and geodynamic modelling. At present, however, AD tools appear to be little used in the geosciences. Here we report on experiments using a state of the art AD tool to perform source to source code translation in a range of geoscience problems. These include calculating derivatives for Gibbs free energy minimization, seismic receiver function inversion, and seismic ray tracing. Issues of accuracy and efficiency are discussed.

  9. Higher-order automatic differentiation of mathematical functions

    Science.gov (United States)

    Charpentier, Isabelle; Dal Cappello, Claude

    2015-04-01

    Functions of mathematical physics such as the Bessel functions, the Chebyshev polynomials, the Gauss hypergeometric function and so forth, have practical applications in many scientific domains. On the one hand, differentiation formulas provided in reference books apply to real or complex variables. These do not account for the chain rule. On the other hand, based on the chain rule, the automatic differentiation has become a natural tool in numerical modeling. Nevertheless automatic differentiation tools do not deal with the numerous mathematical functions. This paper describes formulas and provides codes for the higher-order automatic differentiation of mathematical functions. The first method is based on Faà di Bruno's formula that generalizes the chain rule. The second one makes use of the second order differential equation they satisfy. Both methods are exemplified with the aforementioned functions.

  10. Applications of automatic differentiation in topology optimization

    DEFF Research Database (Denmark)

    Nørgaard, Sebastian A.; Sagebaum, Max; Gauger, Nicolas R.

    2017-01-01

    The goal of this article is to demonstrate the applicability and to discuss the advantages and disadvantages of automatic differentiation in topology optimization. The technique makes it possible to wholly or partially automate the evaluation of derivatives for optimization problems and is demons...

  11. Automatic Differentiation and its Program Realization

    Czech Academy of Sciences Publication Activity Database

    Hartman, J.; Lukšan, Ladislav; Zítko, J.

    2009-01-01

    Roč. 45, č. 5 (2009), s. 865-883 ISSN 0023-5954 R&D Projects: GA AV ČR IAA1030405 Institutional research plan: CEZ:AV0Z10300504 Keywords : automatic differentiation * modeling languages * systems of optimization Subject RIV: BA - General Mathematics Impact factor: 0.445, year: 2009 http://dml.cz/handle/10338.dmlcz/140037

  12. Applications of automatic differentiation in topology optimization

    DEFF Research Database (Denmark)

    Nørgaard, Sebastian A.; Sagebaum, Max; Gauger, Nicolas R.

    2017-01-01

    and is demonstrated on two separate, previously published types of problems in topology optimization. Two separate software packages for automatic differentiation, CoDiPack and Tapenade are considered, and their performance and usability trade-offs are discussed and compared to a hand coded adjoint gradient...

  13. Paediatric Automatic Phonological Analysis Tools (APAT).

    Science.gov (United States)

    Saraiva, Daniela; Lousada, Marisa; Hall, Andreia; Jesus, Luis M T

    2017-12-01

    To develop the pediatric Automatic Phonological Analysis Tools (APAT) and to estimate inter and intrajudge reliability, content validity, and concurrent validity. The APAT were constructed using Excel spreadsheets with formulas. The tools were presented to an expert panel for content validation. The corpus used in the Portuguese standardized test Teste Fonético-Fonológico - ALPE produced by 24 children with phonological delay or phonological disorder was recorded, transcribed, and then inserted into the APAT. Reliability and validity of APAT were analyzed. The APAT present strong inter- and intrajudge reliability (>97%). The content validity was also analyzed (ICC = 0.71), and concurrent validity revealed strong correlations between computerized and manual (traditional) methods. The development of these tools contributes to fill existing gaps in clinical practice and research, since previously there were no valid and reliable tools/instruments for automatic phonological analysis, which allowed the analysis of different corpora.

  14. TMB: Automatic Differentiation and Laplace Approximation

    Directory of Open Access Journals (Sweden)

    Kasper Kristensen

    2016-04-01

    Full Text Available TMB is an open source R package that enables quick implementation of complex nonlinear random effects (latent variable models in a manner similar to the established AD Model Builder package (ADMB, http://admb-project.org/; Fournier et al. 2011. In addition, it offers easy access to parallel computations. The user defines the joint likelihood for the data and the random effects as a C++ template function, while all the other operations are done in R; e.g., reading in the data. The package evaluates and maximizes the Laplace approximation of the marginal likelihood where the random effects are automatically integrated out. This approximation, and its derivatives, are obtained using automatic differentiation (up to order three of the joint likelihood. The computations are designed to be fast for problems with many random effects (≈ 106 and parameters (≈ 103 . Computation times using ADMB and TMB are compared on a suite of examples ranging from simple models to large spatial models where the random effects are a Gaussian random field. Speedups ranging from 1.5 to about 100 are obtained with increasing gains for large problems. The package and examples are available at http://tmb-project.org/.

  15. PASTEC: an automatic transposable element classification tool.

    Directory of Open Access Journals (Sweden)

    Claire Hoede

    Full Text Available SUMMARY: The classification of transposable elements (TEs is key step towards deciphering their potential impact on the genome. However, this process is often based on manual sequence inspection by TE experts. With the wealth of genomic sequences now available, this task requires automation, making it accessible to most scientists. We propose a new tool, PASTEC, which classifies TEs by searching for structural features and similarities. This tool outperforms currently available software for TE classification. The main innovation of PASTEC is the search for HMM profiles, which is useful for inferring the classification of unknown TE on the basis of conserved functional domains of the proteins. In addition, PASTEC is the only tool providing an exhaustive spectrum of possible classifications to the order level of the Wicker hierarchical TE classification system. It can also automatically classify other repeated elements, such as SSR (Simple Sequence Repeats, rDNA or potential repeated host genes. Finally, the output of this new tool is designed to facilitate manual curation by providing to biologists with all the evidence accumulated for each TE consensus. AVAILABILITY: PASTEC is available as a REPET module or standalone software (http://urgi.versailles.inra.fr/download/repet/REPET_linux-x64-2.2.tar.gz. It requires a Unix-like system. There are two standalone versions: one of which is parallelized (requiring Sun grid Engine or Torque, and the other of which is not.

  16. Automatic differentiation for gradient-based optimization of radiatively heated microelectronics manufacturing equipment

    Energy Technology Data Exchange (ETDEWEB)

    Moen, C.D.; Spence, P.A.; Meza, J.C.; Plantenga, T.D.

    1996-12-31

    Automatic differentiation is applied to the optimal design of microelectronic manufacturing equipment. The performance of nonlinear, least-squares optimization methods is compared between numerical and analytical gradient approaches. The optimization calculations are performed by running large finite-element codes in an object-oriented optimization environment. The Adifor automatic differentiation tool is used to generate analytic derivatives for the finite-element codes. The performance results support previous observations that automatic differentiation becomes beneficial as the number of optimization parameters increases. The increase in speed, relative to numerical differences, has a limited value and results are reported for two different analysis codes.

  17. FORSIM-6, Automatic Solution of Coupled Differential Equation System

    International Nuclear Information System (INIS)

    Carver, M.B.; Stewart, D.G.; Blair, J.M.; Selander, W.N.

    1983-01-01

    1 - Description of problem or function: The FORSIM program is a versatile package which automates the solution of coupled differential equation systems. The independent variables are time, and up to three space coordinates, and the equations may be any mixture of partial and/or ordinary differential equations. The philosophy of the program is to provide a tool which will solve a system of differential equations for a user who has basic but unspecialized knowledge of numerical analysis and FORTRAN. The equations to be solved, together with the initial conditions and any special instructions, may be specified by the user in a single FORTRAN subroutine, although he may write a number of routines if this is more suitable. These are then loaded with the control routines, which perform the solution and any requested input and output. 2 - Method of solution: Partial differential equations are automatically converted into sets of coupled ordinary differential equations by variable order discretization in the spatial dimensions. These and other ordinary differential equations are integrated continuously in time using efficient variable order, variable step, error-controlled algorithms

  18. Automatic Clustering Using FSDE-Forced Strategy Differential Evolution

    Science.gov (United States)

    Yasid, A.

    2018-01-01

    Clustering analysis is important in datamining for unsupervised data, cause no adequate prior knowledge. One of the important tasks is defining the number of clusters without user involvement that is known as automatic clustering. This study intends on acquiring cluster number automatically utilizing forced strategy differential evolution (AC-FSDE). Two mutation parameters, namely: constant parameter and variable parameter are employed to boost differential evolution performance. Four well-known benchmark datasets were used to evaluate the algorithm. Moreover, the result is compared with other state of the art automatic clustering methods. The experiment results evidence that AC-FSDE is better or competitive with other existing automatic clustering algorithm.

  19. Tangent: Automatic Differentiation Using Source Code Transformation in Python

    OpenAIRE

    van Merriënboer, Bart; Wiltschko, Alexander B.; Moldovan, Dan

    2017-01-01

    Automatic differentiation (AD) is an essential primitive for machine learning programming systems. Tangent is a new library that performs AD using source code transformation (SCT) in Python. It takes numeric functions written in a syntactic subset of Python and NumPy as input, and generates new Python functions which calculate a derivative. This approach to automatic differentiation is different from existing packages popular in machine learning, such as TensorFlow and Autograd. Advantages ar...

  20. Operator overloading as an enabling technology for automatic differentiation

    International Nuclear Information System (INIS)

    Corliss, G.F.; Griewank, A.

    1993-01-01

    We present an example of the science that is enabled by object-oriented programming techniques. Scientific computation often needs derivatives for solving nonlinear systems such as those arising in many PDE algorithms, optimization, parameter identification, stiff ordinary differential equations, or sensitivity analysis. Automatic differentiation computes derivatives accurately and efficiently by applying the chain rule to each arithmetic operation or elementary function. Operator overloading enables the techniques of either the forward or the reverse mode of automatic differentiation to be applied to real-world scientific problems. We illustrate automatic differentiation with an example drawn from a model of unsaturated flow in a porous medium. The problem arises from planning for the long-term storage of radioactive waste

  1. Applications of automatic differentiation in computational fluid dynamics

    Science.gov (United States)

    Green, Lawrence L.; Carle, A.; Bischof, C.; Haigler, Kara J.; Newman, Perry A.

    1994-01-01

    Automatic differentiation (AD) is a powerful computational method that provides for computing exact sensitivity derivatives (SD) from existing computer programs for multidisciplinary design optimization (MDO) or in sensitivity analysis. A pre-compiler AD tool for FORTRAN programs called ADIFOR has been developed. The ADIFOR tool has been easily and quickly applied by NASA Langley researchers to assess the feasibility and computational impact of AD in MDO with several different FORTRAN programs. These include a state-of-the-art three dimensional multigrid Navier-Stokes flow solver for wings or aircraft configurations in transonic turbulent flow. With ADIFOR the user specifies sets of independent and dependent variables with an existing computer code. ADIFOR then traces the dependency path throughout the code, applies the chain rule to formulate derivative expressions, and generates new code to compute the required SD matrix. The resulting codes have been verified to compute exact non-geometric and geometric SD for a variety of cases. in less time than is required to compute the SD matrix using centered divided differences.

  2. Automatically Assessing Lexical Sophistication: Indices, Tools, Findings, and Application

    Science.gov (United States)

    Kyle, Kristopher; Crossley, Scott A.

    2015-01-01

    This study explores the construct of lexical sophistication and its applications for measuring second language lexical and speaking proficiency. In doing so, the study introduces the Tool for the Automatic Analysis of LExical Sophistication (TAALES), which calculates text scores for 135 classic and newly developed lexical indices related to word…

  3. Automatic differentiation for design sensitivity analysis of structural systems using multiple processors

    Science.gov (United States)

    Nguyen, Duc T.; Storaasli, Olaf O.; Qin, Jiangning; Qamar, Ramzi

    1994-01-01

    An automatic differentiation tool (ADIFOR) is incorporated into a finite element based structural analysis program for shape and non-shape design sensitivity analysis of structural systems. The entire analysis and sensitivity procedures are parallelized and vectorized for high performance computation. Small scale examples to verify the accuracy of the proposed program and a medium scale example to demonstrate the parallel vector performance on multiple CRAY C90 processors are included.

  4. A new design of automatic vertical drilling tool

    Directory of Open Access Journals (Sweden)

    Yanfeng Ma

    2015-09-01

    Full Text Available In order to effectively improve penetration rates and enhance wellbore quality for vertical wells, a new Automatic Vertical Drilling Tool (AVDT based on Eccentric Braced Structure (EBS is designed. Applying operating principle of rotary steering drilling, AVDT adds offset gravity block automatic induction inclination mechanism. When hole straightening happens, tools take essentric moment to be produced by gravity of offset gravity lock to control the bearing of guide force, so that well straightening is achieved. The normal tool's size of the AVDT is designed as 215.9 mm,other major components' sizes are worked out by the result of theoretical analysis, including the offset angle of EBS. This paper aims to introduce the structure, operating principle, theoretical analysis and describe the key components' parameters setting of the AVDT.

  5. Automatized material and radioactivity flow control tool in decommissioning process

    International Nuclear Information System (INIS)

    Rehak, I.; Vasko, M.; Daniska, V.; Schultz, O.

    2009-01-01

    In this presentation the automatized material and radioactivity flow control tool in decommissioning process is discussed. It is concluded that: computer simulation of the decommissioning process is one of the important attributes of computer code Omega; one of the basic tools of computer optimisation of decommissioning waste processing are the tools of integral material and radioactivity flow; all the calculated parameters of materials are stored in each point of calculation process and they can be viewed; computer code Omega represents opened modular system, which can be improved; improvement of the module of optimisation of decommissioning waste processing will be performed in the frame of improvement of material procedures and scenarios.

  6. A semi-automatic annotation tool for cooking video

    Science.gov (United States)

    Bianco, Simone; Ciocca, Gianluigi; Napoletano, Paolo; Schettini, Raimondo; Margherita, Roberto; Marini, Gianluca; Gianforme, Giorgio; Pantaleo, Giuseppe

    2013-03-01

    In order to create a cooking assistant application to guide the users in the preparation of the dishes relevant to their profile diets and food preferences, it is necessary to accurately annotate the video recipes, identifying and tracking the foods of the cook. These videos present particular annotation challenges such as frequent occlusions, food appearance changes, etc. Manually annotate the videos is a time-consuming, tedious and error-prone task. Fully automatic tools that integrate computer vision algorithms to extract and identify the elements of interest are not error free, and false positive and false negative detections need to be corrected in a post-processing stage. We present an interactive, semi-automatic tool for the annotation of cooking videos that integrates computer vision techniques under the supervision of the user. The annotation accuracy is increased with respect to completely automatic tools and the human effort is reduced with respect to completely manual ones. The performance and usability of the proposed tool are evaluated on the basis of the time and effort required to annotate the same video sequences.

  7. Post-convergence automatic differentiation of iterative schemes

    International Nuclear Information System (INIS)

    Azmy, Y.Y.

    1997-01-01

    A new approach for performing automatic differentiation (AD) of computer codes that embody an iterative procedure, based on differentiating a single additional iteration upon achieving convergence, is described and implemented. This post-convergence automatic differentiation (PAD) technique results in better accuracy of the computed derivatives, as it eliminates part of the derivatives convergence error, and a large reduction in execution time, especially when many iterations are required to achieve convergence. In addition, it provides a way to compute derivatives of the converged solution without having to repeat the entire iterative process every time new parameters are considered. These advantages are demonstrated and the PAD technique is validated via a set of three linear and nonlinear codes used to solve neutron transport and fluid flow problems. The PAD technique reduces the execution time over direct AD by a factor of up to 30 and improves the accuracy of the derivatives by up to two orders of magnitude. The PAD technique's biggest disadvantage lies in the necessity to compute the iterative map's Jacobian, which for large problems can be prohibitive. Methods are discussed to alleviate this difficulty

  8. Parallel computation of automatic differentiation applied to magnetic field calculations

    International Nuclear Information System (INIS)

    Hinkins, R.L.; Lawrence Berkeley Lab., CA

    1994-09-01

    The author presents a parallelization of an accelerator physics application to simulate magnetic field in three dimensions. The problem involves the evaluation of high order derivatives with respect to two variables of a multivariate function. Automatic differentiation software had been used with some success, but the computation time was prohibitive. The implementation runs on several platforms, including a network of workstations using PVM, a MasPar using MPFortran, and a CM-5 using CMFortran. A careful examination of the code led to several optimizations that improved its serial performance by a factor of 8.7. The parallelization produced further improvements, especially on the MasPar with a speedup factor of 620. As a result a problem that took six days on a SPARC 10/41 now runs in minutes on the MasPar, making it feasible for physicists at Lawrence Berkeley Laboratory to simulate larger magnets

  9. Analysis on machine tool systems using spindle vibration monitoring for automatic tool changer

    OpenAIRE

    Shang-Liang Chen; Yin-Ting Cheng; Chin-Fa Su

    2015-01-01

    Recently, the intelligent systems of technology have become one of the major items in the development of machine tools. One crucial technology is the machinery status monitoring function, which is required for abnormal warnings and the improvement of cutting efficiency. During processing, the mobility act of the spindle unit determines the most frequent and important part such as automatic tool changer. The vibration detection system includes the development of hardware and software, such as ...

  10. A Domain Specific Embedded Language in C++ for Automatic Differentiation, Projection, Integration and Variational Formulations

    Directory of Open Access Journals (Sweden)

    Christophe Prud'homme

    2006-01-01

    Full Text Available In this article, we present a domain specific embedded language in C++ that can be used in various contexts such as numerical projection onto a functional space, numerical integration, variational formulations and automatic differentiation. Albeit these tools operate in different ways, the language overcomes this difficulty by decoupling expression constructions from evaluation. The language is implemented using expression templates and meta-programming techniques and uses various Boost libraries. The language is exercised on a number of non-trivial examples and a benchmark presents the performance behavior on a few test problems.

  11. Automatic Parallelization Tool: Classification of Program Code for Parallel Computing

    Directory of Open Access Journals (Sweden)

    Mustafa Basthikodi

    2016-04-01

    Full Text Available Performance growth of single-core processors has come to a halt in the past decade, but was re-enabled by the introduction of parallelism in processors. Multicore frameworks along with Graphical Processing Units empowered to enhance parallelism broadly. Couples of compilers are updated to developing challenges forsynchronization and threading issues. Appropriate program and algorithm classifications will have advantage to a great extent to the group of software engineers to get opportunities for effective parallelization. In present work we investigated current species for classification of algorithms, in that related work on classification is discussed along with the comparison of issues that challenges the classification. The set of algorithms are chosen which matches the structure with different issues and perform given task. We have tested these algorithms utilizing existing automatic species extraction toolsalong with Bones compiler. We have added functionalities to existing tool, providing a more detailed characterization. The contributions of our work include support for pointer arithmetic, conditional and incremental statements, user defined types, constants and mathematical functions. With this, we can retain significant data which is not captured by original speciesof algorithms. We executed new theories into the device, empowering automatic characterization of program code.

  12. Automatic welding detection by an intelligent tool pipe inspection

    Science.gov (United States)

    Arizmendi, C. J.; Garcia, W. L.; Quintero, M. A.

    2015-07-01

    This work provide a model based on machine learning techniques in welds recognition, based on signals obtained through in-line inspection tool called “smart pig” in Oil and Gas pipelines. The model uses a signal noise reduction phase by means of pre-processing algorithms and attribute-selection techniques. The noise reduction techniques were selected after a literature review and testing with survey data. Subsequently, the model was trained using recognition and classification algorithms, specifically artificial neural networks and support vector machines. Finally, the trained model was validated with different data sets and the performance was measured with cross validation and ROC analysis. The results show that is possible to identify welding automatically with an efficiency between 90 and 98 percent.

  13. Development of tools for automatic generation of PLC code

    CERN Document Server

    Koutli, Maria; Rochez, Jacques

    This Master thesis was performed at CERN and more specifically in the EN-ICE-PLC section. The Thesis describes the integration of two PLC platforms, that are based on CODESYS development tool, to the CERN defined industrial framework, UNICOS. CODESYS is a development tool for PLC programming, based on IEC 61131-3 standard, and is adopted by many PLC manufacturers. The two PLC development environments are, the SoMachine from Schneider and the TwinCAT from Beckhoff. The two CODESYS compatible PLCs, should be controlled by the SCADA system of Siemens, WinCC OA. The framework includes a library of Function Blocks (objects) for the PLC programs and a software for automatic generation of the PLC code based on this library, called UAB. The integration aimed to give a solution that is shared by both PLC platforms and was based on the PLCOpen XML scheme. The developed tools were demonstrated by creating a control application for both PLC environments and testing of the behavior of the code of the library.

  14. A Thermo-Hydraulic Tool for Automatic Virtual Hazop Evaluation

    Directory of Open Access Journals (Sweden)

    Pugi L.

    2014-12-01

    Full Text Available Development of complex lubrication systems in the Oil&Gas industry has reached high levels of competitiveness in terms of requested performances and reliability. In particular, the use of HazOp (acronym of Hazard and Operability analysis represents a decisive factor to evaluate safety and reliability of plants. The HazOp analysis is a structured and systematic examination of a planned or existing operation in order to identify and evaluate problems that may represent risks to personnel or equipment. In particular, P&ID schemes (acronym of Piping and Instrument Diagram according to regulation in force ISO 14617 are used to evaluate the design of the plant in order to increase its safety and reliability in different operating conditions. The use of a simulation tool can drastically increase speed, efficiency and reliability of the design process. In this work, a tool, called TTH lib (acronym of Transient Thermal Hydraulic Library for the 1-D simulation of thermal hydraulic plants is presented. The proposed tool is applied to the analysis of safety relevant components of compressor and pumping units, such as lubrication circuits. Opposed to the known commercial products, TTH lib has been customized in order to ease simulation of complex interactions with digital logic components and plant controllers including their sensors and measurement systems. In particular, the proposed tool is optimized for fixed step execution and fast prototyping of Real Time code both for testing and production purposes. TTH lib can be used as a standard SimScape-Simulink library of components optimized and specifically designed in accordance with the P&ID definitions. Finally, an automatic code generation procedure has been developed, so TTH simulation models can be directly assembled from the P&ID schemes and technical documentation including detailed informations of sensor and measurement system.

  15. High-order space charge effects using automatic differentiation

    International Nuclear Information System (INIS)

    Reusch, Michael F.; Bruhwiler, David L.

    1997-01-01

    The Northrop Grumman Topkark code has been upgraded to Fortran 90, making use of operator overloading, so the same code can be used to either track an array of particles or construct a Taylor map representation of the accelerator lattice. We review beam optics and beam dynamics simulations conducted with TOPKARK in the past and we present a new method for modeling space charge forces to high-order with automatic differentiation. This method generates an accurate, high-order, 6-D Taylor map of the phase space variable trajectories for a bunched, high-current beam. The spatial distribution is modeled as the product of a Taylor Series times a Gaussian. The variables in the argument of the Gaussian are normalized to the respective second moments of the distribution. This form allows for accurate representation of a wide range of realistic distributions, including any asymmetries, and allows for rapid calculation of the space charge fields with free space boundary conditions. An example problem is presented to illustrate our approach

  16. High-order space charge effects using automatic differentiation

    International Nuclear Information System (INIS)

    Reusch, M.F.; Bruhwiler, D.L.; Computer Accelerator Physics Conference Williamsburg, Virginia 1996)

    1997-01-01

    The Northrop Grumman Topkark code has been upgraded to Fortran 90, making use of operator overloading, so the same code can be used to either track an array of particles or construct a Taylor map representation of the accelerator lattice. We review beam optics and beam dynamics simulations conducted with TOPKARK in the past and we present a new method for modeling space charge forces to high-order with automatic differentiation. This method generates an accurate, high-order, 6-D Taylor map of the phase space variable trajectories for a bunched, high-current beam. The spatial distribution is modeled as the product of a Taylor Series times a Gaussian. The variables in the argument of the Gaussian are normalized to the respective second moments of the distribution. This form allows for accurate representation of a wide range of realistic distributions, including any asymmetries, and allows for rapid calculation of the space charge fields with free space boundary conditions. An example problem is presented to illustrate our approach. copyright 1997 American Institute of Physics

  17. GANALYZER: A TOOL FOR AUTOMATIC GALAXY IMAGE ANALYSIS

    International Nuclear Information System (INIS)

    Shamir, Lior

    2011-01-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ∼10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  18. Ganalyzer: A Tool for Automatic Galaxy Image Analysis

    Science.gov (United States)

    Shamir, Lior

    2011-08-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  19. A novel framework for diagnosing automatic tool changer and tool life based on cloud computing

    Directory of Open Access Journals (Sweden)

    Shang-Liang Chen

    2016-03-01

    Full Text Available Tool change is one among the most frequently performed machining processes, and if there is improper percussion as the tool’s position is changed, the spindle bearing can be damaged. A spindle malfunction can cause problems, such as a knife being dropped or bias in a machined hole. The measures currently taken to avoid such issues, which arose from the available machine tools, only involve determining whether the clapping knife’s state is correct using a spindle and the air adhesion method, which is also used to satisfy the high precision required from mechanical components. Therefore, it cannot be used with any type of machine tool; in addition, improper tapping of the spindle during an automatic tool change cannot be detected. Therefore, this study proposes a new type of diagnostic framework that combines cloud computing and vibration sensors, among of which, tool change is automatically diagnosed using an architecture to identify abnormalities and thereby enhances the reliability and productivity of the machine and equipment.

  20. Preliminary Design Through Graphs: A Tool for Automatic Layout Distribution

    Directory of Open Access Journals (Sweden)

    Carlo Biagini

    2015-02-01

    Full Text Available Diagrams are essential in the preliminary stages of design for understanding distributive aspects and assisting the decision-making process. By drawing a schematic graph, designers can visualize in a synthetic way the relationships between many aspects: functions and spaces, distribution of layouts, space adjacency, influence of traffic flows within a facility layout, and so on. This process can be automated through the use of modern Information and Communication Technologies tools (ICT that allow the designers to manage a large quantity of information. The work that we will present is part of an on-going research project into how modern parametric software influences decision-making on the basis of automatic and optimized layout distribution. The method involves two phases: the first aims to define the ontological relation between spaces, with particular reference to a specific building typology (rules of aggregation of spaces; the second entails the implementation of these rules through the use of specialist software. The generation of ontological relations begins with the collection of data from historical manuals and analyses of case studies. These analyses aim to generate a “relationship matrix” based on preferences of space adjacency. The phase of implementing the previously defined rules is based on the use of Grasshopper to analyse and visualize different layout configurations. The layout is generated by simulating a process involving the collision of spheres, which represents specific functions of the design program. The spheres are attracted or rejected as a function of the relationships matrix, as defined above. The layout thus obtained will remain in a sort of abstract state independent of information about the exterior form, but will still provide a useful tool for the decision-making process. In addition, preliminary results gathered through the analysis of case studies will be presented. These results provide a good variety

  1. Towards an automatic tool for resolution evaluation of mammographic images

    Energy Technology Data Exchange (ETDEWEB)

    De Oliveira, J. E. E. [FUMEC, Av. Alfonso Pena 3880, CEP 30130-009 Belo Horizonte - MG (Brazil); Nogueira, M. S., E-mail: juliae@fumec.br [Centro de Desenvolvimento da Tecnologia Nuclear / CNEN, Pte. Antonio Carlos 6627, 31270-901, Belo Horizonte - MG (Brazil)

    2014-08-15

    Quality of Mammographies from the Public and Private Services of the State. With an essentially educational character, an evaluation of the image quality is monthly held from a breast phantom in each mammographic equipment. In face of this, this work proposes to develop a protocol for automatic evaluation of image quality of mammograms so that the radiological protection and image quality requirements are met in the early detection of breast cancer. Specifically, image resolution will be addressed and evaluated, as a part of the program of image quality evaluation. Results show that for the fourth resolution and using 28 phantom images with the ground truth settled, the computer analysis of the resolution is promising and may be used as a tool for the assessment of the image quality. (Author)

  2. Towards an automatic tool for resolution evaluation of mammographic images

    International Nuclear Information System (INIS)

    De Oliveira, J. E. E.; Nogueira, M. S.

    2014-08-01

    Quality of Mammographies from the Public and Private Services of the State. With an essentially educational character, an evaluation of the image quality is monthly held from a breast phantom in each mammographic equipment. In face of this, this work proposes to develop a protocol for automatic evaluation of image quality of mammograms so that the radiological protection and image quality requirements are met in the early detection of breast cancer. Specifically, image resolution will be addressed and evaluated, as a part of the program of image quality evaluation. Results show that for the fourth resolution and using 28 phantom images with the ground truth settled, the computer analysis of the resolution is promising and may be used as a tool for the assessment of the image quality. (Author)

  3. Differential Forms: A New Tool in Economics

    Science.gov (United States)

    Mimkes, Jürgen

    Econophysics is the transfer of methods from natural to socio-economic sciences. This concept has first been applied to finance1, but it is now also used in various applications of economics and social sciences [2,3]. The present paper focuses on problems in macro economics and growth. 1. Neoclassical theory [4, 5] neglects the “ex post” property of income and growth. Income Y(K, L) is assumed to be a function of capital and labor. But functions cannot model the “ex post” character of income. 2. Neoclassical theory is based on a Cobb Douglas function [6] with variable elasticity α, which may be fitted to economic data. But an undefined elasticity α leads to a descriptive rather than a predictive economic theory. The present paper introduces a new tool - differential forms and path dependent integrals - to macro economics. This is a solution to the problems above: 1. The integral of not exact differential forms is path dependent and can only be calculated “ex post” like income and economic growth. 2. Not exact differential forms can be made exact by an integrating factor, this leads to a new, well defined, unique production function F and a predictive economic theory.

  4. Automatic loading pattern optimization tool for Loviisa VVER-440 reactors

    Energy Technology Data Exchange (ETDEWEB)

    Kuopanportti, Jaakko [Fortum Power and Heat, Fortum (Finland). Nuclear Competence Center

    2013-09-15

    An automatic loading pattern optimization tool called ALPOT has been developed for Loviisa VVER-440 reactors. The ALPOT code utilizes combination of three different optimization methods. The first method is the imitation of the equilibrium pattern that is the optimized pattern in case the cycle length and the operation conditions are constant and the same shuffling pattern is repeated from cycle to cycle. In practice, the algorithm imitates assemblies' operation year distribution of the equilibrium pattern stochastically. The function of the imitation algorithm is to provide initial patterns quickly for the next optimization phase, which is performed either with the stochastic guided binary search algorithm or the deterministic burnup kernel method depending on the choice of the user. The former is a modified version of the standard binary search. The standard version goes through all possible swaps of the assemblies and chooses the best swap at each iteration round. The guided version chooses one assembly, tries to swap it with every other possible assembly and performs the best swap at each iteration round. The search is guided so that the algorithm chooses the assemblies at or near the most restrictive fuel assembly first. The kernel method creates burnup kernel functions to estimate burnup variations that are required to achieve desired changes in the power distribution of the reactor. The idea of the kernel method is first determine the optimal burnup distribution that minimizes the maximum relative assembly power using the created kernel functions and a common solver routine. Then, the burnups of the available fuel assemblies are matched with the obtained burnup distribution. (orig.)

  5. Automatic loading pattern optimization tool for Loviisa VVER-440 reactors

    International Nuclear Information System (INIS)

    Kuopanportti, Jaakko

    2013-01-01

    An automatic loading pattern optimization tool called ALPOT has been developed for Loviisa VVER-440 reactors. The ALPOT code utilizes combination of three different optimization methods. The first method is the imitation of the equilibrium pattern that is the optimized pattern in case the cycle length and the operation conditions are constant and the same shuffling pattern is repeated from cycle to cycle. In practice, the algorithm imitates assemblies' operation year distribution of the equilibrium pattern stochastically. The function of the imitation algorithm is to provide initial patterns quickly for the next optimization phase, which is performed either with the stochastic guided binary search algorithm or the deterministic burnup kernel method depending on the choice of the user. The former is a modified version of the standard binary search. The standard version goes through all possible swaps of the assemblies and chooses the best swap at each iteration round. The guided version chooses one assembly, tries to swap it with every other possible assembly and performs the best swap at each iteration round. The search is guided so that the algorithm chooses the assemblies at or near the most restrictive fuel assembly first. The kernel method creates burnup kernel functions to estimate burnup variations that are required to achieve desired changes in the power distribution of the reactor. The idea of the kernel method is first determine the optimal burnup distribution that minimizes the maximum relative assembly power using the created kernel functions and a common solver routine. Then, the burnups of the available fuel assemblies are matched with the obtained burnup distribution. (orig.)

  6. Semi-automatic tool to ease the creation and optimization of GPU programs

    DEFF Research Database (Denmark)

    Jepsen, Jacob

    2014-01-01

    We present a tool that reduces the development time of GPU-executable code. We implement a catalogue of common optimizations specific to the GPU architecture. Through the tool, the programmer can semi-automatically transform a computationally-intensive code section into GPU-executable form...... of the transformations can be performed automatically, which makes the tool usable for both novices and experts in GPU programming....

  7. An inverse method for non linear ablative thermics with experimentation of automatic differentiation

    Energy Technology Data Exchange (ETDEWEB)

    Alestra, S [Simulation Information Technology and Systems Engineering, EADS IW Toulouse (France); Collinet, J [Re-entry Systems and Technologies, EADS ASTRIUM ST, Les Mureaux (France); Dubois, F [Professor of Applied Mathematics, Conservatoire National des Arts et Metiers Paris (France)], E-mail: stephane.alestra@eads.net, E-mail: jean.collinet@astrium.eads.net, E-mail: fdubois@cnam.fr

    2008-11-01

    Thermal Protection System is a key element for atmospheric re-entry missions of aerospace vehicles. The high level of heat fluxes encountered in such missions has a direct effect on mass balance of the heat shield. Consequently, the identification of heat fluxes is of great industrial interest but is in flight only available by indirect methods based on temperature measurements. This paper is concerned with inverse analyses of highly evolutive heat fluxes. An inverse problem is used to estimate transient surface heat fluxes (convection coefficient), for degradable thermal material (ablation and pyrolysis), by using time domain temperature measurements on thermal protection. The inverse problem is formulated as a minimization problem involving an objective functional, through an optimization loop. An optimal control formulation (Lagrangian, adjoint and gradient steepest descent method combined with quasi-Newton method computations) is then developed and applied, using Monopyro, a transient one-dimensional thermal model with one moving boundary (ablative surface) that has been developed since many years by ASTRIUM-ST. To compute numerically the adjoint and gradient quantities, for the inverse problem in heat convection coefficient, we have used both an analytical manual differentiation and an Automatic Differentiation (AD) engine tool, Tapenade, developed at INRIA Sophia-Antipolis by the TROPICS team. Several validation test cases, using synthetic temperature measurements are carried out, by applying the results of the inverse method with minimization algorithm. Accurate results of identification on high fluxes test cases, and good agreement for temperatures restitutions, are obtained, without and with ablation and pyrolysis, using bad fluxes initial guesses. First encouraging results with an automatic differentiation procedure are also presented in this paper.

  8. Automatic Differentiation in Quantum Chemistry with Applications to Fully Variational Hartree-Fock.

    Science.gov (United States)

    Tamayo-Mendoza, Teresa; Kreisbeck, Christoph; Lindh, Roland; Aspuru-Guzik, Alán

    2018-05-23

    Automatic differentiation (AD) is a powerful tool that allows calculating derivatives of implemented algorithms with respect to all of their parameters up to machine precision, without the need to explicitly add any additional functions. Thus, AD has great potential in quantum chemistry, where gradients are omnipresent but also difficult to obtain, and researchers typically spend a considerable amount of time finding suitable analytical forms when implementing derivatives. Here, we demonstrate that AD can be used to compute gradients with respect to any parameter throughout a complete quantum chemistry method. We present DiffiQult , a Hartree-Fock implementation, entirely differentiated with the use of AD tools. DiffiQult is a software package written in plain Python with minimal deviation from standard code which illustrates the capability of AD to save human effort and time in implementations of exact gradients in quantum chemistry. We leverage the obtained gradients to optimize the parameters of one-particle basis sets in the context of the floating Gaussian framework.

  9. Analysis on machine tool systems using spindle vibration monitoring for automatic tool changer

    Directory of Open Access Journals (Sweden)

    Shang-Liang Chen

    2015-12-01

    Full Text Available Recently, the intelligent systems of technology have become one of the major items in the development of machine tools. One crucial technology is the machinery status monitoring function, which is required for abnormal warnings and the improvement of cutting efficiency. During processing, the mobility act of the spindle unit determines the most frequent and important part such as automatic tool changer. The vibration detection system includes the development of hardware and software, such as vibration meter, signal acquisition card, data processing platform, and machine control program. Meanwhile, based on the difference between the mechanical configuration and the desired characteristics, it is difficult for a vibration detection system to directly choose the commercially available kits. For this reason, it was also selected as an item for self-development research, along with the exploration of a significant parametric study that is sufficient to represent the machine characteristics and states. However, we also launched the development of functional parts of the system simultaneously. Finally, we entered the conditions and the parameters generated from both the states and the characteristics into the developed system to verify its feasibility.

  10. AD Model Builder: using automatic differentiation for statistical inference of highly parameterized complex nonlinear models

    DEFF Research Database (Denmark)

    Fournier, David A.; Skaug, Hans J.; Ancheta, Johnoel

    2011-01-01

    Many criteria for statistical parameter estimation, such as maximum likelihood, are formulated as a nonlinear optimization problem.Automatic Differentiation Model Builder (ADMB) is a programming framework based on automatic differentiation, aimed at highly nonlinear models with a large number...... of such a feature is the generic implementation of Laplace approximation of high-dimensional integrals for use in latent variable models. We also review the literature in which ADMB has been used, and discuss future development of ADMB as an open source project. Overall, the main advantages ofADMB are flexibility...

  11. Automatic tools for enhancing the collaborative experience in large projects

    International Nuclear Information System (INIS)

    Bourilkov, D; Rodriquez, J L

    2014-01-01

    With the explosion of big data in many fields, the efficient management of knowledge about all aspects of the data analysis gains in importance. A key feature of collaboration in large scale projects is keeping a log of what is being done and how - for private use, reuse, and for sharing selected parts with collaborators and peers, often distributed geographically on an increasingly global scale. Even better if the log is automatically created on the fly while the scientist or software developer is working in a habitual way, without the need for extra efforts. This saves time and enables a team to do more with the same resources. The CODESH - COllaborative DEvelopment SHell - and CAVES - Collaborative Analysis Versioning Environment System projects address this problem in a novel way. They build on the concepts of virtual states and transitions to enhance the collaborative experience by providing automatic persistent virtual logbooks. CAVES is designed for sessions of distributed data analysis using the popular ROOT framework, while CODESH generalizes the approach for any type of work on the command line in typical UNIX shells like bash or tcsh. Repositories of sessions can be configured dynamically to record and make available the knowledge accumulated in the course of a scientific or software endeavor. Access can be controlled to define logbooks of private sessions or sessions shared within or between collaborating groups. A typical use case is building working scalable systems for analysis of Petascale volumes of data as encountered in the LHC experiments. Our approach is general enough to find applications in many fields.

  12. MatchGUI: A Graphical MATLAB-Based Tool for Automatic Image Co-Registration

    Science.gov (United States)

    Ansar, Adnan I.

    2011-01-01

    MatchGUI software, based on MATLAB, automatically matches two images and displays the match result by superimposing one image on the other. A slider bar allows focus to shift between the two images. There are tools for zoom, auto-crop to overlap region, and basic image markup. Given a pair of ortho-rectified images (focused primarily on Mars orbital imagery for now), this software automatically co-registers the imagery so that corresponding image pixels are aligned. MatchGUI requires minimal user input, and performs a registration over scale and inplane rotation fully automatically

  13. Development of tools for automatic generation of PLC code

    OpenAIRE

    Koutli, Maria; Chasapis, Georgios; Rochez, Jacques

    2014-01-01

    This Master thesis was performed at CERN and more specifically in the EN-ICE-PLC section. The Thesis describes the integration of two PLC platforms, that are based on CODESYS development tool, to the CERN defined industrial framework, UNICOS. CODESYS is a development tool for PLC programming, based on IEC 61131-3 standard, and is adopted by many PLC manufacturers. The two PLC development environments are, the SoMachine from Schneider and the TwinCAT from Beckhoff. The two CODESYS compatible P...

  14. Facilitating coronary artery evaluation in MDCT using a 3D automatic vessel segmentation tool

    International Nuclear Information System (INIS)

    Fawad Khan, M.; Gurung, Jessen; Maataoui, Adel; Brehmer, Boris; Herzog, Christopher; Vogl, Thomas J.; Wesarg, Stefan; Dogan, Selami; Ackermann, Hanns; Assmus, Birgit

    2006-01-01

    The purpose of this study was to investigate a 3D coronary artery segmentation algorithm using 16-row MDCT data sets. Fifty patients underwent cardiac CT (Sensation 16, Siemens) and coronary angiography. Automatic and manual detection of coronary artery stenosis was performed. A 3D coronary artery segmentation algorithm (Fraunhofer Institute for Computer Graphics, Darmstadt) was used for automatic evaluation. All significant stenoses (>50%) in vessels >1.5 mm in diameter were protocoled. Each detection tool was used by one reader who was blinded to the results of the other detection method and the results of coronary angiography. Sensitivity and specificity were determined for automatic and manual detection as well as was the time for both CT-based evaluation methods. The overall sensitivity and specificity of the automatic and manual approach were 93.1 vs. 95.83% and 86.1 vs. 81.9%. The time required for automatic evaluation was significantly shorter than with the manual approach, i.e., 246.04±43.17 s for the automatic approach and 526.88±45.71 s for the manual approach (P<0.0001). In 94% of the coronary artery branches, automatic detection required less time than the manual approach. Automatic coronary vessel evaluation is feasible. It reduces the time required for cardiac CT evaluation with similar sensitivity and specificity as well as facilitates the evaluation of MDCT coronary angiography in a standardized fashion. (orig.)

  15. Automatic design optimization tool for passive structural control systems

    Science.gov (United States)

    Mojolic, Cristian; Hulea, Radu; Parv, Bianca Roxana

    2017-07-01

    The present paper proposes an automatic dynamic process in order to find the parameters of the seismic isolation systems applied to large span structures. Three seismic isolation solutions are proposed for the model of the new Slatina Sport Hall. The first case uses friction pendulum system (FP), the second one uses High Damping Rubber Bearing (HDRB) and Lead Rubber Bearings, while (LRB) are used for the last case of isolation. The placement of the isolation level is at the top end of the roof supporting columns. The aim is to calculate the parameters of each isolation system so that the whole's structure first vibration periods is the one desired by the user. The model is computed with the use of SAP2000 software. In order to find the best solution for the optimization problem, an optimization process based on Genetic Algorithms (GA) has been developed in Matlab. With the use of the API (Application Programming Interface) libraries a two way link is created between the two programs in order to exchange results and link parameters. The main goal is to find the best seismic isolation method for each desired modal period so that the bending moment on the supporting columns should be minimum.

  16. AutoFACT: An Automatic Functional Annotation and Classification Tool

    Directory of Open Access Journals (Sweden)

    Lang B Franz

    2005-06-01

    Full Text Available Abstract Background Assignment of function to new molecular sequence data is an essential step in genomics projects. The usual process involves similarity searches of a given sequence against one or more databases, an arduous process for large datasets. Results We present AutoFACT, a fully automated and customizable annotation tool that assigns biologically informative functions to a sequence. Key features of this tool are that it (1 analyzes nucleotide and protein sequence data; (2 determines the most informative functional description by combining multiple BLAST reports from several user-selected databases; (3 assigns putative metabolic pathways, functional classes, enzyme classes, GeneOntology terms and locus names; and (4 generates output in HTML, text and GFF formats for the user's convenience. We have compared AutoFACT to four well-established annotation pipelines. The error rate of functional annotation is estimated to be only between 1–2%. Comparison of AutoFACT to the traditional top-BLAST-hit annotation method shows that our procedure increases the number of functionally informative annotations by approximately 50%. Conclusion AutoFACT will serve as a useful annotation tool for smaller sequencing groups lacking dedicated bioinformatics staff. It is implemented in PERL and runs on LINUX/UNIX platforms. AutoFACT is available at http://megasun.bch.umontreal.ca/Software/AutoFACT.htm.

  17. RETRANS - A tool to verify the functional equivalence of automatically generated source code with its specification

    International Nuclear Information System (INIS)

    Miedl, H.

    1998-01-01

    Following the competent technical standards (e.g. IEC 880) it is necessary to verify each step in the development process of safety critical software. This holds also for the verification of automatically generated source code. To avoid human errors during this verification step and to limit the cost effort a tool should be used which is developed independently from the development of the code generator. For this purpose ISTec has developed the tool RETRANS which demonstrates the functional equivalence of automatically generated source code with its underlying specification. (author)

  18. DAF: differential ACE filtering image quality assessment by automatic color equalization

    Science.gov (United States)

    Ouni, S.; Chambah, M.; Saint-Jean, C.; Rizzi, A.

    2008-01-01

    Ideally, a quality assessment system would perceive and measure image or video impairments just like a human being. But in reality, objective quality metrics do not necessarily correlate well with perceived quality [1]. Plus, some measures assume that there exists a reference in the form of an "original" to compare to, which prevents their usage in digital restoration field, where often there is no reference to compare to. That is why subjective evaluation is the most used and most efficient approach up to now. But subjective assessment is expensive, time consuming and does not respond, hence, to the economic requirements [2,3]. Thus, reliable automatic methods for visual quality assessment are needed in the field of digital film restoration. The ACE method, for Automatic Color Equalization [4,6], is an algorithm for digital images unsupervised enhancement. It is based on a new computational approach that tries to model the perceptual response of our vision system merging the Gray World and White Patch equalization mechanisms in a global and local way. Like our vision system ACE is able to adapt to widely varying lighting conditions, and to extract visual information from the environment efficaciously. Moreover ACE can be run in an unsupervised manner. Hence it is very useful as a digital film restoration tool since no a priori information is available. In this paper we deepen the investigation of using the ACE algorithm as a basis for a reference free image quality evaluation. This new metric called DAF for Differential ACE Filtering [7] is an objective quality measure that can be used in several image restoration and image quality assessment systems. In this paper, we compare on different image databases, the results obtained with DAF and with some subjective image quality assessments (Mean Opinion Score MOS as measure of perceived image quality). We study also the correlation between objective measure and MOS. In our experiments, we have used for the first image

  19. Evaluation of a new software tool for the automatic volume calculation of hepatic tumors. First results

    International Nuclear Information System (INIS)

    Meier, S.; Mildenberger, P.; Pitton, M.; Thelen, M.; Schenk, A.; Bourquain, H.

    2004-01-01

    Purpose: computed tomography has become the preferred method in detecting liver carcinomas. The introduction of spiral CT added volumetric assessment of intrahepatic tumors, which was unattainable in the clinical routine with incremental CT due to complex planimetric revisions and excessive computing time. In an ongoing clinical study, a new software tool was tested for the automatic detection of tumor volume and the time needed for this procedure. Materials and methods: we analyzed patients suffering from hepatocellular carcinoma (HCC). All patients underwent treatment with repeated transcatheter chemoembolization of the hepatic arteria. The volumes of the HCC lesions detected in CT were measured with the new software tool in HepaVison (MeVis, Germany). The results were compared with manual planimetric calculation of the volume performed by three independent radiologists. Results: our first results in 16 patients show a correlation between the automatically and the manually calculated volumes (up to a difference of 2 ml) of 96.8%. While the manual method of analyzing the volume of a lesion requires 2.5 minutes on average, the automatic method merely requires about 30 seconds of user interaction time. Conclusion: These preliminary results show a good correlation between automatic and manual calculations of the tumor volume. The new software tool requires less time for accurate determination of the tumor volume and can be applied in the daily clinical routine. (orig.) [de

  20. ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations

    Science.gov (United States)

    Laloo, Jalal Z. A.; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai

    2017-07-01

    The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.

  1. Multislice CT coronary angiography: evaluation of an automatic vessel detection tool

    International Nuclear Information System (INIS)

    Dewey, M.; Schnapauff, D.; Lembcke, A.; Hamm, B.; Rogalla, P.; Laule, M.; Borges, A.C.; Rutsch, W.

    2004-01-01

    Purpose: To investigate the potential of a new detection tool for multisliceCT (MSCT) coronary angiography with automatic display of curved multiplanar reformations and orthogonal cross-sections. Materials and Methods: Thirty-five patients were consecutively enrolled in a prospective intention-to-diagnose study and examined using a MSCT scanner with 16 x 0.5 mm detector collimation and 400 ms gantry rotation time (Aquilion, Toshiba). A multisegment algorithm using up to four segments was applied for ECG-gated reconstruction. Automatic and manual detection of coronary arteries was conducted using the coronary artery CT protocol of a workstation (Vitrea 2, Version 3.3, Vital Images) to detect significant stenoses (≥50%) in all segments of ≥1.5 mm in diameter. Each detection tool was used by one reader who was blinded to the results of the other detection method and the results of conventional coronary angiography. Results: The overall sensitivity, specificity, nondiagnostic rate, and accuracy of the automatic and manual approach were 90 vs. 94%, 89 vs. 84%, 6 vs. 6%, and 89 vs. 88%, respectively (p=n.s.). The vessel length detected with the automatic and manual approach were highly correlated for the left main/left anterior descending (143±30 vs. 146±24 mm, r=0.923, p [de

  2. Method and Tool for Design Process Navigation and Automatic Generation of Simulation Models for Manufacturing Systems

    Science.gov (United States)

    Nakano, Masaru; Kubota, Fumiko; Inamori, Yutaka; Mitsuyuki, Keiji

    Manufacturing system designers should concentrate on designing and planning manufacturing systems instead of spending their efforts on creating the simulation models to verify the design. This paper proposes a method and its tool to navigate the designers through the engineering process and generate the simulation model automatically from the design results. The design agent also supports collaborative design projects among different companies or divisions with distributed engineering and distributed simulation techniques. The idea was implemented and applied to a factory planning process.

  3. Depfix, a Tool for Automatic Rule-based Post-editing of SMT

    Directory of Open Access Journals (Sweden)

    Rudolf Rosa

    2014-09-01

    Full Text Available We present Depfix, an open-source system for automatic post-editing of phrase-based machine translation outputs. Depfix employs a range of natural language processing tools to obtain analyses of the input sentences, and uses a set of rules to correct common or serious errors in machine translation outputs. Depfix is currently implemented only for English-to-Czech translation direction, but extending it to other languages is planned.

  4. Health smart home for elders - a tool for automatic recognition of activities of daily living.

    Science.gov (United States)

    Le, Xuan Hoa Binh; Di Mascolo, Maria; Gouin, Alexia; Noury, Norbert

    2008-01-01

    Elders live preferently in their own home, but with aging comes the loss of autonomy and associated risks. In order to help them live longer in safe conditions, we need a tool to automatically detect their loss of autonomy by assessing the degree of performance of activities of daily living. This article presents an approach enabling the activities recognition of an elder living alone in a home equipped with noninvasive sensors.

  5. Health smart home: towards an assistant tool for automatic assessment of the dependence of elders.

    Science.gov (United States)

    Le, Xuan Hoa Binh; Di Mascolo, Maria; Gouin, Alexia; Noury, Norbert

    2007-01-01

    In order to help elders living alone to age in place independently and safely, it can be useful to have an assistant tool that can automatically assess their dependence and issue an alert if there is any loss of autonomy. The dependence can be assessed by the degree of performance, by the elders, of activities of daily living. This article presents an approach enabling the activity recognition for an elder living alone in a Health Smart Home equipped with noninvasive sensors.

  6. Building Automatic Grading Tools for Basic of Programming Lab in an Academic Institution

    Science.gov (United States)

    Harimurti, Rina; Iwan Nurhidayat, Andi; Asmunin

    2018-04-01

    The skills of computer programming is a core competency that must be mastered by students majoring in computer sciences. The best way to improve this skill is through the practice of writing many programs to solve various problems from simple to complex. It takes hard work and a long time to check and evaluate the results of student labs one by one, especially if the number of students a lot. Based on these constrain, web proposes Automatic Grading Tools (AGT), the application that can evaluate and deeply check the source code in C, C++. The application architecture consists of students, web-based applications, compilers, and operating systems. Automatic Grading Tools (AGT) is implemented MVC Architecture and using open source software, such as laravel framework version 5.4, PostgreSQL 9.6, Bootstrap 3.3.7, and jquery library. Automatic Grading Tools has also been tested for real problems by submitting source code in C/C++ language and then compiling. The test results show that the AGT application has been running well.

  7. Reducing the memory requirement in reverse mode automatic differentiation by solving TBR flow equations

    International Nuclear Information System (INIS)

    Naumann, U.

    2002-01-01

    The fast computation of gradients in reverse mode Automatic Differentiation (AD) requires the generation of adjoint versions of every statement in the original code. Due to the resulting reversal of the control flow certain intermediate values have to be made available in reverse order to compute the local partial derivatives. This can be achieved by storing these values or by recomputing them when they become required. In any case one is interested in minimizing the size of this set. Following an extensive introduction of the ''To-Be-Recorded'' (TBR) problem the authors present flow equations for propagating the TBR status of variables in the context of reverse mode AD of structured programs

  8. Parameter optimization of differential evolution algorithm for automatic playlist generation problem

    Science.gov (United States)

    Alamag, Kaye Melina Natividad B.; Addawe, Joel M.

    2017-11-01

    With the digitalization of music, the number of collection of music increased largely and there is a need to create lists of music that filter the collection according to user preferences, thus giving rise to the Automatic Playlist Generation Problem (APGP). Previous attempts to solve this problem include the use of search and optimization algorithms. If a music database is very large, the algorithm to be used must be able to search the lists thoroughly taking into account the quality of the playlist given a set of user constraints. In this paper we perform an evolutionary meta-heuristic optimization algorithm, Differential Evolution (DE) using different combination of parameter values and select the best performing set when used to solve four standard test functions. Performance of the proposed algorithm is then compared with normal Genetic Algorithm (GA) and a hybrid GA with Tabu Search. Numerical simulations are carried out to show better results from Differential Evolution approach with the optimized parameter values.

  9. NuFTA: A CASE Tool for Automatic Software Fault Tree Analysis

    International Nuclear Information System (INIS)

    Yun, Sang Hyun; Lee, Dong Ah; Yoo, Jun Beom

    2010-01-01

    Software fault tree analysis (SFTA) is widely used for analyzing software requiring high-reliability. In SFTA, experts predict failures of system through HA-ZOP (Hazard and Operability study) or FMEA (Failure Mode and Effects Analysis) and draw software fault trees about the failures. Quality and cost of the software fault tree, therefore, depend on knowledge and experience of the experts. This paper proposes a CASE tool NuFTA in order to assist experts of safety analysis. The NuFTA automatically generate software fault trees from NuSCR formal requirements specification. NuSCR is a formal specification language used for specifying software requirements of KNICS RPS (Reactor Protection System) in Korea. We used the SFTA templates proposed by in order to generate SFTA automatically. The NuFTA also generates logical formulae summarizing the failure's cause, and we have a plan to use the formulae usefully through formal verification techniques

  10. Automatic registration method for multisensor datasets adopted for dimensional measurements on cutting tools

    International Nuclear Information System (INIS)

    Shaw, L; Mehari, F; Weckenmann, A; Ettl, S; Häusler, G

    2013-01-01

    Multisensor systems with optical 3D sensors are frequently employed to capture complete surface information by measuring workpieces from different views. During coarse and fine registration the resulting datasets are afterward transformed into one common coordinate system. Automatic fine registration methods are well established in dimensional metrology, whereas there is a deficit in automatic coarse registration methods. The advantage of a fully automatic registration procedure is twofold: it enables a fast and contact-free alignment and further a flexible application to datasets of any kind of optical 3D sensor. In this paper, an algorithm adapted for a robust automatic coarse registration is presented. The method was originally developed for the field of object reconstruction or localization. It is based on a segmentation of planes in the datasets to calculate the transformation parameters. The rotation is defined by the normals of three corresponding segmented planes of two overlapping datasets, while the translation is calculated via the intersection point of the segmented planes. First results have shown that the translation is strongly shape dependent: 3D data of objects with non-orthogonal planar flanks cannot be registered with the current method. In the novel supplement for the algorithm, the translation is additionally calculated via the distance between centroids of corresponding segmented planes, which results in more than one option for the transformation. A newly introduced measure considering the distance between the datasets after coarse registration evaluates the best possible transformation. Results of the robust automatic registration method are presented on the example of datasets taken from a cutting tool with a fringe-projection system and a focus-variation system. The successful application in dimensional metrology is proven with evaluations of shape parameters based on the registered datasets of a calibrated workpiece. (paper)

  11. Data Quality Monitoring : Automatic MOnitoRing Environment (AMORE ) Web Administration Tool in ALICE Experiment

    CERN Document Server

    Nagi, Imre

    2013-01-01

    ALICE (A Large Ion Collider Experiment) is the heavy-ion detector designed to study the physics of strongly interacting matter and the quark-gluon plasma at the CERN Large Hadron Collider (LHC). The quality of the acquired data evolves over time depending on the status of the detectors, its components and the operating environment. To get an excellent performance of detector, all detector configurations have to be set perfectly so that the data-taking can be done in an optimal way. This report describes a new implementation of the administration tools of the ALICE’s DQM framework called AMORE (Automatic MonitoRing Environment) with web technologies.

  12. Evaluation of Semi-Automatic Metadata Generation Tools: A Survey of the Current State of the Art

    Directory of Open Access Journals (Sweden)

    Jung-ran Park

    2015-09-01

    Full Text Available Assessment of the current landscape of semi-automatic metadata generation tools is particularly important considering the rapid development of digital repositories and the recent explosion of big data. Utilization of (semiautomatic metadata generation is critical in addressing these environmental changes and may be unavoidable in the future considering the costly and complex operation of manual metadata creation. To address such needs, this study examines the range of semi-automatic metadata generation tools (n=39 while providing an analysis of their techniques, features, and functions. The study focuses on open-source tools that can be readily utilized in libraries and other memory institutions.  The challenges and current barriers to implementation of these tools were identified. The greatest area of difficulty lies in the fact that  the piecemeal development of most semi-automatic generation tools only addresses part of the issue of semi-automatic metadata generation, providing solutions to one or a few metadata elements but not the full range elements.  This indicates that significant local efforts will be required to integrate the various tools into a coherent set of a working whole.  Suggestions toward such efforts are presented for future developments that may assist information professionals with incorporation of semi-automatic tools within their daily workflows.

  13. NASCENT: an automatic protein interaction network generation tool for non-model organisms.

    Science.gov (United States)

    Banky, Daniel; Ordog, Rafael; Grolmusz, Vince

    2009-04-24

    Large quantity of reliable protein interaction data are available for model organisms in public depositories (e.g., MINT, DIP, HPRD, INTERACT). Most data correspond to experiments with the proteins of Saccharomyces cerevisiae, Drosophila melanogaster, Homo sapiens, Caenorhabditis elegans, Escherichia coli and Mus musculus. For other important organisms the data availability is poor or non-existent. Here we present NASCENT, a completely automatic web-based tool and also a downloadable Java program, capable of modeling and generating protein interaction networks even for non-model organisms. The tool performs protein interaction network modeling through gene-name mapping, and outputs the resulting network in graphical form and also in computer-readable graph-forms, directly applicable by popular network modeling software. http://nascent.pitgroup.org.

  14. Automatic generation of bioinformatics tools for predicting protein-ligand binding sites.

    Science.gov (United States)

    Komiyama, Yusuke; Banno, Masaki; Ueki, Kokoro; Saad, Gul; Shimizu, Kentaro

    2016-03-15

    Predictive tools that model protein-ligand binding on demand are needed to promote ligand research in an innovative drug-design environment. However, it takes considerable time and effort to develop predictive tools that can be applied to individual ligands. An automated production pipeline that can rapidly and efficiently develop user-friendly protein-ligand binding predictive tools would be useful. We developed a system for automatically generating protein-ligand binding predictions. Implementation of this system in a pipeline of Semantic Web technique-based web tools will allow users to specify a ligand and receive the tool within 0.5-1 day. We demonstrated high prediction accuracy for three machine learning algorithms and eight ligands. The source code and web application are freely available for download at http://utprot.net They are implemented in Python and supported on Linux. shimizu@bi.a.u-tokyo.ac.jp Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  15. Reproducing the internal and external anatomy of fossil bones: Two new automatic digital tools.

    Science.gov (United States)

    Profico, Antonio; Schlager, Stefan; Valoriani, Veronica; Buzi, Costantino; Melchionna, Marina; Veneziano, Alessio; Raia, Pasquale; Moggi-Cecchi, Jacopo; Manzi, Giorgio

    2018-04-21

    We present two new automatic tools, developed under the R environment, to reproduce the internal and external structures of bony elements. The first method, Computer-Aided Laser Scanner Emulator (CA-LSE), provides the reconstruction of the external portions of a 3D mesh by simulating the action of a laser scanner. The second method, Automatic Segmentation Tool for 3D objects (AST-3D), performs the digital reconstruction of anatomical cavities. We present the application of CA-LSE and AST-3D methods to different anatomical remains, highly variable in terms of shape, size and structure: a modern human skull, a malleus bone, and a Neanderthal deciduous tooth. Both methods are developed in the R environment and embedded in the packages "Arothron" and "Morpho," where both the codes and the data are fully available. The application of CA-LSE and AST-3D allows the isolation and manipulation of the internal and external components of the 3D virtual representation of complex bony elements. In particular, we present the output of the four case studies: a complete modern human endocast and the right maxillary sinus, the dental pulp of the Neanderthal tooth and the inner network of blood vessels of the malleus. Both methods demonstrated to be much faster, cheaper, and more accurate than other conventional approaches. The tools we presented are available as add-ons in existing software within the R platform. Because of ease of application, and unrestrained availability of the methods proposed, these tools can be widely used by paleoanthropologists, paleontologists and anatomists. © 2018 Wiley Periodicals, Inc.

  16. A new fully automatic PIM tool to replicate two component tungsten DEMO divertor parts

    International Nuclear Information System (INIS)

    Antusch, Steffen; Commin, Lorelei; Heneka, Jochen; Piotter, Volker; Plewa, Klaus; Walter, Heinz

    2013-01-01

    Highlights: • Development of a fully automatic 2C-PIM tool. • Replicate fusion relevant components in one step without additional brazing. • No cracks or gaps in the seam of the joining zone visible. • For both material combinations a solid bond of the material interface was achieved. • PIM is a powerful process for mass production as well as for joining even complex shaped parts. -- Abstract: At Karlsruhe Institute of Technology (KIT), divertor design concepts for future nuclear fusion power plants beyond ITER are intensively investigated. One promising KIT divertor design concept for the future DEMO power reactor is based on modular He-cooled finger units. The manufacturing of such parts by mechanical machining such as milling and turning, however, is extremely cost and time intensive because tungsten is very hard and brittle. Powder Injection Molding (PIM) has been adapted to tungsten processing at KIT since a couple of years. This production method is deemed promising in view of large-scale production of tungsten parts with high near-net-shape precision, hence, offering an advantage of cost-saving process compared to conventional machining. The properties of the effectively and successfully manufactured divertor part tile consisting only of pure tungsten are a microstructure without cracks and a high density (>98% T.D.). Based on the achieved results a new fully automatic multicomponent PIM tool was developed and allows the replication and joining without brazing of fusion relevant components of different materials in one step and the creation of composite materials. This contribution describes the process route to design and engineer a new fully automatic 2C-PIM tool, including the filling simulation and the implementing of the tool. The complete technological fabrication process of tungsten 2C-PIM, including material and feedstock (powder and binder) development, injection molding, and heat-treatment of real DEMO divertor parts is outlined

  17. ModelMage: a tool for automatic model generation, selection and management.

    Science.gov (United States)

    Flöttmann, Max; Schaber, Jörg; Hoops, Stephan; Klipp, Edda; Mendes, Pedro

    2008-01-01

    Mathematical modeling of biological systems usually involves implementing, simulating, and discriminating several candidate models that represent alternative hypotheses. Generating and managing these candidate models is a tedious and difficult task and can easily lead to errors. ModelMage is a tool that facilitates management of candidate models. It is designed for the easy and rapid development, generation, simulation, and discrimination of candidate models. The main idea of the program is to automatically create a defined set of model alternatives from a single master model. The user provides only one SBML-model and a set of directives from which the candidate models are created by leaving out species, modifiers or reactions. After generating models the software can automatically fit all these models to the data and provides a ranking for model selection, in case data is available. In contrast to other model generation programs, ModelMage aims at generating only a limited set of models that the user can precisely define. ModelMage uses COPASI as a simulation and optimization engine. Thus, all simulation and optimization features of COPASI are readily incorporated. ModelMage can be downloaded from http://sysbio.molgen.mpg.de/modelmage and is distributed as free software.

  18. Four-bar linkage-based automatic tool changer: Dynamic modeling and torque optimization

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Sangho; Seo, TaeWon [Yeungnam University, Gyeongsan (Korea, Republic of); Kim, Jong-Won; Kim, Jongwon [Seoul National University, Seoul (Korea, Republic of)

    2017-05-15

    An Automatic tool changer (ATC) is a device used in a tapping machine to reduce process time. This paper presents the optimization of a Peak torque reduction mechanism (PTRM) for an ATC. It is necessary to reduce the fatigue load and energy consumed, which is related to the peak torque. The PTRM uses a torsion spring to reduce the peak torque and was applied to a novel ATC mechanism, which was modeled using inverse dynamics. Optimization of the PTRM is required to minimize the peak torque. The design parameters are the initial angle and stiffness of the torsion spring, and the objective function is the peak torque of the input link. The torque was simulated, and the peak torque was decreased by 10 %. The energy consumed was reduced by the optimization.

  19. Four-bar linkage-based automatic tool changer: Dynamic modeling and torque optimization

    International Nuclear Information System (INIS)

    Lee, Sangho; Seo, TaeWon; Kim, Jong-Won; Kim, Jongwon

    2017-01-01

    An Automatic tool changer (ATC) is a device used in a tapping machine to reduce process time. This paper presents the optimization of a Peak torque reduction mechanism (PTRM) for an ATC. It is necessary to reduce the fatigue load and energy consumed, which is related to the peak torque. The PTRM uses a torsion spring to reduce the peak torque and was applied to a novel ATC mechanism, which was modeled using inverse dynamics. Optimization of the PTRM is required to minimize the peak torque. The design parameters are the initial angle and stiffness of the torsion spring, and the objective function is the peak torque of the input link. The torque was simulated, and the peak torque was decreased by 10 %. The energy consumed was reduced by the optimization.

  20. DDT: A Research Tool for Automatic Data Distribution in High Performance Fortran

    Directory of Open Access Journals (Sweden)

    Eduard AyguadÉ

    1997-01-01

    Full Text Available This article describes the main features and implementation of our automatic data distribution research tool. The tool (DDT accepts programs written in Fortran 77 and generates High Performance Fortran (HPF directives to map arrays onto the memories of the processors and parallelize loops, and executable statements to remap these arrays. DDT works by identifying a set of computational phases (procedures and loops. The algorithm builds a search space of candidate solutions for these phases which is explored looking for the combination that minimizes the overall cost; this cost includes data movement cost and computation cost. The movement cost reflects the cost of accessing remote data during the execution of a phase and the remapping costs that have to be paid in order to execute the phase with the selected mapping. The computation cost includes the cost of executing a phase in parallel according to the selected mapping and the owner computes rule. The tool supports interprocedural analysis and uses control flow information to identify how phases are sequenced during the execution of the application.

  1. The tool for the automatic analysis of lexical sophistication (TAALES): version 2.0.

    Science.gov (United States)

    Kyle, Kristopher; Crossley, Scott; Berger, Cynthia

    2017-07-11

    This study introduces the second release of the Tool for the Automatic Analysis of Lexical Sophistication (TAALES 2.0), a freely available and easy-to-use text analysis tool. TAALES 2.0 is housed on a user's hard drive (allowing for secure data processing) and is available on most operating systems (Windows, Mac, and Linux). TAALES 2.0 adds 316 indices to the original tool. These indices are related to word frequency, word range, n-gram frequency, n-gram range, n-gram strength of association, contextual distinctiveness, word recognition norms, semantic network, and word neighbors. In this study, we validated TAALES 2.0 by investigating whether its indices could be used to model both holistic scores of lexical proficiency in free writes and word choice scores in narrative essays. The results indicated that the TAALES 2.0 indices could be used to explain 58% of the variance in lexical proficiency scores and 32% of the variance in word-choice scores. Newly added TAALES 2.0 indices, including those related to n-gram association strength, word neighborhood, and word recognition norms, featured heavily in these predictor models, suggesting that TAALES 2.0 represents a substantial upgrade.

  2. Smart-card-based automatic meal record system intervention tool for analysis using data mining approach.

    Science.gov (United States)

    Zenitani, Satoko; Nishiuchi, Hiromu; Kiuchi, Takahiro

    2010-04-01

    The Smart-card-based Automatic Meal Record system for company cafeterias (AutoMealRecord system) was recently developed and used to monitor employee eating habits. The system could be a unique nutrition assessment tool for automatically monitoring the meal purchases of all employees, although it only focuses on company cafeterias and has never been validated. Before starting an interventional study, we tested the reliability of the data collected by the system using the data mining approach. The AutoMealRecord data were examined to determine if it could predict current obesity. All data used in this study (n = 899) were collected by a major electric company based in Tokyo, which has been operating the AutoMealRecord system for several years. We analyzed dietary patterns by principal component analysis using data from the system and extracted 5 major dietary patterns: healthy, traditional Japanese, Chinese, Japanese noodles, and pasta. The ability to predict current body mass index (BMI) with dietary preference was assessed with multiple linear regression analyses, and in the current study, BMI was positively correlated with male gender, preference for "Japanese noodles," mean energy intake, protein content, and frequency of body measurement at a body measurement booth in the cafeteria. There was a negative correlation with age, dietary fiber, and lunchtime cafeteria use (R(2) = 0.22). This regression model predicted "would-be obese" participants (BMI >or= 23) with 68.8% accuracy by leave-one-out cross validation. This shows that there was sufficient predictability of BMI based on data from the AutoMealRecord System. We conclude that the AutoMealRecord system is valuable for further consideration as a health care intervention tool. Copyright 2010 Elsevier Inc. All rights reserved.

  3. High-Order Automatic Differentiation of Unmodified Linear Algebra Routines via Nilpotent Matrices

    Science.gov (United States)

    Dunham, Benjamin Z.

    This work presents a new automatic differentiation method, Nilpotent Matrix Differentiation (NMD), capable of propagating any order of mixed or univariate derivative through common linear algebra functions--most notably third-party sparse solvers and decomposition routines, in addition to basic matrix arithmetic operations and power series--without changing data-type or modifying code line by line; this allows differentiation across sequences of arbitrarily many such functions with minimal implementation effort. NMD works by enlarging the matrices and vectors passed to the routines, replacing each original scalar with a matrix block augmented by derivative data; these blocks are constructed with special sparsity structures, termed "stencils," each designed to be isomorphic to a particular multidimensional hypercomplex algebra. The algebras are in turn designed such that Taylor expansions of hypercomplex function evaluations are finite in length and thus exactly track derivatives without approximation error. Although this use of the method in the "forward mode" is unique in its own right, it is also possible to apply it to existing implementations of the (first-order) discrete adjoint method to find high-order derivatives with lowered cost complexity; for example, for a problem with N inputs and an adjoint solver whose cost is independent of N--i.e., O(1)--the N x N Hessian can be found in O(N) time, which is comparable to existing second-order adjoint methods that require far more problem-specific implementation effort. Higher derivatives are likewise less expensive--e.g., a N x N x N rank-three tensor can be found in O(N2). Alternatively, a Hessian-vector product can be found in O(1) time, which may open up many matrix-based simulations to a range of existing optimization or surrogate modeling approaches. As a final corollary in parallel to the NMD-adjoint hybrid method, the existing complex-step differentiation (CD) technique is also shown to be capable of

  4. Low rank approach to computing first and higher order derivatives using automatic differentiation

    International Nuclear Information System (INIS)

    Reed, J. A.; Abdel-Khalik, H. S.; Utke, J.

    2012-01-01

    This manuscript outlines a new approach for increasing the efficiency of applying automatic differentiation (AD) to large scale computational models. By using the principles of the Efficient Subspace Method (ESM), low rank approximations of the derivatives for first and higher orders can be calculated using minimized computational resources. The output obtained from nuclear reactor calculations typically has a much smaller numerical rank compared to the number of inputs and outputs. This rank deficiency can be exploited to reduce the number of derivatives that need to be calculated using AD. The effective rank can be determined according to ESM by computing derivatives with AD at random inputs. Reduced or pseudo variables are then defined and new derivatives are calculated with respect to the pseudo variables. Two different AD packages are used: OpenAD and Rapsodia. OpenAD is used to determine the effective rank and the subspace that contains the derivatives. Rapsodia is then used to calculate derivatives with respect to the pseudo variables for the desired order. The overall approach is applied to two simple problems and to MATWS, a safety code for sodium cooled reactors. (authors)

  5. LOOP- SIMULATION OF THE AUTOMATIC FREQUENCY CONTROL SUBSYSTEM OF A DIFFERENTIAL MINIMUM SHIFT KEYING RECEIVER

    Science.gov (United States)

    Davarian, F.

    1994-01-01

    The LOOP computer program was written to simulate the Automatic Frequency Control (AFC) subsystem of a Differential Minimum Shift Keying (DMSK) receiver with a bit rate of 2400 baud. The AFC simulated by LOOP is a first order loop configuration with a first order R-C filter. NASA has been investigating the concept of mobile communications based on low-cost, low-power terminals linked via geostationary satellites. Studies have indicated that low bit rate transmission is suitable for this application, particularly from the frequency and power conservation point of view. A bit rate of 2400 BPS is attractive due to its applicability to the linear predictive coding of speech. Input to LOOP includes the following: 1) the initial frequency error; 2) the double-sided loop noise bandwidth; 3) the filter time constants; 4) the amount of intersymbol interference; and 5) the bit energy to noise spectral density. LOOP output includes: 1) the bit number and the frequency error of that bit; 2) the computed mean of the frequency error; and 3) the standard deviation of the frequency error. LOOP is written in MS SuperSoft FORTRAN 77 for interactive execution and has been implemented on an IBM PC operating under PC DOS with a memory requirement of approximately 40K of 8 bit bytes. This program was developed in 1986.

  6. Differential evolution algorithm based automatic generation control for interconnected power systems with

    Directory of Open Access Journals (Sweden)

    Banaja Mohanty

    2014-09-01

    Full Text Available This paper presents the design and performance analysis of Differential Evolution (DE algorithm based Proportional–Integral (PI and Proportional–Integral–Derivative (PID controllers for Automatic Generation Control (AGC of an interconnected power system. Initially, a two area thermal system with governor dead-band nonlinearity is considered for the design and analysis purpose. In the proposed approach, the design problem is formulated as an optimization problem control and DE is employed to search for optimal controller parameters. Three different objective functions are used for the design purpose. The superiority of the proposed approach has been shown by comparing the results with a recently published Craziness based Particle Swarm Optimization (CPSO technique for the same interconnected power system. It is noticed that, the dynamic performance of DE optimized PI controller is better than CPSO optimized PI controllers. Additionally, controller parameters are tuned at different loading conditions so that an adaptive gain scheduling control strategy can be employed. The study is further extended to a more realistic network of two-area six unit system with different power generating units such as thermal, hydro, wind and diesel generating units considering boiler dynamics for thermal plants, Generation Rate Constraint (GRC and Governor Dead Band (GDB non-linearity.

  7. Spaceborne Differential SAR Interferometry: Data Analysis Tools for Deformation Measurement

    Directory of Open Access Journals (Sweden)

    Michele Crosetto

    2011-02-01

    Full Text Available This paper is focused on spaceborne Differential Interferometric SAR (DInSAR for land deformation measurement and monitoring. In the last two decades several DInSAR data analysis procedures have been proposed. The objective of this paper is to describe the DInSAR data processing and analysis tools developed at the Institute of Geomatics in almost ten years of research activities. Four main DInSAR analysis procedures are described, which range from the standard DInSAR analysis based on a single interferogram to more advanced Persistent Scatterer Interferometry (PSI approaches. These different procedures guarantee a sufficient flexibility in DInSAR data processing. In order to provide a technical insight into these analysis procedures, a whole section discusses their main data processing and analysis steps, especially those needed in PSI analyses. A specific section is devoted to the core of our PSI analysis tools: the so-called 2+1D phase unwrapping procedure, which couples a 2D phase unwrapping, performed interferogram-wise, with a kind of 1D phase unwrapping along time, performed pixel-wise. In the last part of the paper, some examples of DInSAR results are discussed, which were derived by standard DInSAR or PSI analyses. Most of these results were derived from X-band SAR data coming from the TerraSAR-X and CosmoSkyMed sensors.

  8. Attentional Bias for Pain and Sex, and Automatic Appraisals of Sexual Penetration: Differential Patterns in Dyspareunia vs Vaginismus?

    Science.gov (United States)

    Melles, Reinhilde J; Dewitte, Marieke D; Ter Kuile, Moniek M; Peters, Madelon M L; de Jong, Peter J

    2016-08-01

    Current information processing models propose that heightened attention bias for sex-related threats (eg, pain) and lowered automatic incentive processes ("wanting") may play an important role in the impairment of sexual arousal and the development of sexual dysfunctions such as genitopelvic pain/penetration disorder (GPPPD). Differential threat and incentive processing may also help explain the stronger persistence of coital avoidance in women with vaginismus compared to women with dyspareunia. As the first aim, we tested if women with GPPPD show (1) heightened attention for pain and sex, and (2) heightened threat and lower incentive associations with sexual penetration. Second, we examined whether the stronger persistence of coital avoidance in vaginismus vs dyspareunia might be explained by a stronger attentional bias or more dysfunctional automatic threat/incentive associations. Women with lifelong vaginismus (n = 37), dyspareunia (n = 29), and a no-symptoms comparison group (n = 51) completed a visual search task to assess attentional bias, and single target implicit-association tests to measure automatic sex-threat and sex-wanting associations. There were no group differences in attentional bias or automatic associations. Correlational analysis showed that slowed detection of sex stimuli and stronger automatic threat associations were related to lowered sexual arousal. The findings do not corroborate the view that attentional bias for pain or sex contributes to coital pain, or that differences in coital avoidance may be explained by differences in attentional bias or automatic threat/incentive associations. However, the correlational findings are consistent with the view that automatic threat associations and impaired attention for sex stimuli may interfere with the generation of sexual arousal. Copyright © 2016 International Society for Sexual Medicine. Published by Elsevier Inc. All rights reserved.

  9. Preoperative automatic visual behavioural analysis as a tool for intraocular lens choice in cataract surgery

    Directory of Open Access Journals (Sweden)

    Heloisa Neumann Nogueira

    2015-04-01

    Full Text Available Purpose: Cataract is the main cause of blindness, affecting 18 million people worldwide, with the highest incidence in the population above 50 years of age. Low visual acuity caused by cataract may have a negative impact on patient quality of life. The current treatment is surgery in order to replace the natural lens with an artificial intraocular lens (IOL, which can be mono- or multifocal. However, due to potential side effects, IOLs must be carefully chosen to ensure higher patient satisfaction. Thus, studies on the visual behavior of these patients may be an important tool to determine the best type of IOL implantation. This study proposed an anamnestic add-on for optimizing the choice of IOL. Methods: We used a camera that automatically takes pictures, documenting the patient’s visual routine in order to obtain additional information about the frequency of distant, intermediate, and near sights. Results: The results indicated an estimated frequency percentage, suggesting that visual analysis of routine photographic records of a patient with cataract may be useful for understanding behavioural gaze and for choosing visual management strategy after cataract surgery, simultaneously stimulating interest for customized IOL manufacturing according to individual needs.

  10. AUTOMATIC WINDING GENERATION USING MATRIX REPRESENTATION - ANFRACTUS TOOL 1.0

    Directory of Open Access Journals (Sweden)

    Daoud Ouamara

    2018-02-01

    Full Text Available This paper describes an original approach dealing with AC/DC winding design in electrical machines. A research software called “ANFRACTUS Tool 1.0”, allowing automatic generation of all windings in multi-phases electrical machines, has been developed using the matrix representation. Unlike existent methods, where the aim is to synthesize a winding with higher performances, the proposed method provides the opportunity to choose between all doable windings. The specificity of this approach is based on the fact that it take only the slots, phases and layers number as input parameters. The poles number is not requested to run the generation process. Windings generation by matrix representation may be applied for any number of slots, phases and layers. The software do not deal with the manner that coils are connected but just the emplacement of coils in each slot with its current sense. The waveform and the harmonic spectrum of the total magnetomotive force (MMF are given as result.

  11. SU-F-BRB-16: A Spreadsheet Based Automatic Trajectory GEnerator (SAGE): An Open Source Tool for Automatic Creation of TrueBeam Developer Mode Robotic Trajectories

    Energy Technology Data Exchange (ETDEWEB)

    Etmektzoglou, A; Mishra, P; Svatos, M [Varian Medical Systems, Palo Alto, CA (United States)

    2015-06-15

    Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomes available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly

  12. SU-F-BRB-16: A Spreadsheet Based Automatic Trajectory GEnerator (SAGE): An Open Source Tool for Automatic Creation of TrueBeam Developer Mode Robotic Trajectories

    International Nuclear Information System (INIS)

    Etmektzoglou, A; Mishra, P; Svatos, M

    2015-01-01

    Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomes available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly

  13. An open source automatic quality assurance (OSAQA) tool for the ACR MRI phantom.

    Science.gov (United States)

    Sun, Jidi; Barnes, Michael; Dowling, Jason; Menk, Fred; Stanwell, Peter; Greer, Peter B

    2015-03-01

    Routine quality assurance (QA) is necessary and essential to ensure MR scanner performance. This includes geometric distortion, slice positioning and thickness accuracy, high contrast spatial resolution, intensity uniformity, ghosting artefact and low contrast object detectability. However, this manual process can be very time consuming. This paper describes the development and validation of an open source tool to automate the MR QA process, which aims to increase physicist efficiency, and improve the consistency of QA results by reducing human error. The OSAQA software was developed in Matlab and the source code is available for download from http://jidisun.wix.com/osaqa-project/. During program execution QA results are logged for immediate review and are also exported to a spreadsheet for long-term machine performance reporting. For the automatic contrast QA test, a user specific contrast evaluation was designed to improve accuracy for individuals on different display monitors. American College of Radiology QA images were acquired over a period of 2 months to compare manual QA and the results from the proposed OSAQA software. OSAQA was found to significantly reduce the QA time from approximately 45 to 2 min. Both the manual and OSAQA results were found to agree with regard to the recommended criteria and the differences were insignificant compared to the criteria. The intensity homogeneity filter is necessary to obtain an image with acceptable quality and at the same time keeps the high contrast spatial resolution within the recommended criterion. The OSAQA tool has been validated on scanners with different field strengths and manufacturers. A number of suggestions have been made to improve both the phantom design and QA protocol in the future.

  14. Migration check tool: automatic plan verification following treatment management systems upgrade and database migration.

    Science.gov (United States)

    Hadley, Scott W; White, Dale; Chen, Xiaoping; Moran, Jean M; Keranen, Wayne M

    2013-11-04

    Software upgrades of the treatment management system (TMS) sometimes require that all data be migrated from one version of the database to another. It is necessary to verify that the data are correctly migrated to assure patient safety. It is impossible to verify by hand the thousands of parameters that go into each patient's radiation therapy treatment plan. Repeating pretreatment QA is costly, time-consuming, and may be inadequate in detecting errors that are introduced during the migration. In this work we investigate the use of an automatic Plan Comparison Tool to verify that plan data have been correctly migrated to a new version of a TMS database from an older version. We developed software to query and compare treatment plans between different versions of the TMS. The same plan in the two TMS systems are translated into an XML schema. A plan comparison module takes the two XML schemas as input and reports any differences in parameters between the two versions of the same plan by applying a schema mapping. A console application is used to query the database to obtain a list of active or in-preparation plans to be tested. It then runs in batch mode to compare all the plans, and a report of success or failure of the comparison is saved for review. This software tool was used as part of software upgrade and database migration from Varian's Aria 8.9 to Aria 11 TMS. Parameters were compared for 358 treatment plans in 89 minutes. This direct comparison of all plan parameters in the migrated TMS against the previous TMS surpasses current QA methods that relied on repeating pretreatment QA measurements or labor-intensive and fallible hand comparisons.

  15. Usefulness of the automatic quantitative estimation tool for cerebral blood flow: clinical assessment of the application software tool AQCEL.

    Science.gov (United States)

    Momose, Mitsuhiro; Takaki, Akihiro; Matsushita, Tsuyoshi; Yanagisawa, Shin; Yano, Kesato; Miyasaka, Tadashi; Ogura, Yuka; Kadoya, Masumi

    2011-01-01

    AQCEL enables automatic reconstruction of single-photon emission computed tomogram (SPECT) without image degradation and quantitative analysis of cerebral blood flow (CBF) after the input of simple parameters. We ascertained the usefulness and quality of images obtained by the application software AQCEL in clinical practice. Twelve patients underwent brain perfusion SPECT using technetium-99m ethyl cysteinate dimer at rest and after acetazolamide (ACZ) loading. Images reconstructed using AQCEL were compared with those reconstructed using conventional filtered back projection (FBP) method for qualitative estimation. Two experienced nuclear medicine physicians interpreted the image quality using the following visual scores: 0, same; 1, slightly superior; 2, superior. For quantitative estimation, the mean CBF values of the normal hemisphere of the 12 patients using ACZ calculated by the AQCEL method were compared with those calculated by the conventional method. The CBF values of the 24 regions of the 3-dimensional stereotaxic region of interest template (3DSRT) calculated by the AQCEL method at rest and after ACZ loading were compared to those calculated by the conventional method. No significant qualitative difference was observed between the AQCEL and conventional FBP methods in the rest study. The average score by the AQCEL method was 0.25 ± 0.45 and that by the conventional method was 0.17 ± 0.39 (P = 0.34). There was a significant qualitative difference between the AQCEL and conventional methods in the ACZ loading study. The average score for AQCEL was 0.83 ± 0.58 and that for the conventional method was 0.08 ± 0.29 (P = 0.003). During quantitative estimation using ACZ, the mean CBF values of 12 patients calculated by the AQCEL method were 3-8% higher than those calculated by the conventional method. The square of the correlation coefficient between these methods was 0.995. While comparing the 24 3DSRT regions of 12 patients, the squares of the correlation

  16. Aerodynamic design applying automatic differentiation and using robust variable fidelity optimization

    Science.gov (United States)

    Takemiya, Tetsushi

    , and that (2) the AMF terminates optimization erroneously when the optimization problems have constraints. The first problem is due to inaccuracy in computing derivatives in the AMF, and the second problem is due to erroneous treatment of the trust region ratio, which sets the size of the domain for an optimization in the AMF. In order to solve the first problem of the AMF, automatic differentiation (AD) technique, which reads the codes of analysis models and automatically generates new derivative codes based on some mathematical rules, is applied. If derivatives are computed with the generated derivative code, they are analytical, and the required computational time is independent of the number of design variables, which is very advantageous for realistic aerospace engineering problems. However, if analysis models implement iterative computations such as computational fluid dynamics (CFD), which solves system partial differential equations iteratively, computing derivatives through the AD requires a massive memory size. The author solved this deficiency by modifying the AD approach and developing a more efficient implementation with CFD, and successfully applied the AD to general CFD software. In order to solve the second problem of the AMF, the governing equation of the trust region ratio, which is very strict against the violation of constraints, is modified so that it can accept the violation of constraints within some tolerance. By accepting violations of constraints during the optimization process, the AMF can continue optimization without terminating immaturely and eventually find the true optimum design point. With these modifications, the AMF is referred to as "Robust AMF," and it is applied to airfoil and wing aerodynamic design problems using Euler CFD software. The former problem has 21 design variables, and the latter 64. In both problems, derivatives computed with the proposed AD method are first compared with those computed with the finite

  17. MAISTAS: a tool for automatic structural evaluation of alternative splicing products.

    KAUST Repository

    Floris, Matteo; Raimondo, Domenico; Leoni, Guido; Orsini, Massimiliano; Marcatili, Paolo; Tramontano, Anna

    2011-01-01

    MOTIVATION: Analysis of the human genome revealed that the amount of transcribed sequence is an order of magnitude greater than the number of predicted and well-characterized genes. A sizeable fraction of these transcripts is related to alternatively spliced forms of known protein coding genes. Inspection of the alternatively spliced transcripts identified in the pilot phase of the ENCODE project has clearly shown that often their structure might substantially differ from that of other isoforms of the same gene, and therefore that they might perform unrelated functions, or that they might even not correspond to a functional protein. Identifying these cases is obviously relevant for the functional assignment of gene products and for the interpretation of the effect of variations in the corresponding proteins. RESULTS: Here we describe a publicly available tool that, given a gene or a protein, retrieves and analyses all its annotated isoforms, provides users with three-dimensional models of the isoform(s) of his/her interest whenever possible and automatically assesses whether homology derived structural models correspond to plausible structures. This information is clearly relevant. When the homology model of some isoforms of a gene does not seem structurally plausible, the implications are that either they assume a structure unrelated to that of the other isoforms of the same gene with presumably significant functional differences, or do not correspond to functional products. We provide indications that the second hypothesis is likely to be true for a substantial fraction of the cases. AVAILABILITY: http://maistas.bioinformatica.crs4.it/.

  18. MAISTAS: a tool for automatic structural evaluation of alternative splicing products.

    KAUST Repository

    Floris, Matteo

    2011-04-15

    MOTIVATION: Analysis of the human genome revealed that the amount of transcribed sequence is an order of magnitude greater than the number of predicted and well-characterized genes. A sizeable fraction of these transcripts is related to alternatively spliced forms of known protein coding genes. Inspection of the alternatively spliced transcripts identified in the pilot phase of the ENCODE project has clearly shown that often their structure might substantially differ from that of other isoforms of the same gene, and therefore that they might perform unrelated functions, or that they might even not correspond to a functional protein. Identifying these cases is obviously relevant for the functional assignment of gene products and for the interpretation of the effect of variations in the corresponding proteins. RESULTS: Here we describe a publicly available tool that, given a gene or a protein, retrieves and analyses all its annotated isoforms, provides users with three-dimensional models of the isoform(s) of his/her interest whenever possible and automatically assesses whether homology derived structural models correspond to plausible structures. This information is clearly relevant. When the homology model of some isoforms of a gene does not seem structurally plausible, the implications are that either they assume a structure unrelated to that of the other isoforms of the same gene with presumably significant functional differences, or do not correspond to functional products. We provide indications that the second hypothesis is likely to be true for a substantial fraction of the cases. AVAILABILITY: http://maistas.bioinformatica.crs4.it/.

  19. Technical Note: PLASTIMATCH MABS, an open source tool for automatic image segmentation

    International Nuclear Information System (INIS)

    Zaffino, Paolo; Spadea, Maria Francesca; Raudaschl, Patrik; Fritscher, Karl; Sharp, Gregory C.

    2016-01-01

    Purpose: Multiatlas based segmentation is largely used in many clinical and research applications. Due to its good performances, it has recently been included in some commercial platforms for radiotherapy planning and surgery guidance. Anyway, to date, a software with no restrictions about the anatomical district and image modality is still missing. In this paper we introduce PLASTIMATCH MABS, an open source software that can be used with any image modality for automatic segmentation. Methods: PLASTIMATCH MABS workflow consists of two main parts: (1) an offline phase, where optimal registration and voting parameters are tuned and (2) an online phase, where a new patient is labeled from scratch by using the same parameters as identified in the former phase. Several registration strategies, as well as different voting criteria can be selected. A flexible atlas selection scheme is also available. To prove the effectiveness of the proposed software across anatomical districts and image modalities, it was tested on two very different scenarios: head and neck (H&N) CT segmentation for radiotherapy application, and magnetic resonance image brain labeling for neuroscience investigation. Results: For the neurological study, minimum dice was equal to 0.76 (investigated structures: left and right caudate, putamen, thalamus, and hippocampus). For head and neck case, minimum dice was 0.42 for the most challenging structures (optic nerves and submandibular glands) and 0.62 for the other ones (mandible, brainstem, and parotid glands). Time required to obtain the labels was compatible with a real clinical workflow (35 and 120 min). Conclusions: The proposed software fills a gap in the multiatlas based segmentation field, since all currently available tools (both for commercial and for research purposes) are restricted to a well specified application. Furthermore, it can be adopted as a platform for exploring MABS parameters and as a reference implementation for comparing against

  20. Technical Note: PLASTIMATCH MABS, an open source tool for automatic image segmentation

    Energy Technology Data Exchange (ETDEWEB)

    Zaffino, Paolo; Spadea, Maria Francesca [Department of Experimental and Clinical Medicine, Magna Graecia University of Catanzaro, Catanzaro 88100 (Italy); Raudaschl, Patrik; Fritscher, Karl [Institute for Biomedical Image Analysis, Private University of Health Sciences, Medical Informatics and Technology, Hall in Tirol 6060 (Austria); Sharp, Gregory C. [Department for Radiation Oncology, Massachusetts General Hospital, Boston, Massachusetts 02114 (United States)

    2016-09-15

    Purpose: Multiatlas based segmentation is largely used in many clinical and research applications. Due to its good performances, it has recently been included in some commercial platforms for radiotherapy planning and surgery guidance. Anyway, to date, a software with no restrictions about the anatomical district and image modality is still missing. In this paper we introduce PLASTIMATCH MABS, an open source software that can be used with any image modality for automatic segmentation. Methods: PLASTIMATCH MABS workflow consists of two main parts: (1) an offline phase, where optimal registration and voting parameters are tuned and (2) an online phase, where a new patient is labeled from scratch by using the same parameters as identified in the former phase. Several registration strategies, as well as different voting criteria can be selected. A flexible atlas selection scheme is also available. To prove the effectiveness of the proposed software across anatomical districts and image modalities, it was tested on two very different scenarios: head and neck (H&N) CT segmentation for radiotherapy application, and magnetic resonance image brain labeling for neuroscience investigation. Results: For the neurological study, minimum dice was equal to 0.76 (investigated structures: left and right caudate, putamen, thalamus, and hippocampus). For head and neck case, minimum dice was 0.42 for the most challenging structures (optic nerves and submandibular glands) and 0.62 for the other ones (mandible, brainstem, and parotid glands). Time required to obtain the labels was compatible with a real clinical workflow (35 and 120 min). Conclusions: The proposed software fills a gap in the multiatlas based segmentation field, since all currently available tools (both for commercial and for research purposes) are restricted to a well specified application. Furthermore, it can be adopted as a platform for exploring MABS parameters and as a reference implementation for comparing against

  1. APT - NASA ENHANCED VERSION OF AUTOMATICALLY PROGRAMMED TOOL SOFTWARE - STAND-ALONE VERSION

    Science.gov (United States)

    Premo, D. A.

    1994-01-01

    The APT code is one of the most widely used software tools for complex numerically controlled (N/C) machining. APT is an acronym for Automatically Programmed Tools and is used to denote both a language and the computer software that processes that language. Development of the APT language and software system was begun over twenty years ago as a U. S. government sponsored industry and university research effort. APT is a "problem oriented" language that was developed for the explicit purpose of aiding the N/C machine tools. Machine-tool instructions and geometry definitions are written in the APT language to constitute a "part program." The APT part program is processed by the APT software to produce a cutter location (CL) file. This CL file may then be processed by user supplied post processors to convert the CL data into a form suitable for a particular N/C machine tool. This June, 1989 offering of the APT system represents an adaptation, with enhancements, of the public domain version of APT IV/SSX8 to the DEC VAX-11/780 for use by the Engineering Services Division of the NASA Goddard Space Flight Center. Enhancements include the super pocket feature which allows concave and convex polygon shapes of up to 40 points including shapes that overlap, that leave islands of material within the pocket, and that have one or more arcs as part of the pocket boundary. Recent modifications to APT include a rework of the POCKET subroutine and correction of an error that prevented the use within a macro of a macro variable cutter move statement combined with macro variable double check surfaces. Former modifications included the expansion of array and buffer sizes to accommodate larger part programs, and the insertion of a few user friendly error messages. The APT system software on the DEC VAX-11/780 is organized into two separate programs: the load complex and the APT processor. The load complex handles the table initiation phase and is usually only run when changes to the

  2. A Survey of Automatic Protocol Reverse Engineering Approaches, Methods, and Tools on the Inputs and Outputs View

    OpenAIRE

    Baraka D. Sija; Young-Hoon Goo; Kyu-Seok Shim; Huru Hasanova; Myung-Sup Kim

    2018-01-01

    A network protocol defines rules that control communications between two or more machines on the Internet, whereas Automatic Protocol Reverse Engineering (APRE) defines the way of extracting the structure of a network protocol without accessing its specifications. Enough knowledge on undocumented protocols is essential for security purposes, network policy implementation, and management of network resources. This paper reviews and analyzes a total of 39 approaches, methods, and tools towards ...

  3. A novel image toggle tool for comparison of serial mammograms: automatic density normalization and alignment-development of the tool and initial experience.

    Science.gov (United States)

    Honda, Satoshi; Tsunoda, Hiroko; Fukuda, Wataru; Saida, Yukihisa

    2014-12-01

    The purpose is to develop a new image toggle tool with automatic density normalization (ADN) and automatic alignment (AA) for comparing serial digital mammograms (DMGs). We developed an ADN and AA process to compare the images of serial DMGs. In image density normalization, a linear interpolation was applied by taking two points of high- and low-brightness areas. The alignment was calculated by determining the point of the greatest correlation while shifting the alignment between the current and prior images. These processes were performed on a PC with a 3.20-GHz Xeon processor and 8 GB of main memory. We selected 12 suspected breast cancer patients who had undergone screening DMGs in the past. Automatic processing was retrospectively performed on these images. Two radiologists subjectively evaluated them. The process of the developed algorithm took approximately 1 s per image. In our preliminary experience, two images could not be aligned approximately. When they were aligned, image toggling allowed detection of differences between examinations easily. We developed a new tool to facilitate comparative reading of DMGs on a mammography viewing system. Using this tool for toggling comparisons might improve the interpretation efficiency of serial DMGs.

  4. RDFBuilder: a tool to automatically build RDF-based interfaces for MAGE-OM microarray data sources.

    Science.gov (United States)

    Anguita, Alberto; Martin, Luis; Garcia-Remesal, Miguel; Maojo, Victor

    2013-07-01

    This paper presents RDFBuilder, a tool that enables RDF-based access to MAGE-ML-compliant microarray databases. We have developed a system that automatically transforms the MAGE-OM model and microarray data stored in the ArrayExpress database into RDF format. Additionally, the system automatically enables a SPARQL endpoint. This allows users to execute SPARQL queries for retrieving microarray data, either from specific experiments or from more than one experiment at a time. Our system optimizes response times by caching and reusing information from previous queries. In this paper, we describe our methods for achieving this transformation. We show that our approach is complementary to other existing initiatives, such as Bio2RDF, for accessing and retrieving data from the ArrayExpress database. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  5. SU-F-T-94: Plan2pdf - a Software Tool for Automatic Plan Report for Philips Pinnacle TPS

    Energy Technology Data Exchange (ETDEWEB)

    Wu, C [Sutter Medical Foundation, Roseville, CA (United States)

    2016-06-15

    Purpose: To implement an automatic electronic PDF plan reporting tool for Philips Pinnacle treatment planning system (TPS) Methods: An electronic treatment plan reporting software is developed by us to enable fully automatic PDF report from Pinnacle TPS to external EMR programs such as MOSAIQ. The tool is named “plan2pdf”. plan2pdf is implemented using Pinnacle scripts, Java and UNIX shell scripts, without any external program needed. plan2pdf supports full auto-mode and manual mode reporting. In full auto-mode, with a single mouse click, plan2pdf will generate a detailed Pinnacle plan report in PDF format, which includes customizable cover page, Pinnacle plan summary, orthogonal views through each plan POI and maximum dose point, DRR for each beam, serial transverse views captured throughout the dose grid at a user specified interval, DVH and scorecard windows. The final PDF report is also automatically bookmarked for each section above for convenient plan review. The final PDF report can either be saved on a user specified folder on Pinnacle, or it can be automatically exported to an EMR import folder via a user configured FTP service. In manual capture mode, plan2pdf allows users to capture any Pinnacle plan by full screen, individual window or rectangular ROI drawn on screen. Furthermore, to avoid possible patients’ plan mix-up during auto-mode reporting, a user conflict check feature is included in plan2pdf: it prompts user to wait if another patient is being exported by plan2pdf by another user. Results: plan2pdf is tested extensively and successfully at our institution consists of 5 centers, 15 dosimetrists and 10 physicists, running Pinnacle version 9.10 on Enterprise servers. Conclusion: plan2pdf provides a highly efficient, user friendly and clinical proven platform for all Philips Pinnacle users, to generate a detailed plan report in PDF format for external EMR systems.

  6. SU-F-T-94: Plan2pdf - a Software Tool for Automatic Plan Report for Philips Pinnacle TPS

    International Nuclear Information System (INIS)

    Wu, C

    2016-01-01

    Purpose: To implement an automatic electronic PDF plan reporting tool for Philips Pinnacle treatment planning system (TPS) Methods: An electronic treatment plan reporting software is developed by us to enable fully automatic PDF report from Pinnacle TPS to external EMR programs such as MOSAIQ. The tool is named “plan2pdf”. plan2pdf is implemented using Pinnacle scripts, Java and UNIX shell scripts, without any external program needed. plan2pdf supports full auto-mode and manual mode reporting. In full auto-mode, with a single mouse click, plan2pdf will generate a detailed Pinnacle plan report in PDF format, which includes customizable cover page, Pinnacle plan summary, orthogonal views through each plan POI and maximum dose point, DRR for each beam, serial transverse views captured throughout the dose grid at a user specified interval, DVH and scorecard windows. The final PDF report is also automatically bookmarked for each section above for convenient plan review. The final PDF report can either be saved on a user specified folder on Pinnacle, or it can be automatically exported to an EMR import folder via a user configured FTP service. In manual capture mode, plan2pdf allows users to capture any Pinnacle plan by full screen, individual window or rectangular ROI drawn on screen. Furthermore, to avoid possible patients’ plan mix-up during auto-mode reporting, a user conflict check feature is included in plan2pdf: it prompts user to wait if another patient is being exported by plan2pdf by another user. Results: plan2pdf is tested extensively and successfully at our institution consists of 5 centers, 15 dosimetrists and 10 physicists, running Pinnacle version 9.10 on Enterprise servers. Conclusion: plan2pdf provides a highly efficient, user friendly and clinical proven platform for all Philips Pinnacle users, to generate a detailed plan report in PDF format for external EMR systems.

  7. Modeling and monitoring of pipelines and networks advanced tools for automatic monitoring and supervision of pipelines

    CERN Document Server

    Torres, Lizeth

    2017-01-01

    This book focuses on the analysis and design of advanced techniques for on-line automatic computational monitoring of pipelines and pipe networks. It discusses how to improve the systems’ security considering mathematical models of the flow, historical flow rate and pressure data, with the main goal of reducing the number of sensors installed along a pipeline. The techniques presented in the book have been implemented in digital systems to enhance the abilities of the pipeline network’s operators in recognizing anomalies. A real leak scenario in a Mexican water pipeline is used to illustrate the benefits of these techniques in locating the position of a leak. Intended for an interdisciplinary audience, the book addresses researchers and professionals in the areas of mechanical, civil and control engineering. It covers topics on fluid mechanics, instrumentation, automatic control, signal processing, computing, construction and diagnostic technologies.

  8. Automatic brain matter segmentation of computed tomography images using a statistical model: A tool to gain working time!

    Science.gov (United States)

    Bertè, Francesco; Lamponi, Giuseppe; Bramanti, Placido; Calabrò, Rocco S

    2015-10-01

    Brain computed tomography (CT) is useful diagnostic tool for the evaluation of several neurological disorders due to its accuracy, reliability, safety and wide availability. In this field, a potentially interesting research topic is the automatic segmentation and recognition of medical regions of interest (ROIs). Herein, we propose a novel automated method, based on the use of the active appearance model (AAM) for the segmentation of brain matter in CT images to assist radiologists in the evaluation of the images. The method described, that was applied to 54 CT images coming from a sample of outpatients affected by cognitive impairment, enabled us to obtain the generation of a model overlapping with the original image with quite good precision. Since CT neuroimaging is in widespread use for detecting neurological disease, including neurodegenerative conditions, the development of automated tools enabling technicians and physicians to reduce working time and reach a more accurate diagnosis is needed. © The Author(s) 2015.

  9. flowAI: automatic and interactive anomaly discerning tools for flow cytometry data.

    Science.gov (United States)

    Monaco, Gianni; Chen, Hao; Poidinger, Michael; Chen, Jinmiao; de Magalhães, João Pedro; Larbi, Anis

    2016-08-15

    Flow cytometry (FCM) is widely used in both clinical and basic research to characterize cell phenotypes and functions. The latest FCM instruments analyze up to 20 markers of individual cells, producing high-dimensional data. This requires the use of the latest clustering and dimensionality reduction techniques to automatically segregate cell sub-populations in an unbiased manner. However, automated analyses may lead to false discoveries due to inter-sample differences in quality and properties. We present an R package, flowAI, containing two methods to clean FCM files from unwanted events: (i) an automatic method that adopts algorithms for the detection of anomalies and (ii) an interactive method with a graphical user interface implemented into an R shiny application. The general approach behind the two methods consists of three key steps to check and remove suspected anomalies that derive from (i) abrupt changes in the flow rate, (ii) instability of signal acquisition and (iii) outliers in the lower limit and margin events in the upper limit of the dynamic range. For each file analyzed our software generates a summary of the quality assessment from the aforementioned steps. The software presented is an intuitive solution seeking to improve the results not only of manual but also and in particular of automatic analysis on FCM data. R source code available through Bioconductor: http://bioconductor.org/packages/flowAI/ CONTACTS: mongianni1@gmail.com or Anis_Larbi@immunol.a-star.edu.sg Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  10. DEEP--a tool for differential expression effector prediction.

    Science.gov (United States)

    Degenhardt, Jost; Haubrock, Martin; Dönitz, Jürgen; Wingender, Edgar; Crass, Torsten

    2007-07-01

    High-throughput methods for measuring transcript abundance, like SAGE or microarrays, are widely used for determining differences in gene expression between different tissue types, dignities (normal/malignant) or time points. Further analysis of such data frequently aims at the identification of gene interaction networks that form the causal basis for the observed properties of the systems under examination. To this end, it is usually not sufficient to rely on the measured gene expression levels alone; rather, additional biological knowledge has to be taken into account in order to generate useful hypotheses about the molecular mechanism leading to the realization of a certain phenotype. We present a method that combines gene expression data with biological expert knowledge on molecular interaction networks, as described by the TRANSPATH database on signal transduction, to predict additional--and not necessarily differentially expressed--genes or gene products which might participate in processes specific for either of the examined tissues or conditions. In a first step, significance values for over-expression in tissue/condition A or B are assigned to all genes in the expression data set. Genes with a significance value exceeding a certain threshold are used as starting points for the reconstruction of a graph with signaling components as nodes and signaling events as edges. In a subsequent graph traversal process, again starting from the previously identified differentially expressed genes, all encountered nodes 'inherit' all their starting nodes' significance values. In a final step, the graph is visualized, the nodes being colored according to a weighted average of their inherited significance values. Each node's, or sub-network's, predominant color, ranging from green (significant for tissue/condition A) over yellow (not significant for either tissue/condition) to red (significant for tissue/condition B), thus gives an immediate visual clue on which molecules--differentially

  11. Automatic centering device of a tool with regard to an aperture

    International Nuclear Information System (INIS)

    Delevalee, A.

    1993-01-01

    The manipulator arm carries a fixed support and a mobile support that can move perpendicularly to the axis of the tube. The mobile support can be held in any position by a brake arrangement. An index device allows the positioning of the mobile support in an initial predetermined position. A conical centering device can be placed coaxial with the tool and as it enters the tube ensures the alignment of the axis of the tool with the axis of the tube

  12. Development of a clinical applicable graphical user interface to automatically detect exercise oscillatory ventilation: The VOdEX-tool.

    Science.gov (United States)

    Cornelis, Justien; Denis, Tim; Beckers, Paul; Vrints, Christiaan; Vissers, Dirk; Goossens, Maggy

    2017-08-01

    Cardiopulmonary exercise testing (CPET) gained importance in the prognostic assessment of especially patients with heart failure (HF). A meaningful prognostic parameter for early mortality in HF is exercise oscillatory ventilation (EOV). This abnormal respiratory pattern is recognized by hypo- and hyperventilation during CPET. Up until now, assessment of EOV is mainly done upon visual agreement or manual calculation. The purpose of this research was to automate the interpretation of EOV so this prognostic parameter could be readily investigated during CPET. Preliminary, four definitions describing the original characteristics of EOV, were selected and integrated in the "Ventilatory Oscillations during Exercise-tool" (VOdEX-tool), a graphical user interface that allows automate calculation of EOV. A Discrete Meyer Level 2 wavelet transformation appeared to be the optimal filter to apply on the collected breath-by-breath minute ventilation CPET data. Divers aspects of the definitions i.e. cycle length, amplitude, regularity and total duration of EOV were combined and calculated. The oscillations meeting the criteria were visualised. Filter methods and cut-off criteria were made adjustable for clinical application and research. The VOdEX-tool was connected to a database. The VOdEX-tool provides the possibility to calculate EOV automatically and to present the clinician an overview of the presence of EOV at a glance. The computerized analysis of EOV can be made readily available in clinical practice by integrating the tool in the manufactures existing CPET software. The VOdEX-tool enhances assessment of EOV and therefore contributes to the estimation of prognosis in especially patients with HF. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  13. Nintendo WII remotes provide a reconfigurable tool-changing unit with an automatic calibration capability

    Directory of Open Access Journals (Sweden)

    Collins, James

    2014-08-01

    Full Text Available Modular machines within the reconfigurable manufacturing paradigm require auxiliary modules to enhance the system’s capability. A tool-changing unit was developed as one of these auxiliary modules. The unit had to be able to adapt itself efficiently to changes in the configuration of the machine it was servicing. This necessitated the development of a real- time 3D tracking system in order for the unit to sense alterations in the position of the spindle to which it was delivering tools. An economic positioning system was produced using Nintendo Wii remotes. This paper presents the development, implementation, and testing of this positioning system.

  14. A brief tool to differentiate factors contributing to insomnia complaints.

    Science.gov (United States)

    Townsend, Donald; Kazaglis, Louis; Savik, Kay; Smerud, Adam; Iber, Conrad

    2017-03-01

    A complaint of insomnia may have many causes. A brief tool examining contributing factors may be useful for nonsleep specialists. This study describes the development of the Insomnia Symptoms Assessment (ISA) for examining insomnia complaints. ISA questions were designed to identify symptoms that may represent 1 of 8 possible factors contributing to insomnia symptoms, including delayed sleep phase syndrome (DSPS), shift work sleep disorder (SWSD), obstructive sleep apnea (OSA), mental health, chronic pain, restless leg syndrome (RLS), poor sleep hygiene, and psychophysiological insomnia (PI). The ISA was completed by 346 new patients. Patients met with a sleep specialist who determined primary and secondary diagnoses. Mean age was 45 (18-85) years and 51% were male. Exploratory factor analysis (n = 217) and confirmatory factor analysis (n = 129) supported 5 factors with good internal consistency (Cronbach's alpha), including RLS (.72), OSA (.60), SWSD (.67), DSPS (.64), and PI (.80). Thirty percent had 1 sleep diagnosis with a mean of 2.2 diagnoses per patient. No diagnosis was entered for 1.2% of patients. The receiver operating characteristics were examined and the area under the curves calculated as an indication of convergent validity for the primary diagnosis (N = 346) were .97 for SWSD, .78 for OSA, .67 for DSPS, .54 for PI, and .80 for RLS. The ISA demonstrated good internal consistency and corresponds well to expert diagnoses. Next steps include setting sensitivity/specificity cutoffs to suggest initial treatment recommendations for use in other settings. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  15. A New Internet Tool for Automatic Evaluation in Control Systems and Programming

    Science.gov (United States)

    Munoz de la Pena, D.; Gomez-Estern, F.; Dormido, S.

    2012-01-01

    In this paper we present a web-based innovative education tool designed for automating the collection, evaluation and error detection in practical exercises assigned to computer programming and control engineering students. By using a student/instructor code-fusion architecture, the conceptual limits of multiple-choice tests are overcome by far.…

  16. A tool for automatic generation of RTL-level VHDL description of RNS FIR filters

    DEFF Research Database (Denmark)

    Re, Andrea Del; Nannarelli, Alberto; Re, Marco

    2004-01-01

    Although digital filters based on the Residue Number System (RNS) show high performance and low power dissipation, RNS filters are not widely used in DSP systems, because of the complexity of the algorithms involved. We present a tool to design RNS FIR filters which hides the RNS algorithms to th...

  17. Application of automatic change of interval to de Vogelaere's method of the solution of the differential equation y'' = f (x, y)

    International Nuclear Information System (INIS)

    Rogers, M.H.

    1960-11-01

    The paper gives an extension to de Vogelaere's method for the solution of systems of second order differential equations from which first derivatives are absent. The extension is a description of the way in which automatic change in step-length can be made to give a prescribed accuracy at each step. (author)

  18. Validation of a Novel Digital Tool in Automatic Scoring of an Online ECG Examination at an International Cardiology Meeting.

    Science.gov (United States)

    Quinn, Kieran L; Crystal, Eugene; Lashevsky, Ilan; Arouny, Banafsheh; Baranchuk, Adrian

    2016-07-01

    We have previously developed a novel digital tool capable of automatically recognizing correct electrocardiography (ECG) diagnoses in an online exam and demonstrated a significant improvement in diagnostic accuracy when utilizing an inductive-deductive reasoning strategy over a pattern recognition strategy. In this study, we sought to validate these findings from participants at the International Winter Arrhythmia School meeting, one of the foremost electrophysiology events in Canada. Preregistration to the event was sent by e-mail. The exam was administered on day 1 of the conference. Results and analysis were presented the following morning to participants. Twenty-five attendees completed the exam, providing a total of 500 responses to be marked. The online tool accurately identified 195 of a total of 395 (49%) correct responses (49%). In total, 305 responses required secondary manual review, of which 200 were added to the correct responses pool. The overall accuracy of correct ECG diagnosis for all participants was 69% and 84% when using pattern recognition or inductive-deductive strategies, respectively. Utilization of a novel digital tool to evaluate ECG competency can be set up as a workshop at international meetings or educational events. Results can be presented during the sessions to ensure immediate feedback. © 2015 Wiley Periodicals, Inc.

  19. Survey on the differentiation of consumption in various types of automatic wood burners

    International Nuclear Information System (INIS)

    Primas, A.; Kistler, M.; Kessler, F.

    2006-01-01

    This final report published by the Swiss Federal Office of Energy (SFOE) takes a look at how statistics on wood-consumption can be differentiated to take various types of wood-fired heating systems into consideration. The approach used, which involved the taking of 1200 random samples from a total of 5200 installations, is described. Figures are presented on the return-quotients reached. The questionnaires returned were sorted according to the types of installation, such as industrial/commercial, farming, services and household. As a result of the high return-rate, the accuracy of the estimates based on the data is also considered to be high. The paper describes how the survey was made and how the results were obtained from the data collected. Details on operation, types of fuel, specific consumption and factors influencing operation are presented in graphical form. An appendix presents the data collected in tabular form

  20. A new tool for rapid and automatic estimation of earthquake source parameters and generation of seismic bulletins

    Science.gov (United States)

    Zollo, Aldo

    2016-04-01

    of the equivalent Wood-Anderson displacement recordings. The moment magnitude (Mw) is then estimated from the inversion of displacement spectra. The duration magnitude (Md) is rapidly computed, based on a simple and automatic measurement of the seismic wave coda duration. Starting from the magnitude estimates, other relevant pieces of information are also computed, such as the corner frequency, the seismic moment, the source radius and the seismic energy. The ground-shaking maps on a Google map are produced, for peak ground acceleration (PGA), peak ground velocity (PGV) and instrumental intensity (in SHAKEMAP® format), or a plot of the measured peak ground values. Furthermore, based on a specific decisional scheme, the automatic discrimination between local earthquakes occurred within the network and regional/teleseismic events occurred outside the network is performed. Finally, for largest events, if a consistent number of P-wave polarity reading are available, the focal mechanism is also computed. For each event, all of the available pieces of information are stored in a local database and the results of the automatic analyses are published on an interactive web page. "The Bulletin" shows a map with event location and stations, as well as a table listing all the events, with the associated parameters. The catalogue fields are the event ID, the origin date and time, latitude, longitude, depth, Ml, Mw, Md, the number of triggered stations, the S-displacement spectra, and shaking maps. Some of these entries also provide additional information, such as the focal mechanism (when available). The picked traces are uploaded in the database and from the web interface of the Bulletin the traces can be download for more specific analysis. This innovative software represents a smart solution, with a friendly and interactive interface, for high-level analysis of seismic data analysis and it may represent a relevant tool not only for seismologists, but also for non

  1. Clinical score to differentiate scrub typhus and dengue: A tool to differentiate scrub typhus and dengue

    Directory of Open Access Journals (Sweden)

    Shubhanker Mitra

    2017-01-01

    Full Text Available Background: Dengue and scrub typhus share similar clinical and epidemiological features, and are difficult to differentiate at initial presentation. Many places are endemic to both these infections where they comprise the majority of acute undifferentiated febrile illnesses. Materials and Methods: We aimed to develop a score that can differentiate scrub typhus from dengue. In this cross-sectional study, 188 cases of scrub typhus and 201 cases of dengue infection who presented to the emergency department or medicine outpatient clinic from September 2012 to April 2013 were included. Univariate followed by multivariate logistic regression analysis was performed to identify clinical features and laboratory results that were significantly different between the two groups. Each variable was assigned scores based on the strength of association and receiver operating characteristics area under the curve (ROC-AUC was generated and compared. Six scoring models were explored to ascertain the model with the best fit. Results: Model 2 was developed using the following six variables: oxygen saturation (>90%, ≤90%, total white blood cell count (7000 cells/cumm, hemoglobin (≤14 and >14 g/dL, total bilirubin (200 and ≥200 IU/dL, and altered sensorium (present or absent. Each variable was assigned scores based on its strength of association. The AUC-ROC curve (95% confidence interval for model 2 was 0.84 (0.79–0.89. At the cut off score of 13, the sensitivity and specificity were 85% and 77% respectively, with a higher score favoring dengue. Conclusion: In areas of high burden of ST and dengue, model 2 (the “clinical score to differentiate scrub typhus and dengue fever” is a simple and rapid clinical scoring system that may be used to differentiate scrub typhus and dengue at initial presentation.

  2. Clinical Score to Differentiate Scrub Typhus and Dengue: A Tool to Differentiate Scrub Typhus and Dengue.

    Science.gov (United States)

    Mitra, Shubhanker; Gautam, Ira; Jambugulam, Mohan; Abhilash, Kundavaram Paul Prabhakar; Jayaseeelan, Vishalakshi

    2017-01-01

    Dengue and scrub typhus share similar clinical and epidemiological features, and are difficult to differentiate at initial presentation. Many places are endemic to both these infections where they comprise the majority of acute undifferentiated febrile illnesses. We aimed to develop a score that can differentiate scrub typhus from dengue. In this cross-sectional study, 188 cases of scrub typhus and 201 cases of dengue infection who presented to the emergency department or medicine outpatient clinic from September 2012 to April 2013 were included. Univariate followed by multivariate logistic regression analysis was performed to identify clinical features and laboratory results that were significantly different between the two groups. Each variable was assigned scores based on the strength of association and receiver operating characteristics area under the curve (ROC-AUC) was generated and compared. Six scoring models were explored to ascertain the model with the best fit. Model 2 was developed using the following six variables: oxygen saturation (>90%, ≤90%), total white blood cell count (7000 cells/cumm), hemoglobin (≤14 and >14 g/dL), total bilirubin (200 and ≥200 IU/dL), and altered sensorium (present or absent). Each variable was assigned scores based on its strength of association. The AUC-ROC curve (95% confidence interval) for model 2 was 0.84 (0.79-0.89). At the cut off score of 13, the sensitivity and specificity were 85% and 77% respectively, with a higher score favoring dengue. In areas of high burden of ST and dengue, model 2 (the "clinical score to differentiate scrub typhus and dengue fever") is a simple and rapid clinical scoring system that may be used to differentiate scrub typhus and dengue at initial presentation.

  3. Strategy proposed by Electricite de France in the development of automatic tools

    Energy Technology Data Exchange (ETDEWEB)

    Castaing, C.; Cazin, B. [Electricite de France, Noisy le grand (France)

    1995-03-01

    The strategy proposed by EDF in the development of a means to limit personal and collective dosimetry is recent. It follows in the steps of a policy that consisted of developing remote operation means for those activities of inspection and maintenance on the reactor, pools bottom, steam generators (SGs), also reactor building valves; target activities because of their high dosimetric cost. One of the main duties of the UTO (Technical Support Department), within the EDF, is the maintenance of Pressurized Water Reactors in French Nuclear Power Plant Operations (consisting of 54 units) and the development and monitoring of specialized tools. To achieve this, the UTO has started a national think-tank on the implementation of the ALARA process in its field of activity and created an ALARA Committee responsible for running and monitoring it, as well as a policy for developing tools. This point will be illustrated in the second on reactor vessel heads.

  4. Computer Tool for Automatically Generated 3D Illustration in Real Time from Archaeological Scanned Pieces

    OpenAIRE

    Luis López; Germán Arroyo; Domingo Martín

    2012-01-01

    The graphical documentation process of archaeological pieces requires the active involvement of a professional artist to recreate beautiful illustrations using a wide variety of expressive techniques. Frequently, the artist’s work is limited by the inconvenience of working only with the photographs of the pieces he is going to illustrate. This paper presents a software tool that allows the easy generation of illustrations in real time from 3D scanned models. The developed interface allows the...

  5. GIS (Geographic Information Systems) based automatic tool for selection of gas pipeline corridors

    Energy Technology Data Exchange (ETDEWEB)

    Matos, Denise F.; Menezes, Paulo Cesar P.; Paz, Luciana R.L.; Garcia, Katia C.; Cruz, Cristiane B.; Pires, Silvia H.M.; Damazio, Jorge M.; Medeiros, Alexandre M.

    2009-07-01

    This paper describes a methodology developed to build total accumulated surfaces in order to better select gas pipelines corridor alternatives. The methodology is based on the minimization of negative impacts and the use of Geographic Information Systems (GIS), allowing an automatic method of construction, evaluation and selection of alternatives, that will contribute to the decision making process. It is important to emphasize that this paper follows the assumptions presented on the research reports of a project sponsored by the Ministry of Mines and Energy (MME) and elaborated at the Electric Power Research Center (CEPEL), called 'Development of a Geographic Information System to Oil and Gas Sectors in Brazil', and also the studies d GTW Project (Gas to Wire). Gas pipelines, as for their linear characteristic, may cross a variety of habitats and settlements, increasing the complexity of their environmental management. Considering this reality, this paper presents a methodology that takes into account different environmental criteria (layers), according to the area impacted. From the synthesis of the criteria it is presented the total accumulated surface. It is showed an example of a hypothetical gas pipeline connection between two points using the total accumulated surface. To select the 'impact scores' of the features, the gas pipeline was considered as a linear feature, but the result is a region, formed by pixels, each pixel with an accumulated impact score lower than some arbitrary measure. This region is called 'corridor', and it is the final result obtained using the proposed methodology. (author)

  6. Finger tapping movements of Parkinson's disease patients automatically rated using nonlinear delay differential equations.

    Science.gov (United States)

    Lainscsek, C; Rowat, P; Schettino, L; Lee, D; Song, D; Letellier, C; Poizner, H

    2012-03-01

    Parkinson's disease is a degenerative condition whose severity is assessed by clinical observations of motor behaviors. These are performed by a neurological specialist through subjective ratings of a variety of movements including 10-s bouts of repetitive finger-tapping movements. We present here an algorithmic rating of these movements which may be beneficial for uniformly assessing the progression of the disease. Finger-tapping movements were digitally recorded from Parkinson's patients and controls, obtaining one time series for every 10 s bout. A nonlinear delay differential equation, whose structure was selected using a genetic algorithm, was fitted to each time series and its coefficients were used as a six-dimensional numerical descriptor. The algorithm was applied to time-series from two different groups of Parkinson's patients and controls. The algorithmic scores compared favorably with the unified Parkinson's disease rating scale scores, at least when the latter adequately matched with ratings from the Hoehn and Yahr scale. Moreover, when the two sets of mean scores for all patients are compared, there is a strong (r = 0.785) and significant (p<0.0015) correlation between them.

  7. HClass: Automatic classification tool for health pathologies using artificial intelligence techniques.

    Science.gov (United States)

    Garcia-Chimeno, Yolanda; Garcia-Zapirain, Begonya

    2015-01-01

    The classification of subjects' pathologies enables a rigorousness to be applied to the treatment of certain pathologies, as doctors on occasions play with so many variables that they can end up confusing some illnesses with others. Thanks to Machine Learning techniques applied to a health-record database, it is possible to make using our algorithm. hClass contains a non-linear classification of either a supervised, non-supervised or semi-supervised type. The machine is configured using other techniques such as validation of the set to be classified (cross-validation), reduction in features (PCA) and committees for assessing the various classifiers. The tool is easy to use, and the sample matrix and features that one wishes to classify, the number of iterations and the subjects who are going to be used to train the machine all need to be introduced as inputs. As a result, the success rate is shown either via a classifier or via a committee if one has been formed. A 90% success rate is obtained in the ADABoost classifier and 89.7% in the case of a committee (comprising three classifiers) when PCA is applied. This tool can be expanded to allow the user to totally characterise the classifiers by adjusting them to each classification use.

  8. An Interactive Tool for Automatic Predimensioning and Numerical Modeling of Arch Dams

    Directory of Open Access Journals (Sweden)

    D. J. Vicente

    2017-01-01

    Full Text Available The construction of double-curvature arch dams is an attractive solution from an economic viewpoint due to the reduced volume of concrete necessary for their construction as compared to conventional gravity dams. Due to their complex geometry, many criteria have arisen for their design. However, the most widespread methods are based on recommendations of traditional technical documents without taking into account the possibilities of computer-aided design. In this paper, an innovative software tool to design FEM models of double-curvature arch dams is presented. Several capabilities are allowed: simplified geometry creation (interesting for academic purposes, preliminary geometrical design, high-detailed model construction, and stochastic calculation performance (introducing uncertainty associated with material properties and other parameters. This paper specially focuses on geometrical issues describing the functionalities of the tool and the fundamentals of the design procedure with regard to the following aspects: topography, reference cylinder, excavation depth, crown cantilever thickness and curvature, horizontal arch curvature, excavation and concrete mass volume, and additional elements such as joints or spillways. Examples of application on two Spanish dams are presented and the results obtained analyzed.

  9. Design and Development of an Automatic Tool Changer for an Articulated Robot Arm

    International Nuclear Information System (INIS)

    Ambrosio, H; Karamanoglu, M

    2014-01-01

    In the creative industries, the length of time between the ideation stage and the making of physical objects is decreasing due to the use of CAD/CAM systems and adicitive manufacturing. Natural anisotropic materials, such as solid wood can also be transformed using CAD/CAM systems, but only with subtractive processes such as machining with CNC routers. Whilst some 3 axis CNC routing machines are affordable to buy and widely available, more flexible 5 axis routing machines still present themselves as a too big investment for small companies. Small refurbished articulated robots can be a cheaper alternative but they require a light end-effector. This paper presents a new lightweight tool changer that converts a small 3kg payload 6 DOF robot into a robot apprentice able to machine wood and similar soft materials

  10. Design and Development of an Automatic Tool Changer for an Articulated Robot Arm

    Science.gov (United States)

    Ambrosio, H.; Karamanoglu, M.

    2014-07-01

    In the creative industries, the length of time between the ideation stage and the making of physical objects is decreasing due to the use of CAD/CAM systems and adicitive manufacturing. Natural anisotropic materials, such as solid wood can also be transformed using CAD/CAM systems, but only with subtractive processes such as machining with CNC routers. Whilst some 3 axis CNC routing machines are affordable to buy and widely available, more flexible 5 axis routing machines still present themselves as a too big investment for small companies. Small refurbished articulated robots can be a cheaper alternative but they require a light end-effector. This paper presents a new lightweight tool changer that converts a small 3kg payload 6 DOF robot into a robot apprentice able to machine wood and similar soft materials.

  11. Computer Tool for Automatically Generated 3D Illustration in Real Time from Archaeological Scanned Pieces

    Directory of Open Access Journals (Sweden)

    Luis López

    2012-11-01

    Full Text Available The graphical documentation process of archaeological pieces requires the active involvement of a professional artist to recreate beautiful illustrations using a wide variety of expressive techniques. Frequently, the artist’s work is limited by the inconvenience of working only with the photographs of the pieces he is going to illustrate. This paper presents a software tool that allows the easy generation of illustrations in real time from 3D scanned models. The developed interface allows the user to simulate very elaborate artistic styles through the creation of diagrams by using the available virtual lights. The software processes the diagrams to render an illustration from any given angle or position. Among the available virtual lights, there are well known techniques as silhouettes enhancement, hatching or toon shading.

  12. A software tool for automatic classification and segmentation of 2D/3D medical images

    International Nuclear Information System (INIS)

    Strzelecki, Michal; Szczypinski, Piotr; Materka, Andrzej; Klepaczko, Artur

    2013-01-01

    Modern medical diagnosis utilizes techniques of visualization of human internal organs (CT, MRI) or of its metabolism (PET). However, evaluation of acquired images made by human experts is usually subjective and qualitative only. Quantitative analysis of MR data, including tissue classification and segmentation, is necessary to perform e.g. attenuation compensation, motion detection, and correction of partial volume effect in PET images, acquired with PET/MR scanners. This article presents briefly a MaZda software package, which supports 2D and 3D medical image analysis aiming at quantification of image texture. MaZda implements procedures for evaluation, selection and extraction of highly discriminative texture attributes combined with various classification, visualization and segmentation tools. Examples of MaZda application in medical studies are also provided

  13. A software tool for automatic classification and segmentation of 2D/3D medical images

    Energy Technology Data Exchange (ETDEWEB)

    Strzelecki, Michal, E-mail: michal.strzelecki@p.lodz.pl [Institute of Electronics, Technical University of Lodz, Wolczanska 211/215, 90-924 Lodz (Poland); Szczypinski, Piotr; Materka, Andrzej; Klepaczko, Artur [Institute of Electronics, Technical University of Lodz, Wolczanska 211/215, 90-924 Lodz (Poland)

    2013-02-21

    Modern medical diagnosis utilizes techniques of visualization of human internal organs (CT, MRI) or of its metabolism (PET). However, evaluation of acquired images made by human experts is usually subjective and qualitative only. Quantitative analysis of MR data, including tissue classification and segmentation, is necessary to perform e.g. attenuation compensation, motion detection, and correction of partial volume effect in PET images, acquired with PET/MR scanners. This article presents briefly a MaZda software package, which supports 2D and 3D medical image analysis aiming at quantification of image texture. MaZda implements procedures for evaluation, selection and extraction of highly discriminative texture attributes combined with various classification, visualization and segmentation tools. Examples of MaZda application in medical studies are also provided.

  14. Automatic alternative phase-shift mask CAD layout tool for gate shrinkage of embedded DRAM in logic below 0.18 μm

    Science.gov (United States)

    Ohnuma, Hidetoshi; Kawahira, Hiroichi

    1998-09-01

    An automatic alternative phase shift mask (PSM) pattern layout tool has been newly developed. This tool is dedicated for embedded DRAM in logic device to shrink gate line width with improving line width controllability in lithography process with a design rule below 0.18 micrometers by the KrF excimer laser exposure. The tool can crete Levenson type PSM used being coupled with a binary mask adopting a double exposure method for positive photo resist. By using graphs, this tool automatically creates alternative PSM patterns. Moreover, it does not give any phase conflicts. By adopting it to actual embedded DRAM in logic cells, we have provided 0.16 micrometers gate resist patterns at both random logic and DRAM areas. The patterns were fabricated using two masks with the double exposure method. Gate line width has been well controlled under a practical exposure-focus window.

  15. A Survey of Automatic Protocol Reverse Engineering Approaches, Methods, and Tools on the Inputs and Outputs View

    Directory of Open Access Journals (Sweden)

    Baraka D. Sija

    2018-01-01

    Full Text Available A network protocol defines rules that control communications between two or more machines on the Internet, whereas Automatic Protocol Reverse Engineering (APRE defines the way of extracting the structure of a network protocol without accessing its specifications. Enough knowledge on undocumented protocols is essential for security purposes, network policy implementation, and management of network resources. This paper reviews and analyzes a total of 39 approaches, methods, and tools towards Protocol Reverse Engineering (PRE and classifies them into four divisions, approaches that reverse engineer protocol finite state machines, protocol formats, and both protocol finite state machines and protocol formats to approaches that focus directly on neither reverse engineering protocol formats nor protocol finite state machines. The efficiency of all approaches’ outputs based on their selected inputs is analyzed in general along with appropriate reverse engineering inputs format. Additionally, we present discussion and extended classification in terms of automated to manual approaches, known and novel categories of reverse engineered protocols, and a literature of reverse engineered protocols in relation to the seven layers’ OSI (Open Systems Interconnection model.

  16. Assessing hippocampal development and language in early childhood: Evidence from a new application of the Automatic Segmentation Adapter Tool.

    Science.gov (United States)

    Lee, Joshua K; Nordahl, Christine W; Amaral, David G; Lee, Aaron; Solomon, Marjorie; Ghetti, Simona

    2015-11-01

    Volumetric assessments of the hippocampus and other brain structures during childhood provide useful indices of brain development and correlates of cognitive functioning in typically and atypically developing children. Automated methods such as FreeSurfer promise efficient and replicable segmentation, but may include errors which are avoided by trained manual tracers. A recently devised automated correction tool that uses a machine learning algorithm to remove systematic errors, the Automatic Segmentation Adapter Tool (ASAT), was capable of substantially improving the accuracy of FreeSurfer segmentations in an adult sample [Wang et al., 2011], but the utility of ASAT has not been examined in pediatric samples. In Study 1, the validity of FreeSurfer and ASAT corrected hippocampal segmentations were examined in 20 typically developing children and 20 children with autism spectrum disorder aged 2 and 3 years. We showed that while neither FreeSurfer nor ASAT accuracy differed by disorder or age, the accuracy of ASAT corrected segmentations were substantially better than FreeSurfer segmentations in every case, using as few as 10 training examples. In Study 2, we applied ASAT to 89 typically developing children aged 2 to 4 years to examine relations between hippocampal volume, age, sex, and expressive language. Girls had smaller hippocampi overall, and in left hippocampus this difference was larger in older than younger girls. Expressive language ability was greater in older children, and this difference was larger in those with larger hippocampi, bilaterally. Overall, this research shows that ASAT is highly reliable and useful to examinations relating behavior to hippocampal structure. © 2015 Wiley Periodicals, Inc.

  17. SplitRacer - a semi-automatic tool for the analysis and interpretation of teleseismic shear-wave splitting

    Science.gov (United States)

    Reiss, Miriam Christina; Rümpker, Georg

    2017-04-01

    We present a semi-automatic, graphical user interface tool for the analysis and interpretation of teleseismic shear-wave splitting in MATLAB. Shear wave splitting analysis is a standard tool to infer seismic anisotropy, which is often interpreted as due to lattice-preferred orientation of e.g. mantle minerals or shape-preferred orientation caused by cracks or alternating layers in the lithosphere and hence provides a direct link to the earth's kinematic processes. The increasing number of permanent stations and temporary experiments result in comprehensive studies of seismic anisotropy world-wide. Their successive comparison with a growing number of global models of mantle flow further advances our understanding the earth's interior. However, increasingly large data sets pose the inevitable question as to how to process them. Well-established routines and programs are accurate but often slow and impractical for analyzing a large amount of data. Additionally, shear wave splitting results are seldom evaluated using the same quality criteria which complicates a straight-forward comparison. SplitRacer consists of several processing steps: i) download of data per FDSNWS, ii) direct reading of miniSEED-files and an initial screening and categorizing of XKS-waveforms using a pre-set SNR-threshold. iii) an analysis of the particle motion of selected phases and successive correction of the sensor miss-alignment based on the long-axis of the particle motion. iv) splitting analysis of selected events: seismograms are first rotated into radial and transverse components, then the energy-minimization method is applied, which provides the polarization and delay time of the phase. To estimate errors, the analysis is done for different randomly-chosen time windows. v) joint-splitting analysis for all events for one station, where the energy content of all phases is inverted simultaneously. This allows to decrease the influence of noise and to increase robustness of the measurement

  18. SU-C-202-03: A Tool for Automatic Calculation of Delivered Dose Variation for Off-Line Adaptive Therapy Using Cone Beam CT

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, B; Lee, S; Chen, S; Zhou, J; Prado, K; D’Souza, W; Yi, B [University of Maryland School of Medicine, Baltimore, MD (United States)

    2016-06-15

    Purpose: Monitoring the delivered dose is an important task for the adaptive radiotherapy (ART) and for determining time to re-plan. A software tool which enables automatic delivered dose calculation using cone-beam CT (CBCT) has been developed and tested. Methods: The tool consists of four components: a CBCT Colleting Module (CCM), a Plan Registration Moduel (PRM), a Dose Calculation Module (DCM), and an Evaluation and Action Module (EAM). The CCM is triggered periodically (e.g. every 1:00 AM) to search for newly acquired CBCTs of patients of interest and then export the DICOM files of the images and related registrations defined in ARIA followed by triggering the PRM. The PRM imports the DICOM images and registrations, links the CBCTs to the related treatment plan of the patient in the planning system (RayStation V4.5, RaySearch, Stockholm, Sweden). A pre-determined CT-to-density table is automatically generated for dose calculation. Current version of the DCM uses a rigid registration which regards the treatment isocenter of the CBCT to be the isocenter of the treatment plan. Then it starts the dose calculation automatically. The AEM evaluates the plan using pre-determined plan evaluation parameters: PTV dose-volume metrics and critical organ doses. The tool has been tested for 10 patients. Results: Automatic plans are generated and saved in the order of the treatment dates of the Adaptive Planning module of the RayStation planning system, without any manual intervention. Once the CTV dose deviates more than 3%, both email and page alerts are sent to the physician and the physicist of the patient so that one can look the case closely. Conclusion: The tool is capable to perform automatic dose tracking and to alert clinicians when an action is needed. It is clinically useful for off-line adaptive therapy to catch any gross error. Practical way of determining alarming level for OAR is under development.

  19. Autonomic Differentiation Map: A Novel Statistical Tool for Interpretation of Heart Rate Variability

    Directory of Open Access Journals (Sweden)

    Daniela Lucini

    2018-04-01

    Full Text Available In spite of the large body of evidence suggesting Heart Rate Variability (HRV alone or combined with blood pressure variability (providing an estimate of baroreflex gain as a useful technique to assess the autonomic regulation of the cardiovascular system, there is still an ongoing debate about methodology, interpretation, and clinical applications. In the present investigation, we hypothesize that non-parametric and multivariate exploratory statistical manipulation of HRV data could provide a novel informational tool useful to differentiate normal controls from clinical groups, such as athletes, or subjects affected by obesity, hypertension, or stress. With a data-driven protocol in 1,352 ambulant subjects, we compute HRV and baroreflex indices from short-term data series as proxies of autonomic (ANS regulation. We apply a three-step statistical procedure, by first removing age and gender effects. Subsequently, by factor analysis, we extract four ANS latent domains that detain the large majority of information (86.94%, subdivided in oscillatory (40.84%, amplitude (18.04%, pressure (16.48%, and pulse domains (11.58%. Finally, we test the overall capacity to differentiate clinical groups vs. control. To give more practical value and improve readability, statistical results concerning individual discriminant ANS proxies and ANS differentiation profiles are displayed through peculiar graphical tools, i.e., significance diagram and ANS differentiation map, respectively. This approach, which simultaneously uses all available information about the system, shows what domains make up the difference in ANS discrimination. e.g., athletes differ from controls in all domains, but with a graded strength: maximal in the (normalized oscillatory and in the pulse domains, slightly less in the pressure domain and minimal in the amplitude domain. The application of multiple (non-parametric and exploratory statistical and graphical tools to ANS proxies defines

  20. SplitRacer - a new Semi-Automatic Tool to Quantify And Interpret Teleseismic Shear-Wave Splitting

    Science.gov (United States)

    Reiss, M. C.; Rumpker, G.

    2017-12-01

    We have developed a semi-automatic, MATLAB-based GUI to combine standard seismological tasks such as the analysis and interpretation of teleseismic shear-wave splitting. Shear-wave splitting analysis is widely used to infer seismic anisotropy, which can be interpreted in terms of lattice-preferred orientation of mantle minerals, shape-preferred orientation caused by fluid-filled cracks or alternating layers. Seismic anisotropy provides a unique link between directly observable surface structures and the more elusive dynamic processes in the mantle below. Thus, resolving the seismic anisotropy of the lithosphere/asthenosphere is of particular importance for geodynamic modeling and interpretations. The increasing number of seismic stations from temporary experiments and permanent installations creates a new basis for comprehensive studies of seismic anisotropy world-wide. However, the increasingly large data sets pose new challenges for the rapid and reliably analysis of teleseismic waveforms and for the interpretation of the measurements. Well-established routines and programs are available but are often impractical for analyzing large data sets from hundreds of stations. Additionally, shear wave splitting results are seldom evaluated using the same well-defined quality criteria which may complicate comparison with results from different studies. SplitRacer has been designed to overcome these challenges by incorporation of the following processing steps: i) downloading of waveform data from multiple stations in mseed-format using FDSNWS tools; ii) automated initial screening and categorizing of XKS-waveforms using a pre-set SNR-threshold; iii) particle-motion analysis of selected phases at longer periods to detect and correct for sensor misalignment; iv) splitting analysis of selected phases based on transverse-energy minimization for multiple, randomly-selected, relevant time windows; v) one and two-layer joint-splitting analysis for all phases at one station by

  1. A Global Multi-Objective Optimization Tool for Design of Mechatronic Components using Generalized Differential Evolution

    DEFF Research Database (Denmark)

    Bech, Michael Møller; Nørgård, Christian; Roemer, Daniel Beck

    2016-01-01

    This paper illustrates how the relatively simple constrained multi-objective optimization algorithm Generalized Differential Evolution 3 (GDE3), can assist with the practical sizing of mechatronic components used in e.g. digital displacement fluid power machinery. The studied bi- and tri-objectiv......This paper illustrates how the relatively simple constrained multi-objective optimization algorithm Generalized Differential Evolution 3 (GDE3), can assist with the practical sizing of mechatronic components used in e.g. digital displacement fluid power machinery. The studied bi- and tri...... different optimization control parameter settings and it is concluded that GDE3 is a reliable optimization tool that can assist mechatronic engineers in the design and decision making process....

  2. Attentional bias for pain and sex, and automatic appraisals of sexual penetration : Differential patterns in dyspareunia versus vaginismus?

    NARCIS (Netherlands)

    Melles, Reinhilde J.; Dewitte, Marieke D.; ter Kuile, Moniek M.; Peters, Madelon M.L.; Jong, de Peter J.

    Introduction Current information processing models propose that heightened attention bias for sex-related threats (eg, pain) and lowered automatic incentive processes (“wanting”) may play an important role in the impairment of sexual arousal and the development of sexual dysfunctions such as

  3. A fully automatic tool to perform accurate flood mapping by merging remote sensing imagery and ancillary data

    Science.gov (United States)

    D'Addabbo, Annarita; Refice, Alberto; Lovergine, Francesco; Pasquariello, Guido

    2016-04-01

    Flooding is one of the most frequent and expansive natural hazard. High-resolution flood mapping is an essential step in the monitoring and prevention of inundation hazard, both to gain insight into the processes involved in the generation of flooding events, and from the practical point of view of the precise assessment of inundated areas. Remote sensing data are recognized to be useful in this respect, thanks to the high resolution and regular revisit schedules of state-of-the-art satellites, moreover offering a synoptic overview of the extent of flooding. In particular, Synthetic Aperture Radar (SAR) data present several favorable characteristics for flood mapping, such as their relative insensitivity to the meteorological conditions during acquisitions, as well as the possibility of acquiring independently of solar illumination, thanks to the active nature of the radar sensors [1]. However, flood scenarios are typical examples of complex situations in which different factors have to be considered to provide accurate and robust interpretation of the situation on the ground: the presence of many land cover types, each one with a particular signature in presence of flood, requires modelling the behavior of different objects in the scene in order to associate them to flood or no flood conditions [2]. Generally, the fusion of multi-temporal, multi-sensor, multi-resolution and/or multi-platform Earth observation image data, together with other ancillary information, seems to have a key role in the pursuit of a consistent interpretation of complex scenes. In the case of flooding, distance from the river, terrain elevation, hydrologic information or some combination thereof can add useful information to remote sensing data. Suitable methods, able to manage and merge different kind of data, are so particularly needed. In this work, a fully automatic tool, based on Bayesian Networks (BNs) [3] and able to perform data fusion, is presented. It supplies flood maps

  4. The INECO Frontal Screening tool differentiates behavioral variant - frontotemporal dementia (bv-FTD from major depression

    Directory of Open Access Journals (Sweden)

    Natalia Fiorentino

    Full Text Available ABSTRACT Executive dysfunction may result from prefrontal circuitry involvement occurring in both neurodegenerative diseases and psychiatric disorders. Moreover, multiple neuropsychiatric conditions, may present with overlapping behavioral and cognitive symptoms, making differential diagnosis challenging, especially during earlier stages. In this sense, cognitive assessment may contribute to the differential diagnosis by providing an objective and quantifiable set of measures that has the potential to distinguish clinical conditions otherwise perceived in everyday clinical settings as quite similar. Objective: The goal of this study was to investigate the utility of the INECO Frontal Screening (IFS for differentiating bv-FTD patients from patients with Major Depression. Methods: We studied 49 patients with bv-FTD diagnosis and 30 patients diagnosed with unipolar depression compared to a control group of 26 healthy controls using the INECO Frontal Screening (IFS, the Mini Mental State Examination (MMSE and the Addenbrooke's Cognitive Examination-Revised (ACE-R. Results: Patient groups differed significantly on the motor inhibitory control (U=437.0, p<0.01, verbal working memory (U=298.0, p<0.001, spatial working memory (U=300.5, p<0.001, proverbs (U=341.5, p<0.001 and verbal inhibitory control (U=316.0, p<0.001 subtests, with bv-FTD patients scoring significantly lower than patients with depression. Conclusion: Our results suggest the IFS can be considered a useful tool for detecting executive dysfunction in both depression and bv-FTD patients and, perhaps more importantly, that it has the potential to help differentiate these two conditions.

  5. Nouns referring to tools and natural objects differentially modulate the motor system.

    Science.gov (United States)

    Gough, Patricia M; Riggio, Lucia; Chersi, Fabian; Sato, Marc; Fogassi, Leonardo; Buccino, Giovanni

    2012-01-01

    While increasing evidence points to a critical role for the motor system in language processing, the focus of previous work has been on the linguistic category of verbs. Here we tested whether nouns are effective in modulating the motor system and further whether different kinds of nouns - those referring to artifacts or natural items, and items that are graspable or ungraspable - would differentially modulate the system. A Transcranial Magnetic Stimulation (TMS) study was carried out to compare modulation of the motor system when subjects read nouns referring to objects which are Artificial or Natural and which are Graspable or Ungraspable. TMS was applied to the primary motor cortex representation of the first dorsal interosseous (FDI) muscle of the right hand at 150 ms after noun presentation. Analyses of Motor Evoked Potentials (MEPs) revealed that across the duration of the task, nouns referring to graspable artifacts (tools) were associated with significantly greater MEP areas. Analyses of the initial presentation of items revealed a main effect of graspability. The findings are in line with an embodied view of nouns, with MEP measures modulated according to whether nouns referred to natural objects or artifacts (tools), confirming tools as a special class of items in motor terms. Additionally our data support a difference for graspable versus non graspable objects, an effect which for natural objects is restricted to initial presentation of items. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Progress on statistical learning systems as data mining tools for the creation of automatic databases in Fusion environments

    International Nuclear Information System (INIS)

    Vega, J.; Murari, A.; Ratta, G.A.; Gonzalez, S.; Dormido-Canto, S.

    2010-01-01

    Nowadays, processing all information of a fusion database is a much more important issue than acquiring more data. Although typically fusion devices produce tens of thousands of discharges, specialized databases for physics studies are normally limited to a few tens of shots. This is due to the fact that these databases are almost always generated manually, which is a very time consuming and unreliable activity. The development of automatic methods to create specialized databases ensures first, the reduction of human efforts to identify and locate physical events, second, the standardization of criteria (reducing the vulnerability to human errors) and, third, the improvement of statistical relevance. Classification and regression techniques have been used for these purposes. The objective has been the automatic recognition of physical events (that can appear in a random and/or infrequent way) in waveforms and video-movies. Results are shown for the JET database.

  7. Automatic earthquake detection and classification with continuous hidden Markov models: a possible tool for monitoring Las Canadas caldera in Tenerife

    Energy Technology Data Exchange (ETDEWEB)

    Beyreuther, Moritz; Wassermann, Joachim [Department of Earth and Environmental Sciences (Geophys. Observatory), Ludwig Maximilians Universitaet Muenchen, D-80333 (Germany); Carniel, Roberto [Dipartimento di Georisorse e Territorio Universitat Degli Studi di Udine, I-33100 (Italy)], E-mail: roberto.carniel@uniud.it

    2008-10-01

    A possible interaction of (volcano-) tectonic earthquakes with the continuous seismic noise recorded in the volcanic island of Tenerife was recently suggested, but existing catalogues seem to be far from being self consistent, calling for the development of automatic detection and classification algorithms. In this work we propose the adoption of a methodology based on Hidden Markov Models (HMMs), widely used already in other fields, such as speech classification.

  8. Fourier Transform Infrared Spectroscopy (FTIR) as a Tool for the Identification and Differentiation of Pathogenic Bacteria.

    Science.gov (United States)

    Zarnowiec, Paulina; Lechowicz, Łukasz; Czerwonka, Grzegorz; Kaca, Wiesław

    2015-01-01

    Methods of human bacterial pathogen identification need to be fast, reliable, inexpensive, and time efficient. These requirements may be met by vibrational spectroscopic techniques. The method that is most often used for bacterial detection and identification is Fourier transform infrared spectroscopy (FTIR). It enables biochemical scans of whole bacterial cells or parts thereof at infrared frequencies (4,000-600 cm(-1)). The recorded spectra must be subsequently transformed in order to minimize data variability and to amplify the chemically-based spectral differences in order to facilitate spectra interpretation and analysis. In the next step, the transformed spectra are analyzed by data reduction tools, regression techniques, and classification methods. Chemometric analysis of FTIR spectra is a basic technique for discriminating between bacteria at the genus, species, and clonal levels. Examples of bacterial pathogen identification and methods of differentiation up to the clonal level, based on infrared spectroscopy, are presented below.

  9. Foodomics: A new tool to differentiate between organic and conventional foods.

    Science.gov (United States)

    Vallverdú-Queralt, Anna; Lamuela-Raventós, Rosa Maria

    2016-07-01

    The demand for organic food is increasing annually due to the growing consumer trend for more natural products that have simpler ingredient lists, involve less processing and are grown free of pesticides. However, there is still not enough nutritional evidence in favor of organic food consumption. Classical chemical analysis of macro- and micronutrients has demonstrated that organic crops are poorer in nitrogen, but clear evidence for other nutrients is lacking. Omics technologies forming part of the new discipline of foodomics have allowed the detection of possible nutritional differences between organic and conventional production, although many results remain controversial and contradictory. The main focus of this review is to provide an overview of the studies that use foodomics techniques as a tool to differentiate between organic and conventional production. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Neuroimaging in Parkinsonism: a study with magnetic resonance and spectroscopy as tools in the differential diagnosis

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcellos, Luiz Felipe Rocha [1Hospital dos Servidores do Estado, Rio de Janeiro RJ (Brazil)], e-mail: luizneurol@terra.com.br; Novis, Sergio A. Pereira; Rosso, Ana Lucia Z. [Hospital Universitario Clementino Fraga Filho (HUCFF), Rio de Janeiro, RJ (Brazil); Moreira, Denise Madeira [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Inst. de Neurologia Deolindo Couto; Leite, Ana Claudia C.B. [Fundacao Oswaldo Cruz (FIOCRUZ), Rio de Janeiro, RJ (Brazil)

    2009-03-15

    The differential diagnosis of Parkinsonism based on clinical features, sometimes may be difficult. Diagnostic tests in these cases might be useful, especially magnetic resonance imaging, a noninvasive exam, not as expensive as positron emission tomography, and provides a good basis for anatomical analysis. The magnetic resonance spectroscopy analyzes cerebral metabolism, yielding inconsistent results in parkinsonian disorders. We selected 40 individuals for magnetic resonance imaging and spectroscopy analysis, 12 with Parkinson's disease, 11 with progressive supranuclear palsy, 7 with multiple system atrophy (parkinsonian type), and 10 individuals without any psychiatric or neurological disorders (controls). Clinical scales included Hoenh and Yahr, unified Parkinson's disease rating scale and mini mental status examination. The results showed that patients with Parkinson's disease and controls presented the same aspects on neuroimaging, with few or absence of abnormalities, and supranuclear progressive palsy and multiple system atrophy showed abnormalities, some of which statistically significant. Thus, magnetic resonance imaging and spectroscopy could be useful as a tool in differential diagnosis of Parkinsonism. (author)

  11. Neuroimaging in Parkinsonism: a study with magnetic resonance and spectroscopy as tools in the differential diagnosis

    International Nuclear Information System (INIS)

    Vasconcellos, Luiz Felipe Rocha; Novis, Sergio A. Pereira; Rosso, Ana Lucia Z.; Moreira, Denise Madeira

    2009-01-01

    The differential diagnosis of Parkinsonism based on clinical features, sometimes may be difficult. Diagnostic tests in these cases might be useful, especially magnetic resonance imaging, a noninvasive exam, not as expensive as positron emission tomography, and provides a good basis for anatomical analysis. The magnetic resonance spectroscopy analyzes cerebral metabolism, yielding inconsistent results in parkinsonian disorders. We selected 40 individuals for magnetic resonance imaging and spectroscopy analysis, 12 with Parkinson's disease, 11 with progressive supranuclear palsy, 7 with multiple system atrophy (parkinsonian type), and 10 individuals without any psychiatric or neurological disorders (controls). Clinical scales included Hoenh and Yahr, unified Parkinson's disease rating scale and mini mental status examination. The results showed that patients with Parkinson's disease and controls presented the same aspects on neuroimaging, with few or absence of abnormalities, and supranuclear progressive palsy and multiple system atrophy showed abnormalities, some of which statistically significant. Thus, magnetic resonance imaging and spectroscopy could be useful as a tool in differential diagnosis of Parkinsonism. (author)

  12. Solving ordinary differential equations by electrical analogy: a multidisciplinary teaching tool

    Science.gov (United States)

    Sanchez Perez, J. F.; Conesa, M.; Alhama, I.

    2016-11-01

    Ordinary differential equations are the mathematical formulation for a great variety of problems in science and engineering, and frequently, two different problems are equivalent from a mathematical point of view when they are formulated by the same equations. Students acquire the knowledge of how to solve these equations (at least some types of them) using protocols and strict algorithms of mathematical calculation without thinking about the meaning of the equation. The aim of this work is that students learn to design network models or circuits in this way; with simple knowledge of them, students can establish the association of electric circuits and differential equations and their equivalences, from a formal point of view, that allows them to associate knowledge of two disciplines and promote the use of this interdisciplinary approach to address complex problems. Therefore, they learn to use a multidisciplinary tool that allows them to solve these kinds of equations, even students of first course of engineering, whatever the order, grade or type of non-linearity. This methodology has been implemented in numerous final degree projects in engineering and science, e.g., chemical engineering, building engineering, industrial engineering, mechanical engineering, architecture, etc. Applications are presented to illustrate the subject of this manuscript.

  13. Development and validation of automatic tools for interactive recurrence analysis in radiation therapy: optimization of treatment algorithms for locally advanced pancreatic cancer.

    Science.gov (United States)

    Kessel, Kerstin A; Habermehl, Daniel; Jäger, Andreas; Floca, Ralf O; Zhang, Lanlan; Bendl, Rolf; Debus, Jürgen; Combs, Stephanie E

    2013-06-07

    In radiation oncology recurrence analysis is an important part in the evaluation process and clinical quality assurance of treatment concepts. With the example of 9 patients with locally advanced pancreatic cancer we developed and validated interactive analysis tools to support the evaluation workflow. After an automatic registration of the radiation planning CTs with the follow-up images, the recurrence volumes are segmented manually. Based on these volumes the DVH (dose volume histogram) statistic is calculated, followed by the determination of the dose applied to the region of recurrence and the distance between the boost and recurrence volume. We calculated the percentage of the recurrence volume within the 80%-isodose volume and compared it to the location of the recurrence within the boost volume, boost + 1 cm, boost + 1.5 cm and boost + 2 cm volumes. Recurrence analysis of 9 patients demonstrated that all recurrences except one occurred within the defined GTV/boost volume; one recurrence developed beyond the field border/outfield. With the defined distance volumes in relation to the recurrences, we could show that 7 recurrent lesions were within the 2 cm radius of the primary tumor. Two large recurrences extended beyond the 2 cm, however, this might be due to very rapid growth and/or late detection of the tumor progression. The main goal of using automatic analysis tools is to reduce time and effort conducting clinical analyses. We showed a first approach and use of a semi-automated workflow for recurrence analysis, which will be continuously optimized. In conclusion, despite the limitations of the automatic calculations we contributed to in-house optimization of subsequent study concepts based on an improved and validated target volume definition.

  14. Differential Arc expression in the hippocampus and striatum during the transition from attentive to automatic navigation on a plus maze

    Science.gov (United States)

    Gardner, Robert S.; Suarez, Daniel F.; Robinson-Burton, Nadira K.; Rudnicky, Christopher J.; Gulati, Asish; Ascoli, Giorgio A.; Dumas, Theodore C.

    2016-01-01

    The strategies utilized to effectively perform a given task change with practice and experience. During a spatial navigation task, with relatively little training, performance is typically attentive enabling an individual to locate the position of a goal by relying on spatial landmarks. These (place) strategies require an intact hippocampus. With task repetition, performance becomes automatic; the same goal is reached using a fixed response or sequence of actions. These (response) strategies require an intact striatum. The current work aims to understand the activation patterns across these neural structures during this experience-dependent strategy transition. This was accomplished by region-specific measurement of activity-dependent immediate early gene expression among rats trained to different degrees on a dual-solution task (i.e., a task that can be solved using either place or response navigation). As expected, rats increased their reliance on response navigation with extended task experience. In addition, dorsal hippocampal expression of the immediate early gene Arc was considerably reduced in rats that used a response strategy late in training (as compared with hippocampal expression in rats that used a place strategy early in training). In line with these data, vicarious trial and error, a behavior linked to hippocampal function, also decreased with task repetition. Although Arc mRNA expression in dorsal medial or lateral striatum alone did not correlate with training stage, the ratio of expression in the medial striatum to that in the lateral striatum was relatively high among rats that used a place strategy early in training as compared with the ratio among over-trained response rats. Altogether, these results identify specific changes in the activation of dissociated neural systems that may underlie the experience-dependent emergence of response-based automatic navigation. PMID:26976088

  15. Species and tissues specific differentiation of processed animal proteins in aquafeeds using proteomics tools.

    Science.gov (United States)

    Rasinger, J D; Marbaix, H; Dieu, M; Fumière, O; Mauro, S; Palmblad, M; Raes, M; Berntssen, M H G

    2016-09-16

    The rapidly growing aquaculture industry drives the search for sustainable protein sources in fish feed. In the European Union (EU) since 2013 non-ruminant processed animal proteins (PAP) are again permitted to be used in aquafeeds. To ensure that commercial fish feeds do not contain PAP from prohibited species, EU reference methods were established. However, due to the heterogeneous and complex nature of PAP complementary methods are required to guarantee the safe use of this fish feed ingredient. In addition, there is a need for tissue specific PAP detection to identify the sources (i.e. bovine carcass, blood, or meat) of illegal PAP use. In the present study, we investigated and compared different protein extraction, solubilisation and digestion protocols on different proteomics platforms for the detection and differentiation of prohibited PAP. In addition, we assessed if tissue specific PAP detection was feasible using proteomics tools. All work was performed independently in two different laboratories. We found that irrespective of sample preparation gel-based proteomics tools were inappropriate when working with PAP. Gel-free shotgun proteomics approaches in combination with direct spectral comparison were able to provide quality species and tissue specific data to complement and refine current methods of PAP detection and identification. To guarantee the safe use of processed animal protein (PAP) in aquafeeds efficient PAP detection and monitoring tools are required. The present study investigated and compared various proteomics workflows and shows that the application of shotgun proteomics in combination with direct comparison of spectral libraries provides for the desired species and tissue specific classification of this heat sterilized and pressure treated (≥133°C, at 3bar for 20min) protein feed ingredient. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Summer Student Work Project Report: SCADA Bridge Tool Development Automatically Capturing Data from SCADA to the Maintenance System

    CERN Document Server

    Alhambra-Moron, Alfonso

    2015-01-01

    The main purpose of this report is to summarize the work project I have been doing at CERN during the last 3 months as a Summer Student. My name is Alfonso Alhambra Morón and the 8th of June 2015 I joined the EN-HE-LM team as a summer student supervised by Damien Lafarge in order to collaborate in the automation of the transfer of meter readings from SCADA1 to Infor EAM2, the computerized maintenance management system at CERN. The main objective of my project was to enable the automatic updates of meters in Infor EAM fetching data from SCADA so as to automatize a process which was done manually before and consumed resources in terms of having to consult the meter physically, import this information to Infor EAM by hand and detecting and correcting the errors that can occur when doing all of this manually. This problem is shared by several other teams at CERN apart from the Lift Maintenance team and for this reason the main target I had when developing my solution was flexibility and scalability so as to make...

  17. Scan-Less Line Field Optical Coherence Tomography, with Automatic Image Segmentation, as a Measurement Tool for Automotive Coatings

    Directory of Open Access Journals (Sweden)

    Samuel Lawman

    2017-04-01

    Full Text Available The measurement of the thicknesses of layers is important for the quality assurance of industrial coating systems. Current measurement techniques only provide a limited amount of information. Here, we show that spectral domain Line Field (LF Optical Coherence Tomography (OCT is able to return to the user a cross sectional B-Scan image in a single shot with no mechanical moving parts. To reliably extract layer thicknesses from such images of automotive paint systems, we present an automatic graph search image segmentation algorithm. To show that the algorithm works independently of the OCT device, the measurements are repeated with a separate time domain Full Field (FF OCT system. This gives matching mean thickness values within the standard deviations of the measured thicknesses across each B-Scan image. The combination of an LF-OCT with graph search segmentation is potentially a powerful technique for the quality assurance of non-opaque industrial coating layers.

  18. ODM Data Analysis-A tool for the automatic validation, monitoring and generation of generic descriptive statistics of patient data.

    Science.gov (United States)

    Brix, Tobias Johannes; Bruland, Philipp; Sarfraz, Saad; Ernsting, Jan; Neuhaus, Philipp; Storck, Michael; Doods, Justin; Ständer, Sonja; Dugas, Martin

    2018-01-01

    A required step for presenting results of clinical studies is the declaration of participants demographic and baseline characteristics as claimed by the FDAAA 801. The common workflow to accomplish this task is to export the clinical data from the used electronic data capture system and import it into statistical software like SAS software or IBM SPSS. This software requires trained users, who have to implement the analysis individually for each item. These expenditures may become an obstacle for small studies. Objective of this work is to design, implement and evaluate an open source application, called ODM Data Analysis, for the semi-automatic analysis of clinical study data. The system requires clinical data in the CDISC Operational Data Model format. After uploading the file, its syntax and data type conformity of the collected data is validated. The completeness of the study data is determined and basic statistics, including illustrative charts for each item, are generated. Datasets from four clinical studies have been used to evaluate the application's performance and functionality. The system is implemented as an open source web application (available at https://odmanalysis.uni-muenster.de) and also provided as Docker image which enables an easy distribution and installation on local systems. Study data is only stored in the application as long as the calculations are performed which is compliant with data protection endeavors. Analysis times are below half an hour, even for larger studies with over 6000 subjects. Medical experts have ensured the usefulness of this application to grant an overview of their collected study data for monitoring purposes and to generate descriptive statistics without further user interaction. The semi-automatic analysis has its limitations and cannot replace the complex analysis of statisticians, but it can be used as a starting point for their examination and reporting.

  19. Fuchsia : A tool for reducing differential equations for Feynman master integrals to epsilon form

    Science.gov (United States)

    Gituliar, Oleksandr; Magerya, Vitaly

    2017-10-01

    We present Fuchsia - an implementation of the Lee algorithm, which for a given system of ordinary differential equations with rational coefficients ∂x J(x , ɛ) = A(x , ɛ) J(x , ɛ) finds a basis transformation T(x , ɛ) , i.e., J(x , ɛ) = T(x , ɛ) J‧(x , ɛ) , such that the system turns into the epsilon form : ∂xJ‧(x , ɛ) = ɛ S(x) J‧(x , ɛ) , where S(x) is a Fuchsian matrix. A system of this form can be trivially solved in terms of polylogarithms as a Laurent series in the dimensional regulator ɛ. That makes the construction of the transformation T(x , ɛ) crucial for obtaining solutions of the initial system. In principle, Fuchsia can deal with any regular systems, however its primary task is to reduce differential equations for Feynman master integrals. It ensures that solutions contain only regular singularities due to the properties of Feynman integrals. Program Files doi:http://dx.doi.org/10.17632/zj6zn9vfkh.1 Licensing provisions: MIT Programming language:Python 2.7 Nature of problem: Feynman master integrals may be calculated from solutions of a linear system of differential equations with rational coefficients. Such a system can be easily solved as an ɛ-series when its epsilon form is known. Hence, a tool which is able to find the epsilon form transformations can be used to evaluate Feynman master integrals. Solution method: The solution method is based on the Lee algorithm (Lee, 2015) which consists of three main steps: fuchsification, normalization, and factorization. During the fuchsification step a given system of differential equations is transformed into the Fuchsian form with the help of the Moser method (Moser, 1959). Next, during the normalization step the system is transformed to the form where eigenvalues of all residues are proportional to the dimensional regulator ɛ. Finally, the system is factorized to the epsilon form by finding an unknown transformation which satisfies a system of linear equations. Additional comments

  20. A Modern Automatic Chamber Technique as a Powerful Tool for CH4 and CO2 Flux Monitoring

    Science.gov (United States)

    Mastepanov, M.; Christensen, T. R.; Lund, M.; Pirk, N.

    2014-12-01

    A number of similar systems were used for monitoring of CH4 and CO2 exchange by the automatic chamber method in a range of different ecosystems. The measurements were carried out in northern Sweden (mountain birch forest near Abisko, 68°N, 2004-2010), southern Sweden (forest bog near Hässleholm, 56°N, 2007-2014), northeastern Greenland (arctic fen in Zackenberg valley, 74°N, 2005-2014), southwestern Greenland (fen near Nuuk, 64°N, 2007-2014), central Svalbard (arctic fen near Longyearbyen, 78°N, 2011-2014). Those in total 37 seasons of measurements delivered not only a large amount of valuable flux data, including a few novel findings (Mastepanov et al., Nature, 2008; Mastepanov et al., Biogeosciences, 2013), but also valuable experience with implementation of the automatic chamber technique using modern analytical instruments and computer technologies. A range of high resolution CH4 analysers (DLT-100, FMA, FGGA - Los Gatos Research), CO2 analyzers (EGM-4, SBA-4 - PP Systems; Li-820 - Li-Cor Biosciences), as well as Methane Carbon Isotope Analyzer (Los Gatos Research) has shown to be suitable for precise measurements of fluxes, from as low as 0.1 mg CH4 m-1 d-1 (wintertime measurements at Zackenberg, unpublished) to as high as 2.4 g CH4 m-1 d-1 (autumn burst 2007 at Zackenberg, Mastepanov et al., Nature, 2008). Some of these instruments had to be customized to accommodate 24/7 operation in harsh arctic conditions. In this presentation we will explain some of these customizations. High frequency of concentration measurements (1 Hz in most cases) provides a unique opportunity for quality control of flux calculations; on the other hand, this enormous amount of data can be analyzed only using highly automated algorithms. A specialized software package was developed and improved through the years of measurements and data processing. This software automates the data flow from raw concentration data of different instruments and sensors and various status records

  1. Differentially pumped spray deposition as a rapid screening tool for organic and perovskite solar cells

    Science.gov (United States)

    Jung, Yen-Sook; Hwang, Kyeongil; Scholes, Fiona H.; Watkins, Scott E.; Kim, Dong-Yu; Vak, Doojin

    2016-01-01

    We report a spray deposition technique as a screening tool for solution processed solar cells. A dual-feed spray nozzle is introduced to deposit donor and acceptor materials separately and to form blended films on substrates in situ. Using a differential pump system with a motorised spray nozzle, the effect of film thickness, solution flow rates and the blend ratio of donor and acceptor materials on device performance can be found in a single experiment. Using this method, polymer solar cells based on poly(3-hexylthiophene) (P3HT):(6,6)-phenyl C61 butyric acid methyl ester (PC61BM) are fabricated with numerous combinations of thicknesses and blend ratios. Results obtained from this technique show that the optimum ratio of materials is consistent with previously reported values confirming this technique is a very useful and effective screening method. This high throughput screening method is also used in a single-feed configuration. In the single-feed mode, methylammonium iodide solution is deposited on lead iodide films to create a photoactive layer of perovskite solar cells. Devices featuring a perovskite layer fabricated by this spray process demonstrated a power conversion efficiencies of up to 7.9%. PMID:26853266

  2. Multi-objective optimum design of fast tool servo based on improved differential evolution algorithm

    International Nuclear Information System (INIS)

    Zhu, Zhiwei; Zhou, Xiaoqin; Liu, Qiang; Zhao, Shaoxin

    2011-01-01

    The flexure-based mechanism is a promising realization of fast tool servo (FTS), and the optimum determination of flexure hinge parameters is one of the most important elements in the FTS design. This paper presents a multi-objective optimization approach to optimizing the dimension and position parameters of the flexure-based mechanism, which is based on the improved differential evolution algorithm embedding chaos and nonlinear simulated anneal algorithm. The results of optimum design show that the proposed algorithm has excellent performance and a well-balanced compromise is made between two conflicting objectives, the stroke and natural frequency of the FTS mechanism. The validation tests based on finite element analysis (FEA) show good agreement with the results obtained by using the proposed theoretical algorithm of this paper. Finally, a series of experimental tests are conducted to validate the design process and assess the performance of the FTS mechanism. The designed FTS reaches up to a stroke of 10.25 μm with at least 2 kHz bandwidth. Both of the FEA and experimental results demonstrate that the parameters of the flexure-based mechanism determined by the proposed approaches can achieve the specified performance and the proposed approach is suitable for the optimum design of FTS mechanism and of excellent performances

  3. Differential Expression and Functional Analysis of High-Throughput -Omics Data Using Open Source Tools.

    Science.gov (United States)

    Kebschull, Moritz; Fittler, Melanie Julia; Demmer, Ryan T; Papapanou, Panos N

    2017-01-01

    Today, -omics analyses, including the systematic cataloging of messenger RNA and microRNA sequences or DNA methylation patterns in a cell population, organ, or tissue sample, allow for an unbiased, comprehensive genome-level analysis of complex diseases, offering a large advantage over earlier "candidate" gene or pathway analyses. A primary goal in the analysis of these high-throughput assays is the detection of those features among several thousand that differ between different groups of samples. In the context of oral biology, our group has successfully utilized -omics technology to identify key molecules and pathways in different diagnostic entities of periodontal disease.A major issue when inferring biological information from high-throughput -omics studies is the fact that the sheer volume of high-dimensional data generated by contemporary technology is not appropriately analyzed using common statistical methods employed in the biomedical sciences.In this chapter, we outline a robust and well-accepted bioinformatics workflow for the initial analysis of -omics data generated using microarrays or next-generation sequencing technology using open-source tools. Starting with quality control measures and necessary preprocessing steps for data originating from different -omics technologies, we next outline a differential expression analysis pipeline that can be used for data from both microarray and sequencing experiments, and offers the possibility to account for random or fixed effects. Finally, we present an overview of the possibilities for a functional analysis of the obtained data.

  4. Differentially pumped spray deposition as a rapid screening tool for organic and perovskite solar cells.

    Science.gov (United States)

    Jung, Yen-Sook; Hwang, Kyeongil; Scholes, Fiona H; Watkins, Scott E; Kim, Dong-Yu; Vak, Doojin

    2016-02-08

    We report a spray deposition technique as a screening tool for solution processed solar cells. A dual-feed spray nozzle is introduced to deposit donor and acceptor materials separately and to form blended films on substrates in situ. Using a differential pump system with a motorised spray nozzle, the effect of film thickness, solution flow rates and the blend ratio of donor and acceptor materials on device performance can be found in a single experiment. Using this method, polymer solar cells based on poly(3-hexylthiophene) (P3HT):(6,6)-phenyl C61 butyric acid methyl ester (PC61BM) are fabricated with numerous combinations of thicknesses and blend ratios. Results obtained from this technique show that the optimum ratio of materials is consistent with previously reported values confirming this technique is a very useful and effective screening method. This high throughput screening method is also used in a single-feed configuration. In the single-feed mode, methylammonium iodide solution is deposited on lead iodide films to create a photoactive layer of perovskite solar cells. Devices featuring a perovskite layer fabricated by this spray process demonstrated a power conversion efficiencies of up to 7.9%.

  5. Revisiting the dose-effect correlations in irradiated head and neck cancer using automatic segmentation tools of the dental structures, mandible and maxilla

    International Nuclear Information System (INIS)

    Thariat, J.; Ramus, L.; Odin, G.; Vincent, S.; Orlanducci, M.H.; Dassonville, O.; Darcourt, V.; Lacout, A.; Marcy, P.Y.; Cagnol, G.; Malandain, G.

    2011-01-01

    Purpose. - Manual delineation of dental structures is too time-consuming to be feasible in routine practice. Information on dose risk levels is crucial for dentists following irradiation of the head and neck to avoid post-extraction osteoradionecrosis based on empirical dose-effects data established on bidimensional radiation therapy plans. Material and methods. - We present an automatic atlas-based segmentation framework of the dental structures, called Dentalmaps, constructed from a patient image-segmentation database. Results. - This framework is accurate (within 2 Gy accuracy) and relevant for the routine use. It has the potential to guide dental care in the context of new irradiation techniques. Conclusion. - This tool provides a user-friendly interface for dentists and radiation oncologists in the context of irradiated head and neck cancer patients. It will likely improve the knowledge of dose-effect correlations for dental complications and osteoradionecrosis. (authors)

  6. ATLAS (Automatic Tool for Local Assembly Structures) - A Comprehensive Infrastructure for Assembly, Annotation, and Genomic Binning of Metagenomic and Metaranscripomic Data

    Energy Technology Data Exchange (ETDEWEB)

    White, Richard A.; Brown, Joseph M.; Colby, Sean M.; Overall, Christopher C.; Lee, Joon-Yong; Zucker, Jeremy D.; Glaesemann, Kurt R.; Jansson, Georg C.; Jansson, Janet K.

    2017-03-02

    ATLAS (Automatic Tool for Local Assembly Structures) is a comprehensive multiomics data analysis pipeline that is massively parallel and scalable. ATLAS contains a modular analysis pipeline for assembly, annotation, quantification and genome binning of metagenomics and metatranscriptomics data and a framework for reference metaproteomic database construction. ATLAS transforms raw sequence data into functional and taxonomic data at the microbial population level and provides genome-centric resolution through genome binning. ATLAS provides robust taxonomy based on majority voting of protein coding open reading frames rolled-up at the contig level using modified lowest common ancestor (LCA) analysis. ATLAS provides robust taxonomy based on majority voting of protein coding open reading frames rolled-up at the contig level using modified lowest common ancestor (LCA) analysis. ATLAS is user-friendly, easy install through bioconda maintained as open-source on GitHub, and is implemented in Snakemake for modular customizable workflows.

  7. Survey on the differentiation of consumption in various types of automatic wood burners; Erhebung Verbrauchssplitting bei automatischen Holzfeuerungen

    Energy Technology Data Exchange (ETDEWEB)

    Primas, A.; Kistler, M.; Kessler, F.

    2006-07-01

    This final report published by the Swiss Federal Office of Energy (SFOE) takes a look at how statistics on wood-consumption can be differentiated to take various types of wood-fired heating systems into consideration. The approach used, which involved the taking of 1200 random samples from a total of 5200 installations, is described. Figures are presented on the return-quotients reached. The questionnaires returned were sorted according to the types of installation, such as industrial/commercial, farming, services and household. As a result of the high return-rate, the accuracy of the estimates based on the data is also considered to be high. The paper describes how the survey was made and how the results were obtained from the data collected. Details on operation, types of fuel, specific consumption and factors influencing operation are presented in graphical form. An appendix presents the data collected in tabular form.

  8. Open-source tool for automatic import of coded surveying data to multiple vector layers in GIS environment

    Directory of Open Access Journals (Sweden)

    Eva Stopková

    2016-12-01

    Full Text Available This paper deals with a tool that enables import of the coded data in a singletext file to more than one vector layers (including attribute tables, together withautomatic drawing of line and polygon objects and with optional conversion toCAD. Python script v.in.survey is available as an add-on for open-source softwareGRASS GIS (GRASS Development Team. The paper describes a case study basedon surveying at the archaeological mission at Tell-el Retaba (Egypt. Advantagesof the tool (e.g. significant optimization of surveying work and its limits (demandson keeping conventions for the points’ names coding are discussed here as well.Possibilities of future development are suggested (e.g. generalization of points’names coding or more complex attribute table creation.

  9. Validation of the ICU-DaMa tool for automatically extracting variables for minimum dataset and quality indicators: The importance of data quality assessment.

    Science.gov (United States)

    Sirgo, Gonzalo; Esteban, Federico; Gómez, Josep; Moreno, Gerard; Rodríguez, Alejandro; Blanch, Lluis; Guardiola, Juan José; Gracia, Rafael; De Haro, Lluis; Bodí, María

    2018-04-01

    Big data analytics promise insights into healthcare processes and management, improving outcomes while reducing costs. However, data quality is a major challenge for reliable results. Business process discovery techniques and an associated data model were used to develop data management tool, ICU-DaMa, for extracting variables essential for overseeing the quality of care in the intensive care unit (ICU). To determine the feasibility of using ICU-DaMa to automatically extract variables for the minimum dataset and ICU quality indicators from the clinical information system (CIS). The Wilcoxon signed-rank test and Fisher's exact test were used to compare the values extracted from the CIS with ICU-DaMa for 25 variables from all patients attended in a polyvalent ICU during a two-month period against the gold standard of values manually extracted by two trained physicians. Discrepancies with the gold standard were classified into plausibility, conformance, and completeness errors. Data from 149 patients were included. Although there were no significant differences between the automatic method and the manual method, we detected differences in values for five variables, including one plausibility error and two conformance and completeness errors. Plausibility: 1) Sex, ICU-DaMa incorrectly classified one male patient as female (error generated by the Hospital's Admissions Department). Conformance: 2) Reason for isolation, ICU-DaMa failed to detect a human error in which a professional misclassified a patient's isolation. 3) Brain death, ICU-DaMa failed to detect another human error in which a professional likely entered two mutually exclusive values related to the death of the patient (brain death and controlled donation after circulatory death). Completeness: 4) Destination at ICU discharge, ICU-DaMa incorrectly classified two patients due to a professional failing to fill out the patient discharge form when thepatients died. 5) Length of continuous renal replacement

  10. Molecular polymorphism as a tool for differentiating ground beetles (Carabus species): application of ubiquitin PCR/SSCP analysis.

    Science.gov (United States)

    Boge, A; Gerstmeier, R; Einspanier, R

    1994-11-01

    Differentiation between Carabus species (ground beetle) and subspecies is difficult, although there have been extensive studies. To address this problem we have applied PCR in combination with SSCP analysis focussing on the evolutionally conservative ubiquitin gene to elaborate a new approach to molecular differentiation between species. We report that Carabidae possess an ubiquitin gene and that its gene has a multimeric structure. Differential SSCP analysis was performed with the monomeric form of the gene to generate a clear SSCP pattern. Such PCR/SSCP resulted in reproducible patterns throughout our experiments. Comparing different Carabus species (Carabus granulatus, C. irregularis, C. violaceus and C. auronitens) we could observe clear interspecies differences but no differences between genders. Some species showed some remarkable differences between the individuals. We suggest that the ubiquitin PCR-SSCP technique might be an additional tool for the differentiation of ground beetles.

  11. Retrieval interval mapping, a tool to optimize the spectral retrieval range in differential optical absorption spectroscopy

    Science.gov (United States)

    Vogel, L.; Sihler, H.; Lampel, J.; Wagner, T.; Platt, U.

    2012-06-01

    Remote sensing via differential optical absorption spectroscopy (DOAS) has become a standard technique to identify and quantify trace gases in the atmosphere. The technique is applied in a variety of configurations, commonly classified into active and passive instruments using artificial and natural light sources, respectively. Platforms range from ground based to satellite instruments and trace-gases are studied in all kinds of different environments. Due to the wide range of measurement conditions, atmospheric compositions and instruments used, a specific challenge of a DOAS retrieval is to optimize the parameters for each specific case and particular trace gas of interest. This becomes especially important when measuring close to the detection limit. A well chosen evaluation wavelength range is crucial to the DOAS technique. It should encompass strong absorption bands of the trace gas of interest in order to maximize the sensitivity of the retrieval, while at the same time minimizing absorption structures of other trace gases and thus potential interferences. Also, instrumental limitations and wavelength depending sources of errors (e.g. insufficient corrections for the Ring effect and cross correlations between trace gas cross sections) need to be taken into account. Most often, not all of these requirements can be fulfilled simultaneously and a compromise needs to be found depending on the conditions at hand. Although for many trace gases the overall dependence of common DOAS retrieval on the evaluation wavelength interval is known, a systematic approach to find the optimal retrieval wavelength range and qualitative assessment is missing. Here we present a novel tool to determine the optimal evaluation wavelength range. It is based on mapping retrieved values in the retrieval wavelength space and thus visualize the consequence of different choices of retrieval spectral ranges, e.g. caused by slightly erroneous absorption cross sections, cross correlations and

  12. GAPscreener: An automatic tool for screening human genetic association literature in PubMed using the support vector machine technique

    Directory of Open Access Journals (Sweden)

    Khoury Muin J

    2008-04-01

    Full Text Available Abstract Background Synthesis of data from published human genetic association studies is a critical step in the translation of human genome discoveries into health applications. Although genetic association studies account for a substantial proportion of the abstracts in PubMed, identifying them with standard queries is not always accurate or efficient. Further automating the literature-screening process can reduce the burden of a labor-intensive and time-consuming traditional literature search. The Support Vector Machine (SVM, a well-established machine learning technique, has been successful in classifying text, including biomedical literature. The GAPscreener, a free SVM-based software tool, can be used to assist in screening PubMed abstracts for human genetic association studies. Results The data source for this research was the HuGE Navigator, formerly known as the HuGE Pub Lit database. Weighted SVM feature selection based on a keyword list obtained by the two-way z score method demonstrated the best screening performance, achieving 97.5% recall, 98.3% specificity and 31.9% precision in performance testing. Compared with the traditional screening process based on a complex PubMed query, the SVM tool reduced by about 90% the number of abstracts requiring individual review by the database curator. The tool also ascertained 47 articles that were missed by the traditional literature screening process during the 4-week test period. We examined the literature on genetic associations with preterm birth as an example. Compared with the traditional, manual process, the GAPscreener both reduced effort and improved accuracy. Conclusion GAPscreener is the first free SVM-based application available for screening the human genetic association literature in PubMed with high recall and specificity. The user-friendly graphical user interface makes this a practical, stand-alone application. The software can be downloaded at no charge.

  13. Advanced Differential Radar Interferometry (A-DInSAR) as integrative tool for a structural geological analysis

    Science.gov (United States)

    Crippa, B.; Calcagni, L.; Rossi, G.; Sternai, P.

    2009-04-01

    Advanced Differential SAR interferometry (A-DInSAR) is a technique monitoring large-coverage surface deformations using a stack of interferograms generated from several complex SLC SAR images, acquired over the same target area at different times. In this work are described the results of a procedure to calculate terrain motion velocity on highly correlated pixels (E. Biescas, M. Crosetto, M. Agudo, O. Monserrat e B. Crippa: Two Radar Interferometric Approaches to Monitor Slow and Fast Land Deformation, 2007) in two area Gemona - Friuli, Northern Italy, Pollino - Calabria, Southern Italy, and, furthermore, are presented some consideration, based on successful examples of the present analysis. The choice of these pixels whose displacement velocity is calculated depends on the dispersion index value (DA) or using coherence values along the stack interferograms. A-DInSAR technique allows to obtain highly reliable velocity values of the vertical displacement. These values concern the movement of minimum surfaces of about 80m2 at the maximum resolution and the minimum velocity that can be recognized is of the order of mm/y. Because of the high versatility of the technology, because of the large dimensions of the area that can be analyzed (of about 10000Km2) and because of the high precision and reliability of the results obtained, we think it is possible to exploit radar interferometry to obtain some important information about the structural context of the studied area, otherwise very difficult to recognize. Therefore we propose radar interferometry as a valid investigation tool whose results must be considered as an important integration of the data collected in fieldworks.

  14. CSReport: A New Computational Tool Designed for Automatic Analysis of Class Switch Recombination Junctions Sequenced by High-Throughput Sequencing.

    Science.gov (United States)

    Boyer, François; Boutouil, Hend; Dalloul, Iman; Dalloul, Zeinab; Cook-Moreau, Jeanne; Aldigier, Jean-Claude; Carrion, Claire; Herve, Bastien; Scaon, Erwan; Cogné, Michel; Péron, Sophie

    2017-05-15

    B cells ensure humoral immune responses due to the production of Ag-specific memory B cells and Ab-secreting plasma cells. In secondary lymphoid organs, Ag-driven B cell activation induces terminal maturation and Ig isotype class switch (class switch recombination [CSR]). CSR creates a virtually unique IgH locus in every B cell clone by intrachromosomal recombination between two switch (S) regions upstream of each C region gene. Amount and structural features of CSR junctions reveal valuable information about the CSR mechanism, and analysis of CSR junctions is useful in basic and clinical research studies of B cell functions. To provide an automated tool able to analyze large data sets of CSR junction sequences produced by high-throughput sequencing (HTS), we designed CSReport, a software program dedicated to support analysis of CSR recombination junctions sequenced with a HTS-based protocol (Ion Torrent technology). CSReport was assessed using simulated data sets of CSR junctions and then used for analysis of Sμ-Sα and Sμ-Sγ1 junctions from CH12F3 cells and primary murine B cells, respectively. CSReport identifies junction segment breakpoints on reference sequences and junction structure (blunt-ended junctions or junctions with insertions or microhomology). Besides the ability to analyze unprecedentedly large libraries of junction sequences, CSReport will provide a unified framework for CSR junction studies. Our results show that CSReport is an accurate tool for analysis of sequences from our HTS-based protocol for CSR junctions, thereby facilitating and accelerating their study. Copyright © 2017 by The American Association of Immunologists, Inc.

  15. The MIMIC Model as a Tool for Differential Bundle Functioning Detection

    Science.gov (United States)

    Finch, W. Holmes

    2012-01-01

    Increasingly, researchers interested in identifying potentially biased test items are encouraged to use a confirmatory, rather than exploratory, approach. One such method for confirmatory testing is rooted in differential bundle functioning (DBF), where hypotheses regarding potential differential item functioning (DIF) for sets of items (bundles)…

  16. Automatization of the special library as a tool of the provision of the quality services for the readers

    International Nuclear Information System (INIS)

    Zendulkova, D.

    2004-01-01

    The presentation is concerned with the base principles of automation library activities. It deals with the problem of the activities selection intended for automation with regard to character of delivered library services as well as for user requirements. It analyzes the actual situation at our place in the field of libraries software menu. It also shows the reality that by the identification of the requirements on library system there exist many criteria that are advisable to be taken into account. The article briefly characterizes the latest trends in the field of library co-operation, actual used interchange formats of data processing and some new legislative documents related to libraries fond processing, which stimulate property of libraries software. In the next part the article analyzes the applications that are typical for a smaller library. These applications are administered by the database and searching system WinISIS including cataloguing of books and periodicals and charging (borrowing) system for example. It deals with available behaviour of libraries database exposure that is produced by this system on Internet as well as the possibilities of hypertext interface of libraries database with online accessible external information sources. In the conclusion are the mentioned services that the Centre of Scientific and Technical Information offers to the users and the persons concerned to the software tools for the automation of libraries. (author)

  17. Automatization of the special library as a tool of the provision of the quality services for the readers

    International Nuclear Information System (INIS)

    Zendulkova, D.

    2004-01-01

    The article is concerned with the base principles of automation library activities. It deals with the problem of the activities selection intended for automation with regard to character of delivered library services as well as for user requirements. It analyzes the actual situation at our place in the field of libraries software menu. It also shows the reality that by the identification of the requirements on library system there exist many criteria that are advisable to be taken into account. The article briefly characterizes the latest trends in the field of library co-operation, actual used interchange formats of data processing and some new legislative documents related to libraries fond processing, which stimulate property of libraries software. In the next part the article analyzes the applications that are typical for a smaller library. These applications are administered by the database and searching system WinISIS including cataloguing of books and periodicals and charging (borrowing) system for example. It deals with available behaviour of libraries database exposure that is produced by this system on Internet as well as the possibilities of hypertext interface of libraries database with online accessible external information sources. In the conclusion are the mentioned services that the Centre of Scientific and Technical Information offers to the users and the persons concerned to the software tools for the automation of libraries. (author)

  18. Automatic Imitation

    Science.gov (United States)

    Heyes, Cecilia

    2011-01-01

    "Automatic imitation" is a type of stimulus-response compatibility effect in which the topographical features of task-irrelevant action stimuli facilitate similar, and interfere with dissimilar, responses. This article reviews behavioral, neurophysiological, and neuroimaging research on automatic imitation, asking in what sense it is "automatic"…

  19. PatternLab for proteomics: a tool for differential shotgun proteomics

    Directory of Open Access Journals (Sweden)

    Yates John R

    2008-07-01

    Full Text Available Abstract Background A goal of proteomics is to distinguish between states of a biological system by identifying protein expression differences. Liu et al. demonstrated a method to perform semi-relative protein quantitation in shotgun proteomics data by correlating the number of tandem mass spectra obtained for each protein, or "spectral count", with its abundance in a mixture; however, two issues have remained open: how to normalize spectral counting data and how to efficiently pinpoint differences between profiles. Moreover, Chen et al. recently showed how to increase the number of identified proteins in shotgun proteomics by analyzing samples with different MS-compatible detergents while performing proteolytic digestion. The latter introduced new challenges as seen from the data analysis perspective, since replicate readings are not acquired. Results To address the open issues above, we present a program termed PatternLab for proteomics. This program implements existing strategies and adds two new methods to pinpoint differences in protein profiles. The first method, ACFold, addresses experiments with less than three replicates from each state or having assays acquired by different protocols as described by Chen et al. ACFold uses a combined criterion based on expression fold changes, the AC test, and the false-discovery rate, and can supply a "bird's-eye view" of differentially expressed proteins. The other method addresses experimental designs having multiple readings from each state and is referred to as nSVM (natural support vector machine because of its roots in evolutionary computing and in statistical learning theory. Our observations suggest that nSVM's niche comprises projects that select a minimum set of proteins for classification purposes; for example, the development of an early detection kit for a given pathology. We demonstrate the effectiveness of each method on experimental data and confront them with existing strategies

  20. Neural differentiation of mouse embryonic stem cells as a tool to assess developmental neurotoxicity in vitro.

    Science.gov (United States)

    Visan, Anke; Hayess, Katrin; Sittner, Dana; Pohl, Elena E; Riebeling, Christian; Slawik, Birgitta; Gulich, Konrad; Oelgeschläger, Michael; Luch, Andreas; Seiler, Andrea E M

    2012-10-01

    Mouse embryonic stem cells (mESCs) represent an attractive cellular system for in vitro studies in developmental biology as well as toxicology because of their potential to differentiate into all fetal cell lineages. The present study aims to establish an in vitro system for developmental neurotoxicity testing employing mESCs. We developed a robust and reproducible protocol for fast and efficient differentiation of the mESC line D3 into neural cells, optimized with regard to chemical testing. Morphological examination and immunocytochemical staining confirmed the presence of different neural cell types, including neural progenitors, neurons, astrocytes, oligodendrocytes, and radial glial cells. Neurons derived from D3 cells expressed the synaptic proteins PSD95 and synaptophysin, and the neurotransmitters serotonin and γ-aminobutyric acid. Calcium ion imaging revealed the presence of functionally active glutamate and dopamine receptors. In addition, flow cytometry analysis of the neuron-specific marker protein MAP2 on day 12 after induction of differentiation demonstrated a concentration dependent effect of the neurodevelopmental toxicants methylmercury chloride, chlorpyrifos, and lead acetate on neuronal differentiation. The current study shows that D3 mESCs differentiate efficiently into neural cells involving a neurosphere-like state and that this system is suitable to detect adverse effects of neurodevelopmental toxicants. Therefore, we propose that the protocol for differentiation of mESCs into neural cells described here could constitute one component of an in vitro testing strategy for developmental neurotoxicity. Copyright © 2012 Elsevier Inc. All rights reserved.

  1. Automatic flow-through dynamic extraction: A fast tool to evaluate char-based remediation of multi-element contaminated mine soils.

    Science.gov (United States)

    Rosende, María; Beesley, Luke; Moreno-Jimenez, Eduardo; Miró, Manuel

    2016-02-01

    An automatic in-vitro bioaccessibility test based upon dynamic microcolumn extraction in a programmable flow setup is herein proposed as a screening tool to evaluate bio-char based remediation of mine soils contaminated with trace elements as a compelling alternative to conventional phyto-availability tests. The feasibility of the proposed system was evaluated by extracting the readily bioaccessible pools of As, Pb and Zn in two contaminated mine soils before and after the addition of two biochars (9% (w:w)) of diverse source origin (pine and olive). Bioaccessible fractions under worst-case scenarios were measured using 0.001 mol L(-1) CaCl2 as extractant for mimicking plant uptake, and analysis of the extracts by inductively coupled optical emission spectrometry. The t-test of comparison of means revealed an efficient metal (mostly Pb and Zn) immobilization by the action of olive pruning-based biochar against the bare (control) soil at the 0.05 significance level. In-vitro flow-through bioaccessibility tests are compared for the first time with in-vivo phyto-toxicity assays in a microcosm soil study. By assessing seed germination and shoot elongation of Lolium perenne in contaminated soils with and without biochar amendments the dynamic flow-based bioaccessibility data proved to be in good agreement with the phyto-availability tests. Experimental results indicate that the dynamic extraction method is a viable and economical in-vitro tool in risk assessment explorations to evaluate the feasibility of a given biochar amendment for revegetation and remediation of metal contaminated soils in a mere 10 min against 4 days in case of phyto-toxicity assays. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. ODMSummary: A Tool for Automatic Structured Comparison of Multiple Medical Forms Based on Semantic Annotation with the Unified Medical Language System.

    Science.gov (United States)

    Storck, Michael; Krumm, Rainer; Dugas, Martin

    2016-01-01

    Medical documentation is applied in various settings including patient care and clinical research. Since procedures of medical documentation are heterogeneous and developed further, secondary use of medical data is complicated. Development of medical forms, merging of data from different sources and meta-analyses of different data sets are currently a predominantly manual process and therefore difficult and cumbersome. Available applications to automate these processes are limited. In particular, tools to compare multiple documentation forms are missing. The objective of this work is to design, implement and evaluate the new system ODMSummary for comparison of multiple forms with a high number of semantically annotated data elements and a high level of usability. System requirements are the capability to summarize and compare a set of forms, enable to estimate the documentation effort, track changes in different versions of forms and find comparable items in different forms. Forms are provided in Operational Data Model format with semantic annotations from the Unified Medical Language System. 12 medical experts were invited to participate in a 3-phase evaluation of the tool regarding usability. ODMSummary (available at https://odmtoolbox.uni-muenster.de/summary/summary.html) provides a structured overview of multiple forms and their documentation fields. This comparison enables medical experts to assess multiple forms or whole datasets for secondary use. System usability was optimized based on expert feedback. The evaluation demonstrates that feedback from domain experts is needed to identify usability issues. In conclusion, this work shows that automatic comparison of multiple forms is feasible and the results are usable for medical experts.

  3. KeyGenes, a Tool to Probe Tissue Differentiation Using a Human Fetal Transcriptional Atlas

    NARCIS (Netherlands)

    Roost, Matthias S; van Iperen, Liesbeth; Ariyurek, Yavuz; Buermans, Henk P; Arindrarto, Wibowo; Devalla, Harsha D; Passier, Robert; Mummery, Christine L; Carlotti, Françoise; de Koning, Eelco J P; van Zwet, Erik W; Goeman, Jelle J; Chuva de Sousa Lopes, Susana M

    2015-01-01

    Differentiated derivatives of human pluripotent stem cells in culture are generally phenotypically immature compared to their adult counterparts. Their identity is often difficult to determine with certainty because little is known about their human fetal equivalents in vivo. Cellular identity and

  4. Ultraprecise parabolic interpolator for numerically controlled machine tools. [Digital differential analyzer circuit

    Energy Technology Data Exchange (ETDEWEB)

    Davenport, C. M.

    1977-02-01

    The mathematical basis for an ultraprecise digital differential analyzer circuit for use as a parabolic interpolator on numerically controlled machines has been established, and scaling and other error-reduction techniques have been developed. An exact computer model is included, along with typical results showing tracking to within an accuracy of one part per million.

  5. Delay differential equations via the matrix Lambert W function and bifurcation analysis: application to machine tool chatter.

    Science.gov (United States)

    Yi, Sun; Nelson, Patrick W; Ulsoy, A Galip

    2007-04-01

    In a turning process modeled using delay differential equations (DDEs), we investigate the stability of the regenerative machine tool chatter problem. An approach using the matrix Lambert W function for the analytical solution to systems of delay differential equations is applied to this problem and compared with the result obtained using a bifurcation analysis. The Lambert W function, known to be useful for solving scalar first-order DDEs, has recently been extended to a matrix Lambert W function approach to solve systems of DDEs. The essential advantages of the matrix Lambert W approach are not only the similarity to the concept of the state transition matrix in lin ear ordinary differential equations, enabling its use for general classes of linear delay differential equations, but also the observation that we need only the principal branch among an infinite number of roots to determine the stability of a system of DDEs. The bifurcation method combined with Sturm sequences provides an algorithm for determining the stability of DDEs without restrictive geometric analysis. With this approach, one can obtain the critical values of delay, which determine the stability of a system and hence the preferred operating spindle speed without chatter. We apply both the matrix Lambert W function and the bifurcation analysis approach to the problem of chatter stability in turning, and compare the results obtained to existing methods. The two new approaches show excellent accuracy and certain other advantages, when compared to traditional graphical, computational and approximate methods.

  6. Differentiation/Purification Protocol for Retinal Pigment Epithelium from Mouse Induced Pluripotent Stem Cells as a Research Tool.

    Directory of Open Access Journals (Sweden)

    Yuko Iwasaki

    Full Text Available To establish a novel protocol for differentiation of retinal pigment epithelium (RPE with high purity from mouse induced pluripotent stem cells (iPSC.Retinal progenitor cells were differentiated from mouse iPSC, and RPE differentiation was then enhanced by activation of the Wnt signaling pathway, inhibition of the fibroblast growth factor signaling pathway, and inhibition of the Rho-associated, coiled-coil containing protein kinase signaling pathway. Expanded pigmented cells were purified by plate adhesion after Accutase® treatment. Enriched cells were cultured until they developed a cobblestone appearance with cuboidal shape. The characteristics of iPS-RPE were confirmed by gene expression, immunocytochemistry, and electron microscopy. Functions and immunologic features of the iPS-RPE were also evaluated.We obtained iPS-RPE at high purity (approximately 98%. The iPS-RPE showed apical-basal polarity and cellular structure characteristic of RPE. Expression levels of several RPE markers were lower than those of freshly isolated mouse RPE but comparable to those of primary cultured RPE. The iPS-RPE could form tight junctions, phagocytose photoreceptor outer segments, express immune antigens, and suppress lymphocyte proliferation.We successfully developed a differentiation/purification protocol to obtain mouse iPS-RPE. The mouse iPS-RPE can serve as an attractive tool for functional and morphological studies of RPE.

  7. Fuchsia. A tool for reducing differential equations for Feynman master integral to epsilon form

    International Nuclear Information System (INIS)

    Gituliar, Oleksandr; Magerya, Vitaly

    2017-01-01

    We present Fuchsia - an implementation of the Lee algorithm, which for a given system of ordinary differential equations with rational coefficients ∂ x f(x,ε)=A(x,ε)f(x,ε) finds a basis transformation T(x,ε), i.e., f(x,ε)=T(x,ε)g(x,ε), such that the system turns into the epsilon form: ∂ x g(x,ε)=εS(x)g(x,ε), where S(x) is a Fuchsian matrix. A system of this form can be trivially solved in terms of polylogarithms as a Laurent series in the dimensional regulator ε. That makes the construction of the transformation T(x,ε) crucial for obtaining solutions of the initial system. In principle, Fuchsia can deal with any regular systems, however its primary task is to reduce differential equations for Feynman master integrals. It ensures that solutions contain only regular singularities due to the properties of Feynman integrals.

  8. The relevance of clinical balance assessment tools to differentiate balance deficits

    OpenAIRE

    Mancini, Martina; Horak, Fay B

    2010-01-01

    Control of balance is complex and involves maintaining postures, facilitating movement, and recovering equilibrium. Balance control consists of controlling the body center of mass over its limits of stability. Clinical balance assessment can help assess fall risk and/or determine the underlying reasons for balance disorders. Most functional balance assessment scales assess fall risk and the need for balance rehabilitation but do not differentiate types of balance deficits. A system approach t...

  9. Some operational tools for solving fractional and higher integer order differential equations: A survey on their mutual relations

    Science.gov (United States)

    Kiryakova, Virginia S.

    2012-11-01

    The Laplace Transform (LT) serves as a basis of the Operational Calculus (OC), widely explored by engineers and applied scientists in solving mathematical models for their practical needs. This transform is closely related to the exponential and trigonometric functions (exp, cos, sin) and to the classical differentiation and integration operators, reducing them to simple algebraic operations. Thus, the classical LT and the OC give useful tool to handle differential equations and systems with constant coefficients. Several generalizations of the LT have been introduced to allow solving, in a similar way, of differential equations with variable coefficients and of higher integer orders, as well as of fractional (arbitrary non-integer) orders. Note that fractional order mathematical models are recently widely used to describe better various systems and phenomena of the real world. This paper surveys briefly some of our results on classes of such integral transforms, that can be obtained from the LT by means of "transmutations" which are operators of the generalized fractional calculus (GFC). On the list of these Laplace-type integral transforms, we consider the Borel-Dzrbashjan, Meijer, Krätzel, Obrechkoff, generalized Obrechkoff (multi-index Borel-Dzrbashjan) transforms, etc. All of them are G- and H-integral transforms of convolutional type, having as kernels Meijer's G- or Fox's H-functions. Besides, some special functions (also being G- and H-functions), among them - the generalized Bessel-type and Mittag-Leffler (M-L) type functions, are generating Gel'fond-Leontiev (G-L) operators of generalized differentiation and integration, which happen to be also operators of GFC. Our integral transforms have operational properties analogous to those of the LT - they do algebrize the G-L generalized integrations and differentiations, and thus can serve for solving wide classes of differential equations with variable coefficients of arbitrary, including non-integer order

  10. PCR melting profile (PCR MP - a new tool for differentiation of Candida albicans strains

    Directory of Open Access Journals (Sweden)

    Nowak Magdalena

    2009-11-01

    Full Text Available Abstract Background We have previously reported the use of PCR Melting Profile (PCR MP technique based on using low denaturation temperatures during ligation mediated PCR (LM PCR for bacterial strain differentiation. The aim of the current study was to evaluate this method for intra-species differentiation of Candida albicans strains. Methods In total 123 Candida albicans strains (including 7 reference, 11 clinical unrelated, and 105 isolates from patients of two hospitals in Poland were examined using three genotyping methods: PCR MP, macrorestriction analysis of the chromosomal DNA by pulsed-field gel electrophoresis (REA-PFGE and RAPD techniques. Results The genotyping results of the PCR MP were compared with results from REA-PFGE and RAPD techniques giving 27, 26 and 25 unique types, respectively. The results showed that the PCR MP technique has at least the same discriminatory power as REA-PFGE and RAPD. Conclusion Data presented here show for the first time the evaluation of PCR MP technique for candidial strains differentiation and we propose that this can be used as a relatively simple and cheap technique for epidemiological studies in short period of time in hospital.

  11. Differential Diagnosis Tool for Parkinsonian Syndrome Using Multiple Structural Brain Measures

    Directory of Open Access Journals (Sweden)

    Miho Ota

    2013-01-01

    Full Text Available Clinical differentiation of parkinsonian syndromes such as the Parkinson variant of multiple system atrophy (MSA-P and cerebellar subtype (MSA-C from Parkinson's disease is difficult in the early stage of the disease. To identify the correlative pattern of brain changes for differentiating parkinsonian syndromes, we applied discriminant analysis techniques by magnetic resonance imaging (MRI. T1-weighted volume data and diffusion tensor images were obtained by MRI in eighteen patients with MSA-C, 12 patients with MSA-P, 21 patients with Parkinson’s disease, and 21 healthy controls. They were evaluated using voxel-based morphometry and tract-based spatial statistics, respectively. Discriminant functions derived by step wise methods resulted in correct classification rates of 0.89. When differentiating these diseases with the use of three independent variables together, the correct classification rate was the same as that obtained with step wise methods. These findings support the view that each parkinsonian syndrome has structural deviations in multiple brain areas and that a combination of structural brain measures can help to distinguish parkinsonian syndromes.

  12. Fuchsia. A tool for reducing differential equations for Feynman master integral to epsilon form

    Energy Technology Data Exchange (ETDEWEB)

    Gituliar, Oleksandr [Hamburg Univ. (Germany). 2. Inst. fuer Theoretische Physik; Magerya, Vitaly

    2017-01-15

    We present Fuchsia - an implementation of the Lee algorithm, which for a given system of ordinary differential equations with rational coefficients ∂{sub x}f(x,ε)=A(x,ε)f(x,ε) finds a basis transformation T(x,ε), i.e., f(x,ε)=T(x,ε)g(x,ε), such that the system turns into the epsilon form: ∂{sub x}g(x,ε)=εS(x)g(x,ε), where S(x) is a Fuchsian matrix. A system of this form can be trivially solved in terms of polylogarithms as a Laurent series in the dimensional regulator ε. That makes the construction of the transformation T(x,ε) crucial for obtaining solutions of the initial system. In principle, Fuchsia can deal with any regular systems, however its primary task is to reduce differential equations for Feynman master integrals. It ensures that solutions contain only regular singularities due to the properties of Feynman integrals.

  13. A method and tool for combining differential or inclusive measurements obtained with simultaneously constrained uncertainties

    Science.gov (United States)

    Kieseler, Jan

    2017-11-01

    A method is discussed that allows combining sets of differential or inclusive measurements. It is assumed that at least one measurement was obtained with simultaneously fitting a set of nuisance parameters, representing sources of systematic uncertainties. As a result of beneficial constraints from the data all such fitted parameters are correlated among each other. The best approach for a combination of these measurements would be the maximization of a combined likelihood, for which the full fit model of each measurement and the original data are required. However, only in rare cases this information is publicly available. In absence of this information most commonly used combination methods are not able to account for these correlations between uncertainties, which can lead to severe biases as shown in this article. The method discussed here provides a solution for this problem. It relies on the public result and its covariance or Hessian, only, and is validated against the combined-likelihood approach. A dedicated software package implementing this method is also presented. It provides a text-based user interface alongside a C++ interface. The latter also interfaces to ROOT classes for simple combination of binned measurements such as differential cross sections.

  14. A method and tool for combining differential or inclusive measurements obtained with simultaneously constrained uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Kieseler, Jan [CERN, Geneva (Switzerland)

    2017-11-15

    A method is discussed that allows combining sets of differential or inclusive measurements. It is assumed that at least one measurement was obtained with simultaneously fitting a set of nuisance parameters, representing sources of systematic uncertainties. As a result of beneficial constraints from the data all such fitted parameters are correlated among each other. The best approach for a combination of these measurements would be the maximization of a combined likelihood, for which the full fit model of each measurement and the original data are required. However, only in rare cases this information is publicly available. In absence of this information most commonly used combination methods are not able to account for these correlations between uncertainties, which can lead to severe biases as shown in this article. The method discussed here provides a solution for this problem. It relies on the public result and its covariance or Hessian, only, and is validated against the combined-likelihood approach. A dedicated software package implementing this method is also presented. It provides a text-based user interface alongside a C++ interface. The latter also interfaces to ROOT classes for simple combination of binned measurements such as differential cross sections. (orig.)

  15. Volatile fraction composition and physicochemical parameters as tools for the differentiation of lemon blossom honey and orange blossom honey.

    Science.gov (United States)

    Kadar, Melinda; Juan-Borrás, Marisol; Carot, Jose M; Domenech, Eva; Escriche, Isabel

    2011-12-01

    Volatile fraction profile and physicochemical parameters were studied with the aim of evaluating their effectiveness for the differentiation between lemon blossom honey (Citrus limon L.) and orange blossom honey (Citrus spp.). They would be useful complementary tools to the traditional analysis based on the percentage of pollen. A stepwise discriminant analysis constructed using 37 volatile compounds (extracted by purge and trap and analysed by gas chromatography-mass spectrometry), and physicochemical and colour parameters (diastase, conductivity, Pfund colour and CIE L a b) together provided a model that permitted the correct classification of 98.3% of the original and 96.6% of the cross-validated cases, indicating its efficiency and robustness. This model proved its effectiveness in the differentiation of both types of honey with another set of batches from the following year. This model, developed from the volatile compounds, physicochemical and colour parameters, has been useful for the differentiation of lemon and orange blossom honeys. Furthermore, it may be of particular interest for the attainment of a suitable classification of orange honey in which the pollen count is very low. These capabilities imply an evident marketing advantage for the beekeeping sector, since lemon blossom honey could be commercialized as unifloral honey and not as generic citrus honey and orange blossom honey could be correctly characterized. Copyright © 2011 Society of Chemical Industry.

  16. Decision support tool for early differential diagnosis of acute lung injury and cardiogenic pulmonary edema in medical critically ill patients.

    Science.gov (United States)

    Schmickl, Christopher N; Shahjehan, Khurram; Li, Guangxi; Dhokarh, Rajanigandha; Kashyap, Rahul; Janish, Christopher; Alsara, Anas; Jaffe, Allan S; Hubmayr, Rolf D; Gajic, Ognjen

    2012-01-01

    At the onset of acute hypoxic respiratory failure, critically ill patients with acute lung injury (ALI) may be difficult to distinguish from those with cardiogenic pulmonary edema (CPE). No single clinical parameter provides satisfying prediction. We hypothesized that a combination of those will facilitate early differential diagnosis. In a population-based retrospective development cohort, validated electronic surveillance identified critically ill adult patients with acute pulmonary edema. Recursive partitioning and logistic regression were used to develop a decision support tool based on routine clinical information to differentiate ALI from CPE. Performance of the score was validated in an independent cohort of referral patients. Blinded post hoc expert review served as gold standard. Of 332 patients in a development cohort, expert reviewers (κ, 0.86) classified 156 as having ALI and 176 as having CPE. The validation cohort had 161 patients (ALI = 113, CPE = 48). The score was based on risk factors for ALI and CPE, age, alcohol abuse, chemotherapy, and peripheral oxygen saturation/Fio(2) ratio. It demonstrated good discrimination (area under curve [AUC] = 0.81; 95% CI, 0.77-0.86) and calibration (Hosmer-Lemeshow [HL] P = .16). Similar performance was obtained in the validation cohort (AUC = 0.80; 95% CI, 0.72-0.88; HL P = .13). A simple decision support tool accurately classifies acute pulmonary edema, reserving advanced testing for a subset of patients in whom satisfying prediction cannot be made. This novel tool may facilitate early inclusion of patients with ALI and CPE into research studies as well as improve and rationalize clinical management and resource use.

  17. CSF lactate level: a useful diagnostic tool to differentiate acute bacterial and viral meningitis.

    Science.gov (United States)

    Abro, Ali Hassan; Abdou, Ahmed Saheh; Ustadi, Abdulla M; Saleh, Ahmed Alhaj; Younis, Nadeem Javeed; Doleh, Wafa F

    2009-08-01

    To evaluate the potential role of CSF lactate level in the diagnosis of acute bacterial meningitis and in the differentiation between viral and bacterial meningitis. This was a hospital based observational study, conducted at Infectious Diseases Unit, Rashid Hospital Dubai, United Arab Emirates, from July 2004 to June 2007. The patients with clinical diagnosis of acute bacterial meningitis and who had CSF Gram stain/culture positive, CSF analysis suggestive of bacterial meningitis with negative Gram stain and culture but blood culture positive for bacteria and patients with clinical diagnosis suggestive of viral meningitis supported by CSF chemical analysis with negative Gram stain and culture as well as negative blood culture for bacteria were included in the study. CT scan brain was done for all patients before lumber puncture and CSF and blood samples were collected immediately after admission. CSF chemical analysis including lactate level was done on first spinal tap. The CSF lactate level was tested by Enzymatic Colorimetric method. A total 95 adult patients of acute meningitis (53 bacterial and 42 viral) fulfilled the inclusion criteria. Among 53 bacterial meningitis patients, Neisseria meningitides were isolated in 29 (54.7%), Strept. Pneumoniae in 18 (33.96%), Staph. Aureus in 2 (3.77%), Klebsiell Pneumoniae in 2 (3.77%), Strept. Agalactiae in 1 (1.8%) and E. Coli in 1 (1.8%). All the patients with bacterial meningitis had CSF lactate > 3.8 mmol/l except one, whereas none of the patients with viral meningitis had lactate level > 3.8 mmol/l. The mean CSF lactate level in bacterial meningitis cases amounted to 16.51 +/- 6.14 mmol/l, whereas it was significantly lower in viral group 2.36 +/- 0.6 mmol/l, p < .0001. CSF lactate level was significantly high in bacterial than viral meningitis and it can provide pertinent, rapid and reliable diagnostic information. Furthermore, CSF lactate level can also differentiate bacterial meningitis from viral one in a quick

  18. Combinatorial hexapeptide ligand libraries (ProteoMiner): an innovative fractionation tool for differential quantitative clinical proteomics.

    Science.gov (United States)

    Hartwig, Sonja; Czibere, Akos; Kotzka, Jorg; Passlack, Waltraud; Haas, Rainer; Eckel, Jürgen; Lehr, Stefan

    2009-07-01

    Blood serum samples are the major source for clinical proteomics approaches, which aim to identify diagnostically relevant or treatment-response related proteins. But, the presence of very high-abundance proteins and the enormous dynamic range of protein distribution hinders whole serum analysis. An innovative tool to overcome these limitations, utilizes combinatorial hexapeptide ligand libraries (ProteoMiner). Here, we demonstrate that ProteoMiner can be used for comparative and quantitative analysis of complex proteomes. We spiked serum samples with increasing amounts (3 microg to 300 microg) of whole E. coli lysate, processed it with ProteoMiner and performed quantitative analyses of 2D-gels. We found, that the concentration of the spiked bacteria proteome, reflected by the maintained proportional spot intensities, was not altered by ProteoMiner treatment. Therefore, we conclude that the ProteoMiner technology can be used for quantitative analysis of low abundant proteins in complex biological samples.

  19. Automatic validation of numerical solutions

    DEFF Research Database (Denmark)

    Stauning, Ole

    1997-01-01

    This thesis is concerned with ``Automatic Validation of Numerical Solutions''. The basic theory of interval analysis and self-validating methods is introduced. The mean value enclosure is applied to discrete mappings for obtaining narrow enclosures of the iterates when applying these mappings...... differential equations, but in this thesis, we describe how to use the methods for enclosing iterates of discrete mappings, and then later use them for discretizing solutions of ordinary differential equations. The theory of automatic differentiation is introduced, and three methods for obtaining derivatives...... are described: The forward, the backward, and the Taylor expansion methods. The three methods have been implemented in the C++ program packages FADBAD/TADIFF. Some examples showing how to use the three metho ds are presented. A feature of FADBAD/TADIFF not present in other automatic differentiation packages...

  20. Embryonic hybrid cells: a powerful tool for studying pluripotency and reprogramming of the differentiated cell chromosomes

    Directory of Open Access Journals (Sweden)

    SEROV OLEG

    2001-01-01

    Full Text Available The properties of embryonic hybrid cells obtained by fusion of embryonic stem (ES or teratocarcinoma (TC cells with differentiated cells are reviewed. Usually, ES-somatic or TC-somatic hybrids retain pluripotent capacity at high levels quite comparable or nearly identical with those of the pluripotent partner. When cultured in vitro, ES-somatic- and TC-somatic hybrid cell clones, as a rule, lose the chromosomes derived from the somatic partner; however, in some clones the autosomes from the ES cell partner were also eliminated, i.e. the parental chromosomes segregated bilaterally in the ES-somatic cell hybrids. This opens up ways for searching correlation between the pluripotent status of the hybrid cells and chromosome segregation patterns and therefore for identifying the particular chromosomes involved in the maintenance of pluripotency. Use of selective medium allows to isolate in vitro the clones of ES-somatic hybrid cells in which "the pluripotent" chromosome can be replaced by "the somatic" counterpart carrying the selectable gene. Unlike the TC-somatic cell hybrids, the ES-somatic hybrids with a near-diploid complement of chromosomes are able to contribute to various tissues of chimeric animals after injection into the blastocoel cavity. Analysis of the chimeric animals showed that the "somatic" chromosome undergoes reprogramming during development. The prospects for the identification of the chromosomes that are involved in the maintenance of pluripotency and its cis- and trans-regulation in the hybrid cell genome are discussed.

  1. The use of surface electromyography as a tool in differentiating temporomandibular disorders from neck disorders.

    Science.gov (United States)

    Ferrario, Virgilio F; Tartaglia, Gianluca M; Luraghi, Francesca E; Sforza, Chiarella

    2007-11-01

    The aim of this study was to assess the electromyographic characteristics of the masticatory muscles (masseter and temporalis) of patients with either "temporomandibular joint disorder" or "neck pain". Surface electromyography of the right and left masseter and temporalis muscles was performed during maximum teeth clenching in 38 patients aged 21-67 years who had either (a) temporomandibular joint disorder (24 patients); (b) "neck pain" (13 patients). Ninety-five control, healthy subjects were also examined. During clenching, standardized total muscle activities (electromyographic potentials over time) were significantly different in the three groups: 75 microV/microVs% in the temporomandibular joint disorder patients, 124 microV/microVs% in the neck pain patients, and 95 microV/microVs% in the control subjects (analysis of variance, Ptemporomandibular joint disorder patients also had significantly (Pneck pain patients (87%) or control subjects (92%). A linear discriminant function analysis allowed a significant separation between the two patient groups, with a single patient error of 18.2%. Surface electromyographic analysis during clenching allowed to differentiate between patients with a temporomandibular joint disorder and patients with a neck pain problem.

  2. Bayesian nonparametric variable selection as an exploratory tool for discovering differentially expressed genes.

    Science.gov (United States)

    Shahbaba, Babak; Johnson, Wesley O

    2013-05-30

    High-throughput scientific studies involving no clear a priori hypothesis are common. For example, a large-scale genomic study of a disease may examine thousands of genes without hypothesizing that any specific gene is responsible for the disease. In these studies, the objective is to explore a large number of possible factors (e.g., genes) in order to identify a small number that will be considered in follow-up studies that tend to be more thorough and on smaller scales. A simple, hierarchical, linear regression model with random coefficients is assumed for case-control data that correspond to each gene. The specific model used will be seen to be related to a standard Bayesian variable selection model. Relatively large regression coefficients correspond to potential differences in responses for cases versus controls and thus to genes that might 'matter'. For large-scale studies, and using a Dirichlet process mixture model for the regression coefficients, we are able to find clusters of regression effects of genes with increasing potential effect or 'relevance', in relation to the outcome of interest. One cluster will always correspond to genes whose coefficients are in a neighborhood that is relatively close to zero and will be deemed least relevant. Other clusters will correspond to increasing magnitudes of the random/latent regression coefficients. Using simulated data, we demonstrate that our approach could be quite effective in finding relevant genes compared with several alternative methods. We apply our model to two large-scale studies. The first study involves transcriptome analysis of infection by human cytomegalovirus. The second study's objective is to identify differentially expressed genes between two types of leukemia. Copyright © 2012 John Wiley & Sons, Ltd.

  3. "NeuroStem Chip": a novel highly specialized tool to study neural differentiation pathways in human stem cells

    Directory of Open Access Journals (Sweden)

    Li Jia-Yi

    2007-02-01

    Full Text Available Abstract Background Human stem cells are viewed as a possible source of neurons for a cell-based therapy of neurodegenerative disorders, such as Parkinson's disease. Several protocols that generate different types of neurons from human stem cells (hSCs have been developed. Nevertheless, the cellular mechanisms that underlie the development of neurons in vitro as they are subjected to the specific differentiation protocols are often poorly understood. Results We have designed a focused DNA (oligonucleotide-based large-scale microarray platform (named "NeuroStem Chip" and used it to study gene expression patterns in hSCs as they differentiate into neurons. We have selected genes that are relevant to cells (i being stem cells, (ii becoming neurons, and (iii being neurons. The NeuroStem Chip has over 1,300 pre-selected gene targets and multiple controls spotted in quadruplicates (~46,000 spots total. In this study, we present the NeuroStem Chip in detail and describe the special advantages it offers to the fields of experimental neurology and stem cell biology. To illustrate the utility of NeuroStem Chip platform, we have characterized an undifferentiated population of pluripotent human embryonic stem cells (hESCs, cell line SA02. In addition, we have performed a comparative gene expression analysis of those cells versus a heterogeneous population of hESC-derived cells committed towards neuronal/dopaminergic differentiation pathway by co-culturing with PA6 stromal cells for 16 days and containing a few tyrosine hydroxylase-positive dopaminergic neurons. Conclusion We characterized the gene expression profiles of undifferentiated and dopaminergic lineage-committed hESC-derived cells using a highly focused custom microarray platform (NeuroStem Chip that can become an important research tool in human stem cell biology. We propose that the areas of application for NeuroStem microarray platform could be the following: (i characterization of the

  4. Differential contributions of the superior and inferior parietal cortex to feedback versus feedforward control of tools.

    Science.gov (United States)

    Macuga, Kristen L; Frey, Scott H

    2014-05-15

    Damage to the superior and/or inferior parietal lobules (SPL, IPL) (Sirigu et al., 1996) or cerebellum (Grealy and Lee, 2011) can selectively disrupt motor imagery, motivating the hypothesis that these regions participate in predictive (i.e., feedforward) control. If so, then the SPL, IPL, and cerebellum should show greater activity as the demands on feedforward control increase from visually-guided execution (closed-loop) to execution without visual feedback (open-loop) to motor imagery. Using fMRI and a Fitts' reciprocal aiming task with tools directed at targets in far space, we found that the SPL and cerebellum exhibited greater activity during closed-loop control. Conversely, open-loop and imagery conditions were associated with increased activity within the IPL and prefrontal areas. These results are consistent with a superior-to-inferior gradient in the representation of feedback-to-feedforward control within the posterior parietal cortex. Additionally, the anterior SPL displayed greater activity when aiming movements were performed with a stick vs. laser pointer. This may suggest that it is involved in the remapping of far into near (reachable) space (Maravita and Iriki, 2004), or in distalization of the end-effector from hand to stick (Arbib et al., 2009). Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Not proper ROC curves as new tool for the analysis of differentially expressed genes in microarray experiments

    Directory of Open Access Journals (Sweden)

    Pistoia Vito

    2008-10-01

    Full Text Available Abstract Background Most microarray experiments are carried out with the purpose of identifying genes whose expression varies in relation with specific conditions or in response to environmental stimuli. In such studies, genes showing similar mean expression values between two or more groups are considered as not differentially expressed, even if hidden subclasses with different expression values may exist. In this paper we propose a new method for identifying differentially expressed genes, based on the area between the ROC curve and the rising diagonal (ABCR. ABCR represents a more general approach than the standard area under the ROC curve (AUC, because it can identify both proper (i.e., concave and not proper ROC curves (NPRC. In particular, NPRC may correspond to those genes that tend to escape standard selection methods. Results We assessed the performance of our method using data from a publicly available database of 4026 genes, including 14 normal B cell samples (NBC and 20 heterogeneous lymphomas (namely: 9 follicular lymphomas and 11 chronic lymphocytic leukemias. Moreover, NBC also included two sub-classes, i.e., 6 heavily stimulated and 8 slightly or not stimulated samples. We identified 1607 differentially expressed genes with an estimated False Discovery Rate of 15%. Among them, 16 corresponded to NPRC and all escaped standard selection procedures based on AUC and t statistics. Moreover, a simple inspection to the shape of such plots allowed to identify the two subclasses in either one class in 13 cases (81%. Conclusion NPRC represent a new useful tool for the analysis of microarray data.

  6. Automatic tracking of the constancy of the imaging chain radiographic equipment using integrated tool for dummy and evaluation software; Seguimiento automatico de la constancia de la cadena de imagen de Equipos Radiograficos mediante herramienta integrada por maniqui y software de evaluacion

    Energy Technology Data Exchange (ETDEWEB)

    Mayo, P.; Rodenas, F.; Marin, B.; Alcaraz, D.; Verdu, G.

    2011-07-01

    This paper presents an innovative tool nationwide for the automatic analysis of the constancy of the imaging chain digital radiographic equipment, both computed radiography (CR) and direct digital (DR).

  7. Automatic Construction of Finite Algebras

    Institute of Scientific and Technical Information of China (English)

    张健

    1995-01-01

    This paper deals with model generation for equational theories,i.e.,automatically generating (finite)models of a given set of (logical) equations.Our method of finite model generation and a tool for automatic construction of finite algebras is described.Some examples are given to show the applications of our program.We argue that,the combination of model generators and theorem provers enables us to get a better understanding of logical theories.A brief comparison betwween our tool and other similar tools is also presented.

  8. Fiscal Drag as an Automatic Stability Tool, in the Case of New Regulation with Price Criteria in Automotive Sectors Special Consumption Tax (SCT

    Directory of Open Access Journals (Sweden)

    Abdurrahman TARAKTAŞ

    2017-12-01

    Full Text Available Fiscal drag is a result of real or nominal expanding economy and progressive taxation. In general, individuals are forced to enter the upper tax bracket depending on their increased income or expenditure. More tax burden can result in less consumption. Fiscal drag, lack of spending or excessive taxation can cause the economy to slow down. Traditional view suggests that fiscal drag may serve as a natural automatic stabilizer to cool the economy. However, this view ignores the supply side and in particular the potential effects of the high tax burden on economy. This study examines the extent to which the expected automatic stabilization function can be performed and the possible side effects on economic balances and income distribution of fiscal drag in our country in the case of new regulation with price criteria in automotive sectors Special Consumption Tax (SCT.

  9. Automatic Picking of Foraminifera: Design of the Foraminifera Image Recognition and Sorting Tool (FIRST) Prototype and Results of the Image Classification Scheme

    Science.gov (United States)

    de Garidel-Thoron, T.; Marchant, R.; Soto, E.; Gally, Y.; Beaufort, L.; Bolton, C. T.; Bouslama, M.; Licari, L.; Mazur, J. C.; Brutti, J. M.; Norsa, F.

    2017-12-01

    Foraminifera tests are the main proxy carriers for paleoceanographic reconstructions. Both geochemical and taxonomical studies require large numbers of tests to achieve statistical relevance. To date, the extraction of foraminifera from the sediment coarse fraction is still done by hand and thus time-consuming. Moreover, the recognition of morphotypes, ecologically relevant, requires some taxonomical skills not easily taught. The automatic recognition and extraction of foraminifera would largely help paleoceanographers to overcome these issues. Recent advances in automatic image classification using machine learning opens the way to automatic extraction of foraminifera. Here we detail progress on the design of an automatic picking machine as part of the FIRST project. The machine handles 30 pre-sieved samples (100-1000µm), separating them into individual particles (including foraminifera) and imaging each in pseudo-3D. The particles are classified and specimens of interest are sorted either for Individual Foraminifera Analyses (44 per slide) and/or for classical multiple analyses (8 morphological classes per slide, up to 1000 individuals per hole). The classification is based on machine learning using Convolutional Neural Networks (CNNs), similar to the approach used in the coccolithophorid imaging system SYRACO. To prove its feasibility, we built two training image datasets of modern planktonic foraminifera containing approximately 2000 and 5000 images each, corresponding to 15 & 25 morphological classes. Using a CNN with a residual topology (ResNet) we achieve over 95% correct classification for each dataset. We tested the network on 160,000 images from 45 depths of a sediment core from the Pacific ocean, for which we have human counts. The current algorithm is able to reproduce the downcore variability in both Globigerinoides ruber and the fragmentation index (r2 = 0.58 and 0.88 respectively). The FIRST prototype yields some promising results for high

  10. Vital Recorder-a free research tool for automatic recording of high-resolution time-synchronised physiological data from multiple anaesthesia devices.

    Science.gov (United States)

    Lee, Hyung-Chul; Jung, Chul-Woo

    2018-01-24

    The current anaesthesia information management system (AIMS) has limited capability for the acquisition of high-quality vital signs data. We have developed a Vital Recorder program to overcome the disadvantages of AIMS and to support research. Physiological data of surgical patients were collected from 10 operating rooms using the Vital Recorder. The basic equipment used were a patient monitor, the anaesthesia machine, and the bispectral index (BIS) monitor. Infusion pumps, cardiac output monitors, regional oximeter, and rapid infusion device were added as required. The automatic recording option was used exclusively and the status of recording was frequently checked through web monitoring. Automatic recording was successful in 98.5% (4,272/4,335) cases during eight months of operation. The total recorded time was 13,489 h (3.2 ± 1.9 h/case). The Vital Recorder's automatic recording and remote monitoring capabilities enabled us to record physiological big data with minimal effort. The Vital Recorder also provided time-synchronised data captured from a variety of devices to facilitate an integrated analysis of vital signs data. The free distribution of the Vital Recorder is expected to improve data access for researchers attempting physiological data studies and to eliminate inequalities in research opportunities due to differences in data collection capabilities.

  11. Evaluation of tumour markers as differential diagnostic tool in patients with suspicion of liver metastases from breast cancer.

    Science.gov (United States)

    Liska, Vaclav; Holubec, Lubos; Treska, Vladislav; Vrzalova, Jindra; Skalicky, Tomas; Sutnar, Alan; Kormunda, Stanislav; Bruha, Jan; Vycital, Ondrej; Finek, Jindrich; Pesta, Martin; Pecen, Ladislav; Topolcan, Ondrej

    2011-04-01

    The liver is the site of breast cancer metastasis in 50% of patients with advanced disease. Tumour markers have been demonstrated as being useful in follow-up of patients with breast cancer, in early detection of recurrence of breast cancer after radical surgical treatments, and in assessing oncologic therapy effect, but no study has been carried out on their usefullness in distinguishing benign liver lesions from breast cancer metastases. The aim of this study was therefore to evaluate the importance of tumour markers carcinoembryonic antigen (CEA), carbohydrate antigen CA19-9 (CA19-9), thymidine kinase (TK), tissue polypeptide antigen (TPA), tissue polypeptide-specific antigen (TPS) and cytokeratin 19 fragment (CYFRA 21-1) in differential diagnosis between benign liver lesions and liver metastases of breast cancer. The study includes 3 groups: 22 patients with liver metastases of breast cancer; 39 patients with benign liver lesions (hemangioma, focal nodular hyperplasia, liver cyst, hepatocellular adenoma); and 21 patients without any liver disease or lesion that were operated on for benign extrahepatic diseases (groin hernia, varices of lower limbs) as a control group. The serum levels of tumour markers were assessed by means of immunoanalytical methods. Preoperative serum levels of CYFRA 21-1, TPA, TPS and CEA were significantly higher in patients with liver metastases of breast cancer in contrast to healthy controls and patients with benign liver lesions (p-value<0.05). Serum levels of CA19-9 and TK were higher in patients with malignancy in comparison with benign liver disease and healthy controls but these differences were not statistically significant. Tumour markers CEA, CYFRA 21-1, TPA and TPS can be recommended as a good tool for differential diagnosis between liver metastases of breast cancer and benign liver lesions.

  12. Interactivity in automatic control: foundations and experiences

    OpenAIRE

    Dormido Bencomo, Sebastián; Guzmán Sánchez, José Luis; Costa Castelló, Ramon; Berenguel, M

    2012-01-01

    The first part of this paper presents the concepts of interactivity and visualization and its essential role in learning the fundamentals and techniques of automatic control. More than 10 years experience of the authors in the development and design of interactive tools dedicated to the study of automatic control concepts are also exposed. The second part of the paper summarizes the main features of the “Automatic Control with Interactive Tools” text that has been recently published by Pea...

  13. Comparison of a semi-automatic annotation tool and a natural language processing application for the generation of clinical statement entries.

    Science.gov (United States)

    Lin, Ching-Heng; Wu, Nai-Yuan; Lai, Wei-Shao; Liou, Der-Ming

    2015-01-01

    Electronic medical records with encoded entries should enhance the semantic interoperability of document exchange. However, it remains a challenge to encode the narrative concept and to transform the coded concepts into a standard entry-level document. This study aimed to use a novel approach for the generation of entry-level interoperable clinical documents. Using HL7 clinical document architecture (CDA) as the example, we developed three pipelines to generate entry-level CDA documents. The first approach was a semi-automatic annotation pipeline (SAAP), the second was a natural language processing (NLP) pipeline, and the third merged the above two pipelines. We randomly selected 50 test documents from the i2b2 corpora to evaluate the performance of the three pipelines. The 50 randomly selected test documents contained 9365 words, including 588 Observation terms and 123 Procedure terms. For the Observation terms, the merged pipeline had a significantly higher F-measure than the NLP pipeline (0.89 vs 0.80, pgenerating entry-level interoperable clinical documents. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.comFor numbered affiliation see end of article.

  14. Re-targeting the Graze performance debugging tool for Java threads and analyzing the re-targeting to automatically parallelized (FORTRAN) code

    OpenAIRE

    Tsai, Pedro T. H.

    2000-01-01

    Approved for public release; distribution is unlimited This research focuses on the design of a language-independent concept, Glimpse, for performance debugging of multi-threaded programs. This research extends previous work on Graze, a tool designed and implemented for performance debugging of C++ programs. Not only is Glimpse easily portable among different programming languages, (i) it is useful in many different paradigms ranging from few long-lived threads to many short-lived...

  15. SU-G-TeP4-07: Automatic EPID-Based 2D Measurement of MLC Leaf Offset as a Quality Control Tool

    Energy Technology Data Exchange (ETDEWEB)

    Ritter, T; Moran, J [The University of Michigan, Ann Arbor, MI (United States); Schultz, B [University of Michigan, Ann Arbor, MI (United States); Kim, G [University of California, San Diego, La Jolla, CA (United States); Barnes, M [Calvary Mater Hospital Newcastle, Warratah, NSW (Australia); Perez, M [North Sydney Cancer Center, Sydney (Australia); Farrey, K [University of Chicago, Chicago, IL (United States); Popple, R [University Alabama Birmingham, Birmingham, AL (United States); Greer, P [Calvary Mater Newcastle, Newcastle (Australia)

    2016-06-15

    Purpose: The MLC dosimetric leaf gap (DLG) and transmission are measured parameters which impact the dosimetric accuracy of IMRT and VMAT plans. This investigation aims to develop an efficient and accurate routine constancy check of the physical DLG in two dimensions. Methods: The manufacturer’s recommended DLG measurement method was modified by using 5 fields instead of 11 and by utilizing the Electronic Portal Imaging Device (EPID). Validations were accomplished using an ion chamber (IC) in solid water and a 2D IC array. EPID data was collected for 6 months on multiple TrueBeam linacs using both Millennium and HD MLCs at 5 different clinics in an international consortium. Matlab code was written to automatically analyze the images and calculate the 2D results. Sensitivity was investigated by introducing deliberate leaf position errors. MLC calibration and initialization history was recorded to allow quantification of their impact. Results were analyzed using statistical process control (SPC). Results: The EPID method took approximately 5 minutes. Due to detector response, the EPID measured DLG and transmission differed from the IC values but were reproducible and consistent with changes measured using the ICs. For the Millennium MLC, the EPID measured DLG and transmission were both consistently lower than IC results. The EPID method was implemented as leaf offset and transmission constancy tests (LOC and TC). Based on 6 months of measurements, the initial leaf-specific action thresholds for changes from baseline were set to 0.1 mm. Upper and lower control limits for variation were developed for each machine. Conclusion: Leaf offset and transmission constancy tests were implemented on Varian HD and Millennium MLCs using an EPID and found to be efficient and accurate. The test is effective for monitoring MLC performance using dynamic delivery and performing process control on the DLG in 2D, thus enhancing dosimetric accuracy. This work was supported by a grant

  16. Differential equations

    CERN Document Server

    Barbu, Viorel

    2016-01-01

    This textbook is a comprehensive treatment of ordinary differential equations, concisely presenting basic and essential results in a rigorous manner. Including various examples from physics, mechanics, natural sciences, engineering and automatic theory, Differential Equations is a bridge between the abstract theory of differential equations and applied systems theory. Particular attention is given to the existence and uniqueness of the Cauchy problem, linear differential systems, stability theory and applications to first-order partial differential equations. Upper undergraduate students and researchers in applied mathematics and systems theory with a background in advanced calculus will find this book particularly useful. Supplementary topics are covered in an appendix enabling the book to be completely self-contained.

  17. IRSS: a web-based tool for automatic layout and analysis of IRES secondary structure prediction and searching system in silico

    Directory of Open Access Journals (Sweden)

    Hong Jun-Jie

    2009-05-01

    Full Text Available Abstract Background Internal ribosomal entry sites (IRESs provide alternative, cap-independent translation initiation sites in eukaryotic cells. IRES elements are important factors in viral genomes and are also useful tools for bi-cistronic expression vectors. Most existing RNA structure prediction programs are unable to deal with IRES elements. Results We designed an IRES search system, named IRSS, to obtain better results for IRES prediction. RNA secondary structure prediction and comparison software programs were implemented to construct our two-stage strategy for the IRSS. Two software programs formed the backbone of IRSS: the RNAL fold program, used to predict local RNA secondary structures by minimum free energy method; and the RNA Align program, used to compare predicted structures. After complete viral genome database search, the IRSS have low error rate and up to 72.3% sensitivity in appropriated parameters. Conclusion IRSS is freely available at this website http://140.135.61.9/ires/. In addition, all source codes, precompiled binaries, examples and documentations are downloadable for local execution. This new search approach for IRES elements will provide a useful research tool on IRES related studies.

  18. Knickzone Extraction Tool (KET – A new ArcGIS toolset for automatic extraction of knickzones from a DEM based on multi-scale stream gradients

    Directory of Open Access Journals (Sweden)

    Zahra Tuba

    2017-04-01

    Full Text Available Extraction of knickpoints or knickzones from a Digital Elevation Model (DEM has gained immense significance owing to the increasing implications of knickzones on landform development. However, existing methods for knickzone extraction tend to be subjective or require time-intensive data processing. This paper describes the proposed Knickzone Extraction Tool (KET, a new raster-based Python script deployed in the form of an ArcGIS toolset that automates the process of knickzone extraction and is both fast and more user-friendly. The KET is based on multi-scale analysis of slope gradients along a river course, where any locally steep segment (knickzone can be extracted as an anomalously high local gradient. We also conducted a comparative analysis of the KET and other contemporary knickzone identification techniques. The relationship between knickzone distribution and its morphometric characteristics are also examined through a case study of a mountainous watershed in Japan.

  19. Grinding Parts For Automatic Welding

    Science.gov (United States)

    Burley, Richard K.; Hoult, William S.

    1989-01-01

    Rollers guide grinding tool along prospective welding path. Skatelike fixture holds rotary grinder or file for machining large-diameter rings or ring segments in preparation for welding. Operator grasps handles to push rolling fixture along part. Rollers maintain precise dimensional relationship so grinding wheel cuts precise depth. Fixture-mounted grinder machines surface to quality sufficient for automatic welding; manual welding with attendant variations and distortion not necessary. Developed to enable automatic welding of parts, manual welding of which resulted in weld bead permeated with microscopic fissures.

  20. High-frequency ultrasonography (HFUS as a useful tool in differentiating between plaque morphea and extragenital lichen sclerosus lesions

    Directory of Open Access Journals (Sweden)

    Rafał Białynicki-Birula

    2017-10-01

    Full Text Available Introduction : Morphea and lichen sclerosus (LS are chronic inflammatory diseases that may pose a diagnostic challenge for a physician. High-frequency ultrasonography (HFUS is a versatile diagnostic method utilized in dermatologic practice, allowing monitoring the course of the disease, treatment response and differentiation between certain skin disorders. Aim : To prove the usefulness of HFUS in differentiating between plaque morphea and extragenital LS lesions. Material and methods : We examined 16 patients with plaque morphea and 4 patients with extragenital LS using 20 MHz taberna pro medicum TM (Germany device. Results : Investigations revealed hyperechogenic entrance echo in both morphea and LS lesions, whereas a distinct polycyclic surface of the entrance echo was detected exclusively in LS. Conclusions : High-frequency ultrasonography is a current diagnostic modality that may prove useful in differentiating between morphea and LS lesions.

  1. A Web-Based Tool for Automatic Data Collection, Curation, and Visualization of Complex Healthcare Survey Studies including Social Network Analysis

    Directory of Open Access Journals (Sweden)

    José Alberto Benítez

    2017-01-01

    Full Text Available There is a great concern nowadays regarding alcohol consumption and drug abuse, especially in young people. Analyzing the social environment where these adolescents are immersed, as well as a series of measures determining the alcohol abuse risk or personal situation and perception using a number of questionnaires like AUDIT, FAS, KIDSCREEN, and others, it is possible to gain insight into the current situation of a given individual regarding his/her consumption behavior. But this analysis, in order to be achieved, requires the use of tools that can ease the process of questionnaire creation, data gathering, curation and representation, and later analysis and visualization to the user. This research presents the design and construction of a web-based platform able to facilitate each of the mentioned processes by integrating the different phases into an intuitive system with a graphical user interface that hides the complexity underlying each of the questionnaires and techniques used and presenting the results in a flexible and visual way, avoiding any manual handling of data during the process. Advantages of this approach are shown and compared to the previous situation where some of the tasks were accomplished by time consuming and error prone manipulations of data.

  2. A Web-Based Tool for Automatic Data Collection, Curation, and Visualization of Complex Healthcare Survey Studies including Social Network Analysis.

    Science.gov (United States)

    Benítez, José Alberto; Labra, José Emilio; Quiroga, Enedina; Martín, Vicente; García, Isaías; Marqués-Sánchez, Pilar; Benavides, Carmen

    2017-01-01

    There is a great concern nowadays regarding alcohol consumption and drug abuse, especially in young people. Analyzing the social environment where these adolescents are immersed, as well as a series of measures determining the alcohol abuse risk or personal situation and perception using a number of questionnaires like AUDIT, FAS, KIDSCREEN, and others, it is possible to gain insight into the current situation of a given individual regarding his/her consumption behavior. But this analysis, in order to be achieved, requires the use of tools that can ease the process of questionnaire creation, data gathering, curation and representation, and later analysis and visualization to the user. This research presents the design and construction of a web-based platform able to facilitate each of the mentioned processes by integrating the different phases into an intuitive system with a graphical user interface that hides the complexity underlying each of the questionnaires and techniques used and presenting the results in a flexible and visual way, avoiding any manual handling of data during the process. Advantages of this approach are shown and compared to the previous situation where some of the tasks were accomplished by time consuming and error prone manipulations of data.

  3. Carbon Stable-Isotope and Physicochemical Data as a Possible Tool to Differentiate between Honey-Production Environments in Uruguay

    Directory of Open Access Journals (Sweden)

    Verónica Berriel

    2018-06-01

    Full Text Available The allocation of honey origin is an increasingly important issue worldwide as it is closely related to product quality and consumer preference. In South America, honeys produced in grasslands and eucalyptus or native forests are preferred at the regional level, so their differentiation is essential to assure consumers of their authenticity according to their productive system. The objective of this study was to differentiate honeys produced in three environments: one, a monoculture system based on the eucalyptus forest, and two others based in natural environments of grasslands and native forests. To do this, honey’s physicochemical and isotopic variables (pH, free acidity, lactic acid content, moisture, total sugar content, and honey and extracted protein 13C isotopic composition were analysed. Discriminant analysis applied to the data revealed that, based on the selected variables, it was impossible to differentiate the three groups of honeys due to the superposition of those produced in grasslands and native forests. For this reason, a group of honeys derived from native and polyfloral environments (grasslands and native forests was formed and subjected to discriminant analysis (DA, together with the group of honeys derived from a commercial plantation of eucalyptus forest. The model obtained in this case achieved 100% correct allocation both at the training stage and the cross-validation stage.

  4. An effective automatic procedure for testing parameter identifiability of HIV/AIDS models.

    Science.gov (United States)

    Saccomani, Maria Pia

    2011-08-01

    Realistic HIV models tend to be rather complex and many recent models proposed in the literature could not yet be analyzed by traditional identifiability testing techniques. In this paper, we check a priori global identifiability of some of these nonlinear HIV models taken from the recent literature, by using a differential algebra algorithm based on previous work of the author. The algorithm is implemented in a software tool, called DAISY (Differential Algebra for Identifiability of SYstems), which has been recently released (DAISY is freely available on the web site http://www.dei.unipd.it/~pia/ ). The software can be used to automatically check global identifiability of (linear and) nonlinear models described by polynomial or rational differential equations, thus providing a general and reliable tool to test global identifiability of several HIV models proposed in the literature. It can be used by researchers with a minimum of mathematical background.

  5. AUTOMATIC ARCHITECTURAL STYLE RECOGNITION

    Directory of Open Access Journals (Sweden)

    M. Mathias

    2012-09-01

    Full Text Available Procedural modeling has proven to be a very valuable tool in the field of architecture. In the last few years, research has soared to automatically create procedural models from images. However, current algorithms for this process of inverse procedural modeling rely on the assumption that the building style is known. So far, the determination of the building style has remained a manual task. In this paper, we propose an algorithm which automates this process through classification of architectural styles from facade images. Our classifier first identifies the images containing buildings, then separates individual facades within an image and determines the building style. This information could then be used to initialize the building reconstruction process. We have trained our classifier to distinguish between several distinct architectural styles, namely Flemish Renaissance, Haussmannian and Neoclassical. Finally, we demonstrate our approach on various street-side images.

  6. TMB: Automatic differentiation and laplace approximation

    DEFF Research Database (Denmark)

    Kristensen, Kasper; Nielsen, Anders; Berg, Casper Willestofte

    2016-01-01

    TMB is an open source R package that enables quick implementation of complex nonlinear random effects (latent variable) models in a manner similar to the established AD Model Builder package (ADMB, http://admb-project.org/; Fournier et al. 2011). In addition, it offers easy access to parallel...... computations. The user defines the joint likelihood for the data and the random effects as a C++ template function, while all the other operations are done in R; e.g., reading in the data. The package evaluates and maximizes the Laplace approximation of the marginal likelihood where the random effects...

  7. Automatic Detection of Fake News

    OpenAIRE

    Pérez-Rosas, Verónica; Kleinberg, Bennett; Lefevre, Alexandra; Mihalcea, Rada

    2017-01-01

    The proliferation of misleading information in everyday access media outlets such as social media feeds, news blogs, and online newspapers have made it challenging to identify trustworthy news sources, thus increasing the need for computational tools able to provide insights into the reliability of online content. In this paper, we focus on the automatic identification of fake content in online news. Our contribution is twofold. First, we introduce two novel datasets for the task of fake news...

  8. Stochastic differential equations as a tool to regularize the parameter estimation problem for continuous time dynamical systems given discrete time measurements.

    Science.gov (United States)

    Leander, Jacob; Lundh, Torbjörn; Jirstrand, Mats

    2014-05-01

    In this paper we consider the problem of estimating parameters in ordinary differential equations given discrete time experimental data. The impact of going from an ordinary to a stochastic differential equation setting is investigated as a tool to overcome the problem of local minima in the objective function. Using two different models, it is demonstrated that by allowing noise in the underlying model itself, the objective functions to be minimized in the parameter estimation procedures are regularized in the sense that the number of local minima is reduced and better convergence is achieved. The advantage of using stochastic differential equations is that the actual states in the model are predicted from data and this will allow the prediction to stay close to data even when the parameters in the model is incorrect. The extended Kalman filter is used as a state estimator and sensitivity equations are provided to give an accurate calculation of the gradient of the objective function. The method is illustrated using in silico data from the FitzHugh-Nagumo model for excitable media and the Lotka-Volterra predator-prey system. The proposed method performs well on the models considered, and is able to regularize the objective function in both models. This leads to parameter estimation problems with fewer local minima which can be solved by efficient gradient-based methods. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Fluorescence In Situ Hybridization for MDM2 Amplification as a Routine Ancillary Diagnostic Tool for Suspected Well-Differentiated and Dedifferentiated Liposarcomas: Experience at a Tertiary Center

    Directory of Open Access Journals (Sweden)

    Khin Thway

    2015-01-01

    Full Text Available Background. The assessment of MDM2 gene amplification by fluorescence in situ hybridization (FISH has become a routine ancillary tool for diagnosing atypical lipomatous tumor (ALT/well-differentiated liposarcoma and dedifferentiated liposarcoma (WDL/DDL in specialist sarcoma units. We describe our experience of its utility at our tertiary institute. Methods. All routine histology samples in which MDM2 amplification was assessed with FISH over a 2-year period were included, and FISH results were correlated with clinical and histologic findings. Results. 365 samples from 347 patients had FISH for MDM2 gene amplification. 170 were positive (i.e., showed MDM2 gene amplification, 192 were negative, and 3 were technically unsatisfactory. There were 122 histologically benign cases showing a histology:FISH concordance rate of 92.6%, 142 WDL/DDL (concordance 96.5%, and 34 cases histologically equivocal for WDL (concordance 50%. Of 64 spindle cell/pleomorphic neoplasms (in which DDL was a differential diagnosis, 21.9% showed MDM2 amplification. Of the cases with discrepant histology and FISH, all but 3 had diagnoses amended following FISH results. For discrepancies of benign histology but positive FISH, lesions were on average larger, more frequently in “classical” (intra-abdominal or inguinal sites for WDL/DDL and more frequently core biopsies. Discrepancies of malignant histology but negative FISH were smaller, less frequently in “classical” sites but again more frequently core biopsies. Conclusions. FISH has a high correlation rate with histology for cases with firm histologic diagnoses of lipoma or WDL/DDL. It is a useful ancillary diagnostic tool in histologically equivocal cases, particularly in WDL lacking significant histologic atypia or DDL without corresponding WDL component, especially in larger tumors, those from intra-abdominal or inguinal sites or core biopsies. There is a significant group of well-differentiated adipocytic neoplasms

  10. Demonstration of Automatically-Generated Adjoint Code for Use in Aerodynamic Shape Optimization

    Science.gov (United States)

    Green, Lawrence; Carle, Alan; Fagan, Mike

    1999-01-01

    Gradient-based optimization requires accurate derivatives of the objective function and constraints. These gradients may have previously been obtained by manual differentiation of analysis codes, symbolic manipulators, finite-difference approximations, or existing automatic differentiation (AD) tools such as ADIFOR (Automatic Differentiation in FORTRAN). Each of these methods has certain deficiencies, particularly when applied to complex, coupled analyses with many design variables. Recently, a new AD tool called ADJIFOR (Automatic Adjoint Generation in FORTRAN), based upon ADIFOR, was developed and demonstrated. Whereas ADIFOR implements forward-mode (direct) differentiation throughout an analysis program to obtain exact derivatives via the chain rule of calculus, ADJIFOR implements the reverse-mode counterpart of the chain rule to obtain exact adjoint form derivatives from FORTRAN code. Automatically-generated adjoint versions of the widely-used CFL3D computational fluid dynamics (CFD) code and an algebraic wing grid generation code were obtained with just a few hours processing time using the ADJIFOR tool. The codes were verified for accuracy and were shown to compute the exact gradient of the wing lift-to-drag ratio, with respect to any number of shape parameters, in about the time required for 7 to 20 function evaluations. The codes have now been executed on various computers with typical memory and disk space for problems with up to 129 x 65 x 33 grid points, and for hundreds to thousands of independent variables. These adjoint codes are now used in a gradient-based aerodynamic shape optimization problem for a swept, tapered wing. For each design iteration, the optimization package constructs an approximate, linear optimization problem, based upon the current objective function, constraints, and gradient values. The optimizer subroutines are called within a design loop employing the approximate linear problem until an optimum shape is found, the design loop

  11. Collimated-hole structures as efficient differential pumping barrier, one-way valve and tool for aligning Penning traps

    International Nuclear Information System (INIS)

    Kluge, H.-Jürgen; Block, Michael; Herfurth, Frank

    2011-01-01

    A collimated-hole structure consists of a very large number of parallel channels which have each a very small diameter and are closely packed together. Such devices, installed in vacuum systems allow one to separate regions of very different gas pressures. A collimated-hole structure has high transmission for a directed ion beam with low emittance but a very low conductance for rest gas atoms or molecules exhibiting random walk. Therefore it is proposed to use such a structure as one-way valve and/or efficient differential pumping barrier in set-ups using Penning traps. Furthermore, these devices might be very useful to align the axis of a Penning trap with the direction of the magnetic field lines which is essential to avoid systematic uncertainties in high-accuracy mass spectroscopy.

  12. arXiv A method and tool for combining differential or inclusive measurements obtained with simultaneously constrained uncertainties

    CERN Document Server

    Kieseler, Jan

    2017-11-22

    A method is discussed that allows combining sets of differential or inclusive measurements. It is assumed that at least one measurement was obtained with simultaneously fitting a set of nuisance parameters, representing sources of systematic uncertainties. As a result of beneficial constraints from the data all such fitted parameters are correlated among each other. The best approach for a combination of these measurements would be the maximization of a combined likelihood, for which the full fit model of each measurement and the original data are required. However, only in rare cases this information is publicly available. In absence of this information most commonly used combination methods are not able to account for these correlations between uncertainties, which can lead to severe biases as shown in this article. The method discussed here provides a solution for this problem. It relies on the public result and its covariance or Hessian, only, and is validated against the combined-likelihood approach. A d...

  13. The Monte Carlo method as a tool for statistical characterisation of differential and additive phase shifting algorithms

    International Nuclear Information System (INIS)

    Miranda, M; Dorrio, B V; Blanco, J; Diz-Bugarin, J; Ribas, F

    2011-01-01

    Several metrological applications base their measurement principle in the phase sum or difference between two patterns, one original s(r,φ) and another modified t(r,φ+Δφ). Additive or differential phase shifting algorithms directly recover the sum 2φ+Δφ or the difference Δφ of phases without requiring prior calculation of the individual phases. These algorithms can be constructed, for example, from a suitable combination of known phase shifting algorithms. Little has been written on the design, analysis and error compensation of these new two-stage algorithms. Previously we have used computer simulation to study, in a linear approach or with a filter process in reciprocal space, the response of several families of them to the main error sources. In this work we present an error analysis that uses Monte Carlo simulation to achieve results in good agreement with those obtained with spatial and temporal methods.

  14. Differentiating Delirium From Sedative/Hypnotic-Related Iatrogenic Withdrawal Syndrome: Lack of Specificity in Pediatric Critical Care Assessment Tools.

    Science.gov (United States)

    Madden, Kate; Burns, Michele M; Tasker, Robert C

    2017-06-01

    To identify available assessment tools for sedative/hypnotic iatrogenic withdrawal syndrome and delirium in PICU patients, the evidence supporting their use, and describe areas of overlap between the components of these tools and the symptoms of anticholinergic burden in children. Studies were identified using PubMed and EMBASE from the earliest available date until July 3, 2016, using a combination of MeSH terms "delirium," "substance withdrawal syndrome," and key words "opioids," "benzodiazepines," "critical illness," "ICU," and "intensive care." Review article references were also searched. Human studies reporting assessment of delirium or iatrogenic withdrawal syndrome in children 0-18 years undergoing critical care. Non-English language, exclusively adult, and neonatal intensive care studies were excluded. References cataloged by study type, population, and screening process. Iatrogenic withdrawal syndrome and delirium are both prevalent in the PICU population. Commonly used scales for delirium and iatrogenic withdrawal syndrome assess signs and symptoms in the motor, behavior, and state domains, and exhibit considerable overlap. In addition, signs and symptoms of an anticholinergic toxidrome (a risk associated with some common PICU medications) overlap with components of these scales, specifically in motor, cardiovascular, and psychiatric domains. Although important studies have demonstrated apparent high prevalence of iatrogenic withdrawal syndrome and delirium in the PICU population, the overlap in these scoring systems presents potential difficulty in distinguishing syndromes, both clinically and for research purposes.

  15. Finding weak points automatically

    International Nuclear Information System (INIS)

    Archinger, P.; Wassenberg, M.

    1999-01-01

    Operators of nuclear power stations have to carry out material tests at selected components by regular intervalls. Therefore a full automaticated test, which achieves a clearly higher reproducibility, compared to part automaticated variations, would provide a solution. In addition the full automaticated test reduces the dose of radiation for the test person. (orig.) [de

  16. Automatic EEG spike detection.

    Science.gov (United States)

    Harner, Richard

    2009-10-01

    Since the 1970s advances in science and technology during each succeeding decade have renewed the expectation of efficient, reliable automatic epileptiform spike detection (AESD). But even when reinforced with better, faster tools, clinically reliable unsupervised spike detection remains beyond our reach. Expert-selected spike parameters were the first and still most widely used for AESD. Thresholds for amplitude, duration, sharpness, rise-time, fall-time, after-coming slow waves, background frequency, and more have been used. It is still unclear which of these wave parameters are essential, beyond peak-peak amplitude and duration. Wavelet parameters are very appropriate to AESD but need to be combined with other parameters to achieve desired levels of spike detection efficiency. Artificial Neural Network (ANN) and expert-system methods may have reached peak efficiency. Support Vector Machine (SVM) technology focuses on outliers rather than centroids of spike and nonspike data clusters and should improve AESD efficiency. An exemplary spike/nonspike database is suggested as a tool for assessing parameters and methods for AESD and is available in CSV or Matlab formats from the author at brainvue@gmail.com. Exploratory Data Analysis (EDA) is presented as a graphic method for finding better spike parameters and for the step-wise evaluation of the spike detection process.

  17. The problem of automatic identification of concepts

    International Nuclear Information System (INIS)

    Andreewsky, Alexandre

    1975-11-01

    This paper deals with the problem of the automatic recognition of concepts and describes an important language tool, the ''linguistic filter'', which facilitates the construction of statistical algorithms. Certain special filters, of prepositions, conjunctions, negatives, logical implication, compound words, are presented. This is followed by a detailed description of a statistical algorithm allowing recognition of pronoun referents, and finally the problem of the automatic treatment of negatives in French is discussed [fr

  18. Culturing of respiratory viruses in well-differentiated pseudostratified human airway epithelium as a tool to detect unknown viruses

    Science.gov (United States)

    Jazaeri Farsani, Seyed Mohammad; Deijs, Martin; Dijkman, Ronald; Molenkamp, Richard; Jeeninga, Rienk E; Ieven, Margareta; Goossens, Herman; van der Hoek, Lia

    2015-01-01

    Background Currently, virus discovery is mainly based on molecular techniques. Here, we propose a method that relies on virus culturing combined with state-of-the-art sequencing techniques. The most natural ex vivo culture system was used to enable replication of respiratory viruses. Method Three respiratory clinical samples were tested on well-differentiated pseudostratified tracheobronchial human airway epithelial (HAE) cultures grown at an air–liquid interface, which resemble the airway epithelium. Cells were stained with convalescent serum of the patients to identify infected cells and apical washes were analyzed by VIDISCA-454, a next-generation sequencing virus discovery technique. Results Infected cells were observed for all three samples. Sequencing subsequently indicated that the cells were infected by either human coronavirus OC43, influenzavirus B, or influenzavirus A. The sequence reads covered a large part of the genome (52%, 82%, and 57%, respectively). Conclusion We present here a new method for virus discovery that requires a virus culture on primary cells and an antibody detection. The virus in the harvest can be used to characterize the viral genome sequence and cell tropism, but also provides progeny virus to initiate experiments to fulfill the Koch's postulates. PMID:25482367

  19. Value of sagittal color Doppler ultrasonography as a supplementary tool in the differential diagnosis of fetal cleft lip and palate

    International Nuclear Information System (INIS)

    Lee, Myoung Seok; Cho, Jeong Yeon; Kim, Sang Youn; Kim, Seung Hyup; Park, Joong Shin; Jun, Jong Kwan

    2017-01-01

    The purpose of this study was to evaluate the feasibility and usefulness of sagittal color Doppler ultrasonography (CDUS) for the diagnosis of fetal cleft lip (CL) and cleft palate (CP). We performed targeted ultrasonography on 25 fetuses with CL and CP, taking coronal and axial images of the upper lip and maxillary alveolar arch in each case. The existence of defects in and malalignment of the alveolus on the axial image, hard palate defects on the midsagittal image, and flow-through defects on CDUS taken during fetal breathing or swallowing were assessed. We compared the ultrasonography findings with postnatal findings in all fetuses. Alveolar defects were detected in 16 out of 17 cases with CP and four out of eight cases with CL. Alveolar malalignment and hard palate defects were detected in 11 out of 17 cases and 14 out of 17 cases with CP, respectively, but not detected in any cases with CL. Communicating flow through the palate defect was detected in 11 out of 17 cases of CL with CP. The accuracy of detection in axial scans of an alveolar defect and malalignment was 80% and 76%, respectively. Accuracy of detection of in mid-sagittal images of hard palate defect and flow was 80% and 86%, respectively. The overall diagnostic accuracy of combined axial and sagittal images with sagittal CDUS was 92%. Sagittal CDUS of the fetal hard palate is a feasible method to directly reveal hard palate bony defects and flow through defects, which may have additional value in the differential diagnosis of fetal CL and CP

  20. Value of sagittal color Doppler ultrasonography as a supplementary tool in the differential diagnosis of fetal cleft lip and palate

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Myoung Seok [Dept. of Radiology, Seoul Metropolitan Government-Seoul National University Boramae Medical Center, Seoul (Korea, Republic of); Cho, Jeong Yeon; Kim, Sang Youn; Kim, Seung Hyup [Dept. of Radiology, Seoul National University Hospital, Seoul (Korea, Republic of); Park, Joong Shin; Jun, Jong Kwan [College of Medicine, Seoul National University, Seoul (Korea, Republic of)

    2017-01-15

    The purpose of this study was to evaluate the feasibility and usefulness of sagittal color Doppler ultrasonography (CDUS) for the diagnosis of fetal cleft lip (CL) and cleft palate (CP). We performed targeted ultrasonography on 25 fetuses with CL and CP, taking coronal and axial images of the upper lip and maxillary alveolar arch in each case. The existence of defects in and malalignment of the alveolus on the axial image, hard palate defects on the midsagittal image, and flow-through defects on CDUS taken during fetal breathing or swallowing were assessed. We compared the ultrasonography findings with postnatal findings in all fetuses. Alveolar defects were detected in 16 out of 17 cases with CP and four out of eight cases with CL. Alveolar malalignment and hard palate defects were detected in 11 out of 17 cases and 14 out of 17 cases with CP, respectively, but not detected in any cases with CL. Communicating flow through the palate defect was detected in 11 out of 17 cases of CL with CP. The accuracy of detection in axial scans of an alveolar defect and malalignment was 80% and 76%, respectively. Accuracy of detection of in mid-sagittal images of hard palate defect and flow was 80% and 86%, respectively. The overall diagnostic accuracy of combined axial and sagittal images with sagittal CDUS was 92%. Sagittal CDUS of the fetal hard palate is a feasible method to directly reveal hard palate bony defects and flow through defects, which may have additional value in the differential diagnosis of fetal CL and CP.

  1. The differentiation of fibre- and drug type Cannabis seedlings by gas chromatography/mass spectrometry and chemometric tools.

    Science.gov (United States)

    Broséus, Julian; Anglada, Frédéric; Esseiva, Pierre

    2010-07-15

    Cannabis cultivation in order to produce drugs is forbidden in Switzerland. Thus, law enforcement authorities regularly ask forensic laboratories to determinate cannabis plant's chemotype from seized material in order to ascertain that the plantation is legal or not. As required by the EU official analysis protocol the THC rate of cannabis is measured from the flowers at maturity. When laboratories are confronted to seedlings, they have to lead the plant to maturity, meaning a time consuming and costly procedure. This study investigated the discrimination of fibre type from drug type Cannabis seedlings by analysing the compounds found in their leaves and using chemometrics tools. 11 legal varieties allowed by the Swiss Federal Office for Agriculture and 13 illegal ones were greenhouse grown and analysed using a gas chromatograph interfaced with a mass spectrometer. Compounds that show high discrimination capabilities in the seedlings have been identified and a support vector machines (SVMs) analysis was used to classify the cannabis samples. The overall set of samples shows a classification rate above 99% with false positive rates less than 2%. This model allows then discrimination between fibre and drug type Cannabis at an early stage of growth. Therefore it is not necessary to wait plants' maturity to quantify their amount of THC in order to determine their chemotype. This procedure could be used for the control of legal (fibre type) and illegal (drug type) Cannabis production. (c) 2010 Elsevier Ireland Ltd. All rights reserved.

  2. Stable isotopes as a tool to differentiate eggs laid by caged, barn, free range, and organic hens.

    Science.gov (United States)

    Rogers, Karyne M

    2009-05-27

    Stable carbon and nitrogen isotope values of whole yolk, delipidized yolk, albumen, and egg membrane were analyzed from 18 different brands of chicken eggs laid under caged, barn, free range, and organic farming regimes. In general, free range and organic egg components showed enrichment of (15)N values up to 4‰ relative to caged and barn laid eggs, suggesting a higher animal protein (trophic) contribution to the chicken's diet than pure plant-based foods and/or that the feed was organically manufactured. One sample of free range and two samples of organic eggs had δ(15)N values within the range of caged or barn laid eggs, suggesting either that these eggs were mislabeled (the hens were raised under "battery" or "barn" conditions, and not permitted to forage outside) or that there was insufficient animal protein gained by foraging to shift the δ(15)N values of their primary food source. δ(13)C values of potential food sources are discussed with respect to dietary intake and contribution to the isotopic signature of the eggs to determine mixing of C(3) and C(4) diets, although they did not elucidate laying regimen. The study finds that stable nitrogen isotope analysis of egg components is potentially a useful technique to unravel dietary differences between caged or barn hens and free range hens (both conventional and organic) and could be further developed as an authentication tool in the egg industry.

  3. Fully automatic adjoints: a robust and efficient mechanism for generating adjoint ocean models

    Science.gov (United States)

    Ham, D. A.; Farrell, P. E.; Funke, S. W.; Rognes, M. E.

    2012-04-01

    The problem of generating and maintaining adjoint models is sufficiently difficult that typically only the most advanced and well-resourced community ocean models achieve it. There are two current technologies which each suffer from their own limitations. Algorithmic differentiation, also called automatic differentiation, is employed by models such as the MITGCM [2] and the Alfred Wegener Institute model FESOM [3]. This technique is very difficult to apply to existing code, and requires a major initial investment to prepare the code for automatic adjoint generation. AD tools may also have difficulty with code employing modern software constructs such as derived data types. An alternative is to formulate the adjoint differential equation and to discretise this separately. This approach, known as the continuous adjoint and employed in ROMS [4], has the disadvantage that two different model code bases must be maintained and manually kept synchronised as the model develops. The discretisation of the continuous adjoint is not automatically consistent with that of the forward model, producing an additional source of error. The alternative presented here is to formulate the flow model in the high level language UFL (Unified Form Language) and to automatically generate the model using the software of the FEniCS project. In this approach it is the high level code specification which is differentiated, a task very similar to the formulation of the continuous adjoint [5]. However since the forward and adjoint models are generated automatically, the difficulty of maintaining them vanishes and the software engineering process is therefore robust. The scheduling and execution of the adjoint model, including the application of an appropriate checkpointing strategy is managed by libadjoint [1]. In contrast to the conventional algorithmic differentiation description of a model as a series of primitive mathematical operations, libadjoint employs a new abstraction of the simulation

  4. Automatic Photoelectric Telescope Service

    International Nuclear Information System (INIS)

    Genet, R.M.; Boyd, L.J.; Kissell, K.E.; Crawford, D.L.; Hall, D.S.; BDM Corp., McLean, VA; Kitt Peak National Observatory, Tucson, AZ; Dyer Observatory, Nashville, TN)

    1987-01-01

    Automatic observatories have the potential of gathering sizable amounts of high-quality astronomical data at low cost. The Automatic Photoelectric Telescope Service (APT Service) has realized this potential and is routinely making photometric observations of a large number of variable stars. However, without observers to provide on-site monitoring, it was necessary to incorporate special quality checks into the operation of the APT Service at its multiple automatic telescope installation on Mount Hopkins. 18 references

  5. Automatic Fiscal Stabilizers

    Directory of Open Access Journals (Sweden)

    Narcis Eduard Mitu

    2013-11-01

    Full Text Available Policies or institutions (built into an economic system that automatically tend to dampen economic cycle fluctuations in income, employment, etc., without direct government intervention. For example, in boom times, progressive income tax automatically reduces money supply as incomes and spendings rise. Similarly, in recessionary times, payment of unemployment benefits injects more money in the system and stimulates demand. Also called automatic stabilizers or built-in stabilizers.

  6. Automatic calculations of electroweak processes

    International Nuclear Information System (INIS)

    Ishikawa, T.; Kawabata, S.; Kurihara, Y.; Shimizu, Y.; Kaneko, T.; Kato, K.; Tanaka, H.

    1996-01-01

    GRACE system is an excellent tool for calculating the cross section and for generating event of the elementary process automatically. However it is not always easy for beginners to use. An interactive version of GRACE is being developed so as to be a user friendly system. Since it works exactly in the same environment as PAW, all functions of PAW are available for handling any histogram information produced by GRACE. As its application the cross sections of all elementary processes with up to 5-body final states induced by e + e - interaction are going to be calculated and to be summarized as a catalogue. (author)

  7. Neural Bases of Automaticity

    Science.gov (United States)

    Servant, Mathieu; Cassey, Peter; Woodman, Geoffrey F.; Logan, Gordon D.

    2018-01-01

    Automaticity allows us to perform tasks in a fast, efficient, and effortless manner after sufficient practice. Theories of automaticity propose that across practice processing transitions from being controlled by working memory to being controlled by long-term memory retrieval. Recent event-related potential (ERP) studies have sought to test this…

  8. Automatic control systems engineering

    International Nuclear Information System (INIS)

    Shin, Yun Gi

    2004-01-01

    This book gives descriptions of automatic control for electrical electronics, which indicates history of automatic control, Laplace transform, block diagram and signal flow diagram, electrometer, linearization of system, space of situation, state space analysis of electric system, sensor, hydro controlling system, stability, time response of linear dynamic system, conception of root locus, procedure to draw root locus, frequency response, and design of control system.

  9. Automatic Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Preuss, Mike

    2014-01-01

    Automatically generating computer animations is a challenging and complex problem with applications in games and film production. In this paper, we investigate howto translate a shot list for a virtual scene into a series of virtual camera configurations — i.e automatically controlling the virtual...

  10. Thai Automatic Speech Recognition

    National Research Council Canada - National Science Library

    Suebvisai, Sinaporn; Charoenpornsawat, Paisarn; Black, Alan; Woszczyna, Monika; Schultz, Tanja

    2005-01-01

    .... We focus on the discussion of the rapid deployment of ASR for Thai under limited time and data resources, including rapid data collection issues, acoustic model bootstrap, and automatic generation of pronunciations...

  11. Automatic Payroll Deposit System.

    Science.gov (United States)

    Davidson, D. B.

    1979-01-01

    The Automatic Payroll Deposit System in Yakima, Washington's Public School District No. 7, directly transmits each employee's salary amount for each pay period to a bank or other financial institution. (Author/MLF)

  12. Automatic Test Systems Aquisition

    National Research Council Canada - National Science Library

    1994-01-01

    We are providing this final memorandum report for your information and use. This report discusses the efforts to achieve commonality in standards among the Military Departments as part of the DoD policy for automatic test systems (ATS...

  13. Development of a versatile tool for the simultaneous differential detection of Pseudomonas savastanoi pathovars by End Point and Real-Time PCR.

    Science.gov (United States)

    Tegli, Stefania; Cerboneschi, Matteo; Libelli, Ilaria Marsili; Santilli, Elena

    2010-05-28

    Pseudomonas savastanoi pv. savastanoi is the causal agent of olive knot disease. The strains isolated from oleander and ash belong to the pathovars nerii and fraxini, respectively. When artificially inoculated, pv. savastanoi causes disease also on ash, and pv. nerii attacks also olive and ash. Surprisingly nothing is known yet about their distribution in nature on these hosts and if spontaneous cross-infections occur. On the other hand sanitary certification programs for olive plants, also including P. savastanoi, were launched in many countries. The aim of this work was to develop several PCR-based tools for the rapid, simultaneous, differential and quantitative detection of these P. savastanoi pathovars, in multiplex and in planta. Specific PCR primers and probes for the pathovars savastanoi, nerii and fraxini of P. savastanoi were designed to be used in End Point and Real-Time PCR, both with SYBR Green or TaqMan chemistries. The specificity of all these assays was 100%, as assessed by testing forty-four P. savastanoi strains, belonging to the three pathovars and having different geographical origins. For comparison strains from the pathovars phaseolicola and glycinea of P. savastanoi and bacterial epiphytes from P. savastanoi host plants were also assayed, and all of them tested always negative. The analytical detection limits were about 5 - 0.5 pg of pure genomic DNA and about 102 genome equivalents per reaction. Similar analytical thresholds were achieved in Multiplex Real-Time PCR experiments, even on artificially inoculated olive plants. Here for the first time a complex of PCR-based assays were developed for the simultaneous discrimination and detection of P. savastanoi pv. savastanoi, pv. nerii and pv. fraxini. These tests were shown to be highly reliable, pathovar-specific, sensitive, rapid and able to quantify these pathogens, both in multiplex reactions and in vivo. Compared with the other methods already available for P. savastanoi, the identification

  14. Brand and automaticity

    OpenAIRE

    Liu, J.

    2008-01-01

    A presumption of most consumer research is that consumers endeavor to maximize the utility of their choices and are in complete control of their purchasing and consumption behavior. However, everyday life experience suggests that many of our choices are not all that reasoned or conscious. Indeed, automaticity, one facet of behavior, is indispensable to complete the portrait of consumers. Despite its importance, little attention is paid to how the automatic side of behavior can be captured and...

  15. Position automatic determination technology

    International Nuclear Information System (INIS)

    1985-10-01

    This book tells of method of position determination and characteristic, control method of position determination and point of design, point of sensor choice for position detector, position determination of digital control system, application of clutch break in high frequency position determination, automation technique of position determination, position determination by electromagnetic clutch and break, air cylinder, cam and solenoid, stop position control of automatic guide vehicle, stacker crane and automatic transfer control.

  16. Automatic intelligent cruise control

    OpenAIRE

    Stanton, NA; Young, MS

    2006-01-01

    This paper reports a study on the evaluation of automatic intelligent cruise control (AICC) from a psychological perspective. It was anticipated that AICC would have an effect upon the psychology of driving—namely, make the driver feel like they have less control, reduce the level of trust in the vehicle, make drivers less situationally aware, but might reduce the workload and make driving might less stressful. Drivers were asked to drive in a driving simulator under manual and automatic inte...

  17. Automatic segmentation of vertebrae from radiographs

    DEFF Research Database (Denmark)

    Mysling, Peter; Petersen, Peter Kersten; Nielsen, Mads

    2011-01-01

    Segmentation of vertebral contours is an essential task in the design of automatic tools for vertebral fracture assessment. In this paper, we propose a novel segmentation technique which does not require operator interaction. The proposed technique solves the segmentation problem in a hierarchical...... is constrained by a conditional shape model, based on the variability of the coarse spine location estimates. The technique is evaluated on a data set of manually annotated lumbar radiographs. The results compare favorably to the previous work in automatic vertebra segmentation, in terms of both segmentation...

  18. Development of automatic laser welding system

    International Nuclear Information System (INIS)

    Ohwaki, Katsura

    2002-01-01

    Laser are a new production tool for high speed and low distortion welding and applications to automatic welding lines are increasing. IHI has long experience of laser processing for the preservation of nuclear power plants, welding of airplane engines and so on. Moreover, YAG laser oscillators and various kinds of hardware have been developed for laser welding and automation. Combining these welding technologies and laser hardware technologies produce the automatic laser welding system. In this paper, the component technologies are described, including combined optics intended to improve welding stability, laser oscillators, monitoring system, seam tracking system and so on. (author)

  19. Automatic Synthesis of Robust and Optimal Controllers

    DEFF Research Database (Denmark)

    Cassez, Franck; Jessen, Jan Jacob; Larsen, Kim Guldstrand

    2009-01-01

    In this paper, we show how to apply recent tools for the automatic synthesis of robust and near-optimal controllers for a real industrial case study. We show how to use three different classes of models and their supporting existing tools, Uppaal-TiGA for synthesis, phaver for verification......, and Simulink for simulation, in a complementary way. We believe that this case study shows that our tools have reached a level of maturity that allows us to tackle interesting and relevant industrial control problems....

  20. Solving Linear Differential Equations

    NARCIS (Netherlands)

    Nguyen, K.A.; Put, M. van der

    2010-01-01

    The theme of this paper is to 'solve' an absolutely irreducible differential module explicitly in terms of modules of lower dimension and finite extensions of the differential field K. Representations of semi-simple Lie algebras and differential Galo is theory are the main tools. The results extend

  1. Automatic weld torch guidance control system

    Science.gov (United States)

    Smaith, H. E.; Wall, W. A.; Burns, M. R., Jr.

    1982-01-01

    A highly reliable, fully digital, closed circuit television optical, type automatic weld seam tracking control system was developed. This automatic tracking equipment is used to reduce weld tooling costs and increase overall automatic welding reliability. The system utilizes a charge injection device digital camera which as 60,512 inidividual pixels as the light sensing elements. Through conventional scanning means, each pixel in the focal plane is sequentially scanned, the light level signal digitized, and an 8-bit word transmitted to scratch pad memory. From memory, the microprocessor performs an analysis of the digital signal and computes the tracking error. Lastly, the corrective signal is transmitted to a cross seam actuator digital drive motor controller to complete the closed loop, feedback, tracking system. This weld seam tracking control system is capable of a tracking accuracy of + or - 0.2 mm, or better. As configured, the system is applicable to square butt, V-groove, and lap joint weldments.

  2. Automatic inference of indexing rules for MEDLINE

    Directory of Open Access Journals (Sweden)

    Shooshan Sonya E

    2008-11-01

    Full Text Available Abstract Background: Indexing is a crucial step in any information retrieval system. In MEDLINE, a widely used database of the biomedical literature, the indexing process involves the selection of Medical Subject Headings in order to describe the subject matter of articles. The need for automatic tools to assist MEDLINE indexers in this task is growing with the increasing number of publications being added to MEDLINE. Methods: In this paper, we describe the use and the customization of Inductive Logic Programming (ILP to infer indexing rules that may be used to produce automatic indexing recommendations for MEDLINE indexers. Results: Our results show that this original ILP-based approach outperforms manual rules when they exist. In addition, the use of ILP rules also improves the overall performance of the Medical Text Indexer (MTI, a system producing automatic indexing recommendations for MEDLINE. Conclusion: We expect the sets of ILP rules obtained in this experiment to be integrated into MTI.

  3. Magnetic Resonance Imaging and conformal radiotherapy: Characterization of MRI alone simulation for conformal radiotherapy. Development and evaluation of an automatic volumes of interest segmentation tool for prostate cancer radiotherapy

    International Nuclear Information System (INIS)

    Pasquier, David

    2006-01-01

    Radiotherapy is a curative treatment of malignant tumours. Radiotherapy techniques considerably evolved last years with the increasing integration of medical images in conformal radiotherapy. This technique makes it possible to elaborate a complex ballistics conforming to target volume and sparing healthy tissues. The examination currently used to delineate volumes of interest is Computed Tomography (CT), on account of its geometrical precision and the information that it provides on electronic densities needed to dose calculation. Magnetic Resonance Imaging (MRI) ensures a more precise delineation of target volumes in many locations, such as pelvis and brain. For pelvic tumours, the use of MRI needs image registration, which complicates treatment planning and poses the problem of the lack of in vivo standard method of validation. The obstacles in the use of MRI alone in treatment planning were evaluated. Neither geometrical distortion linked with the system and the patient nor the lack of information on electronic densities represent stumbling obstacles. Distortion remained low even in edge of large field of view on modern machines. The assignment of electronic densities to bone structures and soft tissues in MR images permitted to obtain equivalent dosimetry to that carried out on the original CT, with a good reproducibility and homogeneous distribution within target volume. The assignment of electronic densities could not be carried out using 20 MV photons and suitable ballistics. The development of Image Guided Radiotherapy could facilitate the use of MRI alone in treatment planning. Target volumes and organ at risk delineation is a time consuming task in radiotherapy planning. We took part in the development and evaluated a method of automatic and semi automatic delineation of volumes of interest from MRI images for prostate cancer radiotherapy. For prostate and organ at risk automatic delineation an organ model-based method and a seeded region growing method

  4. ACIR: automatic cochlea image registration

    Science.gov (United States)

    Al-Dhamari, Ibraheem; Bauer, Sabine; Paulus, Dietrich; Lissek, Friedrich; Jacob, Roland

    2017-02-01

    Efficient Cochlear Implant (CI) surgery requires prior knowledge of the cochlea's size and its characteristics. This information helps to select suitable implants for different patients. To get these measurements, a segmentation method of cochlea medical images is needed. An important pre-processing step for good cochlea segmentation involves efficient image registration. The cochlea's small size and complex structure, in addition to the different resolutions and head positions during imaging, reveals a big challenge for the automated registration of the different image modalities. In this paper, an Automatic Cochlea Image Registration (ACIR) method for multi- modal human cochlea images is proposed. This method is based on using small areas that have clear structures from both input images instead of registering the complete image. It uses the Adaptive Stochastic Gradient Descent Optimizer (ASGD) and Mattes's Mutual Information metric (MMI) to estimate 3D rigid transform parameters. The use of state of the art medical image registration optimizers published over the last two years are studied and compared quantitatively using the standard Dice Similarity Coefficient (DSC). ACIR requires only 4.86 seconds on average to align cochlea images automatically and to put all the modalities in the same spatial locations without human interference. The source code is based on the tool elastix and is provided for free as a 3D Slicer plugin. Another contribution of this work is a proposed public cochlea standard dataset which can be downloaded for free from a public XNAT server.

  5. Automatic Program Development

    DEFF Research Database (Denmark)

    Automatic Program Development is a tribute to Robert Paige (1947-1999), our accomplished and respected colleague, and moreover our good friend, whose untimely passing was a loss to our academic and research community. We have collected the revised, updated versions of the papers published in his...... honor in the Higher-Order and Symbolic Computation Journal in the years 2003 and 2005. Among them there are two papers by Bob: (i) a retrospective view of his research lines, and (ii) a proposal for future studies in the area of the automatic program derivation. The book also includes some papers...... by members of the IFIP Working Group 2.1 of which Bob was an active member. All papers are related to some of the research interests of Bob and, in particular, to the transformational development of programs and their algorithmic derivation from formal specifications. Automatic Program Development offers...

  6. Usage of aids monitoring in automatic braking systems of modern cars

    OpenAIRE

    Dembitskyi V.; Mazylyuk P.; Sitovskyi O.

    2016-01-01

    Increased safety can be carried out at the expense the installation on vehicles of automatic braking systems, that monitor the traffic situation and the actions of the driver. In this paper considered the advantages and disadvantages of automatic braking systems, were analyzed modern tracking tools that are used in automatic braking systems. Based on the statistical data on accidents, are set the main dangers, that the automatic braking system will be reduced. In order to ensure the acc...

  7. Automatic creation of simulation configuration

    International Nuclear Information System (INIS)

    Oudot, G.; Poizat, F.

    1993-01-01

    SIPA, which stands for 'Simulator for Post Accident', includes: 1) a sophisticated software oriented workshop SWORD (which stands for 'Software Workshop Oriented towards Research and Development') designed in the ADA language including integrated CAD system and software tools for automatic generation of simulation software and man-machine interface in order to operate run-time simulation; 2) a 'simulator structure' based on hardware equipment and software for supervision and communications; 3) simulation configuration generated by SWORD, operated under the control of the 'simulator structure' and run on a target computer. SWORD has already been used to generate two simulation configurations (French 900 MW and 1300 MW nuclear power plants), which are now fully operational on the SIPA training simulator. (Z.S.) 1 ref

  8. Automatic text summarization

    CERN Document Server

    Torres Moreno, Juan Manuel

    2014-01-01

    This new textbook examines the motivations and the different algorithms for automatic document summarization (ADS). We performed a recent state of the art. The book shows the main problems of ADS, difficulties and the solutions provided by the community. It presents recent advances in ADS, as well as current applications and trends. The approaches are statistical, linguistic and symbolic. Several exemples are included in order to clarify the theoretical concepts.  The books currently available in the area of Automatic Document Summarization are not recent. Powerful algorithms have been develop

  9. Automatic Ultrasound Scanning

    DEFF Research Database (Denmark)

    Moshavegh, Ramin

    on the user adjustments on the scanner interface to optimize the scan settings. This explains the huge interest in the subject of this PhD project entitled “AUTOMATIC ULTRASOUND SCANNING”. The key goals of the project have been to develop automated techniques to minimize the unnecessary settings...... on the scanners, and to improve the computer-aided diagnosis (CAD) in ultrasound by introducing new quantitative measures. Thus, four major issues concerning automation of the medical ultrasound are addressed in this PhD project. They touch upon gain adjustments in ultrasound, automatic synthetic aperture image...

  10. Automatic NAA. Saturation activities

    International Nuclear Information System (INIS)

    Westphal, G.P.; Grass, F.; Kuhnert, M.

    2008-01-01

    A system for Automatic NAA is based on a list of specific saturation activities determined for one irradiation position at a given neutron flux and a single detector geometry. Originally compiled from measurements of standard reference materials, the list may be extended also by the calculation of saturation activities from k 0 and Q 0 factors, and f and α values of the irradiation position. A systematic improvement of the SRM approach is currently being performed by pseudo-cyclic activation analysis, to reduce counting errors. From these measurements, the list of saturation activities is recalculated in an automatic procedure. (author)

  11. Cliff : the automatized zipper

    NARCIS (Netherlands)

    Baharom, M.Z.; Toeters, M.J.; Delbressine, F.L.M.; Bangaru, C.; Feijs, L.M.G.

    2016-01-01

    It is our strong believe that fashion - more specifically apparel - can support us so much more in our daily life than it currently does. The Cliff project takes the opportunity to create a generic automatized zipper. It is a response to the struggle by elderly, people with physical disability, and

  12. Automatic Complexity Analysis

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    1989-01-01

    One way to analyse programs is to to derive expressions for their computational behaviour. A time bound function (or worst-case complexity) gives an upper bound for the computation time as a function of the size of input. We describe a system to derive such time bounds automatically using abstract...

  13. Automatic Oscillating Turret.

    Science.gov (United States)

    1981-03-01

    Final Report: February 1978 ZAUTOMATIC OSCILLATING TURRET SYSTEM September 1980 * 6. PERFORMING 01G. REPORT NUMBER .J7. AUTHOR(S) S. CONTRACT OR GRANT...o....e.... *24 APPENDIX P-4 OSCILLATING BUMPER TURRET ...................... 25 A. DESCRIPTION 1. Turret Controls ...Other criteria requirements were: 1. Turret controls inside cab. 2. Automatic oscillation with fixed elevation to range from 20* below the horizontal to

  14. Reactor component automatic grapple

    International Nuclear Information System (INIS)

    Greenaway, P.R.

    1982-01-01

    A grapple for handling nuclear reactor components in a medium such as liquid sodium which, upon proper seating and alignment of the grapple with the component as sensed by a mechanical logic integral to the grapple, automatically seizes the component. The mechanical logic system also precludes seizure in the absence of proper seating and alignment. (author)

  15. Automatic sweep circuit

    International Nuclear Information System (INIS)

    Keefe, D.J.

    1980-01-01

    An automatically sweeping circuit for searching for an evoked response in an output signal in time with respect to a trigger input is described. Digital counters are used to activate a detector at precise intervals, and monitoring is repeated for statistical accuracy. If the response is not found then a different time window is examined until the signal is found

  16. Automatic sweep circuit

    Science.gov (United States)

    Keefe, Donald J.

    1980-01-01

    An automatically sweeping circuit for searching for an evoked response in an output signal in time with respect to a trigger input. Digital counters are used to activate a detector at precise intervals, and monitoring is repeated for statistical accuracy. If the response is not found then a different time window is examined until the signal is found.

  17. Recursive automatic classification algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Bauman, E V; Dorofeyuk, A A

    1982-03-01

    A variational statement of the automatic classification problem is given. The dependence of the form of the optimal partition surface on the form of the classification objective functional is investigated. A recursive algorithm is proposed for maximising a functional of reasonably general form. The convergence problem is analysed in connection with the proposed algorithm. 8 references.

  18. Automatic Commercial Permit Sets

    Energy Technology Data Exchange (ETDEWEB)

    Grana, Paul [Folsom Labs, Inc., San Francisco, CA (United States)

    2017-12-21

    Final report for Folsom Labs’ Solar Permit Generator project, which has successfully completed, resulting in the development and commercialization of a software toolkit within the cloud-based HelioScope software environment that enables solar engineers to automatically generate and manage draft documents for permit submission.

  19. Toward a non-invasive screening tool for differentiation of pancreatic lesions based on intra-voxel incoherent motion derived parameters

    Energy Technology Data Exchange (ETDEWEB)

    Graf, Markus; Simon, Dirk; Mang, Sarah [Deutsches Krebsforschungszentrum (DKFZ), Heidelberg (Germany). Software Development for Integrated Therapy and Diagnostics; Lemke, Andreas [Heidelberg Univ., Mannheim (Germany). Dept. of Computer Assisted Clinical Medicine; Gruenberg, Katharina [Deutsches Krebsforschungszentrum (DKFZ), Heidelberg (Germany). Dept. of Radiology

    2013-03-01

    Early recognition of and differential diagnosis between pancreatic cancer and chronic pancreatitis is an important step in successful therapy. Parameters of the IVIM (intra-voxel incoherent motion) theory can be used to differentiate between those lesions. The objective of this work is to evaluate the effects of rigid image registration on IVIM derived parameters for differentiation of pancreatic lesions such as pancreatic cancer and solid mass forming pancreatitis. The effects of linear image registration methods on reproducibility and accuracy of IVIM derived parameters were quantified on MR images of ten volunteers. For this purpose, they were evaluated statistically by comparison of registered and unregistered parameter data. Further, the perfusion fraction f was used to differentiate pancreatic lesions on eleven previously diagnosed patient data sets. Its diagnostic power with and without rigid registration was evaluated using receiver operating curves (ROC) analysis. The pancreas was segmented manually on MR data sets of healthy volunteers as well as the patients showing solid pancreatic lesions. Diffusion weighted imaging was performed in 10 blocks of breath-hold phases. Linear registration of the weighted image stack leads to a 3.7% decrease in variability of the IVIM derived parameter f due to an improved anatomical overlap of 5%. Consequently, after registration the area under the curve in the ROC-analysis for the differentiation approach increased by 2.7%. In conclusion, rigid registration improves the differentiation process based on f-values. (orig.)

  20. Toward a non-invasive screening tool for differentiation of pancreatic lesions based on intra-voxel incoherent motion derived parameters

    International Nuclear Information System (INIS)

    Graf, Markus; Simon, Dirk; Mang, Sarah; Lemke, Andreas; Gruenberg, Katharina

    2013-01-01

    Early recognition of and differential diagnosis between pancreatic cancer and chronic pancreatitis is an important step in successful therapy. Parameters of the IVIM (intra-voxel incoherent motion) theory can be used to differentiate between those lesions. The objective of this work is to evaluate the effects of rigid image registration on IVIM derived parameters for differentiation of pancreatic lesions such as pancreatic cancer and solid mass forming pancreatitis. The effects of linear image registration methods on reproducibility and accuracy of IVIM derived parameters were quantified on MR images of ten volunteers. For this purpose, they were evaluated statistically by comparison of registered and unregistered parameter data. Further, the perfusion fraction f was used to differentiate pancreatic lesions on eleven previously diagnosed patient data sets. Its diagnostic power with and without rigid registration was evaluated using receiver operating curves (ROC) analysis. The pancreas was segmented manually on MR data sets of healthy volunteers as well as the patients showing solid pancreatic lesions. Diffusion weighted imaging was performed in 10 blocks of breath-hold phases. Linear registration of the weighted image stack leads to a 3.7% decrease in variability of the IVIM derived parameter f due to an improved anatomical overlap of 5%. Consequently, after registration the area under the curve in the ROC-analysis for the differentiation approach increased by 2.7%. In conclusion, rigid registration improves the differentiation process based on f-values. (orig.)

  1. ADC as a useful diagnostic tool for differentiating benign and malignant vertebral bone marrow lesions and compression fractures: a systematic review and meta-analysis.

    Science.gov (United States)

    Suh, Chong Hyun; Yun, Seong Jong; Jin, Wook; Lee, Sun Hwa; Park, So Young; Ryu, Chang-Woo

    2018-07-01

    To assess the sensitivity and specificity of quantitative assessment of the apparent diffusion coefficient (ADC) for differentiating benign and malignant vertebral bone marrow lesions (BMLs) and compression fractures (CFs) METHODS: An electronic literature search of MEDLINE and EMBASE was conducted. Bivariate modelling and hierarchical summary receiver operating characteristic modelling were performed to evaluate the diagnostic performance of ADC for differentiating vertebral BMLs. Subgroup analysis was performed for differentiating benign and malignant vertebral CFs. Meta-regression analyses according to subject, study and diffusion-weighted imaging (DWI) characteristics were performed. Twelve eligible studies (748 lesions, 661 patients) were included. The ADC exhibited a pooled sensitivity of 0.89 (95% confidence interval [CI] 0.80-0.94) and a pooled specificity of 0.87 (95% CI 0.78-0.93) for differentiating benign and malignant vertebral BMLs. In addition, the pooled sensitivity and specificity for differentiating benign and malignant CFs were 0.92 (95% CI 0.82-0.97) and 0.91 (95% CI 0.87-0.94), respectively. In the meta-regression analysis, the DWI slice thickness was a significant factor affecting heterogeneity (p benign and malignant vertebral BMLs and CFs. • Quantitative assessment of ADC is useful in differentiating vertebral BMLs. • Quantitative ADC assessment for BMLs had sensitivity of 89%, specificity of 87%. • Quantitative ADC assessment for CFs had sensitivity of 92%, specificity of 91%. • The specificity is highest (95%) with thinner (< 5 mm) DWI slice thickness.

  2. Automatic Planning of External Search Engine Optimization

    Directory of Open Access Journals (Sweden)

    Vita Jasevičiūtė

    2015-07-01

    Full Text Available This paper describes an investigation of the external search engine optimization (SEO action planning tool, dedicated to automatically extract a small set of most important keywords for each month during whole year period. The keywords in the set are extracted accordingly to external measured parameters, such as average number of searches during the year and for every month individually. Additionally the position of the optimized web site for each keyword is taken into account. The generated optimization plan is similar to the optimization plans prepared manually by the SEO professionals and can be successfully used as a support tool for web site search engine optimization.

  3. Preventing SQL Injection through Automatic Query Sanitization with ASSIST

    Directory of Open Access Journals (Sweden)

    Raymond Mui

    2010-09-01

    Full Text Available Web applications are becoming an essential part of our everyday lives. Many of our activities are dependent on the functionality and security of these applications. As the scale of these applications grows, injection vulnerabilities such as SQL injection are major security challenges for developers today. This paper presents the technique of automatic query sanitization to automatically remove SQL injection vulnerabilities in code. In our technique, a combination of static analysis and program transformation are used to automatically instrument web applications with sanitization code. We have implemented this technique in a tool named ASSIST (Automatic and Static SQL Injection Sanitization Tool for protecting Java-based web applications. Our experimental evaluation showed that our technique is effective against SQL injection vulnerabilities and has a low overhead.

  4. Automatic indexing, compiling and classification

    International Nuclear Information System (INIS)

    Andreewsky, Alexandre; Fluhr, Christian.

    1975-06-01

    A review of the principles of automatic indexing, is followed by a comparison and summing-up of work by the authors and by a Soviet staff from the Moscou INFORM-ELECTRO Institute. The mathematical and linguistic problems of the automatic building of thesaurus and automatic classification are examined [fr

  5. Pattern-Driven Automatic Parallelization

    Directory of Open Access Journals (Sweden)

    Christoph W. Kessler

    1996-01-01

    Full Text Available This article describes a knowledge-based system for automatic parallelization of a wide class of sequential numerical codes operating on vectors and dense matrices, and for execution on distributed memory message-passing multiprocessors. Its main feature is a fast and powerful pattern recognition tool that locally identifies frequently occurring computations and programming concepts in the source code. This tool also works for dusty deck codes that have been "encrypted" by former machine-specific code transformations. Successful pattern recognition guides sophisticated code transformations including local algorithm replacement such that the parallelized code need not emerge from the sequential program structure by just parallelizing the loops. It allows access to an expert's knowledge on useful parallel algorithms, available machine-specific library routines, and powerful program transformations. The partially restored program semantics also supports local array alignment, distribution, and redistribution, and allows for faster and more exact prediction of the performance of the parallelized target code than is usually possible.

  6. Automatization of welding

    International Nuclear Information System (INIS)

    Iwabuchi, Masashi; Tomita, Jinji; Nishihara, Katsunori.

    1978-01-01

    Automatization of welding is one of the effective measures for securing high degree of quality of nuclear power equipment, as well as for correspondence to the environment at the site of plant. As the latest ones of the automatic welders practically used for welding of nuclear power apparatuses in factories of Toshiba and IHI, those for pipes and lining tanks are described here. The pipe welder performs the battering welding on the inside of pipe end as the so-called IGSCC countermeasure and the succeeding butt welding through the same controller. The lining tank welder is able to perform simultaneous welding of two parallel weld lines on a large thin plate lining tank. Both types of the welders are demonstrating excellent performance at the shops as well as at the plant site. (author)

  7. Automatic structural scene digitalization.

    Science.gov (United States)

    Tang, Rui; Wang, Yuhan; Cosker, Darren; Li, Wenbin

    2017-01-01

    In this paper, we present an automatic system for the analysis and labeling of structural scenes, floor plan drawings in Computer-aided Design (CAD) format. The proposed system applies a fusion strategy to detect and recognize various components of CAD floor plans, such as walls, doors, windows and other ambiguous assets. Technically, a general rule-based filter parsing method is fist adopted to extract effective information from the original floor plan. Then, an image-processing based recovery method is employed to correct information extracted in the first step. Our proposed method is fully automatic and real-time. Such analysis system provides high accuracy and is also evaluated on a public website that, on average, archives more than ten thousands effective uses per day and reaches a relatively high satisfaction rate.

  8. Automatic trend estimation

    CERN Document Server

    Vamos¸, C˘alin

    2013-01-01

    Our book introduces a method to evaluate the accuracy of trend estimation algorithms under conditions similar to those encountered in real time series processing. This method is based on Monte Carlo experiments with artificial time series numerically generated by an original algorithm. The second part of the book contains several automatic algorithms for trend estimation and time series partitioning. The source codes of the computer programs implementing these original automatic algorithms are given in the appendix and will be freely available on the web. The book contains clear statement of the conditions and the approximations under which the algorithms work, as well as the proper interpretation of their results. We illustrate the functioning of the analyzed algorithms by processing time series from astrophysics, finance, biophysics, and paleoclimatology. The numerical experiment method extensively used in our book is already in common use in computational and statistical physics.

  9. Inter Genre Similarity Modelling For Automatic Music Genre Classification

    OpenAIRE

    Bagci, Ulas; Erzin, Engin

    2009-01-01

    Music genre classification is an essential tool for music information retrieval systems and it has been finding critical applications in various media platforms. Two important problems of the automatic music genre classification are feature extraction and classifier design. This paper investigates inter-genre similarity modelling (IGS) to improve the performance of automatic music genre classification. Inter-genre similarity information is extracted over the mis-classified feature population....

  10. Training shortest-path tractography: Automatic learning of spatial priors

    DEFF Research Database (Denmark)

    Kasenburg, Niklas; Liptrot, Matthew George; Reislev, Nina Linde

    2016-01-01

    Tractography is the standard tool for automatic delineation of white matter tracts from diffusion weighted images. However, the output of tractography often requires post-processing to remove false positives and ensure a robust delineation of the studied tract, and this demands expert prior...... knowledge. Here we demonstrate how such prior knowledge, or indeed any prior spatial information, can be automatically incorporated into a shortest-path tractography approach to produce more robust results. We describe how such a prior can be automatically generated (learned) from a population, and we...

  11. Analyzing Program Termination and Complexity Automatically with AProVE

    DEFF Research Database (Denmark)

    Giesl, Jürgen; Aschermann, Cornelius; Brockschmidt, Marc

    2017-01-01

    In this system description, we present the tool AProVE for automatic termination and complexity proofs of Java, C, Haskell, Prolog, and rewrite systems. In addition to classical term rewrite systems (TRSs), AProVE also supports rewrite systems containing built-in integers (int-TRSs). To analyze...... programs in high-level languages, AProVE automatically converts them to (int-)TRSs. Then, a wide range of techniques is employed to prove termination and to infer complexity bounds for the resulting rewrite systems. The generated proofs can be exported to check their correctness using automatic certifiers...

  12. Automatic food decisions

    DEFF Research Database (Denmark)

    Mueller Loose, Simone

    Consumers' food decisions are to a large extent shaped by automatic processes, which are either internally directed through learned habits and routines or externally influenced by context factors and visual information triggers. Innovative research methods such as eye tracking, choice experiments...... and food diaries allow us to better understand the impact of unconscious processes on consumers' food choices. Simone Mueller Loose will provide an overview of recent research insights into the effects of habit and context on consumers' food choices....

  13. Automatic LOD selection

    OpenAIRE

    Forsman, Isabelle

    2017-01-01

    In this paper a method to automatically generate transition distances for LOD, improving image stability and performance is presented. Three different methods were tested all measuring the change between two level of details using the spatial frequency. The methods were implemented as an optional pre-processing step in order to determine the transition distances from multiple view directions. During run-time both view direction based selection and the furthest distance for each direction was ...

  14. Pigmented Nodular Basal Cell Carcinomas in Differential Diagnosis with Nodular Melanomas: Confocal Microscopy as a Reliable Tool for In Vivo Histologic Diagnosis

    International Nuclear Information System (INIS)

    Casari, A.; Pellacani, G.; Seidenari, S.; Pepe, P.; Longo, C.; Cesinaro, A. M.; Beretti, F.

    2011-01-01

    Nodular basal cell carcinoma, especially when pigmented, can be in differential diagnosis with nodular melanomas, clinically and dermoscopically. Reflectance confocal microscopy is a relatively new imaging technique that permits to evaluate in vivo skin tumors with a nearly histological resolution. Here, we present four cases of challenging nodular lesions where confocal microscopy was able to clarify the diagnosis.

  15. The Characterization Tool: A knowledge-based stem cell, differentiated cell, and tissue database with a web-based analysis front-end.

    NARCIS (Netherlands)

    I. Wohlers (Inken); H. Stachelscheid; J. Borstlap; K. Zeilinger; J.C. Gerlach

    2009-01-01

    htmlabstractIn the rapidly growing field of stem cell research, there is a need for universal databases and web-based applications that provide a common knowledge base on the characteristics of stem cells, differentiated cells, and tissues by collecting, processing, and making available diverse

  16. Criticality in cell differentiation

    Indian Academy of Sciences (India)

    Indrani Bose

    2017-11-09

    Nov 9, 2017 ... Differentiation is mostly based on binary decisions with the progenitor cells ..... accounts for the dominant part of the remaining variation ... significant loss in information. ..... making in vitro: emerging concepts and novel tools.

  17. Combination of cyst fluid CEA and CA 125 is an accurate diagnostic tool for differentiating mucinous cystic neoplasms from intraductal papillary mucinous neoplasms.

    Science.gov (United States)

    Nagashio, Yoshikuni; Hijioka, Susumu; Mizuno, Nobumasa; Hara, Kazuo; Imaoka, Hiroshi; Bhatia, Vikram; Niwa, Yasumasa; Tajika, Masahiro; Tanaka, Tsutomu; Ishihara, Makoto; Shimizu, Yasuhiro; Hosoda, Waki; Yatabe, Yasushi; Yamao, Kenji

    2014-01-01

    Despite advances in imaging techniques, diagnosis and management of pancreatic cystic lesions still remains challenging. The objective of this study was to determine the utility of cyst fluid analysis (CEA, CA 19-9, CA 125, amylase, and cytology) in categorizing pancreatic cystic lesions, and in differentiating malignant from benign cystic lesions. A retrospective analysis of 68 patients with histologically and clinically confirmed cystic lesions was performed. Cyst fluid was obtained by surgical resection (n = 45) or endoscopic ultrasound-guided fine needle aspiration (EUS-FNA) (n = 23). Cyst fluid tumor markers and amylase were measured and compared between the cyst types. Receiver operating characteristic (ROC) curve analysis of the tumor markers demonstrated that cyst fluid CEA provided the greatest area under ROC curve (AUC) (0.884) for differentiating mucinous versus non-mucinous cystic lesions. When a CEA cutoff value was set at 67.3 ng/ml, the sensitivity, specificity and accuracy for diagnosing mucinous cysts were 89.2%, 77.8%, and 84.4%, respectively. The combination of cyst fluid CEA content >67.3 ng/ml and cyst fluid CA 125 content >10.0 U/ml segregated 77.8% (14/18) of mucinous cystic neoplasms (MCNs) from other cyst subtypes. On the other hand, no fluid marker was useful for differentiating malignant versus benign cystic lesions. Although cytology (accuracy 83.3%) more accurately diagnosed malignant cysts than CEA (accuracy 65.6%), it lacked sensitivity (35.3%). Our results demonstrate that cyst fluid CEA can be a helpful marker in differentiating mucinous from non-mucinous, but not malignant from benign cystic lesions. A combined CEA and CA 125 approach may help segregate MCNs from IPMNs. Copyright © 2014 IAP and EPC. Published by Elsevier B.V. All rights reserved.

  18. Reliability of cone beam computed tomography as a biopsy-independent tool in differential diagnosis of periapical cysts and granulomas: An In vivo Study.

    Science.gov (United States)

    Chanani, Ankit; Adhikari, Haridas Das

    2017-01-01

    Differential diagnosis of periapical cysts and granulomas is required as their treatment modalities are different. The aim of this study was to evaluate the efficacy of cone beam computed tomography (CBCT) in the differential diagnosis of periapical cysts from granulomas. A single-centered observational study was carried out in the Department of Conservative Dentistry and Endodontics, Dr. R. Ahmed Dental College and Hospital, using CBCT and dental operating microscope. Forty-five lesions were analyzed using CBCT scans. One evaluator analyzed each CBCT scan for the presence of the following six characteristic radiological features: cyst like-location, shape, periphery, internal structure, effect on the surrounding structures, and cortical plate perforation. Another independent evaluator analyzed the CBCT scans. This process was repeated after 6 months, and inter- and intrarater reliability of CBCT diagnoses was evaluated. Periapical surgeries were performed and tissue samples were obtained for histopathological analysis. To evaluate the efficacy, CBCT diagnoses were compared with histopathological diagnoses, and six receiver operating characteristic (ROC) curve analyses were conducted. ROC curve, Cronbach's alpha (α) test, and Cohen Kappa (κ) test were used for statistical analysis. Both inter- and intrarater reliability were excellent (α = 0.94, κ = 0.75 and 0.77, respectively). ROC curve with regard to ≥4 positive findings revealed the highest area under curve (0.66). CBCT is moderately accurate in the differential diagnosis of periapical cysts and granulomas.

  19. Automatic quantitative renal scintigraphy

    International Nuclear Information System (INIS)

    Valeyre, J.; Deltour, G.; Delisle, M.J.; Bouchard, A.

    1976-01-01

    Renal scintigraphy data may be analyzed automatically by the use of a processing system coupled to an Anger camera (TRIDAC-MULTI 8 or CINE 200). The computing sequence is as follows: normalization of the images; background noise subtraction on both images; evaluation of mercury 197 uptake by the liver and spleen; calculation of the activity fractions on each kidney with respect to the injected dose, taking into account the kidney depth and the results referred to normal values; edition of the results. Automation minimizes the scattering parameters and by its simplification is a great asset in routine work [fr

  20. AUTOMATIC FREQUENCY CONTROL SYSTEM

    Science.gov (United States)

    Hansen, C.F.; Salisbury, J.D.

    1961-01-10

    A control is described for automatically matching the frequency of a resonant cavity to that of a driving oscillator. The driving oscillator is disconnected from the cavity and a secondary oscillator is actuated in which the cavity is the frequency determining element. A low frequency is mixed with the output of the driving oscillator and the resultant lower and upper sidebands are separately derived. The frequencies of the sidebands are compared with the secondary oscillator frequency. deriving a servo control signal to adjust a tuning element in the cavity and matching the cavity frequency to that of the driving oscillator. The driving oscillator may then be connected to the cavity.

  1. Automatic dipole subtraction

    International Nuclear Information System (INIS)

    Hasegawa, K.

    2008-01-01

    The Catani-Seymour dipole subtraction is a general procedure to treat infrared divergences in real emission processes at next-to-leading order in QCD. We automatized the procedure in a computer code. The code is useful especially for the processes with many parton legs. In this talk, we first explain the algorithm of the dipole subtraction and the whole structure of our code. After that we show the results for some processes where the infrared divergences of real emission processes are subtracted. (author)

  2. Automatic programmable air ozonizer

    International Nuclear Information System (INIS)

    Gubarev, S.P.; Klosovsky, A.V.; Opaleva, G.P.; Taran, V.S.; Zolototrubova, M.I.

    2015-01-01

    In this paper we describe a compact, economical, easy to manage auto air ozonator developed at the Institute of Plasma Physics of the NSC KIPT. It is designed for sanitation, disinfection of premises and cleaning the air from foreign odors. A distinctive feature of the developed device is the generation of a given concentration of ozone, approximately 0.7 maximum allowable concentration (MAC), and automatic maintenance of a specified level. This allows people to be inside the processed premises during operation. The microprocessor controller to control the operation of the ozonator was developed

  3. PACS quality control and automatic problem notifier

    Science.gov (United States)

    Honeyman-Buck, Janice C.; Jones, Douglas; Frost, Meryll M.; Staab, Edward V.

    1997-05-01

    One side effect of installing a clinical PACS Is that users become dependent upon the technology and in some cases it can be very difficult to revert back to a film based system if components fail. The nature of system failures range from slow deterioration of function as seen in the loss of monitor luminance through sudden catastrophic loss of the entire PACS networks. This paper describes the quality control procedures in place at the University of Florida and the automatic notification system that alerts PACS personnel when a failure has happened or is anticipated. The goal is to recover from a failure with a minimum of downtime and no data loss. Routine quality control is practiced on all aspects of PACS, from acquisition, through network routing, through display, and including archiving. Whenever possible, the system components perform self and between platform checks for active processes, file system status, errors in log files, and system uptime. When an error is detected or a exception occurs, an automatic page is sent to a pager with a diagnostic code. Documentation on each code, trouble shooting procedures, and repairs are kept on an intranet server accessible only to people involved in maintaining the PACS. In addition to the automatic paging system for error conditions, acquisition is assured by an automatic fax report sent on a daily basis to all technologists acquiring PACS images to be used as a cross check that all studies are archived prior to being removed from the acquisition systems. Daily quality control is preformed to assure that studies can be moved from each acquisition and contrast adjustment. The results of selected quality control reports will be presented. The intranet documentation server will be described with the automatic pager system. Monitor quality control reports will be described and the cost of quality control will be quantified. As PACS is accepted as a clinical tool, the same standards of quality control must be established

  4. Electricity of machine tool

    International Nuclear Information System (INIS)

    Gijeon media editorial department

    1977-10-01

    This book is divided into three parts. The first part deals with electricity machine, which can taints from generator to motor, motor a power source of machine tool, electricity machine for machine tool such as switch in main circuit, automatic machine, a knife switch and pushing button, snap switch, protection device, timer, solenoid, and rectifier. The second part handles wiring diagram. This concludes basic electricity circuit of machine tool, electricity wiring diagram in your machine like milling machine, planer and grinding machine. The third part introduces fault diagnosis of machine, which gives the practical solution according to fault diagnosis and the diagnostic method with voltage and resistance measurement by tester.

  5. Machine Tool Software

    Science.gov (United States)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  6. Do Automatic Self-Associations Relate to Suicidal Ideation?

    NARCIS (Netherlands)

    Glashouwer, Klaske A.; de Jong, Peter J.; Penninx, Brenda W. J. H.; Kerkhof, Ad J. F. M.; van Dyck, Richard; Ormel, Johan

    Dysfunctional self-schemas are assumed to play an important role in suicidal ideation. According to recent information-processing models, it is important to differentiate between 'explicit' beliefs and automatic associations. Explicit beliefs stem from the weighting of propositions and their

  7. Conditioned craving cues elicit an automatic approach tendency

    NARCIS (Netherlands)

    van Gucht, D.; Vansteenwegen, D.; Van den Bergh, O.; Beckers, T.

    2008-01-01

    In two experiments, we used a Pavlovian differential conditioning procedure to induce craving for chocolate. As a result of repeated pairing with chocolate intake, initially neutral cues came to elicit an automatic approach tendency in a speeded stimulus-response compatibility reaction time task.

  8. Automatic personnel contamination monitor

    International Nuclear Information System (INIS)

    Lattin, Kenneth R.

    1978-01-01

    United Nuclear Industries, Inc. (UNI) has developed an automatic personnel contamination monitor (APCM), which uniquely combines the design features of both portal and hand and shoe monitors. In addition, this prototype system also has a number of new features, including: micro computer control and readout, nineteen large area gas flow detectors, real-time background compensation, self-checking for system failures, and card reader identification and control. UNI's experience in operating the Hanford N Reactor, located in Richland, Washington, has shown the necessity of automatically monitoring plant personnel for contamination after they have passed through the procedurally controlled radiation zones. This final check ensures that each radiation zone worker has been properly checked before leaving company controlled boundaries. Investigation of the commercially available portal and hand and shoe monitors indicated that they did not have the sensitivity or sophistication required for UNI's application, therefore, a development program was initiated, resulting in the subject monitor. Field testing shows good sensitivity to personnel contamination with the majority of alarms showing contaminants on clothing, face and head areas. In general, the APCM has sensitivity comparable to portal survey instrumentation. The inherit stand-in, walk-on feature of the APCM not only makes it easy to use, but makes it difficult to bypass. (author)

  9. The combination of urinary IL - 6 and renal biometry as useful diagnostic tools to differentiate acute pyelonephritis from lower urinary tract infection

    Directory of Open Access Journals (Sweden)

    Sherif Azab

    Full Text Available ABSTRACT Objective: To evaluate the role of renal ultrasound (RUS and urinary IL-6 in the differentiation between acute pyelonephritis (APN and lower urinary tract infection (LUTI. Patients and methods: This prospective study was carried out at the Pediatric and urology outpatient and inpatient departments of Cairo University Children's Hospital as well as October 6 University Hospital and it included 155 children between one month and fourteen years old with positive culture UTI. Patients were categorized into APN and LUTI based on their clinical features and laboratory parameters. Thirty healthy children, age and sex matched constituted the control group. Children with positive urine cultures were treated with appropriate antibiotics. Before treatment, urinary IL-6 was measured by enzyme immunoassay technique (ELISA, and renal ultrasound (RUS was done. CRP (C-reactive protein, IL-6 and RUS were repeated on the 14th day of antibiotic treatment to evaluate the changes in their levels in response to treatment. Results: UIL-6 levels were more significantly higher in patients with APN than in patients with LUTI (24.3±19.3pg/mL for APN vs. 7.3±2.7pg/mL in LUTI (95% CI: 2.6-27.4; p20pg/mL and serum CRP >20μg/mL were highly reliable markers of APN. Mean renal volume and mean volume difference between the two kidneys in the APN group were more than that of the LUTI and control groups (P<0.001. Renal volume between 120-130% of normal was the best for differentiating APN from LUTI. Conclusions: RUS and urinary IL-6 levels have a highly dependable role in the differentiation between APN and LUTI especially in places where other investigations are not available and/ or affordable.

  10. Portable and Automatic Moessbauer Analysis

    International Nuclear Information System (INIS)

    Souza, P. A. de; Garg, V. K.; Klingelhoefer, G.; Gellert, R.; Guetlich, P.

    2002-01-01

    A portable Moessbauer spectrometer, developed for extraterrestrial applications, opens up new industrial applications of MBS. But for industrial applications, an available tool for fast data analysis is also required, and it should be easy to handle. The analysis of Moessbauer spectra and their parameters is a barrier for the popularity of this wide-applicable spectroscopic technique in industry. Based on experience, the analysis of a Moessbauer spectrum is time-consuming and requires the dedication of a specialist. However, the analysis of Moessbauer spectra, from the fitting to the identification of the sample phases, can be faster using by genetic algorithms, fuzzy logic and artificial neural networks. Industrial applications are very specific ones and the data analysis can be performed using these algorithms. In combination with an automatic analysis, the Moessbauer spectrometer can be used as a probe instrument which covers the main industrial needs for an on-line monitoring of its products, processes and case studies. Some of these real industrial applications will be discussed.

  11. Automatic sets and Delone sets

    International Nuclear Information System (INIS)

    Barbe, A; Haeseler, F von

    2004-01-01

    Automatic sets D part of Z m are characterized by having a finite number of decimations. They are equivalently generated by fixed points of certain substitution systems, or by certain finite automata. As examples, two-dimensional versions of the Thue-Morse, Baum-Sweet, Rudin-Shapiro and paperfolding sequences are presented. We give a necessary and sufficient condition for an automatic set D part of Z m to be a Delone set in R m . The result is then extended to automatic sets that are defined as fixed points of certain substitutions. The morphology of automatic sets is discussed by means of examples

  12. Norms concerning the programmable automatic control devices

    International Nuclear Information System (INIS)

    Fourmentraux, G.

    1995-01-01

    This presentation is a report of the studies carried out by the Work Group on Functioning Safety of Programmable Automatic Control Devices and by the Group for Prevention Studies (GEP) from the CEA. The objective of these groups is to evaluate the methods which could be used to estimate the functioning safety of control and instrumentation systems involved in the Important Elements for Safety (EIS) of the Basic Nuclear Installations (INB) of the CEA, and also to carry out a qualification of automatic control devices. Norms, protocols and tools for the evaluation are presented. The problem comprises two aspects: the evaluation of fault avoidance techniques and the evaluation of fault control techniques used during the conceiving. For the fault avoidance techniques, the quality assurance organization, the environment tests, and the software quality plans are considered. For the fault control techniques, the different available tools and fault injection models are analysed. The results of an analysis carried out with the DEF.I tool from the National Institute for Research and Safety (INRS) are reported. (J.S.). 23 refs

  13. SELFADJUSTING AUTOMATIC CONTROL OF SOWING UNIT

    Directory of Open Access Journals (Sweden)

    A. Yu. Izmaylov

    2015-01-01

    Full Text Available The selfadjusting automatic control of sowing unit and differentiated introduction of mineral fertilizers doses according to agrochemical indicators of the soil (precision agriculture are used wider nowadays. It was defined that the main requirement to the differentiated seeding and fertilizing is an accuracy and duration of transition from one norm to another. Established that at a speed of unit of 10 km/h object moves for 0.5 s about on 1.5 m and more. Thus in this device the radio channel originated differentiated correction is updated in 10 s, and in the RTK mode - 0.5-2 s that breaks the accuracy of introduction of seeds and fertilizers. The block schematic diagram of system of automatic control of technological process of seeding and mineral fertilizing with use of navigation means of machine-tractor aggregates orientation in the field and technical means for realization of technology of precision agriculture at sowing and fertilizers application due to electronic maps of soil fertility and navigation satellite systems was worked out. It was noted that for regulation of a fertilizing dose it is necessary to complete the unit with the electric drive, and for error reduction use navigation GLONASS, GPS, Galileo receivers. To tracking of four leading navigation systems GPS/GLONASS/Galileo/Compass receiver with 32 canals developed by domestic-owned firm «KB NAVIS» was suggested. It was established that the automated device created by All-Russia Research Institute of Mechanization for Agriculture information based on NAVSTAR and GLONASS/GPS system successfully operates seeding and make possible the differentiate fertilizing.

  14. Design of tool monitor simulator

    International Nuclear Information System (INIS)

    Yao Yonggang; Deng Changming; Zhang Jia; Meng Dan; Zhang Lu; Wang Zhi'ai; Shen Yang

    2011-01-01

    It is based on tool monitor in Qinshan Nuclear Power Plant for the object of study, and manufacture a tool monitor simulator. The device is designed to automatically emulate-monitor the contamination level of objects for training students. Once if the tool monitor reports the contamination, the students can handle properly. The brief introduction of main function and system design of the simulator are presented in the paper. (authors)

  15. Automatic identification in mining

    Energy Technology Data Exchange (ETDEWEB)

    Puckett, D; Patrick, C [Mine Computers and Electronics Inc., Morehead, KY (United States)

    1998-06-01

    The feasibility of monitoring the locations and vital statistics of equipment and personnel in surface and underground mining operations has increased with advancements in radio frequency identification (RFID) technology. This paper addresses the use of RFID technology, which is relatively new to the mining industry, to track surface equipment in mine pits, loading points and processing facilities. Specific applications are discussed, including both simplified and complex truck tracking systems and an automatic pit ticket system. This paper concludes with a discussion of the future possibilities of using RFID technology in mining including monitoring heart and respiration rates, body temperatures and exertion levels; monitoring repetitious movements for the study of work habits; and logging air quality via personnel sensors. 10 refs., 5 figs.

  16. Automatic quantitative metallography

    International Nuclear Information System (INIS)

    Barcelos, E.J.B.V.; Ambrozio Filho, F.; Cunha, R.C.

    1976-01-01

    The quantitative determination of metallographic parameters is analysed through the description of Micro-Videomat automatic image analysis system and volumetric percentage of perlite in nodular cast irons, porosity and average grain size in high-density sintered pellets of UO 2 , and grain size of ferritic steel. Techniques adopted are described and results obtained are compared with the corresponding ones by the direct counting process: counting of systematic points (grid) to measure volume and intersections method, by utilizing a circunference of known radius for the average grain size. The adopted technique for nodular cast iron resulted from the small difference of optical reflectivity of graphite and perlite. Porosity evaluation of sintered UO 2 pellets is also analyzed [pt

  17. Semi-automatic fluoroscope

    International Nuclear Information System (INIS)

    Tarpley, M.W.

    1976-10-01

    Extruded aluminum-clad uranium-aluminum alloy fuel tubes must pass many quality control tests before irradiation in Savannah River Plant nuclear reactors. Nondestructive test equipment has been built to automatically detect high and low density areas in the fuel tubes using x-ray absorption techniques with a video analysis system. The equipment detects areas as small as 0.060-in. dia with 2 percent penetrameter sensitivity. These areas are graded as to size and density by an operator using electronic gages. Video image enhancement techniques permit inspection of ribbed cylindrical tubes and make possible the testing of areas under the ribs. Operation of the testing machine, the special low light level television camera, and analysis and enhancement techniques are discussed

  18. Automatic surveying techniques

    International Nuclear Information System (INIS)

    Sah, R.

    1976-01-01

    In order to investigate the feasibility of automatic surveying methods in a more systematic manner, the PEP organization signed a contract in late 1975 for TRW Systems Group to undertake a feasibility study. The completion of this study resulted in TRW Report 6452.10-75-101, dated December 29, 1975, which was largely devoted to an analysis of a survey system based on an Inertial Navigation System. This PEP note is a review and, in some instances, an extension of that TRW report. A second survey system which employed an ''Image Processing System'' was also considered by TRW, and it will be reviewed in the last section of this note. 5 refs., 5 figs., 3 tabs

  19. Thyroid scintigraphy and perchlorate test after recombinant human TSH: a new tool for the differential diagnosis of congenital hypothyroidism during infancy

    Energy Technology Data Exchange (ETDEWEB)

    Fugazzola, Laura; Vannucchi, Guia; Mannavola, Deborah; Beck-Peccoz, Paolo [University of Milan and Fondazione Policlinico IRCCS, Department of Medical Sciences, Milan (Italy); Persani, Luca [University of Milan and Istituto Auxologico Italiano, Department of Medical Sciences, Via Zucchi, Cusano, Milan (Italy); Carletto, Marco; Longari, Virgilio [Fondazione Policlinico IRCCS, Department of Nuclear Medicine, Milan (Italy); Vigone, Maria C.; Cortinovis, Francesca; Weber, Giovanna [Universita Vita-Salute S. Raffaele, Centro di Endocrinologia dell' Infanzia e dell' Adolescenza, Milan (Italy); Beccaria, Luciano [A. Manzoni Hospital, Paediatric Unit, Lecco (Italy)

    2007-09-15

    Prompt initiation of l-thyroxine therapy in neonates with congenital hypothyroidism (CH) often prevents the performance of functional studies. Aetiological diagnosis is thus postponed until after infancy, when the required investigations are performed after l-thyroxine withdrawal. The aim of this study was to verify the efficacy and safety of new protocols for rhTSH (Thyrogen) testing during l-thyroxine replacement in the differential diagnosis of CH. Ten CH patients (15-144 months old) were studied. Seven had neonatal evidence of gland in situ at the ultrasound examination performed at enrolment and received two rhTSH injections (4 {mu}g/kg daily, i.m.) with {sup 123}I scintigraphy and perchlorate test on day 3. Three patients with an ultrasound diagnosis of thyroid dysgenesis received three rhTSH injections with {sup 123}I scintigraphy on days 3 and 4. TSH and thyroglobulin (Tg) determinations were performed on days 1, 3 and 4, and neck ultrasound on day 1. rhTSH stimulation caused Tg levels to increase in eight cases. Blunted Tg responses were seen in two patients with ectopia and hypoplasia. Interestingly, in two cases the association of different developmental defects was demonstrated. Perchlorate test revealed a total iodide organification defect in two patients, including one with a neonatal diagnosis of Pendred's syndrome, who were subsequently found to harbour TPO mutations. rhTSH did not cause notable side-effects. These new rhTSH protocols always resulted in accurate disease characterisation, allowing specific management and targeted genetic analyses. Thus, rhTSH represents a valid and safe alternative to l-thyroxine withdrawal in the differential diagnosis of CH in paediatric patients. (orig.)

  20. Thyroid scintigraphy and perchlorate test after recombinant human TSH: a new tool for the differential diagnosis of congenital hypothyroidism during infancy

    International Nuclear Information System (INIS)

    Fugazzola, Laura; Vannucchi, Guia; Mannavola, Deborah; Beck-Peccoz, Paolo; Persani, Luca; Carletto, Marco; Longari, Virgilio; Vigone, Maria C.; Cortinovis, Francesca; Weber, Giovanna; Beccaria, Luciano

    2007-01-01

    Prompt initiation of l-thyroxine therapy in neonates with congenital hypothyroidism (CH) often prevents the performance of functional studies. Aetiological diagnosis is thus postponed until after infancy, when the required investigations are performed after l-thyroxine withdrawal. The aim of this study was to verify the efficacy and safety of new protocols for rhTSH (Thyrogen) testing during l-thyroxine replacement in the differential diagnosis of CH. Ten CH patients (15-144 months old) were studied. Seven had neonatal evidence of gland in situ at the ultrasound examination performed at enrolment and received two rhTSH injections (4 μg/kg daily, i.m.) with 123 I scintigraphy and perchlorate test on day 3. Three patients with an ultrasound diagnosis of thyroid dysgenesis received three rhTSH injections with 123 I scintigraphy on days 3 and 4. TSH and thyroglobulin (Tg) determinations were performed on days 1, 3 and 4, and neck ultrasound on day 1. rhTSH stimulation caused Tg levels to increase in eight cases. Blunted Tg responses were seen in two patients with ectopia and hypoplasia. Interestingly, in two cases the association of different developmental defects was demonstrated. Perchlorate test revealed a total iodide organification defect in two patients, including one with a neonatal diagnosis of Pendred's syndrome, who were subsequently found to harbour TPO mutations. rhTSH did not cause notable side-effects. These new rhTSH protocols always resulted in accurate disease characterisation, allowing specific management and targeted genetic analyses. Thus, rhTSH represents a valid and safe alternative to l-thyroxine withdrawal in the differential diagnosis of CH in paediatric patients. (orig.)

  1. The combination of urinary IL - 6 and renal biometry as useful diagnostic tools to differentiate acute pyelonephritis from lower urinary tract infection.

    Science.gov (United States)

    Azab, Sherif; Zakaria, Mostafa; Raafat, Mona; Seief, Hadeel

    2016-01-01

    To evaluate the role of renal ultrasound (RUS) and urinary IL-6 in the differentiation between acute pyelonephritis (APN) and lower urinary tract infection (LUTI). This prospective study was carried out at the Pediatric and urology outpatient and inpatient departments of Cairo University Children's Hospital as well as October 6 University Hospital and it included 155 children between one month and fourteen years old with positive culture UTI. Patients were categorized into APN and LUTI based on their clinical features and laboratory parameters. Thirty healthy children, age and sex matched constituted the control group. Children with positive urine cultures were treated with appropriate antibiotics. Before treatment, urinary IL-6 was measured by enzyme immunoassay technique (ELISA), and renal ultrasound (RUS) was done. CRP (C-reactive protein), IL-6 and RUS were repeated on the 14th day of antibiotic treatment to evaluate the changes in their levels in response to treatment. UIL-6 levels were more significantly higher in patients with APN than in patients with LUTI (24.3±19.3pg/mL for APN vs. 7.3±2.7pg/mL in LUTI (95% CI: 2.6-27.4; p20pg/mL and serum CRP >20μg/mL were highly reliable markers of APN. Mean renal volume and mean volume difference between the two kidneys in the APN group were more than that of the LUTI and control groups (Purinary IL-6 levels have a highly dependable role in the differentiation between APN and LUTI especially in places where other investigations are not available and/ or affordable. Copyright© by the International Brazilian Journal of Urology.

  2. Automatic Visualization of Software Requirements: Reactive Systems

    International Nuclear Information System (INIS)

    Castello, R.; Mili, R.; Tollis, I.G.; Winter, V.

    1999-01-01

    In this paper we present an approach that facilitates the validation of high consequence system requirements. This approach consists of automatically generating a graphical representation from an informal document. Our choice of a graphical notation is statecharts. We proceed in two steps: we first extract a hierarchical decomposition tree from a textual description, then we draw a graph that models the statechart in a hierarchical fashion. The resulting drawing is an effective requirements assessment tool that allows the end user to easily pinpoint inconsistencies and incompleteness

  3. Length Scales in Bayesian Automatic Adaptive Quadrature

    Directory of Open Access Journals (Sweden)

    Adam Gh.

    2016-01-01

    Full Text Available Two conceptual developments in the Bayesian automatic adaptive quadrature approach to the numerical solution of one-dimensional Riemann integrals [Gh. Adam, S. Adam, Springer LNCS 7125, 1–16 (2012] are reported. First, it is shown that the numerical quadrature which avoids the overcomputing and minimizes the hidden floating point loss of precision asks for the consideration of three classes of integration domain lengths endowed with specific quadrature sums: microscopic (trapezoidal rule, mesoscopic (Simpson rule, and macroscopic (quadrature sums of high algebraic degrees of precision. Second, sensitive diagnostic tools for the Bayesian inference on macroscopic ranges, coming from the use of Clenshaw-Curtis quadrature, are derived.

  4. Automatic generation of combinatorial test data

    CERN Document Server

    Zhang, Jian; Ma, Feifei

    2014-01-01

    This book reviews the state-of-the-art in combinatorial testing, with particular emphasis on the automatic generation of test data. It describes the most commonly used approaches in this area - including algebraic construction, greedy methods, evolutionary computation, constraint solving and optimization - and explains major algorithms with examples. In addition, the book lists a number of test generation tools, as well as benchmarks and applications. Addressing a multidisciplinary topic, it will be of particular interest to researchers and professionals in the areas of software testing, combi

  5. Coherence measures in automatic time-migration velocity analysis

    International Nuclear Information System (INIS)

    Maciel, Jonathas S; Costa, Jessé C; Schleicher, Jörg

    2012-01-01

    Time-migration velocity analysis can be carried out automatically by evaluating the coherence of migrated seismic events in common-image gathers (CIGs). The performance of gradient methods for automatic time-migration velocity analysis depends on the coherence measures used as the objective function. We compare the results of four different coherence measures, being conventional semblance, differential semblance, an extended differential semblance using differences of more distant image traces and the product of the latter with conventional semblance. In our numerical experiments, the objective functions based on conventional semblance and on the product of conventional semblance with extended differential semblance provided the best velocity models, as evaluated by the flatness of the resulting CIGs. The method can be easily extended to anisotropic media. (paper)

  6. Formal Specification Based Automatic Test Generation for Embedded Network Systems

    Directory of Open Access Journals (Sweden)

    Eun Hye Choi

    2014-01-01

    Full Text Available Embedded systems have become increasingly connected and communicate with each other, forming large-scaled and complicated network systems. To make their design and testing more reliable and robust, this paper proposes a formal specification language called SENS and a SENS-based automatic test generation tool called TGSENS. Our approach is summarized as follows: (1 A user describes requirements of target embedded network systems by logical property-based constraints using SENS. (2 Given SENS specifications, test cases are automatically generated using a SAT-based solver. Filtering mechanisms to select efficient test cases are also available in our tool. (3 In addition, given a testing goal by the user, test sequences are automatically extracted from exhaustive test cases. We’ve implemented our approach and conducted several experiments on practical case studies. Through the experiments, we confirmed the efficiency of our approach in design and test generation of real embedded air-conditioning network systems.

  7. DAISY: a new software tool to test global identifiability of biological and physiological systems.

    Science.gov (United States)

    Bellu, Giuseppina; Saccomani, Maria Pia; Audoly, Stefania; D'Angiò, Leontina

    2007-10-01

    A priori global identifiability is a structural property of biological and physiological models. It is considered a prerequisite for well-posed estimation, since it concerns the possibility of recovering uniquely the unknown model parameters from measured input-output data, under ideal conditions (noise-free observations and error-free model structure). Of course, determining if the parameters can be uniquely recovered from observed data is essential before investing resources, time and effort in performing actual biomedical experiments. Many interesting biological models are nonlinear but identifiability analysis for nonlinear system turns out to be a difficult mathematical problem. Different methods have been proposed in the literature to test identifiability of nonlinear models but, to the best of our knowledge, so far no software tools have been proposed for automatically checking identifiability of nonlinear models. In this paper, we describe a software tool implementing a differential algebra algorithm to perform parameter identifiability analysis for (linear and) nonlinear dynamic models described by polynomial or rational equations. Our goal is to provide the biological investigator a completely automatized software, requiring minimum prior knowledge of mathematical modelling and no in-depth understanding of the mathematical tools. The DAISY (Differential Algebra for Identifiability of SYstems) software will potentially be useful in biological modelling studies, especially in physiology and clinical medicine, where research experiments are particularly expensive and/or difficult to perform. Practical examples of use of the software tool DAISY are presented. DAISY is available at the web site http://www.dei.unipd.it/~pia/.

  8. Automatic ultrasound image enhancement for 2D semi-automatic breast-lesion segmentation

    Science.gov (United States)

    Lu, Kongkuo; Hall, Christopher S.

    2014-03-01

    Breast cancer is the fastest growing cancer, accounting for 29%, of new cases in 2012, and second leading cause of cancer death among women in the United States and worldwide. Ultrasound (US) has been used as an indispensable tool for breast cancer detection/diagnosis and treatment. In computer-aided assistance, lesion segmentation is a preliminary but vital step, but the task is quite challenging in US images, due to imaging artifacts that complicate detection and measurement of the suspect lesions. The lesions usually present with poor boundary features and vary significantly in size, shape, and intensity distribution between cases. Automatic methods are highly application dependent while manual tracing methods are extremely time consuming and have a great deal of intra- and inter- observer variability. Semi-automatic approaches are designed to counterbalance the advantage and drawbacks of the automatic and manual methods. However, considerable user interaction might be necessary to ensure reasonable segmentation for a wide range of lesions. This work proposes an automatic enhancement approach to improve the boundary searching ability of the live wire method to reduce necessary user interaction while keeping the segmentation performance. Based on the results of segmentation of 50 2D breast lesions in US images, less user interaction is required to achieve desired accuracy, i.e. < 80%, when auto-enhancement is applied for live-wire segmentation.

  9. Wavelet-based feature extraction applied to small-angle x-ray scattering patterns from breast tissue: a tool for differentiating between tissue types

    International Nuclear Information System (INIS)

    Falzon, G; Pearson, S; Murison, R; Hall, C; Siu, K; Evans, A; Rogers, K; Lewis, R

    2006-01-01

    This paper reports on the application of wavelet decomposition to small-angle x-ray scattering (SAXS) patterns from human breast tissue produced by a synchrotron source. The pixel intensities of SAXS patterns of normal, benign and malignant tissue types were transformed into wavelet coefficients. Statistical analysis found significant differences between the wavelet coefficients describing the patterns produced by different tissue types. These differences were then correlated with position in the image and have been linked to the supra-molecular structural changes that occur in breast tissue in the presence of disease. Specifically, results indicate that there are significant differences between healthy and diseased tissues in the wavelet coefficients that describe the peaks produced by the axial d-spacing of collagen. These differences suggest that a useful classification tool could be based upon the spectral information within the axial peaks

  10. Theory and applications of differential algebra

    International Nuclear Information System (INIS)

    Pusch, G.D.

    1992-01-01

    Differential algebra (DA) is a new method of automatic differentiation. DA can rapidly and efficiently calculate the values of derivatives of arbitrarily complicated functions, in arbitrarily many variables, to arbitrary order, via its definition of multiplication. I provide a brief introduction to DA, and enumerate some of its recent applications. (author). 6 refs

  11. HDL cholesterol as a diagnostic tool for clinical differentiation of GCK-MODY from HNF1A-MODY and type 1 diabetes in children and young adults.

    Science.gov (United States)

    Fendler, Wojciech; Borowiec, Maciej; Antosik, Karolina; Szadkowska, Agnieszka; Deja, Grazyna; Jarosz-Chobot, Przemyslawa; Mysliwiec, Malgorzata; Wyka, Krystyna; Pietrzak, Iwona; Skupien, Jan; Malecki, Maciej T; Mlynarski, Wojciech

    2011-09-01

    Confirmation of monogenic diabetes caused by glucokinase mutations (GCK-MODY) allows pharmacogenetic intervention in the form of insulin discontinuation. This is especially important among paediatric and young adult populations where GCK-MODY is most prevalent. The study evaluated the utility of lipid parameters in screening for patients with GCK-MODY. Eighty-nine children with type 1 diabetes and 68 with GCK-MODY were screened for triglyceride (TG), total and HDL cholesterol levels. Standardization against a control group of 171 healthy children was applied to eliminate the effect of development. Clinical applicability and cut-off value were evaluated in all available patients with GCK-MODY (n = 148), hepatocyte nuclear factor 1-alpha-MODY (HNF1A MODY) (n = 37) or type 1 diabetes (n = 221). Lower lipid parameter values were observed in GCK-MODY than in patients with type 1 diabetes. Standard deviation scores were -0·22 ± 2·24 vs 1·31 ± 2·17 for HDL cholesterol (P MODY selection [sensitivity 87%, specificity 54%, negative predictive value (NPV) 86%, positive PV 56%]. A threshold HDL concentration of 1·56 mm offered significantly better diagnostic efficiency than total cholesterol (cut-off value 4·51 mm; NPV 80%; PPV 38%; P MODY and differentiation from T1DM and HNF1A-MODY, regardless of treatment or metabolic control. © 2011 Blackwell Publishing Ltd.

  12. Optical methods and differential scanning calorimetry as a potential tool for discrimination of olive oils (extra virgin and mix with vegetable oils)

    Science.gov (United States)

    Nikolova, Kr.; Yovcheva, T.; Marudova, M.; Eftimov, T.; Bodurov, I.; Viraneva, A.; Vlaeva, I.

    2016-03-01

    Eleven samples from olive oil have been investigated using four physical methods - refractive index measurement, fluorescence spectra, color parameters and differential scanning colorimetry. In pomace olive oil (POO) and extra virgin olive oil (EVOO) the oleic acid (65.24 %-78.40 %) predominates over palmitic (10.47 %-15.07 %) and linoleic (5.26 %-13.92 %) acids. The fluorescence spectra contain three peaks related to oxidation products at about λ = (500-540) nm, chlorophyll content at about λ = (675-680) nm and non determined pigments at λ = (700-750) nm. The melting point for EVOO and POO is between -1 °C and -6 °C. In contrast, the salad olive oils melt between -24 °C and -30 °C. The refractive index for EVOO is lower than that for mixed olive oils. The proposed physical methods could be used for fast and simple detection of vegetable oils in EVOO without use of chemical substances. The experimental results are in accordance with those obtained by chemical analysis.

  13. Differential voltage analysis as a tool for analyzing inhomogeneous aging: A case study for LiFePO4|Graphite cylindrical cells

    Science.gov (United States)

    Lewerenz, Meinert; Marongiu, Andrea; Warnecke, Alexander; Sauer, Dirk Uwe

    2017-11-01

    In this work the differential voltage analysis (DVA) is evaluated for LiFePO4|Graphite cylindrical cells aged in calendaric and cyclic tests. The homogeneity of the active lithium distribution and the loss of anode active material (LAAM) are measured by the characteristic shape and peaks of the DVA. The results from this analysis exhibit an increasing homogeneity of the lithium-ion distribution during aging for all cells subjected to calendaric aging. At 60 °C, LAAM is found additionally and can be associated with the deposition of dissolved Fe from the cathode on the anode, where it finally leads to the clogging of pores. For cells aged under cyclic conditions, several phenomena are correlated to degradation, such as loss of active lithium and local LAAM for 100% DOD. Moreover, the deactivation of certain parts of anode and cathode due to a lithium-impermeable covering layer on top of the anode is observed for some cells. While the 100% DOD cycling is featured by a continuous LAAM, the LAAM due to deactivation by a covering layer of both electrodes starts suddenly. The homogeneity of the active lithium distribution within the cycled cells is successively reduced with deposited passivation layers and with LAAM that is lost locally at positions with lower external pressure on the electrode.

  14. Automatic web site authoring with SiteGuide

    NARCIS (Netherlands)

    de Boer, V.; Hollink, V.; van Someren, M.W.; Kłopotek, M.A.; Przepiórkowski, A.; Wierzchoń, S.T.; Trojanowski, K.

    2009-01-01

    An important step in the design process for a web site is to determine which information is to be included and how the information should be organized on the web site’s pages. In this paper we describe ’SiteGuide’, a tool that automatically produces an information architecture for a web site that a

  15. Hybrid integral-differential simulator of EM force interactions/scenario-assessment tool with pre-computed influence matrix in applications to ITER

    Science.gov (United States)

    Rozov, V.; Alekseev, A.

    2015-08-01

    A necessity to address a wide spectrum of engineering problems in ITER determined the need for efficient tools for modeling of the magnetic environment and force interactions between the main components of the magnet system. The assessment of the operating window for the machine, determined by the electro-magnetic (EM) forces, and the check of feasibility of particular scenarios play an important role for ensuring the safety of exploitation. Such analysis-powered prevention of damages forms an element of the Machine Operations and Investment Protection strategy. The corresponding analysis is a necessary step in preparation of the commissioning, which finalizes the construction phase. It shall be supported by the development of the efficient and robust simulators and multi-physics/multi-system integration of models. The developed numerical model of interactions in the ITER magnetic system, based on the use of pre-computed influence matrices, facilitated immediate and complete assessment and systematic specification of EM loads on magnets in all foreseen operating regimes, their maximum values, envelopes and the most critical scenarios. The common principles of interaction in typical bilateral configurations have been generalized for asymmetry conditions, inspired by the plasma and by the hardware, including asymmetric plasma event and magnetic system fault cases. The specification of loads is supported by the technology of functional approximation of nodal and distributed data by continuous patterns/analytical interpolants. The global model of interactions together with the mesh-independent analytical format of output provides the source of self-consistent and transferable data on the spatial distribution of the system of forces for assessments of structural performance of the components, assemblies and supporting structures. The numerical model used is fully parametrized, which makes it very suitable for multi-variant and sensitivity studies (positioning, off

  16. Automatic generation of anatomic characteristics from cerebral aneurysm surface models.

    Science.gov (United States)

    Neugebauer, M; Lawonn, K; Beuing, O; Preim, B

    2013-03-01

    Computer-aided research on cerebral aneurysms often depends on a polygonal mesh representation of the vessel lumen. To support a differentiated, anatomy-aware analysis, it is necessary to derive anatomic descriptors from the surface model. We present an approach on automatic decomposition of the adjacent vessels into near- and far-vessel regions and computation of the axial plane. We also exemplarily present two applications of the geometric descriptors: automatic computation of a unique vessel order and automatic viewpoint selection. Approximation methods are employed to analyze vessel cross-sections and the vessel area profile along the centerline. The resulting transition zones between near- and far- vessel regions are used as input for an optimization process to compute the axial plane. The unique vessel order is defined via projection into the plane space of the axial plane. The viewing direction for the automatic viewpoint selection is derived from the normal vector of the axial plane. The approach was successfully applied to representative data sets exhibiting a broad variability with respect to the configuration of their adjacent vessels. A robustness analysis showed that the automatic decomposition is stable against noise. A survey with 4 medical experts showed a broad agreement with the automatically defined transition zones. Due to the general nature of the underlying algorithms, this approach is applicable to most of the likely aneurysm configurations in the cerebral vasculature. Additional geometric information obtained during automatic decomposition can support correction in case the automatic approach fails. The resulting descriptors can be used for various applications in the field of visualization, exploration and analysis of cerebral aneurysms.

  17. Specificity enhancement by electrospray ionization multistage mass spectrometry--a valuable tool for differentiation and identification of 'V'-type chemical warfare agents.

    Science.gov (United States)

    Weissberg, Avi; Tzanani, Nitzan; Dagan, Shai

    2013-12-01

    The use of chemical warfare agents has become an issue of emerging concern. One of the challenges in analytical monitoring of the extremely toxic 'V'-type chemical weapons [O-alkyl S-(2-dialkylamino)ethyl alkylphosphonothiolates] is to distinguish and identify compounds of similar structure. MS analysis of these compounds reveals mostly fragment/product ions representing the amine-containing residue. Hence, isomers or derivatives with the same amine residue exhibit similar mass spectral patterns in both classical EI/MS and electrospray ionization-MS, leading to unavoidable ambiguity in the identification of the phosphonate moiety. A set of five 'V'-type agents, including O-ethyl S-(2-diisopropylamino)ethyl methylphosphonothiolate (VX), O-isobutyl S-(2-diethylamino)ethyl methylphosphonothiolate (RVX) and O-ethyl S-(2-diethylamino)ethyl methylphosphonothiolate (VM) were studied by liquid chromatography/electrospray ionization/MS, utilizing a QTRAP mass detector. MS/MS enhanced product ion scans and multistage MS(3) experiments were carried out. Based on the results, possible fragmentation pathways were proposed, and a method for the differentiation and identification of structural isomers and derivatives of 'V'-type chemical warfare agents was obtained. MS/MS enhanced product ion scans at various collision energies provided information-rich spectra, although many of the product ions obtained were at low abundance. Employing MS(3) experiments enhanced the selectivity for those low abundance product ions and provided spectra indicative of the different phosphonate groups. Study of the fragmentation pathways, revealing some less expected structures, was carried out and allowed the formulation of mechanistic rules and the determination of sets of ions typical of specific groups, for example, methylphosphonothiolates versus ethylphosphonothiolates. The new group-specific ions elucidated in this work are also useful for screening unknown 'V'-type agents and related

  18. Sexual Modes Questionnaire (SMQ): Translation and Psychometric Properties of the Italian Version of the Automatic Thought Scale.

    Science.gov (United States)

    Nimbi, Filippo Maria; Tripodi, Francesca; Simonelli, Chiara; Nobre, Pedro

    2018-03-01

    The Sexual Modes Questionnaire (SMQ) is a validated and widespread used tool to assess the association among negative automatic thoughts, emotions, and sexual response during sexual activity in men and women. To test the psychometric characteristics of the Italian version of the SMQ focusing on the Automatic Thoughts subscale (SMQ-AT). After linguistic translation, the psychometric properties (internal consistency, construct, and discriminant validity) were evaluated. 1,051 participants (425 men and 626 women, 776 healthy and 275 clinical groups complaining about sexual problems) participated in the present study. 2 confirmatory factor analyses were conducted to test the fit of the original factor structures of the SMQ versions. In addition, 2 principal component analyses were performed to highlight 2 new factorial structures that were further validated with confirmatory factor analyses. Cronbach α and composite reliability were used as internal consistency measures and differences between clinical and control groups were run to test the discriminant validity for the male and female versions. The associations with emotions and sexual functioning measures also are reported. Principal component analyses identified 5 factors in the male version: erection concerns thoughts, lack of erotic thoughts, age- and body-related thoughts, negative thoughts toward sex, and worries about partner's evaluation and failure anticipation thoughts. In the female version 6 factors were found: sexual abuse thoughts, lack of erotic thoughts, low self-body image thoughts, failure and disengagement thoughts, sexual passivity and control, and partner's lack of affection. Confirmatory factor analysis supported the adequacy of the factor structure for men and women. Moreover, the SMQ showed a strong association with emotional response and sexual functioning, differentiating between clinical and control groups. This measure is useful to evaluate patients and design interventions focused on

  19. A Clustering-Based Automatic Transfer Function Design for Volume Visualization

    Directory of Open Access Journals (Sweden)

    Tianjin Zhang

    2016-01-01

    Full Text Available The two-dimensional transfer functions (TFs designed based on intensity-gradient magnitude (IGM histogram are effective tools for the visualization and exploration of 3D volume data. However, traditional design methods usually depend on multiple times of trial-and-error. We propose a novel method for the automatic generation of transfer functions by performing the affinity propagation (AP clustering algorithm on the IGM histogram. Compared with previous clustering algorithms that were employed in volume visualization, the AP clustering algorithm has much faster convergence speed and can achieve more accurate clustering results. In order to obtain meaningful clustering results, we introduce two similarity measurements: IGM similarity and spatial similarity. These two similarity measurements can effectively bring the voxels of the same tissue together and differentiate the voxels of different tissues so that the generated TFs can assign different optical properties to different tissues. Before performing the clustering algorithm on the IGM histogram, we propose to remove noisy voxels based on the spatial information of voxels. Our method does not require users to input the number of clusters, and the classification and visualization process is automatic and efficient. Experiments on various datasets demonstrate the effectiveness of the proposed method.

  20. Fuzzy-Neural Automatic Daylight Control System

    Directory of Open Access Journals (Sweden)

    Grif H. Şt.

    2011-12-01

    Full Text Available The paper presents the design and the tuning of a CMAC controller (Cerebellar Model Articulation Controller implemented in an automatic daylight control application. After the tuning process of the controller, the authors studied the behavior of the automatic lighting control system (ALCS in the presence of luminance disturbances. The luminance disturbances were produced by the authors in night conditions and day conditions as well. During the night conditions, the luminance disturbances were produced by turning on and off a halogen desk lamp. During the day conditions the luminance disturbances were produced in two ways: by daylight contributions changes achieved by covering and uncovering a part of the office window and by turning on and off a halogen desk lamp. During the day conditions the luminance disturbances, produced by turning on and off the halogen lamp, have a smaller amplitude than those produced during the night conditions. The luminance disturbance during the night conditions was a helpful tool to select the proper values of the learning rate for CMAC controller. The luminance disturbances during the day conditions were a helpful tool to demonstrate the right setting of the CMAC controller.

  1. Shaping electromagnetic waves using software-automatically-designed metasurfaces.

    Science.gov (United States)

    Zhang, Qian; Wan, Xiang; Liu, Shuo; Yuan Yin, Jia; Zhang, Lei; Jun Cui, Tie

    2017-06-15

    We present a fully digital procedure of designing reflective coding metasurfaces to shape reflected electromagnetic waves. The design procedure is completely automatic, controlled by a personal computer. In details, the macro coding units of metasurface are automatically divided into several types (e.g. two types for 1-bit coding, four types for 2-bit coding, etc.), and each type of the macro coding units is formed by discretely random arrangement of micro coding units. By combining an optimization algorithm and commercial electromagnetic software, the digital patterns of the macro coding units are optimized to possess constant phase difference for the reflected waves. The apertures of the designed reflective metasurfaces are formed by arranging the macro coding units with certain coding sequence. To experimentally verify the performance, a coding metasurface is fabricated by automatically designing two digital 1-bit unit cells, which are arranged in array to constitute a periodic coding metasurface to generate the required four-beam radiations with specific directions. Two complicated functional metasurfaces with circularly- and elliptically-shaped radiation beams are realized by automatically designing 4-bit macro coding units, showing excellent performance of the automatic designs by software. The proposed method provides a smart tool to realize various functional devices and systems automatically.

  2. Automatic Control of the Concrete Mixture Homogeneity in Cycling Mixers

    Science.gov (United States)

    Anatoly Fedorovich, Tikhonov; Drozdov, Anatoly

    2018-03-01

    The article describes the factors affecting the concrete mixture quality related to the moisture content of aggregates, since the effectiveness of the concrete mixture production is largely determined by the availability of quality management tools at all stages of the technological process. It is established that the unaccounted moisture of aggregates adversely affects the concrete mixture homogeneity and, accordingly, the strength of building structures. A new control method and the automatic control system of the concrete mixture homogeneity in the technological process of mixing components have been proposed, since the tasks of providing a concrete mixture are performed by the automatic control system of processing kneading-and-mixing machinery with operational automatic control of homogeneity. Theoretical underpinnings of the control of the mixture homogeneity are presented, which are related to a change in the frequency of vibrodynamic vibrations of the mixer body. The structure of the technical means of the automatic control system for regulating the supply of water is determined depending on the change in the concrete mixture homogeneity during the continuous mixing of components. The following technical means for establishing automatic control have been chosen: vibro-acoustic sensors, remote terminal units, electropneumatic control actuators, etc. To identify the quality indicator of automatic control, the system offers a structure flowchart with transfer functions that determine the ACS operation in transient dynamic mode.

  3. Bioanalytical assessment of adaptive stress responses in drinking water: A predictive tool to differentiate between micropollutants and disinfection by-products.

    Science.gov (United States)

    Hebert, Armelle; Feliers, Cedric; Lecarpentier, Caroline; Neale, Peta A; Schlichting, Rita; Thibert, Sylvie; Escher, Beate I

    2018-04-01

    Drinking water can contain low levels of micropollutants, as well as disinfection by-products (DBPs) that form from the reaction of disinfectants with organic and inorganic matter in water. Due to the complex mixture of trace chemicals in drinking water, targeted chemical analysis alone is not sufficient for monitoring. The current study aimed to apply in vitro bioassays indicative of adaptive stress responses to monitor the toxicological profiles and the formation of DBPs in three drinking water distribution systems in France. Bioanalysis was complemented with chemical analysis of forty DBPs. All water samples were active in the oxidative stress response assay, but only after considerable sample enrichment. As both micropollutants in source water and DBPs formed during treatment can contribute to the effect, the bioanalytical equivalent concentration (BEQ) approach was applied for the first time to determine the contribution of DBPs, with DBPs found to contribute between 17 and 58% of the oxidative stress response. Further, the BEQ approach was also used to assess the contribution of volatile DBPs to the observed effect, with detected volatile DBPs found to have only a minor contribution as compared to the measured effects of the non-volatile chemicals enriched by solid-phase extraction. The observed effects in the distribution systems were below any level of concern, quantifiable only at high enrichment and not different from bottled mineral water. Integrating bioanalytical tools and the BEQ mixture model for monitoring drinking water quality is an additional assurance that chemical monitoring is not overlooking any unknown chemicals or transformation products and can help to ensure chemically safe drinking water. Copyright © 2017. Published by Elsevier Ltd.

  4. ASA24 enables multiple automatically coded self-administered 24-hour recalls and food records

    Science.gov (United States)

    A freely available web-based tool for epidemiologic, interventional, behavioral, or clinical research from NCI that enables multiple automatically coded self-administered 24-hour recalls and food records.

  5. Differential Laser Doppler based Non-Contact Sensor for Dimensional Inspection with Error Propagation Evaluation

    Directory of Open Access Journals (Sweden)

    Ketsaya Vacharanukul

    2006-06-01

    Full Text Available To achieve dynamic error compensation in CNC machine tools, a non-contactlaser probe capable of dimensional measurement of a workpiece while it is being machinedhas been developed and presented in this paper. The measurements are automatically fedback to the machine controller for intelligent error compensations. Based on a well resolvedlaser Doppler technique and real time data acquisition, the probe delivers a very promisingdimensional accuracy at few microns over a range of 100 mm. The developed opticalmeasuring apparatus employs a differential laser Doppler arrangement allowing acquisitionof information from the workpiece surface. In addition, the measurements are traceable tostandards of frequency allowing higher precision.

  6. Sequence History Update Tool

    Science.gov (United States)

    Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; DelGuercio, Chris

    2008-01-01

    The Sequence History Update Tool performs Web-based sequence statistics archiving for Mars Reconnaissance Orbiter (MRO). Using a single UNIX command, the software takes advantage of sequencing conventions to automatically extract the needed statistics from multiple files. This information is then used to populate a PHP database, which is then seamlessly formatted into a dynamic Web page. This tool replaces a previous tedious and error-prone process of manually editing HTML code to construct a Web-based table. Because the tool manages all of the statistics gathering and file delivery to and from multiple data sources spread across multiple servers, there is also a considerable time and effort savings. With the use of The Sequence History Update Tool what previously took minutes is now done in less than 30 seconds, and now provides a more accurate archival record of the sequence commanding for MRO.

  7. Automatic classification of blank substrate defects

    Science.gov (United States)

    Boettiger, Tom; Buck, Peter; Paninjath, Sankaranarayanan; Pereira, Mark; Ronald, Rob; Rost, Dan; Samir, Bhamidipati

    2014-10-01

    Mask preparation stages are crucial in mask manufacturing, since this mask is to later act as a template for considerable number of dies on wafer. Defects on the initial blank substrate, and subsequent cleaned and coated substrates, can have a profound impact on the usability of the finished mask. This emphasizes the need for early and accurate identification of blank substrate defects and the risk they pose to the patterned reticle. While Automatic Defect Classification (ADC) is a well-developed technology for inspection and analysis of defects on patterned wafers and masks in the semiconductors industry, ADC for mask blanks is still in the early stages of adoption and development. Calibre ADC is a powerful analysis tool for fast, accurate, consistent and automatic classification of defects on mask blanks. Accurate, automated classification of mask blanks leads to better usability of blanks by enabling defect avoidance technologies during mask writing. Detailed information on blank defects can help to select appropriate job-decks to be written on the mask by defect avoidance tools [1][4][5]. Smart algorithms separate critical defects from the potentially large number of non-critical defects or false defects detected at various stages during mask blank preparation. Mechanisms used by Calibre ADC to identify and characterize defects include defect location and size, signal polarity (dark, bright) in both transmitted and reflected review images, distinguishing defect signals from background noise in defect images. The Calibre ADC engine then uses a decision tree to translate this information into a defect classification code. Using this automated process improves classification accuracy, repeatability and speed, while avoiding the subjectivity of human judgment compared to the alternative of manual defect classification by trained personnel [2]. This paper focuses on the results from the evaluation of Automatic Defect Classification (ADC) product at MP Mask

  8. Towards automatic exchange of information

    OpenAIRE

    Oberson, Xavier

    2015-01-01

    This article describes the various steps that led towards automatic exchange of information as the global standard and the issues that remain to be solved. First, the various competing models of exchange information, such as Double Tax Treaty (DTT), TIEA's, FATCA or UE Directives are described with a view to show how they interact between themselves. Second, the so-called Rubik Strategy is summarized and compared with an automatic exchange of information (AEOI). The third part then describes ...

  9. Child vocalization composition as discriminant information for automatic autism detection.

    Science.gov (United States)

    Xu, Dongxin; Gilkerson, Jill; Richards, Jeffrey; Yapanel, Umit; Gray, Sharmi

    2009-01-01

    Early identification is crucial for young children with autism to access early intervention. The existing screens require either a parent-report questionnaire and/or direct observation by a trained practitioner. Although an automatic tool would benefit parents, clinicians and children, there is no automatic screening tool in clinical use. This study reports a fully automatic mechanism for autism detection/screening for young children. This is a direct extension of the LENA (Language ENvironment Analysis) system, which utilizes speech signal processing technology to analyze and monitor a child's natural language environment and the vocalizations/speech of the child. It is discovered that child vocalization composition contains rich discriminant information for autism detection. By applying pattern recognition and machine learning approaches to child vocalization composition data, accuracy rates of 85% to 90% in cross-validation tests for autism detection have been achieved at the equal-error-rate (EER) point on a data set with 34 children with autism, 30 language delayed children and 76 typically developing children. Due to its easy and automatic procedure, it is believed that this new tool can serve a significant role in childhood autism screening, especially in regards to population-based or universal screening.

  10. Automatic surveillance systems

    International Nuclear Information System (INIS)

    Bruschi, R.; Pallottelli, R.

    1985-01-01

    In this paper are presented studies and realization of a special tool, for supporting the console-operator, during normal or abnormal conditions of the plant. It has been realized by means of real-time simulation techniques and it is able: a) to diagnose plant-faults in real-time mode and, to allow the operator to detect the locations of the causes of the malfunctions; b) to supply the conditions where the plant is going, whether in normal or accidental evolution

  11. Automatic associations with the sensory aspects of smoking: Positive in habitual smokers but negative in non-smokers

    OpenAIRE

    Huijding, Jorg; Jong, Peter

    2006-01-01

    textabstractTo test whether pictorial stimuli that focus on the sensory aspects of smoking elicit different automatic affective associations in smokers than in non-smokers, 31 smoking and 33 non-smoking students completed a single target IAT. Explicit attitudes were assessed using a semantic differential. Automatic affective associations were positive in smokers but negative in non-smokers. Only automatic affective associations but not self-reported attitudes were significantly correlated wit...

  12. Differential equations for dummies

    CERN Document Server

    Holzner, Steven

    2008-01-01

    The fun and easy way to understand and solve complex equations Many of the fundamental laws of physics, chemistry, biology, and economics can be formulated as differential equations. This plain-English guide explores the many applications of this mathematical tool and shows how differential equations can help us understand the world around us. Differential Equations For Dummies is the perfect companion for a college differential equations course and is an ideal supplemental resource for other calculus classes as well as science and engineering courses. It offers step-by-step techniques, practical tips, numerous exercises, and clear, concise examples to help readers improve their differential equation-solving skills and boost their test scores.

  13. Automatic charge control system for satellites

    Science.gov (United States)

    Shuman, B. M.; Cohen, H. A.

    1985-01-01

    The SCATHA and the ATS-5 and 6 spacecraft provided insights to the problem of spacecraft charging at geosychronous altitudes. Reduction of the levels of both absolute and differential charging was indicated, by the emission of low energy neutral plasma. It is appropriate to complete the transition from experimental results to the development of a system that will sense the state-of-charge of a spacecraft, and, when a predetermined threshold is reached, will respond automatically to reduce it. A development program was initiated utilizing sensors comparable to the proton electrostatic analyzer, the surface potential monitor, and the transient pulse monitor that flew in SCATHA, and combine these outputs through a microprocessor controller to operate a rapid-start, low energy plasma source.

  14. CHLOE: A tool for automatic detection of peculiar galaxies

    Science.gov (United States)

    Shamir, Lior; Manning, Saundra; Wallin, John

    2014-09-01

    CHLOE is an image analysis unsupervised learning algorithm that detects peculiar galaxies in datasets of galaxy images. The algorithm first computes a large set of numerical descriptors reflecting different aspects of the visual content, and then weighs them based on the standard deviation of the values computed from the galaxy images. The weighted Euclidean distance of each galaxy image from the median is measured, and the peculiarity of each galaxy is determined based on that distance.

  15. Moving towards a Fully Automatic Knowledge Assessment Tool

    Directory of Open Access Journals (Sweden)

    Christian Gütl

    2008-03-01

    Full Text Available Information about a student’s level or state ofknowledge is a key aspect for efficient, personalized learningactivities. E-learning systems gain such information in twoways: directly by examining users’ self-assessment andadministering predefined tests and indirectly by makinginferences on observed user behaviors. However, most of thecurrent solution approaches either demand excessivemanpower or lack required reliability. To overcome theseproblems, we have developed the e-Examiner, an assessmenttool that supports the assessment process by creatingautomatically test items, assessing students’ answers andproviding feedback. In this paper, we firstly give anoverview about a variety of computer-assisted andcomputer-based assessment systems and methods thatsupport formative assessment activities. Secondly, weintroduce the overall concept and architecture of the e-Examiner. Thirdly, we outline implementation details andevaluation results of our prototype implementation. Oursolution approach is based on the set of statistical similaritymeasures defined by the ROUGE toolset for automaticsummary evaluation.This paper is an extended version of the IMCL 2007 paper.

  16. An analysis of tools for automatic software development and automatic code generation

    OpenAIRE

    Viviana Yarel Rosales-Morales; Giner Alor-Hernández; Jorge Luis García-Alcaráz; Ramón Zatarain-Cabada; María Lucía Barrón-Estrada

    2015-01-01

    El desarrollo de software es una importante área en la ingeniería de software, por tal motivo han surgido técnicas, enfoques y métodos que permiten la automatización de desarrollo del mismo. En este trabajo se presenta un análisis de las herramientas para el desarrollo automático de software y la generación automática de código fuente, con el fi n de evaluarlas y determinar si cumplen o no con un conjunto de características y funcionalidades en términos de calidad. Dichas característica...

  17. An analysis of tools for automatic software development and automatic code generation

    Directory of Open Access Journals (Sweden)

    Viviana Yarel Rosales-Morales

    2015-01-01

    Full Text Available El desarrollo de software es una importante área en la ingeniería de software, por tal motivo han surgido técnicas, enfoques y métodos que permiten la automatización de desarrollo del mismo. En este trabajo se presenta un análisis de las herramientas para el desarrollo automático de software y la generación automática de código fuente, con el fi n de evaluarlas y determinar si cumplen o no con un conjunto de características y funcionalidades en términos de calidad. Dichas características incluyen efi cacia, productividad, seguridad y satisfacción, todo a través de una evaluación cualitativa y cuantitativa. Estas herramientas son 1 herramientas CASE, 2 marcos de trabajo ( frameworks y 3 ambientes de desarrollo integrado (IDEs. La evaluación se llevó a cabo con el fi n de medir no sólo la capacidad de uso, sino también el apoyo que brindan para el desarrollo de software automático y la generación automática de código fuente. El objetivo de este trabajo es proporcionar una metodología y una breve revisión de los trabajos más importantes para, de esta forma, identifi car las principales características de éstos y presentar una evaluación comparativa en términos cualitativos y cuantitativos, con la fi nalidad de proporcionar la información necesaria para el desarrollador de software que facilite la toma de decisiones al considerar herramientas que le pueden ser útiles.

  18. Automatic analysis of microscopic images of red blood cell aggregates

    Science.gov (United States)

    Menichini, Pablo A.; Larese, Mónica G.; Riquelme, Bibiana D.

    2015-06-01

    Red blood cell aggregation is one of the most important factors in blood viscosity at stasis or at very low rates of flow. The basic structure of aggregates is a linear array of cell commonly termed as rouleaux. Enhanced or abnormal aggregation is seen in clinical conditions, such as diabetes and hypertension, producing alterations in the microcirculation, some of which can be analyzed through the characterization of aggregated cells. Frequently, image processing and analysis for the characterization of RBC aggregation were done manually or semi-automatically using interactive tools. We propose a system that processes images of RBC aggregation and automatically obtains the characterization and quantification of the different types of RBC aggregates. Present technique could be interesting to perform the adaptation as a routine used in hemorheological and Clinical Biochemistry Laboratories because this automatic method is rapid, efficient and economical, and at the same time independent of the user performing the analysis (repeatability of the analysis).

  19. Volume Ray Casting with Peak Finding and Differential Sampling

    KAUST Repository

    Knoll, A.; Hijazi, Y.; Westerteiger, R.; Schott, M.; Hansen, C.; Hagen, H.

    2009-01-01

    classification. In this paper, we introduce a method for rendering such features by explicitly solving for isovalues within the volume rendering integral. In addition, we present a sampling strategy inspired by ray differentials that automatically matches

  20. Automatic design of digital synthetic gene circuits.

    Directory of Open Access Journals (Sweden)

    Mario A Marchisio

    2011-02-01

    Full Text Available De novo computational design of synthetic gene circuits that achieve well-defined target functions is a hard task. Existing, brute-force approaches run optimization algorithms on the structure and on the kinetic parameter values of the network. However, more direct rational methods for automatic circuit design are lacking. Focusing on digital synthetic gene circuits, we developed a methodology and a corresponding tool for in silico automatic design. For a given truth table that specifies a circuit's input-output relations, our algorithm generates and ranks several possible circuit schemes without the need for any optimization. Logic behavior is reproduced by the action of regulatory factors and chemicals on the promoters and on the ribosome binding sites of biological Boolean gates. Simulations of circuits with up to four inputs show a faithful and unequivocal truth table representation, even under parametric perturbations and stochastic noise. A comparison with already implemented circuits, in addition, reveals the potential for simpler designs with the same function. Therefore, we expect the method to help both in devising new circuits and in simplifying existing solutions.

  1. Rapid Differentiation of Haemophilus influenzae and Haemophilus haemolyticus by Use of Matrix-Assisted Laser Desorption Ionization-Time of Flight Mass Spectrometry with ClinProTools Mass Spectrum Analysis.

    Science.gov (United States)

    Chen, Jonathan H K; Cheng, Vincent C C; Wong, Chun-Pong; Wong, Sally C Y; Yam, Wing-Cheong; Yuen, Kwok-Yung

    2017-09-01

    Haemophilus influenzae is associated with severe invasive disease, while Haemophilus haemolyticus is considered part of the commensal flora in the human respiratory tract. Although the addition of a custom mass spectrum library into the matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) system could improve identification of these two species, the establishment of such a custom database is technically complicated and requires a large amount of resources, which most clinical laboratories cannot afford. In this study, we developed a mass spectrum analysis model with 7 mass peak biomarkers for the identification of H. influenzae and H. haemolyticus using the ClinProTools software. We evaluated the diagnostic performance of this model using 408 H. influenzae and H. haemolyticus isolates from clinical respiratory specimens from 363 hospitalized patients and compared the identification results with those obtained with the Bruker IVD MALDI Biotyper. The IVD MALDI Biotyper identified only 86.9% of H. influenzae (311/358) and 98.0% of H. haemolyticus (49/50) clinical isolates to the species level. In comparison, the ClinProTools mass spectrum model could identify 100% of H. influenzae (358/358) and H. haemolyticus (50/50) clinical strains to the species level and significantly improved the species identification rate (McNemar's test, P mass spectrometry to handle closely related bacterial species when the proprietary spectrum library failed. This approach should be useful for the differentiation of other closely related bacterial species. Copyright © 2017 American Society for Microbiology.

  2. Automatic interpretation and writing report of the adult waking electroencephalogram.

    Science.gov (United States)

    Shibasaki, Hiroshi; Nakamura, Masatoshi; Sugi, Takenao; Nishida, Shigeto; Nagamine, Takashi; Ikeda, Akio

    2014-06-01

    Automatic interpretation of the EEG has so far been faced with significant difficulties because of a large amount of spatial as well as temporal information contained in the EEG, continuous fluctuation of the background activity depending on changes in the subject's vigilance and attention level, the occurrence of paroxysmal activities such as spikes and spike-and-slow-waves, contamination of the EEG with a variety of artefacts and the use of different recording electrodes and montages. Therefore, previous attempts of automatic EEG interpretation have been focussed only on a specific EEG feature such as paroxysmal abnormalities, delta waves, sleep stages and artefact detection. As a result of a long-standing cooperation between clinical neurophysiologists and system engineers, we report for the first time on a comprehensive, computer-assisted, automatic interpretation of the adult waking EEG. This system analyses the background activity, intermittent abnormalities, artefacts and the level of vigilance and attention of the subject, and automatically presents its report in written form. Besides, it also detects paroxysmal abnormalities and evaluates the effects of intermittent photic stimulation and hyperventilation on the EEG. This system of automatic EEG interpretation was formed by adopting the strategy that the qualified EEGers employ for the systematic visual inspection. This system can be used as a supplementary tool for the EEGer's visual inspection, and for educating EEG trainees and EEG technicians. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  3. Automatic digitization. Experience of magnum 8000 in automatic digitization in EA; Digitalizacion automatica. Experiencias obtenidas durante la utilizacion del sistema magnus 8000 para la digitalizacion automatica en EA

    Energy Technology Data Exchange (ETDEWEB)

    Munoz Garcia, M.

    1995-12-31

    The paper describes the life cycle to be followed for the automatic digitization of files containing rasterised (scanned) images for their conversion into vector files (processable using CAD tools). The main characteristics of each of the five phases: capture, cleaning, conversion, revision and post-processing, that form part of the life cycle, are described. Lastly, the paper gives a comparative analysis of the results obtained using the automatic digitization process and other more conventional methods. (Author)

  4. Automatic sample changers maintenance manual

    International Nuclear Information System (INIS)

    Myers, T.A.

    1978-10-01

    This manual describes and provides trouble-shooting aids for the Automatic Sample Changer electronics on the automatic beta counting system, developed by the Los Alamos Scientific Laboratory Group CNC-11. The output of a gas detector is shaped by a preamplifier, then is coupled to an amplifier. Amplifier output is discriminated and is the input to a scaler. An identification number is associated with each sample. At a predetermined count length, the identification number, scaler data plus other information is punched out on a data card. The next sample to be counted is automatically selected. The beta counter uses the same electronics as the prior count did, the only difference being the sample identification number and sample itself. This manual is intended as a step-by-step aid in trouble-shooting the electronics associated with positioning the sample, counting the sample, and getting the needed data punched on an 80-column data card

  5. WebQuests: Tools for Differentiation

    Science.gov (United States)

    Schweizer, Heidi; Kossow, Ben

    2007-01-01

    This article features the WebQuest, an inquiry-oriented activity in which some or all of the information that learners interact with comes from resources on the Internet. WebQuests, when properly constructed, are activities, usually authentic in nature, that require the student to use Internet-based resources to deepen their understanding and…

  6. Real time automatic discriminating of ultrasonic flaws

    International Nuclear Information System (INIS)

    Suhairy Sani; Mohd Hanif Md Saad; Marzuki Mustafa; Mohd Redzwan Rosli

    2009-01-01

    This paper is concerned with the real time automatic discriminating of flaws from two categories; i. cracks (planar defect) and ii. Non-cracks (volumetric defect such as cluster porosity and slag) using pulse-echo ultrasound. The raw ultrasonic flaws signal were collected from a computerized robotic plane scanning system over the whole of each reflector as the primary source of data. The signal is then filtered and the analysis in both time and frequency domain were executed to obtain the selected feature. The real time feature analysis techniques measured the number of peaks, maximum index, pulse duration, rise time and fall time. The obtained features could be used to distinguish between quantitatively classified flaws by using various tools in artificial intelligence such as neural networks. The proposed algorithm and complete system were implemented in a computer software developed using Microsoft Visual BASIC 6.0 (author)

  7. Development of an automatic scaler

    International Nuclear Information System (INIS)

    He Yuehong

    2009-04-01

    A self-designed automatic scaler is introduced. A microcontroller LPC936 is used as the master chip in the scaler. A counter integrated with the micro-controller is configured to operate as external pulse counter. Software employed in the scaler is based on a embedded real-time operating system kernel named Small RTOS. Data storage, calculation and some other functions are also provided. The scaler is designed for applications with low cost, low power consumption solutions. By now, the automatic scaler has been applied in a surface contamination instrument. (authors)

  8. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming focuses on the techniques of automatic programming used with digital computers. Topics covered range from the design of machine-independent programming languages to the use of recursive procedures in ALGOL 60. A multi-pass translation scheme for ALGOL 60 is described, along with some commercial source languages. The structure and use of the syntax-directed compiler is also considered.Comprised of 12 chapters, this volume begins with a discussion on the basic ideas involved in the description of a computing process as a program for a computer, expressed in

  9. Individual Differences in Automatic Emotion Regulation Interact with Primed Emotion Regulation during an Anger Provocation

    OpenAIRE

    Zhang, Jing; Lipp, Ottmar V.; Hu, Ping

    2017-01-01

    The current study investigated the interactive effects of individual differences in automatic emotion regulation (AER) and primed emotion regulation strategy on skin conductance level (SCL) and heart rate during provoked anger. The study was a 2 × 2 [AER tendency (expression vs. control) × priming (expression vs. control)] between subject design. Participants were assigned to two groups according to their performance on an emotion regulation-IAT (differentiating automatic emotion control tend...

  10. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS/E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  11. Automatic assessment of average diaphragm motion trajectory from 4DCT images through machine learning.

    Science.gov (United States)

    Li, Guang; Wei, Jie; Huang, Hailiang; Gaebler, Carl Philipp; Yuan, Amy; Deasy, Joseph O

    2015-12-01

    To automatically estimate average diaphragm motion trajectory (ADMT) based on four-dimensional computed tomography (4DCT), facilitating clinical assessment of respiratory motion and motion variation and retrospective motion study. We have developed an effective motion extraction approach and a machine-learning-based algorithm to estimate the ADMT. Eleven patients with 22 sets of 4DCT images (4DCT1 at simulation and 4DCT2 at treatment) were studied. After automatically segmenting the lungs, the differential volume-per-slice (dVPS) curves of the left and right lungs were calculated as a function of slice number for each phase with respective to the full-exhalation. After 5-slice moving average was performed, the discrete cosine transform (DCT) was applied to analyze the dVPS curves in frequency domain. The dimensionality of the spectrum data was reduced by using several lowest frequency coefficients ( f v ) to account for most of the spectrum energy (Σ f v 2 ). Multiple linear regression (MLR) method was then applied to determine the weights of these frequencies by fitting the ground truth-the measured ADMT, which are represented by three pivot points of the diaphragm on each side. The 'leave-one-out' cross validation method was employed to analyze the statistical performance of the prediction results in three image sets: 4DCT1, 4DCT2, and 4DCT1 + 4DCT2. Seven lowest frequencies in DCT domain were found to be sufficient to approximate the patient dVPS curves ( R = 91%-96% in MLR fitting). The mean error in the predicted ADMT using leave-one-out method was 0.3 ± 1.9 mm for the left-side diaphragm and 0.0 ± 1.4 mm for the right-side diaphragm. The prediction error is lower in 4DCT2 than 4DCT1, and is the lowest in 4DCT1 and 4DCT2 combined. This frequency-analysis-based machine learning technique was employed to predict the ADMT automatically with an acceptable error (0.2 ± 1.6 mm). This volumetric approach is not affected by the presence of the lung tumors

  12. Towards Automatic Controller Design using Multi-Objective Evolutionary Algorithms

    DEFF Research Database (Denmark)

    Pedersen, Gerulf

    of evolutionary computation, a choice was made to use multi-objective algorithms for the purpose of aiding in automatic controller design. More specifically, the choice was made to use the Non-dominated Sorting Genetic Algorithm II (NSGAII), which is one of the most potent algorithms currently in use...... for automatic controller design. However, because the field of evolutionary computation is relatively unknown in the field of control engineering, this thesis also includes a comprehensive introduction to the basic field of evolutionary computation as well as a description of how the field has previously been......In order to design the controllers of tomorrow, a need has risen for tools that can aid in the design of these. A desire to use evolutionary computation as a tool to achieve that goal is what gave inspiration for the work contained in this thesis. After having studied the foundations...

  13. The automatic lumber planing mill

    Science.gov (United States)

    Peter Koch

    1957-01-01

    It is probable that a truly automatic planning operation could be devised if some of the variables commonly present in the mill-run lumber were eliminated and the remaining variables kept under close control. This paper will deal with the more general situation faced by mostl umber manufacturing plants. In other words, it will be assumed that the incoming lumber has...

  14. Automatic Validation of Protocol Narration

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, Pierpablo

    2003-01-01

    We perform a systematic expansion of protocol narrations into terms of a process algebra in order to make precise some of the detailed checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we...

  15. Automatically Preparing Safe SQL Queries

    Science.gov (United States)

    Bisht, Prithvi; Sistla, A. Prasad; Venkatakrishnan, V. N.

    We present the first sound program source transformation approach for automatically transforming the code of a legacy web application to employ PREPARE statements in place of unsafe SQL queries. Our approach therefore opens the way for eradicating the SQL injection threat vector from legacy web applications.

  16. The Automatic Measurement of Targets

    DEFF Research Database (Denmark)

    Höhle, Joachim

    1997-01-01

    The automatic measurement of targets is demonstrated by means of a theoretical example and by an interactive measuring program for real imagery from a réseau camera. The used strategy is a combination of two methods: the maximum correlation coefficient and the correlation in the subpixel range...... interactive software is also part of a computer-assisted learning program on digital photogrammetry....

  17. Automatic analysis of ultrasonic data

    International Nuclear Information System (INIS)

    Horteur, P.; Colin, J.; Benoist, P.; Bonis, M.; Paradis, L.

    1986-10-01

    This paper describes an automatic and self-contained data processing system, transportable on site, able to perform images such as ''A. Scan'', ''B. Scan'', ... to present very quickly the results of the control. It can be used in the case of pressure vessel inspection [fr

  18. Status of GRACE system - automatic computation of cross sections

    International Nuclear Information System (INIS)

    Fujimoto, J.; Ishikawa, T.; Kawabata, S.; Kurihara, Y.; Shimizu, Y.; Kato, K.; Nakazawa, N.; Kaneko, T.; Tanaka, H.

    1995-01-01

    Automated system is an essential tool for high-energy physics and GRACE system for tree processes makes it possible to calculate cross sections for complicated processes exactly. To check the output of the automatic system we make comparison between Hooft-t-Feynman gauge and unitary gauge, the exchange of external particles, and we check the independence of UV divergence parameter and that of IR divergence parameter

  19. Arraycount, an algorithm for automatic cell counting in microwell arrays

    OpenAIRE

    Kachouie, Nezamoddin N.; Kang, Lifeng; Khademhosseini, Ali

    2009-01-01

    Microscale technologies have emerged as a powerful tool for studying and manipulating biological systems and miniaturizing experiments. However, the lack of software complementing these techniques has made it difficult to apply them for many high-throughput experiments. This work establishes Arraycount, an approach to automatically count cells in microwell arrays. The procedure consists of fluorescent microscope imaging of cells that are seeded in microwells of a microarray system and then an...

  20. Recent advances in Automatic Speech Recognition for Vietnamese

    OpenAIRE

    Le , Viet-Bac; Besacier , Laurent; Seng , Sopheap; Bigi , Brigitte; Do , Thi-Ngoc-Diep

    2008-01-01

    International audience; This paper presents our recent activities for automatic speech recognition for Vietnamese. First, our text data collection and processing methods and tools are described. For language modeling, we investigate word, sub-word and also hybrid word/sub-word models. For acoustic modeling, when only limited speech data are available for Vietnamese, we propose some crosslingual acoustic modeling techniques. Furthermore, since the use of sub-word units can reduce the high out-...

  1. Exogenous (automatic) attention to emotional stimuli: a review

    OpenAIRE

    Carretié, Luis

    2014-01-01

    Current knowledge on the architecture of exogenous attention (also called automatic, bottom-up, or stimulus-driven attention, among other terms) has been mainly obtained from studies employing neutral, anodyne stimuli. Since, from an evolutionary perspective, exogenous attention can be understood as an adaptive tool for rapidly detecting salient events, reorienting processing resources to them, and enhancing processing mechanisms, emotional events (which are, by definition, salient for the in...

  2. Automatic annotation of head velocity and acceleration in Anvil

    DEFF Research Database (Denmark)

    Jongejan, Bart

    2012-01-01

    We describe an automatic face tracker plugin for the ANVIL annotation tool. The face tracker produces data for velocity and for acceleration in two dimensions. We compare the annotations generated by the face tracking algorithm with independently made manual annotations for head movements....... The annotations are a useful supplement to manual annotations and may help human annotators to quickly and reliably determine onset of head movements and to suggest which kind of head movement is taking place....

  3. SU-E-J-272: Auto-Segmentation of Regions with Differentiating CT Numbers for Treatment Response Assessment

    International Nuclear Information System (INIS)

    Yang, C; Noid, G; Dalah, E; Paulson, E; Li, X; Gilat-Schmidt, T

    2015-01-01

    Purpose: It has been reported recently that the change of CT number (CTN) during and after radiation therapy (RT) may be used to assess RT response. The purpose of this work is to develop a tool to automatically segment the regions with differentiating CTN and/or with change of CTN in a series of CTs. Methods: A software tool was developed to identify regions with differentiating CTN using K-mean Cluster of CT numbers and to automatically delineate these regions using convex hull enclosing method. Pre- and Post-RT CT, PET, or MRI images acquired for sample lung and pancreatic cancer cases were used to test the software tool. K-mean cluster of CT numbers within the gross tumor volumes (GTVs) delineated based on PET SUV (standard uptake value of fludeoxyglucose) and/or MRI ADC (apparent diffusion coefficient) map was analyzed. The cluster centers with higher value were considered as active tumor volumes (ATV). The convex hull contours enclosing preset clusters were used to delineate these ATVs with color washed displays. The CTN defined ATVs were compared with the SUV- or ADC-defined ATVs. Results: CTN stability of the CT scanner used to acquire the CTs in this work is less than 1.5 Hounsfield Unit (HU) variation annually. K-mean cluster centers in the GTV have difference of ∼20 HU, much larger than variation due to CTN stability, for the lung cancer cases studied. The dice coefficient between the ATVs delineated based on convex hull enclosure of high CTN centers and the PET defined GTVs based on SUV cutoff value of 2.5 was 90(±5)%. Conclusion: A software tool was developed using K-mean cluster and convex hull contour to automatically segment high CTN regions which may not be identifiable using a simple threshold method. These CTN regions were reasonably overlapped with the PET or MRI defined GTVs

  4. Automatic Model-Based Generation of Parameterized Test Cases Using Data Abstraction

    NARCIS (Netherlands)

    Calamé, Jens R.; Ioustinova, Natalia; Romijn, J.M.T.; Smith, G.; van de Pol, Jan Cornelis

    2007-01-01

    Developing test suites is a costly and error-prone process. Model-based test generation tools facilitate this process by automatically generating test cases from system models. The applicability of these tools, however, depends on the size of the target systems. Here, we propose an approach to

  5. Teaching with technology: automatically receiving information from the internet and web.

    Science.gov (United States)

    Wink, Diane M

    2010-01-01

    In this bimonthly series, the author examines how nurse educators can use the Internet and Web-based computer technologies such as search, communication, and collaborative writing tools, social networking and social bookmarking sites, virtual worlds, and Web-based teaching and learning programs. This article presents information and tools related to automatically receiving information from the Internet and Web.

  6. Microcontroller based automatic temperature control for oyster mushroom plants

    Science.gov (United States)

    Sihombing, P.; Astuti, T. P.; Herriyance; Sitompul, D.

    2018-03-01

    In the cultivation of Oyster Mushrooms need special treatment because oyster mushrooms are susceptible to disease. Mushroom growth will be inhibited if the temperature and humidity are not well controlled because temperature and inertia can affect mold growth. Oyster mushroom growth usually will be optimal at temperatures around 22-28°C and humidity around 70-90%. This problem is often encountered in the cultivation of oyster mushrooms. Therefore it is very important to control the temperature and humidity of the room of oyster mushroom cultivation. In this paper, we developed an automatic temperature monitoring tool in the cultivation of oyster mushroom-based Arduino Uno microcontroller. We have designed a tool that will control the temperature and humidity automatically by Android Smartphone. If the temperature increased more than 28°C in the room of mushroom plants, then this tool will turn on the pump automatically to run water in order to lower the room temperature. And if the room temperature of mushroom plants below of 22°C, then the light will be turned on in order to heat the room. Thus the temperature in the room oyster mushrooms will remain stable so that the growth of oyster mushrooms can grow with good quality.

  7. Automatisms: bridging clinical neurology with criminal law.

    Science.gov (United States)

    Rolnick, Joshua; Parvizi, Josef

    2011-03-01

    The law, like neurology, grapples with the relationship between disease states and behavior. Sometimes, the two disciplines share the same terminology, such as automatism. In law, the "automatism defense" is a claim that action was involuntary or performed while unconscious. Someone charged with a serious crime can acknowledge committing the act and yet may go free if, relying on the expert testimony of clinicians, the court determines that the act of crime was committed in a state of automatism. In this review, we explore the relationship between the use of automatism in the legal and clinical literature. We close by addressing several issues raised by the automatism defense: semantic ambiguity surrounding the term automatism, the presence or absence of consciousness during automatisms, and the methodological obstacles that have hindered the study of cognition during automatisms. Copyright © 2010 Elsevier Inc. All rights reserved.

  8. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathan; Calo, Victor M.

    2010-01-01

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques

  9. Automatic Migration from PARMACS to MPI in Parallel Fortran Applications

    Directory of Open Access Journals (Sweden)

    Rolf Hempel

    1999-01-01

    Full Text Available The PARMACS message passing interface has been in widespread use by application projects, especially in Europe. With the new MPI standard for message passing, many projects face the problem of replacing PARMACS with MPI. An automatic translation tool has been developed which replaces all PARMACS 6.0 calls in an application program with their corresponding MPI calls. In this paper we describe the mapping of the PARMACS programming model onto MPI. We then present some implementation details of the converter tool.

  10. Language Management Tools

    DEFF Research Database (Denmark)

    Sanden, Guro Refsum

    This paper offers a review of existing literature on the topic of language management tools – the means by which language is managed – in multilingual organisations. By drawing on a combination of sociolinguistics and international business and management studies, a new taxonomy of language...... management tools is proposed, differentiating between three categories of tools. Firstly, corporate policies are the deliberate control of issues pertaining to language and communication developed at the managerial level of a firm. Secondly, corporate measures are the planned activities the firm’s leadership...... may deploy in order to address the language needs of the organisation. Finally, front-line practices refer to the use of informal, emergent language management tools available to staff members. The language management tools taxonomy provides a framework for operationalising the management of language...

  11. Constraint Differentiation

    DEFF Research Database (Denmark)

    Mödersheim, Sebastian Alexander; Basin, David; Viganò, Luca

    2010-01-01

    We introduce constraint differentiation, a powerful technique for reducing search when model-checking security protocols using constraint-based methods. Constraint differentiation works by eliminating certain kinds of redundancies that arise in the search space when using constraints to represent...... results show that constraint differentiation substantially reduces search and considerably improves the performance of OFMC, enabling its application to a wider class of problems....

  12. Differential manifolds

    CERN Document Server

    Kosinski, Antoni A

    2007-01-01

    The concepts of differential topology form the center of many mathematical disciplines such as differential geometry and Lie group theory. Differential Manifolds presents to advanced undergraduates and graduate students the systematic study of the topological structure of smooth manifolds. Author Antoni A. Kosinski, Professor Emeritus of Mathematics at Rutgers University, offers an accessible approach to both the h-cobordism theorem and the classification of differential structures on spheres.""How useful it is,"" noted the Bulletin of the American Mathematical Society, ""to have a single, sho

  13. Introduction to differential equations

    CERN Document Server

    Taylor, Michael E

    2011-01-01

    The mathematical formulations of problems in physics, economics, biology, and other sciences are usually embodied in differential equations. The analysis of the resulting equations then provides new insight into the original problems. This book describes the tools for performing that analysis. The first chapter treats single differential equations, emphasizing linear and nonlinear first order equations, linear second order equations, and a class of nonlinear second order equations arising from Newton's laws. The first order linear theory starts with a self-contained presentation of the exponen

  14. The MPO system for automatic workflow documentation

    Energy Technology Data Exchange (ETDEWEB)

    Abla, G.; Coviello, E.N.; Flanagan, S.M. [General Atomics, P.O. Box 85608, San Diego, CA 92186-5608 (United States); Greenwald, M. [Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Lee, X. [General Atomics, P.O. Box 85608, San Diego, CA 92186-5608 (United States); Romosan, A. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Schissel, D.P., E-mail: schissel@fusion.gat.com [General Atomics, P.O. Box 85608, San Diego, CA 92186-5608 (United States); Shoshani, A. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Stillerman, J.; Wright, J. [Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Wu, K.J. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States)

    2016-11-15

    Highlights: • Data model, infrastructure, and tools for data tracking, cataloging, and integration. • Automatically document workflow and data provenance in the widest sense. • Fusion Science as test bed but the system’s framework and data model is quite general. - Abstract: Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for critical applications. However, it is not the mere existence of data that is important, but our ability to make use of it. Experience has shown that when metadata is better organized and more complete, the underlying data becomes more useful. Traditionally, capturing the steps of scientific workflows and metadata was the role of the lab notebook, but the digital era has resulted instead in the fragmentation of data, processing, and annotation. This paper presents the Metadata, Provenance, and Ontology (MPO) System, the software that can automate the documentation of scientific workflows and associated information. Based on recorded metadata, it provides explicit information about the relationships among the elements of workflows in notebook form augmented with directed acyclic graphs. A set of web-based graphical navigation tools and Application Programming Interface (API) have been created for searching and browsing, as well as programmatically accessing the workflows and data. We describe the MPO concepts and its software architecture. We also report the current status of the software as well as the initial deployment experience.

  15. The MPO system for automatic workflow documentation

    International Nuclear Information System (INIS)

    Abla, G.; Coviello, E.N.; Flanagan, S.M.; Greenwald, M.; Lee, X.; Romosan, A.; Schissel, D.P.; Shoshani, A.; Stillerman, J.; Wright, J.; Wu, K.J.

    2016-01-01

    Highlights: • Data model, infrastructure, and tools for data tracking, cataloging, and integration. • Automatically document workflow and data provenance in the widest sense. • Fusion Science as test bed but the system’s framework and data model is quite general. - Abstract: Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for critical applications. However, it is not the mere existence of data that is important, but our ability to make use of it. Experience has shown that when metadata is better organized and more complete, the underlying data becomes more useful. Traditionally, capturing the steps of scientific workflows and metadata was the role of the lab notebook, but the digital era has resulted instead in the fragmentation of data, processing, and annotation. This paper presents the Metadata, Provenance, and Ontology (MPO) System, the software that can automate the documentation of scientific workflows and associated information. Based on recorded metadata, it provides explicit information about the relationships among the elements of workflows in notebook form augmented with directed acyclic graphs. A set of web-based graphical navigation tools and Application Programming Interface (API) have been created for searching and browsing, as well as programmatically accessing the workflows and data. We describe the MPO concepts and its software architecture. We also report the current status of the software as well as the initial deployment experience.

  16. Automatic design of magazine covers

    Science.gov (United States)

    Jahanian, Ali; Liu, Jerry; Tretter, Daniel R.; Lin, Qian; Damera-Venkata, Niranjan; O'Brien-Strain, Eamonn; Lee, Seungyon; Fan, Jian; Allebach, Jan P.

    2012-03-01

    In this paper, we propose a system for automatic design of magazine covers that quantifies a number of concepts from art and aesthetics. Our solution to automatic design of this type of media has been shaped by input from professional designers, magazine art directors and editorial boards, and journalists. Consequently, a number of principles in design and rules in designing magazine covers are delineated. Several techniques are derived and employed in order to quantify and implement these principles and rules in the format of a software framework. At this stage, our framework divides the task of design into three main modules: layout of magazine cover elements, choice of color for masthead and cover lines, and typography of cover lines. Feedback from professional designers on our designs suggests that our results are congruent with their intuition.

  17. Automatic schema evolution in Root

    International Nuclear Information System (INIS)

    Brun, R.; Rademakers, F.

    2001-01-01

    ROOT version 3 (spring 2001) supports automatic class schema evolution. In addition this version also produces files that are self-describing. This is achieved by storing in each file a record with the description of all the persistent classes in the file. Being self-describing guarantees that a file can always be read later, its structure browsed and objects inspected, also when the library with the compiled code of these classes is missing. The schema evolution mechanism supports the frequent case when multiple data sets generated with many different class versions must be analyzed in the same session. ROOT supports the automatic generation of C++ code describing the data objects in a file

  18. Automatic digitization of SMA data

    Science.gov (United States)

    Väänänen, Mika; Tanskanen, Eija

    2017-04-01

    In the 1970's and 1980's the Scandinavian Magnetometer Array produced large amounts of excellent data from over 30 stations In Norway, Sweden and Finland. 620 film reels and 20 kilometers of film have been preserved and the longest time series produced in the campaign span almost uninterrupted for five years, but the data has never seen widespread use due to the choice of medium. Film is a difficult medium to digitize efficiently. Previously events of interest were searched for by hand and digitization was done by projecting the film on paper and plotting it by hand. We propose a method of automatically digitizing geomagnetic data stored on film and extracting the numerical values from the digitized data. The automatic digitization process helps in preserving old, valuable data that might otherwise go unused.

  19. Automatic computation of radioimmunoassay data

    International Nuclear Information System (INIS)

    Toyota, Takayoshi; Kudo, Mikihiko; Abe, Kanji; Kawamata, Fumiaki; Uehata, Shigeru.

    1975-01-01

    Radioimmunoassay provided dose response curves which showed linearity by the use of logistic transformation (Rodbard). This transformation which was applicable to radioimmunoassay should be useful for the computer processing of insulin and C-peptide assay. In the present studies, standard curves were analysed by testing the fit of analytic functions to radioimmunoassay of insulin and C-peptides. A program for use in combination with the double antibody technique was made by Dr. Kawamata. This approach was evidenced to be useful in order to allow automatic computation of data derived from the double antibody assays of insulin and C-peptides. Automatic corrected calculations of radioimmunoassay data of insulin was found to be satisfactory. (auth.)

  20. Physics of Automatic Target Recognition

    CERN Document Server

    Sadjadi, Firooz

    2007-01-01

    Physics of Automatic Target Recognition addresses the fundamental physical bases of sensing, and information extraction in the state-of-the art automatic target recognition field. It explores both passive and active multispectral sensing, polarimetric diversity, complex signature exploitation, sensor and processing adaptation, transformation of electromagnetic and acoustic waves in their interactions with targets, background clutter, transmission media, and sensing elements. The general inverse scattering, and advanced signal processing techniques and scientific evaluation methodologies being used in this multi disciplinary field will be part of this exposition. The issues of modeling of target signatures in various spectral modalities, LADAR, IR, SAR, high resolution radar, acoustic, seismic, visible, hyperspectral, in diverse geometric aspects will be addressed. The methods for signal processing and classification will cover concepts such as sensor adaptive and artificial neural networks, time reversal filt...

  1. Automatic Conflict Detection on Contracts

    Science.gov (United States)

    Fenech, Stephen; Pace, Gordon J.; Schneider, Gerardo

    Many software applications are based on collaborating, yet competing, agents or virtual organisations exchanging services. Contracts, expressing obligations, permissions and prohibitions of the different actors, can be used to protect the interests of the organisations engaged in such service exchange. However, the potentially dynamic composition of services with different contracts, and the combination of service contracts with local contracts can give rise to unexpected conflicts, exposing the need for automatic techniques for contract analysis. In this paper we look at automatic analysis techniques for contracts written in the contract language mathcal{CL}. We present a trace semantics of mathcal{CL} suitable for conflict analysis, and a decision procedure for detecting conflicts (together with its proof of soundness, completeness and termination). We also discuss its implementation and look into the applications of the contract analysis approach we present. These techniques are applied to a small case study of an airline check-in desk.

  2. An automatized frequency analysis for vine plot detection and delineation in remote sensing

    OpenAIRE

    Delenne , Carole; Rabatel , G.; Deshayes , M.

    2008-01-01

    The availability of an automatic tool for vine plot detection, delineation, and characterization would be very useful for management purposes. An automatic and recursive process using frequency analysis (with Fourier transform and Gabor filters) has been developed to meet this need. This results in the determination of vine plot boundary and accurate estimation of interrow width and row orientation. To foster large-scale applications, tests and validation have been carried out on standard ver...

  3. MOS voltage automatic tuning circuit

    OpenAIRE

    李, 田茂; 中田, 辰則; 松本, 寛樹

    2004-01-01

    Abstract ###Automatic tuning circuit adjusts frequency performance to compensate for the process variation. Phase locked ###loop (PLL) is a suitable oscillator for the integrated circuit. It is a feedback system that compares the input ###phase with the output phase. It can make the output frequency equal to the input frequency. In this paper, PLL ###fomed of MOSFET's is presented.The presented circuit consists of XOR circuit, Low-pass filter and Relaxation ###Oscillator. On PSPICE simulation...

  4. The Mark II Automatic Diflux

    Directory of Open Access Journals (Sweden)

    Jean L Rasson

    2011-07-01

    Full Text Available We report here on the new realization of an automatic fluxgate theodolite able to perform unattended absolute geomagnetic declination and inclination measurements: the AUTODIF MKII. The main changes of this version compared with the former one are presented as well as the better specifications we expect now. We also explain the absolute orientation procedure by means of a laser beam and a corner cube and the method for leveling the fluxgate sensor, which is different from a conventional DIflux theodolite.

  5. CLG for Automatic Image Segmentation

    OpenAIRE

    Christo Ananth; S.Santhana Priya; S.Manisha; T.Ezhil Jothi; M.S.Ramasubhaeswari

    2017-01-01

    This paper proposes an automatic segmentation method which effectively combines Active Contour Model, Live Wire method and Graph Cut approach (CLG). The aim of Live wire method is to provide control to the user on segmentation process during execution. Active Contour Model provides a statistical model of object shape and appearance to a new image which are built during a training phase. In the graph cut technique, each pixel is represented as a node and the distance between those nodes is rep...

  6. Annual review in automatic programming

    CERN Document Server

    Halpern, Mark I; Bolliet, Louis

    2014-01-01

    Computer Science and Technology and their Application is an eight-chapter book that first presents a tutorial on database organization. Subsequent chapters describe the general concepts of Simula 67 programming language; incremental compilation and conversational interpretation; dynamic syntax; the ALGOL 68. Other chapters discuss the general purpose conversational system for graphical programming and automatic theorem proving based on resolution. A survey of extensible programming language is also shown.

  7. Automatic computation of transfer functions

    Science.gov (United States)

    Atcitty, Stanley; Watson, Luke Dale

    2015-04-14

    Technologies pertaining to the automatic computation of transfer functions for a physical system are described herein. The physical system is one of an electrical system, a mechanical system, an electromechanical system, an electrochemical system, or an electromagnetic system. A netlist in the form of a matrix comprises data that is indicative of elements in the physical system, values for the elements in the physical system, and structure of the physical system. Transfer functions for the physical system are computed based upon the netlist.

  8. Automatic wipers with mist control

    OpenAIRE

    Ashik K.P; A.N.Basavaraju

    2016-01-01

    - This paper illustrates Automatic wipers with mist control. In modern days, the accidents are most common in commercial vehicles. One of the reasons for these accidents is formation of the mist inside the vehicle due to heavy rain. In rainy seasons for commercial vehicles, the wiper on the windshield has to be controlled by the driver himself, which distracts his concentration on driving. Also when the rain lasts for more time (say for about 15 minutes) the formation of mist on t...

  9. How CBO Estimates Automatic Stabilizers

    Science.gov (United States)

    2015-11-01

    the economy. Most types of revenues—mainly personal, corporate, and social insurance taxes —are sensitive to the business cycle and account for most of...Medicare taxes for self-employed people, taxes on production and imports, and unemployment insurance taxes . Those six categories account for the bulk of...federal tax revenues.6 Individual taxes account for most of the automatic stabilizers from revenues, followed by Social Security plus Medicare

  10. Scanner OPC signatures: automatic vendor-to-vendor OPE matching

    Science.gov (United States)

    Renwick, Stephen P.

    2009-03-01

    As 193nm lithography continues to be stretched and the k1 factor decreases, optical proximity correction (OPC) has become a vital part of the lithographer's tool kit. Unfortunately, as is now well known, the design variations of lithographic scanners from different vendors cause them to have slightly different optical-proximity effect (OPE) behavior, meaning that they print features through pitch in distinct ways. This in turn means that their response to OPC is not the same, and that an OPC solution designed for a scanner from Company 1 may or may not work properly on a scanner from Company 2. Since OPC is not inexpensive, that causes trouble for chipmakers using more than one brand of scanner. Clearly a scanner-matching procedure is needed to meet this challenge. Previously, automatic matching has only been reported for scanners of different tool generations from the same manufacturer. In contrast, scanners from different companies have been matched using expert tuning and adjustment techniques, frequently requiring laborious test exposures. Automatic matching between scanners from Company 1 and Company 2 has remained an unsettled problem. We have recently solved this problem and introduce a novel method to perform the automatic matching. The success in meeting this challenge required three enabling factors. First, we recognized the strongest drivers of OPE mismatch and are thereby able to reduce the information needed about a tool from another supplier to that information readily available from all modern scanners. Second, we developed a means of reliably identifying the scanners' optical signatures, minimizing dependence on process parameters that can cloud the issue. Third, we carefully employed standard statistical techniques, checking for robustness of the algorithms used and maximizing efficiency. The result is an automatic software system that can predict an OPC matching solution for scanners from different suppliers without requiring expert intervention.

  11. Group Dynamics in Automatic Imitation.

    Science.gov (United States)

    Gleibs, Ilka H; Wilson, Neil; Reddy, Geetha; Catmur, Caroline

    Imitation-matching the configural body movements of another individual-plays a crucial part in social interaction. We investigated whether automatic imitation is not only influenced by who we imitate (ingroup vs. outgroup member) but also by the nature of an expected interaction situation (competitive vs. cooperative). In line with assumptions from Social Identity Theory), we predicted that both social group membership and the expected situation impact on the level of automatic imitation. We adopted a 2 (group membership target: ingroup, outgroup) x 2 (situation: cooperative, competitive) design. The dependent variable was the degree to which participants imitated the target in a reaction time automatic imitation task. 99 female students from two British Universities participated. We found a significant two-way interaction on the imitation effect. When interacting in expectation of cooperation, imitation was stronger for an ingroup target compared to an outgroup target. However, this was not the case in the competitive condition where imitation did not differ between ingroup and outgroup target. This demonstrates that the goal structure of an expected interaction will determine the extent to which intergroup relations influence imitation, supporting a social identity approach.

  12. Automatic programming for critical applications

    Science.gov (United States)

    Loganantharaj, Raj L.

    1988-01-01

    The important phases of a software life cycle include verification and maintenance. Usually, the execution performance is an expected requirement in a software development process. Unfortunately, the verification and the maintenance of programs are the time consuming and the frustrating aspects of software engineering. The verification cannot be waived for the programs used for critical applications such as, military, space, and nuclear plants. As a consequence, synthesis of programs from specifications, an alternative way of developing correct programs, is becoming popular. The definition, or what is understood by automatic programming, has been changed with our expectations. At present, the goal of automatic programming is the automation of programming process. Specifically, it means the application of artificial intelligence to software engineering in order to define techniques and create environments that help in the creation of high level programs. The automatic programming process may be divided into two phases: the problem acquisition phase and the program synthesis phase. In the problem acquisition phase, an informal specification of the problem is transformed into an unambiguous specification while in the program synthesis phase such a specification is further transformed into a concrete, executable program.

  13. Signal Compression in Automatic Ultrasonic testing of Rails

    Directory of Open Access Journals (Sweden)

    Tomasz Ciszewski

    2007-01-01

    Full Text Available Full recording of the most important information carried by the ultrasonic signals allows realizing statistical analysis of measurement data. Statistical analysis of the results gathered during automatic ultrasonic tests gives data which lead, together with use of features of measuring method, differential lossy coding and traditional method of lossless data compression (Huffman’s coding, dictionary coding, to a comprehensive, efficient data compression algorithm. The subject of the article is to present the algorithm and the benefits got by using it in comparison to alternative compression methods. Storage of large amount  of data allows to create an electronic catalogue of ultrasonic defects. If it is created, the future qualification system training in the new solutions of the automat for test in rails will be possible.

  14. Automatic Recognition of Object Names in Literature

    Science.gov (United States)

    Bonnin, C.; Lesteven, S.; Derriere, S.; Oberto, A.

    2008-08-01

    SIMBAD is a database of astronomical objects that provides (among other things) their bibliographic references in a large number of journals. Currently, these references have to be entered manually by librarians who read each paper. To cope with the increasing number of papers, CDS develops a tool to assist the librarians in their work, taking advantage of the Dictionary of Nomenclature of Celestial Objects, which keeps track of object acronyms and of their origin. The program searches for object names directly in PDF documents by comparing the words with all the formats stored in the Dictionary of Nomenclature. It also searches for variable star names based on constellation names and for a large list of usual names such as Aldebaran or the Crab. Object names found in the documents often correspond to several astronomical objects. The system retrieves all possible matches, displays them with their object type given by SIMBAD, and lets the librarian make the final choice. The bibliographic reference can then be automatically added to the object identifiers in the database. Besides, the systematic usage of the Dictionary of Nomenclature, which is updated manually, permitted to automatically check it and to detect errors and inconsistencies. Last but not least, the program collects some additional information such as the position of the object names in the document (in the title, subtitle, abstract, table, figure caption...) and their number of occurrences. In the future, this will permit to calculate the 'weight' of an object in a reference and to provide SIMBAD users with an important new information, which will help them to find the most relevant papers in the object reference list.

  15. Automatic welding and cladding in heavy fabrication

    International Nuclear Information System (INIS)

    Altamer, A. de

    1980-01-01

    A description is given of the automatic welding processes used by an Italian fabricator of pressure vessels for petrochemical and nuclear plant. The automatic submerged arc welding, submerged arc strip cladding, pulsed TIG, hot wire TIG and MIG welding processes have proved satisfactory in terms of process reliability, metal deposition rate, and cost effectiveness for low alloy and carbon steels. An example shows sequences required during automatic butt welding, including heat treatments. Factors which govern satisfactory automatic welding include automatic anti-drift rotator device, electrode guidance and bead programming system, the capability of single and dual head operation, flux recovery and slag removal systems, operator environment and controls, maintaining continuity of welding and automatic reverse side grinding. Automatic welding is used for: joining vessel sections; joining tubes to tubeplate; cladding of vessel rings and tubes, dished ends and extruded nozzles; nozzle to shell and butt welds, including narrow gap welding. (author)

  16. Individual Differences in Automatic Emotion Regulation Interact with Primed Emotion Regulation during an Anger Provocation

    Directory of Open Access Journals (Sweden)

    Ping Hu

    2017-04-01

    Full Text Available The current study investigated the interactive effects of individual differences in automatic emotion regulation (AER and primed emotion regulation strategy on skin conductance level (SCL and heart rate during provoked anger. The study was a 2 × 2 [AER tendency (expression vs. control × priming (expression vs. control] between subject design. Participants were assigned to two groups according to their performance on an emotion regulation-IAT (differentiating automatic emotion control tendency and automatic emotion expression tendency. Then participants of the two groups were randomly assigned to two emotion regulation priming conditions (emotion control priming or emotion expression priming. Anger was provoked by blaming participants for slow performance during a subsequent backward subtraction task. In anger provocation, SCL of individuals with automatic emotion control tendencies in the control priming condition was lower than of those with automatic emotion control tendencies in the expression priming condition. However, SCL of individuals with automatic emotion expression tendencies did no differ in the automatic emotion control priming or the automatic emotion expression priming condition. Heart rate during anger provocation was higher in individuals with automatic emotion expression tendencies than in individuals with automatic emotion control tendencies regardless of priming condition. This pattern indicates an interactive effect of individual differences in AER and emotion regulation priming on SCL, which is an index of emotional arousal. Heart rate was only sensitive to the individual differences in AER, and did not reflect this interaction. This finding has implications for clinical studies of the use of emotion regulation strategy training suggesting that different practices are optimal for individuals who differ in AER tendencies.

  17. Advanced differential quadrature methods

    CERN Document Server

    Zong, Zhi

    2009-01-01

    Modern Tools to Perform Numerical DifferentiationThe original direct differential quadrature (DQ) method has been known to fail for problems with strong nonlinearity and material discontinuity as well as for problems involving singularity, irregularity, and multiple scales. But now researchers in applied mathematics, computational mechanics, and engineering have developed a range of innovative DQ-based methods to overcome these shortcomings. Advanced Differential Quadrature Methods explores new DQ methods and uses these methods to solve problems beyond the capabilities of the direct DQ method.After a basic introduction to the direct DQ method, the book presents a number of DQ methods, including complex DQ, triangular DQ, multi-scale DQ, variable order DQ, multi-domain DQ, and localized DQ. It also provides a mathematical compendium that summarizes Gauss elimination, the Runge-Kutta method, complex analysis, and more. The final chapter contains three codes written in the FORTRAN language, enabling readers to q...

  18. Differential games

    CERN Document Server

    Friedman, Avner

    2006-01-01

    This volume lays the mathematical foundations for the theory of differential games, developing a rigorous mathematical framework with existence theorems. It begins with a precise definition of a differential game and advances to considerations of games of fixed duration, games of pursuit and evasion, the computation of saddle points, games of survival, and games with restricted phase coordinates. Final chapters cover selected topics (including capturability and games with delayed information) and N-person games.Geared toward graduate students, Differential Games will be of particular interest

  19. DMET-analyzer: automatic analysis of Affymetrix DMET data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Agapito, Giuseppe; Di Martino, Maria Teresa; Arbitrio, Mariamena; Tassone, Pierfrancesco; Tagliaferri, Pierosandro; Cannataro, Mario

    2012-10-05

    Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters) is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism) on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix) and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i) to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii) the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP), (iii) the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists) to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different case studies regarding the analysis of

  20. DMET-Analyzer: automatic analysis of Affymetrix DMET Data

    Directory of Open Access Journals (Sweden)

    Guzzi Pietro

    2012-10-01

    Full Text Available Abstract Background Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. Results We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP, (iii the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different

  1. Automatic Genre Classification of Musical Signals

    Science.gov (United States)

    Barbedo, Jayme Garcia sArnal; Lopes, Amauri

    2006-12-01

    We present a strategy to perform automatic genre classification of musical signals. The technique divides the signals into 21.3 milliseconds frames, from which 4 features are extracted. The values of each feature are treated over 1-second analysis segments. Some statistical results of the features along each analysis segment are used to determine a vector of summary features that characterizes the respective segment. Next, a classification procedure uses those vectors to differentiate between genres. The classification procedure has two main characteristics: (1) a very wide and deep taxonomy, which allows a very meticulous comparison between different genres, and (2) a wide pairwise comparison of genres, which allows emphasizing the differences between each pair of genres. The procedure points out the genre that best fits the characteristics of each segment. The final classification of the signal is given by the genre that appears more times along all signal segments. The approach has shown very good accuracy even for the lowest layers of the hierarchical structure.

  2. Automatic synthesis of sequential control schemes

    International Nuclear Information System (INIS)

    Klein, I.

    1993-01-01

    Of all hard- and software developed for industrial control purposes, the majority is devoted to sequential, or binary valued, control and only a minor part to classical linear control. Typically, the sequential parts of the controller are invoked during startup and shut-down to bring the system into its normal operating region and into some safe standby region, respectively. Despite its importance, fairly little theoretical research has been devoted to this area, and sequential control programs are therefore still created manually without much theoretical support to obtain a systematic approach. We propose a method to create sequential control programs automatically. The main ideas is to spend some effort off-line modelling the plant, and from this model generate the control strategy, that is the plan. The plant is modelled using action structures, thereby concentrating on the actions instead of the states of the plant. In general the planning problem shows exponential complexity in the number of state variables. However, by focusing on the actions, we can identify problem classes as well as algorithms such that the planning complexity is reduced to polynomial complexity. We prove that these algorithms are sound, i.e., the generated solution will solve the stated problem, and complete, i.e., if the algorithms fail, then no solution exists. The algorithms generate a plan as a set of actions and a partial order on this set specifying the execution order. The generated plant is proven to be minimal and maximally parallel. For a larger class of problems we propose a method to split the original problem into a number of simple problems that can each be solved using one of the presented algorithms. It is also shown how a plan can be translated into a GRAFCET chart, and to illustrate these ideas we have implemented a planing tool, i.e., a system that is able to automatically create control schemes. Such a tool can of course also be used on-line if it is fast enough. This

  3. Automatic positioning control device for automatic control rod exchanger

    International Nuclear Information System (INIS)

    Nasu, Seiji; Sasaki, Masayoshi.

    1982-01-01

    Purpose: To attain accurate positioning for a control rod exchanger. Constitution: The present position for an automatic control rod exchanger is detected by a synchro generator. An aimed stopping position for the exchanger, a stop instruction range depending on the distantial operation delay in the control system and the inertia-running distance of the mechanical system, and a coincidence confirmation range depending on the required positioning accuracy are previously set. If there is a difference between the present position and the aimed stopping position, the automatic exchanger is caused to run toward the aimed stopping position. A stop instruction is generated upon arrival at the position within said stop instruction range, and a coincidence confirmation signal is generated upon arrival at the position within the coincidence confirmation range. Since uncertain factors such as operation delay in the control system and the inertia-running distance of the mechanical system that influence the positioning accuracy are made definite by the method of actual measurement or the like and the stop instruction range and the coincidence confirmation range are set based on the measured data, the accuracy for the positioning can be improved. (Ikeda, J.)

  4. Stiffness and the automatic selection of ODE codes

    International Nuclear Information System (INIS)

    Shampine, L.F.

    1984-01-01

    The author describes the basic ideas behind the most popular methods for the numerical solution of ordinary differential equations (ODEs). He takes up the qualitative behavior of solutions of ODEs and its relation ot the propagation of numerical error. Codes for ODEs are intended either for stiff problems or for non-stiff problems. The difference is explained. Users of codes do not have the information needed to recognize stiffness. A code, DEASY, which automatically recognizes stiffness and selects a suitable method is described

  5. Automatic computation and solution of generalized harmonic balance equations

    Science.gov (United States)

    Peyton Jones, J. C.; Yaser, K. S. A.; Stevenson, J.

    2018-02-01

    Generalized methods are presented for generating and solving the harmonic balance equations for a broad class of nonlinear differential or difference equations and for a general set of harmonics chosen by the user. In particular, a new algorithm for automatically generating the Jacobian of the balance equations enables efficient solution of these equations using continuation methods. Efficient numeric validation techniques are also presented, and the combined algorithm is applied to the analysis of dc, fundamental, second and third harmonic response of a nonlinear automotive damper.

  6. Automatic Strain-Rate Controller,

    Science.gov (United States)

    1976-12-01

    D—AO37 9~e2 ROME AIR DEVELOPMENT CENTER GRIFFISS AFB N 1’ FIG 13/ 6AUTOMATIC STRAIN—RATE CONTROLLER, (U) DEC 76 R L HUNTSINGER. J A ADAMSK I...goes to zero. CONTROLLER, Leeds and Northrup Series 80 CAT with proportional band , rate , reset, and approach controls . Input from deviation output...8) through ( 16) . (8) Move the set-point slowl y up to 3 or 4. (9) If the recorder po inter hunts , adjust the func t ion controls on tine Ser

  7. Commutated automatic gain control system

    Science.gov (United States)

    Yost, S. R.

    1982-01-01

    A commutated automatic gain control (AGC) system was designed and built for a prototype Loran C receiver. The receiver uses a microcomputer to control a memory aided phase-locked loop (MAPLL). The microcomputer also controls the input/output, latitude/longitude conversion, and the recently added AGC system. The circuit designed for the AGC is described, and bench and flight test results are presented. The AGC circuit described actually samples starting at a point 40 microseconds after a zero crossing determined by the software lock pulse ultimately generated by a 30 microsecond delay and add network in the receiver front end envelope detector.

  8. Automatic liquid nitrogen feeding device

    International Nuclear Information System (INIS)

    Gillardeau, J.; Bona, F.; Dejachy, G.

    1963-01-01

    An automatic liquid nitrogen feeding device has been developed (and used) in the framework of corrosion tests realized with constantly renewed uranium hexafluoride. The issue was to feed liquid nitrogen to a large capacity metallic trap in order to condensate uranium hexafluoride at the exit of the corrosion chambers. After having studied various available devices, a feeding device has been specifically designed to be robust, secure and autonomous, as well as ensuring a high liquid nitrogen flowrate and a highly elevated feeding frequency. The device, made of standard material, has been used during 4000 hours without any problem [fr

  9. Automatic alignment of radionuclide images

    International Nuclear Information System (INIS)

    Barber, D.C.

    1982-01-01

    The variability of the position, dimensions and orientation of a radionuclide image within the field of view of a gamma camera hampers attempts to analyse the image numerically. This paper describes a method of using a set of training images of a particular type, in this case right lateral brain images, to define the likely variations in the position, dimensions and orientation for that type of image and to provide alignment data for a program that automatically aligns new images of the specified type to a standard position, size and orientation. Examples are given of the use of this method on three types of radionuclide image. (author)

  10. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming, Volume 2 is a collection of papers that discusses the controversy about the suitability of COBOL as a common business oriented language, and the development of different common languages for scientific computation. A couple of papers describes the use of the Genie system in numerical calculation and analyzes Mercury autocode in terms of a phrase structure language, such as in the source language, target language, the order structure of ATLAS, and the meta-syntactical language of the assembly program. Other papers explain interference or an ""intermediate

  11. Coordinated hybrid automatic repeat request

    KAUST Repository

    Makki, Behrooz

    2014-11-01

    We develop a coordinated hybrid automatic repeat request (HARQ) approach. With the proposed scheme, if a user message is correctly decoded in the first HARQ rounds, its spectrum is allocated to other users, to improve the network outage probability and the users\\' fairness. The results, which are obtained for single- and multiple-antenna setups, demonstrate the efficiency of the proposed approach in different conditions. For instance, with a maximum of M retransmissions and single transmit/receive antennas, the diversity gain of a user increases from M to (J+1)(M-1)+1 where J is the number of users helping that user.

  12. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming, Volume 4 is a collection of papers that deals with the GIER ALGOL compiler, a parameterized compiler based on mechanical linguistics, and the JOVIAL language. A couple of papers describes a commercial use of stacks, an IBM system, and what an ideal computer program support system should be. One paper reviews the system of compilation, the development of a more advanced language, programming techniques, machine independence, and program transfer to other machines. Another paper describes the ALGOL 60 system for the GIER machine including running ALGOL pro

  13. Differential Geometry

    CERN Document Server

    Stoker, J J

    2011-01-01

    This classic work is now available in an unabridged paperback edition. Stoker makes this fertile branch of mathematics accessible to the nonspecialist by the use of three different notations: vector algebra and calculus, tensor calculus, and the notation devised by Cartan, which employs invariant differential forms as elements in an algebra due to Grassman, combined with an operation called exterior differentiation. Assumed are a passing acquaintance with linear algebra and the basic elements of analysis.

  14. Automatic Chessboard Detection for Intrinsic and Extrinsic Camera Parameter Calibration

    Directory of Open Access Journals (Sweden)

    Jose María Armingol

    2010-03-01

    Full Text Available There are increasing applications that require precise calibration of cameras to perform accurate measurements on objects located within images, and an automatic algorithm would reduce this time consuming calibration procedure. The method proposed in this article uses a pattern similar to that of a chess board, which is found automatically in each image, when no information regarding the number of rows or columns is supplied to aid its detection. This is carried out by means of a combined analysis of two Hough transforms, image corners and invariant properties of the perspective transformation. Comparative analysis with more commonly used algorithms demonstrate the viability of the algorithm proposed, as a valuable tool for camera calibration.

  15. Motor automaticity in Parkinson’s disease

    Science.gov (United States)

    Wu, Tao; Hallett, Mark; Chan, Piu

    2017-01-01

    Bradykinesia is the most important feature contributing to motor difficulties in Parkinson’s disease (PD). However, the pathophysiology underlying bradykinesia is not fully understood. One important aspect is that PD patients have difficulty in performing learned motor skills automatically, but this problem has been generally overlooked. Here we review motor automaticity associated motor deficits in PD, such as reduced arm swing, decreased stride length, freezing of gait, micrographia and reduced facial expression. Recent neuroimaging studies have revealed some neural mechanisms underlying impaired motor automaticity in PD, including less efficient neural coding of movement, failure to shift automated motor skills to the sensorimotor striatum, instability of the automatic mode within the striatum, and use of attentional control and/or compensatory efforts to execute movements usually performed automatically in healthy people. PD patients lose previously acquired automatic skills due to their impaired sensorimotor striatum, and have difficulty in acquiring new automatic skills or restoring lost motor skills. More investigations on the pathophysiology of motor automaticity, the effect of L-dopa or surgical treatments on automaticity, and the potential role of using measures of automaticity in early diagnosis of PD would be valuable. PMID:26102020

  16. Effectiveness of an Automatic Tracking Software in Underwater Motion Analysis

    Directory of Open Access Journals (Sweden)

    Fabrício A. Magalhaes

    2013-12-01

    Full Text Available Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP, based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers’ positions were manually tracked to determine the markers’ center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM. Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker’s coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4% than for COM (17.8%. Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis.

  17. The Masculinity of Money: Automatic Stereotypes Predict Gender Differences in Estimated Salaries

    Science.gov (United States)

    Williams, Melissa J.; Paluck, Elizabeth Levy; Spencer-Rodgers, Julie

    2010-01-01

    We present the first empirical investigation of why men are assumed to earn higher salaries than women (the "salary estimation effect"). Although this phenomenon is typically attributed to conscious consideration of the national wage gap (i.e., real inequities in salary), we hypothesize instead that it reflects differential, automatic economic…

  18. Automatic Affective Appraisal of Sexual Penetration Stimuli in Women with Vaginismus or Dyspareunia

    NARCIS (Netherlands)

    Huijding, Jorg; Borg, Charmaine; Weijmar-Schultz, Willibrord; de Jong, Peter J.

    Introduction. Current psychological views are that negative appraisals of sexual stimuli lie at the core of sexual dysfunctions. It is important to differentiate between deliberate appraisals and more automatic appraisals, as research has shown that the former are most relevant to controllable

  19. Individuals with fear of blushing explicitly and automatically associate blushing with social costs

    NARCIS (Netherlands)

    Glashouwer, K.A.; de Jong, P.J.; Dijk, C.; Buwalda, F.M.

    2011-01-01

    To explain fear of blushing, it has been proposed that individuals with fear of blushing overestimate the social costs of their blushing. Current information-processing models emphasize the relevance of differentiating between more automatic and more explicit cognitions, as both types of cognitions

  20. Individuals with Fear of Blushing Explicitly and Automatically Associate Blushing with Social Costs

    NARCIS (Netherlands)

    Glashouwer, Klaske A.; de Jong, Peter J.; Dijk, Corine; Buwalda, Femke M.

    2011-01-01

    To explain fear of blushing, it has been proposed that individuals with fear of blushing overestimate the social costs of their blushing. Current information-processing models emphasize the relevance of differentiating between more automatic and more explicit cognitions, as both types of cognitions

  1. Simulation tools

    CERN Document Server

    Jenni, F

    2006-01-01

    In the last two decades, simulation tools made a significant contribution to the great progress in development of power electronics. Time to market was shortened and development costs were reduced drastically. Falling costs, as well as improved speed and precision, opened new fields of application. Today, continuous and switched circuits can be mixed. A comfortable number of powerful simulation tools is available. The users have to choose the best suitable for their application. Here a simple rule applies: The best available simulation tool is the tool the user is already used to (provided, it can solve the task). Abilities, speed, user friendliness and other features are continuously being improved—even though they are already powerful and comfortable. This paper aims at giving the reader an insight into the simulation of power electronics. Starting with a short description of the fundamentals of a simulation tool as well as properties of tools, several tools are presented. Starting with simplified models ...

  2. ON DIFFERENTIAL EQUATIONS, INTEGRABLE SYSTEMS, AND GEOMETRY

    OpenAIRE

    Enrique Gonzalo Reyes Garcia

    2004-01-01

    ON DIFFERENTIAL EQUATIONS, INTEGRABLE SYSTEMS, AND GEOMETRY Equations in partial derivatives appeared in the 18th century as essential tools for the analytic study of physical models and, later, they proved to be fundamental for the progress of mathematics. For example, fundamental results of modern differential geometry are based on deep theorems on differential equations. Reciprocally, it is possible to study differential equations through geometrical means just like it was done by o...

  3. SRV-automatic handling device

    International Nuclear Information System (INIS)

    Yamada, Koji

    1987-01-01

    Automatic handling device for the steam relief valves (SRV's) is developed in order to achieve a decrease in exposure of workers, increase in availability factor, improvement in reliability, improvement in safety of operation, and labor saving. A survey is made during a periodical inspection to examine the actual SVR handling operation. An SRV automatic handling device consists of four components: conveyor, armed conveyor, lifting machine, and control/monitoring system. The conveyor is so designed that the existing I-rail installed in the containment vessel can be used without any modification. This is employed for conveying an SRV along the rail. The armed conveyor, designed for a box rail, is used for an SRV installed away from the rail. By using the lifting machine, an SRV installed away from the I-rail is brought to a spot just below the rail so that the SRV can be transferred by the conveyor. The control/monitoring system consists of a control computer, operation panel, TV monitor and annunciator. The SRV handling device is operated by remote control from a control room. A trial equipment is constructed and performance/function testing is carried out using actual SRV's. As a result, is it shown that the SRV handling device requires only two operators to serve satisfactorily. The required time for removal and replacement of one SRV is about 10 minutes. (Nogami, K.)

  4. A new uranium automatic analyzer

    International Nuclear Information System (INIS)

    Xia Buyun; Zhu Yaokun; Wang Bin; Cong Peiyuan; Zhang Lan

    1993-01-01

    A new uranium automatic analyzer based on the flow injection analysis (FIA) principle has been developed. It consists of a multichannel peristaltic pump, an injection valve, a photometric detector, a single-chip microprocessor system and electronic circuit. The new designed multifunctional auto-injection valve can automatically change the injection volume of the sample and the channels so that the determination ranges and items can easily be changed. It also can make the instrument vary the FIA operation modes that it has functions of a universal instrument. A chromatographic column with extractant-containing resin was installed in the manifold of the analyzer for the concentration and separation of trace uranium. The 2-(5-bromo-2-pyridylazo)-5-diethyl-aminophenol (Br-PADAP) was used as colour reagent. Uranium was determined in the aqueous solution by adding cetyl-pyridium bromide (CPB). The uranium in the solution in the range 0.02-500 mg · L -1 can be directly determined without any pretreatment. A sample throughput rate of 30-90 h -1 and reproducibility of 1-2% were obtained. The analyzer has been satisfactorily applied to the laboratory and the plant

  5. An automatic holographic adaptive phoropter

    Science.gov (United States)

    Amirsolaimani, Babak; Peyghambarian, N.; Schwiegerling, Jim; Bablumyan, Arkady; Savidis, Nickolaos; Peyman, Gholam

    2017-08-01

    Phoropters are the most common instrument used to detect refractive errors. During a refractive exam, lenses are flipped in front of the patient who looks at the eye chart and tries to read the symbols. The procedure is fully dependent on the cooperation of the patient to read the eye chart, provides only a subjective measurement of visual acuity, and can at best provide a rough estimate of the patient's vision. Phoropters are difficult to use for mass screenings requiring a skilled examiner, and it is hard to screen young children and the elderly etc. We have developed a simplified, lightweight automatic phoropter that can measure the optical error of the eye objectively without requiring the patient's input. The automatic holographic adaptive phoropter is based on a Shack-Hartmann wave front sensor and three computercontrolled fluidic lenses. The fluidic lens system is designed to be able to provide power and astigmatic corrections over a large range of corrections without the need for verbal feedback from the patient in less than 20 seconds.

  6. Automatic welding machine for piping

    International Nuclear Information System (INIS)

    Yoshida, Kazuhiro; Koyama, Takaichi; Iizuka, Tomio; Ito, Yoshitoshi; Takami, Katsumi.

    1978-01-01

    A remotely controlled automatic special welding machine for piping was developed. This machine is utilized for long distance pipe lines, chemical plants, thermal power generating plants and nuclear power plants effectively from the viewpoint of good quality control, reduction of labor and good controllability. The function of this welding machine is to inspect the shape and dimensions of edge preparation before welding work by the sense of touch, to detect the temperature of melt pool, inspect the bead form by the sense of touch, and check the welding state by ITV during welding work, and to grind the bead surface and inspect the weld metal by ultrasonic test automatically after welding work. The construction of this welding system, the main specification of the apparatus, the welding procedure in detail, the electrical source of this welding machine, the cooling system, the structure and handling of guide ring, the central control system and the operating characteristics are explained. The working procedure and the effect by using this welding machine, and the application to nuclear power plants and the other industrial field are outlined. The HIDIC 08 is used as the controlling computer. This welding machine is useful for welding SUS piping as well as carbon steel piping. (Nakai, Y.)

  7. Automatic generation of tourist brochures

    KAUST Repository

    Birsak, Michael

    2014-05-01

    We present a novel framework for the automatic generation of tourist brochures that include routing instructions and additional information presented in the form of so-called detail lenses. The first contribution of this paper is the automatic creation of layouts for the brochures. Our approach is based on the minimization of an energy function that combines multiple goals: positioning of the lenses as close as possible to the corresponding region shown in an overview map, keeping the number of lenses low, and an efficient numbering of the lenses. The second contribution is a route-aware simplification of the graph of streets used for traveling between the points of interest (POIs). This is done by reducing the graph consisting of all shortest paths through the minimization of an energy function. The output is a subset of street segments that enable traveling between all the POIs without considerable detours, while at the same time guaranteeing a clutter-free visualization. © 2014 The Author(s) Computer Graphics Forum © 2014 The Eurographics Association and John Wiley & Sons Ltd. Published by John Wiley & Sons Ltd.

  8. Automatic Computer Mapping of Terrain

    Science.gov (United States)

    Smedes, H. W.

    1971-01-01

    Computer processing of 17 wavelength bands of visible, reflective infrared, and thermal infrared scanner spectrometer data, and of three wavelength bands derived from color aerial film has resulted in successful automatic computer mapping of eight or more terrain classes in a Yellowstone National Park test site. The tests involved: (1) supervised and non-supervised computer programs; (2) special preprocessing of the scanner data to reduce computer processing time and cost, and improve the accuracy; and (3) studies of the effectiveness of the proposed Earth Resources Technology Satellite (ERTS) data channels in the automatic mapping of the same terrain, based on simulations, using the same set of scanner data. The following terrain classes have been mapped with greater than 80 percent accuracy in a 12-square-mile area with 1,800 feet of relief; (1) bedrock exposures, (2) vegetated rock rubble, (3) talus, (4) glacial kame meadow, (5) glacial till meadow, (6) forest, (7) bog, and (8) water. In addition, shadows of clouds and cliffs are depicted, but were greatly reduced by using preprocessing techniques.

  9. Automatic referral to cardiac rehabilitation.

    Science.gov (United States)

    Fischer, Jane P

    2008-01-01

    The pervasive negative impact of cardiovascular disease in the United States is well documented. Although advances have been made, the campaign to reduce the occurrence, progression, and mortality continues. Determining evidence-based data is only half the battle. Implementing new and updated clinical guidelines into daily practice is a challenging task. Cardiac rehabilitation is an example of a proven intervention whose benefit is hindered through erratic implementation. The American Association of Cardiovascular and Pulmonary Rehabilitation (AACVPR), the American College of Cardiology (ACC), and the American Heart Association (AHA) have responded to this problem by publishing the AACVPR/ACC/AHA 2007 Performance Measures on Cardiac Rehabilitation for Referral to and Delivery of Cardiac Rehabilitation/Secondary Prevention Services. This new national guideline recommends automatic referral to cardiac rehabilitation for every eligible patient (performance measure A-1). This article offers guidance for the initiation of an automatic referral system, including individualizing your protocol with regard to electronic or paper-based order entry structures.

  10. Automatic creation of specialised multilingual dictionaries in new subject areas

    Directory of Open Access Journals (Sweden)

    Joaquim Moré

    2009-05-01

    Full Text Available This article presents a tool to automatically generate specialised dictionaries of multilingual equivalents in new subject areas. The tool uses resources that are available on the web to search for equivalents and verify their reliability. These resources are, on the one hand, the Wikipedias, which can be freely downloaded and processed, and, on the other, the materials that terminological institutions of reference make available. This tool is of use to teachers producing teaching materials and researchers preparing theses, articles or reference manuals. It is also of use to translators and terminologists working on terminological standardisation in a new subject area in a given language, as it helps them in their work to pinpoint concepts that have yet to receive a standardised denomination.

  11. SU-E-J-16: Automatic Image Contrast Enhancement Based On Automatic Parameter Optimization for Radiation Therapy Setup Verification

    Energy Technology Data Exchange (ETDEWEB)

    Qiu, J [Taishan Medical University, Taian, Shandong (China); Washington University in St Louis, St Louis, MO (United States); Li, H. Harlod; Zhang, T; Yang, D [Washington University in St Louis, St Louis, MO (United States); Ma, F [Taishan Medical University, Taian, Shandong (China)

    2015-06-15

    Purpose: In RT patient setup 2D images, tissues often cannot be seen well due to the lack of image contrast. Contrast enhancement features provided by image reviewing software, e.g. Mosaiq and ARIA, require manual selection of the image processing filters and parameters thus inefficient and cannot be automated. In this work, we developed a novel method to automatically enhance the 2D RT image contrast to allow automatic verification of patient daily setups as a prerequisite step of automatic patient safety assurance. Methods: The new method is based on contrast limited adaptive histogram equalization (CLAHE) and high-pass filtering algorithms. The most important innovation is to automatically select the optimal parameters by optimizing the image contrast. The image processing procedure includes the following steps: 1) background and noise removal, 2) hi-pass filtering by subtracting the Gaussian smoothed Result, and 3) histogram equalization using CLAHE algorithm. Three parameters were determined through an iterative optimization which was based on the interior-point constrained optimization algorithm: the Gaussian smoothing weighting factor, the CLAHE algorithm block size and clip limiting parameters. The goal of the optimization is to maximize the entropy of the processed Result. Results: A total 42 RT images were processed. The results were visually evaluated by RT physicians and physicists. About 48% of the images processed by the new method were ranked as excellent. In comparison, only 29% and 18% of the images processed by the basic CLAHE algorithm and by the basic window level adjustment process, were ranked as excellent. Conclusion: This new image contrast enhancement method is robust and automatic, and is able to significantly outperform the basic CLAHE algorithm and the manual window-level adjustment process that are currently used in clinical 2D image review software tools.

  12. SU-E-J-16: Automatic Image Contrast Enhancement Based On Automatic Parameter Optimization for Radiation Therapy Setup Verification

    International Nuclear Information System (INIS)

    Qiu, J; Li, H. Harlod; Zhang, T; Yang, D; Ma, F

    2015-01-01

    Purpose: In RT patient setup 2D images, tissues often cannot be seen well due to the lack of image contrast. Contrast enhancement features provided by image reviewing software, e.g. Mosaiq and ARIA, require manual selection of the image processing filters and parameters thus inefficient and cannot be automated. In this work, we developed a novel method to automatically enhance the 2D RT image contrast to allow automatic verification of patient daily setups as a prerequisite step of automatic patient safety assurance. Methods: The new method is based on contrast limited adaptive histogram equalization (CLAHE) and high-pass filtering algorithms. The most important innovation is to automatically select the optimal parameters by optimizing the image contrast. The image processing procedure includes the following steps: 1) background and noise removal, 2) hi-pass filtering by subtracting the Gaussian smoothed Result, and 3) histogram equalization using CLAHE algorithm. Three parameters were determined through an iterative optimization which was based on the interior-point constrained optimization algorithm: the Gaussian smoothing weighting factor, the CLAHE algorithm block size and clip limiting parameters. The goal of the optimization is to maximize the entropy of the processed Result. Results: A total 42 RT images were processed. The results were visually evaluated by RT physicians and physicists. About 48% of the images processed by the new method were ranked as excellent. In comparison, only 29% and 18% of the images processed by the basic CLAHE algorithm and by the basic window level adjustment process, were ranked as excellent. Conclusion: This new image contrast enhancement method is robust and automatic, and is able to significantly outperform the basic CLAHE algorithm and the manual window-level adjustment process that are currently used in clinical 2D image review software tools

  13. Automatic Adjustments of a Trans-oesophageal Ultrasound Robot for Monitoring Intra-operative Catheters

    Science.gov (United States)

    Wang, Shuangyi; Housden, James; Singh, Davinder; Rhode, Kawal

    2017-12-01

    3D trans-oesophageal echocardiography (TOE) has become a powerful tool for monitoring intra-operative catheters used during cardiac procedures in recent years. However, the control of the TOE probe remains as a manual task and therefore the operator has to hold the probe for a long period of time and sometimes in a radiation environment. To solve this problem, an add-on robotic system has been developed for holding and manipulating a commercial TOE probe. This paper focuses on the application of making automatic adjustments to the probe pose in order to accurately monitor the moving catheters. The positioning strategy is divided into an initialization step based on a pre-planning method and a localized adjustments step based on the robotic differential kinematics and related image servoing techniques. Both steps are described in the paper along with simulation experiments performed to validate the concept. The results indicate an error less than 0.5 mm for the initialization step and an error less than 2 mm for the localized adjustments step. Compared to the much bigger live 3D image volume, it is concluded that the methods are promising. Future work will focus on evaluating the method in the real TOE scanning scenario.

  14. Automatic caption generation for news images.

    Science.gov (United States)

    Feng, Yansong; Lapata, Mirella

    2013-04-01

    This paper is concerned with the task of automatically generating captions for images, which is important for many image-related applications. Examples include video and image retrieval as well as the development of tools that aid visually impaired individuals to access pictorial information. Our approach leverages the vast resource of pictures available on the web and the fact that many of them are captioned and colocated with thematically related documents. Our model learns to create captions from a database of news articles, the pictures embedded in them, and their captions, and consists of two stages. Content selection identifies what the image and accompanying article are about, whereas surface realization determines how to verbalize the chosen content. We approximate content selection with a probabilistic image annotation model that suggests keywords for an image. The model postulates that images and their textual descriptions are generated by a shared set of latent variables (topics) and is trained on a weakly labeled dataset (which treats the captions and associated news articles as image labels). Inspired by recent work in summarization, we propose extractive and abstractive surface realization models. Experimental results show that it is viable to generate captions that are pertinent to the specific content of an image and its associated article, while permitting creativity in the description. Indeed, the output of our abstractive model compares favorably to handwritten captions and is often superior to extractive methods.

  15. Automatic Generation of Validated Specific Epitope Sets

    Directory of Open Access Journals (Sweden)

    Sebastian Carrasco Pro

    2015-01-01

    Full Text Available Accurate measurement of B and T cell responses is a valuable tool to study autoimmunity, allergies, immunity to pathogens, and host-pathogen interactions and assist in the design and evaluation of T cell vaccines and immunotherapies. In this context, it is desirable to elucidate a method to select validated reference sets of epitopes to allow detection of T and B cells. However, the ever-growing information contained in the Immune Epitope Database (IEDB and the differences in quality and subjects studied between epitope assays make this task complicated. In this study, we develop a novel method to automatically select reference epitope sets according to a categorization system employed by the IEDB. From the sets generated, three epitope sets (EBV, mycobacteria and dengue were experimentally validated by detection of T cell reactivity ex vivo from human donors. Furthermore, a web application that will potentially be implemented in the IEDB was created to allow users the capacity to generate customized epitope sets.

  16. Automatic Deduction in Dynamic Geometry using Sage

    Directory of Open Access Journals (Sweden)

    Francisco Botana

    2012-02-01

    Full Text Available We present a symbolic tool that provides robust algebraic methods to handle automatic deduction tasks for a dynamic geometry construction. The main prototype has been developed as two different worksheets for the open source computer algebra system Sage, corresponding to two different ways of coding a geometric construction. In one worksheet, diagrams constructed with the open source dynamic geometry system GeoGebra are accepted. In this worksheet, Groebner bases are used to either compute the equation of a geometric locus in the case of a locus construction or to determine the truth of a general geometric statement included in the GeoGebra construction as a boolean variable. In the second worksheet, locus constructions coded using the common file format for dynamic geometry developed by the Intergeo project are accepted for computation. The prototype and several examples are provided for testing. Moreover, a third Sage worksheet is presented in which a novel algorithm to eliminate extraneous parts in symbolically computed loci has been implemented. The algorithm, based on a recent work on the Groebner cover of parametric systems, identifies degenerate components and extraneous adherence points in loci, both natural byproducts of general polynomial algebraic methods. Detailed examples are discussed.

  17. Automatic Generation of Minimal Cut Sets

    Directory of Open Access Journals (Sweden)

    Sentot Kromodimoeljo

    2015-06-01

    Full Text Available A cut set is a collection of component failure modes that could lead to a system failure. Cut Set Analysis (CSA is applied to critical systems to identify and rank system vulnerabilities at design time. Model checking tools have been used to automate the generation of minimal cut sets but are generally based on checking reachability of system failure states. This paper describes a new approach to CSA using a Linear Temporal Logic (LTL model checker called BT Analyser that supports the generation of multiple counterexamples. The approach enables a broader class of system failures to be analysed, by generalising from failure state formulae to failure behaviours expressed in LTL. The traditional approach to CSA using model checking requires the model or system failure to be modified, usually by hand, to eliminate already-discovered cut sets, and the model checker to be rerun, at each step. By contrast, the new approach works incrementally and fully automatically, thereby removing the tedious and error-prone manual process and resulting in significantly reduced computation time. This in turn enables larger models to be checked. Two different strategies for using BT Analyser for CSA are presented. There is generally no single best strategy for model checking: their relative efficiency depends on the model and property being analysed. Comparative results are given for the A320 hydraulics case study in the Behavior Tree modelling language.

  18. Towards unifying inheritance and automatic program specialization

    DEFF Research Database (Denmark)

    Schultz, Ulrik Pagh

    2002-01-01

    with covariant specialization to control the automatic application of program specialization to class members. Lapis integrates object-oriented concepts, block structure, and techniques from automatic program specialization to provide both a language where object-oriented designs can be e#ciently implemented......Inheritance allows a class to be specialized and its attributes refined, but implementation specialization can only take place by overriding with manually implemented methods. Automatic program specialization can generate a specialized, effcient implementation. However, specialization of programs...

  19. Natural language processing techniques for automatic test ...

    African Journals Online (AJOL)

    Natural language processing techniques for automatic test questions generation using discourse connectives. ... PROMOTING ACCESS TO AFRICAN RESEARCH. AFRICAN JOURNALS ... Journal of Computer Science and Its Application.

  20. Automatic Thermal Infrared Panoramic Imaging Sensor

    National Research Council Canada - National Science Library

    Gutin, Mikhail; Tsui, Eddy K; Gutin, Olga; Wang, Xu-Ming; Gutin, Alexey

    2006-01-01

    .... Automatic detection, location, and tracking of targets outside protected area ensures maximum protection and at the same time reduces the workload on personnel, increases reliability and confidence...

  1. AuTom: a novel automatic platform for electron tomography reconstruction

    KAUST Repository

    Han, Renmin

    2017-07-26

    We have developed a software package towards automatic electron tomography (ET): Automatic Tomography (AuTom). The presented package has the following characteristics: accurate alignment modules for marker-free datasets containing substantial biological structures; fully automatic alignment modules for datasets with fiducial markers; wide coverage of reconstruction methods including a new iterative method based on the compressed-sensing theory that suppresses the “missing wedge” effect; and multi-platform acceleration solutions that support faster iterative algebraic reconstruction. AuTom aims to achieve fully automatic alignment and reconstruction for electron tomography and has already been successful for a variety of datasets. AuTom also offers user-friendly interface and auxiliary designs for file management and workflow management, in which fiducial marker-based datasets and marker-free datasets are addressed with totally different subprocesses. With all of these features, AuTom can serve as a convenient and effective tool for processing in electron tomography.

  2. Usage of aids monitoring in automatic braking systems of modern cars

    Directory of Open Access Journals (Sweden)

    Dembitskyi V.

    2016-08-01

    Full Text Available Increased safety can be carried out at the expense the installation on vehicles of automatic braking systems, that monitor the traffic situation and the actions of the driver. In this paper considered the advantages and disadvantages of automatic braking systems, were analyzed modern tracking tools that are used in automatic braking systems. Based on the statistical data on accidents, are set the main dangers, that the automatic braking system will be reduced. In order to ensure the accuracy of information conducted research for determination of optimal combination of different sensors that provide an adequate perception of road conditions. The tracking system should be equipped with a combination of sensors, which in the case of detection of an obstacle or dangers of signal is transmitted to the information processing system and decision making. Information from the monitoring system should include data for the identification of the object, its condition, the speed.

  3. Automatic Tagging as a Support Strategy for Creating Knowledge Maps

    Directory of Open Access Journals (Sweden)

    Leonardo Moura De Araújo

    2017-06-01

    Full Text Available Graph organizers are powerful tools for both structuring and transmitting knowledge. Because of their unique characteristics, these organizers are valuable for cultural institutions, which own large amounts of information assets and need to constantly make sense of them. On one hand, graph organizers are tools for connecting numerous chunks of data. On the other hand, because they are visual media, they offer a bird's-eye view perspective on complexity, which is digestible to the human eye. They are effective tools for information synthesis, and are capable of providing valuable insights on data. Information synthesis is essential for Heritage Interpretation, since institutions depend on constant generation of new content to preserve relevance among their audiences. While Mind Maps are simpler to be structured and comprehended, Knowledge Maps offer challenges that require new methods to minimize the difficulties encountered during their assembly. This paper presents strategies based on manual and automatic tagging as an answer to this problem. In addition, we describe the results of a usability test and qualitative analysis performed to compare the workflows employed to construct both Mind Maps and Knowledge Maps. Furthermore, we also talk about how well concepts can be communicated through the visual representation of trees and networks. Depending on the employed method, different results can be achieved, because of their unique topological characteristics. Our findings suggest that automatic tagging supports and accelerates the construction of graphs.

  4. Differential equations a dynamical systems approach ordinary differential equations

    CERN Document Server

    Hubbard, John H

    1991-01-01

    This is a corrected third printing of the first part of the text Differential Equations: A Dynamical Systems Approach written by John Hubbard and Beverly West. The authors' main emphasis in this book is on ordinary differential equations. The book is most appropriate for upper level undergraduate and graduate students in the fields of mathematics, engineering, and applied mathematics, as well as the life sciences, physics and economics. Traditional courses on differential equations focus on techniques leading to solutions. Yet most differential equations do not admit solutions which can be written in elementary terms. The authors have taken the view that a differential equations defines functions; the object of the theory is to understand the behavior of these functions. The tools the authors use include qualitative and numerical methods besides the traditional analytic methods. The companion software, MacMath, is designed to bring these notions to life.

  5. Antares automatic beam alignment system

    International Nuclear Information System (INIS)

    Appert, Q.; Swann, T.; Sweatt, W.; Saxman, A.

    1980-01-01

    Antares is a 24-beam-line CO 2 laser system for controlled fusion research, under construction at Los Alamos Scientific Laboratory (LASL). Rapid automatic alignment of this system is required prior to each experiment shot. The alignment requirements, operational constraints, and a developed prototype system are discussed. A visible-wavelength alignment technique is employed that uses a telescope/TV system to view point light sources appropriately located down the beamline. Auto-alignment is accomplished by means of a video centroid tracker, which determines the off-axis error of the point sources. The error is nulled by computer-driven, movable mirrors in a closed-loop system. The light sources are fiber-optic terminations located at key points in the optics path, primarily at the center of large copper mirrors, and remotely illuminated to reduce heating effects

  6. Computerized automatic tip scanning operation

    International Nuclear Information System (INIS)

    Nishikawa, K.; Fukushima, T.; Nakai, H.; Yanagisawa, A.

    1984-01-01

    In BWR nuclear power stations the Traversing Incore Probe (TIP) system is one of the most important components in reactor monitoring and control. In previous TIP systems, however, operators have suffered from the complexity of operation and long operation time required. The system presented in this paper realizes the automatic operation of the TIP system by monitoring and driving it with a process computer. This system significantly reduces the burden on customer operators and improves plant efficiency by simplifying the operating procedure, augmenting the accuracy of the measured data, and shortening operating time. The process computer is one of the PODIA (Plant Operation by Displayed Information Automation) systems. This computer transfers control signals to the TIP control panel, which in turn drives equipment by microprocessor control. The process computer contains such components as the CRT/KB unit, the printer plotter, the hard copier, and the message typers required for efficient man-machine communications. Its operation and interface properties are described

  7. Automatic Detection of Terminology Evolution

    Science.gov (United States)

    Tahmasebi, Nina

    As archives contain documents that span over a long period of time, the language used to create these documents and the language used for querying the archive can differ. This difference is due to evolution in both terminology and semantics and will cause a significant number of relevant documents being omitted. A static solution is to use query expansion based on explicit knowledge banks such as thesauri or ontologies. However as we are able to archive resources with more varied terminology, it will be infeasible to use only explicit knowledge for this purpose. There exist only few or no thesauri covering very domain specific terminologies or slang as used in blogs etc. In this Ph.D. thesis we focus on automatically detecting terminology evolution in a completely unsupervised manner as described in this technical paper.

  8. Automatic gamma spectrometry analytical apparatus

    International Nuclear Information System (INIS)

    Lamargot, J.-P.; Wanin, Maurice.

    1980-01-01

    This invention falls within the area of quantitative or semi-quantitative analysis by gamma spectrometry and particularly refers to a device for bringing the samples into the counting position. The purpose of this invention is precisely to provide an automatic apparatus specifically adapted to the analysis of hard gamma radiations. To this effect, the invention relates to a gamma spectrometry analytical device comprising a lead containment, a detector of which the sensitive part is located inside the containment and additionally comprising a transfer system for bringing the analyzed samples in succession to a counting position inside the containment above the detector. A feed compartment enables the samples to be brought in turn one by one on to the transfer system through a duct connecting the compartment to the transfer system. Sequential systems for the coordinated forward feed of the samples in the compartment and the transfer system complete this device [fr

  9. Automatic Regulation of Wastewater Discharge

    Directory of Open Access Journals (Sweden)

    Bolea Yolanda

    2017-01-01

    Full Text Available Wastewater plants, mainly with secondary treatments, discharge polluted water to environment that cannot be used in any human activity. When those dumps are in the sea it is expected that most of the biological pollutants die or almost disappear before water reaches human range. This natural withdrawal of bacteria, viruses and other pathogens is due to some conditions such as the salt water of the sea and the sun effect, and the dumps areas are calculated taking into account these conditions. However, under certain meteorological phenomena water arrives to the coast without the full disappearance of pollutant elements. In Mediterranean Sea there are some periods of adverse climatic conditions that pollute the coast near the wastewater dumping. In this paper, authors present an automatic control that prevents such pollution episodes using two mathematical models, one for the pollutant transportation and the other for the pollutant removal in wastewater spills.

  10. Automated Assessment in a Programming Tools Course

    Science.gov (United States)

    Fernandez Aleman, J. L.

    2011-01-01

    Automated assessment systems can be useful for both students and instructors. Ranking and immediate feedback can have a strongly positive effect on student learning. This paper presents an experience using automatic assessment in a programming tools course. The proposal aims at extending the traditional use of an online judging system with a…

  11. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kosterev, Dmitry [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dai, T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-01

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  12. Particle swarm optimization applied to automatic lens design

    Science.gov (United States)

    Qin, Hua

    2011-06-01

    This paper describes a novel application of Particle Swarm Optimization (PSO) technique to lens design. A mathematical model is constructed, and merit functions in an optical system are employed as fitness functions, which combined radiuses of curvature, thicknesses among lens surfaces and refractive indices regarding an optical system. By using this function, the aberration correction is carried out. A design example using PSO is given. Results show that PSO as optical design tools is practical and powerful, and this method is no longer dependent on the lens initial structure and can arbitrarily create search ranges of structural parameters of a lens system, which is an important step towards automatic design with artificial intelligence.

  13. ASTRA - an automatic system for transport analysis in a tokamak

    International Nuclear Information System (INIS)

    Pereverzev, G.V.; Yushmanov, P.N.; Dnestrovskii, A.Yu.; Polevoi, A.R.; Tarasjan, K.N.; Zakharov, L.E.

    1991-08-01

    The set of codes described here - ASTRA (Automatic System of Transport Analysis) - is a flexible and effective tool for the study of transport mechanisms in reactor-oriented facilities of the tokamak type. Flexibility is provided within the ASTRA system by a wide choice of standard relationships, functions and subroutines representing various transport coefficients, methods of auxiliary heating and other physical processes in the tokamak plasma, as well as by the possibility of pre-setting transport equations and variables for data output in a simple and conseptually transparent form. The transport code produced by the ASTRA system provides an adequate representation of the discharges for present experimental conditions. (orig.)

  14. Automatic control system generation for robot design validation

    Science.gov (United States)

    Bacon, James A. (Inventor); English, James D. (Inventor)

    2012-01-01

    The specification and drawings present a new method, system and software product for and apparatus for generating a robotic validation system for a robot design. The robotic validation system for the robot design of a robotic system is automatically generated by converting a robot design into a generic robotic description using a predetermined format, then generating a control system from the generic robotic description and finally updating robot design parameters of the robotic system with an analysis tool using both the generic robot description and the control system.

  15. Automatic scoring of the severity of psoriasis scaling

    DEFF Research Database (Denmark)

    Gomez, David Delgado; Ersbøll, Bjarne Kjær; Carstensen, Jens Michael

    2004-01-01

    In this work, a combined statistical and image analysis method to automatically evaluate the severity of scaling in psoriasis lesions is proposed. The method separates the different regions of the disease in the image and scores the degree of scaling based on the properties of these areas. The pr...... with scores made by doctors. This and the fact that the obtained measures are continuous indicate the proposed method is a suitable tool to evaluate the lesion and to track the evolution of dermatological diseases....

  16. Differential forms theory and practice

    CERN Document Server

    Weintraub, Steven H

    2014-01-01

    Differential forms are utilized as a mathematical technique to help students, researchers, and engineers analyze and interpret problems where abstract spaces and structures are concerned, and when questions of shape, size, and relative positions are involved. Differential Forms has gained high recognition in the mathematical and scientific community as a powerful computational tool in solving research problems and simplifying very abstract problems through mathematical analysis on a computer. Differential Forms, 2nd Edition, is a solid resource for students and professionals needing a solid g

  17. Fourier transform infrared microspectroscopy identifies early lineage commitment in differentiating human embryonic stem cells.

    Science.gov (United States)

    Heraud, Philip; Ng, Elizabeth S; Caine, Sally; Yu, Qing C; Hirst, Claire; Mayberry, Robyn; Bruce, Amanda; Wood, Bayden R; McNaughton, Don; Stanley, Edouard G; Elefanty, Andrew G

    2010-03-01

    Human ESCs (hESCs) are a valuable tool for the study of early human development and represent a source of normal differentiated cells for pharmaceutical and biotechnology applications and ultimately for cell replacement therapies. For all applications, it will be necessary to develop assays to validate the efficacy of hESC differentiation. We explored the capacity for FTIR spectroscopy, a technique that rapidly characterises cellular macromolecular composition, to discriminate mesendoderm or ectoderm committed cells from undifferentiated hESCs. Distinct infrared spectroscopic "signatures" readily distinguished hESCs from these early differentiated progeny, with bioinformatic models able to correctly classify over 97% of spectra. These data identify a role for FTIR spectroscopy as a new modality to complement conventional analyses of hESCs and their derivatives. FTIR spectroscopy has the potential to provide low-cost, automatable measurements for the quality control of stem and differentiated cells to be used in industry and regenerative medicine. Crown Copyright 2009. Published by Elsevier B.V. All rights reserved.

  18. Comparison of automatic and visual methods used for image segmentation in Endodontics: a microCT study.

    Science.gov (United States)

    Queiroz, Polyane Mazucatto; Rovaris, Karla; Santaella, Gustavo Machado; Haiter-Neto, Francisco; Freitas, Deborah Queiroz

    2017-01-01

    To calculate root canal volume and surface area in microCT images, an image segmentation by selecting threshold values is required, which can be determined by visual or automatic methods. Visual determination is influenced by the operator's visual acuity, while the automatic method is done entirely by computer algorithms. To compare between visual and automatic segmentation, and to determine the influence of the operator's visual acuity on the reproducibility of root canal volume and area measurements. Images from 31 extracted human anterior teeth were scanned with a μCT scanner. Three experienced examiners performed visual image segmentation, and threshold values were recorded. Automatic segmentation was done using the "Automatic Threshold Tool" available in the dedicated software provided by the scanner's manufacturer. Volume and area measurements were performed using the threshold values determined both visually and automatically. The paired Student's t-test showed no significant difference between visual and automatic segmentation methods regarding root canal volume measurements (p=0.93) and root canal surface (p=0.79). Although visual and automatic segmentation methods can be used to determine the threshold and calculate root canal volume and surface, the automatic method may be the most suitable for ensuring the reproducibility of threshold determination.

  19. Solar Powered Automatic Shrimp Feeding System

    Directory of Open Access Journals (Sweden)

    Dindo T. Ani

    2015-12-01

    Full Text Available - Automatic system has brought many revolutions in the existing technologies. One among the technologies, which has greater developments, is the solar powered automatic shrimp feeding system. For instance, the solar power which is a renewable energy can be an alternative solution to energy crisis and basically reducing man power by using it in an automatic manner. The researchers believe an automatic shrimp feeding system may help solve problems on manual feeding operations. The project study aimed to design and develop a solar powered automatic shrimp feeding system. It specifically sought to prepare the design specifications of the project, to determine the methods of fabrication and assembly, and to test the response time of the automatic shrimp feeding system. The researchers designed and developed an automatic system which utilizes a 10 hour timer to be set in intervals preferred by the user and will undergo a continuous process. The magnetic contactor acts as a switch connected to the 10 hour timer which controls the activation or termination of electrical loads and powered by means of a solar panel outputting electrical power, and a rechargeable battery in electrical communication with the solar panel for storing the power. By undergoing through series of testing, the components of the modified system were proven functional and were operating within the desired output. It was recommended that the timer to be used should be tested to avoid malfunction and achieve the fully automatic system and that the system may be improved to handle changes in scope of the project.

  20. Equipment for fully automatic radiographic pipe inspection

    International Nuclear Information System (INIS)

    Basler, G.; Sperl, H.; Weinschenk, K.

    1977-01-01

    The patent describes a device for fully automatic radiographic testing of large pipes with longitudinal welds. Furthermore the invention enables automatic marking of films in radiographic inspection with regard to a ticketing of the test piece and of that part of it where testing took place. (RW) [de

  1. An introduction to automatic radioactive sample counters

    International Nuclear Information System (INIS)

    1980-01-01

    The subject is covered in chapters, entitled; the detection of radiation in sample counters; nucleonic equipment; liquid scintillation counting; basic features of automatic sample counters; statistics of counting; data analysis; purchase, installation, calibration and maintenance of automatic sample counters. (U.K.)

  2. Precision about the automatic emotional brain.

    Science.gov (United States)

    Vuilleumier, Patrik

    2015-01-01

    The question of automaticity in emotion processing has been debated under different perspectives in recent years. Satisfying answers to this issue will require a better definition of automaticity in terms of relevant behavioral phenomena, ecological conditions of occurrence, and a more precise mechanistic account of the underlying neural circuits.

  3. Automatic control of nuclear power plants

    International Nuclear Information System (INIS)

    Jover, P.

    1976-01-01

    The fundamental concepts in automatic control are surveyed, and the purpose of the automatic control of pressurized water reactors is given. The response characteristics for the main components are then studied and block diagrams are given for the main control loops (turbine, steam generator, and nuclear reactors) [fr

  4. Automatic Cobb Angle Determination From Radiographic Images

    NARCIS (Netherlands)

    Sardjono, Tri Arief; Wilkinson, Michael H. F.; Veldhuizen, Albert G.; van Ooijen, Peter M. A.; Purnama, Ketut E.; Verkerke, Gijsbertus J.

    2013-01-01

    Study Design. Automatic measurement of Cobb angle in patients with scoliosis. Objective. To test the accuracy of an automatic Cobb angle determination method from frontal radiographical images. Summary of Background Data. Thirty-six frontal radiographical images of patients with scoliosis. Methods.

  5. Automatic face morphing for transferring facial animation

    NARCIS (Netherlands)

    Bui Huu Trung, B.H.T.; Bui, T.D.; Poel, Mannes; Heylen, Dirk K.J.; Nijholt, Antinus; Hamza, H.M.

    2003-01-01

    In this paper, we introduce a novel method of automatically finding the training set of RBF networks for morphing a prototype face to represent a new face. This is done by automatically specifying and adjusting corresponding feature points on a target face. The RBF networks are then used to transfer

  6. A framework for automatic segmentation in three dimensions of microstructural tomography data

    DEFF Research Database (Denmark)

    Jørgensen, Peter Stanley; Hansen, Karin Vels; Larsen, Rasmus

    2010-01-01

    Routine use of quantitative three dimensional analysis of material microstructure by in particular, focused ion beam (FIB) serial sectioning is generally restricted by the time consuming task of manually delineating structures within each image slice or the quality of manual and automatic...... segmentation schemes. We present here a framework for performing automatic segmentation of complex microstructures using a level set method. The technique is based on numerical approximations to partial differential equations to evolve a 3D surface to capture the phase boundaries. Vector fields derived from...

  7. Automatic detection and visualisation of MEG ripple oscillations in epilepsy

    Directory of Open Access Journals (Sweden)

    Nicole van Klink

    2017-01-01

    Full Text Available High frequency oscillations (HFOs, 80–500 Hz in invasive EEG are a biomarker for the epileptic focus. Ripples (80–250 Hz have also been identified in non-invasive MEG, yet detection is impeded by noise, their low occurrence rates, and the workload of visual analysis. We propose a method that identifies ripples in MEG through noise reduction, beamforming and automatic detection with minimal user effort. We analysed 15 min of presurgical resting-state interictal MEG data of 25 patients with epilepsy. The MEG signal-to-noise was improved by using a cross-validation signal space separation method, and by calculating ~2400 beamformer-based virtual sensors in the grey matter. Ripples in these sensors were automatically detected by an algorithm optimized for MEG. A small subset of the identified ripples was visually checked. Ripple locations were compared with MEG spike dipole locations and the resection area if available. Running the automatic detection algorithm resulted in on average 905 ripples per patient, of which on average 148 ripples were visually reviewed. Reviewing took approximately 5 min per patient, and identified ripples in 16 out of 25 patients. In 14 patients the ripple locations showed good or moderate concordance with the MEG spikes. For six out of eight patients who had surgery, the ripple locations showed concordance with the resection area: 4/5 with good outcome and 2/3 with poor outcome. Automatic ripple detection in beamformer-based virtual sensors is a feasible non-invasive tool for the identification of ripples in MEG. Our method requires minimal user effort and is easily applicable in a clinical setting.

  8. Automatic Atrial Fibrillation Detection: A Novel Approach Using Discrete Wavelet Transform and Heart Rate Variabilit

    DEFF Research Database (Denmark)

    Bruun, Iben H.; Hissabu, Semira M. S.; Poulsen, Erik S.

    2017-01-01

    be used as a screening tool for patients suspected to have AF. The method includes an automatic peak detection prior to the feature extraction, as well as a noise cancellation technique followed by a bagged tree classification. Simulation studies on the MIT-BIH Atrial Fibrillation database was performed...

  9. Differential discriminator

    International Nuclear Information System (INIS)

    Dukhanov, V.I.; Mazurov, I.B.

    1981-01-01

    A principal flowsheet of a differential discriminator intended for operation in a spectrometric circuit with statistical time distribution of pulses is described. The differential discriminator includes four integrated discriminators and a channel of piled-up signal rejection. The presence of the rejection channel enables the discriminator to operate effectively at loads of 14x10 3 pulse/s. The temperature instability of the discrimination thresholds equals 250 μV/ 0 C. The discrimination level changes within 0.1-5 V, the level shift constitutes 0.5% for the filling ratio of 1:10. The rejection coefficient is not less than 90%. Alpha spectrum of the 228 Th source is presented to evaluate the discriminator operation with the rejector. The rejector provides 50 ns time resolution

  10. Differential topology

    CERN Document Server

    Margalef-Roig, J

    1992-01-01

    ...there are reasons enough to warrant a coherent treatment of the main body of differential topology in the realm of Banach manifolds, which is at the same time correct and complete. This book fills the gap: whenever possible the manifolds treated are Banach manifolds with corners. Corners add to the complications and the authors have carefully fathomed the validity of all main results at corners. Even in finite dimensions some results at corners are more complete and better thought out here than elsewhere in the literature. The proofs are correct and with all details. I see this book as a reliable monograph of a well-defined subject; the possibility to fall back to it adds to the feeling of security when climbing in the more dangerous realms of infinite dimensional differential geometry. Peter W. Michor

  11. Differential belongings

    DEFF Research Database (Denmark)

    Oldrup, Helene

    2014-01-01

    This paper explores suburban middle-class residents’ narratives about housing choice, everyday life and belonging in residential areas of Greater Copenhagen, Denmark, to understand how residential processes of social differentiation are constituted. Using Savage et al.’s concepts of discursive...... and not only to the area itself. In addition, rather than seeing suburban residential areas as homogenous, greater attention should be paid to differences within such areas....

  12. Partial differential equations

    CERN Document Server

    Sloan, D; Süli, E

    2001-01-01

    /homepage/sac/cam/na2000/index.html7-Volume Set now available at special set price ! Over the second half of the 20th century the subject area loosely referred to as numerical analysis of partial differential equations (PDEs) has undergone unprecedented development. At its practical end, the vigorous growth and steady diversification of the field were stimulated by the demand for accurate and reliable tools for computational modelling in physical sciences and engineering, and by the rapid development of computer hardware and architecture. At the more theoretical end, the analytical insight in

  13. An FMS Dynamic Production Scheduling Algorithm Considering Cutting Tool Failure and Cutting Tool Life

    International Nuclear Information System (INIS)

    Setiawan, A; Wangsaputra, R; Halim, A H; Martawirya, Y Y

    2016-01-01

    This paper deals with Flexible Manufacturing System (FMS) production rescheduling due to unavailability of cutting tools caused either of cutting tool failure or life time limit. The FMS consists of parallel identical machines integrated with an automatic material handling system and it runs fully automatically. Each machine has a same cutting tool configuration that consists of different geometrical cutting tool types on each tool magazine. The job usually takes two stages. Each stage has sequential operations allocated to machines considering the cutting tool life. In the real situation, the cutting tool can fail before the cutting tool life is reached. The objective in this paper is to develop a dynamic scheduling algorithm when a cutting tool is broken during unmanned and a rescheduling needed. The algorithm consists of four steps. The first step is generating initial schedule, the second step is determination the cutting tool failure time, the third step is determination of system status at cutting tool failure time and the fourth step is the rescheduling for unfinished jobs. The approaches to solve the problem are complete-reactive scheduling and robust-proactive scheduling. The new schedules result differences starting time and completion time of each operations from the initial schedule. (paper)

  14. Automatic affective appraisal of sexual penetration stimuli in women with vaginismus or dyspareunia.

    Science.gov (United States)

    Huijding, Jorg; Borg, Charmaine; Weijmar-Schultz, Willibrord; de Jong, Peter J

    2011-03-01

    Current psychological views are that negative appraisals of sexual stimuli lie at the core of sexual dysfunctions. It is important to differentiate between deliberate appraisals and more automatic appraisals, as research has shown that the former are most relevant to controllable behaviors, and the latter are most relevant to reflexive behaviors. Accordingly, it can be hypothesized that in women with vaginismus, the persistent difficulty to allow vaginal entry is due to global negative automatic affective appraisals that trigger reflexive pelvic floor muscle contraction at the prospect of penetration. To test whether sexual penetration pictures elicited global negative automatic affective appraisals in women with vaginismus or dyspareunia and to examine whether deliberate appraisals and automatic appraisals differed between the two patient groups. Women with persistent vaginismus (N = 24), dyspareunia (N = 23), or no sexual complaints (N = 30) completed a pictorial Extrinsic Affective Simon Task (EAST), and then made a global affective assessment of the EAST stimuli using visual analogue scales (VAS). The EAST assessed global automatic affective appraisals of sexual penetration stimuli, while the VAS assessed global deliberate affective appraisals of these stimuli. Automatic affective appraisals of sexual penetration stimuli tended to be positive, independent of the presence of sexual complaints. Deliberate appraisals of the same stimuli were significantly more negative in the women with vaginismus than in the dyspareunia group and control group, while the latter two groups did not differ in their appraisals. Unexpectedly, deliberate appraisals seemed to be most important in vaginismus, whereas dyspareunia did not seem to implicate negative deliberate or automatic affective appraisals. These findings dispute the view that global automatic affect lies at the core of vaginismus and indicate that a useful element in therapeutic interventions may be the modification of

  15. Automatic calibration of gamma spectrometers

    International Nuclear Information System (INIS)

    Tluchor, D.; Jiranek, V.

    1989-01-01

    The principle is described of energy calibration of the spectrometric path based on the measurement of the standard of one radionuclide or a set of them. The entire computer-aided process is divided into three main steps, viz.: the insertion of the calibration standard by the operator; the start of the calibration program; energy calibration by the computer. The program was selected such that the spectrum identification should not depend on adjustment of the digital or analog elements of the gamma spectrometric measuring path. The ECL program is described for automatic energy calibration as is its control, the organization of data file ECL.DAT and the necessary hardware support. The computer-multichannel analyzer communication was provided using an interface pair of Canberra 8673V and Canberra 8573 operating in the RS-422 standard. All subroutines for communication with the multichannel analyzer were written in MACRO 11 while the main program and the other subroutines were written in FORTRAN-77. (E.J.). 1 tab., 4 refs

  16. Automatic locking orthotic knee device

    Science.gov (United States)

    Weddendorf, Bruce C. (Inventor)

    1993-01-01

    An articulated tang in clevis joint for incorporation in newly manufactured conventional strap-on orthotic knee devices or for replacing such joints in conventional strap-on orthotic knee devices is discussed. The instant tang in clevis joint allows the user the freedom to extend and bend the knee normally when no load (weight) is applied to the knee and to automatically lock the knee when the user transfers weight to the knee, thus preventing a damaged knee from bending uncontrollably when weight is applied to the knee. The tang in clevis joint of the present invention includes first and second clevis plates, a tang assembly and a spacer plate secured between the clevis plates. Each clevis plate includes a bevelled serrated upper section. A bevelled shoe is secured to the tank in close proximity to the bevelled serrated upper section of the clevis plates. A coiled spring mounted within an oblong bore of the tang normally urges the shoes secured to the tang out of engagement with the serrated upper section of each clevic plate to allow rotation of the tang relative to the clevis plate. When weight is applied to the joint, the load compresses the coiled spring, the serrations on each clevis plate dig into the bevelled shoes secured to the tang to prevent relative movement between the tang and clevis plates. A shoulder is provided on the tang and the spacer plate to prevent overextension of the joint.

  17. Automatic Transmission Of Liquid Nitrogen

    Directory of Open Access Journals (Sweden)

    Sumedh Mhatre

    2015-08-01

    Full Text Available Liquid Nitrogen is one of the major substance used as a chiller in industry such as Ice cream factory Milk Diary Storage of blood sample Blood Bank etc. It helps to maintain the required product at a lower temperature for preservation purpose. We cannot fully utilise the LN2 so practically if we are using 3.75 litre LN2 for a single day then around 12 of LN2 450 ml is wasted due to vaporisation. A pressure relief valve is provided to create a pressure difference. If there is no pressure difference between the cylinder carrying LN2 and its surrounding it will results in damage of container as well as wastage of LN2.Transmission of LN2 from TA55 to BA3 is carried manually .So care must be taken for the transmission of LN2 in order to avoid its wastage. With the help of this project concept the transmission of LN2 will be carried automatically so as to reduce the wastage of LN2 in case of manual operation.

  18. Automatic segmentation of psoriasis lesions

    Science.gov (United States)

    Ning, Yang; Shi, Chenbo; Wang, Li; Shu, Chang

    2014-10-01

    The automatic segmentation of psoriatic lesions is widely researched these years. It is an important step in Computer-aid methods of calculating PASI for estimation of lesions. Currently those algorithms can only handle single erythema or only deal with scaling segmentation. In practice, scaling and erythema are often mixed together. In order to get the segmentation of lesions area - this paper proposes an algorithm based on Random forests with color and texture features. The algorithm has three steps. The first step, the polarized light is applied based on the skin's Tyndall-effect in the imaging to eliminate the reflection and Lab color space are used for fitting the human perception. The second step, sliding window and its sub windows are used to get textural feature and color feature. In this step, a feature of image roughness has been defined, so that scaling can be easily separated from normal skin. In the end, Random forests will be used to ensure the generalization ability of the algorithm. This algorithm can give reliable segmentation results even the image has different lighting conditions, skin types. In the data set offered by Union Hospital, more than 90% images can be segmented accurately.

  19. Automatic Road Pavement Assessment with Image Processing: Review and Comparison

    Directory of Open Access Journals (Sweden)

    Sylvie Chambon

    2011-01-01

    Full Text Available In the field of noninvasive sensing techniques for civil infrastructures monitoring, this paper addresses the problem of crack detection, in the surface of the French national roads, by automatic analysis of optical images. The first contribution is a state of the art of the image-processing tools applied to civil engineering. The second contribution is about fine-defect detection in pavement surface. The approach is based on a multi-scale extraction and a Markovian segmentation. Third, an evaluation and comparison protocol which has been designed for evaluating this difficult task—the road pavement crack detection—is introduced. Finally, the proposed method is validated, analysed, and compared to a detection approach based on morphological tools.

  20. A neurocomputational model of automatic sequence production.

    Science.gov (United States)

    Helie, Sebastien; Roeder, Jessica L; Vucovich, Lauren; Rünger, Dennis; Ashby, F Gregory

    2015-07-01

    Most behaviors unfold in time and include a sequence of submovements or cognitive activities. In addition, most behaviors are automatic and repeated daily throughout life. Yet, relatively little is known about the neurobiology of automatic sequence production. Past research suggests a gradual transfer from the associative striatum to the sensorimotor striatum, but a number of more recent studies challenge this role of the BG in automatic sequence production. In this article, we propose a new neurocomputational model of automatic sequence production in which the main role of the BG is to train cortical-cortical connections within the premotor areas that are responsible for automatic sequence production. The new model is used to simulate four different data sets from human and nonhuman animals, including (1) behavioral data (e.g., RTs), (2) electrophysiology data (e.g., single-neuron recordings), (3) macrostructure data (e.g., TMS), and (4) neurological circuit data (e.g., inactivation studies). We conclude with a comparison of the new model with existing models of automatic sequence production and discuss a possible new role for the BG in automaticity and its implication for Parkinson's disease.