WorldWideScience

Sample records for automatic differentiation tools

  1. Automatic differentiation tools in the dynamic simulation of chemical engineering processes

    Directory of Open Access Journals (Sweden)

    Castro M.C.

    2000-01-01

    Full Text Available Automatic Differentiation is a relatively recent technique developed for the differentiation of functions applicable directly to the source code to compute the function written in standard programming languages. That technique permits the automatization of the differentiation step, crucial for dynamic simulation and optimization of processes. The values for the derivatives obtained with AD are exact (to roundoff. The theoretical exactness of the AD comes from the fact that it uses the same rules of differentiation as in differential calculus, but these rules are applied to an algorithmic specification of the function rather than to a formula. The main purpose of this contribution is to discuss the impact of Automatic Differentiation in the field of dynamic simulation of chemical engineering processes. The influence of the differentiation technique on the behavior of the integration code, the performance of the generated code and the incorporation of AD tools in consistent initialization tools are discussed from the viewpoint of dynamic simulation of typical models in chemical engineering.

  2. Automatic differentiation bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Corliss, G.F. [comp.

    1992-07-01

    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  3. Automatic differentiation as a tool in engineering design

    Science.gov (United States)

    Barthelemy, Jean-Francois; Hall, Laura E.

    1992-01-01

    Automatic Differentiation (AD) is a tool that systematically implements the chain rule of differentiation to obtain the derivatives of functions calculated by computer programs. AD is assessed as a tool for engineering design. The forward and reverse modes of AD, their computing requirements, as well as approaches to implementing AD are discussed. The application of two different tools to two medium-size structural analysis problems to generate sensitivity information typically necessary in an optimization or design situation is also discussed. The observation is made that AD is to be preferred to finite differencing in most cases, as long as sufficient computer storage is available; in some instances, AD may be the alternative to consider in lieu of analytical sensitivity analysis.

  4. Sensitivity analysis and design optimization through automatic differentiation

    International Nuclear Information System (INIS)

    Hovland, Paul D; Norris, Boyana; Strout, Michelle Mills; Bhowmick, Sanjukta; Utke, Jean

    2005-01-01

    Automatic differentiation is a technique for transforming a program or subprogram that computes a function, including arbitrarily complex simulation codes, into one that computes the derivatives of that function. We describe the implementation and application of automatic differentiation tools. We highlight recent advances in the combinatorial algorithms and compiler technology that underlie successful implementation of automatic differentiation tools. We discuss applications of automatic differentiation in design optimization and sensitivity analysis. We also describe ongoing research in the design of language-independent source transformation infrastructures for automatic differentiation algorithms

  5. Higher-order automatic differentiation of mathematical functions

    Science.gov (United States)

    Charpentier, Isabelle; Dal Cappello, Claude

    2015-04-01

    Functions of mathematical physics such as the Bessel functions, the Chebyshev polynomials, the Gauss hypergeometric function and so forth, have practical applications in many scientific domains. On the one hand, differentiation formulas provided in reference books apply to real or complex variables. These do not account for the chain rule. On the other hand, based on the chain rule, the automatic differentiation has become a natural tool in numerical modeling. Nevertheless automatic differentiation tools do not deal with the numerous mathematical functions. This paper describes formulas and provides codes for the higher-order automatic differentiation of mathematical functions. The first method is based on Faà di Bruno's formula that generalizes the chain rule. The second one makes use of the second order differential equation they satisfy. Both methods are exemplified with the aforementioned functions.

  6. Automatic Differentiation and Deep Learning

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Statistical learning has been getting more and more interest from the particle-physics community in recent times, with neural networks and gradient-based optimization being a focus. In this talk we shall discuss three things: automatic differention tools: tools to quickly build DAGs of computation that are fully differentiable. We shall focus on one such tool "PyTorch".  Easy deployment of trained neural networks into large systems with many constraints: for example, deploying a model at the reconstruction phase where the neural network has to be integrated into CERN's bulk data-processing C++-only environment Some recent models in deep learning for segmentation and generation that might be useful for particle physics problems.

  7. Automatic differentiation for gradient-based optimization of radiatively heated microelectronics manufacturing equipment

    Energy Technology Data Exchange (ETDEWEB)

    Moen, C.D.; Spence, P.A.; Meza, J.C.; Plantenga, T.D.

    1996-12-31

    Automatic differentiation is applied to the optimal design of microelectronic manufacturing equipment. The performance of nonlinear, least-squares optimization methods is compared between numerical and analytical gradient approaches. The optimization calculations are performed by running large finite-element codes in an object-oriented optimization environment. The Adifor automatic differentiation tool is used to generate analytic derivatives for the finite-element codes. The performance results support previous observations that automatic differentiation becomes beneficial as the number of optimization parameters increases. The increase in speed, relative to numerical differences, has a limited value and results are reported for two different analysis codes.

  8. Automatic differentiation algorithms in model analysis

    NARCIS (Netherlands)

    Huiskes, M.J.

    2002-01-01

    Title: Automatic differentiation algorithms in model analysis
    Author: M.J. Huiskes
    Date: 19 March, 2002

    In this thesis automatic differentiation algorithms and derivative-based methods

  9. Automatic differentiation of functions

    International Nuclear Information System (INIS)

    Douglas, S.R.

    1990-06-01

    Automatic differentiation is a method of computing derivatives of functions to any order in any number of variables. The functions must be expressible as combinations of elementary functions. When evaluated at specific numerical points, the derivatives have no truncation error and are automatically found. The method is illustrated by simple examples. Source code in FORTRAN is provided

  10. Automatic differentiation in geophysical inverse problems

    Science.gov (United States)

    Sambridge, M.; Rickwood, P.; Rawlinson, N.; Sommacal, S.

    2007-07-01

    Automatic differentiation (AD) is the technique whereby output variables of a computer code evaluating any complicated function (e.g. the solution to a differential equation) can be differentiated with respect to the input variables. Often AD tools take the form of source to source translators and produce computer code without the need for deriving and hand coding of explicit mathematical formulae by the user. The power of AD lies in the fact that it combines the generality of finite difference techniques and the accuracy and efficiency of analytical derivatives, while at the same time eliminating `human' coding errors. It also provides the possibility of accurate, efficient derivative calculation from complex `forward' codes where no analytical derivatives are possible and finite difference techniques are too cumbersome. AD is already having a major impact in areas such as optimization, meteorology and oceanography. Similarly it has considerable potential for use in non-linear inverse problems in geophysics where linearization is desirable, or for sensitivity analysis of large numerical simulation codes, for example, wave propagation and geodynamic modelling. At present, however, AD tools appear to be little used in the geosciences. Here we report on experiments using a state of the art AD tool to perform source to source code translation in a range of geoscience problems. These include calculating derivatives for Gibbs free energy minimization, seismic receiver function inversion, and seismic ray tracing. Issues of accuracy and efficiency are discussed.

  11. Automatic Clustering Using FSDE-Forced Strategy Differential Evolution

    Science.gov (United States)

    Yasid, A.

    2018-01-01

    Clustering analysis is important in datamining for unsupervised data, cause no adequate prior knowledge. One of the important tasks is defining the number of clusters without user involvement that is known as automatic clustering. This study intends on acquiring cluster number automatically utilizing forced strategy differential evolution (AC-FSDE). Two mutation parameters, namely: constant parameter and variable parameter are employed to boost differential evolution performance. Four well-known benchmark datasets were used to evaluate the algorithm. Moreover, the result is compared with other state of the art automatic clustering methods. The experiment results evidence that AC-FSDE is better or competitive with other existing automatic clustering algorithm.

  12. FORSIM-6, Automatic Solution of Coupled Differential Equation System

    International Nuclear Information System (INIS)

    Carver, M.B.; Stewart, D.G.; Blair, J.M.; Selander, W.N.

    1983-01-01

    1 - Description of problem or function: The FORSIM program is a versatile package which automates the solution of coupled differential equation systems. The independent variables are time, and up to three space coordinates, and the equations may be any mixture of partial and/or ordinary differential equations. The philosophy of the program is to provide a tool which will solve a system of differential equations for a user who has basic but unspecialized knowledge of numerical analysis and FORTRAN. The equations to be solved, together with the initial conditions and any special instructions, may be specified by the user in a single FORTRAN subroutine, although he may write a number of routines if this is more suitable. These are then loaded with the control routines, which perform the solution and any requested input and output. 2 - Method of solution: Partial differential equations are automatically converted into sets of coupled ordinary differential equations by variable order discretization in the spatial dimensions. These and other ordinary differential equations are integrated continuously in time using efficient variable order, variable step, error-controlled algorithms

  13. Operator overloading as an enabling technology for automatic differentiation

    International Nuclear Information System (INIS)

    Corliss, G.F.; Griewank, A.

    1993-01-01

    We present an example of the science that is enabled by object-oriented programming techniques. Scientific computation often needs derivatives for solving nonlinear systems such as those arising in many PDE algorithms, optimization, parameter identification, stiff ordinary differential equations, or sensitivity analysis. Automatic differentiation computes derivatives accurately and efficiently by applying the chain rule to each arithmetic operation or elementary function. Operator overloading enables the techniques of either the forward or the reverse mode of automatic differentiation to be applied to real-world scientific problems. We illustrate automatic differentiation with an example drawn from a model of unsaturated flow in a porous medium. The problem arises from planning for the long-term storage of radioactive waste

  14. Automatic differentiation for design sensitivity analysis of structural systems using multiple processors

    Science.gov (United States)

    Nguyen, Duc T.; Storaasli, Olaf O.; Qin, Jiangning; Qamar, Ramzi

    1994-01-01

    An automatic differentiation tool (ADIFOR) is incorporated into a finite element based structural analysis program for shape and non-shape design sensitivity analysis of structural systems. The entire analysis and sensitivity procedures are parallelized and vectorized for high performance computation. Small scale examples to verify the accuracy of the proposed program and a medium scale example to demonstrate the parallel vector performance on multiple CRAY C90 processors are included.

  15. Paediatric Automatic Phonological Analysis Tools (APAT).

    Science.gov (United States)

    Saraiva, Daniela; Lousada, Marisa; Hall, Andreia; Jesus, Luis M T

    2017-12-01

    To develop the pediatric Automatic Phonological Analysis Tools (APAT) and to estimate inter and intrajudge reliability, content validity, and concurrent validity. The APAT were constructed using Excel spreadsheets with formulas. The tools were presented to an expert panel for content validation. The corpus used in the Portuguese standardized test Teste Fonético-Fonológico - ALPE produced by 24 children with phonological delay or phonological disorder was recorded, transcribed, and then inserted into the APAT. Reliability and validity of APAT were analyzed. The APAT present strong inter- and intrajudge reliability (>97%). The content validity was also analyzed (ICC = 0.71), and concurrent validity revealed strong correlations between computerized and manual (traditional) methods. The development of these tools contributes to fill existing gaps in clinical practice and research, since previously there were no valid and reliable tools/instruments for automatic phonological analysis, which allowed the analysis of different corpora.

  16. Tangent: Automatic Differentiation Using Source Code Transformation in Python

    OpenAIRE

    van Merriënboer, Bart; Wiltschko, Alexander B.; Moldovan, Dan

    2017-01-01

    Automatic differentiation (AD) is an essential primitive for machine learning programming systems. Tangent is a new library that performs AD using source code transformation (SCT) in Python. It takes numeric functions written in a syntactic subset of Python and NumPy as input, and generates new Python functions which calculate a derivative. This approach to automatic differentiation is different from existing packages popular in machine learning, such as TensorFlow and Autograd. Advantages ar...

  17. A semi-automatic annotation tool for cooking video

    Science.gov (United States)

    Bianco, Simone; Ciocca, Gianluigi; Napoletano, Paolo; Schettini, Raimondo; Margherita, Roberto; Marini, Gianluca; Gianforme, Giorgio; Pantaleo, Giuseppe

    2013-03-01

    In order to create a cooking assistant application to guide the users in the preparation of the dishes relevant to their profile diets and food preferences, it is necessary to accurately annotate the video recipes, identifying and tracking the foods of the cook. These videos present particular annotation challenges such as frequent occlusions, food appearance changes, etc. Manually annotate the videos is a time-consuming, tedious and error-prone task. Fully automatic tools that integrate computer vision algorithms to extract and identify the elements of interest are not error free, and false positive and false negative detections need to be corrected in a post-processing stage. We present an interactive, semi-automatic tool for the annotation of cooking videos that integrates computer vision techniques under the supervision of the user. The annotation accuracy is increased with respect to completely automatic tools and the human effort is reduced with respect to completely manual ones. The performance and usability of the proposed tool are evaluated on the basis of the time and effort required to annotate the same video sequences.

  18. Automatic Differentiation in Quantum Chemistry with Applications to Fully Variational Hartree-Fock.

    Science.gov (United States)

    Tamayo-Mendoza, Teresa; Kreisbeck, Christoph; Lindh, Roland; Aspuru-Guzik, Alán

    2018-05-23

    Automatic differentiation (AD) is a powerful tool that allows calculating derivatives of implemented algorithms with respect to all of their parameters up to machine precision, without the need to explicitly add any additional functions. Thus, AD has great potential in quantum chemistry, where gradients are omnipresent but also difficult to obtain, and researchers typically spend a considerable amount of time finding suitable analytical forms when implementing derivatives. Here, we demonstrate that AD can be used to compute gradients with respect to any parameter throughout a complete quantum chemistry method. We present DiffiQult , a Hartree-Fock implementation, entirely differentiated with the use of AD tools. DiffiQult is a software package written in plain Python with minimal deviation from standard code which illustrates the capability of AD to save human effort and time in implementations of exact gradients in quantum chemistry. We leverage the obtained gradients to optimize the parameters of one-particle basis sets in the context of the floating Gaussian framework.

  19. Applications of automatic differentiation in computational fluid dynamics

    Science.gov (United States)

    Green, Lawrence L.; Carle, A.; Bischof, C.; Haigler, Kara J.; Newman, Perry A.

    1994-01-01

    Automatic differentiation (AD) is a powerful computational method that provides for computing exact sensitivity derivatives (SD) from existing computer programs for multidisciplinary design optimization (MDO) or in sensitivity analysis. A pre-compiler AD tool for FORTRAN programs called ADIFOR has been developed. The ADIFOR tool has been easily and quickly applied by NASA Langley researchers to assess the feasibility and computational impact of AD in MDO with several different FORTRAN programs. These include a state-of-the-art three dimensional multigrid Navier-Stokes flow solver for wings or aircraft configurations in transonic turbulent flow. With ADIFOR the user specifies sets of independent and dependent variables with an existing computer code. ADIFOR then traces the dependency path throughout the code, applies the chain rule to formulate derivative expressions, and generates new code to compute the required SD matrix. The resulting codes have been verified to compute exact non-geometric and geometric SD for a variety of cases. in less time than is required to compute the SD matrix using centered divided differences.

  20. Automatic Differentiation and its Program Realization

    Czech Academy of Sciences Publication Activity Database

    Hartman, J.; Lukšan, Ladislav; Zítko, J.

    2009-01-01

    Roč. 45, č. 5 (2009), s. 865-883 ISSN 0023-5954 R&D Projects: GA AV ČR IAA1030405 Institutional research plan: CEZ:AV0Z10300504 Keywords : automatic differentiation * modeling languages * systems of optimization Subject RIV: BA - General Mathematics Impact factor: 0.445, year: 2009 http://dml.cz/handle/10338.dmlcz/140037

  1. Applications of automatic differentiation in topology optimization

    DEFF Research Database (Denmark)

    Nørgaard, Sebastian A.; Sagebaum, Max; Gauger, Nicolas R.

    2017-01-01

    The goal of this article is to demonstrate the applicability and to discuss the advantages and disadvantages of automatic differentiation in topology optimization. The technique makes it possible to wholly or partially automate the evaluation of derivatives for optimization problems and is demons...

  2. Post-convergence automatic differentiation of iterative schemes

    International Nuclear Information System (INIS)

    Azmy, Y.Y.

    1997-01-01

    A new approach for performing automatic differentiation (AD) of computer codes that embody an iterative procedure, based on differentiating a single additional iteration upon achieving convergence, is described and implemented. This post-convergence automatic differentiation (PAD) technique results in better accuracy of the computed derivatives, as it eliminates part of the derivatives convergence error, and a large reduction in execution time, especially when many iterations are required to achieve convergence. In addition, it provides a way to compute derivatives of the converged solution without having to repeat the entire iterative process every time new parameters are considered. These advantages are demonstrated and the PAD technique is validated via a set of three linear and nonlinear codes used to solve neutron transport and fluid flow problems. The PAD technique reduces the execution time over direct AD by a factor of up to 30 and improves the accuracy of the derivatives by up to two orders of magnitude. The PAD technique's biggest disadvantage lies in the necessity to compute the iterative map's Jacobian, which for large problems can be prohibitive. Methods are discussed to alleviate this difficulty

  3. Applications of automatic differentiation in topology optimization

    DEFF Research Database (Denmark)

    Nørgaard, Sebastian A.; Sagebaum, Max; Gauger, Nicolas R.

    2017-01-01

    and is demonstrated on two separate, previously published types of problems in topology optimization. Two separate software packages for automatic differentiation, CoDiPack and Tapenade are considered, and their performance and usability trade-offs are discussed and compared to a hand coded adjoint gradient...

  4. A new design of automatic vertical drilling tool

    Directory of Open Access Journals (Sweden)

    Yanfeng Ma

    2015-09-01

    Full Text Available In order to effectively improve penetration rates and enhance wellbore quality for vertical wells, a new Automatic Vertical Drilling Tool (AVDT based on Eccentric Braced Structure (EBS is designed. Applying operating principle of rotary steering drilling, AVDT adds offset gravity block automatic induction inclination mechanism. When hole straightening happens, tools take essentric moment to be produced by gravity of offset gravity lock to control the bearing of guide force, so that well straightening is achieved. The normal tool's size of the AVDT is designed as 215.9 mm,other major components' sizes are worked out by the result of theoretical analysis, including the offset angle of EBS. This paper aims to introduce the structure, operating principle, theoretical analysis and describe the key components' parameters setting of the AVDT.

  5. Semi-automatic tool to ease the creation and optimization of GPU programs

    DEFF Research Database (Denmark)

    Jepsen, Jacob

    2014-01-01

    We present a tool that reduces the development time of GPU-executable code. We implement a catalogue of common optimizations specific to the GPU architecture. Through the tool, the programmer can semi-automatically transform a computationally-intensive code section into GPU-executable form...... of the transformations can be performed automatically, which makes the tool usable for both novices and experts in GPU programming....

  6. A novel framework for diagnosing automatic tool changer and tool life based on cloud computing

    Directory of Open Access Journals (Sweden)

    Shang-Liang Chen

    2016-03-01

    Full Text Available Tool change is one among the most frequently performed machining processes, and if there is improper percussion as the tool’s position is changed, the spindle bearing can be damaged. A spindle malfunction can cause problems, such as a knife being dropped or bias in a machined hole. The measures currently taken to avoid such issues, which arose from the available machine tools, only involve determining whether the clapping knife’s state is correct using a spindle and the air adhesion method, which is also used to satisfy the high precision required from mechanical components. Therefore, it cannot be used with any type of machine tool; in addition, improper tapping of the spindle during an automatic tool change cannot be detected. Therefore, this study proposes a new type of diagnostic framework that combines cloud computing and vibration sensors, among of which, tool change is automatically diagnosed using an architecture to identify abnormalities and thereby enhances the reliability and productivity of the machine and equipment.

  7. A Domain Specific Embedded Language in C++ for Automatic Differentiation, Projection, Integration and Variational Formulations

    Directory of Open Access Journals (Sweden)

    Christophe Prud'homme

    2006-01-01

    Full Text Available In this article, we present a domain specific embedded language in C++ that can be used in various contexts such as numerical projection onto a functional space, numerical integration, variational formulations and automatic differentiation. Albeit these tools operate in different ways, the language overcomes this difficulty by decoupling expression constructions from evaluation. The language is implemented using expression templates and meta-programming techniques and uses various Boost libraries. The language is exercised on a number of non-trivial examples and a benchmark presents the performance behavior on a few test problems.

  8. An inverse method for non linear ablative thermics with experimentation of automatic differentiation

    Energy Technology Data Exchange (ETDEWEB)

    Alestra, S [Simulation Information Technology and Systems Engineering, EADS IW Toulouse (France); Collinet, J [Re-entry Systems and Technologies, EADS ASTRIUM ST, Les Mureaux (France); Dubois, F [Professor of Applied Mathematics, Conservatoire National des Arts et Metiers Paris (France)], E-mail: stephane.alestra@eads.net, E-mail: jean.collinet@astrium.eads.net, E-mail: fdubois@cnam.fr

    2008-11-01

    Thermal Protection System is a key element for atmospheric re-entry missions of aerospace vehicles. The high level of heat fluxes encountered in such missions has a direct effect on mass balance of the heat shield. Consequently, the identification of heat fluxes is of great industrial interest but is in flight only available by indirect methods based on temperature measurements. This paper is concerned with inverse analyses of highly evolutive heat fluxes. An inverse problem is used to estimate transient surface heat fluxes (convection coefficient), for degradable thermal material (ablation and pyrolysis), by using time domain temperature measurements on thermal protection. The inverse problem is formulated as a minimization problem involving an objective functional, through an optimization loop. An optimal control formulation (Lagrangian, adjoint and gradient steepest descent method combined with quasi-Newton method computations) is then developed and applied, using Monopyro, a transient one-dimensional thermal model with one moving boundary (ablative surface) that has been developed since many years by ASTRIUM-ST. To compute numerically the adjoint and gradient quantities, for the inverse problem in heat convection coefficient, we have used both an analytical manual differentiation and an Automatic Differentiation (AD) engine tool, Tapenade, developed at INRIA Sophia-Antipolis by the TROPICS team. Several validation test cases, using synthetic temperature measurements are carried out, by applying the results of the inverse method with minimization algorithm. Accurate results of identification on high fluxes test cases, and good agreement for temperatures restitutions, are obtained, without and with ablation and pyrolysis, using bad fluxes initial guesses. First encouraging results with an automatic differentiation procedure are also presented in this paper.

  9. Analysis on machine tool systems using spindle vibration monitoring for automatic tool changer

    OpenAIRE

    Shang-Liang Chen; Yin-Ting Cheng; Chin-Fa Su

    2015-01-01

    Recently, the intelligent systems of technology have become one of the major items in the development of machine tools. One crucial technology is the machinery status monitoring function, which is required for abnormal warnings and the improvement of cutting efficiency. During processing, the mobility act of the spindle unit determines the most frequent and important part such as automatic tool changer. The vibration detection system includes the development of hardware and software, such as ...

  10. TMB: Automatic Differentiation and Laplace Approximation

    Directory of Open Access Journals (Sweden)

    Kasper Kristensen

    2016-04-01

    Full Text Available TMB is an open source R package that enables quick implementation of complex nonlinear random effects (latent variable models in a manner similar to the established AD Model Builder package (ADMB, http://admb-project.org/; Fournier et al. 2011. In addition, it offers easy access to parallel computations. The user defines the joint likelihood for the data and the random effects as a C++ template function, while all the other operations are done in R; e.g., reading in the data. The package evaluates and maximizes the Laplace approximation of the marginal likelihood where the random effects are automatically integrated out. This approximation, and its derivatives, are obtained using automatic differentiation (up to order three of the joint likelihood. The computations are designed to be fast for problems with many random effects (≈ 106 and parameters (≈ 103 . Computation times using ADMB and TMB are compared on a suite of examples ranging from simple models to large spatial models where the random effects are a Gaussian random field. Speedups ranging from 1.5 to about 100 are obtained with increasing gains for large problems. The package and examples are available at http://tmb-project.org/.

  11. Evaluation of Semi-Automatic Metadata Generation Tools: A Survey of the Current State of the Art

    Directory of Open Access Journals (Sweden)

    Jung-ran Park

    2015-09-01

    Full Text Available Assessment of the current landscape of semi-automatic metadata generation tools is particularly important considering the rapid development of digital repositories and the recent explosion of big data. Utilization of (semiautomatic metadata generation is critical in addressing these environmental changes and may be unavoidable in the future considering the costly and complex operation of manual metadata creation. To address such needs, this study examines the range of semi-automatic metadata generation tools (n=39 while providing an analysis of their techniques, features, and functions. The study focuses on open-source tools that can be readily utilized in libraries and other memory institutions.  The challenges and current barriers to implementation of these tools were identified. The greatest area of difficulty lies in the fact that  the piecemeal development of most semi-automatic generation tools only addresses part of the issue of semi-automatic metadata generation, providing solutions to one or a few metadata elements but not the full range elements.  This indicates that significant local efforts will be required to integrate the various tools into a coherent set of a working whole.  Suggestions toward such efforts are presented for future developments that may assist information professionals with incorporation of semi-automatic tools within their daily workflows.

  12. Multislice CT coronary angiography: evaluation of an automatic vessel detection tool

    International Nuclear Information System (INIS)

    Dewey, M.; Schnapauff, D.; Lembcke, A.; Hamm, B.; Rogalla, P.; Laule, M.; Borges, A.C.; Rutsch, W.

    2004-01-01

    Purpose: To investigate the potential of a new detection tool for multisliceCT (MSCT) coronary angiography with automatic display of curved multiplanar reformations and orthogonal cross-sections. Materials and Methods: Thirty-five patients were consecutively enrolled in a prospective intention-to-diagnose study and examined using a MSCT scanner with 16 x 0.5 mm detector collimation and 400 ms gantry rotation time (Aquilion, Toshiba). A multisegment algorithm using up to four segments was applied for ECG-gated reconstruction. Automatic and manual detection of coronary arteries was conducted using the coronary artery CT protocol of a workstation (Vitrea 2, Version 3.3, Vital Images) to detect significant stenoses (≥50%) in all segments of ≥1.5 mm in diameter. Each detection tool was used by one reader who was blinded to the results of the other detection method and the results of conventional coronary angiography. Results: The overall sensitivity, specificity, nondiagnostic rate, and accuracy of the automatic and manual approach were 90 vs. 94%, 89 vs. 84%, 6 vs. 6%, and 89 vs. 88%, respectively (p=n.s.). The vessel length detected with the automatic and manual approach were highly correlated for the left main/left anterior descending (143±30 vs. 146±24 mm, r=0.923, p [de

  13. Automatically Assessing Lexical Sophistication: Indices, Tools, Findings, and Application

    Science.gov (United States)

    Kyle, Kristopher; Crossley, Scott A.

    2015-01-01

    This study explores the construct of lexical sophistication and its applications for measuring second language lexical and speaking proficiency. In doing so, the study introduces the Tool for the Automatic Analysis of LExical Sophistication (TAALES), which calculates text scores for 135 classic and newly developed lexical indices related to word…

  14. MatchGUI: A Graphical MATLAB-Based Tool for Automatic Image Co-Registration

    Science.gov (United States)

    Ansar, Adnan I.

    2011-01-01

    MatchGUI software, based on MATLAB, automatically matches two images and displays the match result by superimposing one image on the other. A slider bar allows focus to shift between the two images. There are tools for zoom, auto-crop to overlap region, and basic image markup. Given a pair of ortho-rectified images (focused primarily on Mars orbital imagery for now), this software automatically co-registers the imagery so that corresponding image pixels are aligned. MatchGUI requires minimal user input, and performs a registration over scale and inplane rotation fully automatically

  15. AD Model Builder: using automatic differentiation for statistical inference of highly parameterized complex nonlinear models

    DEFF Research Database (Denmark)

    Fournier, David A.; Skaug, Hans J.; Ancheta, Johnoel

    2011-01-01

    Many criteria for statistical parameter estimation, such as maximum likelihood, are formulated as a nonlinear optimization problem.Automatic Differentiation Model Builder (ADMB) is a programming framework based on automatic differentiation, aimed at highly nonlinear models with a large number...... of such a feature is the generic implementation of Laplace approximation of high-dimensional integrals for use in latent variable models. We also review the literature in which ADMB has been used, and discuss future development of ADMB as an open source project. Overall, the main advantages ofADMB are flexibility...

  16. Building Automatic Grading Tools for Basic of Programming Lab in an Academic Institution

    Science.gov (United States)

    Harimurti, Rina; Iwan Nurhidayat, Andi; Asmunin

    2018-04-01

    The skills of computer programming is a core competency that must be mastered by students majoring in computer sciences. The best way to improve this skill is through the practice of writing many programs to solve various problems from simple to complex. It takes hard work and a long time to check and evaluate the results of student labs one by one, especially if the number of students a lot. Based on these constrain, web proposes Automatic Grading Tools (AGT), the application that can evaluate and deeply check the source code in C, C++. The application architecture consists of students, web-based applications, compilers, and operating systems. Automatic Grading Tools (AGT) is implemented MVC Architecture and using open source software, such as laravel framework version 5.4, PostgreSQL 9.6, Bootstrap 3.3.7, and jquery library. Automatic Grading Tools has also been tested for real problems by submitting source code in C/C++ language and then compiling. The test results show that the AGT application has been running well.

  17. Automatized material and radioactivity flow control tool in decommissioning process

    International Nuclear Information System (INIS)

    Rehak, I.; Vasko, M.; Daniska, V.; Schultz, O.

    2009-01-01

    In this presentation the automatized material and radioactivity flow control tool in decommissioning process is discussed. It is concluded that: computer simulation of the decommissioning process is one of the important attributes of computer code Omega; one of the basic tools of computer optimisation of decommissioning waste processing are the tools of integral material and radioactivity flow; all the calculated parameters of materials are stored in each point of calculation process and they can be viewed; computer code Omega represents opened modular system, which can be improved; improvement of the module of optimisation of decommissioning waste processing will be performed in the frame of improvement of material procedures and scenarios.

  18. PASTEC: an automatic transposable element classification tool.

    Directory of Open Access Journals (Sweden)

    Claire Hoede

    Full Text Available SUMMARY: The classification of transposable elements (TEs is key step towards deciphering their potential impact on the genome. However, this process is often based on manual sequence inspection by TE experts. With the wealth of genomic sequences now available, this task requires automation, making it accessible to most scientists. We propose a new tool, PASTEC, which classifies TEs by searching for structural features and similarities. This tool outperforms currently available software for TE classification. The main innovation of PASTEC is the search for HMM profiles, which is useful for inferring the classification of unknown TE on the basis of conserved functional domains of the proteins. In addition, PASTEC is the only tool providing an exhaustive spectrum of possible classifications to the order level of the Wicker hierarchical TE classification system. It can also automatically classify other repeated elements, such as SSR (Simple Sequence Repeats, rDNA or potential repeated host genes. Finally, the output of this new tool is designed to facilitate manual curation by providing to biologists with all the evidence accumulated for each TE consensus. AVAILABILITY: PASTEC is available as a REPET module or standalone software (http://urgi.versailles.inra.fr/download/repet/REPET_linux-x64-2.2.tar.gz. It requires a Unix-like system. There are two standalone versions: one of which is parallelized (requiring Sun grid Engine or Torque, and the other of which is not.

  19. A novel image toggle tool for comparison of serial mammograms: automatic density normalization and alignment-development of the tool and initial experience.

    Science.gov (United States)

    Honda, Satoshi; Tsunoda, Hiroko; Fukuda, Wataru; Saida, Yukihisa

    2014-12-01

    The purpose is to develop a new image toggle tool with automatic density normalization (ADN) and automatic alignment (AA) for comparing serial digital mammograms (DMGs). We developed an ADN and AA process to compare the images of serial DMGs. In image density normalization, a linear interpolation was applied by taking two points of high- and low-brightness areas. The alignment was calculated by determining the point of the greatest correlation while shifting the alignment between the current and prior images. These processes were performed on a PC with a 3.20-GHz Xeon processor and 8 GB of main memory. We selected 12 suspected breast cancer patients who had undergone screening DMGs in the past. Automatic processing was retrospectively performed on these images. Two radiologists subjectively evaluated them. The process of the developed algorithm took approximately 1 s per image. In our preliminary experience, two images could not be aligned approximately. When they were aligned, image toggling allowed detection of differences between examinations easily. We developed a new tool to facilitate comparative reading of DMGs on a mammography viewing system. Using this tool for toggling comparisons might improve the interpretation efficiency of serial DMGs.

  20. RETRANS - A tool to verify the functional equivalence of automatically generated source code with its specification

    International Nuclear Information System (INIS)

    Miedl, H.

    1998-01-01

    Following the competent technical standards (e.g. IEC 880) it is necessary to verify each step in the development process of safety critical software. This holds also for the verification of automatically generated source code. To avoid human errors during this verification step and to limit the cost effort a tool should be used which is developed independently from the development of the code generator. For this purpose ISTec has developed the tool RETRANS which demonstrates the functional equivalence of automatically generated source code with its underlying specification. (author)

  1. Automatic generation of bioinformatics tools for predicting protein-ligand binding sites.

    Science.gov (United States)

    Komiyama, Yusuke; Banno, Masaki; Ueki, Kokoro; Saad, Gul; Shimizu, Kentaro

    2016-03-15

    Predictive tools that model protein-ligand binding on demand are needed to promote ligand research in an innovative drug-design environment. However, it takes considerable time and effort to develop predictive tools that can be applied to individual ligands. An automated production pipeline that can rapidly and efficiently develop user-friendly protein-ligand binding predictive tools would be useful. We developed a system for automatically generating protein-ligand binding predictions. Implementation of this system in a pipeline of Semantic Web technique-based web tools will allow users to specify a ligand and receive the tool within 0.5-1 day. We demonstrated high prediction accuracy for three machine learning algorithms and eight ligands. The source code and web application are freely available for download at http://utprot.net They are implemented in Python and supported on Linux. shimizu@bi.a.u-tokyo.ac.jp Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  2. Evaluation of a new software tool for the automatic volume calculation of hepatic tumors. First results

    International Nuclear Information System (INIS)

    Meier, S.; Mildenberger, P.; Pitton, M.; Thelen, M.; Schenk, A.; Bourquain, H.

    2004-01-01

    Purpose: computed tomography has become the preferred method in detecting liver carcinomas. The introduction of spiral CT added volumetric assessment of intrahepatic tumors, which was unattainable in the clinical routine with incremental CT due to complex planimetric revisions and excessive computing time. In an ongoing clinical study, a new software tool was tested for the automatic detection of tumor volume and the time needed for this procedure. Materials and methods: we analyzed patients suffering from hepatocellular carcinoma (HCC). All patients underwent treatment with repeated transcatheter chemoembolization of the hepatic arteria. The volumes of the HCC lesions detected in CT were measured with the new software tool in HepaVison (MeVis, Germany). The results were compared with manual planimetric calculation of the volume performed by three independent radiologists. Results: our first results in 16 patients show a correlation between the automatically and the manually calculated volumes (up to a difference of 2 ml) of 96.8%. While the manual method of analyzing the volume of a lesion requires 2.5 minutes on average, the automatic method merely requires about 30 seconds of user interaction time. Conclusion: These preliminary results show a good correlation between automatic and manual calculations of the tumor volume. The new software tool requires less time for accurate determination of the tumor volume and can be applied in the daily clinical routine. (orig.) [de

  3. Development of tools for automatic generation of PLC code

    CERN Document Server

    Koutli, Maria; Rochez, Jacques

    This Master thesis was performed at CERN and more specifically in the EN-ICE-PLC section. The Thesis describes the integration of two PLC platforms, that are based on CODESYS development tool, to the CERN defined industrial framework, UNICOS. CODESYS is a development tool for PLC programming, based on IEC 61131-3 standard, and is adopted by many PLC manufacturers. The two PLC development environments are, the SoMachine from Schneider and the TwinCAT from Beckhoff. The two CODESYS compatible PLCs, should be controlled by the SCADA system of Siemens, WinCC OA. The framework includes a library of Function Blocks (objects) for the PLC programs and a software for automatic generation of the PLC code based on this library, called UAB. The integration aimed to give a solution that is shared by both PLC platforms and was based on the PLCOpen XML scheme. The developed tools were demonstrated by creating a control application for both PLC environments and testing of the behavior of the code of the library.

  4. Depfix, a Tool for Automatic Rule-based Post-editing of SMT

    Directory of Open Access Journals (Sweden)

    Rudolf Rosa

    2014-09-01

    Full Text Available We present Depfix, an open-source system for automatic post-editing of phrase-based machine translation outputs. Depfix employs a range of natural language processing tools to obtain analyses of the input sentences, and uses a set of rules to correct common or serious errors in machine translation outputs. Depfix is currently implemented only for English-to-Czech translation direction, but extending it to other languages is planned.

  5. Facilitating coronary artery evaluation in MDCT using a 3D automatic vessel segmentation tool

    International Nuclear Information System (INIS)

    Fawad Khan, M.; Gurung, Jessen; Maataoui, Adel; Brehmer, Boris; Herzog, Christopher; Vogl, Thomas J.; Wesarg, Stefan; Dogan, Selami; Ackermann, Hanns; Assmus, Birgit

    2006-01-01

    The purpose of this study was to investigate a 3D coronary artery segmentation algorithm using 16-row MDCT data sets. Fifty patients underwent cardiac CT (Sensation 16, Siemens) and coronary angiography. Automatic and manual detection of coronary artery stenosis was performed. A 3D coronary artery segmentation algorithm (Fraunhofer Institute for Computer Graphics, Darmstadt) was used for automatic evaluation. All significant stenoses (>50%) in vessels >1.5 mm in diameter were protocoled. Each detection tool was used by one reader who was blinded to the results of the other detection method and the results of coronary angiography. Sensitivity and specificity were determined for automatic and manual detection as well as was the time for both CT-based evaluation methods. The overall sensitivity and specificity of the automatic and manual approach were 93.1 vs. 95.83% and 86.1 vs. 81.9%. The time required for automatic evaluation was significantly shorter than with the manual approach, i.e., 246.04±43.17 s for the automatic approach and 526.88±45.71 s for the manual approach (P<0.0001). In 94% of the coronary artery branches, automatic detection required less time than the manual approach. Automatic coronary vessel evaluation is feasible. It reduces the time required for cardiac CT evaluation with similar sensitivity and specificity as well as facilitates the evaluation of MDCT coronary angiography in a standardized fashion. (orig.)

  6. Demonstration of Automatically-Generated Adjoint Code for Use in Aerodynamic Shape Optimization

    Science.gov (United States)

    Green, Lawrence; Carle, Alan; Fagan, Mike

    1999-01-01

    Gradient-based optimization requires accurate derivatives of the objective function and constraints. These gradients may have previously been obtained by manual differentiation of analysis codes, symbolic manipulators, finite-difference approximations, or existing automatic differentiation (AD) tools such as ADIFOR (Automatic Differentiation in FORTRAN). Each of these methods has certain deficiencies, particularly when applied to complex, coupled analyses with many design variables. Recently, a new AD tool called ADJIFOR (Automatic Adjoint Generation in FORTRAN), based upon ADIFOR, was developed and demonstrated. Whereas ADIFOR implements forward-mode (direct) differentiation throughout an analysis program to obtain exact derivatives via the chain rule of calculus, ADJIFOR implements the reverse-mode counterpart of the chain rule to obtain exact adjoint form derivatives from FORTRAN code. Automatically-generated adjoint versions of the widely-used CFL3D computational fluid dynamics (CFD) code and an algebraic wing grid generation code were obtained with just a few hours processing time using the ADJIFOR tool. The codes were verified for accuracy and were shown to compute the exact gradient of the wing lift-to-drag ratio, with respect to any number of shape parameters, in about the time required for 7 to 20 function evaluations. The codes have now been executed on various computers with typical memory and disk space for problems with up to 129 x 65 x 33 grid points, and for hundreds to thousands of independent variables. These adjoint codes are now used in a gradient-based aerodynamic shape optimization problem for a swept, tapered wing. For each design iteration, the optimization package constructs an approximate, linear optimization problem, based upon the current objective function, constraints, and gradient values. The optimizer subroutines are called within a design loop employing the approximate linear problem until an optimum shape is found, the design loop

  7. NuFTA: A CASE Tool for Automatic Software Fault Tree Analysis

    International Nuclear Information System (INIS)

    Yun, Sang Hyun; Lee, Dong Ah; Yoo, Jun Beom

    2010-01-01

    Software fault tree analysis (SFTA) is widely used for analyzing software requiring high-reliability. In SFTA, experts predict failures of system through HA-ZOP (Hazard and Operability study) or FMEA (Failure Mode and Effects Analysis) and draw software fault trees about the failures. Quality and cost of the software fault tree, therefore, depend on knowledge and experience of the experts. This paper proposes a CASE tool NuFTA in order to assist experts of safety analysis. The NuFTA automatically generate software fault trees from NuSCR formal requirements specification. NuSCR is a formal specification language used for specifying software requirements of KNICS RPS (Reactor Protection System) in Korea. We used the SFTA templates proposed by in order to generate SFTA automatically. The NuFTA also generates logical formulae summarizing the failure's cause, and we have a plan to use the formulae usefully through formal verification techniques

  8. High-order space charge effects using automatic differentiation

    International Nuclear Information System (INIS)

    Reusch, Michael F.; Bruhwiler, David L.

    1997-01-01

    The Northrop Grumman Topkark code has been upgraded to Fortran 90, making use of operator overloading, so the same code can be used to either track an array of particles or construct a Taylor map representation of the accelerator lattice. We review beam optics and beam dynamics simulations conducted with TOPKARK in the past and we present a new method for modeling space charge forces to high-order with automatic differentiation. This method generates an accurate, high-order, 6-D Taylor map of the phase space variable trajectories for a bunched, high-current beam. The spatial distribution is modeled as the product of a Taylor Series times a Gaussian. The variables in the argument of the Gaussian are normalized to the respective second moments of the distribution. This form allows for accurate representation of a wide range of realistic distributions, including any asymmetries, and allows for rapid calculation of the space charge fields with free space boundary conditions. An example problem is presented to illustrate our approach

  9. Automatic validation of numerical solutions

    DEFF Research Database (Denmark)

    Stauning, Ole

    1997-01-01

    This thesis is concerned with ``Automatic Validation of Numerical Solutions''. The basic theory of interval analysis and self-validating methods is introduced. The mean value enclosure is applied to discrete mappings for obtaining narrow enclosures of the iterates when applying these mappings...... differential equations, but in this thesis, we describe how to use the methods for enclosing iterates of discrete mappings, and then later use them for discretizing solutions of ordinary differential equations. The theory of automatic differentiation is introduced, and three methods for obtaining derivatives...... are described: The forward, the backward, and the Taylor expansion methods. The three methods have been implemented in the C++ program packages FADBAD/TADIFF. Some examples showing how to use the three metho ds are presented. A feature of FADBAD/TADIFF not present in other automatic differentiation packages...

  10. Parameter optimization of differential evolution algorithm for automatic playlist generation problem

    Science.gov (United States)

    Alamag, Kaye Melina Natividad B.; Addawe, Joel M.

    2017-11-01

    With the digitalization of music, the number of collection of music increased largely and there is a need to create lists of music that filter the collection according to user preferences, thus giving rise to the Automatic Playlist Generation Problem (APGP). Previous attempts to solve this problem include the use of search and optimization algorithms. If a music database is very large, the algorithm to be used must be able to search the lists thoroughly taking into account the quality of the playlist given a set of user constraints. In this paper we perform an evolutionary meta-heuristic optimization algorithm, Differential Evolution (DE) using different combination of parameter values and select the best performing set when used to solve four standard test functions. Performance of the proposed algorithm is then compared with normal Genetic Algorithm (GA) and a hybrid GA with Tabu Search. Numerical simulations are carried out to show better results from Differential Evolution approach with the optimized parameter values.

  11. Analysis on machine tool systems using spindle vibration monitoring for automatic tool changer

    Directory of Open Access Journals (Sweden)

    Shang-Liang Chen

    2015-12-01

    Full Text Available Recently, the intelligent systems of technology have become one of the major items in the development of machine tools. One crucial technology is the machinery status monitoring function, which is required for abnormal warnings and the improvement of cutting efficiency. During processing, the mobility act of the spindle unit determines the most frequent and important part such as automatic tool changer. The vibration detection system includes the development of hardware and software, such as vibration meter, signal acquisition card, data processing platform, and machine control program. Meanwhile, based on the difference between the mechanical configuration and the desired characteristics, it is difficult for a vibration detection system to directly choose the commercially available kits. For this reason, it was also selected as an item for self-development research, along with the exploration of a significant parametric study that is sufficient to represent the machine characteristics and states. However, we also launched the development of functional parts of the system simultaneously. Finally, we entered the conditions and the parameters generated from both the states and the characteristics into the developed system to verify its feasibility.

  12. GANALYZER: A TOOL FOR AUTOMATIC GALAXY IMAGE ANALYSIS

    International Nuclear Information System (INIS)

    Shamir, Lior

    2011-01-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ∼10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  13. Ganalyzer: A Tool for Automatic Galaxy Image Analysis

    Science.gov (United States)

    Shamir, Lior

    2011-08-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  14. A new fully automatic PIM tool to replicate two component tungsten DEMO divertor parts

    International Nuclear Information System (INIS)

    Antusch, Steffen; Commin, Lorelei; Heneka, Jochen; Piotter, Volker; Plewa, Klaus; Walter, Heinz

    2013-01-01

    Highlights: • Development of a fully automatic 2C-PIM tool. • Replicate fusion relevant components in one step without additional brazing. • No cracks or gaps in the seam of the joining zone visible. • For both material combinations a solid bond of the material interface was achieved. • PIM is a powerful process for mass production as well as for joining even complex shaped parts. -- Abstract: At Karlsruhe Institute of Technology (KIT), divertor design concepts for future nuclear fusion power plants beyond ITER are intensively investigated. One promising KIT divertor design concept for the future DEMO power reactor is based on modular He-cooled finger units. The manufacturing of such parts by mechanical machining such as milling and turning, however, is extremely cost and time intensive because tungsten is very hard and brittle. Powder Injection Molding (PIM) has been adapted to tungsten processing at KIT since a couple of years. This production method is deemed promising in view of large-scale production of tungsten parts with high near-net-shape precision, hence, offering an advantage of cost-saving process compared to conventional machining. The properties of the effectively and successfully manufactured divertor part tile consisting only of pure tungsten are a microstructure without cracks and a high density (>98% T.D.). Based on the achieved results a new fully automatic multicomponent PIM tool was developed and allows the replication and joining without brazing of fusion relevant components of different materials in one step and the creation of composite materials. This contribution describes the process route to design and engineer a new fully automatic 2C-PIM tool, including the filling simulation and the implementing of the tool. The complete technological fabrication process of tungsten 2C-PIM, including material and feedstock (powder and binder) development, injection molding, and heat-treatment of real DEMO divertor parts is outlined

  15. Automatic Parallelization Tool: Classification of Program Code for Parallel Computing

    Directory of Open Access Journals (Sweden)

    Mustafa Basthikodi

    2016-04-01

    Full Text Available Performance growth of single-core processors has come to a halt in the past decade, but was re-enabled by the introduction of parallelism in processors. Multicore frameworks along with Graphical Processing Units empowered to enhance parallelism broadly. Couples of compilers are updated to developing challenges forsynchronization and threading issues. Appropriate program and algorithm classifications will have advantage to a great extent to the group of software engineers to get opportunities for effective parallelization. In present work we investigated current species for classification of algorithms, in that related work on classification is discussed along with the comparison of issues that challenges the classification. The set of algorithms are chosen which matches the structure with different issues and perform given task. We have tested these algorithms utilizing existing automatic species extraction toolsalong with Bones compiler. We have added functionalities to existing tool, providing a more detailed characterization. The contributions of our work include support for pointer arithmetic, conditional and incremental statements, user defined types, constants and mathematical functions. With this, we can retain significant data which is not captured by original speciesof algorithms. We executed new theories into the device, empowering automatic characterization of program code.

  16. Automatic welding detection by an intelligent tool pipe inspection

    Science.gov (United States)

    Arizmendi, C. J.; Garcia, W. L.; Quintero, M. A.

    2015-07-01

    This work provide a model based on machine learning techniques in welds recognition, based on signals obtained through in-line inspection tool called “smart pig” in Oil and Gas pipelines. The model uses a signal noise reduction phase by means of pre-processing algorithms and attribute-selection techniques. The noise reduction techniques were selected after a literature review and testing with survey data. Subsequently, the model was trained using recognition and classification algorithms, specifically artificial neural networks and support vector machines. Finally, the trained model was validated with different data sets and the performance was measured with cross validation and ROC analysis. The results show that is possible to identify welding automatically with an efficiency between 90 and 98 percent.

  17. High-order space charge effects using automatic differentiation

    International Nuclear Information System (INIS)

    Reusch, M.F.; Bruhwiler, D.L.; Computer Accelerator Physics Conference Williamsburg, Virginia 1996)

    1997-01-01

    The Northrop Grumman Topkark code has been upgraded to Fortran 90, making use of operator overloading, so the same code can be used to either track an array of particles or construct a Taylor map representation of the accelerator lattice. We review beam optics and beam dynamics simulations conducted with TOPKARK in the past and we present a new method for modeling space charge forces to high-order with automatic differentiation. This method generates an accurate, high-order, 6-D Taylor map of the phase space variable trajectories for a bunched, high-current beam. The spatial distribution is modeled as the product of a Taylor Series times a Gaussian. The variables in the argument of the Gaussian are normalized to the respective second moments of the distribution. This form allows for accurate representation of a wide range of realistic distributions, including any asymmetries, and allows for rapid calculation of the space charge fields with free space boundary conditions. An example problem is presented to illustrate our approach. copyright 1997 American Institute of Physics

  18. NASCENT: an automatic protein interaction network generation tool for non-model organisms.

    Science.gov (United States)

    Banky, Daniel; Ordog, Rafael; Grolmusz, Vince

    2009-04-24

    Large quantity of reliable protein interaction data are available for model organisms in public depositories (e.g., MINT, DIP, HPRD, INTERACT). Most data correspond to experiments with the proteins of Saccharomyces cerevisiae, Drosophila melanogaster, Homo sapiens, Caenorhabditis elegans, Escherichia coli and Mus musculus. For other important organisms the data availability is poor or non-existent. Here we present NASCENT, a completely automatic web-based tool and also a downloadable Java program, capable of modeling and generating protein interaction networks even for non-model organisms. The tool performs protein interaction network modeling through gene-name mapping, and outputs the resulting network in graphical form and also in computer-readable graph-forms, directly applicable by popular network modeling software. http://nascent.pitgroup.org.

  19. ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations

    Science.gov (United States)

    Laloo, Jalal Z. A.; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai

    2017-07-01

    The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.

  20. Reproducing the internal and external anatomy of fossil bones: Two new automatic digital tools.

    Science.gov (United States)

    Profico, Antonio; Schlager, Stefan; Valoriani, Veronica; Buzi, Costantino; Melchionna, Marina; Veneziano, Alessio; Raia, Pasquale; Moggi-Cecchi, Jacopo; Manzi, Giorgio

    2018-04-21

    We present two new automatic tools, developed under the R environment, to reproduce the internal and external structures of bony elements. The first method, Computer-Aided Laser Scanner Emulator (CA-LSE), provides the reconstruction of the external portions of a 3D mesh by simulating the action of a laser scanner. The second method, Automatic Segmentation Tool for 3D objects (AST-3D), performs the digital reconstruction of anatomical cavities. We present the application of CA-LSE and AST-3D methods to different anatomical remains, highly variable in terms of shape, size and structure: a modern human skull, a malleus bone, and a Neanderthal deciduous tooth. Both methods are developed in the R environment and embedded in the packages "Arothron" and "Morpho," where both the codes and the data are fully available. The application of CA-LSE and AST-3D allows the isolation and manipulation of the internal and external components of the 3D virtual representation of complex bony elements. In particular, we present the output of the four case studies: a complete modern human endocast and the right maxillary sinus, the dental pulp of the Neanderthal tooth and the inner network of blood vessels of the malleus. Both methods demonstrated to be much faster, cheaper, and more accurate than other conventional approaches. The tools we presented are available as add-ons in existing software within the R platform. Because of ease of application, and unrestrained availability of the methods proposed, these tools can be widely used by paleoanthropologists, paleontologists and anatomists. © 2018 Wiley Periodicals, Inc.

  1. DAF: differential ACE filtering image quality assessment by automatic color equalization

    Science.gov (United States)

    Ouni, S.; Chambah, M.; Saint-Jean, C.; Rizzi, A.

    2008-01-01

    Ideally, a quality assessment system would perceive and measure image or video impairments just like a human being. But in reality, objective quality metrics do not necessarily correlate well with perceived quality [1]. Plus, some measures assume that there exists a reference in the form of an "original" to compare to, which prevents their usage in digital restoration field, where often there is no reference to compare to. That is why subjective evaluation is the most used and most efficient approach up to now. But subjective assessment is expensive, time consuming and does not respond, hence, to the economic requirements [2,3]. Thus, reliable automatic methods for visual quality assessment are needed in the field of digital film restoration. The ACE method, for Automatic Color Equalization [4,6], is an algorithm for digital images unsupervised enhancement. It is based on a new computational approach that tries to model the perceptual response of our vision system merging the Gray World and White Patch equalization mechanisms in a global and local way. Like our vision system ACE is able to adapt to widely varying lighting conditions, and to extract visual information from the environment efficaciously. Moreover ACE can be run in an unsupervised manner. Hence it is very useful as a digital film restoration tool since no a priori information is available. In this paper we deepen the investigation of using the ACE algorithm as a basis for a reference free image quality evaluation. This new metric called DAF for Differential ACE Filtering [7] is an objective quality measure that can be used in several image restoration and image quality assessment systems. In this paper, we compare on different image databases, the results obtained with DAF and with some subjective image quality assessments (Mean Opinion Score MOS as measure of perceived image quality). We study also the correlation between objective measure and MOS. In our experiments, we have used for the first image

  2. An effective automatic procedure for testing parameter identifiability of HIV/AIDS models.

    Science.gov (United States)

    Saccomani, Maria Pia

    2011-08-01

    Realistic HIV models tend to be rather complex and many recent models proposed in the literature could not yet be analyzed by traditional identifiability testing techniques. In this paper, we check a priori global identifiability of some of these nonlinear HIV models taken from the recent literature, by using a differential algebra algorithm based on previous work of the author. The algorithm is implemented in a software tool, called DAISY (Differential Algebra for Identifiability of SYstems), which has been recently released (DAISY is freely available on the web site http://www.dei.unipd.it/~pia/ ). The software can be used to automatically check global identifiability of (linear and) nonlinear models described by polynomial or rational differential equations, thus providing a general and reliable tool to test global identifiability of several HIV models proposed in the literature. It can be used by researchers with a minimum of mathematical background.

  3. ModelMage: a tool for automatic model generation, selection and management.

    Science.gov (United States)

    Flöttmann, Max; Schaber, Jörg; Hoops, Stephan; Klipp, Edda; Mendes, Pedro

    2008-01-01

    Mathematical modeling of biological systems usually involves implementing, simulating, and discriminating several candidate models that represent alternative hypotheses. Generating and managing these candidate models is a tedious and difficult task and can easily lead to errors. ModelMage is a tool that facilitates management of candidate models. It is designed for the easy and rapid development, generation, simulation, and discrimination of candidate models. The main idea of the program is to automatically create a defined set of model alternatives from a single master model. The user provides only one SBML-model and a set of directives from which the candidate models are created by leaving out species, modifiers or reactions. After generating models the software can automatically fit all these models to the data and provides a ranking for model selection, in case data is available. In contrast to other model generation programs, ModelMage aims at generating only a limited set of models that the user can precisely define. ModelMage uses COPASI as a simulation and optimization engine. Thus, all simulation and optimization features of COPASI are readily incorporated. ModelMage can be downloaded from http://sysbio.molgen.mpg.de/modelmage and is distributed as free software.

  4. Automatic registration method for multisensor datasets adopted for dimensional measurements on cutting tools

    International Nuclear Information System (INIS)

    Shaw, L; Mehari, F; Weckenmann, A; Ettl, S; Häusler, G

    2013-01-01

    Multisensor systems with optical 3D sensors are frequently employed to capture complete surface information by measuring workpieces from different views. During coarse and fine registration the resulting datasets are afterward transformed into one common coordinate system. Automatic fine registration methods are well established in dimensional metrology, whereas there is a deficit in automatic coarse registration methods. The advantage of a fully automatic registration procedure is twofold: it enables a fast and contact-free alignment and further a flexible application to datasets of any kind of optical 3D sensor. In this paper, an algorithm adapted for a robust automatic coarse registration is presented. The method was originally developed for the field of object reconstruction or localization. It is based on a segmentation of planes in the datasets to calculate the transformation parameters. The rotation is defined by the normals of three corresponding segmented planes of two overlapping datasets, while the translation is calculated via the intersection point of the segmented planes. First results have shown that the translation is strongly shape dependent: 3D data of objects with non-orthogonal planar flanks cannot be registered with the current method. In the novel supplement for the algorithm, the translation is additionally calculated via the distance between centroids of corresponding segmented planes, which results in more than one option for the transformation. A newly introduced measure considering the distance between the datasets after coarse registration evaluates the best possible transformation. Results of the robust automatic registration method are presented on the example of datasets taken from a cutting tool with a fringe-projection system and a focus-variation system. The successful application in dimensional metrology is proven with evaluations of shape parameters based on the registered datasets of a calibrated workpiece. (paper)

  5. A Thermo-Hydraulic Tool for Automatic Virtual Hazop Evaluation

    Directory of Open Access Journals (Sweden)

    Pugi L.

    2014-12-01

    Full Text Available Development of complex lubrication systems in the Oil&Gas industry has reached high levels of competitiveness in terms of requested performances and reliability. In particular, the use of HazOp (acronym of Hazard and Operability analysis represents a decisive factor to evaluate safety and reliability of plants. The HazOp analysis is a structured and systematic examination of a planned or existing operation in order to identify and evaluate problems that may represent risks to personnel or equipment. In particular, P&ID schemes (acronym of Piping and Instrument Diagram according to regulation in force ISO 14617 are used to evaluate the design of the plant in order to increase its safety and reliability in different operating conditions. The use of a simulation tool can drastically increase speed, efficiency and reliability of the design process. In this work, a tool, called TTH lib (acronym of Transient Thermal Hydraulic Library for the 1-D simulation of thermal hydraulic plants is presented. The proposed tool is applied to the analysis of safety relevant components of compressor and pumping units, such as lubrication circuits. Opposed to the known commercial products, TTH lib has been customized in order to ease simulation of complex interactions with digital logic components and plant controllers including their sensors and measurement systems. In particular, the proposed tool is optimized for fixed step execution and fast prototyping of Real Time code both for testing and production purposes. TTH lib can be used as a standard SimScape-Simulink library of components optimized and specifically designed in accordance with the P&ID definitions. Finally, an automatic code generation procedure has been developed, so TTH simulation models can be directly assembled from the P&ID schemes and technical documentation including detailed informations of sensor and measurement system.

  6. DDT: A Research Tool for Automatic Data Distribution in High Performance Fortran

    Directory of Open Access Journals (Sweden)

    Eduard AyguadÉ

    1997-01-01

    Full Text Available This article describes the main features and implementation of our automatic data distribution research tool. The tool (DDT accepts programs written in Fortran 77 and generates High Performance Fortran (HPF directives to map arrays onto the memories of the processors and parallelize loops, and executable statements to remap these arrays. DDT works by identifying a set of computational phases (procedures and loops. The algorithm builds a search space of candidate solutions for these phases which is explored looking for the combination that minimizes the overall cost; this cost includes data movement cost and computation cost. The movement cost reflects the cost of accessing remote data during the execution of a phase and the remapping costs that have to be paid in order to execute the phase with the selected mapping. The computation cost includes the cost of executing a phase in parallel according to the selected mapping and the owner computes rule. The tool supports interprocedural analysis and uses control flow information to identify how phases are sequenced during the execution of the application.

  7. Parallel computation of automatic differentiation applied to magnetic field calculations

    International Nuclear Information System (INIS)

    Hinkins, R.L.; Lawrence Berkeley Lab., CA

    1994-09-01

    The author presents a parallelization of an accelerator physics application to simulate magnetic field in three dimensions. The problem involves the evaluation of high order derivatives with respect to two variables of a multivariate function. Automatic differentiation software had been used with some success, but the computation time was prohibitive. The implementation runs on several platforms, including a network of workstations using PVM, a MasPar using MPFortran, and a CM-5 using CMFortran. A careful examination of the code led to several optimizations that improved its serial performance by a factor of 8.7. The parallelization produced further improvements, especially on the MasPar with a speedup factor of 620. As a result a problem that took six days on a SPARC 10/41 now runs in minutes on the MasPar, making it feasible for physicists at Lawrence Berkeley Laboratory to simulate larger magnets

  8. Automatic alternative phase-shift mask CAD layout tool for gate shrinkage of embedded DRAM in logic below 0.18 μm

    Science.gov (United States)

    Ohnuma, Hidetoshi; Kawahira, Hiroichi

    1998-09-01

    An automatic alternative phase shift mask (PSM) pattern layout tool has been newly developed. This tool is dedicated for embedded DRAM in logic device to shrink gate line width with improving line width controllability in lithography process with a design rule below 0.18 micrometers by the KrF excimer laser exposure. The tool can crete Levenson type PSM used being coupled with a binary mask adopting a double exposure method for positive photo resist. By using graphs, this tool automatically creates alternative PSM patterns. Moreover, it does not give any phase conflicts. By adopting it to actual embedded DRAM in logic cells, we have provided 0.16 micrometers gate resist patterns at both random logic and DRAM areas. The patterns were fabricated using two masks with the double exposure method. Gate line width has been well controlled under a practical exposure-focus window.

  9. Automatic Construction of Finite Algebras

    Institute of Scientific and Technical Information of China (English)

    张健

    1995-01-01

    This paper deals with model generation for equational theories,i.e.,automatically generating (finite)models of a given set of (logical) equations.Our method of finite model generation and a tool for automatic construction of finite algebras is described.Some examples are given to show the applications of our program.We argue that,the combination of model generators and theorem provers enables us to get a better understanding of logical theories.A brief comparison betwween our tool and other similar tools is also presented.

  10. RDFBuilder: a tool to automatically build RDF-based interfaces for MAGE-OM microarray data sources.

    Science.gov (United States)

    Anguita, Alberto; Martin, Luis; Garcia-Remesal, Miguel; Maojo, Victor

    2013-07-01

    This paper presents RDFBuilder, a tool that enables RDF-based access to MAGE-ML-compliant microarray databases. We have developed a system that automatically transforms the MAGE-OM model and microarray data stored in the ArrayExpress database into RDF format. Additionally, the system automatically enables a SPARQL endpoint. This allows users to execute SPARQL queries for retrieving microarray data, either from specific experiments or from more than one experiment at a time. Our system optimizes response times by caching and reusing information from previous queries. In this paper, we describe our methods for achieving this transformation. We show that our approach is complementary to other existing initiatives, such as Bio2RDF, for accessing and retrieving data from the ArrayExpress database. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  11. Health smart home for elders - a tool for automatic recognition of activities of daily living.

    Science.gov (United States)

    Le, Xuan Hoa Binh; Di Mascolo, Maria; Gouin, Alexia; Noury, Norbert

    2008-01-01

    Elders live preferently in their own home, but with aging comes the loss of autonomy and associated risks. In order to help them live longer in safe conditions, we need a tool to automatically detect their loss of autonomy by assessing the degree of performance of activities of daily living. This article presents an approach enabling the activities recognition of an elder living alone in a home equipped with noninvasive sensors.

  12. Fully automatic adjoints: a robust and efficient mechanism for generating adjoint ocean models

    Science.gov (United States)

    Ham, D. A.; Farrell, P. E.; Funke, S. W.; Rognes, M. E.

    2012-04-01

    The problem of generating and maintaining adjoint models is sufficiently difficult that typically only the most advanced and well-resourced community ocean models achieve it. There are two current technologies which each suffer from their own limitations. Algorithmic differentiation, also called automatic differentiation, is employed by models such as the MITGCM [2] and the Alfred Wegener Institute model FESOM [3]. This technique is very difficult to apply to existing code, and requires a major initial investment to prepare the code for automatic adjoint generation. AD tools may also have difficulty with code employing modern software constructs such as derived data types. An alternative is to formulate the adjoint differential equation and to discretise this separately. This approach, known as the continuous adjoint and employed in ROMS [4], has the disadvantage that two different model code bases must be maintained and manually kept synchronised as the model develops. The discretisation of the continuous adjoint is not automatically consistent with that of the forward model, producing an additional source of error. The alternative presented here is to formulate the flow model in the high level language UFL (Unified Form Language) and to automatically generate the model using the software of the FEniCS project. In this approach it is the high level code specification which is differentiated, a task very similar to the formulation of the continuous adjoint [5]. However since the forward and adjoint models are generated automatically, the difficulty of maintaining them vanishes and the software engineering process is therefore robust. The scheduling and execution of the adjoint model, including the application of an appropriate checkpointing strategy is managed by libadjoint [1]. In contrast to the conventional algorithmic differentiation description of a model as a series of primitive mathematical operations, libadjoint employs a new abstraction of the simulation

  13. SU-C-202-03: A Tool for Automatic Calculation of Delivered Dose Variation for Off-Line Adaptive Therapy Using Cone Beam CT

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, B; Lee, S; Chen, S; Zhou, J; Prado, K; D’Souza, W; Yi, B [University of Maryland School of Medicine, Baltimore, MD (United States)

    2016-06-15

    Purpose: Monitoring the delivered dose is an important task for the adaptive radiotherapy (ART) and for determining time to re-plan. A software tool which enables automatic delivered dose calculation using cone-beam CT (CBCT) has been developed and tested. Methods: The tool consists of four components: a CBCT Colleting Module (CCM), a Plan Registration Moduel (PRM), a Dose Calculation Module (DCM), and an Evaluation and Action Module (EAM). The CCM is triggered periodically (e.g. every 1:00 AM) to search for newly acquired CBCTs of patients of interest and then export the DICOM files of the images and related registrations defined in ARIA followed by triggering the PRM. The PRM imports the DICOM images and registrations, links the CBCTs to the related treatment plan of the patient in the planning system (RayStation V4.5, RaySearch, Stockholm, Sweden). A pre-determined CT-to-density table is automatically generated for dose calculation. Current version of the DCM uses a rigid registration which regards the treatment isocenter of the CBCT to be the isocenter of the treatment plan. Then it starts the dose calculation automatically. The AEM evaluates the plan using pre-determined plan evaluation parameters: PTV dose-volume metrics and critical organ doses. The tool has been tested for 10 patients. Results: Automatic plans are generated and saved in the order of the treatment dates of the Adaptive Planning module of the RayStation planning system, without any manual intervention. Once the CTV dose deviates more than 3%, both email and page alerts are sent to the physician and the physicist of the patient so that one can look the case closely. Conclusion: The tool is capable to perform automatic dose tracking and to alert clinicians when an action is needed. It is clinically useful for off-line adaptive therapy to catch any gross error. Practical way of determining alarming level for OAR is under development.

  14. Attentional Bias for Pain and Sex, and Automatic Appraisals of Sexual Penetration: Differential Patterns in Dyspareunia vs Vaginismus?

    Science.gov (United States)

    Melles, Reinhilde J; Dewitte, Marieke D; Ter Kuile, Moniek M; Peters, Madelon M L; de Jong, Peter J

    2016-08-01

    Current information processing models propose that heightened attention bias for sex-related threats (eg, pain) and lowered automatic incentive processes ("wanting") may play an important role in the impairment of sexual arousal and the development of sexual dysfunctions such as genitopelvic pain/penetration disorder (GPPPD). Differential threat and incentive processing may also help explain the stronger persistence of coital avoidance in women with vaginismus compared to women with dyspareunia. As the first aim, we tested if women with GPPPD show (1) heightened attention for pain and sex, and (2) heightened threat and lower incentive associations with sexual penetration. Second, we examined whether the stronger persistence of coital avoidance in vaginismus vs dyspareunia might be explained by a stronger attentional bias or more dysfunctional automatic threat/incentive associations. Women with lifelong vaginismus (n = 37), dyspareunia (n = 29), and a no-symptoms comparison group (n = 51) completed a visual search task to assess attentional bias, and single target implicit-association tests to measure automatic sex-threat and sex-wanting associations. There were no group differences in attentional bias or automatic associations. Correlational analysis showed that slowed detection of sex stimuli and stronger automatic threat associations were related to lowered sexual arousal. The findings do not corroborate the view that attentional bias for pain or sex contributes to coital pain, or that differences in coital avoidance may be explained by differences in attentional bias or automatic threat/incentive associations. However, the correlational findings are consistent with the view that automatic threat associations and impaired attention for sex stimuli may interfere with the generation of sexual arousal. Copyright © 2016 International Society for Sexual Medicine. Published by Elsevier Inc. All rights reserved.

  15. SU-F-BRB-16: A Spreadsheet Based Automatic Trajectory GEnerator (SAGE): An Open Source Tool for Automatic Creation of TrueBeam Developer Mode Robotic Trajectories

    International Nuclear Information System (INIS)

    Etmektzoglou, A; Mishra, P; Svatos, M

    2015-01-01

    Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomes available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly

  16. SU-F-BRB-16: A Spreadsheet Based Automatic Trajectory GEnerator (SAGE): An Open Source Tool for Automatic Creation of TrueBeam Developer Mode Robotic Trajectories

    Energy Technology Data Exchange (ETDEWEB)

    Etmektzoglou, A; Mishra, P; Svatos, M [Varian Medical Systems, Palo Alto, CA (United States)

    2015-06-15

    Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomes available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly

  17. The tool for the automatic analysis of lexical sophistication (TAALES): version 2.0.

    Science.gov (United States)

    Kyle, Kristopher; Crossley, Scott; Berger, Cynthia

    2017-07-11

    This study introduces the second release of the Tool for the Automatic Analysis of Lexical Sophistication (TAALES 2.0), a freely available and easy-to-use text analysis tool. TAALES 2.0 is housed on a user's hard drive (allowing for secure data processing) and is available on most operating systems (Windows, Mac, and Linux). TAALES 2.0 adds 316 indices to the original tool. These indices are related to word frequency, word range, n-gram frequency, n-gram range, n-gram strength of association, contextual distinctiveness, word recognition norms, semantic network, and word neighbors. In this study, we validated TAALES 2.0 by investigating whether its indices could be used to model both holistic scores of lexical proficiency in free writes and word choice scores in narrative essays. The results indicated that the TAALES 2.0 indices could be used to explain 58% of the variance in lexical proficiency scores and 32% of the variance in word-choice scores. Newly added TAALES 2.0 indices, including those related to n-gram association strength, word neighborhood, and word recognition norms, featured heavily in these predictor models, suggesting that TAALES 2.0 represents a substantial upgrade.

  18. AUTOMATIC WINDING GENERATION USING MATRIX REPRESENTATION - ANFRACTUS TOOL 1.0

    Directory of Open Access Journals (Sweden)

    Daoud Ouamara

    2018-02-01

    Full Text Available This paper describes an original approach dealing with AC/DC winding design in electrical machines. A research software called “ANFRACTUS Tool 1.0”, allowing automatic generation of all windings in multi-phases electrical machines, has been developed using the matrix representation. Unlike existent methods, where the aim is to synthesize a winding with higher performances, the proposed method provides the opportunity to choose between all doable windings. The specificity of this approach is based on the fact that it take only the slots, phases and layers number as input parameters. The poles number is not requested to run the generation process. Windings generation by matrix representation may be applied for any number of slots, phases and layers. The software do not deal with the manner that coils are connected but just the emplacement of coils in each slot with its current sense. The waveform and the harmonic spectrum of the total magnetomotive force (MMF are given as result.

  19. Health smart home: towards an assistant tool for automatic assessment of the dependence of elders.

    Science.gov (United States)

    Le, Xuan Hoa Binh; Di Mascolo, Maria; Gouin, Alexia; Noury, Norbert

    2007-01-01

    In order to help elders living alone to age in place independently and safely, it can be useful to have an assistant tool that can automatically assess their dependence and issue an alert if there is any loss of autonomy. The dependence can be assessed by the degree of performance, by the elders, of activities of daily living. This article presents an approach enabling the activity recognition for an elder living alone in a Health Smart Home equipped with noninvasive sensors.

  20. Reducing the memory requirement in reverse mode automatic differentiation by solving TBR flow equations

    International Nuclear Information System (INIS)

    Naumann, U.

    2002-01-01

    The fast computation of gradients in reverse mode Automatic Differentiation (AD) requires the generation of adjoint versions of every statement in the original code. Due to the resulting reversal of the control flow certain intermediate values have to be made available in reverse order to compute the local partial derivatives. This can be achieved by storing these values or by recomputing them when they become required. In any case one is interested in minimizing the size of this set. Following an extensive introduction of the ''To-Be-Recorded'' (TBR) problem the authors present flow equations for propagating the TBR status of variables in the context of reverse mode AD of structured programs

  1. Differential Forms: A New Tool in Economics

    Science.gov (United States)

    Mimkes, Jürgen

    Econophysics is the transfer of methods from natural to socio-economic sciences. This concept has first been applied to finance1, but it is now also used in various applications of economics and social sciences [2,3]. The present paper focuses on problems in macro economics and growth. 1. Neoclassical theory [4, 5] neglects the “ex post” property of income and growth. Income Y(K, L) is assumed to be a function of capital and labor. But functions cannot model the “ex post” character of income. 2. Neoclassical theory is based on a Cobb Douglas function [6] with variable elasticity α, which may be fitted to economic data. But an undefined elasticity α leads to a descriptive rather than a predictive economic theory. The present paper introduces a new tool - differential forms and path dependent integrals - to macro economics. This is a solution to the problems above: 1. The integral of not exact differential forms is path dependent and can only be calculated “ex post” like income and economic growth. 2. Not exact differential forms can be made exact by an integrating factor, this leads to a new, well defined, unique production function F and a predictive economic theory.

  2. Application of automatic change of interval to de Vogelaere's method of the solution of the differential equation y'' = f (x, y)

    International Nuclear Information System (INIS)

    Rogers, M.H.

    1960-11-01

    The paper gives an extension to de Vogelaere's method for the solution of systems of second order differential equations from which first derivatives are absent. The extension is a description of the way in which automatic change in step-length can be made to give a prescribed accuracy at each step. (author)

  3. High-Order Automatic Differentiation of Unmodified Linear Algebra Routines via Nilpotent Matrices

    Science.gov (United States)

    Dunham, Benjamin Z.

    This work presents a new automatic differentiation method, Nilpotent Matrix Differentiation (NMD), capable of propagating any order of mixed or univariate derivative through common linear algebra functions--most notably third-party sparse solvers and decomposition routines, in addition to basic matrix arithmetic operations and power series--without changing data-type or modifying code line by line; this allows differentiation across sequences of arbitrarily many such functions with minimal implementation effort. NMD works by enlarging the matrices and vectors passed to the routines, replacing each original scalar with a matrix block augmented by derivative data; these blocks are constructed with special sparsity structures, termed "stencils," each designed to be isomorphic to a particular multidimensional hypercomplex algebra. The algebras are in turn designed such that Taylor expansions of hypercomplex function evaluations are finite in length and thus exactly track derivatives without approximation error. Although this use of the method in the "forward mode" is unique in its own right, it is also possible to apply it to existing implementations of the (first-order) discrete adjoint method to find high-order derivatives with lowered cost complexity; for example, for a problem with N inputs and an adjoint solver whose cost is independent of N--i.e., O(1)--the N x N Hessian can be found in O(N) time, which is comparable to existing second-order adjoint methods that require far more problem-specific implementation effort. Higher derivatives are likewise less expensive--e.g., a N x N x N rank-three tensor can be found in O(N2). Alternatively, a Hessian-vector product can be found in O(1) time, which may open up many matrix-based simulations to a range of existing optimization or surrogate modeling approaches. As a final corollary in parallel to the NMD-adjoint hybrid method, the existing complex-step differentiation (CD) technique is also shown to be capable of

  4. A Survey of Automatic Protocol Reverse Engineering Approaches, Methods, and Tools on the Inputs and Outputs View

    OpenAIRE

    Baraka D. Sija; Young-Hoon Goo; Kyu-Seok Shim; Huru Hasanova; Myung-Sup Kim

    2018-01-01

    A network protocol defines rules that control communications between two or more machines on the Internet, whereas Automatic Protocol Reverse Engineering (APRE) defines the way of extracting the structure of a network protocol without accessing its specifications. Enough knowledge on undocumented protocols is essential for security purposes, network policy implementation, and management of network resources. This paper reviews and analyzes a total of 39 approaches, methods, and tools towards ...

  5. Four-bar linkage-based automatic tool changer: Dynamic modeling and torque optimization

    International Nuclear Information System (INIS)

    Lee, Sangho; Seo, TaeWon; Kim, Jong-Won; Kim, Jongwon

    2017-01-01

    An Automatic tool changer (ATC) is a device used in a tapping machine to reduce process time. This paper presents the optimization of a Peak torque reduction mechanism (PTRM) for an ATC. It is necessary to reduce the fatigue load and energy consumed, which is related to the peak torque. The PTRM uses a torsion spring to reduce the peak torque and was applied to a novel ATC mechanism, which was modeled using inverse dynamics. Optimization of the PTRM is required to minimize the peak torque. The design parameters are the initial angle and stiffness of the torsion spring, and the objective function is the peak torque of the input link. The torque was simulated, and the peak torque was decreased by 10 %. The energy consumed was reduced by the optimization.

  6. Four-bar linkage-based automatic tool changer: Dynamic modeling and torque optimization

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Sangho; Seo, TaeWon [Yeungnam University, Gyeongsan (Korea, Republic of); Kim, Jong-Won; Kim, Jongwon [Seoul National University, Seoul (Korea, Republic of)

    2017-05-15

    An Automatic tool changer (ATC) is a device used in a tapping machine to reduce process time. This paper presents the optimization of a Peak torque reduction mechanism (PTRM) for an ATC. It is necessary to reduce the fatigue load and energy consumed, which is related to the peak torque. The PTRM uses a torsion spring to reduce the peak torque and was applied to a novel ATC mechanism, which was modeled using inverse dynamics. Optimization of the PTRM is required to minimize the peak torque. The design parameters are the initial angle and stiffness of the torsion spring, and the objective function is the peak torque of the input link. The torque was simulated, and the peak torque was decreased by 10 %. The energy consumed was reduced by the optimization.

  7. Method and Tool for Design Process Navigation and Automatic Generation of Simulation Models for Manufacturing Systems

    Science.gov (United States)

    Nakano, Masaru; Kubota, Fumiko; Inamori, Yutaka; Mitsuyuki, Keiji

    Manufacturing system designers should concentrate on designing and planning manufacturing systems instead of spending their efforts on creating the simulation models to verify the design. This paper proposes a method and its tool to navigate the designers through the engineering process and generate the simulation model automatically from the design results. The design agent also supports collaborative design projects among different companies or divisions with distributed engineering and distributed simulation techniques. The idea was implemented and applied to a factory planning process.

  8. Data Quality Monitoring : Automatic MOnitoRing Environment (AMORE ) Web Administration Tool in ALICE Experiment

    CERN Document Server

    Nagi, Imre

    2013-01-01

    ALICE (A Large Ion Collider Experiment) is the heavy-ion detector designed to study the physics of strongly interacting matter and the quark-gluon plasma at the CERN Large Hadron Collider (LHC). The quality of the acquired data evolves over time depending on the status of the detectors, its components and the operating environment. To get an excellent performance of detector, all detector configurations have to be set perfectly so that the data-taking can be done in an optimal way. This report describes a new implementation of the administration tools of the ALICE’s DQM framework called AMORE (Automatic MonitoRing Environment) with web technologies.

  9. SU-F-T-94: Plan2pdf - a Software Tool for Automatic Plan Report for Philips Pinnacle TPS

    Energy Technology Data Exchange (ETDEWEB)

    Wu, C [Sutter Medical Foundation, Roseville, CA (United States)

    2016-06-15

    Purpose: To implement an automatic electronic PDF plan reporting tool for Philips Pinnacle treatment planning system (TPS) Methods: An electronic treatment plan reporting software is developed by us to enable fully automatic PDF report from Pinnacle TPS to external EMR programs such as MOSAIQ. The tool is named “plan2pdf”. plan2pdf is implemented using Pinnacle scripts, Java and UNIX shell scripts, without any external program needed. plan2pdf supports full auto-mode and manual mode reporting. In full auto-mode, with a single mouse click, plan2pdf will generate a detailed Pinnacle plan report in PDF format, which includes customizable cover page, Pinnacle plan summary, orthogonal views through each plan POI and maximum dose point, DRR for each beam, serial transverse views captured throughout the dose grid at a user specified interval, DVH and scorecard windows. The final PDF report is also automatically bookmarked for each section above for convenient plan review. The final PDF report can either be saved on a user specified folder on Pinnacle, or it can be automatically exported to an EMR import folder via a user configured FTP service. In manual capture mode, plan2pdf allows users to capture any Pinnacle plan by full screen, individual window or rectangular ROI drawn on screen. Furthermore, to avoid possible patients’ plan mix-up during auto-mode reporting, a user conflict check feature is included in plan2pdf: it prompts user to wait if another patient is being exported by plan2pdf by another user. Results: plan2pdf is tested extensively and successfully at our institution consists of 5 centers, 15 dosimetrists and 10 physicists, running Pinnacle version 9.10 on Enterprise servers. Conclusion: plan2pdf provides a highly efficient, user friendly and clinical proven platform for all Philips Pinnacle users, to generate a detailed plan report in PDF format for external EMR systems.

  10. SU-F-T-94: Plan2pdf - a Software Tool for Automatic Plan Report for Philips Pinnacle TPS

    International Nuclear Information System (INIS)

    Wu, C

    2016-01-01

    Purpose: To implement an automatic electronic PDF plan reporting tool for Philips Pinnacle treatment planning system (TPS) Methods: An electronic treatment plan reporting software is developed by us to enable fully automatic PDF report from Pinnacle TPS to external EMR programs such as MOSAIQ. The tool is named “plan2pdf”. plan2pdf is implemented using Pinnacle scripts, Java and UNIX shell scripts, without any external program needed. plan2pdf supports full auto-mode and manual mode reporting. In full auto-mode, with a single mouse click, plan2pdf will generate a detailed Pinnacle plan report in PDF format, which includes customizable cover page, Pinnacle plan summary, orthogonal views through each plan POI and maximum dose point, DRR for each beam, serial transverse views captured throughout the dose grid at a user specified interval, DVH and scorecard windows. The final PDF report is also automatically bookmarked for each section above for convenient plan review. The final PDF report can either be saved on a user specified folder on Pinnacle, or it can be automatically exported to an EMR import folder via a user configured FTP service. In manual capture mode, plan2pdf allows users to capture any Pinnacle plan by full screen, individual window or rectangular ROI drawn on screen. Furthermore, to avoid possible patients’ plan mix-up during auto-mode reporting, a user conflict check feature is included in plan2pdf: it prompts user to wait if another patient is being exported by plan2pdf by another user. Results: plan2pdf is tested extensively and successfully at our institution consists of 5 centers, 15 dosimetrists and 10 physicists, running Pinnacle version 9.10 on Enterprise servers. Conclusion: plan2pdf provides a highly efficient, user friendly and clinical proven platform for all Philips Pinnacle users, to generate a detailed plan report in PDF format for external EMR systems.

  11. A new tool for rapid and automatic estimation of earthquake source parameters and generation of seismic bulletins

    Science.gov (United States)

    Zollo, Aldo

    2016-04-01

    of the equivalent Wood-Anderson displacement recordings. The moment magnitude (Mw) is then estimated from the inversion of displacement spectra. The duration magnitude (Md) is rapidly computed, based on a simple and automatic measurement of the seismic wave coda duration. Starting from the magnitude estimates, other relevant pieces of information are also computed, such as the corner frequency, the seismic moment, the source radius and the seismic energy. The ground-shaking maps on a Google map are produced, for peak ground acceleration (PGA), peak ground velocity (PGV) and instrumental intensity (in SHAKEMAP® format), or a plot of the measured peak ground values. Furthermore, based on a specific decisional scheme, the automatic discrimination between local earthquakes occurred within the network and regional/teleseismic events occurred outside the network is performed. Finally, for largest events, if a consistent number of P-wave polarity reading are available, the focal mechanism is also computed. For each event, all of the available pieces of information are stored in a local database and the results of the automatic analyses are published on an interactive web page. "The Bulletin" shows a map with event location and stations, as well as a table listing all the events, with the associated parameters. The catalogue fields are the event ID, the origin date and time, latitude, longitude, depth, Ml, Mw, Md, the number of triggered stations, the S-displacement spectra, and shaking maps. Some of these entries also provide additional information, such as the focal mechanism (when available). The picked traces are uploaded in the database and from the web interface of the Bulletin the traces can be download for more specific analysis. This innovative software represents a smart solution, with a friendly and interactive interface, for high-level analysis of seismic data analysis and it may represent a relevant tool not only for seismologists, but also for non

  12. Coherence measures in automatic time-migration velocity analysis

    International Nuclear Information System (INIS)

    Maciel, Jonathas S; Costa, Jessé C; Schleicher, Jörg

    2012-01-01

    Time-migration velocity analysis can be carried out automatically by evaluating the coherence of migrated seismic events in common-image gathers (CIGs). The performance of gradient methods for automatic time-migration velocity analysis depends on the coherence measures used as the objective function. We compare the results of four different coherence measures, being conventional semblance, differential semblance, an extended differential semblance using differences of more distant image traces and the product of the latter with conventional semblance. In our numerical experiments, the objective functions based on conventional semblance and on the product of conventional semblance with extended differential semblance provided the best velocity models, as evaluated by the flatness of the resulting CIGs. The method can be easily extended to anisotropic media. (paper)

  13. Preventing SQL Injection through Automatic Query Sanitization with ASSIST

    Directory of Open Access Journals (Sweden)

    Raymond Mui

    2010-09-01

    Full Text Available Web applications are becoming an essential part of our everyday lives. Many of our activities are dependent on the functionality and security of these applications. As the scale of these applications grows, injection vulnerabilities such as SQL injection are major security challenges for developers today. This paper presents the technique of automatic query sanitization to automatically remove SQL injection vulnerabilities in code. In our technique, a combination of static analysis and program transformation are used to automatically instrument web applications with sanitization code. We have implemented this technique in a tool named ASSIST (Automatic and Static SQL Injection Sanitization Tool for protecting Java-based web applications. Our experimental evaluation showed that our technique is effective against SQL injection vulnerabilities and has a low overhead.

  14. Interactivity in automatic control: foundations and experiences

    OpenAIRE

    Dormido Bencomo, Sebastián; Guzmán Sánchez, José Luis; Costa Castelló, Ramon; Berenguel, M

    2012-01-01

    The first part of this paper presents the concepts of interactivity and visualization and its essential role in learning the fundamentals and techniques of automatic control. More than 10 years experience of the authors in the development and design of interactive tools dedicated to the study of automatic control concepts are also exposed. The second part of the paper summarizes the main features of the “Automatic Control with Interactive Tools” text that has been recently published by Pea...

  15. Smart-card-based automatic meal record system intervention tool for analysis using data mining approach.

    Science.gov (United States)

    Zenitani, Satoko; Nishiuchi, Hiromu; Kiuchi, Takahiro

    2010-04-01

    The Smart-card-based Automatic Meal Record system for company cafeterias (AutoMealRecord system) was recently developed and used to monitor employee eating habits. The system could be a unique nutrition assessment tool for automatically monitoring the meal purchases of all employees, although it only focuses on company cafeterias and has never been validated. Before starting an interventional study, we tested the reliability of the data collected by the system using the data mining approach. The AutoMealRecord data were examined to determine if it could predict current obesity. All data used in this study (n = 899) were collected by a major electric company based in Tokyo, which has been operating the AutoMealRecord system for several years. We analyzed dietary patterns by principal component analysis using data from the system and extracted 5 major dietary patterns: healthy, traditional Japanese, Chinese, Japanese noodles, and pasta. The ability to predict current body mass index (BMI) with dietary preference was assessed with multiple linear regression analyses, and in the current study, BMI was positively correlated with male gender, preference for "Japanese noodles," mean energy intake, protein content, and frequency of body measurement at a body measurement booth in the cafeteria. There was a negative correlation with age, dietary fiber, and lunchtime cafeteria use (R(2) = 0.22). This regression model predicted "would-be obese" participants (BMI >or= 23) with 68.8% accuracy by leave-one-out cross validation. This shows that there was sufficient predictability of BMI based on data from the AutoMealRecord System. We conclude that the AutoMealRecord system is valuable for further consideration as a health care intervention tool. Copyright 2010 Elsevier Inc. All rights reserved.

  16. Autonomic Differentiation Map: A Novel Statistical Tool for Interpretation of Heart Rate Variability

    Directory of Open Access Journals (Sweden)

    Daniela Lucini

    2018-04-01

    Full Text Available In spite of the large body of evidence suggesting Heart Rate Variability (HRV alone or combined with blood pressure variability (providing an estimate of baroreflex gain as a useful technique to assess the autonomic regulation of the cardiovascular system, there is still an ongoing debate about methodology, interpretation, and clinical applications. In the present investigation, we hypothesize that non-parametric and multivariate exploratory statistical manipulation of HRV data could provide a novel informational tool useful to differentiate normal controls from clinical groups, such as athletes, or subjects affected by obesity, hypertension, or stress. With a data-driven protocol in 1,352 ambulant subjects, we compute HRV and baroreflex indices from short-term data series as proxies of autonomic (ANS regulation. We apply a three-step statistical procedure, by first removing age and gender effects. Subsequently, by factor analysis, we extract four ANS latent domains that detain the large majority of information (86.94%, subdivided in oscillatory (40.84%, amplitude (18.04%, pressure (16.48%, and pulse domains (11.58%. Finally, we test the overall capacity to differentiate clinical groups vs. control. To give more practical value and improve readability, statistical results concerning individual discriminant ANS proxies and ANS differentiation profiles are displayed through peculiar graphical tools, i.e., significance diagram and ANS differentiation map, respectively. This approach, which simultaneously uses all available information about the system, shows what domains make up the difference in ANS discrimination. e.g., athletes differ from controls in all domains, but with a graded strength: maximal in the (normalized oscillatory and in the pulse domains, slightly less in the pressure domain and minimal in the amplitude domain. The application of multiple (non-parametric and exploratory statistical and graphical tools to ANS proxies defines

  17. Grinding Parts For Automatic Welding

    Science.gov (United States)

    Burley, Richard K.; Hoult, William S.

    1989-01-01

    Rollers guide grinding tool along prospective welding path. Skatelike fixture holds rotary grinder or file for machining large-diameter rings or ring segments in preparation for welding. Operator grasps handles to push rolling fixture along part. Rollers maintain precise dimensional relationship so grinding wheel cuts precise depth. Fixture-mounted grinder machines surface to quality sufficient for automatic welding; manual welding with attendant variations and distortion not necessary. Developed to enable automatic welding of parts, manual welding of which resulted in weld bead permeated with microscopic fissures.

  18. DAISY: a new software tool to test global identifiability of biological and physiological systems.

    Science.gov (United States)

    Bellu, Giuseppina; Saccomani, Maria Pia; Audoly, Stefania; D'Angiò, Leontina

    2007-10-01

    A priori global identifiability is a structural property of biological and physiological models. It is considered a prerequisite for well-posed estimation, since it concerns the possibility of recovering uniquely the unknown model parameters from measured input-output data, under ideal conditions (noise-free observations and error-free model structure). Of course, determining if the parameters can be uniquely recovered from observed data is essential before investing resources, time and effort in performing actual biomedical experiments. Many interesting biological models are nonlinear but identifiability analysis for nonlinear system turns out to be a difficult mathematical problem. Different methods have been proposed in the literature to test identifiability of nonlinear models but, to the best of our knowledge, so far no software tools have been proposed for automatically checking identifiability of nonlinear models. In this paper, we describe a software tool implementing a differential algebra algorithm to perform parameter identifiability analysis for (linear and) nonlinear dynamic models described by polynomial or rational equations. Our goal is to provide the biological investigator a completely automatized software, requiring minimum prior knowledge of mathematical modelling and no in-depth understanding of the mathematical tools. The DAISY (Differential Algebra for Identifiability of SYstems) software will potentially be useful in biological modelling studies, especially in physiology and clinical medicine, where research experiments are particularly expensive and/or difficult to perform. Practical examples of use of the software tool DAISY are presented. DAISY is available at the web site http://www.dei.unipd.it/~pia/.

  19. Preliminary Design Through Graphs: A Tool for Automatic Layout Distribution

    Directory of Open Access Journals (Sweden)

    Carlo Biagini

    2015-02-01

    Full Text Available Diagrams are essential in the preliminary stages of design for understanding distributive aspects and assisting the decision-making process. By drawing a schematic graph, designers can visualize in a synthetic way the relationships between many aspects: functions and spaces, distribution of layouts, space adjacency, influence of traffic flows within a facility layout, and so on. This process can be automated through the use of modern Information and Communication Technologies tools (ICT that allow the designers to manage a large quantity of information. The work that we will present is part of an on-going research project into how modern parametric software influences decision-making on the basis of automatic and optimized layout distribution. The method involves two phases: the first aims to define the ontological relation between spaces, with particular reference to a specific building typology (rules of aggregation of spaces; the second entails the implementation of these rules through the use of specialist software. The generation of ontological relations begins with the collection of data from historical manuals and analyses of case studies. These analyses aim to generate a “relationship matrix” based on preferences of space adjacency. The phase of implementing the previously defined rules is based on the use of Grasshopper to analyse and visualize different layout configurations. The layout is generated by simulating a process involving the collision of spheres, which represents specific functions of the design program. The spheres are attracted or rejected as a function of the relationships matrix, as defined above. The layout thus obtained will remain in a sort of abstract state independent of information about the exterior form, but will still provide a useful tool for the decision-making process. In addition, preliminary results gathered through the analysis of case studies will be presented. These results provide a good variety

  20. Differential evolution algorithm based automatic generation control for interconnected power systems with

    Directory of Open Access Journals (Sweden)

    Banaja Mohanty

    2014-09-01

    Full Text Available This paper presents the design and performance analysis of Differential Evolution (DE algorithm based Proportional–Integral (PI and Proportional–Integral–Derivative (PID controllers for Automatic Generation Control (AGC of an interconnected power system. Initially, a two area thermal system with governor dead-band nonlinearity is considered for the design and analysis purpose. In the proposed approach, the design problem is formulated as an optimization problem control and DE is employed to search for optimal controller parameters. Three different objective functions are used for the design purpose. The superiority of the proposed approach has been shown by comparing the results with a recently published Craziness based Particle Swarm Optimization (CPSO technique for the same interconnected power system. It is noticed that, the dynamic performance of DE optimized PI controller is better than CPSO optimized PI controllers. Additionally, controller parameters are tuned at different loading conditions so that an adaptive gain scheduling control strategy can be employed. The study is further extended to a more realistic network of two-area six unit system with different power generating units such as thermal, hydro, wind and diesel generating units considering boiler dynamics for thermal plants, Generation Rate Constraint (GRC and Governor Dead Band (GDB non-linearity.

  1. Child vocalization composition as discriminant information for automatic autism detection.

    Science.gov (United States)

    Xu, Dongxin; Gilkerson, Jill; Richards, Jeffrey; Yapanel, Umit; Gray, Sharmi

    2009-01-01

    Early identification is crucial for young children with autism to access early intervention. The existing screens require either a parent-report questionnaire and/or direct observation by a trained practitioner. Although an automatic tool would benefit parents, clinicians and children, there is no automatic screening tool in clinical use. This study reports a fully automatic mechanism for autism detection/screening for young children. This is a direct extension of the LENA (Language ENvironment Analysis) system, which utilizes speech signal processing technology to analyze and monitor a child's natural language environment and the vocalizations/speech of the child. It is discovered that child vocalization composition contains rich discriminant information for autism detection. By applying pattern recognition and machine learning approaches to child vocalization composition data, accuracy rates of 85% to 90% in cross-validation tests for autism detection have been achieved at the equal-error-rate (EER) point on a data set with 34 children with autism, 30 language delayed children and 76 typically developing children. Due to its easy and automatic procedure, it is believed that this new tool can serve a significant role in childhood autism screening, especially in regards to population-based or universal screening.

  2. Differential equations

    CERN Document Server

    Barbu, Viorel

    2016-01-01

    This textbook is a comprehensive treatment of ordinary differential equations, concisely presenting basic and essential results in a rigorous manner. Including various examples from physics, mechanics, natural sciences, engineering and automatic theory, Differential Equations is a bridge between the abstract theory of differential equations and applied systems theory. Particular attention is given to the existence and uniqueness of the Cauchy problem, linear differential systems, stability theory and applications to first-order partial differential equations. Upper undergraduate students and researchers in applied mathematics and systems theory with a background in advanced calculus will find this book particularly useful. Supplementary topics are covered in an appendix enabling the book to be completely self-contained.

  3. Low rank approach to computing first and higher order derivatives using automatic differentiation

    International Nuclear Information System (INIS)

    Reed, J. A.; Abdel-Khalik, H. S.; Utke, J.

    2012-01-01

    This manuscript outlines a new approach for increasing the efficiency of applying automatic differentiation (AD) to large scale computational models. By using the principles of the Efficient Subspace Method (ESM), low rank approximations of the derivatives for first and higher orders can be calculated using minimized computational resources. The output obtained from nuclear reactor calculations typically has a much smaller numerical rank compared to the number of inputs and outputs. This rank deficiency can be exploited to reduce the number of derivatives that need to be calculated using AD. The effective rank can be determined according to ESM by computing derivatives with AD at random inputs. Reduced or pseudo variables are then defined and new derivatives are calculated with respect to the pseudo variables. Two different AD packages are used: OpenAD and Rapsodia. OpenAD is used to determine the effective rank and the subspace that contains the derivatives. Rapsodia is then used to calculate derivatives with respect to the pseudo variables for the desired order. The overall approach is applied to two simple problems and to MATWS, a safety code for sodium cooled reactors. (authors)

  4. Usage of aids monitoring in automatic braking systems of modern cars

    OpenAIRE

    Dembitskyi V.; Mazylyuk P.; Sitovskyi O.

    2016-01-01

    Increased safety can be carried out at the expense the installation on vehicles of automatic braking systems, that monitor the traffic situation and the actions of the driver. In this paper considered the advantages and disadvantages of automatic braking systems, were analyzed modern tracking tools that are used in automatic braking systems. Based on the statistical data on accidents, are set the main dangers, that the automatic braking system will be reduced. In order to ensure the acc...

  5. The problem of automatic identification of concepts

    International Nuclear Information System (INIS)

    Andreewsky, Alexandre

    1975-11-01

    This paper deals with the problem of the automatic recognition of concepts and describes an important language tool, the ''linguistic filter'', which facilitates the construction of statistical algorithms. Certain special filters, of prepositions, conjunctions, negatives, logical implication, compound words, are presented. This is followed by a detailed description of a statistical algorithm allowing recognition of pronoun referents, and finally the problem of the automatic treatment of negatives in French is discussed [fr

  6. SU-E-J-272: Auto-Segmentation of Regions with Differentiating CT Numbers for Treatment Response Assessment

    International Nuclear Information System (INIS)

    Yang, C; Noid, G; Dalah, E; Paulson, E; Li, X; Gilat-Schmidt, T

    2015-01-01

    Purpose: It has been reported recently that the change of CT number (CTN) during and after radiation therapy (RT) may be used to assess RT response. The purpose of this work is to develop a tool to automatically segment the regions with differentiating CTN and/or with change of CTN in a series of CTs. Methods: A software tool was developed to identify regions with differentiating CTN using K-mean Cluster of CT numbers and to automatically delineate these regions using convex hull enclosing method. Pre- and Post-RT CT, PET, or MRI images acquired for sample lung and pancreatic cancer cases were used to test the software tool. K-mean cluster of CT numbers within the gross tumor volumes (GTVs) delineated based on PET SUV (standard uptake value of fludeoxyglucose) and/or MRI ADC (apparent diffusion coefficient) map was analyzed. The cluster centers with higher value were considered as active tumor volumes (ATV). The convex hull contours enclosing preset clusters were used to delineate these ATVs with color washed displays. The CTN defined ATVs were compared with the SUV- or ADC-defined ATVs. Results: CTN stability of the CT scanner used to acquire the CTs in this work is less than 1.5 Hounsfield Unit (HU) variation annually. K-mean cluster centers in the GTV have difference of ∼20 HU, much larger than variation due to CTN stability, for the lung cancer cases studied. The dice coefficient between the ATVs delineated based on convex hull enclosure of high CTN centers and the PET defined GTVs based on SUV cutoff value of 2.5 was 90(±5)%. Conclusion: A software tool was developed using K-mean cluster and convex hull contour to automatically segment high CTN regions which may not be identifiable using a simple threshold method. These CTN regions were reasonably overlapped with the PET or MRI defined GTVs

  7. Automatic segmentation of vertebrae from radiographs

    DEFF Research Database (Denmark)

    Mysling, Peter; Petersen, Peter Kersten; Nielsen, Mads

    2011-01-01

    Segmentation of vertebral contours is an essential task in the design of automatic tools for vertebral fracture assessment. In this paper, we propose a novel segmentation technique which does not require operator interaction. The proposed technique solves the segmentation problem in a hierarchical...... is constrained by a conditional shape model, based on the variability of the coarse spine location estimates. The technique is evaluated on a data set of manually annotated lumbar radiographs. The results compare favorably to the previous work in automatic vertebra segmentation, in terms of both segmentation...

  8. Automatic brain matter segmentation of computed tomography images using a statistical model: A tool to gain working time!

    Science.gov (United States)

    Bertè, Francesco; Lamponi, Giuseppe; Bramanti, Placido; Calabrò, Rocco S

    2015-10-01

    Brain computed tomography (CT) is useful diagnostic tool for the evaluation of several neurological disorders due to its accuracy, reliability, safety and wide availability. In this field, a potentially interesting research topic is the automatic segmentation and recognition of medical regions of interest (ROIs). Herein, we propose a novel automated method, based on the use of the active appearance model (AAM) for the segmentation of brain matter in CT images to assist radiologists in the evaluation of the images. The method described, that was applied to 54 CT images coming from a sample of outpatients affected by cognitive impairment, enabled us to obtain the generation of a model overlapping with the original image with quite good precision. Since CT neuroimaging is in widespread use for detecting neurological disease, including neurodegenerative conditions, the development of automated tools enabling technicians and physicians to reduce working time and reach a more accurate diagnosis is needed. © The Author(s) 2015.

  9. Aerodynamic design applying automatic differentiation and using robust variable fidelity optimization

    Science.gov (United States)

    Takemiya, Tetsushi

    , and that (2) the AMF terminates optimization erroneously when the optimization problems have constraints. The first problem is due to inaccuracy in computing derivatives in the AMF, and the second problem is due to erroneous treatment of the trust region ratio, which sets the size of the domain for an optimization in the AMF. In order to solve the first problem of the AMF, automatic differentiation (AD) technique, which reads the codes of analysis models and automatically generates new derivative codes based on some mathematical rules, is applied. If derivatives are computed with the generated derivative code, they are analytical, and the required computational time is independent of the number of design variables, which is very advantageous for realistic aerospace engineering problems. However, if analysis models implement iterative computations such as computational fluid dynamics (CFD), which solves system partial differential equations iteratively, computing derivatives through the AD requires a massive memory size. The author solved this deficiency by modifying the AD approach and developing a more efficient implementation with CFD, and successfully applied the AD to general CFD software. In order to solve the second problem of the AMF, the governing equation of the trust region ratio, which is very strict against the violation of constraints, is modified so that it can accept the violation of constraints within some tolerance. By accepting violations of constraints during the optimization process, the AMF can continue optimization without terminating immaturely and eventually find the true optimum design point. With these modifications, the AMF is referred to as "Robust AMF," and it is applied to airfoil and wing aerodynamic design problems using Euler CFD software. The former problem has 21 design variables, and the latter 64. In both problems, derivatives computed with the proposed AD method are first compared with those computed with the finite

  10. A Clustering-Based Automatic Transfer Function Design for Volume Visualization

    Directory of Open Access Journals (Sweden)

    Tianjin Zhang

    2016-01-01

    Full Text Available The two-dimensional transfer functions (TFs designed based on intensity-gradient magnitude (IGM histogram are effective tools for the visualization and exploration of 3D volume data. However, traditional design methods usually depend on multiple times of trial-and-error. We propose a novel method for the automatic generation of transfer functions by performing the affinity propagation (AP clustering algorithm on the IGM histogram. Compared with previous clustering algorithms that were employed in volume visualization, the AP clustering algorithm has much faster convergence speed and can achieve more accurate clustering results. In order to obtain meaningful clustering results, we introduce two similarity measurements: IGM similarity and spatial similarity. These two similarity measurements can effectively bring the voxels of the same tissue together and differentiate the voxels of different tissues so that the generated TFs can assign different optical properties to different tissues. Before performing the clustering algorithm on the IGM histogram, we propose to remove noisy voxels based on the spatial information of voxels. Our method does not require users to input the number of clusters, and the classification and visualization process is automatic and efficient. Experiments on various datasets demonstrate the effectiveness of the proposed method.

  11. Do Automatic Self-Associations Relate to Suicidal Ideation?

    NARCIS (Netherlands)

    Glashouwer, Klaske A.; de Jong, Peter J.; Penninx, Brenda W. J. H.; Kerkhof, Ad J. F. M.; van Dyck, Richard; Ormel, Johan

    Dysfunctional self-schemas are assumed to play an important role in suicidal ideation. According to recent information-processing models, it is important to differentiate between 'explicit' beliefs and automatic associations. Explicit beliefs stem from the weighting of propositions and their

  12. Training shortest-path tractography: Automatic learning of spatial priors

    DEFF Research Database (Denmark)

    Kasenburg, Niklas; Liptrot, Matthew George; Reislev, Nina Linde

    2016-01-01

    Tractography is the standard tool for automatic delineation of white matter tracts from diffusion weighted images. However, the output of tractography often requires post-processing to remove false positives and ensure a robust delineation of the studied tract, and this demands expert prior...... knowledge. Here we demonstrate how such prior knowledge, or indeed any prior spatial information, can be automatically incorporated into a shortest-path tractography approach to produce more robust results. We describe how such a prior can be automatically generated (learned) from a population, and we...

  13. Analyzing Program Termination and Complexity Automatically with AProVE

    DEFF Research Database (Denmark)

    Giesl, Jürgen; Aschermann, Cornelius; Brockschmidt, Marc

    2017-01-01

    In this system description, we present the tool AProVE for automatic termination and complexity proofs of Java, C, Haskell, Prolog, and rewrite systems. In addition to classical term rewrite systems (TRSs), AProVE also supports rewrite systems containing built-in integers (int-TRSs). To analyze...... programs in high-level languages, AProVE automatically converts them to (int-)TRSs. Then, a wide range of techniques is employed to prove termination and to infer complexity bounds for the resulting rewrite systems. The generated proofs can be exported to check their correctness using automatic certifiers...

  14. Automatic ultrasound image enhancement for 2D semi-automatic breast-lesion segmentation

    Science.gov (United States)

    Lu, Kongkuo; Hall, Christopher S.

    2014-03-01

    Breast cancer is the fastest growing cancer, accounting for 29%, of new cases in 2012, and second leading cause of cancer death among women in the United States and worldwide. Ultrasound (US) has been used as an indispensable tool for breast cancer detection/diagnosis and treatment. In computer-aided assistance, lesion segmentation is a preliminary but vital step, but the task is quite challenging in US images, due to imaging artifacts that complicate detection and measurement of the suspect lesions. The lesions usually present with poor boundary features and vary significantly in size, shape, and intensity distribution between cases. Automatic methods are highly application dependent while manual tracing methods are extremely time consuming and have a great deal of intra- and inter- observer variability. Semi-automatic approaches are designed to counterbalance the advantage and drawbacks of the automatic and manual methods. However, considerable user interaction might be necessary to ensure reasonable segmentation for a wide range of lesions. This work proposes an automatic enhancement approach to improve the boundary searching ability of the live wire method to reduce necessary user interaction while keeping the segmentation performance. Based on the results of segmentation of 50 2D breast lesions in US images, less user interaction is required to achieve desired accuracy, i.e. < 80%, when auto-enhancement is applied for live-wire segmentation.

  15. SELFADJUSTING AUTOMATIC CONTROL OF SOWING UNIT

    Directory of Open Access Journals (Sweden)

    A. Yu. Izmaylov

    2015-01-01

    Full Text Available The selfadjusting automatic control of sowing unit and differentiated introduction of mineral fertilizers doses according to agrochemical indicators of the soil (precision agriculture are used wider nowadays. It was defined that the main requirement to the differentiated seeding and fertilizing is an accuracy and duration of transition from one norm to another. Established that at a speed of unit of 10 km/h object moves for 0.5 s about on 1.5 m and more. Thus in this device the radio channel originated differentiated correction is updated in 10 s, and in the RTK mode - 0.5-2 s that breaks the accuracy of introduction of seeds and fertilizers. The block schematic diagram of system of automatic control of technological process of seeding and mineral fertilizing with use of navigation means of machine-tractor aggregates orientation in the field and technical means for realization of technology of precision agriculture at sowing and fertilizers application due to electronic maps of soil fertility and navigation satellite systems was worked out. It was noted that for regulation of a fertilizing dose it is necessary to complete the unit with the electric drive, and for error reduction use navigation GLONASS, GPS, Galileo receivers. To tracking of four leading navigation systems GPS/GLONASS/Galileo/Compass receiver with 32 canals developed by domestic-owned firm «KB NAVIS» was suggested. It was established that the automated device created by All-Russia Research Institute of Mechanization for Agriculture information based on NAVSTAR and GLONASS/GPS system successfully operates seeding and make possible the differentiate fertilizing.

  16. Conditioned craving cues elicit an automatic approach tendency

    NARCIS (Netherlands)

    van Gucht, D.; Vansteenwegen, D.; Van den Bergh, O.; Beckers, T.

    2008-01-01

    In two experiments, we used a Pavlovian differential conditioning procedure to induce craving for chocolate. As a result of repeated pairing with chocolate intake, initially neutral cues came to elicit an automatic approach tendency in a speeded stimulus-response compatibility reaction time task.

  17. Development of a clinical applicable graphical user interface to automatically detect exercise oscillatory ventilation: The VOdEX-tool.

    Science.gov (United States)

    Cornelis, Justien; Denis, Tim; Beckers, Paul; Vrints, Christiaan; Vissers, Dirk; Goossens, Maggy

    2017-08-01

    Cardiopulmonary exercise testing (CPET) gained importance in the prognostic assessment of especially patients with heart failure (HF). A meaningful prognostic parameter for early mortality in HF is exercise oscillatory ventilation (EOV). This abnormal respiratory pattern is recognized by hypo- and hyperventilation during CPET. Up until now, assessment of EOV is mainly done upon visual agreement or manual calculation. The purpose of this research was to automate the interpretation of EOV so this prognostic parameter could be readily investigated during CPET. Preliminary, four definitions describing the original characteristics of EOV, were selected and integrated in the "Ventilatory Oscillations during Exercise-tool" (VOdEX-tool), a graphical user interface that allows automate calculation of EOV. A Discrete Meyer Level 2 wavelet transformation appeared to be the optimal filter to apply on the collected breath-by-breath minute ventilation CPET data. Divers aspects of the definitions i.e. cycle length, amplitude, regularity and total duration of EOV were combined and calculated. The oscillations meeting the criteria were visualised. Filter methods and cut-off criteria were made adjustable for clinical application and research. The VOdEX-tool was connected to a database. The VOdEX-tool provides the possibility to calculate EOV automatically and to present the clinician an overview of the presence of EOV at a glance. The computerized analysis of EOV can be made readily available in clinical practice by integrating the tool in the manufactures existing CPET software. The VOdEX-tool enhances assessment of EOV and therefore contributes to the estimation of prognosis in especially patients with HF. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  18. Development and validation of automatic tools for interactive recurrence analysis in radiation therapy: optimization of treatment algorithms for locally advanced pancreatic cancer.

    Science.gov (United States)

    Kessel, Kerstin A; Habermehl, Daniel; Jäger, Andreas; Floca, Ralf O; Zhang, Lanlan; Bendl, Rolf; Debus, Jürgen; Combs, Stephanie E

    2013-06-07

    In radiation oncology recurrence analysis is an important part in the evaluation process and clinical quality assurance of treatment concepts. With the example of 9 patients with locally advanced pancreatic cancer we developed and validated interactive analysis tools to support the evaluation workflow. After an automatic registration of the radiation planning CTs with the follow-up images, the recurrence volumes are segmented manually. Based on these volumes the DVH (dose volume histogram) statistic is calculated, followed by the determination of the dose applied to the region of recurrence and the distance between the boost and recurrence volume. We calculated the percentage of the recurrence volume within the 80%-isodose volume and compared it to the location of the recurrence within the boost volume, boost + 1 cm, boost + 1.5 cm and boost + 2 cm volumes. Recurrence analysis of 9 patients demonstrated that all recurrences except one occurred within the defined GTV/boost volume; one recurrence developed beyond the field border/outfield. With the defined distance volumes in relation to the recurrences, we could show that 7 recurrent lesions were within the 2 cm radius of the primary tumor. Two large recurrences extended beyond the 2 cm, however, this might be due to very rapid growth and/or late detection of the tumor progression. The main goal of using automatic analysis tools is to reduce time and effort conducting clinical analyses. We showed a first approach and use of a semi-automated workflow for recurrence analysis, which will be continuously optimized. In conclusion, despite the limitations of the automatic calculations we contributed to in-house optimization of subsequent study concepts based on an improved and validated target volume definition.

  19. Development of automatic laser welding system

    International Nuclear Information System (INIS)

    Ohwaki, Katsura

    2002-01-01

    Laser are a new production tool for high speed and low distortion welding and applications to automatic welding lines are increasing. IHI has long experience of laser processing for the preservation of nuclear power plants, welding of airplane engines and so on. Moreover, YAG laser oscillators and various kinds of hardware have been developed for laser welding and automation. Combining these welding technologies and laser hardware technologies produce the automatic laser welding system. In this paper, the component technologies are described, including combined optics intended to improve welding stability, laser oscillators, monitoring system, seam tracking system and so on. (author)

  20. Inter Genre Similarity Modelling For Automatic Music Genre Classification

    OpenAIRE

    Bagci, Ulas; Erzin, Engin

    2009-01-01

    Music genre classification is an essential tool for music information retrieval systems and it has been finding critical applications in various media platforms. Two important problems of the automatic music genre classification are feature extraction and classifier design. This paper investigates inter-genre similarity modelling (IGS) to improve the performance of automatic music genre classification. Inter-genre similarity information is extracted over the mis-classified feature population....

  1. An open source automatic quality assurance (OSAQA) tool for the ACR MRI phantom.

    Science.gov (United States)

    Sun, Jidi; Barnes, Michael; Dowling, Jason; Menk, Fred; Stanwell, Peter; Greer, Peter B

    2015-03-01

    Routine quality assurance (QA) is necessary and essential to ensure MR scanner performance. This includes geometric distortion, slice positioning and thickness accuracy, high contrast spatial resolution, intensity uniformity, ghosting artefact and low contrast object detectability. However, this manual process can be very time consuming. This paper describes the development and validation of an open source tool to automate the MR QA process, which aims to increase physicist efficiency, and improve the consistency of QA results by reducing human error. The OSAQA software was developed in Matlab and the source code is available for download from http://jidisun.wix.com/osaqa-project/. During program execution QA results are logged for immediate review and are also exported to a spreadsheet for long-term machine performance reporting. For the automatic contrast QA test, a user specific contrast evaluation was designed to improve accuracy for individuals on different display monitors. American College of Radiology QA images were acquired over a period of 2 months to compare manual QA and the results from the proposed OSAQA software. OSAQA was found to significantly reduce the QA time from approximately 45 to 2 min. Both the manual and OSAQA results were found to agree with regard to the recommended criteria and the differences were insignificant compared to the criteria. The intensity homogeneity filter is necessary to obtain an image with acceptable quality and at the same time keeps the high contrast spatial resolution within the recommended criterion. The OSAQA tool has been validated on scanners with different field strengths and manufacturers. A number of suggestions have been made to improve both the phantom design and QA protocol in the future.

  2. Norms concerning the programmable automatic control devices

    International Nuclear Information System (INIS)

    Fourmentraux, G.

    1995-01-01

    This presentation is a report of the studies carried out by the Work Group on Functioning Safety of Programmable Automatic Control Devices and by the Group for Prevention Studies (GEP) from the CEA. The objective of these groups is to evaluate the methods which could be used to estimate the functioning safety of control and instrumentation systems involved in the Important Elements for Safety (EIS) of the Basic Nuclear Installations (INB) of the CEA, and also to carry out a qualification of automatic control devices. Norms, protocols and tools for the evaluation are presented. The problem comprises two aspects: the evaluation of fault avoidance techniques and the evaluation of fault control techniques used during the conceiving. For the fault avoidance techniques, the quality assurance organization, the environment tests, and the software quality plans are considered. For the fault control techniques, the different available tools and fault injection models are analysed. The results of an analysis carried out with the DEF.I tool from the National Institute for Research and Safety (INRS) are reported. (J.S.). 23 refs

  3. Automatic loading pattern optimization tool for Loviisa VVER-440 reactors

    Energy Technology Data Exchange (ETDEWEB)

    Kuopanportti, Jaakko [Fortum Power and Heat, Fortum (Finland). Nuclear Competence Center

    2013-09-15

    An automatic loading pattern optimization tool called ALPOT has been developed for Loviisa VVER-440 reactors. The ALPOT code utilizes combination of three different optimization methods. The first method is the imitation of the equilibrium pattern that is the optimized pattern in case the cycle length and the operation conditions are constant and the same shuffling pattern is repeated from cycle to cycle. In practice, the algorithm imitates assemblies' operation year distribution of the equilibrium pattern stochastically. The function of the imitation algorithm is to provide initial patterns quickly for the next optimization phase, which is performed either with the stochastic guided binary search algorithm or the deterministic burnup kernel method depending on the choice of the user. The former is a modified version of the standard binary search. The standard version goes through all possible swaps of the assemblies and chooses the best swap at each iteration round. The guided version chooses one assembly, tries to swap it with every other possible assembly and performs the best swap at each iteration round. The search is guided so that the algorithm chooses the assemblies at or near the most restrictive fuel assembly first. The kernel method creates burnup kernel functions to estimate burnup variations that are required to achieve desired changes in the power distribution of the reactor. The idea of the kernel method is first determine the optimal burnup distribution that minimizes the maximum relative assembly power using the created kernel functions and a common solver routine. Then, the burnups of the available fuel assemblies are matched with the obtained burnup distribution. (orig.)

  4. Automatic loading pattern optimization tool for Loviisa VVER-440 reactors

    International Nuclear Information System (INIS)

    Kuopanportti, Jaakko

    2013-01-01

    An automatic loading pattern optimization tool called ALPOT has been developed for Loviisa VVER-440 reactors. The ALPOT code utilizes combination of three different optimization methods. The first method is the imitation of the equilibrium pattern that is the optimized pattern in case the cycle length and the operation conditions are constant and the same shuffling pattern is repeated from cycle to cycle. In practice, the algorithm imitates assemblies' operation year distribution of the equilibrium pattern stochastically. The function of the imitation algorithm is to provide initial patterns quickly for the next optimization phase, which is performed either with the stochastic guided binary search algorithm or the deterministic burnup kernel method depending on the choice of the user. The former is a modified version of the standard binary search. The standard version goes through all possible swaps of the assemblies and chooses the best swap at each iteration round. The guided version chooses one assembly, tries to swap it with every other possible assembly and performs the best swap at each iteration round. The search is guided so that the algorithm chooses the assemblies at or near the most restrictive fuel assembly first. The kernel method creates burnup kernel functions to estimate burnup variations that are required to achieve desired changes in the power distribution of the reactor. The idea of the kernel method is first determine the optimal burnup distribution that minimizes the maximum relative assembly power using the created kernel functions and a common solver routine. Then, the burnups of the available fuel assemblies are matched with the obtained burnup distribution. (orig.)

  5. Automatic Synthesis of Robust and Optimal Controllers

    DEFF Research Database (Denmark)

    Cassez, Franck; Jessen, Jan Jacob; Larsen, Kim Guldstrand

    2009-01-01

    In this paper, we show how to apply recent tools for the automatic synthesis of robust and near-optimal controllers for a real industrial case study. We show how to use three different classes of models and their supporting existing tools, Uppaal-TiGA for synthesis, phaver for verification......, and Simulink for simulation, in a complementary way. We believe that this case study shows that our tools have reached a level of maturity that allows us to tackle interesting and relevant industrial control problems....

  6. Towards an automatic tool for resolution evaluation of mammographic images

    Energy Technology Data Exchange (ETDEWEB)

    De Oliveira, J. E. E. [FUMEC, Av. Alfonso Pena 3880, CEP 30130-009 Belo Horizonte - MG (Brazil); Nogueira, M. S., E-mail: juliae@fumec.br [Centro de Desenvolvimento da Tecnologia Nuclear / CNEN, Pte. Antonio Carlos 6627, 31270-901, Belo Horizonte - MG (Brazil)

    2014-08-15

    Quality of Mammographies from the Public and Private Services of the State. With an essentially educational character, an evaluation of the image quality is monthly held from a breast phantom in each mammographic equipment. In face of this, this work proposes to develop a protocol for automatic evaluation of image quality of mammograms so that the radiological protection and image quality requirements are met in the early detection of breast cancer. Specifically, image resolution will be addressed and evaluated, as a part of the program of image quality evaluation. Results show that for the fourth resolution and using 28 phantom images with the ground truth settled, the computer analysis of the resolution is promising and may be used as a tool for the assessment of the image quality. (Author)

  7. Towards an automatic tool for resolution evaluation of mammographic images

    International Nuclear Information System (INIS)

    De Oliveira, J. E. E.; Nogueira, M. S.

    2014-08-01

    Quality of Mammographies from the Public and Private Services of the State. With an essentially educational character, an evaluation of the image quality is monthly held from a breast phantom in each mammographic equipment. In face of this, this work proposes to develop a protocol for automatic evaluation of image quality of mammograms so that the radiological protection and image quality requirements are met in the early detection of breast cancer. Specifically, image resolution will be addressed and evaluated, as a part of the program of image quality evaluation. Results show that for the fourth resolution and using 28 phantom images with the ground truth settled, the computer analysis of the resolution is promising and may be used as a tool for the assessment of the image quality. (Author)

  8. Fourier transform infrared microspectroscopy identifies early lineage commitment in differentiating human embryonic stem cells.

    Science.gov (United States)

    Heraud, Philip; Ng, Elizabeth S; Caine, Sally; Yu, Qing C; Hirst, Claire; Mayberry, Robyn; Bruce, Amanda; Wood, Bayden R; McNaughton, Don; Stanley, Edouard G; Elefanty, Andrew G

    2010-03-01

    Human ESCs (hESCs) are a valuable tool for the study of early human development and represent a source of normal differentiated cells for pharmaceutical and biotechnology applications and ultimately for cell replacement therapies. For all applications, it will be necessary to develop assays to validate the efficacy of hESC differentiation. We explored the capacity for FTIR spectroscopy, a technique that rapidly characterises cellular macromolecular composition, to discriminate mesendoderm or ectoderm committed cells from undifferentiated hESCs. Distinct infrared spectroscopic "signatures" readily distinguished hESCs from these early differentiated progeny, with bioinformatic models able to correctly classify over 97% of spectra. These data identify a role for FTIR spectroscopy as a new modality to complement conventional analyses of hESCs and their derivatives. FTIR spectroscopy has the potential to provide low-cost, automatable measurements for the quality control of stem and differentiated cells to be used in industry and regenerative medicine. Crown Copyright 2009. Published by Elsevier B.V. All rights reserved.

  9. Microcontroller based automatic temperature control for oyster mushroom plants

    Science.gov (United States)

    Sihombing, P.; Astuti, T. P.; Herriyance; Sitompul, D.

    2018-03-01

    In the cultivation of Oyster Mushrooms need special treatment because oyster mushrooms are susceptible to disease. Mushroom growth will be inhibited if the temperature and humidity are not well controlled because temperature and inertia can affect mold growth. Oyster mushroom growth usually will be optimal at temperatures around 22-28°C and humidity around 70-90%. This problem is often encountered in the cultivation of oyster mushrooms. Therefore it is very important to control the temperature and humidity of the room of oyster mushroom cultivation. In this paper, we developed an automatic temperature monitoring tool in the cultivation of oyster mushroom-based Arduino Uno microcontroller. We have designed a tool that will control the temperature and humidity automatically by Android Smartphone. If the temperature increased more than 28°C in the room of mushroom plants, then this tool will turn on the pump automatically to run water in order to lower the room temperature. And if the room temperature of mushroom plants below of 22°C, then the light will be turned on in order to heat the room. Thus the temperature in the room oyster mushrooms will remain stable so that the growth of oyster mushrooms can grow with good quality.

  10. DMET-analyzer: automatic analysis of Affymetrix DMET data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Agapito, Giuseppe; Di Martino, Maria Teresa; Arbitrio, Mariamena; Tassone, Pierfrancesco; Tagliaferri, Pierosandro; Cannataro, Mario

    2012-10-05

    Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters) is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism) on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix) and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i) to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii) the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP), (iii) the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists) to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different case studies regarding the analysis of

  11. An FMS Dynamic Production Scheduling Algorithm Considering Cutting Tool Failure and Cutting Tool Life

    International Nuclear Information System (INIS)

    Setiawan, A; Wangsaputra, R; Halim, A H; Martawirya, Y Y

    2016-01-01

    This paper deals with Flexible Manufacturing System (FMS) production rescheduling due to unavailability of cutting tools caused either of cutting tool failure or life time limit. The FMS consists of parallel identical machines integrated with an automatic material handling system and it runs fully automatically. Each machine has a same cutting tool configuration that consists of different geometrical cutting tool types on each tool magazine. The job usually takes two stages. Each stage has sequential operations allocated to machines considering the cutting tool life. In the real situation, the cutting tool can fail before the cutting tool life is reached. The objective in this paper is to develop a dynamic scheduling algorithm when a cutting tool is broken during unmanned and a rescheduling needed. The algorithm consists of four steps. The first step is generating initial schedule, the second step is determination the cutting tool failure time, the third step is determination of system status at cutting tool failure time and the fourth step is the rescheduling for unfinished jobs. The approaches to solve the problem are complete-reactive scheduling and robust-proactive scheduling. The new schedules result differences starting time and completion time of each operations from the initial schedule. (paper)

  12. MAISTAS: a tool for automatic structural evaluation of alternative splicing products.

    KAUST Repository

    Floris, Matteo; Raimondo, Domenico; Leoni, Guido; Orsini, Massimiliano; Marcatili, Paolo; Tramontano, Anna

    2011-01-01

    MOTIVATION: Analysis of the human genome revealed that the amount of transcribed sequence is an order of magnitude greater than the number of predicted and well-characterized genes. A sizeable fraction of these transcripts is related to alternatively spliced forms of known protein coding genes. Inspection of the alternatively spliced transcripts identified in the pilot phase of the ENCODE project has clearly shown that often their structure might substantially differ from that of other isoforms of the same gene, and therefore that they might perform unrelated functions, or that they might even not correspond to a functional protein. Identifying these cases is obviously relevant for the functional assignment of gene products and for the interpretation of the effect of variations in the corresponding proteins. RESULTS: Here we describe a publicly available tool that, given a gene or a protein, retrieves and analyses all its annotated isoforms, provides users with three-dimensional models of the isoform(s) of his/her interest whenever possible and automatically assesses whether homology derived structural models correspond to plausible structures. This information is clearly relevant. When the homology model of some isoforms of a gene does not seem structurally plausible, the implications are that either they assume a structure unrelated to that of the other isoforms of the same gene with presumably significant functional differences, or do not correspond to functional products. We provide indications that the second hypothesis is likely to be true for a substantial fraction of the cases. AVAILABILITY: http://maistas.bioinformatica.crs4.it/.

  13. MAISTAS: a tool for automatic structural evaluation of alternative splicing products.

    KAUST Repository

    Floris, Matteo

    2011-04-15

    MOTIVATION: Analysis of the human genome revealed that the amount of transcribed sequence is an order of magnitude greater than the number of predicted and well-characterized genes. A sizeable fraction of these transcripts is related to alternatively spliced forms of known protein coding genes. Inspection of the alternatively spliced transcripts identified in the pilot phase of the ENCODE project has clearly shown that often their structure might substantially differ from that of other isoforms of the same gene, and therefore that they might perform unrelated functions, or that they might even not correspond to a functional protein. Identifying these cases is obviously relevant for the functional assignment of gene products and for the interpretation of the effect of variations in the corresponding proteins. RESULTS: Here we describe a publicly available tool that, given a gene or a protein, retrieves and analyses all its annotated isoforms, provides users with three-dimensional models of the isoform(s) of his/her interest whenever possible and automatically assesses whether homology derived structural models correspond to plausible structures. This information is clearly relevant. When the homology model of some isoforms of a gene does not seem structurally plausible, the implications are that either they assume a structure unrelated to that of the other isoforms of the same gene with presumably significant functional differences, or do not correspond to functional products. We provide indications that the second hypothesis is likely to be true for a substantial fraction of the cases. AVAILABILITY: http://maistas.bioinformatica.crs4.it/.

  14. Formal Specification Based Automatic Test Generation for Embedded Network Systems

    Directory of Open Access Journals (Sweden)

    Eun Hye Choi

    2014-01-01

    Full Text Available Embedded systems have become increasingly connected and communicate with each other, forming large-scaled and complicated network systems. To make their design and testing more reliable and robust, this paper proposes a formal specification language called SENS and a SENS-based automatic test generation tool called TGSENS. Our approach is summarized as follows: (1 A user describes requirements of target embedded network systems by logical property-based constraints using SENS. (2 Given SENS specifications, test cases are automatically generated using a SAT-based solver. Filtering mechanisms to select efficient test cases are also available in our tool. (3 In addition, given a testing goal by the user, test sequences are automatically extracted from exhaustive test cases. We’ve implemented our approach and conducted several experiments on practical case studies. Through the experiments, we confirmed the efficiency of our approach in design and test generation of real embedded air-conditioning network systems.

  15. Theory and applications of differential algebra

    International Nuclear Information System (INIS)

    Pusch, G.D.

    1992-01-01

    Differential algebra (DA) is a new method of automatic differentiation. DA can rapidly and efficiently calculate the values of derivatives of arbitrarily complicated functions, in arbitrarily many variables, to arbitrary order, via its definition of multiplication. I provide a brief introduction to DA, and enumerate some of its recent applications. (author). 6 refs

  16. Validation of a Novel Digital Tool in Automatic Scoring of an Online ECG Examination at an International Cardiology Meeting.

    Science.gov (United States)

    Quinn, Kieran L; Crystal, Eugene; Lashevsky, Ilan; Arouny, Banafsheh; Baranchuk, Adrian

    2016-07-01

    We have previously developed a novel digital tool capable of automatically recognizing correct electrocardiography (ECG) diagnoses in an online exam and demonstrated a significant improvement in diagnostic accuracy when utilizing an inductive-deductive reasoning strategy over a pattern recognition strategy. In this study, we sought to validate these findings from participants at the International Winter Arrhythmia School meeting, one of the foremost electrophysiology events in Canada. Preregistration to the event was sent by e-mail. The exam was administered on day 1 of the conference. Results and analysis were presented the following morning to participants. Twenty-five attendees completed the exam, providing a total of 500 responses to be marked. The online tool accurately identified 195 of a total of 395 (49%) correct responses (49%). In total, 305 responses required secondary manual review, of which 200 were added to the correct responses pool. The overall accuracy of correct ECG diagnosis for all participants was 69% and 84% when using pattern recognition or inductive-deductive strategies, respectively. Utilization of a novel digital tool to evaluate ECG competency can be set up as a workshop at international meetings or educational events. Results can be presented during the sessions to ensure immediate feedback. © 2015 Wiley Periodicals, Inc.

  17. Automatic weld torch guidance control system

    Science.gov (United States)

    Smaith, H. E.; Wall, W. A.; Burns, M. R., Jr.

    1982-01-01

    A highly reliable, fully digital, closed circuit television optical, type automatic weld seam tracking control system was developed. This automatic tracking equipment is used to reduce weld tooling costs and increase overall automatic welding reliability. The system utilizes a charge injection device digital camera which as 60,512 inidividual pixels as the light sensing elements. Through conventional scanning means, each pixel in the focal plane is sequentially scanned, the light level signal digitized, and an 8-bit word transmitted to scratch pad memory. From memory, the microprocessor performs an analysis of the digital signal and computes the tracking error. Lastly, the corrective signal is transmitted to a cross seam actuator digital drive motor controller to complete the closed loop, feedback, tracking system. This weld seam tracking control system is capable of a tracking accuracy of + or - 0.2 mm, or better. As configured, the system is applicable to square butt, V-groove, and lap joint weldments.

  18. AUTOMATIC AND GENERIC MOSAICING OF MULTISENSOR IMAGES: AN APPLICATION TO PLEIADES HR

    Directory of Open Access Journals (Sweden)

    F. Bignalet-Cazalet

    2012-07-01

    Full Text Available In the early phase of the Pleiades program, the CNES (the French Space Agency specified and developed a fully automatic mosaicing processing unit, in order to generate satellite image mosaics under operational conditions. This tool can automatically put each input image in a common geometry, homogenize the radiometry, and generate orthomosaics using stitching lines. As the image quality commissioning phase of Pleiades1A is on-going, this mosaicing process is being tested for the first time under operational conditions. The French newly launched high resolution satellite can acquire adjacent images for French Civil and Defense User Ground Segments. This paper presents the very firsts results of mosaicing Pleiades1A images. Beyond Pleiades’ use, our mosaicing tool can process a significant variety of images, including other satellites and airborne acquisitions, using automatically-taken or external ground control points, offering time-based image superposition, and more. This paper also presents the design of the mosaicing tool and describes the processing workflow and the additional capabilities and applications.

  19. Shaping electromagnetic waves using software-automatically-designed metasurfaces.

    Science.gov (United States)

    Zhang, Qian; Wan, Xiang; Liu, Shuo; Yuan Yin, Jia; Zhang, Lei; Jun Cui, Tie

    2017-06-15

    We present a fully digital procedure of designing reflective coding metasurfaces to shape reflected electromagnetic waves. The design procedure is completely automatic, controlled by a personal computer. In details, the macro coding units of metasurface are automatically divided into several types (e.g. two types for 1-bit coding, four types for 2-bit coding, etc.), and each type of the macro coding units is formed by discretely random arrangement of micro coding units. By combining an optimization algorithm and commercial electromagnetic software, the digital patterns of the macro coding units are optimized to possess constant phase difference for the reflected waves. The apertures of the designed reflective metasurfaces are formed by arranging the macro coding units with certain coding sequence. To experimentally verify the performance, a coding metasurface is fabricated by automatically designing two digital 1-bit unit cells, which are arranged in array to constitute a periodic coding metasurface to generate the required four-beam radiations with specific directions. Two complicated functional metasurfaces with circularly- and elliptically-shaped radiation beams are realized by automatically designing 4-bit macro coding units, showing excellent performance of the automatic designs by software. The proposed method provides a smart tool to realize various functional devices and systems automatically.

  20. Automatic Model-Based Generation of Parameterized Test Cases Using Data Abstraction

    NARCIS (Netherlands)

    Calamé, Jens R.; Ioustinova, Natalia; Romijn, J.M.T.; Smith, G.; van de Pol, Jan Cornelis

    2007-01-01

    Developing test suites is a costly and error-prone process. Model-based test generation tools facilitate this process by automatically generating test cases from system models. The applicability of these tools, however, depends on the size of the target systems. Here, we propose an approach to

  1. DMET-Analyzer: automatic analysis of Affymetrix DMET Data

    Directory of Open Access Journals (Sweden)

    Guzzi Pietro

    2012-10-01

    Full Text Available Abstract Background Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. Results We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP, (iii the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different

  2. High-Throughput Screening Enhances Kidney Organoid Differentiation from Human Pluripotent Stem Cells and Enables Automated Multidimensional Phenotyping.

    Science.gov (United States)

    Czerniecki, Stefan M; Cruz, Nelly M; Harder, Jennifer L; Menon, Rajasree; Annis, James; Otto, Edgar A; Gulieva, Ramila E; Islas, Laura V; Kim, Yong Kyun; Tran, Linh M; Martins, Timothy J; Pippin, Jeffrey W; Fu, Hongxia; Kretzler, Matthias; Shankland, Stuart J; Himmelfarb, Jonathan; Moon, Randall T; Paragas, Neal; Freedman, Benjamin S

    2018-05-15

    Organoids derived from human pluripotent stem cells are a potentially powerful tool for high-throughput screening (HTS), but the complexity of organoid cultures poses a significant challenge for miniaturization and automation. Here, we present a fully automated, HTS-compatible platform for enhanced differentiation and phenotyping of human kidney organoids. The entire 21-day protocol, from plating to differentiation to analysis, can be performed automatically by liquid-handling robots, or alternatively by manual pipetting. High-content imaging analysis reveals both dose-dependent and threshold effects during organoid differentiation. Immunofluorescence and single-cell RNA sequencing identify previously undetected parietal, interstitial, and partially differentiated compartments within organoids and define conditions that greatly expand the vascular endothelium. Chemical modulation of toxicity and disease phenotypes can be quantified for safety and efficacy prediction. Screening in gene-edited organoids in this system reveals an unexpected role for myosin in polycystic kidney disease. Organoids in HTS formats thus establish an attractive platform for multidimensional phenotypic screening. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. Scanner OPC signatures: automatic vendor-to-vendor OPE matching

    Science.gov (United States)

    Renwick, Stephen P.

    2009-03-01

    As 193nm lithography continues to be stretched and the k1 factor decreases, optical proximity correction (OPC) has become a vital part of the lithographer's tool kit. Unfortunately, as is now well known, the design variations of lithographic scanners from different vendors cause them to have slightly different optical-proximity effect (OPE) behavior, meaning that they print features through pitch in distinct ways. This in turn means that their response to OPC is not the same, and that an OPC solution designed for a scanner from Company 1 may or may not work properly on a scanner from Company 2. Since OPC is not inexpensive, that causes trouble for chipmakers using more than one brand of scanner. Clearly a scanner-matching procedure is needed to meet this challenge. Previously, automatic matching has only been reported for scanners of different tool generations from the same manufacturer. In contrast, scanners from different companies have been matched using expert tuning and adjustment techniques, frequently requiring laborious test exposures. Automatic matching between scanners from Company 1 and Company 2 has remained an unsettled problem. We have recently solved this problem and introduce a novel method to perform the automatic matching. The success in meeting this challenge required three enabling factors. First, we recognized the strongest drivers of OPE mismatch and are thereby able to reduce the information needed about a tool from another supplier to that information readily available from all modern scanners. Second, we developed a means of reliably identifying the scanners' optical signatures, minimizing dependence on process parameters that can cloud the issue. Third, we carefully employed standard statistical techniques, checking for robustness of the algorithms used and maximizing efficiency. The result is an automatic software system that can predict an OPC matching solution for scanners from different suppliers without requiring expert intervention.

  4. Automatic associations with the sensory aspects of smoking: Positive in habitual smokers but negative in non-smokers

    OpenAIRE

    Huijding, Jorg; Jong, Peter

    2006-01-01

    textabstractTo test whether pictorial stimuli that focus on the sensory aspects of smoking elicit different automatic affective associations in smokers than in non-smokers, 31 smoking and 33 non-smoking students completed a single target IAT. Explicit attitudes were assessed using a semantic differential. Automatic affective associations were positive in smokers but negative in non-smokers. Only automatic affective associations but not self-reported attitudes were significantly correlated wit...

  5. APT - NASA ENHANCED VERSION OF AUTOMATICALLY PROGRAMMED TOOL SOFTWARE - STAND-ALONE VERSION

    Science.gov (United States)

    Premo, D. A.

    1994-01-01

    The APT code is one of the most widely used software tools for complex numerically controlled (N/C) machining. APT is an acronym for Automatically Programmed Tools and is used to denote both a language and the computer software that processes that language. Development of the APT language and software system was begun over twenty years ago as a U. S. government sponsored industry and university research effort. APT is a "problem oriented" language that was developed for the explicit purpose of aiding the N/C machine tools. Machine-tool instructions and geometry definitions are written in the APT language to constitute a "part program." The APT part program is processed by the APT software to produce a cutter location (CL) file. This CL file may then be processed by user supplied post processors to convert the CL data into a form suitable for a particular N/C machine tool. This June, 1989 offering of the APT system represents an adaptation, with enhancements, of the public domain version of APT IV/SSX8 to the DEC VAX-11/780 for use by the Engineering Services Division of the NASA Goddard Space Flight Center. Enhancements include the super pocket feature which allows concave and convex polygon shapes of up to 40 points including shapes that overlap, that leave islands of material within the pocket, and that have one or more arcs as part of the pocket boundary. Recent modifications to APT include a rework of the POCKET subroutine and correction of an error that prevented the use within a macro of a macro variable cutter move statement combined with macro variable double check surfaces. Former modifications included the expansion of array and buffer sizes to accommodate larger part programs, and the insertion of a few user friendly error messages. The APT system software on the DEC VAX-11/780 is organized into two separate programs: the load complex and the APT processor. The load complex handles the table initiation phase and is usually only run when changes to the

  6. Molecular polymorphism as a tool for differentiating ground beetles (Carabus species): application of ubiquitin PCR/SSCP analysis.

    Science.gov (United States)

    Boge, A; Gerstmeier, R; Einspanier, R

    1994-11-01

    Differentiation between Carabus species (ground beetle) and subspecies is difficult, although there have been extensive studies. To address this problem we have applied PCR in combination with SSCP analysis focussing on the evolutionally conservative ubiquitin gene to elaborate a new approach to molecular differentiation between species. We report that Carabidae possess an ubiquitin gene and that its gene has a multimeric structure. Differential SSCP analysis was performed with the monomeric form of the gene to generate a clear SSCP pattern. Such PCR/SSCP resulted in reproducible patterns throughout our experiments. Comparing different Carabus species (Carabus granulatus, C. irregularis, C. violaceus and C. auronitens) we could observe clear interspecies differences but no differences between genders. Some species showed some remarkable differences between the individuals. We suggest that the ubiquitin PCR-SSCP technique might be an additional tool for the differentiation of ground beetles.

  7. Migration check tool: automatic plan verification following treatment management systems upgrade and database migration.

    Science.gov (United States)

    Hadley, Scott W; White, Dale; Chen, Xiaoping; Moran, Jean M; Keranen, Wayne M

    2013-11-04

    Software upgrades of the treatment management system (TMS) sometimes require that all data be migrated from one version of the database to another. It is necessary to verify that the data are correctly migrated to assure patient safety. It is impossible to verify by hand the thousands of parameters that go into each patient's radiation therapy treatment plan. Repeating pretreatment QA is costly, time-consuming, and may be inadequate in detecting errors that are introduced during the migration. In this work we investigate the use of an automatic Plan Comparison Tool to verify that plan data have been correctly migrated to a new version of a TMS database from an older version. We developed software to query and compare treatment plans between different versions of the TMS. The same plan in the two TMS systems are translated into an XML schema. A plan comparison module takes the two XML schemas as input and reports any differences in parameters between the two versions of the same plan by applying a schema mapping. A console application is used to query the database to obtain a list of active or in-preparation plans to be tested. It then runs in batch mode to compare all the plans, and a report of success or failure of the comparison is saved for review. This software tool was used as part of software upgrade and database migration from Varian's Aria 8.9 to Aria 11 TMS. Parameters were compared for 358 treatment plans in 89 minutes. This direct comparison of all plan parameters in the migrated TMS against the previous TMS surpasses current QA methods that relied on repeating pretreatment QA measurements or labor-intensive and fallible hand comparisons.

  8. A Global Multi-Objective Optimization Tool for Design of Mechatronic Components using Generalized Differential Evolution

    DEFF Research Database (Denmark)

    Bech, Michael Møller; Nørgård, Christian; Roemer, Daniel Beck

    2016-01-01

    This paper illustrates how the relatively simple constrained multi-objective optimization algorithm Generalized Differential Evolution 3 (GDE3), can assist with the practical sizing of mechatronic components used in e.g. digital displacement fluid power machinery. The studied bi- and tri-objectiv......This paper illustrates how the relatively simple constrained multi-objective optimization algorithm Generalized Differential Evolution 3 (GDE3), can assist with the practical sizing of mechatronic components used in e.g. digital displacement fluid power machinery. The studied bi- and tri...... different optimization control parameter settings and it is concluded that GDE3 is a reliable optimization tool that can assist mechatronic engineers in the design and decision making process....

  9. Automatic WSDL-guided Test Case Generation for PropEr Testing of Web Services

    Directory of Open Access Journals (Sweden)

    Konstantinos Sagonas

    2012-10-01

    Full Text Available With web services already being key ingredients of modern web systems, automatic and easy-to-use but at the same time powerful and expressive testing frameworks for web services are increasingly important. Our work aims at fully automatic testing of web services: ideally the user only specifies properties that the web service is expected to satisfy, in the form of input-output relations, and the system handles all the rest. In this paper we present in detail the component which lies at the heart of this system: how the WSDL specification of a web service is used to automatically create test case generators that can be fed to PropEr, a property-based testing tool, to create structurally valid random test cases for its operations and check its responses. Although the process is fully automatic, our tool optionally allows the user to easily modify its output to either add semantic information to the generators or write properties that test for more involved functionality of the web services.

  10. Fuzzy-Neural Automatic Daylight Control System

    Directory of Open Access Journals (Sweden)

    Grif H. Şt.

    2011-12-01

    Full Text Available The paper presents the design and the tuning of a CMAC controller (Cerebellar Model Articulation Controller implemented in an automatic daylight control application. After the tuning process of the controller, the authors studied the behavior of the automatic lighting control system (ALCS in the presence of luminance disturbances. The luminance disturbances were produced by the authors in night conditions and day conditions as well. During the night conditions, the luminance disturbances were produced by turning on and off a halogen desk lamp. During the day conditions the luminance disturbances were produced in two ways: by daylight contributions changes achieved by covering and uncovering a part of the office window and by turning on and off a halogen desk lamp. During the day conditions the luminance disturbances, produced by turning on and off the halogen lamp, have a smaller amplitude than those produced during the night conditions. The luminance disturbance during the night conditions was a helpful tool to select the proper values of the learning rate for CMAC controller. The luminance disturbances during the day conditions were a helpful tool to demonstrate the right setting of the CMAC controller.

  11. Automatic inference of indexing rules for MEDLINE

    Directory of Open Access Journals (Sweden)

    Shooshan Sonya E

    2008-11-01

    Full Text Available Abstract Background: Indexing is a crucial step in any information retrieval system. In MEDLINE, a widely used database of the biomedical literature, the indexing process involves the selection of Medical Subject Headings in order to describe the subject matter of articles. The need for automatic tools to assist MEDLINE indexers in this task is growing with the increasing number of publications being added to MEDLINE. Methods: In this paper, we describe the use and the customization of Inductive Logic Programming (ILP to infer indexing rules that may be used to produce automatic indexing recommendations for MEDLINE indexers. Results: Our results show that this original ILP-based approach outperforms manual rules when they exist. In addition, the use of ILP rules also improves the overall performance of the Medical Text Indexer (MTI, a system producing automatic indexing recommendations for MEDLINE. Conclusion: We expect the sets of ILP rules obtained in this experiment to be integrated into MTI.

  12. Automatic web site authoring with SiteGuide

    NARCIS (Netherlands)

    de Boer, V.; Hollink, V.; van Someren, M.W.; Kłopotek, M.A.; Przepiórkowski, A.; Wierzchoń, S.T.; Trojanowski, K.

    2009-01-01

    An important step in the design process for a web site is to determine which information is to be included and how the information should be organized on the web site’s pages. In this paper we describe ’SiteGuide’, a tool that automatically produces an information architecture for a web site that a

  13. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten

    2006-01-01

    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a complex...... system at a high level of functional abstraction, analyze single and multiple fault scenarios and automatically generate parity relations for diagnosis for the system in normal and impaired conditions. User interface and algorithmic details are presented....

  14. flowAI: automatic and interactive anomaly discerning tools for flow cytometry data.

    Science.gov (United States)

    Monaco, Gianni; Chen, Hao; Poidinger, Michael; Chen, Jinmiao; de Magalhães, João Pedro; Larbi, Anis

    2016-08-15

    Flow cytometry (FCM) is widely used in both clinical and basic research to characterize cell phenotypes and functions. The latest FCM instruments analyze up to 20 markers of individual cells, producing high-dimensional data. This requires the use of the latest clustering and dimensionality reduction techniques to automatically segregate cell sub-populations in an unbiased manner. However, automated analyses may lead to false discoveries due to inter-sample differences in quality and properties. We present an R package, flowAI, containing two methods to clean FCM files from unwanted events: (i) an automatic method that adopts algorithms for the detection of anomalies and (ii) an interactive method with a graphical user interface implemented into an R shiny application. The general approach behind the two methods consists of three key steps to check and remove suspected anomalies that derive from (i) abrupt changes in the flow rate, (ii) instability of signal acquisition and (iii) outliers in the lower limit and margin events in the upper limit of the dynamic range. For each file analyzed our software generates a summary of the quality assessment from the aforementioned steps. The software presented is an intuitive solution seeking to improve the results not only of manual but also and in particular of automatic analysis on FCM data. R source code available through Bioconductor: http://bioconductor.org/packages/flowAI/ CONTACTS: mongianni1@gmail.com or Anis_Larbi@immunol.a-star.edu.sg Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. Preoperative automatic visual behavioural analysis as a tool for intraocular lens choice in cataract surgery

    Directory of Open Access Journals (Sweden)

    Heloisa Neumann Nogueira

    2015-04-01

    Full Text Available Purpose: Cataract is the main cause of blindness, affecting 18 million people worldwide, with the highest incidence in the population above 50 years of age. Low visual acuity caused by cataract may have a negative impact on patient quality of life. The current treatment is surgery in order to replace the natural lens with an artificial intraocular lens (IOL, which can be mono- or multifocal. However, due to potential side effects, IOLs must be carefully chosen to ensure higher patient satisfaction. Thus, studies on the visual behavior of these patients may be an important tool to determine the best type of IOL implantation. This study proposed an anamnestic add-on for optimizing the choice of IOL. Methods: We used a camera that automatically takes pictures, documenting the patient’s visual routine in order to obtain additional information about the frequency of distant, intermediate, and near sights. Results: The results indicated an estimated frequency percentage, suggesting that visual analysis of routine photographic records of a patient with cataract may be useful for understanding behavioural gaze and for choosing visual management strategy after cataract surgery, simultaneously stimulating interest for customized IOL manufacturing according to individual needs.

  16. Automatic generation of anatomic characteristics from cerebral aneurysm surface models.

    Science.gov (United States)

    Neugebauer, M; Lawonn, K; Beuing, O; Preim, B

    2013-03-01

    Computer-aided research on cerebral aneurysms often depends on a polygonal mesh representation of the vessel lumen. To support a differentiated, anatomy-aware analysis, it is necessary to derive anatomic descriptors from the surface model. We present an approach on automatic decomposition of the adjacent vessels into near- and far-vessel regions and computation of the axial plane. We also exemplarily present two applications of the geometric descriptors: automatic computation of a unique vessel order and automatic viewpoint selection. Approximation methods are employed to analyze vessel cross-sections and the vessel area profile along the centerline. The resulting transition zones between near- and far- vessel regions are used as input for an optimization process to compute the axial plane. The unique vessel order is defined via projection into the plane space of the axial plane. The viewing direction for the automatic viewpoint selection is derived from the normal vector of the axial plane. The approach was successfully applied to representative data sets exhibiting a broad variability with respect to the configuration of their adjacent vessels. A robustness analysis showed that the automatic decomposition is stable against noise. A survey with 4 medical experts showed a broad agreement with the automatically defined transition zones. Due to the general nature of the underlying algorithms, this approach is applicable to most of the likely aneurysm configurations in the cerebral vasculature. Additional geometric information obtained during automatic decomposition can support correction in case the automatic approach fails. The resulting descriptors can be used for various applications in the field of visualization, exploration and analysis of cerebral aneurysms.

  17. Spaceborne Differential SAR Interferometry: Data Analysis Tools for Deformation Measurement

    Directory of Open Access Journals (Sweden)

    Michele Crosetto

    2011-02-01

    Full Text Available This paper is focused on spaceborne Differential Interferometric SAR (DInSAR for land deformation measurement and monitoring. In the last two decades several DInSAR data analysis procedures have been proposed. The objective of this paper is to describe the DInSAR data processing and analysis tools developed at the Institute of Geomatics in almost ten years of research activities. Four main DInSAR analysis procedures are described, which range from the standard DInSAR analysis based on a single interferogram to more advanced Persistent Scatterer Interferometry (PSI approaches. These different procedures guarantee a sufficient flexibility in DInSAR data processing. In order to provide a technical insight into these analysis procedures, a whole section discusses their main data processing and analysis steps, especially those needed in PSI analyses. A specific section is devoted to the core of our PSI analysis tools: the so-called 2+1D phase unwrapping procedure, which couples a 2D phase unwrapping, performed interferogram-wise, with a kind of 1D phase unwrapping along time, performed pixel-wise. In the last part of the paper, some examples of DInSAR results are discussed, which were derived by standard DInSAR or PSI analyses. Most of these results were derived from X-band SAR data coming from the TerraSAR-X and CosmoSkyMed sensors.

  18. Automatic Detection of Fake News

    OpenAIRE

    Pérez-Rosas, Verónica; Kleinberg, Bennett; Lefevre, Alexandra; Mihalcea, Rada

    2017-01-01

    The proliferation of misleading information in everyday access media outlets such as social media feeds, news blogs, and online newspapers have made it challenging to identify trustworthy news sources, thus increasing the need for computational tools able to provide insights into the reliability of online content. In this paper, we focus on the automatic identification of fake content in online news. Our contribution is twofold. First, we introduce two novel datasets for the task of fake news...

  19. Design of tool monitor simulator

    International Nuclear Information System (INIS)

    Yao Yonggang; Deng Changming; Zhang Jia; Meng Dan; Zhang Lu; Wang Zhi'ai; Shen Yang

    2011-01-01

    It is based on tool monitor in Qinshan Nuclear Power Plant for the object of study, and manufacture a tool monitor simulator. The device is designed to automatically emulate-monitor the contamination level of objects for training students. Once if the tool monitor reports the contamination, the students can handle properly. The brief introduction of main function and system design of the simulator are presented in the paper. (authors)

  20. Teaching with technology: automatically receiving information from the internet and web.

    Science.gov (United States)

    Wink, Diane M

    2010-01-01

    In this bimonthly series, the author examines how nurse educators can use the Internet and Web-based computer technologies such as search, communication, and collaborative writing tools, social networking and social bookmarking sites, virtual worlds, and Web-based teaching and learning programs. This article presents information and tools related to automatically receiving information from the Internet and Web.

  1. AuTom: a novel automatic platform for electron tomography reconstruction

    KAUST Repository

    Han, Renmin

    2017-07-26

    We have developed a software package towards automatic electron tomography (ET): Automatic Tomography (AuTom). The presented package has the following characteristics: accurate alignment modules for marker-free datasets containing substantial biological structures; fully automatic alignment modules for datasets with fiducial markers; wide coverage of reconstruction methods including a new iterative method based on the compressed-sensing theory that suppresses the “missing wedge” effect; and multi-platform acceleration solutions that support faster iterative algebraic reconstruction. AuTom aims to achieve fully automatic alignment and reconstruction for electron tomography and has already been successful for a variety of datasets. AuTom also offers user-friendly interface and auxiliary designs for file management and workflow management, in which fiducial marker-based datasets and marker-free datasets are addressed with totally different subprocesses. With all of these features, AuTom can serve as a convenient and effective tool for processing in electron tomography.

  2. Solving ordinary differential equations by electrical analogy: a multidisciplinary teaching tool

    Science.gov (United States)

    Sanchez Perez, J. F.; Conesa, M.; Alhama, I.

    2016-11-01

    Ordinary differential equations are the mathematical formulation for a great variety of problems in science and engineering, and frequently, two different problems are equivalent from a mathematical point of view when they are formulated by the same equations. Students acquire the knowledge of how to solve these equations (at least some types of them) using protocols and strict algorithms of mathematical calculation without thinking about the meaning of the equation. The aim of this work is that students learn to design network models or circuits in this way; with simple knowledge of them, students can establish the association of electric circuits and differential equations and their equivalences, from a formal point of view, that allows them to associate knowledge of two disciplines and promote the use of this interdisciplinary approach to address complex problems. Therefore, they learn to use a multidisciplinary tool that allows them to solve these kinds of equations, even students of first course of engineering, whatever the order, grade or type of non-linearity. This methodology has been implemented in numerous final degree projects in engineering and science, e.g., chemical engineering, building engineering, industrial engineering, mechanical engineering, architecture, etc. Applications are presented to illustrate the subject of this manuscript.

  3. Automatic tools for enhancing the collaborative experience in large projects

    International Nuclear Information System (INIS)

    Bourilkov, D; Rodriquez, J L

    2014-01-01

    With the explosion of big data in many fields, the efficient management of knowledge about all aspects of the data analysis gains in importance. A key feature of collaboration in large scale projects is keeping a log of what is being done and how - for private use, reuse, and for sharing selected parts with collaborators and peers, often distributed geographically on an increasingly global scale. Even better if the log is automatically created on the fly while the scientist or software developer is working in a habitual way, without the need for extra efforts. This saves time and enables a team to do more with the same resources. The CODESH - COllaborative DEvelopment SHell - and CAVES - Collaborative Analysis Versioning Environment System projects address this problem in a novel way. They build on the concepts of virtual states and transitions to enhance the collaborative experience by providing automatic persistent virtual logbooks. CAVES is designed for sessions of distributed data analysis using the popular ROOT framework, while CODESH generalizes the approach for any type of work on the command line in typical UNIX shells like bash or tcsh. Repositories of sessions can be configured dynamically to record and make available the knowledge accumulated in the course of a scientific or software endeavor. Access can be controlled to define logbooks of private sessions or sessions shared within or between collaborating groups. A typical use case is building working scalable systems for analysis of Petascale volumes of data as encountered in the LHC experiments. Our approach is general enough to find applications in many fields.

  4. Towards Automatic Controller Design using Multi-Objective Evolutionary Algorithms

    DEFF Research Database (Denmark)

    Pedersen, Gerulf

    of evolutionary computation, a choice was made to use multi-objective algorithms for the purpose of aiding in automatic controller design. More specifically, the choice was made to use the Non-dominated Sorting Genetic Algorithm II (NSGAII), which is one of the most potent algorithms currently in use...... for automatic controller design. However, because the field of evolutionary computation is relatively unknown in the field of control engineering, this thesis also includes a comprehensive introduction to the basic field of evolutionary computation as well as a description of how the field has previously been......In order to design the controllers of tomorrow, a need has risen for tools that can aid in the design of these. A desire to use evolutionary computation as a tool to achieve that goal is what gave inspiration for the work contained in this thesis. After having studied the foundations...

  5. Pointer Analysis for JavaScript Programming Tools

    DEFF Research Database (Denmark)

    Feldthaus, Asger

    Tools that can assist the programmer with tasks, such as, refactoring or code navigation, have proven popular for Java, C#, and other programming languages. JavaScript is a widely used programming language, and its users could likewise benefit from such tools, but the dynamic nature of the language...... is an obstacle for the development of these. Because of this, tools for JavaScript have long remained ineffective compared to those for many other programming languages. Static pointer analysis can provide a foundation for more powerful tools, although the design of this analysis is itself a complicated endeavor....... In this work, we explore techniques for performing pointer analysis of JavaScript programs, and we find novel applications of these techniques. In particular, we demonstrate how these can be used for code navigation, automatic refactoring, semi-automatic refactoring of incomplete programs, and checking of type...

  6. Automatic detection of coronary arterial branches from X-ray angiograms

    International Nuclear Information System (INIS)

    Lu, Shan; Eiho, Shigeru

    1992-01-01

    This paper describes a method to trace the coronary arterial boundaries automatically from x-ray angiograms. We developed an automatic procedure to detect the edges of an artery with its branches. The edge point is evaluated by a function based on smoothing differential operator on a searching line which is obtained by using the continuous properties of the arterial edges. Thus the boundary points along the artery are detected automatically. If there exists a branch on the boundary, it can be detected automatically. This information about the branch is stored on the stack of the search information and will be used to detect the branch artery. In our edge detection process, the required user interaction is only the manual definition of a starting point for the search, the direction of the search and the range for search. We tested this method on some images generated by a computer with different stenoses and on a coronary angiogram. These results show that this method is useful for analyzing coronary angiograms. (author)

  7. A framework for automatic segmentation in three dimensions of microstructural tomography data

    DEFF Research Database (Denmark)

    Jørgensen, Peter Stanley; Hansen, Karin Vels; Larsen, Rasmus

    2010-01-01

    Routine use of quantitative three dimensional analysis of material microstructure by in particular, focused ion beam (FIB) serial sectioning is generally restricted by the time consuming task of manually delineating structures within each image slice or the quality of manual and automatic...... segmentation schemes. We present here a framework for performing automatic segmentation of complex microstructures using a level set method. The technique is based on numerical approximations to partial differential equations to evolve a 3D surface to capture the phase boundaries. Vector fields derived from...

  8. Automatic Single Event Effects Sensitivity Analysis of a 13-Bit Successive Approximation ADC

    Science.gov (United States)

    Márquez, F.; Muñoz, F.; Palomo, F. R.; Sanz, L.; López-Morillo, E.; Aguirre, M. A.; Jiménez, A.

    2015-08-01

    This paper presents Analog Fault Tolerant University of Seville Debugging System (AFTU), a tool to evaluate the Single-Event Effect (SEE) sensitivity of analog/mixed signal microelectronic circuits at transistor level. As analog cells can behave in an unpredictable way when critical areas interact with the particle hitting, there is a need for designers to have a software tool that allows an automatic and exhaustive analysis of Single-Event Effects influence. AFTU takes the test-bench SPECTRE design, emulates radiation conditions and automatically evaluates vulnerabilities using user-defined heuristics. To illustrate the utility of the tool, the SEE sensitivity of a 13-bits Successive Approximation Analog-to-Digital Converter (ADC) has been analysed. This circuit was selected not only because it was designed for space applications, but also due to the fact that a manual SEE sensitivity analysis would be too time-consuming. After a user-defined test campaign, it was detected that some voltage transients were propagated to a node where a parasitic diode was activated, affecting the offset cancelation, and therefore the whole resolution of the ADC. A simple modification of the scheme solved the problem, as it was verified with another automatic SEE sensitivity analysis.

  9. Differential Laser Doppler based Non-Contact Sensor for Dimensional Inspection with Error Propagation Evaluation

    Directory of Open Access Journals (Sweden)

    Ketsaya Vacharanukul

    2006-06-01

    Full Text Available To achieve dynamic error compensation in CNC machine tools, a non-contactlaser probe capable of dimensional measurement of a workpiece while it is being machinedhas been developed and presented in this paper. The measurements are automatically fedback to the machine controller for intelligent error compensations. Based on a well resolvedlaser Doppler technique and real time data acquisition, the probe delivers a very promisingdimensional accuracy at few microns over a range of 100 mm. The developed opticalmeasuring apparatus employs a differential laser Doppler arrangement allowing acquisitionof information from the workpiece surface. In addition, the measurements are traceable tostandards of frequency allowing higher precision.

  10. LOOP- SIMULATION OF THE AUTOMATIC FREQUENCY CONTROL SUBSYSTEM OF A DIFFERENTIAL MINIMUM SHIFT KEYING RECEIVER

    Science.gov (United States)

    Davarian, F.

    1994-01-01

    The LOOP computer program was written to simulate the Automatic Frequency Control (AFC) subsystem of a Differential Minimum Shift Keying (DMSK) receiver with a bit rate of 2400 baud. The AFC simulated by LOOP is a first order loop configuration with a first order R-C filter. NASA has been investigating the concept of mobile communications based on low-cost, low-power terminals linked via geostationary satellites. Studies have indicated that low bit rate transmission is suitable for this application, particularly from the frequency and power conservation point of view. A bit rate of 2400 BPS is attractive due to its applicability to the linear predictive coding of speech. Input to LOOP includes the following: 1) the initial frequency error; 2) the double-sided loop noise bandwidth; 3) the filter time constants; 4) the amount of intersymbol interference; and 5) the bit energy to noise spectral density. LOOP output includes: 1) the bit number and the frequency error of that bit; 2) the computed mean of the frequency error; and 3) the standard deviation of the frequency error. LOOP is written in MS SuperSoft FORTRAN 77 for interactive execution and has been implemented on an IBM PC operating under PC DOS with a memory requirement of approximately 40K of 8 bit bytes. This program was developed in 1986.

  11. Machine Tool Software

    Science.gov (United States)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  12. Automatic classification of blank substrate defects

    Science.gov (United States)

    Boettiger, Tom; Buck, Peter; Paninjath, Sankaranarayanan; Pereira, Mark; Ronald, Rob; Rost, Dan; Samir, Bhamidipati

    2014-10-01

    Mask preparation stages are crucial in mask manufacturing, since this mask is to later act as a template for considerable number of dies on wafer. Defects on the initial blank substrate, and subsequent cleaned and coated substrates, can have a profound impact on the usability of the finished mask. This emphasizes the need for early and accurate identification of blank substrate defects and the risk they pose to the patterned reticle. While Automatic Defect Classification (ADC) is a well-developed technology for inspection and analysis of defects on patterned wafers and masks in the semiconductors industry, ADC for mask blanks is still in the early stages of adoption and development. Calibre ADC is a powerful analysis tool for fast, accurate, consistent and automatic classification of defects on mask blanks. Accurate, automated classification of mask blanks leads to better usability of blanks by enabling defect avoidance technologies during mask writing. Detailed information on blank defects can help to select appropriate job-decks to be written on the mask by defect avoidance tools [1][4][5]. Smart algorithms separate critical defects from the potentially large number of non-critical defects or false defects detected at various stages during mask blank preparation. Mechanisms used by Calibre ADC to identify and characterize defects include defect location and size, signal polarity (dark, bright) in both transmitted and reflected review images, distinguishing defect signals from background noise in defect images. The Calibre ADC engine then uses a decision tree to translate this information into a defect classification code. Using this automated process improves classification accuracy, repeatability and speed, while avoiding the subjectivity of human judgment compared to the alternative of manual defect classification by trained personnel [2]. This paper focuses on the results from the evaluation of Automatic Defect Classification (ADC) product at MP Mask

  13. Individual Differences in Automatic Emotion Regulation Interact with Primed Emotion Regulation during an Anger Provocation

    OpenAIRE

    Zhang, Jing; Lipp, Ottmar V.; Hu, Ping

    2017-01-01

    The current study investigated the interactive effects of individual differences in automatic emotion regulation (AER) and primed emotion regulation strategy on skin conductance level (SCL) and heart rate during provoked anger. The study was a 2 × 2 [AER tendency (expression vs. control) × priming (expression vs. control)] between subject design. Participants were assigned to two groups according to their performance on an emotion regulation-IAT (differentiating automatic emotion control tend...

  14. Signal Compression in Automatic Ultrasonic testing of Rails

    Directory of Open Access Journals (Sweden)

    Tomasz Ciszewski

    2007-01-01

    Full Text Available Full recording of the most important information carried by the ultrasonic signals allows realizing statistical analysis of measurement data. Statistical analysis of the results gathered during automatic ultrasonic tests gives data which lead, together with use of features of measuring method, differential lossy coding and traditional method of lossless data compression (Huffman’s coding, dictionary coding, to a comprehensive, efficient data compression algorithm. The subject of the article is to present the algorithm and the benefits got by using it in comparison to alternative compression methods. Storage of large amount  of data allows to create an electronic catalogue of ultrasonic defects. If it is created, the future qualification system training in the new solutions of the automat for test in rails will be possible.

  15. Validation of the ICU-DaMa tool for automatically extracting variables for minimum dataset and quality indicators: The importance of data quality assessment.

    Science.gov (United States)

    Sirgo, Gonzalo; Esteban, Federico; Gómez, Josep; Moreno, Gerard; Rodríguez, Alejandro; Blanch, Lluis; Guardiola, Juan José; Gracia, Rafael; De Haro, Lluis; Bodí, María

    2018-04-01

    Big data analytics promise insights into healthcare processes and management, improving outcomes while reducing costs. However, data quality is a major challenge for reliable results. Business process discovery techniques and an associated data model were used to develop data management tool, ICU-DaMa, for extracting variables essential for overseeing the quality of care in the intensive care unit (ICU). To determine the feasibility of using ICU-DaMa to automatically extract variables for the minimum dataset and ICU quality indicators from the clinical information system (CIS). The Wilcoxon signed-rank test and Fisher's exact test were used to compare the values extracted from the CIS with ICU-DaMa for 25 variables from all patients attended in a polyvalent ICU during a two-month period against the gold standard of values manually extracted by two trained physicians. Discrepancies with the gold standard were classified into plausibility, conformance, and completeness errors. Data from 149 patients were included. Although there were no significant differences between the automatic method and the manual method, we detected differences in values for five variables, including one plausibility error and two conformance and completeness errors. Plausibility: 1) Sex, ICU-DaMa incorrectly classified one male patient as female (error generated by the Hospital's Admissions Department). Conformance: 2) Reason for isolation, ICU-DaMa failed to detect a human error in which a professional misclassified a patient's isolation. 3) Brain death, ICU-DaMa failed to detect another human error in which a professional likely entered two mutually exclusive values related to the death of the patient (brain death and controlled donation after circulatory death). Completeness: 4) Destination at ICU discharge, ICU-DaMa incorrectly classified two patients due to a professional failing to fill out the patient discharge form when thepatients died. 5) Length of continuous renal replacement

  16. Automatic digitization. Experience of magnum 8000 in automatic digitization in EA; Digitalizacion automatica. Experiencias obtenidas durante la utilizacion del sistema magnus 8000 para la digitalizacion automatica en EA

    Energy Technology Data Exchange (ETDEWEB)

    Munoz Garcia, M.

    1995-12-31

    The paper describes the life cycle to be followed for the automatic digitization of files containing rasterised (scanned) images for their conversion into vector files (processable using CAD tools). The main characteristics of each of the five phases: capture, cleaning, conversion, revision and post-processing, that form part of the life cycle, are described. Lastly, the paper gives a comparative analysis of the results obtained using the automatic digitization process and other more conventional methods. (Author)

  17. Differences between automatically detected and steady-state fractional flow reserve.

    Science.gov (United States)

    Härle, Tobias; Meyer, Sven; Vahldiek, Felix; Elsässer, Albrecht

    2016-02-01

    Measurement of fractional flow reserve (FFR) has become a standard diagnostic tool in the catheterization laboratory. FFR evaluation studies were based on pressure recordings during steady-state maximum hyperemia. Commercially available computer systems detect the lowest Pd/Pa ratio automatically, which might not always be measured during steady-state hyperemia. We sought to compare the automatically detected FFR and true steady-state FFR. Pressure measurement traces of 105 coronary lesions from 77 patients with intermediate coronary lesions or multivessel disease were reviewed. In all patients, hyperemia had been achieved by intravenous adenosine administration using a dosage of 140 µg/kg/min. In 42 lesions (40%) automatically detected FFR was lower than true steady-state FFR. Mean bias was 0.009 (standard deviation 0.015, limits of agreement -0.02, 0.037). In 4 lesions (3.8%) both methods lead to different treatment recommendations, in all 4 cases instantaneous wave-free ratio confirmed steady-state FFR. Automatically detected FFR was slightly lower than steady-state FFR in more than one-third of cases. Consequently, interpretation of automatically detected FFR values closely below the cutoff value requires special attention.

  18. Nouns referring to tools and natural objects differentially modulate the motor system.

    Science.gov (United States)

    Gough, Patricia M; Riggio, Lucia; Chersi, Fabian; Sato, Marc; Fogassi, Leonardo; Buccino, Giovanni

    2012-01-01

    While increasing evidence points to a critical role for the motor system in language processing, the focus of previous work has been on the linguistic category of verbs. Here we tested whether nouns are effective in modulating the motor system and further whether different kinds of nouns - those referring to artifacts or natural items, and items that are graspable or ungraspable - would differentially modulate the system. A Transcranial Magnetic Stimulation (TMS) study was carried out to compare modulation of the motor system when subjects read nouns referring to objects which are Artificial or Natural and which are Graspable or Ungraspable. TMS was applied to the primary motor cortex representation of the first dorsal interosseous (FDI) muscle of the right hand at 150 ms after noun presentation. Analyses of Motor Evoked Potentials (MEPs) revealed that across the duration of the task, nouns referring to graspable artifacts (tools) were associated with significantly greater MEP areas. Analyses of the initial presentation of items revealed a main effect of graspability. The findings are in line with an embodied view of nouns, with MEP measures modulated according to whether nouns referred to natural objects or artifacts (tools), confirming tools as a special class of items in motor terms. Additionally our data support a difference for graspable versus non graspable objects, an effect which for natural objects is restricted to initial presentation of items. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Automatic Imitation

    Science.gov (United States)

    Heyes, Cecilia

    2011-01-01

    "Automatic imitation" is a type of stimulus-response compatibility effect in which the topographical features of task-irrelevant action stimuli facilitate similar, and interfere with dissimilar, responses. This article reviews behavioral, neurophysiological, and neuroimaging research on automatic imitation, asking in what sense it is "automatic"…

  20. A Survey of Automatic Protocol Reverse Engineering Approaches, Methods, and Tools on the Inputs and Outputs View

    Directory of Open Access Journals (Sweden)

    Baraka D. Sija

    2018-01-01

    Full Text Available A network protocol defines rules that control communications between two or more machines on the Internet, whereas Automatic Protocol Reverse Engineering (APRE defines the way of extracting the structure of a network protocol without accessing its specifications. Enough knowledge on undocumented protocols is essential for security purposes, network policy implementation, and management of network resources. This paper reviews and analyzes a total of 39 approaches, methods, and tools towards Protocol Reverse Engineering (PRE and classifies them into four divisions, approaches that reverse engineer protocol finite state machines, protocol formats, and both protocol finite state machines and protocol formats to approaches that focus directly on neither reverse engineering protocol formats nor protocol finite state machines. The efficiency of all approaches’ outputs based on their selected inputs is analyzed in general along with appropriate reverse engineering inputs format. Additionally, we present discussion and extended classification in terms of automated to manual approaches, known and novel categories of reverse engineered protocols, and a literature of reverse engineered protocols in relation to the seven layers’ OSI (Open Systems Interconnection model.

  1. Robust automatic high resolution segmentation of SOFC anode porosity in 3D

    DEFF Research Database (Denmark)

    Jørgensen, Peter Stanley; Bowen, Jacob R.

    2008-01-01

    Routine use of 3D characterization of SOFCs by focused ion beam (FIB) serial sectioning is generally restricted by the time consuming task of manually delineating structures within each image slice. We apply advanced image analysis algorithms to automatically segment the porosity phase of an SOFC...... anode in 3D. The technique is based on numerical approximations to partial differential equations to evolve a 3D surface to the desired phase boundary. Vector fields derived from the experimentally acquired data are used as the driving force. The automatic segmentation compared to manual delineation...... reveals and good correspondence and the two approaches are quantitatively compared. It is concluded that the. automatic approach is more robust, more reproduceable and orders of magnitude quicker than manual segmentation of SOFC anode porosity for subsequent quantitative 3D analysis. Lastly...

  2. Foodomics: A new tool to differentiate between organic and conventional foods.

    Science.gov (United States)

    Vallverdú-Queralt, Anna; Lamuela-Raventós, Rosa Maria

    2016-07-01

    The demand for organic food is increasing annually due to the growing consumer trend for more natural products that have simpler ingredient lists, involve less processing and are grown free of pesticides. However, there is still not enough nutritional evidence in favor of organic food consumption. Classical chemical analysis of macro- and micronutrients has demonstrated that organic crops are poorer in nitrogen, but clear evidence for other nutrients is lacking. Omics technologies forming part of the new discipline of foodomics have allowed the detection of possible nutritional differences between organic and conventional production, although many results remain controversial and contradictory. The main focus of this review is to provide an overview of the studies that use foodomics techniques as a tool to differentiate between organic and conventional production. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Technical Note: PLASTIMATCH MABS, an open source tool for automatic image segmentation

    International Nuclear Information System (INIS)

    Zaffino, Paolo; Spadea, Maria Francesca; Raudaschl, Patrik; Fritscher, Karl; Sharp, Gregory C.

    2016-01-01

    Purpose: Multiatlas based segmentation is largely used in many clinical and research applications. Due to its good performances, it has recently been included in some commercial platforms for radiotherapy planning and surgery guidance. Anyway, to date, a software with no restrictions about the anatomical district and image modality is still missing. In this paper we introduce PLASTIMATCH MABS, an open source software that can be used with any image modality for automatic segmentation. Methods: PLASTIMATCH MABS workflow consists of two main parts: (1) an offline phase, where optimal registration and voting parameters are tuned and (2) an online phase, where a new patient is labeled from scratch by using the same parameters as identified in the former phase. Several registration strategies, as well as different voting criteria can be selected. A flexible atlas selection scheme is also available. To prove the effectiveness of the proposed software across anatomical districts and image modalities, it was tested on two very different scenarios: head and neck (H&N) CT segmentation for radiotherapy application, and magnetic resonance image brain labeling for neuroscience investigation. Results: For the neurological study, minimum dice was equal to 0.76 (investigated structures: left and right caudate, putamen, thalamus, and hippocampus). For head and neck case, minimum dice was 0.42 for the most challenging structures (optic nerves and submandibular glands) and 0.62 for the other ones (mandible, brainstem, and parotid glands). Time required to obtain the labels was compatible with a real clinical workflow (35 and 120 min). Conclusions: The proposed software fills a gap in the multiatlas based segmentation field, since all currently available tools (both for commercial and for research purposes) are restricted to a well specified application. Furthermore, it can be adopted as a platform for exploring MABS parameters and as a reference implementation for comparing against

  4. Technical Note: PLASTIMATCH MABS, an open source tool for automatic image segmentation

    Energy Technology Data Exchange (ETDEWEB)

    Zaffino, Paolo; Spadea, Maria Francesca [Department of Experimental and Clinical Medicine, Magna Graecia University of Catanzaro, Catanzaro 88100 (Italy); Raudaschl, Patrik; Fritscher, Karl [Institute for Biomedical Image Analysis, Private University of Health Sciences, Medical Informatics and Technology, Hall in Tirol 6060 (Austria); Sharp, Gregory C. [Department for Radiation Oncology, Massachusetts General Hospital, Boston, Massachusetts 02114 (United States)

    2016-09-15

    Purpose: Multiatlas based segmentation is largely used in many clinical and research applications. Due to its good performances, it has recently been included in some commercial platforms for radiotherapy planning and surgery guidance. Anyway, to date, a software with no restrictions about the anatomical district and image modality is still missing. In this paper we introduce PLASTIMATCH MABS, an open source software that can be used with any image modality for automatic segmentation. Methods: PLASTIMATCH MABS workflow consists of two main parts: (1) an offline phase, where optimal registration and voting parameters are tuned and (2) an online phase, where a new patient is labeled from scratch by using the same parameters as identified in the former phase. Several registration strategies, as well as different voting criteria can be selected. A flexible atlas selection scheme is also available. To prove the effectiveness of the proposed software across anatomical districts and image modalities, it was tested on two very different scenarios: head and neck (H&N) CT segmentation for radiotherapy application, and magnetic resonance image brain labeling for neuroscience investigation. Results: For the neurological study, minimum dice was equal to 0.76 (investigated structures: left and right caudate, putamen, thalamus, and hippocampus). For head and neck case, minimum dice was 0.42 for the most challenging structures (optic nerves and submandibular glands) and 0.62 for the other ones (mandible, brainstem, and parotid glands). Time required to obtain the labels was compatible with a real clinical workflow (35 and 120 min). Conclusions: The proposed software fills a gap in the multiatlas based segmentation field, since all currently available tools (both for commercial and for research purposes) are restricted to a well specified application. Furthermore, it can be adopted as a platform for exploring MABS parameters and as a reference implementation for comparing against

  5. Automatic Planning of External Search Engine Optimization

    Directory of Open Access Journals (Sweden)

    Vita Jasevičiūtė

    2015-07-01

    Full Text Available This paper describes an investigation of the external search engine optimization (SEO action planning tool, dedicated to automatically extract a small set of most important keywords for each month during whole year period. The keywords in the set are extracted accordingly to external measured parameters, such as average number of searches during the year and for every month individually. Additionally the position of the optimized web site for each keyword is taken into account. The generated optimization plan is similar to the optimization plans prepared manually by the SEO professionals and can be successfully used as a support tool for web site search engine optimization.

  6. Development of tools for automatic generation of PLC code

    OpenAIRE

    Koutli, Maria; Chasapis, Georgios; Rochez, Jacques

    2014-01-01

    This Master thesis was performed at CERN and more specifically in the EN-ICE-PLC section. The Thesis describes the integration of two PLC platforms, that are based on CODESYS development tool, to the CERN defined industrial framework, UNICOS. CODESYS is a development tool for PLC programming, based on IEC 61131-3 standard, and is adopted by many PLC manufacturers. The two PLC development environments are, the SoMachine from Schneider and the TwinCAT from Beckhoff. The two CODESYS compatible P...

  7. Automatic Control of the Concrete Mixture Homogeneity in Cycling Mixers

    Science.gov (United States)

    Anatoly Fedorovich, Tikhonov; Drozdov, Anatoly

    2018-03-01

    The article describes the factors affecting the concrete mixture quality related to the moisture content of aggregates, since the effectiveness of the concrete mixture production is largely determined by the availability of quality management tools at all stages of the technological process. It is established that the unaccounted moisture of aggregates adversely affects the concrete mixture homogeneity and, accordingly, the strength of building structures. A new control method and the automatic control system of the concrete mixture homogeneity in the technological process of mixing components have been proposed, since the tasks of providing a concrete mixture are performed by the automatic control system of processing kneading-and-mixing machinery with operational automatic control of homogeneity. Theoretical underpinnings of the control of the mixture homogeneity are presented, which are related to a change in the frequency of vibrodynamic vibrations of the mixer body. The structure of the technical means of the automatic control system for regulating the supply of water is determined depending on the change in the concrete mixture homogeneity during the continuous mixing of components. The following technical means for establishing automatic control have been chosen: vibro-acoustic sensors, remote terminal units, electropneumatic control actuators, etc. To identify the quality indicator of automatic control, the system offers a structure flowchart with transfer functions that determine the ACS operation in transient dynamic mode.

  8. Usage of aids monitoring in automatic braking systems of modern cars

    Directory of Open Access Journals (Sweden)

    Dembitskyi V.

    2016-08-01

    Full Text Available Increased safety can be carried out at the expense the installation on vehicles of automatic braking systems, that monitor the traffic situation and the actions of the driver. In this paper considered the advantages and disadvantages of automatic braking systems, were analyzed modern tracking tools that are used in automatic braking systems. Based on the statistical data on accidents, are set the main dangers, that the automatic braking system will be reduced. In order to ensure the accuracy of information conducted research for determination of optimal combination of different sensors that provide an adequate perception of road conditions. The tracking system should be equipped with a combination of sensors, which in the case of detection of an obstacle or dangers of signal is transmitted to the information processing system and decision making. Information from the monitoring system should include data for the identification of the object, its condition, the speed.

  9. Automatic Affective Appraisal of Sexual Penetration Stimuli in Women with Vaginismus or Dyspareunia

    NARCIS (Netherlands)

    Huijding, Jorg; Borg, Charmaine; Weijmar-Schultz, Willibrord; de Jong, Peter J.

    Introduction. Current psychological views are that negative appraisals of sexual stimuli lie at the core of sexual dysfunctions. It is important to differentiate between deliberate appraisals and more automatic appraisals, as research has shown that the former are most relevant to controllable

  10. Individuals with fear of blushing explicitly and automatically associate blushing with social costs

    NARCIS (Netherlands)

    Glashouwer, K.A.; de Jong, P.J.; Dijk, C.; Buwalda, F.M.

    2011-01-01

    To explain fear of blushing, it has been proposed that individuals with fear of blushing overestimate the social costs of their blushing. Current information-processing models emphasize the relevance of differentiating between more automatic and more explicit cognitions, as both types of cognitions

  11. Individuals with Fear of Blushing Explicitly and Automatically Associate Blushing with Social Costs

    NARCIS (Netherlands)

    Glashouwer, Klaske A.; de Jong, Peter J.; Dijk, Corine; Buwalda, Femke M.

    2011-01-01

    To explain fear of blushing, it has been proposed that individuals with fear of blushing overestimate the social costs of their blushing. Current information-processing models emphasize the relevance of differentiating between more automatic and more explicit cognitions, as both types of cognitions

  12. An automatized frequency analysis for vine plot detection and delineation in remote sensing

    OpenAIRE

    Delenne , Carole; Rabatel , G.; Deshayes , M.

    2008-01-01

    The availability of an automatic tool for vine plot detection, delineation, and characterization would be very useful for management purposes. An automatic and recursive process using frequency analysis (with Fourier transform and Gabor filters) has been developed to meet this need. This results in the determination of vine plot boundary and accurate estimation of interrow width and row orientation. To foster large-scale applications, tests and validation have been carried out on standard ver...

  13. Pattern-Driven Automatic Parallelization

    Directory of Open Access Journals (Sweden)

    Christoph W. Kessler

    1996-01-01

    Full Text Available This article describes a knowledge-based system for automatic parallelization of a wide class of sequential numerical codes operating on vectors and dense matrices, and for execution on distributed memory message-passing multiprocessors. Its main feature is a fast and powerful pattern recognition tool that locally identifies frequently occurring computations and programming concepts in the source code. This tool also works for dusty deck codes that have been "encrypted" by former machine-specific code transformations. Successful pattern recognition guides sophisticated code transformations including local algorithm replacement such that the parallelized code need not emerge from the sequential program structure by just parallelizing the loops. It allows access to an expert's knowledge on useful parallel algorithms, available machine-specific library routines, and powerful program transformations. The partially restored program semantics also supports local array alignment, distribution, and redistribution, and allows for faster and more exact prediction of the performance of the parallelized target code than is usually possible.

  14. Automatic Evaluation for E-Learning Using Latent Semantic Analysis: A Use Case

    Directory of Open Access Journals (Sweden)

    Mireia Farrús

    2013-03-01

    Full Text Available Assessment in education allows for obtaining, organizing, and presenting information about how much and how well the student is learning. The current paper aims at analysing and discussing some of the most state-of-the-art assessment systems in education. Later, this work presents a specific use case developed for the Universitat Oberta de Catalunya, which is an online university. An automatic evaluation tool is proposed that allows the student to evaluate himself anytime and receive instant feedback. This tool is a web-based platform, and it has been designed for engineering subjects (i.e., with math symbols and formulas in Catalan and Spanish. Particularly, the technique used for automatic assessment is latent semantic analysis. Although the experimental framework from the use case is quite challenging, results are promising.

  15. Automatic tracking of the constancy of the imaging chain radiographic equipment using integrated tool for dummy and evaluation software; Seguimiento automatico de la constancia de la cadena de imagen de Equipos Radiograficos mediante herramienta integrada por maniqui y software de evaluacion

    Energy Technology Data Exchange (ETDEWEB)

    Mayo, P.; Rodenas, F.; Marin, B.; Alcaraz, D.; Verdu, G.

    2011-07-01

    This paper presents an innovative tool nationwide for the automatic analysis of the constancy of the imaging chain digital radiographic equipment, both computed radiography (CR) and direct digital (DR).

  16. An automatic virtual patient reconstruction from CT-scans for hepatic surgical planning.

    Science.gov (United States)

    Soler, L; Delingette, H; Malandain, G; Ayache, N; Koehl, C; Clément, J M; Dourthe, O; Marescaux, J

    2000-01-01

    PROBLEM/BACKGROUND: In order to help hepatic surgical planning we perfected automatic 3D reconstruction of patients from conventional CT-scan, and interactive visualization and virtual resection tools. From a conventional abdominal CT-scan, we have developed several methods allowing the automatic 3D reconstruction of skin, bones, kidneys, lung, liver, hepatic lesions, and vessels. These methods are based on deformable modeling or thresholding algorithms followed by the application of mathematical morphological operators. From these anatomical and pathological models, we have developed a new framework for translating anatomical knowledge into geometrical and topological constraints. More precisely, our approach allows to automatically delineate the hepatic and portal veins but also to label the portal vein and finally to build an anatomical segmentation of the liver based on Couinaud definition which is currently used by surgeons all over the world. Finally, we have developed a user friendly interface for the 3D visualization of anatomical and pathological structures, the accurate evaluation of volumes and distances and for the virtual hepatic resection along a user-defined cutting plane. A validation study on a 30 patients database gives 2 mm of precision for liver delineation and less than 1 mm for all other anatomical and pathological structures delineation. An in vivo validation performed during surgery also showed that anatomical segmentation is more precise than the delineation performed by a surgeon based on external landmarks. This surgery planning system has been routinely used by our medical partner, and this has resulted in an improvement of the planning and performance of hepatic surgery procedures. We have developed new tools for hepatic surgical planning allowing a better surgery through an automatic delineation and visualization of anatomical and pathological structures. These tools represent a first step towards the development of an augmented

  17. The Masculinity of Money: Automatic Stereotypes Predict Gender Differences in Estimated Salaries

    Science.gov (United States)

    Williams, Melissa J.; Paluck, Elizabeth Levy; Spencer-Rodgers, Julie

    2010-01-01

    We present the first empirical investigation of why men are assumed to earn higher salaries than women (the "salary estimation effect"). Although this phenomenon is typically attributed to conscious consideration of the national wage gap (i.e., real inequities in salary), we hypothesize instead that it reflects differential, automatic economic…

  18. Retractable Pin Tools for the Friction Stir Welding Process

    Science.gov (United States)

    1998-01-01

    Two companies have successfully commercialized a specialized welding tool developed at the Marshall Space Flight Center (MSFC). Friction stir welding uses the high rotational speed of a tool and the resulting frictional heat created from contact to crush, 'stir' together, and forge a bond between two metal alloys. It has had a major drawback, reliance on a single-piece pin tool. The pin is slowly plunged into the joint between two materials to be welded and rotated as high speed. At the end of the weld, the single-piece pin tool is retracted and leaves a 'keyhole,' something which is unacceptable when welding cylindrical objects such as drums, pipes and storage tanks. Another drawback is the requirement for different-length pin tools when welding materials of varying thickness. An engineer at the MSFC helped design an automatic retractable pin tool that uses a computer-controlled motor to automatically retract the pin into the shoulder of the tool at the end of the weld, preventing keyholes. This design allows the pin angle and length to be adjusted for changes in material thickness and results in a smooth hole closure at the end of the weld. Benefits of friction stir welding, using the MSFC retractable pin tool technology, include the following: The ability to weld a wide range of alloys, including previously unweldable and composite materials; provision of twice the fatigue resistance of fusion welds and no keyholes; minimization of material distortion; no creation of hazards such as welding fumes, radiation, high voltage, liquid metals, or arcing; automatic retraction of the pin at the end of the weld; and maintaining full penetration of the pin.

  19. SU-E-J-16: Automatic Image Contrast Enhancement Based On Automatic Parameter Optimization for Radiation Therapy Setup Verification

    International Nuclear Information System (INIS)

    Qiu, J; Li, H. Harlod; Zhang, T; Yang, D; Ma, F

    2015-01-01

    Purpose: In RT patient setup 2D images, tissues often cannot be seen well due to the lack of image contrast. Contrast enhancement features provided by image reviewing software, e.g. Mosaiq and ARIA, require manual selection of the image processing filters and parameters thus inefficient and cannot be automated. In this work, we developed a novel method to automatically enhance the 2D RT image contrast to allow automatic verification of patient daily setups as a prerequisite step of automatic patient safety assurance. Methods: The new method is based on contrast limited adaptive histogram equalization (CLAHE) and high-pass filtering algorithms. The most important innovation is to automatically select the optimal parameters by optimizing the image contrast. The image processing procedure includes the following steps: 1) background and noise removal, 2) hi-pass filtering by subtracting the Gaussian smoothed Result, and 3) histogram equalization using CLAHE algorithm. Three parameters were determined through an iterative optimization which was based on the interior-point constrained optimization algorithm: the Gaussian smoothing weighting factor, the CLAHE algorithm block size and clip limiting parameters. The goal of the optimization is to maximize the entropy of the processed Result. Results: A total 42 RT images were processed. The results were visually evaluated by RT physicians and physicists. About 48% of the images processed by the new method were ranked as excellent. In comparison, only 29% and 18% of the images processed by the basic CLAHE algorithm and by the basic window level adjustment process, were ranked as excellent. Conclusion: This new image contrast enhancement method is robust and automatic, and is able to significantly outperform the basic CLAHE algorithm and the manual window-level adjustment process that are currently used in clinical 2D image review software tools

  20. SU-E-J-16: Automatic Image Contrast Enhancement Based On Automatic Parameter Optimization for Radiation Therapy Setup Verification

    Energy Technology Data Exchange (ETDEWEB)

    Qiu, J [Taishan Medical University, Taian, Shandong (China); Washington University in St Louis, St Louis, MO (United States); Li, H. Harlod; Zhang, T; Yang, D [Washington University in St Louis, St Louis, MO (United States); Ma, F [Taishan Medical University, Taian, Shandong (China)

    2015-06-15

    Purpose: In RT patient setup 2D images, tissues often cannot be seen well due to the lack of image contrast. Contrast enhancement features provided by image reviewing software, e.g. Mosaiq and ARIA, require manual selection of the image processing filters and parameters thus inefficient and cannot be automated. In this work, we developed a novel method to automatically enhance the 2D RT image contrast to allow automatic verification of patient daily setups as a prerequisite step of automatic patient safety assurance. Methods: The new method is based on contrast limited adaptive histogram equalization (CLAHE) and high-pass filtering algorithms. The most important innovation is to automatically select the optimal parameters by optimizing the image contrast. The image processing procedure includes the following steps: 1) background and noise removal, 2) hi-pass filtering by subtracting the Gaussian smoothed Result, and 3) histogram equalization using CLAHE algorithm. Three parameters were determined through an iterative optimization which was based on the interior-point constrained optimization algorithm: the Gaussian smoothing weighting factor, the CLAHE algorithm block size and clip limiting parameters. The goal of the optimization is to maximize the entropy of the processed Result. Results: A total 42 RT images were processed. The results were visually evaluated by RT physicians and physicists. About 48% of the images processed by the new method were ranked as excellent. In comparison, only 29% and 18% of the images processed by the basic CLAHE algorithm and by the basic window level adjustment process, were ranked as excellent. Conclusion: This new image contrast enhancement method is robust and automatic, and is able to significantly outperform the basic CLAHE algorithm and the manual window-level adjustment process that are currently used in clinical 2D image review software tools.

  1. ASA24 enables multiple automatically coded self-administered 24-hour recalls and food records

    Science.gov (United States)

    A freely available web-based tool for epidemiologic, interventional, behavioral, or clinical research from NCI that enables multiple automatically coded self-administered 24-hour recalls and food records.

  2. A bottom-up approach to automatically configured Tango control systems

    International Nuclear Information System (INIS)

    Rubio-Manrique, S.; Beltran, D.; Costa, I.; Fernandez-Carreiras, D.; Gigante, J.V.; Klora, J.; Matilla, O.; Ranz, R.; Ribas, J.; Sanchez, O.

    2012-01-01

    Alba is the first synchrotron light source built in Spain. Most of Alba control system has been developed on top of Tango control system. An amount of 5531 devices are controlled in Alba accelerators (linac, booster and storage ring) using 150 Linux PCs. Alba maintains a central repository, so called 'Cabling and Controls database' (CCDB), which keeps the inventory of equipment, cables, connections and their configuration and technical specifications. The valuable information kept in this MySQL database enables some tools to automatically create and configure Tango devices and other software components of the control systems of Accelerators, beamlines and laboratories. This paper describes the process involved in this automatic setup

  3. Automatic generation of combinatorial test data

    CERN Document Server

    Zhang, Jian; Ma, Feifei

    2014-01-01

    This book reviews the state-of-the-art in combinatorial testing, with particular emphasis on the automatic generation of test data. It describes the most commonly used approaches in this area - including algebraic construction, greedy methods, evolutionary computation, constraint solving and optimization - and explains major algorithms with examples. In addition, the book lists a number of test generation tools, as well as benchmarks and applications. Addressing a multidisciplinary topic, it will be of particular interest to researchers and professionals in the areas of software testing, combi

  4. Stiffness and the automatic selection of ODE codes

    International Nuclear Information System (INIS)

    Shampine, L.F.

    1984-01-01

    The author describes the basic ideas behind the most popular methods for the numerical solution of ordinary differential equations (ODEs). He takes up the qualitative behavior of solutions of ODEs and its relation ot the propagation of numerical error. Codes for ODEs are intended either for stiff problems or for non-stiff problems. The difference is explained. Users of codes do not have the information needed to recognize stiffness. A code, DEASY, which automatically recognizes stiffness and selects a suitable method is described

  5. Automatic evaluation of practices in Moodle for Self Learning in Engineering

    Directory of Open Access Journals (Sweden)

    Carles Sanchez

    2015-06-01

    Full Text Available The first years in engineering degree courses are usually made of large groups with a low teacher-student ratio. Overcrowding in classrooms hinders continuous assessment much needed to promote independent learning. Therefore, there is a need to apply some kind of automatic evaluation to facilitate the correction of exercises outside the classroom. We introduce here a first experience using surveys in Moodle 2.0 in order to get an automatic evaluation of practices in our Database course. We report survey valuation of the autonomous learning tool and preliminary statistics assessing correlation to an improvement in the practice exam marks.

  6. Automatic Migration from PARMACS to MPI in Parallel Fortran Applications

    Directory of Open Access Journals (Sweden)

    Rolf Hempel

    1999-01-01

    Full Text Available The PARMACS message passing interface has been in widespread use by application projects, especially in Europe. With the new MPI standard for message passing, many projects face the problem of replacing PARMACS with MPI. An automatic translation tool has been developed which replaces all PARMACS 6.0 calls in an application program with their corresponding MPI calls. In this paper we describe the mapping of the PARMACS programming model onto MPI. We then present some implementation details of the converter tool.

  7. Solving Linear Differential Equations

    NARCIS (Netherlands)

    Nguyen, K.A.; Put, M. van der

    2010-01-01

    The theme of this paper is to 'solve' an absolutely irreducible differential module explicitly in terms of modules of lower dimension and finite extensions of the differential field K. Representations of semi-simple Lie algebras and differential Galo is theory are the main tools. The results extend

  8. Electricity of machine tool

    International Nuclear Information System (INIS)

    Gijeon media editorial department

    1977-10-01

    This book is divided into three parts. The first part deals with electricity machine, which can taints from generator to motor, motor a power source of machine tool, electricity machine for machine tool such as switch in main circuit, automatic machine, a knife switch and pushing button, snap switch, protection device, timer, solenoid, and rectifier. The second part handles wiring diagram. This concludes basic electricity circuit of machine tool, electricity wiring diagram in your machine like milling machine, planer and grinding machine. The third part introduces fault diagnosis of machine, which gives the practical solution according to fault diagnosis and the diagnostic method with voltage and resistance measurement by tester.

  9. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS/E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  10. Exogenous (automatic) attention to emotional stimuli: a review

    OpenAIRE

    Carretié, Luis

    2014-01-01

    Current knowledge on the architecture of exogenous attention (also called automatic, bottom-up, or stimulus-driven attention, among other terms) has been mainly obtained from studies employing neutral, anodyne stimuli. Since, from an evolutionary perspective, exogenous attention can be understood as an adaptive tool for rapidly detecting salient events, reorienting processing resources to them, and enhancing processing mechanisms, emotional events (which are, by definition, salient for the in...

  11. Volume Ray Casting with Peak Finding and Differential Sampling

    KAUST Repository

    Knoll, A.; Hijazi, Y.; Westerteiger, R.; Schott, M.; Hansen, C.; Hagen, H.

    2009-01-01

    classification. In this paper, we introduce a method for rendering such features by explicitly solving for isovalues within the volume rendering integral. In addition, we present a sampling strategy inspired by ray differentials that automatically matches

  12. The INECO Frontal Screening tool differentiates behavioral variant - frontotemporal dementia (bv-FTD from major depression

    Directory of Open Access Journals (Sweden)

    Natalia Fiorentino

    Full Text Available ABSTRACT Executive dysfunction may result from prefrontal circuitry involvement occurring in both neurodegenerative diseases and psychiatric disorders. Moreover, multiple neuropsychiatric conditions, may present with overlapping behavioral and cognitive symptoms, making differential diagnosis challenging, especially during earlier stages. In this sense, cognitive assessment may contribute to the differential diagnosis by providing an objective and quantifiable set of measures that has the potential to distinguish clinical conditions otherwise perceived in everyday clinical settings as quite similar. Objective: The goal of this study was to investigate the utility of the INECO Frontal Screening (IFS for differentiating bv-FTD patients from patients with Major Depression. Methods: We studied 49 patients with bv-FTD diagnosis and 30 patients diagnosed with unipolar depression compared to a control group of 26 healthy controls using the INECO Frontal Screening (IFS, the Mini Mental State Examination (MMSE and the Addenbrooke's Cognitive Examination-Revised (ACE-R. Results: Patient groups differed significantly on the motor inhibitory control (U=437.0, p<0.01, verbal working memory (U=298.0, p<0.001, spatial working memory (U=300.5, p<0.001, proverbs (U=341.5, p<0.001 and verbal inhibitory control (U=316.0, p<0.001 subtests, with bv-FTD patients scoring significantly lower than patients with depression. Conclusion: Our results suggest the IFS can be considered a useful tool for detecting executive dysfunction in both depression and bv-FTD patients and, perhaps more importantly, that it has the potential to help differentiate these two conditions.

  13. Automatic analysis of microscopic images of red blood cell aggregates

    Science.gov (United States)

    Menichini, Pablo A.; Larese, Mónica G.; Riquelme, Bibiana D.

    2015-06-01

    Red blood cell aggregation is one of the most important factors in blood viscosity at stasis or at very low rates of flow. The basic structure of aggregates is a linear array of cell commonly termed as rouleaux. Enhanced or abnormal aggregation is seen in clinical conditions, such as diabetes and hypertension, producing alterations in the microcirculation, some of which can be analyzed through the characterization of aggregated cells. Frequently, image processing and analysis for the characterization of RBC aggregation were done manually or semi-automatically using interactive tools. We propose a system that processes images of RBC aggregation and automatically obtains the characterization and quantification of the different types of RBC aggregates. Present technique could be interesting to perform the adaptation as a routine used in hemorheological and Clinical Biochemistry Laboratories because this automatic method is rapid, efficient and economical, and at the same time independent of the user performing the analysis (repeatability of the analysis).

  14. EpiTools, A software suite for presurgical brain mapping in epilepsy: Intracerebral EEG.

    Science.gov (United States)

    Medina Villalon, S; Paz, R; Roehri, N; Lagarde, S; Pizzo, F; Colombet, B; Bartolomei, F; Carron, R; Bénar, C-G

    2018-03-29

    In pharmacoresistant epilepsy, exploration with depth electrodes can be needed to precisely define the epileptogenic zone. Accurate location of these electrodes is thus essential for the interpretation of Stereotaxic EEG (SEEG) signals. As SEEG analysis increasingly relies on signal processing, it is crucial to make a link between these results and patient's anatomy. Our aims were thus to develop a suite of software tools, called "EpiTools", able to i) precisely and automatically localize the position of each SEEG contact and ii) display the results of signal analysis in each patient's anatomy. The first tool, GARDEL (GUI for Automatic Registration and Depth Electrode Localization), is able to automatically localize SEEG contacts and to label each contact according to a pre-specified nomenclature (for instance that of FreeSurfer or MarsAtlas). The second tool, 3Dviewer, enables to visualize in the 3D anatomy of the patient the origin of signal processing results such as rate of biomarkers, connectivity graphs or Epileptogenicity Index. GARDEL was validated in 30 patients by clinicians and proved to be highly reliable to determine within the patient's individual anatomy the actual location of contacts. GARDEL is a fully automatic electrode localization tool needing limited user interaction (only for electrode naming or contact correction). The 3Dviewer is able to read signal processing results and to display them in link with patient's anatomy. EpiTools can help speeding up the interpretation of SEEG data and improving its precision. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. AutoFACT: An Automatic Functional Annotation and Classification Tool

    Directory of Open Access Journals (Sweden)

    Lang B Franz

    2005-06-01

    Full Text Available Abstract Background Assignment of function to new molecular sequence data is an essential step in genomics projects. The usual process involves similarity searches of a given sequence against one or more databases, an arduous process for large datasets. Results We present AutoFACT, a fully automated and customizable annotation tool that assigns biologically informative functions to a sequence. Key features of this tool are that it (1 analyzes nucleotide and protein sequence data; (2 determines the most informative functional description by combining multiple BLAST reports from several user-selected databases; (3 assigns putative metabolic pathways, functional classes, enzyme classes, GeneOntology terms and locus names; and (4 generates output in HTML, text and GFF formats for the user's convenience. We have compared AutoFACT to four well-established annotation pipelines. The error rate of functional annotation is estimated to be only between 1–2%. Comparison of AutoFACT to the traditional top-BLAST-hit annotation method shows that our procedure increases the number of functionally informative annotations by approximately 50%. Conclusion AutoFACT will serve as a useful annotation tool for smaller sequencing groups lacking dedicated bioinformatics staff. It is implemented in PERL and runs on LINUX/UNIX platforms. AutoFACT is available at http://megasun.bch.umontreal.ca/Software/AutoFACT.htm.

  16. An Automatic Instruction-Level Parallelization of Machine Code

    Directory of Open Access Journals (Sweden)

    MARINKOVIC, V.

    2018-02-01

    Full Text Available Prevailing multicores and novel manycores have made a great challenge of modern day - parallelization of embedded software that is still written as sequential. In this paper, automatic code parallelization is considered, focusing on developing a parallelization tool at the binary level as well as on the validation of this approach. The novel instruction-level parallelization algorithm for assembly code which uses the register names after SSA to find independent blocks of code and then to schedule independent blocks using METIS to achieve good load balance is developed. The sequential consistency is verified and the validation is done by measuring the program execution time on the target architecture. Great speedup, taken as the performance measure in the validation process, and optimal load balancing are achieved for multicore RISC processors with 2 to 16 cores (e.g. MIPS, MicroBlaze, etc.. In particular, for 16 cores, the average speedup is 7.92x, while in some cases it reaches 14x. An approach to automatic parallelization provided by this paper is useful to researchers and developers in the area of parallelization as the basis for further optimizations, as the back-end of a compiler, or as the code parallelization tool for an embedded system.

  17. Automatic Evaluations and Exercising: Systematic Review and Implications for Future Research.

    Science.gov (United States)

    Schinkoeth, Michaela; Antoniewicz, Franziska

    2017-01-01

    The general purpose of this systematic review was to summarize, structure and evaluate the findings on automatic evaluations of exercising. Studies were eligible for inclusion if they reported measuring automatic evaluations of exercising with an implicit measure and assessed some kind of exercise variable. Fourteen nonexperimental and six experimental studies (out of a total N = 1,928) were identified and rated by two independent reviewers. The main study characteristics were extracted and the grade of evidence for each study evaluated. First, results revealed a large heterogeneity in the applied measures to assess automatic evaluations of exercising and the exercise variables. Generally, small to large-sized significant relations between automatic evaluations of exercising and exercise variables were identified in the vast majority of studies. The review offers a systematization of the various examined exercise variables and prompts to differentiate more carefully between actually observed exercise behavior (proximal exercise indicator) and associated physiological or psychological variables (distal exercise indicator). Second, a lack of transparent reported reflections on the differing theoretical basis leading to the use of specific implicit measures was observed. Implicit measures should be applied purposefully, taking into consideration the individual advantages or disadvantages of the measures. Third, 12 studies were rated as providing first-grade evidence (lowest grade of evidence), five represent second-grade and three were rated as third-grade evidence. There is a dramatic lack of experimental studies, which are essential for illustrating the cause-effect relation between automatic evaluations of exercising and exercise and investigating under which conditions automatic evaluations of exercising influence behavior. Conclusions about the necessity of exercise interventions targeted at the alteration of automatic evaluations of exercising should therefore

  18. Automatic Chessboard Detection for Intrinsic and Extrinsic Camera Parameter Calibration

    Directory of Open Access Journals (Sweden)

    Jose María Armingol

    2010-03-01

    Full Text Available There are increasing applications that require precise calibration of cameras to perform accurate measurements on objects located within images, and an automatic algorithm would reduce this time consuming calibration procedure. The method proposed in this article uses a pattern similar to that of a chess board, which is found automatically in each image, when no information regarding the number of rows or columns is supplied to aid its detection. This is carried out by means of a combined analysis of two Hough transforms, image corners and invariant properties of the perspective transformation. Comparative analysis with more commonly used algorithms demonstrate the viability of the algorithm proposed, as a valuable tool for camera calibration.

  19. Recent advances in Automatic Speech Recognition for Vietnamese

    OpenAIRE

    Le , Viet-Bac; Besacier , Laurent; Seng , Sopheap; Bigi , Brigitte; Do , Thi-Ngoc-Diep

    2008-01-01

    International audience; This paper presents our recent activities for automatic speech recognition for Vietnamese. First, our text data collection and processing methods and tools are described. For language modeling, we investigate word, sub-word and also hybrid word/sub-word models. For acoustic modeling, when only limited speech data are available for Vietnamese, we propose some crosslingual acoustic modeling techniques. Furthermore, since the use of sub-word units can reduce the high out-...

  20. Integration of tools for binding archetypes to SNOMED CT.

    Science.gov (United States)

    Sundvall, Erik; Qamar, Rahil; Nyström, Mikael; Forss, Mattias; Petersson, Håkan; Karlsson, Daniel; Ahlfeldt, Hans; Rector, Alan

    2008-10-27

    The Archetype formalism and the associated Archetype Definition Language have been proposed as an ISO standard for specifying models of components of electronic healthcare records as a means of achieving interoperability between clinical systems. This paper presents an archetype editor with support for manual or semi-automatic creation of bindings between archetypes and terminology systems. Lexical and semantic methods are applied in order to obtain automatic mapping suggestions. Information visualisation methods are also used to assist the user in exploration and selection of mappings. An integrated tool for archetype authoring, semi-automatic SNOMED CT terminology binding assistance and terminology visualization was created and released as open source. Finding the right terms to bind is a difficult task but the effort to achieve terminology bindings may be reduced with the help of the described approach. The methods and tools presented are general, but here only bindings between SNOMED CT and archetypes based on the openEHR reference model are presented in detail.

  1. Status of GRACE system - automatic computation of cross sections

    International Nuclear Information System (INIS)

    Fujimoto, J.; Ishikawa, T.; Kawabata, S.; Kurihara, Y.; Shimizu, Y.; Kato, K.; Nakazawa, N.; Kaneko, T.; Tanaka, H.

    1995-01-01

    Automated system is an essential tool for high-energy physics and GRACE system for tree processes makes it possible to calculate cross sections for complicated processes exactly. To check the output of the automatic system we make comparison between Hooft-t-Feynman gauge and unitary gauge, the exchange of external particles, and we check the independence of UV divergence parameter and that of IR divergence parameter

  2. Automatic calculations of electroweak processes

    International Nuclear Information System (INIS)

    Ishikawa, T.; Kawabata, S.; Kurihara, Y.; Shimizu, Y.; Kaneko, T.; Kato, K.; Tanaka, H.

    1996-01-01

    GRACE system is an excellent tool for calculating the cross section and for generating event of the elementary process automatically. However it is not always easy for beginners to use. An interactive version of GRACE is being developed so as to be a user friendly system. Since it works exactly in the same environment as PAW, all functions of PAW are available for handling any histogram information produced by GRACE. As its application the cross sections of all elementary processes with up to 5-body final states induced by e + e - interaction are going to be calculated and to be summarized as a catalogue. (author)

  3. ATLAS (Automatic Tool for Local Assembly Structures) - A Comprehensive Infrastructure for Assembly, Annotation, and Genomic Binning of Metagenomic and Metaranscripomic Data

    Energy Technology Data Exchange (ETDEWEB)

    White, Richard A.; Brown, Joseph M.; Colby, Sean M.; Overall, Christopher C.; Lee, Joon-Yong; Zucker, Jeremy D.; Glaesemann, Kurt R.; Jansson, Georg C.; Jansson, Janet K.

    2017-03-02

    ATLAS (Automatic Tool for Local Assembly Structures) is a comprehensive multiomics data analysis pipeline that is massively parallel and scalable. ATLAS contains a modular analysis pipeline for assembly, annotation, quantification and genome binning of metagenomics and metatranscriptomics data and a framework for reference metaproteomic database construction. ATLAS transforms raw sequence data into functional and taxonomic data at the microbial population level and provides genome-centric resolution through genome binning. ATLAS provides robust taxonomy based on majority voting of protein coding open reading frames rolled-up at the contig level using modified lowest common ancestor (LCA) analysis. ATLAS provides robust taxonomy based on majority voting of protein coding open reading frames rolled-up at the contig level using modified lowest common ancestor (LCA) analysis. ATLAS is user-friendly, easy install through bioconda maintained as open-source on GitHub, and is implemented in Snakemake for modular customizable workflows.

  4. Comparison of automatic and visual methods used for image segmentation in Endodontics: a microCT study.

    Science.gov (United States)

    Queiroz, Polyane Mazucatto; Rovaris, Karla; Santaella, Gustavo Machado; Haiter-Neto, Francisco; Freitas, Deborah Queiroz

    2017-01-01

    To calculate root canal volume and surface area in microCT images, an image segmentation by selecting threshold values is required, which can be determined by visual or automatic methods. Visual determination is influenced by the operator's visual acuity, while the automatic method is done entirely by computer algorithms. To compare between visual and automatic segmentation, and to determine the influence of the operator's visual acuity on the reproducibility of root canal volume and area measurements. Images from 31 extracted human anterior teeth were scanned with a μCT scanner. Three experienced examiners performed visual image segmentation, and threshold values were recorded. Automatic segmentation was done using the "Automatic Threshold Tool" available in the dedicated software provided by the scanner's manufacturer. Volume and area measurements were performed using the threshold values determined both visually and automatically. The paired Student's t-test showed no significant difference between visual and automatic segmentation methods regarding root canal volume measurements (p=0.93) and root canal surface (p=0.79). Although visual and automatic segmentation methods can be used to determine the threshold and calculate root canal volume and surface, the automatic method may be the most suitable for ensuring the reproducibility of threshold determination.

  5. Data-driven automatic parking constrained control for four-wheeled mobile vehicles

    Directory of Open Access Journals (Sweden)

    Wenxu Yan

    2016-11-01

    Full Text Available In this article, a novel data-driven constrained control scheme is proposed for automatic parking systems. The design of the proposed scheme only depends on the steering angle and the orientation angle of the car, and it does not involve any model information of the car. Therefore, the proposed scheme-based automatic parking system is applicable to different kinds of cars. In order to further reduce the desired trajectory coordinate tracking errors, a coordinates compensation algorithm is also proposed. In the design procedure of the controller, a novel dynamic anti-windup compensator is used to deal with the change magnitude and rate saturations of automatic parking control input. It is theoretically proven that all the signals in the closed-loop system are uniformly ultimately bounded based on Lyapunov stability analysis method. Finally, a simulation comparison among the proposed scheme with coordinates compensation and Proportion Integration Differentiation (PID control algorithm is given. It is shown that the proposed scheme with coordinates compensation has smaller tracking errors and more rapid responses than PID scheme.

  6. Neuroimaging in Parkinsonism: a study with magnetic resonance and spectroscopy as tools in the differential diagnosis

    International Nuclear Information System (INIS)

    Vasconcellos, Luiz Felipe Rocha; Novis, Sergio A. Pereira; Rosso, Ana Lucia Z.; Moreira, Denise Madeira

    2009-01-01

    The differential diagnosis of Parkinsonism based on clinical features, sometimes may be difficult. Diagnostic tests in these cases might be useful, especially magnetic resonance imaging, a noninvasive exam, not as expensive as positron emission tomography, and provides a good basis for anatomical analysis. The magnetic resonance spectroscopy analyzes cerebral metabolism, yielding inconsistent results in parkinsonian disorders. We selected 40 individuals for magnetic resonance imaging and spectroscopy analysis, 12 with Parkinson's disease, 11 with progressive supranuclear palsy, 7 with multiple system atrophy (parkinsonian type), and 10 individuals without any psychiatric or neurological disorders (controls). Clinical scales included Hoenh and Yahr, unified Parkinson's disease rating scale and mini mental status examination. The results showed that patients with Parkinson's disease and controls presented the same aspects on neuroimaging, with few or absence of abnormalities, and supranuclear progressive palsy and multiple system atrophy showed abnormalities, some of which statistically significant. Thus, magnetic resonance imaging and spectroscopy could be useful as a tool in differential diagnosis of Parkinsonism. (author)

  7. Automatic analysis of image quality control for Image Guided Radiation Therapy (IGRT) devices in external radiotherapy

    International Nuclear Information System (INIS)

    Torfeh, Tarraf

    2009-01-01

    On-board imagers mounted on a radiotherapy treatment machine are very effective devices that improve the geometric accuracy of radiation delivery. However, a precise and regular quality control program is required in order to achieve this objective. Our purpose consisted of developing software tools dedicated to an automatic image quality control of IGRT devices used in external radiotherapy: 2D-MV mode for measuring patient position during the treatment using high energy images, 2D-kV mode (low energy images) and 3D Cone Beam Computed Tomography (CBCT) MV or kV mode, used for patient positioning before treatment. Automated analysis of the Winston and Lutz test was also proposed. This test is used for the evaluation of the mechanical aspects of treatment machines on which additional constraints are carried out due to the on-board imagers additional weights. Finally, a technique of generating digital phantoms in order to assess the performance of the proposed software tools is described. Software tools dedicated to an automatic quality control of IGRT devices allow reducing by a factor of 100 the time spent by the medical physics team to analyze the results of controls while improving their accuracy by using objective and reproducible analysis and offering traceability through generating automatic monitoring reports and statistical studies. (author) [fr

  8. Automatic assessment of average diaphragm motion trajectory from 4DCT images through machine learning.

    Science.gov (United States)

    Li, Guang; Wei, Jie; Huang, Hailiang; Gaebler, Carl Philipp; Yuan, Amy; Deasy, Joseph O

    2015-12-01

    To automatically estimate average diaphragm motion trajectory (ADMT) based on four-dimensional computed tomography (4DCT), facilitating clinical assessment of respiratory motion and motion variation and retrospective motion study. We have developed an effective motion extraction approach and a machine-learning-based algorithm to estimate the ADMT. Eleven patients with 22 sets of 4DCT images (4DCT1 at simulation and 4DCT2 at treatment) were studied. After automatically segmenting the lungs, the differential volume-per-slice (dVPS) curves of the left and right lungs were calculated as a function of slice number for each phase with respective to the full-exhalation. After 5-slice moving average was performed, the discrete cosine transform (DCT) was applied to analyze the dVPS curves in frequency domain. The dimensionality of the spectrum data was reduced by using several lowest frequency coefficients ( f v ) to account for most of the spectrum energy (Σ f v 2 ). Multiple linear regression (MLR) method was then applied to determine the weights of these frequencies by fitting the ground truth-the measured ADMT, which are represented by three pivot points of the diaphragm on each side. The 'leave-one-out' cross validation method was employed to analyze the statistical performance of the prediction results in three image sets: 4DCT1, 4DCT2, and 4DCT1 + 4DCT2. Seven lowest frequencies in DCT domain were found to be sufficient to approximate the patient dVPS curves ( R = 91%-96% in MLR fitting). The mean error in the predicted ADMT using leave-one-out method was 0.3 ± 1.9 mm for the left-side diaphragm and 0.0 ± 1.4 mm for the right-side diaphragm. The prediction error is lower in 4DCT2 than 4DCT1, and is the lowest in 4DCT1 and 4DCT2 combined. This frequency-analysis-based machine learning technique was employed to predict the ADMT automatically with an acceptable error (0.2 ± 1.6 mm). This volumetric approach is not affected by the presence of the lung tumors

  9. Individual Differences in Automatic Emotion Regulation Interact with Primed Emotion Regulation during an Anger Provocation

    Directory of Open Access Journals (Sweden)

    Ping Hu

    2017-04-01

    Full Text Available The current study investigated the interactive effects of individual differences in automatic emotion regulation (AER and primed emotion regulation strategy on skin conductance level (SCL and heart rate during provoked anger. The study was a 2 × 2 [AER tendency (expression vs. control × priming (expression vs. control] between subject design. Participants were assigned to two groups according to their performance on an emotion regulation-IAT (differentiating automatic emotion control tendency and automatic emotion expression tendency. Then participants of the two groups were randomly assigned to two emotion regulation priming conditions (emotion control priming or emotion expression priming. Anger was provoked by blaming participants for slow performance during a subsequent backward subtraction task. In anger provocation, SCL of individuals with automatic emotion control tendencies in the control priming condition was lower than of those with automatic emotion control tendencies in the expression priming condition. However, SCL of individuals with automatic emotion expression tendencies did no differ in the automatic emotion control priming or the automatic emotion expression priming condition. Heart rate during anger provocation was higher in individuals with automatic emotion expression tendencies than in individuals with automatic emotion control tendencies regardless of priming condition. This pattern indicates an interactive effect of individual differences in AER and emotion regulation priming on SCL, which is an index of emotional arousal. Heart rate was only sensitive to the individual differences in AER, and did not reflect this interaction. This finding has implications for clinical studies of the use of emotion regulation strategy training suggesting that different practices are optimal for individuals who differ in AER tendencies.

  10. SplitRacer - a semi-automatic tool for the analysis and interpretation of teleseismic shear-wave splitting

    Science.gov (United States)

    Reiss, Miriam Christina; Rümpker, Georg

    2017-04-01

    We present a semi-automatic, graphical user interface tool for the analysis and interpretation of teleseismic shear-wave splitting in MATLAB. Shear wave splitting analysis is a standard tool to infer seismic anisotropy, which is often interpreted as due to lattice-preferred orientation of e.g. mantle minerals or shape-preferred orientation caused by cracks or alternating layers in the lithosphere and hence provides a direct link to the earth's kinematic processes. The increasing number of permanent stations and temporary experiments result in comprehensive studies of seismic anisotropy world-wide. Their successive comparison with a growing number of global models of mantle flow further advances our understanding the earth's interior. However, increasingly large data sets pose the inevitable question as to how to process them. Well-established routines and programs are accurate but often slow and impractical for analyzing a large amount of data. Additionally, shear wave splitting results are seldom evaluated using the same quality criteria which complicates a straight-forward comparison. SplitRacer consists of several processing steps: i) download of data per FDSNWS, ii) direct reading of miniSEED-files and an initial screening and categorizing of XKS-waveforms using a pre-set SNR-threshold. iii) an analysis of the particle motion of selected phases and successive correction of the sensor miss-alignment based on the long-axis of the particle motion. iv) splitting analysis of selected events: seismograms are first rotated into radial and transverse components, then the energy-minimization method is applied, which provides the polarization and delay time of the phase. To estimate errors, the analysis is done for different randomly-chosen time windows. v) joint-splitting analysis for all events for one station, where the energy content of all phases is inverted simultaneously. This allows to decrease the influence of noise and to increase robustness of the measurement

  11. Automatic interpretation and writing report of the adult waking electroencephalogram.

    Science.gov (United States)

    Shibasaki, Hiroshi; Nakamura, Masatoshi; Sugi, Takenao; Nishida, Shigeto; Nagamine, Takashi; Ikeda, Akio

    2014-06-01

    Automatic interpretation of the EEG has so far been faced with significant difficulties because of a large amount of spatial as well as temporal information contained in the EEG, continuous fluctuation of the background activity depending on changes in the subject's vigilance and attention level, the occurrence of paroxysmal activities such as spikes and spike-and-slow-waves, contamination of the EEG with a variety of artefacts and the use of different recording electrodes and montages. Therefore, previous attempts of automatic EEG interpretation have been focussed only on a specific EEG feature such as paroxysmal abnormalities, delta waves, sleep stages and artefact detection. As a result of a long-standing cooperation between clinical neurophysiologists and system engineers, we report for the first time on a comprehensive, computer-assisted, automatic interpretation of the adult waking EEG. This system analyses the background activity, intermittent abnormalities, artefacts and the level of vigilance and attention of the subject, and automatically presents its report in written form. Besides, it also detects paroxysmal abnormalities and evaluates the effects of intermittent photic stimulation and hyperventilation on the EEG. This system of automatic EEG interpretation was formed by adopting the strategy that the qualified EEGers employ for the systematic visual inspection. This system can be used as a supplementary tool for the EEGer's visual inspection, and for educating EEG trainees and EEG technicians. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  12. Revisiting the dose-effect correlations in irradiated head and neck cancer using automatic segmentation tools of the dental structures, mandible and maxilla

    International Nuclear Information System (INIS)

    Thariat, J.; Ramus, L.; Odin, G.; Vincent, S.; Orlanducci, M.H.; Dassonville, O.; Darcourt, V.; Lacout, A.; Marcy, P.Y.; Cagnol, G.; Malandain, G.

    2011-01-01

    Purpose. - Manual delineation of dental structures is too time-consuming to be feasible in routine practice. Information on dose risk levels is crucial for dentists following irradiation of the head and neck to avoid post-extraction osteoradionecrosis based on empirical dose-effects data established on bidimensional radiation therapy plans. Material and methods. - We present an automatic atlas-based segmentation framework of the dental structures, called Dentalmaps, constructed from a patient image-segmentation database. Results. - This framework is accurate (within 2 Gy accuracy) and relevant for the routine use. It has the potential to guide dental care in the context of new irradiation techniques. Conclusion. - This tool provides a user-friendly interface for dentists and radiation oncologists in the context of irradiated head and neck cancer patients. It will likely improve the knowledge of dose-effect correlations for dental complications and osteoradionecrosis. (authors)

  13. Differentiation/Purification Protocol for Retinal Pigment Epithelium from Mouse Induced Pluripotent Stem Cells as a Research Tool.

    Directory of Open Access Journals (Sweden)

    Yuko Iwasaki

    Full Text Available To establish a novel protocol for differentiation of retinal pigment epithelium (RPE with high purity from mouse induced pluripotent stem cells (iPSC.Retinal progenitor cells were differentiated from mouse iPSC, and RPE differentiation was then enhanced by activation of the Wnt signaling pathway, inhibition of the fibroblast growth factor signaling pathway, and inhibition of the Rho-associated, coiled-coil containing protein kinase signaling pathway. Expanded pigmented cells were purified by plate adhesion after Accutase® treatment. Enriched cells were cultured until they developed a cobblestone appearance with cuboidal shape. The characteristics of iPS-RPE were confirmed by gene expression, immunocytochemistry, and electron microscopy. Functions and immunologic features of the iPS-RPE were also evaluated.We obtained iPS-RPE at high purity (approximately 98%. The iPS-RPE showed apical-basal polarity and cellular structure characteristic of RPE. Expression levels of several RPE markers were lower than those of freshly isolated mouse RPE but comparable to those of primary cultured RPE. The iPS-RPE could form tight junctions, phagocytose photoreceptor outer segments, express immune antigens, and suppress lymphocyte proliferation.We successfully developed a differentiation/purification protocol to obtain mouse iPS-RPE. The mouse iPS-RPE can serve as an attractive tool for functional and morphological studies of RPE.

  14. Automatic Visualization of Software Requirements: Reactive Systems

    International Nuclear Information System (INIS)

    Castello, R.; Mili, R.; Tollis, I.G.; Winter, V.

    1999-01-01

    In this paper we present an approach that facilitates the validation of high consequence system requirements. This approach consists of automatically generating a graphical representation from an informal document. Our choice of a graphical notation is statecharts. We proceed in two steps: we first extract a hierarchical decomposition tree from a textual description, then we draw a graph that models the statechart in a hierarchical fashion. The resulting drawing is an effective requirements assessment tool that allows the end user to easily pinpoint inconsistencies and incompleteness

  15. Can wireless technology enable new diabetes management tools?

    Science.gov (United States)

    Hedtke, Paul A

    2008-01-01

    Mobile computing and communications technology embodied in the modern cell phone device can be employed to improve the lives of diabetes patients by giving them better tools for self-management. Several companies are working on the development of diabetes management tools that leverage the ubiquitous cell phone to bring self-management tools to the hand of the diabetes patient. Integration of blood glucose monitoring (BGM) technology with the cell phone platform adds a level of convenience for the person with diabetes, but, more importantly, allows BGM data to be automatically captured, logged, and processed in near real time in order to provide the diabetes patient with assistance in managing their blood glucose levels. Other automatic measurements can estimate physical activity, and information regarding medication events and food intake can be captured and analyzed in order to provide the diabetes patient with continual assistance in managing their therapy and behaviors in order to improve glycemic control. The path to realization of such solutions is not, however, without obstacles.

  16. Assessing hippocampal development and language in early childhood: Evidence from a new application of the Automatic Segmentation Adapter Tool.

    Science.gov (United States)

    Lee, Joshua K; Nordahl, Christine W; Amaral, David G; Lee, Aaron; Solomon, Marjorie; Ghetti, Simona

    2015-11-01

    Volumetric assessments of the hippocampus and other brain structures during childhood provide useful indices of brain development and correlates of cognitive functioning in typically and atypically developing children. Automated methods such as FreeSurfer promise efficient and replicable segmentation, but may include errors which are avoided by trained manual tracers. A recently devised automated correction tool that uses a machine learning algorithm to remove systematic errors, the Automatic Segmentation Adapter Tool (ASAT), was capable of substantially improving the accuracy of FreeSurfer segmentations in an adult sample [Wang et al., 2011], but the utility of ASAT has not been examined in pediatric samples. In Study 1, the validity of FreeSurfer and ASAT corrected hippocampal segmentations were examined in 20 typically developing children and 20 children with autism spectrum disorder aged 2 and 3 years. We showed that while neither FreeSurfer nor ASAT accuracy differed by disorder or age, the accuracy of ASAT corrected segmentations were substantially better than FreeSurfer segmentations in every case, using as few as 10 training examples. In Study 2, we applied ASAT to 89 typically developing children aged 2 to 4 years to examine relations between hippocampal volume, age, sex, and expressive language. Girls had smaller hippocampi overall, and in left hippocampus this difference was larger in older than younger girls. Expressive language ability was greater in older children, and this difference was larger in those with larger hippocampi, bilaterally. Overall, this research shows that ASAT is highly reliable and useful to examinations relating behavior to hippocampal structure. © 2015 Wiley Periodicals, Inc.

  17. Automatic design optimization tool for passive structural control systems

    Science.gov (United States)

    Mojolic, Cristian; Hulea, Radu; Parv, Bianca Roxana

    2017-07-01

    The present paper proposes an automatic dynamic process in order to find the parameters of the seismic isolation systems applied to large span structures. Three seismic isolation solutions are proposed for the model of the new Slatina Sport Hall. The first case uses friction pendulum system (FP), the second one uses High Damping Rubber Bearing (HDRB) and Lead Rubber Bearings, while (LRB) are used for the last case of isolation. The placement of the isolation level is at the top end of the roof supporting columns. The aim is to calculate the parameters of each isolation system so that the whole's structure first vibration periods is the one desired by the user. The model is computed with the use of SAP2000 software. In order to find the best solution for the optimization problem, an optimization process based on Genetic Algorithms (GA) has been developed in Matlab. With the use of the API (Application Programming Interface) libraries a two way link is created between the two programs in order to exchange results and link parameters. The main goal is to find the best seismic isolation method for each desired modal period so that the bending moment on the supporting columns should be minimum.

  18. Design principles of metal-cutting machine tools

    CERN Document Server

    Koenigsberger, F

    1964-01-01

    Design Principles of Metal-Cutting Machine Tools discusses the fundamentals aspects of machine tool design. The book covers the design consideration of metal-cutting machine, such as static and dynamic stiffness, operational speeds, gearboxes, manual, and automatic control. The text first details the data calculation and the general requirements of the machine tool. Next, the book discusses the design principles, which include stiffness and rigidity of the separate constructional elements and their combined behavior under load, as well as electrical, mechanical, and hydraulic drives for the op

  19. PACS quality control and automatic problem notifier

    Science.gov (United States)

    Honeyman-Buck, Janice C.; Jones, Douglas; Frost, Meryll M.; Staab, Edward V.

    1997-05-01

    One side effect of installing a clinical PACS Is that users become dependent upon the technology and in some cases it can be very difficult to revert back to a film based system if components fail. The nature of system failures range from slow deterioration of function as seen in the loss of monitor luminance through sudden catastrophic loss of the entire PACS networks. This paper describes the quality control procedures in place at the University of Florida and the automatic notification system that alerts PACS personnel when a failure has happened or is anticipated. The goal is to recover from a failure with a minimum of downtime and no data loss. Routine quality control is practiced on all aspects of PACS, from acquisition, through network routing, through display, and including archiving. Whenever possible, the system components perform self and between platform checks for active processes, file system status, errors in log files, and system uptime. When an error is detected or a exception occurs, an automatic page is sent to a pager with a diagnostic code. Documentation on each code, trouble shooting procedures, and repairs are kept on an intranet server accessible only to people involved in maintaining the PACS. In addition to the automatic paging system for error conditions, acquisition is assured by an automatic fax report sent on a daily basis to all technologists acquiring PACS images to be used as a cross check that all studies are archived prior to being removed from the acquisition systems. Daily quality control is preformed to assure that studies can be moved from each acquisition and contrast adjustment. The results of selected quality control reports will be presented. The intranet documentation server will be described with the automatic pager system. Monitor quality control reports will be described and the cost of quality control will be quantified. As PACS is accepted as a clinical tool, the same standards of quality control must be established

  20. Sexual Modes Questionnaire (SMQ): Translation and Psychometric Properties of the Italian Version of the Automatic Thought Scale.

    Science.gov (United States)

    Nimbi, Filippo Maria; Tripodi, Francesca; Simonelli, Chiara; Nobre, Pedro

    2018-03-01

    The Sexual Modes Questionnaire (SMQ) is a validated and widespread used tool to assess the association among negative automatic thoughts, emotions, and sexual response during sexual activity in men and women. To test the psychometric characteristics of the Italian version of the SMQ focusing on the Automatic Thoughts subscale (SMQ-AT). After linguistic translation, the psychometric properties (internal consistency, construct, and discriminant validity) were evaluated. 1,051 participants (425 men and 626 women, 776 healthy and 275 clinical groups complaining about sexual problems) participated in the present study. 2 confirmatory factor analyses were conducted to test the fit of the original factor structures of the SMQ versions. In addition, 2 principal component analyses were performed to highlight 2 new factorial structures that were further validated with confirmatory factor analyses. Cronbach α and composite reliability were used as internal consistency measures and differences between clinical and control groups were run to test the discriminant validity for the male and female versions. The associations with emotions and sexual functioning measures also are reported. Principal component analyses identified 5 factors in the male version: erection concerns thoughts, lack of erotic thoughts, age- and body-related thoughts, negative thoughts toward sex, and worries about partner's evaluation and failure anticipation thoughts. In the female version 6 factors were found: sexual abuse thoughts, lack of erotic thoughts, low self-body image thoughts, failure and disengagement thoughts, sexual passivity and control, and partner's lack of affection. Confirmatory factor analysis supported the adequacy of the factor structure for men and women. Moreover, the SMQ showed a strong association with emotional response and sexual functioning, differentiating between clinical and control groups. This measure is useful to evaluate patients and design interventions focused on

  1. Automatic creation of specialised multilingual dictionaries in new subject areas

    Directory of Open Access Journals (Sweden)

    Joaquim Moré

    2009-05-01

    Full Text Available This article presents a tool to automatically generate specialised dictionaries of multilingual equivalents in new subject areas. The tool uses resources that are available on the web to search for equivalents and verify their reliability. These resources are, on the one hand, the Wikipedias, which can be freely downloaded and processed, and, on the other, the materials that terminological institutions of reference make available. This tool is of use to teachers producing teaching materials and researchers preparing theses, articles or reference manuals. It is also of use to translators and terminologists working on terminological standardisation in a new subject area in a given language, as it helps them in their work to pinpoint concepts that have yet to receive a standardised denomination.

  2. Automatic affective appraisal of sexual penetration stimuli in women with vaginismus or dyspareunia.

    Science.gov (United States)

    Huijding, Jorg; Borg, Charmaine; Weijmar-Schultz, Willibrord; de Jong, Peter J

    2011-03-01

    Current psychological views are that negative appraisals of sexual stimuli lie at the core of sexual dysfunctions. It is important to differentiate between deliberate appraisals and more automatic appraisals, as research has shown that the former are most relevant to controllable behaviors, and the latter are most relevant to reflexive behaviors. Accordingly, it can be hypothesized that in women with vaginismus, the persistent difficulty to allow vaginal entry is due to global negative automatic affective appraisals that trigger reflexive pelvic floor muscle contraction at the prospect of penetration. To test whether sexual penetration pictures elicited global negative automatic affective appraisals in women with vaginismus or dyspareunia and to examine whether deliberate appraisals and automatic appraisals differed between the two patient groups. Women with persistent vaginismus (N = 24), dyspareunia (N = 23), or no sexual complaints (N = 30) completed a pictorial Extrinsic Affective Simon Task (EAST), and then made a global affective assessment of the EAST stimuli using visual analogue scales (VAS). The EAST assessed global automatic affective appraisals of sexual penetration stimuli, while the VAS assessed global deliberate affective appraisals of these stimuli. Automatic affective appraisals of sexual penetration stimuli tended to be positive, independent of the presence of sexual complaints. Deliberate appraisals of the same stimuli were significantly more negative in the women with vaginismus than in the dyspareunia group and control group, while the latter two groups did not differ in their appraisals. Unexpectedly, deliberate appraisals seemed to be most important in vaginismus, whereas dyspareunia did not seem to implicate negative deliberate or automatic affective appraisals. These findings dispute the view that global automatic affect lies at the core of vaginismus and indicate that a useful element in therapeutic interventions may be the modification of

  3. Automatic Fiscal Stabilizers

    Directory of Open Access Journals (Sweden)

    Narcis Eduard Mitu

    2013-11-01

    Full Text Available Policies or institutions (built into an economic system that automatically tend to dampen economic cycle fluctuations in income, employment, etc., without direct government intervention. For example, in boom times, progressive income tax automatically reduces money supply as incomes and spendings rise. Similarly, in recessionary times, payment of unemployment benefits injects more money in the system and stimulates demand. Also called automatic stabilizers or built-in stabilizers.

  4. Automatic computation and solution of generalized harmonic balance equations

    Science.gov (United States)

    Peyton Jones, J. C.; Yaser, K. S. A.; Stevenson, J.

    2018-02-01

    Generalized methods are presented for generating and solving the harmonic balance equations for a broad class of nonlinear differential or difference equations and for a general set of harmonics chosen by the user. In particular, a new algorithm for automatically generating the Jacobian of the balance equations enables efficient solution of these equations using continuation methods. Efficient numeric validation techniques are also presented, and the combined algorithm is applied to the analysis of dc, fundamental, second and third harmonic response of a nonlinear automotive damper.

  5. Evaluation of automatic image quality assessment in chest CT - A human cadaver study.

    Science.gov (United States)

    Franck, Caro; De Crop, An; De Roo, Bieke; Smeets, Peter; Vergauwen, Merel; Dewaele, Tom; Van Borsel, Mathias; Achten, Eric; Van Hoof, Tom; Bacher, Klaus

    2017-04-01

    The evaluation of clinical image quality (IQ) is important to optimize CT protocols and to keep patient doses as low as reasonably achievable. Considering the significant amount of effort needed for human observer studies, automatic IQ tools are a promising alternative. The purpose of this study was to evaluate automatic IQ assessment in chest CT using Thiel embalmed cadavers. Chest CT's of Thiel embalmed cadavers were acquired at different exposures. Clinical IQ was determined by performing a visual grading analysis. Physical-technical IQ (noise, contrast-to-noise and contrast-detail) was assessed in a Catphan phantom. Soft and sharp reconstructions were made with filtered back projection and two strengths of iterative reconstruction. In addition to the classical IQ metrics, an automatic algorithm was used to calculate image quality scores (IQs). To be able to compare datasets reconstructed with different kernels, the IQs values were normalized. Good correlations were found between IQs and the measured physical-technical image quality: noise (ρ=-1.00), contrast-to-noise (ρ=1.00) and contrast-detail (ρ=0.96). The correlation coefficients between IQs and the observed clinical image quality of soft and sharp reconstructions were 0.88 and 0.93, respectively. The automatic scoring algorithm is a promising tool for the evaluation of thoracic CT scans in daily clinical practice. It allows monitoring of the image quality of a chest protocol over time, without human intervention. Different reconstruction kernels can be compared after normalization of the IQs. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  6. Applying CASE Tools for On-Board Software Development

    Science.gov (United States)

    Brammer, U.; Hönle, A.

    For many space projects the software development is facing great pressure with respect to quality, costs and schedule. One way to cope with these challenges is the application of CASE tools for automatic generation of code and documentation. This paper describes two CASE tools: Rhapsody (I-Logix) featuring UML and ISG (BSSE) that provides modeling of finite state machines. Both tools have been used at Kayser-Threde in different space projects for the development of on-board software. The tools are discussed with regard to the full software development cycle.

  7. Set-up for differential manometers testing

    International Nuclear Information System (INIS)

    Ratushnyj, M.I.; Galkin, Yu.V.; Nechaj, A.G.

    1985-01-01

    Set-up characteristic for controlling and testing metrological characteristics of TPP and NPP differential manometers with extreme pressure drop upto 250 kPa is briefly described. The set-up provides with automatic and manual assignment of values of gauge air pressure with errors of 0.1 and 0.25% correspondingly. The set-up is supplied with standard equipment to measure output signals. Set-up supply is carried out by a one-phase alternating current circuit with 220 V. Air supply is carried out by O.4-0.6 MPa. pressure of a pneumatic system. Application of the set-up increases operating efficiency 5 times while checking and turning differential manometers

  8. Delay differential equations via the matrix Lambert W function and bifurcation analysis: application to machine tool chatter.

    Science.gov (United States)

    Yi, Sun; Nelson, Patrick W; Ulsoy, A Galip

    2007-04-01

    In a turning process modeled using delay differential equations (DDEs), we investigate the stability of the regenerative machine tool chatter problem. An approach using the matrix Lambert W function for the analytical solution to systems of delay differential equations is applied to this problem and compared with the result obtained using a bifurcation analysis. The Lambert W function, known to be useful for solving scalar first-order DDEs, has recently been extended to a matrix Lambert W function approach to solve systems of DDEs. The essential advantages of the matrix Lambert W approach are not only the similarity to the concept of the state transition matrix in lin ear ordinary differential equations, enabling its use for general classes of linear delay differential equations, but also the observation that we need only the principal branch among an infinite number of roots to determine the stability of a system of DDEs. The bifurcation method combined with Sturm sequences provides an algorithm for determining the stability of DDEs without restrictive geometric analysis. With this approach, one can obtain the critical values of delay, which determine the stability of a system and hence the preferred operating spindle speed without chatter. We apply both the matrix Lambert W function and the bifurcation analysis approach to the problem of chatter stability in turning, and compare the results obtained to existing methods. The two new approaches show excellent accuracy and certain other advantages, when compared to traditional graphical, computational and approximate methods.

  9. Modeling and monitoring of pipelines and networks advanced tools for automatic monitoring and supervision of pipelines

    CERN Document Server

    Torres, Lizeth

    2017-01-01

    This book focuses on the analysis and design of advanced techniques for on-line automatic computational monitoring of pipelines and pipe networks. It discusses how to improve the systems’ security considering mathematical models of the flow, historical flow rate and pressure data, with the main goal of reducing the number of sensors installed along a pipeline. The techniques presented in the book have been implemented in digital systems to enhance the abilities of the pipeline network’s operators in recognizing anomalies. A real leak scenario in a Mexican water pipeline is used to illustrate the benefits of these techniques in locating the position of a leak. Intended for an interdisciplinary audience, the book addresses researchers and professionals in the areas of mechanical, civil and control engineering. It covers topics on fluid mechanics, instrumentation, automatic control, signal processing, computing, construction and diagnostic technologies.

  10. Automatic Tagging as a Support Strategy for Creating Knowledge Maps

    Directory of Open Access Journals (Sweden)

    Leonardo Moura De Araújo

    2017-06-01

    Full Text Available Graph organizers are powerful tools for both structuring and transmitting knowledge. Because of their unique characteristics, these organizers are valuable for cultural institutions, which own large amounts of information assets and need to constantly make sense of them. On one hand, graph organizers are tools for connecting numerous chunks of data. On the other hand, because they are visual media, they offer a bird's-eye view perspective on complexity, which is digestible to the human eye. They are effective tools for information synthesis, and are capable of providing valuable insights on data. Information synthesis is essential for Heritage Interpretation, since institutions depend on constant generation of new content to preserve relevance among their audiences. While Mind Maps are simpler to be structured and comprehended, Knowledge Maps offer challenges that require new methods to minimize the difficulties encountered during their assembly. This paper presents strategies based on manual and automatic tagging as an answer to this problem. In addition, we describe the results of a usability test and qualitative analysis performed to compare the workflows employed to construct both Mind Maps and Knowledge Maps. Furthermore, we also talk about how well concepts can be communicated through the visual representation of trees and networks. Depending on the employed method, different results can be achieved, because of their unique topological characteristics. Our findings suggest that automatic tagging supports and accelerates the construction of graphs.

  11. Haystack, a web-based tool for metabolomics research.

    Science.gov (United States)

    Grace, Stephen C; Embry, Stephen; Luo, Heng

    2014-01-01

    Liquid chromatography coupled to mass spectrometry (LCMS) has become a widely used technique in metabolomics research for differential profiling, the broad screening of biomolecular constituents across multiple samples to diagnose phenotypic differences and elucidate relevant features. However, a significant limitation in LCMS-based metabolomics is the high-throughput data processing required for robust statistical analysis and data modeling for large numbers of samples with hundreds of unique chemical species. To address this problem, we developed Haystack, a web-based tool designed to visualize, parse, filter, and extract significant features from LCMS datasets rapidly and efficiently. Haystack runs in a browser environment with an intuitive graphical user interface that provides both display and data processing options. Total ion chromatograms (TICs) and base peak chromatograms (BPCs) are automatically displayed, along with time-resolved mass spectra and extracted ion chromatograms (EICs) over any mass range. Output files in the common .csv format can be saved for further statistical analysis or customized graphing. Haystack's core function is a flexible binning procedure that converts the mass dimension of the chromatogram into a set of interval variables that can uniquely identify a sample. Binned mass data can be analyzed by exploratory methods such as principal component analysis (PCA) to model class assignment and identify discriminatory features. The validity of this approach is demonstrated by comparison of a dataset from plants grown at two light conditions with manual and automated peak detection methods. Haystack successfully predicted class assignment based on PCA and cluster analysis, and identified discriminatory features based on analysis of EICs of significant bins. Haystack, a new online tool for rapid processing and analysis of LCMS-based metabolomics data is described. It offers users a range of data visualization options and supports non

  12. Automatic design of digital synthetic gene circuits.

    Directory of Open Access Journals (Sweden)

    Mario A Marchisio

    2011-02-01

    Full Text Available De novo computational design of synthetic gene circuits that achieve well-defined target functions is a hard task. Existing, brute-force approaches run optimization algorithms on the structure and on the kinetic parameter values of the network. However, more direct rational methods for automatic circuit design are lacking. Focusing on digital synthetic gene circuits, we developed a methodology and a corresponding tool for in silico automatic design. For a given truth table that specifies a circuit's input-output relations, our algorithm generates and ranks several possible circuit schemes without the need for any optimization. Logic behavior is reproduced by the action of regulatory factors and chemicals on the promoters and on the ribosome binding sites of biological Boolean gates. Simulations of circuits with up to four inputs show a faithful and unequivocal truth table representation, even under parametric perturbations and stochastic noise. A comparison with already implemented circuits, in addition, reveals the potential for simpler designs with the same function. Therefore, we expect the method to help both in devising new circuits and in simplifying existing solutions.

  13. Dissociation between controlled and automatic processes in the behavioral variant of fronto-temporal dementia.

    Science.gov (United States)

    Collette, Fabienne; Van der Linden, Martial; Salmon, Eric

    2010-01-01

    A decline of cognitive functioning affecting several cognitive domains was frequently reported in patients with frontotemporal dementia. We were interested in determining if these deficits can be interpreted as reflecting an impairment of controlled cognitive processes by using an assessment tool specifically developed to explore the distinction between automatic and controlled processes, namely the process dissociation procedure (PDP) developed by Jacoby. The PDP was applied to a word stem completion task to determine the contribution of automatic and controlled processes to episodic memory performance and was administered to a group of 12 patients with the behavioral variant of frontotemporal dementia (bv-FTD) and 20 control subjects (CS). Bv-FTD patients obtained a lower performance than CS for the estimates of controlled processes, but no group differences was observed for estimates of automatic processes. The between-groups comparison of the estimates of controlled and automatic processes showed a larger contribution of automatic processes to performance in bv-FTD, while a slightly more important contribution of controlled processes was observed in control subjects. These results are clearly indicative of an alteration of controlled memory processes in bv-FTD.

  14. Tool-change coupling for robots

    International Nuclear Information System (INIS)

    Cooper, C.

    1988-01-01

    A coupling device for use in enabling a robotic unit to couple automatically with any one of a number of tools, comprises two coupling parts connected respectively to the tool and the arm. The two parts can be brought into interengaged relationship by appropriate manipulation of the arm and can be locked together by means of a locking element, controlled by piston and cylinder assembly which enters a tapered bore in a spigot after the spigot, which is mounted on the coupling part, has entered a bore in the other coupling part. The parts also incorporate registering passages for providing continuity of pressurised air supply between the robotic unit and a fluid-powered device incorporated in the tool. (author)

  15. Neuroimaging in Parkinsonism: a study with magnetic resonance and spectroscopy as tools in the differential diagnosis

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcellos, Luiz Felipe Rocha [1Hospital dos Servidores do Estado, Rio de Janeiro RJ (Brazil)], e-mail: luizneurol@terra.com.br; Novis, Sergio A. Pereira; Rosso, Ana Lucia Z. [Hospital Universitario Clementino Fraga Filho (HUCFF), Rio de Janeiro, RJ (Brazil); Moreira, Denise Madeira [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Inst. de Neurologia Deolindo Couto; Leite, Ana Claudia C.B. [Fundacao Oswaldo Cruz (FIOCRUZ), Rio de Janeiro, RJ (Brazil)

    2009-03-15

    The differential diagnosis of Parkinsonism based on clinical features, sometimes may be difficult. Diagnostic tests in these cases might be useful, especially magnetic resonance imaging, a noninvasive exam, not as expensive as positron emission tomography, and provides a good basis for anatomical analysis. The magnetic resonance spectroscopy analyzes cerebral metabolism, yielding inconsistent results in parkinsonian disorders. We selected 40 individuals for magnetic resonance imaging and spectroscopy analysis, 12 with Parkinson's disease, 11 with progressive supranuclear palsy, 7 with multiple system atrophy (parkinsonian type), and 10 individuals without any psychiatric or neurological disorders (controls). Clinical scales included Hoenh and Yahr, unified Parkinson's disease rating scale and mini mental status examination. The results showed that patients with Parkinson's disease and controls presented the same aspects on neuroimaging, with few or absence of abnormalities, and supranuclear progressive palsy and multiple system atrophy showed abnormalities, some of which statistically significant. Thus, magnetic resonance imaging and spectroscopy could be useful as a tool in differential diagnosis of Parkinsonism. (author)

  16. Automatic charge control system for satellites

    Science.gov (United States)

    Shuman, B. M.; Cohen, H. A.

    1985-01-01

    The SCATHA and the ATS-5 and 6 spacecraft provided insights to the problem of spacecraft charging at geosychronous altitudes. Reduction of the levels of both absolute and differential charging was indicated, by the emission of low energy neutral plasma. It is appropriate to complete the transition from experimental results to the development of a system that will sense the state-of-charge of a spacecraft, and, when a predetermined threshold is reached, will respond automatically to reduce it. A development program was initiated utilizing sensors comparable to the proton electrostatic analyzer, the surface potential monitor, and the transient pulse monitor that flew in SCATHA, and combine these outputs through a microprocessor controller to operate a rapid-start, low energy plasma source.

  17. Automatic flow-through dynamic extraction: A fast tool to evaluate char-based remediation of multi-element contaminated mine soils.

    Science.gov (United States)

    Rosende, María; Beesley, Luke; Moreno-Jimenez, Eduardo; Miró, Manuel

    2016-02-01

    An automatic in-vitro bioaccessibility test based upon dynamic microcolumn extraction in a programmable flow setup is herein proposed as a screening tool to evaluate bio-char based remediation of mine soils contaminated with trace elements as a compelling alternative to conventional phyto-availability tests. The feasibility of the proposed system was evaluated by extracting the readily bioaccessible pools of As, Pb and Zn in two contaminated mine soils before and after the addition of two biochars (9% (w:w)) of diverse source origin (pine and olive). Bioaccessible fractions under worst-case scenarios were measured using 0.001 mol L(-1) CaCl2 as extractant for mimicking plant uptake, and analysis of the extracts by inductively coupled optical emission spectrometry. The t-test of comparison of means revealed an efficient metal (mostly Pb and Zn) immobilization by the action of olive pruning-based biochar against the bare (control) soil at the 0.05 significance level. In-vitro flow-through bioaccessibility tests are compared for the first time with in-vivo phyto-toxicity assays in a microcosm soil study. By assessing seed germination and shoot elongation of Lolium perenne in contaminated soils with and without biochar amendments the dynamic flow-based bioaccessibility data proved to be in good agreement with the phyto-availability tests. Experimental results indicate that the dynamic extraction method is a viable and economical in-vitro tool in risk assessment explorations to evaluate the feasibility of a given biochar amendment for revegetation and remediation of metal contaminated soils in a mere 10 min against 4 days in case of phyto-toxicity assays. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Sequence History Update Tool

    Science.gov (United States)

    Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; DelGuercio, Chris

    2008-01-01

    The Sequence History Update Tool performs Web-based sequence statistics archiving for Mars Reconnaissance Orbiter (MRO). Using a single UNIX command, the software takes advantage of sequencing conventions to automatically extract the needed statistics from multiple files. This information is then used to populate a PHP database, which is then seamlessly formatted into a dynamic Web page. This tool replaces a previous tedious and error-prone process of manually editing HTML code to construct a Web-based table. Because the tool manages all of the statistics gathering and file delivery to and from multiple data sources spread across multiple servers, there is also a considerable time and effort savings. With the use of The Sequence History Update Tool what previously took minutes is now done in less than 30 seconds, and now provides a more accurate archival record of the sequence commanding for MRO.

  19. Effect of automatic control technologies on emission reduction in small-scale combustion

    Energy Technology Data Exchange (ETDEWEB)

    Ruusunen, M. [Control Engineering Laboratory, University of Oulu (Finland)

    2007-07-01

    Automatic control can be regarded as a primary measure for preventing combustion emissions. In this view, the control technology covers broadly the control methods, sensors and actuators for monitoring and controlling combustion. In addition to direct control of combustion process, it can also give tools for condition monitoring and optimisation of total heat consumption by system integration thus reducing the need for excess conversion of energy. Automatic control has already shown its potential in small-scale combustion. The potential, but still unrealised advantages of automatic control in this scale are the adaptation to changes in combustion conditions (fuel, environment, device, user) and the continuous optimisation of the air/fuel ratio. Modem control technology also covers combustion condition monitoring, diagnostics, and the higher level optimisation of the energy consumption with system integration. In theory, these primary measures maximise the overall efficiency, enabling a significant reduction in fuel consumption and thus total emissions per small-scale combustion unit, specifically at the annual level.

  20. The Algorithm for Algorithms: An Evolutionary Algorithm Based on Automatic Designing of Genetic Operators

    Directory of Open Access Journals (Sweden)

    Dazhi Jiang

    2015-01-01

    Full Text Available At present there is a wide range of evolutionary algorithms available to researchers and practitioners. Despite the great diversity of these algorithms, virtually all of the algorithms share one feature: they have been manually designed. A fundamental question is “are there any algorithms that can design evolutionary algorithms automatically?” A more complete definition of the question is “can computer construct an algorithm which will generate algorithms according to the requirement of a problem?” In this paper, a novel evolutionary algorithm based on automatic designing of genetic operators is presented to address these questions. The resulting algorithm not only explores solutions in the problem space like most traditional evolutionary algorithms do, but also automatically generates genetic operators in the operator space. In order to verify the performance of the proposed algorithm, comprehensive experiments on 23 well-known benchmark optimization problems are conducted. The results show that the proposed algorithm can outperform standard differential evolution algorithm in terms of convergence speed and solution accuracy which shows that the algorithm designed automatically by computers can compete with the algorithms designed by human beings.

  1. Automatic scoring of the severity of psoriasis scaling

    DEFF Research Database (Denmark)

    Gomez, David Delgado; Ersbøll, Bjarne Kjær; Carstensen, Jens Michael

    2004-01-01

    In this work, a combined statistical and image analysis method to automatically evaluate the severity of scaling in psoriasis lesions is proposed. The method separates the different regions of the disease in the image and scores the degree of scaling based on the properties of these areas. The pr...... with scores made by doctors. This and the fact that the obtained measures are continuous indicate the proposed method is a suitable tool to evaluate the lesion and to track the evolution of dermatological diseases....

  2. Finding weak points automatically

    International Nuclear Information System (INIS)

    Archinger, P.; Wassenberg, M.

    1999-01-01

    Operators of nuclear power stations have to carry out material tests at selected components by regular intervalls. Therefore a full automaticated test, which achieves a clearly higher reproducibility, compared to part automaticated variations, would provide a solution. In addition the full automaticated test reduces the dose of radiation for the test person. (orig.) [de

  3. A multiparametric automatic method to monitor long-term reproducibility in digital mammography: results from a regional screening programme.

    Science.gov (United States)

    Gennaro, G; Ballaminut, A; Contento, G

    2017-09-01

    This study aims to illustrate a multiparametric automatic method for monitoring long-term reproducibility of digital mammography systems, and its application on a large scale. Twenty-five digital mammography systems employed within a regional screening programme were controlled weekly using the same type of phantom, whose images were analysed by an automatic software tool. To assess system reproducibility levels, 15 image quality indices (IQIs) were extracted and compared with the corresponding indices previously determined by a baseline procedure. The coefficients of variation (COVs) of the IQIs were used to assess the overall variability. A total of 2553 phantom images were collected from the 25 digital mammography systems from March 2013 to December 2014. Most of the systems showed excellent image quality reproducibility over the surveillance interval, with mean variability below 5%. Variability of each IQI was 5%, with the exception of one index associated with the smallest phantom objects (0.25 mm), which was below 10%. The method applied for reproducibility tests-multi-detail phantoms, cloud automatic software tool to measure multiple image quality indices and statistical process control-was proven to be effective and applicable on a large scale and to any type of digital mammography system. • Reproducibility of mammography image quality should be monitored by appropriate quality controls. • Use of automatic software tools allows image quality evaluation by multiple indices. • System reproducibility can be assessed comparing current index value with baseline data. • Overall system reproducibility of modern digital mammography systems is excellent. • The method proposed and applied is cost-effective and easily scalable.

  4. Arraycount, an algorithm for automatic cell counting in microwell arrays

    OpenAIRE

    Kachouie, Nezamoddin N.; Kang, Lifeng; Khademhosseini, Ali

    2009-01-01

    Microscale technologies have emerged as a powerful tool for studying and manipulating biological systems and miniaturizing experiments. However, the lack of software complementing these techniques has made it difficult to apply them for many high-throughput experiments. This work establishes Arraycount, an approach to automatically count cells in microwell arrays. The procedure consists of fluorescent microscope imaging of cells that are seeded in microwells of a microarray system and then an...

  5. Automatic associations with the sensory aspects of smoking : Positive in habitual smokers but negative in non-smokers

    NARCIS (Netherlands)

    Huijding, J; de Jong, PJ

    To test whether pictorial stimuli that focus on the sensory aspects of smoking elicit different automatic affective associations in smokers than in non-smokers, 31 smoking and 33 non-smoking students completed a single target IAT. Explicit attitudes were assessed using a semantic differential.

  6. A study of an intelligent FME system for SFCR tools

    Energy Technology Data Exchange (ETDEWEB)

    Hassan, H.A., E-mail: Hassan.hassan@opg.com [Ontario Power Generation, Toronto, Ontario (Canada)

    2008-07-01

    In the nuclear field, the accurate identification, tracking and history documentation of every nuclear tool, equipment or component is a key to safety, operational and maintenance excellence, and security of the nuclear reactor. This paper offers a study of the possible development of the present Foreign Material Exclusion (FME) system using an Intelligent Nuclear Tools Identification System, (INTIS), that was created and customized for the Single Fuel Channel Replacement (SFCR) Tools. The conceptual design of the INTIS was presented comparing the current and the proposed systems in terms of the time, the cost and the radiation doses received by the employees during the SFCR maintenance jobs. A model was created to help better understand and analyze the effects of deployment of the INTIS on the time, performance, accuracy, received dose and finally the total cost. The model may be also extended to solve other nuclear applications problems. The INTIS is based on Radio Frequency Identification (RFID) Smart Tags which are networked with readers and service computers. The System software was designed to communicate with the network to provide the coordinate information for any component at any time. It also allows digital signatures for use and/or approval to use the components and automatically updates their Data Base Management Systems (DBMS) history in terms of the person performing the job, the time period and date of use. This feature together with the information of part's life span could be used in the planning process for the predictive and preventive maintenance. As a case study, the model was applied to a pilot project for SFCR Tools FME. The INTIS automatically records all the tools to be used inside the vault and make real time tracking of any misplaced tool. It also automatically performs a continuous check of all tools, sending an alarm if any of the tools was left inside the vault after the job is done. Finally, a discussion of the results of the

  7. A study of an intelligent FME system for SFCR tools

    International Nuclear Information System (INIS)

    Hassan, H.A.

    2008-01-01

    In the nuclear field, the accurate identification, tracking and history documentation of every nuclear tool, equipment or component is a key to safety, operational and maintenance excellence, and security of the nuclear reactor. This paper offers a study of the possible development of the present Foreign Material Exclusion (FME) system using an Intelligent Nuclear Tools Identification System, (INTIS), that was created and customized for the Single Fuel Channel Replacement (SFCR) Tools. The conceptual design of the INTIS was presented comparing the current and the proposed systems in terms of the time, the cost and the radiation doses received by the employees during the SFCR maintenance jobs. A model was created to help better understand and analyze the effects of deployment of the INTIS on the time, performance, accuracy, received dose and finally the total cost. The model may be also extended to solve other nuclear applications problems. The INTIS is based on Radio Frequency Identification (RFID) Smart Tags which are networked with readers and service computers. The System software was designed to communicate with the network to provide the coordinate information for any component at any time. It also allows digital signatures for use and/or approval to use the components and automatically updates their Data Base Management Systems (DBMS) history in terms of the person performing the job, the time period and date of use. This feature together with the information of part's life span could be used in the planning process for the predictive and preventive maintenance. As a case study, the model was applied to a pilot project for SFCR Tools FME. The INTIS automatically records all the tools to be used inside the vault and make real time tracking of any misplaced tool. It also automatically performs a continuous check of all tools, sending an alarm if any of the tools was left inside the vault after the job is done. Finally, a discussion of the results of the system

  8. Automatic annotation of head velocity and acceleration in Anvil

    DEFF Research Database (Denmark)

    Jongejan, Bart

    2012-01-01

    We describe an automatic face tracker plugin for the ANVIL annotation tool. The face tracker produces data for velocity and for acceleration in two dimensions. We compare the annotations generated by the face tracking algorithm with independently made manual annotations for head movements....... The annotations are a useful supplement to manual annotations and may help human annotators to quickly and reliably determine onset of head movements and to suggest which kind of head movement is taking place....

  9. The automatic component of habit in health behavior: habit as cue-contingent automaticity.

    Science.gov (United States)

    Orbell, Sheina; Verplanken, Bas

    2010-07-01

    Habit might be usefully characterized as a form of automaticity that involves the association of a cue and a response. Three studies examined habitual automaticity in regard to different aspects of the cue-response relationship characteristic of unhealthy and healthy habits. In each study, habitual automaticity was assessed by the Self-Report Habit Index (SRHI). In Study 1 SRHI scores correlated with attentional bias to smoking cues in a Stroop task. Study 2 examined the ability of a habit cue to elicit an unwanted habit response. In a prospective field study, habitual automaticity in relation to smoking when drinking alcohol in a licensed public house (pub) predicted the likelihood of cigarette-related action slips 2 months later after smoking in pubs had become illegal. In Study 3 experimental group participants formed an implementation intention to floss in response to a specified situational cue. Habitual automaticity of dental flossing was rapidly enhanced compared to controls. The studies provided three different demonstrations of the importance of cues in the automatic operation of habits. Habitual automaticity assessed by the SRHI captured aspects of a habit that go beyond mere frequency or consistency of the behavior. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  10. Automatic Recognition Method for Optical Measuring Instruments Based on Machine Vision

    Institute of Scientific and Technical Information of China (English)

    SONG Le; LIN Yuchi; HAO Liguo

    2008-01-01

    Based on a comprehensive study of various algorithms, the automatic recognition of traditional ocular optical measuring instruments is realized. Taking a universal tools microscope (UTM) lens view image as an example, a 2-layer automatic recognition model for data reading is established after adopting a series of pre-processing algorithms. This model is an optimal combination of the correlation-based template matching method and a concurrent back propagation (BP) neural network. Multiple complementary feature extraction is used in generating the eigenvectors of the concurrent network. In order to improve fault-tolerance capacity, rotation invariant features based on Zernike moments are extracted from digit characters and a 4-dimensional group of the outline features is also obtained. Moreover, the operating time and reading accuracy can be adjusted dynamically by setting the threshold value. The experimental result indicates that the newly developed algorithm has optimal recognition precision and working speed. The average reading ratio can achieve 97.23%. The recognition method can automatically obtain the results of optical measuring instruments rapidly and stably without modifying their original structure, which meets the application requirements.

  11. Automatic detection of osteoporotic vertebral fractures in routine thoracic and abdominal MDCT

    Energy Technology Data Exchange (ETDEWEB)

    Baum, Thomas; Dobritz, Martin; Rummeny, Ernst J.; Noel, Peter B. [Technische Universitaet Muenchen, Institut fuer Radiologie, Klinikum rechts der Isar, Muenchen (Germany); Bauer, Jan S. [Technische Universitaet Muenchen, Abteilung fuer Neuroradiologie, Klinikum rechts der Isar, Muenchen (Germany); Klinder, Tobias; Lorenz, Cristian [Philips Research Laboratories, Hamburg (Germany)

    2014-04-15

    To develop a prototype algorithm for automatic spine segmentation in MDCT images and use it to automatically detect osteoporotic vertebral fractures. Cross-sectional routine thoracic and abdominal MDCT images of 71 patients including 8 males and 9 females with 25 osteoporotic vertebral fractures and longitudinal MDCT images of 9 patients with 18 incidental fractures in the follow-up MDCT were retrospectively selected. The spine segmentation algorithm localised and identified the vertebrae T5-L5. Each vertebra was automatically segmented by using corresponding vertebra surface shape models that were adapted to the original images. Anterior, middle, and posterior height of each vertebra was automatically determined; the anterior-posterior ratio (APR) and middle-posterior ratio (MPR) were computed. As the gold standard, radiologists graded vertebral fractures from T5 to L5 according to the Genant classification in consensus. Using ROC analysis to differentiate vertebrae without versus with prevalent fracture, AUC values of 0.84 and 0.83 were obtained for APR and MPR, respectively (p < 0.001). Longitudinal changes in APR and MPR were significantly different between vertebrae without versus with incidental fracture (ΔAPR: -8.5 % ± 8.6 % versus -1.6 % ± 4.2 %, p = 0.002; ΔMPR: -11.4 % ± 7.7 % versus -1.2 % ± 1.6 %, p < 0.001). This prototype algorithm may support radiologists in reporting currently underdiagnosed osteoporotic vertebral fractures so that appropriate therapy can be initiated. circle This spine segmentation algorithm automatically localised, identified, and segmented the vertebrae in MDCT images. (orig.)

  12. Automatic centering device of a tool with regard to an aperture

    International Nuclear Information System (INIS)

    Delevalee, A.

    1993-01-01

    The manipulator arm carries a fixed support and a mobile support that can move perpendicularly to the axis of the tube. The mobile support can be held in any position by a brake arrangement. An index device allows the positioning of the mobile support in an initial predetermined position. A conical centering device can be placed coaxial with the tool and as it enters the tube ensures the alignment of the axis of the tool with the axis of the tube

  13. Automatic creation of simulation configuration

    International Nuclear Information System (INIS)

    Oudot, G.; Poizat, F.

    1993-01-01

    SIPA, which stands for 'Simulator for Post Accident', includes: 1) a sophisticated software oriented workshop SWORD (which stands for 'Software Workshop Oriented towards Research and Development') designed in the ADA language including integrated CAD system and software tools for automatic generation of simulation software and man-machine interface in order to operate run-time simulation; 2) a 'simulator structure' based on hardware equipment and software for supervision and communications; 3) simulation configuration generated by SWORD, operated under the control of the 'simulator structure' and run on a target computer. SWORD has already been used to generate two simulation configurations (French 900 MW and 1300 MW nuclear power plants), which are now fully operational on the SIPA training simulator. (Z.S.) 1 ref

  14. Effectiveness of an Automatic Tracking Software in Underwater Motion Analysis

    Directory of Open Access Journals (Sweden)

    Fabrício A. Magalhaes

    2013-12-01

    Full Text Available Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP, based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers’ positions were manually tracked to determine the markers’ center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM. Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker’s coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4% than for COM (17.8%. Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis.

  15. Requirements for UML and OWL Integration Tool for User Data Consistency Modeling and Testing

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard; Oleshchuk, V. A.

    2003-01-01

    The amount of data available on the Internet is continuously increasing, consequentially there is a growing need for tools that help to analyse the data. Testing of consistency among data received from different sources is made difficult by the number of different languages and schemas being used....... In this paper we analyze requirements for a tool that support integration of UML models and ontologies written in languages like the W3C Web Ontology Language (OWL). The tool can be used in the following way: after loading two legacy models into the tool, the tool user connects them by inserting modeling......, an important part of this technique is attaching of OCL expressions to special boolean class attributes that we call consistency attributes. The resulting integration model can be used for automatic consistency testing of two instances of the legacy models by automatically instantiate the whole integration...

  16. AUTO-LAY: automatic layout generation for procedure flow diagrams

    Energy Technology Data Exchange (ETDEWEB)

    Forzano, P; Castagna, P [Ansaldo SpA, Genoa (Italy)

    1996-12-31

    Nuclear Power Plant Procedures can be seen from essentially two viewpoints: the process and the information management. From the first point of view, it is important to supply the knowledge apt to solve problems connected with the control of the process, from the second one the focus of attention is on the knowledge representation, its structure, elicitation and maintenance, formal quality assurance. These two aspects of procedure representation can be considered and solved separately. In particular, methodological, formal and management issues require long and tedious activities, that in most cases constitute a great barrier for procedures development and upgrade. To solve these problems, Ansaldo is developing DIAM, a wide integrated tool for procedure management to support in procedure writing, updating, usage and documentation. One of the most challenging features of DIAM is AUTO-LAY, a CASE sub-tool that, in a complete automatical way, structures parts or complete flow diagrams. This is a feature that is partially present in some other CASE products, that, anyway, do not allow complex graph handling and isomorphism between video and paper representation AUTO-LAY has the unique prerogative to draw graphs of any complexity, to section them in pages, and to automatically compose a document. This has been recognized in the literature as the most important second-generation CASE improvement. (author). 5 refs., 9 figs.

  17. AUTO-LAY: automatic layout generation for procedure flow diagrams

    International Nuclear Information System (INIS)

    Forzano, P.; Castagna, P.

    1995-01-01

    Nuclear Power Plant Procedures can be seen from essentially two viewpoints: the process and the information management. From the first point of view, it is important to supply the knowledge apt to solve problems connected with the control of the process, from the second one the focus of attention is on the knowledge representation, its structure, elicitation and maintenance, formal quality assurance. These two aspects of procedure representation can be considered and solved separately. In particular, methodological, formal and management issues require long and tedious activities, that in most cases constitute a great barrier for procedures development and upgrade. To solve these problems, Ansaldo is developing DIAM, a wide integrated tool for procedure management to support in procedure writing, updating, usage and documentation. One of the most challenging features of DIAM is AUTO-LAY, a CASE sub-tool that, in a complete automatical way, structures parts or complete flow diagrams. This is a feature that is partially present in some other CASE products, that, anyway, do not allow complex graph handling and isomorphism between video and paper representation AUTO-LAY has the unique prerogative to draw graphs of any complexity, to section them in pages, and to automatically compose a document. This has been recognized in the literature as the most important second-generation CASE improvement. (author). 5 refs., 9 figs

  18. Automatic EEG spike detection.

    Science.gov (United States)

    Harner, Richard

    2009-10-01

    Since the 1970s advances in science and technology during each succeeding decade have renewed the expectation of efficient, reliable automatic epileptiform spike detection (AESD). But even when reinforced with better, faster tools, clinically reliable unsupervised spike detection remains beyond our reach. Expert-selected spike parameters were the first and still most widely used for AESD. Thresholds for amplitude, duration, sharpness, rise-time, fall-time, after-coming slow waves, background frequency, and more have been used. It is still unclear which of these wave parameters are essential, beyond peak-peak amplitude and duration. Wavelet parameters are very appropriate to AESD but need to be combined with other parameters to achieve desired levels of spike detection efficiency. Artificial Neural Network (ANN) and expert-system methods may have reached peak efficiency. Support Vector Machine (SVM) technology focuses on outliers rather than centroids of spike and nonspike data clusters and should improve AESD efficiency. An exemplary spike/nonspike database is suggested as a tool for assessing parameters and methods for AESD and is available in CSV or Matlab formats from the author at brainvue@gmail.com. Exploratory Data Analysis (EDA) is presented as a graphic method for finding better spike parameters and for the step-wise evaluation of the spike detection process.

  19. Microprocessor controlled system for automatic and semi-automatic syntheses of radiopharmaceuticals

    International Nuclear Information System (INIS)

    Ruth, T.J.; Adam, M.J.; Morris, D.; Jivan, S.

    1986-01-01

    A computer based system has been constructed to control the automatic synthesis of 2-deoxy-2-( 18 F)fluoro-D-glucose and is also being used in the development of an automatic synthesis of L-6-( 18 F)fluorodopa. (author)

  20. Probing stem cell differentiation using atomic force microscopy

    International Nuclear Information System (INIS)

    Liang, Xiaobin; Shi, Xuetao; Ostrovidov, Serge; Wu, Hongkai; Nakajima, Ken

    2016-01-01

    Graphical abstract: - Highlights: • Atomic force microscopy (AFM) was developed to probe stem cell differentiation. • The mechanical properties of stem cells and their ECMs can be used to clearly distinguish specific stem cell-differentiated lineages. • AFM is a facile and useful tool for monitoring stem cell differentiation in a non-invasive manner. - Abstract: A real-time method using atomic force microscopy (AFM) was developed to probe stem cell differentiation by measuring the mechanical properties of cells and the extracellular matrix (ECM). The mechanical properties of stem cells and their ECMs can be used to clearly distinguish specific stem cell-differentiated lineages. It is clear that AFM is a facile and useful tool for monitoring the differentiation of stem cells in a non-invasive manner.

  1. Probing stem cell differentiation using atomic force microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Liang, Xiaobin [Graduate School of Science and Engineering, Tokyo Institute of Technology, Ookayama 2-12-1, Meguro-ku, Tokyo 152-8550 (Japan); Shi, Xuetao, E-mail: mrshixuetao@gmail.com [School of Materials Science and Engineering, South China University of Technology, Guangzhou 510641 (China); Ostrovidov, Serge [WPI-Advanced Institute for Materials Research, Tohoku University, Sendai (Japan); Wu, Hongkai, E-mail: chhkwu@ust.hk [Department of Chemistry & Division of Biomedical Engineering, The Hong Kong University of Science and Technology, Clear Water Bay, Kowloon, Hong Kong (China); Nakajima, Ken [Graduate School of Science and Engineering, Tokyo Institute of Technology, Ookayama 2-12-1, Meguro-ku, Tokyo 152-8550 (Japan)

    2016-03-15

    Graphical abstract: - Highlights: • Atomic force microscopy (AFM) was developed to probe stem cell differentiation. • The mechanical properties of stem cells and their ECMs can be used to clearly distinguish specific stem cell-differentiated lineages. • AFM is a facile and useful tool for monitoring stem cell differentiation in a non-invasive manner. - Abstract: A real-time method using atomic force microscopy (AFM) was developed to probe stem cell differentiation by measuring the mechanical properties of cells and the extracellular matrix (ECM). The mechanical properties of stem cells and their ECMs can be used to clearly distinguish specific stem cell-differentiated lineages. It is clear that AFM is a facile and useful tool for monitoring the differentiation of stem cells in a non-invasive manner.

  2. Tool Efficiency Analysis model research in SEMI industry

    Directory of Open Access Journals (Sweden)

    Lei Ma

    2018-01-01

    Full Text Available One of the key goals in SEMI industry is to improve equipment through put and ensure equipment production efficiency maximization. This paper is based on SEMI standards in semiconductor equipment control, defines the transaction rules between different tool states,and presents a TEA system model which is to analysis tool performance automatically based on finite state machine. The system was applied to fab tools and verified its effectiveness successfully, and obtained the parameter values used to measure the equipment performance, also including the advices of improvement.

  3. Differential equations a dynamical systems approach ordinary differential equations

    CERN Document Server

    Hubbard, John H

    1991-01-01

    This is a corrected third printing of the first part of the text Differential Equations: A Dynamical Systems Approach written by John Hubbard and Beverly West. The authors' main emphasis in this book is on ordinary differential equations. The book is most appropriate for upper level undergraduate and graduate students in the fields of mathematics, engineering, and applied mathematics, as well as the life sciences, physics and economics. Traditional courses on differential equations focus on techniques leading to solutions. Yet most differential equations do not admit solutions which can be written in elementary terms. The authors have taken the view that a differential equations defines functions; the object of the theory is to understand the behavior of these functions. The tools the authors use include qualitative and numerical methods besides the traditional analytic methods. The companion software, MacMath, is designed to bring these notions to life.

  4. SDMdata: A Web-Based Software Tool for Collecting Species Occurrence Records.

    Directory of Open Access Journals (Sweden)

    Xiaoquan Kong

    Full Text Available It is important to easily and efficiently obtain high quality species distribution data for predicting the potential distribution of species using species distribution models (SDMs. There is a need for a powerful software tool to automatically or semi-automatically assist in identifying and correcting errors. Here, we use Python to develop a web-based software tool (SDMdata to easily collect occurrence data from the Global Biodiversity Information Facility (GBIF and check species names and the accuracy of coordinates (latitude and longitude. It is an open source software (GNU Affero General Public License/AGPL licensed allowing anyone to access and manipulate the source code. SDMdata is available online free of charge from .

  5. Automatic Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Preuss, Mike

    2014-01-01

    Automatically generating computer animations is a challenging and complex problem with applications in games and film production. In this paper, we investigate howto translate a shot list for a virtual scene into a series of virtual camera configurations — i.e automatically controlling the virtual...

  6. Automatic control system generation for robot design validation

    Science.gov (United States)

    Bacon, James A. (Inventor); English, James D. (Inventor)

    2012-01-01

    The specification and drawings present a new method, system and software product for and apparatus for generating a robotic validation system for a robot design. The robotic validation system for the robot design of a robotic system is automatically generated by converting a robot design into a generic robotic description using a predetermined format, then generating a control system from the generic robotic description and finally updating robot design parameters of the robotic system with an analysis tool using both the generic robot description and the control system.

  7. Automatically rating trainee skill at a pediatric laparoscopic suturing task.

    Science.gov (United States)

    Oquendo, Yousi A; Riddle, Elijah W; Hiller, Dennis; Blinman, Thane A; Kuchenbecker, Katherine J

    2018-04-01

    Minimally invasive surgeons must acquire complex technical skills while minimizing patient risk, a challenge that is magnified in pediatric surgery. Trainees need realistic practice with frequent detailed feedback, but human grading is tedious and subjective. We aim to validate a novel motion-tracking system and algorithms that automatically evaluate trainee performance of a pediatric laparoscopic suturing task. Subjects (n = 32) ranging from medical students to fellows performed two trials of intracorporeal suturing in a custom pediatric laparoscopic box trainer after watching a video of ideal performance. The motions of the tools and endoscope were recorded over time using a magnetic sensing system, and both tool grip angles were recorded using handle-mounted flex sensors. An expert rated the 63 trial videos on five domains from the Objective Structured Assessment of Technical Skill (OSATS), yielding summed scores from 5 to 20. Motion data from each trial were processed to calculate 280 features. We used regularized least squares regression to identify the most predictive features from different subsets of the motion data and then built six regression tree models that predict summed OSATS score. Model accuracy was evaluated via leave-one-subject-out cross-validation. The model that used all sensor data streams performed best, achieving 71% accuracy at predicting summed scores within 2 points, 89% accuracy within 4, and a correlation of 0.85 with human ratings. 59% of the rounded average OSATS score predictions were perfect, and 100% were within 1 point. This model employed 87 features, including none based on completion time, 77 from tool tip motion, 3 from tool tip visibility, and 7 from grip angle. Our novel hardware and software automatically rated previously unseen trials with summed OSATS scores that closely match human expert ratings. Such a system facilitates more feedback-intensive surgical training and may yield insights into the fundamental

  8. Decision support tool for early differential diagnosis of acute lung injury and cardiogenic pulmonary edema in medical critically ill patients.

    Science.gov (United States)

    Schmickl, Christopher N; Shahjehan, Khurram; Li, Guangxi; Dhokarh, Rajanigandha; Kashyap, Rahul; Janish, Christopher; Alsara, Anas; Jaffe, Allan S; Hubmayr, Rolf D; Gajic, Ognjen

    2012-01-01

    At the onset of acute hypoxic respiratory failure, critically ill patients with acute lung injury (ALI) may be difficult to distinguish from those with cardiogenic pulmonary edema (CPE). No single clinical parameter provides satisfying prediction. We hypothesized that a combination of those will facilitate early differential diagnosis. In a population-based retrospective development cohort, validated electronic surveillance identified critically ill adult patients with acute pulmonary edema. Recursive partitioning and logistic regression were used to develop a decision support tool based on routine clinical information to differentiate ALI from CPE. Performance of the score was validated in an independent cohort of referral patients. Blinded post hoc expert review served as gold standard. Of 332 patients in a development cohort, expert reviewers (κ, 0.86) classified 156 as having ALI and 176 as having CPE. The validation cohort had 161 patients (ALI = 113, CPE = 48). The score was based on risk factors for ALI and CPE, age, alcohol abuse, chemotherapy, and peripheral oxygen saturation/Fio(2) ratio. It demonstrated good discrimination (area under curve [AUC] = 0.81; 95% CI, 0.77-0.86) and calibration (Hosmer-Lemeshow [HL] P = .16). Similar performance was obtained in the validation cohort (AUC = 0.80; 95% CI, 0.72-0.88; HL P = .13). A simple decision support tool accurately classifies acute pulmonary edema, reserving advanced testing for a subset of patients in whom satisfying prediction cannot be made. This novel tool may facilitate early inclusion of patients with ALI and CPE into research studies as well as improve and rationalize clinical management and resource use.

  9. Automatic DNA Diagnosis for 1D Gel Electrophoresis Images using Bio-image Processing Technique.

    Science.gov (United States)

    Intarapanich, Apichart; Kaewkamnerd, Saowaluck; Shaw, Philip J; Ukosakit, Kittipat; Tragoonrung, Somvong; Tongsima, Sissades

    2015-01-01

    DNA gel electrophoresis is a molecular biology technique for separating different sizes of DNA fragments. Applications of DNA gel electrophoresis include DNA fingerprinting (genetic diagnosis), size estimation of DNA, and DNA separation for Southern blotting. Accurate interpretation of DNA banding patterns from electrophoretic images can be laborious and error prone when a large number of bands are interrogated manually. Although many bio-imaging techniques have been proposed, none of them can fully automate the typing of DNA owing to the complexities of migration patterns typically obtained. We developed an image-processing tool that automatically calls genotypes from DNA gel electrophoresis images. The image processing workflow comprises three main steps: 1) lane segmentation, 2) extraction of DNA bands and 3) band genotyping classification. The tool was originally intended to facilitate large-scale genotyping analysis of sugarcane cultivars. We tested the proposed tool on 10 gel images (433 cultivars) obtained from polyacrylamide gel electrophoresis (PAGE) of PCR amplicons for detecting intron length polymorphisms (ILP) on one locus of the sugarcanes. These gel images demonstrated many challenges in automated lane/band segmentation in image processing including lane distortion, band deformity, high degree of noise in the background, and bands that are very close together (doublets). Using the proposed bio-imaging workflow, lanes and DNA bands contained within are properly segmented, even for adjacent bands with aberrant migration that cannot be separated by conventional techniques. The software, called GELect, automatically performs genotype calling on each lane by comparing with an all-banding reference, which was created by clustering the existing bands into the non-redundant set of reference bands. The automated genotype calling results were verified by independent manual typing by molecular biologists. This work presents an automated genotyping tool from DNA

  10. Automatic DNA Diagnosis for 1D Gel Electrophoresis Images using Bio-image Processing Technique

    Science.gov (United States)

    2015-01-01

    Background DNA gel electrophoresis is a molecular biology technique for separating different sizes of DNA fragments. Applications of DNA gel electrophoresis include DNA fingerprinting (genetic diagnosis), size estimation of DNA, and DNA separation for Southern blotting. Accurate interpretation of DNA banding patterns from electrophoretic images can be laborious and error prone when a large number of bands are interrogated manually. Although many bio-imaging techniques have been proposed, none of them can fully automate the typing of DNA owing to the complexities of migration patterns typically obtained. Results We developed an image-processing tool that automatically calls genotypes from DNA gel electrophoresis images. The image processing workflow comprises three main steps: 1) lane segmentation, 2) extraction of DNA bands and 3) band genotyping classification. The tool was originally intended to facilitate large-scale genotyping analysis of sugarcane cultivars. We tested the proposed tool on 10 gel images (433 cultivars) obtained from polyacrylamide gel electrophoresis (PAGE) of PCR amplicons for detecting intron length polymorphisms (ILP) on one locus of the sugarcanes. These gel images demonstrated many challenges in automated lane/band segmentation in image processing including lane distortion, band deformity, high degree of noise in the background, and bands that are very close together (doublets). Using the proposed bio-imaging workflow, lanes and DNA bands contained within are properly segmented, even for adjacent bands with aberrant migration that cannot be separated by conventional techniques. The software, called GELect, automatically performs genotype calling on each lane by comparing with an all-banding reference, which was created by clustering the existing bands into the non-redundant set of reference bands. The automated genotype calling results were verified by independent manual typing by molecular biologists. Conclusions This work presents an

  11. Semi-Automatic Construction of Skeleton Concept Maps from Case Judgments

    OpenAIRE

    Boer, A.; Sijtsma, B.; Winkels, R.; Lettieri, N.

    2014-01-01

    This paper proposes an approach to generating Skeleton Conceptual Maps (SCM) semi automatically from legal case documents provided by the United Kingdom’s Supreme Court. SCM are incomplete knowledge representations for the purpose of scaffolding learning. The proposed system intends to provide students with a tool to pre-process text and to extract knowledge from documents in a time saving manner. A combination of natural language processing methods and proposition extraction algorithms are u...

  12. MIAQuant, a novel system for automatic segmentation, measurement, and localization comparison of different biomarkers from serialized histological slices.

    Science.gov (United States)

    Casiraghi, Elena; Cossa, Mara; Huber, Veronica; Rivoltini, Licia; Tozzi, Matteo; Villa, Antonello; Vergani, Barbara

    2017-11-02

    In the clinical practice, automatic image analysis methods quickly quantizing histological results by objective and replicable methods are getting more and more necessary and widespread. Despite several commercial software products are available for this task, they are very little flexible, and provided as black boxes without modifiable source code. To overcome the aforementioned problems, we employed the commonly used MATLAB platform to develop an automatic method, MIAQuant, for the analysis of histochemical and immunohistochemical images, stained with various methods and acquired by different tools. It automatically extracts and quantifies markers characterized by various colors and shapes; furthermore, it aligns contiguous tissue slices stained by different markers and overlaps them with differing colors for visual comparison of their localization. Application of MIAQuant for clinical research fields, such as oncology and cardiovascular disease studies, has proven its efficacy, robustness and flexibility with respect to various problems; we highlight that, the flexibility of MIAQuant makes it an important tool to be exploited for basic researches where needs are constantly changing. MIAQuant software and its user manual are freely available for clinical studies, pathological research, and diagnosis.

  13. Automatic measurement of axial length of human eye using three-dimensional magnetic resonance imaging

    International Nuclear Information System (INIS)

    Watanabe, Masaki; Kiryu, Tohru

    2011-01-01

    The measurement of axial length and the evaluation of three dimensional (3D) form of an eye are essential to evaluate the mechanism of myopia progression. We propose a method of automatic measurement of axial length including adjustment of the pulse sequence of short-term scan which could suppress influence of eyeblink, using a magnetic resonance imaging (MRI) which acquires 3D images noninvasively. Acquiring T 2 -weighted images with 3.0 tesla MRI device and eight-channel phased-array head coil, we extracted left and right eye ball images, and then reconstructed 3D volume. The surface coordinates were calculated from 3D volume, fitting the ellipsoid model coordinates with the surface coordinates, and measured the axial length automatically. Measuring twenty one subjects, we compared the automatically measured values of axial length with the manually measured ones, then confirmed significant elongation in the axial length of myopia compared with that of emmetropia. Furthermore, there were no significant differences (P<0.05) between the means of automatic measurements and the manual ones. Accordingly, the automatic measurement process of axial length could be a tool for the elucidation of the mechanism of myopia progression, which would be suitable for evaluating the axial length easily and noninvasively. (author)

  14. Some operational tools for solving fractional and higher integer order differential equations: A survey on their mutual relations

    Science.gov (United States)

    Kiryakova, Virginia S.

    2012-11-01

    The Laplace Transform (LT) serves as a basis of the Operational Calculus (OC), widely explored by engineers and applied scientists in solving mathematical models for their practical needs. This transform is closely related to the exponential and trigonometric functions (exp, cos, sin) and to the classical differentiation and integration operators, reducing them to simple algebraic operations. Thus, the classical LT and the OC give useful tool to handle differential equations and systems with constant coefficients. Several generalizations of the LT have been introduced to allow solving, in a similar way, of differential equations with variable coefficients and of higher integer orders, as well as of fractional (arbitrary non-integer) orders. Note that fractional order mathematical models are recently widely used to describe better various systems and phenomena of the real world. This paper surveys briefly some of our results on classes of such integral transforms, that can be obtained from the LT by means of "transmutations" which are operators of the generalized fractional calculus (GFC). On the list of these Laplace-type integral transforms, we consider the Borel-Dzrbashjan, Meijer, Krätzel, Obrechkoff, generalized Obrechkoff (multi-index Borel-Dzrbashjan) transforms, etc. All of them are G- and H-integral transforms of convolutional type, having as kernels Meijer's G- or Fox's H-functions. Besides, some special functions (also being G- and H-functions), among them - the generalized Bessel-type and Mittag-Leffler (M-L) type functions, are generating Gel'fond-Leontiev (G-L) operators of generalized differentiation and integration, which happen to be also operators of GFC. Our integral transforms have operational properties analogous to those of the LT - they do algebrize the G-L generalized integrations and differentiations, and thus can serve for solving wide classes of differential equations with variable coefficients of arbitrary, including non-integer order

  15. Automatic Road Pavement Assessment with Image Processing: Review and Comparison

    Directory of Open Access Journals (Sweden)

    Sylvie Chambon

    2011-01-01

    Full Text Available In the field of noninvasive sensing techniques for civil infrastructures monitoring, this paper addresses the problem of crack detection, in the surface of the French national roads, by automatic analysis of optical images. The first contribution is a state of the art of the image-processing tools applied to civil engineering. The second contribution is about fine-defect detection in pavement surface. The approach is based on a multi-scale extraction and a Markovian segmentation. Third, an evaluation and comparison protocol which has been designed for evaluating this difficult task—the road pavement crack detection—is introduced. Finally, the proposed method is validated, analysed, and compared to a detection approach based on morphological tools.

  16. X-ray conditions and response characteristics of automatic dose control in cinematography

    International Nuclear Information System (INIS)

    Arai, Hiroaki

    1997-01-01

    X-ray characteristics including subject thickness (copper plate), tube voltage, tube current and irradiation time were measured at stability, with an automatic dose control x-ray generator for cineangiography. Regardless of subject thickness, it is possible that the energy input to the x-ray tube in one frame may be decreased. The automatic control response was measured after rapid fluctuation in subject thickness. Two inverter-type x-ray generators with different automatic control units were studied. The older control unit changes exposure dose by tube voltage and tube current, while the newer one changes exposure dose by tube voltage, tube current and irradiation time. The maximum rate of change in tube voltage is greater with the newer control unit. In addition, the actual tube current response of the newer control unit in increasing nominal value is faster than the older one. In the new control unit, for each pulse, irradiation is cut off by means of a signal that the exposure has reached the proper value. Thus given the same differential in subject thickness, the newer control unit resumed stability faster than the older one. (author)

  17. X-ray conditions and response characteristics of automatic dose control in cinematography

    Energy Technology Data Exchange (ETDEWEB)

    Arai, Hiroaki [Cardiovascular Institute Hospital, Tokyo (Japan)

    1997-11-01

    X-ray characteristics including subject thickness (copper plate), tube voltage, tube current and irradiation time were measured at stability, with an automatic dose control x-ray generator for cineangiography. Regardless of subject thickness, it is possible that the energy input to the x-ray tube in one frame may be decreased. The automatic control response was measured after rapid fluctuation in subject thickness. Two inverter-type x-ray generators with different automatic control units were studied. The older control unit changes exposure dose by tube voltage and tube current, while the newer one changes exposure dose by tube voltage, tube current and irradiation time. The maximum rate of change in tube voltage is greater with the newer control unit. In addition, the actual tube current response of the newer control unit in increasing nominal value is faster than the older one. In the new control unit, for each pulse, irradiation is cut off by means of a signal that the exposure has reached the proper value. Thus given the same differential in subject thickness, the newer control unit resumed stability faster than the older one. (author)

  18. Motor automaticity in Parkinson’s disease

    Science.gov (United States)

    Wu, Tao; Hallett, Mark; Chan, Piu

    2017-01-01

    Bradykinesia is the most important feature contributing to motor difficulties in Parkinson’s disease (PD). However, the pathophysiology underlying bradykinesia is not fully understood. One important aspect is that PD patients have difficulty in performing learned motor skills automatically, but this problem has been generally overlooked. Here we review motor automaticity associated motor deficits in PD, such as reduced arm swing, decreased stride length, freezing of gait, micrographia and reduced facial expression. Recent neuroimaging studies have revealed some neural mechanisms underlying impaired motor automaticity in PD, including less efficient neural coding of movement, failure to shift automated motor skills to the sensorimotor striatum, instability of the automatic mode within the striatum, and use of attentional control and/or compensatory efforts to execute movements usually performed automatically in healthy people. PD patients lose previously acquired automatic skills due to their impaired sensorimotor striatum, and have difficulty in acquiring new automatic skills or restoring lost motor skills. More investigations on the pathophysiology of motor automaticity, the effect of L-dopa or surgical treatments on automaticity, and the potential role of using measures of automaticity in early diagnosis of PD would be valuable. PMID:26102020

  19. Automatic analog IC sizing and optimization constrained with PVT corners and layout effects

    CERN Document Server

    Lourenço, Nuno; Horta, Nuno

    2017-01-01

    This book introduces readers to a variety of tools for automatic analog integrated circuit (IC) sizing and optimization. The authors provide a historical perspective on the early methods proposed to tackle automatic analog circuit sizing, with emphasis on the methodologies to size and optimize the circuit, and on the methodologies to estimate the circuit’s performance. The discussion also includes robust circuit design and optimization and the most recent advances in layout-aware analog sizing approaches. The authors describe a methodology for an automatic flow for analog IC design, including details of the inputs and interfaces, multi-objective optimization techniques, and the enhancements made in the base implementation by using machine leaning techniques. The Gradient model is discussed in detail, along with the methods to include layout effects in the circuit sizing. The concepts and algorithms of all the modules are thoroughly described, enabling readers to reproduce the methodologies, improve the qual...

  20. DEEP--a tool for differential expression effector prediction.

    Science.gov (United States)

    Degenhardt, Jost; Haubrock, Martin; Dönitz, Jürgen; Wingender, Edgar; Crass, Torsten

    2007-07-01

    High-throughput methods for measuring transcript abundance, like SAGE or microarrays, are widely used for determining differences in gene expression between different tissue types, dignities (normal/malignant) or time points. Further analysis of such data frequently aims at the identification of gene interaction networks that form the causal basis for the observed properties of the systems under examination. To this end, it is usually not sufficient to rely on the measured gene expression levels alone; rather, additional biological knowledge has to be taken into account in order to generate useful hypotheses about the molecular mechanism leading to the realization of a certain phenotype. We present a method that combines gene expression data with biological expert knowledge on molecular interaction networks, as described by the TRANSPATH database on signal transduction, to predict additional--and not necessarily differentially expressed--genes or gene products which might participate in processes specific for either of the examined tissues or conditions. In a first step, significance values for over-expression in tissue/condition A or B are assigned to all genes in the expression data set. Genes with a significance value exceeding a certain threshold are used as starting points for the reconstruction of a graph with signaling components as nodes and signaling events as edges. In a subsequent graph traversal process, again starting from the previously identified differentially expressed genes, all encountered nodes 'inherit' all their starting nodes' significance values. In a final step, the graph is visualized, the nodes being colored according to a weighted average of their inherited significance values. Each node's, or sub-network's, predominant color, ranging from green (significant for tissue/condition A) over yellow (not significant for either tissue/condition) to red (significant for tissue/condition B), thus gives an immediate visual clue on which molecules--differentially

  1. Automatic segmentation of human cortical layer-complexes and architectural areas using diffusion MRI and its validation

    Directory of Open Access Journals (Sweden)

    Matteo Bastiani

    2016-11-01

    Full Text Available Recently, several magnetic resonance imaging contrast mechanisms have been shown to distinguish cortical substructure corresponding to selected cortical layers. Here, we investigate cortical layer and area differentiation by automatized unsupervised clustering of high resolution diffusion MRI data. Several groups of adjacent layers could be distinguished in human primary motor and premotor cortex. We then used the signature of diffusion MRI signals along cortical depth as a criterion to detect area boundaries and find borders at which the signature changes abruptly. We validate our clustering results by histological analysis of the same tissue. These results confirm earlier studies which show that diffusion MRI can probe layer-specific intracortical fiber organization and, moreover, suggests that it contains enough information to automatically classify architecturally distinct cortical areas. We discuss the strengths and weaknesses of the automatic clustering approach and its appeal for MR-based cortical histology.

  2. New software tools for enhanced precision in robot-assisted laser phonomicrosurgery.

    Science.gov (United States)

    Dagnino, Giulio; Mattos, Leonardo S; Caldwell, Darwin G

    2012-01-01

    This paper describes a new software package created to enhance precision during robot-assisted laser phonomicrosurgery procedures. The new software is composed of three tools for camera calibration, automatic tumor segmentation, and laser tracking. These were designed and developed to improve the outcome of this demanding microsurgical technique, and were tested herein to produce quantitative performance data. The experimental setup was based on the motorized laser micromanipulator created by Istituto Italiano di Tecnologia and the experimental protocols followed are fully described in this paper. The results show the new tools are robust and effective: The camera calibration tool reduced residual errors (RMSE) to 0.009 ± 0.002 mm under 40× microscope magnification; the automatic tumor segmentation tool resulted in deep lesion segmentations comparable to manual segmentations (RMSE= 0.160 ± 0.028 mm under 40× magnification); and the laser tracker tool proved to be reliable even during cutting procedures (RMSE= 0.073 ± 0.023 mm under 40× magnification). These results demonstrate the new software package can provide excellent improvements to the previous microsurgical system, leading to important enhancements in surgical outcome.

  3. Automatic Photoelectric Telescope Service

    International Nuclear Information System (INIS)

    Genet, R.M.; Boyd, L.J.; Kissell, K.E.; Crawford, D.L.; Hall, D.S.; BDM Corp., McLean, VA; Kitt Peak National Observatory, Tucson, AZ; Dyer Observatory, Nashville, TN)

    1987-01-01

    Automatic observatories have the potential of gathering sizable amounts of high-quality astronomical data at low cost. The Automatic Photoelectric Telescope Service (APT Service) has realized this potential and is routinely making photometric observations of a large number of variable stars. However, without observers to provide on-site monitoring, it was necessary to incorporate special quality checks into the operation of the APT Service at its multiple automatic telescope installation on Mount Hopkins. 18 references

  4. Automatic imitation: A meta-analysis.

    Science.gov (United States)

    Cracco, Emiel; Bardi, Lara; Desmet, Charlotte; Genschow, Oliver; Rigoni, Davide; De Coster, Lize; Radkova, Ina; Deschrijver, Eliane; Brass, Marcel

    2018-05-01

    Automatic imitation is the finding that movement execution is facilitated by compatible and impeded by incompatible observed movements. In the past 15 years, automatic imitation has been studied to understand the relation between perception and action in social interaction. Although research on this topic started in cognitive science, interest quickly spread to related disciplines such as social psychology, clinical psychology, and neuroscience. However, important theoretical questions have remained unanswered. Therefore, in the present meta-analysis, we evaluated seven key questions on automatic imitation. The results, based on 161 studies containing 226 experiments, revealed an overall effect size of g z = 0.95, 95% CI [0.88, 1.02]. Moderator analyses identified automatic imitation as a flexible, largely automatic process that is driven by movement and effector compatibility, but is also influenced by spatial compatibility. Automatic imitation was found to be stronger for forced choice tasks than for simple response tasks, for human agents than for nonhuman agents, and for goalless actions than for goal-directed actions. However, it was not modulated by more subtle factors such as animacy beliefs, motion profiles, or visual perspective. Finally, there was no evidence for a relation between automatic imitation and either empathy or autism. Among other things, these findings point toward actor-imitator similarity as a crucial modulator of automatic imitation and challenge the view that imitative tendencies are an indicator of social functioning. The current meta-analysis has important theoretical implications and sheds light on longstanding controversies in the literature on automatic imitation and related domains. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  5. AUTOMATIC ARCHITECTURAL STYLE RECOGNITION

    Directory of Open Access Journals (Sweden)

    M. Mathias

    2012-09-01

    Full Text Available Procedural modeling has proven to be a very valuable tool in the field of architecture. In the last few years, research has soared to automatically create procedural models from images. However, current algorithms for this process of inverse procedural modeling rely on the assumption that the building style is known. So far, the determination of the building style has remained a manual task. In this paper, we propose an algorithm which automates this process through classification of architectural styles from facade images. Our classifier first identifies the images containing buildings, then separates individual facades within an image and determines the building style. This information could then be used to initialize the building reconstruction process. We have trained our classifier to distinguish between several distinct architectural styles, namely Flemish Renaissance, Haussmannian and Neoclassical. Finally, we demonstrate our approach on various street-side images.

  6. ODMSummary: A Tool for Automatic Structured Comparison of Multiple Medical Forms Based on Semantic Annotation with the Unified Medical Language System.

    Science.gov (United States)

    Storck, Michael; Krumm, Rainer; Dugas, Martin

    2016-01-01

    Medical documentation is applied in various settings including patient care and clinical research. Since procedures of medical documentation are heterogeneous and developed further, secondary use of medical data is complicated. Development of medical forms, merging of data from different sources and meta-analyses of different data sets are currently a predominantly manual process and therefore difficult and cumbersome. Available applications to automate these processes are limited. In particular, tools to compare multiple documentation forms are missing. The objective of this work is to design, implement and evaluate the new system ODMSummary for comparison of multiple forms with a high number of semantically annotated data elements and a high level of usability. System requirements are the capability to summarize and compare a set of forms, enable to estimate the documentation effort, track changes in different versions of forms and find comparable items in different forms. Forms are provided in Operational Data Model format with semantic annotations from the Unified Medical Language System. 12 medical experts were invited to participate in a 3-phase evaluation of the tool regarding usability. ODMSummary (available at https://odmtoolbox.uni-muenster.de/summary/summary.html) provides a structured overview of multiple forms and their documentation fields. This comparison enables medical experts to assess multiple forms or whole datasets for secondary use. System usability was optimized based on expert feedback. The evaluation demonstrates that feedback from domain experts is needed to identify usability issues. In conclusion, this work shows that automatic comparison of multiple forms is feasible and the results are usable for medical experts.

  7. Differential equations for dummies

    CERN Document Server

    Holzner, Steven

    2008-01-01

    The fun and easy way to understand and solve complex equations Many of the fundamental laws of physics, chemistry, biology, and economics can be formulated as differential equations. This plain-English guide explores the many applications of this mathematical tool and shows how differential equations can help us understand the world around us. Differential Equations For Dummies is the perfect companion for a college differential equations course and is an ideal supplemental resource for other calculus classes as well as science and engineering courses. It offers step-by-step techniques, practical tips, numerous exercises, and clear, concise examples to help readers improve their differential equation-solving skills and boost their test scores.

  8. Automatic measurement system for congenital hip dislocation using a computed radiography

    International Nuclear Information System (INIS)

    Komori, M.; Minato, K.; Hirakawa, A.; Kuwahara, M.

    1988-01-01

    Acetabular angle which is a diagnostic parameter of congenital hip dislocation has been measured manually in conventional X-ray film system. Using digital image directly provided from a computed radiography, an automatic measurement system was developed for this parameter. The process of the measurement was completed within a reasonable time, and accurate enough. The system was combined with an image database, so that it would be a measurement tool of PACS

  9. Performance Evaluation of a Software Engineering Tool for Automated Design of Cooling Systems in Injection Moulding

    DEFF Research Database (Denmark)

    Jauregui-Becker, Juan M.; Tosello, Guido; van Houten, Fred J.A.M.

    2013-01-01

    This paper presents a software tool for automating the design of cooling systems for injection moulding and a validation of its performance. Cooling system designs were automatically generated by the proposed software tool and by applying a best practice tool engineering design approach. The two...

  10. Species and tissues specific differentiation of processed animal proteins in aquafeeds using proteomics tools.

    Science.gov (United States)

    Rasinger, J D; Marbaix, H; Dieu, M; Fumière, O; Mauro, S; Palmblad, M; Raes, M; Berntssen, M H G

    2016-09-16

    The rapidly growing aquaculture industry drives the search for sustainable protein sources in fish feed. In the European Union (EU) since 2013 non-ruminant processed animal proteins (PAP) are again permitted to be used in aquafeeds. To ensure that commercial fish feeds do not contain PAP from prohibited species, EU reference methods were established. However, due to the heterogeneous and complex nature of PAP complementary methods are required to guarantee the safe use of this fish feed ingredient. In addition, there is a need for tissue specific PAP detection to identify the sources (i.e. bovine carcass, blood, or meat) of illegal PAP use. In the present study, we investigated and compared different protein extraction, solubilisation and digestion protocols on different proteomics platforms for the detection and differentiation of prohibited PAP. In addition, we assessed if tissue specific PAP detection was feasible using proteomics tools. All work was performed independently in two different laboratories. We found that irrespective of sample preparation gel-based proteomics tools were inappropriate when working with PAP. Gel-free shotgun proteomics approaches in combination with direct spectral comparison were able to provide quality species and tissue specific data to complement and refine current methods of PAP detection and identification. To guarantee the safe use of processed animal protein (PAP) in aquafeeds efficient PAP detection and monitoring tools are required. The present study investigated and compared various proteomics workflows and shows that the application of shotgun proteomics in combination with direct comparison of spectral libraries provides for the desired species and tissue specific classification of this heat sterilized and pressure treated (≥133°C, at 3bar for 20min) protein feed ingredient. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Length Scales in Bayesian Automatic Adaptive Quadrature

    Directory of Open Access Journals (Sweden)

    Adam Gh.

    2016-01-01

    Full Text Available Two conceptual developments in the Bayesian automatic adaptive quadrature approach to the numerical solution of one-dimensional Riemann integrals [Gh. Adam, S. Adam, Springer LNCS 7125, 1–16 (2012] are reported. First, it is shown that the numerical quadrature which avoids the overcomputing and minimizes the hidden floating point loss of precision asks for the consideration of three classes of integration domain lengths endowed with specific quadrature sums: microscopic (trapezoidal rule, mesoscopic (Simpson rule, and macroscopic (quadrature sums of high algebraic degrees of precision. Second, sensitive diagnostic tools for the Bayesian inference on macroscopic ranges, coming from the use of Clenshaw-Curtis quadrature, are derived.

  12. Micro-Analyzer: automatic preprocessing of Affymetrix microarray data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Cannataro, Mario

    2013-08-01

    A current trend in genomics is the investigation of the cell mechanism using different technologies, in order to explain the relationship among genes, molecular processes and diseases. For instance, the combined use of gene-expression arrays and genomic arrays has been demonstrated as an effective instrument in clinical practice. Consequently, in a single experiment different kind of microarrays may be used, resulting in the production of different types of binary data (images and textual raw data). The analysis of microarray data requires an initial preprocessing phase, that makes raw data suitable for use on existing analysis platforms, such as the TIGR M4 (TM4) Suite. An additional challenge to be faced by emerging data analysis platforms is the ability to treat in a combined way those different microarray formats coupled with clinical data. In fact, resulting integrated data may include both numerical and symbolic data (e.g. gene expression and SNPs regarding molecular data), as well as temporal data (e.g. the response to a drug, time to progression and survival rate), regarding clinical data. Raw data preprocessing is a crucial step in analysis but is often performed in a manual and error prone way using different software tools. Thus novel, platform independent, and possibly open source tools enabling the semi-automatic preprocessing and annotation of different microarray data are needed. The paper presents Micro-Analyzer (Microarray Analyzer), a cross-platform tool for the automatic normalization, summarization and annotation of Affymetrix gene expression and SNP binary data. It represents the evolution of the μ-CS tool, extending the preprocessing to SNP arrays that were not allowed in μ-CS. The Micro-Analyzer is provided as a Java standalone tool and enables users to read, preprocess and analyse binary microarray data (gene expression and SNPs) by invoking TM4 platform. It avoids: (i) the manual invocation of external tools (e.g. the Affymetrix Power

  13. The fierce urgency of now: a proactive, pervasive content awareness tool

    Energy Technology Data Exchange (ETDEWEB)

    Powell, James E [Los Alamos National Laboratory; Collins, Linn M [Los Alamos National Laboratory; Martinez, Mark L B [Los Alamos National Laboratory

    2009-01-01

    Information awareness is distinct from explicit infonnation seeking, such as searching. We describe an information awareness tool that supports text composition by providing awareness of relevant content and references proactively and non-intrusively. As a user composes text, the tool automatically searches mUltiple sources, retrieves results, and displays links to the results. The tool has been implemented using Web 2.0 and Digital Library 2.0 technologies, and is flexible and highly configurable.

  14. Automatic indexing, compiling and classification

    International Nuclear Information System (INIS)

    Andreewsky, Alexandre; Fluhr, Christian.

    1975-06-01

    A review of the principles of automatic indexing, is followed by a comparison and summing-up of work by the authors and by a Soviet staff from the Moscou INFORM-ELECTRO Institute. The mathematical and linguistic problems of the automatic building of thesaurus and automatic classification are examined [fr

  15. Automated Assessment in a Programming Tools Course

    Science.gov (United States)

    Fernandez Aleman, J. L.

    2011-01-01

    Automated assessment systems can be useful for both students and instructors. Ranking and immediate feedback can have a strongly positive effect on student learning. This paper presents an experience using automatic assessment in a programming tools course. The proposal aims at extending the traditional use of an online judging system with a…

  16. High-throughput full-automatic synchrotron-based tomographic microscopy

    International Nuclear Information System (INIS)

    Mader, Kevin; Marone, Federica; Hintermueller, Christoph; Mikuljan, Gordan; Isenegger, Andreas; Stampanoni, Marco

    2011-01-01

    At the TOMCAT (TOmographic Microscopy and Coherent rAdiology experimenTs) beamline of the Swiss Light Source with an energy range of 8-45 keV and voxel size from 0.37 (micro)m to 7.4 (micro)m, full tomographic datasets are typically acquired in 5 to 10 min. To exploit the speed of the system and enable high-throughput studies to be performed in a fully automatic manner, a package of automation tools has been developed. The samples are automatically exchanged, aligned, moved to the correct region of interest, and scanned. This task is accomplished through the coordination of Python scripts, a robot-based sample-exchange system, sample positioning motors and a CCD camera. The tools are suited for any samples that can be mounted on a standard SEM stub, and require no specific environmental conditions. Up to 60 samples can be analyzed at a time without user intervention. The throughput of the system is dependent on resolution, energy and sample size, but rates of four samples per hour have been achieved with 0.74 (micro)m voxel size at 17.5 keV. The maximum intervention-free scanning time is theoretically unlimited, and in practice experiments have been running unattended as long as 53 h (the average beam time allocation at TOMCAT is 48 h per user). The system is the first fully automated high-throughput tomography station: mounting samples, finding regions of interest, scanning and reconstructing can be performed without user intervention. The system also includes many features which accelerate and simplify the process of tomographic microscopy.

  17. Classification of C2C12 cells at differentiation by convolutional neural network of deep learning using phase contrast images.

    Science.gov (United States)

    Niioka, Hirohiko; Asatani, Satoshi; Yoshimura, Aina; Ohigashi, Hironori; Tagawa, Seiichi; Miyake, Jun

    2018-01-01

    In the field of regenerative medicine, tremendous numbers of cells are necessary for tissue/organ regeneration. Today automatic cell-culturing system has been developed. The next step is constructing a non-invasive method to monitor the conditions of cells automatically. As an image analysis method, convolutional neural network (CNN), one of the deep learning method, is approaching human recognition level. We constructed and applied the CNN algorithm for automatic cellular differentiation recognition of myogenic C2C12 cell line. Phase-contrast images of cultured C2C12 are prepared as input dataset. In differentiation process from myoblasts to myotubes, cellular morphology changes from round shape to elongated tubular shape due to fusion of the cells. CNN abstract the features of the shape of the cells and classify the cells depending on the culturing days from when differentiation is induced. Changes in cellular shape depending on the number of days of culture (Day 0, Day 3, Day 6) are classified with 91.3% accuracy. Image analysis with CNN has a potential to realize regenerative medicine industry.

  18. Atlas-based automatic segmentation of head and neck organs at risk and nodal target volumes: a clinical validation.

    Science.gov (United States)

    Daisne, Jean-François; Blumhofer, Andreas

    2013-06-26

    Intensity modulated radiotherapy for head and neck cancer necessitates accurate definition of organs at risk (OAR) and clinical target volumes (CTV). This crucial step is time consuming and prone to inter- and intra-observer variations. Automatic segmentation by atlas deformable registration may help to reduce time and variations. We aim to test a new commercial atlas algorithm for automatic segmentation of OAR and CTV in both ideal and clinical conditions. The updated Brainlab automatic head and neck atlas segmentation was tested on 20 patients: 10 cN0-stages (ideal population) and 10 unselected N-stages (clinical population). Following manual delineation of OAR and CTV, automatic segmentation of the same set of structures was performed and afterwards manually corrected. Dice Similarity Coefficient (DSC), Average Surface Distance (ASD) and Maximal Surface Distance (MSD) were calculated for "manual to automatic" and "manual to corrected" volumes comparisons. In both groups, automatic segmentation saved about 40% of the corresponding manual segmentation time. This effect was more pronounced for OAR than for CTV. The edition of the automatically obtained contours significantly improved DSC, ASD and MSD. Large distortions of normal anatomy or lack of iodine contrast were the limiting factors. The updated Brainlab atlas-based automatic segmentation tool for head and neck Cancer patients is timesaving but still necessitates review and corrections by an expert.

  19. BPMNDiffViz : a tool for BPMN models comparison

    NARCIS (Netherlands)

    Ivanov, S.Y.; Kalenkova, A.A.; Aalst, van der W.M.P.; Daniel, F.; Zugal, S.

    2015-01-01

    Automatic comparison of business processes plays an important role in their analysis and optimization. In this paper we present the web-based tool BPMNDiffViz, that finds business processes discrepancies and visualizes them. BPMN (Business Process Model and Notation) 2.0 - one of the most commonly

  20. BioSimplify: an open source sentence simplification engine to improve recall in automatic biomedical information extraction

    OpenAIRE

    Jonnalagadda, Siddhartha; Gonzalez, Graciela

    2011-01-01

    BioSimplify is an open source tool written in Java that introduces and facilitates the use of a novel model for sentence simplification tuned for automatic discourse analysis and information extraction (as opposed to sentence simplification for improving human readability). The model is based on a "shot-gun" approach that produces many different (simpler) versions of the original sentence by combining variants of its constituent elements. This tool is optimized for processing biomedical scien...

  1. Design and implementation of a web-based PET-CT reporting assessment and e-portfolio tool

    International Nuclear Information System (INIS)

    Subesinghe, M.; Goldstone, A.R.; Patel, C.N.; Chowdhury, F.U.; Scarsbrook, A.F.

    2015-01-01

    Highlights: • We describe a simple internet-based reporting tool to enhance PET-CT training. • Automatically created competency based metrics are valuable in monitoring progress. • This tool provides robust evidence of competency in PET-CT reporting

  2. EZ and GOSSIP, two new VO compliant tools for spectral analysis

    Science.gov (United States)

    Franzetti, P.; Garill, B.; Fumana, M.; Paioro, L.; Scodeggio, M.; Paltani, S.; Scaramella, R.

    2008-10-01

    We present EZ and GOSSIP, two new VO compliant tools dedicated to spectral analysis. EZ is a tool to perform automatic redshift measurement; GOSSIP is a tool created to perform the SED fitting procedure in a simple, user friendly and efficient way. These two tools have been developed by the PANDORA Group at INAF-IASF (Milano); EZ has been developed in collaboration with Osservatorio Monte Porzio (Roma) and Integral Science Data Center (Geneve). EZ is released to the astronomical community; GOSSIP is currently in beta-testing.

  3. ACIR: automatic cochlea image registration

    Science.gov (United States)

    Al-Dhamari, Ibraheem; Bauer, Sabine; Paulus, Dietrich; Lissek, Friedrich; Jacob, Roland

    2017-02-01

    Efficient Cochlear Implant (CI) surgery requires prior knowledge of the cochlea's size and its characteristics. This information helps to select suitable implants for different patients. To get these measurements, a segmentation method of cochlea medical images is needed. An important pre-processing step for good cochlea segmentation involves efficient image registration. The cochlea's small size and complex structure, in addition to the different resolutions and head positions during imaging, reveals a big challenge for the automated registration of the different image modalities. In this paper, an Automatic Cochlea Image Registration (ACIR) method for multi- modal human cochlea images is proposed. This method is based on using small areas that have clear structures from both input images instead of registering the complete image. It uses the Adaptive Stochastic Gradient Descent Optimizer (ASGD) and Mattes's Mutual Information metric (MMI) to estimate 3D rigid transform parameters. The use of state of the art medical image registration optimizers published over the last two years are studied and compared quantitatively using the standard Dice Similarity Coefficient (DSC). ACIR requires only 4.86 seconds on average to align cochlea images automatically and to put all the modalities in the same spatial locations without human interference. The source code is based on the tool elastix and is provided for free as a 3D Slicer plugin. Another contribution of this work is a proposed public cochlea standard dataset which can be downloaded for free from a public XNAT server.

  4. [OISO, automatic treatment of patients management in oncogenetics].

    Science.gov (United States)

    Guien, Céline; Fabre, Aurélie; Lagarde, Arnaud; Salgado, David; Gensollen-Thiriez, Catherine; Zattara, Hélène; Beroud, Christophe; Olschwang, Sylviane

    Oncogenetics is a long-term process, which requires a close relation between patients and medical teams, good familial links allowing lifetime follow-up. Numerous documents are exchanged in between the medical team, which has to frequently interact. We present here a new tool that has been conceived specifically for this management. The tool has been developed according to a model-view-controler approach with the relational system PostgreSQL 9.3. The web site used PHP 5.3, HTML5 and CSS3 languages, completed with JavaScript and jQuery-AJAX functions and two additional modules, FPDF and PHPMailer. The tool allows multiple interactions, clinical data management, mailing and emailing, follow-up plannings. Requests are able to follow all patients and planning automatically, to send information to a large number of patients or physicians, and to report activity. The tool has been designed for oncogenetics and adapted to its different aspects. The CNIL delivered an authorization for use. Secured web access allows the management at a regional level. Its simple concept makes it evolutive according to the constant updates of genetic and clinical management of patients. Copyright © 2017 Société Française du Cancer. Published by Elsevier Masson SAS. All rights reserved.

  5. Development and application of an automatic system for measuring the laser camera

    International Nuclear Information System (INIS)

    Feng Shuli; Peng Mingchen; Li Kuncheng

    2004-01-01

    Objective: To provide an automatic system for measuring imaging quality of laser camera, and to make an automatic measurement and analysis system. Methods: On the special imaging workstation (SGI 540), the procedure was written by using Matlab language. An automatic measurement and analysis system of imaging quality for laser camera was developed and made according to the imaging quality measurement standard of laser camera of International Engineer Commission (IEC). The measurement system used the theories of digital signal processing, and was based on the characteristics of digital images, as well as put the automatic measurement and analysis of laser camera into practice by the affiliated sample pictures of the laser camera. Results: All the parameters of imaging quality of laser camera, including H-D and MTF curve, low and middle and high resolution of optical density, all kinds of geometry distort, maximum and minimum density, as well as the dynamic range of gray scale, could be measured by this system. The system was applied for measuring the laser cameras in 20 hospitals in Beijing. The measuring results showed that the system could provide objective and quantitative data, and could accurately evaluate the imaging quality of laser camera, as well as correct the results made by manual measurement based on the affiliated sample pictures of the laser camera. Conclusion: The automatic measuring system of laser camera is an effective and objective tool for testing the quality of the laser camera, and the system makes a foundation for the future research

  6. Automatic Atrial Fibrillation Detection: A Novel Approach Using Discrete Wavelet Transform and Heart Rate Variabilit

    DEFF Research Database (Denmark)

    Bruun, Iben H.; Hissabu, Semira M. S.; Poulsen, Erik S.

    2017-01-01

    be used as a screening tool for patients suspected to have AF. The method includes an automatic peak detection prior to the feature extraction, as well as a noise cancellation technique followed by a bagged tree classification. Simulation studies on the MIT-BIH Atrial Fibrillation database was performed...

  7. Bootstrap regularity for integro-differential operators and its application to nonlocal minimal surfaces

    OpenAIRE

    Barrera, Begoña Barrios; Figalli, Alessio; Valdinoci, Enrico

    2012-01-01

    We prove that $C^{1,\\alpha}$ $s$-minimal surfaces are automatically $C^\\infty$. For this, we develop a new bootstrap regularity theory for solutions of integro-differential equations of very general type, which we believe is of independent interest.

  8. Antibiogramj: A tool for analysing images from disk diffusion tests.

    Science.gov (United States)

    Alonso, C A; Domínguez, C; Heras, J; Mata, E; Pascual, V; Torres, C; Zarazaga, M

    2017-05-01

    Disk diffusion testing, known as antibiogram, is widely applied in microbiology to determine the antimicrobial susceptibility of microorganisms. The measurement of the diameter of the zone of growth inhibition of microorganisms around the antimicrobial disks in the antibiogram is frequently performed manually by specialists using a ruler. This is a time-consuming and error-prone task that might be simplified using automated or semi-automated inhibition zone readers. However, most readers are usually expensive instruments with embedded software that require significant changes in laboratory design and workflow. Based on the workflow employed by specialists to determine the antimicrobial susceptibility of microorganisms, we have designed a software tool that, from images of disk diffusion tests, semi-automatises the process. Standard computer vision techniques are employed to achieve such an automatisation. We present AntibiogramJ, a user-friendly and open-source software tool to semi-automatically determine, measure and categorise inhibition zones of images from disk diffusion tests. AntibiogramJ is implemented in Java and deals with images captured with any device that incorporates a camera, including digital cameras and mobile phones. The fully automatic procedure of AntibiogramJ for measuring inhibition zones achieves an overall agreement of 87% with an expert microbiologist; moreover, AntibiogramJ includes features to easily detect when the automatic reading is not correct and fix it manually to obtain the correct result. AntibiogramJ is a user-friendly, platform-independent, open-source, and free tool that, up to the best of our knowledge, is the most complete software tool for antibiogram analysis without requiring any investment in new equipment or changes in the laboratory. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Automatic generation of groundwater model hydrostratigraphy from AEM resistivity and boreholes

    DEFF Research Database (Denmark)

    Marker, Pernille Aabye; Foged, N.; Christiansen, A. V.

    2014-01-01

    Regional hydrological models are important tools in water resources management. Model prediction uncertainty is primarily due to structural (geological) non-uniqueness which makes sampling of the structural model space necessary to estimate prediction uncertainties. Geological structures and hete...... and discharge observations. The method was applied to field data collected at a Danish field site. Our results show that a competitive hydrological model can be constructed from the AEM dataset using the automatic procedure outlined above....

  10. Usefulness of the automatic quantitative estimation tool for cerebral blood flow: clinical assessment of the application software tool AQCEL.

    Science.gov (United States)

    Momose, Mitsuhiro; Takaki, Akihiro; Matsushita, Tsuyoshi; Yanagisawa, Shin; Yano, Kesato; Miyasaka, Tadashi; Ogura, Yuka; Kadoya, Masumi

    2011-01-01

    AQCEL enables automatic reconstruction of single-photon emission computed tomogram (SPECT) without image degradation and quantitative analysis of cerebral blood flow (CBF) after the input of simple parameters. We ascertained the usefulness and quality of images obtained by the application software AQCEL in clinical practice. Twelve patients underwent brain perfusion SPECT using technetium-99m ethyl cysteinate dimer at rest and after acetazolamide (ACZ) loading. Images reconstructed using AQCEL were compared with those reconstructed using conventional filtered back projection (FBP) method for qualitative estimation. Two experienced nuclear medicine physicians interpreted the image quality using the following visual scores: 0, same; 1, slightly superior; 2, superior. For quantitative estimation, the mean CBF values of the normal hemisphere of the 12 patients using ACZ calculated by the AQCEL method were compared with those calculated by the conventional method. The CBF values of the 24 regions of the 3-dimensional stereotaxic region of interest template (3DSRT) calculated by the AQCEL method at rest and after ACZ loading were compared to those calculated by the conventional method. No significant qualitative difference was observed between the AQCEL and conventional FBP methods in the rest study. The average score by the AQCEL method was 0.25 ± 0.45 and that by the conventional method was 0.17 ± 0.39 (P = 0.34). There was a significant qualitative difference between the AQCEL and conventional methods in the ACZ loading study. The average score for AQCEL was 0.83 ± 0.58 and that for the conventional method was 0.08 ± 0.29 (P = 0.003). During quantitative estimation using ACZ, the mean CBF values of 12 patients calculated by the AQCEL method were 3-8% higher than those calculated by the conventional method. The square of the correlation coefficient between these methods was 0.995. While comparing the 24 3DSRT regions of 12 patients, the squares of the correlation

  11. Development of a tool-kit for the detection of healthy and injured cardiac tissue based on MR imaging

    Directory of Open Access Journals (Sweden)

    Westphal Philip

    2017-09-01

    Full Text Available Planning of interventions to treat cardiac arrhythmia requires a 3D patient specific model of the heart. Currently available commercial or free software dedicated to this task have important limitations for routinely use. Automatic algorithms are not robust enough while manual methods are time-consuming. Therefore, the project attempts to develop an optimal software tool. The heart model is generated from preoperative MR data-sets acquired with contrast agent and allows visualisation of damaged cardiac tissue. A requirement in the development of the software tool was the use of semi-automatic functions to be more robust. Once the patient image dataset has been loaded, the user selects a region of interest. Thresholding functions allow selecting the areas of high intensities which correspond to anatomical structures filled with contrast agent, namely cardiac cavities and blood vessels. Thereafter, the target-structure, for example the left ventricle, is coarsely selected by interactively outlining the gross shape. An active contour function adjusts automatically the initial contour to the image content. The result can still be manually improved using fast interaction tools. Finally, possible scar tissue located in the cavity muscle is automatically detected and visualized on the 3D heart model. The model is exported in format which is compatible with interventional devices at hospital. The evaluation of the software tool included two steps. Firstly, a comparison with two free software tools was performed on two image data sets of variable quality. Secondly, six scientists and physicians tested our tool and filled out a questionnaire. The performance of our software tool was visually judged more satisfactory than the free software, especially on the data set of lower quality. Professionals evaluated positively our functionalities regarding time taken, ease of use and quality of results. Improvements would consist in performing the planning based

  12. Automatic Adjustments of a Trans-oesophageal Ultrasound Robot for Monitoring Intra-operative Catheters

    Science.gov (United States)

    Wang, Shuangyi; Housden, James; Singh, Davinder; Rhode, Kawal

    2017-12-01

    3D trans-oesophageal echocardiography (TOE) has become a powerful tool for monitoring intra-operative catheters used during cardiac procedures in recent years. However, the control of the TOE probe remains as a manual task and therefore the operator has to hold the probe for a long period of time and sometimes in a radiation environment. To solve this problem, an add-on robotic system has been developed for holding and manipulating a commercial TOE probe. This paper focuses on the application of making automatic adjustments to the probe pose in order to accurately monitor the moving catheters. The positioning strategy is divided into an initialization step based on a pre-planning method and a localized adjustments step based on the robotic differential kinematics and related image servoing techniques. Both steps are described in the paper along with simulation experiments performed to validate the concept. The results indicate an error less than 0.5 mm for the initialization step and an error less than 2 mm for the localized adjustments step. Compared to the much bigger live 3D image volume, it is concluded that the methods are promising. Future work will focus on evaluating the method in the real TOE scanning scenario.

  13. Higher-Order and Symbolic Computation. LISP and Symbolic Computationditorial

    DEFF Research Database (Denmark)

    Danvy, Olivier; Dybvig, R. Kent; Lawall, Julia

    2008-01-01

    system for these static checks and a corresponding type-inference algorithm. In "An Investigation of Jones Optimality and BTI-Universal Specializers," Robert Glueck establishes a connection between Jones optimal-program specializers and binding-time improvers. This article completes a study started...... at ASIA-PEPM 2002 [1]. In "On the Implementation of Automatic Differentiation Tools," Christian H. Bischof, Paul D. Hovland, and Boyana Norris present a survey of some recent tools for the Automatic Differentiation technology (concentrating mainly on ADIC, ADIFOR and sketching XAIF). They also offer...... for removing tuple constructions and tuple selections. This technique solves the problem of efficiently passing tuples to polymorphic functions by avoiding extra memory operations in selecting components of the tuple....

  14. Workflow-centred evaluation of an automatic lesion tracking software for chemotherapy monitoring by CT

    Energy Technology Data Exchange (ETDEWEB)

    Moltz, Jan Hendrik; Peitgen, Heinz-Otto [Fraunhofer MEVIS - Institute for Medical Image Computing, Bremen (Germany); D' Anastasi, Melvin [University Hospital Munich-Grosshadern, Department of Clinical Radiology, Muenchen (Germany); Kiessling, Andreas [University Hospital Giessen and Marburg, Department of Diagnostic Radiology, Marburg (Germany); Pinto dos Santos, Daniel [University Hospital Mainz, Department of Diagnostic and Interventional Radiology, Mainz (Germany); Schuelke, Christoph [University Hospital Muenster, Institute of Clinical Radiology, Muenster (Germany)

    2012-12-15

    In chemotherapy monitoring, an estimation of the change in tumour size is an important criterion for the assessment of treatment success. This requires a comparison between corresponding lesions in the baseline and follow-up computed tomography (CT) examinations. We evaluate the clinical benefits of an automatic lesion tracking tool that identifies the target lesions in the follow-up CT study and pre-computes the lesion volumes. Four radiologists performed volumetric follow-up examinations for 52 patients with and without lesion tracking. In total, 139 lung nodules, liver metastases and lymph nodes were given as target lesions. We measured reading time, inter-reader variability in lesion identification and volume measurements, and the amount of manual adjustments of the segmentation results. With lesion tracking, target lesion assessment time decreased by 38 % or 22 s per lesion. Relative volume difference between readers was reduced from 0.171 to 0.1. Segmentation quality was comparable with and without lesion tracking. Our automatic lesion tracking tool can make interpretation of follow-up CT examinations quicker and provide results that are less reader-dependent. (orig.)

  15. MIAQuant, a novel system for automatic segmentation, measurement, and localization comparison of different biomarkers from serialized histological slices

    Directory of Open Access Journals (Sweden)

    Elena Casiraghi

    2017-11-01

    Full Text Available In the clinical practice, automatic image analysis methods quickly quantizing histological results by objective and replicable methods are getting more and more necessary and widespread. Despite several commercial software products are available for this task, they are very little flexible, and provided as black boxes without modifiable source code. To overcome the aforementioned problems, we employed the commonly used MATLAB platform to develop an automatic method, MIAQuant, for the analysis of histochemical and immunohistochemical images, stained with various methods and acquired by different tools. It automatically extracts and quantifies markers characterized by various colors and shapes; furthermore, it aligns contiguous tissue slices stained by different markers and overlaps them with differing colors for visual comparison of their localization. Application of MIAQuant for clinical research fields, such as oncology and cardiovascular disease studies, has proven its efficacy, robustness and flexibility with respect to various problems; we highlight that, the flexibility of MIAQuant makes it an important tool to be exploited for basic researches where needs are constantly changing. MIAQuant software and its user manual are freely available for clinical studies, pathological research, and diagnosis.

  16. Differential equation analysis in biomedical science and engineering ordinary differential equation applications with R

    CERN Document Server

    Schiesser, William E

    2014-01-01

    Features a solid foundation of mathematical and computational tools to formulate and solve real-world ODE problems across various fields With a step-by-step approach to solving ordinary differential equations (ODEs), Differential Equation Analysis in Biomedical Science and Engineering: Ordinary Differential Equation Applications with R successfully applies computational techniques for solving real-worldODE problems that are found in a variety of fields, including chemistry, physics, biology,and physiology. The book provides readers with the necessary knowledge to reproduce andextend the comp

  17. Automatic sets and Delone sets

    International Nuclear Information System (INIS)

    Barbe, A; Haeseler, F von

    2004-01-01

    Automatic sets D part of Z m are characterized by having a finite number of decimations. They are equivalently generated by fixed points of certain substitution systems, or by certain finite automata. As examples, two-dimensional versions of the Thue-Morse, Baum-Sweet, Rudin-Shapiro and paperfolding sequences are presented. We give a necessary and sufficient condition for an automatic set D part of Z m to be a Delone set in R m . The result is then extended to automatic sets that are defined as fixed points of certain substitutions. The morphology of automatic sets is discussed by means of examples

  18. Fuchsia : A tool for reducing differential equations for Feynman master integrals to epsilon form

    Science.gov (United States)

    Gituliar, Oleksandr; Magerya, Vitaly

    2017-10-01

    We present Fuchsia - an implementation of the Lee algorithm, which for a given system of ordinary differential equations with rational coefficients ∂x J(x , ɛ) = A(x , ɛ) J(x , ɛ) finds a basis transformation T(x , ɛ) , i.e., J(x , ɛ) = T(x , ɛ) J‧(x , ɛ) , such that the system turns into the epsilon form : ∂xJ‧(x , ɛ) = ɛ S(x) J‧(x , ɛ) , where S(x) is a Fuchsian matrix. A system of this form can be trivially solved in terms of polylogarithms as a Laurent series in the dimensional regulator ɛ. That makes the construction of the transformation T(x , ɛ) crucial for obtaining solutions of the initial system. In principle, Fuchsia can deal with any regular systems, however its primary task is to reduce differential equations for Feynman master integrals. It ensures that solutions contain only regular singularities due to the properties of Feynman integrals. Program Files doi:http://dx.doi.org/10.17632/zj6zn9vfkh.1 Licensing provisions: MIT Programming language:Python 2.7 Nature of problem: Feynman master integrals may be calculated from solutions of a linear system of differential equations with rational coefficients. Such a system can be easily solved as an ɛ-series when its epsilon form is known. Hence, a tool which is able to find the epsilon form transformations can be used to evaluate Feynman master integrals. Solution method: The solution method is based on the Lee algorithm (Lee, 2015) which consists of three main steps: fuchsification, normalization, and factorization. During the fuchsification step a given system of differential equations is transformed into the Fuchsian form with the help of the Moser method (Moser, 1959). Next, during the normalization step the system is transformed to the form where eigenvalues of all residues are proportional to the dimensional regulator ɛ. Finally, the system is factorized to the epsilon form by finding an unknown transformation which satisfies a system of linear equations. Additional comments

  19. Bond graphs : an integrating tool for design of mechatronic systems

    International Nuclear Information System (INIS)

    Ould Bouamama, B.

    2011-01-01

    Bond graph is a powerful tool well known for dynamic modelling of multi physical systems: This is the only modelling technique to generate automatically state space or non-linear models using dedicated software tools (CAMP-G, 20-Sim, Symbols, Dymola...). Recently several fundamental theories have been developed for using a bond graph model not only for modeling but also as a real integrated tool from conceptual ideas to optimal practical realization of mechatronic system. This keynote presents a synthesis of those new theories which exploit some particular properties (such as causal, structural and behavioral) of this graphical methodology. Based on a pedagogical example, it will be shown how from a physical system (not a transfer function or state equation) and using only one representation (Bond graph), the following results can be performed: modeling (formal state equations generation), Control analysis (observability, controllability, Structural I/O decouplability, dynamic decoupling,...) diagnosis analysis (automatic generation of robust fault indicators, sensor placement, structural diagnosability) and finally sizing of actuators. The presentation will be illustrated by real industrial applications. Limits and perspectives of bond graph theory conclude the keynote.

  20. The MIMIC Model as a Tool for Differential Bundle Functioning Detection

    Science.gov (United States)

    Finch, W. Holmes

    2012-01-01

    Increasingly, researchers interested in identifying potentially biased test items are encouraged to use a confirmatory, rather than exploratory, approach. One such method for confirmatory testing is rooted in differential bundle functioning (DBF), where hypotheses regarding potential differential item functioning (DIF) for sets of items (bundles)…

  1. Tie Points Extraction for SAR Images Based on Differential Constraints

    Science.gov (United States)

    Xiong, X.; Jin, G.; Xu, Q.; Zhang, H.

    2018-04-01

    Automatically extracting tie points (TPs) on large-size synthetic aperture radar (SAR) images is still challenging because the efficiency and correct ratio of the image matching need to be improved. This paper proposes an automatic TPs extraction method based on differential constraints for large-size SAR images obtained from approximately parallel tracks, between which the relative geometric distortions are small in azimuth direction and large in range direction. Image pyramids are built firstly, and then corresponding layers of pyramids are matched from the top to the bottom. In the process, the similarity is measured by the normalized cross correlation (NCC) algorithm, which is calculated from a rectangular window with the long side parallel to the azimuth direction. False matches are removed by the differential constrained random sample consensus (DC-RANSAC) algorithm, which appends strong constraints in azimuth direction and weak constraints in range direction. Matching points in the lower pyramid images are predicted with the local bilinear transformation model in range direction. Experiments performed on ENVISAT ASAR and Chinese airborne SAR images validated the efficiency, correct ratio and accuracy of the proposed method.

  2. Automated differentiation between epileptic and non-epileptic convulsive seizures

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Conradsen, Isa; Moldovan, Mihai

    2015-01-01

    Our objective was the clinical validation of an automated algorithm based on surface electromyography (EMG) for differentiation between convulsive epileptic and psychogenic nonepileptic seizures (PNESs). Forty-four consecutive episodes with convulsive events were automatically analyzed with the a......%) and 18 PNESs (95%). The overall diagnostic accuracy was 95%. This algorithm is useful for distinguishing between epileptic and psychogenic convulsive seizures....

  3. Computer vision and soft computing for automatic skull-face overlay in craniofacial superimposition.

    Science.gov (United States)

    Campomanes-Álvarez, B Rosario; Ibáñez, O; Navarro, F; Alemán, I; Botella, M; Damas, S; Cordón, O

    2014-12-01

    Craniofacial superimposition can provide evidence to support that some human skeletal remains belong or not to a missing person. It involves the process of overlaying a skull with a number of ante mortem images of an individual and the analysis of their morphological correspondence. Within the craniofacial superimposition process, the skull-face overlay stage just focuses on achieving the best possible overlay of the skull and a single ante mortem image of the suspect. Although craniofacial superimposition has been in use for over a century, skull-face overlay is still applied by means of a trial-and-error approach without an automatic method. Practitioners finish the process once they consider that a good enough overlay has been attained. Hence, skull-face overlay is a very challenging, subjective, error prone, and time consuming part of the whole process. Though the numerical assessment of the method quality has not been achieved yet, computer vision and soft computing arise as powerful tools to automate it, dramatically reducing the time taken by the expert and obtaining an unbiased overlay result. In this manuscript, we justify and analyze the use of these techniques to properly model the skull-face overlay problem. We also present the automatic technical procedure we have developed using these computational methods and show the four overlays obtained in two craniofacial superimposition cases. This automatic procedure can be thus considered as a tool to aid forensic anthropologists to develop the skull-face overlay, automating and avoiding subjectivity of the most tedious task within craniofacial superimposition. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  4. Introduction, comparison, and validation of Meta‐Essentials: A free and simple tool for meta‐analysis

    Science.gov (United States)

    van Rhee, Henk; Hak, Tony

    2017-01-01

    We present a new tool for meta‐analysis, Meta‐Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta‐analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta‐Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta‐analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp‐Hartung adjustment of the DerSimonian‐Laird estimator. However, more advanced meta‐analysis methods such as meta‐analytical structural equation modelling and meta‐regression with multiple covariates are not available. In summary, Meta‐Essentials may prove a valuable resource for meta‐analysts, including researchers, teachers, and students. PMID:28801932

  5. Hera-FFX: a Firefox add-on for Semi-automatic Web Accessibility Evaluation

    OpenAIRE

    Fuertes Castro, José Luis; González, Ricardo; Gutiérrez, Emmanuelle; Martínez Normand, Loïc

    2009-01-01

    Website accessibility evaluation is a complex task requiring a combination of human expertise and software support. There are several online and offline tools to support the manual web accessibility evaluation process. However, they all have some weaknesses because none of them includes all the desired features. In this paper we present Hera-FFX, an add-on for the Firefox web browser that supports semi-automatic web accessibility evaluation.

  6. Online Questionnaires Use with Automatic Feedback for e-Innovation in University Students

    OpenAIRE

    Remesal, Ana; Colomina, Rosa M.; Mauri, Teresa; Rochera, M. José

    2017-01-01

    Technological tools have permeated higher education programs. However, their mere introduction does not guarantee instructional quality. This article presents the results of an innovation project aimed at fostering autonomous learning among students at a Pre-School and Primary Teacher Grade. For one semester all freshmen students used a system for autonomous learning embedded in the institutional online platform (Moodle), which included automatic formative feedback. The system was part of a c...

  7. Towards tool support for spreadsheet-based domain-specific languages

    DEFF Research Database (Denmark)

    Adam, Marian Sorin; Schultz, Ulrik Pagh

    2015-01-01

    Spreadsheets are commonly used by non-programmers to store data in a structured form, this data can in some cases be considered to be a program in a domain-specific language (DSL). Unlike ordinary text-based domain-specific languages, there is however currently no formalism for expressing...... the syntax of such spreadsheet-based DSLs (SDSLs), and there is no tool support for automatically generating language infrastructure such as parsers and IDE support. In this paper we define a simple notion of two-dimensional grammars for SDSLs, and show how such grammars can be used for automatically...

  8. Automatisms: bridging clinical neurology with criminal law.

    Science.gov (United States)

    Rolnick, Joshua; Parvizi, Josef

    2011-03-01

    The law, like neurology, grapples with the relationship between disease states and behavior. Sometimes, the two disciplines share the same terminology, such as automatism. In law, the "automatism defense" is a claim that action was involuntary or performed while unconscious. Someone charged with a serious crime can acknowledge committing the act and yet may go free if, relying on the expert testimony of clinicians, the court determines that the act of crime was committed in a state of automatism. In this review, we explore the relationship between the use of automatism in the legal and clinical literature. We close by addressing several issues raised by the automatism defense: semantic ambiguity surrounding the term automatism, the presence or absence of consciousness during automatisms, and the methodological obstacles that have hindered the study of cognition during automatisms. Copyright © 2010 Elsevier Inc. All rights reserved.

  9. Attentional bias for pain and sex, and automatic appraisals of sexual penetration : Differential patterns in dyspareunia versus vaginismus?

    NARCIS (Netherlands)

    Melles, Reinhilde J.; Dewitte, Marieke D.; ter Kuile, Moniek M.; Peters, Madelon M.L.; Jong, de Peter J.

    Introduction Current information processing models propose that heightened attention bias for sex-related threats (eg, pain) and lowered automatic incentive processes (“wanting”) may play an important role in the impairment of sexual arousal and the development of sexual dysfunctions such as

  10. Atlas-based automatic segmentation of head and neck organs at risk and nodal target volumes: a clinical validation

    International Nuclear Information System (INIS)

    Daisne, Jean-François; Blumhofer, Andreas

    2013-01-01

    Intensity modulated radiotherapy for head and neck cancer necessitates accurate definition of organs at risk (OAR) and clinical target volumes (CTV). This crucial step is time consuming and prone to inter- and intra-observer variations. Automatic segmentation by atlas deformable registration may help to reduce time and variations. We aim to test a new commercial atlas algorithm for automatic segmentation of OAR and CTV in both ideal and clinical conditions. The updated Brainlab automatic head and neck atlas segmentation was tested on 20 patients: 10 cN0-stages (ideal population) and 10 unselected N-stages (clinical population). Following manual delineation of OAR and CTV, automatic segmentation of the same set of structures was performed and afterwards manually corrected. Dice Similarity Coefficient (DSC), Average Surface Distance (ASD) and Maximal Surface Distance (MSD) were calculated for “manual to automatic” and “manual to corrected” volumes comparisons. In both groups, automatic segmentation saved about 40% of the corresponding manual segmentation time. This effect was more pronounced for OAR than for CTV. The edition of the automatically obtained contours significantly improved DSC, ASD and MSD. Large distortions of normal anatomy or lack of iodine contrast were the limiting factors. The updated Brainlab atlas-based automatic segmentation tool for head and neck Cancer patients is timesaving but still necessitates review and corrections by an expert

  11. Differential equation analysis in biomedical science and engineering partial differential equation applications with R

    CERN Document Server

    Schiesser, William E

    2014-01-01

    Features a solid foundation of mathematical and computational tools to formulate and solve real-world PDE problems across various fields With a step-by-step approach to solving partial differential equations (PDEs), Differential Equation Analysis in Biomedical Science and Engineering: Partial Differential Equation Applications with R successfully applies computational techniques for solving real-world PDE problems that are found in a variety of fields, including chemistry, physics, biology, and physiology. The book provides readers with the necessary knowledge to reproduce and extend the com

  12. Shape: A 3D Modeling Tool for Astrophysics.

    Science.gov (United States)

    Steffen, Wolfgang; Koning, Nicholas; Wenger, Stephan; Morisset, Christophe; Magnor, Marcus

    2011-04-01

    We present a flexible interactive 3D morpho-kinematical modeling application for astrophysics. Compared to other systems, our application reduces the restrictions on the physical assumptions, data type, and amount that is required for a reconstruction of an object's morphology. It is one of the first publicly available tools to apply interactive graphics to astrophysical modeling. The tool allows astrophysicists to provide a priori knowledge about the object by interactively defining 3D structural elements. By direct comparison of model prediction with observational data, model parameters can then be automatically optimized to fit the observation. The tool has already been successfully used in a number of astrophysical research projects.

  13. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Science.gov (United States)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  14. Automatic detection and visualisation of MEG ripple oscillations in epilepsy

    Directory of Open Access Journals (Sweden)

    Nicole van Klink

    2017-01-01

    Full Text Available High frequency oscillations (HFOs, 80–500 Hz in invasive EEG are a biomarker for the epileptic focus. Ripples (80–250 Hz have also been identified in non-invasive MEG, yet detection is impeded by noise, their low occurrence rates, and the workload of visual analysis. We propose a method that identifies ripples in MEG through noise reduction, beamforming and automatic detection with minimal user effort. We analysed 15 min of presurgical resting-state interictal MEG data of 25 patients with epilepsy. The MEG signal-to-noise was improved by using a cross-validation signal space separation method, and by calculating ~2400 beamformer-based virtual sensors in the grey matter. Ripples in these sensors were automatically detected by an algorithm optimized for MEG. A small subset of the identified ripples was visually checked. Ripple locations were compared with MEG spike dipole locations and the resection area if available. Running the automatic detection algorithm resulted in on average 905 ripples per patient, of which on average 148 ripples were visually reviewed. Reviewing took approximately 5 min per patient, and identified ripples in 16 out of 25 patients. In 14 patients the ripple locations showed good or moderate concordance with the MEG spikes. For six out of eight patients who had surgery, the ripple locations showed concordance with the resection area: 4/5 with good outcome and 2/3 with poor outcome. Automatic ripple detection in beamformer-based virtual sensors is a feasible non-invasive tool for the identification of ripples in MEG. Our method requires minimal user effort and is easily applicable in a clinical setting.

  15. Not proper ROC curves as new tool for the analysis of differentially expressed genes in microarray experiments

    Directory of Open Access Journals (Sweden)

    Pistoia Vito

    2008-10-01

    Full Text Available Abstract Background Most microarray experiments are carried out with the purpose of identifying genes whose expression varies in relation with specific conditions or in response to environmental stimuli. In such studies, genes showing similar mean expression values between two or more groups are considered as not differentially expressed, even if hidden subclasses with different expression values may exist. In this paper we propose a new method for identifying differentially expressed genes, based on the area between the ROC curve and the rising diagonal (ABCR. ABCR represents a more general approach than the standard area under the ROC curve (AUC, because it can identify both proper (i.e., concave and not proper ROC curves (NPRC. In particular, NPRC may correspond to those genes that tend to escape standard selection methods. Results We assessed the performance of our method using data from a publicly available database of 4026 genes, including 14 normal B cell samples (NBC and 20 heterogeneous lymphomas (namely: 9 follicular lymphomas and 11 chronic lymphocytic leukemias. Moreover, NBC also included two sub-classes, i.e., 6 heavily stimulated and 8 slightly or not stimulated samples. We identified 1607 differentially expressed genes with an estimated False Discovery Rate of 15%. Among them, 16 corresponded to NPRC and all escaped standard selection procedures based on AUC and t statistics. Moreover, a simple inspection to the shape of such plots allowed to identify the two subclasses in either one class in 13 cases (81%. Conclusion NPRC represent a new useful tool for the analysis of microarray data.

  16. Application of a semi-automatic cartilage segmentation method for biomechanical modeling of the knee joint.

    Science.gov (United States)

    Liukkonen, Mimmi K; Mononen, Mika E; Tanska, Petri; Saarakkala, Simo; Nieminen, Miika T; Korhonen, Rami K

    2017-10-01

    Manual segmentation of articular cartilage from knee joint 3D magnetic resonance images (MRI) is a time consuming and laborious task. Thus, automatic methods are needed for faster and reproducible segmentations. In the present study, we developed a semi-automatic segmentation method based on radial intensity profiles to generate 3D geometries of knee joint cartilage which were then used in computational biomechanical models of the knee joint. Six healthy volunteers were imaged with a 3T MRI device and their knee cartilages were segmented both manually and semi-automatically. The values of cartilage thicknesses and volumes produced by these two methods were compared. Furthermore, the influences of possible geometrical differences on cartilage stresses and strains in the knee were evaluated with finite element modeling. The semi-automatic segmentation and 3D geometry construction of one knee joint (menisci, femoral and tibial cartilages) was approximately two times faster than with manual segmentation. Differences in cartilage thicknesses, volumes, contact pressures, stresses, and strains between segmentation methods in femoral and tibial cartilage were mostly insignificant (p > 0.05) and random, i.e. there were no systematic differences between the methods. In conclusion, the devised semi-automatic segmentation method is a quick and accurate way to determine cartilage geometries; it may become a valuable tool for biomechanical modeling applications with large patient groups.

  17. Language Management Tools

    DEFF Research Database (Denmark)

    Sanden, Guro Refsum

    This paper offers a review of existing literature on the topic of language management tools – the means by which language is managed – in multilingual organisations. By drawing on a combination of sociolinguistics and international business and management studies, a new taxonomy of language...... management tools is proposed, differentiating between three categories of tools. Firstly, corporate policies are the deliberate control of issues pertaining to language and communication developed at the managerial level of a firm. Secondly, corporate measures are the planned activities the firm’s leadership...... may deploy in order to address the language needs of the organisation. Finally, front-line practices refer to the use of informal, emergent language management tools available to staff members. The language management tools taxonomy provides a framework for operationalising the management of language...

  18. Development of An Automatic Verification Program for Thermal-hydraulic System Codes

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J. Y.; Ahn, K. T.; Ko, S. H.; Kim, Y. S.; Kim, D. W. [Pusan National University, Busan (Korea, Republic of); Suh, J. S.; Cho, Y. S.; Jeong, J. J. [System Engineering and Technology Co., Daejeon (Korea, Republic of)

    2012-05-15

    As a project activity of the capstone design competitive exhibition, supported by the Education Center for Green Industry-friendly Fusion Technology (GIFT), we have developed a computer program which can automatically perform non-regression test, which is needed repeatedly during a developmental process of a thermal-hydraulic system code, such as the SPACE code. A non-regression test (NRT) is an approach to software testing. The purpose of the non-regression testing is to verify whether, after updating a given software application (in this case, the code), previous software functions have not been compromised. The goal is to prevent software regression, whereby adding new features results in software bugs. As the NRT is performed repeatedly, a lot of time and human resources will be needed during the development period of a code. It may cause development period delay. To reduce the cost and the human resources and to prevent wasting time, non-regression tests need to be automatized. As a tool to develop an automatic verification program, we have used Visual Basic for Application (VBA). VBA is an implementation of Microsoft's event-driven programming language Visual Basic 6 and its associated integrated development environment, which are built into most Microsoft Office applications (In this case, Excel)

  19. Development of An Automatic Verification Program for Thermal-hydraulic System Codes

    International Nuclear Information System (INIS)

    Lee, J. Y.; Ahn, K. T.; Ko, S. H.; Kim, Y. S.; Kim, D. W.; Suh, J. S.; Cho, Y. S.; Jeong, J. J.

    2012-01-01

    As a project activity of the capstone design competitive exhibition, supported by the Education Center for Green Industry-friendly Fusion Technology (GIFT), we have developed a computer program which can automatically perform non-regression test, which is needed repeatedly during a developmental process of a thermal-hydraulic system code, such as the SPACE code. A non-regression test (NRT) is an approach to software testing. The purpose of the non-regression testing is to verify whether, after updating a given software application (in this case, the code), previous software functions have not been compromised. The goal is to prevent software regression, whereby adding new features results in software bugs. As the NRT is performed repeatedly, a lot of time and human resources will be needed during the development period of a code. It may cause development period delay. To reduce the cost and the human resources and to prevent wasting time, non-regression tests need to be automatized. As a tool to develop an automatic verification program, we have used Visual Basic for Application (VBA). VBA is an implementation of Microsoft's event-driven programming language Visual Basic 6 and its associated integrated development environment, which are built into most Microsoft Office applications (In this case, Excel)

  20. Automatic classification of liver scintigram patterns by computer

    International Nuclear Information System (INIS)

    Csernay, L.; Csirik, J.

    1976-01-01

    The pattern recognition of projection is one of the problems in the automatic evaluation of scintigrams. An algorythm and a computerized programme with the ability to classify the shapes of liver scintigrams has been elaborated by the authors. The programme differentiates not only normal and pathologic basic forms, but performs the identification of nine normal forms described by the literature. To pattern recognition structural and local parameters of the picture were defined. A detailed mechanism of the programme is given in their reports. The programme can classify 55 out of 60 actual liver scintigrams, a result different from subjective definition obtained in 5 cases. These were normal pattern of liver scans. No wrong definition was obtained when classifying normal and pathologic patterns

  1. SplitRacer - a new Semi-Automatic Tool to Quantify And Interpret Teleseismic Shear-Wave Splitting

    Science.gov (United States)

    Reiss, M. C.; Rumpker, G.

    2017-12-01

    We have developed a semi-automatic, MATLAB-based GUI to combine standard seismological tasks such as the analysis and interpretation of teleseismic shear-wave splitting. Shear-wave splitting analysis is widely used to infer seismic anisotropy, which can be interpreted in terms of lattice-preferred orientation of mantle minerals, shape-preferred orientation caused by fluid-filled cracks or alternating layers. Seismic anisotropy provides a unique link between directly observable surface structures and the more elusive dynamic processes in the mantle below. Thus, resolving the seismic anisotropy of the lithosphere/asthenosphere is of particular importance for geodynamic modeling and interpretations. The increasing number of seismic stations from temporary experiments and permanent installations creates a new basis for comprehensive studies of seismic anisotropy world-wide. However, the increasingly large data sets pose new challenges for the rapid and reliably analysis of teleseismic waveforms and for the interpretation of the measurements. Well-established routines and programs are available but are often impractical for analyzing large data sets from hundreds of stations. Additionally, shear wave splitting results are seldom evaluated using the same well-defined quality criteria which may complicate comparison with results from different studies. SplitRacer has been designed to overcome these challenges by incorporation of the following processing steps: i) downloading of waveform data from multiple stations in mseed-format using FDSNWS tools; ii) automated initial screening and categorizing of XKS-waveforms using a pre-set SNR-threshold; iii) particle-motion analysis of selected phases at longer periods to detect and correct for sensor misalignment; iv) splitting analysis of selected phases based on transverse-energy minimization for multiple, randomly-selected, relevant time windows; v) one and two-layer joint-splitting analysis for all phases at one station by

  2. ON DIFFERENTIAL EQUATIONS, INTEGRABLE SYSTEMS, AND GEOMETRY

    OpenAIRE

    Enrique Gonzalo Reyes Garcia

    2004-01-01

    ON DIFFERENTIAL EQUATIONS, INTEGRABLE SYSTEMS, AND GEOMETRY Equations in partial derivatives appeared in the 18th century as essential tools for the analytic study of physical models and, later, they proved to be fundamental for the progress of mathematics. For example, fundamental results of modern differential geometry are based on deep theorems on differential equations. Reciprocally, it is possible to study differential equations through geometrical means just like it was done by o...

  3. Automatic analysis of the micronucleus test in primary human lymphocytes using image analysis.

    Science.gov (United States)

    Frieauff, W; Martus, H J; Suter, W; Elhajouji, A

    2013-01-01

    The in vitro micronucleus test (MNT) is a well-established test for early screening of new chemical entities in industrial toxicology. For assessing the clastogenic or aneugenic potential of a test compound, micronucleus induction in cells has been shown repeatedly to be a sensitive and a specific parameter. Various automated systems to replace the tedious and time-consuming visual slide analysis procedure as well as flow cytometric approaches have been discussed. The ROBIAS (Robotic Image Analysis System) for both automatic cytotoxicity assessment and micronucleus detection in human lymphocytes was developed at Novartis where the assay has been used to validate positive results obtained in the MNT in TK6 cells, which serves as the primary screening system for genotoxicity profiling in early drug development. In addition, the in vitro MNT has become an accepted alternative to support clinical studies and will be used for regulatory purposes as well. The comparison of visual with automatic analysis results showed a high degree of concordance for 25 independent experiments conducted for the profiling of 12 compounds. For concentration series of cyclophosphamide and carbendazim, a very good correlation between automatic and visual analysis by two examiners could be established, both for the relative division index used as cytotoxicity parameter, as well as for micronuclei scoring in mono- and binucleated cells. Generally, false-positive micronucleus decisions could be controlled by fast and simple relocation of the automatically detected patterns. The possibility to analyse 24 slides within 65h by automatic analysis over the weekend and the high reproducibility of the results make automatic image processing a powerful tool for the micronucleus analysis in primary human lymphocytes. The automated slide analysis for the MNT in human lymphocytes complements the portfolio of image analysis applications on ROBIAS which is supporting various assays at Novartis.

  4. Modular playware as a playful diagnosis tool for autistic children

    DEFF Research Database (Denmark)

    Lund, Henrik Hautop

    2009-01-01

    children. Using artificial neural networks for automatic classification of the individual construction practices, we may compare this classification with the diagnosis of the children, and possible obtain a supplementary diagnosis tool which is based on the autistic children's free play with the modular...

  5. Progress on statistical learning systems as data mining tools for the creation of automatic databases in Fusion environments

    International Nuclear Information System (INIS)

    Vega, J.; Murari, A.; Ratta, G.A.; Gonzalez, S.; Dormido-Canto, S.

    2010-01-01

    Nowadays, processing all information of a fusion database is a much more important issue than acquiring more data. Although typically fusion devices produce tens of thousands of discharges, specialized databases for physics studies are normally limited to a few tens of shots. This is due to the fact that these databases are almost always generated manually, which is a very time consuming and unreliable activity. The development of automatic methods to create specialized databases ensures first, the reduction of human efforts to identify and locate physical events, second, the standardization of criteria (reducing the vulnerability to human errors) and, third, the improvement of statistical relevance. Classification and regression techniques have been used for these purposes. The objective has been the automatic recognition of physical events (that can appear in a random and/or infrequent way) in waveforms and video-movies. Results are shown for the JET database.

  6. Numerical solution of three-dimensional magnetic differential equations

    International Nuclear Information System (INIS)

    Reiman, A.H.; Greenside, H.S.

    1987-02-01

    A computer code is described that solves differential equations of the form B . del f = h for a single-valued solution f, given a toroidal three-dimensional divergence-free field B and a single-valued function h. The code uses a new algorithm that Fourier decomposes a given function in a set of flux coordinates in which the field lines are straight. The algorithm automatically adjusts the required integration lengths to compensate for proximity to low order rational surfaces. Applying this algorithm to the Cartesian coordinates defines a transformation to magnetic coordinates, in which the magnetic differential equation can be accurately solved. Our method is illustrated by calculating the Pfirsch-Schlueter currents for a stellarator

  7. "NeuroStem Chip": a novel highly specialized tool to study neural differentiation pathways in human stem cells

    Directory of Open Access Journals (Sweden)

    Li Jia-Yi

    2007-02-01

    Full Text Available Abstract Background Human stem cells are viewed as a possible source of neurons for a cell-based therapy of neurodegenerative disorders, such as Parkinson's disease. Several protocols that generate different types of neurons from human stem cells (hSCs have been developed. Nevertheless, the cellular mechanisms that underlie the development of neurons in vitro as they are subjected to the specific differentiation protocols are often poorly understood. Results We have designed a focused DNA (oligonucleotide-based large-scale microarray platform (named "NeuroStem Chip" and used it to study gene expression patterns in hSCs as they differentiate into neurons. We have selected genes that are relevant to cells (i being stem cells, (ii becoming neurons, and (iii being neurons. The NeuroStem Chip has over 1,300 pre-selected gene targets and multiple controls spotted in quadruplicates (~46,000 spots total. In this study, we present the NeuroStem Chip in detail and describe the special advantages it offers to the fields of experimental neurology and stem cell biology. To illustrate the utility of NeuroStem Chip platform, we have characterized an undifferentiated population of pluripotent human embryonic stem cells (hESCs, cell line SA02. In addition, we have performed a comparative gene expression analysis of those cells versus a heterogeneous population of hESC-derived cells committed towards neuronal/dopaminergic differentiation pathway by co-culturing with PA6 stromal cells for 16 days and containing a few tyrosine hydroxylase-positive dopaminergic neurons. Conclusion We characterized the gene expression profiles of undifferentiated and dopaminergic lineage-committed hESC-derived cells using a highly focused custom microarray platform (NeuroStem Chip that can become an important research tool in human stem cell biology. We propose that the areas of application for NeuroStem microarray platform could be the following: (i characterization of the

  8. Segmentation of Extrapulmonary Tuberculosis Infection Using Modified Automatic Seeded Region Growing

    Directory of Open Access Journals (Sweden)

    Nordin Abdul

    2009-01-01

    Full Text Available Abstract In the image segmentation process of positron emission tomography combined with computed tomography (PET/CT imaging, previous works used information in CT only for segmenting the image without utilizing the information that can be provided by PET. This paper proposes to utilize the hot spot values in PET to guide the segmentation in CT, in automatic image segmentation using seeded region growing (SRG technique. This automatic segmentation routine can be used as part of automatic diagnostic tools. In addition to the original initial seed selection using hot spot values in PET, this paper also introduces a new SRG growing criterion, the sliding windows. Fourteen images of patients having extrapulmonary tuberculosis have been examined using the above-mentioned method. To evaluate the performance of the modified SRG, three fidelity criteria are measured: percentage of under-segmentation area, percentage of over-segmentation area, and average time consumption. In terms of the under-segmentation percentage, SRG with average of the region growing criterion shows the least error percentage (51.85%. Meanwhile, SRG with local averaging and variance yielded the best results (2.67% for the over-segmentation percentage. In terms of the time complexity, the modified SRG with local averaging and variance growing criterion shows the best performance with 5.273 s average execution time. The results indicate that the proposed methods yield fairly good performance in terms of the over- and under-segmentation area. The results also demonstrated that the hot spot values in PET can be used to guide the automatic segmentation in CT image.

  9. Differential algebras without differentials: An easy C++ implementation

    International Nuclear Information System (INIS)

    Michelotti, L.

    1989-03-01

    Automated differentiation can be motivated and explained rather plainly without any reference to infinitesimals or differentials whatsoever. We shall describe one possible approach in this paper. The method which we shall use will suggest its own implementation. However, FORTRAN is not the most natural language in which to carry it out. In the second section we shall describe an almost trivial implementation using C++. (Indeed, one of the motivations for writing this paper is to persuade militant FORTRAN extremists to invest the four or five days necessary to learn this powerful and easy language.) Take heed, however, that what we describe below is only a stripped-down implementation, written in three days, of differential algebra's most essential features; it is not as robust as and does not contain the battery of tools available in Berz's DA package, the product of a significant amount of work. 10 refs

  10. Differential algebras without differentials: An easy C++ implementation

    Energy Technology Data Exchange (ETDEWEB)

    Michelotti, L.

    1989-03-01

    Automated differentiation can be motivated and explained rather plainly without any reference to infinitesimals or differentials whatsoever. We shall describe one possible approach in this paper. The method which we shall use will suggest its own implementation. However, FORTRAN is not the most natural language in which to carry it out. In the second section we shall describe an almost trivial implementation using C++. (Indeed, one of the motivations for writing this paper is to persuade militant FORTRAN extremists to invest the four or five days necessary to learn this powerful and easy language.) Take heed, however, that what we describe below is only a stripped-down implementation, written in three days, of differential algebra's most essential features; it is not as robust as and does not contain the battery of tools available in Berz's DA package, the product of a significant amount of work. 10 refs.

  11. Automatic and manual segmentation of healthy retinas using high-definition optical coherence tomography.

    Science.gov (United States)

    Golbaz, Isabelle; Ahlers, Christian; Goesseringer, Nina; Stock, Geraldine; Geitzenauer, Wolfgang; Prünte, Christian; Schmidt-Erfurth, Ursula Margarethe

    2011-03-01

    This study compared automatic- and manual segmentation modalities in the retina of healthy eyes using high-definition optical coherence tomography (HD-OCT). Twenty retinas in 20 healthy individuals were examined using an HD-OCT system (Carl Zeiss Meditec, Inc.). Three-dimensional imaging was performed with an axial resolution of 6 μm at a maximum scanning speed of 25,000 A-scans/second. Volumes of 6 × 6 × 2 mm were scanned. Scans were analysed using a matlab-based algorithm and a manual segmentation software system (3D-Doctor). The volume values calculated by the two methods were compared. Statistical analysis revealed a high correlation between automatic and manual modes of segmentation. The automatic mode of measuring retinal volume and the corresponding three-dimensional images provided similar results to the manual segmentation procedure. Both methods were able to visualize retinal and subretinal features accurately. This study compared two methods of assessing retinal volume using HD-OCT scans in healthy retinas. Both methods were able to provide realistic volumetric data when applied to raster scan sets. Manual segmentation methods represent an adequate tool with which to control automated processes and to identify clinically relevant structures, whereas automatic procedures will be needed to obtain data in larger patient populations. © 2009 The Authors. Journal compilation © 2009 Acta Ophthalmol.

  12. On detection and automatic tracking of butt weld line in thin wall pipe welding by a mobile robot with visual sensor

    International Nuclear Information System (INIS)

    Suga, Yasuo; Ishii, Hideaki; Muto, Akifumi

    1992-01-01

    An automatic pipe welding mobile robot system with visual sensor was constructed. The robot can move along a pipe, and detect the weld line to be welded by visual sensor. Moreover, in order to make an automatic welding, the welding torch can track the butt weld line of the pipes at a constant speed by rotating the robot head. Main results obtained are summarized as follows: 1) Using a proper lighting fixed in front of the CCD camera, the butt weld line of thin wall pipes can be recongnized stably. In this case, the root gap should be approximately 0.5 mm. 2) In order to detect the weld line stably during moving along the pipe, a brightness distribution measured by the CCD camera should be subjected to smoothing and differentiating and then the weld line is judged by the maximum and minimum values of the differentials. 3) By means of the basic robot system with a visual sensor controlled by a personal computer, the detection and in-process automatic tracking of a weld line are possible. The average tracking error was approximately 0.2 mm and maximum error 0.5 mm and the welding speed was held at a constant value with error of about 0.1 cm/min. (author)

  13. Differential forms theory and practice

    CERN Document Server

    Weintraub, Steven H

    2014-01-01

    Differential forms are utilized as a mathematical technique to help students, researchers, and engineers analyze and interpret problems where abstract spaces and structures are concerned, and when questions of shape, size, and relative positions are involved. Differential Forms has gained high recognition in the mathematical and scientific community as a powerful computational tool in solving research problems and simplifying very abstract problems through mathematical analysis on a computer. Differential Forms, 2nd Edition, is a solid resource for students and professionals needing a solid g

  14. Using CASE tools to write engineering specifications

    Science.gov (United States)

    Henry, James E.; Howard, Robert W.; Iveland, Scott T.

    1993-08-01

    There are always a wide variety of obstacles to writing and maintaining engineering documentation. To combat these problems, documentation generation can be linked to the process of engineering development. The same graphics and communication tools used for structured system analysis and design (SSA/SSD) also form the basis for the documentation. The goal is to build a living document, such that as an engineering design changes, the documentation will `automatically' revise. `Automatic' is qualified by the need to maintain textual descriptions associated with the SSA/SSD graphics, and the need to generate new documents. This paper describes a methodology and a computer aided system engineering toolset that enables a relatively seamless transition into document generation for the development engineering team.

  15. Introduction, comparison, and validation of Meta-Essentials: A free and simple tool for meta-analysis.

    Science.gov (United States)

    Suurmond, Robert; van Rhee, Henk; Hak, Tony

    2017-12-01

    We present a new tool for meta-analysis, Meta-Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta-analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta-Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta-analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp-Hartung adjustment of the DerSimonian-Laird estimator. However, more advanced meta-analysis methods such as meta-analytical structural equation modelling and meta-regression with multiple covariates are not available. In summary, Meta-Essentials may prove a valuable resource for meta-analysts, including researchers, teachers, and students. © 2017 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.

  16. BioSimplify: an open source sentence simplification engine to improve recall in automatic biomedical information extraction.

    Science.gov (United States)

    Jonnalagadda, Siddhartha; Gonzalez, Graciela

    2010-11-13

    BioSimplify is an open source tool written in Java that introduces and facilitates the use of a novel model for sentence simplification tuned for automatic discourse analysis and information extraction (as opposed to sentence simplification for improving human readability). The model is based on a "shot-gun" approach that produces many different (simpler) versions of the original sentence by combining variants of its constituent elements. This tool is optimized for processing biomedical scientific literature such as the abstracts indexed in PubMed. We tested our tool on its impact to the task of PPI extraction and it improved the f-score of the PPI tool by around 7%, with an improvement in recall of around 20%. The BioSimplify tool and test corpus can be downloaded from https://biosimplify.sourceforge.net.

  17. Neural Bases of Automaticity

    Science.gov (United States)

    Servant, Mathieu; Cassey, Peter; Woodman, Geoffrey F.; Logan, Gordon D.

    2018-01-01

    Automaticity allows us to perform tasks in a fast, efficient, and effortless manner after sufficient practice. Theories of automaticity propose that across practice processing transitions from being controlled by working memory to being controlled by long-term memory retrieval. Recent event-related potential (ERP) studies have sought to test this…

  18. Programming Models and Tools for Intelligent Embedded Systems

    DEFF Research Database (Denmark)

    Sørensen, Peter Verner Bojsen

    Design automation and analysis tools targeting embedded platforms, developed using a component-based design approach, must be able to reason about the capabilities of the platforms. In the general case where nothing is assumed about the components comprising a platform or the platform topology...... is used for checking the consistency of a design with respect to the availablity of services and resources. In the second application, a tool for automatically implementing the communication infrastructure of a process network application, the Service Relation Model is used for analyzing the capabilities...

  19. Modelling of Tool Wear and Residual Stress during Machining of AISI H13 Tool Steel

    Science.gov (United States)

    Outeiro, José C.; Umbrello, Domenico; Pina, José C.; Rizzuti, Stefania

    2007-05-01

    Residual stresses can enhance or impair the ability of a component to withstand loading conditions in service (fatigue, creep, stress corrosion cracking, etc.), depending on their nature: compressive or tensile, respectively. This poses enormous problems in structural assembly as this affects the structural integrity of the whole part. In addition, tool wear issues are of critical importance in manufacturing since these affect component quality, tool life and machining cost. Therefore, prediction and control of both tool wear and the residual stresses in machining are absolutely necessary. In this work, a two-dimensional Finite Element model using an implicit Lagrangian formulation with an automatic remeshing was applied to simulate the orthogonal cutting process of AISI H13 tool steel. To validate such model the predicted and experimentally measured chip geometry, cutting forces, temperatures, tool wear and residual stresses on the machined affected layers were compared. The proposed FE model allowed us to investigate the influence of tool geometry, cutting regime parameters and tool wear on residual stress distribution in the machined surface and subsurface of AISI H13 tool steel. The obtained results permit to conclude that in order to reduce the magnitude of surface residual stresses, the cutting speed should be increased, the uncut chip thickness (or feed) should be reduced and machining with honed tools having large cutting edge radii produce better results than chamfered tools. Moreover, increasing tool wear increases the magnitude of surface residual stresses.

  20. Brand and automaticity

    OpenAIRE

    Liu, J.

    2008-01-01

    A presumption of most consumer research is that consumers endeavor to maximize the utility of their choices and are in complete control of their purchasing and consumption behavior. However, everyday life experience suggests that many of our choices are not all that reasoned or conscious. Indeed, automaticity, one facet of behavior, is indispensable to complete the portrait of consumers. Despite its importance, little attention is paid to how the automatic side of behavior can be captured and...

  1. Automatic simplification of systems of reaction-diffusion equations by a posteriori analysis.

    Science.gov (United States)

    Maybank, Philip J; Whiteley, Jonathan P

    2014-02-01

    Many mathematical models in biology and physiology are represented by systems of nonlinear differential equations. In recent years these models have become increasingly complex in order to explain the enormous volume of data now available. A key role of modellers is to determine which components of the model have the greatest effect on a given observed behaviour. An approach for automatically fulfilling this role, based on a posteriori analysis, has recently been developed for nonlinear initial value ordinary differential equations [J.P. Whiteley, Model reduction using a posteriori analysis, Math. Biosci. 225 (2010) 44-52]. In this paper we extend this model reduction technique for application to both steady-state and time-dependent nonlinear reaction-diffusion systems. Exemplar problems drawn from biology are used to demonstrate the applicability of the technique. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Criticality in cell differentiation

    Indian Academy of Sciences (India)

    Indrani Bose

    2017-11-09

    Nov 9, 2017 ... Differentiation is mostly based on binary decisions with the progenitor cells ..... accounts for the dominant part of the remaining variation ... significant loss in information. ..... making in vitro: emerging concepts and novel tools.

  3. Automatic Program Development

    DEFF Research Database (Denmark)

    Automatic Program Development is a tribute to Robert Paige (1947-1999), our accomplished and respected colleague, and moreover our good friend, whose untimely passing was a loss to our academic and research community. We have collected the revised, updated versions of the papers published in his...... honor in the Higher-Order and Symbolic Computation Journal in the years 2003 and 2005. Among them there are two papers by Bob: (i) a retrospective view of his research lines, and (ii) a proposal for future studies in the area of the automatic program derivation. The book also includes some papers...... by members of the IFIP Working Group 2.1 of which Bob was an active member. All papers are related to some of the research interests of Bob and, in particular, to the transformational development of programs and their algorithmic derivation from formal specifications. Automatic Program Development offers...

  4. Development of advanced automatic operation system for nuclear ship. 1. Perfect automatic normal operation

    International Nuclear Information System (INIS)

    Nakazawa, Toshio; Yabuuti, Noriaki; Takahashi, Hiroki; Shimazaki, Junya

    1999-02-01

    Development of operation support system such as automatic operating system and anomaly diagnosis systems of nuclear reactor is very important in practical nuclear ship because of a limited number of operators and severe conditions in which receiving support from others in a case of accident is very difficult. The goal of development of the operation support systems is to realize the perfect automatic control system in a series of normal operation from the reactor start-up to the shutdown. The automatic control system for the normal operation has been developed based on operating experiences of the first Japanese nuclear ship 'Mutsu'. Automation technique was verified by 'Mutsu' plant data at manual operation. Fully automatic control of start-up and shutdown operations was achieved by setting the desired value of operation and the limiting value of parameter fluctuation, and by making the operation program of the principal equipment such as the main coolant pump and the heaters. This report presents the automatic operation system developed for the start-up and the shutdown of reactor and the verification of the system using the Nuclear Ship Engineering Simulator System. (author)

  5. 14 CFR 23.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 23.1329 Section 23...: Installation § 23.1329 Automatic pilot system. If an automatic pilot system is installed, it must meet the following: (a) Each system must be designed so that the automatic pilot can— (1) Be quickly and positively...

  6. 46 CFR 52.01-10 - Automatic controls.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Automatic controls. 52.01-10 Section 52.01-10 Shipping... Requirements § 52.01-10 Automatic controls. (a) Each main boiler must meet the special requirements for automatic safety controls in § 62.35-20(a)(1) of this chapter. (b) Each automatically controlled auxiliary...

  7. Utilization of a genetic algorithm for the automatic detection of oil spill from RADARSAT-2 SAR satellite data

    International Nuclear Information System (INIS)

    Marghany, Maged

    2014-01-01

    Highlights: • An oil platform located 70 km from the coast of Louisiana sank on Thursday. • Oil spill has backscatter values of −25 dB in RADARSAT-2 SAR. • Oil spill is portrayed in SCNB mode by shallower incidence angle. • Ideal detection of oil spills in SAR images requires moderate wind speeds. • Genetic algorithm is excellent tool for automatic detection of oil spill in RADARSAT-2 SAR data. - Abstract: In this work, a genetic algorithm is applied for the automatic detection of oil spills. The procedure is implemented using sequences from RADARSAT-2 SAR ScanSAR Narrow single-beam data acquired in the Gulf of Mexico. The study demonstrates that the implementation of crossover allows for the generation of an accurate oil spill pattern. This conclusion is confirmed by the receiver-operating characteristic (ROC) curve. The ROC curve indicates that the existence of oil slick footprints can be identified using the area between the ROC curve and the no-discrimination line of 90%, which is greater than that of other surrounding environmental features. In conclusion, the genetic algorithm can be used as a tool for the automatic detection of oil spills, and the ScanSAR Narrow single-beam mode serves as an excellent sensor for oil spill detection and survey

  8. Volatile fraction composition and physicochemical parameters as tools for the differentiation of lemon blossom honey and orange blossom honey.

    Science.gov (United States)

    Kadar, Melinda; Juan-Borrás, Marisol; Carot, Jose M; Domenech, Eva; Escriche, Isabel

    2011-12-01

    Volatile fraction profile and physicochemical parameters were studied with the aim of evaluating their effectiveness for the differentiation between lemon blossom honey (Citrus limon L.) and orange blossom honey (Citrus spp.). They would be useful complementary tools to the traditional analysis based on the percentage of pollen. A stepwise discriminant analysis constructed using 37 volatile compounds (extracted by purge and trap and analysed by gas chromatography-mass spectrometry), and physicochemical and colour parameters (diastase, conductivity, Pfund colour and CIE L a b) together provided a model that permitted the correct classification of 98.3% of the original and 96.6% of the cross-validated cases, indicating its efficiency and robustness. This model proved its effectiveness in the differentiation of both types of honey with another set of batches from the following year. This model, developed from the volatile compounds, physicochemical and colour parameters, has been useful for the differentiation of lemon and orange blossom honeys. Furthermore, it may be of particular interest for the attainment of a suitable classification of orange honey in which the pollen count is very low. These capabilities imply an evident marketing advantage for the beekeeping sector, since lemon blossom honey could be commercialized as unifloral honey and not as generic citrus honey and orange blossom honey could be correctly characterized. Copyright © 2011 Society of Chemical Industry.

  9. Quantum κ-deformed differential geometry and field theory

    Science.gov (United States)

    Mercati, Flavio

    2016-03-01

    I introduce in κ-Minkowski noncommutative spacetime the basic tools of quantum differential geometry, namely bicovariant differential calculus, Lie and inner derivatives, the integral, the Hodge-∗ and the metric. I show the relevance of these tools for field theory with an application to complex scalar field, for which I am able to identify a vector-valued four-form which generalizes the energy-momentum tensor. Its closedness is proved, expressing in a covariant form the conservation of energy-momentum.

  10. Evaluating the TESTAR tool in an industrial case study

    NARCIS (Netherlands)

    Bauersfeld, S; Vos, T.; Condori-Fernandez, O.N.; Bagnato, A; Brosse, E.; Morisio, M; Dybå, T; Torchiano, M

    2014-01-01

    [Context] Automated test case design and execution at the GUI level of applications is not a fact in industrial practice. Tests are still mainly designed and executed manually. In previous work we have described TESTAR, a tool which allows to set-up fully automatic testing at the GUI level of

  11. Automatic welding and cladding in heavy fabrication

    International Nuclear Information System (INIS)

    Altamer, A. de

    1980-01-01

    A description is given of the automatic welding processes used by an Italian fabricator of pressure vessels for petrochemical and nuclear plant. The automatic submerged arc welding, submerged arc strip cladding, pulsed TIG, hot wire TIG and MIG welding processes have proved satisfactory in terms of process reliability, metal deposition rate, and cost effectiveness for low alloy and carbon steels. An example shows sequences required during automatic butt welding, including heat treatments. Factors which govern satisfactory automatic welding include automatic anti-drift rotator device, electrode guidance and bead programming system, the capability of single and dual head operation, flux recovery and slag removal systems, operator environment and controls, maintaining continuity of welding and automatic reverse side grinding. Automatic welding is used for: joining vessel sections; joining tubes to tubeplate; cladding of vessel rings and tubes, dished ends and extruded nozzles; nozzle to shell and butt welds, including narrow gap welding. (author)

  12. On a Use Case Points Measurement Tool for Effective Project Management

    OpenAIRE

    Inoue, Katsuro; Kusumoto, Shinji; Tsuda, Michio

    2007-01-01

    Use case point (UCP) method has been proposed to estimate software development effort in early phase of software project and used in a lot of software organizations. This paper briefly describes an automatic use case measurement tool, called U-EST.

  13. A new clinical tool for the quantification of myocardial CT perfusion imaging in patients with suspected Ischemic Heart Disease

    Energy Technology Data Exchange (ETDEWEB)

    Ruiz Muñoz, A.; Dux-Santoy Hurtado, L.; Rodriguez Palomares, J.L.; Piella Fenoy, G.

    2016-07-01

    In the clinical practice, the evaluation of myocardial perfusion by using Computed Tomography (CT) Imaging is usually performed visually or semi-quantitatively. The scarcity of quantitative perfusion data not always allows a proper diagnose of patients which are suspected of suffering from some diseases, such as Ischemic Heart Disease (IHD). In this work, a clinical tool for the automatic quantification of myocardial perfusion in patients with suspected IHD is proposed. Myocardial perfusion is assessed based on a combined diagnosis protocol (CT/CTP protocol) which involves the acquisition of two contrastenhanced CT images, one obtained at rest and another acquired under pharmacological stress. The clinical tool allows the automatic quantification of perfusion in different myocardial segments defined according to the 16-AHA-segmentation model of the left ventricle, by providing the mean of Hounsfield Units in those regions. Based on this analysis, the clinicians can compare the values at baseline and at hyperemia, and they can better determine hypoperfusion defects in patients with IHD. The validation of the clinical tool was performed by comparing automatic and manual perfusion measurements of 10 patients with suspected IHD who were previously assessed with Single Photon Emission Computed Tomography (SPECT) for perfusion analysis. A strong linear correlation was found between the automatic and manual results. Afterwards, perfusion defects obtained from CT/CTP protocol were compared to perfusion defects from SPECT, to assess the applicability of this clinical tool for the diagnosis of IHD. (Author)

  14. Automatic definition of prescription isodose for stereotactic irradiations of arteriovenous malformations

    International Nuclear Information System (INIS)

    Dejean, C.; Lefkopoulos, D.; Foulquier, J.N.; Schlienger, M.; Touboul, E.

    2001-01-01

    To evaluate dosimetric consequences generated by the automatic definition based on lesion coverage of prescription isodose. A clinical series of 124 arteriovenous malformations was analysed. Plan quality was quantified by the standard deviation of the differential dose volume histogram calculated in the lesion. We define two quantitative protocols based on lesion coverage for the automatic definition of prescription isodose using a volumetric definition of coverage (90% of lesion volume), and an isodose-based definition proposed) by RTOG (prescription isodose equals minimum isodose in the lesion divided by 0.9). We have evaluated the plans obtained for these two protocols, calculating several dose-volume indices. These indices are presented as a function of dose-volume histogram standard deviation in order to quantify the consequences of their variations for this representative series of plans. The margin our team tolerates is such that the sum of under-dosed lesion and overdosed healthy tissues factors remains lower than one. Protocol based on volumetric coverage gives results situated within this margin. Protocol based on RTOG definition produces conformation indices that could be greater than 1. The absolute dose would be decided taking into account examined dose-volume indices and clinical data. A protocol for automatic definition of prescription isodose using volumetric lesion coverage seems to be more judiciously adapted to arteriovenous malformation conformal plans in stereotactic conditions because of variations observed in the overdosage of healthy tissues. (authors)

  15. Long range manipulator development and experiments with dismantling tools

    International Nuclear Information System (INIS)

    Mueller, K.

    1993-01-01

    An existing handling system (EMIR) was used as a carrier system for various tools for concrete dismantling and radiation protection monitoring. It combined the advantages of long reach and high payload with highly dexterous kinematics. This system was enhanced mechanically to allow the use of different tools. Tool attachment devices for automatic tool exchange were investigated as well as interfaces (electric, hydraulic, compressed air, cooling water and signals). The control system was improved with regard to accuracy and sensor data processing. Programmable logic controller functions for tool control were incorporated. A free field mockup of the EMIR was build that allowed close simulation of dismantling scenarios without radioactive inventory. Aged concrete was provided for the integration tests. The development scheduled included the basic concept investigation; the development of tools and sensors; the EMIR hardware enhancement including a tool exchange; the adaption of tools and mockup and the final evaluation of the system during experiments

  16. Clinical score to differentiate scrub typhus and dengue: A tool to differentiate scrub typhus and dengue

    Directory of Open Access Journals (Sweden)

    Shubhanker Mitra

    2017-01-01

    Full Text Available Background: Dengue and scrub typhus share similar clinical and epidemiological features, and are difficult to differentiate at initial presentation. Many places are endemic to both these infections where they comprise the majority of acute undifferentiated febrile illnesses. Materials and Methods: We aimed to develop a score that can differentiate scrub typhus from dengue. In this cross-sectional study, 188 cases of scrub typhus and 201 cases of dengue infection who presented to the emergency department or medicine outpatient clinic from September 2012 to April 2013 were included. Univariate followed by multivariate logistic regression analysis was performed to identify clinical features and laboratory results that were significantly different between the two groups. Each variable was assigned scores based on the strength of association and receiver operating characteristics area under the curve (ROC-AUC was generated and compared. Six scoring models were explored to ascertain the model with the best fit. Results: Model 2 was developed using the following six variables: oxygen saturation (>90%, ≤90%, total white blood cell count (7000 cells/cumm, hemoglobin (≤14 and >14 g/dL, total bilirubin (200 and ≥200 IU/dL, and altered sensorium (present or absent. Each variable was assigned scores based on its strength of association. The AUC-ROC curve (95% confidence interval for model 2 was 0.84 (0.79–0.89. At the cut off score of 13, the sensitivity and specificity were 85% and 77% respectively, with a higher score favoring dengue. Conclusion: In areas of high burden of ST and dengue, model 2 (the “clinical score to differentiate scrub typhus and dengue fever” is a simple and rapid clinical scoring system that may be used to differentiate scrub typhus and dengue at initial presentation.

  17. Clinical Score to Differentiate Scrub Typhus and Dengue: A Tool to Differentiate Scrub Typhus and Dengue.

    Science.gov (United States)

    Mitra, Shubhanker; Gautam, Ira; Jambugulam, Mohan; Abhilash, Kundavaram Paul Prabhakar; Jayaseeelan, Vishalakshi

    2017-01-01

    Dengue and scrub typhus share similar clinical and epidemiological features, and are difficult to differentiate at initial presentation. Many places are endemic to both these infections where they comprise the majority of acute undifferentiated febrile illnesses. We aimed to develop a score that can differentiate scrub typhus from dengue. In this cross-sectional study, 188 cases of scrub typhus and 201 cases of dengue infection who presented to the emergency department or medicine outpatient clinic from September 2012 to April 2013 were included. Univariate followed by multivariate logistic regression analysis was performed to identify clinical features and laboratory results that were significantly different between the two groups. Each variable was assigned scores based on the strength of association and receiver operating characteristics area under the curve (ROC-AUC) was generated and compared. Six scoring models were explored to ascertain the model with the best fit. Model 2 was developed using the following six variables: oxygen saturation (>90%, ≤90%), total white blood cell count (7000 cells/cumm), hemoglobin (≤14 and >14 g/dL), total bilirubin (200 and ≥200 IU/dL), and altered sensorium (present or absent). Each variable was assigned scores based on its strength of association. The AUC-ROC curve (95% confidence interval) for model 2 was 0.84 (0.79-0.89). At the cut off score of 13, the sensitivity and specificity were 85% and 77% respectively, with a higher score favoring dengue. In areas of high burden of ST and dengue, model 2 (the "clinical score to differentiate scrub typhus and dengue fever") is a simple and rapid clinical scoring system that may be used to differentiate scrub typhus and dengue at initial presentation.

  18. Puzzle test: A tool for non-analytical clinical reasoning assessment.

    Science.gov (United States)

    Monajemi, Alireza; Yaghmaei, Minoo

    2016-01-01

    Most contemporary clinical reasoning tests typically assess non-automatic thinking. Therefore, a test is needed to measure automatic reasoning or pattern recognition, which has been largely neglected in clinical reasoning tests. The Puzzle Test (PT) is dedicated to assess automatic clinical reasoning in routine situations. This test has been introduced first in 2009 by Monajemi et al in the Olympiad for Medical Sciences Students.PT is an item format that has gained acceptance in medical education, but no detailed guidelines exist for this test's format, construction and scoring. In this article, a format is described and the steps to prepare and administer valid and reliable PTs are presented. PT examines a specific clinical reasoning task: Pattern recognition. PT does not replace other clinical reasoning assessment tools. However, it complements them in strategies for assessing comprehensive clinical reasoning.

  19. Automatic Genre Classification of Musical Signals

    Science.gov (United States)

    Barbedo, Jayme Garcia sArnal; Lopes, Amauri

    2006-12-01

    We present a strategy to perform automatic genre classification of musical signals. The technique divides the signals into 21.3 milliseconds frames, from which 4 features are extracted. The values of each feature are treated over 1-second analysis segments. Some statistical results of the features along each analysis segment are used to determine a vector of summary features that characterizes the respective segment. Next, a classification procedure uses those vectors to differentiate between genres. The classification procedure has two main characteristics: (1) a very wide and deep taxonomy, which allows a very meticulous comparison between different genres, and (2) a wide pairwise comparison of genres, which allows emphasizing the differences between each pair of genres. The procedure points out the genre that best fits the characteristics of each segment. The final classification of the signal is given by the genre that appears more times along all signal segments. The approach has shown very good accuracy even for the lowest layers of the hierarchical structure.

  20. Vital Recorder-a free research tool for automatic recording of high-resolution time-synchronised physiological data from multiple anaesthesia devices.

    Science.gov (United States)

    Lee, Hyung-Chul; Jung, Chul-Woo

    2018-01-24

    The current anaesthesia information management system (AIMS) has limited capability for the acquisition of high-quality vital signs data. We have developed a Vital Recorder program to overcome the disadvantages of AIMS and to support research. Physiological data of surgical patients were collected from 10 operating rooms using the Vital Recorder. The basic equipment used were a patient monitor, the anaesthesia machine, and the bispectral index (BIS) monitor. Infusion pumps, cardiac output monitors, regional oximeter, and rapid infusion device were added as required. The automatic recording option was used exclusively and the status of recording was frequently checked through web monitoring. Automatic recording was successful in 98.5% (4,272/4,335) cases during eight months of operation. The total recorded time was 13,489 h (3.2 ± 1.9 h/case). The Vital Recorder's automatic recording and remote monitoring capabilities enabled us to record physiological big data with minimal effort. The Vital Recorder also provided time-synchronised data captured from a variety of devices to facilitate an integrated analysis of vital signs data. The free distribution of the Vital Recorder is expected to improve data access for researchers attempting physiological data studies and to eliminate inequalities in research opportunities due to differences in data collection capabilities.

  1. Reuse Tools to Support ADA Instantiation Construction

    Science.gov (United States)

    1990-06-01

    specification and body with embedded task shell instantiations, as well as an inter-task coordination procedure which controls task activation, execution, and...Tools to Support Ada Instantiation Construction 3 - Generalized Construction Approaches Page 39 4Automatic Programming Programmer’s Apprentice ~ASLs...which is the root of a frame hierarchy. The specification frame controls the hierarchy’s composition of the program and stores all its custom

  2. Development of a Graphical Tool to integrate the Prometheus AEOlus methodology and Jason Platform

    Directory of Open Access Journals (Sweden)

    Rafhael CUNHA

    2017-07-01

    Full Text Available Software Engineering (SE is an area that intends to build high-quality software in a systematic way. However, traditional software engineering techniques and methods do not support the demand for developing Multiagent Systems (MAS. Therefore a new subarea has been studied, called Agent Oriented Software Engineering (AOSE. The AOSE area proposes solutions to issues related to the development of agent oriented systems. There is still no standardization in this subarea, resulting in several methodologies. Another issue of this subarea is that there are very few tools that are able to automatically generate code. In this work we propose a tool to support the Prometheus AEOlus Methodology because it provides modelling artifacts to all MAS dimensions: agents, environment, interaction, and organization. The tool supports all Prometheus AEOlus artifacts and can automatically generated code to the agent and interaction dimensions in the AgentSpeak Language, which is the language used in the Jason Platform. We have done some validations with the proposed tool and a case study is presented.

  3. Automatic lung segmentation in functional SPECT images using active shape models trained on reference lung shapes from CT.

    Science.gov (United States)

    Cheimariotis, Grigorios-Aris; Al-Mashat, Mariam; Haris, Kostas; Aletras, Anthony H; Jögi, Jonas; Bajc, Marika; Maglaveras, Nicolaos; Heiberg, Einar

    2018-02-01

    Image segmentation is an essential step in quantifying the extent of reduced or absent lung function. The aim of this study is to develop and validate a new tool for automatic segmentation of lungs in ventilation and perfusion SPECT images and compare automatic and manual SPECT lung segmentations with reference computed tomography (CT) volumes. A total of 77 subjects (69 patients with obstructive lung disease, and 8 subjects without apparent perfusion of ventilation loss) performed low-dose CT followed by ventilation/perfusion (V/P) SPECT examination in a hybrid gamma camera system. In the training phase, lung shapes from the 57 anatomical low-dose CT images were used to construct two active shape models (right lung and left lung) which were then used for image segmentation. The algorithm was validated in 20 patients, comparing its results to reference delineation of corresponding CT images, and by comparing automatic segmentation to manual delineations in SPECT images. The Dice coefficient between automatic SPECT delineations and manual SPECT delineations were 0.83 ± 0.04% for the right and 0.82 ± 0.05% for the left lung. There was statistically significant difference between reference volumes from CT and automatic delineations for the right (R = 0.53, p = 0.02) and left lung (R = 0.69, p automatic quantification of wide range of measurements.

  4. Solar Powered Automatic Shrimp Feeding System

    Directory of Open Access Journals (Sweden)

    Dindo T. Ani

    2015-12-01

    Full Text Available - Automatic system has brought many revolutions in the existing technologies. One among the technologies, which has greater developments, is the solar powered automatic shrimp feeding system. For instance, the solar power which is a renewable energy can be an alternative solution to energy crisis and basically reducing man power by using it in an automatic manner. The researchers believe an automatic shrimp feeding system may help solve problems on manual feeding operations. The project study aimed to design and develop a solar powered automatic shrimp feeding system. It specifically sought to prepare the design specifications of the project, to determine the methods of fabrication and assembly, and to test the response time of the automatic shrimp feeding system. The researchers designed and developed an automatic system which utilizes a 10 hour timer to be set in intervals preferred by the user and will undergo a continuous process. The magnetic contactor acts as a switch connected to the 10 hour timer which controls the activation or termination of electrical loads and powered by means of a solar panel outputting electrical power, and a rechargeable battery in electrical communication with the solar panel for storing the power. By undergoing through series of testing, the components of the modified system were proven functional and were operating within the desired output. It was recommended that the timer to be used should be tested to avoid malfunction and achieve the fully automatic system and that the system may be improved to handle changes in scope of the project.

  5. TIE POINTS EXTRACTION FOR SAR IMAGES BASED ON DIFFERENTIAL CONSTRAINTS

    Directory of Open Access Journals (Sweden)

    X. Xiong

    2018-04-01

    Full Text Available Automatically extracting tie points (TPs on large-size synthetic aperture radar (SAR images is still challenging because the efficiency and correct ratio of the image matching need to be improved. This paper proposes an automatic TPs extraction method based on differential constraints for large-size SAR images obtained from approximately parallel tracks, between which the relative geometric distortions are small in azimuth direction and large in range direction. Image pyramids are built firstly, and then corresponding layers of pyramids are matched from the top to the bottom. In the process, the similarity is measured by the normalized cross correlation (NCC algorithm, which is calculated from a rectangular window with the long side parallel to the azimuth direction. False matches are removed by the differential constrained random sample consensus (DC-RANSAC algorithm, which appends strong constraints in azimuth direction and weak constraints in range direction. Matching points in the lower pyramid images are predicted with the local bilinear transformation model in range direction. Experiments performed on ENVISAT ASAR and Chinese airborne SAR images validated the efficiency, correct ratio and accuracy of the proposed method.

  6. [Linked Data as a tool in the nutrition domain].

    Science.gov (United States)

    Míguez Pérez, R; Santos Gago, J M; Alonso Rorís, V M; Álvarez Sabucedo, L M; Mikic Fonte, F A

    2012-01-01

    Currently, there is a huge amount of information available on Internet that can neither be interpreted nor used by software agents. This fact poses a serious drawback to the potential of tools that deal with data on the current Web. Nevertheless, in recent times, advances in the domain of Semantic Web make possible the development of a new generation of smart applications capable of creating added-value services for the final user. This work shows the technical challenges that must be faced in the area of nutrition in order to transform one or several oldfashion sources of raw data into a web repository based on semantic technologies and linked with external and publicly available data on Internet. This approach makes possible for automatic tools to operate on the top of this information providing new functionalities highly interesting in the domain of public health, such as the automatic generation of menus for children or intelligent dietetic assistants, among others. This article explains the process to create such information support applying the guidelines of the Linked Data initiative and provides insights into the use of tools to make the most of this technology for its adoption in related use cases and environments.

  7. Automatic sentence extraction for the detection of scientific paper relations

    Science.gov (United States)

    Sibaroni, Y.; Prasetiyowati, S. S.; Miftachudin, M.

    2018-03-01

    The relations between scientific papers are very useful for researchers to see the interconnection between scientific papers quickly. By observing the inter-article relationships, researchers can identify, among others, the weaknesses of existing research, performance improvements achieved to date, and tools or data typically used in research in specific fields. So far, methods that have been developed to detect paper relations include machine learning and rule-based methods. However, a problem still arises in the process of sentence extraction from scientific paper documents, which is still done manually. This manual process causes the detection of scientific paper relations longer and inefficient. To overcome this problem, this study performs an automatic sentences extraction while the paper relations are identified based on the citation sentence. The performance of the built system is then compared with that of the manual extraction system. The analysis results suggested that the automatic sentence extraction indicates a very high level of performance in the detection of paper relations, which is close to that of manual sentence extraction.

  8. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    Several techniques and tools have been developed for verification of properties expressed as Horn clauses with constraints over a background theory (CHC). Current CHC verification tools implement intricate algorithms and are often limited to certain subclasses of CHC problems. Our aim in this work...... is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  9. Automatic delineation of brain regions on MRI and PET images from the pig.

    Science.gov (United States)

    Villadsen, Jonas; Hansen, Hanne D; Jørgensen, Louise M; Keller, Sune H; Andersen, Flemming L; Petersen, Ida N; Knudsen, Gitte M; Svarer, Claus

    2018-01-15

    The increasing use of the pig as a research model in neuroimaging requires standardized processing tools. For example, extraction of regional dynamic time series from brain PET images requires parcellation procedures that benefit from being automated. Manual inter-modality spatial normalization to a MRI atlas is operator-dependent, time-consuming, and can be inaccurate with lack of cortical radiotracer binding or skull uptake. A parcellated PET template that allows for automatic spatial normalization to PET images of any radiotracer. MRI and [ 11 C]Cimbi-36 PET scans obtained in sixteen pigs made the basis for the atlas. The high resolution MRI scans allowed for creation of an accurately averaged MRI template. By aligning the within-subject PET scans to their MRI counterparts, an averaged PET template was created in the same space. We developed an automatic procedure for spatial normalization of the averaged PET template to new PET images and hereby facilitated transfer of the atlas regional parcellation. Evaluation of the automatic spatial normalization procedure found the median voxel displacement to be 0.22±0.08mm using the MRI template with individual MRI images and 0.92±0.26mm using the PET template with individual [ 11 C]Cimbi-36 PET images. We tested the automatic procedure by assessing eleven PET radiotracers with different kinetics and spatial distributions by using perfusion-weighted images of early PET time frames. We here present an automatic procedure for accurate and reproducible spatial normalization and parcellation of pig PET images of any radiotracer with reasonable blood-brain barrier penetration. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Derivation of groundwater flow-paths based on semi-automatic extraction of lineaments from remote sensing data

    OpenAIRE

    U. Mallast; R. Gloaguen; S. Geyer; T. Rödiger; C. Siebert

    2011-01-01

    In this paper we present a semi-automatic method to infer groundwater flow-paths based on the extraction of lineaments from digital elevation models. This method is especially adequate in remote and inaccessible areas where in-situ data are scarce. The combined method of linear filtering and object-based classification provides a lineament map with a high degree of accuracy. Subsequently, lineaments are differentiated into geological and morphological lineaments using auxili...

  11. Portable and Automatic Moessbauer Analysis

    International Nuclear Information System (INIS)

    Souza, P. A. de; Garg, V. K.; Klingelhoefer, G.; Gellert, R.; Guetlich, P.

    2002-01-01

    A portable Moessbauer spectrometer, developed for extraterrestrial applications, opens up new industrial applications of MBS. But for industrial applications, an available tool for fast data analysis is also required, and it should be easy to handle. The analysis of Moessbauer spectra and their parameters is a barrier for the popularity of this wide-applicable spectroscopic technique in industry. Based on experience, the analysis of a Moessbauer spectrum is time-consuming and requires the dedication of a specialist. However, the analysis of Moessbauer spectra, from the fitting to the identification of the sample phases, can be faster using by genetic algorithms, fuzzy logic and artificial neural networks. Industrial applications are very specific ones and the data analysis can be performed using these algorithms. In combination with an automatic analysis, the Moessbauer spectrometer can be used as a probe instrument which covers the main industrial needs for an on-line monitoring of its products, processes and case studies. Some of these real industrial applications will be discussed.

  12. EINSTEIN - Expert system for an Intelligent Supply of Thermal Energy in Industry. Audit methodology and software tool

    Energy Technology Data Exchange (ETDEWEB)

    Schweiger, Hans; Danov, Stoyan (energyXperts.NET (Spain)); Vannoni, Claudia; Facci, Enrico (Sapienza Univ. of Rome, Dept. of Mechanics and Aeronautics, Rome (Italy)); Brunner, Christoph; Slawitsch, Bettina (Joanneum Research, Inst. of Sustainable Techniques and Systems - JOINTS, Graz (Austria))

    2009-07-01

    For optimising thermal energy supply in industry, a holistic integral approach is required that includes possibilities of demand reduction by heat recovery and process integration, and by an intelligent combination of efficient heat and cold supply technologies. EINSTEIN is a tool-kit for fast and high quality thermal energy audits in industry, composed by an audit guide describing the methodology and by a software tool that guides the auditor through all the audit steps. The main features of EINSTEIN are: (1) a basic questionnaire helps for systematic collection of the necessary information with the possibility to acquire data by distance; (2) special tools allow for fast consistency checking and estimation of missing data, so that already with very few data some first predictions can be made; (3) the data processing is based on standardised models for industrial processes and industrial heat supply systems; (4) semi-automatization: the software tool gives support to decision making for the generation of alternative heat and cold supply proposals, carries out automatically all the necessary calculations, including dynamic simulation of the heat supply system, and creates a standard audit report. The software tool includes modules for benchmarking, automatic design of heat exchanger networks, and design assistants for the heat and cold supply system. The core of the expert system software tool is available for free, as an open source software project. This type of software development has shown to be very efficient for dissemination of knowledge and for the continuous maintenance and improvement thanks to user contributions.

  13. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kosterev, Dmitry [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dai, T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-01

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  14. Decomposition analysis of differential dose volume histograms

    International Nuclear Information System (INIS)

    Heuvel, Frank van den

    2006-01-01

    Dose volume histograms are a common tool to assess the value of a treatment plan for various forms of radiation therapy treatment. The purpose of this work is to introduce, validate, and apply a set of tools to analyze differential dose volume histograms by decomposing them into physically and clinically meaningful normal distributions. A weighted sum of the decomposed normal distributions (e.g., weighted dose) is proposed as a new measure of target dose, rather than the more unstable point dose. The method and its theory are presented and validated using simulated distributions. Additional validation is performed by analyzing simple four field box techniques encompassing a predefined target, using different treatment energies inside a water phantom. Furthermore, two clinical situations are analyzed using this methodology to illustrate practical usefulness. A comparison of a treatment plan for a breast patient using a tangential field setup with wedges is compared to a comparable geometry using dose compensators. Finally, a normal tissue complication probability (NTCP) calculation is refined using this decomposition. The NTCP calculation is performed on a liver as organ at risk in a treatment of a mesothelioma patient with involvement of the right lung. The comparison of the wedged breast treatment versus the compensator technique yields comparable classical dose parameters (e.g., conformity index ≅1 and equal dose at the ICRU dose point). The methodology proposed here shows a 4% difference in weighted dose outlining the difference in treatment using a single parameter instead of at least two in a classical analysis (e.g., mean dose, and maximal dose, or total dose variance). NTCP-calculations for the mesothelioma case are generated automatically and show a 3% decrease with respect to the classical calculation. The decrease is slightly dependant on the fractionation and on the α/β-value utilized. In conclusion, this method is able to distinguish clinically

  15. Particle swarm optimization applied to automatic lens design

    Science.gov (United States)

    Qin, Hua

    2011-06-01

    This paper describes a novel application of Particle Swarm Optimization (PSO) technique to lens design. A mathematical model is constructed, and merit functions in an optical system are employed as fitness functions, which combined radiuses of curvature, thicknesses among lens surfaces and refractive indices regarding an optical system. By using this function, the aberration correction is carried out. A design example using PSO is given. Results show that PSO as optical design tools is practical and powerful, and this method is no longer dependent on the lens initial structure and can arbitrarily create search ranges of structural parameters of a lens system, which is an important step towards automatic design with artificial intelligence.

  16. Colour transformations and K-means segmentation for automatic cloud detection

    Directory of Open Access Journals (Sweden)

    Martin Blazek

    2015-08-01

    Full Text Available The main aim of this work is to find simple criteria for automatic recognition of several meteorological phenomena using optical digital sensors (e.g., Wide-Field cameras, automatic DSLR cameras or robotic telescopes. The output of those sensors is commonly represented in RGB channels containing information about both colour and luminosity even when normalised. Transformation into other colour spaces (e.g., CIE 1931 xyz, CIE L*a*b*, YCbCr can separate colour from luminosity, which is especially useful in the image processing of automatic cloud boundary recognition. Different colour transformations provide different sectorization of cloudy images. Hence, the analysed meteorological phenomena (cloud types, clear sky project differently into the colour diagrams of each international colour systems. In such diagrams, statistical tools can be applied in search of criteria which could determine clear sky from a covered one and possibly even perform a meteorological classification of cloud types. For the purpose of this work, a database of sky images (both clear and cloudy, with emphasis on a variety of different observation conditions (e.g., time, altitude, solar angle, etc. was acquired. The effectiveness of several colour transformations for meteorological application is discussed and the representation of different clouds (or clear sky in those colour systems is analysed. Utilisation of this algorithm would be useful in all-sky surveys, supplementary meteorological observations, solar cell effectiveness predictions or daytime astronomical solar observations.

  17. MixtureTree annotator: a program for automatic colorization and visual annotation of MixtureTree.

    Directory of Open Access Journals (Sweden)

    Shu-Chuan Chen

    Full Text Available The MixtureTree Annotator, written in JAVA, allows the user to automatically color any phylogenetic tree in Newick format generated from any phylogeny reconstruction program and output the Nexus file. By providing the ability to automatically color the tree by sequence name, the MixtureTree Annotator provides a unique advantage over any other programs which perform a similar function. In addition, the MixtureTree Annotator is the only package that can efficiently annotate the output produced by MixtureTree with mutation information and coalescent time information. In order to visualize the resulting output file, a modified version of FigTree is used. Certain popular methods, which lack good built-in visualization tools, for example, MEGA, Mesquite, PHY-FI, TreeView, treeGraph and Geneious, may give results with human errors due to either manually adding colors to each node or with other limitations, for example only using color based on a number, such as branch length, or by taxonomy. In addition to allowing the user to automatically color any given Newick tree by sequence name, the MixtureTree Annotator is the only method that allows the user to automatically annotate the resulting tree created by the MixtureTree program. The MixtureTree Annotator is fast and easy-to-use, while still allowing the user full control over the coloring and annotating process.

  18. Automatic text summarization

    CERN Document Server

    Torres Moreno, Juan Manuel

    2014-01-01

    This new textbook examines the motivations and the different algorithms for automatic document summarization (ADS). We performed a recent state of the art. The book shows the main problems of ADS, difficulties and the solutions provided by the community. It presents recent advances in ADS, as well as current applications and trends. The approaches are statistical, linguistic and symbolic. Several exemples are included in order to clarify the theoretical concepts.  The books currently available in the area of Automatic Document Summarization are not recent. Powerful algorithms have been develop

  19. Automatic Ultrasound Scanning

    DEFF Research Database (Denmark)

    Moshavegh, Ramin

    on the user adjustments on the scanner interface to optimize the scan settings. This explains the huge interest in the subject of this PhD project entitled “AUTOMATIC ULTRASOUND SCANNING”. The key goals of the project have been to develop automated techniques to minimize the unnecessary settings...... on the scanners, and to improve the computer-aided diagnosis (CAD) in ultrasound by introducing new quantitative measures. Thus, four major issues concerning automation of the medical ultrasound are addressed in this PhD project. They touch upon gain adjustments in ultrasound, automatic synthetic aperture image...

  20. Differential geometry bundles, connections, metrics and curvature

    CERN Document Server

    Taubes, Clifford Henry

    2011-01-01

    Bundles, connections, metrics and curvature are the 'lingua franca' of modern differential geometry and theoretical physics. This book will supply a graduate student in mathematics or theoretical physics with the fundamentals of these objects. Many of the tools used in differential topology are introduced and the basic results about differentiable manifolds, smooth maps, differential forms, vector fields, Lie groups, and Grassmanians are all presented here. Other material covered includes the basic theorems about geodesics and Jacobi fields, the classification theorem for flat connections, the

  1. Monitoring tool usage in surgery videos using boosted convolutional and recurrent neural networks.

    Science.gov (United States)

    Al Hajj, Hassan; Lamard, Mathieu; Conze, Pierre-Henri; Cochener, Béatrice; Quellec, Gwenolé

    2018-05-09

    This paper investigates the automatic monitoring of tool usage during a surgery, with potential applications in report generation, surgical training and real-time decision support. Two surgeries are considered: cataract surgery, the most common surgical procedure, and cholecystectomy, one of the most common digestive surgeries. Tool usage is monitored in videos recorded either through a microscope (cataract surgery) or an endoscope (cholecystectomy). Following state-of-the-art video analysis solutions, each frame of the video is analyzed by convolutional neural networks (CNNs) whose outputs are fed to recurrent neural networks (RNNs) in order to take temporal relationships between events into account. Novelty lies in the way those CNNs and RNNs are trained. Computational complexity prevents the end-to-end training of "CNN+RNN" systems. Therefore, CNNs are usually trained first, independently from the RNNs. This approach is clearly suboptimal for surgical tool analysis: many tools are very similar to one another, but they can generally be differentiated based on past events. CNNs should be trained to extract the most useful visual features in combination with the temporal context. A novel boosting strategy is proposed to achieve this goal: the CNN and RNN parts of the system are simultaneously enriched by progressively adding weak classifiers (either CNNs or RNNs) trained to improve the overall classification accuracy. Experiments were performed in a dataset of 50 cataract surgery videos, where the usage of 21 surgical tools was manually annotated, and a dataset of 80 cholecystectomy videos, where the usage of 7 tools was manually annotated. Very good classification performance are achieved in both datasets: tool usage could be labeled with an average area under the ROC curve of A z =0.9961 and A z =0.9939, respectively, in offline mode (using past, present and future information), and A z =0.9957 and A z =0.9936, respectively, in online mode (using past and present

  2. Automatic coordination of protection devices in distribution system

    International Nuclear Information System (INIS)

    Comassetto, L.; Bernardon, D.P.; Canha, L.N.; Abaide, A.R.

    2008-01-01

    Among the several components of distribution systems, protection devices present a fundamental importance, since they aim at keeping the physical integrity not only of the system equipment, but also of the electricians' team and the population in general. The existing tools today in the market that carry out the making of protection studies basically draw curves, and need direct user's interference for the protection devices adjustment and coordination analyses of selectivity, being susceptible to the user's mistakes and not always considering the best technical and economical application. In Brazil, the correct application of the protection devices demand a high amount of time, being extremely laborious due to the great number of devices (around 200 devices), besides the very dynamic behaviour of distribution networks and the need for constant system expansion. This article presents a computational tool developed with the objective of automatically determining the adjustments of all protection devices in the distribution networks to obtain the best technical application, optimizing its performance and making easier protection studies. (author)

  3. 30 CFR 77.314 - Automatic temperature control instruments.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic temperature control instruments. 77... UNDERGROUND COAL MINES Thermal Dryers § 77.314 Automatic temperature control instruments. (a) Automatic temperature control instruments for thermal dryer system shall be of the recording type. (b) Automatic...

  4. Automatic multi-cycle reload design of pressurized water reactor using particle swarm optimization algorithm and local search

    International Nuclear Information System (INIS)

    Lin, Chaung; Hung, Shao-Chun

    2013-01-01

    Highlights: • An automatic multi-cycle core reload design tool, which searches the fresh fuel assembly composition, is developed. • The search method adopts particle swarm optimization and local search. • The design objectives are to achieve required cycle energy, minimum fuel cost, and the satisfactory constraints. • The constraints include the hot zero power moderator temperature coefficient and the hot channel factor. - Abstract: An automatic multi-cycle core reload design tool, which searches the fresh fuel assembly composition, is developed using particle swarm optimization and local search. The local search uses heuristic rules to change the current search result a little so that the result can be improved. The composition of the fresh fuel assemblies should provide the required cycle energy and satisfy the constraints, such as the hot zero power moderator temperature coefficient and the hot channel factor. Instead of designing loading pattern for each FA composition during search process, two fixed loading patterns are used to calculate the core status and the better fitness function value is used in the search process. The fitness function contains terms which reflect the design objectives such as cycle energy, constraints, and fuel cost. The results show that the developed tool can achieve the desire objective

  5. Automatic control systems engineering

    International Nuclear Information System (INIS)

    Shin, Yun Gi

    2004-01-01

    This book gives descriptions of automatic control for electrical electronics, which indicates history of automatic control, Laplace transform, block diagram and signal flow diagram, electrometer, linearization of system, space of situation, state space analysis of electric system, sensor, hydro controlling system, stability, time response of linear dynamic system, conception of root locus, procedure to draw root locus, frequency response, and design of control system.

  6. A deep convolutional neural network-based automatic delineation strategy for multiple brain metastases stereotactic radiosurgery.

    Directory of Open Access Journals (Sweden)

    Yan Liu

    Full Text Available Accurate and automatic brain metastases target delineation is a key step for efficient and effective stereotactic radiosurgery (SRS treatment planning. In this work, we developed a deep learning convolutional neural network (CNN algorithm for segmenting brain metastases on contrast-enhanced T1-weighted magnetic resonance imaging (MRI datasets. We integrated the CNN-based algorithm into an automatic brain metastases segmentation workflow and validated on both Multimodal Brain Tumor Image Segmentation challenge (BRATS data and clinical patients' data. Validation on BRATS data yielded average DICE coefficients (DCs of 0.75±0.07 in the tumor core and 0.81±0.04 in the enhancing tumor, which outperformed most techniques in the 2015 BRATS challenge. Segmentation results of patient cases showed an average of DCs 0.67±0.03 and achieved an area under the receiver operating characteristic curve of 0.98±0.01. The developed automatic segmentation strategy surpasses current benchmark levels and offers a promising tool for SRS treatment planning for multiple brain metastases.

  7. Asymptotic behavior of second-order impulsive differential equations

    Directory of Open Access Journals (Sweden)

    Haifeng Liu

    2011-02-01

    Full Text Available In this article, we study the asymptotic behavior of all solutions of 2-th order nonlinear delay differential equation with impulses. Our main tools are impulsive differential inequalities and the Riccati transformation. We illustrate the results by an example.

  8. Development of advanced automatic control system for nuclear ship. 2. Perfect automatic operation after reactor scram events

    International Nuclear Information System (INIS)

    Yabuuchi, Noriaki; Nakazawa, Toshio; Takahashi, Hiroki; Shimazaki, Junya; Hoshi, Tsutao

    1997-11-01

    An automatic operation system has been developed for the purpose of realizing a perfect automatic plant operation after reactor scram events. The goal of the automatic operation after a reactor scram event is to bring the reactor hot stand-by condition automatically. The basic functions of this system are as follows; to monitor actions of the equipments of safety actions after a reactor scram, to control necessary control equipments to bring a reactor to a hot stand-by condition automatically, and to energize a decay heat removal system. The performance evaluation on this system was carried out by comparing the results using to Nuclear Ship Engineering Simulation System (NESSY) and the those measured in the scram test of the nuclear ship 'Mutsu'. As the result, it was showed that this system had the sufficient performance to bring a reactor to a hot syand-by condition quickly and safety. (author)

  9. Development of advanced automatic control system for nuclear ship. 2. Perfect automatic operation after reactor scram events

    Energy Technology Data Exchange (ETDEWEB)

    Yabuuchi, Noriaki; Nakazawa, Toshio; Takahashi, Hiroki; Shimazaki, Junya; Hoshi, Tsutao [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1997-11-01

    An automatic operation system has been developed for the purpose of realizing a perfect automatic plant operation after reactor scram events. The goal of the automatic operation after a reactor scram event is to bring the reactor hot stand-by condition automatically. The basic functions of this system are as follows; to monitor actions of the equipments of safety actions after a reactor scram, to control necessary control equipments to bring a reactor to a hot stand-by condition automatically, and to energize a decay heat removal system. The performance evaluation on this system was carried out by comparing the results using to Nuclear Ship Engineering Simulation System (NESSY) and the those measured in the scram test of the nuclear ship `Mutsu`. As the result, it was showed that this system had the sufficient performance to bring a reactor to a hot syand-by condition quickly and safety. (author)

  10. Fourier Transform Infrared Spectroscopy (FTIR) as a Tool for the Identification and Differentiation of Pathogenic Bacteria.

    Science.gov (United States)

    Zarnowiec, Paulina; Lechowicz, Łukasz; Czerwonka, Grzegorz; Kaca, Wiesław

    2015-01-01

    Methods of human bacterial pathogen identification need to be fast, reliable, inexpensive, and time efficient. These requirements may be met by vibrational spectroscopic techniques. The method that is most often used for bacterial detection and identification is Fourier transform infrared spectroscopy (FTIR). It enables biochemical scans of whole bacterial cells or parts thereof at infrared frequencies (4,000-600 cm(-1)). The recorded spectra must be subsequently transformed in order to minimize data variability and to amplify the chemically-based spectral differences in order to facilitate spectra interpretation and analysis. In the next step, the transformed spectra are analyzed by data reduction tools, regression techniques, and classification methods. Chemometric analysis of FTIR spectra is a basic technique for discriminating between bacteria at the genus, species, and clonal levels. Examples of bacterial pathogen identification and methods of differentiation up to the clonal level, based on infrared spectroscopy, are presented below.

  11. Introduction to differential equations

    CERN Document Server

    Taylor, Michael E

    2011-01-01

    The mathematical formulations of problems in physics, economics, biology, and other sciences are usually embodied in differential equations. The analysis of the resulting equations then provides new insight into the original problems. This book describes the tools for performing that analysis. The first chapter treats single differential equations, emphasizing linear and nonlinear first order equations, linear second order equations, and a class of nonlinear second order equations arising from Newton's laws. The first order linear theory starts with a self-contained presentation of the exponen

  12. Automatic Shape Control of Triangular B-Splines of Arbitrary Topology

    Institute of Scientific and Technical Information of China (English)

    Ying He; Xian-Feng Gu; Hong Qin

    2006-01-01

    Triangular B-splines are powerful and flexible in modeling a broader class of geometric objects defined over arbitrary, non-rectangular domains. Despite their great potential and advantages in theory, practical techniques and computational tools with triangular B-splines are less-developed. This is mainly because users have to handle a large number of irregularly distributed control points over arbitrary triangulation. In this paper, an automatic and efficient method is proposed to generate visually pleasing, high-quality triangular B-splines of arbitrary topology. The experimental results on several real datasets show that triangular B-splines are powerful and effective in both theory and practice.

  13. Automatic exchange unit for control rod drive device

    International Nuclear Information System (INIS)

    Nasu, Seiji; Sasaki, Masayoshi.

    1982-01-01

    Purpose: To enable automatic reoperation and continuation without external power interruption remedy device at the time of recovering the interrupted power soruce during automatic positioning operation. Constitution: In case of an automatic exchange unit for a control rod drive device of the control type for setting the deviation between the positioning target position and the present position of the device to zero, the position data of the drive device of the positioning target value of the device is automatically read, and an interlock of operation inhibit is applied to a control system until the data reading is completed and automatic operation start or restart conditions are sequentially confirmed. After the confirmation, the interlock is released to start the automatic operation or reoperation. Accordingly, the automatic operation can be safely restarted and continued. (Yoshihara, H.)

  14. A survey on the automatic object tracking technology using video signals

    International Nuclear Information System (INIS)

    Lee, Jae Cheol; Jun, Hyeong Seop; Choi, Yu Rak; Kim, Jae Hee

    2003-01-01

    Recently, automatic identification and tracking of the object are actively studied according to the rapid development of signal processing and vision technology using improved hardware and software. The object tracking technology can be applied to various fields such as road watching of the vehicles, weather satellite, traffic observation, intelligent remote video-conferences and autonomous mobile robots. Object tracking system receives subsequent pictures from the camera and detects motions of the objects in these pictures. In this report, we investigate various object tracking techniques such as brightness change using histogram characteristic, differential image analysis, contour and feature extraction, and try to find proper methods that can be used to mobile robots actually

  15. An automatic rat brain extraction method based on a deformable surface model.

    Science.gov (United States)

    Li, Jiehua; Liu, Xiaofeng; Zhuo, Jiachen; Gullapalli, Rao P; Zara, Jason M

    2013-08-15

    The extraction of the brain from the skull in medical images is a necessary first step before image registration or segmentation. While pre-clinical MR imaging studies on small animals, such as rats, are increasing, fully automatic imaging processing techniques specific to small animal studies remain lacking. In this paper, we present an automatic rat brain extraction method, the Rat Brain Deformable model method (RBD), which adapts the popular human brain extraction tool (BET) through the incorporation of information on the brain geometry and MR image characteristics of the rat brain. The robustness of the method was demonstrated on T2-weighted MR images of 64 rats and compared with other brain extraction methods (BET, PCNN, PCNN-3D). The results demonstrate that RBD reliably extracts the rat brain with high accuracy (>92% volume overlap) and is robust against signal inhomogeneity in the images. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. An Exploitability Analysis Technique for Binary Vulnerability Based on Automatic Exception Suppression

    Directory of Open Access Journals (Sweden)

    Zhiyuan Jiang

    2018-01-01

    Full Text Available To quickly verify and fix vulnerabilities, it is necessary to judge the exploitability of the massive crash generated by the automated vulnerability mining tool. While the current manual analysis of the crash process is inefficient and time-consuming, the existing automated tools can only handle execute exceptions and some write exceptions but cannot handle common read exceptions. To address this problem, we propose a method of determining the exploitability based on the exception type suppression. This method enables the program to continue to execute until an exploitable exception is triggered. The method performs a symbolic replay of the crash sample, constructing and reusing data gadget, to bypass the complex exception, thereby improving the efficiency and accuracy of vulnerability exploitability analysis. The testing of typical CGC/RHG binary software shows that this method can automatically convert a crash that cannot be judged by existing analysis tools into a different crash type and judge the exploitability successfully.

  17. The Influence of Facial Signals on the Automatic Imitation of Hand Actions.

    Science.gov (United States)

    Butler, Emily E; Ward, Robert; Ramsey, Richard

    2016-01-01

    Imitation and facial signals are fundamental social cues that guide interactions with others, but little is known regarding the relationship between these behaviors. It is clear that during expression detection, we imitate observed expressions by engaging similar facial muscles. It is proposed that a cognitive system, which matches observed and performed actions, controls imitation and contributes to emotion understanding. However, there is little known regarding the consequences of recognizing affective states for other forms of imitation, which are not inherently tied to the observed emotion. The current study investigated the hypothesis that facial cue valence would modulate automatic imitation of hand actions. To test this hypothesis, we paired different types of facial cue with an automatic imitation task. Experiments 1 and 2 demonstrated that a smile prompted greater automatic imitation than angry and neutral expressions. Additionally, a meta-analysis of this and previous studies suggests that both happy and angry expressions increase imitation compared to neutral expressions. By contrast, Experiments 3 and 4 demonstrated that invariant facial cues, which signal trait-levels of agreeableness, had no impact on imitation. Despite readily identifying trait-based facial signals, levels of agreeableness did not differentially modulate automatic imitation. Further, a Bayesian analysis showed that the null effect was between 2 and 5 times more likely than the experimental effect. Therefore, we show that imitation systems are more sensitive to prosocial facial signals that indicate "in the moment" states than enduring traits. These data support the view that a smile primes multiple forms of imitation including the copying actions that are not inherently affective. The influence of expression detection on wider forms of imitation may contribute to facilitating interactions between individuals, such as building rapport and affiliation.

  18. Solving Partial Differential Equations Using a New Differential Evolution Algorithm

    Directory of Open Access Journals (Sweden)

    Natee Panagant

    2014-01-01

    Full Text Available This paper proposes an alternative meshless approach to solve partial differential equations (PDEs. With a global approximate function being defined, a partial differential equation problem is converted into an optimisation problem with equality constraints from PDE boundary conditions. An evolutionary algorithm (EA is employed to search for the optimum solution. For this approach, the most difficult task is the low convergence rate of EA which consequently results in poor PDE solution approximation. However, its attractiveness remains due to the nature of a soft computing technique in EA. The algorithm can be used to tackle almost any kind of optimisation problem with simple evolutionary operation, which means it is mathematically simpler to use. A new efficient differential evolution (DE is presented and used to solve a number of the partial differential equations. The results obtained are illustrated and compared with exact solutions. It is shown that the proposed method has a potential to be a future meshless tool provided that the search performance of EA is greatly enhanced.

  19. Position automatic determination technology

    International Nuclear Information System (INIS)

    1985-10-01

    This book tells of method of position determination and characteristic, control method of position determination and point of design, point of sensor choice for position detector, position determination of digital control system, application of clutch break in high frequency position determination, automation technique of position determination, position determination by electromagnetic clutch and break, air cylinder, cam and solenoid, stop position control of automatic guide vehicle, stacker crane and automatic transfer control.

  20. Automatic synthesis of sequential control schemes

    International Nuclear Information System (INIS)

    Klein, I.

    1993-01-01

    Of all hard- and software developed for industrial control purposes, the majority is devoted to sequential, or binary valued, control and only a minor part to classical linear control. Typically, the sequential parts of the controller are invoked during startup and shut-down to bring the system into its normal operating region and into some safe standby region, respectively. Despite its importance, fairly little theoretical research has been devoted to this area, and sequential control programs are therefore still created manually without much theoretical support to obtain a systematic approach. We propose a method to create sequential control programs automatically. The main ideas is to spend some effort off-line modelling the plant, and from this model generate the control strategy, that is the plan. The plant is modelled using action structures, thereby concentrating on the actions instead of the states of the plant. In general the planning problem shows exponential complexity in the number of state variables. However, by focusing on the actions, we can identify problem classes as well as algorithms such that the planning complexity is reduced to polynomial complexity. We prove that these algorithms are sound, i.e., the generated solution will solve the stated problem, and complete, i.e., if the algorithms fail, then no solution exists. The algorithms generate a plan as a set of actions and a partial order on this set specifying the execution order. The generated plant is proven to be minimal and maximally parallel. For a larger class of problems we propose a method to split the original problem into a number of simple problems that can each be solved using one of the presented algorithms. It is also shown how a plan can be translated into a GRAFCET chart, and to illustrate these ideas we have implemented a planing tool, i.e., a system that is able to automatically create control schemes. Such a tool can of course also be used on-line if it is fast enough. This

  1. Efficiently and easily integrating differential equations with JiTCODE, JiTCDDE, and JiTCSDE

    Science.gov (United States)

    Ansmann, Gerrit

    2018-04-01

    We present a family of Python modules for the numerical integration of ordinary, delay, or stochastic differential equations. The key features are that the user enters the derivative symbolically and it is just-in-time-compiled, allowing the user to efficiently integrate differential equations from a higher-level interpreted language. The presented modules are particularly suited for large systems of differential equations such as those used to describe dynamics on complex networks. Through the selected method of input, the presented modules also allow almost complete automatization of the process of estimating regular as well as transversal Lyapunov exponents for ordinary and delay differential equations. We conceptually discuss the modules' design, analyze their performance, and demonstrate their capabilities by application to timely problems.

  2. A neurocomputational model of automatic sequence production.

    Science.gov (United States)

    Helie, Sebastien; Roeder, Jessica L; Vucovich, Lauren; Rünger, Dennis; Ashby, F Gregory

    2015-07-01

    Most behaviors unfold in time and include a sequence of submovements or cognitive activities. In addition, most behaviors are automatic and repeated daily throughout life. Yet, relatively little is known about the neurobiology of automatic sequence production. Past research suggests a gradual transfer from the associative striatum to the sensorimotor striatum, but a number of more recent studies challenge this role of the BG in automatic sequence production. In this article, we propose a new neurocomputational model of automatic sequence production in which the main role of the BG is to train cortical-cortical connections within the premotor areas that are responsible for automatic sequence production. The new model is used to simulate four different data sets from human and nonhuman animals, including (1) behavioral data (e.g., RTs), (2) electrophysiology data (e.g., single-neuron recordings), (3) macrostructure data (e.g., TMS), and (4) neurological circuit data (e.g., inactivation studies). We conclude with a comparison of the new model with existing models of automatic sequence production and discuss a possible new role for the BG in automaticity and its implication for Parkinson's disease.

  3. The ‘Continuing Misfortune’ of Automatism in Early Surrealism

    Directory of Open Access Journals (Sweden)

    Tessel M. Bauduin

    2015-09-01

    Full Text Available In the 1924 Manifesto of Surrealism surrealist leader André Breton (1896-1966 defined Surrealism as ‘psychic automatism in its pure state,’ positioning ‘psychic automatism’ as both a concept and a technique. This definition followed upon an intense period of experimentation with various forms of automatism among the proto-surrealist group; predominantly automatic writing, but also induced dream states. This article explores how surrealist ‘psychic automatism’ functioned as a mechanism for communication, or the expression of thought as directly as possible through the unconscious, in the first two decades of Surrealism. It touches upon automatic writing, hysteria as an automatic bodily performance of the unconscious, dreaming and the experimentation with induced dream states, and automatic drawing and other visual arts-techniques that could be executed more or less automatically as well. For all that the surrealists reinvented automatism for their own poetic, artistic and revolutionary aims, the automatic techniques were primarily drawn from contemporary Spiritualism, psychical research and experimentation with mediums, and the article teases out the connections to mediumistic automatism. It is demonstrated how the surrealists effectively and successfully divested automatism of all things spiritual. It furthermore becomes clear that despite various mishaps, automatism in many forms was a very successful creative technique within Surrealism.

  4. Wellhead bowl protector and retrieving tool

    International Nuclear Information System (INIS)

    Young, J.A.

    1991-01-01

    This patent describes improvement in a wellhead protection system including a wear bushing and a retrieving tool. The improvement comprises a wear bushing supported within the wellhead, wherein the wear bushing includes an enlarged upper end having an external support shoulder for engagement with an internal support shoulder formed in the wellhead; wherein the wear bushing further includes an internal circumferential slot intersected by at least one vertically extending slot, the vertical slot extending from the circumferential slot to the upper end of the wear bushing; a retrieving tool having at least one outwardly biased, retractable lug member mounted thereon; and wherein the retrieving tool includes an enlarged portion adapted to be received within the enlarged upper end of the wear bushing. This patent also describes a method of retrieving a wear bushing from a wellhead comprising the steps of: lowering a retrieving tool into the wellhead for locking engagement with the wear bushing; aligning the retrieving tool with the wear bushing for automatically forcing lug members carried by the retrieving tool outwardly into locking engagement with the wear bushing; monitoring drill string weight for determining engagement of the retrieving tool with the wear bushing, wherein a substantial decrease in drill string weight is an indication that the retrieving tool is engaged with the wear bushing; and removing the wear bushing from the wellhead

  5. Symbolic computation of analytic approximate solutions for nonlinear differential equations with initial conditions

    Science.gov (United States)

    Lin, Yezhi; Liu, Yinping; Li, Zhibin

    2012-01-01

    The Adomian decomposition method (ADM) is one of the most effective methods for constructing analytic approximate solutions of nonlinear differential equations. In this paper, based on the new definition of the Adomian polynomials, and the two-step Adomian decomposition method (TSADM) combined with the Padé technique, a new algorithm is proposed to construct accurate analytic approximations of nonlinear differential equations with initial conditions. Furthermore, a MAPLE package is developed, which is user-friendly and efficient. One only needs to input a system, initial conditions and several necessary parameters, then our package will automatically deliver analytic approximate solutions within a few seconds. Several different types of examples are given to illustrate the validity of the package. Our program provides a helpful and easy-to-use tool in science and engineering to deal with initial value problems. Program summaryProgram title: NAPA Catalogue identifier: AEJZ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJZ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 4060 No. of bytes in distributed program, including test data, etc.: 113 498 Distribution format: tar.gz Programming language: MAPLE R13 Computer: PC Operating system: Windows XP/7 RAM: 2 Gbytes Classification: 4.3 Nature of problem: Solve nonlinear differential equations with initial conditions. Solution method: Adomian decomposition method and Padé technique. Running time: Seconds at most in routine uses of the program. Special tasks may take up to some minutes.

  6. Automatic learning of structural knowledge from geographic information for updating land cover maps

    OpenAIRE

    Bayoudh , Meriam; Roux , Emmanuel; Nock , Richard; Richard , G.

    2012-01-01

    International audience; The number of satellites and remote sensing sensors devoted to earth observation becomes increasingly high, providing more and more data and especially images. In the same time the access to such a data and to the tools to process them has been considerably improved. In the presence of such data flow - and regarding the necessity to follow up and predict environmental and societal changes in highly dynamic socio-environmental contexts - we need automatic image interpre...

  7. Differential Diagnosis Tool for Parkinsonian Syndrome Using Multiple Structural Brain Measures

    Directory of Open Access Journals (Sweden)

    Miho Ota

    2013-01-01

    Full Text Available Clinical differentiation of parkinsonian syndromes such as the Parkinson variant of multiple system atrophy (MSA-P and cerebellar subtype (MSA-C from Parkinson's disease is difficult in the early stage of the disease. To identify the correlative pattern of brain changes for differentiating parkinsonian syndromes, we applied discriminant analysis techniques by magnetic resonance imaging (MRI. T1-weighted volume data and diffusion tensor images were obtained by MRI in eighteen patients with MSA-C, 12 patients with MSA-P, 21 patients with Parkinson’s disease, and 21 healthy controls. They were evaluated using voxel-based morphometry and tract-based spatial statistics, respectively. Discriminant functions derived by step wise methods resulted in correct classification rates of 0.89. When differentiating these diseases with the use of three independent variables together, the correct classification rate was the same as that obtained with step wise methods. These findings support the view that each parkinsonian syndrome has structural deviations in multiple brain areas and that a combination of structural brain measures can help to distinguish parkinsonian syndromes.

  8. Programmable automatic alpha--beta air sample counter

    International Nuclear Information System (INIS)

    Howell, W.P.

    1978-01-01

    A programmable automatic alpha-beta air sample counter was developed for routine sample counting by operational health physics personnel. The system is composed of an automatic sample changer utilizing a large silicon diode detector, an electronic counting system with energy analysis capability, an automatic data acquisition controller, an interface module, and a teletypewriter with paper tape punch and paper tape reader. The system is operated through the teletypewriter keyboard and the paper tape reader, which are used to instruct the automatic data acquisition controller. Paper tape programs are provided for background counting, Chi 2 test, and sample counting. Output data are printed by the teletypewriter on standard continuous roll or multifold paper. Data are automatically corrected for background and counter efficiency

  9. Automatic multi-modal MR tissue classification for the assessment of response to bevacizumab in patients with glioblastoma

    International Nuclear Information System (INIS)

    Liberman, Gilad; Louzoun, Yoram; Aizenstein, Orna; Blumenthal, Deborah T.; Bokstein, Felix; Palmon, Mika; Corn, Benjamin W.; Ben Bashat, Dafna

    2013-01-01

    Background: Current methods for evaluation of treatment response in glioblastoma are inaccurate, limited and time-consuming. This study aimed to develop a multi-modal MRI automatic classification method to improve accuracy and efficiency of treatment response assessment in patients with recurrent glioblastoma (GB). Materials and methods: A modification of the k-Nearest-Neighbors (kNN) classification method was developed and applied to 59 longitudinal MR data sets of 13 patients with recurrent GB undergoing bevacizumab (anti-angiogenic) therapy. Changes in the enhancing tumor volume were assessed using the proposed method and compared with Macdonald's criteria and with manual volumetric measurements. The edema-like area was further subclassified into peri- and non-peri-tumoral edema, using both the kNN method and an unsupervised method, to monitor longitudinal changes. Results: Automatic classification using the modified kNN method was applicable in all scans, even when the tumors were infiltrative with unclear borders. The enhancing tumor volume obtained using the automatic method was highly correlated with manual measurements (N = 33, r = 0.96, p < 0.0001), while standard radiographic assessment based on Macdonald's criteria matched manual delineation and automatic results in only 68% of cases. A graded pattern of tumor infiltration within the edema-like area was revealed by both automatic methods, showing high agreement. All classification results were confirmed by a senior neuro-radiologist and validated using MR spectroscopy. Conclusion: This study emphasizes the important role of automatic tools based on a multi-modal view of the tissue in monitoring therapy response in patients with high grade gliomas specifically under anti-angiogenic therapy

  10. Computational Analysis of Brain Images: Towards a Useful Tool in Clinical Practice

    DEFF Research Database (Denmark)

    Puonti, Oula

    scans, many of the developed methods are not readily extendible to clinical applications due to the variability of clinical MRI data and the presence of pathologies, such as tumors or lesions. Thus, clinicians are forced to manually analyze the MRI data, which is a time consuming task and introduces...... rater-dependent variability that reduces the accuracy and sensitivity of the results. The goal of this PhD-project was to enlarge the scope of the automatic tools into clinical applications. In order to tackle the variability of the data and presence of pathologies, we base our methods on Bayesian...... this framework can be extended with models of brain lesions. This results in a set of fast, robust and fully automatic tools for segmenting MRI brain scans of both healthy subjects and subjects suffering from brain disorders such as multiple sclerosis. Having access to quantitative measures of both lesions...

  11. Technical session: the Atomika TXRF tool series

    International Nuclear Information System (INIS)

    Dobler, M. . URL: www.atomika.com

    2000-01-01

    ATOMIKA Instruments GmbH holds worldwide competence as a renowned producer of high-performance metrology tools and analytic devices. ATOMIKA's TXRF products are widely accepted for elemental contamination monitoring on semiconductor materials as well as in chemical analysis. More than 100 companies and institutes have their analytical work based on TXRF tools made by ATOMIKA Instruments. ATOMIKA's TXRF 8300W/82OOW wafer contamination monitors are the result of an evolution based on a background of 20 years of competence. Built for the semiconductor industry, the TXRF 8300W/82OOW detect rnetal contaminants on 300mm, or 200mm silicon wafer surfaces with highest possible sensitivity. Operating under ambient conditions, with a sealed x-ray tube, and having their own minienvironment (FOUP, or SMIF respectively), TXRF 8300W182OOW are optimally suited for in-line use. Fab automation (GEM/SECS) is supported by predefined measurement recipes and fully automatic routines. High throughput and uptimes, an ergonomic design according to SEMI standard plus an unrivaled small footprint of 1.1 m 2 make the TXRF 8300W/82OOW most efficient and economic solutions for industrial wafer monitoring. As the specific tool for multielement trace and thin layer analysis the ATOMIKA TXRF 8030C provides simultaneous and fast determination of alt elements within the range from sodium to uranium. Sophisticated measurement instrumentation provides detection limits down to the ppt range. On the other hand, performance is decisively facilitated by features as automatic switching of primary radiation, predefined measurement recipes, or software driven optimization of the entire measurement process. These features make the TXRF 8030C a valuable analytic tool for a wide range of applications: contamination in water, dust or sediments; quantitative screening in the chemical industry; toxic elements in tissues and biological fluids; radioactive elements; process chemicals in the semiconductor industry

  12. Robot-assisted automatic ultrasound calibration.

    Science.gov (United States)

    Aalamifar, Fereshteh; Cheng, Alexis; Kim, Younsu; Hu, Xiao; Zhang, Haichong K; Guo, Xiaoyu; Boctor, Emad M

    2016-10-01

    Ultrasound (US) calibration is the process of determining the unknown transformation from a coordinate frame such as the robot's tooltip to the US image frame and is a necessary task for any robotic or tracked US system. US calibration requires submillimeter-range accuracy for most applications, but it is a time-consuming and repetitive task. We provide a new framework for automatic US calibration with robot assistance and without the need for temporal calibration. US calibration based on active echo (AE) phantom was previously proposed, and its superiority over conventional cross-wire phantom-based calibration was shown. In this work, we use AE to guide the robotic arm motion through the process of data collection; we combine the capability of the AE point to localize itself in the frame of the US image with the automatic motion of the robotic arm to provide a framework for calibrating the arm to the US image automatically. We demonstrated the efficacy of the automated method compared to the manual method through experiments. To highlight the necessity of frequent ultrasound calibration, it is demonstrated that the calibration precision changed from 1.67 to 3.20 mm if the data collection is not repeated after a dismounting/mounting of the probe holder. In a large data set experiment, similar reconstruction precision of automatic and manual data collection was observed, while the time was reduced by 58 %. In addition, we compared ten automatic calibrations with ten manual ones, each performed in 15 min, and showed that all the automatic ones could converge in the case of setting the initial matrix as identity, while this was not achieved by manual data sets. Given the same initial matrix, the repeatability of the automatic was [0.46, 0.34, 0.80, 0.47] versus [0.42, 0.51, 0.98, 1.15] mm in the manual case for the US image four corners. The submillimeter accuracy requirement of US calibration makes frequent data collections unavoidable. We proposed an automated

  13. Semi-automatic watershed medical image segmentation methods for customized cancer radiation treatment planning simulation

    International Nuclear Information System (INIS)

    Kum Oyeon; Kim Hye Kyung; Max, N.

    2007-01-01

    A cancer radiation treatment planning simulation requires image segmentation to define the gross tumor volume, clinical target volume, and planning target volume. Manual segmentation, which is usual in clinical settings, depends on the operator's experience and may, in addition, change for every trial by the same operator. To overcome this difficulty, we developed semi-automatic watershed medical image segmentation tools using both the top-down watershed algorithm in the insight segmentation and registration toolkit (ITK) and Vincent-Soille's bottom-up watershed algorithm with region merging. We applied our algorithms to segment two- and three-dimensional head phantom CT data and to find pixel (or voxel) numbers for each segmented area, which are needed for radiation treatment optimization. A semi-automatic method is useful to avoid errors incurred by both human and machine sources, and provide clear and visible information for pedagogical purpose. (orig.)

  14. CADLIVE toolbox for MATLAB: automatic dynamic modeling of biochemical networks with comprehensive system analysis.

    Science.gov (United States)

    Inoue, Kentaro; Maeda, Kazuhiro; Miyabe, Takaaki; Matsuoka, Yu; Kurata, Hiroyuki

    2014-09-01

    Mathematical modeling has become a standard technique to understand the dynamics of complex biochemical systems. To promote the modeling, we had developed the CADLIVE dynamic simulator that automatically converted a biochemical map into its associated mathematical model, simulated its dynamic behaviors and analyzed its robustness. To enhance the feasibility by CADLIVE and extend its functions, we propose the CADLIVE toolbox available for MATLAB, which implements not only the existing functions of the CADLIVE dynamic simulator, but also the latest tools including global parameter search methods with robustness analysis. The seamless, bottom-up processes consisting of biochemical network construction, automatic construction of its dynamic model, simulation, optimization, and S-system analysis greatly facilitate dynamic modeling, contributing to the research of systems biology and synthetic biology. This application can be freely downloaded from http://www.cadlive.jp/CADLIVE_MATLAB/ together with an instruction.

  15. Design Tools for Reconfigurable Hardware in Orbit (RHinO)

    Science.gov (United States)

    French, Mathew; Graham, Paul; Wirthlin, Michael; Larchev, Gregory; Bellows, Peter; Schott, Brian

    2004-01-01

    The Reconfigurable Hardware in Orbit (RHinO) project is focused on creating a set of design tools that facilitate and automate design techniques for reconfigurable computing in space, using SRAM-based field-programmable-gate-array (FPGA) technology. These tools leverage an established FPGA design environment and focus primarily on space effects mitigation and power optimization. The project is creating software to automatically test and evaluate the single-event-upsets (SEUs) sensitivities of an FPGA design and insert mitigation techniques. Extensions into the tool suite will also allow evolvable algorithm techniques to reconfigure around single-event-latchup (SEL) events. In the power domain, tools are being created for dynamic power visualiization and optimization. Thus, this technology seeks to enable the use of Reconfigurable Hardware in Orbit, via an integrated design tool-suite aiming to reduce risk, cost, and design time of multimission reconfigurable space processors using SRAM-based FPGAs.

  16. Children’s Behavioral Pain Cues: Implicit Automaticity and Control Dimensions in Observational Measures

    Directory of Open Access Journals (Sweden)

    Kamal Kaur Sekhon

    2017-01-01

    Full Text Available Some pain behaviors appear to be automatic, reflexive manifestations of pain, whereas others present as voluntarily controlled. This project examined whether this distinction would characterize pain cues used in observational pain measures for children aged 4–12. To develop a comprehensive list of cues, a systematic literature search of studies describing development of children’s observational pain assessment tools was conducted using MEDLINE, PsycINFO, and Web of Science. Twenty-one articles satisfied the criteria. A total of 66 nonredundant pain behavior items were identified. To determine whether items would be perceived as automatic or controlled, 277 research participants rated each on multiple scales associated with the distinction. Factor analyses yielded three major factors: the “Automatic” factor included items related to facial expression, paralinguistics, and consolability; the “Controlled” factor included items related to intentional movements, verbalizations, and social actions; and the “Ambiguous” factor included items related to voluntary facial expressions. Pain behaviors in observational pain scales for children can be characterized as automatic, controlled, and ambiguous, supporting a dual-processing, neuroregulatory model of pain expression. These dimensions would be expected to influence judgments of the nature and severity of pain being experienced and the extent to which the child is attempting to control the social environment.

  17. Automatically sweeping dual-channel boxcar integrator

    International Nuclear Information System (INIS)

    Keefe, D.J.; Patterson, D.R.

    1978-01-01

    An automatically sweeping dual-channel boxcar integrator has been developed to automate the search for a signal that repeatedly follows a trigger pulse by a constant or slowly varying time delay when that signal is completely hidden in random electrical noise and dc-offset drifts. The automatically sweeping dual-channel boxcar integrator improves the signal-to-noise ratio and eliminates dc-drift errors in the same way that a conventional dual-channel boxcar integrator does, but, in addition, automatically locates the hidden signal. When the signal is found, its time delay is displayed with 100-ns resolution, and its peak value is automatically measured and displayed. This relieves the operator of the tedious, time-consuming, and error-prone search for the signal whenever the time delay changes. The automatically sweeping boxcar integrator can also be used as a conventional dual-channel boxcar integrator. In either mode, it can repeatedly integrate a signal up to 990 times and thus make accurate measurements of the signal pulse height in the presence of random noise, dc offsets, and unsynchronized interfering signals

  18. HClass: Automatic classification tool for health pathologies using artificial intelligence techniques.

    Science.gov (United States)

    Garcia-Chimeno, Yolanda; Garcia-Zapirain, Begonya

    2015-01-01

    The classification of subjects' pathologies enables a rigorousness to be applied to the treatment of certain pathologies, as doctors on occasions play with so many variables that they can end up confusing some illnesses with others. Thanks to Machine Learning techniques applied to a health-record database, it is possible to make using our algorithm. hClass contains a non-linear classification of either a supervised, non-supervised or semi-supervised type. The machine is configured using other techniques such as validation of the set to be classified (cross-validation), reduction in features (PCA) and committees for assessing the various classifiers. The tool is easy to use, and the sample matrix and features that one wishes to classify, the number of iterations and the subjects who are going to be used to train the machine all need to be introduced as inputs. As a result, the success rate is shown either via a classifier or via a committee if one has been formed. A 90% success rate is obtained in the ADABoost classifier and 89.7% in the case of a committee (comprising three classifiers) when PCA is applied. This tool can be expanded to allow the user to totally characterise the classifiers by adjusting them to each classification use.

  19. Automatic Segmentation of the Eye in 3D Magnetic Resonance Imaging: A Novel Statistical Shape Model for Treatment Planning of Retinoblastoma.

    Science.gov (United States)

    Ciller, Carlos; De Zanet, Sandro I; Rüegsegger, Michael B; Pica, Alessia; Sznitman, Raphael; Thiran, Jean-Philippe; Maeder, Philippe; Munier, Francis L; Kowal, Jens H; Cuadra, Meritxell Bach

    2015-07-15

    Proper delineation of ocular anatomy in 3-dimensional (3D) imaging is a big challenge, particularly when developing treatment plans for ocular diseases. Magnetic resonance imaging (MRI) is presently used in clinical practice for diagnosis confirmation and treatment planning for treatment of retinoblastoma in infants, where it serves as a source of information, complementary to the fundus or ultrasonographic imaging. Here we present a framework to fully automatically segment the eye anatomy for MRI based on 3D active shape models (ASM), and we validate the results and present a proof of concept to automatically segment pathological eyes. Manual and automatic segmentation were performed in 24 images of healthy children's eyes (3.29 ± 2.15 years of age). Imaging was performed using a 3-T MRI scanner. The ASM consists of the lens, the vitreous humor, the sclera, and the cornea. The model was fitted by first automatically detecting the position of the eye center, the lens, and the optic nerve, and then aligning the model and fitting it to the patient. We validated our segmentation method by using a leave-one-out cross-validation. The segmentation results were evaluated by measuring the overlap, using the Dice similarity coefficient (DSC) and the mean distance error. We obtained a DSC of 94.90 ± 2.12% for the sclera and the cornea, 94.72 ± 1.89% for the vitreous humor, and 85.16 ± 4.91% for the lens. The mean distance error was 0.26 ± 0.09 mm. The entire process took 14 seconds on average per eye. We provide a reliable and accurate tool that enables clinicians to automatically segment the sclera, the cornea, the vitreous humor, and the lens, using MRI. We additionally present a proof of concept for fully automatically segmenting eye pathology. This tool reduces the time needed for eye shape delineation and thus can help clinicians when planning eye treatment and confirming the extent of the tumor. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. What Automaticity Deficit? Activation of Lexical Information by Readers with Dyslexia in a Rapid Automatized Naming Stroop-Switch Task

    Science.gov (United States)

    Jones, Manon W.; Snowling, Margaret J.; Moll, Kristina

    2016-01-01

    Reading fluency is often predicted by rapid automatized naming (RAN) speed, which as the name implies, measures the automaticity with which familiar stimuli (e.g., letters) can be retrieved and named. Readers with dyslexia are considered to have less "automatized" access to lexical information, reflected in longer RAN times compared with…

  1. Summer Student Work Project Report: SCADA Bridge Tool Development Automatically Capturing Data from SCADA to the Maintenance System

    CERN Document Server

    Alhambra-Moron, Alfonso

    2015-01-01

    The main purpose of this report is to summarize the work project I have been doing at CERN during the last 3 months as a Summer Student. My name is Alfonso Alhambra Morón and the 8th of June 2015 I joined the EN-HE-LM team as a summer student supervised by Damien Lafarge in order to collaborate in the automation of the transfer of meter readings from SCADA1 to Infor EAM2, the computerized maintenance management system at CERN. The main objective of my project was to enable the automatic updates of meters in Infor EAM fetching data from SCADA so as to automatize a process which was done manually before and consumed resources in terms of having to consult the meter physically, import this information to Infor EAM by hand and detecting and correcting the errors that can occur when doing all of this manually. This problem is shared by several other teams at CERN apart from the Lift Maintenance team and for this reason the main target I had when developing my solution was flexibility and scalability so as to make...

  2. Bio-EdIP: An automatic approach for in vitro cell confluence images quantification.

    Science.gov (United States)

    Cardona, Andrés; Ariza-Jiménez, Leandro; Uribe, Diego; Arroyave, Johanna C; Galeano, July; Cortés-Mancera, Fabian M

    2017-07-01

    Cell imaging is a widely-employed technique to analyze multiple biological processes. Therefore, simple, accurate and quantitative tools are needed to understand cellular events. For this purpose, Bio-EdIP was developed as a user-friendly tool to quantify confluence levels using cell culture images. The proposed algorithm combines a pre-processing step with subsequent stages that involve local processing techniques and a morphological reconstruction-based segmentation algorithm. Segmentation performance was assessed in three constructed image sets, comparing F-measure scores and AUC values (ROC analysis) for Bio-EdIP, its previous version and TScratch. Furthermore, segmentation results were compared with published algorithms using eight public benchmarks. Bio-EdIP automatically segmented cell-free regions from images of in vitro cell culture. Based on mean F-measure scores and ROC analysis, Bio-EdIP conserved a high performance regardless of image characteristics of the constructed dataset, when compared with its previous version and TScratch. Although acquisition quality of the public dataset affected Bio-EdIP segmentation, performance was better in two out of eight public sets. Bio-EdIP is a user-friendly interface, which is useful for the automatic analysis of confluence levels and cell growth processes using in vitro cell culture images. Here, we also presented new manually annotated data for algorithms evaluation. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. 14 CFR 29.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 29.1329 Section 29... pilot system. (a) Each automatic pilot system must be designed so that the automatic pilot can— (1) Be sufficiently overpowered by one pilot to allow control of the rotorcraft; and (2) Be readily and positively...

  4. 14 CFR 27.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 27.1329 Section 27... pilot system. (a) Each automatic pilot system must be designed so that the automatic pilot can— (1) Be sufficiently overpowered by one pilot to allow control of the rotorcraft; and (2) Be readily and positively...

  5. Automatic assessment of coronary artery calcium score from contrast-enhanced 256-row coronary computed tomography angiography.

    Science.gov (United States)

    Rubinshtein, Ronen; Halon, David A; Gaspar, Tamar; Lewis, Basil S; Peled, Nathan

    2014-01-01

    The coronary artery calcium score (CS), an independent predictor of cardiovascular events, can be obtained from a stand-alone nonenhanced computed tomography (CT) scan (CSCT) or as an additional nonenhanced procedure before contrast-enhanced coronary CT angiography (CCTA). We evaluated the accuracy of a novel fully automatic tool for computing CS from the CCTA examination. One hundred thirty-six consecutive symptomatic patients (aged 59 ± 11 years, 40% female) without known coronary artery disease who underwent both 256-row CSCT and CCTA were studied. Original scan reconstruction (slice thickness) was maintained (3 mm for CSCT and 0.67 mm for CCTA). CS was computed from CCTA by an automatic tool (COR Analyzer, rcadia Medical Imaging, Haifa, Israel) and compared with CS results obtained by standard assessment of nonenhanced CSCT (HeartBeat CS, Philips, Cleveland, Ohio). We also compared both methods for classification into 5 commonly used CS categories (0, 1 to 10, 11 to 100, 101 to 400, >400 Agatston units). All scans were of diagnostic quality. CS obtained by the COR Analyzer from CCTA classified 111 of 136 (82%) of patients into identical categories as CS by CSCT and 24 of remaining 25 into an adjacent category. Overall, CS values from CCTA showed high correlation with CS values from CSCT (Spearman rank correlation = 0.95, p automatically computed from 256-row CCTA correlated highly with standard CS values obtained from nonenhanced CSCT. CS obtained directly from CCTA may obviate the need for an additional scan and attendant radiation. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. ABI Base Recall: Automatic Correction and Ends Trimming of DNA Sequences.

    Science.gov (United States)

    Elyazghi, Zakaria; Yazouli, Loubna El; Sadki, Khalid; Radouani, Fouzia

    2017-12-01

    Automated DNA sequencers produce chromatogram files in ABI format. When viewing chromatograms, some ambiguities are shown at various sites along the DNA sequences, because the program implemented in the sequencing machine and used to call bases cannot always precisely determine the right nucleotide, especially when it is represented by either a broad peak or a set of overlaying peaks. In such cases, a letter other than A, C, G, or T is recorded, most commonly N. Thus, DNA sequencing chromatograms need manual examination: checking for mis-calls and truncating the sequence when errors become too frequent. The purpose of this paper is to develop a program allowing the automatic correction of these ambiguities. This application is a Web-based program powered by Shiny and runs under R platform for an easy exploitation. As a part of the interface, we added the automatic ends clipping option, alignment against reference sequences, and BLAST. To develop and test our tool, we collected several bacterial DNA sequences from different laboratories within Institut Pasteur du Maroc and performed both manual and automatic correction. The comparison between the two methods was carried out. As a result, we note that our program, ABI base recall, accomplishes good correction with a high accuracy. Indeed, it increases the rate of identity and coverage and minimizes the number of mismatches and gaps, hence it provides solution to sequencing ambiguities and saves biologists' time and labor.

  7. Control-oriented Automatic System for Transport Analysis (ASTRA)-Matlab integration for Tokamaks

    International Nuclear Information System (INIS)

    Sevillano, M.G.; Garrido, I.; Garrido, A.J.

    2011-01-01

    The exponential growth in energy consumption has led to a renewed interest in the development of alternatives to fossil fuels. Between the unconventional resources that may help to meet this energy demand, nuclear fusion has arisen as a promising source, which has given way to an unprecedented interest in solving the different control problems existing in nuclear fusion reactors such as Tokamaks. The aim of this manuscript is to show how one of the most popular codes used to simulate the performance of Tokamaks, the Automatic System For Transport Analysis (ASTRA) code, can be integrated into the Matlab-Simulink tool in order to make easier and more comfortable the development of suitable controllers for Tokamaks. As a demonstrative case study to show the feasibility and the goodness of the proposed ASTRA-Matlab integration, a modified anti-windup Proportional Integral Derivative (PID)-based controller for the loop voltage of a Tokamak has been implemented. The integration achieved represents an original and innovative work in the Tokamak control area and it provides new possibilities for the development and application of advanced control schemes to the standardized and widely extended ASTRA transport code for Tokamaks. -- Highlights: → The paper presents a useful tool for rapid prototyping of different solutions to deal with the control problems arising in Tokamaks. → The proposed tool embeds the standardized Automatic System For Transport Analysis (ASTRA) code for Tokamaks within the well-known Matlab-Simulink software. → This allows testing and combining diverse control schemes in a unified way considering the ASTRA as the plant of the system. → A demonstrative Proportional Integral Derivative (PID)-based case study is provided to show the feasibility and capabilities of the proposed integration.

  8. Automatic cloud coverage assessment of Formosat-2 image

    Science.gov (United States)

    Hsu, Kuo-Hsien

    2011-11-01

    Formosat-2 satellite equips with the high-spatial-resolution (2m ground sampling distance) remote sensing instrument. It has been being operated on the daily-revisiting mission orbit by National Space organization (NSPO) of Taiwan since May 21 2004. NSPO has also serving as one of the ground receiving stations for daily processing the received Formosat- 2 images. The current cloud coverage assessment of Formosat-2 image for NSPO Image Processing System generally consists of two major steps. Firstly, an un-supervised K-means method is used for automatically estimating the cloud statistic of Formosat-2 image. Secondly, manual estimation of cloud coverage from Formosat-2 image is processed by manual examination. Apparently, a more accurate Automatic Cloud Coverage Assessment (ACCA) method certainly increases the efficiency of processing step 2 with a good prediction of cloud statistic. In this paper, mainly based on the research results from Chang et al, Irish, and Gotoh, we propose a modified Formosat-2 ACCA method which considered pre-processing and post-processing analysis. For pre-processing analysis, cloud statistic is determined by using un-supervised K-means classification, Sobel's method, Otsu's method, non-cloudy pixels reexamination, and cross-band filter method. Box-Counting fractal method is considered as a post-processing tool to double check the results of pre-processing analysis for increasing the efficiency of manual examination.

  9. Automatic fuzzy inference system development for marker-based watershed segmentation

    International Nuclear Information System (INIS)

    Gonzalez, M A; Meschino, G J; Ballarin, V L

    2007-01-01

    Texture image segmentation is a constant challenge in digital image processing. The partition of an image into regions that allow the experienced observer to obtain the necessary information can be done using a Mathematical Morphology tool called the Watershed Transform. This transform is able to distinguish extremely complex objects and is easily adaptable to various kinds of images. The success of the Watershed Transform depends essentially on the existence of unequivocal markers for each of the objects of interest. The standard methods for marker detection are highly specific and complex when objects presenting great variability of shape, size and texture are processed. This paper proposes the automatic generation of a fuzzy inference system for marker detection using object selection done by the expert. This method allows applying the Watershed Transform to biomedical images with diferent kinds of texture. The results allow concluding that the method proposed is an effective tool for the application of the Watershed Transform

  10. Model-based vision system for automatic recognition of structures in dental radiographs

    Science.gov (United States)

    Acharya, Raj S.; Samarabandu, Jagath K.; Hausmann, E.; Allen, K. A.

    1991-07-01

    X-ray diagnosis of destructive periodontal disease requires assessing serial radiographs by an expert to determine the change in the distance between cemento-enamel junction (CEJ) and the bone crest. To achieve this without the subjectivity of a human expert, a knowledge based system is proposed to automatically locate the two landmarks which are the CEJ and the level of alveolar crest at its junction with the periodontal ligament space. This work is a part of an ongoing project to automatically measure the distance between CEJ and the bone crest along a line parallel to the axis of the tooth. The approach presented in this paper is based on identifying a prominent feature such as the tooth boundary using local edge detection and edge thresholding to establish a reference and then using model knowledge to process sub-regions in locating the landmarks. Segmentation techniques invoked around these regions consists of a neural-network like hierarchical refinement scheme together with local gradient extraction, multilevel thresholding and ridge tracking. Recognition accuracy is further improved by first locating the easily identifiable parts of the bone surface and the interface between the enamel and the dentine and then extending these boundaries towards the periodontal ligament space and the tooth boundary respectively. The system is realized as a collection of tools (or knowledge sources) for pre-processing, segmentation, primary and secondary feature detection and a control structure based on the blackboard model to coordinate the activities of these tools.

  11. Limiting precision in differential equation solvers. II Sources of trouble and starting a code

    International Nuclear Information System (INIS)

    Shampine, L.F.

    1978-01-01

    The reasons a class of codes for solving ordinary differential equations might want to use an extremely small step size are investigated. For this class the likelihood of precision difficulties is evaluated and remedies examined. The investigations suggests a way of selecting automatically an initial step size which should be reliably on scale

  12. Strategy proposed by Electricite de France in the development of automatic tools

    Energy Technology Data Exchange (ETDEWEB)

    Castaing, C.; Cazin, B. [Electricite de France, Noisy le grand (France)

    1995-03-01

    The strategy proposed by EDF in the development of a means to limit personal and collective dosimetry is recent. It follows in the steps of a policy that consisted of developing remote operation means for those activities of inspection and maintenance on the reactor, pools bottom, steam generators (SGs), also reactor building valves; target activities because of their high dosimetric cost. One of the main duties of the UTO (Technical Support Department), within the EDF, is the maintenance of Pressurized Water Reactors in French Nuclear Power Plant Operations (consisting of 54 units) and the development and monitoring of specialized tools. To achieve this, the UTO has started a national think-tank on the implementation of the ALARA process in its field of activity and created an ALARA Committee responsible for running and monitoring it, as well as a policy for developing tools. This point will be illustrated in the second on reactor vessel heads.

  13. Practising verbal maritime communication with computer dialogue systems using automatic speech recognition (My Practice session)

    OpenAIRE

    John, Peter; Wellmann, J.; Appell, J.E.

    2016-01-01

    This My Practice session presents a novel online tool for practising verbal communication in a maritime setting. It is based on low-fi ChatBot simulation exercises which employ computer-based dialogue systems. The ChatBot exercises are equipped with an automatic speech recognition engine specifically designed for maritime communication. The speech input and output functionality enables learners to communicate with the computer freely and spontaneously. The exercises replicate real communicati...

  14. Appennino: A GIS Tool for Analyzing Wildlife Habitat Use

    Directory of Open Access Journals (Sweden)

    Marco Ferretti

    2012-01-01

    Full Text Available The aim of the study was to test Appennino, a tool used to evaluate the habitats of animals through compositional analysis. This free tool calculates an animal’s habitat use within the GIS platform for ArcGIS and saves and exports the results of the comparative land uses to other statistical software. Visual Basic for Application programming language was employed to prepare the ESRI ArcGIS 9.x utility. The tool was tested on a dataset of 546 pheasant positions obtained from a study carried out in Tuscany (Italy. The tool automatically gave the same results as the results obtained by calculating the surfaces in ESRI ArcGIS, exporting the data from the ArcGIS, then using a commercial spreadsheet and/or statistical software to calculate the animal’s habitat use with a considerable reduction in time.

  15. Automatic intelligent cruise control

    OpenAIRE

    Stanton, NA; Young, MS

    2006-01-01

    This paper reports a study on the evaluation of automatic intelligent cruise control (AICC) from a psychological perspective. It was anticipated that AICC would have an effect upon the psychology of driving—namely, make the driver feel like they have less control, reduce the level of trust in the vehicle, make drivers less situationally aware, but might reduce the workload and make driving might less stressful. Drivers were asked to drive in a driving simulator under manual and automatic inte...

  16. Automatic classification of hyperactive children: comparing multiple artificial intelligence approaches.

    Science.gov (United States)

    Delavarian, Mona; Towhidkhah, Farzad; Gharibzadeh, Shahriar; Dibajnia, Parvin

    2011-07-12

    Automatic classification of different behavioral disorders with many similarities (e.g. in symptoms) by using an automated approach will help psychiatrists to concentrate on correct disorder and its treatment as soon as possible, to avoid wasting time on diagnosis, and to increase the accuracy of diagnosis. In this study, we tried to differentiate and classify (diagnose) 306 children with many similar symptoms and different behavioral disorders such as ADHD, depression, anxiety, comorbid depression and anxiety and conduct disorder with high accuracy. Classification was based on the symptoms and their severity. With examining 16 different available classifiers, by using "Prtools", we have proposed nearest mean classifier as the most accurate classifier with 96.92% accuracy in this research. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  17. ASTRA - an automatic system for transport analysis in a tokamak

    International Nuclear Information System (INIS)

    Pereverzev, G.V.; Yushmanov, P.N.; Dnestrovskii, A.Yu.; Polevoi, A.R.; Tarasjan, K.N.; Zakharov, L.E.

    1991-08-01

    The set of codes described here - ASTRA (Automatic System of Transport Analysis) - is a flexible and effective tool for the study of transport mechanisms in reactor-oriented facilities of the tokamak type. Flexibility is provided within the ASTRA system by a wide choice of standard relationships, functions and subroutines representing various transport coefficients, methods of auxiliary heating and other physical processes in the tokamak plasma, as well as by the possibility of pre-setting transport equations and variables for data output in a simple and conseptually transparent form. The transport code produced by the ASTRA system provides an adequate representation of the discharges for present experimental conditions. (orig.)

  18. Development Of Dynamic Probabilistic Safety Assessment: The Accident Dynamic Simulator (ADS) Tool

    International Nuclear Information System (INIS)

    Chang, Y.H.; Mosleh, A.; Dang, V.N.

    2003-01-01

    The development of a dynamic methodology for Probabilistic Safety Assessment (PSA) addresses the complex interactions between the behaviour of technical systems and personnel response in the evolution of accident scenarios. This paper introduces the discrete dynamic event tree, a framework for dynamic PSA, and its implementation in the Accident Dynamic Simulator (ADS) tool. Dynamic event tree tools generate and quantify accident scenarios through coupled simulation models of the plant physical processes, its automatic systems, the equipment reliability, and the human response. The current research on the framework, the ADS tool, and on Human Reliability Analysis issues within dynamic PSA, is discussed. (author)

  19. Development Of Dynamic Probabilistic Safety Assessment: The Accident Dynamic Simulator (ADS) Tool

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Y.H.; Mosleh, A.; Dang, V.N

    2003-03-01

    The development of a dynamic methodology for Probabilistic Safety Assessment (PSA) addresses the complex interactions between the behaviour of technical systems and personnel response in the evolution of accident scenarios. This paper introduces the discrete dynamic event tree, a framework for dynamic PSA, and its implementation in the Accident Dynamic Simulator (ADS) tool. Dynamic event tree tools generate and quantify accident scenarios through coupled simulation models of the plant physical processes, its automatic systems, the equipment reliability, and the human response. The current research on the framework, the ADS tool, and on Human Reliability Analysis issues within dynamic PSA, is discussed. (author)

  20. Automatic radioxenon analyzer for CTBT monitoring

    International Nuclear Information System (INIS)

    Bowyer, T.W.; Abel, K.H.; Hensley, W.K.

    1996-12-01

    Over the past 3 years, with support from US DOE's NN-20 Comprehensive Test Ban Treaty (CTBT) R ampersand D program, PNNL has developed and demonstrated a fully automatic analyzer for collecting and measuring the four Xe radionuclides, 131m Xe(11.9 d), 133m Xe(2.19 d), 133 Xe (5.24 d), and 135 Xe(9.10 h), in the atmosphere. These radionuclides are important signatures in monitoring for compliance to a CTBT. Activity ratios permit discriminating radioxenon from nuclear detonation and that from nuclear reactor operations, nuclear fuel reprocessing, or medical isotope production and usage. In the analyzer, Xe is continuously and automatically separated from the atmosphere at flow rates of about 7 m 3 /h on sorption bed. Aliquots collected for 6-12 h are automatically analyzed by electron-photon coincidence spectrometry to produce sensitivities in the range of 20-100 μBq/m 3 of air, about 100-fold better than with reported laboratory-based procedures for short time collection intervals. Spectral data are automatically analyzed and the calculated radioxenon concentrations and raw gamma- ray spectra automatically transmitted to data centers

  1. A New Internet Tool for Automatic Evaluation in Control Systems and Programming

    Science.gov (United States)

    Munoz de la Pena, D.; Gomez-Estern, F.; Dormido, S.

    2012-01-01

    In this paper we present a web-based innovative education tool designed for automating the collection, evaluation and error detection in practical exercises assigned to computer programming and control engineering students. By using a student/instructor code-fusion architecture, the conceptual limits of multiple-choice tests are overcome by far.…

  2. Automatic Mexico Gulf Oil Spill Detection from Radarsat-2 SAR Satellite Data Using Genetic Algorithm

    Science.gov (United States)

    Marghany, Maged

    2016-10-01

    In this work, a genetic algorithm is exploited for automatic detection of oil spills of small and large size. The route is achieved using arrays of RADARSAT-2 SAR ScanSAR Narrow single beam data obtained in the Gulf of Mexico. The study shows that genetic algorithm has automatically segmented the dark spot patches related to small and large oil spill pixels. This conclusion is confirmed by the receiveroperating characteristic (ROC) curve and ground data which have been documented. The ROC curve indicates that the existence of oil slick footprints can be identified with the area under the curve between the ROC curve and the no-discrimination line of 90%, which is greater than that of other surrounding environmental features. The small oil spill sizes represented 30% of the discriminated oil spill pixels in ROC curve. In conclusion, the genetic algorithm can be used as a tool for the automatic detection of oil spills of either small or large size and the ScanSAR Narrow single beam mode serves as an excellent sensor for oil spill patterns detection and surveying in the Gulf of Mexico.

  3. Automatic plasma control in magnetic traps

    International Nuclear Information System (INIS)

    Samojlenko, Y.; Chuyanov, V.

    1984-01-01

    Hot plasma is essentially in thermodynamic non-steady state. Automatic plasma control basically means monitoring deviations from steady state and producing a suitable magnetic or electric field which brings the plasma back to its original state. Briefly described are two systems of automatic plasma control: control with a magnetic field using a negative impedance circuit, and control using an electric field. It appears that systems of automatic plasma stabilization will be an indispensable component of the fusion reactor and its possibilities will in many ways determine the reactor economy. (Ha)

  4. From Modelling to Execution of Enterprise Integration Scenarios: The GENIUS Tool

    Science.gov (United States)

    Scheibler, Thorsten; Leymann, Frank

    One of the predominant problems IT companies are facing today is Enterprise Application Integration (EAI). Most of the infrastructures built to tackle integration issues are proprietary because no standards exist for how to model, develop, and actually execute integration scenarios. EAI patterns gain importance for non-technical business users to ease and harmonize the development of EAI scenarios. These patterns describe recurring EAI challenges and propose possible solutions in an abstract way. Therefore, one can use those patterns to describe enterprise architectures in a technology neutral manner. However, patterns are documentation only used by developers and systems architects to decide how to implement an integration scenario manually. Thus, patterns are not theoretical thought to stand for artefacts that will immediately be executed. This paper presents a tool supporting a method how EAI patterns can be used to generate executable artefacts for various target platforms automatically using a model-driven development approach, hence turning patterns into something executable. Therefore, we introduce a continuous tool chain beginning at the design phase and ending in executing an integration solution in a completely automatically manner. For evaluation purposes we introduce a scenario demonstrating how the tool is utilized for modelling and actually executing an integration scenario.

  5. Word Processing in Dyslexics: An Automatic Decoding Deficit?

    Science.gov (United States)

    Yap, Regina; Van Der Leu, Aryan

    1993-01-01

    Compares dyslexic children with normal readers on measures of phonological decoding and automatic word processing. Finds that dyslexics have a deficit in automatic phonological decoding skills. Discusses results within the framework of the phonological deficit and the automatization deficit hypotheses. (RS)

  6. LARA. Localization of an automatized refueling machine by acoustical sounding in breeder reactors - implementation of artificial intelligence techniques

    International Nuclear Information System (INIS)

    Lhuillier, C.; Malvache, P.

    1987-01-01

    The automatic control of the machine which handles the nuclear subassemblies in fast neutron reactors requires autonomous perception and decision tools. An acoustical device allows the machine to position in the work area. Artificial intelligence techniques are implemented to interpret the data: pattern recognition, scene analysis. The localization process is managed by an expert system. 6 refs.; 8 figs

  7. Software Tools Streamline Project Management

    Science.gov (United States)

    2009-01-01

    Three innovative software inventions from Ames Research Center (NETMARK, Program Management Tool, and Query-Based Document Management) are finding their way into NASA missions as well as industry applications. The first, NETMARK, is a program that enables integrated searching of data stored in a variety of databases and documents, meaning that users no longer have to look in several places for related information. NETMARK allows users to search and query information across all of these sources in one step. This cross-cutting capability in information analysis has exponentially reduced the amount of time needed to mine data from days or weeks to mere seconds. NETMARK has been used widely throughout NASA, enabling this automatic integration of information across many documents and databases. NASA projects that use NETMARK include the internal reporting system and project performance dashboard, Erasmus, NASA s enterprise management tool, which enhances organizational collaboration and information sharing through document routing and review; the Integrated Financial Management Program; International Space Station Knowledge Management; Mishap and Anomaly Information Reporting System; and management of the Mars Exploration Rovers. Approximately $1 billion worth of NASA s projects are currently managed using Program Management Tool (PMT), which is based on NETMARK. PMT is a comprehensive, Web-enabled application tool used to assist program and project managers within NASA enterprises in monitoring, disseminating, and tracking the progress of program and project milestones and other relevant resources. The PMT consists of an integrated knowledge repository built upon advanced enterprise-wide database integration techniques and the latest Web-enabled technologies. The current system is in a pilot operational mode allowing users to automatically manage, track, define, update, and view customizable milestone objectives and goals. The third software invention, Query

  8. Automatic Picking of Foraminifera: Design of the Foraminifera Image Recognition and Sorting Tool (FIRST) Prototype and Results of the Image Classification Scheme

    Science.gov (United States)

    de Garidel-Thoron, T.; Marchant, R.; Soto, E.; Gally, Y.; Beaufort, L.; Bolton, C. T.; Bouslama, M.; Licari, L.; Mazur, J. C.; Brutti, J. M.; Norsa, F.

    2017-12-01

    Foraminifera tests are the main proxy carriers for paleoceanographic reconstructions. Both geochemical and taxonomical studies require large numbers of tests to achieve statistical relevance. To date, the extraction of foraminifera from the sediment coarse fraction is still done by hand and thus time-consuming. Moreover, the recognition of morphotypes, ecologically relevant, requires some taxonomical skills not easily taught. The automatic recognition and extraction of foraminifera would largely help paleoceanographers to overcome these issues. Recent advances in automatic image classification using machine learning opens the way to automatic extraction of foraminifera. Here we detail progress on the design of an automatic picking machine as part of the FIRST project. The machine handles 30 pre-sieved samples (100-1000µm), separating them into individual particles (including foraminifera) and imaging each in pseudo-3D. The particles are classified and specimens of interest are sorted either for Individual Foraminifera Analyses (44 per slide) and/or for classical multiple analyses (8 morphological classes per slide, up to 1000 individuals per hole). The classification is based on machine learning using Convolutional Neural Networks (CNNs), similar to the approach used in the coccolithophorid imaging system SYRACO. To prove its feasibility, we built two training image datasets of modern planktonic foraminifera containing approximately 2000 and 5000 images each, corresponding to 15 & 25 morphological classes. Using a CNN with a residual topology (ResNet) we achieve over 95% correct classification for each dataset. We tested the network on 160,000 images from 45 depths of a sediment core from the Pacific ocean, for which we have human counts. The current algorithm is able to reproduce the downcore variability in both Globigerinoides ruber and the fragmentation index (r2 = 0.58 and 0.88 respectively). The FIRST prototype yields some promising results for high

  9. Deliberation versus automaticity in decision making: Which presentation format features facilitate automatic decision making?

    Directory of Open Access Journals (Sweden)

    Anke Soellner

    2013-05-01

    Full Text Available The idea of automatic decision making approximating normatively optimal decisions without necessitating much cognitive effort is intriguing. Whereas recent findings support the notion that such fast, automatic processes explain empirical data well, little is known about the conditions under which such processes are selected rather than more deliberate stepwise strategies. We investigate the role of the format of information presentation, focusing explicitly on the ease of information acquisition and its influence on information integration processes. In a probabilistic inference task, the standard matrix employed in prior research was contrasted with a newly created map presentation format and additional variations of both presentation formats. Across three experiments, a robust presentation format effect emerged: Automatic decision making was more prevalent in the matrix (with high information accessibility, whereas sequential decision strategies prevailed when the presentation format demanded more information acquisition effort. Further scrutiny of the effect showed that it is not driven by the presentation format as such, but rather by the extent of information search induced by a format. Thus, if information is accessible with minimal need for information search, information integration is likely to proceed in a perception-like, holistic manner. In turn, a moderate demand for information search decreases the likelihood of behavior consistent with the assumptions of automatic decision making.

  10. Spline-based automatic path generation of welding robot

    Institute of Scientific and Technical Information of China (English)

    Niu Xuejuan; Li Liangyu

    2007-01-01

    This paper presents a flexible method for the representation of welded seam based on spline interpolation. In this method, the tool path of welding robot can be generated automatically from a 3D CAD model. This technique has been implemented and demonstrated in the FANUC Arc Welding Robot Workstation. According to the method, a software system is developed using VBA of SolidWorks 2006. It offers an interface between SolidWorks and ROBOGUIDE, the off-line programming software of FANUC robot. It combines the strong modeling function of the former and the simulating function of the latter. It also has the capability of communication with on-line robot. The result data have shown its high accuracy and strong reliability in experiments. This method will improve the intelligence and the flexibility of the welding robot workstation.

  11. Forensic Automatic Speaker Recognition Based on Likelihood Ratio Using Acoustic-phonetic Features Measured Automatically

    Directory of Open Access Journals (Sweden)

    Huapeng Wang

    2015-01-01

    Full Text Available Forensic speaker recognition is experiencing a remarkable paradigm shift in terms of the evaluation framework and presentation of voice evidence. This paper proposes a new method of forensic automatic speaker recognition using the likelihood ratio framework to quantify the strength of voice evidence. The proposed method uses a reference database to calculate the within- and between-speaker variability. Some acoustic-phonetic features are extracted automatically using the software VoiceSauce. The effectiveness of the approach was tested using two Mandarin databases: A mobile telephone database and a landline database. The experiment's results indicate that these acoustic-phonetic features do have some discriminating potential and are worth trying in discrimination. The automatic acoustic-phonetic features have acceptable discriminative performance and can provide more reliable results in evidence analysis when fused with other kind of voice features.

  12. Algebraic limit cycles in polynomial systems of differential equations

    International Nuclear Information System (INIS)

    Llibre, Jaume; Zhao Yulin

    2007-01-01

    Using elementary tools we construct cubic polynomial systems of differential equations with algebraic limit cycles of degrees 4, 5 and 6. We also construct a cubic polynomial system of differential equations having an algebraic homoclinic loop of degree 3. Moreover, we show that there are polynomial systems of differential equations of arbitrary degree that have algebraic limit cycles of degree 3, as well as give an example of a cubic polynomial system of differential equations with two algebraic limit cycles of degree 4

  13. Automatic recognition of conceptualization zones in scientific articles and two life science applications.

    Science.gov (United States)

    Liakata, Maria; Saha, Shyamasree; Dobnik, Simon; Batchelor, Colin; Rebholz-Schuhmann, Dietrich

    2012-04-01

    SCs in two biomedical applications as well as work in progress. A web-based tool for the automatic annotation of articles with CoreSCs and corresponding documentation is available online at http://www.sapientaproject.com/software http://www.sapientaproject.com also contains detailed information pertaining to CoreSC annotation and links to annotation guidelines as well as a corpus of manually annotated articles, which served as our training data. liakata@ebi.ac.uk Supplementary data are available at Bioinformatics online.

  14. Towards a fully automatic and robust DIMM (DIMMA)

    International Nuclear Information System (INIS)

    Varela, A M; Muñoz-Tuñón, C; Del Olmo-García, A M; Rodríguez, L F; Delgado, J M; Castro-Almazán, J A

    2015-01-01

    Quantitative seeing measurements have been provided at the Canarian Observatories since 1990 by differential image motion monitors (DIMMs). Image quality needs to be studied in long term (routine) measurements. This is important, for instance, in deciding on the siting of large telescopes or in the development of adaptive optics programmes, not to mention the development and design of new instruments. On the other hand, the continuous real time monitoring is essential in the day-to-day operation of telescopes.These routine measurements have to be carried out by standard, easy-to-operate and cross- calibrated instruments that required to be be operational with minimum intervention over many years. The DIMMA (Automatic Differential Image Motion Monitor) is the next step, a fully automated seeing monitor that is capable of providing data without manual operation and in remote locations. Currently, the IAC has two DIMMs working at Roque de los Muchachos Observatory (ORM) and Teide Observatory (OT). They are robotic and require an operator to start and initialize the program, focus the telescope, change the star when needed and turn off at the end of the night, all of which is done remotely. With a view to automation, we have designed a code for monitoring image quality (avoiding spurious data) and a program for autofocus, which is presented here. The data quality control protocol is also given. (paper)

  15. Automatic recognition of cardiac arrhythmias based on the geometric patterns of Poincaré plots

    International Nuclear Information System (INIS)

    Zhang, Lijuan; Guo, Tianci; Xi, Bin; Fan, Yang; Wang, Kun; Bi, Jiacheng; Wang, Ying

    2015-01-01

    The Poincaré plot emerges as an effective tool for assessing cardiovascular autonomic regulation. It displays nonlinear characteristics of heart rate variability (HRV) from electrocardiographic (ECG) recordings and gives a global view of the long range of ECG signals. In the telemedicine or computer-aided diagnosis system, it would offer significant auxiliary information for diagnosis if the patterns of the Poincaré plots can be automatically classified. Therefore, we developed an automatic classification system to distinguish five geometric patterns of the Poincaré plots from four types of cardiac arrhythmias. The statistics features are designed on measurements and an ensemble classifier of three types of neural networks is proposed. Aiming at the difficulty to set a proper threshold for classifying the multiple categories, the threshold selection strategy is analyzed. 24 h ECG monitoring recordings from 674 patients, which have four types of cardiac arrhythmias, are adopted for recognition. For comparison, Support Vector Machine (SVM) classifiers with linear and Gaussian kernels are also applied. The experiment results demonstrate the effectiveness of the extracted features and the better performance of the designed classifier. Our study can be applied to diagnose the corresponding sinus rhythm and arrhythmia substrates disease automatically in the telemedicine and computer-aided diagnosis system. (paper)

  16. 46 CFR 63.25-1 - Small automatic auxiliary boilers.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Small automatic auxiliary boilers. 63.25-1 Section 63.25... AUXILIARY BOILERS Requirements for Specific Types of Automatic Auxiliary Boilers § 63.25-1 Small automatic auxiliary boilers. Small automatic auxiliary boilers defined as having heat-input ratings of 400,000 Btu/hr...

  17. Multi-objective optimum design of fast tool servo based on improved differential evolution algorithm

    International Nuclear Information System (INIS)

    Zhu, Zhiwei; Zhou, Xiaoqin; Liu, Qiang; Zhao, Shaoxin

    2011-01-01

    The flexure-based mechanism is a promising realization of fast tool servo (FTS), and the optimum determination of flexure hinge parameters is one of the most important elements in the FTS design. This paper presents a multi-objective optimization approach to optimizing the dimension and position parameters of the flexure-based mechanism, which is based on the improved differential evolution algorithm embedding chaos and nonlinear simulated anneal algorithm. The results of optimum design show that the proposed algorithm has excellent performance and a well-balanced compromise is made between two conflicting objectives, the stroke and natural frequency of the FTS mechanism. The validation tests based on finite element analysis (FEA) show good agreement with the results obtained by using the proposed theoretical algorithm of this paper. Finally, a series of experimental tests are conducted to validate the design process and assess the performance of the FTS mechanism. The designed FTS reaches up to a stroke of 10.25 μm with at least 2 kHz bandwidth. Both of the FEA and experimental results demonstrate that the parameters of the flexure-based mechanism determined by the proposed approaches can achieve the specified performance and the proposed approach is suitable for the optimum design of FTS mechanism and of excellent performances

  18. A modular (almost) automatic set-up for elastic multi-tenants cloud (micro)infrastructures

    Science.gov (United States)

    Amoroso, A.; Astorino, F.; Bagnasco, S.; Balashov, N. A.; Bianchi, F.; Destefanis, M.; Lusso, S.; Maggiora, M.; Pellegrino, J.; Yan, L.; Yan, T.; Zhang, X.; Zhao, X.

    2017-10-01

    An auto-installing tool on an usb drive can allow for a quick and easy automatic deployment of OpenNebula-based cloud infrastructures remotely managed by a central VMDIRAC instance. A single team, in the main site of an HEP Collaboration or elsewhere, can manage and run a relatively large network of federated (micro-)cloud infrastructures, making an highly dynamic and elastic use of computing resources. Exploiting such an approach can lead to modular systems of cloud-bursting infrastructures addressing complex real-life scenarios.

  19. Development of Open source-based automatic shooting and processing UAV imagery for Orthoimage Using Smart Camera UAV

    Science.gov (United States)

    Park, J. W.; Jeong, H. H.; Kim, J. S.; Choi, C. U.

    2016-06-01

    Recently, aerial photography with unmanned aerial vehicle (UAV) system uses UAV and remote controls through connections of ground control system using bandwidth of about 430 MHz radio Frequency (RF) modem. However, as mentioned earlier, existing method of using RF modem has limitations in long distance communication. The Smart Camera equipments's LTE (long-term evolution), Bluetooth, and Wi-Fi to implement UAV that uses developed UAV communication module system carried out the close aerial photogrammetry with the automatic shooting. Automatic shooting system is an image capturing device for the drones in the area's that needs image capturing and software for loading a smart camera and managing it. This system is composed of automatic shooting using the sensor of smart camera and shooting catalog management which manages filmed images and information. Processing UAV imagery module used Open Drone Map. This study examined the feasibility of using the Smart Camera as the payload for a photogrammetric UAV system. The open soure tools used for generating Android, OpenCV (Open Computer Vision), RTKLIB, Open Drone Map.

  20. Development of Open source-based automatic shooting and processing UAV imagery for Orthoimage Using Smart Camera UAV

    Directory of Open Access Journals (Sweden)

    J. W. Park

    2016-06-01

    Full Text Available Recently, aerial photography with unmanned aerial vehicle (UAV system uses UAV and remote controls through connections of ground control system using bandwidth of about 430 MHz radio Frequency (RF modem. However, as mentioned earlier, existing method of using RF modem has limitations in long distance communication. The Smart Camera equipments’s LTE (long-term evolution, Bluetooth, and Wi-Fi to implement UAV that uses developed UAV communication module system carried out the close aerial photogrammetry with the automatic shooting. Automatic shooting system is an image capturing device for the drones in the area’s that needs image capturing and software for loading a smart camera and managing it. This system is composed of automatic shooting using the sensor of smart camera and shooting catalog management which manages filmed images and information. Processing UAV imagery module used Open Drone Map. This study examined the feasibility of using the Smart Camera as the payload for a photogrammetric UAV system. The open soure tools used for generating Android, OpenCV (Open Computer Vision, RTKLIB, Open Drone Map.