WorldWideScience

Sample records for automatic performance debugging

  1. High Performance with Prescriptive Optimization and Debugging

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo

    parallelization and automatic vectorization is attractive as it transparently optimizes programs. The thesis contributes an improved dependence analysis for explicitly parallel programs. These improvements lead to more loops being vectorized, on average we achieve a speedup of 1.46 over the existing dependence...... analysis and vectorizer in GCC. Automatic optimizations often fail for theoretical and practical reasons. When they fail we argue that a hybrid approach can be effective. Using compiler feedback, we propose to use the programmer’s intuition and insight to achieve high performance. Compiler feedback...... enlightens the programmer why a given optimization was not applied, and suggest how to change the source code to make it more amenable to optimizations. We show how this can yield significant speedups and achieve 2.4 faster execution on a real industrial use case. To aid in parallel debugging we propose...

  2. Debugging a high performance computing program

    Science.gov (United States)

    Gooding, Thomas M.

    2013-08-20

    Methods, apparatus, and computer program products are disclosed for debugging a high performance computing program by gathering lists of addresses of calling instructions for a plurality of threads of execution of the program, assigning the threads to groups in dependence upon the addresses, and displaying the groups to identify defective threads.

  3. Source-Level Debugging of Automatically Parallelized Programs

    Science.gov (United States)

    1992-10-23

    environment can be difficult. However, users typically design their programs so dhat they can be mmn because debugging a program usually requires that...AUTOMAflCALLY PARAULEMD PfROGAM lftm dkbmrallon In jenaaL the relationsip between globil and local addresses on a processor p is as "-folows: Equation U...Implementation, pages 1-11. ACM, San Francisco, California, June, 1992. [Bruegge 91] Bruegge, B. A portable platform for distributed event environments

  4. Supercomputer debugging workshop 1991 proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.

    1991-12-31

    This report discusses the following topics on supercomputer debugging: Distributed debugging; use interface to debugging tools and standards; debugging optimized codes; debugging parallel codes; and debugger performance and interface as analysis tools. (LSP)

  5. Supercomputer debugging workshop 1991 proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.

    1991-01-01

    This report discusses the following topics on supercomputer debugging: Distributed debugging; use interface to debugging tools and standards; debugging optimized codes; debugging parallel codes; and debugger performance and interface as analysis tools. (LSP)

  6. Supercomputer debugging workshop `92

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.S.

    1993-02-01

    This report contains papers or viewgraphs on the following topics: The ABCs of Debugging in the 1990s; Cray Computer Corporation; Thinking Machines Corporation; Cray Research, Incorporated; Sun Microsystems, Inc; Kendall Square Research; The Effects of Register Allocation and Instruction Scheduling on Symbolic Debugging; Debugging Optimized Code: Currency Determination with Data Flow; A Debugging Tool for Parallel and Distributed Programs; Analyzing Traces of Parallel Programs Containing Semaphore Synchronization; Compile-time Support for Efficient Data Race Detection in Shared-Memory Parallel Programs; Direct Manipulation Techniques for Parallel Debuggers; Transparent Observation of XENOOPS Objects; A Parallel Software Monitor for Debugging and Performance Tools on Distributed Memory Multicomputers; Profiling Performance of Inter-Processor Communications in an iWarp Torus; The Application of Code Instrumentation Technology in the Los Alamos Debugger; and CXdb: The Road to Remote Debugging.

  7. Supercomputer debugging workshop '92

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.S.

    1993-01-01

    This report contains papers or viewgraphs on the following topics: The ABCs of Debugging in the 1990s; Cray Computer Corporation; Thinking Machines Corporation; Cray Research, Incorporated; Sun Microsystems, Inc; Kendall Square Research; The Effects of Register Allocation and Instruction Scheduling on Symbolic Debugging; Debugging Optimized Code: Currency Determination with Data Flow; A Debugging Tool for Parallel and Distributed Programs; Analyzing Traces of Parallel Programs Containing Semaphore Synchronization; Compile-time Support for Efficient Data Race Detection in Shared-Memory Parallel Programs; Direct Manipulation Techniques for Parallel Debuggers; Transparent Observation of XENOOPS Objects; A Parallel Software Monitor for Debugging and Performance Tools on Distributed Memory Multicomputers; Profiling Performance of Inter-Processor Communications in an iWarp Torus; The Application of Code Instrumentation Technology in the Los Alamos Debugger; and CXdb: The Road to Remote Debugging.

  8. DySectAPI: Scalable Prescriptive Debugging

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Karlsson, Sven; Quarfot Nielsen, Niklas

    We present the DySectAPI, a tool that allow users to construct probe trees for automatic, event-driven debugging at scale. The traditional, interactive debugging model, whereby users manually step through and inspect their application, does not scale well even for current supercomputers. While...... lightweight debugging models scale well, they can currently only debug a subset of bug classes. DySectAPI fills the gap between these two approaches with a novel user-guided approach. Using both experimental results and analytical modeling we show how DySectAPI scales and can run with a low overhead...

  9. Distributed debugging and tumult

    NARCIS (Netherlands)

    Scholten, Johan; Jansen, P.G.

    1990-01-01

    A description is given of Tumult (Twente university multicomputer) and its operating system, along with considerations about parallel debugging, examples of parallel debuggers, and the proposed debugger for Tumult. Problems related to debugging distributed systems and solutions found in other

  10. MPI Debugging with Handle Introspection

    DEFF Research Database (Denmark)

    Brock-Nannestad, Laust; DelSignore, John; Squyres, Jeffrey M.

    The Message Passing Interface, MPI, is the standard programming model for high performance computing clusters. However, debugging applications on large scale clusters is difficult. The widely used Message Queue Dumping interface enables inspection of message queue state but there is no general in...

  11. Tracking Students' Cognitive Processes during Program Debugging--An Eye-Movement Approach

    Science.gov (United States)

    Lin, Yu-Tzu; Wu, Cheng-Chih; Hou, Ting-Yun; Lin, Yu-Chih; Yang, Fang-Ying; Chang, Chia-Hu

    2016-01-01

    This study explores students' cognitive processes while debugging programs by using an eye tracker. Students' eye movements during debugging were recorded by an eye tracker to investigate whether and how high- and low-performance students act differently during debugging. Thirty-eight computer science undergraduates were asked to debug two C…

  12. An approach to profiling for run-time checking of computational properties and performance debugging in logic programs.

    OpenAIRE

    Mera, E.; Trigo, Teresa; López García, Pedro; Hermenegildo, Manuel V.

    2010-01-01

    Although several profiling techniques for identifying performance bottlenecks in logic programs have been developed, they are generally not automatic and in most cases they do not provide enough information for identifying the root causes of such bottlenecks. This complicates using their results for guiding performance improvement. We present a profiling method and tool that provides such explanations. Our profiler associates cost centers to certain program elements and can measure different ...

  13. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  14. Integrated Debugging of Modelica Models

    Directory of Open Access Journals (Sweden)

    Adrian Pop

    2014-04-01

    Full Text Available The high abstraction level of equation-based object-oriented (EOO languages such as Modelica has the drawback that programming and modeling errors are often hard to find. In this paper we present integrated static and dynamic debugging methods for Modelica models and a debugger prototype that addresses several of those problems. The goal is an integrated debugging framework that combines classical debugging techniques with special techniques for equation-based languages partly based on graph visualization and interaction. To our knowledge, this is the first Modelica debugger that supports both equation-based transformational and algorithmic code debugging in an integrated fashion.

  15. Debugging in a multi-processor environment

    International Nuclear Information System (INIS)

    Spann, J.M.

    1981-01-01

    The Supervisory Control and Diagnostic System (SCDS) for the Mirror Fusion Test Facility (MFTF) consists of nine 32-bit minicomputers arranged in a tightly coupled distributed computer system utilizing a share memory as the data exchange medium. Debugging of more than one program in the multi-processor environment is a difficult process. This paper describes what new tools were developed and how the testing of software is performed in the SCDS for the MFTF project

  16. An object-oriented extension for debugging the virtual machine

    Energy Technology Data Exchange (ETDEWEB)

    Pizzi, Jr, Robert G. [Univ. of California, Davis, CA (United States)

    1994-12-01

    A computer is nothing more then a virtual machine programmed by source code to perform a task. The program`s source code expresses abstract constructs which are compiled into some lower level target language. When a virtual machine breaks, it can be very difficult to debug because typical debuggers provide only low-level target implementation information to the software engineer. We believe that the debugging task can be simplified by introducing aspects of the abstract design and data into the source code. We introduce OODIE, an object-oriented extension to programming languages that allows programmers to specify a virtual environment by describing the meaning of the design and data of a virtual machine. This specification is translated into symbolic information such that an augmented debugger can present engineers with a programmable debugging environment specifically tailored for the virtual machine that is to be debugged.

  17. Study of the nonlinear imperfect software debugging model

    International Nuclear Information System (INIS)

    Wang, Jinyong; Wu, Zhibo

    2016-01-01

    In recent years there has been a dramatic proliferation of research on imperfect software debugging phenomena. Software debugging is a complex process and is affected by a variety of factors, including the environment, resources, personnel skills, and personnel psychologies. Therefore, the simple assumption that debugging is perfect is inconsistent with the actual software debugging process, wherein a new fault can be introduced when removing a fault. Furthermore, the fault introduction process is nonlinear, and the cumulative number of nonlinearly introduced faults increases over time. Thus, this paper proposes a nonlinear, NHPP imperfect software debugging model in consideration of the fact that fault introduction is a nonlinear process. The fitting and predictive power of the NHPP-based proposed model are validated through related experiments. Experimental results show that this model displays better fitting and predicting performance than the traditional NHPP-based perfect and imperfect software debugging models. S-confidence bounds are set to analyze the performance of the proposed model. This study also examines and discusses optimal software release-time policy comprehensively. In addition, this research on the nonlinear process of fault introduction is significant given the recent surge of studies on software-intensive products, such as cloud computing and big data. - Highlights: • Fault introduction is a nonlinear changing process during the debugging phase. • The assumption that the process of fault introduction is nonlinear is credible. • Our proposed model can better fit and accurately predict software failure behavior. • Research on fault introduction case is significant to software-intensive products.

  18. Lightweight and Statistical Techniques for Petascale PetaScale Debugging

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Barton

    2014-06-30

    to a small set of nodes or by identifying equivalence classes of nodes and sampling our debug targets from them. We implemented these techniques as lightweight tools that efficiently work on the full scale of the target machine. We explored four lightweight debugging refinements: generic classification parameters, such as stack traces, application-specific classification parameters, such as global variables, statistical data acquisition techniques and machine learning based approaches to perform root cause analysis. Work done under this project can be divided into two categories, new algorithms and techniques for scalable debugging, and foundation infrastructure work on our MRNet multicast-reduction framework for scalability, and Dyninst binary analysis and instrumentation toolkits.

  19. Performance evaluation of automatic voltage regulators ...

    African Journals Online (AJOL)

    Performance of various Automatic Voltage Regulators (AVR's) in Nigeria and the causes of their inability to regulate at their set points have been investigated. The result indicates that the imported AVRs fail to give the 220 volts as displayed on the name plate at the specified low set point (such as 100, 120 volts etc) on ...

  20. A Scalable Prescriptive Parallel Debugging Model

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Quarfot Nielsen, Niklas; Lee, Gregory L.

    2015-01-01

    Debugging is a critical step in the development of any parallel program. However, the traditional interactive debugging model, where users manually step through code and inspect their application, does not scale well even for current supercomputers due its centralized nature. While lightweight...

  1. Visualizing Debugging Activity in Source Code Repositories

    NARCIS (Netherlands)

    Voinea, Lucian; Telea, Alexandru

    2007-01-01

    We present the use of the CVSgrab visualization tool for understanding the debugging activity in the Mozilla project. We show how to display the distribution of different bug types over the project structure, locate project components which undergo heavy debugging activity, and get insight in the

  2. DySectAPI: Scalable Prescriptive Debugging

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Karlsson, Sven; Quarfot Nielsen, Niklas

    lightweight debugging models scale well, they can currently only debug a subset of bug classes. DySectAPI fills the gap between these two approaches with a novel user-guided approach. Using both experimental results and analytical modeling we show how DySectAPI scales and can run with a low overhead...

  3. Automatic Energy Schemes for High Performance Applications

    Energy Technology Data Exchange (ETDEWEB)

    Sundriyal, Vaibhav [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    Although high-performance computing traditionally focuses on the efficient execution of large-scale applications, both energy and power have become critical concerns when approaching exascale. Drastic increases in the power consumption of supercomputers affect significantly their operating costs and failure rates. In modern microprocessor architectures, equipped with dynamic voltage and frequency scaling (DVFS) and CPU clock modulation (throttling), the power consumption may be controlled in software. Additionally, network interconnect, such as Infiniband, may be exploited to maximize energy savings while the application performance loss and frequency switching overheads must be carefully balanced. This work first studies two important collective communication operations, all-to-all and allgather and proposes energy saving strategies on the per-call basis. Next, it targets point-to-point communications to group them into phases and apply frequency scaling to them to save energy by exploiting the architectural and communication stalls. Finally, it proposes an automatic runtime system which combines both collective and point-to-point communications into phases, and applies throttling to them apart from DVFS to maximize energy savings. The experimental results are presented for NAS parallel benchmark problems as well as for the realistic parallel electronic structure calculations performed by the widely used quantum chemistry package GAMESS. Close to the maximum energy savings were obtained with a substantially low performance loss on the given platform.

  4. Dependability Evaluation of COTS Microprocessors via On-Chip debugging facilities

    OpenAIRE

    Isaza-González, José; Serrano-Cases, Alejandro; Restrepo-Calle, Felipe; Cuenca-Asensi, Sergio; Martínez-Álvarez, Antonio

    2017-01-01

    This paper presents a fault injection tool and methodology for performing Single-Event-Upsets (SEUs) injection campaigns on Commercial-off-the-shelf (COTS) microprocessors. This method takes advantage of the debug facilities of modern microprocessors along with standard GNU Debugger (GDB) for executing and debugging benchmarks. The developed experiments on real boards, as well as on virtual machines, demonstrate the feasibility and flexibility of the proposal as a low-cost solution for assess...

  5. Debug automation from pre-silicon to post-silicon

    CERN Document Server

    Dehbashi, Mehdi

    2015-01-01

    This book describes automated debugging approaches for the bugs and the faults which appear in different abstraction levels of a hardware system. The authors employ a transaction-based debug approach to systems at the transaction-level, asserting the correct relation of transactions. The automated debug approach for design bugs finds the potential fault candidates at RTL and gate-level of a circuit. Debug techniques for logic bugs and synchronization bugs are demonstrated, enabling readers to localize the most difficult bugs. Debug automation for electrical faults (delay faults)finds the potentially failing speedpaths in a circuit at gate-level. The various debug approaches described achieve high diagnosis accuracy and reduce the debugging time, shortening the IC development cycle and increasing the productivity of designers. Describes a unified framework for debug automation used at both pre-silicon and post-silicon stages; Provides approaches for debug automation of a hardware system at different levels of ...

  6. GeoBuilder: a geometric algorithm visualization and debugging system for 2D and 3D geometric computing.

    Science.gov (United States)

    Wei, Jyh-Da; Tsai, Ming-Hung; Lee, Gen-Cher; Huang, Jeng-Hung; Lee, Der-Tsai

    2009-01-01

    Algorithm visualization is a unique research topic that integrates engineering skills such as computer graphics, system programming, database management, computer networks, etc., to facilitate algorithmic researchers in testing their ideas, demonstrating new findings, and teaching algorithm design in the classroom. Within the broad applications of algorithm visualization, there still remain performance issues that deserve further research, e.g., system portability, collaboration capability, and animation effect in 3D environments. Using modern technologies of Java programming, we develop an algorithm visualization and debugging system, dubbed GeoBuilder, for geometric computing. The GeoBuilder system features Java's promising portability, engagement of collaboration in algorithm development, and automatic camera positioning for tracking 3D geometric objects. In this paper, we describe the design of the GeoBuilder system and demonstrate its applications.

  7. TUNE: Compiler-Directed Automatic Performance Tuning

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Mary [University of Utah

    2014-09-18

    This project has developed compiler-directed performance tuning technology targeting the Cray XT4 Jaguar system at Oak Ridge, which has multi-core Opteron nodes with SSE-3 SIMD extensions, and the Cray XE6 Hopper system at NERSC. To achieve this goal, we combined compiler technology for model-guided empirical optimization for memory hierarchies with SIMD code generation, which have been developed by the PIs over the past several years. We examined DOE Office of Science applications to identify performance bottlenecks and apply our system to computational kernels that operate on dense arrays. Our goal for this performance-tuning technology has been to yield hand-tuned levels of performance on DOE Office of Science computational kernels, while allowing application programmers to specify their computations at a high level without requiring manual optimization. Overall, we aim to make our technology for SIMD code generation and memory hierarchy optimization a crucial component of high-productivity Petaflops computing through a close collaboration with the scientists in national laboratories.

  8. Debugging systems-on-chip communication-centric and abstraction-based techniques

    CERN Document Server

    Vermeulen, Bart

    2014-01-01

    This book describes an approach and supporting infrastructure to facilitate debugging the silicon implementation of a System-on-Chip (SOC), allowing its associated product to be introduced into the market more quickly.  Readers learn step-by-step the key requirements for debugging a modern, silicon SOC implementation, nine factors that complicate this debugging task, and a new debug approach that addresses these requirements and complicating factors.  The authors’ novel communication-centric, scan-based, abstraction-based, run/stop-based (CSAR) debug approach is discussed in detail, showing how it helps to meet debug requirements and address the nine, previously identified factors that complicate debugging silicon implementations of SOCs. The authors also derive the debug infrastructure requirements to support debugging of a silicon implementation of an SOC with their CSAR debug approach. This debug infrastructure consists of a generic on-chip debug architecture, a configurable automated design-for-debug ...

  9. Automatic handwritten mensural notation interpreter: From manuscript to MIDI performance

    OpenAIRE

    Huang, Yu-Hui; Chen, Xuanli; Beck, Serafina; Burn, David; Van Gool, Luc

    2015-01-01

    Huang Y.-H., Chen X., Beck S., Burn D., Van Gool L., ''Automatic handwritten mensural notation interpreter: From manuscript to MIDI performance'', 16th International Society for Music Information Retrieval conference - ISMIR 2015, pp. 79-85, October 26-30, 2015, Malaga, Spain.

  10. Performance of automatic scanning microscope for nuclear emulsion experiments

    Science.gov (United States)

    Güler, A. Murat; Altınok, Özgür

    2015-12-01

    The impressive improvements in scanning technology and methods let nuclear emulsion to be used as a target in recent large experiments. We report the performance of an automatic scanning microscope for nuclear emulsion experiments. After successful calibration and alignment of the system, we have reached 99% tracking efficiency for the minimum ionizing tracks that penetrating through the emulsions films. The automatic scanning system is successfully used for the scanning of emulsion films in the OPERA experiment and plan to use for the next generation of nuclear emulsion experiments.

  11. Automatic Performance Model Generation for Java Enterprise Edition (EE) Applications

    OpenAIRE

    Brunnert, Andreas;Vögele, Christian;Krcmar, Helmut

    2015-01-01

    The effort required to create performance models for enterprise applications is often out of proportion compared to their benefits. This work aims to reduce this effort by introducing an approach to automatically generate component-based performance models for running Java EE applications. The approach is applicable for all Java EE server products as it relies on standardized component types and interfaces to gather the required data for modeling an application. The feasibility of the approac...

  12. A debugging method of the Quadrotor UAV based on infrared thermal imaging

    Science.gov (United States)

    Cui, Guangjie; Hao, Qian; Yang, Jianguo; Chen, Lizhi; Hu, Hongkang; Zhang, Lijun

    2018-01-01

    High-performance UAV has been popular and in great need in recent years. The paper introduces a new method in debugging Quadrotor UAVs. Based on the infrared thermal technology and heat transfer theory, a UAV is under debugging above a hot-wire grid which is composed of 14 heated nichrome wires. And the air flow propelled by the rotating rotors has an influence on the temperature distribution of the hot-wire grid. An infrared thermal imager below observes the distribution and gets thermal images of the hot-wire grid. With the assistance of mathematic model and some experiments, the paper discusses the relationship between thermal images and the speed of rotors. By means of getting debugged UAVs into test, the standard information and thermal images can be acquired. The paper demonstrates that comparing to the standard thermal images, a UAV being debugging in the same test can draw some critical data directly or after interpolation. The results are shown in the paper and the advantages are discussed.

  13. Lightweight and Statistical Techniques for Petascale Debugging: Correctness on Petascale Systems (CoPS) Preliminry Report

    Energy Technology Data Exchange (ETDEWEB)

    de Supinski, B R; Miller, B P; Liblit, B

    2011-09-13

    Petascale platforms with O(10{sup 5}) and O(10{sup 6}) processing cores are driving advancements in a wide range of scientific disciplines. These large systems create unprecedented application development challenges. Scalable correctness tools are critical to shorten the time-to-solution on these systems. Currently, many DOE application developers use primitive manual debugging based on printf or traditional debuggers such as TotalView or DDT. This paradigm breaks down beyond a few thousand cores, yet bugs often arise above that scale. Programmers must reproduce problems in smaller runs to analyze them with traditional tools, or else perform repeated runs at scale using only primitive techniques. Even when traditional tools run at scale, the approach wastes substantial effort and computation cycles. Continued scientific progress demands new paradigms for debugging large-scale applications. The Correctness on Petascale Systems (CoPS) project is developing a revolutionary debugging scheme that will reduce the debugging problem to a scale that human developers can comprehend. The scheme can provide precise diagnoses of the root causes of failure, including suggestions of the location and the type of errors down to the level of code regions or even a single execution point. Our fundamentally new strategy combines and expands three relatively new complementary debugging approaches. The Stack Trace Analysis Tool (STAT), a 2011 R&D 100 Award Winner, identifies behavior equivalence classes in MPI jobs and highlights behavior when elements of the class demonstrate divergent behavior, often the first indicator of an error. The Cooperative Bug Isolation (CBI) project has developed statistical techniques for isolating programming errors in widely deployed code that we will adapt to large-scale parallel applications. Finally, we are developing a new approach to parallelizing expensive correctness analyses, such as analysis of memory usage in the Memgrind tool. In the first two

  14. Presenting results of software model checker via debugging interface

    OpenAIRE

    Kohan, Tomáš

    2012-01-01

    Title: Presenting results of software model checker via debugging interface Author: Tomáš Kohan Department: Department of Software Engineering Supervisor of the master thesis: RNDr. Ondřej Šerý, Ph.D., Department of Distributed and Dependable Systems Abstract: This thesis is devoted to design and implementation of the new debugging interface of the Java PathFinder application. As a suitable inte- face container was selected the Eclipse development environment. The created interface should vis...

  15. Data Integration Tool: Permafrost Data Debugging

    Science.gov (United States)

    Wilcox, H.; Schaefer, K. M.; Jafarov, E. E.; Pulsifer, P. L.; Strawhacker, C.; Yarmey, L.; Basak, R.

    2017-12-01

    We developed a Data Integration Tool (DIT) to significantly speed up the time of manual processing needed to translate inconsistent, scattered historical permafrost data into files ready to ingest directly into the Global Terrestrial Network-Permafrost (GTN-P). The United States National Science Foundation funded this project through the National Snow and Ice Data Center (NSIDC) with the GTN-P to improve permafrost data access and discovery. We leverage this data to support science research and policy decisions. DIT is a workflow manager that divides data preparation and analysis into a series of steps or operations called widgets (https://github.com/PermaData/DIT). Each widget does a specific operation, such as read, multiply by a constant, sort, plot, and write data. DIT allows the user to select and order the widgets as desired to meet their specific needs, incrementally interact with and evolve the widget workflows, and save those workflows for reproducibility. Taking ideas from visual programming found in the art and design domain, debugging and iterative design principles from software engineering, and the scientific data processing and analysis power of Fortran and Python it was written for interactive, iterative data manipulation, quality control, processing, and analysis of inconsistent data in an easily installable application. DIT was used to completely translate one dataset (133 sites) that was successfully added to GTN-P, nearly translate three datasets (270 sites), and is scheduled to translate 10 more datasets ( 1000 sites) from the legacy inactive site data holdings of the Frozen Ground Data Center (FGDC). Iterative development has provided the permafrost and wider scientific community with an extendable tool designed specifically for the iterative process of translating unruly data.

  16. Automatic performance tuning of parallel and accelerated seismic imaging kernels

    KAUST Repository

    Haberdar, Hakan

    2014-01-01

    With the increased complexity and diversity of mainstream high performance computing systems, significant effort is required to tune parallel applications in order to achieve the best possible performance for each particular platform. This task becomes more and more challenging and requiring a larger set of skills. Automatic performance tuning is becoming a must for optimizing applications such as Reverse Time Migration (RTM) widely used in seismic imaging for oil and gas exploration. An empirical search based auto-tuning approach is applied to the MPI communication operations of the parallel isotropic and tilted transverse isotropic kernels. The application of auto-tuning using the Abstract Data and Communication Library improved the performance of the MPI communications as well as developer productivity by providing a higher level of abstraction. Keeping productivity in mind, we opted toward pragma based programming for accelerated computation on latest accelerated architectures such as GPUs using the fairly new OpenACC standard. The same auto-tuning approach is also applied to the OpenACC accelerated seismic code for optimizing the compute intensive kernel of the Reverse Time Migration application. The application of such technique resulted in an improved performance of the original code and its ability to adapt to different execution environments.

  17. Software exorcism a handbook for debugging and optimizing legacy code

    CERN Document Server

    Blunden, Bill

    2013-01-01

    Software Exorcism: A Handbook for Debugging and Optimizing Legacy Code takes an unflinching, no bulls and look at behavioral problems in the software engineering industry, shedding much-needed light on the social forces that make it difficult for programmers to do their job. Do you have a co-worker who perpetually writes bad code that you are forced to clean up? This is your book. While there are plenty of books on the market that cover debugging and short-term workarounds for bad code, Reverend Bill Blunden takes a revolutionary step beyond them by bringing our atten

  18. Aspects of a Theory of Simplification, Debugging, and Coaching.

    Science.gov (United States)

    Fischer, Gerhard; And Others

    This paper analyses new methods of teaching skiing in terms of a computational paradigm for learning called increasingly complex microworlds (ICM). Examining the factors that underlie the dramatic enhancement of the learning of skiing led to the focus on the processes of simplification, debugging, and coaching. These three processes are studied in…

  19. Comparison of First Gear Performance for Manual and Automatic Transmissions

    Directory of Open Access Journals (Sweden)

    Kyle Stottlemyer

    2011-01-01

    Full Text Available The purpose of this project is to compare the first gear performance of an automobile for both its manual and automatic transmission options. Each transmission type has a different gear ratio, which yields a different acceleration curve for each transmission throughout the torque-rpm curve of the engine. The method of integral calculus was used to find an equation which could be used to solve for time at any point in the car's acceleration. The automobile velocity versus time was then graphed to compare each transmissions acceleration trend. This process is similar to that which automotive companies may use when determining what type of transmission to pair with a particular vehicle. By observing the trends in the acceleration graphs, it was determined that there are specific advantages and disadvantages to each type of transmission. Which transmission is the “better” choice is dependent on what application the automobile will be used for (e.g. racing, day-to-day driving, towing/hauling.

  20. 20 CFR 404.285 - Recomputations performed automatically.

    Science.gov (United States)

    2010-04-01

    ... DISABILITY INSURANCE (1950- ) Computing Primary Insurance Amounts Recomputing Your Primary Insurance Amount... any sooner than it would be under an automatic recomputation. You may also waive a recomputation if...

  1. Provenance-Based Debugging and Drill-Down in Data-Oriented Workflows

    KAUST Repository

    Ikeda, Robert

    2012-04-01

    Panda (for Provenance and Data) is a system that supports the creation and execution of data-oriented workflows, with automatic provenance generation and built-in provenance tracing operations. Workflows in Panda are arbitrary a cyclic graphs containing both relational (SQL) processing nodes and opaque processing nodes programmed in Python. For both types of nodes, Panda generates logical provenance - provenance information stored at the processing-node level - and uses the generated provenance to support record-level backward tracing and forward tracing operations. In our demonstration we use Panda to integrate, process, and analyze actual education data from multiple sources. We specifically demonstrate how Panda\\'s provenance generation and tracing capabilities can be very useful for workflow debugging, and for drilling down on specific results of interest. © 2012 IEEE.

  2. Predicting automatic speech recognition performance over communication channels from instrumental speech quality and intelligibility scores

    NARCIS (Netherlands)

    Gallardo, L.F.; Möller, S.; Beerends, J.

    2017-01-01

    The performance of automatic speech recognition based on coded-decoded speech heavily depends on the quality of the transmitted signals, determined by channel impairments. This paper examines relationships between speech recognition performance and measurements of speech quality and intelligibility

  3. Edit, compile, execute and debug C++ on the web

    OpenAIRE

    Lobo Cusidó, Albert

    2017-01-01

    The aim of this project is to create a web application to edit, compile, and debug C++ code. This application can be used by instructors to make introductory programming courses more engaging. The first phase of this project provides the planning and design of a software solution to build the application. The main phase describes the implementation of the solution using rapid application development methodology. In the final phase, the implemented solution is evaluated, conc...

  4. Debugging Nondeterministic Failures in Linux Programs through Replay Analysis

    Directory of Open Access Journals (Sweden)

    Shakaiba Majeed

    2018-01-01

    Full Text Available Reproducing a failure is the first and most important step in debugging because it enables us to understand the failure and track down its source. However, many programs are susceptible to nondeterministic failures that are hard to reproduce, which makes debugging extremely difficult. We first address the reproducibility problem by proposing an OS-level replay system for a uniprocessor environment that can capture and replay nondeterministic events needed to reproduce a failure in Linux interactive and event-based programs. We then present an analysis method, called replay analysis, based on the proposed record and replay system to diagnose concurrency bugs in such programs. The replay analysis method uses a combination of static analysis, dynamic tracing during replay, and delta debugging to identify failure-inducing memory access patterns that lead to concurrency failure. The experimental results show that the presented record and replay system has low-recording overhead and hence can be safely used in production systems to catch rarely occurring bugs. We also present few concurrency bug case studies from real-world applications to prove the effectiveness of the proposed bug diagnosis framework.

  5. A Framework to Debug Diagnostic Matrices

    Science.gov (United States)

    Kodal, Anuradha; Robinson, Peter; Patterson-Hine, Ann

    2013-01-01

    Diagnostics is an important concept in system health and monitoring of space operations. Many of the existing diagnostic algorithms utilize system knowledge in the form of diagnostic matrix (D-matrix, also popularly known as diagnostic dictionary, fault signature matrix or reachability matrix) gleaned from physical models. But, sometimes, this may not be coherent to obtain high diagnostic performance. In such a case, it is important to modify this D-matrix based on knowledge obtained from other sources such as time-series data stream (simulated or maintenance data) within the context of a framework that includes the diagnostic/inference algorithm. A systematic and sequential update procedure, diagnostic modeling evaluator (DME) is proposed to modify D-matrix and wrapper logic considering least expensive solution first. This iterative procedure includes conditions ranging from modifying 0s and 1s in the matrix, or adding/removing the rows (failure sources) columns (tests). We will experiment this framework on datasets from DX challenge 2009.

  6. Performance of data acceptance criteria over 50 months from an automatic real-time environmental radiation surveillance network

    International Nuclear Information System (INIS)

    Casanovas, R.; Morant, J.J.; Lopez, M.; Hernandez-Giron, I.; Batalla, E.; Salvado, M.

    2011-01-01

    The automatic real-time environmental radiation surveillance network of Catalonia (Spain) comprises two subnetworks; one with 9 aerosol monitors and the other with 8 Geiger monitors together with 2 water monitors located in the Ebre river. Since September 2006, several improvements were implemented in order to get better quality and quantity of data, allowing a more accurate data analysis. However, several causes (natural causes, equipment failure, artificial external causes and incidents in nuclear power plants) may produce radiological measured values mismatched with the own station background, whether spurious without significance or true radiological values. Thus, data analysis for a 50-month period was made and allowed to establish an easily implementable statistical criterion to find those values that require special attention. This criterion proved a very useful tool for creating a properly debugged database and to give a quick response to equipment failures or possible radiological incidents. This paper presents the results obtained from the criterion application, including the figures for the expected, raw and debugged data, percentages of missing data grouped by causes and radiological measurements from the networks. Finally, based on the discussed information, recommendations for the improvement of the network are identified to obtain better radiological information and analysis capabilities. - Highlights: → Causes producing data mismatching with the own stations background are described. → Causes may be natural, equipment failure, external or nuclear plants incidents. → These causes can produce either spurious or true radiological data. → A criterion to find these data was implemented and tested for a 50-month period. → Recommendations for the improvement of the network are identified.

  7. The roots of stereotype threat: when automatic associations disrupt girls' math performance.

    Science.gov (United States)

    Galdi, Silvia; Cadinu, Mara; Tomasetto, Carlo

    2014-01-01

    Although stereotype awareness is a prerequisite for stereotype threat effects (Steele & Aronson, 1995), research showed girls' deficit under stereotype threat before the emergence of math-gender stereotype awareness, and in the absence of stereotype endorsement. In a study including 240 six-year-old children, this paradox was addressed by testing whether automatic associations trigger stereotype threat in young girls. Whereas no indicators were found that children endorsed the math-gender stereotype, girls, but not boys, showed automatic associations consistent with the stereotype. Moreover, results showed that girls' automatic associations varied as a function of a manipulation regarding the stereotype content. Importantly, girls' math performance decreased in a stereotype-consistent, relative to a stereotype-inconsistent, condition and automatic associations mediated the relation between stereotype threat and performance. © 2013 The Authors. Child Development © 2013 Society for Research in Child Development, Inc.

  8. Development of a system for automatic control and performance evaluation of shutoff units in HANARO

    International Nuclear Information System (INIS)

    Jeong, Y. H.; Joe, Y. G.; Choi, Y. S.; Woo, J. S.

    2003-01-01

    The function of the shutoff units is to rapidly insert the shutoff rod into the reactor core for safe shutdown of reactor. This paper describes the development of a system for automatic control and performance evaluation of shutoff units. The system automatically drives the shutoff unit with a specified operation cycle and records the performance of the drive mechanism in graphs and data. Also, it records the operating parameters of the shutoff unit and test facility. The characteristic of the developed system was evaluated to compare with that being use in the HANARO reactor. The system will be used for the performance and endurance tests in the test facility. Hereafter, the system will efficiently be used for the normal operation and the periodical drop performance tests of shutoff units in HANARO

  9. BoardScope: a debug tool for reconfigurable systems

    Science.gov (United States)

    Levi, Delon; Guccione, Steven A.

    1998-10-01

    BoardScope is a portable, interactive debug tool for Xilinx FPGA-based hardware. BoardScope features simple but powerful graphical interfaces for viewing FPGA circuits in their operational state. The main display graphically shows the Configurable Logic Block (CLB) flip-flop states of all FPGA devices in the system. This display indicates overall dataflow and is used to find design errors, such as clocking problems or incorrect initialization or reset. The CLB display shows the complete state, including LUT values and flip-flop configuration, of any CLB. Finally, a symbol file may be used to drive a waveform display. The waveform display permits both bit-level signals and multibit busses to be viewed in a fashion similar to that used by circuit simulators. BoardScope is implemented completely in JavaTM, using the Abstract Window Toolkit version 1.1. The interface to the hardware is provided by XHWIF, the Xilinx standard HardWare InterFace for FPGA based hardware. This simplifies porting, and provides remote access capabilities. Remote access allows multiple users to communicate with the hardware across a network. BoardScope currently runs on the WildForceTM and WildOneTM boards from Annapolis Microsystems, and the PCI Pamette from Digital Equipment Corporation. Ports to other systems are currently under way.

  10. Automatic performance estimation of conceptual temperature control system design for rapid development of real system

    International Nuclear Information System (INIS)

    Jang, Yu Jin

    2013-01-01

    This paper presents an automatic performance estimation scheme of conceptual temperature control system with multi-heater configuration prior to constructing the physical system for achieving rapid validation of the conceptual design. An appropriate low-order discrete-time model, which will be used in the controller design, is constructed after determining several basic factors including the geometric shape of controlled object and heaters, material properties, heater arrangement, etc. The proposed temperature controller, which adopts the multivariable GPC (generalized predictive control) scheme with scale factors, is then constructed automatically based on the above model. The performance of the conceptual temperature control system is evaluated by using a FEM (finite element method) simulation combined with the controller.

  11. An automatic frequency control system of 2-MeV electronic LINAC

    International Nuclear Information System (INIS)

    Hu Xue; Zhang Junqiang; Zhong Shaopeng; Zhao Minghua

    2013-01-01

    Background: In electronic LINAC, the magnetron is often used as power source. The output frequency of magnetron always changes with the environment and the frequency difference between the output of magnetron and the frequency of accelerator, which will result in the bad performance of LINAC systems. Purpose: To ensure the performance of the work of entire LINAC system effectively, an automatic frequency control system is necessary. Methods: A phase locked frequency discriminator is designed to discriminate the frequency of accelerator guide and magnetron, and analogue circuit is used to process the output signals of frequency discriminator unit. Results: Working with the automatic frequency control (AFC) system, the output frequency of magnetron can be controlled in the range of (2998 MHz, 2998 MHz + 70 kHz) and (2998 MHz, 2998 MHz - 30 kHz). Conclusions: Under the measurement and debug, the functionality of frequency discriminator unit and signal processor circuit is tested effectively. (authors)

  12. The Effects of Background Noise on the Performance of an Automatic Speech Recogniser

    Science.gov (United States)

    Littlefield, Jason; HashemiSakhtsari, Ahmad

    2002-11-01

    Ambient or environmental noise is a major factor that affects the performance of an automatic speech recognizer. Large vocabulary, speaker-dependent, continuous speech recognizers are commercially available. Speech recognizers, perform well in a quiet environment, but poorly in a noisy environment. Speaker-dependent speech recognizers require training prior to them being tested, where the level of background noise in both phases affects the performance of the recognizer. This study aims to determine whether the best performance of a speech recognizer occurs when the levels of background noise during the training and test phases are the same, and how the performance is affected when the levels of background noise during the training and test phases are different. The relationship between the performance of the speech recognizer and upgrading the computer speed and amount of memory as well as software version was also investigated.

  13. Assess and Predict Automatic Generation Control Performances for Thermal Power Generation Units Based on Modeling Techniques

    Science.gov (United States)

    Zhao, Yan; Yang, Zijiang; Gao, Song; Liu, Jinbiao

    2018-02-01

    Automatic generation control(AGC) is a key technology to maintain real time power generation and load balance, and to ensure the quality of power supply. Power grids require each power generation unit to have a satisfactory AGC performance, being specified in two detailed rules. The two rules provide a set of indices to measure the AGC performance of power generation unit. However, the commonly-used method to calculate these indices is based on particular data samples from AGC responses and will lead to incorrect results in practice. This paper proposes a new method to estimate the AGC performance indices via system identification techniques. In addition, a nonlinear regression model between performance indices and load command is built in order to predict the AGC performance indices. The effectiveness of the proposed method is validated through industrial case studies.

  14. Automatic target recognition performance losses in the presence of atmospheric and camera effects

    Science.gov (United States)

    Chen, Xiaohan; Schmid, Natalia A.

    2010-04-01

    The importance of networked automatic target recognition systems for surveillance applications is continuously increasing. Because of the requirement of a low cost and limited payload, these networks are traditionally equipped with lightweight, low-cost sensors such as electro-optical (EO) or infrared sensors. The quality of imagery acquired by these sensors critically depends on the environmental conditions, type and characteristics of sensors, and absence of occluding or concealing objects. In the past, a large number of efficient detection, tracking, and recognition algorithms have been designed to operate on imagery of good quality. However, detection and recognition limits under nonideal environmental and/or sensor-based distortions have not been carefully evaluated. We introduce a fully automatic target recognition system that involves a Haar-based detector to select potential regions of interest within images, performs adjustment of detected regions, segments potential targets using a region-based approach, identifies targets using Bessel K form-based encoding, and performs clutter rejection. We investigate the effects of environmental and camera conditions on target detection and recognition performance. Two databases are involved. One is a simulated database generated using a 3-D tool. The other database is formed by imaging 10 die-cast models of military vehicles from different elevation and orientation angles. The database contains imagery acquired both indoors and outdoors. The indoors data set is composed of clear and distorted images. The distortions include defocus blur, sided illumination, low contrast, shadows, and occlusions. All images in this database, however, have a uniform (blue) background. The indoors database is applied to evaluate the degradations of recognition performance due to camera and illumination effects. The database collected outdoors includes a real background and is much more complex to process. The numerical results

  15. Compile-Time Debugging of C Programs Working on Trees

    DEFF Research Database (Denmark)

    Elgaard, Jacob; Møller, Anders; Schwartzbach, Michael I.

    2000-01-01

    We exhibit a technique for automatically verifying the safety of simple C programs working on tree-shaped data structures. We do not consider the complete behavior of programs, but only attempt to verify that they respect the shape and integrity of the store. A verified program is guaranteed to p...

  16. Automatic Human Movement Assessment With Switching Linear Dynamic System: Motion Segmentation and Motor Performance.

    Science.gov (United States)

    de Souza Baptista, Roberto; Bo, Antonio P L; Hayashibe, Mitsuhiro

    2017-06-01

    Performance assessment of human movement is critical in diagnosis and motor-control rehabilitation. Recent developments in portable sensor technology enable clinicians to measure spatiotemporal aspects to aid in the neurological assessment. However, the extraction of quantitative information from such measurements is usually done manually through visual inspection. This paper presents a novel framework for automatic human movement assessment that executes segmentation and motor performance parameter extraction in time-series of measurements from a sequence of human movements. We use the elements of a Switching Linear Dynamic System model as building blocks to translate formal definitions and procedures from human movement analysis. Our approach provides a method for users with no expertise in signal processing to create models for movements using labeled dataset and later use it for automatic assessment. We validated our framework on preliminary tests involving six healthy adult subjects that executed common movements in functional tests and rehabilitation exercise sessions, such as sit-to-stand and lateral elevation of the arms and five elderly subjects, two of which with limited mobility, that executed the sit-to-stand movement. The proposed method worked on random motion sequences for the dual purpose of movement segmentation (accuracy of 72%-100%) and motor performance assessment (mean error of 0%-12%).

  17. Vdebug: debugging tool for parallel scientific programs. Design report on vdebug

    International Nuclear Information System (INIS)

    Matsuda, Katsuyuki; Takemiya, Hiroshi

    2000-02-01

    We report on a debugging tool called vdebug which supports debugging work for parallel scientific simulation programs. It is difficult to debug scientific programs with an existing debugger, because the volume of data generated by the programs is too large for users to check data in characters. Usually, the existing debugger shows data values in characters. To alleviate it, we have developed vdebug which enables to check the validity of large amounts of data by showing these data values visually. Although targets of vdebug have been restricted to sequential programs, we have made it applicable to parallel programs by realizing the function of merging and visualizing data distributed on programs on each computer node. Now, vdebug works on seven kinds of parallel computers. In this report, we describe the design of vdebug. (author)

  18. Performance evaluation of a right atrial automatic capture verification algorithm using two different sensing configurations.

    Science.gov (United States)

    Sperzel, Johannes; Goetze, Stephan; Kennergren, Charles; Biffi, Mauro; Brooke, M Jason; Vireca, Elisa; Saha, Sunipa; Schubert, Bernd; Butter, Christian

    2009-05-01

    This acute data collection study evaluated the performance of a right atrial (RA) automatic capture verification (ACV) algorithm based on evoked response sensing from two electrode configurations during independent unipolar pacing. RA automatic threshold tests were conducted. Evoked response signals were simultaneously recorded between the RA(Ring) electrode and an empty pacemaker housing electrode (RA(Ring)-->Can) and the electrically isolated Indifferent header electrode (RA(Ring)-->Ind). The atrial evoked response (AER) and the performance of the ACV algorithm were evaluated off-line using each sensing configuration. An accurate threshold measurement was defined as within 0.2 V of the unipolar threshold measured manually. Threshold tests were designed to fail for small AER (AER signals were analyzed from 34 patients who were indicated for a pacemaker (five), implantable cardioverter-defibrillator (11), or cardiac resynchronization therapy pacemaker (six) or defibrillator (12). The minimum AER amplitude was larger (P Can (1.6+/-0.9 mV) than from RA(Ring)-->Ind (1.3+/-0.8 mV). The algorithm successfully measured the pacing threshold in 96.8% and 91.0% of tests for RA(Ring)-->Can and RA(Ring)-->Ind, respectively. No statistical difference between the unipolar and bipolar pacing threshold was observed. The RA(Ring)-->Can AER sensing configuration may provide a means of implementing an independent pacing/sensing method for ACV in the RA. RA bipolar pacing therapy based on measured RA unipolar pacing thresholds may be feasible.

  19. Lightning Protection Performance Assessment of Transmission Line Based on ATP model Automatic Generation

    Directory of Open Access Journals (Sweden)

    Luo Hanwu

    2016-01-01

    Full Text Available This paper presents a novel method to solve the initial lightning breakdown current by combing ATP and MATLAB simulation software effectively, with the aims to evaluate the lightning protection performance of transmission line. Firstly, the executable ATP simulation model is generated automatically according to the required information such as power source parameters, tower parameters, overhead line parameters, grounding resistance and lightning current parameters, etc. through an interface program coded by MATLAB. Then, the data are extracted from the generated LIS files which can be obtained by executing the ATP simulation model, the occurrence of transmission lie breakdown can be determined by the relative data in LIS file. The lightning current amplitude should be reduced when the breakdown occurs, and vice the verse. Thus the initial lightning breakdown current of a transmission line with given parameters can be determined accurately by continuously changing the lightning current amplitude, which is realized by a loop computing algorithm that is coded by MATLAB software. The method proposed in this paper can generate the ATP simulation program automatically, and facilitates the lightning protection performance assessment of transmission line.

  20. Modern multithreading implementing, testing, and debugging multithreaded Java and C++/Pthreads/Win32 programs

    CERN Document Server

    Carver, Richard H

    2005-01-01

    Master the essentials of concurrent programming,including testing and debuggingThis textbook examines languages and libraries for multithreaded programming. Readers learn how to create threads in Java and C++, and develop essential concurrent programming and problem-solving skills. Moreover, the textbook sets itself apart from other comparable works by helping readers to become proficient in key testing and debugging techniques. Among the topics covered, readers are introduced to the relevant aspects of Java, the POSIX Pthreads library, and the Windows Win32 Applications Programming Interface.

  1. The UPSF code: a metaprogramming-based high-performance automatically parallelized plasma simulation framework

    Science.gov (United States)

    Gao, Xiatian; Wang, Xiaogang; Jiang, Binhao

    2017-10-01

    UPSF (Universal Plasma Simulation Framework) is a new plasma simulation code designed for maximum flexibility by using edge-cutting techniques supported by C++17 standard. Through use of metaprogramming technique, UPSF provides arbitrary dimensional data structures and methods to support various kinds of plasma simulation models, like, Vlasov, particle in cell (PIC), fluid, Fokker-Planck, and their variants and hybrid methods. Through C++ metaprogramming technique, a single code can be used to arbitrary dimensional systems with no loss of performance. UPSF can also automatically parallelize the distributed data structure and accelerate matrix and tensor operations by BLAS. A three-dimensional particle in cell code is developed based on UPSF. Two test cases, Landau damping and Weibel instability for electrostatic and electromagnetic situation respectively, are presented to show the validation and performance of the UPSF code.

  2. Automatic and Objective Assessment of Alternating Tapping Performance in Parkinson’s Disease

    Directory of Open Access Journals (Sweden)

    Mevludin Memedi

    2013-12-01

    Full Text Available This paper presents the development and evaluation of a method for enabling quantitative and automatic scoring of alternating tapping performance of patients with Parkinson’s disease (PD. Ten healthy elderly subjects and 95 patients in different clinical stages of PD have utilized a touch-pad handheld computer to perform alternate tapping tests in their home environments. First, a neurologist used a web-based system to visually assess impairments in four tapping dimensions (‘speed’, ‘accuracy’, ‘fatigue’ and ‘arrhythmia’ and a global tapping severity (GTS. Second, tapping signals were processed with time series analysis and statistical methods to derive 24 quantitative parameters. Third, principal component analysis was used to reduce the dimensions of these parameters and to obtain scores for the four dimensions. Finally, a logistic regression classifier was trained using a 10-fold stratified cross-validation to map the reduced parameters to the corresponding visually assessed GTS scores. Results showed that the computed scores correlated well to visually assessed scores and were significantly different across Unified Parkinson’s Disease Rating Scale scores of upper limb motor performance. In addition, they had good internal consistency, had good ability to discriminate between healthy elderly and patients in different disease stages, had good sensitivity to treatment interventions and could reflect the natural disease progression over time. In conclusion, the automatic method can be useful to objectively assess the tapping performance of PD patients and can be included in telemedicine tools for remote monitoring of tapping.

  3. Performance Evaluation of Strain Gauge Printed Using Automatic Fluid Dispensing System on Conformal Substrates

    Science.gov (United States)

    Khairilhijra Khirotdin, Rd.; Faridzuan Ngadiron, Mohamad; Adzeem Mahadzir, Muhammad; Hassan, Nurhafizzah

    2017-08-01

    Smart textiles require flexible electronics that can withstand daily stresses like bends and stretches. Printing using conductive inks provides the flexibility required but the current printing techniques suffered from ink incompatibility, limited of substrates to be printed with and incompatible with conformal substrates due to its rigidity and low flexibility. An alternate printing technique via automatic fluid dispensing system is proposed and its performances on printing strain gauge on conformal substrates were evaluated to determine its feasibility. Process parameters studied including printing speed, deposition height, curing time and curing temperature. It was found that the strain gauge is proven functional as expected since different strains were induced when bent on variation of bending angles and curvature radiuses from designated bending fixtures. The average change of resistances were doubled before the strain gauge starts to break. Printed strain gauges also exhibited some excellence elasticity as they were able to resist bending up to 70° angle and 3 mm of curvature radius.

  4. International Internet-2 performance and automatic tuning protocol for medical imaging applications.

    Science.gov (United States)

    Chan, Lawrence W C; Zhou, Michael Z; Hau, S K; Law, Maria Y Y; Tang, F H; Documet, J

    2005-01-01

    Internet-2 is an advanced computer network, which has been widely used for medical imaging applications such as teleradiology and teleconsultation, since Internet-2 can fulfill the requirements for high-speed data transmission and short turn-around time with low operation cost once installed. However, such high performance of Internet-2 may not be retained for global access from international network peers. Considering the international Internet-2 connection between the PolyU and the IPI/USC, there exist two major factors, network looping in the US and bottleneck of the connection, raising the round-trip time and limiting the available bandwidth, respectively. The available bandwidth will be further underutilized if the TCP/IP parameters at the sending and receiving computers are not appropriately chosen. This paper proposes a repeatable and consistent protocol to automatically tune these parameters for the clinical applications.

  5. Text Summarization Evaluation: Correlating Human Performance on an Extrinsic Task with Automatic Intrinsic Metrics

    National Research Council Canada - National Science Library

    President, Stacy F; Dorr, Bonnie J

    2006-01-01

    This research describes two types of summarization evaluation methods, intrinsic and extrinsic, and concentrates on determining the level of correlation between automatic intrinsic methods and human...

  6. Performance Analysis of the SIFT Operator for Automatic Feature Extraction and Matching in Photogrammetric Applications

    Directory of Open Access Journals (Sweden)

    Francesco Nex

    2009-05-01

    Full Text Available In the photogrammetry field, interest in region detectors, which are widely used in Computer Vision, is quickly increasing due to the availability of new techniques. Images acquired by Mobile Mapping Technology, Oblique Photogrammetric Cameras or Unmanned Aerial Vehicles do not observe normal acquisition conditions. Feature extraction and matching techniques, which are traditionally used in photogrammetry, are usually inefficient for these applications as they are unable to provide reliable results under extreme geometrical conditions (convergent taking geometry, strong affine transformations, etc. and for bad-textured images. A performance analysis of the SIFT technique in aerial and close-range photogrammetric applications is presented in this paper. The goal is to establish the suitability of the SIFT technique for automatic tie point extraction and approximate DSM (Digital Surface Model generation. First, the performances of the SIFT operator have been compared with those provided by feature extraction and matching techniques used in photogrammetry. All these techniques have been implemented by the authors and validated on aerial and terrestrial images. Moreover, an auto-adaptive version of the SIFT operator has been developed, in order to improve the performances of the SIFT detector in relation to the texture of the images. The Auto-Adaptive SIFT operator (A2 SIFT has been validated on several aerial images, with particular attention to large scale aerial images acquired using mini-UAV systems.

  7. Performance Analysis of the SIFT Operator for Automatic Feature Extraction and Matching in Photogrammetric Applications.

    Science.gov (United States)

    Lingua, Andrea; Marenchino, Davide; Nex, Francesco

    2009-01-01

    In the photogrammetry field, interest in region detectors, which are widely used in Computer Vision, is quickly increasing due to the availability of new techniques. Images acquired by Mobile Mapping Technology, Oblique Photogrammetric Cameras or Unmanned Aerial Vehicles do not observe normal acquisition conditions. Feature extraction and matching techniques, which are traditionally used in photogrammetry, are usually inefficient for these applications as they are unable to provide reliable results under extreme geometrical conditions (convergent taking geometry, strong affine transformations, etc.) and for bad-textured images. A performance analysis of the SIFT technique in aerial and close-range photogrammetric applications is presented in this paper. The goal is to establish the suitability of the SIFT technique for automatic tie point extraction and approximate DSM (Digital Surface Model) generation. First, the performances of the SIFT operator have been compared with those provided by feature extraction and matching techniques used in photogrammetry. All these techniques have been implemented by the authors and validated on aerial and terrestrial images. Moreover, an auto-adaptive version of the SIFT operator has been developed, in order to improve the performances of the SIFT detector in relation to the texture of the images. The Auto-Adaptive SIFT operator (A(2) SIFT) has been validated on several aerial images, with particular attention to large scale aerial images acquired using mini-UAV systems.

  8. Man vs. Machine: An interactive poll to evaluate hydrological model performance of a manual and an automatic calibration

    Science.gov (United States)

    Wesemann, Johannes; Burgholzer, Reinhard; Herrnegger, Mathew; Schulz, Karsten

    2017-04-01

    In recent years, a lot of research in hydrological modelling has been invested to improve the automatic calibration of rainfall-runoff models. This includes for example (1) the implementation of new optimisation methods, (2) the incorporation of new and different objective criteria and signatures in the optimisation and (3) the usage of auxiliary data sets apart from runoff. Nevertheless, in many applications manual calibration is still justifiable and frequently applied. The hydrologist performing the manual calibration, with his expert knowledge, is able to judge the hydrographs simultaneously concerning details but also in a holistic view. This integrated eye-ball verification procedure available to man can be difficult to formulate in objective criteria, even when using a multi-criteria approach. Comparing the results of automatic and manual calibration is not straightforward. Automatic calibration often solely involves objective criteria such as Nash-Sutcliffe Efficiency Coefficient or the Kling-Gupta-Efficiency as a benchmark during the calibration. Consequently, a comparison based on such measures is intrinsically biased towards automatic calibration. Additionally, objective criteria do not cover all aspects of a hydrograph leaving questions concerning the quality of a simulation open. This contribution therefore seeks to examine the quality of manually and automatically calibrated hydrographs by interactively involving expert knowledge in the evaluation. Simulations have been performed for the Mur catchment in Austria with the rainfall-runoff model COSERO using two parameter sets evolved from a manual and an automatic calibration. A subset of resulting hydrographs for observation and simulation, representing the typical flow conditions and events, will be evaluated in this study. In an interactive crowdsourcing approach experts attending the session can vote for their preferred simulated hydrograph without having information on the calibration method that

  9. Performance Evaluation of Antlion Optimizer Based Regulator in Automatic Generation Control of Interconnected Power System

    Directory of Open Access Journals (Sweden)

    Esha Gupta

    2016-01-01

    Full Text Available This paper presents an application of the recently introduced Antlion Optimizer (ALO to find the parameters of primary governor loop of thermal generators for successful Automatic Generation Control (AGC of two-area interconnected power system. Two standard objective functions, Integral Square Error (ISE and Integral Time Absolute Error (ITAE, have been employed to carry out this parameter estimation process. The problem is transformed in optimization problem to obtain integral gains, speed regulation, and frequency sensitivity coefficient for both areas. The comparison of the regulator performance obtained from ALO is carried out with Genetic Algorithm (GA, Particle Swarm Optimization (PSO, and Gravitational Search Algorithm (GSA based regulators. Different types of perturbations and load changes are incorporated to establish the efficacy of the obtained design. It is observed that ALO outperforms all three optimization methods for this real problem. The optimization performance of ALO is compared with other algorithms on the basis of standard deviations in the values of parameters and objective functions.

  10. Productive performance of Nile tilapia (Oreochromis niloticus fed at different frequencies and periods with automatic dispenser

    Directory of Open Access Journals (Sweden)

    R.M.R. Sousa

    2012-02-01

    Full Text Available The performance of Nile tilapia (Oreochromis niloticus raised in cages furnished with an automatic dispenser, supplied at different frequencies (once per hour and once every two hours and periods (daytime, nighttime and both was evaluated. Eighteen 1.0m³ cages were placed into a 2000m² pond, two meters deep with a 5% water exchange. One hundred and seventy tilapias, with initial weight of 16.0±4.9g, were dispersed into each 1m³ cage and the feed ration was adjusted every 21 days with biometry. Data was collected from March to July (autumn and winter. Significant difference to final weight (P<0.05 among treatments was observed. The increase in feeding frequency improves the productive performance of Nile tilapias in cages and permitted better management of the food. The better feed conversion rate for high feeding frequency (24 times day-1 can result in saving up to 360kg of food for each ton of fish produced, increasing the economic sustenance for tilapia culture and suggesting less environmental pollution.

  11. Improving heat exchanger performance by using on-line automatic tube cleaning system

    Energy Technology Data Exchange (ETDEWEB)

    Someah, Kaveh; Beauchesne, Guy [OVIVO Water (United States)], email: kaveh.someah@ovivowater.com, email: guy.beauchesne@ovivowater.com

    2011-07-01

    In this presentation, OVIVO demonstrates an innovative solution for enhancing the performance and output of heat exchangers used in oil recovery plants. The solution proposed for reducing tube failure due to deposit buildup, corrosion, micro fouling, and scaling caused by the high mineral content of the water used, is to use an on-line automatic tube cleaning system (ATCS). The first ATCS type is the ball type, which injects rubber balls into the water entering the heat exchanger. The scrubbing and wiping action of the balls keeps the tubes clean, and the balls are subsequently collected by means of a strainer and can be re-circulated as needed. A second system aligns a brush and basket to each tube and, by periodically reversing water flow using a diverter valve, cleans each tube several times daily without process interruption or the need for unit shut down. The use of ATCS has been proven to improve plant performance and increase output while reducing operating and maintenance costs.

  12. Using automatic calibration method for optimizing the performance of Pedotransfer functions of saturated hydraulic conductivity

    Directory of Open Access Journals (Sweden)

    Ahmed M. Abdelbaki

    2016-06-01

    Full Text Available Pedotransfer functions (PTFs are an easy way to predict saturated hydraulic conductivity (Ksat without measurements. This study aims to auto calibrate 22 PTFs. The PTFs were divided into three groups according to its input requirements and the shuffled complex evolution algorithm was used in calibration. The results showed great modification in the performance of the functions compared to the original published functions. For group 1 PTFs, the geometric mean error ratio (GMER and the geometric standard deviation of error ratio (GSDER values were modified from range (1.27–6.09, (5.2–7.01 to (0.91–1.15, (4.88–5.85 respectively. For group 2 PTFs, the GMER and the GSDER values were modified from (0.3–1.55, (5.9–12.38 to (1.00–1.03, (5.5–5.9 respectively. For group 3 PTFs, the GMER and the GSDER values were modified from (0.11–2.06, (5.55–16.42 to (0.82–1.01, (5.1–6.17 respectively. The result showed that the automatic calibration is an efficient and accurate method to enhance the performance of the PTFs.

  13. Debugging of Class-D Audio Power Amplifiers

    DEFF Research Database (Denmark)

    Crone, Lasse; Pedersen, Jeppe Arnsdorf; Mønster, Jakob Døllner

    2012-01-01

    Determining and optimizing the performance of a Class-D audio power amplier can be very dicult without knowledge of the use of audio performance measuring equipment and of how the various noise and distortion sources in uence the audio performance. This paper gives an introduction on how to measure...

  14. Debugging of Class-D Audio Power Amplifiers

    DEFF Research Database (Denmark)

    Crone, Lasse; Pedersen, Jeppe Arnsdorf; Mønster, Jakob Døllner

    2012-01-01

    Determining and optimizing the performance of a Class-D audio power amplier can be very dicult without knowledge of the use of audio performance measuring equipment and of how the various noise and distortion sources in uence the audio performance. This paper gives an introduction on how to measure...... the performance of the amplier and how to nd the noise and distortion sources and suggests ways to remove them. Throughout the paper measurements of a test amplier are presented along with the relevant theory....

  15. Automatic PID Control Loops Design for Performance Improvement of Cryogenic Turboexpander

    International Nuclear Information System (INIS)

    Joshi, D.M.; Patel, H.K.; Shah, D.K.

    2015-01-01

    Cryogenics field involves temperature below 123 K which is much less than ambient temperature. In addition, many industrially important physical processes—from fulfilling the needs of National Thermonuclear Fusion programs, superconducting magnets to treatment of cutting tools and preservation of blood cells, require extreme low temperature. The low temperature required for liquefaction of common gases can be obtained by several processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Helium liquefier is used for the liquefaction process of helium gas. In general, the Helium Refrigerator/Liquefier (HRL) needs turboexpander as expansion machine to produce cooling effect which is further used for the production of liquid helium. Turboexpanders, a high speed device that is supported on gas bearings, are the most critical component in many helium refrigeration systems. A very minor fault in the operation and manufacturing or impurities in the helium gas can destroy the turboexpander. However, since the performance of expanders is dependent on a number of operating parameters and the relations between them are quite complex, the instrumentation and control system design for turboexpander needs special attention. The inefficiency of manual control leads to the need of designing automatic control loops for turboexpander. Proper design and implementation of the control loops plays an important role in the successful operation of the cryogenic turboexpander. The PID control loops has to be implemented with accurate interlocks and logic to enhance the performance of the cryogenic turboexpander. For different normal and off-normal operations, speeds will be different and hence a proper control method for critical rotational speed avoidance is must. This paper presents the design of PID control loops needed for the

  16. Distribution transformer with automatic voltage adjustment - performance; Transformador de distribucion con ajuste automatico de tension - desempeno

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez Ruiz, Gustavo A.; Delgadillo Bocanegra, Alfonso; Betancourt Ramirez, Enrique [PROLEC-GE, Apodaca, Nuevo Leon (Mexico)]. E-mail: gustavo1.hernandez@ge.com; alfonso.delgadillobocanegra@ge.com; enrique.betancourt@ge.com; Ramirez Arredondo, Juan M. [CINVESTAV-Guadalajara, Zapopan, Jalisco (Mexico)]. E-mail: jramirez@gdl.cinvestav.mx

    2010-11-15

    In the electric power distribution systems, the power quality is strongly linked with the service stability voltage. In the radial kind systems, it is virtually impossible to achieve a flat voltage along the lines, so it is desirable to count with transformers that can adjust automatically the turns ratio. In this work, it is described the development and the performance of a transformer with an integrated electronic tap changer, that allows to change the turns ratio along the standard range of +/-5%, and it was identified the application limits of the technology. [Spanish] En los sistemas de distribucion de energia electrica, la calidad del suministro de energia esta fuertemente ligada con la estabilidad del voltaje de servicio. En sistemas de tipo radial, es virtualmente imposible mantener uniforme la tension a lo largo de las lineas, por lo que se hace deseable contar con transformadores que puedan ajustar automaticamente la relacion de transformacion. En este trabajo, se describe el desarrollo y desempeno de un transformador con switch electronico integrado, que permite variar la relacion de transformacion dentro del rango estandarizado de +/-5%, y se identifican los limites de aplicacion de la tecnologia.

  17. Performance of an automatic dose control system for CT: anthropomorphic phantom studies.

    Science.gov (United States)

    Gosch, D; Stumpp, P; Kahn, T; Nagel, H D

    2011-02-01

    To assess the performance and to provide more detailed insight into characteristics and limitations of devices for automatic dose control (ADC) in CT. A comprehensive study on DoseRight 2.0, the ADC system provided by Philips for its Brilliance CT scanners, was conducted with assorted tests using an anthropomorphic phantom that allowed simulation of the operation of the system under almost realistic conditions. The scan protocol settings for the neck, chest and abdomen with pelvis were identical to those applied in the clinical routine. Using the appropriate ADC functionalities, dose reductions equal 40% for the neck, 20% for the chest and 10% for the abdomen with pelvis. Larger dose reductions can be expected for average patients, since their attenuating properties differ significantly from the anthropomorphic phantom. Adverse effects due to increased image noise were only moderate as a consequence of the "adequate noise system" design and the complementary use of adaptive filtration. The results of specific tests also provided deeper insight into the operation of the ADC system that helps to identify the causes of suspected malfunctions and to prevent potential pitfalls. Tests with anthropomorphic phantoms allow verification of the characteristics of devices for ADC in CT under almost realistic conditions. However, differences in phantom shape and material composition require supplementary patient studies on representative patient groups. © Georg Thieme Verlag KG Stuttgart · New York.

  18. TOPIC: a debugging code for torus geometry input data of Monte Carlo transport code

    International Nuclear Information System (INIS)

    Iida, Hiromasa; Kawasaki, Hiromitsu.

    1979-06-01

    TOPIC has been developed for debugging geometry input data of the Monte Carlo transport code. the code has the following features: (1) It debugs the geometry input data of not only MORSE-GG but also MORSE-I capable of treating torus geometry. (2) Its calculation results are shown in figures drawn by Plotter or COM, and the regions not defined or doubly defined are easily detected. (3) It finds a multitude of input data errors in a single run. (4) The input data required in this code are few, so that it is readily usable in a time sharing system of FACOM 230-60/75 computer. Example TOPIC calculations in design study of tokamak fusion reactors (JXFR, INTOR-J) are presented. (author)

  19. Fundamentals of IP and SoC security design, verification, and debug

    CERN Document Server

    Ray, Sandip; Sur-Kolay, Susmita

    2017-01-01

    This book is about security in embedded systems and it provides an authoritative reference to all aspects of security in system-on-chip (SoC) designs. The authors discuss issues ranging from security requirements in SoC designs, definition of architectures and design choices to enforce and validate security policies, and trade-offs and conflicts involving security, functionality, and debug requirements. Coverage also includes case studies from the “trenches” of current industrial practice in design, implementation, and validation of security-critical embedded systems. Provides an authoritative reference and summary of the current state-of-the-art in security for embedded systems, hardware IPs and SoC designs; Takes a "cross-cutting" view of security that interacts with different design and validation components such as architecture, implementation, verification, and debug, each enforcing unique trade-offs; Includes high-level overview, detailed analysis on implementation, and relevant case studies on desi...

  20. General collaboration offer of Johnson Controls regarding the performance of air conditioning automatic control systems and other buildings` automatic control systems

    Energy Technology Data Exchange (ETDEWEB)

    Gniazdowski, J.

    1995-12-31

    JOHNSON CONTROLS manufactures measuring and control equipment (800 types) and is as well a {open_quotes}turn-key{close_quotes} supplier of complete automatic controls systems for heating, air conditioning, ventilation and refrigerating engineering branches. The Company also supplies Buildings` Computer-Based Supervision and Monitoring Systems that may be applied in both small and large structures. Since 1990 the company has been performing full-range trade and contracting activities on the Polish market. We have our own well-trained technical staff and we collaborate with a series of designing and contracting enterprises that enable us to have our projects carried out all over Poland. The prices of our supplies and services correspond with the level of the Polish market.

  1. Design and Performance of a Gas Chromatograph for Automatic Monitoring of Pollutants in Ambient Air

    Science.gov (United States)

    Villalobos, R.; Stevens, D.; LeBlanc, R.; Braun, L.

    1971-01-01

    In recent years, interest in air pollution constituents has focused on carbon monoxide and hydrocarbons as prime components of polluted air. Instrumental methods have been developed, and commercial instruments for continuous monitoring of these components have been available for a number of years. For the measurement of carbon monoxide, non-dispersive infrared spectroscopy has been the accepted tool, in spite of its marginal sensitivity at low parts-per-million levels. For continuously monitoring total hydrocarbons, the hydrogen flame ionization analyzer has been widely accepted as the preferred method. The inadequacy of this latter method became evident when it was concluded that methane is non-reactive and cannot be considered a contaminant even though present at over 1 ppm in the earth's atmosphere. Hence, the need for measuring methane separately became apparent as a means of measuring the reactive and potentially harmful non-methane hydrocarbons fraction. A gas chromatographic method for the measurement of methane and total hydrocarbons which met these requirements has been developed. In this technique, methane was separated on conventional gas chromatographic columns and detected by a hydrogen flame ionization detector (FID) while the total hydrocarbons were obtained by introducing a second sample directly into the FID without separating the various components. The reactive, or non-methane hydrocarbons, were determined by difference. Carbon monoxide was also measured after converting to methane over a heated catalyst to render it detectable by the FID. The development of this method made it possible to perform these measurements with a sensitivity of as much as 1 ppm full scale and a minimum detectability of 20 ppb. Incorporating this technique, criteria were developed by APCO for a second generation continuous automatic instrument for atmospheric monitoring stations.

  2. The performance of an automatic acoustic-based program classifier compared to hearing aid users' manual selection of listening programs.

    Science.gov (United States)

    Searchfield, Grant D; Linford, Tania; Kobayashi, Kei; Crowhen, David; Latzel, Matthias

    2018-03-01

    To compare preference for and performance of manually selected programmes to an automatic sound classifier, the Phonak AutoSense OS. A single blind repeated measures study. Participants were fit with Phonak Virto V90 ITE aids; preferences for different listening programmes were compared across four different sound scenarios (speech in: quiet, noise, loud noise and a car). Following a 4-week trial preferences were reassessed and the users preferred programme was compared to the automatic classifier for sound quality and hearing in noise (HINT test) using a 12 loudspeaker array. Twenty-five participants with symmetrical moderate-severe sensorineural hearing loss. Participant preferences of manual programme for scenarios varied considerably between and within sessions. A HINT Speech Reception Threshold (SRT) advantage was observed for the automatic classifier over participant's manual selection for speech in quiet, loud noise and car noise. Sound quality ratings were similar for both manual and automatic selections. The use of a sound classifier is a viable alternative to manual programme selection.

  3. Automatic shading effects on the energetic performance of building systems; Efeito do sombreamento automatico no desempenho de sistemas prediais

    Energy Technology Data Exchange (ETDEWEB)

    Prado, Racine Tadeu Araujo

    1996-12-31

    This thesis develops a theoretic-experimental study dealing with the effects of an automatic shading device on the energetic performance of a dimmable lighting system and a cooling equipment. Some equations related to fenestration optical and thermal properties are rebuilt, while some others are created, under a theoretical approach. In order to collect field data, the energy demand-and other variables - was measured in two distinct stories, with the same fenestration features, of the Test Tower. New data was gathered after adding an automatic shading device to the window of one story. The comparison of the collected data allows the energetic performance evaluation of the shading device. (author) 136 refs., 55 figs., 6 tabs.

  4. Performance Modelling of Automatic Identification System with Extended Field of View

    DEFF Research Database (Denmark)

    Lauersen, Troels; Mortensen, Hans Peter; Pedersen, Nikolaj Bisgaard

    2010-01-01

    This paper deals with AIS (Automatic Identification System) behavior, to investigate the severity of packet collisions in an extended field of view (FOV). This is an important issue for satellite-based AIS, and the main goal is a feasibility study to find out to what extent an increased FOV...

  5. Space-Based FPGA Radio Receiver Design, Debug, and Development of a Radiation-Tolerant Computing System

    Directory of Open Access Journals (Sweden)

    Zachary K. Baker

    2010-01-01

    Full Text Available Los Alamos has recently completed the latest in a series of Reconfigurable Software Radios, which incorporates several key innovations in both hardware design and algorithms. Due to our focus on satellite applications, each design must extract the best size, weight, and power performance possible from the ensemble of Commodity Off-the-Shelf (COTS parts available at the time of design. A large component of our work lies in determining if a given part will survive in space and how it will fail under various space radiation conditions. Using two Xilinx Virtex 4 FPGAs, we have achieved 1 TeraOps/second signal processing on a 1920 Megabit/second datastream. This processing capability enables very advanced algorithms such as our wideband RF compression scheme to operate at the source, allowing bandwidth-constrained applications to deliver previously unattainable performance. This paper will discuss the design of the payload, making electronics survivable in the radiation of space, and techniques for debug.

  6. Improving the working performance of automatic ball balancer by modifying its mechanism

    Science.gov (United States)

    Rezaee, Mousa; Fathi, Reza

    2015-12-01

    An automatic ball balancer consists of several balls that are free to move in the circular race containing a damping fluid. Although a traditional ABB can improve the vibration behavior of an unbalanced rotor under proper working conditions, at speeds below the first critical speed, it makes the vibration amplitude of the rotor larger than that of a rotor without an automatic ball balancer. Moreover, it has a limited stable region of the perfect balancing configuration. Considering these deficiencies, in this study a new design for automatic ball balancer is proposed. To analyze the problem, the nonlinear governing equations of a rotor equipped with the new ABB are derived using Lagrange's equations. Then, stability analysis is carried out on the basis of linearized equations of motion around the equilibrium positions to obtain the stable region of the system. It is shown that the new ABB can prevent the rotor from increasing the vibrations at the transient state. Also, it increases the stable region of the perfect balancing configuration. Comparing the results with those corresponding to the traditional ball balancer shows that the new ABB can reduce the vibration amplitude at speeds below the first critical speed and it increases the stable region of the perfect balancing configuration.

  7. Validation of a novel automatic sleep spindle detector with high performance during sleep in middle aged subjects

    DEFF Research Database (Denmark)

    Wendt, Sabrina Lyngbye; Christensen, Julie A. E.; Kempfner, Jacob

    2012-01-01

    Many of the automatic sleep spindle detectors currently used to analyze sleep EEG are either validated on young subjects or not validated thoroughly. The purpose of this study is to develop and validate a fast and reliable sleep spindle detector with high performance in middle aged subjects....... An automatic sleep spindle detector using a bandpass filtering approach and a time varying threshold was developed. The validation was done on sleep epochs from EEG recordings with manually scored sleep spindles from 13 healthy subjects with a mean age of 57.9 ± 9.7 years. The sleep spindle detector reached...... a mean sensitivity of 84.6 % and a mean specificity of 95.3 %. The sleep spindle detector can be used to obtain measures of spindle count and density together with quantitative measures such as the mean spindle frequency, mean spindle amplitude, and mean spindle duration....

  8. SimpleGeO - new developments in the interactive creation and debugging of geometries for Monte Carlo simulations

    International Nuclear Information System (INIS)

    Theis, Christian; Feldbaumer, Eduard; Forkel-Wirth, Doris; Jaegerhofer, Lukas; Roesler, Stefan; Vincke, Helmut; Buchegger, Karl Heinz

    2010-01-01

    Nowadays radiation transport Monte Carlo simulations have become an indispensable tool in various fields of physics. The applications are diversified and range from physics simulations, like detector studies or shielding design, to medical applications. Usually a significant amount of time is spent on the quite cumbersome and often error prone task of implementing geometries, before the actual physics studies can be performed. SimpleGeo is an interactive solid modeler which allows for the interactive creation and visualization of geometries for various Monte Carlo particle transport codes in 3D. Even though visual validation of the geometry is important, it might not reveal subtle errors like overlapping or undefined regions. These might eventually corrupt the execution of the simulation or even lead to incorrect results, the latter being sometimes hard to identify. In many cases a debugger is provided by the Monte Carlo package, but most often they lack interactive visual feedback, thus making it hard for the user to localize and correct the error. In this paper we describe the latest developments in SimpleGeo, which include debugging facilities that support immediate visual feedback, and apply various algorithms based on deterministic, Monte Carlo or Quasi Monte Carlo methods. These approaches allow for a fast and robust identification of subtle geometry errors that are also marked visually. (author)

  9. A Case for Dynamic Reverse-code Generation to Debug Non-deterministic Programs

    Directory of Open Access Journals (Sweden)

    Jooyong Yi

    2013-09-01

    Full Text Available Backtracking (i.e., reverse execution helps the user of a debugger to naturally think backwards along the execution path of a program, and thinking backwards makes it easy to locate the origin of a bug. So far backtracking has been implemented mostly by state saving or by checkpointing. These implementations, however, inherently do not scale. Meanwhile, a more recent backtracking method based on reverse-code generation seems promising because executing reverse code can restore the previous states of a program without state saving. In the literature, there can be found two methods that generate reverse code: (a static reverse-code generation that pre-generates reverse code through static analysis before starting a debugging session, and (b dynamic reverse-code generation that generates reverse code by applying dynamic analysis on the fly during a debugging session. In particular, we espoused the latter one in our previous work to accommodate non-determinism of a program caused by e.g., multi-threading. To demonstrate the usefulness of our dynamic reverse-code generation, this article presents a case study of various backtracking methods including ours. We compare the memory usage of various backtracking methods in a simple but nontrivial example, a bounded-buffer program. In the case of non-deterministic programs such as this bounded-buffer program, our dynamic reverse-code generation outperforms the existing backtracking methods in terms of memory efficiency.

  10. Power amplifier automatic test system based on LXI bus technology

    Science.gov (United States)

    Li, Yushuang; Chen, Libing; Men, Tao; Yang, Qingfeng; Li, Ning; Nie, Tao

    2017-10-01

    The power amplifier is an important part of the high power digital transceiver module, because of its great demand and diverse measurement indicators, an automatic test system is designed to meet the production requirements of the power amplifiers as the manual test cannot meet the demand of consistency. This paper puts forward the plan of the automatic test system based on LXI bus technology, introduces the hardware and software architecture of the system. The test system has been used for debugging and testing the power amplifiers stably and efficiently, which greatly saves work force and effectively improves productivity.

  11. Parallelization of the AliRoot event reconstruction by performing a semi- automatic source-code transformation

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    side bus or processor interconnections. Parallelism can only result in performance gain, if the memory usage is optimized, memory locality improved and the communication between threads is minimized. But the domain of concurrent programming has become a field for highly skilled experts, as the implementation of multithreading is difficult, error prone and labor intensive. A full re-implementation for parallel execution of existing offline frameworks, like AliRoot in ALICE, is thus unaffordable. An alternative method, is to use a semi-automatic source-to-source transformation for getting a simple parallel design, with almost no interference between threads. This reduces the need of rewriting the develop...

  12. Automatic sprinkler system performance and reliability in United States Department of Energy Facilities, 1952 to 1980

    International Nuclear Information System (INIS)

    1982-06-01

    The automatic sprinkler system experiences of the United States Department of Energy and its predecessor agencies are analyzed. Based on accident and incident files in the Office of Operational Safety and on supplementary responses, 587 incidents including over 100 fires are analyzed. Tables and figures, with supplementary narratives discuss fire experience by various categories such as number of heads operating, type of system, dollar losses, failures, extinguished vs. controlled, and types of sprinkler heads. Use is made of extreme value projections and frequency-severity plots to compare past experience and predict future experience. Non-fire incidents are analyzed in a similar manner by cause, system types and failure types. Discussion of no-loss incidents and non-fire protection water systems is included. The author's conclusions and recommendations and appendices listing survey methodology, major incidents, and a bibliography are included

  13. Energy Bucket: A Tool for Power Profiling and Debugging of Sensor Nodes

    DEFF Research Database (Denmark)

    Andersen, Jacob; Hansen, Morten Tranberg

    2009-01-01

    The ability to precisely measure and compare energy consumption and relate this to particular parts of programs is a recurring theme in sensor network research. This paper presents the Energy Bucket, a low-cost tool designed for quick empirical measurements of energy consumptions across 5 decades...... of current draw. The Energy Bucket provides a light-weight state API for the target system, which facilitates easy scorekeeping of energy consumption between different parts of a target program. We demonstrate how this tool can be used to discover programming errors and debug sensor network applications.......Furthermore, we show how this tool, together with the target system API, offers a very detailed analysis of where energy is spent in an application, which proves to be very useful when comparing alternative implementations or validating theoretical energy consumption models....

  14. Variable Correlation Digital Noise Source on FPGA — A Versatile Tool for Debugging Radio Telescope Backends

    Science.gov (United States)

    Buch, Kaushal D.; Gupta, Yashwant; Ajith Kumar, B.

    Contemporary wideband radio telescope backends are generally developed on Field Programmable Gate Arrays (FPGA) or hybrid (FPGA+GPU) platforms. One of the challenges faced while developing such instruments is the functional verification of the signal processing backend at various stages of development. In the case of an interferometer or pulsar backend, the typical requirement is for one independent noise source per input, with provision for a common, correlated signal component across all the inputs, with controllable level of correlation. This paper describes the design of a FPGA-based variable correlation Digital Noise Source (DNS), and its applications to built-in testing and debugging of correlators and beamformers. This DNS uses the Central Limit Theorem-based approach for generation of Gaussian noise, and the architecture is optimized for resource requirements and ease of integration with existing signal processing blocks on FPGA.

  15. Application of remote debugging techniques in user-centric job monitoring

    International Nuclear Information System (INIS)

    Dos Santos, T; Mättig, P; Harenberg, T; Volkmer, F; Beermann, T; Kalinin, S; Ahrens, R; Wulff, N

    2012-01-01

    With the Job Execution Monitor, a user-centric job monitoring software developed at the University of Wuppertal and integrated into the job brokerage systems of the WLCG, job progress and grid worker node health can be supervised in real time. Imminent error conditions can thus be detected early by the submitter and countermeasures can be taken. Grid site admins can access aggregated data of all monitored jobs to infer the site status and to detect job misbehaviour. To remove the last 'blind spot' from this monitoring, a remote debugging technique based on the GNU C compiler suite was developed and integrated into the software; its design concept and architecture is described in this paper and its application discussed.

  16. Debugging Nano-Bio Interfaces: Systematic Strategies to Accelerate Clinical Translation of Nanotechnologies.

    Science.gov (United States)

    Mahmoudi, Morteza

    2018-03-17

    Despite considerable efforts in the field of nanomedicine that have been made by researchers, funding agencies, entrepreneurs, and the media, fewer nanoparticle (NP) technologies than expected have made it to clinical trials. The wide gap between the efforts and effective clinical translation is, at least in part, due to multiple overlooked factors in both in vitro and in vivo environments, a poor understanding of the nano-bio interface, and misinterpretation of the data collected in vitro, all of which reduce the accuracy of predictions regarding the NPs' fate and safety in humans. To minimize this bench-to-clinic gap, which may accelerate successful clinical translation of NPs, this opinion paper aims to introduce strategies for systematic debugging of nano-bio interfaces in the current literature. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Technology-Enhanced Assessment of Math Fact Automaticity: Patterns of Performance for Low- and Typically Achieving Students

    Science.gov (United States)

    Stickney, Eric M.; Sharp, Lindsay B.; Kenyon, Amanda S.

    2012-01-01

    Because math fact automaticity has been identified as a key barrier for students struggling with mathematics, we examined how initial math achievement levels influenced the path to automaticity (e.g., variation in number of attempts, speed of retrieval, and skill maintenance over time) and the relation between attainment of automaticity and gains…

  18. An integrated development environment for PMESII model authoring, integration, validation, and debugging

    Science.gov (United States)

    Pioch, Nicholas J.; Lofdahl, Corey; Sao Pedro, Michael; Krikeles, Basil; Morley, Liam

    2007-04-01

    To foster shared battlespace awareness in Air Operations Centers supporting the Joint Forces Commander and Joint Force Air Component Commander, BAE Systems is developing a Commander's Model Integration and Simulation Toolkit (CMIST), an Integrated Development Environment (IDE) for model authoring, integration, validation, and debugging. CMIST is built on the versatile Eclipse framework, a widely used open development platform comprised of extensible frameworks that enable development of tools for building, deploying, and managing software. CMIST provides two distinct layers: 1) a Commander's IDE for supporting staff to author models spanning the Political, Military, Economic, Social, Infrastructure, Information (PMESII) taxonomy; integrate multiple native (third-party) models; validate model interfaces and outputs; and debug the integrated models via intuitive controls and time series visualization, and 2) a PMESII IDE for modeling and simulation developers to rapidly incorporate new native simulation tools and models to make them available for use in the Commander's IDE. The PMESII IDE provides shared ontologies and repositories for world state, modeling concepts, and native tool characterization. CMIST includes extensible libraries for 1) reusable data transforms for semantic alignment of native data with the shared ontology, and 2) interaction patterns to synchronize multiple native simulations with disparate modeling paradigms, such as continuous-time system dynamics, agent-based discrete event simulation, and aggregate solution methods such as Monte Carlo sampling over dynamic Bayesian networks. This paper describes the CMIST system architecture, our technical approach to addressing these semantic alignment and synchronization problems, and initial results from integrating Political-Military-Economic models of post-war Iraq spanning multiple modeling paradigms.

  19. Design and performance of an automatic regenerating adsorption aerosol dryer for continuous operation at monitoring sites

    Science.gov (United States)

    Tuch, T. M.; Haudek, A.; Müller, T.; Nowak, A.; Wex, H.; Wiedensohler, A.

    2009-04-01

    Sizes of aerosol particles depend on the relative humidity of their carrier gas. Most monitoring networks require therefore that the aerosol is dried to a relative humidity below 50% RH to ensure comparability of measurements at different sites. Commercially available aerosol dryers are often not suitable for this purpose at remote monitoring sites. Adsorption dryers need to be regenerated frequently and maintenance-free single column Nafion dryers are not designed for high aerosol flow rates. We therefore developed an automatic regenerating adsorption aerosol dryer with a design flow rate of 1 m3/h. Particle transmission efficiency of this dryer has been determined during a 3 weeks experiment. The lower 50% cut-off was found to be below 3 nm at the design flow rate of the instrument. Measured transmission efficiencies are in good agreement with theoretical calculations. One drier has been successfully deployed in the Amazonas river basin. From this monitoring site, we present data from the first 6 months of measurements (February 2008-August 2008). Apart from one unscheduled service, this dryer did not require any maintenance during this time period. The average relative humidity of the dried aerosol was 27.1+/-7.5% RH compared to an average ambient relative humidity of nearly 80% and temperatures around 30°C. This initial deployment demonstrated that these dryers are well suitable for continuous operation at remote monitoring sites under adverse ambient conditions.

  20. Performance of an automatic dose control system for CT. Patient studies

    Energy Technology Data Exchange (ETDEWEB)

    Stumpp, P.; Gosch, D.; Kuehn, A.; Sorge, I.; Kahn, T. [Universitaetsklinikum Leipzig (Germany). Klinik und Poliklinik fuer Diagnostische und Interventionelle Radiologie; Weber, D. [St. Elisabeth-Krankenhaus Leipzig (Germany). Roentgendiagnostik; Lehmkuhl, L. [Leipzig Univ. - Herzzentrum (Germany). Diagnostische und Interventionelle Radiologie; Nagel, H.D. [Dr. HD Nagel, Wissenschaft und Technik fuer die Radiologie, Buchholz (Germany)

    2013-02-15

    Purpose: To study the effect of an automatic dose control (ADC) system with adequate noise characteristic on the individual perception of image noise and diagnostic acceptance compared to objectively measured image noise and the dose reductions achieved in a representative group of patients. Materials and Methods: In a retrospective study two matched cohorts of 20 patients each were identified: a manual cohort with exposure settings according to body size (small - regular - large) and an ADC cohort with exposure settings calculated by the ADC system (DoseRight 2.0 trademark, Philips Healthcare). For each patient, 12 images from 6 defined anatomic levels from contrast-enhanced scans of chest and abdomen/pelvis were analyzed by 4 independent readers concerning image noise and diagnostic acceptance on a five-point Likert scale and evaluated for objectively measured image noise. Radiation exposure was calculated from recorded exposure data. Results: Use of the ADC system reduced the average effective dose for patients by 36 % in chest scans (3.2 vs. 4.9 mSv) and by 17 % in abdomen/pelvis scans (7.6 vs. 8.3 mSv). Average objective noise was slightly lower in the manual cohort (11.1 vs. 12.8 HU), correlating with a slightly better rating in subjective noise score (4.4 vs. 4.2). However, diagnostic acceptance was rated almost equal in both cohorts with excellent image quality (4.6 vs. 4.5). Conclusion: Use of an ADC system with adequate noise characteristic leads to significant reductions in radiation exposure for patients while maintaining excellent image quality. (orig.)

  1. Performance-Driven Interface Contract Enforcement for Scientific Components

    Energy Technology Data Exchange (ETDEWEB)

    Dahlgren, Tamara Lynn [Univ. of California, Davis, CA (United States)

    2008-01-01

    Performance-driven interface contract enforcement research aims to improve the quality of programs built from plug-and-play scientific components. Interface contracts make the obligations on the caller and all implementations of the specified methods explicit. Runtime contract enforcement is a well-known technique for enhancing testing and debugging. However, checking all of the associated constraints during deployment is generally considered too costly from a performance stand point. Previous solutions enforced subsets of constraints without explicit consideration of their performance implications. Hence, this research measures the impacts of different interface contract sampling strategies and compares results with new techniques driven by execution time estimates. Results from three studies indicate automatically adjusting the level of checking based on performance constraints improves the likelihood of detecting contract violations under certain circumstances. Specifically, performance-driven enforcement is better suited to programs exercising constraints whose costs are at most moderately expensive relative to normal program execution.

  2. Standard guide for in-plant performance evaluation of automatic pedestrian SNM monitors

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1997-01-01

    1.1 This guide is affiliated with Guide C1112 on special nuclear material (SNM) monitors, Guide C1169 on laboratory performance evaluation, and Guide C1189 on calibrating pedestrian SNM monitors. This guide to in-plant performance evaluation is a comparatively rapid way to verify whether a pedestrian SNM monitor performs as expected for detecting SNM or SNM-like test sources. 1.1.1 In-plant performance evaluation should not be confused with the simple daily functional test recommended in Guide C1112. In-plant performance evaluation takes place less often than daily tests, usually at intervals ranging from weekly to once every three months. In-plant evaluations are also more extensive than daily tests and may examine both a monitor's nuisance alarm record and its detection sensitivity for a particular SNM or alternative test source. 1.1.2 In-plant performance evaluation also should not be confused with laboratory performance evaluation. In-plant evaluation is comparatively rapid, takes place in the monitor...

  3. JACoW Automatic PID performance monitoring applied to LHC cryogenics

    CERN Document Server

    Bradu, Benjamin; Marti, Ruben; Tilaro, Filippo

    2018-01-01

    At CERN, the LHC (Large Hadron Collider) cryogenic system employs about 5000 PID (Proportional Integral Derivative) regulation loops distributed over the 27 km of the accelerator. Tuning all these regulation loops is a complex task and the systematic monitoring of them should be done in an automated way to be sure that the overall plant performance is improved by identifying the poorest performing PID controllers. It is nearly impossible to check the performance of a regulation loop with a classical threshold technique as the controlled variables could evolve in large operation ranges and the amount of data cannot be manually checked daily. This paper presents the adaptation and the application of an existing regulation indicator performance algorithm on the LHC cryogenic system and the different results obtained in the past year of operation. This technique is generic for any PID feedback control loop, it does not use any process model and needs only a few tuning parameters. The publication also describes th...

  4. PPP effectiveness study. [automatic procedures recording and crew performance monitoring system

    Science.gov (United States)

    Arbet, J. D.; Benbow, R. L.

    1976-01-01

    This design note presents a study of the Procedures and Performance Program (PPP) effectiveness. The intent of the study is to determine manpower time savings and the improvements in job performance gained through PPP automated techniques. The discussion presents a synopsis of PPP capabilities and identifies potential users and associated applications, PPP effectiveness, and PPP applications to other simulation/training facilities. Appendix A provides a detailed description of each PPP capability.

  5. Analysis of Wind Speed Forecasting Error Effects on Automatic Generation Control Performance

    Directory of Open Access Journals (Sweden)

    H. Rajabi Mashhadi

    2014-09-01

    Full Text Available The main goal of this paper is to study statistical indices and evaluate AGC indices in power system which has large penetration of the WTGs. Increasing penetration of wind turbine generations, needs to study more about impacts of it on power system frequency control. Frequency control is changed with unbalancing real-time system generation and load . Also wind turbine generations have more fluctuations and make system more unbalance. Then AGC loop helps to adjust system frequency and the scheduled tie-line powers. The quality of AGC loop is measured by some indices. A good index is a proper measure shows the AGC performance just as the power system operates. One of well-known measures in literature which was introduced by NERC is Control Performance Standards(CPS. Previously it is claimed that a key factor in CPS index is related to standard deviation of generation error, installed power and frequency response. This paper focuses on impact of a several hours-ahead wind speed forecast error on this factor. Furthermore evaluation of conventional control performances in the power systems with large-scale wind turbine penetration is studied. Effects of wind speed standard deviation and also degree of wind farm penetration are analyzed and importance of mentioned factor are criticized. In addition, influence of mean wind speed forecast error on this factor is investigated. The study system is a two area system which there is significant wind farm in one of those. The results show that mean wind speed forecast error has considerable effect on AGC performance while the mentioned key factor is insensitive to this mean error.

  6. Neural Bases of Automaticity

    Science.gov (United States)

    Servant, Mathieu; Cassey, Peter; Woodman, Geoffrey F.; Logan, Gordon D.

    2018-01-01

    Automaticity allows us to perform tasks in a fast, efficient, and effortless manner after sufficient practice. Theories of automaticity propose that across practice processing transitions from being controlled by working memory to being controlled by long-term memory retrieval. Recent event-related potential (ERP) studies have sought to test this…

  7. Diagnostics for stochastic genome-scale modeling via model slicing and debugging.

    Directory of Open Access Journals (Sweden)

    Kevin J Tsai

    Full Text Available Modeling of biological behavior has evolved from simple gene expression plots represented by mathematical equations to genome-scale systems biology networks. However, due to obstacles in complexity and scalability of creating genome-scale models, several biological modelers have turned to programming or scripting languages and away from modeling fundamentals. In doing so, they have traded the ability to have exchangeable, standardized model representation formats, while those that remain true to standardized model representation are faced with challenges in model complexity and analysis. We have developed a model diagnostic methodology inspired by program slicing and debugging and demonstrate the effectiveness of the methodology on a genome-scale metabolic network model published in the BioModels database. The computer-aided identification revealed specific points of interest such as reversibility of reactions, initialization of species amounts, and parameter estimation that improved a candidate cell's adenosine triphosphate production. We then compared the advantages of our methodology over other modeling techniques such as model checking and model reduction. A software application that implements the methodology is available at http://gel.ym.edu.tw/gcs/.

  8. Simulation software support (S3) system a software testing and debugging tool

    International Nuclear Information System (INIS)

    Burgess, D.C.; Mahjouri, F.S.

    1990-01-01

    The largest percentage of technical effort in the software development process is accounted for debugging and testing. It is not unusual for a software development organization to spend over 50% of the total project effort on testing. In the extreme, testing of human-rated software (e.g., nuclear reactor monitoring, training simulator) can cost three to five times as much as all other software engineering steps combined. The Simulation Software Support (S 3 ) System, developed by the Link-Miles Simulation Corporation is ideally suited for real-time simulation applications which involve a large database with models programmed in FORTRAN. This paper will focus on testing elements of the S 3 system. In this paper system support software utilities are provided which enable the loading and execution of modules in the development environment. These elements include the Linking/Loader (LLD) for dynamically linking program modules and loading them into memory and the interactive executive (IEXEC) for controlling the execution of the modules. Features of the Interactive Symbolic Debugger (SD) and the Real Time Executive (RTEXEC) to support the unit and integrated testing will be explored

  9. A computer-aided control system for automatic performance measurements on the LHC series dipoles

    International Nuclear Information System (INIS)

    Gorskaya, E.; Samojlov, V.; Raimondo, A.; Rijllart, A.

    2003-01-01

    The control system software (Test Master) for the Large Hadron Collider (LHC) magnet series measurements is presented. This system was developed at CERN to automate as many tests on the LHC magnets as possible. The Test Master software is the middle layer of the main software architecture developed by the LHC/IAS group for central supervision of all types of LHC dipole tests in the SM18 hall. It serves as a manager and scheduler for applications, controlling all measurements that are performed in a cluster of two test benches. The software was implemented in the LabVIEW environment. The information about the interactive user interface, the software architecture, communication protocols, file-configuration different types of commands and status files of the Test Master are described

  10. Parameter design and performance analysis of shift actuator for a two-speed automatic mechanical transmission for pure electric vehicles

    Directory of Open Access Journals (Sweden)

    Jianjun Hu

    2016-08-01

    Full Text Available Recent developments of pure electric vehicles have shown that pure electric vehicles equipped with two-speed or multi-speed gearbox possess higher energy efficiency by ensuring the drive motor operates at its peak performance range. This article presents the design, analysis, and control of a two-speed automatic mechanical transmission for pure electric vehicles. The shift actuator is based on a motor-controlled camshaft where a special geometric groove is machined, and the camshaft realizes the axial positions of the synchronizer sleeve for gear engaging, disengaging, and speed control of the drive motor. Based on the force analysis of shift process, the parameters of shift actuator and shift motor are designed. The drive motor’s torque control strategy before shifting, speed governing control strategy before engaging, shift actuator’s control strategy during gear engaging, and drive motor’s torque recovery strategy after shift process are proposed and implemented with a prototype. To validate the performance of the two-speed gearbox, a test bed was developed based on dSPACE that emulates various operation conditions. The experimental results indicate that the shift process with the proposed shift actuator and control strategy could be accomplished within 1 s under various operation conditions, with shift smoothness up to passenger car standard.

  11. Automatic sequences

    CERN Document Server

    Haeseler, Friedrich

    2003-01-01

    Automatic sequences are sequences which are produced by a finite automaton. Although they are not random they may look as being random. They are complicated, in the sense of not being not ultimately periodic, they may look rather complicated, in the sense that it may not be easy to name the rule by which the sequence is generated, however there exists a rule which generates the sequence. The concept automatic sequences has special applications in algebra, number theory, finite automata and formal languages, combinatorics on words. The text deals with different aspects of automatic sequences, in particular:· a general introduction to automatic sequences· the basic (combinatorial) properties of automatic sequences· the algebraic approach to automatic sequences· geometric objects related to automatic sequences.

  12. First performance evaluation of software for automatic segmentation, labeling and reformation of anatomical aligned axial images of the thoracolumbar spine at CT

    Energy Technology Data Exchange (ETDEWEB)

    Scholtz, Jan-Erik, E-mail: janerikscholtz@gmail.com; Wichmann, Julian L.; Kaup, Moritz; Fischer, Sebastian; Kerl, J. Matthias; Lehnert, Thomas; Vogl, Thomas J.; Bauer, Ralf W.

    2015-03-15

    Highlights: •Automatic segmentation and labeling of the thoracolumbar spine. •Automatically generated double-angulated and aligned axial images of spine segments. •High grade of accurateness for the symmetric depiction of anatomical structures. •Time-saving and may improve workflow in daily practice. -- Abstract: Objectives: To evaluate software for automatic segmentation, labeling and reformation of anatomical aligned axial images of the thoracolumbar spine on CT in terms of accuracy, potential for time savings and workflow improvement. Material and methods: 77 patients (28 women, 49 men, mean age 65.3 ± 14.4 years) with known or suspected spinal disorders (degenerative spine disease n = 32; disc herniation n = 36; traumatic vertebral fractures n = 9) underwent 64-slice MDCT with thin-slab reconstruction. Time for automatic labeling of the thoracolumbar spine and reconstruction of double-angulated axial images of the pathological vertebrae was compared with manually performed reconstruction of anatomical aligned axial images. Reformatted images of both reconstruction methods were assessed by two observers regarding accuracy of symmetric depiction of anatomical structures. Results: In 33 cases double-angulated axial images were created in 1 vertebra, in 28 cases in 2 vertebrae and in 16 cases in 3 vertebrae. Correct automatic labeling was achieved in 72 of 77 patients (93.5%). Errors could be manually corrected in 4 cases. Automatic labeling required 1 min in average. In cases where anatomical aligned axial images of 1 vertebra were created, reconstructions made by hand were significantly faster (p < 0.05). Automatic reconstruction was time-saving in cases of 2 and more vertebrae (p < 0.05). Both reconstruction methods revealed good image quality with excellent inter-observer agreement. Conclusion: The evaluated software for automatic labeling and anatomically aligned, double-angulated axial image reconstruction of the thoracolumbar spine on CT is time

  13. Automatic seismic event detection using migration and stacking: a performance and parameter study in Hengill, southwest Iceland

    Science.gov (United States)

    Wagner, F.; Tryggvason, A.; Roberts, R.; Lund, B.; Gudmundsson, Ó.

    2017-06-01

    We investigate the performance of a seismic event detection algorithm using migration and stacking of seismic traces. The focus lies on determining optimal data dependent detection parameters for a data set from a temporary network in the volcanically active Hengill area, southwest Iceland. We test variations of the short-term average to long-term average and Kurtosis functions, calculated from filtered seismic traces, as input data. With optimal detection parameters, our algorithm identified 94 per cent (219 events) of the events detected by the South Iceland Lowlands (SIL) system, that is, the automatic system routinely used on Iceland, as well as a further 209 events, previously missed. The assessed number of incorrect (false) detections was 25 per cent for our algorithm, which was considerably better than that from SIL (40 per cent). Empirical tests show that well-functioning processing parameters can be effectively selected based on analysis of small, representative subsections of data. Our migration approach is more computationally expensive than some alternatives, but not prohibitively so, and it appears well suited to analysis of large swarms of low magnitude events with interevent times on the order of seconds. It is, therefore, an attractive, practical tool for monitoring of natural or anthropogenic seismicity related to, for example, volcanoes, drilling or fluid injection.

  14. I Hear You Eat and Speak: Automatic Recognition of Eating Condition and Food Type, Use-Cases, and Impact on ASR Performance.

    Directory of Open Access Journals (Sweden)

    Simone Hantke

    Full Text Available We propose a new recognition task in the area of computational paralinguistics: automatic recognition of eating conditions in speech, i. e., whether people are eating while speaking, and what they are eating. To this end, we introduce the audio-visual iHEARu-EAT database featuring 1.6 k utterances of 30 subjects (mean age: 26.1 years, standard deviation: 2.66 years, gender balanced, German speakers, six types of food (Apple, Nectarine, Banana, Haribo Smurfs, Biscuit, and Crisps, and read as well as spontaneous speech, which is made publicly available for research purposes. We start with demonstrating that for automatic speech recognition (ASR, it pays off to know whether speakers are eating or not. We also propose automatic classification both by brute-forcing of low-level acoustic features as well as higher-level features related to intelligibility, obtained from an Automatic Speech Recogniser. Prediction of the eating condition was performed with a Support Vector Machine (SVM classifier employed in a leave-one-speaker-out evaluation framework. Results show that the binary prediction of eating condition (i. e., eating or not eating can be easily solved independently of the speaking condition; the obtained average recalls are all above 90%. Low-level acoustic features provide the best performance on spontaneous speech, which reaches up to 62.3% average recall for multi-way classification of the eating condition, i. e., discriminating the six types of food, as well as not eating. The early fusion of features related to intelligibility with the brute-forced acoustic feature set improves the performance on read speech, reaching a 66.4% average recall for the multi-way classification task. Analysing features and classifier errors leads to a suitable ordinal scale for eating conditions, on which automatic regression can be performed with up to 56.2% determination coefficient.

  15. I Hear You Eat and Speak: Automatic Recognition of Eating Condition and Food Type, Use-Cases, and Impact on ASR Performance.

    Science.gov (United States)

    Hantke, Simone; Weninger, Felix; Kurle, Richard; Ringeval, Fabien; Batliner, Anton; Mousa, Amr El-Desoky; Schuller, Björn

    2016-01-01

    We propose a new recognition task in the area of computational paralinguistics: automatic recognition of eating conditions in speech, i. e., whether people are eating while speaking, and what they are eating. To this end, we introduce the audio-visual iHEARu-EAT database featuring 1.6 k utterances of 30 subjects (mean age: 26.1 years, standard deviation: 2.66 years, gender balanced, German speakers), six types of food (Apple, Nectarine, Banana, Haribo Smurfs, Biscuit, and Crisps), and read as well as spontaneous speech, which is made publicly available for research purposes. We start with demonstrating that for automatic speech recognition (ASR), it pays off to know whether speakers are eating or not. We also propose automatic classification both by brute-forcing of low-level acoustic features as well as higher-level features related to intelligibility, obtained from an Automatic Speech Recogniser. Prediction of the eating condition was performed with a Support Vector Machine (SVM) classifier employed in a leave-one-speaker-out evaluation framework. Results show that the binary prediction of eating condition (i. e., eating or not eating) can be easily solved independently of the speaking condition; the obtained average recalls are all above 90%. Low-level acoustic features provide the best performance on spontaneous speech, which reaches up to 62.3% average recall for multi-way classification of the eating condition, i. e., discriminating the six types of food, as well as not eating. The early fusion of features related to intelligibility with the brute-forced acoustic feature set improves the performance on read speech, reaching a 66.4% average recall for the multi-way classification task. Analysing features and classifier errors leads to a suitable ordinal scale for eating conditions, on which automatic regression can be performed with up to 56.2% determination coefficient.

  16. Operational performance of Swedish grid connected solar power plants. Automatic data collection; Driftuppfoeljning av svenska naetanslutna solcellsanlaeggningar. Automatisering av datainsamling

    Energy Technology Data Exchange (ETDEWEB)

    Hedstroem, Jonas; Svensson, Stefan

    2006-09-15

    A performance database containing all grid-connected PV-systems in Sweden has been in operation since March 2002. The systems in the database are described in detail and energy production is continuously added in the form of monthly values. The energy production and the system descriptions are published on www.elforsk.se/solenergi. In august 2006 31 active systems were present in the database. As result of the Swedish subsidy program this number is expected to increase to over 100 systems in the next years. The new owners of PV-systems are obliged to report the produced electricity to the authorities at least once a year. In this work we have studied different means to simplify the collection of data. Four different methods are defined. 1. The conversion of readings from energy meters made at arbitrary distance in time into monthly values. 2. Methods to handle data obtained with the monitoring systems provided by different inverter manufactures. 3. Methods to acquire data from PV-systems with energy meters reporting to the green certificate system. 4. Commercial GSM/GPRS monitoring systems. The first method is the minimum level required by the authorities. The second and third methods are the use of equipments that are expected to be used by some PV-systems for other reasons. Method 4 gives a possibility to create a fully automatic collection method. The described GPRS-systems are expected to have an initial cost of roughly 4000 SEK and a yearly fee of 200 SEK (1 SEK {approx} 0.14 USD)

  17. Data Provenance Inference in Logic Programming: Reducing Effort of Instance-driven Debugging

    NARCIS (Netherlands)

    Huq, M.R.; Mileo, Alessandra; Wombacher, Andreas

    Data provenance allows scientists in different domains validating their models and algorithms to find out anomalies and unexpected behaviors. In previous works, we described on-the-fly interpretation of (Python) scripts to build workflow provenance graph automatically and then infer fine-grained

  18. Assessing the Performance of Automatic Speech Recognition Systems When Used by Native and Non-Native Speakers of Three Major Languages in Dictation Workflows

    DEFF Research Database (Denmark)

    Zapata, Julián; Kirkedal, Andreas Søeborg

    2015-01-01

    In this paper, we report on a two-part experiment aiming to assess and compare the performance of two types of automatic speech recognition (ASR) systems on two different computational platforms when used to augment dictation workflows. The experiment was performed with a sample of speakers...... of three major languages and with different linguistic profiles: non-native English speakers; non-native French speakers; and native Spanish speakers. The main objective of this experiment is to examine ASR performance in translation dictation (TD) and medical dictation (MD) workflows without manual...

  19. On enhancing energy harvesting performance of the photovoltaic modules using an automatic cooling system and assessing its economic benefits of mitigating greenhouse effects on the environment

    Science.gov (United States)

    Wang, Jen-Cheng; Liao, Min-Sheng; Lee, Yeun-Chung; Liu, Cheng-Yue; Kuo, Kun-Chang; Chou, Cheng-Ying; Huang, Chen-Kang; Jiang, Joe-Air

    2018-02-01

    The performance of photovoltaic (PV) modules under outdoor operation is greatly affected by their location and environmental conditions. The temperature of a PV module gradually increases as it is exposed to solar irradiation, resulting in degradation of its electrical characteristics and power generation efficiency. This study adopts wireless sensor network (WSN) technology to develop an automatic water-cooling system for PV modules in order to improve their PV power generation efficiency. A temperature estimation method is developed to quickly and accurately estimate the PV module temperatures based on weather data provided from the WSN monitoring system. Further, an estimation method is also proposed for evaluation of the electrical characteristics and output power of the PV modules, which is performed remotely via a control platform. The automatic WSN-based water-cooling mechanism is designed to avoid the PV module temperature from reaching saturation. Equipping each PV module with the WSN-based cooling system, the ambient conditions are monitored automatically so that the temperature of the PV module is controlled by sprinkling water on the panel surface. The field-test experiment results show an increase in the energy harvested by the PV modules of approximately 17.75% when using the proposed WSN-based cooling system.

  20. A fully automatic tool to perform accurate flood mapping by merging remote sensing imagery and ancillary data

    Science.gov (United States)

    D'Addabbo, Annarita; Refice, Alberto; Lovergine, Francesco; Pasquariello, Guido

    2016-04-01

    Flooding is one of the most frequent and expansive natural hazard. High-resolution flood mapping is an essential step in the monitoring and prevention of inundation hazard, both to gain insight into the processes involved in the generation of flooding events, and from the practical point of view of the precise assessment of inundated areas. Remote sensing data are recognized to be useful in this respect, thanks to the high resolution and regular revisit schedules of state-of-the-art satellites, moreover offering a synoptic overview of the extent of flooding. In particular, Synthetic Aperture Radar (SAR) data present several favorable characteristics for flood mapping, such as their relative insensitivity to the meteorological conditions during acquisitions, as well as the possibility of acquiring independently of solar illumination, thanks to the active nature of the radar sensors [1]. However, flood scenarios are typical examples of complex situations in which different factors have to be considered to provide accurate and robust interpretation of the situation on the ground: the presence of many land cover types, each one with a particular signature in presence of flood, requires modelling the behavior of different objects in the scene in order to associate them to flood or no flood conditions [2]. Generally, the fusion of multi-temporal, multi-sensor, multi-resolution and/or multi-platform Earth observation image data, together with other ancillary information, seems to have a key role in the pursuit of a consistent interpretation of complex scenes. In the case of flooding, distance from the river, terrain elevation, hydrologic information or some combination thereof can add useful information to remote sensing data. Suitable methods, able to manage and merge different kind of data, are so particularly needed. In this work, a fully automatic tool, based on Bayesian Networks (BNs) [3] and able to perform data fusion, is presented. It supplies flood maps

  1. An automatic pilot to take off coal fired power plants performance; Un pilote automatique pour faire decoller les performances des centrales au charbon

    Energy Technology Data Exchange (ETDEWEB)

    Falinower, C.M.; Maurin, S.; Ambos, P. [Electricite de France (EDF), 78 - Chatou (France). Dept. de Controle-Commande des Centrales; Dewasmes, M. [Electricite de France, 76 - Le Havre (France). Dept. Ressources Humaines et Financieres

    1999-10-01

    Realizing an automatic pilot for the electric power of fossil fuel plants constitutes a basic measure in order to optimise the production: better availability, increased flexibility for the network, less stress for the plant equipment. Such an automatic pilot has been successfully installed at Le Havre. I coal fired power plant with a better dispatch and load following, a better control for the steam pressure and temperatures. Instrumentation and Control renewal with DCS as well as the use of modern and efficient technics from control theory have led to these promising results. (authors)

  2. Handling Conflicts in Depth-First Search for LTL Tableau to Debug Compliance Based Languages

    Directory of Open Access Journals (Sweden)

    Francois Hantry

    2011-09-01

    Full Text Available Providing adequate tools to tackle the problem of inconsistent compliance rules is a critical research topic. This problem is of paramount importance to achieve automatic support for early declarative design and to support evolution of rules in contract-based or service-based systems. In this paper we investigate the problem of extracting temporal unsatisfiable cores in order to detect the inconsistent part of a specification. We extend conflict-driven SAT-solver to provide a new conflict-driven depth-first-search solver for temporal logic. We use this solver to compute LTL unsatisfiable cores without re-exploring the history of the solver.

  3. Automatic gallbladder segmentation using combined 2D and 3D shape features to perform volumetric analysis in native and secretin-enhanced MRCP sequences.

    Science.gov (United States)

    Gloger, Oliver; Bülow, Robin; Tönnies, Klaus; Völzke, Henry

    2017-11-24

    We aimed to develop the first fully automated 3D gallbladder segmentation approach to perform volumetric analysis in volume data of magnetic resonance (MR) cholangiopancreatography (MRCP) sequences. Volumetric gallbladder analysis is performed for non-contrast-enhanced and secretin-enhanced MRCP sequences. Native and secretin-enhanced MRCP volume data were produced with a 1.5-T MR system. Images of coronal maximum intensity projections (MIP) are used to automatically compute 2D characteristic shape features of the gallbladder in the MIP images. A gallbladder shape space is generated to derive 3D gallbladder shape features, which are then combined with 2D gallbladder shape features in a support vector machine approach to detect gallbladder regions in MRCP volume data. A region-based level set approach is used for fine segmentation. Volumetric analysis is performed for both sequences to calculate gallbladder volume differences between both sequences. The approach presented achieves segmentation results with mean Dice coefficients of 0.917 in non-contrast-enhanced sequences and 0.904 in secretin-enhanced sequences. This is the first approach developed to detect and segment gallbladders in MR-based volume data automatically in both sequences. It can be used to perform gallbladder volume determination in epidemiological studies and to detect abnormal gallbladder volumes or shapes. The positive volume differences between both sequences may indicate the quantity of the pancreatobiliary reflux.

  4. Assessment of automatic exposure control performance in digital mammography using a no-reference anisotropic quality index

    Science.gov (United States)

    Barufaldi, Bruno; Borges, Lucas R.; Bakic, Predrag R.; Vieira, Marcelo A. C.; Schiabel, Homero; Maidment, Andrew D. A.

    2017-03-01

    Automatic exposure control (AEC) is used in mammography to obtain acceptable radiation dose and adequate image quality regardless of breast thickness and composition. Although there are physics methods for assessing the AEC, it is not clear whether mammography systems operate with optimal dose and image quality in clinical practice. In this work, we propose the use of a normalized anisotropic quality index (NAQI), validated in previous studies, to evaluate the quality of mammograms acquired using AEC. The authors used a clinical dataset that consists of 561 patients and 1,046 mammograms (craniocaudal breast views). The results show that image quality is often maintained, even at various radiation levels (mean NAQI = 0.14 +/- 0.02). However, a more careful analysis of NAQI reveals that the average image quality decreases as breast thickness increases. The NAQI is reduced by 32% on average, when the breast thickness increases from 31 to 71 mm. NAQI also decreases with lower breast density. The variation in breast parenchyma alone cannot fully account for the decrease of NAQI with thickness. Examination of images shows that images of large, fatty breasts are often inadequately processed. This work shows that NAQI can be applied in clinical mammograms to assess mammographic image quality, and highlights the limitations of the automatic exposure control for some images.

  5. Automatic 2D scintillation camera and computed tomography whole-body image registration to perform dosimetric calculations

    International Nuclear Information System (INIS)

    Cismondi, F.; Mosconi, S.L.

    2008-01-01

    Full text: In this work a software tool that has been developed to allow automatic registrations of 2D Scintillation Camera (SC) and Computed Tomography (CT) images is presented. This tool, used with a dosimetric software with Integrated Activity or Residence Time as input data, allows the user to assess physicians about effects of radiodiagnostic or radiotherapeutic practices that involves nuclear medicine 'open sources'. Images are registered locally and globally, maximizing Mutual Information coefficient between regions been registered. In the regional case whole-body images are segmented into five regions: head, thorax, pelvis, left and right legs. Each region has its own registration parameters, which are optimized through Powell-Brent minimization method that 'maximizes' Mutual Information coefficient. This software tool allows the user to draw ROIs, input isotope characteristics and finally calculate Integrated Activity or Residence Time in one or many specific organ. These last values can be introduced in many dosimetric software to finally obtain Absorbed Dose values. (author)

  6. Automatic 2D scintillation camera and computed tomography whole-body image registration to perform dosimetry calculation

    Energy Technology Data Exchange (ETDEWEB)

    Cismondi, Federico; Mosconi, Sergio L [Fundacion Escuela de Medicina Nuclear, Mendoza (Argentina)

    2007-11-15

    In this paper we present a software tool that has been developed to allow automatic registrations of 2D Scintillation Camera (SC) and Computed Tomography (CT) images. This tool, used with a dosimetric software with Integrated Activity or Residence Time as input data, allows the user to assess physicians about effects of radiodiagnostic or radioterapeutic practices. Images are registered locally and globally, maximizing Mutual Information coefficient between regions been registered. In the regional case whole-body images are segmented into five regions: head, thorax, pelvis, left and right legs. Each region has its own registration parameters, which are optimized through Powell-Brent minimization method that 'maximizes' Mutual Information coefficient. This software tool allows the user to draw ROIs, input isotope characteristics and finally calculate Integrated Activity or Residence Time in one or many specific organ. These last values can be introduced in many dosimetric softwares to finally obtain Absorbed Dose values.

  7. Automatic capture of student notes to augment mentor feedback and student performance on patient write-ups.

    Science.gov (United States)

    Spickard, Anderson; Gigante, Joseph; Stein, Glenn; Denny, Joshua C

    2008-07-01

    To determine whether the integration of an automated electronic clinical portfolio into clinical clerkships can improve the quality of feedback given to students on their patient write-ups and the quality of students' write-ups. The authors conducted a single-blinded, randomized controlled study of an electronic clinical portfolio that automatically collects all students' clinical notes and notifies their teachers (attending and resident physicians) via e-mail. Third-year medical students were randomized to use the electronic portfolio or traditional paper means. Teachers in the portfolio group provided feedback directly on the student's write-up using a web-based application. Teachers in the control group provided feedback directly on the student's write-up by writing in the margins of the paper. Outcomes were teacher and student assessment of the frequency and quality of feedback on write-ups, expert assessment of the quality of student write-ups at the end of the clerkship, and participant assessment of the value of the electronic portfolio system. Teachers reported giving more frequent and detailed feedback using the portfolio system (p = 0.01). Seventy percent of students who used the portfolio system, versus 39% of students in the control group (p = 0.001), reported receiving feedback on more than half of their write-ups. Write-ups of portfolio students were rated of similar quality to write-ups of control students. Teachers and students agreed that the system was a valuable teaching tool and easy to use. An electronic clinical portfolio that automatically collects students' clinical notes is associated with improved teacher feedback on write-ups and similar quality of write-ups.

  8. Automatic text summarization

    CERN Document Server

    Torres Moreno, Juan Manuel

    2014-01-01

    This new textbook examines the motivations and the different algorithms for automatic document summarization (ADS). We performed a recent state of the art. The book shows the main problems of ADS, difficulties and the solutions provided by the community. It presents recent advances in ADS, as well as current applications and trends. The approaches are statistical, linguistic and symbolic. Several exemples are included in order to clarify the theoretical concepts.  The books currently available in the area of Automatic Document Summarization are not recent. Powerful algorithms have been develop

  9. Performance of a sequencing-batch membrane bioreactor (SMBR) with an automatic control strategy treating high-strength swine wastewater.

    Science.gov (United States)

    Sui, Qianwen; Jiang, Chao; Yu, Dawei; Chen, Meixue; Zhang, Junya; Wang, Yawei; Wei, Yuansong

    2018-01-15

    Due to high-strength of organic matters, nutrients and pathogen, swine wastewater is a major source of pollution to rural environment and surface water. A sequencing-batch membrane bioreactor (SMBR) system with an automatic control strategy was developed for high-strength swine wastewater treatment. Short-cut nitrification and denitrification (SND) was achieved at nitrite accumulation rate of 83.6%, with removal rates of COD, NH 4 + -N and TN at 95%, 99% and 93%, respectively, at reduced HRT of 6.0 d and TN loading rate of 0.02kgN/(kgVSS d). With effective membrane separation, the reduction of total bacteria (TB) and putative pathogen were 2.77 logs and 1%, respectively. The shift of microbial community was well responded to controlling parameters. During the SND process, ammonia oxidizing bacteria (AOB) (Nitrosomonas, Nitrosospira) and nitrite oxidizing bacteria (NOB) (Nitrospira) were enriched by 52 times and reduced by 2 times, respectively. The denitrifiers (Thauera) were well enriched and the diversity was enhanced. Copyright © 2017. Published by Elsevier B.V.

  10. Performance evaluation of an automatic positioning system for photovoltaic panels; Avaliacao de desempenho de um sistema de posicionamento automatico para paineis fotovoltaicos

    Energy Technology Data Exchange (ETDEWEB)

    Alves, Alceu Ferreira; Cagnon, Jose Angelo [Universidade Estadual Paulista (FEB/UNESP), Bauru, SP (Brazil). Fac. de Engenharia], Emails: alceu@feb.unesp.br, jacagnon@feb.unesp.br

    2009-07-01

    The need of using electric energy in localities not attended by the utilities has motivated the development of this research, whose main approach was photovoltaic systems and the search for better performance of these systems with the solar panels positioning toward the sun. This work presents the performance evaluation of an automatic positioning system for photovoltaic panels taking in account the increase in generation of electric energy and its costs of implantation. It was designed a simplified electromechanical device, which is able to support and to move a photovoltaic panel along the day and along the year, keeping its surface aimed to the sun rays, without using sensors and with optimization of movements, due the adjustment of panel's inclination take place only once a day. The obtained results indicated that the proposal is viable, showing a compatible cost compared to the increase in the generation of electricity. (author)

  11. AVID: Automatic Visualization Interface Designer

    National Research Council Canada - National Science Library

    Chuah, Mei

    2000-01-01

    .... Automatic generation offers great flexibility in performing data and information analysis tasks, because new designs are generated on a case by case basis to suit current and changing future needs...

  12. An automatic image recognition approach

    Directory of Open Access Journals (Sweden)

    Tudor Barbu

    2007-07-01

    Full Text Available Our paper focuses on the graphical analysis domain. We propose an automatic image recognition technique. This approach consists of two main pattern recognition steps. First, it performs an image feature extraction operation on an input image set, using statistical dispersion features. Then, an unsupervised classification process is performed on the previously obtained graphical feature vectors. An automatic region-growing based clustering procedure is proposed and utilized in the classification stage.

  13. Automatic on-line solid-phase extraction with ultra-high performance liquid chromatography and tandem mass spectrometry for the determination of ten antipsychotics in human plasma.

    Science.gov (United States)

    Zhong, Qisheng; Shen, Lingling; Liu, Jiaqi; Yu, Dianbao; Li, Simin; Li, Zhiru; Yao, Jinting; Huang, Taohong; Kawano, Shin-Ichi; Hashi, Yuki; Zhou, Ting

    2016-06-01

    An automatic on-line solid-phase extraction with ultra-high performance liquid chromatography and tandem mass spectrometry method was developed for the simultaneous determination of ten antipsychotics in human plasma. The plasma sample after filtration was injected directly into the system without any pretreatment. A Shim-pack MAYI-C8 (G) column was used as a solid-phase extraction column, and all the analytes were separated on a Shim-pack XR-ODS III column with a mobile phase consisting of 0.1% v/v formic acid in water with 5 mM ammonium acetate and acetonitrile. The method features were systematically investigated, including extraction conditions, desorption conditions, the equilibration solution, the valve switching time, and the dilution for column-head stacking. Under the optimized conditions, the whole analysis procedure took only 10 min. The limits of quantitation were in the range of 0.00321-2.75 μg/L and the recoveries ranged from 75.9 to 122%. Compared with the off-line ultra-high performance liquid chromatography and the reported methods, this validated on-line method showed significant advantages such as minimal pretreatment, shortest analysis time, and highest sensitivity. The results indicated that this automatic on-line method was rapid, sensitive, and reliable for the determination of antipsychotics in plasma and could be extended to other target analytes in biological samples. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Performance verification and comparison of TianLong automatic hypersensitive hepatitis B virus DNA quantification system with Roche CAP/CTM system.

    Science.gov (United States)

    Li, Ming; Chen, Lin; Liu, Li-Ming; Li, Yong-Li; Li, Bo-An; Li, Bo; Mao, Yuan-Li; Xia, Li-Fang; Wang, Tong; Liu, Ya-Nan; Li, Zheng; Guo, Tong-Sheng

    2017-10-07

    To investigate and compare the analytical and clinical performance of TianLong automatic hypersensitive hepatitis B virus (HBV) DNA quantification system and Roche CAP/CTM system. Two hundred blood samples for HBV DNA testing, HBV-DNA negative samples and high-titer HBV-DNA mixture samples were collected and prepared. National standard materials for serum HBV and a worldwide HBV DNA panel were employed for performance verification. The analytical performance, such as limit of detection, limit of quantification, accuracy, precision, reproducibility, linearity, genotype coverage and cross-contamination, was determined using the TianLong automatic hypersensitive HBV DNA quantification system (TL system). Correlation and Bland-Altman plot analyses were carried out to compare the clinical performance of the TL system assay and the CAP/CTM system. The detection limit of the TL system was 10 IU/mL, and its limit of quantification was 30 IU/mL. The differences between the expected and tested concentrations of the national standards were less than ± 0.4 Log 10 IU/mL, which showed high accuracy of the system. Results of the precision, reproducibility and linearity tests showed that the multiple test coefficient of variation (CV) of the same sample was less than 5% for 10 2 -10 6 IU/mL; and for 30-10 8 IU/mL, the linear correlation coefficient r 2 = 0.99. The TL system detected HBV DNA (A-H) genotypes and there was no cross-contamination during the "checkerboard" test. When compared with the CAP/CTM assay, the two assays showed 100% consistency in both negative and positive sample results (15 negative samples and 185 positive samples). No statistical differences between the two assays in the HBV DNA quantification values were observed ( P > 0.05). Correlation analysis indicated a significant correlation between the two assays, r 2 = 0.9774. The Bland-Altman plot analysis showed that 98.9% of the positive data were within the 95% acceptable range, and the maximum difference

  15. Use of Body Armor Protection Levels with Squad Automatic Weapon Fighting Load Impacts Soldier Performance, Mobility, and Postural Control

    Science.gov (United States)

    2015-05-01

    to examine Soldier performance. For the rush task, two padded gym mats (Mats A and B), separated by 30 m, were placed at either end of the hallway...pelvis and torso) using double sided and athletic tape wrapped around the segment. The thigh and pelvis clusters were placed over tight fitting spandex

  16. Automatisms: bridging clinical neurology with criminal law.

    Science.gov (United States)

    Rolnick, Joshua; Parvizi, Josef

    2011-03-01

    The law, like neurology, grapples with the relationship between disease states and behavior. Sometimes, the two disciplines share the same terminology, such as automatism. In law, the "automatism defense" is a claim that action was involuntary or performed while unconscious. Someone charged with a serious crime can acknowledge committing the act and yet may go free if, relying on the expert testimony of clinicians, the court determines that the act of crime was committed in a state of automatism. In this review, we explore the relationship between the use of automatism in the legal and clinical literature. We close by addressing several issues raised by the automatism defense: semantic ambiguity surrounding the term automatism, the presence or absence of consciousness during automatisms, and the methodological obstacles that have hindered the study of cognition during automatisms. Copyright © 2010 Elsevier Inc. All rights reserved.

  17. An automatic on-line 2,2-diphenyl-1-picrylhydrazyl-high performance liquid chromatography method for high-throughput screening of antioxidants from natural products.

    Science.gov (United States)

    Lu, Yanzhen; Wu, Nan; Fang, Yingtong; Shaheen, Nusrat; Wei, Yun

    2017-10-27

    Many natural products are rich in antioxidants which play an important role in preventing or postponing a variety of diseases, such as cardiovascular and inflammatory disease, diabetes as well as breast cancer. In this paper, an automatic on-line 2,2-diphenyl-1-picrylhydrazyl-high performance liquid chromatography (DPPH-HPLC) method was established for antioxidants screening with nine standards including organic acids (4-hydroxyphenylacetic acid, p-coumaric acid, ferulic acid, and benzoic acid), alkaloids (coptisine and berberine), and flavonoids (quercitrin, astragalin, and quercetin). The optimal concentration of DPPH was determined, and six potential antioxidants including 4-hydroxyphenylacetic acid, p-coumaric acid, ferulic acid, quercitrin, astragalin, and quercetin, and three non-antioxidants including benzoic acid, coptisine, and berberine, were successfully screened out and validated by conventional DPPH radical scavenging activity assay. The established method has been applied to the crude samples of Saccharum officinarum rinds, Coptis chinensis powders, and Malus pumila leaves, consecutively. Two potential antioxidant compounds from Saccharum officinarum rinds and five potential antioxidant compounds from Malus pumila eaves were rapidly screened out. Then these seven potential antioxidants were purified and identified as p-coumaric acid, ferulic acid, phloridzin, isoquercitrin, quercetin-3-xyloside, quercetin-3-arabinoside, and quercetin-3-rhamnoside using countercurrent chromatography combined with mass spectrometry and their antioxidant activities were further evaluated by conventional DPPH radical scavenging assay. The activity result was in accordance with that of the established method. This established method is cheap and automatic, and could be used as an efficient tool for high-throughput antioxidant screening from various complex natural products. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. An electronically controlled automatic security access gate

    Directory of Open Access Journals (Sweden)

    Jonathan A. ENOKELA

    2014-11-01

    Full Text Available The security challenges being encountered in many places require electronic means of controlling access to communities, recreational centres, offices, and homes. The electronically controlled automated security access gate being proposed in this work helps to prevent an unwanted access to controlled environments. This is achieved mainly through the use of a Radio Frequency (RF transmitter-receiver pair. In the design a microcontroller is programmed to decode a given sequence of keys that is entered on a keypad and commands a transmitter module to send out this code as signal at a given radio frequency. Upon reception of this RF signal by the receiver module, another microcontroller activates a driver circuitry to operate the gate automatically. The codes for the microcontrollers were written in C language and were debugged and compiled using the KEIL Micro vision 4 integrated development environment. The resultant Hex files were programmed into the memories of the microcontrollers with the aid of a universal programmer. Software simulation was carried out using the Proteus Virtual System Modeling (VSM version 7.7. A scaled-down prototype of the system was built and tested. The electronically controlled automated security access gate can be useful in providing security for homes, organizations, and automobile terminals. The four-character password required to operate the gate gives the system an increased level of security. Due to its standalone nature of operation the system is cheaper to maintain in comparison with a manually operated type.

  19. Performance Evaluation of Three Different High Resolution Satellite Images in Semi-Automatic Urban Illegal Building Detection

    Science.gov (United States)

    Khalilimoghadama, N.; Delavar, M. R.; Hanachi, P.

    2017-09-01

    The problem of overcrowding of mega cities has been bolded in recent years. To meet the need of housing this increased population, which is of great importance in mega cities, a huge number of buildings are constructed annually. With the ever-increasing trend of building constructions, we are faced with the growing trend of building infractions and illegal buildings (IBs). Acquiring multi-temporal satellite images and using change detection techniques is one of the proper methods of IB monitoring. Using the type of satellite images with different spatial and spectral resolutions has always been an issue in efficient detection of the building changes. In this research, three bi-temporal high-resolution satellite images of IRS-P5, GeoEye-1 and QuickBird sensors acquired from the west of metropolitan area of Tehran, capital of Iran, in addition to city maps and municipality property database were used to detect the under construction buildings with improved performance and accuracy. Furthermore, determining the employed bi-temporal satellite images to provide better performance and accuracy in the case of IB detection is the other purpose of this research. The Kappa coefficients of 70 %, 64 %, and 68 % were obtained for producing change image maps using GeoEye-1, IRS-P5, and QuickBird satellite images, respectively. In addition, the overall accuracies of 100 %, 6 %, and 83 % were achieved for IB detection using the satellite images, respectively. These accuracies substantiate the fact that the GeoEye-1 satellite images had the best performance among the employed images in producing change image map and detecting the IBs.

  20. Performance evaluation and operational experience with a semi-automatic monitor for the radiological characterization of low-level wastes

    International Nuclear Information System (INIS)

    Davey, E.C.; Csullog, G.W.

    1987-03-01

    Chalk River Nuclear Laboratories (CRNL) have undertaken a Waste Disposal Project to co-ordinate the transition from the current practice of interim storage to permanent disposal for low-level radioactive wastes (LLW). The strategy of the project is to classify and segregate waste segments according to their hazardous radioactive lifetimes and to emplace them in disposal facilities engineered to isolate and contain them. To support this strategy, a waste characterization program was set up to estimate the volume and radioisotope inventories of the wastes managed by CRNL. A key element of the program is the demonstration of a non-invasive measurement technique for the isotope-specific characterization of solid LLW. This paper describes the approach taken at CRNL for the non-invasive assay of LLW and the field performance and early operational experience with a waste characterization monitor to be used in a waste processing facility

  1. Does the amount of tagged stool and fluid significantly affect the radiation exposure in low-dose CT colonography performed with an automatic exposure control?

    International Nuclear Information System (INIS)

    Lim, Hyun Kyong; Lee, Kyoung Ho; Kim, So Yeon; Kim, Young Hoon; Kim, Kil Joong; Kim, Bohyoung; Lee, Hyunna; Park, Seong Ho; Yanof, Jeffrey H.; Hwang, Seung-sik

    2011-01-01

    To determine whether the amount of tagged stool and fluid significantly affects the radiation exposure in low-dose screening CT colonography performed with an automatic tube-current modulation technique. The study included 311 patients. The tagging agent was barium (n = 271) or iodine (n = 40). Correlation was measured between mean volume CT dose index (CTDI vol ) and the estimated x-ray attenuation of the tagged stool and fluid (ATT). Multiple linear regression analyses were performed to determine the effect of ATT on CTDI vol and the effect of ATT on image noise while adjusting for other variables including abdominal circumference. CTDI vol varied from 0.88 to 2.54 mGy. There was no significant correlation between CTDI vol and ATT (p = 0.61). ATT did not significantly affect CTDI vol (p = 0.93), while abdominal circumference was the only factor significantly affecting CTDI vol (p < 0.001). Image noise ranged from 59.5 to 64.1 HU. The p value for the regression model explaining the noise was 0.38. The amount of stool and fluid tagging does not significantly affect radiation exposure. (orig.)

  2. Computing eye gaze metrics for the automatic assessment of radiographer performance during X-ray image interpretation.

    Science.gov (United States)

    McLaughlin, Laura; Bond, Raymond; Hughes, Ciara; McConnell, Jonathan; McFadden, Sonyia

    2017-09-01

    To investigate image interpretation performance by diagnostic radiography students, diagnostic radiographers and reporting radiographers by computing eye gaze metrics using eye tracking technology. Three groups of participants were studied during their interpretation of 8 digital radiographic images including the axial and appendicular skeleton, and chest (prevalence of normal images was 12.5%). A total of 464 image interpretations were collected. Participants consisted of 21 radiography students, 19 qualified radiographers and 18 qualified reporting radiographers who were further qualified to report on the musculoskeletal (MSK) system. Eye tracking data was collected using the Tobii X60 eye tracker and subsequently eye gaze metrics were computed. Voice recordings, confidence levels and diagnoses provided a clear demonstration of the image interpretation and the cognitive processes undertaken by each participant. A questionnaire afforded the participants an opportunity to offer information on their experience in image interpretation and their opinion on the eye tracking technology. Reporting radiographers demonstrated a 15% greater accuracy rate (p≤0.001), were more confident (p≤0.001) and took a mean of 2.4s longer to clinically decide on all features compared to students. Reporting radiographers also had a 15% greater accuracy rate (p≤0.001), were more confident (p≤0.001) and took longer to clinically decide on an image diagnosis (p=0.02) than radiographers. Reporting radiographers had a greater mean fixation duration (p=0.01), mean fixation count (p=0.04) and mean visit count (p=0.04) within the areas of pathology compared to students. Eye tracking patterns, presented within heat maps, were a good reflection of group expertise and search strategies. Eye gaze metrics such as time to first fixate, fixation count, fixation duration and visit count within the areas of pathology were indicative of the radiographer's competency. The accuracy and confidence of

  3. Clothes Dryer Automatic Termination Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    TeGrotenhuis, Ward E.

    2014-10-01

    Volume 2: Improved Sensor and Control Designs Many residential clothes dryers on the market today provide automatic cycles that are intended to stop when the clothes are dry, as determined by the final remaining moisture content (RMC). However, testing of automatic termination cycles has shown that many dryers are susceptible to over-drying of loads, leading to excess energy consumption. In particular, tests performed using the DOE Test Procedure in Appendix D2 of 10 CFR 430 subpart B have shown that as much as 62% of the energy used in a cycle may be from over-drying. Volume 1 of this report shows an average of 20% excess energy from over-drying when running automatic cycles with various load compositions and dryer settings. Consequently, improving automatic termination sensors and algorithms has the potential for substantial energy savings in the U.S.

  4. Automatic Evaluation Of Interferograms

    Science.gov (United States)

    Becker, Friedhelm; Meier, Gerd E. A.; Wegner, Horst

    1983-03-01

    A system for the automatic evaluation of interference patterns has been developed. After digitizing the interferograms from classical and holografic interferometers with a television digitizer and performing different picture enhancement operations the fringe loci are extracted by use of a floating-threshold method. The fringes are numbered using a special scheme after the removal of any fringe disconnections which might appear if there was insufficient contrast in the interferograms. The reconstruction of the object function from the numbered fringe field is achieved by a local polynomial least-squares approximation. Applications are given, demonstrating the evaluation of interferograms of supersonic flow fields and the analysis of holografic interferograms of car-tyres.

  5. Motor automaticity in Parkinson’s disease

    Science.gov (United States)

    Wu, Tao; Hallett, Mark; Chan, Piu

    2017-01-01

    Bradykinesia is the most important feature contributing to motor difficulties in Parkinson’s disease (PD). However, the pathophysiology underlying bradykinesia is not fully understood. One important aspect is that PD patients have difficulty in performing learned motor skills automatically, but this problem has been generally overlooked. Here we review motor automaticity associated motor deficits in PD, such as reduced arm swing, decreased stride length, freezing of gait, micrographia and reduced facial expression. Recent neuroimaging studies have revealed some neural mechanisms underlying impaired motor automaticity in PD, including less efficient neural coding of movement, failure to shift automated motor skills to the sensorimotor striatum, instability of the automatic mode within the striatum, and use of attentional control and/or compensatory efforts to execute movements usually performed automatically in healthy people. PD patients lose previously acquired automatic skills due to their impaired sensorimotor striatum, and have difficulty in acquiring new automatic skills or restoring lost motor skills. More investigations on the pathophysiology of motor automaticity, the effect of L-dopa or surgical treatments on automaticity, and the potential role of using measures of automaticity in early diagnosis of PD would be valuable. PMID:26102020

  6. Automatic fluid dispenser

    Science.gov (United States)

    Sakellaris, P. C. (Inventor)

    1977-01-01

    Fluid automatically flows to individual dispensing units at predetermined times from a fluid supply and is available only for a predetermined interval of time after which an automatic control causes the fluid to drain from the individual dispensing units. Fluid deprivation continues until the beginning of a new cycle when the fluid is once again automatically made available at the individual dispensing units.

  7. Performance of human observers and an automatic 3-dimensional computer-vision-based locomotion scoring method to detect lameness and hoof lesions in dairy cows

    NARCIS (Netherlands)

    Schlageter-Tello, Andrés; Hertem, Van Tom; Bokkers, Eddie A.M.; Viazzi, Stefano; Bahr, Claudia; Lokhorst, Kees

    2018-01-01

    The objective of this study was to determine if a 3-dimensional computer vision automatic locomotion scoring (3D-ALS) method was able to outperform human observers for classifying cows as lame or nonlame and for detecting cows affected and nonaffected by specific type(s) of hoof lesion. Data

  8. Automatic Validation of Protocol Narration

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, Pierpablo

    2003-01-01

    We perform a systematic expansion of protocol narrations into terms of a process algebra in order to make precise some of the detailed checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we...

  9. Automatic Fiscal Stabilizers

    Directory of Open Access Journals (Sweden)

    Narcis Eduard Mitu

    2013-11-01

    Full Text Available Policies or institutions (built into an economic system that automatically tend to dampen economic cycle fluctuations in income, employment, etc., without direct government intervention. For example, in boom times, progressive income tax automatically reduces money supply as incomes and spendings rise. Similarly, in recessionary times, payment of unemployment benefits injects more money in the system and stimulates demand. Also called automatic stabilizers or built-in stabilizers.

  10. Automatic differentiation bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Corliss, G.F. (comp.)

    1992-07-01

    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  11. Automatic control systems engineering

    International Nuclear Information System (INIS)

    Shin, Yun Gi

    2004-01-01

    This book gives descriptions of automatic control for electrical electronics, which indicates history of automatic control, Laplace transform, block diagram and signal flow diagram, electrometer, linearization of system, space of situation, state space analysis of electric system, sensor, hydro controlling system, stability, time response of linear dynamic system, conception of root locus, procedure to draw root locus, frequency response, and design of control system.

  12. Focusing Automatic Code Inspections

    NARCIS (Netherlands)

    Boogerd, C.J.

    2010-01-01

    Automatic Code Inspection tools help developers in early detection of defects in software. A well-known drawback of many automatic inspection approaches is that they yield too many warnings and require a clearer focus. In this thesis, we provide such focus by proposing two methods to prioritize

  13. Automatic differentiation of functions

    International Nuclear Information System (INIS)

    Douglas, S.R.

    1990-06-01

    Automatic differentiation is a method of computing derivatives of functions to any order in any number of variables. The functions must be expressible as combinations of elementary functions. When evaluated at specific numerical points, the derivatives have no truncation error and are automatically found. The method is illustrated by simple examples. Source code in FORTRAN is provided

  14. AUTOMATIC INTRAVENOUS DRIP CONTROLLER*

    African Journals Online (AJOL)

    Both the nursing staff shortage and the need for precise control in the administration of dangerous drugs intra- venously have led to the development of various devices to achieve an automatic system. The continuous automatic control of the drip rate eliminates errors due to any physical effect such as movement of the ...

  15. Automatic Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Preuss, Mike

    2014-01-01

    Automatically generating computer animations is a challenging and complex problem with applications in games and film production. In this paper, we investigate howto translate a shot list for a virtual scene into a series of virtual camera configurations — i.e automatically controlling the virtual...

  16. MOS voltage automatic tuning circuit

    OpenAIRE

    李, 田茂; 中田, 辰則; 松本, 寛樹

    2004-01-01

    Abstract ###Automatic tuning circuit adjusts frequency performance to compensate for the process variation. Phase locked ###loop (PLL) is a suitable oscillator for the integrated circuit. It is a feedback system that compares the input ###phase with the output phase. It can make the output frequency equal to the input frequency. In this paper, PLL ###fomed of MOSFET's is presented.The presented circuit consists of XOR circuit, Low-pass filter and Relaxation ###Oscillator. On PSPICE simulation...

  17. A Learning-Based Wrapper Method to Correct Systematic Errors in Automatic Image Segmentation: Consistently Improved Performance in Hippocampus, Cortex and Brain Segmentation

    Science.gov (United States)

    Wang, Hongzhi; Das, Sandhitsu R.; Suh, Jung Wook; Altinay, Murat; Pluta, John; Craige, Caryne; Avants, Brian; Yushkevich, Paul A.

    2011-01-01

    We propose a simple but generally applicable approach to improving the accuracy of automatic image segmentation algorithms relative to manual segmentations. The approach is based on the hypothesis that a large fraction of the errors produced by automatic segmentation are systematic, i.e., occur consistently from subject to subject, and serves as a wrapper method around a given host segmentation method. The wrapper method attempts to learn the intensity, spatial and contextual patterns associated with systematic segmentation errors produced by the host method on training data for which manual segmentations are available. The method then attempts to correct such errors in segmentations produced by the host method on new images. One practical use of the proposed wrapper method is to adapt existing segmentation tools, without explicit modification, to imaging data and segmentation protocols that are different from those on which the tools were trained and tuned. An open-source implementation of the proposed wrapper method is provided, and can be applied to a wide range of image segmentation problems. The wrapper method is evaluated with four host brain MRI segmentation methods: hippocampus segmentation using FreeSurfer (Fischl et al., 2002); hippocampus segmentation using multi-atlas label fusion (Artaechevarria et al., 2009); brain extraction using BET (Smith, 2002); and brain tissue segmentation using FAST (Zhang et al., 2001). The wrapper method generates 72%, 14%, 29% and 21% fewer erroneously segmented voxels than the respective host segmentation methods. In the hippocampus segmentation experiment with multi-atlas label fusion as the host method, the average Dice overlap between reference segmentations and segmentations produced by the wrapper method is 0.908 for normal controls and 0.893 for patients with mild cognitive impairment. Average Dice overlaps of 0.964, 0.905 and 0.951 are obtained for brain extraction, white matter segmentation and gray matter

  18. Automaticity of walking: functional significance, mechanisms, measurement and rehabilitation strategies

    Directory of Open Access Journals (Sweden)

    David J Clark

    2015-05-01

    Full Text Available Automaticity is a hallmark feature of walking in adults who are healthy and well-functioning. In the context of walking, ‘automaticity’ refers to the ability of the nervous system to successfully control typical steady state walking with minimal use of attention-demanding executive control resources. Converging lines of evidence indicate that walking deficits and disorders are characterized in part by a shift in the locomotor control strategy from healthy automaticity to compensatory executive control. This is potentially detrimental to walking performance, as an executive control strategy is not optimized for locomotor control. Furthermore, it places excessive demands on a limited pool of executive reserves. The result is compromised ability to perform basic and complex walking tasks and heightened risk for adverse mobility outcomes including falls. Strategies for rehabilitation of automaticity are not well defined, which is due to both a lack of systematic research into the causes of impaired automaticity and to a lack of robust neurophysiological assessments by which to gauge automaticity. These gaps in knowledge are concerning given the serious functional implications of compromised automaticity. Therefore, the objective of this article is to advance the science of automaticity of walking by consolidating evidence and identifying gaps in knowledge regarding: a functional significance of automaticity; b neurophysiology of automaticity; c measurement of automaticity; d mechanistic factors that compromise automaticity; and e strategies for rehabilitation of automaticity.

  19. Automatic Test Systems Aquisition

    National Research Council Canada - National Science Library

    1994-01-01

    We are providing this final memorandum report for your information and use. This report discusses the efforts to achieve commonality in standards among the Military Departments as part of the DoD policy for automatic test systems (ATS...

  20. Automatic requirements traceability

    OpenAIRE

    Andžiulytė, Justė

    2017-01-01

    This paper focuses on automatic requirements traceability and algorithms that automatically find recommendation links for requirements. The main objective of this paper is the evaluation of these algorithms and preparation of the method defining algorithms to be used in different cases. This paper presents and examines probabilistic, vector space and latent semantic indexing models of information retrieval and association rule mining using authors own implementations of these algorithms and o...

  1. Position automatic determination technology

    International Nuclear Information System (INIS)

    1985-10-01

    This book tells of method of position determination and characteristic, control method of position determination and point of design, point of sensor choice for position detector, position determination of digital control system, application of clutch break in high frequency position determination, automation technique of position determination, position determination by electromagnetic clutch and break, air cylinder, cam and solenoid, stop position control of automatic guide vehicle, stacker crane and automatic transfer control.

  2. The ‘Continuing Misfortune’ of Automatism in Early Surrealism

    Directory of Open Access Journals (Sweden)

    Tessel M. Bauduin

    2015-09-01

    Full Text Available In the 1924 Manifesto of Surrealism surrealist leader André Breton (1896-1966 defined Surrealism as ‘psychic automatism in its pure state,’ positioning ‘psychic automatism’ as both a concept and a technique. This definition followed upon an intense period of experimentation with various forms of automatism among the proto-surrealist group; predominantly automatic writing, but also induced dream states. This article explores how surrealist ‘psychic automatism’ functioned as a mechanism for communication, or the expression of thought as directly as possible through the unconscious, in the first two decades of Surrealism. It touches upon automatic writing, hysteria as an automatic bodily performance of the unconscious, dreaming and the experimentation with induced dream states, and automatic drawing and other visual arts-techniques that could be executed more or less automatically as well. For all that the surrealists reinvented automatism for their own poetic, artistic and revolutionary aims, the automatic techniques were primarily drawn from contemporary Spiritualism, psychical research and experimentation with mediums, and the article teases out the connections to mediumistic automatism. It is demonstrated how the surrealists effectively and successfully divested automatism of all things spiritual. It furthermore becomes clear that despite various mishaps, automatism in many forms was a very successful creative technique within Surrealism.

  3. ArrayBridge: Interweaving declarative array processing with high-performance computing

    Energy Technology Data Exchange (ETDEWEB)

    Xing, Haoyuan [The Ohio State Univ., Columbus, OH (United States); Floratos, Sofoklis [The Ohio State Univ., Columbus, OH (United States); Blanas, Spyros [The Ohio State Univ., Columbus, OH (United States); Byna, Suren [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Prabhat, Prabhat [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wu, Kesheng [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Brown, Paul [Paradigm4, Inc., Waltham, MA (United States)

    2017-05-04

    Scientists are increasingly turning to datacenter-scale computers to produce and analyze massive arrays. Despite decades of database research that extols the virtues of declarative query processing, scientists still write, debug and parallelize imperative HPC kernels even for the most mundane queries. This impedance mismatch has been partly attributed to the cumbersome data loading process; in response, the database community has proposed in situ mechanisms to access data in scientific file formats. Scientists, however, desire more than a passive access method that reads arrays from files. This paper describes ArrayBridge, a bi-directional array view mechanism for scientific file formats, that aims to make declarative array manipulations interoperable with imperative file-centric analyses. Our prototype implementation of ArrayBridge uses HDF5 as the underlying array storage library and seamlessly integrates into the SciDB open-source array database system. In addition to fast querying over external array objects, ArrayBridge produces arrays in the HDF5 file format just as easily as it can read from it. ArrayBridge also supports time travel queries from imperative kernels through the unmodified HDF5 API, and automatically deduplicates between array versions for space efficiency. Our extensive performance evaluation in NERSC, a large-scale scientific computing facility, shows that ArrayBridge exhibits statistically indistinguishable performance and I/O scalability to the native SciDB storage engine.

  4. Calibration of automatic performance measures - speed and volume data : volume 1, evaluation of the accuracy of traffic volume counts collected by microwave sensors.

    Science.gov (United States)

    2015-09-01

    Over the past few years, the Utah Department of Transportation (UDOT) has developed a system called the : Signal Performance Metrics System (SPMS) to evaluate the performance of signalized intersections. This system : currently provides data summarie...

  5. Presentation video retrieval using automatically recovered slide and spoken text

    Science.gov (United States)

    Cooper, Matthew

    2013-03-01

    Video is becoming a prevalent medium for e-learning. Lecture videos contain text information in both the presentation slides and lecturer's speech. This paper examines the relative utility of automatically recovered text from these sources for lecture video retrieval. To extract the visual information, we automatically detect slides within the videos and apply optical character recognition to obtain their text. Automatic speech recognition is used similarly to extract spoken text from the recorded audio. We perform controlled experiments with manually created ground truth for both the slide and spoken text from more than 60 hours of lecture video. We compare the automatically extracted slide and spoken text in terms of accuracy relative to ground truth, overlap with one another, and utility for video retrieval. Results reveal that automatically recovered slide text and spoken text contain different content with varying error profiles. Experiments demonstrate that automatically extracted slide text enables higher precision video retrieval than automatically recovered spoken text.

  6. The design and performance of the first fully automatic non-grid 5 MW multi-diesel / mini hydro / battery converter power stations

    International Nuclear Information System (INIS)

    Ahmad Shadzli Abdul Wahab

    2000-01-01

    Electricity power supply in remote communities and towns are traditionally and hitherto supplied by diesel generator sets of varying capacities and sizes -from few kilowatt to few megawatts. Its proven to be versatile, robust, modular cheaper capital investment, reliable and easy to operate and maintain. These features are what make diesel generators most preferred choice for generating electric power to power hungry remote communities. The main draw back, though, is its increasingly high cost of operation and maintenance, largely due to upward trend in the cost of diesel fuel, high cost of engines spare parts plus the inflationary nature of salary and wages of operators. For these reasons, engineers and technologists have for years worked tirelessly to find ways and means to reduce the O and M costs. One of the novel ideas was to hybrid the conventional diesel generating system with renewable energy resources, such as mini hydro, solar photovoltaic or wind energy. Many prototypes involving several configurations of energy resources eg diesel/PV/ battery, diesel/wind/battery, diesel/mini hydro/battery have been tested but none has so far has been as successful as Sema/ Powercorp automated Intelligent Power System (IPS). Based on microprocessor hardware, powerful computer software programming and satellite communication technology, the IPS -equipped diesel power station can now now be operated fully automatic with capability of remote control and monitoring. The system is versatile in maximising the use of renewable energy energy resources such as wind, mini hydro or solar thereby reducing very significantly the use of diesel fuel. Operation and maintenance costs also are reduced due to the use of minimum manpower and and increase in fuel efficiency of the engines. The tested and proven IPS technology has been operating successfully for the last ten years in remote diesel stations in Northern Territory, Australia, Rathlin Island, Northern Ireland and its latest and

  7. Automatic Ultrasound Scanning

    DEFF Research Database (Denmark)

    Moshavegh, Ramin

    on the scanners, and to improve the computer-aided diagnosis (CAD) in ultrasound by introducing new quantitative measures. Thus, four major issues concerning automation of the medical ultrasound are addressed in this PhD project. They touch upon gain adjustments in ultrasound, automatic synthetic aperture image...... on the user adjustments on the scanner interface to optimize the scan settings. This explains the huge interest in the subject of this PhD project entitled “AUTOMATIC ULTRASOUND SCANNING”. The key goals of the project have been to develop automated techniques to minimize the unnecessary settings...

  8. Automatic measuring device for atomic oxygen concentrations (1962)

    International Nuclear Information System (INIS)

    Weill, J.; Deiss, M.; Mercier, R.

    1962-01-01

    Within the framework of the activities of the Autonomous Reactor Electronics Section we have developed a device, which renders automatic one type of measurement carried out in the Physical Chemistry Department at the Saclay Research Centre. We define here: - the physico-chemical principle of the apparatus which is adapted to the measurement of atomic oxygen concentrations; - the physical principle of the automatic measurement; - the properties, performance, constitution, use and maintenance of the automatic measurement device. It is concluded that the principle of the automatic device, whose tests have confirmed the estimation of the theoretical performance, could usefully be adapted to other types of measurement. (authors) [fr

  9. Testing the algorithms for automatic identification of errors on the measured quantities of the nuclear power plant. Verification tests

    International Nuclear Information System (INIS)

    Svatek, J.

    1999-12-01

    During the development and implementation of supporting software for the control room and emergency control centre at the Dukovany nuclear power plant it appeared necessary to validate the input quantities in order to assure operating reliability of the software tools. Therefore, the development of software for validation of the measured quantities of the plant data sources was initiated, and the software had to be debugged and verified. The report contains the proposal for and description of the verification tests for testing the algorithms of automatic identification of errors on the observed quantities of the NPP by means of homemade validation software. In particular, the algorithms treated serve the validation of the hot leg temperature at primary circuit loop no. 2 or 4 at the Dukovany-2 reactor unit using data from the URAN and VK3 information systems, recorded during 3 different days. (author)

  10. Automatic emotional expression analysis from eye area

    Science.gov (United States)

    Akkoç, Betül; Arslan, Ahmet

    2015-02-01

    Eyes play an important role in expressing emotions in nonverbal communication. In the present study, emotional expression classification was performed based on the features that were automatically extracted from the eye area. Fırst, the face area and the eye area were automatically extracted from the captured image. Afterwards, the parameters to be used for the analysis through discrete wavelet transformation were obtained from the eye area. Using these parameters, emotional expression analysis was performed through artificial intelligence techniques. As the result of the experimental studies, 6 universal emotions consisting of expressions of happiness, sadness, surprise, disgust, anger and fear were classified at a success rate of 84% using artificial neural networks.

  11. Reactor component automatic grapple

    International Nuclear Information System (INIS)

    Greenaway, P.R.

    1982-01-01

    A grapple for handling nuclear reactor components in a medium such as liquid sodium which, upon proper seating and alignment of the grapple with the component as sensed by a mechanical logic integral to the grapple, automatically seizes the component. The mechanical logic system also precludes seizure in the absence of proper seating and alignment. (author)

  12. Automatic Commercial Permit Sets

    Energy Technology Data Exchange (ETDEWEB)

    Grana, Paul [Folsom Labs, Inc., San Francisco, CA (United States)

    2017-12-21

    Final report for Folsom Labs’ Solar Permit Generator project, which has successfully completed, resulting in the development and commercialization of a software toolkit within the cloud-based HelioScope software environment that enables solar engineers to automatically generate and manage draft documents for permit submission.

  13. Automatic Complexity Analysis

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    1989-01-01

    One way to analyse programs is to to derive expressions for their computational behaviour. A time bound function (or worst-case complexity) gives an upper bound for the computation time as a function of the size of input. We describe a system to derive such time bounds automatically using abstrac...

  14. Optimisation of high-performance liquid chromatography with diode array detection using an automatic peak tracking procedure based on augmented iterative target transformation factor analysis

    NARCIS (Netherlands)

    van Zomeren, Paul; Hoogvorst, A.; Coenegracht, P.M J; de Jong, G.J.

    2004-01-01

    An automated method for the optimisation of high-performance liquid chromatography is developed. First of all, the sample of interest is analysed with various eluent compositions. All obtained data are combined into one augmented data matrix. Subsequently, augmented iterative target transformation

  15. An automatic versatile system integrating solid-phase extraction with ultra-high performance liquid chromatography-tandem mass spectrometry using a dual-dilution strategy for direct analysis of auxins in plant extracts.

    Science.gov (United States)

    Zhong, Qisheng; Qiu, Xiongxiong; Lin, Caiyong; Shen, Lingling; Huo, Yin; Zhan, Song; Yao, Jinting; Huang, Taohong; Kawano, Shin-ichi; Hashi, Yuki; Xiao, Langtao; Zhou, Ting

    2014-09-12

    An automatic versatile system which integrated solid phase extraction (SPE) with ultra-high performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) was developed. Diverse commercial SPE columns can be used under an ambient pressure in this online system realized by a dual-dilution strategy. The first dilution enabled the direct injection of complex samples with minimal pretreatment, and the second dilution realized direct introduction of large volume of strong eluent into the UHPLC column without causing peak broadening or distortion. In addition, a post-column compensation mode was also designed for the matrix-effects evaluation. The features of the online system were systematically investigated, including the dilution effect, the capture of desorption solution, the column-head stacking effect and the system recovery. Compared with the offline UHPLC system, this online system showed significant advantages such as larger injection volume, higher sensitivity, shorter analysis time and better repeatability. The feasibility of the system was demonstrated by the direct analysis of three auxins from different plant tissues, including leaves of Dracaena sanderiana, buds and petals of Bauhinia. Under the optimized conditions, the whole analysis procedure took only 7min. All the correlation coefficients were greater than 0.9987, the limits of detection and the limits of quantitation were in the range of 0.560-0.800ng/g and 1.80-2.60ng/g, respectively. The recoveries of the real samples ranged from 61.0 to 117%. Finally, the post-column compensation mode was applied and no matrix-effects were observed under the analysis conditions. The automatic versatile system was rapid, sensitive and reliable. We expect this system could be extended to other target analytes in complex samples utilizing diverse SPE columns. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. On automatic machine translation evaluation

    Directory of Open Access Journals (Sweden)

    Darinka Verdonik

    2013-05-01

    Full Text Available An important task of developing machine translation (MT is evaluating system performance. Automatic measures are most commonly used for this task, as manual evaluation is time-consuming and costly. However, to perform an objective evaluation is not a trivial task. Automatic measures, such as BLEU, TER, NIST, METEOR etc., have their own weaknesses, while manual evaluations are also problematic since they are always to some extent subjective. In this paper we test the influence of a test set on the results of automatic MT evaluation for the subtitling domain. Translating subtitles is a rather specific task for MT, since subtitles are a sort of summarization of spoken text rather than a direct translation of (written text. Additional problem when translating language pair that does not include English, in our example Slovene-Serbian, is that commonly the translations are done from English to Serbian and from English to Slovenian, and not directly, since most of the TV production is originally filmed in English. All this poses additional challenges to MT and consequently to MT evaluation. Automatic evaluation is based on a reference translation, which is usually taken from an existing parallel corpus and marked as a test set. In our experiments, we compare the evaluation results for the same MT system output using three types of test set. In the first round, the test set are 4000 subtitles from the parallel corpus of subtitles SUMAT. These subtitles are not direct translations from Serbian to Slovene or vice versa, but are based on an English original. In the second round, the test set are 1000 subtitles randomly extracted from the first test set and translated anew, from Serbian to Slovenian, based solely on the Serbian written subtitles. In the third round, the test set are the same 1000 subtitles, however this time the Slovene translations were obtained by manually correcting the Slovene MT outputs so that they are correct translations of the

  17. Automatic Clustering Using FSDE-Forced Strategy Differential Evolution

    Science.gov (United States)

    Yasid, A.

    2018-01-01

    Clustering analysis is important in datamining for unsupervised data, cause no adequate prior knowledge. One of the important tasks is defining the number of clusters without user involvement that is known as automatic clustering. This study intends on acquiring cluster number automatically utilizing forced strategy differential evolution (AC-FSDE). Two mutation parameters, namely: constant parameter and variable parameter are employed to boost differential evolution performance. Four well-known benchmark datasets were used to evaluate the algorithm. Moreover, the result is compared with other state of the art automatic clustering methods. The experiment results evidence that AC-FSDE is better or competitive with other existing automatic clustering algorithm.

  18. Automatic trend estimation

    CERN Document Server

    Vamos¸, C˘alin

    2013-01-01

    Our book introduces a method to evaluate the accuracy of trend estimation algorithms under conditions similar to those encountered in real time series processing. This method is based on Monte Carlo experiments with artificial time series numerically generated by an original algorithm. The second part of the book contains several automatic algorithms for trend estimation and time series partitioning. The source codes of the computer programs implementing these original automatic algorithms are given in the appendix and will be freely available on the web. The book contains clear statement of the conditions and the approximations under which the algorithms work, as well as the proper interpretation of their results. We illustrate the functioning of the analyzed algorithms by processing time series from astrophysics, finance, biophysics, and paleoclimatology. The numerical experiment method extensively used in our book is already in common use in computational and statistical physics.

  19. Automatic Program Development

    DEFF Research Database (Denmark)

    Automatic Program Development is a tribute to Robert Paige (1947-1999), our accomplished and respected colleague, and moreover our good friend, whose untimely passing was a loss to our academic and research community. We have collected the revised, updated versions of the papers published in his ...... a renewed stimulus for continuing and deepening Bob's research visions. A familiar touch is given to the book by some pictures kindly provided to us by his wife Nieba, the personal recollections of his brother Gary and some of his colleagues and friends....... honor in the Higher-Order and Symbolic Computation Journal in the years 2003 and 2005. Among them there are two papers by Bob: (i) a retrospective view of his research lines, and (ii) a proposal for future studies in the area of the automatic program derivation. The book also includes some papers...

  20. Automaticity or active control

    DEFF Research Database (Denmark)

    Tudoran, Ana Alina; Olsen, Svein Ottar

    This study addresses the quasi-moderating role of habit strength in explaining action loyalty. A model of loyalty behaviour is proposed that extends the traditional satisfaction–intention–action loyalty network. Habit strength is conceptualised as a cognitive construct to refer to the psychological...... aspects of the construct, such as routine, inertia, automaticity, or very little conscious deliberation. The data consist of 2962 consumers participating in a large European survey. The results show that habit strength significantly moderates the association between satisfaction and action loyalty, and......, respectively, between intended loyalty and action loyalty. At high levels of habit strength, consumers are more likely to free up cognitive resources and incline the balance from controlled to routine and automatic-like responses....

  1. Automatic food decisions

    DEFF Research Database (Denmark)

    Mueller Loose, Simone

    Consumers' food decisions are to a large extent shaped by automatic processes, which are either internally directed through learned habits and routines or externally influenced by context factors and visual information triggers. Innovative research methods such as eye tracking, choice experiments...... and food diaries allow us to better understand the impact of unconscious processes on consumers' food choices. Simone Mueller Loose will provide an overview of recent research insights into the effects of habit and context on consumers' food choices....

  2. Automatic Language Identification

    Science.gov (United States)

    2000-08-01

    hundreds guish one language from another. The reader is referred of input languages would need to be supported , the cost of to the linguistics literature...eventually obtained bet- 108 TRAINING FRENCH GERMAN ITRAIING FRENCH M- ALGORITHM - __ GERMAN NHSPANISH TRAINING SPEECH SET OF MODELS: UTTERANCES ONE MODEL...i.e. vowels ) for each speech utterance are located malized to be insensitive to overall amplitude, pitch and automatically. Next, feature vectors

  3. An automatic hinge system for leg orthoses

    NARCIS (Netherlands)

    Rietman, J. S.; Goudsmit, J.; Meulemans, D.; Halbertsma, J. P. K.; Geertzen, J. H. B.

    2004-01-01

    This paper describes a new automatic hinge system for leg orthoses, which provides knee stability in stance, and allows knee-flexion during swing. Indications for the hinge system are a paresis or paralysis of the quadriceps muscles. Instrumented gait analysis was performed in three patients, fitted

  4. An automatic hinge system for leg orthoses

    NARCIS (Netherlands)

    Rietman, J.S.; Goudsmit, J.; Meulemans, D.; Halbertsma, J.P.K.; Geertzen, J.H.B.

    This paper describes a new, automatic hinge system for leg orthoses, which provides knee stability in stance, and allows knee-flexion during swing. Indications for the hinge system are a paresis or paralysis of the quadriceps muscles. Instrumented gait analysis was performed in three patients,

  5. Automatic alignment of audiobooks in Afrikaans

    CSIR Research Space (South Africa)

    Van Heerden, CJ

    2012-11-01

    Full Text Available to perform Maximum A Posteriori adaptation on the baseline models. The corresponding value for models trained on the audiobook data is 0.996. An automatic measure of alignment accuracy is also introduced and compared to accuracies measured relative to a gold...

  6. ASAM: Automatic architecture synthesis and application mapping

    DEFF Research Database (Denmark)

    Jozwiak, Lech; Lindwer, Menno; Corvino, Rosilde

    2013-01-01

    This paper focuses on mastering the automatic architecture synthesis and application mapping for heterogeneous massively-parallel MPSoCs based on customizable application-specific instruction-set processors (ASIPs). It presents an overview of the research being currently performed in the scope...

  7. Applications of automatic differentiation in topology optimization

    DEFF Research Database (Denmark)

    Nørgaard, Sebastian A.; Sagebaum, Max; Gauger, Nicolas R.

    2017-01-01

    and is demonstrated on two separate, previously published types of problems in topology optimization. Two separate software packages for automatic differentiation, CoDiPack and Tapenade are considered, and their performance and usability trade-offs are discussed and compared to a hand coded adjoint gradient...

  8. Exposing MPI Objects for Debugging

    DEFF Research Database (Denmark)

    Brock-Nannestad, Laust; DelSignore, John; Squyres, Jeffrey M.

    Developers rely on debuggers to inspect application state. In applications that use MPI, the Message Passing Interface, the MPI runtime contains an important part of this state. The MPI Tools Working Group has proposed an interface for MPI Handle Introspection. It allows debuggers and MPI impleme...

  9. Graphical debugging of combinational geometry

    International Nuclear Information System (INIS)

    Burns, T.J.; Smith, M.S.

    1992-01-01

    A graphical debugger for combinatorial geometry being developed at Oak Ridge National Laboratory is described. The prototype debugger consists of two parts: a FORTRAN-based ''view'' generator and a Microsoft Windows application for displaying the geometry. Options and features of both modules are discussed. Examples illustrating the various options available are presented. The potential for utilizing the images produced using the debugger as a visualization tool for the output of the radiation transport codes is discussed as is the future direction of the development

  10. Enhancing Automaticity through Task-Based Language Learning

    Science.gov (United States)

    De Ridder, Isabelle; Vangehuchten, Lieve; Gomez, Marta Sesena

    2007-01-01

    In general terms automaticity could be defined as the subconscious condition wherein "we perform a complex series of tasks very quickly and efficiently, without having to think about the various components and subcomponents of action involved" (DeKeyser 2001: 125). For language learning, Segalowitz (2003) characterised automaticity as a…

  11. Reliability and effectiveness of clickthrough data for automatic image annotation

    NARCIS (Netherlands)

    Tsikrika, T.; Diou, C.; De Vries, A.P.; Delopoulos, A.

    2010-01-01

    Automatic image annotation using supervised learning is performed by concept classifiers trained on labelled example images. This work proposes the use of clickthrough data collected from search logs as a source for the automatic generation of concept training data, thus avoiding the expensive

  12. Reliability and effectiveness of clickthrough data for automatic image annotation

    NARCIS (Netherlands)

    T. Tsikrika (Theodora); C. Diou; A.P. de Vries (Arjen); A. Delopoulos

    2010-01-01

    htmlabstractAutomatic image annotation using supervised learning is performed by concept classifiers trained on labelled example images. This work proposes the use of clickthrough data collected from search logs as a source for the automatic generation of concept training data, thus avoiding the

  13. Automatic Video-based Analysis of Human Motion

    DEFF Research Database (Denmark)

    Fihl, Preben

    The human motion contains valuable information in many situations and people frequently perform an unconscious analysis of the motion of other people to understand their actions, intentions, and state of mind. An automatic analysis of human motion will facilitate many applications and thus has...... bring the solution of fully automatic analysis and understanding of human motion closer....

  14. Automatic identification of species with neural networks

    Directory of Open Access Journals (Sweden)

    Andrés Hernández-Serna

    2014-11-01

    Full Text Available A new automatic identification system using photographic images has been designed to recognize fish, plant, and butterfly species from Europe and South America. The automatic classification system integrates multiple image processing tools to extract the geometry, morphology, and texture of the images. Artificial neural networks (ANNs were used as the pattern recognition method. We tested a data set that included 740 species and 11,198 individuals. Our results show that the system performed with high accuracy, reaching 91.65% of true positive fish identifications, 92.87% of plants and 93.25% of butterflies. Our results highlight how the neural networks are complementary to species identification.

  15. SRV-automatic handling device

    International Nuclear Information System (INIS)

    Yamada, Koji

    1987-01-01

    Automatic handling device for the steam relief valves (SRV's) is developed in order to achieve a decrease in exposure of workers, increase in availability factor, improvement in reliability, improvement in safety of operation, and labor saving. A survey is made during a periodical inspection to examine the actual SVR handling operation. An SRV automatic handling device consists of four components: conveyor, armed conveyor, lifting machine, and control/monitoring system. The conveyor is so designed that the existing I-rail installed in the containment vessel can be used without any modification. This is employed for conveying an SRV along the rail. The armed conveyor, designed for a box rail, is used for an SRV installed away from the rail. By using the lifting machine, an SRV installed away from the I-rail is brought to a spot just below the rail so that the SRV can be transferred by the conveyor. The control/monitoring system consists of a control computer, operation panel, TV monitor and annunciator. The SRV handling device is operated by remote control from a control room. A trial equipment is constructed and performance/function testing is carried out using actual SRV's. As a result, is it shown that the SRV handling device requires only two operators to serve satisfactorily. The required time for removal and replacement of one SRV is about 10 minutes. (Nogami, K.)

  16. Automatic scanning of emulsion films

    International Nuclear Information System (INIS)

    D'Ambrosio, N.; Mandrioli, G.; Sirrib, G.

    2003-01-01

    The use of nuclear emulsions in recent large neutrino experiments is mostly due to the significant results in the developments of this detection technique. In the emulsion films, trajectories of through-going particles are permanently recorded: thus, the emulsion target can be considered not only as a tracking but also as a storing device. If the data readout is performed by automatic scanning systems interfaced to an acquisition computer equipped with a fast frame grabber, nuclear emulsions can be used as very large target detector and quickly analyzed in particle physics experiments. Techniques for automatic scanning of nuclear emulsions have been developed in the early past. The effort was initiated by Niwa at Nagoya (Japan) in the late 70s. The first large-scale application was the CHORUS experiment; then emulsions have been used to search for T neutrinos in a high track density environment like DONUT. In order to measure with high accuracy and high speed, very strict constraints must be satisfied in terms of mechanical precisions, camera speed, image processing power. Recent improvements in this technique are briefly reported

  17. Deep neural networks show an equivalent and often superior performance to dermatologists in onychomycosis diagnosis: Automatic construction of onychomycosis datasets by region-based convolutional deep neural network.

    Directory of Open Access Journals (Sweden)

    Seung Seog Han

    Full Text Available Although there have been reports of the successful diagnosis of skin disorders using deep learning, unrealistically large clinical image datasets are required for artificial intelligence (AI training. We created datasets of standardized nail images using a region-based convolutional neural network (R-CNN trained to distinguish the nail from the background. We used R-CNN to generate training datasets of 49,567 images, which we then used to fine-tune the ResNet-152 and VGG-19 models. The validation datasets comprised 100 and 194 images from Inje University (B1 and B2 datasets, respectively, 125 images from Hallym University (C dataset, and 939 images from Seoul National University (D dataset. The AI (ensemble model; ResNet-152 + VGG-19 + feedforward neural networks results showed test sensitivity/specificity/ area under the curve values of (96.0 / 94.7 / 0.98, (82.7 / 96.7 / 0.95, (92.3 / 79.3 / 0.93, (87.7 / 69.3 / 0.82 for the B1, B2, C, and D datasets. With a combination of the B1 and C datasets, the AI Youden index was significantly (p = 0.01 higher than that of 42 dermatologists doing the same assessment manually. For B1+C and B2+ D dataset combinations, almost none of the dermatologists performed as well as the AI. By training with a dataset comprising 49,567 images, we achieved a diagnostic accuracy for onychomycosis using deep learning that was superior to that of most of the dermatologists who participated in this study.

  18. Support vector machine for automatic pain recognition

    Science.gov (United States)

    Monwar, Md Maruf; Rezaei, Siamak

    2009-02-01

    Facial expressions are a key index of emotion and the interpretation of such expressions of emotion is critical to everyday social functioning. In this paper, we present an efficient video analysis technique for recognition of a specific expression, pain, from human faces. We employ an automatic face detector which detects face from the stored video frame using skin color modeling technique. For pain recognition, location and shape features of the detected faces are computed. These features are then used as inputs to a support vector machine (SVM) for classification. We compare the results with neural network based and eigenimage based automatic pain recognition systems. The experiment results indicate that using support vector machine as classifier can certainly improve the performance of automatic pain recognition system.

  19. Automatic inference of indexing rules for MEDLINE

    Directory of Open Access Journals (Sweden)

    Shooshan Sonya E

    2008-11-01

    Full Text Available Abstract Background: Indexing is a crucial step in any information retrieval system. In MEDLINE, a widely used database of the biomedical literature, the indexing process involves the selection of Medical Subject Headings in order to describe the subject matter of articles. The need for automatic tools to assist MEDLINE indexers in this task is growing with the increasing number of publications being added to MEDLINE. Methods: In this paper, we describe the use and the customization of Inductive Logic Programming (ILP to infer indexing rules that may be used to produce automatic indexing recommendations for MEDLINE indexers. Results: Our results show that this original ILP-based approach outperforms manual rules when they exist. In addition, the use of ILP rules also improves the overall performance of the Medical Text Indexer (MTI, a system producing automatic indexing recommendations for MEDLINE. Conclusion: We expect the sets of ILP rules obtained in this experiment to be integrated into MTI.

  20. Automatic inference of indexing rules for MEDLINE.

    Science.gov (United States)

    Névéol, Aurélie; Shooshan, Sonya E; Claveau, Vincent

    2008-11-19

    Indexing is a crucial step in any information retrieval system. In MEDLINE, a widely used database of the biomedical literature, the indexing process involves the selection of Medical Subject Headings in order to describe the subject matter of articles. The need for automatic tools to assist MEDLINE indexers in this task is growing with the increasing number of publications being added to MEDLINE. In this paper, we describe the use and the customization of Inductive Logic Programming (ILP) to infer indexing rules that may be used to produce automatic indexing recommendations for MEDLINE indexers. Our results show that this original ILP-based approach outperforms manual rules when they exist. In addition, the use of ILP rules also improves the overall performance of the Medical Text Indexer (MTI), a system producing automatic indexing recommendations for MEDLINE. We expect the sets of ILP rules obtained in this experiment to be integrated into MTI.

  1. Around the laboratories: Rutherford: Successful tests on bubble chamber target technique; Stanford (SLAC): New storage rings proposal; Berkeley: The HAPPE project to examine cosmic rays with superconducting magnets; The 60th birthday of Professor N.N. Bogolyubov; Argonne: Performance of the automatic film measuring system POLLY II

    CERN Multimedia

    1969-01-01

    Around the laboratories: Rutherford: Successful tests on bubble chamber target technique; Stanford (SLAC): New storage rings proposal; Berkeley: The HAPPE project to examine cosmic rays with superconducting magnets; The 60th birthday of Professor N.N. Bogolyubov; Argonne: Performance of the automatic film measuring system POLLY II

  2. Automatic quantitative renal scintigraphy

    International Nuclear Information System (INIS)

    Valeyre, J.; Deltour, G.; Delisle, M.J.; Bouchard, A.

    1976-01-01

    Renal scintigraphy data may be analyzed automatically by the use of a processing system coupled to an Anger camera (TRIDAC-MULTI 8 or CINE 200). The computing sequence is as follows: normalization of the images; background noise subtraction on both images; evaluation of mercury 197 uptake by the liver and spleen; calculation of the activity fractions on each kidney with respect to the injected dose, taking into account the kidney depth and the results referred to normal values; edition of the results. Automation minimizes the scattering parameters and by its simplification is a great asset in routine work [fr

  3. Avaliação do desempenho de um sistema automático para controle da fertirrigação do tomateiro cultivado em substrato Performance evaluation of an automatic system for tomato fertigation control in substrate

    Directory of Open Access Journals (Sweden)

    Antonio J. Steidle Neto

    2009-09-01

    Full Text Available Este trabalho teve por objetivo avaliar o desempenho de um sistema de controle automático de fertirrigação para a produção do tomateiro em substrato de areia, comparativamente ao sistema de controle convencional quanto à redução de solução nutritiva. No método de controle automático, os eventos de fertirrigação foram estabelecidos em função das condições meteorológicas do ambiente de cultivo e do estádio de desenvolvimento da cultura. Para isso, o modelo de Penman-Monteith foi utilizado como suporte para a tomada de decisão sobre a frequência adequada para aplicação da solução nutritiva. No sistema de controle convencional, os intervalos entre as fertirrigações permaneceram fixos durante todo o ciclo do tomateiro. Os resultados demonstraram que o sistema de controle automático atendeu plenamente às necessidades hídricas da cultura, sem comprometer a produção do tomateiro, proporcionando reduções expressivas no consumo de solução nutritiva. Por outro lado, o sistema de controle convencional realizou número excessivo de fertirrigações, principalmente durante o estádio inicial de desenvolvimento do tomateiro e nos dias caracterizados por elevada nebulosidade. No estádio inicial de crescimento, verificou-se que os volumes totais de solução nutritiva, aplicados ao tomateiro pelo sistema convencional, excederam as necessidades hídricas da cultura em 1,31 e 1,39 L planta-1 em dias típicos com céu claro e nublado, respectivamente.The objective of this work was to compare the performance of an automatic fertigation control system, for soilless tomato production in sand substrate, as compared to a conventional control system. In the automatic control, fertigation events were established by meteorological conditions in the cultivation environment and crop development stage. In this way, the Penman-Monteith model was utilized as a decision support tool regarding the appropriate frequency for delivering the

  4. Development of advanced automatic control system for nuclear ship. 2. Perfect automatic operation after reactor scram events

    International Nuclear Information System (INIS)

    Yabuuchi, Noriaki; Nakazawa, Toshio; Takahashi, Hiroki; Shimazaki, Junya; Hoshi, Tsutao

    1997-11-01

    An automatic operation system has been developed for the purpose of realizing a perfect automatic plant operation after reactor scram events. The goal of the automatic operation after a reactor scram event is to bring the reactor hot stand-by condition automatically. The basic functions of this system are as follows; to monitor actions of the equipments of safety actions after a reactor scram, to control necessary control equipments to bring a reactor to a hot stand-by condition automatically, and to energize a decay heat removal system. The performance evaluation on this system was carried out by comparing the results using to Nuclear Ship Engineering Simulation System (NESSY) and the those measured in the scram test of the nuclear ship 'Mutsu'. As the result, it was showed that this system had the sufficient performance to bring a reactor to a hot syand-by condition quickly and safety. (author)

  5. Automatic readout micrometer

    Science.gov (United States)

    Lauritzen, T.

    A measuring system is described for surveying and very accurately positioning objects with respect to a reference line. A principle use of this surveying system is for accurately aligning the electromagnets which direct a particle beam emitted from a particle accelerator. Prior art surveying systems require highly skilled surveyors. Prior art systems include, for example, optical surveying systems which are susceptible to operator reading errors, and celestial navigation-type surveying systems, with their inherent complexities. The present invention provides an automatic readout micrometer which can very accurately measure distances. The invention has a simplicity of operation which practically eliminates the possibilities of operator optical reading error, owning to the elimination of traditional optical alignments for making measurements. The invention has an extendable arm which carries a laser surveying target. The extendable arm can be continuously positioned over its entire length of travel by either a coarse of fine adjustment without having the fine adjustment outrun the coarse adjustment until a reference laser beam is centered on the target as indicated by a digital readout. The length of the micrometer can then be accurately and automatically read by a computer and compared with a standardized set of alignment measurements. Due to its construction, the micrometer eliminates any errors due to temperature changes when the system is operated within a standard operating temperature range.

  6. Automatic personnel contamination monitor

    International Nuclear Information System (INIS)

    Lattin, Kenneth R.

    1978-01-01

    United Nuclear Industries, Inc. (UNI) has developed an automatic personnel contamination monitor (APCM), which uniquely combines the design features of both portal and hand and shoe monitors. In addition, this prototype system also has a number of new features, including: micro computer control and readout, nineteen large area gas flow detectors, real-time background compensation, self-checking for system failures, and card reader identification and control. UNI's experience in operating the Hanford N Reactor, located in Richland, Washington, has shown the necessity of automatically monitoring plant personnel for contamination after they have passed through the procedurally controlled radiation zones. This final check ensures that each radiation zone worker has been properly checked before leaving company controlled boundaries. Investigation of the commercially available portal and hand and shoe monitors indicated that they did not have the sensitivity or sophistication required for UNI's application, therefore, a development program was initiated, resulting in the subject monitor. Field testing shows good sensitivity to personnel contamination with the majority of alarms showing contaminants on clothing, face and head areas. In general, the APCM has sensitivity comparable to portal survey instrumentation. The inherit stand-in, walk-on feature of the APCM not only makes it easy to use, but makes it difficult to bypass. (author)

  7. Reachability Games on Automatic Graphs

    Science.gov (United States)

    Neider, Daniel

    In this work we study two-person reachability games on finite and infinite automatic graphs. For the finite case we empirically show that automatic game encodings are competitive to well-known symbolic techniques such as BDDs, SAT and QBF formulas. For the infinite case we present a novel algorithm utilizing algorithmic learning techniques, which allows to solve huge classes of automatic reachability games.

  8. Automatic reactor protection system tester

    International Nuclear Information System (INIS)

    Deliant, J.D.; Jahnke, S.; Raimondo, E.

    1988-01-01

    The object of this paper is to present the automatic tester of reactor protection systems designed and developed by EDF and Framatome. In order, the following points are discussed: . The necessity for reactor protection system testing, . The drawbacks of manual testing, . The description and use of the Framatome automatic tester, . On-site installation of this system, . The positive results obtained using the Framatome automatic tester in France

  9. Automatic sets and Delone sets

    International Nuclear Information System (INIS)

    Barbe, A; Haeseler, F von

    2004-01-01

    Automatic sets D part of Z m are characterized by having a finite number of decimations. They are equivalently generated by fixed points of certain substitution systems, or by certain finite automata. As examples, two-dimensional versions of the Thue-Morse, Baum-Sweet, Rudin-Shapiro and paperfolding sequences are presented. We give a necessary and sufficient condition for an automatic set D part of Z m to be a Delone set in R m . The result is then extended to automatic sets that are defined as fixed points of certain substitutions. The morphology of automatic sets is discussed by means of examples

  10. A simple method for automatic measurement of excitation functions

    International Nuclear Information System (INIS)

    Ogawa, M.; Adachi, M.; Arai, E.

    1975-01-01

    An apparatus has been constructed to perform the sequence control of a beam-analysing magnet for automatic excitation function measurements. This device is also applied to the feedback control of the magnet to lock the beam energy. (Auth.)

  11. Optical Automatic Car Identification (OACI) : Volume 1. Advanced System Specification.

    Science.gov (United States)

    1978-12-01

    A performance specification is provided in this report for an Optical Automatic Car Identification (OACI) scanner system which features 6% improved readability over existing industry scanner systems. It also includes the analysis and rationale which ...

  12. claVision: Visual Automatic Piano Music Transcription

    OpenAIRE

    Akbari, Mohammad; Cheng, Howard

    2015-01-01

    One important problem in Musical Information Retrieval is Automatic Music Transcription, which is an automated conversion process from played music to a symbolic notation such as sheet music. Since the accuracy of previous audio-based transcription systems is not satisfactory, we propose an innovative visual-based automatic music transcription system named claVision to perform piano music transcription. Instead of processing the music audio, the system performs the transcription only from the...

  13. Development of automatic ultrasonic testing system and its application

    International Nuclear Information System (INIS)

    Oh, Sang Hong; Matsuura, Toshihiko; Iwata, Ryusuke; Nakagawa, Michio; Horikawa, Kohsuke; Kim, You Chul

    1997-01-01

    The radiographic testing (RT) has been usually applied to a nondestructive testing, which is carried out on purpose to detect internal defects at welded joints of a penstock. In the case that RT could not be applied to, the ultrasonic testing (UT) was performed. UT was generally carried out by manual scanning and the inspections data were recorded by the inspector in a site. So, as a weak point, there was no objective inspection records correspond to films of RT. It was expected that the automatic ultrasonic testing system by which automatic scanning and automatic recording are possible was developed. In this respect, the automatic ultrasonic testing system was developed. Using newly developed the automatic ultrasonic testing system, test results to the circumferential welded joints of the penstock at a site were shown in this paper.

  14. Automatic Detection of Wild-type Mouse Cranial Sutures

    DEFF Research Database (Denmark)

    Ólafsdóttir, Hildur; Darvann, Tron Andre; Hermann, Nuno V.

    , automatic detection of the cranial sutures becomes important. We have previously built a craniofacial, wild-type mouse atlas from a set of 10 Micro CT scans using a B-spline-based nonrigid registration method by Rueckert et al. Subsequently, all volumes were registered nonrigidly to the atlas. Using...... these transformations, any annotation on the atlas can automatically be transformed back to all cases. For this study, two rounds of tracing seven of the cranial sutures, were performed on the atlas by one observer. The average of the two rounds was automatically propagated to all the cases. For validation......, the observer traced the sutures on each of the mouse volumes as well. The observer outperforms the automatic approach by approximately 0.1 mm. All mice have similar errors while the suture error plots reveal that suture 1 and 2 are cumbersome, both for the observer and the automatic approach. These sutures can...

  15. Automatic quantitative metallography

    International Nuclear Information System (INIS)

    Barcelos, E.J.B.V.; Ambrozio Filho, F.; Cunha, R.C.

    1976-01-01

    The quantitative determination of metallographic parameters is analysed through the description of Micro-Videomat automatic image analysis system and volumetric percentage of perlite in nodular cast irons, porosity and average grain size in high-density sintered pellets of UO 2 , and grain size of ferritic steel. Techniques adopted are described and results obtained are compared with the corresponding ones by the direct counting process: counting of systematic points (grid) to measure volume and intersections method, by utilizing a circunference of known radius for the average grain size. The adopted technique for nodular cast iron resulted from the small difference of optical reflectivity of graphite and perlite. Porosity evaluation of sintered UO 2 pellets is also analyzed [pt

  16. Semi-automatic fluoroscope

    International Nuclear Information System (INIS)

    Tarpley, M.W.

    1976-10-01

    Extruded aluminum-clad uranium-aluminum alloy fuel tubes must pass many quality control tests before irradiation in Savannah River Plant nuclear reactors. Nondestructive test equipment has been built to automatically detect high and low density areas in the fuel tubes using x-ray absorption techniques with a video analysis system. The equipment detects areas as small as 0.060-in. dia with 2 percent penetrameter sensitivity. These areas are graded as to size and density by an operator using electronic gages. Video image enhancement techniques permit inspection of ribbed cylindrical tubes and make possible the testing of areas under the ribs. Operation of the testing machine, the special low light level television camera, and analysis and enhancement techniques are discussed

  17. AUTOMATIC ARCHITECTURAL STYLE RECOGNITION

    Directory of Open Access Journals (Sweden)

    M. Mathias

    2012-09-01

    Full Text Available Procedural modeling has proven to be a very valuable tool in the field of architecture. In the last few years, research has soared to automatically create procedural models from images. However, current algorithms for this process of inverse procedural modeling rely on the assumption that the building style is known. So far, the determination of the building style has remained a manual task. In this paper, we propose an algorithm which automates this process through classification of architectural styles from facade images. Our classifier first identifies the images containing buildings, then separates individual facades within an image and determines the building style. This information could then be used to initialize the building reconstruction process. We have trained our classifier to distinguish between several distinct architectural styles, namely Flemish Renaissance, Haussmannian and Neoclassical. Finally, we demonstrate our approach on various street-side images.

  18. Automatic surveying techniques

    International Nuclear Information System (INIS)

    Sah, R.

    1976-01-01

    In order to investigate the feasibility of automatic surveying methods in a more systematic manner, the PEP organization signed a contract in late 1975 for TRW Systems Group to undertake a feasibility study. The completion of this study resulted in TRW Report 6452.10-75-101, dated December 29, 1975, which was largely devoted to an analysis of a survey system based on an Inertial Navigation System. This PEP note is a review and, in some instances, an extension of that TRW report. A second survey system which employed an ''Image Processing System'' was also considered by TRW, and it will be reviewed in the last section of this note. 5 refs., 5 figs., 3 tabs

  19. Automatic alkaloid removal system.

    Science.gov (United States)

    Yahaya, Muhammad Rizuwan; Hj Razali, Mohd Hudzari; Abu Bakar, Che Abdullah; Ismail, Wan Ishak Wan; Muda, Wan Musa Wan; Mat, Nashriyah; Zakaria, Abd

    2014-01-01

    This alkaloid automated removal machine was developed at Instrumentation Laboratory, Universiti Sultan Zainal Abidin Malaysia that purposely for removing the alkaloid toxicity from Dioscorea hispida (DH) tuber. It is a poisonous plant where scientific study has shown that its tubers contain toxic alkaloid constituents, dioscorine. The tubers can only be consumed after it poisonous is removed. In this experiment, the tubers are needed to blend as powder form before inserting into machine basket. The user is need to push the START button on machine controller for switching the water pump ON by then creating turbulence wave of water in machine tank. The water will stop automatically by triggering the outlet solenoid valve. The powders of tubers are washed for 10 minutes while 1 liter of contaminated water due toxin mixture is flowing out. At this time, the controller will automatically triggered inlet solenoid valve and the new water will flow in machine tank until achieve the desire level that which determined by ultra sonic sensor. This process will repeated for 7 h and the positive result is achieved and shows it significant according to the several parameters of biological character ofpH, temperature, dissolve oxygen, turbidity, conductivity and fish survival rate or time. From that parameter, it also shows the positive result which is near or same with control water and assuming was made that the toxin is fully removed when the pH of DH powder is near with control water. For control water, the pH is about 5.3 while water from this experiment process is 6.0 and before run the machine the pH of contaminated water is about 3.8 which are too acid. This automated machine can save time for removing toxicity from DH compared with a traditional method while less observation of the user.

  20. Desempenho de um regulador automático de vazão para canais de irrigação Performance of an automatic discharge regulator for irrigation channels

    Directory of Open Access Journals (Sweden)

    Luís G. H. do Amaral

    2010-12-01

    Full Text Available As estruturas de controle comumente utilizadas nas tomadas de água dos canais de irrigação não permitem a distribuição da quantidade correta de água, favorecendo o desperdício e, consequentemente, reduzindo a eficiência no uso da água. O objetivo deste trabalho foi determinar o desempenho de um regulador automático de vazão no controle da vazão derivada. Para tanto, um exemplar do equipamento, construído em fibra de vidro, foi instalado na lateral de um canal de concreto do Laboratório de Hidráulica da Universidade Federal de Viçosa, em Viçosa-MG. O equipamento foi avaliado em toda a sua faixa de operação, sendo que, em cada regulagem prefixada, determinou-se a vazão derivada com o nível da água a montante variando de 0,30 a 0,45 m. A variação média na vazão do regulador, considerando toda a sua faixa de operação, foi de ± 2,3% em relação às vazões médias fornecidas pelo equipamento em cada regulagem. A amplitude de variação na vazão fornecida foi pequena em relação aos equipamentos usualmente empregados no controle de vazão, em canais de irrigação, demonstrando que o regulador automático de vazão é um equipamento apropriado para a distribuição de água em redes de canais.The control structures commonly used in irrigation channels water intakes are inefficient in delivering the correct water volume to crops, collaborating to water waste and, hence, reducing the water use efficiency. The objective of this work was to determine the performance of an automatic discharge regulator in the control of the supplied discharge. The regulator was made of fiberglass and its evaluation was accomplished in a concrete channel belonging to the Hydraulic Laboratory of the Federal University of Viçosa, in Viçosa, state of Minas Gerais, Brazil. The evaluation was performed for all equipment discharge regulation options. In each regulation, the supplied discharge was determined for the upstream water level changing

  1. Classifying visemes for automatic lipreading

    NARCIS (Netherlands)

    Visser, Michiel; Poel, Mannes; Nijholt, Antinus; Matousek, Vaclav; Mautner, Pavel; Ocelikovi, Jana; Sojka, Petr

    1999-01-01

    Automatic lipreading is automatic speech recognition that uses only visual information. The relevant data in a video signal is isolated and features are extracted from it. From a sequence of feature vectors, where every vector represents one video image, a sequence of higher level semantic elements

  2. On automatic visual inspection of reflective surfaces

    DEFF Research Database (Denmark)

    Kulmann, Lionel

    1995-01-01

    lighting methods in a framework, general usable for inspecting reflective surfaces. Special attention has been given to the design of illumination techniques to enhance defects of highly reflective aluminum sheets. The chosen optical system setup has been used to enhance surface defects of other reflective......This thesis descrbes different methods to perform automatic visual inspection of reflective manufactured products, with the aim of increasing productivity, reduce cost and improve the quality level of the production. We investigate two different systems performing automatic visual inspection....... The first is the inspection of highly reflective aluminum sheets, used by the Danish company Bang & Olufsen, as a part of the exterior design and general appearance of their audio and video products. The second is the inspection of IBM hard disk read/write heads for defects during manufacturing. We have...

  3. Automatic Algorithm Selection for Complex Simulation Problems

    CERN Document Server

    Ewald, Roland

    2012-01-01

    To select the most suitable simulation algorithm for a given task is often difficult. This is due to intricate interactions between model features, implementation details, and runtime environment, which may strongly affect the overall performance. An automated selection of simulation algorithms supports users in setting up simulation experiments without demanding expert knowledge on simulation. Roland Ewald analyzes and discusses existing approaches to solve the algorithm selection problem in the context of simulation. He introduces a framework for automatic simulation algorithm selection and

  4. Automatic Multimodal Cognitive Load Measurement (AMCLM)

    Science.gov (United States)

    2011-06-01

    increasing, while F2 is decreasing, as cognitive load is increased. Classification results performed on the Stroop test database show that formant...Final Project Report Grant AOARD-10-4029 Automatic Multimodal Cognitive Load Measurement (AMCLM) June 2011 NICTA DSIM Team...past year. At the start of the project, we carried out a literature review on video based physiological measures of cognitive load, focusing on

  5. COMMISSIONING AND DETECTOR PERFORMANCE GROUPS (DPG)

    CERN Multimedia

    A. Ryd and T. Camporesi

    2010-01-01

    Commissioning and Run Coordination activities After the successful conclusion of the LHC pilot run commissioning in 2009 activities at the experiment restarted only late in January due to the cooling and detector maintenance. As usual we got going with weekly exercises used to deploy, debug, and validate improvements in firmware and software. A debriefing workshop aimed at analyzing the operational aspects of the 2009 pilot run was held on Jan. 15, 2009, to define a list of improvements (and relative priorities) to be planned. In the last month, most of the objectives set in the debriefing workshop have been attained. The major achievements/improvements obtained are the following: - Consolidation of the firmware for both readout and trigger for ECAL - Software implementation of procedures for raising the bias voltage of the silicon tracker and pixel driven by LHC mode changes with automatic propagation of the state changes from the DCS to the DAQ. The improvements in the software and firmware allow suppress...

  6. Automatic exposure for xeromammography

    International Nuclear Information System (INIS)

    Aichinger, H.

    1977-01-01

    During mammography without intensifying screens, exposure measurements are carried out behind the film. It is, however, difficult to construct an absolutely shadow-free ionization chamber of adequate sensitivity working in the necessary range of 25 to 50 kV. Repeated attempts have been made to utilize the advantages of automatic exposure for xero-mammography. In this case also the ionization chamber was placed behind the Xerox plate. Depending on tube filtration, object thickness and tube voltage, more than 80%, sometimes even 90%, of the radiation is absorbed by the Xerox plate. Particularly the characteristic Mo radiation of 17.4 keV and 19.6 keV is almost totally absorbed by the plate and cannot therefore be registered by the ionization chamber. This results in a considerable dependence of the exposure on kV and object thickness. Dependence on tube voltage and object thickness have been examined dosimetrically and spectroscopically with a Ge(Li)-spectrometer. Finally, the successful use of a shadow-free chamber is described; this has been particularly adapted for xero-mammography and is placed in front of the plate. (orig) [de

  7. Historical Review and Perspective on Automatic Journalizing

    OpenAIRE

    Kato, Masaki

    2017-01-01

    ContentsIntroduction1. EDP Accounting and Automatic Journalizing2. Learning System of Automatic Journalizing3. Automatic Journalizing by the Artificial Intelligence4. Direction of the Progress of the Accounting Information System

  8. Discrete Model Reference Adaptive Control System for Automatic Profiling Machine

    Directory of Open Access Journals (Sweden)

    Peng Song

    2012-01-01

    Full Text Available Automatic profiling machine is a movement system that has a high degree of parameter variation and high frequency of transient process, and it requires an accurate control in time. In this paper, the discrete model reference adaptive control system of automatic profiling machine is discussed. Firstly, the model of automatic profiling machine is presented according to the parameters of DC motor. Then the design of the discrete model reference adaptive control is proposed, and the control rules are proven. The results of simulation show that adaptive control system has favorable dynamic performances.

  9. Electronic amplifiers for automatic compensators

    CERN Document Server

    Polonnikov, D Ye

    1965-01-01

    Electronic Amplifiers for Automatic Compensators presents the design and operation of electronic amplifiers for use in automatic control and measuring systems. This book is composed of eight chapters that consider the problems of constructing input and output circuits of amplifiers, suppression of interference and ensuring high sensitivity.This work begins with a survey of the operating principles of electronic amplifiers in automatic compensator systems. The succeeding chapters deal with circuit selection and the calculation and determination of the principal characteristics of amplifiers, as

  10. Automatic operation device for control rods

    International Nuclear Information System (INIS)

    Sekimizu, Koichi

    1984-01-01

    Purpose: To enable automatic operation of control rods based on the reactor operation planning, and particularly, to decrease the operator's load upon start up and shutdown of the reactor. Constitution: Operation plannings, demand for the automatic operation, break point setting value, power and reactor core flow rate change, demand for operation interrupt, demand for restart, demand for forecasting and the like are inputted to an input device, and an overall judging device performs a long-term forecast as far as the break point by a long-term forecasting device based on the operation plannings. The automatic reactor operation or the like is carried out based on the long-term forecasting and the short time forecasting is performed by the change in the reactor core status due to the control rod operation sequence based on the control rod pattern and the operation planning. Then, it is judged if the operation for the intended control rod is possible or not based on the result of the short time forecasting. (Aizawa, K.)

  11. Development of an automatic reactor inspection system

    International Nuclear Information System (INIS)

    Kim, Jae Hee; Eom, Heung Seop; Lee, Jae Cheol; Choi, Yoo Raek; Moon, Soon Seung

    2002-02-01

    Using recent technologies on a mobile robot computer science, we developed an automatic inspection system for weld lines of the reactor vessel. The ultrasonic inspection of the reactor pressure vessel is currently performed by commercialized robot manipulators. Since, however, the conventional fixed type robot manipulator is very huge, heavy and expensive, it needs long inspection time and is hard to handle and maintain. In order to resolve these problems, we developed a new automatic inspection system using a small mobile robot crawling on the vertical wall of the reactor vessel. According to our conceptual design, we developed the reactor inspection system including an underwater inspection robot, a laser position control subsystem, an ultrasonic data acquisition/analysis subsystem and a main control subsystem. We successfully carried out underwater experiments on the reactor vessel mockup, and real reactor ready for Ulchine nuclear power plant unit 6 at Dusan Heavy Industry in Korea. After this project, we have a plan to commercialize our inspection system. Using this system, we can expect much reduction of the inspection time, performance enhancement, automatic management of inspection history, etc. In the economic point of view, we can also expect import substitution more than 4 million dollars. The established essential technologies for intelligent control and automation are expected to be synthetically applied to the automation of similar systems in nuclear power plants

  12. Automatic Grasp Generation and Improvement for Industrial Bin-Picking

    DEFF Research Database (Denmark)

    Kraft, Dirk; Ellekilde, Lars-Peter; Rytz, Jimmy Alison

    2014-01-01

    and achieve comparable results and that our learning approach can improve system performance significantly. Automatic bin-picking is an important industrial process that can lead to significant savings and potentially keep production in countries with high labour cost rather than outsourcing it. The presented...... work allows to minimize cycle time as well as setup cost, which are essential factors in automatic bin-picking. It therefore leads to a wider applicability of bin-picking in industry....

  13. Inter Genre Similarity Modelling For Automatic Music Genre Classification

    OpenAIRE

    Bagci, Ulas; Erzin, Engin

    2009-01-01

    Music genre classification is an essential tool for music information retrieval systems and it has been finding critical applications in various media platforms. Two important problems of the automatic music genre classification are feature extraction and classifier design. This paper investigates inter-genre similarity modelling (IGS) to improve the performance of automatic music genre classification. Inter-genre similarity information is extracted over the mis-classified feature population....

  14. Automatic Prosodic Segmentation by F0 Clustering Using Superpositional Modeling.

    OpenAIRE

    Nakai, Mitsuru; Harald, Singer; Sagisaka, Yoshinori; Shimodaira, Hiroshi

    1995-01-01

    In this paper, we propose an automatic method for detecting accent phrase boundaries in Japanese continuous speech by using F0 information. In the training phase, hand labeled accent patterns are parameterized according to a superpositional model proposed by Fujisaki, and assigned to some clusters by a clustering method, in which accent templates are calculated as centroid of each cluster. In the segmentation phase, automatic N-best extraction of boundaries is performe...

  15. Tangent: Automatic Differentiation Using Source Code Transformation in Python

    OpenAIRE

    van Merriënboer, Bart; Wiltschko, Alexander B.; Moldovan, Dan

    2017-01-01

    Automatic differentiation (AD) is an essential primitive for machine learning programming systems. Tangent is a new library that performs AD using source code transformation (SCT) in Python. It takes numeric functions written in a syntactic subset of Python and NumPy as input, and generates new Python functions which calculate a derivative. This approach to automatic differentiation is different from existing packages popular in machine learning, such as TensorFlow and Autograd. Advantages ar...

  16. Automatic tuning of free electron lasers

    International Nuclear Information System (INIS)

    Agapov, Ilya; Zagorodnov, Igor; Geloni, Gianluca; Tomin, Sergey

    2017-01-01

    Existing FEL facilities often suffer from stability issues: so electron orbit, transverse electron optics, electron bunch compression and other parameters have to be readjusted often to account for drifts in performance of various components. The tuning procedures typically employed in operation are often manual and lengthy. We have been developing a combination of model-free and model-based automatic tuning methods to meet the needs of present and upcoming XFEL facilities. Our approach has been implemented at FLASH to achieve automatic SASE tuning using empirical control of orbit, electron optics and bunch compression. In this paper we describe our approach to empirical tuning, the software which implements it, and the results of using it at FLASH.We also discuss the potential of using machine learning and model-based techniques in tuning methods.

  17. Development of an automatic pipeline scanning system

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae H.; Lee, Jae C.; Moon, Soon S.; Eom, Heung S.; Choi, Yu R

    1999-11-01

    Pressure pipe inspection in nuclear power plants is one of the mandatory regulation items. Comparing to manual ultrasonic inspection, automatic inspection has the benefits of more accurate and reliable inspection results and reduction of radiation disposal. final object of this project is to develop an automatic pipeline inspection system of pressure pipe welds in nuclear power plants. We developed a pipeline scanning robot with four magnetic wheels and 2-axis manipulator for controlling ultrasonic transducers, and developed the robot control computer which controls the robot to navigate along inspection path exactly. We expect our system can contribute to reduction of inspection time, performance enhancement, and effective management of inspection results. The system developed by this project can be practically used for inspection works after field tests. (author)

  18. Semi-automatic drawings surveying system

    International Nuclear Information System (INIS)

    Andriamampianina, Lala

    1983-01-01

    A system for the semi-automatic survey of drawings is presented. Its design has been oriented to the reduction of the stored information required for the drawing reproduction. This equipment consists mainly of a plotter driven by a micro-computer, but the pen of the plotter is replaced by a circular photodiode array. Line drawings are first viewed as a concatenation of vectors, with constant angle between the two vectors, and then divided in arcs of circles and line segments. A dynamic analysis of line intersections with the circular sensor permits to identify starting points and end points in a line, for the purpose of automatically following connected lines in drawing. The advantage of the method described is that precision practically depends only on the plotter performance, the sensor resolution being only considered for the thickness of strokes and the distance between two strokes. (author) [fr

  19. Automatic stereoscopic system for person recognition

    Science.gov (United States)

    Murynin, Alexander B.; Matveev, Ivan A.; Kuznetsov, Victor D.

    1999-06-01

    A biometric access control system based on identification of human face is presented. The system developed performs remote measurements of the necessary face features. Two different scenarios of the system behavior are implemented. The first one assumes the verification of personal data entered by visitor from console using keyboard or card reader. The system functions as an automatic checkpoint, that strictly controls access of different visitors. The other scenario makes it possible to identify visitors without any person identifier or pass. Only person biometrics are used to identify the visitor. The recognition system automatically finds necessary identification information preliminary stored in the database. Two laboratory models of recognition system were developed. The models are designed to use different information types and sources. In addition to stereoscopic images inputted to computer from cameras the models can use voice data and some person physical characteristics such as person's height, measured by imaging system.

  20. Pattern-Driven Automatic Parallelization

    Directory of Open Access Journals (Sweden)

    Christoph W. Kessler

    1996-01-01

    Full Text Available This article describes a knowledge-based system for automatic parallelization of a wide class of sequential numerical codes operating on vectors and dense matrices, and for execution on distributed memory message-passing multiprocessors. Its main feature is a fast and powerful pattern recognition tool that locally identifies frequently occurring computations and programming concepts in the source code. This tool also works for dusty deck codes that have been "encrypted" by former machine-specific code transformations. Successful pattern recognition guides sophisticated code transformations including local algorithm replacement such that the parallelized code need not emerge from the sequential program structure by just parallelizing the loops. It allows access to an expert's knowledge on useful parallel algorithms, available machine-specific library routines, and powerful program transformations. The partially restored program semantics also supports local array alignment, distribution, and redistribution, and allows for faster and more exact prediction of the performance of the parallelized target code than is usually possible.

  1. Portable and Automatic Moessbauer Analysis

    International Nuclear Information System (INIS)

    Souza, P. A. de; Garg, V. K.; Klingelhoefer, G.; Gellert, R.; Guetlich, P.

    2002-01-01

    A portable Moessbauer spectrometer, developed for extraterrestrial applications, opens up new industrial applications of MBS. But for industrial applications, an available tool for fast data analysis is also required, and it should be easy to handle. The analysis of Moessbauer spectra and their parameters is a barrier for the popularity of this wide-applicable spectroscopic technique in industry. Based on experience, the analysis of a Moessbauer spectrum is time-consuming and requires the dedication of a specialist. However, the analysis of Moessbauer spectra, from the fitting to the identification of the sample phases, can be faster using by genetic algorithms, fuzzy logic and artificial neural networks. Industrial applications are very specific ones and the data analysis can be performed using these algorithms. In combination with an automatic analysis, the Moessbauer spectrometer can be used as a probe instrument which covers the main industrial needs for an on-line monitoring of its products, processes and case studies. Some of these real industrial applications will be discussed.

  2. Automatic analysis of multiparty meetings

    Indian Academy of Sciences (India)

    AMI) meeting corpus, the development of a meeting speech recognition system, and systems for the automatic segmentation, summarization and social processing of meetings, together with some example applications based on these systems.

  3. Automatic control system for the pig ion source for the U-400 cyclotron

    International Nuclear Information System (INIS)

    Kutner, V.B.; Subbotin, V.G.; Sukhov, A.M.; Tret'yakov, Yu.P.; Fefilov, B.V.

    1989-01-01

    An automatic control system is described for the cyclotron U-400 multiply-charged ion source based on CAMAC apparatus and microprocesor controllers. The system allows the automatic tuning of the ion source to the necessary regime including the automatic start-up of discharge, the obtaining of the necessary parameters of sputtering, the automatic search for a maximum beam current within the given discharge parameters. The system performs tuning the ion source to the quasioptimal regime for 10-15 minutes with up to 5% deviation from the preset parameters. It is possible to stabilize the beam current within 3% using the automatic correction of the discharge regime. 6 refs.; 4 figs

  4. Neural stability: A reflection of automaticity in reading.

    Science.gov (United States)

    Lam, Silvia Siu-Yin; White-Schwoch, Travis; Zecker, Steven G; Hornickel, Jane; Kraus, Nina

    2017-08-01

    Automaticity, the ability to perform a task rapidly with minimal effort, plays a key role in reading fluency and is indexed by rapid automatized naming (RAN) and processing speed. Yet little is known about automaticity's neurophysiologic underpinnings. The more efficiently sound is encoded, the more automatic sound processing can be. In turn, this automaticity could free up cognitive resources such as attention and working memory to help build an integrative reading network. Therefore, we hypothesized that automaticity and reading fluency correlate with stable neural representation of sounds, given a larger body of literature suggesting the close relationship between neural stability and the integrative function in the central auditory system. To test this hypothesis, we recorded the frequency-following responses (FFR) to speech syllables and administered cognitive and reading measures to school-aged children. We show that the stability of neural responses to speech correlates with RAN and processing speed, but not phonological awareness. Moreover, the link between neural stability and RAN mediates the previously-determined link between neural stability and reading ability. Children with a RAN deficit have especially unstable neural responses. Our neurophysiological approach illuminates a potential neural mechanism specific to RAN, which in turn indicates a relationship between synchronous neural firing in the auditory system and automaticity critical for reading fluency. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Automatic Music Transcription

    Science.gov (United States)

    Klapuri, Anssi; Virtanen, Tuomas

    Written musical notation describes music in a symbolic form that is suitable for performing a piece using the available musical instruments. Traditionally, musical notation indicates the pitch, target instrument, timing, and duration of each sound to be played. The aim of music transcription either by humans or by a machine is to infer these musical parameters, given only the acoustic recording of a performance.

  6. An automatic evaluation system for NTA film neutron dosimeters

    CERN Document Server

    Müller, R

    1999-01-01

    At CERN, neutron personal monitoring for over 4000 collaborators is performed with Kodak NTA films, which have been shown to be the most suitable neutron dosimeter in the radiation environment around high-energy accelerators. To overcome the lengthy and strenuous manual scanning process with an optical microscope, an automatic analysis system has been developed. We report on the successful automatic scanning of NTA films irradiated with sup 2 sup 3 sup 8 Pu-Be source neutrons, which results in densely ionised recoil tracks, as well as on the extension of the method to higher energy neutrons causing sparse and fragmentary tracks. The application of the method in routine personal monitoring is discussed. $9 overcome the lengthy and strenuous manual scanning process with an optical microscope, an automatic analysis system has been developed. We report on the successful automatic scanning of NTA films irradiated with /sup 238/Pu-Be source $9 discussed. (10 refs).

  7. Automatic contact in DYNA3D for vehicle crashworthiness

    International Nuclear Information System (INIS)

    Whirley, R.G.; Engelmann, B.E.

    1994-01-01

    This paper presents a new formulation for the automatic definition and treatment of mechanical contact in explicit, nonlinear, finite element analysis. Automatic contact offers the benefits of significantly reduced model construction time and fewer opportunities for user error, but faces significant challenges in reliability and computational costs. The authors have used a new four-step automatic contact algorithm. Key aspects of the proposed method include (1) automatic identification of adjacent and opposite surfaces in the global search phase, and (2) the use of a smoothly varying surface normal that allows a consistent treatment of shell intersection and corner contact conditions without ad hoc rules. Three examples are given to illustrate the performance of the newly proposed algorithm in the public DYNA3D code

  8. Next Generation Model 8800 Automatic TLD Reader

    International Nuclear Information System (INIS)

    Velbeck, K.J.; Streetz, K.L.; Rotunda, J.E.

    1999-01-01

    BICRON NE has developed an advanced version of the Model 8800 Automatic TLD Reader. Improvements in the reader include a Windows NT TM -based operating system and a Pentium microprocessor for the host controller, a servo-controlled transport, a VGA display, mouse control, and modular assembly. This high capacity reader will automatically read fourteen hundred TLD Cards in one loading. Up to four elements in a card can be heated without mechanical contact, using hot nitrogen gas. Improvements in performance include an increased throughput rate and more precise card positioning. Operation is simplified through easy-to-read Windows-type screens. Glow curves are displayed graphically along with light intensity, temperature, and channel scaling. Maintenance and diagnostic aids are included for easier troubleshooting. A click of a mouse will command actions that are displayed in easy-to-understand English words. Available options include an internal 90 Sr irradiator, automatic TLD calibration, and two different extremity monitoring modes. Results from testing include reproducibility, reader stability, linearity, detection threshold, residue, primary power supply voltage and frequency, transient voltage, drop testing, and light leakage. (author)

  9. Fuzzy Logic Based Automatic Door Control System

    Directory of Open Access Journals (Sweden)

    Harun SUMBUL

    2017-12-01

    Full Text Available In this paper, fuzzy logic based an automatic door control system is designed to provide for heat energy savings. The heat energy loss usually occurs in where outomotic doors are used. Designed fuzzy logic system’s Input statuses (WS: Walking Speed and DD: Distance Door and the output status (DOS: Door Opening Speed is determined. According to these cases, rule base (25 rules is created; the rules are processed by a fuzzy logic and by appyled to control of an automatic door. An interface program is prepared by using Matlab Graphical User Interface (GUI programming language and some sample results are checked on Matlab using fuzzy logic toolbox. Designed fuzzy logic controller is tested at different speed cases and the results are plotted. As a result; in this study, we have obtained very good results in control of an automatic door with fuzzy logic. The results of analyses have indicated that the controls performed with fuzzy logic provided heat energy savings, less heat energy loss and reliable, consistent controls and that are feasible to in real.

  10. Improving suspended sediment measurements by automatic samplers.

    Science.gov (United States)

    Gettel, Melissa; Gulliver, John S; Kayhanian, Masoud; DeGroot, Gregory; Brand, Joshua; Mohseni, Omid; Erickson, Andrew J

    2011-10-01

    Suspended solids either as total suspended solids (TSS) or suspended sediment concentration (SSC) is an integral particulate water quality parameter that is important in assessing particle-bound contaminants. At present, nearly all stormwater runoff quality monitoring is performed with automatic samplers in which the sampling intake is typically installed at the bottom of a storm sewer or channel. This method of sampling often results in a less accurate measurement of suspended sediment and associated pollutants due to the vertical variation in particle concentration caused by particle settling. In this study, the inaccuracies associated with sampling by conventional intakes for automatic samplers have been verified by testing with known suspended sediment concentrations and known particle sizes ranging from approximately 20 μm to 355 μm under various flow rates. Experimental results show that, for samples collected at a typical automatic sampler intake position, the ratio of sampled to feed suspended sediment concentration is up to 6600% without an intake strainer and up to 300% with a strainer. When the sampling intake is modified with multiple sampling tubes and fitted with a wing to provide lift (winged arm sampler intake), the accuracy of sampling improves substantially. With this modification, the differences between sampled and feed suspended sediment concentration were more consistent and the sampled to feed concentration ratio was accurate to within 10% for particle sizes up to 250 μm.

  11. Automatic evidence retrieval for systematic reviews.

    Science.gov (United States)

    Choong, Miew Keen; Galgani, Filippo; Dunn, Adam G; Tsafnat, Guy

    2014-10-01

    Snowballing involves recursively pursuing relevant references cited in the retrieved literature and adding them to the search results. Snowballing is an alternative approach to discover additional evidence that was not retrieved through conventional search. Snowballing's effectiveness makes it best practice in systematic reviews despite being time-consuming and tedious. Our goal was to evaluate an automatic method for citation snowballing's capacity to identify and retrieve the full text and/or abstracts of cited articles. Using 20 review articles that contained 949 citations to journal or conference articles, we manually searched Microsoft Academic Search (MAS) and identified 78.0% (740/949) of the cited articles that were present in the database. We compared the performance of the automatic citation snowballing method against the results of this manual search, measuring precision, recall, and F1 score. The automatic method was able to correctly identify 633 (as proportion of included citations: recall=66.7%, F1 score=79.3%; as proportion of citations in MAS: recall=85.5%, F1 score=91.2%) of citations with high precision (97.7%), and retrieved the full text or abstract for 490 (recall=82.9%, precision=92.1%, F1 score=87.3%) of the 633 correctly retrieved citations. The proposed method for automatic citation snowballing is accurate and is capable of obtaining the full texts or abstracts for a substantial proportion of the scholarly citations in review articles. By automating the process of citation snowballing, it may be possible to reduce the time and effort of common evidence surveillance tasks such as keeping trial registries up to date and conducting systematic reviews.

  12. Automatic ultrasound image enhancement for 2D semi-automatic breast-lesion segmentation

    Science.gov (United States)

    Lu, Kongkuo; Hall, Christopher S.

    2014-03-01

    Breast cancer is the fastest growing cancer, accounting for 29%, of new cases in 2012, and second leading cause of cancer death among women in the United States and worldwide. Ultrasound (US) has been used as an indispensable tool for breast cancer detection/diagnosis and treatment. In computer-aided assistance, lesion segmentation is a preliminary but vital step, but the task is quite challenging in US images, due to imaging artifacts that complicate detection and measurement of the suspect lesions. The lesions usually present with poor boundary features and vary significantly in size, shape, and intensity distribution between cases. Automatic methods are highly application dependent while manual tracing methods are extremely time consuming and have a great deal of intra- and inter- observer variability. Semi-automatic approaches are designed to counterbalance the advantage and drawbacks of the automatic and manual methods. However, considerable user interaction might be necessary to ensure reasonable segmentation for a wide range of lesions. This work proposes an automatic enhancement approach to improve the boundary searching ability of the live wire method to reduce necessary user interaction while keeping the segmentation performance. Based on the results of segmentation of 50 2D breast lesions in US images, less user interaction is required to achieve desired accuracy, i.e. live-wire segmentation.

  13. Automatic Recognition of Road Signs

    Science.gov (United States)

    Inoue, Yasuo; Kohashi, Yuuichirou; Ishikawa, Naoto; Nakajima, Masato

    2002-11-01

    The increase in traffic accidents is becoming a serious social problem with the recent rapid traffic increase. In many cases, the driver"s carelessness is the primary factor of traffic accidents, and the driver assistance system is demanded for supporting driver"s safety. In this research, we propose the new method of automatic detection and recognition of road signs by image processing. The purpose of this research is to prevent accidents caused by driver"s carelessness, and call attention to a driver when the driver violates traffic a regulation. In this research, high accuracy and the efficient sign detecting method are realized by removing unnecessary information except for a road sign from an image, and detect a road sign using shape features. At first, the color information that is not used in road signs is removed from an image. Next, edges except for circular and triangle ones are removed to choose sign shape. In the recognition process, normalized cross correlation operation is carried out to the two-dimensional differentiation pattern of a sign, and the accurate and efficient method for detecting the road sign is realized. Moreover, the real-time operation in a software base was realized by holding down calculation cost, maintaining highly precise sign detection and recognition. Specifically, it becomes specifically possible to process by 0.1 sec(s)/frame using a general-purpose PC (CPU: Pentium4 1.7GHz). As a result of in-vehicle experimentation, our system could process on real time and has confirmed that detection and recognition of a sign could be performed correctly.

  14. Automatic sample changers maintenance manual

    International Nuclear Information System (INIS)

    Myers, T.A.

    1978-10-01

    This manual describes and provides trouble-shooting aids for the Automatic Sample Changer electronics on the automatic beta counting system, developed by the Los Alamos Scientific Laboratory Group CNC-11. The output of a gas detector is shaped by a preamplifier, then is coupled to an amplifier. Amplifier output is discriminated and is the input to a scaler. An identification number is associated with each sample. At a predetermined count length, the identification number, scaler data plus other information is punched out on a data card. The next sample to be counted is automatically selected. The beta counter uses the same electronics as the prior count did, the only difference being the sample identification number and sample itself. This manual is intended as a step-by-step aid in trouble-shooting the electronics associated with positioning the sample, counting the sample, and getting the needed data punched on an 80-column data card

  15. Automatic Segmentation of Vessels in In-Vivo Ultrasound Scans

    DEFF Research Database (Denmark)

    Tamimi-Sarnikowski, Philip; Brink-Kjær, Andreas; Moshavegh, Ramin

    2017-01-01

    was evaluated empirically and applied to a dataset of in-vivo 1770 images recorded from 8 healthy subjects. The segmentation results were compared to manual delineation performed by two experienced users. The results showed a sensitivity and specificity of 90.41 ± 11.2 % and 97.93 ± 5.7 % (mean ± standard......Ultrasound has become highly popular to monitor atherosclerosis, by scanning the carotid artery. The screening involves measuring the thickness of the vessel wall and diameter of the lumen. An automatic segmentation of the vessel lumen, can enable the determination of lumen diameter. This paper...... presents a fully automatic segmentation algorithm, for robustly segmenting the vessel lumen in longitudinal B-mode ultrasound images. The automatic segmentation is performed using a combination of B-mode and power Doppler images. The proposed algorithm includes a series of preprocessing steps, and performs...

  16. The automatic electromagnetic field generating system

    Science.gov (United States)

    Audone, B.; Gerbi, G.

    1982-07-01

    The technical study and the design approaches adopted for the definition of the automatic electromagnetic field generating system (AEFGS) dedicated to EMC susceptibility testing are presented. The AEFGS covers the frequency range 10 KHz to 40 GHZ and operates successfully in the two EMC shielded chambers at ESTEC. The performance of the generators/amplifiers subsystems, antennas selection, field amplitude and susceptibility feedback and monitoring systems is described. System control modes which guarantee the AEFGS full operability under different test conditions are discussed. Advantages of automation of susceptibility testing include increased measurement accuracy and testing cost reduction.

  17. Automatic Angular alignment of LHC Collimators

    CERN Document Server

    Azzopardi, Gabriella; Salvachua Ferrando, Belen Maria; Mereghetti, Alessio; Bruce, Roderik; Redaelli, Stefano; CERN. Geneva. ATS Department

    2017-01-01

    The LHC is equipped with a complex collimation system to protect sensitive equipment from unavoidable beam losses. Collimators are positioned close to the beam using an alignment procedure. Until now they have always been aligned assuming no tilt between the collimator and the beam, however, tank misalignments or beam envelope angles at large-divergence locations could introduce a tilt limiting the collimation performance. Three different algorithms were implemented to automatically align a chosen collimator at various angles. The implementation was tested on a number of collimators during this MD and no human intervention was required.

  18. Two Systems for Automatic Music Genre Recognition

    DEFF Research Database (Denmark)

    Sturm, Bob L.

    2012-01-01

    We re-implement and test two state-of-the-art systems for automatic music genre classification; but unlike past works in this area, we look closer than ever before at their behavior. First, we look at specific instances where each system consistently applies the same wrong label across multiple...... trials of cross-validation. Second, we test the robustness of each system to spectral equalization. Finally, we test how well human subjects recognize the genres of music excerpts composed by each system to be highly genre representative. Our results suggest that neither high-performing system has...... a capacity to recognize music genre....

  19. Automatic blood detection in capsule endoscopy video

    Science.gov (United States)

    Novozámský, Adam; Flusser, Jan; Tachecí, Ilja; Sulík, Lukáš; Bureš, Jan; Krejcar, Ondřej

    2016-12-01

    We propose two automatic methods for detecting bleeding in wireless capsule endoscopy videos of the small intestine. The first one uses solely the color information, whereas the second one incorporates the assumptions about the blood spot shape and size. The original idea is namely the definition of a new color space that provides good separability of blood pixels and intestinal wall. Both methods can be applied either individually or their results can be fused together for the final decision. We evaluate their individual performance and various fusion rules on real data, manually annotated by an endoscopist.

  20. Development of an automatic scaler

    International Nuclear Information System (INIS)

    He Yuehong

    2009-04-01

    A self-designed automatic scaler is introduced. A microcontroller LPC936 is used as the master chip in the scaler. A counter integrated with the micro-controller is configured to operate as external pulse counter. Software employed in the scaler is based on a embedded real-time operating system kernel named Small RTOS. Data storage, calculation and some other functions are also provided. The scaler is designed for applications with low cost, low power consumption solutions. By now, the automatic scaler has been applied in a surface contamination instrument. (authors)

  1. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming focuses on the techniques of automatic programming used with digital computers. Topics covered range from the design of machine-independent programming languages to the use of recursive procedures in ALGOL 60. A multi-pass translation scheme for ALGOL 60 is described, along with some commercial source languages. The structure and use of the syntax-directed compiler is also considered.Comprised of 12 chapters, this volume begins with a discussion on the basic ideas involved in the description of a computing process as a program for a computer, expressed in

  2. Traduction automatique et terminologie automatique (Automatic Translation and Automatic Terminology

    Science.gov (United States)

    Dansereau, Jules

    1978-01-01

    An exposition of reasons why a system of automatic translation could not use a terminology bank except as a source of information. The fundamental difference between the two tools is explained and examples of translation and mistranslation are given as evidence of the limits and possibilities of each process. (Text is in French.) (AMH)

  3. Automatic Control of the Concrete Mixture Homogeneity in Cycling Mixers

    Science.gov (United States)

    Anatoly Fedorovich, Tikhonov; Drozdov, Anatoly

    2018-03-01

    The article describes the factors affecting the concrete mixture quality related to the moisture content of aggregates, since the effectiveness of the concrete mixture production is largely determined by the availability of quality management tools at all stages of the technological process. It is established that the unaccounted moisture of aggregates adversely affects the concrete mixture homogeneity and, accordingly, the strength of building structures. A new control method and the automatic control system of the concrete mixture homogeneity in the technological process of mixing components have been proposed, since the tasks of providing a concrete mixture are performed by the automatic control system of processing kneading-and-mixing machinery with operational automatic control of homogeneity. Theoretical underpinnings of the control of the mixture homogeneity are presented, which are related to a change in the frequency of vibrodynamic vibrations of the mixer body. The structure of the technical means of the automatic control system for regulating the supply of water is determined depending on the change in the concrete mixture homogeneity during the continuous mixing of components. The following technical means for establishing automatic control have been chosen: vibro-acoustic sensors, remote terminal units, electropneumatic control actuators, etc. To identify the quality indicator of automatic control, the system offers a structure flowchart with transfer functions that determine the ACS operation in transient dynamic mode.

  4. Do judgments of learning predict automatic influences of memory?

    Science.gov (United States)

    Undorf, Monika; Böhm, Simon; Cüpper, Lutz

    2016-06-01

    Current memory theories generally assume that memory performance reflects both recollection and automatic influences of memory. Research on people's predictions about the likelihood of remembering recently studied information on a memory test, that is, on judgments of learning (JOLs), suggests that both magnitude and resolution of JOLs are linked to recollection. However, it has remained unresolved whether JOLs are also predictive of automatic influences of memory. This issue was addressed in 3 experiments. Using the process-dissociation procedure, we assessed the predictive accuracy of immediate and delayed JOLs (Experiment 1) and of immediate JOLs from a first and from a second study-test cycle (Experiments 2 and 3) for recollection and automatic influences. Results showed that each type of JOLs was predictive of both recollection and automatic influences. Moreover, we found that a delay between study and JOL improved the predictive accuracy of JOLs for recollection, while study-test experience improved the predictive accuracy of JOLs for both recollection and automatic influences. These findings demonstrate that JOLs predict not only recollection, but also automatic influences of memory. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  5. The irace package: Iterated racing for automatic algorithm configuration

    Directory of Open Access Journals (Sweden)

    Manuel López-Ibáñez

    2016-01-01

    Full Text Available Modern optimization algorithms typically require the setting of a large number of parameters to optimize their performance. The immediate goal of automatic algorithm configuration is to find, automatically, the best parameter settings of an optimizer. Ultimately, automatic algorithm configuration has the potential to lead to new design paradigms for optimization software. The irace package is a software package that implements a number of automatic configuration procedures. In particular, it offers iterated racing procedures, which have been used successfully to automatically configure various state-of-the-art algorithms. The iterated racing procedures implemented in irace include the iterated F-race algorithm and several extensions and improvements over it. In this paper, we describe the rationale underlying the iterated racing procedures and introduce a number of recent extensions. Among these, we introduce a restart mechanism to avoid premature convergence, the use of truncated sampling distributions to handle correctly parameter bounds, and an elitist racing procedure for ensuring that the best configurations returned are also those evaluated in the highest number of training instances. We experimentally evaluate the most recent version of irace and demonstrate with a number of example applications the use and potential of irace, in particular, and automatic algorithm configuration, in general.

  6. A consideration of the operation of automatic production machines.

    Science.gov (United States)

    Hoshi, Toshiro; Sugimoto, Noboru

    2015-01-01

    At worksites, various automatic production machines are in use to release workers from muscular labor or labor in the detrimental environment. On the other hand, a large number of industrial accidents have been caused by automatic production machines. In view of this, this paper considers the operation of automatic production machines from the viewpoint of accident prevention, and points out two types of machine operation - operation for which quick performance is required (operation that is not permitted to be delayed) - and operation for which composed performance is required (operation that is not permitted to be performed in haste). These operations are distinguished by operation buttons of suitable colors and shapes. This paper shows that these characteristics are evaluated as "asymmetric on the time-axis". Here, in order for workers to accept the risk of automatic production machines, it is preconditioned in general that harm should be sufficiently small or avoidance of harm is easy. In this connection, this paper shows the possibility of facilitating the acceptance of the risk of automatic production machines by enhancing the asymmetric on the time-axis.

  7. Quality Assessment of Compressed Video for Automatic License Plate Recognition

    DEFF Research Database (Denmark)

    Ukhanova, Ann; Støttrup-Andersen, Jesper; Forchhammer, Søren

    2014-01-01

    Definition of video quality requirements for video surveillance poses new questions in the area of quality assessment. This paper presents a quality assessment experiment for an automatic license plate recognition scenario. We explore the influence of the compression by H.264/AVC and H.265/HEVC...... standards on the recognition performance. We compare logarithmic and logistic functions for quality modeling. Our results show that a logistic function can better describe the dependence of recognition performance on the quality for both compression standards. We observe that automatic license plate...

  8. Automatically Preparing Safe SQL Queries

    Science.gov (United States)

    Bisht, Prithvi; Sistla, A. Prasad; Venkatakrishnan, V. N.

    We present the first sound program source transformation approach for automatically transforming the code of a legacy web application to employ PREPARE statements in place of unsafe SQL queries. Our approach therefore opens the way for eradicating the SQL injection threat vector from legacy web applications.

  9. The Automatic Measurement of Targets

    DEFF Research Database (Denmark)

    Höhle, Joachim

    1997-01-01

    The automatic measurement of targets is demonstrated by means of a theoretical example and by an interactive measuring program for real imagery from a réseau camera. The used strategy is a combination of two methods: the maximum correlation coefficient and the correlation in the subpixel range...... interactive software is also part of a computer-assisted learning program on digital photogrammetry....

  10. Automatic Error Analysis Using Intervals

    Science.gov (United States)

    Rothwell, E. J.; Cloud, M. J.

    2012-01-01

    A technique for automatic error analysis using interval mathematics is introduced. A comparison to standard error propagation methods shows that in cases involving complicated formulas, the interval approach gives comparable error estimates with much less effort. Several examples are considered, and numerical errors are computed using the INTLAB…

  11. Automatic target identification using neural networks

    Science.gov (United States)

    Abdallah, Mahmoud A.; Samu, Tayib I.; Grissom, William A.

    1995-10-01

    Neural network theories are applied to attain human-like performance in areas such as speech recognition, statistical mapping, and target recognition or identification. In target identification, one of the difficult tasks has been the extraction of features to be used to train the neural network which is subsequently used for the target's identification. The purpose of this paper is to describe the development of an automatic target identification system using features extracted from a specific class of targets. The extracted features were the graphical representations of the silhouettes of the targets. Image processing techniques and some Fast Fourier Transform (FFT) properties were implemented to extract the features. The FFT eliminates variations in the extracted features due to rotation or scaling. A Neural Network was trained with the extracted features using the Learning Vector Quantization paradigm. An identification system was set up to test the algorithm. The image processing software was interfaced with MATLAB Neural Network Toolbox via a computer program written in C language to automate the target identification process. The system performed well as at classified the objects used to train it irrespective of rotation, scaling, and translation. This automatic target identification system had a classification success rate of about 95%.

  12. Self-Compassion and Automatic Thoughts

    Science.gov (United States)

    Akin, Ahmet

    2012-01-01

    The aim of this research is to examine the relationships between self-compassion and automatic thoughts. Participants were 299 university students. In this study, the Self-compassion Scale and the Automatic Thoughts Questionnaire were used. The relationships between self-compassion and automatic thoughts were examined using correlation analysis…

  13. Automatic schema evolution in Root

    International Nuclear Information System (INIS)

    Brun, R.; Rademakers, F.

    2001-01-01

    ROOT version 3 (spring 2001) supports automatic class schema evolution. In addition this version also produces files that are self-describing. This is achieved by storing in each file a record with the description of all the persistent classes in the file. Being self-describing guarantees that a file can always be read later, its structure browsed and objects inspected, also when the library with the compiled code of these classes is missing. The schema evolution mechanism supports the frequent case when multiple data sets generated with many different class versions must be analyzed in the same session. ROOT supports the automatic generation of C++ code describing the data objects in a file

  14. Automatic design of magazine covers

    Science.gov (United States)

    Jahanian, Ali; Liu, Jerry; Tretter, Daniel R.; Lin, Qian; Damera-Venkata, Niranjan; O'Brien-Strain, Eamonn; Lee, Seungyon; Fan, Jian; Allebach, Jan P.

    2012-03-01

    In this paper, we propose a system for automatic design of magazine covers that quantifies a number of concepts from art and aesthetics. Our solution to automatic design of this type of media has been shaped by input from professional designers, magazine art directors and editorial boards, and journalists. Consequently, a number of principles in design and rules in designing magazine covers are delineated. Several techniques are derived and employed in order to quantify and implement these principles and rules in the format of a software framework. At this stage, our framework divides the task of design into three main modules: layout of magazine cover elements, choice of color for masthead and cover lines, and typography of cover lines. Feedback from professional designers on our designs suggests that our results are congruent with their intuition.

  15. Automatic computation of radioimmunoassay data

    International Nuclear Information System (INIS)

    Toyota, Takayoshi; Kudo, Mikihiko; Abe, Kanji; Kawamata, Fumiaki; Uehata, Shigeru.

    1975-01-01

    Radioimmunoassay provided dose response curves which showed linearity by the use of logistic transformation (Rodbard). This transformation which was applicable to radioimmunoassay should be useful for the computer processing of insulin and C-peptide assay. In the present studies, standard curves were analysed by testing the fit of analytic functions to radioimmunoassay of insulin and C-peptides. A program for use in combination with the double antibody technique was made by Dr. Kawamata. This approach was evidenced to be useful in order to allow automatic computation of data derived from the double antibody assays of insulin and C-peptides. Automatic corrected calculations of radioimmunoassay data of insulin was found to be satisfactory. (auth.)

  16. Physics of Automatic Target Recognition

    CERN Document Server

    Sadjadi, Firooz

    2007-01-01

    Physics of Automatic Target Recognition addresses the fundamental physical bases of sensing, and information extraction in the state-of-the art automatic target recognition field. It explores both passive and active multispectral sensing, polarimetric diversity, complex signature exploitation, sensor and processing adaptation, transformation of electromagnetic and acoustic waves in their interactions with targets, background clutter, transmission media, and sensing elements. The general inverse scattering, and advanced signal processing techniques and scientific evaluation methodologies being used in this multi disciplinary field will be part of this exposition. The issues of modeling of target signatures in various spectral modalities, LADAR, IR, SAR, high resolution radar, acoustic, seismic, visible, hyperspectral, in diverse geometric aspects will be addressed. The methods for signal processing and classification will cover concepts such as sensor adaptive and artificial neural networks, time reversal filt...

  17. Human-competitive automatic topic indexing

    CERN Document Server

    Medelyan, Olena

    2009-01-01

    Topic indexing is the task of identifying the main topics covered by a document. These are useful for many purposes: as subject headings in libraries, as keywords in academic publications and as tags on the web. Knowing a document’s topics helps people judge its relevance quickly. However, assigning topics manually is labor intensive. This thesis shows how to generate them automatically in a way that competes with human performance. Three kinds of indexing are investigated: term assignment, a task commonly performed by librarians, who select topics from a controlled vocabulary; tagging, a popular activity of web users, who choose topics freely; and a new method of keyphrase extraction, where topics are equated to Wikipedia article names. A general two-stage algorithm is introduced that first selects candidate topics and then ranks them by significance based on their properties. These properties draw on statistical, semantic, domain-specific and encyclopedic knowledge. They are combined using a machine learn...

  18. Towards Automatic Classification of Wikipedia Content

    Science.gov (United States)

    Szymański, Julian

    Wikipedia - the Free Encyclopedia encounters the problem of proper classification of new articles everyday. The process of assignment of articles to categories is performed manually and it is a time consuming task. It requires knowledge about Wikipedia structure, which is beyond typical editor competence, which leads to human-caused mistakes - omitting or wrong assignments of articles to categories. The article presents application of SVM classifier for automatic classification of documents from The Free Encyclopedia. The classifier application has been tested while using two text representations: inter-documents connections (hyperlinks) and word content. The results of the performed experiments evaluated on hand crafted data show that the Wikipedia classification process can be partially automated. The proposed approach can be used for building a decision support system which suggests editors the best categories that fit new content entered to Wikipedia.

  19. Automatic Detection of Fake News

    OpenAIRE

    Pérez-Rosas, Verónica; Kleinberg, Bennett; Lefevre, Alexandra; Mihalcea, Rada

    2017-01-01

    The proliferation of misleading information in everyday access media outlets such as social media feeds, news blogs, and online newspapers have made it challenging to identify trustworthy news sources, thus increasing the need for computational tools able to provide insights into the reliability of online content. In this paper, we focus on the automatic identification of fake content in online news. Our contribution is twofold. First, we introduce two novel datasets for the task of fake news...

  20. Automatic controller at associated memory

    International Nuclear Information System (INIS)

    Courty, P.

    1977-06-01

    Organized around an A2 type controller, this CAMAC device allows on command of the associated computer to start reading 64K 16 bit words into an outer memory. This memory is fully controlled by the computer. In the automatic mode, which works at 10 6 words/sec, the computer can access any other module of the same crate by cycle-stealing [fr

  1. Automatic Guidance for Remote Manipulator

    Science.gov (United States)

    Johnston, A. R.

    1986-01-01

    Position sensor and mirror guides manipulator toward object. Grasping becomes automatic when sensor begins to receive signal from reflector on object to be manipulated. Light-emitting diodes on manipulator produce light signals for reflector, which is composite of plane and corner reflectors. Proposed scheme especially useful when manipulator arm tends to flex or when object is moving. Sensor and microprocessor designed to compensate for manipulatorarm oscillation.

  2. Annual review in automatic programming

    CERN Document Server

    Halpern, Mark I; Bolliet, Louis

    2014-01-01

    Computer Science and Technology and their Application is an eight-chapter book that first presents a tutorial on database organization. Subsequent chapters describe the general concepts of Simula 67 programming language; incremental compilation and conversational interpretation; dynamic syntax; the ALGOL 68. Other chapters discuss the general purpose conversational system for graphical programming and automatic theorem proving based on resolution. A survey of extensible programming language is also shown.

  3. Automatically-Programed Machine Tools

    Science.gov (United States)

    Purves, L.; Clerman, N.

    1985-01-01

    Software produces cutter location files for numerically-controlled machine tools. APT, acronym for Automatically Programed Tools, is among most widely used software systems for computerized machine tools. APT developed for explicit purpose of providing effective software system for programing NC machine tools. APT system includes specification of APT programing language and language processor, which executes APT statements and generates NC machine-tool motions specified by APT statements.

  4. Automatic computation of transfer functions

    Science.gov (United States)

    Atcitty, Stanley; Watson, Luke Dale

    2015-04-14

    Technologies pertaining to the automatic computation of transfer functions for a physical system are described herein. The physical system is one of an electrical system, a mechanical system, an electromechanical system, an electrochemical system, or an electromagnetic system. A netlist in the form of a matrix comprises data that is indicative of elements in the physical system, values for the elements in the physical system, and structure of the physical system. Transfer functions for the physical system are computed based upon the netlist.

  5. Automatic segmentation of clinical texts.

    Science.gov (United States)

    Apostolova, Emilia; Channin, David S; Demner-Fushman, Dina; Furst, Jacob; Lytinen, Steven; Raicu, Daniela

    2009-01-01

    Clinical narratives, such as radiology and pathology reports, are commonly available in electronic form. However, they are also commonly entered and stored as free text. Knowledge of the structure of clinical narratives is necessary for enhancing the productivity of healthcare departments and facilitating research. This study attempts to automatically segment medical reports into semantic sections. Our goal is to develop a robust and scalable medical report segmentation system requiring minimum user input for efficient retrieval and extraction of information from free-text clinical narratives. Hand-crafted rules were used to automatically identify a high-confidence training set. This automatically created training dataset was later used to develop metrics and an algorithm that determines the semantic structure of the medical reports. A word-vector cosine similarity metric combined with several heuristics was used to classify each report sentence into one of several pre-defined semantic sections. This baseline algorithm achieved 79% accuracy. A Support Vector Machine (SVM) classifier trained on additional formatting and contextual features was able to achieve 90% accuracy. Plans for future work include developing a configurable system that could accommodate various medical report formatting and content standards.

  6. Automatic Management of Parallel and Distributed System Resources

    Science.gov (United States)

    Yan, Jerry; Ngai, Tin Fook; Lundstrom, Stephen F.

    1990-01-01

    Viewgraphs on automatic management of parallel and distributed system resources are presented. Topics covered include: parallel applications; intelligent management of multiprocessing systems; performance evaluation of parallel architecture; dynamic concurrent programs; compiler-directed system approach; lattice gaseous cellular automata; and sparse matrix Cholesky factorization.

  7. Thesaurus-Based Automatic Indexing: A Study of Indexing Failure.

    Science.gov (United States)

    Caplan, Priscilla Louise

    This study examines automatic indexing performed with a manually constructed thesaurus on a document collection of titles and abstracts of library science master's papers. Errors are identified when the meaning of a posted descriptor, as identified by context in the thesaurus, does not match that of the passage of text which occasioned the…

  8. Characteristics and design improvement of AP1000 automatic depressurization system

    International Nuclear Information System (INIS)

    Jin Fei

    2012-01-01

    Automatic depressurization system, as a specialty of AP1000 Design, enhances capability of mitigating design basis accidents for plant. Advancement of the system is discussed by comparing with traditional PWR design and analyzing system functions, such as depressurizing and venting. System design improvement during China Project performance is also described. At the end, suggestions for the system in China Project are listed. (author)

  9. automatic time regulator for switching on an aeration device for ...

    African Journals Online (AJOL)

    human labor, necessitated the design of an automatic time regulator circuit, which controls the switching on and off .... Design Parameters. Rectifier performance parameters: The per- formance of the rectifier section of the power supply block was evaluated in terms of the following .... The C6 is a voltage control capacitor that.

  10. Nature Conservation Drones for Automatic Localization and Counting of Animals

    NARCIS (Netherlands)

    van Gemert, J.C.; Verschoor, C.R.; Mettes, P.; Epema, K.; Koh, L.P.; Wich, S.; Agapito, L.; Bronstein, M.M.; Rother, C.

    2015-01-01

    This paper is concerned with nature conservation by automatically monitoring animal distribution and animal abundance. Typically, such conservation tasks are performed manually on foot or after an aerial recording from a manned aircraft. Such manual approaches are expensive, slow and labor

  11. Do Judgments of Learning Predict Automatic Influences of Memory?

    Science.gov (United States)

    Undorf, Monika; Böhm, Simon; Cüpper, Lutz

    2016-01-01

    Current memory theories generally assume that memory performance reflects both recollection and automatic influences of memory. Research on people's predictions about the likelihood of remembering recently studied information on a memory test, that is, on judgments of learning (JOLs), suggests that both magnitude and resolution of JOLs are linked…

  12. Automatic attraction of visual attention by supraletter features of former target strings

    DEFF Research Database (Denmark)

    Kyllingsbæk, Søren; Lommel, Sven Van; Bundesen, Claus

    2014-01-01

    , performance (d’) degraded on trials in which former targets were present, suggesting that the former targets automatically drew processing resources away from the current targets. Apparently, the two experiments showed automatic attraction of visual attention by supraletter features of former target strings....

  13. Automatic target extraction in complicated background for camera calibration

    Science.gov (United States)

    Guo, Xichao; Wang, Cheng; Wen, Chenglu; Cheng, Ming

    2016-03-01

    In order to perform high precise calibration of camera in complex background, a novel design of planar composite target and the corresponding automatic extraction algorithm are presented. Unlike other commonly used target designs, the proposed target contains the information of feature point coordinate and feature point serial number simultaneously. Then based on the original target, templates are prepared by three geometric transformations and used as the input of template matching based on shape context. Finally, parity check and region growing methods are used to extract the target as final result. The experimental results show that the proposed method for automatic extraction and recognition of the proposed target is effective, accurate and reliable.

  14. Automatic processing of radioimmunological research data on a computer

    International Nuclear Information System (INIS)

    Korolyuk, I.P.; Gorodenko, A.N.; Gorodenko, S.I.

    1979-01-01

    A program ''CRITEST'' in the language PL/1 for the EC computer intended for automatic processing of the results of radioimmunological research has been elaborated. The program works in the operation system of the OC EC computer and is performed in the section OC 60 kb. When compiling the program Eitken's modified algorithm was used. The program was clinically approved when determining a number of hormones: CTH, T 4 , T 3 , TSH. The automatic processing of the radioimmunological research data on the computer makes it possible to simplify the labour-consuming analysis and to raise its accuracy

  15. Semi-Automatic Rename Refactoring for JavaScript

    DEFF Research Database (Denmark)

    Feldthaus, Asger; Møller, Anders

    2013-01-01

    Modern IDEs support automated refactoring for many programming languages, but support for JavaScript is still primitive. To perform renaming, which is one of the fundamental refactorings, there is often no practical alternative to simple syntactic search-and-replace. Although more sophisticated...... alternatives have been developed, they are limited by whole-program assumptions and poor scalability. We propose a technique for semi-automatic refactoring for JavaScript, with a focus on renaming. Unlike traditional refactoring algorithms, semi-automatic refactoring works by a combination of static analysis...

  16. Automatic Distribution Network Reconfiguration: An Event-Driven Approach

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Fei; Jiang, Huaiguang; Tan, Jin

    2016-11-14

    This paper proposes an event-driven approach for reconfiguring distribution systems automatically. Specifically, an optimal synchrophasor sensor placement (OSSP) is used to reduce the number of synchrophasor sensors while keeping the whole system observable. Then, a wavelet-based event detection and location approach is used to detect and locate the event, which performs as a trigger for network reconfiguration. With the detected information, the system is then reconfigured using the hierarchical decentralized approach to seek for the new optimal topology. In this manner, whenever an event happens the distribution network can be reconfigured automatically based on the real-time information that is observable and detectable.

  17. Automatic Model Generation Framework for Computational Simulation of Cochlear Implantation

    DEFF Research Database (Denmark)

    Mangado Lopez, Nerea; Ceresa, Mario; Duchateau, Nicolas

    2016-01-01

    's CT image, an accurate model of the patient-specific cochlea anatomy is obtained. An algorithm based on the parallel transport frame is employed to perform the virtual insertion of the cochlear implant. Our automatic framework also incorporates the surrounding bone and nerve fibers and assigns....... To address such a challenge, we propose an automatic framework for the generation of patient-specific meshes for finite element modeling of the implanted cochlea. First, a statistical shape model is constructed from high-resolution anatomical μCT images. Then, by fitting the statistical model to a patient...

  18. Quality assurance in the use of automatic processor equipment

    International Nuclear Information System (INIS)

    Cheung, Kyung Mo; Cheung, Hwan

    1986-01-01

    In recent years the concept of quality assurance in radiology has become as popular as the apple pie. Unfortunately, however, the concept means different things to different people. Furthermore, the methods proposed are very many and diversified. The present article will focus on the quality assurance in the use of the automatic processor equipment. With automatic film processors, there are many factors which can cause a reduction in image quality, but they center on the following: (1) chemical solutions, (2) mechanical parts of the equipment, and (3) faults in processing and its management, which are discussed in some detail. Quality assurance helps to ensure satisfactory performance of the automatic processor equipment for high image quality. The following record keeping is required: - For equipment quality control: 1) daily maintenance record, 2) weekly maintenance record. 3) monthly main tenancy record, and 4) ph value record based on tests with litmus paper. - Training of the personal is required in the following subjects: 1) safety in processing operation, 2) management of the automatic processor equipment (a manual will be used), 3) acquisition and absorption of latest information on the automatic processor equipment system. - The procedures described above are considered necessary for efficient processing operations which will give high and uniform image quality

  19. The Potential of Automatic Word Comparison for Historical Linguistics.

    Science.gov (United States)

    List, Johann-Mattis; Greenhill, Simon J; Gray, Russell D

    2017-01-01

    The amount of data from languages spoken all over the world is rapidly increasing. Traditional manual methods in historical linguistics need to face the challenges brought by this influx of data. Automatic approaches to word comparison could provide invaluable help to pre-analyze data which can be later enhanced by experts. In this way, computational approaches can take care of the repetitive and schematic tasks leaving experts to concentrate on answering interesting questions. Here we test the potential of automatic methods to detect etymologically related words (cognates) in cross-linguistic data. Using a newly compiled database of expert cognate judgments across five different language families, we compare how well different automatic approaches distinguish related from unrelated words. Our results show that automatic methods can identify cognates with a very high degree of accuracy, reaching 89% for the best-performing method Infomap. We identify the specific strengths and weaknesses of these different methods and point to major challenges for future approaches. Current automatic approaches for cognate detection-although not perfect-could become an important component of future research in historical linguistics.

  20. New solutions for automatic control of skip hoisting

    Energy Technology Data Exchange (ETDEWEB)

    Kacy, K.; Kwiatkowski, M.; Lankosz, M.

    1977-01-01

    This paper discuss design and operation of an automatic system for control of skip hoisting in the Barbara-Chorzow black coal mine in Poland. The control system for skip hoists consists of the CKD manual control system and an automatic control system developed in Poland. Number of common elements of the 2 control systems was reduced to a minimum. Main functional elements of the automatic control system are described: a control-point setting element, a hoisting-speed control element, and element for control of hoisting current and for limiting current during stoppage, a relay control and protection element. Design of the elements is shown in 11 schemes and diagrams. The system is characterized by low price and installation cost (550,000 zlotys) in comparison to cost of the ASEA control system (1,800,000 zlotys) or that of the PMUE system (1,200,000 zlotys), high reliability and control efficiency. In spite of increased ambient temperature, temperature of the drive system with the automatic control system was within permissible limits. Performance tests showed that an independent system for manual and automatic control of hoist drive is superior to an integrated control system. 2 refs.

  1. Artificial Intelligence In Automatic Target Recognizers: Technology And Timelines

    Science.gov (United States)

    Gilmore, John F.

    1984-12-01

    The recognition of targets in thermal imagery has been a problem exhaustively analyzed in its current localized dimension. This paper discusses the application of artificial intelligence (AI) technology to automatic target recognition, a concept capable of expanding current ATR efforts into a new globalized dimension. Deficiencies of current automatic target recognition systems are reviewed in terms of system shortcomings. Areas of artificial intelligence which show the most promise in improving ATR performance are analyzed, and a timeline is formed in light of how near (as well as far) term artificial intelligence applications may exist. Current research in the area of high level expert vision systems is reviewed and the possible utilization of artificial intelligence architectures to improve low level image processing functions is also discussed. Additional application areas of relevance to solving the problem of automatic target recognition utilizing both high and low level processing are also explored.

  2. Automation of chromosomes analysis. Automatic system for image processing

    International Nuclear Information System (INIS)

    Le Go, R.; Cosnac, B. de; Spiwack, A.

    1975-01-01

    The A.S.T.I. is an automatic system relating to the fast conversational processing of all kinds of images (cells, chromosomes) converted to a numerical data set (120000 points, 16 grey levels stored in a MOS memory) through a fast D.O. analyzer. The system performs automatically the isolation of any individual image, the area and weighted area of which are computed. These results are directly displayed on the command panel and can be transferred to a mini-computer for further computations. A bright spot allows parts of an image to be picked out and the results to be displayed. This study is particularly directed towards automatic karyo-typing [fr

  3. Design of the automatic landing inversion flight control system based on neural network compensation for UAV

    Science.gov (United States)

    Chen, Yinchao; Yang, Wei

    2009-12-01

    A dynamic inversion control method based on neural network compensation for UAV automatic landing is introduced. Aimed at the nonlinear characteristic of automatic landing procedure, the dynamic inversion method is used for feedback linearization. The on-line neural network is introduced to compensation dynamic inversion error caused by the disturbance factors during automatic landing and improves the controller performance. Numerical simulation presents that the control method can make the UAV follow the expected trace properly and have good dynamic performance and robust performance.

  4. Automatic differentiation for gradient-based optimization of radiatively heated microelectronics manufacturing equipment

    Energy Technology Data Exchange (ETDEWEB)

    Moen, C.D.; Spence, P.A.; Meza, J.C.; Plantenga, T.D.

    1996-12-31

    Automatic differentiation is applied to the optimal design of microelectronic manufacturing equipment. The performance of nonlinear, least-squares optimization methods is compared between numerical and analytical gradient approaches. The optimization calculations are performed by running large finite-element codes in an object-oriented optimization environment. The Adifor automatic differentiation tool is used to generate analytic derivatives for the finite-element codes. The performance results support previous observations that automatic differentiation becomes beneficial as the number of optimization parameters increases. The increase in speed, relative to numerical differences, has a limited value and results are reported for two different analysis codes.

  5. Unification of automatic target tracking and automatic target recognition

    Science.gov (United States)

    Schachter, Bruce J.

    2014-06-01

    The subject being addressed is how an automatic target tracker (ATT) and an automatic target recognizer (ATR) can be fused together so tightly and so well that their distinctiveness becomes lost in the merger. This has historically not been the case outside of biology and a few academic papers. The biological model of ATT∪ATR arises from dynamic patterns of activity distributed across many neural circuits and structures (including retina). The information that the brain receives from the eyes is "old news" at the time that it receives it. The eyes and brain forecast a tracked object's future position, rather than relying on received retinal position. Anticipation of the next moment - building up a consistent perception - is accomplished under difficult conditions: motion (eyes, head, body, scene background, target) and processing limitations (neural noise, delays, eye jitter, distractions). Not only does the human vision system surmount these problems, but it has innate mechanisms to exploit motion in support of target detection and classification. Biological vision doesn't normally operate on snapshots. Feature extraction, detection and recognition are spatiotemporal. When vision is viewed as a spatiotemporal process, target detection, recognition, tracking, event detection and activity recognition, do not seem as distinct as they are in current ATT and ATR designs. They appear as similar mechanism taking place at varying time scales. A framework is provided for unifying ATT and ATR.

  6. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming, Volume 4 is a collection of papers that deals with the GIER ALGOL compiler, a parameterized compiler based on mechanical linguistics, and the JOVIAL language. A couple of papers describes a commercial use of stacks, an IBM system, and what an ideal computer program support system should be. One paper reviews the system of compilation, the development of a more advanced language, programming techniques, machine independence, and program transfer to other machines. Another paper describes the ALGOL 60 system for the GIER machine including running ALGOL pro

  7. Automatic Evaluation of Machine Translation

    DEFF Research Database (Denmark)

    Martinez, Mercedes Garcia; Koglin, Arlene; Mesa-Lao, Bartolomé

    2015-01-01

    of quality criteria in as few edits as possible. The quality of MT systems is generally measured by automatic metrics, producing scores that should correlate with human evaluation.In this study, we investigate correlations between one of such metrics, i.e. Translation Edit Rate (TER), and actual post...... of post-editing effort, namely i) temporal (time), ii) cognitive (mental processes) and iii) technical (keyboard activity). For the purposes of this research, TER scores were correlated with two different indicators of post-editing effort as computed in the CRITT Translation Process Database (TPR...

  8. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming, Volume 2 is a collection of papers that discusses the controversy about the suitability of COBOL as a common business oriented language, and the development of different common languages for scientific computation. A couple of papers describes the use of the Genie system in numerical calculation and analyzes Mercury autocode in terms of a phrase structure language, such as in the source language, target language, the order structure of ATLAS, and the meta-syntactical language of the assembly program. Other papers explain interference or an ""intermediate

  9. Coordinated hybrid automatic repeat request

    KAUST Repository

    Makki, Behrooz

    2014-11-01

    We develop a coordinated hybrid automatic repeat request (HARQ) approach. With the proposed scheme, if a user message is correctly decoded in the first HARQ rounds, its spectrum is allocated to other users, to improve the network outage probability and the users\\' fairness. The results, which are obtained for single- and multiple-antenna setups, demonstrate the efficiency of the proposed approach in different conditions. For instance, with a maximum of M retransmissions and single transmit/receive antennas, the diversity gain of a user increases from M to (J+1)(M-1)+1 where J is the number of users helping that user.

  10. ABS-SmartComAgri: An Agent-Based Simulator of Smart Communication Protocols in Wireless Sensor Networks for Debugging in Precision Agriculture.

    Science.gov (United States)

    García-Magariño, Iván; Lacuesta, Raquel; Lloret, Jaime

    2018-03-27

    Smart communication protocols are becoming a key mechanism for improving communication performance in networks such as wireless sensor networks. However, the literature lacks mechanisms for simulating smart communication protocols in precision agriculture for decreasing production costs. In this context, the current work presents an agent-based simulator of smart communication protocols for efficiently managing pesticides. The simulator considers the needs of electric power, crop health, percentage of alive bugs and pesticide consumption. The current approach is illustrated with three different communication protocols respectively called (a) broadcast, (b) neighbor and (c) low-cost neighbor. The low-cost neighbor protocol obtained a statistically-significant reduction in the need of electric power over the neighbor protocol, with a very large difference according to the common interpretations about the Cohen's d effect size. The presented simulator is called ABS-SmartComAgri and is freely distributed as open-source from a public research data repository. It ensures the reproducibility of experiments and allows other researchers to extend the current approach.

  11. ABS-SmartComAgri: An Agent-Based Simulator of Smart Communication Protocols in Wireless Sensor Networks for Debugging in Precision Agriculture

    Directory of Open Access Journals (Sweden)

    Iván García-Magariño

    2018-03-01

    Full Text Available Smart communication protocols are becoming a key mechanism for improving communication performance in networks such as wireless sensor networks. However, the literature lacks mechanisms for simulating smart communication protocols in precision agriculture for decreasing production costs. In this context, the current work presents an agent-based simulator of smart communication protocols for efficiently managing pesticides. The simulator considers the needs of electric power, crop health, percentage of alive bugs and pesticide consumption. The current approach is illustrated with three different communication protocols respectively called (a broadcast, (b neighbor and (c low-cost neighbor. The low-cost neighbor protocol obtained a statistically-significant reduction in the need of electric power over the neighbor protocol, with a very large difference according to the common interpretations about the Cohen’s d effect size. The presented simulator is called ABS-SmartComAgri and is freely distributed as open-source from a public research data repository. It ensures the reproducibility of experiments and allows other researchers to extend the current approach.

  12. A System for Automatically Generating Scheduling Heuristics

    Science.gov (United States)

    Morris, Robert

    1996-01-01

    The goal of this research is to improve the performance of automated schedulers by designing and implementing an algorithm by automatically generating heuristics by selecting a schedule. The particular application selected by applying this method solves the problem of scheduling telescope observations, and is called the Associate Principal Astronomer. The input to the APA scheduler is a set of observation requests submitted by one or more astronomers. Each observation request specifies an observation program as well as scheduling constraints and preferences associated with the program. The scheduler employs greedy heuristic search to synthesize a schedule that satisfies all hard constraints of the domain and achieves a good score with respect to soft constraints expressed as an objective function established by an astronomer-user.

  13. Automatic measurement of target crossing speed

    Science.gov (United States)

    Wardell, Mark; Lougheed, James H.

    1992-11-01

    The motion of ground vehicle targets after a ballistic round is launched can be a major source of inaccuracy for small (handheld) anti-armour weapon systems. A method of automatically measuring the crossing component to compensate the fire control solution has been devised and tested against various targets in a range of environments. A photodetector array aligned with the sight's horizontal reticle obtains scene features, which are digitized and processed to separate target from sight motion. Relative motion of the target against the background is briefly monitored to deduce angular crossing rate and a compensating lead angle is introduced into the aim point. Research to gather quantitative data and optimize algorithm performance is described, and some results from field testing are presented.

  14. Automatic Mode Transition Enabled Robust Triboelectric Nanogenerators.

    Science.gov (United States)

    Chen, Jun; Yang, Jin; Guo, Hengyu; Li, Zhaoling; Zheng, Li; Su, Yuanjie; Wen, Zhen; Fan, Xing; Wang, Zhong Lin

    2015-12-22

    Although the triboelectric nanogenerator (TENG) has been proven to be a renewable and effective route for ambient energy harvesting, its robustness remains a great challenge due to the requirement of surface friction for a decent output, especially for the in-plane sliding mode TENG. Here, we present a rationally designed TENG for achieving a high output performance without compromising the device robustness by, first, converting the in-plane sliding electrification into a contact separation working mode and, second, creating an automatic transition between a contact working state and a noncontact working state. The magnet-assisted automatic transition triboelectric nanogenerator (AT-TENG) was demonstrated to effectively harness various ambient rotational motions to generate electricity with greatly improved device robustness. At a wind speed of 6.5 m/s or a water flow rate of 5.5 L/min, the harvested energy was capable of lighting up 24 spot lights (0.6 W each) simultaneously and charging a capacitor to greater than 120 V in 60 s. Furthermore, due to the rational structural design and unique output characteristics, the AT-TENG was not only capable of harvesting energy from natural bicycling and car motion but also acting as a self-powered speedometer with ultrahigh accuracy. Given such features as structural simplicity, easy fabrication, low cost, wide applicability even in a harsh environment, and high output performance with superior device robustness, the AT-TENG renders an effective and practical approach for ambient mechanical energy harvesting as well as self-powered active sensing.

  15. Improving Automatic Text Classification by Integrated Feature Analysis

    Science.gov (United States)

    Busagala, Lazaro S. P.; Ohyama, Wataru; Wakabayashi, Tetsushi; Kimura, Fumitaka

    Feature transformation in automatic text classification (ATC) can lead to better classification performance. Furthermore dimensionality reduction is important in ATC. Hence, feature transformation and dimensionality reduction are performed to obtain lower computational costs with improved classification performance. However, feature transformation and dimension reduction techniques have been conventionally considered in isolation. In such cases classification performance can be lower than when integrated. Therefore, we propose an integrated feature analysis approach which improves the classification performance at lower dimensionality. Moreover, we propose a multiple feature integration technique which also improves classification effectiveness.

  16. An automatic holographic adaptive phoropter

    Science.gov (United States)

    Amirsolaimani, Babak; Peyghambarian, N.; Schwiegerling, Jim; Bablumyan, Arkady; Savidis, Nickolaos; Peyman, Gholam

    2017-08-01

    Phoropters are the most common instrument used to detect refractive errors. During a refractive exam, lenses are flipped in front of the patient who looks at the eye chart and tries to read the symbols. The procedure is fully dependent on the cooperation of the patient to read the eye chart, provides only a subjective measurement of visual acuity, and can at best provide a rough estimate of the patient's vision. Phoropters are difficult to use for mass screenings requiring a skilled examiner, and it is hard to screen young children and the elderly etc. We have developed a simplified, lightweight automatic phoropter that can measure the optical error of the eye objectively without requiring the patient's input. The automatic holographic adaptive phoropter is based on a Shack-Hartmann wave front sensor and three computercontrolled fluidic lenses. The fluidic lens system is designed to be able to provide power and astigmatic corrections over a large range of corrections without the need for verbal feedback from the patient in less than 20 seconds.

  17. Automatic welding machine for piping

    International Nuclear Information System (INIS)

    Yoshida, Kazuhiro; Koyama, Takaichi; Iizuka, Tomio; Ito, Yoshitoshi; Takami, Katsumi.

    1978-01-01

    A remotely controlled automatic special welding machine for piping was developed. This machine is utilized for long distance pipe lines, chemical plants, thermal power generating plants and nuclear power plants effectively from the viewpoint of good quality control, reduction of labor and good controllability. The function of this welding machine is to inspect the shape and dimensions of edge preparation before welding work by the sense of touch, to detect the temperature of melt pool, inspect the bead form by the sense of touch, and check the welding state by ITV during welding work, and to grind the bead surface and inspect the weld metal by ultrasonic test automatically after welding work. The construction of this welding system, the main specification of the apparatus, the welding procedure in detail, the electrical source of this welding machine, the cooling system, the structure and handling of guide ring, the central control system and the operating characteristics are explained. The working procedure and the effect by using this welding machine, and the application to nuclear power plants and the other industrial field are outlined. The HIDIC 08 is used as the controlling computer. This welding machine is useful for welding SUS piping as well as carbon steel piping. (Nakai, Y.)

  18. Automatic generation of tourist brochures

    KAUST Repository

    Birsak, Michael

    2014-05-01

    We present a novel framework for the automatic generation of tourist brochures that include routing instructions and additional information presented in the form of so-called detail lenses. The first contribution of this paper is the automatic creation of layouts for the brochures. Our approach is based on the minimization of an energy function that combines multiple goals: positioning of the lenses as close as possible to the corresponding region shown in an overview map, keeping the number of lenses low, and an efficient numbering of the lenses. The second contribution is a route-aware simplification of the graph of streets used for traveling between the points of interest (POIs). This is done by reducing the graph consisting of all shortest paths through the minimization of an energy function. The output is a subset of street segments that enable traveling between all the POIs without considerable detours, while at the same time guaranteeing a clutter-free visualization. © 2014 The Author(s) Computer Graphics Forum © 2014 The Eurographics Association and John Wiley & Sons Ltd. Published by John Wiley & Sons Ltd.

  19. A comparison of accurate automatic hippocampal segmentation methods.

    Science.gov (United States)

    Zandifar, Azar; Fonov, Vladimir; Coupé, Pierrick; Pruessner, Jens; Collins, D Louis

    2017-07-15

    The hippocampus is one of the first brain structures affected by Alzheimer's disease (AD). While many automatic methods for hippocampal segmentation exist, few studies have compared them on the same data. In this study, we compare four fully automated hippocampal segmentation methods in terms of their conformity with manual segmentation and their ability to be used as an AD biomarker in clinical settings. We also apply error correction to the four automatic segmentation methods, and complete a comprehensive validation to investigate differences between the methods. The effect size and classification performance is measured for AD versus normal control (NC) groups and for stable mild cognitive impairment (sMCI) versus progressive mild cognitive impairment (pMCI) groups. Our study shows that the nonlinear patch-based segmentation method with error correction is the most accurate automatic segmentation method and yields the most conformity with manual segmentation (κ=0.894). The largest effect size between AD versus NC and sMCI versus pMCI is produced by FreeSurfer with error correction. We further show that, using only hippocampal volume, age, and sex as features, the area under the receiver operating characteristic curve reaches up to 0.8813 for AD versus NC and 0.6451 for sMCI versus pMCI. However, the automatic segmentation methods are not significantly different in their performance. Copyright © 2017. Published by Elsevier Inc.

  20. Automatic Transcription of Polyphonic Vocal Music

    Directory of Open Access Journals (Sweden)

    Andrew McLeod

    2017-12-01

    Full Text Available This paper presents a method for automatic music transcription applied to audio recordings of a cappella performances with multiple singers. We propose a system for multi-pitch detection and voice assignment that integrates an acoustic and a music language model. The acoustic model performs spectrogram decomposition, extending probabilistic latent component analysis (PLCA using a six-dimensional dictionary with pre-extracted log-spectral templates. The music language model performs voice separation and assignment using hidden Markov models that apply musicological assumptions. By integrating the two models, the system is able to detect multiple concurrent pitches in polyphonic vocal music and assign each detected pitch to a specific voice type such as soprano, alto, tenor or bass (SATB. We compare our system against multiple baselines, achieving state-of-the-art results for both multi-pitch detection and voice assignment on a dataset of Bach chorales and another of barbershop quartets. We also present an additional evaluation of our system using varied pitch tolerance levels to investigate its performance at 20-cent pitch resolution.

  1. Automaticity in reading isiZulu

    OpenAIRE

    Sandra Land

    2016-01-01

    Automaticity, or instant recognition of combinations of letters as units of language, is essential for proficient reading in any language. The article explores automaticity amongst competent adult first-language readers of isiZulu, and the factors associated with it or its opposite - active decoding. Whilst the transparent spelling patterns of isiZulu aid learner readers, some of its orthographical features may militate against their gaining automaticity. These features are agglutination; a c...

  2. Production ready feature recognition based automatic group technology part coding

    Energy Technology Data Exchange (ETDEWEB)

    Ames, A.L.

    1990-01-01

    During the past four years, a feature recognition based expert system for automatically performing group technology part coding from solid model data has been under development. The system has become a production quality tool, capable of quickly the geometry based portions of a part code with no human intervention. It has been tested on over 200 solid models, half of which are models of production Sandia designs. Its performance rivals that of humans performing the same task, often surpassing them in speed and uniformity. The feature recognition capability developed for part coding is being extended to support other applications, such as manufacturability analysis, automatic decomposition (for finite element meshing and machining), and assembly planning. Initial surveys of these applications indicate that the current capability will provide a strong basis for other applications and that extensions toward more global geometric reasoning and tighter coupling with solid modeler functionality will be necessary.

  3. Automatic Thermal Infrared Panoramic Imaging Sensor

    National Research Council Canada - National Science Library

    Gutin, Mikhail; Tsui, Eddy K; Gutin, Olga; Wang, Xu-Ming; Gutin, Alexey

    2006-01-01

    .... Automatic detection, location, and tracking of targets outside protected area ensures maximum protection and at the same time reduces the workload on personnel, increases reliability and confidence...

  4. An automatically tuning intrusion detection system.

    Science.gov (United States)

    Yu, Zhenwei; Tsai, Jeffrey J P; Weigert, Thomas

    2007-04-01

    An intrusion detection system (IDS) is a security layer used to detect ongoing intrusive activities in information systems. Traditionally, intrusion detection relies on extensive knowledge of security experts, in particular, on their familiarity with the computer system to be protected. To reduce this dependence, various data-mining and machine learning techniques have been deployed for intrusion detection. An IDS is usually working in a dynamically changing environment, which forces continuous tuning of the intrusion detection model, in order to maintain sufficient performance. The manual tuning process required by current systems depends on the system operators in working out the tuning solution and in integrating it into the detection model. In this paper, an automatically tuning IDS (ATIDS) is presented. The proposed system will automatically tune the detection model on-the-fly according to the feedback provided by the system operator when false predictions are encountered. The system is evaluated using the KDDCup'99 intrusion detection dataset. Experimental results show that the system achieves up to 35% improvement in terms of misclassification cost when compared with a system lacking the tuning feature. If only 10% false predictions are used to tune the model, the system still achieves about 30% improvement. Moreover, when tuning is not delayed too long, the system can achieve about 20% improvement, with only 1.3% of the false predictions used to tune the model. The results of the experiments show that a practical system can be built based on ATIDS: system operators can focus on verification of predictions with low confidence, as only those predictions determined to be false will be used to tune the detection model.

  5. PERFORMANCE

    Directory of Open Access Journals (Sweden)

    M Cilli

    2014-10-01

    Full Text Available This study aimed to investigate the kinematic and kinetic changes when resistance is applied in horizontal and vertical directions, produced by using different percentages of body weight, caused by jumping movements during a dynamic warm-up. The group of subjects consisted of 35 voluntary male athletes (19 basketball and 16 volleyball players; age: 23.4 ± 1.4 years, training experience: 9.6 ± 2.7 years; height: 177.2 ± 5.7 cm, body weight: 69.9 ± 6.9 kg studying Physical Education, who had a jump training background and who were training for 2 hours, on 4 days in a week. A dynamic warm-up protocol containing seven specific resistance movements with specific resistance corresponding to different percentages of body weight (2%, 4%, 6%, 8%, 10% was applied randomly on non consecutive days. Effects of different warm-up protocols were assessed by pre-/post- exercise changes in jump height in the countermovement jump (CMJ and the squat jump (SJ measured using a force platform and changes in hip and knee joint angles at the end of the eccentric phase measured using a video camera. A significant increase in jump height was observed in the dynamic resistance warm-up conducted with different percentages of body weight (p 0.05. In jump movements before and after the warm-up, while no significant difference between the vertical ground reaction forces applied by athletes was observed (p>0.05, in some cases of resistance, a significant reduction was observed in hip and knee joint angles (p<0.05. The dynamic resistance warm-up method was found to cause changes in the kinematics of jumping movements, as well as an increase in jump height values. As a result, dynamic warm-up exercises could be applicable in cases of resistance corresponding to 6-10% of body weight applied in horizontal and vertical directions in order to increase the jump performance acutely.

  6. Automatic gamma spectrometry analytical apparatus

    International Nuclear Information System (INIS)

    Lamargot, J.-P.; Wanin, Maurice.

    1980-01-01

    This invention falls within the area of quantitative or semi-quantitative analysis by gamma spectrometry and particularly refers to a device for bringing the samples into the counting position. The purpose of this invention is precisely to provide an automatic apparatus specifically adapted to the analysis of hard gamma radiations. To this effect, the invention relates to a gamma spectrometry analytical device comprising a lead containment, a detector of which the sensitive part is located inside the containment and additionally comprising a transfer system for bringing the analyzed samples in succession to a counting position inside the containment above the detector. A feed compartment enables the samples to be brought in turn one by one on to the transfer system through a duct connecting the compartment to the transfer system. Sequential systems for the coordinated forward feed of the samples in the compartment and the transfer system complete this device [fr

  7. Automatic validation of numerical solutions

    DEFF Research Database (Denmark)

    Stauning, Ole

    1997-01-01

    , using this method has been developed. (ADIODES is an abbreviation of `` Automatic Differentiation Interval Ordinary Differential Equation Solver''). ADIODES is used to prove existence and uniqueness of periodic solutions to specific ordinary differential equations occuring in dynamical systems theory....... These proofs of existence and uniqueness are difficult or impossible to obtain using other known methods. Also, a method for solving boundary value problems is described. Finally a method for enclosing solutions to a class of integral equations is described. This method is based on the mean value enclosure...... of an integral operator and uses interval Bernstein polynomials for enclosing the solution. Two numerical examples are given, using two orders of approximation and using different numbers of discretization points....

  8. Antares automatic beam alignment system

    International Nuclear Information System (INIS)

    Appert, Q.; Swann, T.; Sweatt, W.; Saxman, A.

    1980-01-01

    Antares is a 24-beam-line CO 2 laser system for controlled fusion research, under construction at Los Alamos Scientific Laboratory (LASL). Rapid automatic alignment of this system is required prior to each experiment shot. The alignment requirements, operational constraints, and a developed prototype system are discussed. A visible-wavelength alignment technique is employed that uses a telescope/TV system to view point light sources appropriately located down the beamline. Auto-alignment is accomplished by means of a video centroid tracker, which determines the off-axis error of the point sources. The error is nulled by computer-driven, movable mirrors in a closed-loop system. The light sources are fiber-optic terminations located at key points in the optics path, primarily at the center of large copper mirrors, and remotely illuminated to reduce heating effects

  9. Automatic quantification of iris color

    DEFF Research Database (Denmark)

    Christoffersen, S.; Harder, Stine; Andersen, J. D.

    2012-01-01

    An automatic algorithm to quantify the eye colour and structural information from standard hi-resolution photos of the human iris has been developed. Initially, the major structures in the eye region are identified including the pupil, iris, sclera, and eyelashes. Based on this segmentation, the ...... is completely data driven and it can divide a group of eye images into classes based on structure, colour or a combination of the two. The methods have been tested on a large set of photos with promising results....... regions. The result is a blue-brown ratio for each eye. Furthermore, an image clustering approach has been used with promising results. The approach is based on using a sparse dictionary of feature vectors learned from a training set of iris regions. The feature vectors contain both local structural...

  10. Automatic creation of simulation configuration

    International Nuclear Information System (INIS)

    Oudot, G.; Poizat, F.

    1993-01-01

    SIPA, which stands for 'Simulator for Post Accident', includes: 1) a sophisticated software oriented workshop SWORD (which stands for 'Software Workshop Oriented towards Research and Development') designed in the ADA language including integrated CAD system and software tools for automatic generation of simulation software and man-machine interface in order to operate run-time simulation; 2) a 'simulator structure' based on hardware equipment and software for supervision and communications; 3) simulation configuration generated by SWORD, operated under the control of the 'simulator structure' and run on a target computer. SWORD has already been used to generate two simulation configurations (French 900 MW and 1300 MW nuclear power plants), which are now fully operational on the SIPA training simulator. (Z.S.) 1 ref

  11. Computerized automatic tip scanning operation

    International Nuclear Information System (INIS)

    Nishikawa, K.; Fukushima, T.; Nakai, H.; Yanagisawa, A.

    1984-01-01

    In BWR nuclear power stations the Traversing Incore Probe (TIP) system is one of the most important components in reactor monitoring and control. In previous TIP systems, however, operators have suffered from the complexity of operation and long operation time required. The system presented in this paper realizes the automatic operation of the TIP system by monitoring and driving it with a process computer. This system significantly reduces the burden on customer operators and improves plant efficiency by simplifying the operating procedure, augmenting the accuracy of the measured data, and shortening operating time. The process computer is one of the PODIA (Plant Operation by Displayed Information Automation) systems. This computer transfers control signals to the TIP control panel, which in turn drives equipment by microprocessor control. The process computer contains such components as the CRT/KB unit, the printer plotter, the hard copier, and the message typers required for efficient man-machine communications. Its operation and interface properties are described

  12. Automatic Differentiation and Deep Learning

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Statistical learning has been getting more and more interest from the particle-physics community in recent times, with neural networks and gradient-based optimization being a focus. In this talk we shall discuss three things: automatic differention tools: tools to quickly build DAGs of computation that are fully differentiable. We shall focus on one such tool "PyTorch".  Easy deployment of trained neural networks into large systems with many constraints: for example, deploying a model at the reconstruction phase where the neural network has to be integrated into CERN's bulk data-processing C++-only environment Some recent models in deep learning for segmentation and generation that might be useful for particle physics problems.

  13. Analysis of a Simple Debugging Model.

    Science.gov (United States)

    1986-11-30

    2--k 3 pIY-1(l-exp(-3T))) (2.4) Akman and Raftery (1986b) have shown that the unique prior of the form (2.4) for which B 0 1 is invariant to scale...arbitrary, undefined, multiplicative constant c0 /c 1. Akman and Raftery (1986b) have shown how this may be assigned using the minimal imaginary experiment...greater than one. Raftery and Akman (1986) have applied this approach to the change-point Poisson process; their results may be compared with the non

  14. Uav-Based Automatic Tree Growth Measurement for Biomass Estimation

    Science.gov (United States)

    Karpina, M.; Jarząbek-Rychard, M.; Tymków, P.; Borkowski, A.

    2016-06-01

    Manual in-situ measurements of geometric tree parameters for the biomass volume estimation are time-consuming and economically non-effective. Photogrammetric techniques can be deployed in order to automate the measurement procedure. The purpose of the presented work is an automatic tree growth estimation based on Unmanned Aircraft Vehicle (UAV) imagery. The experiment was conducted in an agriculture test field with scots pine canopies. The data was collected using a Leica Aibotix X6V2 platform equipped with a Nikon D800 camera. Reference geometric parameters of selected sample plants were measured manually each week. In situ measurements were correlated with the UAV data acquisition. The correlation aimed at the investigation of optimal conditions for a flight and parameter settings for image acquisition. The collected images are processed in a state of the art tool resulting in a generation of dense 3D point clouds. The algorithm is developed in order to estimate geometric tree parameters from 3D points. Stem positions and tree tops are identified automatically in a cross section, followed by the calculation of tree heights. The automatically derived height values are compared to the reference measurements performed manually. The comparison allows for the evaluation of automatic growth estimation process. The accuracy achieved using UAV photogrammetry for tree heights estimation is about 5cm.

  15. Failure of classical traffic flow theories: Stochastic highway capacity and automatic driving

    Science.gov (United States)

    Kerner, Boris S.

    2016-05-01

    In a mini-review Kerner (2013) it has been shown that classical traffic flow theories and models failed to explain empirical traffic breakdown - a phase transition from metastable free flow to synchronized flow at highway bottlenecks. The main objective of this mini-review is to study the consequence of this failure of classical traffic-flow theories for an analysis of empirical stochastic highway capacity as well as for the effect of automatic driving vehicles and cooperative driving on traffic flow. To reach this goal, we show a deep connection between the understanding of empirical stochastic highway capacity and a reliable analysis of automatic driving vehicles in traffic flow. With the use of simulations in the framework of three-phase traffic theory, a probabilistic analysis of the effect of automatic driving vehicles on a mixture traffic flow consisting of a random distribution of automatic driving and manual driving vehicles has been made. We have found that the parameters of automatic driving vehicles can either decrease or increase the probability of the breakdown. The increase in the probability of traffic breakdown, i.e., the deterioration of the performance of the traffic system can occur already at a small percentage (about 5%) of automatic driving vehicles. The increase in the probability of traffic breakdown through automatic driving vehicles can be realized, even if any platoon of automatic driving vehicles satisfies condition for string stability.

  16. Automatic segmentation of diatom images for classification

    NARCIS (Netherlands)

    Jalba, Andrei C.; Wilkinson, Michael H.F.; Roerdink, Jos B.T.M.

    A general framework for automatic segmentation of diatom images is presented. This segmentation is a critical first step in contour-based methods for automatic identification of diatoms by computerized image analysis. We review existing results, adapt popular segmentation methods to this difficult

  17. Solar Powered Automatic Shrimp Feeding System

    Directory of Open Access Journals (Sweden)

    Dindo T. Ani

    2015-12-01

    Full Text Available - Automatic system has brought many revolutions in the existing technologies. One among the technologies, which has greater developments, is the solar powered automatic shrimp feeding system. For instance, the solar power which is a renewable energy can be an alternative solution to energy crisis and basically reducing man power by using it in an automatic manner. The researchers believe an automatic shrimp feeding system may help solve problems on manual feeding operations. The project study aimed to design and develop a solar powered automatic shrimp feeding system. It specifically sought to prepare the design specifications of the project, to determine the methods of fabrication and assembly, and to test the response time of the automatic shrimp feeding system. The researchers designed and developed an automatic system which utilizes a 10 hour timer to be set in intervals preferred by the user and will undergo a continuous process. The magnetic contactor acts as a switch connected to the 10 hour timer which controls the activation or termination of electrical loads and powered by means of a solar panel outputting electrical power, and a rechargeable battery in electrical communication with the solar panel for storing the power. By undergoing through series of testing, the components of the modified system were proven functional and were operating within the desired output. It was recommended that the timer to be used should be tested to avoid malfunction and achieve the fully automatic system and that the system may be improved to handle changes in scope of the project.

  18. Towards Automatic Trunk Classification on Young Conifers

    DEFF Research Database (Denmark)

    Petri, Stig; Immerkær, John

    2009-01-01

    In the garden nursery industry providing young Nordmann firs for Christmas tree plantations, there is a rising interest in automatic classification of their products to ensure consistently high quality and reduce the cost of manual labor. This paper describes a fully automatic single-view algorithm...

  19. Equipment for fully automatic radiographic pipe inspection

    International Nuclear Information System (INIS)

    Basler, G.; Sperl, H.; Weinschenk, K.

    1977-01-01

    The patent describes a device for fully automatic radiographic testing of large pipes with longitudinal welds. Furthermore the invention enables automatic marking of films in radiographic inspection with regard to a ticketing of the test piece and of that part of it where testing took place. (RW) [de

  20. Automatic face morphing for transferring facial animation

    NARCIS (Netherlands)

    Bui Huu Trung, B.H.T.; Bui, T.D.; Poel, Mannes; Heylen, Dirk K.J.; Nijholt, Antinus; Hamza, H.M.

    2003-01-01

    In this paper, we introduce a novel method of automatically finding the training set of RBF networks for morphing a prototype face to represent a new face. This is done by automatically specifying and adjusting corresponding feature points on a target face. The RBF networks are then used to transfer

  1. The TS 600: automatic control system for eddy currents

    International Nuclear Information System (INIS)

    Poulet, J.P.

    1986-10-01

    In the scope of fabrication and in service inspection of the PWR steam generator tubing bendle, FRAMATOME developed an automatic Eddy Current testing system: TS600. Based on a mini-computer, TS600 allows to digitize, to store and to process data in various ways, so it is possible to perform several kinds of inspection: conventional inservice inspection, roll area profilometry...... TS600 can also be used to develop new methods of examination [fr

  2. Computer program for automatic generation of BWR control rod patterns

    International Nuclear Information System (INIS)

    Taner, M.S.; Levine, S.H.; Hsia, M.Y.

    1990-01-01

    A computer program named OCTOPUS has been developed to automatically determine a control rod pattern that approximates some desired target power distribution as closely as possible without violating any thermal safety or reactor criticality constraints. The program OCTOPUS performs a semi-optimization task based on the method of approximation programming (MAP) to develop control rod patterns. The SIMULATE-E code is used to determine the nucleonic characteristics of the reactor core state

  3. Assessing the efficacy of benchmarks for automatic speech accent recognition

    OpenAIRE

    Benjamin Bock; Lior Shamir

    2015-01-01

    Speech accents can possess valuable information about the speaker, and can be used in intelligent multimedia-based human-computer interfaces. The performance of algorithms for automatic classification of accents is often evaluated using audio datasets that include recording samples of different people, representing different accents. Here we describe a method that can detect bias in accent datasets, and apply the method to two accent identification datasets to reveal the existence of dataset ...

  4. METHOD FOR AUTOMATIC RAISING AND LEVELING OF SUPPORT PLATFORM

    OpenAIRE

    A. G. Stryzhniou

    2017-01-01

    The paper presents the method for automatic raising and leveling of support platform that differ from others in simplicity and versatility. The method includes four phases of raising and leveling when performance capabilities of the system is defined and the soil condition is tested. In addition, the current condition of the system is controlled and corrected with the issuance of control parameters to the control panel. The method can be used not only for static, but also for dynamic leveling...

  5. ExpertBayes: Automatically refining manually built Bayesian networks

    OpenAIRE

    Almeida, Ezilda; Ferreira, Pedro; Vinhoza, Tiago; Dutra, Inês; Li, Jingwei; Wu, Yirong; Burnside, Elizabeth

    2014-01-01

    Bayesian network structures are usually built using only the data and starting from an empty network or from a naive Bayes structure. Very often, in some domains, like medicine, a prior structure knowledge is already known. This structure can be automatically or manually refined in search for better performance models. In this work, we take Bayesian networks built by specialists and show that minor perturbations to this original network can yield better classifiers with a very small computati...

  6. FaNexer: Persian Keyphrase Automatic Indexer

    Directory of Open Access Journals (Sweden)

    Mohammad Reza Falahati Qadimi Fumani

    2014-06-01

    Full Text Available The main objective of this paper was to design a model of automatic keyphrase indexing for Persian. The train model, consisting of six features – “TF”, “TF × IDF”, “RE”, “RE × IDF”, “Node Degree” and “First Occurrence” – were elaborated on. These six features were defined briefly and for each feature, the discretization ranges applied as well as the Yes/No probability scores of being an index term were reported. Finally, the way the model, and each of its components, performed were demonstrated in a step-by-step manner by running the software on a sample full-text article. The ultimate assessment of the software on 75 test articles revealed that it had a very good performance on full-texts (F-measure = 27.3%, Precision = 31.68%, and recall = 25.45% and abstracts (F-measure = 28%, precision = 32.19%, and recall = 26.27% when default was set at 7. The software also proved successful as regards generation of keyphrases rather than single word index terms at default 7. In all, 58.1% of the index terms generated by the software for full-text documents, and 58.67% of those generated for abstracts were phrases. Finally, 78.86% and 74.48% of the keyterms generated for full-texts and abstracts were judged as relevant by an LIS expert.

  7. Automatic Segmentation of Ultrasound Tomography Image

    Directory of Open Access Journals (Sweden)

    Shibin Wu

    2017-01-01

    Full Text Available Ultrasound tomography (UST image segmentation is fundamental in breast density estimation, medicine response analysis, and anatomical change quantification. Existing methods are time consuming and require massive manual interaction. To address these issues, an automatic algorithm based on GrabCut (AUGC is proposed in this paper. The presented method designs automated GrabCut initialization for incomplete labeling and is sped up with multicore parallel programming. To verify performance, AUGC is applied to segment thirty-two in vivo UST volumetric images. The performance of AUGC is validated with breast overlapping metrics (Dice coefficient (D, Jaccard (J, and False positive (FP and time cost (TC. Furthermore, AUGC is compared to other methods, including Confidence Connected Region Growing (CCRG, watershed, and Active Contour based Curve Delineation (ACCD. Experimental results indicate that AUGC achieves the highest accuracy (D=0.9275 and J=0.8660 and FP=0.0077 and takes on average about 4 seconds to process a volumetric image. It was said that AUGC benefits large-scale studies by using UST images for breast cancer screening and pathological quantification.

  8. Effectiveness of an Automatic Tracking Software in Underwater Motion Analysis

    Directory of Open Access Journals (Sweden)

    Fabrício A. Magalhaes

    2013-12-01

    Full Text Available Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP, based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers’ positions were manually tracked to determine the markers’ center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM. Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker’s coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4% than for COM (17.8%. Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis.

  9. Automatically extracting information needs from complex clinical questions.

    Science.gov (United States)

    Cao, Yong-gang; Cimino, James J; Ely, John; Yu, Hong

    2010-12-01

    Clinicians pose complex clinical questions when seeing patients, and identifying the answers to those questions in a timely manner helps improve the quality of patient care. We report here on two natural language processing models, namely, automatic topic assignment and keyword identification, that together automatically and effectively extract information needs from ad hoc clinical questions. Our study is motivated in the context of developing the larger clinical question answering system AskHERMES (Help clinicians to Extract and aRrticulate Multimedia information for answering clinical quEstionS). We developed supervised machine-learning systems to automatically assign predefined general categories (e.g. etiology, procedure, and diagnosis) to a question. We also explored both supervised and unsupervised systems to automatically identify keywords that capture the main content of the question. We evaluated our systems on 4654 annotated clinical questions that were collected in practice. We achieved an F1 score of 76.0% for the task of general topic classification and 58.0% for keyword extraction. Our systems have been implemented into the larger question answering system AskHERMES. Our error analyses suggested that inconsistent annotation in our training data have hurt both question analysis tasks. Our systems, available at http://www.askhermes.org, can automatically extract information needs from both short (the number of word tokens 20), and from both well-structured and ill-formed questions. We speculate that the performance of general topic classification and keyword extraction can be further improved if consistently annotated data are made available. Copyright © 2010 Elsevier Inc. All rights reserved.

  10. Control of automatic processes: A parallel distributed-processing account of the Stroop effect. Technical report

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, J.D.; Dunbar, K.; McClelland, J.L.

    1989-11-22

    A growing body of evidence suggests that traditional views of automaticity are in need of revision. For example, automaticity has often been treated as an all-or-none phenomenon, and traditional theories have held that automatic processes are independent of attention. Yet recent empirical data suggest that automatic processes are continuous, and furthermore are subject to attentional control. In this paper we present a model of attention which addresses these issues. Using a parallel distributed processing framework we propose that the attributes of automaticity depend upon the strength of a processing pathway and that strength increases with training. Using the Stroop effect as an example, we show how automatic processes are continuous and emerge gradually with practice. Specifically, we present a computational model of the Stroop task which simulates the time course of processing as well as the effects of learning. This was accomplished by combining the cascade mechanism described by McClelland (1979) with the back propagation learning algorithm (Rumelhart, Hinton, Williams, 1986). The model is able to simulate performance in the standard Stroop task, as well as aspects of performance in variants of this task which manipulate SOA, response set, and degree of practice. In the discussion we contrast our model with other models, and indicate how it relates to many of the central issues in the literature on attention, automaticity, and interference.

  11. Functionality and Performance Visualization of the Distributed High Quality Volume Renderer (HVR)

    KAUST Repository

    Shaheen, Sara

    2012-07-01

    Volume rendering systems are designed to provide means to enable scientists and a variety of experts to interactively explore volume data through 3D views of the volume. However, volume rendering techniques are computationally intensive tasks. Moreover, parallel distributed volume rendering systems and multi-threading architectures were suggested as natural solutions to provide an acceptable volume rendering performance for very large volume data sizes, such as Electron Microscopy data (EM). This in turn adds another level of complexity when developing and manipulating volume rendering systems. Given that distributed parallel volume rendering systems are among the most complex systems to develop, trace and debug, it is obvious that traditional debugging tools do not provide enough support. As a consequence, there is a great demand to provide tools that are able to facilitate the manipulation of such systems. This can be achieved by utilizing the power of compute graphics in designing visual representations that reflect how the system works and that visualize the current performance state of the system.The work presented is categorized within the field of software Visualization, where Visualization is used to serve visualizing and understanding various software. In this thesis, a number of visual representations that reflect a number of functionality and performance aspects of the distributed HVR, a high quality volume renderer system that uses various techniques to visualize large volume sizes interactively. This work is provided to visualize different stages of the parallel volume rendering pipeline of HVR. This is along with means of performance analysis through a number of flexible and dynamic visualizations that reflect the current state of the system and enables manipulation of them at runtime. Those visualization are aimed to facilitate debugging, understanding and analyzing the distributed HVR.

  12. Advances in automatic data analysis capabilities

    International Nuclear Information System (INIS)

    Benson, J.; Bipes, T.; Udpa, L.

    2009-01-01

    Utilities perform eddy current tests on nuclear power plant steam generator (SG) tubes to detect degradation. This paper summarizes the Electric Power Research Institute (EPRI) research to develop signal-processing algorithms that automate the analysis of eddy current test data. The research focuses on analyzing rotating probe and array probe data for detecting, classifying, and characterizing degradation in SG tubes. Automated eddy current data analysis systems for bobbin coil probe data have been available for more than a decade. However, automated data analysis systems for rotating and array probes have developed slowly because of the complexities of the inspection parameters associated with the data. Manual analysis of rotating probe data has been shown to be inconsistent and time consuming when flaw depth profiles are generated. Algorithms have been developed for detection of most all common steam generator degradation mechanisms. Included in the latest version of the developed software is the ability to perform automated defect profiling which is useful in tube integrity determinations. Profiling performed manually can be time consuming whereas automated profiling is performed in a fraction of the time and is much more repeatable. Recent advances in eddy current probe development have resulted in an array probe design capable of high-speed data acquisition over the full length of SG tubes. Probe qualification programs have demonstrated that array probes are capable of providing similar degradation detection capabilities to the rotating probe technology. However, to date, utilities have not used the array probe in the field on a large-scale basis due to the large amount of data analyst resources and time required to process the vast quantity of data generated by the probe. To address this obstacle, EPRI initiated a program to develop automatic data analysis algorithms for rotating and array probes. During the development process for both rotating and array

  13. Automatic Earthquake Detection by Active Learning

    Science.gov (United States)

    Bergen, K.; Beroza, G. C.

    2017-12-01

    In recent years, advances in machine learning have transformed fields such as image recognition, natural language processing and recommender systems. Many of these performance gains have relied on the availability of large, labeled data sets to train high-accuracy models; labeled data sets are those for which each sample includes a target class label, such as waveforms tagged as either earthquakes or noise. Earthquake seismologists are increasingly leveraging machine learning and data mining techniques to detect and analyze weak earthquake signals in large seismic data sets. One of the challenges in applying machine learning to seismic data sets is the limited labeled data problem; learning algorithms need to be given examples of earthquake waveforms, but the number of known events, taken from earthquake catalogs, may be insufficient to build an accurate detector. Furthermore, earthquake catalogs are known to be incomplete, resulting in training data that may be biased towards larger events and contain inaccurate labels. This challenge is compounded by the class imbalance problem; the events of interest, earthquakes, are infrequent relative to noise in continuous data sets, and many learning algorithms perform poorly on rare classes. In this work, we investigate the use of active learning for automatic earthquake detection. Active learning is a type of semi-supervised machine learning that uses a human-in-the-loop approach to strategically supplement a small initial training set. The learning algorithm incorporates domain expertise through interaction between a human expert and the algorithm, with the algorithm actively posing queries to the user to improve detection performance. We demonstrate the potential of active machine learning to improve earthquake detection performance with limited available training data.

  14. Automatic locking orthotic knee device

    Science.gov (United States)

    Weddendorf, Bruce C. (Inventor)

    1993-01-01

    An articulated tang in clevis joint for incorporation in newly manufactured conventional strap-on orthotic knee devices or for replacing such joints in conventional strap-on orthotic knee devices is discussed. The instant tang in clevis joint allows the user the freedom to extend and bend the knee normally when no load (weight) is applied to the knee and to automatically lock the knee when the user transfers weight to the knee, thus preventing a damaged knee from bending uncontrollably when weight is applied to the knee. The tang in clevis joint of the present invention includes first and second clevis plates, a tang assembly and a spacer plate secured between the clevis plates. Each clevis plate includes a bevelled serrated upper section. A bevelled shoe is secured to the tank in close proximity to the bevelled serrated upper section of the clevis plates. A coiled spring mounted within an oblong bore of the tang normally urges the shoes secured to the tang out of engagement with the serrated upper section of each clevic plate to allow rotation of the tang relative to the clevis plate. When weight is applied to the joint, the load compresses the coiled spring, the serrations on each clevis plate dig into the bevelled shoes secured to the tang to prevent relative movement between the tang and clevis plates. A shoulder is provided on the tang and the spacer plate to prevent overextension of the joint.

  15. Automatic Transmission Of Liquid Nitrogen

    Directory of Open Access Journals (Sweden)

    Sumedh Mhatre

    2015-08-01

    Full Text Available Liquid Nitrogen is one of the major substance used as a chiller in industry such as Ice cream factory Milk Diary Storage of blood sample Blood Bank etc. It helps to maintain the required product at a lower temperature for preservation purpose. We cannot fully utilise the LN2 so practically if we are using 3.75 litre LN2 for a single day then around 12 of LN2 450 ml is wasted due to vaporisation. A pressure relief valve is provided to create a pressure difference. If there is no pressure difference between the cylinder carrying LN2 and its surrounding it will results in damage of container as well as wastage of LN2.Transmission of LN2 from TA55 to BA3 is carried manually .So care must be taken for the transmission of LN2 in order to avoid its wastage. With the help of this project concept the transmission of LN2 will be carried automatically so as to reduce the wastage of LN2 in case of manual operation.

  16. Automatic segmentation of the colon

    Science.gov (United States)

    Wyatt, Christopher L.; Ge, Yaorong; Vining, David J.

    1999-05-01

    Virtual colonoscopy is a minimally invasive technique that enables detection of colorectal polyps and cancer. Normally, a patient's bowel is prepared with colonic lavage and gas insufflation prior to computed tomography (CT) scanning. An important step for 3D analysis of the image volume is segmentation of the colon. The high-contrast gas/tissue interface that exists in the colon lumen makes segmentation of the majority of the colon relatively easy; however, two factors inhibit automatic segmentation of the entire colon. First, the colon is not the only gas-filled organ in the data volume: lungs, small bowel, and stomach also meet this criteria. User-defined seed points placed in the colon lumen have previously been required to spatially isolate only the colon. Second, portions of the colon lumen may be obstructed by peristalsis, large masses, and/or residual feces. These complicating factors require increased user interaction during the segmentation process to isolate additional colon segments. To automate the segmentation of the colon, we have developed a method to locate seed points and segment the gas-filled lumen with no user supervision. We have also developed an automated approach to improve lumen segmentation by digitally removing residual contrast-enhanced fluid resulting from a new bowel preparation that liquefies and opacifies any residual feces.

  17. Automatic panoramic thermal integrated sensor

    Science.gov (United States)

    Gutin, Mikhail A.; Tsui, Eddy K.; Gutin, Olga N.

    2005-05-01

    Historically, the US Army has recognized the advantages of panoramic imagers with high image resolution: increased area coverage with fewer cameras, instantaneous full horizon detection, location and tracking of multiple targets simultaneously, extended range, and others. The novel ViperViewTM high-resolution panoramic thermal imager is the heart of the Automatic Panoramic Thermal Integrated Sensor (APTIS), being jointly developed by Applied Science Innovative, Inc. (ASI) and the Armament Research, Development and Engineering Center (ARDEC) in support of the Future Combat Systems (FCS) and the Intelligent Munitions Systems (IMS). The APTIS is anticipated to operate as an intelligent node in a wireless network of multifunctional nodes that work together to improve situational awareness (SA) in many defense and offensive operations, as well as serve as a sensor node in tactical Intelligence Surveillance Reconnaissance (ISR). The ViperView is as an aberration-corrected omnidirectional imager with small optics designed to match the resolution of a 640x480 pixels IR camera with improved image quality for longer range target detection, classification, and tracking. The same approach is applicable to panoramic cameras working in the visible spectral range. Other components of the ATPIS sensor suite include ancillary sensors, advanced power management, and wakeup capability. This paper describes the development status of the APTIS system.

  18. Automatic segmentation of psoriasis lesions

    Science.gov (United States)

    Ning, Yang; Shi, Chenbo; Wang, Li; Shu, Chang

    2014-10-01

    The automatic segmentation of psoriatic lesions is widely researched these years. It is an important step in Computer-aid methods of calculating PASI for estimation of lesions. Currently those algorithms can only handle single erythema or only deal with scaling segmentation. In practice, scaling and erythema are often mixed together. In order to get the segmentation of lesions area - this paper proposes an algorithm based on Random forests with color and texture features. The algorithm has three steps. The first step, the polarized light is applied based on the skin's Tyndall-effect in the imaging to eliminate the reflection and Lab color space are used for fitting the human perception. The second step, sliding window and its sub windows are used to get textural feature and color feature. In this step, a feature of image roughness has been defined, so that scaling can be easily separated from normal skin. In the end, Random forests will be used to ensure the generalization ability of the algorithm. This algorithm can give reliable segmentation results even the image has different lighting conditions, skin types. In the data set offered by Union Hospital, more than 90% images can be segmented accurately.

  19. Research and Development of Fully Automatic Alien Smoke Stack and Packaging System

    Science.gov (United States)

    Yang, Xudong; Ge, Qingkuan; Peng, Tao; Zuo, Ping; Dong, Weifu

    2017-12-01

    The problem of low efficiency of manual sorting packaging for the current tobacco distribution center, which developed a set of safe efficient and automatic type of alien smoke stack and packaging system. The functions of fully automatic alien smoke stack and packaging system adopt PLC control technology, servo control technology, robot technology, image recognition technology and human-computer interaction technology. The characteristics, principles, control process and key technology of the system are discussed in detail. Through the installation and commissioning fully automatic alien smoke stack and packaging system has a good performance and has completed the requirements for shaped cigarette.

  20. MatchGUI: A Graphical MATLAB-Based Tool for Automatic Image Co-Registration

    Science.gov (United States)

    Ansar, Adnan I.

    2011-01-01

    MatchGUI software, based on MATLAB, automatically matches two images and displays the match result by superimposing one image on the other. A slider bar allows focus to shift between the two images. There are tools for zoom, auto-crop to overlap region, and basic image markup. Given a pair of ortho-rectified images (focused primarily on Mars orbital imagery for now), this software automatically co-registers the imagery so that corresponding image pixels are aligned. MatchGUI requires minimal user input, and performs a registration over scale and inplane rotation fully automatically

  1. A neurocomputational model of automatic sequence production.

    Science.gov (United States)

    Helie, Sebastien; Roeder, Jessica L; Vucovich, Lauren; Rünger, Dennis; Ashby, F Gregory

    2015-07-01

    Most behaviors unfold in time and include a sequence of submovements or cognitive activities. In addition, most behaviors are automatic and repeated daily throughout life. Yet, relatively little is known about the neurobiology of automatic sequence production. Past research suggests a gradual transfer from the associative striatum to the sensorimotor striatum, but a number of more recent studies challenge this role of the BG in automatic sequence production. In this article, we propose a new neurocomputational model of automatic sequence production in which the main role of the BG is to train cortical-cortical connections within the premotor areas that are responsible for automatic sequence production. The new model is used to simulate four different data sets from human and nonhuman animals, including (1) behavioral data (e.g., RTs), (2) electrophysiology data (e.g., single-neuron recordings), (3) macrostructure data (e.g., TMS), and (4) neurological circuit data (e.g., inactivation studies). We conclude with a comparison of the new model with existing models of automatic sequence production and discuss a possible new role for the BG in automaticity and its implication for Parkinson's disease.

  2. Upgradation in N2 gas station for TLD personnel monitoring using gas based semi-automatic badge readers- installation and integration of N2 gas generator plant to the system and its performance appraisal

    International Nuclear Information System (INIS)

    Presently Personnel Monitoring against external radiation is being carried out using CaSO 4 :Dy phosphor based TLD badge and hot N 2 gas based semiautomatic TLD Badge Reader (TLDBR-7B). This system requires the supply of high purity N 2 gas for the operation of the reader. This gas is normally obtained from N 2 gas cylinders use of which is a cumbersome, tedious, cost intensive and involves safety hazards. To minimize the dependence on conventional gas cylinders, a medium sized plant for generation of N 2 gas directly from air has been installed and coupled to gas station and distribution system. The paper describes the comparative study of performance of TLD reading system using gas from the generator with that using gas cylinders, which has been found to be quite comparable. This upgradation has helped in drastic reduction in cost, labour and improved safety in TLD Laboratory working. (author)

  3. Comparison of automatic and visual methods used for image segmentation in Endodontics: a microCT study.

    Science.gov (United States)

    Queiroz, Polyane Mazucatto; Rovaris, Karla; Santaella, Gustavo Machado; Haiter-Neto, Francisco; Freitas, Deborah Queiroz

    2017-01-01

    To calculate root canal volume and surface area in microCT images, an image segmentation by selecting threshold values is required, which can be determined by visual or automatic methods. Visual determination is influenced by the operator's visual acuity, while the automatic method is done entirely by computer algorithms. To compare between visual and automatic segmentation, and to determine the influence of the operator's visual acuity on the reproducibility of root canal volume and area measurements. Images from 31 extracted human anterior teeth were scanned with a μCT scanner. Three experienced examiners performed visual image segmentation, and threshold values were recorded. Automatic segmentation was done using the "Automatic Threshold Tool" available in the dedicated software provided by the scanner's manufacturer. Volume and area measurements were performed using the threshold values determined both visually and automatically. The paired Student's t-test showed no significant difference between visual and automatic segmentation methods regarding root canal volume measurements (p=0.93) and root canal surface (p=0.79). Although visual and automatic segmentation methods can be used to determine the threshold and calculate root canal volume and surface, the automatic method may be the most suitable for ensuring the reproducibility of threshold determination.

  4. Stay Focused! The Effects of Internal and External Focus of Attention on Movement Automaticity in Patients with Stroke

    NARCIS (Netherlands)

    Kal, E. C.; van der Kamp, J.; Houdijk, H.; Groet, E.; van Bennekom, C. A. M.; Scherder, E. J. A.

    2015-01-01

    Dual-task performance is often impaired after stroke. This may be resolved by enhancing patients' automaticity of movement. This study sets out to test the constrained action hypothesis, which holds that automaticity of movement is enhanced by triggering an external focus (on movement effects),

  5. Dynamic Anthropometry – Deffning Protocols for Automatic Body Measurement

    Directory of Open Access Journals (Sweden)

    Slavenka Petrak

    2017-12-01

    Full Text Available The paper presents the research on possibilities of protocol development for automatic computer-based determination of measurements on a 3D body model in defined dynamic positions. Initially, two dynamic body positions were defined for the research on dimensional changes of targeted body lengths and surface segments during body movement from basic static position into a selected dynamic body position. The assumption was that during body movement, specifi c length and surface dimensions would change significantly from the aspect of clothing construction and functionality of a garment model. 3D body scanning of a female test sample was performed in basic static and two defined dynamic positions. 3D body models were processed and measurement points were defined as a starting point for the determination of characteristic body measurements. The protocol for automatic computer measurement was defined for every dynamic body position by the systematic set of activities based on determined measurement points. The verification of developed protocols was performed by automatic determination of defined measurements on the test sample and by comparing the results with the conventional manual measurement.

  6. Automatic adjustment of astrochronologic correlations

    Science.gov (United States)

    Zeeden, Christian; Kaboth, Stefanie; Hilgen, Frederik; Laskar, Jacques

    2017-04-01

    Here we present an algorithm for the automated adjustment and optimisation of correlations between proxy data and an orbital tuning target (or similar datasets as e.g. ice models) for the R environment (R Development Core Team 2008), building on the 'astrochron' package (Meyers et al.2014). The basis of this approach is an initial tuning on orbital (precession, obliquity, eccentricity) scale. We use filters of orbital frequency ranges related to e.g. precession, obliquity or eccentricity of data and compare these filters to an ensemble of target data, which may consist of e.g. different combinations of obliquity and precession, different phases of precession and obliquity, a mix of orbital and other data (e.g. ice models), or different orbital solutions. This approach allows for the identification of an ideal mix of precession and obliquity to be used as tuning target. In addition, the uncertainty related to different tuning tie points (and also precession- and obliquity contributions of the tuning target) can easily be assessed. Our message is to suggest an initial tuning and then obtain a reproducible tuned time scale, avoiding arbitrary chosen tie points and replacing these by automatically chosen ones, representing filter maxima (or minima). We present and discuss the above outlined approach and apply it to artificial and geological data. Artificial data are assessed to find optimal filter settings; real datasets are used to demonstrate the possibilities of such an approach. References: Meyers, S.R. (2014). Astrochron: An R Package for Astrochronology. http://cran.r-project.org/package=astrochron R Development Core Team (2008). R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. ISBN 3-900051-07-0, URL http://www.R-project.org.

  7. Individual Differences in Automatic Emotion Regulation Interact with Primed Emotion Regulation during an Anger Provocation

    OpenAIRE

    Zhang, Jing; Lipp, Ottmar V.; Hu, Ping

    2017-01-01

    The current study investigated the interactive effects of individual differences in automatic emotion regulation (AER) and primed emotion regulation strategy on skin conductance level (SCL) and heart rate during provoked anger. The study was a 2 × 2 [AER tendency (expression vs. control) × priming (expression vs. control)] between subject design. Participants were assigned to two groups according to their performance on an emotion regulation-IAT (differentiating automatic emotion control tend...

  8. Color Image Segmentation Based on Different Color Space Models Using Automatic GrabCut

    OpenAIRE

    Khattab, Dina; Ebied, Hala Mousher; Hussein, Ashraf Saad; Tolba, Mohamed Fahmy

    2014-01-01

    This paper presents a comparative study using different color spaces to evaluate the performance of color image segmentation using the automatic GrabCut technique. GrabCut is considered as one of the semiautomatic image segmentation techniques, since it requires user interaction for the initialization of the segmentation process. The automation of the GrabCut technique is proposed as a modification of the original semiautomatic one in order to eliminate the user interaction. The automatic Gra...

  9. Prototype Design and Application of a Semi-circular Automatic Parking System

    OpenAIRE

    Atacak, Ismail; Erdogdu, Ertugrul

    2017-01-01

    Nowadays, with the increasing population in urban areas, the number of vehicles used in traffic has also increased in these areas. This has brought with it major problems that are caused by insufficient parking areas, in terms of traffic congestion, drivers and environment. In this study, in order to overcome these problems, a multi-storey automatic parking system that automatically performs vehicle recognition, vehicle parking, vehicle delivery and pricing processes has been designed and the...

  10. Design of an automatic sample changer for the measurement of neutron flux by gamma spectrometry

    International Nuclear Information System (INIS)

    Gago, Javier; Bruna, Ruben; Baltuano, Oscar; Montoya, Eduardo; Descreaux, Killian

    2014-01-01

    This paper presents calculus, selection and components design for the construction of an automatic system in order to measure neutron flux in a working nuclear reactor by the gamma spectrometry technique using samples irradiated on the RP-10 nucleus. This system will perform the measurement of interchanging 100 samples in a programed and automatic way, reducing operation time by the user and obtaining more accurate measures. (authors).

  11. Automatic Number Plate Recognition System for IPhone Devices

    Directory of Open Access Journals (Sweden)

    Călin Enăchescu

    2013-06-01

    Full Text Available This paper presents a system for automatic number plate recognition, implemented for devices running the iOS operating system. The methods used for number plate recognition are based on existing methods, but optimized for devices with low hardware resources. To solve the task of automatic number plate recognition we have divided it into the following subtasks: image acquisition, localization of the number plate position on the image and character detection. The first subtask is performed by the camera of an iPhone, the second one is done using image pre-processing methods and template matching. For the character recognition we are using a feed-forward artificial neural network. Each of these methods is presented along with its results.

  12. Automatic Capture Verification in Pacemakers (Autocapture – Utility and Problems

    Directory of Open Access Journals (Sweden)

    Ruth Kam

    2004-04-01

    Full Text Available The concept of a closed – loop feedback system, that would automatically assess pacing threshold and self -adjust pacing output to ensure consistent myocardial capture, has many appeals. Enhancing patient safety in cases of an unexpected rise in threshold, reduced current drain, hence prolonging battery longevity and reducing the amount of physician intervention required are just some of the advantages. Autocapture (AC is a proprietary algorithm developed by St Jude Medical CRMD, Sylmar, CA, USA, (SJM that was the first to commercially provide these automatic functions in a single chamber pacemaker (Microny and Regency, and subsequently in a dual chamber pacemaker (Affinity, Entity and Identity family of pacemakers. This article reviews the conditions necessary for AC verification and performance and the problems encountered in clinical practice.

  13. Feature-based automatic color calibration for networked camera system

    Science.gov (United States)

    Yamamoto, Shoji; Taki, Keisuke; Tsumura, Norimichi; Nakaguchi, Toshiya; Miyake, Yoichi

    2011-01-01

    In this paper, we have developed a feature-based automatic color calibration by using an area-based detection and adaptive nonlinear regression method. Simple color matching of chartless is achieved by using the characteristic of overlapping image area with each camera. Accurate detection of common object is achieved by the area-based detection that combines MSER with SIFT. Adaptive color calibration by using the color of detected object is calculated by nonlinear regression method. This method can indicate the contribution of object's color for color calibration, and automatic selection notification for user is performed by this function. Experimental result show that the accuracy of the calibration improves gradually. It is clear that this method can endure practical use of multi-camera color calibration if an enough sample is obtained.

  14. Automatic control of biomass gasifiers using fuzzy inference systems

    Energy Technology Data Exchange (ETDEWEB)

    Sagues, C. [Universidad de Zaragoza (Spain). Dpto. de Informatica e Ingenieria de Sistemas; Garcia-Bacaicoa, P.; Serrano, S. [Universidad de Zaragoza (Spain). Dpto. de Ingenieria Quimica y Medio Ambiente

    2007-03-15

    A fuzzy controller for biomass gasifiers is proposed. Although fuzzy inference systems do not need models to be tuned, a plant model is proposed which has turned out very useful to prove different combinations of membership functions and rules in the proposed fuzzy control. The global control scheme is shown, including the elements to generate the set points for the process variables automatically. There, the type of biomass and its moisture content are the only data which need to be introduced to the controller by a human operator at the beginning of operation to make it work autonomously. The advantages and good performance of the fuzzy controller with the automatic generation of set points, compared to controllers utilising fixed parameters, are demonstrated. (author)

  15. Automatic Chessboard Detection for Intrinsic and Extrinsic Camera Parameter Calibration

    Directory of Open Access Journals (Sweden)

    Jose María Armingol

    2010-03-01

    Full Text Available There are increasing applications that require precise calibration of cameras to perform accurate measurements on objects located within images, and an automatic algorithm would reduce this time consuming calibration procedure. The method proposed in this article uses a pattern similar to that of a chess board, which is found automatically in each image, when no information regarding the number of rows or columns is supplied to aid its detection. This is carried out by means of a combined analysis of two Hough transforms, image corners and invariant properties of the perspective transformation. Comparative analysis with more commonly used algorithms demonstrate the viability of the algorithm proposed, as a valuable tool for camera calibration.

  16. Development of an Automatic Dispensing System for Traditional Chinese Herbs

    Directory of Open Access Journals (Sweden)

    Chi-Ying Lin

    2017-01-01

    Full Text Available The gathering of ingredients for decoctions of traditional Chinese herbs still relies on manual dispensation, due to the irregular shape of many items and inconsistencies in weights. In this study, we developed an automatic dispensing system for Chinese herbal decoctions with the aim of reducing manpower costs and the risk of mistakes. We employed machine vision in conjunction with a robot manipulator to facilitate the grasping of ingredients. The name and formulation of the decoction are input via a human-computer interface, and the dispensing of multiple medicine packets is performed automatically. An off-line least-squared curve fitting method was used to calculate the amount of material grasped by the claws and thereby improve system efficiency as well as the accuracy of individual dosages. Experiments on the dispensing of actual ingredients demonstrate the feasibility of the proposed system.

  17. AUTOMATIC EXTRACTION OF BUILDING OUTLINE FROM HIGH RESOLUTION AERIAL IMAGERY

    Directory of Open Access Journals (Sweden)

    Y. Wang

    2016-06-01

    Full Text Available In this paper, a new approach for automated extraction of building boundary from high resolution imagery is proposed. The proposed approach uses both geometric and spectral properties of a building to detect and locate buildings accurately. It consists of automatic generation of high quality point cloud from the imagery, building detection from point cloud, classification of building roof and generation of building outline. Point cloud is generated from the imagery automatically using semi-global image matching technology. Buildings are detected from the differential surface generated from the point cloud. Further classification of building roof is performed in order to generate accurate building outline. Finally classified building roof is converted into vector format. Numerous tests have been done on images in different locations and results are presented in the paper.

  18. Longitudinal automatic control system for a light weight aircraft

    Directory of Open Access Journals (Sweden)

    Cristian VIDAN

    2016-12-01

    Full Text Available This paper presents the design of an automatic control system for longitudinal axis of a light weight aircraft. To achieve this goal it is important to start from the mathematical model in longitudinal plane and then to determine the steady-state parameters for a given velocity and altitude. Using MATLAB Software the mathematical model in longitudinal plane was linearized and the system transfer functions were obtained. To determine the automatic control design we analyzed the stability of the linearized model for each input. After the stability problem was solved, using MATLAB-Simulink Software we designed the control system architecture and we considered that the objective for a stable flight was to continuously adjust the pitch angle θ through control of elevator and velocity through control of the throttle. Finally, we analyzed the performance of the designed longitudinal control system and the results highlighted in graphs outline that the purpose for which it was designed was fulfilled.

  19. Development of automatic inspection robot for nuclear power plants

    International Nuclear Information System (INIS)

    Yamada, K.; Suzuki, K.; Saitoh, K.; Sakaki, T.; Ohe, Y.; Mizutani, T.; Segawa, M.; Kubo, K.

    1987-01-01

    This robot system has been developed for automatic inspection of nuclear power plants. The system configuration is composed of vehicle that runs on monorail, the sensors on the vehicle, an image processer that processes the image information from the sensors, a computer that creates the inspection planning of the robot and an operation panel. This system has two main features, the first is the robot control system. The vehicle and the sensors are controlled by the output data calculated in the computer with the three dimensional plant data. The malfunction is recognized by the combination of the results of image processing, information from the microphone and infrared camera. Tests for a prototype automatic inspection robot system have been performed in the simulated main steam piping room of a nuclear power plant

  20. Flexible Automatic Discretization for Finite Differences: Eliminating the Human Factor

    Science.gov (United States)

    Pranger, Casper

    2017-04-01

    In the geophysical numerical modelling community, finite differences are (in part due to their small footprint) a popular spatial discretization method for PDEs in the regular-shaped continuum that is the earth. However, they rapidly become prone to programming mistakes when physics increase in complexity. To eliminate opportunities for human error, we have designed an automatic discretization algorithm using Wolfram Mathematica, in which the user supplies symbolic PDEs, the number of spatial dimensions, and a choice of symbolic boundary conditions, and the script transforms this information into matrix- and right-hand-side rules ready for use in a C++ code that will accept them. The symbolic PDEs are further used to automatically develop and perform manufactured solution benchmarks, ensuring at all stages physical fidelity while providing pragmatic targets for numerical accuracy. We find that this procedure greatly accelerates code development and provides a great deal of flexibility in ones choice of physics.

  1. Coherence measures in automatic time-migration velocity analysis

    International Nuclear Information System (INIS)

    Maciel, Jonathas S; Costa, Jessé C; Schleicher, Jörg

    2012-01-01

    Time-migration velocity analysis can be carried out automatically by evaluating the coherence of migrated seismic events in common-image gathers (CIGs). The performance of gradient methods for automatic time-migration velocity analysis depends on the coherence measures used as the objective function. We compare the results of four different coherence measures, being conventional semblance, differential semblance, an extended differential semblance using differences of more distant image traces and the product of the latter with conventional semblance. In our numerical experiments, the objective functions based on conventional semblance and on the product of conventional semblance with extended differential semblance provided the best velocity models, as evaluated by the flatness of the resulting CIGs. The method can be easily extended to anisotropic media. (paper)

  2. Project Report: Automatic Sequence Processor Software Analysis

    Science.gov (United States)

    Benjamin, Brandon

    2011-01-01

    The Mission Planning and Sequencing (MPS) element of Multi-Mission Ground System and Services (MGSS) provides space missions with multi-purpose software to plan spacecraft activities, sequence spacecraft commands, and then integrate these products and execute them on spacecraft. Jet Propulsion Laboratory (JPL) is currently is flying many missions. The processes for building, integrating, and testing the multi-mission uplink software need to be improved to meet the needs of the missions and the operations teams that command the spacecraft. The Multi-Mission Sequencing Team is responsible for collecting and processing the observations, experiments and engineering activities that are to be performed on a selected spacecraft. The collection of these activities is called a sequence and ultimately a sequence becomes a sequence of spacecraft commands. The operations teams check the sequence to make sure that no constraints are violated. The workflow process involves sending a program start command, which activates the Automatic Sequence Processor (ASP). The ASP is currently a file-based system that is comprised of scripts written in perl, c-shell and awk. Once this start process is complete, the system checks for errors and aborts if there are any; otherwise the system converts the commands to binary, and then sends the resultant information to be radiated to the spacecraft.

  3. Spike Pattern Recognition for Automatic Collimation Alignment

    CERN Document Server

    Azzopardi, Gabriella; Salvachua Ferrando, Belen Maria; Mereghetti, Alessio; Redaelli, Stefano; CERN. Geneva. ATS Department

    2017-01-01

    The LHC makes use of a collimation system to protect its sensitive equipment by intercepting potentially dangerous beam halo particles. The appropriate collimator settings to protect the machine against beam losses relies on a very precise alignment of all the collimators with respect to the beam. The beam center at each collimator is then found by touching the beam halo using an alignment procedure. Until now, in order to determine whether a collimator is aligned with the beam or not, a user is required to follow the collimator’s BLM loss data and detect spikes. A machine learning (ML) model was trained in order to automatically recognize spikes when a collimator is aligned. The model was loosely integrated with the alignment implementation to determine the classification performance and reliability, without effecting the alignment process itself. The model was tested on a number of collimators during this MD and the machine learning was able to output the classifications in real-time.

  4. Automatic Genre Classification of Musical Signals

    Science.gov (United States)

    Barbedo, Jayme Garcia sArnal; Lopes, Amauri

    2006-12-01

    We present a strategy to perform automatic genre classification of musical signals. The technique divides the signals into 21.3 milliseconds frames, from which 4 features are extracted. The values of each feature are treated over 1-second analysis segments. Some statistical results of the features along each analysis segment are used to determine a vector of summary features that characterizes the respective segment. Next, a classification procedure uses those vectors to differentiate between genres. The classification procedure has two main characteristics: (1) a very wide and deep taxonomy, which allows a very meticulous comparison between different genres, and (2) a wide pairwise comparison of genres, which allows emphasizing the differences between each pair of genres. The procedure points out the genre that best fits the characteristics of each segment. The final classification of the signal is given by the genre that appears more times along all signal segments. The approach has shown very good accuracy even for the lowest layers of the hierarchical structure.

  5. High performance visual display for HENP detectors

    CERN Document Server

    McGuigan, M; Spiletic, J; Fine, V; Nevski, P

    2001-01-01

    A high end visual display for High Energy Nuclear Physics (HENP) detectors is necessary because of the sheer size and complexity of the detector. For BNL this display will be of special interest because of STAR and ATLAS. To load, rotate, query, and debug simulation code with a modern detector simply takes too long even on a powerful work station. To visualize the HENP detectors with maximal performance we have developed software with the following characteristics. We develop a visual display of HENP detectors on BNL multiprocessor visualization server at multiple level of detail. We work with general and generic detector framework consistent with ROOT, GAUDI etc, to avoid conflicting with the many graphic development groups associated with specific detectors like STAR and ATLAS. We develop advanced OpenGL features such as transparency and polarized stereoscopy. We enable collaborative viewing of detector and events by directly running the analysis in BNL stereoscopic theatre. We construct enhanced interactiv...

  6. 12th Portuguese Conference on Automatic Control

    CERN Document Server

    Soares, Filomena; Moreira, António

    2017-01-01

    The biennial CONTROLO conferences are the main events promoted by The CONTROLO 2016 – 12th Portuguese Conference on Automatic Control, Guimarães, Portugal, September 14th to 16th, was organized by Algoritmi, School of Engineering, University of Minho, in partnership with INESC TEC, and promoted by the Portuguese Association for Automatic Control – APCA, national member organization of the International Federation of Automatic Control – IFAC. The seventy-five papers published in this volume cover a wide range of topics. Thirty-one of them, of a more theoretical nature, are distributed among the first five parts: Control Theory; Optimal and Predictive Control; Fuzzy, Neural and Genetic Control; Modeling and Identification; Sensing and Estimation. The papers go from cutting-edge theoretical research to innovative control applications and show expressively how Automatic Control can be used to increase the well being of people. .

  7. 2010 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2010 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  8. 2014 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2014 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  9. 2011 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2011 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  10. 2009 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2009 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  11. 2012 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2012 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  12. Automatic Testcase Generation for Flight Software

    Science.gov (United States)

    Bushnell, David Henry; Pasareanu, Corina; Mackey, Ryan M.

    2008-01-01

    The TacSat3 project is applying Integrated Systems Health Management (ISHM) technologies to an Air Force spacecraft for operational evaluation in space. The experiment will demonstrate the effectiveness and cost of ISHM and vehicle systems management (VSM) technologies through onboard operation for extended periods. We present two approaches to automatic testcase generation for ISHM: 1) A blackbox approach that views the system as a blackbox, and uses a grammar-based specification of the system's inputs to automatically generate *all* inputs that satisfy the specifications (up to prespecified limits); these inputs are then used to exercise the system. 2) A whitebox approach that performs analysis and testcase generation directly on a representation of the internal behaviour of the system under test. The enabling technologies for both these approaches are model checking and symbolic execution, as implemented in the Ames' Java PathFinder (JPF) tool suite. Model checking is an automated technique for software verification. Unlike simulation and testing which check only some of the system executions and therefore may miss errors, model checking exhaustively explores all possible executions. Symbolic execution evaluates programs with symbolic rather than concrete values and represents variable values as symbolic expressions. We are applying the blackbox approach to generating input scripts for the Spacecraft Command Language (SCL) from Interface and Control Systems. SCL is an embedded interpreter for controlling spacecraft systems. TacSat3 will be using SCL as the controller for its ISHM systems. We translated the SCL grammar into a program that outputs scripts conforming to the grammars. Running JPF on this program generates all legal input scripts up to a prespecified size. Script generation can also be targeted to specific parts of the grammar of interest to the developers. These scripts are then fed to the SCL Executive. ICS's in-house coverage tools will be run to

  13. Perseveration causes automatization of checking behavior in obsessive-compulsive disorder.

    Science.gov (United States)

    Dek, Eliane C P; van den Hout, Marcel A; Engelhard, Iris M; Giele, Catharina L; Cath, Danielle C

    2015-08-01

    Repeated checking leads to reductions in meta-memory (i.e., memory confidence, vividness and detail), and automatization of checking behavior (Dek, van den Hout, Giele, & Engelhard, 2014, 2015). Dek et al. (2014) suggested that this is caused by increased familiarity with the checked stimuli. They predicted that defamiliarization of checking by modifying the perceptual characteristics of stimuli would cause de-automatization and attenuate the negative meta-memory effects of re-checking. However, their results were inconclusive. The present study investigated whether repeated checking leads to automatization of checking behavior, and if defamiliarization indeed leads to de-automatization and attenuation of meta-memory effects in patients with OCD and healthy controls. Participants performed a checking task, in which they activated, deactivated and checked threat-irrelevant stimuli. During a pre- and post-test checking trial, check duration was recorded and a reaction time task was simultaneously administered as dual-task to assess automatization. After the pre- and post-test checking trial, meta-memory was rated. Results showed that relevant checking led to automatization of checking behavior on the RT measure, and negative meta-memory effects for patients and controls. Defamiliarization led to de-automatization measured with the RT task, but did not attenuate the negative meta-memory effects of repeated checking. Clinical implications are discussed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Studies on the calibration of mammography automatic exposure mode with computed radiology

    International Nuclear Information System (INIS)

    Zhu Hongzhou; Shao Guoliang; Shi Lei; Liu Qing

    2010-01-01

    Objective: To realize the optimization of image quality and radiation dose by correcting mammography automatic exposure, according to automatic exposure controlled mode of mammography film-screen system. Methods: The film-screen system (28 kV) was applied to perform automatic exposure of plexiglass (40 mm) and get the standard dose of exposure, the exposure mode of CR base on LgM=2.0 was rectified, which was divided into 10 steps. Mammary glands pattern (Fluke NA18-220) were examined with CR (26, 28, and 30 kV) by the automatic exposure mode corrected. The exposure values (mAs) were recorded. CR image was diagnosed and evaluated in double blind way by 4 radiologists according to American Collage of Radiology (ACR) standard. Results: Based on the standard of CR automatic exposure with the dose higher than the traditional exposure of film-screen system, the calibration of mammography automatic exposure was accomplished. The test results of the calibrated mode was better than the scoring system of ACR. Conclusions: Comparative study showed improvement in acquiring high-quality image and reduction of radiation dose. The corrected mammography automatic exposure mode might be a better method for clinical use. (authors)

  15. Automatic safety rod for reactors. [LMFBR

    Science.gov (United States)

    Germer, J.H.

    1982-03-23

    An automatic safety rod for a nuclear reactor containing neutron absorbing material and designed to be inserted into a reactor core after a loss-of-flow. Actuation is based upon either a sudden decrease in core pressure drop or the pressure drop decreases below a predetermined minimum value. The automatic control rod includes a pressure regulating device whereby a controlled decrease in operating pressure due to reduced coolant flow does not cause the rod to drop into the core.

  16. Automatic Control of Freeboard and Turbine Operation

    DEFF Research Database (Denmark)

    Kofoed, Jens Peter; Frigaard, Peter Bak; Friis-Madsen, Erik

    The report deals with the modules for automatic control of freeboard and turbine operation on board the Wave dragon, Nissum Bredning (WD-NB) prototype, and covers what has been going on up to ultimo 2003.......The report deals with the modules for automatic control of freeboard and turbine operation on board the Wave dragon, Nissum Bredning (WD-NB) prototype, and covers what has been going on up to ultimo 2003....

  17. Automatic and strategic processes in advertising effects

    DEFF Research Database (Denmark)

    Grunert, Klaus G.

    1996-01-01

    Two kinds of cognitive processes can be distinguished: Automatic processes, which are mostly subconscious, are learned and changed very slowly, and are not subject to the capacity limitations of working memory, and strategic processes, which are conscious, are subject to capacity limitations......, and can easily be adapted to situational circumstances. Both the perception of advertising and the way advertising influences brand evaluation involves both processes. Automatic processes govern the recognition of advertising stimuli, the relevance decision which determines further higher-level processing...

  18. Towards automatic verification of ladder logic programs

    OpenAIRE

    Zoubek , Bohumir; Roussel , Jean-Marc; Kwiatkowska , Martha

    2003-01-01

    International audience; Control system programs are usually validated by testing prior to their deployment. Unfortunately, testing is not exhaustive and therefore it is possible that a program which passed all the required tests still contains errors. In this paper we apply techniques of automatic verification to a control program written in ladder logic. A model is constructed mechanically from the ladder logic program and subjected to automatic verification against requirements that include...

  19. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathan

    2010-05-31

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques to detect regions of high error and the flexibility of the transfinite interpolation to add degrees of freedom to these areas. Examples are shown of a section of the Palo Duro Canyon in northern Texas.

  20. The problem of automatic identification of concepts

    International Nuclear Information System (INIS)

    Andreewsky, Alexandre

    1975-11-01

    This paper deals with the problem of the automatic recognition of concepts and describes an important language tool, the ''linguistic filter'', which facilitates the construction of statistical algorithms. Certain special filters, of prepositions, conjunctions, negatives, logical implication, compound words, are presented. This is followed by a detailed description of a statistical algorithm allowing recognition of pronoun referents, and finally the problem of the automatic treatment of negatives in French is discussed [fr

  1. Automatic control of commercial computer programs

    International Nuclear Information System (INIS)

    Rezvov, B.A.; Artem'ev, A.N.; Maevskij, A.G.; Demkiv, A.A.; Kirillov, B.F.; Belyaev, A.D.; Artem'ev, N.A.

    2010-01-01

    The way of automatic control of commercial computer programs is presented. The developed connection of the EXAFS spectrometer automatic system (which is managed by PC for DOS) is taken with the commercial program for the CCD detector control (which is managed by PC for Windows). The described complex system is used for the automation of intermediate amplitude spectra processing in EXAFS spectrum measurements at Kurchatov SR source

  2. Automatic penalty continuation in structural topology optimization

    DEFF Research Database (Denmark)

    Rojas Labanda, Susana; Stolpe, Mathias

    2015-01-01

    this issue is addressed. We propose an automatic continuation method, where the material penalization parameter is included as a new variable in the problem and a constraint guarantees that the requested penalty is eventually reached. The numerical results suggest that this approach is an appealing...... alternative to continuation methods. Automatic continuation also generally obtains better designs than the classical formulation using a reduced number of iterations....

  3. Desempenho de escolares bons leitores, com dislexia e com transtorno do déficit de atenção e hiperatividade em nomeação automática rápida Performance of good readers, students with dyslexia and attention deficit hyperactivity disorder in rapid automatized naming

    Directory of Open Access Journals (Sweden)

    Simone Aparecida Capellini

    2007-06-01

    Full Text Available OBJETIVOS: Caracterizar o desempenho de escolares com transtorno do déficit de atenção e hiperatividade e com dislexia em nomeação automática rápida e comparar o desempenho destes escolares com aqueles que lêem conforme o esperado para idade e escolaridade. MÉTODOS: Participaram deste estudo 30 escolares na faixa etária de oito a 12 anos de idade de 2ª a 4ª séries do ensino público fundamental, divididos em três grupos de 10 escolares, sendo um grupo de escolares com transtorno do déficit de atenção e hiperatividade, um grupo de escolares com dislexia e um grupo de escolares com bom desempenho escolar. RESULTADOS: Os resultados revelaram diferenças estatisticamente significantes, evidenciando desempenho superior do grupo controle em relação ao grupo de escolares com transtorno do déficit de atenção e hiperatividade e ao grupo de escolares com dislexia nos subtestes de cores, dígitos, letras e objetos; e desempenho superior dos escolares do grupo com transtorno do déficit de atenção em relação ao grupo de escolares com dislexia em nomeação automática rápida. CONCLUSÃO: O presente estudo demonstrou que os escolares que lêem conforme o esperado para idade e escolaridade apresentaram melhor desempenho em relação ao grupo de escolares com transtorno do déficit de atenção e hiperatividade e o grupo com dislexia, demonstrando que tal habilidade pode ser considerada um pré-requisito para o desempenho em leitura.PURPOSE: To characterize the performance of students with attention deficit hyperactivity disorder and dyslexia in rapid automatized naming and to compare it to the performance of children whose reading ability is considered to be in accordance to age and schooling. METHODS: A number of 30 students aged 8 to 12 years old attending 2nd to 4th grades in a public school participated in this study, divided into three groups, one formed by students with attention deficit and hyperactivity disorder, another one

  4. QSFIT: automatic analysis of optical AGN spectra

    Science.gov (United States)

    Calderone, G.; Nicastro, L.; Ghisellini, G.; Dotti, M.; Sbarrato, T.; Shankar, F.; Colpi, M.

    2017-12-01

    We present QSFIT (Quasar Spectral Fitting package), a new software package to automatically perform the analysis of active galactic nuclei (AGNs) optical spectra. The software provides luminosity estimates for the AGN continuum, the Balmer continuum, both optical and ultraviolet iron blended complex, host galaxy and emission lines, as well as width, velocity offset and equivalent width of 20 emission lines. Improving on a number of previous studies on AGN spectral analysis, QSFIT fits all the components simultaneously, using an AGN continuum model which extends over the entire available spectrum, and is thus a probe of the actual AGN continuum whose estimates are scarcely influenced by localized features (e.g. emission lines) in the spectrum. We used QSFIT to analyse 71 251 optical spectra of Type 1 AGN at z < 2 (obtained by the Sloan Digital Sky Survey, SDSS) and to produce a publicly available catalogue of AGN spectral properties. Such catalogue allowed us (for the first time) to estimate the AGN continuum slope and the Balmer continuum luminosity on a very large sample, and to show that there is no evident correlation between these quantities the redshift. All data in the catalogue, the plots with best-fitting model and residuals, and the IDL code we used to perform the analysis, are available on a dedicated website. The whole fitting process is customizable for specific needs, and can be extended to analyse spectra from other data sources. The ultimate purpose of QSFIT is to allow astronomers to run standardized recipes to analyse the AGN data, in a simple, replicable and shareable way.

  5. Efficient Semi-Automatic 3D Segmentation for Neuron Tracing in Electron Microscopy Images

    Science.gov (United States)

    Jones, Cory; Liu, Ting; Cohan, Nathaniel Wood; Ellisman, Mark; Tasdizen, Tolga

    2015-01-01

    0.1. Background In the area of connectomics, there is a significant gap between the time required for data acquisition and dense reconstruction of the neural processes contained in the same dataset. Automatic methods are able to eliminate this timing gap, but the state-of-the-art accuracy so far is insufficient for use without user corrections. If completed naively, this process of correction can be tedious and time consuming. 0.2. New Method We present a new semi-automatic method that can be used to perform 3D segmentation of neurites in EM image stacks. It utilizes an automatic method that creates a hierarchical structure for recommended merges of superpixels. The user is then guided through each predicted region to quickly identify errors and establish correct links. 0.3. Results We tested our method on three datasets with both novice and expert users. Accuracy and timing were compared with published automatic, semi-automatic, and manual results. 0.4. Comparison with Existing Methods Post-automatic correction methods have also been used in [1] and [2]. These methods do not provide navigation or suggestions in the manner we present. Other semi-automatic methods require user input prior to the automatic segmentation such as [3] and [4] and are inherently different than our method. 0.5. Conclusion Using this method on the three datasets, novice users achieved accuracy exceeding state-of-the-art automatic results, and expert users achieved accuracy on par with full manual labeling but with a 70% time improvement when compared with other examples in publication. PMID:25769273

  6. 46 CFR 63.25-1 - Small automatic auxiliary boilers.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Small automatic auxiliary boilers. 63.25-1 Section 63.25... AUXILIARY BOILERS Requirements for Specific Types of Automatic Auxiliary Boilers § 63.25-1 Small automatic auxiliary boilers. Small automatic auxiliary boilers defined as having heat-input ratings of 400,000 Btu/hr...

  7. 30 CFR 77.314 - Automatic temperature control instruments.

    Science.gov (United States)

    2010-07-01

    ... UNDERGROUND COAL MINES Thermal Dryers § 77.314 Automatic temperature control instruments. (a) Automatic temperature control instruments for thermal dryer system shall be of the recording type. (b) Automatic... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic temperature control instruments. 77...

  8. Automatic characterization of loose parts impact damage risk parameters

    International Nuclear Information System (INIS)

    Glass, S.W.; Phillips, J.M.

    1985-01-01

    Loose parts caught in the high-velocity flows of the reactor coolant fluid strike against nuclear steam supply system (NSSS) components and can cause significant damage. Loose parts monitor systems (LPMS) have been available for years to detect metal-to-metal impacts. Once detected, however, an assessment of the damage risk potential for leaving the part in the system versus shutting it down and removing the part must be made. The principal parameters used in the damage risk assessment are time delays between the first and subsequent sensor indications (used to assess the impact location) and a correlation between the waveform and the impact energy of the part (how hard the part impacted). These parameters are not well suited to simple automatic techniques. The task has historically been performed by loose parts diagnostic experts who base much of their evaluation on experience and subjective interpretation of impact data waveforms. Three of the principal goals in developing the Babcock and Wilcox (B and W) LPMS-III were (a) to develop an accurate automatic assessment for the time delays, (b) to develop an automatic estimate of the impact energy, and (c) to present the data in a meaningful manner to the operator

  9. Advanced automatic target recognition for police helicopter missions

    Science.gov (United States)

    Stahl, Christoph; Schoppmann, Paul

    2000-08-01

    The results of a case study about the application of an advanced method for automatic target recognition to infrared imagery taken from police helicopter missions are presented. The method consists of the following steps: preprocessing, classification, fusion, postprocessing and tracking, and combines the three paradigms image pyramids, neural networks and bayesian nets. The technology has been developed using a variety of different scenes typical for military aircraft missions. Infrared cameras have been in use for several years at the Bavarian police helicopter forces and are highly valuable for night missions. Several object classes like 'persons' or 'vehicles' are tested and the possible discrimination between persons and animals is shown. The analysis of complex scenes with hidden objects and clutter shows the potentials and limitations of automatic target recognition for real-world tasks. Several display concepts illustrate the achievable improvement of the situation awareness. The similarities and differences between various mission types concerning object variability, time constraints, consequences of false alarms, etc. are discussed. Typical police actions like searching for missing persons or runaway criminals illustrate the advantages of automatic target recognition. The results demonstrate the possible operational benefits for the helicopter crew. Future work will include performance evaluation issues and a system integration concept for the target platform.

  10. Automatic Detection of Vehicles Using Intensity Laser and Anaglyph Image

    Directory of Open Access Journals (Sweden)

    Hideo Araki

    2006-12-01

    Full Text Available In this work is presented a methodology to automatic car detection motion presents in digital aerial image on urban area using intensity, anaglyph and subtracting images. The anaglyph image is used to identify the motion cars on the expose take, because the cars provide red color due the not homology between objects. An implicit model was developed to provide a digital pixel value that has the specific propriety presented early, using the ratio between the RGB color of car object in the anaglyph image. The intensity image is used to decrease the false positive and to do the processing to work into roads and streets. The subtracting image is applied to decrease the false positives obtained due the markings road. The goal of this paper is automatically detect motion cars presents in digital aerial image in urban areas. The algorithm implemented applies normalization on the left and right images and later form the anaglyph with using the translation. The results show the applicability of proposed method and it potentiality on the automatic car detection and presented the performance of proposed methodology.

  11. Automatic vertebral identification using surface-based registration

    Science.gov (United States)

    Herring, Jeannette L.; Dawant, Benoit M.

    2000-06-01

    This work introduces an enhancement to currently existing methods of intra-operative vertebral registration by allowing the portion of the spinal column surface that correctly matches a set of physical vertebral points to be automatically selected from several possible choices. Automatic selection is made possible by the shape variations that exist among lumbar vertebrae. In our experiments, we register vertebral points representing physical space to spinal column surfaces extracted from computed tomography images. The vertebral points are taken from the posterior elements of a single vertebra to represent the region of surgical interest. The surface is extracted using an improved version of the fully automatic marching cubes algorithm, which results in a triangulated surface that contains multiple vertebrae. We find the correct portion of the surface by registering the set of physical points to multiple surface areas, including all vertebral surfaces that potentially match the physical point set. We then compute the standard deviation of the surface error for the set of points registered to each vertebral surface that is a possible match, and the registration that corresponds to the lowest standard deviation designates the correct match. We have performed our current experiments on two plastic spine phantoms and one patient.

  12. Automatic differential analysis of NMR experiments in complex samples.

    Science.gov (United States)

    Margueritte, Laure; Markov, Petar; Chiron, Lionel; Starck, Jean-Philippe; Vonthron-Sénécheau, Catherine; Bourjot, Mélanie; Delsuc, Marc-André

    2017-11-20

    Liquid state nuclear magnetic resonance (NMR) is a powerful tool for the analysis of complex mixtures of unknown molecules. This capacity has been used in many analytical approaches: metabolomics, identification of active compounds in natural extracts, and characterization of species, and such studies require the acquisition of many diverse NMR measurements on series of samples. Although acquisition can easily be performed automatically, the number of NMR experiments involved in these studies increases very rapidly, and this data avalanche requires to resort to automatic processing and analysis. We present here a program that allows the autonomous, unsupervised processing of a large corpus of 1D, 2D, and diffusion-ordered spectroscopy experiments from a series of samples acquired in different conditions. The program provides all the signal processing steps, as well as peak-picking and bucketing of 1D and 2D spectra, the program and its components are fully available. In an experiment mimicking the search of a bioactive species in a natural extract, we use it for the automatic detection of small amounts of artemisinin added to a series of plant extracts and for the generation of the spectral fingerprint of this molecule. This program called Plasmodesma is a novel tool that should be useful to decipher complex mixtures, particularly in the discovery of biologically active natural products from plants extracts but can also in drug discovery or metabolomics studies. Copyright © 2017 John Wiley & Sons, Ltd.

  13. Automatic building extraction using LiDAR and aerial photographs

    Directory of Open Access Journals (Sweden)

    Melis Uzar

    Full Text Available This paper presents an automatic building extraction approach using LiDAR data and aerial photographs from a multi-sensor system positioned at the same platform. The automatic building extraction approach consists of segmentation, analysis and classification steps based on object-based image analysis. The chessboard, contrast split and multi-resolution segmentation methods were used in the segmentation step. The determined object primitives in segmentation, such as scale parameter, shape, completeness, brightness, and statistical parameters, were used to determine threshold values for classification in the analysis step. The rule-based classification was carried out with defined decision rules based on determined object primitives and fuzzy rules. In this study, hierarchical classification was preferred. First, the vegetation and ground classes were generated; the building class was then extracted. The NDVI, slope and Hough images were generated and used to avoid confusing the building class with other classes. The intensity images generated from the LiDAR data and morphological operations were utilized to improve the accuracy of the building class. The proposed approach achieved an overall accuracy of approximately 93% for the target class in a suburban neighborhood, which was the study area. Moreover, completeness (96.73% and correctness (95.02% analyses were performed by comparing the automatically extracted buildings and reference data.

  14. Automatic classification of journalistic documents on the Internet1

    Directory of Open Access Journals (Sweden)

    Elias OLIVEIRA

    Full Text Available Abstract Online journalism is increasing every day. There are many news agencies, newspapers, and magazines using digital publication in the global network. Documents published online are available to users, who use search engines to find them. In order to deliver documents that are relevant to the search, they must be indexed and classified. Due to the vast number of documents published online every day, a lot of research has been carried out to find ways to facilitate automatic document classification. The objective of the present study is to describe an experimental approach for the automatic classification of journalistic documents published on the Internet using the Vector Space Model for document representation. The model was tested based on a real journalism database, using algorithms that have been widely reported in the literature. This article also describes the metrics used to assess the performance of these algorithms and their required configurations. The results obtained show the efficiency of the method used and justify further research to find ways to facilitate the automatic classification of documents.

  15. Microprocessors in automatic chemical analysis

    International Nuclear Information System (INIS)

    Goujon de Beauvivier, M.; Perez, J.-J.

    1979-01-01

    Application of microprocessors to programming and computing of solutions chemical analysis by a sequential technique is examined. Safety, performances reliability are compared to other methods. An example is given on uranium titration by spectrophotometry [fr

  16. Rapid Automatized Naming and Reading Performance: A Meta-Analysis

    Science.gov (United States)

    Araújo, Susana; Reis, Alexandra; Petersson, Karl Magnus; Faísca, Luís

    2015-01-01

    Evidence that rapid naming skill is associated with reading ability has become increasingly prevalent in recent years. However, there is considerable variation in the literature concerning the magnitude of this relationship. The objective of the present study was to provide a comprehensive analysis of the evidence on the relationship between rapid…

  17. Initial investigation of an automatic registration algorithm for surgical navigation.

    Science.gov (United States)

    Bootsma, Gregory J; Siewerdsen, Jeffrey H; Daly, Michael J; Jaffray, David A

    2008-01-01

    The procedure required for registering a surgical navigation system prior to use in a surgical procedure is conventionally a time-consuming manual process that is prone to human errors and must be repeated as necessary through the course of a procedure. The conventional procedure becomes even more time consuming when intra-operative 3D imaging such as the C-arm cone-beam CT (CBCT) is introduced, as each updated volume set requires a new registration. To improve the speed and accuracy of registering image and world reference frames in image-guided surgery, a novel automatic registration algorithm was developed and investigated. The surgical navigation system consists of either Polaris (Northern Digital Inc., Waterloo, ON) or MicronTracker (Claron Technology Inc., Toronto, ON) tracking camera(s), custom software (Cogito running on a PC), and a prototype CBCT imaging system based on a mobile isocentric C-arm (Siemens, Erlangen, Germany). Experiments were conducted to test the accuracy of automatic registration methods for both the MicronTracker and Polaris tracking cameras. Results indicate the automated registration performs as well as the manual registration procedure using either the Claron or Polaris camera. The average root-mean-squared (rms) observed target registration error (TRE) for the manual procedure was 2.58 +/- 0.42 mm and 1.76 +/- 0.49 mm for the Polaris and MicronTracker, respectively. The mean observed TRE for the automatic algorithm was 2.11 +/- 0.13 and 2.03 +/- 0.3 mm for the Polaris and MicronTracker, respectively. Implementation and optimization of the automatic registration technique in Carm CBCT guidance of surgical procedures is underway.

  18. Scanner OPC signatures: automatic vendor-to-vendor OPE matching

    Science.gov (United States)

    Renwick, Stephen P.

    2009-03-01

    As 193nm lithography continues to be stretched and the k1 factor decreases, optical proximity correction (OPC) has become a vital part of the lithographer's tool kit. Unfortunately, as is now well known, the design variations of lithographic scanners from different vendors cause them to have slightly different optical-proximity effect (OPE) behavior, meaning that they print features through pitch in distinct ways. This in turn means that their response to OPC is not the same, and that an OPC solution designed for a scanner from Company 1 may or may not work properly on a scanner from Company 2. Since OPC is not inexpensive, that causes trouble for chipmakers using more than one brand of scanner. Clearly a scanner-matching procedure is needed to meet this challenge. Previously, automatic matching has only been reported for scanners of different tool generations from the same manufacturer. In contrast, scanners from different companies have been matched using expert tuning and adjustment techniques, frequently requiring laborious test exposures. Automatic matching between scanners from Company 1 and Company 2 has remained an unsettled problem. We have recently solved this problem and introduce a novel method to perform the automatic matching. The success in meeting this challenge required three enabling factors. First, we recognized the strongest drivers of OPE mismatch and are thereby able to reduce the information needed about a tool from another supplier to that information readily available from all modern scanners. Second, we developed a means of reliably identifying the scanners' optical signatures, minimizing dependence on process parameters that can cloud the issue. Third, we carefully employed standard statistical techniques, checking for robustness of the algorithms used and maximizing efficiency. The result is an automatic software system that can predict an OPC matching solution for scanners from different suppliers without requiring expert intervention.

  19. Automatic detection of AutoPEEP during controlled mechanical ventilation

    Directory of Open Access Journals (Sweden)

    Nguyen Quang-Thang

    2012-06-01

    Full Text Available Abstract Background Dynamic hyperinflation, hereafter called AutoPEEP (auto-positive end expiratory pressure with some slight language abuse, is a frequent deleterious phenomenon in patients undergoing mechanical ventilation. Although not readily quantifiable, AutoPEEP can be recognized on the expiratory portion of the flow waveform. If expiratory flow does not return to zero before the next inspiration, AutoPEEP is present. This simple detection however requires the eye of an expert clinician at the patient’s bedside. An automatic detection of AutoPEEP should be helpful to optimize care. Methods In this paper, a platform for automatic detection of AutoPEEP based on the flow signal available on most of recent mechanical ventilators is introduced. The detection algorithms are developed on the basis of robust non-parametric hypothesis testings that require no prior information on the signal distribution. In particular, two detectors are proposed: one is based on SNT (Signal Norm Testing and the other is an extension of SNT in the sequential framework. The performance assessment was carried out on a respiratory system analog and ex-vivo on various retrospectively acquired patient curves. Results The experiment results have shown that the proposed algorithm provides relevant AutoPEEP detection on both simulated and real data. The analysis of clinical data has shown that the proposed detectors can be used to automatically detect AutoPEEP with an accuracy of 93% and a recall (sensitivity of 90%. Conclusions The proposed platform provides an automatic early detection of AutoPEEP. Such functionality can be integrated in the currently used mechanical ventilator for continuous monitoring of the patient-ventilator interface and, therefore, alleviate the clinician task.

  20. Automatic detection of tooth cracks in optical coherence tomography images.

    Science.gov (United States)

    Kim, Jun-Min; Kang, Se-Ryong; Yi, Won-Jin

    2017-02-01

    The aims of the present study were to compare the image quality and visibility of tooth cracks between conventional methods and swept-source optical coherence tomography (SS-OCT) and to develop an automatic detection technique for tooth cracks by SS-OCT imaging. We evaluated SS-OCT with a near-infrared wavelength centered at 1,310 nm over a spectral bandwidth of 100 nm at a rate of 50 kHz as a new diagnostic tool for the detection of tooth cracks. The reliability of the SS-OCT images was verified by comparing the crack lines with those detected using conventional methods. After performing preprocessing of the obtained SS-OCT images to emphasize cracks, an algorithm was developed and verified to detect tooth cracks automatically. The detection capability of SS-OCT was superior or comparable to that of trans-illumination, which did not discriminate among the cracks according to depth. Other conventional methods for the detection of tooth cracks did not sense initial cracks with a width of less than 100 μm. However, SS-OCT detected cracks of all sizes, ranging from craze lines to split teeth, and the crack lines were automatically detected in images using the Hough transform. We were able to distinguish structural cracks, craze lines, and split lines in tooth cracks using SS-OCT images, and to automatically detect the position of various cracks in the OCT images. Therefore, the detection capability of SS-OCT images provides a useful diagnostic tool for cracked tooth syndrome.

  1. Automatic ultrasound technique to measure angle of progression during labor.

    Science.gov (United States)

    Conversano, F; Peccarisi, M; Pisani, P; Di Paola, M; De Marco, T; Franchini, R; Greco, A; D'Ambrogio, G; Casciaro, S

    2017-12-01

    To evaluate the accuracy and reliability of an automatic ultrasound technique for assessment of the angle of progression (AoP) during labor. Thirty-nine pregnant women in the second stage of labor, with fetus in cephalic presentation, underwent conventional labor management with additional translabial sonographic examination. AoP was measured in a total of 95 acquisition sessions, both automatically by an innovative algorithm and manually by an experienced sonographer, who was blinded to the algorithm outcome. The results obtained from the manual measurement were used as the reference against which the performance of the algorithm was assessed. In order to overcome the common difficulties encountered when visualizing by sonography the pubic symphysis, the AoP was measured by considering as the symphysis landmark its centroid rather than its distal point, thereby assuring high measurement reliability and reproducibility, while maintaining objectivity and accuracy in the evaluation of progression of labor. There was a strong and statistically significant correlation between AoP values measured by the algorithm and the reference values (r = 0.99, P < 0.001). The high accuracy provided by the automatic method was also highlighted by the corresponding high values of the coefficient of determination (r 2  = 0.98) and the low residual errors (root mean square error = 2°27' (2.1%)). The global agreement between the two methods, assessed through Bland-Altman analysis, resulted in a negligible mean difference of 1°1' (limits of agreement, 4°29'). The proposed automatic algorithm is a reliable technique for measurement of the AoP. Its (relative) operator-independence has the potential to reduce human errors and speed up ultrasound acquisition time, which should facilitate management of women during labor. Copyright © 2017 ISUOG. Published by John Wiley & Sons Ltd. Copyright © 2017 ISUOG. Published by John Wiley & Sons Ltd.

  2. Automatic Power Line Inspection Using UAV Images

    Directory of Open Access Journals (Sweden)

    Yong Zhang

    2017-08-01

    Full Text Available Power line inspection ensures the safe operation of a power transmission grid. Using unmanned aerial vehicle (UAV images of power line corridors is an effective way to carry out these vital inspections. In this paper, we propose an automatic inspection method for power lines using UAV images. This method, known as the power line automatic measurement method based on epipolar constraints (PLAMEC, acquires the spatial position of the power lines. Then, the semi patch matching based on epipolar constraints (SPMEC dense matching method is applied to automatically extract dense point clouds within the power line corridor. Obstacles can then be automatically detected by calculating the spatial distance between a power line and the point cloud representing the ground. Experimental results show that the PLAMEC automatically measures power lines effectively with a measurement accuracy consistent with that of manual stereo measurements. The height root mean square (RMS error of the point cloud was 0.233 m, and the RMS error of the power line was 0.205 m. In addition, we verified the detected obstacles in the field and measured the distance between the canopy and power line using a laser range finder. The results show that the difference of these two distances was within ±0.5 m.

  3. Automatic radioxenon analyzer for CTBT monitoring

    International Nuclear Information System (INIS)

    Bowyer, T.W.; Abel, K.H.; Hensley, W.K.

    1996-12-01

    Over the past 3 years, with support from US DOE's NN-20 Comprehensive Test Ban Treaty (CTBT) R ampersand D program, PNNL has developed and demonstrated a fully automatic analyzer for collecting and measuring the four Xe radionuclides, 131m Xe(11.9 d), 133m Xe(2.19 d), 133 Xe (5.24 d), and 135 Xe(9.10 h), in the atmosphere. These radionuclides are important signatures in monitoring for compliance to a CTBT. Activity ratios permit discriminating radioxenon from nuclear detonation and that from nuclear reactor operations, nuclear fuel reprocessing, or medical isotope production and usage. In the analyzer, Xe is continuously and automatically separated from the atmosphere at flow rates of about 7 m 3 /h on sorption bed. Aliquots collected for 6-12 h are automatically analyzed by electron-photon coincidence spectrometry to produce sensitivities in the range of 20-100 μBq/m 3 of air, about 100-fold better than with reported laboratory-based procedures for short time collection intervals. Spectral data are automatically analyzed and the calculated radioxenon concentrations and raw gamma- ray spectra automatically transmitted to data centers

  4. Automatic Parallelization of Scientific Application

    DEFF Research Database (Denmark)

    Blum, Troels

    performance gains. Scientists working with computer simulations should be allowed to focus on their field of research and not spend excessive amounts of time learning exotic programming models and languages. We have with Bohrium achieved very promising results by starting out with a relatively simple approach...... in the cases where we were not able to gain any performance boost by specialization, the added cost, for kernel generation and extra bookkeeping, is minimal. Many of the lessons learned developing and optimizing the Bohrium GPU vector engine has proven to be valuable in a broader perspective, which has made...

  5. Automatic sentence extraction for the detection of scientific paper relations

    Science.gov (United States)

    Sibaroni, Y.; Prasetiyowati, S. S.; Miftachudin, M.

    2018-03-01

    The relations between scientific papers are very useful for researchers to see the interconnection between scientific papers quickly. By observing the inter-article relationships, researchers can identify, among others, the weaknesses of existing research, performance improvements achieved to date, and tools or data typically used in research in specific fields. So far, methods that have been developed to detect paper relations include machine learning and rule-based methods. However, a problem still arises in the process of sentence extraction from scientific paper documents, which is still done manually. This manual process causes the detection of scientific paper relations longer and inefficient. To overcome this problem, this study performs an automatic sentences extraction while the paper relations are identified based on the citation sentence. The performance of the built system is then compared with that of the manual extraction system. The analysis results suggested that the automatic sentence extraction indicates a very high level of performance in the detection of paper relations, which is close to that of manual sentence extraction.

  6. Genetic Programming for Automatic Hydrological Modelling

    Science.gov (United States)

    Chadalawada, Jayashree; Babovic, Vladan

    2017-04-01

    One of the recent challenges for the hydrologic research community is the need for the development of coupled systems that involves the integration of hydrologic, atmospheric and socio-economic relationships. This poses a requirement for novel modelling frameworks that can accurately represent complex systems, given, the limited understanding of underlying processes, increasing volume of data and high levels of uncertainity. Each of the existing hydrological models vary in terms of conceptualization and process representation and is the best suited to capture the environmental dynamics of a particular hydrological system. Data driven approaches can be used in the integration of alternative process hypotheses in order to achieve a unified theory at catchment scale. The key steps in the implementation of integrated modelling framework that is influenced by prior understanding and data, include, choice of the technique for the induction of knowledge from data, identification of alternative structural hypotheses, definition of rules, constraints for meaningful, intelligent combination of model component hypotheses and definition of evaluation metrics. This study aims at defining a Genetic Programming based modelling framework that test different conceptual model constructs based on wide range of objective functions and evolves accurate and parsimonious models that capture dominant hydrological processes at catchment scale. In this paper, GP initializes the evolutionary process using the modelling decisions inspired from the Superflex framework [Fenicia et al., 2011] and automatically combines them into model structures that are scrutinized against observed data using statistical, hydrological and flow duration curve based performance metrics. The collaboration between data driven and physical, conceptual modelling paradigms improves the ability to model and manage hydrologic systems. Fenicia, F., D. Kavetski, and H. H. Savenije (2011), Elements of a flexible approach

  7. Automatic Robot Safety Shutdown System

    Science.gov (United States)

    Lirette, M.

    1985-01-01

    Robot turned off if acceleration exceeds preset value. Signals from accelerometer on robot arm pass through filter and amplifier, eliminating high-frequency noise and hydraulic-pump pulsations. Data digitized and processed in computer. Unit controls other machines that perform repetitive movements, including rotary tables, tracked vehicles, conveyor lines, and elevators.

  8. Automaticity in reading isiZulu

    Directory of Open Access Journals (Sweden)

    Sandra Land

    2016-06-01

    Full Text Available Automaticity, or instant recognition of combinations of letters as units of language, is essential for proficient reading in any language. The article explores automaticity amongst competent adult first-language readers of isiZulu, and the factors associated with it or its opposite - active decoding. Whilst the transparent spelling patterns of isiZulu aid learner readers, some of its orthographical features may militate against their gaining automaticity. These features are agglutination; a conjoined writing system; comparatively long, complex words; and a high rate of recurring strings of particular letters. This implies that optimal strategies for teaching reading in orthographically opaque languages such as English should not be assumed to apply to languages with dissimilar orthographies. Keywords: Orthography; Eye movement; Reading; isiZulu

  9. Development of fully automatic pipe welding system

    International Nuclear Information System (INIS)

    Tanioka, Shin-ichi; Nakano, Mitsuhiro; Tejima, Akio; Yamada, Minoru; Saito, Tatsuo; Saito, Yoshiyuki; Abe, Rikio

    1985-01-01

    We have succeeded in developing a fully automatic TIG welding system; namely CAPTIG that enables unmanned welding operations from the initial layer to the final finishing layer continuously. This welding system is designed for continuous, multilayered welding of thick and large diameter fixed pipes of nuclear power plants and large-size boiler plants where high-quality welding is demanded. In the tests conducted with this welding system, several hours of continuous unmanned welding corroborated that excellent beads are formed, good results are obtained in radiographic inspection and that quality welding is possible most reliably. This system incorporates a microcomputer for fully automatic controls by which it features a seam tracking function, wire feed position automatic control function, a self-checking function for inter-pass temperature, cooling water temperature and wire reserve. (author)

  10. Automatic control variac system for electronic accelerator

    International Nuclear Information System (INIS)

    Zhang Shuocheng; Wang Dan; Jing Lan; Qiao Weimin; Ma Yunhai

    2006-01-01

    An automatic control variac system is designed in order to satisfy the controlling requirement of the electronic accelerator developed by the Institute. Both design and operational principles, structure of the system as well as the software of industrial PC and micro controller unit are described. The interfaces of the control module are RS232 and RS485. A fiber optical interface (FOC) could be set up if an industrial FOC network is necessary, which will extend the filed of its application and make the communication of the system better. It is shown in practice that the system can adjust the variac output voltage automatically and assure the accurate and automatic control of the electronic accelerator. The system is designed in accordance with the general design principles and possesses the merits such as easy operation and maintenance, good expansibility, and low cost, thus it could also be used in other industrial branches. (authors)

  11. Automaticity in reading isiZulu

    Directory of Open Access Journals (Sweden)

    Sandra Land

    2016-03-01

    Full Text Available Automaticity, or instant recognition of combinations of letters as units of language, is essential for proficient reading in any language. The article explores automaticity amongst competent adult first-language readers of isiZulu, and the factors associated with it or its opposite - active decoding. Whilst the transparent spelling patterns of isiZulu aid learner readers, some of its orthographical features may militate against their gaining automaticity. These features are agglutination; a conjoined writing system; comparatively long, complex words; and a high rate of recurring strings of particular letters. This implies that optimal strategies for teaching reading in orthographically opaque languages such as English should not be assumed to apply to languages with dissimilar orthographies.Keywords: Orthography; Eye movement; Reading; isiZulu

  12. Oocytes Polar Body Detection for Automatic Enucleation

    Directory of Open Access Journals (Sweden)

    Di Chen

    2016-02-01

    Full Text Available Enucleation is a crucial step in cloning. In order to achieve automatic blind enucleation, we should detect the polar body of the oocyte automatically. The conventional polar body detection approaches have low success rate or low efficiency. We propose a polar body detection method based on machine learning in this paper. On one hand, the improved Histogram of Oriented Gradient (HOG algorithm is employed to extract features of polar body images, which will increase success rate. On the other hand, a position prediction method is put forward to narrow the search range of polar body, which will improve efficiency. Experiment results show that the success rate is 96% for various types of polar bodies. Furthermore, the method is applied to an enucleation experiment and improves the degree of automatic enucleation.

  13. Defusing the Debugging Scandal - Dedicated Debugging Technologies for Advanced Dispatching Languages

    NARCIS (Netherlands)

    Yin, Haihan

    2013-01-01

    To increase program modularity, new programming paradigms, such as aspect-oriented programming, context-oriented programming, and predicated dispatching, have been researched in recent years. The new-paradigm languages allow changing behavior according to various kinds of contexts at the call sites.

  14. Region descriptors for automatic classification of small sea targets in infrared video

    NARCIS (Netherlands)

    Mouthaan, M.M.; Broek, S.P. van den; Hendriks, E.A.; Schwering, P.B.W.

    2011-01-01

    We evaluate the performance of different key-point detectors and region descriptors when used for automatic classification of small sea targets in infrared video. In our earlier research performed on this subject as well as in other literature, many different region descriptors have been proposed.

  15. Automatic Hierarchical Color Image Classification

    Directory of Open Access Journals (Sweden)

    Jing Huang

    2003-02-01

    Full Text Available Organizing images into semantic categories can be extremely useful for content-based image retrieval and image annotation. Grouping images into semantic classes is a difficult problem, however. Image classification attempts to solve this hard problem by using low-level image features. In this paper, we propose a method for hierarchical classification of images via supervised learning. This scheme relies on using a good low-level feature and subsequently performing feature-space reconfiguration using singular value decomposition to reduce noise and dimensionality. We use the training data to obtain a hierarchical classification tree that can be used to categorize new images. Our experimental results suggest that this scheme not only performs better than standard nearest-neighbor techniques, but also has both storage and computational advantages.

  16. Automatic stabilization of underwater robots in the time manipulation operations

    International Nuclear Information System (INIS)

    Filaretov, V.F.; Koval, E.V.

    1994-01-01

    When carrying out underwater technical works by means of an underwater vehicles having a manipulator it is desirable to perform manipulation operations in the regime of the underwater vehicle hovering above the object without durable and complicated operations up its rigid fixation. Underwater vehicle stabilization is achieved by compensation all the effects on the vehicle caused by the operating manipulator in water medium. This automatic stabilization is formed due to input of the required control signals into corresponding vehicle propellers proportional to calculated components of the generalized forces and moments. The propellers should form stops reacting against effects

  17. Automatic generation of gene finders for eukaryotic species

    DEFF Research Database (Denmark)

    Terkelsen, Kasper Munch; Krogh, A.

    2006-01-01

    Background The number of sequenced eukaryotic genomes is rapidly increasing. This means that over time it will be hard to keep supplying customised gene finders for each genome. This calls for procedures to automatically generate species-specific gene finders and to re-train them as the quantity...... length distributions. The performance of each individual gene predictor on each individual genome is comparable to the best of the manually optimised species-specific gene finders. It is shown that species-specific gene finders are superior to gene finders trained on other species....

  18. Automatized material and radioactivity flow control tool in decommissioning process

    International Nuclear Information System (INIS)

    Rehak, I.; Vasko, M.; Daniska, V.; Schultz, O.

    2009-01-01

    In this presentation the automatized material and radioactivity flow control tool in decommissioning process is discussed. It is concluded that: computer simulation of the decommissioning process is one of the important attributes of computer code Omega; one of the basic tools of computer optimisation of decommissioning waste processing are the tools of integral material and radioactivity flow; all the calculated parameters of materials are stored in each point of calculation process and they can be viewed; computer code Omega represents opened modular system, which can be improved; improvement of the module of optimisation of decommissioning waste processing will be performed in the frame of improvement of material procedures and scenarios.

  19. Human and automatic speaker recognition over telecommunication channels

    CERN Document Server

    Fernández Gallardo, Laura

    2016-01-01

    This work addresses the evaluation of the human and the automatic speaker recognition performances under different channel distortions caused by bandwidth limitation, codecs, and electro-acoustic user interfaces, among other impairments. Its main contribution is the demonstration of the benefits of communication channels of extended bandwidth, together with an insight into how speaker-specific characteristics of speech are preserved through different transmissions. It provides sufficient motivation for considering speaker recognition as a criterion for the migration from narrowband to enhanced bandwidths, such as wideband and super-wideband.

  20. Studies and Proposals for an Automatic Crystal Control System

    CERN Document Server

    Drobychev, Gleb; Khruschinsky, A A; Korzhik, Mikhail; Missevitch, Oleg; Oriboni, André; Peigneux, Jean-Pierre; Schneegans, Marc

    1997-01-01

    This document presents the status of the studies for an Automatic Crystal Control System ( ACCOS) performed since autumn 1995 for the CMS collaboration. Evaluation of a startstop method for light yield, light uniformity and decay time measurements of PbWO4 crystals is presented, as well as the first results obtained with a compact double-beam spectrophotometer for transverse transmission. Various overall schemes are proposed for an integrated set-up including crystal dimension measurement. The initial financial evaluationperformed is also given.

  1. Automatic identification of corrosion damage using image processing techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bento, Mariana P.; Ramalho, Geraldo L.B.; Medeiros, Fatima N.S. de; Ribeiro, Elvis S. [Universidade Federal do Ceara (UFC), Fortaleza, CE (Brazil); Medeiros, Luiz C.L. [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper proposes a Nondestructive Evaluation (NDE) method for atmospheric corrosion detection on metallic surfaces using digital images. In this study, the uniform corrosion is characterized by texture attributes extracted from co-occurrence matrix and the Self Organizing Mapping (SOM) clustering algorithm. We present a technique for automatic inspection of oil and gas storage tanks and pipelines of petrochemical industries without disturbing their properties and performance. Experimental results are promising and encourage the possibility of using this methodology in designing trustful and robust early failure detection systems. (author)

  2. Automatic SIMD vectorization of SSA-based control flow graphs

    CERN Document Server

    Karrenberg, Ralf

    2015-01-01

    Ralf Karrenberg presents Whole-Function Vectorization (WFV), an approach that allows a compiler to automatically create code that exploits data-parallelism using SIMD instructions. Data-parallel applications such as particle simulations, stock option price estimation or video decoding require the same computations to be performed on huge amounts of data. Without WFV, one processor core executes a single instance of a data-parallel function. WFV transforms the function to execute multiple instances at once using SIMD instructions. The author describes an advanced WFV algorithm that includes a v

  3. Automatic energy calibration of germanium detectors using fuzzy set theory

    CERN Document Server

    Stezowski, O; Prevost, A; Smith, A G; Wall, R

    2002-01-01

    With the advent of multi-detector arrays, many tasks that are usually performed by physicists, such as energy calibration, become very time consuming. There is therefore a need to develop more and more complex algorithms able to mimic human expertise. Fuzzy logic proposes a theoretical framework to build algorithms that are close to the human way of thinking. In this paper we apply fuzzy set theory in order to develop an automatic procedure for energy calibration. The algorithm, based on fuzzy concepts, has been tested on data taken with the EUROBALL IV gamma-ray array.

  4. Machine Learning Algorithms for Automatic Classification of Marmoset Vocalizations.

    Directory of Open Access Journals (Sweden)

    Hjalmar K Turesson

    Full Text Available Automatic classification of vocalization type could potentially become a useful tool for acoustic the monitoring of captive colonies of highly vocal primates. However, for classification to be useful in practice, a reliable algorithm that can be successfully trained on small datasets is necessary. In this work, we consider seven different classification algorithms with the goal of finding a robust classifier that can be successfully trained on small datasets. We found good classification performance (accuracy > 0.83 and F1-score > 0.84 using the Optimum Path Forest classifier. Dataset and algorithms are made publicly available.

  5. Automatic Voltage Control (AVC) System under Uncertainty from Wind Power

    DEFF Research Database (Denmark)

    Qin, Nan; Abildgaard, Hans; Flynn, Damian

    2016-01-01

    An automatic voltage control (AVC) system maintains the voltage profile of a power system in an acceptable range and minimizes the operational cost by coordinating the regulation of controllable components. Typically, all of the parameters in the optimization problem are assumed to be certain....... The proposed method improves the performance and the robustness of a scenario based approach by estimating the potential voltage variations due to fluctuating wind power production, and introduces a voltage margin to protect the decision against uncertainty for each scenario. The effectiveness of the proposed...

  6. Automatic analysis of trabecular bone structure from knee MRI

    DEFF Research Database (Denmark)

    Marques, Joselene; Granlund, Rabia; Lillholm, Martin

    2012-01-01

    We investigated the feasibility of quantifying osteoarthritis (OA) by analysis of the trabecular bone structure in low-field knee MRI. Generic texture features were extracted from the images and subsequently selected by sequential floating forward selection (SFFS), following a fully automatic......, uncommitted machine-learning based framework. Six different classifiers were evaluated in cross-validation schemes and the results showed that the presence of OA can be quantified by a bone structure marker. The performance of the developed marker reached a generalization area-under-the-ROC (AUC) of 0...

  7. ExpertBayes: Automatically refining manually built Bayesian networks.

    Science.gov (United States)

    Almeida, Ezilda; Ferreira, Pedro; Vinhoza, Tiago; Dutra, Inês; Li, Jingwei; Wu, Yirong; Burnside, Elizabeth

    2014-12-01

    Bayesian network structures are usually built using only the data and starting from an empty network or from a naïve Bayes structure. Very often, in some domains, like medicine, a prior structure knowledge is already known. This structure can be automatically or manually refined in search for better performance models. In this work, we take Bayesian networks built by specialists and show that minor perturbations to this original network can yield better classifiers with a very small computational cost, while maintaining most of the intended meaning of the original model.

  8. Towards unifying inheritance and automatic program specialization

    DEFF Research Database (Denmark)

    Schultz, Ulrik Pagh

    2002-01-01

    and specialization of classes (inheritance) are considered different abstractions. We present a new programming language, Lapis, that unifies inheritance and program specialization at the conceptual, syntactic, and semantic levels. This paper presents the initial development of Lapis, which uses inheritance...... with covariant specialization to control the automatic application of program specialization to class members. Lapis integrates object-oriented concepts, block structure, and techniques from automatic program specialization to provide both a language where object-oriented designs can be e#ciently implemented...

  9. Automatic control system in the reactor peggy

    International Nuclear Information System (INIS)

    Bertrand, J.; Mourchon, R.; Da Costa, D.; Desandre-Navarre, Ch.

    1967-01-01

    The equipment makes it possible for the reactor to attain a given power automatically and for the power to be maintained around this level. The principle of its operation consists in the changing from one power to another, at constant period, by means of a programmer transforming a power-step request into a voltage variation which is linear with time and which represents the logarithm of the required power. The real power is compared continuously with the required power. Stabilization occurs automatically as soon as the difference between the reactor power and the required power diminishes to a few per cent. (authors) [fr

  10. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... programs and evaluated using incremental tabled evaluation, a technique for efficiently updating memo tables in response to changes in facts and rules. The approach has been implemented and integrated into the Eclipse IDE. Our measurements show that this technique is effective for automatically...

  11. Some results of automatic processing of images

    International Nuclear Information System (INIS)

    Golenishchev, I.A.; Gracheva, T.N.; Khardikov, S.V.

    1975-01-01

    The problems of automatic deciphering of the radiographic picture the purpose of which is making a conclusion concerning the quality of the inspected product on the basis of the product defect images in the picture are considered. The methods of defect image recognition are listed, and the algorithms and the class features of defects are described. The results of deciphering of a small radiographic picture by means of the ''Minsk-22'' computer are presented. It is established that the sensitivity of the method of the automatic deciphering is close to that obtained for visual deciphering

  12. Automatic speech recognition a deep learning approach

    CERN Document Server

    Yu, Dong

    2015-01-01

    This book summarizes the recent advancement in the field of automatic speech recognition with a focus on discriminative and hierarchical models. This will be the first automatic speech recognition book to include a comprehensive coverage of recent developments such as conditional random field and deep learning techniques. It presents insights and theoretical foundation of a series of recent models such as conditional random field, semi-Markov and hidden conditional random field, deep neural network, deep belief network, and deep stacking models for sequential learning. It also discusses practical considerations of using these models in both acoustic and language modeling for continuous speech recognition.

  13. ANISOMAT+: An automatic tool to retrieve seismic anisotropy from local earthquakes

    Science.gov (United States)

    Piccinini, Davide; Pastori, Marina; Margheriti, Lucia

    2013-07-01

    An automatic analysis code called ANISOMAT+ has been developed and improved to automatically retrieve the crustal anisotropic parameters fast polarization direction (ϕ) and delay time (δt) related to the shear wave splitting phenomena affecting seismic S-wave. The code is composed of a set of MatLab scripts and functions able to evaluate the anisotropic parameters from the three-component seismic recordings of local earthquakes using the cross-correlation method. Because the aim of the code is to achieve a fully automatic evaluation of anisotropic parameters, during the development of the code we focus our attention to devise several automatic checks intended to guarantee the quality and the stability of the results obtained. The basic idea behind the development of this automatic code is to build a tool able to work on a huge amount of data in a short time, obtaining stable results and minimizing the errors due to the subjectivity. These behaviors, coupled to a three component digital seismic network and a monitoring system that performs automatic pickings and locations, are required to develop a real-time monitoring of the anisotropic parameters.

  14. Semi-Automatic Removal of Foreground Stars from Images of Galaxies

    Science.gov (United States)

    Frei, Zsolt

    1996-07-01

    A new procedure, designed to remove foreground stars from galaxy proviles is presented here. Although several programs exist for stellar and faint object photometry, none of them treat star removal from the images very carefully. I present my attempt to develop such a system, and briefly compare the performance of my software to one of the well-known stellar photometry packages, DAOPhot (Stetson 1987). Major steps in my procedure are: (1) automatic construction of an empirical 2D point spread function from well separated stars that are situated off the galaxy; (2) automatic identification of those peaks that are likely to be foreground stars, scaling the PSF and removing these stars, and patching residuals (in the automatically determined smallest possible area where residuals are truly significant); and (3) cosmetic fix of remaining degradations in the image. The algorithm and software presented here is significantly better for automatic removal of foreground stars from images of galaxies than DAOPhot or similar packages, since: (a) the most suitable stars are selected automatically from the image for the PSF fit; (b) after star-removal an intelligent and automatic procedure removes any possible residuals; (c) unlimited number of images can be cleaned in one run without any user interaction whatsoever. (SECTION: Computing and Data Analysis)

  15. Automatic categorization of diverse experimental information in the bioscience literature

    Directory of Open Access Journals (Sweden)

    Fang Ruihua

    2012-01-01

    Full Text Available Abstract Background Curation of information from bioscience literature into biological knowledge databases is a crucial way of capturing experimental information in a computable form. During the biocuration process, a critical first step is to identify from all published literature the papers that contain results for a specific data type the curator is interested in annotating. This step normally requires curators to manually examine many papers to ascertain which few contain information of interest and thus, is usually time consuming. We developed an automatic method for identifying papers containing these curation data types among a large pool of published scientific papers based on the machine learning method Support Vector Machine (SVM. This classification system is completely automatic and can be readily applied to diverse experimental data types. It has been in use in production for automatic categorization of 10 different experimental datatypes in the biocuration process at WormBase for the past two years and it is in the process of being adopted in the biocuration process at FlyBase and the Saccharomyces Genome Database (SGD. We anticipate that this method can be readily adopted by various databases in the biocuration community and thereby greatly reducing time spent on an otherwise laborious and demanding task. We also developed a simple, readily automated procedure to utilize training papers of similar data types from different bodies of literature such as C. elegans and D. melanogaster to identify papers with any of these data types for a single database. This approach has great significance because for some data types, especially those of low occurrence, a single corpus often does not have enough training papers to achieve satisfactory performance. Results We successfully tested the method on ten data types from WormBase, fifteen data types from FlyBase and three data types from Mouse Genomics Informatics (MGI. It is being used in

  16. Automatic learning-based beam angle selection for thoracic IMRT.

    Science.gov (United States)

    Amit, Guy; Purdie, Thomas G; Levinshtein, Alex; Hope, Andrew J; Lindsay, Patricia; Marshall, Andrea; Jaffray, David A; Pekar, Vladimir

    2015-04-01

    The treatment of thoracic cancer using external beam radiation requires an optimal selection of the radiation beam directions to ensure effective coverage of the target volume and to avoid unnecessary treatment of normal healthy tissues. Intensity modulated radiation therapy (IMRT) planning is a lengthy process, which requires the planner to iterate between choosing beam angles, specifying dose-volume objectives and executing IMRT optimization. In thorax treatment planning, where there are no class solutions for beam placement, beam angle selection is performed manually, based on the planner's clinical experience. The purpose of this work is to propose and study a computationally efficient framework that utilizes machine learning to automatically select treatment beam angles. Such a framework may be helpful for reducing the overall planning workload. The authors introduce an automated beam selection method, based on learning the relationships between beam angles and anatomical features. Using a large set of clinically approved IMRT plans, a random forest regression algorithm is trained to map a multitude of anatomical features into an individual beam score. An optimization scheme is then built to select and adjust the beam angles, considering the learned interbeam dependencies. The validity and quality of the automatically selected beams evaluated using the manually selected beams from the corresponding clinical plans as the ground truth. The analysis included 149 clinically approved thoracic IMRT plans. For a randomly selected test subset of 27 plans, IMRT plans were generated using automatically selected beams and compared to the clinical plans. The comparison of the predicted and the clinical beam angles demonstrated a good average correspondence between the two (angular distance 16.8° ± 10°, correlation 0.75 ± 0.2). The dose distributions of the semiautomatic and clinical plans were equivalent in terms of primary target volume coverage and organ at risk

  17. An Automatic Learning-Based Framework for Robust Nucleus Segmentation.

    Science.gov (United States)

    Xing, Fuyong; Xie, Yuanpu; Yang, Lin

    2016-02-01

    Computer-aided image analysis of histopathology specimens could potentially provide support for early detection and improved characterization of diseases such as brain tumor, pancreatic neuroendocrine tumor (NET), and breast cancer. Automated nucleus segmentation is a prerequisite for various quantitative analyses including automatic morphological feature computation. However, it remains to be a challenging problem due to the complex nature of histopathology images. In this paper, we propose a learning-based framework for robust and automatic nucleus segmentation with shape preservation. Given a nucleus image, it begins with a deep convolutional neural network (CNN) model to generate a probability map, on which an iterative region merging approach is performed for shape initializations. Next, a novel segmentation algorithm is exploited to separate individual nuclei combining a robust selection-based sparse shape model and a local repulsive deformable model. One of the significant benefits of the proposed framework is that it is applicable to different staining histopathology images. Due to the feature learning characteristic of the deep CNN and the high level shape prior modeling, the proposed method is general enough to perform well across multiple scenarios. We have tested the proposed algorithm on three large-scale pathology image datasets using a range of different tissue and stain preparations, and the comparative experiments with recent state of the arts demonstrate the superior performance of the proposed approach.

  18. Automatic identification of mass spectra

    International Nuclear Information System (INIS)

    Drabloes, F.

    1992-01-01

    Several approaches to preprocessing and comparison of low resolution mass spectra have been evaluated by various test methods related to library search. It is shown that there is a clear correlation between the nature of any contamination of a spectrum, the basic principle of the transformation or distance measure, and the performance of the identification system. The identification of functionality from low resolution spectra has also been evaluated using several classification methods. It is shown that there is an upper limit to the success of this approach, but also that this can be improved significantly by using a very limited amount of additional information. 10 refs

  19. Can prosody aid the automatic classification of dialog acts in conversational speech?

    Science.gov (United States)

    Shriberg, E; Bates, R; Stolcke, A; Taylor, P; Jurafsky, D; Ries, K; Coccaro, N; Martin, R; Meteer, M; van Ess-Dykema, C

    1998-01-01

    Identifying whether an utterance is a statement, question, greeting, and so forth is integral to effective automatic understanding of natural dialog. Little is known, however, about how such dialog acts (DAs) can be automatically classified in truly natural conversation. This study asks whether current approaches, which use mainly word information, could be improved by adding prosodic information. The study is based on more than 1000 conversations from the Switchboard corpus. DAs were hand-annotated, and prosodic features (duration, pause, F0, energy, and speaking rate) were automatically extracted for each DA. In training, decision trees based on these features were inferred; trees were then applied to unseen test data to evaluate performance. Performance was evaluated for prosody models alone, and after combining the prosody models with word information--either from true words or from the output of an automatic speech recognizer. For an overall classification task, as well as three subtasks, prosody made significant contributions to classification. Feature-specific analyses further revealed that although canonical features (such as F0 for questions) were important, less obvious features could compensate if canonical features were removed. Finally, in each task, integrating the prosodic model with a DA-specific statistical language model improved performance over that of the language model alone, especially for the case of recognized words. Results suggest that DAs are redundantly marked in natural conversation, and that a variety of automatically extractable prosodic features could aid dialog processing in speech applications.

  20. Simulator training to automaticity leads to improved skill transfer compared with traditional proficiency-based training: a randomized controlled trial.

    Science.gov (United States)

    Stefanidis, Dimitrios; Scerbo, Mark W; Montero, Paul N; Acker, Christina E; Smith, Warren D

    2012-01-01

    We hypothesized that novices will perform better in the operating room after simulator training to automaticity compared with traditional proficiency based training (current standard training paradigm). Simulator-acquired skill translates to the operating room, but the skill transfer is incomplete. Secondary task metrics reflect the ability of trainees to multitask (automaticity) and may improve performance assessment on simulators and skill transfer by indicating when learning is complete. Novices (N = 30) were enrolled in an IRB-approved, blinded, randomized, controlled trial. Participants were randomized into an intervention (n = 20) and a control (n = 10) group. The intervention group practiced on the FLS suturing task until they achieved expert levels of time and errors (proficiency), were tested on a live porcine fundoplication model, continued simulator training until they achieved expert levels on a visual spatial secondary task (automaticity) and were retested on the operating room (OR) model. The control group participated only during testing sessions. Performance scores were compared within and between groups during testing sessions. : Intervention group participants achieved proficiency after 54 ± 14 and automaticity after additional 109 ± 57 repetitions. Participants achieved better scores in the OR after automaticity training [345 (range, 0-537)] compared with after proficiency-based training [220 (range, 0-452; P training to automaticity takes more time but is superior to proficiency-based training, as it leads to improved skill acquisition and transfer. Secondary task metrics that reflect trainee automaticity should be implemented during simulator training to improve learning and skill transfer.

  1. Automatic speech signal segmentation based on the innovation adaptive filter

    Directory of Open Access Journals (Sweden)

    Makowski Ryszard

    2014-06-01

    Full Text Available Speech segmentation is an essential stage in designing automatic speech recognition systems and one can find several algorithms proposed in the literature. It is a difficult problem, as speech is immensely variable. The aim of the authors’ studies was to design an algorithm that could be employed at the stage of automatic speech recognition. This would make it possible to avoid some problems related to speech signal parametrization. Posing the problem in such a way requires the algorithm to be capable of working in real time. The only such algorithm was proposed by Tyagi et al., (2006, and it is a modified version of Brandt’s algorithm. The article presents a new algorithm for unsupervised automatic speech signal segmentation. It performs segmentation without access to information about the phonetic content of the utterances, relying exclusively on second-order statistics of a speech signal. The starting point for the proposed method is time-varying Schur coefficients of an innovation adaptive filter. The Schur algorithm is known to be fast, precise, stable and capable of rapidly tracking changes in second order signal statistics. A transfer from one phoneme to another in the speech signal always indicates a change in signal statistics caused by vocal track changes. In order to allow for the properties of human hearing, detection of inter-phoneme boundaries is performed based on statistics defined on the mel spectrum determined from the reflection coefficients. The paper presents the structure of the algorithm, defines its properties, lists parameter values, describes detection efficiency results, and compares them with those for another algorithm. The obtained segmentation results, are satisfactory.

  2. Automatically predicting mood from expressed emotions

    NARCIS (Netherlands)

    Katsimerou, C.

    2016-01-01

    Affect-adaptive systems have the potential to assist users that experience systematically negative moods. This thesis aims at building a platform for predicting automatically a person’s mood from his/her visual expressions. The key word is mood, namely a relatively long-term, stable and diffused

  3. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... incrementalizing a broad range of static analyses....

  4. Automatic Estimation of Movement Statistics of People

    DEFF Research Database (Denmark)

    Ægidiussen Jensen, Thomas; Rasmussen, Henrik Anker; Moeslund, Thomas B.

    2012-01-01

    Automatic analysis of how people move about in a particular environment has a number of potential applications. However, no system has so far been able to do detection and tracking robustly. Instead, trajectories are often broken into tracklets. The key idea behind this paper is based around...

  5. Automatic program generation: future of software engineering

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, J.H.

    1979-01-01

    At this moment software development is still more of an art than an engineering discipline. Each piece of software is lovingly engineered, nurtured, and presented to the world as a tribute to the writer's skill. When will this change. When will the craftsmanship be removed and the programs be turned out like so many automobiles from an assembly line. Sooner or later it will happen: economic necessities will demand it. With the advent of cheap microcomputers and ever more powerful supercomputers doubling capacity, much more software must be produced. The choices are to double the number of programers, double the efficiency of each programer, or find a way to produce the needed software automatically. Producing software automatically is the only logical choice. How will automatic programing come about. Some of the preliminary actions which need to be done and are being done are to encourage programer plagiarism of existing software through public library mechanisms, produce well understood packages such as compiler automatically, develop languages capable of producing software as output, and learn enough about the whole process of programing to be able to automate it. Clearly, the emphasis must not be on efficiency or size, since ever larger and faster hardware is coming.

  6. Automatic Differentiation and its Program Realization

    Czech Academy of Sciences Publication Activity Database

    Hartman, J.; Lukšan, Ladislav; Zítko, J.

    2009-01-01

    Roč. 45, č. 5 (2009), s. 865-883 ISSN 0023-5954 R&D Projects: GA AV ČR IAA1030405 Institutional research plan: CEZ:AV0Z10300504 Keywords : automatic differentiation * modeling languages * systems of optimization Subject RIV: BA - General Mathematics Impact factor: 0.445, year: 2009 http://dml.cz/handle/10338.dmlcz/140037

  7. Development of automatic facilities for ZEPHYR

    International Nuclear Information System (INIS)

    Eder, O.; Lackner, E.; Pohl, F.; Schilling, H.B.

    1982-04-01

    This concept of remotely controlled facilities for repair and maintenance tasks inside the ZEPHYR vacuum vessel uses a supporting structure to insert various types of mobile automatic devices are guided by an egg-shaped disc which is part of the supporting structure. Considerations of adapting the guiding disc to the vessel contour are included. (orig.)

  8. AUTORED - the JADE automatic data reduction system

    International Nuclear Information System (INIS)

    Whittaker, J.B.

    1984-07-01

    The design and implementation of and experience with an automatic data processing system for the reduction of data from the JADE experiment at DESY is described. The central elements are a database and a job submitter which combine powerfully to minimise the need for manual intervention. (author)

  9. Automatic Assessment of 3D Modeling Exams

    Science.gov (United States)

    Sanna, A.; Lamberti, F.; Paravati, G.; Demartini, C.

    2012-01-01

    Computer-based assessment of exams provides teachers and students with two main benefits: fairness and effectiveness in the evaluation process. This paper proposes a fully automatic evaluation tool for the Graphic and Virtual Design (GVD) curriculum at the First School of Architecture of the Politecnico di Torino, Italy. In particular, the tool is…

  10. Automatic Smoker Detection from Telephone Speech Signals

    DEFF Research Database (Denmark)

    Alavijeh, Amir Hossein Poorjam; Hesaraki, Soheila; Safavi, Saeid

    2017-01-01

    This paper proposes an automatic smoking habit detection from spontaneous telephone speech signals. In this method, each utterance is modeled using i-vector and non-negative factor analysis (NFA) frameworks, which yield low-dimensional representation of utterances by applying factor analysis on G...

  11. Reduction of Dutch Sentences for Automatic Subtitling

    NARCIS (Netherlands)

    Tjong Kim Sang, E.F.; Daelemans, W.; Höthker, A.

    2004-01-01

    We compare machine learning approaches for sentence length reduction for automatic generation of subtitles for deaf and hearing-impaired people with a method which relies on hand-crafted deletion rules. We describe building the necessary resources for this task: a parallel corpus of examples of news

  12. Effective speed management through automatic enforcement.

    NARCIS (Netherlands)

    Oei, H.-l.

    1994-01-01

    This paper analyses several aspects of the Dutch experience of speed enforcement, and presents the results of some speed management experiments in The Netherlands, using automatic warning of speeders and enforcement of speeding. Traditional approaches to manage speed there have not resulted in

  13. Automatic Amharic text news classification: Aneural networks ...

    African Journals Online (AJOL)

    The study is on classification of Amharic news automatically using neural networks approach. Learning Vector Quantization (LVQ) algorithm is employed to classify new instance of Amharic news based on classifier developed using training dataset. Two weighting schemes, Term Frequency (TF) and Term Frequency by ...

  14. Automatic invariant detection in dynamic web applications

    NARCIS (Netherlands)

    Groeneveld, F.; Mesbah, A.; Van Deursen, A.

    2010-01-01

    The complexity of modern web applications increases as client-side JavaScript and dynamic DOM programming are used to offer a more interactive web experience. In this paper, we focus on improving the dependability of such applications by automatically inferring invariants from the client-side and

  15. Automatization and familiarity in repeated checking

    NARCIS (Netherlands)

    Dek, E.C.P.|info:eu-repo/dai/nl/313959552; van den Hout, M.A.|info:eu-repo/dai/nl/070445354; Giele, C.L.|info:eu-repo/dai/nl/318754460; Engelhard, I.M.|info:eu-repo/dai/nl/239681533

    2015-01-01

    Repetitive, compulsive-like checking of an object leads to reductions in memory confidence, vividness, and detail. Experimental research suggests that this is caused by increased familiarity with perceptual characteristics of the stimulus and automatization of the checking procedure (Dek, van den

  16. 32 CFR 2001.30 - Automatic declassification.

    Science.gov (United States)

    2010-07-01

    ... that originated in an agency that has ceased to exist and for which there is no successor agency, the... international agreement that does not permit automatic or unilateral declassification. The declassifying agency... foreign nuclear programs (e.g., intelligence assessments or reports, foreign nuclear program information...

  17. Automatically Extracting Typical Syntactic Differences from Corpora

    NARCIS (Netherlands)

    Wiersma, Wybo; Nerbonne, John; Lauttamus, Timo

    We develop an aggregate measure of syntactic difference for automatically finding common syntactic differences between collections of text. With the use of this measure, it is possible to mine for differences between, for example, the English of learners and natives, or between related dialects. If

  18. Automatically unfair and operational requirement dismissals: Making ...

    African Journals Online (AJOL)

    This article explores the concept of the automatic unfair dismissal that is regulated in s 187(1)(c) of the Labour Relations Act 66 of 1995 (LRA), where the reason for the ... This dichotomy was dealt with by the court in Fry's Metals (Pty) Ltd v National Union of Metalworkers of SA 2003 ILJ 133 (LAC), but the decision was ...

  19. Automatic extraction of legal concepts and definitions

    NARCIS (Netherlands)

    Winkels, R.; Hoekstra, R.

    2012-01-01

    In this paper we present the results of an experiment in automatic concept and definition extraction from written sources of law using relatively simple natural language and standard semantic web technology. The software was tested on six laws from the tax domain.

  20. The CHilean Automatic Supernova sEarch

    DEFF Research Database (Denmark)

    Hamuy, M.; Pignata, G.; Maza, J.

    2012-01-01

    The CHilean Automatic Supernova sEarch (CHASE) project began in 2007 with the goal to discover young, nearby southern supernovae in order to (1) better understand the physics of exploding stars and their progenitors, and (2) refine the methods to derive extragalactic distances. During the first...

  1. AUTOMATIC IDENTIFICATION OF ITEMS IN WAREHOUSE MANAGEMENT

    OpenAIRE

    Vladimír Modrák; Peter Knuth

    2010-01-01

    Automatic identification of items saves time and is beneficial in various areas, including warehouse management. Identification can be done by many technologies, but RFID technology seems to be one of the smartest solutions. This article deals with testing and possible use of RFID technology in warehouse management. All results and measurement outcomes are documented in form of graphs followed by comprehensive analysis.

  2. Automatic Positioning System of Small Agricultural Robot

    Science.gov (United States)

    Momot, M. V.; Proskokov, A. V.; Natalchenko, A. S.; Biktimirov, A. S.

    2016-08-01

    The present article discusses automatic positioning systems of agricultural robots used in field works. The existing solutions in this area have been analyzed. The article proposes an original solution, which is easy to implement and is characterized by high- accuracy positioning.

  3. Automatic assessment of cardiac perfusion MRI

    DEFF Research Database (Denmark)

    Ólafsdóttir, Hildur; Stegmann, Mikkel Bille; Larsson, Henrik B.W.

    2004-01-01

    In this paper, a method based on Active Appearance Models (AAM) is applied for automatic registration of myocardial perfusion MRI. A semi-quantitative perfusion assessment of the registered image sequences is presented. This includes the formation of perfusion maps for three parameters; maximum up...

  4. Automatic Synthesis of Robust and Optimal Controllers

    DEFF Research Database (Denmark)

    Cassez, Franck; Jessen, Jan Jacob; Larsen, Kim Guldstrand

    2009-01-01

    In this paper, we show how to apply recent tools for the automatic synthesis of robust and near-optimal controllers for a real industrial case study. We show how to use three different classes of models and their supporting existing tools, Uppaal-TiGA for synthesis, phaver for verification...

  5. Automatic arms their history, development and use

    CERN Document Server

    Johnson, Melvin M

    2015-01-01

    The evolution of automatic weapons is one of the most significant developments in weapons history. While this development has been filled with disagreements, controversy, and stray hurdles, out of all of this tumult, shouting, and shooting has come the progress in firearms from the days when it was necessary to build a fire under a gun to make it go off to the “you press the button and they do the work" automatic firearms of the present day. In 1941, Melvin M. Johnson Jr. and Charles T. Haven, both well-established experts on firearms and ammunitions in their day, commemorated this development in Automatic Arms: Their History, Development and Use. The topics on which they illuminate the reader include: History and development How they work How to keep them firing How they may be employed in combat In the authors' foreword, they state, “There has been a great deal of general discussion about various automatic weapons pro and con, and naturally there have been misunderstandings and misinterpretations." They s...

  6. Automatic Water Sensor Window Opening System

    KAUST Repository

    Percher, Michael

    2013-12-05

    A system can automatically open at least one window of a vehicle when the vehicle is being submerged in water. The system can include a water collector and a water sensor, and when the water sensor detects water in the water collector, at least one window of the vehicle opens.

  7. Automatic TLI recognition system, general description

    Energy Technology Data Exchange (ETDEWEB)

    Lassahn, G.D.

    1997-02-01

    This report is a general description of an automatic target recognition system developed at the Idaho National Engineering Laboratory for the Department of Energy. A user`s manual is a separate volume, Automatic TLI Recognition System, User`s Guide, and a programmer`s manual is Automatic TLI Recognition System, Programmer`s Guide. This system was designed as an automatic target recognition system for fast screening of large amounts of multi-sensor image data, based on low-cost parallel processors. This system naturally incorporates image data fusion, and it gives uncertainty estimates. It is relatively low cost, compact, and transportable. The software is easily enhanced to expand the system`s capabilities, and the hardware is easily expandable to increase the system`s speed. In addition to its primary function as a trainable target recognition system, this is also a versatile, general-purpose tool for image manipulation and analysis, which can be either keyboard-driven or script-driven. This report includes descriptions of three variants of the computer hardware, a description of the mathematical basis if the training process, and a description with examples of the system capabilities.

  8. Automatic prejudice in childhood and early adolescence

    NARCIS (Netherlands)

    Degner, J.; Wentura, D.

    2010-01-01

    Four cross-sectional studies are presented that investigated the automatic activation of prejudice in children and adolescents (aged 9 years to 15 years). Therefore, 4 different versions of the affective priming task were used, with pictures of ingroup and outgroup members being presented as

  9. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  10. Automatic Radiometric Normalization of Multitemporal Satellite Imagery

    DEFF Research Database (Denmark)

    Canty, Morton J.; Nielsen, Allan Aasbjerg; Schmidt, Michael

    2004-01-01

    The linear scale invariance of the multivariate alteration detection (MAD) transformation is used to obtain invariant pixels for automatic relative radiometric normalization of time series of multispectral data. Normalization by means of ordinary least squares regression method is compared with n...

  11. Automatically extracting class diagrams from spreadsheets

    NARCIS (Netherlands)

    Hermans, F.; Pinzger, M.; Van Deursen, A.

    2010-01-01

    The use of spreadsheets to capture information is widespread in industry. Spreadsheets can thus be a wealthy source of domain information. We propose to automatically extract this information and transform it into class diagrams. The resulting class diagram can be used by software engineers to

  12. Automatic characterization of dynamics in Absence Epilepsy

    DEFF Research Database (Denmark)

    Petersen, Katrine N. H.; Nielsen, Trine N.; Kjær, Troels W.

    2013-01-01

    Dynamics of the spike-wave paroxysms in Childhood Absence Epilepsy (CAE) are automatically characterized using novel approaches. Features are extracted from scalograms formed by Continuous Wavelet Transform (CWT). Detection algorithms are designed to identify an estimate of the temporal development...

  13. Evaluating automatically annotated treebanks for linguistic research

    NARCIS (Netherlands)

    Bloem, J.; Bański, P.; Kupietz, M.; Lüngen, H.; Witt, A.; Barbaresi, A.; Biber, H.; Breiteneder, E.; Clematide, S.

    2016-01-01

    This study discusses evaluation methods for linguists to use when employing an automatically annotated treebank as a source of linguistic evidence. While treebanks are usually evaluated with a general measure over all the data, linguistic studies often focus on a particular construction or a group

  14. Automatic Subspace Learning via Principal Coefficients Embedding.

    Science.gov (United States)

    Peng, Xi; Lu, Jiwen; Yi, Zhang; Yan, Rui

    2017-11-01

    In this paper, we address two challenging problems in unsupervised subspace learning: 1) how to automatically identify the feature dimension of the learned subspace (i.e., automatic subspace learning) and 2) how to learn the underlying subspace in the presence of Gaussian noise (i.e., robust subspace learning). We show that these two problems can be simultaneously solved by proposing a new method [(called principal coefficients embedding (PCE)]. For a given data set , PCE recovers a clean data set from and simultaneously learns a global reconstruction relation of . By preserving into an -dimensional space, the proposed method obtains a projection matrix that can capture the latent manifold structure of , where is automatically determined by the rank of with theoretical guarantees. PCE has three advantages: 1) it can automatically determine the feature dimension even though data are sampled from a union of multiple linear subspaces in presence of the Gaussian noise; 2) although the objective function of PCE only considers the Gaussian noise, experimental results show that it is robust to the non-Gaussian noise (e.g., random pixel corruption) and real disguises; and 3) our method has a closed-form solution and can be calculated very fast. Extensive experimental results show the superiority of PCE on a range of databases with respect to the classification accuracy, robustness, and efficiency.

  15. Automatic low-temperature calorimeter

    International Nuclear Information System (INIS)

    Malyshev, V.M.; Mil'ner, G.A.; Shibakin, V.F.; Sorkin, E.L.

    1986-01-01

    This paper describes a low-temperature adiabatic calorimeter with a range of 1.5-500K. The system for maintaining adiabatic conditions is implemented by two resitance thermometers, whose sensitivity at low temperatures is several orders higher than that of thermocouples. The calorimeter cryostat is installed in an STG-40 portable Dewar flask. The calorimeter is controlled by an Elektronika-60 microcomputer. Standard platinum and germanium thermometers were placed inside of the calorimeter to calibrate the thermometers of the calorimeter and the shield, and the specific heats of specimens of OSCh 11-4 copper and KTP-8 paste were measured to demonstrate the possibilities of the described calorimeter. Experience with the calorimeter has shown that a thorough study of the dependence of heat capacity on temperature (over 100 points for one specimen) can be performed in one or two dats

  16. Automatic River Network Extraction from LIDAR Data

    Science.gov (United States)

    Maderal, E. N.; Valcarcel, N.; Delgado, J.; Sevilla, C.; Ojeda, J. C.

    2016-06-01

    National Geographic Institute of Spain (IGN-ES) has launched a new production system for automatic river network extraction for the Geospatial Reference Information (GRI) within hydrography theme. The goal is to get an accurate and updated river network, automatically extracted as possible. For this, IGN-ES has full LiDAR coverage for the whole Spanish territory with a density of 0.5 points per square meter. To implement this work, it has been validated the technical feasibility, developed a methodology to automate each production phase: hydrological terrain models generation with 2 meter grid size and river network extraction combining hydrographic criteria (topographic network) and hydrological criteria (flow accumulation river network), and finally the production was launched. The key points of this work has been managing a big data environment, more than 160,000 Lidar data files, the infrastructure to store (up to 40 Tb between results and intermediate files), and process; using local virtualization and the Amazon Web Service (AWS), which allowed to obtain this automatic production within 6 months, it also has been important the software stability (TerraScan-TerraSolid, GlobalMapper-Blue Marble , FME-Safe, ArcGIS-Esri) and finally, the human resources managing. The results of this production has been an accurate automatic river network extraction for the whole country with a significant improvement for the altimetric component of the 3D linear vector. This article presents the technical feasibility, the production methodology, the automatic river network extraction production and its advantages over traditional vector extraction systems.

  17. AUTOMATIC RIVER NETWORK EXTRACTION FROM LIDAR DATA

    Directory of Open Access Journals (Sweden)

    E. N. Maderal

    2016-06-01

    Full Text Available National Geographic Institute of Spain (IGN-ES has launched a new production system for automatic river network extraction for the Geospatial Reference Information (GRI within hydrography theme. The goal is to get an accurate and updated river network, automatically extracted as possible. For this, IGN-ES has full LiDAR coverage for the whole Spanish territory with a density of 0.5 points per square meter. To implement this work, it has been validated the technical feasibility, developed a methodology to automate each production phase: hydrological terrain models generation with 2 meter grid size and river network extraction combining hydrographic criteria (topographic network and hydrological criteria (flow accumulation river network, and finally the production was launched. The key points of this work has been managing a big data environment, more than 160,000 Lidar data files, the infrastructure to store (up to 40 Tb between results and intermediate files, and process; using local virtualization and the Amazon Web Service (AWS, which allowed to obtain this automatic production within 6 months, it also has been important the software stability (TerraScan-TerraSolid, GlobalMapper-Blue Marble , FME-Safe, ArcGIS-Esri and finally, the human resources managing. The results of this production has been an accurate automatic river network extraction for the whole country with a significant improvement for the altimetric component of the 3D linear vector. This article presents the technical feasibility, the production methodology, the automatic river network extraction production and its advantages over traditional vector extraction systems.

  18. MRI in assessing children with learning disability, focal findings, and reduced automaticity.

    Science.gov (United States)

    Urion, David K; Huff, Hanalise V; Carullo, Maria Paulina

    2015-08-18

    In children with clinically diagnosed learning disabilities with focal findings on neurologic or neuropsychological evaluations, there is a hypothesized association between disorders in automaticity and focal structural abnormalities observed in brain MRIs. We undertook a retrospective analysis of cases referred to a tertiary-hospital-based learning disabilities program. Individuals were coded as having a focal deficit if either neurologic or neuropsychological evaluation demonstrated focal dysfunction. Those with abnormal MRI findings were categorized based on findings. Children with abnormalities from each of these categories were compared in terms of deficits in automaticity, as measured by the tasks of Rapid Automatized Naming, Rapid Alternating Stimulus Naming, or the timed motor performance battery from the Physical and Neurological Examination for Soft Signs. Data were compared in children with and without disorders of automaticity regarding type of brain structure abnormality. Of the 1,587 children evaluated, 127 had a focal deficit. Eighty-seven had a brain MRI (52 on 1.5-tesla machines and 35 on 3.0-tesla machines). Forty of these images were found to be abnormal. These children were compared with a clinic sample of 150 patients with learning disabilities and no focal findings on examination, who also had undergone MRI. Only 5 of the latter group had abnormalities on MRI. Reduced verbal automaticity was associated with cerebellar abnormalities, whereas reduced automaticity on motor or motor and verbal tasks was associated with white matter abnormalities. Reduced automaticity of retrieval and slow timed motor performance appear to be highly associated with MRI findings. © 2015 American Academy of Neurology.

  19. Method for automatic control rod operation using rule-based control

    International Nuclear Information System (INIS)

    Kinoshita, Mitsuo; Yamada, Naoyuki; Kiguchi, Takashi

    1988-01-01

    An automatic control rod operation method using rule-based control is proposed. Its features are as follows: (1) a production system to recognize plant events, determine control actions and realize fast inference (fast selection of a suitable production rule), (2) use of the fuzzy control technique to determine quantitative control variables. The method's performance was evaluated by simulation tests on automatic control rod operation at a BWR plant start-up. The results were as follows; (1) The performance which is related to stabilization of controlled variables and time required for reactor start-up, was superior to that of other methods such as PID control and program control methods, (2) the process time to select and interpret the suitable production rule, which was the same as required for event recognition or determination of control action, was short (below 1 s) enough for real time control. The results showed that the method is effective for automatic control rod operation. (author)

  20. The Development of Automatic Sequences for the RF and Cryogenic Systems at the Spallation Neutron Source

    International Nuclear Information System (INIS)

    Gurd, Pamela; Casagrande, Fabio; Mccarthy, Michael; Strong, William; Ganni, Venkatarao

    2005-01-01

    Automatic sequences both ease the task of operating a complex machine and ensure procedural consistency. At the Spallation Neutron Source project (SNS), a set of automatic sequences have been developed to perform the start up and shut down of the high power RF systems. Similarly, sequences have been developed to perform backfill, pump down, automatic valve control and energy management in the cryogenic system. The sequences run on Linux soft input-output controllers (IOCs), which are similar to ordinary EPICS (Experimental Physics and Industrial Control System) IOCs in terms of data sharing with other EPICS processes, but which share a Linux processor with other such processors. Each sequence waits for a command from an operator console and starts the corresponding set of instructions, allowing operators to follow the sequences either from an overview screen or from detail screens. We describe each system and our operational experience with it.