WorldWideScience

Sample records for automatic performance debugging

  1. High Performance with Prescriptive Optimization and Debugging

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo

    parallelization and automatic vectorization is attractive as it transparently optimizes programs. The thesis contributes an improved dependence analysis for explicitly parallel programs. These improvements lead to more loops being vectorized, on average we achieve a speedup of 1.46 over the existing dependence...... analysis and vectorizer in GCC. Automatic optimizations often fail for theoretical and practical reasons. When they fail we argue that a hybrid approach can be effective. Using compiler feedback, we propose to use the programmer’s intuition and insight to achieve high performance. Compiler feedback...... enlightens the programmer why a given optimization was not applied, and suggest how to change the source code to make it more amenable to optimizations. We show how this can yield significant speedups and achieve 2.4 faster execution on a real industrial use case. To aid in parallel debugging we propose...

  2. DySectAPI: Scalable Prescriptive Debugging

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Karlsson, Sven; Quarfot Nielsen, Niklas

    We present the DySectAPI, a tool that allow users to construct probe trees for automatic, event-driven debugging at scale. The traditional, interactive debugging model, whereby users manually step through and inspect their application, does not scale well even for current supercomputers. While...... lightweight debugging models scale well, they can currently only debug a subset of bug classes. DySectAPI fills the gap between these two approaches with a novel user-guided approach. Using both experimental results and analytical modeling we show how DySectAPI scales and can run with a low overhead...

  3. Re-targeting the Graze performance debugging tool for Java threads and analyzing the re-targeting to automatically parallelized (FORTRAN) code

    OpenAIRE

    Tsai, Pedro T. H.

    2000-01-01

    Approved for public release; distribution is unlimited This research focuses on the design of a language-independent concept, Glimpse, for performance debugging of multi-threaded programs. This research extends previous work on Graze, a tool designed and implemented for performance debugging of C++ programs. Not only is Glimpse easily portable among different programming languages, (i) it is useful in many different paradigms ranging from few long-lived threads to many short-lived...

  4. Supercomputer debugging workshop 1991 proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.

    1991-01-01

    This report discusses the following topics on supercomputer debugging: Distributed debugging; use interface to debugging tools and standards; debugging optimized codes; debugging parallel codes; and debugger performance and interface as analysis tools. (LSP)

  5. Supercomputer debugging workshop 1991 proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.

    1991-12-31

    This report discusses the following topics on supercomputer debugging: Distributed debugging; use interface to debugging tools and standards; debugging optimized codes; debugging parallel codes; and debugger performance and interface as analysis tools. (LSP)

  6. Supercomputer debugging workshop `92

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.S.

    1993-02-01

    This report contains papers or viewgraphs on the following topics: The ABCs of Debugging in the 1990s; Cray Computer Corporation; Thinking Machines Corporation; Cray Research, Incorporated; Sun Microsystems, Inc; Kendall Square Research; The Effects of Register Allocation and Instruction Scheduling on Symbolic Debugging; Debugging Optimized Code: Currency Determination with Data Flow; A Debugging Tool for Parallel and Distributed Programs; Analyzing Traces of Parallel Programs Containing Semaphore Synchronization; Compile-time Support for Efficient Data Race Detection in Shared-Memory Parallel Programs; Direct Manipulation Techniques for Parallel Debuggers; Transparent Observation of XENOOPS Objects; A Parallel Software Monitor for Debugging and Performance Tools on Distributed Memory Multicomputers; Profiling Performance of Inter-Processor Communications in an iWarp Torus; The Application of Code Instrumentation Technology in the Los Alamos Debugger; and CXdb: The Road to Remote Debugging.

  7. Debugging a high performance computing program

    Science.gov (United States)

    Gooding, Thomas M.

    2013-08-20

    Methods, apparatus, and computer program products are disclosed for debugging a high performance computing program by gathering lists of addresses of calling instructions for a plurality of threads of execution of the program, assigning the threads to groups in dependence upon the addresses, and displaying the groups to identify defective threads.

  8. TinyDebug

    DEFF Research Database (Denmark)

    Hansen, Morten Tranberg

    2011-01-01

    Debugging embedded wireless systems can be cumbersome due to low visibility. To ease the task of debugging this paper present TinyDebug which is a multi-purpose passive debugging framework for developing embedded wireless sys- tems. TinyDebug is designed to be used throughout the entire system...... logging to extraction and show how the frame- work improves upon existing message based and event log- ging debugging techniques while enabling distributed event processing. We also present a number of optional event anal- ysis tools demonstrating the generality of the TinyDebug debug messages....

  9. Supercomputer debugging workshop '92

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.S.

    1993-01-01

    This report contains papers or viewgraphs on the following topics: The ABCs of Debugging in the 1990s; Cray Computer Corporation; Thinking Machines Corporation; Cray Research, Incorporated; Sun Microsystems, Inc; Kendall Square Research; The Effects of Register Allocation and Instruction Scheduling on Symbolic Debugging; Debugging Optimized Code: Currency Determination with Data Flow; A Debugging Tool for Parallel and Distributed Programs; Analyzing Traces of Parallel Programs Containing Semaphore Synchronization; Compile-time Support for Efficient Data Race Detection in Shared-Memory Parallel Programs; Direct Manipulation Techniques for Parallel Debuggers; Transparent Observation of XENOOPS Objects; A Parallel Software Monitor for Debugging and Performance Tools on Distributed Memory Multicomputers; Profiling Performance of Inter-Processor Communications in an iWarp Torus; The Application of Code Instrumentation Technology in the Los Alamos Debugger; and CXdb: The Road to Remote Debugging.

  10. BigDebug: Debugging Primitives for Interactive Big Data Processing in Spark.

    Science.gov (United States)

    Gulzar, Muhammad Ali; Interlandi, Matteo; Yoo, Seunghyun; Tetali, Sai Deep; Condie, Tyson; Millstein, Todd; Kim, Miryung

    2016-05-01

    Developers use cloud computing platforms to process a large quantity of data in parallel when developing big data analytics. Debugging the massive parallel computations that run in today's data-centers is time consuming and error-prone. To address this challenge, we design a set of interactive, real-time debugging primitives for big data processing in Apache Spark, the next generation data-intensive scalable cloud computing platform. This requires re-thinking the notion of step-through debugging in a traditional debugger such as gdb, because pausing the entire computation across distributed worker nodes causes significant delay and naively inspecting millions of records using a watchpoint is too time consuming for an end user. First, BIGDEBUG's simulated breakpoints and on-demand watchpoints allow users to selectively examine distributed, intermediate data on the cloud with little overhead. Second, a user can also pinpoint a crash-inducing record and selectively resume relevant sub-computations after a quick fix. Third, a user can determine the root causes of errors (or delays) at the level of individual records through a fine-grained data provenance capability. Our evaluation shows that BIGDEBUG scales to terabytes and its record-level tracing incurs less than 25% overhead on average. It determines crash culprits orders of magnitude more accurately and provides up to 100% time saving compared to the baseline replay debugger. The results show that BIGDEBUG supports debugging at interactive speeds with minimal performance impact.

  11. Study of the nonlinear imperfect software debugging model

    International Nuclear Information System (INIS)

    Wang, Jinyong; Wu, Zhibo

    2016-01-01

    In recent years there has been a dramatic proliferation of research on imperfect software debugging phenomena. Software debugging is a complex process and is affected by a variety of factors, including the environment, resources, personnel skills, and personnel psychologies. Therefore, the simple assumption that debugging is perfect is inconsistent with the actual software debugging process, wherein a new fault can be introduced when removing a fault. Furthermore, the fault introduction process is nonlinear, and the cumulative number of nonlinearly introduced faults increases over time. Thus, this paper proposes a nonlinear, NHPP imperfect software debugging model in consideration of the fact that fault introduction is a nonlinear process. The fitting and predictive power of the NHPP-based proposed model are validated through related experiments. Experimental results show that this model displays better fitting and predicting performance than the traditional NHPP-based perfect and imperfect software debugging models. S-confidence bounds are set to analyze the performance of the proposed model. This study also examines and discusses optimal software release-time policy comprehensively. In addition, this research on the nonlinear process of fault introduction is significant given the recent surge of studies on software-intensive products, such as cloud computing and big data. - Highlights: • Fault introduction is a nonlinear changing process during the debugging phase. • The assumption that the process of fault introduction is nonlinear is credible. • Our proposed model can better fit and accurately predict software failure behavior. • Research on fault introduction case is significant to software-intensive products.

  12. Debug register rootkits : A study of malicious use of the IA-32 debug registers

    OpenAIRE

    Persson, Emil; Mattsson, Joel

    2012-01-01

    The debug register rootkit is a special type of rootkit that has existed for over a decade, and is told to be undetectable by any scanning tools. It exploits the debug registers in Intel’s IA-32 processor architecture. This paper investigates the debug register rootkit to find out why it is considered a threat, and which malware removal tools have implemented detection algorithms against this threat. By implementing and running a debug register rootkit against the most popular Linux tools, ne...

  13. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  14. Multi-purpose passive debugging for embedded wireless

    DEFF Research Database (Denmark)

    Hansen, Morten Tranberg

    Debugging embedded wireless systems can be cumbersome and hard due to low visibility. To ease the task of debugging we propose a multi-purpose passive debugging framework, called TinyDebug, for developing embedded wireless systems. TinyDebug is designed to be used throughout the entire system...

  15. Integrated Debugging of Modelica Models

    Directory of Open Access Journals (Sweden)

    Adrian Pop

    2014-04-01

    Full Text Available The high abstraction level of equation-based object-oriented (EOO languages such as Modelica has the drawback that programming and modeling errors are often hard to find. In this paper we present integrated static and dynamic debugging methods for Modelica models and a debugger prototype that addresses several of those problems. The goal is an integrated debugging framework that combines classical debugging techniques with special techniques for equation-based languages partly based on graph visualization and interaction. To our knowledge, this is the first Modelica debugger that supports both equation-based transformational and algorithmic code debugging in an integrated fashion.

  16. Debugging data transfers in CMS

    International Nuclear Information System (INIS)

    Bagliesi, G; Belforte, S; Bloom, K; Bockelman, B; Bonacorsi, D; Fisk, I; Flix, J; Hernandez, J; D'Hondt, J; Maes, J; Kadastik, M; Klem, J; Kodolova, O; Kuo, C-M; Letts, J; Magini, N; Metson, S; Piedra, J; Pukhaeva, N; Tuura, L

    2010-01-01

    The CMS experiment at CERN is preparing for LHC data taking in several computing preparation activities. In early 2007 a traffic load generator infrastructure for distributed data transfer tests was designed and deployed to equip the WLCG tiers which support the CMS virtual organization with a means for debugging, load-testing and commissioning data transfer routes among CMS computing centres. The LoadTest is based upon PhEDEx as a reliable, scalable data set replication system. The Debugging Data Transfers (DDT) task force was created to coordinate the debugging of the data transfer links. The task force aimed to commission most crucial transfer routes among CMS tiers by designing and enforcing a clear procedure to debug problematic links. Such procedure aimed to move a link from a debugging phase in a separate and independent environment to a production environment when a set of agreed conditions are achieved for that link. The goal was to deliver one by one working transfer routes to the CMS data operations team. The preparation, activities and experience of the DDT task force within the CMS experiment are discussed. Common technical problems and challenges encountered during the lifetime of the taskforce in debugging data transfer links in CMS are explained and summarized.

  17. Debugging Data Transfers in CMS

    CERN Document Server

    Bagliesi, G; Bloom, K; Bockelman, B; Bonacorsi, D; Fisk, I; Flix, J; Hernandez, J; D'Hondt, J; Kadastik, M; Klem, J; Kodolova, O; Kuo, C M; Letts, J; Maes, J; Magini, N; Metson, S; Piedra, J; Pukhaeva, N; Tuura, L; Sonajalg, S; Wu, Y; Van Mulders, P; Villella, I; Wurthwein, F

    2010-01-01

    The CMS experiment at CERN is preparing for LHC data taking in several computing preparation activities. In early 2007 a traffic load generator infrastructure for distributed data transfer tests called the LoadTest was designed and deployed to equip the WLCG sites that support CMS with a means for debugging, load-testing and commissioning data transfer routes among CMS computing centres. The LoadTest is based upon PhEDEx as a reliable, scalable data set replication system. The Debugging Data Transfers (DDT) task force was created to coordinate the debugging of the data transfer links. The task force aimed to commission most crucial transfer routes among CMS sites by designing and enforcing a clear procedure to debug problematic links. Such procedure aimed to move a link from a debugging phase in a separate and independent environment to a production environment when a set of agreed conditions are achieved for that link. The goal was to deliver one by one working transfer routes to the CMS data operations team...

  18. MPI Debugging with Handle Introspection

    DEFF Research Database (Denmark)

    Brock-Nannestad, Laust; DelSignore, John; Squyres, Jeffrey M.

    The Message Passing Interface, MPI, is the standard programming model for high performance computing clusters. However, debugging applications on large scale clusters is difficult. The widely used Message Queue Dumping interface enables inspection of message queue state but there is no general in...

  19. An object-oriented extension for debugging the virtual machine

    Energy Technology Data Exchange (ETDEWEB)

    Pizzi, Jr, Robert G. [Univ. of California, Davis, CA (United States)

    1994-12-01

    A computer is nothing more then a virtual machine programmed by source code to perform a task. The program`s source code expresses abstract constructs which are compiled into some lower level target language. When a virtual machine breaks, it can be very difficult to debug because typical debuggers provide only low-level target implementation information to the software engineer. We believe that the debugging task can be simplified by introducing aspects of the abstract design and data into the source code. We introduce OODIE, an object-oriented extension to programming languages that allows programmers to specify a virtual environment by describing the meaning of the design and data of a virtual machine. This specification is translated into symbolic information such that an augmented debugger can present engineers with a programmable debugging environment specifically tailored for the virtual machine that is to be debugged.

  20. Debugging systems-on-chip communication-centric and abstraction-based techniques

    CERN Document Server

    Vermeulen, Bart

    2014-01-01

    This book describes an approach and supporting infrastructure to facilitate debugging the silicon implementation of a System-on-Chip (SOC), allowing its associated product to be introduced into the market more quickly.  Readers learn step-by-step the key requirements for debugging a modern, silicon SOC implementation, nine factors that complicate this debugging task, and a new debug approach that addresses these requirements and complicating factors.  The authors’ novel communication-centric, scan-based, abstraction-based, run/stop-based (CSAR) debug approach is discussed in detail, showing how it helps to meet debug requirements and address the nine, previously identified factors that complicate debugging silicon implementations of SOCs. The authors also derive the debug infrastructure requirements to support debugging of a silicon implementation of an SOC with their CSAR debug approach. This debug infrastructure consists of a generic on-chip debug architecture, a configurable automated design-for-debug ...

  1. Lightweight and Statistical Techniques for Petascale PetaScale Debugging

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Barton

    2014-06-30

    to a small set of nodes or by identifying equivalence classes of nodes and sampling our debug targets from them. We implemented these techniques as lightweight tools that efficiently work on the full scale of the target machine. We explored four lightweight debugging refinements: generic classification parameters, such as stack traces, application-specific classification parameters, such as global variables, statistical data acquisition techniques and machine learning based approaches to perform root cause analysis. Work done under this project can be divided into two categories, new algorithms and techniques for scalable debugging, and foundation infrastructure work on our MRNet multicast-reduction framework for scalability, and Dyninst binary analysis and instrumentation toolkits.

  2. Visualizing Debugging Activity in Source Code Repositories

    NARCIS (Netherlands)

    Voinea, Lucian; Telea, Alexandru

    2007-01-01

    We present the use of the CVSgrab visualization tool for understanding the debugging activity in the Mozilla project. We show how to display the distribution of different bug types over the project structure, locate project components which undergo heavy debugging activity, and get insight in the

  3. A debugging method of the Quadrotor UAV based on infrared thermal imaging

    Science.gov (United States)

    Cui, Guangjie; Hao, Qian; Yang, Jianguo; Chen, Lizhi; Hu, Hongkang; Zhang, Lijun

    2018-01-01

    High-performance UAV has been popular and in great need in recent years. The paper introduces a new method in debugging Quadrotor UAVs. Based on the infrared thermal technology and heat transfer theory, a UAV is under debugging above a hot-wire grid which is composed of 14 heated nichrome wires. And the air flow propelled by the rotating rotors has an influence on the temperature distribution of the hot-wire grid. An infrared thermal imager below observes the distribution and gets thermal images of the hot-wire grid. With the assistance of mathematic model and some experiments, the paper discusses the relationship between thermal images and the speed of rotors. By means of getting debugged UAVs into test, the standard information and thermal images can be acquired. The paper demonstrates that comparing to the standard thermal images, a UAV being debugging in the same test can draw some critical data directly or after interpolation. The results are shown in the paper and the advantages are discussed.

  4. Visualizing Debugging Activity in Source Code Repositories

    OpenAIRE

    Voinea, Lucian; Telea, Alexandru

    2007-01-01

    We present the use of the CVSgrab visualization tool for understanding the debugging activity in the Mozilla project. We show how to display the distribution of different bug types over the project structure, locate project components which undergo heavy debugging activity, and get insight in the bug evolution in time.

  5. Debugging in a multi-processor environment

    International Nuclear Information System (INIS)

    Spann, J.M.

    1981-01-01

    The Supervisory Control and Diagnostic System (SCDS) for the Mirror Fusion Test Facility (MFTF) consists of nine 32-bit minicomputers arranged in a tightly coupled distributed computer system utilizing a share memory as the data exchange medium. Debugging of more than one program in the multi-processor environment is a difficult process. This paper describes what new tools were developed and how the testing of software is performed in the SCDS for the MFTF project

  6. Bifröst: debugging web applications as a whole

    NARCIS (Netherlands)

    K.B. van der Vlist (Kevin)

    2013-01-01

    htmlabstractEven though web application development is supported by professional tooling, debugging support is lacking. If one starts to debug a web application, hardly any tooling support exists. Only the core components like server processes and a web browser are exposed. Developers need to

  7. Provenance-Based Debugging and Drill-Down in Data-Oriented Workflows

    KAUST Repository

    Ikeda, Robert

    2012-04-01

    Panda (for Provenance and Data) is a system that supports the creation and execution of data-oriented workflows, with automatic provenance generation and built-in provenance tracing operations. Workflows in Panda are arbitrary a cyclic graphs containing both relational (SQL) processing nodes and opaque processing nodes programmed in Python. For both types of nodes, Panda generates logical provenance - provenance information stored at the processing-node level - and uses the generated provenance to support record-level backward tracing and forward tracing operations. In our demonstration we use Panda to integrate, process, and analyze actual education data from multiple sources. We specifically demonstrate how Panda\\'s provenance generation and tracing capabilities can be very useful for workflow debugging, and for drilling down on specific results of interest. © 2012 IEEE.

  8. Debugging multi-core systems-on-chip

    NARCIS (Netherlands)

    Vermeulen, B.; Goossens, K.G.W.; Kornaros, G.

    2010-01-01

    In this chapter, we introduced three fundamental reasons why debugging a multi-processor SoC is intrinsically difficult; (1) limited internal observability, (2) asynchronicity, and (3) non-determinism.

  9. GeoBuilder: a geometric algorithm visualization and debugging system for 2D and 3D geometric computing.

    Science.gov (United States)

    Wei, Jyh-Da; Tsai, Ming-Hung; Lee, Gen-Cher; Huang, Jeng-Hung; Lee, Der-Tsai

    2009-01-01

    Algorithm visualization is a unique research topic that integrates engineering skills such as computer graphics, system programming, database management, computer networks, etc., to facilitate algorithmic researchers in testing their ideas, demonstrating new findings, and teaching algorithm design in the classroom. Within the broad applications of algorithm visualization, there still remain performance issues that deserve further research, e.g., system portability, collaboration capability, and animation effect in 3D environments. Using modern technologies of Java programming, we develop an algorithm visualization and debugging system, dubbed GeoBuilder, for geometric computing. The GeoBuilder system features Java's promising portability, engagement of collaboration in algorithm development, and automatic camera positioning for tracking 3D geometric objects. In this paper, we describe the design of the GeoBuilder system and demonstrate its applications.

  10. Visual Debugging of Object-Oriented Systems With the Unified Modeling Language

    National Research Council Canada - National Science Library

    Fox, Wendell

    2004-01-01

    .... Debugging and analysis tools are required to aid in this process. Debugging of large object-oriented systems is a difficult cognitive process that requires understanding of both the overall and detailed behavior of the application...

  11. Lightweight and Statistical Techniques for Petascale Debugging: Correctness on Petascale Systems (CoPS) Preliminry Report

    Energy Technology Data Exchange (ETDEWEB)

    de Supinski, B R; Miller, B P; Liblit, B

    2011-09-13

    Petascale platforms with O(10{sup 5}) and O(10{sup 6}) processing cores are driving advancements in a wide range of scientific disciplines. These large systems create unprecedented application development challenges. Scalable correctness tools are critical to shorten the time-to-solution on these systems. Currently, many DOE application developers use primitive manual debugging based on printf or traditional debuggers such as TotalView or DDT. This paradigm breaks down beyond a few thousand cores, yet bugs often arise above that scale. Programmers must reproduce problems in smaller runs to analyze them with traditional tools, or else perform repeated runs at scale using only primitive techniques. Even when traditional tools run at scale, the approach wastes substantial effort and computation cycles. Continued scientific progress demands new paradigms for debugging large-scale applications. The Correctness on Petascale Systems (CoPS) project is developing a revolutionary debugging scheme that will reduce the debugging problem to a scale that human developers can comprehend. The scheme can provide precise diagnoses of the root causes of failure, including suggestions of the location and the type of errors down to the level of code regions or even a single execution point. Our fundamentally new strategy combines and expands three relatively new complementary debugging approaches. The Stack Trace Analysis Tool (STAT), a 2011 R&D 100 Award Winner, identifies behavior equivalence classes in MPI jobs and highlights behavior when elements of the class demonstrate divergent behavior, often the first indicator of an error. The Cooperative Bug Isolation (CBI) project has developed statistical techniques for isolating programming errors in widely deployed code that we will adapt to large-scale parallel applications. Finally, we are developing a new approach to parallelizing expensive correctness analyses, such as analysis of memory usage in the Memgrind tool. In the first two

  12. A Framework for Debugging Geoscience Projects in a High Performance Computing Environment

    Science.gov (United States)

    Baxter, C.; Matott, L.

    2012-12-01

    High performance computing (HPC) infrastructure has become ubiquitous in today's world with the emergence of commercial cloud computing and academic supercomputing centers. Teams of geoscientists, hydrologists and engineers can take advantage of this infrastructure to undertake large research projects - for example, linking one or more site-specific environmental models with soft computing algorithms, such as heuristic global search procedures, to perform parameter estimation and predictive uncertainty analysis, and/or design least-cost remediation systems. However, the size, complexity and distributed nature of these projects can make identifying failures in the associated numerical experiments using conventional ad-hoc approaches both time- consuming and ineffective. To address these problems a multi-tiered debugging framework has been developed. The framework allows for quickly isolating and remedying a number of potential experimental failures, including: failures in the HPC scheduler; bugs in the soft computing code; bugs in the modeling code; and permissions and access control errors. The utility of the framework is demonstrated via application to a series of over 200,000 numerical experiments involving a suite of 5 heuristic global search algorithms and 15 mathematical test functions serving as cheap analogues for the simulation-based optimization of pump-and-treat subsurface remediation systems.

  13. Debugging and Logging Services for Defence Service Oriented Architectures

    Science.gov (United States)

    2012-02-01

    Service A software component and callable end point that provides a logically related set of operations, each of which perform a logical step in a...important to note that in some cases when the fault is identified to lie in uneditable code such as program libraries, or outsourced software services ...debugging is limited to characterisation of the fault, reporting it to the software or service provider and development of work-arounds and management

  14. A Novel Method for Live Debugging of Production Web Applications by Dynamic Resource Replacement

    OpenAIRE

    Khalid Al-Tahat; Khaled Zuhair Mahmoud; Ahmad Al-Mughrabi

    2014-01-01

    This paper proposes a novel methodology for enabling debugging and tracing of production web applications without affecting its normal flow and functionality. This method of debugging enables developers and maintenance engineers to replace a set of existing resources such as images, server side scripts, cascading style sheets with another set of resources per web session. The new resources will only be active in the debug session and other sessions will not be affected. T...

  15. Vdebug: debugging tool for parallel scientific programs. Design report on vdebug

    International Nuclear Information System (INIS)

    Matsuda, Katsuyuki; Takemiya, Hiroshi

    2000-02-01

    We report on a debugging tool called vdebug which supports debugging work for parallel scientific simulation programs. It is difficult to debug scientific programs with an existing debugger, because the volume of data generated by the programs is too large for users to check data in characters. Usually, the existing debugger shows data values in characters. To alleviate it, we have developed vdebug which enables to check the validity of large amounts of data by showing these data values visually. Although targets of vdebug have been restricted to sequential programs, we have made it applicable to parallel programs by realizing the function of merging and visualizing data distributed on programs on each computer node. Now, vdebug works on seven kinds of parallel computers. In this report, we describe the design of vdebug. (author)

  16. Debugging Nondeterministic Failures in Linux Programs through Replay Analysis

    Directory of Open Access Journals (Sweden)

    Shakaiba Majeed

    2018-01-01

    Full Text Available Reproducing a failure is the first and most important step in debugging because it enables us to understand the failure and track down its source. However, many programs are susceptible to nondeterministic failures that are hard to reproduce, which makes debugging extremely difficult. We first address the reproducibility problem by proposing an OS-level replay system for a uniprocessor environment that can capture and replay nondeterministic events needed to reproduce a failure in Linux interactive and event-based programs. We then present an analysis method, called replay analysis, based on the proposed record and replay system to diagnose concurrency bugs in such programs. The replay analysis method uses a combination of static analysis, dynamic tracing during replay, and delta debugging to identify failure-inducing memory access patterns that lead to concurrency failure. The experimental results show that the presented record and replay system has low-recording overhead and hence can be safely used in production systems to catch rarely occurring bugs. We also present few concurrency bug case studies from real-world applications to prove the effectiveness of the proposed bug diagnosis framework.

  17. Realization of rapid debugging for detection circuit of optical fiber gas sensor: Using an analog signal source

    Science.gov (United States)

    Tian, Changbin; Chang, Jun; Wang, Qiang; Wei, Wei; Zhu, Cunguang

    2015-03-01

    An optical fiber gas sensor mainly consists of two parts: optical part and detection circuit. In the debugging for the detection circuit, the optical part usually serves as a signal source. However, in the debugging condition, the optical part can be easily influenced by many factors, such as the fluctuation of ambient temperature or driving current resulting in instability of the wavelength and intensity for the laser; for dual-beam sensor, the different bends and stresses of the optical fiber will lead to the fluctuation of the intensity and phase; the intensity noise from the collimator, coupler, and other optical devices in the system will also result in the impurity of the optical part based signal source. In order to dramatically improve the debugging efficiency of the detection circuit and shorten the period of research and development, this paper describes an analog signal source, consisting of a single chip microcomputer (SCM), an amplifier circuit, and a voltage-to-current conversion circuit. It can be used to realize the rapid debugging detection circuit of the optical fiber gas sensor instead of optical part based signal source. This analog signal source performs well with many other advantages, such as the simple operation, small size, and light weight.

  18. Software exorcism a handbook for debugging and optimizing legacy code

    CERN Document Server

    Blunden, Bill

    2013-01-01

    Software Exorcism: A Handbook for Debugging and Optimizing Legacy Code takes an unflinching, no bulls and look at behavioral problems in the software engineering industry, shedding much-needed light on the social forces that make it difficult for programmers to do their job. Do you have a co-worker who perpetually writes bad code that you are forced to clean up? This is your book. While there are plenty of books on the market that cover debugging and short-term workarounds for bad code, Reverend Bill Blunden takes a revolutionary step beyond them by bringing our atten

  19. An automatic frequency control system of 2-MeV electronic LINAC

    International Nuclear Information System (INIS)

    Hu Xue; Zhang Junqiang; Zhong Shaopeng; Zhao Minghua

    2013-01-01

    Background: In electronic LINAC, the magnetron is often used as power source. The output frequency of magnetron always changes with the environment and the frequency difference between the output of magnetron and the frequency of accelerator, which will result in the bad performance of LINAC systems. Purpose: To ensure the performance of the work of entire LINAC system effectively, an automatic frequency control system is necessary. Methods: A phase locked frequency discriminator is designed to discriminate the frequency of accelerator guide and magnetron, and analogue circuit is used to process the output signals of frequency discriminator unit. Results: Working with the automatic frequency control (AFC) system, the output frequency of magnetron can be controlled in the range of (2998 MHz, 2998 MHz + 70 kHz) and (2998 MHz, 2998 MHz - 30 kHz). Conclusions: Under the measurement and debug, the functionality of frequency discriminator unit and signal processor circuit is tested effectively. (authors)

  20. Prototype application for the control and debugging of CMS upgrade projects

    CERN Document Server

    Mills-Howell, Dominic

    2016-01-01

    Following the high-luminosity upgrades of the LHC, many subsystems of the CMS experiment require upgrading and others are using the LHC shutdowns as an opportunity to improve performance. The upgrades, themselves, have served to highlight the exigency to attack problems that were previously unaddressed. One such problem is the need for a tool that allows the users to easily monitor, debug, and test custom hardware. Such a tool could be abstracted to work, in theory, with various hardware devices. In addition to having the added benefit of being able to support future hardware, and maintaining parallel operations with the remaining control software.

  1. Test and debug features of the RTO7 chip

    NARCIS (Netherlands)

    Kaam, van K.M.M.; Vermeulen, H.G.H.; Bergveld, H.J.

    2005-01-01

    The Philips RTO7 chip consists of a complete receive chain from RF up to and including digital demodulation for Bluetooth-like radio communication. This paper describes both the implementation and verification of the test and debugs hardware for the digital core of the RTO7. The core-based DfT and

  2. Modern multithreading implementing, testing, and debugging multithreaded Java and C++/Pthreads/Win32 programs

    CERN Document Server

    Carver, Richard H

    2005-01-01

    Master the essentials of concurrent programming,including testing and debuggingThis textbook examines languages and libraries for multithreaded programming. Readers learn how to create threads in Java and C++, and develop essential concurrent programming and problem-solving skills. Moreover, the textbook sets itself apart from other comparable works by helping readers to become proficient in key testing and debugging techniques. Among the topics covered, readers are introduced to the relevant aspects of Java, the POSIX Pthreads library, and the Windows Win32 Applications Programming Interface.

  3. Proof Theory, Transformations, and Logic Programming for Debugging Security Protocols

    NARCIS (Netherlands)

    Pettorossi, Alberto; Delzanno, Giorgio; Etalle, Sandro

    2001-01-01

    We define a sequent calculus to formally specify, simulate, debug and verify security protocols. In our sequents we distinguish between the current knowledge of principals and the current global state of the session. Hereby, we can describe the operational semantics of principals and of an intruder

  4. Fundamentals of IP and SoC security design, verification, and debug

    CERN Document Server

    Ray, Sandip; Sur-Kolay, Susmita

    2017-01-01

    This book is about security in embedded systems and it provides an authoritative reference to all aspects of security in system-on-chip (SoC) designs. The authors discuss issues ranging from security requirements in SoC designs, definition of architectures and design choices to enforce and validate security policies, and trade-offs and conflicts involving security, functionality, and debug requirements. Coverage also includes case studies from the “trenches” of current industrial practice in design, implementation, and validation of security-critical embedded systems. Provides an authoritative reference and summary of the current state-of-the-art in security for embedded systems, hardware IPs and SoC designs; Takes a "cross-cutting" view of security that interacts with different design and validation components such as architecture, implementation, verification, and debug, each enforcing unique trade-offs; Includes high-level overview, detailed analysis on implementation, and relevant case studies on desi...

  5. Performance of data acceptance criteria over 50 months from an automatic real-time environmental radiation surveillance network

    International Nuclear Information System (INIS)

    Casanovas, R.; Morant, J.J.; Lopez, M.; Hernandez-Giron, I.; Batalla, E.; Salvado, M.

    2011-01-01

    The automatic real-time environmental radiation surveillance network of Catalonia (Spain) comprises two subnetworks; one with 9 aerosol monitors and the other with 8 Geiger monitors together with 2 water monitors located in the Ebre river. Since September 2006, several improvements were implemented in order to get better quality and quantity of data, allowing a more accurate data analysis. However, several causes (natural causes, equipment failure, artificial external causes and incidents in nuclear power plants) may produce radiological measured values mismatched with the own station background, whether spurious without significance or true radiological values. Thus, data analysis for a 50-month period was made and allowed to establish an easily implementable statistical criterion to find those values that require special attention. This criterion proved a very useful tool for creating a properly debugged database and to give a quick response to equipment failures or possible radiological incidents. This paper presents the results obtained from the criterion application, including the figures for the expected, raw and debugged data, percentages of missing data grouped by causes and radiological measurements from the networks. Finally, based on the discussed information, recommendations for the improvement of the network are identified to obtain better radiological information and analysis capabilities. - Highlights: → Causes producing data mismatching with the own stations background are described. → Causes may be natural, equipment failure, external or nuclear plants incidents. → These causes can produce either spurious or true radiological data. → A criterion to find these data was implemented and tested for a 50-month period. → Recommendations for the improvement of the network are identified.

  6. TOPIC: a debugging code for torus geometry input data of Monte Carlo transport code

    International Nuclear Information System (INIS)

    Iida, Hiromasa; Kawasaki, Hiromitsu.

    1979-06-01

    TOPIC has been developed for debugging geometry input data of the Monte Carlo transport code. the code has the following features: (1) It debugs the geometry input data of not only MORSE-GG but also MORSE-I capable of treating torus geometry. (2) Its calculation results are shown in figures drawn by Plotter or COM, and the regions not defined or doubly defined are easily detected. (3) It finds a multitude of input data errors in a single run. (4) The input data required in this code are few, so that it is readily usable in a time sharing system of FACOM 230-60/75 computer. Example TOPIC calculations in design study of tokamak fusion reactors (JXFR, INTOR-J) are presented. (author)

  7. An approach to profiling for run-time checking of computational properties and performance debugging in logic programs.

    OpenAIRE

    Mera, E.; Trigo, Teresa; López García, Pedro; Hermenegildo, Manuel V.

    2010-01-01

    Although several profiling techniques for identifying performance bottlenecks in logic programs have been developed, they are generally not automatic and in most cases they do not provide enough information for identifying the root causes of such bottlenecks. This complicates using their results for guiding performance improvement. We present a profiling method and tool that provides such explanations. Our profiler associates cost centers to certain program elements and can measure different ...

  8. Iterative Authoring Using Story Generation Feedback: Debugging or Co-creation?

    Science.gov (United States)

    Swartjes, Ivo; Theune, Mariët

    We explore the role that story generation feedback may play within the creative process of interactive story authoring. While such feedback is often used as 'debugging' information, we explore here a 'co-creation' view, in which the outcome of the story generator influences authorial intent. We illustrate an iterative authoring approach in which each iteration consists of idea generation, implementation and simulation. We find that the tension between authorial intent and the partially uncontrollable story generation outcome may be relieved by taking such a co-creation approach.

  9. Configuration and debug of field programmable gate arrays using MATLAB[reg)/SIMULINK[reg

    International Nuclear Information System (INIS)

    Grout, I; Ryan, J; O'Shea, T

    2005-01-01

    Increasingly, the need to seamlessly link high-level behavioural descriptions of electronic hardware for modelling and simulation purposes to the final application hardware highlights the gap between the high-level behavioural descriptions of the required circuit functionality (considering here digital logic) in commonly used mathematical modelling tools, and the hardware description languages such as VHDL and Verilog-HDL. In this paper, the linking of a MATLAB[reg] model for digital algorithm for implementation on a programmable logic device for design synthesis from the MATLAB[reg] model into VHDL is discussed. This VHDL model is itself synthesised and downloaded to the target Field Programmable Gate Array, for normal operation and also for design debug purposes. To demonstrate this, a circuit architecture mapped from a SIMULINK[reg] model is presented. The rationale is for a seamless interface between the initial algorithm development and the target hardware, enabling the hardware to be debugged and compared to the simulated model from a single interface for use with by a non-expert in the programmable logic and hardware description language use

  10. Visual Debugging of Object-Oriented Systems With the Unified Modeling Language

    Science.gov (United States)

    2004-03-01

    to be “the systematic and imaginative use of the technology of interactive computer graphics and the disciplines of graphic design, typography ...Traditional debugging involves the user creating a mental image of the structure and execution path based on source code. According to Miller, the 7 ± 2...of each FigClass (the class that represents the image of a class), the DOI and LOD for each, and finally calls a method to apply the visual

  11. A Case for Dynamic Reverse-code Generation to Debug Non-deterministic Programs

    Directory of Open Access Journals (Sweden)

    Jooyong Yi

    2013-09-01

    Full Text Available Backtracking (i.e., reverse execution helps the user of a debugger to naturally think backwards along the execution path of a program, and thinking backwards makes it easy to locate the origin of a bug. So far backtracking has been implemented mostly by state saving or by checkpointing. These implementations, however, inherently do not scale. Meanwhile, a more recent backtracking method based on reverse-code generation seems promising because executing reverse code can restore the previous states of a program without state saving. In the literature, there can be found two methods that generate reverse code: (a static reverse-code generation that pre-generates reverse code through static analysis before starting a debugging session, and (b dynamic reverse-code generation that generates reverse code by applying dynamic analysis on the fly during a debugging session. In particular, we espoused the latter one in our previous work to accommodate non-determinism of a program caused by e.g., multi-threading. To demonstrate the usefulness of our dynamic reverse-code generation, this article presents a case study of various backtracking methods including ours. We compare the memory usage of various backtracking methods in a simple but nontrivial example, a bounded-buffer program. In the case of non-deterministic programs such as this bounded-buffer program, our dynamic reverse-code generation outperforms the existing backtracking methods in terms of memory efficiency.

  12. Performance-Driven Interface Contract Enforcement for Scientific Components

    Energy Technology Data Exchange (ETDEWEB)

    Dahlgren, Tamara Lynn [Univ. of California, Davis, CA (United States)

    2008-01-01

    Performance-driven interface contract enforcement research aims to improve the quality of programs built from plug-and-play scientific components. Interface contracts make the obligations on the caller and all implementations of the specified methods explicit. Runtime contract enforcement is a well-known technique for enhancing testing and debugging. However, checking all of the associated constraints during deployment is generally considered too costly from a performance stand point. Previous solutions enforced subsets of constraints without explicit consideration of their performance implications. Hence, this research measures the impacts of different interface contract sampling strategies and compares results with new techniques driven by execution time estimates. Results from three studies indicate automatically adjusting the level of checking based on performance constraints improves the likelihood of detecting contract violations under certain circumstances. Specifically, performance-driven enforcement is better suited to programs exercising constraints whose costs are at most moderately expensive relative to normal program execution.

  13. Handwriting Automaticity: The Search for Performance Thresholds

    Science.gov (United States)

    Medwell, Jane; Wray, David

    2014-01-01

    Evidence is accumulating that handwriting has an important role in written composition. In particular, handwriting automaticity appears to relate to success in composition. This relationship has been little explored in British contexts and we currently have little idea of what threshold performance levels might be. In this paper, we report on two…

  14. Space-Based FPGA Radio Receiver Design, Debug, and Development of a Radiation-Tolerant Computing System

    Directory of Open Access Journals (Sweden)

    Zachary K. Baker

    2010-01-01

    Full Text Available Los Alamos has recently completed the latest in a series of Reconfigurable Software Radios, which incorporates several key innovations in both hardware design and algorithms. Due to our focus on satellite applications, each design must extract the best size, weight, and power performance possible from the ensemble of Commodity Off-the-Shelf (COTS parts available at the time of design. A large component of our work lies in determining if a given part will survive in space and how it will fail under various space radiation conditions. Using two Xilinx Virtex 4 FPGAs, we have achieved 1 TeraOps/second signal processing on a 1920 Megabit/second datastream. This processing capability enables very advanced algorithms such as our wideband RF compression scheme to operate at the source, allowing bandwidth-constrained applications to deliver previously unattainable performance. This paper will discuss the design of the payload, making electronics survivable in the radiation of space, and techniques for debug.

  15. Application of remote debugging techniques in user-centric job monitoring

    International Nuclear Information System (INIS)

    Dos Santos, T; Mättig, P; Harenberg, T; Volkmer, F; Beermann, T; Kalinin, S; Ahrens, R; Wulff, N

    2012-01-01

    With the Job Execution Monitor, a user-centric job monitoring software developed at the University of Wuppertal and integrated into the job brokerage systems of the WLCG, job progress and grid worker node health can be supervised in real time. Imminent error conditions can thus be detected early by the submitter and countermeasures can be taken. Grid site admins can access aggregated data of all monitored jobs to infer the site status and to detect job misbehaviour. To remove the last 'blind spot' from this monitoring, a remote debugging technique based on the GNU C compiler suite was developed and integrated into the software; its design concept and architecture is described in this paper and its application discussed.

  16. Automatic Single Event Effects Sensitivity Analysis of a 13-Bit Successive Approximation ADC

    Science.gov (United States)

    Márquez, F.; Muñoz, F.; Palomo, F. R.; Sanz, L.; López-Morillo, E.; Aguirre, M. A.; Jiménez, A.

    2015-08-01

    This paper presents Analog Fault Tolerant University of Seville Debugging System (AFTU), a tool to evaluate the Single-Event Effect (SEE) sensitivity of analog/mixed signal microelectronic circuits at transistor level. As analog cells can behave in an unpredictable way when critical areas interact with the particle hitting, there is a need for designers to have a software tool that allows an automatic and exhaustive analysis of Single-Event Effects influence. AFTU takes the test-bench SPECTRE design, emulates radiation conditions and automatically evaluates vulnerabilities using user-defined heuristics. To illustrate the utility of the tool, the SEE sensitivity of a 13-bits Successive Approximation Analog-to-Digital Converter (ADC) has been analysed. This circuit was selected not only because it was designed for space applications, but also due to the fact that a manual SEE sensitivity analysis would be too time-consuming. After a user-defined test campaign, it was detected that some voltage transients were propagated to a node where a parasitic diode was activated, affecting the offset cancelation, and therefore the whole resolution of the ADC. A simple modification of the scheme solved the problem, as it was verified with another automatic SEE sensitivity analysis.

  17. The roots of stereotype threat: when automatic associations disrupt girls' math performance.

    Science.gov (United States)

    Galdi, Silvia; Cadinu, Mara; Tomasetto, Carlo

    2014-01-01

    Although stereotype awareness is a prerequisite for stereotype threat effects (Steele & Aronson, 1995), research showed girls' deficit under stereotype threat before the emergence of math-gender stereotype awareness, and in the absence of stereotype endorsement. In a study including 240 six-year-old children, this paradox was addressed by testing whether automatic associations trigger stereotype threat in young girls. Whereas no indicators were found that children endorsed the math-gender stereotype, girls, but not boys, showed automatic associations consistent with the stereotype. Moreover, results showed that girls' automatic associations varied as a function of a manipulation regarding the stereotype content. Importantly, girls' math performance decreased in a stereotype-consistent, relative to a stereotype-inconsistent, condition and automatic associations mediated the relation between stereotype threat and performance. © 2013 The Authors. Child Development © 2013 Society for Research in Child Development, Inc.

  18. Simulation of a nuclear measurement system around a multi-task mode real-time monitor

    International Nuclear Information System (INIS)

    De Grandi, G.; Ouiguini, R.

    1983-01-01

    When debugging and testing material and software for the automation of systems, the non-availability of this last one states important logistic problems. A simulator of the system to be automatized, conceived around a multi-task mode real-time monitor, allowing the debugging of the software of automation without the physical presence of the system to be automatized, is proposed in the present report

  19. An integrated development environment for PMESII model authoring, integration, validation, and debugging

    Science.gov (United States)

    Pioch, Nicholas J.; Lofdahl, Corey; Sao Pedro, Michael; Krikeles, Basil; Morley, Liam

    2007-04-01

    To foster shared battlespace awareness in Air Operations Centers supporting the Joint Forces Commander and Joint Force Air Component Commander, BAE Systems is developing a Commander's Model Integration and Simulation Toolkit (CMIST), an Integrated Development Environment (IDE) for model authoring, integration, validation, and debugging. CMIST is built on the versatile Eclipse framework, a widely used open development platform comprised of extensible frameworks that enable development of tools for building, deploying, and managing software. CMIST provides two distinct layers: 1) a Commander's IDE for supporting staff to author models spanning the Political, Military, Economic, Social, Infrastructure, Information (PMESII) taxonomy; integrate multiple native (third-party) models; validate model interfaces and outputs; and debug the integrated models via intuitive controls and time series visualization, and 2) a PMESII IDE for modeling and simulation developers to rapidly incorporate new native simulation tools and models to make them available for use in the Commander's IDE. The PMESII IDE provides shared ontologies and repositories for world state, modeling concepts, and native tool characterization. CMIST includes extensible libraries for 1) reusable data transforms for semantic alignment of native data with the shared ontology, and 2) interaction patterns to synchronize multiple native simulations with disparate modeling paradigms, such as continuous-time system dynamics, agent-based discrete event simulation, and aggregate solution methods such as Monte Carlo sampling over dynamic Bayesian networks. This paper describes the CMIST system architecture, our technical approach to addressing these semantic alignment and synchronization problems, and initial results from integrating Political-Military-Economic models of post-war Iraq spanning multiple modeling paradigms.

  20. Clinical performance of a new hepatitis B surface antigen quantitative assay with automatic dilution

    Directory of Open Access Journals (Sweden)

    Ta-Wei Liu

    2015-01-01

    Full Text Available Hepatitis B virus surface antigen (HBsAg levels reflect disease status and can predict the clinical response to antiviral treatment; however, the emergence of HBsAg mutant strains has become a challenge. The Abbott HBsAg quantification assay provides enhanced detection of HBsAg and HBsAg mutants. We aimed to evaluate the performance of the Abbott HBsAg quantification assay with automatic sample dilutions (shortened as automatic Architect assay, compared with the Abbott HBsAg quantification assay with manual sample dilutions (shortened as manual Architect assay and the Roche HBsAg quantification assay with automatic sample dilutions (shortened as Elecsys. A total of 130 sera samples obtained from 87 hepatitis B virus (HBV-infected patients were collected to assess the correlation between the automatic and manual Architect assays. Among the 87 patients, 41 provided 42 sera samples to confirm the linearity and reproducibility of the automatic Architect assay, and find out the correlation among the Elecsys and two Architect assays. The coefficients of variation (0.44–9.53% and R2 = 0.996–1, which were both determined using values obtained from the automatic Architect assay, showed good reproducibility and linearity. Results of the two Architect assays demonstrated a feasible correlation (n = 130 samples; R = 0.898, p  0.93 in all cases. In conclusion, the correlation between the automatic and manual dilution Architect assays was feasible, particularly in the HBeAg-negative and low DNA groups. With lower labor costs and less human error than the manual version, the Abbott automatic dilution Architect assay provided a good clinical performance with regard to the HBsAg levels.

  1. Data Integration Tool: Permafrost Data Debugging

    Science.gov (United States)

    Wilcox, H.; Schaefer, K. M.; Jafarov, E. E.; Pulsifer, P. L.; Strawhacker, C.; Yarmey, L.; Basak, R.

    2017-12-01

    We developed a Data Integration Tool (DIT) to significantly speed up the time of manual processing needed to translate inconsistent, scattered historical permafrost data into files ready to ingest directly into the Global Terrestrial Network-Permafrost (GTN-P). The United States National Science Foundation funded this project through the National Snow and Ice Data Center (NSIDC) with the GTN-P to improve permafrost data access and discovery. We leverage this data to support science research and policy decisions. DIT is a workflow manager that divides data preparation and analysis into a series of steps or operations called widgets (https://github.com/PermaData/DIT). Each widget does a specific operation, such as read, multiply by a constant, sort, plot, and write data. DIT allows the user to select and order the widgets as desired to meet their specific needs, incrementally interact with and evolve the widget workflows, and save those workflows for reproducibility. Taking ideas from visual programming found in the art and design domain, debugging and iterative design principles from software engineering, and the scientific data processing and analysis power of Fortran and Python it was written for interactive, iterative data manipulation, quality control, processing, and analysis of inconsistent data in an easily installable application. DIT was used to completely translate one dataset (133 sites) that was successfully added to GTN-P, nearly translate three datasets (270 sites), and is scheduled to translate 10 more datasets ( 1000 sites) from the legacy inactive site data holdings of the Frozen Ground Data Center (FGDC). Iterative development has provided the permafrost and wider scientific community with an extendable tool designed specifically for the iterative process of translating unruly data.

  2. Debugging Nano-Bio Interfaces: Systematic Strategies to Accelerate Clinical Translation of Nanotechnologies.

    Science.gov (United States)

    Mahmoudi, Morteza

    2018-03-17

    Despite considerable efforts in the field of nanomedicine that have been made by researchers, funding agencies, entrepreneurs, and the media, fewer nanoparticle (NP) technologies than expected have made it to clinical trials. The wide gap between the efforts and effective clinical translation is, at least in part, due to multiple overlooked factors in both in vitro and in vivo environments, a poor understanding of the nano-bio interface, and misinterpretation of the data collected in vitro, all of which reduce the accuracy of predictions regarding the NPs' fate and safety in humans. To minimize this bench-to-clinic gap, which may accelerate successful clinical translation of NPs, this opinion paper aims to introduce strategies for systematic debugging of nano-bio interfaces in the current literature. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Energy Bucket: A Tool for Power Profiling and Debugging of Sensor Nodes

    DEFF Research Database (Denmark)

    Andersen, Jacob; Hansen, Morten Tranberg

    2009-01-01

    The ability to precisely measure and compare energy consumption and relate this to particular parts of programs is a recurring theme in sensor network research. This paper presents the Energy Bucket, a low-cost tool designed for quick empirical measurements of energy consumptions across 5 decades...... of current draw. The Energy Bucket provides a light-weight state API for the target system, which facilitates easy scorekeeping of energy consumption between different parts of a target program. We demonstrate how this tool can be used to discover programming errors and debug sensor network applications.......Furthermore, we show how this tool, together with the target system API, offers a very detailed analysis of where energy is spent in an application, which proves to be very useful when comparing alternative implementations or validating theoretical energy consumption models....

  4. Improvement of visual debugging tool. Shortening the elapsed time for getting data and adding new functions to compare/combine a set of visualized data

    International Nuclear Information System (INIS)

    Matsuda, Katsuyuki; Takemiya, Hiroshi

    2001-03-01

    The visual debugging tool 'vdebug' has been improved, which was designed for the debugging of programs for scientific computing. Improved were the following two points; (1) shortening the elapsed time required for getting appropriate data to visualize; (2) adding new functions which enable to compare and/or combine a set of visualized data originated from two or more different programs. As for shortening elapsed time for getting data, with the improved version of 'vdebug', we could achieve the following results; over hundred times shortening the elapsed time with dbx, pdbx of SX-4 and over ten times with ndb of SR2201. As for the new functions to compare/combine visualized data, it was confirmed that we could easily checked the consistency between the computational results obtained in each calculational steps on two different computers: SP and ONYX. In this report, we illustrate how the tool 'vdebug' has been improved with an example. (author)

  5. Development of a system for automatic control and performance evaluation of shutoff units in HANARO

    International Nuclear Information System (INIS)

    Jeong, Y. H.; Joe, Y. G.; Choi, Y. S.; Woo, J. S.

    2003-01-01

    The function of the shutoff units is to rapidly insert the shutoff rod into the reactor core for safe shutdown of reactor. This paper describes the development of a system for automatic control and performance evaluation of shutoff units. The system automatically drives the shutoff unit with a specified operation cycle and records the performance of the drive mechanism in graphs and data. Also, it records the operating parameters of the shutoff unit and test facility. The characteristic of the developed system was evaluated to compare with that being use in the HANARO reactor. The system will be used for the performance and endurance tests in the test facility. Hereafter, the system will efficiently be used for the normal operation and the periodical drop performance tests of shutoff units in HANARO

  6. Automatic Energy Schemes for High Performance Applications

    Energy Technology Data Exchange (ETDEWEB)

    Sundriyal, Vaibhav [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    Although high-performance computing traditionally focuses on the efficient execution of large-scale applications, both energy and power have become critical concerns when approaching exascale. Drastic increases in the power consumption of supercomputers affect significantly their operating costs and failure rates. In modern microprocessor architectures, equipped with dynamic voltage and frequency scaling (DVFS) and CPU clock modulation (throttling), the power consumption may be controlled in software. Additionally, network interconnect, such as Infiniband, may be exploited to maximize energy savings while the application performance loss and frequency switching overheads must be carefully balanced. This work first studies two important collective communication operations, all-to-all and allgather and proposes energy saving strategies on the per-call basis. Next, it targets point-to-point communications to group them into phases and apply frequency scaling to them to save energy by exploiting the architectural and communication stalls. Finally, it proposes an automatic runtime system which combines both collective and point-to-point communications into phases, and applies throttling to them apart from DVFS to maximize energy savings. The experimental results are presented for NAS parallel benchmark problems as well as for the realistic parallel electronic structure calculations performed by the widely used quantum chemistry package GAMESS. Close to the maximum energy savings were obtained with a substantially low performance loss on the given platform.

  7. Automatic and objective assessment of alternating tapping performance in Parkinson's disease.

    Science.gov (United States)

    Memedi, Mevludin; Khan, Taha; Grenholm, Peter; Nyholm, Dag; Westin, Jerker

    2013-12-09

    This paper presents the development and evaluation of a method for enabling quantitative and automatic scoring of alternating tapping performance of patients with Parkinson's disease (PD). Ten healthy elderly subjects and 95 patients in different clinical stages of PD have utilized a touch-pad handheld computer to perform alternate tapping tests in their home environments. First, a neurologist used a web-based system to visually assess impairments in four tapping dimensions ('speed', 'accuracy', 'fatigue' and 'arrhythmia') and a global tapping severity (GTS). Second, tapping signals were processed with time series analysis and statistical methods to derive 24 quantitative parameters. Third, principal component analysis was used to reduce the dimensions of these parameters and to obtain scores for the four dimensions. Finally, a logistic regression classifier was trained using a 10-fold stratified cross-validation to map the reduced parameters to the corresponding visually assessed GTS scores. Results showed that the computed scores correlated well to visually assessed scores and were significantly different across Unified Parkinson's Disease Rating Scale scores of upper limb motor performance. In addition, they had good internal consistency, had good ability to discriminate between healthy elderly and patients in different disease stages, had good sensitivity to treatment interventions and could reflect the natural disease progression over time. In conclusion, the automatic method can be useful to objectively assess the tapping performance of PD patients and can be included in telemedicine tools for remote monitoring of tapping.

  8. Predicting automatic speech recognition performance over communication channels from instrumental speech quality and intelligibility scores

    NARCIS (Netherlands)

    Gallardo, L.F.; Möller, S.; Beerends, J.

    2017-01-01

    The performance of automatic speech recognition based on coded-decoded speech heavily depends on the quality of the transmitted signals, determined by channel impairments. This paper examines relationships between speech recognition performance and measurements of speech quality and intelligibility

  9. Automatic performance estimation of conceptual temperature control system design for rapid development of real system

    International Nuclear Information System (INIS)

    Jang, Yu Jin

    2013-01-01

    This paper presents an automatic performance estimation scheme of conceptual temperature control system with multi-heater configuration prior to constructing the physical system for achieving rapid validation of the conceptual design. An appropriate low-order discrete-time model, which will be used in the controller design, is constructed after determining several basic factors including the geometric shape of controlled object and heaters, material properties, heater arrangement, etc. The proposed temperature controller, which adopts the multivariable GPC (generalized predictive control) scheme with scale factors, is then constructed automatically based on the above model. The performance of the conceptual temperature control system is evaluated by using a FEM (finite element method) simulation combined with the controller.

  10. Automatic performance estimation of conceptual temperature control system design for rapid development of real system

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Yu Jin [Dongguk University, GyeongJu (Korea, Republic of)

    2013-07-15

    This paper presents an automatic performance estimation scheme of conceptual temperature control system with multi-heater configuration prior to constructing the physical system for achieving rapid validation of the conceptual design. An appropriate low-order discrete-time model, which will be used in the controller design, is constructed after determining several basic factors including the geometric shape of controlled object and heaters, material properties, heater arrangement, etc. The proposed temperature controller, which adopts the multivariable GPC (generalized predictive control) scheme with scale factors, is then constructed automatically based on the above model. The performance of the conceptual temperature control system is evaluated by using a FEM (finite element method) simulation combined with the controller.

  11. 75 FR 37712 - Automatic Dependent Surveillance-Broadcast (ADS-B) Out Performance Requirements To Support Air...

    Science.gov (United States)

    2010-06-30

    ... Performance Requirements To Support Air Traffic Control (ATC) Service; Technical Amendment AGENCY: Federal... FAA amended its regulations to add equipage requirements and performance standards for Automatic... Approval for deviation was renumbered as Sec. 21.618, effective April 14, 2010. On May 28, 2010, the FAA...

  12. Automatic target recognition performance losses in the presence of atmospheric and camera effects

    Science.gov (United States)

    Chen, Xiaohan; Schmid, Natalia A.

    2010-04-01

    The importance of networked automatic target recognition systems for surveillance applications is continuously increasing. Because of the requirement of a low cost and limited payload, these networks are traditionally equipped with lightweight, low-cost sensors such as electro-optical (EO) or infrared sensors. The quality of imagery acquired by these sensors critically depends on the environmental conditions, type and characteristics of sensors, and absence of occluding or concealing objects. In the past, a large number of efficient detection, tracking, and recognition algorithms have been designed to operate on imagery of good quality. However, detection and recognition limits under nonideal environmental and/or sensor-based distortions have not been carefully evaluated. We introduce a fully automatic target recognition system that involves a Haar-based detector to select potential regions of interest within images, performs adjustment of detected regions, segments potential targets using a region-based approach, identifies targets using Bessel K form-based encoding, and performs clutter rejection. We investigate the effects of environmental and camera conditions on target detection and recognition performance. Two databases are involved. One is a simulated database generated using a 3-D tool. The other database is formed by imaging 10 die-cast models of military vehicles from different elevation and orientation angles. The database contains imagery acquired both indoors and outdoors. The indoors data set is composed of clear and distorted images. The distortions include defocus blur, sided illumination, low contrast, shadows, and occlusions. All images in this database, however, have a uniform (blue) background. The indoors database is applied to evaluate the degradations of recognition performance due to camera and illumination effects. The database collected outdoors includes a real background and is much more complex to process. The numerical results

  13. Impact of automatic calibration techniques on HMD life cycle costs and sustainable performance

    Science.gov (United States)

    Speck, Richard P.; Herz, Norman E., Jr.

    2000-06-01

    Automatic test and calibration has become a valuable feature in many consumer products--ranging from antilock braking systems to auto-tune TVs. This paper discusses HMDs (Helmet Mounted Displays) and how similar techniques can reduce life cycle costs and increase sustainable performance if they are integrated into a program early enough. Optical ATE (Automatic Test Equipment) is already zeroing distortion in the HMDs and thereby making binocular displays a practical reality. A suitcase sized, field portable optical ATE unit could re-zero these errors in the Ready Room to cancel the effects of aging, minor damage and component replacement. Planning on this would yield large savings through relaxed component specifications and reduced logistic costs. Yet, the sustained performance would far exceed that attained with fixed calibration strategies. Major tactical benefits can come from reducing display errors, particularly in information fusion modules and virtual `beyond visual range' operations. Some versions of the ATE described are in production and examples of high resolution optical test data will be discussed.

  14. 20 CFR 404.285 - Recomputations performed automatically.

    Science.gov (United States)

    2010-04-01

    ... DISABILITY INSURANCE (1950- ) Computing Primary Insurance Amounts Recomputing Your Primary Insurance Amount... any sooner than it would be under an automatic recomputation. You may also waive a recomputation if...

  15. Motor automaticity in Parkinson’s disease

    Science.gov (United States)

    Wu, Tao; Hallett, Mark; Chan, Piu

    2017-01-01

    Bradykinesia is the most important feature contributing to motor difficulties in Parkinson’s disease (PD). However, the pathophysiology underlying bradykinesia is not fully understood. One important aspect is that PD patients have difficulty in performing learned motor skills automatically, but this problem has been generally overlooked. Here we review motor automaticity associated motor deficits in PD, such as reduced arm swing, decreased stride length, freezing of gait, micrographia and reduced facial expression. Recent neuroimaging studies have revealed some neural mechanisms underlying impaired motor automaticity in PD, including less efficient neural coding of movement, failure to shift automated motor skills to the sensorimotor striatum, instability of the automatic mode within the striatum, and use of attentional control and/or compensatory efforts to execute movements usually performed automatically in healthy people. PD patients lose previously acquired automatic skills due to their impaired sensorimotor striatum, and have difficulty in acquiring new automatic skills or restoring lost motor skills. More investigations on the pathophysiology of motor automaticity, the effect of L-dopa or surgical treatments on automaticity, and the potential role of using measures of automaticity in early diagnosis of PD would be valuable. PMID:26102020

  16. Trace-based post-silicon validation for VLSI circuits

    CERN Document Server

    Liu, Xiao

    2014-01-01

    This book first provides a comprehensive coverage of state-of-the-art validation solutions based on real-time signal tracing to guarantee the correctness of VLSI circuits.  The authors discuss several key challenges in post-silicon validation and provide automated solutions that are systematic and cost-effective.  A series of automatic tracing solutions and innovative design for debug (DfD) techniques are described, including techniques for trace signal selection for enhancing visibility of functional errors, a multiplexed signal tracing strategy for improving functional error detection, a tracing solution for debugging electrical errors, an interconnection fabric for increasing data bandwidth and supporting multi-core debug, an interconnection fabric design and optimization technique to increase transfer flexibility and a DfD design and associated tracing solution for improving debug efficiency and expanding tracing window. The solutions presented in this book improve the validation quality of VLSI circuit...

  17. Automatic and Objective Assessment of Alternating Tapping Performance in Parkinson’s Disease

    Directory of Open Access Journals (Sweden)

    Mevludin Memedi

    2013-12-01

    Full Text Available This paper presents the development and evaluation of a method for enabling quantitative and automatic scoring of alternating tapping performance of patients with Parkinson’s disease (PD. Ten healthy elderly subjects and 95 patients in different clinical stages of PD have utilized a touch-pad handheld computer to perform alternate tapping tests in their home environments. First, a neurologist used a web-based system to visually assess impairments in four tapping dimensions (‘speed’, ‘accuracy’, ‘fatigue’ and ‘arrhythmia’ and a global tapping severity (GTS. Second, tapping signals were processed with time series analysis and statistical methods to derive 24 quantitative parameters. Third, principal component analysis was used to reduce the dimensions of these parameters and to obtain scores for the four dimensions. Finally, a logistic regression classifier was trained using a 10-fold stratified cross-validation to map the reduced parameters to the corresponding visually assessed GTS scores. Results showed that the computed scores correlated well to visually assessed scores and were significantly different across Unified Parkinson’s Disease Rating Scale scores of upper limb motor performance. In addition, they had good internal consistency, had good ability to discriminate between healthy elderly and patients in different disease stages, had good sensitivity to treatment interventions and could reflect the natural disease progression over time. In conclusion, the automatic method can be useful to objectively assess the tapping performance of PD patients and can be included in telemedicine tools for remote monitoring of tapping.

  18. Simulation software support (S3) system a software testing and debugging tool

    International Nuclear Information System (INIS)

    Burgess, D.C.; Mahjouri, F.S.

    1990-01-01

    The largest percentage of technical effort in the software development process is accounted for debugging and testing. It is not unusual for a software development organization to spend over 50% of the total project effort on testing. In the extreme, testing of human-rated software (e.g., nuclear reactor monitoring, training simulator) can cost three to five times as much as all other software engineering steps combined. The Simulation Software Support (S 3 ) System, developed by the Link-Miles Simulation Corporation is ideally suited for real-time simulation applications which involve a large database with models programmed in FORTRAN. This paper will focus on testing elements of the S 3 system. In this paper system support software utilities are provided which enable the loading and execution of modules in the development environment. These elements include the Linking/Loader (LLD) for dynamically linking program modules and loading them into memory and the interactive executive (IEXEC) for controlling the execution of the modules. Features of the Interactive Symbolic Debugger (SD) and the Real Time Executive (RTEXEC) to support the unit and integrated testing will be explored

  19. Testing the algorithms for automatic identification of errors on the measured quantities of the nuclear power plant. Verification tests

    International Nuclear Information System (INIS)

    Svatek, J.

    1999-12-01

    During the development and implementation of supporting software for the control room and emergency control centre at the Dukovany nuclear power plant it appeared necessary to validate the input quantities in order to assure operating reliability of the software tools. Therefore, the development of software for validation of the measured quantities of the plant data sources was initiated, and the software had to be debugged and verified. The report contains the proposal for and description of the verification tests for testing the algorithms of automatic identification of errors on the observed quantities of the NPP by means of homemade validation software. In particular, the algorithms treated serve the validation of the hot leg temperature at primary circuit loop no. 2 or 4 at the Dukovany-2 reactor unit using data from the URAN and VK3 information systems, recorded during 3 different days. (author)

  20. DEVELOP-FPS: a First Person Shooter Development Tool for Rule-based Scripts

    Directory of Open Access Journals (Sweden)

    Bruno Correia

    2012-09-01

    Full Text Available We present DEVELOP-FPS, a software tool specially designed for the development of First Person Shooter (FPS players controlled by Rule Based Scripts. DEVELOP-FPS may be used by FPS developers to create, debug, maintain and compare rule base player behaviours, providing a set of useful functionalities: i for an easy preparation of the right scenarios for game debugging and testing; ii for controlling the game execution: users can stop and resume the game execution at any instant, monitoring and controlling every player in the game, monitoring the state of each player, their rule base activation, being able to issue commands to control their behaviour; and iii to automatically run a certain number of game executions and collect data in order to evaluate and compare the players performance along a sufficient number of similar experiments.

  1. Control characteristics and heating performance analysis of automatic thermostatic valves for radiant slab heating system in residential apartments

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Byung-Cheon [Department of Building Equipment System Engineering, Kyungwon University, Seongnam City (Korea); Song, Jae-Yeob [Graduate School, Building Equipment System Engineering, Kyungwon University, Seongnam City (Korea)

    2010-04-15

    Computer simulations and experiments are carried out to research the control characteristics and heating performances for a radiant slab heating system with automatic thermostatic valves in residential apartments. An electrical equivalent R-C circuit is applied to analyze the unsteady heat transfer in the house. In addition, the radiant heat transfer between slabs, ceilings and walls in the room is evaluated by enclosure analysis method. Results of heating performance and control characteristics were determined from control methods such as automatic thermostatic valves, room air temperature-sensing method, water-temperature-sensing method, proportional control method, and On-Off control method. (author)

  2. Lightning Protection Performance Assessment of Transmission Line Based on ATP model Automatic Generation

    Directory of Open Access Journals (Sweden)

    Luo Hanwu

    2016-01-01

    Full Text Available This paper presents a novel method to solve the initial lightning breakdown current by combing ATP and MATLAB simulation software effectively, with the aims to evaluate the lightning protection performance of transmission line. Firstly, the executable ATP simulation model is generated automatically according to the required information such as power source parameters, tower parameters, overhead line parameters, grounding resistance and lightning current parameters, etc. through an interface program coded by MATLAB. Then, the data are extracted from the generated LIS files which can be obtained by executing the ATP simulation model, the occurrence of transmission lie breakdown can be determined by the relative data in LIS file. The lightning current amplitude should be reduced when the breakdown occurs, and vice the verse. Thus the initial lightning breakdown current of a transmission line with given parameters can be determined accurately by continuously changing the lightning current amplitude, which is realized by a loop computing algorithm that is coded by MATLAB software. The method proposed in this paper can generate the ATP simulation program automatically, and facilitates the lightning protection performance assessment of transmission line.

  3. Neural Bases of Automaticity

    Science.gov (United States)

    Servant, Mathieu; Cassey, Peter; Woodman, Geoffrey F.; Logan, Gordon D.

    2018-01-01

    Automaticity allows us to perform tasks in a fast, efficient, and effortless manner after sufficient practice. Theories of automaticity propose that across practice processing transitions from being controlled by working memory to being controlled by long-term memory retrieval. Recent event-related potential (ERP) studies have sought to test this…

  4. Automatic orbital GTAW welding: Highest quality welds for tomorrow's high-performance systems

    Science.gov (United States)

    Henon, B. K.

    1985-01-01

    Automatic orbital gas tungsten arc welding (GTAW) or TIG welding is certain to play an increasingly prominent role in tomorrow's technology. The welds are of the highest quality and the repeatability of automatic weldings is vastly superior to that of manual welding. Since less heat is applied to the weld during automatic welding than manual welding, there is less change in the metallurgical properties of the parent material. The possibility of accurate control and the cleanliness of the automatic GTAW welding process make it highly suitable to the welding of the more exotic and expensive materials which are now widely used in the aerospace and hydrospace industries. Titanium, stainless steel, Inconel, and Incoloy, as well as, aluminum can all be welded to the highest quality specifications automatically. Automatic orbital GTAW equipment is available for the fusion butt welding of tube-to-tube, as well as, tube to autobuttweld fittings. The same equipment can also be used for the fusion butt welding of up to 6 inch pipe with a wall thickness of up to 0.154 inches.

  5. An electronically controlled automatic security access gate

    Directory of Open Access Journals (Sweden)

    Jonathan A. ENOKELA

    2014-11-01

    Full Text Available The security challenges being encountered in many places require electronic means of controlling access to communities, recreational centres, offices, and homes. The electronically controlled automated security access gate being proposed in this work helps to prevent an unwanted access to controlled environments. This is achieved mainly through the use of a Radio Frequency (RF transmitter-receiver pair. In the design a microcontroller is programmed to decode a given sequence of keys that is entered on a keypad and commands a transmitter module to send out this code as signal at a given radio frequency. Upon reception of this RF signal by the receiver module, another microcontroller activates a driver circuitry to operate the gate automatically. The codes for the microcontrollers were written in C language and were debugged and compiled using the KEIL Micro vision 4 integrated development environment. The resultant Hex files were programmed into the memories of the microcontrollers with the aid of a universal programmer. Software simulation was carried out using the Proteus Virtual System Modeling (VSM version 7.7. A scaled-down prototype of the system was built and tested. The electronically controlled automated security access gate can be useful in providing security for homes, organizations, and automobile terminals. The four-character password required to operate the gate gives the system an increased level of security. Due to its standalone nature of operation the system is cheaper to maintain in comparison with a manually operated type.

  6. Automatic user customization for improving the performance of a self-paced brain interface system.

    Science.gov (United States)

    Fatourechi, Mehrdad; Bashashati, Ali; Birch, Gary E; Ward, Rabab K

    2006-12-01

    Customizing the parameter values of brain interface (BI) systems by a human expert has the advantage of being fast and computationally efficient. However, as the number of users and EEG channels grows, this process becomes increasingly time consuming and exhausting. Manual customization also introduces inaccuracies in the estimation of the parameter values. In this paper, the performance of a self-paced BI system whose design parameter values were automatically user customized using a genetic algorithm (GA) is studied. The GA automatically estimates the shapes of movement-related potentials (MRPs), whose features are then extracted to drive the BI. Offline analysis of the data of eight subjects revealed that automatic user customization improved the true positive (TP) rate of the system by an average of 6.68% over that whose customization was carried out by a human expert, i.e., by visually inspecting the MRP templates. On average, the best improvement in the TP rate (an average of 9.82%) was achieved for four individuals with spinal cord injury. In this case, the visual estimation of the parameter values of the MRP templates was very difficult because of the highly noisy nature of the EEG signals. For four able-bodied subjects, for which the MRP templates were less noisy, the automatic user customization led to an average improvement of 3.58% in the TP rate. The results also show that the inter-subject variability of the TP rate is also reduced compared to the case when user customization is carried out by a human expert. These findings provide some primary evidence that automatic user customization leads to beneficial results in the design of a self-paced BI for individuals with spinal cord injury.

  7. Automatic performance tuning of parallel and accelerated seismic imaging kernels

    KAUST Repository

    Haberdar, Hakan

    2014-01-01

    With the increased complexity and diversity of mainstream high performance computing systems, significant effort is required to tune parallel applications in order to achieve the best possible performance for each particular platform. This task becomes more and more challenging and requiring a larger set of skills. Automatic performance tuning is becoming a must for optimizing applications such as Reverse Time Migration (RTM) widely used in seismic imaging for oil and gas exploration. An empirical search based auto-tuning approach is applied to the MPI communication operations of the parallel isotropic and tilted transverse isotropic kernels. The application of auto-tuning using the Abstract Data and Communication Library improved the performance of the MPI communications as well as developer productivity by providing a higher level of abstraction. Keeping productivity in mind, we opted toward pragma based programming for accelerated computation on latest accelerated architectures such as GPUs using the fairly new OpenACC standard. The same auto-tuning approach is also applied to the OpenACC accelerated seismic code for optimizing the compute intensive kernel of the Reverse Time Migration application. The application of such technique resulted in an improved performance of the original code and its ability to adapt to different execution environments.

  8. Automatic Human Movement Assessment With Switching Linear Dynamic System: Motion Segmentation and Motor Performance.

    Science.gov (United States)

    de Souza Baptista, Roberto; Bo, Antonio P L; Hayashibe, Mitsuhiro

    2017-06-01

    Performance assessment of human movement is critical in diagnosis and motor-control rehabilitation. Recent developments in portable sensor technology enable clinicians to measure spatiotemporal aspects to aid in the neurological assessment. However, the extraction of quantitative information from such measurements is usually done manually through visual inspection. This paper presents a novel framework for automatic human movement assessment that executes segmentation and motor performance parameter extraction in time-series of measurements from a sequence of human movements. We use the elements of a Switching Linear Dynamic System model as building blocks to translate formal definitions and procedures from human movement analysis. Our approach provides a method for users with no expertise in signal processing to create models for movements using labeled dataset and later use it for automatic assessment. We validated our framework on preliminary tests involving six healthy adult subjects that executed common movements in functional tests and rehabilitation exercise sessions, such as sit-to-stand and lateral elevation of the arms and five elderly subjects, two of which with limited mobility, that executed the sit-to-stand movement. The proposed method worked on random motion sequences for the dual purpose of movement segmentation (accuracy of 72%-100%) and motor performance assessment (mean error of 0%-12%).

  9. Performance management system enhancement and maintenance

    Science.gov (United States)

    Cleaver, T. G.; Ahour, R.; Johnson, B. R.

    1984-01-01

    The research described in this report concludes a two-year effort to develop a Performance Management System (PMS) for the NCC computers. PMS provides semi-automated monthly reports to NASA and contractor management on the status and performance of the NCC computers in the TDRSS program. Throughout 1984, PMS was tested, debugged, extended, and enhanced. Regular PMS monthly reports were produced and distributed. PMS continues to operate at the NCC under control of Bendix Corp. personnel.

  10. The Effects of Background Noise on the Performance of an Automatic Speech Recogniser

    Science.gov (United States)

    Littlefield, Jason; HashemiSakhtsari, Ahmad

    2002-11-01

    Ambient or environmental noise is a major factor that affects the performance of an automatic speech recognizer. Large vocabulary, speaker-dependent, continuous speech recognizers are commercially available. Speech recognizers, perform well in a quiet environment, but poorly in a noisy environment. Speaker-dependent speech recognizers require training prior to them being tested, where the level of background noise in both phases affects the performance of the recognizer. This study aims to determine whether the best performance of a speech recognizer occurs when the levels of background noise during the training and test phases are the same, and how the performance is affected when the levels of background noise during the training and test phases are different. The relationship between the performance of the speech recognizer and upgrading the computer speed and amount of memory as well as software version was also investigated.

  11. Testing effort dependent software reliability model for imperfect debugging process considering both detection and correction

    International Nuclear Information System (INIS)

    Peng, R.; Li, Y.F.; Zhang, W.J.; Hu, Q.P.

    2014-01-01

    This paper studies the fault detection process (FDP) and fault correction process (FCP) with the incorporation of testing effort function and imperfect debugging. In order to ensure high reliability, it is essential for software to undergo a testing phase, during which faults can be detected and corrected by debuggers. The testing resource allocation during this phase, which is usually depicted by the testing effort function, considerably influences not only the fault detection rate but also the time to correct a detected fault. In addition, testing is usually far from perfect such that new faults may be introduced. In this paper, we first show how to incorporate testing effort function and fault introduction into FDP and then develop FCP as delayed FDP with a correction effort. Various specific paired FDP and FCP models are obtained based on different assumptions of fault introduction and correction effort. An illustrative example is presented. The optimal release policy under different criteria is also discussed

  12. Development of advanced automatic control system for nuclear ship. 2. Perfect automatic operation after reactor scram events

    International Nuclear Information System (INIS)

    Yabuuchi, Noriaki; Nakazawa, Toshio; Takahashi, Hiroki; Shimazaki, Junya; Hoshi, Tsutao

    1997-11-01

    An automatic operation system has been developed for the purpose of realizing a perfect automatic plant operation after reactor scram events. The goal of the automatic operation after a reactor scram event is to bring the reactor hot stand-by condition automatically. The basic functions of this system are as follows; to monitor actions of the equipments of safety actions after a reactor scram, to control necessary control equipments to bring a reactor to a hot stand-by condition automatically, and to energize a decay heat removal system. The performance evaluation on this system was carried out by comparing the results using to Nuclear Ship Engineering Simulation System (NESSY) and the those measured in the scram test of the nuclear ship 'Mutsu'. As the result, it was showed that this system had the sufficient performance to bring a reactor to a hot syand-by condition quickly and safety. (author)

  13. Development of advanced automatic control system for nuclear ship. 2. Perfect automatic operation after reactor scram events

    Energy Technology Data Exchange (ETDEWEB)

    Yabuuchi, Noriaki; Nakazawa, Toshio; Takahashi, Hiroki; Shimazaki, Junya; Hoshi, Tsutao [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1997-11-01

    An automatic operation system has been developed for the purpose of realizing a perfect automatic plant operation after reactor scram events. The goal of the automatic operation after a reactor scram event is to bring the reactor hot stand-by condition automatically. The basic functions of this system are as follows; to monitor actions of the equipments of safety actions after a reactor scram, to control necessary control equipments to bring a reactor to a hot stand-by condition automatically, and to energize a decay heat removal system. The performance evaluation on this system was carried out by comparing the results using to Nuclear Ship Engineering Simulation System (NESSY) and the those measured in the scram test of the nuclear ship `Mutsu`. As the result, it was showed that this system had the sufficient performance to bring a reactor to a hot syand-by condition quickly and safety. (author)

  14. SimpleGeO - new developments in the interactive creation and debugging of geometries for Monte Carlo simulations

    International Nuclear Information System (INIS)

    Theis, Christian; Feldbaumer, Eduard; Forkel-Wirth, Doris; Jaegerhofer, Lukas; Roesler, Stefan; Vincke, Helmut; Buchegger, Karl Heinz

    2010-01-01

    Nowadays radiation transport Monte Carlo simulations have become an indispensable tool in various fields of physics. The applications are diversified and range from physics simulations, like detector studies or shielding design, to medical applications. Usually a significant amount of time is spent on the quite cumbersome and often error prone task of implementing geometries, before the actual physics studies can be performed. SimpleGeo is an interactive solid modeler which allows for the interactive creation and visualization of geometries for various Monte Carlo particle transport codes in 3D. Even though visual validation of the geometry is important, it might not reveal subtle errors like overlapping or undefined regions. These might eventually corrupt the execution of the simulation or even lead to incorrect results, the latter being sometimes hard to identify. In many cases a debugger is provided by the Monte Carlo package, but most often they lack interactive visual feedback, thus making it hard for the user to localize and correct the error. In this paper we describe the latest developments in SimpleGeo, which include debugging facilities that support immediate visual feedback, and apply various algorithms based on deterministic, Monte Carlo or Quasi Monte Carlo methods. These approaches allow for a fast and robust identification of subtle geometry errors that are also marked visually. (author)

  15. SPPTOOLS: Programming tools for the IRAF SPP language

    Science.gov (United States)

    Fitzpatrick, M.

    1992-01-01

    An IRAF package to assist in SPP code development and debugging is described. SPP is the machine-independent programming language used by virtually all IRAF tasks. Tools have been written to aide both novice and advanced SPP programmers with development and debugging by providing tasks to check the code for the number and type of arguments in all calls to IRAF VOS library procedures, list the calling sequences of IRAF tasks, create a database of identifiers for quick access, check for memory which is not freed, and a source code formatter. Debugging is simplified since the programmer is able to get a better understanding of the structure of his/her code, and IRAF library procedure calls (probably the most common source of errors) are automatically checked for correctness.

  16. Automatization of welding

    International Nuclear Information System (INIS)

    Iwabuchi, Masashi; Tomita, Jinji; Nishihara, Katsunori.

    1978-01-01

    Automatization of welding is one of the effective measures for securing high degree of quality of nuclear power equipment, as well as for correspondence to the environment at the site of plant. As the latest ones of the automatic welders practically used for welding of nuclear power apparatuses in factories of Toshiba and IHI, those for pipes and lining tanks are described here. The pipe welder performs the battering welding on the inside of pipe end as the so-called IGSCC countermeasure and the succeeding butt welding through the same controller. The lining tank welder is able to perform simultaneous welding of two parallel weld lines on a large thin plate lining tank. Both types of the welders are demonstrating excellent performance at the shops as well as at the plant site. (author)

  17. Validation of a novel automatic sleep spindle detector with high performance during sleep in middle aged subjects

    DEFF Research Database (Denmark)

    Wendt, Sabrina Lyngbye; Christensen, Julie A. E.; Kempfner, Jacob

    2012-01-01

    Many of the automatic sleep spindle detectors currently used to analyze sleep EEG are either validated on young subjects or not validated thoroughly. The purpose of this study is to develop and validate a fast and reliable sleep spindle detector with high performance in middle aged subjects....... An automatic sleep spindle detector using a bandpass filtering approach and a time varying threshold was developed. The validation was done on sleep epochs from EEG recordings with manually scored sleep spindles from 13 healthy subjects with a mean age of 57.9 ± 9.7 years. The sleep spindle detector reached...

  18. COMMISSIONING AND DETECTOR PERFORMANCE GROUPS (DPG)

    CERN Multimedia

    A. Ryd and T. Camporesi

    2010-01-01

    Commissioning and Run Coordination activities After the successful conclusion of the LHC pilot run commissioning in 2009 activities at the experiment restarted only late in January due to the cooling and detector maintenance. As usual we got going with weekly exercises used to deploy, debug, and validate improvements in firmware and software. A debriefing workshop aimed at analyzing the operational aspects of the 2009 pilot run was held on Jan. 15, 2009, to define a list of improvements (and relative priorities) to be planned. In the last month, most of the objectives set in the debriefing workshop have been attained. The major achievements/improvements obtained are the following: - Consolidation of the firmware for both readout and trigger for ECAL - Software implementation of procedures for raising the bias voltage of the silicon tracker and pixel driven by LHC mode changes with automatic propagation of the state changes from the DCS to the DAQ. The improvements in the software and firmware allow suppress...

  19. Automated knowledge-base refinement

    Science.gov (United States)

    Mooney, Raymond J.

    1994-01-01

    Over the last several years, we have developed several systems for automatically refining incomplete and incorrect knowledge bases. These systems are given an imperfect rule base and a set of training examples and minimally modify the knowledge base to make it consistent with the examples. One of our most recent systems, FORTE, revises first-order Horn-clause knowledge bases. This system can be viewed as automatically debugging Prolog programs based on examples of correct and incorrect I/O pairs. In fact, we have already used the system to debug simple Prolog programs written by students in a programming language course. FORTE has also been used to automatically induce and revise qualitative models of several continuous dynamic devices from qualitative behavior traces. For example, it has been used to induce and revise a qualitative model of a portion of the Reaction Control System (RCS) of the NASA Space Shuttle. By fitting a correct model of this portion of the RCS to simulated qualitative data from a faulty system, FORTE was also able to correctly diagnose simple faults in this system.

  20. Compile-Time Debugging of C Programs Working on Trees

    DEFF Research Database (Denmark)

    Elgaard, Jacob; Møller, Anders; Schwartzbach, Michael I.

    2000-01-01

    of an initial store that leads to an error is automatically generated. This extends previous work that uses a similar technique to verify a simpler syntax manipulating only list structures. In that case, programs are translated into WS1S formulas. A naive generalization to recursive data-types determines...

  1. I Hear You Eat and Speak: Automatic Recognition of Eating Condition and Food Type, Use-Cases, and Impact on ASR Performance.

    Science.gov (United States)

    Hantke, Simone; Weninger, Felix; Kurle, Richard; Ringeval, Fabien; Batliner, Anton; Mousa, Amr El-Desoky; Schuller, Björn

    2016-01-01

    We propose a new recognition task in the area of computational paralinguistics: automatic recognition of eating conditions in speech, i. e., whether people are eating while speaking, and what they are eating. To this end, we introduce the audio-visual iHEARu-EAT database featuring 1.6 k utterances of 30 subjects (mean age: 26.1 years, standard deviation: 2.66 years, gender balanced, German speakers), six types of food (Apple, Nectarine, Banana, Haribo Smurfs, Biscuit, and Crisps), and read as well as spontaneous speech, which is made publicly available for research purposes. We start with demonstrating that for automatic speech recognition (ASR), it pays off to know whether speakers are eating or not. We also propose automatic classification both by brute-forcing of low-level acoustic features as well as higher-level features related to intelligibility, obtained from an Automatic Speech Recogniser. Prediction of the eating condition was performed with a Support Vector Machine (SVM) classifier employed in a leave-one-speaker-out evaluation framework. Results show that the binary prediction of eating condition (i. e., eating or not eating) can be easily solved independently of the speaking condition; the obtained average recalls are all above 90%. Low-level acoustic features provide the best performance on spontaneous speech, which reaches up to 62.3% average recall for multi-way classification of the eating condition, i. e., discriminating the six types of food, as well as not eating. The early fusion of features related to intelligibility with the brute-forced acoustic feature set improves the performance on read speech, reaching a 66.4% average recall for the multi-way classification task. Analysing features and classifier errors leads to a suitable ordinal scale for eating conditions, on which automatic regression can be performed with up to 56.2% determination coefficient.

  2. The performance of an automatic acoustic-based program classifier compared to hearing aid users' manual selection of listening programs.

    Science.gov (United States)

    Searchfield, Grant D; Linford, Tania; Kobayashi, Kei; Crowhen, David; Latzel, Matthias

    2018-03-01

    To compare preference for and performance of manually selected programmes to an automatic sound classifier, the Phonak AutoSense OS. A single blind repeated measures study. Participants were fit with Phonak Virto V90 ITE aids; preferences for different listening programmes were compared across four different sound scenarios (speech in: quiet, noise, loud noise and a car). Following a 4-week trial preferences were reassessed and the users preferred programme was compared to the automatic classifier for sound quality and hearing in noise (HINT test) using a 12 loudspeaker array. Twenty-five participants with symmetrical moderate-severe sensorineural hearing loss. Participant preferences of manual programme for scenarios varied considerably between and within sessions. A HINT Speech Reception Threshold (SRT) advantage was observed for the automatic classifier over participant's manual selection for speech in quiet, loud noise and car noise. Sound quality ratings were similar for both manual and automatic selections. The use of a sound classifier is a viable alternative to manual programme selection.

  3. Automatic shading effects on the energetic performance of building systems; Efeito do sombreamento automatico no desempenho de sistemas prediais

    Energy Technology Data Exchange (ETDEWEB)

    Prado, Racine Tadeu Araujo

    1997-12-31

    This thesis develops a theoretic-experimental study dealing with the effects of an automatic shading device on the energetic performance of a dimmable lighting system and a cooling equipment. Some equations related to fenestration optical and thermal properties are rebuilt, while some others are created, under a theoretical approach. In order to collect field data, the energy demand-and other variables - was measured in two distinct stories, with the same fenestration features, of the Test Tower. New data was gathered after adding an automatic shading device to the window of one story. The comparison of the collected data allows the energetic performance evaluation of the shading device. (author) 136 refs., 55 figs., 6 tabs.

  4. Automatic shading effects on the energetic performance of building systems; Efeito do sombreamento automatico no desempenho de sistemas prediais

    Energy Technology Data Exchange (ETDEWEB)

    Prado, Racine Tadeu Araujo

    1996-12-31

    This thesis develops a theoretic-experimental study dealing with the effects of an automatic shading device on the energetic performance of a dimmable lighting system and a cooling equipment. Some equations related to fenestration optical and thermal properties are rebuilt, while some others are created, under a theoretical approach. In order to collect field data, the energy demand-and other variables - was measured in two distinct stories, with the same fenestration features, of the Test Tower. New data was gathered after adding an automatic shading device to the window of one story. The comparison of the collected data allows the energetic performance evaluation of the shading device. (author) 136 refs., 55 figs., 6 tabs.

  5. Automatisms: bridging clinical neurology with criminal law.

    Science.gov (United States)

    Rolnick, Joshua; Parvizi, Josef

    2011-03-01

    The law, like neurology, grapples with the relationship between disease states and behavior. Sometimes, the two disciplines share the same terminology, such as automatism. In law, the "automatism defense" is a claim that action was involuntary or performed while unconscious. Someone charged with a serious crime can acknowledge committing the act and yet may go free if, relying on the expert testimony of clinicians, the court determines that the act of crime was committed in a state of automatism. In this review, we explore the relationship between the use of automatism in the legal and clinical literature. We close by addressing several issues raised by the automatism defense: semantic ambiguity surrounding the term automatism, the presence or absence of consciousness during automatisms, and the methodological obstacles that have hindered the study of cognition during automatisms. Copyright © 2010 Elsevier Inc. All rights reserved.

  6. ArrayBridge: Interweaving declarative array processing with high-performance computing

    Energy Technology Data Exchange (ETDEWEB)

    Xing, Haoyuan [The Ohio State Univ., Columbus, OH (United States); Floratos, Sofoklis [The Ohio State Univ., Columbus, OH (United States); Blanas, Spyros [The Ohio State Univ., Columbus, OH (United States); Byna, Suren [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Prabhat, Prabhat [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wu, Kesheng [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Brown, Paul [Paradigm4, Inc., Waltham, MA (United States)

    2017-05-04

    Scientists are increasingly turning to datacenter-scale computers to produce and analyze massive arrays. Despite decades of database research that extols the virtues of declarative query processing, scientists still write, debug and parallelize imperative HPC kernels even for the most mundane queries. This impedance mismatch has been partly attributed to the cumbersome data loading process; in response, the database community has proposed in situ mechanisms to access data in scientific file formats. Scientists, however, desire more than a passive access method that reads arrays from files. This paper describes ArrayBridge, a bi-directional array view mechanism for scientific file formats, that aims to make declarative array manipulations interoperable with imperative file-centric analyses. Our prototype implementation of ArrayBridge uses HDF5 as the underlying array storage library and seamlessly integrates into the SciDB open-source array database system. In addition to fast querying over external array objects, ArrayBridge produces arrays in the HDF5 file format just as easily as it can read from it. ArrayBridge also supports time travel queries from imperative kernels through the unmodified HDF5 API, and automatically deduplicates between array versions for space efficiency. Our extensive performance evaluation in NERSC, a large-scale scientific computing facility, shows that ArrayBridge exhibits statistically indistinguishable performance and I/O scalability to the native SciDB storage engine.

  7. Automatic text summarization

    CERN Document Server

    Torres Moreno, Juan Manuel

    2014-01-01

    This new textbook examines the motivations and the different algorithms for automatic document summarization (ADS). We performed a recent state of the art. The book shows the main problems of ADS, difficulties and the solutions provided by the community. It presents recent advances in ADS, as well as current applications and trends. The approaches are statistical, linguistic and symbolic. Several exemples are included in order to clarify the theoretical concepts.  The books currently available in the area of Automatic Document Summarization are not recent. Powerful algorithms have been develop

  8. Monitoring the Performance of the Pedestrian Transfer Function of Train Stations Using Automatic Fare Collection Data

    NARCIS (Netherlands)

    Van den Heuvel, J.P.A.; Hoogenraad, J.H.

    2014-01-01

    Over the last years all train stations in The Netherlands have been equipped with automatic fare collection gates and/or validators. All public transport passengers use a smart card to pay their fare. In this paper we present a monitor for the performance of the pedestrian function of train stations

  9. Functionality and Performance Visualization of the Distributed High Quality Volume Renderer (HVR)

    KAUST Repository

    Shaheen, Sara

    2012-07-01

    Volume rendering systems are designed to provide means to enable scientists and a variety of experts to interactively explore volume data through 3D views of the volume. However, volume rendering techniques are computationally intensive tasks. Moreover, parallel distributed volume rendering systems and multi-threading architectures were suggested as natural solutions to provide an acceptable volume rendering performance for very large volume data sizes, such as Electron Microscopy data (EM). This in turn adds another level of complexity when developing and manipulating volume rendering systems. Given that distributed parallel volume rendering systems are among the most complex systems to develop, trace and debug, it is obvious that traditional debugging tools do not provide enough support. As a consequence, there is a great demand to provide tools that are able to facilitate the manipulation of such systems. This can be achieved by utilizing the power of compute graphics in designing visual representations that reflect how the system works and that visualize the current performance state of the system.The work presented is categorized within the field of software Visualization, where Visualization is used to serve visualizing and understanding various software. In this thesis, a number of visual representations that reflect a number of functionality and performance aspects of the distributed HVR, a high quality volume renderer system that uses various techniques to visualize large volume sizes interactively. This work is provided to visualize different stages of the parallel volume rendering pipeline of HVR. This is along with means of performance analysis through a number of flexible and dynamic visualizations that reflect the current state of the system and enables manipulation of them at runtime. Those visualization are aimed to facilitate debugging, understanding and analyzing the distributed HVR.

  10. Spectrum-based Fault Localization in Embedded Software

    NARCIS (Netherlands)

    Abreu, R.

    2009-01-01

    Locating software components that are responsible for observed failures is a time-intensive and expensive phase in the software development cycle. Automatic fault localization techniques aid developers/testers in pinpointing the root cause of software failures, as such reducing the debugging effort.

  11. First performance evaluation of software for automatic segmentation, labeling and reformation of anatomical aligned axial images of the thoracolumbar spine at CT

    Energy Technology Data Exchange (ETDEWEB)

    Scholtz, Jan-Erik, E-mail: janerikscholtz@gmail.com; Wichmann, Julian L.; Kaup, Moritz; Fischer, Sebastian; Kerl, J. Matthias; Lehnert, Thomas; Vogl, Thomas J.; Bauer, Ralf W.

    2015-03-15

    Highlights: •Automatic segmentation and labeling of the thoracolumbar spine. •Automatically generated double-angulated and aligned axial images of spine segments. •High grade of accurateness for the symmetric depiction of anatomical structures. •Time-saving and may improve workflow in daily practice. -- Abstract: Objectives: To evaluate software for automatic segmentation, labeling and reformation of anatomical aligned axial images of the thoracolumbar spine on CT in terms of accuracy, potential for time savings and workflow improvement. Material and methods: 77 patients (28 women, 49 men, mean age 65.3 ± 14.4 years) with known or suspected spinal disorders (degenerative spine disease n = 32; disc herniation n = 36; traumatic vertebral fractures n = 9) underwent 64-slice MDCT with thin-slab reconstruction. Time for automatic labeling of the thoracolumbar spine and reconstruction of double-angulated axial images of the pathological vertebrae was compared with manually performed reconstruction of anatomical aligned axial images. Reformatted images of both reconstruction methods were assessed by two observers regarding accuracy of symmetric depiction of anatomical structures. Results: In 33 cases double-angulated axial images were created in 1 vertebra, in 28 cases in 2 vertebrae and in 16 cases in 3 vertebrae. Correct automatic labeling was achieved in 72 of 77 patients (93.5%). Errors could be manually corrected in 4 cases. Automatic labeling required 1 min in average. In cases where anatomical aligned axial images of 1 vertebra were created, reconstructions made by hand were significantly faster (p < 0.05). Automatic reconstruction was time-saving in cases of 2 and more vertebrae (p < 0.05). Both reconstruction methods revealed good image quality with excellent inter-observer agreement. Conclusion: The evaluated software for automatic labeling and anatomically aligned, double-angulated axial image reconstruction of the thoracolumbar spine on CT is time

  12. Provenance-Based Debugging and Drill-Down in Data-Oriented Workflows

    KAUST Repository

    Ikeda, Robert; Cho, Junsang; Fang, Charlie; Salihoglu, Semih; Torikai, Satoshi; Widom, Jennifer

    2012-01-01

    Panda (for Provenance and Data) is a system that supports the creation and execution of data-oriented workflows, with automatic provenance generation and built-in provenance tracing operations. Workflows in Panda are arbitrary a cyclic graphs

  13. Overview of a Linguistic Theory of Design. AI Memo 383A.

    Science.gov (United States)

    Miller, Mark L.; Goldstein, Ira P.

    The SPADE theory, which uses linguistic formalisms to model the planning and debugging processes of computer programming, was simultaneously developed and tested in three separate contexts--computer uses in education, automatic programming (a traditional artificial intelligence arena), and protocol analysis (the domain of information processing…

  14. First performance evaluation of software for automatic segmentation, labeling and reformation of anatomical aligned axial images of the thoracolumbar spine at CT.

    Science.gov (United States)

    Scholtz, Jan-Erik; Wichmann, Julian L; Kaup, Moritz; Fischer, Sebastian; Kerl, J Matthias; Lehnert, Thomas; Vogl, Thomas J; Bauer, Ralf W

    2015-03-01

    To evaluate software for automatic segmentation, labeling and reformation of anatomical aligned axial images of the thoracolumbar spine on CT in terms of accuracy, potential for time savings and workflow improvement. 77 patients (28 women, 49 men, mean age 65.3±14.4 years) with known or suspected spinal disorders (degenerative spine disease n=32; disc herniation n=36; traumatic vertebral fractures n=9) underwent 64-slice MDCT with thin-slab reconstruction. Time for automatic labeling of the thoracolumbar spine and reconstruction of double-angulated axial images of the pathological vertebrae was compared with manually performed reconstruction of anatomical aligned axial images. Reformatted images of both reconstruction methods were assessed by two observers regarding accuracy of symmetric depiction of anatomical structures. In 33 cases double-angulated axial images were created in 1 vertebra, in 28 cases in 2 vertebrae and in 16 cases in 3 vertebrae. Correct automatic labeling was achieved in 72 of 77 patients (93.5%). Errors could be manually corrected in 4 cases. Automatic labeling required 1min in average. In cases where anatomical aligned axial images of 1 vertebra were created, reconstructions made by hand were significantly faster (pquality with excellent inter-observer agreement. The evaluated software for automatic labeling and anatomically aligned, double-angulated axial image reconstruction of the thoracolumbar spine on CT is time-saving when reconstructions of 2 and more vertebrae are performed. Checking results of automatic labeling is necessary to prevent errors in labeling. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  15. "Intelligent" Automatic Beam Steering and Shaping

    CERN Document Server

    Jansson, A

    2000-01-01

    The strategy for Automated Beam Steering and Shaping (ABS) in the PS complex is to use theoretical response matrices calculated from an optics database. The main reason for this is that it enforces a certain understanding of the machine optics. A drawback is that the validation of such a matrix can be a lengthy process. However, every time a correction is made using an ABS program, a partial measurement of the response matrix is effectively performed. Since the ABS programs are very frequently used, the full matrices could thus be measured on an almost daily basis, provided this information is retained. The information can be used in two ways. Either the program passively logs the data to be analysed off­line, or the information is directly fed back to the matrix, which makes the program 'learn' as it executes. The data logging provides a powerful machine debugging tool, since deviations between the measured and theoretical matrices can be traced back to incorrect optical parameters. The 'learning' mode ensu...

  16. Using PAFEC as a preprocessor for MSC/NASTRAN

    International Nuclear Information System (INIS)

    Gray, W.H.; Baudry, T.V.

    1983-01-01

    Programs for Automatic Finite Element Calculations (PAFEC) is a general-purpose, three-dimensional, linear and nonlinear finite element program. PAFEC's features include free-format input using engineering keywords, powerful mesh-generating facilities, sophisticated database management procedures, and extensive data validation checks. Presented here is a description of a software interface that permits PAFEC to be used as a preprocessor for MSC/NASTRAN. This user-friendly software, called PAFMSC, frees the stress analyst from the laborious and error-prone procedure of creating and debugging a rigid-format MSC/NASTRAN bulk data deck. By interactively creating and debugging a finite element model with PAFEC, thus taking full advantage of the free-format, engineering-keyword-oriented data structure of PAFEC, the stress analyst can drastically reduce the amount of time spent during model generation. The PAFMSC software will automatically convert a PAFEC data structure into an MSC/NASTRAN bulk data deck. The capabilities and limitations of the PAFMSC software are fully discussed in the following report

  17. Using PAFEC as a preprocessor for MSC/NASTRAN

    International Nuclear Information System (INIS)

    Gray, W.H.; Baudry, T.V.

    1983-01-01

    Programs for Automatic Finite Element Calculations (PAFEC) is a general-purpose, three-dimensional, linear and nonlinear finite element program. PAFEC's features include free-format input using engineering keywords, powerful mesh-generating facilities, sophisticated database management procedures, and extensive data validation checks. Presented here is a description of a software interface that permits PAFEC to be used as a preprocessor for MSC/NASTRAN. This user-friendly software, called PAFMSC, frees the stress analyst from the laborious and error-prone procedure of creating and debugging a rigid-format MSC/NASTRAN bulk data deck. By interactively creating and debugging a finite element model with PAFEC, thus taking full advantage of the free-format, engineering-keyword-oriented data structure of PAFEC, the stress analyst can drastically reduce the amount of time spent during model generation. The PAFMSC software will automatically convert a PAFEC data structure into an MSC/NASTRAN bulk data deck. The capabilities and limitations of the PAFMSC software are fully discussed

  18. Automaticity of walking: functional significance, mechanisms, measurement and rehabilitation strategies

    Directory of Open Access Journals (Sweden)

    David J Clark

    2015-05-01

    Full Text Available Automaticity is a hallmark feature of walking in adults who are healthy and well-functioning. In the context of walking, ‘automaticity’ refers to the ability of the nervous system to successfully control typical steady state walking with minimal use of attention-demanding executive control resources. Converging lines of evidence indicate that walking deficits and disorders are characterized in part by a shift in the locomotor control strategy from healthy automaticity to compensatory executive control. This is potentially detrimental to walking performance, as an executive control strategy is not optimized for locomotor control. Furthermore, it places excessive demands on a limited pool of executive reserves. The result is compromised ability to perform basic and complex walking tasks and heightened risk for adverse mobility outcomes including falls. Strategies for rehabilitation of automaticity are not well defined, which is due to both a lack of systematic research into the causes of impaired automaticity and to a lack of robust neurophysiological assessments by which to gauge automaticity. These gaps in knowledge are concerning given the serious functional implications of compromised automaticity. Therefore, the objective of this article is to advance the science of automaticity of walking by consolidating evidence and identifying gaps in knowledge regarding: a functional significance of automaticity; b neurophysiology of automaticity; c measurement of automaticity; d mechanistic factors that compromise automaticity; and e strategies for rehabilitation of automaticity.

  19. A consideration of the operation of automatic production machines.

    Science.gov (United States)

    Hoshi, Toshiro; Sugimoto, Noboru

    2015-01-01

    At worksites, various automatic production machines are in use to release workers from muscular labor or labor in the detrimental environment. On the other hand, a large number of industrial accidents have been caused by automatic production machines. In view of this, this paper considers the operation of automatic production machines from the viewpoint of accident prevention, and points out two types of machine operation - operation for which quick performance is required (operation that is not permitted to be delayed) - and operation for which composed performance is required (operation that is not permitted to be performed in haste). These operations are distinguished by operation buttons of suitable colors and shapes. This paper shows that these characteristics are evaluated as "asymmetric on the time-axis". Here, in order for workers to accept the risk of automatic production machines, it is preconditioned in general that harm should be sufficiently small or avoidance of harm is easy. In this connection, this paper shows the possibility of facilitating the acceptance of the risk of automatic production machines by enhancing the asymmetric on the time-axis.

  20. General collaboration offer of Johnson Controls regarding the performance of air conditioning automatic control systems and other buildings` automatic control systems

    Energy Technology Data Exchange (ETDEWEB)

    Gniazdowski, J.

    1995-12-31

    JOHNSON CONTROLS manufactures measuring and control equipment (800 types) and is as well a {open_quotes}turn-key{close_quotes} supplier of complete automatic controls systems for heating, air conditioning, ventilation and refrigerating engineering branches. The Company also supplies Buildings` Computer-Based Supervision and Monitoring Systems that may be applied in both small and large structures. Since 1990 the company has been performing full-range trade and contracting activities on the Polish market. We have our own well-trained technical staff and we collaborate with a series of designing and contracting enterprises that enable us to have our projects carried out all over Poland. The prices of our supplies and services correspond with the level of the Polish market.

  1. Comparison of First Gear Performance for Manual and Automatic Transmissions

    Directory of Open Access Journals (Sweden)

    Kyle Stottlemyer

    2011-01-01

    Full Text Available The purpose of this project is to compare the first gear performance of an automobile for both its manual and automatic transmission options. Each transmission type has a different gear ratio, which yields a different acceleration curve for each transmission throughout the torque-rpm curve of the engine. The method of integral calculus was used to find an equation which could be used to solve for time at any point in the car's acceleration. The automobile velocity versus time was then graphed to compare each transmissions acceleration trend. This process is similar to that which automotive companies may use when determining what type of transmission to pair with a particular vehicle. By observing the trends in the acceleration graphs, it was determined that there are specific advantages and disadvantages to each type of transmission. Which transmission is the “better” choice is dependent on what application the automobile will be used for (e.g. racing, day-to-day driving, towing/hauling.

  2. Towards an Intelligent Planning Knowledge Base Development Environment

    Science.gov (United States)

    Chien, S.

    1994-01-01

    ract describes work in developing knowledge base editing and debugging tools for the Multimission VICAR Planner (MVP) system. MVP uses artificial intelligence planning techniques to automatically construct executable complex image processing procedures (using models of the smaller constituent image processing requests made to the JPL Multimission Image Processing Laboratory.

  3. A Compact Methodology to Understand, Evaluate, and Predict the Performance of Automatic Target Recognition

    Science.gov (United States)

    Li, Yanpeng; Li, Xiang; Wang, Hongqiang; Chen, Yiping; Zhuang, Zhaowen; Cheng, Yongqiang; Deng, Bin; Wang, Liandong; Zeng, Yonghu; Gao, Lei

    2014-01-01

    This paper offers a compacted mechanism to carry out the performance evaluation work for an automatic target recognition (ATR) system: (a) a standard description of the ATR system's output is suggested, a quantity to indicate the operating condition is presented based on the principle of feature extraction in pattern recognition, and a series of indexes to assess the output in different aspects are developed with the application of statistics; (b) performance of the ATR system is interpreted by a quality factor based on knowledge of engineering mathematics; (c) through a novel utility called “context-probability” estimation proposed based on probability, performance prediction for an ATR system is realized. The simulation result shows that the performance of an ATR system can be accounted for and forecasted by the above-mentioned measures. Compared to existing technologies, the novel method can offer more objective performance conclusions for an ATR system. These conclusions may be helpful in knowing the practical capability of the tested ATR system. At the same time, the generalization performance of the proposed method is good. PMID:24967605

  4. Effects on automatic attention due to exposure to pictures of emotional faces while performing Chinese word judgment tasks.

    Science.gov (United States)

    Junhong, Huang; Renlai, Zhou; Senqi, Hu

    2013-01-01

    Two experiments were conducted to investigate the automatic processing of emotional facial expressions while performing low or high demand cognitive tasks under unattended conditions. In Experiment 1, 35 subjects performed low (judging the structure of Chinese words) and high (judging the tone of Chinese words) cognitive load tasks while exposed to unattended pictures of fearful, neutral, or happy faces. The results revealed that the reaction time was slower and the performance accuracy was higher while performing the low cognitive load task than while performing the high cognitive load task. Exposure to fearful faces resulted in significantly longer reaction times and lower accuracy than exposure to neutral faces on the low cognitive load task. In Experiment 2, 26 subjects performed the same word judgment tasks and their brain event-related potentials (ERPs) were measured for a period of 800 ms after the onset of the task stimulus. The amplitudes of the early component of ERP around 176 ms (P2) elicited by unattended fearful faces over frontal-central-parietal recording sites was significantly larger than those elicited by unattended neutral faces while performing the word structure judgment task. Together, the findings of the two experiments indicated that unattended fearful faces captured significantly more attention resources than unattended neutral faces on a low cognitive load task, but not on a high cognitive load task. It was concluded that fearful faces could automatically capture attention if residues of attention resources were available under the unattended condition.

  5. Adaptive pseudolinear compensators of dynamic characteristics of automatic control systems

    Science.gov (United States)

    Skorospeshkin, M. V.; Sukhodoev, M. S.; Timoshenko, E. A.; Lenskiy, F. V.

    2016-04-01

    Adaptive pseudolinear gain and phase compensators of dynamic characteristics of automatic control systems are suggested. The automatic control system performance with adaptive compensators has been explored. The efficiency of pseudolinear adaptive compensators in the automatic control systems with time-varying parameters has been demonstrated.

  6. Accessories for Enhancement of the Semi-Automatic Welding Processes

    National Research Council Canada - National Science Library

    Wheeler, Douglas M; Sawhill, James M

    2000-01-01

    The project's objective is to identify specific areas of the semi-automatic welding operation that is performed with the major semi-automatic processes, which would be more productive if a suitable...

  7. Automatic NAA. Saturation activities

    International Nuclear Information System (INIS)

    Westphal, G.P.; Grass, F.; Kuhnert, M.

    2008-01-01

    A system for Automatic NAA is based on a list of specific saturation activities determined for one irradiation position at a given neutron flux and a single detector geometry. Originally compiled from measurements of standard reference materials, the list may be extended also by the calculation of saturation activities from k 0 and Q 0 factors, and f and α values of the irradiation position. A systematic improvement of the SRM approach is currently being performed by pseudo-cyclic activation analysis, to reduce counting errors. From these measurements, the list of saturation activities is recalculated in an automatic procedure. (author)

  8. Shaping electromagnetic waves using software-automatically-designed metasurfaces.

    Science.gov (United States)

    Zhang, Qian; Wan, Xiang; Liu, Shuo; Yuan Yin, Jia; Zhang, Lei; Jun Cui, Tie

    2017-06-15

    We present a fully digital procedure of designing reflective coding metasurfaces to shape reflected electromagnetic waves. The design procedure is completely automatic, controlled by a personal computer. In details, the macro coding units of metasurface are automatically divided into several types (e.g. two types for 1-bit coding, four types for 2-bit coding, etc.), and each type of the macro coding units is formed by discretely random arrangement of micro coding units. By combining an optimization algorithm and commercial electromagnetic software, the digital patterns of the macro coding units are optimized to possess constant phase difference for the reflected waves. The apertures of the designed reflective metasurfaces are formed by arranging the macro coding units with certain coding sequence. To experimentally verify the performance, a coding metasurface is fabricated by automatically designing two digital 1-bit unit cells, which are arranged in array to constitute a periodic coding metasurface to generate the required four-beam radiations with specific directions. Two complicated functional metasurfaces with circularly- and elliptically-shaped radiation beams are realized by automatically designing 4-bit macro coding units, showing excellent performance of the automatic designs by software. The proposed method provides a smart tool to realize various functional devices and systems automatically.

  9. Forensic Automatic Speaker Recognition Based on Likelihood Ratio Using Acoustic-phonetic Features Measured Automatically

    Directory of Open Access Journals (Sweden)

    Huapeng Wang

    2015-01-01

    Full Text Available Forensic speaker recognition is experiencing a remarkable paradigm shift in terms of the evaluation framework and presentation of voice evidence. This paper proposes a new method of forensic automatic speaker recognition using the likelihood ratio framework to quantify the strength of voice evidence. The proposed method uses a reference database to calculate the within- and between-speaker variability. Some acoustic-phonetic features are extracted automatically using the software VoiceSauce. The effectiveness of the approach was tested using two Mandarin databases: A mobile telephone database and a landline database. The experiment's results indicate that these acoustic-phonetic features do have some discriminating potential and are worth trying in discrimination. The automatic acoustic-phonetic features have acceptable discriminative performance and can provide more reliable results in evidence analysis when fused with other kind of voice features.

  10. Assess and Predict Automatic Generation Control Performances for Thermal Power Generation Units Based on Modeling Techniques

    Science.gov (United States)

    Zhao, Yan; Yang, Zijiang; Gao, Song; Liu, Jinbiao

    2018-02-01

    Automatic generation control(AGC) is a key technology to maintain real time power generation and load balance, and to ensure the quality of power supply. Power grids require each power generation unit to have a satisfactory AGC performance, being specified in two detailed rules. The two rules provide a set of indices to measure the AGC performance of power generation unit. However, the commonly-used method to calculate these indices is based on particular data samples from AGC responses and will lead to incorrect results in practice. This paper proposes a new method to estimate the AGC performance indices via system identification techniques. In addition, a nonlinear regression model between performance indices and load command is built in order to predict the AGC performance indices. The effectiveness of the proposed method is validated through industrial case studies.

  11. Supporting the Development and Adoption of Automatic Lameness Detection Systems in Dairy Cattle: Effect of System Cost and Performance on Potential Market Shares.

    Science.gov (United States)

    Van De Gucht, Tim; Van Weyenberg, Stephanie; Van Nuffel, Annelies; Lauwers, Ludwig; Vangeyte, Jürgen; Saeys, Wouter

    2017-10-08

    Most automatic lameness detection system prototypes have not yet been commercialized, and are hence not yet adopted in practice. Therefore, the objective of this study was to simulate the effect of detection performance (percentage missed lame cows and percentage false alarms) and system cost on the potential market share of three automatic lameness detection systems relative to visual detection: a system attached to the cow, a walkover system, and a camera system. Simulations were done using a utility model derived from survey responses obtained from dairy farmers in Flanders, Belgium. Overall, systems attached to the cow had the largest market potential, but were still not competitive with visual detection. Increasing the detection performance or lowering the system cost led to higher market shares for automatic systems at the expense of visual detection. The willingness to pay for extra performance was €2.57 per % less missed lame cows, €1.65 per % less false alerts, and €12.7 for lame leg indication, respectively. The presented results could be exploited by system designers to determine the effect of adjustments to the technology on a system's potential adoption rate.

  12. Technology-Enhanced Assessment of Math Fact Automaticity: Patterns of Performance for Low- and Typically Achieving Students

    Science.gov (United States)

    Stickney, Eric M.; Sharp, Lindsay B.; Kenyon, Amanda S.

    2012-01-01

    Because math fact automaticity has been identified as a key barrier for students struggling with mathematics, we examined how initial math achievement levels influenced the path to automaticity (e.g., variation in number of attempts, speed of retrieval, and skill maintenance over time) and the relation between attainment of automaticity and gains…

  13. The ‘Continuing Misfortune’ of Automatism in Early Surrealism

    Directory of Open Access Journals (Sweden)

    Tessel M. Bauduin

    2015-09-01

    Full Text Available In the 1924 Manifesto of Surrealism surrealist leader André Breton (1896-1966 defined Surrealism as ‘psychic automatism in its pure state,’ positioning ‘psychic automatism’ as both a concept and a technique. This definition followed upon an intense period of experimentation with various forms of automatism among the proto-surrealist group; predominantly automatic writing, but also induced dream states. This article explores how surrealist ‘psychic automatism’ functioned as a mechanism for communication, or the expression of thought as directly as possible through the unconscious, in the first two decades of Surrealism. It touches upon automatic writing, hysteria as an automatic bodily performance of the unconscious, dreaming and the experimentation with induced dream states, and automatic drawing and other visual arts-techniques that could be executed more or less automatically as well. For all that the surrealists reinvented automatism for their own poetic, artistic and revolutionary aims, the automatic techniques were primarily drawn from contemporary Spiritualism, psychical research and experimentation with mediums, and the article teases out the connections to mediumistic automatism. It is demonstrated how the surrealists effectively and successfully divested automatism of all things spiritual. It furthermore becomes clear that despite various mishaps, automatism in many forms was a very successful creative technique within Surrealism.

  14. Automatic Clustering Using FSDE-Forced Strategy Differential Evolution

    Science.gov (United States)

    Yasid, A.

    2018-01-01

    Clustering analysis is important in datamining for unsupervised data, cause no adequate prior knowledge. One of the important tasks is defining the number of clusters without user involvement that is known as automatic clustering. This study intends on acquiring cluster number automatically utilizing forced strategy differential evolution (AC-FSDE). Two mutation parameters, namely: constant parameter and variable parameter are employed to boost differential evolution performance. Four well-known benchmark datasets were used to evaluate the algorithm. Moreover, the result is compared with other state of the art automatic clustering methods. The experiment results evidence that AC-FSDE is better or competitive with other existing automatic clustering algorithm.

  15. Application of ANN-SCE model on the evaluation of automatic generation control performance

    Energy Technology Data Exchange (ETDEWEB)

    Chang-Chien, L.R.; Lo, C.S.; Lee, K.S. [National Cheng Kung Univ., Tainan, Taiwan (China)

    2005-07-01

    An accurate evaluation of load frequency control (LFC) performance is needed to balance minute-to-minute electricity generation and demand. In this study, an artificial neural network-based system control error (ANN-SCE) model was used to assess the performance of automatic generation controls (AGC). The model was used to identify system dynamics for control references in supplementing AGC logic. The artificial neural network control error model was used to track a single area's LFC dynamics in Taiwan. The model was used to gauge the impacts of regulation control. Results of the training, evaluating, and projecting processes showed that the ANN-SCE model could be algebraically decomposed into components corresponding to different impact factors. The SCE information obtained from testing of various AGC gains provided data for the creation of a new control approach. The ANN-SCE model was used in conjunction with load forecasting and scheduled generation data to create an ANN-SCE identifier. The model successfully simulated SCE dynamics. 13 refs., 10 figs.

  16. Automatic Deficits can lead to executive deficits in ADHD

    Directory of Open Access Journals (Sweden)

    Gabriella Martino

    2017-12-01

    Full Text Available It has been well documented an executive dysfunction in children with Attention Deficit Hyperactivity Disorder (ADHD and with Reading Disorder (RD. The purpose of the present study was to test an alternative hypothesis that deficits in executive functioning within ADHD may be partially due to an impairment of the automatic processing. In addition, since the co-occurrence between ADHD and RD, we tested the hypothesis that the automatic processing may be  a possible common cognitive factor between ADHD and RD. We investigated the automatic processing of selective visual attention through two experiments. 12 children with ADHD, 17 with ADHD+RD and 29 typically developing children, matched for age and gender, performed two tasks: Visual Information Processing Task and Clock Test. As expected, ADHD and ADHD+RD groups differed from the control group in controlled process task, suggesting a deficit in executive functioning. All clinical subjects also exhibited a lower performance in automatic processes, compared to control group. The results of this study suggest that executive deficits within ADHD can be partially due to an impairment of automatic processing.

  17. Automatic ultrasound image enhancement for 2D semi-automatic breast-lesion segmentation

    Science.gov (United States)

    Lu, Kongkuo; Hall, Christopher S.

    2014-03-01

    Breast cancer is the fastest growing cancer, accounting for 29%, of new cases in 2012, and second leading cause of cancer death among women in the United States and worldwide. Ultrasound (US) has been used as an indispensable tool for breast cancer detection/diagnosis and treatment. In computer-aided assistance, lesion segmentation is a preliminary but vital step, but the task is quite challenging in US images, due to imaging artifacts that complicate detection and measurement of the suspect lesions. The lesions usually present with poor boundary features and vary significantly in size, shape, and intensity distribution between cases. Automatic methods are highly application dependent while manual tracing methods are extremely time consuming and have a great deal of intra- and inter- observer variability. Semi-automatic approaches are designed to counterbalance the advantage and drawbacks of the automatic and manual methods. However, considerable user interaction might be necessary to ensure reasonable segmentation for a wide range of lesions. This work proposes an automatic enhancement approach to improve the boundary searching ability of the live wire method to reduce necessary user interaction while keeping the segmentation performance. Based on the results of segmentation of 50 2D breast lesions in US images, less user interaction is required to achieve desired accuracy, i.e. < 80%, when auto-enhancement is applied for live-wire segmentation.

  18. Automatic Imitation

    Science.gov (United States)

    Heyes, Cecilia

    2011-01-01

    "Automatic imitation" is a type of stimulus-response compatibility effect in which the topographical features of task-irrelevant action stimuli facilitate similar, and interfere with dissimilar, responses. This article reviews behavioral, neurophysiological, and neuroimaging research on automatic imitation, asking in what sense it is "automatic"…

  19. Design and accomplishment for the monitoring unit of the sup 6 sup 0 Co train freight inspection system

    CERN Document Server

    Cong Peng

    2002-01-01

    The sup 6 sup 0 Co railway cargo inspection system has super automaticity. And the monitoring unit is an important part of the automatic control system. The author introduces the idea of designing the monitoring unit in detail and accomplishes a new-style unit which is different from the traditional one. The monitoring unit which is highly integrated, easy to be mounted and debugged and convenient to be operated and maintained has play an excellent role in the work of the whole inspection system

  20. Automatic operation device for control rods

    International Nuclear Information System (INIS)

    Sekimizu, Koichi

    1984-01-01

    Purpose: To enable automatic operation of control rods based on the reactor operation planning, and particularly, to decrease the operator's load upon start up and shutdown of the reactor. Constitution: Operation plannings, demand for the automatic operation, break point setting value, power and reactor core flow rate change, demand for operation interrupt, demand for restart, demand for forecasting and the like are inputted to an input device, and an overall judging device performs a long-term forecast as far as the break point by a long-term forecasting device based on the operation plannings. The automatic reactor operation or the like is carried out based on the long-term forecasting and the short time forecasting is performed by the change in the reactor core status due to the control rod operation sequence based on the control rod pattern and the operation planning. Then, it is judged if the operation for the intended control rod is possible or not based on the result of the short time forecasting. (Aizawa, K.)

  1. Automatic Oscillating Turret.

    Science.gov (United States)

    1981-03-01

    Final Report: February 1978 ZAUTOMATIC OSCILLATING TURRET SYSTEM September 1980 * 6. PERFORMING 01G. REPORT NUMBER .J7. AUTHOR(S) S. CONTRACT OR GRANT...o....e.... *24 APPENDIX P-4 OSCILLATING BUMPER TURRET ...................... 25 A. DESCRIPTION 1. Turret Controls ...Other criteria requirements were: 1. Turret controls inside cab. 2. Automatic oscillation with fixed elevation to range from 20* below the horizontal to

  2. On the automaticity of response inhibition in individuals with alcoholism.

    Science.gov (United States)

    Noël, Xavier; Brevers, Damien; Hanak, Catherine; Kornreich, Charles; Verbanck, Paul; Verbruggen, Frederick

    2016-06-01

    Response inhibition is usually considered a hallmark of executive control. However, recent work indicates that stop performance can become associatively mediated ('automatic') over practice. This study investigated automatic response inhibition in sober and recently detoxified individuals with alcoholism.. We administered to forty recently detoxified alcoholics and forty healthy participants a modified stop-signal task that consisted of a training phase in which a subset of the stimuli was consistently associated with stopping or going, and a test phase in which this mapping was reversed. In the training phase, stop performance improved for the consistent stop stimuli, compared with control stimuli that were not associated with going or stopping. In the test phase, go performance tended to be impaired for old stop stimuli. Combined, these findings support the automatic inhibition hypothesis. Importantly, performance was similar in both groups, which indicates that automatic inhibitory control develops normally in individuals with alcoholism.. This finding is specific to individuals with alcoholism without other psychiatric disorders, which is rather atypical and prevents generalization. Personalized stimuli with a stronger affective content should be used in future studies. These results advance our understanding of behavioral inhibition in individuals with alcoholism. Furthermore, intact automatic inhibitory control may be an important element of successful cognitive remediation of addictive behaviors.. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Automatic Operation For A Robot Lawn Mower

    Science.gov (United States)

    Huang, Y. Y.; Cao, Z. L.; Oh, S. J.; Kattan, E. U.; Hall, E. L.

    1987-02-01

    A domestic mobile robot, lawn mower, which performs the automatic operation mode, has been built up in the Center of Robotics Research, University of Cincinnati. The robot lawn mower automatically completes its work with the region filling operation, a new kind of path planning for mobile robots. Some strategies for region filling of path planning have been developed for a partly-known or a unknown environment. Also, an advanced omnidirectional navigation system and a multisensor-based control system are used in the automatic operation. Research on the robot lawn mower, especially on the region filling of path planning, is significant in industrial and agricultural applications.

  4. Automatic segmentation of the heart in radiotherapy for breast cancer

    DEFF Research Database (Denmark)

    Laugaard Lorenzen, Ebbe; Ewertz, Marianne; Brink, Carsten

    2014-01-01

    Background. The aim of this study was to evaluate two fully automatic segmentation methods in comparison with manual delineations for their use in delineating the heart on planning computed tomography (CT) used in radiotherapy for breast cancer. Material and methods. Automatic delineation of heart...... in 15 breast cancer patients was performed by two different automatic delineation systems. Analysis of accuracy and precision of the differences between manual and automatic delineations were evaluated on volume, mean dose, maximum dose and spatial distance differences. Two sets of manual delineations...

  5. Assessing the Performance of Automatic Speech Recognition Systems When Used by Native and Non-Native Speakers of Three Major Languages in Dictation Workflows

    DEFF Research Database (Denmark)

    Zapata, Julián; Kirkedal, Andreas Søeborg

    2015-01-01

    In this paper, we report on a two-part experiment aiming to assess and compare the performance of two types of automatic speech recognition (ASR) systems on two different computational platforms when used to augment dictation workflows. The experiment was performed with a sample of speakers...

  6. The evaluation of the performance of the automatic exposure control system of some selected mammography facilities in the Greater Accra Region, Ghana

    International Nuclear Information System (INIS)

    Amesimenu, R.

    2013-07-01

    Mammography aids in the early detection of breast cancer. X-rays has an associated risk of inducing cancer though very useful and as such mammography procedures should be optimized through the appropriate processes such as the selection of exposure factors for an optimum image and minimal dose to patients. The automatic exposure control (AEC) aids in the selection of exposure factors thus controlling the amount of radiation to the breast and automatically compensates for differences in the breast thickness and density. The performance of the automatic exposure control system of mammography equipment and the status of quality management systems including quality assurance and quality controls of four (4) mammography facilities within the Greater Accra Region were assessed. In assessing the performance of the automatic exposure control system, the short term reproducibility test, thickness and voltage compensation test were carried out using breast equivalent phantom of various thicknesses. Half value layer test, film reject analysis and patient dose assessment were also performed. Analysis of the responses of the questionnaire administered to radiographers and supervisors of the selected facilities revealed that three (3) of the facilities have some aspect of quality management system programme in place but not effectively implemented. Measured optical densities from the various tests performed to evaluate the performance of the automatic exposure control systems revealed that the AEC compensates for the different phantom thickness and tube voltage (KV) by producing comparable optical densities for the various phantom thickness and tube voltages. Some of the measured optical densities were within the recommended optical density range of 1.5 OD - 1.9 OD. The highest optical density value was 0.13 OD above the highest limit of 1.9 OD. The film reject analysis showed that patient motion accounted for the large part (28%) of film rejects. Other factors such as too light

  7. Automatic Sample Changer for X-Ray Spectrometry

    International Nuclear Information System (INIS)

    Morales Tarre, Orlando; Diaz Castro, Maikel; Rivero Ramirez, Doris; Lopez Pino, Neivy

    2011-01-01

    The design and construction of an automatic sample changer for Nuclear Analysis Laboratory's X-ray spectrometer at InSTEC is presented by giving basic details about its mechanical structure, control circuits and the software application developed to interact with the data acquisition software of the multichannel analyzer. Results of some test experiments performed with the automatic sample changer are also discussed. The system is currently in use at InSTEC. (Author)

  8. CT-automatic exposure control devices: What are their performances?

    Energy Technology Data Exchange (ETDEWEB)

    Gutierrez, Daniel [University Institute for Radiation Physics (IRA-DUMSC), Grand-Pre 1, CH-1007 Lausanne (Switzerland); Schmidt, Sabine; Denys, Alban; Schnyder, Pierre [Radiology Department, University Hospital of Lausanne, CHUV, CH-1011 Lausanne (Switzerland); Bochud, Francois O. [University Institute for Radiation Physics (IRA-DUMSC), Grand-Pre 1, CH-1007 Lausanne (Switzerland); Verdun, Francis R. [University Institute for Radiation Physics (IRA-DUMSC), Grand-Pre 1, CH-1007 Lausanne (Switzerland)], E-mail: francis.verdun@chuv.ch

    2007-10-01

    Purpose: To avoid unnecessary exposure to the patients, constructors have developed tube current modulation algorithms. The purpose of this work is to assess the performance of computed tomography (CT) tube current modulation concerning patient dose and image noise in MSCT scanners. Material and methods: A conical PMMA phantom with elliptical cross-section, to vary the thickness of the irradiated object in a monotonous way, and an anthropomorphic chest phantom were scanned under similar conditions on a general electrics (GE) LightSpeed VCT (64 slices) scanner. Noise measurements were made by calculating the standard deviation of the CT-number on a homogeneous ROI in both phantoms. The dose was estimated with the parameters read in the DICOM header of each studied image. Results: The study has shown that most of the time, constant noise levels (noise index) can be obtained by a variation of the mA. Nevertheless, this adaptation can be not fast enough when the variation of the attenuation changes is rapid. Thus, an adaptation length up to 5 cm was obtained. A 18% dose reduction can be achieved (mean of 9.9%) by switching from z-axis modulation algorithm to xyz-axis modulation option. However, exposure in the chest area can be higher with current modulation than without, when trying to keep an image noise level constant in thoraco-abdominal investigations. Conclusion: Current modulation algorithms can produce inadequate quality images due to problems with tube current stabilization when a sudden attenuation variation takes place as at the start of a scanning sequence. As expected, rotational (xyz-axis) modulation performs better than only z-axis modulation algorithm. The use of automatic exposure control (AEC) can lead to an increase of the dose if the maximum allowed current is not properly set in thoraco-abdominal acquisitions.

  9. Data Provenance Inference in Logic Programming: Reducing Effort of Instance-driven Debugging

    NARCIS (Netherlands)

    Huq, M.R.; Mileo, Alessandra; Wombacher, Andreas

    Data provenance allows scientists in different domains validating their models and algorithms to find out anomalies and unexpected behaviors. In previous works, we described on-the-fly interpretation of (Python) scripts to build workflow provenance graph automatically and then infer fine-grained

  10. Evaluation of manual and automatic manually triggered ventilation performance and ergonomics using a simulation model.

    Science.gov (United States)

    Marjanovic, Nicolas; Le Floch, Soizig; Jaffrelot, Morgan; L'Her, Erwan

    2014-05-01

    In the absence of endotracheal intubation, the manual bag-valve-mask (BVM) is the most frequently used ventilation technique during resuscitation. The efficiency of other devices has been poorly studied. The bench-test study described here was designed to evaluate the effectiveness of an automatic, manually triggered system, and to compare it with manual BVM ventilation. A respiratory system bench model was assembled using a lung simulator connected to a manikin to simulate a patient with unprotected airways. Fifty health-care providers from different professional groups (emergency physicians, residents, advanced paramedics, nurses, and paramedics; n = 10 per group) evaluated manual BVM ventilation, and compared it with an automatic manually triggered device (EasyCPR). Three pathological situations were simulated (restrictive, obstructive, normal). Standard ventilation parameters were recorded; the ergonomics of the system were assessed by the health-care professionals using a standard numerical scale once the recordings were completed. The tidal volume fell within the standard range (400-600 mL) for 25.6% of breaths (0.6-45 breaths) using manual BVM ventilation, and for 28.6% of breaths (0.3-80 breaths) using the automatic manually triggered device (EasyCPR) (P < .0002). Peak inspiratory airway pressure was lower using the automatic manually triggered device (EasyCPR) (10.6 ± 5 vs 15.9 ± 10 cm H2O, P < .001). The ventilation rate fell consistently within the guidelines, in the case of the automatic manually triggered device (EasyCPR) only (10.3 ± 2 vs 17.6 ± 6, P < .001). Significant pulmonary overdistention was observed when using the manual BVM device during the normal and obstructive sequences. The nurses and paramedics considered the ergonomics of the automatic manually triggered device (EasyCPR) to be better than those of the manual device. The use of an automatic manually triggered device may improve ventilation efficiency and decrease the risk of

  11. Investigating the Relationship between Stable Personality Characteristics and Automatic Imitation.

    Science.gov (United States)

    Butler, Emily E; Ward, Robert; Ramsey, Richard

    2015-01-01

    Automatic imitation is a cornerstone of nonverbal communication that fosters rapport between interaction partners. Recent research has suggested that stable dimensions of personality are antecedents to automatic imitation, but the empirical evidence linking imitation with personality traits is restricted to a few studies with modest sample sizes. Additionally, atypical imitation has been documented in autism spectrum disorders and schizophrenia, but the mechanisms underpinning these behavioural profiles remain unclear. Using a larger sample than prior studies (N=243), the current study tested whether performance on a computer-based automatic imitation task could be predicted by personality traits associated with social behaviour (extraversion and agreeableness) and with disorders of social cognition (autistic-like and schizotypal traits). Further personality traits (narcissism and empathy) were assessed in a subsample of participants (N=57). Multiple regression analyses showed that personality measures did not predict automatic imitation. In addition, using a similar analytical approach to prior studies, no differences in imitation performance emerged when only the highest and lowest 20 participants on each trait variable were compared. These data weaken support for the view that stable personality traits are antecedents to automatic imitation and that neural mechanisms thought to support automatic imitation, such as the mirror neuron system, are dysfunctional in autism spectrum disorders or schizophrenia. In sum, the impact that personality variables have on automatic imitation is less universal than initial reports suggest.

  12. Investigating the Relationship between Stable Personality Characteristics and Automatic Imitation.

    Directory of Open Access Journals (Sweden)

    Emily E Butler

    Full Text Available Automatic imitation is a cornerstone of nonverbal communication that fosters rapport between interaction partners. Recent research has suggested that stable dimensions of personality are antecedents to automatic imitation, but the empirical evidence linking imitation with personality traits is restricted to a few studies with modest sample sizes. Additionally, atypical imitation has been documented in autism spectrum disorders and schizophrenia, but the mechanisms underpinning these behavioural profiles remain unclear. Using a larger sample than prior studies (N=243, the current study tested whether performance on a computer-based automatic imitation task could be predicted by personality traits associated with social behaviour (extraversion and agreeableness and with disorders of social cognition (autistic-like and schizotypal traits. Further personality traits (narcissism and empathy were assessed in a subsample of participants (N=57. Multiple regression analyses showed that personality measures did not predict automatic imitation. In addition, using a similar analytical approach to prior studies, no differences in imitation performance emerged when only the highest and lowest 20 participants on each trait variable were compared. These data weaken support for the view that stable personality traits are antecedents to automatic imitation and that neural mechanisms thought to support automatic imitation, such as the mirror neuron system, are dysfunctional in autism spectrum disorders or schizophrenia. In sum, the impact that personality variables have on automatic imitation is less universal than initial reports suggest.

  13. Automatic differentiation for gradient-based optimization of radiatively heated microelectronics manufacturing equipment

    Energy Technology Data Exchange (ETDEWEB)

    Moen, C.D.; Spence, P.A.; Meza, J.C.; Plantenga, T.D.

    1996-12-31

    Automatic differentiation is applied to the optimal design of microelectronic manufacturing equipment. The performance of nonlinear, least-squares optimization methods is compared between numerical and analytical gradient approaches. The optimization calculations are performed by running large finite-element codes in an object-oriented optimization environment. The Adifor automatic differentiation tool is used to generate analytic derivatives for the finite-element codes. The performance results support previous observations that automatic differentiation becomes beneficial as the number of optimization parameters increases. The increase in speed, relative to numerical differences, has a limited value and results are reported for two different analysis codes.

  14. The ETA10 supercomputer system

    International Nuclear Information System (INIS)

    Swanson, C.D.

    1987-01-01

    The ETA Systems, Inc. ETA 10 is a next-generation supercomputer featuring multiprocessing, a large hierarchical memory system, high performance input/output, and network support for both batch and interactive processing. Advanced technology used in the ETA 10 includes liquid nitrogen cooled CMOS logic with 20,000 gates per chip, a single printed circuit board for each CPU, and high density static and dynamics MOS memory chips. Software for the ETA 10 includes an underlying kernel that supports multiple user environments, a new ETA FORTRAN compiler with an advanced automatic vectorizer, a multitasking library and debugging tools. Possible developments for future supercomputers from ETA Systems are discussed. (orig.)

  15. The ETA systems plans for supercomputers

    International Nuclear Information System (INIS)

    Swanson, C.D.

    1987-01-01

    The ETA Systems, is a class VII supercomputer featuring multiprocessing, a large hierarchical memory system, high performance input/output, and network support for both batch and interactive processing. Advanced technology used in the ETA 10 includes liquid nitrogen cooled CMOS logic with 20,000 gates per chip, a single printed circuit board for each CPU, and high density static and dynamic MOS memory chips. Software for the ETA 10 includes an underlying kernel that supports multiple user environments, a new ETA FORTRAN compiler with an advanced automatic vectorizer, a multitasking library and debugging tools. Possible developments for future supercomputers from ETA Systems are discussed

  16. Debugging Democracy

    Directory of Open Access Journals (Sweden)

    Alexander Likhotal

    2016-05-01

    Full Text Available Democracy was the most successful political idea of the 20th century. However since the beginning of the new century democracy has been clearly suffering from serious structural problems, rather than a few isolated ailments. Why has it run into trouble, can it be revived? In the consumption driven world people have started to be driven by the belief in economic prosperity as the guarantee of human freedom. As a result, human development and personal status have become hostages of economic performance, deforming basic civilisation’s ethical matrix. However in 10-15 years, the world may be completely different. We are looking at communications and technology revolutions occurring in very abbreviated time frames. Soon, billions of people will interact via a fast data-transferring Metaweb, and it will change social standards as well as human behaviour patterns. Integrated global economies functioning as holistic entities will spur a deep reframing of global governance, shaping a new configuration of political, economic and military power. One can hardly expect that these changes will leave democratic mechanisms intact. It’s a pivotal moment for all of us because we are facing paradigm changes in our way of life. We clearly need a new political vision that is deliverable quickly. Democracy can be reset if it can provide a platform for collective judgement and individual development—in a value-driven process, when values manifest themselves in concrete and socially meaningful issues, and are not reduced to the economic optimization and politics of the wallet. In other words, the only remedy to resolve the crisis of democracy is more democracy.

  17. Special Issue on Automatic Application Tuning for HPC Architectures

    Directory of Open Access Journals (Sweden)

    Siegfried Benkner

    2014-01-01

    Full Text Available High Performance Computing architectures have become incredibly complex and exploiting their full potential is becoming more and more challenging. As a consequence, automatic performance tuning (autotuning of HPC applications is of growing interest and many research groups around the world are currently involved. Autotuning is still a rapidly evolving research field with many different approaches being taken. This special issue features selected papers presented at the Dagstuhl seminar on “Automatic Application Tuning for HPC Architectures” in October 2013, which brought together researchers from the areas of autotuning and performance analysis in order to exchange ideas and steer future collaborations.

  18. On automatic visual inspection of reflective surfaces

    DEFF Research Database (Denmark)

    Kulmann, Lionel

    1995-01-01

    surfaces, providing new and exciting applications subject to automated visual inspection. Several contextual features have been surveyed along with introduction of novel methods to perform data-dependent enhancement of local surface appearance . Morphological methods have been described and utilized......This thesis descrbes different methods to perform automatic visual inspection of reflective manufactured products, with the aim of increasing productivity, reduce cost and improve the quality level of the production. We investigate two different systems performing automatic visual inspection....... The first is the inspection of highly reflective aluminum sheets, used by the Danish company Bang & Olufsen, as a part of the exterior design and general appearance of their audio and video products. The second is the inspection of IBM hard disk read/write heads for defects during manufacturing. We have...

  19. Development of automatic ultrasonic testing system and its application

    International Nuclear Information System (INIS)

    Oh, Sang Hong; Matsuura, Toshihiko; Iwata, Ryusuke; Nakagawa, Michio; Horikawa, Kohsuke; Kim, You Chul

    1997-01-01

    The radiographic testing (RT) has been usually applied to a nondestructive testing, which is carried out on purpose to detect internal defects at welded joints of a penstock. In the case that RT could not be applied to, the ultrasonic testing (UT) was performed. UT was generally carried out by manual scanning and the inspections data were recorded by the inspector in a site. So, as a weak point, there was no objective inspection records correspond to films of RT. It was expected that the automatic ultrasonic testing system by which automatic scanning and automatic recording are possible was developed. In this respect, the automatic ultrasonic testing system was developed. Using newly developed the automatic ultrasonic testing system, test results to the circumferential welded joints of the penstock at a site were shown in this paper.

  20. Adapting Mask-RCNN for Automatic Nucleus Segmentation

    OpenAIRE

    Johnson, Jeremiah W.

    2018-01-01

    Automatic segmentation of microscopy images is an important task in medical image processing and analysis. Nucleus detection is an important example of this task. Mask-RCNN is a recently proposed state-of-the-art algorithm for object detection, object localization, and object instance segmentation of natural images. In this paper we demonstrate that Mask-RCNN can be used to perform highly effective and efficient automatic segmentations of a wide range of microscopy images of cell nuclei, for ...

  1. Automatic analysis of ultrasonic data

    International Nuclear Information System (INIS)

    Horteur, P.; Colin, J.; Benoist, P.; Bonis, M.; Paradis, L.

    1986-10-01

    This paper describes an automatic and self-contained data processing system, transportable on site, able to perform images such as ''A. Scan'', ''B. Scan'', ... to present very quickly the results of the control. It can be used in the case of pressure vessel inspection [fr

  2. Performance Modelling of Automatic Identification System with Extended Field of View

    DEFF Research Database (Denmark)

    Lauersen, Troels; Mortensen, Hans Peter; Pedersen, Nikolaj Bisgaard

    2010-01-01

    This paper deals with AIS (Automatic Identification System) behavior, to investigate the severity of packet collisions in an extended field of view (FOV). This is an important issue for satellite-based AIS, and the main goal is a feasibility study to find out to what extent an increased FOV...

  3. Parallelization of the AliRoot event reconstruction by performing a semi- automatic source-code transformation

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    side bus or processor interconnections. Parallelism can only result in performance gain, if the memory usage is optimized, memory locality improved and the communication between threads is minimized. But the domain of concurrent programming has become a field for highly skilled experts, as the implementation of multithreading is difficult, error prone and labor intensive. A full re-implementation for parallel execution of existing offline frameworks, like AliRoot in ALICE, is thus unaffordable. An alternative method, is to use a semi-automatic source-to-source transformation for getting a simple parallel design, with almost no interference between threads. This reduces the need of rewriting the develop...

  4. Automatic emotional expression analysis from eye area

    Science.gov (United States)

    Akkoç, Betül; Arslan, Ahmet

    2015-02-01

    Eyes play an important role in expressing emotions in nonverbal communication. In the present study, emotional expression classification was performed based on the features that were automatically extracted from the eye area. Fırst, the face area and the eye area were automatically extracted from the captured image. Afterwards, the parameters to be used for the analysis through discrete wavelet transformation were obtained from the eye area. Using these parameters, emotional expression analysis was performed through artificial intelligence techniques. As the result of the experimental studies, 6 universal emotions consisting of expressions of happiness, sadness, surprise, disgust, anger and fear were classified at a success rate of 84% using artificial neural networks.

  5. Automatic Grasp Generation and Improvement for Industrial Bin-Picking

    DEFF Research Database (Denmark)

    Kraft, Dirk; Ellekilde, Lars-Peter; Rytz, Jimmy Alison

    2014-01-01

    and achieve comparable results and that our learning approach can improve system performance significantly. Automatic bin-picking is an important industrial process that can lead to significant savings and potentially keep production in countries with high labour cost rather than outsourcing it. The presented......This paper presents work on automatic grasp generation and grasp learning for reducing the manual setup time and increase grasp success rates within bin-picking applications. We propose an approach that is able to generate good grasps automatically using a dynamic grasp simulator, a newly developed...

  6. Support vector machine for automatic pain recognition

    Science.gov (United States)

    Monwar, Md Maruf; Rezaei, Siamak

    2009-02-01

    Facial expressions are a key index of emotion and the interpretation of such expressions of emotion is critical to everyday social functioning. In this paper, we present an efficient video analysis technique for recognition of a specific expression, pain, from human faces. We employ an automatic face detector which detects face from the stored video frame using skin color modeling technique. For pain recognition, location and shape features of the detected faces are computed. These features are then used as inputs to a support vector machine (SVM) for classification. We compare the results with neural network based and eigenimage based automatic pain recognition systems. The experiment results indicate that using support vector machine as classifier can certainly improve the performance of automatic pain recognition system.

  7. Automatic referral to cardiac rehabilitation.

    Science.gov (United States)

    Fischer, Jane P

    2008-01-01

    The pervasive negative impact of cardiovascular disease in the United States is well documented. Although advances have been made, the campaign to reduce the occurrence, progression, and mortality continues. Determining evidence-based data is only half the battle. Implementing new and updated clinical guidelines into daily practice is a challenging task. Cardiac rehabilitation is an example of a proven intervention whose benefit is hindered through erratic implementation. The American Association of Cardiovascular and Pulmonary Rehabilitation (AACVPR), the American College of Cardiology (ACC), and the American Heart Association (AHA) have responded to this problem by publishing the AACVPR/ACC/AHA 2007 Performance Measures on Cardiac Rehabilitation for Referral to and Delivery of Cardiac Rehabilitation/Secondary Prevention Services. This new national guideline recommends automatic referral to cardiac rehabilitation for every eligible patient (performance measure A-1). This article offers guidance for the initiation of an automatic referral system, including individualizing your protocol with regard to electronic or paper-based order entry structures.

  8. Automatic LOD selection

    OpenAIRE

    Forsman, Isabelle

    2017-01-01

    In this paper a method to automatically generate transition distances for LOD, improving image stability and performance is presented. Three different methods were tested all measuring the change between two level of details using the spatial frequency. The methods were implemented as an optional pre-processing step in order to determine the transition distances from multiple view directions. During run-time both view direction based selection and the furthest distance for each direction was ...

  9. Automatic differentiation bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Corliss, G.F. [comp.

    1992-07-01

    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  10. Automatic contact in DYNA3D for vehicle crashworthiness

    International Nuclear Information System (INIS)

    Whirley, R.G.; Engelmann, B.E.

    1994-01-01

    This paper presents a new formulation for the automatic definition and treatment of mechanical contact in explicit, nonlinear, finite element analysis. Automatic contact offers the benefits of significantly reduced model construction time and fewer opportunities for user error, but faces significant challenges in reliability and computational costs. The authors have used a new four-step automatic contact algorithm. Key aspects of the proposed method include (1) automatic identification of adjacent and opposite surfaces in the global search phase, and (2) the use of a smoothly varying surface normal that allows a consistent treatment of shell intersection and corner contact conditions without ad hoc rules. Three examples are given to illustrate the performance of the newly proposed algorithm in the public DYNA3D code

  11. Automatic Validation of Protocol Narration

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, Pierpablo

    2003-01-01

    We perform a systematic expansion of protocol narrations into terms of a process algebra in order to make precise some of the detailed checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we...

  12. Automatic Segmentation of Vessels in In-Vivo Ultrasound Scans

    DEFF Research Database (Denmark)

    Tamimi-Sarnikowski, Philip; Brink-Kjær, Andreas; Moshavegh, Ramin

    2017-01-01

    presents a fully automatic segmentation algorithm, for robustly segmenting the vessel lumen in longitudinal B-mode ultrasound images. The automatic segmentation is performed using a combination of B-mode and power Doppler images. The proposed algorithm includes a series of preprocessing steps, and performs......Ultrasound has become highly popular to monitor atherosclerosis, by scanning the carotid artery. The screening involves measuring the thickness of the vessel wall and diameter of the lumen. An automatic segmentation of the vessel lumen, can enable the determination of lumen diameter. This paper...... a vessel segmentation by use of the marker-controlled watershed transform. The ultrasound images used in the study were acquired using the bk3000 ultrasound scanner (BK Ultrasound, Herlev, Denmark) with two transducers ”8L2 Linear” and ”10L2w Wide Linear” (BK Ultrasound, Herlev, Denmark). The algorithm...

  13. Robot-assisted automatic ultrasound calibration.

    Science.gov (United States)

    Aalamifar, Fereshteh; Cheng, Alexis; Kim, Younsu; Hu, Xiao; Zhang, Haichong K; Guo, Xiaoyu; Boctor, Emad M

    2016-10-01

    Ultrasound (US) calibration is the process of determining the unknown transformation from a coordinate frame such as the robot's tooltip to the US image frame and is a necessary task for any robotic or tracked US system. US calibration requires submillimeter-range accuracy for most applications, but it is a time-consuming and repetitive task. We provide a new framework for automatic US calibration with robot assistance and without the need for temporal calibration. US calibration based on active echo (AE) phantom was previously proposed, and its superiority over conventional cross-wire phantom-based calibration was shown. In this work, we use AE to guide the robotic arm motion through the process of data collection; we combine the capability of the AE point to localize itself in the frame of the US image with the automatic motion of the robotic arm to provide a framework for calibrating the arm to the US image automatically. We demonstrated the efficacy of the automated method compared to the manual method through experiments. To highlight the necessity of frequent ultrasound calibration, it is demonstrated that the calibration precision changed from 1.67 to 3.20 mm if the data collection is not repeated after a dismounting/mounting of the probe holder. In a large data set experiment, similar reconstruction precision of automatic and manual data collection was observed, while the time was reduced by 58 %. In addition, we compared ten automatic calibrations with ten manual ones, each performed in 15 min, and showed that all the automatic ones could converge in the case of setting the initial matrix as identity, while this was not achieved by manual data sets. Given the same initial matrix, the repeatability of the automatic was [0.46, 0.34, 0.80, 0.47] versus [0.42, 0.51, 0.98, 1.15] mm in the manual case for the US image four corners. The submillimeter accuracy requirement of US calibration makes frequent data collections unavoidable. We proposed an automated

  14. Towards Automatic Trunk Classification on Young Conifers

    DEFF Research Database (Denmark)

    Petri, Stig; Immerkær, John

    2009-01-01

    In the garden nursery industry providing young Nordmann firs for Christmas tree plantations, there is a rising interest in automatic classification of their products to ensure consistently high quality and reduce the cost of manual labor. This paper describes a fully automatic single-view algorit...... performance of the algorithm by incorporating color information into the data considered by the dynamic programming algorithm....

  15. Application of nonlinear transformations to automatic flight control

    Science.gov (United States)

    Meyer, G.; Su, R.; Hunt, L. R.

    1984-01-01

    The theory of transformations of nonlinear systems to linear ones is applied to the design of an automatic flight controller for the UH-1H helicopter. The helicopter mathematical model is described and it is shown to satisfy the necessary and sufficient conditions for transformability. The mapping is constructed, taking the nonlinear model to canonical form. The performance of the automatic control system in a detailed simulation on the flight computer is summarized.

  16. Focus of attention and automaticity in handwriting.

    Science.gov (United States)

    MacMahon, Clare; Charness, Neil

    2014-04-01

    This study investigated the nature of automaticity in everyday tasks by testing handwriting performance under single and dual-task conditions. Item familiarity and hand dominance were also manipulated to understand both cognitive and motor components of the task. In line with previous literature, performance was superior in an extraneous focus of attention condition compared to two different skill focus conditions. This effect was found only when writing with the dominant hand. In addition, performance was superior for high familiarity compared to low familiarity items. These findings indicate that motor and cognitive familiarity are related to the degree of automaticity of motor skills and can be manipulated to produce different performance outcomes. The findings also imply that the progression of skill acquisition from novel to novice to expert levels can be traced using different dual-task conditions. The separation of motor and cognitive familiarity is a new approach in the handwriting domain, and provides insight into the nature of attentional demands during performance. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. Automatic weld torch guidance control system

    Science.gov (United States)

    Smaith, H. E.; Wall, W. A.; Burns, M. R., Jr.

    1982-01-01

    A highly reliable, fully digital, closed circuit television optical, type automatic weld seam tracking control system was developed. This automatic tracking equipment is used to reduce weld tooling costs and increase overall automatic welding reliability. The system utilizes a charge injection device digital camera which as 60,512 inidividual pixels as the light sensing elements. Through conventional scanning means, each pixel in the focal plane is sequentially scanned, the light level signal digitized, and an 8-bit word transmitted to scratch pad memory. From memory, the microprocessor performs an analysis of the digital signal and computes the tracking error. Lastly, the corrective signal is transmitted to a cross seam actuator digital drive motor controller to complete the closed loop, feedback, tracking system. This weld seam tracking control system is capable of a tracking accuracy of + or - 0.2 mm, or better. As configured, the system is applicable to square butt, V-groove, and lap joint weldments.

  18. Inter Genre Similarity Modelling For Automatic Music Genre Classification

    OpenAIRE

    Bagci, Ulas; Erzin, Engin

    2009-01-01

    Music genre classification is an essential tool for music information retrieval systems and it has been finding critical applications in various media platforms. Two important problems of the automatic music genre classification are feature extraction and classifier design. This paper investigates inter-genre similarity modelling (IGS) to improve the performance of automatic music genre classification. Inter-genre similarity information is extracted over the mis-classified feature population....

  19. Discrete Model Reference Adaptive Control System for Automatic Profiling Machine

    Directory of Open Access Journals (Sweden)

    Peng Song

    2012-01-01

    Full Text Available Automatic profiling machine is a movement system that has a high degree of parameter variation and high frequency of transient process, and it requires an accurate control in time. In this paper, the discrete model reference adaptive control system of automatic profiling machine is discussed. Firstly, the model of automatic profiling machine is presented according to the parameters of DC motor. Then the design of the discrete model reference adaptive control is proposed, and the control rules are proven. The results of simulation show that adaptive control system has favorable dynamic performances.

  20. DDT: A Research Tool for Automatic Data Distribution in High Performance Fortran

    Directory of Open Access Journals (Sweden)

    Eduard AyguadÉ

    1997-01-01

    Full Text Available This article describes the main features and implementation of our automatic data distribution research tool. The tool (DDT accepts programs written in Fortran 77 and generates High Performance Fortran (HPF directives to map arrays onto the memories of the processors and parallelize loops, and executable statements to remap these arrays. DDT works by identifying a set of computational phases (procedures and loops. The algorithm builds a search space of candidate solutions for these phases which is explored looking for the combination that minimizes the overall cost; this cost includes data movement cost and computation cost. The movement cost reflects the cost of accessing remote data during the execution of a phase and the remapping costs that have to be paid in order to execute the phase with the selected mapping. The computation cost includes the cost of executing a phase in parallel according to the selected mapping and the owner computes rule. The tool supports interprocedural analysis and uses control flow information to identify how phases are sequenced during the execution of the application.

  1. Automatic Loop Parallelization via Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob

    For many parallel applications, performance relies not on instruction-level parallelism, but on loop-level parallelism. Unfortunately, many modern applications are written in ways that obstruct automatic loop parallelization. Since we cannot identify sufficient parallelization opportunities...... for these codes in a static, off-line compiler, we developed an interactive compilation feedback system that guides the programmer in iteratively modifying application source, thereby improving the compiler’s ability to generate loop-parallel code. We use this compilation system to modify two sequential...... benchmarks, finding that the code parallelized in this way runs up to 8.3 times faster on an octo-core Intel Xeon 5570 system and up to 12.5 times faster on a quad-core IBM POWER6 system. Benchmark performance varies significantly between the systems. This suggests that semi-automatic parallelization should...

  2. Automation of chromosomes analysis. Automatic system for image processing

    International Nuclear Information System (INIS)

    Le Go, R.; Cosnac, B. de; Spiwack, A.

    1975-01-01

    The A.S.T.I. is an automatic system relating to the fast conversational processing of all kinds of images (cells, chromosomes) converted to a numerical data set (120000 points, 16 grey levels stored in a MOS memory) through a fast D.O. analyzer. The system performs automatically the isolation of any individual image, the area and weighted area of which are computed. These results are directly displayed on the command panel and can be transferred to a mini-computer for further computations. A bright spot allows parts of an image to be picked out and the results to be displayed. This study is particularly directed towards automatic karyo-typing [fr

  3. CAVEAT: an assistance project for software validation using formal techniques

    International Nuclear Information System (INIS)

    Trotin, A.; Antoine, C.; Baudin, P.; Collart, J.M.; Raguideau, J.; Zylberajch, C.

    1995-01-01

    The aim of the CAVEAT project is to provide a tool for the validation of industrial C language softwares. It allows the user to go inside the program and have a good comprehension of it. It allows also the possibility to realize refined verifications of the consistency between the specifications and the program by translating the properties into a more suitable language. It calculates automatically the conditions to demonstrate, and offers an assistance to perform interactive demonstrations. The principal application of this tool is the safety of systems during the verification/certification phase or during the developing phase where it can works as an intelligent debugging system. (J.S.). 5 refs., 1 fig

  4. Automatic Fiscal Stabilizers

    Directory of Open Access Journals (Sweden)

    Narcis Eduard Mitu

    2013-11-01

    Full Text Available Policies or institutions (built into an economic system that automatically tend to dampen economic cycle fluctuations in income, employment, etc., without direct government intervention. For example, in boom times, progressive income tax automatically reduces money supply as incomes and spendings rise. Similarly, in recessionary times, payment of unemployment benefits injects more money in the system and stimulates demand. Also called automatic stabilizers or built-in stabilizers.

  5. Development of an automatic identification algorithm for antibiogram analysis

    OpenAIRE

    Costa, LFR; Eduardo Silva; Noronha, VT; Ivone Vaz-Moreira; Olga C Nunes; de Andrade, MM

    2015-01-01

    Routinely, diagnostic and microbiology laboratories perform antibiogram analysis which can present some difficulties leading to misreadings and intra and inter-reader deviations. An Automatic Identification Algorithm (AIA) has been proposed as a solution to overcome some issues associated with the disc diffusion method, which is the main goal of this work. ALA allows automatic scanning of inhibition zones obtained by antibiograms. More than 60 environmental isolates were tested using suscepti...

  6. Automatic programming for critical applications

    Science.gov (United States)

    Loganantharaj, Raj L.

    1988-01-01

    The important phases of a software life cycle include verification and maintenance. Usually, the execution performance is an expected requirement in a software development process. Unfortunately, the verification and the maintenance of programs are the time consuming and the frustrating aspects of software engineering. The verification cannot be waived for the programs used for critical applications such as, military, space, and nuclear plants. As a consequence, synthesis of programs from specifications, an alternative way of developing correct programs, is becoming popular. The definition, or what is understood by automatic programming, has been changed with our expectations. At present, the goal of automatic programming is the automation of programming process. Specifically, it means the application of artificial intelligence to software engineering in order to define techniques and create environments that help in the creation of high level programs. The automatic programming process may be divided into two phases: the problem acquisition phase and the program synthesis phase. In the problem acquisition phase, an informal specification of the problem is transformed into an unambiguous specification while in the program synthesis phase such a specification is further transformed into a concrete, executable program.

  7. The Potential of Automatic Word Comparison for Historical Linguistics.

    Science.gov (United States)

    List, Johann-Mattis; Greenhill, Simon J; Gray, Russell D

    2017-01-01

    The amount of data from languages spoken all over the world is rapidly increasing. Traditional manual methods in historical linguistics need to face the challenges brought by this influx of data. Automatic approaches to word comparison could provide invaluable help to pre-analyze data which can be later enhanced by experts. In this way, computational approaches can take care of the repetitive and schematic tasks leaving experts to concentrate on answering interesting questions. Here we test the potential of automatic methods to detect etymologically related words (cognates) in cross-linguistic data. Using a newly compiled database of expert cognate judgments across five different language families, we compare how well different automatic approaches distinguish related from unrelated words. Our results show that automatic methods can identify cognates with a very high degree of accuracy, reaching 89% for the best-performing method Infomap. We identify the specific strengths and weaknesses of these different methods and point to major challenges for future approaches. Current automatic approaches for cognate detection-although not perfect-could become an important component of future research in historical linguistics.

  8. Performing the processing required for automatically get a PDF/A version of the CERN Library documentation

    CERN Document Server

    Molina Garcia-Retamero, Antonio

    2015-01-01

    The aim of the project was to perform the processing required for automatically get a PDF/A version of the CERN Library documentation. For this, it is necessary to extract as much metadata as possible from the sources files, inject the required data into the original source files creating new ones ready for being compiled with all related dependencies. Besides, I’ve proposed the creation of a HTML version consistent with the PDF and navigable for easy access, I’ve been trying to perform some Natural Language Processing for extracting metadata, I’ve proposed the injection of the cern library documentation into the HTML version of the long writeups where it is referenced (for instance, when a CERN Library function is referenced in a sample code) Finally, I’ve designed and implemented a Graphical User Interface in order to simplify the process for the user.

  9. Automatic attraction of visual attention by supraletter features of former target strings

    DEFF Research Database (Denmark)

    Kyllingsbæk, Søren; Lommel, Sven Van; Bundesen, Claus

    2014-01-01

    , performance (d’) degraded on trials in which former targets were present, suggesting that the former targets automatically drew processing resources away from the current targets. Apparently, the two experiments showed automatic attraction of visual attention by supraletter features of former target strings....

  10. High-Performance Monitoring Architecture for Large-Scale Distributed Systems Using Event Filtering

    Science.gov (United States)

    Maly, K.

    1998-01-01

    Monitoring is an essential process to observe and improve the reliability and the performance of large-scale distributed (LSD) systems. In an LSD environment, a large number of events is generated by the system components during its execution or interaction with external objects (e.g. users or processes). Monitoring such events is necessary for observing the run-time behavior of LSD systems and providing status information required for debugging, tuning and managing such applications. However, correlated events are generated concurrently and could be distributed in various locations in the applications environment which complicates the management decisions process and thereby makes monitoring LSD systems an intricate task. We propose a scalable high-performance monitoring architecture for LSD systems to detect and classify interesting local and global events and disseminate the monitoring information to the corresponding end- points management applications such as debugging and reactive control tools to improve the application performance and reliability. A large volume of events may be generated due to the extensive demands of the monitoring applications and the high interaction of LSD systems. The monitoring architecture employs a high-performance event filtering mechanism to efficiently process the large volume of event traffic generated by LSD systems and minimize the intrusiveness of the monitoring process by reducing the event traffic flow in the system and distributing the monitoring computation. Our architecture also supports dynamic and flexible reconfiguration of the monitoring mechanism via its Instrumentation and subscription components. As a case study, we show how our monitoring architecture can be utilized to improve the reliability and the performance of the Interactive Remote Instruction (IRI) system which is a large-scale distributed system for collaborative distance learning. The filtering mechanism represents an Intrinsic component integrated

  11. The irace package: Iterated racing for automatic algorithm configuration

    Directory of Open Access Journals (Sweden)

    Manuel López-Ibáñez

    2016-01-01

    Full Text Available Modern optimization algorithms typically require the setting of a large number of parameters to optimize their performance. The immediate goal of automatic algorithm configuration is to find, automatically, the best parameter settings of an optimizer. Ultimately, automatic algorithm configuration has the potential to lead to new design paradigms for optimization software. The irace package is a software package that implements a number of automatic configuration procedures. In particular, it offers iterated racing procedures, which have been used successfully to automatically configure various state-of-the-art algorithms. The iterated racing procedures implemented in irace include the iterated F-race algorithm and several extensions and improvements over it. In this paper, we describe the rationale underlying the iterated racing procedures and introduce a number of recent extensions. Among these, we introduce a restart mechanism to avoid premature convergence, the use of truncated sampling distributions to handle correctly parameter bounds, and an elitist racing procedure for ensuring that the best configurations returned are also those evaluated in the highest number of training instances. We experimentally evaluate the most recent version of irace and demonstrate with a number of example applications the use and potential of irace, in particular, and automatic algorithm configuration, in general.

  12. Tangent: Automatic Differentiation Using Source Code Transformation in Python

    OpenAIRE

    van Merriënboer, Bart; Wiltschko, Alexander B.; Moldovan, Dan

    2017-01-01

    Automatic differentiation (AD) is an essential primitive for machine learning programming systems. Tangent is a new library that performs AD using source code transformation (SCT) in Python. It takes numeric functions written in a syntactic subset of Python and NumPy as input, and generates new Python functions which calculate a derivative. This approach to automatic differentiation is different from existing packages popular in machine learning, such as TensorFlow and Autograd. Advantages ar...

  13. Automatic control system for the pig ion source for the U-400 cyclotron

    International Nuclear Information System (INIS)

    Kutner, V.B.; Subbotin, V.G.; Sukhov, A.M.; Tret'yakov, Yu.P.; Fefilov, B.V.

    1989-01-01

    An automatic control system is described for the cyclotron U-400 multiply-charged ion source based on CAMAC apparatus and microprocesor controllers. The system allows the automatic tuning of the ion source to the necessary regime including the automatic start-up of discharge, the obtaining of the necessary parameters of sputtering, the automatic search for a maximum beam current within the given discharge parameters. The system performs tuning the ion source to the quasioptimal regime for 10-15 minutes with up to 5% deviation from the preset parameters. It is possible to stabilize the beam current within 3% using the automatic correction of the discharge regime. 6 refs.; 4 figs

  14. Automatic control system of the PIG ion source for the U-400 cyclotron

    International Nuclear Information System (INIS)

    Kutner, V.B.; Subbotin, V.G.; Sukhov, A.M.; Tretyakov, Y.P.; Fefilov, B.V.; Kasyanov, A.A.; Rybin, V.M.

    1990-01-01

    An automatic control system is described for the multiply charged ion source of the U-400 cyclotron based on CAMAC apparatus and microprocessor controllers. The system allows the automatic tuning of the ion source to the necessary regime, including the automatic start-up of discharge, determination of the necessary parameters of sputtering, and the automatic search for a maximum beam current for given discharge parameters. The system performs the tuning of the ion source to the quasioptimal regime in 10--15 min with up to 5% deviation from the preset parameters. It is possible to stabilize the beam current within 3% using the automatic correction of the discharge regime

  15. Text Summarization Evaluation: Correlating Human Performance on an Extrinsic Task with Automatic Intrinsic Metrics

    National Research Council Canada - National Science Library

    President, Stacy F; Dorr, Bonnie J

    2006-01-01

    This research describes two types of summarization evaluation methods, intrinsic and extrinsic, and concentrates on determining the level of correlation between automatic intrinsic methods and human...

  16. Automatic PID Control Loops Design for Performance Improvement of Cryogenic Turboexpander

    International Nuclear Information System (INIS)

    Joshi, D.M.; Patel, H.K.; Shah, D.K.

    2015-01-01

    Cryogenics field involves temperature below 123 K which is much less than ambient temperature. In addition, many industrially important physical processes—from fulfilling the needs of National Thermonuclear Fusion programs, superconducting magnets to treatment of cutting tools and preservation of blood cells, require extreme low temperature. The low temperature required for liquefaction of common gases can be obtained by several processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Helium liquefier is used for the liquefaction process of helium gas. In general, the Helium Refrigerator/Liquefier (HRL) needs turboexpander as expansion machine to produce cooling effect which is further used for the production of liquid helium. Turboexpanders, a high speed device that is supported on gas bearings, are the most critical component in many helium refrigeration systems. A very minor fault in the operation and manufacturing or impurities in the helium gas can destroy the turboexpander. However, since the performance of expanders is dependent on a number of operating parameters and the relations between them are quite complex, the instrumentation and control system design for turboexpander needs special attention. The inefficiency of manual control leads to the need of designing automatic control loops for turboexpander. Proper design and implementation of the control loops plays an important role in the successful operation of the cryogenic turboexpander. The PID control loops has to be implemented with accurate interlocks and logic to enhance the performance of the cryogenic turboexpander. For different normal and off-normal operations, speeds will be different and hence a proper control method for critical rotational speed avoidance is must. This paper presents the design of PID control loops needed for the

  17. Dissociation between controlled and automatic processes in the behavioral variant of fronto-temporal dementia.

    Science.gov (United States)

    Collette, Fabienne; Van der Linden, Martial; Salmon, Eric

    2010-01-01

    A decline of cognitive functioning affecting several cognitive domains was frequently reported in patients with frontotemporal dementia. We were interested in determining if these deficits can be interpreted as reflecting an impairment of controlled cognitive processes by using an assessment tool specifically developed to explore the distinction between automatic and controlled processes, namely the process dissociation procedure (PDP) developed by Jacoby. The PDP was applied to a word stem completion task to determine the contribution of automatic and controlled processes to episodic memory performance and was administered to a group of 12 patients with the behavioral variant of frontotemporal dementia (bv-FTD) and 20 control subjects (CS). Bv-FTD patients obtained a lower performance than CS for the estimates of controlled processes, but no group differences was observed for estimates of automatic processes. The between-groups comparison of the estimates of controlled and automatic processes showed a larger contribution of automatic processes to performance in bv-FTD, while a slightly more important contribution of controlled processes was observed in control subjects. These results are clearly indicative of an alteration of controlled memory processes in bv-FTD.

  18. Transducer-actuator systems and methods for performing on-machine measurements and automatic part alignment

    Science.gov (United States)

    Barkman, William E.; Dow, Thomas A.; Garrard, Kenneth P.; Marston, Zachary

    2016-07-12

    Systems and methods for performing on-machine measurements and automatic part alignment, including: a measurement component operable for determining the position of a part on a machine; and an actuation component operable for adjusting the position of the part by contacting the part with a predetermined force responsive to the determined position of the part. The measurement component consists of a transducer. The actuation component consists of a linear actuator. Optionally, the measurement component and the actuation component consist of a single linear actuator operable for contacting the part with a first lighter force for determining the position of the part and with a second harder force for adjusting the position of the part. The actuation component is utilized in a substantially horizontal configuration and the effects of gravitational drop of the part are accounted for in the force applied and the timing of the contact.

  19. Automatic inference of indexing rules for MEDLINE

    Directory of Open Access Journals (Sweden)

    Shooshan Sonya E

    2008-11-01

    Full Text Available Abstract Background: Indexing is a crucial step in any information retrieval system. In MEDLINE, a widely used database of the biomedical literature, the indexing process involves the selection of Medical Subject Headings in order to describe the subject matter of articles. The need for automatic tools to assist MEDLINE indexers in this task is growing with the increasing number of publications being added to MEDLINE. Methods: In this paper, we describe the use and the customization of Inductive Logic Programming (ILP to infer indexing rules that may be used to produce automatic indexing recommendations for MEDLINE indexers. Results: Our results show that this original ILP-based approach outperforms manual rules when they exist. In addition, the use of ILP rules also improves the overall performance of the Medical Text Indexer (MTI, a system producing automatic indexing recommendations for MEDLINE. Conclusion: We expect the sets of ILP rules obtained in this experiment to be integrated into MTI.

  20. An automatic evaluation system for NTA film neutron dosimeters

    CERN Document Server

    Müller, R

    1999-01-01

    At CERN, neutron personal monitoring for over 4000 collaborators is performed with Kodak NTA films, which have been shown to be the most suitable neutron dosimeter in the radiation environment around high-energy accelerators. To overcome the lengthy and strenuous manual scanning process with an optical microscope, an automatic analysis system has been developed. We report on the successful automatic scanning of NTA films irradiated with sup 2 sup 3 sup 8 Pu-Be source neutrons, which results in densely ionised recoil tracks, as well as on the extension of the method to higher energy neutrons causing sparse and fragmentary tracks. The application of the method in routine personal monitoring is discussed. $9 overcome the lengthy and strenuous manual scanning process with an optical microscope, an automatic analysis system has been developed. We report on the successful automatic scanning of NTA films irradiated with /sup 238/Pu-Be source $9 discussed. (10 refs).

  1. Trends of progress in medical technics as far as automatization is concerned

    Energy Technology Data Exchange (ETDEWEB)

    Agoston, M [Medicor Muevek, Budapest (Hungary)

    1978-09-01

    Modernization of medical treatment is developing to the direction of establishing big hospitals and policlinics. Highly productive automatic equipments give possibilities for performing the mass examinations with high efficiency. Still the X-ray instruments form the most valuable and indispensable device group. One direction to develop the automatization of these machines is to achieve the best X-ray exposure. The relatively slow but continuous spreading of isotope diagnostic instruments has been connected with a number of results in automatization, too. In the field of sterilization bactericid materials, gas- and ray sterilizing methods, as well as combined systems become used. Automatization has a strong influence on the domain of epidemiology as well.

  2. Performance evaluation of 2D and 3D deep learning approaches for automatic segmentation of multiple organs on CT images

    Science.gov (United States)

    Zhou, Xiangrong; Yamada, Kazuma; Kojima, Takuya; Takayama, Ryosuke; Wang, Song; Zhou, Xinxin; Hara, Takeshi; Fujita, Hiroshi

    2018-02-01

    The purpose of this study is to evaluate and compare the performance of modern deep learning techniques for automatically recognizing and segmenting multiple organ regions on 3D CT images. CT image segmentation is one of the important task in medical image analysis and is still very challenging. Deep learning approaches have demonstrated the capability of scene recognition and semantic segmentation on nature images and have been used to address segmentation problems of medical images. Although several works showed promising results of CT image segmentation by using deep learning approaches, there is no comprehensive evaluation of segmentation performance of the deep learning on segmenting multiple organs on different portions of CT scans. In this paper, we evaluated and compared the segmentation performance of two different deep learning approaches that used 2D- and 3D deep convolutional neural networks (CNN) without- and with a pre-processing step. A conventional approach that presents the state-of-the-art performance of CT image segmentation without deep learning was also used for comparison. A dataset that includes 240 CT images scanned on different portions of human bodies was used for performance evaluation. The maximum number of 17 types of organ regions in each CT scan were segmented automatically and compared to the human annotations by using ratio of intersection over union (IU) as the criterion. The experimental results demonstrated the IUs of the segmentation results had a mean value of 79% and 67% by averaging 17 types of organs that segmented by a 3D- and 2D deep CNN, respectively. All the results of the deep learning approaches showed a better accuracy and robustness than the conventional segmentation method that used probabilistic atlas and graph-cut methods. The effectiveness and the usefulness of deep learning approaches were demonstrated for solving multiple organs segmentation problem on 3D CT images.

  3. Finding weak points automatically

    International Nuclear Information System (INIS)

    Archinger, P.; Wassenberg, M.

    1999-01-01

    Operators of nuclear power stations have to carry out material tests at selected components by regular intervalls. Therefore a full automaticated test, which achieves a clearly higher reproducibility, compared to part automaticated variations, would provide a solution. In addition the full automaticated test reduces the dose of radiation for the test person. (orig.) [de

  4. Performance of ACCOS, an automatic crystal quality control system for the PWO crystals of the CMS calorimeter

    CERN Document Server

    Auffray, Etiennette; Freire, M; Lecoq, P; Le Goff, J M; Marcos, R; Drobychev, G Yu; Missevitch, O V; Oskine, A; Zouevski, R F; Peigneux, J P; Schneegans, M

    2001-01-01

    Nearly 80000 PWO crystals for the CMS electromagnetic calorimeter will arrive at CERN/Geneva and INFN-ENEA/Rome between now and year 2004. The stringent specifications on their dimensions and optical quality have to be verified prior to their formal acceptation. Automatic systems for measuring the critical parameters of each crystal and recording them in a database have been designed and constructed. The first machine is now in stable operation at CERN. In this note, the performance of each instrument, based on the measurements on ~1000 pre-production crystals, is analysed in terms of stability and compared to the results of conventional benches. (9 refs).

  5. Detecting accuracy of flaws by manual and automatic ultrasonic inspections

    International Nuclear Information System (INIS)

    Iida, K.

    1988-01-01

    As the final stage work in the nine year project on proving tests of the ultrasonic inspection technique applied to the ISI of LWR plants, automatic ultrasonic inspection tests were carried out on EDM notches, surface fatigue cracks, weld defects and stress corrosion cracks, which were deliberately introduced in full size structural components simulating a 1,100 MWe BWR. Investigated items are the performance of a newly assembled automatic inspection apparatus, detection limit of flaws, detection resolution of adjacent collinear or parallel EDM notches, detection reproducibility and detection accuracy. The manual ultrasonic inspection of the same flaws as inspected by the automatic ultrasonic inspection was also carried out in order to have comparative data. This paper reports how it was confirmed that the automatic ultrasonic inspection is much superior to the manual inspection in the flaw detection rate and in the detection reproducibility

  6. The automatic component of habit in health behavior: habit as cue-contingent automaticity.

    Science.gov (United States)

    Orbell, Sheina; Verplanken, Bas

    2010-07-01

    Habit might be usefully characterized as a form of automaticity that involves the association of a cue and a response. Three studies examined habitual automaticity in regard to different aspects of the cue-response relationship characteristic of unhealthy and healthy habits. In each study, habitual automaticity was assessed by the Self-Report Habit Index (SRHI). In Study 1 SRHI scores correlated with attentional bias to smoking cues in a Stroop task. Study 2 examined the ability of a habit cue to elicit an unwanted habit response. In a prospective field study, habitual automaticity in relation to smoking when drinking alcohol in a licensed public house (pub) predicted the likelihood of cigarette-related action slips 2 months later after smoking in pubs had become illegal. In Study 3 experimental group participants formed an implementation intention to floss in response to a specified situational cue. Habitual automaticity of dental flossing was rapidly enhanced compared to controls. The studies provided three different demonstrations of the importance of cues in the automatic operation of habits. Habitual automaticity assessed by the SRHI captured aspects of a habit that go beyond mere frequency or consistency of the behavior. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  7. Dynamic Anthropometry – Deffning Protocols for Automatic Body Measurement

    Directory of Open Access Journals (Sweden)

    Slavenka Petrak

    2017-12-01

    Full Text Available The paper presents the research on possibilities of protocol development for automatic computer-based determination of measurements on a 3D body model in defined dynamic positions. Initially, two dynamic body positions were defined for the research on dimensional changes of targeted body lengths and surface segments during body movement from basic static position into a selected dynamic body position. The assumption was that during body movement, specifi c length and surface dimensions would change significantly from the aspect of clothing construction and functionality of a garment model. 3D body scanning of a female test sample was performed in basic static and two defined dynamic positions. 3D body models were processed and measurement points were defined as a starting point for the determination of characteristic body measurements. The protocol for automatic computer measurement was defined for every dynamic body position by the systematic set of activities based on determined measurement points. The verification of developed protocols was performed by automatic determination of defined measurements on the test sample and by comparing the results with the conventional manual measurement.

  8. MRI in assessing children with learning disability, focal findings, and reduced automaticity.

    Science.gov (United States)

    Urion, David K; Huff, Hanalise V; Carullo, Maria Paulina

    2015-08-18

    In children with clinically diagnosed learning disabilities with focal findings on neurologic or neuropsychological evaluations, there is a hypothesized association between disorders in automaticity and focal structural abnormalities observed in brain MRIs. We undertook a retrospective analysis of cases referred to a tertiary-hospital-based learning disabilities program. Individuals were coded as having a focal deficit if either neurologic or neuropsychological evaluation demonstrated focal dysfunction. Those with abnormal MRI findings were categorized based on findings. Children with abnormalities from each of these categories were compared in terms of deficits in automaticity, as measured by the tasks of Rapid Automatized Naming, Rapid Alternating Stimulus Naming, or the timed motor performance battery from the Physical and Neurological Examination for Soft Signs. Data were compared in children with and without disorders of automaticity regarding type of brain structure abnormality. Of the 1,587 children evaluated, 127 had a focal deficit. Eighty-seven had a brain MRI (52 on 1.5-tesla machines and 35 on 3.0-tesla machines). Forty of these images were found to be abnormal. These children were compared with a clinic sample of 150 patients with learning disabilities and no focal findings on examination, who also had undergone MRI. Only 5 of the latter group had abnormalities on MRI. Reduced verbal automaticity was associated with cerebellar abnormalities, whereas reduced automaticity on motor or motor and verbal tasks was associated with white matter abnormalities. Reduced automaticity of retrieval and slow timed motor performance appear to be highly associated with MRI findings. © 2015 American Academy of Neurology.

  9. Comparison of automatic and visual methods used for image segmentation in Endodontics: a microCT study.

    Science.gov (United States)

    Queiroz, Polyane Mazucatto; Rovaris, Karla; Santaella, Gustavo Machado; Haiter-Neto, Francisco; Freitas, Deborah Queiroz

    2017-01-01

    To calculate root canal volume and surface area in microCT images, an image segmentation by selecting threshold values is required, which can be determined by visual or automatic methods. Visual determination is influenced by the operator's visual acuity, while the automatic method is done entirely by computer algorithms. To compare between visual and automatic segmentation, and to determine the influence of the operator's visual acuity on the reproducibility of root canal volume and area measurements. Images from 31 extracted human anterior teeth were scanned with a μCT scanner. Three experienced examiners performed visual image segmentation, and threshold values were recorded. Automatic segmentation was done using the "Automatic Threshold Tool" available in the dedicated software provided by the scanner's manufacturer. Volume and area measurements were performed using the threshold values determined both visually and automatically. The paired Student's t-test showed no significant difference between visual and automatic segmentation methods regarding root canal volume measurements (p=0.93) and root canal surface (p=0.79). Although visual and automatic segmentation methods can be used to determine the threshold and calculate root canal volume and surface, the automatic method may be the most suitable for ensuring the reproducibility of threshold determination.

  10. Automatic system for ionization chamber current measurements

    International Nuclear Information System (INIS)

    Brancaccio, Franco; Dias, Mauro S.; Koskinas, Marina F.

    2004-01-01

    The present work describes an automatic system developed for current integration measurements at the Laboratorio de Metrologia Nuclear of Instituto de Pesquisas Energeticas e Nucleares. This system includes software (graphic user interface and control) and a module connected to a microcomputer, by means of a commercial data acquisition card. Measurements were performed in order to check the performance and for validating the proposed design

  11. Method for automatic control rod operation using rule-based control

    International Nuclear Information System (INIS)

    Kinoshita, Mitsuo; Yamada, Naoyuki; Kiguchi, Takashi

    1988-01-01

    An automatic control rod operation method using rule-based control is proposed. Its features are as follows: (1) a production system to recognize plant events, determine control actions and realize fast inference (fast selection of a suitable production rule), (2) use of the fuzzy control technique to determine quantitative control variables. The method's performance was evaluated by simulation tests on automatic control rod operation at a BWR plant start-up. The results were as follows; (1) The performance which is related to stabilization of controlled variables and time required for reactor start-up, was superior to that of other methods such as PID control and program control methods, (2) the process time to select and interpret the suitable production rule, which was the same as required for event recognition or determination of control action, was short (below 1 s) enough for real time control. The results showed that the method is effective for automatic control rod operation. (author)

  12. Automatic analysis of microscopic images of red blood cell aggregates

    Science.gov (United States)

    Menichini, Pablo A.; Larese, Mónica G.; Riquelme, Bibiana D.

    2015-06-01

    Red blood cell aggregation is one of the most important factors in blood viscosity at stasis or at very low rates of flow. The basic structure of aggregates is a linear array of cell commonly termed as rouleaux. Enhanced or abnormal aggregation is seen in clinical conditions, such as diabetes and hypertension, producing alterations in the microcirculation, some of which can be analyzed through the characterization of aggregated cells. Frequently, image processing and analysis for the characterization of RBC aggregation were done manually or semi-automatically using interactive tools. We propose a system that processes images of RBC aggregation and automatically obtains the characterization and quantification of the different types of RBC aggregates. Present technique could be interesting to perform the adaptation as a routine used in hemorheological and Clinical Biochemistry Laboratories because this automatic method is rapid, efficient and economical, and at the same time independent of the user performing the analysis (repeatability of the analysis).

  13. An automatic hinge system for leg orthoses

    NARCIS (Netherlands)

    Rietman, J.S.; Goudsmit, J.; Meulemans, D.; Halbertsma, J.P.K.; Geertzen, J.H.B.

    This paper describes a new, automatic hinge system for leg orthoses, which provides knee stability in stance, and allows knee-flexion during swing. Indications for the hinge system are a paresis or paralysis of the quadriceps muscles. Instrumented gait analysis was performed in three patients,

  14. Applications of automatic differentiation in topology optimization

    DEFF Research Database (Denmark)

    Nørgaard, Sebastian A.; Sagebaum, Max; Gauger, Nicolas R.

    2017-01-01

    and is demonstrated on two separate, previously published types of problems in topology optimization. Two separate software packages for automatic differentiation, CoDiPack and Tapenade are considered, and their performance and usability trade-offs are discussed and compared to a hand coded adjoint gradient...

  15. Microprocessor controlled system for automatic and semi-automatic syntheses of radiopharmaceuticals

    International Nuclear Information System (INIS)

    Ruth, T.J.; Adam, M.J.; Morris, D.; Jivan, S.

    1986-01-01

    A computer based system has been constructed to control the automatic synthesis of 2-deoxy-2-( 18 F)fluoro-D-glucose and is also being used in the development of an automatic synthesis of L-6-( 18 F)fluorodopa. (author)

  16. Automatic gallbladder segmentation using combined 2D and 3D shape features to perform volumetric analysis in native and secretin-enhanced MRCP sequences.

    Science.gov (United States)

    Gloger, Oliver; Bülow, Robin; Tönnies, Klaus; Völzke, Henry

    2017-11-24

    We aimed to develop the first fully automated 3D gallbladder segmentation approach to perform volumetric analysis in volume data of magnetic resonance (MR) cholangiopancreatography (MRCP) sequences. Volumetric gallbladder analysis is performed for non-contrast-enhanced and secretin-enhanced MRCP sequences. Native and secretin-enhanced MRCP volume data were produced with a 1.5-T MR system. Images of coronal maximum intensity projections (MIP) are used to automatically compute 2D characteristic shape features of the gallbladder in the MIP images. A gallbladder shape space is generated to derive 3D gallbladder shape features, which are then combined with 2D gallbladder shape features in a support vector machine approach to detect gallbladder regions in MRCP volume data. A region-based level set approach is used for fine segmentation. Volumetric analysis is performed for both sequences to calculate gallbladder volume differences between both sequences. The approach presented achieves segmentation results with mean Dice coefficients of 0.917 in non-contrast-enhanced sequences and 0.904 in secretin-enhanced sequences. This is the first approach developed to detect and segment gallbladders in MR-based volume data automatically in both sequences. It can be used to perform gallbladder volume determination in epidemiological studies and to detect abnormal gallbladder volumes or shapes. The positive volume differences between both sequences may indicate the quantity of the pancreatobiliary reflux.

  17. Development of an automatic reactor inspection system

    International Nuclear Information System (INIS)

    Kim, Jae Hee; Eom, Heung Seop; Lee, Jae Cheol; Choi, Yoo Raek; Moon, Soon Seung

    2002-02-01

    Using recent technologies on a mobile robot computer science, we developed an automatic inspection system for weld lines of the reactor vessel. The ultrasonic inspection of the reactor pressure vessel is currently performed by commercialized robot manipulators. Since, however, the conventional fixed type robot manipulator is very huge, heavy and expensive, it needs long inspection time and is hard to handle and maintain. In order to resolve these problems, we developed a new automatic inspection system using a small mobile robot crawling on the vertical wall of the reactor vessel. According to our conceptual design, we developed the reactor inspection system including an underwater inspection robot, a laser position control subsystem, an ultrasonic data acquisition/analysis subsystem and a main control subsystem. We successfully carried out underwater experiments on the reactor vessel mockup, and real reactor ready for Ulchine nuclear power plant unit 6 at Dusan Heavy Industry in Korea. After this project, we have a plan to commercialize our inspection system. Using this system, we can expect much reduction of the inspection time, performance enhancement, automatic management of inspection history, etc. In the economic point of view, we can also expect import substitution more than 4 million dollars. The established essential technologies for intelligent control and automation are expected to be synthetically applied to the automation of similar systems in nuclear power plants

  18. Automatic Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Preuss, Mike

    2014-01-01

    Automatically generating computer animations is a challenging and complex problem with applications in games and film production. In this paper, we investigate howto translate a shot list for a virtual scene into a series of virtual camera configurations — i.e automatically controlling the virtual...

  19. Semi-automatic drawings surveying system

    International Nuclear Information System (INIS)

    Andriamampianina, Lala

    1983-01-01

    A system for the semi-automatic survey of drawings is presented. Its design has been oriented to the reduction of the stored information required for the drawing reproduction. This equipment consists mainly of a plotter driven by a micro-computer, but the pen of the plotter is replaced by a circular photodiode array. Line drawings are first viewed as a concatenation of vectors, with constant angle between the two vectors, and then divided in arcs of circles and line segments. A dynamic analysis of line intersections with the circular sensor permits to identify starting points and end points in a line, for the purpose of automatically following connected lines in drawing. The advantage of the method described is that precision practically depends only on the plotter performance, the sensor resolution being only considered for the thickness of strokes and the distance between two strokes. (author) [fr

  20. On enhancing energy harvesting performance of the photovoltaic modules using an automatic cooling system and assessing its economic benefits of mitigating greenhouse effects on the environment

    Science.gov (United States)

    Wang, Jen-Cheng; Liao, Min-Sheng; Lee, Yeun-Chung; Liu, Cheng-Yue; Kuo, Kun-Chang; Chou, Cheng-Ying; Huang, Chen-Kang; Jiang, Joe-Air

    2018-02-01

    The performance of photovoltaic (PV) modules under outdoor operation is greatly affected by their location and environmental conditions. The temperature of a PV module gradually increases as it is exposed to solar irradiation, resulting in degradation of its electrical characteristics and power generation efficiency. This study adopts wireless sensor network (WSN) technology to develop an automatic water-cooling system for PV modules in order to improve their PV power generation efficiency. A temperature estimation method is developed to quickly and accurately estimate the PV module temperatures based on weather data provided from the WSN monitoring system. Further, an estimation method is also proposed for evaluation of the electrical characteristics and output power of the PV modules, which is performed remotely via a control platform. The automatic WSN-based water-cooling mechanism is designed to avoid the PV module temperature from reaching saturation. Equipping each PV module with the WSN-based cooling system, the ambient conditions are monitored automatically so that the temperature of the PV module is controlled by sprinkling water on the panel surface. The field-test experiment results show an increase in the energy harvested by the PV modules of approximately 17.75% when using the proposed WSN-based cooling system.

  1. Performance of an automatic dose control system for CT. Specifications and basic phantom tests

    Energy Technology Data Exchange (ETDEWEB)

    Nagel, H.D. [Wissenschaft und Technik fuer die Radiolgoe, Dr. HD Nagel, Buchholz (Germany); Stumpp, P.; Kahn, T.; Gosch, D. [Universitaetsklinikum Leipzig (Germany). Klinik und Poliklinik fuer Diagnostische und Interventionelle Radiologie

    2011-01-15

    Purpose: To assess the performance and to provide more detailed insight into the characteristics and limitations of devices for automatic dose control (ADC) in CT. Materials and Methods: A comprehensive study on DoseRight 2.0, the ADC system provided by Philips for its Brilliance CT scanners, was conducted. Phantom tests were carried out on a 64-slice scanner (Brilliance 64) using assorted quality control (QC) phantoms that allowed verification of the basic specifications. If feasible, the findings were verified by model calculations based on known specifications. Results: For all tests, the dose reductions and modulation characteristics fully met the values expected from the specifications. Adverse effects due to increased image noise were only moderate as a result of the 'adequate noise system' design that employs comparatively gentle modulation, and the additional use of adaptive filtration. Conclusion: Simple tests with QC phantoms allow evaluation of the most relevant characteristics of devices for ADC in CT. (orig.)

  2. Semi-Automatic Removal of Foreground Stars from Images of Galaxies

    Science.gov (United States)

    Frei, Zsolt

    1996-07-01

    A new procedure, designed to remove foreground stars from galaxy proviles is presented here. Although several programs exist for stellar and faint object photometry, none of them treat star removal from the images very carefully. I present my attempt to develop such a system, and briefly compare the performance of my software to one of the well-known stellar photometry packages, DAOPhot (Stetson 1987). Major steps in my procedure are: (1) automatic construction of an empirical 2D point spread function from well separated stars that are situated off the galaxy; (2) automatic identification of those peaks that are likely to be foreground stars, scaling the PSF and removing these stars, and patching residuals (in the automatically determined smallest possible area where residuals are truly significant); and (3) cosmetic fix of remaining degradations in the image. The algorithm and software presented here is significantly better for automatic removal of foreground stars from images of galaxies than DAOPhot or similar packages, since: (a) the most suitable stars are selected automatically from the image for the PSF fit; (b) after star-removal an intelligent and automatic procedure removes any possible residuals; (c) unlimited number of images can be cleaned in one run without any user interaction whatsoever. (SECTION: Computing and Data Analysis)

  3. An automatic hinge system for leg orthoses

    NARCIS (Netherlands)

    Rietman, J. S.; Goudsmit, J.; Meulemans, D.; Halbertsma, J. P. K.; Geertzen, J. H. B.

    2004-01-01

    This paper describes a new automatic hinge system for leg orthoses, which provides knee stability in stance, and allows knee-flexion during swing. Indications for the hinge system are a paresis or paralysis of the quadriceps muscles. Instrumented gait analysis was performed in three patients, fitted

  4. Cognitive tasks promote automatization of postural control in young and older adults.

    Science.gov (United States)

    Potvin-Desrochers, Alexandra; Richer, Natalie; Lajoie, Yves

    2017-09-01

    Researchers looking at the effects of performing a concurrent cognitive task on postural control in young and older adults using traditional center-of-pressure measures and complexity measures found discordant results. Results of experiments showing improvements of stability have suggested the use of strategies such as automatization of postural control or stiffening strategy. This experiment aimed to confirm in healthy young and older adults that performing a cognitive task while standing leads to improvements that are due to automaticity of sway by using sample entropy. Twenty-one young adults and twenty-five older adults were asked to stand on a force platform while performing a cognitive task. There were four cognitive tasks: simple reaction time, go/no-go reaction time, equation and occurrence of a digit in a number sequence. Results demonstrated decreased sway area and variability as well as increased sample entropy for both groups when performing a cognitive task. Results suggest that performing a concurrent cognitive task promotes the adoption of an automatic postural control in young and older adults as evidenced by an increased postural stability and postural sway complexity. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Measuring module of spectrometer of neutron small angle scattering on the IBR pulse reactor

    International Nuclear Information System (INIS)

    Vagov, V.A.; Zhukov, G.P.; Kozlova, E.P.; Korobchenko, M.L.; Namsraj, Yu.; Ostanevich, Yu.M.; Savvateev, A.S.; Salamatin, I.M.; Sirotin, A.P.

    1980-01-01

    Equipment and software for experiments with neutron small angle scattering is described. It is intended for data acquisition, equipment control storage of collected data and their output to network of the Laboratory measuring centre. The set-up equipment includes: 9 neutron detectors with corresponding electronic apparatus, sample exchanging device, communication link, SM-3 type minicomputer of an extended configuration and some units of CAMAC electronic equipment. The software (MUR applied operatio system) is intended for the automatic performance of the given number of cycles of successive uniform runs of a given duration with the sample list at two possible filter positions. Besides, the MUR system contains test, debugging and service software. The software has been designed using the SANPO system means [ru

  6. 12 CFR 263.403 - Automatic removal, suspension, and debarment.

    Science.gov (United States)

    2010-01-01

    ... independent public accountant or accounting firm may not perform audit services for banking organizations if... permission to such accountant or firm to perform audit services for banking organizations. The request shall... 12 Banks and Banking 3 2010-01-01 2010-01-01 false Automatic removal, suspension, and debarment...

  7. Automatic tuning of free electron lasers

    Energy Technology Data Exchange (ETDEWEB)

    Agapov, Ilya; Zagorodnov, Igor [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Geloni, Gianluca [European XFEL, Schenefeld (Germany); Tomin, Sergey [European XFEL, Schenefeld (Germany); NRC Kurchatov Institute, Moscow (Russian Federation)

    2017-04-07

    Existing FEL facilities often suffer from stability issues: so electron orbit, transverse electron optics, electron bunch compression and other parameters have to be readjusted often to account for drifts in performance of various components. The tuning procedures typically employed in operation are often manual and lengthy. We have been developing a combination of model-free and model-based automatic tuning methods to meet the needs of present and upcoming XFEL facilities. Our approach has been implemented at FLASH to achieve automatic SASE tuning using empirical control of orbit, electron optics and bunch compression. In this paper we describe our approach to empirical tuning, the software which implements it, and the results of using it at FLASH.We also discuss the potential of using machine learning and model-based techniques in tuning methods.

  8. Automatic tuning of free electron lasers

    International Nuclear Information System (INIS)

    Agapov, Ilya; Zagorodnov, Igor; Geloni, Gianluca; Tomin, Sergey

    2017-01-01

    Existing FEL facilities often suffer from stability issues: so electron orbit, transverse electron optics, electron bunch compression and other parameters have to be readjusted often to account for drifts in performance of various components. The tuning procedures typically employed in operation are often manual and lengthy. We have been developing a combination of model-free and model-based automatic tuning methods to meet the needs of present and upcoming XFEL facilities. Our approach has been implemented at FLASH to achieve automatic SASE tuning using empirical control of orbit, electron optics and bunch compression. In this paper we describe our approach to empirical tuning, the software which implements it, and the results of using it at FLASH.We also discuss the potential of using machine learning and model-based techniques in tuning methods.

  9. 76 FR 35736 - Special Conditions: Gulfstream Aerospace LP (GALP) Model G250 Airplane Automatic Power Reserve...

    Science.gov (United States)

    2011-06-20

    ... engine, the FADEC of the healthy (or both) engine will increase the power to the APR rating and the ``APR... Automatic Power Reserve (APR), an Automatic Takeoff Thrust Control System (ATTCS) AGENCY: Federal Aviation... novel or unusual design feature associated with go-around performance credit for use of Automatic Power...

  10. A semi-automatic traffic sign detection, classification and positioning system

    NARCIS (Netherlands)

    Creusen, I.M.; Hazelhoff, L.; With, de P.H.N.; Said, A.; Guleryuz, O.G.; Stevenson, R.L.

    2012-01-01

    The availability of large-scale databases containing street-level panoramic images offers the possibility to perform semi-automatic surveying of real-world objects such as traffic signs. These inventories can be performed significantly more efficiently than using conventional methods. Governmental

  11. Operating experience of the automatic technological control system at the Kolsk NPP

    International Nuclear Information System (INIS)

    Volkov, A.P.; Ignatenko, E.I.; Kolomtsev, Yu.V.; Mel'nikov, E.F.; Trofimov, B.A.

    1981-01-01

    Briefly reviewed is operating experience of the automatic control systems of the kolsk NPP (KNPP) power units, where measuring technique of the neutron flux ''Iney'', ARM-4 power regulator, automatic turbine start-up system ATS are used. The main shortcomings of the technological process automatic control system (ACS) and ways of their removal are considered. It is noted that the KNPP ACS performs only limited start-up functions of the basic equipment and reactor power control as well as partially protection functions at instant loading drops and switch-off of the main circulating pump [ru

  12. MatchGUI: A Graphical MATLAB-Based Tool for Automatic Image Co-Registration

    Science.gov (United States)

    Ansar, Adnan I.

    2011-01-01

    MatchGUI software, based on MATLAB, automatically matches two images and displays the match result by superimposing one image on the other. A slider bar allows focus to shift between the two images. There are tools for zoom, auto-crop to overlap region, and basic image markup. Given a pair of ortho-rectified images (focused primarily on Mars orbital imagery for now), this software automatically co-registers the imagery so that corresponding image pixels are aligned. MatchGUI requires minimal user input, and performs a registration over scale and inplane rotation fully automatically

  13. Studies on the calibration of mammography automatic exposure mode with computed radiology

    International Nuclear Information System (INIS)

    Zhu Hongzhou; Shao Guoliang; Shi Lei; Liu Qing

    2010-01-01

    Objective: To realize the optimization of image quality and radiation dose by correcting mammography automatic exposure, according to automatic exposure controlled mode of mammography film-screen system. Methods: The film-screen system (28 kV) was applied to perform automatic exposure of plexiglass (40 mm) and get the standard dose of exposure, the exposure mode of CR base on LgM=2.0 was rectified, which was divided into 10 steps. Mammary glands pattern (Fluke NA18-220) were examined with CR (26, 28, and 30 kV) by the automatic exposure mode corrected. The exposure values (mAs) were recorded. CR image was diagnosed and evaluated in double blind way by 4 radiologists according to American Collage of Radiology (ACR) standard. Results: Based on the standard of CR automatic exposure with the dose higher than the traditional exposure of film-screen system, the calibration of mammography automatic exposure was accomplished. The test results of the calibrated mode was better than the scoring system of ACR. Conclusions: Comparative study showed improvement in acquiring high-quality image and reduction of radiation dose. The corrected mammography automatic exposure mode might be a better method for clinical use. (authors)

  14. Next Generation Model 8800 Automatic TLD Reader

    International Nuclear Information System (INIS)

    Velbeck, K.J.; Streetz, K.L.; Rotunda, J.E.

    1999-01-01

    BICRON NE has developed an advanced version of the Model 8800 Automatic TLD Reader. Improvements in the reader include a Windows NT TM -based operating system and a Pentium microprocessor for the host controller, a servo-controlled transport, a VGA display, mouse control, and modular assembly. This high capacity reader will automatically read fourteen hundred TLD Cards in one loading. Up to four elements in a card can be heated without mechanical contact, using hot nitrogen gas. Improvements in performance include an increased throughput rate and more precise card positioning. Operation is simplified through easy-to-read Windows-type screens. Glow curves are displayed graphically along with light intensity, temperature, and channel scaling. Maintenance and diagnostic aids are included for easier troubleshooting. A click of a mouse will command actions that are displayed in easy-to-understand English words. Available options include an internal 90 Sr irradiator, automatic TLD calibration, and two different extremity monitoring modes. Results from testing include reproducibility, reader stability, linearity, detection threshold, residue, primary power supply voltage and frequency, transient voltage, drop testing, and light leakage. (author)

  15. TU-H-CAMPUS-JeP1-02: Fully Automatic Verification of Automatically Contoured Normal Tissues in the Head and Neck

    Energy Technology Data Exchange (ETDEWEB)

    McCarroll, R [UT MD Anderson Cancer Center, Houston, TX (United States); UT Health Science Center, Graduate School of Biomedical Sciences, Houston, TX (United States); Beadle, B; Yang, J; Zhang, L; Kisling, K; Balter, P; Stingo, F; Nelson, C; Followill, D; Court, L [UT MD Anderson Cancer Center, Houston, TX (United States); Mejia, M [University of Santo Tomas Hospital, Manila, Metro Manila (Philippines)

    2016-06-15

    Purpose: To investigate and validate the use of an independent deformable-based contouring algorithm for automatic verification of auto-contoured structures in the head and neck towards fully automated treatment planning. Methods: Two independent automatic contouring algorithms [(1) Eclipse’s Smart Segmentation followed by pixel-wise majority voting, (2) an in-house multi-atlas based method] were used to create contours of 6 normal structures of 10 head-and-neck patients. After rating by a radiation oncologist, the higher performing algorithm was selected as the primary contouring method, the other used for automatic verification of the primary. To determine the ability of the verification algorithm to detect incorrect contours, contours from the primary method were shifted from 0.5 to 2cm. Using a logit model the structure-specific minimum detectable shift was identified. The models were then applied to a set of twenty different patients and the sensitivity and specificity of the models verified. Results: Per physician rating, the multi-atlas method (4.8/5 point scale, with 3 rated as generally acceptable for planning purposes) was selected as primary and the Eclipse-based method (3.5/5) for verification. Mean distance to agreement and true positive rate were selected as covariates in an optimized logit model. These models, when applied to a group of twenty different patients, indicated that shifts could be detected at 0.5cm (brain), 0.75cm (mandible, cord), 1cm (brainstem, cochlea), or 1.25cm (parotid), with sensitivity and specificity greater than 0.95. If sensitivity and specificity constraints are reduced to 0.9, detectable shifts of mandible and brainstem were reduced by 0.25cm. These shifts represent additional safety margins which might be considered if auto-contours are used for automatic treatment planning without physician review. Conclusion: Automatically contoured structures can be automatically verified. This fully automated process could be used to

  16. Automatic Photoelectric Telescope Service

    International Nuclear Information System (INIS)

    Genet, R.M.; Boyd, L.J.; Kissell, K.E.; Crawford, D.L.; Hall, D.S.; BDM Corp., McLean, VA; Kitt Peak National Observatory, Tucson, AZ; Dyer Observatory, Nashville, TN)

    1987-01-01

    Automatic observatories have the potential of gathering sizable amounts of high-quality astronomical data at low cost. The Automatic Photoelectric Telescope Service (APT Service) has realized this potential and is routinely making photometric observations of a large number of variable stars. However, without observers to provide on-site monitoring, it was necessary to incorporate special quality checks into the operation of the APT Service at its multiple automatic telescope installation on Mount Hopkins. 18 references

  17. Automatic imitation: A meta-analysis.

    Science.gov (United States)

    Cracco, Emiel; Bardi, Lara; Desmet, Charlotte; Genschow, Oliver; Rigoni, Davide; De Coster, Lize; Radkova, Ina; Deschrijver, Eliane; Brass, Marcel

    2018-05-01

    Automatic imitation is the finding that movement execution is facilitated by compatible and impeded by incompatible observed movements. In the past 15 years, automatic imitation has been studied to understand the relation between perception and action in social interaction. Although research on this topic started in cognitive science, interest quickly spread to related disciplines such as social psychology, clinical psychology, and neuroscience. However, important theoretical questions have remained unanswered. Therefore, in the present meta-analysis, we evaluated seven key questions on automatic imitation. The results, based on 161 studies containing 226 experiments, revealed an overall effect size of g z = 0.95, 95% CI [0.88, 1.02]. Moderator analyses identified automatic imitation as a flexible, largely automatic process that is driven by movement and effector compatibility, but is also influenced by spatial compatibility. Automatic imitation was found to be stronger for forced choice tasks than for simple response tasks, for human agents than for nonhuman agents, and for goalless actions than for goal-directed actions. However, it was not modulated by more subtle factors such as animacy beliefs, motion profiles, or visual perspective. Finally, there was no evidence for a relation between automatic imitation and either empathy or autism. Among other things, these findings point toward actor-imitator similarity as a crucial modulator of automatic imitation and challenge the view that imitative tendencies are an indicator of social functioning. The current meta-analysis has important theoretical implications and sheds light on longstanding controversies in the literature on automatic imitation and related domains. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  18. Investigation of an automatic trim algorithm for restructurable aircraft control

    Science.gov (United States)

    Weiss, J.; Eterno, J.; Grunberg, D.; Looze, D.; Ostroff, A.

    1986-01-01

    This paper develops and solves an automatic trim problem for restructurable aircraft control. The trim solution is applied as a feed-forward control to reject measurable disturbances following control element failures. Disturbance rejection and command following performances are recovered through the automatic feedback control redesign procedure described by Looze et al. (1985). For this project the existence of a failure detection mechanism is assumed, and methods to cope with potential detection and identification inaccuracies are addressed.

  19. Structure of the automatic system for plasma equilibrium position control

    International Nuclear Information System (INIS)

    Gubarev, V.F.; Krivonos, Yu.G.; Samojlenko, Yu.I.; Snegur, A.A.

    1978-01-01

    Considered are the principles of construction of the automatic system for plasma filament equilibrium position control inside the discharge chamber for the installation of a tokamak type. The combined current control system in control winding is suggested. The most powerful subsystem creates current in the control winding according to the program calculated beforehand. This system provides plasma rough equilibrium along the ''big radius''. The subsystem performing the current change in small limits according to the principle of feed-back coupling is provided simultaneously. The stabilization of plasma position is achieved in the discharge chamber. The advantage of construction of such system is in decreasing of the automatic requlator power without lowering the requirements to the accuracy of equilibrium preservation. The subsystem of automatic control of plasma position over the vertical is put into the system. Such an approach to the construction of the automatic control system proves to be correct; it is based on the experience of application of similar devices for some existing thermonuclear plants

  20. Productive performance of Nile tilapia (Oreochromis niloticus fed at different frequencies and periods with automatic dispenser

    Directory of Open Access Journals (Sweden)

    R.M.R. Sousa

    2012-02-01

    Full Text Available The performance of Nile tilapia (Oreochromis niloticus raised in cages furnished with an automatic dispenser, supplied at different frequencies (once per hour and once every two hours and periods (daytime, nighttime and both was evaluated. Eighteen 1.0m³ cages were placed into a 2000m² pond, two meters deep with a 5% water exchange. One hundred and seventy tilapias, with initial weight of 16.0±4.9g, were dispersed into each 1m³ cage and the feed ration was adjusted every 21 days with biometry. Data was collected from March to July (autumn and winter. Significant difference to final weight (P<0.05 among treatments was observed. The increase in feeding frequency improves the productive performance of Nile tilapias in cages and permitted better management of the food. The better feed conversion rate for high feeding frequency (24 times day-1 can result in saving up to 360kg of food for each ton of fish produced, increasing the economic sustenance for tilapia culture and suggesting less environmental pollution.

  1. Individual Differences in Automatic Emotion Regulation Interact with Primed Emotion Regulation during an Anger Provocation

    Directory of Open Access Journals (Sweden)

    Ping Hu

    2017-04-01

    Full Text Available The current study investigated the interactive effects of individual differences in automatic emotion regulation (AER and primed emotion regulation strategy on skin conductance level (SCL and heart rate during provoked anger. The study was a 2 × 2 [AER tendency (expression vs. control × priming (expression vs. control] between subject design. Participants were assigned to two groups according to their performance on an emotion regulation-IAT (differentiating automatic emotion control tendency and automatic emotion expression tendency. Then participants of the two groups were randomly assigned to two emotion regulation priming conditions (emotion control priming or emotion expression priming. Anger was provoked by blaming participants for slow performance during a subsequent backward subtraction task. In anger provocation, SCL of individuals with automatic emotion control tendencies in the control priming condition was lower than of those with automatic emotion control tendencies in the expression priming condition. However, SCL of individuals with automatic emotion expression tendencies did no differ in the automatic emotion control priming or the automatic emotion expression priming condition. Heart rate during anger provocation was higher in individuals with automatic emotion expression tendencies than in individuals with automatic emotion control tendencies regardless of priming condition. This pattern indicates an interactive effect of individual differences in AER and emotion regulation priming on SCL, which is an index of emotional arousal. Heart rate was only sensitive to the individual differences in AER, and did not reflect this interaction. This finding has implications for clinical studies of the use of emotion regulation strategy training suggesting that different practices are optimal for individuals who differ in AER tendencies.

  2. Performance Analysis of the SIFT Operator for Automatic Feature Extraction and Matching in Photogrammetric Applications

    Directory of Open Access Journals (Sweden)

    Francesco Nex

    2009-05-01

    Full Text Available In the photogrammetry field, interest in region detectors, which are widely used in Computer Vision, is quickly increasing due to the availability of new techniques. Images acquired by Mobile Mapping Technology, Oblique Photogrammetric Cameras or Unmanned Aerial Vehicles do not observe normal acquisition conditions. Feature extraction and matching techniques, which are traditionally used in photogrammetry, are usually inefficient for these applications as they are unable to provide reliable results under extreme geometrical conditions (convergent taking geometry, strong affine transformations, etc. and for bad-textured images. A performance analysis of the SIFT technique in aerial and close-range photogrammetric applications is presented in this paper. The goal is to establish the suitability of the SIFT technique for automatic tie point extraction and approximate DSM (Digital Surface Model generation. First, the performances of the SIFT operator have been compared with those provided by feature extraction and matching techniques used in photogrammetry. All these techniques have been implemented by the authors and validated on aerial and terrestrial images. Moreover, an auto-adaptive version of the SIFT operator has been developed, in order to improve the performances of the SIFT detector in relation to the texture of the images. The Auto-Adaptive SIFT operator (A2 SIFT has been validated on several aerial images, with particular attention to large scale aerial images acquired using mini-UAV systems.

  3. The Safeguards analysis applied to the RRP. Automatic sampling authentication system

    International Nuclear Information System (INIS)

    Ono, Sawako; Nakashima, Shinichi; Iwamoto, Tomonori

    2004-01-01

    The sampling for analysis from vessels and columns at the Rokkasho Reprocessing Plant (RRP) is performed mostly by the automatic sampling system. The safeguards sample for the verification also will be taken using these sampling systems and transfer to the OSL though the pneumatic transfer network owned and controlled by operator. In order to maintaining sample integrity and continuity of knowledge (CoK) for throughout the sample processing. It is essential to develop and establish the authentication measures for the automatic sampling system including transfer network. We have developed the Automatic Sampling Authentication System (ASAS) under consultation by IAEA. This paper describes structure, function and concept of ASAS. (author)

  4. A cloud-based system for automatic glaucoma screening.

    Science.gov (United States)

    Fengshou Yin; Damon Wing Kee Wong; Ying Quan; Ai Ping Yow; Ngan Meng Tan; Gopalakrishnan, Kavitha; Beng Hai Lee; Yanwu Xu; Zhuo Zhang; Jun Cheng; Jiang Liu

    2015-08-01

    In recent years, there has been increasing interest in the use of automatic computer-based systems for the detection of eye diseases including glaucoma. However, these systems are usually standalone software with basic functions only, limiting their usage in a large scale. In this paper, we introduce an online cloud-based system for automatic glaucoma screening through the use of medical image-based pattern classification technologies. It is designed in a hybrid cloud pattern to offer both accessibility and enhanced security. Raw data including patient's medical condition and fundus image, and resultant medical reports are collected and distributed through the public cloud tier. In the private cloud tier, automatic analysis and assessment of colour retinal fundus images are performed. The ubiquitous anywhere access nature of the system through the cloud platform facilitates a more efficient and cost-effective means of glaucoma screening, allowing the disease to be detected earlier and enabling early intervention for more efficient intervention and disease management.

  5. MOS voltage automatic tuning circuit

    OpenAIRE

    李, 田茂; 中田, 辰則; 松本, 寛樹

    2004-01-01

    Abstract ###Automatic tuning circuit adjusts frequency performance to compensate for the process variation. Phase locked ###loop (PLL) is a suitable oscillator for the integrated circuit. It is a feedback system that compares the input ###phase with the output phase. It can make the output frequency equal to the input frequency. In this paper, PLL ###fomed of MOSFET's is presented.The presented circuit consists of XOR circuit, Low-pass filter and Relaxation ###Oscillator. On PSPICE simulation...

  6. Quality Assessment of Compressed Video for Automatic License Plate Recognition

    DEFF Research Database (Denmark)

    Ukhanova, Ann; Støttrup-Andersen, Jesper; Forchhammer, Søren

    2014-01-01

    Definition of video quality requirements for video surveillance poses new questions in the area of quality assessment. This paper presents a quality assessment experiment for an automatic license plate recognition scenario. We explore the influence of the compression by H.264/AVC and H.265/HEVC s...... recognition in our study has a behavior similar to human recognition, allowing the use of the same mathematical models. We furthermore propose an application of one of the models for video surveillance systems......Definition of video quality requirements for video surveillance poses new questions in the area of quality assessment. This paper presents a quality assessment experiment for an automatic license plate recognition scenario. We explore the influence of the compression by H.264/AVC and H.265/HEVC...... standards on the recognition performance. We compare logarithmic and logistic functions for quality modeling. Our results show that a logistic function can better describe the dependence of recognition performance on the quality for both compression standards. We observe that automatic license plate...

  7. Software reliability studies

    Science.gov (United States)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  8. Automatic indexing, compiling and classification

    International Nuclear Information System (INIS)

    Andreewsky, Alexandre; Fluhr, Christian.

    1975-06-01

    A review of the principles of automatic indexing, is followed by a comparison and summing-up of work by the authors and by a Soviet staff from the Moscou INFORM-ELECTRO Institute. The mathematical and linguistic problems of the automatic building of thesaurus and automatic classification are examined [fr

  9. Automatic Control of the Concrete Mixture Homogeneity in Cycling Mixers

    Science.gov (United States)

    Anatoly Fedorovich, Tikhonov; Drozdov, Anatoly

    2018-03-01

    The article describes the factors affecting the concrete mixture quality related to the moisture content of aggregates, since the effectiveness of the concrete mixture production is largely determined by the availability of quality management tools at all stages of the technological process. It is established that the unaccounted moisture of aggregates adversely affects the concrete mixture homogeneity and, accordingly, the strength of building structures. A new control method and the automatic control system of the concrete mixture homogeneity in the technological process of mixing components have been proposed, since the tasks of providing a concrete mixture are performed by the automatic control system of processing kneading-and-mixing machinery with operational automatic control of homogeneity. Theoretical underpinnings of the control of the mixture homogeneity are presented, which are related to a change in the frequency of vibrodynamic vibrations of the mixer body. The structure of the technical means of the automatic control system for regulating the supply of water is determined depending on the change in the concrete mixture homogeneity during the continuous mixing of components. The following technical means for establishing automatic control have been chosen: vibro-acoustic sensors, remote terminal units, electropneumatic control actuators, etc. To identify the quality indicator of automatic control, the system offers a structure flowchart with transfer functions that determine the ACS operation in transient dynamic mode.

  10. Full Automatic synthesis of [18F]FMISO

    International Nuclear Information System (INIS)

    Seung Jun Oh; Se Hun Kang; Jin-Sook Ryu; Dae Hyuk Moon

    2004-01-01

    [ 18 F]FMISO is a radiopharmaceutical for hypoxia imaging. Although it was developed in 1986, there has been no report about automatic synthesis. In this experiment, we established the full automatic synthesis of [ 18 F]FMISO and evaluate the stability according to ICH guideline. Method: We used GE MicroLab MX for automatic synthesis. Sequence program was modified to control of the module as follows: [ 18 F]Fluoride drying→[ 18 F]fluorination→trapping of reaction mixture on C18 cartridge→purification-elution of reaction mixture→hydrolysis and HPLC purification. We used disposable cassette for each synthesis and discard it after synthesis. To find optimal synthesis condition, we tested 90 120 degree C as reaction temperature, 5 15 mg of 1-(2-nitro-1-imidazolyl) -2-O-tetrahtdropyranyl-3-O-toluenesulfonyl-propanediol as precursor and 5 15 min as [ 18 F]fluorination time. HPLC purification condition was EtOH:H20 = 5:95, 5ml/min with Alltech Econosil column. To check the stability of production, we performed 30 times of run. We checked the radiochemical stability until 6 hours at 25 degree C and 40% humidity condition. We also performed the stability test at 50 70 degree C with 60-80% humidity condition or under UV light for 6 hours after synthesis for acceleration test, Results: The optimal [ 18 F] fluorination condition was 10mg of precursor and 15 min incubation at 110 degree C. Hydrolysis was performed at 105 degree C for 5 min. After HPLC purification, radiochemical yields and purity were 45±2.8 and 98±1.2%, respectively. Total synthesis time was 60±5.2 min. [ 18 F]FMISO was stable until 6 hours after production with 97±2.4% of radiochemical purity. [ 18 F]FMISO was also stable in acceleration test and photochemical test with 97±2.4 and 97±2.8% of radiochemical purity, respectively. Conclusion: We established the full automatic synthesis method of [ 18 F]FMISO with reproducible high production yield. [18F]FMISO synthesized by this method was stable

  11. Automatic Management of Parallel and Distributed System Resources

    Science.gov (United States)

    Yan, Jerry; Ngai, Tin Fook; Lundstrom, Stephen F.

    1990-01-01

    Viewgraphs on automatic management of parallel and distributed system resources are presented. Topics covered include: parallel applications; intelligent management of multiprocessing systems; performance evaluation of parallel architecture; dynamic concurrent programs; compiler-directed system approach; lattice gaseous cellular automata; and sparse matrix Cholesky factorization.

  12. An interactive system for the automatic layout of printed circuit boards (ARAIGNEE)

    International Nuclear Information System (INIS)

    Combet, M.; Eder, J.; Pagny, C.

    1974-12-01

    A software package for the automatic layout of printed circuit boards is presented. The program permits an interaction of the user during the layout process. The automatic searching of paths can be interrupted at any step and convenient corrections can be inserted. This procedure improves strongly the performance of the program as far as the number of unresolved connections is concerned

  13. Semi-automatic tool to ease the creation and optimization of GPU programs

    DEFF Research Database (Denmark)

    Jepsen, Jacob

    2014-01-01

    We present a tool that reduces the development time of GPU-executable code. We implement a catalogue of common optimizations specific to the GPU architecture. Through the tool, the programmer can semi-automatically transform a computationally-intensive code section into GPU-executable form...... of the transformations can be performed automatically, which makes the tool usable for both novices and experts in GPU programming....

  14. Automatic control design procedures for restructurable aircraft control

    Science.gov (United States)

    Looze, D. P.; Krolewski, S.; Weiss, J.; Barrett, N.; Eterno, J.

    1985-01-01

    A simple, reliable automatic redesign procedure for restructurable control is discussed. This procedure is based on Linear Quadratic (LQ) design methodologies. It employs a robust control system design for the unfailed aircraft to minimize the effects of failed surfaces and to extend the time available for restructuring the Flight Control System. The procedure uses the LQ design parameters for the unfailed system as a basis for choosing the design parameters of the failed system. This philosophy alloys the engineering trade-offs that were present in the nominal design to the inherited by the restructurable design. In particular, it alloys bandwidth limitations and performance trade-offs to be incorporated in the redesigned system. The procedure also has several other desirable features. It effectively redistributes authority among the available control effectors to maximize the system performance subject to actuator limitations and constraints. It provides a graceful performance degradation as the amount of control authority lessens. When given the parameters of the unfailed aircraft, the automatic redesign procedure reproduces the nominal control system design.

  15. Research and Development of Fully Automatic Alien Smoke Stack and Packaging System

    Science.gov (United States)

    Yang, Xudong; Ge, Qingkuan; Peng, Tao; Zuo, Ping; Dong, Weifu

    2017-12-01

    The problem of low efficiency of manual sorting packaging for the current tobacco distribution center, which developed a set of safe efficient and automatic type of alien smoke stack and packaging system. The functions of fully automatic alien smoke stack and packaging system adopt PLC control technology, servo control technology, robot technology, image recognition technology and human-computer interaction technology. The characteristics, principles, control process and key technology of the system are discussed in detail. Through the installation and commissioning fully automatic alien smoke stack and packaging system has a good performance and has completed the requirements for shaped cigarette.

  16. Automatic Morphological Sieving: Comparison between Different Methods, Application to DNA Ploidy Measurements

    Directory of Open Access Journals (Sweden)

    Christophe Boudry

    1999-01-01

    Full Text Available The aim of the present study is to propose alternative automatic methods to time consuming interactive sorting of elements for DNA ploidy measurements. One archival brain tumour and two archival breast carcinoma were studied, corresponding to 7120 elements (3764 nuclei, 3356 debris and aggregates. Three automatic classification methods were tested to eliminate debris and aggregates from DNA ploidy measurements (mathematical morphology (MM, multiparametric analysis (MA and neural network (NN. Performances were evaluated by reference to interactive sorting. The results obtained for the three methods concerning the percentage of debris and aggregates automatically removed reach 63, 75 and 85% for MM, MA and NN methods, respectively, with false positive rates of 6, 21 and 25%. Information about DNA ploidy abnormalities were globally preserved after automatic elimination of debris and aggregates by MM and MA methods as opposed to NN method, showing that automatic classification methods can offer alternatives to tedious interactive elimination of debris and aggregates, for DNA ploidy measurements of archival tumours.

  17. Research on automatic correction of electronic beam path for distributed control

    International Nuclear Information System (INIS)

    Guo Xin; Su Haijun; Li Deming; Wang Shengli; Guo Honglei

    2014-01-01

    Background: Dynamitron, an electron irradiation accelerator of high-voltage, is used as a radiation source for industrial and agricultural production. The control system is an important component of dynamitron. Purpose: The aim is to improve the control system to meet the performance requirements of dynamitron for the stability. Methods: A distributed control system for the 1.5-MeV dynamitron is proposed to gain better performance. On this basis, an electron beam trajectory automatic correction method based on Cerebellar Model Articulation Controller and Proportional-Integral Derivative (CMAC-PID) controller is designed to improve the function of electron beam extraction system. Results: The distributed control system can meet the control requirements of the accelerator. The stability of the CMAC PID controller is better than that of conventional PID controller for the electron beam trajectory automatic correction system, and hence the CMAC-PID controller can provide better protection of dynamitron when electron beam deflection occurs. Conclusion: The distributed control system and the electron beam trajectory automatic correction method system can effectively improve the performance and reduce the failure probability of the accelerator, thereby enhancing the efficiency of the accelerator. (authors)

  18. Automatic phase aberration compensation for digital holographic microscopy based on deep learning background detection.

    Science.gov (United States)

    Nguyen, Thanh; Bui, Vy; Lam, Van; Raub, Christopher B; Chang, Lin-Ching; Nehmetallah, George

    2017-06-26

    We propose a fully automatic technique to obtain aberration free quantitative phase imaging in digital holographic microscopy (DHM) based on deep learning. The traditional DHM solves the phase aberration compensation problem by manually detecting the background for quantitative measurement. This would be a drawback in real time implementation and for dynamic processes such as cell migration phenomena. A recent automatic aberration compensation approach using principle component analysis (PCA) in DHM avoids human intervention regardless of the cells' motion. However, it corrects spherical/elliptical aberration only and disregards the higher order aberrations. Traditional image segmentation techniques can be employed to spatially detect cell locations. Ideally, automatic image segmentation techniques make real time measurement possible. However, existing automatic unsupervised segmentation techniques have poor performance when applied to DHM phase images because of aberrations and speckle noise. In this paper, we propose a novel method that combines a supervised deep learning technique with convolutional neural network (CNN) and Zernike polynomial fitting (ZPF). The deep learning CNN is implemented to perform automatic background region detection that allows for ZPF to compute the self-conjugated phase to compensate for most aberrations.

  19. Synthèse de gestionnaires mémoire pour applications Java temps-réel embarquées

    OpenAIRE

    Salagnac , Guillaume

    2008-01-01

    In this thesis, we address the problem of dynamic memory management in real-time embedded Java systems. When programming in C or C++, all memory management is done explicitly by the programmer, inducing numerous execution faults because of hazardous use of memory operations. This greatly increases software development costs, because such errors are very hard to debug. The Java language tackles this problem by offering automatic memory management, thanks to the use of a garbage collector. Howe...

  20. Block storage subsystem performance analysis

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    You feel that your service is slow because of the storage subsystem? But there are too many abstraction layers between your software and the raw block device for you to debug all this pile... Let's dive on the platters and check out how the block storage sees your I/Os! We can even figure out what those patterns are meaning.

  1. Prototype Design and Application of a Semi-circular Automatic Parking System

    OpenAIRE

    Atacak, Ismail; Erdogdu, Ertugrul

    2017-01-01

    Nowadays, with the increasing population in urban areas, the number of vehicles used in traffic has also increased in these areas. This has brought with it major problems that are caused by insufficient parking areas, in terms of traffic congestion, drivers and environment. In this study, in order to overcome these problems, a multi-storey automatic parking system that automatically performs vehicle recognition, vehicle parking, vehicle delivery and pricing processes has been designed and the...

  2. Technique for comparing automatic quadrature routines

    Energy Technology Data Exchange (ETDEWEB)

    Lyness, J N; Kaganove, J J

    1976-02-01

    The present unconstrained proliferation of automatic quadrature routines is a phenomenon which is wasteful in human time and computing resources. At the root of the problem is an absence of generally acceptable standards or benchmarks for comparing or evaluating such routines. In this paper a general technique, based on the nature of the performance profile, is described which can be used for evaluation of routines.

  3. Automatic feature-based grouping during multiple object tracking.

    Science.gov (United States)

    Erlikhman, Gennady; Keane, Brian P; Mettler, Everett; Horowitz, Todd S; Kellman, Philip J

    2013-12-01

    Contour interpolation automatically binds targets with distractors to impair multiple object tracking (Keane, Mettler, Tsoi, & Kellman, 2011). Is interpolation special in this regard or can other features produce the same effect? To address this question, we examined the influence of eight features on tracking: color, contrast polarity, orientation, size, shape, depth, interpolation, and a combination (shape, color, size). In each case, subjects tracked 4 of 8 objects that began as undifferentiated shapes, changed features as motion began (to enable grouping), and returned to their undifferentiated states before halting. We found that intertarget grouping improved performance for all feature types except orientation and interpolation (Experiment 1 and Experiment 2). Most importantly, target-distractor grouping impaired performance for color, size, shape, combination, and interpolation. The impairments were, at times, large (>15% decrement in accuracy) and occurred relative to a homogeneous condition in which all objects had the same features at each moment of a trial (Experiment 2), and relative to a "diversity" condition in which targets and distractors had different features at each moment (Experiment 3). We conclude that feature-based grouping occurs for a variety of features besides interpolation, even when irrelevant to task instructions and contrary to the task demands, suggesting that interpolation is not unique in promoting automatic grouping in tracking tasks. Our results also imply that various kinds of features are encoded automatically and in parallel during tracking.

  4. Automatic sets and Delone sets

    International Nuclear Information System (INIS)

    Barbe, A; Haeseler, F von

    2004-01-01

    Automatic sets D part of Z m are characterized by having a finite number of decimations. They are equivalently generated by fixed points of certain substitution systems, or by certain finite automata. As examples, two-dimensional versions of the Thue-Morse, Baum-Sweet, Rudin-Shapiro and paperfolding sequences are presented. We give a necessary and sufficient condition for an automatic set D part of Z m to be a Delone set in R m . The result is then extended to automatic sets that are defined as fixed points of certain substitutions. The morphology of automatic sets is discussed by means of examples

  5. Automatic processing of radioimmunological research data on a computer

    International Nuclear Information System (INIS)

    Korolyuk, I.P.; Gorodenko, A.N.; Gorodenko, S.I.

    1979-01-01

    A program ''CRITEST'' in the language PL/1 for the EC computer intended for automatic processing of the results of radioimmunological research has been elaborated. The program works in the operation system of the OC EC computer and is performed in the section OC 60 kb. When compiling the program Eitken's modified algorithm was used. The program was clinically approved when determining a number of hormones: CTH, T 4 , T 3 , TSH. The automatic processing of the radioimmunological research data on the computer makes it possible to simplify the labour-consuming analysis and to raise its accuracy

  6. Artificial Intelligence In Automatic Target Recognizers: Technology And Timelines

    Science.gov (United States)

    Gilmore, John F.

    1984-12-01

    The recognition of targets in thermal imagery has been a problem exhaustively analyzed in its current localized dimension. This paper discusses the application of artificial intelligence (AI) technology to automatic target recognition, a concept capable of expanding current ATR efforts into a new globalized dimension. Deficiencies of current automatic target recognition systems are reviewed in terms of system shortcomings. Areas of artificial intelligence which show the most promise in improving ATR performance are analyzed, and a timeline is formed in light of how near (as well as far) term artificial intelligence applications may exist. Current research in the area of high level expert vision systems is reviewed and the possible utilization of artificial intelligence architectures to improve low level image processing functions is also discussed. Additional application areas of relevance to solving the problem of automatic target recognition utilizing both high and low level processing are also explored.

  7. Automatic evidence retrieval for systematic reviews.

    Science.gov (United States)

    Choong, Miew Keen; Galgani, Filippo; Dunn, Adam G; Tsafnat, Guy

    2014-10-01

    Snowballing involves recursively pursuing relevant references cited in the retrieved literature and adding them to the search results. Snowballing is an alternative approach to discover additional evidence that was not retrieved through conventional search. Snowballing's effectiveness makes it best practice in systematic reviews despite being time-consuming and tedious. Our goal was to evaluate an automatic method for citation snowballing's capacity to identify and retrieve the full text and/or abstracts of cited articles. Using 20 review articles that contained 949 citations to journal or conference articles, we manually searched Microsoft Academic Search (MAS) and identified 78.0% (740/949) of the cited articles that were present in the database. We compared the performance of the automatic citation snowballing method against the results of this manual search, measuring precision, recall, and F1 score. The automatic method was able to correctly identify 633 (as proportion of included citations: recall=66.7%, F1 score=79.3%; as proportion of citations in MAS: recall=85.5%, F1 score=91.2%) of citations with high precision (97.7%), and retrieved the full text or abstract for 490 (recall=82.9%, precision=92.1%, F1 score=87.3%) of the 633 correctly retrieved citations. The proposed method for automatic citation snowballing is accurate and is capable of obtaining the full texts or abstracts for a substantial proportion of the scholarly citations in review articles. By automating the process of citation snowballing, it may be possible to reduce the time and effort of common evidence surveillance tasks such as keeping trial registries up to date and conducting systematic reviews.

  8. Using automatic calibration method for optimizing the performance of Pedotransfer functions of saturated hydraulic conductivity

    Directory of Open Access Journals (Sweden)

    Ahmed M. Abdelbaki

    2016-06-01

    Full Text Available Pedotransfer functions (PTFs are an easy way to predict saturated hydraulic conductivity (Ksat without measurements. This study aims to auto calibrate 22 PTFs. The PTFs were divided into three groups according to its input requirements and the shuffled complex evolution algorithm was used in calibration. The results showed great modification in the performance of the functions compared to the original published functions. For group 1 PTFs, the geometric mean error ratio (GMER and the geometric standard deviation of error ratio (GSDER values were modified from range (1.27–6.09, (5.2–7.01 to (0.91–1.15, (4.88–5.85 respectively. For group 2 PTFs, the GMER and the GSDER values were modified from (0.3–1.55, (5.9–12.38 to (1.00–1.03, (5.5–5.9 respectively. For group 3 PTFs, the GMER and the GSDER values were modified from (0.11–2.06, (5.55–16.42 to (0.82–1.01, (5.1–6.17 respectively. The result showed that the automatic calibration is an efficient and accurate method to enhance the performance of the PTFs.

  9. A simple method for automatic measurement of excitation functions

    International Nuclear Information System (INIS)

    Ogawa, M.; Adachi, M.; Arai, E.

    1975-01-01

    An apparatus has been constructed to perform the sequence control of a beam-analysing magnet for automatic excitation function measurements. This device is also applied to the feedback control of the magnet to lock the beam energy. (Auth.)

  10. Performance of an automatic dose control system for CT. Anthropomorphic phantom studies

    Energy Technology Data Exchange (ETDEWEB)

    Gosch, D.; Stumpp, P.; Kahn, T. [Universitaetsklinikum Leipzig (Germany). Klinik und Poliklinik fuer Diagnostische und Interventionelle Radiologie; Nagel, H.D. [Wissenschaft und Technik fuer die Radiologie, Dr. HD Nagel, Buchholz (Germany)

    2011-02-15

    Purpose: To assess the performance and to provide more detailed insight into characteristics and limitations of devices for automatic dose control (ADC) in CT. Materials and Methods: A comprehensive study on DoseRight 2.0, the ADC system provided by Philips for its Brilliance CT scanners, was conducted with assorted tests using an anthropomorphic phantom that allowed simulation of the operation of the system under almost realistic conditions. The scan protocol settings for the neck, chest and abdomen with pelvis were identical to those applied in the clinical routine. Results: Using the appropriate ADC functionalities, dose reductions equal 40 % for the neck, 20 % for the chest and 10 % for the abdomen with pelvis. Larger dose reductions can be expected for average patients, since their attenuating properties differ significantly from the anthropomorphic phantom. Adverse effects due to increased image noise were only moderate as a consequence of the 'adequate noise system' design and the complementary use of adaptive filtration. The results of specific tests also provided deeper insight into the operation of the ADC system that helps to identify the causes of suspected malfunctions and to prevent potential pitfalls. Conclusion: Tests with anthropomorphic phantoms allow verification of the characteristics of devices for ADC in CT under almost realistic conditions. However, differences in phantom shape and material composition require supplementary patient studies on representative patient groups. (orig.)

  11. Automatic path proposal computation for CT-guided percutaneous liver biopsy.

    Science.gov (United States)

    Helck, A; Schumann, C; Aumann, J; Thierfelder, K; Strobl, F F; Braunagel, M; Niethammer, M; Clevert, D A; Hoffmann, R T; Reiser, M; Sandner, T; Trumm, C

    2016-12-01

    To evaluate feasibility of automatic software-based path proposals for CT-guided percutaneous biopsies. Thirty-three patients (60 [Formula: see text] 12 years) referred for CT-guided biopsy of focal liver lesions were consecutively included. Pre-interventional CT and dedicated software (FraunhoferMeVis Pathfinder) were used for (semi)automatic segmentation of relevant structures. The software subsequently generated three path proposals in downward quality for CT-guided biopsy. Proposed needle paths were compared with consensus proposal of two experts (comparable, less suitable, not feasible). In case of comparable results, equivalent approach to software-based path proposal was used. Quality of segmentation process was evaluated (Likert scale, 1 [Formula: see text] best, 6 [Formula: see text] worst), and time for processing was registered. All biopsies were performed successfully without complications. In 91 % one of the three automatic path proposals was rated comparable to experts' proposal. None of the first proposals was rated not feasible, and 76 % were rated comparable to the experts' proposal. 7 % automatic path proposals were rated not feasible, all being second choice ([Formula: see text]) or third choice ([Formula: see text]). In 79 %, segmentation at least was good. Average total time for establishing automatic path proposal was 42 [Formula: see text] 9 s. Automatic software-based path proposal for CT-guided liver biopsies in the majority provides path proposals that are easy to establish and comparable to experts' insertion trajectories.

  12. Interactive vs. automatic ultrasound image segmentation methods for staging hepatic lipidosis.

    Science.gov (United States)

    Weijers, Gert; Starke, Alexander; Haudum, Alois; Thijssen, Johan M; Rehage, Jürgen; De Korte, Chris L

    2010-07-01

    The aim of this study was to test the hypothesis that automatic segmentation of vessels in ultrasound (US) images can produce similar or better results in grading fatty livers than interactive segmentation. A study was performed in postpartum dairy cows (N=151), as an animal model of human fatty liver disease, to test this hypothesis. Five transcutaneous and five intraoperative US liver images were acquired in each animal and a liverbiopsy was taken. In liver tissue samples, triacylglycerol (TAG) was measured by biochemical analysis and hepatic diseases other than hepatic lipidosis were excluded by histopathologic examination. Ultrasonic tissue characterization (UTC) parameters--Mean echo level, standard deviation (SD) of echo level, signal-to-noise ratio (SNR), residual attenuation coefficient (ResAtt) and axial and lateral speckle size--were derived using a computer-aided US (CAUS) protocol and software package. First, the liver tissue was interactively segmented by two observers. With increasing fat content, fewer hepatic vessels were visible in the ultrasound images and, therefore, a smaller proportion of the liver needed to be excluded from these images. Automatic-segmentation algorithms were implemented and it was investigated whether better results could be achieved than with the subjective and time-consuming interactive-segmentation procedure. The automatic-segmentation algorithms were based on both fixed and adaptive thresholding techniques in combination with a 'speckle'-shaped moving-window exclusion technique. All data were analyzed with and without postprocessing as contained in CAUS and with different automated-segmentation techniques. This enabled us to study the effect of the applied postprocessing steps on single and multiple linear regressions ofthe various UTC parameters with TAG. Improved correlations for all US parameters were found by using automatic-segmentation techniques. Stepwise multiple linear-regression formulas where derived and used

  13. Segmentation of Extrapulmonary Tuberculosis Infection Using Modified Automatic Seeded Region Growing

    Directory of Open Access Journals (Sweden)

    Nordin Abdul

    2009-01-01

    Full Text Available Abstract In the image segmentation process of positron emission tomography combined with computed tomography (PET/CT imaging, previous works used information in CT only for segmenting the image without utilizing the information that can be provided by PET. This paper proposes to utilize the hot spot values in PET to guide the segmentation in CT, in automatic image segmentation using seeded region growing (SRG technique. This automatic segmentation routine can be used as part of automatic diagnostic tools. In addition to the original initial seed selection using hot spot values in PET, this paper also introduces a new SRG growing criterion, the sliding windows. Fourteen images of patients having extrapulmonary tuberculosis have been examined using the above-mentioned method. To evaluate the performance of the modified SRG, three fidelity criteria are measured: percentage of under-segmentation area, percentage of over-segmentation area, and average time consumption. In terms of the under-segmentation percentage, SRG with average of the region growing criterion shows the least error percentage (51.85%. Meanwhile, SRG with local averaging and variance yielded the best results (2.67% for the over-segmentation percentage. In terms of the time complexity, the modified SRG with local averaging and variance growing criterion shows the best performance with 5.273 s average execution time. The results indicate that the proposed methods yield fairly good performance in terms of the over- and under-segmentation area. The results also demonstrated that the hot spot values in PET can be used to guide the automatic segmentation in CT image.

  14. Performance verification and comparison of TianLong automatic hypersensitive hepatitis B virus DNA quantification system with Roche CAP/CTM system.

    Science.gov (United States)

    Li, Ming; Chen, Lin; Liu, Li-Ming; Li, Yong-Li; Li, Bo-An; Li, Bo; Mao, Yuan-Li; Xia, Li-Fang; Wang, Tong; Liu, Ya-Nan; Li, Zheng; Guo, Tong-Sheng

    2017-10-07

    To investigate and compare the analytical and clinical performance of TianLong automatic hypersensitive hepatitis B virus (HBV) DNA quantification system and Roche CAP/CTM system. Two hundred blood samples for HBV DNA testing, HBV-DNA negative samples and high-titer HBV-DNA mixture samples were collected and prepared. National standard materials for serum HBV and a worldwide HBV DNA panel were employed for performance verification. The analytical performance, such as limit of detection, limit of quantification, accuracy, precision, reproducibility, linearity, genotype coverage and cross-contamination, was determined using the TianLong automatic hypersensitive HBV DNA quantification system (TL system). Correlation and Bland-Altman plot analyses were carried out to compare the clinical performance of the TL system assay and the CAP/CTM system. The detection limit of the TL system was 10 IU/mL, and its limit of quantification was 30 IU/mL. The differences between the expected and tested concentrations of the national standards were less than ± 0.4 Log 10 IU/mL, which showed high accuracy of the system. Results of the precision, reproducibility and linearity tests showed that the multiple test coefficient of variation (CV) of the same sample was less than 5% for 10 2 -10 6 IU/mL; and for 30-10 8 IU/mL, the linear correlation coefficient r 2 = 0.99. The TL system detected HBV DNA (A-H) genotypes and there was no cross-contamination during the "checkerboard" test. When compared with the CAP/CTM assay, the two assays showed 100% consistency in both negative and positive sample results (15 negative samples and 185 positive samples). No statistical differences between the two assays in the HBV DNA quantification values were observed ( P > 0.05). Correlation analysis indicated a significant correlation between the two assays, r 2 = 0.9774. The Bland-Altman plot analysis showed that 98.9% of the positive data were within the 95% acceptable range, and the maximum difference

  15. Measuring the accuracy of automatic shoeprint recognition methods.

    Science.gov (United States)

    Luostarinen, Tapio; Lehmussola, Antti

    2014-11-01

    Shoeprints are an important source of information for criminal investigation. Therefore, an increasing number of automatic shoeprint recognition methods have been proposed for detecting the corresponding shoe models. However, comprehensive comparisons among the methods have not previously been made. In this study, an extensive set of methods proposed in the literature was implemented, and their performance was studied in varying conditions. Three datasets of different quality shoeprints were used, and the methods were evaluated also with partial and rotated prints. The results show clear differences between the algorithms: while the best performing method, based on local image descriptors and RANSAC, provides rather good results with most of the experiments, some methods are almost completely unrobust against any unidealities in the images. Finally, the results demonstrate that there is still a need for extensive research to improve the accuracy of automatic recognition of crime scene prints. © 2014 American Academy of Forensic Sciences.

  16. Debugging of Class-D Audio Power Amplifiers

    DEFF Research Database (Denmark)

    Crone, Lasse; Pedersen, Jeppe Arnsdorf; Mønster, Jakob Døllner

    2012-01-01

    Determining and optimizing the performance of a Class-D audio power amplier can be very dicult without knowledge of the use of audio performance measuring equipment and of how the various noise and distortion sources in uence the audio performance. This paper gives an introduction on how to measure...

  17. Changes in default mode network as automaticity develops in a categorization task.

    Science.gov (United States)

    Shamloo, Farzin; Helie, Sebastien

    2016-10-15

    The default mode network (DMN) is a set of brain regions in which blood oxygen level dependent signal is suppressed during attentional focus on the external environment. Because automatic task processing requires less attention, development of automaticity in a rule-based categorization task may result in less deactivation and altered functional connectivity of the DMN when compared to the initial learning stage. We tested this hypothesis by re-analyzing functional magnetic resonance imaging data of participants trained in rule-based categorization for over 10,000 trials (Helie et al., 2010) [12,13]. The results show that some DMN regions are deactivated in initial training but not after automaticity has developed. There is also a significant decrease in DMN deactivation after extensive practice. Seed-based functional connectivity analyses with the precuneus, medial prefrontal cortex (two important DMN regions) and Brodmann area 6 (an important region in automatic categorization) were also performed. The results show increased functional connectivity with both DMN and non-DMN regions after the development of automaticity, and a decrease in functional connectivity between the medial prefrontal cortex and ventromedial orbitofrontal cortex. Together, these results further support the hypothesis of a strategy shift in automatic categorization and bridge the cognitive and neuroscientific conceptions of automaticity in showing that the reduced need for cognitive resources in automatic processing is accompanied by a disinhibition of the DMN and stronger functional connectivity between DMN and task-related brain regions. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. [Modeling and implementation method for the automatic biochemistry analyzer control system].

    Science.gov (United States)

    Wang, Dong; Ge, Wan-cheng; Song, Chun-lin; Wang, Yun-guang

    2009-03-01

    In this paper the system structure The automatic biochemistry analyzer is a necessary instrument for clinical diagnostics. First of is analyzed. The system problems description and the fundamental principles for dispatch are brought forward. Then this text puts emphasis on the modeling for the automatic biochemistry analyzer control system. The objects model and the communications model are put forward. Finally, the implementation method is designed. It indicates that the system based on the model has good performance.

  19. Mitosis Counting in Breast Cancer: Object-Level Interobserver Agreement and Comparison to an Automatic Method.

    Science.gov (United States)

    Veta, Mitko; van Diest, Paul J; Jiwa, Mehdi; Al-Janabi, Shaimaa; Pluim, Josien P W

    2016-01-01

    Tumor proliferation speed, most commonly assessed by counting of mitotic figures in histological slide preparations, is an important biomarker for breast cancer. Although mitosis counting is routinely performed by pathologists, it is a tedious and subjective task with poor reproducibility, particularly among non-experts. Inter- and intraobserver reproducibility of mitosis counting can be improved when a strict protocol is defined and followed. Previous studies have examined only the agreement in terms of the mitotic count or the mitotic activity score. Studies of the observer agreement at the level of individual objects, which can provide more insight into the procedure, have not been performed thus far. The development of automatic mitosis detection methods has received large interest in recent years. Automatic image analysis is viewed as a solution for the problem of subjectivity of mitosis counting by pathologists. In this paper we describe the results from an interobserver agreement study between three human observers and an automatic method, and make two unique contributions. For the first time, we present an analysis of the object-level interobserver agreement on mitosis counting. Furthermore, we train an automatic mitosis detection method that is robust with respect to staining appearance variability and compare it with the performance of expert observers on an "external" dataset, i.e. on histopathology images that originate from pathology labs other than the pathology lab that provided the training data for the automatic method. The object-level interobserver study revealed that pathologists often do not agree on individual objects, even if this is not reflected in the mitotic count. The disagreement is larger for objects from smaller size, which suggests that adding a size constraint in the mitosis counting protocol can improve reproducibility. The automatic mitosis detection method can perform mitosis counting in an unbiased way, with substantial

  20. An automatic tension measurement system of MWPC wires

    International Nuclear Information System (INIS)

    D'Antone, I.; Lolli, M.; Torromeo, G.

    1992-01-01

    An electronic system is presented for automatic mechanical tension measurement to test wire chambers. The developed system works in the tension range from 50 g to 300 g; this large working range is obtained by using a microcontroller that performs a digital control on the bridge of an oscillator containing the wire of which the tension has to be measured. The microcontroller automatically brings the system towards the oscillation condition and subsequently, measuring the frequency, it evaluates, displays and sends to a host computer the value of the mechanical tension of the wires. The system is precise and allows fast measurements. A description of the hardware and software design is given. (orig.)

  1. Semi-Automatic Rename Refactoring for JavaScript

    DEFF Research Database (Denmark)

    Feldthaus, Asger; Møller, Anders

    2013-01-01

    and interaction with the programmer. With this pragmatic approach, we can provide scalable and effective refactoring support for real-world code, including libraries and incomplete applications. Through a series of experiments that estimate how much manual effort our technique demands from the programmer, we show......Modern IDEs support automated refactoring for many programming languages, but support for JavaScript is still primitive. To perform renaming, which is one of the fundamental refactorings, there is often no practical alternative to simple syntactic search-and-replace. Although more sophisticated...... alternatives have been developed, they are limited by whole-program assumptions and poor scalability. We propose a technique for semi-automatic refactoring for JavaScript, with a focus on renaming. Unlike traditional refactoring algorithms, semi-automatic refactoring works by a combination of static analysis...

  2. Automatic humidification system to support the assessment of food drying processes

    Science.gov (United States)

    Ortiz Hernández, B. D.; Carreño Olejua, A. R.; Castellanos Olarte, J. M.

    2016-07-01

    This work shows the main features of an automatic humidification system to provide drying air that match environmental conditions of different climate zones. This conditioned air is then used to assess the drying process of different agro-industrial products at the Automation and Control for Agro-industrial Processes Laboratory of the Pontifical Bolivarian University of Bucaramanga, Colombia. The automatic system allows creating and improving control strategies to supply drying air under specified conditions of temperature and humidity. The development of automatic routines to control and acquire real time data was made possible by the use of robust control systems and suitable instrumentation. The signals are read and directed to a controller memory where they are scaled and transferred to a memory unit. Using the IP address is possible to access data to perform supervision tasks. One important characteristic of this automatic system is the Dynamic Data Exchange Server (DDE) to allow direct communication between the control unit and the computer used to build experimental curves.

  3. The Mark II Automatic Diflux

    Directory of Open Access Journals (Sweden)

    Jean L Rasson

    2011-07-01

    Full Text Available We report here on the new realization of an automatic fluxgate theodolite able to perform unattended absolute geomagnetic declination and inclination measurements: the AUTODIF MKII. The main changes of this version compared with the former one are presented as well as the better specifications we expect now. We also explain the absolute orientation procedure by means of a laser beam and a corner cube and the method for leveling the fluxgate sensor, which is different from a conventional DIflux theodolite.

  4. Research on Semi-automatic Bomb Fetching for an EOD Robot

    Directory of Open Access Journals (Sweden)

    Qian Jun

    2008-11-01

    Full Text Available An EOD robot system, SUPER-PLUS, which has a novel semi-automatic bomb fetching function is presented in this paper. With limited support of human, SUPER-PLUS scans the cluttered environment with a wrist-mounted laser distance sensor and plans the manipulator a collision free path to fetch the bomb. The model construction of manipulator, bomb and environment, C-space map, path planning and the operation procedure are introduced in detail. The semi-automatic bomb fetching function has greatly improved the operation performance of EOD robot.

  5. Research on Semi-Automatic Bomb Fetching for an EOD Robot

    Directory of Open Access Journals (Sweden)

    Zeng Jian-Jun

    2007-06-01

    Full Text Available An EOD robot system, SUPER-PLUS, which has a novel semi-automatic bomb fetching function is presented in this paper. With limited support of human, SUPER-PLUS scans the cluttered environment with a wrist-mounted laser distance sensor and plans the manipulator a collision free path to fetch the bomb. The model construction of manipulator, bomb and environment, C-space map, path planning and the operation procedure are introduced in detail. The semi-automatic bomb fetching function has greatly improved the operation performance of EOD robot.

  6. A semi-automatic annotation tool for cooking video

    Science.gov (United States)

    Bianco, Simone; Ciocca, Gianluigi; Napoletano, Paolo; Schettini, Raimondo; Margherita, Roberto; Marini, Gianluca; Gianforme, Giorgio; Pantaleo, Giuseppe

    2013-03-01

    In order to create a cooking assistant application to guide the users in the preparation of the dishes relevant to their profile diets and food preferences, it is necessary to accurately annotate the video recipes, identifying and tracking the foods of the cook. These videos present particular annotation challenges such as frequent occlusions, food appearance changes, etc. Manually annotate the videos is a time-consuming, tedious and error-prone task. Fully automatic tools that integrate computer vision algorithms to extract and identify the elements of interest are not error free, and false positive and false negative detections need to be corrected in a post-processing stage. We present an interactive, semi-automatic tool for the annotation of cooking videos that integrates computer vision techniques under the supervision of the user. The annotation accuracy is increased with respect to completely automatic tools and the human effort is reduced with respect to completely manual ones. The performance and usability of the proposed tool are evaluated on the basis of the time and effort required to annotate the same video sequences.

  7. 7th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Gracia, José; Nagel, Wolfgang; Resch, Michael

    2014-01-01

    Current advances in High Performance Computing (HPC) increasingly impact efficient software development workflows. Programmers for HPC applications need to consider trends such as increased core counts, multiple levels of parallelism, reduced memory per core, and I/O system challenges in order to derive well performing and highly scalable codes. At the same time, the increasing complexity adds further sources of program defects. While novel programming paradigms and advanced system libraries provide solutions for some of these challenges, appropriate supporting tools are indispensable. Such tools aid application developers in debugging, performance analysis, or code optimization and therefore make a major contribution to the development of robust and efficient parallel software. This book introduces a selection of the tools presented and discussed at the 7th International Parallel Tools Workshop, held in Dresden, Germany, September 3-4, 2013.  

  8. Genital automatisms: Reappraisal of a remarkable but ignored symptom of focal seizures.

    Science.gov (United States)

    Dede, Hava Özlem; Bebek, Nerses; Gürses, Candan; Baysal-Kıraç, Leyla; Baykan, Betül; Gökyiğit, Ayşen

    2018-03-01

    Genital automatisms (GAs) are uncommon clinical phenomena of focal seizures. They are defined as repeated fondling, grabbing, or scratching of the genitals. The aim of this study was to determine the lateralizing and localizing value and associated clinical characteristics of GAs. Three hundred thirteen consecutive patients with drug-resistant seizures who were referred to our tertiary center for presurgical evaluation between 2009 and 2016 were investigated. The incidence of specific kinds of behavior, clinical semiology, associated symptoms/signs with corresponding ictal electroencephalography (EEG) findings, and their potential role in seizure localization and lateralization were evaluated. Fifteen (4.8%) of 313 patients had GAs. Genital automatisms were identified in 19 (16.4%) of a total 116 seizures. Genital automatisms were observed to occur more often in men than in women (M/F: 10/5). Nine of fifteen patients (60%) had temporal lobe epilepsy (right/left: 4/5) and three (20%) had frontal lobe epilepsy (right/left: 1/2), whereas the remaining two patients could not be classified. One patient was diagnosed as having Rasmussen encephalitis. Genital automatisms were ipsilateral to epileptic focus in 12 patients and contralateral in only one patient according to ictal-interictal EEG and neuroimaging findings. Epileptic focus could not be lateralized in the last 2 patients. Genital automatisms were associated with unilateral hand automatisms such as postictal nose wiping or manual automatisms in 13 (86.7%) of 15 and contralateral dystonia was seen in 6 patients. All patients had amnesia of the performance of GAs. Genital automatisms are more frequent in seizures originating from the temporal lobe, and they can also be seen in frontal lobe seizures. Genital automatisms seem to have a high lateralizing value to the ipsilateral hemisphere and are mostly concordant with other unilateral hand automatisms. Men exhibit GAs more often than women. Copyright © 2017

  9. Semi-supervised learning based probabilistic latent semantic analysis for automatic image annotation

    Institute of Scientific and Technical Information of China (English)

    Tian Dongping

    2017-01-01

    In recent years, multimedia annotation problem has been attracting significant research attention in multimedia and computer vision areas, especially for automatic image annotation, whose purpose is to provide an efficient and effective searching environment for users to query their images more easily.In this paper, a semi-supervised learning based probabilistic latent semantic analysis ( PL-SA) model for automatic image annotation is presenred.Since it' s often hard to obtain or create la-beled images in large quantities while unlabeled ones are easier to collect, a transductive support vector machine ( TSVM) is exploited to enhance the quality of the training image data.Then, differ-ent image features with different magnitudes will result in different performance for automatic image annotation.To this end, a Gaussian normalization method is utilized to normalize different features extracted from effective image regions segmented by the normalized cuts algorithm so as to reserve the intrinsic content of images as complete as possible.Finally, a PLSA model with asymmetric mo-dalities is constructed based on the expectation maximization( EM) algorithm to predict a candidate set of annotations with confidence scores.Extensive experiments on the general-purpose Corel5k dataset demonstrate that the proposed model can significantly improve performance of traditional PL-SA for the task of automatic image annotation.

  10. Automatic Model Generation Framework for Computational Simulation of Cochlear Implantation

    DEFF Research Database (Denmark)

    Mangado Lopez, Nerea; Ceresa, Mario; Duchateau, Nicolas

    2016-01-01

    . To address such a challenge, we propose an automatic framework for the generation of patient-specific meshes for finite element modeling of the implanted cochlea. First, a statistical shape model is constructed from high-resolution anatomical μCT images. Then, by fitting the statistical model to a patient......'s CT image, an accurate model of the patient-specific cochlea anatomy is obtained. An algorithm based on the parallel transport frame is employed to perform the virtual insertion of the cochlear implant. Our automatic framework also incorporates the surrounding bone and nerve fibers and assigns......Recent developments in computational modeling of cochlear implantation are promising to study in silico the performance of the implant before surgery. However, creating a complete computational model of the patient's anatomy while including an external device geometry remains challenging...

  11. Automatic Offline Formulation of Robust Model Predictive Control Based on Linear Matrix Inequalities Method

    Directory of Open Access Journals (Sweden)

    Longge Zhang

    2013-01-01

    Full Text Available Two automatic robust model predictive control strategies are presented for uncertain polytopic linear plants with input and output constraints. A sequence of nested geometric proportion asymptotically stable ellipsoids and controllers is constructed offline first. Then the feedback controllers are automatically selected with the receding horizon online in the first strategy. Finally, a modified automatic offline robust MPC approach is constructed to improve the closed system's performance. The new proposed strategies not only reduce the conservatism but also decrease the online computation. Numerical examples are given to illustrate their effectiveness.

  12. Automatic Generation of Heuristics for Scheduling

    Science.gov (United States)

    Morris, Robert A.; Bresina, John L.; Rodgers, Stuart M.

    1997-01-01

    This paper presents a technique, called GenH, that automatically generates search heuristics for scheduling problems. The impetus for developing this technique is the growing consensus that heuristics encode advice that is, at best, useful in solving most, or typical, problem instances, and, at worst, useful in solving only a narrowly defined set of instances. In either case, heuristic problem solvers, to be broadly applicable, should have a means of automatically adjusting to the idiosyncrasies of each problem instance. GenH generates a search heuristic for a given problem instance by hill-climbing in the space of possible multi-attribute heuristics, where the evaluation of a candidate heuristic is based on the quality of the solution found under its guidance. We present empirical results obtained by applying GenH to the real world problem of telescope observation scheduling. These results demonstrate that GenH is a simple and effective way of improving the performance of an heuristic scheduler.

  13. Optical Automatic Car Identification (OACI) : Volume 1. Advanced System Specification.

    Science.gov (United States)

    1978-12-01

    A performance specification is provided in this report for an Optical Automatic Car Identification (OACI) scanner system which features 6% improved readability over existing industry scanner systems. It also includes the analysis and rationale which ...

  14. Automatic sentence extraction for the detection of scientific paper relations

    Science.gov (United States)

    Sibaroni, Y.; Prasetiyowati, S. S.; Miftachudin, M.

    2018-03-01

    The relations between scientific papers are very useful for researchers to see the interconnection between scientific papers quickly. By observing the inter-article relationships, researchers can identify, among others, the weaknesses of existing research, performance improvements achieved to date, and tools or data typically used in research in specific fields. So far, methods that have been developed to detect paper relations include machine learning and rule-based methods. However, a problem still arises in the process of sentence extraction from scientific paper documents, which is still done manually. This manual process causes the detection of scientific paper relations longer and inefficient. To overcome this problem, this study performs an automatic sentences extraction while the paper relations are identified based on the citation sentence. The performance of the built system is then compared with that of the manual extraction system. The analysis results suggested that the automatic sentence extraction indicates a very high level of performance in the detection of paper relations, which is close to that of manual sentence extraction.

  15. Brand and automaticity

    OpenAIRE

    Liu, J.

    2008-01-01

    A presumption of most consumer research is that consumers endeavor to maximize the utility of their choices and are in complete control of their purchasing and consumption behavior. However, everyday life experience suggests that many of our choices are not all that reasoned or conscious. Indeed, automaticity, one facet of behavior, is indispensable to complete the portrait of consumers. Despite its importance, little attention is paid to how the automatic side of behavior can be captured and...

  16. Automatic Number Plate Recognition System for IPhone Devices

    Directory of Open Access Journals (Sweden)

    Călin Enăchescu

    2013-06-01

    Full Text Available This paper presents a system for automatic number plate recognition, implemented for devices running the iOS operating system. The methods used for number plate recognition are based on existing methods, but optimized for devices with low hardware resources. To solve the task of automatic number plate recognition we have divided it into the following subtasks: image acquisition, localization of the number plate position on the image and character detection. The first subtask is performed by the camera of an iPhone, the second one is done using image pre-processing methods and template matching. For the character recognition we are using a feed-forward artificial neural network. Each of these methods is presented along with its results.

  17. Coherence measures in automatic time-migration velocity analysis

    International Nuclear Information System (INIS)

    Maciel, Jonathas S; Costa, Jessé C; Schleicher, Jörg

    2012-01-01

    Time-migration velocity analysis can be carried out automatically by evaluating the coherence of migrated seismic events in common-image gathers (CIGs). The performance of gradient methods for automatic time-migration velocity analysis depends on the coherence measures used as the objective function. We compare the results of four different coherence measures, being conventional semblance, differential semblance, an extended differential semblance using differences of more distant image traces and the product of the latter with conventional semblance. In our numerical experiments, the objective functions based on conventional semblance and on the product of conventional semblance with extended differential semblance provided the best velocity models, as evaluated by the flatness of the resulting CIGs. The method can be easily extended to anisotropic media. (paper)

  18. Automatic Program Development

    DEFF Research Database (Denmark)

    Automatic Program Development is a tribute to Robert Paige (1947-1999), our accomplished and respected colleague, and moreover our good friend, whose untimely passing was a loss to our academic and research community. We have collected the revised, updated versions of the papers published in his...... honor in the Higher-Order and Symbolic Computation Journal in the years 2003 and 2005. Among them there are two papers by Bob: (i) a retrospective view of his research lines, and (ii) a proposal for future studies in the area of the automatic program derivation. The book also includes some papers...... by members of the IFIP Working Group 2.1 of which Bob was an active member. All papers are related to some of the research interests of Bob and, in particular, to the transformational development of programs and their algorithmic derivation from formal specifications. Automatic Program Development offers...

  19. Development of advanced automatic operation system for nuclear ship. 1. Perfect automatic normal operation

    International Nuclear Information System (INIS)

    Nakazawa, Toshio; Yabuuti, Noriaki; Takahashi, Hiroki; Shimazaki, Junya

    1999-02-01

    Development of operation support system such as automatic operating system and anomaly diagnosis systems of nuclear reactor is very important in practical nuclear ship because of a limited number of operators and severe conditions in which receiving support from others in a case of accident is very difficult. The goal of development of the operation support systems is to realize the perfect automatic control system in a series of normal operation from the reactor start-up to the shutdown. The automatic control system for the normal operation has been developed based on operating experiences of the first Japanese nuclear ship 'Mutsu'. Automation technique was verified by 'Mutsu' plant data at manual operation. Fully automatic control of start-up and shutdown operations was achieved by setting the desired value of operation and the limiting value of parameter fluctuation, and by making the operation program of the principal equipment such as the main coolant pump and the heaters. This report presents the automatic operation system developed for the start-up and the shutdown of reactor and the verification of the system using the Nuclear Ship Engineering Simulator System. (author)

  20. Efficient Semi-Automatic 3D Segmentation for Neuron Tracing in Electron Microscopy Images

    Science.gov (United States)

    Jones, Cory; Liu, Ting; Cohan, Nathaniel Wood; Ellisman, Mark; Tasdizen, Tolga

    2015-01-01

    0.1. Background In the area of connectomics, there is a significant gap between the time required for data acquisition and dense reconstruction of the neural processes contained in the same dataset. Automatic methods are able to eliminate this timing gap, but the state-of-the-art accuracy so far is insufficient for use without user corrections. If completed naively, this process of correction can be tedious and time consuming. 0.2. New Method We present a new semi-automatic method that can be used to perform 3D segmentation of neurites in EM image stacks. It utilizes an automatic method that creates a hierarchical structure for recommended merges of superpixels. The user is then guided through each predicted region to quickly identify errors and establish correct links. 0.3. Results We tested our method on three datasets with both novice and expert users. Accuracy and timing were compared with published automatic, semi-automatic, and manual results. 0.4. Comparison with Existing Methods Post-automatic correction methods have also been used in [1] and [2]. These methods do not provide navigation or suggestions in the manner we present. Other semi-automatic methods require user input prior to the automatic segmentation such as [3] and [4] and are inherently different than our method. 0.5. Conclusion Using this method on the three datasets, novice users achieved accuracy exceeding state-of-the-art automatic results, and expert users achieved accuracy on par with full manual labeling but with a 70% time improvement when compared with other examples in publication. PMID:25769273

  1. 14 CFR 23.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 23.1329 Section 23...: Installation § 23.1329 Automatic pilot system. If an automatic pilot system is installed, it must meet the following: (a) Each system must be designed so that the automatic pilot can— (1) Be quickly and positively...

  2. 46 CFR 52.01-10 - Automatic controls.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Automatic controls. 52.01-10 Section 52.01-10 Shipping... Requirements § 52.01-10 Automatic controls. (a) Each main boiler must meet the special requirements for automatic safety controls in § 62.35-20(a)(1) of this chapter. (b) Each automatically controlled auxiliary...

  3. Automatic vertebral identification using surface-based registration

    Science.gov (United States)

    Herring, Jeannette L.; Dawant, Benoit M.

    2000-06-01

    This work introduces an enhancement to currently existing methods of intra-operative vertebral registration by allowing the portion of the spinal column surface that correctly matches a set of physical vertebral points to be automatically selected from several possible choices. Automatic selection is made possible by the shape variations that exist among lumbar vertebrae. In our experiments, we register vertebral points representing physical space to spinal column surfaces extracted from computed tomography images. The vertebral points are taken from the posterior elements of a single vertebra to represent the region of surgical interest. The surface is extracted using an improved version of the fully automatic marching cubes algorithm, which results in a triangulated surface that contains multiple vertebrae. We find the correct portion of the surface by registering the set of physical points to multiple surface areas, including all vertebral surfaces that potentially match the physical point set. We then compute the standard deviation of the surface error for the set of points registered to each vertebral surface that is a possible match, and the registration that corresponds to the lowest standard deviation designates the correct match. We have performed our current experiments on two plastic spine phantoms and one patient.

  4. Automatic welding and cladding in heavy fabrication

    International Nuclear Information System (INIS)

    Altamer, A. de

    1980-01-01

    A description is given of the automatic welding processes used by an Italian fabricator of pressure vessels for petrochemical and nuclear plant. The automatic submerged arc welding, submerged arc strip cladding, pulsed TIG, hot wire TIG and MIG welding processes have proved satisfactory in terms of process reliability, metal deposition rate, and cost effectiveness for low alloy and carbon steels. An example shows sequences required during automatic butt welding, including heat treatments. Factors which govern satisfactory automatic welding include automatic anti-drift rotator device, electrode guidance and bead programming system, the capability of single and dual head operation, flux recovery and slag removal systems, operator environment and controls, maintaining continuity of welding and automatic reverse side grinding. Automatic welding is used for: joining vessel sections; joining tubes to tubeplate; cladding of vessel rings and tubes, dished ends and extruded nozzles; nozzle to shell and butt welds, including narrow gap welding. (author)

  5. Automatic 3d Building Model Generations with Airborne LiDAR Data

    Science.gov (United States)

    Yastikli, N.; Cetin, Z.

    2017-11-01

    LiDAR systems become more and more popular because of the potential use for obtaining the point clouds of vegetation and man-made objects on the earth surface in an accurate and quick way. Nowadays, these airborne systems have been frequently used in wide range of applications such as DEM/DSM generation, topographic mapping, object extraction, vegetation mapping, 3 dimensional (3D) modelling and simulation, change detection, engineering works, revision of maps, coastal management and bathymetry. The 3D building model generation is the one of the most prominent applications of LiDAR system, which has the major importance for urban planning, illegal construction monitoring, 3D city modelling, environmental simulation, tourism, security, telecommunication and mobile navigation etc. The manual or semi-automatic 3D building model generation is costly and very time-consuming process for these applications. Thus, an approach for automatic 3D building model generation is needed in a simple and quick way for many studies which includes building modelling. In this study, automatic 3D building models generation is aimed with airborne LiDAR data. An approach is proposed for automatic 3D building models generation including the automatic point based classification of raw LiDAR point cloud. The proposed point based classification includes the hierarchical rules, for the automatic production of 3D building models. The detailed analyses for the parameters which used in hierarchical rules have been performed to improve classification results using different test areas identified in the study area. The proposed approach have been tested in the study area which has partly open areas, forest areas and many types of the buildings, in Zekeriyakoy, Istanbul using the TerraScan module of TerraSolid. The 3D building model was generated automatically using the results of the automatic point based classification. The obtained results of this research on study area verified that automatic 3D

  6. AUTOMATIC 3D BUILDING MODEL GENERATIONS WITH AIRBORNE LiDAR DATA

    Directory of Open Access Journals (Sweden)

    N. Yastikli

    2017-11-01

    Full Text Available LiDAR systems become more and more popular because of the potential use for obtaining the point clouds of vegetation and man-made objects on the earth surface in an accurate and quick way. Nowadays, these airborne systems have been frequently used in wide range of applications such as DEM/DSM generation, topographic mapping, object extraction, vegetation mapping, 3 dimensional (3D modelling and simulation, change detection, engineering works, revision of maps, coastal management and bathymetry. The 3D building model generation is the one of the most prominent applications of LiDAR system, which has the major importance for urban planning, illegal construction monitoring, 3D city modelling, environmental simulation, tourism, security, telecommunication and mobile navigation etc. The manual or semi-automatic 3D building model generation is costly and very time-consuming process for these applications. Thus, an approach for automatic 3D building model generation is needed in a simple and quick way for many studies which includes building modelling. In this study, automatic 3D building models generation is aimed with airborne LiDAR data. An approach is proposed for automatic 3D building models generation including the automatic point based classification of raw LiDAR point cloud. The proposed point based classification includes the hierarchical rules, for the automatic production of 3D building models. The detailed analyses for the parameters which used in hierarchical rules have been performed to improve classification results using different test areas identified in the study area. The proposed approach have been tested in the study area which has partly open areas, forest areas and many types of the buildings, in Zekeriyakoy, Istanbul using the TerraScan module of TerraSolid. The 3D building model was generated automatically using the results of the automatic point based classification. The obtained results of this research on study area verified

  7. Using activity-related behavioural features towards more effective automatic stress detection.

    Directory of Open Access Journals (Sweden)

    Dimitris Giakoumis

    Full Text Available This paper introduces activity-related behavioural features that can be automatically extracted from a computer system, with the aim to increase the effectiveness of automatic stress detection. The proposed features are based on processing of appropriate video and accelerometer recordings taken from the monitored subjects. For the purposes of the present study, an experiment was conducted that utilized a stress-induction protocol based on the stroop colour word test. Video, accelerometer and biosignal (Electrocardiogram and Galvanic Skin Response recordings were collected from nineteen participants. Then, an explorative study was conducted by following a methodology mainly based on spatiotemporal descriptors (Motion History Images that are extracted from video sequences. A large set of activity-related behavioural features, potentially useful for automatic stress detection, were proposed and examined. Experimental evaluation showed that several of these behavioural features significantly correlate to self-reported stress. Moreover, it was found that the use of the proposed features can significantly enhance the performance of typical automatic stress detection systems, commonly based on biosignal processing.

  8. PACS quality control and automatic problem notifier

    Science.gov (United States)

    Honeyman-Buck, Janice C.; Jones, Douglas; Frost, Meryll M.; Staab, Edward V.

    1997-05-01

    One side effect of installing a clinical PACS Is that users become dependent upon the technology and in some cases it can be very difficult to revert back to a film based system if components fail. The nature of system failures range from slow deterioration of function as seen in the loss of monitor luminance through sudden catastrophic loss of the entire PACS networks. This paper describes the quality control procedures in place at the University of Florida and the automatic notification system that alerts PACS personnel when a failure has happened or is anticipated. The goal is to recover from a failure with a minimum of downtime and no data loss. Routine quality control is practiced on all aspects of PACS, from acquisition, through network routing, through display, and including archiving. Whenever possible, the system components perform self and between platform checks for active processes, file system status, errors in log files, and system uptime. When an error is detected or a exception occurs, an automatic page is sent to a pager with a diagnostic code. Documentation on each code, trouble shooting procedures, and repairs are kept on an intranet server accessible only to people involved in maintaining the PACS. In addition to the automatic paging system for error conditions, acquisition is assured by an automatic fax report sent on a daily basis to all technologists acquiring PACS images to be used as a cross check that all studies are archived prior to being removed from the acquisition systems. Daily quality control is preformed to assure that studies can be moved from each acquisition and contrast adjustment. The results of selected quality control reports will be presented. The intranet documentation server will be described with the automatic pager system. Monitor quality control reports will be described and the cost of quality control will be quantified. As PACS is accepted as a clinical tool, the same standards of quality control must be established

  9. Automatic differentiation of functions

    International Nuclear Information System (INIS)

    Douglas, S.R.

    1990-06-01

    Automatic differentiation is a method of computing derivatives of functions to any order in any number of variables. The functions must be expressible as combinations of elementary functions. When evaluated at specific numerical points, the derivatives have no truncation error and are automatically found. The method is illustrated by simple examples. Source code in FORTRAN is provided

  10. Artificial neural network controller for automatic ship berthing using head-up coordinate system

    Directory of Open Access Journals (Sweden)

    Nam-Kyun Im

    2018-05-01

    Full Text Available The Artificial Neural Network (ANN model has been known as one of the most effective theories for automatic ship berthing, as it has learning ability and mimics the actions of the human brain when performing the stages of ship berthing. However, existing ANN controllers can only bring a ship into a berth in a certain port, where the inputs of the ANN are the same as those of the teaching data. This means that those ANN controllers must be retrained when the ship arrives to a new port, which is time-consuming and costly. In this research, by using the head-up coordinate system, which includes the relative bearing and distance from the ship to the berth, a novel ANN controller is proposed to automatically control the ship into the berth in different ports without retraining the ANN structure. Numerical simulations were performed to verify the effectiveness of the proposed controller. First, teaching data were created in the original port to train the neural network; then, the controller was tested for automatic berthing in other ports, where the initial conditions of the inputs in the head-up coordinate system were similar to those of the teaching data in the original port. The results showed that the proposed controller has good performance for ship berthing in ports. Keywords: Automatic ship berthing, ANN controller, Head-up coordinate system, Low speed, Relative bearing

  11. Analysis of Phonetic Transcriptions for Danish Automatic Speech Recognition

    DEFF Research Database (Denmark)

    Kirkedal, Andreas Søeborg

    2013-01-01

    Automatic speech recognition (ASR) relies on three resources: audio, orthographic transcriptions and a pronunciation dictionary. The dictionary or lexicon maps orthographic words to sequences of phones or phonemes that represent the pronunciation of the corresponding word. The quality of a speech....... The analysis indicates that transcribing e.g. stress or vowel duration has a negative impact on performance. The best performance is obtained with coarse phonetic annotation and improves performance 1% word error rate and 3.8% sentence error rate....

  12. Go with the flow: how the consideration of joy versus pride influences automaticity.

    Science.gov (United States)

    Katzir, Maayan; Ori, Bnaya; Eyal, Tal; Meiran, Nachshon

    2015-02-01

    Recently, we have shown that the consideration of joy, without the actual experience of the emotion, impaired performance on the antisaccade task (Katzir, Eyal, Meiran, & Kessler, 2010). We interpreted this finding as indicating inhibitory control failure. However, impaired antisaccade performance may result from either the weakening of inhibitory control, the potentiation of the competing automatic response, or both. In the current research we used a task switching paradigm, which allowed us to assess cognitive control more directly, using Backward Inhibition, Competitor Rule Suppression, and Competitor Rule Priming as cognitive-control indices as well as assessing the Task Rule Congruency Effect (TRCE) which, like the antisaccade, is influenced by both control and automaticity. We found that considering joy compared to pride did not influence any of the cognitive control indices but increased the TRCE. We interpret this finding as evidence that joy consideration leads to increased reliance on automatic tendencies, such as short-term desires. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Solar Powered Automatic Shrimp Feeding System

    Directory of Open Access Journals (Sweden)

    Dindo T. Ani

    2015-12-01

    Full Text Available - Automatic system has brought many revolutions in the existing technologies. One among the technologies, which has greater developments, is the solar powered automatic shrimp feeding system. For instance, the solar power which is a renewable energy can be an alternative solution to energy crisis and basically reducing man power by using it in an automatic manner. The researchers believe an automatic shrimp feeding system may help solve problems on manual feeding operations. The project study aimed to design and develop a solar powered automatic shrimp feeding system. It specifically sought to prepare the design specifications of the project, to determine the methods of fabrication and assembly, and to test the response time of the automatic shrimp feeding system. The researchers designed and developed an automatic system which utilizes a 10 hour timer to be set in intervals preferred by the user and will undergo a continuous process. The magnetic contactor acts as a switch connected to the 10 hour timer which controls the activation or termination of electrical loads and powered by means of a solar panel outputting electrical power, and a rechargeable battery in electrical communication with the solar panel for storing the power. By undergoing through series of testing, the components of the modified system were proven functional and were operating within the desired output. It was recommended that the timer to be used should be tested to avoid malfunction and achieve the fully automatic system and that the system may be improved to handle changes in scope of the project.

  14. ERPs reveal deficits in automatic cerebral stimulus processing in patients with NIDDM.

    Science.gov (United States)

    Vanhanen, M; Karhu, J; Koivisto, K; Pääkkönen, A; Partanen, J; Laakso, M; Riekkinen, P

    1996-11-04

    We compared auditory event-related potentials (ERPs) and neuropsychological test scores in nine patients with non-insulin-dependent diabetes mellitus (NIDDM) and in nine control subjects. The measures of automatic stimulus processing, habituation of auditory N100 and mismatch negativity (MMN) were impaired in patients. No differences were observed in the N2b and P3 components, which presumably reflect conscious cognitive analysis of the stimuli. A trend towards impaired performance in the Digit Span backward was found in diabetic subjects, but in the tests of secondary or long-term memory the groups were comparable. Patients with NIDDM may have defects in arousal and in the automatic ability to redirect attention, which can affect their cognitive performance.

  15. 8th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Gracia, José; Knüpfer, Andreas; Resch, Michael; Nagel, Wolfgang

    2015-01-01

    Numerical simulation and modelling using High Performance Computing has evolved into an established technique in academic and industrial research. At the same time, the High Performance Computing infrastructure is becoming ever more complex. For instance, most of the current top systems around the world use thousands of nodes in which classical CPUs are combined with accelerator cards in order to enhance their compute power and energy efficiency. This complexity can only be mastered with adequate development and optimization tools. Key topics addressed by these tools include parallelization on heterogeneous systems, performance optimization for CPUs and accelerators, debugging of increasingly complex scientific applications, and optimization of energy usage in the spirit of green IT. This book represents the proceedings of the 8th International Parallel Tools Workshop, held October 1-2, 2014 in Stuttgart, Germany – which is a forum to discuss the latest advancements in the parallel tools.

  16. Region descriptors for automatic classification of small sea targets in infrared video

    NARCIS (Netherlands)

    Mouthaan, M.M.; Broek, S.P. van den; Hendriks, E.A.; Schwering, P.B.W.

    2011-01-01

    We evaluate the performance of different key-point detectors and region descriptors when used for automatic classification of small sea targets in infrared video. In our earlier research performed on this subject as well as in other literature, many different region descriptors have been proposed.

  17. Nature Conservation Drones for Automatic Localization and Counting of Animals

    NARCIS (Netherlands)

    van Gemert, J.C.; Verschoor, C.R.; Mettes, P.; Epema, K.; Koh, L.P.; Wich, S.; Agapito, L.; Bronstein, M.M.; Rother, C.

    2015-01-01

    This paper is concerned with nature conservation by automatically monitoring animal distribution and animal abundance. Typically, such conservation tasks are performed manually on foot or after an aerial recording from a manned aircraft. Such manual approaches are expensive, slow and labor

  18. Production of 99Tc generators with automatic elution

    International Nuclear Information System (INIS)

    Mengatti, J.; Yanagawa, S.T.I.; Mazzarro, E.; Gasiglia, H.T.; Rela, P.R.; Silva, C.P.G. da; Pereira, N.P.S. de.

    1983-10-01

    The improvements performed on the routine production of sup(99m) Tc-generators at the Instituto de Pesquisas Energeticas e Nucleares-CNEN/SP, are described. The old model generators (manual elution of sup(99m) Tc) were substituted by automatically eluted generators (Vacuum system). The alumina column, elution system and acessories were modified; the elution time was reduced from 60 to 20-30 seconds. The new generators give 80-90% elution yields using six mililiters of sodium chloride 0,9% as sup(99m) Tc eluant instead of the 10 mililiters necessary to eluate the old generators. So, the radioactive concentrations are now 70% higher. The radioactive, radiochemical, chemical and microbiological criteria were examinated for sup(99m) Tc solutions. Like old generators, automatic generators were considered safe for medical purpose. (Author) [pt

  19. Do Judgments of Learning Predict Automatic Influences of Memory?

    Science.gov (United States)

    Undorf, Monika; Böhm, Simon; Cüpper, Lutz

    2016-01-01

    Current memory theories generally assume that memory performance reflects both recollection and automatic influences of memory. Research on people's predictions about the likelihood of remembering recently studied information on a memory test, that is, on judgments of learning (JOLs), suggests that both magnitude and resolution of JOLs are linked…

  20. Automatic classification of journalistic documents on the Internet1

    Directory of Open Access Journals (Sweden)

    Elias OLIVEIRA

    Full Text Available Abstract Online journalism is increasing every day. There are many news agencies, newspapers, and magazines using digital publication in the global network. Documents published online are available to users, who use search engines to find them. In order to deliver documents that are relevant to the search, they must be indexed and classified. Due to the vast number of documents published online every day, a lot of research has been carried out to find ways to facilitate automatic document classification. The objective of the present study is to describe an experimental approach for the automatic classification of journalistic documents published on the Internet using the Vector Space Model for document representation. The model was tested based on a real journalism database, using algorithms that have been widely reported in the literature. This article also describes the metrics used to assess the performance of these algorithms and their required configurations. The results obtained show the efficiency of the method used and justify further research to find ways to facilitate the automatic classification of documents.

  1. A manual-control approach to development of VTOL automatic landing technology.

    Science.gov (United States)

    Kelly, J. R.; Niessen, F. R.; Garren, J. F., Jr.

    1973-01-01

    The operation of VTOL aircraft in the city-center environment will require complex landing-approach trajectories that insure adequate clearance from other traffic and obstructions and provide the most direct routing for efficient operations. As part of a larger program to develop the necessary technology base, a flight investigation was undertaken to study the problems associated with manual and automatic control of steep, decelerating instrument approaches and landings. The study employed a three-cue flight director driven by control laws developed and refined during manual-control studies and subsequently applied to the automatic approach problem. The validity of this approach was demonstrated by performing the first automatic approach and landings to a predetermined spot ever accomplished with a helicopter. The manual-control studies resulted in the development of a constant-attitude deceleration profile and a low-noise navigation system.

  2. Automatic Ultrasound Scanning

    DEFF Research Database (Denmark)

    Moshavegh, Ramin

    on the user adjustments on the scanner interface to optimize the scan settings. This explains the huge interest in the subject of this PhD project entitled “AUTOMATIC ULTRASOUND SCANNING”. The key goals of the project have been to develop automated techniques to minimize the unnecessary settings...... on the scanners, and to improve the computer-aided diagnosis (CAD) in ultrasound by introducing new quantitative measures. Thus, four major issues concerning automation of the medical ultrasound are addressed in this PhD project. They touch upon gain adjustments in ultrasound, automatic synthetic aperture image...

  3. IADE: a system for intelligent automatic design of bioisosteric analogs

    Science.gov (United States)

    Ertl, Peter; Lewis, Richard

    2012-11-01

    IADE, a software system supporting molecular modellers through the automatic design of non-classical bioisosteric analogs, scaffold hopping and fragment growing, is presented. The program combines sophisticated cheminformatics functionalities for constructing novel analogs and filtering them based on their drug-likeness and synthetic accessibility using automatic structure-based design capabilities: the best candidates are selected according to their similarity to the template ligand and to their interactions with the protein binding site. IADE works in an iterative manner, improving the fitness of designed molecules in every generation until structures with optimal properties are identified. The program frees molecular modellers from routine, repetitive tasks, allowing them to focus on analysis and evaluation of the automatically designed analogs, considerably enhancing their work efficiency as well as the area of chemical space that can be covered. The performance of IADE is illustrated through a case study of the design of a nonclassical bioisosteric analog of a farnesyltransferase inhibitor—an analog that has won a recent "Design a Molecule" competition.

  4. Quality assurance using outlier detection on an automatic segmentation method for the cerebellar peduncles

    Science.gov (United States)

    Li, Ke; Ye, Chuyang; Yang, Zhen; Carass, Aaron; Ying, Sarah H.; Prince, Jerry L.

    2016-03-01

    Cerebellar peduncles (CPs) are white matter tracts connecting the cerebellum to other brain regions. Automatic segmentation methods of the CPs have been proposed for studying their structure and function. Usually the performance of these methods is evaluated by comparing segmentation results with manual delineations (ground truth). However, when a segmentation method is run on new data (for which no ground truth exists) it is highly desirable to efficiently detect and assess algorithm failures so that these cases can be excluded from scientific analysis. In this work, two outlier detection methods aimed to assess the performance of an automatic CP segmentation algorithm are presented. The first one is a univariate non-parametric method using a box-whisker plot. We first categorize automatic segmentation results of a dataset of diffusion tensor imaging (DTI) scans from 48 subjects as either a success or a failure. We then design three groups of features from the image data of nine categorized failures for failure detection. Results show that most of these features can efficiently detect the true failures. The second method—supervised classification—was employed on a larger DTI dataset of 249 manually categorized subjects. Four classifiers—linear discriminant analysis (LDA), logistic regression (LR), support vector machine (SVM), and random forest classification (RFC)—were trained using the designed features and evaluated using a leave-one-out cross validation. Results show that the LR performs worst among the four classifiers and the other three perform comparably, which demonstrates the feasibility of automatically detecting segmentation failures using classification methods.

  5. Automatic contact algorithm in ppercase[dyna3d] for crashworthiness and impact problems

    International Nuclear Information System (INIS)

    Whirley, Robert G.; Engelmann, Bruce E.

    1994-01-01

    This paper presents a new approach for the automatic definition and treatment of mechanical contact in explicit non-linear finite element analysis. Automatic contact offers the benefits of significantly reduced model construction time and fewer opportunities for user error, but faces significant challenges in reliability and computational costs. Key aspects of the proposed new method include automatic identification of adjacent and opposite surfaces in the global search phase, and the use of a well-defined surface normal which allows a consistent treatment of shell intersection and corner contact conditions without adhoc rules. The paper concludes with three examples which illustrate the performance of the newly proposed algorithm in the public ppercase[dyna3d] code. ((orig.))

  6. 30 CFR 77.314 - Automatic temperature control instruments.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic temperature control instruments. 77... UNDERGROUND COAL MINES Thermal Dryers § 77.314 Automatic temperature control instruments. (a) Automatic temperature control instruments for thermal dryer system shall be of the recording type. (b) Automatic...

  7. Automatic control systems engineering

    International Nuclear Information System (INIS)

    Shin, Yun Gi

    2004-01-01

    This book gives descriptions of automatic control for electrical electronics, which indicates history of automatic control, Laplace transform, block diagram and signal flow diagram, electrometer, linearization of system, space of situation, state space analysis of electric system, sensor, hydro controlling system, stability, time response of linear dynamic system, conception of root locus, procedure to draw root locus, frequency response, and design of control system.

  8. The Origins of Belief Representation: Monkeys Fail to Automatically Represent Others’ Beliefs

    Science.gov (United States)

    Martin, Alia; Santos, Laurie R.

    2014-01-01

    Young infants’ successful performance on false belief tasks has led several researchers to argue that there may be a core knowledge system for representing the beliefs of other agents, emerging early in human development and constraining automatic belief processing into adulthood. One way to investigate this purported core belief representation system is to examine whether non-human primates share such a system. Although non-human primates have historically performed poorly on false belief tasks that require executive function capacities, little work has explored how primates perform on more automatic measures of belief processing. To get at this issue, we modified Kovács et al. (2010)’s test of automatic belief representation to examine whether one non-human primate species—the rhesus macaque (Macaca mulatta)—is automatically influenced by another agent’s beliefs when tracking an object’s location. Monkeys saw an event in which a human agent watched an apple move back and forth between two boxes and an outcome in which one box was revealed to be empty. By occluding segments of the apple’s movement from either the monkey or the agent, we manipulated both the monkeys’ belief (true or false) and agent’s belief (true or false) about the final location of the apple. We found that monkeys looked longer at events that violated their own beliefs than at events that were consistent with their beliefs. In contrast to human infants, however, monkeys’ expectations were not influenced by another agent’s beliefs, suggesting that belief representation may be an aspect of core knowledge unique to humans. PMID:24374209

  9. SRV-automatic handling device

    International Nuclear Information System (INIS)

    Yamada, Koji

    1987-01-01

    Automatic handling device for the steam relief valves (SRV's) is developed in order to achieve a decrease in exposure of workers, increase in availability factor, improvement in reliability, improvement in safety of operation, and labor saving. A survey is made during a periodical inspection to examine the actual SVR handling operation. An SRV automatic handling device consists of four components: conveyor, armed conveyor, lifting machine, and control/monitoring system. The conveyor is so designed that the existing I-rail installed in the containment vessel can be used without any modification. This is employed for conveying an SRV along the rail. The armed conveyor, designed for a box rail, is used for an SRV installed away from the rail. By using the lifting machine, an SRV installed away from the I-rail is brought to a spot just below the rail so that the SRV can be transferred by the conveyor. The control/monitoring system consists of a control computer, operation panel, TV monitor and annunciator. The SRV handling device is operated by remote control from a control room. A trial equipment is constructed and performance/function testing is carried out using actual SRV's. As a result, is it shown that the SRV handling device requires only two operators to serve satisfactorily. The required time for removal and replacement of one SRV is about 10 minutes. (Nogami, K.)

  10. Automatic exchange unit for control rod drive device

    International Nuclear Information System (INIS)

    Nasu, Seiji; Sasaki, Masayoshi.

    1982-01-01

    Purpose: To enable automatic reoperation and continuation without external power interruption remedy device at the time of recovering the interrupted power soruce during automatic positioning operation. Constitution: In case of an automatic exchange unit for a control rod drive device of the control type for setting the deviation between the positioning target position and the present position of the device to zero, the position data of the drive device of the positioning target value of the device is automatically read, and an interlock of operation inhibit is applied to a control system until the data reading is completed and automatic operation start or restart conditions are sequentially confirmed. After the confirmation, the interlock is released to start the automatic operation or reoperation. Accordingly, the automatic operation can be safely restarted and continued. (Yoshihara, H.)

  11. Facilitating coronary artery evaluation in MDCT using a 3D automatic vessel segmentation tool

    International Nuclear Information System (INIS)

    Fawad Khan, M.; Gurung, Jessen; Maataoui, Adel; Brehmer, Boris; Herzog, Christopher; Vogl, Thomas J.; Wesarg, Stefan; Dogan, Selami; Ackermann, Hanns; Assmus, Birgit

    2006-01-01

    The purpose of this study was to investigate a 3D coronary artery segmentation algorithm using 16-row MDCT data sets. Fifty patients underwent cardiac CT (Sensation 16, Siemens) and coronary angiography. Automatic and manual detection of coronary artery stenosis was performed. A 3D coronary artery segmentation algorithm (Fraunhofer Institute for Computer Graphics, Darmstadt) was used for automatic evaluation. All significant stenoses (>50%) in vessels >1.5 mm in diameter were protocoled. Each detection tool was used by one reader who was blinded to the results of the other detection method and the results of coronary angiography. Sensitivity and specificity were determined for automatic and manual detection as well as was the time for both CT-based evaluation methods. The overall sensitivity and specificity of the automatic and manual approach were 93.1 vs. 95.83% and 86.1 vs. 81.9%. The time required for automatic evaluation was significantly shorter than with the manual approach, i.e., 246.04±43.17 s for the automatic approach and 526.88±45.71 s for the manual approach (P<0.0001). In 94% of the coronary artery branches, automatic detection required less time than the manual approach. Automatic coronary vessel evaluation is feasible. It reduces the time required for cardiac CT evaluation with similar sensitivity and specificity as well as facilitates the evaluation of MDCT coronary angiography in a standardized fashion. (orig.)

  12. A framework for automatic segmentation in three dimensions of microstructural tomography data

    DEFF Research Database (Denmark)

    Jørgensen, Peter Stanley; Hansen, Karin Vels; Larsen, Rasmus

    2010-01-01

    Routine use of quantitative three dimensional analysis of material microstructure by in particular, focused ion beam (FIB) serial sectioning is generally restricted by the time consuming task of manually delineating structures within each image slice or the quality of manual and automatic...... segmentation schemes. We present here a framework for performing automatic segmentation of complex microstructures using a level set method. The technique is based on numerical approximations to partial differential equations to evolve a 3D surface to capture the phase boundaries. Vector fields derived from...

  13. An automatic device for the quality control of large-scale crystal's production

    CERN Document Server

    Baccaro, S; Castellani, M; Cecilia, A; Dafinei, I; Diemoz, M; Guerra, S; Longo, E; Montecchi, M; Organtini, G; Pellegrini, F

    2001-01-01

    In 1999, the construction of the electromagnetic calorimeter of the Compact Muon Solenoid (CMS) experiment started. Half of the barrel calorimeter made of 61200 lead tungstate (PWO) crystals will be assembled and tested in the Regional Centre of INFN-ENEA in Rome, Italy. Before assembling, all 30600 PWO crystals will be qualified for scintillation and radiation hardness characteristics by a specially built Automatic Crystal Control System. The measuring techniques for crystal qualification and performances of the automatic system will be discussed in this work. (11 refs).

  14. Position automatic determination technology

    International Nuclear Information System (INIS)

    1985-10-01

    This book tells of method of position determination and characteristic, control method of position determination and point of design, point of sensor choice for position detector, position determination of digital control system, application of clutch break in high frequency position determination, automation technique of position determination, position determination by electromagnetic clutch and break, air cylinder, cam and solenoid, stop position control of automatic guide vehicle, stacker crane and automatic transfer control.

  15. Automatic segmentation of the left ventricle in a cardiac MR short axis image using blind morphological operation

    Science.gov (United States)

    Irshad, Mehreen; Muhammad, Nazeer; Sharif, Muhammad; Yasmeen, Mussarat

    2018-04-01

    Conventionally, cardiac MR image analysis is done manually. Automatic examination for analyzing images can replace the monotonous tasks of massive amounts of data to analyze the global and regional functions of the cardiac left ventricle (LV). This task is performed using MR images to calculate the analytic cardiac parameter like end-systolic volume, end-diastolic volume, ejection fraction, and myocardial mass, respectively. These analytic parameters depend upon genuine delineation of epicardial, endocardial, papillary muscle, and trabeculations contours. In this paper, we propose an automatic segmentation method using the sum of absolute differences technique to localize the left ventricle. Blind morphological operations are proposed to segment and detect the LV contours of the epicardium and endocardium, automatically. We test the benchmark Sunny Brook dataset for evaluation of the proposed work. Contours of epicardium and endocardium are compared quantitatively to determine contour's accuracy and observe high matching values. Similarity or overlapping of an automatic examination to the given ground truth analysis by an expert are observed with high accuracy as with an index value of 91.30% . The proposed method for automatic segmentation gives better performance relative to existing techniques in terms of accuracy.

  16. A neurocomputational model of automatic sequence production.

    Science.gov (United States)

    Helie, Sebastien; Roeder, Jessica L; Vucovich, Lauren; Rünger, Dennis; Ashby, F Gregory

    2015-07-01

    Most behaviors unfold in time and include a sequence of submovements or cognitive activities. In addition, most behaviors are automatic and repeated daily throughout life. Yet, relatively little is known about the neurobiology of automatic sequence production. Past research suggests a gradual transfer from the associative striatum to the sensorimotor striatum, but a number of more recent studies challenge this role of the BG in automatic sequence production. In this article, we propose a new neurocomputational model of automatic sequence production in which the main role of the BG is to train cortical-cortical connections within the premotor areas that are responsible for automatic sequence production. The new model is used to simulate four different data sets from human and nonhuman animals, including (1) behavioral data (e.g., RTs), (2) electrophysiology data (e.g., single-neuron recordings), (3) macrostructure data (e.g., TMS), and (4) neurological circuit data (e.g., inactivation studies). We conclude with a comparison of the new model with existing models of automatic sequence production and discuss a possible new role for the BG in automaticity and its implication for Parkinson's disease.

  17. Color Image Segmentation Based on Different Color Space Models Using Automatic GrabCut

    Directory of Open Access Journals (Sweden)

    Dina Khattab

    2014-01-01

    Full Text Available This paper presents a comparative study using different color spaces to evaluate the performance of color image segmentation using the automatic GrabCut technique. GrabCut is considered as one of the semiautomatic image segmentation techniques, since it requires user interaction for the initialization of the segmentation process. The automation of the GrabCut technique is proposed as a modification of the original semiautomatic one in order to eliminate the user interaction. The automatic GrabCut utilizes the unsupervised Orchard and Bouman clustering technique for the initialization phase. Comparisons with the original GrabCut show the efficiency of the proposed automatic technique in terms of segmentation, quality, and accuracy. As no explicit color space is recommended for every segmentation problem, automatic GrabCut is applied with RGB, HSV, CMY, XYZ, and YUV color spaces. The comparative study and experimental results using different color images show that RGB color space is the best color space representation for the set of the images used.

  18. Programmable automatic alpha--beta air sample counter

    International Nuclear Information System (INIS)

    Howell, W.P.

    1978-01-01

    A programmable automatic alpha-beta air sample counter was developed for routine sample counting by operational health physics personnel. The system is composed of an automatic sample changer utilizing a large silicon diode detector, an electronic counting system with energy analysis capability, an automatic data acquisition controller, an interface module, and a teletypewriter with paper tape punch and paper tape reader. The system is operated through the teletypewriter keyboard and the paper tape reader, which are used to instruct the automatic data acquisition controller. Paper tape programs are provided for background counting, Chi 2 test, and sample counting. Output data are printed by the teletypewriter on standard continuous roll or multifold paper. Data are automatically corrected for background and counter efficiency

  19. Development of An Automatic Verification Program for Thermal-hydraulic System Codes

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J. Y.; Ahn, K. T.; Ko, S. H.; Kim, Y. S.; Kim, D. W. [Pusan National University, Busan (Korea, Republic of); Suh, J. S.; Cho, Y. S.; Jeong, J. J. [System Engineering and Technology Co., Daejeon (Korea, Republic of)

    2012-05-15

    As a project activity of the capstone design competitive exhibition, supported by the Education Center for Green Industry-friendly Fusion Technology (GIFT), we have developed a computer program which can automatically perform non-regression test, which is needed repeatedly during a developmental process of a thermal-hydraulic system code, such as the SPACE code. A non-regression test (NRT) is an approach to software testing. The purpose of the non-regression testing is to verify whether, after updating a given software application (in this case, the code), previous software functions have not been compromised. The goal is to prevent software regression, whereby adding new features results in software bugs. As the NRT is performed repeatedly, a lot of time and human resources will be needed during the development period of a code. It may cause development period delay. To reduce the cost and the human resources and to prevent wasting time, non-regression tests need to be automatized. As a tool to develop an automatic verification program, we have used Visual Basic for Application (VBA). VBA is an implementation of Microsoft's event-driven programming language Visual Basic 6 and its associated integrated development environment, which are built into most Microsoft Office applications (In this case, Excel)

  20. Development of An Automatic Verification Program for Thermal-hydraulic System Codes

    International Nuclear Information System (INIS)

    Lee, J. Y.; Ahn, K. T.; Ko, S. H.; Kim, Y. S.; Kim, D. W.; Suh, J. S.; Cho, Y. S.; Jeong, J. J.

    2012-01-01

    As a project activity of the capstone design competitive exhibition, supported by the Education Center for Green Industry-friendly Fusion Technology (GIFT), we have developed a computer program which can automatically perform non-regression test, which is needed repeatedly during a developmental process of a thermal-hydraulic system code, such as the SPACE code. A non-regression test (NRT) is an approach to software testing. The purpose of the non-regression testing is to verify whether, after updating a given software application (in this case, the code), previous software functions have not been compromised. The goal is to prevent software regression, whereby adding new features results in software bugs. As the NRT is performed repeatedly, a lot of time and human resources will be needed during the development period of a code. It may cause development period delay. To reduce the cost and the human resources and to prevent wasting time, non-regression tests need to be automatized. As a tool to develop an automatic verification program, we have used Visual Basic for Application (VBA). VBA is an implementation of Microsoft's event-driven programming language Visual Basic 6 and its associated integrated development environment, which are built into most Microsoft Office applications (In this case, Excel)

  1. Automatic segmentation of time-lapse microscopy images depicting a live Dharma embryo.

    Science.gov (United States)

    Zacharia, Eleni; Bondesson, Maria; Riu, Anne; Ducharme, Nicole A; Gustafsson, Jan-Åke; Kakadiaris, Ioannis A

    2011-01-01

    Biological inferences about the toxicity of chemicals reached during experiments on the zebrafish Dharma embryo can be greatly affected by the analysis of the time-lapse microscopy images depicting the embryo. Among the stages of image analysis, automatic and accurate segmentation of the Dharma embryo is the most crucial and challenging. In this paper, an accurate and automatic segmentation approach for the segmentation of the Dharma embryo data obtained by fluorescent time-lapse microscopy is proposed. Experiments performed in four stacks of 3D images over time have shown promising results.

  2. Automatic setting of the distance between sample and detector in gamma-ray spectroscopy

    International Nuclear Information System (INIS)

    Andeweg, A.H.

    1980-01-01

    An apparatus has been developed that automatically sets the distance from the sample to the detector according to the radioactivity of the sample. The distance-setting unit works in conjuction with an automatic sample changer, and is interconnected with other components so that the counting head automatically moves to the optimum distance for the analysis of a particular sample. The distance, which is indicated digitally in increments of 0,01 mm, can be set between 18 and 995 mm at count rates that can be preset between 1000 and 10 000 counts per second. On being tested, the instrument performed well within the desired range and accuracy. Under routine conditions, the spectra were much more accurate than before, especially when samples of different radioactivity were counted

  3. Automatic alignment device for focal spot measurements in the center of the field for mammography

    International Nuclear Information System (INIS)

    Vieira, Marcelo A.C.; Watanabe, Alex O.; Oliveira Junior, Paulo D.; Schiabel, Homero

    2010-01-01

    Some quality control procedures used for mammography, such as focal spot evaluation, requires previous alignment of the measurement equipment with the X-ray central beam. However, alignment procedures are, in general, the most difficult task and the one that needs more time to be performed. Moreover, the operator sometimes is exposed to radiation during this procedure. This work presents an automatic alignment system for mammographic equipment that allows locating the central ray of the radiation beam and, immediately, aligns with it by dislocating itself automatically along the field. The system consists on a bidirectional moving device, connected to a CCD sensor for digital radiographic image acquisition. A computational analysis of a radiographic image, acquired at any position on the field, is performed in order to determine its positioning under the X-ray beam. Finally, a mechanical system for two moving directions, electronically controlled by a microcontroller under USB communication, makes the system to align automatically with the radiation beam central ray. The alignment process is fully automatic, fast and accurate, with no operator exposure to radiation, which allows a considerable time saving for quality control procedures achievement for mammography. (author)

  4. Intentional and Automatic Numerical Processing as Predictors of Mathematical Abilities in Primary School Children

    Directory of Open Access Journals (Sweden)

    Violeta ePina

    2015-03-01

    Full Text Available Previous studies have suggested that numerical processing relates to mathematical performance, but it seems that such relationship is more evident for intentional than for automatic numerical processing. In the present study we assessed the relationship between the two types of numerical processing and specific mathematical abilities in a sample of 109 children in grades 1 to 6. Participants were tested in an ample range of mathematical tests and also performed both a numerical and a size comparison task. The results showed that numerical processing related to mathematical performance only when inhibitory control was involved in the comparison tasks. Concretely, we found that intentional numerical processing, as indexed by the numerical distance effect in the numerical comparison task, was related to mathematical reasoning skills only when the task-irrelevant dimension (the physical size was incongruent; whereas automatic numerical processing, indexed by the congruency effect in the size comparison task, was related to mathematical calculation skills only when digits were separated by small distance. The observed double dissociation highlights the relevance of both intentional and automatic numerical processing in mathematical skills, but when inhibitory control is also involved.

  5. Development of remote automatic equipment for BWR power plants

    International Nuclear Information System (INIS)

    Sasaki, Masayoshi

    1984-01-01

    The development of remote control, automatic equipment for nuclear power stations has been promoted to raise the rate of operation of plants by shortening regular inspection period, to improve the safety and reliability of inspection and maintenance works by mechanization, to reduce the radiation exposure dose of workers and to reduce the manpower required for works. The taking-off of control rod drives from reactors and fixing again have been mechanized, but the disassembling, cleaning, inspection and assembling of control rod drives are manually carried out. Therefore, Hitachi Ltd. has exerted effort to develop the automatic equipment for this purpose. The target of development, investigation, the construction and function of the equipment, the performance and the effect of adopting it are reported. The equipment for the volume reduction of spent fuel channel boxes and spent control rods is developed since these are major high level radioactive solid wastes, and their apparent volume is large. Also the target of development, investigated things, the construction and function of the equipment, the performance and the effect of adopting it are reported. (Kako, I.)

  6. Sensitivity analysis and design optimization through automatic differentiation

    International Nuclear Information System (INIS)

    Hovland, Paul D; Norris, Boyana; Strout, Michelle Mills; Bhowmick, Sanjukta; Utke, Jean

    2005-01-01

    Automatic differentiation is a technique for transforming a program or subprogram that computes a function, including arbitrarily complex simulation codes, into one that computes the derivatives of that function. We describe the implementation and application of automatic differentiation tools. We highlight recent advances in the combinatorial algorithms and compiler technology that underlie successful implementation of automatic differentiation tools. We discuss applications of automatic differentiation in design optimization and sensitivity analysis. We also describe ongoing research in the design of language-independent source transformation infrastructures for automatic differentiation algorithms

  7. Automatically sweeping dual-channel boxcar integrator

    International Nuclear Information System (INIS)

    Keefe, D.J.; Patterson, D.R.

    1978-01-01

    An automatically sweeping dual-channel boxcar integrator has been developed to automate the search for a signal that repeatedly follows a trigger pulse by a constant or slowly varying time delay when that signal is completely hidden in random electrical noise and dc-offset drifts. The automatically sweeping dual-channel boxcar integrator improves the signal-to-noise ratio and eliminates dc-drift errors in the same way that a conventional dual-channel boxcar integrator does, but, in addition, automatically locates the hidden signal. When the signal is found, its time delay is displayed with 100-ns resolution, and its peak value is automatically measured and displayed. This relieves the operator of the tedious, time-consuming, and error-prone search for the signal whenever the time delay changes. The automatically sweeping boxcar integrator can also be used as a conventional dual-channel boxcar integrator. In either mode, it can repeatedly integrate a signal up to 990 times and thus make accurate measurements of the signal pulse height in the presence of random noise, dc offsets, and unsynchronized interfering signals

  8. An automatic virtual patient reconstruction from CT-scans for hepatic surgical planning.

    Science.gov (United States)

    Soler, L; Delingette, H; Malandain, G; Ayache, N; Koehl, C; Clément, J M; Dourthe, O; Marescaux, J

    2000-01-01

    PROBLEM/BACKGROUND: In order to help hepatic surgical planning we perfected automatic 3D reconstruction of patients from conventional CT-scan, and interactive visualization and virtual resection tools. From a conventional abdominal CT-scan, we have developed several methods allowing the automatic 3D reconstruction of skin, bones, kidneys, lung, liver, hepatic lesions, and vessels. These methods are based on deformable modeling or thresholding algorithms followed by the application of mathematical morphological operators. From these anatomical and pathological models, we have developed a new framework for translating anatomical knowledge into geometrical and topological constraints. More precisely, our approach allows to automatically delineate the hepatic and portal veins but also to label the portal vein and finally to build an anatomical segmentation of the liver based on Couinaud definition which is currently used by surgeons all over the world. Finally, we have developed a user friendly interface for the 3D visualization of anatomical and pathological structures, the accurate evaluation of volumes and distances and for the virtual hepatic resection along a user-defined cutting plane. A validation study on a 30 patients database gives 2 mm of precision for liver delineation and less than 1 mm for all other anatomical and pathological structures delineation. An in vivo validation performed during surgery also showed that anatomical segmentation is more precise than the delineation performed by a surgeon based on external landmarks. This surgery planning system has been routinely used by our medical partner, and this has resulted in an improvement of the planning and performance of hepatic surgery procedures. We have developed new tools for hepatic surgical planning allowing a better surgery through an automatic delineation and visualization of anatomical and pathological structures. These tools represent a first step towards the development of an augmented

  9. Automatic and Intentional Number Processing Both Rely on Intact Right Parietal Cortex: A Combined fMRI and Neuronavigated TMS Study

    Science.gov (United States)

    Cohen Kadosh, Roi; Bien, Nina; Sack, Alexander T.

    2012-01-01

    Practice and training usually lead to performance increase in a given task. In addition, a shift from intentional toward more automatic processing mechanisms is often observed. It is currently debated whether automatic and intentional processing is subserved by the same or by different mechanism(s), and whether the same or different regions in the brain are recruited. Previous correlational evidence provided by behavioral, neuroimaging, modeling, and neuropsychological studies addressing this question yielded conflicting results. Here we used transcranial magnetic stimulation (TMS) to compare the causal influence of disrupting either left or right parietal cortex during automatic and intentional numerical processing, as reflected by the size congruity effect and the numerical distance effect, respectively. We found a functional hemispheric asymmetry within parietal cortex with only the TMS-induced right parietal disruption impairing both automatic and intentional numerical processing. In contrast, disrupting the left parietal lobe with TMS, or applying sham stimulation, did not affect performance during automatic or intentional numerical processing. The current results provide causal evidence for the functional relevance of right, but not left, parietal cortex for intentional, and automatic numerical processing, implying that at least within the parietal cortices, automatic, and intentional numerical processing rely on the same underlying hemispheric lateralization. PMID:22347175

  10. Determination of rifampicin in human plasma by high-performance liquid chromatography coupled with ultraviolet detection after automatized solid-liquid extraction.

    Science.gov (United States)

    Louveau, B; Fernandez, C; Zahr, N; Sauvageon-Martre, H; Maslanka, P; Faure, P; Mourah, S; Goldwirt, L

    2016-12-01

    A precise and accurate high-performance liquid chromatography (HPLC) quantification method of rifampicin in human plasma was developed and validated using ultraviolet detection after an automatized solid-phase extraction. The method was validated with respect to selectivity, extraction recovery, linearity, intra- and inter-day precision, accuracy, lower limit of quantification and stability. Chromatographic separation was performed on a Chromolith RP 8 column using a mixture of 0.05 m acetate buffer pH 5.7-acetonitrile (35:65, v/v) as mobile phase. The compounds were detected at a wavelength of 335 nm with a lower limit of quantification of 0.05 mg/L in human plasma. Retention times for rifampicin and 6,7-dimethyl-2,3-di(2-pyridyl) quinoxaline used as internal standard were respectively 3.77 and 4.81 min. This robust and exact method was successfully applied in routine for therapeutic drug monitoring in patients treated with rifampicin. Copyright © 2016 John Wiley & Sons, Ltd.

  11. Stay Focused! The Effects of Internal and External Focus of Attention on Movement Automaticity in Patients with Stroke

    NARCIS (Netherlands)

    Kal, E. C.; van der Kamp, J.; Houdijk, H.; Groet, E.; van Bennekom, C. A. M.; Scherder, E. J. A.

    2015-01-01

    Dual-task performance is often impaired after stroke. This may be resolved by enhancing patients' automaticity of movement. This study sets out to test the constrained action hypothesis, which holds that automaticity of movement is enhanced by triggering an external focus (on movement effects),

  12. Automatic Chessboard Detection for Intrinsic and Extrinsic Camera Parameter Calibration

    Directory of Open Access Journals (Sweden)

    Jose María Armingol

    2010-03-01

    Full Text Available There are increasing applications that require precise calibration of cameras to perform accurate measurements on objects located within images, and an automatic algorithm would reduce this time consuming calibration procedure. The method proposed in this article uses a pattern similar to that of a chess board, which is found automatically in each image, when no information regarding the number of rows or columns is supplied to aid its detection. This is carried out by means of a combined analysis of two Hough transforms, image corners and invariant properties of the perspective transformation. Comparative analysis with more commonly used algorithms demonstrate the viability of the algorithm proposed, as a valuable tool for camera calibration.

  13. Automatic Emotional State Detection using Facial Expression Dynamic in Videos

    Directory of Open Access Journals (Sweden)

    Hongying Meng

    2014-11-01

    Full Text Available In this paper, an automatic emotion detection system is built for a computer or machine to detect the emotional state from facial expressions in human computer communication. Firstly, dynamic motion features are extracted from facial expression videos and then advanced machine learning methods for classification and regression are used to predict the emotional states. The system is evaluated on two publicly available datasets, i.e. GEMEP_FERA and AVEC2013, and satisfied performances are achieved in comparison with the baseline results provided. With this emotional state detection capability, a machine can read the facial expression of its user automatically. This technique can be integrated into applications such as smart robots, interactive games and smart surveillance systems.

  14. Design of an automatic sample changer for the measurement of neutron flux by gamma spectrometry

    International Nuclear Information System (INIS)

    Gago, Javier; Bruna, Ruben; Baltuano, Oscar; Montoya, Eduardo; Descreaux, Killian

    2014-01-01

    This paper presents calculus, selection and components design for the construction of an automatic system in order to measure neutron flux in a working nuclear reactor by the gamma spectrometry technique using samples irradiated on the RP-10 nucleus. This system will perform the measurement of interchanging 100 samples in a programed and automatic way, reducing operation time by the user and obtaining more accurate measures. (authors).

  15. What Automaticity Deficit? Activation of Lexical Information by Readers with Dyslexia in a Rapid Automatized Naming Stroop-Switch Task

    Science.gov (United States)

    Jones, Manon W.; Snowling, Margaret J.; Moll, Kristina

    2016-01-01

    Reading fluency is often predicted by rapid automatized naming (RAN) speed, which as the name implies, measures the automaticity with which familiar stimuli (e.g., letters) can be retrieved and named. Readers with dyslexia are considered to have less "automatized" access to lexical information, reflected in longer RAN times compared with…

  16. 14 CFR 29.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 29.1329 Section 29... pilot system. (a) Each automatic pilot system must be designed so that the automatic pilot can— (1) Be sufficiently overpowered by one pilot to allow control of the rotorcraft; and (2) Be readily and positively...

  17. 14 CFR 27.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 27.1329 Section 27... pilot system. (a) Each automatic pilot system must be designed so that the automatic pilot can— (1) Be sufficiently overpowered by one pilot to allow control of the rotorcraft; and (2) Be readily and positively...

  18. The high-performance database archiver for the LHC experiments

    CERN Document Server

    González-Berges, M

    2007-01-01

    Each of the Large Hadron Collider (LHC) experiments will be controlled by a large distributed system built with the Supervisory Control and Data Acquisition (SCADA) tool Prozeßvisualisierungs- und Steuerungsystem (PVSS). There will be in the order of 150 computers and one million input/output parameters per experiment. The values read from the hardware, the alarms generated and the user actions will be archived for the later physics analysis, the operation and the debugging of the control system itself. Although the original PVSS implementation of a database archiver was appropriate for standard industrial use, the performance was not sufficient for the experiments. A collaboration was setup between CERN and ETM, the company that develops PVSS. Changes in the architecture and several optimizations were made and tested in a system of a comparable size to the final ones. As a result, we have been able to improve the performance by more than one order of magnitude, and what is more important, we now have a scal...

  19. Automatic differentiation algorithms in model analysis

    NARCIS (Netherlands)

    Huiskes, M.J.

    2002-01-01

    Title: Automatic differentiation algorithms in model analysis
    Author: M.J. Huiskes
    Date: 19 March, 2002

    In this thesis automatic differentiation algorithms and derivative-based methods

  20. Automatic Angular alignment of LHC Collimators

    CERN Document Server

    Azzopardi, Gabriella; Salvachua Ferrando, Belen Maria; Mereghetti, Alessio; Bruce, Roderik; Redaelli, Stefano; CERN. Geneva. ATS Department

    2017-01-01

    The LHC is equipped with a complex collimation system to protect sensitive equipment from unavoidable beam losses. Collimators are positioned close to the beam using an alignment procedure. Until now they have always been aligned assuming no tilt between the collimator and the beam, however, tank misalignments or beam envelope angles at large-divergence locations could introduce a tilt limiting the collimation performance. Three different algorithms were implemented to automatically align a chosen collimator at various angles. The implementation was tested on a number of collimators during this MD and no human intervention was required.

  1. Automatic differentiation tools in the dynamic simulation of chemical engineering processes

    Directory of Open Access Journals (Sweden)

    Castro M.C.

    2000-01-01

    Full Text Available Automatic Differentiation is a relatively recent technique developed for the differentiation of functions applicable directly to the source code to compute the function written in standard programming languages. That technique permits the automatization of the differentiation step, crucial for dynamic simulation and optimization of processes. The values for the derivatives obtained with AD are exact (to roundoff. The theoretical exactness of the AD comes from the fact that it uses the same rules of differentiation as in differential calculus, but these rules are applied to an algorithmic specification of the function rather than to a formula. The main purpose of this contribution is to discuss the impact of Automatic Differentiation in the field of dynamic simulation of chemical engineering processes. The influence of the differentiation technique on the behavior of the integration code, the performance of the generated code and the incorporation of AD tools in consistent initialization tools are discussed from the viewpoint of dynamic simulation of typical models in chemical engineering.

  2. Automatic multimodal detection for long-term seizure documentation in epilepsy.

    Science.gov (United States)

    Fürbass, F; Kampusch, S; Kaniusas, E; Koren, J; Pirker, S; Hopfengärtner, R; Stefan, H; Kluge, T; Baumgartner, C

    2017-08-01

    This study investigated sensitivity and false detection rate of a multimodal automatic seizure detection algorithm and the applicability to reduced electrode montages for long-term seizure documentation in epilepsy patients. An automatic seizure detection algorithm based on EEG, EMG, and ECG signals was developed. EEG/ECG recordings of 92 patients from two epilepsy monitoring units including 494 seizures were used to assess detection performance. EMG data were extracted by bandpass filtering of EEG signals. Sensitivity and false detection rate were evaluated for each signal modality and for reduced electrode montages. All focal seizures evolving to bilateral tonic-clonic (BTCS, n=50) and 89% of focal seizures (FS, n=139) were detected. Average sensitivity in temporal lobe epilepsy (TLE) patients was 94% and 74% in extratemporal lobe epilepsy (XTLE) patients. Overall detection sensitivity was 86%. Average false detection rate was 12.8 false detections in 24h (FD/24h) for TLE and 22 FD/24h in XTLE patients. Utilization of 8 frontal and temporal electrodes reduced average sensitivity from 86% to 81%. Our automatic multimodal seizure detection algorithm shows high sensitivity with full and reduced electrode montages. Evaluation of different signal modalities and electrode montages paces the way for semi-automatic seizure documentation systems. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.

  3. Feature-based automatic color calibration for networked camera system

    Science.gov (United States)

    Yamamoto, Shoji; Taki, Keisuke; Tsumura, Norimichi; Nakaguchi, Toshiya; Miyake, Yoichi

    2011-01-01

    In this paper, we have developed a feature-based automatic color calibration by using an area-based detection and adaptive nonlinear regression method. Simple color matching of chartless is achieved by using the characteristic of overlapping image area with each camera. Accurate detection of common object is achieved by the area-based detection that combines MSER with SIFT. Adaptive color calibration by using the color of detected object is calculated by nonlinear regression method. This method can indicate the contribution of object's color for color calibration, and automatic selection notification for user is performed by this function. Experimental result show that the accuracy of the calibration improves gradually. It is clear that this method can endure practical use of multi-camera color calibration if an enough sample is obtained.

  4. Segmenting articular cartilage automatically using a voxel classification approach

    DEFF Research Database (Denmark)

    Folkesson, Jenny; Dam, Erik B; Olsen, Ole F

    2007-01-01

    We present a fully automatic method for articular cartilage segmentation from magnetic resonance imaging (MRI) which we use as the foundation of a quantitative cartilage assessment. We evaluate our method by comparisons to manual segmentations by a radiologist and by examining the interscan...... reproducibility of the volume and area estimates. Training and evaluation of the method is performed on a data set consisting of 139 scans of knees with a status ranging from healthy to severely osteoarthritic. This is, to our knowledge, the only fully automatic cartilage segmentation method that has good...... agreement with manual segmentations, an interscan reproducibility as good as that of a human expert, and enables the separation between healthy and osteoarthritic populations. While high-field scanners offer high-quality imaging from which the articular cartilage have been evaluated extensively using manual...

  5. Development of automatic inspection robot for nuclear power plants

    International Nuclear Information System (INIS)

    Yamada, K.; Suzuki, K.; Saitoh, K.; Sakaki, T.; Ohe, Y.; Mizutani, T.; Segawa, M.; Kubo, K.

    1987-01-01

    This robot system has been developed for automatic inspection of nuclear power plants. The system configuration is composed of vehicle that runs on monorail, the sensors on the vehicle, an image processer that processes the image information from the sensors, a computer that creates the inspection planning of the robot and an operation panel. This system has two main features, the first is the robot control system. The vehicle and the sensors are controlled by the output data calculated in the computer with the three dimensional plant data. The malfunction is recognized by the combination of the results of image processing, information from the microphone and infrared camera. Tests for a prototype automatic inspection robot system have been performed in the simulated main steam piping room of a nuclear power plant

  6. A comparison of automatic and intentional instructions when using the method of vanishing cues in acquired brain injury.

    Science.gov (United States)

    Riley, Gerard A; Venn, Paul

    2015-01-01

    Thirty-four participants with acquired brain injury learned word lists under two forms of vanishing cues - one in which the learning trial instructions encouraged intentional retrieval (i.e., explicit memory) and one in which they encouraged automatic retrieval (which encompasses implicit memory). The automatic instructions represented a novel approach in which the cooperation of participants was actively sought to avoid intentional retrieval. Intentional instructions resulted in fewer errors during the learning trials and better performance on immediate and delayed retrieval tests. The advantage of intentional over automatic instructions was generally less for those who had more severe memory and/or executive impairments. Most participants performed better under intentional instructions on both the immediate and the delayed tests. Although those who were more severely impaired in both memory and executive function also did better with intentional instructions on the immediate retrieval test, they were significantly more likely to show an advantage for automatic instructions on the delayed test. It is suggested that this pattern of results may reflect impairments in the consolidation of intentional memories in this group. When using vanishing cues, automatic instructions may be better for those with severe consolidation impairments, but otherwise intentional instructions may be better.

  7. Automatic cough episode detection using a vibroacoustic sensor.

    Science.gov (United States)

    Mlynczak, Marcel; Pariaszewska, Katarzyna; Cybulski, Gerard

    2015-08-01

    Cough monitoring is an important element of the diagnostics of respiratory diseases. The European Respiratory Society recommends objective assessment of cough episodes and the search for methods of automatic analysis to make obtaining the quantitative parameters possible. The cough "events" could be classified by a microphone and a sensor that measures the vibrations of the chest. Analysis of the recorded signals consists of calculating the features vectors for selected episodes and of performing automatic classification using them. The aim of the study was to assess the accuracy of classification based on an artificial neural networks using vibroacoustic signals collected from chest. Six healthy, young men and eight healthy, young women carried out an imitated cough, hand clapping, speech and shouting. Three methods of parametrization were used to prepare the vectors of episode features - time domain, time-frequency domain and spectral modeling. We obtained the accuracy of 95% using artificial neural networks.

  8. Automatic imitation effects are influenced by experience of synchronous action in children.

    Science.gov (United States)

    O'Sullivan, Eoin P; Bijvoet-van den Berg, Simone; Caldwell, Christine A

    2018-07-01

    By their fourth year of life, children are expert imitators, but it is unclear how this ability develops. One approach suggests that certain types of experience might forge associations between the sensory and motor representations of an action that may facilitate imitation at a later time. Sensorimotor experience of this sort may occur when an infant's action is imitated by a caregiver or when socially synchronous action occurs. This learning approach, therefore, predicts that the strength of sensory-motor associations should depend on the frequency and quality of previous experience. Here, we tested this prediction by examining automatic imitation, that is, the tendency of an action stimulus to facilitate the performance of that action and interfere with the performance of an incompatible action. We required children (aged between 3 years 8 months and 7 years 11 months) to respond to actions performed by an experimenter (e.g., two hands clapping) with both compatible actions (i.e., two hands clapping) and incompatible actions (i.e., two hands waving) at different stages in the experimental procedure. As predicted by a learning account, actions thought to be performed in synchrony (i.e., clapping/waving) produced stronger automatic imitation effects when compared with actions where previous sensorimotor experience is likely to be more limited (e.g., pointing/hand closing). Furthermore, these automatic imitation effects were not found to vary with age, with both compatible and incompatible responses quickening with age. These findings suggest a role for sensorimotor experience in the development of imitative ability. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. Remote automatic control scheme for plasma arc cutting of contaminated waste

    International Nuclear Information System (INIS)

    Dudar, A.M.; Ward, C.R.; Kriikku, E.M.

    1993-01-01

    The Robotics Development Group at the Savannah River Technology Center has developed and implemented a scheme to perform automatic cutting of metallic contaminated waste. The scheme employs a plasma arc cutter in conjunction with a laser ranging sensor attached to a robotic manipulator called the Telerobot. A software algorithm using proportional control is then used to perturb the robot's trajectory in such a way as to regulate the plasma arc standoff and the robot's speed in order to achieve automatic plasma arc cuts. The scheme has been successfully tested on simulated waste materials and the results have been very favorable. This report details the development and testing of the scheme

  10. Towards automatic music transcription: note extraction based on independent subspace analysis

    Science.gov (United States)

    Wellhausen, Jens; Hoynck, Michael

    2005-01-01

    Due to the increasing amount of music available electronically the need of automatic search, retrieval and classification systems for music becomes more and more important. In this paper an algorithm for automatic transcription of polyphonic piano music into MIDI data is presented, which is a very interesting basis for database applications, music analysis and music classification. The first part of the algorithm performs a note accurate temporal audio segmentation. In the second part, the resulting segments are examined using Independent Subspace Analysis to extract sounding notes. Finally, the results are used to build a MIDI file as a new representation of the piece of music which is examined.

  11. Automatic intelligent cruise control

    OpenAIRE

    Stanton, NA; Young, MS

    2006-01-01

    This paper reports a study on the evaluation of automatic intelligent cruise control (AICC) from a psychological perspective. It was anticipated that AICC would have an effect upon the psychology of driving—namely, make the driver feel like they have less control, reduce the level of trust in the vehicle, make drivers less situationally aware, but might reduce the workload and make driving might less stressful. Drivers were asked to drive in a driving simulator under manual and automatic inte...

  12. Effectiveness of an Automatic Tracking Software in Underwater Motion Analysis

    Directory of Open Access Journals (Sweden)

    Fabrício A. Magalhaes

    2013-12-01

    Full Text Available Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP, based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers’ positions were manually tracked to determine the markers’ center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM. Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker’s coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4% than for COM (17.8%. Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis.

  13. Evaluating automatic laughter segmentation in meetings using acoustic and acoustic-phonetic features

    NARCIS (Netherlands)

    Truong, K.P.; Leeuwen, D.A. van

    2007-01-01

    In this study, we investigated automatic laughter segmentation in meetings. We first performed laughterspeech discrimination experiments with traditional spectral features and subsequently used acousticphonetic features. In segmentation, we used Gaussian Mixture Models that were trained with

  14. Individual Differences in Automatic Emotion Regulation Interact with Primed Emotion Regulation during an Anger Provocation

    OpenAIRE

    Zhang, Jing; Lipp, Ottmar V.; Hu, Ping

    2017-01-01

    The current study investigated the interactive effects of individual differences in automatic emotion regulation (AER) and primed emotion regulation strategy on skin conductance level (SCL) and heart rate during provoked anger. The study was a 2 × 2 [AER tendency (expression vs. control) × priming (expression vs. control)] between subject design. Participants were assigned to two groups according to their performance on an emotion regulation-IAT (differentiating automatic emotion control tend...

  15. Automatic radioxenon analyzer for CTBT monitoring

    International Nuclear Information System (INIS)

    Bowyer, T.W.; Abel, K.H.; Hensley, W.K.

    1996-12-01

    Over the past 3 years, with support from US DOE's NN-20 Comprehensive Test Ban Treaty (CTBT) R ampersand D program, PNNL has developed and demonstrated a fully automatic analyzer for collecting and measuring the four Xe radionuclides, 131m Xe(11.9 d), 133m Xe(2.19 d), 133 Xe (5.24 d), and 135 Xe(9.10 h), in the atmosphere. These radionuclides are important signatures in monitoring for compliance to a CTBT. Activity ratios permit discriminating radioxenon from nuclear detonation and that from nuclear reactor operations, nuclear fuel reprocessing, or medical isotope production and usage. In the analyzer, Xe is continuously and automatically separated from the atmosphere at flow rates of about 7 m 3 /h on sorption bed. Aliquots collected for 6-12 h are automatically analyzed by electron-photon coincidence spectrometry to produce sensitivities in the range of 20-100 μBq/m 3 of air, about 100-fold better than with reported laboratory-based procedures for short time collection intervals. Spectral data are automatically analyzed and the calculated radioxenon concentrations and raw gamma- ray spectra automatically transmitted to data centers

  16. Automatic plasma control in magnetic traps

    International Nuclear Information System (INIS)

    Samojlenko, Y.; Chuyanov, V.

    1984-01-01

    Hot plasma is essentially in thermodynamic non-steady state. Automatic plasma control basically means monitoring deviations from steady state and producing a suitable magnetic or electric field which brings the plasma back to its original state. Briefly described are two systems of automatic plasma control: control with a magnetic field using a negative impedance circuit, and control using an electric field. It appears that systems of automatic plasma stabilization will be an indispensable component of the fusion reactor and its possibilities will in many ways determine the reactor economy. (Ha)

  17. Fully automatic diagnostic system for early- and late-onset mild Alzheimer's disease using FDG PET and 3D-SSP

    International Nuclear Information System (INIS)

    Ishii, Kazunari; Kono, Atsushi K.; Sasaki, Hiroki; Miyamoto, Naokazu; Fukuda, Tetsuya; Sakamoto, Setsu; Mori, Etsuro

    2006-01-01

    The purpose of this study was to design a fully automatic computer-assisted diagnostic system for early- and late-onset mild Alzheimer's disease (AD). Glucose metabolic images were obtained from mild AD patients and normal controls using positron emission tomography (PET) and 18 F-fluorodeoxyglucose (FDG). Two groups of 20 mild AD patients with different ages of onset were examined. A fully automatic diagnostic system using the statistical brain mapping method was established from the early-onset (EO) and late-onset (LO) groups, with mean ages of 59.1 and 70.9 years and mean MMSE scores of 23.3 and 22.8, respectively. Aged-matched normal subjects were used as controls. We compared the diagnostic performance of visual inspection of conventional axial FDG PET images by experts and beginners with that of our fully automatic diagnostic system in another 15 EO and 15 LO AD patients (mean age 58.4 and 71.7, mean MMSE 23.6 and 23.1, respectively) and 30 age-matched normal controls. A receiver operating characteristic (ROC) analysis was performed to compare data. The diagnostic performance of the automatic diagnostic system was comparable with that of visual inspection by experts. The area under the ROC curve for the automatic diagnostic system was 0.967 for EO AD patients and 0.878 for LO AD patients. The mean area under the ROC curve for visual inspection by experts was 0.863 and 0.881 for the EO and LO AD patients, respectively. The mean area under the ROC curve for visual inspection by beginners was 0.828 and 0.717, respectively. The fully automatic diagnostic system for EO and LO AD was able to perform at a similar diagnostic level to visual inspection of conventional axial images by experts. (orig.)

  18. Word Processing in Dyslexics: An Automatic Decoding Deficit?

    Science.gov (United States)

    Yap, Regina; Van Der Leu, Aryan

    1993-01-01

    Compares dyslexic children with normal readers on measures of phonological decoding and automatic word processing. Finds that dyslexics have a deficit in automatic phonological decoding skills. Discusses results within the framework of the phonological deficit and the automatization deficit hypotheses. (RS)

  19. AISLE: an automatic volumetric segmentation method for the study of lung allometry.

    Science.gov (United States)

    Ren, Hongliang; Kazanzides, Peter

    2011-01-01

    We developed a fully automatic segmentation method for volumetric CT (computer tomography) datasets to support construction of a statistical atlas for the study of allometric laws of the lung. The proposed segmentation method, AISLE (Automated ITK-Snap based on Level-set), is based on the level-set implementation from an existing semi-automatic segmentation program, ITK-Snap. AISLE can segment the lung field without human interaction and provide intermediate graphical results as desired. The preliminary experimental results show that the proposed method can achieve accurate segmentation, in terms of volumetric overlap metric, by comparing with the ground-truth segmentation performed by a radiologist.

  20. Developmental Dyscalculia and Automatic Magnitudes Processing: Investigating Interference Effects between Area and Perimeter

    Directory of Open Access Journals (Sweden)

    Hili Eidlin-Levy

    2017-12-01

    Full Text Available The relationship between numbers and other magnitudes has been extensively investigated in the scientific literature. Here, the objectives were to examine whether two continuous magnitudes, area and perimeter, are automatically processed and whether adults with developmental dyscalculia (DD are deficient in their ability to automatically process one or both of these magnitudes. Fifty-seven students (30 with DD and 27 with typical development performed a novel Stroop-like task requiring estimation of one aspect (area or perimeter while ignoring the other. In order to track possible changes in automaticity due to practice, we measured performance after initial and continuous exposure to stimuli. Similar to previous findings, current results show a significant group × congruency interaction, evident beyond exposure level or magnitude type. That is, the DD group systematically showed larger Stroop effects. However, analysis of each exposure period showed that during initial exposure to stimuli the DD group showed larger Stroop effects in the perimeter and not in the area task. In contrast, during continuous exposure to stimuli no triple interaction was evident. It is concluded that both magnitudes are automatically processed. Nevertheless, individuals with DD are deficient in inhibiting irrelevant magnitude information in general and, specifically, struggle to inhibit salient area information after initial exposure to a perimeter comparison task. Accordingly, the findings support the assumption that DD involves a deficiency in multiple cognitive components, which include domain-specific and domain-general cognitive functions.

  1. Developmental Dyscalculia and Automatic Magnitudes Processing: Investigating Interference Effects between Area and Perimeter.

    Science.gov (United States)

    Eidlin-Levy, Hili; Rubinsten, Orly

    2017-01-01

    The relationship between numbers and other magnitudes has been extensively investigated in the scientific literature. Here, the objectives were to examine whether two continuous magnitudes, area and perimeter, are automatically processed and whether adults with developmental dyscalculia (DD) are deficient in their ability to automatically process one or both of these magnitudes. Fifty-seven students (30 with DD and 27 with typical development) performed a novel Stroop-like task requiring estimation of one aspect (area or perimeter) while ignoring the other. In order to track possible changes in automaticity due to practice, we measured performance after initial and continuous exposure to stimuli. Similar to previous findings, current results show a significant group × congruency interaction, evident beyond exposure level or magnitude type. That is, the DD group systematically showed larger Stroop effects. However, analysis of each exposure period showed that during initial exposure to stimuli the DD group showed larger Stroop effects in the perimeter and not in the area task. In contrast, during continuous exposure to stimuli no triple interaction was evident. It is concluded that both magnitudes are automatically processed. Nevertheless, individuals with DD are deficient in inhibiting irrelevant magnitude information in general and, specifically, struggle to inhibit salient area information after initial exposure to a perimeter comparison task. Accordingly, the findings support the assumption that DD involves a deficiency in multiple cognitive components, which include domain-specific and domain-general cognitive functions.

  2. Mastitis therapy and control - Automatic on-line detection of abnormal milk.

    NARCIS (Netherlands)

    Hogeveen, H.

    2011-01-01

    Automated online detection of mastitis and abnormal milk is an important subject in the dairy industry, especially because of the introduction of automatic milking systems and the growing farm sizes with consequently less labor available per cow. Demands for performance, which is expressed as

  3. Automatic Algorithm Selection for Complex Simulation Problems

    CERN Document Server

    Ewald, Roland

    2012-01-01

    To select the most suitable simulation algorithm for a given task is often difficult. This is due to intricate interactions between model features, implementation details, and runtime environment, which may strongly affect the overall performance. An automated selection of simulation algorithms supports users in setting up simulation experiments without demanding expert knowledge on simulation. Roland Ewald analyzes and discusses existing approaches to solve the algorithm selection problem in the context of simulation. He introduces a framework for automatic simulation algorithm selection and

  4. Object Occlusion Detection Using Automatic Camera Calibration for a Wide-Area Video Surveillance System

    Directory of Open Access Journals (Sweden)

    Jaehoon Jung

    2016-06-01

    Full Text Available This paper presents an object occlusion detection algorithm using object depth information that is estimated by automatic camera calibration. The object occlusion problem is a major factor to degrade the performance of object tracking and recognition. To detect an object occlusion, the proposed algorithm consists of three steps: (i automatic camera calibration using both moving objects and a background structure; (ii object depth estimation; and (iii detection of occluded regions. The proposed algorithm estimates the depth of the object without extra sensors but with a generic red, green and blue (RGB camera. As a result, the proposed algorithm can be applied to improve the performance of object tracking and object recognition algorithms for video surveillance systems.

  5. High-Throughput Automatic Training System for Odor-Based Learned Behaviors in Head-Fixed Mice

    Directory of Open Access Journals (Sweden)

    Zhe Han

    2018-02-01

    Full Text Available Understanding neuronal mechanisms of learned behaviors requires efficient behavioral assays. We designed a high-throughput automatic training system (HATS for olfactory behaviors in head-fixed mice. The hardware and software were constructed to enable automatic training with minimal human intervention. The integrated system was composed of customized 3D-printing supporting components, an odor-delivery unit with fast response, Arduino based hardware-controlling and data-acquisition unit. Furthermore, the customized software was designed to enable automatic training in all training phases, including lick-teaching, shaping and learning. Using HATS, we trained mice to perform delayed non-match to sample (DNMS, delayed paired association (DPA, Go/No-go (GNG, and GNG reversal tasks. These tasks probed cognitive functions including sensory discrimination, working memory, decision making and cognitive flexibility. Mice reached stable levels of performance within several days in the tasks. HATS enabled an experimenter to train eight mice simultaneously, therefore greatly enhanced the experimental efficiency. Combined with causal perturbation and activity recording techniques, HATS can greatly facilitate our understanding of the neural-circuitry mechanisms underlying learned behaviors.

  6. Deliberation versus automaticity in decision making: Which presentation format features facilitate automatic decision making?

    Directory of Open Access Journals (Sweden)

    Anke Soellner

    2013-05-01

    Full Text Available The idea of automatic decision making approximating normatively optimal decisions without necessitating much cognitive effort is intriguing. Whereas recent findings support the notion that such fast, automatic processes explain empirical data well, little is known about the conditions under which such processes are selected rather than more deliberate stepwise strategies. We investigate the role of the format of information presentation, focusing explicitly on the ease of information acquisition and its influence on information integration processes. In a probabilistic inference task, the standard matrix employed in prior research was contrasted with a newly created map presentation format and additional variations of both presentation formats. Across three experiments, a robust presentation format effect emerged: Automatic decision making was more prevalent in the matrix (with high information accessibility, whereas sequential decision strategies prevailed when the presentation format demanded more information acquisition effort. Further scrutiny of the effect showed that it is not driven by the presentation format as such, but rather by the extent of information search induced by a format. Thus, if information is accessible with minimal need for information search, information integration is likely to proceed in a perception-like, holistic manner. In turn, a moderate demand for information search decreases the likelihood of behavior consistent with the assumptions of automatic decision making.

  7. Automatic measuring device for atomic oxygen concentrations (1962); Dispositif de mesure automatique de concentrations d'oxygene atomique (1962)

    Energy Technology Data Exchange (ETDEWEB)

    Weill, J; Deiss, M; Mercier, R [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1962-07-01

    Within the framework of the activities of the Autonomous Reactor Electronics Section we have developed a device, which renders automatic one type of measurement carried out in the Physical Chemistry Department at the Saclay Research Centre. We define here: - the physico-chemical principle of the apparatus which is adapted to the measurement of atomic oxygen concentrations; - the physical principle of the automatic measurement; - the properties, performance, constitution, use and maintenance of the automatic measurement device. It is concluded that the principle of the automatic device, whose tests have confirmed the estimation of the theoretical performance, could usefully be adapted to other types of measurement. (authors) [French] Dans le cadre des activites de la Section Autonome d'Electronique des Reacteurs, il a ete realise et mis au point un dispositif permettant de rendre automatique un type de mesures effectuees au Departement de Physico-Chimie du C.E.N. SACLAY. On definit ici: - le principe physico-chimique de l'appareillage, adapte a la mesure de concentrations de l'oxygene atomique; - le principe physique de la mesure automatique; - les qualites, performances, constitution, utilisation, et maintenance du dispositif de mesure automatique. Il est porte en conclusion, que le principe du dispositif automatique realise, dont les essais ont sensiblement confirme l'evaluation des performances theoriques, pourrait etre utilement adapte a d'autres types de mesures courantes. (auteurs)

  8. The Development of Automatic Sequences for the RF and Cryogenic Systems at the Spallation Neutron Source

    International Nuclear Information System (INIS)

    Gurd, Pamela; Casagrande, Fabio; Mccarthy, Michael; Strong, William; Ganni, Venkatarao

    2005-01-01

    Automatic sequences both ease the task of operating a complex machine and ensure procedural consistency. At the Spallation Neutron Source project (SNS), a set of automatic sequences have been developed to perform the start up and shut down of the high power RF systems. Similarly, sequences have been developed to perform backfill, pump down, automatic valve control and energy management in the cryogenic system. The sequences run on Linux soft input-output controllers (IOCs), which are similar to ordinary EPICS (Experimental Physics and Industrial Control System) IOCs in terms of data sharing with other EPICS processes, but which share a Linux processor with other such processors. Each sequence waits for a command from an operator console and starts the corresponding set of instructions, allowing operators to follow the sequences either from an overview screen or from detail screens. We describe each system and our operational experience with it.

  9. Automatic luminous reflections detector using global threshold with increased luminosity contrast in images

    Science.gov (United States)

    Silva, Ricardo Petri; Naozuka, Gustavo Taiji; Mastelini, Saulo Martiello; Felinto, Alan Salvany

    2018-01-01

    The incidence of luminous reflections (LR) in captured images can interfere with the color of the affected regions. These regions tend to oversaturate, becoming whitish and, consequently, losing the original color information of the scene. Decision processes that employ images acquired from digital cameras can be impaired by the LR incidence. Such applications include real-time video surgeries, facial, and ocular recognition. This work proposes an algorithm called contrast enhancement of potential LR regions, which is a preprocessing to increase the contrast of potential LR regions, in order to improve the performance of automatic LR detectors. In addition, three automatic detectors were compared with and without the employment of our preprocessing method. The first one is a technique already consolidated in the literature called the Chang-Tseng threshold. We propose two automatic detectors called adapted histogram peak and global threshold. We employed four performance metrics to evaluate the detectors, namely, accuracy, precision, exactitude, and root mean square error. The exactitude metric is developed by this work. Thus, a manually defined reference model was created. The global threshold detector combined with our preprocessing method presented the best results, with an average exactitude rate of 82.47%.

  10. 46 CFR 63.25-1 - Small automatic auxiliary boilers.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Small automatic auxiliary boilers. 63.25-1 Section 63.25... AUXILIARY BOILERS Requirements for Specific Types of Automatic Auxiliary Boilers § 63.25-1 Small automatic auxiliary boilers. Small automatic auxiliary boilers defined as having heat-input ratings of 400,000 Btu/hr...

  11. Evaluation of an automatic MR-based gold fiducial marker localisation method for MR-only prostate radiotherapy

    Science.gov (United States)

    Maspero, Matteo; van den Berg, Cornelis A. T.; Zijlstra, Frank; Sikkes, Gonda G.; de Boer, Hans C. J.; Meijer, Gert J.; Kerkmeijer, Linda G. W.; Viergever, Max A.; Lagendijk, Jan J. W.; Seevinck, Peter R.

    2017-10-01

    An MR-only radiotherapy planning (RTP) workflow would reduce the cost, radiation exposure and uncertainties introduced by CT-MRI registrations. In the case of prostate treatment, one of the remaining challenges currently holding back the implementation of an RTP workflow is the MR-based localisation of intraprostatic gold fiducial markers (FMs), which is crucial for accurate patient positioning. Currently, MR-based FM localisation is clinically performed manually. This is sub-optimal, as manual interaction increases the workload. Attempts to perform automatic FM detection often rely on being able to detect signal voids induced by the FMs in magnitude images. However, signal voids may not always be sufficiently specific, hampering accurate and robust automatic FM localisation. Here, we present an approach that aims at automatic MR-based FM localisation. This method is based on template matching using a library of simulated complex-valued templates, and exploiting the behaviour of the complex MR signal in the vicinity of the FM. Clinical evaluation was performed on seventeen prostate cancer patients undergoing external beam radiotherapy treatment. Automatic MR-based FM localisation was compared to manual MR-based and semi-automatic CT-based localisation (the current gold standard) in terms of detection rate and the spatial accuracy and precision of localisation. The proposed method correctly detected all three FMs in 15/17 patients. The spatial accuracy (mean) and precision (STD) were 0.9 mm and 0.5 mm respectively, which is below the voxel size of 1.1 × 1.1 × 1.2 mm3 and comparable to MR-based manual localisation. FM localisation failed (3/51 FMs) in the presence of bleeding or calcifications in the direct vicinity of the FM. The method was found to be spatially accurate and precise, which is essential for clinical use. To overcome any missed detection, we envision the use of the proposed method along with verification by an observer. This will result in a

  12. Evaluation of an automatic MR-based gold fiducial marker localisation method for MR-only prostate radiotherapy.

    Science.gov (United States)

    Maspero, Matteo; van den Berg, Cornelis A T; Zijlstra, Frank; Sikkes, Gonda G; de Boer, Hans C J; Meijer, Gert J; Kerkmeijer, Linda G W; Viergever, Max A; Lagendijk, Jan J W; Seevinck, Peter R

    2017-10-03

    An MR-only radiotherapy planning (RTP) workflow would reduce the cost, radiation exposure and uncertainties introduced by CT-MRI registrations. In the case of prostate treatment, one of the remaining challenges currently holding back the implementation of an RTP workflow is the MR-based localisation of intraprostatic gold fiducial markers (FMs), which is crucial for accurate patient positioning. Currently, MR-based FM localisation is clinically performed manually. This is sub-optimal, as manual interaction increases the workload. Attempts to perform automatic FM detection often rely on being able to detect signal voids induced by the FMs in magnitude images. However, signal voids may not always be sufficiently specific, hampering accurate and robust automatic FM localisation. Here, we present an approach that aims at automatic MR-based FM localisation. This method is based on template matching using a library of simulated complex-valued templates, and exploiting the behaviour of the complex MR signal in the vicinity of the FM. Clinical evaluation was performed on seventeen prostate cancer patients undergoing external beam radiotherapy treatment. Automatic MR-based FM localisation was compared to manual MR-based and semi-automatic CT-based localisation (the current gold standard) in terms of detection rate and the spatial accuracy and precision of localisation. The proposed method correctly detected all three FMs in 15/17 patients. The spatial accuracy (mean) and precision (STD) were 0.9 mm and 0.5 mm respectively, which is below the voxel size of [Formula: see text] mm 3 and comparable to MR-based manual localisation. FM localisation failed (3/51 FMs) in the presence of bleeding or calcifications in the direct vicinity of the FM. The method was found to be spatially accurate and precise, which is essential for clinical use. To overcome any missed detection, we envision the use of the proposed method along with verification by an observer. This will result in a

  13. 49 CFR 236.825 - System, automatic train control.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false System, automatic train control. 236.825 Section..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Definitions § 236.825 System, automatic train control. A system so arranged that its operation will automatically...

  14. Improving the safety and protective automatic actions of the CMS electromagnetic calorimeter detector control system

    CERN Document Server

    Jimenez Estupinan, Raul; Cirkovic, Predrag; Di Calafiori, Diogo Raphael; Dissertori, Guenther; Djambazov, Lubomir; Jovanovic, Dragoslav; Lustermann, Werner; Milenovic, Predrag; Zelepoukine, Serguei

    2017-01-01

    The CMS ECAL Detector Control System (DCS) features several monitoring mechanisms able to react and perform automatic actions based on pre-defined action matrices. The DCS is capable of early detection of anomalies inside the ECAL and on its off-detector support systems, triggering automatic actions to mitigate the impact of these events and preventing them from escalating to the safety system. The treatment of such events by the DCS allows for a faster recovery process, better understanding of the development of issues, and in most cases, actions with higher granularity than the safety system. This paper presents the details of the DCS automatic action mechanisms, as well as their evolution based on several years of CMS ECAL operations.

  15. Formal Specification and Automatic Analysis of Business Processes under Authorization Constraints: An Action-Based Approach

    Science.gov (United States)

    Armando, Alessandro; Giunchiglia, Enrico; Ponta, Serena Elisa

    We present an approach to the formal specification and automatic analysis of business processes under authorization constraints based on the action language \\cal{C}. The use of \\cal{C} allows for a natural and concise modeling of the business process and the associated security policy and for the automatic analysis of the resulting specification by using the Causal Calculator (CCALC). Our approach improves upon previous work by greatly simplifying the specification step while retaining the ability to perform a fully automatic analysis. To illustrate the effectiveness of the approach we describe its application to a version of a business process taken from the banking domain and use CCALC to determine resource allocation plans complying with the security policy.

  16. Automatic characterization of loose parts impact damage risk parameters

    International Nuclear Information System (INIS)

    Glass, S.W.; Phillips, J.M.

    1985-01-01

    Loose parts caught in the high-velocity flows of the reactor coolant fluid strike against nuclear steam supply system (NSSS) components and can cause significant damage. Loose parts monitor systems (LPMS) have been available for years to detect metal-to-metal impacts. Once detected, however, an assessment of the damage risk potential for leaving the part in the system versus shutting it down and removing the part must be made. The principal parameters used in the damage risk assessment are time delays between the first and subsequent sensor indications (used to assess the impact location) and a correlation between the waveform and the impact energy of the part (how hard the part impacted). These parameters are not well suited to simple automatic techniques. The task has historically been performed by loose parts diagnostic experts who base much of their evaluation on experience and subjective interpretation of impact data waveforms. Three of the principal goals in developing the Babcock and Wilcox (B and W) LPMS-III were (a) to develop an accurate automatic assessment for the time delays, (b) to develop an automatic estimate of the impact energy, and (c) to present the data in a meaningful manner to the operator

  17. Automatic Thread-Level Parallelization in the Chombo AMR Library

    Energy Technology Data Exchange (ETDEWEB)

    Christen, Matthias; Keen, Noel; Ligocki, Terry; Oliker, Leonid; Shalf, John; Van Straalen, Brian; Williams, Samuel

    2011-05-26

    The increasing on-chip parallelism has some substantial implications for HPC applications. Currently, hybrid programming models (typically MPI+OpenMP) are employed for mapping software to the hardware in order to leverage the hardware?s architectural features. In this paper, we present an approach that automatically introduces thread level parallelism into Chombo, a parallel adaptive mesh refinement framework for finite difference type PDE solvers. In Chombo, core algorithms are specified in the ChomboFortran, a macro language extension to F77 that is part of the Chombo framework. This domain-specific language forms an already used target language for an automatic migration of the large number of existing algorithms into a hybrid MPI+OpenMP implementation. It also provides access to the auto-tuning methodology that enables tuning certain aspects of an algorithm to hardware characteristics. Performance measurements are presented for a few of the most relevant kernels with respect to a specific application benchmark using this technique as well as benchmark results for the entire application. The kernel benchmarks show that, using auto-tuning, up to a factor of 11 in performance was gained with 4 threads with respect to the serial reference implementation.

  18. Automatic focusing of attention on object size and shape

    Directory of Open Access Journals (Sweden)

    Cesar Galera

    2005-01-01

    Full Text Available In two experiments we investigated the automatic adjusting of the attentional focus to simple geometric shapes. The participants performed a visual search task with four stimuli (the target and three distractors presented always around the fixation point, inside an outlined frame not related to the search task. A cue informed the subject only about the possible size and shape of the frame, not about the target. The results of the first experiment showed faster target detection in the valid cue trials, suggesting that attention was captured automatically by the cue shape. In the second experiment, we introduced a flanker stimulus (compatible or incompatible with the target in order to determine if attentional resources spread homogenously inside and outside the frame. The results showed that performance depended both on cue validity and frame orientation. The flanker effect was dependent on compatibility and flanker position (vertical or horizontal meridian. The results of both experiments suggest that the form of an irrelevant object can capture attention despite participants’ intention and the results of the second experiment suggest that the attentional resources are more concentrated along the horizontal meridian.

  19. Automatic validation of numerical solutions

    DEFF Research Database (Denmark)

    Stauning, Ole

    1997-01-01

    This thesis is concerned with ``Automatic Validation of Numerical Solutions''. The basic theory of interval analysis and self-validating methods is introduced. The mean value enclosure is applied to discrete mappings for obtaining narrow enclosures of the iterates when applying these mappings...... differential equations, but in this thesis, we describe how to use the methods for enclosing iterates of discrete mappings, and then later use them for discretizing solutions of ordinary differential equations. The theory of automatic differentiation is introduced, and three methods for obtaining derivatives...... are described: The forward, the backward, and the Taylor expansion methods. The three methods have been implemented in the C++ program packages FADBAD/TADIFF. Some examples showing how to use the three metho ds are presented. A feature of FADBAD/TADIFF not present in other automatic differentiation packages...

  20. Automatic control logics to eliminate xenon oscillation based on Axial Offsets Trajectory Method

    International Nuclear Information System (INIS)

    Shimazu, Yoichiro

    1996-01-01

    We have proposed Axial Offsets (AO) Trajectory Method for xenon oscillation control in pressurized water reactors. The features of this method are described as such that it can clearly give necessary control operations to eliminate xenon oscillations. It is expected that using the features automatic control logics for xenon oscillations can be simple and be realized easily. We investigated automatic control logics. The AO Trajectory Method could realize a very simple logic only for eliminating xenon oscillations. However it was necessary to give another considerations to eliminate the xenon oscillation with a given axial power distribution. The other control logic based on the modern control theory was also studied for comparison of the control performance of the new control logic. As the results, it is presented that the automatic control logics based on the AO Trajectory Method are very simple and effective. (author)

  1. The Use of Automatic Indexing for Authority Control.

    Science.gov (United States)

    Dillon, Martin; And Others

    1981-01-01

    Uses an experimental system for authority control on a collection of bibliographic records to demonstrate the resemblance between thesaurus-based automatic indexing and automatic authority control. Details of the automatic indexing system are given, results discussed, and the benefits of the resemblance examined. Included are a rules appendix and…

  2. 30 CFR 77.1401 - Automatic controls and brakes.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic controls and brakes. 77.1401 Section... MINES Personnel Hoisting § 77.1401 Automatic controls and brakes. Hoists and elevators shall be equipped with overspeed, overwind, and automatic stop controls and with brakes capable of stopping the elevator...

  3. 30 CFR 57.19006 - Automatic hoist braking devices.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic hoist braking devices. 57.19006 Section 57.19006 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND... Hoisting Hoists § 57.19006 Automatic hoist braking devices. Automatic hoists shall be provided with devices...

  4. 30 CFR 56.19006 - Automatic hoist braking devices.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic hoist braking devices. 56.19006 Section 56.19006 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND... Hoisting Hoists § 56.19006 Automatic hoist braking devices. Automatic hoists shall be provided with devices...

  5. A data management program for the Electra 800 automatic analyser.

    Science.gov (United States)

    Cambus, J P; Nguyen, F; de Graeve, J; Aragon, B; Valdiguie, P

    1994-10-01

    The Electra 800 automatic coagulation analyser rapidly performs most chronometric coagulation tests with high precision. To facilitate data handling, software, adaptable to any PC running under MS-DOS, was written to manage the analyser. Data are automatically collected via the RS232 interface or can be manually input. The software can handle 64 different analyses, all entirely 'user defined'. An 'electronic worksheet' presents the results in pages of ten patients. This enables the operator to assess the data and to perform verifications or complementary tests if necessary. All results outside a predetermined range can be flagged and results can be deleted, modified or added. A patient's previous files can be recalled as the data are archived at the end of the day. A 120 Mb disk can store approximately 130,000 patient files. A daily archive function can print the day's work in alphabetical order. A communication protocol allows connection to a mainframe computer. This program and the user's manual are available on request, free of charge, from the authors.

  6. Neural-network classifiers for automatic real-world aerial image recognition

    Science.gov (United States)

    Greenberg, Shlomo; Guterman, Hugo

    1996-08-01

    We describe the application of the multilayer perceptron (MLP) network and a version of the adaptive resonance theory version 2-A (ART 2-A) network to the problem of automatic aerial image recognition (AAIR). The classification of aerial images, independent of their positions and orientations, is required for automatic tracking and target recognition. Invariance is achieved by the use of different invariant feature spaces in combination with supervised and unsupervised neural networks. The performance of neural-network-based classifiers in conjunction with several types of invariant AAIR global features, such as the Fourier-transform space, Zernike moments, central moments, and polar transforms, are examined. The advantages of this approach are discussed. The performance of the MLP network is compared with that of a classical correlator. The MLP neural-network correlator outperformed the binary phase-only filter (BPOF) correlator. It was found that the ART 2-A distinguished itself with its speed and its low number of required training vectors. However, only the MLP classifier was able to deal with a combination of shift and rotation geometric distortions.

  7. An evaluation of automatic coronary artery calcium scoring methods with cardiac CT using the orCaScore framework.

    Science.gov (United States)

    Wolterink, Jelmer M; Leiner, Tim; de Vos, Bob D; Coatrieux, Jean-Louis; Kelm, B Michael; Kondo, Satoshi; Salgado, Rodrigo A; Shahzad, Rahil; Shu, Huazhong; Snoeren, Miranda; Takx, Richard A P; van Vliet, Lucas J; van Walsum, Theo; Willems, Tineke P; Yang, Guanyu; Zheng, Yefeng; Viergever, Max A; Išgum, Ivana

    2016-05-01

    The amount of coronary artery calcification (CAC) is a strong and independent predictor of cardiovascular disease (CVD) events. In clinical practice, CAC is manually identified and automatically quantified in cardiac CT using commercially available software. This is a tedious and time-consuming process in large-scale studies. Therefore, a number of automatic methods that require no interaction and semiautomatic methods that require very limited interaction for the identification of CAC in cardiac CT have been proposed. Thus far, a comparison of their performance has been lacking. The objective of this study was to perform an independent evaluation of (semi)automatic methods for CAC scoring in cardiac CT using a publicly available standardized framework. Cardiac CT exams of 72 patients distributed over four CVD risk categories were provided for (semi)automatic CAC scoring. Each exam consisted of a noncontrast-enhanced calcium scoring CT (CSCT) and a corresponding coronary CT angiography (CCTA) scan. The exams were acquired in four different hospitals using state-of-the-art equipment from four major CT scanner vendors. The data were divided into 32 training exams and 40 test exams. A reference standard for CAC in CSCT was defined by consensus of two experts following a clinical protocol. The framework organizers evaluated the performance of (semi)automatic methods on test CSCT scans, per lesion, artery, and patient. Five (semi)automatic methods were evaluated. Four methods used both CSCT and CCTA to identify CAC, and one method used only CSCT. The evaluated methods correctly detected between 52% and 94% of CAC lesions with positive predictive values between 65% and 96%. Lesions in distal coronary arteries were most commonly missed and aortic calcifications close to the coronary ostia were the most common false positive errors. The majority (between 88% and 98%) of correctly identified CAC lesions were assigned to the correct artery. Linearly weighted Cohen's kappa

  8. Synthesis of Fault-Tolerant Schedules with Transparency/Performance Trade-offs for Distributed Embedded Systems

    DEFF Research Database (Denmark)

    Izosimov, Viacheslav; Pop, Paul; Eles, Petru

    2006-01-01

    of the application. We propose a novel algorithm for the synthesis of fault-tolerant schedules that can handle the transparency/performance trade-offs imposed by the designer, and makes use of the fault-occurrence information to reduce the overhead due to fault tolerance. We model the application as a conditional...... process graph, where the fault occurrence information is represented as conditional edges and the transparent recovery is captured using synchronization nodes....... such that the operation of other processes is not affected, we call it transparent recovery. Although transparent recovery has the advantages of fault containment, improved debugability and less memory needed to store the fault-tolerant schedules, it will introduce delays that can violate the timing constraints...

  9. Development of the automatic control rod operation system for JOYO. Verification of automatic control rod operation guide system

    International Nuclear Information System (INIS)

    Terakado, Tsuguo; Suzuki, Shinya; Kawai, Masashi; Aoki, Hiroshi; Ohkubo, Toshiyuki

    1999-10-01

    The automatic control rod operation system was developed to control the JOYO reactor power automatically in all operation modes(critical approach, cooling system heat up, power ascent, power descent), development began in 1989. Prior to applying the system, verification tests of the automatic control rod operation guide system was conducted during 32nd duty cycles of JOYO' from Dec. 1997 to Feb. 1998. The automatic control rod operation guide system consists of the control rod operation guide function and the plant operation guide function. The control rod operation guide function provides information on control rod movement and position, while the plant operation guide function provide guidance for plant operations corresponding to reactor power changes(power ascent or power descent). Control rod insertion or withdrawing are predicted by fuzzy algorithms. (J.P.N.)

  10. Design of Wireless Automatic Synchronization for the Low-Frequency Coded Ground Penetrating Radar

    Directory of Open Access Journals (Sweden)

    Zhenghuan Xia

    2015-01-01

    Full Text Available Low-frequency coded ground penetrating radar (GPR with a pair of wire dipole antennas has some advantages for deep detection. Due to the large distance between the two antennas, the synchronization design is a major challenge of implementing the GPR system. This paper proposes a simple and stable wireless automatic synchronization method based on our developed GPR system, which does not need any synchronization chips or modules and reduces the cost of the hardware system. The transmitter omits the synchronization preamble and pseudorandom binary sequence (PRBS at an appropriate time interval, while receiver automatically estimates the synchronization time and receives the returned signal from the underground targets. All the processes are performed in a single FPGA. The performance of the proposed synchronization method is validated with experiment.

  11. Distribution transformer with automatic voltage adjustment - performance; Transformador de distribucion con ajuste automatico de tension - desempeno

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez Ruiz, Gustavo A.; Delgadillo Bocanegra, Alfonso; Betancourt Ramirez, Enrique [PROLEC-GE, Apodaca, Nuevo Leon (Mexico)]. E-mail: gustavo1.hernandez@ge.com; alfonso.delgadillobocanegra@ge.com; enrique.betancourt@ge.com; Ramirez Arredondo, Juan M. [CINVESTAV-Guadalajara, Zapopan, Jalisco (Mexico)]. E-mail: jramirez@gdl.cinvestav.mx

    2010-11-15

    In the electric power distribution systems, the power quality is strongly linked with the service stability voltage. In the radial kind systems, it is virtually impossible to achieve a flat voltage along the lines, so it is desirable to count with transformers that can adjust automatically the turns ratio. In this work, it is described the development and the performance of a transformer with an integrated electronic tap changer, that allows to change the turns ratio along the standard range of +/-5%, and it was identified the application limits of the technology. [Spanish] En los sistemas de distribucion de energia electrica, la calidad del suministro de energia esta fuertemente ligada con la estabilidad del voltaje de servicio. En sistemas de tipo radial, es virtualmente imposible mantener uniforme la tension a lo largo de las lineas, por lo que se hace deseable contar con transformadores que puedan ajustar automaticamente la relacion de transformacion. En este trabajo, se describe el desarrollo y desempeno de un transformador con switch electronico integrado, que permite variar la relacion de transformacion dentro del rango estandarizado de +/-5%, y se identifican los limites de aplicacion de la tecnologia.

  12. A survey on exploring key performance indicators

    Directory of Open Access Journals (Sweden)

    Mohammed Badawy

    2016-12-01

    Full Text Available Key Performance Indicators (KPIs allows gathering knowledge and exploring the best way to achieve organization goals. Many researchers have provided different ideas for determining KPI's either manually, and semi-automatic, or automatic which is applied in different fields. This work concentrates on providing a survey of different approaches for exploring and predicting key performance indicators (KPIs.

  13. Automatic Detection of Vehicles Using Intensity Laser and Anaglyph Image

    Directory of Open Access Journals (Sweden)

    Hideo Araki

    2006-12-01

    Full Text Available In this work is presented a methodology to automatic car detection motion presents in digital aerial image on urban area using intensity, anaglyph and subtracting images. The anaglyph image is used to identify the motion cars on the expose take, because the cars provide red color due the not homology between objects. An implicit model was developed to provide a digital pixel value that has the specific propriety presented early, using the ratio between the RGB color of car object in the anaglyph image. The intensity image is used to decrease the false positive and to do the processing to work into roads and streets. The subtracting image is applied to decrease the false positives obtained due the markings road. The goal of this paper is automatically detect motion cars presents in digital aerial image in urban areas. The algorithm implemented applies normalization on the left and right images and later form the anaglyph with using the translation. The results show the applicability of proposed method and it potentiality on the automatic car detection and presented the performance of proposed methodology.

  14. Automatic track counting with an optic RAM-based instrument

    International Nuclear Information System (INIS)

    Staderini, E.M.; Castellano, Alfredo

    1986-01-01

    A new image sensor, the optic RAM, is now used in a microprocessor controlled instrument to read and digitize images from CR39 solid state nuclear track detectors. The system performs image analysis, filtering, tracks counting and evaluation in a fully automatic way, not requiring an optic microscope, nor photographic or television devices. The proposed system is a very compact and low power device. (author)

  15. Type tests to the automatic thermoluminescent dosimetry system acquired by the CPHR for personal dosimetry

    International Nuclear Information System (INIS)

    Molina P, D.; Pernas S, R.; Martinez G, A.

    2006-01-01

    The CPHR individual monitoring service acquired an automatic RADOS TLD system to improve its capacities to satisfy the increasing needs of their national customers. The TLD system consists of: two automatic TLD reader, model DOSACUS, a TLD irradiator and personal dosimeters card including slide and holders. The dosimeters were composed by this personal dosimeters card and LiF:Mg,Cu,P (model GR-200) detectors. These readers provide to detectors a constant temperature readout cycle using hot nitrogen gas. In order to evaluate the performance characteristics of the system, different performance tests recommended by the IEC 1066 standard were carried out. Important dosimetric characteristics evaluated were batch homogeneity, reproducibility, detection threshold, energy dependence, residual signal and fading. The results of the tests showed good performance characteristics of the system. (Author)

  16. Interactive vs. automatic ultrasound image segmentation methods for staging hepatic lipidosis.

    NARCIS (Netherlands)

    Weijers, G.; Starke, A.; Haudum, A.; Thijssen, J.M.; Rehage, J.; Korte, C.L. de

    2010-01-01

    The aim of this study was to test the hypothesis that automatic segmentation of vessels in ultrasound (US) images can produce similar or better results in grading fatty livers than interactive segmentation. A study was performed in postpartum dairy cows (N=151), as an animal model of human fatty

  17. Precision about the automatic emotional brain.

    Science.gov (United States)

    Vuilleumier, Patrik

    2015-01-01

    The question of automaticity in emotion processing has been debated under different perspectives in recent years. Satisfying answers to this issue will require a better definition of automaticity in terms of relevant behavioral phenomena, ecological conditions of occurrence, and a more precise mechanistic account of the underlying neural circuits.

  18. Automatic control of nuclear power plants

    International Nuclear Information System (INIS)

    Jover, P.

    1976-01-01

    The fundamental concepts in automatic control are surveyed, and the purpose of the automatic control of pressurized water reactors is given. The response characteristics for the main components are then studied and block diagrams are given for the main control loops (turbine, steam generator, and nuclear reactors) [fr

  19. Presentation of the results of a Bayesian automatic event detection and localization program to human analysts

    Science.gov (United States)

    Kushida, N.; Kebede, F.; Feitio, P.; Le Bras, R.

    2016-12-01

    The Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has been developing and testing NET-VISA (Arora et al., 2013), a Bayesian automatic event detection and localization program, and evaluating its performance in a realistic operational mode. In our preliminary testing at the CTBTO, NET-VISA shows better performance than its currently operating automatic localization program. However, given CTBTO's role and its international context, a new technology should be introduced cautiously when it replaces a key piece of the automatic processing. We integrated the results of NET-VISA into the Analyst Review Station, extensively used by the analysts so that they can check the accuracy and robustness of the Bayesian approach. We expect the workload of the analysts to be reduced because of the better performance of NET-VISA in finding missed events and getting a more complete set of stations than the current system which has been operating for nearly twenty years. The results of a series of tests indicate that the expectations born from the automatic tests, which show an overall overlap improvement of 11%, meaning that the missed events rate is cut by 42%, hold for the integrated interactive module as well. New events are found by analysts, which qualify for the CTBTO Reviewed Event Bulletin, beyond the ones analyzed through the standard procedures. Arora, N., Russell, S., and Sudderth, E., NET-VISA: Network Processing Vertically Integrated Seismic Analysis, 2013, Bull. Seismol. Soc. Am., 103, 709-729.

  20. Radiometric densimeter for measuring and automatic control of liquid density

    International Nuclear Information System (INIS)

    Wajs, J.

    1982-01-01

    A performance rule of the radiometric densimeter produced by ''POLON ''Works is presented. A simplified analysis of the correction of density indication changes due to liquid temperature variations is described. A method of replacing the measuring pipe carrying the liquid being measured by suitable standards is given. The method is for automatic systems control. (A.S.)

  1. Continuous moisture measurement in metallurgical coke with automatic charge correction

    International Nuclear Information System (INIS)

    Watzke, H.; Mehlhose, D.

    1981-01-01

    A process control system has been developed for automatic batching of the coke amount necessary for metallurgical processes taking into account the moisture content. The measurement is performed with a neutron moisture gage consisting of an Am-Be neutron source and a BF 3 counter. The output information of the counter is used for computer-controlled batching

  2. Performance evaluation of an automatic positioning system for photovoltaic panels; Avaliacao de desempenho de um sistema de posicionamento automatico para paineis fotovoltaicos

    Energy Technology Data Exchange (ETDEWEB)

    Alves, Alceu Ferreira; Cagnon, Jose Angelo [Universidade Estadual Paulista (FEB/UNESP), Bauru, SP (Brazil). Fac. de Engenharia], Emails: alceu@feb.unesp.br, jacagnon@feb.unesp.br

    2009-07-01

    The need of using electric energy in localities not attended by the utilities has motivated the development of this research, whose main approach was photovoltaic systems and the search for better performance of these systems with the solar panels positioning toward the sun. This work presents the performance evaluation of an automatic positioning system for photovoltaic panels taking in account the increase in generation of electric energy and its costs of implantation. It was designed a simplified electromechanical device, which is able to support and to move a photovoltaic panel along the day and along the year, keeping its surface aimed to the sun rays, without using sensors and with optimization of movements, due the adjustment of panel's inclination take place only once a day. The obtained results indicated that the proposal is viable, showing a compatible cost compared to the increase in the generation of electricity. (author)

  3. Interactivity in automatic control: foundations and experiences

    OpenAIRE

    Dormido Bencomo, Sebastián; Guzmán Sánchez, José Luis; Costa Castelló, Ramon; Berenguel, M

    2012-01-01

    The first part of this paper presents the concepts of interactivity and visualization and its essential role in learning the fundamentals and techniques of automatic control. More than 10 years experience of the authors in the development and design of interactive tools dedicated to the study of automatic control concepts are also exposed. The second part of the paper summarizes the main features of the “Automatic Control with Interactive Tools” text that has been recently published by Pea...

  4. Towards unifying inheritance and automatic program specialization

    DEFF Research Database (Denmark)

    Schultz, Ulrik Pagh

    2002-01-01

    with covariant specialization to control the automatic application of program specialization to class members. Lapis integrates object-oriented concepts, block structure, and techniques from automatic program specialization to provide both a language where object-oriented designs can be e#ciently implemented......Inheritance allows a class to be specialized and its attributes refined, but implementation specialization can only take place by overriding with manually implemented methods. Automatic program specialization can generate a specialized, effcient implementation. However, specialization of programs...

  5. Automatic analysis of macerals and reflectance; Analisis Automatico de Macerales y Reflectancia

    Energy Technology Data Exchange (ETDEWEB)

    Catalina, J.C.; Alarcon, D.; Gonzalez Prado, J.

    1998-12-01

    A new system has been developed to perform automatically macerals and reflectance analysis of single-seam bituminous coals, improving the interlaboratory accuracy of these types of analyses. The system follows the same steps as the manual method, requiring a human operator for preparation of coal samples and system startup; then, sample scanning, microscope focusing and field centre analysis are fully automatic. The main and most innovative idea of this approach is to coordinate an expert system with an image processing system, using both reflectance and morphological information. In this way, the system tries to reproduce the analysis procedure followed by a human expert in petrography. (Author)

  6. Automatic Transcription of Polyphonic Vocal Music

    Directory of Open Access Journals (Sweden)

    Andrew McLeod

    2017-12-01

    Full Text Available This paper presents a method for automatic music transcription applied to audio recordings of a cappella performances with multiple singers. We propose a system for multi-pitch detection and voice assignment that integrates an acoustic and a music language model. The acoustic model performs spectrogram decomposition, extending probabilistic latent component analysis (PLCA using a six-dimensional dictionary with pre-extracted log-spectral templates. The music language model performs voice separation and assignment using hidden Markov models that apply musicological assumptions. By integrating the two models, the system is able to detect multiple concurrent pitches in polyphonic vocal music and assign each detected pitch to a specific voice type such as soprano, alto, tenor or bass (SATB. We compare our system against multiple baselines, achieving state-of-the-art results for both multi-pitch detection and voice assignment on a dataset of Bach chorales and another of barbershop quartets. We also present an additional evaluation of our system using varied pitch tolerance levels to investigate its performance at 20-cent pitch resolution.

  7. Automatic Capture Verification in Pacemakers (Autocapture – Utility and Problems

    Directory of Open Access Journals (Sweden)

    Ruth Kam

    2004-04-01

    Full Text Available The concept of a closed – loop feedback system, that would automatically assess pacing threshold and self -adjust pacing output to ensure consistent myocardial capture, has many appeals. Enhancing patient safety in cases of an unexpected rise in threshold, reduced current drain, hence prolonging battery longevity and reducing the amount of physician intervention required are just some of the advantages. Autocapture (AC is a proprietary algorithm developed by St Jude Medical CRMD, Sylmar, CA, USA, (SJM that was the first to commercially provide these automatic functions in a single chamber pacemaker (Microny and Regency, and subsequently in a dual chamber pacemaker (Affinity, Entity and Identity family of pacemakers. This article reviews the conditions necessary for AC verification and performance and the problems encountered in clinical practice.

  8. Automatic detection of AutoPEEP during controlled mechanical ventilation

    Directory of Open Access Journals (Sweden)

    Nguyen Quang-Thang

    2012-06-01

    Full Text Available Abstract Background Dynamic hyperinflation, hereafter called AutoPEEP (auto-positive end expiratory pressure with some slight language abuse, is a frequent deleterious phenomenon in patients undergoing mechanical ventilation. Although not readily quantifiable, AutoPEEP can be recognized on the expiratory portion of the flow waveform. If expiratory flow does not return to zero before the next inspiration, AutoPEEP is present. This simple detection however requires the eye of an expert clinician at the patient’s bedside. An automatic detection of AutoPEEP should be helpful to optimize care. Methods In this paper, a platform for automatic detection of AutoPEEP based on the flow signal available on most of recent mechanical ventilators is introduced. The detection algorithms are developed on the basis of robust non-parametric hypothesis testings that require no prior information on the signal distribution. In particular, two detectors are proposed: one is based on SNT (Signal Norm Testing and the other is an extension of SNT in the sequential framework. The performance assessment was carried out on a respiratory system analog and ex-vivo on various retrospectively acquired patient curves. Results The experiment results have shown that the proposed algorithm provides relevant AutoPEEP detection on both simulated and real data. The analysis of clinical data has shown that the proposed detectors can be used to automatically detect AutoPEEP with an accuracy of 93% and a recall (sensitivity of 90%. Conclusions The proposed platform provides an automatic early detection of AutoPEEP. Such functionality can be integrated in the currently used mechanical ventilator for continuous monitoring of the patient-ventilator interface and, therefore, alleviate the clinician task.

  9. Approaches to Debugging at Scale on the Peregrine System | High-Performance

    Science.gov (United States)

    possible approaches. One approach provides those nodes as soon as possible but the time of their administrators. Approach 1: Run an Interactive Job Submit an interactive job asking for the number of nodes you to end the interactive job, and then type exit again to end the screen session. Approach 2: Request a

  10. Human-competitive automatic topic indexing

    CERN Document Server

    Medelyan, Olena

    2009-01-01

    Topic indexing is the task of identifying the main topics covered by a document. These are useful for many purposes: as subject headings in libraries, as keywords in academic publications and as tags on the web. Knowing a document’s topics helps people judge its relevance quickly. However, assigning topics manually is labor intensive. This thesis shows how to generate them automatically in a way that competes with human performance. Three kinds of indexing are investigated: term assignment, a task commonly performed by librarians, who select topics from a controlled vocabulary; tagging, a popular activity of web users, who choose topics freely; and a new method of keyphrase extraction, where topics are equated to Wikipedia article names. A general two-stage algorithm is introduced that first selects candidate topics and then ranks them by significance based on their properties. These properties draw on statistical, semantic, domain-specific and encyclopedic knowledge. They are combined using a machine learn...

  11. Automatic measurement for solid state track detectors

    International Nuclear Information System (INIS)

    Ogura, Koichi

    1982-01-01

    Since in solid state track detectors, their tracks are measured with a microscope, observers are forced to do hard works that consume time and labour. This causes to obtain poor statistic accuracy or to produce personal error. Therefore, many researches have been done to aim at simplifying and automating track measurement. There are two categories in automating the measurement: simple counting of the number of tracks and the requirements to know geometrical elements such as the size of tracks or their coordinates as well as the number of tracks. The former is called automatic counting and the latter automatic analysis. The method to generally evaluate the number of tracks in automatic counting is the estimation of the total number of tracks in the total detector area or in a field of view of a microscope. It is suitable for counting when the track density is higher. The method to count tracks one by one includes the spark counting and the scanning microdensitometer. Automatic analysis includes video image analysis in which the high quality images obtained with a high resolution video camera are processed with a micro-computer, and the tracks are automatically recognized and measured by feature extraction. This method is described in detail. In many kinds of automatic measurements reported so far, frequently used ones are ''spark counting'' and ''video image analysis''. (Wakatsuki, Y.)

  12. Application of image recognition-based automatic hyphae detection in fungal keratitis.

    Science.gov (United States)

    Wu, Xuelian; Tao, Yuan; Qiu, Qingchen; Wu, Xinyi

    2018-03-01

    The purpose of this study is to evaluate the accuracy of two methods in diagnosis of fungal keratitis, whereby one method is automatic hyphae detection based on images recognition and the other method is corneal smear. We evaluate the sensitivity and specificity of the method in diagnosis of fungal keratitis, which is automatic hyphae detection based on image recognition. We analyze the consistency of clinical symptoms and the density of hyphae, and perform quantification using the method of automatic hyphae detection based on image recognition. In our study, 56 cases with fungal keratitis (just single eye) and 23 cases with bacterial keratitis were included. All cases underwent the routine inspection of slit lamp biomicroscopy, corneal smear examination, microorganism culture and the assessment of in vivo confocal microscopy images before starting medical treatment. Then, we recognize the hyphae images of in vivo confocal microscopy by using automatic hyphae detection based on image recognition to evaluate its sensitivity and specificity and compare with the method of corneal smear. The next step is to use the index of density to assess the severity of infection, and then find the correlation with the patients' clinical symptoms and evaluate consistency between them. The accuracy of this technology was superior to corneal smear examination (p hyphae detection of image recognition was 89.29%, and the specificity was 95.65%. The area under the ROC curve was 0.946. The correlation coefficient between the grading of the severity in the fungal keratitis by the automatic hyphae detection based on image recognition and the clinical grading is 0.87. The technology of automatic hyphae detection based on image recognition was with high sensitivity and specificity, able to identify fungal keratitis, which is better than the method of corneal smear examination. This technology has the advantages when compared with the conventional artificial identification of confocal

  13. Effect of automatic recirculation flow control on the transient response for Lungmen ABWR plant

    Energy Technology Data Exchange (ETDEWEB)

    Tzang, Y.-C., E-mail: yctzang@aec.gov.t [National Tsing Hua University, Department of Engineering and System Science, Hsinchu 30013, Taiwan (China); Chiang, R.-F.; Ferng, Y.-M.; Pei, B.-S. [National Tsing Hua University, Department of Engineering and System Science, Hsinchu 30013, Taiwan (China)

    2009-12-15

    In this study the automatic mode of the recirculation flow control system (RFCS) for the Lungmen ABWR plant has been modeled and incorporated into the basic RETRAN-02 system model. The integrated system model is then used to perform the analyses for the two transients in which the automatic RFCS is involved. The two transients selected are: (1) one reactor internal pump (RIP) trip, and (2) loss of feedwater heating. In general, the integrated system model can predict well the response of key system parameters, including neutron flux, steam dome pressure, heat flux, RIP flow, core inlet flow, feedwater flow, steam flow, and reactor water level. The transients are also analyzed for manual RFCS case, between the automatic RFCS and the manual RFCS cases, comparisons of the transient response for the key system parameter show that the difference of transient response can be clearly identified. Also, the results show that the DELTACPR (delta critical power ratio) for the transients analyzed may not be less limiting for the automatic RFCS case under certain combination of control system settings.

  14. Inertial piezoelectric linear motor driven by a single-phase harmonic wave with automatic clamping mechanism

    Science.gov (United States)

    He, Liangguo; Chu, Yuheng; Hao, Sai; Zhao, Xiaoyong; Dong, Yuge; Wang, Yong

    2018-05-01

    A novel, single-phase, harmonic-driven, inertial piezoelectric linear motor using an automatic clamping mechanism was designed, fabricated, and tested to reduce the sliding friction and simplify the drive mechanism and power supply control of the inertial motor. A piezoelectric bimorph and a flexible hinge were connected in series to form the automatic clamping mechanism. The automatic clamping mechanism was used as the driving and clamping elements. A dynamic simulation by Simulink was performed to prove the feasibility of the motor. The finite element method software COMSOL was used to design the structure of the motor. An experimental setup was built to validate the working principle and evaluate the performance of the motor. The prototype motor outputted a no-load velocity of 3.178 mm/s at a voltage of 220 Vp-p and a maximum traction force of 4.25 N under a preload force of 8 N. The minimum resolution of 1.14 μm was achieved at a driving frequency of 74 Hz, a driving voltage of 50 Vp-p, and a preload force of 0 N.

  15. 9th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Hilbrich, Tobias; Niethammer, Christoph; Gracia, José; Nagel, Wolfgang; Resch, Michael

    2016-01-01

    High Performance Computing (HPC) remains a driver that offers huge potentials and benefits for science and society. However, a profound understanding of the computational matters and specialized software is needed to arrive at effective and efficient simulations. Dedicated software tools are important parts of the HPC software landscape, and support application developers. Even though a tool is by definition not a part of an application, but rather a supplemental piece of software, it can make a fundamental difference during the development of an application. Such tools aid application developers in the context of debugging, performance analysis, and code optimization, and therefore make a major contribution to the development of robust and efficient parallel software. This book introduces a selection of the tools presented and discussed at the 9th International Parallel Tools Workshop held in Dresden, Germany, September 2-3, 2015, which offered an established forum for discussing the latest advances in paral...

  16. Automatic video shot boundary detection using k-means clustering and improved adaptive dual threshold comparison

    Science.gov (United States)

    Sa, Qila; Wang, Zhihui

    2018-03-01

    At present, content-based video retrieval (CBVR) is the most mainstream video retrieval method, using the video features of its own to perform automatic identification and retrieval. This method involves a key technology, i.e. shot segmentation. In this paper, the method of automatic video shot boundary detection with K-means clustering and improved adaptive dual threshold comparison is proposed. First, extract the visual features of every frame and divide them into two categories using K-means clustering algorithm, namely, one with significant change and one with no significant change. Then, as to the classification results, utilize the improved adaptive dual threshold comparison method to determine the abrupt as well as gradual shot boundaries.Finally, achieve automatic video shot boundary detection system.

  17. Stay Focused! The Effects of Internal and External Focus of Attention on Movement Automaticity in Patients with Stroke

    Science.gov (United States)

    Kal, E. C.; van der Kamp, J.; Houdijk, H.; Groet, E.; van Bennekom, C. A. M.; Scherder, E. J. A.

    2015-01-01

    Dual-task performance is often impaired after stroke. This may be resolved by enhancing patients’ automaticity of movement. This study sets out to test the constrained action hypothesis, which holds that automaticity of movement is enhanced by triggering an external focus (on movement effects), rather than an internal focus (on movement execution). Thirty-nine individuals with chronic, unilateral stroke performed a one-leg-stepping task with both legs in single- and dual-task conditions. Attentional focus was manipulated with instructions. Motor performance (movement speed), movement automaticity (fluency of movement), and dual-task performance (dual-task costs) were assessed. The effects of focus on movement speed, single- and dual-task movement fluency, and dual-task costs were analysed with generalized estimating equations. Results showed that, overall, single-task performance was unaffected by focus (p = .341). Regarding movement fluency, no main effects of focus were found in single- or dual-task conditions (p’s ≥ .13). However, focus by leg interactions suggested that an external focus reduced movement fluency of the paretic leg compared to an internal focus (single-task conditions: p = .068; dual-task conditions: p = .084). An external focus also tended to result in inferior dual-task performance (β = -2.38, p = .065). Finally, a near-significant interaction (β = 2.36, p = .055) suggested that dual-task performance was more constrained by patients’ attentional capacity in external focus conditions. We conclude that, compared to an internal focus, an external focus did not result in more automated movements in chronic stroke patients. Contrary to expectations, trends were found for enhanced automaticity with an internal focus. These findings might be due to patients’ strong preference to use an internal focus in daily life. Future work needs to establish the more permanent effects of learning with different attentional foci on re-automating motor

  18. Stay Focused! The Effects of Internal and External Focus of Attention on Movement Automaticity in Patients with Stroke.

    Science.gov (United States)

    Kal, E C; van der Kamp, J; Houdijk, H; Groet, E; van Bennekom, C A M; Scherder, E J A

    2015-01-01

    Dual-task performance is often impaired after stroke. This may be resolved by enhancing patients' automaticity of movement. This study sets out to test the constrained action hypothesis, which holds that automaticity of movement is enhanced by triggering an external focus (on movement effects), rather than an internal focus (on movement execution). Thirty-nine individuals with chronic, unilateral stroke performed a one-leg-stepping task with both legs in single- and dual-task conditions. Attentional focus was manipulated with instructions. Motor performance (movement speed), movement automaticity (fluency of movement), and dual-task performance (dual-task costs) were assessed. The effects of focus on movement speed, single- and dual-task movement fluency, and dual-task costs were analysed with generalized estimating equations. Results showed that, overall, single-task performance was unaffected by focus (p = .341). Regarding movement fluency, no main effects of focus were found in single- or dual-task conditions (p's ≥ .13). However, focus by leg interactions suggested that an external focus reduced movement fluency of the paretic leg compared to an internal focus (single-task conditions: p = .068; dual-task conditions: p = .084). An external focus also tended to result in inferior dual-task performance (β = -2.38, p = .065). Finally, a near-significant interaction (β = 2.36, p = .055) suggested that dual-task performance was more constrained by patients' attentional capacity in external focus conditions. We conclude that, compared to an internal focus, an external focus did not result in more automated movements in chronic stroke patients. Contrary to expectations, trends were found for enhanced automaticity with an internal focus. These findings might be due to patients' strong preference to use an internal focus in daily life. Future work needs to establish the more permanent effects of learning with different attentional foci on re-automating motor control

  19. Optimal Recovery Trajectories for Automatic Ground Collision Avoidance Systems (Auto GCAS)

    Science.gov (United States)

    Suplisson, Angela W.

    The US Air Force recently fielded the F-16 Automatic Ground Collision Avoidance System (Auto GCAS). This system meets the operational requirements of being both aggressive and timely, meaning that extremely agile avoidance maneuvers will be executed at the last second to avoid the ground. This small window of automatic operation maneuvering in close proximity to the ground makes the problem challenging. There currently exists no similar Auto GCAS for manned military 'heavy' aircraft with lower climb performance such as transport, tanker, or bomber aircraft. The F-16 Auto GCAS recovery is a single pre-planned roll to wings-level and 5-g pull-up which is very effective for fighters due to their high g and climb performance, but it is not suitable for military heavy aircraft. This research proposes a new optimal control approach to the ground collision avoidance problem for heavy aircraft by mapping the aggressive and timely requirements of the automatic recovery to the optimal control formulation which includes lateral maneuvers around terrain. This novel mapping creates two ways to pose the optimal control problem for Auto GCAS; one as a Max Distance with a Timely Trigger formulation and the other as a Min Control with an Aggressive Trigger formulation. Further, the optimal path and optimal control admitted by these two formulations are demonstrated to be equivalent at the point the automatic recovery is initiated for the simplified 2-D case. The Min Control formulation was demonstrated to have faster computational speed and was chosen for the 3-D case. Results are presented for representative heavy aircraft scenarios against 3-D digital terrain. The Min Control formulation was then compared to a Multi-Trajectory Auto GCAS with five pre-planned maneuvers. Metrics were developed to quantify the improvement from using an optimal approach versus the pre-planned maneuvers. The proposed optimal Min Control method was demonstrated to require less control or trigger later

  20. Speed and automaticity of word recognition - inseparable twins?

    DEFF Research Database (Denmark)

    Poulsen, Mads; Asmussen, Vibeke; Elbro, Carsten

    'Speed and automaticity' of word recognition is a standard collocation. However, it is not clear whether speed and automaticity (i.e., effortlessness) make independent contributions to reading comprehension. In theory, both speed and automaticity may save cognitive resources for comprehension...... processes. Hence, the aim of the present study was to assess the unique contributions of word recognition speed and automaticity to reading comprehension while controlling for decoding speed and accuracy. Method: 139 Grade 5 students completed tests of reading comprehension and computer-based tests of speed...... of decoding and word recognition together with a test of effortlessness (automaticity) of word recognition. Effortlessness was measured in a dual task in which participants were presented with a word enclosed in an unrelated figure. The task was to read the word and decide whether the figure was a triangle...

  1. Towards Automatic Classification of Wikipedia Content

    Science.gov (United States)

    Szymański, Julian

    Wikipedia - the Free Encyclopedia encounters the problem of proper classification of new articles everyday. The process of assignment of articles to categories is performed manually and it is a time consuming task. It requires knowledge about Wikipedia structure, which is beyond typical editor competence, which leads to human-caused mistakes - omitting or wrong assignments of articles to categories. The article presents application of SVM classifier for automatic classification of documents from The Free Encyclopedia. The classifier application has been tested while using two text representations: inter-documents connections (hyperlinks) and word content. The results of the performed experiments evaluated on hand crafted data show that the Wikipedia classification process can be partially automated. The proposed approach can be used for building a decision support system which suggests editors the best categories that fit new content entered to Wikipedia.

  2. Channel selection for automatic seizure detection

    DEFF Research Database (Denmark)

    Duun-Henriksen, Jonas; Kjaer, Troels Wesenberg; Madsen, Rasmus Elsborg

    2012-01-01

    Objective: To investigate the performance of epileptic seizure detection using only a few of the recorded EEG channels and the ability of software to select these channels compared with a neurophysiologist. Methods: Fifty-nine seizures and 1419 h of interictal EEG are used for training and testing...... of an automatic channel selection method. The characteristics of the seizures are extracted by the use of a wavelet analysis and classified by a support vector machine. The best channel selection method is based upon maximum variance during the seizure. Results: Using only three channels, a seizure detection...... sensitivity of 96% and a false detection rate of 0.14/h were obtained. This corresponds to the performance obtained when channels are selected through visual inspection by a clinical neurophysiologist, and constitutes a 4% improvement in sensitivity compared to seizure detection using channels recorded...

  3. [Integrated Development of Full-automatic Fluorescence Analyzer].

    Science.gov (United States)

    Zhang, Mei; Lin, Zhibo; Yuan, Peng; Yao, Zhifeng; Hu, Yueming

    2015-10-01

    In view of the fact that medical inspection equipment sold in the domestic market is mainly imported from abroad and very expensive, we developed a full-automatic fluorescence analyzer in our center, presented in this paper. The present paper introduces the hardware architecture design of FPGA/DSP motion controlling card+PC+ STM32 embedded micro processing unit, software system based on C# multi thread, design and implementation of double-unit communication in detail. By simplifying the hardware structure, selecting hardware legitimately and adopting control system software to object-oriented technology, we have improved the precision and velocity of the control system significantly. Finally, the performance test showed that the control system could meet the needs of automated fluorescence analyzer on the functionality, performance and cost.

  4. Multi-atlas-based automatic 3D segmentation for prostate brachytherapy in transrectal ultrasound images

    Science.gov (United States)

    Nouranian, Saman; Mahdavi, S. Sara; Spadinger, Ingrid; Morris, William J.; Salcudean, S. E.; Abolmaesumi, P.

    2013-03-01

    One of the commonly used treatment methods for early-stage prostate cancer is brachytherapy. The standard of care for planning this procedure is segmentation of contours from transrectal ultrasound (TRUS) images, which closely follow the prostate boundary. This process is currently performed either manually or using semi-automatic techniques. This paper introduces a fully automatic segmentation algorithm which uses a priori knowledge of contours in a reference data set of TRUS volumes. A non-parametric deformable registration method is employed to transform the atlas prostate contours to a target image coordinates. All atlas images are sorted based on their registration results and the highest ranked registration results are selected for decision fusion. A Simultaneous Truth and Performance Level Estimation algorithm is utilized to fuse labels from registered atlases and produce a segmented target volume. In this experiment, 50 patient TRUS volumes are obtained and a leave-one-out study on TRUS volumes is reported. We also compare our results with a state-of-the-art semi-automatic prostate segmentation method that has been clinically used for planning prostate brachytherapy procedures and we show comparable accuracy and precision within clinically acceptable runtime.

  5. Automatic bad channel detection in intracranial electroencephalographic recordings using ensemble machine learning.

    Science.gov (United States)

    Tuyisenge, Viateur; Trebaul, Lena; Bhattacharjee, Manik; Chanteloup-Forêt, Blandine; Saubat-Guigui, Carole; Mîndruţă, Ioana; Rheims, Sylvain; Maillard, Louis; Kahane, Philippe; Taussig, Delphine; David, Olivier

    2018-03-01

    Intracranial electroencephalographic (iEEG) recordings contain "bad channels", which show non-neuronal signals. Here, we developed a new method that automatically detects iEEG bad channels using machine learning of seven signal features. The features quantified signals' variance, spatial-temporal correlation and nonlinear properties. Because the number of bad channels is usually much lower than the number of good channels, we implemented an ensemble bagging classifier known to be optimal in terms of stability and predictive accuracy for datasets with imbalanced class distributions. This method was applied on stereo-electroencephalographic (SEEG) signals recording during low frequency stimulations performed in 206 patients from 5 clinical centers. We found that the classification accuracy was extremely good: It increased with the number of subjects used to train the classifier and reached a plateau at 99.77% for 110 subjects. The classification performance was thus not impacted by the multicentric nature of data. The proposed method to automatically detect bad channels demonstrated convincing results and can be envisaged to be used on larger datasets for automatic quality control of iEEG data. This is the first method proposed to classify bad channels in iEEG and should allow to improve the data selection when reviewing iEEG signals. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.

  6. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathan; Calo, Victor M.

    2010-01-01

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques

  7. Towards automatic exchange of information

    OpenAIRE

    Oberson, Xavier

    2015-01-01

    This article describes the various steps that led towards automatic exchange of information as the global standard and the issues that remain to be solved. First, the various competing models of exchange information, such as Double Tax Treaty (DTT), TIEA's, FATCA or UE Directives are described with a view to show how they interact between themselves. Second, the so-called Rubik Strategy is summarized and compared with an automatic exchange of information (AEOI). The third part then describes ...

  8. Comparison of manual and automatic MR-CT registration for radiotherapy of prostate cancer.

    Science.gov (United States)

    Korsager, Anne Sofie; Carl, Jesper; Riis Østergaard, Lasse

    2016-05-08

    In image-guided radiotherapy (IGRT) of prostate cancer, delineation of the clini-cal target volume (CTV) often relies on magnetic resonance (MR) because of its good soft-tissue visualization. Registration of MR and computed tomography (CT) is required in order to add this accurate delineation to the dose planning CT. An automatic approach for local MR-CT registration of the prostate has previously been developed using a voxel property-based registration as an alternative to a manual landmark-based registration. The aim of this study is to compare the two registration approaches and to investigate the clinical potential for replacing the manual registration with the automatic registration. Registrations and analysis were performed for 30 prostate cancer patients treated with IGRT using a Ni-Ti prostate stent as a fiducial marker. The comparison included computing translational and rotational differences between the approaches, visual inspection, and computing the overlap of the CTV. The computed mean translational difference was 1.65, 1.60, and 1.80mm and the computed mean rotational difference was 1.51°, 3.93°, and 2.09° in the superior/inferior, anterior/posterior, and medial/lateral direction, respectively. The sensitivity of overlap was 87%. The results demonstrate that the automatic registration approach performs registrations comparable to the manual registration.

  9. Using dual-task methodology to dissociate automatic from nonautomatic processes involved in artificial grammar learning.

    Science.gov (United States)

    Hendricks, Michelle A; Conway, Christopher M; Kellogg, Ronald T

    2013-09-01

    Previous studies have suggested that both automatic and intentional processes contribute to the learning of grammar and fragment knowledge in artificial grammar learning (AGL) tasks. To explore the relative contribution of automatic and intentional processes to knowledge gained in AGL, we utilized dual-task methodology to dissociate automatic and intentional grammar- and fragment-based knowledge in AGL at both acquisition and at test. Both experiments used a balanced chunk strength grammar to assure an equal proportion of fragment cues (i.e., chunks) in grammatical and nongrammatical test items. In Experiment 1, participants engaged in a working memory dual-task either during acquisition, test, or both acquisition and test. The results showed that participants performing the dual-task during acquisition learned the artificial grammar as well as the single-task group, presumably by relying on automatic learning mechanisms. A working memory dual-task at test resulted in attenuated grammar performance, suggesting a role for intentional processes for the expression of grammatical learning at test. Experiment 2 explored the importance of perceptual cues by changing letters between the acquisition and test phase; unlike Experiment 1, there was no significant learning of grammatical information for participants under dual-task conditions in Experiment 2, suggesting that intentional processing is necessary for successful acquisition and expression of grammar-based knowledge under transfer conditions. In sum, it appears that some aspects of learning in AGL are indeed relatively automatic, although the expression of grammatical information and the learning of grammatical patterns when perceptual similarity is eliminated both appear to require explicit resources. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  10. Automatically processed alpha-track radon monitor

    International Nuclear Information System (INIS)

    Langner, G.H. Jr.

    1993-01-01

    An automatically processed alpha-track radon monitor is provided which includes a housing having an aperture allowing radon entry, and a filter that excludes the entry of radon daughters into the housing. A flexible track registration material is located within the housing that records alpha-particle emissions from the decay of radon and radon daughters inside the housing. The flexible track registration material is capable of being spliced such that the registration material from a plurality of monitors can be spliced into a single strip to facilitate automatic processing of the registration material from the plurality of monitors. A process for the automatic counting of radon registered by a radon monitor is also provided

  11. Compatibility of automatic exposure control with new screen phosphors in diagnostic roentgenography

    International Nuclear Information System (INIS)

    Mulvaney, J.A.

    1982-01-01

    Automatic exposure control systems are used in diagnostic roentgenography to obtain proper film density for a variety of patient examinations and roentgenographic techniques. Most automatic exposure control systems have been designed for use with par speed, calcium tungstate intensifying screens. The use of screens with faster speeds and new phosphor materials has put extreme demands on present systems. The performance of a representative automatic exposure control system is investigated to determine its ability to maintain constant film density over a wide range of x-ray tube voltages and acrylic phantom thicknesses with four different intensifying screen phosphors. The effects of x-ray energy dependence, generator switching time and stored change are investigated. The system is able to maintain film density to within plus or minus 0.2 optical density units for techniques representing adult patients. A single nonadjustable tube voltage compensation circuit is adequate for the four different screen phosphors for x-ray tube voltages above sixty peak kilovolts. For techniques representing pediatric patients at high x-ray tube voltages, excess film density occurs due to stored charge in the transformer and high-voltage cables. An anticipation circuit in the automatic exposure control circuit can be modified to correct for stored charge effects. In a seperate experiment the energy dependence of three different ionization chamber detectors used in automatic exposure control systems is compared directly with the energy dependence of three different screen phosphors. The data on detector sensitivity and screen speed are combined to predict the best tube voltage compensation for each combination of screen and detector

  12. Characteristics and design improvement of AP1000 automatic depressurization system

    International Nuclear Information System (INIS)

    Jin Fei

    2012-01-01

    Automatic depressurization system, as a specialty of AP1000 Design, enhances capability of mitigating design basis accidents for plant. Advancement of the system is discussed by comparing with traditional PWR design and analyzing system functions, such as depressurizing and venting. System design improvement during China Project performance is also described. At the end, suggestions for the system in China Project are listed. (author)

  13. High performance visual display for HENP detectors

    CERN Document Server

    McGuigan, M; Spiletic, J; Fine, V; Nevski, P

    2001-01-01

    A high end visual display for High Energy Nuclear Physics (HENP) detectors is necessary because of the sheer size and complexity of the detector. For BNL this display will be of special interest because of STAR and ATLAS. To load, rotate, query, and debug simulation code with a modern detector simply takes too long even on a powerful work station. To visualize the HENP detectors with maximal performance we have developed software with the following characteristics. We develop a visual display of HENP detectors on BNL multiprocessor visualization server at multiple level of detail. We work with general and generic detector framework consistent with ROOT, GAUDI etc, to avoid conflicting with the many graphic development groups associated with specific detectors like STAR and ATLAS. We develop advanced OpenGL features such as transparency and polarized stereoscopy. We enable collaborative viewing of detector and events by directly running the analysis in BNL stereoscopic theatre. We construct enhanced interactiv...

  14. Post-convergence automatic differentiation of iterative schemes

    International Nuclear Information System (INIS)

    Azmy, Y.Y.

    1997-01-01

    A new approach for performing automatic differentiation (AD) of computer codes that embody an iterative procedure, based on differentiating a single additional iteration upon achieving convergence, is described and implemented. This post-convergence automatic differentiation (PAD) technique results in better accuracy of the computed derivatives, as it eliminates part of the derivatives convergence error, and a large reduction in execution time, especially when many iterations are required to achieve convergence. In addition, it provides a way to compute derivatives of the converged solution without having to repeat the entire iterative process every time new parameters are considered. These advantages are demonstrated and the PAD technique is validated via a set of three linear and nonlinear codes used to solve neutron transport and fluid flow problems. The PAD technique reduces the execution time over direct AD by a factor of up to 30 and improves the accuracy of the derivatives by up to two orders of magnitude. The PAD technique's biggest disadvantage lies in the necessity to compute the iterative map's Jacobian, which for large problems can be prohibitive. Methods are discussed to alleviate this difficulty

  15. Automatic Lamp and Fan Control Based on Microcontroller

    Science.gov (United States)

    Widyaningrum, V. T.; Pramudita, Y. D.

    2018-01-01

    In general, automation can be described as a process following pre-determined sequential steps with a little or without any human exertion. Automation is provided with the use of various sensors suitable to observe the production processes, actuators and different techniques and devices. In this research, the automation system developed is an automatic lamp and an automatic fan on the smart home. Both of these systems will be processed using an Arduino Mega 2560 microcontroller. A microcontroller is used to obtain values of physical conditions through sensors connected to it. In the automatic lamp system required sensors to detect the light of the LDR (Light Dependent Resistor) sensor. While the automatic fan system required sensors to detect the temperature of the DHT11 sensor. In tests that have been done lamps and fans can work properly. The lamp can turn on automatically when the light begins to darken, and the lamp can also turn off automatically when the light begins to bright again. In addition, it can concluded also that the readings of LDR sensors are placed outside the room is different from the readings of LDR sensors placed in the room. This is because the light intensity received by the existing LDR sensor in the room is blocked by the wall of the house or by other objects. Then for the fan, it can also turn on automatically when the temperature is greater than 25°C, and the fan speed can also be adjusted. The fan may also turn off automatically when the temperature is less than equal to 25°C.

  16. Experimental Breeder Reactor-II automatic control-rod-drive system

    International Nuclear Information System (INIS)

    Christensen, L.J.

    1983-01-01

    A computer-controlled automatic control rod drive system (ACRDS) was designed and operated in EBR-II during reactor runs 121 and 122. The ACRDS was operated in a checkout mode during run 121 using a low worth control rod. During run 122 a high worth control rod was used to perform overpower transient tests as part of the LMFBR oxide fuels transient testing program. The testing program required an increase in power of 4 MW/s, a hold time of 12 minutes and a power decrease of 4 MW/s. During run 122, 13 power transients were performed

  17. Automatic control system at the ''Loviisa'' NPP

    International Nuclear Information System (INIS)

    Kukhtevich, I.V.; Mal'tsev, B.K.; Sergievskaya, E.N.

    1980-01-01

    Automatic control system of the Loviisa-1 NPP (Finland) is described. According to operation conditions of Finland power system the Loviisa-1 NPP must operate in the mode of week and day control of loading schedule and participate in current control of power system frequency and capacity. With provision for these requirements NPP is equipped with the all-regime system for automatic control functioning during reactor start-up, shut-down, in normal and transient regimes and in emergency situations. The automatic control system includes: a data subsystem, an automatic control subsystem, a discrete control subsystem including remote, a subsystem for reactor control and protection and overall station system of protections: control and dosimetry inside the reactor. Structures of a data-computer complex, discrete control subsystems, reactor control and protection systems, neutron flux control system, inside-reactor control system, station protection system and system for control of fuel element tightness are presented in short. Two-year experience of the NPP operation confirmed advisability of the chosen volume of automatization. The Loviisa-1 NPP operates successfully in the mode of the week and day control of supervisor schedule and current control of frequency (short-term control)

  18. Atlas-based automatic segmentation of head and neck organs at risk and nodal target volumes: a clinical validation.

    Science.gov (United States)

    Daisne, Jean-François; Blumhofer, Andreas

    2013-06-26

    Intensity modulated radiotherapy for head and neck cancer necessitates accurate definition of organs at risk (OAR) and clinical target volumes (CTV). This crucial step is time consuming and prone to inter- and intra-observer variations. Automatic segmentation by atlas deformable registration may help to reduce time and variations. We aim to test a new commercial atlas algorithm for automatic segmentation of OAR and CTV in both ideal and clinical conditions. The updated Brainlab automatic head and neck atlas segmentation was tested on 20 patients: 10 cN0-stages (ideal population) and 10 unselected N-stages (clinical population). Following manual delineation of OAR and CTV, automatic segmentation of the same set of structures was performed and afterwards manually corrected. Dice Similarity Coefficient (DSC), Average Surface Distance (ASD) and Maximal Surface Distance (MSD) were calculated for "manual to automatic" and "manual to corrected" volumes comparisons. In both groups, automatic segmentation saved about 40% of the corresponding manual segmentation time. This effect was more pronounced for OAR than for CTV. The edition of the automatically obtained contours significantly improved DSC, ASD and MSD. Large distortions of normal anatomy or lack of iodine contrast were the limiting factors. The updated Brainlab atlas-based automatic segmentation tool for head and neck Cancer patients is timesaving but still necessitates review and corrections by an expert.

  19. Development report: Automatic System Test and Calibration (ASTAC) equipment

    Science.gov (United States)

    Thoren, R. J.

    1981-01-01

    A microcomputer based automatic test system was developed for the daily performance monitoring of wind energy system time domain (WEST) analyzer. The test system consists of a microprocessor based controller and hybrid interface unit which are used for inputing prescribed test signals into all WEST subsystems and for monitoring WEST responses to these signals. Performance is compared to theoretically correct performance levels calculated off line on a large general purpose digital computer. Results are displayed on a cathode ray tube or are available from a line printer. Excessive drift and/or lack of repeatability of the high speed analog sections within WEST is easily detected and the malfunctioning hardware identified using this system.

  20. 21 CFR 211.68 - Automatic, mechanical, and electronic equipment.

    Science.gov (United States)

    2010-04-01

    ... SERVICES (CONTINUED) DRUGS: GENERAL CURRENT GOOD MANUFACTURING PRACTICE FOR FINISHED PHARMACEUTICALS Equipment § 211.68 Automatic, mechanical, and electronic equipment. (a) Automatic, mechanical, or electronic... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Automatic, mechanical, and electronic equipment...

  1. Hardware-in-the-Loop Test for Automatic Voltage Regulator of Synchronous Condenser

    DEFF Research Database (Denmark)

    Nguyen, Ha Thi; Yang, Guangya; Nielsen, Arne Hejde

    2018-01-01

    Automatic voltage regulator (AVR) plays an important role in volt/var control of synchronous condenser (SC) in power systems. Test AVR performance in steady-state and dynamic conditions in real grid is expensive, low efficiency, and hard to achieve. To address this issue, we implement hardware...

  2. Automatic Functionality Assignment to AUTOSAR Multicore Distributed Architectures

    DEFF Research Database (Denmark)

    Maticu, Florin; Pop, Paul; Axbrink, Christian

    2016-01-01

    The automotive electronic architectures have moved from federated architectures, where one function is implemented in one ECU (Electronic Control Unit), to distributed architectures, where several functions may share resources on an ECU. In addition, multicore ECUs are being adopted because...... of better performance, cost, size, fault-tolerance and power consumption. In this paper we present an approach for the automatic software functionality assignment to multicore distributed architectures. We consider that the systems use the AUTomotive Open System ARchitecture (AUTOSAR). The functionality...

  3. Color Segmentation Approach of Infrared Thermography Camera Image for Automatic Fault Diagnosis

    International Nuclear Information System (INIS)

    Djoko Hari Nugroho; Ari Satmoko; Budhi Cynthia Dewi

    2007-01-01

    Predictive maintenance based on fault diagnosis becomes very important in current days to assure the availability and reliability of a system. The main purpose of this research is to configure a computer software for automatic fault diagnosis based on image model acquired from infrared thermography camera using color segmentation approach. This technique detects hot spots in equipment of the plants. Image acquired from camera is first converted to RGB (Red, Green, Blue) image model and then converted to CMYK (Cyan, Magenta, Yellow, Key for Black) image model. Assume that the yellow color in the image represented the hot spot in the equipment, the CMYK image model is then diagnosed using color segmentation model to estimate the fault. The software is configured utilizing Borland Delphi 7.0 computer programming language. The performance is then tested for 10 input infrared thermography images. The experimental result shows that the software capable to detect the faulty automatically with performance value of 80 % from 10 sheets of image input. (author)

  4. Evaluation of automatic face recognition for automatic border control on actual data recorded of travellers at Schiphol Airport

    NARCIS (Netherlands)

    Spreeuwers, Lieuwe Jan; Hendrikse, A.J.; Gerritsen, K.J.; Brömme, A.; Busch, C.

    2012-01-01

    Automatic border control at airports using automated facial recognition for checking the passport is becoming more and more common. A problem is that it is not clear how reliable these automatic gates are. Very few independent studies exist that assess the reliability of automated facial recognition

  5. Automatic differentiation for design sensitivity analysis of structural systems using multiple processors

    Science.gov (United States)

    Nguyen, Duc T.; Storaasli, Olaf O.; Qin, Jiangning; Qamar, Ramzi

    1994-01-01

    An automatic differentiation tool (ADIFOR) is incorporated into a finite element based structural analysis program for shape and non-shape design sensitivity analysis of structural systems. The entire analysis and sensitivity procedures are parallelized and vectorized for high performance computation. Small scale examples to verify the accuracy of the proposed program and a medium scale example to demonstrate the parallel vector performance on multiple CRAY C90 processors are included.

  6. Type tests to the automatic system of thermoluminescent dosimetry acquired by the CPHR for personnel dosimetry

    International Nuclear Information System (INIS)

    Molina P, D.; Pernas S, R.

    2005-01-01

    The CPHR individual monitoring service acquired an automatic RADOS TLD system to improve its capacities to satisfy the increasing needs of their national customers. The TLD system consists of: two automatic TLD reader, model DOSACUS, a TLD irradiator and personal dosimeters card including slide and holders. The dosimeters were composed by this personal dosimeters card and LiF: Mg,Cu,P (model GR-200) detectors. These readers provide to detectors a constant temperature readout cycle using hot nitrogen gas. In order to evaluate the performance characteristics of the system, different performance tests recommended by the IEC 1066 standard were carried out. Important dosimetric characteristics evaluated were batch homogeneity, reproducibility, detection threshold, energy dependence, residual signal and fading. The results of the tests showed good performance characteristics of the system. (Author)

  7. Farm-specific economic value of automatic lameness detection systems in dairy cattle: From concepts to operational simulations.

    Science.gov (United States)

    Van De Gucht, Tim; Saeys, Wouter; Van Meensel, Jef; Van Nuffel, Annelies; Vangeyte, Jurgen; Lauwers, Ludwig

    2018-01-01

    Although prototypes of automatic lameness detection systems for dairy cattle exist, information about their economic value is lacking. In this paper, a conceptual and operational framework for simulating the farm-specific economic value of automatic lameness detection systems was developed and tested on 4 system types: walkover pressure plates, walkover pressure mats, camera systems, and accelerometers. The conceptual framework maps essential factors that determine economic value (e.g., lameness prevalence, incidence and duration, lameness costs, detection performance, and their relationships). The operational simulation model links treatment costs and avoided losses with detection results and farm-specific information, such as herd size and lameness status. Results show that detection performance, herd size, discount rate, and system lifespan have a large influence on economic value. In addition, lameness prevalence influences the economic value, stressing the importance of an adequate prior estimation of the on-farm prevalence. The simulations provide first estimates for the upper limits for purchase prices of automatic detection systems. The framework allowed for identification of knowledge gaps obstructing more accurate economic value estimation. These include insights in cost reductions due to early detection and treatment, and links between specific lameness causes and their related losses. Because this model provides insight in the trade-offs between automatic detection systems' performance and investment price, it is a valuable tool to guide future research and developments. Copyright © 2018 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  8. Development of an Automatic Dispensing System for Traditional Chinese Herbs

    Directory of Open Access Journals (Sweden)

    Chi-Ying Lin

    2017-01-01

    Full Text Available The gathering of ingredients for decoctions of traditional Chinese herbs still relies on manual dispensation, due to the irregular shape of many items and inconsistencies in weights. In this study, we developed an automatic dispensing system for Chinese herbal decoctions with the aim of reducing manpower costs and the risk of mistakes. We employed machine vision in conjunction with a robot manipulator to facilitate the grasping of ingredients. The name and formulation of the decoction are input via a human-computer interface, and the dispensing of multiple medicine packets is performed automatically. An off-line least-squared curve fitting method was used to calculate the amount of material grasped by the claws and thereby improve system efficiency as well as the accuracy of individual dosages. Experiments on the dispensing of actual ingredients demonstrate the feasibility of the proposed system.

  9. An introduction to automatic radioactive sample counters

    International Nuclear Information System (INIS)

    1980-01-01

    The subject is covered in chapters, entitled; the detection of radiation in sample counters; nucleonic equipment; liquid scintillation counting; basic features of automatic sample counters; statistics of counting; data analysis; purchase, installation, calibration and maintenance of automatic sample counters. (U.K.)

  10. Automatic Cobb Angle Determination From Radiographic Images

    NARCIS (Netherlands)

    Sardjono, Tri Arief; Wilkinson, Michael H. F.; Veldhuizen, Albert G.; van Ooijen, Peter M. A.; Purnama, Ketut E.; Verkerke, Gijsbertus J.

    2013-01-01

    Study Design. Automatic measurement of Cobb angle in patients with scoliosis. Objective. To test the accuracy of an automatic Cobb angle determination method from frontal radiographical images. Summary of Background Data. Thirty-six frontal radiographical images of patients with scoliosis. Methods.

  11. Observed use of automatic seat belts in 1987 cars.

    Science.gov (United States)

    Williams, A F; Wells, J K; Lund, A K; Teed, N

    1989-10-01

    Usage of the automatic belt systems supplied by six large-volume automobile manufacturers to meet the federal requirements for automatic restraints were observed in suburban Washington, D.C., Chicago, Los Angeles, and Philadelphia. The different belt systems studied were: Ford and Toyota (motorized, nondetachable automatic shoulder belt), Nissan (motorized, detachable shoulder belt), VW and Chrysler (nonmotorized, detachable shoulder belt), and GM (nonmotorized detachable lap and shoulder belt). Use of automatic belts was significantly greater than manual belt use in otherwise comparable late-model cars for all manufacturers except Chrysler; in Chrysler cars, automatic belt use was significantly lower than manual belt use. The automatic shoulder belts provided by Ford, Nissan, Toyota, and VW increased use rates to about 90%. Because use rates were lower in Ford cars with manual belts, their increase was greater. GM cars had the smallest increase in use rates; however, lap belt use was highest in GM cars. The other manufacturers supply knee bolsters to supplement shoulder belt protection; all--except VW--also provide manual lap belts, which were used by about half of those who used the automatic shoulder belt. The results indicate that some manufacturers have been more successful than others in providing automatic belt systems that result in high use that, in turn, will mean fewer deaths and injuries in those cars.

  12. Dissociating Working Memory Updating and Automatic Updating: The Reference-Back Paradigm

    Science.gov (United States)

    Rac-Lubashevsky, Rachel; Kessler, Yoav

    2016-01-01

    Working memory (WM) updating is a controlled process through which relevant information in the environment is selected to enter the gate to WM and substitute its contents. We suggest that there is also an automatic form of updating, which influences performance in many tasks and is primarily manifested in reaction time sequential effects. The goal…

  13. Automaticity of cognitive biases in addictive behaviours: further evidence with gamblers.

    Science.gov (United States)

    McCusker, C G; Gettings, B

    1997-11-01

    The hypotheses that automatic, non-volitional, attentional and memory biases for addiction-related constructs exist is tested with compulsive gamblers. An independent groups design was employed. Processing of gambling, compared to neutral and drug-related information was examined in 15 gamblers recruited from new members of Gamblers Anonymous. Comparisons were made with the performance of their spouses (N = 15) to help distinguish addiction mechanisms from more non-specific emotional experiences with gambling, and an independent control group (N = 15), recruited from the staff and students of a university department. A modified Stroop procedure was first employed. Automative cognitive interference was assessed relatively, by comparing colour-naming times on the gambling, drug and neutral Stroops. A subsequent word-stem completion task of implicit memory was then used to assess selective and automatic priming of the gambling constructs in memory. Only the gamblers showed selective and automatic interference for gambling-related constructs on the Stroop task. Spouses behaved like the control group on this task. An implicit memory bias for gambling-related words was statistically detected only in the gamblers compared to the control group, although the trend was similar in the comparison with spouses. Further evidence for the specificity of these effects was obtained in subgroup comparisons involving fruit-machine with racing gamblers. Results are generally consistent with an automaticity in the cognitive biases gamblers show for gambling-related information. Implications for cognitive understanding and treatments are highlighted.

  14. Scanner OPC signatures: automatic vendor-to-vendor OPE matching

    Science.gov (United States)

    Renwick, Stephen P.

    2009-03-01

    As 193nm lithography continues to be stretched and the k1 factor decreases, optical proximity correction (OPC) has become a vital part of the lithographer's tool kit. Unfortunately, as is now well known, the design variations of lithographic scanners from different vendors cause them to have slightly different optical-proximity effect (OPE) behavior, meaning that they print features through pitch in distinct ways. This in turn means that their response to OPC is not the same, and that an OPC solution designed for a scanner from Company 1 may or may not work properly on a scanner from Company 2. Since OPC is not inexpensive, that causes trouble for chipmakers using more than one brand of scanner. Clearly a scanner-matching procedure is needed to meet this challenge. Previously, automatic matching has only been reported for scanners of different tool generations from the same manufacturer. In contrast, scanners from different companies have been matched using expert tuning and adjustment techniques, frequently requiring laborious test exposures. Automatic matching between scanners from Company 1 and Company 2 has remained an unsettled problem. We have recently solved this problem and introduce a novel method to perform the automatic matching. The success in meeting this challenge required three enabling factors. First, we recognized the strongest drivers of OPE mismatch and are thereby able to reduce the information needed about a tool from another supplier to that information readily available from all modern scanners. Second, we developed a means of reliably identifying the scanners' optical signatures, minimizing dependence on process parameters that can cloud the issue. Third, we carefully employed standard statistical techniques, checking for robustness of the algorithms used and maximizing efficiency. The result is an automatic software system that can predict an OPC matching solution for scanners from different suppliers without requiring expert intervention.

  15. Property-driven functional verification technique for high-speed vision system-on-chip processor

    Science.gov (United States)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  16. Automatic sample changers maintenance manual

    International Nuclear Information System (INIS)

    Myers, T.A.

    1978-10-01

    This manual describes and provides trouble-shooting aids for the Automatic Sample Changer electronics on the automatic beta counting system, developed by the Los Alamos Scientific Laboratory Group CNC-11. The output of a gas detector is shaped by a preamplifier, then is coupled to an amplifier. Amplifier output is discriminated and is the input to a scaler. An identification number is associated with each sample. At a predetermined count length, the identification number, scaler data plus other information is punched out on a data card. The next sample to be counted is automatically selected. The beta counter uses the same electronics as the prior count did, the only difference being the sample identification number and sample itself. This manual is intended as a step-by-step aid in trouble-shooting the electronics associated with positioning the sample, counting the sample, and getting the needed data punched on an 80-column data card

  17. Scheduling algorithms for automatic control systems for technological processes

    Science.gov (United States)

    Chernigovskiy, A. S.; Tsarev, R. Yu; Kapulin, D. V.

    2017-01-01

    Wide use of automatic process control systems and the usage of high-performance systems containing a number of computers (processors) give opportunities for creation of high-quality and fast production that increases competitiveness of an enterprise. Exact and fast calculations, control computation, and processing of the big data arrays - all of this requires the high level of productivity and, at the same time, minimum time of data handling and result receiving. In order to reach the best time, it is necessary not only to use computing resources optimally, but also to design and develop the software so that time gain will be maximal. For this purpose task (jobs or operations), scheduling techniques for the multi-machine/multiprocessor systems are applied. Some of basic task scheduling methods for the multi-machine process control systems are considered in this paper, their advantages and disadvantages come to light, and also some usage considerations, in case of the software for automatic process control systems developing, are made.

  18. Automatic, semi-automatic and manual validation of urban drainage data.

    Science.gov (United States)

    Branisavljević, N; Prodanović, D; Pavlović, D

    2010-01-01

    Advances in sensor technology and the possibility of automated long distance data transmission have made continuous measurements the preferable way of monitoring urban drainage processes. Usually, the collected data have to be processed by an expert in order to detect and mark the wrong data, remove them and replace them with interpolated data. In general, the first step in detecting the wrong, anomaly data is called the data quality assessment or data validation. Data validation consists of three parts: data preparation, validation scores generation and scores interpretation. This paper will present the overall framework for the data quality improvement system, suitable for automatic, semi-automatic or manual operation. The first two steps of the validation process are explained in more detail, using several validation methods on the same set of real-case data from the Belgrade sewer system. The final part of the validation process, which is the scores interpretation, needs to be further investigated on the developed system.

  19. Automatic multimodal real-time tracking for image plane alignment in interventional Magnetic Resonance Imaging

    International Nuclear Information System (INIS)

    Neumann, Markus

    2014-01-01

    Interventional magnetic resonance imaging (MRI) aims at performing minimally invasive percutaneous interventions, such as tumor ablations and biopsies, under MRI guidance. During such interventions, the acquired MR image planes are typically aligned to the surgical instrument (needle) axis and to surrounding anatomical structures of interest in order to efficiently monitor the advancement in real-time of the instrument inside the patient's body. Object tracking inside the MRI is expected to facilitate and accelerate MR-guided interventions by allowing to automatically align the image planes to the surgical instrument. In this PhD thesis, an image-based work-flow is proposed and refined for automatic image plane alignment. An automatic tracking work-flow was developed, performing detection and tracking of a passive marker directly in clinical real-time images. This tracking work-flow is designed for fully automated image plane alignment, with minimization of tracking-dedicated time. Its main drawback is its inherent dependence on the slow clinical MRI update rate. First, the addition of motion estimation and prediction with a Kalman filter was investigated and improved the work-flow tracking performance. Second, a complementary optical sensor was used for multi-sensor tracking in order to decouple the tracking update rate from the MR image acquisition rate. Performance of the work-flow was evaluated with both computer simulations and experiments using an MR compatible test bed. Results show a high robustness of the multi-sensor tracking approach for dynamic image plane alignment, due to the combination of the individual strengths of each sensor. (author)

  20. Automatic selection of resting-state networks with functional magnetic resonance imaging

    Directory of Open Access Journals (Sweden)

    Silvia Francesca eStorti

    2013-05-01

    Full Text Available Functional magnetic resonance imaging (fMRI during a resting-state condition can reveal the co-activation of specific brain regions in distributed networks, called resting-state networks, which are selected by independent component analysis (ICA of the fMRI data. One of the major difficulties with component analysis is the automatic selection of the ICA features related to brain activity. In this study we describe a method designed to automatically select networks of potential functional relevance, specifically, those regions known to be involved in motor function, visual processing, executive functioning, auditory processing, memory, and the default-mode network. To do this, image analysis was based on probabilistic ICA as implemented in FSL software. After decomposition, the optimal number of components was selected by applying a novel algorithm which takes into account, for each component, Pearson's median coefficient of skewness of the spatial maps generated by FSL, followed by clustering, segmentation, and spectral analysis. To evaluate the performance of the approach, we investigated the resting-state networks in 25 subjects. For each subject, three resting-state scans were obtained with a Siemens Allegra 3 T scanner (NYU data set. Comparison of the visually and the automatically identified neuronal networks showed that the algorithm had high accuracy (first scan: 95%, second scan: 95%, third scan: 93% and precision (90%, 90%, 84%. The reproducibility of the networks for visual and automatic selection was very close: it was highly consistent in each subject for the default-mode network (≥ 92% and the occipital network, which includes the medial visual cortical areas (≥ 94%, and consistent for the attention network (≥ 80%, the right and/or left lateralized frontoparietal attention networks, and the temporal-motor network (≥ 80%. The automatic selection method may be used to detect neural networks and reduce subjectivity in ICA

  1. Prediction Governors for Input-Affine Nonlinear Systems and Application to Automatic Driving Control

    Directory of Open Access Journals (Sweden)

    Yuki Minami

    2018-04-01

    Full Text Available In recent years, automatic driving control has attracted attention. To achieve a satisfactory driving control performance, the prediction accuracy of the traveling route is important. If a highly accurate prediction method can be used, an accurate traveling route can be obtained. Despite the considerable efforts that have been invested in improving prediction methods, prediction errors do occur in general. Thus, a method to minimize the influence of prediction errors on automatic driving control systems is required. This need motivated us to focus on the design of a mechanism for shaping prediction signals, which is called a prediction governor. In this study, we first extended our previous study to the input-affine nonlinear system case. Then, we analytically derived a solution to an optimal design problem of prediction governors. Finally, we applied the solution to an automatic driving control system, and demonstrated its usefulness through a numerical example and an experiment using a radio controlled car.

  2. Automatic analog IC sizing and optimization constrained with PVT corners and layout effects

    CERN Document Server

    Lourenço, Nuno; Horta, Nuno

    2017-01-01

    This book introduces readers to a variety of tools for automatic analog integrated circuit (IC) sizing and optimization. The authors provide a historical perspective on the early methods proposed to tackle automatic analog circuit sizing, with emphasis on the methodologies to size and optimize the circuit, and on the methodologies to estimate the circuit’s performance. The discussion also includes robust circuit design and optimization and the most recent advances in layout-aware analog sizing approaches. The authors describe a methodology for an automatic flow for analog IC design, including details of the inputs and interfaces, multi-objective optimization techniques, and the enhancements made in the base implementation by using machine leaning techniques. The Gradient model is discussed in detail, along with the methods to include layout effects in the circuit sizing. The concepts and algorithms of all the modules are thoroughly described, enabling readers to reproduce the methodologies, improve the qual...

  3. Automatic Parallelization Tool: Classification of Program Code for Parallel Computing

    Directory of Open Access Journals (Sweden)

    Mustafa Basthikodi

    2016-04-01

    Full Text Available Performance growth of single-core processors has come to a halt in the past decade, but was re-enabled by the introduction of parallelism in processors. Multicore frameworks along with Graphical Processing Units empowered to enhance parallelism broadly. Couples of compilers are updated to developing challenges forsynchronization and threading issues. Appropriate program and algorithm classifications will have advantage to a great extent to the group of software engineers to get opportunities for effective parallelization. In present work we investigated current species for classification of algorithms, in that related work on classification is discussed along with the comparison of issues that challenges the classification. The set of algorithms are chosen which matches the structure with different issues and perform given task. We have tested these algorithms utilizing existing automatic species extraction toolsalong with Bones compiler. We have added functionalities to existing tool, providing a more detailed characterization. The contributions of our work include support for pointer arithmetic, conditional and incremental statements, user defined types, constants and mathematical functions. With this, we can retain significant data which is not captured by original speciesof algorithms. We executed new theories into the device, empowering automatic characterization of program code.

  4. Automatic speech signal segmentation based on the innovation adaptive filter

    Directory of Open Access Journals (Sweden)

    Makowski Ryszard

    2014-06-01

    Full Text Available Speech segmentation is an essential stage in designing automatic speech recognition systems and one can find several algorithms proposed in the literature. It is a difficult problem, as speech is immensely variable. The aim of the authors’ studies was to design an algorithm that could be employed at the stage of automatic speech recognition. This would make it possible to avoid some problems related to speech signal parametrization. Posing the problem in such a way requires the algorithm to be capable of working in real time. The only such algorithm was proposed by Tyagi et al., (2006, and it is a modified version of Brandt’s algorithm. The article presents a new algorithm for unsupervised automatic speech signal segmentation. It performs segmentation without access to information about the phonetic content of the utterances, relying exclusively on second-order statistics of a speech signal. The starting point for the proposed method is time-varying Schur coefficients of an innovation adaptive filter. The Schur algorithm is known to be fast, precise, stable and capable of rapidly tracking changes in second order signal statistics. A transfer from one phoneme to another in the speech signal always indicates a change in signal statistics caused by vocal track changes. In order to allow for the properties of human hearing, detection of inter-phoneme boundaries is performed based on statistics defined on the mel spectrum determined from the reflection coefficients. The paper presents the structure of the algorithm, defines its properties, lists parameter values, describes detection efficiency results, and compares them with those for another algorithm. The obtained segmentation results, are satisfactory.

  5. Automatic Encoding and Language Detection in the GSDL

    Directory of Open Access Journals (Sweden)

    Otakar Pinkas

    2014-10-01

    Full Text Available Automatic detection of encoding and language of the text is part of the Greenstone Digital Library Software (GSDL for building and distributing digital collections. It is developed by the University of Waikato (New Zealand in cooperation with UNESCO. The automatic encoding and language detection in Slavic languages is difficult and it sometimes fails. The aim is to detect cases of failure. The automatic detection in the GSDL is based on n-grams method. The most frequent n-grams for Czech are presented. The whole process of automatic detection in the GSDL is described. The input documents to test collections are plain texts encoded in ISO-8859-1, ISO-8859-2 and Windows-1250. We manually evaluated the quality of automatic detection. To the causes of errors belong the improper language model predominance and the incorrect switch to Windows-1250. We carried out further tests on documents that were more complex.

  6. 12th Portuguese Conference on Automatic Control

    CERN Document Server

    Soares, Filomena; Moreira, António

    2017-01-01

    The biennial CONTROLO conferences are the main events promoted by The CONTROLO 2016 – 12th Portuguese Conference on Automatic Control, Guimarães, Portugal, September 14th to 16th, was organized by Algoritmi, School of Engineering, University of Minho, in partnership with INESC TEC, and promoted by the Portuguese Association for Automatic Control – APCA, national member organization of the International Federation of Automatic Control – IFAC. The seventy-five papers published in this volume cover a wide range of topics. Thirty-one of them, of a more theoretical nature, are distributed among the first five parts: Control Theory; Optimal and Predictive Control; Fuzzy, Neural and Genetic Control; Modeling and Identification; Sensing and Estimation. The papers go from cutting-edge theoretical research to innovative control applications and show expressively how Automatic Control can be used to increase the well being of people. .

  7. The TS 600: automatic control system for eddy currents

    International Nuclear Information System (INIS)

    Poulet, J.P.

    1986-10-01

    In the scope of fabrication and in service inspection of the PWR steam generator tubing bendle, FRAMATOME developed an automatic Eddy Current testing system: TS600. Based on a mini-computer, TS600 allows to digitize, to store and to process data in various ways, so it is possible to perform several kinds of inspection: conventional inservice inspection, roll area profilometry...... TS600 can also be used to develop new methods of examination [fr

  8. Automatic Design of a Maglev Controller in State Space

    Science.gov (United States)

    1991-12-01

    Design of a Maglev Controller in State Space Feng Zhao Richard Thornton Abstract We describe the automatic synthesis of a global nonlinear controller for...the global switching points of the controller is presented. The synthesized control system can stabilize the maglev vehicle with large initial displace...NUMBERS Automation Desing of a Maglev Controller in State Space N00014-89-J-3202 MIP-9001651 6. AUTHOR(S) Feng Zhao and Richard Thornton 7. PERFORMING

  9. Some experimental results for an automatic helium liquefier

    International Nuclear Information System (INIS)

    Watanabe, T.; Kudo, T.; Kuraoka, Y.; Sakura, K.; Tsuruga, H.; Watanabe, T.

    1984-01-01

    This chapter describes the testing of an automatic cooldown system. The liquefying machine examined is a CTi Model 1400. The automatic helium gas liquefying system is operated by using sequence control with a programmable controller. The automatic mode is carried out by operation of two compressors. The monitoring system consists of 41 remote sensors. Liquid level is measured by a superconducting level meter. The J-T valve and return valve, which require precise control, are operated by pulse motors. The advantages of the automatic cooldown system are reduced operator man power; temperatures and pressures are changed smoothly, so that the flow chart of automation is simple; and the system makes continuous liquefier operation possible

  10. Automatic Construction of Finite Algebras

    Institute of Scientific and Technical Information of China (English)

    张健

    1995-01-01

    This paper deals with model generation for equational theories,i.e.,automatically generating (finite)models of a given set of (logical) equations.Our method of finite model generation and a tool for automatic construction of finite algebras is described.Some examples are given to show the applications of our program.We argue that,the combination of model generators and theorem provers enables us to get a better understanding of logical theories.A brief comparison betwween our tool and other similar tools is also presented.

  11. Grinding Parts For Automatic Welding

    Science.gov (United States)

    Burley, Richard K.; Hoult, William S.

    1989-01-01

    Rollers guide grinding tool along prospective welding path. Skatelike fixture holds rotary grinder or file for machining large-diameter rings or ring segments in preparation for welding. Operator grasps handles to push rolling fixture along part. Rollers maintain precise dimensional relationship so grinding wheel cuts precise depth. Fixture-mounted grinder machines surface to quality sufficient for automatic welding; manual welding with attendant variations and distortion not necessary. Developed to enable automatic welding of parts, manual welding of which resulted in weld bead permeated with microscopic fissures.

  12. 46 CFR 171.118 - Automatic ventilators and side ports.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 7 2010-10-01 2010-10-01 false Automatic ventilators and side ports. 171.118 Section 171.118 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) SUBDIVISION AND STABILITY... Bulkhead or Weather Deck § 171.118 Automatic ventilators and side ports. (a) An automatic ventilator must...

  13. 30 CFR 75.1404 - Automatic brakes; speed reduction gear.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic brakes; speed reduction gear. 75.1404... Automatic brakes; speed reduction gear. [Statutory Provisions] Each locomotive and haulage car used in an... permit automatic brakes, locomotives and haulage cars shall be subject to speed reduction gear, or other...

  14. Automatic blood vessel based-liver segmentation using the portal phase abdominal CT

    Science.gov (United States)

    Maklad, Ahmed S.; Matsuhiro, Mikio; Suzuki, Hidenobu; Kawata, Yoshiki; Niki, Noboru; Shimada, Mitsuo; Iinuma, Gen

    2018-02-01

    Liver segmentation is the basis for computer-based planning of hepatic surgical interventions. In diagnosis and analysis of hepatic diseases and surgery planning, automatic segmentation of liver has high importance. Blood vessel (BV) has showed high performance at liver segmentation. In our previous work, we developed a semi-automatic method that segments the liver through the portal phase abdominal CT images in two stages. First stage was interactive segmentation of abdominal blood vessels (ABVs) and subsequent classification into hepatic (HBVs) and non-hepatic (non-HBVs). This stage had 5 interactions that include selective threshold for bone segmentation, selecting two seed points for kidneys segmentation, selection of inferior vena cava (IVC) entrance for starting ABVs segmentation, identification of the portal vein (PV) entrance to the liver and the IVC-exit for classifying HBVs from other ABVs (non-HBVs). Second stage is automatic segmentation of the liver based on segmented ABVs as described in [4]. For full automation of our method we developed a method [5] that segments ABVs automatically tackling the first three interactions. In this paper, we propose full automation of classifying ABVs into HBVs and non- HBVs and consequently full automation of liver segmentation that we proposed in [4]. Results illustrate that the method is effective at segmentation of the liver through the portal abdominal CT images.

  15. Automatic Error Recovery in Robot Assembly Operations Using Reverse Execution

    DEFF Research Database (Denmark)

    Laursen, Johan Sund; Schultz, Ulrik Pagh; Ellekilde, Lars-Peter

    2015-01-01

    , in particular for small-batch productions. As an alternative, we propose a system for automatically handling certain classes of errors instead of preventing them. Specifically, we show that many operations can be automatically reversed. Errors can be handled through automatic reverse execution of the control...... program to a safe point, from which forward execution can be resumed. This paper describes the principles behind automatic reversal of robotic assembly operations, and experimentally demonstrates the use of a domain-specific language that supports automatic error handling through reverse execution. Our...

  16. Automatic system for 3D reconstruction of the chick eye based on digital photographs.

    Science.gov (United States)

    Wong, Alexander; Genest, Reno; Chandrashekar, Naveen; Choh, Vivian; Irving, Elizabeth L

    2012-01-01

    The geometry of anatomical specimens is very complex and accurate 3D reconstruction is important for morphological studies, finite element analysis (FEA) and rapid prototyping. Although magnetic resonance imaging, computed tomography and laser scanners can be used for reconstructing biological structures, the cost of the equipment is fairly high and specialised technicians are required to operate the equipment, making such approaches limiting in terms of accessibility. In this paper, a novel automatic system for 3D surface reconstruction of the chick eye from digital photographs of a serially sectioned specimen is presented as a potential cost-effective and practical alternative. The system is designed to allow for automatic detection of the external surface of the chick eye. Automatic alignment of the photographs is performed using a combination of coloured markers and an algorithm based on complex phase order likelihood that is robust to noise and illumination variations. Automatic segmentation of the external boundaries of the eye from the aligned photographs is performed using a novel level-set segmentation approach based on a complex phase order energy functional. The extracted boundaries are sampled to construct a 3D point cloud, and a combination of Delaunay triangulation and subdivision surfaces is employed to construct the final triangular mesh. Experimental results using digital photographs of the chick eye show that the proposed system is capable of producing accurate 3D reconstructions of the external surface of the eye. The 3D model geometry is similar to a real chick eye and could be used for morphological studies and FEA.

  17. Towards Automatic Music Transcription: Extraction of MIDI-Data out of Polyphonic Piano Music

    Directory of Open Access Journals (Sweden)

    Jens Wellhausen

    2005-06-01

    Full Text Available Driven by the increasing amount of music available electronically the need of automatic search and retrieval systems for music becomes more and more important. In this paper an algorithm for automatic transcription of polyphonic piano music into MIDI data is presented, which is a very interesting basis for database applications and music analysis. The first part of the algorithm performs a note accurate temporal audio segmentation. The resulting segments are examined to extract the notes played in the second part. An algorithm for chord separation based on Independent Subspace Analysis is presented. Finally, the results are used to build a MIDI file.

  18. 6th International Parallel Tools Workshop

    CERN Document Server

    Brinkmann, Steffen; Gracia, José; Resch, Michael; Nagel, Wolfgang

    2013-01-01

    The latest advances in the High Performance Computing hardware have significantly raised the level of available compute performance. At the same time, the growing hardware capabilities of modern supercomputing architectures have caused an increasing complexity of the parallel application development. Despite numerous efforts to improve and simplify parallel programming, there is still a lot of manual debugging and  tuning work required. This process  is supported by special software tools, facilitating debugging, performance analysis, and optimization and thus  making a major contribution to the development of  robust and efficient parallel software. This book introduces a selection of the tools, which were presented and discussed at the 6th International Parallel Tools Workshop, held in Stuttgart, Germany, 25-26 September 2012.

  19. Analysis of Factors Affecting System Performance in the ASpIRE Challenge

    Science.gov (United States)

    2015-12-13

    performance in the ASpIRE ( Automatic Speech recognition In Reverberant Environments) challenge. In particular, overall word error rate (WER) of the solver...in mismatched conditions. Index Terms: speech recognition, reverberant rooms, microphone audio 1. Introduction The development of automatic ...IEEE Workshop on Automatic Speech Recognition and Understanding, 2005. [7] Harper, M., The Automatic Speech Recognition in Reverberant

  20. Automatic Mode Transition Enabled Robust Triboelectric Nanogenerators.

    Science.gov (United States)

    Chen, Jun; Yang, Jin; Guo, Hengyu; Li, Zhaoling; Zheng, Li; Su, Yuanjie; Wen, Zhen; Fan, Xing; Wang, Zhong Lin

    2015-12-22

    Although the triboelectric nanogenerator (TENG) has been proven to be a renewable and effective route for ambient energy harvesting, its robustness remains a great challenge due to the requirement of surface friction for a decent output, especially for the in-plane sliding mode TENG. Here, we present a rationally designed TENG for achieving a high output performance without compromising the device robustness by, first, converting the in-plane sliding electrification into a contact separation working mode and, second, creating an automatic transition between a contact working state and a noncontact working state. The magnet-assisted automatic transition triboelectric nanogenerator (AT-TENG) was demonstrated to effectively harness various ambient rotational motions to generate electricity with greatly improved device robustness. At a wind speed of 6.5 m/s or a water flow rate of 5.5 L/min, the harvested energy was capable of lighting up 24 spot lights (0.6 W each) simultaneously and charging a capacitor to greater than 120 V in 60 s. Furthermore, due to the rational structural design and unique output characteristics, the AT-TENG was not only capable of harvesting energy from natural bicycling and car motion but also acting as a self-powered speedometer with ultrahigh accuracy. Given such features as structural simplicity, easy fabrication, low cost, wide applicability even in a harsh environment, and high output performance with superior device robustness, the AT-TENG renders an effective and practical approach for ambient mechanical energy harvesting as well as self-powered active sensing.