WorldWideScience

Sample records for automatic performance debugging

  1. Automatic Performance Debugging of SPMD Parallel Programs

    CERN Document Server

    Liu, Xu; Zhan, Jianfeng; Tu, Bibo; Meng, Dan

    2010-01-01

    Automatic performance debugging of parallel applications usually involves two steps: automatic detection of performance bottlenecks and uncovering their root causes for performance optimization. Previous work fails to resolve this challenging issue in several ways: first, several previous efforts automate analysis processes, but present the results in a confined way that only identifies performance problems with apriori knowledge; second, several tools take exploratory or confirmatory data analysis to automatically discover relevant performance data relationships. However, these efforts do not focus on locating performance bottlenecks or uncovering their root causes. In this paper, we design and implement an innovative system, AutoAnalyzer, to automatically debug the performance problems of single program multi-data (SPMD) parallel programs. Our system is unique in terms of two dimensions: first, without any apriori knowledge, we automatically locate bottlenecks and uncover their root causes for performance o...

  2. Automatic Performance Debugging of SPMD-style Parallel Programs

    CERN Document Server

    Liu, Xu; Zhan, Kunlin; Shi, Weisong; Yuan, Lin; Meng, Dan; Wang, Lei

    2011-01-01

    The simple program and multiple data (SPMD) programming model is widely used for both high performance computing and Cloud computing. In this paper, we design and implement an innovative system, AutoAnalyzer, that automates the process of debugging performance problems of SPMD-style parallel programs, including data collection, performance behavior analysis, locating bottlenecks, and uncovering their root causes. AutoAnalyzer is unique in terms of two features: first, without any apriori knowledge, it automatically locates bottlenecks and uncovers their root causes for performance optimization; second, it is lightweight in terms of the size of performance data to be collected and analyzed. Our contributions are three-fold: first, we propose two effective clustering algorithms to investigate the existence of performance bottlenecks that cause process behavior dissimilarity or code region behavior disparity, respectively; meanwhile, we present two searching algorithms to locate bottlenecks; second, on a basis o...

  3. Automatic program debugging for intelligent tutoring systems

    Energy Technology Data Exchange (ETDEWEB)

    Murray, W.R.

    1986-01-01

    This thesis explores the process by which student programs can be automatically debugged in order to increase the instructional capabilities of these systems. This research presents a methodology and implementation for the diagnosis and correction of nontrivial recursive programs. In this approach, recursive programs are debugged by repairing induction proofs in the Boyer-Moore Logic. The potential of a program debugger to automatically debug widely varying novice programs in a nontrivial domain is proportional to its capabilities to reason about computational semantics. By increasing these reasoning capabilities a more powerful and robust system can result. This thesis supports these claims by examining related work in automated program debugging and by discussing the design, implementation, and evaluation of Talus, an automatic degugger for LISP programs. Talus relies on its abilities to reason about computational semantics to perform algorithm recognition, infer code teleology, and to automatically detect and correct nonsyntactic errors in student programs written in a restricted, but nontrivial, subset of LISP.

  4. Automatic Debugging Support for UML Designs

    Science.gov (United States)

    Schumann, Johann; Swanson, Keith (Technical Monitor)

    2001-01-01

    Design of large software systems requires rigorous application of software engineering methods covering all phases of the software process. Debugging during the early design phases is extremely important, because late bug-fixes are expensive. In this paper, we describe an approach which facilitates debugging of UML requirements and designs. The Unified Modeling Language (UML) is a set of notations for object-orient design of a software system. We have developed an algorithm which translates requirement specifications in the form of annotated sequence diagrams into structured statecharts. This algorithm detects conflicts between sequence diagrams and inconsistencies in the domain knowledge. After synthesizing statecharts from sequence diagrams, these statecharts usually are subject to manual modification and refinement. By using the "backward" direction of our synthesis algorithm. we are able to map modifications made to the statechart back into the requirements (sequence diagrams) and check for conflicts there. Fed back to the user conflicts detected by our algorithm are the basis for deductive-based debugging of requirements and domain theory in very early development stages. Our approach allows to generate explanations oil why there is a conflict and which parts of the specifications are affected.

  5. Do moods affect programmers’ debug performance?

    OpenAIRE

    Khan, I A; Brinkman, W.P.; Hierons, R. M.

    2010-01-01

    There is much research that shows people’s mood can affect their activities. This paper argues that this also applies to programmers, especially their debugging. Literature-based framework is presented linking programming with various cognitive activities as well as linking cognitive activities with moods. Further, the effect of mood on debugging was tested in two experiments. In the first experiment, programmers (n = 72) saw short movie clips selected for their ability to provoke specific mo...

  6. Do moods affect programmers’ debug performance?

    NARCIS (Netherlands)

    Khan, I.A.; Brinkman, W.P.; Hierons, R.M.

    2010-01-01

    There is much research that shows people’s mood can affect their activities. This paper argues that this also applies to programmers, especially their debugging. Literature-based framework is presented linking programming with various cognitive activities as well as linking cognitive activities with

  7. Instrumentation, performance visualization, and debugging tools for multiprocessors

    Science.gov (United States)

    Yan, Jerry C.; Fineman, Charles E.; Hontalas, Philip J.

    1991-01-01

    The need for computing power has forced a migration from serial computation on a single processor to parallel processing on multiprocessor architectures. However, without effective means to monitor (and visualize) program execution, debugging, and tuning parallel programs becomes intractably difficult as program complexity increases with the number of processors. Research on performance evaluation tools for multiprocessors is being carried out at ARC. Besides investigating new techniques for instrumenting, monitoring, and presenting the state of parallel program execution in a coherent and user-friendly manner, prototypes of software tools are being incorporated into the run-time environments of various hardware testbeds to evaluate their impact on user productivity. Our current tool set, the Ames Instrumentation Systems (AIMS), incorporates features from various software systems developed in academia and industry. The execution of FORTRAN programs on the Intel iPSC/860 can be automatically instrumented and monitored. Performance data collected in this manner can be displayed graphically on workstations supporting X-Windows. We have successfully compared various parallel algorithms for computational fluid dynamics (CFD) applications in collaboration with scientists from the Numerical Aerodynamic Simulation Systems Division. By performing these comparisons, we show that performance monitors and debuggers such as AIMS are practical and can illuminate the complex dynamics that occur within parallel programs.

  8. Supercomputer debugging workshop 1991 proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.

    1991-12-31

    This report discusses the following topics on supercomputer debugging: Distributed debugging; use interface to debugging tools and standards; debugging optimized codes; debugging parallel codes; and debugger performance and interface as analysis tools. (LSP)

  9. Supercomputer debugging workshop 1991 proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.

    1991-01-01

    This report discusses the following topics on supercomputer debugging: Distributed debugging; use interface to debugging tools and standards; debugging optimized codes; debugging parallel codes; and debugger performance and interface as analysis tools. (LSP)

  10. Supercomputer debugging workshop `92

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.S.

    1993-02-01

    This report contains papers or viewgraphs on the following topics: The ABCs of Debugging in the 1990s; Cray Computer Corporation; Thinking Machines Corporation; Cray Research, Incorporated; Sun Microsystems, Inc; Kendall Square Research; The Effects of Register Allocation and Instruction Scheduling on Symbolic Debugging; Debugging Optimized Code: Currency Determination with Data Flow; A Debugging Tool for Parallel and Distributed Programs; Analyzing Traces of Parallel Programs Containing Semaphore Synchronization; Compile-time Support for Efficient Data Race Detection in Shared-Memory Parallel Programs; Direct Manipulation Techniques for Parallel Debuggers; Transparent Observation of XENOOPS Objects; A Parallel Software Monitor for Debugging and Performance Tools on Distributed Memory Multicomputers; Profiling Performance of Inter-Processor Communications in an iWarp Torus; The Application of Code Instrumentation Technology in the Los Alamos Debugger; and CXdb: The Road to Remote Debugging.

  11. Debugging, Advanced Debugging and Runtime Analysis

    Directory of Open Access Journals (Sweden)

    Salim Istyaq

    2010-03-01

    variables on error or at chosen points. Automated functional GUI testing tools are used to repeat system-level tests through the GUI, benchmarks, allowing run-time performance comparisons to be made, performance analysis that can help to highlight hot spots and resource usage. A runtime tool will allow you to examine the application internals after the run via the recorded runtime analysis data. Runtime analysis removes the guesswork from debugging.It helps to uncover Memory corruption detection, Memory leak detection etc. Runtime analysis is an effort aimed at understanding software component behavior by using datacollection during the execution of the component. Runtimeanalysis is a topic of great interest in Computer Science. A program can take seconds, hours or even years to finish executing, depending on various parameters.

  12. Supercomputer debugging workshop '92

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.S.

    1993-01-01

    This report contains papers or viewgraphs on the following topics: The ABCs of Debugging in the 1990s; Cray Computer Corporation; Thinking Machines Corporation; Cray Research, Incorporated; Sun Microsystems, Inc; Kendall Square Research; The Effects of Register Allocation and Instruction Scheduling on Symbolic Debugging; Debugging Optimized Code: Currency Determination with Data Flow; A Debugging Tool for Parallel and Distributed Programs; Analyzing Traces of Parallel Programs Containing Semaphore Synchronization; Compile-time Support for Efficient Data Race Detection in Shared-Memory Parallel Programs; Direct Manipulation Techniques for Parallel Debuggers; Transparent Observation of XENOOPS Objects; A Parallel Software Monitor for Debugging and Performance Tools on Distributed Memory Multicomputers; Profiling Performance of Inter-Processor Communications in an iWarp Torus; The Application of Code Instrumentation Technology in the Los Alamos Debugger; and CXdb: The Road to Remote Debugging.

  13. TinyDebug

    DEFF Research Database (Denmark)

    Hansen, Morten Tranberg

    2011-01-01

    Debugging embedded wireless systems can be cumbersome due to low visibility. To ease the task of debugging this paper present TinyDebug which is a multi-purpose passive debugging framework for developing embedded wireless sys- tems. TinyDebug is designed to be used throughout the entire system...

  14. DySectAPI: Scalable Prescriptive Debugging

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Karlsson, Sven; Quarfot Nielsen, Niklas;

    We present the DySectAPI, a tool that allow users to construct probe trees for automatic, event-driven debugging at scale. The traditional, interactive debugging model, whereby users manually step through and inspect their application, does not scale well even for current supercomputers. While...

  15. DySectAPI: Scalable Prescriptive Debugging

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Karlsson, Sven; Quarfot Nielsen, Niklas;

    We present the DySectAPI, a tool that allow users to construct probe trees for automatic, event-driven debugging at scale. The traditional, interactive debugging model, whereby users manually step through and inspect their application, does not scale well even for current supercomputers. While...... lightweight debugging models scale well, they can currently only debug a subset of bug classes. DySectAPI fills the gap between these two approaches with a novel user-guided approach. Using both experimental results and analytical modeling we show how DySectAPI scales and can run with a low overhead...

  16. MPI Debugging with Handle Introspection

    OpenAIRE

    Brock-Nannestad, Laust; DelSignore, John; Squyres, Jeffrey M.; Karlsson, Sven; Mohror, Kathryn

    2014-01-01

    The Message Passing Interface, MPI, is the standard programming model for high performance computing clusters. However, debugging applications on large scale clusters is difficult. The widely used Message Queue Dumping interface enables inspection of message queue state but there is no general interface for extracting information from MPI objects such as communicators. A developer can debug the MPI library as if it was part of the application, but this exposes an unneeded level of detail.The ...

  17. A Scalable Prescriptive Parallel Debugging Model

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Quarfot Nielsen, Niklas; Lee, Gregory L.;

    2015-01-01

    Debugging is a critical step in the development of any parallel program. However, the traditional interactive debugging model, where users manually step through code and inspect their application, does not scale well even for current supercomputers due its centralized nature. While lightweight...... and test their debugging intuition in a way that helps to reduce the error space. Based on this debugging model we introduce a prototype implementation embodying this model, the DySectAPI, allowing programmers to construct probe trees for automatic, event-driven debugging at scale. In this paper we...... introduce the concepts behind DySectAPI and, using both experimental results and analytical modelling, we show that the DySectAPI implementation can run with a low overhead on current systems. We achieve a logarithmic scaling of the prototype and show predictions that even for a large system the overhead...

  18. MPI Debugging with Handle Introspection

    DEFF Research Database (Denmark)

    Brock-Nannestad, Laust; DelSignore, John; Squyres, Jeffrey M.;

    The Message Passing Interface, MPI, is the standard programming model for high performance computing clusters. However, debugging applications on large scale clusters is difficult. The widely used Message Queue Dumping interface enables inspection of message queue state but there is no general in...

  19. Model-Based Debugging

    OpenAIRE

    Mirghasemi, Salman

    2009-01-01

    Software Debugging is still one of the most challenging and time consuming aspects of software development. Monitoring the software behavior and finding the causes of this behavior are located at the center of debugging process. Although many tools and techniques have been introduced to support developers in this part, we still have a long way from the ideal point. In this paper, we first give a detailed explanation of the main issues in this domain and why the available techniques and tools...

  20. 一种基于最小调试边界的断点自动生成技术*%Automatic Breakpoint Generating Approach Based on Minimum Debugging Frontier Set

    Institute of Scientific and Technical Information of China (English)

    李丰; 霍玮; 陈聪明; 李龙; 衷璐洁; 冯晓兵

    2013-01-01

    Until recently, debugging still takes almost 70% of the time in software engineering. The conventional debugging process, based on setting breakpoints and inspecting the states on them, remains the most common and useful way to detect faults. The efficiency of debugging differs a lot as both the selection and inspection of breakpoints are up to programmers. This paper presents a novel breakpoint generating approach based on a new concept named minimum debugging frontier sets (abbr. MDFS). A debugging frontier set describes a set of trace points, which have the ability of bug isolation, and a MDFS is the one with minimum size. Benefiting from the ability of bug isolation, the error suspicious domain will always be narrowed down to one side of the MDFS no matter the state of MDFS is proven correct or not. Breakpoints generated on the basis of MDFS also make the statement instances to be inspected at each breakpoint at the minimum. The paper also establishes a set of criterions to judge the quality of breakpoints. Empirical result indicates that breakpoints generated through this approach not only require low inspecting cost, but also have the ability to accelerate the efficiency of debugging. It also shows that this MDFS-based debugging prototype performs better than the state-of-art fault-localization techniques on the Siemens Suite.%  时至今日,调试仍然占据软件开发过程中近70%的时间;以断点的设置和检查为基础的传统交互式调试依旧是实际工作中最常用的错误定位手段。日常调试过程中,断点的选择和调试的效率主要依赖于调试人员自身的经验以及对所调试程序的理解程度。提出一种基于最小调试边界的断点自动生成方法。最小调试边界描述了一个由程序执行轨迹上一组轨迹点构成的结合。该集合具有对错误传播的阻隔性,以及所对应的程序状态规模最小化的特征。受益于最小调试边界(minimum debugging frontier

  1. IDebug: An Advanced Debugging Framework for Java

    OpenAIRE

    Kiniry, Joseph R.

    1998-01-01

    IDebug, the Infospheres debugging framework, is an advanced debugging framework for Java. This framework provides the standard core debugging and specification constructs such as assertions, debug levels and categories, stack traces, and specialized exceptions. Debugging functionality can be fine-tuned to a per-thread and/or a per-class basis, debugging contexts can be stored to and recovered from persistent storage, and several aspects of the debugging run-time are configurable at the meta-l...

  2. Voice-controlled Debugging of Spreadsheets

    CERN Document Server

    Flood, Derek

    2008-01-01

    Developments in Mobile Computing are putting pressure on the software industry to research new modes of interaction that do not rely on the traditional keyboard and mouse combination. Computer users suffering from Repetitive Strain Injury also seek an alternative to keyboard and mouse devices to reduce suffering in wrist and finger joints. Voice-control is an alternative approach to spreadsheet development and debugging that has been researched and used successfully in other domains. While voice-control technology for spreadsheets is available its effectiveness has not been investigated. This study is the first to compare the performance of a set of expert spreadsheet developers that debugged a spreadsheet using voice-control technology and another set that debugged the same spreadsheet using keyboard and mouse. The study showed that voice, despite its advantages, proved to be slower and less accurate. However, it also revealed ways in which the technology might be improved to redress this imbalance.

  3. Integrated Debugging of Modelica Models

    Directory of Open Access Journals (Sweden)

    Adrian Pop

    2014-04-01

    Full Text Available The high abstraction level of equation-based object-oriented (EOO languages such as Modelica has the drawback that programming and modeling errors are often hard to find. In this paper we present integrated static and dynamic debugging methods for Modelica models and a debugger prototype that addresses several of those problems. The goal is an integrated debugging framework that combines classical debugging techniques with special techniques for equation-based languages partly based on graph visualization and interaction. To our knowledge, this is the first Modelica debugger that supports both equation-based transformational and algorithmic code debugging in an integrated fashion.

  4. Debugging in a multi-processor environment

    International Nuclear Information System (INIS)

    The Supervisory Control and Diagnostic System (SCDS) for the Mirror Fusion Test Facility (MFTF) consists of nine 32-bit minicomputers arranged in a tightly coupled distributed computer system utilizing a share memory as the data exchange medium. Debugging of more than one program in the multi-processor environment is a difficult process. This paper describes what new tools were developed and how the testing of software is performed in the SCDS for the MFTF project

  5. Query strategy for sequential ontology debugging

    CERN Document Server

    Shchekotykhina, Kostyantyn; Fleiss, Philipp; Rodler, Patrick

    2011-01-01

    Debugging of ontologies is an important prerequisite for their wide-spread application, especially in areas that rely upon everyday users to create and maintain knowledge bases, as in the case of the Semantic Web. Recent approaches use diagnosis methods to identify causes of inconsistent or incoherent ontologies. However, in most debugging scenarios these methods return many alternative diagnoses, thus placing the burden of fault localization on the user. This paper demonstrates how the target diagnosis can be identified by performing a sequence of observations, that is, by querying an oracle about entailments of the target ontology. We exploit a-priori probabilities of typical user errors to formulate information-theoretic concepts for query selection. Our evaluation showed that the proposed method significantly reduces the number of required queries compared to myopic strategies. We experimented with different probability distributions of user errors and different qualities of the a-priori probabilities. Ou...

  6. An object-oriented extension for debugging the virtual machine

    Energy Technology Data Exchange (ETDEWEB)

    Pizzi, R.G. Jr. [California Univ., Davis, CA (United States)

    1994-12-01

    A computer is nothing more then a virtual machine programmed by source code to perform a task. The program`s source code expresses abstract constructs which are compiled into some lower level target language. When a virtual machine breaks, it can be very difficult to debug because typical debuggers provide only low-level target implementation information to the software engineer. We believe that the debugging task can be simplified by introducing aspects of the abstract design and data into the source code. We introduce OODIE, an object-oriented extension to programming languages that allows programmers to specify a virtual environment by describing the meaning of the design and data of a virtual machine. This specification is translated into symbolic information such that an augmented debugger can present engineers with a programmable debugging environment specifically tailored for the virtual machine that is to be debugged.

  7. Automatic Energy Schemes for High Performance Applications

    Energy Technology Data Exchange (ETDEWEB)

    Sundriyal, Vaibhav [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    Although high-performance computing traditionally focuses on the efficient execution of large-scale applications, both energy and power have become critical concerns when approaching exascale. Drastic increases in the power consumption of supercomputers affect significantly their operating costs and failure rates. In modern microprocessor architectures, equipped with dynamic voltage and frequency scaling (DVFS) and CPU clock modulation (throttling), the power consumption may be controlled in software. Additionally, network interconnect, such as Infiniband, may be exploited to maximize energy savings while the application performance loss and frequency switching overheads must be carefully balanced. This work first studies two important collective communication operations, all-to-all and allgather and proposes energy saving strategies on the per-call basis. Next, it targets point-to-point communications to group them into phases and apply frequency scaling to them to save energy by exploiting the architectural and communication stalls. Finally, it proposes an automatic runtime system which combines both collective and point-to-point communications into phases, and applies throttling to them apart from DVFS to maximize energy savings. The experimental results are presented for NAS parallel benchmark problems as well as for the realistic parallel electronic structure calculations performed by the widely used quantum chemistry package GAMESS. Close to the maximum energy savings were obtained with a substantially low performance loss on the given platform.

  8. BigDebug: Debugging Primitives for Interactive Big Data Processing in Spark

    Science.gov (United States)

    Gulzar, Muhammad Ali; Interlandi, Matteo; Yoo, Seunghyun; Tetali, Sai Deep; Condie, Tyson; Millstein, Todd; Kim, Miryung

    2016-01-01

    Developers use cloud computing platforms to process a large quantity of data in parallel when developing big data analytics. Debugging the massive parallel computations that run in today’s data-centers is time consuming and error-prone. To address this challenge, we design a set of interactive, real-time debugging primitives for big data processing in Apache Spark, the next generation data-intensive scalable cloud computing platform. This requires re-thinking the notion of step-through debugging in a traditional debugger such as gdb, because pausing the entire computation across distributed worker nodes causes significant delay and naively inspecting millions of records using a watchpoint is too time consuming for an end user. First, BIGDEBUG’s simulated breakpoints and on-demand watchpoints allow users to selectively examine distributed, intermediate data on the cloud with little overhead. Second, a user can also pinpoint a crash-inducing record and selectively resume relevant sub-computations after a quick fix. Third, a user can determine the root causes of errors (or delays) at the level of individual records through a fine-grained data provenance capability. Our evaluation shows that BIGDEBUG scales to terabytes and its record-level tracing incurs less than 25% overhead on average. It determines crash culprits orders of magnitude more accurately and provides up to 100% time saving compared to the baseline replay debugger. The results show that BIGDEBUG supports debugging at interactive speeds with minimal performance impact.

  9. BigDebug: Debugging Primitives for Interactive Big Data Processing in Spark

    Science.gov (United States)

    Gulzar, Muhammad Ali; Interlandi, Matteo; Yoo, Seunghyun; Tetali, Sai Deep; Condie, Tyson; Millstein, Todd; Kim, Miryung

    2016-01-01

    Developers use cloud computing platforms to process a large quantity of data in parallel when developing big data analytics. Debugging the massive parallel computations that run in today’s data-centers is time consuming and error-prone. To address this challenge, we design a set of interactive, real-time debugging primitives for big data processing in Apache Spark, the next generation data-intensive scalable cloud computing platform. This requires re-thinking the notion of step-through debugging in a traditional debugger such as gdb, because pausing the entire computation across distributed worker nodes causes significant delay and naively inspecting millions of records using a watchpoint is too time consuming for an end user. First, BIGDEBUG’s simulated breakpoints and on-demand watchpoints allow users to selectively examine distributed, intermediate data on the cloud with little overhead. Second, a user can also pinpoint a crash-inducing record and selectively resume relevant sub-computations after a quick fix. Third, a user can determine the root causes of errors (or delays) at the level of individual records through a fine-grained data provenance capability. Our evaluation shows that BIGDEBUG scales to terabytes and its record-level tracing incurs less than 25% overhead on average. It determines crash culprits orders of magnitude more accurately and provides up to 100% time saving compared to the baseline replay debugger. The results show that BIGDEBUG supports debugging at interactive speeds with minimal performance impact. PMID:27390389

  10. Lightweight and Statistical Techniques for Petascale PetaScale Debugging

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Barton

    2014-06-30

    to a small set of nodes or by identifying equivalence classes of nodes and sampling our debug targets from them. We implemented these techniques as lightweight tools that efficiently work on the full scale of the target machine. We explored four lightweight debugging refinements: generic classification parameters, such as stack traces, application-specific classification parameters, such as global variables, statistical data acquisition techniques and machine learning based approaches to perform root cause analysis. Work done under this project can be divided into two categories, new algorithms and techniques for scalable debugging, and foundation infrastructure work on our MRNet multicast-reduction framework for scalability, and Dyninst binary analysis and instrumentation toolkits.

  11. Multi-purpose passive debugging for embedded wireless

    DEFF Research Database (Denmark)

    Hansen, Morten Tranberg

    Debugging embedded wireless systems can be cumbersome and hard due to low visibility. To ease the task of debugging we propose a multi-purpose passive debugging framework, called TinyDebug, for developing embedded wireless systems. TinyDebug is designed to be used throughout the entire system...

  12. TUNE: Compiler-Directed Automatic Performance Tuning

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Mary [University of Utah

    2014-09-18

    This project has developed compiler-directed performance tuning technology targeting the Cray XT4 Jaguar system at Oak Ridge, which has multi-core Opteron nodes with SSE-3 SIMD extensions, and the Cray XE6 Hopper system at NERSC. To achieve this goal, we combined compiler technology for model-guided empirical optimization for memory hierarchies with SIMD code generation, which have been developed by the PIs over the past several years. We examined DOE Office of Science applications to identify performance bottlenecks and apply our system to computational kernels that operate on dense arrays. Our goal for this performance-tuning technology has been to yield hand-tuned levels of performance on DOE Office of Science computational kernels, while allowing application programmers to specify their computations at a high level without requiring manual optimization. Overall, we aim to make our technology for SIMD code generation and memory hierarchy optimization a crucial component of high-productivity Petaflops computing through a close collaboration with the scientists in national laboratories.

  13. Performance of automatic scanning microscope for nuclear emulsion experiments

    International Nuclear Information System (INIS)

    The impressive improvements in scanning technology and methods let nuclear emulsion to be used as a target in recent large experiments. We report the performance of an automatic scanning microscope for nuclear emulsion experiments. After successful calibration and alignment of the system, we have reached 99% tracking efficiency for the minimum ionizing tracks that penetrating through the emulsions films. The automatic scanning system is successfully used for the scanning of emulsion films in the OPERA experiment and plan to use for the next generation of nuclear emulsion experiments

  14. PGMPI: Automatically Verifying Self-Consistent MPI Performance Guidelines

    OpenAIRE

    Hunold, Sascha; Carpen-Amarie, Alexandra; Lübbe, Felix Donatus; Träff, Jesper Larsson

    2016-01-01

    The Message Passing Interface (MPI) is the most commonly used application programming interface for process communication on current large-scale parallel systems. Due to the scale and complexity of modern parallel architectures, it is becoming increasingly difficult to optimize MPI libraries, as many factors can influence the communication performance. To assist MPI developers and users, we propose an automatic way to check whether MPI libraries respect self-consistent performance guidelines....

  15. Automatic or Deliberate? Cerebral correlates of automatic associations towards performance enhancing substances

    Directory of Open Access Journals (Sweden)

    Sebastian eSchindler

    2015-12-01

    Full Text Available The direct assessment of explicit attitudes towards performance enhancing substances, for example Neuroenhancement or doping in sports can be affected by social desirability biases and cheating attempts. According to Dual Process Theories of cognition, indirect measures like the Implicit Association Test (IAT measure automatic associations towards a topic (as opposed to explicit attitudes measured by self-report measures. Such automatic associations are thought to occur rapidly and to evade voluntary control. However, whether or not such indirect tests actually reflect automatic associations is difficult to validate. Electroencephalography´s superior time resolution enables to differentiate between highly automatic compared to more elaborate processing stages. We therefore examined on which processing stages cortical differences between negative or positive attitudes to doping occur, and whether or not these differences can be related to BIAT scores. We tested 42 university students (31 females, 24.43 ± 3.17 years old, who were requested to complete a brief doping IAT (BIAT on attitudes towards doping. Cerebral activity during doping BIAT completion was assessed using high-density EEG. Behaviorally, participants D-scores exhibited negative attitudes towards doping, represented by faster reaction times in the doping + dislike pairing task. Event-related potentials (ERPs revealed earliest effects between 200 and 300ms. Here, a relatively larger occipital positivity was found for the doping + dislike pairing task. Further, in the LPP time range between 400 and 600ms a larger late positive potential was found for the doping + dislike pairing task over central regions. These LPP amplitude differences were successfully predicting participants´ BIAT D-scores.Results indicate that event-related potentials differentiate between positive and negative doping attitudes at stages of mid-latency. However, it seems that IAT scores can be predicted only by

  16. Debug automation from pre-silicon to post-silicon

    CERN Document Server

    Dehbashi, Mehdi

    2015-01-01

    This book describes automated debugging approaches for the bugs and the faults which appear in different abstraction levels of a hardware system. The authors employ a transaction-based debug approach to systems at the transaction-level, asserting the correct relation of transactions. The automated debug approach for design bugs finds the potential fault candidates at RTL and gate-level of a circuit. Debug techniques for logic bugs and synchronization bugs are demonstrated, enabling readers to localize the most difficult bugs. Debug automation for electrical faults (delay faults)finds the potentially failing speedpaths in a circuit at gate-level. The various debug approaches described achieve high diagnosis accuracy and reduce the debugging time, shortening the IC development cycle and increasing the productivity of designers. Describes a unified framework for debug automation used at both pre-silicon and post-silicon stages; Provides approaches for debug automation of a hardware system at different levels of ...

  17. Automatic performance tuning of parallel and accelerated seismic imaging kernels

    KAUST Repository

    Haberdar, Hakan

    2014-01-01

    With the increased complexity and diversity of mainstream high performance computing systems, significant effort is required to tune parallel applications in order to achieve the best possible performance for each particular platform. This task becomes more and more challenging and requiring a larger set of skills. Automatic performance tuning is becoming a must for optimizing applications such as Reverse Time Migration (RTM) widely used in seismic imaging for oil and gas exploration. An empirical search based auto-tuning approach is applied to the MPI communication operations of the parallel isotropic and tilted transverse isotropic kernels. The application of auto-tuning using the Abstract Data and Communication Library improved the performance of the MPI communications as well as developer productivity by providing a higher level of abstraction. Keeping productivity in mind, we opted toward pragma based programming for accelerated computation on latest accelerated architectures such as GPUs using the fairly new OpenACC standard. The same auto-tuning approach is also applied to the OpenACC accelerated seismic code for optimizing the compute intensive kernel of the Reverse Time Migration application. The application of such technique resulted in an improved performance of the original code and its ability to adapt to different execution environments.

  18. Interactive ontology debugging: Two query strategies for efficient fault localization.

    Science.gov (United States)

    Shchekotykhin, Kostyantyn; Friedrich, Gerhard; Fleiss, Philipp; Rodler, Patrick

    2012-04-01

    Effective debugging of ontologies is an important prerequisite for their broad application, especially in areas that rely on everyday users to create and maintain knowledge bases, such as the Semantic Web. In such systems ontologies capture formalized vocabularies of terms shared by its users. However in many cases users have different local views of the domain, i.e. of the context in which a given term is used. Inappropriate usage of terms together with natural complications when formulating and understanding logical descriptions may result in faulty ontologies. Recent ontology debugging approaches use diagnosis methods to identify causes of the faults. In most debugging scenarios these methods return many alternative diagnoses, thus placing the burden of fault localization on the user. This paper demonstrates how the target diagnosis can be identified by performing a sequence of observations, that is, by querying an oracle about entailments of the target ontology. To identify the best query we propose two query selection strategies: a simple "split-in-half" strategy and an entropy-based strategy. The latter allows knowledge about typical user errors to be exploited to minimize the number of queries. Our evaluation showed that the entropy-based method significantly reduces the number of required queries compared to the "split-in-half" approach. We experimented with different probability distributions of user errors and different qualities of the a priori probabilities. Our measurements demonstrated the superiority of entropy-based query selection even in cases where all fault probabilities are equal, i.e. where no information about typical user errors is available.

  19. Lessons learned at 208K: Towards Debugging Millions of Cores

    Energy Technology Data Exchange (ETDEWEB)

    Lee, G L; Ahn, D H; Arnold, D C; de Supinski, B R; Legendre, M; Miller, B P; Schulz, M J; Liblit, B

    2008-04-14

    Petascale systems will present several new challenges to performance and correctness tools. Such machines may contain millions of cores, requiring that tools use scalable data structures and analysis algorithms to collect and to process application data. In addition, at such scales, each tool itself will become a large parallel application--already, debugging the full Blue-Gene/L (BG/L) installation at the Lawrence Livermore National Laboratory requires employing 1664 tool daemons. To reach such sizes and beyond, tools must use a scalable communication infrastructure and manage their own tool processes efficiently. Some system resources, such as the file system, may also become tool bottlenecks. In this paper, we present challenges to petascale tool development, using the Stack Trace Analysis Tool (STAT) as a case study. STAT is a lightweight tool that gathers and merges stack traces from a parallel application to identify process equivalence classes. We use results gathered at thousands of tasks on an Infiniband cluster and results up to 208K processes on BG/L to identify current scalability issues as well as challenges that will be faced at the petascale. We then present implemented solutions to these challenges and show the resulting performance improvements. We also discuss future plans to meet the debugging demands of petascale machines.

  20. Debugging systems-on-chip communication-centric and abstraction-based techniques

    CERN Document Server

    Vermeulen, Bart

    2014-01-01

    This book describes an approach and supporting infrastructure to facilitate debugging the silicon implementation of a System-on-Chip (SOC), allowing its associated product to be introduced into the market more quickly.  Readers learn step-by-step the key requirements for debugging a modern, silicon SOC implementation, nine factors that complicate this debugging task, and a new debug approach that addresses these requirements and complicating factors.  The authors’ novel communication-centric, scan-based, abstraction-based, run/stop-based (CSAR) debug approach is discussed in detail, showing how it helps to meet debug requirements and address the nine, previously identified factors that complicate debugging silicon implementations of SOCs. The authors also derive the debug infrastructure requirements to support debugging of a silicon implementation of an SOC with their CSAR debug approach. This debug infrastructure consists of a generic on-chip debug architecture, a configurable automated design-for-debug ...

  1. Bifröst: debugging web applications as a whole

    NARCIS (Netherlands)

    Vlist, K.B. van der

    2013-01-01

    Even though web application development is supported by professional tooling, debugging support is lacking. If one starts to debug a web application, hardly any tooling support exists. Only the core components like server processes and a web browser are exposed. Developers need to manually weave ava

  2. Lightweight and Statistical Techniques for Petascale Debugging: Correctness on Petascale Systems (CoPS) Preliminry Report

    Energy Technology Data Exchange (ETDEWEB)

    de Supinski, B R; Miller, B P; Liblit, B

    2011-09-13

    Petascale platforms with O(10{sup 5}) and O(10{sup 6}) processing cores are driving advancements in a wide range of scientific disciplines. These large systems create unprecedented application development challenges. Scalable correctness tools are critical to shorten the time-to-solution on these systems. Currently, many DOE application developers use primitive manual debugging based on printf or traditional debuggers such as TotalView or DDT. This paradigm breaks down beyond a few thousand cores, yet bugs often arise above that scale. Programmers must reproduce problems in smaller runs to analyze them with traditional tools, or else perform repeated runs at scale using only primitive techniques. Even when traditional tools run at scale, the approach wastes substantial effort and computation cycles. Continued scientific progress demands new paradigms for debugging large-scale applications. The Correctness on Petascale Systems (CoPS) project is developing a revolutionary debugging scheme that will reduce the debugging problem to a scale that human developers can comprehend. The scheme can provide precise diagnoses of the root causes of failure, including suggestions of the location and the type of errors down to the level of code regions or even a single execution point. Our fundamentally new strategy combines and expands three relatively new complementary debugging approaches. The Stack Trace Analysis Tool (STAT), a 2011 R&D 100 Award Winner, identifies behavior equivalence classes in MPI jobs and highlights behavior when elements of the class demonstrate divergent behavior, often the first indicator of an error. The Cooperative Bug Isolation (CBI) project has developed statistical techniques for isolating programming errors in widely deployed code that we will adapt to large-scale parallel applications. Finally, we are developing a new approach to parallelizing expensive correctness analyses, such as analysis of memory usage in the Memgrind tool. In the first two

  3. Debugging: Finding, Fixing and Flailing, a Multi-Institutional Study of Novice Debuggers

    Science.gov (United States)

    Fitzgerald, Sue; Lewandowski, Gary; McCauley, Renee; Murphy, Laurie; Simon, Beth; Thomas, Lynda; Zander, Carol

    2008-01-01

    Debugging is often difficult and frustrating for novices. Yet because students typically debug outside the classroom and often in isolation, instructors rarely have the opportunity to closely observe students while they debug. This paper describes the details of an exploratory study of the debugging skills and behaviors of contemporary novice Java…

  4. Automatic Eye Detection Error as a Predictor of Face Recognition Performance

    OpenAIRE

    Dutta, Abhishek; Veldhuis, Raymond; Spreeuwers, Luuk

    2014-01-01

    Various facial image quality parameters like pose, illumination, noise, resolution, etc are known to be a predictor of face recognition performance. However, there still remain many other properties of facial images that are not captured by the existing quality parameters. In this paper, we propose a novel image quality parameter called the Automatic Eye Detection Error (AEDE) which measures the difference between manually located and automatically detected eye coordinates. Our experiment res...

  5. A Performance Analysis Tool for PVM Parallel Programs

    Institute of Scientific and Technical Information of China (English)

    Chen Wang; Yin Liu; Changjun Jiang; Zhaoqing Zhang

    2004-01-01

    In this paper,we introduce the design and implementation of ParaVT,which is a visual performance analysis and parallel debugging tool.In ParaVT,we propose an automated instrumentation mechanism. Based on this mechanism,ParaVT automatically analyzes the performance bottleneck of parallel applications and provides a visual user interface to monitor and analyze the performance of parallel programs.In addition ,it also supports certain extensions.

  6. Complier-Directed Automatic Performance Tuning (TUNE) Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Chame, Jacqueline [USC-ISI

    2013-06-07

    TUNE was created to develop compiler-directed performance tuning technology targeting the Cray XT4 system at Oak Ridge. TUNE combines compiler technology for model-guided empirical optimization for memory hierarchies with SIMD code generation. The goal of this performance-tuning technology is to yield hand-tuned levels of performance on DOE Office of Science computational kernels, while allowing application programmers to specify their computations at a high level without requiring manual optimization. Overall, TUNE aims to make compiler technology for SIMD code generation and memory hierarchy optimization a crucial component of high-productivity Petaflops computing through a close collaboration with the scientists in national laboratories.

  7. AutomaDeD: Automata-Based Debugging for Dissimilar Parallel Tasks

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G; Laguna, I; Bagchi, S; de Supinski, B R; Ahn, D; Schulz, M

    2010-03-23

    Today's largest systems have over 100,000 cores, with million-core systems expected over the next few years. This growing scale makes debugging the applications that run on them a daunting challenge. Few debugging tools perform well at this scale and most provide an overload of information about the entire job. Developers need tools that quickly direct them to the root cause of the problem. This paper presents AutomaDeD, a tool that identifies which tasks of a large-scale application first manifest a bug at a specific code region at a specific point during program execution. AutomaDeD creates a statistical model of the application's control-flow and timing behavior that organizes tasks into groups and identifies deviations from normal execution, thus significantly reducing debugging effort. In addition to a case study in which AutomaDeD locates a bug that occurred during development of MVAPICH, we evaluate AutomaDeD on a range of bugs injected into the NAS parallel benchmarks. Our results demonstrate that detects the time period when a bug first manifested itself with 90% accuracy for stalls and hangs and 70% accuracy for interference faults. It identifies the subset of processes first affected by the fault with 80% accuracy and 70% accuracy, respectively and the code region where where the fault first manifested with 90% and 50% accuracy, respectively.

  8. Automatic Assessment of Complex Task Performance in Games and Simulations. CRESST Report 775

    Science.gov (United States)

    Iseli, Markus R.; Koenig, Alan D.; Lee, John J.; Wainess, Richard

    2010-01-01

    Assessment of complex task performance is crucial to evaluating personnel in critical job functions such as Navy damage control operations aboard ships. Games and simulations can be instrumental in this process, as they can present a broad range of complex scenarios without involving harm to people or property. However, "automatic" performance…

  9. Software exorcism a handbook for debugging and optimizing legacy code

    CERN Document Server

    Blunden, Bill

    2013-01-01

    Software Exorcism: A Handbook for Debugging and Optimizing Legacy Code takes an unflinching, no bulls and look at behavioral problems in the software engineering industry, shedding much-needed light on the social forces that make it difficult for programmers to do their job. Do you have a co-worker who perpetually writes bad code that you are forced to clean up? This is your book. While there are plenty of books on the market that cover debugging and short-term workarounds for bad code, Reverend Bill Blunden takes a revolutionary step beyond them by bringing our atten

  10. Provenance-Based Debugging and Drill-Down in Data-Oriented Workflows

    KAUST Repository

    Ikeda, Robert

    2012-04-01

    Panda (for Provenance and Data) is a system that supports the creation and execution of data-oriented workflows, with automatic provenance generation and built-in provenance tracing operations. Workflows in Panda are arbitrary a cyclic graphs containing both relational (SQL) processing nodes and opaque processing nodes programmed in Python. For both types of nodes, Panda generates logical provenance - provenance information stored at the processing-node level - and uses the generated provenance to support record-level backward tracing and forward tracing operations. In our demonstration we use Panda to integrate, process, and analyze actual education data from multiple sources. We specifically demonstrate how Panda\\'s provenance generation and tracing capabilities can be very useful for workflow debugging, and for drilling down on specific results of interest. © 2012 IEEE.

  11. Prototype application for the control and debugging of CMS upgrade projects

    CERN Document Server

    Mills-Howell, Dominic

    2016-01-01

    Following the high-luminosity upgrades of the LHC, many subsystems of the CMS experiment require upgrading and others are using the LHC shutdowns as an opportunity to improve performance. The upgrades, themselves, have served to highlight the exigency to attack problems that were previously unaddressed. One such problem is the need for a tool that allows the users to easily monitor, debug, and test custom hardware. Such a tool could be abstracted to work, in theory, with various hardware devices. In addition to having the added benefit of being able to support future hardware, and maintaining parallel operations with the remaining control software.

  12. Automatic image registration performance for two different CBCT systems; variation with imaging dose

    Science.gov (United States)

    Barber, J.; Sykes, J. R.; Holloway, L.; Thwaites, D. I.

    2014-03-01

    The performance of an automatic image registration algorithm was compared on image sets collected with two commercial CBCT systems, and the relationship with imaging dose was explored. CBCT images of a CIRS Virtually Human Male Pelvis phantom (VHMP) were collected on Varian TrueBeam/OBI and Elekta Synergy/XVI linear accelerators, across a range of mAs settings. Each CBCT image was registered 100 times, with random initial offsets introduced. Image registration was performed using the grey value correlation ratio algorithm in the Elekta XVI software, to a mask of the prostate volume with 5 mm expansion. Residual registration errors were calculated after correcting for the initial introduced phantom set-up error. Registration performance with the OBI images was similar to that of XVI. There was a clear dependence on imaging dose for the XVI images with residual errors increasing below 4mGy. It was not possible to acquire images with doses lower than ~5mGy with the OBI system and no evidence of reduced performance was observed at this dose. Registration failures (maximum target registration error > 3.6 mm on the surface of a 30mm sphere) occurred in 5% to 9% of registrations except for the lowest dose XVI scan (31%). The uncertainty in automatic image registration with both OBI and XVI images was found to be adequate for clinical use within a normal range of acquisition settings.

  13. Performance testing of the automatic transmission%自动变速器的性能检验

    Institute of Scientific and Technical Information of China (English)

    张琳琳

    2013-01-01

      文章从自动变速器油的检查与更换、变速杆位置的检查与调整、自动变速器的性能试验三个方面系统分析了自动变速器的性能检验步骤与方法。%The article analyses procedure and methods of performance testing of the automatic transmission from three aspects ,the oil inspection and replacement of automatic transmission, the inspection and adjustment of lever position, the performance test of automatic transmission.

  14. A debugging system for azimuthally acoustic logging tools based on modular and hierarchical design ideas

    Science.gov (United States)

    Zhang, K.; Ju, X. D.; Lu, J. Q.; Men, B. Y.

    2016-08-01

    On the basis of modular and hierarchical design ideas, this study presents a debugging system for an azimuthally sensitive acoustic bond tool (AABT). The debugging system includes three parts: a personal computer (PC), embedded front-end machine and function expansion boards. Modular and hierarchical design ideas are conducted in all design and debug processes. The PC communicates with the front-end machine via the Internet, and the front-end machine and function expansion boards connect each other by the extended parallel bus. In this method, the three parts of the debugging system form stable and high-speed data communication. This study not only introduces the system-level debugging and sub-system level debugging of the tool but also the debugging of the analogue signal processing board, which is important and greatly used in logging tools. Experiments illustrate that the debugging system can greatly improve AABT verification and calibration efficiency and that, board-level debugging can examine and improve analogue signal processing boards. The design thinking is clear and the design structure is reasonable, thus making it easy to extend and upgrade the debugging system.

  15. The ALDB box: automatic testing of cognitive performance in groups of aviary-housed pigeons.

    Science.gov (United States)

    Huber, Ludwig; Heise, Nils; Zeman, Christopher; Palmers, Christian

    2015-03-01

    The combination of highly controlled experimental testing and the voluntary participation of unrestrained animals has many advantages over traditional, laboratory-based learning environments in terms of animal welfare, learning speed, and resource economy. Such automatic learning environments have recently been developed for primates (Fagot & Bonté, 2010; Fagot & Paleressompoulle, 2009;) but, so far, has not been achieved with highly mobile creatures such as birds. Here, we present a novel testing environment for pigeons. Living together in small groups in outside aviaries, they can freely choose to participate in learning experiments by entering and leaving the automatic learning box at any time. At the single-access entry, they are individualized using radio frequency identification technology and then trained or tested in a stress-free and self-terminating manner. The voluntary nature of their participation according to their individual biorhythm guarantees high motivation levels and good learning and test performance. Around-the-clock access allows for massed-trials training, which in baboons has been proven to have facilitative effects on discrimination learning. The performance of 2 pigeons confirmed the advantages of the automatic learning device for birds box. The latter is the result of a development process of several years that required us to deal with and overcome a number of technical challenges: (1) mechanically controlled access to the box, (2) identification of the birds, (3) the release of a bird and, at the same time, prevention of others from entering the box, and (4) reliable functioning of the device despite long operation times and exposure to high dust loads and low temperatures.

  16. The ALDB box: automatic testing of cognitive performance in groups of aviary-housed pigeons.

    Science.gov (United States)

    Huber, Ludwig; Heise, Nils; Zeman, Christopher; Palmers, Christian

    2015-03-01

    The combination of highly controlled experimental testing and the voluntary participation of unrestrained animals has many advantages over traditional, laboratory-based learning environments in terms of animal welfare, learning speed, and resource economy. Such automatic learning environments have recently been developed for primates (Fagot & Bonté, 2010; Fagot & Paleressompoulle, 2009;) but, so far, has not been achieved with highly mobile creatures such as birds. Here, we present a novel testing environment for pigeons. Living together in small groups in outside aviaries, they can freely choose to participate in learning experiments by entering and leaving the automatic learning box at any time. At the single-access entry, they are individualized using radio frequency identification technology and then trained or tested in a stress-free and self-terminating manner. The voluntary nature of their participation according to their individual biorhythm guarantees high motivation levels and good learning and test performance. Around-the-clock access allows for massed-trials training, which in baboons has been proven to have facilitative effects on discrimination learning. The performance of 2 pigeons confirmed the advantages of the automatic learning device for birds box. The latter is the result of a development process of several years that required us to deal with and overcome a number of technical challenges: (1) mechanically controlled access to the box, (2) identification of the birds, (3) the release of a bird and, at the same time, prevention of others from entering the box, and (4) reliable functioning of the device despite long operation times and exposure to high dust loads and low temperatures. PMID:24737096

  17. A Framework to Debug Diagnostic Matrices

    Science.gov (United States)

    Kodal, Anuradha; Robinson, Peter; Patterson-Hine, Ann

    2013-01-01

    Diagnostics is an important concept in system health and monitoring of space operations. Many of the existing diagnostic algorithms utilize system knowledge in the form of diagnostic matrix (D-matrix, also popularly known as diagnostic dictionary, fault signature matrix or reachability matrix) gleaned from physical models. But, sometimes, this may not be coherent to obtain high diagnostic performance. In such a case, it is important to modify this D-matrix based on knowledge obtained from other sources such as time-series data stream (simulated or maintenance data) within the context of a framework that includes the diagnostic/inference algorithm. A systematic and sequential update procedure, diagnostic modeling evaluator (DME) is proposed to modify D-matrix and wrapper logic considering least expensive solution first. This iterative procedure includes conditions ranging from modifying 0s and 1s in the matrix, or adding/removing the rows (failure sources) columns (tests). We will experiment this framework on datasets from DX challenge 2009.

  18. Lightning Protection Performance Assessment of Transmission Line Based on ATP model Automatic Generation

    Directory of Open Access Journals (Sweden)

    Luo Hanwu

    2016-01-01

    Full Text Available This paper presents a novel method to solve the initial lightning breakdown current by combing ATP and MATLAB simulation software effectively, with the aims to evaluate the lightning protection performance of transmission line. Firstly, the executable ATP simulation model is generated automatically according to the required information such as power source parameters, tower parameters, overhead line parameters, grounding resistance and lightning current parameters, etc. through an interface program coded by MATLAB. Then, the data are extracted from the generated LIS files which can be obtained by executing the ATP simulation model, the occurrence of transmission lie breakdown can be determined by the relative data in LIS file. The lightning current amplitude should be reduced when the breakdown occurs, and vice the verse. Thus the initial lightning breakdown current of a transmission line with given parameters can be determined accurately by continuously changing the lightning current amplitude, which is realized by a loop computing algorithm that is coded by MATLAB software. The method proposed in this paper can generate the ATP simulation program automatically, and facilitates the lightning protection performance assessment of transmission line.

  19. Automatic and Objective Assessment of Alternating Tapping Performance in Parkinson’s Disease

    Directory of Open Access Journals (Sweden)

    Mevludin Memedi

    2013-12-01

    Full Text Available This paper presents the development and evaluation of a method for enabling quantitative and automatic scoring of alternating tapping performance of patients with Parkinson’s disease (PD. Ten healthy elderly subjects and 95 patients in different clinical stages of PD have utilized a touch-pad handheld computer to perform alternate tapping tests in their home environments. First, a neurologist used a web-based system to visually assess impairments in four tapping dimensions (‘speed’, ‘accuracy’, ‘fatigue’ and ‘arrhythmia’ and a global tapping severity (GTS. Second, tapping signals were processed with time series analysis and statistical methods to derive 24 quantitative parameters. Third, principal component analysis was used to reduce the dimensions of these parameters and to obtain scores for the four dimensions. Finally, a logistic regression classifier was trained using a 10-fold stratified cross-validation to map the reduced parameters to the corresponding visually assessed GTS scores. Results showed that the computed scores correlated well to visually assessed scores and were significantly different across Unified Parkinson’s Disease Rating Scale scores of upper limb motor performance. In addition, they had good internal consistency, had good ability to discriminate between healthy elderly and patients in different disease stages, had good sensitivity to treatment interventions and could reflect the natural disease progression over time. In conclusion, the automatic method can be useful to objectively assess the tapping performance of PD patients and can be included in telemedicine tools for remote monitoring of tapping.

  20. Effect of an automatic feeding system on growth performance and feeding behaviour of pigs reared outdoors

    Directory of Open Access Journals (Sweden)

    Riccardo Fortina

    2010-01-01

    Full Text Available Nine Mora Romagnola and 10 Large White x Mora Romagnola growing pigs were reared outdoors. In both groups ad libitum feed was provided. Conventional pigs received it twice a day, distributed in two long troughs. Inside the corral of the second group, an automatic station was set up for: feed distribution, pigs weighing, and control by an analog camera. Thus the self-feeders received feed ad libitum individually by the automatic system, divided into small quantities at meal times. During the experiment the analog camera was used over 24 hours each day, to collect pictures of pigs in order to investigate their behaviours. For each picture the day and hour, the number of visible pigs and their behaviours were recorded and a statistical analysis of data, which was expressed as hourly frequencies of behavioural elements, was performed. Moreover to highlight “active” and “passive” behaviours between the groups, two categories “Move” and “Rest” were created grouping some behavioural elements. With regard to performance, conventional pigs reached a higher total weight gain (56.1±2.42 kg vs 46.7±2.42 kg; P=0.0117. But the feed conversion index (FCI of both groups was similar. The self-feeders had consumed less feed than conventional animals. The feeding system seems to influence behaviours. The percentage of time spent in Eating activity differs (P<0.0001 between the self-fed (median 24.6% and conventional pigs (median 10.9%. The resulting more regular eating trend of self-feeders influenced the daily activities distribution. The behavioural category Rest (median: self-feeders 55.0% vs 71.4% conventional pigs was dominant, with conventional pigs becoming more restless, particularly at meal times. This type of feeding competition and aggressive behaviour did not happen in the self-feeders due to the feed distribution system. The self-feeder results showed that pigs eat at the automatic station both day and night. The animals perform on

  1. Transducer-actuator systems and methods for performing on-machine measurements and automatic part alignment

    Energy Technology Data Exchange (ETDEWEB)

    Barkman, William E.; Dow, Thomas A.; Garrard, Kenneth P.; Marston, Zachary

    2016-07-12

    Systems and methods for performing on-machine measurements and automatic part alignment, including: a measurement component operable for determining the position of a part on a machine; and an actuation component operable for adjusting the position of the part by contacting the part with a predetermined force responsive to the determined position of the part. The measurement component consists of a transducer. The actuation component consists of a linear actuator. Optionally, the measurement component and the actuation component consist of a single linear actuator operable for contacting the part with a first lighter force for determining the position of the part and with a second harder force for adjusting the position of the part. The actuation component is utilized in a substantially horizontal configuration and the effects of gravitational drop of the part are accounted for in the force applied and the timing of the contact.

  2. The NetVISA automatic association tool. Next generation software testing and performance under realistic conditions.

    Science.gov (United States)

    Le Bras, Ronan; Arora, Nimar; Kushida, Noriyuki; Tomuta, Elena; Kebede, Fekadu; Feitio, Paulino

    2016-04-01

    The CTBTO's International Data Centre is in the process of developing the next generation software to perform the automatic association step. The NetVISA software uses a Bayesian approach with a forward physical model using probabilistic representations of the propagation, station capabilities, background seismicity, noise detection statistics, and coda phase statistics. The software has been in development for a few years and is now reaching the stage where it is being tested in a realistic operational context. An interactive module has been developed where the NetVISA automatic events that are in addition to the Global Association (GA) results are presented to the analysts. We report on a series of tests where the results are examined and evaluated by seasoned analysts. Consistent with the statistics previously reported (Arora et al., 2013), the first test shows that the software is able to enhance analysis work by providing additional event hypothesis for consideration by analysts. A test on a three-day data set was performed and showed that the system found 42 additional real events out of 116 examined, including 6 that pass the criterion for the Reviewed Event Bulletin of the IDC. The software was functional in a realistic, real-time mode, during the occurrence of the fourth nuclear test claimed by the Democratic People's Republic of Korea on January 6th, 2016. Confirming a previous statistical observation, the software found more associated stations (51, including 35 primary stations) than GA (36, including 26 primary stations) for this event. Nimar S. Arora, Stuart Russell, Erik Sudderth. Bulletin of the Seismological Society of America (BSSA) April 2013, vol. 103 no. 2A pp709-729.

  3. Performance Analysis of the SIFT Operator for Automatic Feature Extraction and Matching in Photogrammetric Applications

    Directory of Open Access Journals (Sweden)

    Francesco Nex

    2009-05-01

    Full Text Available In the photogrammetry field, interest in region detectors, which are widely used in Computer Vision, is quickly increasing due to the availability of new techniques. Images acquired by Mobile Mapping Technology, Oblique Photogrammetric Cameras or Unmanned Aerial Vehicles do not observe normal acquisition conditions. Feature extraction and matching techniques, which are traditionally used in photogrammetry, are usually inefficient for these applications as they are unable to provide reliable results under extreme geometrical conditions (convergent taking geometry, strong affine transformations, etc. and for bad-textured images. A performance analysis of the SIFT technique in aerial and close-range photogrammetric applications is presented in this paper. The goal is to establish the suitability of the SIFT technique for automatic tie point extraction and approximate DSM (Digital Surface Model generation. First, the performances of the SIFT operator have been compared with those provided by feature extraction and matching techniques used in photogrammetry. All these techniques have been implemented by the authors and validated on aerial and terrestrial images. Moreover, an auto-adaptive version of the SIFT operator has been developed, in order to improve the performances of the SIFT detector in relation to the texture of the images. The Auto-Adaptive SIFT operator (A2 SIFT has been validated on several aerial images, with particular attention to large scale aerial images acquired using mini-UAV systems.

  4. Performance Analysis of the SIFT Operator for Automatic Feature Extraction and Matching in Photogrammetric Applications.

    Science.gov (United States)

    Lingua, Andrea; Marenchino, Davide; Nex, Francesco

    2009-01-01

    In the photogrammetry field, interest in region detectors, which are widely used in Computer Vision, is quickly increasing due to the availability of new techniques. Images acquired by Mobile Mapping Technology, Oblique Photogrammetric Cameras or Unmanned Aerial Vehicles do not observe normal acquisition conditions. Feature extraction and matching techniques, which are traditionally used in photogrammetry, are usually inefficient for these applications as they are unable to provide reliable results under extreme geometrical conditions (convergent taking geometry, strong affine transformations, etc.) and for bad-textured images. A performance analysis of the SIFT technique in aerial and close-range photogrammetric applications is presented in this paper. The goal is to establish the suitability of the SIFT technique for automatic tie point extraction and approximate DSM (Digital Surface Model) generation. First, the performances of the SIFT operator have been compared with those provided by feature extraction and matching techniques used in photogrammetry. All these techniques have been implemented by the authors and validated on aerial and terrestrial images. Moreover, an auto-adaptive version of the SIFT operator has been developed, in order to improve the performances of the SIFT detector in relation to the texture of the images. The Auto-Adaptive SIFT operator (A(2) SIFT) has been validated on several aerial images, with particular attention to large scale aerial images acquired using mini-UAV systems. PMID:22412336

  5. A HYBRID METHOD FOR AUTOMATIC SPEECH RECOGNITION PERFORMANCE IMPROVEMENT IN REAL WORLD NOISY ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Urmila Shrawankar

    2013-01-01

    Full Text Available It is a well known fact that, speech recognition systems perform well when the system is used in conditions similar to the one used to train the acoustic models. However, mismatches degrade the performance. In adverse environment, it is very difficult to predict the category of noise in advance in case of real world environmental noise and difficult to achieve environmental robustness. After doing rigorous experimental study it is observed that, a unique method is not available that will clean the noisy speech as well as preserve the quality which have been corrupted by real natural environmental (mixed noise. It is also observed that only back-end techniques are not sufficient to improve the performance of a speech recognition system. It is necessary to implement performance improvement techniques at every step of back-end as well as front-end of the Automatic Speech Recognition (ASR model. Current recognition systems solve this problem using a technique called adaptation. This study presents an experimental study that aims two points, first is to implement the hybrid method that will take care of clarifying the speech signal as much as possible with all combinations of filters and enhancement techniques. The second point is to develop a method for training all categories of noise that can adapt the acoustic models for a new environment that will help to improve the performance of the speech recognizer under real world environmental mismatched conditions. This experiment confirms that hybrid adaptation methods improve the ASR performance on both levels, (Signal-to-Noise Ratio SNR improvement as well as word recognition accuracy in real world noisy environment.

  6. Productive performance of Nile tilapia (Oreochromis niloticus fed at different frequencies and periods with automatic dispenser

    Directory of Open Access Journals (Sweden)

    R.M.R. Sousa

    2012-02-01

    Full Text Available The performance of Nile tilapia (Oreochromis niloticus raised in cages furnished with an automatic dispenser, supplied at different frequencies (once per hour and once every two hours and periods (daytime, nighttime and both was evaluated. Eighteen 1.0m³ cages were placed into a 2000m² pond, two meters deep with a 5% water exchange. One hundred and seventy tilapias, with initial weight of 16.0±4.9g, were dispersed into each 1m³ cage and the feed ration was adjusted every 21 days with biometry. Data was collected from March to July (autumn and winter. Significant difference to final weight (P<0.05 among treatments was observed. The increase in feeding frequency improves the productive performance of Nile tilapias in cages and permitted better management of the food. The better feed conversion rate for high feeding frequency (24 times day-1 can result in saving up to 360kg of food for each ton of fish produced, increasing the economic sustenance for tilapia culture and suggesting less environmental pollution.

  7. CCD:An Integrated C Coding and Debugging Tool

    Institute of Scientific and Technical Information of China (English)

    金立群

    1993-01-01

    CCD is an integrated software tool which is intended to support the coding and debugging for C language It integrates a hybrid editor,an incremental semantic analyzer,a multi-entry parser,an incremental unparser and a source-level debugger into a single tool.The integration is realized by sharing common knowledge among all the components of the system and by task-oriented conbination of the components,Nonlocal attribute grammar is adopted for specifying the common knowledge about the syntax and semantics of C language.The incremental attribute evaluation is used to implement the semantic analyzer and the unparser to increase system efficiency.CCD keeps the preprocessors and comments most regular to make it practical.

  8. Data Provenance as a Tool for Debugging Hydrological Models based on Python

    Science.gov (United States)

    Wombacher, A.; Huq, M.; Wada, Y.; Van Beek, R.

    2012-12-01

    There is an increase in data volume used in hydrological modeling. The increasing data volume requires additional efforts in debugging models since a single output value is influenced by a multitude of input values. Thus, it is difficult to keep an overview among the data dependencies. Further, knowing these dependencies, it is a tedious job to infer all the relevant data values. The aforementioned data dependencies are also known as data provenance, i.e. the determination of how a particular value has been created and processed. The proposed tool infers the data provenance automatically from a python script and visualizes the dependencies as a graph without executing the script. To debug the model the user specifies the value of interest in space and time. The tool infers all related data values and displays them in the graph. The tool has been evaluated by hydrologists developing a model for estimating the global water demand [1]. The model uses multiple different data sources. The script we analysed has 120 lines of codes and used more than 3000 individual files, each of them representing a raster map of 360*720 cells. After importing the data of the files into a SQLite database, the data consumes around 40 GB of memory. Using the proposed tool a modeler is able to select individual values and infer which values have been used to calculate the value. Especially in cases of outliers or missing values it is a beneficial tool to provide the modeler with efficient information to investigate the unexpected behavior of the model. The proposed tool can be applied to many python scripts and has been tested with other scripts in different contexts. In case a python code contains an unknown function or class the tool requests additional information about the used function or class to enable the inference. This information has to be entered only once and can be shared with colleagues or in the community. Reference [1] Y. Wada, L. P. H. van Beek, D. Viviroli, H. H. Drr, R

  9. Validation of a novel automatic sleep spindle detector with high performance during sleep in middle aged subjects

    DEFF Research Database (Denmark)

    Wendt, Sabrina Lyngbye; Christensen, Julie A. E.; Kempfner, Jacob;

    2012-01-01

    Many of the automatic sleep spindle detectors currently used to analyze sleep EEG are either validated on young subjects or not validated thoroughly. The purpose of this study is to develop and validate a fast and reliable sleep spindle detector with high performance in middle aged subjects. An a...

  10. Distribution transformer with automatic voltage adjustment - performance; Transformador de distribucion con ajuste automatico de tension - desempeno

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez Ruiz, Gustavo A.; Delgadillo Bocanegra, Alfonso; Betancourt Ramirez, Enrique [PROLEC-GE, Apodaca, Nuevo Leon (Mexico)]. E-mail: gustavo1.hernandez@ge.com; alfonso.delgadillobocanegra@ge.com; enrique.betancourt@ge.com; Ramirez Arredondo, Juan M. [CINVESTAV-Guadalajara, Zapopan, Jalisco (Mexico)]. E-mail: jramirez@gdl.cinvestav.mx

    2010-11-15

    In the electric power distribution systems, the power quality is strongly linked with the service stability voltage. In the radial kind systems, it is virtually impossible to achieve a flat voltage along the lines, so it is desirable to count with transformers that can adjust automatically the turns ratio. In this work, it is described the development and the performance of a transformer with an integrated electronic tap changer, that allows to change the turns ratio along the standard range of +/-5%, and it was identified the application limits of the technology. [Spanish] En los sistemas de distribucion de energia electrica, la calidad del suministro de energia esta fuertemente ligada con la estabilidad del voltaje de servicio. En sistemas de tipo radial, es virtualmente imposible mantener uniforme la tension a lo largo de las lineas, por lo que se hace deseable contar con transformadores que puedan ajustar automaticamente la relacion de transformacion. En este trabajo, se describe el desarrollo y desempeno de un transformador con switch electronico integrado, que permite variar la relacion de transformacion dentro del rango estandarizado de +/-5%, y se identifican los limites de aplicacion de la tecnologia.

  11. Effect of an automatic feeding system on growth performance and feeding behaviour of pigs reared outdoors

    OpenAIRE

    Riccardo Fortina; Salvatore Barbera; Paolo Cornale

    2009-01-01

    Nine Mora Romagnola and 10 Large White x Mora Romagnola growing pigs were reared outdoors. In both groups ad libitum feed was provided. Conventional pigs received it twice a day, distributed in two long troughs. Inside the corral of the second group, an automatic station was set up for: feed distribution, pigs weighing, and control by an analog camera. Thus the self-feeders received feed ad libitum individually by the automatic system, divided into small quantities at meal times. During the e...

  12. General collaboration offer of Johnson Controls regarding the performance of air conditioning automatic control systems and other buildings` automatic control systems

    Energy Technology Data Exchange (ETDEWEB)

    Gniazdowski, J.

    1995-12-31

    JOHNSON CONTROLS manufactures measuring and control equipment (800 types) and is as well a {open_quotes}turn-key{close_quotes} supplier of complete automatic controls systems for heating, air conditioning, ventilation and refrigerating engineering branches. The Company also supplies Buildings` Computer-Based Supervision and Monitoring Systems that may be applied in both small and large structures. Since 1990 the company has been performing full-range trade and contracting activities on the Polish market. We have our own well-trained technical staff and we collaborate with a series of designing and contracting enterprises that enable us to have our projects carried out all over Poland. The prices of our supplies and services correspond with the level of the Polish market.

  13. Remote Collaborative Debugging Model Based on Debugging Agent%基于调试代理的远程协同调试模型

    Institute of Scientific and Technical Information of China (English)

    胡先浪; 张培培

    2011-01-01

    Aiming at the definition of the embedded software and hardware,analyses the embedded cross debugging theory firstly, because the software research tools support one-to-one debugging mode only, and it puts forward remote collaborative debugging model based on debugging agent, and realizes debugging agent function.The experiment method and the function proof are given on the national integrated developing environment JARI-IDE and the national embedded operating system JARI-Works, it realizes resources share, provides a new method for the debugging modules.%针对嵌入式软硬件资源的限制问题,以及现有的软件开发工具仅支持一对一的调试模式的问题,在嵌入式远程调试原理的基础上,分析了嵌入式交叉调试原理,在此基础上提出了基于调试代理的远程协同调试模型,并详细给出了远程协同调试实现的核心--调试代理功能的实现;其次在国产集成开发环境JARI-IDE和同产嵌入式操作系统JARI-Works上给出了实现方法和功能验证,实现了分布在不同地点的开发人员共享资源,实时协同工作,为模块化分工调试提供了一种新的实现方法.

  14. An Imperfect-debugging Fault-detection Dependent-parameter Software

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Software reliability growth models (SRGMs) incorporating the imperfect debugging and learning phenomenon of developers have recently been developed by many researchers to estimate software reliability measures such as the number of remaining faults and software reliability. However, the model parameters of both the fault content rate function and fault detection rate function of the SRGMs are often considered to be independent from each other. In practice, this assumption may not be the case and it is worth to investigate what if it is not. In this paper, we aim for such study and propose a software reliability model connecting the imperfect debugging and learning phenomenon by a common parameter among the two functions, called the imperfect-debugging fault-detection dependent-parameter model. Software testing data collected from real applications are utilized to illustrate the proposed model for both the descriptive and predictive power by determining the non-zero initial debugging process.

  15. Application of VMware VProbes to debugging of a segmentation based separation kernel

    OpenAIRE

    Sanders, Kyle

    2009-01-01

    Approved for public release; distribution is unlimited Debugging is a useful technique in all aspects of software development, including that of operating systems. Because they provide low level interfaces to the hardware, operating systems are particularly difficult to debug. There is little room to add abstraction between the computer hardware and the executing operating system software. Many debuggers are intimately tied to the system’s memory model, compiler, and loader. For specialize...

  16. Automatic shading effects on the energetic performance of building systems; Efeito do sombreamento automatico no desempenho de sistemas prediais

    Energy Technology Data Exchange (ETDEWEB)

    Prado, Racine Tadeu Araujo

    1996-12-31

    This thesis develops a theoretic-experimental study dealing with the effects of an automatic shading device on the energetic performance of a dimmable lighting system and a cooling equipment. Some equations related to fenestration optical and thermal properties are rebuilt, while some others are created, under a theoretical approach. In order to collect field data, the energy demand-and other variables - was measured in two distinct stories, with the same fenestration features, of the Test Tower. New data was gathered after adding an automatic shading device to the window of one story. The comparison of the collected data allows the energetic performance evaluation of the shading device. (author) 136 refs., 55 figs., 6 tabs.

  17. Performance Modelling of Automatic Identification System with Extended Field of View

    DEFF Research Database (Denmark)

    Lauersen, Troels; Mortensen, Hans Peter; Pedersen, Nikolaj Bisgaard;

    2010-01-01

    This paper deals with AIS (Automatic Identification System) behavior, to investigate the severity of packet collisions in an extended field of view (FOV). This is an important issue for satellite-based AIS, and the main goal is a feasibility study to find out to what extent an increased FOV...

  18. Improving the working performance of automatic ball balancer by modifying its mechanism

    Science.gov (United States)

    Rezaee, Mousa; Fathi, Reza

    2015-12-01

    An automatic ball balancer consists of several balls that are free to move in the circular race containing a damping fluid. Although a traditional ABB can improve the vibration behavior of an unbalanced rotor under proper working conditions, at speeds below the first critical speed, it makes the vibration amplitude of the rotor larger than that of a rotor without an automatic ball balancer. Moreover, it has a limited stable region of the perfect balancing configuration. Considering these deficiencies, in this study a new design for automatic ball balancer is proposed. To analyze the problem, the nonlinear governing equations of a rotor equipped with the new ABB are derived using Lagrange's equations. Then, stability analysis is carried out on the basis of linearized equations of motion around the equilibrium positions to obtain the stable region of the system. It is shown that the new ABB can prevent the rotor from increasing the vibrations at the transient state. Also, it increases the stable region of the perfect balancing configuration. Comparing the results with those corresponding to the traditional ball balancer shows that the new ABB can reduce the vibration amplitude at speeds below the first critical speed and it increases the stable region of the perfect balancing configuration.

  19. Compile-Time Debugging of C Programs Working on Trees

    DEFF Research Database (Denmark)

    Elgaard, Jacob; Møller, Anders; Schwartzbach, Michael I.

    2000-01-01

    We exhibit a technique for automatically verifying the safety of simple C programs working on tree-shaped data structures. We do not consider the complete behavior of programs, but only attempt to verify that they respect the shape and integrity of the store. A verified program is guaranteed...... to preserve the tree-shapes of data structures, to avoid pointer errors such as NULL dereferences, leaking memory, and dangling references, and furthermore to satisfy assertions specified in a specialized store logic. A program is transformed into a single formula in WSRT, an extension of WS2S that is decided...... of an initial store that leads to an error is automatically generated. This extends previous work that uses a similar technique to verify a simpler syntax manipulating only list structures. In that case, programs are translated into WS1S formulas. A naive generalization to recursive data-types determines...

  20. Automatic sprinkler system performance and reliability in United States Department of Energy Facilities, 1952 to 1980

    International Nuclear Information System (INIS)

    The automatic sprinkler system experiences of the United States Department of Energy and its predecessor agencies are analyzed. Based on accident and incident files in the Office of Operational Safety and on supplementary responses, 587 incidents including over 100 fires are analyzed. Tables and figures, with supplementary narratives discuss fire experience by various categories such as number of heads operating, type of system, dollar losses, failures, extinguished vs. controlled, and types of sprinkler heads. Use is made of extreme value projections and frequency-severity plots to compare past experience and predict future experience. Non-fire incidents are analyzed in a similar manner by cause, system types and failure types. Discussion of no-loss incidents and non-fire protection water systems is included. The author's conclusions and recommendations and appendices listing survey methodology, major incidents, and a bibliography are included

  1. Stair descending exercise using a novel automatic escalator: effects on muscle performance and health-related parameters.

    Science.gov (United States)

    Paschalis, Vassilis; Theodorou, Anastasios A; Panayiotou, George; Kyparos, Antonios; Patikas, Dimitrios; Grivas, Gerasimos V; Nikolaidis, Michalis G; Vrabas, Ioannis S

    2013-01-01

    A novel automatic escalator was designed, constructed and used in the present investigation. The aim of the present investigation was to compare the effect of two repeated sessions of stair descending versus stair ascending exercise on muscle performance and health-related parameters in young healthy men. Twenty males participated and were randomly divided into two equal-sized groups: a stair descending group (muscle-damaging group) and a stair ascending group (non-muscle-damaging group). Each group performed two sessions of stair descending or stair ascending exercise on the automatic escalator while a three week period was elapsed between the two exercise sessions. Indices of muscle function, insulin sensitivity, blood lipid profile and redox status were assessed before and immediately after, as well as at day 2 and day 4 after both exercise sessions. It was found that the first bout of stair descending exercise caused muscle damage, induced insulin resistance and oxidative stress as well as affected positively blood lipid profile. However, after the second bout of stair descending exercise the alterations in all parameters were diminished or abolished. On the other hand, the stair ascending exercise induced only minor effects on muscle function and health-related parameters after both exercise bouts. The results of the present investigation indicate that stair descending exercise seems to be a promising way of exercise that can provoke positive effects on blood lipid profile and antioxidant status.

  2. Modern multithreading implementing, testing, and debugging multithreaded Java and C++/Pthreads/Win32 programs

    CERN Document Server

    Carver, Richard H

    2005-01-01

    Master the essentials of concurrent programming,including testing and debuggingThis textbook examines languages and libraries for multithreaded programming. Readers learn how to create threads in Java and C++, and develop essential concurrent programming and problem-solving skills. Moreover, the textbook sets itself apart from other comparable works by helping readers to become proficient in key testing and debugging techniques. Among the topics covered, readers are introduced to the relevant aspects of Java, the POSIX Pthreads library, and the Windows Win32 Applications Programming Interface.

  3. Performing the processing required for automatically get a PDF/A version of the CERN Library documentation

    CERN Document Server

    Molina Garcia-Retamero, Antonio

    2015-01-01

    The aim of the project was to perform the processing required for automatically get a PDF/A version of the CERN Library documentation. For this, it is necessary to extract as much metadata as possible from the sources files, inject the required data into the original source files creating new ones ready for being compiled with all related dependencies. Besides, I’ve proposed the creation of a HTML version consistent with the PDF and navigable for easy access, I’ve been trying to perform some Natural Language Processing for extracting metadata, I’ve proposed the injection of the cern library documentation into the HTML version of the long writeups where it is referenced (for instance, when a CERN Library function is referenced in a sample code) Finally, I’ve designed and implemented a Graphical User Interface in order to simplify the process for the user.

  4. Design and performance of an automatic regenerating adsorption aerosol dryer for continuous operation at monitoring sites

    Science.gov (United States)

    Tuch, T. M.; Haudek, A.; Müller, T.; Nowak, A.; Wex, H.; Wiedensohler, A.

    2009-04-01

    Sizes of aerosol particles depend on the relative humidity of their carrier gas. Most monitoring networks require therefore that the aerosol is dried to a relative humidity below 50% RH to ensure comparability of measurements at different sites. Commercially available aerosol dryers are often not suitable for this purpose at remote monitoring sites. Adsorption dryers need to be regenerated frequently and maintenance-free single column Nafion dryers are not designed for high aerosol flow rates. We therefore developed an automatic regenerating adsorption aerosol dryer with a design flow rate of 1 m3/h. Particle transmission efficiency of this dryer has been determined during a 3 weeks experiment. The lower 50% cut-off was found to be below 3 nm at the design flow rate of the instrument. Measured transmission efficiencies are in good agreement with theoretical calculations. One drier has been successfully deployed in the Amazonas river basin. From this monitoring site, we present data from the first 6 months of measurements (February 2008-August 2008). Apart from one unscheduled service, this dryer did not require any maintenance during this time period. The average relative humidity of the dried aerosol was 27.1+/-7.5% RH compared to an average ambient relative humidity of nearly 80% and temperatures around 30°C. This initial deployment demonstrated that these dryers are well suitable for continuous operation at remote monitoring sites under adverse ambient conditions.

  5. Design and performance of a video-based laser beam automatic alignment system

    Institute of Scientific and Technical Information of China (English)

    Daizhong Liu(刘代中); Renfang Xu(徐仁芳); Dianyuan Fan(范滇元)

    2004-01-01

    @@ A laser alignment system is applied to a high power laser facility for inertial confinement fusion.A designof the automated,close-loop laser beam alignment system is described.Its function is to sense beamalignment errors in a laser beam transport system and automatically steer mirrors preceding the sensorlocation as required to maintain beam alignment.The laser beam is sampled by a sensor package,whichuses video cameras to sense pointing and centering errors.The camera outputs are fed to a personalcomputer,which includes video digitizers and uses image storage and software to sense the centroid of theimage.Signals are sent through the computer to a stepper motor controller,which drives stepper motorson mirror mounts preceding the beam sampling location to return the beam alignment to the prescribedcondition.Its optical principles and key techniques are given.The pointing and centering sensitivities ofthe beam aligmnent sensor package are analyzed.The system has been verified on the multi-pass amplifier experimental system.

  6. Xenus AC Drives’Debugging and Application%Xenus交流驱动器的调试及应用

    Institute of Scientific and Technical Information of China (English)

    翟昭斌

    2014-01-01

    随着交流电机在自动控制系统中的广泛应用,与之相关的交流驱动器种类繁多,适用于不同的控制领域。Copley公司生产的交流驱动器是一款功能齐全,使用方便,调试简单,易于控制,适用范围广泛的产品。主要介绍Xenus驱动器的特点、功能,以及在某随动系统中的具体应用。%With AC motors are widely used in automatic control system,associated with a wide range of AC drives for different control areas. The Copley company’s products applicable to a wide range that because it have fully functional,simple debugging and easy to control.This article mainly introduce the Xenus AC drives’characteristics and its concrete application in a servo system.

  7. Automatic Eye Detection Error as a Predictor of Face Recognition Performance

    NARCIS (Netherlands)

    Dutta, Abhishek; Veldhuis, Raymond; Spreeuwers, Luuk

    2014-01-01

    Various facial image quality parameters like pose, illumination, noise, resolution, etc are known to be a predictor of face recognition performance. However, there still remain many other properties of facial images that are not captured by the existing quality parameters. In this paper, we propose

  8. Debugging of Class-D Audio Power Amplifiers

    DEFF Research Database (Denmark)

    Crone, Lasse; Pedersen, Jeppe Arnsdorf; Mønster, Jakob Døllner;

    2012-01-01

    Determining and optimizing the performance of a Class-D audio power amplier can be very dicult without knowledge of the use of audio performance measuring equipment and of how the various noise and distortion sources in uence the audio performance. This paper gives an introduction on how to measu...

  9. EOS: Automatic In-vivo Evolution of Kernel Policies for Better Performance

    OpenAIRE

    Cui, Yan; Chen, Quan; Yang, Junfeng

    2015-01-01

    Today's monolithic kernels often implement a small, fixed set of policies such as disk I/O scheduling policies, while exposing many parameters to let users select a policy or adjust the specific setting of the policy. Ideally, the parameters exposed should be flexible enough for users to tune for good performance, but in practice, users lack domain knowledge of the parameters and are often stuck with bad, default parameter settings. We present EOS, a system that bridges the knowledge gap betw...

  10. Multistation alarm system for eruptive activity based on the automatic classification of volcanic tremor: specifications and performance

    Science.gov (United States)

    Langer, Horst; Falsaperla, Susanna; Messina, Alfio; Spampinato, Salvatore

    2015-04-01

    system is hitherto one of the main automatic alerting tools to identify impending eruptive events at Etna. The currently operating software named KKAnalysis is applied to the data stream continuously recorded at two seismic stations. The data are merged with reference datasets of past eruptive episodes. In doing so, the results of pattern classification can be immediately compared to previous eruptive scenarios. Given the rich material collected in recent years, here we propose the application of the alert system to a wider range (up to a total of eleven) stations at different elevations (1200-3050 m) and distances (1-8 km) from the summit craters. Critical alert parameters were empirically defined to obtain an optimal tuning of the alert system for each station. To verify the robustness of this new, multistation alert system, a dataset encompassing about eight years of continuous seismic records (since 2006) was processed automatically using KKAnalysis and collateral software offline. Then, we analyzed the performance of the classifier in terms of timing and spatial distribution of the stations.

  11. A computer-aided control system for automatic performance measurements on the LHC series dipoles

    International Nuclear Information System (INIS)

    The control system software (Test Master) for the Large Hadron Collider (LHC) magnet series measurements is presented. This system was developed at CERN to automate as many tests on the LHC magnets as possible. The Test Master software is the middle layer of the main software architecture developed by the LHC/IAS group for central supervision of all types of LHC dipole tests in the SM18 hall. It serves as a manager and scheduler for applications, controlling all measurements that are performed in a cluster of two test benches. The software was implemented in the LabVIEW environment. The information about the interactive user interface, the software architecture, communication protocols, file-configuration different types of commands and status files of the Test Master are described

  12. Parameter design and performance analysis of shift actuator for a two-speed automatic mechanical transmission for pure electric vehicles

    Directory of Open Access Journals (Sweden)

    Jianjun Hu

    2016-08-01

    Full Text Available Recent developments of pure electric vehicles have shown that pure electric vehicles equipped with two-speed or multi-speed gearbox possess higher energy efficiency by ensuring the drive motor operates at its peak performance range. This article presents the design, analysis, and control of a two-speed automatic mechanical transmission for pure electric vehicles. The shift actuator is based on a motor-controlled camshaft where a special geometric groove is machined, and the camshaft realizes the axial positions of the synchronizer sleeve for gear engaging, disengaging, and speed control of the drive motor. Based on the force analysis of shift process, the parameters of shift actuator and shift motor are designed. The drive motor’s torque control strategy before shifting, speed governing control strategy before engaging, shift actuator’s control strategy during gear engaging, and drive motor’s torque recovery strategy after shift process are proposed and implemented with a prototype. To validate the performance of the two-speed gearbox, a test bed was developed based on dSPACE that emulates various operation conditions. The experimental results indicate that the shift process with the proposed shift actuator and control strategy could be accomplished within 1 s under various operation conditions, with shift smoothness up to passenger car standard.

  13. Interactive debug program for evaluation and modification of assembly-language software

    Science.gov (United States)

    Arpasi, D. J.

    1979-01-01

    An assembly-language debug program written for the Honeywell HDC-601 and DDP-516/316 computers is described. Names and relative addressing to improve operator-machine interaction are used. Features include versatile display, on-line assembly, and improved program execution and analysis. The program is discussed from both a programmer's and an operator's standpoint. Functional diagrams are included to describe the program, and each command is illustrated.

  14. Automatic sequences

    CERN Document Server

    Haeseler, Friedrich

    2003-01-01

    Automatic sequences are sequences which are produced by a finite automaton. Although they are not random they may look as being random. They are complicated, in the sense of not being not ultimately periodic, they may look rather complicated, in the sense that it may not be easy to name the rule by which the sequence is generated, however there exists a rule which generates the sequence. The concept automatic sequences has special applications in algebra, number theory, finite automata and formal languages, combinatorics on words. The text deals with different aspects of automatic sequences, in particular:· a general introduction to automatic sequences· the basic (combinatorial) properties of automatic sequences· the algebraic approach to automatic sequences· geometric objects related to automatic sequences.

  15. First performance evaluation of software for automatic segmentation, labeling and reformation of anatomical aligned axial images of the thoracolumbar spine at CT

    International Nuclear Information System (INIS)

    Highlights: •Automatic segmentation and labeling of the thoracolumbar spine. •Automatically generated double-angulated and aligned axial images of spine segments. •High grade of accurateness for the symmetric depiction of anatomical structures. •Time-saving and may improve workflow in daily practice. -- Abstract: Objectives: To evaluate software for automatic segmentation, labeling and reformation of anatomical aligned axial images of the thoracolumbar spine on CT in terms of accuracy, potential for time savings and workflow improvement. Material and methods: 77 patients (28 women, 49 men, mean age 65.3 ± 14.4 years) with known or suspected spinal disorders (degenerative spine disease n = 32; disc herniation n = 36; traumatic vertebral fractures n = 9) underwent 64-slice MDCT with thin-slab reconstruction. Time for automatic labeling of the thoracolumbar spine and reconstruction of double-angulated axial images of the pathological vertebrae was compared with manually performed reconstruction of anatomical aligned axial images. Reformatted images of both reconstruction methods were assessed by two observers regarding accuracy of symmetric depiction of anatomical structures. Results: In 33 cases double-angulated axial images were created in 1 vertebra, in 28 cases in 2 vertebrae and in 16 cases in 3 vertebrae. Correct automatic labeling was achieved in 72 of 77 patients (93.5%). Errors could be manually corrected in 4 cases. Automatic labeling required 1 min in average. In cases where anatomical aligned axial images of 1 vertebra were created, reconstructions made by hand were significantly faster (p < 0.05). Automatic reconstruction was time-saving in cases of 2 and more vertebrae (p < 0.05). Both reconstruction methods revealed good image quality with excellent inter-observer agreement. Conclusion: The evaluated software for automatic labeling and anatomically aligned, double-angulated axial image reconstruction of the thoracolumbar spine on CT is time

  16. First performance evaluation of software for automatic segmentation, labeling and reformation of anatomical aligned axial images of the thoracolumbar spine at CT

    Energy Technology Data Exchange (ETDEWEB)

    Scholtz, Jan-Erik, E-mail: janerikscholtz@gmail.com; Wichmann, Julian L.; Kaup, Moritz; Fischer, Sebastian; Kerl, J. Matthias; Lehnert, Thomas; Vogl, Thomas J.; Bauer, Ralf W.

    2015-03-15

    Highlights: •Automatic segmentation and labeling of the thoracolumbar spine. •Automatically generated double-angulated and aligned axial images of spine segments. •High grade of accurateness for the symmetric depiction of anatomical structures. •Time-saving and may improve workflow in daily practice. -- Abstract: Objectives: To evaluate software for automatic segmentation, labeling and reformation of anatomical aligned axial images of the thoracolumbar spine on CT in terms of accuracy, potential for time savings and workflow improvement. Material and methods: 77 patients (28 women, 49 men, mean age 65.3 ± 14.4 years) with known or suspected spinal disorders (degenerative spine disease n = 32; disc herniation n = 36; traumatic vertebral fractures n = 9) underwent 64-slice MDCT with thin-slab reconstruction. Time for automatic labeling of the thoracolumbar spine and reconstruction of double-angulated axial images of the pathological vertebrae was compared with manually performed reconstruction of anatomical aligned axial images. Reformatted images of both reconstruction methods were assessed by two observers regarding accuracy of symmetric depiction of anatomical structures. Results: In 33 cases double-angulated axial images were created in 1 vertebra, in 28 cases in 2 vertebrae and in 16 cases in 3 vertebrae. Correct automatic labeling was achieved in 72 of 77 patients (93.5%). Errors could be manually corrected in 4 cases. Automatic labeling required 1 min in average. In cases where anatomical aligned axial images of 1 vertebra were created, reconstructions made by hand were significantly faster (p < 0.05). Automatic reconstruction was time-saving in cases of 2 and more vertebrae (p < 0.05). Both reconstruction methods revealed good image quality with excellent inter-observer agreement. Conclusion: The evaluated software for automatic labeling and anatomically aligned, double-angulated axial image reconstruction of the thoracolumbar spine on CT is time

  17. Performance-Driven Interface Contract Enforcement for Scientific Components

    Energy Technology Data Exchange (ETDEWEB)

    Dahlgren, T L

    2007-10-01

    Performance-driven interface contract enforcement research aims to improve the quality of programs built from plug-and-play scientific components. Interface contracts make the obligations on the caller and all implementations of the specified methods explicit. Runtime contract enforcement is a well-known technique for enhancing testing and debugging. However, checking all of the associated constraints during deployment is generally considered too costly from a performance stand point. Previous solutions enforced subsets of constraints without explicit consideration of their performance implications. Hence, this research measures the impacts of different interface contract sampling strategies and compares results with new techniques driven by execution time estimates. Results from three studies indicate automatically adjusting the level of checking based on performance constraints improves the likelihood of detecting contract violations under certain circumstances. Specifically, performance-driven enforcement is better suited to programs exercising constraints whose costs are at most moderately expensive relative to normal program execution.

  18. Performance-Driven Interface Contract Enforcement for Scientific Components

    Energy Technology Data Exchange (ETDEWEB)

    Dahlgren, Tamara Lynn [Univ. of California, Davis, CA (United States)

    2008-01-01

    Performance-driven interface contract enforcement research aims to improve the quality of programs built from plug-and-play scientific components. Interface contracts make the obligations on the caller and all implementations of the specified methods explicit. Runtime contract enforcement is a well-known technique for enhancing testing and debugging. However, checking all of the associated constraints during deployment is generally considered too costly from a performance stand point. Previous solutions enforced subsets of constraints without explicit consideration of their performance implications. Hence, this research measures the impacts of different interface contract sampling strategies and compares results with new techniques driven by execution time estimates. Results from three studies indicate automatically adjusting the level of checking based on performance constraints improves the likelihood of detecting contract violations under certain circumstances. Specifically, performance-driven enforcement is better suited to programs exercising constraints whose costs are at most moderately expensive relative to normal program execution.

  19. Description and performance of a fully automatic device for the study of the sedimentation of magnetic suspensions

    Science.gov (United States)

    Iglesias, G. R.; López-López, M. T.; Delgado, A. V.; Durán, J. D. G.

    2011-07-01

    In this paper we describe an experimental setup for the automatic determination of the sedimentation behavior of magnetic suspensions (i.e., disperse systems consisting on ferro- or ferri-magnetic particles in a suitable fluid) of arbitrary volume fraction of solids. The device is based on the evaluation of the inductance of a thin coil surrounding the test tube containing the sample. The inductance L is evaluated from the measurement of the resonant frequency of a parallel LC circuit constructed with the coil and a capacitor of known capacitance. The coil can be moved vertically along the tube at specified steps and time intervals, and from the knowledge of L as a function of the vertical position and time, one can get an image of the particle concentration profiles at given instants of time. The performance of the device is tested against suspensions of spherical iron particles in the micrometer size range dispersed in silicone oil, with various initial concentrations of solids. The sedimentation profiles are then compared with the predictions of existing models for the settling of disperse systems of non-interacting particles.

  20. Semi-automatic laboratory goniospectrometer system for performing multi-angular reflectance and polarization measurements for natural surfaces

    Science.gov (United States)

    Sun, Z. Q.; Wu, Z. F.; Zhao, Y. S.

    2014-01-01

    In this paper, the design and operation of the Northeast Normal University Laboratory Goniospectrometer System for performing multi-angular reflected and polarized measurements under controlled illumination conditions is described. A semi-automatic arm, which is carried on a rotated circular ring, enables the acquisition of a large number of measurements of surface Bidirectional Reflectance Factor (BRF) over the full hemisphere. In addition, a set of polarizing optics enables the linear polarization over the spectrum from 350 nm to 2300 nm. Because of the stable measurement condition in the laboratory, the BRF and linear polarization has an average uncertainty of 1% and less than 5% depending on the sample property, respectively. The polarimetric accuracy of the instrument is below 0.01 in the form of the absolute value of degree of linear polarization, which is established by measuring a Spectralon plane. This paper also presents the reflectance and polarization of snow, soil, sand, and ice measured during 2010-2013 in order to illustrate its stability and accuracy. These measurement results are useful to understand the scattering property of natural surfaces on Earth.

  1. Description and performance of a fully automatic device for the study of the sedimentation of magnetic suspensions.

    Science.gov (United States)

    Iglesias, G R; López-López, M T; Delgado, A V; Durán, J D G

    2011-07-01

    In this paper we describe an experimental setup for the automatic determination of the sedimentation behavior of magnetic suspensions (i.e., disperse systems consisting on ferro- or ferri-magnetic particles in a suitable fluid) of arbitrary volume fraction of solids. The device is based on the evaluation of the inductance of a thin coil surrounding the test tube containing the sample. The inductance L is evaluated from the measurement of the resonant frequency of a parallel LC circuit constructed with the coil and a capacitor of known capacitance. The coil can be moved vertically along the tube at specified steps and time intervals, and from the knowledge of L as a function of the vertical position and time, one can get an image of the particle concentration profiles at given instants of time. The performance of the device is tested against suspensions of spherical iron particles in the micrometer size range dispersed in silicone oil, with various initial concentrations of solids. The sedimentation profiles are then compared with the predictions of existing models for the settling of disperse systems of non-interacting particles. PMID:21806198

  2. I Hear You Eat and Speak: Automatic Recognition of Eating Condition and Food Type, Use-Cases, and Impact on ASR Performance.

    Directory of Open Access Journals (Sweden)

    Simone Hantke

    Full Text Available We propose a new recognition task in the area of computational paralinguistics: automatic recognition of eating conditions in speech, i. e., whether people are eating while speaking, and what they are eating. To this end, we introduce the audio-visual iHEARu-EAT database featuring 1.6 k utterances of 30 subjects (mean age: 26.1 years, standard deviation: 2.66 years, gender balanced, German speakers, six types of food (Apple, Nectarine, Banana, Haribo Smurfs, Biscuit, and Crisps, and read as well as spontaneous speech, which is made publicly available for research purposes. We start with demonstrating that for automatic speech recognition (ASR, it pays off to know whether speakers are eating or not. We also propose automatic classification both by brute-forcing of low-level acoustic features as well as higher-level features related to intelligibility, obtained from an Automatic Speech Recogniser. Prediction of the eating condition was performed with a Support Vector Machine (SVM classifier employed in a leave-one-speaker-out evaluation framework. Results show that the binary prediction of eating condition (i. e., eating or not eating can be easily solved independently of the speaking condition; the obtained average recalls are all above 90%. Low-level acoustic features provide the best performance on spontaneous speech, which reaches up to 62.3% average recall for multi-way classification of the eating condition, i. e., discriminating the six types of food, as well as not eating. The early fusion of features related to intelligibility with the brute-forced acoustic feature set improves the performance on read speech, reaching a 66.4% average recall for the multi-way classification task. Analysing features and classifier errors leads to a suitable ordinal scale for eating conditions, on which automatic regression can be performed with up to 56.2% determination coefficient.

  3. I Hear You Eat and Speak: Automatic Recognition of Eating Condition and Food Type, Use-Cases, and Impact on ASR Performance.

    Science.gov (United States)

    Hantke, Simone; Weninger, Felix; Kurle, Richard; Ringeval, Fabien; Batliner, Anton; Mousa, Amr El-Desoky; Schuller, Björn

    2016-01-01

    We propose a new recognition task in the area of computational paralinguistics: automatic recognition of eating conditions in speech, i. e., whether people are eating while speaking, and what they are eating. To this end, we introduce the audio-visual iHEARu-EAT database featuring 1.6 k utterances of 30 subjects (mean age: 26.1 years, standard deviation: 2.66 years, gender balanced, German speakers), six types of food (Apple, Nectarine, Banana, Haribo Smurfs, Biscuit, and Crisps), and read as well as spontaneous speech, which is made publicly available for research purposes. We start with demonstrating that for automatic speech recognition (ASR), it pays off to know whether speakers are eating or not. We also propose automatic classification both by brute-forcing of low-level acoustic features as well as higher-level features related to intelligibility, obtained from an Automatic Speech Recogniser. Prediction of the eating condition was performed with a Support Vector Machine (SVM) classifier employed in a leave-one-speaker-out evaluation framework. Results show that the binary prediction of eating condition (i. e., eating or not eating) can be easily solved independently of the speaking condition; the obtained average recalls are all above 90%. Low-level acoustic features provide the best performance on spontaneous speech, which reaches up to 62.3% average recall for multi-way classification of the eating condition, i. e., discriminating the six types of food, as well as not eating. The early fusion of features related to intelligibility with the brute-forced acoustic feature set improves the performance on read speech, reaching a 66.4% average recall for the multi-way classification task. Analysing features and classifier errors leads to a suitable ordinal scale for eating conditions, on which automatic regression can be performed with up to 56.2% determination coefficient. PMID:27176486

  4. LVDT Position Feedback Site Debugging Special Tool%LVDT位置反馈现场调试专用工具

    Institute of Scientific and Technical Information of China (English)

    卢震宇

    2014-01-01

    This paper relates to a power plant steam turbine high, medium pressure main steam valve, high pressure valve, open degree feedback device -- LVDT, solve the manpower, time, environment, a number of problems in the process of debugging interference. During operation of the unit, from DCS to the control sending opening signal to automatically adjust the flow rate, pressure and other parameters, and access to on-site valve opening degree is the only means of field position feedback device are transmitted back to the control room CRT current signal, if the signal error feedback signal and sent over a certain value, it will cause no the stability of system, may cause the entire unit manual, causing unnecessary losses. But because the unit head at high temperature, electronic device damage probability of straight up, the frequency of replacement of spare parts will also rise, so check device position feedback becomes more and more important. The position feedback is not accurate means can not be a true reflection of the valve opening, and the opening of the valve to control the oil / air intake, normal operation load and equipment on the unit's will bring influence. Steam turbine high, medium pressure main steam valve, high pressure valve, is the key equipment for adjusting the turbine load, while the LVDT position feedback is the only way to plant personnel for continuous monitoring of the door opening, it directly affects the accuracy of the load condition of the power plant unit. Debugging method of the time-consuming, laborious, and hidden safety problems. At site LVDT connector specifications, a tool, the current list into the field circuit, the debugging personnel can observe current change alone in the field, realize the site debugging, field observation, reduce environmental interference, improve the maintenance efficiency.%本文涉及电厂汽轮机高、中压主汽门,高、中压调门开度的反馈装置--LVDT,解决了调试过程中人力、工时

  5. Operational performance of Swedish grid connected solar power plants. Automatic data collection; Driftuppfoeljning av svenska naetanslutna solcellsanlaeggningar. Automatisering av datainsamling

    Energy Technology Data Exchange (ETDEWEB)

    Hedstroem, Jonas; Svensson, Stefan

    2006-09-15

    A performance database containing all grid-connected PV-systems in Sweden has been in operation since March 2002. The systems in the database are described in detail and energy production is continuously added in the form of monthly values. The energy production and the system descriptions are published on www.elforsk.se/solenergi. In august 2006 31 active systems were present in the database. As result of the Swedish subsidy program this number is expected to increase to over 100 systems in the next years. The new owners of PV-systems are obliged to report the produced electricity to the authorities at least once a year. In this work we have studied different means to simplify the collection of data. Four different methods are defined. 1. The conversion of readings from energy meters made at arbitrary distance in time into monthly values. 2. Methods to handle data obtained with the monitoring systems provided by different inverter manufactures. 3. Methods to acquire data from PV-systems with energy meters reporting to the green certificate system. 4. Commercial GSM/GPRS monitoring systems. The first method is the minimum level required by the authorities. The second and third methods are the use of equipments that are expected to be used by some PV-systems for other reasons. Method 4 gives a possibility to create a fully automatic collection method. The described GPRS-systems are expected to have an initial cost of roughly 4000 SEK and a yearly fee of 200 SEK (1 SEK {approx} 0.14 USD)

  6. Space-Based FPGA Radio Receiver Design, Debug, and Development of a Radiation-Tolerant Computing System

    Directory of Open Access Journals (Sweden)

    Zachary K. Baker

    2010-01-01

    Full Text Available Los Alamos has recently completed the latest in a series of Reconfigurable Software Radios, which incorporates several key innovations in both hardware design and algorithms. Due to our focus on satellite applications, each design must extract the best size, weight, and power performance possible from the ensemble of Commodity Off-the-Shelf (COTS parts available at the time of design. A large component of our work lies in determining if a given part will survive in space and how it will fail under various space radiation conditions. Using two Xilinx Virtex 4 FPGAs, we have achieved 1 TeraOps/second signal processing on a 1920 Megabit/second datastream. This processing capability enables very advanced algorithms such as our wideband RF compression scheme to operate at the source, allowing bandwidth-constrained applications to deliver previously unattainable performance. This paper will discuss the design of the payload, making electronics survivable in the radiation of space, and techniques for debug.

  7. Completely Debugging Indeterminate MPI/PVM Programs%不确定性MPI/PVM程序的完全调试

    Institute of Scientific and Technical Information of China (English)

    王锋; 安虹; 陈志辉; 陈国良

    2001-01-01

    讨论如何完全地调试不确定性MPI/PVM并行程序.在循环调试过程中,不确定性导致前次遇到的错误在以后的执行中很可能无法再现.基于MPI/PVM的FIFO通信模型,给出一种记录-重放技术的实现.通过可控制的重放,用户可以覆盖所有可能的程序执行路径,从而达到完全调试的目的.和其它方法相比,所提供的方法所需时空开销要小得多.此技术已在两种消息传递体系结构上得到实现:一种是曙光-2000超级服务器(由国家智能计算机研究中心开发),它由单处理器(PowerPC)结点经MESH网互联而成;另一种是国家高性能计算中心(合肥)的工作站(PowerPC/AIX)机群系统%This paper discusses how to completely debug indeterminateMPI/PVM parallelprograms.Due tothe indeterminacy,the previous bugs may be non-repeatable in successive executions during a cyclic debuggingsession.Based on the FIFO communication model of MPI/PVM,an implementation of record and replay tech-nique is presented.Moreover,users are provided with an easy way to completely debug their programs by cover-ing all possible execution paths through controllable replay.Comparied with other solutions,the proposedmethod produces much less temporaland spatialoverhead.The implementation has been completed on two kindsof message passing architectures:one is Dawning-2000 super server(that was developed by the National Re-search Center for Intelligent Computing Systems ofChina)with single-processor(PowerPC)nodes which are in-terconnected by a custom-built wormhole mesh network;the other is a cluster ofworkstations(PowerPC/AIX)which has been built in NationalHigh Performance Computing Center at Hefei.

  8. The evaluation of the performance of the automatic exposure control system of some selected mammography facilities in the Greater Accra Region, Ghana

    International Nuclear Information System (INIS)

    Mammography aids in the early detection of breast cancer. X-rays has an associated risk of inducing cancer though very useful and as such mammography procedures should be optimized through the appropriate processes such as the selection of exposure factors for an optimum image and minimal dose to patients. The automatic exposure control (AEC) aids in the selection of exposure factors thus controlling the amount of radiation to the breast and automatically compensates for differences in the breast thickness and density. The performance of the automatic exposure control system of mammography equipment and the status of quality management systems including quality assurance and quality controls of four (4) mammography facilities within the Greater Accra Region were assessed. In assessing the performance of the automatic exposure control system, the short term reproducibility test, thickness and voltage compensation test were carried out using breast equivalent phantom of various thicknesses. Half value layer test, film reject analysis and patient dose assessment were also performed. Analysis of the responses of the questionnaire administered to radiographers and supervisors of the selected facilities revealed that three (3) of the facilities have some aspect of quality management system programme in place but not effectively implemented. Measured optical densities from the various tests performed to evaluate the performance of the automatic exposure control systems revealed that the AEC compensates for the different phantom thickness and tube voltage (KV) by producing comparable optical densities for the various phantom thickness and tube voltages. Some of the measured optical densities were within the recommended optical density range of 1.5 OD - 1.9 OD. The highest optical density value was 0.13 OD above the highest limit of 1.9 OD. The film reject analysis showed that patient motion accounted for the large part (28%) of film rejects. Other factors such as too light

  9. A Case for Dynamic Reverse-code Generation to Debug Non-deterministic Programs

    Directory of Open Access Journals (Sweden)

    Jooyong Yi

    2013-09-01

    Full Text Available Backtracking (i.e., reverse execution helps the user of a debugger to naturally think backwards along the execution path of a program, and thinking backwards makes it easy to locate the origin of a bug. So far backtracking has been implemented mostly by state saving or by checkpointing. These implementations, however, inherently do not scale. Meanwhile, a more recent backtracking method based on reverse-code generation seems promising because executing reverse code can restore the previous states of a program without state saving. In the literature, there can be found two methods that generate reverse code: (a static reverse-code generation that pre-generates reverse code through static analysis before starting a debugging session, and (b dynamic reverse-code generation that generates reverse code by applying dynamic analysis on the fly during a debugging session. In particular, we espoused the latter one in our previous work to accommodate non-determinism of a program caused by e.g., multi-threading. To demonstrate the usefulness of our dynamic reverse-code generation, this article presents a case study of various backtracking methods including ours. We compare the memory usage of various backtracking methods in a simple but nontrivial example, a bounded-buffer program. In the case of non-deterministic programs such as this bounded-buffer program, our dynamic reverse-code generation outperforms the existing backtracking methods in terms of memory efficiency.

  10. A fully automatic tool to perform accurate flood mapping by merging remote sensing imagery and ancillary data

    Science.gov (United States)

    D'Addabbo, Annarita; Refice, Alberto; Lovergine, Francesco; Pasquariello, Guido

    2016-04-01

    Flooding is one of the most frequent and expansive natural hazard. High-resolution flood mapping is an essential step in the monitoring and prevention of inundation hazard, both to gain insight into the processes involved in the generation of flooding events, and from the practical point of view of the precise assessment of inundated areas. Remote sensing data are recognized to be useful in this respect, thanks to the high resolution and regular revisit schedules of state-of-the-art satellites, moreover offering a synoptic overview of the extent of flooding. In particular, Synthetic Aperture Radar (SAR) data present several favorable characteristics for flood mapping, such as their relative insensitivity to the meteorological conditions during acquisitions, as well as the possibility of acquiring independently of solar illumination, thanks to the active nature of the radar sensors [1]. However, flood scenarios are typical examples of complex situations in which different factors have to be considered to provide accurate and robust interpretation of the situation on the ground: the presence of many land cover types, each one with a particular signature in presence of flood, requires modelling the behavior of different objects in the scene in order to associate them to flood or no flood conditions [2]. Generally, the fusion of multi-temporal, multi-sensor, multi-resolution and/or multi-platform Earth observation image data, together with other ancillary information, seems to have a key role in the pursuit of a consistent interpretation of complex scenes. In the case of flooding, distance from the river, terrain elevation, hydrologic information or some combination thereof can add useful information to remote sensing data. Suitable methods, able to manage and merge different kind of data, are so particularly needed. In this work, a fully automatic tool, based on Bayesian Networks (BNs) [3] and able to perform data fusion, is presented. It supplies flood maps

  11. 汽车液力自动变速器的性能%The performance on automotive automatic transmission

    Institute of Scientific and Technical Information of China (English)

    孙小男; 赵薇

    2013-01-01

    液力自动变速器(Automatic Transmission,简称AT)在自动变速器市场中占有最重要的位置,AT向高档位发展的趋势越来越明显.高档位自动变速器可以更好地改善汽车的各方面性能,具有档间速比分配更加细密、速比范围更大.

  12. Study on triterpenoic acids distribution in Ganoderma mushrooms by automatic multiple development high performance thin layer chromatographic fingerprint analysis.

    Science.gov (United States)

    Yan, Yu-Zhen; Xie, Pei-Shan; Lam, Wai-Kei; Chui, Eddie; Yu, Qiong-Xi

    2010-01-01

    Ganoderma--"Lingzhi" in Chinese--is one of the superior Chinese tonic materia medicas in China, Japan, and Korea. Two species, Ganoderma lucidum (Red Lingzhi) and G. sinense (Purple Lingzhi), have been included in the Chinese Pharmacopoeia since its 2000 Edition. However, some other species of Ganoderma are also available in the market. For example, there are five species divided by color called "Penta-colors Lingzhi" that have been advocated as being the most invigorating among the Lingzhi species; but there is no scientific evidence for such a claim. Morphological identification can serve as an effective practice for differentiating the various species, but the inherent quality has to be delineated by chemical analysis. Among the diverse constituents in Lingzhi, triterpenoids are commonly recognized as the major active ingredients. An automatic triple development HPTLC fingerprint analysis was carried out for detecting the distribution consistency of the triterpenoic acids in various Lingzhi samples. The chromatographic conditions were optimized as follows: stationary phase, precoated HPTLC silica gel 60 plate; mobile phase, toluene-ethyl acetate-methanol-formic acid (15 + 15 + 1 + 0.1); and triple-development using automatic multiple development equipment. The chromatograms showed good resolution, and the color images provided more specific HPTLC fingerprints than have been previously published. It was observed that the abundance of triterpenoic acids and consistent fingerprint pattern in Red Lingzhi (fruiting body of G. lucidum) outweighs the other species of Lingzhi. PMID:21140647

  13. Design of absorbency photometer used in a fully automatic ELISA analyzer

    Science.gov (United States)

    Dong, Ningning; Zhu, Lianqing; Dong, Mingli; Niu, Shouwei

    2008-03-01

    Absorbency measurement is the most important step in the ELISA analysis. Based on the spectrophotometry, absorbency photometer system used in a fully automatic ELISA analyzer is developed. The system is one core module of the fully automatic ELISA analyzer. The principle and function of the system is analyzed. Three main units of the system, the photoelectric transform unit, the data processing unit and the communication and control unit, are designed and debugged. Finally, the test of the system is carried out using the verification plate. The experiment results agree well with the requirements.

  14. Energy Bucket: A Tool for Power Profiling and Debugging of Sensor Nodes

    DEFF Research Database (Denmark)

    Andersen, Jacob; Hansen, Morten Tranberg

    2009-01-01

    .Furthermore, we show how this tool, together with the target system API, offers a very detailed analysis of where energy is spent in an application, which proves to be very useful when comparing alternative implementations or validating theoretical energy consumption models.......The ability to precisely measure and compare energy consumption and relate this to particular parts of programs is a recurring theme in sensor network research. This paper presents the Energy Bucket, a low-cost tool designed for quick empirical measurements of energy consumptions across 5 decades...... of current draw. The Energy Bucket provides a light-weight state API for the target system, which facilitates easy scorekeeping of energy consumption between different parts of a target program. We demonstrate how this tool can be used to discover programming errors and debug sensor network applications...

  15. Application of remote debugging techniques in user-centric job monitoring

    International Nuclear Information System (INIS)

    With the Job Execution Monitor, a user-centric job monitoring software developed at the University of Wuppertal and integrated into the job brokerage systems of the WLCG, job progress and grid worker node health can be supervised in real time. Imminent error conditions can thus be detected early by the submitter and countermeasures can be taken. Grid site admins can access aggregated data of all monitored jobs to infer the site status and to detect job misbehaviour. To remove the last 'blind spot' from this monitoring, a remote debugging technique based on the GNU C compiler suite was developed and integrated into the software; its design concept and architecture is described in this paper and its application discussed.

  16. Assessing the Performance of Automatic Speech Recognition Systems When Used by Native and Non-Native Speakers of Three Major Languages in Dictation Workflows

    DEFF Research Database (Denmark)

    Zapata, Julián; Kirkedal, Andreas Søeborg

    2015-01-01

    In this paper, we report on a two-part experiment aiming to assess and compare the performance of two types of automatic speech recognition (ASR) systems on two different computational platforms when used to augment dictation workflows. The experiment was performed with a sample of speakers...... of three major languages and with different linguistic profiles: non-native English speakers; non-native French speakers; and native Spanish speakers. The main objective of this experiment is to examine ASR performance in translation dictation (TD) and medical dictation (MD) workflows without manual...... transcription vs. with transcription. We discuss the advantages and drawbacks of a particular ASR approach in different computational platforms when used by various speakers of a given language, who may have different accents and levels of proficiency in that language, and who may have different levels...

  17. Automatic 2D scintillation camera and computed tomography whole-body image registration to perform dosimetry calculation

    Energy Technology Data Exchange (ETDEWEB)

    Cismondi, Federico; Mosconi, Sergio L [Fundacion Escuela de Medicina Nuclear, Mendoza (Argentina)

    2007-11-15

    In this paper we present a software tool that has been developed to allow automatic registrations of 2D Scintillation Camera (SC) and Computed Tomography (CT) images. This tool, used with a dosimetric software with Integrated Activity or Residence Time as input data, allows the user to assess physicians about effects of radiodiagnostic or radioterapeutic practices. Images are registered locally and globally, maximizing Mutual Information coefficient between regions been registered. In the regional case whole-body images are segmented into five regions: head, thorax, pelvis, left and right legs. Each region has its own registration parameters, which are optimized through Powell-Brent minimization method that 'maximizes' Mutual Information coefficient. This software tool allows the user to draw ROIs, input isotope characteristics and finally calculate Integrated Activity or Residence Time in one or many specific organ. These last values can be introduced in many dosimetric softwares to finally obtain Absorbed Dose values.

  18. Automatic text summarization

    CERN Document Server

    Torres Moreno, Juan Manuel

    2014-01-01

    This new textbook examines the motivations and the different algorithms for automatic document summarization (ADS). We performed a recent state of the art. The book shows the main problems of ADS, difficulties and the solutions provided by the community. It presents recent advances in ADS, as well as current applications and trends. The approaches are statistical, linguistic and symbolic. Several exemples are included in order to clarify the theoretical concepts.  The books currently available in the area of Automatic Document Summarization are not recent. Powerful algorithms have been develop

  19. Automatic Reading

    Institute of Scientific and Technical Information of China (English)

    胡迪

    2007-01-01

    <正>Reading is the key to school success and,like any skill,it takes practice.A child learns to walk by practising until he no longer has to think about how to put one foot in front of the other.The great athlete practises until he can play quickly,accurately and without thinking.Ed- ucators call it automaticity.

  20. Performance evaluation of an automatic positioning system for photovoltaic panels; Avaliacao de desempenho de um sistema de posicionamento automatico para paineis fotovoltaicos

    Energy Technology Data Exchange (ETDEWEB)

    Alves, Alceu Ferreira; Cagnon, Jose Angelo [Universidade Estadual Paulista (FEB/UNESP), Bauru, SP (Brazil). Fac. de Engenharia], Emails: alceu@feb.unesp.br, jacagnon@feb.unesp.br

    2009-07-01

    The need of using electric energy in localities not attended by the utilities has motivated the development of this research, whose main approach was photovoltaic systems and the search for better performance of these systems with the solar panels positioning toward the sun. This work presents the performance evaluation of an automatic positioning system for photovoltaic panels taking in account the increase in generation of electric energy and its costs of implantation. It was designed a simplified electromechanical device, which is able to support and to move a photovoltaic panel along the day and along the year, keeping its surface aimed to the sun rays, without using sensors and with optimization of movements, due the adjustment of panel's inclination take place only once a day. The obtained results indicated that the proposal is viable, showing a compatible cost compared to the increase in the generation of electricity. (author)

  1. Linux kernel debug technology research%Linux内核调试技术的方法研究

    Institute of Scientific and Technical Information of China (English)

    洪永学; 余红英; 姜世杰; 林丽蓉

    2012-01-01

    The application and development of Linux kernel driver Linux kernel often need to cut or modify, because of the particularity of the operating system kernel and version of the difference,while transplanted drive or write application will appear all sorts of errors and warning information,such as segmentation fault, syntax error,variable not used and so on, but cannot use debug common user program method debug kernel. For above reason this paper first introduces the commonly used two kinds of Linux kernel debugging method, that is printk function printing technology and Oops information stack trace technology, Finally, a LCD driver example explains how to use Oops information stack trace Linux kernel driver debugging technology to reflect the stack trace the importance of technology.%开发Linux应用及内核驱动时经常需要对Linux内核进行裁剪或修改,由于操作系统内核的特殊性和版本的差异性,在移植驱动或是编写应用程序的时候会出现各种各样的错误和警告信息,如:段错误、语法错误、变量未使用等信息,此时不能使用调试普通用户程序的方法调试内核。鉴于上述原因本文首先介绍常用的两种Linux内核调试方法,即printk函数打印技术和Oops信息的栈回溯技术,最后通过一个LCD驱动实例详细讲解了如何利用Oops信息进行栈回溯的Linux内核驱动调试技术以体现出栈回溯技术的重要性。

  2. Incorporating S-shaped testing-effort functions into NHPP software reliability model with imperfect debugging

    Institute of Scientific and Technical Information of China (English)

    Qiuying Li; Haifeng Li; Minyan Lu

    2015-01-01

    Testing-effort (TE) and imperfect debugging (ID) in the reliability modeling process may further improve the fitting and pre-diction results of software reliability growth models (SRGMs). For describing the S-shaped varying trend of TE increasing rate more accurately, first, two S-shaped testing-effort functions (TEFs), i.e., delayed S-shaped TEF (DS-TEF) and inflected S-shaped TEF (IS-TEF), are proposed. Then these two TEFs are incorporated into various types (exponential-type, delayed S-shaped and in-flected S-shaped) of non-homogeneous Poisson process (NHPP) SRGMs with two forms of ID respectively for obtaining a series of new NHPP SRGMs which consider S-shaped TEFs as wel as ID. Final y these new SRGMs and several comparison NHPP SRGMs are applied into four real failure data-sets respectively for investigating the fitting and prediction power of these new SRGMs. The experimental results show that: (i) the proposed IS-TEF is more suitable and flexible for describing the consumption of TE than the previous TEFs; (i ) incorporating TEFs into the inflected S-shaped NHPP SRGM may be more effective and appropriate compared with the exponential-type and the delayed S-shaped NHPP SRGMs; (i i) the inflected S-shaped NHPP SRGM con-sidering both IS-TEF and ID yields the most accurate fitting and prediction results than the other comparison NHPP SRGMs.

  3. Diagnostic performance of a commercially available computer-aided diagnosis system for automatic detection of pulmonary nodules: comparison with single and double reading

    International Nuclear Information System (INIS)

    Objective: To assess the diagnostic performance of a commercially available computer-aided diagnosis (CAD) system for automatic detection of pulmonary nodules with multi-row detector CT scans compared to single and double reading by radiologists. Materials and Methods: A CAD system for automatic nodule detection (Siemens LungCare NEV VB10) was applied to four-detector row low-dose CT (LDCT) performed on nine patients with pulmonary metastases and compared to the findings of three radiologists. A standard-dose CT (SDCT) was acquired simultaneously and used for establishing the reference data base. The study design was approved by the Institutional Review Board and the appropriate German authorities. The reference data base consisted of 457 nodules (mean size 3.9±3.1 mm) and was established by fusion of the sets of nodules detected by three radiologists independently reading LDCT and SDCT and by CAD. An independent radiologist used thin slices to eliminate false positive findings from the reference base. Results: An average sensitivity of 54% (range 51% to 55%) was observed for single reading by one radiologist. CAD demonstrated a similar sensitivity of 55%. Double reading by two radiologists increased the sensitivity to an average of 67% (range 67% to 68%). The difference to single reading was significant (p<0.001). CAD as second opinion after single reading increased the sensitivity to 79% (range 77% to 81%), which proved to be significantly better than double reading (p<0.001). CAD produced more false positive results (7.2%) than human readers but it was acceptable in clinical routine. (orig.)

  4. 单馈圆极化微带天线的工程调试方法研究%Engineering debug method of single-feeding circularly polarized micro-strip antenna

    Institute of Scientific and Technical Information of China (English)

    于家傲; 陈文君; 袁靖; 鞠志忠

    2014-01-01

    为解决单馈圆极化微带天线的工程实现与设计方案存在着谐振频点不一致、轴比变差等问题,基于单馈圆极化天线理论基础,提出了两类单馈圆极化天线的工程调试方案,并采用HFSS软件进行了仿真。仿真结果表明,在天线的不同位置进行调试可分别对单馈圆极化微带天线的谐振频率、反射系数和轴比等天线性能进行优化调整,这对该类天线的工程调试具有指导意义。%To solve the problems that resonant frequency points are inconsistent and the axial ratio is getting worse between the engineering implementation of the single-feeding circularly polarized micro-strip antenna and its design scheme, this paper puts forward two kinds of engineering debug schemes for this micro-strip antenna and simulates them using HFSS software, based on the theoretic basis of single-feeding circularly polarized micro-strip antenna. Simulation results illustrate that the resonant frequency, reflection coefficient and axial ratio as well as other performances of the micro-strip antenna can be optimized respectively by debugging at various positions of this antenna, which plays a certain guiding role in engineering debug of these kinds of antenna.

  5. Facial expressions as feedback cue in human-robot interaction - a comparison between human and automatic recognition performances

    OpenAIRE

    Lang, Christian; Wachsmuth, Sven; Wersing, Heiko; Hanheide, Marc

    2010-01-01

    Facial expressions are one important nonverbal communication cue, as they can provide feedback in conversations between people and also in human-robot interaction. This paper presents an evaluation of three standard pattern recognition techniques (active appearance models, gabor energy filters, and raw images) for facial feedback interpretation in terms of valence (success and failure) and compares the results to the human performance. The used database contains videos of people interacting w...

  6. 安全阀的调试与维修%Debugging and Maintenance of Safety Valve

    Institute of Scientific and Technical Information of China (English)

    吴科学

    2016-01-01

    This paper briefly describes the SY type safety valve application in petrochemical industry, as well as the role of the relief valve in the production unit is responsible for. Relief valve set point of the set is a dangerous and necessary work very much, because the work of specific data, will reflect to the relief valve in the device usage, directly determines the safety valve will be able to have the corresponding security protection. The thesis gives a set of production site pressure setting debugging methods, combined with field using the process of maintenance and repair work experience, summed up the various common faults and fault processing methods, some and illustrates the maintenance process should pay attention to the aspects.%本文简要叙述了SY型安全阀在石油化工行业中的一些应用,以及安全阀在生产装置中所担负的作用。安全阀设定值的现场设定是一项危险且非常必要的工作,因为这一工作得出的具体数据,将体现到其在装置中的使用情况,且直接决定了其是否能够起到相应的安全保护作用。本文详细给出了生产现场压力设定值的设定调试的具体方法,结合现场使用过程中运行维护和检修的工作经验,总结了各种常见故障及其针对一些故障的处理方法,并说明了检修过程中应该注意的方面。

  7. Performance evaluation and operational experience with a semi-automatic monitor for the radiological characterization of low-level wastes

    International Nuclear Information System (INIS)

    Chalk River Nuclear Laboratories (CRNL) have undertaken a Waste Disposal Project to co-ordinate the transition from the current practice of interim storage to permanent disposal for low-level radioactive wastes (LLW). The strategy of the project is to classify and segregate waste segments according to their hazardous radioactive lifetimes and to emplace them in disposal facilities engineered to isolate and contain them. To support this strategy, a waste characterization program was set up to estimate the volume and radioisotope inventories of the wastes managed by CRNL. A key element of the program is the demonstration of a non-invasive measurement technique for the isotope-specific characterization of solid LLW. This paper describes the approach taken at CRNL for the non-invasive assay of LLW and the field performance and early operational experience with a waste characterization monitor to be used in a waste processing facility

  8. Automatically produced FRP beams with embedded FOS in complex geometry: process, material compatibility, micromechanical analysis, and performance tests

    Science.gov (United States)

    Gabler, Markus; Tkachenko, Viktoriya; Küppers, Simon; Kuka, Georg G.; Habel, Wolfgang R.; Milwich, Markus; Knippers, Jan

    2012-04-01

    The main goal of the presented work was to evolve a multifunctional beam composed out of fiber reinforced plastics (FRP) and an embedded optical fiber with various fiber Bragg grating sensors (FBG). These beams are developed for the use as structural member for bridges or industrial applications. It is now possible to realize large scale cross sections, the embedding is part of a fully automated process and jumpers can be omitted in order to not negatively influence the laminate. The development includes the smart placement and layout of the optical fibers in the cross section, reliable strain transfer, and finally the coupling of the embedded fibers after production. Micromechanical tests and analysis were carried out to evaluate the performance of the sensor. The work was funded by the German ministry of economics and technology (funding scheme ZIM). Next to the authors of this contribution, Melanie Book with Röchling Engineering Plastics KG (Haren/Germany; Katharina Frey with SAERTEX GmbH & Co. KG (Saerbeck/Germany) were part of the research group.

  9. Data Provenance Inference in Logic Programming: Reducing Effort of Instance-driven Debugging

    NARCIS (Netherlands)

    Huq, Mohammad Rezwanul; Mileo, Alessandra; Wombacher, Andreas

    2013-01-01

    Data provenance allows scientists in different domains validating their models and algorithms to find out anomalies and unexpected behaviors. In previous works, we described on-the-fly interpretation of (Python) scripts to build workflow provenance graph automatically and then infer fine-grained pro

  10. 射频宽带产品的指压调试法%RF broadband products Shiatsu debugging method

    Institute of Scientific and Technical Information of China (English)

    胡志山

    2014-01-01

    射频宽带产品的传输损耗①是生产调试、样品试制的重点参数,尤其以各端口的反射损耗为最大的调试难点,通常我们需要对线路增加接地补偿电容来优化参数,然而补偿电容的位置和大小取值的确定只能采用逐步逼近试验法,很难快速确定。这在用工紧张、价格竞争激烈的今天成为诸多厂家的绊脚石。笔者经过多年的工作实践,总结出了指压调试法,可以轻松快捷地确定补偿电容的位置及大小,非常适合生产、技术一线的推广应用。%Transmission loss RF broadband products is a key parameter of production debugging,testing samples.Especial y in the relfection loss of each port to debug the biggest dififculty. Usual y we need to increase the grounding capacitance compensation circuit to optimize the parameters.However,the size and location of the compensation capacitor value can only be using stepwise approximation test method,it is dififcult to determine.The labor tension,price competition is intense day become a stumbling block of many manufacturers.After many years of practice,summed up a Shiatsu debugging method.Can quickly and easily determine the location and size of compensation capacitors.Very suitable for application in production,technical line.

  11. Clothes Dryer Automatic Termination Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    TeGrotenhuis, Ward E.

    2014-10-01

    Volume 2: Improved Sensor and Control Designs Many residential clothes dryers on the market today provide automatic cycles that are intended to stop when the clothes are dry, as determined by the final remaining moisture content (RMC). However, testing of automatic termination cycles has shown that many dryers are susceptible to over-drying of loads, leading to excess energy consumption. In particular, tests performed using the DOE Test Procedure in Appendix D2 of 10 CFR 430 subpart B have shown that as much as 62% of the energy used in a cycle may be from over-drying. Volume 1 of this report shows an average of 20% excess energy from over-drying when running automatic cycles with various load compositions and dryer settings. Consequently, improving automatic termination sensors and algorithms has the potential for substantial energy savings in the U.S.

  12. A Software Agent for Automatic Creation of a PLC Program

    Directory of Open Access Journals (Sweden)

    Walid M. Aly

    2008-01-01

    Full Text Available Using structured design techniques to design a Programmable Logical Control (PLC program would decrease the time needed for debugging and produces a concise bug free code. This study is concerned with the design of a software agent for automatic creation of code for a PLC program that can be downloaded on a Siemens Step 7 series. The code is generated according to the syntax rules for the AWL Language, AWL is the abbreviation for the germen word Anweisungsliste which means Instruction List The proposed system uses object oriented approach to transfer the design specification into an object that adequately describes the system using the state based design technique. The industrial system specifications are supplied by the user through a simple Graphical User Interface (GUI environment These specification define the attributes vales of an object oriented class describing the control system, all the functions needed to generate the code are encapsulated in the class.

  13. AX-4280全自动尿液分析仪性能评价%To evaluate the performance of AX-4280 automatic urine dry chemistry analyzer

    Institute of Scientific and Technical Information of China (English)

    王梅; 吴燕; 夏云

    2012-01-01

    目的 评价AX-4280全自动尿液分析仪的性能.方法 采用AX-4280全自动尿液分析仪,分别检测各项参数的精密度、准确度、携带污染率等.结果 高、低质控品精密度SD%值(0.001~0.943、0~0.832)、携带污染率(0.8%~10.0%)、准确度等均在仪器要求范围内.结论 AX-4280全自动尿液分析仪各项性能均符合仪器要求范围,可用于临床尿液干化学分析检测.%Objective To evaluate the performance of the AX-4280, which is the fully automated urine dry chemistry analyzer. Methods The AX-4280 automatic urine dry chemistry analyzer was used to detect the precision,accuracy and carryover rates of all parameters. Results The SD values of the hight-control and lowr-control precisions(0. 001 - 0. 943、0 - 0. 832) ,the contamination rate((0. 8% -10. 0%)and accuracy were all within the requirements of the instrument. Conclusion The performance could meet the requirements of the AX-4280 analyzer,which can be used for dry chemical analysis of clinical urine testing.

  14. The yin and yang properties of pentatonic music in TCM music therapy:based on debugging and speed%中医音乐治疗中五声性音乐阴阳属性--从调式和速度的角度

    Institute of Scientific and Technical Information of China (English)

    左志坚

    2016-01-01

    中国传统音乐理论认为,中国音乐在创作、表演、音乐语言等方面都体现出阴阳思维。中国传统音乐是五声性音乐,使用的五声性调式具有阴阳属性。总体上看,调式的阴阳属性可分为明确、基本明确和不明确三大类。音乐速度对阴阳属性明确的调式能造成细微影响,对阴阳属性基本明确和不明确的调式具有重要的决定作用。%According to Chinese traditional music theory,the creation,performance and music language of Chinese music reflect the thinking of yin and yang. Chinese traditional music is pentatonic music,whose Pentatonic debugging has Yin and Yang properties. Overall,the properties can be divided into three categories:clear,almost clear and unclear. Music speed has subtle influence on the debugging with clear yin and yang property and it has decisive influence on the debugging with almost clear and unclear yin and yang property.

  15. An electronically controlled automatic security access gate

    Directory of Open Access Journals (Sweden)

    Jonathan A. ENOKELA

    2014-11-01

    Full Text Available The security challenges being encountered in many places require electronic means of controlling access to communities, recreational centres, offices, and homes. The electronically controlled automated security access gate being proposed in this work helps to prevent an unwanted access to controlled environments. This is achieved mainly through the use of a Radio Frequency (RF transmitter-receiver pair. In the design a microcontroller is programmed to decode a given sequence of keys that is entered on a keypad and commands a transmitter module to send out this code as signal at a given radio frequency. Upon reception of this RF signal by the receiver module, another microcontroller activates a driver circuitry to operate the gate automatically. The codes for the microcontrollers were written in C language and were debugged and compiled using the KEIL Micro vision 4 integrated development environment. The resultant Hex files were programmed into the memories of the microcontrollers with the aid of a universal programmer. Software simulation was carried out using the Proteus Virtual System Modeling (VSM version 7.7. A scaled-down prototype of the system was built and tested. The electronically controlled automated security access gate can be useful in providing security for homes, organizations, and automobile terminals. The four-character password required to operate the gate gives the system an increased level of security. Due to its standalone nature of operation the system is cheaper to maintain in comparison with a manually operated type.

  16. GNU/Hurd上远程调试的实现%The Implementation of Remote Debug on GNU/Hurd

    Institute of Scientific and Technical Information of China (English)

    陆岳

    2013-01-01

    GNU Hurd consists of a set of protocols and daemons that run on the GNU Mach microkernel;together they are in-tended to form the kernel of GNU operating system. GDB is a widely used debugger, to adapt to multiple platforms and operat-ing systems, but not yet on the GNU/Hurd to realize remote debugging. This paper gives a detailed analyzes on the Mach excep-tion handling model and the principle of the debugger’s implementation, and the implementation of the remote debugging tools gdbserver are introduced.%GNU/Hurd是一系列基于GNU Mach的守护进程,这一套守护进程最终形成了GNU操作系统。GDB是广泛使用的调试器,适应多个平台和操作系统,但是尚未在GNU/Hurd上实现远程调试。该文详细分析了GNU/Hurd上调试器的实现原理、Mach的异常处理模型,并对GNU/Hurd上的远程调试工具gdbserver的实现进行了介绍。

  17. Debugging and realization to the activeX scripting%对于ActiveX Scripting的调试及实现

    Institute of Scientific and Technical Information of China (English)

    侯迎春

    2003-01-01

    ActiveX Scripting是Microsoft的ActiveX的一个组成部分,Microsoft的ActiveX技术架构包括5个部分:Host Application(宿主程序)、Language Engine(脚本语言引擎程序)、Process Debug Manager(进程调试管理器程序)、Machine Debug Manager(本机调试管理器程序)、Application Debugger(调试程序).调试程序的建立,首先要建立Host Application和Application Debugger框架,然后在Language engine中执行脚本.

  18. MONITORING ON DEBUGGING OF GEOTHERMAL HEAT PUMP AIR CONDITIONING SYSTEM AND DOMESTIC HOT-WATER SYSTEM%地源热泵空调及生活热水系统调试监控

    Institute of Scientific and Technical Information of China (English)

    丁育南; 丁楠育

    2012-01-01

    以某工程的地源热泵空调及生活热水系统调试的有关监控要求为例,从生活热水系统运行要求、系统调试监控要点及其调试结果分析等方面介绍了调试监控的方法,即为确保该系统满足设计要求的节能减排目标,应先进行分区、分子系统调试,合格后再联合调试,实时总结调试期间出现的不足并及时修正,为类似工程的调试监控工作提供借鉴.%Based on related monitoring requirements for debugging of geothennal heat pump air conditioning system and domestic hot-water system of a project, the debugging monitoring method is introduced on the following aspects, including operation requirement of domestic hot-water system, key points for monitoring on system debugging and analysis on debugging result, etc. To ensure the energy conservation and emission reduction required in design can be satisfied, the system shall be debugged in subzones and subsystems before joint debugging. Problems detected in debugging shall be summarized and corrected timely, so as to provide references for debugging monitoring of similar projects.

  19. Handling Conflicts in Depth-First Search for LTL Tableau to Debug Compliance Based Languages

    Directory of Open Access Journals (Sweden)

    Francois Hantry

    2011-09-01

    Full Text Available Providing adequate tools to tackle the problem of inconsistent compliance rules is a critical research topic. This problem is of paramount importance to achieve automatic support for early declarative design and to support evolution of rules in contract-based or service-based systems. In this paper we investigate the problem of extracting temporal unsatisfiable cores in order to detect the inconsistent part of a specification. We extend conflict-driven SAT-solver to provide a new conflict-driven depth-first-search solver for temporal logic. We use this solver to compute LTL unsatisfiable cores without re-exploring the history of the solver.

  20. Automatic mapping of monitoring data

    DEFF Research Database (Denmark)

    Lophaven, Søren; Nielsen, Hans Bruun; Søndergaard, Jacob

    2005-01-01

    This paper presents an approach, based on universal kriging, for automatic mapping of monitoring data. The performance of the mapping approach is tested on two data-sets containing daily mean gamma dose rates in Germany reported by means of the national automatic monitoring network (IMIS......). In the second dataset an accidental release of radioactivity in the environment was simulated in the South-Western corner of the monitored area. The approach has a tendency to smooth the actual data values, and therefore it underestimates extreme values, as seen in the second dataset. However, it is capable...

  1. X线机高压发生器的调试%The Debugging of X-ray High Voltage Generator

    Institute of Scientific and Technical Information of China (English)

    戴丹; 王魏; 戴竞; 郭永平; 徐月萍; 高建全; 张春潮; 叶践

    2012-01-01

    随着X射线检查的广泛运用,以及人们对健康的重视程度逐渐提高,接受X射线检查的人越来越多.本文结合X线机构造原理,简单论述X射线高压发生器部分的调试工作.%With wide use of the X-ray examination and the improvement of people's attention on health, more people have accepted the X-ray examination. Based on the structure and principle of X-ray machine, this paper expounds on the debugging of the X-ray high voltage generator.

  2. ABBOTTARCHITECTC16000全自动生化分析仪性能评价%Evaluation on performance of ABBOTT ARCHITECT C16000 automatic biochemistry analyzer

    Institute of Scientific and Technical Information of China (English)

    张娟; 蒋小燕; 李顺君; 黄文芳

    2014-01-01

    Objective To evaluate the main performance of ABBOTT ARCHITECT C160000 biochemistry analyzer,and to judge whether the performance meets the laboratory requirement.Methods According to the clinical laboratory management meth-od and the requirement of accreditation of national laboratory,the precision,accuracy and linearity of the 17 test items(Urea,Cre, UA,Glu,etc.)were analyzed by the CLSI EP5-A2 document,CLSI EP9-A2 document and CLSI EP6-P document;the quotative ref-erence ranges of the 17 test items were verified.Results The coefficient of variation(CV)in within-batch precision of Urea,Cre, UA,Glu,etc.was ≤1/4 CLIA′88 standard and CV in the between-batch precision ≤1/3CLIA′88 standard;in the accuracy test,the relative bias of the 17 test items≤1/2CLIA′88 standard;the linearity of the 17 items was good(r2 >0.95);the cited reference range of various detection items was suitable.Conclusion The performance of the ABBOTT ARCHITECT C160000 automatic biochem-istry analyzer meets the laboratory demand.%目的:对 ABBOTT ARCHITECT C16000生化分析仪主要性能进行评价,判断其是否能够满足本科实验室需求。方法依照实验室管理办法及国家实验室认可的要求,分别采用美国临床和实验室标准化协会(CLSI)的 EP5-A2、EP9-A2和EP6-A 文件评价方法分析尿素(Urea)、肌酐(Cre)、尿酸(UA)和葡萄糖(Glu)等17项检测项目的精密度、准确度和线性;并对各检测项目引用的参考区间进行验证。结果 ABBOTT ARCHITECT C16000生化分析仪 Urea、Cre、UA、Glu 等17项检测项目批内精密度变异系数(CV)均小于或等1/4CLIA′88标准,批间精密度 CV 均小于或等于1/3CLIA′88标准;在准确度试验中,该17项检测项目的相对偏倚均小于或等于1/2CLIA′88标准;线性良好(r2>0.95);各检测项目引用的参考区间合适。结论 ABBOTT ARCHITECT C16000生化分析仪性能满足本实验室需求。

  3. Automatic chemical monitoring in the composition of functions performed by the unit level control system in the new projects of nuclear power plant units

    Science.gov (United States)

    Denisova, L. G.; Khrennikov, N. N.

    2014-08-01

    The article presents information on the state of regulatory framework and development of a subsystem for automated chemical monitoring of water chemistries in the primary and secondary coolant circuits used as part of the automatic process control system in new projects of VVER reactor-based nuclear power plant units. For the strategy of developing and putting in use the water chemistry-related part of the automated process control system within the standard AES-2006 nuclear power plant project to be implemented, it is necessary to develop regulatory documents dealing with certain requirements imposed on automatic water chemistry monitoring systems in accordance with the requirements of federal codes and regulations in the field of using atomic energy.

  4. Design of Pneumatic Device of the Automatic Transmission Performance Test-bed%气动自动变速器试验台架的设计

    Institute of Scientific and Technical Information of China (English)

    王利利

    2012-01-01

    根据自动变速器的工作原理特性,结合自动变速器的专业教学实践,分析了目前大多数变速器解剖试验台架的缺点,设计了以气压为动力传动介质的自动变速器解剖运行试验台.气动自动变速器实物解剖试验台架能实现动态演示,突破了教学上的难点,提高了学生的学习兴趣,取得了较好的试验教学效果.%According to the work principle of automatic transmission,combined with the experimental teaching of automatic transmission, the current most transmission test-bed shortcomings were analyzed. An automat transmission test-bed introducing of pneumatic device was designed. The test-bed can realize the dynamic demonstration and break through the teaching difficulty, which enables students to improve the study interest, and get better effects of teaching.

  5. Automatic Fiscal Stabilizers

    Directory of Open Access Journals (Sweden)

    Narcis Eduard Mitu

    2013-11-01

    Full Text Available Policies or institutions (built into an economic system that automatically tend to dampen economic cycle fluctuations in income, employment, etc., without direct government intervention. For example, in boom times, progressive income tax automatically reduces money supply as incomes and spendings rise. Similarly, in recessionary times, payment of unemployment benefits injects more money in the system and stimulates demand. Also called automatic stabilizers or built-in stabilizers.

  6. Automatic input rectification

    OpenAIRE

    Long, Fan; Ganesh, Vijay; Carbin, Michael James; Sidiroglou, Stelios; Rinard, Martin

    2012-01-01

    We present a novel technique, automatic input rectification, and a prototype implementation, SOAP. SOAP learns a set of constraints characterizing typical inputs that an application is highly likely to process correctly. When given an atypical input that does not satisfy these constraints, SOAP automatically rectifies the input (i.e., changes the input so that it satisfies the learned constraints). The goal is to automatically convert potentially dangerous inputs into typical inputs that the ...

  7. Automatic differentiation bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Corliss, G.F. (comp.)

    1992-07-01

    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  8. Automatic Radiation Monitoring in Slovenia

    International Nuclear Information System (INIS)

    Full text: The automatic radiation monitoring system in Slovenia started in early nineties and now it comprises measurements of: 1. External gamma radiation: For the time being there are forty-three probes with GM tubes integrated into a common automatic network, operated at the SNSA. The probes measure dose rate in 30 minute intervals. 2. Aerosol radioactivity: Three automatic aerosol stations measure the concentration of artificial alpha and beta activity in the air, gamma emitting radionuclides, radioactive iodine 131 in the air (in all chemical forms, - natural radon and thoron progeny, 3. Radon progeny concentration: Radon progeny concentration is measured hourly and results are displayed as the equilibrium equivalent concentrations (EEC), 4. Radioactive deposition measurements: As a support to gamma dose rate measurements - the SNSA developed and installed an automatic measuring station for surface contamination equipped with gamma spectrometry system (with 3x3' NaI(Tl) detector). All data are transferred through the different communication pathways to the SNSA. They are collected in 30 minute intervals. Within these intervals the central computer analyses and processes the collected data, and creates different reports. Every month QA/QC analysis of data is performed, showing the statistics of acquisition errors and availability of measuring results. All results are promptly available at the our WEB pages. The data are checked and daily sent to the EURDEP system at Ispra (Italy) and also to the Austrian, Croatian and Hungarian authorities. (author)

  9. 浅谈高速铁路供电SCADA系统调试工作%On the High-speed Rail Power Supply SCADA System Debugging

    Institute of Scientific and Technical Information of China (English)

    曾亮

    2015-01-01

    In order to standardize the railway power supply SCADA system remote debugging acceptance, after the elimination of SCADA takeover device security risks exist, combined with the use of the existing lines SCADA system demand and system debugging experience, focuses on the SCADA system commissioning content, requirements and procedures and other related content, has some practical significance.%为了规范铁路供电SCADA系统远动调试验收工作,消除SCADA接管后设备中存在的安全隐患,结合既有线路SCADA系统调试经验和系统的运用需求,着重论述了SCADA系统调试内容、要求和程序等相关内容,具有一定的现实指导意义。

  10. Automatic Validation of Protocol Narration

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, Pierpablo;

    2003-01-01

    We perform a systematic expansion of protocol narrations into terms of a process algebra in order to make precise some of the detailed checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we...... demonstrate that these techniques suffice for identifying a number of authentication flaws in symmetric key protocols such as Needham-Schroeder, Otway-Rees, Yahalom and Andrew Secure RPC....

  11. 变压器短路试验电流的计算及调试%Calculation and Debugging of Current in Short- Circuit Tests of Transformers

    Institute of Scientific and Technical Information of China (English)

    杨治业

    2001-01-01

    Error and calculating method of current in the short-circuit tests of transformers are analyzed and presented.Relative problems of current debugging in the short-circuit tests are introduced.%分析并给出了变压器短路试验电流的计算方法及误差,并提出了短路试验电流调试中的有关问题。

  12. Automaticity of walking: functional significance, mechanisms, measurement and rehabilitation strategies

    Directory of Open Access Journals (Sweden)

    David J Clark

    2015-05-01

    Full Text Available Automaticity is a hallmark feature of walking in adults who are healthy and well-functioning. In the context of walking, ‘automaticity’ refers to the ability of the nervous system to successfully control typical steady state walking with minimal use of attention-demanding executive control resources. Converging lines of evidence indicate that walking deficits and disorders are characterized in part by a shift in the locomotor control strategy from healthy automaticity to compensatory executive control. This is potentially detrimental to walking performance, as an executive control strategy is not optimized for locomotor control. Furthermore, it places excessive demands on a limited pool of executive reserves. The result is compromised ability to perform basic and complex walking tasks and heightened risk for adverse mobility outcomes including falls. Strategies for rehabilitation of automaticity are not well defined, which is due to both a lack of systematic research into the causes of impaired automaticity and to a lack of robust neurophysiological assessments by which to gauge automaticity. These gaps in knowledge are concerning given the serious functional implications of compromised automaticity. Therefore, the objective of this article is to advance the science of automaticity of walking by consolidating evidence and identifying gaps in knowledge regarding: a functional significance of automaticity; b neurophysiology of automaticity; c measurement of automaticity; d mechanistic factors that compromise automaticity; and e strategies for rehabilitation of automaticity.

  13. Differential Protection Debugging of DGT801 Series Digital Transformer%DGT801系列数字式变压器差动保护调试

    Institute of Scientific and Technical Information of China (English)

    何霞

    2012-01-01

      DGT801系列数字变压器差动保护在现场安装调试过程中出现卡壳情况,为保证今后调试及事故处理的需要,针对国内微机变压器差动保护 DGT801系列保护的接线和算法进行分析,并以某水电站双变差动保护为例,对如何使用一台三相保护校验仪做比率制动试验方法进行了讨论,通过该方法可以来解决该类型保护调试。%  The differential protection of DGT801 series digital transformer appears stuck during the process of on-site installation and debugging. In order to ensure the future needs of debugging and dealing with the accident, analysis was made to the wiring and algorithm of DGT801 series protection of domestic microprocessor-based differential protection, and taking dual-transformer dif-ferential protection of certain hydraulic power for example, this paper discussed how to use three-phase calibrator to do a ratio of braking test to solve the problem of this kind of protection debugging.

  14. COMMISION DEBUGGING OF AIR-COOLED ISLAND SYSTEMS FOR 600 MW AIR-COOLED UNITS%600 MW空冷机组空冷岛的调试

    Institute of Scientific and Technical Information of China (English)

    杨海生; 李路江; 吴瑞涛; 刘春报; 刘红霞

    2009-01-01

    The problems appeared in commission debugging of the air-cooled island systems for two 600 MW air-cooled units of Guodian Longshan Power Generation Co Ltd,such as undue fast vacuum drop in air-cooled island during start-up of said units after their shutdown in winter, the rapid rise of back-pressure in said air-cooled isrand due to full-load operation of all cooling air fans and air-leakage in the steam seal system, etc. , have been analysed, and corresponding preventive measures a-dopted for above-mentioned problems in the commission debugging process being given. Regarding to a part of problems, which hadn't been solved in debugging, some concrete recommendations have been put forward.%对国电河北龙山发电有限公司2×600 MW空冷机组空冷岛的调试中出现的问题进行了分析,如冬季停机后起动空冷岛真空下降过快,全部空冷风机全负荷运转,汽封系统漏空气空冷岛背压急剧上升等,给出了调试中对上述问题采取的相关防范措施,并对调试中部分未能解决的问题,提出了具体建议.

  15. 一种基于通信事件的机载分布式软件调试方法%A Debugging Method for Airborne Distributed Software Based on Communication Event

    Institute of Scientific and Technical Information of China (English)

    张树兵; 叶宏

    2011-01-01

    Controlling and debugging distributed cooperation among multiple application partitions is key point for development and integration of multiple application partitions, which is one of key steps for developing IMA system. We introduce a concept communication event to abstract activities of inter- partition communication, and then put forward a debugging approach, communication event debugging method, which is good at synchronously controlling and synergistically debugging source partition and destination partition of inter- partition communication. Furthermore, we introduce a hybrid debugging method effectively combining communication event debugging method with code debugging method, thereafter more attention is paid to its debugging process and to advantages and disadvantages it represents in practice.%多应用分区的开发与综合是IMA系统开发的核心工作之一,而如何控制和调试基于分区间通信的分布式协作行为是关键所在.引入”通信事件”对分区间通信活动进行抽象,提出通信事件调试方法,对分区间通信的源分区和目的分区进行同步控制和对通信活动进行协同调试.进而提出一种将通信事件调试和代码调试有机结合的混合调试方法,论述了方法的调试过程以及在实践中表现出来的优点与不足.

  16. JOSTRAN: An Interactive Joss Dialect for Writing and Debugging Fortran Programs.

    Science.gov (United States)

    Graham, W. R.; Macneilage, D. C.

    JOSTRAN is a JOSS dialect that expedites the construction of FORTRAN programs. JOSS is an interactive, on-line computer system. JOSS language programs are list-processed; i.e., each statement is interpreted at execution time. FORTRAN is the principal language for programing digital computers to perform numerical calculations. The JOSS language…

  17. New hardware support transactional memory and parallel debugging in multicore processors

    OpenAIRE

    Orosa Nogueira, Lois

    2013-01-01

    This thesis contributes to the area of hardware support for parallel programming by introducing new hardware elements in multicore processors, with the aim of improving the performance and optimize new tools, abstractions and applications related with parallel programming, such as transactional memory and data race detectors. Specifically, we configure a hardware transactional memory system with signatures as part of the hardware support, and we develop a new hardware filter for reducing the...

  18. Automatic Kurdish Dialects Identification

    Directory of Open Access Journals (Sweden)

    Hossein Hassani

    2016-02-01

    Full Text Available Automatic dialect identification is a necessary Lan guage Technology for processing multi- dialect languages in which the dialects are linguis tically far from each other. Particularly, this becomes crucial where the dialects are mutually uni ntelligible. Therefore, to perform computational activities on these languages, the sy stem needs to identify the dialect that is the subject of the process. Kurdish language encompasse s various dialects. It is written using several different scripts. The language lacks of a standard orthography. This situation makes the Kurdish dialectal identification more interesti ng and required, both form the research and from the application perspectives. In this research , we have applied a classification method, based on supervised machine learning, to identify t he dialects of the Kurdish texts. The research has focused on two widely spoken and most dominant Kurdish dialects, namely, Kurmanji and Sorani. The approach could be applied to the other Kurdish dialects as well. The method is also applicable to the languages which are similar to Ku rdish in their dialectal diversity and differences.

  19. The ‘Continuing Misfortune’ of Automatism in Early Surrealism

    Directory of Open Access Journals (Sweden)

    Tessel M. Bauduin

    2015-09-01

    Full Text Available In the 1924 Manifesto of Surrealism surrealist leader André Breton (1896-1966 defined Surrealism as ‘psychic automatism in its pure state,’ positioning ‘psychic automatism’ as both a concept and a technique. This definition followed upon an intense period of experimentation with various forms of automatism among the proto-surrealist group; predominantly automatic writing, but also induced dream states. This article explores how surrealist ‘psychic automatism’ functioned as a mechanism for communication, or the expression of thought as directly as possible through the unconscious, in the first two decades of Surrealism. It touches upon automatic writing, hysteria as an automatic bodily performance of the unconscious, dreaming and the experimentation with induced dream states, and automatic drawing and other visual arts-techniques that could be executed more or less automatically as well. For all that the surrealists reinvented automatism for their own poetic, artistic and revolutionary aims, the automatic techniques were primarily drawn from contemporary Spiritualism, psychical research and experimentation with mediums, and the article teases out the connections to mediumistic automatism. It is demonstrated how the surrealists effectively and successfully divested automatism of all things spiritual. It furthermore becomes clear that despite various mishaps, automatism in many forms was a very successful creative technique within Surrealism.

  20. Automatic Payroll Deposit System.

    Science.gov (United States)

    Davidson, D. B.

    1979-01-01

    The Automatic Payroll Deposit System in Yakima, Washington's Public School District No. 7, directly transmits each employee's salary amount for each pay period to a bank or other financial institution. (Author/MLF)

  1. An Automatic Hierarchical Delay Analysis Tool

    Institute of Scientific and Technical Information of China (English)

    FaridMheir-El-Saadi; BozenaKaminska

    1994-01-01

    The performance analysis of VLSI integrated circuits(ICs) with flat tools is slow and even sometimes impossible to complete.Some hierarchical tools have been developed to speed up the analysis of these large ICs.However,these hierarchical tools suffer from a poor interaction with the CAD database and poorly automatized operations.We introduce a general hierarchical framework for performance analysis to solve these problems.The circuit analysis is automatic under the proposed framework.Information that has been automatically abstracted in the hierarchy is kept in database properties along with the topological information.A limited software implementation of the framework,PREDICT,has also been developed to analyze the delay performance.Experimental results show that hierarchical analysis CPU time and memory requirements are low if heuristics are used during the abstraction process.

  2. 超大规模集成电路可调试性设计综述%Survey of Design-for-Debug of VLSI

    Institute of Scientific and Technical Information of China (English)

    钱诚; 沈海华; 陈天石; 陈云霁

    2012-01-01

    随着硬件复杂度的不断提高和并行软件调试的需求不断增长,可调试性设计已经成为集成电路设计中的重要内容.一方面,仅靠传统的硅前验证已经无法保证现代超大规模复杂集成电路设计验证的质量,因此作为硅后验证重要支撑技术的可调试性设计日渐成为大规模集成电路设计领域的研究热点.另一方面,并行程序的调试非常困难,很多细微的bug无法直接用传统的单步、断点等方法进行调试,如果没有专门的硬件支持,需要耗费极大的人力和物力.全面分析了现有的可调试性设计,在此基础上归纳总结了可调试性设计技术的主要研究方向并介绍了各个方向的研究进展,深入探讨了可调试性结构设计研究中的热点问题及其产生根源,给出了可调试性结构设计领域的发展趋势.%Design-for-debug (DFD) has become an important feature of modern VLSI. On the one hand, traditional pre-silicon verification methods are not sufficient to enssure the quality of modern complex VLSI designs, thus employing DFD to facilitate post-silicon verification has attracted wide interests from both academia and industry; on the other hand, debugging parallel program is a worldwide difficult problem, which cries out for DFD hardware supports. In this paper, we analyze the existing structures of DFD comprehensively and introduce different fields of DFD for debugging hardware and software. These fields contain various kinds of DFD infrastructures, such as the DFD infrastructure for the pipe line of processor, the system-on-chips (SOC) and the networks on multi-cores processor. We also introduce the recent researches on how to design the DFD infrastructures with certain processor architecture and how to use the DFD infrastructures to solve the debug problems in these different fields. The topologic of the whole infrastructure, the hardware design of components, the methods of analyzing signals, the

  3. Factory Equipment Maintenance and Debugging PLC Programming Ideas to Build%工厂PLC设备维修与调试编程思路构建

    Institute of Scientific and Technical Information of China (English)

    张自强

    2014-01-01

    In-depth analysis of PLC programmable controller with the function and role, explore maintenance and debugging PLC programming ideas factory equipment, to provide some references for the exclusion of the plant PLC equipment failure.%深入分析PLC可编程控制器具备的功能和作用,探究工厂中PLC设备的维修和调试编程思路,为工厂PLC设备故障的排除提供一些参考。

  4. Automatic emotional expression analysis from eye area

    Science.gov (United States)

    Akkoç, Betül; Arslan, Ahmet

    2015-02-01

    Eyes play an important role in expressing emotions in nonverbal communication. In the present study, emotional expression classification was performed based on the features that were automatically extracted from the eye area. Fırst, the face area and the eye area were automatically extracted from the captured image. Afterwards, the parameters to be used for the analysis through discrete wavelet transformation were obtained from the eye area. Using these parameters, emotional expression analysis was performed through artificial intelligence techniques. As the result of the experimental studies, 6 universal emotions consisting of expressions of happiness, sadness, surprise, disgust, anger and fear were classified at a success rate of 84% using artificial neural networks.

  5. Automatic utilities auditing

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Colin Boughton [Energy Metering Technology (United Kingdom)

    2000-08-01

    At present, energy audits represent only snapshot situations of the flow of energy. The normal pattern of energy audits as seen through the eyes of an experienced energy auditor is described. A brief history of energy auditing is given. It is claimed that the future of energy auditing lies in automatic meter reading with expert data analysis providing continuous automatic auditing thereby reducing the skill element. Ultimately, it will be feasible to carry out auditing at intervals of say 30 minutes rather than five years.

  6. Automatic Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Preuss, Mike

    2014-01-01

    Automatically generating computer animations is a challenging and complex problem with applications in games and film production. In this paper, we investigate howto translate a shot list for a virtual scene into a series of virtual camera configurations — i.e automatically controlling the virtual...... camera. We approach this problem by modelling it as a dynamic multi-objective optimisation problem and show how this metaphor allows a much richer expressiveness than a classical single objective approach. Finally, we showcase the application of a multi-objective evolutionary algorithm to generate a shot...

  7. Bugs that debugs: Probiotics

    Directory of Open Access Journals (Sweden)

    Sugumari Elavarasu

    2012-01-01

    Full Text Available The oral cavity harbors a diverse array of bacterial species. There are more than 600 species that colonize in the oral cavity. These include a lot of organisms that are not commonly known to reside in the gastrointestinal (GI tract and also are more familiar: Lactobacillus acidophilus, Lactobacillus casei, Lactobacillus fermentum, Lactobacillus plantarum, Lactobacillus rhamnosus, and Lactobacillus salivarius. The balance of all these microorganisms can easily be disturbed and a prevalence of pathogenic organisms can lead to various oral health problems including dental caries, periodontitis, and halitosis.

  8. Performance management system enhancement and maintenance

    Science.gov (United States)

    Cleaver, T. G.; Ahour, R.; Johnson, B. R.

    1984-01-01

    The research described in this report concludes a two-year effort to develop a Performance Management System (PMS) for the NCC computers. PMS provides semi-automated monthly reports to NASA and contractor management on the status and performance of the NCC computers in the TDRSS program. Throughout 1984, PMS was tested, debugged, extended, and enhanced. Regular PMS monthly reports were produced and distributed. PMS continues to operate at the NCC under control of Bendix Corp. personnel.

  9. Automatic Complexity Analysis

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    1989-01-01

    One way to analyse programs is to to derive expressions for their computational behaviour. A time bound function (or worst-case complexity) gives an upper bound for the computation time as a function of the size of input. We describe a system to derive such time bounds automatically using abstract...

  10. Real-time automatic registration in optical surgical navigation

    Science.gov (United States)

    Lin, Qinyong; Yang, Rongqian; Cai, Ken; Si, Xuan; Chen, Xiuwen; Wu, Xiaoming

    2016-05-01

    An image-guided surgical navigation system requires the improvement of the patient-to-image registration time to enhance the convenience of the registration procedure. A critical step in achieving this aim is performing a fully automatic patient-to-image registration. This study reports on a design of custom fiducial markers and the performance of a real-time automatic patient-to-image registration method using these markers on the basis of an optical tracking system for rigid anatomy. The custom fiducial markers are designed to be automatically localized in both patient and image spaces. An automatic localization method is performed by registering a point cloud sampled from the three dimensional (3D) pedestal model surface of a fiducial marker to each pedestal of fiducial markers searched in image space. A head phantom is constructed to estimate the performance of the real-time automatic registration method under four fiducial configurations. The head phantom experimental results demonstrate that the real-time automatic registration method is more convenient, rapid, and accurate than the manual method. The time required for each registration is approximately 0.1 s. The automatic localization method precisely localizes the fiducial markers in image space. The averaged target registration error for the four configurations is approximately 0.7 mm. The automatic registration performance is independent of the positions relative to the tracking system and the movement of the patient during the operation.

  11. Performance testing of a semi-automatic card punch system, using direct STR profiling of DNA from blood samples on FTA™ cards.

    Science.gov (United States)

    Ogden, Samantha J; Horton, Jeffrey K; Stubbs, Simon L; Tatnell, Peter J

    2015-01-01

    The 1.2 mm Electric Coring Tool (e-Core™) was developed to increase the throughput of FTA(™) sample collection cards used during forensic workflows and is similar to a 1.2 mm Harris manual micro-punch for sampling dried blood spots. Direct short tandem repeat (STR) DNA profiling was used to compare samples taken by the e-Core tool with those taken by the manual micro-punch. The performance of the e-Core device was evaluated using a commercially available PowerPlex™ 18D STR System. In addition, an analysis was performed that investigated the potential carryover of DNA via the e-Core punch from one FTA disc to another. This contamination study was carried out using Applied Biosystems AmpflSTR™ Identifiler™ Direct PCR Amplification kits. The e-Core instrument does not contaminate FTA discs when a cleaning punch is used following excision of discs containing samples and generates STR profiles that are comparable to those generated by the manual micro-punch. PMID:25407399

  12. SPT效应的自动突显机制:来自输出监测的证据%The automatic pop-out mechanism of subject performed tasks effect:Evidence from output monitoring

    Institute of Scientific and Technical Information of China (English)

    李广政; 王丽娟

    2016-01-01

    研究采用“对回忆的再认”范式,从输出监测的角度考察了操作效应的提取机制。实验1结果显示,被试操作任务条件下的自由回忆的系列位置成绩缺乏首因效应,却拥有扩展的近因效应;被试操作任务条件下的“对回忆的再认”成绩显著差于语词任务下的成绩,差异具体表现在组块2~9、10、11和12上,表明在上述组块,被试操作任务条件下的提取存在自动突显,即操作效应得益于这些组块在提取时的自动突显。实验2结果显示,在类别测验下,被试操作任务和语词任务的自由回忆成绩拥有相似的系列位置曲线,而“对回忆的再认”结果同实验1。研究认为自由回忆的系列位置效应与自动突显机制之间不存在直接的因果关系,而“对回忆的再认”范式则能敏感地测量出操作效应的提取机制。%It is commonly established that simple action phrases are retained better when participants were instructed to perform the actions than they learned the phrases by reading. The superior memory performance after motor learning has been called the Subject Performed Tasks effect (SPT effect). Although a great many of researches have tried to account for the SPT effect and four theories had been proposed, none of them was sufficient to explain the enactment effect. However, the processing theories are contradictory with each other. The main contradiction is whether the SPT effect benefited from the automatic retrieval. So the current research applied the “recall-recognition” paradigm to clarify the mentioned divergence. We recruited 80 college students to take part in two experiments and used between-subject design to explore the automatic retrieval mechanism by recall - recognition test. The participants were told to memorize the items for later recall by just reading them silently (VTs) or enacting them symbolically after reading (SPTs). An example of an item was

  13. Automatic differentiation: Obtaining fast and reliable derivatives -- fast

    Energy Technology Data Exchange (ETDEWEB)

    Bischof, C.H.; Khademi, P.M.; Pusch, G. [Argonne National Lab., IL (United States). Mathematics and Computer Science Div.; Carle, A. [Rice Univ., Houston, TX (United States). Center for Research on Parallel Computation

    1994-12-31

    In this paper, the authors introduce automatic differentiation as a method for computing derivatives of large computer codes. After a brief discussion of methods of differentiating codes, they review automatic differentiation and introduce the ADIFOR (Automatic DIfferentiation of FORtran) tool. They highlight some applications of ADIFOR to large industrial and scientific codes (groundwater transport, CFD airfoil design, and sensitivity-enhanced MM5 mesoscale weather model), and discuss the effectiveness and performance of their approach. Finally, they discuss sparsity in automatic differentiation and introduce the SparsLinC library.

  14. 一种新的基于不完全排错的最优软件发行策略%A Novel Optimal Software Release Policy under Imperfect Debugging

    Institute of Scientific and Technical Information of China (English)

    刘云; 田斌; 赵玮

    2005-01-01

    Optimal software releasing is a challenging problem in software reliability. Most of the available software release models have an unreasonable assumption that the software debugging process is perfect or there is no new fault introduced during debugging. This paper presents an optimal software release policy model under imperfect debugging. This model not only takes the software imperfect debugging and the new faults introduced during debugging into account, but also considers the situation that the probability of perfect debugging will be increased while the experience is gained during the process of software testing. This paper also gives the solution of the model.%软件的最优发行管理问题是软件可靠性研究的一个关键问题.现有的最优软件发行模型大都假定软件排错过程是完全的,并且在排错过程中没有新的故障引入,这种假设在很多情况下是不合理的.本文提出了一种新的最优软件发行管理模型,该模型既考虑了软件的不完全排错过程,又考虑了在排错过程中可能会引入新的故障,同时还考虑了由于排错经验的不断积累,软件的完全排错概率会增加的情况.本文同时给出了该模型的解.

  15. Evaluation of anti-fouling performance for ion-rod water treater with automatic dynamic simulator of fouling%离子棒水处理器的阻垢性能评价

    Institute of Scientific and Technical Information of China (English)

    孙灵芳; 杨善让; 秦裕琨; 徐志明

    2005-01-01

    The application of a novel Automatic Dynamic Simulator of Fouling (ADSF) to evaluate the effectiveness of ion-rod water treater is reported.The effects of some parameters of the water treater were studied with an ADSF made according to patented technology, and orthogonal experimental design was adopted with the use of artificial hard water.Experimental results validated that the ion-rod water treater could mitigate fouling,and the anti-fouling efficiency varies with the test conditions.The anti-fouling efficiency of treater increased with the increase of flow velocity in the range of 0.8-1.2 m·s-1 and output voltage in the range of 7500-15000 V.The efficiency weat up initially, and then went down with the increase in hardness.The rough surface of ion-rod was superior to the smooth one.The order of influence on treater performance with respect to these factors was as follows: water hardness, roughness of surface, flow velocity and output voltage.The research also provided a guide to improving the performance of ion-rod water treater.

  16. Automatic Extraction of Metadata from Scientific Publications for CRIS Systems

    Science.gov (United States)

    Kovacevic, Aleksandar; Ivanovic, Dragan; Milosavljevic, Branko; Konjovic, Zora; Surla, Dusan

    2011-01-01

    Purpose: The aim of this paper is to develop a system for automatic extraction of metadata from scientific papers in PDF format for the information system for monitoring the scientific research activity of the University of Novi Sad (CRIS UNS). Design/methodology/approach: The system is based on machine learning and performs automatic extraction…

  17. Reliability and effectiveness of clickthrough data for automatic image annotation

    NARCIS (Netherlands)

    Tsikrika, T.; Diou, C.; Vries, A.P. de; Delopoulos, A.

    2010-01-01

    Automatic image annotation using supervised learning is performed by concept classifiers trained on labelled example images. This work proposes the use of clickthrough data collected from search logs as a source for the automatic generation of concept training data, thus avoiding the expensive manua

  18. Practical automatic Arabic license plate recognition system

    Science.gov (United States)

    Mohammad, Khader; Agaian, Sos; Saleh, Hani

    2011-02-01

    Since 1970's, the need of an automatic license plate recognition system, sometimes referred as Automatic License Plate Recognition system, has been increasing. A license plate recognition system is an automatic system that is able to recognize a license plate number, extracted from image sensors. In specific, Automatic License Plate Recognition systems are being used in conjunction with various transportation systems in application areas such as law enforcement (e.g. speed limit enforcement) and commercial usages such as parking enforcement and automatic toll payment private and public entrances, border control, theft and vandalism control. Vehicle license plate recognition has been intensively studied in many countries. Due to the different types of license plates being used, the requirement of an automatic license plate recognition system is different for each country. [License plate detection using cluster run length smoothing algorithm ].Generally, an automatic license plate localization and recognition system is made up of three modules; license plate localization, character segmentation and optical character recognition modules. This paper presents an Arabic license plate recognition system that is insensitive to character size, font, shape and orientation with extremely high accuracy rate. The proposed system is based on a combination of enhancement, license plate localization, morphological processing, and feature vector extraction using the Haar transform. The performance of the system is fast due to classification of alphabet and numerals based on the license plate organization. Experimental results for license plates of two different Arab countries show an average of 99 % successful license plate localization and recognition in a total of more than 20 different images captured from a complex outdoor environment. The results run times takes less time compared to conventional and many states of art methods.

  19. An automatic hinge system for leg orthoses

    NARCIS (Netherlands)

    Rietman, J.S.; Goudsmit, J.; Meulemans, D.; Halbertsma, J.P.K.; Geertzen, J.H.B.

    2004-01-01

    This paper describes a new, automatic hinge system for leg orthoses, which provides knee stability in stance, and allows knee-flexion during swing. Indications for the hinge system are a paresis or paralysis of the quadriceps muscles. Instrumented gait analysis was performed in three patients, fitte

  20. Automatic Pilot For Flight-Test Maneuvers

    Science.gov (United States)

    Duke, Eugene L.; Jones, Frank P.; Roncoli, Ralph B.

    1992-01-01

    Autopilot replaces pilot during automatic maneuvers. Pilot, based on ground, flies aircraft to required altitude, then turns control over to autopilot. Increases quality of maneuvers significantly beyond that attainable through remote manual control by pilot on ground. Also increases quality of maneuvers because it performs maneuvers faster than pilot could and because it does not have to repeat poorly executed maneuvers.

  1. Automatic visual inspection of hybrid microcircuits

    Energy Technology Data Exchange (ETDEWEB)

    Hines, R.E.

    1980-05-01

    An automatic visual inspection system using a minicomputer and a video digitizer was developed for inspecting hybrid microcircuits (HMC) and thin-film networks (TFN). The system performed well in detecting missing components on HMCs and reduced the testing time for each HMC by 75%.

  2. ASAM: Automatic architecture synthesis and application mapping

    DEFF Research Database (Denmark)

    Jozwiak, Lech; Lindwer, Menno; Corvino, Rosilde;

    2013-01-01

    This paper focuses on mastering the automatic architecture synthesis and application mapping for heterogeneous massively-parallel MPSoCs based on customizable application-specific instruction-set processors (ASIPs). It presents an overview of the research being currently performed in the scope of...

  3. Simultaneous automatic determination of catecholamines and their 3-O-methyl metabolites in rat plasma by high-performance liquid chromatography using peroxyoxalate chemiluminescence reaction.

    Science.gov (United States)

    Tsunoda, M; Takezawa, K; Santa, T; Imai, K

    1999-05-01

    A highly specific and sensitive automated high-performance liquid chromatographic method for the simultaneous determination of catecholamines (CAs; norepinephrine, epinephrine, and dopamine) and their 3-O-methyl metabolites (normetanephrine, metanephrine, and 3-methoxytyramine) is described. Automated precolumn ion-exchange extraction of diluted plasma is coupled with HPLC separation of CAs and their 3-O-methyl metabolites on an ODS column, postcolumn coulometric oxidation, fluorescence derivatization with ethylenediamine, and finally peroxyoxalate chemiluminescence reaction detection. The detection limits were about 3 fmol for norepinephrine, epinephrine, and dopamine, 5 fmol for normetanephrine, and 10 fmol for metanephrine and 3-methoxytyramine (signal-to-noise ratio of 3). Fifty microliters of rat plasma was used and 4-methoxytyramine was employed as an internal standard. The relative standard deviations for the method (n = 5) were 2.5-7.6% for the intraday assay and 6.3-9.1% for the interday assay. The method was applicable to the determination of normetanephrine and metanephrine in 50 microl of rat plasma. PMID:10222014

  4. Automatic trend estimation

    CERN Document Server

    Vamos¸, C˘alin

    2013-01-01

    Our book introduces a method to evaluate the accuracy of trend estimation algorithms under conditions similar to those encountered in real time series processing. This method is based on Monte Carlo experiments with artificial time series numerically generated by an original algorithm. The second part of the book contains several automatic algorithms for trend estimation and time series partitioning. The source codes of the computer programs implementing these original automatic algorithms are given in the appendix and will be freely available on the web. The book contains clear statement of the conditions and the approximations under which the algorithms work, as well as the proper interpretation of their results. We illustrate the functioning of the analyzed algorithms by processing time series from astrophysics, finance, biophysics, and paleoclimatology. The numerical experiment method extensively used in our book is already in common use in computational and statistical physics.

  5. Automatic Program Reports

    OpenAIRE

    Lígia Maria da Silva Ribeiro; Gabriel de Sousa Torcato David

    2007-01-01

    To profit from the data collected by the SIGARRA academic IS, a systematic setof graphs and statistics has been added to it and are available on-line. Thisanalytic information can be automatically included in a flexible yearly report foreach program as well as in a synthesis report for the whole school. Somedifficulties in the interpretation of some graphs led to the definition of new keyindicators and the development of a data warehouse across the university whereeffective data consolidation...

  6. Automatic food decisions

    DEFF Research Database (Denmark)

    Mueller Loose, Simone

    Consumers' food decisions are to a large extent shaped by automatic processes, which are either internally directed through learned habits and routines or externally influenced by context factors and visual information triggers. Innovative research methods such as eye tracking, choice experiments...... and food diaries allow us to better understand the impact of unconscious processes on consumers' food choices. Simone Mueller Loose will provide an overview of recent research insights into the effects of habit and context on consumers' food choices....

  7. Automatic Differentiation Variational Inference

    OpenAIRE

    Kucukelbir, Alp; Tran, Dustin; Ranganath, Rajesh; Gelman, Andrew; Blei, David M.

    2016-01-01

    Probabilistic modeling is iterative. A scientist posits a simple model, fits it to her data, refines it according to her analysis, and repeats. However, fitting complex models to large data is a bottleneck in this process. Deriving algorithms for new models can be both mathematically and computationally challenging, which makes it difficult to efficiently cycle through the steps. To this end, we develop automatic differentiation variational inference (ADVI). Using our method, the scientist on...

  8. Automatic inference of indexing rules for MEDLINE

    Directory of Open Access Journals (Sweden)

    Shooshan Sonya E

    2008-11-01

    Full Text Available Abstract Background: Indexing is a crucial step in any information retrieval system. In MEDLINE, a widely used database of the biomedical literature, the indexing process involves the selection of Medical Subject Headings in order to describe the subject matter of articles. The need for automatic tools to assist MEDLINE indexers in this task is growing with the increasing number of publications being added to MEDLINE. Methods: In this paper, we describe the use and the customization of Inductive Logic Programming (ILP to infer indexing rules that may be used to produce automatic indexing recommendations for MEDLINE indexers. Results: Our results show that this original ILP-based approach outperforms manual rules when they exist. In addition, the use of ILP rules also improves the overall performance of the Medical Text Indexer (MTI, a system producing automatic indexing recommendations for MEDLINE. Conclusion: We expect the sets of ILP rules obtained in this experiment to be integrated into MTI.

  9. Support vector machine for automatic pain recognition

    Science.gov (United States)

    Monwar, Md Maruf; Rezaei, Siamak

    2009-02-01

    Facial expressions are a key index of emotion and the interpretation of such expressions of emotion is critical to everyday social functioning. In this paper, we present an efficient video analysis technique for recognition of a specific expression, pain, from human faces. We employ an automatic face detector which detects face from the stored video frame using skin color modeling technique. For pain recognition, location and shape features of the detected faces are computed. These features are then used as inputs to a support vector machine (SVM) for classification. We compare the results with neural network based and eigenimage based automatic pain recognition systems. The experiment results indicate that using support vector machine as classifier can certainly improve the performance of automatic pain recognition system.

  10. Around the laboratories: Rutherford: Successful tests on bubble chamber target technique; Stanford (SLAC): New storage rings proposal; Berkeley: The HAPPE project to examine cosmic rays with superconducting magnets; The 60th birthday of Professor N.N. Bogolyubov; Argonne: Performance of the automatic film measuring system POLLY II

    CERN Multimedia

    1969-01-01

    Around the laboratories: Rutherford: Successful tests on bubble chamber target technique; Stanford (SLAC): New storage rings proposal; Berkeley: The HAPPE project to examine cosmic rays with superconducting magnets; The 60th birthday of Professor N.N. Bogolyubov; Argonne: Performance of the automatic film measuring system POLLY II

  11. Automatic bootstrapping and tracking of object contours.

    Science.gov (United States)

    Chiverton, John; Xie, Xianghua; Mirmehdi, Majid

    2012-03-01

    A new fully automatic object tracking and segmentation framework is proposed. The framework consists of a motion-based bootstrapping algorithm concurrent to a shape-based active contour. The shape-based active contour uses finite shape memory that is automatically and continuously built from both the bootstrap process and the active-contour object tracker. A scheme is proposed to ensure that the finite shape memory is continuously updated but forgets unnecessary information. Two new ways of automatically extracting shape information from image data given a region of interest are also proposed. Results demonstrate that the bootstrapping stage provides important motion and shape information to the object tracker. This information is found to be essential for good (fully automatic) initialization of the active contour. Further results also demonstrate convergence properties of the content of the finite shape memory and similar object tracking performance in comparison with an object tracker with unlimited shape memory. Tests with an active contour using a fixed-shape prior also demonstrate superior performance for the proposed bootstrapped finite-shape-memory framework and similar performance when compared with a recently proposed active contour that uses an alternative online learning model. PMID:21908256

  12. 航天编码器调试系统显示功能设计%Design of Display Function in Debugging System of Space Encoder

    Institute of Scientific and Technical Information of China (English)

    左洋; 龙科慧; 乔克; 刘金国

    2012-01-01

    为实现航天编码器的高可靠性与稳定性,设计了一种针对航天编码器的调试系统显示模块.系统以单片机80C32E为核心处理器,实现编码器信号采集、处理、通讯和显示.在显示模块中,一是通过LED灯排显示二进制形式的角度信息,二是通过LCD液晶屏显示度分秒形式的角度信息.当调试编码器时,LED灯排显示适于专业人员监测编码器数据信息是否进位正常和读取角度信息;LCD液晶屏适于非专业人员查看编码器角度等信息.系统体积小,可视性好,方便编码器的调试.实验结果表明,该调试系统可同时显示两种方式角度信息,实时性好,数据读取方便,并且系统显示功能还可进一步扩展.%In order to meet the needs on reliability and stability for aerospace encode, the display module in the debugging system of space encoder was designed. The singlechip 80C32E was the core processor to achieve the collection, processing and display. In the display module, the angle information of binary form was displayed by LED bars on the one hand, the angle information of degrees,minutes,seconds form was displayed by LCD screen on the other hand. When people were debugging encoder, LED bars display was suitable for professionals monitoring whether the encoder data carry normally or not and reading angle information, LCD display was suitable for non-professionals viewing encoder angle. The system is small, flexible operation, and good visibility. The results show that the angle information in the two ways were displayed simultaneously in the debugging system, easy to read, and display function of system can be further expanded.

  13. On automatic visual inspection of reflective surfaces

    DEFF Research Database (Denmark)

    Kulmann, Lionel

    1995-01-01

    This thesis descrbes different methods to perform automatic visual inspection of reflective manufactured products, with the aim of increasing productivity, reduce cost and improve the quality level of the production. We investigate two different systems performing automatic visual inspection....... The first is the inspection of highly reflective aluminum sheets, used by the Danish company Bang & Olufsen, as a part of the exterior design and general appearance of their audio and video products. The second is the inspection of IBM hard disk read/write heads for defects during manufacturing. We have...... surveyed visual inspection system design methods and presented available image processing hardware to perform high resolution image capture. We present general usable practical visual inspection system solutions, when performing high resolution visual inspection of surfaces. We have presented known and new...

  14. A General Method for Module Automatic Testing in Avionics Systems

    Directory of Open Access Journals (Sweden)

    Li Ma

    2013-05-01

    Full Text Available The traditional Automatic Test Equipment (ATE systems are insufficient to cope with the challenges of testing more and more complex avionics systems. In this study, we propose a general method for module automatic testing in the avionics test platform based on PXI bus. We apply virtual instrument technology to realize the automatic testing and the fault reporting of signal performance. Taking the avionics bus ARINC429 as an example, we introduce the architecture of automatic test system as well as the implementation of algorithms in Lab VIEW. The comprehensive experiments show the proposed method can effectively accomplish the automatic testing and fault reporting of signal performance. It greatly improves the generality and reliability of ATE in avionics systems.

  15. 船舶自动识别系统性能半实物仿真方法研究%Research on Automatic Identify System Performance with Hardware-In-The-Loop Simulation

    Institute of Scientific and Technical Information of China (English)

    马枫; 严新平; 初秀民

    2013-01-01

    船舶自动识别系统(Automatic Identify System,AIS)是当今海事管理主要无线监管工具.为全面分析其在复杂地形下的误包特征,设计了一种半实物仿真研究方法,模拟任意场强与干扰特征的AIS信号,采用真实的接收机评估误包率,并使用Okumura-Hata模型建立了场强与距高的对应模型,用于仿真AIS网络实际运行中可能遇到的地形衰减,以及各种常见干扰.使用访方法可以有效寻找出信号衰减的误码特征,验证不同干扰信号的特性.通过深圳AIS信号干扰实例,证明了半实物仿真方法与平台的实用性.该研究方法可以为最终实现AIS设备的硬件设计优化提供平台,并为基站的最佳布置方案提供实验基础.%Automatic Identification System (AIS) is a widely used management tool for administration department.Aiming at the analysis on the performance characteristics in the actual conditions,a radio frequency hard-ware-in-loop simulation platform was proposed,which contained a combination radiofrequency signal generation system to simulate the common AIS signal with arbitrary field strength and interference characteristics,meanwhile,the Okumura-Hata model was used to design a formula to calculate the field strength in define distance.With its help,it was much easier to figure out the error signal attenuation characteristics,study the PER in different interference signals.And a field testing in the Shenzhen was carried out to prove the efficiency.And it shows another way to study the AIS network,and provides a platform to optimize hardware design and the layout scheme for the locations ofbasestations.

  16. Avaliação do desempenho de um sistema automático para controle da fertirrigação do tomateiro cultivado em substrato Performance evaluation of an automatic system for tomato fertigation control in substrate

    Directory of Open Access Journals (Sweden)

    Antonio J. Steidle Neto

    2009-09-01

    Full Text Available Este trabalho teve por objetivo avaliar o desempenho de um sistema de controle automático de fertirrigação para a produção do tomateiro em substrato de areia, comparativamente ao sistema de controle convencional quanto à redução de solução nutritiva. No método de controle automático, os eventos de fertirrigação foram estabelecidos em função das condições meteorológicas do ambiente de cultivo e do estádio de desenvolvimento da cultura. Para isso, o modelo de Penman-Monteith foi utilizado como suporte para a tomada de decisão sobre a frequência adequada para aplicação da solução nutritiva. No sistema de controle convencional, os intervalos entre as fertirrigações permaneceram fixos durante todo o ciclo do tomateiro. Os resultados demonstraram que o sistema de controle automático atendeu plenamente às necessidades hídricas da cultura, sem comprometer a produção do tomateiro, proporcionando reduções expressivas no consumo de solução nutritiva. Por outro lado, o sistema de controle convencional realizou número excessivo de fertirrigações, principalmente durante o estádio inicial de desenvolvimento do tomateiro e nos dias caracterizados por elevada nebulosidade. No estádio inicial de crescimento, verificou-se que os volumes totais de solução nutritiva, aplicados ao tomateiro pelo sistema convencional, excederam as necessidades hídricas da cultura em 1,31 e 1,39 L planta-1 em dias típicos com céu claro e nublado, respectivamente.The objective of this work was to compare the performance of an automatic fertigation control system, for soilless tomato production in sand substrate, as compared to a conventional control system. In the automatic control, fertigation events were established by meteorological conditions in the cultivation environment and crop development stage. In this way, the Penman-Monteith model was utilized as a decision support tool regarding the appropriate frequency for delivering the

  17. SYSMEX CS 5100全自动血凝分析仪的性能评价%Evaluation on the performance of Sysmex CS 5100 automatic blood coagulation analyzer

    Institute of Scientific and Technical Information of China (English)

    王芳; 张军; 徐唯傑; 乐军; 陈晓燕; 陈晋

    2015-01-01

    目的:对Sysmex CS 5100(下简称CS 5100)全自动血凝仪进行性能评价。方法对CS 5100血凝仪的准确性、不精密度、Fg 线性(可报告范围)、参考范围、携带污染率进行评价,并与 Sysmex 公司生产的 CA 7000全自动血凝分析仪进行相关性试验,所测指标为 PT、PT(INR)、APTT、Fbg、TT、AT、D 二聚体(DD)和 FDP。结果准确性试验测定结果在质控说明书给定的范围内。批内最大变异系数(CV)与日间最大 CV 均符合符合相关行业文件要求。Fbg 线性验证试验结果显示 Fbg 线性范围为1.071~5.355,相关系数(r)值为0.9950,符合规定要求(r≥0.975)。参考范围验证试验结果显示各项检测指标 R 值均>0.9,实验室预设参考范围可适用于该仪器。CS 5100与 CA 7000的各项检测项目 r 值均>0.95,两仪器结果有很好的对比性。结论CS 5100有优异的准确性、精密度良好、Fbg 检测范围宽、抗生物干扰能力强,完全可满足临床实验室要求。%Objective To evaluate the performance of Sysmex CS 51 00 automatic blood coagulation analyzer. Methods PT,PT(INR),APTT,Fbg,TT,AT,D-dimer (DD)and FDP were measured by Sysmex CS 51 00 automatic blood coagulation analyzer.The accuracy,imprecision,Fg linearity (reportable range),reference range, carry-over rate were evaluated,and the correlation with Sysmex CA 7000 automatic blood coagulation analyzer was analyzed.Results The accuracy was in the range provided by quality control instructions.The maximum within-run and inter-day coefficients of variation met the requirements of Clia′88.Fbg linear validation test showed that the linearity of Fbg was from 1 .071 to 5.355,and the correlation coefficient (r)was 0.9950,which met the specified requirements (r≥0.975).The reference range verification tests showed that the R values of all the tests were >0.9,which proved that the laboratory preset reference range can be

  18. Development of advanced automatic control system for nuclear ship. 2. Perfect automatic operation after reactor scram events

    Energy Technology Data Exchange (ETDEWEB)

    Yabuuchi, Noriaki; Nakazawa, Toshio; Takahashi, Hiroki; Shimazaki, Junya; Hoshi, Tsutao [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1997-11-01

    An automatic operation system has been developed for the purpose of realizing a perfect automatic plant operation after reactor scram events. The goal of the automatic operation after a reactor scram event is to bring the reactor hot stand-by condition automatically. The basic functions of this system are as follows; to monitor actions of the equipments of safety actions after a reactor scram, to control necessary control equipments to bring a reactor to a hot stand-by condition automatically, and to energize a decay heat removal system. The performance evaluation on this system was carried out by comparing the results using to Nuclear Ship Engineering Simulation System (NESSY) and the those measured in the scram test of the nuclear ship `Mutsu`. As the result, it was showed that this system had the sufficient performance to bring a reactor to a hot syand-by condition quickly and safety. (author)

  19. Automatic Configuration in NTP

    Institute of Scientific and Technical Information of China (English)

    Jiang Zongli(蒋宗礼); Xu Binbin

    2003-01-01

    NTP is nowadays the most widely used distributed network time protocol, which aims at synchronizing the clocks of computers in a network and keeping the accuracy and validation of the time information which is transmitted in the network. Without automatic configuration mechanism, the stability and flexibility of the synchronization network built upon NTP protocol are not satisfying. P2P's resource discovery mechanism is used to look for time sources in a synchronization network, and according to the network environment and node's quality, the synchronization network is constructed dynamically.

  20. Considering the Fault Dependency Concept with Debugging Time Lag in Software Reliability Growth Modeling Using a Power Function of Testing Time

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Since the early 1970s tremendous growth has been seen in the research of software reliability growth modeling. In general, software reliability growth models (SRGMs) are applicable to the late stages of testing in software development and they can provide useful information about how to improve the reliability of software products. A number of SRGMs have been proposed in the literature to represent time-dependent fault identification / removal phenomenon; still new models are being proposed that could fit a greater number of reliability growth curves. Often, it is assumed that detected faults are immediately corrected when mathematical models are developed. This assumption may not be realistic in practice because the time to remove a detected fault depends on the complexity of the fault, the skill and experience of the personnel, the size of the debugging team, the technique, and so on. Thus, the detected fault need not be immediately removed, and it may lag the fault detection process by a delay effect factor. In this paper, we first review how different software reliability growth models have been developed, where fault detection process is dependent not only on the number of residual fault content but also on the testing time, and see how these models can be reinterpreted as the delayed fault detection model by using a delay effect factor. Based on the power function of the testing time concept, we propose four new SRGMs that assume the presence of two types of faults in the software: leading and dependent faults. Leading faults are those that can be removed upon a failure being observed. However, dependent faults are masked by leading faults and can only be removed after the corresponding leading fault has been removed with a debugging time lag. These models have been tested on real software error data to show its goodness of fit, predictive validity and applicability.

  1. 计算机软件工程的调试预编译作用解析%Debugging a Compiler Function Analysis of Computer Software Engineering

    Institute of Scientific and Technical Information of China (English)

    周霞

    2014-01-01

    The development of computer software is currently the mainstream technology of computer application,computer programming language based on logic function,so as to realize the function of the software requirements.In the process of software development,needs to carry on the analysis from different angles,understand the nature of software engineering,the computer programming code function.In software development,debugging is the necessary work.While the precompiled is a test of the function of the software,test its practicality.In this paper,the public on the computer software development and technical analysis,which analyzes the debugging and precompiled effect.%计算机软件开发是目前比较主流的计算机应用技术,依托计算机编程语言的逻辑功能,从而实现软件的需求功能。在软件开发的过程中,需要从不同的角度进行分析。在软件开发阶段,调试是必要的工作。而预编译则是对软件功能的一种测试,测试其实用性。本文针对计算机软件开发技术进行分析,从而对调试和预编译的作用进行解析。

  2. Automatic device for maintenance on safety relief valve

    Energy Technology Data Exchange (ETDEWEB)

    Fujio, M. [Okano Valve Mfg. Co. Ltd., Kitakyushu, Fukuoka (Japan)

    1996-02-01

    This system offers shorter, labor-saving periodic inspection of nuclear power plants particularly for maintenance of in the BWR plant. This automatic device has the following performance features. (author).

  3. Automatic personnel contamination monitor

    International Nuclear Information System (INIS)

    United Nuclear Industries, Inc. (UNI) has developed an automatic personnel contamination monitor (APCM), which uniquely combines the design features of both portal and hand and shoe monitors. In addition, this prototype system also has a number of new features, including: micro computer control and readout, nineteen large area gas flow detectors, real-time background compensation, self-checking for system failures, and card reader identification and control. UNI's experience in operating the Hanford N Reactor, located in Richland, Washington, has shown the necessity of automatically monitoring plant personnel for contamination after they have passed through the procedurally controlled radiation zones. This final check ensures that each radiation zone worker has been properly checked before leaving company controlled boundaries. Investigation of the commercially available portal and hand and shoe monitors indicated that they did not have the sensitivity or sophistication required for UNI's application, therefore, a development program was initiated, resulting in the subject monitor. Field testing shows good sensitivity to personnel contamination with the majority of alarms showing contaminants on clothing, face and head areas. In general, the APCM has sensitivity comparable to portal survey instrumentation. The inherit stand-in, walk-on feature of the APCM not only makes it easy to use, but makes it difficult to bypass. (author)

  4. USB设备端驱动的研究及在GDB远程调试中的应用%Study of USB device driver and its application in GDB remote debugging

    Institute of Scientific and Technical Information of China (English)

    况阳; 雷航; 詹瑾瑜

    2011-01-01

    During embedded Linux software development, it can do remote debugging on embedded software by using host CDB and target GDBserver, GDB communicate with CDBserver via RSP protocol, this method can improve efficiency effectively. The host machine and target machine can build connection with serial port or Ethernet but, not support USB connection so far. This paper introduced some USB related conceptions and the fundamental of GDB remote debugging, by analysing debugging model existed now, used Gadget drvier on the Linux device side to realize USB + GDB + GDBserver remote debugging model. This model is a supplement for the debugging models existed so far, USB interface is becoming more and more popular, making this model to be very convenient for engineers in the actual development.%在嵌入式Linux软件开发中,可以通过宿主机GDB( GNU debugger)加目标机GDBserver的方式对嵌入式软件进行远程调试,GDB和GDBserver通过RSP( remote serial protocol)协议进行通信,这种方式可以显著提高开发效率;目前宿主机和目标机之间支持串口或网口方式建立连接,暂不支持USB( universal serial bus)接口.介绍了USB的相关概念及GDB远程调试原理,通过分析当前存在的调试模型,利用Linux设备端Gadget功能驱动实现了USB+ GDB+ GDBserver的远程调试模型.该模型弥补了现有模型的不足,USB接口的日益普及使得该模型在实际开发中带来了极大的便利.

  5. A semi-automatic method for ontology mapping

    OpenAIRE

    PEREZ, Laura Haide; Cechich, Alejandra; Buccella, Agustina

    2007-01-01

    Ontology mapping involves the task of finding similarities among overlapping sources by using ontologies. In a Federated System in which distributed, autonomous and heterogeneous information sources must be integrated, ontologies have emerged as tools to solve semantic heterogeneity problems. In this paper we propose a three-level approach that provides a semi-automatic method to ontology mapping. It performs some tasks automatically and guides the user in performing other tasks for which ...

  6. Desempenho de um regulador automático de vazão para canais de irrigação Performance of an automatic discharge regulator for irrigation channels

    Directory of Open Access Journals (Sweden)

    Luís G. H. do Amaral

    2010-12-01

    Full Text Available As estruturas de controle comumente utilizadas nas tomadas de água dos canais de irrigação não permitem a distribuição da quantidade correta de água, favorecendo o desperdício e, consequentemente, reduzindo a eficiência no uso da água. O objetivo deste trabalho foi determinar o desempenho de um regulador automático de vazão no controle da vazão derivada. Para tanto, um exemplar do equipamento, construído em fibra de vidro, foi instalado na lateral de um canal de concreto do Laboratório de Hidráulica da Universidade Federal de Viçosa, em Viçosa-MG. O equipamento foi avaliado em toda a sua faixa de operação, sendo que, em cada regulagem prefixada, determinou-se a vazão derivada com o nível da água a montante variando de 0,30 a 0,45 m. A variação média na vazão do regulador, considerando toda a sua faixa de operação, foi de ± 2,3% em relação às vazões médias fornecidas pelo equipamento em cada regulagem. A amplitude de variação na vazão fornecida foi pequena em relação aos equipamentos usualmente empregados no controle de vazão, em canais de irrigação, demonstrando que o regulador automático de vazão é um equipamento apropriado para a distribuição de água em redes de canais.The control structures commonly used in irrigation channels water intakes are inefficient in delivering the correct water volume to crops, collaborating to water waste and, hence, reducing the water use efficiency. The objective of this work was to determine the performance of an automatic discharge regulator in the control of the supplied discharge. The regulator was made of fiberglass and its evaluation was accomplished in a concrete channel belonging to the Hydraulic Laboratory of the Federal University of Viçosa, in Viçosa, state of Minas Gerais, Brazil. The evaluation was performed for all equipment discharge regulation options. In each regulation, the supplied discharge was determined for the upstream water level changing

  7. A Statistical Approach to Automatic Speech Summarization

    Science.gov (United States)

    Hori, Chiori; Furui, Sadaoki; Malkin, Rob; Yu, Hua; Waibel, Alex

    2003-12-01

    This paper proposes a statistical approach to automatic speech summarization. In our method, a set of words maximizing a summarization score indicating the appropriateness of summarization is extracted from automatically transcribed speech and then concatenated to create a summary. The extraction process is performed using a dynamic programming (DP) technique based on a target compression ratio. In this paper, we demonstrate how an English news broadcast transcribed by a speech recognizer is automatically summarized. We adapted our method, which was originally proposed for Japanese, to English by modifying the model for estimating word concatenation probabilities based on a dependency structure in the original speech given by a stochastic dependency context free grammar (SDCFG). We also propose a method of summarizing multiple utterances using a two-level DP technique. The automatically summarized sentences are evaluated by summarization accuracy based on a comparison with a manual summary of speech that has been correctly transcribed by human subjects. Our experimental results indicate that the method we propose can effectively extract relatively important information and remove redundant and irrelevant information from English news broadcasts.

  8. What is automatized during perceptual categorization?

    Science.gov (United States)

    Roeder, Jessica L; Ashby, F Gregory

    2016-09-01

    An experiment is described that tested whether stimulus-response associations or an abstract rule are automatized during extensive practice at perceptual categorization. Twenty-seven participants each completed 12,300 trials of perceptual categorization, either on rule-based (RB) categories that could be learned explicitly or information-integration (II) categories that required procedural learning. Each participant practiced predominantly on a primary category structure, but every third session they switched to a secondary structure that used the same stimuli and responses. Half the stimuli retained their same response on the primary and secondary categories (the congruent stimuli) and half switched responses (the incongruent stimuli). Several results stood out. First, performance on the primary categories met the standard criteria of automaticity by the end of training. Second, for the primary categories in the RB condition, accuracy and response time (RT) were identical on congruent and incongruent stimuli. In contrast, for the primary II categories, accuracy was higher and RT was lower for congruent than for incongruent stimuli. These results are consistent with the hypothesis that rules are automatized in RB tasks, whereas stimulus-response associations are automatized in II tasks. A cognitive neuroscience theory is proposed that accounts for these results. PMID:27232521

  9. MARZ: Manual and automatic redshifting software

    Science.gov (United States)

    Hinton, S. R.; Davis, Tamara M.; Lidman, C.; Glazebrook, K.; Lewis, G. F.

    2016-04-01

    The Australian Dark Energy Survey (OzDES) is a 100-night spectroscopic survey underway on the Anglo-Australian Telescope using the fibre-fed 2-degree-field (2dF) spectrograph. We have developed a new redshifting application MARZ with greater usability, flexibility, and the capacity to analyse a wider range of object types than the RUNZ software package previously used for redshifting spectra from 2dF. MARZ is an open-source, client-based, Javascript web-application which provides an intuitive interface and powerful automatic matching capabilities on spectra generated from the AAOmega spectrograph to produce high quality spectroscopic redshift measurements. The software can be run interactively or via the command line, and is easily adaptable to other instruments and pipelines if conforming to the current FITS file standard is not possible. Behind the scenes, a modified version of the AUTOZ cross-correlation algorithm is used to match input spectra against a variety of stellar and galaxy templates, and automatic matching performance for OzDES spectra has increased from 54% (RUNZ) to 91% (MARZ). Spectra not matched correctly by the automatic algorithm can be easily redshifted manually by cycling automatic results, manual template comparison, or marking spectral features.

  10. A Statistical Approach to Automatic Speech Summarization

    Directory of Open Access Journals (Sweden)

    Chiori Hori

    2003-02-01

    Full Text Available This paper proposes a statistical approach to automatic speech summarization. In our method, a set of words maximizing a summarization score indicating the appropriateness of summarization is extracted from automatically transcribed speech and then concatenated to create a summary. The extraction process is performed using a dynamic programming (DP technique based on a target compression ratio. In this paper, we demonstrate how an English news broadcast transcribed by a speech recognizer is automatically summarized. We adapted our method, which was originally proposed for Japanese, to English by modifying the model for estimating word concatenation probabilities based on a dependency structure in the original speech given by a stochastic dependency context free grammar (SDCFG. We also propose a method of summarizing multiple utterances using a two-level DP technique. The automatically summarized sentences are evaluated by summarization accuracy based on a comparison with a manual summary of speech that has been correctly transcribed by human subjects. Our experimental results indicate that the method we propose can effectively extract relatively important information and remove redundant and irrelevant information from English news broadcasts.

  11. 基于行业标准的全自动生化分析仪性能评价%The performance evaluation of the automatic analyser based on the“Medical standard of the People's Republic of China"

    Institute of Scientific and Technical Information of China (English)

    阳苹; 张莉萍; 毕小云; 邓小玲; 肖勤; 陈维蓓

    2011-01-01

    Objective The performance of Roche Modular DDP was evaluated according to the "Medical standard of the People's Republic of China-automatic chemistry analyzer" administered by the state food and drug administration(SFDA). Methods The stray light, the absorption linear range, the absorption accuracy, the absorption stability, the absorption reproducibility, the sample contamination rate,the sampling accuracy and the sampling reproducibility of the Roche Modular DDP were evaluated by multiple repetitive determination of standard substance reproducibility using the standard solution calibrated by national institute of metrology according to the requirement of the “Medical standard of the People's Republic of China-automatic chemistry analy zer". Results The highest stray light absorption was more than 23 000; The relative variation was less than 5 % when absorption was no less than 32 000;When the absorption were 5 000 and 10 000,the error were ± 300 and ± 700, respectively; The highest value minus the lowest value was less than 100; The CV was less than 1.5 % with the smallest reaction volume; The contamination rate was less than 0.5 %; The absorption of the CHKS was in the defined range and its CV was less than 1.5 %; The absorption of the CHKR1 and the CHKR2 were both in the defined ranges and the CVs of them were less than 0.5 % and less than 1.0 %, respectively. Conclusion The performance index of the Roche Modular DDP is live up to the requirement of "Medical standard of the People's Republic of China-automatic chemistry analyzer".%目的 对Roche Modular DDP全自动生化分析仪进行性能评价.方法 采用经中国计量科学研究所进行定值及校正过的标准溶液,据<中华人民共和国医药行业标准--全自动生化分析仪>要求,通过多次重复检测已知标准物质的重复性,评价Roche Modular DDP全自动生化分析仪的杂散光、吸光度线性范围、吸光度准确性、吸光度稳定性、

  12. Performance Verification of Precil C2000-A Automatic Blood Coagulation Instrument%普利生C2000-A全自动血凝仪的性能验证

    Institute of Scientific and Technical Information of China (English)

    张伟坚; 刘光明; 梁凤琼

    2014-01-01

    目的:对全自动血凝仪普利生C2000-A进行性能验证,以确定其是否符合临床检测要求。方法参照美国临床实验室标准化委员会(NCCLS)标准,应用定值质控血浆或(和)定标血浆,选择凝血常规项目[D-二聚体(D-D)、血浆凝血酶原时间(PT)、活化部分凝血活酶时间(APTT)、凝血酶原时间测定(TT)、纤维蛋白原(FIB)]对仪器分析系统的精密度、正确度、携带污染率、线性范围、抗干扰能力(干扰物为血红蛋白、直接胆红素和三酰甘油)以及通道一致性等性能进行验证和初步评价。结果所有凝血检测项目中,批内精密度均小于3%,批间精密度均小于5%;定值质控品或者定值校准品的结果与各自靶值相比其偏差均少于8%;线性验证标本按一定比例稀释后将所得理论值与实测值进行回归分析,a值均介于0.97~1.03范围内,r均大于0.975,符合线性相关要求;携带污染率均小于3%;干扰试验的偏离值均小于3%;在4个通道上各项目的测定结果差异均无统计学意义(P>0.05),说明其通道一致性良好。结论国产血凝仪普利生C2000-A全自动血凝仪具有良好的分析性能,其准确度、精密度、线性范围、携带污染率等指标均符合质量管理要求,特别是其对溶血、黄疸以及脂浊标本具有较强的抗干扰能力,可完全满足临床检测要求。%Objective To verify the performance of the full automatic blood coagulation analy-zer precil C2000-A,and to determine whether it meets the requirements of clinical detection. Methods According to American Committee for Clinical Laboratory Standards,the precision,ac-curacy,carryover,linear range,anti-interference capability (interfering agents included hemoglo-bin,direct bilirubin and triacylglycerol)and channel consistency were verified and evaluated.Re-sults Of the all blood coagulation detection

  13. Automatic Algorithm Selection for Complex Simulation Problems

    CERN Document Server

    Ewald, Roland

    2012-01-01

    To select the most suitable simulation algorithm for a given task is often difficult. This is due to intricate interactions between model features, implementation details, and runtime environment, which may strongly affect the overall performance. An automated selection of simulation algorithms supports users in setting up simulation experiments without demanding expert knowledge on simulation. Roland Ewald analyzes and discusses existing approaches to solve the algorithm selection problem in the context of simulation. He introduces a framework for automatic simulation algorithm selection and

  14. Performance Evaluation of Platelet Count by Sysmex XN-3000 Auto-matic Hematology Analyzer%Sysmex XN-3000全自动血细胞分析仪血小板计数性能评价分析

    Institute of Scientific and Technical Information of China (English)

    张琴; 廖扬; 石玉玲

    2014-01-01

    Objective: To study the performance of methods for platelet(PLT) count by Sysmex XN-3000 automat-ic hematology analyzer. Methods: PLT was counted by XN-3000 hematology analyzer with platelet-electrical im-pedance method(PLT-I), platelet-optical method(PLT-O) and platelet-fluorescence method(PLT-F) to test the pre-cision, linear range, the carried pollution rate and red blood cell fragments interference. Moreover, the correlation between Sysmex XN-3000 PLT counts and flow cytometry(FCM) using anti-CD61 monoclonal antibody was con-firmed. Results: Compared with PLT-I and PLT-O, PLT-F had better repeatability, higher precision. Three meth-ods of linear range and the carried pollution all got good results. On red blood cell fragments interference experi-ment, PLT-F had stronger anti-interference ability(P>0.05), PLT-O took second place. Correlation analysis showed that three methods had good correlation, the accuracy of PLT-F was higher specially in low PLT samples group. Conclusion: Routine clinical specimens are counted by Sysmex XN-3000 hematology analyzer with PLT-I or PLT-O. When specimens are hemolysis or the number of PLT abnormalities, recommend using PLT-F method review, if necessary, using microscopic method or flow cytometry.%目的:探讨Sysmex XN-3000全自动血细胞分析仪的血小板计数性能。方法:采用Sysmex XN-3000全自动血细胞分析仪的3种方法进行血小板计数,从精密度、线性范围、携带污染率及红细胞碎片干扰等4方面评价,并与使用抗CD61抗体的流式细胞术检测结果进行比较。结果:与电阻抗法(PLT-I)、光学法(PLT-O)相比,核酸染色法(PLT-F)的重复性最好,精确度较高;3种方法的稀释线性和携带污染均得到良好的结果;在红细胞碎片干扰实验中, PLT-F具有较强的抗干扰能力(P均大于0.05),PLT-O次之;相关分析结果显示3种方法都得到良好的相关性,PLT-F法的准确性较高,尤其在

  15. 新型电机综合性能自动测试系统的研制与研究%R & D of an Automatic Testing System for Overall Performance of New Types of Motors

    Institute of Scientific and Technical Information of China (English)

    卢慧芬; 卢荻; 沈若凡; 许越华; 赵建勇

    2016-01-01

    This paper introduces an integration of WT1 600 digital wattmeter and JN338A torque transducer in a system for testing overall performance of new types of motors,which can measure their characteristic parameters such as voltage,current,torque,speed and power.This testing system combines effectively hardware such as IPC,signal collecting card and sensor and application programs such as those for data collection and analysis,and user graphic interface,so as to perform good automatic measurement of the data on the characteristic points of new types of motors such as input and output parameters,and improve the testing efficiency and accuracy. Application results indicate easy operation,and safety and reliability of the testing system developed.It has the potential for engineering application as it satisfies the requirements on the testing of new types of motors.The system has been successfully applied on the platform of motor and electrical appliances of the Electrical College of Zhejiang University.%介绍了 WT1600数字功率计与 JN338A 型转矩传感器在新型电机综合性能检测设备系统中的集成,可以测量电压、电流、转矩、转速、功率等电机特性参数。测试系统将工控机、采集卡、传感器等硬件与数据采集、分析以及图形用户界面的应用软件有效地结合起来,较好地实现新型电机的输入参数和输出参数等特征点数据的自动测试,提高电机测试效率和准确度。应用结果表明,测试系统操作方便,安全可靠,具有一定的工程指导价值,符合当今新型电机测试的要求。系统已成功应用于浙江大学电气学院电机与电器学科平台。

  16. Investigating the Relationship between Stable Personality Characteristics and Automatic Imitation.

    Directory of Open Access Journals (Sweden)

    Emily E Butler

    Full Text Available Automatic imitation is a cornerstone of nonverbal communication that fosters rapport between interaction partners. Recent research has suggested that stable dimensions of personality are antecedents to automatic imitation, but the empirical evidence linking imitation with personality traits is restricted to a few studies with modest sample sizes. Additionally, atypical imitation has been documented in autism spectrum disorders and schizophrenia, but the mechanisms underpinning these behavioural profiles remain unclear. Using a larger sample than prior studies (N=243, the current study tested whether performance on a computer-based automatic imitation task could be predicted by personality traits associated with social behaviour (extraversion and agreeableness and with disorders of social cognition (autistic-like and schizotypal traits. Further personality traits (narcissism and empathy were assessed in a subsample of participants (N=57. Multiple regression analyses showed that personality measures did not predict automatic imitation. In addition, using a similar analytical approach to prior studies, no differences in imitation performance emerged when only the highest and lowest 20 participants on each trait variable were compared. These data weaken support for the view that stable personality traits are antecedents to automatic imitation and that neural mechanisms thought to support automatic imitation, such as the mirror neuron system, are dysfunctional in autism spectrum disorders or schizophrenia. In sum, the impact that personality variables have on automatic imitation is less universal than initial reports suggest.

  17. 用于8051微控制器的片上调试系统的硬件设计%Hardware Design of an Onchip Debug System for 8051 MCU

    Institute of Scientific and Technical Information of China (English)

    肖哲靖; 徐静平; 雷青松; 钟德刚

    2011-01-01

    This paper designed an onchip debug system for industrial 8051 MCU with the process of ASIC and put the all debug fuction into one chip. The debug system not only can make the 8051 halt, run, step into or skip an instruction, but also can read and write all registers, internal and external program memories, data memories and SFRs and set hardware breakpoints in them. A three-wire interface is used by the debug system to connect computer which takes less space than the standard JTAG. It was verified on Xilinx's xc3s400 FPGA and P&R with SMIC 0.18μm technology library. Results shows the system can effectively avoid the disadvantages of traditional debug method based on software simulation or emulator and it also can save much money for users spent on the commercial emulator and improve debug efficiency. The proposed design method is also applicable to other microcontroller.%为工业用8051微控制器设计了一个片上调试系统,将调试功能集成到单片机芯片内部.该系统基于专用集成电路的设计流程设计,不仅具有控制8051单片机挂起、正常运行、单步运行和指令跳转的能力,而且能够读写片内寄存器、内外部数据,程序存储器、特殊功能寄存器的值,并能在其中设置硬件断点.该调试系统使用比工业上的JTAG标准接口占用空间更少的三线接口作为其和计算机的连接通道.系统在Xilinx的xc3s400 FPGA上完成功能验证,利用SMIC0.18μm工艺库完成版图设计.结果表明,系统有效解决基于传统软件调试和仿真器调试方式的弊端,并能省去用户购买商业仿真器的调试花费,减少调试成本,提高调试效率.提出的设计方法同样适用于其他微控制器片上调试系统的设计.

  18. Automatic Software Install/Update for Embedded Linux

    Institute of Scientific and Technical Information of China (English)

    TAO Li; HUANG Pei-wei

    2008-01-01

    Linux has a special feature of automount in the Linux kernel. Filesystem can be mounted and unmounted automatically. Its performance is similar to MS Windows. It works well under demands of more saving resources and automatization, like memory of system. An approach to install/update software auto-matically on embedded platform was proposed. This approach derives from above feature based on embedded environment with Linux OS. Configurations of files related to this feature were introduced. An example in practice was given to realize this approach from universal serial bus (USB) memory disk when USB disk is available.

  19. Image feature meaning for automatic key-frame extraction

    Science.gov (United States)

    Di Lecce, Vincenzo; Guerriero, Andrea

    2003-12-01

    Video abstraction and summarization, being request in several applications, has address a number of researches to automatic video analysis techniques. The processes for automatic video analysis are based on the recognition of short sequences of contiguous frames that describe the same scene, shots, and key frames representing the salient content of the shot. Since effective shot boundary detection techniques exist in the literature, in this paper we will focus our attention on key frames extraction techniques to identify the low level visual features of the frames that better represent the shot content. To evaluate the features performance, key frame automatically extracted using these features, are compared to human operator video annotations.

  20. PLC工业通讯网络在自动生产线中的应用%Application of PLC Communication in Automatic Assembly Line

    Institute of Scientific and Technical Information of China (English)

    陈英

    2013-01-01

    This paper selects the Siemens S7-200PLC as the main controller, the automatic assembly line has integrated communication port of RS485 with the function of PPI network communication. Based on PPI network, the hardware connection and debugging has been realized. And also Siemens S7-200 PLC reading and writing data of program compiling and debugging based on the complex PPI network has been realized.%文章选用西门子S7-200PLC作为主控制器,自动生产线集成RS485通信口具备PPI网络通信功能。基于此网络通信功能,实现了PPI网络的硬件连接与调试及PPI网络参数设置与调试,以及基于多台西门子S7-200 PLC的复杂PPI网络数据读写程序编写与调试。

  1. Automatic graphene transfer system for improved material quality and efficiency

    OpenAIRE

    Alberto Boscá; Jorge Pedrós; Javier Martínez; Tomás Palacios; Fernando Calle

    2015-01-01

    In most applications based on chemical vapor deposition (CVD) graphene, the transfer from the growth to the target substrate is a critical step for the final device performance. Manual procedures are time consuming and depend on handling skills, whereas existing automatic roll-to-roll methods work well for flexible substrates but tend to induce mechanical damage in rigid ones. A new system that automatically transfers CVD graphene to an arbitrary target substrate has been developed. The proce...

  2. Automatic learning strategies and their application to electrophoresis analysis

    OpenAIRE

    Roch, Christian Maurice; Pun, Thierry; Hochstrasser, Denis; Pellegrini, Christian

    1989-01-01

    Automatic learning plays an important role in image analysis and pattern recognition. A taxonomy of automatic learning strategies is presented; this categorization is based on the amount of inferences the learning element must perform to bridge the gap between environmental and system knowledge representation level. Four main categories are identified and described: rote learning, learning by deduction, learning by induction, and learning by analogy. An application of learning by induction to...

  3. Automatic Grasp Generation and Improvement for Industrial Bin-Picking

    DEFF Research Database (Denmark)

    Kraft, Dirk; Ellekilde, Lars-Peter; Rytz, Jimmy Alison

    2014-01-01

    and achieve comparable results and that our learning approach can improve system performance significantly. Automatic bin-picking is an important industrial process that can lead to significant savings and potentially keep production in countries with high labour cost rather than outsourcing it. The presented...... work allows to minimize cycle time as well as setup cost, which are essential factors in automatic bin-picking. It therefore leads to a wider applicability of bin-picking in industry....

  4. Development of an automatic reactor inspection system

    International Nuclear Information System (INIS)

    Using recent technologies on a mobile robot computer science, we developed an automatic inspection system for weld lines of the reactor vessel. The ultrasonic inspection of the reactor pressure vessel is currently performed by commercialized robot manipulators. Since, however, the conventional fixed type robot manipulator is very huge, heavy and expensive, it needs long inspection time and is hard to handle and maintain. In order to resolve these problems, we developed a new automatic inspection system using a small mobile robot crawling on the vertical wall of the reactor vessel. According to our conceptual design, we developed the reactor inspection system including an underwater inspection robot, a laser position control subsystem, an ultrasonic data acquisition/analysis subsystem and a main control subsystem. We successfully carried out underwater experiments on the reactor vessel mockup, and real reactor ready for Ulchine nuclear power plant unit 6 at Dusan Heavy Industry in Korea. After this project, we have a plan to commercialize our inspection system. Using this system, we can expect much reduction of the inspection time, performance enhancement, automatic management of inspection history, etc. In the economic point of view, we can also expect import substitution more than 4 million dollars. The established essential technologies for intelligent control and automation are expected to be synthetically applied to the automation of similar systems in nuclear power plants

  5. Electronic amplifiers for automatic compensators

    CERN Document Server

    Polonnikov, D Ye

    1965-01-01

    Electronic Amplifiers for Automatic Compensators presents the design and operation of electronic amplifiers for use in automatic control and measuring systems. This book is composed of eight chapters that consider the problems of constructing input and output circuits of amplifiers, suppression of interference and ensuring high sensitivity.This work begins with a survey of the operating principles of electronic amplifiers in automatic compensator systems. The succeeding chapters deal with circuit selection and the calculation and determination of the principal characteristics of amplifiers, as

  6. The Automatic Telescope Network (ATN)

    CERN Document Server

    Mattox, J R

    1999-01-01

    Because of the scheduled GLAST mission by NASA, there is strong scientific justification for preparation for very extensive blazar monitoring in the optical bands to exploit the opportunity to learn about blazars through the correlation of variability of the gamma-ray flux with flux at lower frequencies. Current optical facilities do not provide the required capability.Developments in technology have enabled astronomers to readily deploy automatic telescopes. The effort to create an Automatic Telescope Network (ATN) for blazar monitoring in the GLAST era is described. Other scientific applications of the networks of automatic telescopes are discussed. The potential of the ATN for science education is also discussed.

  7. Automatic Prosodic Break Detection and Feature Analysis

    Institute of Scientific and Technical Information of China (English)

    Chong-Jia Ni; Ai-Ying Zhang; Wen-Ju Liu; Bo Xu

    2012-01-01

    Automatic prosodic break detection and annotation are important for both speech understanding and natural speech synthesis.In this paper,we discuss automatic prosodic break detection and feature analysis.The contributions of the paper are two aspects.One is that we use classifier combination method to detect Mandarin and English prosodic break using acoustic,lexical and syntactic evidence.Our proposed method achieves better performance on both the Mandarin prosodic annotation corpus — Annotated Speech Corpus of Chinese Discourse and the English prosodic annotation corpus —Boston University Radio News Corpus when compared with the baseline system and other researches' experimental results.The other is the feature analysis for prosodic break detection.The functions of different features,such as duration,pitch,energy,and intensity,are analyzed and compared in Mandarin and English prosodic break detection.Based on the feature analysis,we also verify some linguistic conclusions.

  8. Automatic stereoscopic system for person recognition

    Science.gov (United States)

    Murynin, Alexander B.; Matveev, Ivan A.; Kuznetsov, Victor D.

    1999-06-01

    A biometric access control system based on identification of human face is presented. The system developed performs remote measurements of the necessary face features. Two different scenarios of the system behavior are implemented. The first one assumes the verification of personal data entered by visitor from console using keyboard or card reader. The system functions as an automatic checkpoint, that strictly controls access of different visitors. The other scenario makes it possible to identify visitors without any person identifier or pass. Only person biometrics are used to identify the visitor. The recognition system automatically finds necessary identification information preliminary stored in the database. Two laboratory models of recognition system were developed. The models are designed to use different information types and sources. In addition to stereoscopic images inputted to computer from cameras the models can use voice data and some person physical characteristics such as person's height, measured by imaging system.

  9. Development of an automatic pipeline scanning system

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae H.; Lee, Jae C.; Moon, Soon S.; Eom, Heung S.; Choi, Yu R

    1999-11-01

    Pressure pipe inspection in nuclear power plants is one of the mandatory regulation items. Comparing to manual ultrasonic inspection, automatic inspection has the benefits of more accurate and reliable inspection results and reduction of radiation disposal. final object of this project is to develop an automatic pipeline inspection system of pressure pipe welds in nuclear power plants. We developed a pipeline scanning robot with four magnetic wheels and 2-axis manipulator for controlling ultrasonic transducers, and developed the robot control computer which controls the robot to navigate along inspection path exactly. We expect our system can contribute to reduction of inspection time, performance enhancement, and effective management of inspection results. The system developed by this project can be practically used for inspection works after field tests. (author)

  10. Automatic line generalization using zero-crossings

    Science.gov (United States)

    Thapa, K.

    1988-01-01

    The problem of automating the process of line generalization has been very difficult. It has not been solved yet despite the concerted effort of many private firms as well as government agencies. There does not exist an algorithm which can automatically perform this process when there is a drastic change in scale between the original and generalized maps. In this paper, an algorithm which is successful in automatically generalizing lines from any large scale to any small scale is presented. The algorthm achieves different levels of smoothing the line while preserving the overall shape of the line. The results are compared with those obtained by manual methods. It was found that the results obtained by the algorithm are very close to those obtained by cartographers using manual methods.

  11. Automatic Palette Identification of Colored Graphics

    Science.gov (United States)

    Lacroix, Vinciane

    The median-shift, a new clustering algorithm, is proposed to automatically identify the palette of colored graphics, a pre-requisite for graphics vectorization. The median-shift is an iterative process which shifts each data point to the "median" point of its neighborhood defined thanks to a distance measure and a maximum radius, the only parameter of the method. The process is viewed as a graph transformation which converges to a set of clusters made of one or several connected vertices. As the palette identification depends on color perception, the clustering is performed in the L*a*b* feature space. As pixels located on edges are made of mixed colors not expected to be part of the palette, they are removed from the initial data set by an automatic pre-processing. Results are shown on scanned maps and on the Macbeth color chart and compared to well established methods.

  12. Automatic Phonetic Transcription for Danish Speech Recognition

    DEFF Research Database (Denmark)

    Kirkedal, Andreas Søeborg

    to acquire and expensive to create. For languages with productive compounding or agglutinative languages like German and Finnish, respectively, phonetic dictionaries are also hard to maintain. For this reason, automatic phonetic transcription tools have been produced for many languages. The quality...... of automatic phonetic transcriptions vary greatly with respect to language and transcription strategy. For some languages where the difference between the graphemic and phonetic representations are small, graphemic transcriptions can be used to create ASR systems with acceptable performance. In other languages...... representations, e.g. morphological analysis, decompounding, letter-to-sound rules, etc. Two different phonetic transcribers for Danish will be compared in this study: eSpeak (Duddington, 2010) and Phonix (Henrichsen, 2014). Both transcribers produce a richer transcription than ASR can utilise such as stress...

  13. Pattern-Driven Automatic Parallelization

    Directory of Open Access Journals (Sweden)

    Christoph W. Kessler

    1996-01-01

    Full Text Available This article describes a knowledge-based system for automatic parallelization of a wide class of sequential numerical codes operating on vectors and dense matrices, and for execution on distributed memory message-passing multiprocessors. Its main feature is a fast and powerful pattern recognition tool that locally identifies frequently occurring computations and programming concepts in the source code. This tool also works for dusty deck codes that have been "encrypted" by former machine-specific code transformations. Successful pattern recognition guides sophisticated code transformations including local algorithm replacement such that the parallelized code need not emerge from the sequential program structure by just parallelizing the loops. It allows access to an expert's knowledge on useful parallel algorithms, available machine-specific library routines, and powerful program transformations. The partially restored program semantics also supports local array alignment, distribution, and redistribution, and allows for faster and more exact prediction of the performance of the parallelized target code than is usually possible.

  14. 两台水洗机的联机控制改造与调试%Two Sets of Washing Machine On-line Control Retrofit and Debugging

    Institute of Scientific and Technical Information of China (English)

    高锦南

    2013-01-01

    This paper mainly introduces the modification program and debugging of two sets of washing machine on-line control, and single-machine control. Two-sets of washing machine on-line control greatly shorten the product washed twice time, effectively improve the washing efficiency of cloth, so that the water washing process is smooth and achieve the ideal process effect.%本文主要介绍某印染企业两台水洗机既能联机控制,又能单机控制的改造方案及调试过程。两台水洗机联机控制,大大缩短了产品水洗两遍的时间,有效提高布匹的水洗效率,以便于水洗工艺的顺利进行,达到理想的工艺效果。

  15. C语言指针错误的分析及调试%Analyzing and Debugging the Errors of C Language Pointer

    Institute of Scientific and Technical Information of China (English)

    许永达

    2013-01-01

    Some pointer errors in C Programming are not easily found at the compiling phase. The current teaching materials cannot provide sufficient description on those errors, but mainly focusing on concept or theory. This article aims at preventing those errors by analyzing those errors in sample programs, debugging those errors in VISUAL C++6.0, showing the phenomena of those errors, analyzing their causes, and putting forward the correct way to use pointers.%  C语言指针的有些错误在程序编译阶段难以发现,且现行教材主要从概念、理论上对指针错误进行讲述,存在不足.分析了带有此类错误的示例程序,并在VISUAL C++6.0进行调试,展示此类指针错误的错误现象,分析其产生的原因,提出正确使用指针的方法,以达到预防此类指针错误发生的目的.

  16. 纯电动汽车用电机调试软件设计%Debugging Software Design of Pure Electric Car Motors

    Institute of Scientific and Technical Information of China (English)

    李豹; 张云; 朱孟美; 孔辉

    2013-01-01

    In order to make the pure electric car motor initial commissioning process more convenient,the pure electric car motors debugging software was developed by using LabVIEW.The software sets motor parameters characteristic determination,PI parameter adjustment,the motor operating parameters setting,results analysis,data processing and display functions as one.The main function is to realize the communication between the PC and the motor drive to achieve the transmission and reception of data and display.%为了使纯电动汽车用电机初始调试过程更加便捷,采用LabVIEW开发了一款针对纯电动汽车用电机的调试软件,该软件集电机参数特性测定、PI参数调节、电机运行参数设置、结果分析、数据处理及显示等多种功能为一体,其中最主要的功能是实现PC机与电机驱动器之间的通信,实现数据的发送、接收与显示.

  17. Automatic programming of simulation models

    Science.gov (United States)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.

    1988-01-01

    The objective of automatic programming is to improve the overall environment for describing the program. This improved environment is realized by a reduction in the amount of detail that the programmer needs to know and is exposed to. Furthermore, this improved environment is achieved by a specification language that is more natural to the user's problem domain and to the user's way of thinking and looking at the problem. The goal of this research is to apply the concepts of automatic programming (AP) to modeling discrete event simulation system. Specific emphasis is on the design and development of simulation tools to assist the modeler define or construct a model of the system and to then automatically write the corresponding simulation code in the target simulation language, GPSS/PC. A related goal is to evaluate the feasibility of various languages for constructing automatic programming simulation tools.

  18. Analysis and Debugging of Sleeper Distance Deviation for SVM1000 Tracklayer Units After Overhauling%经大修改造的 SVM1000铺轨机组枕距偏差分析与调试

    Institute of Scientific and Technical Information of China (English)

    毕建设

    2015-01-01

    After overhauling and reforming of the SVM1000 tracklayer units,deviation of sleep distance will cause a larger accumulative deviation during the process of onsite debugging.In the late phase for the debugging,the most emphasis is paid to the procedure based on the analysis of the influence factors such as construction slope,ambient temperature,walk-ing speed,rotary encoder and procedure and so on.Through actual deviation analysis and debugging of the sleeper dis-tance,it can provide references for the similar problems.%SVM1000铺轨机组经过大修及改造后,现场调试过程中,发现枕距存在误差,导致累计偏差较大。通过分析施工面坡度、外界温度、行走速度、旋转编码器和程序等影响因素,最后确定程序为主要影响因素。通过此次偏差分析与调试,为以后类似问题,提供了借鉴意义。

  19. Automatic Number Plate Recognition System

    OpenAIRE

    Rajshree Dhruw; Dharmendra Roy

    2014-01-01

    Automatic Number Plate Recognition (ANPR) is a mass surveillance system that captures the image of vehicles and recognizes their license number. The objective is to design an efficient automatic authorized vehicle identification system by using the Indian vehicle number plate. In this paper we discus different methodology for number plate localization, character segmentation & recognition of the number plate. The system is mainly applicable for non standard Indian number plates by recognizing...

  20. Evaluation of the performance of PRECIL LBY-NJ4A automatic platelet aggregation analyzer%普利生LBY-NJ4A全自动血小板聚集仪性能评价

    Institute of Scientific and Technical Information of China (English)

    石冬敏; 吴元健; 马伟

    2012-01-01

    目的 对普利生LBY-NJ4A全自动血小板聚集仪(NJ4A)进行性能评估.方法 109 mmol/L枸橼酸钠真空管采血,分离富含血小板血浆(PRP)和贫血小板血浆(PPP),应用NJ4A及配套质控品、诱导剂和清洗液,测定血小板最大聚集率(MAR),测试精密度、通道一致性、不确定度、检测限、干扰等.结果 批内不精密度测试,变异系数(CV)为3.4%~5.0%;4个不同通道测定均值差异无统计学意义(P>0.05);携带污染率2.82%;CV批内≤3.5%,CV批间≤4.2%,CV总≤3.8%;总误差范围为5.6%~10.3%,不确定度在可接受范围;2个水平质控品测定值与靶值差异无统计学意义(P>0.05);NJ4A与LBY-NJ2比对结果相关良好(r=0.998,P0.05).结论 NJ4A精密度、准确度、不确定度、灵敏度、携带污染率、抗干扰等性能指标符合CLSI规范,可在临床应用.%Objective To check and evaluate the performance of PRECIL LBY-NJ4A automatic platelet aggregation analyzer (NJ4A) .Methods Blood samples were collected with 109 mmol/L sodium citrate using vacuum tube,then the platelet-rich plasma (PRP)and platelet-poor plasma(PPP) were separated .The NJ4A analyzer with mating quality control,agonists and cleaning solution were used to measure the maximum aggregate rate of platelet (MAR),precision,channel consistency,inaccuracy,detection limitation and inference .Results The precision ranged from 3 .4/0 to 5 .0% .The means of four different channels have no statistically difference(P>0 .05),when carryover rate was 2 .82% .The intra,inter hatch and total coefficient of variance was less than or equal to 3 .5%,4 .2% and 3 .8/0,respectively .The total error ranged from 5 .6/0 to 10 .3%,with an acceptable inaccuracy .Two levels of quality control have no statistically difference with each target value (P>0.05).The correlation of NJ4 and LBY-NJ2 was 0.998 (P0.05) compared to the well-recognized reference range .Conclusion The performance indicators of precision, accuracy, uncertainty

  1. 罗氏 Co bas c701全自动生化分析仪性能评价%Performance evaluation of Roche Cobas c 701 fully automatic biochemical analyzer

    Institute of Scientific and Technical Information of China (English)

    邓小玲; 侯玉磊; 陈特; 毕小云

    2015-01-01

    Objective To assess the performance of Roche Cobas c701 fully automatic biochemical analyzer. Methods According to EP15‐A2 from Clinical and Laboratory Standards Institute, the electrolyte (potassium, sodi‐um and chloride) and covers all the wavelengths of nine projects (alanine transaminase, aspartate transaminase, alka‐line phosphatase, gamma‐glutamine transaminase, creatinine and urea nitrogen, glucose, total protein, three acyl glyc‐erin) were measured by Roche Cobas c701 analyzer and original reagents. The precisions and accuracies of all parame‐ters were verified. Results In the 2 levels of tested parameters, the standard deviation of repeatability (Sr )was ≤the manufacture′s standard deviation of repeatability (σr ), and the standard deviation of prescision (St ) was≤the manu‐facture′s standard prescision (σt ), the prescision was acceptable and similar to what the manufacter declared. Correla‐tions between theoretic value and actual value were good (regression coefficient was :0. 999 4-1. 000 0). The bias of all parameters was acceptable (within the prescribed scope of CLIA′88)with Roche cobas c701analyzer, compared with the external quality assessment of the ministry of health clinical inspection center. Conclusion The repeatabili‐ty, precision and accuracy of the parameters by Roche Cobas c701 reach the performance that the manufacturer de‐clares.%目的:对罗氏Cobas c701全自动生化分析仪进行性能评价。方法按照美国临床和实验室标准化协会EP15‐A2文件的要求,通过电解质(钾、钠、氯)和涵盖各波长的9个项目(丙氨酸氨基转移酶、天门冬氨酸氨基转移酶、碱性磷酸酶、γ‐谷氨酰转移酶、肌酐、尿素氮、葡萄糖、总蛋白、三酰甘油)对仪器的精密度、准确度、线性范围等进行验证。结果所有检测项目的重复性标准差(Sr)≤厂家声明的标准差(σr)、精密度的标准差(St)≤σt ,均

  2. A Test Suite for High-Performance Parallel Java

    OpenAIRE

    Hauser, Jochem; Ludewig, Thorsten; Williams, Roy D.; Winkelmann, Ralf; Gollnick, Torsten; Brunett, Sharon; Muylaert, Jean

    1999-01-01

    The Java programming language has a number of features that make it attractive for writing high-quality, portable parallel programs. A pure object formulation, strong typing and the exception model make programs easier to create, debug, and maintain. The elegant threading provides a simple route to parallelism on shared-memory machines. Anticipating great improvements in numerical performance, this paper presents a suite of simple programs that indicate how a pure Java Navier-Stokes solver mi...

  3. Automatic Music Transcription

    Science.gov (United States)

    Klapuri, Anssi; Virtanen, Tuomas

    Written musical notation describes music in a symbolic form that is suitable for performing a piece using the available musical instruments. Traditionally, musical notation indicates the pitch, target instrument, timing, and duration of each sound to be played. The aim of music transcription either by humans or by a machine is to infer these musical parameters, given only the acoustic recording of a performance.

  4. An automatic evaluation system for NTA film neutron dosimeters

    CERN Document Server

    Müller, R

    1999-01-01

    At CERN, neutron personal monitoring for over 4000 collaborators is performed with Kodak NTA films, which have been shown to be the most suitable neutron dosimeter in the radiation environment around high-energy accelerators. To overcome the lengthy and strenuous manual scanning process with an optical microscope, an automatic analysis system has been developed. We report on the successful automatic scanning of NTA films irradiated with sup 2 sup 3 sup 8 Pu-Be source neutrons, which results in densely ionised recoil tracks, as well as on the extension of the method to higher energy neutrons causing sparse and fragmentary tracks. The application of the method in routine personal monitoring is discussed. $9 overcome the lengthy and strenuous manual scanning process with an optical microscope, an automatic analysis system has been developed. We report on the successful automatic scanning of NTA films irradiated with /sup 238/Pu-Be source $9 discussed. (10 refs).

  5. SEMANTIC INTEGRATION FOR AUTOMATIC ONTOLOGY MAPPING

    Directory of Open Access Journals (Sweden)

    Siham AMROUCH

    2013-11-01

    Full Text Available In the last decade, ontologies have played a key technology role for information sharing and agents interoperability in different application domains. In semantic web domain, ontologies are efficiently used to face the great challenge of representing the semantics of data, in order to bring the actual web to its full power and hence, achieve its objective. However, using ontologies as common and shared vocabularies requires a certain degree of interoperability between them. To confront this requirement, mapping ontologies is a solution that is not to be avoided. In deed, ontology mapping build a meta layer that allows different applications and information systems to access and share their informations, of course, after resolving the different forms of syntactic, semantic and lexical mismatches. In the contribution presented in this paper, we have integrated the semantic aspect based on an external lexical resource, wordNet, to design a new algorithm for fully automatic ontology mapping. This fully automatic character features the main difference of our contribution with regards to the most of the existing semi-automatic algorithms of ontology mapping, such as Chimaera, Prompt, Onion, Glue, etc. To better enhance the performances of our algorithm, the mapping discovery stage is based on the combination of two sub-modules. The former analysis the concept’s names and the later analysis their properties. Each one of these two sub-modules is it self based on the combination of lexical and semantic similarity measures.

  6. Automatic ultrasound image enhancement for 2D semi-automatic breast-lesion segmentation

    Science.gov (United States)

    Lu, Kongkuo; Hall, Christopher S.

    2014-03-01

    Breast cancer is the fastest growing cancer, accounting for 29%, of new cases in 2012, and second leading cause of cancer death among women in the United States and worldwide. Ultrasound (US) has been used as an indispensable tool for breast cancer detection/diagnosis and treatment. In computer-aided assistance, lesion segmentation is a preliminary but vital step, but the task is quite challenging in US images, due to imaging artifacts that complicate detection and measurement of the suspect lesions. The lesions usually present with poor boundary features and vary significantly in size, shape, and intensity distribution between cases. Automatic methods are highly application dependent while manual tracing methods are extremely time consuming and have a great deal of intra- and inter- observer variability. Semi-automatic approaches are designed to counterbalance the advantage and drawbacks of the automatic and manual methods. However, considerable user interaction might be necessary to ensure reasonable segmentation for a wide range of lesions. This work proposes an automatic enhancement approach to improve the boundary searching ability of the live wire method to reduce necessary user interaction while keeping the segmentation performance. Based on the results of segmentation of 50 2D breast lesions in US images, less user interaction is required to achieve desired accuracy, i.e. < 80%, when auto-enhancement is applied for live-wire segmentation.

  7. Automatic basal slice detection for cardiac analysis

    Science.gov (United States)

    Paknezhad, Mahsa; Marchesseau, Stephanie; Brown, Michael S.

    2016-03-01

    Identification of the basal slice in cardiac imaging is a key step to measuring the ejection fraction (EF) of the left ventricle (LV). Despite research on cardiac segmentation, basal slice identification is routinely performed manually. Manual identification, however, has been shown to have high inter-observer variability, with a variation of the EF by up to 8%. Therefore, an automatic way of identifying the basal slice is still required. Prior published methods operate by automatically tracking the mitral valve points from the long-axis view of the LV. These approaches assumed that the basal slice is the first short-axis slice below the mitral valve. However, guidelines published in 2013 by the society for cardiovascular magnetic resonance indicate that the basal slice is the uppermost short-axis slice with more than 50% myocardium surrounding the blood cavity. Consequently, these existing methods are at times identifying the incorrect short-axis slice. Correct identification of the basal slice under these guidelines is challenging due to the poor image quality and blood movement during image acquisition. This paper proposes an automatic tool that focuses on the two-chamber slice to find the basal slice. To this end, an active shape model is trained to automatically segment the two-chamber view for 51 samples using the leave-one-out strategy. The basal slice was detected using temporal binary profiles created for each short-axis slice from the segmented two-chamber slice. From the 51 successfully tested samples, 92% and 84% of detection results were accurate at the end-systolic and the end-diastolic phases of the cardiac cycle, respectively.

  8. Variable-mass Thermodynamics Calculation Model for Gas-operated Automatic Weapon%Variable-mass Thermodynamics Calculation Model for Gas-operated Automatic Weapon

    Institute of Scientific and Technical Information of China (English)

    陈建彬; 吕小强

    2011-01-01

    Aiming at the fact that the energy and mass exchange phenomena exist between barrel and gas-operated device of the automatic weapon, for describing its interior ballistics and dynamic characteristics of the gas-operated device accurately, a new variable-mass thermodynamics model is built. It is used to calculate the automatic mechanism velocity of a certain automatic weapon, the calculation results coincide with the experimental results better, and thus the model is validated. The influences of structure parameters on gas-operated device' s dynamic characteristics are discussed. It shows that the model is valuable for design and accurate performance prediction of gas-operated automatic weapon.

  9. COMMISSIONING AND DETECTOR PERFORMANCE GROUPS (DPG)

    CERN Multimedia

    A. Ryd and T. Camporesi

    2010-01-01

    Commissioning and Run Coordination activities After the successful conclusion of the LHC pilot run commissioning in 2009 activities at the experiment restarted only late in January due to the cooling and detector maintenance. As usual we got going with weekly exercises used to deploy, debug, and validate improvements in firmware and software. A debriefing workshop aimed at analyzing the operational aspects of the 2009 pilot run was held on Jan. 15, 2009, to define a list of improvements (and relative priorities) to be planned. In the last month, most of the objectives set in the debriefing workshop have been attained. The major achievements/improvements obtained are the following: - Consolidation of the firmware for both readout and trigger for ECAL - Software implementation of procedures for raising the bias voltage of the silicon tracker and pixel driven by LHC mode changes with automatic propagation of the state changes from the DCS to the DAQ. The improvements in the software and firmware allow suppress...

  10. The performance comparison between a domestic automatic enzyme immunoassay analyzer and Tecan Fame ELISA system%某国产全自动 ELISA 分析仪与 Tecan Fame ELISA 系统的性能比较

    Institute of Scientific and Technical Information of China (English)

    包建玲; 唐婧; 孟存仁; 张朝霞

    2014-01-01

    Objective To compare the performance of Addcare automatic enyzne-link immunosorbent assay (ELISA ) analyzer and Tecan Fame ELISA system ,and to assess the feasibility of Addcare ELISA analyzer in clinical application .Methods The per-formance of pipetting needles ,washers ,microplate readers of two different systems were evaluated .Gravimetric method was used to detect pipetting needles errors and washer residues .Distilled water ,methyl orange ,dichromic acid and p-nitrophenol were used to test the microplate reader′s zero point drifting ,channel difference ,sensitivity and accuracy .80 specimens were used to test the anti-body of hepatitis C virus ,then the total coincidence rate of two systems was determined .Results The mean value of Addcare 10μL pipetting system was 10 .164μL ,and total CV value was 3 .91% .In Tecan system the mean value was 10 .223μL and the total val-ue of CV was 2 .96% .The absorbance values of zero drifting in two systems were both within 0 .023 6 ± 0 .003 8 ,differences among channels were ± 0 .002 9 ,differences among holes were ± 0 .001 4 .Washing system′s residues of Addcare were within 0 .4 μL ,and those of Fame system were within 0 .6μL .The total coincidence rate of two systems to test hepatitis C virus antibody of 80 samples was 100% .Conclusion The performance of the two systems are stable ,and the test results are consistent ,which could meet the clinical needs .%目的:比较国产艾德康酶联免疫吸附试验(ELISA )分析仪与瑞士 Tecan Fame ELISA 系统的性能,确定艾德康ELISA分析仪在临床免疫检验中应用的可行性。方法分别对艾德康ELISA分析仪与瑞士Tecan Fame ELISA系统的加样针、洗板机、酶标仪几个模块进行性能评价,用称量法检测加样针加样误差和洗板机残留液量;用蒸馏水、甲基橙比色系统、重铬酸钾和对硝基苯酚,分别测定酶标仪零点飘移、通道差、灵敏度和准确度;取80例标本进行

  11. Simulative debugging method for serial communication under Keil Cenvironment%Keil C环境下串口实验的模拟调试

    Institute of Scientific and Technical Information of China (English)

    朱艳萍; 邹应全; 廖建辉

    2012-01-01

    According to the practice teaching needs of the single chip, the specific methods and steps of serial communication between SCM and PC are proposed under Keil C51 software environment. The simulation experiment is as follows: Serial program sends two hexadecimal data, changes it for decimal and displays on four digital tubes. Comparing the simulation data under software step execution of I/O mouth with the actual serial interface communication results ,they are consistent. This experiment is to make the students more familiar with single-chip microcomputer software development environment and debugging details. And it also makes students deep understanding the serial interface communication and dynamic display.%根据单片机实践教学发展的需要,提出了在Keil C51软件环境下模拟单片机与PC机间串口通信的具体方法和步骤.仿真实验为:串口程序发送2位16进制数据,将其转换为十进制后,在4位LED上显示.通过比较软件单步执行的I/O口模拟数据,发现与实际串口通信结果一致.该实验使学生更加熟悉单片机软件开发环境下模拟硬件的调试方法,加深了学生对串口通信及动态显示的理解.

  12. Automatic monitoring system for ''F'' installation

    International Nuclear Information System (INIS)

    The design and operation procedure of the first part of automatic radiation monitoring system of the Laboratory of Nuclear Problems, JINR, (''F'' Installation) are described. The system consists of 50 data measuring lines from which 30 are used to monitor by means of radiation de-- tectors; 12- to control the state of branch circuits, and orhers give auxiliary information on the accelerator performance. The data are handled and registered by a crate controller with built-in microcomputer once in some seconds. The monitoring results are output on a special light panel, a sound signaling and on a print

  13. Two Systems for Automatic Music Genre Recognition

    DEFF Research Database (Denmark)

    Sturm, Bob L.

    2012-01-01

    We re-implement and test two state-of-the-art systems for automatic music genre classification; but unlike past works in this area, we look closer than ever before at their behavior. First, we look at specific instances where each system consistently applies the same wrong label across multiple...... trials of cross-validation. Second, we test the robustness of each system to spectral equalization. Finally, we test how well human subjects recognize the genres of music excerpts composed by each system to be highly genre representative. Our results suggest that neither high-performing system has...... a capacity to recognize music genre....

  14. All-optical automatic pollen identification: Towards an operational system

    Science.gov (United States)

    Crouzy, Benoît; Stella, Michelle; Konzelmann, Thomas; Calpini, Bertrand; Clot, Bernard

    2016-09-01

    We present results from the development and validation campaign of an optical pollen monitoring method based on time-resolved scattering and fluorescence. Focus is first set on supervised learning algorithms for pollen-taxa identification and on the determination of aerosol properties (particle size and shape). The identification capability provides a basis for a pre-operational automatic pollen season monitoring performed in parallel to manual reference measurements (Hirst-type volumetric samplers). Airborne concentrations obtained from the automatic system are compatible with those from the manual method regarding total pollen and the automatic device provides real-time data reliably (one week interruption over five months). In addition, although the calibration dataset still needs to be completed, we are able to follow the grass pollen season. The high sampling from the automatic device allows to go beyond the commonly-presented daily values and we obtain statistically significant hourly concentrations. Finally, we discuss remaining challenges for obtaining an operational automatic monitoring system and how the generic validation environment developed for the present campaign could be used for further tests of automatic pollen monitoring devices.

  15. Debugging of Combination Control and Automatic Regulation of Oil-feed Pumps and Its Improvement%供油泵的联动控制及自动调节的调试与改进

    Institute of Scientific and Technical Information of China (English)

    唐立平

    2003-01-01

    黔北电厂4×300MW机组工程设有一座燃油泵房,共3台供油泵,为4台炉提供燃油,该系统采用SIEMENS公司的PCS7过程控制系统.我们在调试中发现,PLC厂家根据西南电力设计院提供的控制要求所做的逻辑,在实际运行中引起供油泵缺乏稳定,存在一定的安全隐患而危及机组运行.经过多次修改和调试,使之适应了现场的运行要求.

  16. ARN型自动调谐消弧装置的调试及运行%Study on Debugging Program and Operation of ARN TypeAutomatic Tuning Neutralizer in 10 kV System

    Institute of Scientific and Technical Information of China (English)

    陈三运; 伍昌庭

    2000-01-01

    针对宜昌城区两个110 kV变电站的10 kV系统自动调谐消弧装置,介绍了自动调谐系统的工作原理、投运调试情况;对装置投入运行后的情况作了典型分析,提出了完善措施,介绍了实施效果.

  17. Automatic Melody Segmentation

    OpenAIRE

    Rodríguez López, Marcelo

    2016-01-01

    The work presented in this dissertation investigates music segmentation. In the field of Musicology, segmentation refers to a score analysis technique, whereby notated pieces or passages of these pieces are divided into “units” referred to as sections, periods, phrases, and so on. Segmentation analysis is a widespread practice among musicians: performers use it to help them memorise pieces, music theorists and historians use it to compare works, music students use it to understand the composi...

  18. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming focuses on the techniques of automatic programming used with digital computers. Topics covered range from the design of machine-independent programming languages to the use of recursive procedures in ALGOL 60. A multi-pass translation scheme for ALGOL 60 is described, along with some commercial source languages. The structure and use of the syntax-directed compiler is also considered.Comprised of 12 chapters, this volume begins with a discussion on the basic ideas involved in the description of a computing process as a program for a computer, expressed in

  19. Algorithms for skiascopy measurement automatization

    Science.gov (United States)

    Fomins, Sergejs; Trukša, Renārs; KrūmiĆa, Gunta

    2014-10-01

    Automatic dynamic infrared retinoscope was developed, which allows to run procedure at a much higher rate. Our system uses a USB image sensor with up to 180 Hz refresh rate equipped with a long focus objective and 850 nm infrared light emitting diode as light source. Two servo motors driven by microprocessor control the rotation of semitransparent mirror and motion of retinoscope chassis. Image of eye pupil reflex is captured via software and analyzed along the horizontal plane. Algorithm for automatic accommodative state analysis is developed based on the intensity changes of the fundus reflex.

  20. Automatic Construction of Finite Algebras

    Institute of Scientific and Technical Information of China (English)

    张健

    1995-01-01

    This paper deals with model generation for equational theories,i.e.,automatically generating (finite)models of a given set of (logical) equations.Our method of finite model generation and a tool for automatic construction of finite algebras is described.Some examples are given to show the applications of our program.We argue that,the combination of model generators and theorem provers enables us to get a better understanding of logical theories.A brief comparison betwween our tool and other similar tools is also presented.

  1. On-line current feed and computer aided control tactics for automatic balancing head

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In the designed automatic balancing head,a non-contact induction transformer is used to deliver driving energy to solve the problem of current fed and controlling on-line.Computer controlled automatic balancing experiments with phase-magnitude control tactics were performed on a flexible rotor system.Results of the experiments prove that the energy feeding method and the control tactics are effective in the automatic balancing head for vibration controlling.

  2. Performative

    DEFF Research Database (Denmark)

    Sack-Nielsen, Torsten

    2015-01-01

    The article describes the potential of building skins being climate-adaptive. The principle of folding, and the relation between form and performance of facades are discussed here.......The article describes the potential of building skins being climate-adaptive. The principle of folding, and the relation between form and performance of facades are discussed here....

  3. High Performance LINUX Clusters With OSCAR, Rocks, OpenMosix & MPI

    CERN Document Server

    Sloan, Joseph D

    2005-01-01

    This new guide covers everything you need to plan, build, and deploy a high-performance Linux cluster. You'll learn about planning, hardware choices, bulk installation of Linux on multiple systems, and other basic considerations. Learn about the major free software projects and how to choose those that are most helpful to new cluster administrators and programmers. Guidelines for debugging, profiling, performance tuning, and managing jobs from multiple users round out this immensely useful book

  4. Automatic detection of microcalcifications with multi-fractal spectrum.

    Science.gov (United States)

    Ding, Yong; Dai, Hang; Zhang, Hang

    2014-01-01

    For improving the detection of micro-calcifications (MCs), this paper proposes an automatic detection of MC system making use of multi-fractal spectrum in digitized mammograms. The approach of automatic detection system is based on the principle that normal tissues possess certain fractal properties which change along with the presence of MCs. In this system, multi-fractal spectrum is applied to reveal such fractal properties. By quantifying the deviations of multi-fractal spectrums between normal tissues and MCs, the system can identify MCs altering the fractal properties and finally locate the position of MCs. The performance of the proposed system is compared with the leading automatic detection systems in a mammographic image database. Experimental results demonstrate that the proposed system is statistically superior to most of the compared systems and delivers a superior performance. PMID:25227013

  5. Automatic Identification of Metaphoric Utterances

    Science.gov (United States)

    Dunn, Jonathan Edwin

    2013-01-01

    This dissertation analyzes the problem of metaphor identification in linguistic and computational semantics, considering both manual and automatic approaches. It describes a manual approach to metaphor identification, the Metaphoricity Measurement Procedure (MMP), and compares this approach with other manual approaches. The dissertation then…

  6. Automatically Preparing Safe SQL Queries

    Science.gov (United States)

    Bisht, Prithvi; Sistla, A. Prasad; Venkatakrishnan, V. N.

    We present the first sound program source transformation approach for automatically transforming the code of a legacy web application to employ PREPARE statements in place of unsafe SQL queries. Our approach therefore opens the way for eradicating the SQL injection threat vector from legacy web applications.

  7. The Automatic Measurement of Targets

    DEFF Research Database (Denmark)

    Höhle, Joachim

    1997-01-01

    The automatic measurement of targets is demonstrated by means of a theoretical example and by an interactive measuring program for real imagery from a réseau camera. The used strategy is a combination of two methods: the maximum correlation coefficient and the correlation in the subpixel range...

  8. Automatic quantification of iris color

    DEFF Research Database (Denmark)

    Christoffersen, S.; Harder, Stine; Andersen, J. D.;

    2012-01-01

    An automatic algorithm to quantify the eye colour and structural information from standard hi-resolution photos of the human iris has been developed. Initially, the major structures in the eye region are identified including the pupil, iris, sclera, and eyelashes. Based on this segmentation, the ...

  9. Automatic Association of News Items.

    Science.gov (United States)

    Carrick, Christina; Watters, Carolyn

    1997-01-01

    Discussion of electronic news delivery systems and the automatic generation of electronic editions focuses on the association of related items of different media type, specifically photos and stories. The goal is to be able to determine to what degree any two news items refer to the same news event. (Author/LRW)

  10. Automatic milking : a better understanding

    NARCIS (Netherlands)

    Meijering, A.; Hogeveen, H.; Koning, de C.J.A.M.

    2004-01-01

    In 2000 the book Robotic Milking, reflecting the proceedings of an International Symposium which was held in The Netherlands came out. At that time, commercial introduction of automatic milking systems was no longer obstructed by technological inadequacies. Particularly in a few west-European countr

  11. 考虑光伏组件发电性能的自动除尘系统运行时间优化%Optimization of running time of automatic dedusting system considered generating performance of PV mudules

    Institute of Scientific and Technical Information of China (English)

    郭枭; 澈力格尔; 韩雪; 田瑞

    2015-01-01

    Low power generation efficiency is one of the main obstacles to apply PV (photovoltaic) modules in large scale, and therefore studying the influence factors is of great significance. This article has independently developed a kind of automatic dedusting system of PV modules, which has the advantage of simple structure, low installation cost, reliable operation, without the use of water in the ash deposition, continuous and effective dedusting. The system has been applied to 3 kinds of occasions, including supplying power separately by the PV conversion cell with temperature in the range of -45℃−35℃, having various experimental tests of the assemble angles by the PV module cells and a large area of the PV power system. The dedusting effect of the automatic dedusting system is tested with temperature in the range of -10℃−5℃ when applied in the power separately by the PV conversion cell. Adopting the automatic dedusting system, the dynamic occlusion in the operation process has been simulated and the influence law of the output parameter for PV modules has been researched; the effect of dedusting has been analyzed under different amounts of the ash deposition; the effect of dedusting changing with the amount of the ash deposition has been summarized, and the opening time and the running period have been determined. The experimental PV modules are placed in outdoor open ground at an angle of 45°for 3, 7, 20 days and the amounts of the ash deposition are 0.1274, 0.2933, 0.8493 g/m2separately. The correction coefficient of PV modules involved in the experiments is 0.9943. The results show that, when the system is in the horizontal and vertical cycle, the cleaning brush makes the output parameters of the PV modules, including the output power, the electric current and the voltage, change according to the V-shaped law as it crosses a row of battery. Compared with the process of downlink, the output parameters of PV modules in the process of uplink fluctuate

  12. 实时监控程序的实验室快速调试开发%Rapid Debugging and Development of Real-time Monitoring Program in Lab

    Institute of Scientific and Technical Information of China (English)

    蔡文斋

    2015-01-01

    This paper proposes the several simulation and debugging methods of real-time monitoring application software development in Telemetry, Track and command (TT&C) project, and mainly introduces a window debugging method embedded in the monitoring program. This method uses an independent hexadecimal edit window message mode to replace a hardware sensor accessed to the computer, completely simulates the hardware communication environment in the lab, and flexibly gives the fixed length information send by the hardware sensor to the computer according to protocols, in order to debug the whole monitoring system without hardware. Once the hardware components are connected, the communication at the protocol level is immediately completed.%提出了航天测控工程中实时监控类应用软件开发中的仿真调试几种方法,该文主要展示了一种内嵌在监控程序本身的窗口调试法,该方法使用一个独立的十六进制编辑窗口发消息方式代替某接入计算机的硬件传感器,在实验室内可完全仿真出硬件通讯时的环境,可灵活地按协议给出硬件传感器发送给计算机的定长信息,从而可无硬件环境下调通整个监控系统,一旦硬件配件接通,协议级的通讯亦随即完成。

  13. Analysis of Phonetic Transcriptions for Danish Automatic Speech Recognition

    DEFF Research Database (Denmark)

    Kirkedal, Andreas Søeborg

    2013-01-01

    Automatic speech recognition (ASR) relies on three resources: audio, orthographic transcriptions and a pronunciation dictionary. The dictionary or lexicon maps orthographic words to sequences of phones or phonemes that represent the pronunciation of the corresponding word. The quality of a speech....... The analysis indicates that transcribing e.g. stress or vowel duration has a negative impact on performance. The best performance is obtained with coarse phonetic annotation and improves performance 1% word error rate and 3.8% sentence error rate....

  14. Fault Analysis and Phase Debugging Method of the Thyristor Charger%晶闸管充电机的故障分析及相位调试方法

    Institute of Scientific and Technical Information of China (English)

    杨伟珍

    2001-01-01

    It is analyzed that voltage phase characteristics when thethyristor charger is wried in different ways and the phase shift characteristic of the thyristor rectification circuit. The voltage phasor diagram is drawed. The phase debugging method of the thyristor rectification circuit is introduced.%分析了晶闸管充电机不同接线时的电压相位特性和晶闸管整流电路的移相特性,画出了电压相量图。给出晶闸管整流电路的相位调试方法。

  15. Automatic image classification for the urinoculture screening.

    Science.gov (United States)

    Andreini, Paolo; Bonechi, Simone; Bianchini, Monica; Garzelli, Andrea; Mecocci, Alessandro

    2016-03-01

    Urinary tract infections (UTIs) are considered to be the most common bacterial infection and, actually, it is estimated that about 150 million UTIs occur world wide yearly, giving rise to roughly $6 billion in healthcare expenditures and resulting in 100,000 hospitalizations. Nevertheless, it is difficult to carefully assess the incidence of UTIs, since an accurate diagnosis depends both on the presence of symptoms and on a positive urinoculture, whereas in most outpatient settings this diagnosis is made without an ad hoc analysis protocol. On the other hand, in the traditional urinoculture test, a sample of midstream urine is put onto a Petri dish, where a growth medium favors the proliferation of germ colonies. Then, the infection severity is evaluated by a visual inspection of a human expert, an error prone and lengthy process. In this paper, we propose a fully automated system for the urinoculture screening that can provide quick and easily traceable results for UTIs. Based on advanced image processing and machine learning tools, the infection type recognition, together with the estimation of the bacterial load, can be automatically carried out, yielding accurate diagnoses. The proposed AID (Automatic Infection Detector) system provides support during the whole analysis process: first, digital color images of Petri dishes are automatically captured, then specific preprocessing and spatial clustering algorithms are applied to isolate the colonies from the culture ground and, finally, an accurate classification of the infections and their severity evaluation are performed. The AID system speeds up the analysis, contributes to the standardization of the process, allows result repeatability, and reduces the costs. Moreover, the continuous transition between sterile and external environments (typical of the standard analysis procedure) is completely avoided. PMID:26780249

  16. Automatic sensor placement

    Science.gov (United States)

    Abidi, Besma R.

    1995-10-01

    Active sensing is the process of exploring the environment using multiple views of a scene captured by sensors from different points in space under different sensor settings. Applications of active sensing are numerous and can be found in the medical field (limb reconstruction), in archeology (bone mapping), in the movie and advertisement industry (computer simulation and graphics), in manufacturing (quality control), as well as in the environmental industry (mapping of nuclear dump sites). In this work, the focus is on the use of a single vision sensor (camera) to perform the volumetric modeling of an unknown object in an entirely autonomous fashion. The camera moves to acquire the necessary information in two ways: (a) viewing closely each local feature of interest using 2D data; and (b) acquiring global information about the environment via 3D sensor locations and orientations. A single object is presented to the camera and an initial arbitrary image is acquired. A 2D optimization process is developed. It brings the object in the field of view of the camera, normalizes it by centering the data in the image plane, aligns the principal axis with one of the camera's axes (arbitrarily chosen), and finally maximizes its resolution for better feature extraction. The enhanced image at each step is projected along the corresponding viewing direction. The new projection is intersected with previously obtained projections for volume reconstruction. During the global exploration of the scene, the current image as well as previous images are used to maximize the information in terms of shape irregularity as well as contrast variations. The scene on the borders of occlusion (contours) is modeled by an entropy-based objective functional. This functional is optimized to determine the best next view, which is recovered by computing the pose of the camera. A criterion based on the minimization of the difference between consecutive volume updates is set for termination of the

  17. Exposing MPI Objects for Debugging

    DEFF Research Database (Denmark)

    Brock-Nannestad, Laust; DelSignore, John; Squyres, Jeffrey M.;

    Developers rely on debuggers to inspect application state. In applications that use MPI, the Message Passing Interface, the MPI runtime contains an important part of this state. The MPI Tools Working Group has proposed an interface for MPI Handle Introspection. It allows debuggers and MPI impleme...

  18. Exposing MPI Objects for Debugging

    OpenAIRE

    Brock-Nannestad, Laust; DelSignore, John; Squyres, Jeffrey M.; Karlsson, Sven; Mohror, Kathryn

    2014-01-01

    Developers rely on debuggers to inspect application state. In applications that use MPI, the Message Passing Interface, the MPI runtime contains an important part of this state. The MPI Tools Working Group has proposed an interface for MPI Handle Introspection. It allows debuggers and MPI implementations to cooperate in extracting information from MPI objects. Information that can then be presented to the developer. MPI Handle Introspection provides a more general interface than previous work...

  19. Human-competitive automatic topic indexing

    CERN Document Server

    Medelyan, Olena

    2009-01-01

    Topic indexing is the task of identifying the main topics covered by a document. These are useful for many purposes: as subject headings in libraries, as keywords in academic publications and as tags on the web. Knowing a document’s topics helps people judge its relevance quickly. However, assigning topics manually is labor intensive. This thesis shows how to generate them automatically in a way that competes with human performance. Three kinds of indexing are investigated: term assignment, a task commonly performed by librarians, who select topics from a controlled vocabulary; tagging, a popular activity of web users, who choose topics freely; and a new method of keyphrase extraction, where topics are equated to Wikipedia article names. A general two-stage algorithm is introduced that first selects candidate topics and then ranks them by significance based on their properties. These properties draw on statistical, semantic, domain-specific and encyclopedic knowledge. They are combined using a machine learn...

  20. Automatic Loop Parallelization via Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob;

    for these codes in a static, off-line compiler, we developed an interactive compilation feedback system that guides the programmer in iteratively modifying application source, thereby improving the compiler’s ability to generate loop-parallel code. We use this compilation system to modify two sequential...... benchmarks, finding that the code parallelized in this way runs up to 8.3 times faster on an octo-core Intel Xeon 5570 system and up to 12.5 times faster on a quad-core IBM POWER6 system. Benchmark performance varies significantly between the systems. This suggests that semi-automatic parallelization should...... be combined with target-specific optimizations. Furthermore, comparing the first benchmark to hand-parallelized, hand-optimized pthreads and OpenMP versions, we find that code generated using our approach typically outperforms the pthreads code (within 93-339%). It also performs competitively against the Open...

  1. Towards Automatic Trunk Classification on Young Conifers

    DEFF Research Database (Denmark)

    Petri, Stig; Immerkær, John

    2009-01-01

    In the garden nursery industry providing young Nordmann firs for Christmas tree plantations, there is a rising interest in automatic classification of their products to ensure consistently high quality and reduce the cost of manual labor. This paper describes a fully automatic single-view algorit...... performance of the algorithm by incorporating color information into the data considered by the dynamic programming algorithm....

  2. Alcohol-related effects on automaticity due to experimentally manipulated conditioning

    NARCIS (Netherlands)

    T.E. Gladwin; R.W. Wiers

    2012-01-01

    Background:  The use of alcohol is associated with various forms of automatic processing, such as approach tendencies and attentional biases, which may play a role in addictive behavior. The development of such automaticity has generally occurred well before subjects perform tasks designed to detect

  3. Alcohol-Related Effects on Automaticity due to Experimentally Manipulated Conditioning

    NARCIS (Netherlands)

    Gladwin, T.E.; Wiers, R.W.H.J.

    2012-01-01

    Background:• The use of alcohol is associated with various forms of automatic processing, such as approach tendencies and attentional biases, which may play a role in addictive behavior. The development of such automaticity has generally occurred well before subjects perform tasks designed to detec

  4. Nature Conservation Drones for Automatic Localization and Counting of Animals

    NARCIS (Netherlands)

    J.C. van Gemert; C.R. Verschoor; P. Mettes; K. Epema; L.P. Koh; S. Wich

    2014-01-01

    This paper is concerned with nature conservation by automatically monitoring animal distribution and animal abundance. Typically, such conservation tasks are performed manually on foot or after an aerial recording from a manned aircraft. Such manual approaches are expensive, slow and labor intensive

  5. An Automatic Proof of Euler's Formula

    Directory of Open Access Journals (Sweden)

    Jun Zhang

    2005-05-01

    Full Text Available In this information age, everything is digitalized. The encoding of functions and the automatic proof of functions are important. This paper will discuss the automatic calculation for Taylor expansion coefficients, as an example, it can be applied to prove Euler's formula automatically.

  6. Self-Compassion and Automatic Thoughts

    Science.gov (United States)

    Akin, Ahmet

    2012-01-01

    The aim of this research is to examine the relationships between self-compassion and automatic thoughts. Participants were 299 university students. In this study, the Self-compassion Scale and the Automatic Thoughts Questionnaire were used. The relationships between self-compassion and automatic thoughts were examined using correlation analysis…

  7. Automatic target extraction in complicated background for camera calibration

    Science.gov (United States)

    Guo, Xichao; Wang, Cheng; Wen, Chenglu; Cheng, Ming

    2016-03-01

    In order to perform high precise calibration of camera in complex background, a novel design of planar composite target and the corresponding automatic extraction algorithm are presented. Unlike other commonly used target designs, the proposed target contains the information of feature point coordinate and feature point serial number simultaneously. Then based on the original target, templates are prepared by three geometric transformations and used as the input of template matching based on shape context. Finally, parity check and region growing methods are used to extract the target as final result. The experimental results show that the proposed method for automatic extraction and recognition of the proposed target is effective, accurate and reliable.

  8. Semi-Automatic Rename Refactoring for JavaScript

    DEFF Research Database (Denmark)

    Feldthaus, Asger; Møller, Anders

    2013-01-01

    Modern IDEs support automated refactoring for many programming languages, but support for JavaScript is still primitive. To perform renaming, which is one of the fundamental refactorings, there is often no practical alternative to simple syntactic search-and-replace. Although more sophisticated...... alternatives have been developed, they are limited by whole-program assumptions and poor scalability. We propose a technique for semi-automatic refactoring for JavaScript, with a focus on renaming. Unlike traditional refactoring algorithms, semi-automatic refactoring works by a combination of static analysis...

  9. Automatic schema evolution in Root

    International Nuclear Information System (INIS)

    ROOT version 3 (spring 2001) supports automatic class schema evolution. In addition this version also produces files that are self-describing. This is achieved by storing in each file a record with the description of all the persistent classes in the file. Being self-describing guarantees that a file can always be read later, its structure browsed and objects inspected, also when the library with the compiled code of these classes is missing. The schema evolution mechanism supports the frequent case when multiple data sets generated with many different class versions must be analyzed in the same session. ROOT supports the automatic generation of C++ code describing the data objects in a file

  10. Automatic spikes detection in seismogram

    Institute of Scientific and Technical Information of China (English)

    王海军; 靳平; 刘贵忠

    2003-01-01

    @@ Data processing for seismic network is very complex and fussy, because a lot of data is recorded in seismic network every day, which make it impossible to process these data all by manual work. Therefore, seismic data should be processed automatically to produce a initial results about events detection and location. Afterwards, these results are reviewed and modified by analyst. In automatic processing data quality checking is important. There are three main problem data thatexist in real seismic records, which include: spike, repeated data and dropouts. Spike is defined as isolated large amplitude point; the other two problem datahave the same features that amplitude of sample points are uniform in a interval. In data quality checking, the first step is to detect and statistic problem data in a data segment, if percent of problem data exceed a threshold, then the whole data segment is masked and not be processed in the later process.

  11. Physics of Automatic Target Recognition

    CERN Document Server

    Sadjadi, Firooz

    2007-01-01

    Physics of Automatic Target Recognition addresses the fundamental physical bases of sensing, and information extraction in the state-of-the art automatic target recognition field. It explores both passive and active multispectral sensing, polarimetric diversity, complex signature exploitation, sensor and processing adaptation, transformation of electromagnetic and acoustic waves in their interactions with targets, background clutter, transmission media, and sensing elements. The general inverse scattering, and advanced signal processing techniques and scientific evaluation methodologies being used in this multi disciplinary field will be part of this exposition. The issues of modeling of target signatures in various spectral modalities, LADAR, IR, SAR, high resolution radar, acoustic, seismic, visible, hyperspectral, in diverse geometric aspects will be addressed. The methods for signal processing and classification will cover concepts such as sensor adaptive and artificial neural networks, time reversal filt...

  12. Automatic Schema Evolution in Root

    Institute of Scientific and Technical Information of China (English)

    ReneBrun; FonsRademakers

    2001-01-01

    ROOT version 3(spring 2001) supports automatic class schema evolution.In addition this version also produces files that are self-describing.This is achieved by storing in each file a record with the description of all the persistent classes in the file.Being self-describing guarantees that a file can always be read later,its structure browsed and objects inspected.also when the library with the compiled code of these classes is missing The schema evolution mechanism supports the frequent case when multiple data sets generated with many different class versions must be analyzed in the same session.ROOT supports the automatic generation of C++ code describing the data objects in a file.

  13. The Automaticity of Social Life

    OpenAIRE

    Bargh, John A.; Williams, Erin L.

    2006-01-01

    Much of social life is experienced through mental processes that are not intended and about which one is fairly oblivious. These processes are automatically triggered by features of the immediate social environment, such as the group memberships of other people, the qualities of their behavior, and features of social situations (e.g., norms, one's relative power). Recent research has shown these nonconscious influences to extend beyond the perception and interpretation of the social world to ...

  14. Automatically-Programed Machine Tools

    Science.gov (United States)

    Purves, L.; Clerman, N.

    1985-01-01

    Software produces cutter location files for numerically-controlled machine tools. APT, acronym for Automatically Programed Tools, is among most widely used software systems for computerized machine tools. APT developed for explicit purpose of providing effective software system for programing NC machine tools. APT system includes specification of APT programing language and language processor, which executes APT statements and generates NC machine-tool motions specified by APT statements.

  15. Automatic Generation of Technical Documentation

    OpenAIRE

    Reiter, Ehud; Mellish, Chris; Levine, John

    1994-01-01

    Natural-language generation (NLG) techniques can be used to automatically produce technical documentation from a domain knowledge base and linguistic and contextual models. We discuss this application of NLG technology from both a technical and a usefulness (costs and benefits) perspective. This discussion is based largely on our experiences with the IDAS documentation-generation project, and the reactions various interested people from industry have had to IDAS. We hope that this summary of ...

  16. Annual review in automatic programming

    CERN Document Server

    Halpern, Mark I; Bolliet, Louis

    2014-01-01

    Computer Science and Technology and their Application is an eight-chapter book that first presents a tutorial on database organization. Subsequent chapters describe the general concepts of Simula 67 programming language; incremental compilation and conversational interpretation; dynamic syntax; the ALGOL 68. Other chapters discuss the general purpose conversational system for graphical programming and automatic theorem proving based on resolution. A survey of extensible programming language is also shown.

  17. The Automatic Galaxy Collision Software

    CERN Document Server

    Smith, Beverly J; Pfeiffer, Phillip; Perkins, Sam; Barkanic, Jason; Fritts, Steve; Southerland, Derek; Manchikalapudi, Dinikar; Baker, Matt; Luckey, John; Franklin, Coral; Moffett, Amanda; Struck, Curtis

    2009-01-01

    The key to understanding the physical processes that occur during galaxy interactions is dynamical modeling, and especially the detailed matching of numerical models to specific systems. To make modeling interacting galaxies more efficient, we have constructed the `Automatic Galaxy Collision' (AGC) code, which requires less human intervention in finding good matches to data. We present some preliminary results from this code for the well-studied system Arp 284 (NGC 7714/5), and address questions of uniqueness of solutions.

  18. Automatic validation of numerical solutions

    DEFF Research Database (Denmark)

    Stauning, Ole

    1997-01-01

    This thesis is concerned with ``Automatic Validation of Numerical Solutions''. The basic theory of interval analysis and self-validating methods is introduced. The mean value enclosure is applied to discrete mappings for obtaining narrow enclosures of the iterates when applying these mappings...... of an integral operator and uses interval Bernstein polynomials for enclosing the solution. Two numerical examples are given, using two orders of approximation and using different numbers of discretization points....

  19. Driver behavior following an automatic steering intervention.

    Science.gov (United States)

    Fricke, Nicola; Griesche, Stefan; Schieben, Anna; Hesse, Tobias; Baumann, Martin

    2015-10-01

    The study investigated driver behavior toward an automatic steering intervention of a collision mitigation system. Forty participants were tested in a driving simulator and confronted with an inevitable collision. They performed a naïve drive and afterwards a repeated exposure in which they were told to hold the steering wheel loosely. In a third drive they experienced a false alarm situation. Data on driving behavior, i.e. steering and braking behavior as well as subjective data was assessed in the scenarios. Results showed that most participants held on to the steering wheel strongly or counter-steered during the system intervention during the first encounter. Moreover, subjective data collected after the first drive showed that the majority of drivers was not aware of the system intervention. Data from the repeated drive in which participants were instructed to hold the steering wheel loosely, led to significantly more participants holding the steering wheel loosely and thus complying with the instruction. This study seems to imply that without knowledge and information of the system about an upcoming intervention, the most prevalent driving behavior is a strong reaction with the steering wheel similar to an automatic steering reflex which decreases the system's effectiveness. Results of the second drive show some potential for countermeasures, such as informing drivers shortly before a system intervention in order to prevent inhibiting reactions. PMID:26310799

  20. Multiobjective image recognition algorithm in the fully automatic die bonder

    Institute of Scientific and Technical Information of China (English)

    JIANG Kai; CHEN Hai-xia; YUAN Sen-miao

    2006-01-01

    It is a very important task to automatically fix the number of die in the image recognition system of a fully automatic die bonder.A multiobjective image recognition algorithm based on clustering Genetic Algorithm (GA),is proposed in this paper.In the evolutionary process of GA,a clustering method is provided that utilizes information from the template and the fitness landscape of the current population..The whole population is grouped into different niches by the clustering method.Experimental results demonstrated that the number of target images could be determined by the algorithm automatically,and multiple targets could be recognized at a time.As a result,time consumed by one image recognition is shortened,the performance of the image recognition system is improved,and the atomization of the system is fulfilled.

  1. The PLC Transformation and Debug of the T68 Type Horizontal Electrical Control System%T68型卧式镗床电气控制系统的PLC改造和调试

    Institute of Scientific and Technical Information of China (English)

    彭爱梅

    2012-01-01

    本文介绍了利用PLC改造T68型卧式镗床的电气控制系统的思路和方法,对T68卧式镗床主轴电动机M1的正反转的点动控制,正反转高速控制,正反转停车制动控制,主轴变速控制和对镗床控制系统的主电路、辅助电路和控制电路等电路进行了设计和调试。改造后车床运行稳定,降低了故障率,提高了生产效率。%This paper introduces the ideas and methods to transform electrical control system of T68 horizontal boring machines with PLC, and then designs and debugs the posi-nega rotation jog control, posi-nega rotation high-speed control, posi-nega braking control and spindle variable speed control of T68 spindle motor M1, and also designs and debugs the main circuit, aux. circuit and control circuit and boring machine control system. After reconstruction, the lathe operation is stable, and reduces failure rate and improves the production efficiency.

  2. 全淹没式高倍数泡沫灭火系统施工调试探讨%The discussion of the construction debugging of the total flooding of high expansion foam extinguishing system

    Institute of Scientific and Technical Information of China (English)

    兰雪梅; 应晓东

    2012-01-01

    According to the characteristics of the paint processing plant of wood products manufacturing, by analyzing the technical advantages of the total flooding of high expansion foam fire extinguishing system and other fire extinguishing systems, the selection of the fire extinguishing system was determined. Combined with the experiences of the failures of the three attempts in the project spray debugging, the key points and the prone to occur problems during the debugging of the total flooding of high expansion foam fire extinguishing system were summarized, so as to propose solutions.%根据木制品油漆车间的火灾特点,通过比较分析高倍数泡沫灭火系统与其他灭火系统的技术优点,确定灭火系统的选型.结合系统三次试喷的工程调试实例经验,总结全淹没高倍数泡沫灭火系统的施工要点及调试中容易出现的问题,并提出解决方法.

  3. New automatic minidisk infiltrometer: design and testing

    Directory of Open Access Journals (Sweden)

    Klípa Vladimír

    2015-06-01

    Full Text Available Soil hydraulic conductivity is a key parameter to predict water flow through the soil profile. We have developed an automatic minidisk infiltrometer (AMI to enable easy measurement of unsaturated hydraulic conductivity using the tension infiltrometer method in the field. AMI senses the cumulative infiltration by recording change in buoyancy force acting on a vertical solid bar fixed in the reservoir tube of the infiltrometer. Performance of the instrument was tested in the laboratory and in two contrasting catchments at three sites with different land use. Hydraulic conductivities determined using AMI were compared with earlier manually taken readings. The results of laboratory testing demonstrated high accuracy and robustness of the AMI measurement. Field testing of AMI proved the suitability of the instrument for use in the determination of sorptivity and near saturated hydraulic conductivity

  4. Towards automatic synthesis of linear algebra programs

    Energy Technology Data Exchange (ETDEWEB)

    Boyle, J. M.

    1979-01-01

    Automating the writing of efficient computer programs from an abstract specification of the computation that they are to perform is discussed. Advantages offered by automatic synthesis of programs include economy, reliability, and improved service. The synthesis of simple linear algebra programs is considered in general and then illustrated for the usual matrix product, a column-oriented matrix product, a rank-one update matrix product, and a program to multiply three matrices. The accumulation of inner products and transformational implementation of program synthesis addressed. The discussion attempts to illustrate both the general strategy of the syntheses and how various tactics can be adapted to make the syntheses proceed deterministically to programs that are optimal with respect to certain criteria. (RWR)

  5. Automatic Tuning of Interactive Perception Applications

    CERN Document Server

    Zhu, Qian; Mummert, Lily; Pillai, Padmanabhan

    2012-01-01

    Interactive applications incorporating high-data rate sensing and computer vision are becoming possible due to novel runtime systems and the use of parallel computation resources. To allow interactive use, such applications require careful tuning of multiple application parameters to meet required fidelity and latency bounds. This is a nontrivial task, often requiring expert knowledge, which becomes intractable as resources and application load characteristics change. This paper describes a method for automatic performance tuning that learns application characteristics and effects of tunable parameters online, and constructs models that are used to maximize fidelity for a given latency constraint. The paper shows that accurate latency models can be learned online, knowledge of application structure can be used to reduce the complexity of the learning task, and operating points can be found that achieve 90% of the optimal fidelity by exploring the parameter space only 3% of the time.

  6. ASAM: Automatic Architecture Synthesis and Application Mapping

    DEFF Research Database (Denmark)

    Jozwiak, L.; Lindwer, M.; Corvino, R.;

    2012-01-01

    This paper focuses on mastering the automatic architecture synthesis and application mapping for heterogeneous massively-parallel MPSoCs based on customizable application-specific instruction-set processors (ASIPs). It presents an over-view of the research being currently performed in the scope...... of the European project ASAM of the ARTEMIS program. The paper briefly presents the results of our analysis of the main problems to be solved and challenges to be faced in the design of such heterogeneous MPSoCs. It explains which system, design, and electronic design automation (EDA) concepts seem to be adequate...... to resolve the problems and address the challenges. Finally, it introduces and briefly discusses the ASAM design-flow and its main stages....

  7. Automatic scanning of NTA film neutron dosimeters

    CERN Document Server

    Müller, R

    1999-01-01

    At the European Laboratory for Particle Physics CERN, personal neutron monitoring for over 4000 collaborators is performed with Kodak NTA film, one of the few suitable dosemeters in the stray radiation environment of a high energy accelerator. After development, films are scanned with a projection microscope. To overcome this lengthy and strenuous procedure an automated analysis system for the dosemeters has been developed. General purpose image recognition software, tailored to the specific needs with a macro language, analyses the digitised microscope image. This paper reports on the successful automatic scanning of NTA films irradiated with neutrons from a /sup 238/Pu-Be source (E approximately=4 MeV), as well as on the extension of the method to neutrons of higher energies. The question of detection limits is discussed in the light of an application of the method in routine personal neutron monitoring. (9 refs).

  8. Unification of automatic target tracking and automatic target recognition

    Science.gov (United States)

    Schachter, Bruce J.

    2014-06-01

    The subject being addressed is how an automatic target tracker (ATT) and an automatic target recognizer (ATR) can be fused together so tightly and so well that their distinctiveness becomes lost in the merger. This has historically not been the case outside of biology and a few academic papers. The biological model of ATT∪ATR arises from dynamic patterns of activity distributed across many neural circuits and structures (including retina). The information that the brain receives from the eyes is "old news" at the time that it receives it. The eyes and brain forecast a tracked object's future position, rather than relying on received retinal position. Anticipation of the next moment - building up a consistent perception - is accomplished under difficult conditions: motion (eyes, head, body, scene background, target) and processing limitations (neural noise, delays, eye jitter, distractions). Not only does the human vision system surmount these problems, but it has innate mechanisms to exploit motion in support of target detection and classification. Biological vision doesn't normally operate on snapshots. Feature extraction, detection and recognition are spatiotemporal. When vision is viewed as a spatiotemporal process, target detection, recognition, tracking, event detection and activity recognition, do not seem as distinct as they are in current ATT and ATR designs. They appear as similar mechanism taking place at varying time scales. A framework is provided for unifying ATT and ATR.

  9. Automatic Mode Transition Enabled Robust Triboelectric Nanogenerators.

    Science.gov (United States)

    Chen, Jun; Yang, Jin; Guo, Hengyu; Li, Zhaoling; Zheng, Li; Su, Yuanjie; Wen, Zhen; Fan, Xing; Wang, Zhong Lin

    2015-12-22

    Although the triboelectric nanogenerator (TENG) has been proven to be a renewable and effective route for ambient energy harvesting, its robustness remains a great challenge due to the requirement of surface friction for a decent output, especially for the in-plane sliding mode TENG. Here, we present a rationally designed TENG for achieving a high output performance without compromising the device robustness by, first, converting the in-plane sliding electrification into a contact separation working mode and, second, creating an automatic transition between a contact working state and a noncontact working state. The magnet-assisted automatic transition triboelectric nanogenerator (AT-TENG) was demonstrated to effectively harness various ambient rotational motions to generate electricity with greatly improved device robustness. At a wind speed of 6.5 m/s or a water flow rate of 5.5 L/min, the harvested energy was capable of lighting up 24 spot lights (0.6 W each) simultaneously and charging a capacitor to greater than 120 V in 60 s. Furthermore, due to the rational structural design and unique output characteristics, the AT-TENG was not only capable of harvesting energy from natural bicycling and car motion but also acting as a self-powered speedometer with ultrahigh accuracy. Given such features as structural simplicity, easy fabrication, low cost, wide applicability even in a harsh environment, and high output performance with superior device robustness, the AT-TENG renders an effective and practical approach for ambient mechanical energy harvesting as well as self-powered active sensing. PMID:26529374

  10. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming, Volume 2 is a collection of papers that discusses the controversy about the suitability of COBOL as a common business oriented language, and the development of different common languages for scientific computation. A couple of papers describes the use of the Genie system in numerical calculation and analyzes Mercury autocode in terms of a phrase structure language, such as in the source language, target language, the order structure of ATLAS, and the meta-syntactical language of the assembly program. Other papers explain interference or an ""intermediate

  11. Unsupervised automatic music genre classification

    OpenAIRE

    Barreira, Luís Filipe Marques

    2010-01-01

    Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática In this study we explore automatic music genre recognition and classification of digital music. Music has always been a reflection of culture di erences and an influence in our society. Today’s digital content development triggered the massive use of digital music. Nowadays,digital music is manually labeled without following a universa...

  12. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming, Volume 4 is a collection of papers that deals with the GIER ALGOL compiler, a parameterized compiler based on mechanical linguistics, and the JOVIAL language. A couple of papers describes a commercial use of stacks, an IBM system, and what an ideal computer program support system should be. One paper reviews the system of compilation, the development of a more advanced language, programming techniques, machine independence, and program transfer to other machines. Another paper describes the ALGOL 60 system for the GIER machine including running ALGOL pro

  13. The Automaticity of Social Life.

    Science.gov (United States)

    Bargh, John A; Williams, Erin L

    2006-02-01

    Much of social life is experienced through mental processes that are not intended and about which one is fairly oblivious. These processes are automatically triggered by features of the immediate social environment, such as the group memberships of other people, the qualities of their behavior, and features of social situations (e.g., norms, one's relative power). Recent research has shown these nonconscious influences to extend beyond the perception and interpretation of the social world to the actual guidance, over extended time periods, of one's important goal pursuits and social interactions.

  14. Automatic analysis of multiparty meetings

    Indian Academy of Sciences (India)

    Steve Renals

    2011-10-01

    This paper is about the recognition and interpretation of multiparty meetings captured as audio, video and other signals. This is a challenging task since the meetings consist of spontaneous and conversational interactions between a number of participants: it is a multimodal, multiparty, multistream problem. We discuss the capture and annotation of the Augmented Multiparty Interaction (AMI) meeting corpus, the development of a meeting speech recognition system, and systems for the automatic segmentation, summarization and social processing of meetings, together with some example applications based on these systems.

  15. Automatic Inference of DATR Theories

    CERN Document Server

    Barg, P

    1996-01-01

    This paper presents an approach for the automatic acquisition of linguistic knowledge from unstructured data. The acquired knowledge is represented in the lexical knowledge representation language DATR. A set of transformation rules that establish inheritance relationships and a default-inference algorithm make up the basis components of the system. Since the overall approach is not restricted to a special domain, the heuristic inference strategy uses criteria to evaluate the quality of a DATR theory, where different domains may require different criteria. The system is applied to the linguistic learning task of German noun inflection.

  16. The Automaticity of Social Life.

    Science.gov (United States)

    Bargh, John A; Williams, Erin L

    2006-02-01

    Much of social life is experienced through mental processes that are not intended and about which one is fairly oblivious. These processes are automatically triggered by features of the immediate social environment, such as the group memberships of other people, the qualities of their behavior, and features of social situations (e.g., norms, one's relative power). Recent research has shown these nonconscious influences to extend beyond the perception and interpretation of the social world to the actual guidance, over extended time periods, of one's important goal pursuits and social interactions. PMID:18568084

  17. Automatic Generation of Technical Documentation

    CERN Document Server

    Reiter, E R; Levine, J; Reiter, Ehud; Mellish, Chris; Levine, John

    1994-01-01

    Natural-language generation (NLG) techniques can be used to automatically produce technical documentation from a domain knowledge base and linguistic and contextual models. We discuss this application of NLG technology from both a technical and a usefulness (costs and benefits) perspective. This discussion is based largely on our experiences with the IDAS documentation-generation project, and the reactions various interested people from industry have had to IDAS. We hope that this summary of our experiences with IDAS and the lessons we have learned from it will be beneficial for other researchers who wish to build technical-documentation generation systems.

  18. Coordinated hybrid automatic repeat request

    KAUST Repository

    Makki, Behrooz

    2014-11-01

    We develop a coordinated hybrid automatic repeat request (HARQ) approach. With the proposed scheme, if a user message is correctly decoded in the first HARQ rounds, its spectrum is allocated to other users, to improve the network outage probability and the users\\' fairness. The results, which are obtained for single- and multiple-antenna setups, demonstrate the efficiency of the proposed approach in different conditions. For instance, with a maximum of M retransmissions and single transmit/receive antennas, the diversity gain of a user increases from M to (J+1)(M-1)+1 where J is the number of users helping that user.

  19. Automatic transcription of polyphonic singing

    OpenAIRE

    Paščinski, Uroš

    2015-01-01

    In this work we focus on automatic transcription of polyphonic singing. In particular we do the multiple fundamental frequency (F0) estimation. From the terrain recordings a test set of Slovenian folk songs with polyphonic singing is extracted and manually transcribed. On the test set we try the general algorithm for multiple F0 detection. An interactive visualization of the main parts of the algorithm is made to analyse how it works and try to detect possible issues. As the data set is ne...

  20. Automatic surveillance system using fish-eye lens camera

    Institute of Scientific and Technical Information of China (English)

    Xue Yuan; Yongduan Song; Xueye Wei

    2011-01-01

    This letter presents an automatic surveillance system using fish-eye lens camera. Our system achieves wide-area automatic surveillance without a dead angle using only one camera. We propose a new human detection method to select the most adaptive classifier based on the locations of the human candidates.Human regions are detected from the fish-eye image effectively and are corrected for perspective versions.An experiment is performed on indoor video sequences with different illumination and crowded conditions,with results demonstrating the efficiency of our algorithm.%@@ This letter presents an automatic surveillance system using fish-eye lens camera. Our system achieves wide-area automatic surveillance without a dead angle using only one camera. We propose a new human detection method to select the most adaptive classifier based on the locations of the human candidates. Human regions are detected from the fish-eye image effectively and are corrected for perspective versions. An experiment is performed on indoor video sequences with different illumination and crowded conditions, with results demonstrating the efficiency of our algorithm.

  1. The design of automatic washing machine control system based on Proteus%基于Proteus的全自动洗衣机控制系统设计

    Institute of Scientific and Technical Information of China (English)

    刘晓彤

    2012-01-01

    This paper mainly discusses using the Proteus simulation software to achieve the automatic washing machine control system hardware,software simulation design,debugging and running.This design,which uses a AT89S52 MCU as the control core and a washing machine as a control object,controls the entire laundry process by apply corresponding input and output device.This paper introduces the design for a automatic washing machine control scheme in a convenient and efficient way.The scheme uses software electric circuit and the automatic washing machine control system software program both designed by the Proteus simulation software,and then add the prepared program to a virtual prototype based on the hardware schematic,finally achieve the real-time debugging and simulation running of both hardware and software.%本文主要探讨采用Proteus仿真软件,实现全自动洗衣机控制系统硬件、软件的仿真设计和调试运行。本设计采用AT89S52单片机作为控制核心,以洗衣机作为控制对象,采用相应的输入、输出设备,实现对洗衣机整个洗衣过程的多种控制。本文介绍了从方便、快捷的角度出发进行的全自动洗衣机控制方案设计,使用Protues仿真软件进行的硬件电路设计,全自动洗衣机控制系统软件程序的设计,最后在基于硬件原理图的虚拟原型上添加编写好的程序,实现了硬件、软件的实时调试和仿真运行。

  2. Automatic generation of tourist brochures

    KAUST Repository

    Birsak, Michael

    2014-05-01

    We present a novel framework for the automatic generation of tourist brochures that include routing instructions and additional information presented in the form of so-called detail lenses. The first contribution of this paper is the automatic creation of layouts for the brochures. Our approach is based on the minimization of an energy function that combines multiple goals: positioning of the lenses as close as possible to the corresponding region shown in an overview map, keeping the number of lenses low, and an efficient numbering of the lenses. The second contribution is a route-aware simplification of the graph of streets used for traveling between the points of interest (POIs). This is done by reducing the graph consisting of all shortest paths through the minimization of an energy function. The output is a subset of street segments that enable traveling between all the POIs without considerable detours, while at the same time guaranteeing a clutter-free visualization. © 2014 The Author(s) Computer Graphics Forum © 2014 The Eurographics Association and John Wiley & Sons Ltd. Published by John Wiley & Sons Ltd.

  3. Automatic Speech Segmentation Based on HMM

    OpenAIRE

    M. Kroul

    2007-01-01

    This contribution deals with the problem of automatic phoneme segmentation using HMMs. Automatization of speech segmentation task is important for applications, where large amount of data is needed to process, so manual segmentation is out of the question. In this paper we focus on automatic segmentation of recordings, which will be used for triphone synthesis unit database creation. For speech synthesis, the speech unit quality is a crucial aspect, so the maximal accuracy in segmentation is ...

  4. Towards unifying inheritance and automatic program specialization

    DEFF Research Database (Denmark)

    Schultz, Ulrik Pagh

    2002-01-01

    Inheritance allows a class to be specialized and its attributes refined, but implementation specialization can only take place by overriding with manually implemented methods. Automatic program specialization can generate a specialized, effcient implementation. However, specialization of programs...... with covariant specialization to control the automatic application of program specialization to class members. Lapis integrates object-oriented concepts, block structure, and techniques from automatic program specialization to provide both a language where object-oriented designs can be e#ciently implemented...

  5. Automatic Control of Water Pumping Stations

    Institute of Scientific and Technical Information of China (English)

    Muhannad Alrheeh; JIANG Zhengfeng

    2006-01-01

    Automatic Control of pumps is an interesting proposal to operate water pumping stations among many kinds of water pumping stations according to their functions.In this paper, our pumping station is being used for water supply system. This paper is to introduce the idea of pump controller and the important factors that must be considering when we want to design automatic control system of water pumping stations. Then the automatic control circuit with the function of all components will be introduced.

  6. An automatic visual analysis system for tennis

    OpenAIRE

    Connaghan, Damien; Moran, Kieran; O''Connor, Noel E.

    2013-01-01

    This article presents a novel video analysis system for coaching tennis players of all levels, which uses computer vision algorithms to automatically edit and index tennis videos into meaningful annotations. Existing tennis coaching software lacks the ability to automatically index a tennis match into key events, and therefore, a coach who uses existing software is burdened with time-consuming manual video editing. This work aims to explore the effectiveness of a system to automatically de...

  7. Automatic generation of application specific FPGA multicore accelerators

    DEFF Research Database (Denmark)

    Hindborg, Andreas Erik; Schleuniger, Pascal; Jensen, Nicklas Bo;

    2014-01-01

    . In this paper we propose a tool flow, which automatically generates highly optimized hardware multicore systems based on parameters. Profiling feedback is used to adjust these parameters to improve performance and lower the power consumption. For an image processing application we show that our tools are able......High performance computing systems make increasing use of hardware accelerators to improve performance and power properties. For large high-performance FPGAs to be successfully integrated in such computing systems, methods to raise the abstraction level of FPGA programming are required...

  8. 生活垃圾资源化有机浆液的厌氧消化调试研究%Study on the Anaerobic Debugging of Organic Slurry in Municipal Solid Waste Treatment and Recycling Utilization

    Institute of Scientific and Technical Information of China (English)

    金慧宁; 张进锋; 史东晓; 屈阳; 王风庆; 李习武; 吴海锁

    2014-01-01

    以城市生活垃圾经过破碎、淋滤等预处理后的有机浆液为研究对象,进行了中试规模的厌氧反应器的调试研究,研究的结果表明:①采用污水处理厂的污泥,生活垃圾经过预处理后的有机浆液历时120 d的完成了厌氧调试;②根据调试过程分为3个阶段:污泥驯化阶段、负荷提高阶段、稳定运行阶段。在污泥驯化期,OLR=0.5 kg/(m3·d)下, COD去除率经历缓慢增长、快速增长2个时段。在负荷提高阶段,COD去除率基本保持85%左右。稳定运行期OLR=7.0 kg/(m3·d)下,COD去除率保持在80%左右;③沼气的产气率保持在0.45~0.55 m3/kg,CH4和CO2的百分比分别保持在55%和40%左右。%Anaerobic debugging experimentation was done in a pilot plan, using the organic slurry that comes from the process of crush and percolation in the pretreatment of municipal solid waste (MSW). The results indicated: ①Inoculated with the sludge of sewage treatment plant,the anaerobic debugging of the organic slurry from the pretreatment of MSW was completed in 120 days; ②The anaerobic debugging consisted of three stages: the cultivation stage, the improving loading stage and the stable stage. In the cultivation stage, when OLR was 0.5 kg /(m3·d), the increasing rate of the COD removal varied form low rate to high rate. In the improving loading stage, the COD removal rate was maintained in 85%. In stable stage, when OLR was 7.0 kg /(m3·d), the COD removal rate was about 80%; ③The biogas production rate was 0.45 ~ 0.55 m3/kg, the percentage of methane and carbon dioxide was maintained at about 55%and 40%, respectively.

  9. Automatic colorimetric calibration of human wounds

    Directory of Open Access Journals (Sweden)

    Meert Theo

    2010-03-01

    Full Text Available Abstract Background Recently, digital photography in medicine is considered an acceptable tool in many clinical domains, e.g. wound care. Although ever higher resolutions are available, reproducibility is still poor and visual comparison of images remains difficult. This is even more the case for measurements performed on such images (colour, area, etc.. This problem is often neglected and images are freely compared and exchanged without further thought. Methods The first experiment checked whether camera settings or lighting conditions could negatively affect the quality of colorimetric calibration. Digital images plus a calibration chart were exposed to a variety of conditions. Precision and accuracy of colours after calibration were quantitatively assessed with a probability distribution for perceptual colour differences (dE_ab. The second experiment was designed to assess the impact of the automatic calibration procedure (i.e. chart detection on real-world measurements. 40 Different images of real wounds were acquired and a region of interest was selected in each image. 3 Rotated versions of each image were automatically calibrated and colour differences were calculated. Results 1st Experiment: Colour differences between the measurements and real spectrophotometric measurements reveal median dE_ab values respectively 6.40 for the proper patches of calibrated normal images and 17.75 for uncalibrated images demonstrating an important improvement in accuracy after calibration. The reproducibility, visualized by the probability distribution of the dE_ab errors between 2 measurements of the patches of the images has a median of 3.43 dE* for all calibrated images, 23.26 dE_ab for all uncalibrated images. If we restrict ourselves to the proper patches of normal calibrated images the median is only 2.58 dE_ab! Wilcoxon sum-rank testing (p Conclusion The investigators proposed an automatic colour calibration algorithm that ensures reproducible colour

  10. Automatic classification of blank substrate defects

    Science.gov (United States)

    Boettiger, Tom; Buck, Peter; Paninjath, Sankaranarayanan; Pereira, Mark; Ronald, Rob; Rost, Dan; Samir, Bhamidipati

    2014-10-01

    Mask preparation stages are crucial in mask manufacturing, since this mask is to later act as a template for considerable number of dies on wafer. Defects on the initial blank substrate, and subsequent cleaned and coated substrates, can have a profound impact on the usability of the finished mask. This emphasizes the need for early and accurate identification of blank substrate defects and the risk they pose to the patterned reticle. While Automatic Defect Classification (ADC) is a well-developed technology for inspection and analysis of defects on patterned wafers and masks in the semiconductors industry, ADC for mask blanks is still in the early stages of adoption and development. Calibre ADC is a powerful analysis tool for fast, accurate, consistent and automatic classification of defects on mask blanks. Accurate, automated classification of mask blanks leads to better usability of blanks by enabling defect avoidance technologies during mask writing. Detailed information on blank defects can help to select appropriate job-decks to be written on the mask by defect avoidance tools [1][4][5]. Smart algorithms separate critical defects from the potentially large number of non-critical defects or false defects detected at various stages during mask blank preparation. Mechanisms used by Calibre ADC to identify and characterize defects include defect location and size, signal polarity (dark, bright) in both transmitted and reflected review images, distinguishing defect signals from background noise in defect images. The Calibre ADC engine then uses a decision tree to translate this information into a defect classification code. Using this automated process improves classification accuracy, repeatability and speed, while avoiding the subjectivity of human judgment compared to the alternative of manual defect classification by trained personnel [2]. This paper focuses on the results from the evaluation of Automatic Defect Classification (ADC) product at MP Mask

  11. Performance-Driven Interface Contract Enforcement for Scientific Components

    Energy Technology Data Exchange (ETDEWEB)

    Dahlgren, T

    2007-02-22

    Several performance-driven approaches to selectively enforce interface contracts for scientific components are investigated. The goal is to facilitate debugging deployed applications built from plug-and-play components while keeping the cost of enforcement within acceptable overhead limits. This paper describes a study of global enforcement using a priori execution cost estimates obtained from traces. Thirteen trials are formed from five, single-component programs. Enforcement experiments conducted using twenty-three enforcement policies are used to determine the nature of exercised contracts and the impact of a variety of sampling strategies. Performance-driven enforcement appears to be best suited to programs that exercise moderately expensive contracts.

  12. Multilabel Learning for Automatic Web Services Tagging

    Directory of Open Access Journals (Sweden)

    Mustapha AZNAG

    2014-08-01

    Full Text Available Recently, some web services portals and search engines as Biocatalogue and Seekda!, have allowed users to manually annotate Web services using tags. User Tags provide meaningful descriptions of services and allow users to index and organize their contents. Tagging technique is widely used to annotate objects in Web 2.0 applications. In this paper we propose a novel probabilistic topic model (which extends the CorrLDA model - Correspondence Latent Dirichlet Allocation- to automatically tag web services according to existing manual tags. Our probabilistic topic model is a latent variable model that exploits local correlation labels. Indeed, exploiting label correlations is a challenging and crucial problem especially in multi-label learning context. Moreover, several existing systems can recommend tags for web services based on existing manual tags. In most cases, the manual tags have better quality. We also develop three strategies to automatically recommend the best tags for web services. We also propose, in this paper, WS-Portal; An Enriched Web Services Search Engine which contains 7063 providers, 115 sub-classes of category and 22236 web services crawled from the Internet. In WS-Portal, severals technologies are employed to improve the effectiveness of web service discovery (i.e. web services clustering, tags recommendation, services rating and monitoring. Our experiments are performed out based on real-world web services. The comparisons of Precision@n, Normalised Discounted Cumulative Gain (NDCGn values for our approach indicate that the method presented in this paper outperforms the method based on the CorrLDA in terms of ranking and quality of generated tags.

  13. Digital movie-based on automatic titrations.

    Science.gov (United States)

    Lima, Ricardo Alexandre C; Almeida, Luciano F; Lyra, Wellington S; Siqueira, Lucas A; Gaião, Edvaldo N; Paiva Junior, Sérgio S L; Lima, Rafaela L F C

    2016-01-15

    This study proposes the use of digital movies (DMs) in a flow-batch analyzer (FBA) to perform automatic, fast and accurate titrations. The term used for this process is "Digital movie-based on automatic titrations" (DMB-AT). A webcam records the DM during the addition of the titrant to the mixing chamber (MC). While the DM is recorded, it is decompiled into frames ordered sequentially at a constant rate of 26 frames per second (FPS). The first frame is used as a reference to define the region of interest (ROI) of 28×13pixels and the R, G and B values, which are used to calculate the Hue (H) values for each frame. The Pearson's correlation coefficient (r) is calculated between the H values of the initial frame and each subsequent frame. The titration curves are plotted in real time using the r values and the opening time of the titrant valve. The end point is estimated by the second derivative method. A software written in C language manages all analytical steps and data treatment in real time. The feasibility of the method was attested by application in acid/base test samples and edible oils. Results were compared with classical titration and did not present statistically significant differences when the paired t-test at the 95% confidence level was applied. The proposed method is able to process about 117-128 samples per hour for the test and edible oil samples, respectively, and its precision was confirmed by overall relative standard deviation (RSD) values, always less than 1.0%. PMID:26592600

  14. Removal of interproximal subgingival plaque by hand and automatic toothbrushes.

    Science.gov (United States)

    Taylor, J Y; Wood, C L; Garnick, J J; Thompson, W O

    1995-03-01

    Subgingival plaque removal at interproximal sites by automatic and hand toothbrushes was compared with control sites at which cleansing was not performed. There were 58 patients, 35 to 63 years of age, each with one hopeless tooth requiring extraction. Each patient was randomly assigned to one of four test groups: hand brush; automatic toothbrush 1; automatic toothbrush 2; and no brushing. The brushing instructions as stated by the manufacturers were demonstrated and the patient brushed the sextant containing the test tooth for 20 seconds. The level of the gingival margin was marked at each interproximal test site. The teeth were extracted and processed for SEM, and subgingival plaque was viewed at X100 and X2000 magnifications. A montage of photomicrographs of the gingival groove to the occlusal margin of the bacterial plaque at X100 magnification was made and the distance from the groove to the margin was measured. An ANOVA was performed using P = 0.05 level for significance. Due to processing difficulties, only 33 specimens were available for analysis. The average distances from the groove to the subgingival plaque front for the four test groups were 0.514, 0.132, 0.163, and 0.111 mm respectively. The maximum distance (1.5 mm) of plaque removal was greatest for the hand toothbrush. Due to the large standard deviation (0.636 compared to 0.146, 0.250, and 0.124 respectively), the hand brushing group was excluded from ANOVA. There were no statistically significant differences among the automatic toothbrushes and the no brushing control (P = 0.8393). It was concluded that a single session of oral hygiene instruction with an automatic toothbrush did not result in subgingival interproximal plaque cleansing. PMID:7776163

  15. ANPS - AUTOMATIC NETWORK PROGRAMMING SYSTEM

    Science.gov (United States)

    Schroer, B. J.

    1994-01-01

    Development of some of the space program's large simulation projects -- like the project which involves simulating the countdown sequence prior to spacecraft liftoff -- requires the support of automated tools and techniques. The number of preconditions which must be met for a successful spacecraft launch and the complexity of their interrelationship account for the difficulty of creating an accurate model of the countdown sequence. Researchers developed ANPS for the Nasa Marshall Space Flight Center to assist programmers attempting to model the pre-launch countdown sequence. Incorporating the elements of automatic programming as its foundation, ANPS aids the user in defining the problem and then automatically writes the appropriate simulation program in GPSS/PC code. The program's interactive user dialogue interface creates an internal problem specification file from user responses which includes the time line for the countdown sequence, the attributes for the individual activities which are part of a launch, and the dependent relationships between the activities. The program's automatic simulation code generator receives the file as input and selects appropriate macros from the library of software modules to generate the simulation code in the target language GPSS/PC. The user can recall the problem specification file for modification to effect any desired changes in the source code. ANPS is designed to write simulations for problems concerning the pre-launch activities of space vehicles and the operation of ground support equipment and has potential for use in developing network reliability models for hardware systems and subsystems. ANPS was developed in 1988 for use on IBM PC or compatible machines. The program requires at least 640 KB memory and one 360 KB disk drive, PC DOS Version 2.0 or above, and GPSS/PC System Version 2.0 from Minuteman Software. The program is written in Turbo Prolog Version 2.0. GPSS/PC is a trademark of Minuteman Software. Turbo Prolog

  16. Failure of classical traffic flow theories: Stochastic highway capacity and automatic driving

    Science.gov (United States)

    Kerner, Boris S.

    2016-05-01

    In a mini-review Kerner (2013) it has been shown that classical traffic flow theories and models failed to explain empirical traffic breakdown - a phase transition from metastable free flow to synchronized flow at highway bottlenecks. The main objective of this mini-review is to study the consequence of this failure of classical traffic-flow theories for an analysis of empirical stochastic highway capacity as well as for the effect of automatic driving vehicles and cooperative driving on traffic flow. To reach this goal, we show a deep connection between the understanding of empirical stochastic highway capacity and a reliable analysis of automatic driving vehicles in traffic flow. With the use of simulations in the framework of three-phase traffic theory, a probabilistic analysis of the effect of automatic driving vehicles on a mixture traffic flow consisting of a random distribution of automatic driving and manual driving vehicles has been made. We have found that the parameters of automatic driving vehicles can either decrease or increase the probability of the breakdown. The increase in the probability of traffic breakdown, i.e., the deterioration of the performance of the traffic system can occur already at a small percentage (about 5%) of automatic driving vehicles. The increase in the probability of traffic breakdown through automatic driving vehicles can be realized, even if any platoon of automatic driving vehicles satisfies condition for string stability.

  17. Changes in default mode network as automaticity develops in a categorization task.

    Science.gov (United States)

    Shamloo, Farzin; Helie, Sebastien

    2016-10-15

    The default mode network (DMN) is a set of brain regions in which blood oxygen level dependent signal is suppressed during attentional focus on the external environment. Because automatic task processing requires less attention, development of automaticity in a rule-based categorization task may result in less deactivation and altered functional connectivity of the DMN when compared to the initial learning stage. We tested this hypothesis by re-analyzing functional magnetic resonance imaging data of participants trained in rule-based categorization for over 10,000 trials (Helie et al., 2010) [12,13]. The results show that some DMN regions are deactivated in initial training but not after automaticity has developed. There is also a significant decrease in DMN deactivation after extensive practice. Seed-based functional connectivity analyses with the precuneus, medial prefrontal cortex (two important DMN regions) and Brodmann area 6 (an important region in automatic categorization) were also performed. The results show increased functional connectivity with both DMN and non-DMN regions after the development of automaticity, and a decrease in functional connectivity between the medial prefrontal cortex and ventromedial orbitofrontal cortex. Together, these results further support the hypothesis of a strategy shift in automatic categorization and bridge the cognitive and neuroscientific conceptions of automaticity in showing that the reduced need for cognitive resources in automatic processing is accompanied by a disinhibition of the DMN and stronger functional connectivity between DMN and task-related brain regions. PMID:27457134

  18. Automatic defect identification on PWR nuclear power station fuel pellets

    International Nuclear Information System (INIS)

    This article presents a new automatic identification technique of structural failures in nuclear green fuel pellet. This technique was developed to identify failures occurred during the fabrication process. It is based on a smart image analysis technique for automatic identification of the failures on uranium oxide pellets used as fuel in PWR nuclear power stations. In order to achieve this goal, an artificial neural network (ANN) has been trained and validated from image histograms of pellets containing examples not only from normal pellets (flawless), but from defective pellets as well (with the main flaws normally found during the manufacturing process). Based on this technique, a new automatic identification system of flaws on nuclear fuel element pellets, composed by the association of image pre-processing and intelligent, will be developed and implemented on the Brazilian nuclear fuel production industry. Based on the theoretical performance of the technology proposed and presented in this article, it is believed that this new system, NuFAS (Nuclear Fuel Pellets Failures Automatic Identification Neural System) will be able to identify structural failures in nuclear fuel pellets with virtually zero error margins. After implemented, the NuFAS will add value to control quality process of the national production of the nuclear fuel.

  19. Uav-Based Automatic Tree Growth Measurement for Biomass Estimation

    Science.gov (United States)

    Karpina, M.; Jarząbek-Rychard, M.; Tymków, P.; Borkowski, A.

    2016-06-01

    Manual in-situ measurements of geometric tree parameters for the biomass volume estimation are time-consuming and economically non-effective. Photogrammetric techniques can be deployed in order to automate the measurement procedure. The purpose of the presented work is an automatic tree growth estimation based on Unmanned Aircraft Vehicle (UAV) imagery. The experiment was conducted in an agriculture test field with scots pine canopies. The data was collected using a Leica Aibotix X6V2 platform equipped with a Nikon D800 camera. Reference geometric parameters of selected sample plants were measured manually each week. In situ measurements were correlated with the UAV data acquisition. The correlation aimed at the investigation of optimal conditions for a flight and parameter settings for image acquisition. The collected images are processed in a state of the art tool resulting in a generation of dense 3D point clouds. The algorithm is developed in order to estimate geometric tree parameters from 3D points. Stem positions and tree tops are identified automatically in a cross section, followed by the calculation of tree heights. The automatically derived height values are compared to the reference measurements performed manually. The comparison allows for the evaluation of automatic growth estimation process. The accuracy achieved using UAV photogrammetry for tree heights estimation is about 5cm.

  20. Autoclass: An automatic classification system

    Science.gov (United States)

    Stutz, John; Cheeseman, Peter; Hanson, Robin

    1991-01-01

    The task of inferring a set of classes and class descriptions most likely to explain a given data set can be placed on a firm theoretical foundation using Bayesian statistics. Within this framework, and using various mathematical and algorithmic approximations, the AutoClass System searches for the most probable classifications, automatically choosing the number of classes and complexity of class descriptions. A simpler version of AutoClass has been applied to many large real data sets, has discovered new independently-verified phenomena, and has been released as a robust software package. Recent extensions allow attributes to be selectively correlated within particular classes, and allow classes to inherit, or share, model parameters through a class hierarchy. The mathematical foundations of AutoClass are summarized.

  1. Automatic summarising factors and directions

    CERN Document Server

    Jones, K S

    1998-01-01

    This position paper suggests that progress with automatic summarising demands a better research methodology and a carefully focussed research strategy. In order to develop effective procedures it is necessary to identify and respond to the context factors, i.e. input, purpose, and output factors, that bear on summarising and its evaluation. The paper analyses and illustrates these factors and their implications for evaluation. It then argues that this analysis, together with the state of the art and the intrinsic difficulty of summarising, imply a nearer-term strategy concentrating on shallow, but not surface, text analysis and on indicative summarising. This is illustrated with current work, from which a potentially productive research programme can be developed.

  2. Automatic Sequencing for Experimental Protocols

    Science.gov (United States)

    Hsieh, Paul F.; Stern, Ivan

    We present a paradigm and implementation of a system for the specification of the experimental protocols to be used for the calibration of AXAF mirrors. For the mirror calibration, several thousand individual measurements need to be defined. For each measurement, over one hundred parameters need to be tabulated for the facility test conductor and several hundred instrument parameters need to be set. We provide a high level protocol language which allows for a tractable representation of the measurement protocol. We present a procedure dispatcher which automatically sequences a protocol more accurately and more rapidly than is possible by an unassisted human operator. We also present back-end tools to generate printed procedure manuals and database tables required for review by the AXAF program. This paradigm has been tested and refined in the calibration of detectors to be used in mirror calibration.

  3. TRT快切阀系统调试过程的故障分析与处理%Analysis and Treatment of Malfunctions during the Debugging Process of TRT Fast Shutoff Valve System

    Institute of Scientific and Technical Information of China (English)

    秦海兵; 王刚

    2015-01-01

    Through analysis of the faults of open failure, opening and excessive buffer time occurring during the process of debugging and operation of the TRT fast shutoff valve, the cause of the problems were found out and treated. Opinions on troubleshooting and main-tenance of the hydraulic system were also provided.%通过对TRT快切阀在调试和运行过程中发生的阀打不开、打开和缓冲时间长等故障现象的分析,找出故障原因并处理,并对液压系统故障的查找和维护提出自己的见解。

  4. On the Cost Control of the Construction Stage of Electricity Installation and Debugging Engineering%试论电力安装调试工程施工阶段的成本控制

    Institute of Scientific and Technical Information of China (English)

    王馗

    2014-01-01

    With the continuous innovation and improvement of market economic system, the competition between electric power enterprises is more and more fierce. To gain a foothold in the competition, it is necessary for electric power enterprises to strictly control the construction cost, in order to improve their comprehensive competitiveness. For electricity installation and debugging engineering, the cost of construction stage mainly includes the cost of construction materials, construction machinery and equipment, project management, construction personnel labor and so on. Of course, the construction scheme is the direct influencing factor of these costs. Based on the author's own working experience, effective measures to control the cost of the construction stage of electric installation and debugging engineering are put forward.%在市场经济体制不断创新和完善下,电力行业之间的市场竞争力越来越激烈。电力企业若想在激烈的竞争中站稳脚跟,就必须严格控制施工成本,以提高企业的综合竞争力。对于电力安装调试工程来讲,其施工阶段的成本主要包括了施工材料费、施工机械设备费、项目管理费、施工人员劳务费等,当然这些费用的直接影响因素就是电力安装调试工程的施工方案。笔者结合自身的工作经验,基于电力安装调试工程,对其施工阶段的成本进行有效控制。

  5. Solar Powered Automatic Shrimp Feeding System

    Directory of Open Access Journals (Sweden)

    Dindo T. Ani

    2015-12-01

    Full Text Available - Automatic system has brought many revolutions in the existing technologies. One among the technologies, which has greater developments, is the solar powered automatic shrimp feeding system. For instance, the solar power which is a renewable energy can be an alternative solution to energy crisis and basically reducing man power by using it in an automatic manner. The researchers believe an automatic shrimp feeding system may help solve problems on manual feeding operations. The project study aimed to design and develop a solar powered automatic shrimp feeding system. It specifically sought to prepare the design specifications of the project, to determine the methods of fabrication and assembly, and to test the response time of the automatic shrimp feeding system. The researchers designed and developed an automatic system which utilizes a 10 hour timer to be set in intervals preferred by the user and will undergo a continuous process. The magnetic contactor acts as a switch connected to the 10 hour timer which controls the activation or termination of electrical loads and powered by means of a solar panel outputting electrical power, and a rechargeable battery in electrical communication with the solar panel for storing the power. By undergoing through series of testing, the components of the modified system were proven functional and were operating within the desired output. It was recommended that the timer to be used should be tested to avoid malfunction and achieve the fully automatic system and that the system may be improved to handle changes in scope of the project.

  6. Automatic control of nuclear power plants

    International Nuclear Information System (INIS)

    The fundamental concepts in automatic control are surveyed, and the purpose of the automatic control of pressurized water reactors is given. The response characteristics for the main components are then studied and block diagrams are given for the main control loops (turbine, steam generator, and nuclear reactors)

  7. Automatic segmentation of diatom images for classification

    NARCIS (Netherlands)

    Jalba, Andrei C.; Wilkinson, Michael H.F.; Roerdink, Jos B.T.M.

    2004-01-01

    A general framework for automatic segmentation of diatom images is presented. This segmentation is a critical first step in contour-based methods for automatic identification of diatoms by computerized image analysis. We review existing results, adapt popular segmentation methods to this difficult p

  8. Data mining of geospatial data: combining visual and automatic methods

    OpenAIRE

    Demšar, Urška

    2006-01-01

    Most of the largest databases currently available have a strong geospatial component and contain potentially useful information which might be of value. The discipline concerned with extracting this information and knowledge is data mining. Knowledge discovery is performed by applying automatic algorithms which recognise patterns in the data. Classical data mining algorithms assume that data are independently generated and identically distributed. Geospatial data are multidimensional, spatial...

  9. Automatic classification of eclipsing binaries light curves using neural networks

    CERN Document Server

    Sarro, L M; Giménez, A

    2005-01-01

    In this work we present a system for the automatic classification of the light curves of eclipsing binaries. This system is based on a classification scheme that aims to separate eclipsing binary sistems according to their geometrical configuration in a modified version of the traditional classification scheme. The classification is performed by a Bayesian ensemble of neural networks trained with {\\em Hipparcos} data of seven different categories including eccentric binary systems and two types of pulsating light curve morphologies.

  10. Fiona: a parallel and automatic strategy for read error correction

    OpenAIRE

    Schulz, Marcel H; Weese, David; Holtgrewe, Manuel; Dimitrova, Viktoria; Niu, Sijia; Reinert, Knut; Richard, Hugues

    2014-01-01

    Motivation: Automatic error correction of high-throughput sequencing data can have a dramatic impact on the amount of usable base pairs and their quality. It has been shown that the performance of tasks such as de novo genome assembly and SNP calling can be dramatically improved after read error correction. While a large number of methods specialized for correcting substitution errors as found in Illumina data exist, few methods for the correction of indel errors, common to technologies like ...

  11. The TS 600: automatic control system for eddy currents

    International Nuclear Information System (INIS)

    In the scope of fabrication and in service inspection of the PWR steam generator tubing bendle, FRAMATOME developed an automatic Eddy Current testing system: TS600. Based on a mini-computer, TS600 allows to digitize, to store and to process data in various ways, so it is possible to perform several kinds of inspection: conventional inservice inspection, roll area profilometry...... TS600 can also be used to develop new methods of examination

  12. Automatic Camera Viewfinder Based on TI DaVinci

    Institute of Scientific and Technical Information of China (English)

    WANG Hai-gang; XIAO Zhi-tao; GENG Lei

    2009-01-01

    Presented is an automatic camera viewfinder based on TI DaVinci digital platform and discussed mainly is the scheme of software system based on linux. This system can give an alarm and save the picture when the set features appear in the view, and the saved pictures can be downloaded and zoomed out. All functions are operated in OSD menu. It is well established for its flexible operations, powerful functions, multitasking and stable performance.

  13. AUTOMATIC DESIGNING OF POWER SUPPLY SYSTEMS

    Directory of Open Access Journals (Sweden)

    A. I. Kirspou

    2016-01-01

    Full Text Available Development of automatic designing system for power supply of industrial enterprises is considered in the paper. Its complete structure and principle of operation are determined and established. Modern graphical interface and data scheme are developed, software is completely realized. Methodology and software correspond to the requirements of the up-to-date designing, describe a general algorithm of program process and also reveals properties of automatic designing system objects. Automatic designing system is based on module principle while using object-orientated programming. Automatic designing system makes it possible to carry out consistently designing calculations of power supply system and select the required equipment with subsequent output of all calculations in the form of explanatory note. Automatic designing system can be applied by designing organizations under conditions of actual designing.

  14. SPHOTOM - Package for an Automatic Multicolour Photometry

    Science.gov (United States)

    Parimucha, Š.; Vaňko, M.; Mikloš, P.

    2012-04-01

    We present basic information about package SPHOTOM for an automatic multicolour photometry. This package is in development for the creation of a photometric pipe-line, which we plan to use in the near future with our new instruments. It could operate in two independent modes, (i) GUI mode, in which the user can select images and control functions of package through interface and (ii) command line mode, in which all processes are controlled using a main parameter file. SPHOTOM is developed as a universal package for Linux based systems with easy implementation for different observatories. The photometric part of the package is based on the Sextractor code, which allows us to detect all objects on the images and perform their photometry with different apertures. We can also perform astrometric solutions for all images for a correct cross-identification of the stars on the images. The result is a catalogue of all objects with their instrumental photometric measurements which are consequently used for a differential magnitudes calculations with one or more comparison stars, transformations to an international system, and determinations of colour indices.

  15. Automatic image segmentation by dynamic region merging.

    Science.gov (United States)

    Peng, Bo; Zhang, Lei; Zhang, David

    2011-12-01

    This paper addresses the automatic image segmentation problem in a region merging style. With an initially oversegmented image, in which many regions (or superpixels) with homogeneous color are detected, an image segmentation is performed by iteratively merging the regions according to a statistical test. There are two essential issues in a region-merging algorithm: order of merging and the stopping criterion. In the proposed algorithm, these two issues are solved by a novel predicate, which is defined by the sequential probability ratio test and the minimal cost criterion. Starting from an oversegmented image, neighboring regions are progressively merged if there is an evidence for merging according to this predicate. We show that the merging order follows the principle of dynamic programming. This formulates the image segmentation as an inference problem, where the final segmentation is established based on the observed image. We also prove that the produced segmentation satisfies certain global properties. In addition, a faster algorithm is developed to accelerate the region-merging process, which maintains a nearest neighbor graph in each iteration. Experiments on real natural images are conducted to demonstrate the performance of the proposed dynamic region-merging algorithm. PMID:21609885

  16. Automatic analysis of ventilation and perfusion pulmonary scintigrams

    International Nuclear Information System (INIS)

    A fully automatic program is used to analyse Pulmonary Ventilation and Perfusion Scintigrams. Ventilation study is performed using a standard washin-washout 133Xe method. Multiple View Late Xenon Washout Images are also recorded. Perfusion study is performed using sup(99m)Tc serum albumin. The FORTRAN program recognizes the different steps of the test, whatever their durations are. It performs background subtraction, drows pulmonary Regions of Interest and calculate Ventilation and Perfusion parameters for each ROI and each lung. It also processes Multiple View Late Xenon Washout Images in such a way that they give not only a topographic information about hypoventilated regions, but also a semi-quantitative information about the strongness of xenon retention. During the processing, the operator has only to control two intermediate results (e.g. automatically determained pulmonary ROI). All the numerical and processed iconographic results are obtained within 10 minutes after the end of the test. This program has already been used to analyse 1,000 pulmonary studies. During those studies, correction of intermediate results has been very scarcely necessary. This efficient and reliable automatic program is very useful for the daily practice of a Nuclear Medecin Department

  17. MatchGUI: A Graphical MATLAB-Based Tool for Automatic Image Co-Registration

    Science.gov (United States)

    Ansar, Adnan I.

    2011-01-01

    MatchGUI software, based on MATLAB, automatically matches two images and displays the match result by superimposing one image on the other. A slider bar allows focus to shift between the two images. There are tools for zoom, auto-crop to overlap region, and basic image markup. Given a pair of ortho-rectified images (focused primarily on Mars orbital imagery for now), this software automatically co-registers the imagery so that corresponding image pixels are aligned. MatchGUI requires minimal user input, and performs a registration over scale and inplane rotation fully automatically

  18. Automatic Meal Inspection System Using LBP-HF Feature for Central Kitchen

    OpenAIRE

    Yue-Min Jiang; Ho-Hsin Lee; Cheng-Chang Lien; Chun-Feng Tai; Pi-Chun Chu; Ting-Wei Yang

    2015-01-01

    This paper proposes an intelligent and automatic meal inspection system which can be applied to the meal inspection for the application of central kitchen automation. The diet specifically designed for the patients are required with providing personalized diet such as low sodium intake or some necessary food. Hence, the proposed system can benefit the inspection process that is often performed manually. In the proposed system, firstly, the meal box can be detected and located automatically wi...

  19. Neuronal Spectral Analysis of EEG and Expert Knowledge Integration for Automatic Classification of Sleep Stages

    OpenAIRE

    Kerkeni, Nizar; Alexandre, Frédéric; Bedoui, Mohamed Hédi; Bougrain, Laurent; Dogui, Mohamed

    2005-01-01

    http://www.wseas.org Being able to analyze and interpret signal coming from electroencephalogram (EEG) recording can be of high interest for many applications including medical diagnosis and Brain-Computer Interfaces. Indeed, human experts are today able to extract from this signal many hints related to physiological as well as cognitive states of the recorded subject and it would be very interesting to perform such task automatically but today no completely automatic system exists. In pre...

  20. NASA MSFC hardware in the loop simulations of automatic rendezvous and capture systems

    Science.gov (United States)

    Tobbe, Patrick A.; Naumann, Charles B.; Sutton, William; Bryan, Thomas C.

    1991-01-01

    Two complementary hardware-in-the-loop simulation facilities for automatic rendezvous and capture systems at MSFC are described. One, the Flight Robotics Laboratory, uses an 8 DOF overhead manipulator with a work volume of 160 by 40 by 23 feet to evaluate automatic rendezvous algorithms and range/rate sensing systems. The other, the Space Station/Station Operations Mechanism Test Bed, uses a 6 DOF hydraulic table to perform docking and berthing dynamics simulations.

  1. Automatic subject classification of textual documents using limited or no training data

    OpenAIRE

    Joorabchi, Arash

    2010-01-01

    With the explosive growth in the number of electronic documents available on the internet, intranets, and digital libraries, there is a growing need for automatic systems capable of indexing and organising such large volumes of data more that ever. Automatic Text Classification (ATC) has become one of the principal means for enhancing the performance of information retrieval systems and organising digital libraries and other textual collections. Within this context, the use of ...

  2. Explodet Project:. Methods of Automatic Data Processing and Analysis for the Detection of Hidden Explosive

    Science.gov (United States)

    Lecca, Paola

    2003-12-01

    The research of the INFN Gruppo Collegato di Trento in the ambit of EXPLODET project for the humanitarian demining, is devoted to the development of a software procedure for the automatization of data analysis and decision taking about the presence of hidden explosive. Innovative algorithms of likely background calculation, a system based on neural networks for energy calibration and simple statistical methods for the qualitative consistency check of the signals are the main parts of the software performing the automatic data elaboration.

  3. Development of an automatic identification algorithm for antibiogram analysis.

    Science.gov (United States)

    Costa, Luan F R; da Silva, Eduardo S; Noronha, Victor T; Vaz-Moreira, Ivone; Nunes, Olga C; Andrade, Marcelino M de

    2015-12-01

    Routinely, diagnostic and microbiology laboratories perform antibiogram analysis which can present some difficulties leading to misreadings and intra and inter-reader deviations. An Automatic Identification Algorithm (AIA) has been proposed as a solution to overcome some issues associated with the disc diffusion method, which is the main goal of this work. AIA allows automatic scanning of inhibition zones obtained by antibiograms. More than 60 environmental isolates were tested using susceptibility tests which were performed for 12 different antibiotics for a total of 756 readings. Plate images were acquired and classified as standard or oddity. The inhibition zones were measured using the AIA and results were compared with reference method (human reading), using weighted kappa index and statistical analysis to evaluate, respectively, inter-reader agreement and correlation between AIA-based and human-based reading. Agreements were observed in 88% cases and 89% of the tests showed no difference or a reading problems such as overlapping inhibition zones, imperfect microorganism seeding, non-homogeneity of the circumference, partial action of the antimicrobial, and formation of a second halo of inhibition. Furthermore, AIA proved to overcome some of the limitations observed in other automatic methods. Therefore, AIA may be a practical tool for automated reading of antibiograms in diagnostic and microbiology laboratories. PMID:26513468

  4. Automatic Chessboard Detection for Intrinsic and Extrinsic Camera Parameter Calibration

    Directory of Open Access Journals (Sweden)

    Jose María Armingol

    2010-03-01

    Full Text Available There are increasing applications that require precise calibration of cameras to perform accurate measurements on objects located within images, and an automatic algorithm would reduce this time consuming calibration procedure. The method proposed in this article uses a pattern similar to that of a chess board, which is found automatically in each image, when no information regarding the number of rows or columns is supplied to aid its detection. This is carried out by means of a combined analysis of two Hough transforms, image corners and invariant properties of the perspective transformation. Comparative analysis with more commonly used algorithms demonstrate the viability of the algorithm proposed, as a valuable tool for camera calibration.

  5. Automatic sleep staging using state machine-controlled decision trees.

    Science.gov (United States)

    Imtiaz, Syed Anas; Rodriguez-Villegas, Esther

    2015-01-01

    Automatic sleep staging from a reduced number of channels is desirable to save time, reduce costs and make sleep monitoring more accessible by providing home-based polysomnography. This paper introduces a novel algorithm for automatic scoring of sleep stages using a combination of small decision trees driven by a state machine. The algorithm uses two channels of EEG for feature extraction and has a state machine that selects a suitable decision tree for classification based on the prevailing sleep stage. Its performance has been evaluated using the complete dataset of 61 recordings from PhysioNet Sleep EDF Expanded database achieving an overall accuracy of 82% and 79% on training and test sets respectively. The algorithm has been developed with a very small number of decision tree nodes that are active at any given time making it suitable for use in resource-constrained wearable systems.

  6. Detection of Off-normal Images for NIF Automatic Alignment

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J V; Awwal, A S; McClay, W A; Ferguson, S W; Burkhart, S C

    2005-07-11

    One of the major purposes of National Ignition Facility at Lawrence Livermore National Laboratory is to accurately focus 192 high energy laser beams on a nanoscale (mm) fusion target at the precise location and time. The automatic alignment system developed for NIF is used to align the beams in order to achieve the required focusing effect. However, if a distorted image is inadvertently created by a faulty camera shutter or some other opto-mechanical malfunction, the resulting image termed ''off-normal'' must be detected and rejected before further alignment processing occurs. Thus the off-normal processor acts as a preprocessor to automatic alignment image processing. In this work, we discuss the development of an ''off-normal'' pre-processor capable of rapidly detecting the off-normal images and performing the rejection. Wide variety of off-normal images for each loop is used to develop the criterion for rejections accurately.

  7. Automatic relational database compression scheme design based on swarm evolution

    Institute of Scientific and Technical Information of China (English)

    HU Tian-lei; CHEN Gang; LI Xiao-yan; DONG Jin-xiang

    2006-01-01

    Compression is an intuitive way to boost the performance of a database system. However, compared with other physical database design techniques, compression consumes large amount of CPU power. There is a trade-off between the reduction of disk access and the overhead of CPU processing. Automatic design and adaptive administration of database systems are widely demanded, and the automatic selection of compression schema to compromise the trade-off is very important. In this paper,we present a model with novel techniques to integrate a rapidly convergent agent-based evolution framework, i.e. the SWAF (SWarm Algorithm Framework), into adaptive attribute compression for relational database. The model evolutionally consults statistics of CPU load and IO bandwidth to select compression schemas considering both aspects of the trade-off. We have implemented a prototype model on Oscar RDBMS with experiments highlighting the correctness and efficiency of our techniques.

  8. Automatic Medical Image Classification and Abnormality Detection Using KNearest Neighbour

    Directory of Open Access Journals (Sweden)

    Dr. R. J. Ramteke , Khachane Monali Y.

    2012-12-01

    Full Text Available This research work presents a method for automatic classification of medical images in two classes Normal and Abnormal based on image features and automatic abnormality detection. Our proposed system consists of four phases Preprocessing, Feature extraction, Classification, and Post processing. Statistical texture feature set is derived from normal and abnormal images. We used the KNN classifier for classifying image. The KNN classifier performance compared with kernel based SVM classifier (Linear and RBF. The confusion matrix computed and result shows that KNN obtain 80% classification rate which is more than SVM classification rate. So we choose KNN algorithm for classification of images. If image classified as abnormal then post processing step applied on the image and abnormal region is highlighted on the image. The system has been tested on the number of real CT scan brain images.

  9. Automatic local Gabor Features extraction for face recognition

    CERN Document Server

    Jemaa, Yousra Ben

    2009-01-01

    We present in this paper a biometric system of face detection and recognition in color images. The face detection technique is based on skin color information and fuzzy classification. A new algorithm is proposed in order to detect automatically face features (eyes, mouth and nose) and extract their correspondent geometrical points. These fiducial points are described by sets of wavelet components which are used for recognition. To achieve the face recognition, we use neural networks and we study its performances for different inputs. We compare the two types of features used for recognition: geometric distances and Gabor coefficients which can be used either independently or jointly. This comparison shows that Gabor coefficients are more powerful than geometric distances. We show with experimental results how the importance recognition ratio makes our system an effective tool for automatic face detection and recognition.

  10. Automatic Facial Expression Recognition Based on Hybrid Approach

    Directory of Open Access Journals (Sweden)

    Ali K. K. Bermani

    2012-12-01

    Full Text Available The topic of automatic recognition of facial expressions deduce a lot of researchers in the late last century and has increased a great interest in the past few years. Several techniques have emerged in order to improve the efficiency of the recognition by addressing problems in face detection and extraction features in recognizing expressions. This paper has proposed automatic system for facial expression recognition which consists of hybrid approach in feature extraction phase which represent a combination between holistic and analytic approaches by extract 307 facial expression features (19 features by geometric, 288 feature by appearance. Expressions recognition is performed by using radial basis function (RBF based on artificial neural network to recognize the six basic emotions (anger, fear, disgust, happiness, surprise, sadness in addition to the natural.The system achieved recognition rate 97.08% when applying on person-dependent database and 93.98% when applying on person-independent.

  11. AUTOMATIC EXTRACTION OF BUILDING OUTLINE FROM HIGH RESOLUTION AERIAL IMAGERY

    Directory of Open Access Journals (Sweden)

    Y. Wang

    2016-06-01

    Full Text Available In this paper, a new approach for automated extraction of building boundary from high resolution imagery is proposed. The proposed approach uses both geometric and spectral properties of a building to detect and locate buildings accurately. It consists of automatic generation of high quality point cloud from the imagery, building detection from point cloud, classification of building roof and generation of building outline. Point cloud is generated from the imagery automatically using semi-global image matching technology. Buildings are detected from the differential surface generated from the point cloud. Further classification of building roof is performed in order to generate accurate building outline. Finally classified building roof is converted into vector format. Numerous tests have been done on images in different locations and results are presented in the paper.

  12. Automatic Extraction of Building Outline from High Resolution Aerial Imagery

    Science.gov (United States)

    Wang, Yandong

    2016-06-01

    In this paper, a new approach for automated extraction of building boundary from high resolution imagery is proposed. The proposed approach uses both geometric and spectral properties of a building to detect and locate buildings accurately. It consists of automatic generation of high quality point cloud from the imagery, building detection from point cloud, classification of building roof and generation of building outline. Point cloud is generated from the imagery automatically using semi-global image matching technology. Buildings are detected from the differential surface generated from the point cloud. Further classification of building roof is performed in order to generate accurate building outline. Finally classified building roof is converted into vector format. Numerous tests have been done on images in different locations and results are presented in the paper.

  13. Automatic Emotional State Detection using Facial Expression Dynamic in Videos

    Directory of Open Access Journals (Sweden)

    Hongying Meng

    2014-11-01

    Full Text Available In this paper, an automatic emotion detection system is built for a computer or machine to detect the emotional state from facial expressions in human computer communication. Firstly, dynamic motion features are extracted from facial expression videos and then advanced machine learning methods for classification and regression are used to predict the emotional states. The system is evaluated on two publicly available datasets, i.e. GEMEP_FERA and AVEC2013, and satisfied performances are achieved in comparison with the baseline results provided. With this emotional state detection capability, a machine can read the facial expression of its user automatically. This technique can be integrated into applications such as smart robots, interactive games and smart surveillance systems.

  14. Automatic spike sorting using tuning information.

    Science.gov (United States)

    Ventura, Valérie

    2009-09-01

    Current spike sorting methods focus on clustering neurons' characteristic spike waveforms. The resulting spike-sorted data are typically used to estimate how covariates of interest modulate the firing rates of neurons. However, when these covariates do modulate the firing rates, they provide information about spikes' identities, which thus far have been ignored for the purpose of spike sorting. This letter describes a novel approach to spike sorting, which incorporates both waveform information and tuning information obtained from the modulation of firing rates. Because it efficiently uses all the available information, this spike sorter yields lower spike misclassification rates than traditional automatic spike sorters. This theoretical result is verified empirically on several examples. The proposed method does not require additional assumptions; only its implementation is different. It essentially consists of performing spike sorting and tuning estimation simultaneously rather than sequentially, as is currently done. We used an expectation-maximization maximum likelihood algorithm to implement the new spike sorter. We present the general form of this algorithm and provide a detailed implementable version under the assumptions that neurons are independent and spike according to Poisson processes. Finally, we uncover a systematic flaw of spike sorting based on waveform information only.

  15. Project Report: Automatic Sequence Processor Software Analysis

    Science.gov (United States)

    Benjamin, Brandon

    2011-01-01

    The Mission Planning and Sequencing (MPS) element of Multi-Mission Ground System and Services (MGSS) provides space missions with multi-purpose software to plan spacecraft activities, sequence spacecraft commands, and then integrate these products and execute them on spacecraft. Jet Propulsion Laboratory (JPL) is currently is flying many missions. The processes for building, integrating, and testing the multi-mission uplink software need to be improved to meet the needs of the missions and the operations teams that command the spacecraft. The Multi-Mission Sequencing Team is responsible for collecting and processing the observations, experiments and engineering activities that are to be performed on a selected spacecraft. The collection of these activities is called a sequence and ultimately a sequence becomes a sequence of spacecraft commands. The operations teams check the sequence to make sure that no constraints are violated. The workflow process involves sending a program start command, which activates the Automatic Sequence Processor (ASP). The ASP is currently a file-based system that is comprised of scripts written in perl, c-shell and awk. Once this start process is complete, the system checks for errors and aborts if there are any; otherwise the system converts the commands to binary, and then sends the resultant information to be radiated to the spacecraft.

  16. Automatic Specification Evaluator for Effective Migration

    Directory of Open Access Journals (Sweden)

    P. Sakthivel

    2012-01-01

    Full Text Available Problem statement: Software Reengineering is an effective technique for reuse the older application in the new environment. Nowadays, Reengineering techniques are increasing in spite of many difficulties and issues arise when the older application is converted to newer one. So there is a need to enhance the new system to satisfy the user requirements and quality aspects. Approach: For this enhancement of new system, we propose a method namely Automatic Specification Evaluator (ASE where the interference and their effects on the new system were identified by their attributes and modify the interference if necessary. The accuracy of the migration was further increased by reimplementation of the same method. Results: After the proposed ASE method, the system interference was reduced and the efficiency of the new system was improved. In many migration situations, ASE produces the target system with zero interference. Conclusion: Our proposed method gives a good performance in the new system and hence the new system can adopt the properties of the legacy system and also satisfies the user requirements

  17. Studies on the calibration of mammography automatic exposure mode with computed radiology

    International Nuclear Information System (INIS)

    Objective: To realize the optimization of image quality and radiation dose by correcting mammography automatic exposure, according to automatic exposure controlled mode of mammography film-screen system. Methods: The film-screen system (28 kV) was applied to perform automatic exposure of plexiglass (40 mm) and get the standard dose of exposure, the exposure mode of CR base on LgM=2.0 was rectified, which was divided into 10 steps. Mammary glands pattern (Fluke NA18-220) were examined with CR (26, 28, and 30 kV) by the automatic exposure mode corrected. The exposure values (mAs) were recorded. CR image was diagnosed and evaluated in double blind way by 4 radiologists according to American Collage of Radiology (ACR) standard. Results: Based on the standard of CR automatic exposure with the dose higher than the traditional exposure of film-screen system, the calibration of mammography automatic exposure was accomplished. The test results of the calibrated mode was better than the scoring system of ACR. Conclusions: Comparative study showed improvement in acquiring high-quality image and reduction of radiation dose. The corrected mammography automatic exposure mode might be a better method for clinical use. (authors)

  18. Towards automatic classification of all WISE sources

    Science.gov (United States)

    Kurcz, A.; Bilicki, M.; Solarz, A.; Krupa, M.; Pollo, A.; Małek, K.

    2016-07-01

    Context. The Wide-field Infrared Survey Explorer (WISE) has detected hundreds of millions of sources over the entire sky. Classifying them reliably is, however, a challenging task owing to degeneracies in WISE multicolour space and low levels of detection in its two longest-wavelength bandpasses. Simple colour cuts are often not sufficient; for satisfactory levels of completeness and purity, more sophisticated classification methods are needed. Aims: Here we aim to obtain comprehensive and reliable star, galaxy, and quasar catalogues based on automatic source classification in full-sky WISE data. This means that the final classification will employ only parameters available from WISE itself, in particular those which are reliably measured for the majority of sources. Methods: For the automatic classification we applied a supervised machine learning algorithm, support vector machines (SVM). It requires a training sample with relevant classes already identified, and we chose to use the SDSS spectroscopic dataset (DR10) for that purpose. We tested the performance of two kernels used by the classifier, and determined the minimum number of sources in the training set required to achieve stable classification, as well as the minimum dimension of the parameter space. We also tested SVM classification accuracy as a function of extinction and apparent magnitude. Thus, the calibrated classifier was finally applied to all-sky WISE data, flux-limited to 16 mag (Vega) in the 3.4 μm channel. Results: By calibrating on the test data drawn from SDSS, we first established that a polynomial kernel is preferred over a radial one for this particular dataset. Next, using three classification parameters (W1 magnitude, W1-W2 colour, and a differential aperture magnitude) we obtained very good classification efficiency in all the tests. At the bright end, the completeness for stars and galaxies reaches ~95%, deteriorating to ~80% at W1 = 16 mag, while for quasars it stays at a level of

  19. Desktop calibration of automatic transmission for passenger vehicle

    Institute of Scientific and Technical Information of China (English)

    FANG Chi; SHI Jian-peng; WANG Jun

    2014-01-01

    Desktop calibration of automatic transmission (AT) is a method which can reduce cost, enhance efficiency and shorten the development periods of a vehicle effectively. We primary introduced the principle and approach of desktop calibration of AT based on the condition of coupling characteristics between engine and torque converter and obtained right point exactly. It is shown to agree with experimental measurements reasonably well. It was used in different applications abroad based on AT technology and achieved a good performance of the vehicle compared with traditional AT technology which primary focuses on the drivability, performance and fuel consumption.

  20. Automatic Identification of Silence, Unvoiced and Voiced Chunks in Speech

    Directory of Open Access Journals (Sweden)

    Poonam Sharma

    2013-05-01

    Full Text Available The objective of this work is to automatically seg ment the speech signal into silence, voiced and unvoiced regions which are very beneficial in incre asing the accuracy and performance of recognition systems. Proposed algorithm is based on three important characteristics of speech signal namely Zero Crossing Rate, Short Time Energy and Fundamental Frequency. The performance of the proposed algorithm is evaluated using the data collected from four different speakers and an overall accuracy of 96.61 % is achi eved.

  1. Traceability Through Automatic Program Generation

    Science.gov (United States)

    Richardson, Julian; Green, Jeff

    2003-01-01

    Program synthesis is a technique for automatically deriving programs from specifications of their behavior. One of the arguments made in favour of program synthesis is that it allows one to trace from the specification to the program. One way in which traceability information can be derived is to augment the program synthesis system so that manipulations and calculations it carries out during the synthesis process are annotated with information on what the manipulations and calculations were and why they were made. This information is then accumulated throughout the synthesis process, at the end of which, every artifact produced by the synthesis is annotated with a complete history relating it to every other artifact (including the source specification) which influenced its construction. This approach requires modification of the entire synthesis system - which is labor-intensive and hard to do without influencing its behavior. In this paper, we introduce a novel, lightweight technique for deriving traceability from a program specification to the corresponding synthesized code. Once a program has been successfully synthesized from a specification, small changes are systematically made to the specification and the effects on the synthesized program observed. We have partially automated the technique and applied it in an experiment to one of our program synthesis systems, AUTOFILTER, and to the GNU C compiler, GCC. The results are promising: 1. Manual inspection of the results indicates that most of the connections derived from the source (a specification in the case of AUTOFILTER, C source code in the case of GCC) to its generated target (C source code in the case of AUTOFILTER, assembly language code in the case of GCC) are correct. 2. Around half of the lines in the target can be traced to at least one line of the source. 3. Small changes in the source often induce only small changes in the target.

  2. 2012 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2012 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  3. 2009 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2009 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  4. A Demonstration of Automatically Switched Optical Network

    Institute of Scientific and Technical Information of China (English)

    Weisheng Hu; Qingji Zeng; Yaohui Jin; Chun Jiang; Yue Wang; Xiaodong Wang; Chunlei Zhang; Yang Lu; Buwei Xu; Peigang Hu

    2003-01-01

    We build an automatically switched optical network (ASON) testbed with four optical cross-connect nodes. Many fundamental ASON features are demonstrated, which is implemented by control protocols based on generalized multi-protocol label switching (GMPLS) framework.

  5. Computer systems for automatic earthquake detection

    Science.gov (United States)

    Stewart, S.W.

    1974-01-01

    U.S Geological Survey seismologists in Menlo park, California, are utilizing the speed, reliability, and efficiency of minicomputers to monitor seismograph stations and to automatically detect earthquakes. An earthquake detection computer system, believed to be the only one of its kind in operation, automatically reports about 90 percent of all local earthquakes recorded by a network of over 100 central California seismograph stations. The system also monitors the stations for signs of malfunction or abnormal operation. Before the automatic system was put in operation, all of the earthquakes recorded had to be detected by manually searching the records, a time-consuming process. With the automatic detection system, the stations are efficiently monitored continuously. 

  6. 2014 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2014 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  7. Automatic acquisition of pattern collocations in GO

    Institute of Scientific and Technical Information of China (English)

    LIU Zhi-qing; DOU Qing; LI Wen-hong; LU Ben-jie

    2008-01-01

    The quality, quantity, and consistency of the knowledgeused in GO-playing programs often determine their strengths,and automatic acquisition of large amounts of high-quality andconsistent GO knowledge is crucial for successful GO playing.In a previous article of this subject, we have presented analgorithm for efficient and automatic acquisition of spatialpatterns of GO as well as their frequency of occurrence fromgame records. In this article, we present two algorithms, one forefficient and automatic acquisition of pairs of spatial patternsthat appear jointly in a local context, and the other for deter-mining whether the joint pattern appearances are of certainsignificance statistically and not just a coincidence. Results ofthe two algorithms include 1 779 966 pairs of spatial patternsacquired automatically from 16 067 game records of profess-sional GO players, of which about 99.8% are qualified as patterncollocations with a statistical confidence of 99.5% or higher.

  8. Three layered framework for automatic service composition

    Science.gov (United States)

    Liu, Xinqiong; Xia, Ping; Wan, Junli

    2009-10-01

    For automatic service composition, a planning based framework MOCIS is proposed. Planning is based on two major techniques, service reasoning and constraint satisfaction. Constraint satisfaction can be divided into quality constraint satisfaction and quantity constraint satisfaction. Contrary to traditional methods realizing upon techniques by interleaving activity, message and provider, the novelty of the framework is dividing these concerns into three layers, with activity layer majoring service reasoning, message layer for quality constraint and provider layer for quantity constraint. The layered architecture makes automatic web service composition possible for activity tree that abstract BPEL list and concrete BPEL list are achieved automatically with each layer, and users can selection proper abstract BPEL or BPEL to satisfy their request. And E-traveling composition cases have been tested, demonstrating that complex service can be achieved through three layers compositing automatically.

  9. Variable load automatically tests dc power supplies

    Science.gov (United States)

    Burke, H. C., Jr.; Sullivan, R. M.

    1965-01-01

    Continuously variable load automatically tests dc power supplies over an extended current range. External meters monitor current and voltage, and multipliers at the outputs facilitate plotting the power curve of the unit.

  10. Automatic coding of online collaboration protocols

    NARCIS (Netherlands)

    Erkens, Gijsbert; Janssen, J.J.H.M.

    2006-01-01

    An automatic coding procedure is described to determine the communicative functions of messages in chat discussions. Five main communicative functions are distinguished: argumentative (indicating a line of argumentation or reasoning), responsive (e.g., confirmations, denials, and answers), informati

  11. 2010 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2010 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  12. Automatization and familiarity in repeated checking

    NARCIS (Netherlands)

    Dek, Eliane C P; van den Hout, Marcel A.; Giele, Catharina L.; Engelhard, Iris M.

    2014-01-01

    Repeated checking paradoxically increases memory uncertainty. This study investigated the underlying mechanism of this effect. We hypothesized that as a result of repeated checking, familiarity with stimuli increases, and automatization of the checking procedure occurs, which should result in decrea

  13. Automatic safety rod for reactors. [LMFBR

    Science.gov (United States)

    Germer, J.H.

    1982-03-23

    An automatic safety rod for a nuclear reactor containing neutron absorbing material and designed to be inserted into a reactor core after a loss-of-flow. Actuation is based upon either a sudden decrease in core pressure drop or the pressure drop decreases below a predetermined minimum value. The automatic control rod includes a pressure regulating device whereby a controlled decrease in operating pressure due to reduced coolant flow does not cause the rod to drop into the core.

  14. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathaniel O.

    2010-05-31

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques to detect regions of high error and the flexibility of the transfinite interpolation to add degrees of freedom to these areas. Examples are shown of a section of the Palo Duro Canyon in northern Texas.

  15. Automatic Programming with Ant Colony Optimization

    OpenAIRE

    Green, Jennifer; Jacqueline L. Whalley; Johnson, Colin G.

    2004-01-01

    Automatic programming is the use of search techniques to find programs that solve a problem. The most commonly explored automatic programming technique is genetic programming, which uses genetic algorithms to carry out the search. In this paper we introduce a new technique called Ant Colony Programming (ACP) which uses an ant colony based search in place of genetic algorithms. This algorithm is described and compared with other approaches in the literature.

  16. Automatic Morphometry of Nerve Histological Sections

    OpenAIRE

    Romero, E.; Cuisenaire, O.; Denef, J.; Delbeke, J.; Macq, B.; Veraart, C.

    2000-01-01

    A method for the automatic segmentation, recognition and measurement of neuronal myelinated fibers in nerve histological sections is presented. In this method, the fiber parameters i.e. perimeter, area, position of the fiber and myelin sheath thickness are automatically computed. Obliquity of the sections may be taken into account. First, the image is thresholded to provide a coarse classification between myelin and non-myelin pixels. Next, the resulting binary image is further simplified usi...

  17. Automatic processing of dominance and submissiveness

    OpenAIRE

    Moors, Agnes; De Houwer, Jan

    2005-01-01

    We investigated whether people are able to detect in a relatively automatic manner the dominant or submissive status of persons engaged in social interactions. Using a variant of the affective Simon task (De Houwer & Eelen, 1998), we demonstrated that the verbal response DOMINANT or SUBMISSIVE was facilitated when it had to be made to a target person that was respectively dominant or submissive. These results provide new information about the automatic nature of appraisals and ...

  18. AUTOMATIC CAPTION GENERATION FOR ELECTRONICS TEXTBOOKS

    OpenAIRE

    Veena Thakur; Trupti Gedam

    2015-01-01

    Automatic or semi-automatic approaches for developing Technology Supported Learning Systems (TSLS) are required to lighten their development cost. The main objective of this paper is to automate the generation of a caption module; it aims at reproducing the way teachers prepare their lessons and the learning material they will use throughout the course. Teachers tend to choose one or more textbooks that cover the contents of their subjects, determine the topics to be addressed, and identify...

  19. Automatic text categorisation of racist webpages

    OpenAIRE

    Greevy, Edel

    2004-01-01

    Automatic Text Categorisation (TC) involves the assignment of one or more predefined categories to text documents in order that they can be effectively managed. In this thesis we examine the possibility of applying automatic text categorisation to the problem of categorising texts (web pages) based on whether or not they are racist. TC has proven successful for topic-based problems such as news story categorisation. However, the problem of detecting racism is dissimilar to topic-based pro...

  20. Automatic Control of Freeboard and Turbine Operation

    DEFF Research Database (Denmark)

    Kofoed, Jens Peter; Frigaard, Peter Bak; Friis-Madsen, Erik;

    The report deals with the modules for automatic control of freeboard and turbine operation on board the Wave dragon, Nissum Bredning (WD-NB) prototype, and covers what has been going on up to ultimo 2003.......The report deals with the modules for automatic control of freeboard and turbine operation on board the Wave dragon, Nissum Bredning (WD-NB) prototype, and covers what has been going on up to ultimo 2003....

  1. UMLS-based automatic image indexing.

    Science.gov (United States)

    Sneiderman, C; Sneiderman, Charles Alan; Demner-Fushman, D; Demner-Fushman, Dina; Fung, K W; Fung, Kin Wah; Bray, B; Bray, Bruce

    2008-01-01

    To date, most accurate image retrieval techniques rely on textual descriptions of images. Our goal is to automatically generate indexing terms for an image extracted from a biomedical article by identifying Unified Medical Language System (UMLS) concepts in image caption and its discussion in the text. In a pilot evaluation of the suggested image indexing method by five physicians, a third of the automatically identified index terms were found suitable for indexing.

  2. 调试中基于文法编码的日志异常检测算法%A Log Anomaly Detection Algorithm for Debugging Based on Grammar-Based Codes

    Institute of Scientific and Technical Information of China (English)

    王楠; 韩冀中; 方金云

    2013-01-01

    调试软件中的非确定错误对软件开发有重要意义.近年来,随着云计算系统的快速发展和对录制重放调试方法研究的深入,使用异常检测方法从大量文本日志或控制流日志等数据中找出异常的信息对调试愈发重要.传统的异常检测算法大多是为检测和防范攻击而设计的,它们很多基于马尔可夫假设,对事件流上的剧烈变化很敏感.但是新的问题要求异常检测能够检出语义级别的异常行为.实验表明现有的基于马尔可夫假设的异常检测算法在这方面表现不佳.提出了一种新的基于文法编码的异常检测算法.该算法不依赖于统计模型、概率模型、机器学习及马尔可夫假设,设计和实现都极为简单.实验表明在检测高层次的语义异常方面,该算法比传统方法有优势.%Debugging non-deterministic bugs has long been an important research area in software development. In recent years, with the rapid emerging of large cloud computing systems and the development of record replay debugging, the key of such debugging problem becomes mining anomaly information from text console logs and/or execution flow logs. Anomaly detection algorithms can therefore be used in this area. However, although many approaches have been proposed, traditional anomaly detection algorithms are designed for detecting network attacking and not suitable for the new problems. One important reason is the Markov assumption on which many traditional anomaly detection methods are based. Markov-based methods are sensitive to harshly trashing in event transitions. In contrast, the new problems in system diagnosing require the abilities of detecting semantic misbehaviors. Experiment results show the powerless of Markov-based methods on those problems. This paper presents a novel anomaly detection algorithm which is based on grammar-based codes. Different from previous approaches, our algorithm is a non-Markov approach. It doesn

  3. Semi-automatic removal of foreground stars from images of galaxies

    CERN Document Server

    Frei, Z

    1996-01-01

    A new procedure, designed to remove foreground stars from galaxy profiles is presented. Although several programs exist for stellar and faint object photometry, none of them treat star removal from the images very carefully. I present my attempt to develop such a system, and briefly compare the performance of my software to one of the well known stellar photometry packages, DAOPhot. Major steps in my procedure are: (1) automatic construction of an empirical 2D point spread function from well separated stars that are situated off the galaxy; (2) automatic identification of those peaks that are likely to be foreground stars, scaling the PSF and removing these stars, and patching residuals (in the automatically determined smallest possible area where residuals are truly significant); and (3) cosmetic fix of remaining degradations in the image. The algorithm and software presented here is significantly better for automatic removal of foreground stars from images of galaxies than DAOPhot or similar packages, since...

  4. Advanced automatic target recognition for police helicopter missions

    Science.gov (United States)

    Stahl, Christoph; Schoppmann, Paul

    2000-08-01

    The results of a case study about the application of an advanced method for automatic target recognition to infrared imagery taken from police helicopter missions are presented. The method consists of the following steps: preprocessing, classification, fusion, postprocessing and tracking, and combines the three paradigms image pyramids, neural networks and bayesian nets. The technology has been developed using a variety of different scenes typical for military aircraft missions. Infrared cameras have been in use for several years at the Bavarian police helicopter forces and are highly valuable for night missions. Several object classes like 'persons' or 'vehicles' are tested and the possible discrimination between persons and animals is shown. The analysis of complex scenes with hidden objects and clutter shows the potentials and limitations of automatic target recognition for real-world tasks. Several display concepts illustrate the achievable improvement of the situation awareness. The similarities and differences between various mission types concerning object variability, time constraints, consequences of false alarms, etc. are discussed. Typical police actions like searching for missing persons or runaway criminals illustrate the advantages of automatic target recognition. The results demonstrate the possible operational benefits for the helicopter crew. Future work will include performance evaluation issues and a system integration concept for the target platform.

  5. A semi-automatic annotation tool for cooking video

    Science.gov (United States)

    Bianco, Simone; Ciocca, Gianluigi; Napoletano, Paolo; Schettini, Raimondo; Margherita, Roberto; Marini, Gianluca; Gianforme, Giorgio; Pantaleo, Giuseppe

    2013-03-01

    In order to create a cooking assistant application to guide the users in the preparation of the dishes relevant to their profile diets and food preferences, it is necessary to accurately annotate the video recipes, identifying and tracking the foods of the cook. These videos present particular annotation challenges such as frequent occlusions, food appearance changes, etc. Manually annotate the videos is a time-consuming, tedious and error-prone task. Fully automatic tools that integrate computer vision algorithms to extract and identify the elements of interest are not error free, and false positive and false negative detections need to be corrected in a post-processing stage. We present an interactive, semi-automatic tool for the annotation of cooking videos that integrates computer vision techniques under the supervision of the user. The annotation accuracy is increased with respect to completely automatic tools and the human effort is reduced with respect to completely manual ones. The performance and usability of the proposed tool are evaluated on the basis of the time and effort required to annotate the same video sequences.

  6. Automatic spectrophotometric determination of trace amounts of boron with curcumin

    International Nuclear Information System (INIS)

    The proposed method utilizes a rosocyanin complex formed by the reaction of boric acid and curcumin without evaporation to dryness. The automatic determination of boron in aqueous solution is performed according to the predetermined program (Fig. 8), after manual injection of a sample solution (2.00 ml) to the reaction vessel. Glacial acetic acid (5.40 ml) and propionic anhydride (13.20 ml) are added and the solution is circulated through the circulating pipe consisting of a bubble remover, an absorbance measuring flow cell, an air blowing tube and a drain valve. Oxalyl chloride (0.81 ml) is added and the solution is circulated for 80 seconds to eliminate water. Sulfuric acid (1.08 ml) and curcumin reagent (3.01 ml) are added and the solution is circulated for 120 seconds to form a rosocyanin complex. After addition of an acetate buffer solution (21.34 ml) for the neutralisation of an interfering proton complex of curcumin, the absorbance of the orange solution is measured at 545 nm. This automatic analysis is sensitive (Fig. 9) and rapid; less than 1.5 μg of boron is determined in 7 minutes. It can be applied to the determination of trace amounts of boron in steel samples, combined with an automatic distillation under development. (auth.)

  7. Automatic graphene transfer system for improved material quality and efficiency

    Science.gov (United States)

    Boscá, Alberto; Pedrós, Jorge; Martínez, Javier; Palacios, Tomás; Calle, Fernando

    2016-02-01

    In most applications based on chemical vapor deposition (CVD) graphene, the transfer from the growth to the target substrate is a critical step for the final device performance. Manual procedures are time consuming and depend on handling skills, whereas existing automatic roll-to-roll methods work well for flexible substrates but tend to induce mechanical damage in rigid ones. A new system that automatically transfers CVD graphene to an arbitrary target substrate has been developed. The process is based on the all-fluidic manipulation of the graphene to avoid mechanical damage, strain and contamination, and on the combination of capillary action and electrostatic repulsion between the graphene and its container to ensure a centered sample on top of the target substrate. The improved carrier mobility and yield of the automatically transferred graphene, as compared to that manually transferred, is demonstrated by the optical and electrical characterization of field-effect transistors fabricated on both materials. In particular, 70% higher mobility values, with a 30% decrease in the unintentional doping and a 10% strain reduction are achieved. The system has been developed for lab-scale transfer and proved to be scalable for industrial applications.

  8. Automatic reconstruction of neural morphologies with multi-scale tracking.

    Science.gov (United States)

    Choromanska, Anna; Chang, Shih-Fu; Yuste, Rafael

    2012-01-01

    Neurons have complex axonal and dendritic morphologies that are the structural building blocks of neural circuits. The traditional method to capture these morphological structures using manual reconstructions is time-consuming and partly subjective, so it appears important to develop automatic or semi-automatic methods to reconstruct neurons. Here we introduce a fast algorithm for tracking neural morphologies in 3D with simultaneous detection of branching processes. The method is based on existing tracking procedures, adding the machine vision technique of multi-scaling. Starting from a seed point, our algorithm tracks axonal or dendritic arbors within a sphere of a variable radius, then moves the sphere center to the point on its surface with the shortest Dijkstra path, detects branching points on the surface of the sphere, scales it until branches are well separated and then continues tracking each branch. We evaluate the performance of our algorithm on preprocessed data stacks obtained by manual reconstructions of neural cells, corrupted with different levels of artificial noise, and unprocessed data sets, achieving 90% precision and 81% recall in branch detection. We also discuss limitations of our method, such as reconstructing highly overlapping neural processes, and suggest possible improvements. Multi-scaling techniques, well suited to detect branching structures, appear a promising strategy for automatic neuronal reconstructions. PMID:22754498

  9. Automatic imitation in a rich social context with virtual characters

    Directory of Open Access Journals (Sweden)

    Xueni ePan

    2015-06-01

    Full Text Available It has been well established that people respond faster when they perform an action that is congruent with an observed action than when they respond with an incongruent action. Here we propose a new method of using interactive Virtual Characters (VCs to test if social congruency effects can be obtained in a richer social context with sequential hand-arm actions. Two separate experiments were conducted, exploring if it is feasible to measure spatial congruency (experiment 1 and anatomical congruency (experiment 2 in response to a virtual character, compared to the same action sequence indicated by three virtual balls. In experiment 1, we found a robust spatial congruency effect for both VC and virtual balls, modulated by a social facilitation effect for participants who felt the VC was human. In experiment 2 which allowed for anatomical congruency, a form by congruency interaction provided evidence that participants automatically imitate the actions of the VC but do not imitate the balls. Our method and results build a bridge between studies using minimal stimuli in automatic interaction and studies of mimicry in a rich social interaction, and open new research venue for future research in the area of automatic imitation with a more ecologically valid social interaction.

  10. 20 CFR 404.285 - Recomputations performed automatically.

    Science.gov (United States)

    2010-04-01

    ....285 Section 404.285 Employees' Benefits SOCIAL SECURITY ADMINISTRATION FEDERAL OLD-AGE, SURVIVORS AND DISABILITY INSURANCE (1950- ) Computing Primary Insurance Amounts Recomputing Your Primary Insurance Amount... retired, disabled, and deceased worker to see if the worker's primary insurance amount may be...

  11. 14 CFR 23.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 23.1329 Section 23...: Installation § 23.1329 Automatic pilot system. If an automatic pilot system is installed, it must meet the following: (a) Each system must be designed so that the automatic pilot can— (1) Be quickly and...

  12. Microprocessors in automatic chemical analysis

    International Nuclear Information System (INIS)

    Application of microprocessors to programming and computing of solutions chemical analysis by a sequential technique is examined. Safety, performances reliability are compared to other methods. An example is given on uranium titration by spectrophotometry

  13. QoS-Aware Automatic Service Composition: A Graph View

    Institute of Scientific and Technical Information of China (English)

    Wei Jiang; Tian Wu; Song-Lin Hu; Zhi-Yong Liu

    2011-01-01

    In the research of service composition,it demands efficient algorithms that not only retrieve correct service compositions automatically from thousands of services but also satisfy the quality requirements of different service users.However,most approaches treat these two aspects as two separate problems,automatic service composition and service selection.Although the latest researches realize the restriction of this separate view and some specific methods are proposed,they still suffer from serious limitations in scalability and accuracy when addressing both requirements simultaneously.In order to cope with these limitations and efficiently solve the combined problem which is known as QoS-aware or QoS-driven automatic service composition problem,we propose a new graph search problem,single-source optimal directed acyclic graphs (DAGs),for the first time.This novel single-source optimal DAGs (SSOD) problem is similar to,but more general than the classical single-source shortest paths (SSSP) problem.In this paper,a new graph model of SSOD problem is proposed and a Sim-Dijkstra algorithm is presented to address the SSOD problem with the time complexity of O(n log n +m) (n and m are the number of nodes and edges in the graph respectively),and the proofs of its soundness.It is also directly applied to solve the QoS-aware automatic service composition problem,and a service composition tool named QSynth is implemented.Evaluations show that Sim-Dijkstra algorithm achieves superior scalability and efficiency with respect to a large variety of composition scenarios,even more efficient than our worklist algorithm that won the performance championship of Web Services Challenge 2009.

  14. Scanner OPC signatures: automatic vendor-to-vendor OPE matching

    Science.gov (United States)

    Renwick, Stephen P.

    2009-03-01

    As 193nm lithography continues to be stretched and the k1 factor decreases, optical proximity correction (OPC) has become a vital part of the lithographer's tool kit. Unfortunately, as is now well known, the design variations of lithographic scanners from different vendors cause them to have slightly different optical-proximity effect (OPE) behavior, meaning that they print features through pitch in distinct ways. This in turn means that their response to OPC is not the same, and that an OPC solution designed for a scanner from Company 1 may or may not work properly on a scanner from Company 2. Since OPC is not inexpensive, that causes trouble for chipmakers using more than one brand of scanner. Clearly a scanner-matching procedure is needed to meet this challenge. Previously, automatic matching has only been reported for scanners of different tool generations from the same manufacturer. In contrast, scanners from different companies have been matched using expert tuning and adjustment techniques, frequently requiring laborious test exposures. Automatic matching between scanners from Company 1 and Company 2 has remained an unsettled problem. We have recently solved this problem and introduce a novel method to perform the automatic matching. The success in meeting this challenge required three enabling factors. First, we recognized the strongest drivers of OPE mismatch and are thereby able to reduce the information needed about a tool from another supplier to that information readily available from all modern scanners. Second, we developed a means of reliably identifying the scanners' optical signatures, minimizing dependence on process parameters that can cloud the issue. Third, we carefully employed standard statistical techniques, checking for robustness of the algorithms used and maximizing efficiency. The result is an automatic software system that can predict an OPC matching solution for scanners from different suppliers without requiring expert intervention.

  15. Automatic detection of AutoPEEP during controlled mechanical ventilation

    Directory of Open Access Journals (Sweden)

    Nguyen Quang-Thang

    2012-06-01

    Full Text Available Abstract Background Dynamic hyperinflation, hereafter called AutoPEEP (auto-positive end expiratory pressure with some slight language abuse, is a frequent deleterious phenomenon in patients undergoing mechanical ventilation. Although not readily quantifiable, AutoPEEP can be recognized on the expiratory portion of the flow waveform. If expiratory flow does not return to zero before the next inspiration, AutoPEEP is present. This simple detection however requires the eye of an expert clinician at the patient’s bedside. An automatic detection of AutoPEEP should be helpful to optimize care. Methods In this paper, a platform for automatic detection of AutoPEEP based on the flow signal available on most of recent mechanical ventilators is introduced. The detection algorithms are developed on the basis of robust non-parametric hypothesis testings that require no prior information on the signal distribution. In particular, two detectors are proposed: one is based on SNT (Signal Norm Testing and the other is an extension of SNT in the sequential framework. The performance assessment was carried out on a respiratory system analog and ex-vivo on various retrospectively acquired patient curves. Results The experiment results have shown that the proposed algorithm provides relevant AutoPEEP detection on both simulated and real data. The analysis of clinical data has shown that the proposed detectors can be used to automatically detect AutoPEEP with an accuracy of 93% and a recall (sensitivity of 90%. Conclusions The proposed platform provides an automatic early detection of AutoPEEP. Such functionality can be integrated in the currently used mechanical ventilator for continuous monitoring of the patient-ventilator interface and, therefore, alleviate the clinician task.

  16. Experiences with automatic N and P measurements of an activated sludge process in a research environment

    DEFF Research Database (Denmark)

    Isaacs, Steven Howard; Temmink, H.

    1996-01-01

    Some of the advantages of on-line automatic measurement of ammonia, nitrate and phosphate for studying activated sludge systems are pointed out with the help of examples of batch experiments. Sample taking is performed by cross-flow filtration and measurement of all three analytes is performed...

  17. Analysis and countermeasures of the problems occurred in reverse osmosis during debugging and running%反渗透调试运行中出现问题分析及对策

    Institute of Scientific and Technical Information of China (English)

    秦昊

    2012-01-01

    The technological process of the groundwater pre-desalination system of Shaanxi Weihe Coal Chemical Engineering Group,as well as the technical and equipment problems occurred in the process of debugging and running, is introduced. Meanwhile, in the reverse osmosis system, the effects of electrical conductivity, influent SDI, influent residual chlorine and ORP, security filter, produced water backpressure, etc. On the system are analyzed and discussed, and corresponding countermeasures are put forward.%介绍了陕西渭河煤化工集团的地下水预脱盐系统工艺流程,以及调试运行过程中出现的工艺及设备问题.分析了反渗透系统进水电导率、进水SDI、进水余氯及ORP、保安过滤器、产水背压等对系统的影响,并提出对策.

  18. Multi-channel Virtual Instrument OscilloScope for Servo Motor Debugging System%多通道虚拟示波器伺服电机调试系统

    Institute of Scientific and Technical Information of China (English)

    张超; 游林儒

    2012-01-01

    Virtual instrument technology for servo motor debugging system was introduced. RS - 485 bus was chosen for the data transmission channel according to the system's real-time requirements,which send the voltage,current,speed collected by the microcontroller and the related parameters during the process of servo control to the PC. The PC used LabVIEW programming, which can implement the functions of data collection, processing, display and storage, and realize the on-line monitoring of servo motor control system. Experimental results show that the system can meet the requirements with desired technical targets.%介绍了虚拟仪器技术在伺服电机调试系统中的应用.根据系统实时性要求,采用基于RS - 485总线的数据传输通适,通过串行接口将微控制器采集到的电压、电流、转速及伺服控制过程中的相关参数传送到上位机.上位机采用LabVIEW编程,完成数据的采集、处理、显示和存储,实现对伺服电机控制系统运行状态的在线监控.实验证明,该系统达到了预期的技术要求.

  19. Application of Portable Gas Chromatography Detecting Technique in Debugging Stage of UHV Project%便携式气相色谱检测技术在特高压工程调试阶段的应用

    Institute of Scientific and Technical Information of China (English)

    王海飞; 苏镇西; 祁炯; 赵也; 程伟; 袁小芳; 韩慧慧

    2015-01-01

    采取加装FPD检测器、优化试验条件等措施,对现有的便携式气相色谱检测技术进行改进,能检测出更多种类的分解产物,同时具备较高的分离和检测效率。在特高压工程六氟化硫电气设备调试阶段,运用该项技术可以快速准确地对疑似故障设备内SF6气体分解产物进行检测,有助于对故障位置、类型和严重程度进行准确判断。%Through the measures of adding FPD detector , optimizing the experimental conditions and so on, the portable gas chromatography detecting technique is improved , it can detect more decomposition products , and has high efficiency of separation and detection .In UHV GIS equipment debugging stage , applying this technique to detect SF 6 gas decomposition products in the suspected faulty equipments quickly and accurately , can help people accurately judge the position , type and severity of the fault .

  20. Automatic measurement and representation of prosodic features

    Science.gov (United States)

    Ying, Goangshiuan Shawn

    Effective measurement and representation of prosodic features of the acoustic signal for use in automatic speech recognition and understanding systems is the goal of this work. Prosodic features-stress, duration, and intonation-are variations of the acoustic signal whose domains are beyond the boundaries of each individual phonetic segment. Listeners perceive prosodic features through a complex combination of acoustic correlates such as intensity, duration, and fundamental frequency (F0). We have developed new tools to measure F0 and intensity features. We apply a probabilistic global error correction routine to an Average Magnitude Difference Function (AMDF) pitch detector. A new short-term frequency-domain Teager energy algorithm is used to measure the energy of a speech signal. We have conducted a series of experiments performing lexical stress detection on words in continuous English speech from two speech corpora. We have experimented with two different approaches, a segment-based approach and a rhythm unit-based approach, in lexical stress detection. The first approach uses pattern recognition with energy- and duration-based measurements as features to build Bayesian classifiers to detect the stress level of a vowel segment. In the second approach we define rhythm unit and use only the F0-based measurement and a scoring system to determine the stressed segment in the rhythm unit. A duration-based segmentation routine was developed to break polysyllabic words into rhythm units. The long-term goal of this work is to develop a system that can effectively detect the stress pattern for each word in continuous speech utterances. Stress information will be integrated as a constraint for pruning the word hypotheses in a word recognition system based on hidden Markov models.

  1. Automatic detection of aircraft emergency landing sites

    Science.gov (United States)

    Shen, Yu-Fei; Rahman, Zia-ur; Krusienski, Dean; Li, Jiang

    2011-06-01

    An automatic landing site detection algorithm is proposed for aircraft emergency landing. Emergency landing is an unplanned event in response to emergency situations. If, as is unfortunately usually the case, there is no airstrip or airfield that can be reached by the un-powered aircraft, a crash landing or ditching has to be carried out. Identifying a safe landing site is critical to the survival of passengers and crew. Conventionally, the pilot chooses the landing site visually by looking at the terrain through the cockpit. The success of this vital decision greatly depends on the external environmental factors that can impair human vision, and on the pilot's flight experience that can vary significantly among pilots. Therefore, we propose a robust, reliable and efficient algorithm that is expected to alleviate the negative impact of these factors. We present only the detection mechanism of the proposed algorithm and assume that the image enhancement for increased visibility, and image stitching for a larger field-of-view have already been performed on the images acquired by aircraftmounted cameras. Specifically, we describe an elastic bound detection method which is designed to position the horizon. The terrain image is divided into non-overlapping blocks which are then clustered according to a "roughness" measure. Adjacent smooth blocks are merged to form potential landing sites whose dimensions are measured with principal component analysis and geometric transformations. If the dimensions of the candidate region exceed the minimum requirement for safe landing, the potential landing site is considered a safe candidate and highlighted on the human machine interface. At the end, the pilot makes the final decision by confirming one of the candidates, also considering other factors such as wind speed and wind direction, etc. Preliminary results show the feasibility of the proposed algorithm.

  2. Automatic query formulations in information retrieval.

    Science.gov (United States)

    Salton, G; Buckley, C; Fox, E A

    1983-07-01

    Modern information retrieval systems are designed to supply relevant information in response to requests received from the user population. In most retrieval environments the search requests consist of keywords, or index terms, interrelated by appropriate Boolean operators. Since it is difficult for untrained users to generate effective Boolean search requests, trained search intermediaries are normally used to translate original statements of user need into useful Boolean search formulations. Methods are introduced in this study which reduce the role of the search intermediaries by making it possible to generate Boolean search formulations completely automatically from natural language statements provided by the system patrons. Frequency considerations are used automatically to generate appropriate term combinations as well as Boolean connectives relating the terms. Methods are covered to produce automatic query formulations both in a standard Boolean logic system, as well as in an extended Boolean system in which the strict interpretation of the connectives is relaxed. Experimental results are supplied to evaluate the effectiveness of the automatic query formulation process, and methods are described for applying the automatic query formulation process in practice. PMID:10299297

  3. Automatic prejudice in childhood and early adolescence.

    Science.gov (United States)

    Degner, Juliane; Wentura, Dirk

    2010-03-01

    Four cross-sectional studies are presented that investigated the automatic activation of prejudice in children and adolescents (aged 9 years to 15 years). Therefore, 4 different versions of the affective priming task were used, with pictures of ingroup and outgroup members being presented as prejudice-related prime stimuli. In all 4 studies, a pattern occurred that suggests a linear developmental increase of automatic prejudice with significant effects of outgroup negativity appearing only around the ages of 12 to 13 years. Results of younger children, on the contrary, did not indicate any effect of automatic prejudice activation. In contrast, prejudice effects in an Implicit Association Test (IAT) showed high levels of prejudice independent of age (Study 3). Results of Study 4 suggest that these age differences are due to age-related differences in spontaneous categorization processes. Introducing a forced-categorization into the affective priming procedure produced a pattern of results equivalent to that obtained with the IAT. These results suggest that although children are assumed to acquire prejudice at much younger ages, automatization of such attitudes might be related to developmental processes in early adolescence. We discuss possible theoretical implications of these results for a developmental theory of prejudice representation and automatization during childhood and adolescence. PMID:20175618

  4. Automatic contrast: evidence that automatic comparison with the social self affects evaluative responses.

    Science.gov (United States)

    Ruys, Kirsten I; Spears, Russell; Gordijn, Ernestine H; de Vries, Nanne K

    2007-08-01

    The aim of the present research was to investigate whether unconsciously presented affective information may cause opposite evaluative responses depending on what social category the information originates from. We argue that automatic comparison processes between the self and the unconscious affective information produce this evaluative contrast effect. Consistent with research on automatic behaviour, we propose that when an intergroup context is activated, an automatic comparison to the social self may determine the automatic evaluative responses, at least for highly visible categories (e.g. sex, ethnicity). Contrary to previous research on evaluative priming, we predict automatic contrastive responses to affective information originating from an outgroup category such that the evaluative response to neutral targets is opposite to the valence of the suboptimal primes. Two studies using different intergroup contexts provide support for our hypotheses. PMID:17705936

  5. 基于EM78P510NK的全自动咖啡机控制系统的设计%Design of Automatic Coffee Machine Control System Based on the EM78P510NK

    Institute of Scientific and Technical Information of China (English)

    邵雯

    2013-01-01

    This article introduced automatic coffee machine control system 's design.Its core is MCU based on EM78P510NK, it can control the position of piston and seal gland, temperature, water flow, coarseness of coffee reliably.It can give an alarm when hydropenia and lack coffee.The automatic coffee machine control system has advantages of reliable operation, simple structure, easy to debug and expansion.It is used in business, catering industry, office and family, etc.%介绍一种全自动咖啡机控制系统的设计,它以EM78P510NK处理器为核心,对活塞位置、密封套位置、温度、水流量、咖啡粗细度等进行可靠控制,具有缺水、缺豆报警功能.该系统工作可靠、结构简单、易于调试和扩展,适合于商业、餐饮业、办公室和家庭.

  6. Automatic Parallelization of Scientific Application

    DEFF Research Database (Denmark)

    Blum, Troels

    performance gains. Scientists working with computer simulations should be allowed to focus on their field of research and not spend excessive amounts of time learning exotic programming models and languages. We have with Bohrium achieved very promising results by starting out with a relatively simple approach...

  7. Automatic energy calibration of germanium detectors using fuzzy set theory

    International Nuclear Information System (INIS)

    With the advent of multi-detector arrays, many tasks that are usually performed by physicists, such as energy calibration, become very time consuming. There is therefore a need to develop more and more complex algorithms able to mimic human expertise. Fuzzy logic proposes a theoretical framework to build algorithms that are close to the human way of thinking. In this paper we apply fuzzy set theory in order to develop an automatic procedure for energy calibration. The algorithm, based on fuzzy concepts, has been tested on data taken with the EUROBALL IV γ-ray array

  8. Mind Out of Action: The Intentionality of Automatic Actions

    OpenAIRE

    Di Nucci, Ezio

    2008-01-01

    We think less than we think. My thesis moves from this suspicion to show that standard accounts of intentional action can't explain the whole of agency. Causalist accounts such as Davidson's and Bratman's, according to which an action can be intentional only if it is caused by a particular mental state of the agent, don't work for every kind of action. So-called automatic actions, effortless performances over which the agent doesn't deliberate, and to which she doesn't need to ...

  9. High Range Resolution Profile Automatic Target Recognition Using Sparse Representation

    Institute of Scientific and Technical Information of China (English)

    Zhou Nuo; Chen Wei

    2010-01-01

    Sparse representation is a new signal analysis method which is receiving increasing attention in recent years.In this article,a novel scheme solving high range resolution profile automatic target recognition for ground moving targets is proposed.The sparse representation theory is applied to analyzing the components of high range resolution profiles and sparse coefficients are used to describe their features.Numerous experiments with the target type number ranging from 2 to 6 have been implemented.Results show that the proposed scheme not only provides higher recognition preciseness in real time,but also achieves more robust performance as the target type number increases.

  10. Automatic SIMD vectorization of SSA-based control flow graphs

    CERN Document Server

    Karrenberg, Ralf

    2015-01-01

    Ralf Karrenberg presents Whole-Function Vectorization (WFV), an approach that allows a compiler to automatically create code that exploits data-parallelism using SIMD instructions. Data-parallel applications such as particle simulations, stock option price estimation or video decoding require the same computations to be performed on huge amounts of data. Without WFV, one processor core executes a single instance of a data-parallel function. WFV transforms the function to execute multiple instances at once using SIMD instructions. The author describes an advanced WFV algorithm that includes a v

  11. Automatic Resonance Alignment of High-Order Microring Filters

    CERN Document Server

    Mak, Jason C C; Xue, Tianyuan; Mikkelsen, Jared C; Yong, Zheng; Poon, Joyce K S

    2015-01-01

    Automatic resonance alignment tuning is performed in high-order series coupled microring filters using a feedback system. By inputting only a reference wavelength, a filter is tuned such that passband ripples are dramatically reduced compared to the initial detuned state and the passband becomes centered at the reference. The method is tested on 5th order microring filters fabricated in a standard silicon photonics foundry process. Repeatable tuning is demonstrated for filters on multiple dies from the wafer and for arbitrary reference wavelengths within the free spectral range of the microrings.

  12. Human and automatic speaker recognition over telecommunication channels

    CERN Document Server

    Fernández Gallardo, Laura

    2016-01-01

    This work addresses the evaluation of the human and the automatic speaker recognition performances under different channel distortions caused by bandwidth limitation, codecs, and electro-acoustic user interfaces, among other impairments. Its main contribution is the demonstration of the benefits of communication channels of extended bandwidth, together with an insight into how speaker-specific characteristics of speech are preserved through different transmissions. It provides sufficient motivation for considering speaker recognition as a criterion for the migration from narrowband to enhanced bandwidths, such as wideband and super-wideband.

  13. Automatic scoring of the severity of psoriasis scaling

    DEFF Research Database (Denmark)

    Gomez, David Delgado; Ersbøll, Bjarne Kjær; Carstensen, Jens Michael

    2004-01-01

    In this work, a combined statistical and image analysis method to automatically evaluate the severity of scaling in psoriasis lesions is proposed. The method separates the different regions of the disease in the image and scores the degree of scaling based on the properties of these areas....... The proposed method provides a solution to the lack of suitable methods to assess the lesion and to evaluate changes during the treatment. An experiment over a collection of psoriasis images is conducted to test the performance of the method. Results show that the obtained scores are highly correlated...

  14. Automatic Identification of Modal, Breathy and Creaky Voices

    Directory of Open Access Journals (Sweden)

    Poonam Sharma

    2013-12-01

    Full Text Available This paper presents a way for the automatic identification of different voice qualities present in a speech signal which is very beneficiary for detecting any kind of speech by an efficient speech recognition system. Proposed technique is based on three important characteristics of speech signal namely Zero Crossing Rate, Short Time Energy and Fundamental Frequency. The performance of the proposed algorithm is evaluated using the data collected from three different speakers and an overall accuracy of 87.2 % is achieved.

  15. Automatic identification of corrosion damage using image processing techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bento, Mariana P.; Ramalho, Geraldo L.B.; Medeiros, Fatima N.S. de; Ribeiro, Elvis S. [Universidade Federal do Ceara (UFC), Fortaleza, CE (Brazil); Medeiros, Luiz C.L. [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper proposes a Nondestructive Evaluation (NDE) method for atmospheric corrosion detection on metallic surfaces using digital images. In this study, the uniform corrosion is characterized by texture attributes extracted from co-occurrence matrix and the Self Organizing Mapping (SOM) clustering algorithm. We present a technique for automatic inspection of oil and gas storage tanks and pipelines of petrochemical industries without disturbing their properties and performance. Experimental results are promising and encourage the possibility of using this methodology in designing trustful and robust early failure detection systems. (author)

  16. Analytical Performance of Olympus AU2700 Automatic Biochemical Analyzer in Detecting Electrolyte Proj ect%奥林巴斯AU2700全自动生化分析仪测定电解质项目的性能验证研究

    Institute of Scientific and Technical Information of China (English)

    韩学波; 唐秀英; 李莉; 于欣

    2014-01-01

    Objective:Evaluation the analysis performance of Olympus AU2700 automatic biochemical Analyzer determined electrolytes proj ect.Methods:In accordance with United States clinical laboratory standards organization (CLSI) guidance document and other documents related to the experimental program,analyzes the precision,accuracy,linear range of Olympus AU2700 automatic biochemical analyzer determination electrolyte project.Results:AU2700 automatic biochemical analyzer examines in the K+ batch coefficient of variation respectively is 1.84% and 0.89%,the K+ inter-assay coefficients of variation respectively is 2.08% and 4.20%.The Na+ batch coefficient of variation respectively is 0.05% and 0.94%,the K+ inter-assay coefficients of variation respectively is 0.64% and 0.81%.The Cl- batch coefficient of variation respectively is 0.88% and 0.75%,the Cl- inter-assay coefficients of variation respectively is 1.12% and 1.21%.These results satisfies the CLIA88 standard.Accuracy:the detection result in the EQA the permissible range.Linearity:The linear regression equation of Potassium is Y=1.0085X-0.0312.a is 1.0085 and correlation coefficient is 0.99.Reportable range is 0.68~11.8 mmol/L.The linear regression equation of Sodium ions is Y=9998X-0.0055.a is 0.9998 and correlation co-efficient is 1.Reportable range is 37~288.6 mmol/L.The linear regression equation of Chloride ion is Y=0.9895X+1.8413.a is 0.9895 and correlation coefficient is 0.99.Reportable range is 37.1~246.6 mmol/L.Conclusion:Olympus AU2700 automatic biochemical analyzer examination electrolyte project has met the requirements of biochemical assay laboratory in precision,accuracy,linear range and other aspects,may be used in the clinical specimen examination.%目的:评价奥林巴斯 AU2700全自动生化分析仪测定电解质项目的分析性能。方法:按照美国临床实验室标准化组织(CLSI)指南文件和其他相关文献的实验方案,分析奥林巴斯 AU2700全自动生化分析仪测定电

  17. Research on an Intelligent Automatic Turning System

    Directory of Open Access Journals (Sweden)

    Lichong Huang

    2012-12-01

    Full Text Available Equipment manufacturing industry is the strategic industries of a country. And its core part is the CNC machine tool. Therefore, enhancing the independent research of relevant technology of CNC machine, especially the open CNC system, is of great significance. This paper presented some key techniques of an Intelligent Automatic Turning System and gave a viable solution for system integration. First of all, the integrated system architecture and the flexible and efficient workflow for perfoming the intelligent automatic turning process is illustrated. Secondly, the innovated methods of the workpiece feature recognition and expression and process planning of the NC machining are put forward. Thirdly, the cutting tool auto-selection and the cutting parameter optimization solution are generated with a integrated inference of rule-based reasoning and case-based reasoning. Finally, the actual machining case based on the developed intelligent automatic turning system proved the presented solutions are valid, practical and efficient.

  18. Automatic and strategic processes in advertising effects

    DEFF Research Database (Denmark)

    Grunert, Klaus G.

    1996-01-01

    are at variance with current notions about advertising effects. For example, the att span problem will be relevant only for strategic processes, not for automatic processes, a certain amount of learning can occur with very little conscious effort, and advertising's effect on brand evaluation may be more stable......Two kinds of cognitive processes can be distinguished: Automatic processes, which are mostly subconscious, are learned and changed very slowly, and are not subject to the capacity limitations of working memory, and strategic processes, which are conscious, are subject to capacity limitations......, and can easily be adapted to situational circumstances. Both the perception of advertising and the way advertising influences brand evaluation involves both processes. Automatic processes govern the recognition of advertising stimuli, the relevance decision which determines further higher-level processing...

  19. Fault injection system for automatic testing system

    Institute of Scientific and Technical Information of China (English)

    王胜文; 洪炳熔

    2003-01-01

    Considering the deficiency of the means for confirming the attribution of fault redundancy in the re-search of Automatic Testing System(ATS) , a fault-injection system has been proposed to study fault redundancyof automatic testing system through compurison. By means of a fault-imbeded environmental simulation, thefaults injected at the input level of the software are under test. These faults may induce inherent failure mode,thus bringing about unexpected output, and the anticipated goal of the test is attained. The fault injection con-sists of voltage signal generator, current signal generator and rear drive circuit which are specially developed,and the ATS can work regularly by means of software simulation. The experimental results indicate that the faultinjection system can find the deficiency of the automatic testing software, and identify the preference of fault re-dundancy. On the other hand, some soft deficiency never exposed before can be identified by analyzing the tes-ting results.

  20. Oocytes Polar Body Detection for Automatic Enucleation

    Directory of Open Access Journals (Sweden)

    Di Chen

    2016-02-01

    Full Text Available Enucleation is a crucial step in cloning. In order to achieve automatic blind enucleation, we should detect the polar body of the oocyte automatically. The conventional polar body detection approaches have low success rate or low efficiency. We propose a polar body detection method based on machine learning in this paper. On one hand, the improved Histogram of Oriented Gradient (HOG algorithm is employed to extract features of polar body images, which will increase success rate. On the other hand, a position prediction method is put forward to narrow the search range of polar body, which will improve efficiency. Experiment results show that the success rate is 96% for various types of polar bodies. Furthermore, the method is applied to an enucleation experiment and improves the degree of automatic enucleation.

  1. Effects of bandwidth feedback on the automatization of an arm movement sequence.

    Science.gov (United States)

    Agethen, Manfred; Krause, Daniel

    2016-02-01

    We examined the effects of a bandwidth feedback manipulation on motor learning. Effects on movement accuracy, as well as on movement consistency, have been addressed in earlier studies. We have additionally investigated the effects on motor automatization. Because providing error feedback is believed to induce attentional control processes, we suppose that a bandwidth method should facilitate motor automatization. Participants (N=48) were assigned to four groups: one control group and three intervention groups. Participants of the intervention groups practiced an arm movement sequence with 760 trials. The BW0-Group practiced with 100% frequency of feedback. For the BW10-Group, feedback was provided when the errors were larger than 10°. The YokedBW10-Group participants were matched to the feedback schedule of research twins from the BW10-Group. All groups performed pre-tests and retention tests with a secondary task paradigm to test for automaticity. The BW10-Group indicated a higher degree of automatization compared with the BW0-Group, which did not exhibit a change in automaticity. The comparison of the YokedBW10-Group, which also exhibited automatization, and the BW10-Group leads to the proposal that reduction of quantitative feedback frequency and additional positive feedback are responsible for the bandwidth effect. Differences in movement accuracy and consistency were not evident.

  2. Coronary CT angiography: automatic cardiac-phase selection for image reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Ruzsics, Balazs; Brothers, Robin L.; Costello, Philip [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Gebregziabher, Mulugeta [Medical University of South Carolina, Department of Biostatistics, Bioinformatics, and Epidemiology, Charleston, SC (United States); Lee, Heon [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Seoul Medical Center, Department of Radiology, Seoul (Korea); Allmendinger, Thomas; Vogt, Sebastian [Siemens Medical Solutions, Division CT, Forchheim (Germany); Schoepf, U.J. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Medical University of South Carolina, Division of Cardiology, Charleston, SC (United States)

    2009-08-15

    We evaluated an algorithm for automatic selection of the cardiac phase with the least motion for image reconstruction at coronary computed tomography (CT) angiography (CCTA). We analyzed data of 100 patients (49 female, mean age 59 years) who had undergone retrospectively ECG-gated CCTA. Two experienced observers visually identified the most suitable end-systolic and end-diastolic phases using a series of image reconstructions in 5% increments across the RR cycle. The same data were then reconstructed using an automatic phase finding algorithm based on a 4D weighting function of cardiac motion. On average, the algorithm determined the most suitable systolic reconstruction phase at 40.11{+-}6.29% RR compared with 40.07{+-}5.58% RR by the human observers (p=NS). The most suitable diastolic phase was found at 72.71{+-}7.37% RR by the automatic algorithm, compared with 76.43{+-}6.35% RR by the observers (p<0.05). No statistically significant difference was found between automatically and visually determined reconstruction phases regarding motion and stair-step artifacts in either systole or diastole (p>0.05). Thus, there appears to be no relevant difference between an automatic phase finding algorithm and visual selection by experienced observers for determining the phase with the least cardiac motion for CCTA image reconstruction. The use of automatic phase finding may therefore facilitate the performance of cardiac CT and reduce human error. (orig.)

  3. Determination of Artificial Sweetener 4-Ethoxyphenylurea in Succade by Automatic Solid-phase Extraction and High Performance Chromatography with Fluorescence Method%全自动固相萃取-高效液相色谱荧光法测定蜜饯中人工合成甜味剂对乙氧基苯脲含量

    Institute of Scientific and Technical Information of China (English)

    陈章捷; 陈金凤; 张艳燕; 钟坚海; 魏晶晶

    2014-01-01

    提出了高效液相色谱法测定蜜饯中人工合成甜味剂对乙氧基苯脲含量的方法。样品采用醋酸-醋酸铵缓冲液超声提取,全自动固相萃取仪净化,SB-C18反相色谱柱分离,荧光检测器检测。对乙氧基苯脲在0~10 mg/L范围内的线性相关系数为0.9987,方法定量限(S/N=10)小于0.1 mg/kg。以三种空白蜜饯为基体,在3个添加水平进行加标回收试验,平均回收率在81.7%~92.4%之间,相对标准偏差(n=6)在2.4%~6.8%之间。%High performance liquid chromatography is applied for the determination of artificial sweete-ner 4-Ethoxyphenylurea in succade.The sample is ultrasonic extracted with acetic acid/ammonium acetate buffer solution and purified by automatic solid-phase extraction.The extract is separated by SB-C1 8 column and detected by fluorescence detector.The value of correlation coefficient in the range of 0 to 10 mg/L is 0.9987.The limit of quantity (S/N=10)is less than 0.1 mg/kg.Using blank sample of succade as matrixes,the recovery is tested at 3 different concentration levels and the values of recovery are in the range of 81.7% to 92.4% with RSDs (n=6)in the range of 2.4% to 6.8%.

  4. An Automatic Learning-Based Framework for Robust Nucleus Segmentation.

    Science.gov (United States)

    Xing, Fuyong; Xie, Yuanpu; Yang, Lin

    2016-02-01

    Computer-aided image analysis of histopathology specimens could potentially provide support for early detection and improved characterization of diseases such as brain tumor, pancreatic neuroendocrine tumor (NET), and breast cancer. Automated nucleus segmentation is a prerequisite for various quantitative analyses including automatic morphological feature computation. However, it remains to be a challenging problem due to the complex nature of histopathology images. In this paper, we propose a learning-based framework for robust and automatic nucleus segmentation with shape preservation. Given a nucleus image, it begins with a deep convolutional neural network (CNN) model to generate a probability map, on which an iterative region merging approach is performed for shape initializations. Next, a novel segmentation algorithm is exploited to separate individual nuclei combining a robust selection-based sparse shape model and a local repulsive deformable model. One of the significant benefits of the proposed framework is that it is applicable to different staining histopathology images. Due to the feature learning characteristic of the deep CNN and the high level shape prior modeling, the proposed method is general enough to perform well across multiple scenarios. We have tested the proposed algorithm on three large-scale pathology image datasets using a range of different tissue and stain preparations, and the comparative experiments with recent state of the arts demonstrate the superior performance of the proposed approach. PMID:26415167

  5. Semi-automatic knee cartilage segmentation

    Science.gov (United States)

    Dam, Erik B.; Folkesson, Jenny; Pettersen, Paola C.; Christiansen, Claus

    2006-03-01

    Osteo-Arthritis (OA) is a very common age-related cause of pain and reduced range of motion. A central effect of OA is wear-down of the articular cartilage that otherwise ensures smooth joint motion. Quantification of the cartilage breakdown is central in monitoring disease progression and therefore cartilage segmentation is required. Recent advances allow automatic cartilage segmentation with high accuracy in most cases. However, the automatic methods still fail in some problematic cases. For clinical studies, even if a few failing cases will be averaged out in the overall results, this reduces the mean accuracy and precision and thereby necessitates larger/longer studies. Since the severe OA cases are often most problematic for the automatic methods, there is even a risk that the quantification will introduce a bias in the results. Therefore, interactive inspection and correction of these problematic cases is desirable. For diagnosis on individuals, this is even more crucial since the diagnosis will otherwise simply fail. We introduce and evaluate a semi-automatic cartilage segmentation method combining an automatic pre-segmentation with an interactive step that allows inspection and correction. The automatic step consists of voxel classification based on supervised learning. The interactive step combines a watershed transformation of the original scan with the posterior probability map from the classification step at sub-voxel precision. We evaluate the method for the task of segmenting the tibial cartilage sheet from low-field magnetic resonance imaging (MRI) of knees. The evaluation shows that the combined method allows accurate and highly reproducible correction of the segmentation of even the worst cases in approximately ten minutes of interaction.

  6. Automatic Hierarchical Color Image Classification

    Directory of Open Access Journals (Sweden)

    Jing Huang

    2003-02-01

    Full Text Available Organizing images into semantic categories can be extremely useful for content-based image retrieval and image annotation. Grouping images into semantic classes is a difficult problem, however. Image classification attempts to solve this hard problem by using low-level image features. In this paper, we propose a method for hierarchical classification of images via supervised learning. This scheme relies on using a good low-level feature and subsequently performing feature-space reconfiguration using singular value decomposition to reduce noise and dimensionality. We use the training data to obtain a hierarchical classification tree that can be used to categorize new images. Our experimental results suggest that this scheme not only performs better than standard nearest-neighbor techniques, but also has both storage and computational advantages.

  7. Automatic categorization of diverse experimental information in the bioscience literature

    Directory of Open Access Journals (Sweden)

    Fang Ruihua

    2012-01-01

    Full Text Available Abstract Background Curation of information from bioscience literature into biological knowledge databases is a crucial way of capturing experimental information in a computable form. During the biocuration process, a critical first step is to identify from all published literature the papers that contain results for a specific data type the curator is interested in annotating. This step normally requires curators to manually examine many papers to ascertain which few contain information of interest and thus, is usually time consuming. We developed an automatic method for identifying papers containing these curation data types among a large pool of published scientific papers based on the machine learning method Support Vector Machine (SVM. This classification system is completely automatic and can be readily applied to diverse experimental data types. It has been in use in production for automatic categorization of 10 different experimental datatypes in the biocuration process at WormBase for the past two years and it is in the process of being adopted in the biocuration process at FlyBase and the Saccharomyces Genome Database (SGD. We anticipate that this method can be readily adopted by various databases in the biocuration community and thereby greatly reducing time spent on an otherwise laborious and demanding task. We also developed a simple, readily automated procedure to utilize training papers of similar data types from different bodies of literature such as C. elegans and D. melanogaster to identify papers with any of these data types for a single database. This approach has great significance because for some data types, especially those of low occurrence, a single corpus often does not have enough training papers to achieve satisfactory performance. Results We successfully tested the method on ten data types from WormBase, fifteen data types from FlyBase and three data types from Mouse Genomics Informatics (MGI. It is being used in

  8. Automatic learning-based beam angle selection for thoracic IMRT

    International Nuclear Information System (INIS)

    Purpose: The treatment of thoracic cancer using external beam radiation requires an optimal selection of the radiation beam directions to ensure effective coverage of the target volume and to avoid unnecessary treatment of normal healthy tissues. Intensity modulated radiation therapy (IMRT) planning is a lengthy process, which requires the planner to iterate between choosing beam angles, specifying dose–volume objectives and executing IMRT optimization. In thorax treatment planning, where there are no class solutions for beam placement, beam angle selection is performed manually, based on the planner’s clinical experience. The purpose of this work is to propose and study a computationally efficient framework that utilizes machine learning to automatically select treatment beam angles. Such a framework may be helpful for reducing the overall planning workload. Methods: The authors introduce an automated beam selection method, based on learning the relationships between beam angles and anatomical features. Using a large set of clinically approved IMRT plans, a random forest regression algorithm is trained to map a multitude of anatomical features into an individual beam score. An optimization scheme is then built to select and adjust the beam angles, considering the learned interbeam dependencies. The validity and quality of the automatically selected beams evaluated using the manually selected beams from the corresponding clinical plans as the ground truth. Results: The analysis included 149 clinically approved thoracic IMRT plans. For a randomly selected test subset of 27 plans, IMRT plans were generated using automatically selected beams and compared to the clinical plans. The comparison of the predicted and the clinical beam angles demonstrated a good average correspondence between the two (angular distance 16.8° ± 10°, correlation 0.75 ± 0.2). The dose distributions of the semiautomatic and clinical plans were equivalent in terms of primary target volume

  9. Document Exploration and Automatic Knowledge Extraction for Unstructured Biomedical Text

    Science.gov (United States)

    Chu, S.; Totaro, G.; Doshi, N.; Thapar, S.; Mattmann, C. A.; Ramirez, P.

    2015-12-01

    We describe our work on building a web-browser based document reader with built-in exploration tool and automatic concept extraction of medical entities for biomedical text. Vast amounts of biomedical information are offered in unstructured text form through scientific publications and R&D reports. Utilizing text mining can help us to mine information and extract relevant knowledge from a plethora of biomedical text. The ability to employ such technologies to aid researchers in coping with information overload is greatly desirable. In recent years, there has been an increased interest in automatic biomedical concept extraction [1, 2] and intelligent PDF reader tools with the ability to search on content and find related articles [3]. Such reader tools are typically desktop applications and are limited to specific platforms. Our goal is to provide researchers with a simple tool to aid them in finding, reading, and exploring documents. Thus, we propose a web-based document explorer, which we called Shangri-Docs, which combines a document reader with automatic concept extraction and highlighting of relevant terms. Shangri-Docsalso provides the ability to evaluate a wide variety of document formats (e.g. PDF, Words, PPT, text, etc.) and to exploit the linked nature of the Web and personal content by performing searches on content from public sites (e.g. Wikipedia, PubMed) and private cataloged databases simultaneously. Shangri-Docsutilizes Apache cTAKES (clinical Text Analysis and Knowledge Extraction System) [4] and Unified Medical Language System (UMLS) to automatically identify and highlight terms and concepts, such as specific symptoms, diseases, drugs, and anatomical sites, mentioned in the text. cTAKES was originally designed specially to extract information from clinical medical records. Our investigation leads us to extend the automatic knowledge extraction process of cTAKES for biomedical research domain by improving the ontology guided information extraction

  10. Automatic learning-based beam angle selection for thoracic IMRT

    Energy Technology Data Exchange (ETDEWEB)

    Amit, Guy; Marshall, Andrea [Radiation Medicine Program, Princess Margaret Cancer Centre, Toronto, Ontario M5G 2M9 (Canada); Purdie, Thomas G., E-mail: tom.purdie@rmp.uhn.ca; Jaffray, David A. [Radiation Medicine Program, Princess Margaret Cancer Centre, Toronto, Ontario M5G 2M9 (Canada); Department of Radiation Oncology, University of Toronto, Toronto, Ontario M5S 3E2 (Canada); Techna Institute, University Health Network, Toronto, Ontario M5G 1P5 (Canada); Levinshtein, Alex [Department of Computer Science, University of Toronto, Toronto, Ontario M5S 3G4 (Canada); Hope, Andrew J.; Lindsay, Patricia [Radiation Medicine Program, Princess Margaret Cancer Centre, Toronto, Ontario M5G 2M9, Canada and Department of Radiation Oncology, University of Toronto, Toronto, Ontario M5S 3E2 (Canada); Pekar, Vladimir [Philips Healthcare, Markham, Ontario L6C 2S3 (Canada)

    2015-04-15

    Purpose: The treatment of thoracic cancer using external beam radiation requires an optimal selection of the radiation beam directions to ensure effective coverage of the target volume and to avoid unnecessary treatment of normal healthy tissues. Intensity modulated radiation therapy (IMRT) planning is a lengthy process, which requires the planner to iterate between choosing beam angles, specifying dose–volume objectives and executing IMRT optimization. In thorax treatment planning, where there are no class solutions for beam placement, beam angle selection is performed manually, based on the planner’s clinical experience. The purpose of this work is to propose and study a computationally efficient framework that utilizes machine learning to automatically select treatment beam angles. Such a framework may be helpful for reducing the overall planning workload. Methods: The authors introduce an automated beam selection method, based on learning the relationships between beam angles and anatomical features. Using a large set of clinically approved IMRT plans, a random forest regression algorithm is trained to map a multitude of anatomical features into an individual beam score. An optimization scheme is then built to select and adjust the beam angles, considering the learned interbeam dependencies. The validity and quality of the automatically selected beams evaluated using the manually selected beams from the corresponding clinical plans as the ground truth. Results: The analysis included 149 clinically approved thoracic IMRT plans. For a randomly selected test subset of 27 plans, IMRT plans were generated using automatically selected beams and compared to the clinical plans. The comparison of the predicted and the clinical beam angles demonstrated a good average correspondence between the two (angular distance 16.8° ± 10°, correlation 0.75 ± 0.2). The dose distributions of the semiautomatic and clinical plans were equivalent in terms of primary target volume

  11. Automatic malware analysis an emulator based approach

    CERN Document Server

    Yin, Heng

    2012-01-01

    Malicious software (i.e., malware) has become a severe threat to interconnected computer systems for decades and has caused billions of dollars damages each year. A large volume of new malware samples are discovered daily. Even worse, malware is rapidly evolving becoming more sophisticated and evasive to strike against current malware analysis and defense systems. Automatic Malware Analysis presents a virtualized malware analysis framework that addresses common challenges in malware analysis. In regards to this new analysis framework, a series of analysis techniques for automatic malware analy

  12. Development of automatic laser welding system

    International Nuclear Information System (INIS)

    Laser are a new production tool for high speed and low distortion welding and applications to automatic welding lines are increasing. IHI has long experience of laser processing for the preservation of nuclear power plants, welding of airplane engines and so on. Moreover, YAG laser oscillators and various kinds of hardware have been developed for laser welding and automation. Combining these welding technologies and laser hardware technologies produce the automatic laser welding system. In this paper, the component technologies are described, including combined optics intended to improve welding stability, laser oscillators, monitoring system, seam tracking system and so on. (author)

  13. Automatic Keyword Extraction from Individual Documents

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Stuart J.; Engel, David W.; Cramer, Nicholas O.; Cowley, Wendy E.

    2010-05-03

    This paper introduces a novel and domain-independent method for automatically extracting keywords, as sequences of one or more words, from individual documents. We describe the method’s configuration parameters and algorithm, and present an evaluation on a benchmark corpus of technical abstracts. We also present a method for generating lists of stop words for specific corpora and domains, and evaluate its ability to improve keyword extraction on the benchmark corpus. Finally, we apply our method of automatic keyword extraction to a corpus of news articles and define metrics for characterizing the exclusivity, essentiality, and generality of extracted keywords within a corpus.

  14. Automatic speech recognition a deep learning approach

    CERN Document Server

    Yu, Dong

    2015-01-01

    This book summarizes the recent advancement in the field of automatic speech recognition with a focus on discriminative and hierarchical models. This will be the first automatic speech recognition book to include a comprehensive coverage of recent developments such as conditional random field and deep learning techniques. It presents insights and theoretical foundation of a series of recent models such as conditional random field, semi-Markov and hidden conditional random field, deep neural network, deep belief network, and deep stacking models for sequential learning. It also discusses practical considerations of using these models in both acoustic and language modeling for continuous speech recognition.

  15. Technology Performance Exchange

    Energy Technology Data Exchange (ETDEWEB)

    2015-09-01

    To address the need for accessible, high-quality data, the Department of Energy has developed the Technology Performance Exchange (TPEx). TPEx enables technology suppliers, third-party testing laboratories, and other entities to share product performance data. These data are automatically transformed into a format that technology evaluators can easily use in their energy modeling assessments to inform procurement decisions.

  16. Automatic leather inspection of defective patterns

    Science.gov (United States)

    Tafuri, Maria; Branca, Antonella; Attolico, Giovanni; Distante, Arcangelo; Delaney, William

    1996-02-01

    Constant and consistent quality levels in the manufacturing industry increasingly require automatic inspection. This paper describes a vision system for leather inspection based upon visual textural properties of the material surface. As visual appearances of both leather and defects exhibit a wide range of variations due to original skin characteristics, curing processes and defect causes, location and classification of defective areas become hard tasks. This paper describes a method for separating the oriented structures of defects from normal leather, a background not homogeneous in color, thickness, brightness and finally in wrinkledness. The first step requires the evaluation of the orientation field from the image of the leather. Such a field associates to each point of the image a 2D vector having as direction the dominant local orientation of gradient vectors and the length proportional to their coherence evaluated in a neighborhood of fixed size. The second step analyzes such a vector flow field by projecting it on a set of basis vectors (elementary texture vectors) spanning the vector space where the vector fields associated to the defects can be defined. The coefficients of these projections are the parameters by means of which both detection and classification can be performed. Since the set of basis vectors is neither orthogonal nor complete, the projection requires the definition of a global optimization criteria that has been chosen to be the minimum difference between the original flow field and the vector field obtained as a linear combination of the basis vectors using the estimated coefficients. This optimization step is performed through a neural network initialized to recognize a limited number of patterns (corresponding to the basis vectors). This second step estimates the parameter vector in each point of the original image. Both leather without defects and defects can be characterized in terms of coefficient vectors making it possible to

  17. Functionality and Performance Visualization of the Distributed High Quality Volume Renderer (HVR)

    KAUST Repository

    Shaheen, Sara

    2012-07-01

    Volume rendering systems are designed to provide means to enable scientists and a variety of experts to interactively explore volume data through 3D views of the volume. However, volume rendering techniques are computationally intensive tasks. Moreover, parallel distributed volume rendering systems and multi-threading architectures were suggested as natural solutions to provide an acceptable volume rendering performance for very large volume data sizes, such as Electron Microscopy data (EM). This in turn adds another level of complexity when developing and manipulating volume rendering systems. Given that distributed parallel volume rendering systems are among the most complex systems to develop, trace and debug, it is obvious that traditional debugging tools do not provide enough support. As a consequence, there is a great demand to provide tools that are able to facilitate the manipulation of such systems. This can be achieved by utilizing the power of compute graphics in designing visual representations that reflect how the system works and that visualize the current performance state of the system.The work presented is categorized within the field of software Visualization, where Visualization is used to serve visualizing and understanding various software. In this thesis, a number of visual representations that reflect a number of functionality and performance aspects of the distributed HVR, a high quality volume renderer system that uses various techniques to visualize large volume sizes interactively. This work is provided to visualize different stages of the parallel volume rendering pipeline of HVR. This is along with means of performance analysis through a number of flexible and dynamic visualizations that reflect the current state of the system and enables manipulation of them at runtime. Those visualization are aimed to facilitate debugging, understanding and analyzing the distributed HVR.

  18. Automatic speech signal segmentation based on the innovation adaptive filter

    Directory of Open Access Journals (Sweden)

    Makowski Ryszard

    2014-06-01

    Full Text Available Speech segmentation is an essential stage in designing automatic speech recognition systems and one can find several algorithms proposed in the literature. It is a difficult problem, as speech is immensely variable. The aim of the authors’ studies was to design an algorithm that could be employed at the stage of automatic speech recognition. This would make it possible to avoid some problems related to speech signal parametrization. Posing the problem in such a way requires the algorithm to be capable of working in real time. The only such algorithm was proposed by Tyagi et al., (2006, and it is a modified version of Brandt’s algorithm. The article presents a new algorithm for unsupervised automatic speech signal segmentation. It performs segmentation without access to information about the phonetic content of the utterances, relying exclusively on second-order statistics of a speech signal. The starting point for the proposed method is time-varying Schur coefficients of an innovation adaptive filter. The Schur algorithm is known to be fast, precise, stable and capable of rapidly tracking changes in second order signal statistics. A transfer from one phoneme to another in the speech signal always indicates a change in signal statistics caused by vocal track changes. In order to allow for the properties of human hearing, detection of inter-phoneme boundaries is performed based on statistics defined on the mel spectrum determined from the reflection coefficients. The paper presents the structure of the algorithm, defines its properties, lists parameter values, describes detection efficiency results, and compares them with those for another algorithm. The obtained segmentation results, are satisfactory.

  19. Automatic identification of mass spectra

    International Nuclear Information System (INIS)

    Several approaches to preprocessing and comparison of low resolution mass spectra have been evaluated by various test methods related to library search. It is shown that there is a clear correlation between the nature of any contamination of a spectrum, the basic principle of the transformation or distance measure, and the performance of the identification system. The identification of functionality from low resolution spectra has also been evaluated using several classification methods. It is shown that there is an upper limit to the success of this approach, but also that this can be improved significantly by using a very limited amount of additional information. 10 refs

  20. Adaptive neuro-fuzzy inference system based automatic generation control

    Energy Technology Data Exchange (ETDEWEB)

    Hosseini, S.H.; Etemadi, A.H. [Department of Electrical Engineering, Sharif University of Technology, Tehran (Iran)

    2008-07-15

    Fixed gain controllers for automatic generation control are designed at nominal operating conditions and fail to provide best control performance over a wide range of operating conditions. So, to keep system performance near its optimum, it is desirable to track the operating conditions and use updated parameters to compute control gains. A control scheme based on artificial neuro-fuzzy inference system (ANFIS), which is trained by the results of off-line studies obtained using particle swarm optimization, is proposed in this paper to optimize and update control gains in real-time according to load variations. Also, frequency relaxation is implemented using ANFIS. The efficiency of the proposed method is demonstrated via simulations. Compliance of the proposed method with NERC control performance standard is verified. (author)

  1. High performance visual display for HENP detectors

    CERN Document Server

    McGuigan, M; Spiletic, J; Fine, V; Nevski, P

    2001-01-01

    A high end visual display for High Energy Nuclear Physics (HENP) detectors is necessary because of the sheer size and complexity of the detector. For BNL this display will be of special interest because of STAR and ATLAS. To load, rotate, query, and debug simulation code with a modern detector simply takes too long even on a powerful work station. To visualize the HENP detectors with maximal performance we have developed software with the following characteristics. We develop a visual display of HENP detectors on BNL multiprocessor visualization server at multiple level of detail. We work with general and generic detector framework consistent with ROOT, GAUDI etc, to avoid conflicting with the many graphic development groups associated with specific detectors like STAR and ATLAS. We develop advanced OpenGL features such as transparency and polarized stereoscopy. We enable collaborative viewing of detector and events by directly running the analysis in BNL stereoscopic theatre. We construct enhanced interactiv...

  2. Automatic noncontact ultrasonic inspection technique

    International Nuclear Information System (INIS)

    A system for EMAT, which generates ultrasound by electro-magnectic forces and performs nondestructive testing in noncontact, was established. By linking it with a 3 axis scanning system and a data acquisition and processing system the automation of EMAT testing was attempted. A EMAT sensor was fabricated and the directivity pattern of it was measured. To be suitable automation ir has a transmitter and a receiver in one case and the main beam direction of it can be controlled by the frequency of driving signal. A program which controls the EMAT system, the 3 axis scanner and the data acquisition and processing system was developed. It also processes acquired data and displays the processing results. IBM-PC/AT compatible PC was used as main controller and the stratage of the program is emulation of real devices on the PC monitor. To provide the performance of the established EMAT system, two aluminium blocks containing artificial flaws and a welded aluminium block were tested. The result of the tests were satisfactory.

  3. Comparison of TCP automatic tuning techniques for distributed computing

    Energy Technology Data Exchange (ETDEWEB)

    Weigle, E. H. (Eric H.); Feng, W. C. (Wu-Chun)

    2002-01-01

    Rather than painful, manual, static, per-connection optimization of TCP buffer sizes simply to achieve acceptable performance for distributed applications, many researchers have proposed techniques to perform this tuning automatically. This paper first discusses the relative merits of the various approaches in theory, and then provides substantial experimental data concerning two competing implementations - the buffer autotuning already present in Linux 2.4.x and 'Dynamic Right-Sizing.' This paper reveals heretofore unknown aspects of the problem and current solutions, provides insight into the proper approach for various circumstances, and points toward ways to further improve performance. TCP, for good or ill, is the only protocol widely available for reliable end-to-end congestion-controlled network communication, and thus it is the one used for almost all distributed computing. Unfortunately, TCP was not designed with high-performance computing in mind - its original design decisions focused on long-term fairness first, with performance a distant second. Thus users must often perform tortuous manual optimizations simply to achieve acceptable behavior. The most important and often most difficult task is determining and setting appropriate buffer sizes. Because of this, at least six ways of automatically setting these sizes have been proposed. In this paper, we compare and contrast these tuning methods. First we explain each method, followed by an in-depth discussion of their features. Next we discuss the experiments to fully characterize two particularly interesting methods (Linux 2.4 autotuning and Dynamic Right-Sizing). We conclude with results and possible improvements.

  4. Intermediate leak protection/automatic shutdown for B and W helical coil steam generator

    International Nuclear Information System (INIS)

    The report summarizes a follow-on study to the multi-tiered Intermediate Leak/Automatic Shutdown System report. It makes the automatic shutdown system specific to the Babcock and Wilcox (B and W) helical coil steam generator and to the Large Development LMFBR Plant. Threshold leak criteria specific to this steam generator design are developed, and performance predictions are presented for a multi-tier intermediate leak, automatic shutdown system applied to this unit. Preliminary performance predictions for application to the helical coil steam generator were given in the referenced report; for the most part, these predictions have been confirmed. The importance of including a cover gas hydrogen meter in this unit is demonstrated by calculation of a response time one-fifth that of an in-sodium meter at hot standby and refueling conditions

  5. Automatic assessment of cardiac perfusion MRI

    DEFF Research Database (Denmark)

    Ólafsdóttir, Hildur; Stegmann, Mikkel Bille; Larsson, Henrik B.W.

    2004-01-01

    In this paper, a method based on Active Appearance Models (AAM) is applied for automatic registration of myocardial perfusion MRI. A semi-quantitative perfusion assessment of the registered image sequences is presented. This includes the formation of perfusion maps for three parameters; maximum up...

  6. Feedback Improvement in Automatic Program Evaluation Systems

    Science.gov (United States)

    Skupas, Bronius

    2010-01-01

    Automatic program evaluation is a way to assess source program files. These techniques are used in learning management environments, programming exams and contest systems. However, use of automated program evaluation encounters problems: some evaluations are not clear for the students and the system messages do not show reasons for lost points.…

  7. Experiments in Automatic Library of Congress Classification.

    Science.gov (United States)

    Larson, Ray R.

    1992-01-01

    Presents the results of research into the automatic selection of Library of Congress Classification numbers based on the titles and subject headings in MARC records from a test database at the University of California at Berkeley Library School library. Classification clustering and matching techniques are described. (44 references) (LRW)

  8. Automatic Radiometric Normalization of Multitemporal Satellite Imagery

    DEFF Research Database (Denmark)

    Canty, Morton J.; Nielsen, Allan Aasbjerg; Schmidt, Michael

    2004-01-01

    The linear scale invariance of the multivariate alteration detection (MAD) transformation is used to obtain invariant pixels for automatic relative radiometric normalization of time series of multispectral data. Normalization by means of ordinary least squares regression method is compared...... normalization, compare favorably with results from normalization from manually obtained time-invariant features....

  9. 42 CFR 407.17 - Automatic enrollment.

    Science.gov (United States)

    2010-10-01

    ... SUPPLEMENTARY MEDICAL INSURANCE (SMI) ENROLLMENT AND ENTITLEMENT Individual Enrollment and Entitlement for SMI... enrolled for SMI if he or she: (1) Resides in the United States, except in Puerto Rico; (2) Becomes... chapter; and (3) Does not decline SMI enrollment. (b) Opportunity to decline automatic enrollment. (1)...

  10. Automatic extraction of legal concepts and definitions

    NARCIS (Netherlands)

    R. Winkels; R. Hoekstra

    2012-01-01

    In this paper we present the results of an experiment in automatic concept and definition extraction from written sources of law using relatively simple natural language and standard semantic web technology. The software was tested on six laws from the tax domain.

  11. Neuroanatomical automatic segmentation in brain cancer patients

    OpenAIRE

    D’Haese, P.; Niermann, K; Cmelak, A.; Donnelly, E.; Duay, V.; Li, R; Dawant, B.

    2003-01-01

    Conformally prescribed radiation therapy for brain cancer requires precisely defining the target treatment area, as well as delineating vital brain structures which must be spared from radiotoxicity. The current clinical practice of manually segmenting brain structures can be complex and exceedingly time consuming. Automatic computeraided segmentation methods have been proposed to increase efficiency and reproducibility in developing radiation treatment plans. Previous studies have establishe...

  12. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan;

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... incrementalizing a broad range of static analyses....

  13. Automatic alignment of hieroglyphs and transliteration

    OpenAIRE

    Nederhof, Mark Jan

    2009-01-01

    Automatic alignment has important applications in philology, facilitating study of texts on the basis of electronic resources produced by different scholars. A simple technique is presented to realise such alignment for Ancient Egyptian hieroglyphic texts and transliteration. Preliminary experiments with the technique are reported, and plans for future work are discussed. Postprint

  14. Learning slip behavior using automatic mechanical supervision

    OpenAIRE

    Angelova, Anelia; Matthies, Larry; Helmick, Daniel; Perona, Pietro

    2007-01-01

    We address the problem of learning terrain traversability properties from visual input, using automatic mechanical supervision collected from sensors onboard an autonomous vehicle. We present a novel probabilistic framework in which the visual information and the mechanical supervision interact to learn particular terrain types and their properties. The proposed method is applied to learning of rover slippage from visual information in a completely auto...

  15. Automatic Synthesis of Robust and Optimal Controllers

    DEFF Research Database (Denmark)

    Cassez, Franck; Jessen, Jan Jacob; Larsen, Kim Guldstrand;

    2009-01-01

    In this paper, we show how to apply recent tools for the automatic synthesis of robust and near-optimal controllers for a real industrial case study. We show how to use three different classes of models and their supporting existing tools, Uppaal-TiGA for synthesis, phaver for verification, and S...

  16. Automatic Guidance System for Welding Torches

    Science.gov (United States)

    Smith, H.; Wall, W.; Burns, M. R., Jr.

    1984-01-01

    Digital system automatically guides welding torch to produce squarebutt, V-groove and lap-joint weldments within tracking accuracy of +0.2 millimeter. Television camera observes and traverses weld joint, carrying welding torch behind. Image of joint digitized, and resulting data used to derive control signals that enable torch to track joint.

  17. Automatic program generation: future of software engineering

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, J.H.

    1979-01-01

    At this moment software development is still more of an art than an engineering discipline. Each piece of software is lovingly engineered, nurtured, and presented to the world as a tribute to the writer's skill. When will this change. When will the craftsmanship be removed and the programs be turned out like so many automobiles from an assembly line. Sooner or later it will happen: economic necessities will demand it. With the advent of cheap microcomputers and ever more powerful supercomputers doubling capacity, much more software must be produced. The choices are to double the number of programers, double the efficiency of each programer, or find a way to produce the needed software automatically. Producing software automatically is the only logical choice. How will automatic programing come about. Some of the preliminary actions which need to be done and are being done are to encourage programer plagiarism of existing software through public library mechanisms, produce well understood packages such as compiler automatically, develop languages capable of producing software as output, and learn enough about the whole process of programing to be able to automate it. Clearly, the emphasis must not be on efficiency or size, since ever larger and faster hardware is coming.

  18. Automatically extracting class diagrams from spreadsheets

    NARCIS (Netherlands)

    Hermans, F.; Pinzger, M.; Van Deursen, A.

    2010-01-01

    The use of spreadsheets to capture information is widespread in industry. Spreadsheets can thus be a wealthy source of domain information. We propose to automatically extract this information and transform it into class diagrams. The resulting class diagram can be used by software engineers to under

  19. Automatic invariant detection in dynamic web applications

    NARCIS (Netherlands)

    Groeneveld, F.; Mesbah, A.; Van Deursen, A.

    2010-01-01

    The complexity of modern web applications increases as client-side JavaScript and dynamic DOM programming are used to offer a more interactive web experience. In this paper, we focus on improving the dependability of such applications by automatically inferring invariants from the client-side and us

  20. Automatic prejudice in childhood and early adolescence

    NARCIS (Netherlands)

    J. Degner; D. Wentura

    2010-01-01

    Four cross-sectional studies are presented that investigated the automatic activation of prejudice in children and adolescents (aged 9 years to 15 years). Therefore, 4 different versions of the affective priming task were used, with pictures of ingroup and outgroup members being presented as prejudi

  1. Automatic thematic mapping in the EROS program

    Science.gov (United States)

    Edson, D. T.

    1972-01-01

    A specified approach to the automatic extraction and catographic presentation of thematic data contained in multispectral photographic images is presented. Experimental efforts were directed toward the mapping of open waters, snow and ice, infrared reflective vegetation, and massed works of man. The system must also be able to process data from a wide variety of sources.

  2. Automatic quality assurance in cutting and machining

    International Nuclear Information System (INIS)

    Requirements, economics, and possibility of automatic data acquisition and processing are discussed for different production stages. Which of the stages of materials and measuring equipment handling data acquisition, and data processing is to have priority in automation depends on the time requirements of these stages. (orig.)

  3. Automatic Positioning System of Small Agricultural Robot

    Science.gov (United States)

    Momot, M. V.; Proskokov, A. V.; Natalchenko, A. S.; Biktimirov, A. S.

    2016-08-01

    The present article discusses automatic positioning systems of agricultural robots used in field works. The existing solutions in this area have been analyzed. The article proposes an original solution, which is easy to implement and is characterized by high- accuracy positioning.

  4. Automatic Water Sensor Window Opening System

    KAUST Repository

    Percher, Michael

    2013-12-05

    A system can automatically open at least one window of a vehicle when the vehicle is being submerged in water. The system can include a water collector and a water sensor, and when the water sensor detects water in the water collector, at least one window of the vehicle opens.

  5. Automatic characterization of dynamics in Absence Epilepsy

    DEFF Research Database (Denmark)

    Petersen, Katrine N. H.; Nielsen, Trine N.; Kjær, Troels W.;

    2013-01-01

    Dynamics of the spike-wave paroxysms in Childhood Absence Epilepsy (CAE) are automatically characterized using novel approaches. Features are extracted from scalograms formed by Continuous Wavelet Transform (CWT). Detection algorithms are designed to identify an estimate of the temporal development...

  6. The CHilean Automatic Supernova sEarch

    DEFF Research Database (Denmark)

    Hamuy, M.; Pignata, G.; Maza, J.;

    2012-01-01

    The CHilean Automatic Supernova sEarch (CHASE) project began in 2007 with the goal to discover young, nearby southern supernovae in order to (1) better understand the physics of exploding stars and their progenitors, and (2) refine the methods to derive extragalactic distances. During the first...

  7. Automatically predicting mood from expressed emotions

    NARCIS (Netherlands)

    Katsimerou, C.

    2016-01-01

    Affect-adaptive systems have the potential to assist users that experience systematically negative moods. This thesis aims at building a platform for predicting automatically a person’s mood from his/her visual expressions. The key word is mood, namely a relatively long-term, stable and diffused aff

  8. Hierarchical word clustering - automatic thesaurus generation

    OpenAIRE

    Hodge, V.J.; Austin, J.

    2002-01-01

    In this paper, we propose a hierarchical, lexical clustering neural network algorithm that automatically generates a thesaurus (synonym abstraction) using purely stochastic information derived from unstructured text corpora and requiring no prior word classifications. The lexical hierarchy overcomes the Vocabulary Problem by accommodating paraphrasing through using synonym clusters and overcomes Information Overload by focusing search within cohesive clusters. We describe existing word catego...

  9. Automatization and familiarity in repeated checking

    NARCIS (Netherlands)

    Dek, E.C.P.; van den Hout, M.A.; Giele, C.L.; Engelhard, I.M.

    2015-01-01

    Repetitive, compulsive-like checking of an object leads to reductions in memory confidence, vividness, and detail. Experimental research suggests that this is caused by increased familiarity with perceptual characteristics of the stimulus and automatization of the checking procedure (Dek, van den Ho

  10. Automatically identifying periodic social events from Twitter

    NARCIS (Netherlands)

    Kunneman, F.A.; Bosch, A.P.J. van den

    2015-01-01

    Many events referred to on Twitter are of a periodic nature, characterized by roughly constant time intervals in between occurrences. Examples are annual music festivals, weekly television programs, and the full moon cycle. We propose a system that can automatically identify periodic events from Twi

  11. Automatic Estimation of Movement Statistics of People

    DEFF Research Database (Denmark)

    Ægidiussen Jensen, Thomas; Rasmussen, Henrik Anker; Moeslund, Thomas B.

    2012-01-01

    Automatic analysis of how people move about in a particular environment has a number of potential applications. However, no system has so far been able to do detection and tracking robustly. Instead, trajectories are often broken into tracklets. The key idea behind this paper is based around...

  12. Reduction of Dutch Sentences for Automatic Subtitling

    NARCIS (Netherlands)

    Tjong Kim Sang, E.F.; Daelemans, W.; Höthker, A.

    2004-01-01

    We compare machine learning approaches for sentence length reduction for automatic generation of subtitles for deaf and hearing-impaired people with a method which relies on hand-crafted deletion rules. We describe building the necessary resources for this task: a parallel corpus of examples of news

  13. Precision laser automatic tracking system.

    Science.gov (United States)

    Lucy, R F; Peters, C J; McGann, E J; Lang, K T

    1966-04-01

    A precision laser tracker has been constructed and tested that is capable of tracking a low-acceleration target to an accuracy of about 25 microrad root mean square. In tracking high-acceleration targets, the error is directly proportional to the angular acceleration. For an angular acceleration of 0.6 rad/sec(2), the measured tracking error was about 0.1 mrad. The basic components in this tracker, similar in configuration to a heliostat, are a laser and an image dissector, which are mounted on a stationary frame, and a servocontrolled tracking mirror. The daytime sensitivity of this system is approximately 3 x 10(-10) W/m(2); the ultimate nighttime sensitivity is approximately 3 x 10(-14) W/m(2). Experimental tests were performed to evaluate both dynamic characteristics of this system and the system sensitivity. Dynamic performance of the system was obtained, using a small rocket covered with retroreflective material launched at an acceleration of about 13 g at a point 204 m from the tracker. The daytime sensitivity of the system was checked, using an efficient retroreflector mounted on a light aircraft. This aircraft was tracked out to a maximum range of 15 km, which checked the daytime sensitivity of the system measured by other means. The system also has been used to track passively stars and the Echo I satellite. Also, the system tracked passively a +7.5 magnitude star, and the signal-to-noise ratio in this experiment indicates that it should be possible to track a + 12.5 magnitude star.

  14. Automatic software fault localization based on ar tificial bee colony

    Institute of Scientific and Technical Information of China (English)

    Linzhi Huang∗; Jun Ai

    2015-01-01

    Software debugging accounts for a vast majority of the financial and time costs in software developing and maintenance. Thus, approaches of software fault localization that can help au-tomate the debugging process have become a hot topic in the field of software engineering. Given the great demand for software fault localization, an approach based on the artificial bee colony (ABC) algorithm is proposed to be integrated with other related techniques. In this process, the source program is initial y instru-mented after analyzing the dependence information. The test case sets are then compiled and run on the instrumented program, and execution results are input to the ABC algorithm. The algorithm can determine the largest fitness value and best food source by calculating the average fitness of the employed bees in the iter-ative process. The program unit with the highest suspicion score corresponding to the best test case set is regarded as the final fault localization. Experiments are conducted with the TCAS program in the Siemens suite. Results demonstrate that the proposed fault localization method is effective and efficient. The ABC algorithm can efficiently avoid the local optimum, and ensure the validity of the fault location to a larger extent.

  15. Automatic Blocking Of QR and LU Factorizations for Locality

    Energy Technology Data Exchange (ETDEWEB)

    Yi, Q; Kennedy, K; You, H; Seymour, K; Dongarra, J

    2004-03-26

    QR and LU factorizations for dense matrices are important linear algebra computations that are widely used in scientific applications. To efficiently perform these computations on modern computers, the factorization algorithms need to be blocked when operating on large matrices to effectively exploit the deep cache hierarchy prevalent in today's computer memory systems. Because both QR (based on Householder transformations) and LU factorization algorithms contain complex loop structures, few compilers can fully automate the blocking of these algorithms. Though linear algebra libraries such as LAPACK provides manually blocked implementations of these algorithms, by automatically generating blocked versions of the computations, more benefit can be gained such as automatic adaptation of different blocking strategies. This paper demonstrates how to apply an aggressive loop transformation technique, dependence hoisting, to produce efficient blockings for both QR and LU with partial pivoting. We present different blocking strategies that can be generated by our optimizer and compare the performance of auto-blocked versions with manually tuned versions in LAPACK, both using reference BLAS, ATLAS BLAS and native BLAS specially tuned for the underlying machine architectures.

  16. Automatic Dissection Position Selection for Cleavage-Stage Embryo Biopsy.

    Science.gov (United States)

    Wang, Zenan; Ang, Wei Tech

    2016-03-01

    Embryo biopsies are routinely performed for preimplantation genetic diagnosis (PGD). In order to avoid blastomere membrane rupture and cell lysis, correct selection of a suitable dissection position on the zona pellucida (ZP) is necessary. Although, the technology for automated cell manipulation has advanced greatly over the past decade, fully automated embryo biopsy in PGD has not been realized yet. Automated PGD may ultimately set a new clinical standard that improves the consistency of outcomes, increases cell survival rates, flattens the learning curve of the manual procedure, and reduces the effects of human fatigue. In this paper, we present the first approach to automatically select a suitable ZP dissection position prior to embryo biopsy from a single focused embryo image based on edge detection. The proposed method consists of a technique that estimates the elliptical ZP boundaries and another two techniques that select the suitable position for ZP dissection. These techniques achieved success rates of 96%, 94%, and 94% respectively. In addition, the proposed ZP boundary estimation technique has the potential to perform ZP thickness variation (ZPTV) test and other ZP morphology measurements with further improvement in the future. Our methods provide a starting point for fast position selection prior to automatic embryo biopsy. PMID:26259216

  17. A Cough-Based Algorithm for Automatic Diagnosis of Pertussis.

    Science.gov (United States)

    Pramono, Renard Xaviero Adhi; Imtiaz, Syed Anas; Rodriguez-Villegas, Esther

    2016-01-01

    Pertussis is a contagious respiratory disease which mainly affects young children and can be fatal if left untreated. The World Health Organization estimates 16 million pertussis cases annually worldwide resulting in over 200,000 deaths. It is prevalent mainly in developing countries where it is difficult to diagnose due to the lack of healthcare facilities and medical professionals. Hence, a low-cost, quick and easily accessible solution is needed to provide pertussis diagnosis in such areas to contain an outbreak. In this paper we present an algorithm for automated diagnosis of pertussis using audio signals by analyzing cough and whoop sounds. The algorithm consists of three main blocks to perform automatic cough detection, cough classification and whooping sound detection. Each of these extract relevant features from the audio signal and subsequently classify them using a logistic regression model. The output from these blocks is collated to provide a pertussis likelihood diagnosis. The performance of the proposed algorithm is evaluated using audio recordings from 38 patients. The algorithm is able to diagnose all pertussis successfully from all audio recordings without any false diagnosis. It can also automatically detect individual cough sounds with 92% accuracy and PPV of 97%. The low complexity of the proposed algorithm coupled with its high accuracy demonstrates that it can be readily deployed using smartphones and can be extremely useful for quick identification or early screening of pertussis and for infection outbreaks control. PMID:27583523

  18. Automatic River Network Extraction from LIDAR Data

    Science.gov (United States)

    Maderal, E. N.; Valcarcel, N.; Delgado, J.; Sevilla, C.; Ojeda, J. C.

    2016-06-01

    National Geographic Institute of Spain (IGN-ES) has launched a new production system for automatic river network extraction for the Geospatial Reference Information (GRI) within hydrography theme. The goal is to get an accurate and updated river network, automatically extracted as possible. For this, IGN-ES has full LiDAR coverage for the whole Spanish territory with a density of 0.5 points per square meter. To implement this work, it has been validated the technical feasibility, developed a methodology to automate each production phase: hydrological terrain models generation with 2 meter grid size and river network extraction combining hydrographic criteria (topographic network) and hydrological criteria (flow accumulation river network), and finally the production was launched. The key points of this work has been managing a big data environment, more than 160,000 Lidar data files, the infrastructure to store (up to 40 Tb between results and intermediate files), and process; using local virtualization and the Amazon Web Service (AWS), which allowed to obtain this automatic production within 6 months, it also has been important the software stability (TerraScan-TerraSolid, GlobalMapper-Blue Marble , FME-Safe, ArcGIS-Esri) and finally, the human resources managing. The results of this production has been an accurate automatic river network extraction for the whole country with a significant improvement for the altimetric component of the 3D linear vector. This article presents the technical feasibility, the production methodology, the automatic river network extraction production and its advantages over traditional vector extraction systems.

  19. The Cross-cultural Debugging and Evaluation Reliability and Validity of the Health Belief Scale%健康信念量表的跨文化调试与信效度评价

    Institute of Scientific and Technical Information of China (English)

    季韶艳; 杨辉

    2013-01-01

    Objective:The health beliefs scale of the Chinese version of Nursing Outcome Classification cross-cultural debug,and reliability and validity of the evaluation,our assessment of the patient health beliefs provide a scientific basis measurement. Method:Follow scale cross-cultural debugging guide,by two expert consulting,40 patients with preliminary test,to form a new version of the Health Belief Scale,reliability and validity analysis of the findings of 200 hospitalized patients. Result:The new health beliefs scale internal consistency was 0.935,test-retest reliability 0.889, split-half reliability 0.936. Through expert analysis,content validity was better. The KMO and Bartlett’s test showed that suitable for factor analysis, exploratory factor analysis to extract five common factors,the cumulative variance contribution rate of 54.993%. Conclusion:The final version of health beliefs scale has high reliability and validity,science and simple,suitable for Chinese patients with wide application in the crowd,and to assess the patient’s level of health beliefs,objective evaluationn of the quality of nursing services.%  目的:对中文版《护理结局分类》中的健康信念量表进行跨文化调试,并进行信效度的评价,为评估我国患者健康信念提供科学、有效的测量依据.方法:遵循量表跨文化调试指南,通过两轮专家咨询,40名患者初试,形成新版健康信念量表,对200名住院患者调查结果进行信效度分析.结果:最终形成的健康信念量表内部一致性为0.935,重测信度0.889,分半信度0.936;经过专家分析,内容效度较好,KMO和Bartlett的检验表明适合进行因素分析,探索性因素分析提取5个公因子,累计方差贡献率为54.993%.结论:最终形成的新版健康信念量表具有较高的信效度,科学简便,适合在我国患者人群中广泛应用,以评估患者的健康信念水平,客观评价护理人员的服务质量.

  20. Technical evaluation of TomoTherapy automatic roll correction.

    Science.gov (United States)

    Laub, Steve; Snyder, Michael; Burmeister, Jay

    2015-01-01

    The TomoTherapy Hi·Art System allows the application of rotational corrections as a part of the pretreatment image guidance process. This study outlines a custom method to perform an end-to-end evaluation of the TomoTherapy Hi·Art roll correction feature. A roll-sensitive plan was designed and delivered to a cylindrical solid water phantom to test the accuracy of roll corrections, as well as the ability of the automatic registration feature to detect induced roll. Cylindrical target structures containing coaxial inner avoidance structures were placed adjacent to the plane bisecting the phantom and 7 cm laterally off central axis. The phantom was positioned at isocenter with the target-plane parallel to the couch surface. Varying degrees of phantom roll were induced and dose to the targets and inner avoidance structures was measured using Kodak EDR2 films placed in the target-plane. Normalized point doses were compared with baseline (no roll) data to determine the sensitivity of the test and the effectiveness of the roll correction feature. Gamma analysis comparing baseline, roll-corrected, and uncorrected films was performed using film analysis software. MVCT images were acquired prior to plan delivery. Measured roll was compared with induced roll to evaluate the automatic registration feature's ability to detect rotational misalignment. Rotations beyond 0.3° result in statistically significant deviation from baseline point measurements. Gamma pass rates begin to drop below 90% at approximately 0.5° induced rotation at 3%/3 mm and between 0.2° and 0.3° for 2%/2 mm. With roll correction applied, point dose measurements for all rotations are indistinguishable from baseline, and gamma pass rates exceed 96% when using 3% and 3 mm as evaluation criteria. Measured roll via the automatic registration algorithm agrees with induced rotation to within the test sensitivity for nearly all imaging settings. The TomoTherapy automatic registration system accurately detects