WorldWideScience

Sample records for automatic performance debugging

  1. Automatic Performance Debugging of SPMD Parallel Programs

    CERN Document Server

    Liu, Xu; Zhan, Jianfeng; Tu, Bibo; Meng, Dan

    2010-01-01

    Automatic performance debugging of parallel applications usually involves two steps: automatic detection of performance bottlenecks and uncovering their root causes for performance optimization. Previous work fails to resolve this challenging issue in several ways: first, several previous efforts automate analysis processes, but present the results in a confined way that only identifies performance problems with apriori knowledge; second, several tools take exploratory or confirmatory data analysis to automatically discover relevant performance data relationships. However, these efforts do not focus on locating performance bottlenecks or uncovering their root causes. In this paper, we design and implement an innovative system, AutoAnalyzer, to automatically debug the performance problems of single program multi-data (SPMD) parallel programs. Our system is unique in terms of two dimensions: first, without any apriori knowledge, we automatically locate bottlenecks and uncover their root causes for performance o...

  2. Automatic Performance Debugging of SPMD-style Parallel Programs

    CERN Document Server

    Liu, Xu; Zhan, Kunlin; Shi, Weisong; Yuan, Lin; Meng, Dan; Wang, Lei

    2011-01-01

    The simple program and multiple data (SPMD) programming model is widely used for both high performance computing and Cloud computing. In this paper, we design and implement an innovative system, AutoAnalyzer, that automates the process of debugging performance problems of SPMD-style parallel programs, including data collection, performance behavior analysis, locating bottlenecks, and uncovering their root causes. AutoAnalyzer is unique in terms of two features: first, without any apriori knowledge, it automatically locates bottlenecks and uncovers their root causes for performance optimization; second, it is lightweight in terms of the size of performance data to be collected and analyzed. Our contributions are three-fold: first, we propose two effective clustering algorithms to investigate the existence of performance bottlenecks that cause process behavior dissimilarity or code region behavior disparity, respectively; meanwhile, we present two searching algorithms to locate bottlenecks; second, on a basis o...

  3. Automatic program debugging for intelligent tutoring systems

    Energy Technology Data Exchange (ETDEWEB)

    Murray, W.R.

    1986-01-01

    This thesis explores the process by which student programs can be automatically debugged in order to increase the instructional capabilities of these systems. This research presents a methodology and implementation for the diagnosis and correction of nontrivial recursive programs. In this approach, recursive programs are debugged by repairing induction proofs in the Boyer-Moore Logic. The potential of a program debugger to automatically debug widely varying novice programs in a nontrivial domain is proportional to its capabilities to reason about computational semantics. By increasing these reasoning capabilities a more powerful and robust system can result. This thesis supports these claims by examining related work in automated program debugging and by discussing the design, implementation, and evaluation of Talus, an automatic degugger for LISP programs. Talus relies on its abilities to reason about computational semantics to perform algorithm recognition, infer code teleology, and to automatically detect and correct nonsyntactic errors in student programs written in a restricted, but nontrivial, subset of LISP.

  4. Automatic Debugging Support for UML Designs

    Science.gov (United States)

    Schumann, Johann; Swanson, Keith (Technical Monitor)

    2001-01-01

    Design of large software systems requires rigorous application of software engineering methods covering all phases of the software process. Debugging during the early design phases is extremely important, because late bug-fixes are expensive. In this paper, we describe an approach which facilitates debugging of UML requirements and designs. The Unified Modeling Language (UML) is a set of notations for object-orient design of a software system. We have developed an algorithm which translates requirement specifications in the form of annotated sequence diagrams into structured statecharts. This algorithm detects conflicts between sequence diagrams and inconsistencies in the domain knowledge. After synthesizing statecharts from sequence diagrams, these statecharts usually are subject to manual modification and refinement. By using the "backward" direction of our synthesis algorithm. we are able to map modifications made to the statechart back into the requirements (sequence diagrams) and check for conflicts there. Fed back to the user conflicts detected by our algorithm are the basis for deductive-based debugging of requirements and domain theory in very early development stages. Our approach allows to generate explanations oil why there is a conflict and which parts of the specifications are affected.

  5. Do moods affect programmers’ debug performance?

    OpenAIRE

    Khan, I A; Brinkman, W.P.; Hierons, R. M.

    2010-01-01

    There is much research that shows people’s mood can affect their activities. This paper argues that this also applies to programmers, especially their debugging. Literature-based framework is presented linking programming with various cognitive activities as well as linking cognitive activities with moods. Further, the effect of mood on debugging was tested in two experiments. In the first experiment, programmers (n = 72) saw short movie clips selected for their ability to provoke specific mo...

  6. Do moods affect programmers’ debug performance?

    NARCIS (Netherlands)

    Khan, I.A.; Brinkman, W.P.; Hierons, R.M.

    2010-01-01

    There is much research that shows people’s mood can affect their activities. This paper argues that this also applies to programmers, especially their debugging. Literature-based framework is presented linking programming with various cognitive activities as well as linking cognitive activities with

  7. Supercomputer debugging workshop 1991 proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.

    1991-12-31

    This report discusses the following topics on supercomputer debugging: Distributed debugging; use interface to debugging tools and standards; debugging optimized codes; debugging parallel codes; and debugger performance and interface as analysis tools. (LSP)

  8. Supercomputer debugging workshop 1991 proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.

    1991-01-01

    This report discusses the following topics on supercomputer debugging: Distributed debugging; use interface to debugging tools and standards; debugging optimized codes; debugging parallel codes; and debugger performance and interface as analysis tools. (LSP)

  9. Supercomputer debugging workshop `92

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.S.

    1993-02-01

    This report contains papers or viewgraphs on the following topics: The ABCs of Debugging in the 1990s; Cray Computer Corporation; Thinking Machines Corporation; Cray Research, Incorporated; Sun Microsystems, Inc; Kendall Square Research; The Effects of Register Allocation and Instruction Scheduling on Symbolic Debugging; Debugging Optimized Code: Currency Determination with Data Flow; A Debugging Tool for Parallel and Distributed Programs; Analyzing Traces of Parallel Programs Containing Semaphore Synchronization; Compile-time Support for Efficient Data Race Detection in Shared-Memory Parallel Programs; Direct Manipulation Techniques for Parallel Debuggers; Transparent Observation of XENOOPS Objects; A Parallel Software Monitor for Debugging and Performance Tools on Distributed Memory Multicomputers; Profiling Performance of Inter-Processor Communications in an iWarp Torus; The Application of Code Instrumentation Technology in the Los Alamos Debugger; and CXdb: The Road to Remote Debugging.

  10. Extraction Error Modeling and Automated Model Debugging in High-Performance Low Power Custom Designs

    OpenAIRE

    Yang, Yu-Shen; Veneris, Andreas; Thadikaran, Paul; Venkataraman, Srikanth

    2005-01-01

    Test model generation is common in the design cycle of custom made high performance low power designs targeted for high volume production. Logic extraction is a key step in test model generation to produce a logic level netlist from the transistor level representation. This is a semi-automated process which is error prone. This paper analyzes typical extraction errors applicable to clocking schemes seen in high-performance designs today. An automated debugging solution for these errors in des...

  11. Debugging, Advanced Debugging and Runtime Analysis

    Directory of Open Access Journals (Sweden)

    Salim Istyaq

    2010-03-01

    variables on error or at chosen points. Automated functional GUI testing tools are used to repeat system-level tests through the GUI, benchmarks, allowing run-time performance comparisons to be made, performance analysis that can help to highlight hot spots and resource usage. A runtime tool will allow you to examine the application internals after the run via the recorded runtime analysis data. Runtime analysis removes the guesswork from debugging.It helps to uncover Memory corruption detection, Memory leak detection etc. Runtime analysis is an effort aimed at understanding software component behavior by using datacollection during the execution of the component. Runtimeanalysis is a topic of great interest in Computer Science. A program can take seconds, hours or even years to finish executing, depending on various parameters.

  12. Supercomputer debugging workshop '92

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.S.

    1993-01-01

    This report contains papers or viewgraphs on the following topics: The ABCs of Debugging in the 1990s; Cray Computer Corporation; Thinking Machines Corporation; Cray Research, Incorporated; Sun Microsystems, Inc; Kendall Square Research; The Effects of Register Allocation and Instruction Scheduling on Symbolic Debugging; Debugging Optimized Code: Currency Determination with Data Flow; A Debugging Tool for Parallel and Distributed Programs; Analyzing Traces of Parallel Programs Containing Semaphore Synchronization; Compile-time Support for Efficient Data Race Detection in Shared-Memory Parallel Programs; Direct Manipulation Techniques for Parallel Debuggers; Transparent Observation of XENOOPS Objects; A Parallel Software Monitor for Debugging and Performance Tools on Distributed Memory Multicomputers; Profiling Performance of Inter-Processor Communications in an iWarp Torus; The Application of Code Instrumentation Technology in the Los Alamos Debugger; and CXdb: The Road to Remote Debugging.

  13. TinyDebug

    DEFF Research Database (Denmark)

    Hansen, Morten Tranberg

    2011-01-01

    Debugging embedded wireless systems can be cumbersome due to low visibility. To ease the task of debugging this paper present TinyDebug which is a multi-purpose passive debugging framework for developing embedded wireless sys- tems. TinyDebug is designed to be used throughout the entire system...

  14. DySectAPI: Scalable Prescriptive Debugging

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Karlsson, Sven; Quarfot Nielsen, Niklas;

    We present the DySectAPI, a tool that allow users to construct probe trees for automatic, event-driven debugging at scale. The traditional, interactive debugging model, whereby users manually step through and inspect their application, does not scale well even for current supercomputers. While...... lightweight debugging models scale well, they can currently only debug a subset of bug classes. DySectAPI fills the gap between these two approaches with a novel user-guided approach. Using both experimental results and analytical modeling we show how DySectAPI scales and can run with a low overhead on...

  15. MPI Debugging with Handle Introspection

    OpenAIRE

    Brock-Nannestad, Laust; DelSignore, John; Squyres, Jeffrey M.; Karlsson, Sven; Mohror, Kathryn

    2014-01-01

    The Message Passing Interface, MPI, is the standard programming model for high performance computing clusters. However, debugging applications on large scale clusters is difficult. The widely used Message Queue Dumping interface enables inspection of message queue state but there is no general interface for extracting information from MPI objects such as communicators. A developer can debug the MPI library as if it was part of the application, but this exposes an unneeded level of detail.The ...

  16. 05501 Summary -- Automatic Performance Analysis

    OpenAIRE

    Gerndt, Hans Michael; Malony, Allen; Miller, Barton P.; Nagel, Wolfgang

    2006-01-01

    The Workshop on Automatic Performance Analysis (WAPA 2005, Dagstuhl Seminar 05501), held December 13-16, 2005, brought together performance researchers, developers, and practitioners with the goal of better understanding the methods, techniques, and tools that are needed for the automation of performance analysis for high performance computing.

  17. Debugging, Advanced Debugging and Runtime Analysis

    OpenAIRE

    Salim Istyaq; Aufaq Zargar

    2010-01-01

    This paper discusses debugging and runtime analysis of software and outlines its enormous benefits to software developers and testers. A debugger is usually quite helpful in tracking down many logic problems. However, even with the most advanced debugger at your disposal, it doesn't guarantee that it will be a straightforward task to rid your program of bugs. Debugging techniques might help you in your task of flushing errors out of your program. Some of these willdirectly involve the debugge...

  18. Debugging Democracy

    Directory of Open Access Journals (Sweden)

    Alexander Likhotal

    2016-05-01

    Full Text Available Democracy was the most successful political idea of the 20th century. However since the beginning of the new century democracy has been clearly suffering from serious structural problems, rather than a few isolated ailments. Why has it run into trouble, can it be revived? In the consumption driven world people have started to be driven by the belief in economic prosperity as the guarantee of human freedom. As a result, human development and personal status have become hostages of economic performance, deforming basic civilisation’s ethical matrix. However in 10-15 years, the world may be completely different. We are looking at communications and technology revolutions occurring in very abbreviated time frames. Soon, billions of people will interact via a fast data-transferring Metaweb, and it will change social standards as well as human behaviour patterns. Integrated global economies functioning as holistic entities will spur a deep reframing of global governance, shaping a new configuration of political, economic and military power. One can hardly expect that these changes will leave democratic mechanisms intact. It’s a pivotal moment for all of us because we are facing paradigm changes in our way of life. We clearly need a new political vision that is deliverable quickly. Democracy can be reset if it can provide a platform for collective judgement and individual development—in a value-driven process, when values manifest themselves in concrete and socially meaningful issues, and are not reduced to the economic optimization and politics of the wallet. In other words, the only remedy to resolve the crisis of democracy is more democracy.

  19. Model-Based Debugging

    OpenAIRE

    Mirghasemi, Salman

    2009-01-01

    Software Debugging is still one of the most challenging and time consuming aspects of software development. Monitoring the software behavior and finding the causes of this behavior are located at the center of debugging process. Although many tools and techniques have been introduced to support developers in this part, we still have a long way from the ideal point. In this paper, we first give a detailed explanation of the main issues in this domain and why the available techniques and tools...

  20. MPI Debugging with Handle Introspection

    DEFF Research Database (Denmark)

    Brock-Nannestad, Laust; DelSignore, John; Squyres, Jeffrey M.;

    The Message Passing Interface, MPI, is the standard programming model for high performance computing clusters. However, debugging applications on large scale clusters is difficult. The widely used Message Queue Dumping interface enables inspection of message queue state but there is no general...... interface for extracting information from MPI objects such as communicators. A developer can debug the MPI library as if it was part of the application, but this exposes an unneeded level of detail. The Tools Working Group in the MPI Forum has proposed a specification for MPI Handle Introspection. It...... defines a standard interface that lets debuggers extract information from MPI objects. Extracted information is then presented to the developer, in a human readable format. The interface is designed to be independent of MPI implementations and debuggers. In this paper, we describe our support for...

  1. IDebug: An Advanced Debugging Framework for Java

    OpenAIRE

    Kiniry, Joseph R.

    1998-01-01

    IDebug, the Infospheres debugging framework, is an advanced debugging framework for Java. This framework provides the standard core debugging and specification constructs such as assertions, debug levels and categories, stack traces, and specialized exceptions. Debugging functionality can be fine-tuned to a per-thread and/or a per-class basis, debugging contexts can be stored to and recovered from persistent storage, and several aspects of the debugging run-time are configurable at the meta-l...

  2. Voice-controlled Debugging of Spreadsheets

    CERN Document Server

    Flood, Derek

    2008-01-01

    Developments in Mobile Computing are putting pressure on the software industry to research new modes of interaction that do not rely on the traditional keyboard and mouse combination. Computer users suffering from Repetitive Strain Injury also seek an alternative to keyboard and mouse devices to reduce suffering in wrist and finger joints. Voice-control is an alternative approach to spreadsheet development and debugging that has been researched and used successfully in other domains. While voice-control technology for spreadsheets is available its effectiveness has not been investigated. This study is the first to compare the performance of a set of expert spreadsheet developers that debugged a spreadsheet using voice-control technology and another set that debugged the same spreadsheet using keyboard and mouse. The study showed that voice, despite its advantages, proved to be slower and less accurate. However, it also revealed ways in which the technology might be improved to redress this imbalance.

  3. Integrated Debugging of Modelica Models

    Directory of Open Access Journals (Sweden)

    Adrian Pop

    2014-04-01

    Full Text Available The high abstraction level of equation-based object-oriented (EOO languages such as Modelica has the drawback that programming and modeling errors are often hard to find. In this paper we present integrated static and dynamic debugging methods for Modelica models and a debugger prototype that addresses several of those problems. The goal is an integrated debugging framework that combines classical debugging techniques with special techniques for equation-based languages partly based on graph visualization and interaction. To our knowledge, this is the first Modelica debugger that supports both equation-based transformational and algorithmic code debugging in an integrated fashion.

  4. Handwriting Automaticity: The Search for Performance Thresholds

    Science.gov (United States)

    Medwell, Jane; Wray, David

    2014-01-01

    Evidence is accumulating that handwriting has an important role in written composition. In particular, handwriting automaticity appears to relate to success in composition. This relationship has been little explored in British contexts and we currently have little idea of what threshold performance levels might be. In this paper, we report on two…

  5. Debugging in a multi-processor environment

    International Nuclear Information System (INIS)

    The Supervisory Control and Diagnostic System (SCDS) for the Mirror Fusion Test Facility (MFTF) consists of nine 32-bit minicomputers arranged in a tightly coupled distributed computer system utilizing a share memory as the data exchange medium. Debugging of more than one program in the multi-processor environment is a difficult process. This paper describes what new tools were developed and how the testing of software is performed in the SCDS for the MFTF project

  6. BigDebug: Debugging Primitives for Interactive Big Data Processing in Spark

    Science.gov (United States)

    Gulzar, Muhammad Ali; Interlandi, Matteo; Yoo, Seunghyun; Tetali, Sai Deep; Condie, Tyson; Millstein, Todd; Kim, Miryung

    2016-01-01

    Developers use cloud computing platforms to process a large quantity of data in parallel when developing big data analytics. Debugging the massive parallel computations that run in today’s data-centers is time consuming and error-prone. To address this challenge, we design a set of interactive, real-time debugging primitives for big data processing in Apache Spark, the next generation data-intensive scalable cloud computing platform. This requires re-thinking the notion of step-through debugging in a traditional debugger such as gdb, because pausing the entire computation across distributed worker nodes causes significant delay and naively inspecting millions of records using a watchpoint is too time consuming for an end user. First, BIGDEBUG’s simulated breakpoints and on-demand watchpoints allow users to selectively examine distributed, intermediate data on the cloud with little overhead. Second, a user can also pinpoint a crash-inducing record and selectively resume relevant sub-computations after a quick fix. Third, a user can determine the root causes of errors (or delays) at the level of individual records through a fine-grained data provenance capability. Our evaluation shows that BIGDEBUG scales to terabytes and its record-level tracing incurs less than 25% overhead on average. It determines crash culprits orders of magnitude more accurately and provides up to 100% time saving compared to the baseline replay debugger. The results show that BIGDEBUG supports debugging at interactive speeds with minimal performance impact.

  7. Automatic Energy Schemes for High Performance Applications

    Energy Technology Data Exchange (ETDEWEB)

    Sundriyal, Vaibhav [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    Although high-performance computing traditionally focuses on the efficient execution of large-scale applications, both energy and power have become critical concerns when approaching exascale. Drastic increases in the power consumption of supercomputers affect significantly their operating costs and failure rates. In modern microprocessor architectures, equipped with dynamic voltage and frequency scaling (DVFS) and CPU clock modulation (throttling), the power consumption may be controlled in software. Additionally, network interconnect, such as Infiniband, may be exploited to maximize energy savings while the application performance loss and frequency switching overheads must be carefully balanced. This work first studies two important collective communication operations, all-to-all and allgather and proposes energy saving strategies on the per-call basis. Next, it targets point-to-point communications to group them into phases and apply frequency scaling to them to save energy by exploiting the architectural and communication stalls. Finally, it proposes an automatic runtime system which combines both collective and point-to-point communications into phases, and applies throttling to them apart from DVFS to maximize energy savings. The experimental results are presented for NAS parallel benchmark problems as well as for the realistic parallel electronic structure calculations performed by the widely used quantum chemistry package GAMESS. Close to the maximum energy savings were obtained with a substantially low performance loss on the given platform.

  8. Lightweight and Statistical Techniques for Petascale PetaScale Debugging

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Barton

    2014-06-30

    to a small set of nodes or by identifying equivalence classes of nodes and sampling our debug targets from them. We implemented these techniques as lightweight tools that efficiently work on the full scale of the target machine. We explored four lightweight debugging refinements: generic classification parameters, such as stack traces, application-specific classification parameters, such as global variables, statistical data acquisition techniques and machine learning based approaches to perform root cause analysis. Work done under this project can be divided into two categories, new algorithms and techniques for scalable debugging, and foundation infrastructure work on our MRNet multicast-reduction framework for scalability, and Dyninst binary analysis and instrumentation toolkits.

  9. TESTING AND DEBUGGING OF DISTRIBUTED SOFTWARE

    OpenAIRE

    Cunha, José C.; Henryk Krawczyk

    2012-01-01

    This paper introduces the topic of testing and debugging of distributed software in this special issue of the Computers and Artificial Intelligence Journal.  A global picture is given of the problems involved in developing distributed applications in order to  motivate the need for testing and debugging activities. The main issues and approaches of testing and debugging are surveyed, the  focus being on the identification of current and future trends. We  conclude by intro...

  10. Multi-purpose passive debugging for embedded wireless

    DEFF Research Database (Denmark)

    Hansen, Morten Tranberg

    Debugging embedded wireless systems can be cumbersome and hard due to low visibility. To ease the task of debugging we propose a multi-purpose passive debugging framework, called TinyDebug, for developing embedded wireless systems. TinyDebug is designed to be used throughout the entire system...

  11. A Scalable Prescriptive Parallel Debugging Model

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Quarfot Nielsen, Niklas; Lee, Gregory L.;

    2015-01-01

    Debugging is a critical step in the development of any parallel program. However, the traditional interactive debugging model, where users manually step through code and inspect their application, does not scale well even for current supercomputers due its centralized nature. While lightweight...

  12. Exposing MPI Objects for Debugging

    DEFF Research Database (Denmark)

    Brock-Nannestad, Laust; DelSignore, John; Squyres, Jeffrey M.;

    Developers rely on debuggers to inspect application state. In applications that use MPI, the Message Passing Interface, the MPI runtime contains an important part of this state. The MPI Tools Working Group has proposed an interface for MPI Handle Introspection. It allows debuggers and MPI...... implementations to cooperate in extracting information from MPI objects. Information that can then be presented to the developer. MPI Handle Introspection provides a more general interface than previous work, such as Message Queue Dumping. We add support for introspection to the TotalView debugger and a...... development version of Open MPI. We explain the interactions between the debugger and MPI library and demonstrate how MPI Handle Introspection raises the abstraction level to simplify debugging of MPI related programming errors....

  13. Cerebral Correlates of Automatic Associations Towards Performance Enhancing Substances

    OpenAIRE

    Schindler, Sebastian; Wolff, Wanja

    2015-01-01

    The direct assessment of explicit attitudes toward performance enhancing substances, for example Neuroenhancement or doping in sports, can be affected by social desirability biases and cheating attempts. According to Dual Process Theories of cognition, indirect measures like the Implicit Association Test (IAT) measure automatic associations toward a topic (as opposed to explicit attitudes measured by self-report measures). Such automatic associations are thought to occur rapidly and to evade ...

  14. TUNE: Compiler-Directed Automatic Performance Tuning

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Mary [University of Utah

    2014-09-18

    This project has developed compiler-directed performance tuning technology targeting the Cray XT4 Jaguar system at Oak Ridge, which has multi-core Opteron nodes with SSE-3 SIMD extensions, and the Cray XE6 Hopper system at NERSC. To achieve this goal, we combined compiler technology for model-guided empirical optimization for memory hierarchies with SIMD code generation, which have been developed by the PIs over the past several years. We examined DOE Office of Science applications to identify performance bottlenecks and apply our system to computational kernels that operate on dense arrays. Our goal for this performance-tuning technology has been to yield hand-tuned levels of performance on DOE Office of Science computational kernels, while allowing application programmers to specify their computations at a high level without requiring manual optimization. Overall, we aim to make our technology for SIMD code generation and memory hierarchy optimization a crucial component of high-productivity Petaflops computing through a close collaboration with the scientists in national laboratories.

  15. Requirements for Automatic Performance Analysis - APART Technical Report

    OpenAIRE

    Riley, Graham D.; Gurd, John R.

    1999-01-01

    This report discusses the requirements for automatic performance analysis tools. The discussion proceeds by first examining the nature and purpose of performance analysis. This results in an identification of the sources of performance data available to the analysis process and some properties of the process itself. Consideration is then given to the automation of the process. Many environmental factors affecting the performance analysis process are identified leading to the definition of a s...

  16. Performance of automatic scanning microscope for nuclear emulsion experiments

    International Nuclear Information System (INIS)

    The impressive improvements in scanning technology and methods let nuclear emulsion to be used as a target in recent large experiments. We report the performance of an automatic scanning microscope for nuclear emulsion experiments. After successful calibration and alignment of the system, we have reached 99% tracking efficiency for the minimum ionizing tracks that penetrating through the emulsions films. The automatic scanning system is successfully used for the scanning of emulsion films in the OPERA experiment and plan to use for the next generation of nuclear emulsion experiments

  17. Performance of automatic scanning microscope for nuclear emulsion experiments

    Science.gov (United States)

    Güler, A. Murat; Altınok, Özgür

    2015-12-01

    The impressive improvements in scanning technology and methods let nuclear emulsion to be used as a target in recent large experiments. We report the performance of an automatic scanning microscope for nuclear emulsion experiments. After successful calibration and alignment of the system, we have reached 99% tracking efficiency for the minimum ionizing tracks that penetrating through the emulsions films. The automatic scanning system is successfully used for the scanning of emulsion films in the OPERA experiment and plan to use for the next generation of nuclear emulsion experiments.

  18. PGMPI: Automatically Verifying Self-Consistent MPI Performance Guidelines

    OpenAIRE

    Hunold, Sascha; Carpen-Amarie, Alexandra; Lübbe, Felix Donatus; Träff, Jesper Larsson

    2016-01-01

    The Message Passing Interface (MPI) is the most commonly used application programming interface for process communication on current large-scale parallel systems. Due to the scale and complexity of modern parallel architectures, it is becoming increasingly difficult to optimize MPI libraries, as many factors can influence the communication performance. To assist MPI developers and users, we propose an automatic way to check whether MPI libraries respect self-consistent performance guidelines....

  19. Debug automation from pre-silicon to post-silicon

    CERN Document Server

    Dehbashi, Mehdi

    2015-01-01

    This book describes automated debugging approaches for the bugs and the faults which appear in different abstraction levels of a hardware system. The authors employ a transaction-based debug approach to systems at the transaction-level, asserting the correct relation of transactions. The automated debug approach for design bugs finds the potential fault candidates at RTL and gate-level of a circuit. Debug techniques for logic bugs and synchronization bugs are demonstrated, enabling readers to localize the most difficult bugs. Debug automation for electrical faults (delay faults)finds the potentially failing speedpaths in a circuit at gate-level. The various debug approaches described achieve high diagnosis accuracy and reduce the debugging time, shortening the IC development cycle and increasing the productivity of designers. Describes a unified framework for debug automation used at both pre-silicon and post-silicon stages; Provides approaches for debug automation of a hardware system at different levels of ...

  20. An automatic system for measuring road and tunnel lighting performance

    OpenAIRE

    Greffier, Florian; Charbonnier, Pierre; Tarel, Jean-Philippe; Boucher, Vincent; FOURNELA, Fabrice

    2015-01-01

    Various problems in different domains are related to the operation of the Human Visual System (HVS). This is notably the case when considering the driver's visual perception, and road safety in general. That is why several standards of road equipments are directly derived from human visual abilities and especially in road and tunnel lighting installations design. This paper introduces an automatic system for measuring road and tunnel lighting performance. The proposed device is based on an em...

  1. Automatic or Deliberate? Cerebral correlates of automatic associations towards performance enhancing substances

    Directory of Open Access Journals (Sweden)

    Sebastian eSchindler

    2015-12-01

    Full Text Available The direct assessment of explicit attitudes towards performance enhancing substances, for example Neuroenhancement or doping in sports can be affected by social desirability biases and cheating attempts. According to Dual Process Theories of cognition, indirect measures like the Implicit Association Test (IAT measure automatic associations towards a topic (as opposed to explicit attitudes measured by self-report measures. Such automatic associations are thought to occur rapidly and to evade voluntary control. However, whether or not such indirect tests actually reflect automatic associations is difficult to validate. Electroencephalography´s superior time resolution enables to differentiate between highly automatic compared to more elaborate processing stages. We therefore examined on which processing stages cortical differences between negative or positive attitudes to doping occur, and whether or not these differences can be related to BIAT scores. We tested 42 university students (31 females, 24.43 ± 3.17 years old, who were requested to complete a brief doping IAT (BIAT on attitudes towards doping. Cerebral activity during doping BIAT completion was assessed using high-density EEG. Behaviorally, participants D-scores exhibited negative attitudes towards doping, represented by faster reaction times in the doping + dislike pairing task. Event-related potentials (ERPs revealed earliest effects between 200 and 300ms. Here, a relatively larger occipital positivity was found for the doping + dislike pairing task. Further, in the LPP time range between 400 and 600ms a larger late positive potential was found for the doping + dislike pairing task over central regions. These LPP amplitude differences were successfully predicting participants´ BIAT D-scores.Results indicate that event-related potentials differentiate between positive and negative doping attitudes at stages of mid-latency. However, it seems that IAT scores can be predicted only by

  2. Cerebral Correlates of Automatic Associations Towards Performance Enhancing Substances.

    Science.gov (United States)

    Schindler, Sebastian; Wolff, Wanja

    2015-01-01

    The direct assessment of explicit attitudes toward performance enhancing substances, for example Neuroenhancement or doping in sports, can be affected by social desirability biases and cheating attempts. According to Dual Process Theories of cognition, indirect measures like the Implicit Association Test (IAT) measure automatic associations toward a topic (as opposed to explicit attitudes measured by self-report measures). Such automatic associations are thought to occur rapidly and to evade voluntary control. However, whether or not such indirect tests actually reflect automatic associations is difficult to validate. Electroencephalography (EEG) has a superior time resolution which can differentiate between highly automatic compared to more elaborate processing stages. We therefore used EEG to examine on which processing stages cortical differences between negative or positive attitudes to doping occur, and whether or not these differences can be related to BIAT scores. We tested 42 university students (31 females, 24.43 ± 3.17 years old), who were requested to complete a brief doping IAT (BIAT) on attitudes toward doping. Cerebral activity during doping BIAT completion was assessed using high-density EEG. Behaviorally, participants D-scores exhibited negative attitudes toward doping, represented by faster reaction times in the doping + dislike pairing task. Event-related potentials (ERPs) revealed earliest effects between 200 and 300 ms. Here, a relatively larger occipital positivity was found for the doping + dislike pairing task. Further, in the LPP time range between 400 and 600 ms a larger late positive potential was found for the doping + dislike pairing task over central regions. These LPP amplitude differences were successfully predicting participants' BIAT D-scores. Results indicate that event-related potentials differentiate between positive and negative doping attitudes at stages of mid-latency. However, it seems that IAT scores can be predicted only

  3. Automatic performance tuning of parallel and accelerated seismic imaging kernels

    KAUST Repository

    Haberdar, Hakan

    2014-01-01

    With the increased complexity and diversity of mainstream high performance computing systems, significant effort is required to tune parallel applications in order to achieve the best possible performance for each particular platform. This task becomes more and more challenging and requiring a larger set of skills. Automatic performance tuning is becoming a must for optimizing applications such as Reverse Time Migration (RTM) widely used in seismic imaging for oil and gas exploration. An empirical search based auto-tuning approach is applied to the MPI communication operations of the parallel isotropic and tilted transverse isotropic kernels. The application of auto-tuning using the Abstract Data and Communication Library improved the performance of the MPI communications as well as developer productivity by providing a higher level of abstraction. Keeping productivity in mind, we opted toward pragma based programming for accelerated computation on latest accelerated architectures such as GPUs using the fairly new OpenACC standard. The same auto-tuning approach is also applied to the OpenACC accelerated seismic code for optimizing the compute intensive kernel of the Reverse Time Migration application. The application of such technique resulted in an improved performance of the original code and its ability to adapt to different execution environments.

  4. Lessons learned at 208K: Towards Debugging Millions of Cores

    Energy Technology Data Exchange (ETDEWEB)

    Lee, G L; Ahn, D H; Arnold, D C; de Supinski, B R; Legendre, M; Miller, B P; Schulz, M J; Liblit, B

    2008-04-14

    Petascale systems will present several new challenges to performance and correctness tools. Such machines may contain millions of cores, requiring that tools use scalable data structures and analysis algorithms to collect and to process application data. In addition, at such scales, each tool itself will become a large parallel application--already, debugging the full Blue-Gene/L (BG/L) installation at the Lawrence Livermore National Laboratory requires employing 1664 tool daemons. To reach such sizes and beyond, tools must use a scalable communication infrastructure and manage their own tool processes efficiently. Some system resources, such as the file system, may also become tool bottlenecks. In this paper, we present challenges to petascale tool development, using the Stack Trace Analysis Tool (STAT) as a case study. STAT is a lightweight tool that gathers and merges stack traces from a parallel application to identify process equivalence classes. We use results gathered at thousands of tasks on an Infiniband cluster and results up to 208K processes on BG/L to identify current scalability issues as well as challenges that will be faced at the petascale. We then present implemented solutions to these challenges and show the resulting performance improvements. We also discuss future plans to meet the debugging demands of petascale machines.

  5. Debugging systems-on-chip communication-centric and abstraction-based techniques

    CERN Document Server

    Vermeulen, Bart

    2014-01-01

    This book describes an approach and supporting infrastructure to facilitate debugging the silicon implementation of a System-on-Chip (SOC), allowing its associated product to be introduced into the market more quickly.  Readers learn step-by-step the key requirements for debugging a modern, silicon SOC implementation, nine factors that complicate this debugging task, and a new debug approach that addresses these requirements and complicating factors.  The authors’ novel communication-centric, scan-based, abstraction-based, run/stop-based (CSAR) debug approach is discussed in detail, showing how it helps to meet debug requirements and address the nine, previously identified factors that complicate debugging silicon implementations of SOCs. The authors also derive the debug infrastructure requirements to support debugging of a silicon implementation of an SOC with their CSAR debug approach. This debug infrastructure consists of a generic on-chip debug architecture, a configurable automated design-for-debug ...

  6. Lightweight and Statistical Techniques for Petascale Debugging: Correctness on Petascale Systems (CoPS) Preliminry Report

    Energy Technology Data Exchange (ETDEWEB)

    de Supinski, B R; Miller, B P; Liblit, B

    2011-09-13

    Petascale platforms with O(10{sup 5}) and O(10{sup 6}) processing cores are driving advancements in a wide range of scientific disciplines. These large systems create unprecedented application development challenges. Scalable correctness tools are critical to shorten the time-to-solution on these systems. Currently, many DOE application developers use primitive manual debugging based on printf or traditional debuggers such as TotalView or DDT. This paradigm breaks down beyond a few thousand cores, yet bugs often arise above that scale. Programmers must reproduce problems in smaller runs to analyze them with traditional tools, or else perform repeated runs at scale using only primitive techniques. Even when traditional tools run at scale, the approach wastes substantial effort and computation cycles. Continued scientific progress demands new paradigms for debugging large-scale applications. The Correctness on Petascale Systems (CoPS) project is developing a revolutionary debugging scheme that will reduce the debugging problem to a scale that human developers can comprehend. The scheme can provide precise diagnoses of the root causes of failure, including suggestions of the location and the type of errors down to the level of code regions or even a single execution point. Our fundamentally new strategy combines and expands three relatively new complementary debugging approaches. The Stack Trace Analysis Tool (STAT), a 2011 R&D 100 Award Winner, identifies behavior equivalence classes in MPI jobs and highlights behavior when elements of the class demonstrate divergent behavior, often the first indicator of an error. The Cooperative Bug Isolation (CBI) project has developed statistical techniques for isolating programming errors in widely deployed code that we will adapt to large-scale parallel applications. Finally, we are developing a new approach to parallelizing expensive correctness analyses, such as analysis of memory usage in the Memgrind tool. In the first two

  7. Debugging: Finding, Fixing and Flailing, a Multi-Institutional Study of Novice Debuggers

    Science.gov (United States)

    Fitzgerald, Sue; Lewandowski, Gary; McCauley, Renee; Murphy, Laurie; Simon, Beth; Thomas, Lynda; Zander, Carol

    2008-01-01

    Debugging is often difficult and frustrating for novices. Yet because students typically debug outside the classroom and often in isolation, instructors rarely have the opportunity to closely observe students while they debug. This paper describes the details of an exploratory study of the debugging skills and behaviors of contemporary novice Java…

  8. Statistical Debugging of Programs written in Dynamic Programming Language : RUBY

    OpenAIRE

    Akhter, Adeel; Azhar, Hassan

    2010-01-01

    Debugging is an important and critical phase during the software development process. Software debugging is serious and tough practice involved in functional base test driven development. Software vendors encourages their programmers to practice test driven development during the initial development phases to capture the bug traces and the associated code coverage infected from diagnosed bugs. Application’s source code with fewer threats of bug existence or faulty executions is assumed as hig...

  9. Improving Statistical Language Model Performance with Automatically Generated Word Hierarchies

    CERN Document Server

    McMahon, J; Mahon, John Mc

    1995-01-01

    An automatic word classification system has been designed which processes word unigram and bigram frequency statistics extracted from a corpus of natural language utterances. The system implements a binary top-down form of word clustering which employs an average class mutual information metric. Resulting classifications are hierarchical, allowing variable class granularity. Words are represented as structural tags --- unique $n$-bit numbers the most significant bit-patterns of which incorporate class information. Access to a structural tag immediately provides access to all classification levels for the corresponding word. The classification system has successfully revealed some of the structure of English, from the phonemic to the semantic level. The system has been compared --- directly and indirectly --- with other recent word classification systems. Class based interpolated language models have been constructed to exploit the extra information supplied by the classifications and some experiments have sho...

  10. The roots of stereotype threat: when automatic associations disrupt girls' math performance.

    Science.gov (United States)

    Galdi, Silvia; Cadinu, Mara; Tomasetto, Carlo

    2014-01-01

    Although stereotype awareness is a prerequisite for stereotype threat effects (Steele & Aronson, 1995), research showed girls' deficit under stereotype threat before the emergence of math-gender stereotype awareness, and in the absence of stereotype endorsement. In a study including 240 six-year-old children, this paradox was addressed by testing whether automatic associations trigger stereotype threat in young girls. Whereas no indicators were found that children endorsed the math-gender stereotype, girls, but not boys, showed automatic associations consistent with the stereotype. Moreover, results showed that girls' automatic associations varied as a function of a manipulation regarding the stereotype content. Importantly, girls' math performance decreased in a stereotype-consistent, relative to a stereotype-inconsistent, condition and automatic associations mediated the relation between stereotype threat and performance. PMID:23713580

  11. Automatic Eye Detection Error as a Predictor of Face Recognition Performance

    OpenAIRE

    Dutta, Abhishek; Veldhuis, Raymond; Spreeuwers, Luuk

    2014-01-01

    Various facial image quality parameters like pose, illumination, noise, resolution, etc are known to be a predictor of face recognition performance. However, there still remain many other properties of facial images that are not captured by the existing quality parameters. In this paper, we propose a novel image quality parameter called the Automatic Eye Detection Error (AEDE) which measures the difference between manually located and automatically detected eye coordinates. Our experiment res...

  12. A Performance Analysis Tool for PVM Parallel Programs

    Institute of Scientific and Technical Information of China (English)

    Chen Wang; Yin Liu; Changjun Jiang; Zhaoqing Zhang

    2004-01-01

    In this paper,we introduce the design and implementation of ParaVT,which is a visual performance analysis and parallel debugging tool.In ParaVT,we propose an automated instrumentation mechanism. Based on this mechanism,ParaVT automatically analyzes the performance bottleneck of parallel applications and provides a visual user interface to monitor and analyze the performance of parallel programs.In addition ,it also supports certain extensions.

  13. Performance of data acceptance criteria over 50 months from an automatic real-time environmental radiation surveillance network

    Energy Technology Data Exchange (ETDEWEB)

    Casanovas, R., E-mail: ramon.casanovas@urv.cat [Unitat de Fisica Medica, Facultat de Medicina i Ciencies de la Salut, Universitat Rovira i Virgili, ES-43201 Reus (Tarragona) (Spain); Morant, J.J. [Servei de Proteccio Radiologica, Facultat de Medicina i Ciencies de la Salut, Universitat Rovira i Virgili, ES-43201 Reus (Tarragona) (Spain); Lopez, M. [Unitat de Fisica Medica, Facultat de Medicina i Ciencies de la Salut, Universitat Rovira i Virgili, ES-43201 Reus (Tarragona) (Spain); Servei de Proteccio Radiologica, Facultat de Medicina i Ciencies de la Salut, Universitat Rovira i Virgili, ES-43201 Reus (Tarragona) (Spain); Hernandez-Giron, I. [Unitat de Fisica Medica, Facultat de Medicina i Ciencies de la Salut, Universitat Rovira i Virgili, ES-43201 Reus (Tarragona) (Spain); Batalla, E. [Servei de Coordinacio d' Activitats Radioactives, Departament d' Economia i Finances, Generalitat de Catalunya, ES-08018 Barcelona (Spain); Salvado, M. [Unitat de Fisica Medica, Facultat de Medicina i Ciencies de la Salut, Universitat Rovira i Virgili, ES-43201 Reus (Tarragona) (Spain)

    2011-08-15

    The automatic real-time environmental radiation surveillance network of Catalonia (Spain) comprises two subnetworks; one with 9 aerosol monitors and the other with 8 Geiger monitors together with 2 water monitors located in the Ebre river. Since September 2006, several improvements were implemented in order to get better quality and quantity of data, allowing a more accurate data analysis. However, several causes (natural causes, equipment failure, artificial external causes and incidents in nuclear power plants) may produce radiological measured values mismatched with the own station background, whether spurious without significance or true radiological values. Thus, data analysis for a 50-month period was made and allowed to establish an easily implementable statistical criterion to find those values that require special attention. This criterion proved a very useful tool for creating a properly debugged database and to give a quick response to equipment failures or possible radiological incidents. This paper presents the results obtained from the criterion application, including the figures for the expected, raw and debugged data, percentages of missing data grouped by causes and radiological measurements from the networks. Finally, based on the discussed information, recommendations for the improvement of the network are identified to obtain better radiological information and analysis capabilities. - Highlights: > Causes producing data mismatching with the own stations background are described. > Causes may be natural, equipment failure, external or nuclear plants incidents. > These causes can produce either spurious or true radiological data. > A criterion to find these data was implemented and tested for a 50-month period. > Recommendations for the improvement of the network are identified.

  14. Performance of data acceptance criteria over 50 months from an automatic real-time environmental radiation surveillance network

    International Nuclear Information System (INIS)

    The automatic real-time environmental radiation surveillance network of Catalonia (Spain) comprises two subnetworks; one with 9 aerosol monitors and the other with 8 Geiger monitors together with 2 water monitors located in the Ebre river. Since September 2006, several improvements were implemented in order to get better quality and quantity of data, allowing a more accurate data analysis. However, several causes (natural causes, equipment failure, artificial external causes and incidents in nuclear power plants) may produce radiological measured values mismatched with the own station background, whether spurious without significance or true radiological values. Thus, data analysis for a 50-month period was made and allowed to establish an easily implementable statistical criterion to find those values that require special attention. This criterion proved a very useful tool for creating a properly debugged database and to give a quick response to equipment failures or possible radiological incidents. This paper presents the results obtained from the criterion application, including the figures for the expected, raw and debugged data, percentages of missing data grouped by causes and radiological measurements from the networks. Finally, based on the discussed information, recommendations for the improvement of the network are identified to obtain better radiological information and analysis capabilities. - Highlights: → Causes producing data mismatching with the own stations background are described. → Causes may be natural, equipment failure, external or nuclear plants incidents. → These causes can produce either spurious or true radiological data. → A criterion to find these data was implemented and tested for a 50-month period. → Recommendations for the improvement of the network are identified.

  15. Complier-Directed Automatic Performance Tuning (TUNE) Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Chame, Jacqueline [USC-ISI

    2013-06-07

    TUNE was created to develop compiler-directed performance tuning technology targeting the Cray XT4 system at Oak Ridge. TUNE combines compiler technology for model-guided empirical optimization for memory hierarchies with SIMD code generation. The goal of this performance-tuning technology is to yield hand-tuned levels of performance on DOE Office of Science computational kernels, while allowing application programmers to specify their computations at a high level without requiring manual optimization. Overall, TUNE aims to make compiler technology for SIMD code generation and memory hierarchy optimization a crucial component of high-productivity Petaflops computing through a close collaboration with the scientists in national laboratories.

  16. AutomaDeD: Automata-Based Debugging for Dissimilar Parallel Tasks

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G; Laguna, I; Bagchi, S; de Supinski, B R; Ahn, D; Schulz, M

    2010-03-23

    Today's largest systems have over 100,000 cores, with million-core systems expected over the next few years. This growing scale makes debugging the applications that run on them a daunting challenge. Few debugging tools perform well at this scale and most provide an overload of information about the entire job. Developers need tools that quickly direct them to the root cause of the problem. This paper presents AutomaDeD, a tool that identifies which tasks of a large-scale application first manifest a bug at a specific code region at a specific point during program execution. AutomaDeD creates a statistical model of the application's control-flow and timing behavior that organizes tasks into groups and identifies deviations from normal execution, thus significantly reducing debugging effort. In addition to a case study in which AutomaDeD locates a bug that occurred during development of MVAPICH, we evaluate AutomaDeD on a range of bugs injected into the NAS parallel benchmarks. Our results demonstrate that detects the time period when a bug first manifested itself with 90% accuracy for stalls and hangs and 70% accuracy for interference faults. It identifies the subset of processes first affected by the fault with 80% accuracy and 70% accuracy, respectively and the code region where where the fault first manifested with 90% and 50% accuracy, respectively.

  17. Can search algorithms save large-scale automatic performance tuning?

    Energy Technology Data Exchange (ETDEWEB)

    Balaprakash, P.; Wild, S. M.; Hovland, P. D. (Mathematics and Computer Science)

    2011-01-01

    Empirical performance optimization of computer codes using autotuners has received significant attention in recent years. Given the increased complexity of computer architectures and scientific codes, evaluating all possible code variants is prohibitively expensive for all but the simplest kernels. One way for autotuners to overcome this hurdle is through use of a search algorithm that finds high-performing code variants while examining relatively few variants. In this paper we examine the search problem in autotuning from a mathematical optimization perspective. As an illustration of the power and limitations of this optimization, we conduct an experimental study of several optimization algorithms on a number of linear algebra kernel codes. We find that the algorithms considered obtain performance gains similar to the optimal ones found by complete enumeration or by large random searches but in a tiny fraction of the computation time.

  18. Software exorcism a handbook for debugging and optimizing legacy code

    CERN Document Server

    Blunden, Bill

    2013-01-01

    Software Exorcism: A Handbook for Debugging and Optimizing Legacy Code takes an unflinching, no bulls and look at behavioral problems in the software engineering industry, shedding much-needed light on the social forces that make it difficult for programmers to do their job. Do you have a co-worker who perpetually writes bad code that you are forced to clean up? This is your book. While there are plenty of books on the market that cover debugging and short-term workarounds for bad code, Reverend Bill Blunden takes a revolutionary step beyond them by bringing our atten

  19. 75 FR 48552 - Automatic Dependent Surveillance-Broadcast (ADS-B) Out Performance Requirements To Support Air...

    Science.gov (United States)

    2010-08-11

    ... (ADS-B) Out Performance Requirements To Support Air Traffic Control (ATC) Service; OMB Approval of..., ``Automatic Dependent Surveillance- Broadcast (ADS-B) Out Performance Requirements To Support Air Traffic... rule, ``Automatic Dependent Surveillance-Broadcast (ADS-B) Out Performance Requirements To Support...

  20. Automatic Assessment of Complex Task Performance in Games and Simulations. CRESST Report 775

    Science.gov (United States)

    Iseli, Markus R.; Koenig, Alan D.; Lee, John J.; Wainess, Richard

    2010-01-01

    Assessment of complex task performance is crucial to evaluating personnel in critical job functions such as Navy damage control operations aboard ships. Games and simulations can be instrumental in this process, as they can present a broad range of complex scenarios without involving harm to people or property. However, "automatic" performance…

  1. Provenance-Based Debugging and Drill-Down in Data-Oriented Workflows

    KAUST Repository

    Ikeda, Robert

    2012-04-01

    Panda (for Provenance and Data) is a system that supports the creation and execution of data-oriented workflows, with automatic provenance generation and built-in provenance tracing operations. Workflows in Panda are arbitrary a cyclic graphs containing both relational (SQL) processing nodes and opaque processing nodes programmed in Python. For both types of nodes, Panda generates logical provenance - provenance information stored at the processing-node level - and uses the generated provenance to support record-level backward tracing and forward tracing operations. In our demonstration we use Panda to integrate, process, and analyze actual education data from multiple sources. We specifically demonstrate how Panda\\'s provenance generation and tracing capabilities can be very useful for workflow debugging, and for drilling down on specific results of interest. © 2012 IEEE.

  2. Prototype application for the control and debugging of CMS upgrade projects

    CERN Document Server

    Mills-Howell, Dominic

    2016-01-01

    Following the high-luminosity upgrades of the LHC, many subsystems of the CMS experiment require upgrading and others are using the LHC shutdowns as an opportunity to improve performance. The upgrades, themselves, have served to highlight the exigency to attack problems that were previously unaddressed. One such problem is the need for a tool that allows the users to easily monitor, debug, and test custom hardware. Such a tool could be abstracted to work, in theory, with various hardware devices. In addition to having the added benefit of being able to support future hardware, and maintaining parallel operations with the remaining control software.

  3. Memory Debug Technique Using March17N BIST

    OpenAIRE

    Ms. Zeenath

    2014-01-01

    A Memory Debug Technique plays a key role in System-on-chip (SOC) product development and yield ramp-up. Diagnosis technique plays a key role during the rapid development of the semiconductor memories, for Catching the design and manufacturing failures and improving the overall yield and quality. Conventional failure analysis (FA) based on bitmaps and the experiences of the FA (field application) engineer are time consuming and error prone. The increasing time-to-volume pressu...

  4. SPC for Software Reliability-Imperfect Software Debugging Model

    OpenAIRE

    R Satya Prasad; Supriya, N.; G. Krishna Mohan

    2011-01-01

    Software reliability process can be monitored efficiently by using Statistical Process Control (SPC). It assists the software development team to identify failures and actions to be taken during software failure process and hence, assures better software reliability. In this paper, we consider a software reliability growth model of Non-Homogenous Poisson Process (NHPP) based, that incorporates imperfect debugging problem. The proposed model utilizes the failure data collected from software de...

  5. Lightning Protection Performance Assessment of Transmission Line Based on ATP model Automatic Generation

    OpenAIRE

    Luo Hanwu; Li Mengke; Xu Xinyao; Cui Shigang; Han Yin; Yan Kai; Wang Jing; Le Jian

    2016-01-01

    This paper presents a novel method to solve the initial lightning breakdown current by combing ATP and MATLAB simulation software effectively, with the aims to evaluate the lightning protection performance of transmission line. Firstly, the executable ATP simulation model is generated automatically according to the required information such as power source parameters, tower parameters, overhead line parameters, grounding resistance and lightning current parameters, etc. through an interface p...

  6. A debugging system for azimuthally acoustic logging tools based on modular and hierarchical design ideas

    Science.gov (United States)

    Zhang, K.; Ju, X. D.; Lu, J. Q.; Men, B. Y.

    2016-08-01

    On the basis of modular and hierarchical design ideas, this study presents a debugging system for an azimuthally sensitive acoustic bond tool (AABT). The debugging system includes three parts: a personal computer (PC), embedded front-end machine and function expansion boards. Modular and hierarchical design ideas are conducted in all design and debug processes. The PC communicates with the front-end machine via the Internet, and the front-end machine and function expansion boards connect each other by the extended parallel bus. In this method, the three parts of the debugging system form stable and high-speed data communication. This study not only introduces the system-level debugging and sub-system level debugging of the tool but also the debugging of the analogue signal processing board, which is important and greatly used in logging tools. Experiments illustrate that the debugging system can greatly improve AABT verification and calibration efficiency and that, board-level debugging can examine and improve analogue signal processing boards. The design thinking is clear and the design structure is reasonable, thus making it easy to extend and upgrade the debugging system.

  7. A Framework to Debug Diagnostic Matrices

    Science.gov (United States)

    Kodal, Anuradha; Robinson, Peter; Patterson-Hine, Ann

    2013-01-01

    Diagnostics is an important concept in system health and monitoring of space operations. Many of the existing diagnostic algorithms utilize system knowledge in the form of diagnostic matrix (D-matrix, also popularly known as diagnostic dictionary, fault signature matrix or reachability matrix) gleaned from physical models. But, sometimes, this may not be coherent to obtain high diagnostic performance. In such a case, it is important to modify this D-matrix based on knowledge obtained from other sources such as time-series data stream (simulated or maintenance data) within the context of a framework that includes the diagnostic/inference algorithm. A systematic and sequential update procedure, diagnostic modeling evaluator (DME) is proposed to modify D-matrix and wrapper logic considering least expensive solution first. This iterative procedure includes conditions ranging from modifying 0s and 1s in the matrix, or adding/removing the rows (failure sources) columns (tests). We will experiment this framework on datasets from DX challenge 2009.

  8. SU-E-J-29: Automatic Image Registration Performance of Three IGRT Systems for Prostate Radiotherapy

    International Nuclear Information System (INIS)

    Purpose: To compare the performance of an automatic image registration algorithm on image sets collected on three commercial image guidance systems, and explore its relationship with imaging parameters such as dose and sharpness. Methods: Images of a CIRS Virtually Human Male Pelvis phantom (VHMP) were collected on the CBCT systems of Varian TrueBeam/OBI and Elekta Synergy/XVI linear accelerators, across a range of mAs settings; and MVCT on a Tomotherapy Hi-ART accelerator with a range of pitch. Using the 6D correlation ratio algorithm of XVI, each image was registered to a mask of the prostate volume with a 5 mm expansion. Registrations were repeated 100 times, with random initial offsets introduced to simulate daily matching. Residual registration errors were calculated by correcting for the initial phantom set-up error. Automatic registration was also repeated after reconstructing images with different sharpness filters. Results: All three systems showed good registration performance, with residual translations <0.5mm (1σ) for typical clinical dose and reconstruction settings. Residual rotational error had larger range, with 0.8°, 1.2° and 1.9° for 1σ in XVI, OBI and Tomotherapy respectively. The registration accuracy of XVI images showed a strong dependence on imaging dose, particularly below 4mGy. No evidence of reduced performance was observed at the lowest dose settings for OBI and Tomotherapy, but these were above 4mGy. Registration failures (maximum target registration error > 3.6 mm on the surface of a 30mm sphere) occurred in 5% to 10% of registrations. Changing the sharpness of image reconstruction had no significant effect on registration performance. Conclusions: Using the present automatic image registration algorithm, all IGRT systems tested provided satisfactory registrations for clinical use, within a normal range of acquisition settings

  9. SU-E-J-29: Automatic Image Registration Performance of Three IGRT Systems for Prostate Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Barber, J [Sydney West Radiation Oncology Network, Sydney, NSW (Australia); University of Sydney, Sydney, NSW (Australia); Sykes, J [Sydney West Radiation Oncology Network, Sydney, NSW (Australia); Holloway, L [Ingham Institute, Sydney, NSW (Australia); University of Sydney, Sydney, NSW (Australia); Thwaites, D [University of Sydney, Sydney, NSW (Australia)

    2015-06-15

    Purpose: To compare the performance of an automatic image registration algorithm on image sets collected on three commercial image guidance systems, and explore its relationship with imaging parameters such as dose and sharpness. Methods: Images of a CIRS Virtually Human Male Pelvis phantom (VHMP) were collected on the CBCT systems of Varian TrueBeam/OBI and Elekta Synergy/XVI linear accelerators, across a range of mAs settings; and MVCT on a Tomotherapy Hi-ART accelerator with a range of pitch. Using the 6D correlation ratio algorithm of XVI, each image was registered to a mask of the prostate volume with a 5 mm expansion. Registrations were repeated 100 times, with random initial offsets introduced to simulate daily matching. Residual registration errors were calculated by correcting for the initial phantom set-up error. Automatic registration was also repeated after reconstructing images with different sharpness filters. Results: All three systems showed good registration performance, with residual translations <0.5mm (1σ) for typical clinical dose and reconstruction settings. Residual rotational error had larger range, with 0.8°, 1.2° and 1.9° for 1σ in XVI, OBI and Tomotherapy respectively. The registration accuracy of XVI images showed a strong dependence on imaging dose, particularly below 4mGy. No evidence of reduced performance was observed at the lowest dose settings for OBI and Tomotherapy, but these were above 4mGy. Registration failures (maximum target registration error > 3.6 mm on the surface of a 30mm sphere) occurred in 5% to 10% of registrations. Changing the sharpness of image reconstruction had no significant effect on registration performance. Conclusions: Using the present automatic image registration algorithm, all IGRT systems tested provided satisfactory registrations for clinical use, within a normal range of acquisition settings.

  10. Automatic target recognition performance losses in the presence of atmospheric and camera effects

    Science.gov (United States)

    Chen, Xiaohan; Schmid, Natalia A.

    2010-04-01

    The importance of networked automatic target recognition systems for surveillance applications is continuously increasing. Because of the requirement of a low cost and limited payload, these networks are traditionally equipped with lightweight, low-cost sensors such as electro-optical (EO) or infrared sensors. The quality of imagery acquired by these sensors critically depends on the environmental conditions, type and characteristics of sensors, and absence of occluding or concealing objects. In the past, a large number of efficient detection, tracking, and recognition algorithms have been designed to operate on imagery of good quality. However, detection and recognition limits under nonideal environmental and/or sensor-based distortions have not been carefully evaluated. We introduce a fully automatic target recognition system that involves a Haar-based detector to select potential regions of interest within images, performs adjustment of detected regions, segments potential targets using a region-based approach, identifies targets using Bessel K form-based encoding, and performs clutter rejection. We investigate the effects of environmental and camera conditions on target detection and recognition performance. Two databases are involved. One is a simulated database generated using a 3-D tool. The other database is formed by imaging 10 die-cast models of military vehicles from different elevation and orientation angles. The database contains imagery acquired both indoors and outdoors. The indoors data set is composed of clear and distorted images. The distortions include defocus blur, sided illumination, low contrast, shadows, and occlusions. All images in this database, however, have a uniform (blue) background. The indoors database is applied to evaluate the degradations of recognition performance due to camera and illumination effects. The database collected outdoors includes a real background and is much more complex to process. The numerical results

  11. Automatic image registration performance for two different CBCT systems; variation with imaging dose

    Science.gov (United States)

    Barber, J.; Sykes, J. R.; Holloway, L.; Thwaites, D. I.

    2014-03-01

    The performance of an automatic image registration algorithm was compared on image sets collected with two commercial CBCT systems, and the relationship with imaging dose was explored. CBCT images of a CIRS Virtually Human Male Pelvis phantom (VHMP) were collected on Varian TrueBeam/OBI and Elekta Synergy/XVI linear accelerators, across a range of mAs settings. Each CBCT image was registered 100 times, with random initial offsets introduced. Image registration was performed using the grey value correlation ratio algorithm in the Elekta XVI software, to a mask of the prostate volume with 5 mm expansion. Residual registration errors were calculated after correcting for the initial introduced phantom set-up error. Registration performance with the OBI images was similar to that of XVI. There was a clear dependence on imaging dose for the XVI images with residual errors increasing below 4mGy. It was not possible to acquire images with doses lower than ~5mGy with the OBI system and no evidence of reduced performance was observed at this dose. Registration failures (maximum target registration error > 3.6 mm on the surface of a 30mm sphere) occurred in 5% to 9% of registrations except for the lowest dose XVI scan (31%). The uncertainty in automatic image registration with both OBI and XVI images was found to be adequate for clinical use within a normal range of acquisition settings.

  12. Automatic image registration performance for two different CBCT systems; variation with imaging dose

    International Nuclear Information System (INIS)

    The performance of an automatic image registration algorithm was compared on image sets collected with two commercial CBCT systems, and the relationship with imaging dose was explored. CBCT images of a CIRS Virtually Human Male Pelvis phantom (VHMP) were collected on Varian TrueBeam/OBI and Elekta Synergy/XVI linear accelerators, across a range of mAs settings. Each CBCT image was registered 100 times, with random initial offsets introduced. Image registration was performed using the grey value correlation ratio algorithm in the Elekta XVI software, to a mask of the prostate volume with 5 mm expansion. Residual registration errors were calculated after correcting for the initial introduced phantom set-up error. Registration performance with the OBI images was similar to that of XVI. There was a clear dependence on imaging dose for the XVI images with residual errors increasing below 4mGy. It was not possible to acquire images with doses lower than ∼5mGy with the OBI system and no evidence of reduced performance was observed at this dose. Registration failures (maximum target registration error > 3.6 mm on the surface of a 30mm sphere) occurred in 5% to 9% of registrations except for the lowest dose XVI scan (31%). The uncertainty in automatic image registration with both OBI and XVI images was found to be adequate for clinical use within a normal range of acquisition settings.

  13. Automatic and Parallel Optimized Learning for Neural Networks performing MIMO Applications

    Directory of Open Access Journals (Sweden)

    LAUDANI, A.

    2013-02-01

    Full Text Available An automatic and optimized approach based on multivariate functions decomposition is presented to face Multi-Input-Multi-Output (MIMO applications by using Single-Input-Single-Output (SISO feed-forward Neural Networks (NNs. Indeed, often the learning time and the computational costs are too large for an effective use of MIMO NNs. Since performing a MISO neural model by starting from a single MIMO NN is frequently adopted in literature, the proposed method introduces three other steps: 1 a further decomposition; 2 a learning optimization; 3 a parallel training to speed up the process. Starting from a MISO NN, a collection of SISO NNs can be obtained by means a multi-dimensional Single Value Decomposition (SVD. Then, a general approach for the learning optimization of SISO NNs is applied. It is based on the observation that the performances of SISO NNs improve in terms of generalization and robustness against noise under suitable learning conditions. Thus, each SISO NN is trained and optimized by using limited training data that allow a significant decrease of computational costs. Moreover, a parallel architecture can be easily implemented. Consequently, the presented approach allows to perform an automatic conversion of MIMO NN into a collection of parallel-optimized SISO NNs. Experimental results will be suitably shown.

  14. The ALDB box: automatic testing of cognitive performance in groups of aviary-housed pigeons.

    Science.gov (United States)

    Huber, Ludwig; Heise, Nils; Zeman, Christopher; Palmers, Christian

    2015-03-01

    The combination of highly controlled experimental testing and the voluntary participation of unrestrained animals has many advantages over traditional, laboratory-based learning environments in terms of animal welfare, learning speed, and resource economy. Such automatic learning environments have recently been developed for primates (Fagot & Bonté, 2010; Fagot & Paleressompoulle, 2009;) but, so far, has not been achieved with highly mobile creatures such as birds. Here, we present a novel testing environment for pigeons. Living together in small groups in outside aviaries, they can freely choose to participate in learning experiments by entering and leaving the automatic learning box at any time. At the single-access entry, they are individualized using radio frequency identification technology and then trained or tested in a stress-free and self-terminating manner. The voluntary nature of their participation according to their individual biorhythm guarantees high motivation levels and good learning and test performance. Around-the-clock access allows for massed-trials training, which in baboons has been proven to have facilitative effects on discrimination learning. The performance of 2 pigeons confirmed the advantages of the automatic learning device for birds box. The latter is the result of a development process of several years that required us to deal with and overcome a number of technical challenges: (1) mechanically controlled access to the box, (2) identification of the birds, (3) the release of a bird and, at the same time, prevention of others from entering the box, and (4) reliable functioning of the device despite long operation times and exposure to high dust loads and low temperatures. PMID:24737096

  15. Lightning Protection Performance Assessment of Transmission Line Based on ATP model Automatic Generation

    Directory of Open Access Journals (Sweden)

    Luo Hanwu

    2016-01-01

    Full Text Available This paper presents a novel method to solve the initial lightning breakdown current by combing ATP and MATLAB simulation software effectively, with the aims to evaluate the lightning protection performance of transmission line. Firstly, the executable ATP simulation model is generated automatically according to the required information such as power source parameters, tower parameters, overhead line parameters, grounding resistance and lightning current parameters, etc. through an interface program coded by MATLAB. Then, the data are extracted from the generated LIS files which can be obtained by executing the ATP simulation model, the occurrence of transmission lie breakdown can be determined by the relative data in LIS file. The lightning current amplitude should be reduced when the breakdown occurs, and vice the verse. Thus the initial lightning breakdown current of a transmission line with given parameters can be determined accurately by continuously changing the lightning current amplitude, which is realized by a loop computing algorithm that is coded by MATLAB software. The method proposed in this paper can generate the ATP simulation program automatically, and facilitates the lightning protection performance assessment of transmission line.

  16. Automatic and Objective Assessment of Alternating Tapping Performance in Parkinson’s Disease

    Directory of Open Access Journals (Sweden)

    Mevludin Memedi

    2013-12-01

    Full Text Available This paper presents the development and evaluation of a method for enabling quantitative and automatic scoring of alternating tapping performance of patients with Parkinson’s disease (PD. Ten healthy elderly subjects and 95 patients in different clinical stages of PD have utilized a touch-pad handheld computer to perform alternate tapping tests in their home environments. First, a neurologist used a web-based system to visually assess impairments in four tapping dimensions (‘speed’, ‘accuracy’, ‘fatigue’ and ‘arrhythmia’ and a global tapping severity (GTS. Second, tapping signals were processed with time series analysis and statistical methods to derive 24 quantitative parameters. Third, principal component analysis was used to reduce the dimensions of these parameters and to obtain scores for the four dimensions. Finally, a logistic regression classifier was trained using a 10-fold stratified cross-validation to map the reduced parameters to the corresponding visually assessed GTS scores. Results showed that the computed scores correlated well to visually assessed scores and were significantly different across Unified Parkinson’s Disease Rating Scale scores of upper limb motor performance. In addition, they had good internal consistency, had good ability to discriminate between healthy elderly and patients in different disease stages, had good sensitivity to treatment interventions and could reflect the natural disease progression over time. In conclusion, the automatic method can be useful to objectively assess the tapping performance of PD patients and can be included in telemedicine tools for remote monitoring of tapping.

  17. Effect of an automatic feeding system on growth performance and feeding behaviour of pigs reared outdoors

    Directory of Open Access Journals (Sweden)

    Riccardo Fortina

    2010-01-01

    Full Text Available Nine Mora Romagnola and 10 Large White x Mora Romagnola growing pigs were reared outdoors. In both groups ad libitum feed was provided. Conventional pigs received it twice a day, distributed in two long troughs. Inside the corral of the second group, an automatic station was set up for: feed distribution, pigs weighing, and control by an analog camera. Thus the self-feeders received feed ad libitum individually by the automatic system, divided into small quantities at meal times. During the experiment the analog camera was used over 24 hours each day, to collect pictures of pigs in order to investigate their behaviours. For each picture the day and hour, the number of visible pigs and their behaviours were recorded and a statistical analysis of data, which was expressed as hourly frequencies of behavioural elements, was performed. Moreover to highlight “active” and “passive” behaviours between the groups, two categories “Move” and “Rest” were created grouping some behavioural elements. With regard to performance, conventional pigs reached a higher total weight gain (56.1±2.42 kg vs 46.7±2.42 kg; P=0.0117. But the feed conversion index (FCI of both groups was similar. The self-feeders had consumed less feed than conventional animals. The feeding system seems to influence behaviours. The percentage of time spent in Eating activity differs (P<0.0001 between the self-fed (median 24.6% and conventional pigs (median 10.9%. The resulting more regular eating trend of self-feeders influenced the daily activities distribution. The behavioural category Rest (median: self-feeders 55.0% vs 71.4% conventional pigs was dominant, with conventional pigs becoming more restless, particularly at meal times. This type of feeding competition and aggressive behaviour did not happen in the self-feeders due to the feed distribution system. The self-feeder results showed that pigs eat at the automatic station both day and night. The animals perform on

  18. Transducer-actuator systems and methods for performing on-machine measurements and automatic part alignment

    Energy Technology Data Exchange (ETDEWEB)

    Barkman, William E.; Dow, Thomas A.; Garrard, Kenneth P.; Marston, Zachary

    2016-07-12

    Systems and methods for performing on-machine measurements and automatic part alignment, including: a measurement component operable for determining the position of a part on a machine; and an actuation component operable for adjusting the position of the part by contacting the part with a predetermined force responsive to the determined position of the part. The measurement component consists of a transducer. The actuation component consists of a linear actuator. Optionally, the measurement component and the actuation component consist of a single linear actuator operable for contacting the part with a first lighter force for determining the position of the part and with a second harder force for adjusting the position of the part. The actuation component is utilized in a substantially horizontal configuration and the effects of gravitational drop of the part are accounted for in the force applied and the timing of the contact.

  19. Installation and debugging of the hydrostatic leveling system at SSRF

    International Nuclear Information System (INIS)

    In this paper, we discuss the technical problems during installation and debugging of the hydrostatic leveling system (HLS) established on the linac, booster and storage ring of Shanghai Synchrotron Radiation Facility (SSRF), and the ways to solve the problems. Before the system installation, all the sensors were tested to exclude anyone that failed, because of the reasons of transportation, to meet the accuracy requirement of the SSRF design. The test method and test process are given in detail. During the process of HLS system installation, we solved a number of problems that could not be anticipated, e.g problems of the supports and water supplying. The surveying results after system installation show that the system works normally and satisfies the designing requirement. (authors)

  20. Data Provenance as a Tool for Debugging Hydrological Models based on Python

    Science.gov (United States)

    Wombacher, A.; Huq, M.; Wada, Y.; Van Beek, R.

    2012-12-01

    There is an increase in data volume used in hydrological modeling. The increasing data volume requires additional efforts in debugging models since a single output value is influenced by a multitude of input values. Thus, it is difficult to keep an overview among the data dependencies. Further, knowing these dependencies, it is a tedious job to infer all the relevant data values. The aforementioned data dependencies are also known as data provenance, i.e. the determination of how a particular value has been created and processed. The proposed tool infers the data provenance automatically from a python script and visualizes the dependencies as a graph without executing the script. To debug the model the user specifies the value of interest in space and time. The tool infers all related data values and displays them in the graph. The tool has been evaluated by hydrologists developing a model for estimating the global water demand [1]. The model uses multiple different data sources. The script we analysed has 120 lines of codes and used more than 3000 individual files, each of them representing a raster map of 360*720 cells. After importing the data of the files into a SQLite database, the data consumes around 40 GB of memory. Using the proposed tool a modeler is able to select individual values and infer which values have been used to calculate the value. Especially in cases of outliers or missing values it is a beneficial tool to provide the modeler with efficient information to investigate the unexpected behavior of the model. The proposed tool can be applied to many python scripts and has been tested with other scripts in different contexts. In case a python code contains an unknown function or class the tool requests additional information about the used function or class to enable the inference. This information has to be entered only once and can be shared with colleagues or in the community. Reference [1] Y. Wada, L. P. H. van Beek, D. Viviroli, H. H. Drr, R

  1. Performance Analysis of the SIFT Operator for Automatic Feature Extraction and Matching in Photogrammetric Applications.

    Science.gov (United States)

    Lingua, Andrea; Marenchino, Davide; Nex, Francesco

    2009-01-01

    In the photogrammetry field, interest in region detectors, which are widely used in Computer Vision, is quickly increasing due to the availability of new techniques. Images acquired by Mobile Mapping Technology, Oblique Photogrammetric Cameras or Unmanned Aerial Vehicles do not observe normal acquisition conditions. Feature extraction and matching techniques, which are traditionally used in photogrammetry, are usually inefficient for these applications as they are unable to provide reliable results under extreme geometrical conditions (convergent taking geometry, strong affine transformations, etc.) and for bad-textured images. A performance analysis of the SIFT technique in aerial and close-range photogrammetric applications is presented in this paper. The goal is to establish the suitability of the SIFT technique for automatic tie point extraction and approximate DSM (Digital Surface Model) generation. First, the performances of the SIFT operator have been compared with those provided by feature extraction and matching techniques used in photogrammetry. All these techniques have been implemented by the authors and validated on aerial and terrestrial images. Moreover, an auto-adaptive version of the SIFT operator has been developed, in order to improve the performances of the SIFT detector in relation to the texture of the images. The Auto-Adaptive SIFT operator (A(2) SIFT) has been validated on several aerial images, with particular attention to large scale aerial images acquired using mini-UAV systems. PMID:22412336

  2. Performance Analysis of the SIFT Operator for Automatic Feature Extraction and Matching in Photogrammetric Applications

    Directory of Open Access Journals (Sweden)

    Francesco Nex

    2009-05-01

    Full Text Available In the photogrammetry field, interest in region detectors, which are widely used in Computer Vision, is quickly increasing due to the availability of new techniques. Images acquired by Mobile Mapping Technology, Oblique Photogrammetric Cameras or Unmanned Aerial Vehicles do not observe normal acquisition conditions. Feature extraction and matching techniques, which are traditionally used in photogrammetry, are usually inefficient for these applications as they are unable to provide reliable results under extreme geometrical conditions (convergent taking geometry, strong affine transformations, etc. and for bad-textured images. A performance analysis of the SIFT technique in aerial and close-range photogrammetric applications is presented in this paper. The goal is to establish the suitability of the SIFT technique for automatic tie point extraction and approximate DSM (Digital Surface Model generation. First, the performances of the SIFT operator have been compared with those provided by feature extraction and matching techniques used in photogrammetry. All these techniques have been implemented by the authors and validated on aerial and terrestrial images. Moreover, an auto-adaptive version of the SIFT operator has been developed, in order to improve the performances of the SIFT detector in relation to the texture of the images. The Auto-Adaptive SIFT operator (A2 SIFT has been validated on several aerial images, with particular attention to large scale aerial images acquired using mini-UAV systems.

  3. A HYBRID METHOD FOR AUTOMATIC SPEECH RECOGNITION PERFORMANCE IMPROVEMENT IN REAL WORLD NOISY ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Urmila Shrawankar

    2013-01-01

    Full Text Available It is a well known fact that, speech recognition systems perform well when the system is used in conditions similar to the one used to train the acoustic models. However, mismatches degrade the performance. In adverse environment, it is very difficult to predict the category of noise in advance in case of real world environmental noise and difficult to achieve environmental robustness. After doing rigorous experimental study it is observed that, a unique method is not available that will clean the noisy speech as well as preserve the quality which have been corrupted by real natural environmental (mixed noise. It is also observed that only back-end techniques are not sufficient to improve the performance of a speech recognition system. It is necessary to implement performance improvement techniques at every step of back-end as well as front-end of the Automatic Speech Recognition (ASR model. Current recognition systems solve this problem using a technique called adaptation. This study presents an experimental study that aims two points, first is to implement the hybrid method that will take care of clarifying the speech signal as much as possible with all combinations of filters and enhancement techniques. The second point is to develop a method for training all categories of noise that can adapt the acoustic models for a new environment that will help to improve the performance of the speech recognizer under real world environmental mismatched conditions. This experiment confirms that hybrid adaptation methods improve the ASR performance on both levels, (Signal-to-Noise Ratio SNR improvement as well as word recognition accuracy in real world noisy environment.

  4. Image guided radiotherapy : performance of a cone beam CT automatic image registration of the prostate

    International Nuclear Information System (INIS)

    Full text: Image registration is one source of uncertainty in image guided radiotherapy. The performance of masked, soft-tissue, automatic image registration of the prostate between CT and CBCT images was measured and its relationship with reduced imaging dose investigated. An anthropomorphic pelvis phantom (CIRS) was CT scanned and used as a reference for lGRT. Seven CBCT scans were taken using the Elekta Synergy system with nominal imaging doses from I to 40 mGy. Rigid-body image registration was repeated 100 times with randomly selected start positions representing normal prostate set-up errors. Image registration used the 'Elekta Correlation Ratio' algorithm with CT data masked to the prostate + 5 mm isotropic margin. Residual error analysis was performed to determine the registration accuracy, precision and robustness. Rigid body errors were analysed as target registration error (TRE), the average error between any two corresponding points on the surface of a 5 cm sphere centred on the isocentre. Similar methods were applied to 21 CBCT scans from seven patients. The TRE was stable for imaging doses above 6 mGy. Median(TRE) was 3.6 mm) was ation performance for patient images was highly variable; 4121 CT-CBCT registrations showed median(TRE) < I mm and RFF <20%. For the rest, median(TRE) was up to 9 mm and RFF from 20 to 90%. A clear dose response relationship was evident for CTCBCT image registration performance of the prostate in phantom measurements. Performance with patient images was highly variable.

  5. Performance Evaluation of Antlion Optimizer Based Regulator in Automatic Generation Control of Interconnected Power System

    Directory of Open Access Journals (Sweden)

    Esha Gupta

    2016-01-01

    Full Text Available This paper presents an application of the recently introduced Antlion Optimizer (ALO to find the parameters of primary governor loop of thermal generators for successful Automatic Generation Control (AGC of two-area interconnected power system. Two standard objective functions, Integral Square Error (ISE and Integral Time Absolute Error (ITAE, have been employed to carry out this parameter estimation process. The problem is transformed in optimization problem to obtain integral gains, speed regulation, and frequency sensitivity coefficient for both areas. The comparison of the regulator performance obtained from ALO is carried out with Genetic Algorithm (GA, Particle Swarm Optimization (PSO, and Gravitational Search Algorithm (GSA based regulators. Different types of perturbations and load changes are incorporated to establish the efficacy of the obtained design. It is observed that ALO outperforms all three optimization methods for this real problem. The optimization performance of ALO is compared with other algorithms on the basis of standard deviations in the values of parameters and objective functions.

  6. Productive performance of Nile tilapia (Oreochromis niloticus fed at different frequencies and periods with automatic dispenser

    Directory of Open Access Journals (Sweden)

    R.M.R. Sousa

    2012-02-01

    Full Text Available The performance of Nile tilapia (Oreochromis niloticus raised in cages furnished with an automatic dispenser, supplied at different frequencies (once per hour and once every two hours and periods (daytime, nighttime and both was evaluated. Eighteen 1.0m³ cages were placed into a 2000m² pond, two meters deep with a 5% water exchange. One hundred and seventy tilapias, with initial weight of 16.0±4.9g, were dispersed into each 1m³ cage and the feed ration was adjusted every 21 days with biometry. Data was collected from March to July (autumn and winter. Significant difference to final weight (P<0.05 among treatments was observed. The increase in feeding frequency improves the productive performance of Nile tilapias in cages and permitted better management of the food. The better feed conversion rate for high feeding frequency (24 times day-1 can result in saving up to 360kg of food for each ton of fish produced, increasing the economic sustenance for tilapia culture and suggesting less environmental pollution.

  7. Validation of a novel automatic sleep spindle detector with high performance during sleep in middle aged subjects

    DEFF Research Database (Denmark)

    Wendt, Sabrina Lyngbye; Christensen, Julie A. E.; Kempfner, Jacob;

    2012-01-01

    Many of the automatic sleep spindle detectors currently used to analyze sleep EEG are either validated on young subjects or not validated thoroughly. The purpose of this study is to develop and validate a fast and reliable sleep spindle detector with high performance in middle aged subjects. An...

  8. Automatic PID Control Loops Design for Performance Improvement of Cryogenic Turboexpander

    International Nuclear Information System (INIS)

    Cryogenics field involves temperature below 123 K which is much less than ambient temperature. In addition, many industrially important physical processes—from fulfilling the needs of National Thermonuclear Fusion programs, superconducting magnets to treatment of cutting tools and preservation of blood cells, require extreme low temperature. The low temperature required for liquefaction of common gases can be obtained by several processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Helium liquefier is used for the liquefaction process of helium gas. In general, the Helium Refrigerator/Liquefier (HRL) needs turboexpander as expansion machine to produce cooling effect which is further used for the production of liquid helium. Turboexpanders, a high speed device that is supported on gas bearings, are the most critical component in many helium refrigeration systems. A very minor fault in the operation and manufacturing or impurities in the helium gas can destroy the turboexpander. However, since the performance of expanders is dependent on a number of operating parameters and the relations between them are quite complex, the instrumentation and control system design for turboexpander needs special attention. The inefficiency of manual control leads to the need of designing automatic control loops for turboexpander. Proper design and implementation of the control loops plays an important role in the successful operation of the cryogenic turboexpander. The PID control loops has to be implemented with accurate interlocks and logic to enhance the performance of the cryogenic turboexpander. For different normal and off-normal operations, speeds will be different and hence a proper control method for critical rotational speed avoidance is must. This paper presents the design of PID control loops needed for the

  9. Distribution transformer with automatic voltage adjustment - performance; Transformador de distribucion con ajuste automatico de tension - desempeno

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez Ruiz, Gustavo A.; Delgadillo Bocanegra, Alfonso; Betancourt Ramirez, Enrique [PROLEC-GE, Apodaca, Nuevo Leon (Mexico)]. E-mail: gustavo1.hernandez@ge.com; alfonso.delgadillobocanegra@ge.com; enrique.betancourt@ge.com; Ramirez Arredondo, Juan M. [CINVESTAV-Guadalajara, Zapopan, Jalisco (Mexico)]. E-mail: jramirez@gdl.cinvestav.mx

    2010-11-15

    In the electric power distribution systems, the power quality is strongly linked with the service stability voltage. In the radial kind systems, it is virtually impossible to achieve a flat voltage along the lines, so it is desirable to count with transformers that can adjust automatically the turns ratio. In this work, it is described the development and the performance of a transformer with an integrated electronic tap changer, that allows to change the turns ratio along the standard range of +/-5%, and it was identified the application limits of the technology. [Spanish] En los sistemas de distribucion de energia electrica, la calidad del suministro de energia esta fuertemente ligada con la estabilidad del voltaje de servicio. En sistemas de tipo radial, es virtualmente imposible mantener uniforme la tension a lo largo de las lineas, por lo que se hace deseable contar con transformadores que puedan ajustar automaticamente la relacion de transformacion. En este trabajo, se describe el desarrollo y desempeno de un transformador con switch electronico integrado, que permite variar la relacion de transformacion dentro del rango estandarizado de +/-5%, y se identifican los limites de aplicacion de la tecnologia.

  10. Remote Collaborative Debugging Model Based on Debugging Agent%基于调试代理的远程协同调试模型

    Institute of Scientific and Technical Information of China (English)

    胡先浪; 张培培

    2011-01-01

    Aiming at the definition of the embedded software and hardware,analyses the embedded cross debugging theory firstly, because the software research tools support one-to-one debugging mode only, and it puts forward remote collaborative debugging model based on debugging agent, and realizes debugging agent function.The experiment method and the function proof are given on the national integrated developing environment JARI-IDE and the national embedded operating system JARI-Works, it realizes resources share, provides a new method for the debugging modules.%针对嵌入式软硬件资源的限制问题,以及现有的软件开发工具仅支持一对一的调试模式的问题,在嵌入式远程调试原理的基础上,分析了嵌入式交叉调试原理,在此基础上提出了基于调试代理的远程协同调试模型,并详细给出了远程协同调试实现的核心--调试代理功能的实现;其次在国产集成开发环境JARI-IDE和同产嵌入式操作系统JARI-Works上给出了实现方法和功能验证,实现了分布在不同地点的开发人员共享资源,实时协同工作,为模块化分工调试提供了一种新的实现方法.

  11. An Imperfect-debugging Fault-detection Dependent-parameter Software

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Software reliability growth models (SRGMs) incorporating the imperfect debugging and learning phenomenon of developers have recently been developed by many researchers to estimate software reliability measures such as the number of remaining faults and software reliability. However, the model parameters of both the fault content rate function and fault detection rate function of the SRGMs are often considered to be independent from each other. In practice, this assumption may not be the case and it is worth to investigate what if it is not. In this paper, we aim for such study and propose a software reliability model connecting the imperfect debugging and learning phenomenon by a common parameter among the two functions, called the imperfect-debugging fault-detection dependent-parameter model. Software testing data collected from real applications are utilized to illustrate the proposed model for both the descriptive and predictive power by determining the non-zero initial debugging process.

  12. Effect of an automatic feeding system on growth performance and feeding behaviour of pigs reared outdoors

    OpenAIRE

    Riccardo Fortina; Salvatore Barbera; Paolo Cornale

    2009-01-01

    Nine Mora Romagnola and 10 Large White x Mora Romagnola growing pigs were reared outdoors. In both groups ad libitum feed was provided. Conventional pigs received it twice a day, distributed in two long troughs. Inside the corral of the second group, an automatic station was set up for: feed distribution, pigs weighing, and control by an analog camera. Thus the self-feeders received feed ad libitum individually by the automatic system, divided into small quantities at meal times. During the e...

  13. Application of VMware VProbes to debugging of a segmentation based separation kernel

    OpenAIRE

    Sanders, Kyle

    2009-01-01

    Approved for public release; distribution is unlimited Debugging is a useful technique in all aspects of software development, including that of operating systems. Because they provide low level interfaces to the hardware, operating systems are particularly difficult to debug. There is little room to add abstraction between the computer hardware and the executing operating system software. Many debuggers are intimately tied to the system’s memory model, compiler, and loader. For specialize...

  14. General collaboration offer of Johnson Controls regarding the performance of air conditioning automatic control systems and other buildings` automatic control systems

    Energy Technology Data Exchange (ETDEWEB)

    Gniazdowski, J.

    1995-12-31

    JOHNSON CONTROLS manufactures measuring and control equipment (800 types) and is as well a {open_quotes}turn-key{close_quotes} supplier of complete automatic controls systems for heating, air conditioning, ventilation and refrigerating engineering branches. The Company also supplies Buildings` Computer-Based Supervision and Monitoring Systems that may be applied in both small and large structures. Since 1990 the company has been performing full-range trade and contracting activities on the Polish market. We have our own well-trained technical staff and we collaborate with a series of designing and contracting enterprises that enable us to have our projects carried out all over Poland. The prices of our supplies and services correspond with the level of the Polish market.

  15. Automatically Communication Path Creation and ReConfiguration in Dynamic Network Environment for Optimizing Performance to Netowk Applications

    Directory of Open Access Journals (Sweden)

    Rakesh Kumar Singh

    2012-03-01

    Full Text Available Accessing network services across a wide area network still remains a challenging task and the difficulty mainly comes from the heterogeneous and constantly changing network environment, which usually causes undesirable user experience for network-oblivious applications. A promising approach to address this is to provide network awareness in communication paths. Many challenging problems remain, in particular: how to automatically create effective network paths whose performance is optimized for encountered network conditions; how to dynamically reconfigure such paths when network conditions change; and how to manage and distribute network resources among different paths and between different network regions. This paper describes solutions for these problems, built into a programmable network infrastructure called Switching Network Services (SNS. The SNS infrastructure provides applications with network-aware communication paths that are automatically created and dynamically modified. SNS highlights four key mechanisms which are: a high-level integrated type-based specification of components and network resources; automatic path creation strategies; system support for low overhead path reconfiguration; and distributed strategies for managing and allocating network resources.We evaluate these mechanisms using experiments with typical applications running in the SNS infrastructure, and extensive simulation of a large scale network topology to compare with other alternatives. Experimental results validate the effectiveness of our approach, verifying that (1 the path-based approach provides the best and the most robust performance under a wide range of network configurations as compared to end-point or proxy-based alternatives; (2 automatic generation of network-aware paths is feasible and provides considerable performance advantages, requiring only minimal input from applications; (3 path reconfiguration strategies ensure continuous adaptation and

  16. Performing Label-Fusion-Based Segmentation Using Multiple Automatically Generated Templates

    Science.gov (United States)

    Chakravarty, M. Mallar; Steadman, Patrick; van Eede, Matthijs C.; Calcott, Rebecca D.; Gu, Victoria; Shaw, Philip; Raznahan, Armin; Collins, D. Louis; Lerch, Jason P.

    2016-01-01

    Classically, model-based segmentation procedures match magnetic resonance imaging (MRI) volumes to an expertly labeled atlas using nonlinear registration. The accuracy of these techniques are limited due to atlas biases, misregistration, and resampling error. Multi-atlas-based approaches are used as a remedy and involve matching each subject to a number of manually labeled templates. This approach yields numerous independent segmentations that are fused using a voxel-by-voxel label-voting procedure. In this article, we demonstrate how the multi-atlas approach can be extended to work with input atlases that are unique and extremely time consuming to construct by generating a library of multiple automatically generated templates of different brains (MAGeT Brain). We demonstrate the efficacy of our method for the mouse and human using two different nonlinear registration algorithms (ANIMAL and ANTs). The input atlases consist a high-resolution mouse brain atlas and an atlas of the human basal ganglia and thalamus derived from serial histological data. MAGeT Brain segmentation improves the identification of the mouse anterior commissure (mean Dice Kappa values (κ = 0.801), but may be encountering a ceiling effect for hippocampal segmentations. Applying MAGeT Brain to human subcortical structures improves segmentation accuracy for all structures compared to regular model-based techniques (κ = 0.845, 0.752, and 0.861 for the striatum, globus pallidus, and thalamus, respectively). Experiments performed with three manually derived input templates suggest that MAGeT Brain can approach or exceed the accuracy of multi-atlas label-fusion segmentation (κ = 0.894, 0.815, and 0.895 for the striatum, globus pallidus, and thalamus, respectively). PMID:22611030

  17. Automatic shading effects on the energetic performance of building systems; Efeito do sombreamento automatico no desempenho de sistemas prediais

    Energy Technology Data Exchange (ETDEWEB)

    Prado, Racine Tadeu Araujo

    1996-12-31

    This thesis develops a theoretic-experimental study dealing with the effects of an automatic shading device on the energetic performance of a dimmable lighting system and a cooling equipment. Some equations related to fenestration optical and thermal properties are rebuilt, while some others are created, under a theoretical approach. In order to collect field data, the energy demand-and other variables - was measured in two distinct stories, with the same fenestration features, of the Test Tower. New data was gathered after adding an automatic shading device to the window of one story. The comparison of the collected data allows the energetic performance evaluation of the shading device. (author) 136 refs., 55 figs., 6 tabs.

  18. Performance Modelling of Automatic Identification System with Extended Field of View

    DEFF Research Database (Denmark)

    Lauersen, Troels; Mortensen, Hans Peter; Pedersen, Nikolaj Bisgaard;

    2010-01-01

    This paper deals with AIS (Automatic Identification System) behavior, to investigate the severity of packet collisions in an extended field of view (FOV). This is an important issue for satellite-based AIS, and the main goal is a feasibility study to find out to what extent an increased FOV is...

  19. Compile-Time Debugging of C Programs Working on Trees

    DEFF Research Database (Denmark)

    Elgaard, Jacob; Møller, Anders; Schwartzbach, Michael I.

    We exhibit a technique for automatically verifying the safety of simple C programs working on tree-shaped data structures. We do not consider the complete behavior of programs, but only attempt to verify that they respect the shape and integrity of the store. A verified program is guaranteed to...... by the MONA tool. This technique is complete for loop-free code, but for loops and recursive functions we rely on Hoare-style invariants. A default well-formedness invariant is supplied and can be strengthened as needed by programmer annotations. If a program fails to verify, a counterexample in the...... form of an initial store that leads to an error is automatically generated. This extends previous work that uses a similar technique to verify a simpler syntax manipulating only list structures. In that case, programs are translated into WS1S formulas. A naive generalization to recursive data...

  20. Modern multithreading implementing, testing, and debugging multithreaded Java and C++/Pthreads/Win32 programs

    CERN Document Server

    Carver, Richard H

    2005-01-01

    Master the essentials of concurrent programming,including testing and debuggingThis textbook examines languages and libraries for multithreaded programming. Readers learn how to create threads in Java and C++, and develop essential concurrent programming and problem-solving skills. Moreover, the textbook sets itself apart from other comparable works by helping readers to become proficient in key testing and debugging techniques. Among the topics covered, readers are introduced to the relevant aspects of Java, the POSIX Pthreads library, and the Windows Win32 Applications Programming Interface.

  1. Automatic sprinkler system performance and reliability in United States Department of Energy Facilities, 1952 to 1980

    International Nuclear Information System (INIS)

    The automatic sprinkler system experiences of the United States Department of Energy and its predecessor agencies are analyzed. Based on accident and incident files in the Office of Operational Safety and on supplementary responses, 587 incidents including over 100 fires are analyzed. Tables and figures, with supplementary narratives discuss fire experience by various categories such as number of heads operating, type of system, dollar losses, failures, extinguished vs. controlled, and types of sprinkler heads. Use is made of extreme value projections and frequency-severity plots to compare past experience and predict future experience. Non-fire incidents are analyzed in a similar manner by cause, system types and failure types. Discussion of no-loss incidents and non-fire protection water systems is included. The author's conclusions and recommendations and appendices listing survey methodology, major incidents, and a bibliography are included

  2. Performing the processing required for automatically get a PDF/A version of the CERN Library documentation

    CERN Document Server

    Molina Garcia-Retamero, Antonio

    2015-01-01

    The aim of the project was to perform the processing required for automatically get a PDF/A version of the CERN Library documentation. For this, it is necessary to extract as much metadata as possible from the sources files, inject the required data into the original source files creating new ones ready for being compiled with all related dependencies. Besides, I’ve proposed the creation of a HTML version consistent with the PDF and navigable for easy access, I’ve been trying to perform some Natural Language Processing for extracting metadata, I’ve proposed the injection of the cern library documentation into the HTML version of the long writeups where it is referenced (for instance, when a CERN Library function is referenced in a sample code) Finally, I’ve designed and implemented a Graphical User Interface in order to simplify the process for the user.

  3. Design and performance of an automatic regenerating adsorption aerosol dryer for continuous operation at monitoring sites

    Science.gov (United States)

    Tuch, T. M.; Haudek, A.; Müller, T.; Nowak, A.; Wex, H.; Wiedensohler, A.

    2009-04-01

    Sizes of aerosol particles depend on the relative humidity of their carrier gas. Most monitoring networks require therefore that the aerosol is dried to a relative humidity below 50% RH to ensure comparability of measurements at different sites. Commercially available aerosol dryers are often not suitable for this purpose at remote monitoring sites. Adsorption dryers need to be regenerated frequently and maintenance-free single column Nafion dryers are not designed for high aerosol flow rates. We therefore developed an automatic regenerating adsorption aerosol dryer with a design flow rate of 1 m3/h. Particle transmission efficiency of this dryer has been determined during a 3 weeks experiment. The lower 50% cut-off was found to be below 3 nm at the design flow rate of the instrument. Measured transmission efficiencies are in good agreement with theoretical calculations. One drier has been successfully deployed in the Amazonas river basin. From this monitoring site, we present data from the first 6 months of measurements (February 2008-August 2008). Apart from one unscheduled service, this dryer did not require any maintenance during this time period. The average relative humidity of the dried aerosol was 27.1+/-7.5% RH compared to an average ambient relative humidity of nearly 80% and temperatures around 30°C. This initial deployment demonstrated that these dryers are well suitable for continuous operation at remote monitoring sites under adverse ambient conditions.

  4. Design and performance of a video-based laser beam automatic alignment system

    Institute of Scientific and Technical Information of China (English)

    Daizhong Liu(刘代中); Renfang Xu(徐仁芳); Dianyuan Fan(范滇元)

    2004-01-01

    @@ A laser alignment system is applied to a high power laser facility for inertial confinement fusion.A designof the automated,close-loop laser beam alignment system is described.Its function is to sense beamalignment errors in a laser beam transport system and automatically steer mirrors preceding the sensorlocation as required to maintain beam alignment.The laser beam is sampled by a sensor package,whichuses video cameras to sense pointing and centering errors.The camera outputs are fed to a personalcomputer,which includes video digitizers and uses image storage and software to sense the centroid of theimage.Signals are sent through the computer to a stepper motor controller,which drives stepper motorson mirror mounts preceding the beam sampling location to return the beam alignment to the prescribedcondition.Its optical principles and key techniques are given.The pointing and centering sensitivities ofthe beam aligmnent sensor package are analyzed.The system has been verified on the multi-pass amplifier experimental system.

  5. Performance of an automatic dose control system for CT. Patient studies

    International Nuclear Information System (INIS)

    Purpose: To study the effect of an automatic dose control (ADC) system with adequate noise characteristic on the individual perception of image noise and diagnostic acceptance compared to objectively measured image noise and the dose reductions achieved in a representative group of patients. Materials and Methods: In a retrospective study two matched cohorts of 20 patients each were identified: a manual cohort with exposure settings according to body size (small - regular - large) and an ADC cohort with exposure settings calculated by the ADC system (DoseRight 2.0 trademark, Philips Healthcare). For each patient, 12 images from 6 defined anatomic levels from contrast-enhanced scans of chest and abdomen/pelvis were analyzed by 4 independent readers concerning image noise and diagnostic acceptance on a five-point Likert scale and evaluated for objectively measured image noise. Radiation exposure was calculated from recorded exposure data. Results: Use of the ADC system reduced the average effective dose for patients by 36 % in chest scans (3.2 vs. 4.9 mSv) and by 17 % in abdomen/pelvis scans (7.6 vs. 8.3 mSv). Average objective noise was slightly lower in the manual cohort (11.1 vs. 12.8 HU), correlating with a slightly better rating in subjective noise score (4.4 vs. 4.2). However, diagnostic acceptance was rated almost equal in both cohorts with excellent image quality (4.6 vs. 4.5). Conclusion: Use of an ADC system with adequate noise characteristic leads to significant reductions in radiation exposure for patients while maintaining excellent image quality. (orig.)

  6. Debugging of Class-D Audio Power Amplifiers

    DEFF Research Database (Denmark)

    Crone, Lasse; Pedersen, Jeppe Arnsdorf; Mønster, Jakob Døllner;

    2012-01-01

    Determining and optimizing the performance of a Class-D audio power amplier can be very dicult without knowledge of the use of audio performance measuring equipment and of how the various noise and distortion sources in uence the audio performance. This paper gives an introduction on how to measure...

  7. Standard guide for in-plant performance evaluation of automatic pedestrian SNM monitors

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1997-01-01

    1.1 This guide is affiliated with Guide C1112 on special nuclear material (SNM) monitors, Guide C1169 on laboratory performance evaluation, and Guide C1189 on calibrating pedestrian SNM monitors. This guide to in-plant performance evaluation is a comparatively rapid way to verify whether a pedestrian SNM monitor performs as expected for detecting SNM or SNM-like test sources. 1.1.1 In-plant performance evaluation should not be confused with the simple daily functional test recommended in Guide C1112. In-plant performance evaluation takes place less often than daily tests, usually at intervals ranging from weekly to once every three months. In-plant evaluations are also more extensive than daily tests and may examine both a monitor's nuisance alarm record and its detection sensitivity for a particular SNM or alternative test source. 1.1.2 In-plant performance evaluation also should not be confused with laboratory performance evaluation. In-plant evaluation is comparatively rapid, takes place in the monitor...

  8. Performance analysis of automatic generation control of interconnected power systems with delayed mode operation of area control error

    Directory of Open Access Journals (Sweden)

    Janardan Nanda

    2015-05-01

    Full Text Available This study presents automatic generation control (AGC of interconnected power systems comprising of two thermal and one hydro area having integral controllers. Emphasis is given to a delay in the area control error for the actuation of the supplementary controller and to examine its impact on the dynamic response against no delay which is usually the practice. Analysis is based on 50% loading condition in all the areas. The system performance is examined considering 1% step load perturbation. Results reveal that delayed mode operation provides a better system dynamic performance compared with that obtained without delay and has several distinct merits for the governor. The delay is linked with reduction in wear and tear of the secondary controller and hence increases the life of the governor. The controller gains are optimised by particle swarm optimisation. The performance of delayed mode operation of AGC at other loading conditions is also analysed. An attempt has also been made to find the impact of weights for different components in a cost function used to optimise the controller gains. A modified cost function having different weights for different components when used for controller gain optimisation improves the system performance.

  9. Interactive debug program for evaluation and modification of assembly-language software

    Science.gov (United States)

    Arpasi, D. J.

    1979-01-01

    An assembly-language debug program written for the Honeywell HDC-601 and DDP-516/316 computers is described. Names and relative addressing to improve operator-machine interaction are used. Features include versatile display, on-line assembly, and improved program execution and analysis. The program is discussed from both a programmer's and an operator's standpoint. Functional diagrams are included to describe the program, and each command is illustrated.

  10. Debugging and application of russian three-joint manipulator in the pilot plant

    International Nuclear Information System (INIS)

    This article analyze lots of malfunctions fully which exist and engender in system debugging and full-scale application of 11 Russian three-joint manipulators in the Pilot Plant, meanwhile take the relevant methods and means to prevent the common malfunctions and ensure Russian three-joint manipulators working normally. At the last, it summarize the disposals when Russian three-joint manipulators have malfunctions in order that maintenance personnel can find out and maintain malfunctions well and quickly. (authors)

  11. Automatic Eye Detection Error as a Predictor of Face Recognition Performance

    NARCIS (Netherlands)

    Dutta, Abhishek; Veldhuis, Raymond; Spreeuwers, Luuk

    2014-01-01

    Various facial image quality parameters like pose, illumination, noise, resolution, etc are known to be a predictor of face recognition performance. However, there still remain many other properties of facial images that are not captured by the existing quality parameters. In this paper, we propose

  12. Performance of automatic generation control mechanisms with large-scale wind power

    International Nuclear Information System (INIS)

    The unpredictability and variability of wind power increasingly challenges real-time balancing of supply and demand in electric power systems. In liberalised markets, balancing is a responsibility jointly held by the TSO (real-time power balancing) and PRPs (energy programs). In this paper, a procedure is developed for the simulation of power system balancing and the assessment of AGC performance in the presence of large-scale wind power, using the Dutch control zone as a case study. The simulation results show that the performance of existing AGC-mechanisms is adequate for keeping ACE within acceptable bounds. At higher wind power penetrations, however, the capabilities of the generation mix are increasingly challenged and additional reserves are required at the same level. (au)

  13. Mining Test Repositories for Automatic Detection of UI Performance Regressions in Android Apps

    OpenAIRE

    Gomez, Maria; Rouvoy, Romain; Adams, Bram; Seinturier, Lionel

    2016-01-01

    The reputation of a mobile app vendor’s apps is crucial to survive amongst the ever increasing competition, however this reputation largely depends on the quality of the apps, both functional and non-functional. One major non-functional requirement of mobile apps is to guarantee smooth UI interactions, since choppy scrolling or navigation caused by performance problems on a mobile device’s limited hardware resources is highly annoying for end-users. The main research challenge of automaticall...

  14. EOS: Automatic In-vivo Evolution of Kernel Policies for Better Performance

    OpenAIRE

    Cui, Yan; Chen, Quan; Yang, Junfeng

    2015-01-01

    Today's monolithic kernels often implement a small, fixed set of policies such as disk I/O scheduling policies, while exposing many parameters to let users select a policy or adjust the specific setting of the policy. Ideally, the parameters exposed should be flexible enough for users to tune for good performance, but in practice, users lack domain knowledge of the parameters and are often stuck with bad, default parameter settings. We present EOS, a system that bridges the knowledge gap betw...

  15. Multistation alarm system for eruptive activity based on the automatic classification of volcanic tremor: specifications and performance

    Science.gov (United States)

    Langer, Horst; Falsaperla, Susanna; Messina, Alfio; Spampinato, Salvatore

    2015-04-01

    system is hitherto one of the main automatic alerting tools to identify impending eruptive events at Etna. The currently operating software named KKAnalysis is applied to the data stream continuously recorded at two seismic stations. The data are merged with reference datasets of past eruptive episodes. In doing so, the results of pattern classification can be immediately compared to previous eruptive scenarios. Given the rich material collected in recent years, here we propose the application of the alert system to a wider range (up to a total of eleven) stations at different elevations (1200-3050 m) and distances (1-8 km) from the summit craters. Critical alert parameters were empirically defined to obtain an optimal tuning of the alert system for each station. To verify the robustness of this new, multistation alert system, a dataset encompassing about eight years of continuous seismic records (since 2006) was processed automatically using KKAnalysis and collateral software offline. Then, we analyzed the performance of the classifier in terms of timing and spatial distribution of the stations.

  16. A computer-aided control system for automatic performance measurements on the LHC series dipoles

    International Nuclear Information System (INIS)

    The control system software (Test Master) for the Large Hadron Collider (LHC) magnet series measurements is presented. This system was developed at CERN to automate as many tests on the LHC magnets as possible. The Test Master software is the middle layer of the main software architecture developed by the LHC/IAS group for central supervision of all types of LHC dipole tests in the SM18 hall. It serves as a manager and scheduler for applications, controlling all measurements that are performed in a cluster of two test benches. The software was implemented in the LabVIEW environment. The information about the interactive user interface, the software architecture, communication protocols, file-configuration different types of commands and status files of the Test Master are described

  17. A preliminary study into performing routine tube output and automatic exposure control quality assurance using radiology information system data

    International Nuclear Information System (INIS)

    Data are currently being collected from hospital radiology information systems in the North West of the UK for the purposes of both clinical audit and patient dose audit. Could these data also be used to satisfy quality assurance (QA) requirements according to UK guidance? From 2008 to 2009, 731 653 records were submitted from 8 hospitals from the North West England. For automatic exposure control QA, the protocol from Inst. of Physics and Engineering in Medicine (IPEM) report 91 recommends that milli amperes per second can be monitored for repeatability and reproducibility using a suitable phantom, at 70-81 kV. Abdomen AP and chest PA examinations were analysed to find the most common kilo voltage used with these records then used to plot average monthly milli amperes per second with time. IPEM report 91 also recommends that a range of commonly used clinical settings is used to check output reproducibility and repeatability. For each tube, the dose area product values were plotted over time for two most common exposure factor sets. Results show that it is possible to do performance checks of AEC systems; however more work is required to be able to monitor tube output performance. Procedurally, the management system requires work and the benefits to the workflow would need to be demonstrated. (authors)

  18. Performance-Driven Interface Contract Enforcement for Scientific Components

    Energy Technology Data Exchange (ETDEWEB)

    Dahlgren, T L

    2007-10-01

    Performance-driven interface contract enforcement research aims to improve the quality of programs built from plug-and-play scientific components. Interface contracts make the obligations on the caller and all implementations of the specified methods explicit. Runtime contract enforcement is a well-known technique for enhancing testing and debugging. However, checking all of the associated constraints during deployment is generally considered too costly from a performance stand point. Previous solutions enforced subsets of constraints without explicit consideration of their performance implications. Hence, this research measures the impacts of different interface contract sampling strategies and compares results with new techniques driven by execution time estimates. Results from three studies indicate automatically adjusting the level of checking based on performance constraints improves the likelihood of detecting contract violations under certain circumstances. Specifically, performance-driven enforcement is better suited to programs exercising constraints whose costs are at most moderately expensive relative to normal program execution.

  19. LVDT Position Feedback Site Debugging Special Tool%LVDT位置反馈现场调试专用工具

    Institute of Scientific and Technical Information of China (English)

    卢震宇

    2014-01-01

    This paper relates to a power plant steam turbine high, medium pressure main steam valve, high pressure valve, open degree feedback device -- LVDT, solve the manpower, time, environment, a number of problems in the process of debugging interference. During operation of the unit, from DCS to the control sending opening signal to automatically adjust the flow rate, pressure and other parameters, and access to on-site valve opening degree is the only means of field position feedback device are transmitted back to the control room CRT current signal, if the signal error feedback signal and sent over a certain value, it will cause no the stability of system, may cause the entire unit manual, causing unnecessary losses. But because the unit head at high temperature, electronic device damage probability of straight up, the frequency of replacement of spare parts will also rise, so check device position feedback becomes more and more important. The position feedback is not accurate means can not be a true reflection of the valve opening, and the opening of the valve to control the oil / air intake, normal operation load and equipment on the unit's will bring influence. Steam turbine high, medium pressure main steam valve, high pressure valve, is the key equipment for adjusting the turbine load, while the LVDT position feedback is the only way to plant personnel for continuous monitoring of the door opening, it directly affects the accuracy of the load condition of the power plant unit. Debugging method of the time-consuming, laborious, and hidden safety problems. At site LVDT connector specifications, a tool, the current list into the field circuit, the debugging personnel can observe current change alone in the field, realize the site debugging, field observation, reduce environmental interference, improve the maintenance efficiency.%本文涉及电厂汽轮机高、中压主汽门,高、中压调门开度的反馈装置--LVDT,解决了调试过程中人力、工时

  20. Performance portability study of an automatic target detection and classification algorithm for hyperspectral image analysis using OpenCL

    Science.gov (United States)

    Bernabe, Sergio; Igual, Francisco D.; Botella, Guillermo; Garcia, Carlos; Prieto-Matias, Manuel; Plaza, Antonio

    2015-10-01

    Recent advances in heterogeneous high performance computing (HPC) have opened new avenues for demanding remote sensing applications. Perhaps one of the most popular algorithm in target detection and identification is the automatic target detection and classification algorithm (ATDCA) widely used in the hyperspectral image analysis community. Previous research has already investigated the mapping of ATDCA on graphics processing units (GPUs) and field programmable gate arrays (FPGAs), showing impressive speedup factors that allow its exploitation in time-critical scenarios. Based on these studies, our work explores the performance portability of a tuned OpenCL implementation across a range of processing devices including multicore processors, GPUs and other accelerators. This approach differs from previous papers, which focused on achieving the optimal performance on each platform. Here, we are more interested in the following issues: (1) evaluating if a single code written in OpenCL allows us to achieve acceptable performance across all of them, and (2) assessing the gap between our portable OpenCL code and those hand-tuned versions previously investigated. Our study includes the analysis of different tuning techniques that expose data parallelism as well as enable an efficient exploitation of the complex memory hierarchies found in these new heterogeneous devices. Experiments have been conducted using hyperspectral data sets collected by NASA's Airborne Visible Infra- red Imaging Spectrometer (AVIRIS) and the Hyperspectral Digital Imagery Collection Experiment (HYDICE) sensors. To the best of our knowledge, this kind of analysis has not been previously conducted in the hyperspectral imaging processing literature, and in our opinion it is very important in order to really calibrate the possibility of using heterogeneous platforms for efficient hyperspectral imaging processing in real remote sensing missions.

  1. First performance evaluation of software for automatic segmentation, labeling and reformation of anatomical aligned axial images of the thoracolumbar spine at CT

    International Nuclear Information System (INIS)

    Highlights: •Automatic segmentation and labeling of the thoracolumbar spine. •Automatically generated double-angulated and aligned axial images of spine segments. •High grade of accurateness for the symmetric depiction of anatomical structures. •Time-saving and may improve workflow in daily practice. -- Abstract: Objectives: To evaluate software for automatic segmentation, labeling and reformation of anatomical aligned axial images of the thoracolumbar spine on CT in terms of accuracy, potential for time savings and workflow improvement. Material and methods: 77 patients (28 women, 49 men, mean age 65.3 ± 14.4 years) with known or suspected spinal disorders (degenerative spine disease n = 32; disc herniation n = 36; traumatic vertebral fractures n = 9) underwent 64-slice MDCT with thin-slab reconstruction. Time for automatic labeling of the thoracolumbar spine and reconstruction of double-angulated axial images of the pathological vertebrae was compared with manually performed reconstruction of anatomical aligned axial images. Reformatted images of both reconstruction methods were assessed by two observers regarding accuracy of symmetric depiction of anatomical structures. Results: In 33 cases double-angulated axial images were created in 1 vertebra, in 28 cases in 2 vertebrae and in 16 cases in 3 vertebrae. Correct automatic labeling was achieved in 72 of 77 patients (93.5%). Errors could be manually corrected in 4 cases. Automatic labeling required 1 min in average. In cases where anatomical aligned axial images of 1 vertebra were created, reconstructions made by hand were significantly faster (p < 0.05). Automatic reconstruction was time-saving in cases of 2 and more vertebrae (p < 0.05). Both reconstruction methods revealed good image quality with excellent inter-observer agreement. Conclusion: The evaluated software for automatic labeling and anatomically aligned, double-angulated axial image reconstruction of the thoracolumbar spine on CT is time

  2. First performance evaluation of software for automatic segmentation, labeling and reformation of anatomical aligned axial images of the thoracolumbar spine at CT

    Energy Technology Data Exchange (ETDEWEB)

    Scholtz, Jan-Erik, E-mail: janerikscholtz@gmail.com; Wichmann, Julian L.; Kaup, Moritz; Fischer, Sebastian; Kerl, J. Matthias; Lehnert, Thomas; Vogl, Thomas J.; Bauer, Ralf W.

    2015-03-15

    Highlights: •Automatic segmentation and labeling of the thoracolumbar spine. •Automatically generated double-angulated and aligned axial images of spine segments. •High grade of accurateness for the symmetric depiction of anatomical structures. •Time-saving and may improve workflow in daily practice. -- Abstract: Objectives: To evaluate software for automatic segmentation, labeling and reformation of anatomical aligned axial images of the thoracolumbar spine on CT in terms of accuracy, potential for time savings and workflow improvement. Material and methods: 77 patients (28 women, 49 men, mean age 65.3 ± 14.4 years) with known or suspected spinal disorders (degenerative spine disease n = 32; disc herniation n = 36; traumatic vertebral fractures n = 9) underwent 64-slice MDCT with thin-slab reconstruction. Time for automatic labeling of the thoracolumbar spine and reconstruction of double-angulated axial images of the pathological vertebrae was compared with manually performed reconstruction of anatomical aligned axial images. Reformatted images of both reconstruction methods were assessed by two observers regarding accuracy of symmetric depiction of anatomical structures. Results: In 33 cases double-angulated axial images were created in 1 vertebra, in 28 cases in 2 vertebrae and in 16 cases in 3 vertebrae. Correct automatic labeling was achieved in 72 of 77 patients (93.5%). Errors could be manually corrected in 4 cases. Automatic labeling required 1 min in average. In cases where anatomical aligned axial images of 1 vertebra were created, reconstructions made by hand were significantly faster (p < 0.05). Automatic reconstruction was time-saving in cases of 2 and more vertebrae (p < 0.05). Both reconstruction methods revealed good image quality with excellent inter-observer agreement. Conclusion: The evaluated software for automatic labeling and anatomically aligned, double-angulated axial image reconstruction of the thoracolumbar spine on CT is time

  3. Automatic sequences

    CERN Document Server

    Haeseler, Friedrich

    2003-01-01

    Automatic sequences are sequences which are produced by a finite automaton. Although they are not random they may look as being random. They are complicated, in the sense of not being not ultimately periodic, they may look rather complicated, in the sense that it may not be easy to name the rule by which the sequence is generated, however there exists a rule which generates the sequence. The concept automatic sequences has special applications in algebra, number theory, finite automata and formal languages, combinatorics on words. The text deals with different aspects of automatic sequences, in particular:· a general introduction to automatic sequences· the basic (combinatorial) properties of automatic sequences· the algebraic approach to automatic sequences· geometric objects related to automatic sequences.

  4. Description and performance of a fully automatic device for the study of the sedimentation of magnetic suspensions

    Science.gov (United States)

    Iglesias, G. R.; López-López, M. T.; Delgado, A. V.; Durán, J. D. G.

    2011-07-01

    In this paper we describe an experimental setup for the automatic determination of the sedimentation behavior of magnetic suspensions (i.e., disperse systems consisting on ferro- or ferri-magnetic particles in a suitable fluid) of arbitrary volume fraction of solids. The device is based on the evaluation of the inductance of a thin coil surrounding the test tube containing the sample. The inductance L is evaluated from the measurement of the resonant frequency of a parallel LC circuit constructed with the coil and a capacitor of known capacitance. The coil can be moved vertically along the tube at specified steps and time intervals, and from the knowledge of L as a function of the vertical position and time, one can get an image of the particle concentration profiles at given instants of time. The performance of the device is tested against suspensions of spherical iron particles in the micrometer size range dispersed in silicone oil, with various initial concentrations of solids. The sedimentation profiles are then compared with the predictions of existing models for the settling of disperse systems of non-interacting particles.

  5. Description and performance of a fully automatic device for the study of the sedimentation of magnetic suspensions.

    Science.gov (United States)

    Iglesias, G R; López-López, M T; Delgado, A V; Durán, J D G

    2011-07-01

    In this paper we describe an experimental setup for the automatic determination of the sedimentation behavior of magnetic suspensions (i.e., disperse systems consisting on ferro- or ferri-magnetic particles in a suitable fluid) of arbitrary volume fraction of solids. The device is based on the evaluation of the inductance of a thin coil surrounding the test tube containing the sample. The inductance L is evaluated from the measurement of the resonant frequency of a parallel LC circuit constructed with the coil and a capacitor of known capacitance. The coil can be moved vertically along the tube at specified steps and time intervals, and from the knowledge of L as a function of the vertical position and time, one can get an image of the particle concentration profiles at given instants of time. The performance of the device is tested against suspensions of spherical iron particles in the micrometer size range dispersed in silicone oil, with various initial concentrations of solids. The sedimentation profiles are then compared with the predictions of existing models for the settling of disperse systems of non-interacting particles. PMID:21806198

  6. Semi-automatic laboratory goniospectrometer system for performing multi-angular reflectance and polarization measurements for natural surfaces

    Science.gov (United States)

    Sun, Z. Q.; Wu, Z. F.; Zhao, Y. S.

    2014-01-01

    In this paper, the design and operation of the Northeast Normal University Laboratory Goniospectrometer System for performing multi-angular reflected and polarized measurements under controlled illumination conditions is described. A semi-automatic arm, which is carried on a rotated circular ring, enables the acquisition of a large number of measurements of surface Bidirectional Reflectance Factor (BRF) over the full hemisphere. In addition, a set of polarizing optics enables the linear polarization over the spectrum from 350 nm to 2300 nm. Because of the stable measurement condition in the laboratory, the BRF and linear polarization has an average uncertainty of 1% and less than 5% depending on the sample property, respectively. The polarimetric accuracy of the instrument is below 0.01 in the form of the absolute value of degree of linear polarization, which is established by measuring a Spectralon plane. This paper also presents the reflectance and polarization of snow, soil, sand, and ice measured during 2010-2013 in order to illustrate its stability and accuracy. These measurement results are useful to understand the scattering property of natural surfaces on Earth.

  7. Space-Based FPGA Radio Receiver Design, Debug, and Development of a Radiation-Tolerant Computing System

    Directory of Open Access Journals (Sweden)

    Zachary K. Baker

    2010-01-01

    Full Text Available Los Alamos has recently completed the latest in a series of Reconfigurable Software Radios, which incorporates several key innovations in both hardware design and algorithms. Due to our focus on satellite applications, each design must extract the best size, weight, and power performance possible from the ensemble of Commodity Off-the-Shelf (COTS parts available at the time of design. A large component of our work lies in determining if a given part will survive in space and how it will fail under various space radiation conditions. Using two Xilinx Virtex 4 FPGAs, we have achieved 1 TeraOps/second signal processing on a 1920 Megabit/second datastream. This processing capability enables very advanced algorithms such as our wideband RF compression scheme to operate at the source, allowing bandwidth-constrained applications to deliver previously unattainable performance. This paper will discuss the design of the payload, making electronics survivable in the radiation of space, and techniques for debug.

  8. Comparison of Automatic Classifiers’ Performances using Word-based Feature Extraction Techniques in an E-government setting

    OpenAIRE

    Marin Rodenas, Alfonso

    2011-01-01

    Nowadays email is commonly used by citizens to establish communication with their government. On the received emails, governments deal with some common queries and subjects which some handling officers have to manually answer. Automatic email classification of the incoming emails allows to increase the communication efficiency by decreasing the delay between the query and its response. This thesis takes part within the IMAIL project, which aims to provide an automatic answering solution to th...

  9. I Hear You Eat and Speak: Automatic Recognition of Eating Condition and Food Type, Use-Cases, and Impact on ASR Performance

    Science.gov (United States)

    Hantke, Simone; Weninger, Felix; Kurle, Richard; Ringeval, Fabien; Batliner, Anton; Mousa, Amr El-Desoky; Schuller, Björn

    2016-01-01

    We propose a new recognition task in the area of computational paralinguistics: automatic recognition of eating conditions in speech, i. e., whether people are eating while speaking, and what they are eating. To this end, we introduce the audio-visual iHEARu-EAT database featuring 1.6 k utterances of 30 subjects (mean age: 26.1 years, standard deviation: 2.66 years, gender balanced, German speakers), six types of food (Apple, Nectarine, Banana, Haribo Smurfs, Biscuit, and Crisps), and read as well as spontaneous speech, which is made publicly available for research purposes. We start with demonstrating that for automatic speech recognition (ASR), it pays off to know whether speakers are eating or not. We also propose automatic classification both by brute-forcing of low-level acoustic features as well as higher-level features related to intelligibility, obtained from an Automatic Speech Recogniser. Prediction of the eating condition was performed with a Support Vector Machine (SVM) classifier employed in a leave-one-speaker-out evaluation framework. Results show that the binary prediction of eating condition (i. e., eating or not eating) can be easily solved independently of the speaking condition; the obtained average recalls are all above 90%. Low-level acoustic features provide the best performance on spontaneous speech, which reaches up to 62.3% average recall for multi-way classification of the eating condition, i. e., discriminating the six types of food, as well as not eating. The early fusion of features related to intelligibility with the brute-forced acoustic feature set improves the performance on read speech, reaching a 66.4% average recall for the multi-way classification task. Analysing features and classifier errors leads to a suitable ordinal scale for eating conditions, on which automatic regression can be performed with up to 56.2% determination coefficient. PMID:27176486

  10. I Hear You Eat and Speak: Automatic Recognition of Eating Condition and Food Type, Use-Cases, and Impact on ASR Performance.

    Science.gov (United States)

    Hantke, Simone; Weninger, Felix; Kurle, Richard; Ringeval, Fabien; Batliner, Anton; Mousa, Amr El-Desoky; Schuller, Björn

    2016-01-01

    We propose a new recognition task in the area of computational paralinguistics: automatic recognition of eating conditions in speech, i. e., whether people are eating while speaking, and what they are eating. To this end, we introduce the audio-visual iHEARu-EAT database featuring 1.6 k utterances of 30 subjects (mean age: 26.1 years, standard deviation: 2.66 years, gender balanced, German speakers), six types of food (Apple, Nectarine, Banana, Haribo Smurfs, Biscuit, and Crisps), and read as well as spontaneous speech, which is made publicly available for research purposes. We start with demonstrating that for automatic speech recognition (ASR), it pays off to know whether speakers are eating or not. We also propose automatic classification both by brute-forcing of low-level acoustic features as well as higher-level features related to intelligibility, obtained from an Automatic Speech Recogniser. Prediction of the eating condition was performed with a Support Vector Machine (SVM) classifier employed in a leave-one-speaker-out evaluation framework. Results show that the binary prediction of eating condition (i. e., eating or not eating) can be easily solved independently of the speaking condition; the obtained average recalls are all above 90%. Low-level acoustic features provide the best performance on spontaneous speech, which reaches up to 62.3% average recall for multi-way classification of the eating condition, i. e., discriminating the six types of food, as well as not eating. The early fusion of features related to intelligibility with the brute-forced acoustic feature set improves the performance on read speech, reaching a 66.4% average recall for the multi-way classification task. Analysing features and classifier errors leads to a suitable ordinal scale for eating conditions, on which automatic regression can be performed with up to 56.2% determination coefficient. PMID:27176486

  11. Debugging Democracy

    OpenAIRE

    Alexander Likhotal

    2016-01-01

    Democracy was the most successful political idea of the 20th century. However since the beginning of the new century democracy has been clearly suffering from serious structural problems, rather than a few isolated ailments. Why has it run into trouble, can it be revived? In the consumption driven world people have started to be driven by the belief in economic prosperity as the guarantee of human freedom. As a result, human development and personal status have become hostages of economic per...

  12. Operational performance of Swedish grid connected solar power plants. Automatic data collection; Driftuppfoeljning av svenska naetanslutna solcellsanlaeggningar. Automatisering av datainsamling

    Energy Technology Data Exchange (ETDEWEB)

    Hedstroem, Jonas; Svensson, Stefan

    2006-09-15

    A performance database containing all grid-connected PV-systems in Sweden has been in operation since March 2002. The systems in the database are described in detail and energy production is continuously added in the form of monthly values. The energy production and the system descriptions are published on www.elforsk.se/solenergi. In august 2006 31 active systems were present in the database. As result of the Swedish subsidy program this number is expected to increase to over 100 systems in the next years. The new owners of PV-systems are obliged to report the produced electricity to the authorities at least once a year. In this work we have studied different means to simplify the collection of data. Four different methods are defined. 1. The conversion of readings from energy meters made at arbitrary distance in time into monthly values. 2. Methods to handle data obtained with the monitoring systems provided by different inverter manufactures. 3. Methods to acquire data from PV-systems with energy meters reporting to the green certificate system. 4. Commercial GSM/GPRS monitoring systems. The first method is the minimum level required by the authorities. The second and third methods are the use of equipments that are expected to be used by some PV-systems for other reasons. Method 4 gives a possibility to create a fully automatic collection method. The described GPRS-systems are expected to have an initial cost of roughly 4000 SEK and a yearly fee of 200 SEK (1 SEK {approx} 0.14 USD)

  13. A Case for Dynamic Reverse-code Generation to Debug Non-deterministic Programs

    Directory of Open Access Journals (Sweden)

    Jooyong Yi

    2013-09-01

    Full Text Available Backtracking (i.e., reverse execution helps the user of a debugger to naturally think backwards along the execution path of a program, and thinking backwards makes it easy to locate the origin of a bug. So far backtracking has been implemented mostly by state saving or by checkpointing. These implementations, however, inherently do not scale. Meanwhile, a more recent backtracking method based on reverse-code generation seems promising because executing reverse code can restore the previous states of a program without state saving. In the literature, there can be found two methods that generate reverse code: (a static reverse-code generation that pre-generates reverse code through static analysis before starting a debugging session, and (b dynamic reverse-code generation that generates reverse code by applying dynamic analysis on the fly during a debugging session. In particular, we espoused the latter one in our previous work to accommodate non-determinism of a program caused by e.g., multi-threading. To demonstrate the usefulness of our dynamic reverse-code generation, this article presents a case study of various backtracking methods including ours. We compare the memory usage of various backtracking methods in a simple but nontrivial example, a bounded-buffer program. In the case of non-deterministic programs such as this bounded-buffer program, our dynamic reverse-code generation outperforms the existing backtracking methods in terms of memory efficiency.

  14. The evaluation of the performance of the automatic exposure control system of some selected mammography facilities in the Greater Accra Region, Ghana

    International Nuclear Information System (INIS)

    Mammography aids in the early detection of breast cancer. X-rays has an associated risk of inducing cancer though very useful and as such mammography procedures should be optimized through the appropriate processes such as the selection of exposure factors for an optimum image and minimal dose to patients. The automatic exposure control (AEC) aids in the selection of exposure factors thus controlling the amount of radiation to the breast and automatically compensates for differences in the breast thickness and density. The performance of the automatic exposure control system of mammography equipment and the status of quality management systems including quality assurance and quality controls of four (4) mammography facilities within the Greater Accra Region were assessed. In assessing the performance of the automatic exposure control system, the short term reproducibility test, thickness and voltage compensation test were carried out using breast equivalent phantom of various thicknesses. Half value layer test, film reject analysis and patient dose assessment were also performed. Analysis of the responses of the questionnaire administered to radiographers and supervisors of the selected facilities revealed that three (3) of the facilities have some aspect of quality management system programme in place but not effectively implemented. Measured optical densities from the various tests performed to evaluate the performance of the automatic exposure control systems revealed that the AEC compensates for the different phantom thickness and tube voltage (KV) by producing comparable optical densities for the various phantom thickness and tube voltages. Some of the measured optical densities were within the recommended optical density range of 1.5 OD - 1.9 OD. The highest optical density value was 0.13 OD above the highest limit of 1.9 OD. The film reject analysis showed that patient motion accounted for the large part (28%) of film rejects. Other factors such as too light

  15. A fully automatic tool to perform accurate flood mapping by merging remote sensing imagery and ancillary data

    Science.gov (United States)

    D'Addabbo, Annarita; Refice, Alberto; Lovergine, Francesco; Pasquariello, Guido

    2016-04-01

    Flooding is one of the most frequent and expansive natural hazard. High-resolution flood mapping is an essential step in the monitoring and prevention of inundation hazard, both to gain insight into the processes involved in the generation of flooding events, and from the practical point of view of the precise assessment of inundated areas. Remote sensing data are recognized to be useful in this respect, thanks to the high resolution and regular revisit schedules of state-of-the-art satellites, moreover offering a synoptic overview of the extent of flooding. In particular, Synthetic Aperture Radar (SAR) data present several favorable characteristics for flood mapping, such as their relative insensitivity to the meteorological conditions during acquisitions, as well as the possibility of acquiring independently of solar illumination, thanks to the active nature of the radar sensors [1]. However, flood scenarios are typical examples of complex situations in which different factors have to be considered to provide accurate and robust interpretation of the situation on the ground: the presence of many land cover types, each one with a particular signature in presence of flood, requires modelling the behavior of different objects in the scene in order to associate them to flood or no flood conditions [2]. Generally, the fusion of multi-temporal, multi-sensor, multi-resolution and/or multi-platform Earth observation image data, together with other ancillary information, seems to have a key role in the pursuit of a consistent interpretation of complex scenes. In the case of flooding, distance from the river, terrain elevation, hydrologic information or some combination thereof can add useful information to remote sensing data. Suitable methods, able to manage and merge different kind of data, are so particularly needed. In this work, a fully automatic tool, based on Bayesian Networks (BNs) [3] and able to perform data fusion, is presented. It supplies flood maps

  16. Application of remote debugging techniques in user-centric job monitoring

    International Nuclear Information System (INIS)

    With the Job Execution Monitor, a user-centric job monitoring software developed at the University of Wuppertal and integrated into the job brokerage systems of the WLCG, job progress and grid worker node health can be supervised in real time. Imminent error conditions can thus be detected early by the submitter and countermeasures can be taken. Grid site admins can access aggregated data of all monitored jobs to infer the site status and to detect job misbehaviour. To remove the last 'blind spot' from this monitoring, a remote debugging technique based on the GNU C compiler suite was developed and integrated into the software; its design concept and architecture is described in this paper and its application discussed.

  17. Testing effort dependent software reliability model for imperfect debugging process considering both detection and correction

    International Nuclear Information System (INIS)

    This paper studies the fault detection process (FDP) and fault correction process (FCP) with the incorporation of testing effort function and imperfect debugging. In order to ensure high reliability, it is essential for software to undergo a testing phase, during which faults can be detected and corrected by debuggers. The testing resource allocation during this phase, which is usually depicted by the testing effort function, considerably influences not only the fault detection rate but also the time to correct a detected fault. In addition, testing is usually far from perfect such that new faults may be introduced. In this paper, we first show how to incorporate testing effort function and fault introduction into FDP and then develop FCP as delayed FDP with a correction effort. Various specific paired FDP and FCP models are obtained based on different assumptions of fault introduction and correction effort. An illustrative example is presented. The optimal release policy under different criteria is also discussed

  18. Study on triterpenoic acids distribution in Ganoderma mushrooms by automatic multiple development high performance thin layer chromatographic fingerprint analysis.

    Science.gov (United States)

    Yan, Yu-Zhen; Xie, Pei-Shan; Lam, Wai-Kei; Chui, Eddie; Yu, Qiong-Xi

    2010-01-01

    Ganoderma--"Lingzhi" in Chinese--is one of the superior Chinese tonic materia medicas in China, Japan, and Korea. Two species, Ganoderma lucidum (Red Lingzhi) and G. sinense (Purple Lingzhi), have been included in the Chinese Pharmacopoeia since its 2000 Edition. However, some other species of Ganoderma are also available in the market. For example, there are five species divided by color called "Penta-colors Lingzhi" that have been advocated as being the most invigorating among the Lingzhi species; but there is no scientific evidence for such a claim. Morphological identification can serve as an effective practice for differentiating the various species, but the inherent quality has to be delineated by chemical analysis. Among the diverse constituents in Lingzhi, triterpenoids are commonly recognized as the major active ingredients. An automatic triple development HPTLC fingerprint analysis was carried out for detecting the distribution consistency of the triterpenoic acids in various Lingzhi samples. The chromatographic conditions were optimized as follows: stationary phase, precoated HPTLC silica gel 60 plate; mobile phase, toluene-ethyl acetate-methanol-formic acid (15 + 15 + 1 + 0.1); and triple-development using automatic multiple development equipment. The chromatograms showed good resolution, and the color images provided more specific HPTLC fingerprints than have been previously published. It was observed that the abundance of triterpenoic acids and consistent fingerprint pattern in Red Lingzhi (fruiting body of G. lucidum) outweighs the other species of Lingzhi. PMID:21140647

  19. Orbital welding automatic pressure test by ODA automatic machines is 35 years old

    International Nuclear Information System (INIS)

    Development review of technology and equipment of automatic orbital welding with automatic pressures test of nuclear power stations pipelines and different purpose objects is performed. Welding variants with automatic pressure test and different welding automatic machines are described. Priority of national developments is underlined

  20. Automatic 2D scintillation camera and computed tomography whole-body image registration to perform dosimetric calculations

    International Nuclear Information System (INIS)

    Full text: In this work a software tool that has been developed to allow automatic registrations of 2D Scintillation Camera (SC) and Computed Tomography (CT) images is presented. This tool, used with a dosimetric software with Integrated Activity or Residence Time as input data, allows the user to assess physicians about effects of radiodiagnostic or radiotherapeutic practices that involves nuclear medicine 'open sources'. Images are registered locally and globally, maximizing Mutual Information coefficient between regions been registered. In the regional case whole-body images are segmented into five regions: head, thorax, pelvis, left and right legs. Each region has its own registration parameters, which are optimized through Powell-Brent minimization method that 'maximizes' Mutual Information coefficient. This software tool allows the user to draw ROIs, input isotope characteristics and finally calculate Integrated Activity or Residence Time in one or many specific organ. These last values can be introduced in many dosimetric software to finally obtain Absorbed Dose values. (author)

  1. Automatic 2D scintillation camera and computed tomography whole-body image registration to perform dosimetry calculation

    International Nuclear Information System (INIS)

    In this paper we present a software tool that has been developed to allow automatic registrations of 2D Scintillation Camera (SC) and Computed Tomography (CT) images. This tool, used with a dosimetric software with Integrated Activity or Residence Time as input data, allows the user to assess physicians about effects of radiodiagnostic or radioterapeutic practices. Images are registered locally and globally, maximizing Mutual Information coefficient between regions been registered. In the regional case whole-body images are segmented into five regions: head, thorax, pelvis, left and right legs. Each region has its own registration parameters, which are optimized through Powell-Brent minimization method that 'maximizes' Mutual Information coefficient. This software tool allows the user to draw ROIs, input isotope characteristics and finally calculate Integrated Activity or Residence Time in one or many specific organ. These last values can be introduced in many dosimetric softwares to finally obtain Absorbed Dose values

  2. Assessing the Performance of Automatic Speech Recognition Systems When Used by Native and Non-Native Speakers of Three Major Languages in Dictation Workflows

    DEFF Research Database (Denmark)

    Zapata, Julián; Kirkedal, Andreas Søeborg

    In this paper, we report on a two-part experiment aiming to assess and compare the performance of two types of automatic speech recognition (ASR) systems on two different computational platforms when used to augment dictation workflows. The experiment was performed with a sample of speakers of...... three major languages and with different linguistic profiles: non-native English speakers; non-native French speakers; and native Spanish speakers. The main objective of this experiment is to examine ASR performance in translation dictation (TD) and medical dictation (MD) workflows without manual...... transcription vs. with transcription. We discuss the advantages and drawbacks of a particular ASR approach in different computational platforms when used by various speakers of a given language, who may have different accents and levels of proficiency in that language, and who may have different levels of...

  3. Automatic text summarization

    CERN Document Server

    Torres Moreno, Juan Manuel

    2014-01-01

    This new textbook examines the motivations and the different algorithms for automatic document summarization (ADS). We performed a recent state of the art. The book shows the main problems of ADS, difficulties and the solutions provided by the community. It presents recent advances in ADS, as well as current applications and trends. The approaches are statistical, linguistic and symbolic. Several exemples are included in order to clarify the theoretical concepts.  The books currently available in the area of Automatic Document Summarization are not recent. Powerful algorithms have been develop

  4. Simulation software support (S3) system a software testing and debugging tool

    International Nuclear Information System (INIS)

    The largest percentage of technical effort in the software development process is accounted for debugging and testing. It is not unusual for a software development organization to spend over 50% of the total project effort on testing. In the extreme, testing of human-rated software (e.g., nuclear reactor monitoring, training simulator) can cost three to five times as much as all other software engineering steps combined. The Simulation Software Support (S3) System, developed by the Link-Miles Simulation Corporation is ideally suited for real-time simulation applications which involve a large database with models programmed in FORTRAN. This paper will focus on testing elements of the S3 system. In this paper system support software utilities are provided which enable the loading and execution of modules in the development environment. These elements include the Linking/Loader (LLD) for dynamically linking program modules and loading them into memory and the interactive executive (IEXEC) for controlling the execution of the modules. Features of the Interactive Symbolic Debugger (SD) and the Real Time Executive (RTEXEC) to support the unit and integrated testing will be explored

  5. In-situ FPGA debug driven by on-board microcontroller

    Energy Technology Data Exchange (ETDEWEB)

    Baker, Zachary Kent [Los Alamos National Laboratory

    2009-01-01

    Often we are faced with the situation that the behavior of a circuit changes in an unpredictable way when chassis cover is attached or the system is not easily accessible. For instance, in a deployed environment, such as space, hardware can malfunction in unpredictable ways. What can a designer do to ascertain the cause of the problem? Register interrogations only go so far, and sometimes the problem being debugged is register transactions themselves, or the problem lies in FPGA programming. This work provides a solution to this; namely, the ability to drive a JTAG chain via an on-board microcontroller and use a simple clone of the Xilinx Chipscope core without a Xilinx JTAG cable or any external interfaces required. We have demonstrated the functionality of the prototype system using a Xilinx Spartan 3E FPGA and a Microchip PIC18j2550 microcontroller. This paper will discuss the implementation details as well as present case studies describing how the tools have aided satellite hardware development.

  6. Performance evaluation of an automatic positioning system for photovoltaic panels; Avaliacao de desempenho de um sistema de posicionamento automatico para paineis fotovoltaicos

    Energy Technology Data Exchange (ETDEWEB)

    Alves, Alceu Ferreira; Cagnon, Jose Angelo [Universidade Estadual Paulista (FEB/UNESP), Bauru, SP (Brazil). Fac. de Engenharia], Emails: alceu@feb.unesp.br, jacagnon@feb.unesp.br

    2009-07-01

    The need of using electric energy in localities not attended by the utilities has motivated the development of this research, whose main approach was photovoltaic systems and the search for better performance of these systems with the solar panels positioning toward the sun. This work presents the performance evaluation of an automatic positioning system for photovoltaic panels taking in account the increase in generation of electric energy and its costs of implantation. It was designed a simplified electromechanical device, which is able to support and to move a photovoltaic panel along the day and along the year, keeping its surface aimed to the sun rays, without using sensors and with optimization of movements, due the adjustment of panel's inclination take place only once a day. The obtained results indicated that the proposal is viable, showing a compatible cost compared to the increase in the generation of electricity. (author)

  7. 单馈圆极化微带天线的工程调试方法研究%Engineering debug method of single-feeding circularly polarized micro-strip antenna

    Institute of Scientific and Technical Information of China (English)

    于家傲; 陈文君; 袁靖; 鞠志忠

    2014-01-01

    为解决单馈圆极化微带天线的工程实现与设计方案存在着谐振频点不一致、轴比变差等问题,基于单馈圆极化天线理论基础,提出了两类单馈圆极化天线的工程调试方案,并采用HFSS软件进行了仿真。仿真结果表明,在天线的不同位置进行调试可分别对单馈圆极化微带天线的谐振频率、反射系数和轴比等天线性能进行优化调整,这对该类天线的工程调试具有指导意义。%To solve the problems that resonant frequency points are inconsistent and the axial ratio is getting worse between the engineering implementation of the single-feeding circularly polarized micro-strip antenna and its design scheme, this paper puts forward two kinds of engineering debug schemes for this micro-strip antenna and simulates them using HFSS software, based on the theoretic basis of single-feeding circularly polarized micro-strip antenna. Simulation results illustrate that the resonant frequency, reflection coefficient and axial ratio as well as other performances of the micro-strip antenna can be optimized respectively by debugging at various positions of this antenna, which plays a certain guiding role in engineering debug of these kinds of antenna.

  8. 75 FR 30159 - Automatic Dependent Surveillance-Broadcast (ADS-B) Out Performance Requirements To Support Air...

    Science.gov (United States)

    2010-05-28

    ...) Out Performance Requirements To Support Air Traffic Control (ATC) Service; Final Rule #0;#0;Federal...--Broadcast (ADS-B) Out Performance Requirements To Support Air Traffic Control (ATC) Service AGENCY: Federal..., published in the Federal Register on October 5, 2007 (72 FR 56947), Congress enacted the ``Century...

  9. 75 FR 37712 - Automatic Dependent Surveillance-Broadcast (ADS-B) Out Performance Requirements To Support Air...

    Science.gov (United States)

    2010-06-30

    ... (ADS-B) Out Performance Requirements To Support Air Traffic Control (ATC) Service; Technical Amendment... Marking, and Miscellaneous Amendments'' (74 FR 53368) in which the FAA revised part 21 subpart O. As part... Surveillance--Broadcast (ADS-B) Out Performance Requirements To Support Air Traffic Control (ATC) Service''...

  10. Diagnostic performance of a commercially available computer-aided diagnosis system for automatic detection of pulmonary nodules: comparison with single and double reading

    International Nuclear Information System (INIS)

    Objective: To assess the diagnostic performance of a commercially available computer-aided diagnosis (CAD) system for automatic detection of pulmonary nodules with multi-row detector CT scans compared to single and double reading by radiologists. Materials and Methods: A CAD system for automatic nodule detection (Siemens LungCare NEV VB10) was applied to four-detector row low-dose CT (LDCT) performed on nine patients with pulmonary metastases and compared to the findings of three radiologists. A standard-dose CT (SDCT) was acquired simultaneously and used for establishing the reference data base. The study design was approved by the Institutional Review Board and the appropriate German authorities. The reference data base consisted of 457 nodules (mean size 3.9±3.1 mm) and was established by fusion of the sets of nodules detected by three radiologists independently reading LDCT and SDCT and by CAD. An independent radiologist used thin slices to eliminate false positive findings from the reference base. Results: An average sensitivity of 54% (range 51% to 55%) was observed for single reading by one radiologist. CAD demonstrated a similar sensitivity of 55%. Double reading by two radiologists increased the sensitivity to an average of 67% (range 67% to 68%). The difference to single reading was significant (p<0.001). CAD as second opinion after single reading increased the sensitivity to 79% (range 77% to 81%), which proved to be significantly better than double reading (p<0.001). CAD produced more false positive results (7.2%) than human readers but it was acceptable in clinical routine. (orig.)

  11. 75 FR 37711 - Automatic Dependent Surveillance-Broadcast (ADS-B) Out Performance Requirements To Support Air...

    Science.gov (United States)

    2010-06-30

    ... Support Air Traffic Control (ATC) Service'' (75 FR 30160). There are three footnotes in the preamble for... (ADS-B) Out Performance Requirements To Support Air Traffic Control (ATC) Service; Correction AGENCY... Amendments,'' published October 16, 2009 (74 FR 53368), the FAA revised part 21 subpart O, and Sec....

  12. Facial expressions as feedback cue in human-robot interaction - a comparison between human and automatic recognition performances

    OpenAIRE

    Lang, Christian; Wachsmuth, Sven; Wersing, Heiko; Hanheide, Marc

    2010-01-01

    Facial expressions are one important nonverbal communication cue, as they can provide feedback in conversations between people and also in human-robot interaction. This paper presents an evaluation of three standard pattern recognition techniques (active appearance models, gabor energy filters, and raw images) for facial feedback interpretation in terms of valence (success and failure) and compares the results to the human performance. The used database contains videos of people interacting w...

  13. 射频宽带产品的指压调试法%RF broadband products Shiatsu debugging method

    Institute of Scientific and Technical Information of China (English)

    胡志山

    2014-01-01

    射频宽带产品的传输损耗①是生产调试、样品试制的重点参数,尤其以各端口的反射损耗为最大的调试难点,通常我们需要对线路增加接地补偿电容来优化参数,然而补偿电容的位置和大小取值的确定只能采用逐步逼近试验法,很难快速确定。这在用工紧张、价格竞争激烈的今天成为诸多厂家的绊脚石。笔者经过多年的工作实践,总结出了指压调试法,可以轻松快捷地确定补偿电容的位置及大小,非常适合生产、技术一线的推广应用。%Transmission loss RF broadband products is a key parameter of production debugging,testing samples.Especial y in the relfection loss of each port to debug the biggest dififculty. Usual y we need to increase the grounding capacitance compensation circuit to optimize the parameters.However,the size and location of the compensation capacitor value can only be using stepwise approximation test method,it is dififcult to determine.The labor tension,price competition is intense day become a stumbling block of many manufacturers.After many years of practice,summed up a Shiatsu debugging method.Can quickly and easily determine the location and size of compensation capacitors.Very suitable for application in production,technical line.

  14. Data Provenance Inference in Logic Programming: Reducing Effort of Instance-driven Debugging

    NARCIS (Netherlands)

    Huq, Mohammad Rezwanul; Mileo, Alessandra; Wombacher, Andreas

    2013-01-01

    Data provenance allows scientists in different domains validating their models and algorithms to find out anomalies and unexpected behaviors. In previous works, we described on-the-fly interpretation of (Python) scripts to build workflow provenance graph automatically and then infer fine-grained pro

  15. Performance evaluation and operational experience with a semi-automatic monitor for the radiological characterization of low-level wastes

    International Nuclear Information System (INIS)

    Chalk River Nuclear Laboratories (CRNL) have undertaken a Waste Disposal Project to co-ordinate the transition from the current practice of interim storage to permanent disposal for low-level radioactive wastes (LLW). The strategy of the project is to classify and segregate waste segments according to their hazardous radioactive lifetimes and to emplace them in disposal facilities engineered to isolate and contain them. To support this strategy, a waste characterization program was set up to estimate the volume and radioisotope inventories of the wastes managed by CRNL. A key element of the program is the demonstration of a non-invasive measurement technique for the isotope-specific characterization of solid LLW. This paper describes the approach taken at CRNL for the non-invasive assay of LLW and the field performance and early operational experience with a waste characterization monitor to be used in a waste processing facility

  16. Does the amount of tagged stool and fluid significantly affect the radiation exposure in low-dose CT colonography performed with an automatic exposure control?

    International Nuclear Information System (INIS)

    To determine whether the amount of tagged stool and fluid significantly affects the radiation exposure in low-dose screening CT colonography performed with an automatic tube-current modulation technique. The study included 311 patients. The tagging agent was barium (n = 271) or iodine (n = 40). Correlation was measured between mean volume CT dose index (CTDIvol) and the estimated x-ray attenuation of the tagged stool and fluid (ATT). Multiple linear regression analyses were performed to determine the effect of ATT on CTDIvol and the effect of ATT on image noise while adjusting for other variables including abdominal circumference. CTDIvol varied from 0.88 to 2.54 mGy. There was no significant correlation between CTDIvol and ATT (p = 0.61). ATT did not significantly affect CTDIvol (p = 0.93), while abdominal circumference was the only factor significantly affecting CTDIvol (p < 0.001). Image noise ranged from 59.5 to 64.1 HU. The p value for the regression model explaining the noise was 0.38. The amount of stool and fluid tagging does not significantly affect radiation exposure. (orig.)

  17. Automatic bone and plaque removal using dual energy CT for head and neck angiography: Feasibility and initial performance evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, C., E-mail: Christoph.thomas@med.uni-tuebingen.d [Department of Diagnostic and Interventional Radiology, University of Tuebingen, Hoppe-Seyler-Strasse 3, 72076 Tuebingen (Germany); Korn, A. [Department of Neuroradiology, University of Tuebingen, Tuebingen (Germany); Krauss, B. [Siemens Medical Solutions, Forchheim (Germany); Ketelsen, D.; Tsiflikas, I.; Reimann, A.; Brodoefel, H.; Claussen, C.D.; Kopp, A.F. [Department of Diagnostic and Interventional Radiology, University of Tuebingen, Hoppe-Seyler-Strasse 3, 72076 Tuebingen (Germany); Ernemann, U. [Department of Neuroradiology, University of Tuebingen, Tuebingen (Germany); Heuschmid, M. [Department of Diagnostic and Interventional Radiology, University of Tuebingen, Hoppe-Seyler-Strasse 3, 72076 Tuebingen (Germany)

    2010-10-15

    Purpose: We sought to evaluate the feasibility and efficiency of dual energy (DE) bone and plaque removal in head and neck CT angiography. Materials and methods: 20 patients with suspected carotid stenoses received head and neck DE-CTA as part of their pre-interventional workup. Visual grading using multiplanar reformations (MPR), thick slab maximum intensity projections (MIP) and quantitative vessel analysis (QVA) of stenoses was performed prior and after DE bone removal. Results were evaluated for the detection of relevant stenoses (vessel area reduction >70%). Vessel segmentation errors were analyzed. Results: Segmentation errors occurred in 19% of all vessel segments. Nevertheless, most post-bone removal artifacts could be recognized using the MPR technique for reading. Compared to MPR reading prior to bone removal, sensitivity, specificity, positive and negative predictive values after bone removal were 100%, 98%, 88% and 100% for MPR reading and 100%, 91%, 63% and 100% for exclusive MIP reading, respectively. There was a good agreement between the QVA results prior and post-DE plaque removal (r{sup 2} = 0.8858). Conclusion: DE bone and plaque removal for head and neck angiography is feasible and offers a rapid and highly sensitive overview over vascular head and neck studies. Due to a slightly limited specificity of the MIP technique due to segmentation errors, possible stenoses should be verified and graded using MPR techniques.

  18. Automatic bone and plaque removal using dual energy CT for head and neck angiography: Feasibility and initial performance evaluation

    International Nuclear Information System (INIS)

    Purpose: We sought to evaluate the feasibility and efficiency of dual energy (DE) bone and plaque removal in head and neck CT angiography. Materials and methods: 20 patients with suspected carotid stenoses received head and neck DE-CTA as part of their pre-interventional workup. Visual grading using multiplanar reformations (MPR), thick slab maximum intensity projections (MIP) and quantitative vessel analysis (QVA) of stenoses was performed prior and after DE bone removal. Results were evaluated for the detection of relevant stenoses (vessel area reduction >70%). Vessel segmentation errors were analyzed. Results: Segmentation errors occurred in 19% of all vessel segments. Nevertheless, most post-bone removal artifacts could be recognized using the MPR technique for reading. Compared to MPR reading prior to bone removal, sensitivity, specificity, positive and negative predictive values after bone removal were 100%, 98%, 88% and 100% for MPR reading and 100%, 91%, 63% and 100% for exclusive MIP reading, respectively. There was a good agreement between the QVA results prior and post-DE plaque removal (r2 = 0.8858). Conclusion: DE bone and plaque removal for head and neck angiography is feasible and offers a rapid and highly sensitive overview over vascular head and neck studies. Due to a slightly limited specificity of the MIP technique due to segmentation errors, possible stenoses should be verified and graded using MPR techniques.

  19. Automatic identification approach for high-performance liquid chromatography-multiple reaction monitoring fatty acid global profiling.

    Science.gov (United States)

    Tie, Cai; Hu, Ting; Jia, Zhi-Xin; Zhang, Jin-Lan

    2015-08-18

    Fatty acids (FAs) are a group of lipid molecules that are essential to organisms. As potential biomarkers for different diseases, FAs have attracted increasing attention from both biological researchers and the pharmaceutical industry. A sensitive and accurate method for globally profiling and identifying FAs is required for biomarker discovery. The high selectivity and sensitivity of high-performance liquid chromatography-multiple reaction monitoring (HPLC-MRM) gives it great potential to fulfill the need to identify FAs from complicated matrices. This paper developed a new approach for global FA profiling and identification for HPLC-MRM FA data mining. Mathematical models for identifying FAs were simulated using the isotope-induced retention time (RT) shift (IRS) and peak area ratios between parallel isotope peaks for a series of FA standards. The FA structures were predicated using another model based on the RT and molecular weight. Fully automated FA identification software was coded using the Qt platform based on these mathematical models. Different samples were used to verify the software. A high identification efficiency (greater than 75%) was observed when 96 FA species were identified in plasma. This FAs identification strategy promises to accelerate FA research and applications. PMID:26189701

  20. Automatically produced FRP beams with embedded FOS in complex geometry: process, material compatibility, micromechanical analysis, and performance tests

    Science.gov (United States)

    Gabler, Markus; Tkachenko, Viktoriya; Küppers, Simon; Kuka, Georg G.; Habel, Wolfgang R.; Milwich, Markus; Knippers, Jan

    2012-04-01

    The main goal of the presented work was to evolve a multifunctional beam composed out of fiber reinforced plastics (FRP) and an embedded optical fiber with various fiber Bragg grating sensors (FBG). These beams are developed for the use as structural member for bridges or industrial applications. It is now possible to realize large scale cross sections, the embedding is part of a fully automated process and jumpers can be omitted in order to not negatively influence the laminate. The development includes the smart placement and layout of the optical fibers in the cross section, reliable strain transfer, and finally the coupling of the embedded fibers after production. Micromechanical tests and analysis were carried out to evaluate the performance of the sensor. The work was funded by the German ministry of economics and technology (funding scheme ZIM). Next to the authors of this contribution, Melanie Book with Röchling Engineering Plastics KG (Haren/Germany; Katharina Frey with SAERTEX GmbH & Co. KG (Saerbeck/Germany) were part of the research group.

  1. Development of the automatized system for the NPP operator support and monitoring of the operating procedure requirement performances

    International Nuclear Information System (INIS)

    The report represents one of the variants to solve the problem of the effective support of the reactor operator and monitoring of the maintenance operating procedure requirementperformances based on a methodology of hybrid expert system (ES). The requirements of operating documentation (OD) in formalized form are entered into knowledge base (KB) of expert system and used then to check the operator actions and to give him effective information support. The system functions are as follows: identification of the reactor status; extraction of the operator actions during some given situation from the knowledge system base on limitations; graphic representation of the present reactor status for the operator and his action limitations; determination of the operator conduct during some given situation; explanation of the logical output result given by the system to the operator; context assistance in the form of the hypertext based on operation documentation (OD) for the operator on his request; determination of the permissible operator conduct during some given situation; hold and output of the information on the operator conduct that contradicts OD requirements; entering the changes into the data and system knowledge base in the case of correction; the system operation protocol. Expected effect: increasing the supervision efficiency for the NPP maintenance operating procedures performance by the operators; giving the effective support for the operator under various reactor operations; receiving of statistical data to improve OD

  2. The yin and yang properties of pentatonic music in TCM music therapy:based on debugging and speed%中医音乐治疗中五声性音乐阴阳属性--从调式和速度的角度

    Institute of Scientific and Technical Information of China (English)

    左志坚

    2016-01-01

    中国传统音乐理论认为,中国音乐在创作、表演、音乐语言等方面都体现出阴阳思维。中国传统音乐是五声性音乐,使用的五声性调式具有阴阳属性。总体上看,调式的阴阳属性可分为明确、基本明确和不明确三大类。音乐速度对阴阳属性明确的调式能造成细微影响,对阴阳属性基本明确和不明确的调式具有重要的决定作用。%According to Chinese traditional music theory,the creation,performance and music language of Chinese music reflect the thinking of yin and yang. Chinese traditional music is pentatonic music,whose Pentatonic debugging has Yin and Yang properties. Overall,the properties can be divided into three categories:clear,almost clear and unclear. Music speed has subtle influence on the debugging with clear yin and yang property and it has decisive influence on the debugging with almost clear and unclear yin and yang property.

  3. Reproducing Context-sensitive Crashes in Mobile Apps using Crowdsourced Debugging

    OpenAIRE

    Gomez, Maria; Rouvoy, Romain; Seinturier, Lionel

    2015-01-01

    While the number of mobile apps published by app stores keeps increasing, the quality of these apps greatly varies. Unfortunately, end-users continue experiencing bugs and crashes for some of the apps installed on their mobile devices. Although developers heavily test their apps before release, context-sensitive crashes might still emerge after deployment. This paper therefore introduces MoTiF, a crowdsourced approach to support developers in automatically reproducing context-sensitive crashe...

  4. GNU/Hurd上远程调试的实现%The Implementation of Remote Debug on GNU/Hurd

    Institute of Scientific and Technical Information of China (English)

    陆岳

    2013-01-01

    GNU Hurd consists of a set of protocols and daemons that run on the GNU Mach microkernel;together they are in-tended to form the kernel of GNU operating system. GDB is a widely used debugger, to adapt to multiple platforms and operat-ing systems, but not yet on the GNU/Hurd to realize remote debugging. This paper gives a detailed analyzes on the Mach excep-tion handling model and the principle of the debugger’s implementation, and the implementation of the remote debugging tools gdbserver are introduced.%GNU/Hurd是一系列基于GNU Mach的守护进程,这一套守护进程最终形成了GNU操作系统。GDB是广泛使用的调试器,适应多个平台和操作系统,但是尚未在GNU/Hurd上实现远程调试。该文详细分析了GNU/Hurd上调试器的实现原理、Mach的异常处理模型,并对GNU/Hurd上的远程调试工具gdbserver的实现进行了介绍。

  5. A Software Agent for Automatic Creation of a PLC Program

    Directory of Open Access Journals (Sweden)

    Walid M. Aly

    2008-01-01

    Full Text Available Using structured design techniques to design a Programmable Logical Control (PLC program would decrease the time needed for debugging and produces a concise bug free code. This study is concerned with the design of a software agent for automatic creation of code for a PLC program that can be downloaded on a Siemens Step 7 series. The code is generated according to the syntax rules for the AWL Language, AWL is the abbreviation for the germen word Anweisungsliste which means Instruction List The proposed system uses object oriented approach to transfer the design specification into an object that adequately describes the system using the state based design technique. The industrial system specifications are supplied by the user through a simple Graphical User Interface (GUI environment These specification define the attributes vales of an object oriented class describing the control system, all the functions needed to generate the code are encapsulated in the class.

  6. Debugging and realization to the activeX scripting%对于ActiveX Scripting的调试及实现

    Institute of Scientific and Technical Information of China (English)

    侯迎春

    2003-01-01

    ActiveX Scripting是Microsoft的ActiveX的一个组成部分,Microsoft的ActiveX技术架构包括5个部分:Host Application(宿主程序)、Language Engine(脚本语言引擎程序)、Process Debug Manager(进程调试管理器程序)、Machine Debug Manager(本机调试管理器程序)、Application Debugger(调试程序).调试程序的建立,首先要建立Host Application和Application Debugger框架,然后在Language engine中执行脚本.

  7. Clothes Dryer Automatic Termination Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    TeGrotenhuis, Ward E.

    2014-10-01

    Volume 2: Improved Sensor and Control Designs Many residential clothes dryers on the market today provide automatic cycles that are intended to stop when the clothes are dry, as determined by the final remaining moisture content (RMC). However, testing of automatic termination cycles has shown that many dryers are susceptible to over-drying of loads, leading to excess energy consumption. In particular, tests performed using the DOE Test Procedure in Appendix D2 of 10 CFR 430 subpart B have shown that as much as 62% of the energy used in a cycle may be from over-drying. Volume 1 of this report shows an average of 20% excess energy from over-drying when running automatic cycles with various load compositions and dryer settings. Consequently, improving automatic termination sensors and algorithms has the potential for substantial energy savings in the U.S.

  8. Handling Conflicts in Depth-First Search for LTL Tableau to Debug Compliance Based Languages

    Directory of Open Access Journals (Sweden)

    Francois Hantry

    2011-09-01

    Full Text Available Providing adequate tools to tackle the problem of inconsistent compliance rules is a critical research topic. This problem is of paramount importance to achieve automatic support for early declarative design and to support evolution of rules in contract-based or service-based systems. In this paper we investigate the problem of extracting temporal unsatisfiable cores in order to detect the inconsistent part of a specification. We extend conflict-driven SAT-solver to provide a new conflict-driven depth-first-search solver for temporal logic. We use this solver to compute LTL unsatisfiable cores without re-exploring the history of the solver.

  9. Handling Conflicts in Depth-First Search for LTL Tableau to Debug Compliance Based Languages

    CERN Document Server

    Hantry, Francois; 10.4204/EPTCS.68.5

    2011-01-01

    Providing adequate tools to tackle the problem of inconsistent compliance rules is a critical research topic. This problem is of paramount importance to achieve automatic support for early declarative design and to support evolution of rules in contract-based or service-based systems. In this paper we investigate the problem of extracting temporal unsatisfiable cores in order to detect the inconsistent part of a specification. We extend conflict-driven SAT-solver to provide a new conflict-driven depth-first-search solver for temporal logic. We use this solver to compute LTL unsatisfiable cores without re-exploring the history of the solver.

  10. An electronically controlled automatic security access gate

    Directory of Open Access Journals (Sweden)

    Jonathan A. ENOKELA

    2014-11-01

    Full Text Available The security challenges being encountered in many places require electronic means of controlling access to communities, recreational centres, offices, and homes. The electronically controlled automated security access gate being proposed in this work helps to prevent an unwanted access to controlled environments. This is achieved mainly through the use of a Radio Frequency (RF transmitter-receiver pair. In the design a microcontroller is programmed to decode a given sequence of keys that is entered on a keypad and commands a transmitter module to send out this code as signal at a given radio frequency. Upon reception of this RF signal by the receiver module, another microcontroller activates a driver circuitry to operate the gate automatically. The codes for the microcontrollers were written in C language and were debugged and compiled using the KEIL Micro vision 4 integrated development environment. The resultant Hex files were programmed into the memories of the microcontrollers with the aid of a universal programmer. Software simulation was carried out using the Proteus Virtual System Modeling (VSM version 7.7. A scaled-down prototype of the system was built and tested. The electronically controlled automated security access gate can be useful in providing security for homes, organizations, and automobile terminals. The four-character password required to operate the gate gives the system an increased level of security. Due to its standalone nature of operation the system is cheaper to maintain in comparison with a manually operated type.

  11. X线机高压发生器的调试%The Debugging of X-ray High Voltage Generator

    Institute of Scientific and Technical Information of China (English)

    戴丹; 王魏; 戴竞; 郭永平; 徐月萍; 高建全; 张春潮; 叶践

    2012-01-01

    随着X射线检查的广泛运用,以及人们对健康的重视程度逐渐提高,接受X射线检查的人越来越多.本文结合X线机构造原理,简单论述X射线高压发生器部分的调试工作.%With wide use of the X-ray examination and the improvement of people's attention on health, more people have accepted the X-ray examination. Based on the structure and principle of X-ray machine, this paper expounds on the debugging of the X-ray high voltage generator.

  12. ABBOTTARCHITECTC16000全自动生化分析仪性能评价%Evaluation on performance of ABBOTT ARCHITECT C16000 automatic biochemistry analyzer

    Institute of Scientific and Technical Information of China (English)

    张娟; 蒋小燕; 李顺君; 黄文芳

    2014-01-01

    Objective To evaluate the main performance of ABBOTT ARCHITECT C160000 biochemistry analyzer,and to judge whether the performance meets the laboratory requirement.Methods According to the clinical laboratory management meth-od and the requirement of accreditation of national laboratory,the precision,accuracy and linearity of the 17 test items(Urea,Cre, UA,Glu,etc.)were analyzed by the CLSI EP5-A2 document,CLSI EP9-A2 document and CLSI EP6-P document;the quotative ref-erence ranges of the 17 test items were verified.Results The coefficient of variation(CV)in within-batch precision of Urea,Cre, UA,Glu,etc.was ≤1/4 CLIA′88 standard and CV in the between-batch precision ≤1/3CLIA′88 standard;in the accuracy test,the relative bias of the 17 test items≤1/2CLIA′88 standard;the linearity of the 17 items was good(r2 >0.95);the cited reference range of various detection items was suitable.Conclusion The performance of the ABBOTT ARCHITECT C160000 automatic biochem-istry analyzer meets the laboratory demand.%目的:对 ABBOTT ARCHITECT C16000生化分析仪主要性能进行评价,判断其是否能够满足本科实验室需求。方法依照实验室管理办法及国家实验室认可的要求,分别采用美国临床和实验室标准化协会(CLSI)的 EP5-A2、EP9-A2和EP6-A 文件评价方法分析尿素(Urea)、肌酐(Cre)、尿酸(UA)和葡萄糖(Glu)等17项检测项目的精密度、准确度和线性;并对各检测项目引用的参考区间进行验证。结果 ABBOTT ARCHITECT C16000生化分析仪 Urea、Cre、UA、Glu 等17项检测项目批内精密度变异系数(CV)均小于或等1/4CLIA′88标准,批间精密度 CV 均小于或等于1/3CLIA′88标准;在准确度试验中,该17项检测项目的相对偏倚均小于或等于1/2CLIA′88标准;线性良好(r2>0.95);各检测项目引用的参考区间合适。结论 ABBOTT ARCHITECT C16000生化分析仪性能满足本实验室需求。

  13. Improvement in the performance of CAD for the Alzheimer-type dementia based on automatic extraction of temporal lobe from coronal MR images

    International Nuclear Information System (INIS)

    In this study, we extracted whole brain and temporal lobe images from MR images (26 healthy elderly controls and 34 Alzheimer-type dementia patients) by means of binarize, mask processing, template matching, Hough transformation, and boundary tracing etc. We assessed the extraction accuracy by comparing the extracted images to images extracts by a radiological technologist. The results of assessment by consistent rate; brain images 91.3±4.3%, right temporal lobe 83.3±6.9%, left temporal lobe 83.7±7.6%. Furthermore discriminant analysis using 6 textural features demonstrated sensitivity and specificity of 100% when the healthy elderly controls were compared to the Alzheimer-type dementia patients. Our research showed the possibility of automatic objective diagnosis of temporal lobe abnormalities by automatic extracted images of the temporal lobes. (author)

  14. Automatic Speaker Recognition System

    Directory of Open Access Journals (Sweden)

    Parul,R. B. Dubey

    2012-12-01

    Full Text Available Spoken language is used by human to convey many types of information. Primarily, speech convey message via words. Owing to advanced speech technologies, people's interactions with remote machines, such as phone banking, internet browsing, and secured information retrieval by voice, is becoming popular today. Speaker verification and speaker identification are important for authentication and verification in security purpose. Speaker identification methods can be divided into text independent and text-dependent. Speaker recognition is the process of automatically recognizing speaker voice on the basis of individual information included in the input speech waves. It consists of comparing a speech signal from an unknown speaker to a set of stored data of known speakers. This process recognizes who has spoken by matching input signal with pre- stored samples. The work is focussed to improve the performance of the speaker verification under noisy conditions.

  15. 浅谈高速铁路供电SCADA系统调试工作%On the High-speed Rail Power Supply SCADA System Debugging

    Institute of Scientific and Technical Information of China (English)

    曾亮

    2015-01-01

    In order to standardize the railway power supply SCADA system remote debugging acceptance, after the elimination of SCADA takeover device security risks exist, combined with the use of the existing lines SCADA system demand and system debugging experience, focuses on the SCADA system commissioning content, requirements and procedures and other related content, has some practical significance.%为了规范铁路供电SCADA系统远动调试验收工作,消除SCADA接管后设备中存在的安全隐患,结合既有线路SCADA系统调试经验和系统的运用需求,着重论述了SCADA系统调试内容、要求和程序等相关内容,具有一定的现实指导意义。

  16. 变压器短路试验电流的计算及调试%Calculation and Debugging of Current in Short- Circuit Tests of Transformers

    Institute of Scientific and Technical Information of China (English)

    杨治业

    2001-01-01

    Error and calculating method of current in the short-circuit tests of transformers are analyzed and presented.Relative problems of current debugging in the short-circuit tests are introduced.%分析并给出了变压器短路试验电流的计算方法及误差,并提出了短路试验电流调试中的有关问题。

  17. Differential Protection Debugging of DGT801 Series Digital Transformer%DGT801系列数字式变压器差动保护调试

    Institute of Scientific and Technical Information of China (English)

    何霞

    2012-01-01

      DGT801系列数字变压器差动保护在现场安装调试过程中出现卡壳情况,为保证今后调试及事故处理的需要,针对国内微机变压器差动保护 DGT801系列保护的接线和算法进行分析,并以某水电站双变差动保护为例,对如何使用一台三相保护校验仪做比率制动试验方法进行了讨论,通过该方法可以来解决该类型保护调试。%  The differential protection of DGT801 series digital transformer appears stuck during the process of on-site installation and debugging. In order to ensure the future needs of debugging and dealing with the accident, analysis was made to the wiring and algorithm of DGT801 series protection of domestic microprocessor-based differential protection, and taking dual-transformer dif-ferential protection of certain hydraulic power for example, this paper discussed how to use three-phase calibrator to do a ratio of braking test to solve the problem of this kind of protection debugging.

  18. COMMISION DEBUGGING OF AIR-COOLED ISLAND SYSTEMS FOR 600 MW AIR-COOLED UNITS%600 MW空冷机组空冷岛的调试

    Institute of Scientific and Technical Information of China (English)

    杨海生; 李路江; 吴瑞涛; 刘春报; 刘红霞

    2009-01-01

    The problems appeared in commission debugging of the air-cooled island systems for two 600 MW air-cooled units of Guodian Longshan Power Generation Co Ltd,such as undue fast vacuum drop in air-cooled island during start-up of said units after their shutdown in winter, the rapid rise of back-pressure in said air-cooled isrand due to full-load operation of all cooling air fans and air-leakage in the steam seal system, etc. , have been analysed, and corresponding preventive measures a-dopted for above-mentioned problems in the commission debugging process being given. Regarding to a part of problems, which hadn't been solved in debugging, some concrete recommendations have been put forward.%对国电河北龙山发电有限公司2×600 MW空冷机组空冷岛的调试中出现的问题进行了分析,如冬季停机后起动空冷岛真空下降过快,全部空冷风机全负荷运转,汽封系统漏空气空冷岛背压急剧上升等,给出了调试中对上述问题采取的相关防范措施,并对调试中部分未能解决的问题,提出了具体建议.

  19. 一种基于通信事件的机载分布式软件调试方法%A Debugging Method for Airborne Distributed Software Based on Communication Event

    Institute of Scientific and Technical Information of China (English)

    张树兵; 叶宏

    2011-01-01

    Controlling and debugging distributed cooperation among multiple application partitions is key point for development and integration of multiple application partitions, which is one of key steps for developing IMA system. We introduce a concept communication event to abstract activities of inter- partition communication, and then put forward a debugging approach, communication event debugging method, which is good at synchronously controlling and synergistically debugging source partition and destination partition of inter- partition communication. Furthermore, we introduce a hybrid debugging method effectively combining communication event debugging method with code debugging method, thereafter more attention is paid to its debugging process and to advantages and disadvantages it represents in practice.%多应用分区的开发与综合是IMA系统开发的核心工作之一,而如何控制和调试基于分区间通信的分布式协作行为是关键所在.引入”通信事件”对分区间通信活动进行抽象,提出通信事件调试方法,对分区间通信的源分区和目的分区进行同步控制和对通信活动进行协同调试.进而提出一种将通信事件调试和代码调试有机结合的混合调试方法,论述了方法的调试过程以及在实践中表现出来的优点与不足.

  20. Automatic input rectification

    OpenAIRE

    Long, Fan; Ganesh, Vijay; Carbin, Michael James; Sidiroglou, Stelios; Rinard, Martin

    2012-01-01

    We present a novel technique, automatic input rectification, and a prototype implementation, SOAP. SOAP learns a set of constraints characterizing typical inputs that an application is highly likely to process correctly. When given an atypical input that does not satisfy these constraints, SOAP automatically rectifies the input (i.e., changes the input so that it satisfies the learned constraints). The goal is to automatically convert potentially dangerous inputs into typical inputs that the ...

  1. Automatic Fiscal Stabilizers

    Directory of Open Access Journals (Sweden)

    Narcis Eduard Mitu

    2013-11-01

    Full Text Available Policies or institutions (built into an economic system that automatically tend to dampen economic cycle fluctuations in income, employment, etc., without direct government intervention. For example, in boom times, progressive income tax automatically reduces money supply as incomes and spendings rise. Similarly, in recessionary times, payment of unemployment benefits injects more money in the system and stimulates demand. Also called automatic stabilizers or built-in stabilizers.

  2. Face-Based Automatic Personality Perception

    OpenAIRE

    Al Moubayed, Noura; Vazquez-Alvarez, Yolanda; McKay, Alex; Vinciarelli, Alessandro

    2014-01-01

    Automatic Personality Perception is the task of automatically predicting the personality traits people attribute to others. This work presents experiments where such a task is performed by mapping facial appearance into the Big-Five personality traits, namely Openness, Conscientiousness, Extraversion, Agreeableness and Neuroticism. The experiments are performed over the pictures of the FERET corpus, originally collected for biometrics purposes, for a total of 829 individuals. The results show...

  3. Automatic differentiation bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Corliss, G.F. (comp.)

    1992-07-01

    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  4. JOSTRAN: An Interactive Joss Dialect for Writing and Debugging Fortran Programs.

    Science.gov (United States)

    Graham, W. R.; Macneilage, D. C.

    JOSTRAN is a JOSS dialect that expedites the construction of FORTRAN programs. JOSS is an interactive, on-line computer system. JOSS language programs are list-processed; i.e., each statement is interpreted at execution time. FORTRAN is the principal language for programing digital computers to perform numerical calculations. The JOSS language…

  5. New hardware support transactional memory and parallel debugging in multicore processors

    OpenAIRE

    Orosa Nogueira, Lois

    2013-01-01

    This thesis contributes to the area of hardware support for parallel programming by introducing new hardware elements in multicore processors, with the aim of improving the performance and optimize new tools, abstractions and applications related with parallel programming, such as transactional memory and data race detectors. Specifically, we configure a hardware transactional memory system with signatures as part of the hardware support, and we develop a new hardware filter for reducing the...

  6. Automatic Radiation Monitoring in Slovenia

    International Nuclear Information System (INIS)

    Full text: The automatic radiation monitoring system in Slovenia started in early nineties and now it comprises measurements of: 1. External gamma radiation: For the time being there are forty-three probes with GM tubes integrated into a common automatic network, operated at the SNSA. The probes measure dose rate in 30 minute intervals. 2. Aerosol radioactivity: Three automatic aerosol stations measure the concentration of artificial alpha and beta activity in the air, gamma emitting radionuclides, radioactive iodine 131 in the air (in all chemical forms, - natural radon and thoron progeny, 3. Radon progeny concentration: Radon progeny concentration is measured hourly and results are displayed as the equilibrium equivalent concentrations (EEC), 4. Radioactive deposition measurements: As a support to gamma dose rate measurements - the SNSA developed and installed an automatic measuring station for surface contamination equipped with gamma spectrometry system (with 3x3' NaI(Tl) detector). All data are transferred through the different communication pathways to the SNSA. They are collected in 30 minute intervals. Within these intervals the central computer analyses and processes the collected data, and creates different reports. Every month QA/QC analysis of data is performed, showing the statistics of acquisition errors and availability of measuring results. All results are promptly available at the our WEB pages. The data are checked and daily sent to the EURDEP system at Ispra (Italy) and also to the Austrian, Croatian and Hungarian authorities. (author)

  7. Automaticity of walking: functional significance, mechanisms, measurement and rehabilitation strategies

    Directory of Open Access Journals (Sweden)

    David J Clark

    2015-05-01

    Full Text Available Automaticity is a hallmark feature of walking in adults who are healthy and well-functioning. In the context of walking, ‘automaticity’ refers to the ability of the nervous system to successfully control typical steady state walking with minimal use of attention-demanding executive control resources. Converging lines of evidence indicate that walking deficits and disorders are characterized in part by a shift in the locomotor control strategy from healthy automaticity to compensatory executive control. This is potentially detrimental to walking performance, as an executive control strategy is not optimized for locomotor control. Furthermore, it places excessive demands on a limited pool of executive reserves. The result is compromised ability to perform basic and complex walking tasks and heightened risk for adverse mobility outcomes including falls. Strategies for rehabilitation of automaticity are not well defined, which is due to both a lack of systematic research into the causes of impaired automaticity and to a lack of robust neurophysiological assessments by which to gauge automaticity. These gaps in knowledge are concerning given the serious functional implications of compromised automaticity. Therefore, the objective of this article is to advance the science of automaticity of walking by consolidating evidence and identifying gaps in knowledge regarding: a functional significance of automaticity; b neurophysiology of automaticity; c measurement of automaticity; d mechanistic factors that compromise automaticity; and e strategies for rehabilitation of automaticity.

  8. Automatic mapping of monitoring data

    DEFF Research Database (Denmark)

    Lophaven, Søren; Nielsen, Hans Bruun; Søndergaard, Jacob

    2005-01-01

    This paper presents an approach, based on universal kriging, for automatic mapping of monitoring data. The performance of the mapping approach is tested on two data-sets containing daily mean gamma dose rates in Germany reported by means of the national automatic monitoring network (IMIS). In the...... second dataset an accidental release of radioactivity in the environment was simulated in the South-Western corner of the monitored area. The approach has a tendency to smooth the actual data values, and therefore it underestimates extreme values, as seen in the second dataset. However, it is capable of...... identifying a release of radioactivity provided that the number of sampling locations is sufficiently high. Consequently, we believe that a combination of applying the presented mapping approach and the physical knowledge of the transport processes of radioactivity should be used to predict the extreme values...

  9. Automatic Kurdish Dialects Identification

    Directory of Open Access Journals (Sweden)

    Hossein Hassani

    2016-02-01

    Full Text Available Automatic dialect identification is a necessary Lan guage Technology for processing multi- dialect languages in which the dialects are linguis tically far from each other. Particularly, this becomes crucial where the dialects are mutually uni ntelligible. Therefore, to perform computational activities on these languages, the sy stem needs to identify the dialect that is the subject of the process. Kurdish language encompasse s various dialects. It is written using several different scripts. The language lacks of a standard orthography. This situation makes the Kurdish dialectal identification more interesti ng and required, both form the research and from the application perspectives. In this research , we have applied a classification method, based on supervised machine learning, to identify t he dialects of the Kurdish texts. The research has focused on two widely spoken and most dominant Kurdish dialects, namely, Kurmanji and Sorani. The approach could be applied to the other Kurdish dialects as well. The method is also applicable to the languages which are similar to Ku rdish in their dialectal diversity and differences.

  10. The ‘Continuing Misfortune’ of Automatism in Early Surrealism

    Directory of Open Access Journals (Sweden)

    Tessel M. Bauduin

    2015-09-01

    Full Text Available In the 1924 Manifesto of Surrealism surrealist leader André Breton (1896-1966 defined Surrealism as ‘psychic automatism in its pure state,’ positioning ‘psychic automatism’ as both a concept and a technique. This definition followed upon an intense period of experimentation with various forms of automatism among the proto-surrealist group; predominantly automatic writing, but also induced dream states. This article explores how surrealist ‘psychic automatism’ functioned as a mechanism for communication, or the expression of thought as directly as possible through the unconscious, in the first two decades of Surrealism. It touches upon automatic writing, hysteria as an automatic bodily performance of the unconscious, dreaming and the experimentation with induced dream states, and automatic drawing and other visual arts-techniques that could be executed more or less automatically as well. For all that the surrealists reinvented automatism for their own poetic, artistic and revolutionary aims, the automatic techniques were primarily drawn from contemporary Spiritualism, psychical research and experimentation with mediums, and the article teases out the connections to mediumistic automatism. It is demonstrated how the surrealists effectively and successfully divested automatism of all things spiritual. It furthermore becomes clear that despite various mishaps, automatism in many forms was a very successful creative technique within Surrealism.

  11. Automatic Implantable Cardiac Defibrillator

    Medline Plus

    Full Text Available Automatic Implantable Cardiac Defibrillator February 19, 2009 Halifax Health Medical Center, Daytona Beach, FL Welcome to Halifax Health Daytona Beach, Florida. Over the next hour you' ...

  12. Automatic Payroll Deposit System.

    Science.gov (United States)

    Davidson, D. B.

    1979-01-01

    The Automatic Payroll Deposit System in Yakima, Washington's Public School District No. 7, directly transmits each employee's salary amount for each pay period to a bank or other financial institution. (Author/MLF)

  13. Automatic Arabic Text Classification

    OpenAIRE

    Al-harbi, S; Almuhareb, A.; Al-Thubaity , A; Khorsheed, M. S.; Al-Rajeh, A.

    2008-01-01

    Automated document classification is an important text mining task especially with the rapid growth of the number of online documents present in Arabic language. Text classification aims to automatically assign the text to a predefined category based on linguistic features. Such a process has different useful applications including, but not restricted to, e-mail spam detection, web page content filtering, and automatic message routing. This paper presents the results of experiments on documen...

  14. ES3: Automatic capture and reconstruction of science product lineage and metadata

    Science.gov (United States)

    Frew, J.; Slaughter, P.; Painter, T.

    2007-12-01

    The MODSCAG algorithm derives per-pixel fractional snow-covered area and snow grain size from MODIS imagery. The current implementation of MODSCAG is a combination of UNIX shell scripts, compiled C programs, and interpreted IDL programs. MODSCAG is under active development, and tracing its output products back to specific parameter settings or software versions is crucial for debugging and quality control. To this end, we are running MODSCAG on the Earth System Science Server (ES3), a suite of software that automatically captures run- time information about user processes and stores this information in an XML database, which can be queried to retrieve a specific MODSCAG output's complete lineage graph and any associated metadata. This poster/presentation will illustrate how this lineage/metadata capture operates without any modifications to either the host operating system or the science application code.

  15. The design and performance of the first fully automatic non-grid 5 MW multi-diesel / mini hydro / battery converter power stations

    International Nuclear Information System (INIS)

    Electricity power supply in remote communities and towns are traditionally and hitherto supplied by diesel generator sets of varying capacities and sizes -from few kilowatt to few megawatts. Its proven to be versatile, robust, modular cheaper capital investment, reliable and easy to operate and maintain. These features are what make diesel generators most preferred choice for generating electric power to power hungry remote communities. The main draw back, though, is its increasingly high cost of operation and maintenance, largely due to upward trend in the cost of diesel fuel, high cost of engines spare parts plus the inflationary nature of salary and wages of operators. For these reasons, engineers and technologists have for years worked tirelessly to find ways and means to reduce the O and M costs. One of the novel ideas was to hybrid the conventional diesel generating system with renewable energy resources, such as mini hydro, solar photovoltaic or wind energy. Many prototypes involving several configurations of energy resources eg diesel/PV/ battery, diesel/wind/battery, diesel/mini hydro/battery have been tested but none has so far has been as successful as Sema/ Powercorp automated Intelligent Power System (IPS). Based on microprocessor hardware, powerful computer software programming and satellite communication technology, the IPS -equipped diesel power station can now now be operated fully automatic with capability of remote control and monitoring. The system is versatile in maximising the use of renewable energy energy resources such as wind, mini hydro or solar thereby reducing very significantly the use of diesel fuel. Operation and maintenance costs also are reduced due to the use of minimum manpower and and increase in fuel efficiency of the engines. The tested and proven IPS technology has been operating successfully for the last ten years in remote diesel stations in Northern Territory, Australia, Rathlin Island, Northern Ireland and its latest and

  16. An Automatic Hierarchical Delay Analysis Tool

    Institute of Scientific and Technical Information of China (English)

    FaridMheir-El-Saadi; BozenaKaminska

    1994-01-01

    The performance analysis of VLSI integrated circuits(ICs) with flat tools is slow and even sometimes impossible to complete.Some hierarchical tools have been developed to speed up the analysis of these large ICs.However,these hierarchical tools suffer from a poor interaction with the CAD database and poorly automatized operations.We introduce a general hierarchical framework for performance analysis to solve these problems.The circuit analysis is automatic under the proposed framework.Information that has been automatically abstracted in the hierarchy is kept in database properties along with the topological information.A limited software implementation of the framework,PREDICT,has also been developed to analyze the delay performance.Experimental results show that hierarchical analysis CPU time and memory requirements are low if heuristics are used during the abstraction process.

  17. 一种新的基于不完全排错的最优软件发行策略%A Novel Optimal Software Release Policy under Imperfect Debugging

    Institute of Scientific and Technical Information of China (English)

    刘云; 田斌; 赵玮

    2005-01-01

    Optimal software releasing is a challenging problem in software reliability. Most of the available software release models have an unreasonable assumption that the software debugging process is perfect or there is no new fault introduced during debugging. This paper presents an optimal software release policy model under imperfect debugging. This model not only takes the software imperfect debugging and the new faults introduced during debugging into account, but also considers the situation that the probability of perfect debugging will be increased while the experience is gained during the process of software testing. This paper also gives the solution of the model.%软件的最优发行管理问题是软件可靠性研究的一个关键问题.现有的最优软件发行模型大都假定软件排错过程是完全的,并且在排错过程中没有新的故障引入,这种假设在很多情况下是不合理的.本文提出了一种新的最优软件发行管理模型,该模型既考虑了软件的不完全排错过程,又考虑了在排错过程中可能会引入新的故障,同时还考虑了由于排错经验的不断积累,软件的完全排错概率会增加的情况.本文同时给出了该模型的解.

  18. Performance management system enhancement and maintenance

    Science.gov (United States)

    Cleaver, T. G.; Ahour, R.; Johnson, B. R.

    1984-01-01

    The research described in this report concludes a two-year effort to develop a Performance Management System (PMS) for the NCC computers. PMS provides semi-automated monthly reports to NASA and contractor management on the status and performance of the NCC computers in the TDRSS program. Throughout 1984, PMS was tested, debugged, extended, and enhanced. Regular PMS monthly reports were produced and distributed. PMS continues to operate at the NCC under control of Bendix Corp. personnel.

  19. Automatic Program Development

    DEFF Research Database (Denmark)

    Automatic Program Development is a tribute to Robert Paige (1947-1999), our accomplished and respected colleague, and moreover our good friend, whose untimely passing was a loss to our academic and research community. We have collected the revised, updated versions of the papers published in his...... honor in the Higher-Order and Symbolic Computation Journal in the years 2003 and 2005. Among them there are two papers by Bob: (i) a retrospective view of his research lines, and (ii) a proposal for future studies in the area of the automatic program derivation. The book also includes some papers by...... members of the IFIP Working Group 2.1 of which Bob was an active member. All papers are related to some of the research interests of Bob and, in particular, to the transformational development of programs and their algorithmic derivation from formal specifications. Automatic Program Development offers a...

  20. Automatic emotional expression analysis from eye area

    Science.gov (United States)

    Akkoç, Betül; Arslan, Ahmet

    2015-02-01

    Eyes play an important role in expressing emotions in nonverbal communication. In the present study, emotional expression classification was performed based on the features that were automatically extracted from the eye area. Fırst, the face area and the eye area were automatically extracted from the captured image. Afterwards, the parameters to be used for the analysis through discrete wavelet transformation were obtained from the eye area. Using these parameters, emotional expression analysis was performed through artificial intelligence techniques. As the result of the experimental studies, 6 universal emotions consisting of expressions of happiness, sadness, surprise, disgust, anger and fear were classified at a success rate of 84% using artificial neural networks.

  1. Automatic utilities auditing

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Colin Boughton [Energy Metering Technology (United Kingdom)

    2000-08-01

    At present, energy audits represent only snapshot situations of the flow of energy. The normal pattern of energy audits as seen through the eyes of an experienced energy auditor is described. A brief history of energy auditing is given. It is claimed that the future of energy auditing lies in automatic meter reading with expert data analysis providing continuous automatic auditing thereby reducing the skill element. Ultimately, it will be feasible to carry out auditing at intervals of say 30 minutes rather than five years.

  2. Automatic Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Preuss, Mike

    2014-01-01

    Automatically generating computer animations is a challenging and complex problem with applications in games and film production. In this paper, we investigate howto translate a shot list for a virtual scene into a series of virtual camera configurations — i.e automatically controlling the virtual...... camera. We approach this problem by modelling it as a dynamic multi-objective optimisation problem and show how this metaphor allows a much richer expressiveness than a classical single objective approach. Finally, we showcase the application of a multi-objective evolutionary algorithm to generate a shot...

  3. Application Accuracy of Automatic Registration in Frameless Stereotaxy

    OpenAIRE

    Rachinger, Jens; Keller, Boris von; Ganslandt, Oliver; Fahlbusch, Rudolf; Nimsky, Christopher

    2013-01-01

    Objective: We compared the application accuracy of an infrared- based neuronavigation system when used with a novel automatic registration with its application accuracy when standard fiducial-based registration is performed. Methods: The automatic referencing tool is based on markers that are integrated in the headrest holder we routinely use in our intraoperative magnetic resonance imaging (MRI) setting and can be detected by the navigation software automatically. For navigation targeting we...

  4. Automatic Dance Lesson Generation

    Science.gov (United States)

    Yang, Yang; Leung, H.; Yue, Lihua; Deng, LiQun

    2012-01-01

    In this paper, an automatic lesson generation system is presented which is suitable in a learning-by-mimicking scenario where the learning objects can be represented as multiattribute time series data. The dance is used as an example in this paper to illustrate the idea. Given a dance motion sequence as the input, the proposed lesson generation…

  5. Automatic Complexity Analysis

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    1989-01-01

    One way to analyse programs is to to derive expressions for their computational behaviour. A time bound function (or worst-case complexity) gives an upper bound for the computation time as a function of the size of input. We describe a system to derive such time bounds automatically using abstract...

  6. Automatic welding techniques for nuclear power plants

    International Nuclear Information System (INIS)

    Improved type BWRs (ABWR) further heightened the operation properties, safety and economic efficiency as the synthetic characteristics of the plants by simplification and the heightening of performance. Especially, reactor internal pumps (RIP) together with improved control rod driving (CRD) system promoted the simplification of reactor system and the improvement of operation properties and safety. The structures of RIP casing proper and the welded parts, the automatic TIG welder for RIP casings and the nondestructive inspection after the welding, the three-dimensional automatic welding of CRD stubs, the narrow gap welding of flow nozzles and the automatic welding of spent fuel storage racks are reported. The works of exchanging the recirculating pipings in reactors to 316L pipes withstanding SCC by remote automatic welding to reduce the radiation exposure of workers are introduced. The fully automatic TIG welding system for pipings was developed for the purpose of realizing unmanned welding or the welding that does not require skill, and its constitution and the performance are described. (K.I.)

  7. Real-time automatic registration in optical surgical navigation

    Science.gov (United States)

    Lin, Qinyong; Yang, Rongqian; Cai, Ken; Si, Xuan; Chen, Xiuwen; Wu, Xiaoming

    2016-05-01

    An image-guided surgical navigation system requires the improvement of the patient-to-image registration time to enhance the convenience of the registration procedure. A critical step in achieving this aim is performing a fully automatic patient-to-image registration. This study reports on a design of custom fiducial markers and the performance of a real-time automatic patient-to-image registration method using these markers on the basis of an optical tracking system for rigid anatomy. The custom fiducial markers are designed to be automatically localized in both patient and image spaces. An automatic localization method is performed by registering a point cloud sampled from the three dimensional (3D) pedestal model surface of a fiducial marker to each pedestal of fiducial markers searched in image space. A head phantom is constructed to estimate the performance of the real-time automatic registration method under four fiducial configurations. The head phantom experimental results demonstrate that the real-time automatic registration method is more convenient, rapid, and accurate than the manual method. The time required for each registration is approximately 0.1 s. The automatic localization method precisely localizes the fiducial markers in image space. The averaged target registration error for the four configurations is approximately 0.7 mm. The automatic registration performance is independent of the positions relative to the tracking system and the movement of the patient during the operation.

  8. 航天编码器调试系统显示功能设计%Design of Display Function in Debugging System of Space Encoder

    Institute of Scientific and Technical Information of China (English)

    左洋; 龙科慧; 乔克; 刘金国

    2012-01-01

    为实现航天编码器的高可靠性与稳定性,设计了一种针对航天编码器的调试系统显示模块.系统以单片机80C32E为核心处理器,实现编码器信号采集、处理、通讯和显示.在显示模块中,一是通过LED灯排显示二进制形式的角度信息,二是通过LCD液晶屏显示度分秒形式的角度信息.当调试编码器时,LED灯排显示适于专业人员监测编码器数据信息是否进位正常和读取角度信息;LCD液晶屏适于非专业人员查看编码器角度等信息.系统体积小,可视性好,方便编码器的调试.实验结果表明,该调试系统可同时显示两种方式角度信息,实时性好,数据读取方便,并且系统显示功能还可进一步扩展.%In order to meet the needs on reliability and stability for aerospace encode, the display module in the debugging system of space encoder was designed. The singlechip 80C32E was the core processor to achieve the collection, processing and display. In the display module, the angle information of binary form was displayed by LED bars on the one hand, the angle information of degrees,minutes,seconds form was displayed by LCD screen on the other hand. When people were debugging encoder, LED bars display was suitable for professionals monitoring whether the encoder data carry normally or not and reading angle information, LCD display was suitable for non-professionals viewing encoder angle. The system is small, flexible operation, and good visibility. The results show that the angle information in the two ways were displayed simultaneously in the debugging system, easy to read, and display function of system can be further expanded.

  9. Performance testing of a semi-automatic card punch system, using direct STR profiling of DNA from blood samples on FTA™ cards.

    Science.gov (United States)

    Ogden, Samantha J; Horton, Jeffrey K; Stubbs, Simon L; Tatnell, Peter J

    2015-01-01

    The 1.2 mm Electric Coring Tool (e-Core™) was developed to increase the throughput of FTA(™) sample collection cards used during forensic workflows and is similar to a 1.2 mm Harris manual micro-punch for sampling dried blood spots. Direct short tandem repeat (STR) DNA profiling was used to compare samples taken by the e-Core tool with those taken by the manual micro-punch. The performance of the e-Core device was evaluated using a commercially available PowerPlex™ 18D STR System. In addition, an analysis was performed that investigated the potential carryover of DNA via the e-Core punch from one FTA disc to another. This contamination study was carried out using Applied Biosystems AmpflSTR™ Identifiler™ Direct PCR Amplification kits. The e-Core instrument does not contaminate FTA discs when a cleaning punch is used following excision of discs containing samples and generates STR profiles that are comparable to those generated by the manual micro-punch. PMID:25407399

  10. Automatic differentiation: Obtaining fast and reliable derivatives -- fast

    Energy Technology Data Exchange (ETDEWEB)

    Bischof, C.H.; Khademi, P.M.; Pusch, G. [Argonne National Lab., IL (United States). Mathematics and Computer Science Div.; Carle, A. [Rice Univ., Houston, TX (United States). Center for Research on Parallel Computation

    1994-12-31

    In this paper, the authors introduce automatic differentiation as a method for computing derivatives of large computer codes. After a brief discussion of methods of differentiating codes, they review automatic differentiation and introduce the ADIFOR (Automatic DIfferentiation of FORtran) tool. They highlight some applications of ADIFOR to large industrial and scientific codes (groundwater transport, CFD airfoil design, and sensitivity-enhanced MM5 mesoscale weather model), and discuss the effectiveness and performance of their approach. Finally, they discuss sparsity in automatic differentiation and introduce the SparsLinC library.

  11. High-Resolution Dynamical Downscaling of ERA-Interim Using the WRF Regional Climate Model for the Area of Poland. Part 2: Model Performance with Respect to Automatically Derived Circulation Types

    Science.gov (United States)

    Ojrzyńska, Hanna; Kryza, Maciej; Wałaszek, Kinga; Szymanowski, Mariusz; Werner, Małgorzata; Dore, Anthony J.

    2016-03-01

    This paper presents the application of the high-resolution WRF model data for the automatic classification of the atmospheric circulation types and the evaluation of the model results for daily rainfall and air temperatures. The WRF model evaluation is performed by comparison with measurements and gridded data (E-OBS). The study is focused on the area of Poland and covers the 1981-2010 period, for which the WRF model has been run using three nested domains with spatial resolution of 45 km × 45 km, 15 km × 15 km and 5 km × 5 km. For the model evaluation, we have used the data from the innermost domain, and data from the second domain were used for circulation typology. According to the circulation type analysis, the anticyclonic types (AAD and AAW) are the most frequent. The WRF model is able to reproduce the daily air temperatures and the error statistics are better, compared with the interpolation-based gridded dataset. The high-resolution WRF model shows a higher spatial variability of both air temperature and rainfall, compared with the E-OBS dataset. For the rainfall, the WRF model, in general, overestimates the measured values. The model performance shows a seasonal pattern and is also dependent on the atmospheric circulation type, especially for daily rainfall.

  12. Automatic Extraction of Metadata from Scientific Publications for CRIS Systems

    Science.gov (United States)

    Kovacevic, Aleksandar; Ivanovic, Dragan; Milosavljevic, Branko; Konjovic, Zora; Surla, Dusan

    2011-01-01

    Purpose: The aim of this paper is to develop a system for automatic extraction of metadata from scientific papers in PDF format for the information system for monitoring the scientific research activity of the University of Novi Sad (CRIS UNS). Design/methodology/approach: The system is based on machine learning and performs automatic extraction…

  13. 12 CFR 263.403 - Automatic removal, suspension, and debarment.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 3 2010-01-01 2010-01-01 false Automatic removal, suspension, and debarment... From Performing Audit Services § 263.403 Automatic removal, suspension, and debarment. (a) An... to a temporary suspension or permanent revocation of registration or a temporary or...

  14. Reliability and effectiveness of clickthrough data for automatic image annotation

    NARCIS (Netherlands)

    Tsikrika, T.; Diou, C.; De Vries, A.P.; Delopoulos, A.

    2010-01-01

    Automatic image annotation using supervised learning is performed by concept classifiers trained on labelled example images. This work proposes the use of clickthrough data collected from search logs as a source for the automatic generation of concept training data, thus avoiding the expensive manua

  15. Automatic indexing, compiling and classification

    International Nuclear Information System (INIS)

    A review of the principles of automatic indexing, is followed by a comparison and summing-up of work by the authors and by a Soviet staff from the Moscou INFORM-ELECTRO Institute. The mathematical and linguistic problems of the automatic building of thesaurus and automatic classification are examined

  16. Automatic visual inspection of hybrid microcircuits

    Energy Technology Data Exchange (ETDEWEB)

    Hines, R.E.

    1980-05-01

    An automatic visual inspection system using a minicomputer and a video digitizer was developed for inspecting hybrid microcircuits (HMC) and thin-film networks (TFN). The system performed well in detecting missing components on HMCs and reduced the testing time for each HMC by 75%.

  17. Simultaneous automatic determination of catecholamines and their 3-O-methyl metabolites in rat plasma by high-performance liquid chromatography using peroxyoxalate chemiluminescence reaction.

    Science.gov (United States)

    Tsunoda, M; Takezawa, K; Santa, T; Imai, K

    1999-05-01

    A highly specific and sensitive automated high-performance liquid chromatographic method for the simultaneous determination of catecholamines (CAs; norepinephrine, epinephrine, and dopamine) and their 3-O-methyl metabolites (normetanephrine, metanephrine, and 3-methoxytyramine) is described. Automated precolumn ion-exchange extraction of diluted plasma is coupled with HPLC separation of CAs and their 3-O-methyl metabolites on an ODS column, postcolumn coulometric oxidation, fluorescence derivatization with ethylenediamine, and finally peroxyoxalate chemiluminescence reaction detection. The detection limits were about 3 fmol for norepinephrine, epinephrine, and dopamine, 5 fmol for normetanephrine, and 10 fmol for metanephrine and 3-methoxytyramine (signal-to-noise ratio of 3). Fifty microliters of rat plasma was used and 4-methoxytyramine was employed as an internal standard. The relative standard deviations for the method (n = 5) were 2.5-7.6% for the intraday assay and 6.3-9.1% for the interday assay. The method was applicable to the determination of normetanephrine and metanephrine in 50 microl of rat plasma. PMID:10222014

  18. Performance Evaluation of Automatic Anatomy Segmentation Algorithm on Repeat or Four-Dimensional Computed Tomography Images Using Deformable Image Registration Method

    International Nuclear Information System (INIS)

    Purpose: Auto-propagation of anatomic regions of interest from the planning computed tomography (CT) scan to the daily CT is an essential step in image-guided adaptive radiotherapy. The goal of this study was to quantitatively evaluate the performance of the algorithm in typical clinical applications. Methods and Materials: We had previously adopted an image intensity-based deformable registration algorithm to find the correspondence between two images. In the present study, the regions of interest delineated on the planning CT image were mapped onto daily CT or four-dimensional CT images using the same transformation. Postprocessing methods, such as boundary smoothing and modification, were used to enhance the robustness of the algorithm. Auto-propagated contours for 8 head-and-neck cancer patients with a total of 100 repeat CT scans, 1 prostate patient with 24 repeat CT scans, and 9 lung cancer patients with a total of 90 four-dimensional CT images were evaluated against physician-drawn contours and physician-modified deformed contours using the volume overlap index and mean absolute surface-to-surface distance. Results: The deformed contours were reasonably well matched with the daily anatomy on the repeat CT images. The volume overlap index and mean absolute surface-to-surface distance was 83% and 1.3 mm, respectively, compared with the independently drawn contours. Better agreement (>97% and <0.4 mm) was achieved if the physician was only asked to correct the deformed contours. The algorithm was also robust in the presence of random noise in the image. Conclusion: The deformable algorithm might be an effective method to propagate the planning regions of interest to subsequent CT images of changed anatomy, although a final review by physicians is highly recommended

  19. Minimum Hamiltonian Ascent Trajectory Evaluation (MASTRE) program (update to automatic flight trajectory design, performance prediction, and vehicle sizing for support of Shuttle and Shuttle derived vehicles) engineering manual

    Science.gov (United States)

    Lyons, J. T.

    1993-01-01

    The Minimum Hamiltonian Ascent Trajectory Evaluation (MASTRE) program and its predecessors, the ROBOT and the RAGMOP programs, have had a long history of supporting MSFC in the simulation of space boosters for the purpose of performance evaluation. The ROBOT program was used in the simulation of the Saturn 1B and Saturn 5 vehicles in the 1960's and provided the first utilization of the minimum Hamiltonian (or min-H) methodology and the steepest ascent technique to solve the optimum trajectory problem. The advent of the Space Shuttle in the 1970's and its complex airplane design required a redesign of the trajectory simulation code since aerodynamic flight and controllability were required for proper simulation. The RAGMOP program was the first attempt to incorporate the complex equations of the Space Shuttle into an optimization tool by using an optimization method based on steepest ascent techniques (but without the min-H methodology). Development of the complex partial derivatives associated with the Space Shuttle configuration and using techniques from the RAGMOP program, the ROBOT program was redesigned to incorporate these additional complexities. This redesign created the MASTRE program, which was referred to as the Minimum Hamiltonian Ascent Shuttle TRajectory Evaluation program at that time. Unique to this program were first-stage (or booster) nonlinear aerodynamics, upper-stage linear aerodynamics, engine control via moment balance, liquid and solid thrust forces, variable liquid throttling to maintain constant acceleration limits, and a total upgrade of the equations used in the forward and backward integration segments of the program. This modification of the MASTRE code has been used to simulate the new space vehicles associated with the National Launch Systems (NLS). Although not as complicated as the Space Shuttle, the simulation and analysis of the NLS vehicles required additional modifications to the MASTRE program in the areas of providing

  20. The automatic NMR gaussmeter

    International Nuclear Information System (INIS)

    The paper describes the automatic gaussmeter operating according to the principle of nuclear magnetic resonance. There have been discussed the operating principle, the block diagram and operating parameters of the meter. It can be applied to measurements of induction in electromagnets of wide-line radio-spectrometers EPR and NMR and in calibration stands of magnetic induction values. Frequency range of an autodyne oscillator from 0,6 up to 86 MHz for protons is corresponding to the field range from 0.016 up to 2T. Applicaton of other nuclei, such as 7Li and 2D is also foreseen. The induction measurement is carried over automatically, and the NMR signal and value of measured induction are displayed on a monitor screen. (author)

  1. Automatic trend estimation

    CERN Document Server

    Vamos¸, C˘alin

    2013-01-01

    Our book introduces a method to evaluate the accuracy of trend estimation algorithms under conditions similar to those encountered in real time series processing. This method is based on Monte Carlo experiments with artificial time series numerically generated by an original algorithm. The second part of the book contains several automatic algorithms for trend estimation and time series partitioning. The source codes of the computer programs implementing these original automatic algorithms are given in the appendix and will be freely available on the web. The book contains clear statement of the conditions and the approximations under which the algorithms work, as well as the proper interpretation of their results. We illustrate the functioning of the analyzed algorithms by processing time series from astrophysics, finance, biophysics, and paleoclimatology. The numerical experiment method extensively used in our book is already in common use in computational and statistical physics.

  2. Considering the Fault Dependency Concept with Debugging Time Lag in Software Reliability Growth Modeling Using a Power Function of Testing Time

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Since the early 1970s tremendous growth has been seen in the research of software reliability growth modeling. In general, software reliability growth models (SRGMs) are applicable to the late stages of testing in software development and they can provide useful information about how to improve the reliability of software products. A number of SRGMs have been proposed in the literature to represent time-dependent fault identification / removal phenomenon; still new models are being proposed that could fit a greater number of reliability growth curves. Often, it is assumed that detected faults are immediately corrected when mathematical models are developed. This assumption may not be realistic in practice because the time to remove a detected fault depends on the complexity of the fault, the skill and experience of the personnel, the size of the debugging team, the technique, and so on. Thus, the detected fault need not be immediately removed, and it may lag the fault detection process by a delay effect factor. In this paper, we first review how different software reliability growth models have been developed, where fault detection process is dependent not only on the number of residual fault content but also on the testing time, and see how these models can be reinterpreted as the delayed fault detection model by using a delay effect factor. Based on the power function of the testing time concept, we propose four new SRGMs that assume the presence of two types of faults in the software: leading and dependent faults. Leading faults are those that can be removed upon a failure being observed. However, dependent faults are masked by leading faults and can only be removed after the corresponding leading fault has been removed with a debugging time lag. These models have been tested on real software error data to show its goodness of fit, predictive validity and applicability.

  3. Automatic Wall Painting Robot

    OpenAIRE

    P.KEERTHANAA, K.JEEVITHA, V.NAVINA, G.INDIRA, S.JAYAMANI

    2013-01-01

    The Primary Aim Of The Project Is To Design, Develop And Implement Automatic Wall Painting Robot Which Helps To Achieve Low Cost Painting Equipment. Despite The Advances In Robotics And Its Wide Spreading Applications, Interior Wall Painting Has Shared Little In Research Activities. The Painting Chemicals Can Cause Hazards To The Human Painters Such As Eye And Respiratory System Problems. Also The Nature Of Painting Procedure That Requires Repeated Work And Hand Rising Makes It Boring, Time A...

  4. Automatic Program Reports

    OpenAIRE

    Lígia Maria da Silva Ribeiro; Gabriel de Sousa Torcato David

    2007-01-01

    To profit from the data collected by the SIGARRA academic IS, a systematic setof graphs and statistics has been added to it and are available on-line. Thisanalytic information can be automatically included in a flexible yearly report foreach program as well as in a synthesis report for the whole school. Somedifficulties in the interpretation of some graphs led to the definition of new keyindicators and the development of a data warehouse across the university whereeffective data consolidation...

  5. Automatic Inductive Programming Tutorial

    OpenAIRE

    Aler, Ricardo

    2006-01-01

    Computers that can program themselves is an old dream of Artificial Intelligence, but only nowadays there is some progress of remark. In relation to Machine Learning, a computer program is the most powerful structure that can be learned, pushing the final goal well beyond neural networks or decision trees. There are currently many separate areas, working independently, related to automatic programming, both deductive and inductive. The first goal of this tutorial is to give to the attendants ...

  6. Automatic food decisions

    DEFF Research Database (Denmark)

    Mueller Loose, Simone

    Consumers' food decisions are to a large extent shaped by automatic processes, which are either internally directed through learned habits and routines or externally influenced by context factors and visual information triggers. Innovative research methods such as eye tracking, choice experiments...... and food diaries allow us to better understand the impact of unconscious processes on consumers' food choices. Simone Mueller Loose will provide an overview of recent research insights into the effects of habit and context on consumers' food choices....

  7. Automatic Differentiation Variational Inference

    OpenAIRE

    Kucukelbir, Alp; Tran, Dustin; Ranganath, Rajesh; Gelman, Andrew; Blei, David M.

    2016-01-01

    Probabilistic modeling is iterative. A scientist posits a simple model, fits it to her data, refines it according to her analysis, and repeats. However, fitting complex models to large data is a bottleneck in this process. Deriving algorithms for new models can be both mathematically and computationally challenging, which makes it difficult to efficiently cycle through the steps. To this end, we develop automatic differentiation variational inference (ADVI). Using our method, the scientist on...

  8. Automaticity or active control

    DEFF Research Database (Denmark)

    Tudoran, Ana Alina; Olsen, Svein Ottar

    This study addresses the quasi-moderating role of habit strength in explaining action loyalty. A model of loyalty behaviour is proposed that extends the traditional satisfaction–intention–action loyalty network. Habit strength is conceptualised as a cognitive construct to refer to the psychologic......, respectively, between intended loyalty and action loyalty. At high levels of habit strength, consumers are more likely to free up cognitive resources and incline the balance from controlled to routine and automatic-like responses....

  9. Automatic digital image registration

    Science.gov (United States)

    Goshtasby, A.; Jain, A. K.; Enslin, W. R.

    1982-01-01

    This paper introduces a general procedure for automatic registration of two images which may have translational, rotational, and scaling differences. This procedure involves (1) segmentation of the images, (2) isolation of dominant objects from the images, (3) determination of corresponding objects in the two images, and (4) estimation of transformation parameters using the center of gravities of objects as control points. An example is given which uses this technique to register two images which have translational, rotational, and scaling differences.

  10. Around the laboratories: Rutherford: Successful tests on bubble chamber target technique; Stanford (SLAC): New storage rings proposal; Berkeley: The HAPPE project to examine cosmic rays with superconducting magnets; The 60th birthday of Professor N.N. Bogolyubov; Argonne: Performance of the automatic film measuring system POLLY II

    CERN Multimedia

    1969-01-01

    Around the laboratories: Rutherford: Successful tests on bubble chamber target technique; Stanford (SLAC): New storage rings proposal; Berkeley: The HAPPE project to examine cosmic rays with superconducting magnets; The 60th birthday of Professor N.N. Bogolyubov; Argonne: Performance of the automatic film measuring system POLLY II

  11. Automatic scanning for nuclear emulsion

    International Nuclear Information System (INIS)

    Automatic scanning systems have been recently developed for application in neutrino experiments exploiting nuclear emulsion detectors of particle tracks. These systems speed up substantially the analysis of events in emulsion, allowing the realisation of experiments with unprecedented statistics. The pioneering work on automatic scanning has been done by the University of Nagoya (Japan). The so called new track selector has a very good reproducibility in position (∼1 μm) and angle (∼3 mrad), with the possibility to reconstruct, in about 3 s, all the tracks in a view of 150x150 μm2 and 1 mm of thickness. A new system (ultratrack selector), with speed higher by one order of magnitude, has started to be in operation. R and D programs are going on in Nagoya and in other laboratories for new systems. The scanning speed in nuclear emulsion be further increased by an order of magnitude. The recent progress in the technology of digital signal processing and of image acquisition systems (CCDs and fast frame grabbers) allows the realisation of systems with high performance. New interesting applications of the technique in other fields (e.g. in biophysics) have recently been envisaged

  12. SRV-automatic handling device

    International Nuclear Information System (INIS)

    Automatic handling device for the steam relief valves (SRV's) is developed in order to achieve a decrease in exposure of workers, increase in availability factor, improvement in reliability, improvement in safety of operation, and labor saving. A survey is made during a periodical inspection to examine the actual SVR handling operation. An SRV automatic handling device consists of four components: conveyor, armed conveyor, lifting machine, and control/monitoring system. The conveyor is so designed that the existing I-rail installed in the containment vessel can be used without any modification. This is employed for conveying an SRV along the rail. The armed conveyor, designed for a box rail, is used for an SRV installed away from the rail. By using the lifting machine, an SRV installed away from the I-rail is brought to a spot just below the rail so that the SRV can be transferred by the conveyor. The control/monitoring system consists of a control computer, operation panel, TV monitor and annunciator. The SRV handling device is operated by remote control from a control room. A trial equipment is constructed and performance/function testing is carried out using actual SRV's. As a result, is it shown that the SRV handling device requires only two operators to serve satisfactorily. The required time for removal and replacement of one SRV is about 10 minutes. (Nogami, K.)

  13. Support vector machine for automatic pain recognition

    Science.gov (United States)

    Monwar, Md Maruf; Rezaei, Siamak

    2009-02-01

    Facial expressions are a key index of emotion and the interpretation of such expressions of emotion is critical to everyday social functioning. In this paper, we present an efficient video analysis technique for recognition of a specific expression, pain, from human faces. We employ an automatic face detector which detects face from the stored video frame using skin color modeling technique. For pain recognition, location and shape features of the detected faces are computed. These features are then used as inputs to a support vector machine (SVM) for classification. We compare the results with neural network based and eigenimage based automatic pain recognition systems. The experiment results indicate that using support vector machine as classifier can certainly improve the performance of automatic pain recognition system.

  14. Automatic inference of indexing rules for MEDLINE

    Directory of Open Access Journals (Sweden)

    Shooshan Sonya E

    2008-11-01

    Full Text Available Abstract Background: Indexing is a crucial step in any information retrieval system. In MEDLINE, a widely used database of the biomedical literature, the indexing process involves the selection of Medical Subject Headings in order to describe the subject matter of articles. The need for automatic tools to assist MEDLINE indexers in this task is growing with the increasing number of publications being added to MEDLINE. Methods: In this paper, we describe the use and the customization of Inductive Logic Programming (ILP to infer indexing rules that may be used to produce automatic indexing recommendations for MEDLINE indexers. Results: Our results show that this original ILP-based approach outperforms manual rules when they exist. In addition, the use of ILP rules also improves the overall performance of the Medical Text Indexer (MTI, a system producing automatic indexing recommendations for MEDLINE. Conclusion: We expect the sets of ILP rules obtained in this experiment to be integrated into MTI.

  15. USB设备端驱动的研究及在GDB远程调试中的应用%Study of USB device driver and its application in GDB remote debugging

    Institute of Scientific and Technical Information of China (English)

    况阳; 雷航; 詹瑾瑜

    2011-01-01

    During embedded Linux software development, it can do remote debugging on embedded software by using host CDB and target GDBserver, GDB communicate with CDBserver via RSP protocol, this method can improve efficiency effectively. The host machine and target machine can build connection with serial port or Ethernet but, not support USB connection so far. This paper introduced some USB related conceptions and the fundamental of GDB remote debugging, by analysing debugging model existed now, used Gadget drvier on the Linux device side to realize USB + GDB + GDBserver remote debugging model. This model is a supplement for the debugging models existed so far, USB interface is becoming more and more popular, making this model to be very convenient for engineers in the actual development.%在嵌入式Linux软件开发中,可以通过宿主机GDB( GNU debugger)加目标机GDBserver的方式对嵌入式软件进行远程调试,GDB和GDBserver通过RSP( remote serial protocol)协议进行通信,这种方式可以显著提高开发效率;目前宿主机和目标机之间支持串口或网口方式建立连接,暂不支持USB( universal serial bus)接口.介绍了USB的相关概念及GDB远程调试原理,通过分析当前存在的调试模型,利用Linux设备端Gadget功能驱动实现了USB+ GDB+ GDBserver的远程调试模型.该模型弥补了现有模型的不足,USB接口的日益普及使得该模型在实际开发中带来了极大的便利.

  16. Automatic bootstrapping and tracking of object contours.

    Science.gov (United States)

    Chiverton, John; Xie, Xianghua; Mirmehdi, Majid

    2012-03-01

    A new fully automatic object tracking and segmentation framework is proposed. The framework consists of a motion-based bootstrapping algorithm concurrent to a shape-based active contour. The shape-based active contour uses finite shape memory that is automatically and continuously built from both the bootstrap process and the active-contour object tracker. A scheme is proposed to ensure that the finite shape memory is continuously updated but forgets unnecessary information. Two new ways of automatically extracting shape information from image data given a region of interest are also proposed. Results demonstrate that the bootstrapping stage provides important motion and shape information to the object tracker. This information is found to be essential for good (fully automatic) initialization of the active contour. Further results also demonstrate convergence properties of the content of the finite shape memory and similar object tracking performance in comparison with an object tracker with unlimited shape memory. Tests with an active contour using a fixed-shape prior also demonstrate superior performance for the proposed bootstrapped finite-shape-memory framework and similar performance when compared with a recently proposed active contour that uses an alternative online learning model. PMID:21908256

  17. A General Method for Module Automatic Testing in Avionics Systems

    Directory of Open Access Journals (Sweden)

    Li Ma

    2013-05-01

    Full Text Available The traditional Automatic Test Equipment (ATE systems are insufficient to cope with the challenges of testing more and more complex avionics systems. In this study, we propose a general method for module automatic testing in the avionics test platform based on PXI bus. We apply virtual instrument technology to realize the automatic testing and the fault reporting of signal performance. Taking the avionics bus ARINC429 as an example, we introduce the architecture of automatic test system as well as the implementation of algorithms in Lab VIEW. The comprehensive experiments show the proposed method can effectively accomplish the automatic testing and fault reporting of signal performance. It greatly improves the generality and reliability of ATE in avionics systems.

  18. 船舶自动识别系统性能半实物仿真方法研究%Research on Automatic Identify System Performance with Hardware-In-The-Loop Simulation

    Institute of Scientific and Technical Information of China (English)

    马枫; 严新平; 初秀民

    2013-01-01

    船舶自动识别系统(Automatic Identify System,AIS)是当今海事管理主要无线监管工具.为全面分析其在复杂地形下的误包特征,设计了一种半实物仿真研究方法,模拟任意场强与干扰特征的AIS信号,采用真实的接收机评估误包率,并使用Okumura-Hata模型建立了场强与距高的对应模型,用于仿真AIS网络实际运行中可能遇到的地形衰减,以及各种常见干扰.使用访方法可以有效寻找出信号衰减的误码特征,验证不同干扰信号的特性.通过深圳AIS信号干扰实例,证明了半实物仿真方法与平台的实用性.该研究方法可以为最终实现AIS设备的硬件设计优化提供平台,并为基站的最佳布置方案提供实验基础.%Automatic Identification System (AIS) is a widely used management tool for administration department.Aiming at the analysis on the performance characteristics in the actual conditions,a radio frequency hard-ware-in-loop simulation platform was proposed,which contained a combination radiofrequency signal generation system to simulate the common AIS signal with arbitrary field strength and interference characteristics,meanwhile,the Okumura-Hata model was used to design a formula to calculate the field strength in define distance.With its help,it was much easier to figure out the error signal attenuation characteristics,study the PER in different interference signals.And a field testing in the Shenzhen was carried out to prove the efficiency.And it shows another way to study the AIS network,and provides a platform to optimize hardware design and the layout scheme for the locations ofbasestations.

  19. SYSMEX CS 5100全自动血凝分析仪的性能评价%Evaluation on the performance of Sysmex CS 5100 automatic blood coagulation analyzer

    Institute of Scientific and Technical Information of China (English)

    王芳; 张军; 徐唯傑; 乐军; 陈晓燕; 陈晋

    2015-01-01

    目的:对Sysmex CS 5100(下简称CS 5100)全自动血凝仪进行性能评价。方法对CS 5100血凝仪的准确性、不精密度、Fg 线性(可报告范围)、参考范围、携带污染率进行评价,并与 Sysmex 公司生产的 CA 7000全自动血凝分析仪进行相关性试验,所测指标为 PT、PT(INR)、APTT、Fbg、TT、AT、D 二聚体(DD)和 FDP。结果准确性试验测定结果在质控说明书给定的范围内。批内最大变异系数(CV)与日间最大 CV 均符合符合相关行业文件要求。Fbg 线性验证试验结果显示 Fbg 线性范围为1.071~5.355,相关系数(r)值为0.9950,符合规定要求(r≥0.975)。参考范围验证试验结果显示各项检测指标 R 值均>0.9,实验室预设参考范围可适用于该仪器。CS 5100与 CA 7000的各项检测项目 r 值均>0.95,两仪器结果有很好的对比性。结论CS 5100有优异的准确性、精密度良好、Fbg 检测范围宽、抗生物干扰能力强,完全可满足临床实验室要求。%Objective To evaluate the performance of Sysmex CS 51 00 automatic blood coagulation analyzer. Methods PT,PT(INR),APTT,Fbg,TT,AT,D-dimer (DD)and FDP were measured by Sysmex CS 51 00 automatic blood coagulation analyzer.The accuracy,imprecision,Fg linearity (reportable range),reference range, carry-over rate were evaluated,and the correlation with Sysmex CA 7000 automatic blood coagulation analyzer was analyzed.Results The accuracy was in the range provided by quality control instructions.The maximum within-run and inter-day coefficients of variation met the requirements of Clia′88.Fbg linear validation test showed that the linearity of Fbg was from 1 .071 to 5.355,and the correlation coefficient (r)was 0.9950,which met the specified requirements (r≥0.975).The reference range verification tests showed that the R values of all the tests were >0.9,which proved that the laboratory preset reference range can be

  20. Avaliação do desempenho de um sistema automático para controle da fertirrigação do tomateiro cultivado em substrato Performance evaluation of an automatic system for tomato fertigation control in substrate

    Directory of Open Access Journals (Sweden)

    Antonio J. Steidle Neto

    2009-09-01

    Full Text Available Este trabalho teve por objetivo avaliar o desempenho de um sistema de controle automático de fertirrigação para a produção do tomateiro em substrato de areia, comparativamente ao sistema de controle convencional quanto à redução de solução nutritiva. No método de controle automático, os eventos de fertirrigação foram estabelecidos em função das condições meteorológicas do ambiente de cultivo e do estádio de desenvolvimento da cultura. Para isso, o modelo de Penman-Monteith foi utilizado como suporte para a tomada de decisão sobre a frequência adequada para aplicação da solução nutritiva. No sistema de controle convencional, os intervalos entre as fertirrigações permaneceram fixos durante todo o ciclo do tomateiro. Os resultados demonstraram que o sistema de controle automático atendeu plenamente às necessidades hídricas da cultura, sem comprometer a produção do tomateiro, proporcionando reduções expressivas no consumo de solução nutritiva. Por outro lado, o sistema de controle convencional realizou número excessivo de fertirrigações, principalmente durante o estádio inicial de desenvolvimento do tomateiro e nos dias caracterizados por elevada nebulosidade. No estádio inicial de crescimento, verificou-se que os volumes totais de solução nutritiva, aplicados ao tomateiro pelo sistema convencional, excederam as necessidades hídricas da cultura em 1,31 e 1,39 L planta-1 em dias típicos com céu claro e nublado, respectivamente.The objective of this work was to compare the performance of an automatic fertigation control system, for soilless tomato production in sand substrate, as compared to a conventional control system. In the automatic control, fertigation events were established by meteorological conditions in the cultivation environment and crop development stage. In this way, the Penman-Monteith model was utilized as a decision support tool regarding the appropriate frequency for delivering the

  1. Techniques for automatic speech recognition

    Science.gov (United States)

    Moore, R. K.

    1983-05-01

    A brief insight into some of the algorithms that lie behind current automatic speech recognition system is provided. Early phonetically based approaches were not particularly successful, due mainly to a lack of appreciation of the problems involved. These problems are summarized, and various recognition techniques are reviewed in the contect of the solutions that they provide. It is pointed out that the majority of currently available speech recognition equipments employ a "whole-word' pattern matching approach which, although relatively simple, has proved particularly successful in its ability to recognize speech. The concepts of time-normalizing plays a central role in this type of recognition process and a family of such algorithms is described in detail. The technique of dynamic time warping is not only capable of providing good performance for isolated word recognition, but how it is also extended to the recognition of connected speech (thereby removing one of the most severe limitations of early speech recognition equipment).

  2. Development of advanced automatic control system for nuclear ship. 2. Perfect automatic operation after reactor scram events

    Energy Technology Data Exchange (ETDEWEB)

    Yabuuchi, Noriaki; Nakazawa, Toshio; Takahashi, Hiroki; Shimazaki, Junya; Hoshi, Tsutao [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1997-11-01

    An automatic operation system has been developed for the purpose of realizing a perfect automatic plant operation after reactor scram events. The goal of the automatic operation after a reactor scram event is to bring the reactor hot stand-by condition automatically. The basic functions of this system are as follows; to monitor actions of the equipments of safety actions after a reactor scram, to control necessary control equipments to bring a reactor to a hot stand-by condition automatically, and to energize a decay heat removal system. The performance evaluation on this system was carried out by comparing the results using to Nuclear Ship Engineering Simulation System (NESSY) and the those measured in the scram test of the nuclear ship `Mutsu`. As the result, it was showed that this system had the sufficient performance to bring a reactor to a hot syand-by condition quickly and safety. (author)

  3. Development of advanced automatic control system for nuclear ship. 2. Perfect automatic operation after reactor scram events

    International Nuclear Information System (INIS)

    An automatic operation system has been developed for the purpose of realizing a perfect automatic plant operation after reactor scram events. The goal of the automatic operation after a reactor scram event is to bring the reactor hot stand-by condition automatically. The basic functions of this system are as follows; to monitor actions of the equipments of safety actions after a reactor scram, to control necessary control equipments to bring a reactor to a hot stand-by condition automatically, and to energize a decay heat removal system. The performance evaluation on this system was carried out by comparing the results using to Nuclear Ship Engineering Simulation System (NESSY) and the those measured in the scram test of the nuclear ship 'Mutsu'. As the result, it was showed that this system had the sufficient performance to bring a reactor to a hot syand-by condition quickly and safety. (author)

  4. Automatic radioactive waste recycling

    International Nuclear Information System (INIS)

    The production of a plutonium ingot by calcium reduction process at CEA/Valduc generates a residue called 'slag'. This article introduces the recycling unit which is dedicated to the treatment of slags. The aim is to separate and to recycle the plutonium trapped in this bulk on the one hand, and to generate a disposable waste from the slag on the other hand. After a general introduction of the facilities, some elements will be enlightened, particularly the dissolution step, the filtration and the drying equipment. Reflections upon technological constraints will be proposed, and the benefits of a fully automatic recycling unit of nuclear waste will also be stressed. (authors)

  5. Automatic Configuration in NTP

    Institute of Scientific and Technical Information of China (English)

    Jiang Zongli(蒋宗礼); Xu Binbin

    2003-01-01

    NTP is nowadays the most widely used distributed network time protocol, which aims at synchronizing the clocks of computers in a network and keeping the accuracy and validation of the time information which is transmitted in the network. Without automatic configuration mechanism, the stability and flexibility of the synchronization network built upon NTP protocol are not satisfying. P2P's resource discovery mechanism is used to look for time sources in a synchronization network, and according to the network environment and node's quality, the synchronization network is constructed dynamically.

  6. 49 CFR 236.504 - Operation interconnected with automatic block-signal system.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Operation interconnected with automatic block... Operation interconnected with automatic block-signal system. (a) A continuous inductive automatic train stop... be so interconnected with the signal system as to perform its intended function in event of...

  7. Automatic device for maintenance on safety relief valve

    International Nuclear Information System (INIS)

    This system offers shorter, labor-saving periodic inspection of nuclear power plants particularly for maintenance of in the BWR plant. This automatic device has the following performance features. (author)

  8. Automatic analysis and classification of surface electromyography.

    Science.gov (United States)

    Abou-Chadi, F E; Nashar, A; Saad, M

    2001-01-01

    In this paper, parametric modeling of surface electromyography (EMG) algorithms that facilitates automatic SEMG feature extraction and artificial neural networks (ANN) are combined for providing an integrated system for the automatic analysis and diagnosis of myopathic disorders. Three paradigms of ANN were investigated: the multilayer backpropagation algorithm, the self-organizing feature map algorithm and a probabilistic neural network model. The performance of the three classifiers was compared with that of the old Fisher linear discriminant (FLD) classifiers. The results have shown that the three ANN models give higher performance. The percentage of correct classification reaches 90%. Poorer diagnostic performance was obtained from the FLD classifier. The system presented here indicates that surface EMG, when properly processed, can be used to provide the physician with a diagnostic assist device. PMID:11556501

  9. Automatic readout micrometer

    International Nuclear Information System (INIS)

    A measuring system is disclosed for surveying and very accurately positioning objects with respect to a reference line. A principal use of this surveying system is for accurately aligning the electromagnets which direct a particle beam emitted from a particle accelerator. Prior art surveying systems require highly skilled surveyors. Prior art systems include, for example, optical surveying systems which are susceptible to operator reading errors, and celestial navigation-type surveying systems, with their inherent complexities. The present invention provides an automatic readout micrometer which can very accurately measure distances. The invention has a simplicity of operation which practically eliminates the possibilities of operator optical reading error, owning to the elimination of traditional optical alignments for making measurements. The invention has an extendable arm which carries a laser surveying target. The extendable arm can be continuously positioned over its entire length of travel by either a coarse or fine adjustment without having the fine adjustment outrun the coarse adjustment until a reference laser beam is centered on the target as indicated by a digital readout. The length of the micrometer can then be accurately and automatically read by a computer and compared with a standardized set of alignment measurements. Due to its construction, the micrometer eliminates any errors due to temperature changes when the system is operated within a standard operating temperature range

  10. Automatic personnel contamination monitor

    International Nuclear Information System (INIS)

    United Nuclear Industries, Inc. (UNI) has developed an automatic personnel contamination monitor (APCM), which uniquely combines the design features of both portal and hand and shoe monitors. In addition, this prototype system also has a number of new features, including: micro computer control and readout, nineteen large area gas flow detectors, real-time background compensation, self-checking for system failures, and card reader identification and control. UNI's experience in operating the Hanford N Reactor, located in Richland, Washington, has shown the necessity of automatically monitoring plant personnel for contamination after they have passed through the procedurally controlled radiation zones. This final check ensures that each radiation zone worker has been properly checked before leaving company controlled boundaries. Investigation of the commercially available portal and hand and shoe monitors indicated that they did not have the sensitivity or sophistication required for UNI's application, therefore, a development program was initiated, resulting in the subject monitor. Field testing shows good sensitivity to personnel contamination with the majority of alarms showing contaminants on clothing, face and head areas. In general, the APCM has sensitivity comparable to portal survey instrumentation. The inherit stand-in, walk-on feature of the APCM not only makes it easy to use, but makes it difficult to bypass. (author)

  11. A semi-automatic method for ontology mapping

    OpenAIRE

    PEREZ, Laura Haide; Cechich, Alejandra; Buccella, Agustina

    2007-01-01

    Ontology mapping involves the task of finding similarities among overlapping sources by using ontologies. In a Federated System in which distributed, autonomous and heterogeneous information sources must be integrated, ontologies have emerged as tools to solve semantic heterogeneity problems. In this paper we propose a three-level approach that provides a semi-automatic method to ontology mapping. It performs some tasks automatically and guides the user in performing other tasks for which ...

  12. Desempenho de um regulador automático de vazão para canais de irrigação Performance of an automatic discharge regulator for irrigation channels

    Directory of Open Access Journals (Sweden)

    Luís G. H. do Amaral

    2010-12-01

    Full Text Available As estruturas de controle comumente utilizadas nas tomadas de água dos canais de irrigação não permitem a distribuição da quantidade correta de água, favorecendo o desperdício e, consequentemente, reduzindo a eficiência no uso da água. O objetivo deste trabalho foi determinar o desempenho de um regulador automático de vazão no controle da vazão derivada. Para tanto, um exemplar do equipamento, construído em fibra de vidro, foi instalado na lateral de um canal de concreto do Laboratório de Hidráulica da Universidade Federal de Viçosa, em Viçosa-MG. O equipamento foi avaliado em toda a sua faixa de operação, sendo que, em cada regulagem prefixada, determinou-se a vazão derivada com o nível da água a montante variando de 0,30 a 0,45 m. A variação média na vazão do regulador, considerando toda a sua faixa de operação, foi de ± 2,3% em relação às vazões médias fornecidas pelo equipamento em cada regulagem. A amplitude de variação na vazão fornecida foi pequena em relação aos equipamentos usualmente empregados no controle de vazão, em canais de irrigação, demonstrando que o regulador automático de vazão é um equipamento apropriado para a distribuição de água em redes de canais.The control structures commonly used in irrigation channels water intakes are inefficient in delivering the correct water volume to crops, collaborating to water waste and, hence, reducing the water use efficiency. The objective of this work was to determine the performance of an automatic discharge regulator in the control of the supplied discharge. The regulator was made of fiberglass and its evaluation was accomplished in a concrete channel belonging to the Hydraulic Laboratory of the Federal University of Viçosa, in Viçosa, state of Minas Gerais, Brazil. The evaluation was performed for all equipment discharge regulation options. In each regulation, the supplied discharge was determined for the upstream water level changing

  13. MARZ: Manual and automatic redshifting software

    Science.gov (United States)

    Hinton, S. R.; Davis, Tamara M.; Lidman, C.; Glazebrook, K.; Lewis, G. F.

    2016-04-01

    The Australian Dark Energy Survey (OzDES) is a 100-night spectroscopic survey underway on the Anglo-Australian Telescope using the fibre-fed 2-degree-field (2dF) spectrograph. We have developed a new redshifting application MARZ with greater usability, flexibility, and the capacity to analyse a wider range of object types than the RUNZ software package previously used for redshifting spectra from 2dF. MARZ is an open-source, client-based, Javascript web-application which provides an intuitive interface and powerful automatic matching capabilities on spectra generated from the AAOmega spectrograph to produce high quality spectroscopic redshift measurements. The software can be run interactively or via the command line, and is easily adaptable to other instruments and pipelines if conforming to the current FITS file standard is not possible. Behind the scenes, a modified version of the AUTOZ cross-correlation algorithm is used to match input spectra against a variety of stellar and galaxy templates, and automatic matching performance for OzDES spectra has increased from 54% (RUNZ) to 91% (MARZ). Spectra not matched correctly by the automatic algorithm can be easily redshifted manually by cycling automatic results, manual template comparison, or marking spectral features.

  14. What is automatized during perceptual categorization?

    Science.gov (United States)

    Roeder, Jessica L; Ashby, F Gregory

    2016-09-01

    An experiment is described that tested whether stimulus-response associations or an abstract rule are automatized during extensive practice at perceptual categorization. Twenty-seven participants each completed 12,300 trials of perceptual categorization, either on rule-based (RB) categories that could be learned explicitly or information-integration (II) categories that required procedural learning. Each participant practiced predominantly on a primary category structure, but every third session they switched to a secondary structure that used the same stimuli and responses. Half the stimuli retained their same response on the primary and secondary categories (the congruent stimuli) and half switched responses (the incongruent stimuli). Several results stood out. First, performance on the primary categories met the standard criteria of automaticity by the end of training. Second, for the primary categories in the RB condition, accuracy and response time (RT) were identical on congruent and incongruent stimuli. In contrast, for the primary II categories, accuracy was higher and RT was lower for congruent than for incongruent stimuli. These results are consistent with the hypothesis that rules are automatized in RB tasks, whereas stimulus-response associations are automatized in II tasks. A cognitive neuroscience theory is proposed that accounts for these results. PMID:27232521

  15. Automatic Wall Painting Robot

    Directory of Open Access Journals (Sweden)

    P.KEERTHANAA, K.JEEVITHA, V.NAVINA, G.INDIRA, S.JAYAMANI

    2013-07-01

    Full Text Available The Primary Aim Of The Project Is To Design, Develop And Implement Automatic Wall Painting Robot Which Helps To Achieve Low Cost Painting Equipment. Despite The Advances In Robotics And Its Wide Spreading Applications, Interior Wall Painting Has Shared Little In Research Activities. The Painting Chemicals Can Cause Hazards To The Human Painters Such As Eye And Respiratory System Problems. Also The Nature Of Painting Procedure That Requires Repeated Work And Hand Rising Makes It Boring, Time And Effort Consuming. When Construction Workers And Robots Are Properly Integrated In Building Tasks, The Whole Construction Process Can Be Better Managed And Savings In Human Labour And Timing Are Obtained As A Consequence. In Addition, It Would Offer The Opportunity To Reduce Or Eliminate Human Exposure To Difficult And Hazardous Environments, Which Would Solve Most Of The Problems Connected With Safety When Many Activities Occur At The Same Time. These Factors Motivate The Development Of An Automated Robotic Painting System.

  16. Automatic alkaloid removal system.

    Science.gov (United States)

    Yahaya, Muhammad Rizuwan; Hj Razali, Mohd Hudzari; Abu Bakar, Che Abdullah; Ismail, Wan Ishak Wan; Muda, Wan Musa Wan; Mat, Nashriyah; Zakaria, Abd

    2014-01-01

    This alkaloid automated removal machine was developed at Instrumentation Laboratory, Universiti Sultan Zainal Abidin Malaysia that purposely for removing the alkaloid toxicity from Dioscorea hispida (DH) tuber. It is a poisonous plant where scientific study has shown that its tubers contain toxic alkaloid constituents, dioscorine. The tubers can only be consumed after it poisonous is removed. In this experiment, the tubers are needed to blend as powder form before inserting into machine basket. The user is need to push the START button on machine controller for switching the water pump ON by then creating turbulence wave of water in machine tank. The water will stop automatically by triggering the outlet solenoid valve. The powders of tubers are washed for 10 minutes while 1 liter of contaminated water due toxin mixture is flowing out. At this time, the controller will automatically triggered inlet solenoid valve and the new water will flow in machine tank until achieve the desire level that which determined by ultra sonic sensor. This process will repeated for 7 h and the positive result is achieved and shows it significant according to the several parameters of biological character ofpH, temperature, dissolve oxygen, turbidity, conductivity and fish survival rate or time. From that parameter, it also shows the positive result which is near or same with control water and assuming was made that the toxin is fully removed when the pH of DH powder is near with control water. For control water, the pH is about 5.3 while water from this experiment process is 6.0 and before run the machine the pH of contaminated water is about 3.8 which are too acid. This automated machine can save time for removing toxicity from DH compared with a traditional method while less observation of the user. PMID:24783795

  17. Channel selection for automatic seizure detection

    DEFF Research Database (Denmark)

    Duun-Henriksen, Jonas; Kjaer, Troels Wesenberg; Madsen, Rasmus Elsborg; Remvig, Line Sofie; Thomsen, Carsten Eckhart; Sørensen, Helge Bjarup Dissing

    2012-01-01

    Objective: To investigate the performance of epileptic seizure detection using only a few of the recorded EEG channels and the ability of software to select these channels compared with a neurophysiologist. Methods: Fifty-nine seizures and 1419 h of interictal EEG are used for training and testing...... of an automatic channel selection method. The characteristics of the seizures are extracted by the use of a wavelet analysis and classified by a support vector machine. The best channel selection method is based upon maximum variance during the seizure. Results: Using only three channels, a seizure...... recorded directly on the epileptic focus. Conclusions: Based on our dataset, automatic seizure detection can be done using only three EEG channels without loss of performance. These channels should be selected based on maximum variance and not, as often done, using the focal channels. Significance: With...

  18. Performance Verification of Precil C2000-A Automatic Blood Coagulation Instrument%普利生C2000-A全自动血凝仪的性能验证

    Institute of Scientific and Technical Information of China (English)

    张伟坚; 刘光明; 梁凤琼

    2014-01-01

    目的:对全自动血凝仪普利生C2000-A进行性能验证,以确定其是否符合临床检测要求。方法参照美国临床实验室标准化委员会(NCCLS)标准,应用定值质控血浆或(和)定标血浆,选择凝血常规项目[D-二聚体(D-D)、血浆凝血酶原时间(PT)、活化部分凝血活酶时间(APTT)、凝血酶原时间测定(TT)、纤维蛋白原(FIB)]对仪器分析系统的精密度、正确度、携带污染率、线性范围、抗干扰能力(干扰物为血红蛋白、直接胆红素和三酰甘油)以及通道一致性等性能进行验证和初步评价。结果所有凝血检测项目中,批内精密度均小于3%,批间精密度均小于5%;定值质控品或者定值校准品的结果与各自靶值相比其偏差均少于8%;线性验证标本按一定比例稀释后将所得理论值与实测值进行回归分析,a值均介于0.97~1.03范围内,r均大于0.975,符合线性相关要求;携带污染率均小于3%;干扰试验的偏离值均小于3%;在4个通道上各项目的测定结果差异均无统计学意义(P>0.05),说明其通道一致性良好。结论国产血凝仪普利生C2000-A全自动血凝仪具有良好的分析性能,其准确度、精密度、线性范围、携带污染率等指标均符合质量管理要求,特别是其对溶血、黄疸以及脂浊标本具有较强的抗干扰能力,可完全满足临床检测要求。%Objective To verify the performance of the full automatic blood coagulation analy-zer precil C2000-A,and to determine whether it meets the requirements of clinical detection. Methods According to American Committee for Clinical Laboratory Standards,the precision,ac-curacy,carryover,linear range,anti-interference capability (interfering agents included hemoglo-bin,direct bilirubin and triacylglycerol)and channel consistency were verified and evaluated.Re-sults Of the all blood coagulation detection

  19. Performance Evaluation of Platelet Count by Sysmex XN-3000 Auto-matic Hematology Analyzer%Sysmex XN-3000全自动血细胞分析仪血小板计数性能评价分析

    Institute of Scientific and Technical Information of China (English)

    张琴; 廖扬; 石玉玲

    2014-01-01

    Objective: To study the performance of methods for platelet(PLT) count by Sysmex XN-3000 automat-ic hematology analyzer. Methods: PLT was counted by XN-3000 hematology analyzer with platelet-electrical im-pedance method(PLT-I), platelet-optical method(PLT-O) and platelet-fluorescence method(PLT-F) to test the pre-cision, linear range, the carried pollution rate and red blood cell fragments interference. Moreover, the correlation between Sysmex XN-3000 PLT counts and flow cytometry(FCM) using anti-CD61 monoclonal antibody was con-firmed. Results: Compared with PLT-I and PLT-O, PLT-F had better repeatability, higher precision. Three meth-ods of linear range and the carried pollution all got good results. On red blood cell fragments interference experi-ment, PLT-F had stronger anti-interference ability(P>0.05), PLT-O took second place. Correlation analysis showed that three methods had good correlation, the accuracy of PLT-F was higher specially in low PLT samples group. Conclusion: Routine clinical specimens are counted by Sysmex XN-3000 hematology analyzer with PLT-I or PLT-O. When specimens are hemolysis or the number of PLT abnormalities, recommend using PLT-F method review, if necessary, using microscopic method or flow cytometry.%目的:探讨Sysmex XN-3000全自动血细胞分析仪的血小板计数性能。方法:采用Sysmex XN-3000全自动血细胞分析仪的3种方法进行血小板计数,从精密度、线性范围、携带污染率及红细胞碎片干扰等4方面评价,并与使用抗CD61抗体的流式细胞术检测结果进行比较。结果:与电阻抗法(PLT-I)、光学法(PLT-O)相比,核酸染色法(PLT-F)的重复性最好,精确度较高;3种方法的稀释线性和携带污染均得到良好的结果;在红细胞碎片干扰实验中, PLT-F具有较强的抗干扰能力(P均大于0.05),PLT-O次之;相关分析结果显示3种方法都得到良好的相关性,PLT-F法的准确性较高,尤其在

  20. Automatic Status Logger For a Gas Turbine

    OpenAIRE

    JONAS, Susanne

    2007-01-01

    The Company Siemens Industrial Turbo Machinery AB manufactures and launches in operation among other things gas turbines, steam turbines, compressors, turn-key power plants and carries out service for components for heat and power production. Siemens also performs research and development, marketing, sales and installations of turbines and completes power plants, service and refurbish. Our thesis for the engineering degree is to develop an automatic status log which will be used as a tool to ...

  1. Automatic Algorithm Selection for Complex Simulation Problems

    CERN Document Server

    Ewald, Roland

    2012-01-01

    To select the most suitable simulation algorithm for a given task is often difficult. This is due to intricate interactions between model features, implementation details, and runtime environment, which may strongly affect the overall performance. An automated selection of simulation algorithms supports users in setting up simulation experiments without demanding expert knowledge on simulation. Roland Ewald analyzes and discusses existing approaches to solve the algorithm selection problem in the context of simulation. He introduces a framework for automatic simulation algorithm selection and

  2. Automatic Multilevel Medical Image Annotation and Retrieval

    OpenAIRE

    Mueen, A.; Zainuddin, R.; Baba, M. Sapiyan

    2007-01-01

    Image retrieval at the semantic level mostly depends on image annotation or image classification. Image annotation performance largely depends on three issues: (1) automatic image feature extraction; (2) a semantic image concept modeling; (3) algorithm for semantic image annotation. To address first issue, multilevel features are extracted to construct the feature vector, which represents the contents of the image. To address second issue, domain-dependent concept hierarchy is constructed for...

  3. 两台水洗机的联机控制改造与调试%Two Sets of Washing Machine On-line Control Retrofit and Debugging

    Institute of Scientific and Technical Information of China (English)

    高锦南

    2013-01-01

    This paper mainly introduces the modification program and debugging of two sets of washing machine on-line control, and single-machine control. Two-sets of washing machine on-line control greatly shorten the product washed twice time, effectively improve the washing efficiency of cloth, so that the water washing process is smooth and achieve the ideal process effect.%本文主要介绍某印染企业两台水洗机既能联机控制,又能单机控制的改造方案及调试过程。两台水洗机联机控制,大大缩短了产品水洗两遍的时间,有效提高布匹的水洗效率,以便于水洗工艺的顺利进行,达到理想的工艺效果。

  4. C语言指针错误的分析及调试%Analyzing and Debugging the Errors of C Language Pointer

    Institute of Scientific and Technical Information of China (English)

    许永达

    2013-01-01

    Some pointer errors in C Programming are not easily found at the compiling phase. The current teaching materials cannot provide sufficient description on those errors, but mainly focusing on concept or theory. This article aims at preventing those errors by analyzing those errors in sample programs, debugging those errors in VISUAL C++6.0, showing the phenomena of those errors, analyzing their causes, and putting forward the correct way to use pointers.%  C语言指针的有些错误在程序编译阶段难以发现,且现行教材主要从概念、理论上对指针错误进行讲述,存在不足.分析了带有此类错误的示例程序,并在VISUAL C++6.0进行调试,展示此类指针错误的错误现象,分析其产生的原因,提出正确使用指针的方法,以达到预防此类指针错误发生的目的.

  5. 纯电动汽车用电机调试软件设计%Debugging Software Design of Pure Electric Car Motors

    Institute of Scientific and Technical Information of China (English)

    李豹; 张云; 朱孟美; 孔辉

    2013-01-01

    In order to make the pure electric car motor initial commissioning process more convenient,the pure electric car motors debugging software was developed by using LabVIEW.The software sets motor parameters characteristic determination,PI parameter adjustment,the motor operating parameters setting,results analysis,data processing and display functions as one.The main function is to realize the communication between the PC and the motor drive to achieve the transmission and reception of data and display.%为了使纯电动汽车用电机初始调试过程更加便捷,采用LabVIEW开发了一款针对纯电动汽车用电机的调试软件,该软件集电机参数特性测定、PI参数调节、电机运行参数设置、结果分析、数据处理及显示等多种功能为一体,其中最主要的功能是实现PC机与电机驱动器之间的通信,实现数据的发送、接收与显示.

  6. Investigating the Relationship between Stable Personality Characteristics and Automatic Imitation.

    Directory of Open Access Journals (Sweden)

    Emily E Butler

    Full Text Available Automatic imitation is a cornerstone of nonverbal communication that fosters rapport between interaction partners. Recent research has suggested that stable dimensions of personality are antecedents to automatic imitation, but the empirical evidence linking imitation with personality traits is restricted to a few studies with modest sample sizes. Additionally, atypical imitation has been documented in autism spectrum disorders and schizophrenia, but the mechanisms underpinning these behavioural profiles remain unclear. Using a larger sample than prior studies (N=243, the current study tested whether performance on a computer-based automatic imitation task could be predicted by personality traits associated with social behaviour (extraversion and agreeableness and with disorders of social cognition (autistic-like and schizotypal traits. Further personality traits (narcissism and empathy were assessed in a subsample of participants (N=57. Multiple regression analyses showed that personality measures did not predict automatic imitation. In addition, using a similar analytical approach to prior studies, no differences in imitation performance emerged when only the highest and lowest 20 participants on each trait variable were compared. These data weaken support for the view that stable personality traits are antecedents to automatic imitation and that neural mechanisms thought to support automatic imitation, such as the mirror neuron system, are dysfunctional in autism spectrum disorders or schizophrenia. In sum, the impact that personality variables have on automatic imitation is less universal than initial reports suggest.

  7. Analysis and Debugging of Sleeper Distance Deviation for SVM1000 Tracklayer Units After Overhauling%经大修改造的 SVM1000铺轨机组枕距偏差分析与调试

    Institute of Scientific and Technical Information of China (English)

    毕建设

    2015-01-01

    After overhauling and reforming of the SVM1000 tracklayer units,deviation of sleep distance will cause a larger accumulative deviation during the process of onsite debugging.In the late phase for the debugging,the most emphasis is paid to the procedure based on the analysis of the influence factors such as construction slope,ambient temperature,walk-ing speed,rotary encoder and procedure and so on.Through actual deviation analysis and debugging of the sleeper dis-tance,it can provide references for the similar problems.%SVM1000铺轨机组经过大修及改造后,现场调试过程中,发现枕距存在误差,导致累计偏差较大。通过分析施工面坡度、外界温度、行走速度、旋转编码器和程序等影响因素,最后确定程序为主要影响因素。通过此次偏差分析与调试,为以后类似问题,提供了借鉴意义。

  8. Making automatic differentiation truly automatic : coupling PETSc with ADIC

    International Nuclear Information System (INIS)

    Despite its name, automatic differentiation (AD) is often far from an automatic process. often one must specify independent and dependent variables, indicate the derivative quantities to be computed, and perhaps even provide information about the structure of the Jacobians or Hessians being computed. However, when AD is used in conjunction with a toolkit with well-defined interfaces, many of these issues do not arise. They describe recent research into coupling the ADIC automatic differentiation tool with PETSc, a toolkit for the parallel numerical solution of PDEs. This research leverages the interfaces and objects of PETSc to make the AD process very nearly transparent

  9. Automatic Parallelization and Locality Optimization of Beamforming Algorithms

    OpenAIRE

    Hartono, Albert; Vasilache, Nicolas; Bastoul, Cédric; Leung, Allen; Meister, Benoît; Lethin, Richard; Vouras, Peter

    2010-01-01

    This paper demonstrates the benefits of a global optimization strategy using a new automatic parallelization and locality optimization methodology for high performance embedded computing algorithms that occur in adaptive radar systems, for modern multi-core computing chips. As a baseline, the resulting performance was compared against the performance that could be obtained using highly optimized math libraries.

  10. Automatic Chessboard Detection for Intrinsic and Extrinsic Camera Parameter Calibration

    OpenAIRE

    Jose María Armingol; Arturo de la Escalera

    2010-01-01

    There are increasing applications that require precise calibration of cameras to perform accurate measurements on objects located within images, and an automatic algorithm would reduce this time consuming calibration procedure. The method proposed in this article uses a pattern similar to that of a chess board, which is found automatically in each image, when no information regarding the number of rows or columns is supplied to aid its detection. This is carried out by means of a combined ana...

  11. Indexing of Arabic documents automatically based on lexical analysis

    OpenAIRE

    Molijy, Abdulrahman Al; Hmeidi, Ismail; Alsmadi, Izzat

    2012-01-01

    The continuous information explosion through the Internet and all information sources makes it necessary to perform all information processing activities automatically in quick and reliable manners. In this paper, we proposed and implemented a method to automatically create and Index for books written in Arabic language. The process depends largely on text summarization and abstraction processes to collect main topics and statements in the book. The process is developed in terms of accuracy a...

  12. Automatic Grasp Generation and Improvement for Industrial Bin-Picking

    DEFF Research Database (Denmark)

    Kraft, Dirk; Ellekilde, Lars-Peter; Rytz, Jimmy Alison

    2014-01-01

    and achieve comparable results and that our learning approach can improve system performance significantly. Automatic bin-picking is an important industrial process that can lead to significant savings and potentially keep production in countries with high labour cost rather than outsourcing it. The presented...... work allows to minimize cycle time as well as setup cost, which are essential factors in automatic bin-picking. It therefore leads to a wider applicability of bin-picking in industry....

  13. Automatic graphene transfer system for improved material quality and efficiency

    OpenAIRE

    Alberto Boscá; Jorge Pedrós; Javier Martínez; Tomás Palacios; Fernando Calle

    2015-01-01

    In most applications based on chemical vapor deposition (CVD) graphene, the transfer from the growth to the target substrate is a critical step for the final device performance. Manual procedures are time consuming and depend on handling skills, whereas existing automatic roll-to-roll methods work well for flexible substrates but tend to induce mechanical damage in rigid ones. A new system that automatically transfers CVD graphene to an arbitrary target substrate has been developed. The proce...

  14. Nature Conservation Drones for Automatic Localization and Counting of Animals

    OpenAIRE

    Gemert, van, M.J.C.; Verschoor, C.R.; Mettes, P.; Epema, K.; Koh, L. P.; Wich, S.

    2014-01-01

    This paper is concerned with nature conservation by automatically monitoring animal distribution and animal abundance. Typically, such conservation tasks are performed manually on foot or after an aerial recording from a manned aircraft. Such manual approaches are expensive, slow and labor intensive. In this paper, we investigate the combination of small unmanned aerial vehicles (UAVs or “drones”) with automatic object recognition techniques as a viable solution to manual animal surveying. Si...

  15. Automatic learning strategies and their application to electrophoresis analysis

    OpenAIRE

    Roch, Christian Maurice; Pun, Thierry; Hochstrasser, Denis; Pellegrini, Christian

    1989-01-01

    Automatic learning plays an important role in image analysis and pattern recognition. A taxonomy of automatic learning strategies is presented; this categorization is based on the amount of inferences the learning element must perform to bridge the gap between environmental and system knowledge representation level. Four main categories are identified and described: rote learning, learning by deduction, learning by induction, and learning by analogy. An application of learning by induction to...

  16. A Test Suite for High-Performance Parallel Java

    OpenAIRE

    Hauser, Jochem; Ludewig, Thorsten; Williams, Roy D.; Winkelmann, Ralf; Gollnick, Torsten; Brunett, Sharon; Muylaert, Jean

    1999-01-01

    The Java programming language has a number of features that make it attractive for writing high-quality, portable parallel programs. A pure object formulation, strong typing and the exception model make programs easier to create, debug, and maintain. The elegant threading provides a simple route to parallelism on shared-memory machines. Anticipating great improvements in numerical performance, this paper presents a suite of simple programs that indicate how a pure Java Navier-Stokes solver mi...

  17. Development of an automatic reactor inspection system

    International Nuclear Information System (INIS)

    Using recent technologies on a mobile robot computer science, we developed an automatic inspection system for weld lines of the reactor vessel. The ultrasonic inspection of the reactor pressure vessel is currently performed by commercialized robot manipulators. Since, however, the conventional fixed type robot manipulator is very huge, heavy and expensive, it needs long inspection time and is hard to handle and maintain. In order to resolve these problems, we developed a new automatic inspection system using a small mobile robot crawling on the vertical wall of the reactor vessel. According to our conceptual design, we developed the reactor inspection system including an underwater inspection robot, a laser position control subsystem, an ultrasonic data acquisition/analysis subsystem and a main control subsystem. We successfully carried out underwater experiments on the reactor vessel mockup, and real reactor ready for Ulchine nuclear power plant unit 6 at Dusan Heavy Industry in Korea. After this project, we have a plan to commercialize our inspection system. Using this system, we can expect much reduction of the inspection time, performance enhancement, automatic management of inspection history, etc. In the economic point of view, we can also expect import substitution more than 4 million dollars. The established essential technologies for intelligent control and automation are expected to be synthetically applied to the automation of similar systems in nuclear power plants

  18. Electronic amplifiers for automatic compensators

    CERN Document Server

    Polonnikov, D Ye

    1965-01-01

    Electronic Amplifiers for Automatic Compensators presents the design and operation of electronic amplifiers for use in automatic control and measuring systems. This book is composed of eight chapters that consider the problems of constructing input and output circuits of amplifiers, suppression of interference and ensuring high sensitivity.This work begins with a survey of the operating principles of electronic amplifiers in automatic compensator systems. The succeeding chapters deal with circuit selection and the calculation and determination of the principal characteristics of amplifiers, as

  19. Development of an automatic pipeline scanning system

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae H.; Lee, Jae C.; Moon, Soon S.; Eom, Heung S.; Choi, Yu R

    1999-11-01

    Pressure pipe inspection in nuclear power plants is one of the mandatory regulation items. Comparing to manual ultrasonic inspection, automatic inspection has the benefits of more accurate and reliable inspection results and reduction of radiation disposal. final object of this project is to develop an automatic pipeline inspection system of pressure pipe welds in nuclear power plants. We developed a pipeline scanning robot with four magnetic wheels and 2-axis manipulator for controlling ultrasonic transducers, and developed the robot control computer which controls the robot to navigate along inspection path exactly. We expect our system can contribute to reduction of inspection time, performance enhancement, and effective management of inspection results. The system developed by this project can be practically used for inspection works after field tests. (author)

  20. Semi-automatic drawings surveying system

    International Nuclear Information System (INIS)

    A system for the semi-automatic survey of drawings is presented. Its design has been oriented to the reduction of the stored information required for the drawing reproduction. This equipment consists mainly of a plotter driven by a micro-computer, but the pen of the plotter is replaced by a circular photodiode array. Line drawings are first viewed as a concatenation of vectors, with constant angle between the two vectors, and then divided in arcs of circles and line segments. A dynamic analysis of line intersections with the circular sensor permits to identify starting points and end points in a line, for the purpose of automatically following connected lines in drawing. The advantage of the method described is that precision practically depends only on the plotter performance, the sensor resolution being only considered for the thickness of strokes and the distance between two strokes. (author)

  1. Automatic Phonetic Transcription for Danish Speech Recognition

    DEFF Research Database (Denmark)

    Kirkedal, Andreas Søeborg

    to acquire and expensive to create. For languages with productive compounding or agglutinative languages like German and Finnish, respectively, phonetic dictionaries are also hard to maintain. For this reason, automatic phonetic transcription tools have been produced for many languages. The quality...... of automatic phonetic transcriptions vary greatly with respect to language and transcription strategy. For some languages where the difference between the graphemic and phonetic representations are small, graphemic transcriptions can be used to create ASR systems with acceptable performance. In other languages......, syllabication, stød and several other suprasegmental features (Kirkedal, 2013). Simplifying the transcriptions by filtering out the symbols for suprasegmental features in a post-processing step produces a format that is suitable for ASR purposes. eSpeak is an open source speech synthesizer originally created...

  2. Automatic reactor power control device

    International Nuclear Information System (INIS)

    Anticipated transient without scram (ATWS) of a BWR type reactor is judged to generate a signal based on a reactor power signal and a scram actuation demand signal. The ATWS signal and a predetermined water level signal to be generated upon occurrence of ATWS are inputted, and an injection water flow rate signal exhibiting injection water flow rate optimum to reactor flooding and power suppression is outputted. In addition, a reactor pressure setting signal is outputted based on injection performance of a high pressure water injection system or a lower pressure water injection system upon occurrence of ATWS. Further, the reactor pressure setting signal is inputted to calculate opening/closing setting pressure of a main steam relief valve and output an opening setting pressure signal and a closure setting pressure signal for the main steam relief valve. As a result, the reactor power and the reactor water level can be automatically controlled even upon occurrence of ATWS due to failure of insertion of all of the control rods thereby enabling to maintain integrity and safety of the reactor, the reactor pressure vessel and the reactor container. (N.H.)

  3. Pattern-Driven Automatic Parallelization

    Directory of Open Access Journals (Sweden)

    Christoph W. Kessler

    1996-01-01

    Full Text Available This article describes a knowledge-based system for automatic parallelization of a wide class of sequential numerical codes operating on vectors and dense matrices, and for execution on distributed memory message-passing multiprocessors. Its main feature is a fast and powerful pattern recognition tool that locally identifies frequently occurring computations and programming concepts in the source code. This tool also works for dusty deck codes that have been "encrypted" by former machine-specific code transformations. Successful pattern recognition guides sophisticated code transformations including local algorithm replacement such that the parallelized code need not emerge from the sequential program structure by just parallelizing the loops. It allows access to an expert's knowledge on useful parallel algorithms, available machine-specific library routines, and powerful program transformations. The partially restored program semantics also supports local array alignment, distribution, and redistribution, and allows for faster and more exact prediction of the performance of the parallelized target code than is usually possible.

  4. Robust automatic target recognition in FLIR imagery

    Science.gov (United States)

    Soyman, Yusuf

    2012-05-01

    In this paper, a robust automatic target recognition algorithm in FLIR imagery is proposed. Target is first segmented out from the background using wavelet transform. Segmentation process is accomplished by parametric Gabor wavelet transformation. Invariant features that belong to the target, which is segmented out from the background, are then extracted via moments. Higher-order moments, while providing better quality for identifying the image, are more sensitive to noise. A trade-off study is then performed on a few moments that provide effective performance. Bayes method is used for classification, using Mahalanobis distance as the Bayes' classifier. Results are assessed based on false alarm rates. The proposed method is shown to be robust against rotations, translations and scale effects. Moreover, it is shown to effectively perform under low-contrast objects in FLIR images. Performance comparisons are also performed on both GPU and CPU. Results indicate that GPU has superior performance over CPU.

  5. Automatic Loop Parallelization via Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob;

    For many parallel applications, performance relies not on instruction-level parallelism, but on loop-level parallelism. Unfortunately, many modern applications are written in ways that obstruct automatic loop parallelization. Since we cannot identify sufficient parallelization opportunities for t...... the OpenMP code (within 75-111%). The second benchmark outperforms hand-parallelized and optimized OpenMP code (within 109-242%)....... be combined with target-specific optimizations. Furthermore, comparing the first benchmark to hand-parallelized, hand-optimized pthreads and OpenMP versions, we find that code generated using our approach typically outperforms the pthreads code (within 93-339%). It also performs competitively against...

  6. Prospects for de-automatization.

    Science.gov (United States)

    Kihlstrom, John F

    2011-06-01

    Research by Raz and his associates has repeatedly found that suggestions for hypnotic agnosia, administered to highly hypnotizable subjects, reduce or even eliminate Stroop interference. The present paper sought unsuccessfully to extend these findings to negative priming in the Stroop task. Nevertheless, the reduction of Stroop interference has broad theoretical implications, both for our understanding of automaticity and for the prospect of de-automatizing cognition in meditation and other altered states of consciousness. PMID:20356765

  7. Process automatization in system administration

    OpenAIRE

    Petauer, Janja

    2013-01-01

    The aim of the thesis is to present automatization of user management in company Studio Moderna. The company has grown exponentially in recent years, that is why we needed to find faster, easier and cheaper way of man- aging user accounts. We automatized processes of creating, changing and removing user accounts within Active Directory. We prepared user interface inside of existing application, used Java Script for drop down menus, wrote script in scripting programming langu...

  8. Automatic Number Plate Recognition System

    OpenAIRE

    Rajshree Dhruw; Dharmendra Roy

    2014-01-01

    Automatic Number Plate Recognition (ANPR) is a mass surveillance system that captures the image of vehicles and recognizes their license number. The objective is to design an efficient automatic authorized vehicle identification system by using the Indian vehicle number plate. In this paper we discus different methodology for number plate localization, character segmentation & recognition of the number plate. The system is mainly applicable for non standard Indian number plates by recognizing...

  9. Eating as an Automatic Behavior

    OpenAIRE

    Deborah A. Cohen, MD, MPH; Thomas A. Farley, MD, MPH

    2007-01-01

    The continued growth of the obesity epidemic at a time when obesity is highly stigmatizing should make us question the assumption that, given the right information and motivation, people can successfully reduce their food intake over the long term. An alternative view is that eating is an automatic behavior over which the environment has more control than do individuals. Automatic behaviors are those that occur without awareness, are initiated without intention, tend to continue without contr...

  10. The Midas touch: automatic steering for longwall shearers

    Energy Technology Data Exchange (ETDEWEB)

    1986-10-01

    Automatic steering for longwall shearers has helped many mines produce record tonnages. British Coal's Machine Information Display and Automation System (MIDAS) has been designed to provide automatic steering for both single and double-ended ranging drum shearers. Describes the Midas unit which is installed in the shearer and which performs all control computation. The main objective is to maintain a constant roof coal thickness and work to a constant seam height. The use of the natural radiation sensors places some limitations on the use of the MIDAS system as the seam must have a shale overburden to emit the necessary natural gammas. Describes the fully automatic, degraded automatic and manual operating modes, the early trials of MIDAS and the use of MIDAS at Silverwood Colliery.

  11. An automatic evaluation system for NTA film neutron dosimeters

    CERN Document Server

    Müller, R

    1999-01-01

    At CERN, neutron personal monitoring for over 4000 collaborators is performed with Kodak NTA films, which have been shown to be the most suitable neutron dosimeter in the radiation environment around high-energy accelerators. To overcome the lengthy and strenuous manual scanning process with an optical microscope, an automatic analysis system has been developed. We report on the successful automatic scanning of NTA films irradiated with sup 2 sup 3 sup 8 Pu-Be source neutrons, which results in densely ionised recoil tracks, as well as on the extension of the method to higher energy neutrons causing sparse and fragmentary tracks. The application of the method in routine personal monitoring is discussed. $9 overcome the lengthy and strenuous manual scanning process with an optical microscope, an automatic analysis system has been developed. We report on the successful automatic scanning of NTA films irradiated with /sup 238/Pu-Be source $9 discussed. (10 refs).

  12. SEMANTIC INTEGRATION FOR AUTOMATIC ONTOLOGY MAPPING

    Directory of Open Access Journals (Sweden)

    Siham AMROUCH

    2013-11-01

    Full Text Available In the last decade, ontologies have played a key technology role for information sharing and agents interoperability in different application domains. In semantic web domain, ontologies are efficiently used to face the great challenge of representing the semantics of data, in order to bring the actual web to its full power and hence, achieve its objective. However, using ontologies as common and shared vocabularies requires a certain degree of interoperability between them. To confront this requirement, mapping ontologies is a solution that is not to be avoided. In deed, ontology mapping build a meta layer that allows different applications and information systems to access and share their informations, of course, after resolving the different forms of syntactic, semantic and lexical mismatches. In the contribution presented in this paper, we have integrated the semantic aspect based on an external lexical resource, wordNet, to design a new algorithm for fully automatic ontology mapping. This fully automatic character features the main difference of our contribution with regards to the most of the existing semi-automatic algorithms of ontology mapping, such as Chimaera, Prompt, Onion, Glue, etc. To better enhance the performances of our algorithm, the mapping discovery stage is based on the combination of two sub-modules. The former analysis the concept’s names and the later analysis their properties. Each one of these two sub-modules is it self based on the combination of lexical and semantic similarity measures.

  13. Debugging of Combination Control and Automatic Regulation of Oil-feed Pumps and Its Improvement%供油泵的联动控制及自动调节的调试与改进

    Institute of Scientific and Technical Information of China (English)

    唐立平

    2003-01-01

    黔北电厂4×300MW机组工程设有一座燃油泵房,共3台供油泵,为4台炉提供燃油,该系统采用SIEMENS公司的PCS7过程控制系统.我们在调试中发现,PLC厂家根据西南电力设计院提供的控制要求所做的逻辑,在实际运行中引起供油泵缺乏稳定,存在一定的安全隐患而危及机组运行.经过多次修改和调试,使之适应了现场的运行要求.

  14. ARN型自动调谐消弧装置的调试及运行%Study on Debugging Program and Operation of ARN TypeAutomatic Tuning Neutralizer in 10 kV System

    Institute of Scientific and Technical Information of China (English)

    陈三运; 伍昌庭

    2000-01-01

    针对宜昌城区两个110 kV变电站的10 kV系统自动调谐消弧装置,介绍了自动调谐系统的工作原理、投运调试情况;对装置投入运行后的情况作了典型分析,提出了完善措施,介绍了实施效果.

  15. Automatic ultrasound image enhancement for 2D semi-automatic breast-lesion segmentation

    Science.gov (United States)

    Lu, Kongkuo; Hall, Christopher S.

    2014-03-01

    Breast cancer is the fastest growing cancer, accounting for 29%, of new cases in 2012, and second leading cause of cancer death among women in the United States and worldwide. Ultrasound (US) has been used as an indispensable tool for breast cancer detection/diagnosis and treatment. In computer-aided assistance, lesion segmentation is a preliminary but vital step, but the task is quite challenging in US images, due to imaging artifacts that complicate detection and measurement of the suspect lesions. The lesions usually present with poor boundary features and vary significantly in size, shape, and intensity distribution between cases. Automatic methods are highly application dependent while manual tracing methods are extremely time consuming and have a great deal of intra- and inter- observer variability. Semi-automatic approaches are designed to counterbalance the advantage and drawbacks of the automatic and manual methods. However, considerable user interaction might be necessary to ensure reasonable segmentation for a wide range of lesions. This work proposes an automatic enhancement approach to improve the boundary searching ability of the live wire method to reduce necessary user interaction while keeping the segmentation performance. Based on the results of segmentation of 50 2D breast lesions in US images, less user interaction is required to achieve desired accuracy, i.e. < 80%, when auto-enhancement is applied for live-wire segmentation.

  16. COMMISSIONING AND DETECTOR PERFORMANCE GROUPS (DPG)

    CERN Multimedia

    A. Ryd and T. Camporesi

    2010-01-01

    Commissioning and Run Coordination activities After the successful conclusion of the LHC pilot run commissioning in 2009 activities at the experiment restarted only late in January due to the cooling and detector maintenance. As usual we got going with weekly exercises used to deploy, debug, and validate improvements in firmware and software. A debriefing workshop aimed at analyzing the operational aspects of the 2009 pilot run was held on Jan. 15, 2009, to define a list of improvements (and relative priorities) to be planned. In the last month, most of the objectives set in the debriefing workshop have been attained. The major achievements/improvements obtained are the following: - Consolidation of the firmware for both readout and trigger for ECAL - Software implementation of procedures for raising the bias voltage of the silicon tracker and pixel driven by LHC mode changes with automatic propagation of the state changes from the DCS to the DAQ. The improvements in the software and firmware allow suppress...

  17. An automatic high frequency radio network for the Military Airlift Command

    Science.gov (United States)

    Bland, R. G.; Hall, W. D.; Salamone, P. P.; Wickwire, K. H.

    The authors describe the initial design of an automatic high-frequency (HF) radio network for the Military Airlift Command (MAC). The network design incorporates new technology that will make HF communications more reliable. The network uses the Automatic Communications Processor (ACP) developed for MAC to perform adaptive frequency selection and automatic link establishment (ALE). The MAC network functions with decentralized control performed by network control computers at each node, which carry out networking functions such as automatic route selection and message relaying. Simulation results show that an automated HF radio network provides significant improvement in message delivery compared with a network that uses only point-to-point links.

  18. High Performance LINUX Clusters With OSCAR, Rocks, OpenMosix & MPI

    CERN Document Server

    Sloan, Joseph D

    2005-01-01

    This new guide covers everything you need to plan, build, and deploy a high-performance Linux cluster. You'll learn about planning, hardware choices, bulk installation of Linux on multiple systems, and other basic considerations. Learn about the major free software projects and how to choose those that are most helpful to new cluster administrators and programmers. Guidelines for debugging, profiling, performance tuning, and managing jobs from multiple users round out this immensely useful book

  19. Automatic monitoring system for ''F'' installation

    International Nuclear Information System (INIS)

    The design and operation procedure of the first part of automatic radiation monitoring system of the Laboratory of Nuclear Problems, JINR, (''F'' Installation) are described. The system consists of 50 data measuring lines from which 30 are used to monitor by means of radiation de-- tectors; 12- to control the state of branch circuits, and orhers give auxiliary information on the accelerator performance. The data are handled and registered by a crate controller with built-in microcomputer once in some seconds. The monitoring results are output on a special light panel, a sound signaling and on a print

  20. Automatic penalty continuation in structural topology optimization

    DEFF Research Database (Denmark)

    Rojas Labanda, Susana; Stolpe, Mathias

    2015-01-01

    reduce the risks of ending in local minima. However, the numerical performance of continuation methods has not been studied in detail. The first purpose of this article is to benchmark existing continuation methods and the classical formulation with fixed penalty parameter in structural topology...... issue is addressed. We propose an automatic continuation method, where the material penalization parameter is included as a new variable in the problem and a constraint guarantees that the requested penalty is eventually reached. The numerical results suggest that this approach is an appealing...

  1. ASAM: Automatic Architecture Synthesis and Application Mapping

    DEFF Research Database (Denmark)

    Jozwiak, L.; Lindwer, M.; Corvino, R.;

    2012-01-01

    This paper focuses on mastering the automatic architecture synthesis and application mapping for heterogeneous massively-parallel MPSoCs based on customizable application-specific instruction-set processors (ASIPs). It presents an over-view of the research being currently performed in the scope...... of the European project ASAM of the ARTEMIS program. The paper briefly presents the results of our analysis of the main problems to be solved and challenges to be faced in the design of such heterogeneous MPSoCs. It explains which system, design, and electronic design automation (EDA) concepts seem to be adequate...

  2. ASAM: Automatic architecture synthesis and application mapping

    DEFF Research Database (Denmark)

    Jozwiak, Lech; Lindwer, Menno; Corvino, Rosilde;

    2013-01-01

    This paper focuses on mastering the automatic architecture synthesis and application mapping for heterogeneous massively-parallel MPSoCs based on customizable application-specific instruction-set processors (ASIPs). It presents an overview of the research being currently performed in the scope...... of the European project ASAM of the ARTEMIS program. The paper briefly presents the results of our analysis of the main challenges to be faced in the design of such heterogeneous MPSoCs. It explains which system, design, and electronic design automation (EDA) concepts seem to be adequate to address the challenges...

  3. Two Systems for Automatic Music Genre Recognition

    DEFF Research Database (Denmark)

    Sturm, Bob L.

    2012-01-01

    trials of cross-validation. Second, we test the robustness of each system to spectral equalization. Finally, we test how well human subjects recognize the genres of music excerpts composed by each system to be highly genre representative. Our results suggest that neither high-performing system has a......We re-implement and test two state-of-the-art systems for automatic music genre classification; but unlike past works in this area, we look closer than ever before at their behavior. First, we look at specific instances where each system consistently applies the same wrong label across multiple...

  4. Vision-based industrial automatic vehicle classifier

    Science.gov (United States)

    Khanipov, Timur; Koptelov, Ivan; Grigoryev, Anton; Kuznetsova, Elena; Nikolaev, Dmitry

    2015-02-01

    The paper describes the automatic motor vehicle video stream based classification system. The system determines vehicle type at payment collection plazas on toll roads. Classification is performed in accordance with a preconfigured set of rules which determine type by number of wheel axles, vehicle length, height over the first axle and full height. These characteristics are calculated using various computer vision algorithms: contour detectors, correlational analysis, fast Hough transform, Viola-Jones detectors, connected components analysis, elliptic shapes detectors and others. Input data contains video streams and induction loop signals. Output signals are vehicle enter and exit events, vehicle type, motion direction, speed and the above mentioned features.

  5. Automatic control algorithm effects on energy production

    Science.gov (United States)

    Mcnerney, G. M.

    1981-01-01

    A computer model was developed using actual wind time series and turbine performance data to simulate the power produced by the Sandia 17-m VAWT operating in automatic control. The model was used to investigate the influence of starting algorithms on annual energy production. The results indicate that, depending on turbine and local wind characteristics, a bad choice of a control algorithm can significantly reduce overall energy production. The model can be used to select control algorithms and threshold parameters that maximize long term energy production. The results from local site and turbine characteristics were generalized to obtain general guidelines for control algorithm design.

  6. Do judgments of learning predict automatic influences of memory?

    Science.gov (United States)

    Undorf, Monika; Böhm, Simon; Cüpper, Lutz

    2016-06-01

    Current memory theories generally assume that memory performance reflects both recollection and automatic influences of memory. Research on people's predictions about the likelihood of remembering recently studied information on a memory test, that is, on judgments of learning (JOLs), suggests that both magnitude and resolution of JOLs are linked to recollection. However, it has remained unresolved whether JOLs are also predictive of automatic influences of memory. This issue was addressed in 3 experiments. Using the process-dissociation procedure, we assessed the predictive accuracy of immediate and delayed JOLs (Experiment 1) and of immediate JOLs from a first and from a second study-test cycle (Experiments 2 and 3) for recollection and automatic influences. Results showed that each type of JOLs was predictive of both recollection and automatic influences. Moreover, we found that a delay between study and JOL improved the predictive accuracy of JOLs for recollection, while study-test experience improved the predictive accuracy of JOLs for both recollection and automatic influences. These findings demonstrate that JOLs predict not only recollection, but also automatic influences of memory. (PsycINFO Database Record PMID:26595068

  7. All-optical automatic pollen identification: Towards an operational system

    Science.gov (United States)

    Crouzy, Benoît; Stella, Michelle; Konzelmann, Thomas; Calpini, Bertrand; Clot, Bernard

    2016-09-01

    We present results from the development and validation campaign of an optical pollen monitoring method based on time-resolved scattering and fluorescence. Focus is first set on supervised learning algorithms for pollen-taxa identification and on the determination of aerosol properties (particle size and shape). The identification capability provides a basis for a pre-operational automatic pollen season monitoring performed in parallel to manual reference measurements (Hirst-type volumetric samplers). Airborne concentrations obtained from the automatic system are compatible with those from the manual method regarding total pollen and the automatic device provides real-time data reliably (one week interruption over five months). In addition, although the calibration dataset still needs to be completed, we are able to follow the grass pollen season. The high sampling from the automatic device allows to go beyond the commonly-presented daily values and we obtain statistically significant hourly concentrations. Finally, we discuss remaining challenges for obtaining an operational automatic monitoring system and how the generic validation environment developed for the present campaign could be used for further tests of automatic pollen monitoring devices.

  8. Exposing MPI Objects for Debugging

    OpenAIRE

    Brock-Nannestad, Laust; DelSignore, John; Squyres, Jeffrey M.; Karlsson, Sven; Mohror, Kathryn

    2014-01-01

    Developers rely on debuggers to inspect application state. In applications that use MPI, the Message Passing Interface, the MPI runtime contains an important part of this state. The MPI Tools Working Group has proposed an interface for MPI Handle Introspection. It allows debuggers and MPI implementations to cooperate in extracting information from MPI objects. Information that can then be presented to the developer. MPI Handle Introspection provides a more general interface than previous work...

  9. Automatic Melody Segmentation

    OpenAIRE

    Rodríguez López, Marcelo

    2016-01-01

    The work presented in this dissertation investigates music segmentation. In the field of Musicology, segmentation refers to a score analysis technique, whereby notated pieces or passages of these pieces are divided into “units” referred to as sections, periods, phrases, and so on. Segmentation analysis is a widespread practice among musicians: performers use it to help them memorise pieces, music theorists and historians use it to compare works, music students use it to understand the composi...

  10. Performative

    DEFF Research Database (Denmark)

    Sack-Nielsen, Torsten

    2015-01-01

    The article describes the potential of building skins being climate-adaptive. The principle of folding, and the relation between form and performance of facades are discussed here.......The article describes the potential of building skins being climate-adaptive. The principle of folding, and the relation between form and performance of facades are discussed here....

  11. IMP: A performance code

    Science.gov (United States)

    Dauro, Vincent A., Sr.

    IMP (Integrated Mission Program) is a simulation language and code used to model present and future Earth, Moon, or Mars missions. The profile is user controlled through selection from a large menu of events and maneuvers. A Fehlberg 7/13 Runge-Kutta integrator with error and step size control is used to numerically integrate the differential equations of motion (DEQ) of three spacecraft, a main, a target, and an observer. Through selection, the DEQ's include guided thrust, oblate gravity, atmosphere drag, solar pressure, and Moon gravity effects. Guide parameters for thrust events and performance parameters of velocity changes (Delta-V) and propellant usage (maximum of five systems) are developed as needed. Print, plot, summary, and debug files are output.

  12. Indexing of Arabic documents automatically based on lexical analysis

    CERN Document Server

    Molijy, Abdulrahman Al; Alsmadi, Izzat

    2012-01-01

    The continuous information explosion through the Internet and all information sources makes it necessary to perform all information processing activities automatically in quick and reliable manners. In this paper, we proposed and implemented a method to automatically create and Index for books written in Arabic language. The process depends largely on text summarization and abstraction processes to collect main topics and statements in the book. The process is developed in terms of accuracy and performance and results showed that this process can effectively replace the effort of manually indexing books and document, a process that can be very useful in all information processing and retrieval applications.

  13. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming focuses on the techniques of automatic programming used with digital computers. Topics covered range from the design of machine-independent programming languages to the use of recursive procedures in ALGOL 60. A multi-pass translation scheme for ALGOL 60 is described, along with some commercial source languages. The structure and use of the syntax-directed compiler is also considered.Comprised of 12 chapters, this volume begins with a discussion on the basic ideas involved in the description of a computing process as a program for a computer, expressed in

  14. On-line current feed and computer aided control tactics for automatic balancing head

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In the designed automatic balancing head,a non-contact induction transformer is used to deliver driving energy to solve the problem of current fed and controlling on-line.Computer controlled automatic balancing experiments with phase-magnitude control tactics were performed on a flexible rotor system.Results of the experiments prove that the energy feeding method and the control tactics are effective in the automatic balancing head for vibration controlling.

  15. Automatic bagout system

    International Nuclear Information System (INIS)

    Nuclear material entrained wastes are generated at the Plutonium Facility at Los Alamos National Laboratory. These wastes are removed from the glove box lines using the bagout method. This is a manual operation performed by technicians. An automated system is being developed to relieve the technicians from this task. The system will reduce the amount of accumulated radiation exposure to the worker. The primary components of the system consist of a six degree of freedom robot, a bag sealing device, and a small gantry robot. 1 ref., 5 figs

  16. Automatic remote communication system

    International Nuclear Information System (INIS)

    The Upgraded RECOVER (Remote Continual Verification) system is a communication system for remote continual verification of security and safeguards status of nuclear material in principal nuclear facilities. The system is composed of a command center and facility sub-systems. A command center is a mini-computer system to process C/S (Containment and Surveillance) status data. Facility sub-systems consists of OSM (On-site Multiplexer), MU (Monitoring Unit) and C/S sensor. The system uses public telephone network for communication between a command center and facility sub-systems, and it encrypts communication data to prevent falsification and wiretapping by unauthorized persons. This system inherits the design principle of RECOVER system that was tested by IAEA before. We upgraded and expanded its capabilities more than those of RECOVER. The development of this system began in 1983, and it finished in 1987. Performance tests of the system were carried out since 1987. It showed a farely good result with some indications which should need further improvements. The Upgraded RECOVER system provides timely information about the status of C/S systems, which could contribute to the reduction of inspection effort and the improvement of cost performance. (author)

  17. AUTOMATIC SEGMENTATION OF PELVIS FOR BRACHYTHERAPY OF PROSTATE.

    Science.gov (United States)

    Kardell, M; Magnusson, M; Sandborg, M; Alm Carlsson, G; Jeuthe, J; Malusek, A

    2016-06-01

    Advanced model-based iterative reconstruction algorithms in quantitative computed tomography (CT) perform automatic segmentation of tissues to estimate material properties of the imaged object. Compared with conventional methods, these algorithms may improve quality of reconstructed images and accuracy of radiation treatment planning. Automatic segmentation of tissues is, however, a difficult task. The aim of this work was to develop and evaluate an algorithm that automatically segments tissues in CT images of the male pelvis. The newly developed algorithm (MK2014) combines histogram matching, thresholding, region growing, deformable model and atlas-based registration techniques for the segmentation of bones, adipose tissue, prostate and muscles in CT images. Visual inspection of segmented images showed that the algorithm performed well for the five analysed images. The tissues were identified and outlined with accuracy sufficient for the dual-energy iterative reconstruction algorithm whose aim is to improve the accuracy of radiation treatment planning in brachytherapy of the prostate. PMID:26567322

  18. Automatic analysis of microscopic images of red blood cell aggregates

    Science.gov (United States)

    Menichini, Pablo A.; Larese, Mónica G.; Riquelme, Bibiana D.

    2015-06-01

    Red blood cell aggregation is one of the most important factors in blood viscosity at stasis or at very low rates of flow. The basic structure of aggregates is a linear array of cell commonly termed as rouleaux. Enhanced or abnormal aggregation is seen in clinical conditions, such as diabetes and hypertension, producing alterations in the microcirculation, some of which can be analyzed through the characterization of aggregated cells. Frequently, image processing and analysis for the characterization of RBC aggregation were done manually or semi-automatically using interactive tools. We propose a system that processes images of RBC aggregation and automatically obtains the characterization and quantification of the different types of RBC aggregates. Present technique could be interesting to perform the adaptation as a routine used in hemorheological and Clinical Biochemistry Laboratories because this automatic method is rapid, efficient and economical, and at the same time independent of the user performing the analysis (repeatability of the analysis).

  19. Eating as an Automatic Behavior

    Directory of Open Access Journals (Sweden)

    Deborah A. Cohen, MD, MPH

    2008-01-01

    Full Text Available The continued growth of the obesity epidemic at a time when obesity is highly stigmatizing should make us question the assumption that, given the right information and motivation, people can successfully reduce their food intake over the long term. An alternative view is that eating is an automatic behavior over which the environment has more control than do individuals. Automatic behaviors are those that occur without awareness, are initiated without intention, tend to continue without control, and operate efficiently or with little effort. The concept that eating is an automatic behavior is supported by studies that demonstrate the impact of the environmental context and food presentation on eating. The amount of food eaten is strongly influenced by factors such as portion size, food visibility and salience, and the ease of obtaining food. Moreover, people are often unaware of the amount of food they have eaten or of the environmental influences on their eating. A revised view of eating as an automatic behavior, as opposed to one that humans can self-regulate, has profound implications for our response to the obesity epidemic, suggesting that the focus should be less on nutrition education and more on shaping the food environment.

  20. Automatic Association of News Items.

    Science.gov (United States)

    Carrick, Christina; Watters, Carolyn

    1997-01-01

    Discussion of electronic news delivery systems and the automatic generation of electronic editions focuses on the association of related items of different media type, specifically photos and stories. The goal is to be able to determine to what degree any two news items refer to the same news event. (Author/LRW)

  1. Automatic quantification of iris color

    DEFF Research Database (Denmark)

    Christoffersen, S.; Harder, Stine; Andersen, J. D.;

    2012-01-01

    An automatic algorithm to quantify the eye colour and structural information from standard hi-resolution photos of the human iris has been developed. Initially, the major structures in the eye region are identified including the pupil, iris, sclera, and eyelashes. Based on this segmentation, the ...

  2. Automatic Identification of Metaphoric Utterances

    Science.gov (United States)

    Dunn, Jonathan Edwin

    2013-01-01

    This dissertation analyzes the problem of metaphor identification in linguistic and computational semantics, considering both manual and automatic approaches. It describes a manual approach to metaphor identification, the Metaphoricity Measurement Procedure (MMP), and compares this approach with other manual approaches. The dissertation then…

  3. 考虑光伏组件发电性能的自动除尘系统运行时间优化%Optimization of running time of automatic dedusting system considered generating performance of PV mudules

    Institute of Scientific and Technical Information of China (English)

    郭枭; 澈力格尔; 韩雪; 田瑞

    2015-01-01

    Low power generation efficiency is one of the main obstacles to apply PV (photovoltaic) modules in large scale, and therefore studying the influence factors is of great significance. This article has independently developed a kind of automatic dedusting system of PV modules, which has the advantage of simple structure, low installation cost, reliable operation, without the use of water in the ash deposition, continuous and effective dedusting. The system has been applied to 3 kinds of occasions, including supplying power separately by the PV conversion cell with temperature in the range of -45℃−35℃, having various experimental tests of the assemble angles by the PV module cells and a large area of the PV power system. The dedusting effect of the automatic dedusting system is tested with temperature in the range of -10℃−5℃ when applied in the power separately by the PV conversion cell. Adopting the automatic dedusting system, the dynamic occlusion in the operation process has been simulated and the influence law of the output parameter for PV modules has been researched; the effect of dedusting has been analyzed under different amounts of the ash deposition; the effect of dedusting changing with the amount of the ash deposition has been summarized, and the opening time and the running period have been determined. The experimental PV modules are placed in outdoor open ground at an angle of 45°for 3, 7, 20 days and the amounts of the ash deposition are 0.1274, 0.2933, 0.8493 g/m2separately. The correction coefficient of PV modules involved in the experiments is 0.9943. The results show that, when the system is in the horizontal and vertical cycle, the cleaning brush makes the output parameters of the PV modules, including the output power, the electric current and the voltage, change according to the V-shaped law as it crosses a row of battery. Compared with the process of downlink, the output parameters of PV modules in the process of uplink fluctuate

  4. The PLC Transformation and Debug of the T68 Type Horizontal Electrical Control System%T68型卧式镗床电气控制系统的PLC改造和调试

    Institute of Scientific and Technical Information of China (English)

    彭爱梅

    2012-01-01

    本文介绍了利用PLC改造T68型卧式镗床的电气控制系统的思路和方法,对T68卧式镗床主轴电动机M1的正反转的点动控制,正反转高速控制,正反转停车制动控制,主轴变速控制和对镗床控制系统的主电路、辅助电路和控制电路等电路进行了设计和调试。改造后车床运行稳定,降低了故障率,提高了生产效率。%This paper introduces the ideas and methods to transform electrical control system of T68 horizontal boring machines with PLC, and then designs and debugs the posi-nega rotation jog control, posi-nega rotation high-speed control, posi-nega braking control and spindle variable speed control of T68 spindle motor M1, and also designs and debugs the main circuit, aux. circuit and control circuit and boring machine control system. After reconstruction, the lathe operation is stable, and reduces failure rate and improves the production efficiency.

  5. 全淹没式高倍数泡沫灭火系统施工调试探讨%The discussion of the construction debugging of the total flooding of high expansion foam extinguishing system

    Institute of Scientific and Technical Information of China (English)

    兰雪梅; 应晓东

    2012-01-01

    According to the characteristics of the paint processing plant of wood products manufacturing, by analyzing the technical advantages of the total flooding of high expansion foam fire extinguishing system and other fire extinguishing systems, the selection of the fire extinguishing system was determined. Combined with the experiences of the failures of the three attempts in the project spray debugging, the key points and the prone to occur problems during the debugging of the total flooding of high expansion foam fire extinguishing system were summarized, so as to propose solutions.%根据木制品油漆车间的火灾特点,通过比较分析高倍数泡沫灭火系统与其他灭火系统的技术优点,确定灭火系统的选型.结合系统三次试喷的工程调试实例经验,总结全淹没高倍数泡沫灭火系统的施工要点及调试中容易出现的问题,并提出解决方法.

  6. Automatic image classification for the urinoculture screening.

    Science.gov (United States)

    Andreini, Paolo; Bonechi, Simone; Bianchini, Monica; Garzelli, Andrea; Mecocci, Alessandro

    2016-03-01

    Urinary tract infections (UTIs) are considered to be the most common bacterial infection and, actually, it is estimated that about 150 million UTIs occur world wide yearly, giving rise to roughly $6 billion in healthcare expenditures and resulting in 100,000 hospitalizations. Nevertheless, it is difficult to carefully assess the incidence of UTIs, since an accurate diagnosis depends both on the presence of symptoms and on a positive urinoculture, whereas in most outpatient settings this diagnosis is made without an ad hoc analysis protocol. On the other hand, in the traditional urinoculture test, a sample of midstream urine is put onto a Petri dish, where a growth medium favors the proliferation of germ colonies. Then, the infection severity is evaluated by a visual inspection of a human expert, an error prone and lengthy process. In this paper, we propose a fully automated system for the urinoculture screening that can provide quick and easily traceable results for UTIs. Based on advanced image processing and machine learning tools, the infection type recognition, together with the estimation of the bacterial load, can be automatically carried out, yielding accurate diagnoses. The proposed AID (Automatic Infection Detector) system provides support during the whole analysis process: first, digital color images of Petri dishes are automatically captured, then specific preprocessing and spatial clustering algorithms are applied to isolate the colonies from the culture ground and, finally, an accurate classification of the infections and their severity evaluation are performed. The AID system speeds up the analysis, contributes to the standardization of the process, allows result repeatability, and reduces the costs. Moreover, the continuous transition between sterile and external environments (typical of the standard analysis procedure) is completely avoided. PMID:26780249

  7. Automatic sensor placement

    Science.gov (United States)

    Abidi, Besma R.

    1995-10-01

    Active sensing is the process of exploring the environment using multiple views of a scene captured by sensors from different points in space under different sensor settings. Applications of active sensing are numerous and can be found in the medical field (limb reconstruction), in archeology (bone mapping), in the movie and advertisement industry (computer simulation and graphics), in manufacturing (quality control), as well as in the environmental industry (mapping of nuclear dump sites). In this work, the focus is on the use of a single vision sensor (camera) to perform the volumetric modeling of an unknown object in an entirely autonomous fashion. The camera moves to acquire the necessary information in two ways: (a) viewing closely each local feature of interest using 2D data; and (b) acquiring global information about the environment via 3D sensor locations and orientations. A single object is presented to the camera and an initial arbitrary image is acquired. A 2D optimization process is developed. It brings the object in the field of view of the camera, normalizes it by centering the data in the image plane, aligns the principal axis with one of the camera's axes (arbitrarily chosen), and finally maximizes its resolution for better feature extraction. The enhanced image at each step is projected along the corresponding viewing direction. The new projection is intersected with previously obtained projections for volume reconstruction. During the global exploration of the scene, the current image as well as previous images are used to maximize the information in terms of shape irregularity as well as contrast variations. The scene on the borders of occlusion (contours) is modeled by an entropy-based objective functional. This functional is optimized to determine the best next view, which is recovered by computing the pose of the camera. A criterion based on the minimization of the difference between consecutive volume updates is set for termination of the

  8. Human-competitive automatic topic indexing

    CERN Document Server

    Medelyan, Olena

    2009-01-01

    Topic indexing is the task of identifying the main topics covered by a document. These are useful for many purposes: as subject headings in libraries, as keywords in academic publications and as tags on the web. Knowing a document’s topics helps people judge its relevance quickly. However, assigning topics manually is labor intensive. This thesis shows how to generate them automatically in a way that competes with human performance. Three kinds of indexing are investigated: term assignment, a task commonly performed by librarians, who select topics from a controlled vocabulary; tagging, a popular activity of web users, who choose topics freely; and a new method of keyphrase extraction, where topics are equated to Wikipedia article names. A general two-stage algorithm is introduced that first selects candidate topics and then ranks them by significance based on their properties. These properties draw on statistical, semantic, domain-specific and encyclopedic knowledge. They are combined using a machine learn...

  9. Nuclear demagnetization refrigerator with automatic control, pick up and data process system

    International Nuclear Information System (INIS)

    A nuclear demagnetization refrigerator for various physical research at ultralow temperatures with automatic control, pick up and data process system is developed . The design of the main units and performance of the refrigerator and automatic system are described. The possibilities of the set-up operation in various regimes are analyzed for the case of NMR investigation of helium quantum crystals

  10. An interactive system for the automatic layout of printed circuit boards (ARAIGNEE)

    International Nuclear Information System (INIS)

    A software package for the automatic layout of printed circuit boards is presented. The program permits an interaction of the user during the layout process. The automatic searching of paths can be interrupted at any step and convenient corrections can be inserted. This procedure improves strongly the performance of the program as far as the number of unresolved connections is concerned

  11. Speaker-Machine Interaction in Automatic Speech Recognition. Technical Report.

    Science.gov (United States)

    Makhoul, John I.

    The feasibility and limitations of speaker adaptation in improving the performance of a "fixed" (speaker-independent) automatic speech recognition system were examined. A fixed vocabulary of 55 syllables is used in the recognition system which contains 11 stops and fricatives and five tense vowels. The results of an experiment on speaker…

  12. Automatic Management of Parallel and Distributed System Resources

    Science.gov (United States)

    Yan, Jerry; Ngai, Tin Fook; Lundstrom, Stephen F.

    1990-01-01

    Viewgraphs on automatic management of parallel and distributed system resources are presented. Topics covered include: parallel applications; intelligent management of multiprocessing systems; performance evaluation of parallel architecture; dynamic concurrent programs; compiler-directed system approach; lattice gaseous cellular automata; and sparse matrix Cholesky factorization.

  13. Semi-Automatic Rename Refactoring for JavaScript

    DEFF Research Database (Denmark)

    Feldthaus, Asger; Møller, Anders

    2013-01-01

    alternatives have been developed, they are limited by whole-program assumptions and poor scalability. We propose a technique for semi-automatic refactoring for JavaScript, with a focus on renaming. Unlike traditional refactoring algorithms, semi-automatic refactoring works by a combination of static analysis......Modern IDEs support automated refactoring for many programming languages, but support for JavaScript is still primitive. To perform renaming, which is one of the fundamental refactorings, there is often no practical alternative to simple syntactic search-and-replace. Although more sophisticated...

  14. Automatic control rod programming for boiling water reactors

    International Nuclear Information System (INIS)

    The objective of long-term control rod programming is to develop a sequence of exposure-dependent control rod patterns that assure the safe and efficient depletion of the nuclear fuel for the duration of the cycle. A two step method was effected in the code OCTOPUS to perform this task automatically for the Pennsylvania and Power Light Co.' BWRs. Although the execution of OCTOPUS provides good or satisfactory results, its input and execution mode has been improved by making it more user friendly and automatic. (authors)

  15. Semi-automatic process partitioning for parallel computation

    Science.gov (United States)

    Koelbel, Charles; Mehrotra, Piyush; Vanrosendale, John

    1988-01-01

    On current multiprocessor architectures one must carefully distribute data in memory in order to achieve high performance. Process partitioning is the operation of rewriting an algorithm as a collection of tasks, each operating primarily on its own portion of the data, to carry out the computation in parallel. A semi-automatic approach to process partitioning is considered in which the compiler, guided by advice from the user, automatically transforms programs into such an interacting task system. This approach is illustrated with a picture processing example written in BLAZE, which is transformed into a task system maximizing locality of memory reference.

  16. Automatic target extraction in complicated background for camera calibration

    Science.gov (United States)

    Guo, Xichao; Wang, Cheng; Wen, Chenglu; Cheng, Ming

    2016-03-01

    In order to perform high precise calibration of camera in complex background, a novel design of planar composite target and the corresponding automatic extraction algorithm are presented. Unlike other commonly used target designs, the proposed target contains the information of feature point coordinate and feature point serial number simultaneously. Then based on the original target, templates are prepared by three geometric transformations and used as the input of template matching based on shape context. Finally, parity check and region growing methods are used to extract the target as final result. The experimental results show that the proposed method for automatic extraction and recognition of the proposed target is effective, accurate and reliable.

  17. Automatic Schema Evolution in Root

    Institute of Scientific and Technical Information of China (English)

    ReneBrun; FonsRademakers

    2001-01-01

    ROOT version 3(spring 2001) supports automatic class schema evolution.In addition this version also produces files that are self-describing.This is achieved by storing in each file a record with the description of all the persistent classes in the file.Being self-describing guarantees that a file can always be read later,its structure browsed and objects inspected.also when the library with the compiled code of these classes is missing The schema evolution mechanism supports the frequent case when multiple data sets generated with many different class versions must be analyzed in the same session.ROOT supports the automatic generation of C++ code describing the data objects in a file.

  18. Physics of Automatic Target Recognition

    CERN Document Server

    Sadjadi, Firooz

    2007-01-01

    Physics of Automatic Target Recognition addresses the fundamental physical bases of sensing, and information extraction in the state-of-the art automatic target recognition field. It explores both passive and active multispectral sensing, polarimetric diversity, complex signature exploitation, sensor and processing adaptation, transformation of electromagnetic and acoustic waves in their interactions with targets, background clutter, transmission media, and sensing elements. The general inverse scattering, and advanced signal processing techniques and scientific evaluation methodologies being used in this multi disciplinary field will be part of this exposition. The issues of modeling of target signatures in various spectral modalities, LADAR, IR, SAR, high resolution radar, acoustic, seismic, visible, hyperspectral, in diverse geometric aspects will be addressed. The methods for signal processing and classification will cover concepts such as sensor adaptive and artificial neural networks, time reversal filt...

  19. Automatic schema evolution in Root

    International Nuclear Information System (INIS)

    ROOT version 3 (spring 2001) supports automatic class schema evolution. In addition this version also produces files that are self-describing. This is achieved by storing in each file a record with the description of all the persistent classes in the file. Being self-describing guarantees that a file can always be read later, its structure browsed and objects inspected, also when the library with the compiled code of these classes is missing. The schema evolution mechanism supports the frequent case when multiple data sets generated with many different class versions must be analyzed in the same session. ROOT supports the automatic generation of C++ code describing the data objects in a file

  20. Automatic spikes detection in seismogram

    Institute of Scientific and Technical Information of China (English)

    王海军; 靳平; 刘贵忠

    2003-01-01

    @@ Data processing for seismic network is very complex and fussy, because a lot of data is recorded in seismic network every day, which make it impossible to process these data all by manual work. Therefore, seismic data should be processed automatically to produce a initial results about events detection and location. Afterwards, these results are reviewed and modified by analyst. In automatic processing data quality checking is important. There are three main problem data thatexist in real seismic records, which include: spike, repeated data and dropouts. Spike is defined as isolated large amplitude point; the other two problem datahave the same features that amplitude of sample points are uniform in a interval. In data quality checking, the first step is to detect and statistic problem data in a data segment, if percent of problem data exceed a threshold, then the whole data segment is masked and not be processed in the later process.

  1. Automatic registration of satellite imagery

    Science.gov (United States)

    Fonseca, Leila M. G.; Costa, Max H. M.; Manjunath, B. S.; Kenney, C.

    1997-01-01

    Image registration is one of the basic image processing operations in remote sensing. With the increase in the number of images collected every day from different sensors, automated registration of multi-sensor/multi-spectral images has become an important issue. A wide range of registration techniques has been developed for many different types of applications and data. The objective of this paper is to present an automatic registration algorithm which uses a multiresolution analysis procedure based upon the wavelet transform. The procedure is completely automatic and relies on the grey level information content of the images and their local wavelet transform modulus maxima. The registration algorithm is very simple and easy to apply because it needs basically one parameter. We have obtained very encouraging results on test data sets from the TM and SPOT sensor images of forest, urban and agricultural areas.

  2. The Automatic Galaxy Collision Software

    CERN Document Server

    Smith, Beverly J; Pfeiffer, Phillip; Perkins, Sam; Barkanic, Jason; Fritts, Steve; Southerland, Derek; Manchikalapudi, Dinikar; Baker, Matt; Luckey, John; Franklin, Coral; Moffett, Amanda; Struck, Curtis

    2009-01-01

    The key to understanding the physical processes that occur during galaxy interactions is dynamical modeling, and especially the detailed matching of numerical models to specific systems. To make modeling interacting galaxies more efficient, we have constructed the `Automatic Galaxy Collision' (AGC) code, which requires less human intervention in finding good matches to data. We present some preliminary results from this code for the well-studied system Arp 284 (NGC 7714/5), and address questions of uniqueness of solutions.

  3. Automatic Generation of Technical Documentation

    OpenAIRE

    Reiter, Ehud; Mellish, Chris; Levine, John

    1994-01-01

    Natural-language generation (NLG) techniques can be used to automatically produce technical documentation from a domain knowledge base and linguistic and contextual models. We discuss this application of NLG technology from both a technical and a usefulness (costs and benefits) perspective. This discussion is based largely on our experiences with the IDAS documentation-generation project, and the reactions various interested people from industry have had to IDAS. We hope that this summary of ...

  4. Annual review in automatic programming

    CERN Document Server

    Halpern, Mark I; Bolliet, Louis

    2014-01-01

    Computer Science and Technology and their Application is an eight-chapter book that first presents a tutorial on database organization. Subsequent chapters describe the general concepts of Simula 67 programming language; incremental compilation and conversational interpretation; dynamic syntax; the ALGOL 68. Other chapters discuss the general purpose conversational system for graphical programming and automatic theorem proving based on resolution. A survey of extensible programming language is also shown.

  5. Automatically constructing the semantic web

    OpenAIRE

    Becerra, Victor Manuel; Brown, Matthew; Nasuto, Slawomir

    2008-01-01

    The storage and processing capacity realised by computing has lead to an explosion of data retention. We now reach the point of information overload and must begin to use computers to process more complex information. In particular, the proposition of the Semantic Web has given structure to this problem, but has yet realised practically. The largest of its problems is that of ontology construction; without a suitable automatic method most will have to be encoded by hand. In this paper we disc...

  6. Approaches to Automatic Text Structuring

    OpenAIRE

    Erbs, Nicolai

    2015-01-01

    Structured text helps readers to better understand the content of documents. In classic newspaper texts or books, some structure already exists. In the Web 2.0, the amount of textual data, especially user-generated data, has increased dramatically. As a result, there exists a large amount of textual data which lacks structure, thus making it more difficult to understand. In this thesis, we will explore techniques for automatic text structuring to help readers to fulfill their information need...

  7. The Automatic Measurement of Targets

    DEFF Research Database (Denmark)

    Höhle, Joachim

    1997-01-01

    The automatic measurement of targets is demonstrated by means of a theoretical example and by an interactive measuring program for real imagery from a réseau camera. The used strategy is a combination of two methods: the maximum correlation coefficient and the correlation in the subpixel range. F...... interactive software is also part of a computer-assisted learning program on digital photogrammetry....

  8. Automatically-Programed Machine Tools

    Science.gov (United States)

    Purves, L.; Clerman, N.

    1985-01-01

    Software produces cutter location files for numerically-controlled machine tools. APT, acronym for Automatically Programed Tools, is among most widely used software systems for computerized machine tools. APT developed for explicit purpose of providing effective software system for programing NC machine tools. APT system includes specification of APT programing language and language processor, which executes APT statements and generates NC machine-tool motions specified by APT statements.

  9. Automatic translation among spoken languages

    Science.gov (United States)

    Walter, Sharon M.; Costigan, Kelly

    1994-02-01

    The Machine Aided Voice Translation (MAVT) system was developed in response to the shortage of experienced military field interrogators with both foreign language proficiency and interrogation skills. Combining speech recognition, machine translation, and speech generation technologies, the MAVT accepts an interrogator's spoken English question and translates it into spoken Spanish. The spoken Spanish response of the potential informant can then be translated into spoken English. Potential military and civilian applications for automatic spoken language translation technology are discussed in this paper.

  10. Automatic evaluation of uterine cervix segmentations

    Science.gov (United States)

    Lotenberg, Shelly; Gordon, Shiri; Long, Rodney; Antani, Sameer; Jeronimo, Jose; Greenspan, Hayit

    2007-03-01

    In this work we focus on the generation of reliable ground truth data for a large medical repository of digital cervicographic images (cervigrams) collected by the National Cancer Institute (NCI). This work is part of an ongoing effort conducted by NCI together with the National Library of Medicine (NLM) at the National Institutes of Health (NIH) to develop a web-based database of the digitized cervix images in order to study the evolution of lesions related to cervical cancer. As part of this effort, NCI has gathered twenty experts to manually segment a set of 933 cervigrams into regions of medical and anatomical interest. This process yields a set of images with multi-expert segmentations. The objectives of the current work are: 1) generate multi-expert ground truth and assess the diffculty of segmenting an image, 2) analyze observer variability in the multi-expert data, and 3) utilize the multi-expert ground truth to evaluate automatic segmentation algorithms. The work is based on STAPLE (Simultaneous Truth and Performance Level Estimation), which is a well known method to generate ground truth segmentation maps from multiple experts' observations. We have analyzed both intra- and inter-expert variability within the segmentation data. We propose novel measures of "segmentation complexity" by which we can automatically identify cervigrams that were found difficult to segment by the experts, based on their inter-observer variability. Finally, the results are used to assess our own automated algorithm for cervix boundary detection.

  11. Driver behavior following an automatic steering intervention.

    Science.gov (United States)

    Fricke, Nicola; Griesche, Stefan; Schieben, Anna; Hesse, Tobias; Baumann, Martin

    2015-10-01

    The study investigated driver behavior toward an automatic steering intervention of a collision mitigation system. Forty participants were tested in a driving simulator and confronted with an inevitable collision. They performed a naïve drive and afterwards a repeated exposure in which they were told to hold the steering wheel loosely. In a third drive they experienced a false alarm situation. Data on driving behavior, i.e. steering and braking behavior as well as subjective data was assessed in the scenarios. Results showed that most participants held on to the steering wheel strongly or counter-steered during the system intervention during the first encounter. Moreover, subjective data collected after the first drive showed that the majority of drivers was not aware of the system intervention. Data from the repeated drive in which participants were instructed to hold the steering wheel loosely, led to significantly more participants holding the steering wheel loosely and thus complying with the instruction. This study seems to imply that without knowledge and information of the system about an upcoming intervention, the most prevalent driving behavior is a strong reaction with the steering wheel similar to an automatic steering reflex which decreases the system's effectiveness. Results of the second drive show some potential for countermeasures, such as informing drivers shortly before a system intervention in order to prevent inhibiting reactions. PMID:26310799

  12. A Robot Based Automatic Paint Inspection System

    Science.gov (United States)

    Atkinson, R. M.; Claridge, J. F.

    1988-06-01

    The final inspection of manufactured goods is a labour intensive activity. The use of human inspectors has a number of potential disadvantages; it can be expensive, the inspection standard applied is subjective and the inspection process can be slow compared with the production process. The use of automatic optical and electronic systems to perform the inspection task is now a growing practice but, in general, such systems have been applied to small components which are accurately presented. Recent advances in vision systems and robot control technology have made possible the installation of an automated paint inspection system at the Austin Rover Group's plant at Cowley, Oxford. The automatic inspection of painted car bodies is a particularly difficult problem, but one which has major benefits. The pass line of the car bodies is ill-determined, the surface to be inspected is of varying surface geometry and only a short time is available to inspect a large surface area. The benefits, however, are due to the consistent standard of inspection which should lead to lower levels of customer complaints and improved process feedback. The Austin Rover Group initiated the development of a system to fulfil this requirement. Three companies collaborated on the project; Austin Rover itself undertook the production line modifications required for body presentation, Sira Ltd developed the inspection cameras and signal processing system and Unimation (Europe) Ltd designed, supplied and programmed the robot system. Sira's development was supported by a grant from the Department of Trade and Industry.

  13. Social influence effects on automatic racial prejudice.

    Science.gov (United States)

    Lowery, B S; Hardin, C D; Sinclair, S

    2001-11-01

    Although most research on the control of automatic prejudice has focused on the efficacy of deliberate attempts to suppress or correct for stereotyping, the reported experiments tested the hypothesis that automatic racial prejudice is subject to common social influence. In experiments involving actual interethnic contact, both tacit and expressed social influence reduced the expression of automatic prejudice, as assessed by two different measures of automatic attitudes. Moreover, the automatic social tuning effect depended on participant ethnicity. European Americans (but not Asian Americans) exhibited less automatic prejudice in the presence of a Black experimenter than a White experimenter (Experiments 2 and 4), although both groups exhibited reduced automatic prejudice when instructed to avoid prejudice (Experiment 3). Results are consistent with shared reality theory, which postulates that social regulation is central to social cognition. PMID:11708561

  14. Automatic hypermnesia and impaired recollection in schizophrenia.

    Science.gov (United States)

    Linscott, R J; Knight, R G

    2001-10-01

    Evidence from studies of nonmnemonic automatic cognitive processes provides reason to expect that schizophrenia is associated with exaggerated automatic memory (implicit memory), or automatic hypermnesia. Participants with schizophrenia (n = 22) and control participants (n = 26) were compared on word stem completion (WSC) and list discrimination (LD) tasks administered using the process dissociation procedure. Unadjusted, extended measurement model and dual-process signal-detection methods were used to estimate recollection and automatic memory indices. Schizophrenia was associated with automatic hypermnesia on the WSC task and impaired recollection on both tasks. Thought disorder was associated with even greater automatic hypermnesia. The absence of automatic hypermnesia on the LD task was interpreted with reference to the neuropsychological bases of context and content memory. PMID:11761047

  15. 生活垃圾资源化有机浆液的厌氧消化调试研究%Study on the Anaerobic Debugging of Organic Slurry in Municipal Solid Waste Treatment and Recycling Utilization

    Institute of Scientific and Technical Information of China (English)

    金慧宁; 张进锋; 史东晓; 屈阳; 王风庆; 李习武; 吴海锁

    2014-01-01

    以城市生活垃圾经过破碎、淋滤等预处理后的有机浆液为研究对象,进行了中试规模的厌氧反应器的调试研究,研究的结果表明:①采用污水处理厂的污泥,生活垃圾经过预处理后的有机浆液历时120 d的完成了厌氧调试;②根据调试过程分为3个阶段:污泥驯化阶段、负荷提高阶段、稳定运行阶段。在污泥驯化期,OLR=0.5 kg/(m3·d)下, COD去除率经历缓慢增长、快速增长2个时段。在负荷提高阶段,COD去除率基本保持85%左右。稳定运行期OLR=7.0 kg/(m3·d)下,COD去除率保持在80%左右;③沼气的产气率保持在0.45~0.55 m3/kg,CH4和CO2的百分比分别保持在55%和40%左右。%Anaerobic debugging experimentation was done in a pilot plan, using the organic slurry that comes from the process of crush and percolation in the pretreatment of municipal solid waste (MSW). The results indicated: ①Inoculated with the sludge of sewage treatment plant,the anaerobic debugging of the organic slurry from the pretreatment of MSW was completed in 120 days; ②The anaerobic debugging consisted of three stages: the cultivation stage, the improving loading stage and the stable stage. In the cultivation stage, when OLR was 0.5 kg /(m3·d), the increasing rate of the COD removal varied form low rate to high rate. In the improving loading stage, the COD removal rate was maintained in 85%. In stable stage, when OLR was 7.0 kg /(m3·d), the COD removal rate was about 80%; ③The biogas production rate was 0.45 ~ 0.55 m3/kg, the percentage of methane and carbon dioxide was maintained at about 55%and 40%, respectively.

  16. The design of automatic washing machine control system based on Proteus%基于Proteus的全自动洗衣机控制系统设计

    Institute of Scientific and Technical Information of China (English)

    刘晓彤

    2012-01-01

    This paper mainly discusses using the Proteus simulation software to achieve the automatic washing machine control system hardware,software simulation design,debugging and running.This design,which uses a AT89S52 MCU as the control core and a washing machine as a control object,controls the entire laundry process by apply corresponding input and output device.This paper introduces the design for a automatic washing machine control scheme in a convenient and efficient way.The scheme uses software electric circuit and the automatic washing machine control system software program both designed by the Proteus simulation software,and then add the prepared program to a virtual prototype based on the hardware schematic,finally achieve the real-time debugging and simulation running of both hardware and software.%本文主要探讨采用Proteus仿真软件,实现全自动洗衣机控制系统硬件、软件的仿真设计和调试运行。本设计采用AT89S52单片机作为控制核心,以洗衣机作为控制对象,采用相应的输入、输出设备,实现对洗衣机整个洗衣过程的多种控制。本文介绍了从方便、快捷的角度出发进行的全自动洗衣机控制方案设计,使用Protues仿真软件进行的硬件电路设计,全自动洗衣机控制系统软件程序的设计,最后在基于硬件原理图的虚拟原型上添加编写好的程序,实现了硬件、软件的实时调试和仿真运行。

  17. EZ: A Tool For Automatic Redshift Measurement

    Science.gov (United States)

    Garilli, B.; Fumana, M.; Franzetti, P.; Paioro, L.; Scodeggio, M.; Le Fèvre, O.; Paltani, S.; Scaramella, R.

    2010-07-01

    We present EZ (Easy redshift), a tool we have developed within the VVDS project to help in redshift measurement from optical spectra. EZ has been designed with large spectroscopic surveys in mind, and in its development particular care has been given to the reliability of the results obtained in an automatic and unsupervised mode. Nevertheless, the possibility of running it interactively has been preserved, and a graphical user interface for results inspection has been designed. EZ has been successfully used within the VVDS project, as well as the zCosmos one. In this article we describe its architecture and the algorithms used, and evaluate its performances both on simulated and real data. EZ is an open-source program, freely downloadable from the Pandora Web Site.1

  18. Punjabi Automatic Speech Recognition Using HTK

    Directory of Open Access Journals (Sweden)

    Mohit Dua

    2012-07-01

    Full Text Available This paper aims to discuss the implementation of an isolated word Automatic Speech Recognition system (ASR for an Indian regional language Punjabi. The HTK toolkit based on Hidden Markov Model (HMM, a statistical approach, is used to develop the system. Initially the system is trained for 115 distinct Punjabi words by collecting data from eight speakers and then is tested by using samples from six speakers in real time environments. To make the system more interactive and fast a GUI has been developed using JAVA platform for implementing the testing module. The paper also describes the role of each HTK tool, used in various phases of system development, by presenting a detailed architecture of an ASR system developed using HTK library modules and tools. The experimental results show that the overall system performance is 95.63% and 94.08%.

  19. Automatic Tuning of Interactive Perception Applications

    CERN Document Server

    Zhu, Qian; Mummert, Lily; Pillai, Padmanabhan

    2012-01-01

    Interactive applications incorporating high-data rate sensing and computer vision are becoming possible due to novel runtime systems and the use of parallel computation resources. To allow interactive use, such applications require careful tuning of multiple application parameters to meet required fidelity and latency bounds. This is a nontrivial task, often requiring expert knowledge, which becomes intractable as resources and application load characteristics change. This paper describes a method for automatic performance tuning that learns application characteristics and effects of tunable parameters online, and constructs models that are used to maximize fidelity for a given latency constraint. The paper shows that accurate latency models can be learned online, knowledge of application structure can be used to reduce the complexity of the learning task, and operating points can be found that achieve 90% of the optimal fidelity by exploring the parameter space only 3% of the time.

  20. Automatic check of the RFX machine signals

    International Nuclear Information System (INIS)

    The paper deals with an automatic computer procedure developed to detect errors or faults in the signal waveforms measured during the operation of the RFX machine. These data are generated by the diagnostic equipment and other sensors installed in the machine and in its power supplies. The signal check is paramount, since even a single faulty waveform may lead to wrong results in the post-processing phase. Due to their large number, the manual inspection of each signal after the pulse is impossible. The check has to be performed as quickly as possible, in order to identify and discard the defective waveforms before their on-line elaboration. To this purpose, the on-line signal check program is integrated in the control data acquisition system of RFX. Presently, the procedure is being used to check the quality of the saved machine pulse data

  1. Automatic scanning of NTA film neutron dosimeters

    CERN Document Server

    Müller, R

    1999-01-01

    At the European Laboratory for Particle Physics CERN, personal neutron monitoring for over 4000 collaborators is performed with Kodak NTA film, one of the few suitable dosemeters in the stray radiation environment of a high energy accelerator. After development, films are scanned with a projection microscope. To overcome this lengthy and strenuous procedure an automated analysis system for the dosemeters has been developed. General purpose image recognition software, tailored to the specific needs with a macro language, analyses the digitised microscope image. This paper reports on the successful automatic scanning of NTA films irradiated with neutrons from a /sup 238/Pu-Be source (E approximately=4 MeV), as well as on the extension of the method to neutrons of higher energies. The question of detection limits is discussed in the light of an application of the method in routine personal neutron monitoring. (9 refs).

  2. Unification of automatic target tracking and automatic target recognition

    Science.gov (United States)

    Schachter, Bruce J.

    2014-06-01

    The subject being addressed is how an automatic target tracker (ATT) and an automatic target recognizer (ATR) can be fused together so tightly and so well that their distinctiveness becomes lost in the merger. This has historically not been the case outside of biology and a few academic papers. The biological model of ATT∪ATR arises from dynamic patterns of activity distributed across many neural circuits and structures (including retina). The information that the brain receives from the eyes is "old news" at the time that it receives it. The eyes and brain forecast a tracked object's future position, rather than relying on received retinal position. Anticipation of the next moment - building up a consistent perception - is accomplished under difficult conditions: motion (eyes, head, body, scene background, target) and processing limitations (neural noise, delays, eye jitter, distractions). Not only does the human vision system surmount these problems, but it has innate mechanisms to exploit motion in support of target detection and classification. Biological vision doesn't normally operate on snapshots. Feature extraction, detection and recognition are spatiotemporal. When vision is viewed as a spatiotemporal process, target detection, recognition, tracking, event detection and activity recognition, do not seem as distinct as they are in current ATT and ATR designs. They appear as similar mechanism taking place at varying time scales. A framework is provided for unifying ATT and ATR.

  3. Automatic Mode Transition Enabled Robust Triboelectric Nanogenerators.

    Science.gov (United States)

    Chen, Jun; Yang, Jin; Guo, Hengyu; Li, Zhaoling; Zheng, Li; Su, Yuanjie; Wen, Zhen; Fan, Xing; Wang, Zhong Lin

    2015-12-22

    Although the triboelectric nanogenerator (TENG) has been proven to be a renewable and effective route for ambient energy harvesting, its robustness remains a great challenge due to the requirement of surface friction for a decent output, especially for the in-plane sliding mode TENG. Here, we present a rationally designed TENG for achieving a high output performance without compromising the device robustness by, first, converting the in-plane sliding electrification into a contact separation working mode and, second, creating an automatic transition between a contact working state and a noncontact working state. The magnet-assisted automatic transition triboelectric nanogenerator (AT-TENG) was demonstrated to effectively harness various ambient rotational motions to generate electricity with greatly improved device robustness. At a wind speed of 6.5 m/s or a water flow rate of 5.5 L/min, the harvested energy was capable of lighting up 24 spot lights (0.6 W each) simultaneously and charging a capacitor to greater than 120 V in 60 s. Furthermore, due to the rational structural design and unique output characteristics, the AT-TENG was not only capable of harvesting energy from natural bicycling and car motion but also acting as a self-powered speedometer with ultrahigh accuracy. Given such features as structural simplicity, easy fabrication, low cost, wide applicability even in a harsh environment, and high output performance with superior device robustness, the AT-TENG renders an effective and practical approach for ambient mechanical energy harvesting as well as self-powered active sensing. PMID:26529374

  4. Performance-Driven Interface Contract Enforcement for Scientific Components

    Energy Technology Data Exchange (ETDEWEB)

    Dahlgren, T

    2007-02-22

    Several performance-driven approaches to selectively enforce interface contracts for scientific components are investigated. The goal is to facilitate debugging deployed applications built from plug-and-play components while keeping the cost of enforcement within acceptable overhead limits. This paper describes a study of global enforcement using a priori execution cost estimates obtained from traces. Thirteen trials are formed from five, single-component programs. Enforcement experiments conducted using twenty-three enforcement policies are used to determine the nature of exercised contracts and the impact of a variety of sampling strategies. Performance-driven enforcement appears to be best suited to programs that exercise moderately expensive contracts.

  5. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming, Volume 2 is a collection of papers that discusses the controversy about the suitability of COBOL as a common business oriented language, and the development of different common languages for scientific computation. A couple of papers describes the use of the Genie system in numerical calculation and analyzes Mercury autocode in terms of a phrase structure language, such as in the source language, target language, the order structure of ATLAS, and the meta-syntactical language of the assembly program. Other papers explain interference or an ""intermediate

  6. Automatic Generation of Technical Documentation

    CERN Document Server

    Reiter, E R; Levine, J; Reiter, Ehud; Mellish, Chris; Levine, John

    1994-01-01

    Natural-language generation (NLG) techniques can be used to automatically produce technical documentation from a domain knowledge base and linguistic and contextual models. We discuss this application of NLG technology from both a technical and a usefulness (costs and benefits) perspective. This discussion is based largely on our experiences with the IDAS documentation-generation project, and the reactions various interested people from industry have had to IDAS. We hope that this summary of our experiences with IDAS and the lessons we have learned from it will be beneficial for other researchers who wish to build technical-documentation generation systems.

  7. Unsupervised automatic music genre classification

    OpenAIRE

    Barreira, Luís Filipe Marques

    2010-01-01

    Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática In this study we explore automatic music genre recognition and classification of digital music. Music has always been a reflection of culture di erences and an influence in our society. Today’s digital content development triggered the massive use of digital music. Nowadays,digital music is manually labeled without following a universa...

  8. Real time automatic scene classification

    OpenAIRE

    Israël, Menno; Broek, van den, Wouter; Putten, van, M.J.A.M.; Uyl, den, T.M.; Verbrugge, R.; Taatgen, N.; Schomaker, L.

    2004-01-01

    This work has been done as part of the EU VICAR (IST) project and the EU SCOFI project (IAP). The aim of the first project was to develop a real time video indexing classification annotation and retrieval system. For our systems, we have adapted the approach of Picard and Minka [3], who categorized elements of a scene automatically with so-called ’stuff’ categories (e.g., grass, sky, sand, stone). Campbell et al. [1] use similar concepts to describe certain parts of an image, which they named...

  9. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming, Volume 4 is a collection of papers that deals with the GIER ALGOL compiler, a parameterized compiler based on mechanical linguistics, and the JOVIAL language. A couple of papers describes a commercial use of stacks, an IBM system, and what an ideal computer program support system should be. One paper reviews the system of compilation, the development of a more advanced language, programming techniques, machine independence, and program transfer to other machines. Another paper describes the ALGOL 60 system for the GIER machine including running ALGOL pro

  10. Automatic transcription of polyphonic singing

    OpenAIRE

    Paščinski, Uroš

    2015-01-01

    In this work we focus on automatic transcription of polyphonic singing. In particular we do the multiple fundamental frequency (F0) estimation. From the terrain recordings a test set of Slovenian folk songs with polyphonic singing is extracted and manually transcribed. On the test set we try the general algorithm for multiple F0 detection. An interactive visualization of the main parts of the algorithm is made to analyse how it works and try to detect possible issues. As the data set is ne...

  11. Automatic analysis of multiparty meetings

    Indian Academy of Sciences (India)

    Steve Renals

    2011-10-01

    This paper is about the recognition and interpretation of multiparty meetings captured as audio, video and other signals. This is a challenging task since the meetings consist of spontaneous and conversational interactions between a number of participants: it is a multimodal, multiparty, multistream problem. We discuss the capture and annotation of the Augmented Multiparty Interaction (AMI) meeting corpus, the development of a meeting speech recognition system, and systems for the automatic segmentation, summarization and social processing of meetings, together with some example applications based on these systems.

  12. Coordinated hybrid automatic repeat request

    KAUST Repository

    Makki, Behrooz

    2014-11-01

    We develop a coordinated hybrid automatic repeat request (HARQ) approach. With the proposed scheme, if a user message is correctly decoded in the first HARQ rounds, its spectrum is allocated to other users, to improve the network outage probability and the users\\' fairness. The results, which are obtained for single- and multiple-antenna setups, demonstrate the efficiency of the proposed approach in different conditions. For instance, with a maximum of M retransmissions and single transmit/receive antennas, the diversity gain of a user increases from M to (J+1)(M-1)+1 where J is the number of users helping that user.

  13. Automatic surveillance system using fish-eye lens camera

    Institute of Scientific and Technical Information of China (English)

    Xue Yuan; Yongduan Song; Xueye Wei

    2011-01-01

    This letter presents an automatic surveillance system using fish-eye lens camera. Our system achieves wide-area automatic surveillance without a dead angle using only one camera. We propose a new human detection method to select the most adaptive classifier based on the locations of the human candidates.Human regions are detected from the fish-eye image effectively and are corrected for perspective versions.An experiment is performed on indoor video sequences with different illumination and crowded conditions,with results demonstrating the efficiency of our algorithm.%@@ This letter presents an automatic surveillance system using fish-eye lens camera. Our system achieves wide-area automatic surveillance without a dead angle using only one camera. We propose a new human detection method to select the most adaptive classifier based on the locations of the human candidates. Human regions are detected from the fish-eye image effectively and are corrected for perspective versions. An experiment is performed on indoor video sequences with different illumination and crowded conditions, with results demonstrating the efficiency of our algorithm.

  14. Automatic generation of tourist brochures

    KAUST Repository

    Birsak, Michael

    2014-05-01

    We present a novel framework for the automatic generation of tourist brochures that include routing instructions and additional information presented in the form of so-called detail lenses. The first contribution of this paper is the automatic creation of layouts for the brochures. Our approach is based on the minimization of an energy function that combines multiple goals: positioning of the lenses as close as possible to the corresponding region shown in an overview map, keeping the number of lenses low, and an efficient numbering of the lenses. The second contribution is a route-aware simplification of the graph of streets used for traveling between the points of interest (POIs). This is done by reducing the graph consisting of all shortest paths through the minimization of an energy function. The output is a subset of street segments that enable traveling between all the POIs without considerable detours, while at the same time guaranteeing a clutter-free visualization. © 2014 The Author(s) Computer Graphics Forum © 2014 The Eurographics Association and John Wiley & Sons Ltd. Published by John Wiley & Sons Ltd.

  15. Automatic validation of numerical solutions

    DEFF Research Database (Denmark)

    Stauning, Ole

    1997-01-01

    This thesis is concerned with ``Automatic Validation of Numerical Solutions''. The basic theory of interval analysis and self-validating methods is introduced. The mean value enclosure is applied to discrete mappings for obtaining narrow enclosures of the iterates when applying these mappings wit...... the mean value enclosure of an integral operator and uses interval Bernstein polynomials for enclosing the solution. Two numerical examples are given, using two orders of approximation and using different numbers of discretization points.......This thesis is concerned with ``Automatic Validation of Numerical Solutions''. The basic theory of interval analysis and self-validating methods is introduced. The mean value enclosure is applied to discrete mappings for obtaining narrow enclosures of the iterates when applying these mappings with...... intervals as initial values. A modification of the mean value enclosure of discrete mappings is considered, namely the extended mean value enclosure which in most cases leads to even better enclosures. These methods have previously been described in connection with discretizing solutions of ordinary...

  16. Automatic Differentiation of Algorithms for Machine Learning

    OpenAIRE

    Baydin, Atilim Gunes; Pearlmutter, Barak A.

    2014-01-01

    Automatic differentiation --- the mechanical transformation of numeric computer programs to calculate derivatives efficiently and accurately --- dates to the origin of the computer age. Reverse mode automatic differentiation both antedates and generalizes the method of backwards propagation of errors used in machine learning. Despite this, practitioners in a variety of fields, including machine learning, have been little influenced by automatic differentiation, and make scant use of available...

  17. Automatic Speech Segmentation Based on HMM

    OpenAIRE

    M. Kroul

    2007-01-01

    This contribution deals with the problem of automatic phoneme segmentation using HMMs. Automatization of speech segmentation task is important for applications, where large amount of data is needed to process, so manual segmentation is out of the question. In this paper we focus on automatic segmentation of recordings, which will be used for triphone synthesis unit database creation. For speech synthesis, the speech unit quality is a crucial aspect, so the maximal accuracy in segmentation is ...

  18. Automatic Control of Water Pumping Stations

    Institute of Scientific and Technical Information of China (English)

    Muhannad Alrheeh; JIANG Zhengfeng

    2006-01-01

    Automatic Control of pumps is an interesting proposal to operate water pumping stations among many kinds of water pumping stations according to their functions.In this paper, our pumping station is being used for water supply system. This paper is to introduce the idea of pump controller and the important factors that must be considering when we want to design automatic control system of water pumping stations. Then the automatic control circuit with the function of all components will be introduced.

  19. Automatic inference of specifications using matching logic

    OpenAIRE

    Alpuente Frasnedo, María; Feliú Gabaldón, Marco Antonio; Villanueva García, Alicia

    2013-01-01

    Formal specifications can be used for various software engineering activities ranging from finding errors to documenting software and automatic test-case generation. Automatically discovering specifications for heap-manipulating programs is a challenging task. In this paper, we propose a technique for automatically inferring formal specifications from C code which is based on the symbolic execution and automated reasoning tandem "MATCHING LOGIC /K framework". We implemented our technique for ...

  20. An automatic visual analysis system for tennis

    OpenAIRE

    Connaghan, Damien; Moran, Kieran; O''Connor, Noel E.

    2013-01-01

    This article presents a novel video analysis system for coaching tennis players of all levels, which uses computer vision algorithms to automatically edit and index tennis videos into meaningful annotations. Existing tennis coaching software lacks the ability to automatically index a tennis match into key events, and therefore, a coach who uses existing software is burdened with time-consuming manual video editing. This work aims to explore the effectiveness of a system to automatically de...

  1. Automatic generation of application specific FPGA multicore accelerators

    DEFF Research Database (Denmark)

    Hindborg, Andreas Erik; Schleuniger, Pascal; Jensen, Nicklas Bo;

    2014-01-01

    . In this paper we propose a tool flow, which automatically generates highly optimized hardware multicore systems based on parameters. Profiling feedback is used to adjust these parameters to improve performance and lower the power consumption. For an image processing application we show that our tools are able......High performance computing systems make increasing use of hardware accelerators to improve performance and power properties. For large high-performance FPGAs to be successfully integrated in such computing systems, methods to raise the abstraction level of FPGA programming are required...

  2. Automatic classification of blank substrate defects

    Science.gov (United States)

    Boettiger, Tom; Buck, Peter; Paninjath, Sankaranarayanan; Pereira, Mark; Ronald, Rob; Rost, Dan; Samir, Bhamidipati

    2014-10-01

    Mask preparation stages are crucial in mask manufacturing, since this mask is to later act as a template for considerable number of dies on wafer. Defects on the initial blank substrate, and subsequent cleaned and coated substrates, can have a profound impact on the usability of the finished mask. This emphasizes the need for early and accurate identification of blank substrate defects and the risk they pose to the patterned reticle. While Automatic Defect Classification (ADC) is a well-developed technology for inspection and analysis of defects on patterned wafers and masks in the semiconductors industry, ADC for mask blanks is still in the early stages of adoption and development. Calibre ADC is a powerful analysis tool for fast, accurate, consistent and automatic classification of defects on mask blanks. Accurate, automated classification of mask blanks leads to better usability of blanks by enabling defect avoidance technologies during mask writing. Detailed information on blank defects can help to select appropriate job-decks to be written on the mask by defect avoidance tools [1][4][5]. Smart algorithms separate critical defects from the potentially large number of non-critical defects or false defects detected at various stages during mask blank preparation. Mechanisms used by Calibre ADC to identify and characterize defects include defect location and size, signal polarity (dark, bright) in both transmitted and reflected review images, distinguishing defect signals from background noise in defect images. The Calibre ADC engine then uses a decision tree to translate this information into a defect classification code. Using this automated process improves classification accuracy, repeatability and speed, while avoiding the subjectivity of human judgment compared to the alternative of manual defect classification by trained personnel [2]. This paper focuses on the results from the evaluation of Automatic Defect Classification (ADC) product at MP Mask

  3. Automatic colorimetric calibration of human wounds

    Directory of Open Access Journals (Sweden)

    Meert Theo

    2010-03-01

    Full Text Available Abstract Background Recently, digital photography in medicine is considered an acceptable tool in many clinical domains, e.g. wound care. Although ever higher resolutions are available, reproducibility is still poor and visual comparison of images remains difficult. This is even more the case for measurements performed on such images (colour, area, etc.. This problem is often neglected and images are freely compared and exchanged without further thought. Methods The first experiment checked whether camera settings or lighting conditions could negatively affect the quality of colorimetric calibration. Digital images plus a calibration chart were exposed to a variety of conditions. Precision and accuracy of colours after calibration were quantitatively assessed with a probability distribution for perceptual colour differences (dE_ab. The second experiment was designed to assess the impact of the automatic calibration procedure (i.e. chart detection on real-world measurements. 40 Different images of real wounds were acquired and a region of interest was selected in each image. 3 Rotated versions of each image were automatically calibrated and colour differences were calculated. Results 1st Experiment: Colour differences between the measurements and real spectrophotometric measurements reveal median dE_ab values respectively 6.40 for the proper patches of calibrated normal images and 17.75 for uncalibrated images demonstrating an important improvement in accuracy after calibration. The reproducibility, visualized by the probability distribution of the dE_ab errors between 2 measurements of the patches of the images has a median of 3.43 dE* for all calibrated images, 23.26 dE_ab for all uncalibrated images. If we restrict ourselves to the proper patches of normal calibrated images the median is only 2.58 dE_ab! Wilcoxon sum-rank testing (p Conclusion The investigators proposed an automatic colour calibration algorithm that ensures reproducible colour

  4. Automatic laser tracking and ranging system.

    Science.gov (United States)

    Cooke, C R

    1972-02-01

    An automatic laser tracking and ranging system has been developed for use with cooperative retroreflective targets. Target position is determined with high precision at ranges out to 19 km and sample rates up to one hundred measurements per second. The data are recorded on a magnetic tape in the form of azimuth, elevation, range, and standard time and are computer-compatible. The system is fully automatic with the exception of the initial acquisition sequence, which is performed manually. This eliminates the need for expensive and time-consuming photographic data reduction. Also, position is uniquely determined by a single instrument. To provide convenient operation at remote sites, the system is van-mounted and operates off a portable power generator. The transmitter is a flash-pumped Q-spoiled Nd:YAG laser developing 1 MW peak power in a 10-mrad beam at a rate of 100 pps. The beam, which is coaxial with the receiver, is directed to the target by an azimuth-elevation mirror mount. The return beam is imaged o separate ranging and tracking receivers. The ranging receiver measures time of flight of the 25-nsec laser pulse with range accuracies of +/-15 cm. The tracking receiver uses a quadrant photodiode followed by matched log video amplifiers and achieves a tracking accuracy of +/-0.1 mrad. An optical dynamic range of 30 dB is provided to minimize error due to scintillation. Also, 80 dB of optical dynamic range is provided by adjustable neutral density filters to compensate for changes in target range. PMID:20111495

  5. Multilabel Learning for Automatic Web Services Tagging

    Directory of Open Access Journals (Sweden)

    Mustapha AZNAG

    2014-08-01

    Full Text Available Recently, some web services portals and search engines as Biocatalogue and Seekda!, have allowed users to manually annotate Web services using tags. User Tags provide meaningful descriptions of services and allow users to index and organize their contents. Tagging technique is widely used to annotate objects in Web 2.0 applications. In this paper we propose a novel probabilistic topic model (which extends the CorrLDA model - Correspondence Latent Dirichlet Allocation- to automatically tag web services according to existing manual tags. Our probabilistic topic model is a latent variable model that exploits local correlation labels. Indeed, exploiting label correlations is a challenging and crucial problem especially in multi-label learning context. Moreover, several existing systems can recommend tags for web services based on existing manual tags. In most cases, the manual tags have better quality. We also develop three strategies to automatically recommend the best tags for web services. We also propose, in this paper, WS-Portal; An Enriched Web Services Search Engine which contains 7063 providers, 115 sub-classes of category and 22236 web services crawled from the Internet. In WS-Portal, severals technologies are employed to improve the effectiveness of web service discovery (i.e. web services clustering, tags recommendation, services rating and monitoring. Our experiments are performed out based on real-world web services. The comparisons of Precision@n, Normalised Discounted Cumulative Gain (NDCGn values for our approach indicate that the method presented in this paper outperforms the method based on the CorrLDA in terms of ranking and quality of generated tags.

  6. Digital movie-based on automatic titrations.

    Science.gov (United States)

    Lima, Ricardo Alexandre C; Almeida, Luciano F; Lyra, Wellington S; Siqueira, Lucas A; Gaião, Edvaldo N; Paiva Junior, Sérgio S L; Lima, Rafaela L F C

    2016-01-15

    This study proposes the use of digital movies (DMs) in a flow-batch analyzer (FBA) to perform automatic, fast and accurate titrations. The term used for this process is "Digital movie-based on automatic titrations" (DMB-AT). A webcam records the DM during the addition of the titrant to the mixing chamber (MC). While the DM is recorded, it is decompiled into frames ordered sequentially at a constant rate of 26 frames per second (FPS). The first frame is used as a reference to define the region of interest (ROI) of 28×13pixels and the R, G and B values, which are used to calculate the Hue (H) values for each frame. The Pearson's correlation coefficient (r) is calculated between the H values of the initial frame and each subsequent frame. The titration curves are plotted in real time using the r values and the opening time of the titrant valve. The end point is estimated by the second derivative method. A software written in C language manages all analytical steps and data treatment in real time. The feasibility of the method was attested by application in acid/base test samples and edible oils. Results were compared with classical titration and did not present statistically significant differences when the paired t-test at the 95% confidence level was applied. The proposed method is able to process about 117-128 samples per hour for the test and edible oil samples, respectively, and its precision was confirmed by overall relative standard deviation (RSD) values, always less than 1.0%. PMID:26592600

  7. Removal of interproximal subgingival plaque by hand and automatic toothbrushes.

    Science.gov (United States)

    Taylor, J Y; Wood, C L; Garnick, J J; Thompson, W O

    1995-03-01

    Subgingival plaque removal at interproximal sites by automatic and hand toothbrushes was compared with control sites at which cleansing was not performed. There were 58 patients, 35 to 63 years of age, each with one hopeless tooth requiring extraction. Each patient was randomly assigned to one of four test groups: hand brush; automatic toothbrush 1; automatic toothbrush 2; and no brushing. The brushing instructions as stated by the manufacturers were demonstrated and the patient brushed the sextant containing the test tooth for 20 seconds. The level of the gingival margin was marked at each interproximal test site. The teeth were extracted and processed for SEM, and subgingival plaque was viewed at X100 and X2000 magnifications. A montage of photomicrographs of the gingival groove to the occlusal margin of the bacterial plaque at X100 magnification was made and the distance from the groove to the margin was measured. An ANOVA was performed using P = 0.05 level for significance. Due to processing difficulties, only 33 specimens were available for analysis. The average distances from the groove to the subgingival plaque front for the four test groups were 0.514, 0.132, 0.163, and 0.111 mm respectively. The maximum distance (1.5 mm) of plaque removal was greatest for the hand toothbrush. Due to the large standard deviation (0.636 compared to 0.146, 0.250, and 0.124 respectively), the hand brushing group was excluded from ANOVA. There were no statistically significant differences among the automatic toothbrushes and the no brushing control (P = 0.8393). It was concluded that a single session of oral hygiene instruction with an automatic toothbrush did not result in subgingival interproximal plaque cleansing. PMID:7776163

  8. ANPS - AUTOMATIC NETWORK PROGRAMMING SYSTEM

    Science.gov (United States)

    Schroer, B. J.

    1994-01-01

    Development of some of the space program's large simulation projects -- like the project which involves simulating the countdown sequence prior to spacecraft liftoff -- requires the support of automated tools and techniques. The number of preconditions which must be met for a successful spacecraft launch and the complexity of their interrelationship account for the difficulty of creating an accurate model of the countdown sequence. Researchers developed ANPS for the Nasa Marshall Space Flight Center to assist programmers attempting to model the pre-launch countdown sequence. Incorporating the elements of automatic programming as its foundation, ANPS aids the user in defining the problem and then automatically writes the appropriate simulation program in GPSS/PC code. The program's interactive user dialogue interface creates an internal problem specification file from user responses which includes the time line for the countdown sequence, the attributes for the individual activities which are part of a launch, and the dependent relationships between the activities. The program's automatic simulation code generator receives the file as input and selects appropriate macros from the library of software modules to generate the simulation code in the target language GPSS/PC. The user can recall the problem specification file for modification to effect any desired changes in the source code. ANPS is designed to write simulations for problems concerning the pre-launch activities of space vehicles and the operation of ground support equipment and has potential for use in developing network reliability models for hardware systems and subsystems. ANPS was developed in 1988 for use on IBM PC or compatible machines. The program requires at least 640 KB memory and one 360 KB disk drive, PC DOS Version 2.0 or above, and GPSS/PC System Version 2.0 from Minuteman Software. The program is written in Turbo Prolog Version 2.0. GPSS/PC is a trademark of Minuteman Software. Turbo Prolog

  9. Failure of classical traffic flow theories: Stochastic highway capacity and automatic driving

    Science.gov (United States)

    Kerner, Boris S.

    2016-05-01

    In a mini-review Kerner (2013) it has been shown that classical traffic flow theories and models failed to explain empirical traffic breakdown - a phase transition from metastable free flow to synchronized flow at highway bottlenecks. The main objective of this mini-review is to study the consequence of this failure of classical traffic-flow theories for an analysis of empirical stochastic highway capacity as well as for the effect of automatic driving vehicles and cooperative driving on traffic flow. To reach this goal, we show a deep connection between the understanding of empirical stochastic highway capacity and a reliable analysis of automatic driving vehicles in traffic flow. With the use of simulations in the framework of three-phase traffic theory, a probabilistic analysis of the effect of automatic driving vehicles on a mixture traffic flow consisting of a random distribution of automatic driving and manual driving vehicles has been made. We have found that the parameters of automatic driving vehicles can either decrease or increase the probability of the breakdown. The increase in the probability of traffic breakdown, i.e., the deterioration of the performance of the traffic system can occur already at a small percentage (about 5%) of automatic driving vehicles. The increase in the probability of traffic breakdown through automatic driving vehicles can be realized, even if any platoon of automatic driving vehicles satisfies condition for string stability.

  10. Changes in default mode network as automaticity develops in a categorization task.

    Science.gov (United States)

    Shamloo, Farzin; Helie, Sebastien

    2016-10-15

    The default mode network (DMN) is a set of brain regions in which blood oxygen level dependent signal is suppressed during attentional focus on the external environment. Because automatic task processing requires less attention, development of automaticity in a rule-based categorization task may result in less deactivation and altered functional connectivity of the DMN when compared to the initial learning stage. We tested this hypothesis by re-analyzing functional magnetic resonance imaging data of participants trained in rule-based categorization for over 10,000 trials (Helie et al., 2010) [12,13]. The results show that some DMN regions are deactivated in initial training but not after automaticity has developed. There is also a significant decrease in DMN deactivation after extensive practice. Seed-based functional connectivity analyses with the precuneus, medial prefrontal cortex (two important DMN regions) and Brodmann area 6 (an important region in automatic categorization) were also performed. The results show increased functional connectivity with both DMN and non-DMN regions after the development of automaticity, and a decrease in functional connectivity between the medial prefrontal cortex and ventromedial orbitofrontal cortex. Together, these results further support the hypothesis of a strategy shift in automatic categorization and bridge the cognitive and neuroscientific conceptions of automaticity in showing that the reduced need for cognitive resources in automatic processing is accompanied by a disinhibition of the DMN and stronger functional connectivity between DMN and task-related brain regions. PMID:27457134

  11. Automatic defect identification on PWR nuclear power station fuel pellets

    International Nuclear Information System (INIS)

    This article presents a new automatic identification technique of structural failures in nuclear green fuel pellet. This technique was developed to identify failures occurred during the fabrication process. It is based on a smart image analysis technique for automatic identification of the failures on uranium oxide pellets used as fuel in PWR nuclear power stations. In order to achieve this goal, an artificial neural network (ANN) has been trained and validated from image histograms of pellets containing examples not only from normal pellets (flawless), but from defective pellets as well (with the main flaws normally found during the manufacturing process). Based on this technique, a new automatic identification system of flaws on nuclear fuel element pellets, composed by the association of image pre-processing and intelligent, will be developed and implemented on the Brazilian nuclear fuel production industry. Based on the theoretical performance of the technology proposed and presented in this article, it is believed that this new system, NuFAS (Nuclear Fuel Pellets Failures Automatic Identification Neural System) will be able to identify structural failures in nuclear fuel pellets with virtually zero error margins. After implemented, the NuFAS will add value to control quality process of the national production of the nuclear fuel.

  12. Uav-Based Automatic Tree Growth Measurement for Biomass Estimation

    Science.gov (United States)

    Karpina, M.; Jarząbek-Rychard, M.; Tymków, P.; Borkowski, A.

    2016-06-01

    Manual in-situ measurements of geometric tree parameters for the biomass volume estimation are time-consuming and economically non-effective. Photogrammetric techniques can be deployed in order to automate the measurement procedure. The purpose of the presented work is an automatic tree growth estimation based on Unmanned Aircraft Vehicle (UAV) imagery. The experiment was conducted in an agriculture test field with scots pine canopies. The data was collected using a Leica Aibotix X6V2 platform equipped with a Nikon D800 camera. Reference geometric parameters of selected sample plants were measured manually each week. In situ measurements were correlated with the UAV data acquisition. The correlation aimed at the investigation of optimal conditions for a flight and parameter settings for image acquisition. The collected images are processed in a state of the art tool resulting in a generation of dense 3D point clouds. The algorithm is developed in order to estimate geometric tree parameters from 3D points. Stem positions and tree tops are identified automatically in a cross section, followed by the calculation of tree heights. The automatically derived height values are compared to the reference measurements performed manually. The comparison allows for the evaluation of automatic growth estimation process. The accuracy achieved using UAV photogrammetry for tree heights estimation is about 5cm.

  13. Automatic gamma spectrometry analytical apparatus

    International Nuclear Information System (INIS)

    This invention falls within the area of quantitative or semi-quantitative analysis by gamma spectrometry and particularly refers to a device for bringing the samples into the counting position. The purpose of this invention is precisely to provide an automatic apparatus specifically adapted to the analysis of hard gamma radiations. To this effect, the invention relates to a gamma spectrometry analytical device comprising a lead containment, a detector of which the sensitive part is located inside the containment and additionally comprising a transfer system for bringing the analyzed samples in succession to a counting position inside the containment above the detector. A feed compartment enables the samples to be brought in turn one by one on to the transfer system through a duct connecting the compartment to the transfer system. Sequential systems for the coordinated forward feed of the samples in the compartment and the transfer system complete this device

  14. Automatic home medical product recommendation.

    Science.gov (United States)

    Luo, Gang; Thomas, Selena B; Tang, Chunqiang

    2012-04-01

    Web-based personal health records (PHRs) are being widely deployed. To improve PHR's capability and usability, we proposed the concept of intelligent PHR (iPHR). In this paper, we use automatic home medical product recommendation as a concrete application to demonstrate the benefits of introducing intelligence into PHRs. In this new application domain, we develop several techniques to address the emerging challenges. Our approach uses treatment knowledge and nursing knowledge, and extends the language modeling method to (1) construct a topic-selection input interface for recommending home medical products, (2) produce a global ranking of Web pages retrieved by multiple queries, and (3) provide diverse search results. We demonstrate the effectiveness of our techniques using USMLE medical exam cases. PMID:20703712

  15. Automatic sampling of radioactive liquors

    International Nuclear Information System (INIS)

    This paper describes the latest techniques in sampling radioactive liquors in an Irradiated Fuel Reprocessing Plant. Previously to obtain a sample from these liquors operators were involved at the point of sampling, the transport of samples in shielded containers to the laboratories and at the offloading of the samples at the laboratory. Penetration of the radioactive containments occurred at the sampling point and again in the laboratory, these operations could lead to possible radioactive contamination. The latest design consists of a Sample Bottle Despatch Facility Autosampler units, Pneumatic Transfer System and Receipt Facility which reduces considerably operator involvement, provides a safe rapid transport system and minimises any possibility of radioactive contamination. The system can be made fully automatic and ease of maintenance has been ensured by the design

  16. Automatic sampling of radioactive liquors

    International Nuclear Information System (INIS)

    This paper describes the latest techniques in sampling radioactive liquors in an Irradiated Fuel Reprocessing Plant. Previously to obtain a sample from these liquors operators were involved at the point of sampling, the transport of samples in shielded containers to the laboratories and at the offloading of the samples at the laboratory. Penetration of the radioactive containments occurred at the sampling point and again in the laboratory, these operations could lead to possible radioactive contamination. The latest design consists of a Sample Bottle Despatch Facility Autosampler units, Pneumatic Transfer System and Receipt Facility which reduces considerably operator involvement, provides a safe rapid transport system and minimises any possibility of radioactive contamination. The system can be made fully automatic and ease of maintenance has been ensured by the design. (author)

  17. Automatic sampling of radioactive liquors

    International Nuclear Information System (INIS)

    The latest techniques in sampling radioactive liquors in an Irradiated Fuel Reprocessing Plant are described. Previously to obtain a sample from these liquors operators were involved at the point of sampling, the transport of samples in shielded containers to the laboratories and at the offloading of the samples at the laboratory. Penetration of the radioactive containments occurred at the sampling point and again in the laboratory; these operations could lead to possible radioactive contamination. The latest design consists of a Sample Bottle Despatch Facility Autosampler units, Pneumatic Transfer System and Receipt Facility which reduces considerably operator involvement, provides a safe rapid transport system and minimises any possibility of radioactive contamination. The system can be made fully automatic and ease of maintenance has been ensured by the design. (author)

  18. Automatic Sequencing for Experimental Protocols

    Science.gov (United States)

    Hsieh, Paul F.; Stern, Ivan

    We present a paradigm and implementation of a system for the specification of the experimental protocols to be used for the calibration of AXAF mirrors. For the mirror calibration, several thousand individual measurements need to be defined. For each measurement, over one hundred parameters need to be tabulated for the facility test conductor and several hundred instrument parameters need to be set. We provide a high level protocol language which allows for a tractable representation of the measurement protocol. We present a procedure dispatcher which automatically sequences a protocol more accurately and more rapidly than is possible by an unassisted human operator. We also present back-end tools to generate printed procedure manuals and database tables required for review by the AXAF program. This paradigm has been tested and refined in the calibration of detectors to be used in mirror calibration.

  19. Autoclass: An automatic classification system

    Science.gov (United States)

    Stutz, John; Cheeseman, Peter; Hanson, Robin

    1991-01-01

    The task of inferring a set of classes and class descriptions most likely to explain a given data set can be placed on a firm theoretical foundation using Bayesian statistics. Within this framework, and using various mathematical and algorithmic approximations, the AutoClass System searches for the most probable classifications, automatically choosing the number of classes and complexity of class descriptions. A simpler version of AutoClass has been applied to many large real data sets, has discovered new independently-verified phenomena, and has been released as a robust software package. Recent extensions allow attributes to be selectively correlated within particular classes, and allow classes to inherit, or share, model parameters through a class hierarchy. The mathematical foundations of AutoClass are summarized.

  20. Automatic force balance calibration system

    Science.gov (United States)

    Ferris, Alice T.

    1995-05-01

    A system for automatically calibrating force balances is provided. The invention uses a reference balance aligned with the balance being calibrated to provide superior accuracy while minimizing the time required to complete the calibration. The reference balance and the test balance are rigidly attached together with closely aligned moment centers. Loads placed on the system equally effect each balance, and the differences in the readings of the two balances can be used to generate the calibration matrix for the test balance. Since the accuracy of the test calibration is determined by the accuracy of the reference balance and current technology allows for reference balances to be calibrated to within +/-0.05% the entire system has an accuracy of +/-0.2%. The entire apparatus is relatively small and can be mounted on a movable base for easy transport between test locations. The system can also accept a wide variety of reference balances, thus allowing calibration under diverse load and size requirements.

  1. Methods of automatic scanning of SSNTDs

    International Nuclear Information System (INIS)

    The methods of automatic scanning of solid state nuclear track detectors are reviewed. The paper deals with transmission of light, charged particles, chemicals and electrical current through conventionally etched detectors. Special attention is given to the jumping spark technique and breakdown counters. Eventually optical automatic devices are examined. (orig.)

  2. Automatic control of nuclear power plants

    International Nuclear Information System (INIS)

    The fundamental concepts in automatic control are surveyed, and the purpose of the automatic control of pressurized water reactors is given. The response characteristics for the main components are then studied and block diagrams are given for the main control loops (turbine, steam generator, and nuclear reactors)

  3. Towards unifying inheritance and automatic program specialization

    DEFF Research Database (Denmark)

    Schultz, Ulrik Pagh

    2002-01-01

    inheritance with covariant specialization to control the automatic application of program specialization to class members. Lapis integrates object-oriented concepts, block structure, and techniques from automatic program specialization to provide both a language where object-oriented designs can be e...

  4. ANNUAL REPORT-AUTOMATIC INDEXING AND ABSTRACTING.

    Science.gov (United States)

    Lockheed Missiles and Space Co., Palo Alto, CA. Electronic Sciences Lab.

    THE INVESTIGATION IS CONCERNED WITH THE DEVELOPMENT OF AUTOMATIC INDEXING, ABSTRACTING, AND EXTRACTING SYSTEMS. BASIC INVESTIGATIONS IN ENGLISH MORPHOLOGY, PHONETICS, AND SYNTAX ARE PURSUED AS NECESSARY MEANS TO THIS END. IN THE FIRST SECTION THE THEORY AND DESIGN OF THE "SENTENCE DICTIONARY" EXPERIMENT IN AUTOMATIC EXTRACTION IS OUTLINED. SOME OF…

  5. Solar Powered Automatic Shrimp Feeding System

    Directory of Open Access Journals (Sweden)

    Dindo T. Ani

    2015-12-01

    Full Text Available - Automatic system has brought many revolutions in the existing technologies. One among the technologies, which has greater developments, is the solar powered automatic shrimp feeding system. For instance, the solar power which is a renewable energy can be an alternative solution to energy crisis and basically reducing man power by using it in an automatic manner. The researchers believe an automatic shrimp feeding system may help solve problems on manual feeding operations. The project study aimed to design and develop a solar powered automatic shrimp feeding system. It specifically sought to prepare the design specifications of the project, to determine the methods of fabrication and assembly, and to test the response time of the automatic shrimp feeding system. The researchers designed and developed an automatic system which utilizes a 10 hour timer to be set in intervals preferred by the user and will undergo a continuous process. The magnetic contactor acts as a switch connected to the 10 hour timer which controls the activation or termination of electrical loads and powered by means of a solar panel outputting electrical power, and a rechargeable battery in electrical communication with the solar panel for storing the power. By undergoing through series of testing, the components of the modified system were proven functional and were operating within the desired output. It was recommended that the timer to be used should be tested to avoid malfunction and achieve the fully automatic system and that the system may be improved to handle changes in scope of the project.

  6. Data mining of geospatial data: combining visual and automatic methods

    OpenAIRE

    Demšar, Urška

    2006-01-01

    Most of the largest databases currently available have a strong geospatial component and contain potentially useful information which might be of value. The discipline concerned with extracting this information and knowledge is data mining. Knowledge discovery is performed by applying automatic algorithms which recognise patterns in the data. Classical data mining algorithms assume that data are independently generated and identically distributed. Geospatial data are multidimensional, spatial...

  7. Automatic classification of eclipsing binaries light curves using neural networks

    CERN Document Server

    Sarro, L M; Giménez, A

    2005-01-01

    In this work we present a system for the automatic classification of the light curves of eclipsing binaries. This system is based on a classification scheme that aims to separate eclipsing binary sistems according to their geometrical configuration in a modified version of the traditional classification scheme. The classification is performed by a Bayesian ensemble of neural networks trained with {\\em Hipparcos} data of seven different categories including eccentric binary systems and two types of pulsating light curve morphologies.

  8. Fiona: a parallel and automatic strategy for read error correction

    OpenAIRE

    Schulz, Marcel H; Weese, David; Holtgrewe, Manuel; Dimitrova, Viktoria; Niu, Sijia; Reinert, Knut; Richard, Hugues

    2014-01-01

    Motivation: Automatic error correction of high-throughput sequencing data can have a dramatic impact on the amount of usable base pairs and their quality. It has been shown that the performance of tasks such as de novo genome assembly and SNP calling can be dramatically improved after read error correction. While a large number of methods specialized for correcting substitution errors as found in Illumina data exist, few methods for the correction of indel errors, common to technologies like ...

  9. Automatic Camera Viewfinder Based on TI DaVinci

    Institute of Scientific and Technical Information of China (English)

    WANG Hai-gang; XIAO Zhi-tao; GENG Lei

    2009-01-01

    Presented is an automatic camera viewfinder based on TI DaVinci digital platform and discussed mainly is the scheme of software system based on linux. This system can give an alarm and save the picture when the set features appear in the view, and the saved pictures can be downloaded and zoomed out. All functions are operated in OSD menu. It is well established for its flexible operations, powerful functions, multitasking and stable performance.

  10. The TS 600: automatic control system for eddy currents

    International Nuclear Information System (INIS)

    In the scope of fabrication and in service inspection of the PWR steam generator tubing bendle, FRAMATOME developed an automatic Eddy Current testing system: TS600. Based on a mini-computer, TS600 allows to digitize, to store and to process data in various ways, so it is possible to perform several kinds of inspection: conventional inservice inspection, roll area profilometry...... TS600 can also be used to develop new methods of examination

  11. Automatic Image Registration Technique of Remote Sensing Images

    OpenAIRE

    M. Wahed; Gh.S.El-tawel; A.Gad El-karim

    2013-01-01

    Image registration is a crucial step in most image processing tasks for which the final result is achieved from a combination of various resources. Automatic registration of remote-sensing images is a difficult task as it must deal with the intensity changes and variation of scale, rotation and illumination of the images. This paper proposes image registration technique of multi-view, multi- temporal and multi-spectral remote sensing images. Firstly, a preprocessing step is performed by apply...

  12. AUTOMATIC DESIGNING OF POWER SUPPLY SYSTEMS

    Directory of Open Access Journals (Sweden)

    A. I. Kirspou

    2016-01-01

    Full Text Available Development of automatic designing system for power supply of industrial enterprises is considered in the paper. Its complete structure and principle of operation are determined and established. Modern graphical interface and data scheme are developed, software is completely realized. Methodology and software correspond to the requirements of the up-to-date designing, describe a general algorithm of program process and also reveals properties of automatic designing system objects. Automatic designing system is based on module principle while using object-orientated programming. Automatic designing system makes it possible to carry out consistently designing calculations of power supply system and select the required equipment with subsequent output of all calculations in the form of explanatory note. Automatic designing system can be applied by designing organizations under conditions of actual designing.

  13. Automatic image segmentation by dynamic region merging.

    Science.gov (United States)

    Peng, Bo; Zhang, Lei; Zhang, David

    2011-12-01

    This paper addresses the automatic image segmentation problem in a region merging style. With an initially oversegmented image, in which many regions (or superpixels) with homogeneous color are detected, an image segmentation is performed by iteratively merging the regions according to a statistical test. There are two essential issues in a region-merging algorithm: order of merging and the stopping criterion. In the proposed algorithm, these two issues are solved by a novel predicate, which is defined by the sequential probability ratio test and the minimal cost criterion. Starting from an oversegmented image, neighboring regions are progressively merged if there is an evidence for merging according to this predicate. We show that the merging order follows the principle of dynamic programming. This formulates the image segmentation as an inference problem, where the final segmentation is established based on the observed image. We also prove that the produced segmentation satisfies certain global properties. In addition, a faster algorithm is developed to accelerate the region-merging process, which maintains a nearest neighbor graph in each iteration. Experiments on real natural images are conducted to demonstrate the performance of the proposed dynamic region-merging algorithm. PMID:21609885

  14. Fuzzy logic in automatic control devices

    International Nuclear Information System (INIS)

    Fuzzy logic is a theory that, applied to an automatic control device, allows to perform a regulation as efficiently as an operating expert could have done manually. The description of the behaviour of a regulation system implies the use of laws such as 'if...then', these laws link input variables that are 'conditions' to output variables that are 'conclusions'. In DAPNIA facilities fuzzy logic has been used to improve the performances of 3 control systems: -the regulation of the helium cycle compressor of a condenser, this regulation has required 21 laws, 4 conditions and 3 conclusions, -the regulation of the temperature of the LHC testing station at STCM, and -the regulation of the temperature of hydrogen target for the CLAS experiment, by means of fuzzy logic temperature stability has been driven from ±150 mK to ±20 mK, this regulation is based on 9 laws, 2 conditions and 2 conclusions. The application of fuzzy logic to regulation is presented on a simple example. (A.C.)

  15. Automatic detection of apnoea of prematurity

    International Nuclear Information System (INIS)

    The detection of the incidents of apnoea of prematurity (AP) in preterm infants is important in the intensive care unit, but this detection is often based on simple threshold techniques, which suffer from poor specificity. Three methods for the automatic detection of AP were designed, tested and evaluated using approximately 2426 h of continuous recording from 54 neonates (μ = 44 h and σ = 7 h). The first method was based on the cumulative sum of the time series of heart rate (HR), respiratory rate (RR) and oxygen saturation (SpO2) along with the sum of their Shannon entropy. The performance of this method gave 94.53% sensitivity, 74.72% specificity and 77.84% accuracy. The second method was based on the correlation between the time series of HR, RR and SpO2, which were used as inputs to an artificial neural network. This gave 81.85% sensitivity, 75.83% specificity and 76.78% accuracy. The third method utilized the derivative of the three time series and yielded a performance of 100% sensitivity, 96.19% specificity and 96.79% accuracy. Although not optimized to work in real time, the latter method has the potential for forming the basis of a real time system for the detection of incidents of AP

  16. Control of automatic processes: A parallel distributed-processing account of the Stroop effect. Technical report

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, J.D.; Dunbar, K.; McClelland, J.L.

    1989-11-22

    A growing body of evidence suggests that traditional views of automaticity are in need of revision. For example, automaticity has often been treated as an all-or-none phenomenon, and traditional theories have held that automatic processes are independent of attention. Yet recent empirical data suggest that automatic processes are continuous, and furthermore are subject to attentional control. In this paper we present a model of attention which addresses these issues. Using a parallel distributed processing framework we propose that the attributes of automaticity depend upon the strength of a processing pathway and that strength increases with training. Using the Stroop effect as an example, we show how automatic processes are continuous and emerge gradually with practice. Specifically, we present a computational model of the Stroop task which simulates the time course of processing as well as the effects of learning. This was accomplished by combining the cascade mechanism described by McClelland (1979) with the back propagation learning algorithm (Rumelhart, Hinton, Williams, 1986). The model is able to simulate performance in the standard Stroop task, as well as aspects of performance in variants of this task which manipulate SOA, response set, and degree of practice. In the discussion we contrast our model with other models, and indicate how it relates to many of the central issues in the literature on attention, automaticity, and interference.

  17. Automatic analysis of ventilation and perfusion pulmonary scintigrams

    International Nuclear Information System (INIS)

    A fully automatic program is used to analyse Pulmonary Ventilation and Perfusion Scintigrams. Ventilation study is performed using a standard washin-washout 133Xe method. Multiple View Late Xenon Washout Images are also recorded. Perfusion study is performed using sup(99m)Tc serum albumin. The FORTRAN program recognizes the different steps of the test, whatever their durations are. It performs background subtraction, drows pulmonary Regions of Interest and calculate Ventilation and Perfusion parameters for each ROI and each lung. It also processes Multiple View Late Xenon Washout Images in such a way that they give not only a topographic information about hypoventilated regions, but also a semi-quantitative information about the strongness of xenon retention. During the processing, the operator has only to control two intermediate results (e.g. automatically determained pulmonary ROI). All the numerical and processed iconographic results are obtained within 10 minutes after the end of the test. This program has already been used to analyse 1,000 pulmonary studies. During those studies, correction of intermediate results has been very scarcely necessary. This efficient and reliable automatic program is very useful for the daily practice of a Nuclear Medecin Department

  18. Effectiveness of an Automatic Tracking Software in Underwater Motion Analysis

    Directory of Open Access Journals (Sweden)

    Fabrício A. Magalhaes

    2013-12-01

    Full Text Available Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP, based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers’ positions were manually tracked to determine the markers’ center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM. Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker’s coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4% than for COM (17.8%. Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis.

  19. MatchGUI: A Graphical MATLAB-Based Tool for Automatic Image Co-Registration

    Science.gov (United States)

    Ansar, Adnan I.

    2011-01-01

    MatchGUI software, based on MATLAB, automatically matches two images and displays the match result by superimposing one image on the other. A slider bar allows focus to shift between the two images. There are tools for zoom, auto-crop to overlap region, and basic image markup. Given a pair of ortho-rectified images (focused primarily on Mars orbital imagery for now), this software automatically co-registers the imagery so that corresponding image pixels are aligned. MatchGUI requires minimal user input, and performs a registration over scale and inplane rotation fully automatically

  20. 调试中基于文法编码的日志异常检测算法%A Log Anomaly Detection Algorithm for Debugging Based on Grammar-Based Codes

    Institute of Scientific and Technical Information of China (English)

    王楠; 韩冀中; 方金云

    2013-01-01

    调试软件中的非确定错误对软件开发有重要意义.近年来,随着云计算系统的快速发展和对录制重放调试方法研究的深入,使用异常检测方法从大量文本日志或控制流日志等数据中找出异常的信息对调试愈发重要.传统的异常检测算法大多是为检测和防范攻击而设计的,它们很多基于马尔可夫假设,对事件流上的剧烈变化很敏感.但是新的问题要求异常检测能够检出语义级别的异常行为.实验表明现有的基于马尔可夫假设的异常检测算法在这方面表现不佳.提出了一种新的基于文法编码的异常检测算法.该算法不依赖于统计模型、概率模型、机器学习及马尔可夫假设,设计和实现都极为简单.实验表明在检测高层次的语义异常方面,该算法比传统方法有优势.%Debugging non-deterministic bugs has long been an important research area in software development. In recent years, with the rapid emerging of large cloud computing systems and the development of record replay debugging, the key of such debugging problem becomes mining anomaly information from text console logs and/or execution flow logs. Anomaly detection algorithms can therefore be used in this area. However, although many approaches have been proposed, traditional anomaly detection algorithms are designed for detecting network attacking and not suitable for the new problems. One important reason is the Markov assumption on which many traditional anomaly detection methods are based. Markov-based methods are sensitive to harshly trashing in event transitions. In contrast, the new problems in system diagnosing require the abilities of detecting semantic misbehaviors. Experiment results show the powerless of Markov-based methods on those problems. This paper presents a novel anomaly detection algorithm which is based on grammar-based codes. Different from previous approaches, our algorithm is a non-Markov approach. It doesn

  1. Automatic Meal Inspection System Using LBP-HF Feature for Central Kitchen

    OpenAIRE

    Yue-Min Jiang; Ho-Hsin Lee; Cheng-Chang Lien; Chun-Feng Tai; Pi-Chun Chu; Ting-Wei Yang

    2015-01-01

    This paper proposes an intelligent and automatic meal inspection system which can be applied to the meal inspection for the application of central kitchen automation. The diet specifically designed for the patients are required with providing personalized diet such as low sodium intake or some necessary food. Hence, the proposed system can benefit the inspection process that is often performed manually. In the proposed system, firstly, the meal box can be detected and located automatically wi...

  2. Automatically activated stereotypes and differential treatment against the obese in hiring

    OpenAIRE

    Rooth, Dan-Olof

    2008-01-01

    This study provides empirical support for automatically activated associations inducing unequal treatment against the obese among recruiters in a real-life hiring situation. A field experiment on differential treatment against obese job applicants in hiring is combined with a measure of employers' automatic/implicit performance stereotype toward obese relative to normal weight using the implicit association test. We find a strong and statistically significant obesity difference in the correla...

  3. NASA MSFC hardware in the loop simulations of automatic rendezvous and capture systems

    Science.gov (United States)

    Tobbe, Patrick A.; Naumann, Charles B.; Sutton, William; Bryan, Thomas C.

    1991-01-01

    Two complementary hardware-in-the-loop simulation facilities for automatic rendezvous and capture systems at MSFC are described. One, the Flight Robotics Laboratory, uses an 8 DOF overhead manipulator with a work volume of 160 by 40 by 23 feet to evaluate automatic rendezvous algorithms and range/rate sensing systems. The other, the Space Station/Station Operations Mechanism Test Bed, uses a 6 DOF hydraulic table to perform docking and berthing dynamics simulations.

  4. Automatic subject classification of textual documents using limited or no training data

    OpenAIRE

    Joorabchi, Arash

    2010-01-01

    With the explosive growth in the number of electronic documents available on the internet, intranets, and digital libraries, there is a growing need for automatic systems capable of indexing and organising such large volumes of data more that ever. Automatic Text Classification (ATC) has become one of the principal means for enhancing the performance of information retrieval systems and organising digital libraries and other textual collections. Within this context, the use of ...

  5. Neuronal Spectral Analysis of EEG and Expert Knowledge Integration for Automatic Classification of Sleep Stages

    OpenAIRE

    Kerkeni, Nizar; Alexandre, Frédéric; Bedoui, Mohamed Hédi; Bougrain, Laurent; Dogui, Mohamed

    2005-01-01

    http://www.wseas.org Being able to analyze and interpret signal coming from electroencephalogram (EEG) recording can be of high interest for many applications including medical diagnosis and Brain-Computer Interfaces. Indeed, human experts are today able to extract from this signal many hints related to physiological as well as cognitive states of the recorded subject and it would be very interesting to perform such task automatically but today no completely automatic system exists. In pre...

  6. Evaluating PcGets and RETINA as Automatic Model Selection Algorithms.

    OpenAIRE

    Jennifer L. Castle

    2005-01-01

    The paper describes two automatic model selection algorithms, RETINA and PcGets, briefly discussing how the algorithms work and what their performance claims are. RETINA's Matlab implementation of the code is explained, then the program is compared with PcGets on the data in Perez-Amaral, Gallo and White (2005, Econometric Theory, Vol. 21, pp. 262-277), "A Comparison of Complementary Automatic Modelling Methods: RETINA and PcGets", and Hoover and Perez (1999, Econometrics Journal, Vol. 2, pp....

  7. Design and microfabrication of new automatic human blood sample collection and preparation devices

    OpenAIRE

    Tran, Minh Nhut

    2015-01-01

    For self-sampling or collection of blood by health personal related to point-ofcare diagnostics in health rooms, it may often be necessary to perform automatic collection of blood samples. The most important operation that needs to be done when handling whole blood is to be able to combine automatic sample collection with optimal mixing of anticoagulation liquid and weak xatives. In particular before doing any transport of a sample or point-of-care nucleic acid diagnostics (PO...

  8. Explodet Project:. Methods of Automatic Data Processing and Analysis for the Detection of Hidden Explosive

    Science.gov (United States)

    Lecca, Paola

    2003-12-01

    The research of the INFN Gruppo Collegato di Trento in the ambit of EXPLODET project for the humanitarian demining, is devoted to the development of a software procedure for the automatization of data analysis and decision taking about the presence of hidden explosive. Innovative algorithms of likely background calculation, a system based on neural networks for energy calibration and simple statistical methods for the qualitative consistency check of the signals are the main parts of the software performing the automatic data elaboration.

  9. Development of an automatic identification algorithm for antibiogram analysis.

    Science.gov (United States)

    Costa, Luan F R; da Silva, Eduardo S; Noronha, Victor T; Vaz-Moreira, Ivone; Nunes, Olga C; Andrade, Marcelino M de

    2015-12-01

    Routinely, diagnostic and microbiology laboratories perform antibiogram analysis which can present some difficulties leading to misreadings and intra and inter-reader deviations. An Automatic Identification Algorithm (AIA) has been proposed as a solution to overcome some issues associated with the disc diffusion method, which is the main goal of this work. AIA allows automatic scanning of inhibition zones obtained by antibiograms. More than 60 environmental isolates were tested using susceptibility tests which were performed for 12 different antibiotics for a total of 756 readings. Plate images were acquired and classified as standard or oddity. The inhibition zones were measured using the AIA and results were compared with reference method (human reading), using weighted kappa index and statistical analysis to evaluate, respectively, inter-reader agreement and correlation between AIA-based and human-based reading. Agreements were observed in 88% cases and 89% of the tests showed no difference or a reading problems such as overlapping inhibition zones, imperfect microorganism seeding, non-homogeneity of the circumference, partial action of the antimicrobial, and formation of a second halo of inhibition. Furthermore, AIA proved to overcome some of the limitations observed in other automatic methods. Therefore, AIA may be a practical tool for automated reading of antibiograms in diagnostic and microbiology laboratories. PMID:26513468

  10. Fuzzy logic based automatic voltage regulator for damping power oscillations

    Energy Technology Data Exchange (ETDEWEB)

    Prasertwong, K. [Srinakharinwirot Univ., Ongkharak, Nahhonnayok (Thailand). Dept. of Electrical Engineering; Mithulananthan, N. [Asian Inst. of Technology, Klong Luang, Pathumthani (Thailand). Energy Field of Study

    2008-07-01

    Low frequency oscillations in a power system can result in instability and widespread blackouts. A new fuzzy logic based automatic voltage regulator for damping power system oscillations was presented. The proposed controller has one voltage control loop which functions as an automatic voltage regulating unit in a synchronous machine. The input signals for voltage control include the terminal voltage error and its derivative. Comparison studies were also conducted to determine the performance of the proposed controller with the conventional automatic voltage regulator (AVR) compared with the conventional AVR combined with a power system stabilizer (PSS). This paper systematically explained the steps involved in fuzzy logic control design for oscillation damping in power systems. A comparison between fuzzy logic AVR and conventional AVR revealed that fuzzy logic AVR performed better. The proposed fuzzy logic AVR provided good damping and improved dynamics. Although fuzzy based controllers have a number of advantages, different operating points need to be considered in order to gain the robustness of the fuzzy based controllers. Fuzzy logic controllers are suitable for nonlinear, dynamic processes for which an exact mathematical model may not be available. 9 refs, 5 tabs., 14 figs.

  11. Automatic Chessboard Detection for Intrinsic and Extrinsic Camera Parameter Calibration

    Directory of Open Access Journals (Sweden)

    Jose María Armingol

    2010-03-01

    Full Text Available There are increasing applications that require precise calibration of cameras to perform accurate measurements on objects located within images, and an automatic algorithm would reduce this time consuming calibration procedure. The method proposed in this article uses a pattern similar to that of a chess board, which is found automatically in each image, when no information regarding the number of rows or columns is supplied to aid its detection. This is carried out by means of a combined analysis of two Hough transforms, image corners and invariant properties of the perspective transformation. Comparative analysis with more commonly used algorithms demonstrate the viability of the algorithm proposed, as a valuable tool for camera calibration.

  12. Automatic relational database compression scheme design based on swarm evolution

    Institute of Scientific and Technical Information of China (English)

    HU Tian-lei; CHEN Gang; LI Xiao-yan; DONG Jin-xiang

    2006-01-01

    Compression is an intuitive way to boost the performance of a database system. However, compared with other physical database design techniques, compression consumes large amount of CPU power. There is a trade-off between the reduction of disk access and the overhead of CPU processing. Automatic design and adaptive administration of database systems are widely demanded, and the automatic selection of compression schema to compromise the trade-off is very important. In this paper,we present a model with novel techniques to integrate a rapidly convergent agent-based evolution framework, i.e. the SWAF (SWarm Algorithm Framework), into adaptive attribute compression for relational database. The model evolutionally consults statistics of CPU load and IO bandwidth to select compression schemas considering both aspects of the trade-off. We have implemented a prototype model on Oscar RDBMS with experiments highlighting the correctness and efficiency of our techniques.

  13. Automatic Identification of Human Erythrocytes in Microscopic Fecal Specimens.

    Science.gov (United States)

    Liu, Lin; Lei, Haoting; Zhang, Jing; Yuan, Yang; Zhang, Zhenglong; Liu, Juanxiu; Xie, Yu; Ni, Guangming; Liu, Yong

    2015-11-01

    Traditional fecal erythrocyte detection is performed via a manual operation that is unsuitable because it depends significantly on the expertise of individual inspectors. To recognize human erythrocytes automatically and precisely, automatic segmentation is very important for extraction of characteristics. In addition, multiple recognition algorithms are also essential. This paper proposes an algorithm based on morphological segmentation and a fuzzy neural network. The morphological segmentation process comprises three operational steps: top-hat transformation, Otsu's method, and image binarization. Following initial screening by area and circularity, fuzzy c-means clustering and the neural network algorithms are used for secondary screening. Subsequently, the erythrocytes are screened by combining the results of five images obtained at different focal lengths. Experimental results show that even when the illumination, noise pollution, and position of the erythrocytes are different, they are all segmented and labeled accurately by the proposed method. Thus, the proposed method is robust even in images with significant amounts of noise. PMID:26349804

  14. Automatic Modulation Recognition by Support Vector Machines Using Wavelet Kernel

    International Nuclear Information System (INIS)

    Automatic modulation identification plays a significant role in electronic warfare, electronic surveillance systems and electronic counter measure. The task of modulation recognition of communication signals is to determine the modulation type and signal parameters. In fact, automatic modulation identification can be range to an application of pattern recognition in communication field. The support vector machines (SVM) is a new universal learning machine which is widely used in the fields of pattern recognition, regression estimation and probability density. In this paper, a new method using wavelet kernel function was proposed, which maps the input vector xi into a high dimensional feature space F. In this feature space F, we can construct the optimal hyperplane that realizes the maximal margin in this space. That is to say, we can use SVM to classify the communication signals into two groups, namely analogue modulated signals and digitally modulated signals. In addition, computer simulation results are given at last, which show good performance of the method

  15. Automatic Emotional State Detection using Facial Expression Dynamic in Videos

    Directory of Open Access Journals (Sweden)

    Hongying Meng

    2014-11-01

    Full Text Available In this paper, an automatic emotion detection system is built for a computer or machine to detect the emotional state from facial expressions in human computer communication. Firstly, dynamic motion features are extracted from facial expression videos and then advanced machine learning methods for classification and regression are used to predict the emotional states. The system is evaluated on two publicly available datasets, i.e. GEMEP_FERA and AVEC2013, and satisfied performances are achieved in comparison with the baseline results provided. With this emotional state detection capability, a machine can read the facial expression of its user automatically. This technique can be integrated into applications such as smart robots, interactive games and smart surveillance systems.

  16. Detection of Off-normal Images for NIF Automatic Alignment

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J V; Awwal, A S; McClay, W A; Ferguson, S W; Burkhart, S C

    2005-07-11

    One of the major purposes of National Ignition Facility at Lawrence Livermore National Laboratory is to accurately focus 192 high energy laser beams on a nanoscale (mm) fusion target at the precise location and time. The automatic alignment system developed for NIF is used to align the beams in order to achieve the required focusing effect. However, if a distorted image is inadvertently created by a faulty camera shutter or some other opto-mechanical malfunction, the resulting image termed ''off-normal'' must be detected and rejected before further alignment processing occurs. Thus the off-normal processor acts as a preprocessor to automatic alignment image processing. In this work, we discuss the development of an ''off-normal'' pre-processor capable of rapidly detecting the off-normal images and performing the rejection. Wide variety of off-normal images for each loop is used to develop the criterion for rejections accurately.

  17. Automatic Medical Image Classification and Abnormality Detection Using KNearest Neighbour

    Directory of Open Access Journals (Sweden)

    Dr. R. J. Ramteke , Khachane Monali Y.

    2012-12-01

    Full Text Available This research work presents a method for automatic classification of medical images in two classes Normal and Abnormal based on image features and automatic abnormality detection. Our proposed system consists of four phases Preprocessing, Feature extraction, Classification, and Post processing. Statistical texture feature set is derived from normal and abnormal images. We used the KNN classifier for classifying image. The KNN classifier performance compared with kernel based SVM classifier (Linear and RBF. The confusion matrix computed and result shows that KNN obtain 80% classification rate which is more than SVM classification rate. So we choose KNN algorithm for classification of images. If image classified as abnormal then post processing step applied on the image and abnormal region is highlighted on the image. The system has been tested on the number of real CT scan brain images.

  18. Automatic Facial Expression Recognition Based on Hybrid Approach

    Directory of Open Access Journals (Sweden)

    Ali K. K. Bermani

    2012-12-01

    Full Text Available The topic of automatic recognition of facial expressions deduce a lot of researchers in the late last century and has increased a great interest in the past few years. Several techniques have emerged in order to improve the efficiency of the recognition by addressing problems in face detection and extraction features in recognizing expressions. This paper has proposed automatic system for facial expression recognition which consists of hybrid approach in feature extraction phase which represent a combination between holistic and analytic approaches by extract 307 facial expression features (19 features by geometric, 288 feature by appearance. Expressions recognition is performed by using radial basis function (RBF based on artificial neural network to recognize the six basic emotions (anger, fear, disgust, happiness, surprise, sadness in addition to the natural.The system achieved recognition rate 97.08% when applying on person-dependent database and 93.98% when applying on person-independent.

  19. Automatic local Gabor Features extraction for face recognition

    CERN Document Server

    Jemaa, Yousra Ben

    2009-01-01

    We present in this paper a biometric system of face detection and recognition in color images. The face detection technique is based on skin color information and fuzzy classification. A new algorithm is proposed in order to detect automatically face features (eyes, mouth and nose) and extract their correspondent geometrical points. These fiducial points are described by sets of wavelet components which are used for recognition. To achieve the face recognition, we use neural networks and we study its performances for different inputs. We compare the two types of features used for recognition: geometric distances and Gabor coefficients which can be used either independently or jointly. This comparison shows that Gabor coefficients are more powerful than geometric distances. We show with experimental results how the importance recognition ratio makes our system an effective tool for automatic face detection and recognition.

  20. Automatic parameterization for magnetometer zero offset determination

    Directory of Open Access Journals (Sweden)

    M. A. Pudney

    2012-06-01

    Full Text Available In-situ magnetic field measurements are of critical importance in understanding how the Sun creates and controls the heliosphere. To ensure the measurements are accurate, it is necessary to track the combined slowly-varying spacecraft magnetic field and magnetometer zero offset – the systematic error in the sensor measurements. For a 3-axis stabilised spacecraft, in-flight correction of zero offsets primarily relies on the use of Alfvénic rotations in the magnetic field. We present a method to automatically determine a key parameter related to the ambient compressional variance of the signal (which determines the selection criteria for identifying clear Alfvénic rotations. We apply our method to different solar wind conditions, performing a statistical analysis of the data periods required to achieve a 70% chance of calculating an offset using Helios datasets. We find that 70% of 40 min data periods in regions of fast solar wind possess sufficient rotational content to calculate an offset. To achieve the same 70% calculation probability in regions of slow solar wind requires data periods of 2 h duration. We also find that 40 min data periods at perihelion compared to 1 h and 40 min data periods at aphelion are required to achieve the same 70% calculation probability. We compare our method with previous work that uses a fixed parameter approach and demonstrate an improvement in the calculation probability of up to 10% at aphelion and 5% at perihelion.

  1. Automatic parameterization for magnetometer zero offset determination

    Directory of Open Access Journals (Sweden)

    M. A. Pudney

    2012-08-01

    Full Text Available In-situ magnetic field measurements are of critical importance in understanding how the Sun creates and controls the heliosphere. To ensure the measurements are accurate, it is necessary to track the combined slowly varying spacecraft magnetic field and magnetometer zero offset – the systematic error in the sensor measurements. For a 3-axis stabilised spacecraft, in-flight correction of zero offsets primarily relies on the use of Alfvénic rotations in the magnetic field. We present a method to automatically determine a key parameter related to the ambient compressional variance of the signal (which determines the selection criteria for identifying clear Alfvénic rotations. We apply our method to different solar wind conditions, performing a statistical analysis of the data periods required to achieve a 70% chance of calculating an offset using Helios datasets. We find that 70% of 40 min data periods in regions of fast solar wind possess sufficient rotational content to calculate an offset. To achieve the same 70% calculation probability in regions of slow solar wind requires data periods of 2 h duration. We also find that 40 min data periods at perihelion compared to 1 h and 40 min data periods at aphelion are required to achieve the same 70% calculation probability. We compare our method with previous work that uses a fixed parameter approach and demonstrate an improvement in the calculation probability of up to 10% at aphelion and 5% at perihelion.

  2. Automatic Specification Evaluator for Effective Migration

    Directory of Open Access Journals (Sweden)

    P. Sakthivel

    2012-01-01

    Full Text Available Problem statement: Software Reengineering is an effective technique for reuse the older application in the new environment. Nowadays, Reengineering techniques are increasing in spite of many difficulties and issues arise when the older application is converted to newer one. So there is a need to enhance the new system to satisfy the user requirements and quality aspects. Approach: For this enhancement of new system, we propose a method namely Automatic Specification Evaluator (ASE where the interference and their effects on the new system were identified by their attributes and modify the interference if necessary. The accuracy of the migration was further increased by reimplementation of the same method. Results: After the proposed ASE method, the system interference was reduced and the efficiency of the new system was improved. In many migration situations, ASE produces the target system with zero interference. Conclusion: Our proposed method gives a good performance in the new system and hence the new system can adopt the properties of the legacy system and also satisfies the user requirements

  3. Automatic target tracking in FLIR image sequences

    Science.gov (United States)

    Bal, Abdullah; Alam, Mohammad S.

    2004-09-01

    Moving target tracking is a challenging task and is increasingly becoming important for various applications. In this paper, we have presented target detection and tracking algorithm based on target intensity feature relative to surrounding background, and shape information of target. Proposed automatic target tracking algorithm includes two techniques: intensity variation function (IVF) and template modeling (TM). The intensity variation function is formulated by using target intensity feature while template modeling is based on target shape information. The IVF technique produces the maximum peak value whereas the reference target intensity variation is similar to the candidate target intensity variation. When IVF technique fails, due to background clutter, non-target object or other artifacts, the second technique, template modeling, is triggered by control module. By evaluating the outputs from the IVF and TM techniques, the tracker determines the real coordinates of the target. Performance of the proposed ATT is tested using real life forward-looking infrared (FLIR) image sequences taken from an airborne, moving platform.

  4. Project Report: Automatic Sequence Processor Software Analysis

    Science.gov (United States)

    Benjamin, Brandon

    2011-01-01

    The Mission Planning and Sequencing (MPS) element of Multi-Mission Ground System and Services (MGSS) provides space missions with multi-purpose software to plan spacecraft activities, sequence spacecraft commands, and then integrate these products and execute them on spacecraft. Jet Propulsion Laboratory (JPL) is currently is flying many missions. The processes for building, integrating, and testing the multi-mission uplink software need to be improved to meet the needs of the missions and the operations teams that command the spacecraft. The Multi-Mission Sequencing Team is responsible for collecting and processing the observations, experiments and engineering activities that are to be performed on a selected spacecraft. The collection of these activities is called a sequence and ultimately a sequence becomes a sequence of spacecraft commands. The operations teams check the sequence to make sure that no constraints are violated. The workflow process involves sending a program start command, which activates the Automatic Sequence Processor (ASP). The ASP is currently a file-based system that is comprised of scripts written in perl, c-shell and awk. Once this start process is complete, the system checks for errors and aborts if there are any; otherwise the system converts the commands to binary, and then sends the resultant information to be radiated to the spacecraft.

  5. Studies on the calibration of mammography automatic exposure mode with computed radiology

    International Nuclear Information System (INIS)

    Objective: To realize the optimization of image quality and radiation dose by correcting mammography automatic exposure, according to automatic exposure controlled mode of mammography film-screen system. Methods: The film-screen system (28 kV) was applied to perform automatic exposure of plexiglass (40 mm) and get the standard dose of exposure, the exposure mode of CR base on LgM=2.0 was rectified, which was divided into 10 steps. Mammary glands pattern (Fluke NA18-220) were examined with CR (26, 28, and 30 kV) by the automatic exposure mode corrected. The exposure values (mAs) were recorded. CR image was diagnosed and evaluated in double blind way by 4 radiologists according to American Collage of Radiology (ACR) standard. Results: Based on the standard of CR automatic exposure with the dose higher than the traditional exposure of film-screen system, the calibration of mammography automatic exposure was accomplished. The test results of the calibrated mode was better than the scoring system of ACR. Conclusions: Comparative study showed improvement in acquiring high-quality image and reduction of radiation dose. The corrected mammography automatic exposure mode might be a better method for clinical use. (authors)

  6. Towards automatic classification of all WISE sources

    Science.gov (United States)

    Kurcz, A.; Bilicki, M.; Solarz, A.; Krupa, M.; Pollo, A.; Małek, K.

    2016-07-01

    Context. The Wide-field Infrared Survey Explorer (WISE) has detected hundreds of millions of sources over the entire sky. Classifying them reliably is, however, a challenging task owing to degeneracies in WISE multicolour space and low levels of detection in its two longest-wavelength bandpasses. Simple colour cuts are often not sufficient; for satisfactory levels of completeness and purity, more sophisticated classification methods are needed. Aims: Here we aim to obtain comprehensive and reliable star, galaxy, and quasar catalogues based on automatic source classification in full-sky WISE data. This means that the final classification will employ only parameters available from WISE itself, in particular those which are reliably measured for the majority of sources. Methods: For the automatic classification we applied a supervised machine learning algorithm, support vector machines (SVM). It requires a training sample with relevant classes already identified, and we chose to use the SDSS spectroscopic dataset (DR10) for that purpose. We tested the performance of two kernels used by the classifier, and determined the minimum number of sources in the training set required to achieve stable classification, as well as the minimum dimension of the parameter space. We also tested SVM classification accuracy as a function of extinction and apparent magnitude. Thus, the calibrated classifier was finally applied to all-sky WISE data, flux-limited to 16 mag (Vega) in the 3.4 μm channel. Results: By calibrating on the test data drawn from SDSS, we first established that a polynomial kernel is preferred over a radial one for this particular dataset. Next, using three classification parameters (W1 magnitude, W1-W2 colour, and a differential aperture magnitude) we obtained very good classification efficiency in all the tests. At the bright end, the completeness for stars and galaxies reaches ~95%, deteriorating to ~80% at W1 = 16 mag, while for quasars it stays at a level of

  7. Automatic plant start-up system for nuclear power station

    International Nuclear Information System (INIS)

    An automatic plant start-up system using a process computer has been applied to the unit No. 1 of Onagawa Nuclear Power Station, Tohoku Electric Power Co., Inc. This is the world's first commercial-base system for LWRs. Turbine start-up and power control by reactor recirculation flow are automated to reduce operator's labor and to improve the efficiency and accuracy of plant operation. The test data and the results of practical operation have proved the performance of the system is satisfactory. Major functions, configuration, and the results of performance tests at factory and at site are represented here. (author)

  8. Traceability Through Automatic Program Generation

    Science.gov (United States)

    Richardson, Julian; Green, Jeff

    2003-01-01

    Program synthesis is a technique for automatically deriving programs from specifications of their behavior. One of the arguments made in favour of program synthesis is that it allows one to trace from the specification to the program. One way in which traceability information can be derived is to augment the program synthesis system so that manipulations and calculations it carries out during the synthesis process are annotated with information on what the manipulations and calculations were and why they were made. This information is then accumulated throughout the synthesis process, at the end of which, every artifact produced by the synthesis is annotated with a complete history relating it to every other artifact (including the source specification) which influenced its construction. This approach requires modification of the entire synthesis system - which is labor-intensive and hard to do without influencing its behavior. In this paper, we introduce a novel, lightweight technique for deriving traceability from a program specification to the corresponding synthesized code. Once a program has been successfully synthesized from a specification, small changes are systematically made to the specification and the effects on the synthesized program observed. We have partially automated the technique and applied it in an experiment to one of our program synthesis systems, AUTOFILTER, and to the GNU C compiler, GCC. The results are promising: 1. Manual inspection of the results indicates that most of the connections derived from the source (a specification in the case of AUTOFILTER, C source code in the case of GCC) to its generated target (C source code in the case of AUTOFILTER, assembly language code in the case of GCC) are correct. 2. Around half of the lines in the target can be traced to at least one line of the source. 3. Small changes in the source often induce only small changes in the target.

  9. Automatic Testcase Generation for Flight Software

    Science.gov (United States)

    Bushnell, David Henry; Pasareanu, Corina; Mackey, Ryan M.

    2008-01-01

    The TacSat3 project is applying Integrated Systems Health Management (ISHM) technologies to an Air Force spacecraft for operational evaluation in space. The experiment will demonstrate the effectiveness and cost of ISHM and vehicle systems management (VSM) technologies through onboard operation for extended periods. We present two approaches to automatic testcase generation for ISHM: 1) A blackbox approach that views the system as a blackbox, and uses a grammar-based specification of the system's inputs to automatically generate *all* inputs that satisfy the specifications (up to prespecified limits); these inputs are then used to exercise the system. 2) A whitebox approach that performs analysis and testcase generation directly on a representation of the internal behaviour of the system under test. The enabling technologies for both these approaches are model checking and symbolic execution, as implemented in the Ames' Java PathFinder (JPF) tool suite. Model checking is an automated technique for software verification. Unlike simulation and testing which check only some of the system executions and therefore may miss errors, model checking exhaustively explores all possible executions. Symbolic execution evaluates programs with symbolic rather than concrete values and represents variable values as symbolic expressions. We are applying the blackbox approach to generating input scripts for the Spacecraft Command Language (SCL) from Interface and Control Systems. SCL is an embedded interpreter for controlling spacecraft systems. TacSat3 will be using SCL as the controller for its ISHM systems. We translated the SCL grammar into a program that outputs scripts conforming to the grammars. Running JPF on this program generates all legal input scripts up to a prespecified size. Script generation can also be targeted to specific parts of the grammar of interest to the developers. These scripts are then fed to the SCL Executive. ICS's in-house coverage tools will be run to

  10. Extensometer automatically measures elongation in elastomers

    Science.gov (United States)

    Hooper, C. D.

    1966-01-01

    Extensometer, with a calibrated shaft, measures the elongation of elastomers and automatically records this distance on a chart. It is adaptable to almost any tensile testing machine and is fabricated at a relatively low cost.

  11. 2010 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2010 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  12. Computer systems for automatic earthquake detection

    Science.gov (United States)

    Stewart, S.W.

    1974-01-01

    U.S Geological Survey seismologists in Menlo park, California, are utilizing the speed, reliability, and efficiency of minicomputers to monitor seismograph stations and to automatically detect earthquakes. An earthquake detection computer system, believed to be the only one of its kind in operation, automatically reports about 90 percent of all local earthquakes recorded by a network of over 100 central California seismograph stations. The system also monitors the stations for signs of malfunction or abnormal operation. Before the automatic system was put in operation, all of the earthquakes recorded had to be detected by manually searching the records, a time-consuming process. With the automatic detection system, the stations are efficiently monitored continuously. 

  13. Variable load automatically tests dc power supplies

    Science.gov (United States)

    Burke, H. C., Jr.; Sullivan, R. M.

    1965-01-01

    Continuously variable load automatically tests dc power supplies over an extended current range. External meters monitor current and voltage, and multipliers at the outputs facilitate plotting the power curve of the unit.

  14. 2012 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2012 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  15. Coke oven automatic combustion control system

    Energy Technology Data Exchange (ETDEWEB)

    Shihara, Y.

    1981-01-01

    This article describes and discusses the development and application of an automatic combustion control system for coke ovens that has been used at the Yawata Works of the Nippon Steel Corporation, Japan. (In Japanese)

  16. 2014 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2014 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  17. 2009 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2009 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  18. Automatic calibration system for pressure transducers

    Science.gov (United States)

    1968-01-01

    Fifty-channel automatic pressure transducer calibration system increases quantity and accuracy for test evaluation calibration. The pressure transducers are installed in an environmental tests chamber and manifolded to connect them to a pressure balance which is uniform.

  19. A Demonstration of Automatically Switched Optical Network

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    We build an automatically switched optical network (ASON) testbed with four optical cross-connect nodes. Many fundamental ASON features are demonstrated, which is implemented by control protocols based on generalized multi-protocol label switching (GMPLS) framework.

  20. Analysis and countermeasures of the problems occurred in reverse osmosis during debugging and running%反渗透调试运行中出现问题分析及对策

    Institute of Scientific and Technical Information of China (English)

    秦昊

    2012-01-01

    The technological process of the groundwater pre-desalination system of Shaanxi Weihe Coal Chemical Engineering Group,as well as the technical and equipment problems occurred in the process of debugging and running, is introduced. Meanwhile, in the reverse osmosis system, the effects of electrical conductivity, influent SDI, influent residual chlorine and ORP, security filter, produced water backpressure, etc. On the system are analyzed and discussed, and corresponding countermeasures are put forward.%介绍了陕西渭河煤化工集团的地下水预脱盐系统工艺流程,以及调试运行过程中出现的工艺及设备问题.分析了反渗透系统进水电导率、进水SDI、进水余氯及ORP、保安过滤器、产水背压等对系统的影响,并提出对策.

  1. Application of Portable Gas Chromatography Detecting Technique in Debugging Stage of UHV Project%便携式气相色谱检测技术在特高压工程调试阶段的应用

    Institute of Scientific and Technical Information of China (English)

    王海飞; 苏镇西; 祁炯; 赵也; 程伟; 袁小芳; 韩慧慧

    2015-01-01

    采取加装FPD检测器、优化试验条件等措施,对现有的便携式气相色谱检测技术进行改进,能检测出更多种类的分解产物,同时具备较高的分离和检测效率。在特高压工程六氟化硫电气设备调试阶段,运用该项技术可以快速准确地对疑似故障设备内SF6气体分解产物进行检测,有助于对故障位置、类型和严重程度进行准确判断。%Through the measures of adding FPD detector , optimizing the experimental conditions and so on, the portable gas chromatography detecting technique is improved , it can detect more decomposition products , and has high efficiency of separation and detection .In UHV GIS equipment debugging stage , applying this technique to detect SF 6 gas decomposition products in the suspected faulty equipments quickly and accurately , can help people accurately judge the position , type and severity of the fault .

  2. The problem of automatic identification of concepts

    International Nuclear Information System (INIS)

    This paper deals with the problem of the automatic recognition of concepts and describes an important language tool, the ''linguistic filter'', which facilitates the construction of statistical algorithms. Certain special filters, of prepositions, conjunctions, negatives, logical implication, compound words, are presented. This is followed by a detailed description of a statistical algorithm allowing recognition of pronoun referents, and finally the problem of the automatic treatment of negatives in French is discussed

  3. Automatic text categorisation of racist webpages

    OpenAIRE

    Greevy, Edel

    2004-01-01

    Automatic Text Categorisation (TC) involves the assignment of one or more predefined categories to text documents in order that they can be effectively managed. In this thesis we examine the possibility of applying automatic text categorisation to the problem of categorising texts (web pages) based on whether or not they are racist. TC has proven successful for topic-based problems such as news story categorisation. However, the problem of detecting racism is dissimilar to topic-based pro...

  4. Development of automatic weld strength testing machine

    International Nuclear Information System (INIS)

    In order to improve the testing process and accuracy so as to carry out all the manual works including documentation automatically and effortlessly, an automatic computerised strength testing machine with the latest state-of-art technology, including both the hardware and software was developed. The operator has to only submit the weld to the machine for testing and start the testing process merely by pressing a switch. This paper depicts the salient features of this machine

  5. Dynamic Automatic Noisy Speech Recognition System (DANSR)

    OpenAIRE

    Paul, Sheuli

    2014-01-01

    In this thesis we studied and investigated a very common but a long existing noise problem and we provided a solution to this problem. The task is to deal with different types of noise that occur simultaneously and which we call hybrid. Although there are individual solutions for specific types one cannot simply combine them because each solution affects the whole speech. We developed an automatic speech recognition system DANSR ( Dynamic Automatic Noisy Speech Recognition System) for hybri...

  6. AUTOMATIC CAPTION GENERATION FOR ELECTRONICS TEXTBOOKS

    OpenAIRE

    Veena Thakur; Trupti Gedam

    2015-01-01

    Automatic or semi-automatic approaches for developing Technology Supported Learning Systems (TSLS) are required to lighten their development cost. The main objective of this paper is to automate the generation of a caption module; it aims at reproducing the way teachers prepare their lessons and the learning material they will use throughout the course. Teachers tend to choose one or more textbooks that cover the contents of their subjects, determine the topics to be addressed, and identify...

  7. Automatic Morphometry of Nerve Histological Sections

    OpenAIRE

    Romero, E.; Cuisenaire, O.; Denef, J.; Delbeke, J.; Macq, B.; Veraart, C.

    2000-01-01

    A method for the automatic segmentation, recognition and measurement of neuronal myelinated fibers in nerve histological sections is presented. In this method, the fiber parameters i.e. perimeter, area, position of the fiber and myelin sheath thickness are automatically computed. Obliquity of the sections may be taken into account. First, the image is thresholded to provide a coarse classification between myelin and non-myelin pixels. Next, the resulting binary image is further simplified usi...

  8. An automatic system for multielement solvent extractions

    International Nuclear Information System (INIS)

    The automatic system described is suitable for multi-element separations by solvent extraction techniques with organic solvents heavier than water. The analysis is run automatically by a central control unit and includes steps such as pH regulation and reduction or oxidation. As an example, the separation of radioactive Hg2+, Cu2+, Mo6+, Cd2+, As5+, Sb5+, Fe3+, and Co3+ by means of diethyldithiocarbonate complexes is reported. (Auth.)

  9. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathaniel O.

    2010-05-31

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques to detect regions of high error and the flexibility of the transfinite interpolation to add degrees of freedom to these areas. Examples are shown of a section of the Palo Duro Canyon in northern Texas.

  10. Evaluation framework for automatic singing transcription

    OpenAIRE

    Molina, Emilio; Ana M. Barbancho; Tardón, Lorenzo J.; Barbancho, Isabel

    2014-01-01

    In this paper, we analyse the evaluation strategies used in previous works on automatic singing transcription, and we present a novel, comprehensive and freely available evaluation framework for automatic singing transcription. This framework consists of a cross-annotated dataset and a set of extended evaluation measures, which are integrated in a Matlab toolbox. The presented evaluation measures are based on standard MIREX note-tracking measures, but they provide extra information about the ...

  11. Automatic Programming with Ant Colony Optimization

    OpenAIRE

    Green, Jennifer; Jacqueline L. Whalley; Johnson, Colin G.

    2004-01-01

    Automatic programming is the use of search techniques to find programs that solve a problem. The most commonly explored automatic programming technique is genetic programming, which uses genetic algorithms to carry out the search. In this paper we introduce a new technique called Ant Colony Programming (ACP) which uses an ant colony based search in place of genetic algorithms. This algorithm is described and compared with other approaches in the literature.

  12. Automatic processing of dominance and submissiveness

    OpenAIRE

    Moors, Agnes; De Houwer, Jan

    2005-01-01

    We investigated whether people are able to detect in a relatively automatic manner the dominant or submissive status of persons engaged in social interactions. Using a variant of the affective Simon task (De Houwer & Eelen, 1998), we demonstrated that the verbal response DOMINANT or SUBMISSIVE was facilitated when it had to be made to a target person that was respectively dominant or submissive. These results provide new information about the automatic nature of appraisals and ...

  13. Automatic anatomy recognition of sparse objects

    Science.gov (United States)

    Zhao, Liming; Udupa, Jayaram K.; Odhner, Dewey; Wang, Huiqian; Tong, Yubing; Torigian, Drew A.

    2015-03-01

    A general body-wide automatic anatomy recognition (AAR) methodology was proposed in our previous work based on hierarchical fuzzy models of multitudes of objects which was not tied to any specific organ system, body region, or image modality. That work revealed the challenges encountered in modeling, recognizing, and delineating sparse objects throughout the body (compared to their non-sparse counterparts) if the models are based on the object's exact geometric representations. The challenges stem mainly from the variation in sparse objects in their shape, topology, geographic layout, and relationship to other objects. That led to the idea of modeling sparse objects not from the precise geometric representations of their samples but by using a properly designed optimal super form. This paper presents the underlying improved methodology which includes 5 steps: (a) Collecting image data from a specific population group G and body region Β and delineating in these images the objects in Β to be modeled; (b) Building a super form, S-form, for each object O in Β; (c) Refining the S-form of O to construct an optimal (minimal) super form, S*-form, which constitutes the (fuzzy) model of O; (d) Recognizing objects in Β using the S*-form; (e) Defining confounding and background objects in each S*-form for each object and performing optimal delineation. Our evaluations based on 50 3D computed tomography (CT) image sets in the thorax on four sparse objects indicate that substantially improved performance (FPVF~2%, FNVF~10%, and success where the previous approach failed) can be achieved using the new approach.

  14. Automatic Parallelization of Scientific Application

    DEFF Research Database (Denmark)

    Blum, Troels

    . This has lead to more specialized methods as I have shown with the work done with both specialized, and parametrizied kernels. Both have their benefits and recognizable use cases. We achieved clear performance benefits without any significant negative impact on overall application performance. Even in...

  15. Semi-automatic removal of foreground stars from images of galaxies

    CERN Document Server

    Frei, Z

    1996-01-01

    A new procedure, designed to remove foreground stars from galaxy profiles is presented. Although several programs exist for stellar and faint object photometry, none of them treat star removal from the images very carefully. I present my attempt to develop such a system, and briefly compare the performance of my software to one of the well known stellar photometry packages, DAOPhot. Major steps in my procedure are: (1) automatic construction of an empirical 2D point spread function from well separated stars that are situated off the galaxy; (2) automatic identification of those peaks that are likely to be foreground stars, scaling the PSF and removing these stars, and patching residuals (in the automatically determined smallest possible area where residuals are truly significant); and (3) cosmetic fix of remaining degradations in the image. The algorithm and software presented here is significantly better for automatic removal of foreground stars from images of galaxies than DAOPhot or similar packages, since...

  16. Research and implementation of report automatic generation measure based on perl CGI

    International Nuclear Information System (INIS)

    The running process is usually accompanied with a large number of data about operation states in the large scale real time data processing system, and the data should be managed through out some performance report by operation engineers. A solution for performance report automatic generation is presented. It is to build a performance report automatic generation system by extracting the massages of the database and UNIX file system and deploying it to an application system. The system has been applied at the CTBT NDC. (authors)

  17. Advanced automatic target recognition for police helicopter missions

    Science.gov (United States)

    Stahl, Christoph; Schoppmann, Paul

    2000-08-01

    The results of a case study about the application of an advanced method for automatic target recognition to infrared imagery taken from police helicopter missions are presented. The method consists of the following steps: preprocessing, classification, fusion, postprocessing and tracking, and combines the three paradigms image pyramids, neural networks and bayesian nets. The technology has been developed using a variety of different scenes typical for military aircraft missions. Infrared cameras have been in use for several years at the Bavarian police helicopter forces and are highly valuable for night missions. Several object classes like 'persons' or 'vehicles' are tested and the possible discrimination between persons and animals is shown. The analysis of complex scenes with hidden objects and clutter shows the potentials and limitations of automatic target recognition for real-world tasks. Several display concepts illustrate the achievable improvement of the situation awareness. The similarities and differences between various mission types concerning object variability, time constraints, consequences of false alarms, etc. are discussed. Typical police actions like searching for missing persons or runaway criminals illustrate the advantages of automatic target recognition. The results demonstrate the possible operational benefits for the helicopter crew. Future work will include performance evaluation issues and a system integration concept for the target platform.

  18. BIOESTIM: software for automatic design of estimators in bioprocess engineering.

    Science.gov (United States)

    Farza, M; Chéruy, A

    1994-09-01

    This paper describes BIOESTIM, a software package devoted to on-line estimation in bioprocess engineering. BIOESTIM enables bioengineers automatically to design state and parameter estimators from a minimal knowledge of the process kinetics. Such estimators allow development of software sensors capable of coping with the lack of reliable instrumentation suited to real-time monitoring. The estimator building procedure through BIOESTIM starts up from a dynamical material balance model of the bioprocess. This model, supplied by the user, is next completed by other information with no requirement for numerical values: the user has only to specify available measurements, coupled reactions and the known yield coefficients. On the base of this knowledge, BIOESTIM proceeds to symbolic algebraic manipulations on the model in order to study estimation possibilities and check identifiability of yield coefficients. When the design of an estimator is possible, the corresponding equations are automatically generated. Moreover, these estimators are stored in a user-specified file which is automatically interfaced with a specialized simulation software including data treatment and numerical integration packages. Thus, the user can simulate the estimator performances under various operational conditions using available experimental measurements. A typical example dealing with microbial growth and biosynthesis reactions is given in order to illustrate the main functional capabilities of BIOESTIM. BIOESTIM has been designed and written in a modular fashion. The module dealing with estimators design makes use of symbolic computation; it is written in Mathematica and runs on every computer on which this language is available. PMID:7828062

  19. Automatic graphene transfer system for improved material quality and efficiency

    Science.gov (United States)

    Boscá, Alberto; Pedrós, Jorge; Martínez, Javier; Palacios, Tomás; Calle, Fernando

    2016-02-01

    In most applications based on chemical vapor deposition (CVD) graphene, the transfer from the growth to the target substrate is a critical step for the final device performance. Manual procedures are time consuming and depend on handling skills, whereas existing automatic roll-to-roll methods work well for flexible substrates but tend to induce mechanical damage in rigid ones. A new system that automatically transfers CVD graphene to an arbitrary target substrate has been developed. The process is based on the all-fluidic manipulation of the graphene to avoid mechanical damage, strain and contamination, and on the combination of capillary action and electrostatic repulsion between the graphene and its container to ensure a centered sample on top of the target substrate. The improved carrier mobility and yield of the automatically transferred graphene, as compared to that manually transferred, is demonstrated by the optical and electrical characterization of field-effect transistors fabricated on both materials. In particular, 70% higher mobility values, with a 30% decrease in the unintentional doping and a 10% strain reduction are achieved. The system has been developed for lab-scale transfer and proved to be scalable for industrial applications.

  20. Enhanced Automatic Wavelet Independent Component Analysis for Electroencephalographic Artifact Removal

    Directory of Open Access Journals (Sweden)

    Nadia Mammone

    2014-12-01

    Full Text Available Electroencephalography (EEG is a fundamental diagnostic instrument for many neurological disorders, and it is the main tool for the investigation of the cognitive or pathological activity of the brain through the bioelectromagnetic fields that it generates. The correct interpretation of the EEG is misleading, both for clinicians’ visual evaluation and for automated procedures, because of artifacts. As a consequence, artifact rejection in EEG is a key preprocessing step, and the quest for reliable automatic processors has been quickly growing in the last few years. Recently, a promising automatic methodology, known as automatic wavelet-independent component analysis (AWICA, has been proposed. In this paper, a more efficient and sensitive version, called enhanced-AWICA (EAWICA, is proposed, and an extensive performance comparison is carried out by a step of tuning the different parameters that are involved in artifact detection. EAWICA is shown to minimize information loss and to outperform AWICA in artifact removal, both on simulated and real experimental EEG recordings.

  1. Automatic spectrophotometric determination of trace amounts of boron with curcumin

    International Nuclear Information System (INIS)

    The proposed method utilizes a rosocyanin complex formed by the reaction of boric acid and curcumin without evaporation to dryness. The automatic determination of boron in aqueous solution is performed according to the predetermined program (Fig. 8), after manual injection of a sample solution (2.00 ml) to the reaction vessel. Glacial acetic acid (5.40 ml) and propionic anhydride (13.20 ml) are added and the solution is circulated through the circulating pipe consisting of a bubble remover, an absorbance measuring flow cell, an air blowing tube and a drain valve. Oxalyl chloride (0.81 ml) is added and the solution is circulated for 80 seconds to eliminate water. Sulfuric acid (1.08 ml) and curcumin reagent (3.01 ml) are added and the solution is circulated for 120 seconds to form a rosocyanin complex. After addition of an acetate buffer solution (21.34 ml) for the neutralisation of an interfering proton complex of curcumin, the absorbance of the orange solution is measured at 545 nm. This automatic analysis is sensitive (Fig. 9) and rapid; less than 1.5 μg of boron is determined in 7 minutes. It can be applied to the determination of trace amounts of boron in steel samples, combined with an automatic distillation under development. (auth.)

  2. Automatic reconstruction of neural morphologies with multi-scale tracking.

    Science.gov (United States)

    Choromanska, Anna; Chang, Shih-Fu; Yuste, Rafael

    2012-01-01

    Neurons have complex axonal and dendritic morphologies that are the structural building blocks of neural circuits. The traditional method to capture these morphological structures using manual reconstructions is time-consuming and partly subjective, so it appears important to develop automatic or semi-automatic methods to reconstruct neurons. Here we introduce a fast algorithm for tracking neural morphologies in 3D with simultaneous detection of branching processes. The method is based on existing tracking procedures, adding the machine vision technique of multi-scaling. Starting from a seed point, our algorithm tracks axonal or dendritic arbors within a sphere of a variable radius, then moves the sphere center to the point on its surface with the shortest Dijkstra path, detects branching points on the surface of the sphere, scales it until branches are well separated and then continues tracking each branch. We evaluate the performance of our algorithm on preprocessed data stacks obtained by manual reconstructions of neural cells, corrupted with different levels of artificial noise, and unprocessed data sets, achieving 90% precision and 81% recall in branch detection. We also discuss limitations of our method, such as reconstructing highly overlapping neural processes, and suggest possible improvements. Multi-scaling techniques, well suited to detect branching structures, appear a promising strategy for automatic neuronal reconstructions. PMID:22754498

  3. Microprocessors in automatic chemical analysis

    International Nuclear Information System (INIS)

    Application of microprocessors to programming and computing of solutions chemical analysis by a sequential technique is examined. Safety, performances reliability are compared to other methods. An example is given on uranium titration by spectrophotometry

  4. Scanner OPC signatures: automatic vendor-to-vendor OPE matching

    Science.gov (United States)

    Renwick, Stephen P.

    2009-03-01

    As 193nm lithography continues to be stretched and the k1 factor decreases, optical proximity correction (OPC) has become a vital part of the lithographer's tool kit. Unfortunately, as is now well known, the design variations of lithographic scanners from different vendors cause them to have slightly different optical-proximity effect (OPE) behavior, meaning that they print features through pitch in distinct ways. This in turn means that their response to OPC is not the same, and that an OPC solution designed for a scanner from Company 1 may or may not work properly on a scanner from Company 2. Since OPC is not inexpensive, that causes trouble for chipmakers using more than one brand of scanner. Clearly a scanner-matching procedure is needed to meet this challenge. Previously, automatic matching has only been reported for scanners of different tool generations from the same manufacturer. In contrast, scanners from different companies have been matched using expert tuning and adjustment techniques, frequently requiring laborious test exposures. Automatic matching between scanners from Company 1 and Company 2 has remained an unsettled problem. We have recently solved this problem and introduce a novel method to perform the automatic matching. The success in meeting this challenge required three enabling factors. First, we recognized the strongest drivers of OPE mismatch and are thereby able to reduce the information needed about a tool from another supplier to that information readily available from all modern scanners. Second, we developed a means of reliably identifying the scanners' optical signatures, minimizing dependence on process parameters that can cloud the issue. Third, we carefully employed standard statistical techniques, checking for robustness of the algorithms used and maximizing efficiency. The result is an automatic software system that can predict an OPC matching solution for scanners from different suppliers without requiring expert intervention.

  5. Automatic detection of AutoPEEP during controlled mechanical ventilation

    Directory of Open Access Journals (Sweden)

    Nguyen Quang-Thang

    2012-06-01

    Full Text Available Abstract Background Dynamic hyperinflation, hereafter called AutoPEEP (auto-positive end expiratory pressure with some slight language abuse, is a frequent deleterious phenomenon in patients undergoing mechanical ventilation. Although not readily quantifiable, AutoPEEP can be recognized on the expiratory portion of the flow waveform. If expiratory flow does not return to zero before the next inspiration, AutoPEEP is present. This simple detection however requires the eye of an expert clinician at the patient’s bedside. An automatic detection of AutoPEEP should be helpful to optimize care. Methods In this paper, a platform for automatic detection of AutoPEEP based on the flow signal available on most of recent mechanical ventilators is introduced. The detection algorithms are developed on the basis of robust non-parametric hypothesis testings that require no prior information on the signal distribution. In particular, two detectors are proposed: one is based on SNT (Signal Norm Testing and the other is an extension of SNT in the sequential framework. The performance assessment was carried out on a respiratory system analog and ex-vivo on various retrospectively acquired patient curves. Results The experiment results have shown that the proposed algorithm provides relevant AutoPEEP detection on both simulated and real data. The analysis of clinical data has shown that the proposed detectors can be used to automatically detect AutoPEEP with an accuracy of 93% and a recall (sensitivity of 90%. Conclusions The proposed platform provides an automatic early detection of AutoPEEP. Such functionality can be integrated in the currently used mechanical ventilator for continuous monitoring of the patient-ventilator interface and, therefore, alleviate the clinician task.

  6. 基于EM78P510NK的全自动咖啡机控制系统的设计%Design of Automatic Coffee Machine Control System Based on the EM78P510NK

    Institute of Scientific and Technical Information of China (English)

    邵雯

    2013-01-01

    This article introduced automatic coffee machine control system 's design.Its core is MCU based on EM78P510NK, it can control the position of piston and seal gland, temperature, water flow, coarseness of coffee reliably.It can give an alarm when hydropenia and lack coffee.The automatic coffee machine control system has advantages of reliable operation, simple structure, easy to debug and expansion.It is used in business, catering industry, office and family, etc.%介绍一种全自动咖啡机控制系统的设计,它以EM78P510NK处理器为核心,对活塞位置、密封套位置、温度、水流量、咖啡粗细度等进行可靠控制,具有缺水、缺豆报警功能.该系统工作可靠、结构简单、易于调试和扩展,适合于商业、餐饮业、办公室和家庭.

  7. Automatic query formulations in information retrieval.

    Science.gov (United States)

    Salton, G; Buckley, C; Fox, E A

    1983-07-01

    Modern information retrieval systems are designed to supply relevant information in response to requests received from the user population. In most retrieval environments the search requests consist of keywords, or index terms, interrelated by appropriate Boolean operators. Since it is difficult for untrained users to generate effective Boolean search requests, trained search intermediaries are normally used to translate original statements of user need into useful Boolean search formulations. Methods are introduced in this study which reduce the role of the search intermediaries by making it possible to generate Boolean search formulations completely automatically from natural language statements provided by the system patrons. Frequency considerations are used automatically to generate appropriate term combinations as well as Boolean connectives relating the terms. Methods are covered to produce automatic query formulations both in a standard Boolean logic system, as well as in an extended Boolean system in which the strict interpretation of the connectives is relaxed. Experimental results are supplied to evaluate the effectiveness of the automatic query formulation process, and methods are described for applying the automatic query formulation process in practice. PMID:10299297

  8. Automatic detection of aircraft emergency landing sites

    Science.gov (United States)

    Shen, Yu-Fei; Rahman, Zia-ur; Krusienski, Dean; Li, Jiang

    2011-06-01

    An automatic landing site detection algorithm is proposed for aircraft emergency landing. Emergency landing is an unplanned event in response to emergency situations. If, as is unfortunately usually the case, there is no airstrip or airfield that can be reached by the un-powered aircraft, a crash landing or ditching has to be carried out. Identifying a safe landing site is critical to the survival of passengers and crew. Conventionally, the pilot chooses the landing site visually by looking at the terrain through the cockpit. The success of this vital decision greatly depends on the external environmental factors that can impair human vision, and on the pilot's flight experience that can vary significantly among pilots. Therefore, we propose a robust, reliable and efficient algorithm that is expected to alleviate the negative impact of these factors. We present only the detection mechanism of the proposed algorithm and assume that the image enhancement for increased visibility, and image stitching for a larger field-of-view have already been performed on the images acquired by aircraftmounted cameras. Specifically, we describe an elastic bound detection method which is designed to position the horizon. The terrain image is divided into non-overlapping blocks which are then clustered according to a "roughness" measure. Adjacent smooth blocks are merged to form potential landing sites whose dimensions are measured with principal component analysis and geometric transformations. If the dimensions of the candidate region exceed the minimum requirement for safe landing, the potential landing site is considered a safe candidate and highlighted on the human machine interface. At the end, the pilot makes the final decision by confirming one of the candidates, also considering other factors such as wind speed and wind direction, etc. Preliminary results show the feasibility of the proposed algorithm.

  9. Automatic measurement and representation of prosodic features

    Science.gov (United States)

    Ying, Goangshiuan Shawn

    Effective measurement and representation of prosodic features of the acoustic signal for use in automatic speech recognition and understanding systems is the goal of this work. Prosodic features-stress, duration, and intonation-are variations of the acoustic signal whose domains are beyond the boundaries of each individual phonetic segment. Listeners perceive prosodic features through a complex combination of acoustic correlates such as intensity, duration, and fundamental frequency (F0). We have developed new tools to measure F0 and intensity features. We apply a probabilistic global error correction routine to an Average Magnitude Difference Function (AMDF) pitch detector. A new short-term frequency-domain Teager energy algorithm is used to measure the energy of a speech signal. We have conducted a series of experiments performing lexical stress detection on words in continuous English speech from two speech corpora. We have experimented with two different approaches, a segment-based approach and a rhythm unit-based approach, in lexical stress detection. The first approach uses pattern recognition with energy- and duration-based measurements as features to build Bayesian classifiers to detect the stress level of a vowel segment. In the second approach we define rhythm unit and use only the F0-based measurement and a scoring system to determine the stressed segment in the rhythm unit. A duration-based segmentation routine was developed to break polysyllabic words into rhythm units. The long-term goal of this work is to develop a system that can effectively detect the stress pattern for each word in continuous speech utterances. Stress information will be integrated as a constraint for pruning the word hypotheses in a word recognition system based on hidden Markov models.

  10. On the malleability of automatic attitudes: combating automatic prejudice with images of admired and disliked individuals.

    Science.gov (United States)

    Dasgupta, N; Greenwald, A G

    2001-11-01

    Two experiments examined whether exposure to pictures of admired and disliked exemplars can reduce automatic preference for White over Black Americans and younger over older people. In Experiment 1, participants were exposed to either admired Black and disliked White individuals, disliked Black and admired White individuals, or nonracial exemplars. Immediately after exemplar exposure and 24 hr later, they completed an Implicit Association Test that assessed automatic racial attitudes and 2 explicit attitude measures. Results revealed that exposure to admired Black and disliked White exemplars significantly weakened automatic pro-White attitudes for 24 hr beyond the treatment but did not affect explicit racial attitudes. Experiment 2 provided a replication using automatic age-related attitudes. Together, these studies provide a strategy that attempts to change the social context and, through it, to reduce automatic prejudice and preference. PMID:11708558

  11. Automatic contrast: evidence that automatic comparison with the social self affects evaluative responses.

    Science.gov (United States)

    Ruys, Kirsten I; Spears, Russell; Gordijn, Ernestine H; de Vries, Nanne K

    2007-08-01

    The aim of the present research was to investigate whether unconsciously presented affective information may cause opposite evaluative responses depending on what social category the information originates from. We argue that automatic comparison processes between the self and the unconscious affective information produce this evaluative contrast effect. Consistent with research on automatic behaviour, we propose that when an intergroup context is activated, an automatic comparison to the social self may determine the automatic evaluative responses, at least for highly visible categories (e.g. sex, ethnicity). Contrary to previous research on evaluative priming, we predict automatic contrastive responses to affective information originating from an outgroup category such that the evaluative response to neutral targets is opposite to the valence of the suboptimal primes. Two studies using different intergroup contexts provide support for our hypotheses. PMID:17705936

  12. Functionality and Performance Visualization of the Distributed High Quality Volume Renderer (HVR)

    KAUST Repository

    Shaheen, Sara

    2012-07-01

    Volume rendering systems are designed to provide means to enable scientists and a variety of experts to interactively explore volume data through 3D views of the volume. However, volume rendering techniques are computationally intensive tasks. Moreover, parallel distributed volume rendering systems and multi-threading architectures were suggested as natural solutions to provide an acceptable volume rendering performance for very large volume data sizes, such as Electron Microscopy data (EM). This in turn adds another level of complexity when developing and manipulating volume rendering systems. Given that distributed parallel volume rendering systems are among the most complex systems to develop, trace and debug, it is obvious that traditional debugging tools do not provide enough support. As a consequence, there is a great demand to provide tools that are able to facilitate the manipulation of such systems. This can be achieved by utilizing the power of compute graphics in designing visual representations that reflect how the system works and that visualize the current performance state of the system.The work presented is categorized within the field of software Visualization, where Visualization is used to serve visualizing and understanding various software. In this thesis, a number of visual representations that reflect a number of functionality and performance aspects of the distributed HVR, a high quality volume renderer system that uses various techniques to visualize large volume sizes interactively. This work is provided to visualize different stages of the parallel volume rendering pipeline of HVR. This is along with means of performance analysis through a number of flexible and dynamic visualizations that reflect the current state of the system and enables manipulation of them at runtime. Those visualization are aimed to facilitate debugging, understanding and analyzing the distributed HVR.

  13. Mind Out of Action: The Intentionality of Automatic Actions

    OpenAIRE

    Di Nucci, Ezio

    2008-01-01

    We think less than we think. My thesis moves from this suspicion to show that standard accounts of intentional action can't explain the whole of agency. Causalist accounts such as Davidson's and Bratman's, according to which an action can be intentional only if it is caused by a particular mental state of the agent, don't work for every kind of action. So-called automatic actions, effortless performances over which the agent doesn't deliberate, and to which she doesn't need to ...

  14. AN AUTOMATIC TEST ENVIRONMENT FOR MICROELECTRONICS EDUCATION AND RESEARCH

    OpenAIRE

    Federico Sandoval-Ibarra

    2008-01-01

    An automatic test environment (ATE) based on a PSoC has been developed to perform electricalcharacterization of integrated circuits (ICs). The ICs are designed for academic and research purposesas part of the Electronic Design graduate program at CINVESTAV-Guadalajara Unit; these ICs aremanufactured in standard N-well, 5-V, 1.5μm/0.5μm CMOS technologies. The ATE offers programmablecapabilities to develop master-slave architectures, memory for data storage, functions generator tostimulate circ...

  15. Automatically identifying scatter in fluorescence data using robust techniques

    DEFF Research Database (Denmark)

    Engelen, S.; Frosch, Stina; Hubert, M.

    2007-01-01

    complicates the analysis instead and contributes to model inadequacy. As such, scatter can be considered as an example of element-wise outliers. However, no straightforward method for identifying the scatter region can be found in the literature. In this paper an automatic scatter identification method is...... input data for three different PARAFAC methods. Firstly inserting missing values in the scatter regions are tested, secondly an interpolation of the scatter regions is performed and finally the scatter regions are down-weighted. These results show that the PARAFAC method to choose after scatter...

  16. High Range Resolution Profile Automatic Target Recognition Using Sparse Representation

    Institute of Scientific and Technical Information of China (English)

    Zhou Nuo; Chen Wei

    2010-01-01

    Sparse representation is a new signal analysis method which is receiving increasing attention in recent years.In this article,a novel scheme solving high range resolution profile automatic target recognition for ground moving targets is proposed.The sparse representation theory is applied to analyzing the components of high range resolution profiles and sparse coefficients are used to describe their features.Numerous experiments with the target type number ranging from 2 to 6 have been implemented.Results show that the proposed scheme not only provides higher recognition preciseness in real time,but also achieves more robust performance as the target type number increases.

  17. Automatic identification of corrosion damage using image processing techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bento, Mariana P.; Ramalho, Geraldo L.B.; Medeiros, Fatima N.S. de; Ribeiro, Elvis S. [Universidade Federal do Ceara (UFC), Fortaleza, CE (Brazil); Medeiros, Luiz C.L. [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper proposes a Nondestructive Evaluation (NDE) method for atmospheric corrosion detection on metallic surfaces using digital images. In this study, the uniform corrosion is characterized by texture attributes extracted from co-occurrence matrix and the Self Organizing Mapping (SOM) clustering algorithm. We present a technique for automatic inspection of oil and gas storage tanks and pipelines of petrochemical industries without disturbing their properties and performance. Experimental results are promising and encourage the possibility of using this methodology in designing trustful and robust early failure detection systems. (author)

  18. Automatic scoring of the severity of psoriasis scaling

    DEFF Research Database (Denmark)

    Gomez, David Delgado; Ersbøll, Bjarne Kjær; Carstensen, Jens Michael

    In this work, a combined statistical and image analysis method to automatically evaluate the severity of scaling in psoriasis lesions is proposed. The method separates the different regions of the disease in the image and scores the degree of scaling based on the properties of these areas. The...... proposed method provides a solution to the lack of suitable methods to assess the lesion and to evaluate changes during the treatment. An experiment over a collection of psoriasis images is conducted to test the performance of the method. Results show that the obtained scores are highly correlated with...

  19. Automatic generation of alignments for 3D QSAR analyses.

    Science.gov (United States)

    Jewell, N E; Turner, D B; Willett, P; Sexton, G J

    2001-01-01

    Many 3D QSAR methods require the alignment of the molecules in a dataset, which can require a fair amount of manual effort in deciding upon a rational basis for the superposition. This paper describes the use of FBSS, a program for field-based similarity searching in chemical databases, for generating such alignments automatically. The CoMFA and CoMSIA experiments with several literature datasets show that the QSAR models resulting from the FBSS alignments are broadly comparable in predictive performance with the models resulting from manual alignments. PMID:11774998

  20. Automatic energy calibration of germanium detectors using fuzzy set theory

    International Nuclear Information System (INIS)

    With the advent of multi-detector arrays, many tasks that are usually performed by physicists, such as energy calibration, become very time consuming. There is therefore a need to develop more and more complex algorithms able to mimic human expertise. Fuzzy logic proposes a theoretical framework to build algorithms that are close to the human way of thinking. In this paper we apply fuzzy set theory in order to develop an automatic procedure for energy calibration. The algorithm, based on fuzzy concepts, has been tested on data taken with the EUROBALL IV γ-ray array