WorldWideScience

Sample records for automatic performance debugging

  1. Automatic Debugging Support for UML Designs

    Science.gov (United States)

    Schumann, Johann; Swanson, Keith (Technical Monitor)

    2001-01-01

    Design of large software systems requires rigorous application of software engineering methods covering all phases of the software process. Debugging during the early design phases is extremely important, because late bug-fixes are expensive. In this paper, we describe an approach which facilitates debugging of UML requirements and designs. The Unified Modeling Language (UML) is a set of notations for object-orient design of a software system. We have developed an algorithm which translates requirement specifications in the form of annotated sequence diagrams into structured statecharts. This algorithm detects conflicts between sequence diagrams and inconsistencies in the domain knowledge. After synthesizing statecharts from sequence diagrams, these statecharts usually are subject to manual modification and refinement. By using the "backward" direction of our synthesis algorithm. we are able to map modifications made to the statechart back into the requirements (sequence diagrams) and check for conflicts there. Fed back to the user conflicts detected by our algorithm are the basis for deductive-based debugging of requirements and domain theory in very early development stages. Our approach allows to generate explanations oil why there is a conflict and which parts of the specifications are affected.

  2. Debugging a high performance computing program

    Science.gov (United States)

    Gooding, Thomas M.

    2013-08-20

    Methods, apparatus, and computer program products are disclosed for debugging a high performance computing program by gathering lists of addresses of calling instructions for a plurality of threads of execution of the program, assigning the threads to groups in dependence upon the addresses, and displaying the groups to identify defective threads.

  3. Instrumentation, performance visualization, and debugging tools for multiprocessors

    Science.gov (United States)

    Yan, Jerry C.; Fineman, Charles E.; Hontalas, Philip J.

    1991-01-01

    The need for computing power has forced a migration from serial computation on a single processor to parallel processing on multiprocessor architectures. However, without effective means to monitor (and visualize) program execution, debugging, and tuning parallel programs becomes intractably difficult as program complexity increases with the number of processors. Research on performance evaluation tools for multiprocessors is being carried out at ARC. Besides investigating new techniques for instrumenting, monitoring, and presenting the state of parallel program execution in a coherent and user-friendly manner, prototypes of software tools are being incorporated into the run-time environments of various hardware testbeds to evaluate their impact on user productivity. Our current tool set, the Ames Instrumentation Systems (AIMS), incorporates features from various software systems developed in academia and industry. The execution of FORTRAN programs on the Intel iPSC/860 can be automatically instrumented and monitored. Performance data collected in this manner can be displayed graphically on workstations supporting X-Windows. We have successfully compared various parallel algorithms for computational fluid dynamics (CFD) applications in collaboration with scientists from the Numerical Aerodynamic Simulation Systems Division. By performing these comparisons, we show that performance monitors and debuggers such as AIMS are practical and can illuminate the complex dynamics that occur within parallel programs.

  4. Supercomputer debugging workshop 1991 proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.

    1991-01-01

    This report discusses the following topics on supercomputer debugging: Distributed debugging; use interface to debugging tools and standards; debugging optimized codes; debugging parallel codes; and debugger performance and interface as analysis tools. (LSP)

  5. Supercomputer debugging workshop 1991 proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.

    1991-12-31

    This report discusses the following topics on supercomputer debugging: Distributed debugging; use interface to debugging tools and standards; debugging optimized codes; debugging parallel codes; and debugger performance and interface as analysis tools. (LSP)

  6. Supercomputer debugging workshop `92

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.S.

    1993-02-01

    This report contains papers or viewgraphs on the following topics: The ABCs of Debugging in the 1990s; Cray Computer Corporation; Thinking Machines Corporation; Cray Research, Incorporated; Sun Microsystems, Inc; Kendall Square Research; The Effects of Register Allocation and Instruction Scheduling on Symbolic Debugging; Debugging Optimized Code: Currency Determination with Data Flow; A Debugging Tool for Parallel and Distributed Programs; Analyzing Traces of Parallel Programs Containing Semaphore Synchronization; Compile-time Support for Efficient Data Race Detection in Shared-Memory Parallel Programs; Direct Manipulation Techniques for Parallel Debuggers; Transparent Observation of XENOOPS Objects; A Parallel Software Monitor for Debugging and Performance Tools on Distributed Memory Multicomputers; Profiling Performance of Inter-Processor Communications in an iWarp Torus; The Application of Code Instrumentation Technology in the Los Alamos Debugger; and CXdb: The Road to Remote Debugging.

  7. PerfXplain: Debugging MapReduce Job Performance

    CERN Document Server

    Khoussainova, Nodira; Suciu, Dan

    2012-01-01

    While users today have access to many tools that assist in performing large scale data analysis tasks, understanding the performance characteristics of their parallel computations, such as MapReduce jobs, remains difficult. We present PerfXplain, a system that enables users to ask questions about the relative performances (i.e., runtimes) of pairs of MapReduce jobs. PerfXplain provides a new query language for articulating performance queries and an algorithm for generating explanations from a log of past MapReduce job executions. We formally define the notion of an explanation together with three metrics, relevance, precision, and generality, that measure explanation quality. We present the explanation-generation algorithm based on techniques related to decision-tree building. We evaluate the approach on a log of past executions on Amazon EC2, and show that our approach can generate quality explanations, outperforming two naive explanation-generation methods.

  8. Supercomputer debugging workshop '92

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.S.

    1993-01-01

    This report contains papers or viewgraphs on the following topics: The ABCs of Debugging in the 1990s; Cray Computer Corporation; Thinking Machines Corporation; Cray Research, Incorporated; Sun Microsystems, Inc; Kendall Square Research; The Effects of Register Allocation and Instruction Scheduling on Symbolic Debugging; Debugging Optimized Code: Currency Determination with Data Flow; A Debugging Tool for Parallel and Distributed Programs; Analyzing Traces of Parallel Programs Containing Semaphore Synchronization; Compile-time Support for Efficient Data Race Detection in Shared-Memory Parallel Programs; Direct Manipulation Techniques for Parallel Debuggers; Transparent Observation of XENOOPS Objects; A Parallel Software Monitor for Debugging and Performance Tools on Distributed Memory Multicomputers; Profiling Performance of Inter-Processor Communications in an iWarp Torus; The Application of Code Instrumentation Technology in the Los Alamos Debugger; and CXdb: The Road to Remote Debugging.

  9. A Scalable Prescriptive Parallel Debugging Model

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Quarfot Nielsen, Niklas; Lee, Gregory L.;

    2015-01-01

    Debugging is a critical step in the development of any parallel program. However, the traditional interactive debugging model, where users manually step through code and inspect their application, does not scale well even for current supercomputers due its centralized nature. While lightweight...... and test their debugging intuition in a way that helps to reduce the error space. Based on this debugging model we introduce a prototype implementation embodying this model, the DySectAPI, allowing programmers to construct probe trees for automatic, event-driven debugging at scale. In this paper we...

  10. TinyDebug

    DEFF Research Database (Denmark)

    Hansen, Morten Tranberg

    2011-01-01

    Debugging embedded wireless systems can be cumbersome due to low visibility. To ease the task of debugging this paper present TinyDebug which is a multi-purpose passive debugging framework for developing embedded wireless sys- tems. TinyDebug is designed to be used throughout the entire system...

  11. DySectAPI: Scalable Prescriptive Debugging

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Karlsson, Sven; Quarfot Nielsen, Niklas;

    We present the DySectAPI, a tool that allow users to construct probe trees for automatic, event-driven debugging at scale. The traditional, interactive debugging model, whereby users manually step through and inspect their application, does not scale well even for current supercomputers. While...

  12. DySectAPI: Scalable Prescriptive Debugging

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Karlsson, Sven; Quarfot Nielsen, Niklas;

    We present the DySectAPI, a tool that allow users to construct probe trees for automatic, event-driven debugging at scale. The traditional, interactive debugging model, whereby users manually step through and inspect their application, does not scale well even for current supercomputers. While...... lightweight debugging models scale well, they can currently only debug a subset of bug classes. DySectAPI fills the gap between these two approaches with a novel user-guided approach. Using both experimental results and analytical modeling we show how DySectAPI scales and can run with a low overhead...

  13. Distributed debugging and tumult

    NARCIS (Netherlands)

    Scholten, J.; Jansen, P.G.

    1990-01-01

    A description is given of Tumult (Twente university multicomputer) and its operating system, along with considerations about parallel debugging, examples of parallel debuggers, and the proposed debugger for Tumult. Problems related to debugging distributed systems and solutions found in other distri

  14. MPI Debugging with Handle Introspection

    DEFF Research Database (Denmark)

    Brock-Nannestad, Laust; DelSignore, John; Squyres, Jeffrey M.;

    The Message Passing Interface, MPI, is the standard programming model for high performance computing clusters. However, debugging applications on large scale clusters is difficult. The widely used Message Queue Dumping interface enables inspection of message queue state but there is no general...... interface for extracting information from MPI objects such as communicators. A developer can debug the MPI library as if it was part of the application, but this exposes an unneeded level of detail. The Tools Working Group in the MPI Forum has proposed a specification for MPI Handle Introspection....... It defines a standard interface that lets debuggers extract information from MPI objects. Extracted information is then presented to the developer, in a human readable format. The interface is designed to be independent of MPI implementations and debuggers. In this paper, we describe our support...

  15. Voice-controlled Debugging of Spreadsheets

    CERN Document Server

    Flood, Derek

    2008-01-01

    Developments in Mobile Computing are putting pressure on the software industry to research new modes of interaction that do not rely on the traditional keyboard and mouse combination. Computer users suffering from Repetitive Strain Injury also seek an alternative to keyboard and mouse devices to reduce suffering in wrist and finger joints. Voice-control is an alternative approach to spreadsheet development and debugging that has been researched and used successfully in other domains. While voice-control technology for spreadsheets is available its effectiveness has not been investigated. This study is the first to compare the performance of a set of expert spreadsheet developers that debugged a spreadsheet using voice-control technology and another set that debugged the same spreadsheet using keyboard and mouse. The study showed that voice, despite its advantages, proved to be slower and less accurate. However, it also revealed ways in which the technology might be improved to redress this imbalance.

  16. Integrated Debugging of Modelica Models

    Directory of Open Access Journals (Sweden)

    Adrian Pop

    2014-04-01

    Full Text Available The high abstraction level of equation-based object-oriented (EOO languages such as Modelica has the drawback that programming and modeling errors are often hard to find. In this paper we present integrated static and dynamic debugging methods for Modelica models and a debugger prototype that addresses several of those problems. The goal is an integrated debugging framework that combines classical debugging techniques with special techniques for equation-based languages partly based on graph visualization and interaction. To our knowledge, this is the first Modelica debugger that supports both equation-based transformational and algorithmic code debugging in an integrated fashion.

  17. A Component-Based Debugging Approach for Detecting Structural Inconsistencies in Declarative Equation Based Models

    Institute of Scientific and Technical Information of China (English)

    Jian-Wan Ding; Li-Ping Chen; Fan-Li Zhou

    2006-01-01

    Object-oriented modeling with declarative equation based languages often unconsciously leads to structural inconsistencies. Component-based debugging is a new structural analysis approach that addresses this problem by analyzing the structure of each component in a model to separately locate faulty components. The analysis procedure is performed recursively based on the depth-first rule. It first generates fictitious equations for a component to establish a debugging environment, and then detects structural defects by using graph theoretical approaches to analyzing the structure of the system of equations resulting from the component. The proposed method can automatically locate components that cause the structural inconsistencies, and show the user detailed error messages. This information can be a great help in finding and localizing structural inconsistencies, and in some cases pinpoints them immediately.

  18. Query strategy for sequential ontology debugging

    CERN Document Server

    Shchekotykhina, Kostyantyn; Fleiss, Philipp; Rodler, Patrick

    2011-01-01

    Debugging of ontologies is an important prerequisite for their wide-spread application, especially in areas that rely upon everyday users to create and maintain knowledge bases, as in the case of the Semantic Web. Recent approaches use diagnosis methods to identify causes of inconsistent or incoherent ontologies. However, in most debugging scenarios these methods return many alternative diagnoses, thus placing the burden of fault localization on the user. This paper demonstrates how the target diagnosis can be identified by performing a sequence of observations, that is, by querying an oracle about entailments of the target ontology. We exploit a-priori probabilities of typical user errors to formulate information-theoretic concepts for query selection. Our evaluation showed that the proposed method significantly reduces the number of required queries compared to myopic strategies. We experimented with different probability distributions of user errors and different qualities of the a-priori probabilities. Ou...

  19. An object-oriented extension for debugging the virtual machine

    Energy Technology Data Exchange (ETDEWEB)

    Pizzi, R.G. Jr. [California Univ., Davis, CA (United States)

    1994-12-01

    A computer is nothing more then a virtual machine programmed by source code to perform a task. The program`s source code expresses abstract constructs which are compiled into some lower level target language. When a virtual machine breaks, it can be very difficult to debug because typical debuggers provide only low-level target implementation information to the software engineer. We believe that the debugging task can be simplified by introducing aspects of the abstract design and data into the source code. We introduce OODIE, an object-oriented extension to programming languages that allows programmers to specify a virtual environment by describing the meaning of the design and data of a virtual machine. This specification is translated into symbolic information such that an augmented debugger can present engineers with a programmable debugging environment specifically tailored for the virtual machine that is to be debugged.

  20. Debugging expert systems using a dynamically created hypertext network

    Science.gov (United States)

    Boyle, Craig D. B.; Schuette, John F.

    1991-01-01

    The labor intensive nature of expert system writing and debugging motivated this study. The hypothesis is that a hypertext based debugging tool is easier and faster than one traditional tool, the graphical execution trace. HESDE (Hypertext Expert System Debugging Environment) uses Hypertext nodes and links to represent the objects and their relationships created during the execution of a rule based expert system. HESDE operates transparently on top of the CLIPS (C Language Integrated Production System) rule based system environment and is used during the knowledge base debugging process. During the execution process HESDE builds an execution trace. Use of facts, rules, and their values are automatically stored in a Hypertext network for each execution cycle. After the execution process, the knowledge engineer may access the Hypertext network and browse the network created. The network may be viewed in terms of rules, facts, and values. An experiment was conducted to compare HESDE with a graphical debugging environment. Subjects were given representative tasks. For speed and accuracy, in eight of the eleven tasks given to subjects, HESDE was significantly better.

  1. Debugging Data Transfers in CMS

    CERN Document Server

    Bagliesi, G; Bloom, K; Bockelman, B; Bonacorsi, D; Fisk, I; Flix, J; Hernandez, J; D'Hondt, J; Kadastik, M; Klem, J; Kodolova, O; Kuo, C M; Letts, J; Maes, J; Magini, N; Metson, S; Piedra, J; Pukhaeva, N; Tuura, L; Sonajalg, S; Wu, Y; Van Mulders, P; Villella, I; Wurthwein, F

    2010-01-01

    The CMS experiment at CERN is preparing for LHC data taking in several computing preparation activities. In early 2007 a traffic load generator infrastructure for distributed data transfer tests called the LoadTest was designed and deployed to equip the WLCG sites that support CMS with a means for debugging, load-testing and commissioning data transfer routes among CMS computing centres. The LoadTest is based upon PhEDEx as a reliable, scalable data set replication system. The Debugging Data Transfers (DDT) task force was created to coordinate the debugging of the data transfer links. The task force aimed to commission most crucial transfer routes among CMS sites by designing and enforcing a clear procedure to debug problematic links. Such procedure aimed to move a link from a debugging phase in a separate and independent environment to a production environment when a set of agreed conditions are achieved for that link. The goal was to deliver one by one working transfer routes to the CMS data operations team...

  2. BigDebug: Debugging Primitives for Interactive Big Data Processing in Spark.

    Science.gov (United States)

    Gulzar, Muhammad Ali; Interlandi, Matteo; Yoo, Seunghyun; Tetali, Sai Deep; Condie, Tyson; Millstein, Todd; Kim, Miryung

    2016-05-01

    Developers use cloud computing platforms to process a large quantity of data in parallel when developing big data analytics. Debugging the massive parallel computations that run in today's data-centers is time consuming and error-prone. To address this challenge, we design a set of interactive, real-time debugging primitives for big data processing in Apache Spark, the next generation data-intensive scalable cloud computing platform. This requires re-thinking the notion of step-through debugging in a traditional debugger such as gdb, because pausing the entire computation across distributed worker nodes causes significant delay and naively inspecting millions of records using a watchpoint is too time consuming for an end user. First, BIGDEBUG's simulated breakpoints and on-demand watchpoints allow users to selectively examine distributed, intermediate data on the cloud with little overhead. Second, a user can also pinpoint a crash-inducing record and selectively resume relevant sub-computations after a quick fix. Third, a user can determine the root causes of errors (or delays) at the level of individual records through a fine-grained data provenance capability. Our evaluation shows that BIGDEBUG scales to terabytes and its record-level tracing incurs less than 25% overhead on average. It determines crash culprits orders of magnitude more accurately and provides up to 100% time saving compared to the baseline replay debugger. The results show that BIGDEBUG supports debugging at interactive speeds with minimal performance impact.

  3. BigDebug: Debugging Primitives for Interactive Big Data Processing in Spark

    Science.gov (United States)

    Gulzar, Muhammad Ali; Interlandi, Matteo; Yoo, Seunghyun; Tetali, Sai Deep; Condie, Tyson; Millstein, Todd; Kim, Miryung

    2016-01-01

    Developers use cloud computing platforms to process a large quantity of data in parallel when developing big data analytics. Debugging the massive parallel computations that run in today’s data-centers is time consuming and error-prone. To address this challenge, we design a set of interactive, real-time debugging primitives for big data processing in Apache Spark, the next generation data-intensive scalable cloud computing platform. This requires re-thinking the notion of step-through debugging in a traditional debugger such as gdb, because pausing the entire computation across distributed worker nodes causes significant delay and naively inspecting millions of records using a watchpoint is too time consuming for an end user. First, BIGDEBUG’s simulated breakpoints and on-demand watchpoints allow users to selectively examine distributed, intermediate data on the cloud with little overhead. Second, a user can also pinpoint a crash-inducing record and selectively resume relevant sub-computations after a quick fix. Third, a user can determine the root causes of errors (or delays) at the level of individual records through a fine-grained data provenance capability. Our evaluation shows that BIGDEBUG scales to terabytes and its record-level tracing incurs less than 25% overhead on average. It determines crash culprits orders of magnitude more accurately and provides up to 100% time saving compared to the baseline replay debugger. The results show that BIGDEBUG supports debugging at interactive speeds with minimal performance impact. PMID:27390389

  4. Automatic Energy Schemes for High Performance Applications

    Energy Technology Data Exchange (ETDEWEB)

    Sundriyal, Vaibhav [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    Although high-performance computing traditionally focuses on the efficient execution of large-scale applications, both energy and power have become critical concerns when approaching exascale. Drastic increases in the power consumption of supercomputers affect significantly their operating costs and failure rates. In modern microprocessor architectures, equipped with dynamic voltage and frequency scaling (DVFS) and CPU clock modulation (throttling), the power consumption may be controlled in software. Additionally, network interconnect, such as Infiniband, may be exploited to maximize energy savings while the application performance loss and frequency switching overheads must be carefully balanced. This work first studies two important collective communication operations, all-to-all and allgather and proposes energy saving strategies on the per-call basis. Next, it targets point-to-point communications to group them into phases and apply frequency scaling to them to save energy by exploiting the architectural and communication stalls. Finally, it proposes an automatic runtime system which combines both collective and point-to-point communications into phases, and applies throttling to them apart from DVFS to maximize energy savings. The experimental results are presented for NAS parallel benchmark problems as well as for the realistic parallel electronic structure calculations performed by the widely used quantum chemistry package GAMESS. Close to the maximum energy savings were obtained with a substantially low performance loss on the given platform.

  5. Hardware Support for Software Debugging

    Science.gov (United States)

    2011-05-01

    Architecture • Concurrency Debugging - ReEnact • Conclusions Cost of Software Defects • Financial Costs • In a study by NIST in 2002 it was found that... ReEnact • Leverages modified Thread-Level Speculation (TLS) hardware • Create partial orderings of threads in a multithreaded program using...logical vector clocks • Using these orderings, ReEnact is able to detect and often repair data race conditions in a multithreaded program • Experiments

  6. Lightweight and Statistical Techniques for Petascale PetaScale Debugging

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Barton

    2014-06-30

    to a small set of nodes or by identifying equivalence classes of nodes and sampling our debug targets from them. We implemented these techniques as lightweight tools that efficiently work on the full scale of the target machine. We explored four lightweight debugging refinements: generic classification parameters, such as stack traces, application-specific classification parameters, such as global variables, statistical data acquisition techniques and machine learning based approaches to perform root cause analysis. Work done under this project can be divided into two categories, new algorithms and techniques for scalable debugging, and foundation infrastructure work on our MRNet multicast-reduction framework for scalability, and Dyninst binary analysis and instrumentation toolkits.

  7. Debugging Concurrent Software:Advances and Challenges

    Institute of Scientific and Technical Information of China (English)

    Jeff Huang; Charles Zhang

    2016-01-01

    Concurrency debugging is an extremely important yet challenging problem that has been hampering developer productivity and software reliability in the multicore era. We have worked on this problem in the past eight years and have developed several effective methods and automated tools for helping developers debugging shared memory concurrent programs. This article discusses challenges in concurrency debugging and summarizes our research contributions in four important directions: concurrency bug reproduction, detection, understanding, and fixing. It also discusses other recent advances in tackling these challenges.

  8. Automatic Information Processing and High Performance Skills

    Science.gov (United States)

    1992-10-01

    Society Twenty-Sixth Annual Meeting (pp. 10-14). Santa Monica, CA: Human Factors Society. Shiffrin , R. M. (1988). Attention. In R. C. Atkinson , R. J...Learning. Memory . and Cognition, 1A, 562-569. Shiffrin , R. M., and Dumais, S. T. (1981). The development of automatism. In J. R. Anderson (Ed.), Cognitive...Change and Skill Acquisition in Visual Search ................................... 43 iii TABLE OF CONTENTS (Continued) Consistent Memory and Visual Search

  9. Multi-purpose passive debugging for embedded wireless

    DEFF Research Database (Denmark)

    Hansen, Morten Tranberg

    Debugging embedded wireless systems can be cumbersome and hard due to low visibility. To ease the task of debugging we propose a multi-purpose passive debugging framework, called TinyDebug, for developing embedded wireless systems. TinyDebug is designed to be used throughout the entire system...

  10. TUNE: Compiler-Directed Automatic Performance Tuning

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Mary [University of Utah

    2014-09-18

    This project has developed compiler-directed performance tuning technology targeting the Cray XT4 Jaguar system at Oak Ridge, which has multi-core Opteron nodes with SSE-3 SIMD extensions, and the Cray XE6 Hopper system at NERSC. To achieve this goal, we combined compiler technology for model-guided empirical optimization for memory hierarchies with SIMD code generation, which have been developed by the PIs over the past several years. We examined DOE Office of Science applications to identify performance bottlenecks and apply our system to computational kernels that operate on dense arrays. Our goal for this performance-tuning technology has been to yield hand-tuned levels of performance on DOE Office of Science computational kernels, while allowing application programmers to specify their computations at a high level without requiring manual optimization. Overall, we aim to make our technology for SIMD code generation and memory hierarchy optimization a crucial component of high-productivity Petaflops computing through a close collaboration with the scientists in national laboratories.

  11. Performance of automatic scanning microscope for nuclear emulsion experiments

    Energy Technology Data Exchange (ETDEWEB)

    Güler, A. Murat, E-mail: mguler@newton.physics.metu.edu.tr [Middle East Technical University, 06800 Ankara (Turkey); Altınok, Özgür [Middle East Technical University, 06800 Ankara (Turkey); Tufts University, Medford, MA 02155 (United States)

    2015-12-31

    The impressive improvements in scanning technology and methods let nuclear emulsion to be used as a target in recent large experiments. We report the performance of an automatic scanning microscope for nuclear emulsion experiments. After successful calibration and alignment of the system, we have reached 99% tracking efficiency for the minimum ionizing tracks that penetrating through the emulsions films. The automatic scanning system is successfully used for the scanning of emulsion films in the OPERA experiment and plan to use for the next generation of nuclear emulsion experiments.

  12. Debug automation from pre-silicon to post-silicon

    CERN Document Server

    Dehbashi, Mehdi

    2015-01-01

    This book describes automated debugging approaches for the bugs and the faults which appear in different abstraction levels of a hardware system. The authors employ a transaction-based debug approach to systems at the transaction-level, asserting the correct relation of transactions. The automated debug approach for design bugs finds the potential fault candidates at RTL and gate-level of a circuit. Debug techniques for logic bugs and synchronization bugs are demonstrated, enabling readers to localize the most difficult bugs. Debug automation for electrical faults (delay faults)finds the potentially failing speedpaths in a circuit at gate-level. The various debug approaches described achieve high diagnosis accuracy and reduce the debugging time, shortening the IC development cycle and increasing the productivity of designers. Describes a unified framework for debug automation used at both pre-silicon and post-silicon stages; Provides approaches for debug automation of a hardware system at different levels of ...

  13. Automatic or Deliberate? Cerebral correlates of automatic associations towards performance enhancing substances

    Directory of Open Access Journals (Sweden)

    Sebastian eSchindler

    2015-12-01

    Full Text Available The direct assessment of explicit attitudes towards performance enhancing substances, for example Neuroenhancement or doping in sports can be affected by social desirability biases and cheating attempts. According to Dual Process Theories of cognition, indirect measures like the Implicit Association Test (IAT measure automatic associations towards a topic (as opposed to explicit attitudes measured by self-report measures. Such automatic associations are thought to occur rapidly and to evade voluntary control. However, whether or not such indirect tests actually reflect automatic associations is difficult to validate. Electroencephalography´s superior time resolution enables to differentiate between highly automatic compared to more elaborate processing stages. We therefore examined on which processing stages cortical differences between negative or positive attitudes to doping occur, and whether or not these differences can be related to BIAT scores. We tested 42 university students (31 females, 24.43 ± 3.17 years old, who were requested to complete a brief doping IAT (BIAT on attitudes towards doping. Cerebral activity during doping BIAT completion was assessed using high-density EEG. Behaviorally, participants D-scores exhibited negative attitudes towards doping, represented by faster reaction times in the doping + dislike pairing task. Event-related potentials (ERPs revealed earliest effects between 200 and 300ms. Here, a relatively larger occipital positivity was found for the doping + dislike pairing task. Further, in the LPP time range between 400 and 600ms a larger late positive potential was found for the doping + dislike pairing task over central regions. These LPP amplitude differences were successfully predicting participants´ BIAT D-scores.Results indicate that event-related potentials differentiate between positive and negative doping attitudes at stages of mid-latency. However, it seems that IAT scores can be predicted only by

  14. Debugging and Rectification of Electric Heating Boilers

    Institute of Scientific and Technical Information of China (English)

    GE; Cheng-song

    2015-01-01

    Steam system of CRARL mainly provides steam for dissolving system,and steam was transported through pipes.The major equipment is a150kW steam electric heating boiler(FH-JZ-003),with rated evaporation 0.2T/h and rated pressure 1.0MPa.It was found during debugging

  15. Automatic performance tuning of parallel and accelerated seismic imaging kernels

    KAUST Repository

    Haberdar, Hakan

    2014-01-01

    With the increased complexity and diversity of mainstream high performance computing systems, significant effort is required to tune parallel applications in order to achieve the best possible performance for each particular platform. This task becomes more and more challenging and requiring a larger set of skills. Automatic performance tuning is becoming a must for optimizing applications such as Reverse Time Migration (RTM) widely used in seismic imaging for oil and gas exploration. An empirical search based auto-tuning approach is applied to the MPI communication operations of the parallel isotropic and tilted transverse isotropic kernels. The application of auto-tuning using the Abstract Data and Communication Library improved the performance of the MPI communications as well as developer productivity by providing a higher level of abstraction. Keeping productivity in mind, we opted toward pragma based programming for accelerated computation on latest accelerated architectures such as GPUs using the fairly new OpenACC standard. The same auto-tuning approach is also applied to the OpenACC accelerated seismic code for optimizing the compute intensive kernel of the Reverse Time Migration application. The application of such technique resulted in an improved performance of the original code and its ability to adapt to different execution environments.

  16. Research and Practice on the Development of the Course of “Automatic Filling Line Debugging and Maintenance”%《灌装自动线调试与维护》课相开发研究与实践

    Institute of Scientific and Technical Information of China (English)

    张赛昆; 赵堂春; 崔健

    2015-01-01

    Automatic filling line debugging and maintenance”is the four stage of the reform of the mechanical and electrical professional grading system. Phase of the course is developed according to the mechanical and electrical integration of professional post analysis, according to the standard classification system to determine the teaching objectives; according to the characteristics of courses teaching carrier; determine the teaching content is composed of five items of corresponding jobs. Class phase of the implementation process to students as the center, the teaching organization mode of teaching, learning, training, and assessment phase fusion; close to the business, and enterprise“four unification”; the use of task driven, system in charge of the site, better play to the students' subjective initiative; evaluation methods lead into the enterprise standards, to enable students to faster qualified enterprises.%《灌装自动线调试与维护》是机电专业分级制改革四级试点课相。课相的开发依据机电一体化专业岗位分析,根据分级制标准确定教学目标;根据课相教学载体的特点,确定教学内容由对应相应工作岗位的5个项目组成。课相的实施过程中以学生为中心,采用教、学、训、做、评相融合的教学组织模式;紧贴企业,实现与企业“四统一”;采用任务驱动,站点负责制,更好地发挥学生的主观能动性;评价方式引入企业标准,使学生能够更快胜任企业工作。

  17. Debugging systems-on-chip communication-centric and abstraction-based techniques

    CERN Document Server

    Vermeulen, Bart

    2014-01-01

    This book describes an approach and supporting infrastructure to facilitate debugging the silicon implementation of a System-on-Chip (SOC), allowing its associated product to be introduced into the market more quickly.  Readers learn step-by-step the key requirements for debugging a modern, silicon SOC implementation, nine factors that complicate this debugging task, and a new debug approach that addresses these requirements and complicating factors.  The authors’ novel communication-centric, scan-based, abstraction-based, run/stop-based (CSAR) debug approach is discussed in detail, showing how it helps to meet debug requirements and address the nine, previously identified factors that complicate debugging silicon implementations of SOCs. The authors also derive the debug infrastructure requirements to support debugging of a silicon implementation of an SOC with their CSAR debug approach. This debug infrastructure consists of a generic on-chip debug architecture, a configurable automated design-for-debug ...

  18. Comparison of First Gear Performance for Manual and Automatic Transmissions

    Directory of Open Access Journals (Sweden)

    Kyle Stottlemyer

    2011-01-01

    Full Text Available The purpose of this project is to compare the first gear performance of an automobile for both its manual and automatic transmission options. Each transmission type has a different gear ratio, which yields a different acceleration curve for each transmission throughout the torque-rpm curve of the engine. The method of integral calculus was used to find an equation which could be used to solve for time at any point in the car's acceleration. The automobile velocity versus time was then graphed to compare each transmissions acceleration trend. This process is similar to that which automotive companies may use when determining what type of transmission to pair with a particular vehicle. By observing the trends in the acceleration graphs, it was determined that there are specific advantages and disadvantages to each type of transmission. Which transmission is the “better” choice is dependent on what application the automobile will be used for (e.g. racing, day-to-day driving, towing/hauling.

  19. Bifröst: debugging web applications as a whole

    NARCIS (Netherlands)

    Vlist, K.B. van der

    2013-01-01

    Even though web application development is supported by professional tooling, debugging support is lacking. If one starts to debug a web application, hardly any tooling support exists. Only the core components like server processes and a web browser are exposed. Developers need to manually weave ava

  20. Experiences in Parallel Debugging. Revision 1.3.

    Science.gov (United States)

    2007-11-02

    This document describes a prototype thread and lock debugging package that has been used for debugging parallelized Tera file system code. One of the...and as a result facilitate more thorough parallel testings prior to first shipment of the Tera operating system code.

  1. Lightweight and Statistical Techniques for Petascale Debugging: Correctness on Petascale Systems (CoPS) Preliminry Report

    Energy Technology Data Exchange (ETDEWEB)

    de Supinski, B R; Miller, B P; Liblit, B

    2011-09-13

    Petascale platforms with O(10{sup 5}) and O(10{sup 6}) processing cores are driving advancements in a wide range of scientific disciplines. These large systems create unprecedented application development challenges. Scalable correctness tools are critical to shorten the time-to-solution on these systems. Currently, many DOE application developers use primitive manual debugging based on printf or traditional debuggers such as TotalView or DDT. This paradigm breaks down beyond a few thousand cores, yet bugs often arise above that scale. Programmers must reproduce problems in smaller runs to analyze them with traditional tools, or else perform repeated runs at scale using only primitive techniques. Even when traditional tools run at scale, the approach wastes substantial effort and computation cycles. Continued scientific progress demands new paradigms for debugging large-scale applications. The Correctness on Petascale Systems (CoPS) project is developing a revolutionary debugging scheme that will reduce the debugging problem to a scale that human developers can comprehend. The scheme can provide precise diagnoses of the root causes of failure, including suggestions of the location and the type of errors down to the level of code regions or even a single execution point. Our fundamentally new strategy combines and expands three relatively new complementary debugging approaches. The Stack Trace Analysis Tool (STAT), a 2011 R&D 100 Award Winner, identifies behavior equivalence classes in MPI jobs and highlights behavior when elements of the class demonstrate divergent behavior, often the first indicator of an error. The Cooperative Bug Isolation (CBI) project has developed statistical techniques for isolating programming errors in widely deployed code that we will adapt to large-scale parallel applications. Finally, we are developing a new approach to parallelizing expensive correctness analyses, such as analysis of memory usage in the Memgrind tool. In the first two

  2. Debugging: Finding, Fixing and Flailing, a Multi-Institutional Study of Novice Debuggers

    Science.gov (United States)

    Fitzgerald, Sue; Lewandowski, Gary; McCauley, Renee; Murphy, Laurie; Simon, Beth; Thomas, Lynda; Zander, Carol

    2008-01-01

    Debugging is often difficult and frustrating for novices. Yet because students typically debug outside the classroom and often in isolation, instructors rarely have the opportunity to closely observe students while they debug. This paper describes the details of an exploratory study of the debugging skills and behaviors of contemporary novice Java…

  3. Improving Statistical Language Model Performance with Automatically Generated Word Hierarchies

    CERN Document Server

    McMahon, J; Mahon, John Mc

    1995-01-01

    An automatic word classification system has been designed which processes word unigram and bigram frequency statistics extracted from a corpus of natural language utterances. The system implements a binary top-down form of word clustering which employs an average class mutual information metric. Resulting classifications are hierarchical, allowing variable class granularity. Words are represented as structural tags --- unique $n$-bit numbers the most significant bit-patterns of which incorporate class information. Access to a structural tag immediately provides access to all classification levels for the corresponding word. The classification system has successfully revealed some of the structure of English, from the phonemic to the semantic level. The system has been compared --- directly and indirectly --- with other recent word classification systems. Class based interpolated language models have been constructed to exploit the extra information supplied by the classifications and some experiments have sho...

  4. Performance of Three-Arm Ac Automatic Voltage Regulator

    Directory of Open Access Journals (Sweden)

    T. Papinaidu

    2014-04-01

    Full Text Available In this paper the design and simulation of automatic voltage regulator (AVR is proposed. The AVR provides voltage buck and boost capability to eliminate power problems created by under voltage or over voltage fluctuations. It also protects against minor and severe spikes and surges that comprise over 80% of power problems. Over heating of components due to voltage swell is also avoided by using AVR. The switching losses are also reduced as only one arm among three arms is maintained at higher power frequencies depending on mode of operation. Moreover, there is no need to use large capacitor as a result the overall size of converter is also reduced. Hence, the output voltage of the AVR can be maintained at the specified voltage. Hence, the AVR is cost can be reduced, and the efficiency of the power convertor can be extended.

  5. A Performance Analysis Tool for PVM Parallel Programs

    Institute of Scientific and Technical Information of China (English)

    Chen Wang; Yin Liu; Changjun Jiang; Zhaoqing Zhang

    2004-01-01

    In this paper,we introduce the design and implementation of ParaVT,which is a visual performance analysis and parallel debugging tool.In ParaVT,we propose an automated instrumentation mechanism. Based on this mechanism,ParaVT automatically analyzes the performance bottleneck of parallel applications and provides a visual user interface to monitor and analyze the performance of parallel programs.In addition ,it also supports certain extensions.

  6. Performance of data acceptance criteria over 50 months from an automatic real-time environmental radiation surveillance network

    Energy Technology Data Exchange (ETDEWEB)

    Casanovas, R., E-mail: ramon.casanovas@urv.cat [Unitat de Fisica Medica, Facultat de Medicina i Ciencies de la Salut, Universitat Rovira i Virgili, ES-43201 Reus (Tarragona) (Spain); Morant, J.J. [Servei de Proteccio Radiologica, Facultat de Medicina i Ciencies de la Salut, Universitat Rovira i Virgili, ES-43201 Reus (Tarragona) (Spain); Lopez, M. [Unitat de Fisica Medica, Facultat de Medicina i Ciencies de la Salut, Universitat Rovira i Virgili, ES-43201 Reus (Tarragona) (Spain); Servei de Proteccio Radiologica, Facultat de Medicina i Ciencies de la Salut, Universitat Rovira i Virgili, ES-43201 Reus (Tarragona) (Spain); Hernandez-Giron, I. [Unitat de Fisica Medica, Facultat de Medicina i Ciencies de la Salut, Universitat Rovira i Virgili, ES-43201 Reus (Tarragona) (Spain); Batalla, E. [Servei de Coordinacio d' Activitats Radioactives, Departament d' Economia i Finances, Generalitat de Catalunya, ES-08018 Barcelona (Spain); Salvado, M. [Unitat de Fisica Medica, Facultat de Medicina i Ciencies de la Salut, Universitat Rovira i Virgili, ES-43201 Reus (Tarragona) (Spain)

    2011-08-15

    The automatic real-time environmental radiation surveillance network of Catalonia (Spain) comprises two subnetworks; one with 9 aerosol monitors and the other with 8 Geiger monitors together with 2 water monitors located in the Ebre river. Since September 2006, several improvements were implemented in order to get better quality and quantity of data, allowing a more accurate data analysis. However, several causes (natural causes, equipment failure, artificial external causes and incidents in nuclear power plants) may produce radiological measured values mismatched with the own station background, whether spurious without significance or true radiological values. Thus, data analysis for a 50-month period was made and allowed to establish an easily implementable statistical criterion to find those values that require special attention. This criterion proved a very useful tool for creating a properly debugged database and to give a quick response to equipment failures or possible radiological incidents. This paper presents the results obtained from the criterion application, including the figures for the expected, raw and debugged data, percentages of missing data grouped by causes and radiological measurements from the networks. Finally, based on the discussed information, recommendations for the improvement of the network are identified to obtain better radiological information and analysis capabilities. - Highlights: > Causes producing data mismatching with the own stations background are described. > Causes may be natural, equipment failure, external or nuclear plants incidents. > These causes can produce either spurious or true radiological data. > A criterion to find these data was implemented and tested for a 50-month period. > Recommendations for the improvement of the network are identified.

  7. Complier-Directed Automatic Performance Tuning (TUNE) Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Chame, Jacqueline [USC-ISI

    2013-06-07

    TUNE was created to develop compiler-directed performance tuning technology targeting the Cray XT4 system at Oak Ridge. TUNE combines compiler technology for model-guided empirical optimization for memory hierarchies with SIMD code generation. The goal of this performance-tuning technology is to yield hand-tuned levels of performance on DOE Office of Science computational kernels, while allowing application programmers to specify their computations at a high level without requiring manual optimization. Overall, TUNE aims to make compiler technology for SIMD code generation and memory hierarchy optimization a crucial component of high-productivity Petaflops computing through a close collaboration with the scientists in national laboratories.

  8. AutomaDeD: Automata-Based Debugging for Dissimilar Parallel Tasks

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G; Laguna, I; Bagchi, S; de Supinski, B R; Ahn, D; Schulz, M

    2010-03-23

    Today's largest systems have over 100,000 cores, with million-core systems expected over the next few years. This growing scale makes debugging the applications that run on them a daunting challenge. Few debugging tools perform well at this scale and most provide an overload of information about the entire job. Developers need tools that quickly direct them to the root cause of the problem. This paper presents AutomaDeD, a tool that identifies which tasks of a large-scale application first manifest a bug at a specific code region at a specific point during program execution. AutomaDeD creates a statistical model of the application's control-flow and timing behavior that organizes tasks into groups and identifies deviations from normal execution, thus significantly reducing debugging effort. In addition to a case study in which AutomaDeD locates a bug that occurred during development of MVAPICH, we evaluate AutomaDeD on a range of bugs injected into the NAS parallel benchmarks. Our results demonstrate that detects the time period when a bug first manifested itself with 90% accuracy for stalls and hangs and 70% accuracy for interference faults. It identifies the subset of processes first affected by the fault with 80% accuracy and 70% accuracy, respectively and the code region where where the fault first manifested with 90% and 50% accuracy, respectively.

  9. REDIR: Automated Static Detection of Obfuscated Anti-Debugging Techniques

    Science.gov (United States)

    2014-03-27

    data can replicate non-debugging ( Cnd ) and debugging conditions (Cd). Evaluation of Cnd and Cd creates boolean values End and Ed respectively. If...the detection of α and β. First, “questioning” creates the sub-program C that provides for “elaboration” to create instrumented sub-programs Cnd and Cd...are created. Then, “questioning” resumes by evaluating Cnd and Cd to create End and Ed. Finally, “questioning” End and Ed to determine an inequality

  10. Software exorcism a handbook for debugging and optimizing legacy code

    CERN Document Server

    Blunden, Bill

    2013-01-01

    Software Exorcism: A Handbook for Debugging and Optimizing Legacy Code takes an unflinching, no bulls and look at behavioral problems in the software engineering industry, shedding much-needed light on the social forces that make it difficult for programmers to do their job. Do you have a co-worker who perpetually writes bad code that you are forced to clean up? This is your book. While there are plenty of books on the market that cover debugging and short-term workarounds for bad code, Reverend Bill Blunden takes a revolutionary step beyond them by bringing our atten

  11. Monitoring the Performance of the Pedestrian Transfer Function of Train Stations Using Automatic Fare Collection Data

    NARCIS (Netherlands)

    Van den Heuvel, J.P.A.; Hoogenraad, J.H.

    2014-01-01

    Over the last years all train stations in The Netherlands have been equipped with automatic fare collection gates and/or validators. All public transport passengers use a smart card to pay their fare. In this paper we present a monitor for the performance of the pedestrian function of train stations

  12. Realization of rapid debugging for detection circuit of optical fiber gas sensor: Using an analog signal source

    Science.gov (United States)

    Tian, Changbin; Chang, Jun; Wang, Qiang; Wei, Wei; Zhu, Cunguang

    2015-03-01

    An optical fiber gas sensor mainly consists of two parts: optical part and detection circuit. In the debugging for the detection circuit, the optical part usually serves as a signal source. However, in the debugging condition, the optical part can be easily influenced by many factors, such as the fluctuation of ambient temperature or driving current resulting in instability of the wavelength and intensity for the laser; for dual-beam sensor, the different bends and stresses of the optical fiber will lead to the fluctuation of the intensity and phase; the intensity noise from the collimator, coupler, and other optical devices in the system will also result in the impurity of the optical part based signal source. In order to dramatically improve the debugging efficiency of the detection circuit and shorten the period of research and development, this paper describes an analog signal source, consisting of a single chip microcomputer (SCM), an amplifier circuit, and a voltage-to-current conversion circuit. It can be used to realize the rapid debugging detection circuit of the optical fiber gas sensor instead of optical part based signal source. This analog signal source performs well with many other advantages, such as the simple operation, small size, and light weight.

  13. Debugging and Logging Services for Defence Service Oriented Architectures

    Science.gov (United States)

    2012-02-01

    in a choreographed interaction, it must at least maintain correlation IDs or behave 7Note that we see tracing to be a function of logging more than...in which choreographed inter- actions can be debugged by examining the message queues and editing their content. It is the author’s conjecture that

  14. Provenance-Based Debugging and Drill-Down in Data-Oriented Workflows

    KAUST Repository

    Ikeda, Robert

    2012-04-01

    Panda (for Provenance and Data) is a system that supports the creation and execution of data-oriented workflows, with automatic provenance generation and built-in provenance tracing operations. Workflows in Panda are arbitrary a cyclic graphs containing both relational (SQL) processing nodes and opaque processing nodes programmed in Python. For both types of nodes, Panda generates logical provenance - provenance information stored at the processing-node level - and uses the generated provenance to support record-level backward tracing and forward tracing operations. In our demonstration we use Panda to integrate, process, and analyze actual education data from multiple sources. We specifically demonstrate how Panda\\'s provenance generation and tracing capabilities can be very useful for workflow debugging, and for drilling down on specific results of interest. © 2012 IEEE.

  15. Prototype application for the control and debugging of CMS upgrade projects

    CERN Document Server

    Mills-Howell, Dominic

    2016-01-01

    Following the high-luminosity upgrades of the LHC, many subsystems of the CMS experiment require upgrading and others are using the LHC shutdowns as an opportunity to improve performance. The upgrades, themselves, have served to highlight the exigency to attack problems that were previously unaddressed. One such problem is the need for a tool that allows the users to easily monitor, debug, and test custom hardware. Such a tool could be abstracted to work, in theory, with various hardware devices. In addition to having the added benefit of being able to support future hardware, and maintaining parallel operations with the remaining control software.

  16. Dependence Analysis Based on Dynamic Slicing for Debugging

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Dynamic program slicing is an effective technique for narrowingthe errors to t h e relevant parts of a program when debugging. Given a slicing criterion, the dyn amic slice contains only those statements that actually affect the variables in the slicing criterion. This paper proposes a dynamic slicing method based on sta tic dependence analysis. It uses the program dependence graph and other static i nformation to reduce the information needed to be traced during program executio n. Thus, the efficiency is dramatically improved while the precision is not depr e ssed. The slicing criterion is modified to fit for debugging. It consists of fil e-name and the line number at which the statement is.

  17. A debugging system for azimuthally acoustic logging tools based on modular and hierarchical design ideas

    Science.gov (United States)

    Zhang, K.; Ju, X. D.; Lu, J. Q.; Men, B. Y.

    2016-08-01

    On the basis of modular and hierarchical design ideas, this study presents a debugging system for an azimuthally sensitive acoustic bond tool (AABT). The debugging system includes three parts: a personal computer (PC), embedded front-end machine and function expansion boards. Modular and hierarchical design ideas are conducted in all design and debug processes. The PC communicates with the front-end machine via the Internet, and the front-end machine and function expansion boards connect each other by the extended parallel bus. In this method, the three parts of the debugging system form stable and high-speed data communication. This study not only introduces the system-level debugging and sub-system level debugging of the tool but also the debugging of the analogue signal processing board, which is important and greatly used in logging tools. Experiments illustrate that the debugging system can greatly improve AABT verification and calibration efficiency and that, board-level debugging can examine and improve analogue signal processing boards. The design thinking is clear and the design structure is reasonable, thus making it easy to extend and upgrade the debugging system.

  18. Rate of learning and asymptotic performance in an automatization task and the relation to reading.

    Science.gov (United States)

    Hecht, Rozalia; Crewther, David; Crewther, Sheila

    2004-12-01

    In the present study, direct evidence was sought linking cognitive automatic processing with reading in the general adult population. Reading speed on single-task performance and dual-task performance were compared. A total of 18 adults without dyslexia participated (7 men and 11 women, age M=25.3 yr., SD=2.7). Participants initially were trained in single-task mode on two types of tasks. The first was a central alphanumeric equation task (true or false), which comprised 3 subtests of increasing difficulty, ranging from an easily automated task to a varied and unpredictable mathematical operation. The second task was a peripheral pattern subitization task for which stimulus exposure time was related to performance. Finally, participants received dual-task training, which required simultaneous processing of both tasks. Slower reading speed was significantly related to rate of learning and speed of performance on predictable alphanumeric operations in dual-task conditions. There was no effect of reading speed on performance in the varied alphanumeric task. Faster readers were no better than slower readers on the pattern-subitization task. These findings suggest that faster readers automatized the predictable alphanumeric task more rapidly than slower readers and hence were better able to cope with the dual-task condition.

  19. Performance testing of the automatic transmission%自动变速器的性能检验

    Institute of Scientific and Technical Information of China (English)

    张琳琳

    2013-01-01

      文章从自动变速器油的检查与更换、变速杆位置的检查与调整、自动变速器的性能试验三个方面系统分析了自动变速器的性能检验步骤与方法。%The article analyses procedure and methods of performance testing of the automatic transmission from three aspects ,the oil inspection and replacement of automatic transmission, the inspection and adjustment of lever position, the performance test of automatic transmission.

  20. A Framework to Debug Diagnostic Matrices

    Science.gov (United States)

    Kodal, Anuradha; Robinson, Peter; Patterson-Hine, Ann

    2013-01-01

    Diagnostics is an important concept in system health and monitoring of space operations. Many of the existing diagnostic algorithms utilize system knowledge in the form of diagnostic matrix (D-matrix, also popularly known as diagnostic dictionary, fault signature matrix or reachability matrix) gleaned from physical models. But, sometimes, this may not be coherent to obtain high diagnostic performance. In such a case, it is important to modify this D-matrix based on knowledge obtained from other sources such as time-series data stream (simulated or maintenance data) within the context of a framework that includes the diagnostic/inference algorithm. A systematic and sequential update procedure, diagnostic modeling evaluator (DME) is proposed to modify D-matrix and wrapper logic considering least expensive solution first. This iterative procedure includes conditions ranging from modifying 0s and 1s in the matrix, or adding/removing the rows (failure sources) columns (tests). We will experiment this framework on datasets from DX challenge 2009.

  1. Insertion of coherence requests for debugging a multiprocessor

    Science.gov (United States)

    Blumrich, Matthias A.; Salapura, Valentina

    2010-02-23

    A method and system are disclosed to insert coherence events in a multiprocessor computer system, and to present those coherence events to the processors of the multiprocessor computer system for analysis and debugging purposes. The coherence events are inserted in the computer system by adding one or more special insert registers. By writing into the insert registers, coherence events are inserted in the multiprocessor system as if they were generated by the normal coherence protocol. Once these coherence events are processed, the processing of coherence events can continue in the normal operation mode.

  2. Lightning Protection Performance Assessment of Transmission Line Based on ATP model Automatic Generation

    Directory of Open Access Journals (Sweden)

    Luo Hanwu

    2016-01-01

    Full Text Available This paper presents a novel method to solve the initial lightning breakdown current by combing ATP and MATLAB simulation software effectively, with the aims to evaluate the lightning protection performance of transmission line. Firstly, the executable ATP simulation model is generated automatically according to the required information such as power source parameters, tower parameters, overhead line parameters, grounding resistance and lightning current parameters, etc. through an interface program coded by MATLAB. Then, the data are extracted from the generated LIS files which can be obtained by executing the ATP simulation model, the occurrence of transmission lie breakdown can be determined by the relative data in LIS file. The lightning current amplitude should be reduced when the breakdown occurs, and vice the verse. Thus the initial lightning breakdown current of a transmission line with given parameters can be determined accurately by continuously changing the lightning current amplitude, which is realized by a loop computing algorithm that is coded by MATLAB software. The method proposed in this paper can generate the ATP simulation program automatically, and facilitates the lightning protection performance assessment of transmission line.

  3. Automatic and Objective Assessment of Alternating Tapping Performance in Parkinson’s Disease

    Directory of Open Access Journals (Sweden)

    Mevludin Memedi

    2013-12-01

    Full Text Available This paper presents the development and evaluation of a method for enabling quantitative and automatic scoring of alternating tapping performance of patients with Parkinson’s disease (PD. Ten healthy elderly subjects and 95 patients in different clinical stages of PD have utilized a touch-pad handheld computer to perform alternate tapping tests in their home environments. First, a neurologist used a web-based system to visually assess impairments in four tapping dimensions (‘speed’, ‘accuracy’, ‘fatigue’ and ‘arrhythmia’ and a global tapping severity (GTS. Second, tapping signals were processed with time series analysis and statistical methods to derive 24 quantitative parameters. Third, principal component analysis was used to reduce the dimensions of these parameters and to obtain scores for the four dimensions. Finally, a logistic regression classifier was trained using a 10-fold stratified cross-validation to map the reduced parameters to the corresponding visually assessed GTS scores. Results showed that the computed scores correlated well to visually assessed scores and were significantly different across Unified Parkinson’s Disease Rating Scale scores of upper limb motor performance. In addition, they had good internal consistency, had good ability to discriminate between healthy elderly and patients in different disease stages, had good sensitivity to treatment interventions and could reflect the natural disease progression over time. In conclusion, the automatic method can be useful to objectively assess the tapping performance of PD patients and can be included in telemedicine tools for remote monitoring of tapping.

  4. Optical performance of a PDMS tunable lens with automatically controlled applied stress

    Science.gov (United States)

    Cruz-Felix, Angel S.; Santiago-Alvarado, Agustín.; Hernández-Méndez, Arturo; Reyes-Pérez, Emilio R.; Tepichín-Rodriguez, Eduardo

    2016-09-01

    The advances in the field of adaptive optics and in the fabrication of tunable optical components capable to automatically modify their physical features are of great interest in areas like machine vision, imaging systems, ophthalmology, etc. Such components like tunable lenses are used to reduce the overall size of optical setups like in small camera systems and even to imitate some biological functions made by the human eye. In this direction, in the last years we have been working in the development and fabrication of PDMS-made tunable lenses and in the design of special mechanical mounting systems to manipulate them. A PDMS-made tunable lens was previously designed by us, following the scheme reported by Navarro et al. in 1985, in order to mimic the accommodation process made by the crystalline lens of the human eye. The design included a simulation of the application of radial stress onto the lens and it was shown that the effective focal length was indeed changed. In this work we show the fabrication process of this particular tunable lens and an optimized mechanism that is able to automatically change the curvature of both surfaces of the lens by the application of controlled stress. We also show results of a study and analysis of aberrations performed to the Solid Elastic Lens (SEL).

  5. Transducer-actuator systems and methods for performing on-machine measurements and automatic part alignment

    Science.gov (United States)

    Barkman, William E.; Dow, Thomas A.; Garrard, Kenneth P.; Marston, Zachary

    2016-07-12

    Systems and methods for performing on-machine measurements and automatic part alignment, including: a measurement component operable for determining the position of a part on a machine; and an actuation component operable for adjusting the position of the part by contacting the part with a predetermined force responsive to the determined position of the part. The measurement component consists of a transducer. The actuation component consists of a linear actuator. Optionally, the measurement component and the actuation component consist of a single linear actuator operable for contacting the part with a first lighter force for determining the position of the part and with a second harder force for adjusting the position of the part. The actuation component is utilized in a substantially horizontal configuration and the effects of gravitational drop of the part are accounted for in the force applied and the timing of the contact.

  6. CCD:An Integrated C Coding and Debugging Tool

    Institute of Scientific and Technical Information of China (English)

    金立群

    1993-01-01

    CCD is an integrated software tool which is intended to support the coding and debugging for C language It integrates a hybrid editor,an incremental semantic analyzer,a multi-entry parser,an incremental unparser and a source-level debugger into a single tool.The integration is realized by sharing common knowledge among all the components of the system and by task-oriented conbination of the components,Nonlocal attribute grammar is adopted for specifying the common knowledge about the syntax and semantics of C language.The incremental attribute evaluation is used to implement the semantic analyzer and the unparser to increase system efficiency.CCD keeps the preprocessors and comments most regular to make it practical.

  7. Data Provenance as a Tool for Debugging Hydrological Models based on Python

    Science.gov (United States)

    Wombacher, A.; Huq, M.; Wada, Y.; Van Beek, R.

    2012-12-01

    There is an increase in data volume used in hydrological modeling. The increasing data volume requires additional efforts in debugging models since a single output value is influenced by a multitude of input values. Thus, it is difficult to keep an overview among the data dependencies. Further, knowing these dependencies, it is a tedious job to infer all the relevant data values. The aforementioned data dependencies are also known as data provenance, i.e. the determination of how a particular value has been created and processed. The proposed tool infers the data provenance automatically from a python script and visualizes the dependencies as a graph without executing the script. To debug the model the user specifies the value of interest in space and time. The tool infers all related data values and displays them in the graph. The tool has been evaluated by hydrologists developing a model for estimating the global water demand [1]. The model uses multiple different data sources. The script we analysed has 120 lines of codes and used more than 3000 individual files, each of them representing a raster map of 360*720 cells. After importing the data of the files into a SQLite database, the data consumes around 40 GB of memory. Using the proposed tool a modeler is able to select individual values and infer which values have been used to calculate the value. Especially in cases of outliers or missing values it is a beneficial tool to provide the modeler with efficient information to investigate the unexpected behavior of the model. The proposed tool can be applied to many python scripts and has been tested with other scripts in different contexts. In case a python code contains an unknown function or class the tool requests additional information about the used function or class to enable the inference. This information has to be entered only once and can be shared with colleagues or in the community. Reference [1] Y. Wada, L. P. H. van Beek, D. Viviroli, H. H. Drr, R

  8. Performance Analysis of the SIFT Operator for Automatic Feature Extraction and Matching in Photogrammetric Applications.

    Science.gov (United States)

    Lingua, Andrea; Marenchino, Davide; Nex, Francesco

    2009-01-01

    In the photogrammetry field, interest in region detectors, which are widely used in Computer Vision, is quickly increasing due to the availability of new techniques. Images acquired by Mobile Mapping Technology, Oblique Photogrammetric Cameras or Unmanned Aerial Vehicles do not observe normal acquisition conditions. Feature extraction and matching techniques, which are traditionally used in photogrammetry, are usually inefficient for these applications as they are unable to provide reliable results under extreme geometrical conditions (convergent taking geometry, strong affine transformations, etc.) and for bad-textured images. A performance analysis of the SIFT technique in aerial and close-range photogrammetric applications is presented in this paper. The goal is to establish the suitability of the SIFT technique for automatic tie point extraction and approximate DSM (Digital Surface Model) generation. First, the performances of the SIFT operator have been compared with those provided by feature extraction and matching techniques used in photogrammetry. All these techniques have been implemented by the authors and validated on aerial and terrestrial images. Moreover, an auto-adaptive version of the SIFT operator has been developed, in order to improve the performances of the SIFT detector in relation to the texture of the images. The Auto-Adaptive SIFT operator (A(2) SIFT) has been validated on several aerial images, with particular attention to large scale aerial images acquired using mini-UAV systems.

  9. Performance Analysis of the SIFT Operator for Automatic Feature Extraction and Matching in Photogrammetric Applications

    Directory of Open Access Journals (Sweden)

    Francesco Nex

    2009-05-01

    Full Text Available In the photogrammetry field, interest in region detectors, which are widely used in Computer Vision, is quickly increasing due to the availability of new techniques. Images acquired by Mobile Mapping Technology, Oblique Photogrammetric Cameras or Unmanned Aerial Vehicles do not observe normal acquisition conditions. Feature extraction and matching techniques, which are traditionally used in photogrammetry, are usually inefficient for these applications as they are unable to provide reliable results under extreme geometrical conditions (convergent taking geometry, strong affine transformations, etc. and for bad-textured images. A performance analysis of the SIFT technique in aerial and close-range photogrammetric applications is presented in this paper. The goal is to establish the suitability of the SIFT technique for automatic tie point extraction and approximate DSM (Digital Surface Model generation. First, the performances of the SIFT operator have been compared with those provided by feature extraction and matching techniques used in photogrammetry. All these techniques have been implemented by the authors and validated on aerial and terrestrial images. Moreover, an auto-adaptive version of the SIFT operator has been developed, in order to improve the performances of the SIFT detector in relation to the texture of the images. The Auto-Adaptive SIFT operator (A2 SIFT has been validated on several aerial images, with particular attention to large scale aerial images acquired using mini-UAV systems.

  10. A HYBRID METHOD FOR AUTOMATIC SPEECH RECOGNITION PERFORMANCE IMPROVEMENT IN REAL WORLD NOISY ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Urmila Shrawankar

    2013-01-01

    Full Text Available It is a well known fact that, speech recognition systems perform well when the system is used in conditions similar to the one used to train the acoustic models. However, mismatches degrade the performance. In adverse environment, it is very difficult to predict the category of noise in advance in case of real world environmental noise and difficult to achieve environmental robustness. After doing rigorous experimental study it is observed that, a unique method is not available that will clean the noisy speech as well as preserve the quality which have been corrupted by real natural environmental (mixed noise. It is also observed that only back-end techniques are not sufficient to improve the performance of a speech recognition system. It is necessary to implement performance improvement techniques at every step of back-end as well as front-end of the Automatic Speech Recognition (ASR model. Current recognition systems solve this problem using a technique called adaptation. This study presents an experimental study that aims two points, first is to implement the hybrid method that will take care of clarifying the speech signal as much as possible with all combinations of filters and enhancement techniques. The second point is to develop a method for training all categories of noise that can adapt the acoustic models for a new environment that will help to improve the performance of the speech recognizer under real world environmental mismatched conditions. This experiment confirms that hybrid adaptation methods improve the ASR performance on both levels, (Signal-to-Noise Ratio SNR improvement as well as word recognition accuracy in real world noisy environment.

  11. Automatization and Basic Fact Performance of Normal and Learning Disabled Children. Technical Report # 10.

    Science.gov (United States)

    Garnett, Katherine; Fleischner, Jeannette E.

    The relationship between automatization ability (the tendency for repetitive routine aspects of behavior to become so overlearned that a minimum of conscious effort and attention is necessary for rapid efficient execution), as measured by the Rapid Automatic Naming (RAN) Test, and proficiency in arithmetic basic fact computation was investigated…

  12. Automatization and Basic Fact Performance of Normal and Learning Disabled Children.

    Science.gov (United States)

    Garnett, Katherine; Fleischner, Jeannette E.

    1983-01-01

    The relationship between automatization ability, as measured by the "Rapid Automatic Naming Test" (RAN), and proficiency in arithmetic basic fact computation was investigated with 120 learning disabled (LD) and 120 nondisabled children between 8 and 13 years of age. (Author/SW)

  13. Productive performance of Nile tilapia (Oreochromis niloticus fed at different frequencies and periods with automatic dispenser

    Directory of Open Access Journals (Sweden)

    R.M.R. Sousa

    2012-02-01

    Full Text Available The performance of Nile tilapia (Oreochromis niloticus raised in cages furnished with an automatic dispenser, supplied at different frequencies (once per hour and once every two hours and periods (daytime, nighttime and both was evaluated. Eighteen 1.0m³ cages were placed into a 2000m² pond, two meters deep with a 5% water exchange. One hundred and seventy tilapias, with initial weight of 16.0±4.9g, were dispersed into each 1m³ cage and the feed ration was adjusted every 21 days with biometry. Data was collected from March to July (autumn and winter. Significant difference to final weight (P<0.05 among treatments was observed. The increase in feeding frequency improves the productive performance of Nile tilapias in cages and permitted better management of the food. The better feed conversion rate for high feeding frequency (24 times day-1 can result in saving up to 360kg of food for each ton of fish produced, increasing the economic sustenance for tilapia culture and suggesting less environmental pollution.

  14. Using automatic calibration method for optimizing the performance of Pedotransfer functions of saturated hydraulic conductivity

    Directory of Open Access Journals (Sweden)

    Ahmed M. Abdelbaki

    2016-06-01

    Full Text Available Pedotransfer functions (PTFs are an easy way to predict saturated hydraulic conductivity (Ksat without measurements. This study aims to auto calibrate 22 PTFs. The PTFs were divided into three groups according to its input requirements and the shuffled complex evolution algorithm was used in calibration. The results showed great modification in the performance of the functions compared to the original published functions. For group 1 PTFs, the geometric mean error ratio (GMER and the geometric standard deviation of error ratio (GSDER values were modified from range (1.27–6.09, (5.2–7.01 to (0.91–1.15, (4.88–5.85 respectively. For group 2 PTFs, the GMER and the GSDER values were modified from (0.3–1.55, (5.9–12.38 to (1.00–1.03, (5.5–5.9 respectively. For group 3 PTFs, the GMER and the GSDER values were modified from (0.11–2.06, (5.55–16.42 to (0.82–1.01, (5.1–6.17 respectively. The result showed that the automatic calibration is an efficient and accurate method to enhance the performance of the PTFs.

  15. Investigation of measureable parameters that correlate with automatic target recognition performance in synthetic aperture sonar

    Science.gov (United States)

    Gazagnaire, Julia; Cobb, J. T.; Isaacs, Jason

    2015-05-01

    There is a desire in the Mine Counter Measure community to develop a systematic method to predict and/or estimate the performance of Automatic Target Recognition (ATR) algorithms that are detecting and classifying mine-like objects within sonar data. Ideally, parameters exist that can be measured directly from the sonar data that correlate with ATR performance. In this effort, two metrics were analyzed for their predictive potential using high frequency synthetic aperture sonar (SAS) images. The first parameter is a measure of contrast. It is essentially the variance in pixel intensity over a fixed partition of relatively small size. An analysis was performed to determine the optimum block size for this contrast calculation. These blocks were then overlapped in the horizontal and vertical direction over the entire image. The second parameter is the one-dimensional K-shape parameter. The K-distribution is commonly used to describe sonar backscatter return from range cells that contain a finite number of scatterers. An Ada-Boosted Decision Tree classifier was used to calculate the probability of classification (Pc) and false alarm rate (FAR) for several types of targets in SAS images from three different data sets. ROC curves as a function of the measured parameters were generated and the correlation between the measured parameters in the vicinity of each of the contacts and the ATR performance was investigated. The contrast and K-shape parameters were considered separately. Additionally, the contrast and K-shape parameter were associated with background texture types using previously labeled high frequency SAS images.

  16. Validation of a novel automatic sleep spindle detector with high performance during sleep in middle aged subjects

    DEFF Research Database (Denmark)

    Wendt, Sabrina Lyngbye; Christensen, Julie A. E.; Kempfner, Jacob

    2012-01-01

    Many of the automatic sleep spindle detectors currently used to analyze sleep EEG are either validated on young subjects or not validated thoroughly. The purpose of this study is to develop and validate a fast and reliable sleep spindle detector with high performance in middle aged subjects. An a...

  17. Automatic PID Control Loops Design for Performance Improvement of Cryogenic Turboexpander

    Science.gov (United States)

    Joshi, D. M.; Patel, H. K.; Shah, D. K.

    2015-04-01

    Cryogenics field involves temperature below 123 K which is much less than ambient temperature. In addition, many industrially important physical processes—from fulfilling the needs of National Thermonuclear Fusion programs, superconducting magnets to treatment of cutting tools and preservation of blood cells, require extreme low temperature. The low temperature required for liquefaction of common gases can be obtained by several processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Helium liquefier is used for the liquefaction process of helium gas. In general, the Helium Refrigerator/Liquefier (HRL) needs turboexpander as expansion machine to produce cooling effect which is further used for the production of liquid helium. Turboexpanders, a high speed device that is supported on gas bearings, are the most critical component in many helium refrigeration systems. A very minor fault in the operation and manufacturing or impurities in the helium gas can destroy the turboexpander. However, since the performance of expanders is dependent on a number of operating parameters and the relations between them are quite complex, the instrumentation and control system design for turboexpander needs special attention. The inefficiency of manual control leads to the need of designing automatic control loops for turboexpander. Proper design and implementation of the control loops plays an important role in the successful operation of the cryogenic turboexpander. The PID control loops has to be implemented with accurate interlocks and logic to enhance the performance of the cryogenic turboexpander. For different normal and off-normal operations, speeds will be different and hence a proper control method for critical rotational speed avoidance is must. This paper presents the design of PID control loops needed for the

  18. Distribution transformer with automatic voltage adjustment - performance; Transformador de distribucion con ajuste automatico de tension - desempeno

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez Ruiz, Gustavo A.; Delgadillo Bocanegra, Alfonso; Betancourt Ramirez, Enrique [PROLEC-GE, Apodaca, Nuevo Leon (Mexico)]. E-mail: gustavo1.hernandez@ge.com; alfonso.delgadillobocanegra@ge.com; enrique.betancourt@ge.com; Ramirez Arredondo, Juan M. [CINVESTAV-Guadalajara, Zapopan, Jalisco (Mexico)]. E-mail: jramirez@gdl.cinvestav.mx

    2010-11-15

    In the electric power distribution systems, the power quality is strongly linked with the service stability voltage. In the radial kind systems, it is virtually impossible to achieve a flat voltage along the lines, so it is desirable to count with transformers that can adjust automatically the turns ratio. In this work, it is described the development and the performance of a transformer with an integrated electronic tap changer, that allows to change the turns ratio along the standard range of +/-5%, and it was identified the application limits of the technology. [Spanish] En los sistemas de distribucion de energia electrica, la calidad del suministro de energia esta fuertemente ligada con la estabilidad del voltaje de servicio. En sistemas de tipo radial, es virtualmente imposible mantener uniforme la tension a lo largo de las lineas, por lo que se hace deseable contar con transformadores que puedan ajustar automaticamente la relacion de transformacion. En este trabajo, se describe el desarrollo y desempeno de un transformador con switch electronico integrado, que permite variar la relacion de transformacion dentro del rango estandarizado de +/-5%, y se identifican los limites de aplicacion de la tecnologia.

  19. Smooth pursuit detection in binocular eye-tracking data with automatic video-based performance evaluation.

    Science.gov (United States)

    Larsson, Linnéa; Nyström, Marcus; Ardö, Håkan; Åström, Kalle; Stridh, Martin

    2016-12-01

    An increasing number of researchers record binocular eye-tracking signals from participants viewing moving stimuli, but the majority of event-detection algorithms are monocular and do not consider smooth pursuit movements. The purposes of the present study are to develop an algorithm that discriminates between fixations and smooth pursuit movements in binocular eye-tracking signals and to evaluate its performance using an automated video-based strategy. The proposed algorithm uses a clustering approach that takes both spatial and temporal aspects of the binocular eye-tracking signal into account, and is evaluated using a novel video-based evaluation strategy based on automatically detected moving objects in the video stimuli. The binocular algorithm detects 98% of fixations in image stimuli compared to 95% when only one eye is used, while for video stimuli, both the binocular and monocular algorithms detect around 40% of smooth pursuit movements. The present article shows that using binocular information for discrimination of fixations and smooth pursuit movements is advantageous in static stimuli, without impairing the algorithm's ability to detect smooth pursuit movements in video and moving-dot stimuli. With an automated evaluation strategy, time-consuming manual annotations are avoided and a larger amount of data can be used in the evaluation process.

  20. An Imperfect-debugging Fault-detection Dependent-parameter Software

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Software reliability growth models (SRGMs) incorporating the imperfect debugging and learning phenomenon of developers have recently been developed by many researchers to estimate software reliability measures such as the number of remaining faults and software reliability. However, the model parameters of both the fault content rate function and fault detection rate function of the SRGMs are often considered to be independent from each other. In practice, this assumption may not be the case and it is worth to investigate what if it is not. In this paper, we aim for such study and propose a software reliability model connecting the imperfect debugging and learning phenomenon by a common parameter among the two functions, called the imperfect-debugging fault-detection dependent-parameter model. Software testing data collected from real applications are utilized to illustrate the proposed model for both the descriptive and predictive power by determining the non-zero initial debugging process.

  1. Nonresident and Endangered Variables: The Effects of Code Generation Optimizations on Symbolic Debugging

    Science.gov (United States)

    1992-12-01

    affect debugging by making vari- ables inaccessible at a breakpoint. By attempting to pack as many variables as possible into a limited number of...However, C is nowadays a widely used programming lauguage , and the practice of debugging has evolved to include asynchronous breakpoints (e.g., for data...the number of breakpoints with suspect variables to approximately the number of breakpoints with out-of-order function calls, which is the limit on how

  2. An interactive debugging system composed of a minicomputer and a microprocessor

    OpenAIRE

    Okada, Kenichi; 松尾, 泰樹; 北川, 節

    1980-01-01

    This paper describes a debugging system which has been developed on such a combined system of a minicomputer and a microprocessor that is microprogrammable for the user, and aims at an effective debugging of the errors that will be detected during the execution of the program written in a low-level language. The multiprocessor organization, adoption of firmware monitor and special hardwares yield such advantageous features as bilateral tracing, procedure extraction, eight kinds of event monit...

  3. General collaboration offer of Johnson Controls regarding the performance of air conditioning automatic control systems and other buildings` automatic control systems

    Energy Technology Data Exchange (ETDEWEB)

    Gniazdowski, J.

    1995-12-31

    JOHNSON CONTROLS manufactures measuring and control equipment (800 types) and is as well a {open_quotes}turn-key{close_quotes} supplier of complete automatic controls systems for heating, air conditioning, ventilation and refrigerating engineering branches. The Company also supplies Buildings` Computer-Based Supervision and Monitoring Systems that may be applied in both small and large structures. Since 1990 the company has been performing full-range trade and contracting activities on the Polish market. We have our own well-trained technical staff and we collaborate with a series of designing and contracting enterprises that enable us to have our projects carried out all over Poland. The prices of our supplies and services correspond with the level of the Polish market.

  4. Automatic shading effects on the energetic performance of building systems; Efeito do sombreamento automatico no desempenho de sistemas prediais

    Energy Technology Data Exchange (ETDEWEB)

    Prado, Racine Tadeu Araujo

    1996-12-31

    This thesis develops a theoretic-experimental study dealing with the effects of an automatic shading device on the energetic performance of a dimmable lighting system and a cooling equipment. Some equations related to fenestration optical and thermal properties are rebuilt, while some others are created, under a theoretical approach. In order to collect field data, the energy demand-and other variables - was measured in two distinct stories, with the same fenestration features, of the Test Tower. New data was gathered after adding an automatic shading device to the window of one story. The comparison of the collected data allows the energetic performance evaluation of the shading device. (author) 136 refs., 55 figs., 6 tabs.

  5. Performance Modelling of Automatic Identification System with Extended Field of View

    DEFF Research Database (Denmark)

    Lauersen, Troels; Mortensen, Hans Peter; Pedersen, Nikolaj Bisgaard

    2010-01-01

    This paper deals with AIS (Automatic Identification System) behavior, to investigate the severity of packet collisions in an extended field of view (FOV). This is an important issue for satellite-based AIS, and the main goal is a feasibility study to find out to what extent an increased FOV...

  6. Improving the working performance of automatic ball balancer by modifying its mechanism

    Science.gov (United States)

    Rezaee, Mousa; Fathi, Reza

    2015-12-01

    An automatic ball balancer consists of several balls that are free to move in the circular race containing a damping fluid. Although a traditional ABB can improve the vibration behavior of an unbalanced rotor under proper working conditions, at speeds below the first critical speed, it makes the vibration amplitude of the rotor larger than that of a rotor without an automatic ball balancer. Moreover, it has a limited stable region of the perfect balancing configuration. Considering these deficiencies, in this study a new design for automatic ball balancer is proposed. To analyze the problem, the nonlinear governing equations of a rotor equipped with the new ABB are derived using Lagrange's equations. Then, stability analysis is carried out on the basis of linearized equations of motion around the equilibrium positions to obtain the stable region of the system. It is shown that the new ABB can prevent the rotor from increasing the vibrations at the transient state. Also, it increases the stable region of the perfect balancing configuration. Comparing the results with those corresponding to the traditional ball balancer shows that the new ABB can reduce the vibration amplitude at speeds below the first critical speed and it increases the stable region of the perfect balancing configuration.

  7. Compile-Time Debugging of C Programs Working on Trees

    DEFF Research Database (Denmark)

    Elgaard, Jacob; Møller, Anders; Schwartzbach, Michael I.

    2000-01-01

    We exhibit a technique for automatically verifying the safety of simple C programs working on tree-shaped data structures. We do not consider the complete behavior of programs, but only attempt to verify that they respect the shape and integrity of the store. A verified program is guaranteed...... of an initial store that leads to an error is automatically generated. This extends previous work that uses a similar technique to verify a simpler syntax manipulating only list structures. In that case, programs are translated into WS1S formulas. A naive generalization to recursive data-types determines...... to preserve the tree-shapes of data structures, to avoid pointer errors such as NULL dereferences, leaking memory, and dangling references, and furthermore to satisfy assertions specified in a specialized store logic. A program is transformed into a single formula in WSRT, an extension of WS2S that is decided...

  8. An Approach to automatically optimize the Hydraulic performance of Blade System for Hydraulic Machines using Multi-objective Genetic Algorithm

    Science.gov (United States)

    Lai, Xide; Chen, Xiaoming; Zhang, Xiang; Lei, Mingchuan

    2016-11-01

    This paper presents an approach to automatic hydraulic optimization of hydraulic machine's blade system combining a blade geometric modeller and parametric generator with automatic CFD solution procedure and multi-objective genetic algorithm. In order to evaluate a plurality of design options and quickly estimate the blade system's hydraulic performance, the approximate model which is able to substitute for the original inside optimization loop has been employed in the hydraulic optimization of blade by using function approximation. As the approximate model is constructed through the database samples containing a set of blade geometries and their resulted hydraulic performances, it can ensure to correctly imitate the real blade's performances predicted by the original model. As hydraulic machine designers are accustomed to do design with 2D blade profiles on stream surface that are then stacked to 3D blade geometric model in the form of NURBS surfaces, geometric variables to be optimized were defined by a series profiles on stream surfaces. The approach depends on the cooperation between a genetic algorithm, a database and user defined objective functions and constraints which comprises hydraulic performances, structural and geometric constraint functions. Example covering optimization design of a mixed-flow pump impeller is presented.

  9. Modern multithreading implementing, testing, and debugging multithreaded Java and C++/Pthreads/Win32 programs

    CERN Document Server

    Carver, Richard H

    2005-01-01

    Master the essentials of concurrent programming,including testing and debuggingThis textbook examines languages and libraries for multithreaded programming. Readers learn how to create threads in Java and C++, and develop essential concurrent programming and problem-solving skills. Moreover, the textbook sets itself apart from other comparable works by helping readers to become proficient in key testing and debugging techniques. Among the topics covered, readers are introduced to the relevant aspects of Java, the POSIX Pthreads library, and the Windows Win32 Applications Programming Interface.

  10. Parallelization of the AliRoot event reconstruction by performing a semi- automatic source-code transformation

    CERN Document Server

    CERN. Geneva

    2012-01-01

    side bus or processor interconnections. Parallelism can only result in performance gain, if the memory usage is optimized, memory locality improved and the communication between threads is minimized. But the domain of concurrent programming has become a field for highly skilled experts, as the implementation of multithreading is difficult, error prone and labor intensive. A full re-implementation for parallel execution of existing offline frameworks, like AliRoot in ALICE, is thus unaffordable. An alternative method, is to use a semi-automatic source-to-source transformation for getting a simple parallel design, with almost no interference between threads. This reduces the need of rewriting the develop...

  11. Xenus AC Drives’Debugging and Application%Xenus交流驱动器的调试及应用

    Institute of Scientific and Technical Information of China (English)

    翟昭斌

    2014-01-01

    随着交流电机在自动控制系统中的广泛应用,与之相关的交流驱动器种类繁多,适用于不同的控制领域。Copley公司生产的交流驱动器是一款功能齐全,使用方便,调试简单,易于控制,适用范围广泛的产品。主要介绍Xenus驱动器的特点、功能,以及在某随动系统中的具体应用。%With AC motors are widely used in automatic control system,associated with a wide range of AC drives for different control areas. The Copley company’s products applicable to a wide range that because it have fully functional,simple debugging and easy to control.This article mainly introduce the Xenus AC drives’characteristics and its concrete application in a servo system.

  12. Stair descending exercise using a novel automatic escalator: effects on muscle performance and health-related parameters.

    Science.gov (United States)

    Paschalis, Vassilis; Theodorou, Anastasios A; Panayiotou, George; Kyparos, Antonios; Patikas, Dimitrios; Grivas, Gerasimos V; Nikolaidis, Michalis G; Vrabas, Ioannis S

    2013-01-01

    A novel automatic escalator was designed, constructed and used in the present investigation. The aim of the present investigation was to compare the effect of two repeated sessions of stair descending versus stair ascending exercise on muscle performance and health-related parameters in young healthy men. Twenty males participated and were randomly divided into two equal-sized groups: a stair descending group (muscle-damaging group) and a stair ascending group (non-muscle-damaging group). Each group performed two sessions of stair descending or stair ascending exercise on the automatic escalator while a three week period was elapsed between the two exercise sessions. Indices of muscle function, insulin sensitivity, blood lipid profile and redox status were assessed before and immediately after, as well as at day 2 and day 4 after both exercise sessions. It was found that the first bout of stair descending exercise caused muscle damage, induced insulin resistance and oxidative stress as well as affected positively blood lipid profile. However, after the second bout of stair descending exercise the alterations in all parameters were diminished or abolished. On the other hand, the stair ascending exercise induced only minor effects on muscle function and health-related parameters after both exercise bouts. The results of the present investigation indicate that stair descending exercise seems to be a promising way of exercise that can provoke positive effects on blood lipid profile and antioxidant status.

  13. Feeding patterns and performance of cows in controlled cow traffic in automatic milking systems.

    Science.gov (United States)

    Melin, M; Svennersten-Sjaunja, K; Wiktorsson, H

    2005-11-01

    Two groups of dairy cows monitored from 3 to 19 wk postpartum were subjected to 2 different cow traffic routines in an automatic milking system with control gates and an open waiting area. Using different time settings in the control gates, the groups of cows were separated by average milking frequency; cows in the high milking frequency routine had a minimum of 4 h between milkings (MF(4)) and were milked 3.2 +/- 0.1 times daily, whereas cows in the low milking frequency routine had at least 8 h between milkings (MF8) and were milked 2.1 +/- 0.1 times daily. Cows in the 2 groups were switched to the opposite milking frequency control for wk 18 and 19. The increased milking frequency resulted in a higher milk yield of about 9% through 16 wk of early lactation Although the higher milk yield was not significant when measured as energy-corrected milk, significant interactions of milking frequency and study period for milk yield and energy-corrected milk yield were consistent with a yield response when cows were milked more frequently. Meal criteria estimated for each individual cow were used to group feeding visits into meals. During MF4, cows fed in fewer meals per day and had longer meals than during MF8. The control gates were used efficiently, with only a few passages not resulting in actual meals. Although the voluntary meal intervals seemed to be short, the average milking frequency was far below that theoretically possible. This was explained by individual differences in milking frequency and long intervals from when a cow was redirected in a control gate until it arrived in the milking unit. A wide individual range in the voluntary interval between the first and the second meal in the milking cycle suggests that fixed time limits for control gates set on group level have no justifiable biological basis. It was also concluded that primiparous cows were well adapted to the automatic milking system after 2 wk in the barn.

  14. Performing the processing required for automatically get a PDF/A version of the CERN Library documentation

    CERN Document Server

    Molina Garcia-Retamero, Antonio

    2015-01-01

    The aim of the project was to perform the processing required for automatically get a PDF/A version of the CERN Library documentation. For this, it is necessary to extract as much metadata as possible from the sources files, inject the required data into the original source files creating new ones ready for being compiled with all related dependencies. Besides, I’ve proposed the creation of a HTML version consistent with the PDF and navigable for easy access, I’ve been trying to perform some Natural Language Processing for extracting metadata, I’ve proposed the injection of the cern library documentation into the HTML version of the long writeups where it is referenced (for instance, when a CERN Library function is referenced in a sample code) Finally, I’ve designed and implemented a Graphical User Interface in order to simplify the process for the user.

  15. Static load test performance of a telescoping structure for an automatically deployable ROPS.

    Science.gov (United States)

    Etherton, J R; Cutlip, R G; Harris, J R; Ronaghi, M; Means, K H; Gillispie, A

    2002-02-01

    The automatically deployable ROPS was developed as part of an innovative project to provide passive protection against overturn fatality to operators of new tractors used in both low-clearance and unrestricted-clearance tasks. The primary objective of this phase of the research was to build a telescoping structure that would prove that a ROPS can be built that will (1) reliably deploy on signal, (2) rise in a sufficiently short amount of time, (3) firmly latch in its deployed position, and (4) satisfy SAE J2194 testing requirements. The two-post structure had previously been found to meet deployment time criteria, and design analyses indicated that neither the slip-fit joint nor the latch pins would fail at test loading. Four directions of static loading were applied to the structure to satisfy SAE requirements. For the series of static loading tests, the raised structure was found to maintain a protective clearance zone after all loads were applied. The structure is overly stiff and should be redesigned to increase its ability to absorb ground-impact energy. Results of dynamic tests and field upset tests are reported in companion articles. The next phase of development is to optimize the structure so that it will plastically deform and absorb energy that would otherwise be transferred to the tractor chassis.

  16. Mediation and Automatization.

    Science.gov (United States)

    Hutchins, Edwin

    This paper discusses the relationship between the mediation of task performance by some structure that is not inherent in the task domain itself and the phenomenon of automatization, in which skilled performance becomes effortless or phenomenologically "automatic" after extensive practice. The use of a common simple explicit mediating…

  17. Debugging of Class-D Audio Power Amplifiers

    DEFF Research Database (Denmark)

    Crone, Lasse; Pedersen, Jeppe Arnsdorf; Mønster, Jakob Døllner;

    2012-01-01

    Determining and optimizing the performance of a Class-D audio power amplier can be very dicult without knowledge of the use of audio performance measuring equipment and of how the various noise and distortion sources in uence the audio performance. This paper gives an introduction on how to measu...

  18. Application performance evaluation on automatic fire alarm system of metro%地铁火灾自动报警系统使用效能评价

    Institute of Scientific and Technical Information of China (English)

    宋立巍; 宋立丹; 赵海荣

    2009-01-01

    对地铁火灾自动报警系统使用效能评价进行了研究,提出了该系统使用效能的评价要求、评价指标体系、评价流程以及评分模型,为地铁火灾自动报警系统的使用效能评价工作提供了技术参考和指导.%Application performance evaluation on automatic fire alarm system in metro was studied. Requirements, evaluation index system, evaluation process and score model of application performance evaluation on automatic fire alarm system in metro were proposed. And it will provide technical reference and guidance on the application performance evaluation job on automatic fire alarm system in metro.

  19. Fundamentals of IP and SoC security design, verification, and debug

    CERN Document Server

    Ray, Sandip; Sur-Kolay, Susmita

    2017-01-01

    This book is about security in embedded systems and it provides an authoritative reference to all aspects of security in system-on-chip (SoC) designs. The authors discuss issues ranging from security requirements in SoC designs, definition of architectures and design choices to enforce and validate security policies, and trade-offs and conflicts involving security, functionality, and debug requirements. Coverage also includes case studies from the “trenches” of current industrial practice in design, implementation, and validation of security-critical embedded systems. Provides an authoritative reference and summary of the current state-of-the-art in security for embedded systems, hardware IPs and SoC designs; Takes a "cross-cutting" view of security that interacts with different design and validation components such as architecture, implementation, verification, and debug, each enforcing unique trade-offs; Includes high-level overview, detailed analysis on implementation, and relevant case studies on desi...

  20. Interactive debug program for evaluation and modification of assembly-language software

    Science.gov (United States)

    Arpasi, D. J.

    1979-01-01

    An assembly-language debug program written for the Honeywell HDC-601 and DDP-516/316 computers is described. Names and relative addressing to improve operator-machine interaction are used. Features include versatile display, on-line assembly, and improved program execution and analysis. The program is discussed from both a programmer's and an operator's standpoint. Functional diagrams are included to describe the program, and each command is illustrated.

  1. Text Summarization Evaluation: Correlating Human Performance on an Extrinsic Task with Automatic Intrinsic Metrics

    Science.gov (United States)

    2006-05-01

    computed the Pearson r and Spearman ρ (Siegel and Castellan , 1988) correlation values for the comparison of the human judgments and the ROUGE scores, and...average performance of a system for all topics. Table 4.15 and Table 4.16 show the rank correlations— using Pearson r (Siegel and Castellan , 1988...Intrinsic and Extrinsic Scores Grouped by System (including Full Text) and also Spearman ρ (Siegel and Castellan , 1988) is introduced in this

  2. Archival Automatic Identification System (AIS) Data for Navigation Project Performance Evaluation

    Science.gov (United States)

    2015-08-01

    and available to USACE practitioners via the MOU mentioned above provides several of these parameters at a cost that is significantly lower than...performance information can be screened for a variety of embedded factors in the context of navigation features, such as inbound or outbound vessels. Vessel...collection, yet AIS data provides triple the data volume for this single transit, with no explicit cost incurred. Each historical data request from

  3. Multistation alarm system for eruptive activity based on the automatic classification of volcanic tremor: specifications and performance

    Science.gov (United States)

    Langer, Horst; Falsaperla, Susanna; Messina, Alfio; Spampinato, Salvatore

    2015-04-01

    system is hitherto one of the main automatic alerting tools to identify impending eruptive events at Etna. The currently operating software named KKAnalysis is applied to the data stream continuously recorded at two seismic stations. The data are merged with reference datasets of past eruptive episodes. In doing so, the results of pattern classification can be immediately compared to previous eruptive scenarios. Given the rich material collected in recent years, here we propose the application of the alert system to a wider range (up to a total of eleven) stations at different elevations (1200-3050 m) and distances (1-8 km) from the summit craters. Critical alert parameters were empirically defined to obtain an optimal tuning of the alert system for each station. To verify the robustness of this new, multistation alert system, a dataset encompassing about eight years of continuous seismic records (since 2006) was processed automatically using KKAnalysis and collateral software offline. Then, we analyzed the performance of the classifier in terms of timing and spatial distribution of the stations.

  4. Performance-Driven Interface Contract Enforcement for Scientific Components

    Energy Technology Data Exchange (ETDEWEB)

    Dahlgren, Tamara Lynn [Univ. of California, Davis, CA (United States)

    2008-01-01

    Performance-driven interface contract enforcement research aims to improve the quality of programs built from plug-and-play scientific components. Interface contracts make the obligations on the caller and all implementations of the specified methods explicit. Runtime contract enforcement is a well-known technique for enhancing testing and debugging. However, checking all of the associated constraints during deployment is generally considered too costly from a performance stand point. Previous solutions enforced subsets of constraints without explicit consideration of their performance implications. Hence, this research measures the impacts of different interface contract sampling strategies and compares results with new techniques driven by execution time estimates. Results from three studies indicate automatically adjusting the level of checking based on performance constraints improves the likelihood of detecting contract violations under certain circumstances. Specifically, performance-driven enforcement is better suited to programs exercising constraints whose costs are at most moderately expensive relative to normal program execution.

  5. Digital automatic gain control

    Science.gov (United States)

    Uzdy, Z.

    1980-01-01

    Performance analysis, used to evaluated fitness of several circuits to digital automatic gain control (AGC), indicates that digital integrator employing coherent amplitude detector (CAD) is best device suited for application. Circuit reduces gain error to half that of conventional analog AGC while making it possible to automatically modify response of receiver to match incoming signal conditions.

  6. Parameter design and performance analysis of shift actuator for a two-speed automatic mechanical transmission for pure electric vehicles

    Directory of Open Access Journals (Sweden)

    Jianjun Hu

    2016-08-01

    Full Text Available Recent developments of pure electric vehicles have shown that pure electric vehicles equipped with two-speed or multi-speed gearbox possess higher energy efficiency by ensuring the drive motor operates at its peak performance range. This article presents the design, analysis, and control of a two-speed automatic mechanical transmission for pure electric vehicles. The shift actuator is based on a motor-controlled camshaft where a special geometric groove is machined, and the camshaft realizes the axial positions of the synchronizer sleeve for gear engaging, disengaging, and speed control of the drive motor. Based on the force analysis of shift process, the parameters of shift actuator and shift motor are designed. The drive motor’s torque control strategy before shifting, speed governing control strategy before engaging, shift actuator’s control strategy during gear engaging, and drive motor’s torque recovery strategy after shift process are proposed and implemented with a prototype. To validate the performance of the two-speed gearbox, a test bed was developed based on dSPACE that emulates various operation conditions. The experimental results indicate that the shift process with the proposed shift actuator and control strategy could be accomplished within 1 s under various operation conditions, with shift smoothness up to passenger car standard.

  7. First performance evaluation of software for automatic segmentation, labeling and reformation of anatomical aligned axial images of the thoracolumbar spine at CT

    Energy Technology Data Exchange (ETDEWEB)

    Scholtz, Jan-Erik, E-mail: janerikscholtz@gmail.com; Wichmann, Julian L.; Kaup, Moritz; Fischer, Sebastian; Kerl, J. Matthias; Lehnert, Thomas; Vogl, Thomas J.; Bauer, Ralf W.

    2015-03-15

    Highlights: •Automatic segmentation and labeling of the thoracolumbar spine. •Automatically generated double-angulated and aligned axial images of spine segments. •High grade of accurateness for the symmetric depiction of anatomical structures. •Time-saving and may improve workflow in daily practice. -- Abstract: Objectives: To evaluate software for automatic segmentation, labeling and reformation of anatomical aligned axial images of the thoracolumbar spine on CT in terms of accuracy, potential for time savings and workflow improvement. Material and methods: 77 patients (28 women, 49 men, mean age 65.3 ± 14.4 years) with known or suspected spinal disorders (degenerative spine disease n = 32; disc herniation n = 36; traumatic vertebral fractures n = 9) underwent 64-slice MDCT with thin-slab reconstruction. Time for automatic labeling of the thoracolumbar spine and reconstruction of double-angulated axial images of the pathological vertebrae was compared with manually performed reconstruction of anatomical aligned axial images. Reformatted images of both reconstruction methods were assessed by two observers regarding accuracy of symmetric depiction of anatomical structures. Results: In 33 cases double-angulated axial images were created in 1 vertebra, in 28 cases in 2 vertebrae and in 16 cases in 3 vertebrae. Correct automatic labeling was achieved in 72 of 77 patients (93.5%). Errors could be manually corrected in 4 cases. Automatic labeling required 1 min in average. In cases where anatomical aligned axial images of 1 vertebra were created, reconstructions made by hand were significantly faster (p < 0.05). Automatic reconstruction was time-saving in cases of 2 and more vertebrae (p < 0.05). Both reconstruction methods revealed good image quality with excellent inter-observer agreement. Conclusion: The evaluated software for automatic labeling and anatomically aligned, double-angulated axial image reconstruction of the thoracolumbar spine on CT is time

  8. Automatic sequences

    CERN Document Server

    Haeseler, Friedrich

    2003-01-01

    Automatic sequences are sequences which are produced by a finite automaton. Although they are not random they may look as being random. They are complicated, in the sense of not being not ultimately periodic, they may look rather complicated, in the sense that it may not be easy to name the rule by which the sequence is generated, however there exists a rule which generates the sequence. The concept automatic sequences has special applications in algebra, number theory, finite automata and formal languages, combinatorics on words. The text deals with different aspects of automatic sequences, in particular:· a general introduction to automatic sequences· the basic (combinatorial) properties of automatic sequences· the algebraic approach to automatic sequences· geometric objects related to automatic sequences.

  9. I Hear You Eat and Speak: Automatic Recognition of Eating Condition and Food Type, Use-Cases, and Impact on ASR Performance.

    Directory of Open Access Journals (Sweden)

    Simone Hantke

    Full Text Available We propose a new recognition task in the area of computational paralinguistics: automatic recognition of eating conditions in speech, i. e., whether people are eating while speaking, and what they are eating. To this end, we introduce the audio-visual iHEARu-EAT database featuring 1.6 k utterances of 30 subjects (mean age: 26.1 years, standard deviation: 2.66 years, gender balanced, German speakers, six types of food (Apple, Nectarine, Banana, Haribo Smurfs, Biscuit, and Crisps, and read as well as spontaneous speech, which is made publicly available for research purposes. We start with demonstrating that for automatic speech recognition (ASR, it pays off to know whether speakers are eating or not. We also propose automatic classification both by brute-forcing of low-level acoustic features as well as higher-level features related to intelligibility, obtained from an Automatic Speech Recogniser. Prediction of the eating condition was performed with a Support Vector Machine (SVM classifier employed in a leave-one-speaker-out evaluation framework. Results show that the binary prediction of eating condition (i. e., eating or not eating can be easily solved independently of the speaking condition; the obtained average recalls are all above 90%. Low-level acoustic features provide the best performance on spontaneous speech, which reaches up to 62.3% average recall for multi-way classification of the eating condition, i. e., discriminating the six types of food, as well as not eating. The early fusion of features related to intelligibility with the brute-forced acoustic feature set improves the performance on read speech, reaching a 66.4% average recall for the multi-way classification task. Analysing features and classifier errors leads to a suitable ordinal scale for eating conditions, on which automatic regression can be performed with up to 56.2% determination coefficient.

  10. I Hear You Eat and Speak: Automatic Recognition of Eating Condition and Food Type, Use-Cases, and Impact on ASR Performance.

    Science.gov (United States)

    Hantke, Simone; Weninger, Felix; Kurle, Richard; Ringeval, Fabien; Batliner, Anton; Mousa, Amr El-Desoky; Schuller, Björn

    2016-01-01

    We propose a new recognition task in the area of computational paralinguistics: automatic recognition of eating conditions in speech, i. e., whether people are eating while speaking, and what they are eating. To this end, we introduce the audio-visual iHEARu-EAT database featuring 1.6 k utterances of 30 subjects (mean age: 26.1 years, standard deviation: 2.66 years, gender balanced, German speakers), six types of food (Apple, Nectarine, Banana, Haribo Smurfs, Biscuit, and Crisps), and read as well as spontaneous speech, which is made publicly available for research purposes. We start with demonstrating that for automatic speech recognition (ASR), it pays off to know whether speakers are eating or not. We also propose automatic classification both by brute-forcing of low-level acoustic features as well as higher-level features related to intelligibility, obtained from an Automatic Speech Recogniser. Prediction of the eating condition was performed with a Support Vector Machine (SVM) classifier employed in a leave-one-speaker-out evaluation framework. Results show that the binary prediction of eating condition (i. e., eating or not eating) can be easily solved independently of the speaking condition; the obtained average recalls are all above 90%. Low-level acoustic features provide the best performance on spontaneous speech, which reaches up to 62.3% average recall for multi-way classification of the eating condition, i. e., discriminating the six types of food, as well as not eating. The early fusion of features related to intelligibility with the brute-forced acoustic feature set improves the performance on read speech, reaching a 66.4% average recall for the multi-way classification task. Analysing features and classifier errors leads to a suitable ordinal scale for eating conditions, on which automatic regression can be performed with up to 56.2% determination coefficient.

  11. Space-Based FPGA Radio Receiver Design, Debug, and Development of a Radiation-Tolerant Computing System

    Directory of Open Access Journals (Sweden)

    Zachary K. Baker

    2010-01-01

    Full Text Available Los Alamos has recently completed the latest in a series of Reconfigurable Software Radios, which incorporates several key innovations in both hardware design and algorithms. Due to our focus on satellite applications, each design must extract the best size, weight, and power performance possible from the ensemble of Commodity Off-the-Shelf (COTS parts available at the time of design. A large component of our work lies in determining if a given part will survive in space and how it will fail under various space radiation conditions. Using two Xilinx Virtex 4 FPGAs, we have achieved 1 TeraOps/second signal processing on a 1920 Megabit/second datastream. This processing capability enables very advanced algorithms such as our wideband RF compression scheme to operate at the source, allowing bandwidth-constrained applications to deliver previously unattainable performance. This paper will discuss the design of the payload, making electronics survivable in the radiation of space, and techniques for debug.

  12. Completely Debugging Indeterminate MPI/PVM Programs%不确定性MPI/PVM程序的完全调试

    Institute of Scientific and Technical Information of China (English)

    王锋; 安虹; 陈志辉; 陈国良

    2001-01-01

    讨论如何完全地调试不确定性MPI/PVM并行程序.在循环调试过程中,不确定性导致前次遇到的错误在以后的执行中很可能无法再现.基于MPI/PVM的FIFO通信模型,给出一种记录-重放技术的实现.通过可控制的重放,用户可以覆盖所有可能的程序执行路径,从而达到完全调试的目的.和其它方法相比,所提供的方法所需时空开销要小得多.此技术已在两种消息传递体系结构上得到实现:一种是曙光-2000超级服务器(由国家智能计算机研究中心开发),它由单处理器(PowerPC)结点经MESH网互联而成;另一种是国家高性能计算中心(合肥)的工作站(PowerPC/AIX)机群系统%This paper discusses how to completely debug indeterminateMPI/PVM parallelprograms.Due tothe indeterminacy,the previous bugs may be non-repeatable in successive executions during a cyclic debuggingsession.Based on the FIFO communication model of MPI/PVM,an implementation of record and replay tech-nique is presented.Moreover,users are provided with an easy way to completely debug their programs by cover-ing all possible execution paths through controllable replay.Comparied with other solutions,the proposedmethod produces much less temporaland spatialoverhead.The implementation has been completed on two kindsof message passing architectures:one is Dawning-2000 super server(that was developed by the National Re-search Center for Intelligent Computing Systems ofChina)with single-processor(PowerPC)nodes which are in-terconnected by a custom-built wormhole mesh network;the other is a cluster ofworkstations(PowerPC/AIX)which has been built in NationalHigh Performance Computing Center at Hefei.

  13. Operational performance of Swedish grid connected solar power plants. Automatic data collection; Driftuppfoeljning av svenska naetanslutna solcellsanlaeggningar. Automatisering av datainsamling

    Energy Technology Data Exchange (ETDEWEB)

    Hedstroem, Jonas; Svensson, Stefan

    2006-09-15

    A performance database containing all grid-connected PV-systems in Sweden has been in operation since March 2002. The systems in the database are described in detail and energy production is continuously added in the form of monthly values. The energy production and the system descriptions are published on www.elforsk.se/solenergi. In august 2006 31 active systems were present in the database. As result of the Swedish subsidy program this number is expected to increase to over 100 systems in the next years. The new owners of PV-systems are obliged to report the produced electricity to the authorities at least once a year. In this work we have studied different means to simplify the collection of data. Four different methods are defined. 1. The conversion of readings from energy meters made at arbitrary distance in time into monthly values. 2. Methods to handle data obtained with the monitoring systems provided by different inverter manufactures. 3. Methods to acquire data from PV-systems with energy meters reporting to the green certificate system. 4. Commercial GSM/GPRS monitoring systems. The first method is the minimum level required by the authorities. The second and third methods are the use of equipments that are expected to be used by some PV-systems for other reasons. Method 4 gives a possibility to create a fully automatic collection method. The described GPRS-systems are expected to have an initial cost of roughly 4000 SEK and a yearly fee of 200 SEK (1 SEK {approx} 0.14 USD)

  14. Distribution Based Change-Point Problem With Two Types of Imperfect Debugging in Software Reliability

    Directory of Open Access Journals (Sweden)

    P. K. Kapur

    2009-01-01

    Full Text Available Software testing is an important phase of softwaredevelopment life cycle. It controls the quality of softwareproduct. Due to the complexity of software system andincomplete understanding of software, the testing team maynot be able to remove/correct the fault perfectly onobservation/detection of a failure and the original fault mayremain resulting in a phenomenon known as imperfectdebugging, or get replaced by another fault causing faultgeneration. In case of imperfect debugging, the fault contentof the software remains same while in case of faultgeneration, the fault content increases as the testingprogresses and removal/correction results in introduction ofnew faults while removing/correcting old ones. Duringsoftware testing fault detection /correction rate may not besame throughout the whole testing process, but it maychange at any time moment. In the literature varioussoftware reliability models have been proposedincorporating change-point concept. In this paper wepropose a distribution based change-point problem with twotypes of imperfect debugging in software reliability. Themodels developed have been validated and verified usingreal data sets. Estimated Parameters and comparisoncriteria results have also been presented

  15. A Case for Dynamic Reverse-code Generation to Debug Non-deterministic Programs

    Directory of Open Access Journals (Sweden)

    Jooyong Yi

    2013-09-01

    Full Text Available Backtracking (i.e., reverse execution helps the user of a debugger to naturally think backwards along the execution path of a program, and thinking backwards makes it easy to locate the origin of a bug. So far backtracking has been implemented mostly by state saving or by checkpointing. These implementations, however, inherently do not scale. Meanwhile, a more recent backtracking method based on reverse-code generation seems promising because executing reverse code can restore the previous states of a program without state saving. In the literature, there can be found two methods that generate reverse code: (a static reverse-code generation that pre-generates reverse code through static analysis before starting a debugging session, and (b dynamic reverse-code generation that generates reverse code by applying dynamic analysis on the fly during a debugging session. In particular, we espoused the latter one in our previous work to accommodate non-determinism of a program caused by e.g., multi-threading. To demonstrate the usefulness of our dynamic reverse-code generation, this article presents a case study of various backtracking methods including ours. We compare the memory usage of various backtracking methods in a simple but nontrivial example, a bounded-buffer program. In the case of non-deterministic programs such as this bounded-buffer program, our dynamic reverse-code generation outperforms the existing backtracking methods in terms of memory efficiency.

  16. Assessing the Performance of Automatic Speech Recognition Systems When Used by Native and Non-Native Speakers of Three Major Languages in Dictation Workflows

    DEFF Research Database (Denmark)

    Zapata, Julián; Kirkedal, Andreas Søeborg

    2015-01-01

    In this paper, we report on a two-part experiment aiming to assess and compare the performance of two types of automatic speech recognition (ASR) systems on two different computational platforms when used to augment dictation workflows. The experiment was performed with a sample of speakers...... of three major languages and with different linguistic profiles: non-native English speakers; non-native French speakers; and native Spanish speakers. The main objective of this experiment is to examine ASR performance in translation dictation (TD) and medical dictation (MD) workflows without manual...

  17. A fully automatic tool to perform accurate flood mapping by merging remote sensing imagery and ancillary data

    Science.gov (United States)

    D'Addabbo, Annarita; Refice, Alberto; Lovergine, Francesco; Pasquariello, Guido

    2016-04-01

    Flooding is one of the most frequent and expansive natural hazard. High-resolution flood mapping is an essential step in the monitoring and prevention of inundation hazard, both to gain insight into the processes involved in the generation of flooding events, and from the practical point of view of the precise assessment of inundated areas. Remote sensing data are recognized to be useful in this respect, thanks to the high resolution and regular revisit schedules of state-of-the-art satellites, moreover offering a synoptic overview of the extent of flooding. In particular, Synthetic Aperture Radar (SAR) data present several favorable characteristics for flood mapping, such as their relative insensitivity to the meteorological conditions during acquisitions, as well as the possibility of acquiring independently of solar illumination, thanks to the active nature of the radar sensors [1]. However, flood scenarios are typical examples of complex situations in which different factors have to be considered to provide accurate and robust interpretation of the situation on the ground: the presence of many land cover types, each one with a particular signature in presence of flood, requires modelling the behavior of different objects in the scene in order to associate them to flood or no flood conditions [2]. Generally, the fusion of multi-temporal, multi-sensor, multi-resolution and/or multi-platform Earth observation image data, together with other ancillary information, seems to have a key role in the pursuit of a consistent interpretation of complex scenes. In the case of flooding, distance from the river, terrain elevation, hydrologic information or some combination thereof can add useful information to remote sensing data. Suitable methods, able to manage and merge different kind of data, are so particularly needed. In this work, a fully automatic tool, based on Bayesian Networks (BNs) [3] and able to perform data fusion, is presented. It supplies flood maps

  18. Application of remote debugging techniques in user-centric job monitoring

    Science.gov (United States)

    dos Santos, T.; Mättig, P.; Wulff, N.; Harenberg, T.; Volkmer, F.; Beermann, T.; Kalinin, S.; Ahrens, R.

    2012-06-01

    With the Job Execution Monitor, a user-centric job monitoring software developed at the University of Wuppertal and integrated into the job brokerage systems of the WLCG, job progress and grid worker node health can be supervised in real time. Imminent error conditions can thus be detected early by the submitter and countermeasures can be taken. Grid site admins can access aggregated data of all monitored jobs to infer the site status and to detect job misbehaviour. To remove the last "blind spot" from this monitoring, a remote debugging technique based on the GNU C compiler suite was developed and integrated into the software; its design concept and architecture is described in this paper and its application discussed.

  19. Design of absorbency photometer used in a fully automatic ELISA analyzer

    Science.gov (United States)

    Dong, Ningning; Zhu, Lianqing; Dong, Mingli; Niu, Shouwei

    2008-03-01

    Absorbency measurement is the most important step in the ELISA analysis. Based on the spectrophotometry, absorbency photometer system used in a fully automatic ELISA analyzer is developed. The system is one core module of the fully automatic ELISA analyzer. The principle and function of the system is analyzed. Three main units of the system, the photoelectric transform unit, the data processing unit and the communication and control unit, are designed and debugged. Finally, the test of the system is carried out using the verification plate. The experiment results agree well with the requirements.

  20. 汽车液力自动变速器的性能%The performance on automotive automatic transmission

    Institute of Scientific and Technical Information of China (English)

    孙小男; 赵薇

    2013-01-01

    液力自动变速器(Automatic Transmission,简称AT)在自动变速器市场中占有最重要的位置,AT向高档位发展的趋势越来越明显.高档位自动变速器可以更好地改善汽车的各方面性能,具有档间速比分配更加细密、速比范围更大.

  1. 西门子BNⅡ全自动蛋白分析仪性能验证报告%Performance verification of Siemens BNⅡautomatic protein analyzer

    Institute of Scientific and Technical Information of China (English)

    于淼琛; 朱鸿; 王凤

    2016-01-01

    目的:对西门子BNⅡ全自动蛋白分析仪主要分析性能进行验证。方法根据实验室认可准则和美国CLIA’88性能验证文件,对西门子BNⅡ全自动蛋白分析仪上开展的8个常规项目进行精密度、正确度、临床可报告范围和生物参考区间验证。结果西门子BNⅡ全自动蛋白分析仪精密度、正确度、线性范围、临床可报告范围和生物参考区间验证等均符合要求。结论西门子BNⅡ全自动蛋白分析仪检测性能完全满足预期的临床应用要求。%Objective To verify the performance of Siemens BNⅡautomatic protein analyzer. Methods According to labora-tory accreditation criteria and performance verification documents of American CLIA’88,the precision,accuracy,clinical re-portable range and biological reference interval were verified for the tests of 8 conventional biochemical indicators performed on Siemens BNⅡautomatic protein analyzer. Results The precision,accuracy,liner range,clinical reportable range and biological ref-erence interval were all acceptable. Conclusion Siemens BNⅡautomatic protein analyzer could fully meet the requirements in clinical application.

  2. Design and performance of an Automatic Gain Control system for the High Energy X-Ray Timing Experiment

    Science.gov (United States)

    Pelling, Michael R.; Rothschild, Richard E.; Macdonald, Daniel R.; Hertel, Robert; Nishiie, Edward

    1991-01-01

    The High Energy X-Ray Timing Experiment (HEXTE), currently under development for the X-Ray Timing Explorer (XTE) mission, employs a closed loop gain control system to attain 0.5 percent stabilization of each of eight-phoswich detector gains. This Automatic Gain Control (AGC) system utilizes a split window discriminator scheme to control the response of each detector pulse height analyzer to gated Am-241 X-ray events at 60 keV. A prototype AGC system has been implemented and tested within the gain perturbation environment expected to be experienced by the HEXTE instrument in flight. The AGC system and test configuration are described. Response, stability and noise characteristics are measured and compared with theoretical predictions. The system is found to be generally suitable for the HEXTE application.

  3. Automatic text summarization

    CERN Document Server

    Torres Moreno, Juan Manuel

    2014-01-01

    This new textbook examines the motivations and the different algorithms for automatic document summarization (ADS). We performed a recent state of the art. The book shows the main problems of ADS, difficulties and the solutions provided by the community. It presents recent advances in ADS, as well as current applications and trends. The approaches are statistical, linguistic and symbolic. Several exemples are included in order to clarify the theoretical concepts.  The books currently available in the area of Automatic Document Summarization are not recent. Powerful algorithms have been develop

  4. Performance evaluation of an automatic positioning system for photovoltaic panels; Avaliacao de desempenho de um sistema de posicionamento automatico para paineis fotovoltaicos

    Energy Technology Data Exchange (ETDEWEB)

    Alves, Alceu Ferreira; Cagnon, Jose Angelo [Universidade Estadual Paulista (FEB/UNESP), Bauru, SP (Brazil). Fac. de Engenharia], Emails: alceu@feb.unesp.br, jacagnon@feb.unesp.br

    2009-07-01

    The need of using electric energy in localities not attended by the utilities has motivated the development of this research, whose main approach was photovoltaic systems and the search for better performance of these systems with the solar panels positioning toward the sun. This work presents the performance evaluation of an automatic positioning system for photovoltaic panels taking in account the increase in generation of electric energy and its costs of implantation. It was designed a simplified electromechanical device, which is able to support and to move a photovoltaic panel along the day and along the year, keeping its surface aimed to the sun rays, without using sensors and with optimization of movements, due the adjustment of panel's inclination take place only once a day. The obtained results indicated that the proposal is viable, showing a compatible cost compared to the increase in the generation of electricity. (author)

  5. Research on Embedded Software Debugging Method%嵌入式软件调试方法研究

    Institute of Scientific and Technical Information of China (English)

    李志丹

    2012-01-01

    In the development of embedded system, the software will always come out many problems due to the lack of experiences and skills of the programmer. So debugging is a comparatively important step in software development cycle. The paper mainly introduces some methods of software debugging under Vxworks OS.%在嵌入式软件开发过程中,由于开发者的经验和技术的限制,开发的软件经常会出现很多错误.因此,调试在软件开发流程中成为一个比较重要的环节.文章以Vxworks为例,给出了调试软件的有关方法.

  6. Automatic Reading

    Institute of Scientific and Technical Information of China (English)

    胡迪

    2007-01-01

    <正>Reading is the key to school success and,like any skill,it takes practice.A child learns to walk by practising until he no longer has to think about how to put one foot in front of the other.The great athlete practises until he can play quickly,accurately and without thinking.Ed- ucators call it automaticity.

  7. Linux kernel debug technology research%Linux内核调试技术的方法研究

    Institute of Scientific and Technical Information of China (English)

    洪永学; 余红英; 姜世杰; 林丽蓉

    2012-01-01

    The application and development of Linux kernel driver Linux kernel often need to cut or modify, because of the particularity of the operating system kernel and version of the difference,while transplanted drive or write application will appear all sorts of errors and warning information,such as segmentation fault, syntax error,variable not used and so on, but cannot use debug common user program method debug kernel. For above reason this paper first introduces the commonly used two kinds of Linux kernel debugging method, that is printk function printing technology and Oops information stack trace technology, Finally, a LCD driver example explains how to use Oops information stack trace Linux kernel driver debugging technology to reflect the stack trace the importance of technology.%开发Linux应用及内核驱动时经常需要对Linux内核进行裁剪或修改,由于操作系统内核的特殊性和版本的差异性,在移植驱动或是编写应用程序的时候会出现各种各样的错误和警告信息,如:段错误、语法错误、变量未使用等信息,此时不能使用调试普通用户程序的方法调试内核。鉴于上述原因本文首先介绍常用的两种Linux内核调试方法,即printk函数打印技术和Oops信息的栈回溯技术,最后通过一个LCD驱动实例详细讲解了如何利用Oops信息进行栈回溯的Linux内核驱动调试技术以体现出栈回溯技术的重要性。

  8. 单馈圆极化微带天线的工程调试方法研究%Engineering debug method of single-feeding circularly polarized micro-strip antenna

    Institute of Scientific and Technical Information of China (English)

    于家傲; 陈文君; 袁靖; 鞠志忠

    2014-01-01

    为解决单馈圆极化微带天线的工程实现与设计方案存在着谐振频点不一致、轴比变差等问题,基于单馈圆极化天线理论基础,提出了两类单馈圆极化天线的工程调试方案,并采用HFSS软件进行了仿真。仿真结果表明,在天线的不同位置进行调试可分别对单馈圆极化微带天线的谐振频率、反射系数和轴比等天线性能进行优化调整,这对该类天线的工程调试具有指导意义。%To solve the problems that resonant frequency points are inconsistent and the axial ratio is getting worse between the engineering implementation of the single-feeding circularly polarized micro-strip antenna and its design scheme, this paper puts forward two kinds of engineering debug schemes for this micro-strip antenna and simulates them using HFSS software, based on the theoretic basis of single-feeding circularly polarized micro-strip antenna. Simulation results illustrate that the resonant frequency, reflection coefficient and axial ratio as well as other performances of the micro-strip antenna can be optimized respectively by debugging at various positions of this antenna, which plays a certain guiding role in engineering debug of these kinds of antenna.

  9. Diagnostics for stochastic genome-scale modeling via model slicing and debugging.

    Directory of Open Access Journals (Sweden)

    Kevin J Tsai

    Full Text Available Modeling of biological behavior has evolved from simple gene expression plots represented by mathematical equations to genome-scale systems biology networks. However, due to obstacles in complexity and scalability of creating genome-scale models, several biological modelers have turned to programming or scripting languages and away from modeling fundamentals. In doing so, they have traded the ability to have exchangeable, standardized model representation formats, while those that remain true to standardized model representation are faced with challenges in model complexity and analysis. We have developed a model diagnostic methodology inspired by program slicing and debugging and demonstrate the effectiveness of the methodology on a genome-scale metabolic network model published in the BioModels database. The computer-aided identification revealed specific points of interest such as reversibility of reactions, initialization of species amounts, and parameter estimation that improved a candidate cell's adenosine triphosphate production. We then compared the advantages of our methodology over other modeling techniques such as model checking and model reduction. A software application that implements the methodology is available at http://gel.ym.edu.tw/gcs/.

  10. Diagnostics for stochastic genome-scale modeling via model slicing and debugging.

    Science.gov (United States)

    Tsai, Kevin J; Chang, Chuan-Hsiung

    2014-01-01

    Modeling of biological behavior has evolved from simple gene expression plots represented by mathematical equations to genome-scale systems biology networks. However, due to obstacles in complexity and scalability of creating genome-scale models, several biological modelers have turned to programming or scripting languages and away from modeling fundamentals. In doing so, they have traded the ability to have exchangeable, standardized model representation formats, while those that remain true to standardized model representation are faced with challenges in model complexity and analysis. We have developed a model diagnostic methodology inspired by program slicing and debugging and demonstrate the effectiveness of the methodology on a genome-scale metabolic network model published in the BioModels database. The computer-aided identification revealed specific points of interest such as reversibility of reactions, initialization of species amounts, and parameter estimation that improved a candidate cell's adenosine triphosphate production. We then compared the advantages of our methodology over other modeling techniques such as model checking and model reduction. A software application that implements the methodology is available at http://gel.ym.edu.tw/gcs/.

  11. Incorporating S-shaped testing-effort functions into NHPP software reliability model with imperfect debugging

    Institute of Scientific and Technical Information of China (English)

    Qiuying Li; Haifeng Li; Minyan Lu

    2015-01-01

    Testing-effort (TE) and imperfect debugging (ID) in the reliability modeling process may further improve the fitting and pre-diction results of software reliability growth models (SRGMs). For describing the S-shaped varying trend of TE increasing rate more accurately, first, two S-shaped testing-effort functions (TEFs), i.e., delayed S-shaped TEF (DS-TEF) and inflected S-shaped TEF (IS-TEF), are proposed. Then these two TEFs are incorporated into various types (exponential-type, delayed S-shaped and in-flected S-shaped) of non-homogeneous Poisson process (NHPP) SRGMs with two forms of ID respectively for obtaining a series of new NHPP SRGMs which consider S-shaped TEFs as wel as ID. Final y these new SRGMs and several comparison NHPP SRGMs are applied into four real failure data-sets respectively for investigating the fitting and prediction power of these new SRGMs. The experimental results show that: (i) the proposed IS-TEF is more suitable and flexible for describing the consumption of TE than the previous TEFs; (i ) incorporating TEFs into the inflected S-shaped NHPP SRGM may be more effective and appropriate compared with the exponential-type and the delayed S-shaped NHPP SRGMs; (i i) the inflected S-shaped NHPP SRGM con-sidering both IS-TEF and ID yields the most accurate fitting and prediction results than the other comparison NHPP SRGMs.

  12. 安全阀的调试与维修%Debugging and Maintenance of Safety Valve

    Institute of Scientific and Technical Information of China (English)

    吴科学

    2016-01-01

    This paper briefly describes the SY type safety valve application in petrochemical industry, as well as the role of the relief valve in the production unit is responsible for. Relief valve set point of the set is a dangerous and necessary work very much, because the work of specific data, will reflect to the relief valve in the device usage, directly determines the safety valve will be able to have the corresponding security protection. The thesis gives a set of production site pressure setting debugging methods, combined with field using the process of maintenance and repair work experience, summed up the various common faults and fault processing methods, some and illustrates the maintenance process should pay attention to the aspects.%本文简要叙述了SY型安全阀在石油化工行业中的一些应用,以及安全阀在生产装置中所担负的作用。安全阀设定值的现场设定是一项危险且非常必要的工作,因为这一工作得出的具体数据,将体现到其在装置中的使用情况,且直接决定了其是否能够起到相应的安全保护作用。本文详细给出了生产现场压力设定值的设定调试的具体方法,结合现场使用过程中运行维护和检修的工作经验,总结了各种常见故障及其针对一些故障的处理方法,并说明了检修过程中应该注意的方面。

  13. Use of Body Armor Protection Levels with Squad Automatic Weapon Fighting Load Impacts Soldier Performance, Mobility, and Postural Control

    Science.gov (United States)

    2015-05-01

    COURSE PROTECTIVE EQUIPMENT BALLISTICS BIOMECHANICS POSTURAL CONTROL PPE(PERSONAL PROTECTIVE EQUIPMENT) Natick Soldier Research...24  vi ACKNOWLEDGEMENTS The evaluation reported was carried out by personnel of the Biomechanics Team...performance (e.g., long distance runs , short sprints, agility runs , and obstacle courses) (Knapik, 2004). Recently, Peoples et al., (2010) compared the

  14. Skilled Performance, Practice, and the Differentiation of Speed-Up from Automatization Effects: Evidence from Second Language Word Recognition.

    Science.gov (United States)

    Segalowitz, Norman S.; Segalowitz, Sidney J.

    1993-01-01

    Practice on cognitive tasks, such as word recognition tasks, will usually lead to faster and more stable responding in a second language. An analysis is presented of the relationship between observed reductions in performance latency and latency variability with respect to whether processing has become faster or whether a qualitative change, such…

  15. Data Provenance Inference in Logic Programming: Reducing Effort of Instance-driven Debugging

    NARCIS (Netherlands)

    Huq, Mohammad Rezwanul; Mileo, Alessandra; Wombacher, Andreas

    2013-01-01

    Data provenance allows scientists in different domains validating their models and algorithms to find out anomalies and unexpected behaviors. In previous works, we described on-the-fly interpretation of (Python) scripts to build workflow provenance graph automatically and then infer fine-grained pro

  16. Automatically produced FRP beams with embedded FOS in complex geometry: process, material compatibility, micromechanical analysis, and performance tests

    Science.gov (United States)

    Gabler, Markus; Tkachenko, Viktoriya; Küppers, Simon; Kuka, Georg G.; Habel, Wolfgang R.; Milwich, Markus; Knippers, Jan

    2012-04-01

    The main goal of the presented work was to evolve a multifunctional beam composed out of fiber reinforced plastics (FRP) and an embedded optical fiber with various fiber Bragg grating sensors (FBG). These beams are developed for the use as structural member for bridges or industrial applications. It is now possible to realize large scale cross sections, the embedding is part of a fully automated process and jumpers can be omitted in order to not negatively influence the laminate. The development includes the smart placement and layout of the optical fibers in the cross section, reliable strain transfer, and finally the coupling of the embedded fibers after production. Micromechanical tests and analysis were carried out to evaluate the performance of the sensor. The work was funded by the German ministry of economics and technology (funding scheme ZIM). Next to the authors of this contribution, Melanie Book with Röchling Engineering Plastics KG (Haren/Germany; Katharina Frey with SAERTEX GmbH & Co. KG (Saerbeck/Germany) were part of the research group.

  17. The yin and yang properties of pentatonic music in TCM music therapy:based on debugging and speed%中医音乐治疗中五声性音乐阴阳属性--从调式和速度的角度

    Institute of Scientific and Technical Information of China (English)

    左志坚

    2016-01-01

    中国传统音乐理论认为,中国音乐在创作、表演、音乐语言等方面都体现出阴阳思维。中国传统音乐是五声性音乐,使用的五声性调式具有阴阳属性。总体上看,调式的阴阳属性可分为明确、基本明确和不明确三大类。音乐速度对阴阳属性明确的调式能造成细微影响,对阴阳属性基本明确和不明确的调式具有重要的决定作用。%According to Chinese traditional music theory,the creation,performance and music language of Chinese music reflect the thinking of yin and yang. Chinese traditional music is pentatonic music,whose Pentatonic debugging has Yin and Yang properties. Overall,the properties can be divided into three categories:clear,almost clear and unclear. Music speed has subtle influence on the debugging with clear yin and yang property and it has decisive influence on the debugging with almost clear and unclear yin and yang property.

  18. Research of Methods for General Multi-Core Parallel Debugging%通用多核并行调试方法研究

    Institute of Scientific and Technical Information of China (English)

    王敬宇

    2009-01-01

    Mutli-core architectures challenges parallel programming further.High production capability on parallel software can't be archieved with current manual debugging techniques.This paper analyzes currently available debugging methods, present a progressive debugging by the parallel granularity,which could make full use of our incremental experiences on parallel programming.%多核体系结构加深了并行编程的难度.为开发高效的多核并行调试工具,本文分析了传统并行调试技术面临的问题,提出按并行粒度分级的调试方法,该方法可充分利用并行编程的经验,不断优化调试技术.

  19. A Software Agent for Automatic Creation of a PLC Program

    Directory of Open Access Journals (Sweden)

    Walid M. Aly

    2008-01-01

    Full Text Available Using structured design techniques to design a Programmable Logical Control (PLC program would decrease the time needed for debugging and produces a concise bug free code. This study is concerned with the design of a software agent for automatic creation of code for a PLC program that can be downloaded on a Siemens Step 7 series. The code is generated according to the syntax rules for the AWL Language, AWL is the abbreviation for the germen word Anweisungsliste which means Instruction List The proposed system uses object oriented approach to transfer the design specification into an object that adequately describes the system using the state based design technique. The industrial system specifications are supplied by the user through a simple Graphical User Interface (GUI environment These specification define the attributes vales of an object oriented class describing the control system, all the functions needed to generate the code are encapsulated in the class.

  20. Clothes Dryer Automatic Termination Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    TeGrotenhuis, Ward E.

    2014-10-01

    Volume 2: Improved Sensor and Control Designs Many residential clothes dryers on the market today provide automatic cycles that are intended to stop when the clothes are dry, as determined by the final remaining moisture content (RMC). However, testing of automatic termination cycles has shown that many dryers are susceptible to over-drying of loads, leading to excess energy consumption. In particular, tests performed using the DOE Test Procedure in Appendix D2 of 10 CFR 430 subpart B have shown that as much as 62% of the energy used in a cycle may be from over-drying. Volume 1 of this report shows an average of 20% excess energy from over-drying when running automatic cycles with various load compositions and dryer settings. Consequently, improving automatic termination sensors and algorithms has the potential for substantial energy savings in the U.S.

  1. AX-4280全自动尿液分析仪性能评价%To evaluate the performance of AX-4280 automatic urine dry chemistry analyzer

    Institute of Scientific and Technical Information of China (English)

    王梅; 吴燕; 夏云

    2012-01-01

    目的 评价AX-4280全自动尿液分析仪的性能.方法 采用AX-4280全自动尿液分析仪,分别检测各项参数的精密度、准确度、携带污染率等.结果 高、低质控品精密度SD%值(0.001~0.943、0~0.832)、携带污染率(0.8%~10.0%)、准确度等均在仪器要求范围内.结论 AX-4280全自动尿液分析仪各项性能均符合仪器要求范围,可用于临床尿液干化学分析检测.%Objective To evaluate the performance of the AX-4280, which is the fully automated urine dry chemistry analyzer. Methods The AX-4280 automatic urine dry chemistry analyzer was used to detect the precision,accuracy and carryover rates of all parameters. Results The SD values of the hight-control and lowr-control precisions(0. 001 - 0. 943、0 - 0. 832) ,the contamination rate((0. 8% -10. 0%)and accuracy were all within the requirements of the instrument. Conclusion The performance could meet the requirements of the AX-4280 analyzer,which can be used for dry chemical analysis of clinical urine testing.

  2. An electronically controlled automatic security access gate

    Directory of Open Access Journals (Sweden)

    Jonathan A. ENOKELA

    2014-11-01

    Full Text Available The security challenges being encountered in many places require electronic means of controlling access to communities, recreational centres, offices, and homes. The electronically controlled automated security access gate being proposed in this work helps to prevent an unwanted access to controlled environments. This is achieved mainly through the use of a Radio Frequency (RF transmitter-receiver pair. In the design a microcontroller is programmed to decode a given sequence of keys that is entered on a keypad and commands a transmitter module to send out this code as signal at a given radio frequency. Upon reception of this RF signal by the receiver module, another microcontroller activates a driver circuitry to operate the gate automatically. The codes for the microcontrollers were written in C language and were debugged and compiled using the KEIL Micro vision 4 integrated development environment. The resultant Hex files were programmed into the memories of the microcontrollers with the aid of a universal programmer. Software simulation was carried out using the Proteus Virtual System Modeling (VSM version 7.7. A scaled-down prototype of the system was built and tested. The electronically controlled automated security access gate can be useful in providing security for homes, organizations, and automobile terminals. The four-character password required to operate the gate gives the system an increased level of security. Due to its standalone nature of operation the system is cheaper to maintain in comparison with a manually operated type.

  3. MONITORING ON DEBUGGING OF GEOTHERMAL HEAT PUMP AIR CONDITIONING SYSTEM AND DOMESTIC HOT-WATER SYSTEM%地源热泵空调及生活热水系统调试监控

    Institute of Scientific and Technical Information of China (English)

    丁育南; 丁楠育

    2012-01-01

    以某工程的地源热泵空调及生活热水系统调试的有关监控要求为例,从生活热水系统运行要求、系统调试监控要点及其调试结果分析等方面介绍了调试监控的方法,即为确保该系统满足设计要求的节能减排目标,应先进行分区、分子系统调试,合格后再联合调试,实时总结调试期间出现的不足并及时修正,为类似工程的调试监控工作提供借鉴.%Based on related monitoring requirements for debugging of geothennal heat pump air conditioning system and domestic hot-water system of a project, the debugging monitoring method is introduced on the following aspects, including operation requirement of domestic hot-water system, key points for monitoring on system debugging and analysis on debugging result, etc. To ensure the energy conservation and emission reduction required in design can be satisfied, the system shall be debugged in subzones and subsystems before joint debugging. Problems detected in debugging shall be summarized and corrected timely, so as to provide references for debugging monitoring of similar projects.

  4. Handling Conflicts in Depth-First Search for LTL Tableau to Debug Compliance Based Languages

    Directory of Open Access Journals (Sweden)

    Francois Hantry

    2011-09-01

    Full Text Available Providing adequate tools to tackle the problem of inconsistent compliance rules is a critical research topic. This problem is of paramount importance to achieve automatic support for early declarative design and to support evolution of rules in contract-based or service-based systems. In this paper we investigate the problem of extracting temporal unsatisfiable cores in order to detect the inconsistent part of a specification. We extend conflict-driven SAT-solver to provide a new conflict-driven depth-first-search solver for temporal logic. We use this solver to compute LTL unsatisfiable cores without re-exploring the history of the solver.

  5. 自动生化分析系统校准品的检测性能验证%Performance verification of calibrator for automatic biochemical analysis systems

    Institute of Scientific and Technical Information of China (English)

    张丽; 管晓媛; 段兵; 黄一玲; 田蕾; 李一石

    2014-01-01

    目的:以乳酸脱氢酶(LDH)为例,验证罗氏新批号自动生化分析系统校准品(Cfas)的检测性能。方法在罗氏Cobas C501全自动生化分析仪上,使用20份新鲜患者血清样本,于Cfas批号更换前后分别检测LDH。采用Bland‐Altman散点图方法,用Medcalc12.7.0统计学软件分析检测结果。结果散点图中,有19个点位于一致性界限(LoA)范围内,超过所有点的95%;LDH两次测定结果差值均数仅为-0.2 U/L ,与差值均数为0的线非常接近;LDH两次测量结果最大差值为8U/L,最大比对偏差为2.48%,小于比对偏差可接受标准(2.87%),这种偏差幅度在临床上可以接受。结论在更换Cfas批号前后LDH的两次检测结果具有一致性,新批号Cfas性能验证通过。%Objective To verify the performance of calibrator with new batch number for automatic biochemi‐cal analysis systems (Cfas), using lactate dehydrogenase (LDH) as an example .Methods LDH of twenty serum samples were detected by Roche Cobas C501 automatic biochemical analyzer before and after replacement of calibrator with new batch number .The detection results were analyzed by Bland‐Altman analysis method and Medcalc 12 .7 .0 software .Results According to the Bland‐Altman scatter diagram, a total of 19 dots were within limits of agreement (LoA), accounting for more than 95% of all dots .The mean value of difference between the results of twice detection was only -0 .2 U/L, which was very closed to zero, and the maximum difference was 8 U/L .The largest comparabil‐ity deviation (2 .48% ) was less than the comparability standard deviation(2 .87% ), which was clinically acceptable . Conclusion The detectable results of LDH before and after replacement of calibrator with new batch number could be in accordance, indicating the performance of the calibrator with new batch numbe might be acceptable .

  6. Rectifying systems of ionic membrane caustic soda synchronized debugging%离子膜烧碱整流系统同步调试

    Institute of Scientific and Technical Information of China (English)

    朱琦; 孔强; 王小红

    2011-01-01

    论述了离子膜烧碱整流系统同步调试的原理、步骤和注意事项。%Discusses of rectifying systems for ionic membrane caustic soda synchronized debugging principles, steps and considerations.

  7. 电力谐波滤波器设计与调试%Design and Debug for Powered Harmonic Filter

    Institute of Scientific and Technical Information of China (English)

    仇润鹤; 吴震春

    2001-01-01

    针对煤矿电网谐波特点,采用单调谐谐波滤波器滤除交流装置产生的谐波是一种经济而有效地方法。结合实例对谐波滤波器的设计和调试进行了详细的讨论和分析。%Aims at with regard to the characteristic of Mining Powered harmonics, this paper considers that employong single tume filter to eliminate harmonic of AC filter devices is a economical and effective method. The design and debug of powered harmonic filter are discussed and analyzed with a practical case in detail.

  8. X线机高压发生器的调试%The Debugging of X-ray High Voltage Generator

    Institute of Scientific and Technical Information of China (English)

    戴丹; 王魏; 戴竞; 郭永平; 徐月萍; 高建全; 张春潮; 叶践

    2012-01-01

    随着X射线检查的广泛运用,以及人们对健康的重视程度逐渐提高,接受X射线检查的人越来越多.本文结合X线机构造原理,简单论述X射线高压发生器部分的调试工作.%With wide use of the X-ray examination and the improvement of people's attention on health, more people have accepted the X-ray examination. Based on the structure and principle of X-ray machine, this paper expounds on the debugging of the X-ray high voltage generator.

  9. Debugging Method Based on OMAPL138 Dual-core System%OMAPL138双核系统的调试方案设计

    Institute of Scientific and Technical Information of China (English)

    栾小飞

    2012-01-01

    OMAPL138高性能、低功耗双核处理器为手持式移动设备提供强有力的支持。对双核通信模块DSPLink的软件架构和在Linux嵌入式操作系统下的编译加载进行了分析和介绍,以消息队列组件为例分析了ARM和DSP双核通信时通道的建立和连接的方式。通过DSP/BIOS和Linux端DSPLink的MSGQ接口和多线程技术,建立ARM和DSP消息传递通道,提供了在双核开发中对DSP端暗箱调试的解决方法。%OMAPL138 high-performance low-power dual-core processor provides strong support for handheld mobile devices. The paper provides the analysis for the software architecture of dual-core module DSPLink and the introduction of the methods of compiling and loading the module DSPLink based on the embeded operating system Linux. The process of installation and connection when ARM and DSP communicate with each other by the example of MSGQ is analyzed. Through the DSP/BIOS and Linux side DSPLink of MSGQ in terrace and multi-threading technology, a messaging channel is built between ARM and DSP, providing the solution of DSP side of the black-box debugging in dual-core development.

  10. Research and Implementation on ARM Software Trace Debugging Technology in LTE System%LTE系统中软件跟踪调试技术的研究与实现

    Institute of Scientific and Technical Information of China (English)

    申敏; 彭涛; 周勃

    2012-01-01

    The embedded tracking debuging technology based on ARM11 core in LTE system is discussed. In order to meet the needs of high-speed data services in LTE system and facilitate process controlling and the location of abnormal situation analysis in the operation of ARM subsystem during the product development period, a solution of trace debuging technology is put forward, which has high reliability, strong performance and real-time property. The difficulty is how to manage on-chip cache of trace information and how to process cache of trace information when the high priority task interrupt low priority task in the embedded operating systems, in order to ensure the efficient operation of system and the timing of the trace information.%论述了在LTE系统中基于ARM11内核的嵌入式跟踪调试技术.为了满足LTE系统中高速数据业务的需求,方便产品开发期间ARM子系统的运行流程控制和异常情况的定位分析,提出了一种可靠性高、性能强的实时跟踪调试技术方案.该方案的难点在于跟踪信息片内缓存的管理和嵌入式操作系统中高低优先级任务切换时跟踪信息的缓存处理方法,保证LTE系统的高速率运行和跟踪信息的时序性.

  11. Automatic Validation of Protocol Narration

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, Pierpablo;

    2003-01-01

    We perform a systematic expansion of protocol narrations into terms of a process algebra in order to make precise some of the detailed checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we...

  12. Automatic mapping of monitoring data

    DEFF Research Database (Denmark)

    Lophaven, Søren; Nielsen, Hans Bruun; Søndergaard, Jacob

    2005-01-01

    This paper presents an approach, based on universal kriging, for automatic mapping of monitoring data. The performance of the mapping approach is tested on two data-sets containing daily mean gamma dose rates in Germany reported by means of the national automatic monitoring network (IMIS......). In the second dataset an accidental release of radioactivity in the environment was simulated in the South-Western corner of the monitored area. The approach has a tendency to smooth the actual data values, and therefore it underestimates extreme values, as seen in the second dataset. However, it is capable...

  13. Performance Evaluation of Destiny Max Automatic Coagulation Analyzer%Destiny Max全自动血凝分析仪性能评价

    Institute of Scientific and Technical Information of China (English)

    谢田刚; 王海燕; 王胜江

    2012-01-01

    Objective To evaluate the detection system performance of mechanical method of Destiny Max automatic coagulation analyzer. Methods Mechanical method was selected, and prothrombin time(PT), activated partial thromboplastin time(APTT), thrombin time and fibrinogen(FIB) were used to evaluate the accuracy, precision and carry over rate. The sample test results were compared with STR -Evolution, and then the correlation and Kappa consistency analysis were evaluated. Results The accuracy deviation of PT, APTT, TT and FIB tested with TriniCHECK Control 1 on Destiny Max were 2.21%, 2.57%, 4.68% and 4.72%. The accuracy deviation of PT, APTT and FIB tested with TriniCHECK Control 2 were 1.96%, 0.54% and 3.46%, and the precision were 1.33%, 1.90% and 3.54%, respectively. The carry over rate of FIB high value was 2.56%, and that of FIB to PT reagent was 2.15%. The clinical sample results comparison between Destiny Max and STR-Evolution was favorable, the linear correlation coefficient r>0.95, X>0.8, F>0.05, and with no significant difference. Conclusion The accuracy, precision and carry over rate of Destiny Max all have reached the performance requirements, and the clinical results compared to STR-Evolution have good relevance and consistency, so Destiny Max automatic coagulation analyzer could provide reliable test results for clinic. [Chinese Medical Equipment Journal,2012,33(6) :107-109]%目的:对Destiny Max全自动血凝分析仪的机械法检测系统进行性能评价.方法:选择凝血酶原时间(PT)、部分活化凝血酶原时间(APTT)、凝血酶时间(TT)和纤维蛋白原(FIB)浓度4个指标,采用机械式磁珠法检测对准确性、精密度和携带污染率进行实验评价,并与STR-Evolution上所测结果进行相关性和Kappa一致性检验分析.结果:TriniCHECK Control 1在Destiny Max全自动血凝仪上测定的PT、APTT、TT、FIB的准确性偏差分别为2.21%、2.57%、4.68%、4.72%;TriniCHECK Control 2测定的PT、APTT

  14. Automatic chemical monitoring in the composition of functions performed by the unit level control system in the new projects of nuclear power plant units

    Science.gov (United States)

    Denisova, L. G.; Khrennikov, N. N.

    2014-08-01

    The article presents information on the state of regulatory framework and development of a subsystem for automated chemical monitoring of water chemistries in the primary and secondary coolant circuits used as part of the automatic process control system in new projects of VVER reactor-based nuclear power plant units. For the strategy of developing and putting in use the water chemistry-related part of the automated process control system within the standard AES-2006 nuclear power plant project to be implemented, it is necessary to develop regulatory documents dealing with certain requirements imposed on automatic water chemistry monitoring systems in accordance with the requirements of federal codes and regulations in the field of using atomic energy.

  15. Design of Pneumatic Device of the Automatic Transmission Performance Test-bed%气动自动变速器试验台架的设计

    Institute of Scientific and Technical Information of China (English)

    王利利

    2012-01-01

    根据自动变速器的工作原理特性,结合自动变速器的专业教学实践,分析了目前大多数变速器解剖试验台架的缺点,设计了以气压为动力传动介质的自动变速器解剖运行试验台.气动自动变速器实物解剖试验台架能实现动态演示,突破了教学上的难点,提高了学生的学习兴趣,取得了较好的试验教学效果.%According to the work principle of automatic transmission,combined with the experimental teaching of automatic transmission, the current most transmission test-bed shortcomings were analyzed. An automat transmission test-bed introducing of pneumatic device was designed. The test-bed can realize the dynamic demonstration and break through the teaching difficulty, which enables students to improve the study interest, and get better effects of teaching.

  16. 浅谈高速铁路供电SCADA系统调试工作%On the High-speed Rail Power Supply SCADA System Debugging

    Institute of Scientific and Technical Information of China (English)

    曾亮

    2015-01-01

    In order to standardize the railway power supply SCADA system remote debugging acceptance, after the elimination of SCADA takeover device security risks exist, combined with the use of the existing lines SCADA system demand and system debugging experience, focuses on the SCADA system commissioning content, requirements and procedures and other related content, has some practical significance.%为了规范铁路供电SCADA系统远动调试验收工作,消除SCADA接管后设备中存在的安全隐患,结合既有线路SCADA系统调试经验和系统的运用需求,着重论述了SCADA系统调试内容、要求和程序等相关内容,具有一定的现实指导意义。

  17. Automatic Fiscal Stabilizers

    Directory of Open Access Journals (Sweden)

    Narcis Eduard Mitu

    2013-11-01

    Full Text Available Policies or institutions (built into an economic system that automatically tend to dampen economic cycle fluctuations in income, employment, etc., without direct government intervention. For example, in boom times, progressive income tax automatically reduces money supply as incomes and spendings rise. Similarly, in recessionary times, payment of unemployment benefits injects more money in the system and stimulates demand. Also called automatic stabilizers or built-in stabilizers.

  18. 中央空调冷冻水系统的水力平衡调试%Hydraulic balance debugging of central air-conditioning chilled water system

    Institute of Scientific and Technical Information of China (English)

    王赛华

    2014-01-01

    在管道系统中增设静态、动态水力平衡设备是解决管路的水力失调的常用方法,而系统的水力平衡调试也成为空调系统调试的重要内容之一。本文介绍了空调水系统水力平衡的调试方法,并结合工程实例,着重说明通过流量比例调节法进行静态水力平衡的原理及过程。%Addition of static and dynamic hydraulic balance equipment in the pipeline system is the common method to solve the imbalance of hydraulic pipeline , and the hydraulic balance system debugging has become one of the important contents of the air conditioning system debugging .This paper introduces the debugging method of hydraulic balance of air conditioning water system , and combined with the engi-neering practice , emphasizes the principle and process of static hydraulic balancing by the flow ratio control method .

  19. Automatic differentiation bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Corliss, G.F. (comp.)

    1992-07-01

    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  20. JOSTRAN: An Interactive Joss Dialect for Writing and Debugging Fortran Programs.

    Science.gov (United States)

    Graham, W. R.; Macneilage, D. C.

    JOSTRAN is a JOSS dialect that expedites the construction of FORTRAN programs. JOSS is an interactive, on-line computer system. JOSS language programs are list-processed; i.e., each statement is interpreted at execution time. FORTRAN is the principal language for programing digital computers to perform numerical calculations. The JOSS language…

  1. Automated debug for common information model defect using natural language processing algorithm%基于自然语言处理的通用信息模型自动调试

    Institute of Scientific and Technical Information of China (English)

    项炜

    2013-01-01

    Common Information Model (CIM) is an open industrial standard, which has been implemented in products of many companies. Meanwhile, there are lots of bugs being reported and fixed. In order to reduce the cost time and effort of finding the root cause, in this paper, a method to debug automatically was proposed based on natural language processing algorithm. It firstly segmented those sentences using maximum entropy model, then used simHash to find the most similar fixed bug based on specifically constructed dictionary, finally used text mining to find the root cause and solution via analyzing the trace provided by customer. The experimental result achieves 87. 5% accuracy, which shows its effectiveness.%通用信息模型(CIM)是工业界的一种公开标准,并已实现于很多产品中,大量的bug被发现和修复.为了减少了人工查找错误根源所需的时间和精力,提出一种基于自然语言处理的方法对CIM的bug进行自动调试.首先使用最大熵模型对已解决bug的文档描述进行分词,然后基于构建的词典使用simHash找出那些重复性很大的已修复的bug,最后使用文档处理的方法分析客户提供的trace找出问题所在和解决方法.实验结果取得了87.5%准确率,表明了该方法的有效性.

  2. Automatic Differentiation Package

    Energy Technology Data Exchange (ETDEWEB)

    2007-03-01

    Sacado is an automatic differentiation package for C++ codes using operator overloading and C++ templating. Sacado provide forward, reverse, and Taylor polynomial automatic differentiation classes and utilities for incorporating these classes into C++ codes. Users can compute derivatives of computations arising in engineering and scientific applications, including nonlinear equation solving, time integration, sensitivity analysis, stability analysis, optimization and uncertainity quantification.

  3. Automaticity of walking: functional significance, mechanisms, measurement and rehabilitation strategies

    Directory of Open Access Journals (Sweden)

    David J Clark

    2015-05-01

    Full Text Available Automaticity is a hallmark feature of walking in adults who are healthy and well-functioning. In the context of walking, ‘automaticity’ refers to the ability of the nervous system to successfully control typical steady state walking with minimal use of attention-demanding executive control resources. Converging lines of evidence indicate that walking deficits and disorders are characterized in part by a shift in the locomotor control strategy from healthy automaticity to compensatory executive control. This is potentially detrimental to walking performance, as an executive control strategy is not optimized for locomotor control. Furthermore, it places excessive demands on a limited pool of executive reserves. The result is compromised ability to perform basic and complex walking tasks and heightened risk for adverse mobility outcomes including falls. Strategies for rehabilitation of automaticity are not well defined, which is due to both a lack of systematic research into the causes of impaired automaticity and to a lack of robust neurophysiological assessments by which to gauge automaticity. These gaps in knowledge are concerning given the serious functional implications of compromised automaticity. Therefore, the objective of this article is to advance the science of automaticity of walking by consolidating evidence and identifying gaps in knowledge regarding: a functional significance of automaticity; b neurophysiology of automaticity; c measurement of automaticity; d mechanistic factors that compromise automaticity; and e strategies for rehabilitation of automaticity.

  4. Automatic Testing of a CANopen Node

    OpenAIRE

    Liang, Hui

    2013-01-01

    This Bachelor’s thesis was commissioned by TK Engineering Oy in Vaasa. The goals of the thesis were to test a prototype CANopen node, called UWASA Node for conformance to the CiA 301 standard, and to develop the automatic performance test software and the automatic CiA 401 test software. A test report that describes to the designer what needs to be corrected and improved is made in this thesis. For the CiA 301 test there is a CANopen conformance test tool that can be used. The automatic perfo...

  5. 瓦斯管道输送自动喷粉抑爆装置安全性能试验系统%Automatic spray gas pipeline explosion suppression system safety performance test system

    Institute of Scientific and Technical Information of China (English)

    许明英

    2015-01-01

    针对瓦斯管道输送自动喷粉抑爆装置安全性能试验相关标准,介绍了瓦斯管道输送自动喷粉抑爆装置安全性能试验系统设计,科学巧妙的解决了试验过程的自动化、人员安全等问题。系统已在试验室得到应用,对提高检验的自动化程度和试验安全,效果明显。%The automatic spray suppression ordnance transport safety performance test standards for gas pipeline, Introduced plant safety performance test system design of gas pipeline explosion suppression automatic dusting, scientific and ingenious solution to the automation of the testing process, the safety of personnel and other issues. The system has been applied in the laboratory, to improve the test automation and test security, the effect is obvious.

  6. The ‘Continuing Misfortune’ of Automatism in Early Surrealism

    Directory of Open Access Journals (Sweden)

    Tessel M. Bauduin

    2015-09-01

    Full Text Available In the 1924 Manifesto of Surrealism surrealist leader André Breton (1896-1966 defined Surrealism as ‘psychic automatism in its pure state,’ positioning ‘psychic automatism’ as both a concept and a technique. This definition followed upon an intense period of experimentation with various forms of automatism among the proto-surrealist group; predominantly automatic writing, but also induced dream states. This article explores how surrealist ‘psychic automatism’ functioned as a mechanism for communication, or the expression of thought as directly as possible through the unconscious, in the first two decades of Surrealism. It touches upon automatic writing, hysteria as an automatic bodily performance of the unconscious, dreaming and the experimentation with induced dream states, and automatic drawing and other visual arts-techniques that could be executed more or less automatically as well. For all that the surrealists reinvented automatism for their own poetic, artistic and revolutionary aims, the automatic techniques were primarily drawn from contemporary Spiritualism, psychical research and experimentation with mediums, and the article teases out the connections to mediumistic automatism. It is demonstrated how the surrealists effectively and successfully divested automatism of all things spiritual. It furthermore becomes clear that despite various mishaps, automatism in many forms was a very successful creative technique within Surrealism.

  7. Automatic Kurdish Dialects Identification

    Directory of Open Access Journals (Sweden)

    Hossein Hassani

    2016-02-01

    Full Text Available Automatic dialect identification is a necessary Lan guage Technology for processing multi- dialect languages in which the dialects are linguis tically far from each other. Particularly, this becomes crucial where the dialects are mutually uni ntelligible. Therefore, to perform computational activities on these languages, the sy stem needs to identify the dialect that is the subject of the process. Kurdish language encompasse s various dialects. It is written using several different scripts. The language lacks of a standard orthography. This situation makes the Kurdish dialectal identification more interesti ng and required, both form the research and from the application perspectives. In this research , we have applied a classification method, based on supervised machine learning, to identify t he dialects of the Kurdish texts. The research has focused on two widely spoken and most dominant Kurdish dialects, namely, Kurmanji and Sorani. The approach could be applied to the other Kurdish dialects as well. The method is also applicable to the languages which are similar to Ku rdish in their dialectal diversity and differences.

  8. 超大规模集成电路可调试性设计综述%Survey of Design-for-Debug of VLSI

    Institute of Scientific and Technical Information of China (English)

    钱诚; 沈海华; 陈天石; 陈云霁

    2012-01-01

    随着硬件复杂度的不断提高和并行软件调试的需求不断增长,可调试性设计已经成为集成电路设计中的重要内容.一方面,仅靠传统的硅前验证已经无法保证现代超大规模复杂集成电路设计验证的质量,因此作为硅后验证重要支撑技术的可调试性设计日渐成为大规模集成电路设计领域的研究热点.另一方面,并行程序的调试非常困难,很多细微的bug无法直接用传统的单步、断点等方法进行调试,如果没有专门的硬件支持,需要耗费极大的人力和物力.全面分析了现有的可调试性设计,在此基础上归纳总结了可调试性设计技术的主要研究方向并介绍了各个方向的研究进展,深入探讨了可调试性结构设计研究中的热点问题及其产生根源,给出了可调试性结构设计领域的发展趋势.%Design-for-debug (DFD) has become an important feature of modern VLSI. On the one hand, traditional pre-silicon verification methods are not sufficient to enssure the quality of modern complex VLSI designs, thus employing DFD to facilitate post-silicon verification has attracted wide interests from both academia and industry; on the other hand, debugging parallel program is a worldwide difficult problem, which cries out for DFD hardware supports. In this paper, we analyze the existing structures of DFD comprehensively and introduce different fields of DFD for debugging hardware and software. These fields contain various kinds of DFD infrastructures, such as the DFD infrastructure for the pipe line of processor, the system-on-chips (SOC) and the networks on multi-cores processor. We also introduce the recent researches on how to design the DFD infrastructures with certain processor architecture and how to use the DFD infrastructures to solve the debug problems in these different fields. The topologic of the whole infrastructure, the hardware design of components, the methods of analyzing signals, the

  9. Word Automaticity of Tree Automatic Scattered Linear Orderings Is Decidable

    CERN Document Server

    Huschenbett, Martin

    2012-01-01

    A tree automatic structure is a structure whose domain can be encoded by a regular tree language such that each relation is recognisable by a finite automaton processing tuples of trees synchronously. Words can be regarded as specific simple trees and a structure is word automatic if it is encodable using only these trees. The question naturally arises whether a given tree automatic structure is already word automatic. We prove that this problem is decidable for tree automatic scattered linear orderings. Moreover, we show that in case of a positive answer a word automatic presentation is computable from the tree automatic presentation.

  10. Presentation video retrieval using automatically recovered slide and spoken text

    Science.gov (United States)

    Cooper, Matthew

    2013-03-01

    Video is becoming a prevalent medium for e-learning. Lecture videos contain text information in both the presentation slides and lecturer's speech. This paper examines the relative utility of automatically recovered text from these sources for lecture video retrieval. To extract the visual information, we automatically detect slides within the videos and apply optical character recognition to obtain their text. Automatic speech recognition is used similarly to extract spoken text from the recorded audio. We perform controlled experiments with manually created ground truth for both the slide and spoken text from more than 60 hours of lecture video. We compare the automatically extracted slide and spoken text in terms of accuracy relative to ground truth, overlap with one another, and utility for video retrieval. Results reveal that automatically recovered slide text and spoken text contain different content with varying error profiles. Experiments demonstrate that automatically extracted slide text enables higher precision video retrieval than automatically recovered spoken text.

  11. Factory Equipment Maintenance and Debugging PLC Programming Ideas to Build%工厂PLC设备维修与调试编程思路构建

    Institute of Scientific and Technical Information of China (English)

    张自强

    2014-01-01

    In-depth analysis of PLC programmable controller with the function and role, explore maintenance and debugging PLC programming ideas factory equipment, to provide some references for the exclusion of the plant PLC equipment failure.%深入分析PLC可编程控制器具备的功能和作用,探究工厂中PLC设备的维修和调试编程思路,为工厂PLC设备故障的排除提供一些参考。

  12. Assembly and debugging technology for ultrasonic motor%超声波电机的装配调试技术

    Institute of Scientific and Technical Information of China (English)

    高跃民; 姬海英

    2011-01-01

    超声波电机的装配调试是超声波电机研制的重要环节,主要难点是压电振子粘接、预紧力调试及机电匹配调试.本文自主摸索出一套适合超声波电机快速小批量生产的装配和调试技术.采用摩擦材料、定子弹性体、压电陶瓷、柔性印制电路板的一体化粘接,解决压电振子的粘接难题;采用粗调和微调相结合的方法,快速完成机电匹配调试,缩短生产周期.形成一套规范化的装配调试方法及流程,并应用于超声波电机的小批量生产和研制,研制出的超声波电机微型扁平,体积为14 mm×14 mm×4 mm,输出力矩达到2 mN·m,通过功率谱密度(PSD)为0.2 g2/Hz的随机振动和-40℃~60℃的高低温试验,并集成应用到系统中实现步进驱动功能.%Assembly and debugging of ultrasonic motor are the important links of ultrasonic motor development. The bonding of piezo-electric vibrator and the match adjusting of drive-control circuit are the main difficulties. A set of bonding, assembly and adjusting methods for small batch production was explored. Integration bonding of friction material, piezoelectric ceramic, flexible Printed Circuit Board (PCB) achieves high adhesion and precise positioning. Combination of coarse and fine adjusting achieves the fast mechanical and electrical matching setting to shorten the cycle of product. This set of assembly and setting methods was successfully applied in batch product. A batch of ultrasonic motors was developed with the volume of 14 mm × 14 mm × 4 mm, output torque 2 mN·m, and passed the 0.2 g2/Hz PSD(Power Spectral Density) random vibration and temperature test from -40 ℃ to 60 ℃. The developed ultrasonic motors were integrated applied in system to realize step driving.

  13. 核电站闭气式正比计数器C2门调试%Debug Method of Sealed Gas Detector C2 Gate at Nuclear Power Plants

    Institute of Scientific and Technical Information of China (English)

    李翔; 余哲; 李买林; 郑欢; 王建新; 任熠; 张佳; 刘晋瑾; 刘芸

    2014-01-01

    介绍了以闭气式正比计数器为探测器的核电站C2门的调试方法,分析了设备性能优化的可行性与必要性。为保证最优的探测器效率,需确定探测器最佳的高压值;计算了设备的最小可探测限,并设定低本底报警阈值;制定了探测器的防护措施。%A description is presented of the debug method of C 2 gate at nuclear power plants using sealed gas proportional counter as detector ,including the analysis of the feasibility and necessity of performance optimiza-tion .To ensure the optimized detection efficiency ,the high voltage value is to be determined .The lowest de-tectable limit of the equipment is calculated and the low background alarm threshold is set .Protective method for detector are provided .

  14. Bugs that debugs: Probiotics

    Directory of Open Access Journals (Sweden)

    Sugumari Elavarasu

    2012-01-01

    Full Text Available The oral cavity harbors a diverse array of bacterial species. There are more than 600 species that colonize in the oral cavity. These include a lot of organisms that are not commonly known to reside in the gastrointestinal (GI tract and also are more familiar: Lactobacillus acidophilus, Lactobacillus casei, Lactobacillus fermentum, Lactobacillus plantarum, Lactobacillus rhamnosus, and Lactobacillus salivarius. The balance of all these microorganisms can easily be disturbed and a prevalence of pathogenic organisms can lead to various oral health problems including dental caries, periodontitis, and halitosis.

  15. An Automatic Hierarchical Delay Analysis Tool

    Institute of Scientific and Technical Information of China (English)

    FaridMheir-El-Saadi; BozenaKaminska

    1994-01-01

    The performance analysis of VLSI integrated circuits(ICs) with flat tools is slow and even sometimes impossible to complete.Some hierarchical tools have been developed to speed up the analysis of these large ICs.However,these hierarchical tools suffer from a poor interaction with the CAD database and poorly automatized operations.We introduce a general hierarchical framework for performance analysis to solve these problems.The circuit analysis is automatic under the proposed framework.Information that has been automatically abstracted in the hierarchy is kept in database properties along with the topological information.A limited software implementation of the framework,PREDICT,has also been developed to analyze the delay performance.Experimental results show that hierarchical analysis CPU time and memory requirements are low if heuristics are used during the abstraction process.

  16. Performance management system enhancement and maintenance

    Science.gov (United States)

    Cleaver, T. G.; Ahour, R.; Johnson, B. R.

    1984-01-01

    The research described in this report concludes a two-year effort to develop a Performance Management System (PMS) for the NCC computers. PMS provides semi-automated monthly reports to NASA and contractor management on the status and performance of the NCC computers in the TDRSS program. Throughout 1984, PMS was tested, debugged, extended, and enhanced. Regular PMS monthly reports were produced and distributed. PMS continues to operate at the NCC under control of Bendix Corp. personnel.

  17. 便携式发控装置电气调试系统设计%Design of electric debugging system for portable electropult control device

    Institute of Scientific and Technical Information of China (English)

    潘勃; 卢选民; 苏龙; 王剑亮

    2014-01-01

    为了模拟飞机、两型发射架及导弹的电气信号特征,实现凉性发射装置及其配套发控盒研制过程中调试、系统联试功能一体化,设计开发了某型便捷式发控装置电气调试系统;介绍了该系统的总体设计、硬件配置、适配器的设计、软件工作流程等;经测试,该系统安全、可靠,完全满足设计要求。%To simulate aircraft,two types of launchers and electrical signal characteristics of missiles,and achieve the inte-gration of debugging,system testing and commissioning in the development process of cool launcher and its supporting control box,a debugging system for certain type of portable electropult is designed and developed. The overall system design,hardware configuration,the adapter design and software workflows are introduced;Test result shows that the system is safe,reliable,and can fully meet the design requirements.

  18. Improvement of visual debugging tool. Shortening the elapsed time for getting data and adding new functions to compare/combine a set of visualized data

    Energy Technology Data Exchange (ETDEWEB)

    Matsuda, Katsuyuki; Takemiya, Hiroshi

    2001-03-01

    The visual debugging tool 'vdebug' has been improved, which was designed for the debugging of programs for scientific computing. Improved were the following two points; (1) shortening the elapsed time required for getting appropriate data to visualize; (2) adding new functions which enable to compare and/or combine a set of visualized data originated from two or more different programs. As for shortening elapsed time for getting data, with the improved version of 'vdebug', we could achieve the following results; over hundred times shortening the elapsed time with dbx, pdbx of SX-4 and over ten times with ndb of SR2201. As for the new functions to compare/combine visualized data, it was confirmed that we could easily checked the consistency between the computational results obtained in each calculational steps on two different computers: SP and ONYX. In this report, we illustrate how the tool 'vdebug' has been improved with an example. (author)

  19. Automatic Program Development

    DEFF Research Database (Denmark)

    by members of the IFIP Working Group 2.1 of which Bob was an active member. All papers are related to some of the research interests of Bob and, in particular, to the transformational development of programs and their algorithmic derivation from formal specifications. Automatic Program Development offers......Automatic Program Development is a tribute to Robert Paige (1947-1999), our accomplished and respected colleague, and moreover our good friend, whose untimely passing was a loss to our academic and research community. We have collected the revised, updated versions of the papers published in his...... honor in the Higher-Order and Symbolic Computation Journal in the years 2003 and 2005. Among them there are two papers by Bob: (i) a retrospective view of his research lines, and (ii) a proposal for future studies in the area of the automatic program derivation. The book also includes some papers...

  20. 一种新的基于不完全排错的最优软件发行策略%A Novel Optimal Software Release Policy under Imperfect Debugging

    Institute of Scientific and Technical Information of China (English)

    刘云; 田斌; 赵玮

    2005-01-01

    Optimal software releasing is a challenging problem in software reliability. Most of the available software release models have an unreasonable assumption that the software debugging process is perfect or there is no new fault introduced during debugging. This paper presents an optimal software release policy model under imperfect debugging. This model not only takes the software imperfect debugging and the new faults introduced during debugging into account, but also considers the situation that the probability of perfect debugging will be increased while the experience is gained during the process of software testing. This paper also gives the solution of the model.%软件的最优发行管理问题是软件可靠性研究的一个关键问题.现有的最优软件发行模型大都假定软件排错过程是完全的,并且在排错过程中没有新的故障引入,这种假设在很多情况下是不合理的.本文提出了一种新的最优软件发行管理模型,该模型既考虑了软件的不完全排错过程,又考虑了在排错过程中可能会引入新的故障,同时还考虑了由于排错经验的不断积累,软件的完全排错概率会增加的情况.本文同时给出了该模型的解.

  1. A SURVEY: PID OPTIMIZATION FOR AUTOMATIC VOLTAGE REGULATOR

    OpenAIRE

    Ajay Dixit*, Miss.Pragati Joshi, Mr.Mahesh Lokhande

    2016-01-01

    This paper presents on survey paper  of PSO and automatic voltage regulator for synchronous generator. It is used to obtain for regulation and stability of any electrical equipment. There are many technology/methods were used in automatic voltage regulator as well as different controller used for improving robustness, overshoot, rise time and voltage control but problem is about to survey on Automatic Voltage Regulator. Comparisons studies which are based on PID Controller are performed to sh...

  2. Automatic Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Preuss, Mike

    2014-01-01

    Automatically generating computer animations is a challenging and complex problem with applications in games and film production. In this paper, we investigate howto translate a shot list for a virtual scene into a series of virtual camera configurations — i.e automatically controlling the virtual...... camera. We approach this problem by modelling it as a dynamic multi-objective optimisation problem and show how this metaphor allows a much richer expressiveness than a classical single objective approach. Finally, we showcase the application of a multi-objective evolutionary algorithm to generate a shot...

  3. 长效型主动融雪沥青混合料路用性能试验%Experiment of Road Performance of Asphalt Mixture with Automatic Long-term Snowmelt Agent

    Institute of Scientific and Technical Information of China (English)

    李福普; 王志军

    2012-01-01

    为全面验证Mafilon材料置换矿粉后沥青混合料的路用性能,进行了室内试验并现场观测了该沥青混合料的融冰化雪能力.对比不同Mafilon掺量的沥青混合料的性能差异,结果表明:在相同矿料级配条件下,长效型主动融雪沥青混合料高温性能优良,Mafilon掺量对沥青混合料的低温性能基本没有影响.3种沥青混合料的水稳定性满足规范要求,但是水稳定性能随Mafilon掺量的增加而降低.从室内和试验路长效型主动融雪沥青混合料的除冰雪效果看,Mafilon置换矿粉比例越高,其除冰雪效果越明显.%The asphalt mixture with Mafilon which in lieu of mineral fines and helps to deice is automatic long-term snow-melt asphalt mixture. In order to prove the performance of asphalt mixture with Mafilon content, some indoor tests were conducted, the field deicing capacity of the asphalt mixture was observed, and the performance of several asphalt mixtures with different contents of Mafilon were compared. The results show that (1 ) with the same gradation, automatic long-term snow melting asphalt mixtures has better anti-rutting performance, and the content of Mafilon has little effect on the low temperature performance of asphalt mixture; (2) the moisture stability of three kinds of asphalt mixture can meet the requirement of the specification, but it would decrease with the increase of Mafilon content. According to the indoor and outdoor deicing effects of the subjected asphalt mixture, it is better to deice pavement with more Mafilon content in the asphalt mixture.

  4. 自动化立体停车场的监控系统设计%Design of the Monitoring System of Automatic Three dimensional Parking Lot

    Institute of Scientific and Technical Information of China (English)

    刘少军; 张思雨

    2016-01-01

    Aiming at the shortcomings of traditional parking lots ,a garage monitoring system is de‐signed by using the programmable controller PLC based on an analysis of the principle of the lifting and transferring stereoscopic garage .T he monitoring of the control system uses the industrial control configu‐ration software MCGS based on the WINDOWS platform .The automatic monitoring system for parking lots can be realized through the construction of the configuration software database ,the connection of ani‐mation ,the programming and debugging of the control flow .Experiments have demonstrated that the sys‐tem has favorable market prospects and high application value due to the fact that the system has a range of advantages such as smooth operation ,reliable performance ,a friendly human computer interaction inter‐face ,and a high automation level .%针对传统停车场的不足,在分析升降横移式立体车库原理的基础上,设计了基于PLC和M CGS的车库控制监控系统。在分析系统工作原理的基础上,对其硬件结构及程序设计进行介绍。通过工控组态软件M CGS组态画面的制作、数据库的构建及脚本程序的编写、调试能够实现对自动停车场全程的实时自动监控。实验证明,系统运行顺畅,性能可靠,人机交互界面友好,自动化水平高,具有应用价值。

  5. Real-time automatic registration in optical surgical navigation

    Science.gov (United States)

    Lin, Qinyong; Yang, Rongqian; Cai, Ken; Si, Xuan; Chen, Xiuwen; Wu, Xiaoming

    2016-05-01

    An image-guided surgical navigation system requires the improvement of the patient-to-image registration time to enhance the convenience of the registration procedure. A critical step in achieving this aim is performing a fully automatic patient-to-image registration. This study reports on a design of custom fiducial markers and the performance of a real-time automatic patient-to-image registration method using these markers on the basis of an optical tracking system for rigid anatomy. The custom fiducial markers are designed to be automatically localized in both patient and image spaces. An automatic localization method is performed by registering a point cloud sampled from the three dimensional (3D) pedestal model surface of a fiducial marker to each pedestal of fiducial markers searched in image space. A head phantom is constructed to estimate the performance of the real-time automatic registration method under four fiducial configurations. The head phantom experimental results demonstrate that the real-time automatic registration method is more convenient, rapid, and accurate than the manual method. The time required for each registration is approximately 0.1 s. The automatic localization method precisely localizes the fiducial markers in image space. The averaged target registration error for the four configurations is approximately 0.7 mm. The automatic registration performance is independent of the positions relative to the tracking system and the movement of the patient during the operation.

  6. Automatic Complexity Analysis

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    1989-01-01

    One way to analyse programs is to to derive expressions for their computational behaviour. A time bound function (or worst-case complexity) gives an upper bound for the computation time as a function of the size of input. We describe a system to derive such time bounds automatically using abstract...

  7. Exploring Automatization Processes.

    Science.gov (United States)

    DeKeyser, Robert M.

    1996-01-01

    Presents the rationale for and the results of a pilot study attempting to document in detail how automatization takes place as the result of different kinds of intensive practice. Results show that reaction times and error rates gradually decline with practice, and the practice effect is skill-specific. (36 references) (CK)

  8. On automatic machine translation evaluation

    Directory of Open Access Journals (Sweden)

    Darinka Verdonik

    2013-05-01

    Full Text Available An important task of developing machine translation (MT is evaluating system performance. Automatic measures are most commonly used for this task, as manual evaluation is time-consuming and costly. However, to perform an objective evaluation is not a trivial task. Automatic measures, such as BLEU, TER, NIST, METEOR etc., have their own weaknesses, while manual evaluations are also problematic since they are always to some extent subjective. In this paper we test the influence of a test set on the results of automatic MT evaluation for the subtitling domain. Translating subtitles is a rather specific task for MT, since subtitles are a sort of summarization of spoken text rather than a direct translation of (written text. Additional problem when translating language pair that does not include English, in our example Slovene-Serbian, is that commonly the translations are done from English to Serbian and from English to Slovenian, and not directly, since most of the TV production is originally filmed in English. All this poses additional challenges to MT and consequently to MT evaluation. Automatic evaluation is based on a reference translation, which is usually taken from an existing parallel corpus and marked as a test set. In our experiments, we compare the evaluation results for the same MT system output using three types of test set. In the first round, the test set are 4000 subtitles from the parallel corpus of subtitles SUMAT. These subtitles are not direct translations from Serbian to Slovene or vice versa, but are based on an English original. In the second round, the test set are 1000 subtitles randomly extracted from the first test set and translated anew, from Serbian to Slovenian, based solely on the Serbian written subtitles. In the third round, the test set are the same 1000 subtitles, however this time the Slovene translations were obtained by manually correcting the Slovene MT outputs so that they are correct translations of the

  9. The performance evaluation of reticulocytes detected by XN-3000 automatic blood analyzer%XN-3000全自动血液分析仪检测网织红细胞的性能评价

    Institute of Scientific and Technical Information of China (English)

    周爱国; 苏勇; 陆奎英

    2016-01-01

    Objective:To evaluate the performance of reticulocyte (Ret) detected by Sysmex XN-3000 automatic blood analyzer.Methods: 96 blood samples were collected randomly in clinical detection of reticulocytes, and respectively were detected by XN-3000 automatic blood analyzer and microscopy. The precision, stability, linearity and carryover of Ret in blood samples detected by XN-3000 automatic blood analyzer were evaluated by the correlation analysis of the results from the two methods and the correlation analysis would be completed by microscopy.Results: precision, linear range and sensitivity of Ret measured by XN-3000 were within the allowable range. The stability did not change significantly below 4℃ within 48h (CV<1%). The carryover rate was 0.14%. The results measured by XN-3000 were well compared with microscopy (r=0.979).Conclusion: Sysmex XN-3000 automatic blood analyzer is applied to the clinical detection of large quantities of specimens and it is ideal for measuring Ret.%目的:评价Sysmex XN-3000全自动血液分析仪检测全血中网织红细胞(Ret)的性能。方法:随机收集临床检测网织红细胞的血标本96份,分别使用XN-3000全自动血液分析仪和人工显微镜法进行检测,通过对两种方法所测得的结果进行相关性分析,评价XN-3000全自动血液分析仪检测血样本中Ret的精密度、稳定性、线性范围及携带污染率等,并与人工显微镜法做相关性分析。结果:XN-3000测定Ret的精密度、线性范围及敏感性均在允许的范围内,稳定性在4℃下48 h内Ret分析无显著变化,变异系数(CV)<1%,携带污染率为0.14%;XN-3000测定Ret的结果与人工显微镜法相关性良好(r=0.979)。结论:XN-3000测定Ret准确度高,重复性好,线性范围能满足临床需求,携带污染率极低,且检测快速简便,与人工显微镜法相关性好,结果可靠,尤其适用于临床大批量标本的检测,是临床测定Ret的理想分析装置。

  10. Automaticity and Reading: Perspectives from the Instance Theory of Automatization.

    Science.gov (United States)

    Logan, Gordon D.

    1997-01-01

    Reviews recent literature on automaticity, defining the criteria that distinguish automatic processing from non-automatic processing, and describing modern theories of the underlying mechanisms. Focuses on evidence from studies of reading and draws implications from theory and data for practical issues in teaching reading. Suggests that…

  11. Evaluation of anti-fouling performance for ion-rod water treater with automatic dynamic simulator of fouling%离子棒水处理器的阻垢性能评价

    Institute of Scientific and Technical Information of China (English)

    孙灵芳; 杨善让; 秦裕琨; 徐志明

    2005-01-01

    The application of a novel Automatic Dynamic Simulator of Fouling (ADSF) to evaluate the effectiveness of ion-rod water treater is reported.The effects of some parameters of the water treater were studied with an ADSF made according to patented technology, and orthogonal experimental design was adopted with the use of artificial hard water.Experimental results validated that the ion-rod water treater could mitigate fouling,and the anti-fouling efficiency varies with the test conditions.The anti-fouling efficiency of treater increased with the increase of flow velocity in the range of 0.8-1.2 m·s-1 and output voltage in the range of 7500-15000 V.The efficiency weat up initially, and then went down with the increase in hardness.The rough surface of ion-rod was superior to the smooth one.The order of influence on treater performance with respect to these factors was as follows: water hardness, roughness of surface, flow velocity and output voltage.The research also provided a guide to improving the performance of ion-rod water treater.

  12. High-Resolution Dynamical Downscaling of ERA-Interim Using the WRF Regional Climate Model for the Area of Poland. Part 2: Model Performance with Respect to Automatically Derived Circulation Types

    Science.gov (United States)

    Ojrzyńska, Hanna; Kryza, Maciej; Wałaszek, Kinga; Szymanowski, Mariusz; Werner, Małgorzata; Dore, Anthony J.

    2017-02-01

    This paper presents the application of the high-resolution WRF model data for the automatic classification of the atmospheric circulation types and the evaluation of the model results for daily rainfall and air temperatures. The WRF model evaluation is performed by comparison with measurements and gridded data (E-OBS). The study is focused on the area of Poland and covers the 1981-2010 period, for which the WRF model has been run using three nested domains with spatial resolution of 45 km × 45 km, 15 km × 15 km and 5 km × 5 km. For the model evaluation, we have used the data from the innermost domain, and data from the second domain were used for circulation typology. According to the circulation type analysis, the anticyclonic types (AAD and AAW) are the most frequent. The WRF model is able to reproduce the daily air temperatures and the error statistics are better, compared with the interpolation-based gridded dataset. The high-resolution WRF model shows a higher spatial variability of both air temperature and rainfall, compared with the E-OBS dataset. For the rainfall, the WRF model, in general, overestimates the measured values. The model performance shows a seasonal pattern and is also dependent on the atmospheric circulation type, especially for daily rainfall.

  13. Automatic Extraction of Metadata from Scientific Publications for CRIS Systems

    Science.gov (United States)

    Kovacevic, Aleksandar; Ivanovic, Dragan; Milosavljevic, Branko; Konjovic, Zora; Surla, Dusan

    2011-01-01

    Purpose: The aim of this paper is to develop a system for automatic extraction of metadata from scientific papers in PDF format for the information system for monitoring the scientific research activity of the University of Novi Sad (CRIS UNS). Design/methodology/approach: The system is based on machine learning and performs automatic extraction…

  14. Reliability and effectiveness of clickthrough data for automatic image annotation

    NARCIS (Netherlands)

    Tsikrika, T.; Diou, C.; De Vries, A.P.; Delopoulos, A.

    2010-01-01

    Automatic image annotation using supervised learning is performed by concept classifiers trained on labelled example images. This work proposes the use of clickthrough data collected from search logs as a source for the automatic generation of concept training data, thus avoiding the expensive manua

  15. Enhancing Automaticity through Task-Based Language Learning

    Science.gov (United States)

    De Ridder, Isabelle; Vangehuchten, Lieve; Gomez, Marta Sesena

    2007-01-01

    In general terms automaticity could be defined as the subconscious condition wherein "we perform a complex series of tasks very quickly and efficiently, without having to think about the various components and subcomponents of action involved" (DeKeyser 2001: 125). For language learning, Segalowitz (2003) characterised automaticity as a…

  16. Remote updating and debugging multi-FPGA based on XVC internet protocol%基于XVC网络协议的多FPGA远程更新与调试

    Institute of Scientific and Technical Information of China (English)

    薛乾; 曾云; 张杰

    2015-01-01

    同步辐射光源硅像素探测器是面向北京先进光源对X射线探测的重大技术需求所研发的新型仪器.该探测器处于辐射环境中,且多个前端电子学读出板被放置在密封的冷却容器中.为在不打开冷却容器的情况下,脱离专用的USB-JTAG (Universal serial bus - joint test action group)下载电缆对现场可编程门阵列(Field programmable gate array, FPGA)进行远端升级,本文提出了一种基于XVC (Xilinx visual cable)协议,通过以太网,利用ARM微控制器控制FPGA的JTAG接口对其进行远程更新与调试的方法.该方案附加电路少,易于拓展,同时也提高了更新可靠性.%Background: Silicon pixel detector for synchrotron radiation is a new device designed for the technical need of X-ray detection of Beijing advanced light source. The entire detector is placed in a radiation environment, and the multiple frontend readout boards are placed in sealed cooling containers, thus the traditional Universal serial bus - joint test action group (USB-JTAG) cables can no longer be used.Purpose:This study aims to design and implement remote updating and debugging multi-Field programmable gate array (multi-FPGA) for the detector without opening the cooling container.Methods:In this study, a network based ARM microcontroller was used to accept configuration filesvia network and generate JTAG sequence to FPGA by means of Xilinx visual cable (XVC) protocol.Results:The remote updating and debugging of multi-FPGA were realized.Conclusion:This methodimproves the reliability of remote updating and debugging of FPGAs and is easy to extend with less extra circuits.

  17. ASAM: Automatic architecture synthesis and application mapping

    DEFF Research Database (Denmark)

    Jozwiak, Lech; Lindwer, Menno; Corvino, Rosilde

    2013-01-01

    This paper focuses on mastering the automatic architecture synthesis and application mapping for heterogeneous massively-parallel MPSoCs based on customizable application-specific instruction-set processors (ASIPs). It presents an overview of the research being currently performed in the scope of...

  18. An automatic hinge system for leg orthoses

    NARCIS (Netherlands)

    Rietman, J.S.; Goudsmit, J.; Meulemans, D.; Halbertsma, J.P.K.; Geertzen, J.H.B.

    2004-01-01

    This paper describes a new, automatic hinge system for leg orthoses, which provides knee stability in stance, and allows knee-flexion during swing. Indications for the hinge system are a paresis or paralysis of the quadriceps muscles. Instrumented gait analysis was performed in three patients, fitte

  19. Automatic visual inspection of hybrid microcircuits

    Energy Technology Data Exchange (ETDEWEB)

    Hines, R.E.

    1980-05-01

    An automatic visual inspection system using a minicomputer and a video digitizer was developed for inspecting hybrid microcircuits (HMC) and thin-film networks (TFN). The system performed well in detecting missing components on HMCs and reduced the testing time for each HMC by 75%.

  20. Performance verification of Roche COBAS6000 automatic electrochemiluminescence immunoassay analyzer%罗氏COBAS6000全自动电化学发光免疫分析仪性能验证

    Institute of Scientific and Technical Information of China (English)

    谭晓辉; 王勇

    2011-01-01

    目的 按ISO15189要求对罗氏COBAS 6000全自动电化学发光免疫分析仪的性能进行验证. 方法 对甲胎蛋白(alpha-fetoprotein,AFP)的精密度、准确度、临床可报告范围(clinical reportable range,CRR)、分析测量范围(analytical measurement range,AMR)、参考区间进行验证实验. 结果 批内精密度变异系数(coefficient variation,CV)高低值分别为3.28%和3.46%;日间精密度CV高低值分别为4.39%和5.13%,均小于厂家提供的CV(10%).相对偏差为0.862%,小于5%.分析测量范围为0.80-1 200 ng/ml,参考区间为0-20.00 ng/ml,临床可报告范围为0-60 000 ng/ml. 结论 罗氏COBAS 6000全自动电化学发光免疫分析仪的性能与厂家提供的资料基本一致,故可用其进行临床标本的检验工作,所得结果具有可信性.%Objeaive To test and verify the system performance of Roche COBAS 6000 automatic electrochemiluminescence immunoassay analyzer according to the requirements of IS015189. Methods Verification experiments were taken to measure the precision.accuracy , clinical reportable range( CRR) , analytical measurement range ( AMR ) , reference interval of alpha-feloprotein ( AFP) . Re-sults The high and low values of coefficient variation( CV) of inter-assay were 3. 28% and 3. 46% , and those of between-day precision were 4. 39% and 5. 13% , which were all less than the CV provided by the manufacturer( 1O% ) . Relative bias was 0. 862% . Analytical measurement range was 0.80 - 1 200 ng/ml.the reference interval was O - 20. 00 ng/ml, and the clinical reportable range was O 60 000 ng/ml. Conclusion The basic performances of Roche COBAS6000 automatic electrochemiluminescence immunoassay analyzer are consistent wich the data provided by the manufacturer,so it can be used to inspect the clinical samples and the results are credible.

  1. Automatic cell counting with ImageJ.

    Science.gov (United States)

    Grishagin, Ivan V

    2015-03-15

    Cell counting is an important routine procedure. However, to date there is no comprehensive, easy to use, and inexpensive solution for routine cell counting, and this procedure usually needs to be performed manually. Here, we report a complete solution for automatic cell counting in which a conventional light microscope is equipped with a web camera to obtain images of a suspension of mammalian cells in a hemocytometer assembly. Based on the ImageJ toolbox, we devised two algorithms to automatically count these cells. This approach is approximately 10 times faster and yields more reliable and consistent results compared with manual counting.

  2. Around the laboratories: Rutherford: Successful tests on bubble chamber target technique; Stanford (SLAC): New storage rings proposal; Berkeley: The HAPPE project to examine cosmic rays with superconducting magnets; The 60th birthday of Professor N.N. Bogolyubov; Argonne: Performance of the automatic film measuring system POLLY II

    CERN Multimedia

    1969-01-01

    Around the laboratories: Rutherford: Successful tests on bubble chamber target technique; Stanford (SLAC): New storage rings proposal; Berkeley: The HAPPE project to examine cosmic rays with superconducting magnets; The 60th birthday of Professor N.N. Bogolyubov; Argonne: Performance of the automatic film measuring system POLLY II

  3. 城轨交通定修车辆段静调列位设置方式探讨%Discussion on Static Debug Station Setting Mode in Medium Repair Depot of Transit

    Institute of Scientific and Technical Information of China (English)

    邱建平

    2013-01-01

    介绍城轨车辆静调作业内容及场地标准,对其定修车辆段静调列位的3种设置方式进行了探讨,得出定修车辆段静调库宜独立设置,当双周三月检库能力富裕较多时可与其共库共线设置,不建议与定临修库共库设置的结论。%The paper discusses and analyzes the standard and content of three types of static debug station setting modes in medium repair depot of urban rail transit, and concludes that static debug station should be set up independently, and the static debug station can be combined with biweekly and three-month inspection stations based on the fact that the inspection capacity is fully adequate. The paper also recommends that the static debug station shall not be set with medium and occasional repair station.

  4. Performance evaluation of an automatic anatomy segmentation algorithm on repeat or four-dimensional CT images using a deformable image registration method

    Science.gov (United States)

    Wang, He; Garden, Adam S.; Zhang, Lifei; Wei, Xiong; Ahamad, Anesa; Kuban, Deborah A.; Komaki, Ritsuko; O’Daniel, Jennifer; Zhang, Yongbin; Mohan, Radhe; Dong, Lei

    2008-01-01

    Purpose Auto-propagation of anatomical region-of-interests (ROIs) from the planning CT to daily CT is an essential step in image-guided adaptive radiotherapy. The goal of this study was to quantitatively evaluate the performance of the algorithm in typical clinical applications. Method and Materials We previously adopted an image intensity-based deformable registration algorithm to find the correspondence between two images. In this study, the ROIs delineated on the planning CT image were mapped onto daily CT or four-dimentional (4D) CT images using the same transformation. Post-processing methods, such as boundary smoothing and modification, were used to enhance the robustness of the algorithm. Auto-propagated contours for eight head-and-neck patients with a total of 100 repeat CTs, one prostate patient with 24 repeat CTs, and nine lung cancer patients with a total of 90 4D-CT images were evaluated against physician-drawn contours and physician-modified deformed contours using the volume-overlap-index (VOI) and mean absolute surface-to-surface distance (ASSD). Results The deformed contours were reasonably well matched with daily anatomy on repeat CT images. The VOI and mean ASSD were 83% and 1.3 mm when compared to the independently drawn contours. A better agreement (greater than 97% and less than 0.4 mm) was achieved if the physician was only asked to correct the deformed contours. The algorithm was robust in the presence of random noise in the image. Conclusion The deformable algorithm may be an effective method to propagate the planning ROIs to subsequent CT images of changed anatomy, although a final review by physicians is highly recommended. PMID:18722272

  5. Minimum Hamiltonian Ascent Trajectory Evaluation (MASTRE) program (update to automatic flight trajectory design, performance prediction, and vehicle sizing for support of Shuttle and Shuttle derived vehicles) engineering manual

    Science.gov (United States)

    Lyons, J. T.

    1993-01-01

    The Minimum Hamiltonian Ascent Trajectory Evaluation (MASTRE) program and its predecessors, the ROBOT and the RAGMOP programs, have had a long history of supporting MSFC in the simulation of space boosters for the purpose of performance evaluation. The ROBOT program was used in the simulation of the Saturn 1B and Saturn 5 vehicles in the 1960's and provided the first utilization of the minimum Hamiltonian (or min-H) methodology and the steepest ascent technique to solve the optimum trajectory problem. The advent of the Space Shuttle in the 1970's and its complex airplane design required a redesign of the trajectory simulation code since aerodynamic flight and controllability were required for proper simulation. The RAGMOP program was the first attempt to incorporate the complex equations of the Space Shuttle into an optimization tool by using an optimization method based on steepest ascent techniques (but without the min-H methodology). Development of the complex partial derivatives associated with the Space Shuttle configuration and using techniques from the RAGMOP program, the ROBOT program was redesigned to incorporate these additional complexities. This redesign created the MASTRE program, which was referred to as the Minimum Hamiltonian Ascent Shuttle TRajectory Evaluation program at that time. Unique to this program were first-stage (or booster) nonlinear aerodynamics, upper-stage linear aerodynamics, engine control via moment balance, liquid and solid thrust forces, variable liquid throttling to maintain constant acceleration limits, and a total upgrade of the equations used in the forward and backward integration segments of the program. This modification of the MASTRE code has been used to simulate the new space vehicles associated with the National Launch Systems (NLS). Although not as complicated as the Space Shuttle, the simulation and analysis of the NLS vehicles required additional modifications to the MASTRE program in the areas of providing

  6. Automaticity or active control

    DEFF Research Database (Denmark)

    Tudoran, Ana Alina; Olsen, Svein Ottar

    aspects of the construct, such as routine, inertia, automaticity, or very little conscious deliberation. The data consist of 2962 consumers participating in a large European survey. The results show that habit strength significantly moderates the association between satisfaction and action loyalty, and......This study addresses the quasi-moderating role of habit strength in explaining action loyalty. A model of loyalty behaviour is proposed that extends the traditional satisfaction–intention–action loyalty network. Habit strength is conceptualised as a cognitive construct to refer to the psychological......, respectively, between intended loyalty and action loyalty. At high levels of habit strength, consumers are more likely to free up cognitive resources and incline the balance from controlled to routine and automatic-like responses....

  7. Automatic Ultrasound Scanning

    DEFF Research Database (Denmark)

    Moshavegh, Ramin

    Medical ultrasound has been a widely used imaging modality in healthcare platforms for examination, diagnostic purposes, and for real-time guidance during surgery. However, despite the recent advances, medical ultrasound remains the most operator-dependent imaging modality, as it heavily relies...... on the user adjustments on the scanner interface to optimize the scan settings. This explains the huge interest in the subject of this PhD project entitled “AUTOMATIC ULTRASOUND SCANNING”. The key goals of the project have been to develop automated techniques to minimize the unnecessary settings...... on the scanners, and to improve the computer-aided diagnosis (CAD) in ultrasound by introducing new quantitative measures. Thus, four major issues concerning automation of the medical ultrasound are addressed in this PhD project. They touch upon gain adjustments in ultrasound, automatic synthetic aperture image...

  8. Automatic trend estimation

    CERN Document Server

    Vamos¸, C˘alin

    2013-01-01

    Our book introduces a method to evaluate the accuracy of trend estimation algorithms under conditions similar to those encountered in real time series processing. This method is based on Monte Carlo experiments with artificial time series numerically generated by an original algorithm. The second part of the book contains several automatic algorithms for trend estimation and time series partitioning. The source codes of the computer programs implementing these original automatic algorithms are given in the appendix and will be freely available on the web. The book contains clear statement of the conditions and the approximations under which the algorithms work, as well as the proper interpretation of their results. We illustrate the functioning of the analyzed algorithms by processing time series from astrophysics, finance, biophysics, and paleoclimatology. The numerical experiment method extensively used in our book is already in common use in computational and statistical physics.

  9. Support vector machine for automatic pain recognition

    Science.gov (United States)

    Monwar, Md Maruf; Rezaei, Siamak

    2009-02-01

    Facial expressions are a key index of emotion and the interpretation of such expressions of emotion is critical to everyday social functioning. In this paper, we present an efficient video analysis technique for recognition of a specific expression, pain, from human faces. We employ an automatic face detector which detects face from the stored video frame using skin color modeling technique. For pain recognition, location and shape features of the detected faces are computed. These features are then used as inputs to a support vector machine (SVM) for classification. We compare the results with neural network based and eigenimage based automatic pain recognition systems. The experiment results indicate that using support vector machine as classifier can certainly improve the performance of automatic pain recognition system.

  10. Automatic Phonetic Transcription for Danish Speech Recognition

    DEFF Research Database (Denmark)

    Kirkedal, Andreas Søeborg

    Automatic speech recognition (ASR) uses dictionaries that map orthographic words to their phonetic representation. To minimize the occurrence of out-of-vocabulary words, ASR requires large phonetic dictionaries to model pronunciation. Hand-crafted high-quality phonetic dictionaries are difficult...... of automatic phonetic transcriptions vary greatly with respect to language and transcription strategy. For some languages where the difference between the graphemic and phonetic representations are small, graphemic transcriptions can be used to create ASR systems with acceptable performance. In other languages......, like Danish, the graphemic and phonetic representations are very dissimilar and more complex rewriting rules must be applied to create the correct phonetic representation. Automatic phonetic transcribers use different strategies, from deep analysis to shallow rewriting rules, to produce phonetic...

  11. Automatic weld torch guidance control system

    Science.gov (United States)

    Smaith, H. E.; Wall, W. A.; Burns, M. R., Jr.

    1982-01-01

    A highly reliable, fully digital, closed circuit television optical, type automatic weld seam tracking control system was developed. This automatic tracking equipment is used to reduce weld tooling costs and increase overall automatic welding reliability. The system utilizes a charge injection device digital camera which as 60,512 inidividual pixels as the light sensing elements. Through conventional scanning means, each pixel in the focal plane is sequentially scanned, the light level signal digitized, and an 8-bit word transmitted to scratch pad memory. From memory, the microprocessor performs an analysis of the digital signal and computes the tracking error. Lastly, the corrective signal is transmitted to a cross seam actuator digital drive motor controller to complete the closed loop, feedback, tracking system. This weld seam tracking control system is capable of a tracking accuracy of + or - 0.2 mm, or better. As configured, the system is applicable to square butt, V-groove, and lap joint weldments.

  12. Automatic inference of indexing rules for MEDLINE

    Directory of Open Access Journals (Sweden)

    Shooshan Sonya E

    2008-11-01

    Full Text Available Abstract Background: Indexing is a crucial step in any information retrieval system. In MEDLINE, a widely used database of the biomedical literature, the indexing process involves the selection of Medical Subject Headings in order to describe the subject matter of articles. The need for automatic tools to assist MEDLINE indexers in this task is growing with the increasing number of publications being added to MEDLINE. Methods: In this paper, we describe the use and the customization of Inductive Logic Programming (ILP to infer indexing rules that may be used to produce automatic indexing recommendations for MEDLINE indexers. Results: Our results show that this original ILP-based approach outperforms manual rules when they exist. In addition, the use of ILP rules also improves the overall performance of the Medical Text Indexer (MTI, a system producing automatic indexing recommendations for MEDLINE. Conclusion: We expect the sets of ILP rules obtained in this experiment to be integrated into MTI.

  13. Automatic food decisions

    DEFF Research Database (Denmark)

    Mueller Loose, Simone

    Consumers' food decisions are to a large extent shaped by automatic processes, which are either internally directed through learned habits and routines or externally influenced by context factors and visual information triggers. Innovative research methods such as eye tracking, choice experiments...... and food diaries allow us to better understand the impact of unconscious processes on consumers' food choices. Simone Mueller Loose will provide an overview of recent research insights into the effects of habit and context on consumers' food choices....

  14. Automatization of lexicographic work

    Directory of Open Access Journals (Sweden)

    Iztok Kosem

    2013-12-01

    Full Text Available A new approach to lexicographic work, in which the lexicographer is seen more as a validator of the choices made by computer, was recently envisaged by Rundell and Kilgarriff (2011. In this paper, we describe an experiment using such an approach during the creation of Slovene Lexical Database (Gantar, Krek, 2011. The corpus data, i.e. grammatical relations, collocations, examples, and grammatical labels, were automatically extracted from 1,18-billion-word Gigafida corpus of Slovene. The evaluation of the extracted data consisted of making a comparison between the time spent writing a manual entry and a (semi-automatic entry, and identifying potential improvements in the extraction algorithm and in the presentation of data. An important finding was that the automatic approach was far more effective than the manual approach, without any significant loss of information. Based on our experience, we would propose a slightly revised version of the approach envisaged by Rundell and Kilgarriff in which the validation of data is left to lower-level linguists or crowd-sourcing, whereas high-level tasks such as meaning description remain the domain of lexicographers. Such an approach indeed reduces the scope of lexicographer’s work, however it also results in the ability of bringing the content to the users more quickly.

  15. Considering the Fault Dependency Concept with Debugging Time Lag in Software Reliability Growth Modeling Using a Power Function of Testing Time

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Since the early 1970s tremendous growth has been seen in the research of software reliability growth modeling. In general, software reliability growth models (SRGMs) are applicable to the late stages of testing in software development and they can provide useful information about how to improve the reliability of software products. A number of SRGMs have been proposed in the literature to represent time-dependent fault identification / removal phenomenon; still new models are being proposed that could fit a greater number of reliability growth curves. Often, it is assumed that detected faults are immediately corrected when mathematical models are developed. This assumption may not be realistic in practice because the time to remove a detected fault depends on the complexity of the fault, the skill and experience of the personnel, the size of the debugging team, the technique, and so on. Thus, the detected fault need not be immediately removed, and it may lag the fault detection process by a delay effect factor. In this paper, we first review how different software reliability growth models have been developed, where fault detection process is dependent not only on the number of residual fault content but also on the testing time, and see how these models can be reinterpreted as the delayed fault detection model by using a delay effect factor. Based on the power function of the testing time concept, we propose four new SRGMs that assume the presence of two types of faults in the software: leading and dependent faults. Leading faults are those that can be removed upon a failure being observed. However, dependent faults are masked by leading faults and can only be removed after the corresponding leading fault has been removed with a debugging time lag. These models have been tested on real software error data to show its goodness of fit, predictive validity and applicability.

  16. A General Method for Module Automatic Testing in Avionics Systems

    Directory of Open Access Journals (Sweden)

    Li Ma

    2013-05-01

    Full Text Available The traditional Automatic Test Equipment (ATE systems are insufficient to cope with the challenges of testing more and more complex avionics systems. In this study, we propose a general method for module automatic testing in the avionics test platform based on PXI bus. We apply virtual instrument technology to realize the automatic testing and the fault reporting of signal performance. Taking the avionics bus ARINC429 as an example, we introduce the architecture of automatic test system as well as the implementation of algorithms in Lab VIEW. The comprehensive experiments show the proposed method can effectively accomplish the automatic testing and fault reporting of signal performance. It greatly improves the generality and reliability of ATE in avionics systems.

  17. On automatic visual inspection of reflective surfaces

    DEFF Research Database (Denmark)

    Kulmann, Lionel

    1995-01-01

    surveyed visual inspection system design methods and presented available image processing hardware to perform high resolution image capture. We present general usable practical visual inspection system solutions, when performing high resolution visual inspection of surfaces. We have presented known and new......This thesis descrbes different methods to perform automatic visual inspection of reflective manufactured products, with the aim of increasing productivity, reduce cost and improve the quality level of the production. We investigate two different systems performing automatic visual inspection...... in algorithms for detecting 3-dimensional surface damages based on images from a novel structured lighting setup enhancing the appearance of these defects in specular surfaces. A hardware implementable polynomial classifier structure has been described and compared to better known techniques based...

  18. Analysis on Performance and Fault Relation for Automatic Pressure Keeping Weight Type Hydraulic Control Butterfly Valve%自动保压重锤式液控蝶阀性能与故障关系分析

    Institute of Scientific and Technical Information of China (English)

    陈培兴; 狄翠霞

    2012-01-01

    自动保压重锤式液控蝶阀采用二阶段关闭,与普通阀门有所不同.简单介绍蝶阀可能出现的主要故障及其原因,通过具体实例说明如何根据二阶段液控蝶阀的最主要性能指标即开阀时间、快关时间、慢关时间的变化及相互之间的关系排除故障.%Because two-stage closing is adopted in automatic pressure keeping weight type hydraulic control butterfly valve, it is different from common valves. The main faults and causes of the butterfly valve were introduced. Through an example, it was shown that how to remove faults according to the main performance indexes changes of the hydraulic controlled butterfly valve and their mutual relations.

  19. Avaliação do desempenho de um sistema automático para controle da fertirrigação do tomateiro cultivado em substrato Performance evaluation of an automatic system for tomato fertigation control in substrate

    Directory of Open Access Journals (Sweden)

    Antonio J. Steidle Neto

    2009-09-01

    Full Text Available Este trabalho teve por objetivo avaliar o desempenho de um sistema de controle automático de fertirrigação para a produção do tomateiro em substrato de areia, comparativamente ao sistema de controle convencional quanto à redução de solução nutritiva. No método de controle automático, os eventos de fertirrigação foram estabelecidos em função das condições meteorológicas do ambiente de cultivo e do estádio de desenvolvimento da cultura. Para isso, o modelo de Penman-Monteith foi utilizado como suporte para a tomada de decisão sobre a frequência adequada para aplicação da solução nutritiva. No sistema de controle convencional, os intervalos entre as fertirrigações permaneceram fixos durante todo o ciclo do tomateiro. Os resultados demonstraram que o sistema de controle automático atendeu plenamente às necessidades hídricas da cultura, sem comprometer a produção do tomateiro, proporcionando reduções expressivas no consumo de solução nutritiva. Por outro lado, o sistema de controle convencional realizou número excessivo de fertirrigações, principalmente durante o estádio inicial de desenvolvimento do tomateiro e nos dias caracterizados por elevada nebulosidade. No estádio inicial de crescimento, verificou-se que os volumes totais de solução nutritiva, aplicados ao tomateiro pelo sistema convencional, excederam as necessidades hídricas da cultura em 1,31 e 1,39 L planta-1 em dias típicos com céu claro e nublado, respectivamente.The objective of this work was to compare the performance of an automatic fertigation control system, for soilless tomato production in sand substrate, as compared to a conventional control system. In the automatic control, fertigation events were established by meteorological conditions in the cultivation environment and crop development stage. In this way, the Penman-Monteith model was utilized as a decision support tool regarding the appropriate frequency for delivering the

  20. Automatization and working memory capacity in schizophrenia.

    Science.gov (United States)

    van Raalten, Tamar R; Ramsey, Nick F; Jansma, J Martijn; Jager, Gerry; Kahn, René S

    2008-03-01

    Working memory (WM) dysfunction in schizophrenia is characterized by inefficient WM recruitment and reduced capacity, but it is not yet clear how these relate to one another. In controls practice of certain cognitive tasks induces automatization, which is associated with reduced WM recruitment and increased capacity of concurrent task performance. We therefore investigated whether inefficient function and reduced capacity in schizophrenia was associated with a failure in automatization. FMRI data was acquired with a verbal WM task with novel and practiced stimuli in 18 schizophrenia patients and 18 controls. Participants performed a dual-task outside the scanner to test WM capacity. Patients showed intact performance on the WM task, which was paralleled by excessive WM activity. Practice improved performance and reduced WM activity in both groups. The difference in WM activity after practice predicted performance cost in controls but not in patients. In addition, patients showed disproportionately poor dual-task performance compared to controls, especially when processing information that required continuous adjustment in WM. Our findings support the notion of inefficient WM function and reduced capacity in schizophrenia. This was not related to a failure in automatization, but was evident when processing continuously changing information. This suggests that inefficient WM function and reduced capacity may be related to an inability to process information requiring frequent updating.

  1. Automatic device for maintenance on safety relief valve

    Energy Technology Data Exchange (ETDEWEB)

    Fujio, M. [Okano Valve Mfg. Co. Ltd., Kitakyushu, Fukuoka (Japan)

    1996-02-01

    This system offers shorter, labor-saving periodic inspection of nuclear power plants particularly for maintenance of in the BWR plant. This automatic device has the following performance features. (author).

  2. Automatic Configuration in NTP

    Institute of Scientific and Technical Information of China (English)

    Jiang Zongli(蒋宗礼); Xu Binbin

    2003-01-01

    NTP is nowadays the most widely used distributed network time protocol, which aims at synchronizing the clocks of computers in a network and keeping the accuracy and validation of the time information which is transmitted in the network. Without automatic configuration mechanism, the stability and flexibility of the synchronization network built upon NTP protocol are not satisfying. P2P's resource discovery mechanism is used to look for time sources in a synchronization network, and according to the network environment and node's quality, the synchronization network is constructed dynamically.

  3. Comparison of automatic control systems

    Science.gov (United States)

    Oppelt, W

    1941-01-01

    This report deals with a reciprocal comparison of an automatic pressure control, an automatic rpm control, an automatic temperature control, and an automatic directional control. It shows the difference between the "faultproof" regulator and the actual regulator which is subject to faults, and develops this difference as far as possible in a parallel manner with regard to the control systems under consideration. Such as analysis affords, particularly in its extension to the faults of the actual regulator, a deep insight into the mechanism of the regulator process.

  4. 用于8051微控制器的片上调试系统的硬件设计%Hardware Design of an Onchip Debug System for 8051 MCU

    Institute of Scientific and Technical Information of China (English)

    肖哲靖; 徐静平; 雷青松; 钟德刚

    2011-01-01

    This paper designed an onchip debug system for industrial 8051 MCU with the process of ASIC and put the all debug fuction into one chip. The debug system not only can make the 8051 halt, run, step into or skip an instruction, but also can read and write all registers, internal and external program memories, data memories and SFRs and set hardware breakpoints in them. A three-wire interface is used by the debug system to connect computer which takes less space than the standard JTAG. It was verified on Xilinx's xc3s400 FPGA and P&R with SMIC 0.18μm technology library. Results shows the system can effectively avoid the disadvantages of traditional debug method based on software simulation or emulator and it also can save much money for users spent on the commercial emulator and improve debug efficiency. The proposed design method is also applicable to other microcontroller.%为工业用8051微控制器设计了一个片上调试系统,将调试功能集成到单片机芯片内部.该系统基于专用集成电路的设计流程设计,不仅具有控制8051单片机挂起、正常运行、单步运行和指令跳转的能力,而且能够读写片内寄存器、内外部数据,程序存储器、特殊功能寄存器的值,并能在其中设置硬件断点.该调试系统使用比工业上的JTAG标准接口占用空间更少的三线接口作为其和计算机的连接通道.系统在Xilinx的xc3s400 FPGA上完成功能验证,利用SMIC0.18μm工艺库完成版图设计.结果表明,系统有效解决基于传统软件调试和仿真器调试方式的弊端,并能省去用户购买商业仿真器的调试花费,减少调试成本,提高调试效率.提出的设计方法同样适用于其他微控制器片上调试系统的设计.

  5. 版式电子文档表格自动检测与性能评估%Automatic Table Boundary Detection and Performance Evaluation in Fixed-Layout Documents

    Institute of Scientific and Technical Information of China (English)

    房婧; 高良才; 仇睿恒; 汤帜

    2013-01-01

    The authors propose a novel and effective table boundary detection method via visual separators and geometric content layout information, which is effective for both Chinese and English documents. Additionally, due to the lack of automatic evaluation system for table boundaries detection, the authors also provide a publicly available large-scale dataset, composed of same amount of Chinese and English pages make ground-truth and propose mobile reading oriented performance measurements. Evaluation and comparison with two other open source table boundary detection projects demonstrates effectiveness of the proposed method and practicality of the evaluation suit.%针对版式电子文档的特点,提出一种表格线分割符和表格文本的布局特征相结合的表格定位方法,并且对中英文档均有效.此外,针对缺少表格定位自动评估体系,构建了一个初具规模的公开数据集,由中英文版式页面等比例组成,对其标注基准结果,并针对移动阅读应用场景提出一套评估准则.通过与现有两个开源表格定位项目的比较,验证了新提出的表格定位方法的有效性和评估体系的实用性,特别是对中文数据集获得了较好的结果.

  6. 小型自动搬运系统的设计%Design of Small-scale Automatic Handling System

    Institute of Scientific and Technical Information of China (English)

    张嘉睿; 葛垚

    2012-01-01

    In order to solve problems of complex structure and high cost of existing handling system, the paper proposed a design scheme of small-scale automatic handling system. The system uses supports and ropes to handing goods through control mode of multi-computer network, and uses interpolation algorithm to set or optimize handling paths. The system realizes functions of sorting inbound, palletizing and distributing outbound through manual mode, semi-automatic mode or automatic mode. The debugging and running results showed that the system realizes expected functions perfectly.%针对现有搬运系统结构复杂、成本较高等问题,提出了一种小型自动搬运系统的设计方案.该系统采用支架和绳索,以多机网络结构控制方式实现货物搬运,采用插补算法预定或优化货物搬运路径,可通过手动、半自动和自动3种模式实现货物分拣入库、码垛和配送出库等功能.调试运行结果表明,该系统较好地实现了设定功能.

  7. What is automatized during perceptual categorization?

    Science.gov (United States)

    Roeder, Jessica L; Ashby, F Gregory

    2016-09-01

    An experiment is described that tested whether stimulus-response associations or an abstract rule are automatized during extensive practice at perceptual categorization. Twenty-seven participants each completed 12,300 trials of perceptual categorization, either on rule-based (RB) categories that could be learned explicitly or information-integration (II) categories that required procedural learning. Each participant practiced predominantly on a primary category structure, but every third session they switched to a secondary structure that used the same stimuli and responses. Half the stimuli retained their same response on the primary and secondary categories (the congruent stimuli) and half switched responses (the incongruent stimuli). Several results stood out. First, performance on the primary categories met the standard criteria of automaticity by the end of training. Second, for the primary categories in the RB condition, accuracy and response time (RT) were identical on congruent and incongruent stimuli. In contrast, for the primary II categories, accuracy was higher and RT was lower for congruent than for incongruent stimuli. These results are consistent with the hypothesis that rules are automatized in RB tasks, whereas stimulus-response associations are automatized in II tasks. A cognitive neuroscience theory is proposed that accounts for these results.

  8. A Statistical Approach to Automatic Speech Summarization

    Science.gov (United States)

    Hori, Chiori; Furui, Sadaoki; Malkin, Rob; Yu, Hua; Waibel, Alex

    2003-12-01

    This paper proposes a statistical approach to automatic speech summarization. In our method, a set of words maximizing a summarization score indicating the appropriateness of summarization is extracted from automatically transcribed speech and then concatenated to create a summary. The extraction process is performed using a dynamic programming (DP) technique based on a target compression ratio. In this paper, we demonstrate how an English news broadcast transcribed by a speech recognizer is automatically summarized. We adapted our method, which was originally proposed for Japanese, to English by modifying the model for estimating word concatenation probabilities based on a dependency structure in the original speech given by a stochastic dependency context free grammar (SDCFG). We also propose a method of summarizing multiple utterances using a two-level DP technique. The automatically summarized sentences are evaluated by summarization accuracy based on a comparison with a manual summary of speech that has been correctly transcribed by human subjects. Our experimental results indicate that the method we propose can effectively extract relatively important information and remove redundant and irrelevant information from English news broadcasts.

  9. MARZ: Manual and automatic redshifting software

    Science.gov (United States)

    Hinton, S. R.; Davis, Tamara M.; Lidman, C.; Glazebrook, K.; Lewis, G. F.

    2016-04-01

    The Australian Dark Energy Survey (OzDES) is a 100-night spectroscopic survey underway on the Anglo-Australian Telescope using the fibre-fed 2-degree-field (2dF) spectrograph. We have developed a new redshifting application MARZ with greater usability, flexibility, and the capacity to analyse a wider range of object types than the RUNZ software package previously used for redshifting spectra from 2dF. MARZ is an open-source, client-based, Javascript web-application which provides an intuitive interface and powerful automatic matching capabilities on spectra generated from the AAOmega spectrograph to produce high quality spectroscopic redshift measurements. The software can be run interactively or via the command line, and is easily adaptable to other instruments and pipelines if conforming to the current FITS file standard is not possible. Behind the scenes, a modified version of the AUTOZ cross-correlation algorithm is used to match input spectra against a variety of stellar and galaxy templates, and automatic matching performance for OzDES spectra has increased from 54% (RUNZ) to 91% (MARZ). Spectra not matched correctly by the automatic algorithm can be easily redshifted manually by cycling automatic results, manual template comparison, or marking spectral features.

  10. Automatic Fixture Planning

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    Fixture planning is a crucial problem in the field of fixture design. In this paper, the research scope and research methods of the computer-aided fixture planning are presented. Based on positioning principles of typical workparts, an ANN algorithm, namely Hopfield algorithm, is adopted for the automatic fixture planning. Also, this paper leads a deep research into the selection of positioning and clamping surfaces (or points) on workparts using positioning-clamping-surface-selecting rules and matrix evaluation of deterministic workpart positioning. In the end of this paper, the methods to select positioning and clamping elements from database and the layout algorithm to assemble the selected fixture elements into a tangible fixture are developed.

  11. 基于行业标准的全自动生化分析仪性能评价%The performance evaluation of the automatic analyser based on the“Medical standard of the People's Republic of China"

    Institute of Scientific and Technical Information of China (English)

    阳苹; 张莉萍; 毕小云; 邓小玲; 肖勤; 陈维蓓

    2011-01-01

    Objective The performance of Roche Modular DDP was evaluated according to the "Medical standard of the People's Republic of China-automatic chemistry analyzer" administered by the state food and drug administration(SFDA). Methods The stray light, the absorption linear range, the absorption accuracy, the absorption stability, the absorption reproducibility, the sample contamination rate,the sampling accuracy and the sampling reproducibility of the Roche Modular DDP were evaluated by multiple repetitive determination of standard substance reproducibility using the standard solution calibrated by national institute of metrology according to the requirement of the “Medical standard of the People's Republic of China-automatic chemistry analy zer". Results The highest stray light absorption was more than 23 000; The relative variation was less than 5 % when absorption was no less than 32 000;When the absorption were 5 000 and 10 000,the error were ± 300 and ± 700, respectively; The highest value minus the lowest value was less than 100; The CV was less than 1.5 % with the smallest reaction volume; The contamination rate was less than 0.5 %; The absorption of the CHKS was in the defined range and its CV was less than 1.5 %; The absorption of the CHKR1 and the CHKR2 were both in the defined ranges and the CVs of them were less than 0.5 % and less than 1.0 %, respectively. Conclusion The performance index of the Roche Modular DDP is live up to the requirement of "Medical standard of the People's Republic of China-automatic chemistry analyzer".%目的 对Roche Modular DDP全自动生化分析仪进行性能评价.方法 采用经中国计量科学研究所进行定值及校正过的标准溶液,据<中华人民共和国医药行业标准--全自动生化分析仪>要求,通过多次重复检测已知标准物质的重复性,评价Roche Modular DDP全自动生化分析仪的杂散光、吸光度线性范围、吸光度准确性、吸光度稳定性、

  12. PLC工业通讯网络在自动生产线中的应用%Application of PLC Communication in Automatic Assembly Line

    Institute of Scientific and Technical Information of China (English)

    陈英

    2013-01-01

    This paper selects the Siemens S7-200PLC as the main controller, the automatic assembly line has integrated communication port of RS485 with the function of PPI network communication. Based on PPI network, the hardware connection and debugging has been realized. And also Siemens S7-200 PLC reading and writing data of program compiling and debugging based on the complex PPI network has been realized.%文章选用西门子S7-200PLC作为主控制器,自动生产线集成RS485通信口具备PPI网络通信功能。基于此网络通信功能,实现了PPI网络的硬件连接与调试及PPI网络参数设置与调试,以及基于多台西门子S7-200 PLC的复杂PPI网络数据读写程序编写与调试。

  13. Fast Automatic Heuristic Construction Using Active Learning

    OpenAIRE

    Ogilvie, William; Petoumenos, Pavlos; Wang, Zheng; Leather, Hugh

    2015-01-01

    Building effective optimization heuristics is a challenging task which often takes developers several months if not years to complete. Predictive modelling has recently emerged as a promising solution, automatically constructing heuristics from training data. However, obtaining this data can take months per platform. This is becoming an ever more critical problem and if no solution is found we shall be left with out of date heuristics which cannot extract the best performance from modern mach...

  14. Automatic Algorithm Selection for Complex Simulation Problems

    CERN Document Server

    Ewald, Roland

    2012-01-01

    To select the most suitable simulation algorithm for a given task is often difficult. This is due to intricate interactions between model features, implementation details, and runtime environment, which may strongly affect the overall performance. An automated selection of simulation algorithms supports users in setting up simulation experiments without demanding expert knowledge on simulation. Roland Ewald analyzes and discusses existing approaches to solve the algorithm selection problem in the context of simulation. He introduces a framework for automatic simulation algorithm selection and

  15. Investigating the Relationship between Stable Personality Characteristics and Automatic Imitation.

    Science.gov (United States)

    Butler, Emily E; Ward, Robert; Ramsey, Richard

    2015-01-01

    Automatic imitation is a cornerstone of nonverbal communication that fosters rapport between interaction partners. Recent research has suggested that stable dimensions of personality are antecedents to automatic imitation, but the empirical evidence linking imitation with personality traits is restricted to a few studies with modest sample sizes. Additionally, atypical imitation has been documented in autism spectrum disorders and schizophrenia, but the mechanisms underpinning these behavioural profiles remain unclear. Using a larger sample than prior studies (N=243), the current study tested whether performance on a computer-based automatic imitation task could be predicted by personality traits associated with social behaviour (extraversion and agreeableness) and with disorders of social cognition (autistic-like and schizotypal traits). Further personality traits (narcissism and empathy) were assessed in a subsample of participants (N=57). Multiple regression analyses showed that personality measures did not predict automatic imitation. In addition, using a similar analytical approach to prior studies, no differences in imitation performance emerged when only the highest and lowest 20 participants on each trait variable were compared. These data weaken support for the view that stable personality traits are antecedents to automatic imitation and that neural mechanisms thought to support automatic imitation, such as the mirror neuron system, are dysfunctional in autism spectrum disorders or schizophrenia. In sum, the impact that personality variables have on automatic imitation is less universal than initial reports suggest.

  16. Investigating the Relationship between Stable Personality Characteristics and Automatic Imitation.

    Directory of Open Access Journals (Sweden)

    Emily E Butler

    Full Text Available Automatic imitation is a cornerstone of nonverbal communication that fosters rapport between interaction partners. Recent research has suggested that stable dimensions of personality are antecedents to automatic imitation, but the empirical evidence linking imitation with personality traits is restricted to a few studies with modest sample sizes. Additionally, atypical imitation has been documented in autism spectrum disorders and schizophrenia, but the mechanisms underpinning these behavioural profiles remain unclear. Using a larger sample than prior studies (N=243, the current study tested whether performance on a computer-based automatic imitation task could be predicted by personality traits associated with social behaviour (extraversion and agreeableness and with disorders of social cognition (autistic-like and schizotypal traits. Further personality traits (narcissism and empathy were assessed in a subsample of participants (N=57. Multiple regression analyses showed that personality measures did not predict automatic imitation. In addition, using a similar analytical approach to prior studies, no differences in imitation performance emerged when only the highest and lowest 20 participants on each trait variable were compared. These data weaken support for the view that stable personality traits are antecedents to automatic imitation and that neural mechanisms thought to support automatic imitation, such as the mirror neuron system, are dysfunctional in autism spectrum disorders or schizophrenia. In sum, the impact that personality variables have on automatic imitation is less universal than initial reports suggest.

  17. Automatic Weather Station (AWS) Lidar

    Science.gov (United States)

    Rall, Jonathan A.R.; Abshire, James B.; Spinhirne, James D.; Smith, David E. (Technical Monitor)

    2000-01-01

    An autonomous, low-power atmospheric lidar instrument is being developed at NASA Goddard Space Flight Center. This compact, portable lidar will operate continuously in a temperature controlled enclosure, charge its own batteries through a combination of a small rugged wind generator and solar panels, and transmit its data from remote locations to ground stations via satellite. A network of these instruments will be established by co-locating them at remote Automatic Weather Station (AWS) sites in Antarctica under the auspices of the National Science Foundation (NSF). The NSF Office of Polar Programs provides support to place the weather stations in remote areas of Antarctica in support of meteorological research and operations. The AWS meteorological data will directly benefit the analysis of the lidar data while a network of ground based atmospheric lidar will provide knowledge regarding the temporal evolution and spatial extent of Type la polar stratospheric clouds (PSC). These clouds play a crucial role in the annual austral springtime destruction of stratospheric ozone over Antarctica, i.e. the ozone hole. In addition, the lidar will monitor and record the general atmospheric conditions (transmission and backscatter) of the overlying atmosphere which will benefit the Geoscience Laser Altimeter System (GLAS). Prototype lidar instruments have been deployed to the Amundsen-Scott South Pole Station (1995-96, 2000) and to an Automated Geophysical Observatory site (AGO 1) in January 1999. We report on data acquired with these instruments, instrument performance, and anticipated performance of the AWS Lidar.

  18. Image feature meaning for automatic key-frame extraction

    Science.gov (United States)

    Di Lecce, Vincenzo; Guerriero, Andrea

    2003-12-01

    Video abstraction and summarization, being request in several applications, has address a number of researches to automatic video analysis techniques. The processes for automatic video analysis are based on the recognition of short sequences of contiguous frames that describe the same scene, shots, and key frames representing the salient content of the shot. Since effective shot boundary detection techniques exist in the literature, in this paper we will focus our attention on key frames extraction techniques to identify the low level visual features of the frames that better represent the shot content. To evaluate the features performance, key frame automatically extracted using these features, are compared to human operator video annotations.

  19. 基于虚拟机日志记录回放的可逆调试方法%A REVERSIBLE DEBUGGING METHOD BASED ON VIRTUAL MACHINE LOGGING AND REPLAYING

    Institute of Scientific and Technical Information of China (English)

    邵腾刚; 张俊飞

    2011-01-01

    传统的调试器调试程序时,仅仅能够让程序正向运行并获取其当前的状态.提出了一种可以让程序逆向运行,回到过去任意时刻的调试方法,来增强调试器的功能.该方法是通过为Xen虚拟机添加完整的日志记录和回放功能以及对GDB调试器作相应修改来实现的;调试对象可以恢复到其运行过程的任意时刻.该可逆调试器,可以解决大型软件和操作系统内核开发调试困难的问题,大大提高了开发进度.%Traditional debugger can only make programs execute forward and acquire current state when debugging.This paper proposes a new debugging method, in which the program can execute reversely back to any moment in the past.This method is implemented by adding complete logging and replaying function to Xen virtual machine and modifying GDB debugger correspondingly.The debugging object can restore its operation process happened at any moment.The debugger with reversible function implemented in the paper can solve the problem of debugging difficulties of large-scale software and operating system in their kernel development, and greatly accelerate the progress of development.

  20. 涡轮发动机燃油起动喷嘴调试%Debugging of Turbine Engine Fuel Starting Jet

    Institute of Scientific and Technical Information of China (English)

    杨凯; 王芳琦

    2016-01-01

    In the development stage of a certain type of engine, the fuel starting jet flow exceeds the maximum design value. The conventional coping method can't make the flow value drop to the design value scope. So it is necessary to redesign the key dimension of the starting jet. On the basis of the working principle of the centrifugal nozzle and related theory, combined with test piece debugging, the structural dimension is adjusted. Through practical test and batch processing, the reasonability of the change and precision of the calculation is verified. At last the design of the starting jet is finalized.%某型发动机在研制阶段出现燃油起动喷嘴流量值超出设计值上限问题,采用常规修磨方法无法将流量值降至设计值范围内,需对起动喷嘴关键尺寸进行重新设计.针对该问题,以离心式喷嘴工作原理及相关理论为基础,结合试验件调试,对该起动喷嘴的结构尺寸进行了调整.经过实测和批量加工,验证了改动的合理性和计算的准确性,最终使该起动喷嘴顺利定型.

  1. Automatization of hardware configuration for plasma diagnostic system

    Science.gov (United States)

    Wojenski, A.; Pozniak, K. T.; Kasprowicz, G.; Kolasinski, P.; Krawczyk, R. D.; Zabolotny, W.; Linczuk, P.; Chernyshova, M.; Czarski, T.; Malinowski, K.

    2016-09-01

    Soft X-ray plasma measurement systems are mostly multi-channel, high performance systems. In case of the modular construction it is necessary to perform sophisticated system discovery in parallel with automatic system configuration. In the paper the structure of the modular system designed for tokamak plasma soft X-ray measurements is described. The concept of the system discovery and further automatic configuration is also presented. FCS application (FMC/ FPGA Configuration Software) is used for running sophisticated system setup with automatic verification of proper configuration. In order to provide flexibility of further system configurations (e.g. user setup), common communication interface is also described. The approach presented here is related to the automatic system firmware building presented in previous papers. Modular construction and multichannel measurements are key requirement in term of SXR diagnostics with use of GEM detectors.

  2. 触头灭弧系统短路分断调试方法的技术研究%Technical Study of Short-Circuit Breaking Debugging Method for Contact Arc Extinguishing System

    Institute of Scientific and Technical Information of China (English)

    贺雅洁; 黄世泽; 郭其一; 胡景泰; 朱奇敏

    2012-01-01

      介绍了控制与保护开关(CPS)触头灭弧系统的各种出厂调试方法以及调试现象。阐述了触头灭弧系统的工作原理,分析了触头灭弧系统的触头受力情况,针对分断试验过程中出现的三种情况,分析了其对KBO触头灭弧系统短路分断能力的影响,通过试验验证了理论分析的正确性,并对触头灭弧系统的调试提出了建议。%  Introduction was made to various kinds of ex-works debugging methods and debugging phenomena for contact arc extinguish-ing system of control and protective switching device (CPS). Description was made to the working principle of contact arc extinguishing sys-tem and analysis was made to contact stress situation of the system. Aiming at three situations in breaking test process, this paper analyzed its impacts on short-circuit breaking capability of KBO contact arc extinguishing system. The test verifies the correctness of theoretical analysis and suggestions are made for debugging of contact arc extinguishing system.

  3. Automatization and Abstract Problem-Solving as Predictors of Academic Achievement.

    Science.gov (United States)

    Meltzer, Lynn J.; And Others

    The associations among cognitive automatization, abstract problem solving, and educational performance were studied using 127 fourth to ninth grade students. A number of measures of fast, automatic, and fluent performance (FAF measures) were used: writing the alphabet; reading from a word list; and mentally performing arithmetic operations. The…

  4. Matlab仿真在自动控制原理教学中的应用%Application of Matlab Software in Teaching of Automatic Control Theory

    Institute of Scientific and Technical Information of China (English)

    曹海红

    2012-01-01

    In order to greatly inspire students' enthusiasm to learn the automatic control theory and improve the ability to analyze, design and debug automatic control system. This paper describes the method to calculate and analyze the more typical and abstract systems in automatic control theory courses with examples by using MATLAB programming language. The results indicate that this kind of educational software can improve student's understanding of automatic control theory and achieve the purpose of teaching.%为了加深学生对自动控制原理课程的兴趣、提高分析、设计和调试自动控制系统的能力,本文采用了MATLAB语言编制程序结合实例对自动控制原理课程中遇到的一些典型和抽象系统进行计算和分析方法.通过这种教学软件,提高学生学习自动控制原理的理解,达到了该课程的教学目的.

  5. Automatic phases recognition in pituitary surgeries by microscope images classification

    OpenAIRE

    Lalys, Florent; Riffaud, Laurent; Morandi, Xavier; Jannin, Pierre

    2010-01-01

    International audience; The segmentation of the surgical workflow might be helpful for providing context-sensitive user interfaces, or generating automatic report. Our approach focused on the automatic recognition of surgical phases by microscope image classification. Our workflow, including images features extraction, image database labelisation, Principal Component Analysis (PCA) transformation and 10-fold cross-validation studies was performed on a specific type of neurosurgical interventi...

  6. Automatic aircraft recognition

    Science.gov (United States)

    Hmam, Hatem; Kim, Jijoong

    2002-08-01

    Automatic aircraft recognition is very complex because of clutter, shadows, clouds, self-occlusion and degraded imaging conditions. This paper presents an aircraft recognition system, which assumes from the start that the image is possibly degraded, and implements a number of strategies to overcome edge fragmentation and distortion. The current vision system employs a bottom up approach, where recognition begins by locating image primitives (e.g., lines and corners), which are then combined in an incremental fashion into larger sets of line groupings using knowledge about aircraft, as viewed from a generic viewpoint. Knowledge about aircraft is represented in the form of whole/part shape description and the connectedness property, and is embedded in production rules, which primarily aim at finding instances of the aircraft parts in the image and checking the connectedness property between the parts. Once a match is found, a confidence score is assigned and as evidence in support of an aircraft interpretation is accumulated, the score is increased proportionally. Finally a selection of the resulting image interpretations with the highest scores, is subjected to competition tests, and only non-ambiguous interpretations are allowed to survive. Experimental results demonstrating the effectiveness of the current recognition system are given.

  7. Automatization and training in visual search.

    Science.gov (United States)

    Czerwinski, M; Lightfoot, N; Shiffrin, R M

    1992-01-01

    In several search tasks, the amount of practice on particular combinations of targets and distractors was equated in varied-mapping (VM) and consistent-mapping (CM) conditions. The results indicate the importance of distinguishing between memory and visual search tasks, and implicate a number of factors that play important roles in visual search and its learning. Visual search was studied in Experiment 1. VM and CM performance were almost equal, and slope reductions occurred during practice for both, suggesting the learning of efficient attentive search based on features, and no important role for automatic attention attraction. However, positive transfer effects occurred when previous CM targets were re-paired with previous CM distractors, even though these targets and distractors had not been trained together. Also, the introduction of a demanding simultaneous task produced advantages of CM over VM. These latter two results demonstrated the operation of automatic attention attraction. Visual search was further studied in Experiment 2, using novel characters for which feature overlap and similarity were controlled. The design and many of the findings paralleled Experiment 1. In addition, enormous search improvement was seen over 35 sessions of training, suggesting the operation of perceptual unitization for the novel characters. Experiment 3 showed a large, persistent advantage for CM over VM performance in memory search, even when practice on particular combinations of targets and distractors was equated in the two training conditions. A multifactor theory of automatization and attention is put forth to account for these findings and others in the literature.

  8. Development of an automatic pipeline scanning system

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae H.; Lee, Jae C.; Moon, Soon S.; Eom, Heung S.; Choi, Yu R

    1999-11-01

    Pressure pipe inspection in nuclear power plants is one of the mandatory regulation items. Comparing to manual ultrasonic inspection, automatic inspection has the benefits of more accurate and reliable inspection results and reduction of radiation disposal. final object of this project is to develop an automatic pipeline inspection system of pressure pipe welds in nuclear power plants. We developed a pipeline scanning robot with four magnetic wheels and 2-axis manipulator for controlling ultrasonic transducers, and developed the robot control computer which controls the robot to navigate along inspection path exactly. We expect our system can contribute to reduction of inspection time, performance enhancement, and effective management of inspection results. The system developed by this project can be practically used for inspection works after field tests. (author)

  9. Automatic stereoscopic system for person recognition

    Science.gov (United States)

    Murynin, Alexander B.; Matveev, Ivan A.; Kuznetsov, Victor D.

    1999-06-01

    A biometric access control system based on identification of human face is presented. The system developed performs remote measurements of the necessary face features. Two different scenarios of the system behavior are implemented. The first one assumes the verification of personal data entered by visitor from console using keyboard or card reader. The system functions as an automatic checkpoint, that strictly controls access of different visitors. The other scenario makes it possible to identify visitors without any person identifier or pass. Only person biometrics are used to identify the visitor. The recognition system automatically finds necessary identification information preliminary stored in the database. Two laboratory models of recognition system were developed. The models are designed to use different information types and sources. In addition to stereoscopic images inputted to computer from cameras the models can use voice data and some person physical characteristics such as person's height, measured by imaging system.

  10. Automatic Palette Identification of Colored Graphics

    Science.gov (United States)

    Lacroix, Vinciane

    The median-shift, a new clustering algorithm, is proposed to automatically identify the palette of colored graphics, a pre-requisite for graphics vectorization. The median-shift is an iterative process which shifts each data point to the "median" point of its neighborhood defined thanks to a distance measure and a maximum radius, the only parameter of the method. The process is viewed as a graph transformation which converges to a set of clusters made of one or several connected vertices. As the palette identification depends on color perception, the clustering is performed in the L*a*b* feature space. As pixels located on edges are made of mixed colors not expected to be part of the palette, they are removed from the initial data set by an automatic pre-processing. Results are shown on scanned maps and on the Macbeth color chart and compared to well established methods.

  11. Electronic amplifiers for automatic compensators

    CERN Document Server

    Polonnikov, D Ye

    1965-01-01

    Electronic Amplifiers for Automatic Compensators presents the design and operation of electronic amplifiers for use in automatic control and measuring systems. This book is composed of eight chapters that consider the problems of constructing input and output circuits of amplifiers, suppression of interference and ensuring high sensitivity.This work begins with a survey of the operating principles of electronic amplifiers in automatic compensator systems. The succeeding chapters deal with circuit selection and the calculation and determination of the principal characteristics of amplifiers, as

  12. The Automatic Telescope Network (ATN)

    CERN Document Server

    Mattox, J R

    1999-01-01

    Because of the scheduled GLAST mission by NASA, there is strong scientific justification for preparation for very extensive blazar monitoring in the optical bands to exploit the opportunity to learn about blazars through the correlation of variability of the gamma-ray flux with flux at lower frequencies. Current optical facilities do not provide the required capability.Developments in technology have enabled astronomers to readily deploy automatic telescopes. The effort to create an Automatic Telescope Network (ATN) for blazar monitoring in the GLAST era is described. Other scientific applications of the networks of automatic telescopes are discussed. The potential of the ATN for science education is also discussed.

  13. Pattern-Driven Automatic Parallelization

    Directory of Open Access Journals (Sweden)

    Christoph W. Kessler

    1996-01-01

    Full Text Available This article describes a knowledge-based system for automatic parallelization of a wide class of sequential numerical codes operating on vectors and dense matrices, and for execution on distributed memory message-passing multiprocessors. Its main feature is a fast and powerful pattern recognition tool that locally identifies frequently occurring computations and programming concepts in the source code. This tool also works for dusty deck codes that have been "encrypted" by former machine-specific code transformations. Successful pattern recognition guides sophisticated code transformations including local algorithm replacement such that the parallelized code need not emerge from the sequential program structure by just parallelizing the loops. It allows access to an expert's knowledge on useful parallel algorithms, available machine-specific library routines, and powerful program transformations. The partially restored program semantics also supports local array alignment, distribution, and redistribution, and allows for faster and more exact prediction of the performance of the parallelized target code than is usually possible.

  14. Evaluation of the performance of PRECIL LBY-NJ4A automatic platelet aggregation analyzer%普利生LBY-NJ4A全自动血小板聚集仪性能评价

    Institute of Scientific and Technical Information of China (English)

    石冬敏; 吴元健; 马伟

    2012-01-01

    目的 对普利生LBY-NJ4A全自动血小板聚集仪(NJ4A)进行性能评估.方法 109 mmol/L枸橼酸钠真空管采血,分离富含血小板血浆(PRP)和贫血小板血浆(PPP),应用NJ4A及配套质控品、诱导剂和清洗液,测定血小板最大聚集率(MAR),测试精密度、通道一致性、不确定度、检测限、干扰等.结果 批内不精密度测试,变异系数(CV)为3.4%~5.0%;4个不同通道测定均值差异无统计学意义(P>0.05);携带污染率2.82%;CV批内≤3.5%,CV批间≤4.2%,CV总≤3.8%;总误差范围为5.6%~10.3%,不确定度在可接受范围;2个水平质控品测定值与靶值差异无统计学意义(P>0.05);NJ4A与LBY-NJ2比对结果相关良好(r=0.998,P0.05).结论 NJ4A精密度、准确度、不确定度、灵敏度、携带污染率、抗干扰等性能指标符合CLSI规范,可在临床应用.%Objective To check and evaluate the performance of PRECIL LBY-NJ4A automatic platelet aggregation analyzer (NJ4A) .Methods Blood samples were collected with 109 mmol/L sodium citrate using vacuum tube,then the platelet-rich plasma (PRP)and platelet-poor plasma(PPP) were separated .The NJ4A analyzer with mating quality control,agonists and cleaning solution were used to measure the maximum aggregate rate of platelet (MAR),precision,channel consistency,inaccuracy,detection limitation and inference .Results The precision ranged from 3 .4/0 to 5 .0% .The means of four different channels have no statistically difference(P>0 .05),when carryover rate was 2 .82% .The intra,inter hatch and total coefficient of variance was less than or equal to 3 .5%,4 .2% and 3 .8/0,respectively .The total error ranged from 5 .6/0 to 10 .3%,with an acceptable inaccuracy .Two levels of quality control have no statistically difference with each target value (P>0.05).The correlation of NJ4 and LBY-NJ2 was 0.998 (P0.05) compared to the well-recognized reference range .Conclusion The performance indicators of precision, accuracy, uncertainty

  15. 罗氏 Co bas c701全自动生化分析仪性能评价%Performance evaluation of Roche Cobas c 701 fully automatic biochemical analyzer

    Institute of Scientific and Technical Information of China (English)

    邓小玲; 侯玉磊; 陈特; 毕小云

    2015-01-01

    Objective To assess the performance of Roche Cobas c701 fully automatic biochemical analyzer. Methods According to EP15‐A2 from Clinical and Laboratory Standards Institute, the electrolyte (potassium, sodi‐um and chloride) and covers all the wavelengths of nine projects (alanine transaminase, aspartate transaminase, alka‐line phosphatase, gamma‐glutamine transaminase, creatinine and urea nitrogen, glucose, total protein, three acyl glyc‐erin) were measured by Roche Cobas c701 analyzer and original reagents. The precisions and accuracies of all parame‐ters were verified. Results In the 2 levels of tested parameters, the standard deviation of repeatability (Sr )was ≤the manufacture′s standard deviation of repeatability (σr ), and the standard deviation of prescision (St ) was≤the manu‐facture′s standard prescision (σt ), the prescision was acceptable and similar to what the manufacter declared. Correla‐tions between theoretic value and actual value were good (regression coefficient was :0. 999 4-1. 000 0). The bias of all parameters was acceptable (within the prescribed scope of CLIA′88)with Roche cobas c701analyzer, compared with the external quality assessment of the ministry of health clinical inspection center. Conclusion The repeatabili‐ty, precision and accuracy of the parameters by Roche Cobas c701 reach the performance that the manufacturer de‐clares.%目的:对罗氏Cobas c701全自动生化分析仪进行性能评价。方法按照美国临床和实验室标准化协会EP15‐A2文件的要求,通过电解质(钾、钠、氯)和涵盖各波长的9个项目(丙氨酸氨基转移酶、天门冬氨酸氨基转移酶、碱性磷酸酶、γ‐谷氨酰转移酶、肌酐、尿素氮、葡萄糖、总蛋白、三酰甘油)对仪器的精密度、准确度、线性范围等进行验证。结果所有检测项目的重复性标准差(Sr)≤厂家声明的标准差(σr)、精密度的标准差(St)≤σt ,均

  16. Automatic Coarse Graining of Polymers

    OpenAIRE

    Faller, Roland

    2003-01-01

    Several recently proposed semi--automatic and fully--automatic coarse--graining schemes for polymer simulations are discussed. All these techniques derive effective potentials for multi--atom units or super--atoms from atomistic simulations. These include techniques relying on single chain simulations in vacuum and self--consistent optimizations from the melt like the simplex method and the inverted Boltzmann method. The focus is on matching the polymer structure on different scales. Several ...

  17. Automatic Sarcasm Detection: A Survey

    OpenAIRE

    Joshi, Aditya; Bhattacharyya, Pushpak; Carman, Mark James

    2016-01-01

    Automatic sarcasm detection is the task of predicting sarcasm in text. This is a crucial step to sentiment analysis, considering prevalence and challenges of sarcasm in sentiment-bearing text. Beginning with an approach that used speech-based features, sarcasm detection has witnessed great interest from the sentiment analysis community. This paper is the first known compilation of past work in automatic sarcasm detection. We observe three milestones in the research so far: semi-supervised pat...

  18. Prospects for de-automatization.

    Science.gov (United States)

    Kihlstrom, John F

    2011-06-01

    Research by Raz and his associates has repeatedly found that suggestions for hypnotic agnosia, administered to highly hypnotizable subjects, reduce or even eliminate Stroop interference. The present paper sought unsuccessfully to extend these findings to negative priming in the Stroop task. Nevertheless, the reduction of Stroop interference has broad theoretical implications, both for our understanding of automaticity and for the prospect of de-automatizing cognition in meditation and other altered states of consciousness.

  19. The automatization of journalistic narrative

    Directory of Open Access Journals (Sweden)

    Naara Normande

    2013-06-01

    Full Text Available This paper proposes an initial discussion about the production of automatized journalistic narratives. Despite being a topic discussed in specialized sites and international conferences in communication area, the concepts are still deficient in academic research. For this article, we studied the concepts of narrative, databases and algorithms, indicating a theoretical trend that explains this automatized journalistic narratives. As characterization, we use the cases of Los Angeles Times, Narrative Science and Automated Insights.

  20. Process automatization in system administration

    OpenAIRE

    Petauer, Janja

    2013-01-01

    The aim of the thesis is to present automatization of user management in company Studio Moderna. The company has grown exponentially in recent years, that is why we needed to find faster, easier and cheaper way of man- aging user accounts. We automatized processes of creating, changing and removing user accounts within Active Directory. We prepared user interface inside of existing application, used Java Script for drop down menus, wrote script in scripting programming langu...

  1. An automatic evaluation system for NTA film neutron dosimeters

    CERN Document Server

    Müller, R

    1999-01-01

    At CERN, neutron personal monitoring for over 4000 collaborators is performed with Kodak NTA films, which have been shown to be the most suitable neutron dosimeter in the radiation environment around high-energy accelerators. To overcome the lengthy and strenuous manual scanning process with an optical microscope, an automatic analysis system has been developed. We report on the successful automatic scanning of NTA films irradiated with sup 2 sup 3 sup 8 Pu-Be source neutrons, which results in densely ionised recoil tracks, as well as on the extension of the method to higher energy neutrons causing sparse and fragmentary tracks. The application of the method in routine personal monitoring is discussed. $9 overcome the lengthy and strenuous manual scanning process with an optical microscope, an automatic analysis system has been developed. We report on the successful automatic scanning of NTA films irradiated with /sup 238/Pu-Be source $9 discussed. (10 refs).

  2. Automatic Grasp Generation and Improvement for Industrial Bin-Picking

    DEFF Research Database (Denmark)

    Kraft, Dirk; Ellekilde, Lars-Peter; Rytz, Jimmy Alison

    2014-01-01

    This paper presents work on automatic grasp generation and grasp learning for reducing the manual setup time and increase grasp success rates within bin-picking applications. We propose an approach that is able to generate good grasps automatically using a dynamic grasp simulator, a newly developed...... and achieve comparable results and that our learning approach can improve system performance significantly. Automatic bin-picking is an important industrial process that can lead to significant savings and potentially keep production in countries with high labour cost rather than outsourcing it. The presented...... work allows to minimize cycle time as well as setup cost, which are essential factors in automatic bin-picking. It therefore leads to a wider applicability of bin-picking in industry....

  3. COMMISSIONING AND DETECTOR PERFORMANCE GROUPS (DPG)

    CERN Multimedia

    A. Ryd and T. Camporesi

    2010-01-01

    Commissioning and Run Coordination activities After the successful conclusion of the LHC pilot run commissioning in 2009 activities at the experiment restarted only late in January due to the cooling and detector maintenance. As usual we got going with weekly exercises used to deploy, debug, and validate improvements in firmware and software. A debriefing workshop aimed at analyzing the operational aspects of the 2009 pilot run was held on Jan. 15, 2009, to define a list of improvements (and relative priorities) to be planned. In the last month, most of the objectives set in the debriefing workshop have been attained. The major achievements/improvements obtained are the following: - Consolidation of the firmware for both readout and trigger for ECAL - Software implementation of procedures for raising the bias voltage of the silicon tracker and pixel driven by LHC mode changes with automatic propagation of the state changes from the DCS to the DAQ. The improvements in the software and firmware allow suppress...

  4. Simulative debugging method for serial communication under Keil Cenvironment%Keil C环境下串口实验的模拟调试

    Institute of Scientific and Technical Information of China (English)

    朱艳萍; 邹应全; 廖建辉

    2012-01-01

    According to the practice teaching needs of the single chip, the specific methods and steps of serial communication between SCM and PC are proposed under Keil C51 software environment. The simulation experiment is as follows: Serial program sends two hexadecimal data, changes it for decimal and displays on four digital tubes. Comparing the simulation data under software step execution of I/O mouth with the actual serial interface communication results ,they are consistent. This experiment is to make the students more familiar with single-chip microcomputer software development environment and debugging details. And it also makes students deep understanding the serial interface communication and dynamic display.%根据单片机实践教学发展的需要,提出了在Keil C51软件环境下模拟单片机与PC机间串口通信的具体方法和步骤.仿真实验为:串口程序发送2位16进制数据,将其转换为十进制后,在4位LED上显示.通过比较软件单步执行的I/O口模拟数据,发现与实际串口通信结果一致.该实验使学生更加熟悉单片机软件开发环境下模拟硬件的调试方法,加深了学生对串口通信及动态显示的理解.

  5. Teaching Exploration of Process Control System Running and Debugging Project%过程控制系统运行调试项目教学探索

    Institute of Scientific and Technical Information of China (English)

    徐文明

    2016-01-01

    The paper introduces the teaching requirements ofthe running and debugging projects of the process control system. It also introduces the teaching method by combining the teaching project with enterprise's real case which can solve the teaching difficulties of fault diagnosis and elimination of equipment. Combined with the enterprise work experience and practice, utilizing the trial-and-error tuning method and summarizing the practical method of system parameters tuning, the teaching difficulties of the second-order liquid level control system parameter tuning and data analysis and processing will be solved. In addition, this paper puts forward to the teaching measures by combining the curriculum teaching with the community activities of student, and also with vocational skills competitions. The combinations have achieved good teaching results through practice.%本文介绍了过程控制系统运行调试项目的教学要求,采用了教学任务与企业故障案例相融合的教学方法,解决设备故障的诊断与排除教学重点。结合企业的工作经验做法,利用经验凑试法整定方法,总结教学任务的系统参数整定的实操方法,解决了二阶液位控制系统参数整定与数据分析处理的教学难点。提出了课程教学与学生社团活动相结合,课程教学与职业技能竞赛相融合的教学措施,通过实践取得了较好的教学效果。

  6. Automatic morphometry of nerve histological sections.

    Science.gov (United States)

    Romero, E; Cuisenaire, O; Denef, J F; Delbeke, J; Macq, B; Veraart, C

    2000-04-15

    A method for the automatic segmentation, recognition and measurement of neuronal myelinated fibers in nerve histological sections is presented. In this method, the fiber parameters i.e. perimeter, area, position of the fiber and myelin sheath thickness are automatically computed. Obliquity of the sections may be taken into account. First, the image is thresholded to provide a coarse classification between myelin and non-myelin pixels. Next, the resulting binary image is further simplified using connected morphological operators. By applying semantic rules to the zonal graph axon candidates are identified. Those are either isolated or still connected. Then, separation of connected fibers is performed by evaluating myelin sheath thickness around each candidate area with an Euclidean distance transformation. Finally, properties of each detected fiber are computed and false positives are removed. The accuracy of the method is assessed by evaluating missed detection, false positive ratio and comparing the results to the manual procedure with sampling. In the evaluated nerve surface, a 0.9% of false positives was found, along with 6.36% of missed detections. The resulting histograms show strong correlation with those obtained by manual measure. The noise introduced by this method is significantly lower than the intrinsic sampling variability. This automatic method constitutes an original tool for morphometrical analysis.

  7. SEMANTIC INTEGRATION FOR AUTOMATIC ONTOLOGY MAPPING

    Directory of Open Access Journals (Sweden)

    Siham AMROUCH

    2013-11-01

    Full Text Available In the last decade, ontologies have played a key technology role for information sharing and agents interoperability in different application domains. In semantic web domain, ontologies are efficiently used to face the great challenge of representing the semantics of data, in order to bring the actual web to its full power and hence, achieve its objective. However, using ontologies as common and shared vocabularies requires a certain degree of interoperability between them. To confront this requirement, mapping ontologies is a solution that is not to be avoided. In deed, ontology mapping build a meta layer that allows different applications and information systems to access and share their informations, of course, after resolving the different forms of syntactic, semantic and lexical mismatches. In the contribution presented in this paper, we have integrated the semantic aspect based on an external lexical resource, wordNet, to design a new algorithm for fully automatic ontology mapping. This fully automatic character features the main difference of our contribution with regards to the most of the existing semi-automatic algorithms of ontology mapping, such as Chimaera, Prompt, Onion, Glue, etc. To better enhance the performances of our algorithm, the mapping discovery stage is based on the combination of two sub-modules. The former analysis the concept’s names and the later analysis their properties. Each one of these two sub-modules is it self based on the combination of lexical and semantic similarity measures.

  8. Automatic basal slice detection for cardiac analysis

    Science.gov (United States)

    Paknezhad, Mahsa; Marchesseau, Stephanie; Brown, Michael S.

    2016-03-01

    Identification of the basal slice in cardiac imaging is a key step to measuring the ejection fraction (EF) of the left ventricle (LV). Despite research on cardiac segmentation, basal slice identification is routinely performed manually. Manual identification, however, has been shown to have high inter-observer variability, with a variation of the EF by up to 8%. Therefore, an automatic way of identifying the basal slice is still required. Prior published methods operate by automatically tracking the mitral valve points from the long-axis view of the LV. These approaches assumed that the basal slice is the first short-axis slice below the mitral valve. However, guidelines published in 2013 by the society for cardiovascular magnetic resonance indicate that the basal slice is the uppermost short-axis slice with more than 50% myocardium surrounding the blood cavity. Consequently, these existing methods are at times identifying the incorrect short-axis slice. Correct identification of the basal slice under these guidelines is challenging due to the poor image quality and blood movement during image acquisition. This paper proposes an automatic tool that focuses on the two-chamber slice to find the basal slice. To this end, an active shape model is trained to automatically segment the two-chamber view for 51 samples using the leave-one-out strategy. The basal slice was detected using temporal binary profiles created for each short-axis slice from the segmented two-chamber slice. From the 51 successfully tested samples, 92% and 84% of detection results were accurate at the end-systolic and the end-diastolic phases of the cardiac cycle, respectively.

  9. Variable-mass Thermodynamics Calculation Model for Gas-operated Automatic Weapon%Variable-mass Thermodynamics Calculation Model for Gas-operated Automatic Weapon

    Institute of Scientific and Technical Information of China (English)

    陈建彬; 吕小强

    2011-01-01

    Aiming at the fact that the energy and mass exchange phenomena exist between barrel and gas-operated device of the automatic weapon, for describing its interior ballistics and dynamic characteristics of the gas-operated device accurately, a new variable-mass thermodynamics model is built. It is used to calculate the automatic mechanism velocity of a certain automatic weapon, the calculation results coincide with the experimental results better, and thus the model is validated. The influences of structure parameters on gas-operated device' s dynamic characteristics are discussed. It shows that the model is valuable for design and accurate performance prediction of gas-operated automatic weapon.

  10. 12 CFR 263.403 - Automatic removal, suspension, and debarment.

    Science.gov (United States)

    2010-01-01

    ... independent public accountant or accounting firm may not perform audit services for banking organizations if... permission to such accountant or firm to perform audit services for banking organizations. The request shall... 12 Banks and Banking 3 2010-01-01 2010-01-01 false Automatic removal, suspension, and...

  11. The performance comparison between a domestic automatic enzyme immunoassay analyzer and Tecan Fame ELISA system%某国产全自动 ELISA 分析仪与 Tecan Fame ELISA 系统的性能比较

    Institute of Scientific and Technical Information of China (English)

    包建玲; 唐婧; 孟存仁; 张朝霞

    2014-01-01

    Objective To compare the performance of Addcare automatic enyzne-link immunosorbent assay (ELISA ) analyzer and Tecan Fame ELISA system ,and to assess the feasibility of Addcare ELISA analyzer in clinical application .Methods The per-formance of pipetting needles ,washers ,microplate readers of two different systems were evaluated .Gravimetric method was used to detect pipetting needles errors and washer residues .Distilled water ,methyl orange ,dichromic acid and p-nitrophenol were used to test the microplate reader′s zero point drifting ,channel difference ,sensitivity and accuracy .80 specimens were used to test the anti-body of hepatitis C virus ,then the total coincidence rate of two systems was determined .Results The mean value of Addcare 10μL pipetting system was 10 .164μL ,and total CV value was 3 .91% .In Tecan system the mean value was 10 .223μL and the total val-ue of CV was 2 .96% .The absorbance values of zero drifting in two systems were both within 0 .023 6 ± 0 .003 8 ,differences among channels were ± 0 .002 9 ,differences among holes were ± 0 .001 4 .Washing system′s residues of Addcare were within 0 .4 μL ,and those of Fame system were within 0 .6μL .The total coincidence rate of two systems to test hepatitis C virus antibody of 80 samples was 100% .Conclusion The performance of the two systems are stable ,and the test results are consistent ,which could meet the clinical needs .%目的:比较国产艾德康酶联免疫吸附试验(ELISA )分析仪与瑞士 Tecan Fame ELISA 系统的性能,确定艾德康ELISA分析仪在临床免疫检验中应用的可行性。方法分别对艾德康ELISA分析仪与瑞士Tecan Fame ELISA系统的加样针、洗板机、酶标仪几个模块进行性能评价,用称量法检测加样针加样误差和洗板机残留液量;用蒸馏水、甲基橙比色系统、重铬酸钾和对硝基苯酚,分别测定酶标仪零点飘移、通道差、灵敏度和准确度;取80例标本进行

  12. Automatic Fall Detection using Smartphone Acceleration Sensor

    Directory of Open Access Journals (Sweden)

    Tran Tri Dang

    2016-12-01

    Full Text Available In this paper, we describe our work on developing an automatic fall detection technique using smart phone. Fall is detected based on analyzing acceleration patterns generated during various activities. An additional long lie detection algorithm is used to improve fall detection rate while keeping false positive rate at an acceptable value. An application prototype is implemented on Android operating system and is used to evaluate the proposed technique performance. Experiment results show the potential of using this app for fall detection. However, more realistic experiment setting is needed to make this technique suitable for use in real life situations.

  13. Automatic control algorithm effects on energy production

    Science.gov (United States)

    Mcnerney, G. M.

    1981-01-01

    A computer model was developed using actual wind time series and turbine performance data to simulate the power produced by the Sandia 17-m VAWT operating in automatic control. The model was used to investigate the influence of starting algorithms on annual energy production. The results indicate that, depending on turbine and local wind characteristics, a bad choice of a control algorithm can significantly reduce overall energy production. The model can be used to select control algorithms and threshold parameters that maximize long term energy production. The results from local site and turbine characteristics were generalized to obtain general guidelines for control algorithm design.

  14. Research on automatic human chromosome image analysis

    Science.gov (United States)

    Ming, Delie; Tian, Jinwen; Liu, Jian

    2007-11-01

    Human chromosome karyotyping is one of the essential tasks in cytogenetics, especially in genetic syndrome diagnoses. In this thesis, an automatic procedure is introduced for human chromosome image analysis. According to different status of touching and overlapping chromosomes, several segmentation methods are proposed to achieve the best results. Medial axis is extracted by the middle point algorithm. Chromosome band is enhanced by the algorithm based on multiscale B-spline wavelets, extracted by average gray profile, gradient profile and shape profile, and calculated by the WDD (Weighted Density Distribution) descriptors. The multilayer classifier is used in classification. Experiment results demonstrate that the algorithms perform well.

  15. All-optical automatic pollen identification: Towards an operational system

    Science.gov (United States)

    Crouzy, Benoît; Stella, Michelle; Konzelmann, Thomas; Calpini, Bertrand; Clot, Bernard

    2016-09-01

    We present results from the development and validation campaign of an optical pollen monitoring method based on time-resolved scattering and fluorescence. Focus is first set on supervised learning algorithms for pollen-taxa identification and on the determination of aerosol properties (particle size and shape). The identification capability provides a basis for a pre-operational automatic pollen season monitoring performed in parallel to manual reference measurements (Hirst-type volumetric samplers). Airborne concentrations obtained from the automatic system are compatible with those from the manual method regarding total pollen and the automatic device provides real-time data reliably (one week interruption over five months). In addition, although the calibration dataset still needs to be completed, we are able to follow the grass pollen season. The high sampling from the automatic device allows to go beyond the commonly-presented daily values and we obtain statistically significant hourly concentrations. Finally, we discuss remaining challenges for obtaining an operational automatic monitoring system and how the generic validation environment developed for the present campaign could be used for further tests of automatic pollen monitoring devices.

  16. 实时监控程序的实验室快速调试开发%Rapid Debugging and Development of Real-time Monitoring Program in Lab

    Institute of Scientific and Technical Information of China (English)

    蔡文斋

    2015-01-01

    This paper proposes the several simulation and debugging methods of real-time monitoring application software development in Telemetry, Track and command (TT&C) project, and mainly introduces a window debugging method embedded in the monitoring program. This method uses an independent hexadecimal edit window message mode to replace a hardware sensor accessed to the computer, completely simulates the hardware communication environment in the lab, and flexibly gives the fixed length information send by the hardware sensor to the computer according to protocols, in order to debug the whole monitoring system without hardware. Once the hardware components are connected, the communication at the protocol level is immediately completed.%提出了航天测控工程中实时监控类应用软件开发中的仿真调试几种方法,该文主要展示了一种内嵌在监控程序本身的窗口调试法,该方法使用一个独立的十六进制编辑窗口发消息方式代替某接入计算机的硬件传感器,在实验室内可完全仿真出硬件通讯时的环境,可灵活地按协议给出硬件传感器发送给计算机的定长信息,从而可无硬件环境下调通整个监控系统,一旦硬件配件接通,协议级的通讯亦随即完成。

  17. 基于FSA方法的新船试航中主机调试安全性评估%Safety Assessment of Main Engine Debugging during Sea Trial of Newly-built Ship Based on FSA

    Institute of Scientific and Technical Information of China (English)

    任威; 潘新祥; 刘新建

    2011-01-01

    There are many particularities and many safety hidden troubles during sea trial of a newly-built ship. By adopting Formal Safety Assessment (FSA) method, this thesis studies the factors that influence faults and hidden troubles of main engine debugging for newly-built ships during sea trial, establishes fault tree figure of main power plant debugging. Based on the results, according to the processes of fault tree, relevant policies and suggestions are put forward. Aiming at the new area of ships security management-security management of testing ships, a new idea has been created. Using fault tree to analyze faults and hidden troubles of main engine debugging for newly-built ships during sea trial is a scientific and effective method. It can enhance the security of newly-built ship during sea trial in the future.%新造船舶试航过程具有诸多的特殊性,也存在很多事故隐患,其安全性不容忽视.在诸多新船试航事故中,主机失控是其中重要的隐患之一.采用FSA方法对试航过程中主机安全性进行评估,通过建立故障树(FTA),找出引发主机失控这一安全隐患的最小径集,在此基础上做出有针对性的分析,并提出相应建议,确保新造船舶航行试验安全顺利进行.

  18. On-line current feed and computer aided control tactics for automatic balancing head

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In the designed automatic balancing head,a non-contact induction transformer is used to deliver driving energy to solve the problem of current fed and controlling on-line.Computer controlled automatic balancing experiments with phase-magnitude control tactics were performed on a flexible rotor system.Results of the experiments prove that the energy feeding method and the control tactics are effective in the automatic balancing head for vibration controlling.

  19. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming focuses on the techniques of automatic programming used with digital computers. Topics covered range from the design of machine-independent programming languages to the use of recursive procedures in ALGOL 60. A multi-pass translation scheme for ALGOL 60 is described, along with some commercial source languages. The structure and use of the syntax-directed compiler is also considered.Comprised of 12 chapters, this volume begins with a discussion on the basic ideas involved in the description of a computing process as a program for a computer, expressed in

  20. Algorithms for skiascopy measurement automatization

    Science.gov (United States)

    Fomins, Sergejs; Trukša, Renārs; KrūmiĆa, Gunta

    2014-10-01

    Automatic dynamic infrared retinoscope was developed, which allows to run procedure at a much higher rate. Our system uses a USB image sensor with up to 180 Hz refresh rate equipped with a long focus objective and 850 nm infrared light emitting diode as light source. Two servo motors driven by microprocessor control the rotation of semitransparent mirror and motion of retinoscope chassis. Image of eye pupil reflex is captured via software and analyzed along the horizontal plane. Algorithm for automatic accommodative state analysis is developed based on the intensity changes of the fundus reflex.

  1. Automatic Construction of Finite Algebras

    Institute of Scientific and Technical Information of China (English)

    张健

    1995-01-01

    This paper deals with model generation for equational theories,i.e.,automatically generating (finite)models of a given set of (logical) equations.Our method of finite model generation and a tool for automatic construction of finite algebras is described.Some examples are given to show the applications of our program.We argue that,the combination of model generators and theorem provers enables us to get a better understanding of logical theories.A brief comparison betwween our tool and other similar tools is also presented.

  2. Fault Analysis and Phase Debugging Method of the Thyristor Charger%晶闸管充电机的故障分析及相位调试方法

    Institute of Scientific and Technical Information of China (English)

    杨伟珍

    2001-01-01

    It is analyzed that voltage phase characteristics when thethyristor charger is wried in different ways and the phase shift characteristic of the thyristor rectification circuit. The voltage phasor diagram is drawed. The phase debugging method of the thyristor rectification circuit is introduced.%分析了晶闸管充电机不同接线时的电压相位特性和晶闸管整流电路的移相特性,画出了电压相量图。给出晶闸管整流电路的相位调试方法。

  3. Automatic detection of microcalcifications with multi-fractal spectrum.

    Science.gov (United States)

    Ding, Yong; Dai, Hang; Zhang, Hang

    2014-01-01

    For improving the detection of micro-calcifications (MCs), this paper proposes an automatic detection of MC system making use of multi-fractal spectrum in digitized mammograms. The approach of automatic detection system is based on the principle that normal tissues possess certain fractal properties which change along with the presence of MCs. In this system, multi-fractal spectrum is applied to reveal such fractal properties. By quantifying the deviations of multi-fractal spectrums between normal tissues and MCs, the system can identify MCs altering the fractal properties and finally locate the position of MCs. The performance of the proposed system is compared with the leading automatic detection systems in a mammographic image database. Experimental results demonstrate that the proposed system is statistically superior to most of the compared systems and delivers a superior performance.

  4. Automatic Segmentation of Vessels in In-Vivo Ultrasound Scans

    DEFF Research Database (Denmark)

    Tamimi-Sarnikowski, Philip; Brink-Kjær, Andreas; Moshavegh, Ramin

    2017-01-01

    Ultrasound has become highly popular to monitor atherosclerosis, by scanning the carotid artery. The screening involves measuring the thickness of the vessel wall and diameter of the lumen. An automatic segmentation of the vessel lumen, can enable the determination of lumen diameter. This paper...... presents a fully automatic segmentation algorithm, for robustly segmenting the vessel lumen in longitudinal B-mode ultrasound images. The automatic segmentation is performed using a combination of B-mode and power Doppler images. The proposed algorithm includes a series of preprocessing steps, and performs...... a vessel segmentation by use of the marker-controlled watershed transform. The ultrasound images used in the study were acquired using the bk3000 ultrasound scanner (BK Ultrasound, Herlev, Denmark) with two transducers ”8L2 Linear” and ”10L2w Wide Linear” (BK Ultrasound, Herlev, Denmark). The algorithm...

  5. Using numerosity judgments to determine what is learned during automatization.

    Science.gov (United States)

    Green, J T

    1997-07-01

    An important question in learning is the nature of the information required to support skilled, or automatic, performance. In 2 experiments, participants counted patterns of 8-10 objects displayed on a computer screen for 4 sessions of 240 trials each before being transferred to a different set of patterns for a final session of 240 trials. The patterns in the final session differed from those seen in the 4 training sessions in either overall configuration (Experiment 1) or identity of constituents (Experiment 2). Results indicated that both types of information are important in learning and automatization of a counting task and support the idea that what is attended to during training will be necessary to support subsequent automatic performance.

  6. Exposing MPI Objects for Debugging

    DEFF Research Database (Denmark)

    Brock-Nannestad, Laust; DelSignore, John; Squyres, Jeffrey M.;

    Developers rely on debuggers to inspect application state. In applications that use MPI, the Message Passing Interface, the MPI runtime contains an important part of this state. The MPI Tools Working Group has proposed an interface for MPI Handle Introspection. It allows debuggers and MPI...... implementations to cooperate in extracting information from MPI objects. Information that can then be presented to the developer. MPI Handle Introspection provides a more general interface than previous work, such as Message Queue Dumping. We add support for introspection to the TotalView debugger...

  7. Automatic quantification of iris color

    DEFF Research Database (Denmark)

    Christoffersen, S.; Harder, Stine; Andersen, J. D.;

    2012-01-01

    An automatic algorithm to quantify the eye colour and structural information from standard hi-resolution photos of the human iris has been developed. Initially, the major structures in the eye region are identified including the pupil, iris, sclera, and eyelashes. Based on this segmentation, the ...

  8. Trevi Park: Automatic Parking System

    OpenAIRE

    ECT Team, Purdue

    2007-01-01

    TreviPark is an underground, multi-story stacking system that holds cars efficiently, thus reducing the cost of each parking space, as a fully automatic parking system intended to maximize space utilization in parking structures. TreviPark costs less than the price of a conventional urban garage and takes up half the volume and 80% of the depth.

  9. Automatic agar tray inoculation device

    Science.gov (United States)

    Wilkins, J. R.; Mills, S. M.

    1972-01-01

    Automatic agar tray inoculation device is simple in design and foolproof in operation. It employs either conventional inoculating loop or cotton swab for uniform inoculation of agar media, and it allows technician to carry on with other activities while tray is being inoculated.

  10. Automatic Error Analysis Using Intervals

    Science.gov (United States)

    Rothwell, E. J.; Cloud, M. J.

    2012-01-01

    A technique for automatic error analysis using interval mathematics is introduced. A comparison to standard error propagation methods shows that in cases involving complicated formulas, the interval approach gives comparable error estimates with much less effort. Several examples are considered, and numerical errors are computed using the INTLAB…

  11. Automatic Identification of Metaphoric Utterances

    Science.gov (United States)

    Dunn, Jonathan Edwin

    2013-01-01

    This dissertation analyzes the problem of metaphor identification in linguistic and computational semantics, considering both manual and automatic approaches. It describes a manual approach to metaphor identification, the Metaphoricity Measurement Procedure (MMP), and compares this approach with other manual approaches. The dissertation then…

  12. Automatic milking : a better understanding

    NARCIS (Netherlands)

    Meijering, A.; Hogeveen, H.; Koning, de C.J.A.M.

    2004-01-01

    In 2000 the book Robotic Milking, reflecting the proceedings of an International Symposium which was held in The Netherlands came out. At that time, commercial introduction of automatic milking systems was no longer obstructed by technological inadequacies. Particularly in a few west-European countr

  13. Automatic Dissection Of Plantlets

    Science.gov (United States)

    Batchelor, B. G.; Harris, I. P.; Marchant, J. A.; Tillett, R. D.

    1989-03-01

    Micropropagation is a technique used in horticulture for generating a monoclonal colony of plants. A tiny plantlet is cut into several parts, each of which is then replanted. At the moment, the cutting is performed manually. Automating this task would have significant economic benefits. A robot designed to dissect plants would need to be equipped with intelligent visual sensing. This article is concerned with the image acquisition and processing techniques which such a machine might use. A program, which can calculate where to cut a plant with an "open" structure, is presented. This is expressed in the ProVision language, which is described in another article presented at this conference. (Article 1002-65)

  14. 考虑光伏组件发电性能的自动除尘系统运行时间优化%Optimization of running time of automatic dedusting system considered generating performance of PV mudules

    Institute of Scientific and Technical Information of China (English)

    郭枭; 澈力格尔; 韩雪; 田瑞

    2015-01-01

    Low power generation efficiency is one of the main obstacles to apply PV (photovoltaic) modules in large scale, and therefore studying the influence factors is of great significance. This article has independently developed a kind of automatic dedusting system of PV modules, which has the advantage of simple structure, low installation cost, reliable operation, without the use of water in the ash deposition, continuous and effective dedusting. The system has been applied to 3 kinds of occasions, including supplying power separately by the PV conversion cell with temperature in the range of -45℃−35℃, having various experimental tests of the assemble angles by the PV module cells and a large area of the PV power system. The dedusting effect of the automatic dedusting system is tested with temperature in the range of -10℃−5℃ when applied in the power separately by the PV conversion cell. Adopting the automatic dedusting system, the dynamic occlusion in the operation process has been simulated and the influence law of the output parameter for PV modules has been researched; the effect of dedusting has been analyzed under different amounts of the ash deposition; the effect of dedusting changing with the amount of the ash deposition has been summarized, and the opening time and the running period have been determined. The experimental PV modules are placed in outdoor open ground at an angle of 45°for 3, 7, 20 days and the amounts of the ash deposition are 0.1274, 0.2933, 0.8493 g/m2separately. The correction coefficient of PV modules involved in the experiments is 0.9943. The results show that, when the system is in the horizontal and vertical cycle, the cleaning brush makes the output parameters of the PV modules, including the output power, the electric current and the voltage, change according to the V-shaped law as it crosses a row of battery. Compared with the process of downlink, the output parameters of PV modules in the process of uplink fluctuate

  15. Automatic attraction of visual attention by supraletter features of former target strings

    DEFF Research Database (Denmark)

    Kyllingsbæk, Søren; Lommel, Sven Van; Bundesen, Claus

    2014-01-01

    , performance (d’) degraded on trials in which former targets were present, suggesting that the former targets automatically drew processing resources away from the current targets. Apparently, the two experiments showed automatic attraction of visual attention by supraletter features of former target strings....

  16. Alcohol-Related Effects on Automaticity due to Experimentally Manipulated Conditioning

    NARCIS (Netherlands)

    Gladwin, T.E.; Wiers, R.W.H.J.

    2012-01-01

    Background:• The use of alcohol is associated with various forms of automatic processing, such as approach tendencies and attentional biases, which may play a role in addictive behavior. The development of such automaticity has generally occurred well before subjects perform tasks designed to detec

  17. Alcohol-related effects on automaticity due to experimentally manipulated conditioning

    NARCIS (Netherlands)

    Gladwin, T.E.; Wiers, R.W.

    2012-01-01

    Background:  The use of alcohol is associated with various forms of automatic processing, such as approach tendencies and attentional biases, which may play a role in addictive behavior. The development of such automaticity has generally occurred well before subjects perform tasks designed to detect

  18. Human-competitive automatic topic indexing

    CERN Document Server

    Medelyan, Olena

    2009-01-01

    Topic indexing is the task of identifying the main topics covered by a document. These are useful for many purposes: as subject headings in libraries, as keywords in academic publications and as tags on the web. Knowing a document’s topics helps people judge its relevance quickly. However, assigning topics manually is labor intensive. This thesis shows how to generate them automatically in a way that competes with human performance. Three kinds of indexing are investigated: term assignment, a task commonly performed by librarians, who select topics from a controlled vocabulary; tagging, a popular activity of web users, who choose topics freely; and a new method of keyphrase extraction, where topics are equated to Wikipedia article names. A general two-stage algorithm is introduced that first selects candidate topics and then ranks them by significance based on their properties. These properties draw on statistical, semantic, domain-specific and encyclopedic knowledge. They are combined using a machine learn...

  19. 12 CFR 19.244 - Automatic removal, suspension, and debarment.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 1 2010-01-01 2010-01-01 false Automatic removal, suspension, and debarment. 19.244 Section 19.244 Banks and Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY RULES OF PRACTICE AND PROCEDURE Removal, Suspension, and Debarment of Accountants From Performing...

  20. An automatic redesign approach for restructurable control systems

    Science.gov (United States)

    Looze, D. P.; Weiss, J. L.; Eterno, J. S.; Barrett, N. M.

    1985-01-01

    This paper presents an approach to the automatic redesign of flight control systems for aircraft that have suffered one or more control element failures. The procedure is based on Linear Quadratic design techniques, and produces a control system that maximizes a measure of feedback system performance subject to a bandwidth constraint.

  1. Nature Conservation Drones for Automatic Localization and Counting of Animals

    NARCIS (Netherlands)

    J.C. van Gemert; C.R. Verschoor; P. Mettes; K. Epema; L.P. Koh; S. Wich

    2014-01-01

    This paper is concerned with nature conservation by automatically monitoring animal distribution and animal abundance. Typically, such conservation tasks are performed manually on foot or after an aerial recording from a manned aircraft. Such manual approaches are expensive, slow and labor intensive

  2. The Automatization of Verbal Morphology in Instructed Second Language Acquisition

    Science.gov (United States)

    Rodgers, Daryl M.

    2011-01-01

    According to information-processing accounts of skill acquisition, learner performance becomes more automatic over time and with practice, requiring less attention, time, and cognitive effort (DeKeyser, "Skill acquisition theory," Lawrence Erlbaum Associates, 2007a). This study set out to provide converging evidence for the development of…

  3. An Automatic Proof of Euler's Formula

    Directory of Open Access Journals (Sweden)

    Jun Zhang

    2005-05-01

    Full Text Available In this information age, everything is digitalized. The encoding of functions and the automatic proof of functions are important. This paper will discuss the automatic calculation for Taylor expansion coefficients, as an example, it can be applied to prove Euler's formula automatically.

  4. Self-Compassion and Automatic Thoughts

    Science.gov (United States)

    Akin, Ahmet

    2012-01-01

    The aim of this research is to examine the relationships between self-compassion and automatic thoughts. Participants were 299 university students. In this study, the Self-compassion Scale and the Automatic Thoughts Questionnaire were used. The relationships between self-compassion and automatic thoughts were examined using correlation analysis…

  5. Automatic Control System for Neutron Laboratory Safety

    Institute of Scientific and Technical Information of China (English)

    ZHAO; Xiao; ZHANG; Guo-guang; FENG; Shu-qiang; SU; Dan; YANG; Guo-zhao; ZHANG; Shuai

    2015-01-01

    In order to cooperate with the experiment of neutron generator,and realize the automatic control in the experiment,a set of automatic control system for the safety of the neutron laboratory is designed.The system block diagram is shown as Fig.1.Automatic control device is for processing switch signal,so PLC is selected as the core component

  6. Automatic identification of artifacts in electrodermal activity data.

    Science.gov (United States)

    Taylor, Sara; Jaques, Natasha; Chen, Weixuan; Fedor, Szymon; Sano, Akane; Picard, Rosalind

    2015-01-01

    Recently, wearable devices have allowed for long term, ambulatory measurement of electrodermal activity (EDA). Despite the fact that ambulatory recording can be noisy, and recording artifacts can easily be mistaken for a physiological response during analysis, to date there is no automatic method for detecting artifacts. This paper describes the development of a machine learning algorithm for automatically detecting EDA artifacts, and provides an empirical evaluation of classification performance. We have encoded our results into a freely available web-based tool for artifact and peak detection.

  7. Semi-Automatic Rename Refactoring for JavaScript

    DEFF Research Database (Denmark)

    Feldthaus, Asger; Møller, Anders

    2013-01-01

    Modern IDEs support automated refactoring for many programming languages, but support for JavaScript is still primitive. To perform renaming, which is one of the fundamental refactorings, there is often no practical alternative to simple syntactic search-and-replace. Although more sophisticated...... alternatives have been developed, they are limited by whole-program assumptions and poor scalability. We propose a technique for semi-automatic refactoring for JavaScript, with a focus on renaming. Unlike traditional refactoring algorithms, semi-automatic refactoring works by a combination of static analysis...

  8. Automatic Distribution Network Reconfiguration: An Event-Driven Approach

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Fei; Jiang, Huaiguang; Tan, Jin

    2016-11-14

    This paper proposes an event-driven approach for reconfiguring distribution systems automatically. Specifically, an optimal synchrophasor sensor placement (OSSP) is used to reduce the number of synchrophasor sensors while keeping the whole system observable. Then, a wavelet-based event detection and location approach is used to detect and locate the event, which performs as a trigger for network reconfiguration. With the detected information, the system is then reconfigured using the hierarchical decentralized approach to seek for the new optimal topology. In this manner, whenever an event happens the distribution network can be reconfigured automatically based on the real-time information that is observable and detectable.

  9. The Potential of Automatic Word Comparison for Historical Linguistics

    Science.gov (United States)

    Greenhill, Simon J.; Gray, Russell D.

    2017-01-01

    The amount of data from languages spoken all over the world is rapidly increasing. Traditional manual methods in historical linguistics need to face the challenges brought by this influx of data. Automatic approaches to word comparison could provide invaluable help to pre-analyze data which can be later enhanced by experts. In this way, computational approaches can take care of the repetitive and schematic tasks leaving experts to concentrate on answering interesting questions. Here we test the potential of automatic methods to detect etymologically related words (cognates) in cross-linguistic data. Using a newly compiled database of expert cognate judgments across five different language families, we compare how well different automatic approaches distinguish related from unrelated words. Our results show that automatic methods can identify cognates with a very high degree of accuracy, reaching 89% for the best-performing method Infomap. We identify the specific strengths and weaknesses of these different methods and point to major challenges for future approaches. Current automatic approaches for cognate detection—although not perfect—could become an important component of future research in historical linguistics. PMID:28129337

  10. The Potential of Automatic Word Comparison for Historical Linguistics.

    Science.gov (United States)

    List, Johann-Mattis; Greenhill, Simon J; Gray, Russell D

    2017-01-01

    The amount of data from languages spoken all over the world is rapidly increasing. Traditional manual methods in historical linguistics need to face the challenges brought by this influx of data. Automatic approaches to word comparison could provide invaluable help to pre-analyze data which can be later enhanced by experts. In this way, computational approaches can take care of the repetitive and schematic tasks leaving experts to concentrate on answering interesting questions. Here we test the potential of automatic methods to detect etymologically related words (cognates) in cross-linguistic data. Using a newly compiled database of expert cognate judgments across five different language families, we compare how well different automatic approaches distinguish related from unrelated words. Our results show that automatic methods can identify cognates with a very high degree of accuracy, reaching 89% for the best-performing method Infomap. We identify the specific strengths and weaknesses of these different methods and point to major challenges for future approaches. Current automatic approaches for cognate detection-although not perfect-could become an important component of future research in historical linguistics.

  11. Automatic Schema Evolution in Root

    Institute of Scientific and Technical Information of China (English)

    ReneBrun; FonsRademakers

    2001-01-01

    ROOT version 3(spring 2001) supports automatic class schema evolution.In addition this version also produces files that are self-describing.This is achieved by storing in each file a record with the description of all the persistent classes in the file.Being self-describing guarantees that a file can always be read later,its structure browsed and objects inspected.also when the library with the compiled code of these classes is missing The schema evolution mechanism supports the frequent case when multiple data sets generated with many different class versions must be analyzed in the same session.ROOT supports the automatic generation of C++ code describing the data objects in a file.

  12. Automatic spikes detection in seismogram

    Institute of Scientific and Technical Information of China (English)

    王海军; 靳平; 刘贵忠

    2003-01-01

    @@ Data processing for seismic network is very complex and fussy, because a lot of data is recorded in seismic network every day, which make it impossible to process these data all by manual work. Therefore, seismic data should be processed automatically to produce a initial results about events detection and location. Afterwards, these results are reviewed and modified by analyst. In automatic processing data quality checking is important. There are three main problem data thatexist in real seismic records, which include: spike, repeated data and dropouts. Spike is defined as isolated large amplitude point; the other two problem datahave the same features that amplitude of sample points are uniform in a interval. In data quality checking, the first step is to detect and statistic problem data in a data segment, if percent of problem data exceed a threshold, then the whole data segment is masked and not be processed in the later process.

  13. Physics of Automatic Target Recognition

    CERN Document Server

    Sadjadi, Firooz

    2007-01-01

    Physics of Automatic Target Recognition addresses the fundamental physical bases of sensing, and information extraction in the state-of-the art automatic target recognition field. It explores both passive and active multispectral sensing, polarimetric diversity, complex signature exploitation, sensor and processing adaptation, transformation of electromagnetic and acoustic waves in their interactions with targets, background clutter, transmission media, and sensing elements. The general inverse scattering, and advanced signal processing techniques and scientific evaluation methodologies being used in this multi disciplinary field will be part of this exposition. The issues of modeling of target signatures in various spectral modalities, LADAR, IR, SAR, high resolution radar, acoustic, seismic, visible, hyperspectral, in diverse geometric aspects will be addressed. The methods for signal processing and classification will cover concepts such as sensor adaptive and artificial neural networks, time reversal filt...

  14. Automatic design of magazine covers

    Science.gov (United States)

    Jahanian, Ali; Liu, Jerry; Tretter, Daniel R.; Lin, Qian; Damera-Venkata, Niranjan; O'Brien-Strain, Eamonn; Lee, Seungyon; Fan, Jian; Allebach, Jan P.

    2012-03-01

    In this paper, we propose a system for automatic design of magazine covers that quantifies a number of concepts from art and aesthetics. Our solution to automatic design of this type of media has been shaped by input from professional designers, magazine art directors and editorial boards, and journalists. Consequently, a number of principles in design and rules in designing magazine covers are delineated. Several techniques are derived and employed in order to quantify and implement these principles and rules in the format of a software framework. At this stage, our framework divides the task of design into three main modules: layout of magazine cover elements, choice of color for masthead and cover lines, and typography of cover lines. Feedback from professional designers on our designs suggests that our results are congruent with their intuition.

  15. Multiobjective image recognition algorithm in the fully automatic die bonder

    Institute of Scientific and Technical Information of China (English)

    JIANG Kai; CHEN Hai-xia; YUAN Sen-miao

    2006-01-01

    It is a very important task to automatically fix the number of die in the image recognition system of a fully automatic die bonder.A multiobjective image recognition algorithm based on clustering Genetic Algorithm (GA),is proposed in this paper.In the evolutionary process of GA,a clustering method is provided that utilizes information from the template and the fitness landscape of the current population..The whole population is grouped into different niches by the clustering method.Experimental results demonstrated that the number of target images could be determined by the algorithm automatically,and multiple targets could be recognized at a time.As a result,time consumed by one image recognition is shortened,the performance of the image recognition system is improved,and the atomization of the system is fulfilled.

  16. Driver behavior following an automatic steering intervention.

    Science.gov (United States)

    Fricke, Nicola; Griesche, Stefan; Schieben, Anna; Hesse, Tobias; Baumann, Martin

    2015-10-01

    The study investigated driver behavior toward an automatic steering intervention of a collision mitigation system. Forty participants were tested in a driving simulator and confronted with an inevitable collision. They performed a naïve drive and afterwards a repeated exposure in which they were told to hold the steering wheel loosely. In a third drive they experienced a false alarm situation. Data on driving behavior, i.e. steering and braking behavior as well as subjective data was assessed in the scenarios. Results showed that most participants held on to the steering wheel strongly or counter-steered during the system intervention during the first encounter. Moreover, subjective data collected after the first drive showed that the majority of drivers was not aware of the system intervention. Data from the repeated drive in which participants were instructed to hold the steering wheel loosely, led to significantly more participants holding the steering wheel loosely and thus complying with the instruction. This study seems to imply that without knowledge and information of the system about an upcoming intervention, the most prevalent driving behavior is a strong reaction with the steering wheel similar to an automatic steering reflex which decreases the system's effectiveness. Results of the second drive show some potential for countermeasures, such as informing drivers shortly before a system intervention in order to prevent inhibiting reactions.

  17. The Automatic Measurement of Targets

    DEFF Research Database (Denmark)

    Höhle, Joachim

    1997-01-01

    The automatic measurement of targets is demonstrated by means of a theoretical example and by an interactive measuring program for real imagery from a réseau camera. The used strategy is a combination of two methods: the maximum correlation coefficient and the correlation in the subpixel range. F...... interactive software is also part of a computer-assisted learning program on digital photogrammetry....

  18. Automatically-Programed Machine Tools

    Science.gov (United States)

    Purves, L.; Clerman, N.

    1985-01-01

    Software produces cutter location files for numerically-controlled machine tools. APT, acronym for Automatically Programed Tools, is among most widely used software systems for computerized machine tools. APT developed for explicit purpose of providing effective software system for programing NC machine tools. APT system includes specification of APT programing language and language processor, which executes APT statements and generates NC machine-tool motions specified by APT statements.

  19. Annual review in automatic programming

    CERN Document Server

    Halpern, Mark I; Bolliet, Louis

    2014-01-01

    Computer Science and Technology and their Application is an eight-chapter book that first presents a tutorial on database organization. Subsequent chapters describe the general concepts of Simula 67 programming language; incremental compilation and conversational interpretation; dynamic syntax; the ALGOL 68. Other chapters discuss the general purpose conversational system for graphical programming and automatic theorem proving based on resolution. A survey of extensible programming language is also shown.

  20. How CBO Estimates Automatic Stabilizers

    Science.gov (United States)

    2015-11-01

    of wages and salaries and proprietors’ incomes as recorded in the NIPAs to changes in the GDP gap , CBO uses separate regressions based on equation (1...Outlays Without Automatic Stabilizers GDP Gapa Unemployment Gap (Percent)b Revenues Outlays 3 Table 1. (Continued) Deficit or Surplus With and...gross domestic product; * = between -0.05 percent and 0.05 percent. a. The GDP gap equals the difference between actual or projected GDP and CBO’s

  1. Automatic translation among spoken languages

    Science.gov (United States)

    Walter, Sharon M.; Costigan, Kelly

    1994-01-01

    The Machine Aided Voice Translation (MAVT) system was developed in response to the shortage of experienced military field interrogators with both foreign language proficiency and interrogation skills. Combining speech recognition, machine translation, and speech generation technologies, the MAVT accepts an interrogator's spoken English question and translates it into spoken Spanish. The spoken Spanish response of the potential informant can then be translated into spoken English. Potential military and civilian applications for automatic spoken language translation technology are discussed in this paper.

  2. The Automatic Galaxy Collision Software

    CERN Document Server

    Smith, Beverly J; Pfeiffer, Phillip; Perkins, Sam; Barkanic, Jason; Fritts, Steve; Southerland, Derek; Manchikalapudi, Dinikar; Baker, Matt; Luckey, John; Franklin, Coral; Moffett, Amanda; Struck, Curtis

    2009-01-01

    The key to understanding the physical processes that occur during galaxy interactions is dynamical modeling, and especially the detailed matching of numerical models to specific systems. To make modeling interacting galaxies more efficient, we have constructed the `Automatic Galaxy Collision' (AGC) code, which requires less human intervention in finding good matches to data. We present some preliminary results from this code for the well-studied system Arp 284 (NGC 7714/5), and address questions of uniqueness of solutions.

  3. Automatic computation of transfer functions

    Science.gov (United States)

    Atcitty, Stanley; Watson, Luke Dale

    2015-04-14

    Technologies pertaining to the automatic computation of transfer functions for a physical system are described herein. The physical system is one of an electrical system, a mechanical system, an electromechanical system, an electrochemical system, or an electromagnetic system. A netlist in the form of a matrix comprises data that is indicative of elements in the physical system, values for the elements in the physical system, and structure of the physical system. Transfer functions for the physical system are computed based upon the netlist.

  4. Performance-Driven Interface Contract Enforcement for Scientific Components

    Energy Technology Data Exchange (ETDEWEB)

    Dahlgren, T

    2007-02-22

    Several performance-driven approaches to selectively enforce interface contracts for scientific components are investigated. The goal is to facilitate debugging deployed applications built from plug-and-play components while keeping the cost of enforcement within acceptable overhead limits. This paper describes a study of global enforcement using a priori execution cost estimates obtained from traces. Thirteen trials are formed from five, single-component programs. Enforcement experiments conducted using twenty-three enforcement policies are used to determine the nature of exercised contracts and the impact of a variety of sampling strategies. Performance-driven enforcement appears to be best suited to programs that exercise moderately expensive contracts.

  5. Towards automatic synthesis of linear algebra programs

    Energy Technology Data Exchange (ETDEWEB)

    Boyle, J. M.

    1979-01-01

    Automating the writing of efficient computer programs from an abstract specification of the computation that they are to perform is discussed. Advantages offered by automatic synthesis of programs include economy, reliability, and improved service. The synthesis of simple linear algebra programs is considered in general and then illustrated for the usual matrix product, a column-oriented matrix product, a rank-one update matrix product, and a program to multiply three matrices. The accumulation of inner products and transformational implementation of program synthesis addressed. The discussion attempts to illustrate both the general strategy of the syntheses and how various tactics can be adapted to make the syntheses proceed deterministically to programs that are optimal with respect to certain criteria. (RWR)

  6. Automatic Vessel Segmentation on Retinal Images

    Institute of Scientific and Technical Information of China (English)

    Chun-Yuan Yu; Chia-Jen Chang; Yen-Ju Yao; Shyr-Shen Yu

    2014-01-01

    Several features of retinal vessels can be used to monitor the progression of diseases. Changes in vascular structures, for example, vessel caliber, branching angle, and tortuosity, are portents of many diseases such as diabetic retinopathy and arterial hyper-tension. This paper proposes an automatic retinal vessel segmentation method based on morphological closing and multi-scale line detection. First, an illumination correction is performed on the green band retinal image. Next, the morphological closing and subtraction processing are applied to obtain the crude retinal vessel image. Then, the multi-scale line detection is used to fine the vessel image. Finally, the binary vasculature is extracted by the Otsu algorithm. In this paper, for improving the drawbacks of multi-scale line detection, only the line detectors at 4 scales are used. The experimental results show that the accuracy is 0.939 for DRIVE (digital retinal images for vessel extraction) retinal database, which is much better than other methods.

  7. Two novel automatic frequency tracking loops

    Science.gov (United States)

    Aguirre, Sergio; Hinedi, Sami

    1989-01-01

    Two automatic-frequency-control (AFC) loops are introduced and analyzed in detail. The algorithms are generalizations of the well known cross-product AFC loop with improved performance. The first estimator uses running overlapping discrete Fourier transforms to create a discriminator curve proportional to the frequency estimation error, whereas the second one preprocesses the received data and then uses an extended Kalman filter to estimate the input frequency. The algorithms are tested by computer simulations in a highly dynamic environment at low carrier/noise ratio (CNR). The algorithms are suboptimum tracking schemes with a larger frequency-error variance compared to an optimum strategy, but they offer simplicity of mechanization and a CNR with a very low operating threshold.

  8. Automatic Tuning of Interactive Perception Applications

    CERN Document Server

    Zhu, Qian; Mummert, Lily; Pillai, Padmanabhan

    2012-01-01

    Interactive applications incorporating high-data rate sensing and computer vision are becoming possible due to novel runtime systems and the use of parallel computation resources. To allow interactive use, such applications require careful tuning of multiple application parameters to meet required fidelity and latency bounds. This is a nontrivial task, often requiring expert knowledge, which becomes intractable as resources and application load characteristics change. This paper describes a method for automatic performance tuning that learns application characteristics and effects of tunable parameters online, and constructs models that are used to maximize fidelity for a given latency constraint. The paper shows that accurate latency models can be learned online, knowledge of application structure can be used to reduce the complexity of the learning task, and operating points can be found that achieve 90% of the optimal fidelity by exploring the parameter space only 3% of the time.

  9. New automatic minidisk infiltrometer: design and testing

    Directory of Open Access Journals (Sweden)

    Klípa Vladimír

    2015-06-01

    Full Text Available Soil hydraulic conductivity is a key parameter to predict water flow through the soil profile. We have developed an automatic minidisk infiltrometer (AMI to enable easy measurement of unsaturated hydraulic conductivity using the tension infiltrometer method in the field. AMI senses the cumulative infiltration by recording change in buoyancy force acting on a vertical solid bar fixed in the reservoir tube of the infiltrometer. Performance of the instrument was tested in the laboratory and in two contrasting catchments at three sites with different land use. Hydraulic conductivities determined using AMI were compared with earlier manually taken readings. The results of laboratory testing demonstrated high accuracy and robustness of the AMI measurement. Field testing of AMI proved the suitability of the instrument for use in the determination of sorptivity and near saturated hydraulic conductivity

  10. Automatic scanning of NTA film neutron dosimeters

    CERN Document Server

    Müller, R

    1999-01-01

    At the European Laboratory for Particle Physics CERN, personal neutron monitoring for over 4000 collaborators is performed with Kodak NTA film, one of the few suitable dosemeters in the stray radiation environment of a high energy accelerator. After development, films are scanned with a projection microscope. To overcome this lengthy and strenuous procedure an automated analysis system for the dosemeters has been developed. General purpose image recognition software, tailored to the specific needs with a macro language, analyses the digitised microscope image. This paper reports on the successful automatic scanning of NTA films irradiated with neutrons from a /sup 238/Pu-Be source (E approximately=4 MeV), as well as on the extension of the method to neutrons of higher energies. The question of detection limits is discussed in the light of an application of the method in routine personal neutron monitoring. (9 refs).

  11. Unification of automatic target tracking and automatic target recognition

    Science.gov (United States)

    Schachter, Bruce J.

    2014-06-01

    The subject being addressed is how an automatic target tracker (ATT) and an automatic target recognizer (ATR) can be fused together so tightly and so well that their distinctiveness becomes lost in the merger. This has historically not been the case outside of biology and a few academic papers. The biological model of ATT∪ATR arises from dynamic patterns of activity distributed across many neural circuits and structures (including retina). The information that the brain receives from the eyes is "old news" at the time that it receives it. The eyes and brain forecast a tracked object's future position, rather than relying on received retinal position. Anticipation of the next moment - building up a consistent perception - is accomplished under difficult conditions: motion (eyes, head, body, scene background, target) and processing limitations (neural noise, delays, eye jitter, distractions). Not only does the human vision system surmount these problems, but it has innate mechanisms to exploit motion in support of target detection and classification. Biological vision doesn't normally operate on snapshots. Feature extraction, detection and recognition are spatiotemporal. When vision is viewed as a spatiotemporal process, target detection, recognition, tracking, event detection and activity recognition, do not seem as distinct as they are in current ATT and ATR designs. They appear as similar mechanism taking place at varying time scales. A framework is provided for unifying ATT and ATR.

  12. Automatic surveillance system using fish-eye lens camera

    Institute of Scientific and Technical Information of China (English)

    Xue Yuan; Yongduan Song; Xueye Wei

    2011-01-01

    This letter presents an automatic surveillance system using fish-eye lens camera. Our system achieves wide-area automatic surveillance without a dead angle using only one camera. We propose a new human detection method to select the most adaptive classifier based on the locations of the human candidates.Human regions are detected from the fish-eye image effectively and are corrected for perspective versions.An experiment is performed on indoor video sequences with different illumination and crowded conditions,with results demonstrating the efficiency of our algorithm.%@@ This letter presents an automatic surveillance system using fish-eye lens camera. Our system achieves wide-area automatic surveillance without a dead angle using only one camera. We propose a new human detection method to select the most adaptive classifier based on the locations of the human candidates. Human regions are detected from the fish-eye image effectively and are corrected for perspective versions. An experiment is performed on indoor video sequences with different illumination and crowded conditions, with results demonstrating the efficiency of our algorithm.

  13. 汽车自动行驶装置设计%Design of vehicle automatic driving device

    Institute of Scientific and Technical Information of China (English)

    吴小邦

    2012-01-01

    The control of fire engines automatically traveling, mainly related to the key technologies of the steering, gear, brake and clutch control,its organization has used the machinery integration type or the air operated integration types of control,may be installed near the gear shift box in the vehicle and the bottom of the cab outside the narrow space. Use fire engine gas source and apply the pneumatic control localization,can be easy debugging and maintenance-free. Use single-power control solenoid valve, can be done for vehicles when it meet unexpected power outages,the every cylinder can be back to its original position.%对消防车自动行驶的控制,主要涉及转向、排挡、刹车和离合器控制等关键技术,其机构均采用机械集成式或气动集成控制形式,可安装在车辆的换挡箱附近、驾驶室外底部狭小空间;利用消防车的气源,应用气动控制定位,使调试更加方便、免维护;采用单电控电磁阀能确保在车辆意外断电时各气缸回复至原位.

  14. National Ignition Facility sub-system design requirements automatic alignment system SSDR 1.5.5

    Energy Technology Data Exchange (ETDEWEB)

    VanArsdall, P.; Bliss, E.

    1996-09-01

    This System Design Requirement document establishes the performance, design, development, and test requirements for the Automatic Alignment System, which is part of the NIF Integrated Computer Control System (ICCS).

  15. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming, Volume 4 is a collection of papers that deals with the GIER ALGOL compiler, a parameterized compiler based on mechanical linguistics, and the JOVIAL language. A couple of papers describes a commercial use of stacks, an IBM system, and what an ideal computer program support system should be. One paper reviews the system of compilation, the development of a more advanced language, programming techniques, machine independence, and program transfer to other machines. Another paper describes the ALGOL 60 system for the GIER machine including running ALGOL pro

  16. Automatic Inference of DATR Theories

    CERN Document Server

    Barg, P

    1996-01-01

    This paper presents an approach for the automatic acquisition of linguistic knowledge from unstructured data. The acquired knowledge is represented in the lexical knowledge representation language DATR. A set of transformation rules that establish inheritance relationships and a default-inference algorithm make up the basis components of the system. Since the overall approach is not restricted to a special domain, the heuristic inference strategy uses criteria to evaluate the quality of a DATR theory, where different domains may require different criteria. The system is applied to the linguistic learning task of German noun inflection.

  17. Automatic analysis of multiparty meetings

    Indian Academy of Sciences (India)

    Steve Renals

    2011-10-01

    This paper is about the recognition and interpretation of multiparty meetings captured as audio, video and other signals. This is a challenging task since the meetings consist of spontaneous and conversational interactions between a number of participants: it is a multimodal, multiparty, multistream problem. We discuss the capture and annotation of the Augmented Multiparty Interaction (AMI) meeting corpus, the development of a meeting speech recognition system, and systems for the automatic segmentation, summarization and social processing of meetings, together with some example applications based on these systems.

  18. Commutated automatic gain control system

    Science.gov (United States)

    Yost, S. R.

    1982-01-01

    The commutated automatic gain control (AGC) system was designed and built for the prototype Loran-C receiver is discussed. The current version of the prototype receiver, the Mini L-80, was tested initially in 1980. The receiver uses a super jolt microcomputer to control a memory aided phase loop (MAPLL). The microcomputer also controls the input/output, latitude/longitude conversion, and the recently added AGC system. The AGC control adjusts the level of each station signal, such that the early portion of each envelope rise is about at the same amplitude in the receiver envelope detector.

  19. Coordinated hybrid automatic repeat request

    KAUST Repository

    Makki, Behrooz

    2014-11-01

    We develop a coordinated hybrid automatic repeat request (HARQ) approach. With the proposed scheme, if a user message is correctly decoded in the first HARQ rounds, its spectrum is allocated to other users, to improve the network outage probability and the users\\' fairness. The results, which are obtained for single- and multiple-antenna setups, demonstrate the efficiency of the proposed approach in different conditions. For instance, with a maximum of M retransmissions and single transmit/receive antennas, the diversity gain of a user increases from M to (J+1)(M-1)+1 where J is the number of users helping that user.

  20. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming, Volume 2 is a collection of papers that discusses the controversy about the suitability of COBOL as a common business oriented language, and the development of different common languages for scientific computation. A couple of papers describes the use of the Genie system in numerical calculation and analyzes Mercury autocode in terms of a phrase structure language, such as in the source language, target language, the order structure of ATLAS, and the meta-syntactical language of the assembly program. Other papers explain interference or an ""intermediate

  1. TRT快切阀系统调试过程的故障分析与处理%Analysis and Treatment of Malfunctions during the Debugging Process of TRT Fast Shutoff Valve System

    Institute of Scientific and Technical Information of China (English)

    秦海兵; 王刚

    2015-01-01

    Through analysis of the faults of open failure, opening and excessive buffer time occurring during the process of debugging and operation of the TRT fast shutoff valve, the cause of the problems were found out and treated. Opinions on troubleshooting and main-tenance of the hydraulic system were also provided.%通过对TRT快切阀在调试和运行过程中发生的阀打不开、打开和缓冲时间长等故障现象的分析,找出故障原因并处理,并对液压系统故障的查找和维护提出自己的见解。

  2. On the Cost Control of the Construction Stage of Electricity Installation and Debugging Engineering%试论电力安装调试工程施工阶段的成本控制

    Institute of Scientific and Technical Information of China (English)

    王馗

    2014-01-01

    With the continuous innovation and improvement of market economic system, the competition between electric power enterprises is more and more fierce. To gain a foothold in the competition, it is necessary for electric power enterprises to strictly control the construction cost, in order to improve their comprehensive competitiveness. For electricity installation and debugging engineering, the cost of construction stage mainly includes the cost of construction materials, construction machinery and equipment, project management, construction personnel labor and so on. Of course, the construction scheme is the direct influencing factor of these costs. Based on the author's own working experience, effective measures to control the cost of the construction stage of electric installation and debugging engineering are put forward.%在市场经济体制不断创新和完善下,电力行业之间的市场竞争力越来越激烈。电力企业若想在激烈的竞争中站稳脚跟,就必须严格控制施工成本,以提高企业的综合竞争力。对于电力安装调试工程来讲,其施工阶段的成本主要包括了施工材料费、施工机械设备费、项目管理费、施工人员劳务费等,当然这些费用的直接影响因素就是电力安装调试工程的施工方案。笔者结合自身的工作经验,基于电力安装调试工程,对其施工阶段的成本进行有效控制。

  3. Automatic generation of tourist brochures

    KAUST Repository

    Birsak, Michael

    2014-05-01

    We present a novel framework for the automatic generation of tourist brochures that include routing instructions and additional information presented in the form of so-called detail lenses. The first contribution of this paper is the automatic creation of layouts for the brochures. Our approach is based on the minimization of an energy function that combines multiple goals: positioning of the lenses as close as possible to the corresponding region shown in an overview map, keeping the number of lenses low, and an efficient numbering of the lenses. The second contribution is a route-aware simplification of the graph of streets used for traveling between the points of interest (POIs). This is done by reducing the graph consisting of all shortest paths through the minimization of an energy function. The output is a subset of street segments that enable traveling between all the POIs without considerable detours, while at the same time guaranteeing a clutter-free visualization. © 2014 The Author(s) Computer Graphics Forum © 2014 The Eurographics Association and John Wiley & Sons Ltd. Published by John Wiley & Sons Ltd.

  4. QXT-full Automatic Saccharify Instrument

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    QXT is a full automatic saccharify instrument of eight holes . The instrument use process control technology of micro-computer. It can realize automatic of saccharify full process correctly. Due to adapt control mode of high precision expert PID and digit automatic calibration technology of fill micro computer, not only ensure precision of linear raising temperature region (1 ℃ /min) and constant temperature region (temperature error ±0.2 ℃), but also overcome the disturbance

  5. Automatic Control of Water Pumping Stations

    Institute of Scientific and Technical Information of China (English)

    Muhannad Alrheeh; JIANG Zhengfeng

    2006-01-01

    Automatic Control of pumps is an interesting proposal to operate water pumping stations among many kinds of water pumping stations according to their functions.In this paper, our pumping station is being used for water supply system. This paper is to introduce the idea of pump controller and the important factors that must be considering when we want to design automatic control system of water pumping stations. Then the automatic control circuit with the function of all components will be introduced.

  6. Memory as a function of attention, level of processing, and automatization.

    Science.gov (United States)

    Fisk, A D; Schneider, W

    1984-04-01

    The relationships between long-term memory (LTM) modification, attentional allocation, and type of processing are examined. Automatic/controlled processing theory (Schneider & Shiffrin, 1977) predicts that the nature and amount of controlled processing determines LTM storage and that stimuli can be automatically processed with no lasting LTM effect. Subjects performed the following: (a) an intentional learning, (b) a semantic categorization, (c) a graphic categorization, (d) a distracting digit-search while intentionally learning words, and (e) a distracting digit-search while ignoring words. Frequency judgments were more accurate in the semantic and intentional conditions than the graphic condition. Frequency judgments in the digit-search conditions were near chance. Experiment 2 extensively trained subjects to develop automatic categorization. Automatic categorization produced no frequency learning and little recognition. These results also disconfirm the Hasher and Zacks (1979) "automatic encoding" proposal regarding the nature of processing.

  7. Automatic classification of blank substrate defects

    Science.gov (United States)

    Boettiger, Tom; Buck, Peter; Paninjath, Sankaranarayanan; Pereira, Mark; Ronald, Rob; Rost, Dan; Samir, Bhamidipati

    2014-10-01

    Mask preparation stages are crucial in mask manufacturing, since this mask is to later act as a template for considerable number of dies on wafer. Defects on the initial blank substrate, and subsequent cleaned and coated substrates, can have a profound impact on the usability of the finished mask. This emphasizes the need for early and accurate identification of blank substrate defects and the risk they pose to the patterned reticle. While Automatic Defect Classification (ADC) is a well-developed technology for inspection and analysis of defects on patterned wafers and masks in the semiconductors industry, ADC for mask blanks is still in the early stages of adoption and development. Calibre ADC is a powerful analysis tool for fast, accurate, consistent and automatic classification of defects on mask blanks. Accurate, automated classification of mask blanks leads to better usability of blanks by enabling defect avoidance technologies during mask writing. Detailed information on blank defects can help to select appropriate job-decks to be written on the mask by defect avoidance tools [1][4][5]. Smart algorithms separate critical defects from the potentially large number of non-critical defects or false defects detected at various stages during mask blank preparation. Mechanisms used by Calibre ADC to identify and characterize defects include defect location and size, signal polarity (dark, bright) in both transmitted and reflected review images, distinguishing defect signals from background noise in defect images. The Calibre ADC engine then uses a decision tree to translate this information into a defect classification code. Using this automated process improves classification accuracy, repeatability and speed, while avoiding the subjectivity of human judgment compared to the alternative of manual defect classification by trained personnel [2]. This paper focuses on the results from the evaluation of Automatic Defect Classification (ADC) product at MP Mask

  8. Multilabel Learning for Automatic Web Services Tagging

    Directory of Open Access Journals (Sweden)

    Mustapha AZNAG

    2014-08-01

    Full Text Available Recently, some web services portals and search engines as Biocatalogue and Seekda!, have allowed users to manually annotate Web services using tags. User Tags provide meaningful descriptions of services and allow users to index and organize their contents. Tagging technique is widely used to annotate objects in Web 2.0 applications. In this paper we propose a novel probabilistic topic model (which extends the CorrLDA model - Correspondence Latent Dirichlet Allocation- to automatically tag web services according to existing manual tags. Our probabilistic topic model is a latent variable model that exploits local correlation labels. Indeed, exploiting label correlations is a challenging and crucial problem especially in multi-label learning context. Moreover, several existing systems can recommend tags for web services based on existing manual tags. In most cases, the manual tags have better quality. We also develop three strategies to automatically recommend the best tags for web services. We also propose, in this paper, WS-Portal; An Enriched Web Services Search Engine which contains 7063 providers, 115 sub-classes of category and 22236 web services crawled from the Internet. In WS-Portal, severals technologies are employed to improve the effectiveness of web service discovery (i.e. web services clustering, tags recommendation, services rating and monitoring. Our experiments are performed out based on real-world web services. The comparisons of Precision@n, Normalised Discounted Cumulative Gain (NDCGn values for our approach indicate that the method presented in this paper outperforms the method based on the CorrLDA in terms of ranking and quality of generated tags.

  9. Multiclassifier fusion of an ultrasonic lip reader in automatic speech recognition

    Science.gov (United States)

    Jennnings, David L.

    1994-12-01

    This thesis investigates the use of two active ultrasonic devices in collecting lip information for performing and enhancing automatic speech recognition. The two devices explored are called the 'Ultrasonic Mike' and the 'Lip Lock Loop.' The devices are tested in a speaker dependent isolated word recognition task with a vocabulary consisting of the spoken digits from zero to nine. Two automatic lip readers are designed and tested based on the output of the ultrasonic devices. The automatic lip readers use template matching and dynamic time warping to determine the best candidate for a given test utterance. The automatic lip readers alone achieve accuracies of 65-89%, depending on the number of reference templates used. Next the automatic lip reader is combined with a conventional automatic speech recognizer. Both classifier level fusion and feature level fusion are investigated. Feature fusion is based on combining the feature vectors prior to dynamic time warping. Classifier fusion is based on a pseudo probability mass function derived from the dynamic time warping distances. The combined systems are tested with various levels of acoustic noise added. In one typical test, at a signal to noise ratio of 0dB, the acoustic recognizer's accuracy alone was 78%, the automatic lip reader's accuracy was 69%, but the combined accuracy was 93%. This experiment demonstrates that a simple ultrasonic lip motion detector, that has an output data rate 12,500 times less than a typical video camera, can significantly improve the accuracy of automatic speech recognition in noise.

  10. Failure of classical traffic flow theories: Stochastic highway capacity and automatic driving

    Science.gov (United States)

    Kerner, Boris S.

    2016-05-01

    In a mini-review Kerner (2013) it has been shown that classical traffic flow theories and models failed to explain empirical traffic breakdown - a phase transition from metastable free flow to synchronized flow at highway bottlenecks. The main objective of this mini-review is to study the consequence of this failure of classical traffic-flow theories for an analysis of empirical stochastic highway capacity as well as for the effect of automatic driving vehicles and cooperative driving on traffic flow. To reach this goal, we show a deep connection between the understanding of empirical stochastic highway capacity and a reliable analysis of automatic driving vehicles in traffic flow. With the use of simulations in the framework of three-phase traffic theory, a probabilistic analysis of the effect of automatic driving vehicles on a mixture traffic flow consisting of a random distribution of automatic driving and manual driving vehicles has been made. We have found that the parameters of automatic driving vehicles can either decrease or increase the probability of the breakdown. The increase in the probability of traffic breakdown, i.e., the deterioration of the performance of the traffic system can occur already at a small percentage (about 5%) of automatic driving vehicles. The increase in the probability of traffic breakdown through automatic driving vehicles can be realized, even if any platoon of automatic driving vehicles satisfies condition for string stability.

  11. Uav-Based Automatic Tree Growth Measurement for Biomass Estimation

    Science.gov (United States)

    Karpina, M.; Jarząbek-Rychard, M.; Tymków, P.; Borkowski, A.

    2016-06-01

    Manual in-situ measurements of geometric tree parameters for the biomass volume estimation are time-consuming and economically non-effective. Photogrammetric techniques can be deployed in order to automate the measurement procedure. The purpose of the presented work is an automatic tree growth estimation based on Unmanned Aircraft Vehicle (UAV) imagery. The experiment was conducted in an agriculture test field with scots pine canopies. The data was collected using a Leica Aibotix X6V2 platform equipped with a Nikon D800 camera. Reference geometric parameters of selected sample plants were measured manually each week. In situ measurements were correlated with the UAV data acquisition. The correlation aimed at the investigation of optimal conditions for a flight and parameter settings for image acquisition. The collected images are processed in a state of the art tool resulting in a generation of dense 3D point clouds. The algorithm is developed in order to estimate geometric tree parameters from 3D points. Stem positions and tree tops are identified automatically in a cross section, followed by the calculation of tree heights. The automatically derived height values are compared to the reference measurements performed manually. The comparison allows for the evaluation of automatic growth estimation process. The accuracy achieved using UAV photogrammetry for tree heights estimation is about 5cm.

  12. UAV-BASED AUTOMATIC TREE GROWTH MEASUREMENT FOR BIOMASS ESTIMATION

    Directory of Open Access Journals (Sweden)

    M. Karpina

    2016-06-01

    Full Text Available Manual in-situ measurements of geometric tree parameters for the biomass volume estimation are time-consuming and economically non-effective. Photogrammetric techniques can be deployed in order to automate the measurement procedure. The purpose of the presented work is an automatic tree growth estimation based on Unmanned Aircraft Vehicle (UAV imagery. The experiment was conducted in an agriculture test field with scots pine canopies. The data was collected using a Leica Aibotix X6V2 platform equipped with a Nikon D800 camera. Reference geometric parameters of selected sample plants were measured manually each week. In situ measurements were correlated with the UAV data acquisition. The correlation aimed at the investigation of optimal conditions for a flight and parameter settings for image acquisition. The collected images are processed in a state of the art tool resulting in a generation of dense 3D point clouds. The algorithm is developed in order to estimate geometric tree parameters from 3D points. Stem positions and tree tops are identified automatically in a cross section, followed by the calculation of tree heights. The automatically derived height values are compared to the reference measurements performed manually. The comparison allows for the evaluation of automatic growth estimation process. The accuracy achieved using UAV photogrammetry for tree heights estimation is about 5cm.

  13. Automatic Network Reconstruction using ASP

    CERN Document Server

    Ostrowski, Max; Durzinsky, Markus; Marwan, Wolfgang; Wagler, Annegret

    2011-01-01

    Building biological models by inferring functional dependencies from experimental data is an im- portant issue in Molecular Biology. To relieve the biologist from this traditionally manual process, various approaches have been proposed to increase the degree of automation. However, available ap- proaches often yield a single model only, rely on specific assumptions, and/or use dedicated, heuris- tic algorithms that are intolerant to changing circumstances or requirements in the view of the rapid progress made in Biotechnology. Our aim is to provide a declarative solution to the problem by ap- peal to Answer Set Programming (ASP) overcoming these difficulties. We build upon an existing approach to Automatic Network Reconstruction proposed by part of the authors. This approach has firm mathematical foundations and is well suited for ASP due to its combinatorial flavor providing a characterization of all models explaining a set of experiments. The usage of ASP has several ben- efits over the existing heuristic a...

  14. Automatic validation of numerical solutions

    DEFF Research Database (Denmark)

    Stauning, Ole

    1997-01-01

    This thesis is concerned with ``Automatic Validation of Numerical Solutions''. The basic theory of interval analysis and self-validating methods is introduced. The mean value enclosure is applied to discrete mappings for obtaining narrow enclosures of the iterates when applying these mappings...... is the possiblility to combine the three methods in an extremely flexible way. We examine some applications where this flexibility is very useful. A method for Taylor expanding solutions of ordinary differential equations is presented, and a method for obtaining interval enclosures of the truncation errors incurred...... with intervals as initial values. A modification of the mean value enclosure of discrete mappings is considered, namely the extended mean value enclosure which in most cases leads to even better enclosures. These methods have previously been described in connection with discretizing solutions of ordinary...

  15. Automatic summarising factors and directions

    CERN Document Server

    Jones, K S

    1998-01-01

    This position paper suggests that progress with automatic summarising demands a better research methodology and a carefully focussed research strategy. In order to develop effective procedures it is necessary to identify and respond to the context factors, i.e. input, purpose, and output factors, that bear on summarising and its evaluation. The paper analyses and illustrates these factors and their implications for evaluation. It then argues that this analysis, together with the state of the art and the intrinsic difficulty of summarising, imply a nearer-term strategy concentrating on shallow, but not surface, text analysis and on indicative summarising. This is illustrated with current work, from which a potentially productive research programme can be developed.

  16. Autoclass: An automatic classification system

    Science.gov (United States)

    Stutz, John; Cheeseman, Peter; Hanson, Robin

    1991-01-01

    The task of inferring a set of classes and class descriptions most likely to explain a given data set can be placed on a firm theoretical foundation using Bayesian statistics. Within this framework, and using various mathematical and algorithmic approximations, the AutoClass System searches for the most probable classifications, automatically choosing the number of classes and complexity of class descriptions. A simpler version of AutoClass has been applied to many large real data sets, has discovered new independently-verified phenomena, and has been released as a robust software package. Recent extensions allow attributes to be selectively correlated within particular classes, and allow classes to inherit, or share, model parameters through a class hierarchy. The mathematical foundations of AutoClass are summarized.

  17. An Automatic High Efficient Method for Dish Concentrator Alignment

    OpenAIRE

    Yong Wang; Song Li; Jinshan Xu; Yijiang Wang; Xu Cheng; Changgui Gu; Shengyong Chen; Bin Wan

    2014-01-01

    Alignment of dish concentrator is a key factor to the performance of solar energy system. We propose a new method for the alignment of faceted solar dish concentrator. The isosceles triangle configuration of facet’s footholds determines a fixed relation between light spot displacements and foothold movements, which allows an automatic determination of the amount of adjustments. Tests on a 25 kW Stirling Energy System dish concentrator verify the feasibility, accuracy, and efficiency of our...

  18. A fully automatic system for acid-base coulometric titrations

    OpenAIRE

    1990-01-01

    An automatic system for acid-base titrations by electrogeneration of H+ and OH- ions, with potentiometric end-point detection, was developed. The system includes a PC-compatible computer for instrumental control, data acquisition and processing, which allows up to 13 samples to be analysed sequentially with no human intervention. The system performance was tested on the titration of standard solutions, which it carried out with low errors and RSD. It was subsequently applied to the analysis o...

  19. Automatic Camera Viewfinder Based on TI DaVinci

    Institute of Scientific and Technical Information of China (English)

    WANG Hai-gang; XIAO Zhi-tao; GENG Lei

    2009-01-01

    Presented is an automatic camera viewfinder based on TI DaVinci digital platform and discussed mainly is the scheme of software system based on linux. This system can give an alarm and save the picture when the set features appear in the view, and the saved pictures can be downloaded and zoomed out. All functions are operated in OSD menu. It is well established for its flexible operations, powerful functions, multitasking and stable performance.

  20. Computer program for automatic generation of BWR control rod patterns

    Energy Technology Data Exchange (ETDEWEB)

    Taner, M.S.; Levine, S.H.; Hsia, M.Y. (Pennsylvania State Univ., University Park (United States))

    1990-01-01

    A computer program named OCTOPUS has been developed to automatically determine a control rod pattern that approximates some desired target power distribution as closely as possible without violating any thermal safety or reactor criticality constraints. The program OCTOPUS performs a semi-optimization task based on the method of approximation programming (MAP) to develop control rod patterns. The SIMULATE-E code is used to determine the nucleonic characteristics of the reactor core state.

  1. Solar Powered Automatic Shrimp Feeding System

    Directory of Open Access Journals (Sweden)

    Dindo T. Ani

    2015-12-01

    Full Text Available - Automatic system has brought many revolutions in the existing technologies. One among the technologies, which has greater developments, is the solar powered automatic shrimp feeding system. For instance, the solar power which is a renewable energy can be an alternative solution to energy crisis and basically reducing man power by using it in an automatic manner. The researchers believe an automatic shrimp feeding system may help solve problems on manual feeding operations. The project study aimed to design and develop a solar powered automatic shrimp feeding system. It specifically sought to prepare the design specifications of the project, to determine the methods of fabrication and assembly, and to test the response time of the automatic shrimp feeding system. The researchers designed and developed an automatic system which utilizes a 10 hour timer to be set in intervals preferred by the user and will undergo a continuous process. The magnetic contactor acts as a switch connected to the 10 hour timer which controls the activation or termination of electrical loads and powered by means of a solar panel outputting electrical power, and a rechargeable battery in electrical communication with the solar panel for storing the power. By undergoing through series of testing, the components of the modified system were proven functional and were operating within the desired output. It was recommended that the timer to be used should be tested to avoid malfunction and achieve the fully automatic system and that the system may be improved to handle changes in scope of the project.

  2. Automatic cobb angle determination from radiographic images

    NARCIS (Netherlands)

    Sardjono, Tri Arief; Wilkinson, Michael H.F.; Veldhuizen, Albert G.; Ooijen, van Peter M.A.; Purnama, Ketut E.; Verkerke, Gijsbertus J.

    2013-01-01

    Study Design. Automatic measurement of Cobb angle in patients with scoliosis. Objective. To test the accuracy of an automatic Cobb angle determination method from frontal radiographical images. Summary of Background Data. Thirty-six frontal radiographical images of patients with scoliosis. Met

  3. Automatic Cobb Angle Determination From Radiographic Images

    NARCIS (Netherlands)

    Sardjono, Tri Arief; Wilkinson, Michael H. F.; Veldhuizen, Albert G.; van Ooijen, Peter M. A.; Purnama, Ketut E.; Verkerke, Gijsbertus J.

    2013-01-01

    Study Design. Automatic measurement of Cobb angle in patients with scoliosis. Objective. To test the accuracy of an automatic Cobb angle determination method from frontal radiographical images. Summary of Background Data. Thirty-six frontal radiographical images of patients with scoliosis. Methods.

  4. 2008-2013年杭州市拱墅区传染病自动预警信息分析%Analysis on performance of communicable disease automatic early warning information system in Gongshu District of Hangzhou City, 2008-2013

    Institute of Scientific and Technical Information of China (English)

    谢健; 徐旭卿; 鲁琴宝

    2015-01-01

    Objective To evaluate the performance of communicable disease automatic early warning information system during 2008 to 2013,and provide scientific evidence for the improvement of the system.Methods Descriptive epidemiological analysis was conducted on the information obtained from the automatic early warning system from 2008 to 2013 in Gongshu district of Hangzhou City.Results Totally 1 626 warning signals involving 20 kinds of communicable diseases were released by the automatic early warning system from 21 April 2008 to 31 December 2013.The five major diseases were other infectious diarrhea,mumps,measles,bacillary and dysentery,hand foot and mouth disease,accouning for 87.27% of the total number of early warning signals;and 13 of them were verified by field investigation (6 cases of cholera,3 cases of influenza).The predictive value positive (PVP) of early warning was 0.80% and the sensitivity was 85.71%.From warning signal releasing to reporting,the median of the response interval was 0.42 hour.Conclusions The automatic early warning system was stable and effective for the early detection of priority disease as cholera in Gongshu district.%目的 了解2008-2013年杭州市拱墅区传染病自动预警系统运行现状,为优化系统提供依据.方法 应用描述流行病学方法分析2008-2013年拱墅区传染病自动预警信息.结果 2008年4月21日至2013年12月31日拱墅区预警系统共接收到1 626条信号;涉及20种法定传染病,前5位病种依次为其他感染性腹泻病、流行性腮腺炎、麻疹、痢疾和手足口病,占全部预警信号数的87.27%;经现场核实为暴发/流行的信号事件13起,其中霍乱6起,流行性感冒3起,预警系统阳性预测值为0.80%,灵敏度为85.71%;预警响应中位数时间为0.42 h.结论 拱墅区传染病自动预警系统运行良好,对霍乱等重点传染病早期探测发挥了重要作用.

  5. Hidden Markov models in automatic speech recognition

    Science.gov (United States)

    Wrzoskowicz, Adam

    1993-11-01

    This article describes a method for constructing an automatic speech recognition system based on hidden Markov models (HMMs). The author discusses the basic concepts of HMM theory and the application of these models to the analysis and recognition of speech signals. The author provides algorithms which make it possible to train the ASR system and recognize signals on the basis of distinct stochastic models of selected speech sound classes. The author describes the specific components of the system and the procedures used to model and recognize speech. The author discusses problems associated with the choice of optimal signal detection and parameterization characteristics and their effect on the performance of the system. The author presents different options for the choice of speech signal segments and their consequences for the ASR process. The author gives special attention to the use of lexical, syntactic, and semantic information for the purpose of improving the quality and efficiency of the system. The author also describes an ASR system developed by the Speech Acoustics Laboratory of the IBPT PAS. The author discusses the results of experiments on the effect of noise on the performance of the ASR system and describes methods of constructing HMM's designed to operate in a noisy environment. The author also describes a language for human-robot communications which was defined as a complex multilevel network from an HMM model of speech sounds geared towards Polish inflections. The author also added mandatory lexical and syntactic rules to the system for its communications vocabulary.

  6. Automatic figure classification in bioscience literature.

    Science.gov (United States)

    Kim, Daehyun; Ramesh, Balaji Polepalli; Yu, Hong

    2011-10-01

    Millions of figures appear in biomedical articles, and it is important to develop an intelligent figure search engine to return relevant figures based on user entries. In this study we report a figure classifier that automatically classifies biomedical figures into five predefined figure types: Gel-image, Image-of-thing, Graph, Model, and Mix. The classifier explored rich image features and integrated them with text features. We performed feature selection and explored different classification models, including a rule-based figure classifier, a supervised machine-learning classifier, and a multi-model classifier, the latter of which integrated the first two classifiers. Our results show that feature selection improved figure classification and the novel image features we explored were the best among image features that we have examined. Our results also show that integrating text and image features achieved better performance than using either of them individually. The best system is a multi-model classifier which combines the rule-based hierarchical classifier and a support vector machine (SVM) based classifier, achieving a 76.7% F1-score for five-type classification. We demonstrated our system at http://figureclassification.askhermes.org/.

  7. Comparing different classifiers for automatic age estimation.

    Science.gov (United States)

    Lanitis, Andreas; Draganova, Chrisina; Christodoulou, Chris

    2004-02-01

    We describe a quantitative evaluation of the performance of different classifiers in the task of automatic age estimation. In this context, we generate a statistical model of facial appearance, which is subsequently used as the basis for obtaining a compact parametric description of face images. The aim of our work is to design classifiers that accept the model-based representation of unseen images and produce an estimate of the age of the person in the corresponding face image. For this application, we have tested different classifiers: a classifier based on the use of quadratic functions for modeling the relationship between face model parameters and age, a shortest distance classifier, and artificial neural network based classifiers. We also describe variations to the basic method where we use age-specific and/or appearance specific age estimation methods. In this context, we use age estimation classifiers for each age group and/or classifiers for different clusters of subjects within our training set. In those cases, part of the classification procedure is devoted to choosing the most appropriate classifier for the subject/age range in question, so that more accurate age estimates can be obtained. We also present comparative results concerning the performance of humans and computers in the task of age estimation. Our results indicate that machines can estimate the age of a person almost as reliably as humans.

  8. SPHOTOM - Package for an Automatic Multicolour Photometry

    Science.gov (United States)

    Parimucha, Š.; Vaňko, M.; Mikloš, P.

    2012-04-01

    We present basic information about package SPHOTOM for an automatic multicolour photometry. This package is in development for the creation of a photometric pipe-line, which we plan to use in the near future with our new instruments. It could operate in two independent modes, (i) GUI mode, in which the user can select images and control functions of package through interface and (ii) command line mode, in which all processes are controlled using a main parameter file. SPHOTOM is developed as a universal package for Linux based systems with easy implementation for different observatories. The photometric part of the package is based on the Sextractor code, which allows us to detect all objects on the images and perform their photometry with different apertures. We can also perform astrometric solutions for all images for a correct cross-identification of the stars on the images. The result is a catalogue of all objects with their instrumental photometric measurements which are consequently used for a differential magnitudes calculations with one or more comparison stars, transformations to an international system, and determinations of colour indices.

  9. Automatic cloud classification of whole sky images

    Directory of Open Access Journals (Sweden)

    A. Heinle

    2010-05-01

    Full Text Available The recently increasing development of whole sky imagers enables temporal and spatial high-resolution sky observations. One application already performed in most cases is the estimation of fractional sky cover. A distinction between different cloud types, however, is still in progress. Here, an automatic cloud classification algorithm is presented, based on a set of mainly statistical features describing the color as well as the texture of an image. The k-nearest-neighbour classifier is used due to its high performance in solving complex issues, simplicity of implementation and low computational complexity. Seven different sky conditions are distinguished: high thin clouds (cirrus and cirrostratus, high patched cumuliform clouds (cirrocumulus and altocumulus, stratocumulus clouds, low cumuliform clouds, thick clouds (cumulonimbus and nimbostratus, stratiform clouds and clear sky. Based on the Leave-One-Out Cross-Validation the algorithm achieves an accuracy of about 97%. In addition, a test run of random images is presented, still outperforming previous algorithms by yielding a success rate of about 75%, or up to 88% if only "serious" errors with respect to radiation impact are considered. Reasons for the decrement in accuracy are discussed, and ideas to further improve the classification results, especially in problematic cases, are investigated.

  10. MatchGUI: A Graphical MATLAB-Based Tool for Automatic Image Co-Registration

    Science.gov (United States)

    Ansar, Adnan I.

    2011-01-01

    MatchGUI software, based on MATLAB, automatically matches two images and displays the match result by superimposing one image on the other. A slider bar allows focus to shift between the two images. There are tools for zoom, auto-crop to overlap region, and basic image markup. Given a pair of ortho-rectified images (focused primarily on Mars orbital imagery for now), this software automatically co-registers the imagery so that corresponding image pixels are aligned. MatchGUI requires minimal user input, and performs a registration over scale and inplane rotation fully automatically

  11. An Automatic Labeling of K-means Clusters based on Chi-Square Value

    Science.gov (United States)

    Kusumaningrum, R.; Farikhin

    2017-01-01

    Automatic labeling methods in text clustering are widely implemented. However, there are limited studies in automatic cluster labeling for numeric data points. Therefore, the aim of this study is to develop a novel automatic cluster labeling of numeric data points that utilize analysis of Chi-Square test as its cluster label. We performed K-means clustering as a clustering method and disparity of Health Human Resources as a case study. The result shows that the accuracy of cluster labeling is about 89.14%.

  12. Effectiveness of an Automatic Tracking Software in Underwater Motion Analysis

    Directory of Open Access Journals (Sweden)

    Fabrício A. Magalhaes

    2013-12-01

    Full Text Available Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP, based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers’ positions were manually tracked to determine the markers’ center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM. Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker’s coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4% than for COM (17.8%. Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis.

  13. Confidence and rejection in automatic speech recognition

    Science.gov (United States)

    Colton, Larry Don

    Automatic speech recognition (ASR) is performed imperfectly by computers. For some designated part (e.g., word or phrase) of the ASR output, rejection is deciding (yes or no) whether it is correct, and confidence is the probability (0.0 to 1.0) of it being correct. This thesis presents new methods of rejecting errors and estimating confidence for telephone speech. These are also called word or utterance verification and can be used in wordspotting or voice-response systems. Open-set or out-of-vocabulary situations are a primary focus. Language models are not considered. In vocabulary-dependent rejection all words in the target vocabulary are known in advance and a strategy can be developed for confirming each word. A word-specific artificial neural network (ANN) is shown to discriminate well, and scores from such ANNs are shown on a closed-set recognition task to reorder the N-best hypothesis list (N=3) for improved recognition performance. Segment-based duration and perceptual linear prediction (PLP) features are shown to perform well for such ANNs. The majority of the thesis concerns vocabulary- and task-independent confidence and rejection based on phonetic word models. These can be computed for words even when no training examples of those words have been seen. New techniques are developed using phoneme ranks instead of probabilities in each frame. These are shown to perform as well as the best other methods examined despite the data reduction involved. Certain new weighted averaging schemes are studied but found to give no performance benefit. Hierarchical averaging is shown to improve performance significantly: frame scores combine to make segment (phoneme state) scores, which combine to make phoneme scores, which combine to make word scores. Use of intermediate syllable scores is shown to not affect performance. Normalizing frame scores by an average of the top probabilities in each frame is shown to improve performance significantly. Perplexity of the wrong

  14. The nature of the automatization deficit in Chinese children with dyslexia.

    Science.gov (United States)

    Wong, Simpson W L; Ho, Connie S-H

    2010-01-01

    Clarifying whether automatization deficits constitute the primary causes or symptoms of developmental dyslexia, we focused on three critical issues of the dyslexic automatization deficit, namely universality, domain specificity, and severity. Thirty Chinese dyslexic children (mean age 10 years and 5 months), 30 chronological-age-, and 30 reading-level-matched children were tested in 4 areas of automaticity: motor, visual search, Stroop facilitation effects, and automatic word recognition. The results showed that the dyslexic children performed significantly worse than the CA-controls but not the RL-controls in all the tasks except for Stroop congruent-color words, on which they performed worse than children in both control groups. The deficits reflect a lag in reading experiences rather than a persistent cognitive deficit.

  15. 7 CFR 58.418 - Automatic cheese making equipment.

    Science.gov (United States)

    2010-01-01

    ... processing or packaging areas. (c) Automatic salter. The automatic salter shall be constructed of stainless.... The automatic salter shall be constructed so that it can be satisfactorily cleaned. The salting...

  16. 调试中基于文法编码的日志异常检测算法%A Log Anomaly Detection Algorithm for Debugging Based on Grammar-Based Codes

    Institute of Scientific and Technical Information of China (English)

    王楠; 韩冀中; 方金云

    2013-01-01

    调试软件中的非确定错误对软件开发有重要意义.近年来,随着云计算系统的快速发展和对录制重放调试方法研究的深入,使用异常检测方法从大量文本日志或控制流日志等数据中找出异常的信息对调试愈发重要.传统的异常检测算法大多是为检测和防范攻击而设计的,它们很多基于马尔可夫假设,对事件流上的剧烈变化很敏感.但是新的问题要求异常检测能够检出语义级别的异常行为.实验表明现有的基于马尔可夫假设的异常检测算法在这方面表现不佳.提出了一种新的基于文法编码的异常检测算法.该算法不依赖于统计模型、概率模型、机器学习及马尔可夫假设,设计和实现都极为简单.实验表明在检测高层次的语义异常方面,该算法比传统方法有优势.%Debugging non-deterministic bugs has long been an important research area in software development. In recent years, with the rapid emerging of large cloud computing systems and the development of record replay debugging, the key of such debugging problem becomes mining anomaly information from text console logs and/or execution flow logs. Anomaly detection algorithms can therefore be used in this area. However, although many approaches have been proposed, traditional anomaly detection algorithms are designed for detecting network attacking and not suitable for the new problems. One important reason is the Markov assumption on which many traditional anomaly detection methods are based. Markov-based methods are sensitive to harshly trashing in event transitions. In contrast, the new problems in system diagnosing require the abilities of detecting semantic misbehaviors. Experiment results show the powerless of Markov-based methods on those problems. This paper presents a novel anomaly detection algorithm which is based on grammar-based codes. Different from previous approaches, our algorithm is a non-Markov approach. It doesn

  17. A Hierarchy of Tree-Automatic Structures

    CERN Document Server

    Finkel, Olivier

    2011-01-01

    We consider $\\omega^n$-automatic structures which are relational structures whose domain and relations are accepted by automata reading ordinal words of length $\\omega^n$ for some integer $n\\geq 1$. We show that all these structures are $\\omega$-tree-automatic structures presentable by Muller or Rabin tree automata. We prove that the isomorphism relation for $\\omega^2$-automatic (resp. $\\omega^n$-automatic for $n>2$) boolean algebras (respectively, partial orders, rings, commutative rings, non commutative rings, non commutative groups) is not determined by the axiomatic system ZFC. We infer from the proof of the above result that the isomorphism problem for $\\omega^n$-automatic boolean algebras, $n > 1$, (respectively, rings, commutative rings, non commutative rings, non commutative groups) is neither a $\\Sigma_2^1$-set nor a $\\Pi_2^1$-set. We obtain that there exist infinitely many $\\omega^n$-automatic, hence also $\\omega$-tree-automatic, atomless boolean algebras $B_n$, $n\\geq 1$, which are pairwise isomorp...

  18. Automatic image cropping for republishing

    Science.gov (United States)

    Cheatle, Phil

    2010-02-01

    Image cropping is an important aspect of creating aesthetically pleasing web pages and repurposing content for different web or printed output layouts. Cropping provides both the possibility of improving the composition of the image, and also the ability to change the aspect ratio of the image to suit the layout design needs of different document or web page formats. This paper presents a method for aesthetically cropping images on the basis of their content. Underlying the approach is a novel segmentation-based saliency method which identifies some regions as "distractions", as an alternative to the conventional "foreground" and "background" classifications. Distractions are a particular problem with typical consumer photos found on social networking websites such as FaceBook, Flickr etc. Automatic cropping is achieved by identifying the main subject area of the image and then using an optimization search to expand this to form an aesthetically pleasing crop. Evaluation of aesthetic functions like auto-crop is difficult as there is no single correct solution. A further contribution of this paper is an automated evaluation method which goes some way towards handling the complexity of aesthetic assessment. This allows crop algorithms to be easily evaluated against a large test set.

  19. Automatic segmentation of psoriasis lesions

    Science.gov (United States)

    Ning, Yang; Shi, Chenbo; Wang, Li; Shu, Chang

    2014-10-01

    The automatic segmentation of psoriatic lesions is widely researched these years. It is an important step in Computer-aid methods of calculating PASI for estimation of lesions. Currently those algorithms can only handle single erythema or only deal with scaling segmentation. In practice, scaling and erythema are often mixed together. In order to get the segmentation of lesions area - this paper proposes an algorithm based on Random forests with color and texture features. The algorithm has three steps. The first step, the polarized light is applied based on the skin's Tyndall-effect in the imaging to eliminate the reflection and Lab color space are used for fitting the human perception. The second step, sliding window and its sub windows are used to get textural feature and color feature. In this step, a feature of image roughness has been defined, so that scaling can be easily separated from normal skin. In the end, Random forests will be used to ensure the generalization ability of the algorithm. This algorithm can give reliable segmentation results even the image has different lighting conditions, skin types. In the data set offered by Union Hospital, more than 90% images can be segmented accurately.

  20. Automatic Assessment of Programming assignment

    Directory of Open Access Journals (Sweden)

    Surendra Gupta

    2012-01-01

    Full Text Available In today’s world study of computer’s language is more important. Effective and good programming skills are need full all computer science students. They can be master in programming, only through intensive exercise practices. Due to day by day increasing number of students in the class, the assessment of programming exercises leads to extensive workload for teacher/instructor, particularly if it has to be carried out manually. In this paper, we propose an automatic assessment system for programming assignments, using verification program with random inputs. One of the most important properties of a program is that, it carries out its intended function. The intended function of a program or part of a program can be verified by using inverse function’s verification program. For checking intended functionality and evaluation of a program, we have used verification program. This assessment system has been tested on basic C programming courses, and results shows that it can work well in basic programming exercises, with some initial promising results

  1. Automatic sleep staging using state machine-controlled decision trees.

    Science.gov (United States)

    Imtiaz, Syed Anas; Rodriguez-Villegas, Esther

    2015-01-01

    Automatic sleep staging from a reduced number of channels is desirable to save time, reduce costs and make sleep monitoring more accessible by providing home-based polysomnography. This paper introduces a novel algorithm for automatic scoring of sleep stages using a combination of small decision trees driven by a state machine. The algorithm uses two channels of EEG for feature extraction and has a state machine that selects a suitable decision tree for classification based on the prevailing sleep stage. Its performance has been evaluated using the complete dataset of 61 recordings from PhysioNet Sleep EDF Expanded database achieving an overall accuracy of 82% and 79% on training and test sets respectively. The algorithm has been developed with a very small number of decision tree nodes that are active at any given time making it suitable for use in resource-constrained wearable systems.

  2. OSKI: A Library of Automatically Tuned Sparse Matrix Kernels

    Energy Technology Data Exchange (ETDEWEB)

    Vuduc, R; Demmel, J W; Yelick, K A

    2005-07-19

    The Optimized Sparse Kernel Interface (OSKI) is a collection of low-level primitives that provide automatically tuned computational kernels on sparse matrices, for use by solver libraries and applications. These kernels include sparse matrix-vector multiply and sparse triangular solve, among others. The primary aim of this interface is to hide the complex decision-making process needed to tune the performance of a kernel implementation for a particular user's sparse matrix and machine, while also exposing the steps and potentially non-trivial costs of tuning at run-time. This paper provides an overview of OSKI, which is based on our research on automatically tuned sparse kernels for modern cache-based superscalar machines.

  3. Automatic Chessboard Detection for Intrinsic and Extrinsic Camera Parameter Calibration

    Directory of Open Access Journals (Sweden)

    Jose María Armingol

    2010-03-01

    Full Text Available There are increasing applications that require precise calibration of cameras to perform accurate measurements on objects located within images, and an automatic algorithm would reduce this time consuming calibration procedure. The method proposed in this article uses a pattern similar to that of a chess board, which is found automatically in each image, when no information regarding the number of rows or columns is supplied to aid its detection. This is carried out by means of a combined analysis of two Hough transforms, image corners and invariant properties of the perspective transformation. Comparative analysis with more commonly used algorithms demonstrate the viability of the algorithm proposed, as a valuable tool for camera calibration.

  4. Automatic control of biomass gasifiers using fuzzy inference systems

    Energy Technology Data Exchange (ETDEWEB)

    Sagues, C. [Universidad de Zaragoza (Spain). Dpto. de Informatica e Ingenieria de Sistemas; Garcia-Bacaicoa, P.; Serrano, S. [Universidad de Zaragoza (Spain). Dpto. de Ingenieria Quimica y Medio Ambiente

    2007-03-15

    A fuzzy controller for biomass gasifiers is proposed. Although fuzzy inference systems do not need models to be tuned, a plant model is proposed which has turned out very useful to prove different combinations of membership functions and rules in the proposed fuzzy control. The global control scheme is shown, including the elements to generate the set points for the process variables automatically. There, the type of biomass and its moisture content are the only data which need to be introduced to the controller by a human operator at the beginning of operation to make it work autonomously. The advantages and good performance of the fuzzy controller with the automatic generation of set points, compared to controllers utilising fixed parameters, are demonstrated. (author)

  5. Automatic control of biomass gasifiers using fuzzy inference systems.

    Science.gov (United States)

    Sagüés, C; García-Bacaicoa, P; Serrano, S

    2007-03-01

    A fuzzy controller for biomass gasifiers is proposed. Although fuzzy inference systems do not need models to be tuned, a plant model is proposed which has turned out very useful to prove different combinations of membership functions and rules in the proposed fuzzy control. The global control scheme is shown, including the elements to generate the set points for the process variables automatically. There, the type of biomass and its moisture content are the only data which need to be introduced to the controller by a human operator at the beginning of operation to make it work autonomously. The advantages and good performance of the fuzzy controller with the automatic generation of set points, compared to controllers utilising fixed parameters, are demonstrated.

  6. Pilot control through the TAFCOS automatic flight control system

    Science.gov (United States)

    Wehrend, W. R., Jr.

    1979-01-01

    The set of flight control logic used in a recently completed flight test program to evaluate the total automatic flight control system (TAFCOS) with the controller operating in a fully automatic mode, was used to perform an unmanned simulation on an IBM 360 computer in which the TAFCOS concept was extended to provide a multilevel pilot interface. A pilot TAFCOS interface for direct pilot control by use of a velocity-control-wheel-steering mode was defined as well as a means for calling up conventional autopilot modes. It is concluded that the TAFCOS structure is easily adaptable to the addition of a pilot control through a stick-wheel-throttle control similar to conventional airplane controls. Conventional autopilot modes, such as airspeed-hold, altitude-hold, heading-hold, and flight path angle-hold, can also be included.

  7. Sparse encoding of automatic visual association in hippocampal networks

    DEFF Research Database (Denmark)

    Hulme, Oliver J; Skov, Martin; Chadwick, Martin J

    2014-01-01

    Intelligent action entails exploiting predictions about associations between elements of ones environment. The hippocampus and mediotemporal cortex are endowed with the network topology, physiology, and neurochemistry to automatically and sparsely code sensori-cognitive associations that can...... by these stimuli. Using multivariate Bayesian decoding, we show that human hippocampal and temporal neocortical structures host sparse associative representations that are automatically triggered by visual input. Furthermore, as predicted theoretically, there was a significant increase in sparsity in the Cornu...... Ammonis subfields, relative to the entorhinal cortex. Remarkably, the sparsity of CA encoding correlated significantly with associative memory performance over subjects; elsewhere within the temporal lobe, entorhinal, parahippocampal, perirhinal and fusiform cortices showed the highest model evidence...

  8. Detection of Off-normal Images for NIF Automatic Alignment

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J V; Awwal, A S; McClay, W A; Ferguson, S W; Burkhart, S C

    2005-07-11

    One of the major purposes of National Ignition Facility at Lawrence Livermore National Laboratory is to accurately focus 192 high energy laser beams on a nanoscale (mm) fusion target at the precise location and time. The automatic alignment system developed for NIF is used to align the beams in order to achieve the required focusing effect. However, if a distorted image is inadvertently created by a faulty camera shutter or some other opto-mechanical malfunction, the resulting image termed ''off-normal'' must be detected and rejected before further alignment processing occurs. Thus the off-normal processor acts as a preprocessor to automatic alignment image processing. In this work, we discuss the development of an ''off-normal'' pre-processor capable of rapidly detecting the off-normal images and performing the rejection. Wide variety of off-normal images for each loop is used to develop the criterion for rejections accurately.

  9. Automatic Emotional State Detection using Facial Expression Dynamic in Videos

    Directory of Open Access Journals (Sweden)

    Hongying Meng

    2014-11-01

    Full Text Available In this paper, an automatic emotion detection system is built for a computer or machine to detect the emotional state from facial expressions in human computer communication. Firstly, dynamic motion features are extracted from facial expression videos and then advanced machine learning methods for classification and regression are used to predict the emotional states. The system is evaluated on two publicly available datasets, i.e. GEMEP_FERA and AVEC2013, and satisfied performances are achieved in comparison with the baseline results provided. With this emotional state detection capability, a machine can read the facial expression of its user automatically. This technique can be integrated into applications such as smart robots, interactive games and smart surveillance systems.

  10. AUTOMATIC EXTRACTION OF BUILDING OUTLINE FROM HIGH RESOLUTION AERIAL IMAGERY

    Directory of Open Access Journals (Sweden)

    Y. Wang

    2016-06-01

    Full Text Available In this paper, a new approach for automated extraction of building boundary from high resolution imagery is proposed. The proposed approach uses both geometric and spectral properties of a building to detect and locate buildings accurately. It consists of automatic generation of high quality point cloud from the imagery, building detection from point cloud, classification of building roof and generation of building outline. Point cloud is generated from the imagery automatically using semi-global image matching technology. Buildings are detected from the differential surface generated from the point cloud. Further classification of building roof is performed in order to generate accurate building outline. Finally classified building roof is converted into vector format. Numerous tests have been done on images in different locations and results are presented in the paper.

  11. Automatic Extraction of Building Outline from High Resolution Aerial Imagery

    Science.gov (United States)

    Wang, Yandong

    2016-06-01

    In this paper, a new approach for automated extraction of building boundary from high resolution imagery is proposed. The proposed approach uses both geometric and spectral properties of a building to detect and locate buildings accurately. It consists of automatic generation of high quality point cloud from the imagery, building detection from point cloud, classification of building roof and generation of building outline. Point cloud is generated from the imagery automatically using semi-global image matching technology. Buildings are detected from the differential surface generated from the point cloud. Further classification of building roof is performed in order to generate accurate building outline. Finally classified building roof is converted into vector format. Numerous tests have been done on images in different locations and results are presented in the paper.

  12. Automatic relational database compression scheme design based on swarm evolution

    Institute of Scientific and Technical Information of China (English)

    HU Tian-lei; CHEN Gang; LI Xiao-yan; DONG Jin-xiang

    2006-01-01

    Compression is an intuitive way to boost the performance of a database system. However, compared with other physical database design techniques, compression consumes large amount of CPU power. There is a trade-off between the reduction of disk access and the overhead of CPU processing. Automatic design and adaptive administration of database systems are widely demanded, and the automatic selection of compression schema to compromise the trade-off is very important. In this paper,we present a model with novel techniques to integrate a rapidly convergent agent-based evolution framework, i.e. the SWAF (SWarm Algorithm Framework), into adaptive attribute compression for relational database. The model evolutionally consults statistics of CPU load and IO bandwidth to select compression schemas considering both aspects of the trade-off. We have implemented a prototype model on Oscar RDBMS with experiments highlighting the correctness and efficiency of our techniques.

  13. Automatic denoising of single-trial evoked potentials.

    Science.gov (United States)

    Ahmadi, Maryam; Quian Quiroga, Rodrigo

    2013-02-01

    We present an automatic denoising method based on the wavelet transform to obtain single trial evoked potentials. The method is based on the inter- and intra-scale variability of the wavelet coefficients and their deviations from baseline values. The performance of the method is tested with simulated event related potentials (ERPs) and with real visual and auditory ERPs. For the simulated data the presented method gives a significant improvement in the observation of single trial ERPs as well as in the estimation of their amplitudes and latencies, in comparison with a standard denoising technique (Donoho's thresholding) and in comparison with the noisy single trials. For the real data, the proposed method largely filters the spontaneous EEG activity, thus helping the identification of single trial visual and auditory ERPs. The proposed method provides a simple, automatic and fast tool that allows the study of single trial responses and their correlations with behavior.

  14. An Automatic Clustering Technique for Optimal Clusters

    CERN Document Server

    Pavan, K Karteeka; Rao, A V Dattatreya; 10.5121/ijcsea.2011.1412

    2011-01-01

    This paper proposes a simple, automatic and efficient clustering algorithm, namely, Automatic Merging for Optimal Clusters (AMOC) which aims to generate nearly optimal clusters for the given datasets automatically. The AMOC is an extension to standard k-means with a two phase iterative procedure combining certain validation techniques in order to find optimal clusters with automation of merging of clusters. Experiments on both synthetic and real data have proved that the proposed algorithm finds nearly optimal clustering structures in terms of number of clusters, compactness and separation.

  15. 折半查找算法在电子会议系统调试中的应用%Application of binary search algorithm in electronic conference system debugging

    Institute of Scientific and Technical Information of China (English)

    李蓓

    2014-01-01

    关联性故障代表单元的查找定位是电子会议系统调试的关键环节。为了提高查找定位效率,提出了一种基于折半查找的快速定位算法。通过对电子会议系统中干线上的代表单元可靠度的分析,把代表单元的故障情况转化成随着代表单元数量动态变化的量化数据,构成一组有序的可靠度数据,将工程调试问题转化成数据分析问题,引入折半查找算法,设计高效准确定位故障代表单元的算法,并进行了算法效率分析。该方法操作简单,无需借助特殊工具即可应用于复杂环境下电子会议系统的现场调试。%As fault locating of delegate units of relevance is the key link of electronic conference system debugging. In order to improve locating efficiency, a fast locating algorithm based on binary search was proposed. By reliability analysis of delegate units in the main line of electronic conference system, fault conditions of delegate units were transformed to quantitative data of dynamic changes of the number of units, constituting an ordered set of reliability data, so that the debugging problem was transformed into the problem of data analysis. On this basis, this paper introduced the binary search algorithm, designed efficient and accurate fault location algorithm, and analyzed the effectiveness of the algorithm. Without any special tools, this method can be applied to site commissioning of electronic conference system in the complex environment.

  16. Perseveration causes automatization of checking behavior in obsessive-compulsive disorder.

    Science.gov (United States)

    Dek, Eliane C P; van den Hout, Marcel A; Engelhard, Iris M; Giele, Catharina L; Cath, Danielle C

    2015-08-01

    Repeated checking leads to reductions in meta-memory (i.e., memory confidence, vividness and detail), and automatization of checking behavior (Dek, van den Hout, Giele, & Engelhard, 2014, 2015). Dek et al. (2014) suggested that this is caused by increased familiarity with the checked stimuli. They predicted that defamiliarization of checking by modifying the perceptual characteristics of stimuli would cause de-automatization and attenuate the negative meta-memory effects of re-checking. However, their results were inconclusive. The present study investigated whether repeated checking leads to automatization of checking behavior, and if defamiliarization indeed leads to de-automatization and attenuation of meta-memory effects in patients with OCD and healthy controls. Participants performed a checking task, in which they activated, deactivated and checked threat-irrelevant stimuli. During a pre- and post-test checking trial, check duration was recorded and a reaction time task was simultaneously administered as dual-task to assess automatization. After the pre- and post-test checking trial, meta-memory was rated. Results showed that relevant checking led to automatization of checking behavior on the RT measure, and negative meta-memory effects for patients and controls. Defamiliarization led to de-automatization measured with the RT task, but did not attenuate the negative meta-memory effects of repeated checking. Clinical implications are discussed.

  17. Automatic spike sorting using tuning information.

    Science.gov (United States)

    Ventura, Valérie

    2009-09-01

    Current spike sorting methods focus on clustering neurons' characteristic spike waveforms. The resulting spike-sorted data are typically used to estimate how covariates of interest modulate the firing rates of neurons. However, when these covariates do modulate the firing rates, they provide information about spikes' identities, which thus far have been ignored for the purpose of spike sorting. This letter describes a novel approach to spike sorting, which incorporates both waveform information and tuning information obtained from the modulation of firing rates. Because it efficiently uses all the available information, this spike sorter yields lower spike misclassification rates than traditional automatic spike sorters. This theoretical result is verified empirically on several examples. The proposed method does not require additional assumptions; only its implementation is different. It essentially consists of performing spike sorting and tuning estimation simultaneously rather than sequentially, as is currently done. We used an expectation-maximization maximum likelihood algorithm to implement the new spike sorter. We present the general form of this algorithm and provide a detailed implementable version under the assumptions that neurons are independent and spike according to Poisson processes. Finally, we uncover a systematic flaw of spike sorting based on waveform information only.

  18. Automatization for development of HPLC methods.

    Science.gov (United States)

    Pfeffer, M; Windt, H

    2001-01-01

    Within the frame of inprocess analytics of the synthesis of pharmaceutical drugs a lot of HPLC methods are required for checking the quality of intermediates and drug substances. The methods have to be developed in terms of optimal selectivity and low limit of detection, minimum running time and chromatographic robustness. The goal was to shorten the method development process. Therefore, the screening of stationary phases was automated by means of switching modules equipped with 12 HPLC columns. Mobile phase and temperature could be optimized by using Drylab after evaluating chromatograms of gradient elutions performed automatically. The column switching module was applied for more than three dozens of substances, e.g. steroidal intermediates. Resolution (especially of isomeres), peak shape and number of peaks turned out to be the criteria for selection of the appropriate stationary phase. On the basis of the "best" column the composition of the "best" eluent was usually defined rapidly and with less effort. This approach leads to savings in manpower by more than one third. Overnight, impurity profiles of the intermediates were obtained yielding robust HPLC methods with high selectivity and minimized elution time.

  19. Desktop calibration of automatic transmission for passenger vehicle

    Institute of Scientific and Technical Information of China (English)

    FANG Chi; SHI Jian-peng; WANG Jun

    2014-01-01

    Desktop calibration of automatic transmission (AT) is a method which can reduce cost, enhance efficiency and shorten the development periods of a vehicle effectively. We primary introduced the principle and approach of desktop calibration of AT based on the condition of coupling characteristics between engine and torque converter and obtained right point exactly. It is shown to agree with experimental measurements reasonably well. It was used in different applications abroad based on AT technology and achieved a good performance of the vehicle compared with traditional AT technology which primary focuses on the drivability, performance and fuel consumption.

  20. Aircraft automatic flight control system with model inversion

    Science.gov (United States)

    Smith, G. A.; Meyer, George

    1990-01-01

    A simulator study was conducted to verify the advantages of a Newton-Raphson model-inversion technique as a design basis for an automatic trajectory control system in an aircraft with highly nonlinear characteristics. The simulation employed a detailed mathematical model of the aerodynamic and propulsion system performance characteristics of a vertical-attitude takeoff and landing tactical aircraft. The results obtained confirm satisfactory control system performance over a large portion of the flight envelope. System response to wind gusts was satisfactory for various plausible combinations of wind magnitude and direction.

  1. Towards automatic classification of all WISE sources

    Science.gov (United States)

    Kurcz, A.; Bilicki, M.; Solarz, A.; Krupa, M.; Pollo, A.; Małek, K.

    2016-07-01

    Context. The Wide-field Infrared Survey Explorer (WISE) has detected hundreds of millions of sources over the entire sky. Classifying them reliably is, however, a challenging task owing to degeneracies in WISE multicolour space and low levels of detection in its two longest-wavelength bandpasses. Simple colour cuts are often not sufficient; for satisfactory levels of completeness and purity, more sophisticated classification methods are needed. Aims: Here we aim to obtain comprehensive and reliable star, galaxy, and quasar catalogues based on automatic source classification in full-sky WISE data. This means that the final classification will employ only parameters available from WISE itself, in particular those which are reliably measured for the majority of sources. Methods: For the automatic classification we applied a supervised machine learning algorithm, support vector machines (SVM). It requires a training sample with relevant classes already identified, and we chose to use the SDSS spectroscopic dataset (DR10) for that purpose. We tested the performance of two kernels used by the classifier, and determined the minimum number of sources in the training set required to achieve stable classification, as well as the minimum dimension of the parameter space. We also tested SVM classification accuracy as a function of extinction and apparent magnitude. Thus, the calibrated classifier was finally applied to all-sky WISE data, flux-limited to 16 mag (Vega) in the 3.4 μm channel. Results: By calibrating on the test data drawn from SDSS, we first established that a polynomial kernel is preferred over a radial one for this particular dataset. Next, using three classification parameters (W1 magnitude, W1-W2 colour, and a differential aperture magnitude) we obtained very good classification efficiency in all the tests. At the bright end, the completeness for stars and galaxies reaches ~95%, deteriorating to ~80% at W1 = 16 mag, while for quasars it stays at a level of

  2. Automatic tuning of myoelectric prostheses.

    Science.gov (United States)

    Bonivento, C; Davalli, A; Fantuzzi, C; Sacchetti, R; Terenzi, S

    1998-07-01

    This paper is concerned with the development of a software package for the automatic tuning of myoelectric prostheses. The package core consists of Fuzzy Logic Expert Systems (FLES) that embody skilled operator heuristics in the tuning of prosthesis control parameters. The prosthesis system is an artificial arm-hand system developed at the National Institute of Accidents at Work (INAIL) laboratories. The prosthesis is powered by an electric motor that is controlled by a microprocessor using myoelectric signals acquired from skin-surface electrodes placed on a muscle in the residual limb of the subject. The software package, Microprocessor Controlled Arm (MCA) Auto Tuning, is a tool for aiding both INAIL expert operators and unskilled persons in the controller parameter tuning procedure. Prosthesis control parameter setup and subsequent recurrent adjustments are fundamental for the correct working of the prosthesis, especially when we consider that myoelectric parameters may vary greatly with environmental modifications. The parameter adjustment requires the end-user to go to the manufacturer's laboratory for the control parameters setup because, generally, he/she does not have the necessary knowledge and instruments to do this at home. However, this procedure is not very practical and involves a waste of time for the technicians and uneasiness for the clients. The idea behind the MCA Auto Tuning package consists in translating technician expertise into an FLES knowledge database. The software interacts through a user-friendly graphic interface with an unskilled user, who is guided through a step-by-step procedure in the prosthesis parameter tuning that emulates the traditional expert-aided procedure. The adoption of this program on a large scale may yield considerable economic benefits and improve the service quality supplied to the users of prostheses. In fact, the time required to set the prosthesis parameters are remarkably reduced, as is the technician

  3. 12th Portuguese Conference on Automatic Control

    CERN Document Server

    Soares, Filomena; Moreira, António

    2017-01-01

    The biennial CONTROLO conferences are the main events promoted by The CONTROLO 2016 – 12th Portuguese Conference on Automatic Control, Guimarães, Portugal, September 14th to 16th, was organized by Algoritmi, School of Engineering, University of Minho, in partnership with INESC TEC, and promoted by the Portuguese Association for Automatic Control – APCA, national member organization of the International Federation of Automatic Control – IFAC. The seventy-five papers published in this volume cover a wide range of topics. Thirty-one of them, of a more theoretical nature, are distributed among the first five parts: Control Theory; Optimal and Predictive Control; Fuzzy, Neural and Genetic Control; Modeling and Identification; Sensing and Estimation. The papers go from cutting-edge theoretical research to innovative control applications and show expressively how Automatic Control can be used to increase the well being of people. .

  4. Automaticity in social-cognitive processes.

    Science.gov (United States)

    Bargh, John A; Schwader, Kay L; Hailey, Sarah E; Dyer, Rebecca L; Boothby, Erica J

    2012-12-01

    Over the past several years, the concept of automaticity of higher cognitive processes has permeated nearly all domains of psychological research. In this review, we highlight insights arising from studies in decision-making, moral judgments, close relationships, emotional processes, face perception and social judgment, motivation and goal pursuit, conformity and behavioral contagion, embodied cognition, and the emergence of higher-level automatic processes in early childhood. Taken together, recent work in these domains demonstrates that automaticity does not result exclusively from a process of skill acquisition (in which a process always begins as a conscious and deliberate one, becoming capable of automatic operation only with frequent use) - there are evolved substrates and early childhood learning mechanisms involved as well.

  5. Automatic lexical classification: bridging research and practice.

    Science.gov (United States)

    Korhonen, Anna

    2010-08-13

    Natural language processing (NLP)--the automatic analysis, understanding and generation of human language by computers--is vitally dependent on accurate knowledge about words. Because words change their behaviour between text types, domains and sub-languages, a fully accurate static lexical resource (e.g. a dictionary, word classification) is unattainable. Researchers are now developing techniques that could be used to automatically acquire or update lexical resources from textual data. If successful, the automatic approach could considerably enhance the accuracy and portability of language technologies, such as machine translation, text mining and summarization. This paper reviews the recent and on-going research in automatic lexical acquisition. Focusing on lexical classification, it discusses the many challenges that still need to be met before the approach can benefit NLP on a large scale.

  6. Automatic identification for standing tree limb pruning

    Institute of Scientific and Technical Information of China (English)

    Sun Renshan; Li Wenbin; Tian Yongchen; Hua Li

    2006-01-01

    To meet the demand of automatic pruning machines,this paper presents a new method for dynamic automatic identification of standing tree limbs and capture of the digital images of Platycladus orientalis.Methods of computer vision,image processing and wavelet analysis technology were used to compress,filter,segment,abate noise and capture the outline of the picture.We then present the arithmetic for dynamic automatic identification of standing tree limbs,extracting basic growth characteristics of the standing trees such as the form,size,degree of bending and their relative spatial position.We use pattern recognition technology to confirm the proportionate relationship matching the database and thus achieve the goal of dynamic automatic identification of standing tree limbs.

  7. A Demonstration of Automatically Switched Optical Network

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    We build an automatically switched optical network (ASON) testbed with four optical cross-connect nodes. Many fundamental ASON features are demonstrated, which is implemented by control protocols based on generalized multi-protocol label switching (GMPLS) framework.

  8. Automatic acquisition of pattern collocations in GO

    Institute of Scientific and Technical Information of China (English)

    LIU Zhi-qing; DOU Qing; LI Wen-hong; LU Ben-jie

    2008-01-01

    The quality, quantity, and consistency of the knowledgeused in GO-playing programs often determine their strengths,and automatic acquisition of large amounts of high-quality andconsistent GO knowledge is crucial for successful GO playing.In a previous article of this subject, we have presented analgorithm for efficient and automatic acquisition of spatialpatterns of GO as well as their frequency of occurrence fromgame records. In this article, we present two algorithms, one forefficient and automatic acquisition of pairs of spatial patternsthat appear jointly in a local context, and the other for deter-mining whether the joint pattern appearances are of certainsignificance statistically and not just a coincidence. Results ofthe two algorithms include 1 779 966 pairs of spatial patternsacquired automatically from 16 067 game records of profess-sional GO players, of which about 99.8% are qualified as patterncollocations with a statistical confidence of 99.5% or higher.

  9. 2014 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2014 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  10. Automatization and familiarity in repeated checking

    NARCIS (Netherlands)

    Dek, Eliane C P; van den Hout, Marcel A.; Giele, Catharina L.; Engelhard, Iris M.

    2014-01-01

    Repeated checking paradoxically increases memory uncertainty. This study investigated the underlying mechanism of this effect. We hypothesized that as a result of repeated checking, familiarity with stimuli increases, and automatization of the checking procedure occurs, which should result in decrea

  11. Automatic Speech Segmentation Based on HMM

    Directory of Open Access Journals (Sweden)

    M. Kroul

    2007-06-01

    Full Text Available This contribution deals with the problem of automatic phoneme segmentation using HMMs. Automatization of speech segmentation task is important for applications, where large amount of data is needed to process, so manual segmentation is out of the question. In this paper we focus on automatic segmentation of recordings, which will be used for triphone synthesis unit database creation. For speech synthesis, the speech unit quality is a crucial aspect, so the maximal accuracy in segmentation is needed here. In this work, different kinds of HMMs with various parameters have been trained and their usefulness for automatic segmentation is discussed. At the end of this work, some segmentation accuracy tests of all models are presented.

  12. Automatic coding of online collaboration protocols

    NARCIS (Netherlands)

    Erkens, Gijsbert; Janssen, J.J.H.M.

    2006-01-01

    An automatic coding procedure is described to determine the communicative functions of messages in chat discussions. Five main communicative functions are distinguished: argumentative (indicating a line of argumentation or reasoning), responsive (e.g., confirmations, denials, and answers), informati

  13. 2010 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2010 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  14. Collapsible truss structure is automatically expandable

    Science.gov (United States)

    1965-01-01

    Coil springs wound with maximum initial tension in a three-truss, closed loop structure form a collapsible truss structure. The truss automatically expands and provides excellent rigidity and close dimensional tolerance when expanded.

  15. Phoneme vs Grapheme Based Automatic Speech Recognition

    OpenAIRE

    Magimai.-Doss, Mathew; Dines, John; Bourlard, Hervé; Hermansky, Hynek

    2004-01-01

    In recent literature, different approaches have been proposed to use graphemes as subword units with implicit source of phoneme information for automatic speech recognition. The major advantage of using graphemes as subword units is that the definition of lexicon is easy. In previous studies, results comparable to phoneme-based automatic speech recognition systems have been reported using context-independent graphemes or context-dependent graphemes with decision trees. In this paper, we study...

  16. Automatic quiz generation for elderly people

    OpenAIRE

    Samuelsen, Jeanette

    2016-01-01

    Studies have indicated that games can be beneficial for the elderly, in areas such as cognitive functioning and well-being. Taking part in social activities, such as playing a game with others, could also be beneficial. One type of game is a computer-based quiz. One can create quiz questions manually; however, this can be time-consuming. Another approach is to generate quiz questions automatically. This project has examined how quizzes for Norwegian elderly can be automatically generated usin...

  17. Automatic Age Estimation System for Face Images

    OpenAIRE

    Chin-Teng Lin; Dong-Lin Li; Jian-Hao Lai; Ming-Feng Han; Jyh-Yeong Chang

    2012-01-01

    Humans are the most important tracking objects in surveillance systems. However, human tracking is not enough to provide the required information for personalized recognition. In this paper, we present a novel and reliable framework for automatic age estimation based on computer vision. It exploits global face features based on the combination of Gabor wavelets and orthogonal locality preserving projections. In addition, the proposed system can extract face aging features automatically in rea...

  18. Automatic Control of Freeboard and Turbine Operation

    DEFF Research Database (Denmark)

    Kofoed, Jens Peter; Frigaard, Peter Bak; Friis-Madsen, Erik;

    The report deals with the modules for automatic control of freeboard and turbine operation on board the Wave dragon, Nissum Bredning (WD-NB) prototype, and covers what has been going on up to ultimo 2003.......The report deals with the modules for automatic control of freeboard and turbine operation on board the Wave dragon, Nissum Bredning (WD-NB) prototype, and covers what has been going on up to ultimo 2003....

  19. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathaniel O.

    2010-05-31

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques to detect regions of high error and the flexibility of the transfinite interpolation to add degrees of freedom to these areas. Examples are shown of a section of the Palo Duro Canyon in northern Texas.

  20. Automatic Fringe Detection Of Dynamic Moire Patterns

    Science.gov (United States)

    Fang, Jing; Su, Xian-ji; Shi, Hong-ming

    1989-10-01

    Fringe-carrier method is used in automatic fringe-order numbering of dynamic in-plane moire patterns. In experiment both static carrier and dynamic moire patterns are recorded. The image files corresponding to instants are set up to assign fringe orders automatically. Subtracting the carrier image from the modulated ones, the moire patterns due to the dynamic deformations are restored with fringe-order variation displayed by different grey levels.

  1. Automatic safety rod for reactors. [LMFBR

    Science.gov (United States)

    Germer, J.H.

    1982-03-23

    An automatic safety rod for a nuclear reactor containing neutron absorbing material and designed to be inserted into a reactor core after a loss-of-flow. Actuation is based upon either a sudden decrease in core pressure drop or the pressure drop decreases below a predetermined minimum value. The automatic control rod includes a pressure regulating device whereby a controlled decrease in operating pressure due to reduced coolant flow does not cause the rod to drop into the core.

  2. Automatic penalty continuation in structural topology optimization

    DEFF Research Database (Denmark)

    Rojas Labanda, Susana; Stolpe, Mathias

    2015-01-01

    this issue is addressed. We propose an automatic continuation method, where the material penalization parameter is included as a new variable in the problem and a constraint guarantees that the requested penalty is eventually reached. The numerical results suggest that this approach is an appealing...... alternative to continuation methods. Automatic continuation also generally obtains better designs than the classical formulation using a reduced number of iterations....

  3. UMLS-based automatic image indexing.

    Science.gov (United States)

    Sneiderman, C; Sneiderman, Charles Alan; Demner-Fushman, D; Demner-Fushman, Dina; Fung, K W; Fung, Kin Wah; Bray, B; Bray, Bruce

    2008-01-01

    To date, most accurate image retrieval techniques rely on textual descriptions of images. Our goal is to automatically generate indexing terms for an image extracted from a biomedical article by identifying Unified Medical Language System (UMLS) concepts in image caption and its discussion in the text. In a pilot evaluation of the suggested image indexing method by five physicians, a third of the automatically identified index terms were found suitable for indexing.

  4. Semi-automatic removal of foreground stars from images of galaxies

    CERN Document Server

    Frei, Z

    1996-01-01

    A new procedure, designed to remove foreground stars from galaxy profiles is presented. Although several programs exist for stellar and faint object photometry, none of them treat star removal from the images very carefully. I present my attempt to develop such a system, and briefly compare the performance of my software to one of the well known stellar photometry packages, DAOPhot. Major steps in my procedure are: (1) automatic construction of an empirical 2D point spread function from well separated stars that are situated off the galaxy; (2) automatic identification of those peaks that are likely to be foreground stars, scaling the PSF and removing these stars, and patching residuals (in the automatically determined smallest possible area where residuals are truly significant); and (3) cosmetic fix of remaining degradations in the image. The algorithm and software presented here is significantly better for automatic removal of foreground stars from images of galaxies than DAOPhot or similar packages, since...

  5. Analysis and countermeasures of the problems occurred in reverse osmosis during debugging and running%反渗透调试运行中出现问题分析及对策

    Institute of Scientific and Technical Information of China (English)

    秦昊

    2012-01-01

    The technological process of the groundwater pre-desalination system of Shaanxi Weihe Coal Chemical Engineering Group,as well as the technical and equipment problems occurred in the process of debugging and running, is introduced. Meanwhile, in the reverse osmosis system, the effects of electrical conductivity, influent SDI, influent residual chlorine and ORP, security filter, produced water backpressure, etc. On the system are analyzed and discussed, and corresponding countermeasures are put forward.%介绍了陕西渭河煤化工集团的地下水预脱盐系统工艺流程,以及调试运行过程中出现的工艺及设备问题.分析了反渗透系统进水电导率、进水SDI、进水余氯及ORP、保安过滤器、产水背压等对系统的影响,并提出对策.

  6. Discussion of low carrier meter reading operation of debugging and communication technology%低压载波抄表运行调试和通信技术探讨

    Institute of Scientific and Technical Information of China (English)

    刘桐然; 常晓华; 冯婧

    2011-01-01

    介绍了电力载波通信技术的特点和发展趋势,按照载波系统的3个业务层面具体分析了影响抄表成功率的常见问题和调试手段,并提出了具体的解决方案.通过阐述系统组成及载波信道特性,对影响电力线载波传输质量的电力网络阻抗特性、衰减特性及噪声干扰等方面进行了分析,详细介绍了电力载波信道发展的2项新技术.%The characteristics of power line carrier communication technology and development trends are proposed,theproblems affect the reading success rate and debugging tools are analysised according to 3 levels of the carrier system,sothe solutions are given. In addition, based on system components and the characteristics of the carrier channel, thefactors which affect the quality of power line carrier transmission power network impedance, attenuation and noiseinterference and other aspects are proposed, and two new technologies of power line carrier channel development aregiven.

  7. 针对EMV Level1 Analog Test认证的金融支付PCD设计与调试%Design and Debugging of PCD for EMV Level 1 Analog Test

    Institute of Scientific and Technical Information of China (English)

    刘丽丽; 王丰; 余秋芳; 王西国

    2015-01-01

    文章针对最新版EMV Level 1标准,对PCD的EMV Level 1 Analog test认证的检测环境及检测项目进行总结,对实际测试中可能遇到的问题进行分析,总结出其技术难点。针对这些难点提出PCD设计中需要重点关注的方面以及设计技巧,进一步给出调试中使用怎样的方法达到系统的最优配置,成功通过EMV Level1 Analog test的检测。%According to the newest EMV Level 1 protocol, EMV Level 1 Analog test environment and PCD test cases are analyzed. The problems that may be encounted in real testing are analyzeed and the technological difficulties are concluded. The aspects that need to be concerned and the design skills are proposed for the PCD design, aand the method to be adopted in the debugging is given to achieve optimum system configuration,.Then passing the PCD’s EMV Level1 Analog test is expected.

  8. Multi-channel Virtual Instrument OscilloScope for Servo Motor Debugging System%多通道虚拟示波器伺服电机调试系统

    Institute of Scientific and Technical Information of China (English)

    张超; 游林儒

    2012-01-01

    Virtual instrument technology for servo motor debugging system was introduced. RS - 485 bus was chosen for the data transmission channel according to the system's real-time requirements,which send the voltage,current,speed collected by the microcontroller and the related parameters during the process of servo control to the PC. The PC used LabVIEW programming, which can implement the functions of data collection, processing, display and storage, and realize the on-line monitoring of servo motor control system. Experimental results show that the system can meet the requirements with desired technical targets.%介绍了虚拟仪器技术在伺服电机调试系统中的应用.根据系统实时性要求,采用基于RS - 485总线的数据传输通适,通过串行接口将微控制器采集到的电压、电流、转速及伺服控制过程中的相关参数传送到上位机.上位机采用LabVIEW编程,完成数据的采集、处理、显示和存储,实现对伺服电机控制系统运行状态的在线监控.实验证明,该系统达到了预期的技术要求.

  9. Using Straight Light Hair to Achieve Interactive Signal Insertion of IPQAM Installation and Debugging%利用直调光发实现 IPQAM互动信号插播的安装与调试

    Institute of Scientific and Technical Information of China (English)

    刘圣奇

    2016-01-01

    在互动电视用户迅速增长的情况下,为了提高服务质量,利用1550 nm直调光发实现IPQAM互动信号的插播,既可平滑升级又可节约投资成本,有利于开展双向业务,同时可以不改变网络结构而升级系统,为今后开展FTTH业务打下基础,具有一定的实用价值。%Paper described has in wired digital interactive TV user growth quickly situation Xia , to guarantee interactive business of normal stable run , and improve service quality , I Taiwan using 1 550 nm straight ad-justable light sent achieved IPQAM interactive signal spots of installation and debugging of actual case , can smooth upgrade and can save investment cost , conducive to broadcasting carried out two -way business fast preemption market , while in user number increased Shi , can in not change network structure Xia upgrade sys-tem, also for future GPON carried out FTTH business laying , Has practical value , for reference.

  10. PDBalert: automatic, recurrent remote homology tracking and protein structure prediction

    Directory of Open Access Journals (Sweden)

    Söding Johannes

    2008-11-01

    Full Text Available Abstract Background During the last years, methods for remote homology detection have grown more and more sensitive and reliable. Automatic structure prediction servers relying on these methods can generate useful 3D models even below 20% sequence identity between the protein of interest and the known structure (template. When no homologs can be found in the protein structure database (PDB, the user would need to rerun the same search at regular intervals in order to make timely use of a template once it becomes available. Results PDBalert is a web-based automatic system that sends an email alert as soon as a structure with homology to a protein in the user's watch list is released to the PDB database or appears among the sequences on hold. The mail contains links to the search results and to an automatically generated 3D homology model. The sequence search is performed with the same software as used by the very sensitive and reliable remote homology detection server HHpred, which is based on pairwise comparison of Hidden Markov models. Conclusion PDBalert will accelerate the information flow from the PDB database to all those who can profit from the newly released protein structures for predicting the 3D structure or function of their proteins of interest.

  11. A semi-automatic annotation tool for cooking video

    Science.gov (United States)

    Bianco, Simone; Ciocca, Gianluigi; Napoletano, Paolo; Schettini, Raimondo; Margherita, Roberto; Marini, Gianluca; Gianforme, Giorgio; Pantaleo, Giuseppe

    2013-03-01

    In order to create a cooking assistant application to guide the users in the preparation of the dishes relevant to their profile diets and food preferences, it is necessary to accurately annotate the video recipes, identifying and tracking the foods of the cook. These videos present particular annotation challenges such as frequent occlusions, food appearance changes, etc. Manually annotate the videos is a time-consuming, tedious and error-prone task. Fully automatic tools that integrate computer vision algorithms to extract and identify the elements of interest are not error free, and false positive and false negative detections need to be corrected in a post-processing stage. We present an interactive, semi-automatic tool for the annotation of cooking videos that integrates computer vision techniques under the supervision of the user. The annotation accuracy is increased with respect to completely automatic tools and the human effort is reduced with respect to completely manual ones. The performance and usability of the proposed tool are evaluated on the basis of the time and effort required to annotate the same video sequences.

  12. Automatic imitation in a rich social context with virtual characters

    Directory of Open Access Journals (Sweden)

    Xueni ePan

    2015-06-01

    Full Text Available It has been well established that people respond faster when they perform an action that is congruent with an observed action than when they respond with an incongruent action. Here we propose a new method of using interactive Virtual Characters (VCs to test if social congruency effects can be obtained in a richer social context with sequential hand-arm actions. Two separate experiments were conducted, exploring if it is feasible to measure spatial congruency (experiment 1 and anatomical congruency (experiment 2 in response to a virtual character, compared to the same action sequence indicated by three virtual balls. In experiment 1, we found a robust spatial congruency effect for both VC and virtual balls, modulated by a social facilitation effect for participants who felt the VC was human. In experiment 2 which allowed for anatomical congruency, a form by congruency interaction provided evidence that participants automatically imitate the actions of the VC but do not imitate the balls. Our method and results build a bridge between studies using minimal stimuli in automatic interaction and studies of mimicry in a rich social interaction, and open new research venue for future research in the area of automatic imitation with a more ecologically valid social interaction.

  13. Automatic and controlled processing in the corticocerebellar system.

    Science.gov (United States)

    Ramnani, Narender

    2014-01-01

    During learning, performance changes often involve a transition from controlled processing in which performance is flexible and responsive to ongoing error feedback, but effortful and slow, to a state in which processing becomes swift and automatic. In this state, performance is unencumbered by the requirement to process feedback, but its insensitivity to feedback reduces its flexibility. Many properties of automatic processing are similar to those that one would expect of forward models, and many have suggested that these may be instantiated in cerebellar circuitry. Since hierarchically organized frontal lobe areas can both send and receive commands, I discuss the possibility that they can act both as controllers and controlled objects and that their behaviors can be independently modeled by forward models in cerebellar circuits. Since areas of the prefrontal cortex contribute to this hierarchically organized system and send outputs to the cerebellar cortex, I suggest that the cerebellum is likely to contribute to the automation of cognitive skills, and to the formation of habitual behavior which is resistant to error feedback. An important prerequisite to these ideas is that cerebellar circuitry should have access to higher order error feedback that signals the success or failure of cognitive processing. I have discussed the pathways through which such feedback could arrive via the inferior olive and the dopamine system. Cerebellar outputs inhibit both the inferior olive and the dopamine system. It is possible that learned representations in the cerebellum use this as a mechanism to suppress the processing of feedback in other parts of the nervous system. Thus, cerebellar processes that control automatic performance may be completed without triggering the engagement of controlled processes by prefrontal mechanisms.

  14. 14 CFR 23.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 23.1329 Section 23...: Installation § 23.1329 Automatic pilot system. If an automatic pilot system is installed, it must meet the following: (a) Each system must be designed so that the automatic pilot can— (1) Be quickly and...

  15. QoS-Aware Automatic Service Composition: A Graph View

    Institute of Scientific and Technical Information of China (English)

    Wei Jiang; Tian Wu; Song-Lin Hu; Zhi-Yong Liu

    2011-01-01

    In the research of service composition,it demands efficient algorithms that not only retrieve correct service compositions automatically from thousands of services but also satisfy the quality requirements of different service users.However,most approaches treat these two aspects as two separate problems,automatic service composition and service selection.Although the latest researches realize the restriction of this separate view and some specific methods are proposed,they still suffer from serious limitations in scalability and accuracy when addressing both requirements simultaneously.In order to cope with these limitations and efficiently solve the combined problem which is known as QoS-aware or QoS-driven automatic service composition problem,we propose a new graph search problem,single-source optimal directed acyclic graphs (DAGs),for the first time.This novel single-source optimal DAGs (SSOD) problem is similar to,but more general than the classical single-source shortest paths (SSSP) problem.In this paper,a new graph model of SSOD problem is proposed and a Sim-Dijkstra algorithm is presented to address the SSOD problem with the time complexity of O(n log n +m) (n and m are the number of nodes and edges in the graph respectively),and the proofs of its soundness.It is also directly applied to solve the QoS-aware automatic service composition problem,and a service composition tool named QSynth is implemented.Evaluations show that Sim-Dijkstra algorithm achieves superior scalability and efficiency with respect to a large variety of composition scenarios,even more efficient than our worklist algorithm that won the performance championship of Web Services Challenge 2009.

  16. 基于EM78P510NK的全自动咖啡机控制系统的设计%Design of Automatic Coffee Machine Control System Based on the EM78P510NK

    Institute of Scientific and Technical Information of China (English)

    邵雯

    2013-01-01

    This article introduced automatic coffee machine control system 's design.Its core is MCU based on EM78P510NK, it can control the position of piston and seal gland, temperature, water flow, coarseness of coffee reliably.It can give an alarm when hydropenia and lack coffee.The automatic coffee machine control system has advantages of reliable operation, simple structure, easy to debug and expansion.It is used in business, catering industry, office and family, etc.%介绍一种全自动咖啡机控制系统的设计,它以EM78P510NK处理器为核心,对活塞位置、密封套位置、温度、水流量、咖啡粗细度等进行可靠控制,具有缺水、缺豆报警功能.该系统工作可靠、结构简单、易于调试和扩展,适合于商业、餐饮业、办公室和家庭.

  17. Region descriptors for automatic classification of small sea targets in infrared video

    NARCIS (Netherlands)

    Mouthaan, M.M.; Broek, S.P. van den; Hendriks, E.A.; Schwering, P.B.W.

    2011-01-01

    We evaluate the performance of different key-point detectors and region descriptors when used for automatic classification of small sea targets in infrared video. In our earlier research performed on this subject as well as in other literature, many different region descriptors have been proposed. H

  18. Functionality and Performance Visualization of the Distributed High Quality Volume Renderer (HVR)

    KAUST Repository

    Shaheen, Sara

    2012-07-01

    Volume rendering systems are designed to provide means to enable scientists and a variety of experts to interactively explore volume data through 3D views of the volume. However, volume rendering techniques are computationally intensive tasks. Moreover, parallel distributed volume rendering systems and multi-threading architectures were suggested as natural solutions to provide an acceptable volume rendering performance for very large volume data sizes, such as Electron Microscopy data (EM). This in turn adds another level of complexity when developing and manipulating volume rendering systems. Given that distributed parallel volume rendering systems are among the most complex systems to develop, trace and debug, it is obvious that traditional debugging tools do not provide enough support. As a consequence, there is a great demand to provide tools that are able to facilitate the manipulation of such systems. This can be achieved by utilizing the power of compute graphics in designing visual representations that reflect how the system works and that visualize the current performance state of the system.The work presented is categorized within the field of software Visualization, where Visualization is used to serve visualizing and understanding various software. In this thesis, a number of visual representations that reflect a number of functionality and performance aspects of the distributed HVR, a high quality volume renderer system that uses various techniques to visualize large volume sizes interactively. This work is provided to visualize different stages of the parallel volume rendering pipeline of HVR. This is along with means of performance analysis through a number of flexible and dynamic visualizations that reflect the current state of the system and enables manipulation of them at runtime. Those visualization are aimed to facilitate debugging, understanding and analyzing the distributed HVR.

  19. Automatic detection of aircraft emergency landing sites

    Science.gov (United States)

    Shen, Yu-Fei; Rahman, Zia-ur; Krusienski, Dean; Li, Jiang

    2011-06-01

    An automatic landing site detection algorithm is proposed for aircraft emergency landing. Emergency landing is an unplanned event in response to emergency situations. If, as is unfortunately usually the case, there is no airstrip or airfield that can be reached by the un-powered aircraft, a crash landing or ditching has to be carried out. Identifying a safe landing site is critical to the survival of passengers and crew. Conventionally, the pilot chooses the landing site visually by looking at the terrain through the cockpit. The success of this vital decision greatly depends on the external environmental factors that can impair human vision, and on the pilot's flight experience that can vary significantly among pilots. Therefore, we propose a robust, reliable and efficient algorithm that is expected to alleviate the negative impact of these factors. We present only the detection mechanism of the proposed algorithm and assume that the image enhancement for increased visibility, and image stitching for a larger field-of-view have already been performed on the images acquired by aircraftmounted cameras. Specifically, we describe an elastic bound detection method which is designed to position the horizon. The terrain image is divided into non-overlapping blocks which are then clustered according to a "roughness" measure. Adjacent smooth blocks are merged to form potential landing sites whose dimensions are measured with principal component analysis and geometric transformations. If the dimensions of the candidate region exceed the minimum requirement for safe landing, the potential landing site is considered a safe candidate and highlighted on the human machine interface. At the end, the pilot makes the final decision by confirming one of the candidates, also considering other factors such as wind speed and wind direction, etc. Preliminary results show the feasibility of the proposed algorithm.

  20. Automatic Fastening Large Structures: a New Approach

    Science.gov (United States)

    Lumley, D. F.

    1985-01-01

    The external tank (ET) intertank structure for the space shuttle, a 27.5 ft diameter 22.5 ft long externally stiffened mechanically fastened skin-stringer-frame structure, was a labor intensitive manual structure built on a modified Saturn tooling position. A new approach was developed based on half-section subassemblies. The heart of this manufacturing approach will be 33 ft high vertical automatic riveting system with a 28 ft rotary positioner coming on-line in mid 1985. The Automatic Riveting System incorporates many of the latest automatic riveting technologies. Key features include: vertical columns with two sets of independently operating CNC drill-riveting heads; capability of drill, insert and upset any one piece fastener up to 3/8 inch diameter including slugs without displacing the workpiece offset bucking ram with programmable rotation and deep retraction; vision system for automatic parts program re-synchronization and part edge margin control; and an automatic rivet selection/handling system.

  1. Automaticity: Componential, Causal, and Mechanistic Explanations.

    Science.gov (United States)

    Moors, Agnes

    2016-01-01

    The review first discusses componential explanations of automaticity, which specify non/automaticity features (e.g., un/controlled, un/conscious, non/efficient, fast/slow) and their interrelations. Reframing these features as factors that influence processes (e.g., goals, attention, and time) broadens the range of factors that can be considered (e.g., adding stimulus intensity and representational quality). The evidence reviewed challenges the view of a perfect coherence among goals, attention, and consciousness, and supports the alternative view that (a) these and other factors influence the quality of representations in an additive way (e.g., little time can be compensated by extra attention or extra stimulus intensity) and that (b) a first threshold of this quality is required for unconscious processing and a second threshold for conscious processing. The review closes with a discussion of causal explanations of automaticity, which specify factors involved in automatization such as repetition and complexity, and a discussion of mechanistic explanations, which specify the low-level processes underlying automatization.

  2. Automatic Parallelization of Scientific Application

    DEFF Research Database (Denmark)

    Blum, Troels

    In my PhD work I show that it is possible to run unmodified Python/NumPy code on modern GPUs. This is done by using the Bohrium runtime system to translate the NumPy array operations into an array based bytecode sequence. Executing these byte-codes on two GPUs from different vendors shows great...... performance gains. Scientists working with computer simulations should be allowed to focus on their field of research and not spend excessive amounts of time learning exotic programming models and languages. We have with Bohrium achieved very promising results by starting out with a relatively simple approach....... This has lead to more specialized methods as I have shown with the work done with both specialized, and parametrizied kernels. Both have their benefits and recognizable use cases. We achieved clear performance benefits without any significant negative impact on overall application performance. Even...

  3. High performance visual display for HENP detectors

    CERN Document Server

    McGuigan, M; Spiletic, J; Fine, V; Nevski, P

    2001-01-01

    A high end visual display for High Energy Nuclear Physics (HENP) detectors is necessary because of the sheer size and complexity of the detector. For BNL this display will be of special interest because of STAR and ATLAS. To load, rotate, query, and debug simulation code with a modern detector simply takes too long even on a powerful work station. To visualize the HENP detectors with maximal performance we have developed software with the following characteristics. We develop a visual display of HENP detectors on BNL multiprocessor visualization server at multiple level of detail. We work with general and generic detector framework consistent with ROOT, GAUDI etc, to avoid conflicting with the many graphic development groups associated with specific detectors like STAR and ATLAS. We develop advanced OpenGL features such as transparency and polarized stereoscopy. We enable collaborative viewing of detector and events by directly running the analysis in BNL stereoscopic theatre. We construct enhanced interactiv...

  4. Performance of a streaming mesh refinement algorithm.

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, David C.; Pebay, Philippe Pierre

    2004-08-01

    In SAND report 2004-1617, we outline a method for edge-based tetrahedral subdivision that does not rely on saving state or communication to produce compatible tetrahedralizations. This report analyzes the performance of the technique by characterizing (a) mesh quality, (b) execution time, and (c) traits of the algorithm that could affect quality or execution time differently for different meshes. It also details the method used to debug the several hundred subdivision templates that the algorithm relies upon. Mesh quality is on par with other similar refinement schemes and throughput on modern hardware can exceed 600,000 output tetrahedra per second. But if you want to understand the traits of the algorithm, you have to read the report!

  5. Analytical Performance of Olympus AU2700 Automatic Biochemical Analyzer in Detecting Electrolyte Proj ect%奥林巴斯AU2700全自动生化分析仪测定电解质项目的性能验证研究

    Institute of Scientific and Technical Information of China (English)

    韩学波; 唐秀英; 李莉; 于欣

    2014-01-01

    Objective:Evaluation the analysis performance of Olympus AU2700 automatic biochemical Analyzer determined electrolytes proj ect.Methods:In accordance with United States clinical laboratory standards organization (CLSI) guidance document and other documents related to the experimental program,analyzes the precision,accuracy,linear range of Olympus AU2700 automatic biochemical analyzer determination electrolyte project.Results:AU2700 automatic biochemical analyzer examines in the K+ batch coefficient of variation respectively is 1.84% and 0.89%,the K+ inter-assay coefficients of variation respectively is 2.08% and 4.20%.The Na+ batch coefficient of variation respectively is 0.05% and 0.94%,the K+ inter-assay coefficients of variation respectively is 0.64% and 0.81%.The Cl- batch coefficient of variation respectively is 0.88% and 0.75%,the Cl- inter-assay coefficients of variation respectively is 1.12% and 1.21%.These results satisfies the CLIA88 standard.Accuracy:the detection result in the EQA the permissible range.Linearity:The linear regression equation of Potassium is Y=1.0085X-0.0312.a is 1.0085 and correlation coefficient is 0.99.Reportable range is 0.68~11.8 mmol/L.The linear regression equation of Sodium ions is Y=9998X-0.0055.a is 0.9998 and correlation co-efficient is 1.Reportable range is 37~288.6 mmol/L.The linear regression equation of Chloride ion is Y=0.9895X+1.8413.a is 0.9895 and correlation coefficient is 0.99.Reportable range is 37.1~246.6 mmol/L.Conclusion:Olympus AU2700 automatic biochemical analyzer examination electrolyte project has met the requirements of biochemical assay laboratory in precision,accuracy,linear range and other aspects,may be used in the clinical specimen examination.%目的:评价奥林巴斯 AU2700全自动生化分析仪测定电解质项目的分析性能。方法:按照美国临床实验室标准化组织(CLSI)指南文件和其他相关文献的实验方案,分析奥林巴斯 AU2700全自动生化分析仪测定电

  6. Implementation of a microcontroller-based semi-automatic coagulator.

    Science.gov (United States)

    Chan, K; Kirumira, A; Elkateeb, A

    2001-01-01

    The coagulator is an instrument used in hospitals to detect clot formation as a function of time. Generally, these coagulators are very expensive and therefore not affordable by a doctors' office and small clinics. The objective of this project is to design and implement a low cost semi-automatic coagulator (SAC) prototype. The SAC is capable of assaying up to 12 samples and can perform the following tests: prothrombin time (PT), activated partial thromboplastin time (APTT), and PT/APTT combination. The prototype has been tested successfully.

  7. Automatic cone photoreceptor segmentation using graph theory and dynamic programming.

    Science.gov (United States)

    Chiu, Stephanie J; Lokhnygina, Yuliya; Dubis, Adam M; Dubra, Alfredo; Carroll, Joseph; Izatt, Joseph A; Farsiu, Sina

    2013-06-01

    Geometrical analysis of the photoreceptor mosaic can reveal subclinical ocular pathologies. In this paper, we describe a fully automatic algorithm to identify and segment photoreceptors in adaptive optics ophthalmoscope images of the photoreceptor mosaic. This method is an extension of our previously described closed contour segmentation framework based on graph theory and dynamic programming (GTDP). We validated the performance of the proposed algorithm by comparing it to the state-of-the-art technique on a large data set consisting of over 200,000 cones and posted the results online. We found that the GTDP method achieved a higher detection rate, decreasing the cone miss rate by over a factor of five.

  8. Automatic identification of corrosion damage using image processing techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bento, Mariana P.; Ramalho, Geraldo L.B.; Medeiros, Fatima N.S. de; Ribeiro, Elvis S. [Universidade Federal do Ceara (UFC), Fortaleza, CE (Brazil); Medeiros, Luiz C.L. [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper proposes a Nondestructive Evaluation (NDE) method for atmospheric corrosion detection on metallic surfaces using digital images. In this study, the uniform corrosion is characterized by texture attributes extracted from co-occurrence matrix and the Self Organizing Mapping (SOM) clustering algorithm. We present a technique for automatic inspection of oil and gas storage tanks and pipelines of petrochemical industries without disturbing their properties and performance. Experimental results are promising and encourage the possibility of using this methodology in designing trustful and robust early failure detection systems. (author)

  9. Automatic SIMD vectorization of SSA-based control flow graphs

    CERN Document Server

    Karrenberg, Ralf

    2015-01-01

    Ralf Karrenberg presents Whole-Function Vectorization (WFV), an approach that allows a compiler to automatically create code that exploits data-parallelism using SIMD instructions. Data-parallel applications such as particle simulations, stock option price estimation or video decoding require the same computations to be performed on huge amounts of data. Without WFV, one processor core executes a single instance of a data-parallel function. WFV transforms the function to execute multiple instances at once using SIMD instructions. The author describes an advanced WFV algorithm that includes a v

  10. Human and automatic speaker recognition over telecommunication channels

    CERN Document Server

    Fernández Gallardo, Laura

    2016-01-01

    This work addresses the evaluation of the human and the automatic speaker recognition performances under different channel distortions caused by bandwidth limitation, codecs, and electro-acoustic user interfaces, among other impairments. Its main contribution is the demonstration of the benefits of communication channels of extended bandwidth, together with an insight into how speaker-specific characteristics of speech are preserved through different transmissions. It provides sufficient motivation for considering speaker recognition as a criterion for the migration from narrowband to enhanced bandwidths, such as wideband and super-wideband.

  11. Machine Learning Algorithms for Automatic Classification of Marmoset Vocalizations

    Science.gov (United States)

    Ribeiro, Sidarta; Pereira, Danillo R.; Papa, João P.; de Albuquerque, Victor Hugo C.

    2016-01-01

    Automatic classification of vocalization type could potentially become a useful tool for acoustic the monitoring of captive colonies of highly vocal primates. However, for classification to be useful in practice, a reliable algorithm that can be successfully trained on small datasets is necessary. In this work, we consider seven different classification algorithms with the goal of finding a robust classifier that can be successfully trained on small datasets. We found good classification performance (accuracy > 0.83 and F1-score > 0.84) using the Optimum Path Forest classifier. Dataset and algorithms are made publicly available. PMID:27654941

  12. Automatically Restructuring Practice Guidelines using the GEM DTD

    CERN Document Server

    Bouffier, Amanda

    2007-01-01

    This paper describes a system capable of semi-automatically filling an XML template from free texts in the clinical domain (practice guidelines). The XML template includes semantic information not explicitly encoded in the text (pairs of conditions and actions/recommendations). Therefore, there is a need to compute the exact scope of conditions over text sequences expressing the required actions. We present a system developed for this task. We show that it yields good performance when applied to the analysis of French practice guidelines.

  13. Effects of bandwidth feedback on the automatization of an arm movement sequence.

    Science.gov (United States)

    Agethen, Manfred; Krause, Daniel

    2016-02-01

    We examined the effects of a bandwidth feedback manipulation on motor learning. Effects on movement accuracy, as well as on movement consistency, have been addressed in earlier studies. We have additionally investigated the effects on motor automatization. Because providing error feedback is believed to induce attentional control processes, we suppose that a bandwidth method should facilitate motor automatization. Participants (N=48) were assigned to four groups: one control group and three intervention groups. Participants of the intervention groups practiced an arm movement sequence with 760 trials. The BW0-Group practiced with 100% frequency of feedback. For the BW10-Group, feedback was provided when the errors were larger than 10°. The YokedBW10-Group participants were matched to the feedback schedule of research twins from the BW10-Group. All groups performed pre-tests and retention tests with a secondary task paradigm to test for automaticity. The BW10-Group indicated a higher degree of automatization compared with the BW0-Group, which did not exhibit a change in automaticity. The comparison of the YokedBW10-Group, which also exhibited automatization, and the BW10-Group leads to the proposal that reduction of quantitative feedback frequency and additional positive feedback are responsible for the bandwidth effect. Differences in movement accuracy and consistency were not evident.

  14. Determination of Artificial Sweetener 4-Ethoxyphenylurea in Succade by Automatic Solid-phase Extraction and High Performance Chromatography with Fluorescence Method%全自动固相萃取-高效液相色谱荧光法测定蜜饯中人工合成甜味剂对乙氧基苯脲含量

    Institute of Scientific and Technical Information of China (English)

    陈章捷; 陈金凤; 张艳燕; 钟坚海; 魏晶晶

    2014-01-01

    提出了高效液相色谱法测定蜜饯中人工合成甜味剂对乙氧基苯脲含量的方法。样品采用醋酸-醋酸铵缓冲液超声提取,全自动固相萃取仪净化,SB-C18反相色谱柱分离,荧光检测器检测。对乙氧基苯脲在0~10 mg/L范围内的线性相关系数为0.9987,方法定量限(S/N=10)小于0.1 mg/kg。以三种空白蜜饯为基体,在3个添加水平进行加标回收试验,平均回收率在81.7%~92.4%之间,相对标准偏差(n=6)在2.4%~6.8%之间。%High performance liquid chromatography is applied for the determination of artificial sweete-ner 4-Ethoxyphenylurea in succade.The sample is ultrasonic extracted with acetic acid/ammonium acetate buffer solution and purified by automatic solid-phase extraction.The extract is separated by SB-C1 8 column and detected by fluorescence detector.The value of correlation coefficient in the range of 0 to 10 mg/L is 0.9987.The limit of quantity (S/N=10)is less than 0.1 mg/kg.Using blank sample of succade as matrixes,the recovery is tested at 3 different concentration levels and the values of recovery are in the range of 81.7% to 92.4% with RSDs (n=6)in the range of 2.4% to 6.8%.

  15. Oocytes Polar Body Detection for Automatic Enucleation

    Directory of Open Access Journals (Sweden)

    Di Chen

    2016-02-01

    Full Text Available Enucleation is a crucial step in cloning. In order to achieve automatic blind enucleation, we should detect the polar body of the oocyte automatically. The conventional polar body detection approaches have low success rate or low efficiency. We propose a polar body detection method based on machine learning in this paper. On one hand, the improved Histogram of Oriented Gradient (HOG algorithm is employed to extract features of polar body images, which will increase success rate. On the other hand, a position prediction method is put forward to narrow the search range of polar body, which will improve efficiency. Experiment results show that the success rate is 96% for various types of polar bodies. Furthermore, the method is applied to an enucleation experiment and improves the degree of automatic enucleation.

  16. Automatic Image-Based Pencil Sketch Rendering

    Institute of Scientific and Technical Information of China (English)

    王进; 鲍虎军; 周伟华; 彭群生; 徐迎庆

    2002-01-01

    This paper presents an automatic image-based approach for converting greyscale images to pencil sketches, in which strokes follow the image features. The algorithm first extracts a dense direction field automatically using Logical/Linear operators which embody the drawing mechanism. Next, a reconstruction approach based on a sampling-and-interpolation scheme is introduced to generate stroke paths from the direction field. Finally, pencil strokes are rendered along the specified paths with consideration of image tone and artificial illumination.As an important application, the technique is applied to render portraits from images with little user interaction. The experimental results demonstrate that the approach can automatically achieve compelling pencil sketches from reference images.

  17. Research on an Intelligent Automatic Turning System

    Directory of Open Access Journals (Sweden)

    Lichong Huang

    2012-12-01

    Full Text Available Equipment manufacturing industry is the strategic industries of a country. And its core part is the CNC machine tool. Therefore, enhancing the independent research of relevant technology of CNC machine, especially the open CNC system, is of great significance. This paper presented some key techniques of an Intelligent Automatic Turning System and gave a viable solution for system integration. First of all, the integrated system architecture and the flexible and efficient workflow for perfoming the intelligent automatic turning process is illustrated. Secondly, the innovated methods of the workpiece feature recognition and expression and process planning of the NC machining are put forward. Thirdly, the cutting tool auto-selection and the cutting parameter optimization solution are generated with a integrated inference of rule-based reasoning and case-based reasoning. Finally, the actual machining case based on the developed intelligent automatic turning system proved the presented solutions are valid, practical and efficient.

  18. Towards unifying inheritance and automatic program specialization

    DEFF Research Database (Denmark)

    Schultz, Ulrik Pagh

    2002-01-01

    Inheritance allows a class to be specialized and its attributes refined, but implementation specialization can only take place by overriding with manually implemented methods. Automatic program specialization can generate a specialized, effcient implementation. However, specialization of programs...... and specialization of classes (inheritance) are considered different abstractions. We present a new programming language, Lapis, that unifies inheritance and program specialization at the conceptual, syntactic, and semantic levels. This paper presents the initial development of Lapis, which uses inheritance...... with covariant specialization to control the automatic application of program specialization to class members. Lapis integrates object-oriented concepts, block structure, and techniques from automatic program specialization to provide both a language where object-oriented designs can be e#ciently implemented...

  19. Automatic Age Estimation System for Face Images

    Directory of Open Access Journals (Sweden)

    Chin-Teng Lin

    2012-11-01

    Full Text Available Humans are the most important tracking objects in surveillance systems. However, human tracking is not enough to provide the required information for personalized recognition. In this paper, we present a novel and reliable framework for automatic age estimation based on computer vision. It exploits global face features based on the combination of Gabor wavelets and orthogonal locality preserving projections. In addition, the proposed system can extract face aging features automatically in real‐time. This means that the proposed system has more potential in applications compared to other semi‐automatic systems. The results obtained from this novel approach could provide clearer insight for operators in the field of age estimation to develop real‐world applications.

  20. A simple readout electronics for automatic power controlled self-mixing laser diode systems.

    Science.gov (United States)

    Cattini, Stefano; Rovati, Luigi

    2008-08-01

    The paper describes a simple electronic circuit to drive a laser diode for self-mixing interferometry. The network integrates a stable commercial automatic power controller and a current mirror based readout of the interferometric signal. The first prototype version of the circuit has been realized and characterized. The system allows easily performing precise interferometric measurements with no thermostatic circuitry to stabilize the laser diode temperature and an automatic control gain network to compensate emitted optical power fluctuations. To achieve this result, in the paper a specific calibration procedure to be performed is described.

  1. Semi-automatic knee cartilage segmentation

    Science.gov (United States)

    Dam, Erik B.; Folkesson, Jenny; Pettersen, Paola C.; Christiansen, Claus

    2006-03-01

    Osteo-Arthritis (OA) is a very common age-related cause of pain and reduced range of motion. A central effect of OA is wear-down of the articular cartilage that otherwise ensures smooth joint motion. Quantification of the cartilage breakdown is central in monitoring disease progression and therefore cartilage segmentation is required. Recent advances allow automatic cartilage segmentation with high accuracy in most cases. However, the automatic methods still fail in some problematic cases. For clinical studies, even if a few failing cases will be averaged out in the overall results, this reduces the mean accuracy and precision and thereby necessitates larger/longer studies. Since the severe OA cases are often most problematic for the automatic methods, there is even a risk that the quantification will introduce a bias in the results. Therefore, interactive inspection and correction of these problematic cases is desirable. For diagnosis on individuals, this is even more crucial since the diagnosis will otherwise simply fail. We introduce and evaluate a semi-automatic cartilage segmentation method combining an automatic pre-segmentation with an interactive step that allows inspection and correction. The automatic step consists of voxel classification based on supervised learning. The interactive step combines a watershed transformation of the original scan with the posterior probability map from the classification step at sub-voxel precision. We evaluate the method for the task of segmenting the tibial cartilage sheet from low-field magnetic resonance imaging (MRI) of knees. The evaluation shows that the combined method allows accurate and highly reproducible correction of the segmentation of even the worst cases in approximately ten minutes of interaction.

  2. Automatic learning-based beam angle selection for thoracic IMRT

    Energy Technology Data Exchange (ETDEWEB)

    Amit, Guy; Marshall, Andrea [Radiation Medicine Program, Princess Margaret Cancer Centre, Toronto, Ontario M5G 2M9 (Canada); Purdie, Thomas G., E-mail: tom.purdie@rmp.uhn.ca; Jaffray, David A. [Radiation Medicine Program, Princess Margaret Cancer Centre, Toronto, Ontario M5G 2M9 (Canada); Department of Radiation Oncology, University of Toronto, Toronto, Ontario M5S 3E2 (Canada); Techna Institute, University Health Network, Toronto, Ontario M5G 1P5 (Canada); Levinshtein, Alex [Department of Computer Science, University of Toronto, Toronto, Ontario M5S 3G4 (Canada); Hope, Andrew J.; Lindsay, Patricia [Radiation Medicine Program, Princess Margaret Cancer Centre, Toronto, Ontario M5G 2M9, Canada and Department of Radiation Oncology, University of Toronto, Toronto, Ontario M5S 3E2 (Canada); Pekar, Vladimir [Philips Healthcare, Markham, Ontario L6C 2S3 (Canada)

    2015-04-15

    Purpose: The treatment of thoracic cancer using external beam radiation requires an optimal selection of the radiation beam directions to ensure effective coverage of the target volume and to avoid unnecessary treatment of normal healthy tissues. Intensity modulated radiation therapy (IMRT) planning is a lengthy process, which requires the planner to iterate between choosing beam angles, specifying dose–volume objectives and executing IMRT optimization. In thorax treatment planning, where there are no class solutions for beam placement, beam angle selection is performed manually, based on the planner’s clinical experience. The purpose of this work is to propose and study a computationally efficient framework that utilizes machine learning to automatically select treatment beam angles. Such a framework may be helpful for reducing the overall planning workload. Methods: The authors introduce an automated beam selection method, based on learning the relationships between beam angles and anatomical features. Using a large set of clinically approved IMRT plans, a random forest regression algorithm is trained to map a multitude of anatomical features into an individual beam score. An optimization scheme is then built to select and adjust the beam angles, considering the learned interbeam dependencies. The validity and quality of the automatically selected beams evaluated using the manually selected beams from the corresponding clinical plans as the ground truth. Results: The analysis included 149 clinically approved thoracic IMRT plans. For a randomly selected test subset of 27 plans, IMRT plans were generated using automatically selected beams and compared to the clinical plans. The comparison of the predicted and the clinical beam angles demonstrated a good average correspondence between the two (angular distance 16.8° ± 10°, correlation 0.75 ± 0.2). The dose distributions of the semiautomatic and clinical plans were equivalent in terms of primary target volume

  3. Document Exploration and Automatic Knowledge Extraction for Unstructured Biomedical Text

    Science.gov (United States)

    Chu, S.; Totaro, G.; Doshi, N.; Thapar, S.; Mattmann, C. A.; Ramirez, P.

    2015-12-01

    We describe our work on building a web-browser based document reader with built-in exploration tool and automatic concept extraction of medical entities for biomedical text. Vast amounts of biomedical information are offered in unstructured text form through scientific publications and R&D reports. Utilizing text mining can help us to mine information and extract relevant knowledge from a plethora of biomedical text. The ability to employ such technologies to aid researchers in coping with information overload is greatly desirable. In recent years, there has been an increased interest in automatic biomedical concept extraction [1, 2] and intelligent PDF reader tools with the ability to search on content and find related articles [3]. Such reader tools are typically desktop applications and are limited to specific platforms. Our goal is to provide researchers with a simple tool to aid them in finding, reading, and exploring documents. Thus, we propose a web-based document explorer, which we called Shangri-Docs, which combines a document reader with automatic concept extraction and highlighting of relevant terms. Shangri-Docsalso provides the ability to evaluate a wide variety of document formats (e.g. PDF, Words, PPT, text, etc.) and to exploit the linked nature of the Web and personal content by performing searches on content from public sites (e.g. Wikipedia, PubMed) and private cataloged databases simultaneously. Shangri-Docsutilizes Apache cTAKES (clinical Text Analysis and Knowledge Extraction System) [4] and Unified Medical Language System (UMLS) to automatically identify and highlight terms and concepts, such as specific symptoms, diseases, drugs, and anatomical sites, mentioned in the text. cTAKES was originally designed specially to extract information from clinical medical records. Our investigation leads us to extend the automatic knowledge extraction process of cTAKES for biomedical research domain by improving the ontology guided information extraction

  4. Automatic categorization of diverse experimental information in the bioscience literature

    Directory of Open Access Journals (Sweden)

    Fang Ruihua

    2012-01-01

    Full Text Available Abstract Background Curation of information from bioscience literature into biological knowledge databases is a crucial way of capturing experimental information in a computable form. During the biocuration process, a critical first step is to identify from all published literature the papers that contain results for a specific data type the curator is interested in annotating. This step normally requires curators to manually examine many papers to ascertain which few contain information of interest and thus, is usually time consuming. We developed an automatic method for identifying papers containing these curation data types among a large pool of published scientific papers based on the machine learning method Support Vector Machine (SVM. This classification system is completely automatic and can be readily applied to diverse experimental data types. It has been in use in production for automatic categorization of 10 different experimental datatypes in the biocuration process at WormBase for the past two years and it is in the process of being adopted in the biocuration process at FlyBase and the Saccharomyces Genome Database (SGD. We anticipate that this method can be readily adopted by various databases in the biocuration community and thereby greatly reducing time spent on an otherwise laborious and demanding task. We also developed a simple, readily automated procedure to utilize training papers of similar data types from different bodies of literature such as C. elegans and D. melanogaster to identify papers with any of these data types for a single database. This approach has great significance because for some data types, especially those of low occurrence, a single corpus often does not have enough training papers to achieve satisfactory performance. Results We successfully tested the method on ten data types from WormBase, fifteen data types from FlyBase and three data types from Mouse Genomics Informatics (MGI. It is being used in

  5. Automatic malware analysis an emulator based approach

    CERN Document Server

    Yin, Heng

    2012-01-01

    Malicious software (i.e., malware) has become a severe threat to interconnected computer systems for decades and has caused billions of dollars damages each year. A large volume of new malware samples are discovered daily. Even worse, malware is rapidly evolving becoming more sophisticated and evasive to strike against current malware analysis and defense systems. Automatic Malware Analysis presents a virtualized malware analysis framework that addresses common challenges in malware analysis. In regards to this new analysis framework, a series of analysis techniques for automatic malware analy

  6. Automatic Target Detection Using Wavelet Transform

    Directory of Open Access Journals (Sweden)

    Ganesan L

    2004-01-01

    Full Text Available Automatic target recognition (ATR involves processing images for detecting, classifying, and tracking targets embedded in a background scene. This paper presents an algorithm for detecting a specified set of target objects embedded in visual images for an ATR application. The developed algorithm employs a novel technique for automatically detecting man-made and non-man-made single, two, and multitargets from nontarget objects, located within a cluttered environment by evaluating nonoverlapping image blocks, where block-by-block comparison of wavelet cooccurrence feature is done. The results of the proposed algorithm are found to be satisfactory.

  7. Automatic and strategic processes in advertising effects

    DEFF Research Database (Denmark)

    Grunert, Klaus G.

    1996-01-01

    , and can easily be adapted to situational circumstances. Both the perception of advertising and the way advertising influences brand evaluation involves both processes. Automatic processes govern the recognition of advertising stimuli, the relevance decision which determines further higher-level processing...... are at variance with current notions about advertising effects. For example, the att span problem will be relevant only for strategic processes, not for automatic processes, a certain amount of learning can occur with very little conscious effort, and advertising's effect on brand evaluation may be more stable...

  8. Automatic Hierarchical Color Image Classification

    Directory of Open Access Journals (Sweden)

    Jing Huang

    2003-02-01

    Full Text Available Organizing images into semantic categories can be extremely useful for content-based image retrieval and image annotation. Grouping images into semantic classes is a difficult problem, however. Image classification attempts to solve this hard problem by using low-level image features. In this paper, we propose a method for hierarchical classification of images via supervised learning. This scheme relies on using a good low-level feature and subsequently performing feature-space reconfiguration using singular value decomposition to reduce noise and dimensionality. We use the training data to obtain a hierarchical classification tree that can be used to categorize new images. Our experimental results suggest that this scheme not only performs better than standard nearest-neighbor techniques, but also has both storage and computational advantages.

  9. Automatic speech signal segmentation based on the innovation adaptive filter

    Directory of Open Access Journals (Sweden)

    Makowski Ryszard

    2014-06-01

    Full Text Available Speech segmentation is an essential stage in designing automatic speech recognition systems and one can find several algorithms proposed in the literature. It is a difficult problem, as speech is immensely variable. The aim of the authors’ studies was to design an algorithm that could be employed at the stage of automatic speech recognition. This would make it possible to avoid some problems related to speech signal parametrization. Posing the problem in such a way requires the algorithm to be capable of working in real time. The only such algorithm was proposed by Tyagi et al., (2006, and it is a modified version of Brandt’s algorithm. The article presents a new algorithm for unsupervised automatic speech signal segmentation. It performs segmentation without access to information about the phonetic content of the utterances, relying exclusively on second-order statistics of a speech signal. The starting point for the proposed method is time-varying Schur coefficients of an innovation adaptive filter. The Schur algorithm is known to be fast, precise, stable and capable of rapidly tracking changes in second order signal statistics. A transfer from one phoneme to another in the speech signal always indicates a change in signal statistics caused by vocal track changes. In order to allow for the properties of human hearing, detection of inter-phoneme boundaries is performed based on statistics defined on the mel spectrum determined from the reflection coefficients. The paper presents the structure of the algorithm, defines its properties, lists parameter values, describes detection efficiency results, and compares them with those for another algorithm. The obtained segmentation results, are satisfactory.

  10. Laboratory evaluation of the Boehringer Mannheim "Hitachi 705" automatic analyzer.

    Science.gov (United States)

    Kineiko, R W; Floering, D A; Morrissey, M

    1983-04-01

    We evaluated a new multi-channel chemistry analyzer, the Hitachi 705 Automatic Analyzer, marketed by Boehringer Mannheim Diagnostics, Inc. The instrument is a computer-controlled discrete analyzer, which can be run in a combination profile mode and single-test mode. Sixteen different tests per sample may be performed at the rate of 180 tests per hour. The Hitachi 705 is especially suitable for use in hospitals that do not perform profile testing. Precision and linearity were excellent and the instrument was relatively trouble-free, with little operator attention required during operation. The Hitachi 705 was easily interfaced to our laboratory computer. We compared the performance of the instrument with that of the Du Pont aca; the two instruments compared favorably.

  11. Automatic software fault localization based on ar tificial bee colony

    Institute of Scientific and Technical Information of China (English)

    Linzhi Huang∗; Jun Ai

    2015-01-01

    Software debugging accounts for a vast majority of the financial and time costs in software developing and maintenance. Thus, approaches of software fault localization that can help au-tomate the debugging process have become a hot topic in the field of software engineering. Given the great demand for software fault localization, an approach based on the artificial bee colony (ABC) algorithm is proposed to be integrated with other related techniques. In this process, the source program is initial y instru-mented after analyzing the dependence information. The test case sets are then compiled and run on the instrumented program, and execution results are input to the ABC algorithm. The algorithm can determine the largest fitness value and best food source by calculating the average fitness of the employed bees in the iter-ative process. The program unit with the highest suspicion score corresponding to the best test case set is regarded as the final fault localization. Experiments are conducted with the TCAS program in the Siemens suite. Results demonstrate that the proposed fault localization method is effective and efficient. The ABC algorithm can efficiently avoid the local optimum, and ensure the validity of the fault location to a larger extent.

  12. An automatic grid generation approach over free-form surface for architectural design

    Institute of Scientific and Technical Information of China (English)

    苏亮; 祝顺来; 肖南; 高博青

    2014-01-01

    An essential step for the realization of free-form surface structures is to create an efficient structural gird that satisfies not only the architectural aesthetics, but also the structural performance. Employing the main stress trajectories as the representation of force flows on a free-form surface, an automatic grid generation approach is proposed for the architectural design. The algorithm automatically plots the main stress trajectories on a 3D free-form surface, and adopts a modified advancing front meshing technique to generate the structural grid. Based on the proposed algorithm, an automatic grid generator named “St-Surmesh” is developed for the practical architectural design of free-form surface structure. The surface geometry of one of the Sun Valleys in Expo Axis for the Expo Shanghai 2010 is selected as a numerical example for validating the proposed approach. Comparative studies are performed to demonstrate how different structural grids affect the design of a free-form surface structure.

  13. The role of attention in automatization: does attention operate at encoding, or retrieval, or both?

    Science.gov (United States)

    Boronat, C B; Logan, G D

    1997-01-01

    In this research, we investigated whether attention operates in the encoding of automatized information, the retrieval of automatized information, or in both cases. Subjects searched two-word displays for members of a target category in focused-attention or divided-attention conditions that were crossed with block (training vs. transfer). To see whether subjects encoded all available items or only attended items, we compared performance for subjects in different training conditions but in the same transfer condition. Subjects encoded attended items. To see whether subjects retrieved all the items they had in memory, or only items associated with that to which they were attending at retrieval, we compared performance for subjects in the same training conditions but in different transfer conditions. Subjects retrieved attended items. Attention was found to operate at both encoding and retrieval. These findings support the instance theory of automaticity, which predicts the role of attention at encoding and retrieval.

  14. Go with the flow: how the consideration of joy versus pride influences automaticity.

    Science.gov (United States)

    Katzir, Maayan; Ori, Bnaya; Eyal, Tal; Meiran, Nachshon

    2015-02-01

    Recently, we have shown that the consideration of joy, without the actual experience of the emotion, impaired performance on the antisaccade task (Katzir, Eyal, Meiran, & Kessler, 2010). We interpreted this finding as indicating inhibitory control failure. However, impaired antisaccade performance may result from either the weakening of inhibitory control, the potentiation of the competing automatic response, or both. In the current research we used a task switching paradigm, which allowed us to assess cognitive control more directly, using Backward Inhibition, Competitor Rule Suppression, and Competitor Rule Priming as cognitive-control indices as well as assessing the Task Rule Congruency Effect (TRCE) which, like the antisaccade, is influenced by both control and automaticity. We found that considering joy compared to pride did not influence any of the cognitive control indices but increased the TRCE. We interpret this finding as evidence that joy consideration leads to increased reliance on automatic tendencies, such as short-term desires.

  15. Comparison of TCP automatic tuning techniques for distributed computing

    Energy Technology Data Exchange (ETDEWEB)

    Weigle, E. H. (Eric H.); Feng, W. C. (Wu-Chun)

    2002-01-01

    Rather than painful, manual, static, per-connection optimization of TCP buffer sizes simply to achieve acceptable performance for distributed applications, many researchers have proposed techniques to perform this tuning automatically. This paper first discusses the relative merits of the various approaches in theory, and then provides substantial experimental data concerning two competing implementations - the buffer autotuning already present in Linux 2.4.x and 'Dynamic Right-Sizing.' This paper reveals heretofore unknown aspects of the problem and current solutions, provides insight into the proper approach for various circumstances, and points toward ways to further improve performance. TCP, for good or ill, is the only protocol widely available for reliable end-to-end congestion-controlled network communication, and thus it is the one used for almost all distributed computing. Unfortunately, TCP was not designed with high-performance computing in mind - its original design decisions focused on long-term fairness first, with performance a distant second. Thus users must often perform tortuous manual optimizations simply to achieve acceptable behavior. The most important and often most difficult task is determining and setting appropriate buffer sizes. Because of this, at least six ways of automatically setting these sizes have been proposed. In this paper, we compare and contrast these tuning methods. First we explain each method, followed by an in-depth discussion of their features. Next we discuss the experiments to fully characterize two particularly interesting methods (Linux 2.4 autotuning and Dynamic Right-Sizing). We conclude with results and possible improvements.

  16. Parallel solution of systems of linear equations generated by COMSOL 3.2 using the Sun Performance Library

    DEFF Research Database (Denmark)

    Gersborg-Hansen, Allan; Dammann, Bernd; Aage, Niels

    This note investigates the use of the Sun Performance Library for parallel solution of a system of linear equations generated by COMSOL 3.2. In many engineering disciplines this is a computational bottleneck for large problems which are often met in research practice. Most researches are primarily...... for a testproblem run in 2D and 3D. Moreover this note quantifies the performance of COMSOL running on a Sparc ULTRA III processor. The study shows that for small problems such as debugging tasks, teaching exercises etc. the Sun computer is not competitive compared with a standard PC....

  17. Automatic Water Sensor Window Opening System

    KAUST Repository

    Percher, Michael

    2013-12-05

    A system can automatically open at least one window of a vehicle when the vehicle is being submerged in water. The system can include a water collector and a water sensor, and when the water sensor detects water in the water collector, at least one window of the vehicle opens.

  18. Channel selection for automatic seizure detection

    DEFF Research Database (Denmark)

    Duun-Henriksen, Jonas; Kjaer, Troels Wesenberg; Madsen, Rasmus Elsborg

    2012-01-01

    of an automatic channel selection method. The characteristics of the seizures are extracted by the use of a wavelet analysis and classified by a support vector machine. The best channel selection method is based upon maximum variance during the seizure. Results: Using only three channels, a seizure detection...

  19. Automatic Activation of Exercise and Sedentary Stereotypes

    Science.gov (United States)

    Berry, Tanya; Spence, John C.

    2009-01-01

    We examined the automatic activation of "sedentary" and "exerciser" stereotypes using a social prime Stroop task. Results showed significantly slower response times between the exercise words and the exercise control words and between the sedentary words and the exercise control words when preceded by an attractive exerciser prime. Words preceded…

  20. Automatic Guidance System for Welding Torches

    Science.gov (United States)

    Smith, H.; Wall, W.; Burns, M. R., Jr.

    1984-01-01

    Digital system automatically guides welding torch to produce squarebutt, V-groove and lap-joint weldments within tracking accuracy of +0.2 millimeter. Television camera observes and traverses weld joint, carrying welding torch behind. Image of joint digitized, and resulting data used to derive control signals that enable torch to track joint.

  1. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan;

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic prog...

  2. Automatic TLI recognition system, general description

    Energy Technology Data Exchange (ETDEWEB)

    Lassahn, G.D.

    1997-02-01

    This report is a general description of an automatic target recognition system developed at the Idaho National Engineering Laboratory for the Department of Energy. A user`s manual is a separate volume, Automatic TLI Recognition System, User`s Guide, and a programmer`s manual is Automatic TLI Recognition System, Programmer`s Guide. This system was designed as an automatic target recognition system for fast screening of large amounts of multi-sensor image data, based on low-cost parallel processors. This system naturally incorporates image data fusion, and it gives uncertainty estimates. It is relatively low cost, compact, and transportable. The software is easily enhanced to expand the system`s capabilities, and the hardware is easily expandable to increase the system`s speed. In addition to its primary function as a trainable target recognition system, this is also a versatile, general-purpose tool for image manipulation and analysis, which can be either keyboard-driven or script-driven. This report includes descriptions of three variants of the computer hardware, a description of the mathematical basis if the training process, and a description with examples of the system capabilities.

  3. Automatically output-power-controlled WDM EDFA

    Science.gov (United States)

    Choi, Bo-Hun; Lee, Sang Soo; Kim, Chang-Bong; Ko, Jesoo

    2006-09-01

    Our amplifier using an all optical method and a fixed GFF achieved automatic gain flatness through all C-band without any NF degradation, and simultaneously a constant 25 dB gain, while input signals were varied between one channel and forty WDM channels.

  4. Two Systems for Automatic Music Genre Recognition

    DEFF Research Database (Denmark)

    Sturm, Bob L.

    2012-01-01

    We re-implement and test two state-of-the-art systems for automatic music genre classification; but unlike past works in this area, we look closer than ever before at their behavior. First, we look at specific instances where each system consistently applies the same wrong label across multiple t...

  5. 42 CFR 407.17 - Automatic enrollment.

    Science.gov (United States)

    2010-10-01

    ... SUPPLEMENTARY MEDICAL INSURANCE (SMI) ENROLLMENT AND ENTITLEMENT Individual Enrollment and Entitlement for SMI... enrolled for SMI if he or she: (1) Resides in the United States, except in Puerto Rico; (2) Becomes... chapter; and (3) Does not decline SMI enrollment. (b) Opportunity to decline automatic enrollment. (1)...

  6. Automatic assessment of cardiac perfusion MRI

    DEFF Research Database (Denmark)

    Ólafsdóttir, Hildur; Stegmann, Mikkel Bille; Larsson, Henrik B.W.

    2004-01-01

    In this paper, a method based on Active Appearance Models (AAM) is applied for automatic registration of myocardial perfusion MRI. A semi-quantitative perfusion assessment of the registered image sequences is presented. This includes the formation of perfusion maps for three parameters; maximum up...

  7. Automatic extraction of legal concepts and definitions

    NARCIS (Netherlands)

    R. Winkels; R. Hoekstra

    2012-01-01

    In this paper we present the results of an experiment in automatic concept and definition extraction from written sources of law using relatively simple natural language and standard semantic web technology. The software was tested on six laws from the tax domain.

  8. Automatic Syntactic Analysis of Free Text.

    Science.gov (United States)

    Schwarz, Christoph

    1990-01-01

    Discusses problems encountered with the syntactic analysis of free text documents in indexing. Postcoordination and precoordination of terms is discussed, an automatic indexing system call COPSY (context operator syntax) that uses natural language processing techniques is described, and future developments are explained. (60 references) (LRW)

  9. Toward the Automatic Identification of Sublanguage Vocabulary.

    Science.gov (United States)

    Haas, Stephanie W.; He, Shaoyi

    1993-01-01

    Describes the development of a method for the automatic identification of sublanguage vocabulary words as they occur in abstracts. Highlights include research relating to sublanguages and their vocabulary; domain terms; evaluation criteria, including recall and precision; and implications for natural language processing and information retrieval.…

  10. Automatically extracting class diagrams from spreadsheets

    NARCIS (Netherlands)

    Hermans, F.; Pinzger, M.; Van Deursen, A.

    2010-01-01

    The use of spreadsheets to capture information is widespread in industry. Spreadsheets can thus be a wealthy source of domain information. We propose to automatically extract this information and transform it into class diagrams. The resulting class diagram can be used by software engineers to under

  11. Feedback Improvement in Automatic Program Evaluation Systems

    Science.gov (United States)

    Skupas, Bronius

    2010-01-01

    Automatic program evaluation is a way to assess source program files. These techniques are used in learning management environments, programming exams and contest systems. However, use of automated program evaluation encounters problems: some evaluations are not clear for the students and the system messages do not show reasons for lost points.…

  12. Automatic Thesaurus Construction Using Bayesian Networks.

    Science.gov (United States)

    Park, Young C.; Choi, Key-Sun

    1996-01-01

    Discusses automatic thesaurus construction and characterizes the statistical behavior of terms by using an inference network. Highlights include low-frequency terms and data sparseness, Bayesian networks, collocation maps and term similarity, constructing a thesaurus from a collocation map, and experiments with test collections. (Author/LRW)

  13. Automatic characterization of dynamics in Absence Epilepsy

    DEFF Research Database (Denmark)

    Petersen, Katrine N. H.; Nielsen, Trine N.; Kjær, Troels W.;

    2013-01-01

    Dynamics of the spike-wave paroxysms in Childhood Absence Epilepsy (CAE) are automatically characterized using novel approaches. Features are extracted from scalograms formed by Continuous Wavelet Transform (CWT). Detection algorithms are designed to identify an estimate of the temporal development...

  14. Automatic contour welder incorporates speed control system

    Science.gov (United States)

    Wall, W. A., Jr.

    1968-01-01

    Speed control system maintains the welding torch of an automatic welder at a substantially constant speed. The system is particularly useful when welding contoured or unusually shaped surfaces, which cause the distance from the work surface to the weld carriage to vary in a random manner.

  15. Automatic solar lamp intensity control system

    Science.gov (United States)

    Leverone, H.; Mandell, N.

    1968-01-01

    System that substitutes solar cells directly in the path of the radiation incident on the test volume and uses a dc bridge-null system was developed. The solar cell is affixed to a heat sink mounted on each of three arms for each solar lamp. Control of the radiation from the solar lamps is automatic.

  16. 38 CFR 51.31 - Automatic recognition.

    Science.gov (United States)

    2010-07-01

    ...) PER DIEM FOR NURSING HOME CARE OF VETERANS IN STATE HOMES Obtaining Per Diem for Nursing Home Care in... that already is recognized by VA as a State home for nursing home care at the time this part becomes effective, automatically will continue to be recognized as a State home for nursing home care but will...

  17. Conscious and Automatic Processes in Language Learning.

    Science.gov (United States)

    Carroll, John B.

    1981-01-01

    Proposes theory that the learning processes of first- and second-language learners are fundamentally the same, differing only in kinds of information used by both kinds of learners and the degree of automatization attained. Suggests designing second-language learning processes to simulate those occurring in natural settings. (Author/BK)

  18. Automatic prejudice in childhood and early adolescence

    NARCIS (Netherlands)

    Degner, J.; Wentura, D.

    2010-01-01

    Four cross-sectional studies are presented that investigated the automatic activation of prejudice in children and adolescents (aged 9 years to 15 years). Therefore, 4 different versions of the affective priming task were used, with pictures of ingroup and outgroup members being presented as prejudi

  19. Automatization of Student Assessment Using Multimedia Technology.

    Science.gov (United States)

    Taniar, David; Rahayu, Wenny

    Most use of multimedia technology in teaching and learning to date has emphasized the teaching aspect only. An application of multimedia in examinations has been neglected. This paper addresses how multimedia technology can be applied to the automatization of assessment, by proposing a prototype of a multimedia question bank, which is able to…

  20. Automatization and familiarity in repeated checking

    NARCIS (Netherlands)

    Dek, E.C.P.|info:eu-repo/dai/nl/313959552; van den Hout, M.A.|info:eu-repo/dai/nl/070445354; Giele, C.L.|info:eu-repo/dai/nl/318754460; Engelhard, I.M.|info:eu-repo/dai/nl/239681533

    2015-01-01

    Repetitive, compulsive-like checking of an object leads to reductions in memory confidence, vividness, and detail. Experimental research suggests that this is caused by increased familiarity with perceptual characteristics of the stimulus and automatization of the checking procedure (Dek, van den Ho