WorldWideScience

Sample records for automated computational tool

  1. Improvement of Computer Software Quality through Software Automated Tools.

    Science.gov (United States)

    1986-08-30

    information that are returned from the tools to the human user, and the forms in which these outputs are presented. Page 2 of 4 STAGE OF DEVELOPMENT: What... AUTOMIATED SOFTWARE TOOL MONITORING SYSTEM APPENDIX 2 2-1 INTRODUCTION This document and Automated Software Tool Monitoring Program (Appendix 1) are...t Output Output features provide links from the tool to both the human user and the target machine (where applicable). They describe the types

  2. Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous MultiCore Systems

    Science.gov (United States)

    2017-04-13

    AFRL-AFOSR-UK-TR-2017-0029 Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous MultiCore Systems ...2012, “ Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous MultiCore Systems .” 2. The objective...2012 - 01/25/2015 4. TITLE AND SUBTITLE Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous

  3. Improvement of Computer Software Quality through Software Automated Tools.

    Science.gov (United States)

    1986-08-31

    maintain an effective and economical monitoring program that includes both processes -* and products which makes data available to the Government...presented to help the AFPRO understand what a soft- wace tool is and how it works. There are many ways in which one can view the characteristics of soft

  4. Automated planning target volume generation: an evaluation pitting a computer-based tool against human experts

    International Nuclear Information System (INIS)

    Ketting, Case H.; Austin-Seymour, Mary; Kalet, Ira; Jacky, Jon; Kromhout-Schiro, Sharon; Hummel, Sharon; Unger, Jonathan; Fagan, Lawrence M.; Griffin, Tom

    1997-01-01

    Purpose: Software tools are seeing increased use in three-dimensional treatment planning. However, the development of these tools frequently omits careful evaluation before placing them in clinical use. This study demonstrates the application of a rigorous evaluation methodology using blinded peer review to an automated software tool that produces ICRU-50 planning target volumes (PTVs). Methods and Materials: Seven physicians from three different institutions involved in three-dimensional treatment planning participated in the evaluation. Four physicians drew partial PTVs on nine test cases, consisting of four nasopharynx and five lung primaries. Using the same information provided to the human experts, the computer tool generated PTVs for comparison. The remaining three physicians, designated evaluators, individually reviewed the PTVs for acceptability. To exclude bias, the evaluators were blinded to the source (human or computer) of the PTVs they reviewed. Their scorings of the PTVs were statistically examined to determine if the computer tool performed as well as the human experts. Results: The computer tool was as successful as the human experts in generating PTVs. Failures were primarily attributable to insufficient margins around the clinical target volume and to encroachment upon critical structures. In a qualitative analysis, the human and computer experts displayed similar types and distributions of errors. Conclusions: Rigorous evaluation of computer-based radiotherapy tools requires comparison to current practice and can reveal areas for improvement before the tool enters clinical practice

  5. Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools

    Science.gov (United States)

    Boe, Bryce A.

    There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.

  6. Technology and Jobs: Computer-Aided Design. Numerical-Control Machine-Tool Operators. Office Automation.

    Science.gov (United States)

    Stanton, Michael; And Others

    1985-01-01

    Three reports on the effects of high technology on the nature of work include (1) Stanton on applications and implications of computer-aided design for engineers, drafters, and architects; (2) Nardone on the outlook and training of numerical-control machine tool operators; and (3) Austin and Drake on the future of clerical occupations in automated…

  7. A suite of MATLAB-based computational tools for automated analysis of COPAS Biosort data

    Science.gov (United States)

    Morton, Elizabeth; Lamitina, Todd

    2010-01-01

    Complex Object Parametric Analyzer and Sorter (COPAS) devices are large-object, fluorescence-capable flow cytometers used for high-throughput analysis of live model organisms, including Drosophila melanogaster, Caenorhabditis elegans, and zebrafish. The COPAS is especially useful in C. elegans high-throughput genome-wide RNA interference (RNAi) screens that utilize fluorescent reporters. However, analysis of data from such screens is relatively labor-intensive and time-consuming. Currently, there are no computational tools available to facilitate high-throughput analysis of COPAS data. We used MATLAB to develop algorithms (COPAquant, COPAmulti, and COPAcompare) to analyze different types of COPAS data. COPAquant reads single-sample files, filters and extracts values and value ratios for each file, and then returns a summary of the data. COPAmulti reads 96-well autosampling files generated with the ReFLX adapter, performs sample filtering, graphs features across both wells and plates, performs some common statistical measures for hit identification, and outputs results in graphical formats. COPAcompare performs a correlation analysis between replicate 96-well plates. For many parameters, thresholds may be defined through a simple graphical user interface (GUI), allowing our algorithms to meet a variety of screening applications. In a screen for regulators of stress-inducible GFP expression, COPAquant dramatically accelerated data analysis and allowed us to rapidly move from raw data to hit identification. Because the COPAS file structure is standardized and our MATLAB code is freely available, our algorithms should be extremely useful for analysis of COPAS data from multiple platforms and organisms. The MATLAB code is freely available at our web site (www.med.upenn.edu/lamitinalab/downloads.shtml). PMID:20569218

  8. Automated Parallel Computing Tools for Multicore Machines and Clusters, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to improve productivity of high performance computing for applications on multicore computers and clusters. These machines built from one or more chips...

  9. ATLAS Distributed Computing Automation

    CERN Document Server

    Schovancova, J; The ATLAS collaboration; Borrego, C; Campana, S; Di Girolamo, A; Elmsheuser, J; Hejbal, J; Kouba, T; Legger, F; Magradze, E; Medrano Llamas, R; Negri, G; Rinaldi, L; Sciacca, G; Serfon, C; Van Der Ster, D C

    2012-01-01

    The ATLAS Experiment benefits from computing resources distributed worldwide at more than 100 WLCG sites. The ATLAS Grid sites provide over 100k CPU job slots, over 100 PB of storage space on disk or tape. Monitoring of status of such a complex infrastructure is essential. The ATLAS Grid infrastructure is monitored 24/7 by two teams of shifters distributed world-wide, by the ATLAS Distributed Computing experts, and by site administrators. In this paper we summarize automation efforts performed within the ATLAS Distributed Computing team in order to reduce manpower costs and improve the reliability of the system. Different aspects of the automation process are described: from the ATLAS Grid site topology provided by the ATLAS Grid Information System, via automatic site testing by the HammerCloud, to automatic exclusion from production or analysis activities.

  10. A new concept in glasshouse computer automation with SCADA and CASE Tools

    NARCIS (Netherlands)

    Meurs, van W.Th.M.; Gieling, Th.H.; Janssen, H.J.J.

    1996-01-01

    Climate control computers in greenhouses control heating and ventilation, supply water, dilute and dispense nutrients and integrate models into an optimally controlled system. This paper describes how information technology, as in use in other sectors of industry, applies to greenhouse control. In

  11. Computer automation and artificial intelligence

    International Nuclear Information System (INIS)

    Hasnain, S.B.

    1992-01-01

    Rapid advances in computing, resulting from micro chip revolution has increased its application manifold particularly for computer automation. Yet the level of automation available, has limited its application to more complex and dynamic systems which require an intelligent computer control. In this paper a review of Artificial intelligence techniques used to augment automation is presented. The current sequential processing approach usually adopted in artificial intelligence has succeeded in emulating the symbolic processing part of intelligence, but the processing power required to get more elusive aspects of intelligence leads towards parallel processing. An overview of parallel processing with emphasis on transputer is also provided. A Fuzzy knowledge based controller for amination drug delivery in muscle relaxant anesthesia on transputer is described. 4 figs. (author)

  12. Reload safety analysis automation tools

    International Nuclear Information System (INIS)

    Havlůj, F.; Hejzlar, J.; Vočka, R.

    2013-01-01

    Performing core physics calculations for the sake of reload safety analysis is a very demanding and time consuming process. This process generally begins with the preparation of libraries for the core physics code using a lattice code. The next step involves creating a very large set of calculations with the core physics code. Lastly, the results of the calculations must be interpreted, correctly applying uncertainties and checking whether applicable limits are satisfied. Such a procedure requires three specialized experts. One must understand the lattice code in order to correctly calculate and interpret its results. The next expert must have a good understanding of the physics code in order to create libraries from the lattice code results and to correctly define all the calculations involved. The third expert must have a deep knowledge of the power plant and the reload safety analysis procedure in order to verify, that all the necessary calculations were performed. Such a procedure involves many steps and is very time consuming. At ÚJV Řež, a.s., we have developed a set of tools which can be used to automate and simplify the whole process of performing reload safety analysis. Our application QUADRIGA automates lattice code calculations for library preparation. It removes user interaction with the lattice code and reduces his task to defining fuel pin types, enrichments, assembly maps and operational parameters all through a very nice and user-friendly GUI. The second part in reload safety analysis calculations is done by CycleKit, a code which is linked with our core physics code ANDREA. Through CycleKit large sets of calculations with complicated interdependencies can be performed using simple and convenient notation. CycleKit automates the interaction with ANDREA, organizes all the calculations, collects the results, performs limit verification and displays the output in clickable html format. Using this set of tools for reload safety analysis simplifies

  13. Computer automation in veterinary hospitals.

    Science.gov (United States)

    Rogers, H

    1996-05-01

    Computers have been used to automate complex and repetitive tasks in veterinary hospitals since the 1960s. Early systems were expensive, but their use was justified because they performed jobs which would have been impossible or which would have required greater resources in terms of time and personnel had they been performed by other methods. Systems found in most veterinary hospitals today are less costly, magnitudes more capable, and often underused. Modern multitasking operating systems and graphical interfaces bring many opportunities for automation. Commercial and custom programs developed and used in a typical multidoctor mixed species veterinary practice are described.

  14. Automated, computer interpreted radioimmunoassay results

    International Nuclear Information System (INIS)

    Hill, J.C.; Nagle, C.E.; Dworkin, H.J.; Fink-Bennett, D.; Freitas, J.E.; Wetzel, R.; Sawyer, N.; Ferry, D.; Hershberger, D.

    1984-01-01

    90,000 Radioimmunoassay results have been interpreted and transcribed automatically using software developed for use on a Hewlett Packard Model 1000 mini-computer system with conventional dot matrix printers. The computer program correlates the results of a combination of assays, interprets them and prints a report ready for physician review and signature within minutes of completion of the assay. The authors designed and wrote a computer program to query their patient data base for radioassay laboratory results and to produce a computer generated interpretation of these results using an algorithm that produces normal and abnormal interpretives. Their laboratory assays 50,000 patient samples each year using 28 different radioassays. Of these 85% have been interpreted using our computer program. Allowances are made for drug and patient history and individualized reports are generated with regard to the patients age and sex. Finalization of reports is still subject to change by the nuclear physician at the time of final review. Automated, computerized interpretations have realized cost savings through reduced personnel and personnel time and provided uniformity of the interpretations among the five physicians. Prior to computerization of interpretations, all radioassay results had to be dictated and reviewed for signing by one of the resident or staff physicians. Turn around times for reports prior to the automated computer program generally were two to three days. Whereas, the computerized interpret system allows reports to generally be issued the day assays are completed

  15. Hypercard Another Computer Tool.

    Science.gov (United States)

    Geske, Joel

    1991-01-01

    Describes "Hypercard," a computer application package usable in all three modes of instructional computing: tutor, tool, and tutee. Suggests using Hypercard in scholastic journalism programs to teach such topics as news, headlines, design, photography, and advertising. Argues that the ability to access, organize, manipulate, and comprehend…

  16. Automated Behavior Property Verification Tool

    National Research Council Canada - National Science Library

    Leo, John K

    2008-01-01

    .... A type of CGF in which the entities have limited autonomy is semi-automated forces (SAF). The SAF system for this thesis research is OneSAF, a near real-time SAF that offers raw data collection of the entities in a particular simulation scenario...

  17. ASTROS: A multidisciplinary automated structural design tool

    Science.gov (United States)

    Neill, D. J.

    1989-01-01

    ASTROS (Automated Structural Optimization System) is a finite-element-based multidisciplinary structural optimization procedure developed under Air Force sponsorship to perform automated preliminary structural design. The design task is the determination of the structural sizes that provide an optimal structure while satisfying numerous constraints from many disciplines. In addition to its automated design features, ASTROS provides a general transient and frequency response capability, as well as a special feature to perform a transient analysis of a vehicle subjected to a nuclear blast. The motivation for the development of a single multidisciplinary design tool is that such a tool can provide improved structural designs in less time than is currently needed. The role of such a tool is even more apparent as modern materials come into widespread use. Balancing conflicting requirements for the structure's strength and stiffness while exploiting the benefits of material anisotropy is perhaps an impossible task without assistance from an automated design tool. Finally, the use of a single tool can bring the design task into better focus among design team members, thereby improving their insight into the overall task.

  18. Automated validation of a computer operating system

    Science.gov (United States)

    Dervage, M. M.; Milberg, B. A.

    1970-01-01

    Programs apply selected input/output loads to complex computer operating system and measure performance of that system under such loads. Technique lends itself to checkout of computer software designed to monitor automated complex industrial systems.

  19. Computer system architecture for laboratory automation

    International Nuclear Information System (INIS)

    Penney, B.K.

    1978-01-01

    This paper describes the various approaches that may be taken to provide computing resources for laboratory automation. Three distinct approaches are identified, the single dedicated small computer, shared use of a larger computer, and a distributed approach in which resources are provided by a number of computers, linked together, and working in some cooperative way. The significance of the microprocessor in laboratory automation is discussed, and it is shown that it is not simply a cheap replacement of the minicomputer. (Auth.)

  20. Tools for computational finance

    CERN Document Server

    Seydel, Rüdiger U

    2017-01-01

    Computational and numerical methods are used in a number of ways across the field of finance. It is the aim of this book to explain how such methods work in financial engineering. By concentrating on the field of option pricing, a core task of financial engineering and risk analysis, this book explores a wide range of computational tools in a coherent and focused manner and will be of use to anyone working in computational finance. Starting with an introductory chapter that presents the financial and stochastic background, the book goes on to detail computational methods using both stochastic and deterministic approaches. Now in its sixth edition, Tools for Computational Finance has been significantly revised and contains:    Several new parts such as a section on extended applications of tree methods, including multidimensional trees, trinomial trees, and the handling of dividends; Additional material in the field of generating normal variates with acceptance-rejection methods, and on Monte Carlo methods...

  1. AGWA: The Automated Geospatial Watershed Assessment Tool

    Science.gov (United States)

    The Automated Geospatial Watershed Assessment Tool (AGWA, see: www.tucson.ars.ag.gov/agwa or http://www.epa.gov/esd/land-sci/agwa/) is a GIS interface jointly developed by the USDA-Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona...

  2. Automated Assessment in a Programming Tools Course

    Science.gov (United States)

    Fernandez Aleman, J. L.

    2011-01-01

    Automated assessment systems can be useful for both students and instructors. Ranking and immediate feedback can have a strongly positive effect on student learning. This paper presents an experience using automatic assessment in a programming tools course. The proposal aims at extending the traditional use of an online judging system with a…

  3. Automated prototyping tool-kit (APT)

    OpenAIRE

    Nada, Nader; Shing, M.; Berzins, V.; Luqi

    2002-01-01

    Automated prototyping tool-kit (APT) is an integrated set of software tools that generate source programs directly from real-time requirements. The APT system uses a fifth-generation prototyping language to model the communication structure, timing constraints, 1/0 control, and data buffering that comprise the requirements for an embedded software system. The language supports the specification of hard real-time systems with reusable components from domain specific component libraries. APT ha...

  4. Automating usability of ATLAS distributed computing resources

    International Nuclear Information System (INIS)

    Tupputi, S A; Girolamo, A Di; Kouba, T; Schovancová, J

    2014-01-01

    The automation of ATLAS Distributed Computing (ADC) operations is essential to reduce manpower costs and allow performance-enhancing actions, which improve the reliability of the system. In this perspective a crucial case is the automatic handling of outages of ATLAS computing sites storage resources, which are continuously exploited at the edge of their capabilities. It is challenging to adopt unambiguous decision criteria for storage resources of non-homogeneous types, sizes and roles. The recently developed Storage Area Automatic Blacklisting (SAAB) tool has provided a suitable solution, by employing an inference algorithm which processes history of storage monitoring tests outcome. SAAB accomplishes both the tasks of providing global monitoring as well as automatic operations on single sites. The implementation of the SAAB tool has been the first step in a comprehensive review of the storage areas monitoring and central management at all levels. Such review has involved the reordering and optimization of SAM tests deployment and the inclusion of SAAB results in the ATLAS Site Status Board with both dedicated metrics and views. The resulting structure allows monitoring the storage resources status with fine time-granularity and automatic actions to be taken in foreseen cases, like automatic outage handling and notifications to sites. Hence, the human actions are restricted to reporting and following up problems, where and when needed. In this work we show SAAB working principles and features. We present also the decrease of human interactions achieved within the ATLAS Computing Operation team. The automation results in a prompt reaction to failures, which leads to the optimization of resource exploitation.

  5. Automated Computer Access Request System

    Science.gov (United States)

    Snook, Bryan E.

    2010-01-01

    The Automated Computer Access Request (AutoCAR) system is a Web-based account provisioning application that replaces the time-consuming paper-based computer-access request process at Johnson Space Center (JSC). Auto- CAR combines rules-based and role-based functionality in one application to provide a centralized system that is easily and widely accessible. The system features a work-flow engine that facilitates request routing, a user registration directory containing contact information and user metadata, an access request submission and tracking process, and a system administrator account management component. This provides full, end-to-end disposition approval chain accountability from the moment a request is submitted. By blending both rules-based and rolebased functionality, AutoCAR has the flexibility to route requests based on a user s nationality, JSC affiliation status, and other export-control requirements, while ensuring a user s request is addressed by either a primary or backup approver. All user accounts that are tracked in AutoCAR are recorded and mapped to the native operating system schema on the target platform where user accounts reside. This allows for future extensibility for supporting creation, deletion, and account management directly on the target platforms by way of AutoCAR. The system s directory-based lookup and day-today change analysis of directory information determines personnel moves, deletions, and additions, and automatically notifies a user via e-mail to revalidate his/her account access as a result of such changes. AutoCAR is a Microsoft classic active server page (ASP) application hosted on a Microsoft Internet Information Server (IIS).

  6. Honeywell modular automation system computer software documentation

    International Nuclear Information System (INIS)

    Cunningham, L.T.

    1997-01-01

    This document provides a Computer Software Documentation for a new Honeywell Modular Automation System (MAS) being installed in the Plutonium Finishing Plant (PFP). This system will be used to control new thermal stabilization furnaces in HA-21I

  7. Future Computer, Communication, Control and Automation

    CERN Document Server

    2011 International Conference on Computer, Communication, Control and Automation

    2012-01-01

    The volume includes a set of selected papers extended and revised from the 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011). 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011) has been held in Zhuhai, China, November 19-20, 2011. This volume topics covered include wireless communications, advances in wireless video, wireless sensors networking, security in wireless networks, network measurement and management, hybrid and discrete-event systems, internet analytics and automation, robotic system and applications, reconfigurable automation systems, machine vision in automation. We hope that researchers, graduate students and other interested readers benefit scientifically from the proceedings and also find it stimulating in the process.

  8. Comparison of Automated Graphical User Interface Testing Tools

    OpenAIRE

    Gaber, Domen

    2018-01-01

    The thesis presents the analysis of modern tools for automated testing of various web based user interfaces. The purpose of the work is to compare specific test automation solutions and point out the most suitable test automation tool amongst them. One of the main goals of test automation is to gain faster execution when compared to manual testing and overall cost deduction. There are multiple test automation solutions available on the market, which differ in complexity of use, type of o...

  9. Automated Steel Cleanliness Analysis Tool (ASCAT)

    Energy Technology Data Exchange (ETDEWEB)

    Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)

    2005-12-30

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment

  10. Automated Steel Cleanliness Analysis Tool (ASCAT)

    International Nuclear Information System (INIS)

    Gary Casuccio; Michael Potter; Fred Schwerer; Richard J. Fruehan; Dr. Scott Story

    2005-01-01

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet

  11. Tools for the Automation of Large Distributed Control Systems

    CERN Document Server

    Gaspar, Clara

    2005-01-01

    The new LHC experiments at CERN will have very large numbers of channels to operate. In order to be able to configure and monitor such large systems, a high degree of parallelism is necessary. The control system is built as a hierarchy of sub-systems distributed over several computers. A toolkit - SMI++, combining two approaches: finite state machines and rule-based programming, allows for the description of the various sub-systems as decentralized deciding entities, reacting is real-time to changes in the system, thus providing for the automation of standard procedures and for the automatic recovery from error conditions in a hierarchical fashion. In this paper we will describe the principles and features of SMI++ as well as its integration with an industrial SCADA tool for use by the LHC experiments and we will try to show that such tools, can provide a very convenient mechanism for the automation of large scale, high complexity, applications.

  12. Tools for the automation of large control systems

    CERN Document Server

    Gaspar, Clara

    2005-01-01

    The new LHC experiments at CERN will have very large numbers of channels to operate. In order to be able to configure and monitor such large systems, a high degree of parallelism is necessary. The control system is built as a hierarchy of sub-systems distributed over several computers. A toolkit – SMI++, combining two approaches: finite state machines and rule-based programming, allows for the description of the various sub-systems as decentralized deciding entities, reacting in real-time to changes in the system, thus providing for the automation of standard procedures and the for the automatic recovery from error conditions in a hierarchical fashion. In this paper we will describe the principles and features of SMI++ as well as its integration with an industrial SCADA tool for use by the LHC experiments and we will try to show that such tools, can provide a very convenient mechanism for the automation of large scale, high complexity, applications.

  13. Design of automation tools for management of descent traffic

    Science.gov (United States)

    Erzberger, Heinz; Nedell, William

    1988-01-01

    The design of an automated air traffic control system based on a hierarchy of advisory tools for controllers is described. Compatibility of the tools with the human controller, a key objective of the design, is achieved by a judicious selection of tasks to be automated and careful attention to the design of the controller system interface. The design comprises three interconnected subsystems referred to as the Traffic Management Advisor, the Descent Advisor, and the Final Approach Spacing Tool. Each of these subsystems provides a collection of tools for specific controller positions and tasks. This paper focuses primarily on the Descent Advisor which provides automation tools for managing descent traffic. The algorithms, automation modes, and graphical interfaces incorporated in the design are described. Information generated by the Descent Advisor tools is integrated into a plan view traffic display consisting of a high-resolution color monitor. Estimated arrival times of aircraft are presented graphically on a time line, which is also used interactively in combination with a mouse input device to select and schedule arrival times. Other graphical markers indicate the location of the fuel-optimum top-of-descent point and the predicted separation distances of aircraft at a designated time-control point. Computer generated advisories provide speed and descent clearances which the controller can issue to aircraft to help them arrive at the feeder gate at the scheduled times or with specified separation distances. Two types of horizontal guidance modes, selectable by the controller, provide markers for managing the horizontal flightpaths of aircraft under various conditions. The entire system consisting of descent advisor algorithm, a library of aircraft performance models, national airspace system data bases, and interactive display software has been implemented on a workstation made by Sun Microsystems, Inc. It is planned to use this configuration in operational

  14. Computer-Aided Instruction in Automated Instrumentation.

    Science.gov (United States)

    Stephenson, David T.

    1986-01-01

    Discusses functions of automated instrumentation systems, i.e., systems which combine electrical measuring instruments and a controlling computer to measure responses of a unit under test. The computer-assisted tutorial then described is programmed for use on such a system--a modern microwave spectrum analyzer--to introduce engineering students to…

  15. Process computers automate CERN power supply installations

    International Nuclear Information System (INIS)

    Ullrich, H.; Martin, A.

    1974-01-01

    Higher standards of performance and reliability in the power plants of large particle accelerators necessitate increasing use of automation. The CERN (European Nuclear Research Centre) in Geneva started to employ process computers for plant automation at an early stage in its history. The great complexity and extent of the plants for high-energy physics first led to the setting-up of decentralized automatic systems which are now being increasingly combined into one interconnected automation system. One of these automatic systems controls and monitors the extensive power supply installations for the main ring magnets in the experimental zones. (orig.) [de

  16. Personal computer based home automation system

    OpenAIRE

    Hellmuth, George F.

    1993-01-01

    The systems engineering process is applied in the development of the preliminary design of a home automation communication protocol. The objective of the communication protocol is to provide a means for a personal computer to communicate with adapted appliances in the home. A needs analysis is used to ascertain that a need exist for a home automation system. Numerous design alternatives are suggested and evaluated to determine the best possible protocol design. Coaxial cable...

  17. Computer automation of a dilution cryogenic system

    International Nuclear Information System (INIS)

    Nogues, C.

    1992-09-01

    This study has been realized in the framework of studies on developing new technic for low temperature detectors for neutrinos and dark matter. The principles of low temperature physics and helium 4 and dilution cryostats, are first reviewed. The cryogenic system used and the technic for low temperature thermometry and regulation systems are then described. The computer automation of the dilution cryogenic system involves: numerical measurement of the parameter set (pressure, temperature, flow rate); computer assisted operating of the cryostat and the pump bench; numerical regulation of pressure and temperature; operation sequence full automation allowing the system to evolve from a state to another (temperature descent for example)

  18. Model Checking - Automated Verification of Computational Systems

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 14; Issue 7. Model Checking - Automated Verification of Computational Systems. Madhavan Mukund. General Article Volume 14 Issue 7 July 2009 pp 667-681. Fulltext. Click here to view fulltext PDF. Permanent link:

  19. Honeywell Modular Automation System Computer Software Documentation

    International Nuclear Information System (INIS)

    CUNNINGHAM, L.T.

    1999-01-01

    This document provides a Computer Software Documentation for a new Honeywell Modular Automation System (MAS) being installed in the Plutonium Finishing Plant (PFP). This system will be used to control new thermal stabilization furnaces in HA-211 and vertical denitration calciner in HC-230C-2

  20. CSS Preprocessing: Tools and Automation Techniques

    Directory of Open Access Journals (Sweden)

    Ricardo Queirós

    2018-01-01

    Full Text Available Cascading Style Sheets (CSS is a W3C specification for a style sheet language used for describing the presentation of a document written in a markup language, more precisely, for styling Web documents. However, in the last few years, the landscape for CSS development has changed dramatically with the appearance of several languages and tools aiming to help developers build clean, modular and performance-aware CSS. These new approaches give developers mechanisms to preprocess CSS rules through the use of programming constructs, defined as CSS preprocessors, with the ultimate goal to bring those missing constructs to the CSS realm and to foster stylesheets structured programming. At the same time, a new set of tools appeared, defined as postprocessors, for extension and automation purposes covering a broad set of features ranging from identifying unused and duplicate code to applying vendor prefixes. With all these tools and techniques in hands, developers need to provide a consistent workflow to foster CSS modular coding. This paper aims to present an introductory survey on the CSS processors. The survey gathers information on a specific set of processors, categorizes them and compares their features regarding a set of predefined criteria such as: maturity, coverage and performance. Finally, we propose a basic set of best practices in order to setup a simple and pragmatic styling code workflow.

  1. A Computational Architecture for Programmable Automation Research

    Science.gov (United States)

    Taylor, Russell H.; Korein, James U.; Maier, Georg E.; Durfee, Lawrence F.

    1987-03-01

    This short paper describes recent work at the IBM T. J. Watson Research Center directed at developing a highly flexible computational architecture for research on sensor-based programmable automation. The system described here has been designed with a focus on dynamic configurability, layered user inter-faces and incorporation of sensor-based real time operations into new commands. It is these features which distinguish it from earlier work. The system is cur-rently being implemented at IBM for research purposes and internal use and is an outgrowth of programmable automation research which has been ongoing since 1972 [e.g., 1, 2, 3, 4, 5, 6] .

  2. Advances in Computer, Communication, Control and Automation

    CERN Document Server

    011 International Conference on Computer, Communication, Control and Automation

    2012-01-01

    The volume includes a set of selected papers extended and revised from the 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011). 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011) has been held in Zhuhai, China, November 19-20, 2011. This volume  topics covered include signal and Image processing, speech and audio Processing, video processing and analysis, artificial intelligence, computing and intelligent systems, machine learning, sensor and neural networks, knowledge discovery and data mining, fuzzy mathematics and Applications, knowledge-based systems, hybrid systems modeling and design, risk analysis and management, system modeling and simulation. We hope that researchers, graduate students and other interested readers benefit scientifically from the proceedings and also find it stimulating in the process.

  3. Shuttle Repair Tools Automate Vehicle Maintenance

    Science.gov (United States)

    2013-01-01

    Successfully building, flying, and maintaining the space shuttles was an immensely complex job that required a high level of detailed, precise engineering. After each shuttle landed, it entered a maintenance, repair, and overhaul (MRO) phase. Each system was thoroughly checked and tested, and worn or damaged parts replaced, before the shuttle was rolled out for its next mission. During the MRO period, workers needed to record exactly what needed replacing and why, as well as follow precise guidelines and procedures in making their repairs. That meant traceability, and with it lots of paperwork. In 2007, the number of reports generated during electrical system repairs was getting out of hand-placing among the top three systems in terms of paperwork volume. Repair specialists at Kennedy Space Center were unhappy spending so much time at a desk and so little time actually working on the shuttle. "Engineers weren't spending their time doing technical work," says Joseph Schuh, an electrical engineer at Kennedy. "Instead, they were busy with repetitive, time-consuming processes that, while important in their own right, provided a low return on time invested." The strain of such inefficiency was bad enough that slow electrical repairs jeopardized rollout on several occasions. Knowing there had to be a way to streamline operations, Kennedy asked Martin Belson, a project manager with 30 years experience as an aerospace contractor, to co-lead a team in developing software that would reduce the effort required to document shuttle repairs. The result was System Maintenance Automated Repair Tasks (SMART) software. SMART is a tool for aggregating and applying information on every aspect of repairs, from procedures and instructions to a vehicle s troubleshooting history. Drawing on that data, SMART largely automates the processes of generating repair instructions and post-repair paperwork. In the case of the space shuttle, this meant that SMART had 30 years worth of operations

  4. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1986-01-01

    An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies

  5. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1985-01-01

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs

  6. BEACON: automated tool for Bacterial GEnome Annotation ComparisON.

    Science.gov (United States)

    Kalkatawi, Manal; Alam, Intikhab; Bajic, Vladimir B

    2015-08-18

    Genome annotation is one way of summarizing the existing knowledge about genomic characteristics of an organism. There has been an increased interest during the last several decades in computer-based structural and functional genome annotation. Many methods for this purpose have been developed for eukaryotes and prokaryotes. Our study focuses on comparison of functional annotations of prokaryotic genomes. To the best of our knowledge there is no fully automated system for detailed comparison of functional genome annotations generated by different annotation methods (AMs). The presence of many AMs and development of new ones introduce needs to: a/ compare different annotations for a single genome, and b/ generate annotation by combining individual ones. To address these issues we developed an Automated Tool for Bacterial GEnome Annotation ComparisON (BEACON) that benefits both AM developers and annotation analysers. BEACON provides detailed comparison of gene function annotations of prokaryotic genomes obtained by different AMs and generates extended annotations through combination of individual ones. For the illustration of BEACON's utility, we provide a comparison analysis of multiple different annotations generated for four genomes and show on these examples that the extended annotation can increase the number of genes annotated by putative functions up to 27%, while the number of genes without any function assignment is reduced. We developed BEACON, a fast tool for an automated and a systematic comparison of different annotations of single genomes. The extended annotation assigns putative functions to many genes with unknown functions. BEACON is available under GNU General Public License version 3.0 and is accessible at: http://www.cbrc.kaust.edu.sa/BEACON/ .

  7. BEACON: automated tool for Bacterial GEnome Annotation ComparisON

    KAUST Repository

    Kalkatawi, Manal M.

    2015-08-18

    Background Genome annotation is one way of summarizing the existing knowledge about genomic characteristics of an organism. There has been an increased interest during the last several decades in computer-based structural and functional genome annotation. Many methods for this purpose have been developed for eukaryotes and prokaryotes. Our study focuses on comparison of functional annotations of prokaryotic genomes. To the best of our knowledge there is no fully automated system for detailed comparison of functional genome annotations generated by different annotation methods (AMs). Results The presence of many AMs and development of new ones introduce needs to: a/ compare different annotations for a single genome, and b/ generate annotation by combining individual ones. To address these issues we developed an Automated Tool for Bacterial GEnome Annotation ComparisON (BEACON) that benefits both AM developers and annotation analysers. BEACON provides detailed comparison of gene function annotations of prokaryotic genomes obtained by different AMs and generates extended annotations through combination of individual ones. For the illustration of BEACON’s utility, we provide a comparison analysis of multiple different annotations generated for four genomes and show on these examples that the extended annotation can increase the number of genes annotated by putative functions up to 27 %, while the number of genes without any function assignment is reduced. Conclusions We developed BEACON, a fast tool for an automated and a systematic comparison of different annotations of single genomes. The extended annotation assigns putative functions to many genes with unknown functions. BEACON is available under GNU General Public License version 3.0 and is accessible at: http://www.cbrc.kaust.edu.sa/BEACON/

  8. Delineated Analysis of Robotic Process Automation Tools

    OpenAIRE

    Ruchi Isaac; Riya Muni; Kenali Desai

    2017-01-01

    In this age and time when celerity is expected out of all the sectors of the country, the speed of execution of various processes and hence efficiency, becomes a prominent factor. To facilitate the speeding demands of these diverse platforms, Robotic Process Automation (RPA) is used. Robotic Process Automation can expedite back-office tasks in commercial industries, remote management tasks in IT industries and conservation of resources in multiple sectors. To implement RPA, many software ...

  9. Computer-aided translation tools

    DEFF Research Database (Denmark)

    Christensen, Tina Paulsen; Schjoldager, Anne

    2016-01-01

    in Denmark is rather high in general, but limited in the case of machine translation (MT) tools: While most TSPs use translation-memory (TM) software, often in combination with a terminology management system (TMS), only very few have implemented MT, which is criticised for its low quality output, especially......The paper reports on a questionnaire survey from 2013 of the uptake and use of computer-aided translation (CAT) tools by Danish translation service providers (TSPs) and discusses how these tools appear to have impacted on the Danish translation industry. According to our results, the uptake...

  10. Honeywell Modular Automation System Computer Software Documentation

    International Nuclear Information System (INIS)

    STUBBS, A.M.

    2000-01-01

    The purpose of this Computer Software Document (CSWD) is to provide configuration control of the Honeywell Modular Automation System (MAS) in use at the Plutonium Finishing Plant (PFP). This CSWD describes hardware and PFP developed software for control of stabilization furnaces. The Honeywell software can generate configuration reports for the developed control software. These reports are described in the following section and are attached as addendum's. This plan applies to PFP Engineering Manager, Thermal Stabilization Cognizant Engineers, and the Shift Technical Advisors responsible for the Honeywell MAS software/hardware and administration of the Honeywell System

  11. Automated sensitivity analysis: New tools for modeling complex dynamic systems

    International Nuclear Information System (INIS)

    Pin, F.G.

    1987-01-01

    Sensitivity analysis is an established methodology used by researchers in almost every field to gain essential insight in design and modeling studies and in performance assessments of complex systems. Conventional sensitivity analysis methodologies, however, have not enjoyed the widespread use they deserve considering the wealth of information they can provide, partly because of their prohibitive cost or the large initial analytical investment they require. Automated systems have recently been developed at ORNL to eliminate these drawbacks. Compilers such as GRESS and EXAP now allow automatic and cost effective calculation of sensitivities in FORTRAN computer codes. In this paper, these and other related tools are described and their impact and applicability in the general areas of modeling, performance assessment and decision making for radioactive waste isolation problems are discussed

  12. Computer Assisted Advising Tool (CAAT).

    Science.gov (United States)

    Matsen, Marie E.

    Lane Community College's Computer Assisted Advising Tool (CAAT) is used by counselors to assist students in developing a plan for the completion of a degree or certificate. CAAT was designed to facilitate student advisement from matriculation to graduation by comparing degree requirements with the courses completed by students. Three major sources…

  13. Automated and interactive fuel management tools: Past, present and future

    International Nuclear Information System (INIS)

    Cook, A.G.; Casadei, A.L.

    1986-01-01

    The past, present and future status of automated and interactive fuel management tools are reviewed. Issues such as who are the customers for these products and what are their needs are addressed. The nature of the fuel management problem is reviewed. The Westinghouse fuel management tools and methods are presented as an example of how the technology has evolved

  14. Method for automation of tool preproduction

    Science.gov (United States)

    Rychkov, D. A.; Yanyushkin, A. S.; Lobanov, D. V.; Arkhipov, P. V.

    2018-03-01

    The primary objective of tool production is a creation or selection of such tool design which could make it possible to secure high process efficiency, tool availability as well as a quality of received surfaces with minimum means and resources spent on it. It takes much time of application people, being engaged in tool preparation, to make a correct selection of the appropriate tool among the set of variants. Program software has been developed to solve the problem, which helps to create, systematize and carry out a comparative analysis of tool design to identify the rational variant under given production conditions. The literature indicates that systematization and selection of the tool rational design has been carried out in accordance with the developed modeling technology and comparative design analysis. Software application makes it possible to reduce the period of design by 80....85% and obtain a significant annual saving.

  15. Robotic Automation in Computer Controlled Polishing

    Science.gov (United States)

    Walker, D. D.; Yu, G.; Bibby, M.; Dunn, C.; Li, H.; Wu, Y.; Zheng, X.; Zhang, P.

    2016-02-01

    We first present a Case Study - the manufacture of 1.4 m prototype mirror-segments for the European Extremely Large Telescope, undertaken by the National Facility for Ultra Precision Surfaces, at the OpTIC facility operated by Glyndwr University. Scale-up to serial-manufacture demands delivery of a 1.4 m off-axis aspheric hexagonal segment with surface precision robots and computer numerically controlled ('CNC') polishing machines for optical fabrication. The objective was not to assess which is superior. Rather, it was to understand for the first time their complementary properties, leading us to operate them together as a unit, integrated in hardware and software. Three key areas are reported. First is the novel use of robots to automate currently-manual operations on CNC polishing machines, to improve work-throughput, mitigate risk of damage to parts, and reduce dependence on highly-skilled staff. Second is the use of robots to pre-process surfaces prior to CNC polishing, to reduce total process time. The third draws the threads together, describing our vision of the automated manufacturing cell, where the operator interacts at cell rather than machine level. This promises to deliver a step-change in end-to-end manufacturing times and costs, compared with either platform used on its own or, indeed, the state-of-the-art used elsewhere.

  16. Foundational Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Barton [Univ. of Wisconsin, Madison, WI (United States)

    2014-05-19

    The Paradyn project has a history of developing algorithms, techniques, and software that push the cutting edge of tool technology for high-end computing systems. Under this funding, we are working on a three-year agenda to make substantial new advances in support of new and emerging Petascale systems. The overall goal for this work is to address the steady increase in complexity of these petascale systems. Our work covers two key areas: (1) The analysis, instrumentation and control of binary programs. Work in this area falls under the general framework of the Dyninst API tool kits. (2) Infrastructure for building tools and applications at extreme scale. Work in this area falls under the general framework of the MRNet scalability framework. Note that work done under this funding is closely related to work done under a contemporaneous grant, “High-Performance Energy Applications and Systems”, SC0004061/FG02-10ER25972, UW PRJ36WV.

  17. JTst - An Automated Unit Testing Tool for Java Program

    OpenAIRE

    Kamal Z.  Zamli; Nor A. M.  Isa

    2008-01-01

    Software testing is an integral part of software development lifecycle. Lack of testing can often lead to disastrous consequences including lost of data, fortunes, and even lives. Despite its importance, current software testing practice lacks automation, and is still primarily based on highly manual processes from the generation of test cases up to the actual execution of the test. Although the emergence of helpful automated testing tools in the market is blooming, their adoptions are lackin...

  18. Toward a universal, automated facial measurement tool in facial reanimation.

    Science.gov (United States)

    Hadlock, Tessa A; Urban, Luke S

    2012-01-01

    To describe a highly quantitative facial function-measuring tool that yields accurate, objective measures of facial position in significantly less time than existing methods. Facial Assessment by Computer Evaluation (FACE) software was designed for facial analysis. Outputs report the static facial landmark positions and dynamic facial movements relevant in facial reanimation. Fifty individuals underwent facial movement analysis using Photoshop-based measurements and the new software; comparisons of agreement and efficiency were made. Comparisons were made between individuals with normal facial animation and patients with paralysis to gauge sensitivity to abnormal movements. Facial measurements were matched using FACE software and Photoshop-based measures at rest and during expressions. The automated assessments required significantly less time than Photoshop-based assessments.FACE measurements easily revealed differences between individuals with normal facial animation and patients with facial paralysis. FACE software produces accurate measurements of facial landmarks and facial movements and is sensitive to paralysis. Given its efficiency, it serves as a useful tool in the clinical setting for zonal facial movement analysis in comprehensive facial nerve rehabilitation programs.

  19. Computational botany methods for automated species identification

    CERN Document Server

    Remagnino, Paolo; Wilkin, Paul; Cope, James; Kirkup, Don

    2017-01-01

    This book discusses innovative methods for mining information from images of plants, especially leaves, and highlights the diagnostic features that can be implemented in fully automatic systems for identifying plant species. Adopting a multidisciplinary approach, it explores the problem of plant species identification, covering both the concepts of taxonomy and morphology. It then provides an overview of morphometrics, including the historical background and the main steps in the morphometric analysis of leaves together with a number of applications. The core of the book focuses on novel diagnostic methods for plant species identification developed from a computer scientist’s perspective. It then concludes with a chapter on the characterization of botanists' visions, which highlights important cognitive aspects that can be implemented in a computer system to more accurately replicate the human expert’s fixation process. The book not only represents an authoritative guide to advanced computational tools fo...

  20. A Fully Automated Penumbra Segmentation Tool

    DEFF Research Database (Denmark)

    Nagenthiraja, Kartheeban; Ribe, Lars Riisgaard; Hougaard, Kristina Dupont

    2012-01-01

    Introduction: Perfusion- and diffusion weighted MRI (PWI/DWI) is widely used to select patients who are likely to benefit from recanalization therapy. The visual identification of PWI-DWI-mismatch tissue depends strongly on the observer, prompting a need for software, which estimates potentially...... salavageable tissue, quickly and accurately. We present a fully Automated Penumbra Segmentation (APS) algorithm using PWI and DWI images. We compare automatically generated PWI-DWI mismatch mask to mask outlined manually by experts, in 168 patients. Method: The algorithm initially identifies PWI lesions......) at 600∙10-6 mm2/sec. Due to the nature of thresholding, the ADC mask overestimates the DWI lesion volume and consequently we initialized level-set algorithm on DWI image with ADC mask as prior knowledge. Combining the PWI and inverted DWI mask then yield the PWI-DWI mismatch mask. Four expert raters...

  1. Automating Test Activities: Test Cases Creation, Test Execution, and Test Reporting with Multiple Test Automation Tools

    OpenAIRE

    Loke Mun Sei

    2015-01-01

    Software testing has become a mandatory process in assuring the software product quality. Hence, test management is needed in order to manage the test activities conducted in the software test life cycle. This paper discusses on the challenges faced in the software test life cycle, and how the test processes and test activities, mainly on test cases creation, test execution, and test reporting is being managed and automated using several test automation tools, i.e. Jira, ...

  2. doit – Automation Tool

    Directory of Open Access Journals (Sweden)

    2010-09-01

    Full Text Available

    This article describes how traditional build-tools work, what are the shortcomings of this model for modern software development and finally how doit solve these problems. doit is written in python, it comes from the idea of bringing the power of build-tools to execute any kind of tasks.

  3. Automated tools for safety-critical software

    International Nuclear Information System (INIS)

    Lapassat, A.M.

    1993-01-01

    The regulatory (DSIN), the utilities (EDF, CEA..) and the CEA-Institute for Protection and Nuclear Safety (IPSN) work together at the French nuclear safety. This paper presents a tool, called CLAIRE, for simulation and tests of different nuclear safety system. (TEC)

  4. Automated tools for cross-referencing large databases. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Clapp, N E; Green, P L; Bell, D [and others

    1997-05-01

    A Cooperative Research and Development Agreement (CRADA) was funded with TRESP Associates, Inc., to develop a limited prototype software package operating on one platform (e.g., a personal computer, small workstation, or other selected device) to demonstrate the concepts of using an automated database application to improve the process of detecting fraud and abuse of the welfare system. An analysis was performed on Tennessee`s welfare administration system. This analysis was undertaken to determine if the incidence of welfare waste, fraud, and abuse could be reduced and if the administrative process could be improved to reduce benefits overpayment errors. The analysis revealed a general inability to obtain timely data to support the verification of a welfare recipient`s economic status and eligibility for benefits. It has been concluded that the provision of more modern computer-based tools and the establishment of electronic links to other state and federal data sources could increase staff efficiency, reduce the incidence of out-of-date information provided to welfare assistance staff, and make much of the new data required available in real time. Electronic data links have been proposed to allow near-real-time access to data residing in databases located in other states and at federal agency data repositories. The ability to provide these improvements to the local office staff would require the provision of additional computers, software, and electronic data links within each of the offices and the establishment of approved methods of accessing remote databases and transferring potentially sensitive data. In addition, investigations will be required to ascertain if existing laws would allow such data transfers, and if not, what changed or new laws would be required. The benefits, in both cost and efficiency, to the state of Tennessee of having electronically-enhanced welfare system administration and control are expected to result in a rapid return of investment.

  5. BEACON: automated tool for Bacterial GEnome Annotation ComparisON

    KAUST Repository

    Kalkatawi, Manal M.; Alam, Intikhab; Bajic, Vladimir B.

    2015-01-01

    We developed BEACON, a fast tool for an automated and a systematic comparison of different annotations of single genomes. The extended annotation assigns putative functions to many genes with unknown functions. BEACON is available under GNU General Public License version 3.0 and is accessible at: http://www.cbrc.kaust.edu.sa/BEACON/

  6. Automated Geospatial Watershed Assessment Tool (AGWA) Poster Presentation

    Science.gov (United States)

    The Automated Geospatial Watershed Assessment tool (AGWA, see: www.tucson.ars.ag.gov/agwa or http://www.epa.gov/esd/land-sci/agwa/) is a GIS interface jointly developed by the USDA-Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona...

  7. TSORT - an automated tool for allocating tasks to training strategies

    International Nuclear Information System (INIS)

    Carter, R.J.; Jorgensen, C.C.

    1986-01-01

    An automated tool (TSORT) that can aid training system developers in determining which training strategy should be applied to a particular task and in grouping similar tasks into training categories has been developed. This paper describes the rationale for TSORT's development and addresses its structure, including training categories, task description dimensions, and categorization metrics. It also provides some information on TSORT's application

  8. An Automated Data Analysis Tool for Livestock Market Data

    Science.gov (United States)

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  9. Enhancing Formal Modelling Tool Support with Increased Automation

    DEFF Research Database (Denmark)

    Lausdahl, Kenneth

    Progress report for the qualification exam report for PhD Student Kenneth Lausdahl. Initial work on enhancing tool support for the formal method VDM and the concept of unifying a abstract syntax tree with the ability for isolated extensions is described. The tool support includes a connection to ...... to UML and a test automation principle based on traces written as a kind of regular expressions....

  10. An automated tool for solar power systems

    International Nuclear Information System (INIS)

    Natsheh, E.M.; Natsheh, A.R.; Albarbar, AH

    2014-01-01

    In this paper a novel model of smart grid-connected solar power system is developed. The model is implemented using MatLab/SIMULINK software package. Artificial neural network (ANN) algorithm is used for maximizing the generated power based on maximum power point tracker (MPPT) implementation. The dynamic behavior of the proposed model is examined under different operating conditions. Solar irradiance, and temperature data are gathered from a grid connected, 28.8 kW solar power system located in central Manchester. The developed system and its control strategy exhibit excellent performance with tracking efficiency exceed 94.5%. The proposed model and its control strategy offer a proper tool for smart grid performance optimization. (author)

  11. COSTMODL - AN AUTOMATED SOFTWARE DEVELOPMENT COST ESTIMATION TOOL

    Science.gov (United States)

    Roush, G. B.

    1994-01-01

    The cost of developing computer software consumes an increasing portion of many organizations' budgets. As this trend continues, the capability to estimate the effort and schedule required to develop a candidate software product becomes increasingly important. COSTMODL is an automated software development estimation tool which fulfills this need. Assimilating COSTMODL to any organization's particular environment can yield significant reduction in the risk of cost overruns and failed projects. This user-customization capability is unmatched by any other available estimation tool. COSTMODL accepts a description of a software product to be developed and computes estimates of the effort required to produce it, the calendar schedule required, and the distribution of effort and staffing as a function of the defined set of development life-cycle phases. This is accomplished by the five cost estimation algorithms incorporated into COSTMODL: the NASA-developed KISS model; the Basic, Intermediate, and Ada COCOMO models; and the Incremental Development model. This choice affords the user the ability to handle project complexities ranging from small, relatively simple projects to very large projects. Unique to COSTMODL is the ability to redefine the life-cycle phases of development and the capability to display a graphic representation of the optimum organizational structure required to develop the subject project, along with required staffing levels and skills. The program is menu-driven and mouse sensitive with an extensive context-sensitive help system that makes it possible for a new user to easily install and operate the program and to learn the fundamentals of cost estimation without having prior training or separate documentation. The implementation of these functions, along with the customization feature, into one program makes COSTMODL unique within the industry. COSTMODL was written for IBM PC compatibles, and it requires Turbo Pascal 5.0 or later and Turbo

  12. Computer simulation and automation of data processing

    International Nuclear Information System (INIS)

    Tikhonov, A.N.

    1981-01-01

    The principles of computerized simulation and automation of data processing are presented. The automized processing system is constructed according to the module-hierarchical principle. The main operating conditions of the system are as follows: preprocessing, installation analysis, interpretation, accuracy analysis and controlling parameters. The definition of the quasireal experiment permitting to plan the real experiment is given. It is pointed out that realization of the quasireal experiment by means of the computerized installation model with subsequent automized processing permits to scan the quantitative aspect of the system as a whole as well as provides optimal designing of installation parameters for obtaining maximum resolution [ru

  13. COMPUTER TOOLS OF DYNAMIC MATHEMATIC SOFTWARE AND METHODICAL PROBLEMS OF THEIR USE

    OpenAIRE

    Olena V. Semenikhina; Maryna H. Drushliak

    2014-01-01

    The article presents results of analyses of standard computer tools of dynamic mathematic software which are used in solving tasks, and tools on which the teacher can support in the teaching of mathematics. Possibility of the organization of experimental investigating of mathematical objects on the basis of these tools and the wording of new tasks on the basis of the limited number of tools, fast automated check are specified. Some methodological comments on application of computer tools and ...

  14. Flow injection analysis: Emerging tool for laboratory automation in radiochemistry

    International Nuclear Information System (INIS)

    Egorov, O.; Ruzicka, J.; Grate, J.W.; Janata, J.

    1996-01-01

    Automation of routine and serial assays is a common practice of modern analytical laboratory, while it is virtually nonexistent in the field of radiochemistry. Flow injection analysis (FIA) is a general solution handling methodology that has been extensively used for automation of routine assays in many areas of analytical chemistry. Reproducible automated solution handling and on-line separation capabilities are among several distinctive features that make FI a very promising, yet under utilized tool for automation in analytical radiochemistry. The potential of the technique is demonstrated through the development of an automated 90 Sr analyzer and its application in the analysis of tank waste samples from the Hanford site. Sequential injection (SI), the latest generation of FIA, is used to rapidly separate 90 Sr from interfering radionuclides and deliver separated Sr zone to a flow-through liquid scintillation detector. The separation is performed on a mini column containing Sr-specific sorbent extraction material, which selectively retains Sr under acidic conditions. The 90 Sr is eluted with water, mixed with scintillation cocktail, and sent through the flow cell of a flow through counter, where 90 Sr radioactivity is detected as a transient signal. Both peak area and peak height can be used for quantification of sample radioactivity. Alternatively, stopped flow detection can be performed to improve detection precision for low activity samples. The authors current research activities are focused on expansion of radiochemical applications of FIA methodology, with an ultimate goal of creating a set of automated methods that will cover the basic needs of radiochemical analysis at the Hanford site. The results of preliminary experiments indicate that FIA is a highly suitable technique for the automation of chemically more challenging separations, such as separation of actinide elements

  15. Combining Archetypes, Ontologies and Formalization Enables Automated Computation of Quality Indicators.

    Science.gov (United States)

    Legaz-García, María Del Carmen; Dentler, Kathrin; Fernández-Breis, Jesualdo Tomás; Cornet, Ronald

    2017-01-01

    ArchMS is a framework that represents clinical information and knowledge using ontologies in OWL, which facilitates semantic interoperability and thereby the exploitation and secondary use of clinical data. However, it does not yet support the automated assessment of quality of care. CLIF is a stepwise method to formalize quality indicators. The method has been implemented in the CLIF tool which supports its users in generating computable queries based on a patient data model which can be based on archetypes. To enable the automated computation of quality indicators using ontologies and archetypes, we tested whether ArchMS and the CLIF tool can be integrated. We successfully automated the process of generating SPARQL queries from quality indicators that have been formalized with CLIF and integrated them into ArchMS. Hence, ontologies and archetypes can be combined for the execution of formalized quality indicators.

  16. Automated Classification of Seedlings Using Computer Vision

    DEFF Research Database (Denmark)

    Dyrmann, Mads; Christiansen, Peter

    The objective of this project is to investigate the possibilities of recognizing plant species at multiple growth stages based on RGB images. Plants and leaves are initially segmented from a database through a partly automated procedure providing samples of 2438 plants and 4767 leaves distributed...

  17. Computer-automated neutron activation analysis system

    International Nuclear Information System (INIS)

    Minor, M.M.; Garcia, S.R.

    1983-01-01

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day. 5 references

  18. Automated computation of autonomous spectral submanifolds for nonlinear modal analysis

    Science.gov (United States)

    Ponsioen, Sten; Pedergnana, Tiemo; Haller, George

    2018-04-01

    We discuss an automated computational methodology for computing two-dimensional spectral submanifolds (SSMs) in autonomous nonlinear mechanical systems of arbitrary degrees of freedom. In our algorithm, SSMs, the smoothest nonlinear continuations of modal subspaces of the linearized system, are constructed up to arbitrary orders of accuracy, using the parameterization method. An advantage of this approach is that the construction of the SSMs does not break down when the SSM folds over its underlying spectral subspace. A further advantage is an automated a posteriori error estimation feature that enables a systematic increase in the orders of the SSM computation until the required accuracy is reached. We find that the present algorithm provides a major speed-up, relative to numerical continuation methods, in the computation of backbone curves, especially in higher-dimensional problems. We illustrate the accuracy and speed of the automated SSM algorithm on lower- and higher-dimensional mechanical systems.

  19. Visualization Tools for Teaching Computer Security

    Science.gov (United States)

    Yuan, Xiaohong; Vega, Percy; Qadah, Yaseen; Archer, Ricky; Yu, Huiming; Xu, Jinsheng

    2010-01-01

    Using animated visualization tools has been an important teaching approach in computer science education. We have developed three visualization and animation tools that demonstrate various information security concepts and actively engage learners. The information security concepts illustrated include: packet sniffer and related computer network…

  20. Towards full automation of accelerators through computer control

    CERN Document Server

    Gamble, J; Kemp, D; Keyser, R; Koutchouk, Jean-Pierre; Martucci, P P; Tausch, Lothar A; Vos, L

    1980-01-01

    The computer control system of the Intersecting Storage Rings (ISR) at CERN has always laid emphasis on two particular operational aspects, the first being the reproducibility of machine conditions and the second that of giving the operators the possibility to work in terms of machine parameters such as the tune. Already certain phases of the operation are optimized by the control system, whilst others are automated with a minimum of manual intervention. The authors describe this present control system with emphasis on the existing automated facilities and the features of the control system which make it possible. It then discusses the steps needed to completely automate the operational procedure of accelerators. (7 refs).

  1. Towards full automation of accelerators through computer control

    International Nuclear Information System (INIS)

    Gamble, J.; Hemery, J.-Y.; Kemp, D.; Keyser, R.; Koutchouk, J.-P.; Martucci, P.; Tausch, L.; Vos, L.

    1980-01-01

    The computer control system of the Intersecting Storage Rings (ISR) at CERN has always laid emphasis on two particular operational aspects, the first being the reproducibility of machine conditions and the second that of giving the operators the possibility to work in terms of machine parameters such as the tune. Already certain phases of the operation are optimized by the control system, whilst others are automated with a minimum of manual intervention. The paper describes this present control system with emphasis on the existing automated facilities and the features of the control system which make it possible. It then discusses the steps needed to completely automate the operational procedure of accelerators. (Auth.)

  2. Process computers automate CERN power supply installations

    CERN Document Server

    Ullrich, H

    1974-01-01

    Computerized automation systems are being used at CERN, Geneva, to improve the capacity, operational reliability and flexibility of the power supply installations for main ring magnets in the experimental zones of particle accelerators. A detailed account of the technological problem involved is followed in the article by a description of the system configuration, the program system and field experience already gathered in similar schemes. (1 refs).

  3. Tools for Embedded Computing Systems Software

    Science.gov (United States)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  4. ProofJudge: Automated Proof Judging Tool for Learning Mathematical Logic

    DEFF Research Database (Denmark)

    Villadsen, Jørgen

    2015-01-01

    Today we have software in many artefacts, from medical devices to cars and airplanes, and the software must not only be efficient and intelligent but also reliable and secure. Tests can show the presence of bugs but cannot guarantee their absence. A machine-checked proof using mathematical logic...... pen and paper because no adequate tool was available. The learning problem is how to make abstract concepts of logic as concrete as possible. ProofJudge is a computer system and teaching approach for teaching mathematical logic and automated reasoning which augments the e-learning tool NaDeA (Natural...

  5. ProofJudge: Automated Proof Judging Tool for Learning Mathematical Logic

    DEFF Research Database (Denmark)

    Villadsen, Jørgen

    2016-01-01

    Today we have software in many artefacts, from medical devices to cars and airplanes, and the software must not only be efficient and intelligent but also reliable and secure. Tests can show the presence of bugs but cannot guarantee their absence. A machine-checked proof using mathematical logic...... using pen and paper because no adequate tool was available. The learning problem is how to make abstract concepts of logic as concrete as possible. ProofJudge is a computer system and teaching approach for teaching mathematical logic and automated reasoning which augments the e-learning tool Na...

  6. Bayesian ISOLA: new tool for automated centroid moment tensor inversion

    Science.gov (United States)

    Vackář, Jiří; Burjánek, Jan; Gallovič, František; Zahradník, Jiří; Clinton, John

    2017-04-01

    Focal mechanisms are important for understanding seismotectonics of a region, and they serve as a basic input for seismic hazard assessment. Usually, the point source approximation and the moment tensor (MT) are used. We have developed a new, fully automated tool for the centroid moment tensor (CMT) inversion in a Bayesian framework. It includes automated data retrieval, data selection where station components with various instrumental disturbances and high signal-to-noise are rejected, and full-waveform inversion in a space-time grid around a provided hypocenter. The method is innovative in the following aspects: (i) The CMT inversion is fully automated, no user interaction is required, although the details of the process can be visually inspected latter on many figures which are automatically plotted.(ii) The automated process includes detection of disturbances based on MouseTrap code, so disturbed recordings do not affect inversion.(iii) A data covariance matrix calculated from pre-event noise yields an automated weighting of the station recordings according to their noise levels and also serves as an automated frequency filter suppressing noisy frequencies.(iv) Bayesian approach is used, so not only the best solution is obtained, but also the posterior probability density function.(v) A space-time grid search effectively combined with the least-squares inversion of moment tensor components speeds up the inversion and allows to obtain more accurate results compared to stochastic methods. The method has been tested on synthetic and observed data. It has been tested by comparison with manually processed moment tensors of all events greater than M≥3 in the Swiss catalogue over 16 years using data available at the Swiss data center (http://arclink.ethz.ch). The quality of the results of the presented automated process is comparable with careful manual processing of data. The software package programmed in Python has been designed to be as versatile as possible in

  7. Computer automation of ultrasonic testing. [inspection of ultrasonic welding

    Science.gov (United States)

    Yee, B. G. W.; Kerlin, E. E.; Gardner, A. H.; Dunmyer, D.; Wells, T. G.; Robinson, A. R.; Kunselman, J. S.; Walker, T. C.

    1974-01-01

    Report describes a prototype computer-automated ultrasonic system developed for the inspection of weldments. This system can be operated in three modes: manual, automatic, and computer-controlled. In the computer-controlled mode, the system will automatically acquire, process, analyze, store, and display ultrasonic inspection data in real-time. Flaw size (in cross-section), location (depth), and type (porosity-like or crack-like) can be automatically discerned and displayed. The results and pertinent parameters are recorded.

  8. Software engineering and data management for automated payload experiment tool

    Science.gov (United States)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1994-01-01

    The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by University of Alabama in Huntsville (UAH) and provide versions of the software in a Macintosh and Windows compatible format. Appendix 1 science requirements document (SRD) Users Manual is attached.

  9. Automated procedure for performing computer security risk analysis

    International Nuclear Information System (INIS)

    Smith, S.T.; Lim, J.J.

    1984-05-01

    Computers, the invisible backbone of nuclear safeguards, monitor and control plant operations and support many materials accounting systems. Our automated procedure to assess computer security effectiveness differs from traditional risk analysis methods. The system is modeled as an interactive questionnaire, fully automated on a portable microcomputer. A set of modular event trees links the questionnaire to the risk assessment. Qualitative scores are obtained for target vulnerability, and qualitative impact measures are evaluated for a spectrum of threat-target pairs. These are then combined by a linguistic algebra to provide an accurate and meaningful risk measure. 12 references, 7 figures

  10. Evaluation of a New Digital Automated Glycemic Pattern Detection Tool.

    Science.gov (United States)

    Comellas, María José; Albiñana, Emma; Artes, Maite; Corcoy, Rosa; Fernández-García, Diego; García-Alemán, Jorge; García-Cuartero, Beatriz; González, Cintia; Rivero, María Teresa; Casamira, Núria; Weissmann, Jörg

    2017-11-01

    Blood glucose meters are reliable devices for data collection, providing electronic logs of historical data easier to interpret than handwritten logbooks. Automated tools to analyze these data are necessary to facilitate glucose pattern detection and support treatment adjustment. These tools emerge in a broad variety in a more or less nonevaluated manner. The aim of this study was to compare eDetecta, a new automated pattern detection tool, to nonautomated pattern analysis in terms of time investment, data interpretation, and clinical utility, with the overarching goal to identify early in development and implementation of tool areas of improvement and potential safety risks. Multicenter web-based evaluation in which 37 endocrinologists were asked to assess glycemic patterns of 4 real reports (2 continuous subcutaneous insulin infusion [CSII] and 2 multiple daily injection [MDI]). Endocrinologist and eDetecta analyses were compared on time spent to analyze each report and agreement on the presence or absence of defined patterns. eDetecta module markedly reduced the time taken to analyze each case on the basis of the emminens eConecta reports (CSII: 18 min; MDI: 12.5), compared to the automatic eDetecta analysis. Agreement between endocrinologists and eDetecta varied depending on the patterns, with high level of agreement in patterns of glycemic variability. Further analysis of low level of agreement led to identifying areas where algorithms used could be improved to optimize trend pattern identification. eDetecta was a useful tool for glycemic pattern detection, helping clinicians to reduce time required to review emminens eConecta glycemic reports. No safety risks were identified during the study.

  11. USSR Report, Cybernetics Computers and Automation Technology

    Science.gov (United States)

    1985-09-05

    organization, the SKALD program utilizes a dictionary or data base to generate SKALD poetry at the computer center of Minsk State Pedagogical ...wonderful capabilities at the^ Krasnoyarsk branch of the USSR AN [Academy of Sciences] Siberian section’s Computer Center. They began training the kids

  12. Computer- Aided Design in Power Engineering Application of Software Tools

    CERN Document Server

    Stojkovic, Zlatan

    2012-01-01

    This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents  application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel & Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems ...

  13. Java Tool Framework for Automation of Hardware Commissioning and Maintenance Procedures

    International Nuclear Information System (INIS)

    Ho, J C; Fisher, J M; Gordon, J B; Lagin, L J; West, S L

    2007-01-01

    The National Ignition Facility (NIF) is a 192-beam laser system designed to study high energy density physics. Each beam line contains a variety of line replaceable units (LRUs) that contain optics, stepping motors, sensors and other devices to control and diagnose the laser. During commissioning and subsequent maintenance of the laser, LRUs undergo a qualification process using the Integrated Computer Control System (ICCS) to verify and calibrate the equipment. The commissioning processes are both repetitive and tedious when we use remote manual computer controls, making them ideal candidates for software automation. Maintenance and Commissioning Tool (MCT) software was developed to improve the efficiency of the qualification process. The tools are implemented in Java, leveraging ICCS services and CORBA to communicate with the control devices. The framework provides easy-to-use mechanisms for handling configuration data, task execution, task progress reporting, and generation of commissioning test reports. The tool framework design and application examples will be discussed

  14. Glycan arrays and other tools produced by automated glycan assembly

    Directory of Open Access Journals (Sweden)

    Peter H. Seeberger

    2017-01-01

    Full Text Available Carbohydrates are the dominant biopolymer on earth and play important roles ranging from building material for plants to function in many biological systems. Glycans remain poorly studied due to a lack of synthetic tools. The goal of my laboratory has been to develop a general method for the automated assembly of glycans. The general protocols we developed resulted in the commercialisation of the Glyconeer 2.1™ synthesizer as well as the building blocks and all reagents. Oligosaccharides as long as 50-mers are now accessible within days. Rapid access to defined oligosaccharides has been the foundation to many applications including synthetic tools such as glycan microarrays, glycan nanoparticles and anti-glycan antibodies. The platform technology is helping to address real-life problems by the creation of new vaccines and diagnostics. After addressing mainly mammalian glycobiology earlier, material science and plant biology are benefitting increasingly from synthetic glycans.

  15. Automated Multivariate Optimization Tool for Energy Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.

    2006-07-01

    Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

  16. ATLAS Distributed Computing Monitoring tools during the LHC Run I

    Science.gov (United States)

    Schovancová, J.; Campana, S.; Di Girolamo, A.; Jézéquel, S.; Ueda, I.; Wenaus, T.; Atlas Collaboration

    2014-06-01

    This contribution summarizes evolution of the ATLAS Distributed Computing (ADC) Monitoring project during the LHC Run I. The ADC Monitoring targets at the three groups of customers: ADC Operations team to early identify malfunctions and escalate issues to an activity or a service expert, ATLAS national contacts and sites for the real-time monitoring and long-term measurement of the performance of the provided computing resources, and the ATLAS Management for long-term trends and accounting information about the ATLAS Distributed Computing resources. During the LHC Run I a significant development effort has been invested in standardization of the monitoring and accounting applications in order to provide extensive monitoring and accounting suite. ADC Monitoring applications separate the data layer and the visualization layer. The data layer exposes data in a predefined format. The visualization layer is designed bearing in mind visual identity of the provided graphical elements, and re-usability of the visualization bits across the different tools. A rich family of various filtering and searching options enhancing available user interfaces comes naturally with the data and visualization layer separation. With a variety of reliable monitoring data accessible through standardized interfaces, the possibility of automating actions under well defined conditions correlating multiple data sources has become feasible. In this contribution we discuss also about the automated exclusion of degraded resources and their automated recovery in various activities.

  17. Mathematical support for automated geometry analysis of lathe machining of oblique peakless round-nose tools

    Science.gov (United States)

    Filippov, A. V.; Tarasov, S. Yu; Podgornyh, O. A.; Shamarin, N. N.; Filippova, E. O.

    2017-01-01

    Automatization of engineering processes requires developing relevant mathematical support and a computer software. Analysis of metal cutting kinematics and tool geometry is a necessary key task at the preproduction stage. This paper is focused on developing a procedure for determining the geometry of oblique peakless round-nose tool lathe machining with the use of vector/matrix transformations. Such an approach allows integration into modern mathematical software packages in distinction to the traditional analytic description. Such an advantage is very promising for developing automated control of the preproduction process. A kinematic criterion for the applicable tool geometry has been developed from the results of this study. The effect of tool blade inclination and curvature on the geometry-dependent process parameters was evaluated.

  18. Chinese-English Automation and Computer Technology Dictionary, Volume 2.

    Science.gov (United States)

    1980-08-01

    Chinese-English Automation and Computer Technology Dictionary VOL 2 ItT: SEP 2LECTE \\This dcuflent h as een c i tsrO tog public te1a sae’ I d~suil to...zhuangbei A information link 04 tongxin ].ianjie zhuangzhi A Iconrwnicatioi link 05 tongxin shebei camuenications euipme~nt; 06 omnications facility

  19. Automated computation of one-loop integrals in massless theories

    International Nuclear Information System (INIS)

    Hameren, A. van; Vollinga, J.; Weinzierl, S.

    2005-01-01

    We consider one-loop tensor and scalar integrals, which occur in a massless quantum field theory, and we report on the implementation into a numerical program of an algorithm for the automated computation of these one-loop integrals. The number of external legs of the loop integrals is not restricted. All calculations are done within dimensional regularization. (orig.)

  20. Computer-Aided Modelling Methods and Tools

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    The development of models for a range of applications requires methods and tools. In many cases a reference model is required that allows the generation of application specific models that are fit for purpose. There are a range of computer aided modelling tools available that help to define the m...

  1. USSR Report, Cybernetics, Computers and Automation Technology

    Science.gov (United States)

    1987-03-31

    version of the system was tested by adapting PAL-11 and MACRO-11 assembly code for the "Elektronika=60" and "Elektronika-60M" computers; ASM -86 for the...GS, "On the Results of Evaluation of Insurance Payments in Collective and State Farms and Private Households," the actuarial analysis tables based

  2. Computational tasks in robotics and factory automation

    NARCIS (Netherlands)

    Biemans, Frank P.; Vissers, C.A.

    1988-01-01

    The design of Manufacturing Planning and Control Systems (MPCSs) — systems that negotiate with Customers and Suppliers to exchange products in return for money in order to generate profit, is discussed. The computational task of MPCS components are systematically specified as a starting point for

  3. USSR Report: Cybernetics, Computers and Automation Technology

    Science.gov (United States)

    1986-12-03

    Georgian SSR Academy of Sciences: "Ready for Dialogue"] [Text] Computers in schools, auditoria , and educational laboratories are an phenomenon to which we...professional-technical academies and VUZ auditoria . Obviously, the color of the screens and the characters on them is of major importance for people

  4. Automated Reporting of DXA Studies Using a Custom-Built Computer Program.

    Science.gov (United States)

    England, Joseph R; Colletti, Patrick M

    2018-06-01

    Dual-energy x-ray absorptiometry (DXA) scans are a critical population health tool and relatively simple to interpret but can be time consuming to report, often requiring manual transfer of bone mineral density and associated statistics into commercially available dictation systems. We describe here a custom-built computer program for automated reporting of DXA scans using Pydicom, an open-source package built in the Python computer language, and regular expressions to mine DICOM tags for patient information and bone mineral density statistics. This program, easy to emulate by any novice computer programmer, has doubled our efficiency at reporting DXA scans and has eliminated dictation errors.

  5. Possible Computer Vision Systems and Automated or Computer-Aided Edging and Trimming

    Science.gov (United States)

    Philip A. Araman

    1990-01-01

    This paper discusses research which is underway to help our industry reduce costs, increase product volume and value recovery, and market more accurately graded and described products. The research is part of a team effort to help the hardwood sawmill industry automate with computer vision systems, and computer-aided or computer controlled processing. This paper...

  6. Software Tool for Automated Failure Modes and Effects Analysis (FMEA) of Hydraulic Systems

    DEFF Research Database (Denmark)

    Stecki, J. S.; Conrad, Finn; Oh, B.

    2002-01-01

    Offshore, marine,aircraft and other complex engineering systems operate in harsh environmental and operational conditions and must meet stringent requirements of reliability, safety and maintability. To reduce the hight costs of development of new systems in these fields improved the design...... management techniques and a vast array of computer aided techniques are applied during design and testing stages. The paper present and discusses the research and development of a software tool for automated failure mode and effects analysis - FMEA - of hydraulic systems. The paper explains the underlying...

  7. Intelligent Computer Vision System for Automated Classification

    International Nuclear Information System (INIS)

    Jordanov, Ivan; Georgieva, Antoniya

    2010-01-01

    In this paper we investigate an Intelligent Computer Vision System applied for recognition and classification of commercially available cork tiles. The system is capable of acquiring and processing gray images using several feature generation and analysis techniques. Its functionality includes image acquisition, feature extraction and preprocessing, and feature classification with neural networks (NN). We also discuss system test and validation results from the recognition and classification tasks. The system investigation also includes statistical feature processing (features number and dimensionality reduction techniques) and classifier design (NN architecture, target coding, learning complexity and performance, and training with our own metaheuristic optimization method). The NNs trained with our genetic low-discrepancy search method (GLPτS) for global optimisation demonstrated very good generalisation abilities. In our view, the reported testing success rate of up to 95% is due to several factors: combination of feature generation techniques; application of Analysis of Variance (ANOVA) and Principal Component Analysis (PCA), which appeared to be very efficient for preprocessing the data; and use of suitable NN design and learning method.

  8. Computer simulation boosts automation in the stockyard

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-04-01

    Today's desktop computer and advanced software keep pace with handling equipment to reach new heights of sophistication with graphic simulation able to show precisely what is and could happen in the coal terminal's stockyard. The article describes an innovative coal terminal nearing completion on the Pacific coast at Lazaro Cardenas in Mexico, called the Petracalco terminal. Here coal is unloaded, stored and fed to the nearby power plant of Pdte Plutarco Elias Calles. The R & D department of the Italian company Techint, Italimpianti has developed MHATIS, a sophisticated software system for marine terminal management here, allowing analysis of performance with the use of graphical animation. Strategies can be tested before being put into practice and likely power station demand can be predicted. The design and operation of the MHATIS system is explained. Other integrated coal handling plants described in the article are that developed by the then PWH (renamed Krupp Foerdertechnik) of Germany for the Israel Electric Corporation and the installation by the same company of a further bucketwheel for a redesigned coal stockyard at the Port of Hamburg operated by Hansaport. 1 fig., 4 photos.

  9. Automated System for Teaching Computational Complexity of Algorithms Course

    Directory of Open Access Journals (Sweden)

    Vadim S. Roublev

    2017-01-01

    Full Text Available This article describes problems of designing automated teaching system for “Computational complexity of algorithms” course. This system should provide students with means to familiarize themselves with complex mathematical apparatus and improve their mathematical thinking in the respective area. The article introduces the technique of algorithms symbol scroll table that allows estimating lower and upper bounds of computational complexity. Further, we introduce a set of theorems that facilitate the analysis in cases when the integer rounding of algorithm parameters is involved and when analyzing the complexity of a sum. At the end, the article introduces a normal system of symbol transformations that allows one both to perform any symbol transformations and simplifies the automated validation of such transformations. The article is published in the authors’ wording.

  10. ASPECT (Automated System-level Performance Evaluation and Characterization Tool), Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — SSCI has developed a suite of SAA tools and an analysis capability referred to as ASPECT (Automated System-level Performance Evaluation and Characterization Tool)....

  11. Tools for automating the imaging of zebrafish larvae.

    Science.gov (United States)

    Pulak, Rock

    2016-03-01

    The VAST BioImager system is a set of tools developed for zebrafish researchers who require the collection of images from a large number of 2-7 dpf zebrafish larvae. The VAST BioImager automates larval handling, positioning and orientation tasks. Color images at about 10 μm resolution are collected from the on-board camera of the system. If images of greater resolution and detail are required, this system is mounted on an upright microscope, such as a confocal or fluorescence microscope, to utilize their capabilities. The system loads a larvae, positions it in view of the camera, determines orientation using pattern recognition analysis, and then more precisely positions to user-defined orientation for optimal imaging of any desired tissue or organ system. Multiple images of the same larva can be collected. The specific part of each larva and the desired orientation and position is identified by the researcher and an experiment defining the settings and a series of steps can be saved and repeated for imaging of subsequent larvae. The system captures images, then ejects and loads another larva from either a bulk reservoir, a well of a 96 well plate using the LP Sampler, or individually targeted larvae from a Petri dish or other container using the VAST Pipettor. Alternative manual protocols for handling larvae for image collection are tedious and time consuming. The VAST BioImager automates these steps to allow for greater throughput of assays and screens requiring high-content image collection of zebrafish larvae such as might be used in drug discovery and toxicology studies. Copyright © 2015 The Author. Published by Elsevier Inc. All rights reserved.

  12. A novel tool for automated evaluation of radiographic weld images

    International Nuclear Information System (INIS)

    Rajagopalan, C.; Venkatraman, B.; Jayakumar, T.; Kalyanasundaram, P.; Raj, B.

    2004-01-01

    Radiography is one of the oldest and the most widely used NDT method for the detection of volumetric defects in welds and castings. Once a radiograph of a weld or a casting or an assembly is taken, the radiographer examines the same. The task of the radiographer consists of identifying the defects and quantitatively evaluating the same based on codes and specifications. Radiographic interpretation primarily depends on the expertise of the individual radiographer. To overcome the subjectivity involved in human interpretation, it is thus desirable to develop a computer based automated system to aid in the interpretation of radiographs. Towards this goal, the authors have developed a flowchart chalking out the various stages involved. Typical weld images of tube to tubesheet weld joints were digitised using high resolution digitiser. The images were segmented and 52 invariant moments were computed to be used as features. The results of these are presented in this paper. Once the features (invariant moments) are extracted and ranked, a neural network classifier based on error back-propagation has to classify the (top ranking) features and evaluate the image for acceptance or rejection. (author)

  13. COMPUTER TOOLS OF DYNAMIC MATHEMATIC SOFTWARE AND METHODICAL PROBLEMS OF THEIR USE

    Directory of Open Access Journals (Sweden)

    Olena V. Semenikhina

    2014-08-01

    Full Text Available The article presents results of analyses of standard computer tools of dynamic mathematic software which are used in solving tasks, and tools on which the teacher can support in the teaching of mathematics. Possibility of the organization of experimental investigating of mathematical objects on the basis of these tools and the wording of new tasks on the basis of the limited number of tools, fast automated check are specified. Some methodological comments on application of computer tools and methodological features of the use of interactive mathematical environments are presented. Problems, which are arising from the use of computer tools, among which rethinking forms and methods of training by teacher, the search for creative problems, the problem of rational choice of environment, check the e-solutions, common mistakes in the use of computer tools are selected.

  14. Automating ATLAS Computing Operations using the Site Status Board

    CERN Document Server

    Andreeva, J; The ATLAS collaboration; Campana, S; Di Girolamo, A; Espinal Curull, X; Gayazov, S; Magradze, E; Nowotka, MM; Rinaldi, L; Saiz, P; Schovancova, J; Stewart, GA; Wright, M

    2012-01-01

    The automation of operations is essential to reduce manpower costs and improve the reliability of the system. The Site Status Board (SSB) is a framework which allows Virtual Organizations to monitor their computing activities at distributed sites and to evaluate site performance. The ATLAS experiment intensively uses SSB for the distributed computing shifts, for estimating data processing and data transfer efficiencies at a particular site, and for implementing automatic exclusion of sites from computing activities, in case of potential problems. ATLAS SSB provides a real-time aggregated monitoring view and keeps the history of the monitoring metrics. Based on this history, usability of a site from the perspective of ATLAS is calculated. The presentation will describe how SSB is integrated in the ATLAS operations and computing infrastructure and will cover implementation details of the ATLAS SSB sensors and alarm system, based on the information in SSB. It will demonstrate the positive impact of the use of SS...

  15. PLAT: An Automated Fault and Behavioural Anomaly Detection Tool for PLC Controlled Manufacturing Systems

    Directory of Open Access Journals (Sweden)

    Arup Ghosh

    2016-01-01

    Full Text Available Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively.

  16. PLAT: An Automated Fault and Behavioural Anomaly Detection Tool for PLC Controlled Manufacturing Systems.

    Science.gov (United States)

    Ghosh, Arup; Qin, Shiming; Lee, Jooyeoun; Wang, Gi-Nam

    2016-01-01

    Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT) that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively.

  17. GOPET: A tool for automated predictions of Gene Ontology terms

    Directory of Open Access Journals (Sweden)

    Glatting Karl-Heinz

    2006-03-01

    Full Text Available Abstract Background Vast progress in sequencing projects has called for annotation on a large scale. A Number of methods have been developed to address this challenging task. These methods, however, either apply to specific subsets, or their predictions are not formalised, or they do not provide precise confidence values for their predictions. Description We recently established a learning system for automated annotation, trained with a broad variety of different organisms to predict the standardised annotation terms from Gene Ontology (GO. Now, this method has been made available to the public via our web-service GOPET (Gene Ontology term Prediction and Evaluation Tool. It supplies annotation for sequences of any organism. For each predicted term an appropriate confidence value is provided. The basic method had been developed for predicting molecular function GO-terms. It is now expanded to predict biological process terms. This web service is available via http://genius.embnet.dkfz-heidelberg.de/menu/biounit/open-husar Conclusion Our web service gives experimental researchers as well as the bioinformatics community a valuable sequence annotation device. Additionally, GOPET also provides less significant annotation data which may serve as an extended discovery platform for the user.

  18. HPCToolkit: performance tools for scientific computing

    Energy Technology Data Exchange (ETDEWEB)

    Tallent, N; Mellor-Crummey, J; Adhianto, L; Fagan, M; Krentel, M [Department of Computer Science, Rice University, Houston, TX 77005 (United States)

    2008-07-15

    As part of the U.S. Department of Energy's Scientific Discovery through Advanced Computing (SciDAC) program, science teams are tackling problems that require simulation and modeling on petascale computers. As part of activities associated with the SciDAC Center for Scalable Application Development Software (CScADS) and the Performance Engineering Research Institute (PERI), Rice University is building software tools for performance analysis of scientific applications on the leadership-class platforms. In this poster abstract, we briefly describe the HPCToolkit performance tools and how they can be used to pinpoint bottlenecks in SPMD and multi-threaded parallel codes. We demonstrate HPCToolkit's utility by applying it to two SciDAC applications: the S3D code for simulation of turbulent combustion and the MFDn code for ab initio calculations of microscopic structure of nuclei.

  19. HPCToolkit: performance tools for scientific computing

    International Nuclear Information System (INIS)

    Tallent, N; Mellor-Crummey, J; Adhianto, L; Fagan, M; Krentel, M

    2008-01-01

    As part of the U.S. Department of Energy's Scientific Discovery through Advanced Computing (SciDAC) program, science teams are tackling problems that require simulation and modeling on petascale computers. As part of activities associated with the SciDAC Center for Scalable Application Development Software (CScADS) and the Performance Engineering Research Institute (PERI), Rice University is building software tools for performance analysis of scientific applications on the leadership-class platforms. In this poster abstract, we briefly describe the HPCToolkit performance tools and how they can be used to pinpoint bottlenecks in SPMD and multi-threaded parallel codes. We demonstrate HPCToolkit's utility by applying it to two SciDAC applications: the S3D code for simulation of turbulent combustion and the MFDn code for ab initio calculations of microscopic structure of nuclei

  20. A computational framework for automation of point defect calculations

    International Nuclear Information System (INIS)

    Goyal, Anuj; Gorai, Prashun; Peng, Haowei

    2017-01-01

    We have developed a complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory. Furthermore, the framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. This package provides the capability to compute widely-accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3 as test examples, we demonstrate the package capabilities and validate the methodology.

  1. Final Report: Correctness Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [Rice Univ., Houston, TX (United States)

    2014-10-27

    In the course of developing parallel programs for leadership computing systems, subtle programming errors often arise that are extremely difficult to diagnose without tools. To meet this challenge, University of Maryland, the University of Wisconsin—Madison, and Rice University worked to develop lightweight tools to help code developers pinpoint a variety of program correctness errors that plague parallel scientific codes. The aim of this project was to develop software tools that help diagnose program errors including memory leaks, memory access errors, round-off errors, and data races. Research at Rice University focused on developing algorithms and data structures to support efficient monitoring of multithreaded programs for memory access errors and data races. This is a final report about research and development work at Rice University as part of this project.

  2. Automated cutting in the food industry using computer vision

    KAUST Repository

    Daley, Wayne D R

    2012-01-01

    The processing of natural products has posed a significant problem to researchers and developers involved in the development of automation. The challenges have come from areas such as sensing, grasping and manipulation, as well as product-specific areas such as cutting and handling of meat products. Meat products are naturally variable and fixed automation is at its limit as far as its ability to accommodate these products. Intelligent automation systems (such as robots) are also challenged, mostly because of a lack of knowledge of the physical characteristic of the individual products. Machine vision has helped to address some of these shortcomings but underperforms in many situations. Developments in sensors, software and processing power are now offering capabilities that will help to make more of these problems tractable. In this chapter we will describe some of the developments that are underway in terms of computer vision for meat product applications, the problems they are addressing and potential future trends. © 2012 Woodhead Publishing Limited All rights reserved.

  3. An Evaluation of the Automated Cost Estimating Integrated Tools (ACEIT) System

    Science.gov (United States)

    1989-09-01

    C~4p DTIC S ELECTE fl JAN12 19 .1R ~OF S%. B -U AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Caroline L...Ohio go 91 022 AFIT/GCA/LSQ/89S-5 AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Caroline L. Hanson Major, USAF...Department of Defense. AFIT/GCA/LSQ/89S-5 AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Presented to the

  4. Implementing Lumberjacks and Black Swans Into Model-Based Tools to Support Human-Automation Interaction.

    Science.gov (United States)

    Sebok, Angelia; Wickens, Christopher D

    2017-03-01

    The objectives were to (a) implement theoretical perspectives regarding human-automation interaction (HAI) into model-based tools to assist designers in developing systems that support effective performance and (b) conduct validations to assess the ability of the models to predict operator performance. Two key concepts in HAI, the lumberjack analogy and black swan events, have been studied extensively. The lumberjack analogy describes the effects of imperfect automation on operator performance. In routine operations, an increased degree of automation supports performance, but in failure conditions, increased automation results in more significantly impaired performance. Black swans are the rare and unexpected failures of imperfect automation. The lumberjack analogy and black swan concepts have been implemented into three model-based tools that predict operator performance in different systems. These tools include a flight management system, a remotely controlled robotic arm, and an environmental process control system. Each modeling effort included a corresponding validation. In one validation, the software tool was used to compare three flight management system designs, which were ranked in the same order as predicted by subject matter experts. The second validation compared model-predicted operator complacency with empirical performance in the same conditions. The third validation compared model-predicted and empirically determined time to detect and repair faults in four automation conditions. The three model-based tools offer useful ways to predict operator performance in complex systems. The three tools offer ways to predict the effects of different automation designs on operator performance.

  5. An integrated computational tool for precipitation simulation

    Science.gov (United States)

    Cao, W.; Zhang, F.; Chen, S.-L.; Zhang, C.; Chang, Y. A.

    2011-07-01

    Computer aided materials design is of increasing interest because the conventional approach solely relying on experimentation is no longer viable within the constraint of available resources. Modeling of microstructure and mechanical properties during precipitation plays a critical role in understanding the behavior of materials and thus accelerating the development of materials. Nevertheless, an integrated computational tool coupling reliable thermodynamic calculation, kinetic simulation, and property prediction of multi-component systems for industrial applications is rarely available. In this regard, we are developing a software package, PanPrecipitation, under the framework of integrated computational materials engineering to simulate precipitation kinetics. It is seamlessly integrated with the thermodynamic calculation engine, PanEngine, to obtain accurate thermodynamic properties and atomic mobility data necessary for precipitation simulation.

  6. Translation Memory and Computer Assisted Translation Tool for Medieval Texts

    Directory of Open Access Journals (Sweden)

    Törcsvári Attila

    2013-05-01

    Full Text Available Translation memories (TMs, as part of Computer Assisted Translation (CAT tools, support translators reusing portions of formerly translated text. Fencing books are good candidates for using TMs due to the high number of repeated terms. Medieval texts suffer a number of drawbacks that make hard even “simple” rewording to the modern version of the same language. The analyzed difficulties are: lack of systematic spelling, unusual word orders and typos in the original. A hypothesis is made and verified that even simple modernization increases legibility and it is feasible, also it is worthwhile to apply translation memories due to the numerous and even extremely long repeated terms. Therefore, methods and algorithms are presented 1. for automated transcription of medieval texts (when a limited training set is available, and 2. collection of repeated patterns. The efficiency of the algorithms is analyzed for recall and precision.

  7. Supplementary Material for: BEACON: automated tool for Bacterial GEnome Annotation ComparisON

    KAUST Repository

    Kalkatawi, Manal M.; Alam, Intikhab; Bajic, Vladimir B.

    2015-01-01

    Abstract Background Genome annotation is one way of summarizing the existing knowledge about genomic characteristics of an organism. There has been an increased interest during the last several decades in computer-based structural and functional genome annotation. Many methods for this purpose have been developed for eukaryotes and prokaryotes. Our study focuses on comparison of functional annotations of prokaryotic genomes. To the best of our knowledge there is no fully automated system for detailed comparison of functional genome annotations generated by different annotation methods (AMs). Results The presence of many AMs and development of new ones introduce needs to: a/ compare different annotations for a single genome, and b/ generate annotation by combining individual ones. To address these issues we developed an Automated Tool for Bacterial GEnome Annotation ComparisON (BEACON) that benefits both AM developers and annotation analysers. BEACON provides detailed comparison of gene function annotations of prokaryotic genomes obtained by different AMs and generates extended annotations through combination of individual ones. For the illustration of BEACONâ s utility, we provide a comparison analysis of multiple different annotations generated for four genomes and show on these examples that the extended annotation can increase the number of genes annotated by putative functions up to 27 %, while the number of genes without any function assignment is reduced. Conclusions We developed BEACON, a fast tool for an automated and a systematic comparison of different annotations of single genomes. The extended annotation assigns putative functions to many genes with unknown functions. BEACON is available under GNU General Public License version 3.0 and is accessible at: http://www.cbrc.kaust.edu.sa/BEACON/ .

  8. Automated differentiation of computer models for sensitivity analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1990-01-01

    Sensitivity analysis of reactor physics computer models is an established discipline after more than twenty years of active development of generalized perturbations theory based on direct and adjoint methods. Many reactor physics models have been enhanced to solve for sensitivities of model results to model data. The calculated sensitivities are usually normalized first derivatives although some codes are capable of solving for higher-order sensitivities. The purpose of this paper is to report on the development and application of the GRESS system for automating the implementation of the direct and adjoint techniques into existing FORTRAN computer codes. The GRESS system was developed at ORNL to eliminate the costly man-power intensive effort required to implement the direct and adjoint techniques into already-existing FORTRAN codes. GRESS has been successfully tested for a number of codes over a wide range of applications and presently operates on VAX machines under both VMS and UNIX operating systems

  9. Automated differentiation of computer models for sensitivity analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1991-01-01

    Sensitivity analysis of reactor physics computer models is an established discipline after more than twenty years of active development of generalized perturbations theory based on direct and adjoint methods. Many reactor physics models have been enhanced to solve for sensitivities of model results to model data. The calculated sensitivities are usually normalized first derivatives, although some codes are capable of solving for higher-order sensitivities. The purpose of this paper is to report on the development and application of the GRESS system for automating the implementation of the direct and adjoint techniques into existing FORTRAN computer codes. The GRESS system was developed at ORNL to eliminate the costly man-power intensive effort required to implement the direct and adjoint techniques into already-existing FORTRAN codes. GRESS has been successfully tested for a number of codes over a wide range of applications and presently operates on VAX machines under both VMS and UNIX operating systems. (author). 9 refs, 1 tab

  10. A Novel Automated Method for Analyzing Cylindrical Computed Tomography Data

    Science.gov (United States)

    Roth, D. J.; Burke, E. R.; Rauser, R. W.; Martin, R. E.

    2011-01-01

    A novel software method is presented that is applicable for analyzing cylindrical and partially cylindrical objects inspected using computed tomography. This method involves unwrapping and re-slicing data so that the CT data from the cylindrical object can be viewed as a series of 2-D sheets in the vertical direction in addition to volume rendering and normal plane views provided by traditional CT software. The method is based on interior and exterior surface edge detection and under proper conditions, is FULLY AUTOMATED and requires no input from the user except the correct voxel dimension from the CT scan. The software is available from NASA in 32- and 64-bit versions that can be applied to gigabyte-sized data sets, processing data either in random access memory or primarily on the computer hard drive. Please inquire with the presenting author if further interested. This software differentiates itself in total from other possible re-slicing software solutions due to complete automation and advanced processing and analysis capabilities.

  11. The classification and evaluation of Computer-Aided Software Engineering tools

    OpenAIRE

    Manley, Gary W.

    1990-01-01

    Approved for public release; distribution unlimited. The use of Computer-Aided Software Engineering (CASE) tools has been viewed as a remedy for the software development crisis by achieving improved productivity and system quality via the automation of all or part of the software engineering process. The proliferation and tremendous variety of tools available have stretched the understanding of experienced practitioners and has had a profound impact on the software engineering process itse...

  12. Decomposition recovery extension to the Computer Aided Prototyping System (CAPS) change-merge tool.

    OpenAIRE

    Keesling, William Ronald

    1997-01-01

    Approved for public release; distribution is unlimited A promising use of Computer Aided Prototyping System (CAPS) is to support concurrent design. Key to success in this context is the ability to automatically and reliably combine and integrate the prototypes produced in concurrent efforts. Thus, to be of practical use in this as well as most prototyping contexts, a CAPS tool must have a fast, automated, reliable prototype integration capability. The current CAPS Change Merge Tool is fast...

  13. Automated Generation of User Guidance by Combining Computation and Deduction

    Directory of Open Access Journals (Sweden)

    Walther Neuper

    2012-02-01

    Full Text Available Herewith, a fairly old concept is published for the first time and named "Lucas Interpretation". This has been implemented in a prototype, which has been proved useful in educational practice and has gained academic relevance with an emerging generation of educational mathematics assistants (EMA based on Computer Theorem Proving (CTP. Automated Theorem Proving (ATP, i.e. deduction, is the most reliable technology used to check user input. However ATP is inherently weak in automatically generating solutions for arbitrary problems in applied mathematics. This weakness is crucial for EMAs: when ATP checks user input as incorrect and the learner gets stuck then the system should be able to suggest possible next steps. The key idea of Lucas Interpretation is to compute the steps of a calculation following a program written in a novel CTP-based programming language, i.e. computation provides the next steps. User guidance is generated by combining deduction and computation: the latter is performed by a specific language interpreter, which works like a debugger and hands over control to the learner at breakpoints, i.e. tactics generating the steps of calculation. The interpreter also builds up logical contexts providing ATP with the data required for checking user input, thus combining computation and deduction. The paper describes the concepts underlying Lucas Interpretation so that open questions can adequately be addressed, and prerequisites for further work are provided.

  14. Performance Evaluation of a Software Engineering Tool for Automated Design of Cooling Systems in Injection Moulding

    DEFF Research Database (Denmark)

    Jauregui-Becker, Juan M.; Tosello, Guido; van Houten, Fred J.A.M.

    2013-01-01

    This paper presents a software tool for automating the design of cooling systems for injection moulding and a validation of its performance. Cooling system designs were automatically generated by the proposed software tool and by applying a best practice tool engineering design approach. The two...

  15. Automated Design Tools for Integrated Mixed-Signal Microsystems (NeoCAD)

    National Research Council Canada - National Science Library

    Petre, P; Visher, J; Shringarpure, R; Valley, F; Swaminathan, M

    2005-01-01

    Automated design tools and integrated design flow methodologies were developed that demonstrated more than an order- of-magnitude reduction in cycle time and cost for mixed signal (digital/analoglRF...

  16. Development of an Automated Security Risk Assessment Methodology Tool for Critical Infrastructures.

    Energy Technology Data Exchange (ETDEWEB)

    Jaeger, Calvin Dell; Roehrig, Nathaniel S.; Torres, Teresa M.

    2008-12-01

    This document presents the security automated Risk Assessment Methodology (RAM) prototype tool developed by Sandia National Laboratories (SNL). This work leverages SNL's capabilities and skills in security risk analysis and the development of vulnerability assessment/risk assessment methodologies to develop an automated prototype security RAM tool for critical infrastructures (RAM-CITM). The prototype automated RAM tool provides a user-friendly, systematic, and comprehensive risk-based tool to assist CI sector and security professionals in assessing and managing security risk from malevolent threats. The current tool is structured on the basic RAM framework developed by SNL. It is envisioned that this prototype tool will be adapted to meet the requirements of different CI sectors and thereby provide additional capabilities.

  17. Automated tools and techniques for distributed Grid Software Development of the testbed infrastructure

    CERN Document Server

    Aguado Sanchez, C

    2007-01-01

    Grid technology is becoming more and more important as the new paradigm for sharing computational resources across different organizations in a secure way. The great powerfulness of this solution, requires the definition of a generic stack of services and protocols and this is the scope of the different Grid initiatives. As a result of international collaborations for its development, the Open Grid Forum created the Open Grid Services Architecture (OGSA) which aims to define the common set of services that will enable interoperability across the different implementations. This master thesis has been developed in this framework, as part of the two European-funded projects ETICS and OMII-Europe. The main objective is to contribute to the design and maintenance of large distributed development projects with the automated tool that enables to implement Software Engineering techniques oriented to achieve an acceptable level of quality at the release process. Specifically, this thesis develops the testbed concept a...

  18. Methodology, Algorithms, and Emerging Tool for Automated Design of Intelligent Integrated Multi-Sensor Systems

    Directory of Open Access Journals (Sweden)

    Andreas König

    2009-11-01

    Full Text Available The emergence of novel sensing elements, computing nodes, wireless communication and integration technology provides unprecedented possibilities for the design and application of intelligent systems. Each new application system must be designed from scratch, employing sophisticated methods ranging from conventional signal processing to computational intelligence. Currently, a significant part of this overall algorithmic chain of the computational system model still has to be assembled manually by experienced designers in a time and labor consuming process. In this research work, this challenge is picked up and a methodology and algorithms for automated design of intelligent integrated and resource-aware multi-sensor systems employing multi-objective evolutionary computation are introduced. The proposed methodology tackles the challenge of rapid-prototyping of such systems under realization constraints and, additionally, includes features of system instance specific self-correction for sustained operation of a large volume and in a dynamically changing environment. The extension of these concepts to the reconfigurable hardware platform renders so called self-x sensor systems, which stands, e.g., for self-monitoring, -calibrating, -trimming, and -repairing/-healing systems. Selected experimental results prove the applicability and effectiveness of our proposed methodology and emerging tool. By our approach, competitive results were achieved with regard to classification accuracy, flexibility, and design speed under additional design constraints.

  19. Automated uncertainty analysis methods in the FRAP computer codes

    International Nuclear Information System (INIS)

    Peck, S.O.

    1980-01-01

    A user oriented, automated uncertainty analysis capability has been incorporated in the Fuel Rod Analysis Program (FRAP) computer codes. The FRAP codes have been developed for the analysis of Light Water Reactor fuel rod behavior during steady state (FRAPCON) and transient (FRAP-T) conditions as part of the United States Nuclear Regulatory Commission's Water Reactor Safety Research Program. The objective of uncertainty analysis of these codes is to obtain estimates of the uncertainty in computed outputs of the codes is to obtain estimates of the uncertainty in computed outputs of the codes as a function of known uncertainties in input variables. This paper presents the methods used to generate an uncertainty analysis of a large computer code, discusses the assumptions that are made, and shows techniques for testing them. An uncertainty analysis of FRAP-T calculated fuel rod behavior during a hypothetical loss-of-coolant transient is presented as an example and carried through the discussion to illustrate the various concepts

  20. HEP Computing Tools, Grid and Supercomputers for Genome Sequencing Studies

    Science.gov (United States)

    De, K.; Klimentov, A.; Maeno, T.; Mashinistov, R.; Novikov, A.; Poyda, A.; Tertychnyy, I.; Wenaus, T.

    2017-10-01

    PanDA - Production and Distributed Analysis Workload Management System has been developed to address ATLAS experiment at LHC data processing and analysis challenges. Recently PanDA has been extended to run HEP scientific applications on Leadership Class Facilities and supercomputers. The success of the projects to use PanDA beyond HEP and Grid has drawn attention from other compute intensive sciences such as bioinformatics. Recent advances of Next Generation Genome Sequencing (NGS) technology led to increasing streams of sequencing data that need to be processed, analysed and made available for bioinformaticians worldwide. Analysis of genomes sequencing data using popular software pipeline PALEOMIX can take a month even running it on the powerful computer resource. In this paper we will describe the adaptation the PALEOMIX pipeline to run it on a distributed computing environment powered by PanDA. To run pipeline we split input files into chunks which are run separately on different nodes as separate inputs for PALEOMIX and finally merge output file, it is very similar to what it done by ATLAS to process and to simulate data. We dramatically decreased the total walltime because of jobs (re)submission automation and brokering within PanDA. Using software tools developed initially for HEP and Grid can reduce payload execution time for Mammoths DNA samples from weeks to days.

  1. Automated breast segmentation in ultrasound computer tomography SAFT images

    Science.gov (United States)

    Hopp, T.; You, W.; Zapf, M.; Tan, W. Y.; Gemmeke, H.; Ruiter, N. V.

    2017-03-01

    Ultrasound Computer Tomography (USCT) is a promising new imaging system for breast cancer diagnosis. An essential step before further processing is to remove the water background from the reconstructed images. In this paper we present a fully-automated image segmentation method based on three-dimensional active contours. The active contour method is extended by applying gradient vector flow and encoding the USCT aperture characteristics as additional weighting terms. A surface detection algorithm based on a ray model is developed to initialize the active contour, which is iteratively deformed to capture the breast outline in USCT reflection images. The evaluation with synthetic data showed that the method is able to cope with noisy images, and is not influenced by the position of the breast and the presence of scattering objects within the breast. The proposed method was applied to 14 in-vivo images resulting in an average surface deviation from a manual segmentation of 2.7 mm. We conclude that automated segmentation of USCT reflection images is feasible and produces results comparable to a manual segmentation. By applying the proposed method, reproducible segmentation results can be obtained without manual interaction by an expert.

  2. Automated high speed volume computed tomography for inline quality control

    International Nuclear Information System (INIS)

    Hanke, R.; Kugel, A.; Troup, P.

    2004-01-01

    Increasing complexity of innovative products as well as growing requirements on quality and reliability call for more detailed knowledge about internal structures of manufactured components rather by 100 % inspection than just by sampling test. A first-step solution, like radioscopic inline inspection machines, equipped with automated data evaluation software, have become state of the art in the production floor during the last years. However, these machines provide just ordinary two-dimensional information and deliver no volume data e.g. to evaluate exact position or shape of detected defects. One way to solve this problem is the application of X-ray computed tomography (CT). Compared to the performance of the first generation medical scanners (scanning times of many hours), today, modern Volume CT machines for industrial applications need about 5 minutes for a full object scan depending on the object size. Of course, this is still too long to introduce this powerful method into the inline production quality control. In order to gain acceptance, the scanning time including subsequent data evaluation must be decreased significantly and adapted to the manufacturing cycle times. This presentation demonstrates the new technical set up, reconstruction results and the methods for high-speed volume data evaluation of a new fully automated high-speed CT scanner with cycle times below one minute for an object size of less than 15 cm. This will directly create new opportunities in design and construction of more complex objects. (author)

  3. Experiences of Using Automated Assessment in Computer Science Courses

    Directory of Open Access Journals (Sweden)

    John English

    2015-10-01

    Full Text Available In this paper we discuss the use of automated assessment in a variety of computer science courses that have been taught at Israel Academic College by the authors. The course assignments were assessed entirely automatically using Checkpoint, a web-based automated assessment framework. The assignments all used free-text questions (where the students type in their own answers. Students were allowed to correct errors based on feedback provided by the system and resubmit their answers. A total of 141 students were surveyed to assess their opinions of this approach, and we analysed their responses. Analysis of the questionnaire showed a low correlation between questions, indicating the statistical independence of the individual questions. As a whole, student feedback on using Checkpoint was very positive, emphasizing the benefits of multiple attempts, impartial marking, and a quick turnaround time for submissions. Many students said that Checkpoint gave them confidence in learning and motivation to practise. Students also said that the detailed feedback that Checkpoint generated when their programs failed helped them understand their mistakes and how to correct them.

  4. An automated annotation tool for genomic DNA sequences using

    Indian Academy of Sciences (India)

    Genomic sequence data are often available well before the annotated sequence is published. We present a method for analysis of genomic DNA to identify coding sequences using the GeneScan algorithm and characterize these resultant sequences by BLAST. The routines are used to develop a system for automated ...

  5. Computer program for the automated attendance accounting system

    Science.gov (United States)

    Poulson, P.; Rasmusson, C.

    1971-01-01

    The automated attendance accounting system (AAAS) was developed under the auspices of the Space Technology Applications Program. The task is basically the adaptation of a small digital computer, coupled with specially developed pushbutton terminals located in school classrooms and offices for the purpose of taking daily attendance, maintaining complete attendance records, and producing partial and summary reports. Especially developed for high schools, the system is intended to relieve both teachers and office personnel from the time-consuming and dreary task of recording and analyzing the myriad classroom attendance data collected throughout the semester. In addition, since many school district budgets are related to student attendance, the increase in accounting accuracy is expected to augment district income. A major component of this system is the real-time AAAS software system, which is described.

  6. Applications of the soft computing in the automated history matching

    Energy Technology Data Exchange (ETDEWEB)

    Silva, P.C.; Maschio, C.; Schiozer, D.J. [Unicamp (Brazil)

    2006-07-01

    Reservoir management is a research field in petroleum engineering that optimizes reservoir performance based on environmental, political, economic and technological criteria. Reservoir simulation is based on geological models that simulate fluid flow. Models must be constantly corrected to yield the observed production behaviour. The process of history matching is controlled by the comparison of production data, well test data and measured data from simulations. Parametrization, objective function analysis, sensitivity analysis and uncertainty analysis are important steps in history matching. One of the main challenges facing automated history matching is to develop algorithms that find the optimal solution in multidimensional search spaces. Optimization algorithms can be either global optimizers that work with noisy multi-modal functions, or local optimizers that cannot work with noisy multi-modal functions. The problem with global optimizers is the very large number of function calls, which is an inconvenience due to the long reservoir simulation time. For that reason, techniques such as least squared, thin plane spline, kriging and artificial neural networks (ANN) have been used as substitutes to reservoir simulators. This paper described the use of optimization algorithms to find optimal solution in automated history matching. Several ANN were used, including the generalized regression neural network, fuzzy system with subtractive clustering and radial basis network. The UNIPAR soft computing method was used along with a modified Hooke- Jeeves optimization method. Two case studies with synthetic and real reservoirs are examined. It was concluded that the combination of global and local optimization has the potential to improve the history matching process and that the use of substitute models can reduce computational efforts. 15 refs., 11 figs.

  7. Tool for Automated Retrieval of Generic Event Tracks (TARGET)

    Science.gov (United States)

    Clune, Thomas; Freeman, Shawn; Cruz, Carlos; Burns, Robert; Kuo, Kwo-Sen; Kouatchou, Jules

    2013-01-01

    Methods have been developed to identify and track tornado-producing mesoscale convective systems (MCSs) automatically over the continental United States, in order to facilitate systematic studies of these powerful and often destructive events. Several data sources were combined to ensure event identification accuracy. Records of watches and warnings issued by National Weather Service (NWS), and tornado locations and tracks from the Tornado History Project (THP) were used to locate MCSs in high-resolution precipitation observations and GOES infrared (11-micron) Rapid Scan Operation (RSO) imagery. Thresholds are then applied to the latter two data sets to define MCS events and track their developments. MCSs produce a broad range of severe convective weather events that are significantly affecting the living conditions of the populations exposed to them. Understanding how MCSs grow and develop could help scientists improve their weather prediction models, and also provide tools to decision-makers whose goals are to protect populations and their property. Associating storm cells across frames of remotely sensed images poses a difficult problem because storms evolve, split, and merge. Any storm-tracking method should include the following processes: storm identification, storm tracking, and quantification of storm intensity and activity. The spatiotemporal coordinates of the tracks will enable researchers to obtain other coincident observations to conduct more thorough studies of these events. In addition to their tracked locations, their areal extents, precipitation intensities, and accumulations all as functions of their evolutions in time were also obtained and recorded for these events. All parameters so derived can be catalogued into a moving object database (MODB) for custom queries. The purpose of this software is to provide a generalized, cross-platform, pluggable tool for identifying events within a set of scientific data based upon specified criteria with the

  8. Computational Complexity of Some Problems on Generalized Cellular Automations

    Directory of Open Access Journals (Sweden)

    P. G. Klyucharev

    2012-03-01

    Full Text Available We prove that the preimage problem of a generalized cellular automation is NP-hard. The results of this work are important for supporting the security of the ciphers based on the cellular automations.

  9. AUTOMATION OF REMEDY TICKETS CATEGORIZATION USING BUSINESS INTELLIGENCE TOOLS

    OpenAIRE

    DR. M RAJASEKHARA BABU; ANKITA TIWARI

    2012-01-01

    The work log of an issue is often the primary source of information for predicting the cause. Mining patterns from work log is an important issue management task. This paper aims at developing an application which categorizes the issues into problem areas using a clustering algorithm. This algorithm helps one to cluster the issues by mining patterns from the work log files. Standard reports can be generated for the root cause analysis. The whole process is automated using Business Intelligenc...

  10. Computational Tools for RF Structure Design

    CERN Document Server

    Jensen, E

    2004-01-01

    The Finite Differences Method and the Finite Element Method are the two principally employed numerical methods in modern RF field simulation programs. The basic ideas behind these methods are explained, with regard to available simulation programs. We then go through a list of characteristic parameters of RF structures, explaining how they can be calculated using these tools. With the help of these parameters, we introduce the frequency-domain and the time-domain calculations, leading to impedances and wake-fields, respectively. Subsequently, we present some readily available computer programs, which are in use for RF structure design, stressing their distinctive features and limitations. One final example benchmarks the precision of different codes for calculating the eigenfrequency and Q of a simple cavity resonator.

  11. VISTA - computational tools for comparative genomics

    Energy Technology Data Exchange (ETDEWEB)

    Frazer, Kelly A.; Pachter, Lior; Poliakov, Alexander; Rubin,Edward M.; Dubchak, Inna

    2004-01-01

    Comparison of DNA sequences from different species is a fundamental method for identifying functional elements in genomes. Here we describe the VISTA family of tools created to assist biologists in carrying out this task. Our first VISTA server at http://www-gsd.lbl.gov/VISTA/ was launched in the summer of 2000 and was designed to align long genomic sequences and visualize these alignments with associated functional annotations. Currently the VISTA site includes multiple comparative genomics tools and provides users with rich capabilities to browse pre-computed whole-genome alignments of large vertebrate genomes and other groups of organisms with VISTA Browser, submit their own sequences of interest to several VISTA servers for various types of comparative analysis, and obtain detailed comparative analysis results for a set of cardiovascular genes. We illustrate capabilities of the VISTA site by the analysis of a 180 kilobase (kb) interval on human chromosome 5 that encodes for the kinesin family member3A (KIF3A) protein.

  12. Electronic circuit design with HEP computational tools

    International Nuclear Information System (INIS)

    Vaz, Mario

    1996-01-01

    CPSPICE is an electronic circuit statistical simulation program developed to run in a parallel environment under UNIX operating system and TCP/IP communications protocol, using CPS - Cooperative Processes Software , SPICE program and CERNLIB software package. It is part of a set of tools being develop, intended to help electronic engineers to design, model and simulate complex systems and circuits for High Energy Physics detectors, based on statistical methods, using the same software and methodology used by HEP physicists for data analysis. CPSPICE simulates electronic circuits by Monte Carlo method, through several different processes running simultaneously SPICE in UNIX parallel computers or workstation farms. Data transfer between CPS processes for a modified version of SPICE2G6 is done by RAM memory, but can also be done through hard disk files if no source files are available for the simulator, and for bigger simulation outputs files. Simulation results are written in a HBOOK file as a NTUPLE, to be examined by HBOOK in batch model or graphics, and analyzed by statistical procedures available. The HBOOK file be stored on hard disk for small amount of data, or into Exabyte tape file for large amount of data. HEP tools also helps circuit or component modeling, like MINUT program from CERNLIB, that implements Nelder and Mead Simplex and Gradient with or without derivatives algorithms, and can be used for design optimization.This paper presents CPSPICE program implementation. The scheme adopted is suitable to make parallel other electronic circuit simulators. (author)

  13. Tools for remote computing in accelerator control

    International Nuclear Information System (INIS)

    Anderssen, P.S.; Frammery, V.; Wilcke, R.

    1990-01-01

    In modern accelerator control systems, the intelligence of the equipment is distributed in the geographical and the logical sense. Control processes for a large variety of tasks reside in both the equipment and the control computers. Hence successful operation hinges on the availability and reliability of the communication infrastructure. The computers are interconnected by a communication system and use remote procedure calls and message passing for information exchange. These communication mechanisms need a well-defined convention, i.e. a protocol. They also require flexibility in both the setup and changes to the protocol specification. The network compiler is a tool which provides the programmer with a means of establishing such a protocol for his application. Input to the network compiler is a single interface description file provided by the programmer. This file is written according to a grammar, and completely specifies the interprocess communication interfaces. Passed through the network compiler, the interface description file automatically produces the additional source code needed for the protocol. Hence the programmer does not have to be concerned about the details of the communication calls. Any further additions and modifications are made easy, because all the information about the interface is kept in a single file. (orig.)

  14. A least-squares computational ''tool kit''

    International Nuclear Information System (INIS)

    Smith, D.L.

    1993-04-01

    The information assembled in this report is intended to offer a useful computational ''tool kit'' to individuals who are interested in a variety of practical applications for the least-squares method of parameter estimation. The fundamental principles of Bayesian analysis are outlined first and these are applied to development of both the simple and the generalized least-squares conditions. Formal solutions that satisfy these conditions are given subsequently. Their application to both linear and non-linear problems is described in detail. Numerical procedures required to implement these formal solutions are discussed and two utility computer algorithms are offered for this purpose (codes LSIOD and GLSIOD written in FORTRAN). Some simple, easily understood examples are included to illustrate the use of these algorithms. Several related topics are then addressed, including the generation of covariance matrices, the role of iteration in applications of least-squares procedures, the effects of numerical precision and an approach that can be pursued in developing data analysis packages that are directed toward special applications

  15. Computer-aided detection and automated CT volumetry of pulmonary nodules

    International Nuclear Information System (INIS)

    Marten, Katharina; Engelke, Christoph

    2007-01-01

    With use of multislice computed tomography (MSCT), small pulmonary nodules are being detected in vast numbers, constituting the majority of all noncalcified lung nodules. Although the prevalence of lung cancers among such lesions in lung cancer screening populations is low, their isolation may contribute to increased patient survival. Computer-aided diagnosis (CAD) has emerged as a diverse set of diagnostic tools to handle the large number of images in MSCT datasets and most importantly, includes automated detection and volumetry of pulmonary nodules. Current CAD systems can significantly enhance experienced radiologists' performance and outweigh human limitations in identifying small lesions and manually measuring their diameters, augment observer consistency in the interpretation of such examinations and may thus help to detect significantly higher rates of early malignomas and give more precise estimates on chemotherapy response than can radiologists alone. In this review, we give an overview of current CAD in lung nodule detection and volumetry and discuss their relative merits and limitations. (orig.)

  16. Computer automation of a health physics program record

    International Nuclear Information System (INIS)

    Bird, E.M.; Flook, B.A.; Jarrett, R.D.

    1984-01-01

    A multi-user computer data base management system (DBMS) has been developed to automate USDA's national radiological safety program. It maintains information on approved users of radioactive material and radiation emanating equipment, as a central file which is accessed whenever information on the user is required. Files of inventory, personnel dosemetry records, laboratory and equipment surveys, leak tests, bioassay reports, and all other information are linked to each approved user by an assigned code that identifies the user by state, agency, and facility. The DBMS is menu-driven with provisions for addition, modification and report generation of information maintained in the system. This DBMS was designed as a single entry system to reduce the redundency of data entry. Prompts guide the user at decision points and data validation routines check for proper data entry. The DBMS generates lists of current inventories, leak test forms, inspection reports, scans for overdue reports from users, and generates follow-up letters. The DBMS system operates on a Wang OIS computer and utilizes its compiled BASIC, List Processing, Word Processing, and indexed (ISAM) file features. This system is a very fast relational database supporting many users simultaneously while providing several methods of data protection. All data files are compatible with List Processing. Information in these files can be examined, sorted, modified, or outputted to word processing documents using software supplied by Wang. This has reduced the need for special one-time programs and provides alternative access to the data

  17. Improving patient safety and efficiency of medication reconciliation through the development and adoption of a computer-assisted tool with automated electronic integration of population-based community drug data: the RightRx project.

    Science.gov (United States)

    Tamblyn, Robyn; Winslade, Nancy; Lee, Todd C; Motulsky, Aude; Meguerditchian, Ari; Bustillo, Melissa; Elsayed, Sarah; Buckeridge, David L; Couture, Isabelle; Qian, Christina J; Moraga, Teresa; Huang, Allen

    2018-05-01

    Many countries require hospitals to implement medication reconciliation for accreditation, but the process is resource-intensive, thus adherence is poor. We report on the impact of prepopulating and aligning community and hospital drug lists with data from population-based and hospital-based drug information systems to reduce workload and enhance adoption and use of an e-medication reconciliation application, RightRx. The prototype e-medical reconciliation web-based software was developed for a cluster-randomized trial at the McGill University Health Centre. User-centered design and agile development processes were used to develop features intended to enhance adoption, safety, and efficiency. RightRx was implemented in medical and surgical wards, with support and training provided by unit champions and field staff. The time spent per professional using RightRx was measured, as well as the medication reconciliation completion rates in the intervention and control units during the first 20 months of the trial. Users identified required modifications to the application, including the need for dose-based prescribing, the role of the discharge physician in prescribing community-based medication, and access to the rationale for medication decisions made during hospitalization. In the intervention units, both physicians and pharmacists were involved in discharge reconciliation, for 96.1% and 71.9% of patients, respectively. Medication reconciliation was completed for 80.7% (surgery) to 96.0% (medicine) of patients in the intervention units, and 0.7% (surgery) to 82.7% of patients in the control units. The odds of completing medication reconciliation were 9 times greater in the intervention compared to control units (odds ratio: 9.0, 95% confidence interval, 7.4-10.9, P < .0001) after adjusting for differences in patient characteristics. High rates of medication reconciliation completion were achieved with automated prepopulation and alignment of community and hospital

  18. Automated Development of Accurate Algorithms and Efficient Codes for Computational Aeroacoustics

    Science.gov (United States)

    Goodrich, John W.; Dyson, Rodger W.

    1999-01-01

    The simulation of sound generation and propagation in three space dimensions with realistic aircraft components is a very large time dependent computation with fine details. Simulations in open domains with embedded objects require accurate and robust algorithms for propagation, for artificial inflow and outflow boundaries, and for the definition of geometrically complex objects. The development, implementation, and validation of methods for solving these demanding problems is being done to support the NASA pillar goals for reducing aircraft noise levels. Our goal is to provide algorithms which are sufficiently accurate and efficient to produce usable results rapidly enough to allow design engineers to study the effects on sound levels of design changes in propulsion systems, and in the integration of propulsion systems with airframes. There is a lack of design tools for these purposes at this time. Our technical approach to this problem combines the development of new, algorithms with the use of Mathematica and Unix utilities to automate the algorithm development, code implementation, and validation. We use explicit methods to ensure effective implementation by domain decomposition for SPMD parallel computing. There are several orders of magnitude difference in the computational efficiencies of the algorithms which we have considered. We currently have new artificial inflow and outflow boundary conditions that are stable, accurate, and unobtrusive, with implementations that match the accuracy and efficiency of the propagation methods. The artificial numerical boundary treatments have been proven to have solutions which converge to the full open domain problems, so that the error from the boundary treatments can be driven as low as is required. The purpose of this paper is to briefly present a method for developing highly accurate algorithms for computational aeroacoustics, the use of computer automation in this process, and a brief survey of the algorithms that

  19. Quantification of lung fibrosis and emphysema in mice using automated micro-computed tomography.

    Directory of Open Access Journals (Sweden)

    Ellen De Langhe

    Full Text Available BACKGROUND: In vivo high-resolution micro-computed tomography allows for longitudinal image-based measurements in animal models of lung disease. The combination of repetitive high resolution imaging with fully automated quantitative image analysis in mouse models of lung fibrosis lung benefits preclinical research. This study aimed to develop and validate such an automated micro-computed tomography analysis algorithm for quantification of aerated lung volume in mice; an indicator of pulmonary fibrosis and emphysema severity. METHODOLOGY: Mice received an intratracheal instillation of bleomycin (n = 8, elastase (0.25 U elastase n = 9, 0.5 U elastase n = 8 or saline control (n = 6 for fibrosis, n = 5 for emphysema. A subset of mice was scanned without intervention, to evaluate potential radiation-induced toxicity (n = 4. Some bleomycin-instilled mice were treated with imatinib for proof of concept (n = 8. Mice were scanned weekly, until four weeks after induction, when they underwent pulmonary function testing, lung histology and collagen quantification. Aerated lung volumes were calculated with our automated algorithm. PRINCIPAL FINDINGS: Our automated image-based aerated lung volume quantification method is reproducible with low intra-subject variability. Bleomycin-treated mice had significantly lower scan-derived aerated lung volumes, compared to controls. Aerated lung volume correlated with the histopathological fibrosis score and total lung collagen content. Inversely, a dose-dependent increase in lung volume was observed in elastase-treated mice. Serial scanning of individual mice is feasible and visualized dynamic disease progression. No radiation-induced toxicity was observed. Three-dimensional images provided critical topographical information. CONCLUSIONS: We report on a high resolution in vivo micro-computed tomography image analysis algorithm that runs fully automated and allows quantification of aerated lung volume in mice. This

  20. Automated software development tools in the MIS (Management Information Systems) environment

    Energy Technology Data Exchange (ETDEWEB)

    Arrowood, L.F.; Emrich, M.L.

    1987-09-11

    Quantitative and qualitative benefits can be obtained through the use of automated software development tools. Such tools are best utilized when they complement existing procedures and standards. They can assist systems analysts and programmers with project specification, design, implementation, testing, and documentation. Commercial products have been evaluated to determine their efficacy. User comments have been included to illustrate actual benefits derived from introducing these tools into MIS organizations.

  1. The role of the computer in automated spectral analysis

    International Nuclear Information System (INIS)

    Rasmussen, S.E.

    This report describes how a computer can be an extremely valuable tool for routine analysis of spectra, which is a time consuming process. A number of general-purpose algorithms that are available for the various phases of the analysis can be implemented, if these algorithms are designed to cope with all the variations that may occur. Since this is basically impossible, one must find a compromise between obscure error and program complexity. This is usually possible with human interaction at critical points. In spectral analysis this is possible if the user scans the data on an interactive graphics terminal, makes the necessary changes and then returns control to the computer for completion of the analysis

  2. A software for computer automated radioactive particle tracking

    Energy Technology Data Exchange (ETDEWEB)

    Vieira, Wilson S.; Brandao, Luis E. [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)]. E-mails: wilson@ien.gov.br; brandao@ien.gov.br; Braz, Delson [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE)]. E-mail: delson@smb.lin.ufrj.br

    2008-07-01

    TRACO-1 is the first software developed in Brazil for optimization and diagnosis of multiphase chemical reactors employing the technique known as 'Computer Automated Radioactive Particle Tracking' whose main idea is to follow the movement of a punctual radioactive particle inside a vessel. Considering that this particle has a behavior similar of the phase under investigation, important conclusions can be achieved. As a preliminary TRACO-1 evaluation, a simulation was carried out with the aid of a commercial software called MICROSHIELD, version 5.05, to obtain values of photon counting rates at four detector surfaces. These counting were related to the emission of gamma radiation from a radioactive source because they are the main TRACO-1 input variables. Although the results that has been found are incipient, the analysis of them suggest that the tracking of a radioactive source using TRACO- 1 can be well succeed, but a better evaluation of the capabilities of this software will only be achieved after its application in real experiments. (author)

  3. Automated computer analysis of plasma-streak traces from SCYLLAC

    International Nuclear Information System (INIS)

    Whitman, R.L.; Jahoda, F.C.; Kruger, R.P.

    1977-01-01

    An automated computer analysis technique that locates and references the approximate centroid of single- or dual-streak traces from the Los Alamos Scientific Laboratory SCYLLAC facility is described. The technique also determines the plasma-trace width over a limited self-adjusting region. The plasma traces are recorded with streak cameras on Polaroid film, then scanned and digitized for processing. The analysis technique uses scene segmentation to separate the plasma trace from a reference fiducial trace. The technique employs two methods of peak detection; one for the plasma trace and one for the fiducial trace. The width is obtained using an edge-detection, or slope, method. Timing data are derived from the intensity modulation of the fiducial trace. To smooth (despike) the output graphs showing the plasma-trace centroid and width, a technique of ''twicing'' developed by Tukey was employed. In addition, an interactive sorting algorithm allows retrieval of the centroid, width, and fiducial data from any test shot plasma for post analysis. As yet, only a limited set of sixteen plasma traces has been processed using this technique

  4. An overview of the Environmental Monitoring Computer Automation Project

    International Nuclear Information System (INIS)

    Johnson, S.M.; Lorenz, R.

    1992-01-01

    The Savannah River Site (SRS) was bulk to produce plutonium and tritium for national defense. As a result of site operations, routine and accidental releases of radionuclides have occurred. The effects these releases have on the k>cal population and environment are of concern to the Department of Energy (DOE) and SRS personnel. Each year, approximately 40,000 environmental samples are collected. The quality of the samples, analytical methods and results obtained are important to site personnel. The Environmental Monitoring Computer Automation Project (EMCAP) was developed to better manage scheduling, log-in, tracking, analytical results, and report generation. EMCAP can be viewed as a custom Laboratory Information Management System (LIMS) with the ability to schedule samples, generate reports, and query data. The purpose of this paper is to give an overview of the SRS environmental monitoring program, describe the development of EMCAP software and hardware, discuss the different software modules, show how EMCAP improved the Environmental Monitoring Section program, and examine the future of EMCAP at SRS

  5. Automated computer analysis of plasma-streak traces from SCYLLAC

    International Nuclear Information System (INIS)

    Whiteman, R.L.; Jahoda, F.C.; Kruger, R.P.

    1977-11-01

    An automated computer analysis technique that locates and references the approximate centroid of single- or dual-streak traces from the Los Alamos Scientific Laboratory SCYLLAC facility is described. The technique also determines the plasma-trace width over a limited self-adjusting region. The plasma traces are recorded with streak cameras on Polaroid film, then scanned and digitized for processing. The analysis technique uses scene segmentation to separate the plasma trace from a reference fiducial trace. The technique employs two methods of peak detection; one for the plasma trace and one for the fiducial trace. The width is obtained using an edge-detection, or slope, method. Timing data are derived from the intensity modulation of the fiducial trace. To smooth (despike) the output graphs showing the plasma-trace centroid and width, a technique of ''twicing'' developed by Tukey was employed. In addition, an interactive sorting algorithm allows retrieval of the centroid, width, and fiducial data from any test shot plasma for post analysis. As yet, only a limited set of the plasma traces has been processed with this technique

  6. A software for computer automated radioactive particle tracking

    International Nuclear Information System (INIS)

    Vieira, Wilson S.; Brandao, Luis E.; Braz, Delson

    2008-01-01

    TRACO-1 is the first software developed in Brazil for optimization and diagnosis of multiphase chemical reactors employing the technique known as 'Computer Automated Radioactive Particle Tracking' whose main idea is to follow the movement of a punctual radioactive particle inside a vessel. Considering that this particle has a behavior similar of the phase under investigation, important conclusions can be achieved. As a preliminary TRACO-1 evaluation, a simulation was carried out with the aid of a commercial software called MICROSHIELD, version 5.05, to obtain values of photon counting rates at four detector surfaces. These counting were related to the emission of gamma radiation from a radioactive source because they are the main TRACO-1 input variables. Although the results that has been found are incipient, the analysis of them suggest that the tracking of a radioactive source using TRACO- 1 can be well succeed, but a better evaluation of the capabilities of this software will only be achieved after its application in real experiments. (author)

  7. Automation of potentiometric titration with a personal computer using ...

    African Journals Online (AJOL)

    sampling was designed and tested for automation of potentiometric titrations with personal ... automation permits us to carry out new types of experiments, such as those requiring ... have proved to be very useful in routine tasks but not in research, due to their ... This is done by a simple delay sub-routine in data acquisition.

  8. Exploring repetitive DNA landscapes using REPCLASS, a tool that automates the classification of transposable elements in eukaryotic genomes.

    Science.gov (United States)

    Feschotte, Cédric; Keswani, Umeshkumar; Ranganathan, Nirmal; Guibotsy, Marcel L; Levine, David

    2009-07-23

    Eukaryotic genomes contain large amount of repetitive DNA, most of which is derived from transposable elements (TEs). Progress has been made to develop computational tools for ab initio identification of repeat families, but there is an urgent need to develop tools to automate the annotation of TEs in genome sequences. Here we introduce REPCLASS, a tool that automates the classification of TE sequences. Using control repeat libraries, we show that the program can classify accurately virtually any known TE types. Combining REPCLASS to ab initio repeat finding in the genomes of Caenorhabditis elegans and Drosophila melanogaster allowed us to recover the contrasting TE landscape characteristic of these species. Unexpectedly, REPCLASS also uncovered several novel TE families in both genomes, augmenting the TE repertoire of these model species. When applied to the genomes of distant Caenorhabditis and Drosophila species, the approach revealed a remarkable conservation of TE composition profile within each genus, despite substantial interspecific covariations in genome size and in the number of TEs and TE families. Lastly, we applied REPCLASS to analyze 10 fungal genomes from a wide taxonomic range, most of which have not been analyzed for TE content previously. The results showed that TE diversity varies widely across the fungi "kingdom" and appears to positively correlate with genome size, in particular for DNA transposons. Together, these data validate REPCLASS as a powerful tool to explore the repetitive DNA landscapes of eukaryotes and to shed light onto the evolutionary forces shaping TE diversity and genome architecture.

  9. Regimes of data output from an automated scanning system into a computer

    International Nuclear Information System (INIS)

    Ovsov, Yu.V.; Shaislamov, P.T.

    1984-01-01

    A method is described for accomplishment of rather a complex algorithm of various coordinate and service data transmission from different automated scanning system devices into a monitoring computer in the automated system for processing images from bubble chambers. The accepted data output algorithm and the developed appropriate equipment enable data transmission both in separate words and word arrays

  10. Bayesian ISOLA: new tool for automated centroid moment tensor inversion

    Czech Academy of Sciences Publication Activity Database

    Vackář, J.; Burjánek, Jan; Gallovič, F.; Zahradník, J.; Clinton, J.

    2017-01-01

    Roč. 210, č. 2 (2017), s. 693-705 ISSN 0956-540X Institutional support: RVO:67985530 Keywords : inverse theory * waveform inversion * computational seismology * earthquake source observations * seismic noise Subject RIV: DC - Siesmology, Volcanology, Earth Structure OBOR OECD: Volcanology Impact factor: 2.414, year: 2016

  11. Automated speech quality monitoring tool based on perceptual evaluation

    OpenAIRE

    Vozňák, Miroslav; Rozhon, Jan

    2010-01-01

    The paper deals with a speech quality monitoring tool which we have developed in accordance with PESQ (Perceptual Evaluation of Speech Quality) and is automatically running and calculating the MOS (Mean Opinion Score). Results are stored into database and used in a research project investigating how meteorological conditions influence the speech quality in a GSM network. The meteorological station, which is located in our university campus provides information about a temperature,...

  12. Automation support for mobile app quality assurance - a tool landscape

    OpenAIRE

    Braun, Susanne; Elberzhager, Frank; Holl, Konstantin

    2017-01-01

    Competitive pressure in app stores, as well as direct and transparent feedback of app store reviews have resulted in an increased demand for outstanding app quality and user experience. At the same time, reduced time-to-market, decreased budgets and time available for quality assurance, and careful user experience design have to be considered. In response, an enormous market for mobile app quality and user experience measurement tools has grown around the mobile app store ecosystems. Develope...

  13. A novel DTI-QA tool: Automated metric extraction exploiting the sphericity of an agar filled phantom.

    Science.gov (United States)

    Chavez, Sofia; Viviano, Joseph; Zamyadi, Mojdeh; Kingsley, Peter B; Kochunov, Peter; Strother, Stephen; Voineskos, Aristotle

    2018-02-01

    To develop a quality assurance (QA) tool (acquisition guidelines and automated processing) for diffusion tensor imaging (DTI) data using a common agar-based phantom used for fMRI QA. The goal is to produce a comprehensive set of automated, sensitive and robust QA metrics. A readily available agar phantom was scanned with and without parallel imaging reconstruction. Other scanning parameters were matched to the human scans. A central slab made up of either a thick slice or an average of a few slices, was extracted and all processing was performed on that image. The proposed QA relies on the creation of two ROIs for processing: (i) a preset central circular region of interest (ccROI) and (ii) a signal mask for all images in the dataset. The ccROI enables computation of average signal for SNR calculations as well as average FA values. The production of the signal masks enables automated measurements of eddy current and B0 inhomogeneity induced distortions by exploiting the sphericity of the phantom. Also, the signal masks allow automated background localization to assess levels of Nyquist ghosting. The proposed DTI-QA was shown to produce eleven metrics which are robust yet sensitive to image quality changes within site and differences across sites. It can be performed in a reasonable amount of scan time (~15min) and the code for automated processing has been made publicly available. A novel DTI-QA tool has been proposed. It has been applied successfully on data from several scanners/platforms. The novelty lies in the exploitation of the sphericity of the phantom for distortion measurements. Other novel contributions are: the computation of an SNR value per gradient direction for the diffusion weighted images (DWIs) and an SNR value per non-DWI, an automated background detection for the Nyquist ghosting measurement and an error metric reflecting the contribution of EPI instability to the eddy current induced shape changes observed for DWIs. Copyright © 2017 Elsevier

  14. UAV : Warnings From Multiple Automated Static Analysis Tools At A Glance

    NARCIS (Netherlands)

    Buckers, T.B.; Cao, C.S.; Doesburg, M.S.; Gong, Boning; Wang, Sunwei; Beller, M.M.; Zaidman, A.E.; Pinzger, Martin; Bavota, Gabriele; Marcus, Andrian

    2017-01-01

    Automated Static Analysis Tools (ASATs) are an integral part of today’s software quality assurance practices. At present, a plethora of ASATs exist, each with different strengths. However, there is little guidance for developers on which of these ASATs to choose and combine for a project. As a

  15. Automated tool for virtual screening and pharmacology-based pathway prediction and analysis

    Directory of Open Access Journals (Sweden)

    Sugandh Kumar

    2017-10-01

    Full Text Available The virtual screening is an effective tool for the lead identification in drug discovery. However, there are limited numbers of crystal structures available as compared to the number of biological sequences which makes (Structure Based Drug Discovery SBDD a difficult choice. The current tool is an attempt to automate the protein structure modelling and automatic virtual screening followed by pharmacology-based prediction and analysis. Starting from sequence(s, this tool automates protein structure modelling, binding site identification, automated docking, ligand preparation, post docking analysis and identification of hits in the biological pathways that can be modulated by a group of ligands. This automation helps in the characterization of ligands selectivity and action of ligands on a complex biological molecular network as well as on individual receptor. The judicial combination of the ligands binding different receptors can be used to inhibit selective biological pathways in a disease. This tool also allows the user to systemically investigate network-dependent effects of a drug or drug candidate.

  16. Robust localisation of automated guided vehicles for computer-integrated manufacturing environments

    Directory of Open Access Journals (Sweden)

    Dixon, R. C.

    2013-05-01

    Full Text Available As industry moves toward an era of complete automation and mass customisation, automated guided vehicles (AGVs are used as material handling systems. However, the current techniques that provide navigation, control, and manoeuvrability of automated guided vehicles threaten to create bottlenecks and inefficiencies in manufacturing environments that strive towards the optimisation of part production. This paper proposes a decentralised localisation technique for an automated guided vehicle without any non-holonomic constraints. Incorporation of these vehicles into the material handling system of a computer-integrated manufacturing environment would increase the characteristics of robustness, efficiency, flexibility, and advanced manoeuvrability.

  17. CloVR: A virtual machine for automated and portable sequence analysis from the desktop using cloud computing

    Science.gov (United States)

    2011-01-01

    Background Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. Results We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. Conclusion The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing. PMID:21878105

  18. CloVR: a virtual machine for automated and portable sequence analysis from the desktop using cloud computing.

    Science.gov (United States)

    Angiuoli, Samuel V; Matalka, Malcolm; Gussman, Aaron; Galens, Kevin; Vangala, Mahesh; Riley, David R; Arze, Cesar; White, James R; White, Owen; Fricke, W Florian

    2011-08-30

    Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing.

  19. Automation and schema acquisition in learning elementary computer programming : implications for the design of practice

    NARCIS (Netherlands)

    van Merrienboer, Jeroen J.G.; van Merrienboer, J.J.G.; Paas, Fred G.W.C.

    1990-01-01

    Two complementary processes may be distinguished in learning a complex cognitive skill such as computer programming. First, automation offers task-specific procedures that may directly control programming behavior, second, schema acquisition offers cognitive structures that provide analogies in new

  20. Program software for the automated processing of gravity and magnetic survey data for the Mir computer

    Energy Technology Data Exchange (ETDEWEB)

    Lyubimov, G.A.

    1980-01-01

    A presentation is made of the content of program software for the automated processing of gravity and magnetic survey data for the small Mir-1 and Mir-2 computers as worked out on the Voronezh geophysical expedition.

  1. Functions and Requirements and Specifications for Replacement of the Computer Automated Surveillance System (CASS)

    International Nuclear Information System (INIS)

    SCAIEF, C.C.

    1999-01-01

    This functions, requirements and specifications document defines the baseline requirements and criteria for the design, purchase, fabrication, construction, installation, and operation of the system to replace the Computer Automated Surveillance System (CASS) alarm monitoring

  2. Automated patterning and probing with multiple nanoscale tools for single-cell analysis.

    Science.gov (United States)

    Li, Jiayao; Kim, Yeonuk; Liu, Boyin; Qin, Ruwen; Li, Jian; Fu, Jing

    2017-10-01

    The nano-manipulation approach that combines Focused Ion Beam (FIB) milling and various imaging and probing techniques enables researchers to investigate the cellular structures in three dimensions. Such fusion approach, however, requires extensive effort on locating and examining randomly-distributed targets due to limited Field of View (FOV) when high magnification is desired. In the present study, we present the development that automates 'pattern and probe' particularly for single-cell analysis, achieved by computer aided tools including feature recognition and geometric planning algorithms. Scheduling of serial FOVs for imaging and probing of multiple cells was considered as a rectangle covering problem, and optimal or near-optimal solutions were obtained with the heuristics developed. FIB milling was then employed automatically followed by downstream analysis using Atomic Force Microscopy (AFM) to probe the cellular interior. Our strategy was applied to examine bacterial cells (Klebsiella pneumoniae) and achieved high efficiency with limited human interference. The developed algorithms can be easily adapted and integrated with different imaging platforms towards high-throughput imaging analysis of single cells. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. An automated computer misuse detection system for UNICOS

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, K.A.; Neuman, M.C.; Simmonds, D.D.; Stallings, C.A.; Thompson, J.L.; Christoph, G.G.

    1994-09-27

    An effective method for detecting computer misuse is the automatic monitoring and analysis of on-line user activity. This activity is reflected in the system audit record, in the system vulnerability posture, and in other evidence found through active testing of the system. During the last several years we have implemented an automatic misuse detection system at Los Alamos. This is the Network Anomaly Detection and Intrusion Reporter (NADIR). We are currently expanding NADIR to include processing of the Cray UNICOS operating system. This new component is called the UNICOS Realtime NADIR, or UNICORN. UNICORN summarizes user activity and system configuration in statistical profiles. It compares these profiles to expert rules that define security policy and improper or suspicious behavior. It reports suspicious behavior to security auditors and provides tools to aid in follow-up investigations. The first phase of UNICORN development is nearing completion, and will be operational in late 1994.

  4. Computer automated mass spectrometer for isotope analysis on gas samples

    International Nuclear Information System (INIS)

    Pamula, A.; Kaucsar, M.; Fatu, C.; Ursu, D.; Vonica, D.; Bendea, D.; Muntean, F.

    1998-01-01

    A low resolution, high precision instrument was designed and realized in the mass spectrometry laboratory of the Institute of Isotopic and Molecular Technology, Cluj-Napoca. The paper presents the vacuum system, the sample inlet system, the ion source, the magnetic analyzer and the ion collector. The instrument is almost completely automated. There are described the analog-to-digital conversion circuits, the local control microcomputer, the automation systems and the performance checking. (authors)

  5. The Effect of Computer Automation on Institutional Review Board (IRB) Office Efficiency

    Science.gov (United States)

    Oder, Karl; Pittman, Stephanie

    2015-01-01

    Companies purchase computer systems to make their processes more efficient through automation. Some academic medical centers (AMC) have purchased computer systems for their institutional review boards (IRB) to increase efficiency and compliance with regulations. IRB computer systems are expensive to purchase, deploy, and maintain. An AMC should…

  6. Integrating Computational Science Tools into a Thermodynamics Course

    Science.gov (United States)

    Vieira, Camilo; Magana, Alejandra J.; García, R. Edwin; Jana, Aniruddha; Krafcik, Matthew

    2018-01-01

    Computational tools and methods have permeated multiple science and engineering disciplines, because they enable scientists and engineers to process large amounts of data, represent abstract phenomena, and to model and simulate complex concepts. In order to prepare future engineers with the ability to use computational tools in the context of their disciplines, some universities have started to integrate these tools within core courses. This paper evaluates the effect of introducing three computational modules within a thermodynamics course on student disciplinary learning and self-beliefs about computation. The results suggest that using worked examples paired to computer simulations to implement these modules have a positive effect on (1) student disciplinary learning, (2) student perceived ability to do scientific computing, and (3) student perceived ability to do computer programming. These effects were identified regardless of the students' prior experiences with computer programming.

  7. Prototype of Automated PLC Model Checking Using Continuous Integration Tools

    CERN Document Server

    Lettrich, Michael

    2015-01-01

    To deal with the complexity of operating and supervising large scale industrial installations at CERN, often Programmable Logic Controllers (PLCs) are used. A failure in these control systems can cause a disaster in terms of economic loses, environmental damages or human losses. Therefore the requirements to software quality are very high. To provide PLC developers with a way to verify proper functionality against requirements, a Java tool named PLCverif has been developed which encapsulates and thus simplifies the use of third party model checkers. One of our goals in this project is to integrate PLCverif in development process of PLC programs. When the developer changes the program, all the requirements should be verified again, as a change on the code can produce collateral effects and violate one or more requirements. For that reason, PLCverif has been extended to work with Jenkins CI in order to trigger automatically the verication cases when the developer changes the PLC program. This prototype has been...

  8. Automated computation of arbor densities: a step toward identifying neuronal cell types

    Directory of Open Access Journals (Sweden)

    Uygar eSümbül

    2014-11-01

    Full Text Available The shape and position of a neuron convey information regarding its molecular and functional identity. The identification of cell types from structure, a classic method, relies on the time-consuming step of arbor tracing. However, as genetic tools and imaging methods make data-driven approaches to neuronal circuit analysis feasible, the need for automated processing increases. Here, we first establish that mouse retinal ganglion cell types can be as precise about distributing their arbor volumes across the inner plexiform layer as they are about distributing the skeletons of the arbors. Then, we describe an automated approach to computing the spatial distribution of the dendritic arbors, or arbor density, with respect to a global depth coordinate based on this observation. Our method involves three-dimensional reconstruction of neuronal arbors by a supervised machine learning algorithm, post-processing of the enhanced stacks to remove somata and isolate the neuron of interest, and registration of neurons to each other using automatically detected arbors of the starburst amacrine interneurons as fiducial markers. In principle, this method could be generalizable to other structures of the CNS, provided that they allow sparse labeling of the cells and contain a reliable axis of spatial reference.

  9. Computing tools for accelerator design calculations

    International Nuclear Information System (INIS)

    Fischler, M.; Nash, T.

    1984-01-01

    This note is intended as a brief, summary guide for accelerator designers to the new generation of commercial and special processors that allow great increases in computing cost effectiveness. New thinking is required to take best advantage of these computing opportunities, in particular, when moving from analytical approaches to tracking simulations. In this paper, we outline the relevant considerations

  10. Automation tools for accelerator control a network based sequencer

    International Nuclear Information System (INIS)

    Clout, P.; Geib, M.; Westervelt, R.

    1991-01-01

    In conjunction with a major client, Vista Control Systems has developed a sequencer for control systems which works in conjunction with its realtime, distributed Vsystem database. Vsystem is a network-based data acquisition, monitoring and control system which has been applied successfully to both accelerator projects and projects outside this realm of research. The network-based sequencer allows a user to simply define a thread of execution in any supported computer on the network. The script defining a sequence has a simple syntax designed for non-programmers, with facilities for selectively abbreviating the channel names for easy reference. The semantics of the script contains most of the familiar capabilities of conventional programming languages, including standard stream I/O and the ability to start other processes with parameters passed. The script is compiled to threaded code for execution efficiency. The implementation is described in some detail and examples are given of applications for which the sequencer has been used

  11. AMMOS: Automated Molecular Mechanics Optimization tool for in silico Screening

    Directory of Open Access Journals (Sweden)

    Pajeva Ilza

    2008-10-01

    Full Text Available Abstract Background Virtual or in silico ligand screening combined with other computational methods is one of the most promising methods to search for new lead compounds, thereby greatly assisting the drug discovery process. Despite considerable progresses made in virtual screening methodologies, available computer programs do not easily address problems such as: structural optimization of compounds in a screening library, receptor flexibility/induced-fit, and accurate prediction of protein-ligand interactions. It has been shown that structural optimization of chemical compounds and that post-docking optimization in multi-step structure-based virtual screening approaches help to further improve the overall efficiency of the methods. To address some of these points, we developed the program AMMOS for refining both, the 3D structures of the small molecules present in chemical libraries and the predicted receptor-ligand complexes through allowing partial to full atom flexibility through molecular mechanics optimization. Results The program AMMOS carries out an automatic procedure that allows for the structural refinement of compound collections and energy minimization of protein-ligand complexes using the open source program AMMP. The performance of our package was evaluated by comparing the structures of small chemical entities minimized by AMMOS with those minimized with the Tripos and MMFF94s force fields. Next, AMMOS was used for full flexible minimization of protein-ligands complexes obtained from a mutli-step virtual screening. Enrichment studies of the selected pre-docked complexes containing 60% of the initially added inhibitors were carried out with or without final AMMOS minimization on two protein targets having different binding pocket properties. AMMOS was able to improve the enrichment after the pre-docking stage with 40 to 60% of the initially added active compounds found in the top 3% to 5% of the entire compound collection

  12. Tools for automated acoustic monitoring within the R package monitoR

    Science.gov (United States)

    Katz, Jonathan; Hafner, Sasha D.; Donovan, Therese

    2016-01-01

    The R package monitoR contains tools for managing an acoustic-monitoring program including survey metadata, template creation and manipulation, automated detection and results management. These tools are scalable for use with small projects as well as larger long-term projects and those with expansive spatial extents. Here, we describe typical workflow when using the tools in monitoR. Typical workflow utilizes a generic sequence of functions, with the option for either binary point matching or spectrogram cross-correlation detectors.

  13. Tools for automated acoustic monitoring within the R package monitoR

    DEFF Research Database (Denmark)

    Katz, Jonathan; Hafner, Sasha D.; Donovan, Therese

    2016-01-01

    The R package monitoR contains tools for managing an acoustic-monitoring program including survey metadata, template creation and manipulation, automated detection and results management. These tools are scalable for use with small projects as well as larger long-term projects and those...... with expansive spatial extents. Here, we describe typical workflow when using the tools in monitoR. Typical workflow utilizes a generic sequence of functions, with the option for either binary point matching or spectrogram cross-correlation detectors....

  14. Operational status display and automation tools for FERMI-Elettra

    International Nuclear Information System (INIS)

    Scafuri, C.

    2012-01-01

    Detecting and locating faults and malfunctions of an accelerator is a difficult and time consuming task. The situation is even more difficult during the commissioning phase of a new accelerator, when the plants are not yet well known. Faults involving single devices are easy to detect, however a fault free machine does not imply that it is ready to run: the definition of malfunction depends on what is the expected behavior of the plant. In the case of FERMI-Elettra, in which the electron beam goes to different branches of the machine depending on the programmed activity, the configuration of the plant determines the rules for detecting malfunctions. In order to help the detection of faults and malfunctions and to display the status of the plant, a tool, known as the 'Matrix', has been developed. It is composed by a graphical front-end which displays a synthetic view of the plant status grouped by subsystem and location along the accelerator, and by a back-end calculation engine. The graphical front-end gives also the possibility, once a problem is detected, to focus on its details. The calculation engine is composed by a set of objects known as Sequencers. The calculation rules have been determined by analyzing the various subsystems and global working of the accelerator with plant and operations experts. The Sequencer is designed so that it can also issue commands to the plant. This will be used in the next releases of the Matrix for actively switching from one accelerator configuration to another. (author)

  15. Computer automation of an accelerator mass spectrometry system

    International Nuclear Information System (INIS)

    Gressett, J.D.; Maxson, D.L.; Matteson, S.; McDaniel, F.D.; Duggan, J.L.; Mackey, H.J.; North Texas State Univ., Denton, TX; Anthony, J.M.

    1989-01-01

    The determination of trace impurities in electronic materials using accelerator mass spectrometry (AMS) requires efficient automation of the beam transport and mass discrimination hardware. The ability to choose between a variety of charge states, isotopes and injected molecules is necessary to provide survey capabilities similar to that available on conventional mass spectrometers. This paper will discuss automation hardware and software for flexible, high-sensitivity trace analysis of electronic materials, e.g. Si, GaAs and HgCdTe. Details regarding settling times will be presented, along with proof-of-principle experimental data. Potential and present applications will also be discussed. (orig.)

  16. Computer-Assisted Automated Scoring of Polysomnograms Using the Somnolyzer System.

    Science.gov (United States)

    Punjabi, Naresh M; Shifa, Naima; Dorffner, Georg; Patil, Susheel; Pien, Grace; Aurora, Rashmi N

    2015-10-01

    Manual scoring of polysomnograms is a time-consuming and tedious process. To expedite the scoring of polysomnograms, several computerized algorithms for automated scoring have been developed. The overarching goal of this study was to determine the validity of the Somnolyzer system, an automated system for scoring polysomnograms. The analysis sample comprised of 97 sleep studies. Each polysomnogram was manually scored by certified technologists from four sleep laboratories and concurrently subjected to automated scoring by the Somnolyzer system. Agreement between manual and automated scoring was examined. Sleep staging and scoring of disordered breathing events was conducted using the 2007 American Academy of Sleep Medicine criteria. Clinical sleep laboratories. A high degree of agreement was noted between manual and automated scoring of the apnea-hypopnea index (AHI). The average correlation between the manually scored AHI across the four clinical sites was 0.92 (95% confidence interval: 0.90-0.93). Similarly, the average correlation between the manual and Somnolyzer-scored AHI values was 0.93 (95% confidence interval: 0.91-0.96). Thus, interscorer correlation between the manually scored results was no different than that derived from manual and automated scoring. Substantial concordance in the arousal index, total sleep time, and sleep efficiency between manual and automated scoring was also observed. In contrast, differences were noted between manually and automated scored percentages of sleep stages N1, N2, and N3. Automated analysis of polysomnograms using the Somnolyzer system provides results that are comparable to manual scoring for commonly used metrics in sleep medicine. Although differences exist between manual versus automated scoring for specific sleep stages, the level of agreement between manual and automated scoring is not significantly different than that between any two human scorers. In light of the burden associated with manual scoring, automated

  17. Computational Design Tools for Integrated Design

    DEFF Research Database (Denmark)

    Holst, Malene Kirstine; Kirkegaard, Poul Henning

    2010-01-01

    In an architectural conceptual sketching process, where an architect is working with the initial ideas for a design, the process is characterized by three phases: sketching, evaluation and modification. Basically the architect needs to address three areas in the conceptual sketching phase......: aesthetical, functional and technical requirements. The aim of the present paper is to address the problem of a vague or not existing link between digital conceptual design tools used by architects and designers and engineering analysis and simulation tools. Based on an analysis of the architectural design...... process different digital design methods are related to tasks in an integrated design process....

  18. Application of quantum dots as analytical tools in automated chemical analysis: A review

    International Nuclear Information System (INIS)

    Frigerio, Christian; Ribeiro, David S.M.; Rodrigues, S. Sofia M.; Abreu, Vera L.R.G.; Barbosa, João A.C.; Prior, João A.V.; Marques, Karine L.; Santos, João L.M.

    2012-01-01

    Highlights: ► Review on quantum dots application in automated chemical analysis. ► Automation by using flow-based techniques. ► Quantum dots in liquid chromatography and capillary electrophoresis. ► Detection by fluorescence and chemiluminescence. ► Electrochemiluminescence and radical generation. - Abstract: Colloidal semiconductor nanocrystals or quantum dots (QDs) are one of the most relevant developments in the fast-growing world of nanotechnology. Initially proposed as luminescent biological labels, they are finding new important fields of application in analytical chemistry, where their photoluminescent properties have been exploited in environmental monitoring, pharmaceutical and clinical analysis and food quality control. Despite the enormous variety of applications that have been developed, the automation of QDs-based analytical methodologies by resorting to automation tools such as continuous flow analysis and related techniques, which would allow to take advantage of particular features of the nanocrystals such as the versatile surface chemistry and ligand binding ability, the aptitude to generate reactive species, the possibility of encapsulation in different materials while retaining native luminescence providing the means for the implementation of renewable chemosensors or even the utilisation of more drastic and even stability impairing reaction conditions, is hitherto very limited. In this review, we provide insights into the analytical potential of quantum dots focusing on prospects of their utilisation in automated flow-based and flow-related approaches and the future outlook of QDs applications in chemical analysis.

  19. Computational Tools for Stem Cell Biology.

    Science.gov (United States)

    Bian, Qin; Cahan, Patrick

    2016-12-01

    For over half a century, the field of developmental biology has leveraged computation to explore mechanisms of developmental processes. More recently, computational approaches have been critical in the translation of high throughput data into knowledge of both developmental and stem cell biology. In the past several years, a new subdiscipline of computational stem cell biology has emerged that synthesizes the modeling of systems-level aspects of stem cells with high-throughput molecular data. In this review, we provide an overview of this new field and pay particular attention to the impact that single cell transcriptomics is expected to have on our understanding of development and our ability to engineer cell fate. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Workshop on Software Development Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Vetter, Jeffrey [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Georgia Inst. of Technology, Atlanta, GA (United States)

    2007-08-01

    Petascale computing systems will soon be available to the DOE science community. Recent studies in the productivity of HPC platforms point to better software environments as a key enabler to science on these systems. To prepare for the deployment and productive use of these petascale platforms, the DOE science and general HPC community must have the software development tools, such as performance analyzers and debuggers that meet application requirements for scalability, functionality, reliability, and ease of use. In this report, we identify and prioritize the research opportunities in the area of software development tools for high performance computing. To facilitate this effort, DOE hosted a group of 55 leading international experts in this area at the Software Development Tools for PetaScale Computing (SDTPC) Workshop, which was held in Washington, D.C. on August 1 and 2, 2007. Software development tools serve as an important interface between the application teams and the target HPC architectures. Broadly speaking, these roles can be decomposed into three categories: performance tools, correctness tools, and development environments. Accordingly, this SDTPC report has four technical thrusts: performance tools, correctness tools, development environment infrastructures, and scalable tool infrastructures. The last thrust primarily targets tool developers per se, rather than end users. Finally, this report identifies non-technical strategic challenges that impact most tool development. The organizing committee emphasizes that many critical areas are outside the scope of this charter; these important areas include system software, compilers, and I/O.

  1. CAT: a computer code for the automated construction of fault trees

    International Nuclear Information System (INIS)

    Apostolakis, G.E.; Salem, S.L.; Wu, J.S.

    1978-03-01

    A computer code, CAT (Computer Automated Tree, is presented which applies decision table methods to model the behavior of components for systematic construction of fault trees. The decision tables for some commonly encountered mechanical and electrical components are developed; two nuclear subsystems, a Containment Spray Recirculation System and a Consequence Limiting Control System, are analyzed to demonstrate the applications of CAT code

  2. Automated, Resummed and Effective: Precision Computations for the LHC and Beyond

    CERN Document Server

    2017-01-01

    Precise predictions for collider processes are crucial to interpret the results from the Large Hadron Collider (LHC) at CERN. The goal of this programme is to bring together experts from different communities in precision collider physics (diagrammatic resummation vs. effective field theory, automated numerical computations vs. analytic approaches, etc.) to discuss the latest advances in jet physics, higher-order computations and resummation.

  3. Enhancement of the reliability of automated ultrasonic inspections using tools of quantitative NDT

    International Nuclear Information System (INIS)

    Kappes, W.; Baehr, W.; Kroening, M.; Schmitz, V.

    1994-01-01

    To achieve reliable test results from automated ultrasonic inspection of safety related components, optimization and integral consideration of the various inspection stages - inspection planning, inspection performance and evaluation of results - are indispensable. For this purpose, a large potential of methods is available: advanced measurement techniques, mathematical-numerical modelling processes, artificial intelligence tools, data bases and CAD systems. The potential inherent in these methods to enhance inspection reliability is outlined by way of different applications. (orig.) [de

  4. Generation of orientation tools for automated zebrafish screening assays using desktop 3D printing

    OpenAIRE

    Wittbrodt, Jonas N.; Liebel, Urban; Gehrig, Jochen

    2014-01-01

    Background The zebrafish has been established as the main vertebrate model system for whole organism screening applications. However, the lack of consistent positioning of zebrafish embryos within wells of microtiter plates remains an obstacle for the comparative analysis of images acquired in automated screening assays. While technical solutions to the orientation problem exist, dissemination is often hindered by the lack of simple and inexpensive ways of distributing and duplicating tools. ...

  5. Applying Dataflow Architecture and Visualization Tools to In Vitro Pharmacology Data Automation.

    Science.gov (United States)

    Pechter, David; Xu, Serena; Kurtz, Marc; Williams, Steven; Sonatore, Lisa; Villafania, Artjohn; Agrawal, Sony

    2016-12-01

    The pace and complexity of modern drug discovery places ever-increasing demands on scientists for data analysis and interpretation. Data flow programming and modern visualization tools address these demands directly. Three different requirements-one for allosteric modulator analysis, one for a specialized clotting analysis, and one for enzyme global progress curve analysis-are reviewed, and their execution in a combined data flow/visualization environment is outlined. © 2016 Society for Laboratory Automation and Screening.

  6. Evaluation of an Automated Analysis Tool for Prostate Cancer Prediction Using Multiparametric Magnetic Resonance Imaging.

    Directory of Open Access Journals (Sweden)

    Matthias C Roethke

    Full Text Available To evaluate the diagnostic performance of an automated analysis tool for the assessment of prostate cancer based on multiparametric magnetic resonance imaging (mpMRI of the prostate.A fully automated analysis tool was used for a retrospective analysis of mpMRI sets (T2-weighted, T1-weighted dynamic contrast-enhanced, and diffusion-weighted sequences. The software provided a malignancy prediction value for each image pixel, defined as Malignancy Attention Index (MAI that can be depicted as a colour map overlay on the original images. The malignancy maps were compared to histopathology derived from a combination of MRI-targeted and systematic transperineal MRI/TRUS-fusion biopsies.In total, mpMRI data of 45 patients were evaluated. With a sensitivity of 85.7% (with 95% CI of 65.4-95.0, a specificity of 87.5% (with 95% CI of 69.0-95.7 and a diagnostic accuracy of 86.7% (with 95% CI of 73.8-93.8 for detection of prostate cancer, the automated analysis results corresponded well with the reported diagnostic accuracies by human readers based on the PI-RADS system in the current literature.The study revealed comparable diagnostic accuracies for the detection of prostate cancer of a user-independent MAI-based automated analysis tool and PI-RADS-scoring-based human reader analysis of mpMRI. Thus, the analysis tool could serve as a detection support system for less experienced readers. The results of the study also suggest the potential of MAI-based analysis for advanced lesion assessments, such as cancer extent and staging prediction.

  7. AI tools in computer based problem solving

    Science.gov (United States)

    Beane, Arthur J.

    1988-01-01

    The use of computers to solve value oriented, deterministic, algorithmic problems, has evolved a structured life cycle model of the software process. The symbolic processing techniques used, primarily in research, for solving nondeterministic problems, and those for which an algorithmic solution is unknown, have evolved a different model, much less structured. Traditionally, the two approaches have been used completely independently. With the advent of low cost, high performance 32 bit workstations executing identical software with large minicomputers and mainframes, it became possible to begin to merge both models into a single extended model of computer problem solving. The implementation of such an extended model on a VAX family of micro/mini/mainframe systems is described. Examples in both development and deployment of applications involving a blending of AI and traditional techniques are given.

  8. Computational Tools applied to Urban Engineering

    OpenAIRE

    Filho, Armando Carlos de Pina; Lima, Fernando Rodrigues; Amaral, Renato Dias Calado do

    2010-01-01

    This chapter looked for to present the main details on three technologies much used in Urban Engineering: CAD (Computer-Aided Design); GIS (Geographic Information System); and BIM (Building Information Modelling). As it can be seen, each one of them presents specific characteristics and with diverse applications in urban projects, providing better results in relation to the planning, management and maintenance of the systems. In relation to presented software, it is important to note that the...

  9. A Perspective on Computational Human Performance Models as Design Tools

    Science.gov (United States)

    Jones, Patricia M.

    2010-01-01

    The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.

  10. Concept of a computer network architecture for complete automation of nuclear power plants

    International Nuclear Information System (INIS)

    Edwards, R.M.; Ray, A.

    1990-01-01

    The state of the art in automation of nuclear power plants has been largely limited to computerized data acquisition, monitoring, display, and recording of process signals. Complete automation of nuclear power plants, which would include plant operations, control, and management, fault diagnosis, and system reconfiguration with efficient and reliable man/machine interactions, has been projected as a realistic goal. This paper presents the concept of a computer network architecture that would use a high-speed optical data highway to integrate diverse, interacting, and spatially distributed functions that are essential for a fully automated nuclear power plant

  11. Automated design of analog and high-frequency circuits a computational intelligence approach

    CERN Document Server

    Liu, Bo; Fernández, Francisco V

    2014-01-01

    Computational intelligence techniques are becoming more and more important for automated problem solving nowadays. Due to the growing complexity of industrial applications and the increasingly tight time-to-market requirements, the time available for thorough problem analysis and development of tailored solution methods is decreasing. There is no doubt that this trend will continue in the foreseeable future. Hence, it is not surprising that robust and general automated problem solving methods with satisfactory performance are needed.

  12. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system’s...... of the graph. SaTool makes analysis of the structure graph to provide knowledge about fundamental properties of the system in normal and faulty conditions. Salient features of SaTool include rapid analysis of possibility to diagnose faults and ability to make autonomous recovery should faults occur........ The list of such variables and functional relations constitute the system’s structure graph. Normal operation means all functional relations are intact. Should faults occur, one or more functional relations cease to be valid. In a structure graph, this is seen as the disappearance of one or more nodes...

  13. Computer Aided Design Tools for Extreme Environment Electronics, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This project aims to provide Computer Aided Design (CAD) tools for radiation-tolerant, wide-temperature-range digital, analog, mixed-signal, and radio-frequency...

  14. Computational Tool for Aerothermal Environment Around Transatmospheric Vehicles, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this Project is to develop a high-fidelity computational tool for accurate prediction of aerothermal environment on transatmospheric vehicles. This...

  15. Applications of computational tools in biosciences and medical engineering

    CERN Document Server

    Altenbach, Holm

    2015-01-01

     This book presents the latest developments and applications of computational tools related to the biosciences and medical engineering. It also reports the findings of different multi-disciplinary research projects, for example, from the areas of scaffolds and synthetic bones, implants and medical devices, and medical materials. It is also shown that the application of computational tools often requires mathematical and experimental methods. Computational tools such as the finite element methods, computer-aided design and optimization as well as visualization techniques such as computed axial tomography open up completely new research fields that combine the fields of engineering and bio/medical. Nevertheless, there are still hurdles since both directions are based on quite different ways of education. Often even the “language” can vary from discipline to discipline.

  16. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten

    2006-01-01

    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a complex...... system at a high level of functional abstraction, analyze single and multiple fault scenarios and automatically generate parity relations for diagnosis for the system in normal and impaired conditions. User interface and algorithmic details are presented....

  17. Scratch as a Computational Modelling Tool for Teaching Physics

    Science.gov (United States)

    Lopez, Victor; Hernandez, Maria Isabel

    2015-01-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling…

  18. Caesy: A software tool for computer-aided engineering

    Science.gov (United States)

    Wette, Matt

    1993-01-01

    A new software tool, Caesy, is described. This tool provides a strongly typed programming environment for research in the development of algorithms and software for computer-aided control system design. A description of the user language and its implementation as they currently stand are presented along with a description of work in progress and areas of future work.

  19. Computer Tools for Construction, Modification and Analysis of Petri Nets

    DEFF Research Database (Denmark)

    Jensen, Kurt

    1987-01-01

    The practical use of Petri nets is — just as any other description technique — very dependent on the existence of adequate computer tools, which may assist the user to cope with the many details of a large description. For Petri nets there is a need for tools supporting construction of nets...

  20. Computational tool for postoperative evaluation of cochlear implant patients

    International Nuclear Information System (INIS)

    Giacomini, Guilherme; Pavan, Ana Luiza M.; Pina, Diana R. de; Altemani, Joao M.C.; Castilho, Arthur M.

    2016-01-01

    The aim of this study was to develop a tool to calculate the insertion depth angle of cochlear implants, from computed tomography exams. The tool uses different image processing techniques, such as thresholding and active contour. Then, we compared the average insertion depth angle of three different implant manufacturers. The developed tool can be used, in the future, to compare the insertion depth angle of the cochlear implant with postoperative response of patient's hearing. (author)

  1. Advanced Computing Tools and Models for Accelerator Physics

    International Nuclear Information System (INIS)

    Ryne, Robert; Ryne, Robert D.

    2008-01-01

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics

  2. Methods of physical experiment and installation automation on the base of computers

    International Nuclear Information System (INIS)

    Stupin, Yu.V.

    1983-01-01

    Peculiarities of using computers for physical experiment and installation automation are considered. Systems for data acquisition and processing on the base of microprocessors, micro- and mini-computers, CAMAC equipment and real time operational systems as well as systems intended for automation of physical experiments on accelerators and installations of laser thermonuclear fusion and installations for plasma investigation are dpscribed. The problems of multimachine complex and multi-user system, arrangement, development of automated systems for collective use, arrangement of intermachine data exchange and control of experimental data base are discussed. Data on software systems used for complex experimental data processing are presented. It is concluded that application of new computers in combination with new possibilities provided for users by universal operational systems essentially exceeds efficiency of a scientist work

  3. An automated magnetic tape vault at CERN computer center

    CERN Multimedia

    Claudia Marcelloni

    2008-01-01

    The rapidly changing data processing landscape the underlying long-term storage technology remains the tried and tested magnetic tape. This robust and mature technology is used to store the complete LHC data set, from which a fraction of the data is copied to overlying disk caches for fast and widespread access. The handling of the magnetic tape cartridges is now fully automated, as they are racked in vaults where they are moved between the storage shelves and the tape drives by robotic arms.

  4. Automated insertion of sequences into a ribosomal RNA alignment: An application of computational linguistics in molecular biology

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, Ronald C. [Case Western Reserve Univ., Cleveland, OH (United States)

    1991-11-01

    This thesis involved the construction of (1) a grammar that incorporates knowledge on base invariancy and secondary structure in a molecule and (2) a parser engine that uses the grammar to position bases into the structural subunits of the molecule. These concepts were combined with a novel pinning technique to form a tool that semi-automates insertion of a new species into the alignment for the 16S rRNA molecule (a component of the ribosome) maintained by Dr. Carl Woese`s group at the University of Illinois at Urbana. The tool was tested on species extracted from the alignment and on a group of entirely new species. The results were very encouraging, and the tool should be substantial aid to the curators of the 16S alignment. The construction of the grammar was itself automated, allowing application of the tool to alignments for other molecules. The logic programming language Prolog was used to construct all programs involved. The computational linguistics approach used here was found to be a useful way to attach the problem of insertion into an alignment.

  5. Automated insertion of sequences into a ribosomal RNA alignment: An application of computational linguistics in molecular biology

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, R.C.

    1991-11-01

    This thesis involved the construction of (1) a grammar that incorporates knowledge on base invariancy and secondary structure in a molecule and (2) a parser engine that uses the grammar to position bases into the structural subunits of the molecule. These concepts were combined with a novel pinning technique to form a tool that semi-automates insertion of a new species into the alignment for the 16S rRNA molecule (a component of the ribosome) maintained by Dr. Carl Woese's group at the University of Illinois at Urbana. The tool was tested on species extracted from the alignment and on a group of entirely new species. The results were very encouraging, and the tool should be substantial aid to the curators of the 16S alignment. The construction of the grammar was itself automated, allowing application of the tool to alignments for other molecules. The logic programming language Prolog was used to construct all programs involved. The computational linguistics approach used here was found to be a useful way to attach the problem of insertion into an alignment.

  6. Computational tools for high-throughput discovery in biology

    OpenAIRE

    Jones, Neil Christopher

    2007-01-01

    High throughput data acquisition technology has inarguably transformed the landscape of the life sciences, in part by making possible---and necessary---the computational disciplines of bioinformatics and biomedical informatics. These fields focus primarily on developing tools for analyzing data and generating hypotheses about objects in nature, and it is in this context that we address three pressing problems in the fields of the computational life sciences which each require computing capaci...

  7. New tools to aid in scientific computing and visualization

    International Nuclear Information System (INIS)

    Wallace, M.G.; Christian-Frear, T.L.

    1992-01-01

    In this paper, two computer programs are described which aid in the pre- and post-processing of computer generated data. CoMeT (Computational Mechanics Toolkit) is a customizable, interactive, graphical, menu-driven program that provides the analyst with a consistent user-friendly interface to analysis codes. Trans Vol (Transparent Volume Visualization) is a specialized tool for the scientific three-dimensional visualization of complex solids by the technique of volume rendering. Both tools are described in basic detail along with an application example concerning the simulation of contaminant migration from an underground nuclear repository

  8. ClassyFire: automated chemical classification with a comprehensive, computable taxonomy.

    Science.gov (United States)

    Djoumbou Feunang, Yannick; Eisner, Roman; Knox, Craig; Chepelev, Leonid; Hastings, Janna; Owen, Gareth; Fahy, Eoin; Steinbeck, Christoph; Subramanian, Shankar; Bolton, Evan; Greiner, Russell; Wishart, David S

    2016-01-01

    Scientists have long been driven by the desire to describe, organize, classify, and compare objects using taxonomies and/or ontologies. In contrast to biology, geology, and many other scientific disciplines, the world of chemistry still lacks a standardized chemical ontology or taxonomy. Several attempts at chemical classification have been made; but they have mostly been limited to either manual, or semi-automated proof-of-principle applications. This is regrettable as comprehensive chemical classification and description tools could not only improve our understanding of chemistry but also improve the linkage between chemistry and many other fields. For instance, the chemical classification of a compound could help predict its metabolic fate in humans, its druggability or potential hazards associated with it, among others. However, the sheer number (tens of millions of compounds) and complexity of chemical structures is such that any manual classification effort would prove to be near impossible. We have developed a comprehensive, flexible, and computable, purely structure-based chemical taxonomy (ChemOnt), along with a computer program (ClassyFire) that uses only chemical structures and structural features to automatically assign all known chemical compounds to a taxonomy consisting of >4800 different categories. This new chemical taxonomy consists of up to 11 different levels (Kingdom, SuperClass, Class, SubClass, etc.) with each of the categories defined by unambiguous, computable structural rules. Furthermore each category is named using a consensus-based nomenclature and described (in English) based on the characteristic common structural properties of the compounds it contains. The ClassyFire webserver is freely accessible at http://classyfire.wishartlab.com/. Moreover, a Ruby API version is available at https://bitbucket.org/wishartlab/classyfire_api, which provides programmatic access to the ClassyFire server and database. ClassyFire has been used to

  9. The RABiT: a rapid automated biodosimetry tool for radiological triage. II. Technological developments.

    Science.gov (United States)

    Garty, Guy; Chen, Youhua; Turner, Helen C; Zhang, Jian; Lyulko, Oleksandra V; Bertucci, Antonella; Xu, Yanping; Wang, Hongliang; Simaan, Nabil; Randers-Pehrson, Gerhard; Lawrence Yao, Y; Brenner, David J

    2011-08-01

    Over the past five years the Center for Minimally Invasive Radiation Biodosimetry at Columbia University has developed the Rapid Automated Biodosimetry Tool (RABiT), a completely automated, ultra-high throughput biodosimetry workstation. This paper describes recent upgrades and reliability testing of the RABiT. The RABiT analyses fingerstick-derived blood samples to estimate past radiation exposure or to identify individuals exposed above or below a cut-off dose. Through automated robotics, lymphocytes are extracted from fingerstick blood samples into filter-bottomed multi-well plates. Depending on the time since exposure, the RABiT scores either micronuclei or phosphorylation of the histone H2AX, in an automated robotic system, using filter-bottomed multi-well plates. Following lymphocyte culturing, fixation and staining, the filter bottoms are removed from the multi-well plates and sealed prior to automated high-speed imaging. Image analysis is performed online using dedicated image processing hardware. Both the sealed filters and the images are archived. We have developed a new robotic system for lymphocyte processing, making use of an upgraded laser power and parallel processing of four capillaries at once. This system has allowed acceleration of lymphocyte isolation, the main bottleneck of the RABiT operation, from 12 to 2 sec/sample. Reliability tests have been performed on all robotic subsystems. Parallel handling of multiple samples through the use of dedicated, purpose-built, robotics and high speed imaging allows analysis of up to 30,000 samples per day.

  10. Computer tools for systems engineering at LaRC

    Science.gov (United States)

    Walters, J. Milam

    1994-01-01

    The Systems Engineering Office (SEO) has been established to provide life cycle systems engineering support to Langley research Center projects. over the last two years, the computing market has been reviewed for tools which could enhance the effectiveness and efficiency of activities directed towards this mission. A group of interrelated applications have been procured, or are under development including a requirements management tool, a system design and simulation tool, and project and engineering data base. This paper will review the current configuration of these tools and provide information on future milestones and directions.

  11. An automated A-value measurement tool for accurate cochlear duct length estimation.

    Science.gov (United States)

    Iyaniwura, John E; Elfarnawany, Mai; Ladak, Hanif M; Agrawal, Sumit K

    2018-01-22

    There has been renewed interest in the cochlear duct length (CDL) for preoperative cochlear implant electrode selection and postoperative generation of patient-specific frequency maps. The CDL can be estimated by measuring the A-value, which is defined as the length between the round window and the furthest point on the basal turn. Unfortunately, there is significant intra- and inter-observer variability when these measurements are made clinically. The objective of this study was to develop an automated A-value measurement algorithm to improve accuracy and eliminate observer variability. Clinical and micro-CT images of 20 cadaveric cochleae specimens were acquired. The micro-CT of one sample was chosen as the atlas, and A-value fiducials were placed onto that image. Image registration (rigid affine and non-rigid B-spline) was applied between the atlas and the 19 remaining clinical CT images. The registration transform was applied to the A-value fiducials, and the A-value was then automatically calculated for each specimen. High resolution micro-CT images of the same 19 specimens were used to measure the gold standard A-values for comparison against the manual and automated methods. The registration algorithm had excellent qualitative overlap between the atlas and target images. The automated method eliminated the observer variability and the systematic underestimation by experts. Manual measurement of the A-value on clinical CT had a mean error of 9.5 ± 4.3% compared to micro-CT, and this improved to an error of 2.7 ± 2.1% using the automated algorithm. Both the automated and manual methods correlated significantly with the gold standard micro-CT A-values (r = 0.70, p value measurement tool using atlas-based registration methods was successfully developed and validated. The automated method eliminated the observer variability and improved accuracy as compared to manual measurements by experts. This open-source tool has the potential to benefit

  12. Integrated evolutionary computation neural network quality controller for automated systems

    Energy Technology Data Exchange (ETDEWEB)

    Patro, S.; Kolarik, W.J. [Texas Tech Univ., Lubbock, TX (United States). Dept. of Industrial Engineering

    1999-06-01

    With increasing competition in the global market, more and more stringent quality standards and specifications are being demands at lower costs. Manufacturing applications of computing power are becoming more common. The application of neural networks to identification and control of dynamic processes has been discussed. The limitations of using neural networks for control purposes has been pointed out and a different technique, evolutionary computation, has been discussed. The results of identifying and controlling an unstable, dynamic process using evolutionary computation methods has been presented. A framework for an integrated system, using both neural networks and evolutionary computation, has been proposed to identify the process and then control the product quality, in a dynamic, multivariable system, in real-time.

  13. Recent development in software and automation tools for high-throughput discovery bioanalysis.

    Science.gov (United States)

    Shou, Wilson Z; Zhang, Jun

    2012-05-01

    Bioanalysis with LC-MS/MS has been established as the method of choice for quantitative determination of drug candidates in biological matrices in drug discovery and development. The LC-MS/MS bioanalytical support for drug discovery, especially for early discovery, often requires high-throughput (HT) analysis of large numbers of samples (hundreds to thousands per day) generated from many structurally diverse compounds (tens to hundreds per day) with a very quick turnaround time, in order to provide important activity and liability data to move discovery projects forward. Another important consideration for discovery bioanalysis is its fit-for-purpose quality requirement depending on the particular experiments being conducted at this stage, and it is usually not as stringent as those required in bioanalysis supporting drug development. These aforementioned attributes of HT discovery bioanalysis made it an ideal candidate for using software and automation tools to eliminate manual steps, remove bottlenecks, improve efficiency and reduce turnaround time while maintaining adequate quality. In this article we will review various recent developments that facilitate automation of individual bioanalytical procedures, such as sample preparation, MS/MS method development, sample analysis and data review, as well as fully integrated software tools that manage the entire bioanalytical workflow in HT discovery bioanalysis. In addition, software tools supporting the emerging high-resolution accurate MS bioanalytical approach are also discussed.

  14. Digital-flight-control-system software written in automated-engineering-design language: A user's guide of verification and validation tools

    Science.gov (United States)

    Saito, Jim

    1987-01-01

    The user guide of verification and validation (V&V) tools for the Automated Engineering Design (AED) language is specifically written to update the information found in several documents pertaining to the automated verification of flight software tools. The intent is to provide, in one document, all the information necessary to adequately prepare a run to use the AED V&V tools. No attempt is made to discuss the FORTRAN V&V tools since they were not updated and are not currently active. Additionally, the current descriptions of the AED V&V tools are contained and provides information to augment the NASA TM 84276. The AED V&V tools are accessed from the digital flight control systems verification laboratory (DFCSVL) via a PDP-11/60 digital computer. The AED V&V tool interface handlers on the PDP-11/60 generate a Univac run stream which is transmitted to the Univac via a Remote Job Entry (RJE) link. Job execution takes place on the Univac 1100 and the job output is transmitted back to the DFCSVL and stored as a PDP-11/60 printfile.

  15. A New Tool for Automated Data Collection and Complete On-site Flux Data Processing for Eddy Covariance Measurements

    Science.gov (United States)

    Begashaw, I. G.; Kathilankal, J. C.; Li, J.; Beaty, K.; Ediger, K.; Forgione, A.; Fratini, G.; Johnson, D.; Velgersdyk, M.; Hupp, J. R.; Xu, L.; Burba, G. G.

    2014-12-01

    The eddy covariance method is widely used for direct measurements of turbulent exchange of gases and energy between the surface and atmosphere. In the past, raw data were collected first in the field and then processed back in the laboratory to achieve fully corrected publication-ready flux results. This post-processing consumed significant amount of time and resources, and precluded researchers from accessing near real-time final flux results. A new automated measurement system with novel hardware and software designs was developed, tested, and deployed starting late 2013. The major advancements with this automated flux system include: 1) Enabling logging high-frequency, three-dimensional wind speeds and multiple gas densities (CO2, H2O and CH4), low-frequency meteorological data, and site metadata simultaneously through a specially designed file format 2) Conducting fully corrected, real-time on-site flux computations using conventional as well as user-specified methods, by implementing EddyPro Software on a small low-power microprocessor 3) Providing precision clock control and coordinate information for data synchronization and inter-site data comparison by incorporating a GPS and Precision Time Protocol. Along with these innovations, a data management server application was also developed to chart fully corrected real-time fluxes to assist remote system monitoring, to send e-mail alerts, and to automate data QA/QC, transfer and archiving at individual stations or on a network level. Combination of all of these functions was designed to help save substantial amount of time and costs associated with managing a research site by eliminating the post-field data processing, reducing user errors and facilitating real-time access to fully corrected flux results. The design, functionality, and test results from this new eddy covariance measurement tool will be presented.

  16. Tool-driven Design and Automated Parameterization for Real-time Generic Drivetrain Models

    Directory of Open Access Journals (Sweden)

    Schwarz Christina

    2015-01-01

    Full Text Available Real-time dynamic drivetrain modeling approaches have a great potential for development cost reduction in the automotive industry. Even though real-time drivetrain models are available, these solutions are specific to single transmission topologies. In this paper an environment for parameterization of a solution is proposed based on a generic method applicable to all types of gear transmission topologies. This enables tool-guided modeling by non- experts in the fields of mechanic engineering and control theory leading to reduced development and testing efforts. The approach is demonstrated for an exemplary automatic transmission using the environment for automated parameterization. Finally, the parameterization is validated via vehicle measurement data.

  17. Integrated Markov-neural reliability computation method: A case for multiple automated guided vehicle system

    International Nuclear Information System (INIS)

    Fazlollahtabar, Hamed; Saidi-Mehrabad, Mohammad; Balakrishnan, Jaydeep

    2015-01-01

    This paper proposes an integrated Markovian and back propagation neural network approaches to compute reliability of a system. While states of failure occurrences are significant elements for accurate reliability computation, Markovian based reliability assessment method is designed. Due to drawbacks shown by Markovian model for steady state reliability computations and neural network for initial training pattern, integration being called Markov-neural is developed and evaluated. To show efficiency of the proposed approach comparative analyses are performed. Also, for managerial implication purpose an application case for multiple automated guided vehicles (AGVs) in manufacturing networks is conducted. - Highlights: • Integrated Markovian and back propagation neural network approach to compute reliability. • Markovian based reliability assessment method. • Managerial implication is shown in an application case for multiple automated guided vehicles (AGVs) in manufacturing networks

  18. Automated Detection of Heuristics and Biases among Pathologists in a Computer-Based System

    Science.gov (United States)

    Crowley, Rebecca S.; Legowski, Elizabeth; Medvedeva, Olga; Reitmeyer, Kayse; Tseytlin, Eugene; Castine, Melissa; Jukic, Drazen; Mello-Thoms, Claudia

    2013-01-01

    The purpose of this study is threefold: (1) to develop an automated, computer-based method to detect heuristics and biases as pathologists examine virtual slide cases, (2) to measure the frequency and distribution of heuristics and errors across three levels of training, and (3) to examine relationships of heuristics to biases, and biases to…

  19. Computer-automated tuning of semiconductor double quantum dots into the single-electron regime

    NARCIS (Netherlands)

    Baart, T.A.; Eendebak, P.T.; Reichl, C.; Wegscheider, W.; Vandersypen, L.M.K.

    2016-01-01

    We report the computer-automated tuning of gate-defined semiconductor double quantum dots in GaAs heterostructures. We benchmark the algorithm by creating three double quantum dots inside a linear array of four quantum dots. The algorithm sets the correct gate voltages for all the gates to tune the

  20. Quantitative analysis of spider locomotion employing computer-automated video tracking

    DEFF Research Database (Denmark)

    Baatrup, E; Bayley, M

    1993-01-01

    The locomotor activity of adult specimens of the wolf spider Pardosa amentata was measured in an open-field setup, using computer-automated colour object video tracking. The x,y coordinates of the animal in the digitized image of the test arena were recorded three times per second during four...

  1. In-House Automation of a Small Library Using a Mainframe Computer.

    Science.gov (United States)

    Waranius, Frances B.; Tellier, Stephen H.

    1986-01-01

    An automated library routine management system was developed in-house to create system unique to the Library and Information Center, Lunar and Planetary Institute, Houston, Texas. A modular approach was used to allow continuity in operations and services as system was implemented. Acronyms and computer accounts and file names are appended.…

  2. Decreasing Transition Times in Elementary School Classrooms: Using Computer-Assisted Instruction to Automate Intervention Components

    Science.gov (United States)

    Hine, Jeffrey F.; Ardoin, Scott P.; Foster, Tori E.

    2015-01-01

    Research suggests that students spend a substantial amount of time transitioning between classroom activities, which may reduce time spent academically engaged. This study used an ABAB design to evaluate the effects of a computer-assisted intervention that automated intervention components previously shown to decrease transition times. We examined…

  3. Computer software configuration management plan for the Honeywell modular automation system

    International Nuclear Information System (INIS)

    Cunningham, L.T.

    1997-01-01

    This document provides a Computer Software management plan for a new Honeywell Modular Automation System (MAS) being installed in the Plutonium Finishing Plant (PFP). This type of system will be used to control new thermal stabilization furnaces, a vertical denitrator calciner, and a pyrolysis furnace

  4. Philosophy of a computer-automated counting system

    International Nuclear Information System (INIS)

    Perry, D.G.; Giesler, G.C.

    1979-01-01

    The LAMPF Nuclear Chemistry computer system is designed to provide both real-time control of data acquisition and facilities for data processing for a large variety of users. It is a PDP-11/34 connected to a parallel CAMAC branch highway as well as a large variety of peripherals. The philosophy for the design of this system is discussed; such points as use of the computer for control only versus direct data acquisition by the computer, why a CAMAC system was chosen, and the advantages and disadvantages of this system are covered. Also discussed are future expansion of the system and what might be done differently if the system were redesigned. 3 figures

  5. STAR- A SIMPLE TOOL FOR AUTOMATED REASONING SUPPORTING HYBRID APPLICATIONS OF ARTIFICIAL INTELLIGENCE (UNIX VERSION)

    Science.gov (United States)

    Borchardt, G. C.

    1994-01-01

    The Simple Tool for Automated Reasoning program (STAR) is an interactive, interpreted programming language for the development and operation of artificial intelligence (AI) application systems. STAR provides an environment for integrating traditional AI symbolic processing with functions and data structures defined in compiled languages such as C, FORTRAN and PASCAL. This type of integration occurs in a number of AI applications including interpretation of numerical sensor data, construction of intelligent user interfaces to existing compiled software packages, and coupling AI techniques with numerical simulation techniques and control systems software. The STAR language was created as part of an AI project for the evaluation of imaging spectrometer data at NASA's Jet Propulsion Laboratory. Programming in STAR is similar to other symbolic processing languages such as LISP and CLIP. STAR includes seven primitive data types and associated operations for the manipulation of these structures. A semantic network is used to organize data in STAR, with capabilities for inheritance of values and generation of side effects. The AI knowledge base of STAR can be a simple repository of records or it can be a highly interdependent association of implicit and explicit components. The symbolic processing environment of STAR may be extended by linking the interpreter with functions defined in conventional compiled languages. These external routines interact with STAR through function calls in either direction, and through the exchange of references to data structures. The hybrid knowledge base may thus be accessed and processed in general by either side of the application. STAR is initially used to link externally compiled routines and data structures. It is then invoked to interpret the STAR rules and symbolic structures. In a typical interactive session, the user enters an expression to be evaluated, STAR parses the input, evaluates the expression, performs any file input

  6. STAR- A SIMPLE TOOL FOR AUTOMATED REASONING SUPPORTING HYBRID APPLICATIONS OF ARTIFICIAL INTELLIGENCE (DEC VAX VERSION)

    Science.gov (United States)

    Borchardt, G. C.

    1994-01-01

    The Simple Tool for Automated Reasoning program (STAR) is an interactive, interpreted programming language for the development and operation of artificial intelligence (AI) application systems. STAR provides an environment for integrating traditional AI symbolic processing with functions and data structures defined in compiled languages such as C, FORTRAN and PASCAL. This type of integration occurs in a number of AI applications including interpretation of numerical sensor data, construction of intelligent user interfaces to existing compiled software packages, and coupling AI techniques with numerical simulation techniques and control systems software. The STAR language was created as part of an AI project for the evaluation of imaging spectrometer data at NASA's Jet Propulsion Laboratory. Programming in STAR is similar to other symbolic processing languages such as LISP and CLIP. STAR includes seven primitive data types and associated operations for the manipulation of these structures. A semantic network is used to organize data in STAR, with capabilities for inheritance of values and generation of side effects. The AI knowledge base of STAR can be a simple repository of records or it can be a highly interdependent association of implicit and explicit components. The symbolic processing environment of STAR may be extended by linking the interpreter with functions defined in conventional compiled languages. These external routines interact with STAR through function calls in either direction, and through the exchange of references to data structures. The hybrid knowledge base may thus be accessed and processed in general by either side of the application. STAR is initially used to link externally compiled routines and data structures. It is then invoked to interpret the STAR rules and symbolic structures. In a typical interactive session, the user enters an expression to be evaluated, STAR parses the input, evaluates the expression, performs any file input

  7. A computer tool to support in design of industrial Ethernet.

    Science.gov (United States)

    Lugli, Alexandre Baratella; Santos, Max Mauro Dias; Franco, Lucia Regina Horta Rodrigues

    2009-04-01

    This paper presents a computer tool to support in the project and development of an industrial Ethernet network, verifying the physical layer (cables-resistance and capacitance, scan time, network power supply-POE's concept "Power Over Ethernet" and wireless), and occupation rate (amount of information transmitted to the network versus the controller network scan time). These functions are accomplished without a single physical element installed in the network, using only simulation. The computer tool has a software that presents a detailed vision of the network to the user, besides showing some possible problems in the network, and having an extremely friendly environment.

  8. Modeling with data tools and techniques for scientific computing

    CERN Document Server

    Klemens, Ben

    2009-01-01

    Modeling with Data fully explains how to execute computationally intensive analyses on very large data sets, showing readers how to determine the best methods for solving a variety of different problems, how to create and debug statistical models, and how to run an analysis and evaluate the results. Ben Klemens introduces a set of open and unlimited tools, and uses them to demonstrate data management, analysis, and simulation techniques essential for dealing with large data sets and computationally intensive procedures. He then demonstrates how to easily apply these tools to the many threads of statistical technique, including classical, Bayesian, maximum likelihood, and Monte Carlo methods

  9. Analysis of the thoracic aorta using a semi-automated post processing tool

    International Nuclear Information System (INIS)

    Entezari, Pegah; Kino, Aya; Honarmand, Amir R.; Galizia, Mauricio S.; Yang, Yan; Collins, Jeremy; Yaghmai, Vahid; Carr, James C.

    2013-01-01

    Objective: To evaluates a semi-automated method for Thoracic Aortic Aneurysm (TAA) measurement using ECG-gated Dual Source CT Angiogram (DSCTA). Methods: This retrospective HIPAA compliant study was approved by our IRB. Transaxial maximum diameters of outer wall to outer wall were studied in fifty patients at seven anatomic locations of the thoracic aorta: annulus, sinus, sinotubular junction (STJ), mid ascending aorta (MAA) at the level of right pulmonary artery, proximal aortic arch (PROX) immediately proximal to innominate artery, distal aortic arch (DIST) immediately distal to left subclavian artery, and descending aorta (DESC) at the level of diaphragm. Measurements were performed using a manual method and semi-automated software. All readers repeated their measurements. Inter-method, intra-observer and inter-observer agreements were evaluated according to intraclass correlation coefficient (ICC) and Bland–Altman plot. The number of cases with manual contouring or center line adjustment for the semi-automated method and also the post-processing time for each method were recorded. Results: The mean difference between semi-automated and manual methods was less than 1.3 mm at all seven points. Strong inter-method, inter-observer and intra-observer agreement was recorded at all levels (ICC ≥ 0.9). The maximum rate of manual adjustment of center line and contour was at the level of annulus. The average time for manual post-processing of the aorta was 19 ± 0.3 min, while it took 8.26 ± 2.1 min to do the measurements with the semi-automated tool (Vitrea version 6.0.0.1 software). The center line was edited manually at all levels, with most corrections at the level of annulus (60%), while the contour was adjusted at all levels with highest and lowest number of corrections at the levels of annulus and DESC (75% and 0.07% of the cases), respectively. Conclusion: Compared to the commonly used manual method, semi-automated measurement of vessel dimensions is

  10. Evaluation of a semi-automated computer algorithm for measuring total fat and visceral fat content in lambs undergoing in vivo whole body computed tomography.

    Science.gov (United States)

    Rosenblatt, Alana J; Scrivani, Peter V; Boisclair, Yves R; Reeves, Anthony P; Ramos-Nieves, Jose M; Xie, Yiting; Erb, Hollis N

    2017-10-01

    Computed tomography (CT) is a suitable tool for measuring body fat, since it is non-destructive and can be used to differentiate metabolically active visceral fat from total body fat. Whole body analysis of body fat is likely to be more accurate than single CT slice estimates of body fat. The aim of this study was to assess the agreement between semi-automated computer analysis of whole body volumetric CT data and conventional proximate (chemical) analysis of body fat in lambs. Data were collected prospectively from 12 lambs that underwent duplicate whole body CT, followed by slaughter and carcass analysis by dissection and chemical analysis. Agreement between methods for quantification of total and visceral fat was assessed by Bland-Altman plot analysis. The repeatability of CT was assessed for these measures using the mean difference of duplicated measures. When compared to chemical analysis, CT systematically underestimated total and visceral fat contents by more than 10% of the mean fat weight. Therefore, carcass analysis and semi-automated CT computer measurements were not interchangeable for quantifying body fat content without the use of a correction factor. CT acquisition was repeatable, with a mean difference of repeated measures being close to zero. Therefore, uncorrected whole body CT might have an application for assessment of relative changes in fat content, especially in growing lambs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Scalable space-time adaptive simulation tools for computational electrocardiology

    OpenAIRE

    Krause, Dorian; Krause, Rolf

    2013-01-01

    This work is concerned with the development of computational tools for the solution of reaction-diffusion equations from the field of computational electrocardiology. We designed lightweight spatially and space-time adaptive schemes for large-scale parallel simulations. We propose two different adaptive schemes based on locally structured meshes, managed either via a conforming coarse tessellation or a forest of shallow trees. A crucial ingredient of our approach is a non-conforming morta...

  12. OPTHYLIC: An Optimised Tool for Hybrid Limits Computation

    Science.gov (United States)

    Busato, Emmanuel; Calvet, David; Theveneaux-Pelzer, Timothée

    2018-05-01

    A software tool, computing observed and expected upper limits on Poissonian process rates using a hybrid frequentist-Bayesian CLs method, is presented. This tool can be used for simple counting experiments where only signal, background and observed yields are provided or for multi-bin experiments where binned distributions of discriminating variables are provided. It allows the combination of several channels and takes into account statistical and systematic uncertainties, as well as correlations of systematic uncertainties between channels. It has been validated against other software tools and analytical calculations, for several realistic cases.

  13. Use of computed tomography and automated software for quantitative analysis of the vasculature of patients with pulmonary hypertension

    Energy Technology Data Exchange (ETDEWEB)

    Wada, Danilo Tadao; Pádua, Adriana Ignácio de; Lima Filho, Moyses Oliveira; Marin Neto, José Antonio; Elias Júnior, Jorge; Baddini-Martinez, José; Santos, Marcel Koenigkam, E-mail: danilowada@yahoo.com.br [Universidade de São Paulo (HCFMRP/USP), Ribeirão Preto, SP (Brazil). Faculdade de Medicina. Hospital das Clínicas

    2017-11-15

    Objective: To perform a quantitative analysis of the lung parenchyma and pulmonary vasculature of patients with pulmonary hypertension (PH) on computed tomography angiography (CTA) images, using automated software. Materials And Methods: We retrospectively analyzed the CTA findings and clinical records of 45 patients with PH (17 males and 28 females), in comparison with a control group of 20 healthy individuals (7 males and 13 females); the mean age differed significantly between the two groups (53 ± 14.7 vs. 35 ± 9.6 years; p = 0.0001). Results: The automated analysis showed that, in comparison with the controls, the patients with PH showed lower 10{sup th} percentile values for lung density, higher vascular volumes in the right upper lung lobe, and higher vascular volume ratios between the upper and lower lobes. In our quantitative analysis, we found no differences among the various PH subgroups. We inferred that a difference in the 10{sup th} percentile values indicates areas of hypovolaemia in patients with PH and that a difference in pulmonary vascular volumes indicates redistribution of the pulmonary vasculature and an increase in pulmonary vasculature resistance. Conclusion: Automated analysis of pulmonary vessels on CTA images revealed alterations and could represent an objective diagnostic tool for the evaluation of patients with PH. (author)

  14. A portable software tool for computing digitally reconstructed radiographs

    International Nuclear Information System (INIS)

    Chaney, Edward L.; Thorn, Jesse S.; Tracton, Gregg; Cullip, Timothy; Rosenman, Julian G.; Tepper, Joel E.

    1995-01-01

    Purpose: To develop a portable software tool for fast computation of digitally reconstructed radiographs (DRR) with a friendly user interface and versatile image format and display options. To provide a means for interfacing with commercial and custom three-dimensional (3D) treatment planning systems. To make the tool freely available to the Radiation Oncology community. Methods and Materials: A computer program for computing DRRs was enhanced with new features and rewritten to increase computational efficiency. A graphical user interface was added to improve ease of data input and DRR display. Installer, programmer, and user manuals were written, and installation test data sets were developed. The code conforms to the specifications of the Cooperative Working Group (CWG) of the National Cancer Institute (NCI) Contract on Radiotherapy Treatment Planning Tools. Results: The interface allows the user to select DRR input data and image formats primarily by point-and-click mouse operations. Digitally reconstructed radiograph formats are predefined by configuration files that specify 19 calculation parameters. Enhancements include improved contrast resolution for visualizing surgical clips, an extended source model to simulate the penumbra region in a computed port film, and the ability to easily modify the CT numbers of objects contoured on the planning computed tomography (CT) scans. Conclusions: The DRR tool can be used with 3D planning systems that lack this functionality, or perhaps improve the quality and functionality of existing DRR software. The tool can be interfaced to 3D planning systems that run on most modern graphics workstations, and can also function as a stand-alone program

  15. Automating Commercial Video Game Development using Computational Intelligence

    OpenAIRE

    Tse G. Tan; Jason Teo; Patricia Anthony

    2011-01-01

    Problem statement: The retail sales of computer and video games have grown enormously during the last few years, not just in United States (US), but also all over the world. This is the reason a lot of game developers and academic researchers have focused on game related technologies, such as graphics, audio, physics and Artificial Intelligence (AI) with the goal of creating newer and more fun games. In recent years, there has been an increasing interest in game AI for pro...

  16. Computer Vision Research and Its Applications to Automated Cartography

    Science.gov (United States)

    1984-09-01

    Imaging Geometry from a Camera Transformation Matrix. Many scene analysis algorithms require knowledge of the geometry of the image formation process as a...to compute the imaging geometry directly from the constraints provided by the known data points. Partial information such as the camera’s focal length...Artificial Infelli- 1 fence 4, 1973, 121-137. 8. Kanade, T., A theory of origami world, Artificial Intelligence 13, 1080, 270-311. 0. Barnard, S. T

  17. A fascinating country in the world of computing your guide to automated reasoning

    CERN Document Server

    Wos, Larry

    1999-01-01

    This book shows you - through examples and puzzles and intriguing questions - how to make your computer reason logically. To help you, the book includes a CD-ROM with OTTER, the world's most powerful general-purpose reasoning program. The automation of reasoning has advanced markedly in the past few decades, and this book discusses some of the remarkable successes that automated reasoning programs have had in tackling challenging problems in mathematics, logic, program verification, and circuit design. Because the intended audience includes students and teachers, the book provides many exercis

  18. On Computational Fluid Dynamics Tools in Architectural Design

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Hougaard, Mads; Stærdahl, Jesper Winther

    engineering computational fluid dynamics (CFD) simulation program ANSYS CFX and a CFD based representative program RealFlow are investigated. These two programs represent two types of CFD based tools available for use during phases of an architectural design process. However, as outlined in two case studies...

  19. Development of Desktop Computing Applications and Engineering Tools on GPUs

    DEFF Research Database (Denmark)

    Sørensen, Hans Henrik Brandenborg; Glimberg, Stefan Lemvig; Hansen, Toke Jansen

    (GPUs) for high-performance computing applications and software tools in science and engineering, inverse problems, visualization, imaging, dynamic optimization. The goals are to contribute to the development of new state-of-the-art mathematical models and algorithms for maximum throughout performance...

  20. Software Tools: A One-Semester Secondary School Computer Course.

    Science.gov (United States)

    Bromley, John; Lakatos, John

    1985-01-01

    Provides a course outline, describes equipment and teacher requirements, discusses student evaluation and course outcomes, and details the computer programs used in a high school course. The course is designed to teach students use of the microcomputer as a tool through hands-on experience with a variety of commercial software programs. (MBR)

  1. Cloud Computing as a Tool for Improving Business Competitiveness

    Directory of Open Access Journals (Sweden)

    Wišniewski Michał

    2014-08-01

    Full Text Available This article organizes knowledge on cloud computing presenting the classification of deployment models, characteristics and service models. The author, looking at the problem from the entrepreneur’s perspective, draws attention to the differences in the benefits depending on the cloud computing deployment models and considers an effective way of selection of cloud computing services according to the specificity of organization. Within this work, a thesis statement was considered that in economic terms the cloud computing is not always the best solution for your organization. This raises the question, “What kind of tools should be used to estimate the usefulness of the model cloud computing services in the enterprise?”

  2. Computer-Automated Evolution of Spacecraft X-Band Antennas

    Science.gov (United States)

    Lohn, Jason D.; Homby, Gregory S.; Linden, Derek S.

    2010-01-01

    A document discusses the use of computer- aided evolution in arriving at a design for X-band communication antennas for NASA s three Space Technology 5 (ST5) satellites, which were launched on March 22, 2006. Two evolutionary algorithms, incorporating different representations of the antenna design and different fitness functions, were used to automatically design and optimize an X-band antenna design. A set of antenna designs satisfying initial ST5 mission requirements was evolved by use these algorithms. The two best antennas - one from each evolutionary algorithm - were built. During flight-qualification testing of these antennas, the mission requirements were changed. After minimal changes in the evolutionary algorithms - mostly in the fitness functions - new antenna designs satisfying the changed mission requirements were evolved and within one month of this change, two new antennas were designed and prototypes of the antennas were built and tested. One of these newly evolved antennas was approved for deployment on the ST5 mission, and flight-qualified versions of this design were built and installed on the spacecraft. At the time of writing the document, these antennas were the first computer-evolved hardware in outer space.

  3. Computing tools for implementing standards for single-case designs.

    Science.gov (United States)

    Chen, Li-Ting; Peng, Chao-Ying Joanne; Chen, Ming-E

    2015-11-01

    In the single-case design (SCD) literature, five sets of standards have been formulated and distinguished: design standards, assessment standards, analysis standards, reporting standards, and research synthesis standards. This article reviews computing tools that can assist researchers and practitioners in meeting the analysis standards recommended by the What Works Clearinghouse: Procedures and Standards Handbook-the WWC standards. These tools consist of specialized web-based calculators or downloadable software for SCD data, and algorithms or programs written in Excel, SAS procedures, SPSS commands/Macros, or the R programming language. We aligned these tools with the WWC standards and evaluated them for accuracy and treatment of missing data, using two published data sets. All tools were tested to be accurate. When missing data were present, most tools either gave an error message or conducted analysis based on the available data. Only one program used a single imputation method. This article concludes with suggestions for an inclusive computing tool or environment, additional research on the treatment of missing data, and reasonable and flexible interpretations of the WWC standards. © The Author(s) 2015.

  4. Computer programing for geosciences: Teach your students how to make tools

    Science.gov (United States)

    Grapenthin, Ronni

    2011-12-01

    When I announced my intention to pursue a Ph.D. in geophysics, some people gave me confused looks, because I was working on a master's degree in computer science at the time. My friends, like many incoming geoscience graduate students, have trouble linking these two fields. From my perspective, it is pretty straightforward: Much of geoscience evolves around novel analyses of large data sets that require custom tools—computer programs—to minimize the drudgery of manual data handling; other disciplines share this characteristic. While most faculty adapted to the need for tool development quite naturally, as they grew up around computer terminal interfaces, incoming graduate students lack intuitive understanding of programing concepts such as generalization and automation. I believe the major cause is the intuitive graphical user interfaces of modern operating systems and applications, which isolate the user from all technical details. Generally, current curricula do not recognize this gap between user and machine. For students to operate effectively, they require specialized courses teaching them the skills they need to make tools that operate on particular data sets and solve their specific problems. Courses in computer science departments are aimed at a different audience and are of limited help.

  5. Computer automation for protection factor calculations of buildings

    International Nuclear Information System (INIS)

    Farafat, M.A.Z.; Madian, A.H.

    2011-01-01

    The protection factors of buildings are different according to the constructional and architectural specifications. Uk and USA performed a calculation using manual method to calculate the protection factor for any building which may protect the people in it from gamma rays and fall-out.The manual calculation method is very complex which is very difficult to use, for that reason the researchers simplify this method in proposed form which will be easy to understand and use. Also the researchers have designed a computer program ,in visual basic, to calculate the different protection factors for buildings. The program aims to provide the missing time in the calculation processes to calculate the protection in some spaces for any building through entering specifications data for any building .The program will modify the protection factor in very short time which will save the effort and time in comparison with the manual calculation.

  6. Computer program CDCID: an automated quality control program using CDC update

    International Nuclear Information System (INIS)

    Singer, G.L.; Aguilar, F.

    1984-04-01

    A computer program, CDCID, has been developed in coordination with a quality control program to provide a highly automated method of documenting changes to computer codes at EG and G Idaho, Inc. The method uses the standard CDC UPDATE program in such a manner that updates and their associated documentation are easily made and retrieved in various formats. The method allows each card image of a source program to point to the document which describes it, who created the card, and when it was created. The method described is applicable to the quality control of computer programs in general. The computer program described is executable only on CDC computing systems, but the program could be modified and applied to any computing system with an adequate updating program

  7. Emergency healthcare process automation using mobile computing and cloud services.

    Science.gov (United States)

    Poulymenopoulou, M; Malamateniou, F; Vassilacopoulos, G

    2012-10-01

    Emergency care is basically concerned with the provision of pre-hospital and in-hospital medical and/or paramedical services and it typically involves a wide variety of interdependent and distributed activities that can be interconnected to form emergency care processes within and between Emergency Medical Service (EMS) agencies and hospitals. Hence, in developing an information system for emergency care processes, it is essential to support individual process activities and to satisfy collaboration and coordination needs by providing readily access to patient and operational information regardless of location and time. Filling this information gap by enabling the provision of the right information, to the right people, at the right time fosters new challenges, including the specification of a common information format, the interoperability among heterogeneous institutional information systems or the development of new, ubiquitous trans-institutional systems. This paper is concerned with the development of an integrated computer support to emergency care processes by evolving and cross-linking institutional healthcare systems. To this end, an integrated EMS cloud-based architecture has been developed that allows authorized users to access emergency case information in standardized document form, as proposed by the Integrating the Healthcare Enterprise (IHE) profile, uses the Organization for the Advancement of Structured Information Standards (OASIS) standard Emergency Data Exchange Language (EDXL) Hospital Availability Exchange (HAVE) for exchanging operational data with hospitals and incorporates an intelligent module that supports triaging and selecting the most appropriate ambulances and hospitals for each case.

  8. Computer-aided design in power engineering. Application of software tools

    International Nuclear Information System (INIS)

    Stojkovic, Zlatan

    2012-01-01

    Demonstrates the use software tools in the practice of design in the field of power systems. Presents many applications in the design in the field of power systems. Useful for educative purposes and practical work. This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel and Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems is discussed. Here, the emphasis is put on the standard software MS Excel and MS Project.

  9. Computer-aided design in power engineering. Application of software tools

    Energy Technology Data Exchange (ETDEWEB)

    Stojkovic, Zlatan

    2012-07-01

    Demonstrates the use software tools in the practice of design in the field of power systems. Presents many applications in the design in the field of power systems. Useful for educative purposes and practical work. This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel and Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems is discussed. Here, the emphasis is put on the standard software MS Excel and MS Project.

  10. A Tangible Programming Tool for Children to Cultivate Computational Thinking

    Directory of Open Access Journals (Sweden)

    Danli Wang

    2014-01-01

    Full Text Available Game and creation are activities which have good potential for computational thinking skills. In this paper we present T-Maze, an economical tangible programming tool for children aged 5–9 to build computer programs in maze games by placing wooden blocks. Through the use of computer vision technology, T-Maze provides a live programming interface with real-time graphical and voice feedback. We conducted a user study with 7 children using T-Maze to play two levels of maze-escape games and create their own mazes. The results show that T-Maze is not only easy to use, but also has the potential to help children cultivate computational thinking like abstraction, problem decomposition, and creativity.

  11. A Tangible Programming Tool for Children to Cultivate Computational Thinking

    Science.gov (United States)

    Wang, Danli; Liu, Zhen

    2014-01-01

    Game and creation are activities which have good potential for computational thinking skills. In this paper we present T-Maze, an economical tangible programming tool for children aged 5–9 to build computer programs in maze games by placing wooden blocks. Through the use of computer vision technology, T-Maze provides a live programming interface with real-time graphical and voice feedback. We conducted a user study with 7 children using T-Maze to play two levels of maze-escape games and create their own mazes. The results show that T-Maze is not only easy to use, but also has the potential to help children cultivate computational thinking like abstraction, problem decomposition, and creativity. PMID:24719575

  12. Elementary mathematical and computational tools for electrical and computer engineers using Matlab

    CERN Document Server

    Manassah, Jamal T

    2013-01-01

    Ideal for use as a short-course textbook and for self-study Elementary Mathematical and Computational Tools for Electrical and Computer Engineers Using MATLAB fills that gap. Accessible after just one semester of calculus, it introduces the many practical analytical and numerical tools that are essential to success both in future studies and in professional life. Sharply focused on the needs of the electrical and computer engineering communities, the text provides a wealth of relevant exercises and design problems. Changes in MATLAB's version 6.0 are included in a special addendum.

  13. OpenComet: An automated tool for comet assay image analysis

    Directory of Open Access Journals (Sweden)

    Benjamin M. Gyori

    2014-01-01

    Full Text Available Reactive species such as free radicals are constantly generated in vivo and DNA is the most important target of oxidative stress. Oxidative DNA damage is used as a predictive biomarker to monitor the risk of development of many diseases. The comet assay is widely used for measuring oxidative DNA damage at a single cell level. The analysis of comet assay output images, however, poses considerable challenges. Commercial software is costly and restrictive, while free software generally requires laborious manual tagging of cells. This paper presents OpenComet, an open-source software tool providing automated analysis of comet assay images. It uses a novel and robust method for finding comets based on geometric shape attributes and segmenting the comet heads through image intensity profile analysis. Due to automation, OpenComet is more accurate, less prone to human bias, and faster than manual analysis. A live analysis functionality also allows users to analyze images captured directly from a microscope. We have validated OpenComet on both alkaline and neutral comet assay images as well as sample images from existing software packages. Our results show that OpenComet achieves high accuracy with significantly reduced analysis time.

  14. Automated quantitative coronary computed tomography correlates of myocardial ischaemia on gated myocardial perfusion SPECT

    International Nuclear Information System (INIS)

    Graaf, Michiel A. de; Boogers, Mark J.; Veltman, Caroline E.; El-Naggar, Heba M.; Bax, Jeroen J.; Delgado, Victoria; Broersen, Alexander; Kitslaar, Pieter H.; Dijkstra, Jouke; Kroft, Lucia J.; Younis, Imad Al; Reiber, Johan H.; Scholte, Arthur J.

    2013-01-01

    Automated software tools have permitted more comprehensive, robust and reproducible quantification of coronary stenosis, plaque burden and plaque location of coronary computed tomography angiography (CTA) data. The association between these quantitative CTA (QCT) parameters and the presence of myocardial ischaemia has not been explored. The aim of the present investigation was to evaluate the association between QCT parameters of coronary artery lesions and the presence of myocardial ischaemia on gated myocardial perfusion single-photon emission CT (SPECT). Included in the study were 40 patients (mean age 58.2 ± 10.9 years, 27 men) with known or suspected coronary artery disease (CAD) who had undergone multidetector row CTA and gated myocardial perfusion SPECT within 6 months. From the CTA datasets, vessel-based and lesion-based visual analyses were performed. Consecutively, lesion-based QCT was performed to assess plaque length, plaque burden, percentage lumen area stenosis and remodelling index. Subsequently, the presence of myocardial ischaemia was assessed using the summed difference score (SDS ≥2) on gated myocardial perfusion SPECT. Myocardial ischaemia was seen in 25 patients (62.5 %) in 37 vascular territories. Quantitatively assessed significant stenosis and quantitatively assessed lesion length were independently associated with myocardial ischaemia (OR 7.72, 95 % CI 2.41-24.7, p 2 = 20.7) and lesion length (χ 2 = 26.0) to the clinical variables and the visual assessment (χ 2 = 5.9) had incremental value in the association with myocardial ischaemia. Coronary lesion length and quantitatively assessed significant stenosis were independently associated with myocardial ischaemia. Both quantitative parameters have incremental value over baseline variables and visually assessed significant stenosis. Potentially, QCT can refine assessment of CAD, which may be of potential use for identification of patients with myocardial ischaemia. (orig.)

  15. Field-programmable custom computing technology architectures, tools, and applications

    CERN Document Server

    Luk, Wayne; Pocek, Ken

    2000-01-01

    Field-Programmable Custom Computing Technology: Architectures, Tools, and Applications brings together in one place important contributions and up-to-date research results in this fast-moving area. In seven selected chapters, the book describes the latest advances in architectures, design methods, and applications of field-programmable devices for high-performance reconfigurable systems. The contributors to this work were selected from the leading researchers and practitioners in the field. It will be valuable to anyone working or researching in the field of custom computing technology. It serves as an excellent reference, providing insight into some of the most challenging issues being examined today.

  16. GAUFRE: A tool for an automated determination of atmospheric parameters from spectroscopy

    Directory of Open Access Journals (Sweden)

    Fossati L.

    2013-03-01

    Full Text Available We present an automated tool for measuring atmospheric parameters (Teff, log g, [Fe/H] for F-G-K dwarf and giant stars. The tool, called GAUFRE, is composed of several routines written in C++: GAUFRE-RV measures radial velocity from spectra via cross-correlation against a synthetic template, GAUFRE-EW measures atmospheric parameters through the classic line-by-line technique and GAUFRE-CHI2 performs a ��2 fitting to a library of synthetic spectra. A set of F-G-K stars extensively studied in the literature were used as a benchmark for the program: their high signal-to-noise and high resolution spectra were analyzed by using GAUFRE and results were compared with those present in literature. The tool is also implemented in order to perform the spectral analysis after fixing the surface gravity (log g to the accurate value provided by asteroseismology. A set of CoRoT stars, belonging to LRc01 and LRa01 fields was used for first testing the performances and the behavior of the program when using the seismic log g.

  17. Assessing the Validity of Automated Webcrawlers as Data Collection Tools to Investigate Online Child Sexual Exploitation.

    Science.gov (United States)

    Westlake, Bryce; Bouchard, Martin; Frank, Richard

    2017-10-01

    The distribution of child sexual exploitation (CE) material has been aided by the growth of the Internet. The graphic nature and prevalence of the material has made researching and combating difficult. Although used to study online CE distribution, automated data collection tools (e.g., webcrawlers) have yet to be shown effective at targeting only relevant data. Using CE-related image and keyword criteria, we compare networks starting from CE websites to those from similar non-CE sexuality websites and dissimilar sports websites. Our results provide evidence that (a) webcrawlers have the potential to provide valid CE data, if the appropriate criterion is selected; (b) CE distribution is still heavily image-based suggesting images as an effective criterion; (c) CE-seeded networks are more hub-based and differ from non-CE-seeded networks on several website characteristics. Recommendations for improvements to reliable criteria selection are discussed.

  18. CHANNEL MORPHOLOGY TOOL (CMT): A GIS-BASED AUTOMATED EXTRACTION MODEL FOR CHANNEL GEOMETRY

    Energy Technology Data Exchange (ETDEWEB)

    JUDI, DAVID [Los Alamos National Laboratory; KALYANAPU, ALFRED [Los Alamos National Laboratory; MCPHERSON, TIMOTHY [Los Alamos National Laboratory; BERSCHEID, ALAN [Los Alamos National Laboratory

    2007-01-17

    This paper describes an automated Channel Morphology Tool (CMT) developed in ArcGIS 9.1 environment. The CMT creates cross-sections along a stream centerline and uses a digital elevation model (DEM) to create station points with elevations along each of the cross-sections. The generated cross-sections may then be exported into a hydraulic model. Along with the rapid cross-section generation the CMT also eliminates any cross-section overlaps that might occur due to the sinuosity of the channels using the Cross-section Overlap Correction Algorithm (COCoA). The CMT was tested by extracting cross-sections from a 5-m DEM for a 50-km channel length in Houston, Texas. The extracted cross-sections were compared directly with surveyed cross-sections in terms of the cross-section area. Results indicated that the CMT-generated cross-sections satisfactorily matched the surveyed data.

  19. Automatic brain matter segmentation of computed tomography images using a statistical model: A tool to gain working time!

    Science.gov (United States)

    Bertè, Francesco; Lamponi, Giuseppe; Bramanti, Placido; Calabrò, Rocco S

    2015-10-01

    Brain computed tomography (CT) is useful diagnostic tool for the evaluation of several neurological disorders due to its accuracy, reliability, safety and wide availability. In this field, a potentially interesting research topic is the automatic segmentation and recognition of medical regions of interest (ROIs). Herein, we propose a novel automated method, based on the use of the active appearance model (AAM) for the segmentation of brain matter in CT images to assist radiologists in the evaluation of the images. The method described, that was applied to 54 CT images coming from a sample of outpatients affected by cognitive impairment, enabled us to obtain the generation of a model overlapping with the original image with quite good precision. Since CT neuroimaging is in widespread use for detecting neurological disease, including neurodegenerative conditions, the development of automated tools enabling technicians and physicians to reduce working time and reach a more accurate diagnosis is needed. © The Author(s) 2015.

  20. Aligator: A computational tool for optimizing total chemical synthesis of large proteins.

    Science.gov (United States)

    Jacobsen, Michael T; Erickson, Patrick W; Kay, Michael S

    2017-09-15

    The scope of chemical protein synthesis (CPS) continues to expand, driven primarily by advances in chemical ligation tools (e.g., reversible solubilizing groups and novel ligation chemistries). However, the design of an optimal synthesis route can be an arduous and fickle task due to the large number of theoretically possible, and in many cases problematic, synthetic strategies. In this perspective, we highlight recent CPS tool advances and then introduce a new and easy-to-use program, Aligator (Automated Ligator), for analyzing and designing the most efficient strategies for constructing large targets using CPS. As a model set, we selected the E. coli ribosomal proteins and associated factors for computational analysis. Aligator systematically scores and ranks all feasible synthetic strategies for a particular CPS target. The Aligator script methodically evaluates potential peptide segments for a target using a scoring function that includes solubility, ligation site quality, segment lengths, and number of ligations to provide a ranked list of potential synthetic strategies. We demonstrate the utility of Aligator by analyzing three recent CPS projects from our lab: TNFα (157 aa), GroES (97 aa), and DapA (312 aa). As the limits of CPS are extended, we expect that computational tools will play an increasingly important role in the efficient execution of ambitious CPS projects such as production of a mirror-image ribosome. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Advanced digital computers, controls, and automation technologies for power plants: Proceedings

    International Nuclear Information System (INIS)

    Bhatt, S.C.

    1992-08-01

    This document is a compilation of the papers that were presented at an EPRI workshop on Advances in Computers, Controls, and Automation Technologies for Power Plants. The workshop, sponsored by EPRI's Nuclear Power Division, took place February 1992. It was attended by 157 representatives from electric utilities, equipment manufacturers, engineering consulting organizations, universities, national laboratories, government agencies and international utilities. More than 40% of the attendees were from utilities representing the majority group. There were 30% attendees from equipment manufacturers and the engineering consulting organizations. The participants from government agencies, universities, and national laboratories were about 10% each. The workshop included a keynote address, 35 technical papers, and vendor's equipment demonstrations. The technical papers described the state-of-the-art in the areas of recent utility digital upgrades such as digital feedwater controllers, steam generator level controllers, integrated plant computer systems, computer aided diagnostics, automated testing and surveillance and other applications. A group of technical papers presented the ongoing B ampersand W PWR integrated plant control system prototype developments with the triple redundant advanced digital control system. Several international papers from France, Japan and U.K. presented their programs on advance power plant design and applications. Significant advances in the control and automation technologies such as adaptive controls, self-tuning methods, neural networks and expert systems were presented by developers, universities, and national laboratories. Individual papers are indexed separately

  2. Understanding organometallic reaction mechanisms and catalysis experimental and computational tools computational and experimental tools

    CERN Document Server

    Ananikov, Valentin P

    2014-01-01

    Exploring and highlighting the new horizons in the studies of reaction mechanisms that open joint application of experimental studies and theoretical calculations is the goal of this book. The latest insights and developments in the mechanistic studies of organometallic reactions and catalytic processes are presented and reviewed. The book adopts a unique approach, exemplifying how to use experiments, spectroscopy measurements, and computational methods to reveal reaction pathways and molecular structures of catalysts, rather than concentrating solely on one discipline. The result is a deeper

  3. The integration of automated knowledge acquisition with computer-aided software engineering for space shuttle expert systems

    Science.gov (United States)

    Modesitt, Kenneth L.

    1990-01-01

    A prediction was made that the terms expert systems and knowledge acquisition would begin to disappear over the next several years. This is not because they are falling into disuse; it is rather that practitioners are realizing that they are valuable adjuncts to software engineering, in terms of problem domains addressed, user acceptance, and in development methodologies. A specific problem was discussed, that of constructing an automated test analysis system for the Space Shuttle Main Engine. In this domain, knowledge acquisition was part of requirements systems analysis, and was performed with the aid of a powerful inductive ESBT in conjunction with a computer aided software engineering (CASE) tool. The original prediction is not a very risky one -- it has already been accomplished.

  4. Computational Tools To Model Halogen Bonds in Medicinal Chemistry.

    Science.gov (United States)

    Ford, Melissa Coates; Ho, P Shing

    2016-03-10

    The use of halogens in therapeutics dates back to the earliest days of medicine when seaweed was used as a source of iodine to treat goiters. The incorporation of halogens to improve the potency of drugs is now fairly standard in medicinal chemistry. In the past decade, halogens have been recognized as direct participants in defining the affinity of inhibitors through a noncovalent interaction called the halogen bond or X-bond. Incorporating X-bonding into structure-based drug design requires computational models for the anisotropic distribution of charge and the nonspherical shape of halogens, which lead to their highly directional geometries and stabilizing energies. We review here current successes and challenges in developing computational methods to introduce X-bonding into lead compound discovery and optimization during drug development. This fast-growing field will push further development of more accurate and efficient computational tools to accelerate the exploitation of halogens in medicinal chemistry.

  5. Subspace Dimensionality: A Tool for Automated QC in Seismic Array Processing

    Science.gov (United States)

    Rowe, C. A.; Stead, R. J.; Begnaud, M. L.

    2013-12-01

    Because of the great resolving power of seismic arrays, the application of automated processing to array data is critically important in treaty verification work. A significant problem in array analysis is the inclusion of bad sensor channels in the beamforming process. We are testing an approach to automated, on-the-fly quality control (QC) to aid in the identification of poorly performing sensor channels prior to beam-forming in routine event detection or location processing. The idea stems from methods used for large computer servers, when monitoring traffic at enormous numbers of nodes is impractical on a node-by node basis, so the dimensionality of the node traffic is instead monitoried for anomalies that could represent malware, cyber-attacks or other problems. The technique relies upon the use of subspace dimensionality or principal components of the overall system traffic. The subspace technique is not new to seismology, but its most common application has been limited to comparing waveforms to an a priori collection of templates for detecting highly similar events in a swarm or seismic cluster. In the established template application, a detector functions in a manner analogous to waveform cross-correlation, applying a statistical test to assess the similarity of the incoming data stream to known templates for events of interest. In our approach, we seek not to detect matching signals, but instead, we examine the signal subspace dimensionality in much the same way that the method addresses node traffic anomalies in large computer systems. Signal anomalies recorded on seismic arrays affect the dimensional structure of the array-wide time-series. We have shown previously that this observation is useful in identifying real seismic events, either by looking at the raw signal or derivatives thereof (entropy, kurtosis), but here we explore the effects of malfunctioning channels on the dimension of the data and its derivatives, and how to leverage this effect for

  6. Automation of multi-agent control for complex dynamic systems in heterogeneous computational network

    Science.gov (United States)

    Oparin, Gennady; Feoktistov, Alexander; Bogdanova, Vera; Sidorov, Ivan

    2017-01-01

    The rapid progress of high-performance computing entails new challenges related to solving large scientific problems for various subject domains in a heterogeneous distributed computing environment (e.g., a network, Grid system, or Cloud infrastructure). The specialists in the field of parallel and distributed computing give the special attention to a scalability of applications for problem solving. An effective management of the scalable application in the heterogeneous distributed computing environment is still a non-trivial issue. Control systems that operate in networks, especially relate to this issue. We propose a new approach to the multi-agent management for the scalable applications in the heterogeneous computational network. The fundamentals of our approach are the integrated use of conceptual programming, simulation modeling, network monitoring, multi-agent management, and service-oriented programming. We developed a special framework for an automation of the problem solving. Advantages of the proposed approach are demonstrated on the parametric synthesis example of the static linear regulator for complex dynamic systems. Benefits of the scalable application for solving this problem include automation of the multi-agent control for the systems in a parallel mode with various degrees of its detailed elaboration.

  7. Computational Tools and Algorithms for Designing Customized Synthetic Genes

    Directory of Open Access Journals (Sweden)

    Nathan eGould

    2014-10-01

    Full Text Available Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de-novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations.

  8. Computational Tools and Algorithms for Designing Customized Synthetic Genes

    Energy Technology Data Exchange (ETDEWEB)

    Gould, Nathan [Department of Computer Science, The College of New Jersey, Ewing, NJ (United States); Hendy, Oliver [Department of Biology, The College of New Jersey, Ewing, NJ (United States); Papamichail, Dimitris, E-mail: papamicd@tcnj.edu [Department of Computer Science, The College of New Jersey, Ewing, NJ (United States)

    2014-10-06

    Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein-coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review, we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations.

  9. Computational Tools and Algorithms for Designing Customized Synthetic Genes

    International Nuclear Information System (INIS)

    Gould, Nathan; Hendy, Oliver; Papamichail, Dimitris

    2014-01-01

    Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein-coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review, we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations.

  10. Natural language processing tools for computer assisted language learning

    Directory of Open Access Journals (Sweden)

    Vandeventer Faltin, Anne

    2003-01-01

    Full Text Available This paper illustrates the usefulness of natural language processing (NLP tools for computer assisted language learning (CALL through the presentation of three NLP tools integrated within a CALL software for French. These tools are (i a sentence structure viewer; (ii an error diagnosis system; and (iii a conjugation tool. The sentence structure viewer helps language learners grasp the structure of a sentence, by providing lexical and grammatical information. This information is derived from a deep syntactic analysis. Two different outputs are presented. The error diagnosis system is composed of a spell checker, a grammar checker, and a coherence checker. The spell checker makes use of alpha-codes, phonological reinterpretation, and some ad hoc rules to provide correction proposals. The grammar checker employs constraint relaxation and phonological reinterpretation as diagnosis techniques. The coherence checker compares the underlying "semantic" structures of a stored answer and of the learners' input to detect semantic discrepancies. The conjugation tool is a resource with enhanced capabilities when put on an electronic format, enabling searches from inflected and ambiguous verb forms.

  11. Computer-Based Tools for Evaluating Graphical User Interfaces

    Science.gov (United States)

    Moore, Loretta A.

    1997-01-01

    The user interface is the component of a software system that connects two very complex system: humans and computers. Each of these two systems impose certain requirements on the final product. The user is the judge of the usability and utility of the system; the computer software and hardware are the tools with which the interface is constructed. Mistakes are sometimes made in designing and developing user interfaces because the designers and developers have limited knowledge about human performance (e.g., problem solving, decision making, planning, and reasoning). Even those trained in user interface design make mistakes because they are unable to address all of the known requirements and constraints on design. Evaluation of the user inter-face is therefore a critical phase of the user interface development process. Evaluation should not be considered the final phase of design; but it should be part of an iterative design cycle with the output of evaluation being feed back into design. The goal of this research was to develop a set of computer-based tools for objectively evaluating graphical user interfaces. The research was organized into three phases. The first phase resulted in the development of an embedded evaluation tool which evaluates the usability of a graphical user interface based on a user's performance. An expert system to assist in the design and evaluation of user interfaces based upon rules and guidelines was developed during the second phase. During the final phase of the research an automatic layout tool to be used in the initial design of graphical inter- faces was developed. The research was coordinated with NASA Marshall Space Flight Center's Mission Operations Laboratory's efforts in developing onboard payload display specifications for the Space Station.

  12. Vision 20/20: Automation and advanced computing in clinical radiation oncology

    International Nuclear Information System (INIS)

    Moore, Kevin L.; Moiseenko, Vitali; Kagadis, George C.; McNutt, Todd R.; Mutic, Sasa

    2014-01-01

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy

  13. Vision 20/20: Automation and advanced computing in clinical radiation oncology

    Energy Technology Data Exchange (ETDEWEB)

    Moore, Kevin L., E-mail: kevinmoore@ucsd.edu; Moiseenko, Vitali [Department of Radiation Medicine and Applied Sciences, University of California San Diego, La Jolla, California 92093 (United States); Kagadis, George C. [Department of Medical Physics, School of Medicine, University of Patras, Rion, GR 26504 (Greece); McNutt, Todd R. [Department of Radiation Oncology and Molecular Radiation Science, School of Medicine, Johns Hopkins University, Baltimore, Maryland 21231 (United States); Mutic, Sasa [Department of Radiation Oncology, Washington University in St. Louis, St. Louis, Missouri 63110 (United States)

    2014-01-15

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.

  14. Vision 20/20: Automation and advanced computing in clinical radiation oncology.

    Science.gov (United States)

    Moore, Kevin L; Kagadis, George C; McNutt, Todd R; Moiseenko, Vitali; Mutic, Sasa

    2014-01-01

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.

  15. TAPS: an automated tool for identification of skills, knowledges, and abilities using natural language task description

    International Nuclear Information System (INIS)

    Jorgensen, C.C.; Carter, R.J.

    1986-01-01

    A prototype, computer-based tool (TAPS) has been developed to aid training system developers in identifying skills, knowledges, and abilities (SKAs) during task analysis. TAPS uses concepts of flexible pattern matching to evaluate English descriptions of job behaviors and to recode them as SKA lists. This paper addresses the rationale for TAPS and describes its design including SKA definitions and task analysis logic. It also presents examples of TAPS's application

  16. TAPS: an automated tool for identification of skills, knowledges, and abilities using natural language task description

    Energy Technology Data Exchange (ETDEWEB)

    Jorgensen, C.C.; Carter, R.J.

    1986-01-01

    A prototype, computer-based tool (TAPS) has been developed to aid training system developers in identifying skills, knowledges, and abilities (SKAs) during task analysis. TAPS uses concepts of flexible pattern matching to evaluate English descriptions of job behaviors and to recode them as SKA lists. This paper addresses the rationale for TAPS and describes its design including SKA definitions and task analysis logic. It also presents examples of TAPS's application.

  17. TAPS - An automated tool for identification of skills, knowledges, and abilities using natural language task description

    International Nuclear Information System (INIS)

    Jorgensen, C.C.; Carter, R.J.

    1986-01-01

    A prototype, computer-based tool (TAPS) has been developed to aid training system developers in identifying skills, knowledges, and abilities (SKAs) during task analysis. TAPS uses concepts of flexible pattern matching to evaluate English descriptions of job behaviors and to recode them as SKA lists. This paper addresses the rationale for TAPS and describes its design including SKA definitions and task analysis logic. It also presents examples of TAPS's application

  18. How do Air Traffic Controllers Use Automation and Tools Differently During High Demand Situations?

    Science.gov (United States)

    Kraut, Joshua M.; Mercer, Joey; Morey, Susan; Homola, Jeffrey; Gomez, Ashley; Prevot, Thomas

    2013-01-01

    In a human-in-the-loop simulation, two air traffic controllers managed identical airspace while burdened with higher than average workload, and while using advanced tools and automation designed to assist with scheduling aircraft on multiple arrival flows to a single meter fix. This paper compares the strategies employed by each controller, and investigates how the controllers' strategies change while managing their airspace under more normal workload conditions and a higher workload condition. Each controller engaged in different methods of maneuvering aircraft to arrive on schedule, and adapted their strategies to cope with the increased workload in different ways. Based on the conclusions three suggestions are made: that quickly providing air traffic controllers with recommendations and information to assist with maneuvering and scheduling aircraft when burdened with increased workload will improve the air traffic controller's effectiveness, that the tools should adapt to the strategy currently employed by a controller, and that training should emphasize which traffic management strategies are most effective given specific airspace demands.

  19. RATIO_TOOL - SOFTWARE FOR COMPUTING IMAGE RATIOS

    Science.gov (United States)

    Yates, G. L.

    1994-01-01

    Geological studies analyze spectral data in order to gain information on surface materials. RATIO_TOOL is an interactive program for viewing and analyzing large multispectral image data sets that have been created by an imaging spectrometer. While the standard approach to classification of multispectral data is to match the spectrum for each input pixel against a library of known mineral spectra, RATIO_TOOL uses ratios of spectral bands in order to spot significant areas of interest within a multispectral image. Each image band can be viewed iteratively, or a selected image band of the data set can be requested and displayed. When the image ratios are computed, the result is displayed as a gray scale image. At this point a histogram option helps in viewing the distribution of values. A thresholding option can then be used to segment the ratio image result into two to four classes. The segmented image is then color coded to indicate threshold classes and displayed alongside the gray scale image. RATIO_TOOL is written in C language for Sun series computers running SunOS 4.0 and later. It requires the XView toolkit and the OpenWindows window manager (version 2.0 or 3.0). The XView toolkit is distributed with Open Windows. A color monitor is also required. The standard distribution medium for RATIO_TOOL is a .25 inch streaming magnetic tape cartridge in UNIX tar format. An electronic copy of the documentation is included on the program media. RATIO_TOOL was developed in 1992 and is a copyrighted work with all copyright vested in NASA. Sun, SunOS, and OpenWindows are trademarks of Sun Microsystems, Inc. UNIX is a registered trademark of AT&T Bell Laboratories.

  20. Evaluating hydrological response to forecasted land-use change—scenario testing with the automated geospatial watershed assessment (AGWA) tool

    Science.gov (United States)

    Kepner, William G.; Semmens, Darius J.; Hernandez, Mariano; Goodrich, David C.

    2009-01-01

    Envisioning and evaluating future scenarios has emerged as a critical component of both science and social decision-making. The ability to assess, report, map, and forecast the life support functions of ecosystems is absolutely critical to our capacity to make informed decisions to maintain the sustainable nature of our ecosystem services now and into the future. During the past two decades, important advances in the integration of remote imagery, computer processing, and spatial-analysis technologies have been used to develop landscape information that can be integrated with hydrologic models to determine long-term change and make predictive inferences about the future. Two diverse case studies in northwest Oregon (Willamette River basin) and southeastern Arizona (San Pedro River) were examined in regard to future land use scenarios relative to their impact on surface water conditions (e.g., sediment yield and surface runoff) using hydrologic models associated with the Automated Geospatial Watershed Assessment (AGWA) tool. The base reference grid for land cover was modified in both study locations to reflect stakeholder preferences 20 to 60 yrs into the future, and the consequences of landscape change were evaluated relative to the selected future scenarios. The two studies provide examples of integrating hydrologic modeling with a scenario analysis framework to evaluate plausible future forecasts and to understand the potential impact of landscape change on ecosystem services.

  1. Computers and the internet: tools for youth empowerment.

    Science.gov (United States)

    Valaitis, Ruta K

    2005-10-04

    Youth are often disenfranchised in their communities and may feel they have little voice. Since computers are an important aspect of youth culture, they may offer solutions to increasing youth participation in communities. This qualitative case study investigated the perceptions of 19 (predominantly female) inner-city school youth about their use of computers and the Internet in a school-based community development project. Youth working with public health nurses in a school-based community development project communicated with local community members using computer-mediated communication, surveyed peers online, built websites, searched for information online, and prepared project materials using computers and the Internet. Participant observation, semistructured interviews, analysis of online messages, and online- and paper-based surveys were used to gather data about youth's and adults' perceptions and use of the technologies. Constant comparison method and between-method triangulation were used in the analysis to satisfy the existence of themes. Not all youth were interested in working with computers. Some electronic messages from adults were perceived to be critical, and writing to adults was intimidating for some youth. In addition, technical problems were experienced. Despite these barriers, most youth perceived that using computers and the Internet reduced their anxiety concerning communication with adults, increased their control when dealing with adults, raised their perception of their social status, increased participation within the community, supported reflective thought, increased efficiency, and improved their access to resources. Overall, youth perceived computers and the Internet to be empowering tools, and they should be encouraged to use such technology to support them in community initiatives.

  2. Computational needs survey of NASA automation and robotics missions. Volume 2: Appendixes

    Science.gov (United States)

    Davis, Gloria J.

    1991-01-01

    NASA's operational use of advanced processor technology in space systems lags behind its commercial development by more than eight years. One of the factors contributing to this is the fact that mission computing requirements are frequency unknown, unstated, misrepresented, or simply not available in a timely manner. NASA must provide clear common requirements to make better use of available technology, to cut development lead time on deployable architectures, and to increase the utilization of new technology. Here, NASA, industry and academic communities are provided with a preliminary set of advanced mission computational processing requirements of automation and robotics (A and R) systems. The results were obtained in an assessment of the computational needs of current projects throughout NASA. The high percent of responses indicated a general need for enhanced computational capabilities beyond the currently available 80386 and 68020 processor technology. Because of the need for faster processors and more memory, 90 percent of the polled automation projects have reduced or will reduce the scope of their implemented capabilities. The requirements are presented with respect to their targeted environment, identifying the applications required, system performance levels necessary to support them, and the degree to which they are met with typical programmatic constraints. Here, appendixes are provided.

  3. Computational needs survey of NASA automation and robotics missions. Volume 1: Survey and results

    Science.gov (United States)

    Davis, Gloria J.

    1991-01-01

    NASA's operational use of advanced processor technology in space systems lags behind its commercial development by more than eight years. One of the factors contributing to this is that mission computing requirements are frequently unknown, unstated, misrepresented, or simply not available in a timely manner. NASA must provide clear common requirements to make better use of available technology, to cut development lead time on deployable architectures, and to increase the utilization of new technology. A preliminary set of advanced mission computational processing requirements of automation and robotics (A&R) systems are provided for use by NASA, industry, and academic communities. These results were obtained in an assessment of the computational needs of current projects throughout NASA. The high percent of responses indicated a general need for enhanced computational capabilities beyond the currently available 80386 and 68020 processor technology. Because of the need for faster processors and more memory, 90 percent of the polled automation projects have reduced or will reduce the scope of their implementation capabilities. The requirements are presented with respect to their targeted environment, identifying the applications required, system performance levels necessary to support them, and the degree to which they are met with typical programmatic constraints. Volume one includes the survey and results. Volume two contains the appendixes.

  4. Applying knowledge engineering tools for the personal computer to the operation and maintenance of radiopharmaceutical production systems

    International Nuclear Information System (INIS)

    Alexoff, D.L.

    1990-01-01

    A practical consequence of over three decades of Artificial Intelligence (AI) research has been the emergence of Personal Computer-based AI programming tools. A special class of this microcomputer-based software, called expert systems shells, is now applied routinely outside the realm of classical AI to solve many types of problems, particularly in analytical chemistry. These AI tools offer not only some of the advantages inherent to symbolic programming languages, but, as significant, they bring with them advanced program development environments which can facilitate software development and maintenance. Exploitation of this enhanced programming environment was a major motivation for using an AI tool. The goal of this work is to evaluate the use of an example-based expert system shell (1st Class FUSION, 1st Class Expert Systems, Inc.) as a programming tool for developing software useful for automated radiopharmaceutical production

  5. Automated segmentation of pulmonary structures in thoracic computed tomography scans: a review

    International Nuclear Information System (INIS)

    Van Rikxoort, Eva M; Van Ginneken, Bram

    2013-01-01

    Computed tomography (CT) is the modality of choice for imaging the lungs in vivo. Sub-millimeter isotropic images of the lungs can be obtained within seconds, allowing the detection of small lesions and detailed analysis of disease processes. The high resolution of thoracic CT and the high prevalence of lung diseases require a high degree of automation in the analysis pipeline. The automated segmentation of pulmonary structures in thoracic CT has been an important research topic for over a decade now. This systematic review provides an overview of current literature. We discuss segmentation methods for the lungs, the pulmonary vasculature, the airways, including airway tree construction and airway wall segmentation, the fissures, the lobes and the pulmonary segments. For each topic, the current state of the art is summarized, and topics for future research are identified. (topical review)

  6. aMCfast: automation of fast NLO computations for PDF fits

    CERN Document Server

    Bertone, Valerio; Frixione, Stefano; Rojo, Juan; Sutton, Mark

    2014-01-01

    We present the interface between MadGraph5_aMC@NLO, a self-contained program that calculates cross sections up to next-to-leading order accuracy in an automated manner, and APPLgrid, a code that parametrises such cross sections in the form of look-up tables which can be used for the fast computations needed in the context of PDF fits. The main characteristic of this interface, which we dub aMCfast, is its being fully automated as well, which removes the need to extract manually the process-specific information for additional physics processes, as is the case with other matrix element calculators, and renders it straightforward to include any new process in the PDF fits. We demonstrate this by studying several cases which are easily measured at the LHC, have a good constraining power on PDFs, and some of which were previously unavailable in the form of a fast interface.

  7. Hardware replacements and software tools for digital control computers

    International Nuclear Information System (INIS)

    Walker, R.A.P.; Wang, B-C.; Fung, J.

    1996-01-01

    computers which use 'Varian' technology. A new software program, Desk Top Tools, permits the designer greater flexibility in digital control computer software design and testing. This software development allows the user to emulate control of the CANDU reactor system by system. All discussions will highlight the ability of the replacements and the new developments to enhance the operation of the existing and 'repeat' plant digital control computers and will explore future applications of these developments. Examples of current use of all replacement components and software are provided. (author)

  8. Systematic Methods and Tools for Computer Aided Modelling

    DEFF Research Database (Denmark)

    Fedorova, Marina

    and processes can be faster, cheaper and very efficient. The developed modelling framework involves five main elements: 1) a modelling tool, that includes algorithms for model generation; 2) a template library, which provides building blocks for the templates (generic models previously developed); 3) computer......-format and COM-objects, are incorporated to allow the export and import of mathematical models; 5) a user interface that provides the work-flow and data-flow to guide the user through the different modelling tasks....

  9. 3D data processing with advanced computer graphics tools

    Science.gov (United States)

    Zhang, Song; Ekstrand, Laura; Grieve, Taylor; Eisenmann, David J.; Chumbley, L. Scott

    2012-09-01

    Often, the 3-D raw data coming from an optical profilometer contains spiky noises and irregular grid, which make it difficult to analyze and difficult to store because of the enormously large size. This paper is to address these two issues for an optical profilometer by substantially reducing the spiky noise of the 3-D raw data from an optical profilometer, and by rapidly re-sampling the raw data into regular grids at any pixel size and any orientation with advanced computer graphics tools. Experimental results will be presented to demonstrate the effectiveness of the proposed approach.

  10. Mobile Computing and Cloud maturity - Introducing Machine Learning for ERP Configuration Automation

    Directory of Open Access Journals (Sweden)

    Elena Geanina ULARU

    2013-01-01

    Full Text Available Nowadays the smart phone market is clearly growing due to the new type of functionalities that mobile devices have and the role that they play in everyday life. Their utility and benefits rely on the applications that can be installed on the device (the so-called mobile apps. Cloud computing is a way to enhance the world of mobile application by providing disk space and freeing the user of the local storage needs, this way providing cheaper storage, wider acces-sibility and greater speed for business. In this paper we introduce various aspects of mobile computing and we stress the importance of obtaining cloud maturity by using machine learning for automating configurations of software applications deployed on cloud nodes using the open source application ERP5 and SlapOS, an open source operating system for Decentralized Cloud Computing.

  11. Automated quantitative coronary computed tomography correlates of myocardial ischaemia on gated myocardial perfusion SPECT

    Energy Technology Data Exchange (ETDEWEB)

    Graaf, Michiel A. de; Boogers, Mark J.; Veltman, Caroline E. [Leiden University Medical Center, Department of Cardiology, Leiden (Netherlands); The Interuniversity Cardiology Institute of The Netherlands, Utrecht (Netherlands); El-Naggar, Heba M.; Bax, Jeroen J.; Delgado, Victoria [Leiden University Medical Center, Department of Cardiology, Leiden (Netherlands); Broersen, Alexander; Kitslaar, Pieter H.; Dijkstra, Jouke [Leiden University Medical Center, Department of Radiology, Division of Image Processing, Leiden (Netherlands); Kroft, Lucia J. [Leiden University Medical Center, Department of Radiology, Leiden (Netherlands); Younis, Imad Al [Leiden University Medical Center, Department of Nuclear Medicine, Leiden (Netherlands); Reiber, Johan H. [Leiden University Medical Center, Department of Radiology, Division of Image Processing, Leiden (Netherlands); Medis medical imaging systems B.V., Leiden (Netherlands); Scholte, Arthur J. [Leiden University Medical Center, Department of Cardiology, Leiden (Netherlands)

    2013-08-15

    Automated software tools have permitted more comprehensive, robust and reproducible quantification of coronary stenosis, plaque burden and plaque location of coronary computed tomography angiography (CTA) data. The association between these quantitative CTA (QCT) parameters and the presence of myocardial ischaemia has not been explored. The aim of the present investigation was to evaluate the association between QCT parameters of coronary artery lesions and the presence of myocardial ischaemia on gated myocardial perfusion single-photon emission CT (SPECT). Included in the study were 40 patients (mean age 58.2 {+-} 10.9 years, 27 men) with known or suspected coronary artery disease (CAD) who had undergone multidetector row CTA and gated myocardial perfusion SPECT within 6 months. From the CTA datasets, vessel-based and lesion-based visual analyses were performed. Consecutively, lesion-based QCT was performed to assess plaque length, plaque burden, percentage lumen area stenosis and remodelling index. Subsequently, the presence of myocardial ischaemia was assessed using the summed difference score (SDS {>=}2) on gated myocardial perfusion SPECT. Myocardial ischaemia was seen in 25 patients (62.5 %) in 37 vascular territories. Quantitatively assessed significant stenosis and quantitatively assessed lesion length were independently associated with myocardial ischaemia (OR 7.72, 95 % CI 2.41-24.7, p < 0.001, and OR 1.07, 95 % CI 1.00-1.45, p = 0.032, respectively) after correcting for clinical variables and visually assessed significant stenosis. The addition of quantitatively assessed significant stenosis ({chi} {sup 2} = 20.7) and lesion length ({chi} {sup 2} = 26.0) to the clinical variables and the visual assessment ({chi} {sup 2} = 5.9) had incremental value in the association with myocardial ischaemia. Coronary lesion length and quantitatively assessed significant stenosis were independently associated with myocardial ischaemia. Both quantitative parameters have

  12. Concurrent validity of an automated algorithm for computing the center of pressure excursion index (CPEI).

    Science.gov (United States)

    Diaz, Michelle A; Gibbons, Mandi W; Song, Jinsup; Hillstrom, Howard J; Choe, Kersti H; Pasquale, Maria R

    2018-01-01

    Center of Pressure Excursion Index (CPEI), a parameter computed from the distribution of plantar pressures during stance phase of barefoot walking, has been used to assess dynamic foot function. The original custom program developed to calculate CPEI required the oversight of a user who could manually correct for certain exceptions to the computational rules. A new fully automatic program has been developed to calculate CPEI with an algorithm that accounts for these exceptions. The purpose of this paper is to compare resulting CPEI values computed by these two programs on plantar pressure data from both asymptomatic and pathologic subjects. If comparable, the new program offers significant benefits-reduced potential for variability due to rater discretion and faster CPEI calculation. CPEI values were calculated from barefoot plantar pressure distributions during comfortable paced walking on 61 healthy asymptomatic adults, 19 diabetic adults with moderate hallux valgus, and 13 adults with mild hallux valgus. Right foot data for each subject was analyzed with linear regression and a Bland-Altman plot. The automated algorithm yielded CPEI values that were linearly related to the original program (R 2 =0.99; Pcomputation methods. Results of this analysis suggest that the new automated algorithm may be used to calculate CPEI on both healthy and pathologic feet. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 1: HARP introduction and user's guide

    Science.gov (United States)

    Bavuso, Salvatore J.; Rothmann, Elizabeth; Dugan, Joanne Bechta; Trivedi, Kishor S.; Mittal, Nitin; Boyd, Mark A.; Geist, Robert M.; Smotherman, Mark D.

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed to be compatible with most computing platforms and operating systems, and some programs have been beta tested, within the aerospace community for over 8 years. Volume 1 provides an introduction to the HARP program. Comprehensive information on HARP mathematical models can be found in the references.

  14. A novel framework for diagnosing automatic tool changer and tool life based on cloud computing

    Directory of Open Access Journals (Sweden)

    Shang-Liang Chen

    2016-03-01

    Full Text Available Tool change is one among the most frequently performed machining processes, and if there is improper percussion as the tool’s position is changed, the spindle bearing can be damaged. A spindle malfunction can cause problems, such as a knife being dropped or bias in a machined hole. The measures currently taken to avoid such issues, which arose from the available machine tools, only involve determining whether the clapping knife’s state is correct using a spindle and the air adhesion method, which is also used to satisfy the high precision required from mechanical components. Therefore, it cannot be used with any type of machine tool; in addition, improper tapping of the spindle during an automatic tool change cannot be detected. Therefore, this study proposes a new type of diagnostic framework that combines cloud computing and vibration sensors, among of which, tool change is automatically diagnosed using an architecture to identify abnormalities and thereby enhances the reliability and productivity of the machine and equipment.

  15. Computer-automated tuning of semiconductor double quantum dots into the single-electron regime

    Energy Technology Data Exchange (ETDEWEB)

    Baart, T. A.; Vandersypen, L. M. K. [QuTech, Delft University of Technology, P.O. Box 5046, 2600 GA Delft (Netherlands); Kavli Institute of Nanoscience, Delft University of Technology, P.O. Box 5046, 2600 GA Delft (Netherlands); Eendebak, P. T. [QuTech, Delft University of Technology, P.O. Box 5046, 2600 GA Delft (Netherlands); Netherlands Organisation for Applied Scientific Research (TNO), P.O. Box 155, 2600 AD Delft (Netherlands); Reichl, C.; Wegscheider, W. [Solid State Physics Laboratory, ETH Zürich, 8093 Zürich (Switzerland)

    2016-05-23

    We report the computer-automated tuning of gate-defined semiconductor double quantum dots in GaAs heterostructures. We benchmark the algorithm by creating three double quantum dots inside a linear array of four quantum dots. The algorithm sets the correct gate voltages for all the gates to tune the double quantum dots into the single-electron regime. The algorithm only requires (1) prior knowledge of the gate design and (2) the pinch-off value of the single gate T that is shared by all the quantum dots. This work significantly alleviates the user effort required to tune multiple quantum dot devices.

  16. Application of process computers for automation of power plants in Hungary

    Energy Technology Data Exchange (ETDEWEB)

    Papp, G.; Szilagyi, R.

    1982-04-01

    An automation system for normal operation and accidents is presented. In normal operation, the operators have only a supervisory function. In case of disturbances, only a minimum number of units will fail. Process computer data are: Storage cycle: 750 ns; parallel system; length of configuration: 12 bit; one-address binary two-complement arithmetic; operative ferromagnetic storage: 24 K; core register: 5. There are two peripheral disk storages with a total capacity of 6 Mbit and two floppy disk storages, each with a capacity of 800 Kbit.

  17. Honeywell Modular Automation System Computer Software Documentation for the Magnesium Hydroxide Precipitation Process

    International Nuclear Information System (INIS)

    STUBBS, A.M.

    2001-01-01

    The purpose of this Computer Software Document (CSWD) is to provide configuration control of the Honeywell Modular Automation System (MAS) in use at the Plutonium Finishing Plant (PFP) for the Magnesium Hydroxide Precipitation Process in Rm 230C/234-5Z. The magnesium hydroxide process control software Rev 0 is being updated to include control programming for a second hot plate. The process control programming was performed by the system administrator. Software testing for the additional hot plate was performed per PFP Job Control Work Package 2Z-00-1703. The software testing was verified by Quality Control to comply with OSD-Z-184-00044, Magnesium Hydroxide Precipitation Process

  18. Software Application Profile: PHESANT: a tool for performing automated phenome scans in UK Biobank.

    Science.gov (United States)

    Millard, Louise A C; Davies, Neil M; Gaunt, Tom R; Davey Smith, George; Tilling, Kate

    2017-10-05

    Epidemiological cohorts typically contain a diverse set of phenotypes such that automation of phenome scans is non-trivial, because they require highly heterogeneous models. For this reason, phenome scans have to date tended to use a smaller homogeneous set of phenotypes that can be analysed in a consistent fashion. We present PHESANT (PHEnome Scan ANalysis Tool), a software package for performing comprehensive phenome scans in UK Biobank. PHESANT tests the association of a specified trait with all continuous, integer and categorical variables in UK Biobank, or a specified subset. PHESANT uses a novel rule-based algorithm to determine how to appropriately test each trait, then performs the analyses and produces plots and summary tables. The PHESANT phenome scan is implemented in R. PHESANT includes a novel Javascript D3.js visualization and accompanying Java code that converts the phenome scan results to the required JavaScript Object Notation (JSON) format. PHESANT is available on GitHub at [https://github.com/MRCIEU/PHESANT]. Git tag v0.5 corresponds to the version presented here. © The Author 2017. Published by Oxford University Press on behalf of the International Epidemiological Association.

  19. A computer vision-based automated Figure-8 maze for working memory test in rodents.

    Science.gov (United States)

    Pedigo, Samuel F; Song, Eun Young; Jung, Min Whan; Kim, Jeansok J

    2006-09-30

    The benchmark test for prefrontal cortex (PFC)-mediated working memory in rodents is a delayed alternation task utilizing variations of T-maze or Figure-8 maze, which requires the animals to make specific arm entry responses for reward. In this task, however, manual procedures involved in shaping target behavior, imposing delays between trials and delivering rewards can potentially influence the animal's performance on the maze. Here, we report an automated Figure-8 maze which does not necessitate experimenter-subject interaction during shaping, training or testing. This system incorporates a computer vision system for tracking, motorized gates to impose delays, and automated reward delivery. The maze is controlled by custom software that records the animal's location and activates the gates according to the animal's behavior and a control algorithm. The program performs calculations of task accuracy, tracks movement sequence through the maze, and provides other dependent variables (such as running speed, time spent in different maze locations, activity level during delay). Testing in rats indicates that the performance accuracy is inversely proportional to the delay interval, decreases with PFC lesions, and that animals anticipate timing during long delays. Thus, our automated Figure-8 maze is effective at assessing working memory and provides novel behavioral measures in rodents.

  20. Large-scale automated analysis of news media: a novel computational method for obesity policy research.

    Science.gov (United States)

    Hamad, Rita; Pomeranz, Jennifer L; Siddiqi, Arjumand; Basu, Sanjay

    2015-02-01

    Analyzing news media allows obesity policy researchers to understand popular conceptions about obesity, which is important for targeting health education and policies. A persistent dilemma is that investigators have to read and manually classify thousands of individual news articles to identify how obesity and obesity-related policy proposals may be described to the public in the media. A machine learning method called "automated content analysis" that permits researchers to train computers to "read" and classify massive volumes of documents was demonstrated. 14,302 newspaper articles that mentioned the word "obesity" during 2011-2012 were identified. Four states that vary in obesity prevalence and policy (Alabama, California, New Jersey, and North Carolina) were examined. The reliability of an automated program to categorize the media's framing of obesity as an individual-level problem (e.g., diet) and/or an environmental-level problem (e.g., obesogenic environment) was tested. The automated program performed similarly to human coders. The proportion of articles with individual-level framing (27.7-31.0%) was higher than the proportion with neutral (18.0-22.1%) or environmental-level framing (16.0-16.4%) across all states and over the entire study period (Pnews media was demonstrated. © 2014 The Obesity Society.

  1. SlugIn 1.0: A Free Tool for Automated Slug Test Analysis.

    Science.gov (United States)

    Martos-Rosillo, Sergio; Guardiola-Albert, Carolina; Padilla Benítez, Alberto; Delgado Pastor, Joaquín; Azcón González, Antonio; Durán Valsero, Juan José

    2018-05-01

    The correct characterization of aquifer parameters is essential for water-supply and water-quality investigations. Slug tests are widely used for these purposes. While free software is available to interpret slug tests, some codes are not user-friendly, or do not include a wide range of methods to interpret the results, or do not include automatic, inverse solutions to the test data. The private sector has also generated several good programs to interpret slug test data, but they are not free of charge. The computer program SlugIn 1.0 is available online for free download, and is demonstrated to aid in the analysis of slug tests to estimate hydraulic parameters. The program provides an easy-to-use Graphical User Interface. SlugIn 1.0 incorporates automated parameter estimation and facilitates the visualization of several interpretations of the same test. It incorporates solutions for confined and unconfined aquifers, partially penetrating wells, skin effects, shape factor, anisotropy, high hydraulic conductivity formations and the Mace test for large-diameter wells. It is available in English and Spanish and can be downloaded from the web site of the Geological Survey of Spain. Two field examples are presented to illustrate how the software operates. © 2018, National Ground Water Association.

  2. Automated Quantification of Stroke Damage on Brain Computed Tomography Scans: e-ASPECTS

    Directory of Open Access Journals (Sweden)

    James Hampton-Till

    2015-08-01

    Full Text Available Emergency radiological diagnosis of acute ischaemic stroke requires the accurate detection and appropriate interpretation of relevant imaging findings. Non-contrast computed tomography (CT provides fast and low-cost assessment of the early signs of ischaemia and is the most widely used diagnostic modality for acute stroke. The Alberta Stroke Program Early CT Score (ASPECTS is a quantitative and clinically validated method to measure the extent of ischaemic signs on brain CT scans. The CE-marked electronic-ASPECTS (e-ASPECTS software automates the ASPECTS score. Anglia Ruskin Clinical Trials Unit (ARCTU independently carried out a clinical investigation of the e-ASPECTS software, an automated scoring system which can be integrated into the diagnostic pathway of an acute ischaemic stroke patient, thereby assisting the physician with expert interpretation of the brain CT scan. Here we describe a literature review of the clinical importance of reliable assessment of early ischaemic signs on plain CT scans, and of technologies automating these processed scoring systems in ischaemic stroke on CT scans focusing on the e-ASPECTS software. To be suitable for critical appraisal in this evaluation, the published studies needed a sample size of a minimum of 10 cases. All randomised studies were screened and data deemed relevant to demonstration of performance of ASPECTS were appraised. The literature review focused on three domains: i interpretation of brain CT scans of stroke patients, ii the application of the ASPECTS score in ischaemic stroke, and iii automation of brain CT analysis. Finally, the appraised references are discussed in the context of the clinical impact of e-ASPECTS and the expected performance, which will be independently evaluated by a non-inferiority study conducted by the ARCTU.

  3. A computational framework for the automated construction of glycosylation reaction networks.

    Science.gov (United States)

    Liu, Gang; Neelamegham, Sriram

    2014-01-01

    Glycosylation is among the most common and complex post-translational modifications identified to date. It proceeds through the catalytic action of multiple enzyme families that include the glycosyltransferases that add monosaccharides to growing glycans, and glycosidases which remove sugar residues to trim glycans. The expression level and specificity of these enzymes, in part, regulate the glycan distribution or glycome of specific cell/tissue systems. Currently, there is no systematic method to describe the enzymes and cellular reaction networks that catalyze glycosylation. To address this limitation, we present a streamlined machine-readable definition for the glycosylating enzymes and additional methodologies to construct and analyze glycosylation reaction networks. In this computational framework, the enzyme class is systematically designed to store detailed specificity data such as enzymatic functional group, linkage and substrate specificity. The new classes and their associated functions enable both single-reaction inference and automated full network reconstruction, when given a list of reactants and/or products along with the enzymes present in the system. In addition, graph theory is used to support functions that map the connectivity between two or more species in a network, and that generate subset models to identify rate-limiting steps regulating glycan biosynthesis. Finally, this framework allows the synthesis of biochemical reaction networks using mass spectrometry (MS) data. The features described above are illustrated using three case studies that examine: i) O-linked glycan biosynthesis during the construction of functional selectin-ligands; ii) automated N-linked glycosylation pathway construction; and iii) the handling and analysis of glycomics based MS data. Overall, the new computational framework enables automated glycosylation network model construction and analysis by integrating knowledge of glycan structure and enzyme biochemistry. All

  4. A computational framework for the automated construction of glycosylation reaction networks.

    Directory of Open Access Journals (Sweden)

    Gang Liu

    Full Text Available Glycosylation is among the most common and complex post-translational modifications identified to date. It proceeds through the catalytic action of multiple enzyme families that include the glycosyltransferases that add monosaccharides to growing glycans, and glycosidases which remove sugar residues to trim glycans. The expression level and specificity of these enzymes, in part, regulate the glycan distribution or glycome of specific cell/tissue systems. Currently, there is no systematic method to describe the enzymes and cellular reaction networks that catalyze glycosylation. To address this limitation, we present a streamlined machine-readable definition for the glycosylating enzymes and additional methodologies to construct and analyze glycosylation reaction networks. In this computational framework, the enzyme class is systematically designed to store detailed specificity data such as enzymatic functional group, linkage and substrate specificity. The new classes and their associated functions enable both single-reaction inference and automated full network reconstruction, when given a list of reactants and/or products along with the enzymes present in the system. In addition, graph theory is used to support functions that map the connectivity between two or more species in a network, and that generate subset models to identify rate-limiting steps regulating glycan biosynthesis. Finally, this framework allows the synthesis of biochemical reaction networks using mass spectrometry (MS data. The features described above are illustrated using three case studies that examine: i O-linked glycan biosynthesis during the construction of functional selectin-ligands; ii automated N-linked glycosylation pathway construction; and iii the handling and analysis of glycomics based MS data. Overall, the new computational framework enables automated glycosylation network model construction and analysis by integrating knowledge of glycan structure and enzyme

  5. Utility in a Fallible Tool: A Multi-Site Case Study of Automated Writing Evaluation

    Science.gov (United States)

    Grimes, Douglas; Warschauer, Mark

    2010-01-01

    Automated writing evaluation (AWE) software uses artificial intelligence (AI) to score student essays and support revision. We studied how an AWE program called MY Access![R] was used in eight middle schools in Southern California over a three-year period. Although many teachers and students considered automated scoring unreliable, and teachers'…

  6. THE DEVELOPMENT OF AUTOMATION MANAGEMENT TOOLS BY THE DIVISIONS OF TACTICAL MISSILE DEFENSE

    Directory of Open Access Journals (Sweden)

    O. V. Voronin

    2017-01-01

    Full Text Available The article summarizes the basic directions of automation for planning and management of combat by the divisions of tactical missile defense. The article focuses on the problem of the automated choice of rational option for combat order and fire control carried out by the divisions of tactical missile defense during operation.

  7. Automation of analytical processes. A tool for higher efficiency and safety

    International Nuclear Information System (INIS)

    Groll, P.

    1976-01-01

    The analytical laboratory of a radiochemical facility is usually faced with the fact that numerous analyses of a similar type must be routinely carried out. Automation of such routine analytical procedures helps in increasing the efficiency and safety of the work. A review of the requirements for automation and its advantages is given and demonstrated on three examples. (author)

  8. Clinical validation of fully automated computation of ejection fraction from gated equilibrium blood-pool scintigrams

    International Nuclear Information System (INIS)

    Reiber, J.H.C.; Lie, S.P.; Simoons, M.L.; Hoek, C.; Gerbrands, J.J.; Wijns, W.; Bakker, W.H.; Kooij, P.P.M.

    1983-01-01

    A fully automated procedure for the computation of left-ventricular ejection fraction (EF) from cardiac-gated Tc-99m blood-pool (GBP) scintigrams with fixed, dual, and variable ROI methods is described. By comparison with EF data from contrast ventriculography in 68 patients, the dual-ROI method (separate end-diastolic and end-systolic contours) was found to be the method of choice; processing time was 2 min. Success score of dual-ROI procedure was 92% as assessed from 100 GBP studies. Overall reproducibility of data acquisition and analysis was determined in 12 patients. Mean value and standard deviation of differences between repeat studies (average time interval 27 min) were 0.8% and 4.3% EF units, respectively, (r=0.98). The authors conclude that left-ventricular EF can be computed automatically from GBP scintigrams with minimal operator-interaction and good reproducibility; EFs are similar to those from contrast ventriculography

  9. Automated selection of brain regions for real-time fMRI brain-computer interfaces

    Science.gov (United States)

    Lührs, Michael; Sorger, Bettina; Goebel, Rainer; Esposito, Fabrizio

    2017-02-01

    Objective. Brain-computer interfaces (BCIs) implemented with real-time functional magnetic resonance imaging (rt-fMRI) use fMRI time-courses from predefined regions of interest (ROIs). To reach best performances, localizer experiments and on-site expert supervision are required for ROI definition. To automate this step, we developed two unsupervised computational techniques based on the general linear model (GLM) and independent component analysis (ICA) of rt-fMRI data, and compared their performances on a communication BCI. Approach. 3 T fMRI data of six volunteers were re-analyzed in simulated real-time. During a localizer run, participants performed three mental tasks following visual cues. During two communication runs, a letter-spelling display guided the subjects to freely encode letters by performing one of the mental tasks with a specific timing. GLM- and ICA-based procedures were used to decode each letter, respectively using compact ROIs and whole-brain distributed spatio-temporal patterns of fMRI activity, automatically defined from subject-specific or group-level maps. Main results. Letter-decoding performances were comparable to supervised methods. In combination with a similarity-based criterion, GLM- and ICA-based approaches successfully decoded more than 80% (average) of the letters. Subject-specific maps yielded optimal performances. Significance. Automated solutions for ROI selection may help accelerating the translation of rt-fMRI BCIs from research to clinical applications.

  10. An automated tuberculosis screening strategy combining X-ray-based computer-aided detection and clinical information

    Science.gov (United States)

    Melendez, Jaime; Sánchez, Clara I.; Philipsen, Rick H. H. M.; Maduskar, Pragnya; Dawson, Rodney; Theron, Grant; Dheda, Keertan; van Ginneken, Bram

    2016-04-01

    Lack of human resources and radiological interpretation expertise impair tuberculosis (TB) screening programmes in TB-endemic countries. Computer-aided detection (CAD) constitutes a viable alternative for chest radiograph (CXR) reading. However, no automated techniques that exploit the additional clinical information typically available during screening exist. To address this issue and optimally exploit this information, a machine learning-based combination framework is introduced. We have evaluated this framework on a database containing 392 patient records from suspected TB subjects prospectively recruited in Cape Town, South Africa. Each record comprised a CAD score, automatically computed from a CXR, and 12 clinical features. Comparisons with strategies relying on either CAD scores or clinical information alone were performed. Our results indicate that the combination framework outperforms the individual strategies in terms of the area under the receiving operating characteristic curve (0.84 versus 0.78 and 0.72), specificity at 95% sensitivity (49% versus 24% and 31%) and negative predictive value (98% versus 95% and 96%). Thus, it is believed that combining CAD and clinical information to estimate the risk of active disease is a promising tool for TB screening.

  11. MEGA-CC: computing core of molecular evolutionary genetics analysis program for automated and iterative data analysis.

    Science.gov (United States)

    Kumar, Sudhir; Stecher, Glen; Peterson, Daniel; Tamura, Koichiro

    2012-10-15

    There is a growing need in the research community to apply the molecular evolutionary genetics analysis (MEGA) software tool for batch processing a large number of datasets and to integrate it into analysis workflows. Therefore, we now make available the computing core of the MEGA software as a stand-alone executable (MEGA-CC), along with an analysis prototyper (MEGA-Proto). MEGA-CC provides users with access to all the computational analyses available through MEGA's graphical user interface version. This includes methods for multiple sequence alignment, substitution model selection, evolutionary distance estimation, phylogeny inference, substitution rate and pattern estimation, tests of natural selection and ancestral sequence inference. Additionally, we have upgraded the source code for phylogenetic analysis using the maximum likelihood methods for parallel execution on multiple processors and cores. Here, we describe MEGA-CC and outline the steps for using MEGA-CC in tandem with MEGA-Proto for iterative and automated data analysis. http://www.megasoftware.net/.

  12. Atomdroid: a computational chemistry tool for mobile platforms.

    Science.gov (United States)

    Feldt, Jonas; Mata, Ricardo A; Dieterich, Johannes M

    2012-04-23

    We present the implementation of a new molecular mechanics program designed for use in mobile platforms, the first specifically built for these devices. The software is designed to run on Android operating systems and is compatible with several modern tablet-PCs and smartphones available in the market. It includes molecular viewer/builder capabilities with integrated routines for geometry optimizations and Monte Carlo simulations. These functionalities allow it to work as a stand-alone tool. We discuss some particular development aspects, as well as the overall feasibility of using computational chemistry software packages in mobile platforms. Benchmark calculations show that through efficient implementation techniques even hand-held devices can be used to simulate midsized systems using force fields.

  13. Constructing Bridges between Computational Tools in Heterogeneous and Homogeneous Catalysis

    KAUST Repository

    Falivene, Laura; Kozlov, Sergey M.; Cavallo, Luigi

    2018-01-01

    Better catalysts are needed to address numerous challenges faced by humanity. In this perspective, we review concepts and tools in theoretical and computational chemistry that can help to accelerate the rational design of homogeneous and heterogeneous catalysts. In particular, we focus on the following three topics: 1) identification of key intermediates and transition states in a reaction using the energetic span model, 2) disentanglement of factors influencing the relative stability of the key species using energy decomposition analysis and the activation strain model, and 3) discovery of new catalysts using volcano relationships. To facilitate wider use of these techniques across different areas, we illustrate their potentials and pitfalls when applied to the study of homogeneous and heterogeneous catalysts.

  14. Development of tools and models for computational fracture assessment

    International Nuclear Information System (INIS)

    Talja, H.; Santaoja, K.

    1998-01-01

    The aim of the work presented in this paper has been to develop and test new computational tools and theoretically more sound methods for fracture mechanical analysis. The applicability of the engineering integrity assessment system MASI for evaluation of piping components has been extended. The most important motivation for the theoretical development have been the well-known fundamental limitations in the validity of J-integral, which limits its applicability in many important practical safety assessment cases. Examples are extensive plastic deformation, multimaterial structures and ascending loading paths (especially warm prestress, WPS). Further, the micromechanical Gurson model has been applied to several reactor pressure vessel materials. Special attention is paid to the transferability of Gurson model parameters from tensile test results to prediction of ductile failure behaviour of cracked structures. (author)

  15. Constructing Bridges between Computational Tools in Heterogeneous and Homogeneous Catalysis

    KAUST Repository

    Falivene, Laura

    2018-05-08

    Better catalysts are needed to address numerous challenges faced by humanity. In this perspective, we review concepts and tools in theoretical and computational chemistry that can help to accelerate the rational design of homogeneous and heterogeneous catalysts. In particular, we focus on the following three topics: 1) identification of key intermediates and transition states in a reaction using the energetic span model, 2) disentanglement of factors influencing the relative stability of the key species using energy decomposition analysis and the activation strain model, and 3) discovery of new catalysts using volcano relationships. To facilitate wider use of these techniques across different areas, we illustrate their potentials and pitfalls when applied to the study of homogeneous and heterogeneous catalysts.

  16. TRAC, a collaborative computer tool for tracer-test interpretation

    Directory of Open Access Journals (Sweden)

    Fécamp C.

    2013-05-01

    Full Text Available Artificial tracer tests are widely used by consulting engineers for demonstrating water circulation, proving the existence of leakage, or estimating groundwater velocity. However, the interpretation of such tests is often very basic, with the result that decision makers and professionals commonly face unreliable results through hasty and empirical interpretation. There is thus an increasing need for a reliable interpretation tool, compatible with the latest operating systems and available in several languages. BRGM, the French Geological Survey, has developed a project together with hydrogeologists from various other organizations to build software assembling several analytical solutions in order to comply with various field contexts. This computer program, called TRAC, is very light and simple, allowing the user to add his own analytical solution if the formula is not yet included. It aims at collaborative improvement by sharing the tool and the solutions. TRAC can be used for interpreting data recovered from a tracer test as well as for simulating the transport of a tracer in the saturated zone (for the time being. Calibration of a site operation is based on considering the hydrodynamic and hydrodispersive features of groundwater flow as well as the amount, nature and injection mode of the artificial tracer. The software is available in French, English and Spanish, and the latest version can be downloaded from the web site http://trac.brgm.fr.

  17. Integrated modeling tool for performance engineering of complex computer systems

    Science.gov (United States)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  18. Automated innovative diagnostic, data management and communication tool, for improving malaria vector control in endemic settings.

    Science.gov (United States)

    Vontas, John; Mitsakakis, Konstantinos; Zengerle, Roland; Yewhalaw, Delenasaw; Sikaala, Chadwick Haadezu; Etang, Josiane; Fallani, Matteo; Carman, Bill; Müller, Pie; Chouaïbou, Mouhamadou; Coleman, Marlize; Coleman, Michael

    2016-01-01

    Malaria is a life-threatening disease that caused more than 400,000 deaths in sub-Saharan Africa in 2015. Mass prevention of the disease is best achieved by vector control which heavily relies on the use of insecticides. Monitoring mosquito vector populations is an integral component of control programs and a prerequisite for effective interventions. Several individual methods are used for this task; however, there are obstacles to their uptake, as well as challenges in organizing, interpreting and communicating vector population data. The Horizon 2020 project "DMC-MALVEC" consortium will develop a fully integrated and automated multiplex vector-diagnostic platform (LabDisk) for characterizing mosquito populations in terms of species composition, Plasmodium infections and biochemical insecticide resistance markers. The LabDisk will be interfaced with a Disease Data Management System (DDMS), a custom made data management software which will collate and manage data from routine entomological monitoring activities providing information in a timely fashion based on user needs and in a standardized way. The ResistanceSim, a serious game, a modern ICT platform that uses interactive ways of communicating guidelines and exemplifying good practices of optimal use of interventions in the health sector will also be a key element. The use of the tool will teach operational end users the value of quality data (relevant, timely and accurate) to make informed decisions. The integrated system (LabDisk, DDMS & ResistanceSim) will be evaluated in four malaria endemic countries, representative of the vector control challenges in sub-Saharan Africa, (Cameroon, Ivory Coast, Ethiopia and Zambia), highly representative of malaria settings with different levels of endemicity and vector control challenges, to support informed decision-making in vector control and disease management.

  19. Automated computation of femoral angles in dogs from three-dimensional computed tomography reconstructions: Comparison with manual techniques.

    Science.gov (United States)

    Longo, F; Nicetto, T; Banzato, T; Savio, G; Drigo, M; Meneghello, R; Concheri, G; Isola, M

    2018-02-01

    The aim of this ex vivo study was to test a novel three-dimensional (3D) automated computer-aided design (CAD) method (aCAD) for the computation of femoral angles in dogs from 3D reconstructions of computed tomography (CT) images. The repeatability and reproducibility of three manual radiography, manual CT reconstructions and the aCAD method for the measurement of three femoral angles were evaluated: (1) anatomical lateral distal femoral angle (aLDFA); (2) femoral neck angle (FNA); and (3) femoral torsion angle (FTA). Femoral angles of 22 femurs obtained from 16 cadavers were measured by three blinded observers. Measurements were repeated three times by each observer for each diagnostic technique. Femoral angle measurements were analysed using a mixed effects linear model for repeated measures to determine the levels of intra-observer agreement (repeatability) and inter-observer agreement (reproducibility). Repeatability and reproducibility of measurements using the aCAD method were excellent (intra-class coefficients, ICCs≥0.98) for all three angles assessed. Manual radiography and CT exhibited excellent agreement for the aLDFA measurement (ICCs≥0.90). However, FNA repeatability and reproducibility were poor (ICCscomputation of the 3D aCAD method provided the highest repeatability and reproducibility among the tested methodologies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. [Time consumption and quality of an automated fusion tool for SPECT and MRI images of the brain].

    Science.gov (United States)

    Fiedler, E; Platsch, G; Schwarz, A; Schmiedehausen, K; Tomandl, B; Huk, W; Rupprecht, Th; Rahn, N; Kuwert, T

    2003-10-01

    Although the fusion of images from different modalities may improve diagnostic accuracy, it is rarely used in clinical routine work due to logistic problems. Therefore we evaluated performance and time needed for fusing MRI and SPECT images using a semiautomated dedicated software. PATIENTS, MATERIAL AND METHOD: In 32 patients regional cerebral blood flow was measured using (99m)Tc ethylcystein dimer (ECD) and the three-headed SPECT camera MultiSPECT 3. MRI scans of the brain were performed using either a 0,2 T Open or a 1,5 T Sonata. Twelve of the MRI data sets were acquired using a 3D-T1w MPRAGE sequence, 20 with a 2D acquisition technique and different echo sequences. Image fusion was performed on a Syngo workstation using an entropy minimizing algorithm by an experienced user of the software. The fusion results were classified. We measured the time needed for the automated fusion procedure and in case of need that for manual realignment after automated, but insufficient fusion. The mean time of the automated fusion procedure was 123 s. It was for the 2D significantly shorter than for the 3D MRI datasets. For four of the 2D data sets and two of the 3D data sets an optimal fit was reached using the automated approach. The remaining 26 data sets required manual correction. The sum of the time required for automated fusion and that needed for manual correction averaged 320 s (50-886 s). The fusion of 3D MRI data sets lasted significantly longer than that of the 2D MRI data. The automated fusion tool delivered in 20% an optimal fit, in 80% manual correction was necessary. Nevertheless, each of the 32 SPECT data sets could be merged in less than 15 min with the corresponding MRI data, which seems acceptable for clinical routine use.

  1. Time consumption and quality of an automated fusion tool for SPECT and MRI images of the brain

    International Nuclear Information System (INIS)

    Fiedler, E.; Platsch, G.; Schwarz, A.; Schmiedehausen, K.; Kuwert, T.; Tomandl, B.; Huk, W.; Rupprecht, Th.; Rahn, N.

    2003-01-01

    Aim: Although the fusion of images from different modalities may improve diagnostic accuracy, it is rarely used in clinical routine work due to logistic problems. Therefore we evaluated performance and time needed for fusing MRI and SPECT images using a semiautomated dedicated software. Patients, material and method: In 32 patients regional cerebral blood flow was measured using 99m Tc ethylcystein dimer (ECD) and the three-headed SPECT camera MultiSPECT 3. MRI scans of the brain were performed using either a 0,2 T Open or a 1,5 T Sonata. Twelve of the MRI data sets were acquired using a 3 D-T1 w MPRAGE sequence, 20 with a 2D acquisition technique and different echo sequences. Image fusion was performed on a Syngo workstation using an entropy minimizing algorithm by an experienced user of the software. The fusion results were classified. We measured the time needed for the automated fusion procedure and in case of need that for manual realignment after automated, but insufficient fusion. Results: The mean time of the automated fusion procedure was 123 s. It was for the 2D significantly shorter than for the 3D MRI datasets. For four of the 2D data sets and two of the 3D data sets an optimal fit was reached using the automated approach. The remaining 26 data sets required manual correction. The sum of the time required for automated fusion and that needed for manual correction averaged 320 s (50-886 s). Conclusion: The fusion of 3D MRI data sets lasted significantly longer than that of the 2D MRI data. The automated fusion tool delivered in 20% an optimal fit, in 80% manual correction was necessary. Nevertheless, each of the 32 SPECT data sets could be merged in less than 15 min with the corresponding MRI data, which seems acceptable for clinical routine use. (orig.) [de

  2. G-Cloud Monitor: A Cloud Monitoring System for Factory Automation for Sustainable Green Computing

    Directory of Open Access Journals (Sweden)

    Hwa-Young Jeong

    2014-11-01

    Full Text Available Green and cloud computing (G-cloud are new trends in all areas of computing. The G-cloud provides an efficient function, which enables users to access their programs, systems and platforms at anytime and anyplace. Green computing can also yield greener technology by reducing power consumption for sustainable environments. Furthermore, in order to apply user needs to the system development, the user characteristics are regarded as some of the most important factors to be considered in product industries. In this paper, we propose a cloud monitoring system to observe and manage the manufacturing system/factory automation for sustainable green computing. For monitoring systems, we utilized the resources in the G-cloud environments, and hence, it can reduce the amount of system resources and devices, such as system power and processes. In addition, we propose adding a user profile to the monitoring system in order to provide a user-friendly function. That is, this function allows system configurations to be automatically matched to the individual’s requirements, thus increasing efficiency.

  3. A computer-controlled automated test system for fatigue and fracture testing

    International Nuclear Information System (INIS)

    Nanstad, R.K.; Alexander, D.J.; Swain, R.L.; Hutton, J.T.; Thomas, D.L.

    1989-01-01

    A computer-controlled system consisting of a servohydraulic test machine, an in-house designed test controller, and a desktop computer has been developed for performing automated fracture toughness and fatigue crack growth testing both in the laboratory and in hot cells for remote testing of irradiated specimens. Both unloading compliance and dc-potential drop can be used to monitor crack growth. The test controller includes a dc-current supply programmer, a function generator for driving the servohydraulic test machine to required test outputs, five measurement channels (each consisting of low-pass filter, track/hold amplifier, and 16-bit analog-to-digital converter), and digital logic for various control and data multiplexing functions. The test controller connects to the computer via a 16-bit wide photo-isolated bidirectional bus. The computer, a Hewlett-Packard series 200/300, inputs specimen and test parameters from the operator, configures the test controller, stores test data from the test controller in memory, does preliminary analysis during the test, and records sensor calibrations, specimen and test parameters, and test data on flexible diskette for later recall and analysis with measured initial and final crack length information. During the test, the operator can change test parameters as necessary. 24 refs., 6 figs

  4. Movie magic in the clinic: computer-generated characters for automated health counseling.

    Science.gov (United States)

    Bickmore, Timothy

    2008-11-06

    In this presentation, I demonstrate how many of the technologies used in movie special effects and games have been successfully used in health education and behavior change interventions. Computer-animated health counselors simulate human face-to-face dialogue as a computer interface medium, including not only verbal behavior but nonverbal conversational behavior such as hand gesture, body posture shifts, and facial display of emotion. This technology has now been successfully used in a wide range of health interventions for education and counseling of patients and consumers, including applications in physical activity promotion, medication adherence, and hospital discharge. These automated counselors have been deployed on home computers, hospital-based touch screen kiosks, and mobile devices with integrated health behavior sensing capability. Development of these agents is an interdisciplinary endeavor spanning the fields of character modeling and animation, computational linguistics, artificial intelligence, health communication and behavioral medicine. I will give demonstrations of several fielded systems, describe the technologies and methodologies underlying their development, and present results from five randomized controlled trials that have been completed or are in progress.

  5. REMOD: a computational tool for remodeling neuronal dendrites

    Directory of Open Access Journals (Sweden)

    Panagiotis Bozelos

    2014-05-01

    Full Text Available In recent years, several modeling studies have indicated that dendritic morphology is a key determinant of how individual neurons acquire a unique signal processing profile. The highly branched dendritic structure that originates from the cell body, explores the surrounding 3D space in a fractal-like manner, until it reaches a certain amount of complexity. Its shape undergoes significant alterations not only in various neuropathological conditions, but in physiological, too. Yet, despite the profound effect that these alterations can have on neuronal function, the causal relationship between structure and function remains largely elusive. The lack of a systematic approach for remodeling neuronal cells and their dendritic trees is a key limitation that contributes to this problem. In this context, we developed a computational tool that allows the remodeling of any type of neurons, given a set of exemplar morphologies. The tool is written in Python and provides a simple GUI that guides the user through various options to manipulate selected neuronal morphologies. It provides the ability to load one or more morphology files (.swc or .hoc and choose specific dendrites to operate one of the following actions: shrink, remove, extend or branch (as shown in Figure 1. The user retains complete control over the extent of each alteration and if a chosen action is not possible due to pre-existing structural constraints, appropriate warnings are produced. Importantly, the tool can also be used to extract morphology statistics for one or multiple morphologies, including features such as the total dendritic length, path length to the root, branch order, diameter tapering, etc. Finally, an experimental utility enables the user to remodel entire dendritic trees based on preloaded statistics from a database of cell-type specific neuronal morphologies. To our knowledge, this is the first tool that allows (a the remodeling of existing –as opposed to the de novo

  6. Automated Design and Analysis Tool for CEV Structural and TPS Components, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation of the proposed effort is a unique automated process for the analysis, design, and sizing of CEV structures and TPS. This developed process will...

  7. Automated Design and Analysis Tool for CLV/CEV Composite and Metallic Structural Components, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation of the proposed effort is a unique automated process for the analysis, design, and sizing of CLV/CEV composite and metallic structures. This developed...

  8. Reference Tools for Data Processing, Office Automation, and Data Communications: An Introductory Guide.

    Science.gov (United States)

    Cupoli, Patricia Dymkar

    1981-01-01

    Provides an introduction to various reference sources which are useful in dealing with the areas of data processing, office automation, and communications technologies. A bibliography with vendor listings is included. (FM)

  9. cmpXLatt: Westinghouse automated testing tool for nodal cross section models

    International Nuclear Information System (INIS)

    Guimaraes, Petri Forslund; Rönnberg, Kristian

    2011-01-01

    The procedure for evaluating the merits of different nodal cross section representation models is normally both cumbersome and time consuming, and includes many manual steps when preparing appropriate benchmark problems. Therefore, a computer tool called cmpXLatt has been developed at Westinghouse in order to facilitate the process of performing comparisons between nodal diffusion theory results and corresponding transport theory results on a single node basis. Due to the large number of state points that can be evaluated by cmpXLatt, a systematic and comprehensive way of performing verification and validation of nodal cross section models is provided. This paper presents the main features of cmpXLatt and demonstrates the benefits of using cmpXLatt in a real life application. (author)

  10. Vessel suppressed chest Computed Tomography for semi-automated volumetric measurements of solid pulmonary nodules.

    Science.gov (United States)

    Milanese, Gianluca; Eberhard, Matthias; Martini, Katharina; Vittoria De Martini, Ilaria; Frauenfelder, Thomas

    2018-04-01

    To evaluate whether vessel-suppressed computed tomography (VSCT) can be reliably used for semi-automated volumetric measurements of solid pulmonary nodules, as compared to standard CT (SCT) MATERIAL AND METHODS: Ninety-three SCT were elaborated by dedicated software (ClearRead CT, Riverain Technologies, Miamisburg, OH, USA), that allows subtracting vessels from lung parenchyma. Semi-automated volumetric measurements of 65 solid nodules were compared between SCT and VSCT. The measurements were repeated by two readers. For each solid nodule, volume measured on SCT by Reader 1 and Reader 2 was averaged and the average volume between readers acted as standard of reference value. Concordance between measurements was assessed using Lin's Concordance Correlation Coefficient (CCC). Limits of agreement (LoA) between readers and CT datasets were evaluated. Standard of reference nodule volume ranged from 13 to 366 mm 3 . The mean overestimation between readers was 3 mm 3 and 2.9 mm 3 on SCT and VSCT, respectively. Semi-automated volumetric measurements on VSCT showed substantial agreement with the standard of reference (Lin's CCC = 0.990 for Reader 1; 0.985 for Reader 2). The upper and lower LoA between readers' measurements were (16.3, -22.4 mm 3 ) and (15.5, -21.4 mm 3 ) for SCT and VSCT, respectively. VSCT datasets are feasible for the measurements of solid nodules, showing an almost perfect concordance between readers and with measurements on SCT. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. The Impact of Misspelled Words on Automated Computer Scoring: A Case Study of Scientific Explanations

    Science.gov (United States)

    Ha, Minsu; Nehm, Ross H.

    2016-06-01

    Automated computerized scoring systems (ACSSs) are being increasingly used to analyze text in many educational settings. Nevertheless, the impact of misspelled words (MSW) on scoring accuracy remains to be investigated in many domains, particularly jargon-rich disciplines such as the life sciences. Empirical studies confirm that MSW are a pervasive feature of human-generated text and that despite improvements, spell-check and auto-replace programs continue to be characterized by significant errors. Our study explored four research questions relating to MSW and text-based computer assessments: (1) Do English language learners (ELLs) produce equivalent magnitudes and types of spelling errors as non-ELLs? (2) To what degree do MSW impact concept-specific computer scoring rules? (3) What impact do MSW have on computer scoring accuracy? and (4) Are MSW more likely to impact false-positive or false-negative feedback to students? We found that although ELLs produced twice as many MSW as non-ELLs, MSW were relatively uncommon in our corpora. The MSW in the corpora were found to be important features of the computer scoring models. Although MSW did not significantly or meaningfully impact computer scoring efficacy across nine different computer scoring models, MSW had a greater impact on the scoring algorithms for naïve ideas than key concepts. Linguistic and concept redundancy in student responses explains the weak connection between MSW and scoring accuracy. Lastly, we found that MSW tend to have a greater impact on false-positive feedback. We discuss the implications of these findings for the development of next-generation science assessments.

  12. Cloud computing: An innovative tool for library services

    OpenAIRE

    Sahu, R.

    2015-01-01

    Cloud computing is a new technique of information communication technology because of its potential benefits such as reduced cost, accessible anywhere any time as well as its elasticity and flexibility. In this Paper defines cloud Computing, definition, essential characteristics, model of cloud computing, components of cloud, advantages & drawbacks of cloud computing and also describe cloud computing in libraries.

  13. Life Cycle Assessment of Connected and Automated Vehicles: Sensing and Computing Subsystem and Vehicle Level Effects.

    Science.gov (United States)

    Gawron, James H; Keoleian, Gregory A; De Kleine, Robert D; Wallington, Timothy J; Kim, Hyung Chul

    2018-03-06

    Although recent studies of connected and automated vehicles (CAVs) have begun to explore the potential energy and greenhouse gas (GHG) emission impacts from an operational perspective, little is known about how the full life cycle of the vehicle will be impacted. We report the results of a life cycle assessment (LCA) of Level 4 CAV sensing and computing subsystems integrated into internal combustion engine vehicle (ICEV) and battery electric vehicle (BEV) platforms. The results indicate that CAV subsystems could increase vehicle primary energy use and GHG emissions by 3-20% due to increases in power consumption, weight, drag, and data transmission. However, when potential operational effects of CAVs are included (e.g., eco-driving, platooning, and intersection connectivity), the net result is up to a 9% reduction in energy and GHG emissions in the base case. Overall, this study highlights opportunities where CAVs can improve net energy and environmental performance.

  14. Automated agents for management and control of the ALICE Computing Grid

    CERN Document Server

    Grigoras, C; Carminati, F; Legrand, I; Voicu, R

    2010-01-01

    A complex software environment such as the ALICE Computing Grid infrastructure requires permanent control and management for the large set of services involved. Automating control procedures reduces the human interaction with the various components of the system and yields better availability of the overall system. In this paper we will present how we used the MonALISA framework to gather, store and display the relevant metrics in the entire system from central and remote site services. We will also show the automatic local and global procedures that are triggered by the monitored values. Decision-taking agents are used to restart remote services, alert the operators in case of problems that cannot be automatically solved, submit production jobs, replicate and analyze raw data, resource load-balance and other control mechanisms that optimize the overall work flow and simplify day-to-day operations. Synthetic graphical views for all operational parameters, correlations, state of services and applications as we...

  15. A Tool for Multiple Targeted Genome Deletions that Is Precise, Scar-Free, and Suitable for Automation.

    Directory of Open Access Journals (Sweden)

    Wayne Aubrey

    Full Text Available Many advances in synthetic biology require the removal of a large number of genomic elements from a genome. Most existing deletion methods leave behind markers, and as there are a limited number of markers, such methods can only be applied a fixed number of times. Deletion methods that recycle markers generally are either imprecise (remove untargeted sequences, or leave scar sequences which can cause genome instability and rearrangements. No existing marker recycling method is automation-friendly. We have developed a novel openly available deletion tool that consists of: 1 a method for deleting genomic elements that can be repeatedly used without limit, is precise, scar-free, and suitable for automation; and 2 software to design the method's primers. Our tool is sequence agnostic and could be used to delete large numbers of coding sequences, promoter regions, transcription factor binding sites, terminators, etc in a single genome. We have validated our tool on the deletion of non-essential open reading frames (ORFs from S. cerevisiae. The tool is applicable to arbitrary genomes, and we provide primer sequences for the deletion of: 90% of the ORFs from the S. cerevisiae genome, 88% of the ORFs from S. pombe genome, and 85% of the ORFs from the L. lactis genome.

  16. CRISIS2012: An Updated Tool to Compute Seismic Hazard

    Science.gov (United States)

    Ordaz, M.; Martinelli, F.; Meletti, C.; D'Amico, V.

    2013-05-01

    CRISIS is a computer tool for probabilistic seismic hazard analysis (PSHA), whose development started in the late 1980's at the Instituto de Ingeniería, UNAM, Mexico. It started circulating outside the Mexican borders at the beginning of the 1990's, when it was first distributed as part of SEISAN tools. Throughout the years, CRISIS has been used for seismic hazard studies in several countries in Latin America (Mexico, Guatemala, Belize, El Salvador, Honduras, Nicaragua, Costa Rica, Panama, Colombia, Venezuela, Ecuador, Peru, Argentina and Chile), and in many other countries of the World. CRISIS has always circulated free of charge for non-commercial applications. It is worth noting that CRISIS has been mainly written by people that are, at the same time, PSHA practitioners. Therefore, the development loop has been relatively short, and most of the modifications and improvements have been made to satisfy the needs of the developers themselves. CRISIS has evolved from a rather simple FORTRAN code to a relatively complex program with a friendly graphical interface, able to handle a variety of modeling possibilities for source geometries, seismicity descriptions and ground motion prediction models (GMPM). We will describe some of the improvements made for the newest version of the code: CRISIS 2012.These improvements, some of which were made in the frame of the Italian research project INGV-DPC S2 (http://nuovoprogettoesse2.stru.polimi.it/), funded by the Dipartimento della Protezione Civile (DPC; National Civil Protection Department), include: A wider variety of source geometries A wider variety of seismicity models, including the ability to handle non-Poissonian occurrence models and Poissonian smoothed-seismicity descriptions. Enhanced capabilities for using different kinds of GMPM: attenuation tables, built-in models and generalized attenuation models. In the case of built-in models, there is, by default, a set ready to use in CRISIS, but additional custom GMPMs

  17. Preclinical validation of automated dual-energy X-ray absorptiometry and computed tomography-based body composition measurements

    International Nuclear Information System (INIS)

    DEVRIESE, Joke; Pottel, Hans; BEELS, Laurence; VAN DE WIELE, Christophe; MAES, Alex; GHEYSENS, Olivier

    2016-01-01

    The aim of this study was to determine and validate a set of Hounsfield unit (HU) ranges to segment computed tomography (CT) images into tissue types and to test the validity of dual-energy X-ray absorptiometry (DXA) tissue segmentation on pure, unmixed porcine tissues. This preclinical prospective study was approved by the local ethical committee. Different quantities of porcine bone tissue (BT), lean tissue (LT) and adipose tissue (AT) were scanned using DXA and CT. Tissue type segmentation in DXA was performed via the standard clinical protocol and in CT through different sets of HU ranges. Percent coefficients of variation (%CV) were used to assess precision while % differences of observed masses were tested against zero using the Wilcoxon signed-rank Test. Total mass DXA measurements differ little but significantly (P=0.016) from true mass, while total mass CT measurements based on literature values show non-significant (P=0.69) differences of 1.7% and 2.0%. BT mass estimates with DXA differed more from true mass (median -78.2 to -75.8%) than other tissue types (median -11.3 to -8.1%). Tissue mass estimates with CT and literature HU ranges showed small differences from true mass for every tissue type (median -10.4 to 8.8%). The most suited method for automated tissue segmentation is CT and can become a valuable tool in quantitative nuclear medicine.

  18. Crowdsourcing image annotation for nucleus detection and segmentation in computational pathology: evaluating experts, automated methods, and the crowd.

    Science.gov (United States)

    Irshad, H; Montaser-Kouhsari, L; Waltz, G; Bucur, O; Nowak, J A; Dong, F; Knoblauch, N W; Beck, A H

    2015-01-01

    The development of tools in computational pathology to assist physicians and biomedical scientists in the diagnosis of disease requires access to high-quality annotated images for algorithm learning and evaluation. Generating high-quality expert-derived annotations is time-consuming and expensive. We explore the use of crowdsourcing for rapidly obtaining annotations for two core tasks in com- putational pathology: nucleus detection and nucleus segmentation. We designed and implemented crowdsourcing experiments using the CrowdFlower platform, which provides access to a large set of labor channel partners that accesses and manages millions of contributors worldwide. We obtained annotations from four types of annotators and compared concordance across these groups. We obtained: crowdsourced annotations for nucleus detection and segmentation on a total of 810 images; annotations using automated methods on 810 images; annotations from research fellows for detection and segmentation on 477 and 455 images, respectively; and expert pathologist-derived annotations for detection and segmentation on 80 and 63 images, respectively. For the crowdsourced annotations, we evaluated performance across a range of contributor skill levels (1, 2, or 3). The crowdsourced annotations (4,860 images in total) were completed in only a fraction of the time and cost required for obtaining annotations using traditional methods. For the nucleus detection task, the research fellow-derived annotations showed the strongest concordance with the expert pathologist- derived annotations (F-M =93.68%), followed by the crowd-sourced contributor levels 1,2, and 3 and the automated method, which showed relatively similar performance (F-M = 87.84%, 88.49%, 87.26%, and 86.99%, respectively). For the nucleus segmentation task, the crowdsourced contributor level 3-derived annotations, research fellow-derived annotations, and automated method showed the strongest concordance with the expert pathologist

  19. Photonic measurement of apparent presence of spirit using a computer automated system.

    Science.gov (United States)

    Schwartz, Gary E

    2011-01-01

    Research investigating the potential of detecting the purported presence of spirit (POS) has been hampered by the necessity of employing a human being to collect the data. To infer the presence of alleged spirit, it is essential to remove the simultaneous presence of an experimenter (POE), thereby eliminating his or her physical energy as well as accompanying conscious intentions and expectations. The purpose of these two proof of concept experiments was to explore the feasibility of completely automating data collection in the absence of an experimenter to determine if evidence consistent with POS was still obtained. A computer automated system was developed making it possible to collect all data in the absence of an experimenter (thereby achieving complete experimenter blinding). In the evenings, the computer would perform as follows: (1) start the experimental run at random times, (2) conduct 30-minute baseline as well as POS trials involving two different alleged spirits, and (3) record background light in a completely dark chamber with a highly sensitive low-light Princeton Instruments charge-coupled device (CCD) camera system. The CCD camera and light-tight recording chamber were housed in a light-tight room; the computer, large screen monitor, and speakers were housed in a separate control room. The participants were two purported spirits involved in previous research published in this journal, in which a silicon photomultiplier system was used. The primary intervention was the computer selecting and presenting visual and auditory information inviting Spirit 1 or Spirit 2 to enter the chamber in the absence of experimenter presence and awareness. The CCD camera provided 512 × 512 pixel images of 30-minute exposures (reflecting a combination of possible background light plus instrument dark noise). The images were imported into image processing software, and two-dimensional fast fourier transform (FFT) analyses were performed. Visual examinations of the FFT

  20. Connectivity among computer-aided engineering methods, procedures, and tools used in developing the SSC collider magnets

    International Nuclear Information System (INIS)

    Kallas, N.; Jalloh, A.R.

    1992-01-01

    The accomplishment of functional productivity for the computer aided engineering (CAE) environment at the magnet engineering department (ME) of the magnet systems division (MSD) at the Superconducting Super Collider Laboratory (SSCL) involves most of the basic aspects of information engineering. It is highly desirable to arrive at a software and hardware topology that offers total, two-way (back and forth), automatic and direct software and hardware connectivity among computer-aided design and drafting (CADD), analysis codes, and office automation tools applicable to the disciplines involved. This paper describes the components, data flow, and practices employed in the development of the CAE environment from a systems engineering aspect rather than from the analytical angle. When appropriate, references to case studies are made in order to demonstrate the connectivity of the techniques used

  1. Connectivity among computer-aided engineering methods, procedures, and tools used in developing the SSC collider magnets

    International Nuclear Information System (INIS)

    Kallas, N.; Jalloh, A.R.

    1992-03-01

    The accomplishment of functional productivity for the computer aided engineering (CAE) environment at the magnet engineering department (ME) of the magnet systems divisions (MSD) at the Superconducting Super Collider Laboratory (SSCL) involves most of the basic aspects of information engineering. It is highly desirable to arrive at a software and hardware topology that offers total, two-way (back and forth), automatic and direct software and hardware connectivity among computer-aided design and drafting (CADD), analysis codes, and office automation tools applicable to the disciplines involved. This paper describes the components, data flow, and practices employed in the development of the CAE environment from a systems engineering aspect rather than from the analytical angle. When appropriate, references to case studies are made in order to demonstrate the connectivity of the techniques used

  2. Chimenea and other tools: Automated imaging of multi-epoch radio-synthesis data with CASA

    Science.gov (United States)

    Staley, T. D.; Anderson, G. E.

    2015-11-01

    In preparing the way for the Square Kilometre Array and its pathfinders, there is a pressing need to begin probing the transient sky in a fully robotic fashion using the current generation of radio telescopes. Effective exploitation of such surveys requires a largely automated data-reduction process. This paper introduces an end-to-end automated reduction pipeline, AMIsurvey, used for calibrating and imaging data from the Arcminute Microkelvin Imager Large Array. AMIsurvey makes use of several component libraries which have been packaged separately for open-source release. The most scientifically significant of these is chimenea, which implements a telescope-agnostic algorithm for automated imaging of pre-calibrated multi-epoch radio-synthesis data, of the sort typically acquired for transient surveys or follow-up. The algorithm aims to improve upon standard imaging pipelines by utilizing iterative RMS-estimation and automated source-detection to avoid so called 'Clean-bias', and makes use of CASA subroutines for the underlying image-synthesis operations. At a lower level, AMIsurvey relies upon two libraries, drive-ami and drive-casa, built to allow use of mature radio-astronomy software packages from within Python scripts. While targeted at automated imaging, the drive-casa interface can also be used to automate interaction with any of the CASA subroutines from a generic Python process. Additionally, these packages may be of wider technical interest beyond radio-astronomy, since they demonstrate use of the Python library pexpect to emulate terminal interaction with an external process. This approach allows for rapid development of a Python interface to any legacy or externally-maintained pipeline which accepts command-line input, without requiring alterations to the original code.

  3. Computer games as a pedagogical tool in education

    OpenAIRE

    Maher, Ken

    1997-01-01

    Designing computer based environments is never easy, especially when considering young learners. Traditionally, computer gaming has been seen as lacking in educational value, but rating highly in satisfaction and motivation. The objective of this dissertation is to look at elements of computer based learning and to ascertain how computer games can be included as a means of improving learning. Various theories are drawn together from psychology, instructional technology and computer gaming, to...

  4. XVI International symposium on nuclear electronics and VI International school on automation and computing in nuclear physics and astrophysics

    International Nuclear Information System (INIS)

    Churin, I.N.

    1995-01-01

    Reports and papers of the 16- International Symposium on nuclear electronics and the 6- International school on automation and computing in nuclear physics and astrophysics are presented. The latest achievements in the field of development of fact - response electronic circuits designed for detecting and spectrometric facilities are studied. The peculiar attention is paid to the systems for acquisition, processing and storage of experimental data. The modern equipment designed for data communication in the computer networks is studied

  5. Investigation into Cloud Computing for More Robust Automated Bulk Image Geoprocessing

    Science.gov (United States)

    Brown, Richard B.; Smoot, James C.; Underwood, Lauren; Armstrong, C. Duane

    2012-01-01

    Geospatial resource assessments frequently require timely geospatial data processing that involves large multivariate remote sensing data sets. In particular, for disasters, response requires rapid access to large data volumes, substantial storage space and high performance processing capability. The processing and distribution of this data into usable information products requires a processing pipeline that can efficiently manage the required storage, computing utilities, and data handling requirements. In recent years, with the availability of cloud computing technology, cloud processing platforms have made available a powerful new computing infrastructure resource that can meet this need. To assess the utility of this resource, this project investigates cloud computing platforms for bulk, automated geoprocessing capabilities with respect to data handling and application development requirements. This presentation is of work being conducted by Applied Sciences Program Office at NASA-Stennis Space Center. A prototypical set of image manipulation and transformation processes that incorporate sample Unmanned Airborne System data were developed to create value-added products and tested for implementation on the "cloud". This project outlines the steps involved in creating and testing of open source software developed process code on a local prototype platform, and then transitioning this code with associated environment requirements into an analogous, but memory and processor enhanced cloud platform. A data processing cloud was used to store both standard digital camera panchromatic and multi-band image data, which were subsequently subjected to standard image processing functions such as NDVI (Normalized Difference Vegetation Index), NDMI (Normalized Difference Moisture Index), band stacking, reprojection, and other similar type data processes. Cloud infrastructure service providers were evaluated by taking these locally tested processing functions, and then

  6. Pharmacokinetic study with computational tools in the medicinal chemistry course

    Directory of Open Access Journals (Sweden)

    Monique Araújo de Brito

    2011-12-01

    Full Text Available To improve the teaching-learning process in the Medicinal Chemistry course, new strategies have been incorporated into practical classes of this fundamental discipline of the pharmaceutical curriculum. Many changes and improvements have been made in the area of medicinal chemistry so far, and students should be prepared for these new approaches with the use of technological resources in this field. Practical activities using computational techniques have been directed to the evaluation of chemical and physicochemical properties that affect the pharmacokinetics of drugs. Their objectives were to allow students to know these tools, to learn how to access them, to search for the structures of drugs and to analyze results. To the best of our knowledge, this is the first study in Brazil to demonstrate the use of computational practices in teaching pharmacokinetics. Practical classes using Osiris and Molinspiration were attractive to students, who developed the activities easily and acquired better theoretical knowledge.Para melhorar o processo ensino-aprendizagem no curso de Química Medicinal novas estratégias estão sendo incorporadas às aulas práticas desta disciplina fundamental do currículo farmacêutico. Muitas mudanças e melhorias vêm marcando a área de química medicinal e por isso é importante que os alunos sejam colocados nestas novas abordagens na área, com a utilização de recursos tecnológicos. As atividades práticas foram direcionadas para a avaliação dos dados químicos e físico-químicos de fármacos que influenciam as propriedades farmacocinéticas com o auxílio de técnicas computacionais. Os objetivos foram permitir aos alunos conhecer essas ferramentas, saber como acessá-las, procurar as estruturas de fármacos e analisar os resultados. Este é o primeiro estudo publicado no Brasil que apresenta aula prática computacional sobre o tema farmacocinética. As aulas práticas utilizando os servidores Osiris e

  7. Automated tools to be used for ascertaining structural condition in South African hard rock mines

    CSIR Research Space (South Africa)

    Teleka, R

    2011-11-01

    Full Text Available in the mining operations and in the efforts to improve mine safety. If mines are safe, the belief is that more skilled labor will express interest in it unlike the way it currently is. The purpose of this paper is to discuss the possibility of using automated...

  8. A systematic engineering tool chain approach for self-organizing building automation systems

    NARCIS (Netherlands)

    Mc Gibney, A.; Rea, S.; Lehmann, M.; Thior, S.; Lesecq, S.; Hendriks, M.; Guyon-Gardeux, C.; Mai, Linh Tuan; Pacull, F.; Ploennigs, J.; Basten, T.; Pesch, D.

    2013-01-01

    There is a strong push towards smart buildings that aim to achieve comfort, safety and energy efficiency, through building automation systems (BAS) that incorporate multiple subsystems such as heating and air-conditioning, lighting, access control etc. The design, commissioning and operation of BAS

  9. Automated and unbiased image analyses as tools in phenotypic classification of small-spored Alternaria species

    DEFF Research Database (Denmark)

    Andersen, Birgitte; Hansen, Michael Edberg; Smedsgaard, Jørn

    2005-01-01

    often has been broadly applied to various morphologically and chemically distinct groups of isolates from different hosts. The purpose of this study was to develop and evaluate automated and unbiased image analysis systems that will analyze different phenotypic characters and facilitate testing...

  10. Automation System Goals for the Creation and Operation of the Tool

    Science.gov (United States)

    Khisamutdinov, R. M.; Khisamutdinov, M. R.

    2014-12-01

    Complex automation of processes for creation and operating the instrument, consistent linking of hierarchical levels in the single system of collection and processing of data and operations management, integration with TEAMCENTRE PLM and SAP/R3 ERP significantly improve the quality and efficiency of production preparation.

  11. Automated detection of heuristics and biases among pathologists in a computer-based system.

    Science.gov (United States)

    Crowley, Rebecca S; Legowski, Elizabeth; Medvedeva, Olga; Reitmeyer, Kayse; Tseytlin, Eugene; Castine, Melissa; Jukic, Drazen; Mello-Thoms, Claudia

    2013-08-01

    The purpose of this study is threefold: (1) to develop an automated, computer-based method to detect heuristics and biases as pathologists examine virtual slide cases, (2) to measure the frequency and distribution of heuristics and errors across three levels of training, and (3) to examine relationships of heuristics to biases, and biases to diagnostic errors. The authors conducted the study using a computer-based system to view and diagnose virtual slide cases. The software recorded participant responses throughout the diagnostic process, and automatically classified participant actions based on definitions of eight common heuristics and/or biases. The authors measured frequency of heuristic use and bias across three levels of training. Biases studied were detected at varying frequencies, with availability and search satisficing observed most frequently. There were few significant differences by level of training. For representativeness and anchoring, the heuristic was used appropriately as often or more often than it was used in biased judgment. Approximately half of the diagnostic errors were associated with one or more biases. We conclude that heuristic use and biases were observed among physicians at all levels of training using the virtual slide system, although their frequencies varied. The system can be employed to detect heuristic use and to test methods for decreasing diagnostic errors resulting from cognitive biases.

  12. Automated detection of lung nodules in low-dose computed tomography

    International Nuclear Information System (INIS)

    Cascio, D.; Cheran, S.C.; Chincarini, A.; De Nunzio, G.; Delogu, P.; Fantacci, M.E.; Gargano, G.; Gori, I.; Retico, A.; Masala, G.L.; Preite Martinez, A.; Santoro, M.; Spinelli, C.; Tarantino, T.

    2007-01-01

    A computer-aided detection (CAD) system for the identification of pulmonary nodules in low-dose multi-detector computed-tomography (CT) images has been developed in the framework of the MAGIC-5 Italian project. One of the main goals of this project is to build a distributed database of lung CT scans in order to enable automated image analysis through a data and cpu GRID infrastructure. The basic modules of our lung-CAD system, consisting in a 3D dot-enhancement filter for nodule detection and a neural classifier for false-positive finding reduction, are described. The system was designed and tested for both internal and sub-pleural nodules. The database used in this study consists of 17 low-dose CT scans reconstructed with thin slice thickness (∝300 slices/scan). The preliminary results are shown in terms of the FROC analysis reporting a good sensitivity (85% range) for both internal and sub-pleural nodules at an acceptable level of false positive findings (1-9 FP/scan); the sensitivity value remains very high (75% range) even at 1-6 FP/scan. (orig.)

  13. Automated Cervical Screening and Triage, Based on HPV Testing and Computer-Interpreted Cytology.

    Science.gov (United States)

    Yu, Kai; Hyun, Noorie; Fetterman, Barbara; Lorey, Thomas; Raine-Bennett, Tina R; Zhang, Han; Stamps, Robin E; Poitras, Nancy E; Wheeler, William; Befano, Brian; Gage, Julia C; Castle, Philip E; Wentzensen, Nicolas; Schiffman, Mark

    2018-04-11

    State-of-the-art cervical cancer prevention includes human papillomavirus (HPV) vaccination among adolescents and screening/treatment of cervical precancer (CIN3/AIS and, less strictly, CIN2) among adults. HPV testing provides sensitive detection of precancer but, to reduce overtreatment, secondary "triage" is needed to predict women at highest risk. Those with the highest-risk HPV types or abnormal cytology are commonly referred to colposcopy; however, expert cytology services are critically lacking in many regions. To permit completely automatable cervical screening/triage, we designed and validated a novel triage method, a cytologic risk score algorithm based on computer-scanned liquid-based slide features (FocalPoint, BD, Burlington, NC). We compared it with abnormal cytology in predicting precancer among 1839 women testing HPV positive (HC2, Qiagen, Germantown, MD) in 2010 at Kaiser Permanente Northern California (KPNC). Precancer outcomes were ascertained by record linkage. As additional validation, we compared the algorithm prospectively with cytology results among 243 807 women screened at KPNC (2016-2017). All statistical tests were two-sided. Among HPV-positive women, the algorithm matched the triage performance of abnormal cytology. Combined with HPV16/18/45 typing (Onclarity, BD, Sparks, MD), the automatable strategy referred 91.7% of HPV-positive CIN3/AIS cases to immediate colposcopy while deferring 38.4% of all HPV-positive women to one-year retesting (compared with 89.1% and 37.4%, respectively, for typing and cytology triage). In the 2016-2017 validation, the predicted risk scores strongly correlated with cytology (P < .001). High-quality cervical screening and triage performance is achievable using this completely automated approach. Automated technology could permit extension of high-quality cervical screening/triage coverage to currently underserved regions.

  14. Circular Hough transform diffraction analysis: A software tool for automated measurement of selected area electron diffraction patterns within Digital MicrographTM

    International Nuclear Information System (INIS)

    Mitchell, D.R.G.

    2008-01-01

    A software tool (script and plugin) for computing circular Hough transforms (CHT) in Digital Micrograph TM has been developed, for the purpose of automated analysis of selected area electron diffraction patterns (SADPs) of polycrystalline materials. The CHT enables the diffraction pattern centre to be determined with sub-pixel accuracy, regardless of the exposure condition of the transmitted beam or if a beam stop is present. Radii of the diffraction rings can also be accurately measured with sub-pixel precision. If the pattern is calibrated against a known camera length, then d-spacings with an accuracy of better than 1% can be obtained. These measurements require no a priori knowledge of the pattern and very limited user interaction. The accuracy of the CHT is degraded by distortion introduced by the projector lens, and this should be minimised prior to pattern acquisition. A number of optimisations in the CHT software enable rapid processing of patterns; a typical analysis of a 1kx1k image taking just a few minutes. The CHT tool appears robust and is even able to accurately measure SADPs with very incomplete diffraction rings due to texture effects. This software tool is freely downloadable via the Internet

  15. Circular Hough transform diffraction analysis: A software tool for automated measurement of selected area electron diffraction patterns within Digital Micrograph{sup TM}

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, D.R.G. [Institute of Materials and Engineering Science, ANSTO, PMB 1, Menai, NSW 2234 (Australia)], E-mail: drm@ansto.gov.au

    2008-03-15

    A software tool (script and plugin) for computing circular Hough transforms (CHT) in Digital Micrograph{sup TM} has been developed, for the purpose of automated analysis of selected area electron diffraction patterns (SADPs) of polycrystalline materials. The CHT enables the diffraction pattern centre to be determined with sub-pixel accuracy, regardless of the exposure condition of the transmitted beam or if a beam stop is present. Radii of the diffraction rings can also be accurately measured with sub-pixel precision. If the pattern is calibrated against a known camera length, then d-spacings with an accuracy of better than 1% can be obtained. These measurements require no a priori knowledge of the pattern and very limited user interaction. The accuracy of the CHT is degraded by distortion introduced by the projector lens, and this should be minimised prior to pattern acquisition. A number of optimisations in the CHT software enable rapid processing of patterns; a typical analysis of a 1kx1k image taking just a few minutes. The CHT tool appears robust and is even able to accurately measure SADPs with very incomplete diffraction rings due to texture effects. This software tool is freely downloadable via the Internet.

  16. Espina: A Tool for the Automated Segmentation and Counting of Synapses in Large Stacks of Electron Microscopy Images

    Science.gov (United States)

    Morales, Juan; Alonso-Nanclares, Lidia; Rodríguez, José-Rodrigo; DeFelipe, Javier; Rodríguez, Ángel; Merchán-Pérez, Ángel

    2011-01-01

    The synapses in the cerebral cortex can be classified into two main types, Gray's type I and type II, which correspond to asymmetric (mostly glutamatergic excitatory) and symmetric (inhibitory GABAergic) synapses, respectively. Hence, the quantification and identification of their different types and the proportions in which they are found, is extraordinarily important in terms of brain function. The ideal approach to calculate the number of synapses per unit volume is to analyze 3D samples reconstructed from serial sections. However, obtaining serial sections by transmission electron microscopy is an extremely time consuming and technically demanding task. Using focused ion beam/scanning electron microscope microscopy, we recently showed that virtually all synapses can be accurately identified as asymmetric or symmetric synapses when they are visualized, reconstructed, and quantified from large 3D tissue samples obtained in an automated manner. Nevertheless, the analysis, segmentation, and quantification of synapses is still a labor intensive procedure. Thus, novel solutions are currently necessary to deal with the large volume of data that is being generated by automated 3D electron microscopy. Accordingly, we have developed ESPINA, a software tool that performs the automated segmentation and counting of synapses in a reconstructed 3D volume of the cerebral cortex, and that greatly facilitates and accelerates these processes. PMID:21633491

  17. ESPINA: a tool for the automated segmentation and counting of synapses in large stacks of electron microscopy images

    Directory of Open Access Journals (Sweden)

    Juan eMorales

    2011-03-01

    Full Text Available The synapses in the cerebral cortex can be classified into two main types, Gray’s type I and type II, which correspond to asymmetric (mostly glutamatergic excitatory and symmetric (inhibitory GABAergic synapses, respectively. Hence, the quantification and identification of their different types and the proportions in which they are found, is extraordinarily important in terms of brain function. The ideal approach to calculate the number of synapses per unit volume is to analyze three-dimensional samples reconstructed from serial sections. However, obtaining serial sections by transmission electron microscopy is an extremely time consuming and technically demanding task. Using FIB/SEM microscopy, we recently showed that virtually all synapses can be accurately identified as asymmetric or symmetric synapses when they are visualized, reconstructed and quantified from large three-dimensional tissue samples obtained in an automated manner. Nevertheless, the analysis, segmentation and quantification of synapses is still a labor intensive procedure. Thus, novel solutions are currently necessary to deal with the large volume of data that is being generated by automated 3D electron microscopy. Accordingly, we have developed ESPINA, a software tool that performs the automated segmentation and counting of synapses in a reconstructed 3D volume of the cerebral cortex, and that greatly facilitates and accelerates these processes.

  18. Automated interpretable computational biology in the clinic: a framework to predict disease severity and stratify patients from clinical data

    Directory of Open Access Journals (Sweden)

    Soumya Banerjee

    2017-10-01

    Full Text Available We outline an automated computational and machine learning framework that predicts disease severity and stratifies patients. We apply our framework to available clinical data. Our algorithm automatically generates insights and predicts disease severity with minimal operator intervention. The computational framework presented here can be used to stratify patients, predict disease severity and propose novel biomarkers for disease. Insights from machine learning algorithms coupled with clinical data may help guide therapy, personalize treatment and help clinicians understand the change in disease over time. Computational techniques like these can be used in translational medicine in close collaboration with clinicians and healthcare providers. Our models are also interpretable, allowing clinicians with minimal machine learning experience to engage in model building. This work is a step towards automated machine learning in the clinic.

  19. THE AUTOMATED GENERATION OF ENGINEERING KNOWLEDGE USING A DIGITAL ENGINEERING TOOL: AN INDUSTRIAL EVALUATION CASE STUDY

    OpenAIRE

    RAYMOND CW SUNG; JAMES M RITCHIE; THEODORE LIM; YING LIU; ZOE KOSMADOUDI

    2012-01-01

    In a knowledge-based economy, it will be crucial to capture expertise and rationale in working environments of all kinds as the need develops to understand how people are working, the intuitive processes they use as they carry out tasks and make decisions and trying to determine the most effective methods and rationales for solving problems. Key outputs from this will be the capability to automate decision making activities and supporting training and learning in competitive business environm...

  20. Electromagnetic compatibility of tools and automated process control systems of NPP units

    International Nuclear Information System (INIS)

    Alpeev, A.S.

    1994-01-01

    Problems of electromagnetic compatibility of automated process control subsystems in NPP units are discussed. It is emphasized, that at the stage of development of request for proposal for each APC subsystem special attention should be paid to electromagnetic situation in specific room and requirements to the quality of functions performed by the system. Besides, requirements to electromagnetic compatibility tests at the work stations should be formulated, and mock-ups of the subsystems should be tested

  1. VALIDATING the Accuracy of Sighten's Automated Shading Tool

    Energy Technology Data Exchange (ETDEWEB)

    2018-05-04

    Solar companies - including installers, financiers, and distributors - leverage Sighten software to deliver accurate shading calculations and solar proposals. Sighten recently partnered with Google Project Sunroof to provide automated remote shading analysis directly within the Sighten platform. The National Renewable Energy Laboratory (NREL), in partnership with Sighten, independently verified the accuracy of Sighten's remote-shading solar access values (SAVs) on an annual basis for locations in Los Angeles, California, and Denver, Colorado.

  2. An automated optimization tool for high-dose-rate (HDR) prostate brachytherapy with divergent needle pattern

    Science.gov (United States)

    Borot de Battisti, M.; Maenhout, M.; de Senneville, B. Denis; Hautvast, G.; Binnekamp, D.; Lagendijk, J. J. W.; van Vulpen, M.; Moerland, M. A.

    2015-10-01

    Focal high-dose-rate (HDR) for prostate cancer has gained increasing interest as an alternative to whole gland therapy as it may contribute to the reduction of treatment related toxicity. For focal treatment, optimal needle guidance and placement is warranted. This can be achieved under MR guidance. However, MR-guided needle placement is currently not possible due to space restrictions in the closed MR bore. To overcome this problem, a MR-compatible, single-divergent needle-implant robotic device is under development at the University Medical Centre, Utrecht: placed between the legs of the patient inside the MR bore, this robot will tap the needle in a divergent pattern from a single rotation point into the tissue. This rotation point is just beneath the perineal skin to have access to the focal prostate tumor lesion. Currently, there is no treatment planning system commercially available which allows optimization of the dose distribution with such needle arrangement. The aim of this work is to develop an automatic inverse dose planning optimization tool for focal HDR prostate brachytherapy with needle insertions in a divergent configuration. A complete optimizer workflow is proposed which includes the determination of (1) the position of the center of rotation, (2) the needle angulations and (3) the dwell times. Unlike most currently used optimizers, no prior selection or adjustment of input parameters such as minimum or maximum dose or weight coefficients for treatment region and organs at risk is required. To test this optimizer, a planning study was performed on ten patients (treatment volumes ranged from 8.5 cm3to 23.3 cm3) by using 2-14 needle insertions. The total computation time of the optimizer workflow was below 20 min and a clinically acceptable plan was reached on average using only four needle insertions.

  3. A semi-automated tool for treatment plan-quality evaluation and clinical trial quality assurance

    Science.gov (United States)

    Wang, Jiazhou; Chen, Wenzhou; Studenski, Matthew; Cui, Yunfeng; Lee, Andrew J.; Xiao, Ying

    2013-07-01

    The goal of this work is to develop a plan-quality evaluation program for clinical routine and multi-institutional clinical trials so that the overall evaluation efficiency is improved. In multi-institutional clinical trials evaluating the plan quality is a time-consuming and labor-intensive process. In this note, we present a semi-automated plan-quality evaluation program which combines MIMVista, Java/MATLAB, and extensible markup language (XML). More specifically, MIMVista is used for data visualization; Java and its powerful function library are implemented for calculating dosimetry parameters; and to improve the clarity of the index definitions, XML is applied. The accuracy and the efficiency of the program were evaluated by comparing the results of the program with the manually recorded results in two RTOG trials. A slight difference of about 0.2% in volume or 0.6 Gy in dose between the semi-automated program and manual recording was observed. According to the criteria of indices, there are minimal differences between the two methods. The evaluation time is reduced from 10-20 min to 2 min by applying the semi-automated plan-quality evaluation program.

  4. A semi-automated tool for treatment plan-quality evaluation and clinical trial quality assurance

    International Nuclear Information System (INIS)

    Wang, Jiazhou; Chen, Wenzhou; Studenski, Matthew; Cui, Yunfeng; Xiao, Ying; Lee, Andrew J

    2013-01-01

    The goal of this work is to develop a plan-quality evaluation program for clinical routine and multi-institutional clinical trials so that the overall evaluation efficiency is improved. In multi-institutional clinical trials evaluating the plan quality is a time-consuming and labor-intensive process. In this note, we present a semi-automated plan-quality evaluation program which combines MIMVista, Java/MATLAB, and extensible markup language (XML). More specifically, MIMVista is used for data visualization; Java and its powerful function library are implemented for calculating dosimetry parameters; and to improve the clarity of the index definitions, XML is applied. The accuracy and the efficiency of the program were evaluated by comparing the results of the program with the manually recorded results in two RTOG trials. A slight difference of about 0.2% in volume or 0.6 Gy in dose between the semi-automated program and manual recording was observed. According to the criteria of indices, there are minimal differences between the two methods. The evaluation time is reduced from 10–20 min to 2 min by applying the semi-automated plan-quality evaluation program. (note)

  5. Automated cell analysis tool for a genome-wide RNAi screen with support vector machine based supervised learning

    Science.gov (United States)

    Remmele, Steffen; Ritzerfeld, Julia; Nickel, Walter; Hesser, Jürgen

    2011-03-01

    RNAi-based high-throughput microscopy screens have become an important tool in biological sciences in order to decrypt mostly unknown biological functions of human genes. However, manual analysis is impossible for such screens since the amount of image data sets can often be in the hundred thousands. Reliable automated tools are thus required to analyse the fluorescence microscopy image data sets usually containing two or more reaction channels. The herein presented image analysis tool is designed to analyse an RNAi screen investigating the intracellular trafficking and targeting of acylated Src kinases. In this specific screen, a data set consists of three reaction channels and the investigated cells can appear in different phenotypes. The main issue of the image processing task is an automatic cell segmentation which has to be robust and accurate for all different phenotypes and a successive phenotype classification. The cell segmentation is done in two steps by segmenting the cell nuclei first and then using a classifier-enhanced region growing on basis of the cell nuclei to segment the cells. The classification of the cells is realized by a support vector machine which has to be trained manually using supervised learning. Furthermore, the tool is brightness invariant allowing different staining quality and it provides a quality control that copes with typical defects during preparation and acquisition. A first version of the tool has already been successfully applied for an RNAi-screen containing three hundred thousand image data sets and the SVM extended version is designed for additional screens.

  6. Gear cutting tools fundamentals of design and computation

    CERN Document Server

    Radzevich, Stephen P

    2010-01-01

    Presents the DG/K-based method of surface generation, a novel and practical mathematical method for designing gear cutting tools with optimal parameters. This book proposes a scientific classification for the various kinds of the gear machining meshes, discussing optimal designs of gear cutting tools.

  7. Computer-mediated-communication and social networking tools at work

    NARCIS (Netherlands)

    Ou, C.X.J.; Sia, C.L.; Hui, C.K.

    2013-01-01

    Purpose – Advances in information technology (IT) have resulted in the development of various computer‐mediated communication (CMC) and social networking tools. However, quantifying the benefits of utilizing these tools in the organizational context remains a challenge. In this study, the authors

  8. Correcting Inconsistencies and Errors in Bacterial Genome Metadata Using an Automated Curation Tool in Excel (AutoCurE).

    Science.gov (United States)

    Schmedes, Sarah E; King, Jonathan L; Budowle, Bruce

    2015-01-01

    Whole-genome data are invaluable for large-scale comparative genomic studies. Current sequencing technologies have made it feasible to sequence entire bacterial genomes with relative ease and time with a substantially reduced cost per nucleotide, hence cost per genome. More than 3,000 bacterial genomes have been sequenced and are available at the finished status. Publically available genomes can be readily downloaded; however, there are challenges to verify the specific supporting data contained within the download and to identify errors and inconsistencies that may be present within the organizational data content and metadata. AutoCurE, an automated tool for bacterial genome database curation in Excel, was developed to facilitate local database curation of supporting data that accompany downloaded genomes from the National Center for Biotechnology Information. AutoCurE provides an automated approach to curate local genomic databases by flagging inconsistencies or errors by comparing the downloaded supporting data to the genome reports to verify genome name, RefSeq accession numbers, the presence of archaea, BioProject/UIDs, and sequence file descriptions. Flags are generated for nine metadata fields if there are inconsistencies between the downloaded genomes and genomes reports and if erroneous or missing data are evident. AutoCurE is an easy-to-use tool for local database curation for large-scale genome data prior to downstream analyses.

  9. Automated home cage observations as a tool to measure the effects of wheel running on cage floor locomotion.

    Science.gov (United States)

    de Visser, Leonie; van den Bos, Ruud; Spruijt, Berry M

    2005-05-28

    This paper introduces automated observations in a modular home cage system as a tool to measure the effects of wheel running on the time distribution and daily organization of cage floor locomotor activity in female C57BL/6 mice. Mice (n = 16) were placed in the home cage system for 6 consecutive days. Fifty percent of the subjects had free access to a running wheel that was integrated in the home cage. Overall activity levels in terms of duration of movement were increased by wheel running, while time spent inside a sheltering box was decreased. Wheel running affected the hourly pattern of movement during the animals' active period of the day. Mice without a running wheel, in contrast to mice with a running wheel, showed a clear differentiation between novelty-induced and baseline levels of locomotion as reflected by a decrease after the first day of introduction to the home cage. The results are discussed in the light of the use of running wheels as a tool to measure general activity and as an object for environmental enrichment. Furthermore, the possibilities of using automated home cage observations for e.g. behavioural phenotyping are discussed.

  10. Effects of Attitudes and Behaviours on Learning Mathematics with Computer Tools

    Science.gov (United States)

    Reed, Helen C.; Drijvers, Paul; Kirschner, Paul A.

    2010-01-01

    This mixed-methods study investigates the effects of student attitudes and behaviours on the outcomes of learning mathematics with computer tools. A computer tool was used to help students develop the mathematical concept of function. In the whole sample (N = 521), student attitudes could account for a 3.4 point difference in test scores between…

  11. Development of a computer-based automated pure tone hearing screening device: a preliminary clinical trial.

    Science.gov (United States)

    Gan, Kok Beng; Azeez, Dhifaf; Umat, Cila; Ali, Mohd Alauddin Mohd; Wahab, Noor Alaudin Abdul; Mukari, Siti Zamratol Mai-Sarah

    2012-10-01

    Hearing screening is important for the early detection of hearing loss. The requirements of specialized equipment, skilled personnel, and quiet environments for valid screening results limit its application in schools and health clinics. This study aimed to develop an automated hearing screening kit (auto-kit) with the capability of realtime noise level monitoring to ensure that the screening is performed in an environment that conforms to the standard. The auto-kit consists of a laptop, a 24-bit resolution sound card, headphones, a microphone, and a graphical user interface, which is calibrated according to the American National Standards Institute S3.6-2004 standard. The auto-kit can present four test tones (500, 1000, 2000, and 4000 Hz) at 25 or 40 dB HL screening cut-off level. The clinical results at 40 dB HL screening cut-off level showed that the auto-kit has a sensitivity of 92.5% and a specificity of 75.0%. Because the 500 Hz test tone is not included in the standard hearing screening procedure, it can be excluded from the auto-kit test procedure. The exclusion of 500 Hz test tone improved the specificity of the auto-kit from 75.0% to 92.3%, which suggests that the auto-kit could be a valid hearing screening device. In conclusion, the auto-kit may be a valuable hearing screening tool, especially in countries where resources are limited.

  12. Computational Testing for Automated Preprocessing 2: Practical Demonstration of a System for Scientific Data-Processing Workflow Management for High-Volume EEG.

    Science.gov (United States)

    Cowley, Benjamin U; Korpela, Jussi

    2018-01-01

    Existing tools for the preprocessing of EEG data provide a large choice of methods to suitably prepare and analyse a given dataset. Yet it remains a challenge for the average user to integrate methods for batch processing of the increasingly large datasets of modern research, and compare methods to choose an optimal approach across the many possible parameter configurations. Additionally, many tools still require a high degree of manual decision making for, e.g., the classification of artifacts in channels, epochs or segments. This introduces extra subjectivity, is slow, and is not reproducible. Batching and well-designed automation can help to regularize EEG preprocessing, and thus reduce human effort, subjectivity, and consequent error. The Computational Testing for Automated Preprocessing (CTAP) toolbox facilitates: (i) batch processing that is easy for experts and novices alike; (ii) testing and comparison of preprocessing methods. Here we demonstrate the application of CTAP to high-resolution EEG data in three modes of use. First, a linear processing pipeline with mostly default parameters illustrates ease-of-use for naive users. Second, a branching pipeline illustrates CTAP's support for comparison of competing methods. Third, a pipeline with built-in parameter-sweeping illustrates CTAP's capability to support data-driven method parameterization. CTAP extends the existing functions and data structure from the well-known EEGLAB toolbox, based on Matlab, and produces extensive quality control outputs. CTAP is available under MIT open-source licence from https://github.com/bwrc/ctap.

  13. Managing Laboratory Data Using Cloud Computing as an Organizational Tool

    Science.gov (United States)

    Bennett, Jacqueline; Pence, Harry E.

    2011-01-01

    One of the most significant difficulties encountered when directing undergraduate research and developing new laboratory experiments is how to efficiently manage the data generated by a number of students. Cloud computing, where both software and computer files reside online, offers a solution to this data-management problem and allows researchers…

  14. Computer Art--A New Tool in Advertising Graphics.

    Science.gov (United States)

    Wassmuth, Birgit L.

    Using computers to produce art began with scientists, mathematicians, and individuals with strong technical backgrounds who used the graphic material as visualizations of data in technical fields. People are using computer art in advertising, as well as in painting; sculpture; music; textile, product, industrial, and interior design; architecture;…

  15. Computational tools for cyclotron design, commissioning, and operation

    International Nuclear Information System (INIS)

    Kost, C.J.

    1989-05-01

    Many support systems are required in the design, commissioning, and normal operation of a modern cyclotron. Presented is an overview of the computing environment developed during these various stages at TRIUMF. The current computing environment is also discussed, with emphasis on how one can provide an integrated system which is user-friendly

  16. GoSam-2.0. A tool for automated one-loop calculations within the Standard Model and beyond

    International Nuclear Information System (INIS)

    Cullen, Gavin; Deurzen, Hans van; Greiner, Nicolas

    2014-05-01

    We present the version 2.0 of the program package GoSam for the automated calculation of one-loop amplitudes. GoSam is devised to compute one-loop QCD and/or electroweak corrections to multi-particle processes within and beyond the Standard Model. The new code contains improvements in the generation and in the reduction of the amplitudes, performs better in computing time and numerical accuracy, and has an extended range of applicability. The extended version of the ''Binoth-Les-Houches-Accord'' interface to Monte Carlo programs is also implemented. We give a detailed description of installation and usage of the code, and illustrate the new features in dedicated examples.

  17. Improved cancer detection in automated breast ultrasound by radiologists using Computer Aided Detection

    Energy Technology Data Exchange (ETDEWEB)

    Zelst, J.C.M. van, E-mail: Jan.vanZelst@radboudumc.nl [Radboud University Medical Center, Department of Radiology and Nuclear Medicine, Nijmegen (Netherlands); Tan, T.; Platel, B. [Radboud University Medical Center, Department of Radiology and Nuclear Medicine, Nijmegen (Netherlands); Jong, M. de [Jeroen Bosch Medical Centre, Department of Radiology, ‘s-Hertogenbosch (Netherlands); Steenbakkers, A. [Radboud University Medical Center, Department of Radiology and Nuclear Medicine, Nijmegen (Netherlands); Mourits, M. [Jeroen Bosch Medical Centre, Department of Radiology, ‘s-Hertogenbosch (Netherlands); Grivegnee, A. [Jules Bordet Institute, Department of Radiology, Brussels (Belgium); Borelli, C. [Catholic University of the Sacred Heart, Department of Radiological Sciences, Rome (Italy); Karssemeijer, N.; Mann, R.M. [Radboud University Medical Center, Department of Radiology and Nuclear Medicine, Nijmegen (Netherlands)

    2017-04-15

    Objective: To investigate the effect of dedicated Computer Aided Detection (CAD) software for automated breast ultrasound (ABUS) on the performance of radiologists screening for breast cancer. Methods: 90 ABUS views of 90 patients were randomly selected from a multi-institutional archive of cases collected between 2010 and 2013. This dataset included normal cases (n = 40) with >1 year of follow up, benign (n = 30) lesions that were either biopsied or remained stable, and malignant lesions (n = 20). Six readers evaluated all cases with and without CAD in two sessions. CAD-software included conventional CAD-marks and an intelligent minimum intensity projection of the breast tissue. Readers reported using a likelihood-of-malignancy scale from 0 to 100. Alternative free-response ROC analysis was used to measure the performance. Results: Without CAD, the average area-under-the-curve (AUC) of the readers was 0.77 and significantly improved with CAD to 0.84 (p = 0.001). Sensitivity of all readers improved (range 5.2–10.6%) by using CAD but specificity decreased in four out of six readers (range 1.4–5.7%). No significant difference was observed in the AUC between experienced radiologists and residents both with and without CAD. Conclusions: Dedicated CAD-software for ABUS has the potential to improve the cancer detection rates of radiologists screening for breast cancer.

  18. Improved cancer detection in automated breast ultrasound by radiologists using Computer Aided Detection

    International Nuclear Information System (INIS)

    Zelst, J.C.M. van; Tan, T.; Platel, B.; Jong, M. de; Steenbakkers, A.; Mourits, M.; Grivegnee, A.; Borelli, C.; Karssemeijer, N.; Mann, R.M.

    2017-01-01

    Objective: To investigate the effect of dedicated Computer Aided Detection (CAD) software for automated breast ultrasound (ABUS) on the performance of radiologists screening for breast cancer. Methods: 90 ABUS views of 90 patients were randomly selected from a multi-institutional archive of cases collected between 2010 and 2013. This dataset included normal cases (n = 40) with >1 year of follow up, benign (n = 30) lesions that were either biopsied or remained stable, and malignant lesions (n = 20). Six readers evaluated all cases with and without CAD in two sessions. CAD-software included conventional CAD-marks and an intelligent minimum intensity projection of the breast tissue. Readers reported using a likelihood-of-malignancy scale from 0 to 100. Alternative free-response ROC analysis was used to measure the performance. Results: Without CAD, the average area-under-the-curve (AUC) of the readers was 0.77 and significantly improved with CAD to 0.84 (p = 0.001). Sensitivity of all readers improved (range 5.2–10.6%) by using CAD but specificity decreased in four out of six readers (range 1.4–5.7%). No significant difference was observed in the AUC between experienced radiologists and residents both with and without CAD. Conclusions: Dedicated CAD-software for ABUS has the potential to improve the cancer detection rates of radiologists screening for breast cancer.

  19. Automated agents for management and control of the ALICE Computing Grid

    International Nuclear Information System (INIS)

    Grigoras, C; Betev, L; Carminati, F; Legrand, I; Voicu, R

    2010-01-01

    A complex software environment such as the ALICE Computing Grid infrastructure requires permanent control and management for the large set of services involved. Automating control procedures reduces the human interaction with the various components of the system and yields better availability of the overall system. In this paper we will present how we used the MonALISA framework to gather, store and display the relevant metrics in the entire system from central and remote site services. We will also show the automatic local and global procedures that are triggered by the monitored values. Decision-taking agents are used to restart remote services, alert the operators in case of problems that cannot be automatically solved, submit production jobs, replicate and analyze raw data, resource load-balance and other control mechanisms that optimize the overall work flow and simplify day-to-day operations. Synthetic graphical views for all operational parameters, correlations, state of services and applications as well as the full history of all monitoring metrics are available for the ent ire system that now encompasses 85 sites all over the world, mo re than 14000 CPU cores and 10PB of storage.

  20. An observer study comparing spot imaging regions selected by radiologists and a computer for an automated stereo spot mammography technique

    International Nuclear Information System (INIS)

    Goodsitt, Mitchell M.; Chan, Heang-Ping; Lydick, Justin T.; Gandra, Chaitanya R.; Chen, Nelson G.; Helvie, Mark A.; Bailey, Janet E.; Roubidoux, Marilyn A.; Paramagul, Chintana; Blane, Caroline E.; Sahiner, Berkman; Petrick, Nicholas A.

    2004-01-01

    We are developing an automated stereo spot mammography technique for improved imaging of suspicious dense regions within digital mammograms. The technique entails the acquisition of a full-field digital mammogram, automated detection of a suspicious dense region within that mammogram by a computer aided detection (CAD) program, and acquisition of a stereo pair of images with automated collimation to the suspicious region. The latter stereo spot image is obtained within seconds of the original full-field mammogram, without releasing the compression paddle. The spot image is viewed on a stereo video display. A critical element of this technique is the automated detection of suspicious regions for spot imaging. We performed an observer study to compare the suspicious regions selected by radiologists with those selected by a CAD program developed at the University of Michigan. True regions of interest (TROIs) were separately determined by one of the radiologists who reviewed the original mammograms, biopsy images, and histology results. We compared the radiologist and computer-selected regions of interest (ROIs) to the TROIs. Both the radiologists and the computer were allowed to select up to 3 regions in each of 200 images (mixture of 100 CC and 100 MLO views). We computed overlap indices (the overlap index is defined as the ratio of the area of intersection to the area of interest) to quantify the agreement between the selected regions in each image. The averages of the largest overlap indices per image for the 5 radiologist-to-computer comparisons were directly related to the average number of regions per image traced by the radiologists (about 50% for 1 region/image, 84% for 2 regions/image and 96% for 3 regions/image). The average of the overlap indices with all of the TROIs was 73% for CAD and 76.8%+/-10.0% for the radiologists. This study indicates that the CAD determined ROIs could potentially be useful for a screening technique that includes stereo spot

  1. Automated egg grading system using computer vision: Investigation on weight measure versus shape parameters

    Science.gov (United States)

    Nasir, Ahmad Fakhri Ab; Suhaila Sabarudin, Siti; Majeed, Anwar P. P. Abdul; Ghani, Ahmad Shahrizan Abdul

    2018-04-01

    Chicken egg is a source of food of high demand by humans. Human operators cannot work perfectly and continuously when conducting egg grading. Instead of an egg grading system using weight measure, an automatic system for egg grading using computer vision (using egg shape parameter) can be used to improve the productivity of egg grading. However, early hypothesis has indicated that more number of egg classes will change when using egg shape parameter compared with using weight measure. This paper presents the comparison of egg classification by the two above-mentioned methods. Firstly, 120 images of chicken eggs of various grades (A–D) produced in Malaysia are captured. Then, the egg images are processed using image pre-processing techniques, such as image cropping, smoothing and segmentation. Thereafter, eight egg shape features, including area, major axis length, minor axis length, volume, diameter and perimeter, are extracted. Lastly, feature selection (information gain ratio) and feature extraction (principal component analysis) are performed using k-nearest neighbour classifier in the classification process. Two methods, namely, supervised learning (using weight measure as graded by egg supplier) and unsupervised learning (using egg shape parameters as graded by ourselves), are conducted to execute the experiment. Clustering results reveal many changes in egg classes after performing shape-based grading. On average, the best recognition results using shape-based grading label is 94.16% while using weight-based label is 44.17%. As conclusion, automated egg grading system using computer vision is better by implementing shape-based features since it uses image meanwhile the weight parameter is more suitable by using weight grading system.

  2. Identification and red blood cell automated counting from blood smear images using computer-aided system.

    Science.gov (United States)

    Acharya, Vasundhara; Kumar, Preetham

    2018-03-01

    Red blood cell count plays a vital role in identifying the overall health of the patient. Hospitals use the hemocytometer to count the blood cells. Conventional method of placing the smear under microscope and counting the cells manually lead to erroneous results, and medical laboratory technicians are put under stress. A computer-aided system will help to attain precise results in less amount of time. This research work proposes an image-processing technique for counting the number of red blood cells. It aims to examine and process the blood smear image, in order to support the counting of red blood cells and identify the number of normal and abnormal cells in the image automatically. K-medoids algorithm which is robust to external noise is used to extract the WBCs from the image. Granulometric analysis is used to separate the red blood cells from the white blood cells. The red blood cells obtained are counted using the labeling algorithm and circular Hough transform. The radius range for the circle-drawing algorithm is estimated by computing the distance of the pixels from the boundary which automates the entire algorithm. A comparison is done between the counts obtained using the labeling algorithm and circular Hough transform. Results of the work showed that circular Hough transform was more accurate in counting the red blood cells than the labeling algorithm as it was successful in identifying even the overlapping cells. The work also intends to compare the results of cell count done using the proposed methodology and manual approach. The work is designed to address all the drawbacks of the previous research work. The research work can be extended to extract various texture and shape features of abnormal cells identified so that diseases like anemia of inflammation and chronic disease can be detected at the earliest.

  3. iTools: a framework for classification, categorization and integration of computational biology resources.

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    2008-05-01

    Full Text Available The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long

  4. Recommendations for the Use of Automated Gray Matter Segmentation Tools: Evidence from Huntington’s Disease

    Science.gov (United States)

    Johnson, Eileanoir B.; Gregory, Sarah; Johnson, Hans J.; Durr, Alexandra; Leavitt, Blair R.; Roos, Raymund A.; Rees, Geraint; Tabrizi, Sarah J.; Scahill, Rachael I.

    2017-01-01

    The selection of an appropriate segmentation tool is a challenge facing any researcher aiming to measure gray matter (GM) volume. Many tools have been compared, yet there is currently no method that can be recommended above all others; in particular, there is a lack of validation in disease cohorts. This work utilizes a clinical dataset to conduct an extensive comparison of segmentation tools. Our results confirm that all tools have advantages and disadvantages, and we present a series of considerations that may be of use when selecting a GM segmentation method, rather than a ranking of these tools. Seven segmentation tools were compared using 3 T MRI data from 20 controls, 40 premanifest Huntington’s disease (HD), and 40 early HD participants. Segmented volumes underwent detailed visual quality control. Reliability and repeatability of total, cortical, and lobular GM were investigated in repeated baseline scans. The relationship between each tool was also examined. Longitudinal within-group change over 3 years was assessed via generalized least squares regression to determine sensitivity of each tool to disease effects. Visual quality control and raw volumes highlighted large variability between tools, especially in occipital and temporal regions. Most tools showed reliable performance and the volumes were generally correlated. Results for longitudinal within-group change varied between tools, especially within lobular regions. These differences highlight the need for careful selection of segmentation methods in clinical neuroimaging studies. This guide acts as a primer aimed at the novice or non-technical imaging scientist providing recommendations for the selection of cohort-appropriate GM segmentation software. PMID:29066997

  5. Recommendations for the Use of Automated Gray Matter Segmentation Tools: Evidence from Huntington’s Disease

    Directory of Open Access Journals (Sweden)

    Eileanoir B. Johnson

    2017-10-01

    Full Text Available The selection of an appropriate segmentation tool is a challenge facing any researcher aiming to measure gray matter (GM volume. Many tools have been compared, yet there is currently no method that can be recommended above all others; in particular, there is a lack of validation in disease cohorts. This work utilizes a clinical dataset to conduct an extensive comparison of segmentation tools. Our results confirm that all tools have advantages and disadvantages, and we present a series of considerations that may be of use when selecting a GM segmentation method, rather than a ranking of these tools. Seven segmentation tools were compared using 3 T MRI data from 20 controls, 40 premanifest Huntington’s disease (HD, and 40 early HD participants. Segmented volumes underwent detailed visual quality control. Reliability and repeatability of total, cortical, and lobular GM were investigated in repeated baseline scans. The relationship between each tool was also examined. Longitudinal within-group change over 3 years was assessed via generalized least squares regression to determine sensitivity of each tool to disease effects. Visual quality control and raw volumes highlighted large variability between tools, especially in occipital and temporal regions. Most tools showed reliable performance and the volumes were generally correlated. Results for longitudinal within-group change varied between tools, especially within lobular regions. These differences highlight the need for careful selection of segmentation methods in clinical neuroimaging studies. This guide acts as a primer aimed at the novice or non-technical imaging scientist providing recommendations for the selection of cohort-appropriate GM segmentation software.

  6. Recommendations for the Use of Automated Gray Matter Segmentation Tools: Evidence from Huntington's Disease.

    Science.gov (United States)

    Johnson, Eileanoir B; Gregory, Sarah; Johnson, Hans J; Durr, Alexandra; Leavitt, Blair R; Roos, Raymund A; Rees, Geraint; Tabrizi, Sarah J; Scahill, Rachael I

    2017-01-01

    The selection of an appropriate segmentation tool is a challenge facing any researcher aiming to measure gray matter (GM) volume. Many tools have been compared, yet there is currently no method that can be recommended above all others; in particular, there is a lack of validation in disease cohorts. This work utilizes a clinical dataset to conduct an extensive comparison of segmentation tools. Our results confirm that all tools have advantages and disadvantages, and we present a series of considerations that may be of use when selecting a GM segmentation method, rather than a ranking of these tools. Seven segmentation tools were compared using 3 T MRI data from 20 controls, 40 premanifest Huntington's disease (HD), and 40 early HD participants. Segmented volumes underwent detailed visual quality control. Reliability and repeatability of total, cortical, and lobular GM were investigated in repeated baseline scans. The relationship between each tool was also examined. Longitudinal within-group change over 3 years was assessed via generalized least squares regression to determine sensitivity of each tool to disease effects. Visual quality control and raw volumes highlighted large variability between tools, especially in occipital and temporal regions. Most tools showed reliable performance and the volumes were generally correlated. Results for longitudinal within-group change varied between tools, especially within lobular regions. These differences highlight the need for careful selection of segmentation methods in clinical neuroimaging studies. This guide acts as a primer aimed at the novice or non-technical imaging scientist providing recommendations for the selection of cohort-appropriate GM segmentation software.

  7. A tool for computing diversity and consideration on differences between diversity indices

    OpenAIRE

    Palaghianu, Ciprian

    2016-01-01

    Diversity represents a key concept in ecology, and there are various methods of assessing it. The multitude of diversity indices are quite puzzling and sometimes difficult to compute for a large volume of data. This paper promotes a computational tool used to assess the diversity of different entities. The BIODIV software is a user-friendly tool, developed using Microsoft Visual Basic. It is capable to compute several diversity indices such as: Shannon, Simpson, Pielou, Brillouin, Berger-Park...

  8. IHT: Tools for Computing Insolation Absorption by Particle Laden Flows

    Energy Technology Data Exchange (ETDEWEB)

    Grout, R. W.

    2013-10-01

    This report describes IHT, a toolkit for computing radiative heat exchange between particles. Well suited for insolation absorption computations, it is also has potential applications in combustion (sooting flames), biomass gasification processes and similar processes. The algorithm is based on the 'Photon Monte Carlo' approach and implemented in a library that can be interfaced with a variety of computational fluid dynamics codes to analyze radiative heat transfer in particle-laden flows. The emphasis in this report is on the data structures and organization of IHT for developers seeking to use the IHT toolkit to add Photon Monte Carlo capabilities to their own codes.

  9. Formal Development of a Tool for Automated Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Kjær, Andreas A.; Le Bliguet, Marie

    2011-01-01

    This paper describes a tool for formal modelling relay interlocking systems and explains how it has been stepwise, formally developed using the RAISE method. The developed tool takes the circuit diagrams of a relay interlocking system as input and gives as result a state transition system modelling...

  10. Rainbow: a tool for large-scale whole-genome sequencing data analysis using cloud computing.

    Science.gov (United States)

    Zhao, Shanrong; Prenger, Kurt; Smith, Lance; Messina, Thomas; Fan, Hongtao; Jaeger, Edward; Stephens, Susan

    2013-06-27

    Technical improvements have decreased sequencing costs and, as a result, the size and number of genomic datasets have increased rapidly. Because of the lower cost, large amounts of sequence data are now being produced by small to midsize research groups. Crossbow is a software tool that can detect single nucleotide polymorphisms (SNPs) in whole-genome sequencing (WGS) data from a single subject; however, Crossbow has a number of limitations when applied to multiple subjects from large-scale WGS projects. The data storage and CPU resources that are required for large-scale whole genome sequencing data analyses are too large for many core facilities and individual laboratories to provide. To help meet these challenges, we have developed Rainbow, a cloud-based software package that can assist in the automation of large-scale WGS data analyses. Here, we evaluated the performance of Rainbow by analyzing 44 different whole-genome-sequenced subjects. Rainbow has the capacity to process genomic data from more than 500 subjects in two weeks using cloud computing provided by the Amazon Web Service. The time includes the import and export of the data using Amazon Import/Export service. The average cost of processing a single sample in the cloud was less than 120 US dollars. Compared with Crossbow, the main improvements incorporated into Rainbow include the ability: (1) to handle BAM as well as FASTQ input files; (2) to split large sequence files for better load balance downstream; (3) to log the running metrics in data processing and monitoring multiple Amazon Elastic Compute Cloud (EC2) instances; and (4) to merge SOAPsnp outputs for multiple individuals into a single file to facilitate downstream genome-wide association studies. Rainbow is a scalable, cost-effective, and open-source tool for large-scale WGS data analysis. For human WGS data sequenced by either the Illumina HiSeq 2000 or HiSeq 2500 platforms, Rainbow can be used straight out of the box. Rainbow is available

  11. ERP Human Enhancement Progress Report : Use case and computational model for adaptive maritime automation

    NARCIS (Netherlands)

    Kleij, R. van der; Broek, J. van den; Brake, G.M. te; Rypkema, J.A.; Schilder, C.M.C.

    2015-01-01

    Automation is often applied in order to increase the cost-effectiveness, reliability and safety of maritime ship and offshore operations. Automation of operator tasks, has not, however, eliminated human error so much as created opportunities for new kinds of error. The ambition of the Adaptive

  12. Risks and benefits of social computing as a healthcare tool

    CSIR Research Space (South Africa)

    Mxoli, Avuya

    2016-03-01

    Full Text Available Cybercitizen describes a frequent user of the Internet or in other terms, a member of an online community (cybercommunity). This digital space can be used to participate in educational, economical and cultural activities. Social computing...

  13. Computers in the Classroom: From Tool to Medium.

    Science.gov (United States)

    Perrone, Corrina; Repenning, Alexander; Spencer, Sarah; Ambach, James

    1996-01-01

    Discusses the computer as a communication medium to support learning. Illustrates the benefits of this reconceptualization in the context of having students author and play interactive simulation games and exchange them over the Internet. (RS)

  14. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 3: HARP Graphics Oriented (GO) input user's guide

    Science.gov (United States)

    Bavuso, Salvatore J.; Rothmann, Elizabeth; Mittal, Nitin; Koppen, Sandra Howell

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of highly reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed at the outset to be compatible with most computing platforms and operating systems, and some programs have been beta tested within the aerospace community for over 8 years. This document is a user's guide for the HiRel graphical preprocessor Graphics Oriented (GO) program. GO is a graphical user interface for the HARP engine that enables the drawing of reliability/availability models on a monitor. A mouse is used to select fault tree gates or Markov graphical symbols from a menu for drawing.

  15. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 4: HARP Output (HARPO) graphics display user's guide

    Science.gov (United States)

    Sproles, Darrell W.; Bavuso, Salvatore J.

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of highly reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed at the outset to be compatible with most computing platforms and operating systems and some programs have been beta tested within the aerospace community for over 8 years. This document is a user's guide for the HiRel graphical postprocessor program HARPO (HARP Output). HARPO reads ASCII files generated by HARP. It provides an interactive plotting capability that can be used to display alternate model data for trade-off analyses. File data can also be imported to other commercial software programs.

  16. Computer-generated movies as an analytic tool

    International Nuclear Information System (INIS)

    Elliott, R.L.

    1978-01-01

    One of the problems faced by the users of large, sophisticated modeling programs at the Los Alamos Scientific Laboratory (LASL) is the analysis of the results of their calculations. One of the more productive and frequently spectacular methods is the production of computer-generated movies. An overview of the generation of computer movies at LASL is presented. The hardware, software, and generation techniques are briefly discussed

  17. Assessing mouse behaviour throughout the light/dark cycle using automated in-cage analysis tools.

    Science.gov (United States)

    Bains, Rasneer S; Wells, Sara; Sillito, Rowland R; Armstrong, J Douglas; Cater, Heather L; Banks, Gareth; Nolan, Patrick M

    2018-04-15

    An important factor in reducing variability in mouse test outcomes has been to develop assays that can be used for continuous automated home cage assessment. Our experience has shown that this has been most evidenced in long-term assessment of wheel-running activity in mice. Historically, wheel-running in mice and other rodents have been used as a robust assay to determine, with precision, the inherent period of circadian rhythms in mice. Furthermore, this assay has been instrumental in dissecting the molecular genetic basis of mammalian circadian rhythms. In teasing out the elements of this test that have determined its robustness - automated assessment of an unforced behaviour in the home cage over long time intervals - we and others have been investigating whether similar test apparatus could be used to accurately discriminate differences in distinct behavioural parameters in mice. Firstly, using these systems, we explored behaviours in a number of mouse inbred strains to determine whether we could extract biologically meaningful differences. Secondly, we tested a number of relevant mutant lines to determine how discriminative these parameters were. Our findings show that, when compared to conventional out-of-cage phenotyping, a far deeper understanding of mouse mutant phenotype can be established by monitoring behaviour in the home cage over one or more light:dark cycles. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  18. Development of an Automated Decision-Making Tool for Supervisory Control System

    Energy Technology Data Exchange (ETDEWEB)

    Cetiner, Sacit M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Muhlheim, Michael David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Flanagan, George F. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Fugate, David L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kisner, Roger A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2014-09-01

    This technical report was generated as a product of the Supervisory Control for Multi-Modular Small Modular Reactor (SMR) Plants project within the Instrumentation, Control and Human-Machine Interface technology area under the Advanced Small Modular Reactor (AdvSMR) Research and Development Program of the US Department of Energy. The report documents the definition of strategies, functional elements, and the structural architecture of a supervisory control system for multi-modular AdvSMR plants. This research activity advances the state of the art by incorporating real-time, probabilistic-based decision-making into the supervisory control system architectural layers through the introduction of a tiered-plant system approach. The report provides background information on the state of the art of automated decision-making, including the description of existing methodologies. It then presents a description of a generalized decision-making framework, upon which the supervisory control decision-making algorithm is based. The probabilistic portion of automated decision-making is demonstrated through a simple hydraulic loop example.

  19. Tools for a simulation supported commissioning of the automation of HVAC plants. Hardware-in-the-loop in building automation; Werkzeuge fuer eine simulationsgestuetzte Inbetriebnahme der Automation von RLT- Anlagen. Hardware-in-the-Loop in der Gebaeudeautomation

    Energy Technology Data Exchange (ETDEWEB)

    Richter, Andreas; Sokollik, Frank [Hochschule Merseburg (Germany). Fachbereich Informatik und Kommunikationssysteme

    2012-07-01

    Hardware-in-the-loop (HiL) is a method for testing and validating technical automation solutions based on virtual processes in a simulation environment. Applied to the automation of the interior air supply systems, preceded commissioning tests of the controller at a simulated system can be performed. These tests can be used for example to find logic errors in the program development, or to adjust the parameters of a controller. The adjustment of the parameters can be performed independent of the seasons by modifying the ambient climatic conditions. The parameters of the plants can be tested under dynamic conditions. The control mode can be visualized by starting up of load conditions at dynamic HVAC components and optimized if necessary. Within BMBF funded projects, a HiL solution was developed in a.NET environment. The coupling of simulation and control takes place via the bus systems CAN and BACnet. The elements of the simulation of air conditioners are implemented object-oriented in the programming language C, and are based on the solution of dynamic mass and energy balances. The features of HIL are implemented in a multi-client architecture. This includes primarily the simulation and communication. Other feature are implemented: import of virtual systems from a CAE system, adjustment of parameters of the simulation using structured sets of parameters, features for a distributed simulation of complex systems in the network, a tool for the dimensioning of controllers, chart and visualization features.

  20. An automated tool for cortical feature analysis: Application to differences on 7 Tesla T2* -weighted images between young and older healthy subjects.

    Science.gov (United States)

    Doan, Nhat Trung; van Rooden, Sanneke; Versluis, Maarten J; Buijs, Mathijs; Webb, Andrew G; van der Grond, Jeroen; van Buchem, Mark A; Reiber, Johan H C; Milles, Julien

    2015-07-01

    High field T 2 * -weighted MR images of the cerebral cortex are increasingly used to study tissue susceptibility changes related to aging or pathologies. This paper presents a novel automated method for the computation of quantitative cortical measures and group-wise comparison using 7 Tesla T 2 * -weighted magnitude and phase images. The cerebral cortex was segmented using a combination of T 2 * -weighted magnitude and phase information and subsequently was parcellated based on an anatomical atlas. Local gray matter (GM)/white matter (WM) contrast and cortical profiles, which depict the magnitude or phase variation across the cortex, were computed from the magnitude and phase images in each parcellated region and further used for group-wise comparison. Differences in local GM/WM contrast were assessed using linear regression analysis. Regional cortical profiles were compared both globally and locally using permutation testing. The method was applied to compare a group of 10 young volunteers with a group of 15 older subjects. Using local GM/WM contrast, significant differences were revealed in at least 13 of 17 studied regions. Highly significant differences between cortical profiles were shown in all regions. The proposed method can be a useful tool for studying cortical changes in normal aging and potentially in neurodegenerative diseases. Magn Reson Med 74:240-248, 2015. © 2014 Wiley Periodicals, Inc. © 2014 Wiley Periodicals, Inc.

  1. Flow Mapping in a Gas-Solid Riser via Computer Automated Radioactive Particle Tracking (CARPT)

    Energy Technology Data Exchange (ETDEWEB)

    Muthanna Al-Dahhan; Milorad P. Dudukovic; Satish Bhusarapu; Timothy J. O' hern; Steven Trujillo; Michael R. Prairie

    2005-06-04

    Statement of the Problem: Developing and disseminating a general and experimentally validated model for turbulent multiphase fluid dynamics suitable for engineering design purposes in industrial scale applications of riser reactors and pneumatic conveying, require collecting reliable data on solids trajectories, velocities ? averaged and instantaneous, solids holdup distribution and solids fluxes in the riser as a function of operating conditions. Such data are currently not available on the same system. Multiphase Fluid Dynamics Research Consortium (MFDRC) was established to address these issues on a chosen example of circulating fluidized bed (CFB) reactor, which is widely used in petroleum and chemical industry including coal combustion. This project addresses the problem of lacking reliable data to advance CFB technology. Project Objectives: The objective of this project is to advance the understanding of the solids flow pattern and mixing in a well-developed flow region of a gas-solid riser, operated at different gas flow rates and solids loading using the state-of-the-art non-intrusive measurements. This work creates an insight and reliable database for local solids fluid-dynamic quantities in a pilot-plant scale CFB, which can then be used to validate/develop phenomenological models for the riser. This study also attempts to provide benchmark data for validation of Computational Fluid Dynamic (CFD) codes and their current closures. Technical Approach: Non-Invasive Computer Automated Radioactive Particle Tracking (CARPT) technique provides complete Eulerian solids flow field (time average velocity map and various turbulence parameters such as the Reynolds stresses, turbulent kinetic energy, and eddy diffusivities). It also gives directly the Lagrangian information of solids flow and yields the true solids residence time distribution (RTD). Another radiation based technique, Computed Tomography (CT) yields detailed time averaged local holdup profiles at

  2. Evaluating tablet computers as a survey tool in rural communities.

    Science.gov (United States)

    Newell, Steve M; Logan, Henrietta L; Guo, Yi; Marks, John G; Shepperd, James A

    2015-01-01

    Although tablet computers offer advantages in data collection over traditional paper-and-pencil methods, little research has examined whether the 2 formats yield similar responses, especially with underserved populations. We compared the 2 survey formats and tested whether participants' responses to common health questionnaires or perceptions of usability differed by survey format. We also tested whether we could replicate established paper-and-pencil findings via tablet computer. We recruited a sample of low-income community members living in the rural southern United States. Participants were 170 residents (black = 49%; white = 36%; other races and missing data = 15%) drawn from 2 counties meeting Florida's state statutory definition of rural with 100 persons or fewer per square mile. We randomly assigned participants to complete scales (Center for Epidemiologic Studies Depression Inventory and Regulatory Focus Questionnaire) along with survey format usability ratings via paper-and-pencil or tablet computer. All participants rated a series of previously validated posters using a tablet computer. Finally, participants completed comparisons of the survey formats and reported survey format preferences. Participants preferred using the tablet computer and showed no significant differences between formats in mean responses, scale reliabilities, or in participants' usability ratings. Overall, participants reported similar scales responses and usability ratings between formats. However, participants reported both preferring and enjoying responding via tablet computer more. Collectively, these findings are among the first data to show that tablet computers represent a suitable substitute among an underrepresented rural sample for paper-and-pencil methodology in survey research. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.

  3. Safe manning of merchant ships: an approach and computer tool

    DEFF Research Database (Denmark)

    Alapetite, Alexandre; Kozin, Igor

    2017-01-01

    In the shipping industry, staffing expenses have become a vital competition parameter. In this paper, an approach and a software tool are presented to support decisions on the staffing of merchant ships. The tool is implemented in the form of a Web user interface that makes use of discrete......-event simulation and allows estimation of the workload and of whether different scenarios are successfully performed taking account of the number of crewmembers, watch schedules, distribution of competencies, and others. The software library ‘SimManning’ at the core of the project is provided as open source...

  4. Automated real time peg and tool detection for the FLS trainer box.

    Science.gov (United States)

    Nemani, Arun; Sankaranarayanan, Ganesh

    2012-01-01

    This study proposes a method that effectively tracks trocar tool and peg positions in real time to allow real time assessment of the peg transfer task of the Fundamentals of Laparoscopic Surgery (FLS). By utilizing custom code along with OpenCV libraries, tool and peg positions can be accurately tracked without altering the original setup conditions of the FLS trainer box. This is achieved via a series of image filtration sequences, thresholding functions, and Haar training methods.

  5. Implementing iRound: A Computer-Based Auditing Tool.

    Science.gov (United States)

    Brady, Darcie

    Many hospitals use rounding or auditing as a tool to help identify gaps and needs in quality and process performance. Some hospitals are also using rounding to help improve patient experience. It is known that purposeful rounding helps improve Hospital Consumer Assessment of Healthcare Providers and Systems scores by helping manage patient expectations, provide service recovery, and recognize quality caregivers. Rounding works when a standard method is used across the facility, where data are comparable and trustworthy. This facility had a pen-and-paper process in place that made data reporting difficult, created a silo culture between departments, and most audits and rounds were completed differently on each unit. It was recognized that this facility needed to standardize the rounding and auditing process. The tool created by the Advisory Board called iRound was chosen as the tool this facility would use for patient experience rounds as well as process and quality rounding. The success of the iRound tool in this facility depended on several factors that started many months before implementation to current everyday usage.

  6. Computational tool for simulation of power and refrigeration cycles

    Science.gov (United States)

    Córdoba Tuta, E.; Reyes Orozco, M.

    2016-07-01

    Small improvement in thermal efficiency of power cycles brings huge cost savings in the production of electricity, for that reason have a tool for simulation of power cycles allows modeling the optimal changes for a best performance. There is also a big boom in research Organic Rankine Cycle (ORC), which aims to get electricity at low power through cogeneration, in which the working fluid is usually a refrigerant. A tool to design the elements of an ORC cycle and the selection of the working fluid would be helpful, because sources of heat from cogeneration are very different and in each case would be a custom design. In this work the development of a multiplatform software for the simulation of power cycles and refrigeration, which was implemented in the C ++ language and includes a graphical interface which was developed using multiplatform environment Qt and runs on operating systems Windows and Linux. The tool allows the design of custom power cycles, selection the type of fluid (thermodynamic properties are calculated through CoolProp library), calculate the plant efficiency, identify the fractions of flow in each branch and finally generates a report very educational in pdf format via the LaTeX tool.

  7. ATLAS Distributed Computing Monitoring tools during the LHC Run I

    CERN Document Server

    Schovancova, J; The ATLAS collaboration; Di Girolamo, A; Jezequel, S; Ueda, I; Wenaus, T

    2013-01-01

    This contribution summarizes evolution of the ATLAS Distributed Computing (ADC) Monitoring project during the LHC Run I. The ADC Monitoring targets at the three groups of customers: ADC Operations team to early identify malfunctions and escalate issues to an activity or a service expert, ATLAS national contacts and sites for the real-time monitoring and long-term measurement of the performance of the provided computing resources, and the ATLAS Management for long-term trends and accounting information about the ATLAS Distributed Computing resources.\\\\ During the LHC Run I a significant development effort has been invested in standardization of the monitoring and accounting applications in order to provide extensive monitoring and accounting suite. ADC Monitoring applications separate the data layer and the visualization layer. The data layer exposes data in a predefined format. The visualization layer is designed bearing in mind visual identity of the provided graphical elements, and re-usability of the visua...

  8. ATLAS Distributed Computing Monitoring tools during the LHC Run I

    CERN Document Server

    Schovancova, J; The ATLAS collaboration; Di Girolamo, A; Jezequel, S; Ueda, I; Wenaus, T

    2014-01-01

    This contribution summarizes evolution of the ATLAS Distributed Computing (ADC) Monitoring project during the LHC Run I. The ADC Monitoring targets at the three groups of customers: ADC Operations team to early identify malfunctions and escalate issues to an activity or a service expert, ATLAS national contacts and sites for the real-time monitoring and long-term measurement of the performance of the provided computing resources, and the ATLAS Management for long-term trends and accounting information about the ATLAS Distributed Computing resources.\\\\ During the LHC Run I a significant development effort has been invested in standardization of the monitoring and accounting applications in order to provide extensive monitoring and accounting suite. ADC Monitoring applications separate the data layer and the visualization layer. The data layer exposes data in a predefined format. The visualization layer is designed bearing in mind visual identity of the provided graphical elements, and re-usability of the visua...

  9. Transcriptator: An Automated Computational Pipeline to Annotate Assembled Reads and Identify Non Coding RNA.

    Directory of Open Access Journals (Sweden)

    Kumar Parijat Tripathi

    Full Text Available RNA-seq is a new tool to measure RNA transcript counts, using high-throughput sequencing at an extraordinary accuracy. It provides quantitative means to explore the transcriptome of an organism of interest. However, interpreting this extremely large data into biological knowledge is a problem, and biologist-friendly tools are lacking. In our lab, we developed Transcriptator, a web application based on a computational Python pipeline with a user-friendly Java interface. This pipeline uses the web services available for BLAST (Basis Local Search Alignment Tool, QuickGO and DAVID (Database for Annotation, Visualization and Integrated Discovery tools. It offers a report on statistical analysis of functional and Gene Ontology (GO annotation's enrichment. It helps users to identify enriched biological themes, particularly GO terms, pathways, domains, gene/proteins features and protein-protein interactions related informations. It clusters the transcripts based on functional annotations and generates a tabular report for functional and gene ontology annotations for each submitted transcript to the web server. The implementation of QuickGo web-services in our pipeline enable the users to carry out GO-Slim analysis, whereas the integration of PORTRAIT (Prediction of transcriptomic non coding RNA (ncRNA by ab initio methods helps to identify the non coding RNAs and their regulatory role in transcriptome. In summary, Transcriptator is a useful software for both NGS and array data. It helps the users to characterize the de-novo assembled reads, obtained from NGS experiments for non-referenced organisms, while it also performs the functional enrichment analysis of differentially expressed transcripts/genes for both RNA-seq and micro-array experiments. It generates easy to read tables and interactive charts for better understanding of the data. The pipeline is modular in nature, and provides an opportunity to add new plugins in the future. Web application is

  10. Computer algebra as a research tool in physics

    International Nuclear Information System (INIS)

    Drouffe, J.M.

    1985-04-01

    The progress of computer algebra observed during these last years has had certainly an impact in physics. I want to precise the role of these new techniques in this application domain and to analyze their present limitations. In Section 1, I describe briefly the use of algebraic manipulation programs at the elementary level. The numerical and symbolic solutions of problems are compared in Section 2. Section 3 is devoted to a prospective about the use of computer algebra at the highest level, as an ''intelligent'' system. I recall in Section 4 what is required from a system to be used in physics

  11. Automated linear regression tools improve RSSI WSN localization in multipath indoor environment

    Directory of Open Access Journals (Sweden)

    Laermans Eric

    2011-01-01

    Full Text Available Abstract Received signal strength indication (RSSI-based localization is emerging in wireless sensor networks (WSNs. Localization algorithms need to include the physical and hardware limitations of RSSI measurements in order to give more accurate results in dynamic real-life indoor environments. In this study, we use the Interdisciplinary Institute for Broadband Technology real-life test bed and present an automated method to optimize and calibrate the experimental data before offering them to a positioning engine. In a preprocessing localization step, we introduce a new method to provide bounds for the range, thereby further improving the accuracy of our simple and fast 2D localization algorithm based on corrected distance circles. A maximum likelihood algorithm with a mean square error cost function has a higher position error median than our algorithm. Our experiments further show that the complete proposed algorithm eliminates outliers and avoids any manual calibration procedure.

  12. Tools for Designing, Evaluating, and Certifying NextGen Technologies and Procedures: Automation Roles and Responsibilities

    Science.gov (United States)

    Kanki, Barbara G.

    2011-01-01

    Barbara Kanki from NASA Ames Research Center will discuss research that focuses on the collaborations between pilots, air traffic controllers and dispatchers that will change in NextGen systems as automation increases and roles and responsibilities change. The approach taken by this NASA Ames team is to build a collaborative systems assessment template (CSAT) based on detailed task descriptions within each system to establish a baseline of the current operations. The collaborative content and context are delineated through the review of regulatory and advisory materials, policies, procedures and documented practices as augmented by field observations and interviews. The CSAT is developed to aid the assessment of key human factors and performance tradeoffs that result from considering different collaborative arrangements under NextGen system changes. In theory, the CSAT product may be applied to any NextGen application (such as Trajectory Based Operations) with specified ground and aircraft capabilities.

  13. The Spiral Discovery Network as an Automated General-Purpose Optimization Tool

    Directory of Open Access Journals (Sweden)

    Adam B. Csapo

    2018-01-01

    Full Text Available The Spiral Discovery Method (SDM was originally proposed as a cognitive artifact for dealing with black-box models that are dependent on multiple inputs with nonlinear and/or multiplicative interaction effects. Besides directly helping to identify functional patterns in such systems, SDM also simplifies their control through its characteristic spiral structure. In this paper, a neural network-based formulation of SDM is proposed together with a set of automatic update rules that makes it suitable for both semiautomated and automated forms of optimization. The behavior of the generalized SDM model, referred to as the Spiral Discovery Network (SDN, and its applicability to nondifferentiable nonconvex optimization problems are elucidated through simulation. Based on the simulation, the case is made that its applicability would be worth investigating in all areas where the default approach of gradient-based backpropagation is used today.

  14. Enhancing interest in statistics among computer science students using computer tool entrepreneur role play

    Science.gov (United States)

    Judi, Hairulliza Mohamad; Sahari @ Ashari, Noraidah; Eksan, Zanaton Hj

    2017-04-01

    Previous research in Malaysia indicates that there is a problem regarding attitude towards statistics among students. They didn't show positive attitude in affective, cognitive, capability, value, interest and effort aspects although did well in difficulty. This issue should be given substantial attention because students' attitude towards statistics may give impacts on the teaching and learning process of the subject. Teaching statistics using role play is an appropriate attempt to improve attitudes to statistics, to enhance the learning of statistical techniques and statistical thinking, and to increase generic skills. The objectives of the paper are to give an overview on role play in statistics learning and to access the effect of these activities on students' attitude and learning in action research framework. The computer tool entrepreneur role play is conducted in a two-hour tutorial class session of first year students in Faculty of Information Sciences and Technology (FTSM), Universiti Kebangsaan Malaysia, enrolled in Probability and Statistics course. The results show that most students feel that they have enjoyable and great time in the role play. Furthermore, benefits and disadvantages from role play activities were highlighted to complete the review. Role play is expected to serve as an important activities that take into account students' experience, emotions and responses to provide useful information on how to modify student's thinking or behavior to improve learning.

  15. Automated cropland mapping of continental Africa using Google Earth Engine cloud computing

    Science.gov (United States)

    Xiong, Jun N.; Thenkabail, Prasad S.; Gumma, Murali Krishna; Teluguntla, Pardhasaradhi G.; Poehnelt, Justin; Congalton, Russell G.; Yadav, Kamini; Thau, David

    2017-01-01

    The automation of agricultural mapping using satellite-derived remotely sensed data remains a challenge in Africa because of the heterogeneous and fragmental landscape, complex crop cycles, and limited access to local knowledge. Currently, consistent, continent-wide routine cropland mapping of Africa does not exist, with most studies focused either on certain portions of the continent or at most a one-time effort at mapping the continent at coarse resolution remote sensing. In this research, we addressed these limitations by applying an automated cropland mapping algorithm (ACMA) that captures extensive knowledge on the croplands of Africa available through: (a) ground-based training samples, (b) very high (sub-meter to five-meter) resolution imagery (VHRI), and (c) local knowledge captured during field visits and/or sourced from country reports and literature. The study used 16-day time-series of Moderate Resolution Imaging Spectroradiometer (MODIS) normalized difference vegetation index (NDVI) composited data at 250-m resolution for the entire African continent. Based on these data, the study first produced accurate reference cropland layers or RCLs (cropland extent/areas, irrigation versus rainfed, cropping intensities, crop dominance, and croplands versus cropland fallows) for the year 2014 that provided an overall accuracy of around 90% for crop extent in different agro-ecological zones (AEZs). The RCLs for the year 2014 (RCL2014) were then used in the development of the ACMA algorithm to create ACMA-derived cropland layers for 2014 (ACL2014). ACL2014 when compared pixel-by-pixel with the RCL2014 had an overall similarity greater than 95%. Based on the ACL2014, the African continent had 296 Mha of net cropland areas (260 Mha cultivated plus 36 Mha fallows) and 330 Mha of gross cropland areas. Of the 260 Mha of net cropland areas cultivated during 2014, 90.6% (236 Mha) was rainfed and just 9.4% (24 Mha) was irrigated. Africa has about 15% of the

  16. Automated parasite faecal egg counting using fluorescence labelling, smartphone image capture and computational image analysis.

    Science.gov (United States)

    Slusarewicz, Paul; Pagano, Stefanie; Mills, Christopher; Popa, Gabriel; Chow, K Martin; Mendenhall, Michael; Rodgers, David W; Nielsen, Martin K

    2016-07-01

    Intestinal parasites are a concern in veterinary medicine worldwide and for human health in the developing world. Infections are identified by microscopic visualisation of parasite eggs in faeces, which is time-consuming, requires technical expertise and is impractical for use on-site. For these reasons, recommendations for parasite surveillance are not widely adopted and parasite control is based on administration of rote prophylactic treatments with anthelmintic drugs. This approach is known to promote anthelmintic resistance, so there is a pronounced need for a convenient egg counting assay to promote good clinical practice. Using a fluorescent chitin-binding protein, we show that this structural carbohydrate is present and accessible in shells of ova of strongyle, ascarid, trichurid and coccidian parasites. Furthermore, we show that a cellular smartphone can be used as an inexpensive device to image fluorescent eggs and, by harnessing the computational power of the phone, to perform image analysis to count the eggs. Strongyle egg counts generated by the smartphone system had a significant linear correlation with manual McMaster counts (R(2)=0.98), but with a significantly lower coefficient of variation (P=0.0177). Furthermore, the system was capable of differentiating equine strongyle and ascarid eggs similar to the McMaster method, but with significantly lower coefficients of variation (P<0.0001). This demonstrates the feasibility of a simple, automated on-site test to detect and/or enumerate parasite eggs in mammalian faeces without the need for a laboratory microscope, and highlights the potential of smartphones as relatively sophisticated, inexpensive and portable medical diagnostic devices. Copyright © 2016 Australian Society for Parasitology. Published by Elsevier Ltd. All rights reserved.

  17. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT: A GIS-BASED HYDROLOGIC MODELING TOOL

    Science.gov (United States)

    Planning and assessment in land and water resource management are evolving toward complex, spatially explicit regional assessments. These problems have to be addressed with distributed models that can compute runoff and erosion at different spatial and temporal scales. The extens...

  18. Computer modelling as a tool for understanding language evolution

    NARCIS (Netherlands)

    de Boer, Bart; Gontier, N; VanBendegem, JP; Aerts, D

    2006-01-01

    This paper describes the uses of computer models in studying the evolution of language. Language is a complex dynamic system that can be studied at the level of the individual and at the level of the population. Much of the dynamics of language evolution and language change occur because of the

  19. The ZAP Project: Designing Interactive Computer Tools for Learning Psychology

    Science.gov (United States)

    Hulshof, Casper; Eysink, Tessa; de Jong, Ton

    2006-01-01

    In the ZAP project, a set of interactive computer programs called "ZAPs" was developed. The programs were designed in such a way that first-year students experience psychological phenomena in a vivid and self-explanatory way. Students can either take the role of participant in a psychological experiment, they can experience phenomena themselves,…

  20. Coordinated computer-supported collaborative learning: Awareness and awareness tools

    NARCIS (Netherlands)

    Janssen, J.J.H.M.; Bodermer, D.

    2013-01-01

    Traditionally, research on awareness during online collaboration focused on topics such as the effects of spatial information about group members’ activities on the collaborative process. When the concept of awareness was introduced to computer-supported collaborative learning, this focus shifted to

  1. Computers, Laptops and Tools. ACER Research Monograph No. 56.

    Science.gov (United States)

    Ainley, Mary; Bourke, Valerie; Chatfield, Robert; Hillman, Kylie; Watkins, Ian

    In 1997, Balwyn High School (Australia) instituted a class of 28 Year 7 students to use laptop computers across the curriculum. This report details findings from an action research project that monitored important aspects of what happened when this program was introduced. A range of measures was developed to assess the influence of the use of…

  2. Computer Generated Optical Illusions: A Teaching and Research Tool.

    Science.gov (United States)

    Bailey, Bruce; Harman, Wade

    Interactive computer-generated simulations that highlight psychological principles were investigated in this study in which 33 female and 19 male undergraduate college student volunteers of median age 21 matched line and circle sizes in six variations of Ponzo's illusion. Prior to working with the illusions, data were collected based on subjects'…

  3. MRIVIEW: An interactive computational tool for investigation of brain structure and function

    International Nuclear Information System (INIS)

    Ranken, D.; George, J.

    1993-01-01

    MRIVIEW is a software system which uses image processing and visualization to provide neuroscience researchers with an integrated environment for combining functional and anatomical information. Key features of the software include semi-automated segmentation of volumetric head data and an interactive coordinate reconciliation method which utilizes surface visualization. The current system is a precursor to a computational brain atlas. We describe features this atlas will incorporate, including methods under development for visualizing brain functional data obtained from several different research modalities

  4. Knowledge-based systems and interactive graphics for reactor control using the Automated Reasoning Tool(ART) system

    International Nuclear Information System (INIS)

    Ragheb, M.; Clayton, B.; Davies, P.

    1987-01-01

    The use of Knowledge-Based systems and advanced graphic concepts are described using the Automated Reasoning Tool (ART) for a model nuclear plant system. Through the sue of asynchronous graphic input/output, the user is allowed to communicate through a graphical display to a Production-Rule Analysis System modelling the plant while its rules are actively being fired. The user changes the status of system components by pointing at them on the system configuration display with a mouse cursor and clicking one of the buttons on the mouse. The Production-Rule Analysis System accepts the new input and immediately displays its diagnosis of the system state and any associated recommendations as to the appropriate course of action. This approach offers a distinct advantage over typing the components statuses in response to queries by a conventional Production-Rule Analysis system. Moreover, two effective ways of communication between man and machine are combined

  5. A malaria diagnostic tool based on computer vision screening and visualization of Plasmodium falciparum candidate areas in digitized blood smears.

    Directory of Open Access Journals (Sweden)

    Nina Linder

    Full Text Available INTRODUCTION: Microscopy is the gold standard for diagnosis of malaria, however, manual evaluation of blood films is highly dependent on skilled personnel in a time-consuming, error-prone and repetitive process. In this study we propose a method using computer vision detection and visualization of only the diagnostically most relevant sample regions in digitized blood smears. METHODS: Giemsa-stained thin blood films with P. falciparum ring-stage trophozoites (n = 27 and uninfected controls (n = 20 were digitally scanned with an oil immersion objective (0.1 µm/pixel to capture approximately 50,000 erythrocytes per sample. Parasite candidate regions were identified based on color and object size, followed by extraction of image features (local binary patterns, local contrast and Scale-invariant feature transform descriptors used as input to a support vector machine classifier. The classifier was trained on digital slides from ten patients and validated on six samples. RESULTS: The diagnostic accuracy was tested on 31 samples (19 infected and 12 controls. From each digitized area of a blood smear, a panel with the 128 most probable parasite candidate regions was generated. Two expert microscopists were asked to visually inspect the panel on a tablet computer and to judge whether the patient was infected with P. falciparum. The method achieved a diagnostic sensitivity and specificity of 95% and 100% as well as 90% and 100% for the two readers respectively using the diagnostic tool. Parasitemia was separately calculated by the automated system and the correlation coefficient between manual and automated parasitemia counts was 0.97. CONCLUSION: We developed a decision support system for detecting malaria parasites using a computer vision algorithm combined with visualization of sample areas with the highest probability of malaria infection. The system provides a novel method for blood smear screening with a significantly reduced need for

  6. Uranus: a rapid prototyping tool for FPGA embedded computer vision

    Science.gov (United States)

    Rosales-Hernández, Victor; Castillo-Jimenez, Liz; Viveros-Velez, Gilberto; Zuñiga-Grajeda, Virgilio; Treviño Torres, Abel; Arias-Estrada, M.

    2007-01-01

    The starting point for all successful system development is the simulation. Performing high level simulation of a system can help to identify, insolate and fix design problems. This work presents Uranus, a software tool for simulation and evaluation of image processing algorithms with support to migrate them to an FPGA environment for algorithm acceleration and embedded processes purposes. The tool includes an integrated library of previous coded operators in software and provides the necessary support to read and display image sequences as well as video files. The user can use the previous compiled soft-operators in a high level process chain, and code his own operators. Additional to the prototyping tool, Uranus offers FPGA-based hardware architecture with the same organization as the software prototyping part. The hardware architecture contains a library of FPGA IP cores for image processing that are connected with a PowerPC based system. The Uranus environment is intended for rapid prototyping of machine vision and the migration to FPGA accelerator platform, and it is distributed for academic purposes.

  7. A sampler of useful computational tools for applied geometry, computer graphics, and image processing foundations for computer graphics, vision, and image processing

    CERN Document Server

    Cohen-Or, Daniel; Ju, Tao; Mitra, Niloy J; Shamir, Ariel; Sorkine-Hornung, Olga; Zhang, Hao (Richard)

    2015-01-01

    A Sampler of Useful Computational Tools for Applied Geometry, Computer Graphics, and Image Processing shows how to use a collection of mathematical techniques to solve important problems in applied mathematics and computer science areas. The book discusses fundamental tools in analytical geometry and linear algebra. It covers a wide range of topics, from matrix decomposition to curvature analysis and principal component analysis to dimensionality reduction.Written by a team of highly respected professors, the book can be used in a one-semester, intermediate-level course in computer science. It

  8. SequenceL: Automated Parallel Algorithms Derived from CSP-NT Computational Laws

    Science.gov (United States)

    Cooke, Daniel; Rushton, Nelson

    2013-01-01

    With the introduction of new parallel architectures like the cell and multicore chips from IBM, Intel, AMD, and ARM, as well as the petascale processing available for highend computing, a larger number of programmers will need to write parallel codes. Adding the parallel control structure to the sequence, selection, and iterative control constructs increases the complexity of code development, which often results in increased development costs and decreased reliability. SequenceL is a high-level programming language that is, a programming language that is closer to a human s way of thinking than to a machine s. Historically, high-level languages have resulted in decreased development costs and increased reliability, at the expense of performance. In recent applications at JSC and in industry, SequenceL has demonstrated the usual advantages of high-level programming in terms of low cost and high reliability. SequenceL programs, however, have run at speeds typically comparable with, and in many cases faster than, their counterparts written in C and C++ when run on single-core processors. Moreover, SequenceL is able to generate parallel executables automatically for multicore hardware, gaining parallel speedups without any extra effort from the programmer beyond what is required to write the sequen tial/singlecore code. A SequenceL-to-C++ translator has been developed that automatically renders readable multithreaded C++ from a combination of a SequenceL program and sample data input. The SequenceL language is based on two fundamental computational laws, Consume-Simplify- Produce (CSP) and Normalize-Trans - pose (NT), which enable it to automate the creation of parallel algorithms from high-level code that has no annotations of parallelism whatsoever. In our anecdotal experience, SequenceL development has been in every case less costly than development of the same algorithm in sequential (that is, single-core, single process) C or C++, and an order of magnitude less

  9. Improving Climate Communication through Comprehensive Linguistic Analyses Using Computational Tools

    Science.gov (United States)

    Gann, T. M.; Matlock, T.

    2014-12-01

    An important lesson on climate communication research is that there is no single way to reach out and inform the public. Different groups conceptualize climate issues in different ways and different groups have different values and assumptions. This variability makes it extremely difficult to effectively and objectively communicate climate information. One of the main challenges is the following: How do we acquire a better understanding of how values and assumptions vary across groups, including political groups? A necessary starting point is to pay close attention to the linguistic content of messages used across current popular media sources. Careful analyses of that information—including how it is realized in language for conservative and progressive media—may ultimately help climate scientists, government agency officials, journalists and others develop more effective messages. Past research has looked at partisan media coverage of climate change, but little attention has been given to the fine-grained linguistic content of such media. And when researchers have done detailed linguistic analyses, they have relied primarily on hand-coding, an approach that is costly, labor intensive, and time-consuming. Our project, building on recent work on partisan news media (Gann & Matlock, 2014; under review) uses high dimensional semantic analyses and other methods of automated classification techniques from the field of natural language processing to quantify how climate issues are characterized in media sources that differ according to political orientation. In addition to discussing varied linguistic patterns, we share new methods for improving climate communication for varied stakeholders, and for developing better assessments of their effectiveness.

  10. INTEGRATION OF INFORMATIONAL COMPUTER TECHNOLOGIES SMK: AUTOMATION OF THE MAIN FUNCTIONS OF THE TECHNICAL CONTROL DEPARTMENT

    Directory of Open Access Journals (Sweden)

    S. A. Pavlenko

    2010-01-01

    Full Text Available It is shown that automation of some functions of control department allows to record defects, reclamations and failures of technology, to make the necessary reporting forms and quality certificates for production.

  11. PCE: web tools to compute protein continuum electrostatics

    Science.gov (United States)

    Miteva, Maria A.; Tufféry, Pierre; Villoutreix, Bruno O.

    2005-01-01

    PCE (protein continuum electrostatics) is an online service for protein electrostatic computations presently based on the MEAD (macroscopic electrostatics with atomic detail) package initially developed by D. Bashford [(2004) Front Biosci., 9, 1082–1099]. This computer method uses a macroscopic electrostatic model for the calculation of protein electrostatic properties, such as pKa values of titratable groups and electrostatic potentials. The MEAD package generates electrostatic energies via finite difference solution to the Poisson–Boltzmann equation. Users submit a PDB file and PCE returns potentials and pKa values as well as color (static or animated) figures displaying electrostatic potentials mapped on the molecular surface. This service is intended to facilitate electrostatics analyses of proteins and thereby broaden the accessibility to continuum electrostatics to the biological community. PCE can be accessed at . PMID:15980492

  12. Computer aided systems human engineering: A hypermedia tool

    Science.gov (United States)

    Boff, Kenneth R.; Monk, Donald L.; Cody, William J.

    1992-01-01

    The Computer Aided Systems Human Engineering (CASHE) system, Version 1.0, is a multimedia ergonomics database on CD-ROM for the Apple Macintosh II computer, being developed for use by human system designers, educators, and researchers. It will initially be available on CD-ROM and will allow users to access ergonomics data and models stored electronically as text, graphics, and audio. The CASHE CD-ROM, Version 1.0 will contain the Boff and Lincoln (1988) Engineering Data Compendium, MIL-STD-1472D and a unique, interactive simulation capability, the Perception and Performance Prototyper. Its features also include a specialized data retrieval, scaling, and analysis capability and the state of the art in information retrieval, browsing, and navigation.

  13. Use of Computer vision for Automation of a Roadheader in Selective Cutting Operation

    OpenAIRE

    Fuentes-Cantillana , J.L.; Catalina , J.C.; Rodriguez , A.; Orteu , Jean-José; Dumahu , Didier

    1991-01-01

    International audience; State-of-the art of automation in roadheaders Most of the experimental work for roadheaders automation has been centered in the operations which imply cutting a complete section which has a constant profile, or shows only slight changes, and with an arrangement of the cutting sequence subject basically only to the restrictions arising from the geometrical or geotechnical conditions. Nowadays, the market offers Systems able to control automatically the cutting of a fixe...

  14. Evaluation of right ventricular function by coronary computed tomography angiography using a novel automated 3D right ventricle volume segmentation approach: a validation study.

    Science.gov (United States)

    Burghard, Philipp; Plank, Fabian; Beyer, Christoph; Müller, Silvana; Dörler, Jakob; Zaruba, Marc-Michael; Pölzl, Leo; Pölzl, Gerhard; Klauser, Andrea; Rauch, Stefan; Barbieri, Fabian; Langer, Christian-Ekkehardt; Schgoer, Wilfried; Williamson, Eric E; Feuchtner, Gudrun

    2018-06-04

    To evaluate right ventricle (RV) function by coronary computed tomography angiography (CTA) using a novel automated three-dimensional (3D) RV volume segmentation tool in comparison with clinical reference modalities. Twenty-six patients with severe end-stage heart failure [left ventricle (LV) ejection fraction (EF) right heart invasive catheterisation (IC). Automated 3D RV volume segmentation was successful in 26 (100%) patients. Read-out time was 3 min 33 s (range, 1 min 50s-4 min 33s). RV EF by CTA was stronger correlated with right atrial pressure (RAP) by IC (r = -0.595; p = 0.006) but weaker with TAPSE (r = 0.366, p = 0.94). When comparing TAPSE with RAP by IC (r = -0.317, p = 0.231), a weak-to-moderate non-significant inverse correlation was found. Interobserver correlation was high with r = 0.96 (p right atrium (RA) and right ventricle (RV) was 196.9 ± 75.3 and 217.5 ± 76.1 HU, respectively. Measurement of RV function by CTA using a novel 3D volumetric segmentation tool is fast and reliable by applying a dedicated biphasic injection protocol. The RV EF from CTA is a closer surrogate of RAP than TAPSE by TTE. • Evaluation of RV function by cardiac CTA by using a novel 3D volume segmentation tool is fast and reliable. • A biphasic contrast agent injection protocol ensures homogenous RV contrast attenuation. • Cardiac CT is a valuable alternative modality to CMR for the evaluation of RV function.

  15. Present status of computational tools for maglev development

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Z.; Chen, S.S.; Rote, D.M.

    1991-10-01

    High-speed vehicles that employ magnetic levitation (maglev) have received great attention worldwide as a means of relieving both highway and air-traffic congestion. At this time, Japan and Germany are leading the development of maglev. After fifteen years of inactivity that is attributed to technical policy decisions, the federal government of the United States has reconsidered the possibility of using maglev in the United States. The National Maglev Initiative (NMI) was established in May 1990 to assess the potential of maglev in the United States. One of the tasks of the NMI, which is also the objective of this report, is to determine the status of existing computer software that can be applied to maglev-related problems. The computational problems involved in maglev assessment, research, and development can be classified into two categories: electromagnetic and mechanical. Because most maglev problems are complicated and difficult to solve analytically, proper numerical methods are needed to find solutions. To determine the status of maglev-related software, developers and users of computer codes were surveyed. The results of the survey are described in this report. 25 refs.

  16. A Tool for the Automated Design and Evaluation of Habitat Interior Layouts

    Science.gov (United States)

    Simon, Matthew A.; Wilhite, Alan W.

    2013-01-01

    The objective of space habitat design is to minimize mass and system size while providing adequate space for all necessary equipment and a functional layout that supports crew health and productivity. Unfortunately, development and evaluation of interior layouts is often ignored during conceptual design because of the subjectivity and long times required using current evaluation methods (e.g., human-in-the-loop mockup tests and in-depth CAD evaluations). Early, more objective assessment could prevent expensive design changes that may increase vehicle mass and compromise functionality. This paper describes a new interior design evaluation method to enable early, structured consideration of habitat interior layouts. This interior layout evaluation method features a comprehensive list of quantifiable habitat layout evaluation criteria, automatic methods to measure these criteria from a geometry model, and application of systems engineering tools and numerical methods to construct a multi-objective value function measuring the overall habitat layout performance. In addition to a detailed description of this method, a C++/OpenGL software tool which has been developed to implement this method is also discussed. This tool leverages geometry modeling coupled with collision detection techniques to identify favorable layouts subject to multiple constraints and objectives (e.g., minimize mass, maximize contiguous habitable volume, maximize task performance, and minimize crew safety risks). Finally, a few habitat layout evaluation examples are described to demonstrate the effectiveness of this method and tool to influence habitat design.

  17. From Blunt to Pointy Tools: Transcending Task Automation to Effective Instructional Practice with CaseMate

    Science.gov (United States)

    Swan, Gerry

    2009-01-01

    While blogs, wikis and many other Web 2.0 applications can be employed in learning settings, instruction is not the primary purpose for these tools. The educational field must actively participate in the definition and development of what repurposed or new Web 2.0 applications means in educational settings. One way of viewing this needed…

  18. TESTAR : Tool Support for Test Automation at the User Interface Level

    NARCIS (Netherlands)

    Vos, Tanja E.J.; Kruse, Peter M.; Condori Fernandez, Nelly; Bauersfeld, Sebastian; Wegener, Joachim

    2015-01-01

    Testing applications with a graphical user interface (GUI) is an important, though challenging and time consuming task. The state of the art in the industry are still capture and replay tools, which may simplify the recording and execution of input sequences, but do not support the tester in finding

  19. Tools for Analyzing Computing Resource Management Strategies and Algorithms for SDR Clouds

    Science.gov (United States)

    Marojevic, Vuk; Gomez-Miguelez, Ismael; Gelonch, Antoni

    2012-09-01

    Software defined radio (SDR) clouds centralize the computing resources of base stations. The computing resource pool is shared between radio operators and dynamically loads and unloads digital signal processing chains for providing wireless communications services on demand. Each new user session request particularly requires the allocation of computing resources for executing the corresponding SDR transceivers. The huge amount of computing resources of SDR cloud data centers and the numerous session requests at certain hours of a day require an efficient computing resource management. We propose a hierarchical approach, where the data center is divided in clusters that are managed in a distributed way. This paper presents a set of computing resource management tools for analyzing computing resource management strategies and algorithms for SDR clouds. We use the tools for evaluating a different strategies and algorithms. The results show that more sophisticated algorithms can achieve higher resource occupations and that a tradeoff exists between cluster size and algorithm complexity.

  20. A basic tool for computer-aided sail design

    International Nuclear Information System (INIS)

    Thrasher, D.F.; Dunyak, T.J.; Mook, D.T.; Nayfeh, A.H.

    1985-01-01

    Recent developments in modelling lifting surfaces have provided a tool that also can be used to model sails. The simplest of the adequate models is the vortex-lattice method. This method can fully account for the aerodynamic interactions among several lifting surfaces having arbitrary platforms, camber, and twist as long as separation occurs only along the edges and the phenomenon known as vortex bursting does not occur near the sails. This paper describes this method and how it can be applied to the design of sails

  1. Computer-Assisted Visual Search/Decision Aids as a Training Tool for Mammography

    National Research Council Canada - National Science Library

    Nodine, Calvin

    2000-01-01

    The primary goal of the project is to develop a computer-assisted visual search (CAVS) mammography training tool that will improve the perceptual and cognitive skills of trainees leading to mammographic expertise...

  2. Computer-Assisted Visual Search/Decision Aids as a Training Tool for Mammography

    National Research Council Canada - National Science Library

    Nodine, Calvin

    1999-01-01

    The primary goal of the project is to develop a computer-assisted visual search (CAVS) mammography training tool that will improve the perceptual and cognitive skills of trainees leading to mammographic expertise...

  3. Computer-Assisted Visual Search/Decision Aids as a Training Tool for Mammography

    National Research Council Canada - National Science Library

    Nodine, Calvin

    1998-01-01

    The primary goal of the project is to develop a computer-assisted visual search (CAVS) mammography training tool that will improve the perceptual and cognitive skills of trainees leading to mammographic expertise...

  4. Computational Tool for Coupled Simulation of Nonequilibrium Hypersonic Flows with Ablation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this SBIR project is to develop a computational tool with unique predictive capabilities for the aerothermodynamic environment around ablation-cooled...

  5. Computational Tool for Coupled Simulation of Nonequilibrium Hypersonic Flows with Ablation, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this SBIR project is to develop a predictive computational tool for the aerothermal environment around ablation-cooled hypersonic atmospheric entry...

  6. Synthetic RNAs for Gene Regulation: Design Principles and Computational Tools

    International Nuclear Information System (INIS)

    Laganà, Alessandro; Shasha, Dennis; Croce, Carlo Maria

    2014-01-01

    The use of synthetic non-coding RNAs for post-transcriptional regulation of gene expression has not only become a standard laboratory tool for gene functional studies but it has also opened up new perspectives in the design of new and potentially promising therapeutic strategies. Bioinformatics has provided researchers with a variety of tools for the design, the analysis, and the evaluation of RNAi agents such as small-interfering RNA (siRNA), short-hairpin RNA (shRNA), artificial microRNA (a-miR), and microRNA sponges. More recently, a new system for genome engineering based on the bacterial CRISPR-Cas9 system (Clustered Regularly Interspaced Short Palindromic Repeats), was shown to have the potential to also regulate gene expression at both transcriptional and post-transcriptional level in a more specific way. In this mini review, we present RNAi and CRISPRi design principles and discuss the advantages and limitations of the current design approaches.

  7. Synthetic RNAs for Gene Regulation: Design Principles and Computational Tools

    Energy Technology Data Exchange (ETDEWEB)

    Laganà, Alessandro [Department of Molecular Virology, Immunology and Medical Genetics, Comprehensive Cancer Center, The Ohio State University, Columbus, OH (United States); Shasha, Dennis [Courant Institute of Mathematical Sciences, New York University, New York, NY (United States); Croce, Carlo Maria [Department of Molecular Virology, Immunology and Medical Genetics, Comprehensive Cancer Center, The Ohio State University, Columbus, OH (United States)

    2014-12-11

    The use of synthetic non-coding RNAs for post-transcriptional regulation of gene expression has not only become a standard laboratory tool for gene functional studies but it has also opened up new perspectives in the design of new and potentially promising therapeutic strategies. Bioinformatics has provided researchers with a variety of tools for the design, the analysis, and the evaluation of RNAi agents such as small-interfering RNA (siRNA), short-hairpin RNA (shRNA), artificial microRNA (a-miR), and microRNA sponges. More recently, a new system for genome engineering based on the bacterial CRISPR-Cas9 system (Clustered Regularly Interspaced Short Palindromic Repeats), was shown to have the potential to also regulate gene expression at both transcriptional and post-transcriptional level in a more specific way. In this mini review, we present RNAi and CRISPRi design principles and discuss the advantages and limitations of the current design approaches.

  8. Towards early software reliability prediction for computer forensic tools (case study).

    Science.gov (United States)

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  9. The Use of Computers as a Design Tool.

    Science.gov (United States)

    1980-01-01

    sponsor of numerical computing engines for defense needs; there is no such driving sponsorship today. It is concluded that as a result of these changes...coit difficilos A atteindro, soit impossibloc, soit trap faciles A atteindro. Pour cola , ii oct nlcessaire, pour le reeponsable du projot, de d~finir...CRAY I CRAY 1, BURROUGHS6 e70 " BM31/ IB CYER • e0 s 13 c amISS .MDAHL 4 /e oLe./ "B cocas ee/ i v 1,iu 70le o UNIVAC n1 t 1ons,, In 360/78 U 2

  10. Integration of numerical analysis tools for automated numerical optimization of a transportation package design

    International Nuclear Information System (INIS)

    Witkowski, W.R.; Eldred, M.S.; Harding, D.C.

    1994-01-01

    The use of state-of-the-art numerical analysis tools to determine the optimal design of a radioactive material (RAM) transportation container is investigated. The design of a RAM package's components involves a complex coupling of structural, thermal, and radioactive shielding analyses. The final design must adhere to very strict design constraints. The current technique used by cask designers is uncoupled and involves designing each component separately with respect to its driving constraint. With the use of numerical optimization schemes, the complex couplings can be considered directly, and the performance of the integrated package can be maximized with respect to the analysis conditions. This can lead to more efficient package designs. Thermal and structural accident conditions are analyzed in the shape optimization of a simplified cask design. In this paper, details of the integration of numerical analysis tools, development of a process model, nonsmoothness difficulties with the optimization of the cask, and preliminary results are discussed

  11. Automating testbed documentation and database access using World Wide Web (WWW) tools

    Science.gov (United States)

    Ames, Charles; Auernheimer, Brent; Lee, Young H.

    1994-01-01

    A method for providing uniform transparent access to disparate distributed information systems was demonstrated. A prototype testing interface was developed to access documentation and information using publicly available hypermedia tools. The prototype gives testers a uniform, platform-independent user interface to on-line documentation, user manuals, and mission-specific test and operations data. Mosaic was the common user interface, and HTML (Hypertext Markup Language) provided hypertext capability.

  12. Automated riverine landscape characterization: GIS-based tools for watershed-scale research, assessment, and management.

    Science.gov (United States)

    Williams, Bradley S; D'Amico, Ellen; Kastens, Jude H; Thorp, James H; Flotemersch, Joseph E; Thoms, Martin C

    2013-09-01

    River systems consist of hydrogeomorphic patches (HPs) that emerge at multiple spatiotemporal scales. Functional process zones (FPZs) are HPs that exist at the river valley scale and are important strata for framing whole-watershed research questions and management plans. Hierarchical classification procedures aid in HP identification by grouping sections of river based on their hydrogeomorphic character; however, collecting data required for such procedures with field-based methods is often impractical. We developed a set of GIS-based tools that facilitate rapid, low cost riverine landscape characterization and FPZ classification. Our tools, termed RESonate, consist of a custom toolbox designed for ESRI ArcGIS®. RESonate automatically extracts 13 hydrogeomorphic variables from readily available geospatial datasets and datasets derived from modeling procedures. An advanced 2D flood model, FLDPLN, designed for MATLAB® is used to determine valley morphology by systematically flooding river networks. When used in conjunction with other modeling procedures, RESonate and FLDPLN can assess the character of large river networks quickly and at very low costs. Here we describe tool and model functions in addition to their benefits, limitations, and applications.

  13. Configuration monitoring tool for large-scale distributed computing

    International Nuclear Information System (INIS)

    Wu, Y.; Graham, G.; Lu, X.; Afaq, A.; Kim, B.J.; Fisk, I.

    2004-01-01

    The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) at CERN will likely use a grid system to achieve much of its offline processing need. Given the heterogeneous and dynamic nature of grid systems, it is desirable to have in place a configuration monitor. The configuration monitoring tool is built using the Globus toolkit and web services. It consists of an information provider for the Globus MDS, a relational database for keeping track of the current and old configurations, and client interfaces to query and administer the configuration system. The Grid Security Infrastructure (GSI), together with EDG Java Security packages, are used for secure authentication and transparent access to the configuration information across the CMS grid. This work has been prototyped and tested using US-CMS grid resources

  14. Soft computing simulation tools for nuclear energy systems

    International Nuclear Information System (INIS)

    Kannan Balasubramanian, S.

    2012-01-01

    This chapter deals with simulation, a very powerful tool in designing, constructing and operating nuclear power generating facilities. There are very different types of power plants, and the examples mentioned in this chapter originate from experience with water cooled and water moderated thermal reactors, based on fission of uranium-235. Nevertheless, the methodological achievements in simulation mentioned below can definitely be used not only for this particular type of nuclear power generating reactor. Simulation means: investigation of processes in the time domain. We can calculate the characteristics and properties of different systems, e.g. we can design a bridge over a river, but if we calculate how it would respond to a thunderstorm with high winds, its movement can or can not evolve after a certain time into destructive oscillation - this type of calculations are called simulation

  15. Platformation: Cloud Computing Tools at the Service of Social Change

    Directory of Open Access Journals (Sweden)

    Anil Patel

    2012-07-01

    Full Text Available The following article establishes some context and definitions for what is termed the “sharing imperative” – a movement or tendency towards sharing information online and in real time that has rapidly transformed several industries. As internet-enabled devices proliferate to all corners of the globe, ways of working and accessing information have changed. Users now expect to be able to access the products, services, and information that they want from anywhere, at any time, on any device. This article addresses how the nonprofit sector might respond to those demands by embracing the sharing imperative. It suggests that how well an organization shares has become one of the most pressing governance questions a nonprofit organization must tackle. Finally, the article introduces Platformation, a project whereby tools that enable better inter and intra-organizational sharing are tested for scalability, affordability, interoperability, and security, all with a non-profit lens.

  16. Configuration monitoring tool for large-scale distributed computing

    CERN Document Server

    Wu, Y; Fisk, I; Graham, G; Kim, B J; Lü, X

    2004-01-01

    The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) at CERN will likely use a grid system to achieve much of its offline processing need. Given the heterogeneous and dynamic nature of grid systems, it is desirable to have in place a configuration monitor. The configuration monitoring tool is built using the Globus toolkit and web services. It consists of an information provider for the Globus MDS, a relational database for keeping track of the current and old configurations, and client interfaces to query and administer the configuration system. The Grid Security Infrastructure (GSI), together with EDG Java Security packages, are used for secure authentication and transparent access to the configuration information across the CMS grid. This work has been prototyped and tested using US-CMS grid resources.

  17. HCI^2 Workbench: A Development Tool for Multimodal Human-Computer Interaction Systems

    NARCIS (Netherlands)

    Shen, Jie; Wenzhe, Shi; Pantic, Maja

    In this paper, we present a novel software tool designed and implemented to simplify the development process of Multimodal Human-Computer Interaction (MHCI) systems. This tool, which is called the HCI^2 Workbench, exploits a Publish / Subscribe (P/S) architecture [13] [14] to facilitate efficient

  18. Development and Evaluation of Computer-Based Laboratory Practical Learning Tool

    Science.gov (United States)

    Gandole, Y. B.

    2006-01-01

    Effective evaluation of educational software is a key issue for successful introduction of advanced tools in the curriculum. This paper details to developing and evaluating a tool for computer assisted learning of science laboratory courses. The process was based on the generic instructional system design model. Various categories of educational…

  19. DEVELOPMENT AND USE OF COMPUTER-AIDED PROCESS ENGINEERING TOOLS FOR POLLUTION PREVENTION

    Science.gov (United States)

    The use of Computer-Aided Process Engineering (CAPE) and process simulation tools has become established industry practice to predict simulation software, new opportunities are available for the creation of a wide range of ancillary tools that can be used from within multiple sim...

  20. Teachers' Use of Computational Tools to Construct and Explore Dynamic Mathematical Models

    Science.gov (United States)

    Santos-Trigo, Manuel; Reyes-Rodriguez, Aaron

    2011-01-01

    To what extent does the use of computational tools offer teachers the possibility of constructing dynamic models to identify and explore diverse mathematical relations? What ways of reasoning or thinking about the problems emerge during the model construction process that involves the use of the tools? These research questions guided the…

  1. Automated structure and flow measurement - a promising tool in nailfold capillaroscopy.

    Science.gov (United States)

    Berks, Michael; Dinsdale, Graham; Murray, Andrea; Moore, Tonia; Manning, Joanne; Taylor, Chris; Herrick, Ariane L

    2018-07-01

    Despite increasing interest in nailfold capillaroscopy, objective measures of capillary structure and blood flow have been little studied. We aimed to test the hypothesis that structural measurements, capillary flow, and a combined measure have the predictive power to separate patients with systemic sclerosis (SSc) from those with primary Raynaud's phenomenon (PRP) and healthy controls (HC). 50 patients with SSc, 12 with PRP, and 50 HC were imaged using a novel capillaroscopy system that generates high-quality nailfold images and provides fully-automated measurements of capillary structure and blood flow (capillary density, mean width, maximum width, shape score, derangement and mean flow velocity). Population statistics summarise the differences between the three groups. Areas under ROC curves (A Z ) were used to measure classification accuracy when assigning individuals to SSc and HC/PRP groups. Statistically significant differences in group means were found between patients with SSc and both HC and patients with PRP, for all measurements, e.g. mean width (μm) ± SE: 15.0 ± 0.71, 12.7 ± 0.74 and 11.8 ± 0.23 for SSc, PRP and HC respectively. Combining the five structural measurements gave better classification (A Z  = 0.919 ± 0.026) than the best single measurement (mean width, A Z  = 0.874 ± 0.043), whilst adding flow further improved classification (A Z  = 0.930 ± 0.024). Structural and blood flow measurements are both able to distinguish patients with SSc from those with PRP/HC. Importantly, these hold promise as clinical trial outcome measures for treatments aimed at improving finger blood flow or microvascular remodelling. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  2. Automated image-based colon cleansing for laxative-free CT colonography computer-aided polyp detection

    International Nuclear Information System (INIS)

    Linguraru, Marius George; Panjwani, Neil; Fletcher, Joel G.; Summer, Ronald M.

    2011-01-01

    Purpose: To evaluate the performance of a computer-aided detection (CAD) system for detecting colonic polyps at noncathartic computed tomography colonography (CTC) in conjunction with an automated image-based colon cleansing algorithm. Methods: An automated colon cleansing algorithm was designed to detect and subtract tagged-stool, accounting for heterogeneity and poor tagging, to be used in conjunction with a colon CAD system. The method is locally adaptive and combines intensity, shape, and texture analysis with probabilistic optimization. CTC data from cathartic-free bowel preparation were acquired for testing and training the parameters. Patients underwent various colonic preparations with barium or Gastroview in divided doses over 48 h before scanning. No laxatives were administered and no dietary modifications were required. Cases were selected from a polyp-enriched cohort and included scans in which at least 90% of the solid stool was visually estimated to be tagged and each colonic segment was distended in either the prone or supine view. The CAD system was run comparatively with and without the stool subtraction algorithm. Results: The dataset comprised 38 CTC scans from prone and/or supine scans of 19 patients containing 44 polyps larger than 10 mm (22 unique polyps, if matched between prone and supine scans). The results are robust on fine details around folds, thin-stool linings on the colonic wall, near polyps and in large fluid/stool pools. The sensitivity of the CAD system is 70.5% per polyp at a rate of 5.75 false positives/scan without using the stool subtraction module. This detection improved significantly (p = 0.009) after automated colon cleansing on cathartic-free data to 86.4% true positive rate at 5.75 false positives/scan. Conclusions: An automated image-based colon cleansing algorithm designed to overcome the challenges of the noncathartic colon significantly improves the sensitivity of colon CAD by approximately 15%.

  3. Modeling biology with HDL languages: a first step toward a genetic design automation tool inspired from microelectronics.

    Science.gov (United States)

    Gendrault, Yves; Madec, Morgan; Lallement, Christophe; Haiech, Jacques

    2014-04-01

    Nowadays, synthetic biology is a hot research topic. Each day, progresses are made to improve the complexity of artificial biological functions in order to tend to complex biodevices and biosystems. Up to now, these systems are handmade by bioengineers, which require strong technical skills and leads to nonreusable development. Besides, scientific fields that share the same design approach, such as microelectronics, have already overcome several issues and designers succeed in building extremely complex systems with many evolved functions. On the other hand, in systems engineering and more specifically in microelectronics, the development of the domain has been promoted by both the improvement of technological processes and electronic design automation tools. The work presented in this paper paves the way for the adaptation of microelectronics design tools to synthetic biology. Considering the similarities and differences between the synthetic biology and microelectronics, the milestones of this adaptation are described. The first one concerns the modeling of biological mechanisms. To do so, a new formalism is proposed, based on an extension of the generalized Kirchhoff laws to biology. This way, a description of all biological mechanisms can be made with languages widely used in microelectronics. Our approach is therefore successfully validated on specific examples drawn from the literature.

  4. Computer assisted audit tools and techniques in real world: CAATT's applications and approaches in context

    OpenAIRE

    Pedrosa, I.; Costa, C. J.

    2012-01-01

    Nowadays, Computer Aided Audit Tools (and Techniques’) support almost all audit processes concerning data extraction and analysis. These tools were firstly aimed to support financial auditing processes. However, their scope is beyond this, therefore, we present case studies and good practices in an academic context. Although in large auditing companies Audit Tools to do data extraction and analysis are very common and applied in several contexts, we realized that is not easy to find practical...

  5. Automated computer-based CT stratification as a predictor of outcome in hypersensitivity pneumonitis

    International Nuclear Information System (INIS)

    Jacob, Joseph; Mak, S.M.; Mok, W.; Hansell, D.M.; Bartholmai, B.J.; Rajagopalan, S.; Karwoski, R.; Della Casa, G.; Sugino, K.; Walsh, S.L.F.; Wells, A.U.

    2017-01-01

    Hypersensitivity pneumonitis (HP) has a variable clinical course. Modelling of quantitative CALIPER-derived CT data can identify distinct disease phenotypes. Mortality prediction using CALIPER analysis was compared to the interstitial lung disease gender, age, physiology (ILD-GAP) outcome model. CALIPER CT analysis of parenchymal patterns in 98 consecutive HP patients was compared to visual CT scoring by two radiologists. Functional indices including forced vital capacity (FVC) and diffusion capacity for carbon monoxide (DLco) in univariate and multivariate Cox mortality models. Automated stratification of CALIPER scores was evaluated against outcome models. Univariate predictors of mortality included visual and CALIPER CT fibrotic patterns, and all functional indices. Multivariate analyses identified only two independent predictors of mortality: CALIPER reticular pattern (p = 0.001) and DLco (p < 0.0001). Automated stratification distinguished three distinct HP groups (log-rank test p < 0.0001). Substitution of automated stratified groups for FVC and DLco in the ILD-GAP model demonstrated no loss of model strength (C-Index = 0.73 for both models). Model strength improved when automated stratified groups were combined with the ILD-GAP model (C-Index = 0.77). CALIPER-derived variables are the strongest CT predictors of mortality in HP. Automated CT stratification is equivalent to functional indices in the ILD-GAP model for predicting outcome in HP. (orig.)

  6. Automated computer-based CT stratification as a predictor of outcome in hypersensitivity pneumonitis

    Energy Technology Data Exchange (ETDEWEB)

    Jacob, Joseph; Mak, S.M.; Mok, W.; Hansell, D.M. [Royal Brompton and Harefield NHS Foundation Trust, Department of Radiology, Royal Brompton Hospital, London (United Kingdom); Bartholmai, B.J. [Mayo Clinic Rochester, Division of Radiology, Rochester, MN (United States); Rajagopalan, S.; Karwoski, R. [Mayo Clinic Rochester, Biomedical Imaging Resource, Rochester, MN (United States); Della Casa, G. [Universita degli Studi di Modena e Reggio Emilia, Modena, Emilia-Romagna (Italy); Sugino, K. [Toho University Omori Medical Centre, Tokyo (Japan); Walsh, S.L.F. [Kings College Hospital, London (United Kingdom); Wells, A.U. [Royal Brompton and Harefield NHS Foundation Trust, Interstitial Lung Disease Unit, Royal Brompton Hospital, London (United Kingdom)

    2017-09-15

    Hypersensitivity pneumonitis (HP) has a variable clinical course. Modelling of quantitative CALIPER-derived CT data can identify distinct disease phenotypes. Mortality prediction using CALIPER analysis was compared to the interstitial lung disease gender, age, physiology (ILD-GAP) outcome model. CALIPER CT analysis of parenchymal patterns in 98 consecutive HP patients was compared to visual CT scoring by two radiologists. Functional indices including forced vital capacity (FVC) and diffusion capacity for carbon monoxide (DLco) in univariate and multivariate Cox mortality models. Automated stratification of CALIPER scores was evaluated against outcome models. Univariate predictors of mortality included visual and CALIPER CT fibrotic patterns, and all functional indices. Multivariate analyses identified only two independent predictors of mortality: CALIPER reticular pattern (p = 0.001) and DLco (p < 0.0001). Automated stratification distinguished three distinct HP groups (log-rank test p < 0.0001). Substitution of automated stratified groups for FVC and DLco in the ILD-GAP model demonstrated no loss of model strength (C-Index = 0.73 for both models). Model strength improved when automated stratified groups were combined with the ILD-GAP model (C-Index = 0.77). CALIPER-derived variables are the strongest CT predictors of mortality in HP. Automated CT stratification is equivalent to functional indices in the ILD-GAP model for predicting outcome in HP. (orig.)

  7. Automated tools and techniques for distributed Grid Software: Development of the testbed infrastructure

    OpenAIRE

    Aguado Sanchez, C; Di Meglio, A

    2007-01-01

    Grid technology is becoming more and more important as the new paradigm for sharing computational resources across different organizations in a secure way. The great powerfulness of this solution, requires the definition of a generic stack of services and protocols and this is the scope of the different Grid initiatives. As a result of international collaborations for its development, the Open Grid Forum created the Open Grid Services Architecture (OGSA) which aims to define the common set of...

  8. Towards a Tool for Computer Supported Structuring of Products

    DEFF Research Database (Denmark)

    Hansen, Claus Thorp

    1997-01-01

    . However, a product possesses not only a component structure but also various organ structures which are superimposed on the component structure. The organ structures carry behaviour and make the product suited for its life phases.Our long-term research goal is to develop a computer-based system...... that is capable of supporting synthesis activities in engineering design, and thereby also support handling of various organ structures. Such a system must contain a product model, in which it is possible to describe and manipulate both various organ structures and the component structure.In this paper we focus...... on the relationships between organ structures and the component structure. By an analysis of an existing product it is shown that a component may contribute to more than one organ. A set of organ structures is identified and their influence on the component strucute is illustrated....

  9. Assessing Affordances of Selected Cloud Computing Tools for Language Teacher Education in Nigeria

    Science.gov (United States)

    Ofemile, Abdulmalik Yusuf

    2015-01-01

    This paper reports part of a study that hoped to understand Teacher Educators' (TE) assessment of the affordances of selected cloud computing tools ranked among the top 100 for the year 2010. Research has shown that ICT and by extension cloud computing has positive impacts on daily life and this informed the Nigerian government's policy to…

  10. Development and Assessment of a Chemistry-Based Computer Video Game as a Learning Tool

    Science.gov (United States)

    Martinez-Hernandez, Kermin Joel

    2010-01-01

    The chemistry-based computer video game is a multidisciplinary collaboration between chemistry and computer graphics and technology fields developed to explore the use of video games as a possible learning tool. This innovative approach aims to integrate elements of commercial video game and authentic chemistry context environments into a learning…

  11. The Computer as a Tool for Learning through Reflection. Technical Report No. 376.

    Science.gov (United States)

    Collins, Allan; Brown, John Seely

    Because of its ability to record and represent process, the computer can provide a powerful, motivating, and as yet untapped tool for focusing the students' attention directly on their own thought processes and learning through reflection. Properly abstracted and structured, the computational medium can capture the processes by which a novice or…

  12. Automatic Parallelization Tool: Classification of Program Code for Parallel Computing

    Directory of Open Access Journals (Sweden)

    Mustafa Basthikodi

    2016-04-01

    Full Text Available Performance growth of single-core processors has come to a halt in the past decade, but was re-enabled by the introduction of parallelism in processors. Multicore frameworks along with Graphical Processing Units empowered to enhance parallelism broadly. Couples of compilers are updated to developing challenges forsynchronization and threading issues. Appropriate program and algorithm classifications will have advantage to a great extent to the group of software engineers to get opportunities for effective parallelization. In present work we investigated current species for classification of algorithms, in that related work on classification is discussed along with the comparison of issues that challenges the classification. The set of algorithms are chosen which matches the structure with different issues and perform given task. We have tested these algorithms utilizing existing automatic species extraction toolsalong with Bones compiler. We have added functionalities to existing tool, providing a more detailed characterization. The contributions of our work include support for pointer arithmetic, conditional and incremental statements, user defined types, constants and mathematical functions. With this, we can retain significant data which is not captured by original speciesof algorithms. We executed new theories into the device, empowering automatic characterization of program code.

  13. An automated graphics tool for comparative genomics: the Coulson plot generator.

    Science.gov (United States)

    Field, Helen I; Coulson, Richard M R; Field, Mark C

    2013-04-27

    Comparative analysis is an essential component to biology. When applied to genomics for example, analysis may require comparisons between the predicted presence and absence of genes in a group of genomes under consideration. Frequently, genes can be grouped into small categories based on functional criteria, for example membership of a multimeric complex, participation in a metabolic or signaling pathway or shared sequence features and/or paralogy. These patterns of retention and loss are highly informative for the prediction of function, and hence possible biological context, and can provide great insights into the evolutionary history of cellular functions. However, representation of such information in a standard spreadsheet is a poor visual means from which to extract patterns within a dataset. We devised the Coulson Plot, a new graphical representation that exploits a matrix of pie charts to display comparative genomics data. Each pie is used to describe a complex or process from a separate taxon, and is divided into sectors corresponding to the number of proteins (subunits) in a complex/process. The predicted presence or absence of proteins in each complex are delineated by occupancy of a given sector; this format is visually highly accessible and makes pattern recognition rapid and reliable. A key to the identity of each subunit, plus hierarchical naming of taxa and coloring are included. A java-based application, the Coulson plot generator (CPG) automates graphic production, with a tab or comma-delineated text file as input and generating an editable portable document format or svg file. CPG software may be used to rapidly convert spreadsheet data to a graphical matrix pie chart format. The representation essentially retains all of the information from the spreadsheet but presents a graphically rich format making comparisons and identification of patterns significantly clearer. While the Coulson plot format is highly useful in comparative genomics, its

  14. Cost-effective cloud computing: a case study using the comparative genomics tool, roundup.

    Science.gov (United States)

    Kudtarkar, Parul; Deluca, Todd F; Fusaro, Vincent A; Tonellato, Peter J; Wall, Dennis P

    2010-12-22

    Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource-Roundup-using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon's Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon's computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure.

  15. Controller Strategies for Automation Tool Use under Varying Levels of Trajectory Prediction Uncertainty

    Science.gov (United States)

    Morey, Susan; Prevot, Thomas; Mercer, Joey; Martin, Lynne; Bienert, Nancy; Cabrall, Christopher; Hunt, Sarah; Homola, Jeffrey; Kraut, Joshua

    2013-01-01

    A human-in-the-loop simulation was conducted to examine the effects of varying levels of trajectory prediction uncertainty on air traffic controller workload and performance, as well as how strategies and the use of decision support tools change in response. This paper focuses on the strategies employed by two controllers from separate teams who worked in parallel but independently under identical conditions (airspace, arrival traffic, tools) with the goal of ensuring schedule conformance and safe separation for a dense arrival flow in en route airspace. Despite differences in strategy and methods, both controllers achieved high levels of schedule conformance and safe separation. Overall, results show that trajectory uncertainties introduced by wind and aircraft performance prediction errors do not affect the controllers' ability to manage traffic. Controller strategies were fairly robust to changes in error, though strategies were affected by the amount of delay to absorb (scheduled time of arrival minus estimated time of arrival). Using the results and observations, this paper proposes an ability to dynamically customize the display of information including delay time based on observed error to better accommodate different strategies and objectives.

  16. Computers, coders, and voters: Comparing automated methods for estimating party positions

    DEFF Research Database (Denmark)

    Hjorth, F.; Klemmensen, R.; Hobolt, S.

    2015-01-01

    Assigning political actors positions in ideological space is a task of key importance to political scientists. In this paper we compare estimates obtained using the automated Wordscores and Wordfish techniques, along with estimates from voters and the Comparative Manifesto Project (CMP), against...... texts and a more ideologically charged vocabulary in order to produce estimates comparable to Wordscores. The paper contributes to the literature on automated content analysis by providing a comprehensive test of convergent validation, in terms of both number of cases analyzed and number of validation...

  17. 8th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Gracia, José; Knüpfer, Andreas; Resch, Michael; Nagel, Wolfgang

    2015-01-01

    Numerical simulation and modelling using High Performance Computing has evolved into an established technique in academic and industrial research. At the same time, the High Performance Computing infrastructure is becoming ever more complex. For instance, most of the current top systems around the world use thousands of nodes in which classical CPUs are combined with accelerator cards in order to enhance their compute power and energy efficiency. This complexity can only be mastered with adequate development and optimization tools. Key topics addressed by these tools include parallelization on heterogeneous systems, performance optimization for CPUs and accelerators, debugging of increasingly complex scientific applications, and optimization of energy usage in the spirit of green IT. This book represents the proceedings of the 8th International Parallel Tools Workshop, held October 1-2, 2014 in Stuttgart, Germany – which is a forum to discuss the latest advancements in the parallel tools.

  18. Validating automated kidney stone volumetry in computed tomography and mathematical correlation with estimated stone volume based on diameter.

    Science.gov (United States)

    Wilhelm, Konrad; Miernik, Arkadiusz; Hein, Simon; Schlager, Daniel; Adams, Fabian; Benndorf, Matthias; Fritz, Benjamin; Langer, Mathias; Hesse, Albrecht; Schoenthaler, Martin; Neubauer, Jakob

    2018-06-02

    To validate AutoMated UroLithiasis Evaluation Tool (AMULET) software for kidney stone volumetry and compare its performance to standard clinical practice. Maximum diameter and volume of 96 urinary stones were measured as reference standard by three independent urologists. The same stones were positioned in an anthropomorphic phantom and CT scans acquired in standard settings. Three independent radiologists blinded to the reference values took manual measurements of the maximum diameter and automatic measurements of maximum diameter and volume. An "expected volume" was calculated based on manual diameter measurements using the formula: V=4/3 πr³. 96 stones were analyzed in the study. We had initially aimed to assess 100. Nine were replaced during data acquisition due of crumbling and 4 had to be excluded because the automated measurement did not work. Mean reference maximum diameter was 13.3 mm (5.2-32.1 mm). Correlation coefficients among all measured outcomes were compared. The correlation between the manual and automatic diameter measurements to the reference was 0.98 and 0.91, respectively (pvolumetry is possible and significantly more accurate than diameter-based volumetric calculations. To avoid bias in clinical trials, size should be measured as volume. However, automated diameter measurements are not as accurate as manual measurements.

  19. BREED: a CDC-7600 computer program for the automation of breeder reactor design analysis (LWBR Development Program)

    International Nuclear Information System (INIS)

    Candelore, N.R.; Maher, C.M.

    1985-03-01

    BREED is an executive CDC-7600 program which was developed to facilitate the sequence of calculations and movement of data through a prescribed series of breeder reactor design computer programs in an uninterrupted single-job mode. It provides the capability to interface different application programs into a single computer run to provide a complete design function. The automation that can be achieved as a result of using BREED significantly reduces not only the time required for data preparation and hand transfer of data, but also the time required to complete an iteration of the total design effort. Data processing within a technical discipline and data transfer between technical disciplines can be accommodated. The input/output data processing is achieved with BREED by using a set of simple, easily understood user commands, usually short descriptive words, which the user inserts in his input deck. The input deck completely identifies and controls the calculational sequence needed to produce a desired end product. This report has been prepared to provide instructional material on the use of BREED and its user-oriented procedures to facilitate computer automation of design calculations

  20. A review of computer tools for analysing the integration of renewable energy into various energy systems

    DEFF Research Database (Denmark)

    Connolly, D.; Lund, Henrik; Mathiesen, Brian Vad

    2010-01-01

    to integrating renewable energy, but instead the ‘ideal’ energy tool is highly dependent on the specific objectives that must be fulfilled. The typical applications for the 37 tools reviewed (from analysing single-building systems to national energy-systems), combined with numerous other factors......This paper includes a review of the different computer tools that can be used to analyse the integration of renewable energy. Initially 68 tools were considered, but 37 were included in the final analysis which was carried out in collaboration with the tool developers or recommended points...... of contact. The results in this paper provide the information necessary to identify a suitable energy tool for analysing the integration of renewable energy into various energy-systems under different objectives. It is evident from this paper that there is no energy tool that addresses all issues related...