WorldWideScience

Sample records for automated structure generation

  1. Automated quadrilateral mesh generation for digital image structures

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    With the development of advanced imaging technology, digital images are widely used. This paper proposes an automatic quadrilateral mesh generation algorithm for multi-colour imaged structures. It takes an original arbitrary digital image as an input for automatic quadrilateral mesh generation, this includes removing the noise, extracting and smoothing the boundary geometries between different colours, and automatic all-quad mesh generation with the above boundaries as constraints. An application example is...

  2. Bim Automation: Advanced Modeling Generative Process for Complex Structures

    Science.gov (United States)

    Banfi, F.; Fai, S.; Brumana, R.

    2017-08-01

    The new paradigm of the complexity of modern and historic structures, which are characterised by complex forms, morphological and typological variables, is one of the greatest challenges for building information modelling (BIM). Generation of complex parametric models needs new scientific knowledge concerning new digital technologies. These elements are helpful to store a vast quantity of information during the life cycle of buildings (LCB). The latest developments of parametric applications do not provide advanced tools, resulting in time-consuming work for the generation of models. This paper presents a method capable of processing and creating complex parametric Building Information Models (BIM) with Non-Uniform to NURBS) with multiple levels of details (Mixed and ReverseLoD) based on accurate 3D photogrammetric and laser scanning surveys. Complex 3D elements are converted into parametric BIM software and finite element applications (BIM to FEA) using specific exchange formats and new modelling tools. The proposed approach has been applied to different case studies: the BIM of modern structure for the courtyard of West Block on Parliament Hill in Ottawa (Ontario) and the BIM of Masegra Castel in Sondrio (Italy), encouraging the dissemination and interaction of scientific results without losing information during the generative process.

  3. Automated lattice data generation

    Directory of Open Access Journals (Sweden)

    Ayyar Venkitesh

    2018-01-01

    Full Text Available The process of generating ensembles of gauge configurations (and measuring various observables over them can be tedious and error-prone when done “by hand”. In practice, most of this procedure can be automated with the use of a workflow manager. We discuss how this automation can be accomplished using Taxi, a minimal Python-based workflow manager built for generating lattice data. We present a case study demonstrating this technology.

  4. Automated lattice data generation

    Science.gov (United States)

    Ayyar, Venkitesh; Hackett, Daniel C.; Jay, William I.; Neil, Ethan T.

    2018-03-01

    The process of generating ensembles of gauge configurations (and measuring various observables over them) can be tedious and error-prone when done "by hand". In practice, most of this procedure can be automated with the use of a workflow manager. We discuss how this automation can be accomplished using Taxi, a minimal Python-based workflow manager built for generating lattice data. We present a case study demonstrating this technology.

  5. Automated Narratives and Journalistic Text Generation: The Lead Organization Structure Translated into Code.

    Directory of Open Access Journals (Sweden)

    Márcio Carneiro dos Santos

    2016-07-01

    Full Text Available It describes the experiment of building a software capable of generating leads and newspaper titles in an automated fashion from information obtained from the Internet. The theoretical possibility Lage already provided by the end of last century is based on relatively rigid and simple structure of this type of story construction, which facilitates the representation or translation of its syntax in terms of instructions that the computer can execute. The paper also discusses the relationship between society, technique and technology, making a brief history of the introduction of digital solutions in newsrooms and their impacts. The development was done with the Python programming language and NLTK- Natural Language Toolkit library - and used the results of the Brazilian Soccer Championship 2013 published on an internet portal as a data source.

  6. Automated drawing generation system

    International Nuclear Information System (INIS)

    Yoshinaga, Toshiaki; Kawahata, Junichi; Yoshida, Naoto; Ono, Satoru

    1991-01-01

    Since automated CAD drawing generation systems still require human intervention, improvements were focussed on an interactive processing section (data input and correcting operation) which necessitates a vast amount of work. As a result, human intervention was eliminated, the original objective of a computerized system. This is the first step taken towards complete automation. The effects of development and commercialization of the system are as described below. (1) The interactive processing time required for generating drawings was improved. It was determined that introduction of the CAD system has reduced the time required for generating drawings. (2) The difference in skills between workers preparing drawings has been eliminated and the quality of drawings has been made uniform. (3) The extent of knowledge and experience demanded of workers has been reduced. (author)

  7. Managing expectations: assessment of chemistry databases generated by automated extraction of chemical structures from patents.

    Science.gov (United States)

    Senger, Stefan; Bartek, Luca; Papadatos, George; Gaulton, Anna

    2015-12-01

    First public disclosure of new chemical entities often takes place in patents, which makes them an important source of information. However, with an ever increasing number of patent applications, manual processing and curation on such a large scale becomes even more challenging. An alternative approach better suited for this large corpus of documents is the automated extraction of chemical structures. A number of patent chemistry databases generated by using the latter approach are now available but little is known that can help to manage expectations when using them. This study aims to address this by comparing two such freely available sources, SureChEMBL and IBM SIIP (IBM Strategic Intellectual Property Insight Platform), with manually curated commercial databases. When looking at the percentage of chemical structures successfully extracted from a set of patents, using SciFinder as our reference, 59 and 51 % were also found in our comparison in SureChEMBL and IBM SIIP, respectively. When performing this comparison with compounds as starting point, i.e. establishing if for a list of compounds the databases provide the links between chemical structures and patents they appear in, we obtained similar results. SureChEMBL and IBM SIIP found 62 and 59 %, respectively, of the compound-patent pairs obtained from Reaxys. In our comparison of automatically generated vs. manually curated patent chemistry databases, the former successfully provided approximately 60 % of links between chemical structure and patents. It needs to be stressed that only a very limited number of patents and compound-patent pairs were used for our comparison. Nevertheless, our results will hopefully help to manage expectations of users of patent chemistry databases of this type and provide a useful framework for more studies like ours as well as guide future developments of the workflows used for the automated extraction of chemical structures from patents. The challenges we have encountered

  8. Automated Test Case Generation

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    I would like to present the concept of automated test case generation. I work on it as part of my PhD and I think it would be interesting also for other people. It is also the topic of a workshop paper that I am introducing in Paris. (abstract below) Please note that the talk itself would be more general and not about the specifics of my PhD, but about the broad field of Automated Test Case Generation. I would introduce the main approaches (combinatorial testing, symbolic execution, adaptive random testing) and their advantages and problems. (oracle problem, combinatorial explosion, ...) Abstract of the paper: Over the last decade code-based test case generation techniques such as combinatorial testing or dynamic symbolic execution have seen growing research popularity. Most algorithms and tool implementations are based on finding assignments for input parameter values in order to maximise the execution branch coverage. Only few of them consider dependencies from outside the Code Under Test’s scope such...

  9. Integrated Design Engineering Analysis (IDEA) Environment Automated Generation of Structured CFD Grids using Topology Methods

    Science.gov (United States)

    Kamhawi, Hilmi N.

    2012-01-01

    This report documents the work performed from March 2010 to March 2012. The Integrated Design and Engineering Analysis (IDEA) environment is a collaborative environment based on an object-oriented, multidisciplinary, distributed framework using the Adaptive Modeling Language (AML) as a framework and supporting the configuration design and parametric CFD grid generation. This report will focus on describing the work in the area of parametric CFD grid generation using novel concepts for defining the interaction between the mesh topology and the geometry in such a way as to separate the mesh topology from the geometric topology while maintaining the link between the mesh topology and the actual geometry.

  10. Cassini Tour Atlas Automated Generation

    Science.gov (United States)

    Grazier, Kevin R.; Roumeliotis, Chris; Lange, Robert D.

    2011-01-01

    During the Cassini spacecraft s cruise phase and nominal mission, the Cassini Science Planning Team developed and maintained an online database of geometric and timing information called the Cassini Tour Atlas. The Tour Atlas consisted of several hundreds of megabytes of EVENTS mission planning software outputs, tables, plots, and images used by mission scientists for observation planning. Each time the nominal mission trajectory was altered or tweaked, a new Tour Atlas had to be regenerated manually. In the early phases of Cassini s Equinox Mission planning, an a priori estimate suggested that mission tour designers would develop approximately 30 candidate tours within a short period of time. So that Cassini scientists could properly analyze the science opportunities in each candidate tour quickly and thoroughly so that the optimal series of orbits for science return could be selected, a separate Tour Atlas was required for each trajectory. The task of manually generating the number of trajectory analyses in the allotted time would have been impossible, so the entire task was automated using code written in five different programming languages. This software automates the generation of the Cassini Tour Atlas database. It performs with one UNIX command what previously took a day or two of human labor.

  11. Automated protein structure calculation from NMR data

    International Nuclear Information System (INIS)

    Williamson, Mike P.; Craven, C. Jeremy

    2009-01-01

    Current software is almost at the stage to permit completely automatic structure determination of small proteins of <15 kDa, from NMR spectra to structure validation with minimal user interaction. This goal is welcome, as it makes structure calculation more objective and therefore more easily validated, without any loss in the quality of the structures generated. Moreover, it releases expert spectroscopists to carry out research that cannot be automated. It should not take much further effort to extend automation to ca 20 kDa. However, there are technological barriers to further automation, of which the biggest are identified as: routines for peak picking; adoption and sharing of a common framework for structure calculation, including the assembly of an automated and trusted package for structure validation; and sample preparation, particularly for larger proteins. These barriers should be the main target for development of methodology for protein structure determination, particularly by structural genomics consortia

  12. Automated Generation of Attack Trees

    DEFF Research Database (Denmark)

    Vigo, Roberto; Nielson, Flemming; Nielson, Hanne Riis

    2014-01-01

    Attack trees are widely used to represent threat scenarios in a succinct and intuitive manner, suitable for conveying security information to non-experts. The manual construction of such objects relies on the creativity and experience of specialists, and therefore it is error-prone and impractica......Attack trees are widely used to represent threat scenarios in a succinct and intuitive manner, suitable for conveying security information to non-experts. The manual construction of such objects relies on the creativity and experience of specialists, and therefore it is error......-prone and impracticable for large systems. Nonetheless, the automated generation of attack trees has only been explored in connection to computer networks and levering rich models, whose analysis typically leads to an exponential blow-up of the state space. We propose a static analysis approach where attack trees...... are automatically inferred from a process algebraic specification in a syntax-directed fashion, encompassing a great many application domains and avoiding incurring systematically an exponential explosion. Moreover, we show how the standard propositional denotation of an attack tree can be used to phrase...

  13. Automated Test-Form Generation

    Science.gov (United States)

    van der Linden, Wim J.; Diao, Qi

    2011-01-01

    In automated test assembly (ATA), the methodology of mixed-integer programming is used to select test items from an item bank to meet the specifications for a desired test form and optimize its measurement accuracy. The same methodology can be used to automate the formatting of the set of selected items into the actual test form. Three different…

  14. Automated Test Requirement Document Generation

    Science.gov (United States)

    1987-11-01

    DIAGNOSTICS BASED ON THE PRINCIPLES OF ARTIFICIAL INTELIGENCE ", 1984 International Test Conference, 01Oct84, (A3, 3, Cs D3, E2, G2, H2, 13, J6, K) 425...j0O GLOSSARY OF ACRONYMS 0 ABBREVIATION DEFINITION AFSATCOM Air Force Satellite Communication Al Artificial Intelligence ASIC Application Specific...In-Test Equipment (BITE) and AI ( Artificial Intelligence) - Expert Systems - need to be fully applied before a completely automated process can be

  15. "First generation" automated DNA sequencing technology.

    Science.gov (United States)

    Slatko, Barton E; Kieleczawa, Jan; Ju, Jingyue; Gardner, Andrew F; Hendrickson, Cynthia L; Ausubel, Frederick M

    2011-10-01

    Beginning in the 1980s, automation of DNA sequencing has greatly increased throughput, reduced costs, and enabled large projects to be completed more easily. The development of automation technology paralleled the development of other aspects of DNA sequencing: better enzymes and chemistry, separation and imaging technology, sequencing protocols, robotics, and computational advancements (including base-calling algorithms with quality scores, database developments, and sequence analysis programs). Despite the emergence of high-throughput sequencing platforms, automated Sanger sequencing technology remains useful for many applications. This unit provides background and a description of the "First-Generation" automated DNA sequencing technology. It also includes protocols for using the current Applied Biosystems (ABI) automated DNA sequencing machines. © 2011 by John Wiley & Sons, Inc.

  16. Automated procedures for sizing aerospace vehicle structures /SAVES/

    Science.gov (United States)

    Giles, G. L.; Blackburn, C. L.; Dixon, S. C.

    1972-01-01

    Results from a continuing effort to develop automated methods for structural design are described. A system of computer programs presently under development called SAVES is intended to automate the preliminary structural design of a complete aerospace vehicle. Each step in the automated design process of the SAVES system of programs is discussed, with emphasis placed on use of automated routines for generation of finite-element models. The versatility of these routines is demonstrated by structural models generated for a space shuttle orbiter, an advanced technology transport,n hydrogen fueled Mach 3 transport. Illustrative numerical results are presented for the Mach 3 transport wing.

  17. A linear programming approach to reconstructing subcellular structures from confocal images for automated generation of representative 3D cellular models.

    Science.gov (United States)

    Wood, Scott T; Dean, Brian C; Dean, Delphine

    2013-04-01

    This paper presents a novel computer vision algorithm to analyze 3D stacks of confocal images of fluorescently stained single cells. The goal of the algorithm is to create representative in silico model structures that can be imported into finite element analysis software for mechanical characterization. Segmentation of cell and nucleus boundaries is accomplished via standard thresholding methods. Using novel linear programming methods, a representative actin stress fiber network is generated by computing a linear superposition of fibers having minimum discrepancy compared with an experimental 3D confocal image. Qualitative validation is performed through analysis of seven 3D confocal image stacks of adherent vascular smooth muscle cells (VSMCs) grown in 2D culture. The presented method is able to automatically generate 3D geometries of the cell's boundary, nucleus, and representative F-actin network based on standard cell microscopy data. These geometries can be used for direct importation and implementation in structural finite element models for analysis of the mechanics of a single cell to potentially speed discoveries in the fields of regenerative medicine, mechanobiology, and drug discovery. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Semi-automated ontology generation and evolution

    Science.gov (United States)

    Stirtzinger, Anthony P.; Anken, Craig S.

    2009-05-01

    Extending the notion of data models or object models, ontology can provide rich semantic definition not only to the meta-data but also to the instance data of domain knowledge, making these semantic definitions available in machine readable form. However, the generation of an effective ontology is a difficult task involving considerable labor and skill. This paper discusses an Ontology Generation and Evolution Processor (OGEP) aimed at automating this process, only requesting user input when un-resolvable ambiguous situations occur. OGEP directly attacks the main barrier which prevents automated (or self learning) ontology generation: the ability to understand the meaning of artifacts and the relationships the artifacts have to the domain space. OGEP leverages existing lexical to ontological mappings in the form of WordNet, and Suggested Upper Merged Ontology (SUMO) integrated with a semantic pattern-based structure referred to as the Semantic Grounding Mechanism (SGM) and implemented as a Corpus Reasoner. The OGEP processing is initiated by a Corpus Parser performing a lexical analysis of the corpus, reading in a document (or corpus) and preparing it for processing by annotating words and phrases. After the Corpus Parser is done, the Corpus Reasoner uses the parts of speech output to determine the semantic meaning of a word or phrase. The Corpus Reasoner is the crux of the OGEP system, analyzing, extrapolating, and evolving data from free text into cohesive semantic relationships. The Semantic Grounding Mechanism provides a basis for identifying and mapping semantic relationships. By blending together the WordNet lexicon and SUMO ontological layout, the SGM is given breadth and depth in its ability to extrapolate semantic relationships between domain entities. The combination of all these components results in an innovative approach to user assisted semantic-based ontology generation. This paper will describe the OGEP technology in the context of the architectural

  19. Automated Liquibase Generator And ValidatorALGV

    Directory of Open Access Journals (Sweden)

    Manik Jain

    2015-08-01

    Full Text Available Abstract This paper presents an automation tool namely ALGV Automated Liquibase Generator and Validator for the automated generation and verification of liquibase scripts. Liquibase is one of the most efficient ways of applying and persisting changes to a database schema. Since its invention by Nathan Voxland 1 it has become de facto standard for database change management. The advantages of using liquibase scripts over traditional sql queries ranges from version control to reusing the same scripts over multiple database platforms. Irrespective of its advantages manual creation of liquibase scripts takes a lot of effort and sometimes is error-prone. ALGV helps to reduce the time consuming liquibase script generation manual typing efforts possible error occurrence and manual verification process and time by 75. Automating the liquibase generation process also helps to remove the burden of recollecting specific tags to be used for a particular change. Moreover developers can concentrate on the business logic and business data rather than wasting their precious efforts in writing files.

  20. Automated generation of lattice QCD Feynman rules

    Energy Technology Data Exchange (ETDEWEB)

    Hart, A.; Mueller, E.H. [Edinburgh Univ. (United Kingdom). SUPA School of Physics and Astronomy; von Hippel, G.M. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Horgan, R.R. [Cambridge Univ. (United Kingdom). DAMTP, CMS

    2009-04-15

    The derivation of the Feynman rules for lattice perturbation theory from actions and operators is complicated, especially for highly improved actions such as HISQ. This task is, however, both important and particularly suitable for automation. We describe a suite of software to generate and evaluate Feynman rules for a wide range of lattice field theories with gluons and (relativistic and/or heavy) quarks. Our programs are capable of dealing with actions as complicated as (m)NRQCD and HISQ. Automated differentiation methods are used to calculate also the derivatives of Feynman diagrams. (orig.)

  1. Automated generation of lattice QCD Feynman rules

    International Nuclear Information System (INIS)

    Hart, A.; Mueller, E.H.; Horgan, R.R.

    2009-04-01

    The derivation of the Feynman rules for lattice perturbation theory from actions and operators is complicated, especially for highly improved actions such as HISQ. This task is, however, both important and particularly suitable for automation. We describe a suite of software to generate and evaluate Feynman rules for a wide range of lattice field theories with gluons and (relativistic and/or heavy) quarks. Our programs are capable of dealing with actions as complicated as (m)NRQCD and HISQ. Automated differentiation methods are used to calculate also the derivatives of Feynman diagrams. (orig.)

  2. Nuclear power generation and automation technology

    International Nuclear Information System (INIS)

    Korei, Yoshiro

    1985-01-01

    The proportion of nuclear power in the total generated electric power has been increasing year after year, and the ensuring of its stable supply has been demanded. For the further development of nuclear power generation, the heightening of economical efficiency which is the largest merit of nuclear power and the public acceptance as a safe and stable electric power source are the important subjects. In order to solve these subjects, in nuclear power generation, various automation techniques have been applied for the purpose of the heightening of reliability, labor saving and the reduction of radiation exposure. Meeting the high needs of automation, the automation technology aided by computers have been applied to the design, manufacture and construction, operation and maintenance of nuclear power plants. Computer-aided design and the examples of design of a reactor building, pipings and a fuel assembly, an automatic welder for pipings of all position TIG welding type, a new central monitoring and control system, an automatic exchanger of control rod-driving mechanism, an automatic in-service inspection system for nozzles and pipings, and a robot for steam generator maintenance are shown. The trend of technical development and an intelligent moving robot, a system maintenance robot and a four legs walking robot are explained. (Kako, I.)

  3. An Automated Approach to the Generation of Structured Building Information Models from Unstructured 3d Point Cloud Scans

    DEFF Research Database (Denmark)

    Tamke, Martin; Evers, Henrik Leander; Wessel, Raoul

    2016-01-01

    In this paper we present and evaluate an approach for the automatic generation of building models in IFC BIM format from unstructured Point Cloud scans, as they result from 3dlaser scans of buildings. While the actual measurement process is relatively fast, 85% of the overall time are spend on th...

  4. An Automated Approach to the Generation of Structured Building Information Models from Unstructured 3d Point Cloud Scans

    DEFF Research Database (Denmark)

    Tamke, Martin; Evers, Henrik Leander; Wessel, Raoul

    2016-01-01

    In this paper we present and evaluate an approach for the automatic generation of building models in IFC BIM format from unstructured Point Cloud scans, as they result from 3dlaser scans of buildings. While the actual measurement process is relatively fast, 85% of the overall time are spend...

  5. An Automated Approach to the Generation of Structured Building Information Models from Unstructured 3d Point Cloud Scans

    DEFF Research Database (Denmark)

    Tamke, Martin; Evers, Henrik Leander; Wessel, Raoul

    2016-01-01

    In this paper we present and evaluate an approach for the automatic generation of building models in IFC BIM format from unstructured Point Cloud scans, as they result from 3dlaser scans of buildings. While the actual measurement process is relatively fast, 85% of the overall time are spend...... on the interpretation and transformation of the resulting Point Cloud data into information, which can be used in architectural and engineering design workflows. Our approach to tackle this problem, is in contrast to existing ones which work on the levels of points, based on the detection of building elements...

  6. An Automated Technique for Generating Georectified Mosaics from Ultra-High Resolution Unmanned Aerial Vehicle (UAV Imagery, Based on Structure from Motion (SfM Point Clouds

    Directory of Open Access Journals (Sweden)

    Christopher Watson

    2012-05-01

    Full Text Available Unmanned Aerial Vehicles (UAVs are an exciting new remote sensing tool capable of acquiring high resolution spatial data. Remote sensing with UAVs has the potential to provide imagery at an unprecedented spatial and temporal resolution. The small footprint of UAV imagery, however, makes it necessary to develop automated techniques to geometrically rectify and mosaic the imagery such that larger areas can be monitored. In this paper, we present a technique for geometric correction and mosaicking of UAV photography using feature matching and Structure from Motion (SfM photogrammetric techniques. Images are processed to create three dimensional point clouds, initially in an arbitrary model space. The point clouds are transformed into a real-world coordinate system using either a direct georeferencing technique that uses estimated camera positions or via a Ground Control Point (GCP technique that uses automatically identified GCPs within the point cloud. The point cloud is then used to generate a Digital Terrain Model (DTM required for rectification of the images. Subsequent georeferenced images are then joined together to form a mosaic of the study area. The absolute spatial accuracy of the direct technique was found to be 65–120 cm whilst the GCP technique achieves an accuracy of approximately 10–15 cm.

  7. Automated Item Generation with Recurrent Neural Networks.

    Science.gov (United States)

    von Davier, Matthias

    2018-03-12

    Utilizing technology for automated item generation is not a new idea. However, test items used in commercial testing programs or in research are still predominantly written by humans, in most cases by content experts or professional item writers. Human experts are a limited resource and testing agencies incur high costs in the process of continuous renewal of item banks to sustain testing programs. Using algorithms instead holds the promise of providing unlimited resources for this crucial part of assessment development. The approach presented here deviates in several ways from previous attempts to solve this problem. In the past, automatic item generation relied either on generating clones of narrowly defined item types such as those found in language free intelligence tests (e.g., Raven's progressive matrices) or on an extensive analysis of task components and derivation of schemata to produce items with pre-specified variability that are hoped to have predictable levels of difficulty. It is somewhat unlikely that researchers utilizing these previous approaches would look at the proposed approach with favor; however, recent applications of machine learning show success in solving tasks that seemed impossible for machines not too long ago. The proposed approach uses deep learning to implement probabilistic language models, not unlike what Google brain and Amazon Alexa use for language processing and generation.

  8. Automated test data generation for branch testing using incremental

    Indian Academy of Sciences (India)

    Cost of software testing can be reduced by automated test data generation to find a minimal set of data that has maximum coverage. Search-based software testing (SBST) is one of the techniques recently used for automated testing task. SBST makes use of control flow graph (CFG) and meta-heuristic search algorithms to ...

  9. Automated MAD and MIR structure solution

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.; Berendzen, Joel

    1999-01-01

    A fully automated procedure for solving MIR and MAD structures has been developed using a scoring scheme to convert the structure-solution process into an optimization problem. Obtaining an electron-density map from X-ray diffraction data can be difficult and time-consuming even after the data have been collected, largely because MIR and MAD structure determinations currently require many subjective evaluations of the qualities of trial heavy-atom partial structures before a correct heavy-atom solution is obtained. A set of criteria for evaluating the quality of heavy-atom partial solutions in macromolecular crystallography have been developed. These have allowed the conversion of the crystal structure-solution process into an optimization problem and have allowed its automation. The SOLVE software has been used to solve MAD data sets with as many as 52 selenium sites in the asymmetric unit. The automated structure-solution process developed is a major step towards the fully automated structure-determination, model-building and refinement procedure which is needed for genomic scale structure determinations

  10. Loft: An Automated Mesh Generator for Stiffened Shell Aerospace Vehicles

    Science.gov (United States)

    Eldred, Lloyd B.

    2011-01-01

    Loft is an automated mesh generation code that is designed for aerospace vehicle structures. From user input, Loft generates meshes for wings, noses, tanks, fuselage sections, thrust structures, and so on. As a mesh is generated, each element is assigned properties to mark the part of the vehicle with which it is associated. This property assignment is an extremely powerful feature that enables detailed analysis tasks, such as load application and structural sizing. This report is presented in two parts. The first part is an overview of the code and its applications. The modeling approach that was used to create the finite element meshes is described. Several applications of the code are demonstrated, including a Next Generation Launch Technology (NGLT) wing-sizing study, a lunar lander stage study, a launch vehicle shroud shape study, and a two-stage-to-orbit (TSTO) orbiter. Part two of the report is the program user manual. The manual includes in-depth tutorials and a complete command reference.

  11. ASTROS: A multidisciplinary automated structural design tool

    Science.gov (United States)

    Neill, D. J.

    1989-01-01

    ASTROS (Automated Structural Optimization System) is a finite-element-based multidisciplinary structural optimization procedure developed under Air Force sponsorship to perform automated preliminary structural design. The design task is the determination of the structural sizes that provide an optimal structure while satisfying numerous constraints from many disciplines. In addition to its automated design features, ASTROS provides a general transient and frequency response capability, as well as a special feature to perform a transient analysis of a vehicle subjected to a nuclear blast. The motivation for the development of a single multidisciplinary design tool is that such a tool can provide improved structural designs in less time than is currently needed. The role of such a tool is even more apparent as modern materials come into widespread use. Balancing conflicting requirements for the structure's strength and stiffness while exploiting the benefits of material anisotropy is perhaps an impossible task without assistance from an automated design tool. Finally, the use of a single tool can bring the design task into better focus among design team members, thereby improving their insight into the overall task.

  12. Automation Framework for Flight Dynamics Products Generation

    Science.gov (United States)

    Wiegand, Robert E.; Esposito, Timothy C.; Watson, John S.; Jun, Linda; Shoan, Wendy; Matusow, Carla

    2010-01-01

    XFDS provides an easily adaptable automation platform. To date it has been used to support flight dynamics operations. It coordinates the execution of other applications such as Satellite TookKit, FreeFlyer, MATLAB, and Perl code. It provides a mechanism for passing messages among a collection of XFDS processes, and allows sending and receiving of GMSEC messages. A unified and consistent graphical user interface (GUI) is used for the various tools. Its automation configuration is stored in text files, and can be edited either directly or using the GUI.

  13. Automated Testing with Targeted Event Sequence Generation

    DEFF Research Database (Denmark)

    Jensen, Casper Svenning; Prasad, Mukul R.; Møller, Anders

    2013-01-01

    Automated software testing aims to detect errors by producing test inputs that cover as much of the application source code as possible. Applications for mobile devices are typically event-driven, which raises the challenge of automatically producing event sequences that result in high coverage...

  14. Automated Structure Solution with the PHENIX Suite

    Energy Technology Data Exchange (ETDEWEB)

    Zwart, Peter H.; Zwart, Peter H.; Afonine, Pavel; Grosse-Kunstleve, Ralf W.; Hung, Li-Wei; Ioerger, Tom R.; McCoy, A.J.; McKee, Eric; Moriarty, Nigel; Read, Randy J.; Sacchettini, James C.; Sauter, Nicholas K.; Storoni, L.C.; Terwilliger, Tomas C.; Adams, Paul D.

    2008-06-09

    Significant time and effort are often required to solve and complete a macromolecular crystal structure. The development of automated computational methods for the analysis, solution and completion of crystallographic structures has the potential to produce minimally biased models in a short time without the need for manual intervention. The PHENIX software suite is a highly automated system for macromolecular structure determination that can rapidly arrive at an initial partial model of a structure without significant human intervention, given moderate resolution and good quality data. This achievement has been made possible by the development of new algorithms for structure determination, maximum-likelihood molecular replacement (PHASER), heavy-atom search (HySS), template and pattern-based automated model-building (RESOLVE, TEXTAL), automated macromolecular refinement (phenix.refine), and iterative model-building, density modification and refinement that can operate at moderate resolution (RESOLVE, AutoBuild). These algorithms are based on a highly integrated and comprehensive set of crystallographic libraries that have been built and made available to the community. The algorithms are tightly linked and made easily accessible to users through the PHENIX Wizards and the PHENIX GUI.

  15. Automated structure solution with the PHENIX suite

    Energy Technology Data Exchange (ETDEWEB)

    Terwilliger, Thomas C [Los Alamos National Laboratory; Zwart, Peter H [LBNL; Afonine, Pavel V [LBNL; Grosse - Kunstleve, Ralf W [LBNL

    2008-01-01

    Significant time and effort are often required to solve and complete a macromolecular crystal structure. The development of automated computational methods for the analysis, solution, and completion of crystallographic structures has the potential to produce minimally biased models in a short time without the need for manual intervention. The PHENIX software suite is a highly automated system for macromolecular structure determination that can rapidly arrive at an initial partial model of a structure without significant human intervention, given moderate resolution, and good quality data. This achievement has been made possible by the development of new algorithms for structure determination, maximum-likelihood molecular replacement (PHASER), heavy-atom search (HySS), template- and pattern-based automated model-building (RESOLVE, TEXTAL), automated macromolecular refinement (phenix. refine), and iterative model-building, density modification and refinement that can operate at moderate resolution (RESOLVE, AutoBuild). These algorithms are based on a highly integrated and comprehensive set of crystallographic libraries that have been built and made available to the community. The algorithms are tightly linked and made easily accessible to users through the PHENIX Wizards and the PHENIX GUI.

  16. Automated robust generation of compact 3D statistical shape models

    Science.gov (United States)

    Vrtovec, Tomaz; Likar, Bostjan; Tomazevic, Dejan; Pernus, Franjo

    2004-05-01

    Ascertaining the detailed shape and spatial arrangement of anatomical structures is important not only within diagnostic settings but also in the areas of planning, simulation, intraoperative navigation, and tracking of pathology. Robust, accurate and efficient automated segmentation of anatomical structures is difficult because of their complexity and inter-patient variability. Furthermore, the position of the patient during image acquisition, the imaging device and protocol, image resolution, and other factors induce additional variations in shape and appearance. Statistical shape models (SSMs) have proven quite successful in capturing structural variability. A possible approach to obtain a 3D SSM is to extract reference voxels by precisely segmenting the structure in one, reference image. The corresponding voxels in other images are determined by registering the reference image to each other image. The SSM obtained in this way describes statistically plausible shape variations over the given population as well as variations due to imperfect registration. In this paper, we present a completely automated method that significantly reduces shape variations induced by imperfect registration, thus allowing a more accurate description of variations. At each iteration, the derived SSM is used for coarse registration, which is further improved by describing finer variations of the structure. The method was tested on 64 lumbar spinal column CT scans, from which 23, 38, 45, 46 and 42 volumes of interest containing vertebra L1, L2, L3, L4 and L5, respectively, were extracted. Separate SSMs were generated for each vertebra. The results show that the method is capable of reducing the variations induced by registration errors.

  17. Automating the Generation of Heterogeneous Aviation Safety Cases

    Science.gov (United States)

    Denney, Ewen W.; Pai, Ganesh J.; Pohl, Josef M.

    2012-01-01

    A safety case is a structured argument, supported by a body of evidence, which provides a convincing and valid justification that a system is acceptably safe for a given application in a given operating environment. This report describes the development of a fragment of a preliminary safety case for the Swift Unmanned Aircraft System. The construction of the safety case fragment consists of two parts: a manually constructed system-level case, and an automatically constructed lower-level case, generated from formal proof of safety-relevant correctness properties. We provide a detailed discussion of the safety considerations for the target system, emphasizing the heterogeneity of sources of safety-relevant information, and use a hazard analysis to derive safety requirements, including formal requirements. We evaluate the safety case using three classes of metrics for measuring degrees of coverage, automation, and understandability. We then present our preliminary conclusions and make suggestions for future work.

  18. Application of automated reasoning software: procedure generation system verifier

    International Nuclear Information System (INIS)

    Smith, D.E.; Seeman, S.E.

    1984-09-01

    An on-line, automated reasoning software system for verifying the actions of other software or human control systems has been developed. It was demonstrated by verifying the actions of an automated procedure generation system. The verifier uses an interactive theorem prover as its inference engine with the rules included as logic axioms. Operation of the verifier is generally transparent except when the verifier disagrees with the actions of the monitored software. Testing with an automated procedure generation system demonstrates the successful application of automated reasoning software for verification of logical actions in a diverse, redundant manner. A higher degree of confidence may be placed in the verified actions gathered by the combined system

  19. Automated Test Case Generation for an Autopilot Requirement Prototype

    Science.gov (United States)

    Giannakopoulou, Dimitra; Rungta, Neha; Feary, Michael

    2011-01-01

    Designing safety-critical automation with robust human interaction is a difficult task that is susceptible to a number of known Human-Automation Interaction (HAI) vulnerabilities. It is therefore essential to develop automated tools that provide support both in the design and rapid evaluation of such automation. The Automation Design and Evaluation Prototyping Toolset (ADEPT) enables the rapid development of an executable specification for automation behavior and user interaction. ADEPT supports a number of analysis capabilities, thus enabling the detection of HAI vulnerabilities early in the design process, when modifications are less costly. In this paper, we advocate the introduction of a new capability to model-based prototyping tools such as ADEPT. The new capability is based on symbolic execution that allows us to automatically generate quality test suites based on the system design. Symbolic execution is used to generate both user input and test oracles user input drives the testing of the system implementation, and test oracles ensure that the system behaves as designed. We present early results in the context of a component in the Autopilot system modeled in ADEPT, and discuss the challenges of test case generation in the HAI domain.

  20. Cerebellum engages in automation of verb-generation skill.

    Science.gov (United States)

    Yang, Zhi; Wu, Paula; Weng, Xuchu; Bandettini, Peter A

    2014-03-01

    Numerous studies have shown cerebellar involvement in item-specific association, a form of explicit learning. However, very few have demonstrated cerebellar participation in automation of non-motor cognitive tasks. Applying fMRI to a repeated verb-generation task, we sought to distinguish cerebellar involvement in learning of item-specific noun-verb association and automation of verb generation skill. The same set of nouns was repeated in six verb-generation blocks so that subjects practiced generating verbs for the nouns. The practice was followed by a novel block with a different set of nouns. The cerebellar vermis (IV/V) and the right cerebellar lobule VI showed decreased activation following practice; activation in the right cerebellar Crus I was significantly lower in the novel challenge than in the initial verb-generation task. Furthermore, activation in this region during well-practiced blocks strongly correlated with improvement of behavioral performance in both the well-practiced and the novel blocks, suggesting its role in the learning of general mental skills not specific to the practiced noun-verb pairs. Therefore, the cerebellum processes both explicit verbal associative learning and automation of cognitive tasks. Different cerebellar regions predominate in this processing: lobule VI during the acquisition of item-specific association, and Crus I during automation of verb-generation skills through practice.

  1. Automated mass spectrum generation for new physics

    CERN Document Server

    Alloul, Adam; De Causmaecker, Karen; Fuks, Benjamin; Rausch de Traubenberg, Michel

    2013-01-01

    We describe an extension of the FeynRules package dedicated to the automatic generation of the mass spectrum associated with any Lagrangian-based quantum field theory. After introducing a simplified way to implement particle mixings, we present a new class of FeynRules functions allowing both for the analytical computation of all the model mass matrices and for the generation of a C++ package, dubbed ASperGe. This program can then be further employed for a numerical evaluation of the rotation matrices necessary to diagonalize the field basis. We illustrate these features in the context of the Two-Higgs-Doublet Model, the Minimal Left-Right Symmetric Standard Model and the Minimal Supersymmetric Standard Model.

  2. AutoDipole - Automated generation of dipole subtraction terms

    International Nuclear Information System (INIS)

    Hasegawa, K.; Uwer, P.

    2009-11-01

    We present an automated generation of the subtraction terms for next-to-leading order QCD calculations in the Catani-Seymour dipole formalism. For a given scattering process with n external particles our Mathematica package generates all dipole terms, allowing for bothmassless and massive dipoles. The numerical evaluation of the subtraction terms proceeds with MadGraph, which provides Fortran code for the necessary scattering amplitudes. Checks of the numerical stability are discussed. (orig.)

  3. AutoDipole - Automated generation of dipole subtraction terms

    Energy Technology Data Exchange (ETDEWEB)

    Hasegawa, K.; Uwer, P. [Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik; Moch, S. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany)

    2009-11-15

    We present an automated generation of the subtraction terms for next-to-leading order QCD calculations in the Catani-Seymour dipole formalism. For a given scattering process with n external particles our Mathematica package generates all dipole terms, allowing for bothmassless and massive dipoles. The numerical evaluation of the subtraction terms proceeds with MadGraph, which provides Fortran code for the necessary scattering amplitudes. Checks of the numerical stability are discussed. (orig.)

  4. Generative Moire Structures

    Directory of Open Access Journals (Sweden)

    Adrian – Mihail Marian

    2006-01-01

    Full Text Available “GRAPHIC ON COMPUTER” – the work of the Czech Petar Milojevic, published in Titus Mocanu’s book “THE MODERN ART’S MORPHOLOGY”, in 1973, had great influence on me. I tried to discover the algorithm that generated this work. It was not so difficult to do and in a short time I was able to draw 12 such structures. In time, with interruptions, I have returned to this kind of works. In my personal exhibition “CYBERNETIC DESIGN” that took place at “M4-1-13-etopa” gallery of Pitesti, in March 1981, I have presented 8 such structures. To my joy, they had an impact on art lovers.

  5. A programmable Gaussian random pulse generator for automated performance measurements

    International Nuclear Information System (INIS)

    Abdel-Aal, R.E.

    1989-01-01

    This paper describes a versatile random signal generator which produces logic pulses with a Gaussian distribution for the pulse spacing. The average rate at the pulse generator output can be software-programmed, which makes it useful in performing automated measurements of dead time and CPU time performance of data acquisition systems and modules over a wide range of data rates. Hardware and software components are described and data on the input-output characteristics and the statistical properties of the pulse generator are given. Typical applications are discussed together with advantages over using radioactive test sources. Results obtained from an automated performance run on a VAX 11/785 data acquisition system are presented. (orig.)

  6. Toward Fully Automated Multicriterial Plan Generation: A Prospective Clinical Study

    International Nuclear Information System (INIS)

    Voet, Peter W.J.; Dirkx, Maarten L.P.; Breedveld, Sebastiaan; Fransen, Dennie; Levendag, Peter C.; Heijmen, Ben J.M.

    2013-01-01

    Purpose: To prospectively compare plans generated with iCycle, an in-house-developed algorithm for fully automated multicriterial intensity modulated radiation therapy (IMRT) beam profile and beam orientation optimization, with plans manually generated by dosimetrists using the clinical treatment planning system. Methods and Materials: For 20 randomly selected head-and-neck cancer patients with various tumor locations (of whom 13 received sequential boost treatments), we offered the treating physician the choice between an automatically generated iCycle plan and a manually optimized plan using standard clinical procedures. Although iCycle used a fixed “wish list” with hard constraints and prioritized objectives, the dosimetrists manually selected the beam configuration and fine tuned the constraints and objectives for each IMRT plan. Dosimetrists were not informed in advance whether a competing iCycle plan was made. The 2 plans were simultaneously presented to the physician, who then selected the plan to be used for treatment. For the patient group, differences in planning target volume coverage and sparing of critical tissues were quantified. Results: In 32 of 33 plan comparisons, the physician selected the iCycle plan for treatment. This highly consistent preference for the automatically generated plans was mainly caused by the improved sparing for the large majority of critical structures. With iCycle, the normal tissue complication probabilities for the parotid and submandibular glands were reduced by 2.4% ± 4.9% (maximum, 18.5%, P=.001) and 6.5% ± 8.3% (maximum, 27%, P=.005), respectively. The reduction in the mean oral cavity dose was 2.8 ± 2.8 Gy (maximum, 8.1 Gy, P=.005). For the swallowing muscles, the esophagus and larynx, the mean dose reduction was 3.3 ± 1.1 Gy (maximum, 9.2 Gy, P<.001). For 15 of the 20 patients, target coverage was also improved. Conclusions: In 97% of cases, automatically generated plans were selected for treatment because of

  7. Automating Visualization Service Generation with the WATT Compiler

    Science.gov (United States)

    Bollig, E. F.; Lyness, M. D.; Erlebacher, G.; Yuen, D. A.

    2007-12-01

    As tasks and workflows become increasingly complex, software developers are devoting increasing attention to automation tools. Among many examples, the Automator tool from Apple collects components of a workflow into a single script, with very little effort on the part of the user. Tasks are most often described as a series of instructions. The granularity of the tasks dictates the tools to use. Compilers translate fine-grained instructions to assembler code, while scripting languages (ruby, perl) are used to describe a series of tasks at a higher level. Compilers can also be viewed as transformational tools: a cross-compiler can translate executable code written on one computer to assembler code understood on another, while transformational tools can translate from one high-level language to another. We are interested in creating visualization web services automatically, starting from stand-alone VTK (Visualization Toolkit) code written in Tcl. To this end, using the OCaml programming language, we have developed a compiler that translates Tcl into C++, including all the stubs, classes and methods to interface with gSOAP, a C++ implementation of the Soap 1.1/1.2 protocols. This compiler, referred to as the Web Automation and Translation Toolkit (WATT), is the first step towards automated creation of specialized visualization web services without input from the user. The WATT compiler seeks to automate all aspects of web service generation, including the transport layer, the division of labor and the details related to interface generation. The WATT compiler is part of ongoing efforts within the NSF funded VLab consortium [1] to facilitate and automate time-consuming tasks for the science related to understanding planetary materials. Through examples of services produced by WATT for the VLab portal, we will illustrate features, limitations and the improvements necessary to achieve the ultimate goal of complete and transparent automation in the generation of web

  8. Automated processing of data generated by molecular dynamics

    International Nuclear Information System (INIS)

    Lobato Hoyos, Ivan; Rojas Tapia, Justo; Instituto Peruano de Energia Nuclear, Lima

    2008-01-01

    A new integrated tool for automated processing of data generated by molecular dynamics packages and programs have been developed. The program allows to calculate important quantities such as pair correlation function, the analysis of common neighbors, counting nanoparticles and their size distribution, conversion of output files between different formats. The work explains in detail the modules of the tool, the interface between them. The uses of program are illustrated in application examples in the calculation of various properties of silver nanoparticles. (author)

  9. JWST Associations overview: automated generation of combined products

    Science.gov (United States)

    Alexov, Anastasia; Swade, Daryl; Bushouse, Howard; Diaz, Rosa; Eisenhamer, Jonathan; Hack, Warren; Kyprianou, Mark; Levay, Karen; Rahmani, Christopher; Swam, Mike; Valenti, Jeff

    2018-01-01

    We are presenting the design of the James Webb Space Telescope (JWST) Data Management System (DMS) automated processing of Associations. An Association captures the relationship between exposures and higher level data products, such as combined mosaics created from dithered and tiled observations. The astronomer’s intent is captured within the Proposal Planning System (PPS) and provided to DMS as candidate associations. These candidates are converted into Association Pools and Association Generator Tables that serve as input to automated processing which create the combined data products. Association Pools are generated to capture a list of exposures that could potentially form associations and provide relevant information about those exposures. The Association Generator using definitions on groupings creates one or more Association Tables from a single input Association Pool. Each Association Table defines a set of exposures to be combined and the ruleset of the combination to be performed; the calibration software creates Associated data products based on these input tables. The initial design produces automated Associations within a proposal. Additionally this JWST overall design is conducive to eventually produce Associations for observations from multiple proposals, similar to the Hubble Legacy Archive (HLA).

  10. Automation of steam generator services at public service electric & gas

    Energy Technology Data Exchange (ETDEWEB)

    Cruickshank, H.; Wray, J.; Scull, D. [Public Service Electric & Gas, Hancock`s Bridge, NJ (United States)

    1995-03-01

    Public Service Electric & Gas takes an aggressive approach to pursuing new exposure reduction techniques. Evaluation of historic outage exposure shows that over the last eight refueling outages, primary steam generator work has averaged sixty-six (66) person-rem, or, approximately tewenty-five percent (25%) of the general outage exposure at Salem Station. This maintenance evolution represents the largest percentage of exposure for any single activity. Because of this, primary steam generator work represents an excellent opportunity for the development of significant exposure reduction techniques. A study of primary steam generator maintenance activities demonstrated that seventy-five percent (75%) of radiation exposure was due to work activities of the primary steam generator platform, and that development of automated methods for performing these activities was worth pursuing. Existing robotics systems were examined and it was found that a new approach would have to be developed. This resulted in a joint research and development project between Westinghouse and Public Service Electric & Gas to develop an automated system of accomplishing the Health Physics functions on the primary steam generator platform. R.O.M.M.R.S. (Remotely Operated Managed Maintenance Robotics System) was the result of this venture.

  11. A Recommendation Algorithm for Automating Corollary Order Generation

    Science.gov (United States)

    Klann, Jeffrey; Schadow, Gunther; McCoy, JM

    2009-01-01

    Manual development and maintenance of decision support content is time-consuming and expensive. We explore recommendation algorithms, e-commerce data-mining tools that use collective order history to suggest purchases, to assist with this. In particular, previous work shows corollary order suggestions are amenable to automated data-mining techniques. Here, an item-based collaborative filtering algorithm augmented with association rule interestingness measures mined suggestions from 866,445 orders made in an inpatient hospital in 2007, generating 584 potential corollary orders. Our expert physician panel evaluated the top 92 and agreed 75.3% were clinically meaningful. Also, at least one felt 47.9% would be directly relevant in guideline development. This automated generation of a rough-cut of corollary orders confirms prior indications about automated tools in building decision support content. It is an important step toward computerized augmentation to decision support development, which could increase development efficiency and content quality while automatically capturing local standards. PMID:20351875

  12. Carbohydrate structure: the rocky road to automation.

    Science.gov (United States)

    Agirre, Jon; Davies, Gideon J; Wilson, Keith S; Cowtan, Kevin D

    2017-06-01

    With the introduction of intuitive graphical software, structural biologists who are not experts in crystallography are now able to build complete protein or nucleic acid models rapidly. In contrast, carbohydrates are in a wholly different situation: scant automation exists, with manual building attempts being sometimes toppled by incorrect dictionaries or refinement problems. Sugars are the most stereochemically complex family of biomolecules and, as pyranose rings, have clear conformational preferences. Despite this, all refinement programs may produce high-energy conformations at medium to low resolution, without any support from the electron density. This problem renders the affected structures unusable in glyco-chemical terms. Bringing structural glycobiology up to 'protein standards' will require a total overhaul of the methodology. Time is of the essence, as the community is steadily increasing the production rate of glycoproteins, and electron cryo-microscopy has just started to image them in precisely that resolution range where crystallographic methods falter most. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Automated generation of burnup chain for reactor analysis applications

    International Nuclear Information System (INIS)

    Tran, Viet-Phu; Tran, Hoai-Nam; Yamamoto, Akio; Endo, Tomohiro

    2017-01-01

    This paper presents the development of an automated generation of burnup chain for reactor analysis applications. Algorithms are proposed to reevaluate decay modes, branching ratios and effective fission product (FP) cumulative yields of a given list of important FPs taking into account intermediate reactions. A new burnup chain is generated using the updated data sources taken from the JENDL FP decay data file 2011 and Fission yields data file 2011. The new burnup chain is output according to the format for the SRAC code system. Verification has been performed to evaluate the accuracy of the new burnup chain. The results show that the new burnup chain reproduces well the results of a reference one with 193 fission products used in SRAC. Burnup calculations using the new burnup chain have also been performed based on UO_2 and MOX fuel pin cells and compared with a reference chain th2cm6fp193bp6T.

  14. Automated generation of burnup chain for reactor analysis applications

    Energy Technology Data Exchange (ETDEWEB)

    Tran, Viet-Phu [VINATOM, Hanoi (Viet Nam). Inst. for Nuclear Science and Technology; Tran, Hoai-Nam [Duy Tan Univ., Da Nang (Viet Nam). Inst. of Research and Development; Yamamoto, Akio; Endo, Tomohiro [Nagoya Univ., Nagoya-shi (Japan). Dept. of Materials, Physics and Energy Engineering

    2017-05-15

    This paper presents the development of an automated generation of burnup chain for reactor analysis applications. Algorithms are proposed to reevaluate decay modes, branching ratios and effective fission product (FP) cumulative yields of a given list of important FPs taking into account intermediate reactions. A new burnup chain is generated using the updated data sources taken from the JENDL FP decay data file 2011 and Fission yields data file 2011. The new burnup chain is output according to the format for the SRAC code system. Verification has been performed to evaluate the accuracy of the new burnup chain. The results show that the new burnup chain reproduces well the results of a reference one with 193 fission products used in SRAC. Burnup calculations using the new burnup chain have also been performed based on UO{sub 2} and MOX fuel pin cells and compared with a reference chain th2cm6fp193bp6T.

  15. PONDEROSA, an automated 3D-NOESY peak picking program, enables automated protein structure determination.

    Science.gov (United States)

    Lee, Woonghee; Kim, Jin Hae; Westler, William M; Markley, John L

    2011-06-15

    PONDEROSA (Peak-picking Of Noe Data Enabled by Restriction of Shift Assignments) accepts input information consisting of a protein sequence, backbone and sidechain NMR resonance assignments, and 3D-NOESY ((13)C-edited and/or (15)N-edited) spectra, and returns assignments of NOESY crosspeaks, distance and angle constraints, and a reliable NMR structure represented by a family of conformers. PONDEROSA incorporates and integrates external software packages (TALOS+, STRIDE and CYANA) to carry out different steps in the structure determination. PONDEROSA implements internal functions that identify and validate NOESY peak assignments and assess the quality of the calculated three-dimensional structure of the protein. The robustness of the analysis results from PONDEROSA's hierarchical processing steps that involve iterative interaction among the internal and external modules. PONDEROSA supports a variety of input formats: SPARKY assignment table (.shifts) and spectrum file formats (.ucsf), XEASY proton file format (.prot), and NMR-STAR format (.star). To demonstrate the utility of PONDEROSA, we used the package to determine 3D structures of two proteins: human ubiquitin and Escherichia coli iron-sulfur scaffold protein variant IscU(D39A). The automatically generated structural constraints and ensembles of conformers were as good as or better than those determined previously by much less automated means. The program, in the form of binary code along with tutorials and reference manuals, is available at http://ponderosa.nmrfam.wisc.edu/.

  16. Guiding automated NMR structure determination using a global optimization metric, the NMR DP score

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Yuanpeng Janet, E-mail: yphuang@cabm.rutgers.edu; Mao, Binchen; Xu, Fei; Montelione, Gaetano T., E-mail: gtm@rutgers.edu [Rutgers, The State University of New Jersey, Department of Molecular Biology and Biochemistry, Center for Advanced Biotechnology and Medicine, and Northeast Structural Genomics Consortium (United States)

    2015-08-15

    ASDP is an automated NMR NOE assignment program. It uses a distinct bottom-up topology-constrained network anchoring approach for NOE interpretation, with 2D, 3D and/or 4D NOESY peak lists and resonance assignments as input, and generates unambiguous NOE constraints for iterative structure calculations. ASDP is designed to function interactively with various structure determination programs that use distance restraints to generate molecular models. In the CASD–NMR project, ASDP was tested and further developed using blinded NMR data, including resonance assignments, either raw or manually-curated (refined) NOESY peak list data, and in some cases {sup 15}N–{sup 1}H residual dipolar coupling data. In these blinded tests, in which the reference structure was not available until after structures were generated, the fully-automated ASDP program performed very well on all targets using both the raw and refined NOESY peak list data. Improvements of ASDP relative to its predecessor program for automated NOESY peak assignments, AutoStructure, were driven by challenges provided by these CASD–NMR data. These algorithmic improvements include (1) using a global metric of structural accuracy, the discriminating power score, for guiding model selection during the iterative NOE interpretation process, and (2) identifying incorrect NOESY cross peak assignments caused by errors in the NMR resonance assignment list. These improvements provide a more robust automated NOESY analysis program, ASDP, with the unique capability of being utilized with alternative structure generation and refinement programs including CYANA, CNS, and/or Rosetta.

  17. Automated Generation of User Guidance by Combining Computation and Deduction

    Directory of Open Access Journals (Sweden)

    Walther Neuper

    2012-02-01

    Full Text Available Herewith, a fairly old concept is published for the first time and named "Lucas Interpretation". This has been implemented in a prototype, which has been proved useful in educational practice and has gained academic relevance with an emerging generation of educational mathematics assistants (EMA based on Computer Theorem Proving (CTP. Automated Theorem Proving (ATP, i.e. deduction, is the most reliable technology used to check user input. However ATP is inherently weak in automatically generating solutions for arbitrary problems in applied mathematics. This weakness is crucial for EMAs: when ATP checks user input as incorrect and the learner gets stuck then the system should be able to suggest possible next steps. The key idea of Lucas Interpretation is to compute the steps of a calculation following a program written in a novel CTP-based programming language, i.e. computation provides the next steps. User guidance is generated by combining deduction and computation: the latter is performed by a specific language interpreter, which works like a debugger and hands over control to the learner at breakpoints, i.e. tactics generating the steps of calculation. The interpreter also builds up logical contexts providing ATP with the data required for checking user input, thus combining computation and deduction. The paper describes the concepts underlying Lucas Interpretation so that open questions can adequately be addressed, and prerequisites for further work are provided.

  18. Automated generation of burnup chain for reactor analysis applications

    International Nuclear Information System (INIS)

    Tran Viet Phu; Tran Hoai Nam; Akio Yamamoto; Tomohiro Endo

    2015-01-01

    This paper presents the development of an automated generation of a new burnup chain for reactor analysis applications. The JENDL FP Decay Data File 2011 and Fission Yields Data File 2011 were used as the data sources. The nuclides in the new chain are determined by restrictions of the half-life and cumulative yield of fission products or from a given list. Then, decay modes, branching ratios and fission yields are recalculated taking into account intermediate reactions. The new burnup chain is output according to the format for the SRAC code system. Verification was performed to evaluate the accuracy of the new burnup chain. The results show that the new burnup chain reproduces well the results of a reference one with 193 fission products used in SRAC. Further development and applications are being planned with the burnup chain code. (author)

  19. Balancing Structure for Multiple Generator

    Directory of Open Access Journals (Sweden)

    LUPU Ciprian

    2014-05-01

    Full Text Available This paper presents a strategy to (rebalance a multiple generator control system structure on maintaining the global output in case of load and functioning disturbances. Applicability is proved on a control structure of the two and three sources connected in parallel to produce energy, a situation that has been encountered more and more these days especially in the renewable energy industry (wind, solar and small generators etc.

  20. A Structured Light Scanner for Hyper Flexible Industrial Automation

    DEFF Research Database (Denmark)

    Hansen, Kent; Pedersen, Jeppe; Sølund, Thomas

    2014-01-01

    A current trend in industrial automation implies a need for doing automatic scene understanding, from optical 3D sensors, which in turn imposes a need for a lightweight and reliable 3D optical sensor to be mounted on a collaborative robot e.g., Universal Robot UR5 or Kuka LWR. Here, we empirically...... contribute to the robustness of the system. Hereby, we demonstrate that structured light scanning is a technology well suited for hyper flexible industrial automation, by proposing an appropriate system....

  1. A utilization of fuzzy control for design automation of nuclear structures

    International Nuclear Information System (INIS)

    Yoshimura, Shinobu; Yagawa, Genki; Mochizuki, Yoshihiko

    1991-01-01

    This paper describes an automated design of nuclear structures by means of some artificial intelligence techniques. The 'generate and test' strategy is adopted as a basic strategy of design. An empirical approach with the fuzzy control is introduced for efficient design modification. This system is applied to the design of some 2D models of the fusion first wall. (author)

  2. Extending and automating a Systems-Theoretic hazard analysis for requirements generation and analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, John (Massachusetts Institute of Technology)

    2012-05-01

    Systems Theoretic Process Analysis (STPA) is a powerful new hazard analysis method designed to go beyond traditional safety techniques - such as Fault Tree Analysis (FTA) - that overlook important causes of accidents like flawed requirements, dysfunctional component interactions, and software errors. While proving to be very effective on real systems, no formal structure has been defined for STPA and its application has been ad-hoc with no rigorous procedures or model-based design tools. This report defines a formal mathematical structure underlying STPA and describes a procedure for systematically performing an STPA analysis based on that structure. A method for using the results of the hazard analysis to generate formal safety-critical, model-based system and software requirements is also presented. Techniques to automate both the analysis and the requirements generation are introduced, as well as a method to detect conflicts between the safety and other functional model-based requirements during early development of the system.

  3. An automated approach to network features of protein structure ensembles

    Science.gov (United States)

    Bhattacharyya, Moitrayee; Bhat, Chanda R; Vishveshwara, Saraswathi

    2013-01-01

    Network theory applied to protein structures provides insights into numerous problems of biological relevance. The explosion in structural data available from PDB and simulations establishes a need to introduce a standalone-efficient program that assembles network concepts/parameters under one hood in an automated manner. Herein, we discuss the development/application of an exhaustive, user-friendly, standalone program package named PSN-Ensemble, which can handle structural ensembles generated through molecular dynamics (MD) simulation/NMR studies or from multiple X-ray structures. The novelty in network construction lies in the explicit consideration of side-chain interactions among amino acids. The program evaluates network parameters dealing with topological organization and long-range allosteric communication. The introduction of a flexible weighing scheme in terms of residue pairwise cross-correlation/interaction energy in PSN-Ensemble brings in dynamical/chemical knowledge into the network representation. Also, the results are mapped on a graphical display of the structure, allowing an easy access of network analysis to a general biological community. The potential of PSN-Ensemble toward examining structural ensemble is exemplified using MD trajectories of an ubiquitin-conjugating enzyme (UbcH5b). Furthermore, insights derived from network parameters evaluated using PSN-Ensemble for single-static structures of active/inactive states of β2-adrenergic receptor and the ternary tRNA complexes of tyrosyl tRNA synthetases (from organisms across kingdoms) are discussed. PSN-Ensemble is freely available from http://vishgraph.mbu.iisc.ernet.in/PSN-Ensemble/psn_index.html. PMID:23934896

  4. UNICOS CPC6: automated code generation for process control applications

    International Nuclear Information System (INIS)

    Fernandez Adiego, B.; Blanco Vinuela, E.; Prieto Barreiro, I.

    2012-01-01

    The Continuous Process Control package (CPC) is one of the components of the CERN Unified Industrial Control System framework (UNICOS). As a part of this framework, UNICOS-CPC provides a well defined library of device types, a methodology and a set of tools to design and implement industrial control applications. The new CPC version uses the software factory UNICOS Application Builder (UAB) to develop CPC applications. The CPC component is composed of several platform oriented plug-ins (PLCs and SCADA) describing the structure and the format of the generated code. It uses a resource package where both, the library of device types and the generated file syntax, are defined. The UAB core is the generic part of this software, it discovers and calls dynamically the different plug-ins and provides the required common services. In this paper the UNICOS CPC6 package is introduced. It is composed of several plug-ins: the Instance generator and the Logic generator for both, Siemens and Schneider PLCs, the SCADA generator (based on PVSS) and the CPC wizard as a dedicated plug-in created to provide the user a friendly GUI (Graphical User Interface). A tool called UAB Bootstrap will manage the different UAB components, like CPC, and its dependencies with the resource packages. This tool guides the control system developer during the installation, update and execution of the UAB components. (authors)

  5. Automated and fast building of three-dimensional RNA structures.

    Science.gov (United States)

    Zhao, Yunjie; Huang, Yangyu; Gong, Zhou; Wang, Yanjie; Man, Jianfen; Xiao, Yi

    2012-01-01

    Building tertiary structures of non-coding RNA is required to understand their functions and design new molecules. Current algorithms of RNA tertiary structure prediction give satisfactory accuracy only for small size and simple topology and many of them need manual manipulation. Here, we present an automated and fast program, 3dRNA, for RNA tertiary structure prediction with reasonable accuracy for RNAs of larger size and complex topology.

  6. Automating generation of textual class definitions from OWL to English.

    Science.gov (United States)

    Stevens, Robert; Malone, James; Williams, Sandra; Power, Richard; Third, Allan

    2011-05-17

    Text definitions for entities within bio-ontologies are a cornerstone of the effort to gain a consensus in understanding and usage of those ontologies. Writing these definitions is, however, a considerable effort and there is often a lag between specification of the main part of an ontology (logical descriptions and definitions of entities) and the development of the text-based definitions. The goal of natural language generation (NLG) from ontologies is to take the logical description of entities and generate fluent natural language. The application described here uses NLG to automatically provide text-based definitions from an ontology that has logical descriptions of its entities, so avoiding the bottleneck of authoring these definitions by hand. To produce the descriptions, the program collects all the axioms relating to a given entity, groups them according to common structure, realises each group through an English sentence, and assembles the resulting sentences into a paragraph, to form as 'coherent' a text as possible without human intervention. Sentence generation is accomplished using a generic grammar based on logical patterns in OWL, together with a lexicon for realising atomic entities. We have tested our output for the Experimental Factor Ontology (EFO) using a simple survey strategy to explore the fluency of the generated text and how well it conveys the underlying axiomatisation. Two rounds of survey and improvement show that overall the generated English definitions are found to convey the intended meaning of the axiomatisation in a satisfactory manner. The surveys also suggested that one form of generated English will not be universally liked; that intrusion of too much 'formal ontology' was not liked; and that too much explicit exposure of OWL semantics was also not liked. Our prototype tools can generate reasonable paragraphs of English text that can act as definitions. The definitions were found acceptable by our survey and, as a result, the

  7. A Next-Generation Automated Holdup Measurement System (HMS-5)

    International Nuclear Information System (INIS)

    Gariazzo, Claudio Andres; Smith, Steven E.; Solodov, Alexander A

    2007-01-01

    hardware such as lanthanum halide detectors and digital processing multichannel analyzers will be incorporated into the new HMS-5 system to accommodate the evolving realm of SNM detection and quantification. HMS-5 is the natural progression from the previous incantations of automated special nuclear material holdup measurement systems for process facilities. ORNL is leading this next-generation system with assistance from its foreign partners and past experiences of its Safeguards Laboratory staff

  8. Automated analysis and design of complex structures

    International Nuclear Information System (INIS)

    Wilson, E.L.

    1977-01-01

    The present application of optimum design appears to be restricted to components of the structure rather than to the total structural system. Since design normally involved many analysis of the system any improvement in the efficiency of the basic methods of analysis will allow more complicated systems to be designed by optimum methods. The evaluation of the risk and reliability of a structural system can be extremely important. Reliability studies have been made of many non-structural systems for which the individual components have been extensively tested and the service environment is known. For such systems the reliability studies are valid. For most structural systems, however, the properties of the components can only be estimated and statistical data associated with the potential loads is often minimum. Also, a potentially critical loading condition may be completely neglected in the study. For these reasons and the previous problems associated with the reliability of both linear and nonlinear analysis computer programs it appears to be premature to place a significant value on such studies for complex structures. With these comments as background the purpose of this paper is to discuss the following: the relationship of analysis to design; new methods of analysis; new of improved finite elements; effect of minicomputer on structural analysis methods; the use of system of microprocessors for nonlinear structural analysis; the role of interacting graphics systems in future analysis and design. This discussion will focus on the impact of new, inexpensive computer hardware on design and analysis methods

  9. ADGEN: An automated adjoint code generator for large-scale sensitivity analysis

    International Nuclear Information System (INIS)

    Pin, F.G.; Oblow, E.M.; Horwedel, J.E.; Lucius, J.L.

    1987-01-01

    This paper describes a new automated system, named ADGEN, which makes use of the strengths of computer calculus to automate the costly and time-consuming calculation of derivatives in FORTRAN computer codes, and automatically generate adjoint solutions of computer codes

  10. UNICOS CPC6: Automated Code Generation for Process Control Applications

    CERN Document Server

    Fernandez Adiego, B; Prieto Barreiro, I

    2011-01-01

    The Continuous Process Control package (CPC) is one of the components of the CERN Unified Industrial Control System framework (UNICOS) [1]. As a part of this framework, UNICOS-CPC provides a well defined library of device types, amethodology and a set of tools to design and implement industrial control applications. The new CPC version uses the software factory UNICOS Application Builder (UAB) [2] to develop CPC applications. The CPC component is composed of several platform oriented plugins PLCs and SCADA) describing the structure and the format of the generated code. It uses a resource package where both, the library of device types and the generated file syntax, are defined. The UAB core is the generic part of this software, it discovers and calls dynamically the different plug-ins and provides the required common services. In this paper the UNICOS CPC6 package is introduced. It is composed of several plug-ins: the Instance generator and the Logic generator for both, Siemens and Schneider PLCs, the SCADA g...

  11. Automated analysis and design of complex structures

    International Nuclear Information System (INIS)

    Wilson, E.L.

    1977-01-01

    This paper discusses the following: 1. The relationship of analysis to design. 2. New methods of analysis. 3. Improved finite elements. 4. Effect of minicomputer on structural analysis methods. 5. The use of system of microprocessors for nonlinear structural analysis. 6. The role of interacting graphics systems in future analysis and design. The discussion focusses on the impact of new inexpensive computer hardware on design and analysis methods. (Auth.)

  12. Automated structure solution, density modification and model building.

    Science.gov (United States)

    Terwilliger, Thomas C

    2002-11-01

    The approaches that form the basis of automated structure solution in SOLVE and RESOLVE are described. The use of a scoring scheme to convert decision making in macromolecular structure solution to an optimization problem has proven very useful and in many cases a single clear heavy-atom solution can be obtained and used for phasing. Statistical density modification is well suited to an automated approach to structure solution because the method is relatively insensitive to choices of numbers of cycles and solvent content. The detection of non-crystallographic symmetry (NCS) in heavy-atom sites and checking of potential NCS operations against the electron-density map has proven to be a reliable method for identification of NCS in most cases. Automated model building beginning with an FFT-based search for helices and sheets has been successful in automated model building for maps with resolutions as low as 3 A. The entire process can be carried out in a fully automatic fashion in many cases.

  13. Quantitative DMS mapping for automated RNA secondary structure inference

    OpenAIRE

    Cordero, Pablo; Kladwang, Wipapat; VanLang, Christopher C.; Das, Rhiju

    2012-01-01

    For decades, dimethyl sulfate (DMS) mapping has informed manual modeling of RNA structure in vitro and in vivo. Here, we incorporate DMS data into automated secondary structure inference using a pseudo-energy framework developed for 2'-OH acylation (SHAPE) mapping. On six non-coding RNAs with crystallographic models, DMS- guided modeling achieves overall false negative and false discovery rates of 9.5% and 11.6%, comparable or better than SHAPE-guided modeling; and non-parametric bootstrappin...

  14. Prototype Software for Automated Structural Analysis of Systems

    DEFF Research Database (Denmark)

    Jørgensen, A.; Izadi-Zamanabadi, Roozbeh; Kristensen, M.

    2004-01-01

    In this paper we present a prototype software tool that is developed to analyse the structural model of automated systems in order to identify redundant information that is hence utilized for Fault detection and Isolation (FDI) purposes. The dedicated algorithms in this software tool use a tri......-partite graph that represents the structural model of the system. A component-based approach has been used to address issues such as system complexity and recon¯gurability possibilities....

  15. Prototype Software for Automated Structural Analysis of Systems

    DEFF Research Database (Denmark)

    Jørgensen, A.; Izadi-Zamanabadi, Roozbeh; Kristensen, M.

    2004-01-01

    In this paper we present a prototype software tool that is developed to analyse the structural model of automated systems in order to identify redundant information that is hence utilized for Fault detection and Isolation (FDI) purposes. The dedicated algorithms in this software tool use a tri......-partite graph that represents the structural model of the system. A component-based approach has been used to address issues such as system complexity and reconfigurability possibilities....

  16. Automation of program model developing for complex structure control objects

    International Nuclear Information System (INIS)

    Ivanov, A.P.; Sizova, T.B.; Mikhejkina, N.D.; Sankovskij, G.A.; Tyufyagin, A.N.

    1991-01-01

    A brief description of software for automated developing the models of integrating modular programming system, program module generator and program module library providing thermal-hydraulic calcualtion of process dynamics in power unit equipment components and on-line control system operation simulation is given. Technical recommendations for model development are based on experience in creation of concrete models of NPP power units. 8 refs., 1 tab., 4 figs

  17. A fast, automated, semideterministic weight windows generator for MCNP

    International Nuclear Information System (INIS)

    Mickael, M.W.

    1995-01-01

    A fast automated method is developed to estimate particle importance in the Los Alamos Carlo code MCNP. It provides an automated and efficient way of predicting and setting up an important map for the weight windows technique. A short analog simulation is first performed to obtain effective group parameters based on the input description of the problem. A solution of the multigroup time-dependent adjoint diffusion equation is then used to estimate particle importance. At any point in space, time, and energy, the particle importance is determined, based on the calculated parameters, and used as the lower limit of the weight window. The method has been tested for neutron, photon, and coupled neutron-photon problems. Significant improvement in the simulation efficiency is obtained using this technique at no additional computer time and with no prior knowledge of the nature of the problem. Moreover, time and angular importance that are not available yet in MCNP are easily implemented in this method

  18. Automated hexahedral mesh generation from biomedical image data: applications in limb prosthetics.

    Science.gov (United States)

    Zachariah, S G; Sanders, J E; Turkiyyah, G M

    1996-06-01

    A general method to generate hexahedral meshes for finite element analysis of residual limbs and similar biomedical geometries is presented. The method utilizes skeleton-based subdivision of cross-sectional domains to produce simple subdomains in which structured meshes are easily generated. Application to a below-knee residual limb and external prosthetic socket is described. The residual limb was modeled as consisting of bones, soft tissue, and skin. The prosthetic socket model comprised a socket wall with an inner liner. The geometries of these structures were defined using axial cross-sectional contour data from X-ray computed tomography, optical scanning, and mechanical surface digitization. A tubular surface representation, using B-splines to define the directrix and generator, is shown to be convenient for definition of the structure geometries. Conversion of cross-sectional data to the compact tubular surface representation is direct, and the analytical representation simplifies geometric querying and numerical optimization within the mesh generation algorithms. The element meshes remain geometrically accurate since boundary nodes are constrained to lie on the tubular surfaces. Several element meshes of increasing mesh density were generated for two residual limbs and prosthetic sockets. Convergence testing demonstrated that approximately 19 elements are required along a circumference of the residual limb surface for a simple linear elastic model. A model with the fibula absent compared with the same geometry with the fibula present showed differences suggesting higher distal stresses in the absence of the fibula. Automated hexahedral mesh generation algorithms for sliced data represent an advancement in prosthetic stress analysis since they allow rapid modeling of any given residual limb and optimization of mesh parameters.

  19. Automated Localization of Multiple Pelvic Bone Structures on MRI.

    Science.gov (United States)

    Onal, Sinan; Lai-Yuen, Susana; Bao, Paul; Weitzenfeld, Alfredo; Hart, Stuart

    2016-01-01

    In this paper, we present a fully automated localization method for multiple pelvic bone structures on magnetic resonance images (MRI). Pelvic bone structures are at present identified manually on MRI to locate reference points for measurement and evaluation of pelvic organ prolapse (POP). Given that this is a time-consuming and subjective procedure, there is a need to localize pelvic bone structures automatically. However, bone structures are not easily differentiable from soft tissue on MRI as their pixel intensities tend to be very similar. In this paper, we present a model that combines support vector machines and nonlinear regression capturing global and local information to automatically identify the bounding boxes of bone structures on MRI. The model identifies the location of the pelvic bone structures by establishing the association between their relative locations and using local information such as texture features. Results show that the proposed method is able to locate the bone structures of interest accurately (dice similarity index >0.75) in 87-91% of the images. This research aims to enable accurate, consistent, and fully automated localization of bone structures on MRI to facilitate and improve the diagnosis of health conditions such as female POP.

  20. Automated, parallel mass spectrometry imaging and structural identification of lipids

    DEFF Research Database (Denmark)

    Ellis, Shane R.; Paine, Martin R.L.; Eijkel, Gert B.

    2018-01-01

    We report a method that enables automated data-dependent acquisition of lipid tandem mass spectrometry data in parallel with a high-resolution mass spectrometry imaging experiment. The method does not increase the total image acquisition time and is combined with automatic structural assignments....... This lipidome-per-pixel approach automatically identified and validated 104 unique molecular lipids and their spatial locations from rat cerebellar tissue....

  1. Towards automated crystallographic structure refinement with phenix.refine

    OpenAIRE

    Afonine, Pavel V.; Grosse-Kunstleve, Ralf W.; Echols, Nathaniel; Headd, Jeffrey J.; Moriarty, Nigel W.; Mustyakimov, Marat; Terwilliger, Thomas C.; Urzhumtsev, Alexandre; Zwart, Peter H.; Adams, Paul D.

    2012-01-01

    phenix.refine is a program within the PHENIX package that supports crystallographic structure refinement against experimental data with a wide range of upper resolution limits using a large repertoire of model parameterizations. It has several automation features and is also highly flexible. Several hundred parameters enable extensive customizations for complex use cases. Multiple user-defined refinement strategies can be applied to specific parts of the model in a single refinement run. An i...

  2. Fully automated muscle quality assessment by Gabor filtering of second harmonic generation images

    Science.gov (United States)

    Paesen, Rik; Smolders, Sophie; Vega, José Manolo de Hoyos; Eijnde, Bert O.; Hansen, Dominique; Ameloot, Marcel

    2016-02-01

    Although structural changes on the sarcomere level of skeletal muscle are known to occur due to various pathologies, rigorous studies of the reduced sarcomere quality remain scarce. This can possibly be explained by the lack of an objective tool for analyzing and comparing sarcomere images across biological conditions. Recent developments in second harmonic generation (SHG) microscopy and increasing insight into the interpretation of sarcomere SHG intensity profiles have made SHG microscopy a valuable tool to study microstructural properties of sarcomeres. Typically, sarcomere integrity is analyzed by fitting a set of manually selected, one-dimensional SHG intensity profiles with a supramolecular SHG model. To circumvent this tedious manual selection step, we developed a fully automated image analysis procedure to map the sarcomere disorder for the entire image at once. The algorithm relies on a single-frequency wavelet-based Gabor approach and includes a newly developed normalization procedure allowing for unambiguous data interpretation. The method was validated by showing the correlation between the sarcomere disorder, quantified by the M-band size obtained from manually selected profiles, and the normalized Gabor value ranging from 0 to 1 for decreasing disorder. Finally, to elucidate the applicability of our newly developed protocol, Gabor analysis was used to study the effect of experimental autoimmune encephalomyelitis on the sarcomere regularity. We believe that the technique developed in this work holds great promise for high-throughput, unbiased, and automated image analysis to study sarcomere integrity by SHG microscopy.

  3. AUTOMATED LOW-COST PHOTOGRAMMETRY FOR FLEXIBLE STRUCTURE MONITORING

    Directory of Open Access Journals (Sweden)

    C. H. Wang

    2012-07-01

    Full Text Available Structural monitoring requires instruments which can provide high precision and accuracy, reliable measurements at good temporal resolution and rapid processing speeds. Long-term campaigns and flexible structures are regarded as two of the most challenging subjects in monitoring engineering structures. Long-term monitoring in civil engineering is generally considered to be labourintensive and financially expensive and it can take significant effort to arrange the necessary human resources, transportation and equipment maintenance. When dealing with flexible structure monitoring, it is of paramount importance that any monitoring equipment used is able to carry out rapid sampling. Low cost, automated, photogrammetric techniques therefore have the potential to become routinely viable for monitoring non-rigid structures. This research aims to provide a photogrammetric solution for long-term flexible structural monitoring purposes. The automated approach was achieved using low-cost imaging devices (mobile phones to replace traditional image acquisition stations and substantially reduce the equipment costs. A self-programmed software package was developed to deal with the hardware-software integration and system operation. In order to evaluate the performance of this low-cost monitoring system, a shaking table experiment was undertaken. Different network configurations and target sizes were used to determine the best configuration. A large quantity of image data was captured by four DSLR cameras and four mobile phone cameras respectively. These image data were processed using photogrammetric techniques to calculate the final results for the system evaluation.

  4. Genetic algorithms in teaching artificial intelligence (automated generation of specific algebras)

    Science.gov (United States)

    Habiballa, Hashim; Jendryscik, Radek

    2017-11-01

    The problem of teaching essential Artificial Intelligence (AI) methods is an important task for an educator in the branch of soft-computing. The key focus is often given to proper understanding of the principle of AI methods in two essential points - why we use soft-computing methods at all and how we apply these methods to generate reasonable results in sensible time. We present one interesting problem solved in the non-educational research concerning automated generation of specific algebras in the huge search space. We emphasize above mentioned points as an educational case study of an interesting problem in automated generation of specific algebras.

  5. Automated detection of structural alerts (chemical fragments in (ecotoxicology

    Directory of Open Access Journals (Sweden)

    Ronan Bureau

    2013-02-01

    Full Text Available This mini-review describes the evolution of different algorithms dedicated to the automated discovery of chemical fragments associated to (ecotoxicological endpoints. These structural alerts correspond to one of the most interesting approach of in silico toxicology due to their direct link with specific toxicological mechanisms. A number of expert systems are already available but, since the first work in this field which considered a binomial distribution of chemical fragments between two datasets, new data miners were developed and applied with success in chemoinformatics. The frequency of a chemical fragment in a dataset is often at the core of the process for the definition of its toxicological relevance. However, recent progresses in data mining provide new insights into the automated discovery of new rules. Particularly, this review highlights the notion of Emerging Patterns that can capture contrasts between classes of data.

  6. AUTOMATED DETECTION OF STRUCTURAL ALERTS (CHEMICAL FRAGMENTS IN (ECOTOXICOLOGY

    Directory of Open Access Journals (Sweden)

    Alban Lepailleur

    2013-02-01

    Full Text Available This mini-review describes the evolution of different algorithms dedicated to the automated discovery of chemical fragments associated to (ecotoxicological endpoints. These structural alerts correspond to one of the most interesting approach of in silico toxicology due to their direct link with specific toxicological mechanisms. A number of expert systems are already available but, since the first work in this field which considered a binomial distribution of chemical fragments between two datasets, new data miners were developed and applied with success in chemoinformatics. The frequency of a chemical fragment in a dataset is often at the core of the process for the definition of its toxicological relevance. However, recent progresses in data mining provide new insights into the automated discovery of new rules. Particularly, this review highlights the notion of Emerging Patterns that can capture contrasts between classes of data.

  7. The Generator of the Event Structure Lexicon (GESL): Automatic Annotation of Event Structure for Textual Inference Tasks

    Science.gov (United States)

    Im, Seohyun

    2013-01-01

    This dissertation aims to develop the Generator of the Event Structure Lexicon (GESL) which is a tool to automate annotating the event structure of verbs in text to support textual inference tasks related to lexically entailed subevents. The output of the GESL is the Event Structure Lexicon (ESL), which is a lexicon of verbs in text which includes…

  8. The AUDANA algorithm for automated protein 3D structure determination from NMR NOE data

    International Nuclear Information System (INIS)

    Lee, Woonghee; Petit, Chad M.; Cornilescu, Gabriel; Stark, Jaime L.; Markley, John L.

    2016-01-01

    We introduce AUDANA (Automated Database-Assisted NOE Assignment), an algorithm for determining three-dimensional structures of proteins from NMR data that automates the assignment of 3D-NOE spectra, generates distance constraints, and conducts iterative high temperature molecular dynamics and simulated annealing. The protein sequence, chemical shift assignments, and NOE spectra are the only required inputs. Distance constraints generated automatically from ambiguously assigned NOE peaks are validated during the structure calculation against information from an enlarged version of the freely available PACSY database that incorporates information on protein structures deposited in the Protein Data Bank (PDB). This approach yields robust sets of distance constraints and 3D structures. We evaluated the performance of AUDANA with input data for 14 proteins ranging in size from 6 to 25 kDa that had 27–98 % sequence identity to proteins in the database. In all cases, the automatically calculated 3D structures passed stringent validation tests. Structures were determined with and without database support. In 9/14 cases, database support improved the agreement with manually determined structures in the PDB and in 11/14 cases, database support lowered the r.m.s.d. of the family of 20 structural models.

  9. The AUDANA algorithm for automated protein 3D structure determination from NMR NOE data

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Woonghee, E-mail: whlee@nmrfam.wisc.edu [University of Wisconsin-Madison, National Magnetic Resonance Facility at Madison and Biochemistry Department (United States); Petit, Chad M. [University of Alabama at Birmingham, Department of Biochemistry and Molecular Genetics (United States); Cornilescu, Gabriel; Stark, Jaime L.; Markley, John L., E-mail: markley@nmrfam.wisc.edu [University of Wisconsin-Madison, National Magnetic Resonance Facility at Madison and Biochemistry Department (United States)

    2016-06-15

    We introduce AUDANA (Automated Database-Assisted NOE Assignment), an algorithm for determining three-dimensional structures of proteins from NMR data that automates the assignment of 3D-NOE spectra, generates distance constraints, and conducts iterative high temperature molecular dynamics and simulated annealing. The protein sequence, chemical shift assignments, and NOE spectra are the only required inputs. Distance constraints generated automatically from ambiguously assigned NOE peaks are validated during the structure calculation against information from an enlarged version of the freely available PACSY database that incorporates information on protein structures deposited in the Protein Data Bank (PDB). This approach yields robust sets of distance constraints and 3D structures. We evaluated the performance of AUDANA with input data for 14 proteins ranging in size from 6 to 25 kDa that had 27–98 % sequence identity to proteins in the database. In all cases, the automatically calculated 3D structures passed stringent validation tests. Structures were determined with and without database support. In 9/14 cases, database support improved the agreement with manually determined structures in the PDB and in 11/14 cases, database support lowered the r.m.s.d. of the family of 20 structural models.

  10. Automated determination of fibrillar structures by simultaneous model building and fiber diffraction refinement.

    Science.gov (United States)

    Potrzebowski, Wojciech; André, Ingemar

    2015-07-01

    For highly oriented fibrillar molecules, three-dimensional structures can often be determined from X-ray fiber diffraction data. However, because of limited information content, structure determination and validation can be challenging. We demonstrate that automated structure determination of protein fibers can be achieved by guiding the building of macromolecular models with fiber diffraction data. We illustrate the power of our approach by determining the structures of six bacteriophage viruses de novo using fiber diffraction data alone and together with solid-state NMR data. Furthermore, we demonstrate the feasibility of molecular replacement from monomeric and fibrillar templates by solving the structure of a plant virus using homology modeling and protein-protein docking. The generated models explain the experimental data to the same degree as deposited reference structures but with improved structural quality. We also developed a cross-validation method for model selection. The results highlight the power of fiber diffraction data as structural constraints.

  11. Towards automated crystallographic structure refinement with phenix.refine

    Energy Technology Data Exchange (ETDEWEB)

    Afonine, Pavel V., E-mail: pafonine@lbl.gov; Grosse-Kunstleve, Ralf W.; Echols, Nathaniel; Headd, Jeffrey J.; Moriarty, Nigel W. [Lawrence Berkeley National Laboratory, One Cyclotron Road, MS64R0121, Berkeley, CA 94720 (United States); Mustyakimov, Marat; Terwilliger, Thomas C. [Los Alamos National Laboratory, M888, Los Alamos, NM 87545 (United States); Urzhumtsev, Alexandre [CNRS–INSERM–UdS, 1 Rue Laurent Fries, BP 10142, 67404 Illkirch (France); Université Henri Poincaré, Nancy 1, BP 239, 54506 Vandoeuvre-lès-Nancy (France); Zwart, Peter H. [Lawrence Berkeley National Laboratory, One Cyclotron Road, MS64R0121, Berkeley, CA 94720 (United States); Adams, Paul D. [Lawrence Berkeley National Laboratory, One Cyclotron Road, MS64R0121, Berkeley, CA 94720 (United States); University of California Berkeley, Berkeley, CA 94720 (United States)

    2012-04-01

    phenix.refine is a program within the PHENIX package that supports crystallographic structure refinement against experimental data with a wide range of upper resolution limits using a large repertoire of model parameterizations. This paper presents an overview of the major phenix.refine features, with extensive literature references for readers interested in more detailed discussions of the methods. phenix.refine is a program within the PHENIX package that supports crystallographic structure refinement against experimental data with a wide range of upper resolution limits using a large repertoire of model parameterizations. It has several automation features and is also highly flexible. Several hundred parameters enable extensive customizations for complex use cases. Multiple user-defined refinement strategies can be applied to specific parts of the model in a single refinement run. An intuitive graphical user interface is available to guide novice users and to assist advanced users in managing refinement projects. X-ray or neutron diffraction data can be used separately or jointly in refinement. phenix.refine is tightly integrated into the PHENIX suite, where it serves as a critical component in automated model building, final structure refinement, structure validation and deposition to the wwPDB. This paper presents an overview of the major phenix.refine features, with extensive literature references for readers interested in more detailed discussions of the methods.

  12. EddyOne automated analysis of PWR/WWER steam generator tubes eddy current data

    International Nuclear Information System (INIS)

    Nadinic, B.; Vanjak, Z.

    2004-01-01

    INETEC Institute for Nuclear Technology developed software package called Eddy One which has option of automated analysis of bobbin coil eddy current data. During its development and on site use, many valuable lessons were learned which are described in this article. In accordance with previous, the following topics are covered: General requirements for automated analysis of bobbin coil eddy current data; Main approaches to automated analysis; Multi rule algorithms for data screening; Landmark detection algorithms as prerequisite for automated analysis (threshold algorithms and algorithms based on neural network principles); Field experience with Eddy One software; Development directions (use of artificial intelligence with self learning abilities for indication detection and sizing); Automated analysis software qualification; Conclusions. Special emphasis is given on results obtained on different types of steam generators, condensers and heat exchangers. Such results are then compared with results obtained by other automated software vendors giving clear advantage to INETEC approach. It has to be pointed out that INETEC field experience was collected also on WWER steam generators what is for now unique experience.(author)

  13. Automated Diagnosis and Classification of Steam Generator Tube Defects

    International Nuclear Information System (INIS)

    Garcia, Gabe V.

    2004-01-01

    A major cause of failure in nuclear steam generators is tube degradation. Tube defects are divided into seven categories, one of which is intergranular attack/stress corrosion cracking (IGA/SCC). Defects of this type usually begin on the outer surface of the tubes and propagate both inward and laterally. In many cases these defects occur at or near the tube support plates. Several different methods exist for the nondestructive evaluation of nuclear steam generator tubes for defect characterization

  14. An Evaluation of Automated Code Generation with the PetriCode Approach

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    2014-01-01

    Automated code generation is an important element of model driven development methodologies. We have previously proposed an approach for code generation based on Coloured Petri Net models annotated with textual pragmatics for the network protocol domain. In this paper, we present and evaluate thr...... important properties of our approach: platform independence, code integratability, and code readability. The evaluation shows that our approach can generate code for a wide range of platforms which is integratable and readable....

  15. Automated Generation of Tabular Equations of State with Uncertainty Information

    Science.gov (United States)

    Carpenter, John H.; Robinson, Allen C.; Debusschere, Bert J.; Mattsson, Ann E.

    2015-06-01

    As computational science pushes toward higher fidelity prediction, understanding the uncertainty associated with closure models, such as the equation of state (EOS), has become a key focus. Traditional EOS development often involves a fair amount of art, where expert modelers may appear as magicians, providing what is felt to be the closest possible representation of the truth. Automation of the development process gives a means by which one may demystify the art of EOS, while simultaneously obtaining uncertainty information in a manner that is both quantifiable and reproducible. We describe our progress on the implementation of such a system to provide tabular EOS tables with uncertainty information to hydrocodes. Key challenges include encoding the artistic expert opinion into an algorithmic form and preserving the analytic models and uncertainty information in a manner that is both accurate and computationally efficient. Results are demonstrated on a multi-phase aluminum model. *Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  16. Automated extraction of chemical structure information from digital raster images

    Directory of Open Access Journals (Sweden)

    Shedden Kerby A

    2009-02-01

    Full Text Available Abstract Background To search for chemical structures in research articles, diagrams or text representing molecules need to be translated to a standard chemical file format compatible with cheminformatic search engines. Nevertheless, chemical information contained in research articles is often referenced as analog diagrams of chemical structures embedded in digital raster images. To automate analog-to-digital conversion of chemical structure diagrams in scientific research articles, several software systems have been developed. But their algorithmic performance and utility in cheminformatic research have not been investigated. Results This paper aims to provide critical reviews for these systems and also report our recent development of ChemReader – a fully automated tool for extracting chemical structure diagrams in research articles and converting them into standard, searchable chemical file formats. Basic algorithms for recognizing lines and letters representing bonds and atoms in chemical structure diagrams can be independently run in sequence from a graphical user interface-and the algorithm parameters can be readily changed-to facilitate additional development specifically tailored to a chemical database annotation scheme. Compared with existing software programs such as OSRA, Kekule, and CLiDE, our results indicate that ChemReader outperforms other software systems on several sets of sample images from diverse sources in terms of the rate of correct outputs and the accuracy on extracting molecular substructure patterns. Conclusion The availability of ChemReader as a cheminformatic tool for extracting chemical structure information from digital raster images allows research and development groups to enrich their chemical structure databases by annotating the entries with published research articles. Based on its stable performance and high accuracy, ChemReader may be sufficiently accurate for annotating the chemical database with links

  17. Structured automated code checking through structural components and systems engineering

    NARCIS (Netherlands)

    Coenders, J.L.; Rolvink, A.

    2014-01-01

    This paper presents a proposal to employ the design computing methodology proposed as StructuralComponents (Rolvink et al [6] and van de Weerd et al [7]) as a method to perform a digital verification process to fulfil the requirements related to structural design and engineering as part of a

  18. Automated problem list generation and physicians perspective from a pilot study.

    Science.gov (United States)

    Devarakonda, Murthy V; Mehta, Neil; Tsou, Ching-Huei; Liang, Jennifer J; Nowacki, Amy S; Jelovsek, John Eric

    2017-09-01

    An accurate, comprehensive and up-to-date problem list can help clinicians provide patient-centered care. Unfortunately, problem lists created and maintained in electronic health records by providers tend to be inaccurate, duplicative and out of date. With advances in machine learning and natural language processing, it is possible to automatically generate a problem list from the data in the EHR and keep it current. In this paper, we describe an automated problem list generation method and report on insights from a pilot study of physicians' assessment of the generated problem lists compared to existing providers-curated problem lists in an institution's EHR system. The natural language processing and machine learning-based Watson 1 method models clinical thinking in identifying a patient's problem list using clinical notes and structured data. This pilot study assessed the Watson method and included 15 randomly selected, de-identified patient records from a large healthcare system that were each planned to be reviewed by at least two internal medicine physicians. The physicians created their own problem lists, and then evaluated the overall usefulness of their own problem lists (P), Watson generated problem lists (W), and the existing EHR problem lists (E) on a 10-point scale. The primary outcome was pairwise comparisons of P, W, and E. Six out of the 10 invited physicians completed 27 assessments of P, W, and E, and in process evaluated 732 Watson generated problems and 444 problems in the EHR system. As expected, physicians rated their own lists, P, highest. However, W was rated higher than E. Among 89% of assessments, Watson identified at least one important problem that physicians missed. Cognitive computing systems like this Watson system hold the potential for accurate, problem-list-centered summarization of patient records, potentially leading to increased efficiency, better clinical decision support, and improved quality of patient care. Copyright © 2017

  19. Automated Layout Generation of Analogue and Mixed-Signal ASIC's

    DEFF Research Database (Denmark)

    Bloch, Rene

    search for better solutions can be guided into new and more prosperous areas of the search space. This feature also provides the designer with the ability to easily try out several implementation options, thus exploring the solution space, which are especially important in the early stages of the design...... is generated using a full-custom layout style and is based on a library of CMOS process independent device generators. The placement for the analogue circuit is derived using the interactive floorplan optimization algorithm described above. This ensures that a high degree of user control is implemented...

  20. Generative Representations for the Automated Design of Modular Physical Robots

    Science.gov (United States)

    Hornby, Gregory S.; Lipson, Hod; Pollack, Jordan B.

    2003-01-01

    We will begin with a brief background of evolutionary robotics and related work, and demonstrate the scaling problem with our own prior results. Next we propose the use of an evolved generative representation as opposed to a non-generative representation. We describe this representation in detail as well as the evolutionary process that uses it. We then compare progress of evolved robots with and without the use of the grammar, and quantify the obtained advantage. Working two- dimensional and three-dimensional physical robots produced by the system are shown.

  1. Automated generation of formal safety conditions from railway interlocking tables

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2014-01-01

    This paper describes a tool for extracting formal safety conditions from interlocking tables for railway interlocking systems. The tool has been applied to generate safety conditions for the interlocking system at Stenstrup station in Denmark, and the SAL model checker tool has been used to check...

  2. Verification Test of Automated Robotic Assembly of Space Truss Structures

    Science.gov (United States)

    Rhodes, Marvin D.; Will, Ralph W.; Quach, Cuong C.

    1995-01-01

    A multidisciplinary program has been conducted at the Langley Research Center to develop operational procedures for supervised autonomous assembly of truss structures suitable for large-aperture antennas. The hardware and operations required to assemble a 102-member tetrahedral truss and attach 12 hexagonal panels were developed and evaluated. A brute-force automation approach was used to develop baseline assembly hardware and software techniques. However, as the system matured and operations were proven, upgrades were incorporated and assessed against the baseline test results. These upgrades included the use of distributed microprocessors to control dedicated end-effector operations, machine vision guidance for strut installation, and the use of an expert system-based executive-control program. This paper summarizes the developmental phases of the program, the results of several assembly tests, and a series of proposed enhancements. No problems that would preclude automated in-space assembly or truss structures have been encountered. The test system was developed at a breadboard level and continued development at an enhanced level is warranted.

  3. Automated planning target volume generation: an evaluation pitting a computer-based tool against human experts

    International Nuclear Information System (INIS)

    Ketting, Case H.; Austin-Seymour, Mary; Kalet, Ira; Jacky, Jon; Kromhout-Schiro, Sharon; Hummel, Sharon; Unger, Jonathan; Fagan, Lawrence M.; Griffin, Tom

    1997-01-01

    Purpose: Software tools are seeing increased use in three-dimensional treatment planning. However, the development of these tools frequently omits careful evaluation before placing them in clinical use. This study demonstrates the application of a rigorous evaluation methodology using blinded peer review to an automated software tool that produces ICRU-50 planning target volumes (PTVs). Methods and Materials: Seven physicians from three different institutions involved in three-dimensional treatment planning participated in the evaluation. Four physicians drew partial PTVs on nine test cases, consisting of four nasopharynx and five lung primaries. Using the same information provided to the human experts, the computer tool generated PTVs for comparison. The remaining three physicians, designated evaluators, individually reviewed the PTVs for acceptability. To exclude bias, the evaluators were blinded to the source (human or computer) of the PTVs they reviewed. Their scorings of the PTVs were statistically examined to determine if the computer tool performed as well as the human experts. Results: The computer tool was as successful as the human experts in generating PTVs. Failures were primarily attributable to insufficient margins around the clinical target volume and to encroachment upon critical structures. In a qualitative analysis, the human and computer experts displayed similar types and distributions of errors. Conclusions: Rigorous evaluation of computer-based radiotherapy tools requires comparison to current practice and can reveal areas for improvement before the tool enters clinical practice

  4. Automated generation of partial Markov chain from high level descriptions

    International Nuclear Information System (INIS)

    Brameret, P.-A.; Rauzy, A.; Roussel, J.-M.

    2015-01-01

    We propose an algorithm to generate partial Markov chains from high level implicit descriptions, namely AltaRica models. This algorithm relies on two components. First, a variation on Dijkstra's algorithm to compute shortest paths in a graph. Second, the definition of a notion of distance to select which states must be kept and which can be safely discarded. The proposed method solves two problems at once. First, it avoids a manual construction of Markov chains, which is both tedious and error prone. Second, up the price of acceptable approximations, it makes it possible to push back dramatically the exponential blow-up of the size of the resulting chains. We report experimental results that show the efficiency of the proposed approach. - Highlights: • We generate Markov chains from a higher level safety modeling language (AltaRica). • We use a variation on Dijkstra's algorithm to generate partial Markov chains. • Hence we solve two problems: the first problem is the tedious manual construction of Markov chains. • The second problem is the blow-up of the size of the chains, at the cost of decent approximations. • The experimental results highlight the efficiency of the method

  5. Steam generator automated eddy current data analysis: A benchmarking study. Final report

    International Nuclear Information System (INIS)

    Brown, S.D.

    1998-12-01

    The eddy current examination of steam generator tubes is a very demanding process. Challenges include: complex signal analysis, massive amount of data to be reviewed quickly with extreme precision and accuracy, shortages of data analysts during peak periods, and the desire to reduce examination costs. One method to address these challenges is by incorporating automation into the data analysis process. Specific advantages, which automated data analysis has the potential to provide, include the ability to analyze data more quickly, consistently and accurately than can be performed manually. Also, automated data analysis can potentially perform the data analysis function with significantly smaller levels of analyst staffing. Despite the clear advantages that an automated data analysis system has the potential to provide, no automated system has been produced and qualified that can perform all of the functions that utility engineers demand. This report investigates the current status of automated data analysis, both at the commercial and developmental level. A summary of the various commercial and developmental data analysis systems is provided which includes the signal processing methodologies used and, where available, the performance data obtained for each system. Also, included in this report is input from seventeen research organizations regarding the actions required and obstacles to be overcome in order to bring automatic data analysis from the laboratory into the field environment. In order to provide assistance with ongoing and future research efforts in the automated data analysis arena, the most promising approaches to signal processing are described in this report. These approaches include: wavelet applications, pattern recognition, template matching, expert systems, artificial neural networks, fuzzy logic, case based reasoning and genetic algorithms. Utility engineers and NDE researchers can use this information to assist in developing automated data

  6. Substructure analysis techniques and automation. [to eliminate logistical data handling and generation chores

    Science.gov (United States)

    Hennrich, C. W.; Konrath, E. J., Jr.

    1973-01-01

    A basic automated substructure analysis capability for NASTRAN is presented which eliminates most of the logistical data handling and generation chores that are currently associated with the method. Rigid formats are proposed which will accomplish this using three new modules, all of which can be added to level 16 with a relatively small effort.

  7. Automated Report Generation for Research Data Repositories: From i2b2 to PDF.

    Science.gov (United States)

    Thiemann, Volker S; Xu, Tingyan; Röhrig, Rainer; Majeed, Raphael W

    2017-01-01

    We developed an automated toolchain to generate reports of i2b2 data. It is based on free open source software and runs on a Java Application Server. It is sucessfully used in an ED registry project. The solution is highly configurable and portable to other projects based on i2b2 or compatible factual data sources.

  8. Automated Glioblastoma Segmentation Based on a Multiparametric Structured Unsupervised Classification

    Science.gov (United States)

    Juan-Albarracín, Javier; Fuster-Garcia, Elies; Manjón, José V.; Robles, Montserrat; Aparici, F.; Martí-Bonmatí, L.; García-Gómez, Juan M.

    2015-01-01

    Automatic brain tumour segmentation has become a key component for the future of brain tumour treatment. Currently, most of brain tumour segmentation approaches arise from the supervised learning standpoint, which requires a labelled training dataset from which to infer the models of the classes. The performance of these models is directly determined by the size and quality of the training corpus, whose retrieval becomes a tedious and time-consuming task. On the other hand, unsupervised approaches avoid these limitations but often do not reach comparable results than the supervised methods. In this sense, we propose an automated unsupervised method for brain tumour segmentation based on anatomical Magnetic Resonance (MR) images. Four unsupervised classification algorithms, grouped by their structured or non-structured condition, were evaluated within our pipeline. Considering the non-structured algorithms, we evaluated K-means, Fuzzy K-means and Gaussian Mixture Model (GMM), whereas as structured classification algorithms we evaluated Gaussian Hidden Markov Random Field (GHMRF). An automated postprocess based on a statistical approach supported by tissue probability maps is proposed to automatically identify the tumour classes after the segmentations. We evaluated our brain tumour segmentation method with the public BRAin Tumor Segmentation (BRATS) 2013 Test and Leaderboard datasets. Our approach based on the GMM model improves the results obtained by most of the supervised methods evaluated with the Leaderboard set and reaches the second position in the ranking. Our variant based on the GHMRF achieves the first position in the Test ranking of the unsupervised approaches and the seventh position in the general Test ranking, which confirms the method as a viable alternative for brain tumour segmentation. PMID:25978453

  9. Automating the generation of finite element dynamical cores with Firedrake

    Science.gov (United States)

    Ham, David; Mitchell, Lawrence; Homolya, Miklós; Luporini, Fabio; Gibson, Thomas; Kelly, Paul; Cotter, Colin; Lange, Michael; Kramer, Stephan; Shipton, Jemma; Yamazaki, Hiroe; Paganini, Alberto; Kärnä, Tuomas

    2017-04-01

    The development of a dynamical core is an increasingly complex software engineering undertaking. As the equations become more complete, the discretisations more sophisticated and the hardware acquires ever more fine-grained parallelism and deeper memory hierarchies, the problem of building, testing and modifying dynamical cores becomes increasingly complex. Here we present Firedrake, a code generation system for the finite element method with specialist features designed to support the creation of geoscientific models. Using Firedrake, the dynamical core developer writes the partial differential equations in weak form in a high level mathematical notation. Appropriate function spaces are chosen and time stepping loops written at the same high level. When the programme is run, Firedrake generates high performance C code for the resulting numerics which are executed in parallel. Models in Firedrake typically take a tiny fraction of the lines of code required by traditional hand-coding techniques. They support more sophisticated numerics than are easily achieved by hand, and the resulting code is frequently higher performance. Critically, debugging, modifying and extending a model written in Firedrake is vastly easier than by traditional methods due to the small, highly mathematical code base. Firedrake supports a wide range of key features for dynamical core creation: A vast range of discretisations, including both continuous and discontinuous spaces and mimetic (C-grid-like) elements which optimally represent force balances in geophysical flows. High aspect ratio layered meshes suitable for ocean and atmosphere domains. Curved elements for high accuracy representations of the sphere. Support for non-finite element operators, such as parametrisations. Access to PETSc, a world-leading library of programmable linear and nonlinear solvers. High performance adjoint models generated automatically by symbolically reasoning about the forward model. This poster will present

  10. An automated framework for hypotheses generation using literature

    Directory of Open Access Journals (Sweden)

    Abedi Vida

    2012-08-01

    Full Text Available Abstract Background In bio-medicine, exploratory studies and hypothesis generation often begin with researching existing literature to identify a set of factors and their association with diseases, phenotypes, or biological processes. Many scientists are overwhelmed by the sheer volume of literature on a disease when they plan to generate a new hypothesis or study a biological phenomenon. The situation is even worse for junior investigators who often find it difficult to formulate new hypotheses or, more importantly, corroborate if their hypothesis is consistent with existing literature. It is a daunting task to be abreast with so much being published and also remember all combinations of direct and indirect associations. Fortunately there is a growing trend of using literature mining and knowledge discovery tools in biomedical research. However, there is still a large gap between the huge amount of effort and resources invested in disease research and the little effort in harvesting the published knowledge. The proposed hypothesis generation framework (HGF finds “crisp semantic associations” among entities of interest - that is a step towards bridging such gaps. Methodology The proposed HGF shares similar end goals like the SWAN but are more holistic in nature and was designed and implemented using scalable and efficient computational models of disease-disease interaction. The integration of mapping ontologies with latent semantic analysis is critical in capturing domain specific direct and indirect “crisp” associations, and making assertions about entities (such as disease X is associated with a set of factors Z. Results Pilot studies were performed using two diseases. A comparative analysis of the computed “associations” and “assertions” with curated expert knowledge was performed to validate the results. It was observed that the HGF is able to capture “crisp” direct and indirect associations, and provide knowledge

  11. An automated framework for hypotheses generation using literature.

    Science.gov (United States)

    Abedi, Vida; Zand, Ramin; Yeasin, Mohammed; Faisal, Fazle Elahi

    2012-08-29

    In bio-medicine, exploratory studies and hypothesis generation often begin with researching existing literature to identify a set of factors and their association with diseases, phenotypes, or biological processes. Many scientists are overwhelmed by the sheer volume of literature on a disease when they plan to generate a new hypothesis or study a biological phenomenon. The situation is even worse for junior investigators who often find it difficult to formulate new hypotheses or, more importantly, corroborate if their hypothesis is consistent with existing literature. It is a daunting task to be abreast with so much being published and also remember all combinations of direct and indirect associations. Fortunately there is a growing trend of using literature mining and knowledge discovery tools in biomedical research. However, there is still a large gap between the huge amount of effort and resources invested in disease research and the little effort in harvesting the published knowledge. The proposed hypothesis generation framework (HGF) finds "crisp semantic associations" among entities of interest - that is a step towards bridging such gaps. The proposed HGF shares similar end goals like the SWAN but are more holistic in nature and was designed and implemented using scalable and efficient computational models of disease-disease interaction. The integration of mapping ontologies with latent semantic analysis is critical in capturing domain specific direct and indirect "crisp" associations, and making assertions about entities (such as disease X is associated with a set of factors Z). Pilot studies were performed using two diseases. A comparative analysis of the computed "associations" and "assertions" with curated expert knowledge was performed to validate the results. It was observed that the HGF is able to capture "crisp" direct and indirect associations, and provide knowledge discovery on demand. The proposed framework is fast, efficient, and robust in

  12. Automated Testcase Generation for Numerical Support Functions in Embedded Systems

    Science.gov (United States)

    Schumann, Johann; Schnieder, Stefan-Alexander

    2014-01-01

    We present a tool for the automatic generation of test stimuli for small numerical support functions, e.g., code for trigonometric functions, quaternions, filters, or table lookup. Our tool is based on KLEE to produce a set of test stimuli for full path coverage. We use a method of iterative deepening over abstractions to deal with floating-point values. During actual testing the stimuli exercise the code against a reference implementation. We illustrate our approach with results of experiments with low-level trigonometric functions, interpolation routines, and mathematical support functions from an open source UAS autopilot.

  13. ADVANTG An Automated Variance Reduction Parameter Generator, Rev. 1

    Energy Technology Data Exchange (ETDEWEB)

    Mosher, Scott W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bevill, Aaron M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ibrahim, Ahmad M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Daily, Charles R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wagner, John C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Jeffrey O. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Grove, Robert E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-08-01

    The primary objective of ADVANTG is to reduce both the user effort and the computational time required to obtain accurate and precise tally estimates across a broad range of challenging transport applications. ADVANTG has been applied to simulations of real-world radiation shielding, detection, and neutron activation problems. Examples of shielding applications include material damage and dose rate analyses of the Oak Ridge National Laboratory (ORNL) Spallation Neutron Source and High Flux Isotope Reactor (Risner and Blakeman 2013) and the ITER Tokamak (Ibrahim et al. 2011). ADVANTG has been applied to a suite of radiation detection, safeguards, and special nuclear material movement detection test problems (Shaver et al. 2011). ADVANTG has also been used in the prediction of activation rates within light water reactor facilities (Pantelias and Mosher 2013). In these projects, ADVANTG was demonstrated to significantly increase the tally figure of merit (FOM) relative to an analog MCNP simulation. The ADVANTG-generated parameters were also shown to be more effective than manually generated geometry splitting parameters.

  14. Automated problem generation in Learning Management Systems: a tutorial

    Directory of Open Access Journals (Sweden)

    Jaime Romero

    2016-07-01

    Full Text Available The benefits of solving problems have been widely acknowledged by literature. Its implementation in e–learning platforms can make easier its management and the learning process itself. However, its implementation can also become a very time–consuming task, particularly when the number of problems to generate is high. In this tutorial we describe a methodology that we have developed aiming to alleviate the workload of producing a great deal of problems in Moodle for an undergraduate business course. This methodology follows a six-step process and allows evaluating student’s skills in problem solving, minimizes plagiarism behaviors and provides immediate feedback. We expect this tutorial encourage other educators to apply our six steps process, thus benefiting themselves and their students of its advantages.

  15. Fully Automated Volumetric Modulated Arc Therapy Plan Generation for Prostate Cancer Patients

    International Nuclear Information System (INIS)

    Voet, Peter W.J.; Dirkx, Maarten L.P.; Breedveld, Sebastiaan; Al-Mamgani, Abrahim; Incrocci, Luca; Heijmen, Ben J.M.

    2014-01-01

    Purpose: To develop and evaluate fully automated volumetric modulated arc therapy (VMAT) treatment planning for prostate cancer patients, avoiding manual trial-and-error tweaking of plan parameters by dosimetrists. Methods and Materials: A system was developed for fully automated generation of VMAT plans with our commercial clinical treatment planning system (TPS), linked to the in-house developed Erasmus-iCycle multicriterial optimizer for preoptimization. For 30 randomly selected patients, automatically generated VMAT plans (VMAT auto ) were compared with VMAT plans generated manually by 1 expert dosimetrist in the absence of time pressure (VMAT man ). For all treatment plans, planning target volume (PTV) coverage and sparing of organs-at-risk were quantified. Results: All generated plans were clinically acceptable and had similar PTV coverage (V 95%  > 99%). For VMAT auto and VMAT man plans, the organ-at-risk sparing was similar as well, although only the former plans were generated without any planning workload. Conclusions: Fully automated generation of high-quality VMAT plans for prostate cancer patients is feasible and has recently been implemented in our clinic

  16. Automation of the process of generation of the students insurance, applying RFID and GPRS technologies

    Directory of Open Access Journals (Sweden)

    Nelson Barrera-Lombana

    2013-07-01

    Full Text Available This article presents the description of the design and implementation of a system which allows the fulfilment of a consultation service on various parameters to a web server using a GSM modem, exchanging information systems over the Internet (ISS and radio-frequency identification (RFID. The application validates for its use in automation of the process of generation of the student insurance, and hardware and software, developed by the Research Group in Robotics and Industrial Automation GIRAof UPTC, are used as a platform.

  17. Semi-Automated Discovery of Application Session Structure

    Energy Technology Data Exchange (ETDEWEB)

    Kannan, J.; Jung, J.; Paxson, V.; Koksal, C.

    2006-09-07

    While the problem of analyzing network traffic at the granularity of individual connections has seen considerable previous work and tool development, understanding traffic at a higher level---the structure of user-initiated sessions comprised of groups of related connections---remains much less explored. Some types of session structure, such as the coupling between an FTP control connection and the data connections it spawns, have prespecified forms, though the specifications do not guarantee how the forms appear in practice. Other types of sessions, such as a user reading email with a browser, only manifest empirically. Still other sessions might exist without us even knowing of their presence, such as a botnet zombie receiving instructions from its master and proceeding in turn to carry them out. We present algorithms rooted in the statistics of Poisson processes that can mine a large corpus of network connection logs to extract the apparent structure of application sessions embedded in the connections. Our methods are semi-automated in that we aim to present an analyst with high-quality information (expressed as regular expressions) reflecting different possible abstractions of an application's session structure. We develop and test our methods using traces from a large Internet site, finding diversity in the number of applications that manifest, their different session structures, and the presence of abnormal behavior. Our work has applications to traffic characterization and monitoring, source models for synthesizing network traffic, and anomaly detection.

  18. Automated Generation of Technical Documentation and Provenance for Reproducible Research

    Science.gov (United States)

    Jolly, B.; Medyckyj-Scott, D.; Spiekermann, R.; Ausseil, A. G.

    2017-12-01

    Data provenance and detailed technical documentation are essential components of high-quality reproducible research, however are often only partially addressed during a research project. Recording and maintaining this information during the course of a project can be a difficult task to get right as it is a time consuming and often boring process for the researchers involved. As a result, provenance records and technical documentation provided alongside research results can be incomplete or may not be completely consistent with the actual processes followed. While providing access to the data and code used by the original researchers goes some way toward enabling reproducibility, this does not count as, or replace, data provenance. Additionally, this can be a poor substitute for good technical documentation and is often more difficult for a third-party to understand - particularly if they do not understand the programming language(s) used. We present and discuss a tool built from the ground up for the production of well-documented and reproducible spatial datasets that are created by applying a series of classification rules to a number of input layers. The internal model of the classification rules required by the tool to process the input data is exploited to also produce technical documentation and provenance records with minimal additional user input. Available provenance records that accompany input datasets are incorporated into those that describe the current process. As a result, each time a new iteration of the analysis is performed the documentation and provenance records are re-generated to provide an accurate description of the exact process followed. The generic nature of this tool, and the lessons learned during its creation, have wider application to other fields where the production of derivative datasets must be done in an open, defensible, and reproducible way.

  19. Next generation structural silicone glazing

    Directory of Open Access Journals (Sweden)

    Charles D. Clift

    2015-06-01

    Full Text Available This paper presents an advanced engineering evaluation, using nonlinear analysis of hyper elastic material that provides significant improvement to structural silicone glazing (SSG design in high performance curtain wall systems. Very high cladding wind pressures required in hurricane zones often result in bulky SSG profile dimensions. Architectural desire for aesthetically slender curtain wall framing sight-lines in combination with a desire to reduce aluminium usage led to optimization of silicone material geometry for better stress distribution.To accomplish accurate simulation of predicted behaviour under structural load, robust stress-strain curves of the silicone material are essential. The silicone manufacturer provided physical property testing via a specialized laboratory protocol. A series of rigorous curve fit techniques were then made to closely model test data in the finite element computer analysis that accounts for nonlinear strain of hyper elastic silicone.Comparison of this advanced design technique to traditional SSG design highlights differences in stress distribution contours in the silicone material. Simplified structural engineering per the traditional SSG design method does not provide accurate forecasting of material and stress optimization as shown in the advanced design.Full-scale specimens subject to structural load testing were performed to verify the design capacity, not only for high wind pressure values, but also for debris impact per ASTM E1886 and ASTM E1996. Also, construction of the test specimens allowed development of SSG installation techniques necessitated by the unique geometry of the silicone profile. Finally, correlation of physical test results with theoretical simulations is made, so evaluation of design confidence is possible. This design technique will introduce significant engineering advancement to the curtain wall industry.

  20. Validation of Fully Automated VMAT Plan Generation for Library-Based Plan-of-the-Day Cervical Cancer Radiotherapy

    OpenAIRE

    Sharfo, Abdul Wahab M.; Breedveld, Sebastiaan; Voet, Peter W. J.; Heijkoop, Sabrina T.; Mens, Jan-Willem M.; Hoogeman, Mischa S.; Heijmen, Ben J. M.

    2016-01-01

    textabstractPurpose: To develop and validate fully automated generation of VMAT plan-libraries for plan-of-the-day adaptive radiotherapy in locally-advanced cervical cancer. Material and Methods: Our framework for fully automated treatment plan generation (Erasmus-iCycle) was adapted to create dual-arc VMAT treatment plan libraries for cervical cancer patients. For each of 34 patients, automatically generated VMAT plans (autoVMAT) were compared to manually generated, clinically delivered 9-be...

  1. Automated analysis technique developed for detection of ODSCC on the tubes of OPR1000 steam generator

    International Nuclear Information System (INIS)

    Kim, In Chul; Nam, Min Woo

    2013-01-01

    A steam generator (SG) tube is an important component of a nuclear power plant (NPP). It works as a pressure boundary between the primary and secondary systems. The integrity of a SG tube can be assessed by an eddy current test every outage. The eddy current technique(adopting a bobbin probe) is currently the main technique used to assess the integrity of the tubing of a steam generator. An eddy current signal analyst for steam generator tubes continuously analyzes data over a given period of time. However, there are possibilities that the analyst conducting the test may get tired and cause mistakes, such as: missing indications or not being able to separate a true defect signal from one that is more complicated. This error could lead to confusion and an improper interpretation of the signal analysis. In order to avoid these possibilities, many countries of opted for automated analyses. Axial ODSCC (outside diameter stress corrosion cracking) defects on the tubes of OPR1000 steam generators have been found on the tube that are in contract with tube support plates. In this study, automated analysis software called CDS (computer data screening) made by Zetec was used. This paper will discuss the results of introducing an automated analysis system for an axial ODSCC on the tubes of an OPR1000 steam generator.

  2. M073: Monte Carlo generated spectra for QA/QC of automated NAA routine

    International Nuclear Information System (INIS)

    Jackman, K.R.; Biegalski, S.R.

    2004-01-01

    A quality check for an automated system of analyzing large sets of neutron activated samples has been developed. Activated samples are counted with an HPGe detector, in conjunction with an automated sample changer and spectral analysis tools, controlled by the Canberra GENIE 2K and REXX software. After each sample is acquired and analyzed, a Microsoft Visual Basic program imports the results into a template Microsoft Excel file where the final concentrations, uncertainties, and detection limits are determined. Standard reference materials are included in each set of 40 samples as a standard quality assurance/quality control (QA/QC) test. A select group of sample spectra are also visually reviewed to check the peak fitting routines. A reference spectrum was generated in MCNP 4c2 using an F8, pulse height, tally with a detector model of the actual detector used in counting. The detector model matches the detector resolution, energy calibration, and counting geometry. The generated spectrum also contained a radioisotope matrix that was similar to what was expected in the samples. This spectrum can then be put through the automated system and analyzed along with the other samples. The automated results are then compared to expected results for QA/QC assurance.

  3. Monte Carlo generated spectra for QA/QC of automated NAA routine

    International Nuclear Information System (INIS)

    Jackman, K.R.; Biegalski, S.R.

    2007-01-01

    A quality check for an automated system of analyzing large sets of neutron activated samples has been developed. Activated samples are counted with an HPGe detector, in conjunction with an automated sample changer and spectral analysis tools, controlled by the Canberra GENIE 2K and REXX software. After each sample is acquired and analyzed, a Microsoft Visual Basic program imports the results into a template Microsoft Excel file where the final concentrations, uncertainties, and detection limits are determined. Standard reference materials are included in each set of 40 samples as a standard quality assurance/quality control (QA/QC) test. A select group of sample spectra are also visually reviewed to check the peak fitting routines. A reference spectrum was generated in MCNP 4c2 using an F8, pulse-height, tally with a detector model of the actual detector used in counting. The detector model matches the detector resolution, energy calibration, and counting geometry. The generated spectrum also contained a radioisotope matrix that was similar to what was expected in the samples. This spectrum can then be put through the automated system and analyzed along with the other samples. The automated results are then compared to expected results for QA/QC assurance. (author)

  4. Development of automated generation system of accidental operating procedures for a PWR

    International Nuclear Information System (INIS)

    Artaud, J.L.

    1991-06-01

    The aim of the ACACIA project is to develop an automated generation system of accident operating procedures for a PWR. This research and development study, common at CEA and EDF, has two objectives: at mean-dated the realization of a validation tool and a procedure generation; at long-dated the dynamic generation of real time procedures. This work is consecrated at the realization of 2 prototypes. These prototypes and the technics used are described in detail. The last chapter explores the perspectives given by this type of tool [fr

  5. Optimizing Decision Preparedness by Adapting Scenario Complexity and Automating Scenario Generation

    Science.gov (United States)

    Dunne, Rob; Schatz, Sae; Flore, Stephen M.; Nicholson, Denise

    2011-01-01

    Klein's recognition-primed decision (RPD) framework proposes that experts make decisions by recognizing similarities between current decision situations and previous decision experiences. Unfortunately, military personnel arQ often presented with situations that they have not experienced before. Scenario-based training (S8T) can help mitigate this gap. However, SBT remains a challenging and inefficient training approach. To address these limitations, the authors present an innovative formulation of scenario complexity that contributes to the larger research goal of developing an automated scenario generation system. This system will enable trainees to effectively advance through a variety of increasingly complex decision situations and experiences. By adapting scenario complexities and automating generation, trainees will be provided with a greater variety of appropriately calibrated training events, thus broadening their repositories of experience. Preliminary results from empirical testing (N=24) of the proof-of-concept formula are presented, and future avenues of scenario complexity research are also discussed.

  6. The development of an automated sentence generator for the assessment of reading speed

    Directory of Open Access Journals (Sweden)

    Legge Gordon E

    2008-03-01

    Full Text Available Abstract Reading speed is an important outcome measure for many studies in neuroscience and psychology. Conventional reading speed tests have a limited corpus of sentences and usually require observers to read sentences aloud. Here we describe an automated sentence generator which can create over 100,000 unique sentences, scored using a true/false response. We propose that an estimate of the minimum exposure time required for observers to categorise the truth of such sentences is a good alternative to reading speed measures that guarantees comprehension of the printed material. Removing one word from the sentence reduces performance to chance, indicating minimal redundancy. Reading speed assessed using rapid serial visual presentation (RSVP of these sentences is not statistically different from using MNREAD sentences. The automated sentence generator would be useful for measuring reading speed with button-press response (such as within MRI scanners and for studies requiring many repeated measures of reading speed.

  7. Fluorescence In Situ Hybridization (FISH Signal Analysis Using Automated Generated Projection Images

    Directory of Open Access Journals (Sweden)

    Xingwei Wang

    2012-01-01

    Full Text Available Fluorescence in situ hybridization (FISH tests provide promising molecular imaging biomarkers to more accurately and reliably detect and diagnose cancers and genetic disorders. Since current manual FISH signal analysis is low-efficient and inconsistent, which limits its clinical utility, developing automated FISH image scanning systems and computer-aided detection (CAD schemes has been attracting research interests. To acquire high-resolution FISH images in a multi-spectral scanning mode, a huge amount of image data with the stack of the multiple three-dimensional (3-D image slices is generated from a single specimen. Automated preprocessing these scanned images to eliminate the non-useful and redundant data is important to make the automated FISH tests acceptable in clinical applications. In this study, a dual-detector fluorescence image scanning system was applied to scan four specimen slides with FISH-probed chromosome X. A CAD scheme was developed to detect analyzable interphase cells and map the multiple imaging slices recorded FISH-probed signals into the 2-D projection images. CAD scheme was then applied to each projection image to detect analyzable interphase cells using an adaptive multiple-threshold algorithm, identify FISH-probed signals using a top-hat transform, and compute the ratios between the normal and abnormal cells. To assess CAD performance, the FISH-probed signals were also independently visually detected by an observer. The Kappa coefficients for agreement between CAD and observer ranged from 0.69 to 1.0 in detecting/counting FISH signal spots in four testing samples. The study demonstrated the feasibility of automated FISH signal analysis that applying a CAD scheme to the automated generated 2-D projection images.

  8. The Upgrade Programme for the Structural Biology beamlines at the European Synchrotron Radiation Facility - High throughput sample evaluation and automation

    Science.gov (United States)

    Theveneau, P.; Baker, R.; Barrett, R.; Beteva, A.; Bowler, M. W.; Carpentier, P.; Caserotto, H.; de Sanctis, D.; Dobias, F.; Flot, D.; Guijarro, M.; Giraud, T.; Lentini, M.; Leonard, G. A.; Mattenet, M.; McCarthy, A. A.; McSweeney, S. M.; Morawe, C.; Nanao, M.; Nurizzo, D.; Ohlsson, S.; Pernot, P.; Popov, A. N.; Round, A.; Royant, A.; Schmid, W.; Snigirev, A.; Surr, J.; Mueller-Dieckmann, C.

    2013-03-01

    Automation and advances in technology are the key elements in addressing the steadily increasing complexity of Macromolecular Crystallography (MX) experiments. Much of this complexity is due to the inter-and intra-crystal heterogeneity in diffraction quality often observed for crystals of multi-component macromolecular assemblies or membrane proteins. Such heterogeneity makes high-throughput sample evaluation an important and necessary tool for increasing the chances of a successful structure determination. The introduction at the ESRF of automatic sample changers in 2005 dramatically increased the number of samples that were tested for diffraction quality. This "first generation" of automation, coupled with advances in software aimed at optimising data collection strategies in MX, resulted in a three-fold increase in the number of crystal structures elucidated per year using data collected at the ESRF. In addition, sample evaluation can be further complemented using small angle scattering experiments on the newly constructed bioSAXS facility on BM29 and the micro-spectroscopy facility (ID29S). The construction of a second generation of automated facilities on the MASSIF (Massively Automated Sample Screening Integrated Facility) beam lines will build on these advances and should provide a paradigm shift in how MX experiments are carried out which will benefit the entire Structural Biology community.

  9. Automation of eddy current system for in-service inspection of turbine and generator bores

    International Nuclear Information System (INIS)

    Viertl, J.R.M.

    1988-01-01

    The most commonly applied inspection method for ferromagnetic turbine and generator rotor bores is the magnetic particle test technique. This method is subjective, depends on the test operator's skill and diligence in identifying test indications, and suffers from poor repeatability, especially for small indications. Automation would improve repeatability. However, magnetic particle tests are not easily automated, because the data are in the form of sketches, photographs, and written and oral descriptions of the indications. Eddy current inspection has obvious potential to replace magnetic particle methods in this application. Eddy current tests can be readily automated, as the data are in the form of voltages that can be recorded, digitized, and manipulated by a computer. The current project continues the investigation of the correlation between eddy current and magnetic particle inspection. Two systems have been combined to acquire eddy current data automatically. This combination of systems consists of the Nortec-25L Eddyscope (to provide the analog eddy current signals) and the General Electric DATAQ (TM) System (to perform the automatic data acquisition). The automation of the system is discussed

  10. Implementing the WebSocket Protocol Based on Formal Modelling and Automated Code Generation

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2014-01-01

    with pragmatic annotations for automated code generation of protocol software. The contribution of this paper is an application of the approach as implemented in the PetriCode tool to obtain protocol software implementing the IETF WebSocket protocol. This demonstrates the scalability of our approach to real...... protocols. Furthermore, we perform formal verification of the CPN model prior to code generation, and test the implementation for interoperability against the Autobahn WebSocket test-suite resulting in 97% and 99% success rate for the client and server implementation, respectively. The tests show...

  11. Automated importance generation and biasing techniques for Monte Carlo shielding techniques by the TRIPOLI-3 code

    International Nuclear Information System (INIS)

    Both, J.P.; Nimal, J.C.; Vergnaud, T.

    1990-01-01

    We discuss an automated biasing procedure for generating the parameters necessary to achieve efficient Monte Carlo biasing shielding calculations. The biasing techniques considered here are exponential transform and collision biasing deriving from the concept of the biased game based on the importance function. We use a simple model of the importance function with exponential attenuation as the distance to the detector increases. This importance function is generated on a three-dimensional mesh including geometry and with graph theory algorithms. This scheme is currently being implemented in the third version of the neutron and gamma ray transport code TRIPOLI-3. (author)

  12. Consistency and accuracy of diagnostic cancer codes generated by automated registration: comparison with manual registration

    Directory of Open Access Journals (Sweden)

    Codazzi Tiziana

    2006-09-01

    Full Text Available Abstract Background Automated procedures are increasingly used in cancer registration, and it is important that the data produced are systematically checked for consistency and accuracy. We evaluated an automated procedure for cancer registration adopted by the Lombardy Cancer Registry in 1997, comparing automatically-generated diagnostic codes with those produced manually over one year (1997. Methods The automatically generated cancer cases were produced by Open Registry algorithms. For manual registration, trained staff consulted clinical records, pathology reports and death certificates. The social security code, present and checked in both databases in all cases, was used to match the files in the automatic and manual databases. The cancer cases generated by the two methods were compared by manual revision. Results The automated procedure generated 5027 cases: 2959 (59% were accepted automatically and 2068 (41% were flagged for manual checking. Among the cases accepted automatically, discrepancies in data items (surname, first name, sex and date of birth constituted 8.5% of cases, and discrepancies in the first three digits of the ICD-9 code constituted 1.6%. Among flagged cases, cancers of female genital tract, hematopoietic system, metastatic and ill-defined sites, and oropharynx predominated. The usual reasons were use of specific vs. generic codes, presence of multiple primaries, and use of extranodal vs. nodal codes for lymphomas. The percentage of automatically accepted cases ranged from 83% for breast and thyroid cancers to 13% for metastatic and ill-defined cancer sites. Conclusion Since 59% of cases were accepted automatically and contained relatively few, mostly trivial discrepancies, the automatic procedure is efficient for routine case generation effectively cutting the workload required for routine case checking by this amount. Among cases not accepted automatically, discrepancies were mainly due to variations in coding practice.

  13. Automated Monte Carlo biasing for photon-generated electrons near surfaces.

    Energy Technology Data Exchange (ETDEWEB)

    Franke, Brian Claude; Crawford, Martin James; Kensek, Ronald Patrick

    2009-09-01

    This report describes efforts to automate the biasing of coupled electron-photon Monte Carlo particle transport calculations. The approach was based on weight-windows biasing. Weight-window settings were determined using adjoint-flux Monte Carlo calculations. A variety of algorithms were investigated for adaptivity of the Monte Carlo tallies. Tree data structures were used to investigate spatial partitioning. Functional-expansion tallies were used to investigate higher-order spatial representations.

  14. A Solution Generator Algorithm for Decision Making based Automated Negotiation in the Construction Domain

    Directory of Open Access Journals (Sweden)

    Arazi Idrus

    2017-12-01

    Full Text Available In this paper, we present our work-in-progress of a proposed framework for automated negotiation in the construction domain. The proposed framework enables software agents to conduct negotiations and autonomously make value-based decisions. The framework consists of three main components which are, solution generator algorithm, negotiation algorithm, and conflict resolution algorithm. This paper extends the discussion on the solution generator algorithm that enables software agents to generate solutions and rank them from 1st to nth solution for the negotiation stage of the operation. The solution generator algorithm consists of three steps which are, review solutions, rank solutions, and form ranked solutions. For validation purpose, we present a scenario that utilizes the proposed algorithm to rank solutions. The validation shows that the algorithm is promising, however, it also highlights the conflict between different parties that needs further negotiation action.

  15. Automated Eukaryotic Gene Structure Annotation Using EVidenceModeler and the Program to Assemble Spliced Alignments

    Energy Technology Data Exchange (ETDEWEB)

    Haas, B J; Salzberg, S L; Zhu, W; Pertea, M; Allen, J E; Orvis, J; White, O; Buell, C R; Wortman, J R

    2007-12-10

    EVidenceModeler (EVM) is presented as an automated eukaryotic gene structure annotation tool that reports eukaryotic gene structures as a weighted consensus of all available evidence. EVM, when combined with the Program to Assemble Spliced Alignments (PASA), yields a comprehensive, configurable annotation system that predicts protein-coding genes and alternatively spliced isoforms. Our experiments on both rice and human genome sequences demonstrate that EVM produces automated gene structure annotation approaching the quality of manual curation.

  16. An automation of design and modelling tasks in NX Siemens environment with original software - generator module

    Science.gov (United States)

    Zbiciak, M.; Grabowik, C.; Janik, W.

    2015-11-01

    Nowadays the design constructional process is almost exclusively aided with CAD/CAE/CAM systems. It is evaluated that nearly 80% of design activities have a routine nature. These design routine tasks are highly susceptible to automation. Design automation is usually made with API tools which allow building original software responsible for adding different engineering activities. In this paper the original software worked out in order to automate engineering tasks at the stage of a product geometrical shape design is presented. The elaborated software works exclusively in NX Siemens CAD/CAM/CAE environment and was prepared in Microsoft Visual Studio with application of the .NET technology and NX SNAP library. The software functionality allows designing and modelling of spur and helicoidal involute gears. Moreover, it is possible to estimate relative manufacturing costs. With the Generator module it is possible to design and model both standard and non-standard gear wheels. The main advantage of the model generated in such a way is its better representation of an involute curve in comparison to those which are drawn in specialized standard CAD systems tools. It comes from fact that usually in CAD systems an involute curve is drawn by 3 points that respond to points located on the addendum circle, the reference diameter of a gear and the base circle respectively. In the Generator module the involute curve is drawn by 11 involute points which are located on and upper the base and the addendum circles therefore 3D gear wheels models are highly accurate. Application of the Generator module makes the modelling process very rapid so that the gear wheel modelling time is reduced to several seconds. During the conducted research the analysis of differences between standard 3 points and 11 points involutes was made. The results and conclusions drawn upon analysis are shown in details.

  17. Automated measurement of CT noise in patient images with a novel structure coherence feature

    International Nuclear Information System (INIS)

    Chun, Minsoo; Kim, Jong Hyo; Choi, Young Hun

    2015-01-01

    While the assessment of CT noise constitutes an important task for the optimization of scan protocols in clinical routine, the majority of noise measurements in practice still rely on manual operation, hence limiting their efficiency and reliability. This study presents an algorithm for the automated measurement of CT noise in patient images with a novel structure coherence feature. The proposed algorithm consists of a four-step procedure including subcutaneous fat tissue selection, the calculation of structure coherence feature, the determination of homogeneous ROIs, and the estimation of the average noise level. In an evaluation with 94 CT scans (16 517 images) of pediatric and adult patients along with the participation of two radiologists, ROIs were placed on a homogeneous fat region at 99.46% accuracy, and the agreement of the automated noise measurements with the radiologists’ reference noise measurements (PCC  =  0.86) was substantially higher than the within and between-rater agreements of noise measurements (PCC within   =  0.75, PCC between   =  0.70). In addition, the absolute noise level measurements matched closely the theoretical noise levels generated by a reduced-dose simulation technique. Our proposed algorithm has the potential to be used for examining the appropriateness of radiation dose and the image quality of CT protocols for research purposes as well as clinical routine. (paper)

  18. Medial structure generation for registration of anatomical structures

    DEFF Research Database (Denmark)

    Vera, Sergio; Gil, Debora; Kjer, Hans Martin

    2017-01-01

    structures. Methods for generation of medial structures, however, are prone to the generation of medial artifacts (spurious branches) that traditionally need to be pruned before the medial structure can be used for further computations. The act of pruning can affect main sections of the medial surface......Medial structures (skeletons and medial manifolds) have shown capacity to describe shape in a compact way. In the field of medical imaging, they have been employed to enrich the description of organ anatomy, to improve segmentation, or to describe the organ position in relation to surrounding...

  19. Structural considerations in steam generator replacement

    International Nuclear Information System (INIS)

    Bertheau, S.R.; Gazda, P.A.

    1991-01-01

    Corrosion of the tubes and tube-support structures inside pressurized water reactor (PWR) steam generators has led many utilities to consider a replacement of the generators. Such a project is a major undertaking for a utility and must be well planned to ensure an efficient and cost-effective effort. This paper discusses various structural aspects of replacement options, such as total or partial generator replacement, along with their associated pipe cuts; major structural aspects associated with removal paths through the equipment hatch or through an opening in the containment wall, along with the related removal processes; onsite movement and storage of the generators; and the advantages and disadvantages of the removal alternatives. This paper addresses the major structural considerations associated with a steam generator replacement project. Other important considerations (e.g., licensing, radiological concerns, electrical requirements, facilities for management and onsite administrative activities, storage and fabrication activities, and offsite transportation) are not discussed in this paper, but should be carefully considered when undertaking a replacement project

  20. Spanish generation market: structure, design and results

    International Nuclear Information System (INIS)

    Agosti, L.; Padilla, A. J.; Requejo, A.

    2007-01-01

    This paper provides an overview of the structure, design and outcome of the Spanish generation market from 1998, when the market was liberalised, to date. More precisely, this paper reviews the history of the liberalisation process; describes the structure of the generation market and its evolution over time; analyses the existence of market power; and evaluates the outcome of the liberalisation process from the viewpoint of its impact on al locative efficiency, productive efficiency and dynamic efficiency. The paper concludes with a brief summary of recent regulatory reforms. (Author)

  1. Computational strategies for the automated design of RNA nanoscale structures from building blocks using NanoTiler.

    Science.gov (United States)

    Bindewald, Eckart; Grunewald, Calvin; Boyle, Brett; O'Connor, Mary; Shapiro, Bruce A

    2008-10-01

    One approach to designing RNA nanoscale structures is to use known RNA structural motifs such as junctions, kissing loops or bulges and to construct a molecular model by connecting these building blocks with helical struts. We previously developed an algorithm for detecting internal loops, junctions and kissing loops in RNA structures. Here we present algorithms for automating or assisting many of the steps that are involved in creating RNA structures from building blocks: (1) assembling building blocks into nanostructures using either a combinatorial search or constraint satisfaction; (2) optimizing RNA 3D ring structures to improve ring closure; (3) sequence optimisation; (4) creating a unique non-degenerate RNA topology descriptor. This effectively creates a computational pipeline for generating molecular models of RNA nanostructures and more specifically RNA ring structures with optimized sequences from RNA building blocks. We show several examples of how the algorithms can be utilized to generate RNA tecto-shapes.

  2. Computational strategies for the automated design of RNA nanoscale structures from building blocks using NanoTiler☆

    Science.gov (United States)

    Bindewald, Eckart; Grunewald, Calvin; Boyle, Brett; O’Connor, Mary; Shapiro, Bruce A.

    2013-01-01

    One approach to designing RNA nanoscale structures is to use known RNA structural motifs such as junctions, kissing loops or bulges and to construct a molecular model by connecting these building blocks with helical struts. We previously developed an algorithm for detecting internal loops, junctions and kissing loops in RNA structures. Here we present algorithms for automating or assisting many of the steps that are involved in creating RNA structures from building blocks: (1) assembling building blocks into nanostructures using either a combinatorial search or constraint satisfaction; (2) optimizing RNA 3D ring structures to improve ring closure; (3) sequence optimisation; (4) creating a unique non-degenerate RNA topology descriptor. This effectively creates a computational pipeline for generating molecular models of RNA nanostructures and more specifically RNA ring structures with optimized sequences from RNA building blocks. We show several examples of how the algorithms can be utilized to generate RNA tecto-shapes. PMID:18838281

  3. Automated Breast Ultrasound for Ductal Pattern Reconstruction: Ground Truth File Generation and CADe Evaluation

    Science.gov (United States)

    Manousaki, D.; Panagiotopoulou, A.; Bizimi, V.; Haynes, M. S.; Love, S.; Kallergi, M.

    2017-11-01

    The purpose of this study was the generation of ground truth files (GTFs) of the breast ducts from 3D images of the Invenia™ Automated Breast Ultrasound System (ABUS) system (GE Healthcare, Little Chalfont, UK) and the application of these GTFs for the optimization of the imaging protocol and the evaluation of a computer aided detection (CADe) algorithm developed for automated duct detection. Six lactating, nursing volunteers were scanned with the ABUS before and right after breastfeeding their infants. An expert in breast ultrasound generated rough outlines of the milk-filled ducts in the transaxial slices of all image volumes and the final GTFs were created by using thresholding and smoothing tools in ImageJ. In addition, a CADe algorithm automatically segmented duct like areas and its results were compared to the expert’s GTFs by estimating true positive fraction (TPF) or % overlap. The CADe output differed significantly from the expert’s but both detected a smaller than expected volume of the ducts due to insufficient contrast (ducts were partially filled with milk), discontinuities, and artifacts. GTFs were used to modify the imaging protocol and improve the CADe method. In conclusion, electronic GTFs provide a valuable tool in the optimization of a tomographic imaging system, the imaging protocol, and the CADe algorithms. Their generation, however, is an extremely time consuming, strenuous process, particularly for multi-slice examinations, and alternatives based on phantoms or simulations are highly desirable.

  4. Automation of labelling of Lipiodol with high-activity generator-produced 188Re

    International Nuclear Information System (INIS)

    Lepareur, Nicolas; Ardisson, Valerie; Noiret, Nicolas; Boucher, Eveline; Raoul, Jean-Luc; Clement, Bruno; Garin, Etienne

    2011-01-01

    This work describes optimisation of the kit formulation for labelling of Lipiodol with high-activity generator-produced rhenium-188. Radiochemical purity (RCP) was 92.52±2.3% and extraction yield was 98.56±1.2%. The synthesis has been automated with a TADDEO module (Comecer) giving a mean final yield of 52.68±9.6%, and reducing radiation burden to the radiochemist by 80%. Radiolabelled Lipiodol ( 188 Re-SSS/Lipiodol) is stable for at least 7 days (RCP=91.07±0.9%).

  5. Aptaligner: automated software for aligning pseudorandom DNA X-aptamers from next-generation sequencing data.

    Science.gov (United States)

    Lu, Emily; Elizondo-Riojas, Miguel-Angel; Chang, Jeffrey T; Volk, David E

    2014-06-10

    Next-generation sequencing results from bead-based aptamer libraries have demonstrated that traditional DNA/RNA alignment software is insufficient. This is particularly true for X-aptamers containing specialty bases (W, X, Y, Z, ...) that are identified by special encoding. Thus, we sought an automated program that uses the inherent design scheme of bead-based X-aptamers to create a hypothetical reference library and Markov modeling techniques to provide improved alignments. Aptaligner provides this feature as well as length error and noise level cutoff features, is parallelized to run on multiple central processing units (cores), and sorts sequences from a single chip into projects and subprojects.

  6. Automated Design and Analysis Tool for CEV Structural and TPS Components, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation of the proposed effort is a unique automated process for the analysis, design, and sizing of CEV structures and TPS. This developed process will...

  7. Automated Design and Analysis Tool for CLV/CEV Composite and Metallic Structural Components, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation of the proposed effort is a unique automated process for the analysis, design, and sizing of CLV/CEV composite and metallic structures. This developed...

  8. The structure and functions of an automated project management system for the centers of scientific and technical creativity of students

    OpenAIRE

    Dmitriev, V. M.; Gandzha, T. V.; Gandzha, V. V.; Panov, S. A.

    2013-01-01

    This article discusses the possibility of automating of the student's projecting through the use of automated project management system. There are described the purpose, structure and formalism of automated workplace of student-designer (AWSD), and shown its structural-functional diagram.

  9. Kotai Antibody Builder: automated high-resolution structural modeling of antibodies.

    Science.gov (United States)

    Yamashita, Kazuo; Ikeda, Kazuyoshi; Amada, Karlou; Liang, Shide; Tsuchiya, Yuko; Nakamura, Haruki; Shirai, Hiroki; Standley, Daron M

    2014-11-15

    Kotai Antibody Builder is a Web service for tertiary structural modeling of antibody variable regions. It consists of three main steps: hybrid template selection by sequence alignment and canonical rules, 3D rendering of alignments and CDR-H3 loop modeling. For the last step, in addition to rule-based heuristics used to build the initial model, a refinement option is available that uses fragment assembly followed by knowledge-based scoring. Using targets from the Second Antibody Modeling Assessment, we demonstrate that Kotai Antibody Builder generates models with an overall accuracy equal to that of the best-performing semi-automated predictors using expert knowledge. Kotai Antibody Builder is available at http://kotaiab.org standley@ifrec.osaka-u.ac.jp. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  10. Automated system for generation of soil moisture products for agricultural drought assessment

    Science.gov (United States)

    Raja Shekhar, S. S.; Chandrasekar, K.; Sesha Sai, M. V. R.; Diwakar, P. G.; Dadhwal, V. K.

    2014-11-01

    Drought is a frequently occurring disaster affecting lives of millions of people across the world every year. Several parameters, indices and models are being used globally to forecast / early warning of drought and monitoring drought for its prevalence, persistence and severity. Since drought is a complex phenomenon, large number of parameter/index need to be evaluated to sufficiently address the problem. It is a challenge to generate input parameters from different sources like space based data, ground data and collateral data in short intervals of time, where there may be limitation in terms of processing power, availability of domain expertise, specialized models & tools. In this study, effort has been made to automate the derivation of one of the important parameter in the drought studies viz Soil Moisture. Soil water balance bucket model is in vogue to arrive at soil moisture products, which is widely popular for its sensitivity to soil conditions and rainfall parameters. This model has been encoded into "Fish-Bone" architecture using COM technologies and Open Source libraries for best possible automation to fulfill the needs for a standard procedure of preparing input parameters and processing routines. The main aim of the system is to provide operational environment for generation of soil moisture products by facilitating users to concentrate on further enhancements and implementation of these parameters in related areas of research, without re-discovering the established models. Emphasis of the architecture is mainly based on available open source libraries for GIS and Raster IO operations for different file formats to ensure that the products can be widely distributed without the burden of any commercial dependencies. Further the system is automated to the extent of user free operations if required with inbuilt chain processing for every day generation of products at specified intervals. Operational software has inbuilt capabilities to automatically

  11. A Parallel Multiblock Structured Grid Method with Automated Interblocked Unstructured Grids for Chemically Reacting Flows

    Science.gov (United States)

    Spiegel, Seth Christian

    An automated method for using unstructured grids to patch non- C0 interfaces between structured blocks has been developed in conjunction with a finite-volume method for solving chemically reacting flows on unstructured grids. Although the standalone unstructured solver, FVFLO-NCSU, is capable of resolving flows for high-speed aeropropulsion devices with complex geometries, unstructured-mesh algorithms are inherently inefficient when compared to their structured counterparts. However, the advantages of structured algorithms in developing a flow solution in a timely manner can be negated by the amount of time required to develop a mesh for complex geometries. The global domain can be split up into numerous smaller blocks during the grid-generation process to alleviate some of the difficulties in creating these complex meshes. An even greater abatement can be found by allowing the nodes on abutting block interfaces to be nonmatching or non-C 0 continuous. One code capable of solving chemically reacting flows on these multiblock grids is VULCAN, which uses a nonconservative approach for patching non-C0 block interfaces. The developed automated unstructured-grid patching algorithm has been installed within VULCAN to provide it the capability of a fully conservative approach for patching non-C0 block interfaces. Additionally, the FVFLO-NCSU solver algorithms have been deeply intertwined with the VULCAN source code to solve chemically reacting flows on these unstructured patches. Finally, the CGNS software library was added to the VULCAN postprocessor so structured and unstructured data can be stored in a single compact file. This final upgrade to VULCAN has been successfully installed and verified using test cases with particular interest towards those involving grids with non- C0 block interfaces.

  12. Continuous Automated Model EvaluatiOn (CAMEO) complementing the critical assessment of structure prediction in CASP12.

    Science.gov (United States)

    Haas, Jürgen; Barbato, Alessandro; Behringer, Dario; Studer, Gabriel; Roth, Steven; Bertoni, Martino; Mostaguir, Khaled; Gumienny, Rafal; Schwede, Torsten

    2018-03-01

    Every second year, the community experiment "Critical Assessment of Techniques for Structure Prediction" (CASP) is conducting an independent blind assessment of structure prediction methods, providing a framework for comparing the performance of different approaches and discussing the latest developments in the field. Yet, developers of automated computational modeling methods clearly benefit from more frequent evaluations based on larger sets of data. The "Continuous Automated Model EvaluatiOn (CAMEO)" platform complements the CASP experiment by conducting fully automated blind prediction assessments based on the weekly pre-release of sequences of those structures, which are going to be published in the next release of the PDB Protein Data Bank. CAMEO publishes weekly benchmarking results based on models collected during a 4-day prediction window, on average assessing ca. 100 targets during a time frame of 5 weeks. CAMEO benchmarking data is generated consistently for all participating methods at the same point in time, enabling developers to benchmark and cross-validate their method's performance, and directly refer to the benchmarking results in publications. In order to facilitate server development and promote shorter release cycles, CAMEO sends weekly email with submission statistics and low performance warnings. Many participants of CASP have successfully employed CAMEO when preparing their methods for upcoming community experiments. CAMEO offers a variety of scores to allow benchmarking diverse aspects of structure prediction methods. By introducing new scoring schemes, CAMEO facilitates new development in areas of active research, for example, modeling quaternary structure, complexes, or ligand binding sites. © 2017 Wiley Periodicals, Inc.

  13. Component-based modeling of systems for automated fault tree generation

    International Nuclear Information System (INIS)

    Majdara, Aref; Wakabayashi, Toshio

    2009-01-01

    One of the challenges in the field of automated fault tree construction is to find an efficient modeling approach that can support modeling of different types of systems without ignoring any necessary details. In this paper, we are going to represent a new system of modeling approach for computer-aided fault tree generation. In this method, every system model is composed of some components and different types of flows propagating through them. Each component has a function table that describes its input-output relations. For the components having different operational states, there is also a state transition table. Each component can communicate with other components in the system only through its inputs and outputs. A trace-back algorithm is proposed that can be applied to the system model to generate the required fault trees. The system modeling approach and the fault tree construction algorithm are applied to a fire sprinkler system and the results are presented

  14. Automated Generation of Formal Models from ST Control Programs for Verification Purposes

    CERN Document Server

    Fernandez Adiego, B; Tournier, J-C; Blanco Vinuela, E; Blech, J-O; Gonzalez Suarez, V

    2014-01-01

    In large industrial control systems such as the ones installed at CERN, one of the main issues is the ability to verify the correct behaviour of the Programmable Logic Controller (PLC) programs. While manual and automated testing can achieve good results, some obvious problems remain unsolved such as the difficulty to check safety or liveness properties. This paper proposes a general methodology and a tool to verify PLC programs by automatically generating formal models for different model checkers out of ST code. The proposed methodology defines an automata-based formalism used as intermediate model (IM) to transform PLC programs written in ST language into different formal models for verification purposes. A tool based on Xtext has been implemented that automatically generates models for the NuSMV and UPPAAL model checkers and the BIP framework.

  15. Application of a path sensitizing method on automated generation of test specifications for control software

    International Nuclear Information System (INIS)

    Morimoto, Yuuichi; Fukuda, Mitsuko

    1995-01-01

    An automated generation method for test specifications has been developed for sequential control software in plant control equipment. Sequential control software can be represented as sequential circuits. The control software implemented in a control equipment is designed from these circuit diagrams. In logic tests of VLSI's, path sensitizing methods are widely used to generate test specifications. But the method generates test specifications at a single time only, and can not be directly applied to sequential control software. The basic idea of the proposed method is as follows. Specifications of each logic operator in the diagrams are defined in the software design process. Therefore, test specifications of each operator in the control software can be determined from these specifications, and validity of software can be judged by inspecting all of the operators in the logic circuit diagrams. Candidates for sensitized paths, on which test data for each operator propagates, can be generated by the path sensitizing method. To confirm feasibility of the method, it was experimentally applied to control software in digital control equipment. The program could generate test specifications exactly, and feasibility of the method was confirmed. (orig.) (3 refs., 7 figs.)

  16. Automating methods to improve precision in Monte-Carlo event generation for particle colliders

    International Nuclear Information System (INIS)

    Gleisberg, Tanju

    2008-01-01

    The subject of this thesis was the development of tools for the automated calculation of exact matrix elements, which are a key for the systematic improvement of precision and confidence for theoretical predictions. Part I of this thesis concentrates on the calculations of cross sections at tree level. A number of extensions have been implemented in the matrix element generator AMEGIC++, namely new interaction models such as effective loop-induced couplings of the Higgs boson with massless gauge bosons, required for a number of channels for the Higgs boson search at LHC and anomalous gauge couplings, parameterizing a number of models beyond th SM. Further a special treatment to deal with complicated decay chains of heavy particles has been constructed. A significant effort went into the implementation of methods to push the limits on particle multiplicities. Two recursive methods have been implemented, the Cachazo-Svrcek-Witten recursion and the colour dressed Berends-Giele recursion. For the latter the new module COMIX has been added to the SHERPA framework. The Monte-Carlo phase space integration techniques have been completely revised, which led to significantly reduced statistical error estimates when calculating cross sections and a greatly improved unweighting efficiency for the event generation. Special integration methods have been developed to cope with the newly accessible final states. The event generation framework SHERPA directly benefits from those new developments, improving the precision and the efficiency. Part II was addressed to the automation of QCD calculations at next-to-leading order. A code has been developed, that, for the first time fully automates the real correction part of a NLO calculation. To calculate the correction for a m-parton process obeying the Catani-Seymour dipole subtraction method the following components are provided: 1. the corresponding m+1-parton tree level matrix elements, 2. a number dipole subtraction terms to remove

  17. Automating methods to improve precision in Monte-Carlo event generation for particle colliders

    Energy Technology Data Exchange (ETDEWEB)

    Gleisberg, Tanju

    2008-07-01

    The subject of this thesis was the development of tools for the automated calculation of exact matrix elements, which are a key for the systematic improvement of precision and confidence for theoretical predictions. Part I of this thesis concentrates on the calculations of cross sections at tree level. A number of extensions have been implemented in the matrix element generator AMEGIC++, namely new interaction models such as effective loop-induced couplings of the Higgs boson with massless gauge bosons, required for a number of channels for the Higgs boson search at LHC and anomalous gauge couplings, parameterizing a number of models beyond th SM. Further a special treatment to deal with complicated decay chains of heavy particles has been constructed. A significant effort went into the implementation of methods to push the limits on particle multiplicities. Two recursive methods have been implemented, the Cachazo-Svrcek-Witten recursion and the colour dressed Berends-Giele recursion. For the latter the new module COMIX has been added to the SHERPA framework. The Monte-Carlo phase space integration techniques have been completely revised, which led to significantly reduced statistical error estimates when calculating cross sections and a greatly improved unweighting efficiency for the event generation. Special integration methods have been developed to cope with the newly accessible final states. The event generation framework SHERPA directly benefits from those new developments, improving the precision and the efficiency. Part II was addressed to the automation of QCD calculations at next-to-leading order. A code has been developed, that, for the first time fully automates the real correction part of a NLO calculation. To calculate the correction for a m-parton process obeying the Catani-Seymour dipole subtraction method the following components are provided: 1. the corresponding m+1-parton tree level matrix elements, 2. a number dipole subtraction terms to remove

  18. Synthesis of generator based 68Ga-labeled biphosphonates by an automated module: Indian experience

    International Nuclear Information System (INIS)

    Kumar, Rajeev; Sharma, P.; Medhavi, S.; Pandey, A.K.; Tripathy, M.; Kumar, Rakesh; Bal, C.; Malhotra, A.; Meckel, M.; Rosch, F.

    2015-01-01

    Full text of publication follows. Aim of the study: to share our experience regarding the synthesis and quality control of generator based 68 Ga-NOTA biphosphonates for bone PET imaging using an automated module. Material and methods: the eluate of a 68 Ge/ 68 Ga generator was passed through a cation exchange resin (strata X-C). 68 Ga was adsorbed on the cartridge and rest of the solvent was passed into the waste. A solution conventionally called N2 (mixture of Acetone, metal free water and HCl), was used to release concentrated and purified 68 Ga from the strata X-C to release it to the 10 ml reaction vial. The reaction vial contained 20 μg of the precursor NOTA-biphosphonate dissolved in 1.5 ml 0.25 M sodium acetate at pH of 4. Now the reaction vessel was heated at a temperature of 95 Celsius degrees for 15 minutes. After cooling the solution was diluted by adding 3 ml metal free water. The product was transferred to product vial through 0.22 μm sterile filter. All the synthesis steps were carried out in automated module (Modular lab, Eckert and Ziegler, Germany). The total synthesis time was 18 minutes. During the whole procedure radiation level was monitored around the Hot cell at all four side walls at every 3 minutes. Routine quality control test was performed with the help of Radio-TLC, Ph paper and dose calibrator respectively (For its radiochemical binding, Rf, Ph value and its half life). Results: 68 Ga-NOTA-biphosphonate yield ranged between 333 to 370 MBq from five months 1850 MBq old generator. 68 Ga-NOTA-biphosphonate conjugate was prepared with very high radio chemical yield and purity (>99 %). The product was stable up to four hours at room temperature (checked by Radio-TLC). During synthesis the radiation level around the hot cell was near to background level (∼ 3 μSv/hr). Summary: generator based PET radiotracer 68 Ga-NOTA-biphosphonate can be synthesized with high radiochemical purity and good stability, using an automated module. (authors)

  19. Automation in structural biology beamlines of the Photon Factory

    International Nuclear Information System (INIS)

    Igarashi, Noriyuki; Hiraki, Masahiko; Matsugaki, Naohiro; Yamada, Yusuke; Wakatsuki, Soichi

    2007-01-01

    The Photon Factory currently operates four synchrotron beamlines for protein crystallography and two more beamlines are scheduled to be constructed in the next years. Over the last years these beamlines have been upgraded and equipped with a fully automated beamline control system based on a robotic sample changer. The current system allows for remote operation, controlled from the user's area, of sample mounting, centering and data collection of pre-frozen crystals mounted in Hampton-type cryo-loops on goniometer head. New intuitive graphical user interfaces have been developed so as to control the complete beamline operation. Furthermore, algorithms for automatic sample centering based on pattern matching and X-ray beam scanning are being developed and combined with newly developed diffraction evaluation programs in order to complete entire automation of the data collection. (author)

  20. Exploiting structure similarity in refinement: automated NCS and target-structure restraints in BUSTER

    Energy Technology Data Exchange (ETDEWEB)

    Smart, Oliver S., E-mail: osmart@globalphasing.com; Womack, Thomas O.; Flensburg, Claus; Keller, Peter; Paciorek, Włodek; Sharff, Andrew; Vonrhein, Clemens; Bricogne, Gérard [Global Phasing Ltd, Sheraton House, Castle Park, Cambridge CB3 0AX (United Kingdom)

    2012-04-01

    Local structural similarity restraints (LSSR) provide a novel method for exploiting NCS or structural similarity to an external target structure. Two examples are given where BUSTER re-refinement of PDB entries with LSSR produces marked improvements, enabling further structural features to be modelled. Maximum-likelihood X-ray macromolecular structure refinement in BUSTER has been extended with restraints facilitating the exploitation of structural similarity. The similarity can be between two or more chains within the structure being refined, thus favouring NCS, or to a distinct ‘target’ structure that remains fixed during refinement. The local structural similarity restraints (LSSR) approach considers all distances less than 5.5 Å between pairs of atoms in the chain to be restrained. For each, the difference from the distance between the corresponding atoms in the related chain is found. LSSR applies a restraint penalty on each difference. A functional form that reaches a plateau for large differences is used to avoid the restraints distorting parts of the structure that are not similar. Because LSSR are local, there is no need to separate out domains. Some restraint pruning is still necessary, but this has been automated. LSSR have been available to academic users of BUSTER since 2009 with the easy-to-use -autoncs and @@target target.pdb options. The use of LSSR is illustrated in the re-refinement of PDB entries http://scripts.iucr.org/cgi-bin/cr.cgi?rm, where -target enables the correct ligand-binding structure to be found, and http://scripts.iucr.org/cgi-bin/cr.cgi?rm, where -autoncs contributes to the location of an additional copy of the cyclic peptide ligand.

  1. Automated biphasic morphological assessment of hepatitis B-related liver fibrosis using second harmonic generation microscopy

    Science.gov (United States)

    Wang, Tong-Hong; Chen, Tse-Ching; Teng, Xiao; Liang, Kung-Hao; Yeh, Chau-Ting

    2015-08-01

    Liver fibrosis assessment by biopsy and conventional staining scores is based on histopathological criteria. Variations in sample preparation and the use of semi-quantitative histopathological methods commonly result in discrepancies between medical centers. Thus, minor changes in liver fibrosis might be overlooked in multi-center clinical trials, leading to statistically non-significant data. Here, we developed a computer-assisted, fully automated, staining-free method for hepatitis B-related liver fibrosis assessment. In total, 175 liver biopsies were divided into training (n = 105) and verification (n = 70) cohorts. Collagen was observed using second harmonic generation (SHG) microscopy without prior staining, and hepatocyte morphology was recorded using two-photon excitation fluorescence (TPEF) microscopy. The training cohort was utilized to establish a quantification algorithm. Eleven of 19 computer-recognizable SHG/TPEF microscopic morphological features were significantly correlated with the ISHAK fibrosis stages (P 0.82 for liver cirrhosis detection. Since no subjective gradings are needed, interobserver discrepancies could be avoided using this fully automated method.

  2. Automated facial recognition of manually generated clay facial approximations: Potential application in unidentified persons data repositories.

    Science.gov (United States)

    Parks, Connie L; Monson, Keith L

    2018-01-01

    This research examined how accurately 2D images (i.e., photographs) of 3D clay facial approximations were matched to corresponding photographs of the approximated individuals using an objective automated facial recognition system. Irrespective of search filter (i.e., blind, sex, or ancestry) or rank class (R 1 , R 10 , R 25 , and R 50 ) employed, few operationally informative results were observed. In only a single instance of 48 potential match opportunities was a clay approximation matched to a corresponding life photograph within the top 50 images (R 50 ) of a candidate list, even with relatively small gallery sizes created from the application of search filters (e.g., sex or ancestry search restrictions). Increasing the candidate lists to include the top 100 images (R 100 ) resulted in only two additional instances of correct match. Although other untested variables (e.g., approximation method, 2D photographic process, and practitioner skill level) may have impacted the observed results, this study suggests that 2D images of manually generated clay approximations are not readily matched to life photos by automated facial recognition systems. Further investigation is necessary in order to identify the underlying cause(s), if any, of the poor recognition results observed in this study (e.g., potential inferior facial feature detection and extraction). Additional inquiry exploring prospective remedial measures (e.g., stronger feature differentiation) is also warranted, particularly given the prominent use of clay approximations in unidentified persons casework. Copyright © 2017. Published by Elsevier B.V.

  3. Modelling and simulating the forming of new dry automated lay-up reinforcements for primary structures

    Science.gov (United States)

    Bouquerel, Laure; Moulin, Nicolas; Drapier, Sylvain; Boisse, Philippe; Beraud, Jean-Marc

    2017-10-01

    While weight has been so far the main driver for the development of prepreg based-composites solutions for aeronautics, a new weight-cost trade-off tends to drive choices for next-generation aircrafts. As a response, Hexcel has designed a new dry reinforcement type for aircraft primary structures, which combines the benefits of automation, out-of-autoclave process cost-effectiveness, and mechanical performances competitive to prepreg solutions: HiTape® is a unidirectional (UD) dry carbon reinforcement with thermoplastic veil on each side designed for aircraft primary structures [1-3]. One privileged process route for HiTape® in high volume automated processes consists in forming initially flat dry reinforcement stacks, before resin infusion [4] or injection. Simulation of the forming step aims at predicting the geometry and mechanical properties of the formed stack (so-called preform) for process optimisation. Extensive work has been carried out on prepreg and dry woven fabrics forming behaviour and simulation, but the interest for dry non-woven reinforcements has emerged more recently. Some work has been achieved on non crimp fabrics but studies on the forming behaviour of UDs are seldom and deal with UD prepregs only. Tension and bending in the fibre direction, along with inter-ply friction have been identified as the main mechanisms controlling the HiTape® response during forming. Bending has been characterised using a modified Peirce's flexometer [5] and inter-ply friction study is under development. Anisotropic hyperelastic constitutive models have been selected to represent the assumed decoupled deformation mechanisms. Model parameters are then identified from associated experimental results. For forming simulation, a continuous approach at the macroscopic scale has been selected first, and simulation is carried out in the Zset framework [6] using proper shell finite elements.

  4. DG-AMMOS: a new tool to generate 3d conformation of small molecules using distance geometry and automated molecular mechanics optimization for in silico screening.

    Science.gov (United States)

    Lagorce, David; Pencheva, Tania; Villoutreix, Bruno O; Miteva, Maria A

    2009-11-13

    Discovery of new bioactive molecules that could enter drug discovery programs or that could serve as chemical probes is a very complex and costly endeavor. Structure-based and ligand-based in silico screening approaches are nowadays extensively used to complement experimental screening approaches in order to increase the effectiveness of the process and facilitating the screening of thousands or millions of small molecules against a biomolecular target. Both in silico screening methods require as input a suitable chemical compound collection and most often the 3D structure of the small molecules has to be generated since compounds are usually delivered in 1D SMILES, CANSMILES or in 2D SDF formats. Here, we describe the new open source program DG-AMMOS which allows the generation of the 3D conformation of small molecules using Distance Geometry and their energy minimization via Automated Molecular Mechanics Optimization. The program is validated on the Astex dataset, the ChemBridge Diversity database and on a number of small molecules with known crystal structures extracted from the Cambridge Structural Database. A comparison with the free program Balloon and the well-known commercial program Omega generating the 3D of small molecules is carried out. The results show that the new free program DG-AMMOS is a very efficient 3D structure generator engine. DG-AMMOS provides fast, automated and reliable access to the generation of 3D conformation of small molecules and facilitates the preparation of a compound collection prior to high-throughput virtual screening computations. The validation of DG-AMMOS on several different datasets proves that generated structures are generally of equal quality or sometimes better than structures obtained by other tested methods.

  5. DG-AMMOS: A New tool to generate 3D conformation of small molecules using Distance Geometry and Automated Molecular Mechanics Optimization for in silico Screening

    Directory of Open Access Journals (Sweden)

    Villoutreix Bruno O

    2009-11-01

    Full Text Available Abstract Background Discovery of new bioactive molecules that could enter drug discovery programs or that could serve as chemical probes is a very complex and costly endeavor. Structure-based and ligand-based in silico screening approaches are nowadays extensively used to complement experimental screening approaches in order to increase the effectiveness of the process and facilitating the screening of thousands or millions of small molecules against a biomolecular target. Both in silico screening methods require as input a suitable chemical compound collection and most often the 3D structure of the small molecules has to be generated since compounds are usually delivered in 1D SMILES, CANSMILES or in 2D SDF formats. Results Here, we describe the new open source program DG-AMMOS which allows the generation of the 3D conformation of small molecules using Distance Geometry and their energy minimization via Automated Molecular Mechanics Optimization. The program is validated on the Astex dataset, the ChemBridge Diversity database and on a number of small molecules with known crystal structures extracted from the Cambridge Structural Database. A comparison with the free program Balloon and the well-known commercial program Omega generating the 3D of small molecules is carried out. The results show that the new free program DG-AMMOS is a very efficient 3D structure generator engine. Conclusion DG-AMMOS provides fast, automated and reliable access to the generation of 3D conformation of small molecules and facilitates the preparation of a compound collection prior to high-throughput virtual screening computations. The validation of DG-AMMOS on several different datasets proves that generated structures are generally of equal quality or sometimes better than structures obtained by other tested methods.

  6. Additive Construction with Mobile Emplacement (ACME) / Automated Construction of Expeditionary Structures (ACES) Materials Delivery System (MDS)

    Science.gov (United States)

    Mueller, R. P.; Townsend, I. I.; Tamasy, G. J.; Evers, C. J.; Sibille, L. J.; Edmunson, J. E.; Fiske, M. R.; Fikes, J. C.; Case, M.

    2018-01-01

    The purpose of the Automated Construction of Expeditionary Structures, Phase 3 (ACES 3) project is to incorporate the Liquid Goods Delivery System (LGDS) into the Dry Goods Delivery System (DGDS) structure to create an integrated and automated Materials Delivery System (MDS) for 3D printing structures with ordinary Portland cement (OPC) concrete. ACES 3 is a prototype for 3-D printing barracks for soldiers in forward bases, here on Earth. The LGDS supports ACES 3 by storing liquid materials, mixing recipe batches of liquid materials, and working with the Dry Goods Feed System (DGFS) previously developed for ACES 2, combining the materials that are eventually extruded out of the print nozzle. Automated Construction of Expeditionary Structures, Phase 3 (ACES 3) is a project led by the US Army Corps of Engineers (USACE) and supported by NASA. The equivalent 3D printing system for construction in space is designated Additive Construction with Mobile Emplacement (ACME) by NASA.

  7. Automated Generation of OCL Constraints: NL based Approach vs Pattern Based Approach

    Directory of Open Access Journals (Sweden)

    IMRAN SARWAR BAJWA

    2017-04-01

    Full Text Available This paper presents an approach used for automated generations of software constraints. In this model, the SBVR (Semantics of Business Vocabulary and Rules based semi-formal representation is obtained from the syntactic and semantic analysis of a NL (Natural Language (such as English sentence. A SBVR representation is easy to translate to other formal languages as SBVR is based on higher-order logic like other formal languages such as OCL (Object Constraint Language. The proposed model endows with a systematic and powerful system of incorporating NL knowledge on the formal languages. A prototype is constructed in Java (an Eclipse plug-in as a proof of the concept. The performance was tested for a few sample texts taken from existing research thesis reports and books

  8. Toward the automated generation of genome-scale metabolic networks in the SEED.

    Science.gov (United States)

    DeJongh, Matthew; Formsma, Kevin; Boillot, Paul; Gould, John; Rycenga, Matthew; Best, Aaron

    2007-04-26

    Current methods for the automated generation of genome-scale metabolic networks focus on genome annotation and preliminary biochemical reaction network assembly, but do not adequately address the process of identifying and filling gaps in the reaction network, and verifying that the network is suitable for systems level analysis. Thus, current methods are only sufficient for generating draft-quality networks, and refinement of the reaction network is still largely a manual, labor-intensive process. We have developed a method for generating genome-scale metabolic networks that produces substantially complete reaction networks, suitable for systems level analysis. Our method partitions the reaction space of central and intermediary metabolism into discrete, interconnected components that can be assembled and verified in isolation from each other, and then integrated and verified at the level of their interconnectivity. We have developed a database of components that are common across organisms, and have created tools for automatically assembling appropriate components for a particular organism based on the metabolic pathways encoded in the organism's genome. This focuses manual efforts on that portion of an organism's metabolism that is not yet represented in the database. We have demonstrated the efficacy of our method by reverse-engineering and automatically regenerating the reaction network from a published genome-scale metabolic model for Staphylococcus aureus. Additionally, we have verified that our method capitalizes on the database of common reaction network components created for S. aureus, by using these components to generate substantially complete reconstructions of the reaction networks from three other published metabolic models (Escherichia coli, Helicobacter pylori, and Lactococcus lactis). We have implemented our tools and database within the SEED, an open-source software environment for comparative genome annotation and analysis. Our method sets the

  9. Toward the automated generation of genome-scale metabolic networks in the SEED

    Directory of Open Access Journals (Sweden)

    Gould John

    2007-04-01

    Full Text Available Abstract Background Current methods for the automated generation of genome-scale metabolic networks focus on genome annotation and preliminary biochemical reaction network assembly, but do not adequately address the process of identifying and filling gaps in the reaction network, and verifying that the network is suitable for systems level analysis. Thus, current methods are only sufficient for generating draft-quality networks, and refinement of the reaction network is still largely a manual, labor-intensive process. Results We have developed a method for generating genome-scale metabolic networks that produces substantially complete reaction networks, suitable for systems level analysis. Our method partitions the reaction space of central and intermediary metabolism into discrete, interconnected components that can be assembled and verified in isolation from each other, and then integrated and verified at the level of their interconnectivity. We have developed a database of components that are common across organisms, and have created tools for automatically assembling appropriate components for a particular organism based on the metabolic pathways encoded in the organism's genome. This focuses manual efforts on that portion of an organism's metabolism that is not yet represented in the database. We have demonstrated the efficacy of our method by reverse-engineering and automatically regenerating the reaction network from a published genome-scale metabolic model for Staphylococcus aureus. Additionally, we have verified that our method capitalizes on the database of common reaction network components created for S. aureus, by using these components to generate substantially complete reconstructions of the reaction networks from three other published metabolic models (Escherichia coli, Helicobacter pylori, and Lactococcus lactis. We have implemented our tools and database within the SEED, an open-source software environment for comparative

  10. TAPDANCE: An automated tool to identify and annotate transposon insertion CISs and associations between CISs from next generation sequence data

    Directory of Open Access Journals (Sweden)

    Sarver Aaron L

    2012-06-01

    Full Text Available Abstract Background Next generation sequencing approaches applied to the analyses of transposon insertion junction fragments generated in high throughput forward genetic screens has created the need for clear informatics and statistical approaches to deal with the massive amount of data currently being generated. Previous approaches utilized to 1 map junction fragments within the genome and 2 identify Common Insertion Sites (CISs within the genome are not practical due to the volume of data generated by current sequencing technologies. Previous approaches applied to this problem also required significant manual annotation. Results We describe Transposon Annotation Poisson Distribution Association Network Connectivity Environment (TAPDANCE software, which automates the identification of CISs within transposon junction fragment insertion data. Starting with barcoded sequence data, the software identifies and trims sequences and maps putative genomic sequence to a reference genome using the bowtie short read mapper. Poisson distribution statistics are then applied to assess and rank genomic regions showing significant enrichment for transposon insertion. Novel methods of counting insertions are used to ensure that the results presented have the expected characteristics of informative CISs. A persistent mySQL database is generated and utilized to keep track of sequences, mappings and common insertion sites. Additionally, associations between phenotypes and CISs are also identified using Fisher’s exact test with multiple testing correction. In a case study using previously published data we show that the TAPDANCE software identifies CISs as previously described, prioritizes them based on p-value, allows holistic visualization of the data within genome browser software and identifies relationships present in the structure of the data. Conclusions The TAPDANCE process is fully automated, performs similarly to previous labor intensive approaches

  11. Some principles of automated control systems construction with project organizational structure

    OpenAIRE

    Kovalenko, Ihor I.; Puhachenko, Kateryna S.

    2013-01-01

    The main principles of automated control systems construction with project organizational structures have been considered and the process flow sheet  for organizational systems control has been proposed. The architectural elements of the organizational system have been introduced and described. The instrumental tools of the graphodynamic systems theory have been used for the simulation modeling of hierarchical structures.

  12. Automated protein structure modeling with SWISS-MODEL Workspace and the Protein Model Portal.

    Science.gov (United States)

    Bordoli, Lorenza; Schwede, Torsten

    2012-01-01

    Comparative protein structure modeling is a computational approach to build three-dimensional structural models for proteins using experimental structures of related protein family members as templates. Regular blind assessments of modeling accuracy have demonstrated that comparative protein structure modeling is currently the most reliable technique to model protein structures. Homology models are often sufficiently accurate to substitute for experimental structures in a wide variety of applications. Since the usefulness of a model for specific application is determined by its accuracy, model quality estimation is an essential component of protein structure prediction. Comparative protein modeling has become a routine approach in many areas of life science research since fully automated modeling systems allow also nonexperts to build reliable models. In this chapter, we describe practical approaches for automated protein structure modeling with SWISS-MODEL Workspace and the Protein Model Portal.

  13. Automated software system for checking the structure and format of ACM SIG documents

    Science.gov (United States)

    Mirza, Arsalan Rahman; Sah, Melike

    2017-04-01

    Microsoft (MS) Office Word is one of the most commonly used software tools for creating documents. MS Word 2007 and above uses XML to represent the structure of MS Word documents. Metadata about the documents are automatically created using Office Open XML (OOXML) syntax. We develop a new framework, which is called ADFCS (Automated Document Format Checking System) that takes the advantage of the OOXML metadata, in order to extract semantic information from MS Office Word documents. In particular, we develop a new ontology for Association for Computing Machinery (ACM) Special Interested Group (SIG) documents for representing the structure and format of these documents by using OWL (Web Ontology Language). Then, the metadata is extracted automatically in RDF (Resource Description Framework) according to this ontology using the developed software. Finally, we generate extensive rules in order to infer whether the documents are formatted according to ACM SIG standards. This paper, introduces ACM SIG ontology, metadata extraction process, inference engine, ADFCS online user interface, system evaluation and user study evaluations.

  14. Automated structure and flow measurement - a promising tool in nailfold capillaroscopy.

    Science.gov (United States)

    Berks, Michael; Dinsdale, Graham; Murray, Andrea; Moore, Tonia; Manning, Joanne; Taylor, Chris; Herrick, Ariane L

    2018-07-01

    Despite increasing interest in nailfold capillaroscopy, objective measures of capillary structure and blood flow have been little studied. We aimed to test the hypothesis that structural measurements, capillary flow, and a combined measure have the predictive power to separate patients with systemic sclerosis (SSc) from those with primary Raynaud's phenomenon (PRP) and healthy controls (HC). 50 patients with SSc, 12 with PRP, and 50 HC were imaged using a novel capillaroscopy system that generates high-quality nailfold images and provides fully-automated measurements of capillary structure and blood flow (capillary density, mean width, maximum width, shape score, derangement and mean flow velocity). Population statistics summarise the differences between the three groups. Areas under ROC curves (A Z ) were used to measure classification accuracy when assigning individuals to SSc and HC/PRP groups. Statistically significant differences in group means were found between patients with SSc and both HC and patients with PRP, for all measurements, e.g. mean width (μm) ± SE: 15.0 ± 0.71, 12.7 ± 0.74 and 11.8 ± 0.23 for SSc, PRP and HC respectively. Combining the five structural measurements gave better classification (A Z  = 0.919 ± 0.026) than the best single measurement (mean width, A Z  = 0.874 ± 0.043), whilst adding flow further improved classification (A Z  = 0.930 ± 0.024). Structural and blood flow measurements are both able to distinguish patients with SSc from those with PRP/HC. Importantly, these hold promise as clinical trial outcome measures for treatments aimed at improving finger blood flow or microvascular remodelling. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  15. The second round of Critical Assessment of Automated Structure Determination of Proteins by NMR: CASD-NMR-2013

    International Nuclear Information System (INIS)

    Rosato, Antonio; Vranken, Wim; Fogh, Rasmus H.; Ragan, Timothy J.; Tejero, Roberto; Pederson, Kari; Lee, Hsiau-Wei; Prestegard, James H.; Yee, Adelinda; Wu, Bin; Lemak, Alexander; Houliston, Scott; Arrowsmith, Cheryl H.; Kennedy, Michael; Acton, Thomas B.; Xiao, Rong; Liu, Gaohua; Montelione, Gaetano T.; Vuister, Geerten W.

    2015-01-01

    The second round of the community-wide initiative Critical Assessment of automated Structure Determination of Proteins by NMR (CASD-NMR-2013) comprised ten blind target datasets, consisting of unprocessed spectral data, assigned chemical shift lists and unassigned NOESY peak and RDC lists, that were made available in both curated (i.e. manually refined) or un-curated (i.e. automatically generated) form. Ten structure calculation programs, using fully automated protocols only, generated a total of 164 three-dimensional structures (entries) for the ten targets, sometimes using both curated and un-curated lists to generate multiple entries for a single target. The accuracy of the entries could be established by comparing them to the corresponding manually solved structure of each target, which was not available at the time the data were provided. Across the entire data set, 71 % of all entries submitted achieved an accuracy relative to the reference NMR structure better than 1.5 Å. Methods based on NOESY peak lists achieved even better results with up to 100 % of the entries within the 1.5 Å threshold for some programs. However, some methods did not converge for some targets using un-curated NOESY peak lists. Over 90 % of the entries achieved an accuracy better than the more relaxed threshold of 2.5 Å that was used in the previous CASD-NMR-2010 round. Comparisons between entries generated with un-curated versus curated peaks show only marginal improvements for the latter in those cases where both calculations converged

  16. The second round of Critical Assessment of Automated Structure Determination of Proteins by NMR: CASD-NMR-2013

    Energy Technology Data Exchange (ETDEWEB)

    Rosato, Antonio [University of Florence, Department of Chemistry and Magnetic Resonance Center (Italy); Vranken, Wim [Vrije Universiteit Brussel, Structural Biology Brussels (Belgium); Fogh, Rasmus H.; Ragan, Timothy J. [University of Leicester, Department of Biochemistry, School of Biological Sciences (United Kingdom); Tejero, Roberto [Universidad de Valencia, Departamento de Química Física (Spain); Pederson, Kari; Lee, Hsiau-Wei; Prestegard, James H. [University of Georgia, Complex Carbohydrate Research Center and Northeast Structural Genomics Consortium (United States); Yee, Adelinda; Wu, Bin; Lemak, Alexander; Houliston, Scott; Arrowsmith, Cheryl H. [University of Toronto, Department of Medical Biophysics, Cancer Genomics and Proteomics, Ontario Cancer Institute, Northeast Structural Genomics Consortium (Canada); Kennedy, Michael [Miami University, Department of Chemistry and Biochemistry, Northeast Structural Genomics Consortium (United States); Acton, Thomas B.; Xiao, Rong; Liu, Gaohua; Montelione, Gaetano T., E-mail: guy@cabm.rutgers.edu [The State University of New Jersey, Department of Molecular Biology and Biochemistry, Center for Advanced Biotechnology and Medicine, Northeast Structural Genomics Consortium, Rutgers (United States); Vuister, Geerten W., E-mail: gv29@le.ac.uk [University of Leicester, Department of Biochemistry, School of Biological Sciences (United Kingdom)

    2015-08-15

    The second round of the community-wide initiative Critical Assessment of automated Structure Determination of Proteins by NMR (CASD-NMR-2013) comprised ten blind target datasets, consisting of unprocessed spectral data, assigned chemical shift lists and unassigned NOESY peak and RDC lists, that were made available in both curated (i.e. manually refined) or un-curated (i.e. automatically generated) form. Ten structure calculation programs, using fully automated protocols only, generated a total of 164 three-dimensional structures (entries) for the ten targets, sometimes using both curated and un-curated lists to generate multiple entries for a single target. The accuracy of the entries could be established by comparing them to the corresponding manually solved structure of each target, which was not available at the time the data were provided. Across the entire data set, 71 % of all entries submitted achieved an accuracy relative to the reference NMR structure better than 1.5 Å. Methods based on NOESY peak lists achieved even better results with up to 100 % of the entries within the 1.5 Å threshold for some programs. However, some methods did not converge for some targets using un-curated NOESY peak lists. Over 90 % of the entries achieved an accuracy better than the more relaxed threshold of 2.5 Å that was used in the previous CASD-NMR-2010 round. Comparisons between entries generated with un-curated versus curated peaks show only marginal improvements for the latter in those cases where both calculations converged.

  17. Generating an Automated Test Suite by Variable Strength Combinatorial Testing for Web Services

    Directory of Open Access Journals (Sweden)

    Yin Li

    2016-09-01

    Full Text Available Testing Web Services has become the spotlight of software engineering as an important means to assure the quality of Web application. Due to lacking of graphic interface and source code, Web services need an automated testing method, which is an important part in efficiently designing and generating test suite. However, the existing testing methods may lead to the redundancy of test suite and the decrease of fault-detecting ability since it cannot handle scenarios where the strengths of the different interactions are not uniform. With the purpose of solving this problem, firstly the formal tree model based on WSDL is constructed and the actual interaction relationship of each node is made sufficient consideration into, then the combinatorial testing is proposed to generate variable strength combinatorial test suite based on One-test-at-a-time strategy. At last test cases are minimized according to constraint rules. The results show that compared with conventional random testing, the proposed approach can detect more errors with the same amount of test cases which turning out to be more ideal than existing ones in size.

  18. A novel method for automated grid generation of ice shapes for local-flow analysis

    Science.gov (United States)

    Ogretim, Egemen; Huebsch, Wade W.

    2004-02-01

    Modelling a complex geometry, such as ice roughness, plays a key role for the computational flow analysis over rough surfaces. This paper presents two enhancement ideas in modelling roughness geometry for local flow analysis over an aerodynamic surface. The first enhancement is use of the leading-edge region of an airfoil as a perturbation to the parabola surface. The reasons for using a parabola as the base geometry are: it resembles the airfoil leading edge in the vicinity of its apex and it allows the use of a lower apparent Reynolds number. The second enhancement makes use of the Fourier analysis for modelling complex ice roughness on the leading edge of airfoils. This method of modelling provides an analytical expression, which describes the roughness geometry and the corresponding derivatives. The factors affecting the performance of the Fourier analysis were also investigated. It was shown that the number of sine-cosine terms and the number of control points are of importance. Finally, these enhancements are incorporated into an automated grid generation method over the airfoil ice accretion surface. The validations for both enhancements demonstrate that they can improve the current capability of grid generation and computational flow field analysis around airfoils with ice roughness.

  19. Automated As-Built Model Generation of Subway Tunnels from Mobile LiDAR Data

    Directory of Open Access Journals (Sweden)

    Mostafa Arastounia

    2016-09-01

    Full Text Available This study proposes fully-automated methods for as-built model generation of subway tunnels employing mobile Light Detection and Ranging (LiDAR data. The employed dataset is acquired by a Velodyne HDL 32E and covers 155 m of a subway tunnel containing six million points. First, the tunnel’s main axis and cross sections are extracted. Next, a preliminary model is created by fitting an ellipse to each extracted cross section. The model is refined by employing residual analysis and Baarda’s data snooping method to eliminate outliers. The final model is then generated by applying least squares adjustment to outlier-free data. The obtained results indicate that the tunnel’s main axis and 1551 cross sections at 0.1 m intervals are successfully extracted. Cross sections have an average semi-major axis of 7.8508 m with a standard deviation of 0.2 mm and semi-minor axis of 7.7509 m with a standard deviation of 0.1 mm. The average normal distance of points from the constructed model (average absolute error is also 0.012 m. The developed algorithm is applicable to tunnels with any horizontal orientation and degree of curvature since it makes no assumptions, nor does it use any a priori knowledge regarding the tunnel’s curvature and horizontal orientation.

  20. Piezoelectric Structures and Low Power Generation Devices

    Directory of Open Access Journals (Sweden)

    Irinela CHILIBON

    2016-10-01

    Full Text Available A short overview of different piezoelectric structures and devices for generating renewable electricity under mechanical actions is presented. A vibrating piezoelectric device differs from a typical electrical power source in that it has capacitive rather than inductive source impedance, and may be driven by mechanical vibrations of varying amplitude. Several techniques have been developed to extract energy from the environment. Generally, “vibration energy” could be converted into electrical energy by three techniques: electrostatic charge, magnetic fields and piezoelectric. Mechanical resonance frequency of piezoelectric bimorph transducers depends on geometric size (length, width, and thickness of each layer, and the piezoelectric coefficients of the piezoelectric material. Manufacturing processes and intended applications of several energy harvesting devices are presented.

  1. Development and verification testing of automation and robotics for assembly of space structures

    Science.gov (United States)

    Rhodes, Marvin D.; Will, Ralph W.; Quach, Cuong C.

    1993-01-01

    A program was initiated within the past several years to develop operational procedures for automated assembly of truss structures suitable for large-aperture antennas. The assembly operations require the use of a robotic manipulator and are based on the principle of supervised autonomy to minimize crew resources. A hardware testbed was established to support development and evaluation testing. A brute-force automation approach was used to develop the baseline assembly hardware and software techniques. As the system matured and an operation was proven, upgrades were incorprated and assessed against the baseline test results. This paper summarizes the developmental phases of the program, the results of several assembly tests, the current status, and a series of proposed developments for additional hardware and software control capability. No problems that would preclude automated in-space assembly of truss structures have been encountered. The current system was developed at a breadboard level and continued development at an enhanced level is warranted.

  2. Automated reported system using structured data entry: Application to prostate US

    International Nuclear Information System (INIS)

    Kim, Bo Hyun; Paik, Chul Hwa; Lee, Won Yong

    2001-01-01

    To improve efficacy in producing and searching the radiological reported of prostate US in daily practice and clinical research by developing an automated reporting system using structured data entry system. The report database was established with appropriate fields. A structured data entry form for prostate US was created. The rules for automated transformation from the entered data a text report have been decide. Two programmers coded the programs according to the rules. We have successful developed an automated reporting system for prostate US using structured data entry. Patients. deg Φs demographic information, the order information, and the contents of the main body and conclusion of the radiological report were included as individual fields in the database. The report contents were input by selecting corresponding fields in a structured data entry entry form, which has transformed into a text report. The automated reporting system using structured data entry is an efficient way to establish radiological report database and could be successfully applied to prostate US. If its utility can be extended to other US examinations, it will become a useful tool for both radiological reporting and database management.

  3. Laser materials processing of complex components. From reverse engineering via automated beam path generation to short process development cycles.

    Science.gov (United States)

    Görgl, R.; Brandstätter, E.

    2016-03-01

    The article presents an overview of what is possible nowadays in the field of laser materials processing. The state of the art in the complete process chain is shown, starting with the generation of a specific components CAD data and continuing with the automated motion path generation for the laser head carried by a CNC or robot system. Application examples from laser welding, laser cladding and additive laser manufacturing are given.

  4. Laser materials processing of complex components: from reverse engineering via automated beam path generation to short process development cycles

    Science.gov (United States)

    Görgl, Richard; Brandstätter, Elmar

    2017-01-01

    The article presents an overview of what is possible nowadays in the field of laser materials processing. The state of the art in the complete process chain is shown, starting with the generation of a specific components CAD data and continuing with the automated motion path generation for the laser head carried by a CNC or robot system. Application examples from laser cladding and laser-based additive manufacturing are given.

  5. Automated modelling of complex refrigeration cycles through topological structure analysis

    International Nuclear Information System (INIS)

    Belman-Flores, J.M.; Riesco-Avila, J.M.; Gallegos-Munoz, A.; Navarro-Esbri, J.; Aceves, S.M.

    2009-01-01

    We have developed a computational method for analysis of refrigeration cycles. The method is well suited for automated analysis of complex refrigeration systems. The refrigerator is specified through a description of flows representing thermodynamic sates at system locations; components that modify the thermodynamic state of a flow; and controls that specify flow characteristics at selected points in the diagram. A system of equations is then established for the refrigerator, based on mass, energy and momentum balances for each of the system components. Controls specify the values of certain system variables, thereby reducing the number of unknowns. It is found that the system of equations for the refrigerator may contain a number of redundant or duplicate equations, and therefore further equations are necessary for a full characterization. The number of additional equations is related to the number of loops in the cycle, and this is calculated by a matrix-based topological method. The methodology is demonstrated through an analysis of a two-stage refrigeration cycle.

  6. Assessment of Automated Data Analysis Application on VVER Steam Generator Tubing

    International Nuclear Information System (INIS)

    Picek, E.; Barilar, D.

    2006-01-01

    INETEC - Institute for Nuclear Technology has developed software package named EddyOne having an option of automated analysis of bobbin coil eddy current data. During its development and site use some features were noticed preventing the wide use automatic analysis on VVER SG data. This article discuss these specific problems as well evaluates possible solutions. With regards to current state of automated analysis technology an overview of advantaged and disadvantages of automated analysis on VVER SG is summarized as well.(author)

  7. Automated laser-based barely visible impact damage detection in honeycomb sandwich composite structures

    International Nuclear Information System (INIS)

    Girolamo, D.; Yuan, F. G.; Girolamo, L.

    2015-01-01

    Nondestructive evaluation (NDE) for detection and quantification of damage in composite materials is fundamental in the assessment of the overall structural integrity of modern aerospace systems. Conventional NDE systems have been extensively used to detect the location and size of damages by propagating ultrasonic waves normal to the surface. However they usually require physical contact with the structure and are time consuming and labor intensive. An automated, contactless laser ultrasonic imaging system for barely visible impact damage (BVID) detection in advanced composite structures has been developed to overcome these limitations. Lamb waves are generated by a Q-switched Nd:YAG laser, raster scanned by a set of galvano-mirrors over the damaged area. The out-of-plane vibrations are measured through a laser Doppler Vibrometer (LDV) that is stationary at a point on the corner of the grid. The ultrasonic wave field of the scanned area is reconstructed in polar coordinates and analyzed for high resolution characterization of impact damage in the composite honeycomb panel. Two methodologies are used for ultrasonic wave-field analysis: scattered wave field analysis (SWA) and standing wave energy analysis (SWEA) in the frequency domain. The SWA is employed for processing the wave field and estimate spatially dependent wavenumber values, related to discontinuities in the structural domain. The SWEA algorithm extracts standing waves trapped within damaged areas and, by studying the spectrum of the standing wave field, returns high fidelity damage imaging. While the SWA can be used to locate the impact damage in the honeycomb panel, the SWEA produces damage images in good agreement with X-ray computed tomographic (X-ray CT) scans. The results obtained prove that the laser-based nondestructive system is an effective alternative to overcome limitations of conventional NDI technologies

  8. Reproducibility, stability, and biological variability of thrombin generation using calibrated automated thrombography in healthy dogs.

    Science.gov (United States)

    Cuq, Benoît; Blois, Shauna L; Wood, R Darren; Monteith, Gabrielle; Abrams-Ogg, Anthony C; Bédard, Christian; Wood, Geoffrey A

    2018-06-01

    Thrombin plays a central role in hemostasis and thrombosis. Calibrated automated thrombography (CAT), a thrombin generation assay, may be a useful test for hemostatic disorders in dogs. To describe CAT results in a group of healthy dogs, and assess preanalytical variables and biological variability. Forty healthy dogs were enrolled. Lag time (Lag), time to peak (ttpeak), peak thrombin generation (peak), and endogenous thrombin potential (ETP) were measured. Direct jugular venipuncture and winged-needle catheter-assisted saphenous venipuncture were used to collect samples from each dog, and results were compared between methods. Sample stability at -80°C was assessed over 12 months in a subset of samples. Biological variability of CAT was assessed via nested ANOVA using samples obtained weekly from a subset of 9 dogs for 4 consecutive weeks. Samples for CAT were stable at -80°C over 12 months of storage. Samples collected via winged-needle catheter venipuncture showed poor repeatability compared to direct venipuncture samples; there was also poor agreement between the 2 sampling methods. Intra-individual variability of CAT parameters was below 25%; inter-individual variability ranged from 36.9% to 78.5%. Measurement of thrombin generation using CAT appears to be repeatable in healthy dogs, and samples are stable for at least 12 months when stored at -80°C. Direct venipuncture sampling is recommended for CAT. Low indices of individuality suggest that subject-based reference intervals are more suitable when interpreting CAT results. © 2018 American Society for Veterinary Clinical Pathology.

  9. Non-Uniform Sampling and J-UNIO Automation for Efficient Protein NMR Structure Determination.

    Science.gov (United States)

    Didenko, Tatiana; Proudfoot, Andrew; Dutta, Samit Kumar; Serrano, Pedro; Wüthrich, Kurt

    2015-08-24

    High-resolution structure determination of small proteins in solution is one of the big assets of NMR spectroscopy in structural biology. Improvements in the efficiency of NMR structure determination by advances in NMR experiments and automation of data handling therefore attracts continued interest. Here, non-uniform sampling (NUS) of 3D heteronuclear-resolved [(1)H,(1)H]-NOESY data yielded two- to three-fold savings of instrument time for structure determinations of soluble proteins. With the 152-residue protein NP_372339.1 from Staphylococcus aureus and the 71-residue protein NP_346341.1 from Streptococcus pneumonia we show that high-quality structures can be obtained with NUS NMR data, which are equally well amenable to robust automated analysis as the corresponding uniformly sampled data. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Automated simulation and study of spatial-structural design processes

    NARCIS (Netherlands)

    Davila Delgado, J.M.; Hofmeyer, H.; Stouffs, R.; Sariyildiz, S.

    2013-01-01

    A so-called "Design Process Investigation toolbox" (DPI toolbox), has been developed. It is a set of computational tools that simulate spatial-structural design processes. Its objectives are to study spatial-structural design processes and to support the involved actors. Two case-studies are

  11. HD-RNAS: An automated hierarchical database of RNA structures

    Directory of Open Access Journals (Sweden)

    Shubhra Sankar eRay

    2012-04-01

    Full Text Available One of the important goals of most biological investigations is to classify and organize the experimental findings so that they are readily useful for deriving generalized rules. Although there is a huge amount of information on RNA structures in PDB, there are redundant files, ambiguous synthetic sequences etc. Moreover, a systematic hierarchical organization, reflecting RNA classification, is missing in PDB. In this investigation, we have classified all the available RNA crystal structures from PDB through a programmatic approach. Hence, it would be now a simple assignment to regularly update the classification as and when new structures are released. The classification can further determine (i a non-redundant set of RNA structures and (ii if available, a set of structures of identical sequence and function, which can highlight structural polymorphism, ligand-induced conformational alterations etc. Presently, we have classified the available structures (2095 PDB entries having RNA chain longer than 9 nucleotides solved by X-ray crystallography or NMR spectroscopy into nine functional classes. The structures of same function and same source are mostly seen to be similar with subtle differences depending on their functional complexation. The web-server is available online at http://www.saha.ac.in/biop/www/HD-RNAS.html and is updated regularly.

  12. Amazon Forest Structure from IKONOS Satellite Data and the Automated Characterization of Forest Canopy Properties

    Science.gov (United States)

    Michael Palace; Michael Keller; Gregory P. Asner; Stephen Hagen; Bobby . Braswell

    2008-01-01

    We developed an automated tree crown analysis algorithm using 1-m panchromatic IKONOS satellite images to examine forest canopy structure in the Brazilian Amazon. The algorithm was calibrated on the landscape level with tree geometry and forest stand data at the Fazenda Cauaxi (3.75◦ S, 48.37◦ W) in the eastern Amazon, and then compared with forest...

  13. Automated Modal Parameter Estimation of Civil Engineering Structures

    DEFF Research Database (Denmark)

    Andersen, Palle; Brincker, Rune; Goursat, Maurice

    In this paper the problems of doing automatic modal parameter extraction of ambient excited civil engineering structures is considered. Two different approaches for obtaining the modal parameters automatically are presented: The Frequency Domain Decomposition (FDD) technique and a correlation...

  14. An automated laboratory-scale methodology for the generation of sheared mammalian cell culture samples.

    Science.gov (United States)

    Joseph, Adrian; Goldrick, Stephen; Mollet, Michael; Turner, Richard; Bender, Jean; Gruber, David; Farid, Suzanne S; Titchener-Hooker, Nigel

    2017-05-01

    Continuous disk-stack centrifugation is typically used for the removal of cells and cellular debris from mammalian cell culture broths at manufacturing-scale. The use of scale-down methods to characterise disk-stack centrifugation performance enables substantial reductions in material requirements and allows a much wider design space to be tested than is currently possible at pilot-scale. The process of scaling down centrifugation has historically been challenging due to the difficulties in mimicking the Energy Dissipation Rates (EDRs) in typical machines. This paper describes an alternative and easy-to-assemble automated capillary-based methodology to generate levels of EDRs consistent with those found in a continuous disk-stack centrifuge. Variations in EDR were achieved through changes in capillary internal diameter and the flow rate of operation through the capillary. The EDRs found to match the levels of shear in the feed zone of a pilot-scale centrifuge using the experimental method developed in this paper (2.4×10 5 W/Kg) are consistent with those obtained through previously published computational fluid dynamic (CFD) studies (2.0×10 5 W/Kg). Furthermore, this methodology can be incorporated into existing scale-down methods to model the process performance of continuous disk-stack centrifuges. This was demonstrated through the characterisation of culture hold time, culture temperature and EDRs on centrate quality. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. QuickNGS elevates Next-Generation Sequencing data analysis to a new level of automation.

    Science.gov (United States)

    Wagle, Prerana; Nikolić, Miloš; Frommolt, Peter

    2015-07-01

    Next-Generation Sequencing (NGS) has emerged as a widely used tool in molecular biology. While time and cost for the sequencing itself are decreasing, the analysis of the massive amounts of data remains challenging. Since multiple algorithmic approaches for the basic data analysis have been developed, there is now an increasing need to efficiently use these tools to obtain results in reasonable time. We have developed QuickNGS, a new workflow system for laboratories with the need to analyze data from multiple NGS projects at a time. QuickNGS takes advantage of parallel computing resources, a comprehensive back-end database, and a careful selection of previously published algorithmic approaches to build fully automated data analysis workflows. We demonstrate the efficiency of our new software by a comprehensive analysis of 10 RNA-Seq samples which we can finish in only a few minutes of hands-on time. The approach we have taken is suitable to process even much larger numbers of samples and multiple projects at a time. Our approach considerably reduces the barriers that still limit the usability of the powerful NGS technology and finally decreases the time to be spent before proceeding to further downstream analysis and interpretation of the data.

  16. Automated Generation of Fault Management Artifacts from a Simple System Model

    Science.gov (United States)

    Kennedy, Andrew K.; Day, John C.

    2013-01-01

    Our understanding of off-nominal behavior - failure modes and fault propagation - in complex systems is often based purely on engineering intuition; specific cases are assessed in an ad hoc fashion as a (fallible) fault management engineer sees fit. This work is an attempt to provide a more rigorous approach to this understanding and assessment by automating the creation of a fault management artifact, the Failure Modes and Effects Analysis (FMEA) through querying a representation of the system in a SysML model. This work builds off the previous development of an off-nominal behavior model for the upcoming Soil Moisture Active-Passive (SMAP) mission at the Jet Propulsion Laboratory. We further developed the previous system model to more fully incorporate the ideas of State Analysis, and it was restructured in an organizational hierarchy that models the system as layers of control systems while also incorporating the concept of "design authority". We present software that was developed to traverse the elements and relationships in this model to automatically construct an FMEA spreadsheet. We further discuss extending this model to automatically generate other typical fault management artifacts, such as Fault Trees, to efficiently portray system behavior, and depend less on the intuition of fault management engineers to ensure complete examination of off-nominal behavior.

  17. Conceptual Model of an Application for Automated Generation of Webpage Mobile Versions

    Directory of Open Access Journals (Sweden)

    Todor Rachovski

    2017-11-01

    Full Text Available Accessing webpages through various types of mobile devices with different screen sizes and using different browsers has put new demands on web developers. The main challenge is the development of websites with responsive design that is adaptable depending on the mobile device used. The article presents a conceptual model of an app for automated generation of mobile pages. It has five-layer architecture: database, database management layer, business logic layer, web services layer and a presentation layer. The database stores all the data needed to run the application. The database management layer uses an ORM model to convert relational data into an object-oriented format and control the access to them. The business logic layer contains components that perform the actual work on building a mobile version of the page, including parsing, building a hierarchical model of the page and a number of transformations. The web services layer provides external applications with access to lower-level functionalities, and the presentation layer is responsible for choosing and using the appropriate CSS. A web application that uses the proposed model was developed and experiments were conducted.

  18. Agent-based Modeling Automated: Data-driven Generation of Innovation Diffusion Models

    NARCIS (Netherlands)

    Jensen, T.; Chappin, E.J.L.

    2016-01-01

    Simulation modeling is useful to gain insights into driving mechanisms of diffusion of innovations. This study aims to introduce automation to make identification of such mechanisms with agent-based simulation modeling less costly in time and labor. We present a novel automation procedure in which

  19. SV-AUTOPILOT: optimized, automated construction of structural variation discovery and benchmarking pipelines.

    Science.gov (United States)

    Leung, Wai Yi; Marschall, Tobias; Paudel, Yogesh; Falquet, Laurent; Mei, Hailiang; Schönhuth, Alexander; Maoz Moss, Tiffanie Yael

    2015-03-25

    Many tools exist to predict structural variants (SVs), utilizing a variety of algorithms. However, they have largely been developed and tested on human germline or somatic (e.g. cancer) variation. It seems appropriate to exploit this wealth of technology available for humans also for other species. Objectives of this work included: a) Creating an automated, standardized pipeline for SV prediction. b) Identifying the best tool(s) for SV prediction through benchmarking. c) Providing a statistically sound method for merging SV calls. The SV-AUTOPILOT meta-tool platform is an automated pipeline for standardization of SV prediction and SV tool development in paired-end next-generation sequencing (NGS) analysis. SV-AUTOPILOT comes in the form of a virtual machine, which includes all datasets, tools and algorithms presented here. The virtual machine easily allows one to add, replace and update genomes, SV callers and post-processing routines and therefore provides an easy, out-of-the-box environment for complex SV discovery tasks. SV-AUTOPILOT was used to make a direct comparison between 7 popular SV tools on the Arabidopsis thaliana genome using the Landsberg (Ler) ecotype as a standardized dataset. Recall and precision measurements suggest that Pindel and Clever were the most adaptable to this dataset across all size ranges while Delly performed well for SVs larger than 250 nucleotides. A novel, statistically-sound merging process, which can control the false discovery rate, reduced the false positive rate on the Arabidopsis benchmark dataset used here by >60%. SV-AUTOPILOT provides a meta-tool platform for future SV tool development and the benchmarking of tools on other genomes using a standardized pipeline. It optimizes detection of SVs in non-human genomes using statistically robust merging. The benchmarking in this study has demonstrated the power of 7 different SV tools for analyzing different size classes and types of structural variants. The optional merge

  20. From bacterial to human dihydrouridine synthase: automated structure determination

    Energy Technology Data Exchange (ETDEWEB)

    Whelan, Fiona, E-mail: fiona.whelan@york.ac.uk; Jenkins, Huw T., E-mail: fiona.whelan@york.ac.uk [The University of York, Heslington, York YO10 5DD (United Kingdom); Griffiths, Samuel C. [University of Oxford, Headington, Oxford OX3 7BN (United Kingdom); Byrne, Robert T. [Ludwig-Maximilians-University Munich, Feodor-Lynen-Strasse 25, 81377 Munich (Germany); Dodson, Eleanor J.; Antson, Alfred A., E-mail: fiona.whelan@york.ac.uk [The University of York, Heslington, York YO10 5DD (United Kingdom)

    2015-06-30

    The crystal structure of a human dihydrouridine synthase, an enzyme associated with lung cancer, with 18% sequence identity to a T. maritima enzyme, has been determined at 1.9 Å resolution by molecular replacement after extensive molecular remodelling of the template. The reduction of uridine to dihydrouridine at specific positions in tRNA is catalysed by dihydrouridine synthase (Dus) enzymes. Increased expression of human dihydrouridine synthase 2 (hDus2) has been linked to pulmonary carcinogenesis, while its knockdown decreased cancer cell line viability, suggesting that it may serve as a valuable target for therapeutic intervention. Here, the X-ray crystal structure of a construct of hDus2 encompassing the catalytic and tRNA-recognition domains (residues 1–340) determined at 1.9 Å resolution is presented. It is shown that the structure can be determined automatically by phenix.mr-rosetta starting from a bacterial Dus enzyme with only 18% sequence identity and a significantly divergent structure. The overall fold of the human Dus2 is similar to that of bacterial enzymes, but has a larger recognition domain and a unique three-stranded antiparallel β-sheet insertion into the catalytic domain that packs next to the recognition domain, contributing to domain–domain interactions. The structure may inform the development of novel therapeutic approaches in the fight against lung cancer.

  1. From bacterial to human dihydrouridine synthase: automated structure determination

    International Nuclear Information System (INIS)

    Whelan, Fiona; Jenkins, Huw T.; Griffiths, Samuel C.; Byrne, Robert T.; Dodson, Eleanor J.; Antson, Alfred A.

    2015-01-01

    The crystal structure of a human dihydrouridine synthase, an enzyme associated with lung cancer, with 18% sequence identity to a T. maritima enzyme, has been determined at 1.9 Å resolution by molecular replacement after extensive molecular remodelling of the template. The reduction of uridine to dihydrouridine at specific positions in tRNA is catalysed by dihydrouridine synthase (Dus) enzymes. Increased expression of human dihydrouridine synthase 2 (hDus2) has been linked to pulmonary carcinogenesis, while its knockdown decreased cancer cell line viability, suggesting that it may serve as a valuable target for therapeutic intervention. Here, the X-ray crystal structure of a construct of hDus2 encompassing the catalytic and tRNA-recognition domains (residues 1–340) determined at 1.9 Å resolution is presented. It is shown that the structure can be determined automatically by phenix.mr-rosetta starting from a bacterial Dus enzyme with only 18% sequence identity and a significantly divergent structure. The overall fold of the human Dus2 is similar to that of bacterial enzymes, but has a larger recognition domain and a unique three-stranded antiparallel β-sheet insertion into the catalytic domain that packs next to the recognition domain, contributing to domain–domain interactions. The structure may inform the development of novel therapeutic approaches in the fight against lung cancer

  2. Engineering Mathematical Analysis Method for Productivity Rate in Linear Arrangement Serial Structure Automated Flow Assembly Line

    Directory of Open Access Journals (Sweden)

    Tan Chan Sin

    2015-01-01

    Full Text Available Productivity rate (Q or production rate is one of the important indicator criteria for industrial engineer to improve the system and finish good output in production or assembly line. Mathematical and statistical analysis method is required to be applied for productivity rate in industry visual overviews of the failure factors and further improvement within the production line especially for automated flow line since it is complicated. Mathematical model of productivity rate in linear arrangement serial structure automated flow line with different failure rate and bottleneck machining time parameters becomes the basic model for this productivity analysis. This paper presents the engineering mathematical analysis method which is applied in an automotive company which possesses automated flow assembly line in final assembly line to produce motorcycle in Malaysia. DCAS engineering and mathematical analysis method that consists of four stages known as data collection, calculation and comparison, analysis, and sustainable improvement is used to analyze productivity in automated flow assembly line based on particular mathematical model. Variety of failure rate that causes loss of productivity and bottleneck machining time is shown specifically in mathematic figure and presents the sustainable solution for productivity improvement for this final assembly automated flow line.

  3. Automated Learning of Subcellular Variation among Punctate Protein Patterns and a Generative Model of Their Relation to Microtubules.

    Directory of Open Access Journals (Sweden)

    Gregory R Johnson

    2015-12-01

    Full Text Available Characterizing the spatial distribution of proteins directly from microscopy images is a difficult problem with numerous applications in cell biology (e.g. identifying motor-related proteins and clinical research (e.g. identification of cancer biomarkers. Here we describe the design of a system that provides automated analysis of punctate protein patterns in microscope images, including quantification of their relationships to microtubules. We constructed the system using confocal immunofluorescence microscopy images from the Human Protein Atlas project for 11 punctate proteins in three cultured cell lines. These proteins have previously been characterized as being primarily located in punctate structures, but their images had all been annotated by visual examination as being simply "vesicular". We were able to show that these patterns could be distinguished from each other with high accuracy, and we were able to assign to one of these subclasses hundreds of proteins whose subcellular localization had not previously been well defined. In addition to providing these novel annotations, we built a generative approach to modeling of punctate distributions that captures the essential characteristics of the distinct patterns. Such models are expected to be valuable for representing and summarizing each pattern and for constructing systems biology simulations of cell behaviors.

  4. Automated Structure Detection in HRTEM Images: An Example with Graphene

    DEFF Research Database (Denmark)

    Kling, Jens; Vestergaard, Jacob Schack; Dahl, Anders Bjorholm

    Graphene, as the forefather of 2D-materials, attracts much attention due to its extraordinary properties like transparency, flexibility and outstanding high conductivity, together with a thickness of only one atom. The properties seem to be dependent on the atomic structure of graphene and theref...

  5. Automated detection of repeated structures in building facades

    Directory of Open Access Journals (Sweden)

    M. Previtali

    2013-10-01

    Full Text Available Automatic identification of high-level repeated structures in 3D point clouds of building façades is crucial for applications like digitalization and building modelling. Indeed, in many architectural styles building façades are governed by arrangements of objects into repeated patterns. In particular, façades are generally designed as the repetition of some few basic objects organized into interlaced and\\or concatenated grid structures. Starting from this key observation, this paper presents an algorithm for Repeated Structure Detection (RSD in 3D point clouds of building façades. The presented methodology consists of three main phases. First, in the point cloud segmentation stage (i the building façade is decomposed into planar patches which are classified by means of some weak prior knowledge of urban buildings formulated in a classification tree. Secondly (ii, in the element clustering phase detected patches are grouped together by means of a similarity function and pairwise transformations between patches are computed. Eventually (iii, in the structure regularity estimation step the parameters of repeated grid patterns are calculated by using a Least- Squares optimization. Workability of the presented approach is tested using some real data from urban scenes.

  6. Aspects of the design of the automated system for code generation of electrical items of technological equipment

    Directory of Open Access Journals (Sweden)

    Erokhin V.V.

    2017-09-01

    Full Text Available The article presents the aspects of designing an automated system for generating codes for electrical elements of process equipment using CASE-means. We propose our own technology of iterative development of such systems. The proposed methodology uses the tool to develop the ERwin Data Modeler databases of Computer Associates and the author's tool for the automatic generation of ERwin Class Builder code. The implemented design tool is a superstructure over the ERwin Data Modeler from Computer Associates, which extends its functionality. ERwin Data Modeler works with logical and physical data models and allows you to generate a description of the database and ddl-scripts.

  7. The Upgrade Programme for the Structural Biology beamlines at the European Synchrotron Radiation Facility – High throughput sample evaluation and automation

    International Nuclear Information System (INIS)

    Theveneau, P; Baker, R; Barrett, R; Beteva, A; Bowler, M W; Carpentier, P; Caserotto, H; Sanctis, D de; Dobias, F; Flot, D; Guijarro, M; Giraud, T; Lentini, M; Leonard, G A; Mattenet, M; McSweeney, S M; Morawe, C; Nurizzo, D; McCarthy, A A; Nanao, M

    2013-01-01

    Automation and advances in technology are the key elements in addressing the steadily increasing complexity of Macromolecular Crystallography (MX) experiments. Much of this complexity is due to the inter-and intra-crystal heterogeneity in diffraction quality often observed for crystals of multi-component macromolecular assemblies or membrane proteins. Such heterogeneity makes high-throughput sample evaluation an important and necessary tool for increasing the chances of a successful structure determination. The introduction at the ESRF of automatic sample changers in 2005 dramatically increased the number of samples that were tested for diffraction quality. This 'first generation' of automation, coupled with advances in software aimed at optimising data collection strategies in MX, resulted in a three-fold increase in the number of crystal structures elucidated per year using data collected at the ESRF. In addition, sample evaluation can be further complemented using small angle scattering experiments on the newly constructed bioSAXS facility on BM29 and the micro-spectroscopy facility (ID29S). The construction of a second generation of automated facilities on the MASSIF (Massively Automated Sample Screening Integrated Facility) beam lines will build on these advances and should provide a paradigm shift in how MX experiments are carried out which will benefit the entire Structural Biology community.

  8. Spectral neighbor analysis method for automated generation of quantum-accurate interatomic potentials

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, Aidan P. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Multiscale Science Dept.; Swiler, Laura P. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Optimization and Uncertainty Quantification Dept.; Trott, Christian R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Scalable Algorithms Dept.; Foiles, Stephen M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Computational Materials and Data Science Dept.; Tucker, Garritt J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Computational Materials and Data Science Dept.; Drexel Univ., Philadelphia, PA (United States). Dept. of Materials Science and Engineering

    2015-03-15

    Here, we present a new interatomic potential for solids and liquids called Spectral Neighbor Analysis Potential (SNAP). The SNAP potential has a very general form and uses machine-learning techniques to reproduce the energies, forces, and stress tensors of a large set of small configurations of atoms, which are obtained using high-accuracy quantum electronic structure (QM) calculations. The local environment of each atom is characterized by a set of bispectrum components of the local neighbor density projected onto a basis of hyperspherical harmonics in four dimensions. The bispectrum components are the same bond-orientational order parameters employed by the GAP potential [1]. The SNAP potential, unlike GAP, assumes a linear relationship between atom energy and bispectrum components. The linear SNAP coefficients are determined using weighted least-squares linear regression against the full QM training set. This allows the SNAP potential to be fit in a robust, automated manner to large QM data sets using many bispectrum components. The calculation of the bispectrum components and the SNAP potential are implemented in the LAMMPS parallel molecular dynamics code. We demonstrate that a previously unnoticed symmetry property can be exploited to reduce the computational cost of the force calculations by more than one order of magnitude. We present results for a SNAP potential for tantalum, showing that it accurately reproduces a range of commonly calculated properties of both the crystalline solid and the liquid phases. In addition, unlike simpler existing potentials, SNAP correctly predicts the energy barrier for screw dislocation migration in BCC tantalum.

  9. Spectral neighbor analysis method for automated generation of quantum-accurate interatomic potentials

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, A.P., E-mail: athomps@sandia.gov [Multiscale Science Department, Sandia National Laboratories, PO Box 5800, MS 1322, Albuquerque, NM 87185 (United States); Swiler, L.P., E-mail: lpswile@sandia.gov [Optimization and Uncertainty Quantification Department, Sandia National Laboratories, PO Box 5800, MS 1318, Albuquerque, NM 87185 (United States); Trott, C.R., E-mail: crtrott@sandia.gov [Scalable Algorithms Department, Sandia National Laboratories, PO Box 5800, MS 1322, Albuquerque, NM 87185 (United States); Foiles, S.M., E-mail: foiles@sandia.gov [Computational Materials and Data Science Department, Sandia National Laboratories, PO Box 5800, MS 1411, Albuquerque, NM 87185 (United States); Tucker, G.J., E-mail: gtucker@coe.drexel.edu [Computational Materials and Data Science Department, Sandia National Laboratories, PO Box 5800, MS 1411, Albuquerque, NM 87185 (United States); Department of Materials Science and Engineering, Drexel University, Philadelphia, PA 19104 (United States)

    2015-03-15

    We present a new interatomic potential for solids and liquids called Spectral Neighbor Analysis Potential (SNAP). The SNAP potential has a very general form and uses machine-learning techniques to reproduce the energies, forces, and stress tensors of a large set of small configurations of atoms, which are obtained using high-accuracy quantum electronic structure (QM) calculations. The local environment of each atom is characterized by a set of bispectrum components of the local neighbor density projected onto a basis of hyperspherical harmonics in four dimensions. The bispectrum components are the same bond-orientational order parameters employed by the GAP potential [1]. The SNAP potential, unlike GAP, assumes a linear relationship between atom energy and bispectrum components. The linear SNAP coefficients are determined using weighted least-squares linear regression against the full QM training set. This allows the SNAP potential to be fit in a robust, automated manner to large QM data sets using many bispectrum components. The calculation of the bispectrum components and the SNAP potential are implemented in the LAMMPS parallel molecular dynamics code. We demonstrate that a previously unnoticed symmetry property can be exploited to reduce the computational cost of the force calculations by more than one order of magnitude. We present results for a SNAP potential for tantalum, showing that it accurately reproduces a range of commonly calculated properties of both the crystalline solid and the liquid phases. In addition, unlike simpler existing potentials, SNAP correctly predicts the energy barrier for screw dislocation migration in BCC tantalum.

  10. Spectral neighbor analysis method for automated generation of quantum-accurate interatomic potentials

    International Nuclear Information System (INIS)

    Thompson, A.P.; Swiler, L.P.; Trott, C.R.; Foiles, S.M.; Tucker, G.J.

    2015-01-01

    We present a new interatomic potential for solids and liquids called Spectral Neighbor Analysis Potential (SNAP). The SNAP potential has a very general form and uses machine-learning techniques to reproduce the energies, forces, and stress tensors of a large set of small configurations of atoms, which are obtained using high-accuracy quantum electronic structure (QM) calculations. The local environment of each atom is characterized by a set of bispectrum components of the local neighbor density projected onto a basis of hyperspherical harmonics in four dimensions. The bispectrum components are the same bond-orientational order parameters employed by the GAP potential [1]. The SNAP potential, unlike GAP, assumes a linear relationship between atom energy and bispectrum components. The linear SNAP coefficients are determined using weighted least-squares linear regression against the full QM training set. This allows the SNAP potential to be fit in a robust, automated manner to large QM data sets using many bispectrum components. The calculation of the bispectrum components and the SNAP potential are implemented in the LAMMPS parallel molecular dynamics code. We demonstrate that a previously unnoticed symmetry property can be exploited to reduce the computational cost of the force calculations by more than one order of magnitude. We present results for a SNAP potential for tantalum, showing that it accurately reproduces a range of commonly calculated properties of both the crystalline solid and the liquid phases. In addition, unlike simpler existing potentials, SNAP correctly predicts the energy barrier for screw dislocation migration in BCC tantalum

  11. Automated generation and ensemble-learned matching of X-ray absorption spectra

    Science.gov (United States)

    Zheng, Chen; Mathew, Kiran; Chen, Chi; Chen, Yiming; Tang, Hanmei; Dozier, Alan; Kas, Joshua J.; Vila, Fernando D.; Rehr, John J.; Piper, Louis F. J.; Persson, Kristin A.; Ong, Shyue Ping

    2018-03-01

    X-ray absorption spectroscopy (XAS) is a widely used materials characterization technique to determine oxidation states, coordination environment, and other local atomic structure information. Analysis of XAS relies on comparison of measured spectra to reliable reference spectra. However, existing databases of XAS spectra are highly limited both in terms of the number of reference spectra available as well as the breadth of chemistry coverage. In this work, we report the development of XASdb, a large database of computed reference XAS, and an Ensemble-Learned Spectra IdEntification (ELSIE) algorithm for the matching of spectra. XASdb currently hosts more than 800,000 K-edge X-ray absorption near-edge spectra (XANES) for over 40,000 materials from the open-science Materials Project database. We discuss a high-throughput automation framework for FEFF calculations, built on robust, rigorously benchmarked parameters. FEFF is a computer program uses a real-space Green's function approach to calculate X-ray absorption spectra. We will demonstrate that the ELSIE algorithm, which combines 33 weak "learners" comprising a set of preprocessing steps and a similarity metric, can achieve up to 84.2% accuracy in identifying the correct oxidation state and coordination environment of a test set of 19 K-edge XANES spectra encompassing a diverse range of chemistries and crystal structures. The XASdb with the ELSIE algorithm has been integrated into a web application in the Materials Project, providing an important new public resource for the analysis of XAS to all materials researchers. Finally, the ELSIE algorithm itself has been made available as part of veidt, an open source machine-learning library for materials science.

  12. An automated procedure for covariation-based detection of RNA structure

    International Nuclear Information System (INIS)

    Winker, S.; Overbeek, R.; Woese, C.R.; Olsen, G.J.; Pfluger, N.

    1989-12-01

    This paper summarizes our investigations into the computational detection of secondary and tertiary structure of ribosomal RNA. We have developed a new automated procedure that not only identifies potential bondings of secondary and tertiary structure, but also provides the covariation evidence that supports the proposed bondings, and any counter-evidence that can be detected in the known sequences. A small number of previously unknown bondings have been detected in individual RNA molecules (16S rRNA and 7S RNA) through the use of our automated procedure. Currently, we are systematically studying mitochondrial rRNA. Our goal is to detect tertiary structure within 16S rRNA and quaternary structure between 16S and 23S rRNA. Our ultimate hope is that automated covariation analysis will contribute significantly to a refined picture of ribosome structure. Our colleagues in biology have begun experiments to test certain hypotheses suggested by an examination of our program's output. These experiments involve sequencing key portions of the 23S ribosomal RNA for species in which the known 16S ribosomal RNA exhibits variation (from the dominant pattern) at the site of a proposed bonding. The hope is that the 23S ribosomal RNA of these species will exhibit corresponding complementary variation or generalized covariation. 24 refs

  13. An automated procedure for covariation-based detection of RNA structure

    Energy Technology Data Exchange (ETDEWEB)

    Winker, S.; Overbeek, R.; Woese, C.R.; Olsen, G.J.; Pfluger, N.

    1989-12-01

    This paper summarizes our investigations into the computational detection of secondary and tertiary structure of ribosomal RNA. We have developed a new automated procedure that not only identifies potential bondings of secondary and tertiary structure, but also provides the covariation evidence that supports the proposed bondings, and any counter-evidence that can be detected in the known sequences. A small number of previously unknown bondings have been detected in individual RNA molecules (16S rRNA and 7S RNA) through the use of our automated procedure. Currently, we are systematically studying mitochondrial rRNA. Our goal is to detect tertiary structure within 16S rRNA and quaternary structure between 16S and 23S rRNA. Our ultimate hope is that automated covariation analysis will contribute significantly to a refined picture of ribosome structure. Our colleagues in biology have begun experiments to test certain hypotheses suggested by an examination of our program's output. These experiments involve sequencing key portions of the 23S ribosomal RNA for species in which the known 16S ribosomal RNA exhibits variation (from the dominant pattern) at the site of a proposed bonding. The hope is that the 23S ribosomal RNA of these species will exhibit corresponding complementary variation or generalized covariation. 24 refs.

  14. Automated analysis of Physarum network structure and dynamics

    Science.gov (United States)

    Fricker, Mark D.; Akita, Dai; Heaton, Luke LM; Jones, Nick; Obara, Boguslaw; Nakagaki, Toshiyuki

    2017-06-01

    We evaluate different ridge-enhancement and segmentation methods to automatically extract the network architecture from time-series of Physarum plasmodia withdrawing from an arena via a single exit. Whilst all methods gave reasonable results, judged by precision-recall analysis against a ground-truth skeleton, the mean phase angle (Feature Type) from intensity-independent, phase-congruency edge enhancement and watershed segmentation was the most robust to variation in threshold parameters. The resultant single pixel-wide segmented skeleton was converted to a graph representation as a set of weighted adjacency matrices containing the physical dimensions of each vein, and the inter-vein regions. We encapsulate the complete image processing and network analysis pipeline in a downloadable software package, and provide an extensive set of metrics that characterise the network structure, including hierarchical loop decomposition to analyse the nested structure of the developing network. In addition, the change in volume for each vein and intervening plasmodial sheet was used to predict the net flow across the network. The scaling relationships between predicted current, speed and shear force with vein radius were consistent with predictions from Murray’s law. This work was presented at PhysNet 2015.

  15. Automated analysis of Physarum network structure and dynamics

    International Nuclear Information System (INIS)

    Fricker, Mark D; Heaton, Luke LM; Akita, Dai; Jones, Nick; Obara, Boguslaw; Nakagaki, Toshiyuki

    2017-01-01

    We evaluate different ridge-enhancement and segmentation methods to automatically extract the network architecture from time-series of Physarum plasmodia withdrawing from an arena via a single exit. Whilst all methods gave reasonable results, judged by precision-recall analysis against a ground-truth skeleton, the mean phase angle (Feature Type) from intensity-independent, phase-congruency edge enhancement and watershed segmentation was the most robust to variation in threshold parameters. The resultant single pixel-wide segmented skeleton was converted to a graph representation as a set of weighted adjacency matrices containing the physical dimensions of each vein, and the inter-vein regions. We encapsulate the complete image processing and network analysis pipeline in a downloadable software package, and provide an extensive set of metrics that characterise the network structure, including hierarchical loop decomposition to analyse the nested structure of the developing network. In addition, the change in volume for each vein and intervening plasmodial sheet was used to predict the net flow across the network. The scaling relationships between predicted current, speed and shear force with vein radius were consistent with predictions from Murray’s law. This work was presented at PhysNet 2015. (paper)

  16. An automated graphics tool for comparative genomics: the Coulson plot generator.

    Science.gov (United States)

    Field, Helen I; Coulson, Richard M R; Field, Mark C

    2013-04-27

    Comparative analysis is an essential component to biology. When applied to genomics for example, analysis may require comparisons between the predicted presence and absence of genes in a group of genomes under consideration. Frequently, genes can be grouped into small categories based on functional criteria, for example membership of a multimeric complex, participation in a metabolic or signaling pathway or shared sequence features and/or paralogy. These patterns of retention and loss are highly informative for the prediction of function, and hence possible biological context, and can provide great insights into the evolutionary history of cellular functions. However, representation of such information in a standard spreadsheet is a poor visual means from which to extract patterns within a dataset. We devised the Coulson Plot, a new graphical representation that exploits a matrix of pie charts to display comparative genomics data. Each pie is used to describe a complex or process from a separate taxon, and is divided into sectors corresponding to the number of proteins (subunits) in a complex/process. The predicted presence or absence of proteins in each complex are delineated by occupancy of a given sector; this format is visually highly accessible and makes pattern recognition rapid and reliable. A key to the identity of each subunit, plus hierarchical naming of taxa and coloring are included. A java-based application, the Coulson plot generator (CPG) automates graphic production, with a tab or comma-delineated text file as input and generating an editable portable document format or svg file. CPG software may be used to rapidly convert spreadsheet data to a graphical matrix pie chart format. The representation essentially retains all of the information from the spreadsheet but presents a graphically rich format making comparisons and identification of patterns significantly clearer. While the Coulson plot format is highly useful in comparative genomics, its

  17. Automated crack detection in conductive smart-concrete structures using a resistor mesh model

    Science.gov (United States)

    Downey, Austin; D'Alessandro, Antonella; Ubertini, Filippo; Laflamme, Simon

    2018-03-01

    Various nondestructive evaluation techniques are currently used to automatically detect and monitor cracks in concrete infrastructure. However, these methods often lack the scalability and cost-effectiveness over large geometries. A solution is the use of self-sensing carbon-doped cementitious materials. These self-sensing materials are capable of providing a measurable change in electrical output that can be related to their damage state. Previous work by the authors showed that a resistor mesh model could be used to track damage in structural components fabricated from electrically conductive concrete, where damage was located through the identification of high resistance value resistors in a resistor mesh model. In this work, an automated damage detection strategy that works through placing high value resistors into the previously developed resistor mesh model using a sequential Monte Carlo method is introduced. Here, high value resistors are used to mimic the internal condition of damaged cementitious specimens. The proposed automated damage detection method is experimentally validated using a 500 × 500 × 50 mm3 reinforced cement paste plate doped with multi-walled carbon nanotubes exposed to 100 identical impact tests. Results demonstrate that the proposed Monte Carlo method is capable of detecting and localizing the most prominent damage in a structure, demonstrating that automated damage detection in smart-concrete structures is a promising strategy for real-time structural health monitoring of civil infrastructure.

  18. Automated Generation of the Alaska Coastline Using High-Resolution Satellite Imagery

    Science.gov (United States)

    Roth, G.; Porter, C. C.; Cloutier, M. D.; Clementz, M. E.; Reim, C.; Morin, P. J.

    2015-12-01

    Previous campaigns to map Alaska's coast at high resolution have relied on airborne, marine, or ground-based surveying and manual digitization. The coarse temporal resolution, inability to scale geographically, and high cost of field data acquisition in these campaigns is inadequate for the scale and speed of recent coastal change in Alaska. Here, we leverage the Polar Geospatial Center (PGC) archive of DigitalGlobe, Inc. satellite imagery to produce a state-wide coastline at 2 meter resolution. We first select multispectral imagery based on time and quality criteria. We then extract the near-infrared (NIR) band from each processed image, and classify each pixel as water or land with a pre-determined NIR threshold value. Processing continues with vectorizing the water-land boundary, removing extraneous data, and attaching metadata. Final coastline raster and vector products maintain the original accuracy of the orthorectified satellite data, which is often within the local tidal range. The repeat frequency of coastline production can range from 1 month to 3 years, depending on factors such as satellite capacity, cloud cover, and floating ice. Shadows from trees or structures complicate the output and merit further data cleaning. The PGC's imagery archive, unique expertise, and computing resources enabled us to map the Alaskan coastline in a few months. The DigitalGlobe archive allows us to update this coastline as new imagery is acquired, and facilitates baseline data for studies of coastal change and improvement of topographic datasets. Our results are not simply a one-time coastline, but rather a system for producing multi-temporal, automated coastlines. Workflows and tools produced with this project can be freely distributed and utilized globally. Researchers and government agencies must now consider how they can incorporate and quality-control this high-frequency, high-resolution data to meet their mapping standards and research objectives.

  19. Mass Spectra-Based Framework for Automated Structural Elucidation of Metabolome Data to Explore Phytochemical Diversity

    Science.gov (United States)

    Matsuda, Fumio; Nakabayashi, Ryo; Sawada, Yuji; Suzuki, Makoto; Hirai, Masami Y.; Kanaya, Shigehiko; Saito, Kazuki

    2011-01-01

    A novel framework for automated elucidation of metabolite structures in liquid chromatography–mass spectrometer metabolome data was constructed by integrating databases. High-resolution tandem mass spectra data automatically acquired from each metabolite signal were used for database searches. Three distinct databases, KNApSAcK, ReSpect, and the PRIMe standard compound database, were employed for the structural elucidation. The outputs were retrieved using the CAS metabolite identifier for identification and putative annotation. A simple metabolite ontology system was also introduced to attain putative characterization of the metabolite signals. The automated method was applied for the metabolome data sets obtained from the rosette leaves of 20 Arabidopsis accessions. Phenotypic variations in novel Arabidopsis metabolites among these accessions could be investigated using this method. PMID:22645535

  20. Mass spectra-based framework for automated structural elucidation of metabolome data to explore phytochemical diversity

    Directory of Open Access Journals (Sweden)

    Fumio eMatsuda

    2011-08-01

    Full Text Available A novel framework for automated elucidation of metabolite structures in liquid chromatography-mass spectrometer (LC-MS metabolome data was constructed by integrating databases. High-resolution tandem mass spectra data automatically acquired from each metabolite signal were used for database searches. Three distinct databases, KNApSAcK, ReSpect, and the PRIMe standard compound database, were employed for the structural elucidation. The outputs were retrieved using the CAS metabolite identifier for identification and putative annotation. A simple metabolite ontology system was also introduced to attain putative characterization of the metabolite signals. The automated method was applied for the metabolome data sets obtained from the rosette leaves of 20 Arabidopsis accessions. Phenotypic variations in novel Arabidopsis metabolites among these accessions could be investigated using this method.

  1. Knowledge structure representation and automated updates in intelligent information management systems

    Science.gov (United States)

    Corey, Stephen; Carnahan, Richard S., Jr.

    1990-01-01

    A continuing effort to apply rapid prototyping and Artificial Intelligence techniques to problems associated with projected Space Station-era information management systems is examined. In particular, timely updating of the various databases and knowledge structures within the proposed intelligent information management system (IIMS) is critical to support decision making processes. Because of the significantly large amounts of data entering the IIMS on a daily basis, information updates will need to be automatically performed with some systems requiring that data be incorporated and made available to users within a few hours. Meeting these demands depends first, on the design and implementation of information structures that are easily modified and expanded, and second, on the incorporation of intelligent automated update techniques that will allow meaningful information relationships to be established. Potential techniques are studied for developing such an automated update capability and IIMS update requirements are examined in light of results obtained from the IIMS prototyping effort.

  2. Test generation for hierarchical functionalswitching structures

    OpenAIRE

    LYULKIN A.; LINNIK I.

    2003-01-01

    A problem of test generation for logic CMOS circuits is solved with regard to an extended class of faults. The well-known D-algorithm for test generation for stuck-at faults is extended for transistor stuck-open faults. It is shown that a test for transistor stuck-open faults may be constructed on the base of the test for stuck-at faults. A problem of length minimization of constructed test is discussed.

  3. Automated Quality Assessment of Structural Magnetic Resonance Brain Images Based on a Supervised Machine Learning Algorithm

    Directory of Open Access Journals (Sweden)

    Ricardo Andres Pizarro

    2016-12-01

    Full Text Available High-resolution three-dimensional magnetic resonance imaging (3D-MRI is being increasingly used to delineate morphological changes underlying neuropsychiatric disorders. Unfortunately, artifacts frequently compromise the utility of 3D-MRI yielding irreproducible results, from both type I and type II errors. It is therefore critical to screen 3D-MRIs for artifacts before use. Currently, quality assessment involves slice-wise visual inspection of 3D-MRI volumes, a procedure that is both subjective and time consuming. Automating the quality rating of 3D-MRI could improve the efficiency and reproducibility of the procedure. The present study is one of the first efforts to apply a support vector machine (SVM algorithm in the quality assessment of structural brain images, using global and region of interest (ROI automated image quality features developed in-house. SVM is a supervised machine-learning algorithm that can predict the category of test datasets based on the knowledge acquired from a learning dataset. The performance (accuracy of the automated SVM approach was assessed, by comparing the SVM-predicted quality labels to investigator-determined quality labels. The accuracy for classifying 1457 3D-MRI volumes from our database using the SVM approach is around 80%. These results are promising and illustrate the possibility of using SVM as an automated quality assessment tool for 3D-MRI.

  4. Automated Quality Assessment of Structural Magnetic Resonance Brain Images Based on a Supervised Machine Learning Algorithm.

    Science.gov (United States)

    Pizarro, Ricardo A; Cheng, Xi; Barnett, Alan; Lemaitre, Herve; Verchinski, Beth A; Goldman, Aaron L; Xiao, Ena; Luo, Qian; Berman, Karen F; Callicott, Joseph H; Weinberger, Daniel R; Mattay, Venkata S

    2016-01-01

    High-resolution three-dimensional magnetic resonance imaging (3D-MRI) is being increasingly used to delineate morphological changes underlying neuropsychiatric disorders. Unfortunately, artifacts frequently compromise the utility of 3D-MRI yielding irreproducible results, from both type I and type II errors. It is therefore critical to screen 3D-MRIs for artifacts before use. Currently, quality assessment involves slice-wise visual inspection of 3D-MRI volumes, a procedure that is both subjective and time consuming. Automating the quality rating of 3D-MRI could improve the efficiency and reproducibility of the procedure. The present study is one of the first efforts to apply a support vector machine (SVM) algorithm in the quality assessment of structural brain images, using global and region of interest (ROI) automated image quality features developed in-house. SVM is a supervised machine-learning algorithm that can predict the category of test datasets based on the knowledge acquired from a learning dataset. The performance (accuracy) of the automated SVM approach was assessed, by comparing the SVM-predicted quality labels to investigator-determined quality labels. The accuracy for classifying 1457 3D-MRI volumes from our database using the SVM approach is around 80%. These results are promising and illustrate the possibility of using SVM as an automated quality assessment tool for 3D-MRI.

  5. Control room design with new automation structures. Leitwartengestaltung bei neuen Automatisierungsstrukturen

    Energy Technology Data Exchange (ETDEWEB)

    Gilson, W

    1984-01-01

    This brochure is concerned with the configuration of modern control rooms, taking new automation structures into account. The configuration of control rooms is treated taking note of new process control systems from the point of view of the requirements and performance, which is well known from process and powerstation technology. Apart from general technical and ergonomic considerations, aspects of work load and work stress are dealt with in detail.

  6. The Impact of Automation on Employment: Just the Usual Structural Change?

    Directory of Open Access Journals (Sweden)

    Ben Vermeulen

    2018-05-01

    Full Text Available We study the projected impact of automation on employment in the forthcoming decade, both at the macro-level and in actual (types of sectors. Hereto, we unite an evolutionary economic model of multisectoral structural change with labor economic theory. We thus get a comprehensive framework of how displacement of labor in sectors of application is compensated by intra- and intersectoral countervailing effects and notably mopped up by newly created, labor-intensive sectors. We use several reputable datasets with expert projections on employment in occupations affected by automation (and notably by the introduction of robotics and AI to pinpoint which and how sectors and occupations face employment shifts. This reveals how potential job loss due to automation in “applying” sectors is counterbalanced by job creation in “making” sectors as well in complementary and quaternary, spillover sectors. Finally, we study several macro-level scenarios on employment and find that mankind is facing “the usual structural change” rather than the “end of work”. We provide recommendations on policy instruments that enhance the dynamic efficiency of structural change.

  7. Understanding fine structure constants and three generations

    International Nuclear Information System (INIS)

    Bennett, D.L.; Nielsen, H.B.

    1988-02-01

    We put forward a model inspired by random dynamics that relates the smallness of the gauge coupling constants to the number of generations being 'large'. The new element in the present version of our model is the appearance of a free parameter χ that is a measure of the (presumably relatively minor) importance of a term in the plaquette action proportional to the trace in the (1/6, 2, 3) representation of the Standard Model. Calling N gen the number of generations, the sets of allowed (N gen , χN gen )-pairs obtained by imposing the three measured coupling constant values of the Standard Model form three lines. In addition to finding that these lines cross at a single point (as needed for a consistent fit), the intersection occurs with surprising accuracy at the integer N gen = 3 (thereby predicting exactly three generations). It is also encouraging that the parameter χ turns out to be small and positive as expected. (orig.)

  8. Grayscale lithography-automated mask generation for complex three-dimensional topography

    Science.gov (United States)

    Loomis, James; Ratnayake, Dilan; McKenna, Curtis; Walsh, Kevin M.

    2016-01-01

    Grayscale lithography is a relatively underutilized technique that enables fabrication of three-dimensional (3-D) microstructures in photosensitive polymers (photoresists). By spatially modulating ultraviolet (UV) dosage during the writing process, one can vary the depth at which photoresist is developed. This means complex structures and bioinspired designs can readily be produced that would otherwise be cost prohibitive or too time intensive to fabricate. The main barrier to widespread grayscale implementation, however, stems from the laborious generation of mask files required to create complex surface topography. We present a process and associated software utility for automatically generating grayscale mask files from 3-D models created within industry-standard computer-aided design (CAD) suites. By shifting the microelectromechanical systems (MEMS) design onus to commonly used CAD programs ideal for complex surfacing, engineering professionals already familiar with traditional 3-D CAD software can readily utilize their pre-existing skills to make valuable contributions to the MEMS community. Our conversion process is demonstrated by prototyping several samples on a laser pattern generator-capital equipment already in use in many foundries. Finally, an empirical calibration technique is shown that compensates for nonlinear relationships between UV exposure intensity and photoresist development depth as well as a thermal reflow technique to help smooth microstructure surfaces.

  9. Speciation analysis of arsenic in biological matrices by automated hydride generation-cryotrapping-atomic absorption spectrometry with multiple microflame quartz tube atomizer (multiatomizer).

    Science.gov (United States)

    This paper describes an automated system for the oxidation state specific speciation of inorganic and methylated arsenicals by selective hydride generation - cryotrapping- gas chromatography - atomic absorption spectrometry with the multiatomizer. The corresponding arsines are ge...

  10. A novel automated direct measurement method for total antioxidant capacity using a new generation, more stable ABTS radical cation.

    Science.gov (United States)

    Erel, Ozcan

    2004-04-01

    To develop a novel colorimetric and automated direct measurement method for total antioxidant capacity (TAC). A new generation, more stable, colored 2,2'-azinobis-(3-ethylbenzothiazoline-6-sulfonic acid radical cation (ABTS(*+)) was employed. The ABTS(*+) is decolorized by antioxidants according to their concentrations and antioxidant capacities. This change in color is measured as a change in absorbance at 660 nm. This process is applied to an automated analyzer and the assay is calibrated with Trolox. The novel assay is linear up to 6 mmol Trolox equivalent/l, its precision values are lower than 3%, and there is no interference from hemoglobin, bilirubin, EDTA, or citrate. The method developed is significantly correlated with the Randox- total antioxidant status (TAS) assay (r = 0.897, P total antioxidant capacity.

  11. An efficient approach to bioconversion kinetic model generation based on automated microscale experimentation integrated with model driven experimental design

    DEFF Research Database (Denmark)

    Chen, B. H.; Micheletti, M.; Baganz, F.

    2009-01-01

    -erythrulose. Experiments were performed using automated microwell studies at the 150 or 800 mu L scale. The derived kinetic parameters were then verified in a second round of experiments where model predictions showed excellent agreement with experimental data obtained under conditions not included in the original......Reliable models of enzyme kinetics are required for the effective design of bioconversion processes. Kinetic expressions of the enzyme-catalysed reaction rate however, are frequently complex and establishing accurate values of kinetic parameters normally requires a large number of experiments....... These can be both time consuming and expensive when working with the types of non-natural chiral intermediates important in pharmaceutical syntheses. This paper presents ail automated microscale approach to the rapid and cost effective generation of reliable kinetic models useful for bioconversion process...

  12. Automated Generation of Phase Diagrams for Binary Systems with Azeotropic Behavior

    DEFF Research Database (Denmark)

    Cismondi, Martin; Michelsen, Michael Locht; Zabaloy, Marcelo S.

    2008-01-01

    In this work, we propose a computational strategy and methods for the automated calculation of complete loci of homogeneous azeotropy of binary mixtures and the related Pxy and Txy diagrams for models of the equation-of-state (EOS) type. The strategy consists of first finding the system...

  13. Planning linear construction projects: automated method for the generation of earthwork activities

    NARCIS (Netherlands)

    Askew, W.H.; Al-Jibouri, Saad H.S.; Mawdesley, M.J.; Patterson, D.E.

    2002-01-01

    Earthworks planning for road construction projects is a complex operation and the planning rules used are usually intuitive and not well defined. An approach to automate the earthworks planning process is described and the basic techniques that are used are outlined. A computer-based system has been

  14. Automated brain structure segmentation based on atlas registration and appearance models

    DEFF Research Database (Denmark)

    van der Lijn, Fedde; de Bruijne, Marleen; Klein, Stefan

    2012-01-01

    Accurate automated brain structure segmentation methods facilitate the analysis of large-scale neuroimaging studies. This work describes a novel method for brain structure segmentation in magnetic resonance images that combines information about a structure’s location and appearance. The spatial...... with different magnetic resonance sequences, in which the hippocampus and cerebellum were segmented by an expert. Furthermore, the method is compared to two other segmentation techniques that were applied to the same data. Results show that the atlas- and appearance-based method produces accurate results...

  15. Automated segmentation of pulmonary structures in thoracic computed tomography scans: a review

    International Nuclear Information System (INIS)

    Van Rikxoort, Eva M; Van Ginneken, Bram

    2013-01-01

    Computed tomography (CT) is the modality of choice for imaging the lungs in vivo. Sub-millimeter isotropic images of the lungs can be obtained within seconds, allowing the detection of small lesions and detailed analysis of disease processes. The high resolution of thoracic CT and the high prevalence of lung diseases require a high degree of automation in the analysis pipeline. The automated segmentation of pulmonary structures in thoracic CT has been an important research topic for over a decade now. This systematic review provides an overview of current literature. We discuss segmentation methods for the lungs, the pulmonary vasculature, the airways, including airway tree construction and airway wall segmentation, the fissures, the lobes and the pulmonary segments. For each topic, the current state of the art is summarized, and topics for future research are identified. (topical review)

  16. Analysis of Numerically Generated Wake Structures

    DEFF Research Database (Denmark)

    Ivanell, S.; Sørensen, Jens Nørkær; Mikkelsen, Robert Flemming

    2009-01-01

    . In the actuator-line method, the blades are represented by lines along which body forces representing the loading are introduced. The body forces are determined by computing local angles of attack and using tabulated aerofoil coefficients. The advantage of using the actuator-line technique...... is that it is not needed to resolve blade boundary layers and instead the computational resources are devoted to simulating the dynamics of the flow structures. In the present study, approximately 5 million mesh points are used to resolve the wake structure in a 120-degree domain behind the turbine. The results from...

  17. Automated Clustering Analysis of Immunoglobulin Sequences in Chronic Lymphocytic Leukemia Based on 3D Structural Descriptors

    DEFF Research Database (Denmark)

    Marcatili, Paolo; Mochament, Konstantinos; Agathangelidis, Andreas

    2016-01-01

    study, we used the structure prediction tools PIGS and I-TASSER for creating the 3D models and the TM-align algorithm to superpose them. The innovation of the current methodology resides in the usage of methods adapted from 3D content-based search methodologies to determine the local structural...... determine it are extremely laborious and demanding. Hence, the ability to gain insight into the structure of Igs at large relies on the availability of tools and algorithms for producing accurate Ig structural models based on their primary sequence alone. These models can then be used to determine...... to achieve an optimal solution to this task yet their results were hindered mainly due to the lack of efficient clustering methods based on the similarity of 3D structure descriptors. Here, we present a novel workflow for robust Ig 3D modeling and automated clustering. We validated our protocol in chronic...

  18. Nano Structured Devices for Energy Generation

    DEFF Research Database (Denmark)

    Radziwon, Michal Jędrzej

    ?uorescence polarimetry and X-ray diffractometry (XRD). Layer thicknesses of inverted α-6T / C60 bilayer organic solar cells fabricated at room temperature were optimized to obtain the model device for the performance enhancement studies. By variation of the substrate temperature during deposition of α-6T, the structures...

  19. DEEP LEARNING AND IMAGE PROCESSING FOR AUTOMATED CRACK DETECTION AND DEFECT MEASUREMENT IN UNDERGROUND STRUCTURES

    Directory of Open Access Journals (Sweden)

    F. Panella

    2018-05-01

    Full Text Available This work presents the combination of Deep-Learning (DL and image processing to produce an automated cracks recognition and defect measurement tool for civil structures. The authors focus on tunnel civil structures and survey and have developed an end to end tool for asset management of underground structures. In order to maintain the serviceability of tunnels, regular inspection is needed to assess their structural status. The traditional method of carrying out the survey is the visual inspection: simple, but slow and relatively expensive and the quality of the output depends on the ability and experience of the engineer as well as on the total workload (stress and tiredness may influence the ability to observe and record information. As a result of these issues, in the last decade there is the desire to automate the monitoring using new methods of inspection. The present paper has the goal of combining DL with traditional image processing to create a tool able to detect, locate and measure the structural defect.

  20. Transire, a Program for Generating Solid-State Interface Structures

    Science.gov (United States)

    2017-09-14

    ARL-TR-8134 ● SEP 2017 US Army Research Laboratory Transire, a Program for Generating Solid-State Interface Structures by...Program for Generating Solid-State Interface Structures by Caleb M Carlin and Berend C Rinderspacher Weapons and Materials Research Directorate, ARL...

  1. Second harmonic generation in resonant optical structures

    Science.gov (United States)

    Eichenfield, Matt; Moore, Jeremy; Friedmann, Thomas A.; Olsson, Roy H.; Wiwi, Michael; Padilla, Camille; Douglas, James Kenneth; Hattar, Khalid Mikhiel

    2018-01-09

    An optical second-harmonic generator (or spontaneous parametric down-converter) includes a microresonator formed of a nonlinear optical medium. The microresonator supports at least two modes that can be phase matched at different frequencies so that light can be converted between them: A first resonant mode having substantially radial polarization and a second resonant mode having substantially vertical polarization. The first and second modes have the same radial order. The thickness of the nonlinear medium is less than one-half the pump wavelength within the medium.

  2. Automated WWER steam generator eddy current testing and plugging control system

    International Nuclear Information System (INIS)

    Gorecan, I.; Gortan, K.; Grzalja, I.

    2004-01-01

    The structural architecture of the system contains three main components which are described as follows: Manipulator Guidance System; Eddy Current Testing System; Plugging System. The manipulator system has the task to position the end-effectors to the desired tube position. When the final position is reached, the Eddy Current testing system performs data acquisition. In case defects are found, the plugging system performs tube plug installment. Each system is composed of 3 layers. The first layer is the hardware layer consisting of motors driving the effectors along with sensors needed to obtain the positioning data, pusher motors used to push the test probes into tubes of the WWER steam generator, and plugging hardware tool. The second layer is the control box performing basic monitoring and control routines as an interconnection between first and third layer. The highest layer is the control software, running on the PC, which is used as a human-machine-interface.(author)

  3. Automated methodology for estimating waste streams generated from decommissioning contaminated facilities

    International Nuclear Information System (INIS)

    Toth, J.J.; King, D.A.; Humphreys, K.K.; Haffner, D.R.

    1994-01-01

    As part of the DOE Programmatic Environmental Impact Statement (PEIS), a viable way to determine aggregate waste volumes, cost, and direct labor hours for decommissioning and decontaminating facilities is required. In this paper, a methodology is provided for determining waste streams, cost and direct labor hours from remediation of contaminated facilities. The method is developed utilizing U.S. facility remediation data and information from several decommissioning programs, including reactor decommissioning projects. The method provides for rapid, consistent analysis for many facility types. Three remediation scenarios are considered for facility D ampersand D: unrestricted land use, semi-restricted land use, and restricted land use. Unrestricted land use involves removing radioactive components, decontaminating the building surfaces, and demolishing the remaining structure. Semi-restricted land use involves removing transuranic contamination and immobilizing the contamination on-site. Restricted land use involves removing the transuranic contamination and leaving the building standing. In both semi-restricted and restricted land use scenarios, verification of containment with environmental monitoring is required. To use the methodology, facilities are placed in a building category depending upon the level of contamination, construction design, and function of the building. Unit volume and unit area waste generation factors are used to calculate waste volumes and estimate the amount of waste generated in each of the following classifications: low-level, transuranic, and hazardous waste. Unit factors for cost and labor hours are also applied to the result to estimate D ampersand D cost and labor hours

  4. Rule-based programming and strategies for automated generation of detailed kinetic models for gas phase combustion of polycyclic hydrocarbon molecules; Programmation par regles et strategies pour la generation automatique de mecanismes de combustion d'hydrocarbures polycycliques

    Energy Technology Data Exchange (ETDEWEB)

    Ibanescu, L.

    2004-06-15

    The primary objective of this thesis is to explore the approach of using rule-based systems and strategies, for a complex problem of chemical kinetic: the automated generation of reaction mechanisms. The chemical reactions are naturally expressed as conditional rewriting rules. The control of the chemical reactions chaining is easy to describe using a strategies language, such as the one of the ELAN system, developed in the Protheo team. The thesis presents the basic concepts of the chemical kinetics, the chemical and computational problems related to the conception and validation of a reaction mechanism, and gives a general structure for the generator of reaction mechanisms called GasEI. Our research focuses on the primary mechanism generator. We give solutions for encoding the chemical species, the reactions and their chaining, and we present the prototype developed in ELAN. The representation of the chemical species uses the notion of molecular graphs, encoded by a term structure called GasEI terms. The chemical reactions are expressed by rewriting rules on molecular graphs, encoded by a set of conditional rewriting rules on GasEI terms. The strategies language of the ELAN system is used to express the reactions chaining in the primary mechanism generator. This approach is illustrated by coding ten generic reactions of the oxidizing pyrolysis. Qualitative chemical validations of the prototype show that our approach gives, for acyclic molecules, the same results as the existing mechanism generators, and for polycyclic molecules produces original results.

  5. Rule-based programming and strategies for automated generation of detailed kinetic models for gas phase combustion of polycyclic hydrocarbon molecules; Programmation par regles et strategies pour la generation automatique de mecanismes de combustion d'hydrocarbures polycycliques

    Energy Technology Data Exchange (ETDEWEB)

    Ibanescu, L

    2004-06-15

    The primary objective of this thesis is to explore the approach of using rule-based systems and strategies, for a complex problem of chemical kinetic: the automated generation of reaction mechanisms. The chemical reactions are naturally expressed as conditional rewriting rules. The control of the chemical reactions chaining is easy to describe using a strategies language, such as the one of the ELAN system, developed in the Protheo team. The thesis presents the basic concepts of the chemical kinetics, the chemical and computational problems related to the conception and validation of a reaction mechanism, and gives a general structure for the generator of reaction mechanisms called GasEI. Our research focuses on the primary mechanism generator. We give solutions for encoding the chemical species, the reactions and their chaining, and we present the prototype developed in ELAN. The representation of the chemical species uses the notion of molecular graphs, encoded by a term structure called GasEI terms. The chemical reactions are expressed by rewriting rules on molecular graphs, encoded by a set of conditional rewriting rules on GasEI terms. The strategies language of the ELAN system is used to express the reactions chaining in the primary mechanism generator. This approach is illustrated by coding ten generic reactions of the oxidizing pyrolysis. Qualitative chemical validations of the prototype show that our approach gives, for acyclic molecules, the same results as the existing mechanism generators, and for polycyclic molecules produces original results.

  6. Generation of orientation tools for automated zebrafish screening assays using desktop 3D printing

    OpenAIRE

    Wittbrodt, Jonas N.; Liebel, Urban; Gehrig, Jochen

    2014-01-01

    Background The zebrafish has been established as the main vertebrate model system for whole organism screening applications. However, the lack of consistent positioning of zebrafish embryos within wells of microtiter plates remains an obstacle for the comparative analysis of images acquired in automated screening assays. While technical solutions to the orientation problem exist, dissemination is often hindered by the lack of simple and inexpensive ways of distributing and duplicating tools. ...

  7. THE AUTOMATED GENERATION OF ENGINEERING KNOWLEDGE USING A DIGITAL ENGINEERING TOOL: AN INDUSTRIAL EVALUATION CASE STUDY

    OpenAIRE

    RAYMOND CW SUNG; JAMES M RITCHIE; THEODORE LIM; YING LIU; ZOE KOSMADOUDI

    2012-01-01

    In a knowledge-based economy, it will be crucial to capture expertise and rationale in working environments of all kinds as the need develops to understand how people are working, the intuitive processes they use as they carry out tasks and make decisions and trying to determine the most effective methods and rationales for solving problems. Key outputs from this will be the capability to automate decision making activities and supporting training and learning in competitive business environm...

  8. Automated Reconstruction of Historic Roof Structures from Point Clouds - Development and Examples

    Science.gov (United States)

    Pöchtrager, M.; Styhler-Aydın, G.; Döring-Williams, M.; Pfeifer, N.

    2017-08-01

    The analysis of historic roof constructions is an important task for planning the adaptive reuse of buildings or for maintenance and restoration issues. Current approaches to modeling roof constructions consist of several consecutive operations that need to be done manually or using semi-automatic routines. To increase efficiency and allow the focus to be on analysis rather than on data processing, a set of methods was developed for the fully automated analysis of the roof constructions, including integration of architectural and structural modeling. Terrestrial laser scanning permits high-detail surveying of large-scale structures within a short time. Whereas 3-D laser scan data consist of millions of single points on the object surface, we need a geometric description of structural elements in order to obtain a structural model consisting of beam axis and connections. Preliminary results showed that the developed methods work well for beams in flawless condition with a quadratic cross section and no bending. Deformations or damages such as cracks and cuts on the wooden beams can lead to incomplete representations in the model. Overall, a high degree of automation was achieved.

  9. Automated determination of size and morphology information from soot transmission electron microscope (TEM)-generated images

    International Nuclear Information System (INIS)

    Wang, Cheng; Chan, Qing N.; Zhang, Renlin; Kook, Sanghoon; Hawkes, Evatt R.; Yeoh, Guan H.; Medwell, Paul R.

    2016-01-01

    The thermophoretic sampling of particulates from hot media, coupled with transmission electron microscope (TEM) imaging, is a combined approach that is widely used to derive morphological information. The identification and the measurement of the particulates, however, can be complex when the TEM images are of low contrast, noisy, and have non-uniform background signal level. The image processing method can also be challenging and time consuming, when the samples collected have large variability in shape and size, or have some degree of overlapping. In this work, a three-stage image processing sequence is presented to facilitate time-efficient automated identification and measurement of particulates from the TEM grids. The proposed processing sequence is first applied to soot samples that were thermophoretically sampled from a laminar non-premixed ethylene-air flame. The parameter values that are required to be set to facilitate the automated process are identified, and sensitivity of the results to these parameters is assessed. The same analysis process is also applied to soot samples that were acquired from an externally irradiated laminar non-premixed ethylene-air flame, which have different geometrical characteristics, to assess the morphological dependence of the proposed image processing sequence. Using the optimized parameter values, statistical assessments of the automated results reveal that the largest discrepancies that are associated with the estimated values of primary particle diameter, fractal dimension, and prefactor values of the aggregates for the tested cases, are approximately 3, 1, and 10 %, respectively, when compared with the manual measurements.

  10. Automated determination of size and morphology information from soot transmission electron microscope (TEM)-generated images

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Cheng; Chan, Qing N., E-mail: qing.chan@unsw.edu.au; Zhang, Renlin; Kook, Sanghoon; Hawkes, Evatt R.; Yeoh, Guan H. [UNSW, School of Mechanical and Manufacturing Engineering (Australia); Medwell, Paul R. [The University of Adelaide, Centre for Energy Technology (Australia)

    2016-05-15

    The thermophoretic sampling of particulates from hot media, coupled with transmission electron microscope (TEM) imaging, is a combined approach that is widely used to derive morphological information. The identification and the measurement of the particulates, however, can be complex when the TEM images are of low contrast, noisy, and have non-uniform background signal level. The image processing method can also be challenging and time consuming, when the samples collected have large variability in shape and size, or have some degree of overlapping. In this work, a three-stage image processing sequence is presented to facilitate time-efficient automated identification and measurement of particulates from the TEM grids. The proposed processing sequence is first applied to soot samples that were thermophoretically sampled from a laminar non-premixed ethylene-air flame. The parameter values that are required to be set to facilitate the automated process are identified, and sensitivity of the results to these parameters is assessed. The same analysis process is also applied to soot samples that were acquired from an externally irradiated laminar non-premixed ethylene-air flame, which have different geometrical characteristics, to assess the morphological dependence of the proposed image processing sequence. Using the optimized parameter values, statistical assessments of the automated results reveal that the largest discrepancies that are associated with the estimated values of primary particle diameter, fractal dimension, and prefactor values of the aggregates for the tested cases, are approximately 3, 1, and 10 %, respectively, when compared with the manual measurements.

  11. PINE-SPARKY.2 for automated NMR-based protein structure research.

    Science.gov (United States)

    Lee, Woonghee; Markley, John L

    2018-05-01

    Nuclear magnetic resonance (NMR) spectroscopy, along with X-ray crystallography and cryoelectron microscopy, is one of the three major tools that enable the determination of atomic-level structural models of biological macromolecules. Of these, NMR has the unique ability to follow important processes in solution, including conformational changes, internal dynamics and protein-ligand interactions. As a means for facilitating the handling and analysis of spectra involved in these types of NMR studies, we have developed PINE-SPARKY.2, a software package that integrates and automates discrete tasks that previously required interaction with separate software packages. The graphical user interface of PINE-SPARKY.2 simplifies chemical shift assignment and verification, automated detection of secondary structural elements, predictions of flexibility and hydrophobic cores, and calculation of three-dimensional structural models. PINE-SPARKY.2 is available in the latest version of NMRFAM-SPARKY from the National Magnetic Resonance Facility at Madison (http://pine.nmrfam.wisc.edu/download_packages.html), the NMRbox Project (https://nmrbox.org) and to subscribers to the SBGrid (https://sbgrid.org). For a detailed description of the program, see http://www.nmrfam.wisc.edu/pine-sparky2.htm. whlee@nmrfam.wisc.edu or markley@nmrfam.wisc.edu. Supplementary data are available at Bioinformatics online.

  12. Automated delineation of brain structures in patients undergoing radiotherapy for primary brain tumors: From atlas to dose–volume histograms

    International Nuclear Information System (INIS)

    Conson, Manuel; Cella, Laura; Pacelli, Roberto; Comerci, Marco; Liuzzi, Raffaele; Salvatore, Marco; Quarantelli, Mario

    2014-01-01

    Purpose: To implement and evaluate a magnetic resonance imaging atlas-based automated segmentation (MRI-ABAS) procedure for cortical and sub-cortical grey matter areas definition, suitable for dose-distribution analyses in brain tumor patients undergoing radiotherapy (RT). Patients and methods: 3T-MRI scans performed before RT in ten brain tumor patients were used. The MRI-ABAS procedure consists of grey matter classification and atlas-based regions of interest definition. The Simultaneous Truth and Performance Level Estimation (STAPLE) algorithm was applied to structures manually delineated by four experts to generate the standard reference. Performance was assessed comparing multiple geometrical metrics (including Dice Similarity Coefficient – DSC). Dosimetric parameters from dose–volume-histograms were also generated and compared. Results: Compared with manual delineation, MRI-ABAS showed excellent reproducibility [median DSC ABAS = 1 (95% CI, 0.97–1.0) vs. DSC MANUAL = 0.90 (0.73–0.98)], acceptable accuracy [DSC ABAS = 0.81 (0.68–0.94) vs. DSC MANUAL = 0.90 (0.76–0.98)], and an overall 90% reduction in delineation time. Dosimetric parameters obtained using MRI-ABAS were comparable with those obtained by manual contouring. Conclusions: The speed, reproducibility, and robustness of the process make MRI-ABAS a valuable tool for investigating radiation dose–volume effects in non-target brain structures providing additional standardized data without additional time-consuming procedures

  13. Automating crystallographic structure solution and refinement of protein–ligand complexes

    International Nuclear Information System (INIS)

    Echols, Nathaniel; Moriarty, Nigel W.; Klei, Herbert E.; Afonine, Pavel V.; Bunkóczi, Gábor; Headd, Jeffrey J.; McCoy, Airlie J.; Oeffner, Robert D.; Read, Randy J.; Terwilliger, Thomas C.; Adams, Paul D.

    2013-01-01

    A software system for automated protein–ligand crystallography has been implemented in the Phenix suite. This significantly reduces the manual effort required in high-throughput crystallographic studies. High-throughput drug-discovery and mechanistic studies often require the determination of multiple related crystal structures that only differ in the bound ligands, point mutations in the protein sequence and minor conformational changes. If performed manually, solution and refinement requires extensive repetition of the same tasks for each structure. To accelerate this process and minimize manual effort, a pipeline encompassing all stages of ligand building and refinement, starting from integrated and scaled diffraction intensities, has been implemented in Phenix. The resulting system is able to successfully solve and refine large collections of structures in parallel without extensive user intervention prior to the final stages of model completion and validation

  14. Evaluation of software tools for automated identification of neuroanatomical structures in quantitative β-amyloid PET imaging to diagnose Alzheimer's disease

    Energy Technology Data Exchange (ETDEWEB)

    Tuszynski, Tobias; Luthardt, Julia; Butzke, Daniel; Tiepolt, Solveig; Seese, Anita; Barthel, Henryk [Leipzig University Medical Centre, Department of Nuclear Medicine, Leipzig (Germany); Rullmann, Michael; Hesse, Swen; Sabri, Osama [Leipzig University Medical Centre, Department of Nuclear Medicine, Leipzig (Germany); Leipzig University Medical Centre, Integrated Treatment and Research Centre (IFB) Adiposity Diseases, Leipzig (Germany); Gertz, Hermann-Josef [Leipzig University Medical Centre, Department of Psychiatry, Leipzig (Germany); Lobsien, Donald [Leipzig University Medical Centre, Department of Neuroradiology, Leipzig (Germany)

    2016-06-15

    For regional quantification of nuclear brain imaging data, defining volumes of interest (VOIs) by hand is still the gold standard. As this procedure is time-consuming and operator-dependent, a variety of software tools for automated identification of neuroanatomical structures were developed. As the quality and performance of those tools are poorly investigated so far in analyzing amyloid PET data, we compared in this project four algorithms for automated VOI definition (HERMES Brass, two PMOD approaches, and FreeSurfer) against the conventional method. We systematically analyzed florbetaben brain PET and MRI data of ten patients with probable Alzheimer's dementia (AD) and ten age-matched healthy controls (HCs) collected in a previous clinical study. VOIs were manually defined on the data as well as through the four automated workflows. Standardized uptake value ratios (SUVRs) with the cerebellar cortex as a reference region were obtained for each VOI. SUVR comparisons between ADs and HCs were carried out using Mann-Whitney-U tests, and effect sizes (Cohen's d) were calculated. SUVRs of automatically generated VOIs were correlated with SUVRs of conventionally derived VOIs (Pearson's tests). The composite neocortex SUVRs obtained by manually defined VOIs were significantly higher for ADs vs. HCs (p=0.010, d=1.53). This was also the case for the four tested automated approaches which achieved effect sizes of d=1.38 to d=1.62. SUVRs of automatically generated VOIs correlated significantly with those of the hand-drawn VOIs in a number of brain regions, with regional differences in the degree of these correlations. Best overall correlation was observed in the lateral temporal VOI for all tested software tools (r=0.82 to r=0.95, p<0.001). Automated VOI definition by the software tools tested has a great potential to substitute for the current standard procedure to manually define VOIs in β-amyloid PET data analysis. (orig.)

  15. Codon-Precise, Synthetic, Antibody Fragment Libraries Built Using Automated Hexamer Codon Additions and Validated through Next Generation Sequencing

    Directory of Open Access Journals (Sweden)

    Laura Frigotto

    2015-05-01

    Full Text Available We have previously described ProxiMAX, a technology that enables the fabrication of precise, combinatorial gene libraries via codon-by-codon saturation mutagenesis. ProxiMAX was originally performed using manual, enzymatic transfer of codons via blunt-end ligation. Here we present Colibra™: an automated, proprietary version of ProxiMAX used specifically for antibody library generation, in which double-codon hexamers are transferred during the saturation cycling process. The reduction in process complexity, resulting library quality and an unprecedented saturation of up to 24 contiguous codons are described. Utility of the method is demonstrated via fabrication of complementarity determining regions (CDR in antibody fragment libraries and next generation sequencing (NGS analysis of their quality and diversity.

  16. Variable structure unit vector control of electric power generation ...

    African Journals Online (AJOL)

    A variable structure Automatic Generation Control (VSAGC) scheme is proposed in this paper for the control of a single area power system model dominated by steam powered electric generating plants. Unlike existing, VSAGC scheme where the selection of the control function is based on a trial and error procedure, the ...

  17. Dynamic analysis of CHASNUPP steam generator structure during shipping

    International Nuclear Information System (INIS)

    Han Liangbi; Xu Jinkang; Zhou Meiwu; He Yinbiao

    1998-07-01

    The dynamic analysis of CHASNUPP steam generator during shipping is described, including the simplified mathematical model, acceleration power spectrum of ocean wave induced random vibration, the dynamic analysis of steam generator structure under random loading, the applied computer code and calculated results

  18. ProDaMa: an open source Python library to generate protein structure datasets.

    Science.gov (United States)

    Armano, Giuliano; Manconi, Andrea

    2009-10-02

    The huge difference between the number of known sequences and known tertiary structures has justified the use of automated methods for protein analysis. Although a general methodology to solve these problems has not been yet devised, researchers are engaged in developing more accurate techniques and algorithms whose training plays a relevant role in determining their performance. From this perspective, particular importance is given to the training data used in experiments, and researchers are often engaged in the generation of specialized datasets that meet their requirements. To facilitate the task of generating specialized datasets we devised and implemented ProDaMa, an open source Python library than provides classes for retrieving, organizing, updating, analyzing, and filtering protein data. ProDaMa has been used to generate specialized datasets useful for secondary structure prediction and to develop a collaborative web application aimed at generating and sharing protein structure datasets. The library, the related database, and the documentation are freely available at the URL http://iasc.diee.unica.it/prodama.

  19. ProDaMa: an open source Python library to generate protein structure datasets

    Directory of Open Access Journals (Sweden)

    Manconi Andrea

    2009-10-01

    Full Text Available Abstract Background The huge difference between the number of known sequences and known tertiary structures has justified the use of automated methods for protein analysis. Although a general methodology to solve these problems has not been yet devised, researchers are engaged in developing more accurate techniques and algorithms whose training plays a relevant role in determining their performance. From this perspective, particular importance is given to the training data used in experiments, and researchers are often engaged in the generation of specialized datasets that meet their requirements. Findings To facilitate the task of generating specialized datasets we devised and implemented ProDaMa, an open source Python library than provides classes for retrieving, organizing, updating, analyzing, and filtering protein data. Conclusion ProDaMa has been used to generate specialized datasets useful for secondary structure prediction and to develop a collaborative web application aimed at generating and sharing protein structure datasets. The library, the related database, and the documentation are freely available at the URL http://iasc.diee.unica.it/prodama.

  20. Deep generative learning for automated EHR diagnosis of traditional Chinese medicine.

    Science.gov (United States)

    Liang, Zhaohui; Liu, Jun; Ou, Aihua; Zhang, Honglai; Li, Ziping; Huang, Jimmy Xiangji

    2018-05-04

    Computer-aided medical decision-making (CAMDM) is the method to utilize massive EMR data as both empirical and evidence support for the decision procedure of healthcare activities. Well-developed information infrastructure, such as hospital information systems and disease surveillance systems, provides abundant data for CAMDM. However, the complexity of EMR data with abstract medical knowledge makes the conventional model incompetent for the analysis. Thus a deep belief networks (DBN) based model is proposed to simulate the information analysis and decision-making procedure in medical practice. The purpose of this paper is to evaluate a deep learning architecture as an effective solution for CAMDM. A two-step model is applied in our study. At the first step, an optimized seven-layer deep belief network (DBN) is applied as an unsupervised learning algorithm to perform model training to acquire feature representation. Then a support vector machine model is adopted to DBN at the second step of the supervised learning. There are two data sets used in the experiments. One is a plain text data set indexed by medical experts. The other is a structured dataset on primary hypertension. The data are randomly divided to generate the training set for the unsupervised learning and the testing set for the supervised learning. The model performance is evaluated by the statistics of mean and variance, the average precision and coverage on the data sets. Two conventional shallow models (support vector machine / SVM and decision tree / DT) are applied as the comparisons to show the superiority of our proposed approach. The deep learning (DBN + SVM) model outperforms simple SVM and DT on two data sets in terms of all the evaluation measures, which confirms our motivation that the deep model is good at capturing the key features with less dependence when the index is built up by manpower. Our study shows the two-step deep learning model achieves high performance for medical

  1. Automated EEG sleep staging in the term-age baby using a generative modelling approach

    Science.gov (United States)

    Pillay, Kirubin; Dereymaeker, Anneleen; Jansen, Katrien; Naulaers, Gunnar; Van Huffel, Sabine; De Vos, Maarten

    2018-06-01

    Objective. We develop a method for automated four-state sleep classification of preterm and term-born babies at term-age of 38-40 weeks postmenstrual age (the age since the last menstrual cycle of the mother) using multichannel electroencephalogram (EEG) recordings. At this critical age, EEG differentiates from broader quiet sleep (QS) and active sleep (AS) stages to four, more complex states, and the quality and timing of this differentiation is indicative of the level of brain development. However, existing methods for automated sleep classification remain focussed only on QS and AS sleep classification. Approach. EEG features were calculated from 16 EEG recordings, in 30 s epochs, and personalized feature scaling used to correct for some of the inter-recording variability, by standardizing each recording’s feature data using its mean and standard deviation. Hidden Markov models (HMMs) and Gaussian mixture models (GMMs) were trained, with the HMM incorporating knowledge of the sleep state transition probabilities. Performance of the GMM and HMM (with and without scaling) were compared, and Cohen’s kappa agreement calculated between the estimates and clinicians’ visual labels. Main results. For four-state classification, the HMM proved superior to the GMM. With the inclusion of personalized feature scaling, mean kappa (±standard deviation) was 0.62 (±0.16) compared to the GMM value of 0.55 (±0.15). Without feature scaling, kappas for the HMM and GMM dropped to 0.56 (±0.18) and 0.51 (±0.15), respectively. Significance. This is the first study to present a successful method for the automated staging of four states in term-age sleep using multichannel EEG. Results suggested a benefit in incorporating transition information using an HMM, and correcting for inter-recording variability through personalized feature scaling. Determining the timing and quality of these states are indicative of developmental delays in both preterm and term-born babies that may

  2. The study of features of the structural organization of the au-tomated information processing system of the collective type

    Science.gov (United States)

    Nikolaev, V. N.; Titov, D. V.; Syryamkin, V. I.

    2018-05-01

    The comparative assessment of the level of channel capacity of different variants of the structural organization of the automated information processing systems is made. The information processing time assessment model depending on the type of standard elements and their structural organization is developed.

  3. Instructional Topics in Educational Measurement (ITEMS) Module: Using Automated Processes to Generate Test Items

    Science.gov (United States)

    Gierl, Mark J.; Lai, Hollis

    2013-01-01

    Changes to the design and development of our educational assessments are resulting in the unprecedented demand for a large and continuous supply of content-specific test items. One way to address this growing demand is with automatic item generation (AIG). AIG is the process of using item models to generate test items with the aid of computer…

  4. Automated multiscale morphometry of muscle disease from second harmonic generation microscopy using tensor-based image processing.

    Science.gov (United States)

    Garbe, Christoph S; Buttgereit, Andreas; Schürmann, Sebastian; Friedrich, Oliver

    2012-01-01

    Practically, all chronic diseases are characterized by tissue remodeling that alters organ and cellular function through changes to normal organ architecture. Some morphometric alterations become irreversible and account for disease progression even on cellular levels. Early diagnostics to categorize tissue alterations, as well as monitoring progression or remission of disturbed cytoarchitecture upon treatment in the same individual, are a new emerging field. They strongly challenge spatial resolution and require advanced imaging techniques and strategies for detecting morphological changes. We use a combined second harmonic generation (SHG) microscopy and automated image processing approach to quantify morphology in an animal model of inherited Duchenne muscular dystrophy (mdx mouse) with age. Multiphoton XYZ image stacks from tissue slices reveal vast morphological deviation in muscles from old mdx mice at different scales of cytoskeleton architecture: cell calibers are irregular, myofibrils within cells are twisted, and sarcomere lattice disruptions (detected as "verniers") are larger in number compared to samples from healthy mice. In young mdx mice, such alterations are only minor. The boundary-tensor approach, adapted and optimized for SHG data, is a suitable approach to allow quick quantitative morphometry in whole tissue slices. The overall detection performance of the automated algorithm compares very well with manual "by eye" detection, the latter being time consuming and prone to subjective errors. Our algorithm outperfoms manual detection by time with similar reliability. This approach will be an important prerequisite for the implementation of a clinical image databases to diagnose and monitor specific morphological alterations in chronic (muscle) diseases. © 2011 IEEE

  5. Improved reliability, accuracy and quality in automated NMR structure calculation with ARIA

    Energy Technology Data Exchange (ETDEWEB)

    Mareuil, Fabien [Institut Pasteur, Cellule d' Informatique pour la Biologie (France); Malliavin, Thérèse E.; Nilges, Michael; Bardiaux, Benjamin, E-mail: bardiaux@pasteur.fr [Institut Pasteur, Unité de Bioinformatique Structurale, CNRS UMR 3528 (France)

    2015-08-15

    In biological NMR, assignment of NOE cross-peaks and calculation of atomic conformations are critical steps in the determination of reliable high-resolution structures. ARIA is an automated approach that performs NOE assignment and structure calculation in a concomitant manner in an iterative procedure. The log-harmonic shape for distance restraint potential and the Bayesian weighting of distance restraints, recently introduced in ARIA, were shown to significantly improve the quality and the accuracy of determined structures. In this paper, we propose two modifications of the ARIA protocol: (1) the softening of the force field together with adapted hydrogen radii, which is meaningful in the context of the log-harmonic potential with Bayesian weighting, (2) a procedure that automatically adjusts the violation tolerance used in the selection of active restraints, based on the fitting of the structure to the input data sets. The new ARIA protocols were fine-tuned on a set of eight protein targets from the CASD–NMR initiative. As a result, the convergence problems previously observed for some targets was resolved and the obtained structures exhibited better quality. In addition, the new ARIA protocols were applied for the structure calculation of ten new CASD–NMR targets in a blind fashion, i.e. without knowing the actual solution. Even though optimisation of parameters and pre-filtering of unrefined NOE peak lists were necessary for half of the targets, ARIA consistently and reliably determined very precise and highly accurate structures for all cases. In the context of integrative structural biology, an increasing number of experimental methods are used that produce distance data for the determination of 3D structures of macromolecules, stressing the importance of methods that successfully make use of ambiguous and noisy distance data.

  6. Data Structure Analysis to Represent Basic Models of Finite State Automation

    Directory of Open Access Journals (Sweden)

    V. V. Gurenko

    2015-01-01

    Full Text Available Complex system engineering based on the automaton models requires a reasoned data structure selection to implement them. The problem of automaton representation and data structure selection to be used in it has been understudied. Arbitrary data structure selection for automaton model software implementation leads to unnecessary computational burden and reduces the developed system efficiency. This article proposes an approach to the reasoned selection of data structures to represent finite algoristic automaton basic models and gives practical considerations based on it.Static and dynamic data structures are proposed for three main ways to assign Mealy and Moore automatons: a transition table, a matrix of coupling and a transition graph. A thirddimensional array, a rectangular matrix and a matrix of lists are the static structures. Dynamic structures are list-oriented structures: two-level and three-level Ayliff vectors and a multi-linked list. These structures allow us to store all required information about finite state automaton model components - characteristic set cardinalities and data of transition and output functions.A criterion system is proposed for data structure comparative evaluation in virtue of algorithmic features of automata theory problems. The criteria focused on capacitive and time computational complexity of operations performed in tasks such as equivalent automaton conversions, proving of automaton equivalence and isomorphism, and automaton minimization.A data structure comparative analysis based on the criterion system has done for both static and dynamic type. The analysis showed advantages of the third-dimensional array, matrix and two-level Ayliff vector. These are structures that assign automaton by transition table. For these structures an experiment was done to measure the execution time of automation operations included in criterion system.The analysis of experiment results showed that a dynamic structure - two

  7. An automated synthesis-purification-sample-management platform for the accelerated generation of pharmaceutical candidates.

    Science.gov (United States)

    Sutherland, J David; Tu, Noah P; Nemcek, Thomas A; Searle, Philip A; Hochlowski, Jill E; Djuric, Stevan W; Pan, Jeffrey Y

    2014-04-01

    A flexible and integrated flow-chemistry-synthesis-purification compound-generation and sample-management platform has been developed to accelerate the production of small-molecule organic-compound drug candidates in pharmaceutical research. Central to the integrated system is a Mitsubishi robot, which hands off samples throughout the process to the next station, including synthesis and purification, sample dispensing for purity and quantification analysis, dry-down, and aliquot generation.

  8. Sensitivity analysis of reactor safety parameters with automated adjoint function generation

    International Nuclear Information System (INIS)

    Kallfelz, J.M.; Horwedel, J.E.; Worley, B.A.

    1992-01-01

    A project at the Paul Scherrer Institute (PSI) involves the development of simulation models for the transient analysis of the reactors in Switzerland (STARS). This project, funded in part by the Swiss Federal Nuclear Safety Inspectorate, also involves the calculation and evaluation of certain transients for Swiss light water reactors (LWRs). For best-estimate analyses, a key element in quantifying reactor safety margins is uncertainty evaluation to determine the uncertainty in calculated integral values (responses) caused by modeling, calculational methodology, and input data (parameters). The work reported in this paper is a joint PSI/Oak Ridge National Laboratory (ORNL) application to a core transient analysis code of an ORNL software system for automated sensitivity analysis. The Gradient-Enhanced Software System (GRESS) is a software package that can in principle enhance any code so that it can calculate the sensitivity (derivative) to input parameters of any integral value (response) calculated in the original code. The studies reported are the first application of the GRESS capability to core neutronics and safety codes

  9. FULLY AUTOMATED GENERATION OF ACCURATE DIGITAL SURFACE MODELS WITH SUB-METER RESOLUTION FROM SATELLITE IMAGERY

    Directory of Open Access Journals (Sweden)

    J. Wohlfeil

    2012-07-01

    Full Text Available Modern pixel-wise image matching algorithms like Semi-Global Matching (SGM are able to compute high resolution digital surface models from airborne and spaceborne stereo imagery. Although image matching itself can be performed automatically, there are prerequisites, like high geometric accuracy, which are essential for ensuring the high quality of resulting surface models. Especially for line cameras, these prerequisites currently require laborious manual interaction using standard tools, which is a growing problem due to continually increasing demand for such surface models. The tedious work includes partly or fully manual selection of tie- and/or ground control points for ensuring the required accuracy of the relative orientation of images for stereo matching. It also includes masking of large water areas that seriously reduce the quality of the results. Furthermore, a good estimate of the depth range is required, since accurate estimates can seriously reduce the processing time for stereo matching. In this paper an approach is presented that allows performing all these steps fully automated. It includes very robust and precise tie point selection, enabling the accurate calculation of the images’ relative orientation via bundle adjustment. It is also shown how water masking and elevation range estimation can be performed automatically on the base of freely available SRTM data. Extensive tests with a large number of different satellite images from QuickBird and WorldView are presented as proof of the robustness and reliability of the proposed method.

  10. Low cost home automation system: a support to the ecological electricity generation in Colombia

    Directory of Open Access Journals (Sweden)

    Elmer Alejandro Parada Prieto

    2016-09-01

    Full Text Available Context/Objective: In Colombia, consumption of residential electricity accounts for about 40% of the national demand; therefore, alternatives to reduce this consumption are needed. The goal of this study was to develop a home automation prototype to control the illumination of a household and to foster the efficient use of energy. Method: The system consists of independent control modules and an information manager module; the control module regulates the luminaires using a microcontroller and a presence sensor, and exchanges data by means of a radio frequency transceiver; the manager module allows the access to the control modules from a Web interface. The prototype was implemented in a household located in the city of San José de Cúcuta, Colombia, during a 60 days period. Results: The operation of the system diminished the total electricity consumption by 3,75 %, with a z-score of -1,93 that was obtained from the statistical analysis. Conclusions: We concluded that the prototype is inexpensive in comparison to similar technologies available in the national and international markets, and it reduces the waste of electrical energy due to the consumption habits of the residents in the case study.

  11. Automated Systematic Generation and Exploration of Flat Direction Phenomenology in Free Fermionic Heterotic String Theory

    Science.gov (United States)

    Greenwald, Jared

    Any good physical theory must resolve current experimental data as well as offer predictions for potential searches in the future. The Standard Model of particle physics, Grand Unied Theories, Minimal Supersymmetric Models and Supergravity are all attempts to provide such a framework. However, they all lack the ability to predict many of the parameters that each of the theories utilize. String theory may yield a solution to this naturalness (or self-predictiveness) problem as well as offer a unifed theory of gravity. Studies in particle physics phenomenology based on perturbative low energy analysis of various string theories can help determine the candidacy of such models. After a review of principles and problems leading up to our current understanding of the universe, we will discuss some of the best particle physics model building techniques that have been developed using string theory. This will culminate in the introduction of a novel approach to a computational, systematic analysis of the various physical phenomena that arise from these string models. We focus on the necessary assumptions, complexity and open questions that arise while making a fully-automated at direction analysis program.

  12. Movie magic in the clinic: computer-generated characters for automated health counseling.

    Science.gov (United States)

    Bickmore, Timothy

    2008-11-06

    In this presentation, I demonstrate how many of the technologies used in movie special effects and games have been successfully used in health education and behavior change interventions. Computer-animated health counselors simulate human face-to-face dialogue as a computer interface medium, including not only verbal behavior but nonverbal conversational behavior such as hand gesture, body posture shifts, and facial display of emotion. This technology has now been successfully used in a wide range of health interventions for education and counseling of patients and consumers, including applications in physical activity promotion, medication adherence, and hospital discharge. These automated counselors have been deployed on home computers, hospital-based touch screen kiosks, and mobile devices with integrated health behavior sensing capability. Development of these agents is an interdisciplinary endeavor spanning the fields of character modeling and animation, computational linguistics, artificial intelligence, health communication and behavioral medicine. I will give demonstrations of several fielded systems, describe the technologies and methodologies underlying their development, and present results from five randomized controlled trials that have been completed or are in progress.

  13. Random generation of RNA secondary structures according to native distributions

    Directory of Open Access Journals (Sweden)

    Nebel Markus E

    2011-10-01

    Full Text Available Abstract Background Random biological sequences are a topic of great interest in genome analysis since, according to a powerful paradigm, they represent the background noise from which the actual biological information must differentiate. Accordingly, the generation of random sequences has been investigated for a long time. Similarly, random object of a more complicated structure like RNA molecules or proteins are of interest. Results In this article, we present a new general framework for deriving algorithms for the non-uniform random generation of combinatorial objects according to the encoding and probability distribution implied by a stochastic context-free grammar. Briefly, the framework extends on the well-known recursive method for (uniform random generation and uses the popular framework of admissible specifications of combinatorial classes, introducing weighted combinatorial classes to allow for the non-uniform generation by means of unranking. This framework is used to derive an algorithm for the generation of RNA secondary structures of a given fixed size. We address the random generation of these structures according to a realistic distribution obtained from real-life data by using a very detailed context-free grammar (that models the class of RNA secondary structures by distinguishing between all known motifs in RNA structure. Compared to well-known sampling approaches used in several structure prediction tools (such as SFold ours has two major advantages: Firstly, after a preprocessing step in time O(n2 for the computation of all weighted class sizes needed, with our approach a set of m random secondary structures of a given structure size n can be computed in worst-case time complexity Om⋅n⋅ log(n while other algorithms typically have a runtime in O(m⋅n2. Secondly, our approach works with integer arithmetic only which is faster and saves us from all the discomforting details of using floating point arithmetic with

  14. Automated Portfolio Optimization Based on a New Test for Structural Breaks

    Directory of Open Access Journals (Sweden)

    Tobias Berens

    2014-04-01

    Full Text Available We present a completely automated optimization strategy which combines the classical Markowitz mean-variance portfolio theory with a recently proposed test for structural breaks in covariance matrices. With respect to equity portfolios, global minimum-variance optimizations, which base solely on the covariance matrix, yield considerable results in previous studies. However, financial assets cannot be assumed to have a constant covariance matrix over longer periods of time. Hence, we estimate the covariance matrix of the assets by respecting potential change points. The resulting approach resolves the issue of determining a sample for parameter estimation. Moreover, we investigate if this approach is also appropriate for timing the reoptimizations. Finally, we apply the approach to two datasets and compare the results to relevant benchmark techniques by means of an out-of-sample study. It is shown that the new approach outperforms equally weighted portfolios and plain minimum-variance portfolios on average.

  15. Absorption-reduced waveguide structure for efficient terahertz generation

    Energy Technology Data Exchange (ETDEWEB)

    Pálfalvi, L., E-mail: palfalvi@fizika.ttk.pte.hu [Institute of Physics, University of Pécs, Ifjúság ú. 6, 7624 Pécs (Hungary); Fülöp, J. A. [MTA-PTE High-Field Terahertz Research Group, Ifjúság ú. 6, 7624 Pécs (Hungary); Szentágothai Research Centre, University of Pécs, Ifjúság ú. 20, 7624 Pécs (Hungary); Hebling, J. [Institute of Physics, University of Pécs, Ifjúság ú. 6, 7624 Pécs (Hungary); MTA-PTE High-Field Terahertz Research Group, Ifjúság ú. 6, 7624 Pécs (Hungary); Szentágothai Research Centre, University of Pécs, Ifjúság ú. 20, 7624 Pécs (Hungary)

    2015-12-07

    An absorption-reduced planar waveguide structure is proposed for increasing the efficiency of terahertz (THz) pulse generation by optical rectification of femtosecond laser pulses with tilted-pulse-front in highly nonlinear materials with large absorption coefficient. The structure functions as waveguide both for the optical pump and the generated THz radiation. Most of the THz power propagates inside the cladding with low THz absorption, thereby reducing losses and leading to the enhancement of the THz generation efficiency by up to more than one order of magnitude, as compared with a bulk medium. Such a source can be suitable for highly efficient THz pulse generation pumped by low-energy (nJ-μJ) pulses at high (MHz) repetition rates delivered by compact fiber lasers.

  16. Automated torso organ segmentation from 3D CT images using structured perceptron and dual decomposition

    Science.gov (United States)

    Nimura, Yukitaka; Hayashi, Yuichiro; Kitasaka, Takayuki; Mori, Kensaku

    2015-03-01

    This paper presents a method for torso organ segmentation from abdominal CT images using structured perceptron and dual decomposition. A lot of methods have been proposed to enable automated extraction of organ regions from volumetric medical images. However, it is necessary to adjust empirical parameters of them to obtain precise organ regions. This paper proposes an organ segmentation method using structured output learning. Our method utilizes a graphical model and binary features which represent the relationship between voxel intensities and organ labels. Also we optimize the weights of the graphical model by structured perceptron and estimate the best organ label for a given image by dynamic programming and dual decomposition. The experimental result revealed that the proposed method can extract organ regions automatically using structured output learning. The error of organ label estimation was 4.4%. The DICE coefficients of left lung, right lung, heart, liver, spleen, pancreas, left kidney, right kidney, and gallbladder were 0.91, 0.95, 0.77, 0.81, 0.74, 0.08, 0.83, 0.84, and 0.03, respectively.

  17. Software package to automate the design and production of translucent building structures made of pvc

    Directory of Open Access Journals (Sweden)

    Petrova Irina Yur’evna

    2016-08-01

    Full Text Available The article describes the features of the design and production of translucent building structures made of PVC. The analysis of the automation systems of this process currently existing on the market is carried out, their advantages and disadvantages are identified. Basing on this analysis, a set of requirements for automation systems for the design and production of translucent building structures made of PVC is formulated; the basic entities are involved in those business processes. The necessary functions for the main application and for dealers’ application are specified. The main application is based on technological platform 1C: Enterprise 8.2. The dealers’ module is .NET application and is developed with the use of Microsoft Visual Studio and Microsoft SQL Server because these software products have client versions free for end users (.NET Framework 4.0 Client Profile and Microsoft SQL Server 2008 Express. The features of the developed software complex implementation are described; the relevant charts are given. The scheme of system deployment and protocols of data exchange between 1C server, 1C client and dealer is presented. Also the functions supported by 1C module and .NET module are described. The article describes the content of class library developed for .NET module. The specification of integration of the two applications in a single software package is given. The features of the GUI organization are described; the corresponding screenshots are given. The possible ways of further development of the described software complex are presented and a conclusion about its competitiveness and expediency of new researches is made.

  18. Steam generation process control and automation; Automacao e controle no processo de geracao de vapor

    Energy Technology Data Exchange (ETDEWEB)

    Souza Junior, Jose Cleodon de; Silva, Walmy Andre C.M. da [PETROBRAS S.A., Natal, RN (Brazil)

    2004-07-01

    This paper describes the implementation of the Supervisory Control and Data Acquisition System (SCADA) in the steam generation process for injection in heavy oil fields of the Alto do Rodrigues Production Asset, developed by PETROBRAS/E and P/UN-RNCE. This Asset is located in the northeastern region of Brazil, in Rio Grande do Norte State. It addresses to the steam generators for injection in oil wells and the upgrade project that installed remote terminal units and a new panel controlled by PLC, changed all the pneumatic transmitters by electronic and incorporated the steam quality and oxygen control, providing the remote supervision of the process. It also discusses the improvements obtained in the steam generation after the changes in the conception of the control and safety systems. (author)

  19. On the combination of molecular replacement and single-wavelength anomalous diffraction phasing for automated structure determination

    International Nuclear Information System (INIS)

    Panjikar, Santosh; Parthasarathy, Venkataraman; Lamzin, Victor S.; Weiss, Manfred S.; Tucker, Paul A.

    2009-01-01

    The combination of molecular replacement and single-wavelength anomalous diffraction improves the performance of automated structure determination with Auto-Rickshaw. A combination of molecular replacement and single-wavelength anomalous diffraction phasing has been incorporated into the automated structure-determination platform Auto-Rickshaw. The complete MRSAD procedure includes molecular replacement, model refinement, experimental phasing, phase improvement and automated model building. The improvement over the standard SAD or MR approaches is illustrated by ten test cases taken from the JCSG diffraction data-set database. Poor MR or SAD phases with phase errors larger than 70° can be improved using the described procedure and a large fraction of the model can be determined in a purely automatic manner from X-ray data extending to better than 2.6 Å resolution

  20. The Value of Distributed Generation under Different Tariff Structures

    OpenAIRE

    Firestone, Ryan; Magnus Maribu, Karl; Marnay, Chris

    2006-01-01

    Distributed generation (DG) may play a key role in a modern energy system because it can improve energy efficiency. Reductions in the energy bill, and therefore DG attractiveness, depend on the electricity tariff structure; a system created before widespread adoption of distributed generation. Tariffs have been designed to recover costs equitably amongst customers with similar consumption patterns. Recently, electric utilities began to question the equity of this electricity pricing stru...

  1. Determining the helicity structure of third generation resonances

    International Nuclear Information System (INIS)

    Papaefstathiou, Andreas

    2011-11-01

    We examine methods that have been proposed for determining the helicity structure of decays of new resonances to third generation quarks and/or leptons. We present analytical and semi-analytical predictions and assess the applicability of the relevant variables in realistic reconstruction scenarios using Monte Carlo-generated events, including the effects of QCD radiation and multiple parton interactions, combinatoric ambiguities and fast detector simulation. (orig.)

  2. A generative, probabilistic model of local protein structure

    DEFF Research Database (Denmark)

    Boomsma, Wouter; Mardia, Kanti V.; Taylor, Charles C.

    2008-01-01

    Despite significant progress in recent years, protein structure prediction maintains its status as one of the prime unsolved problems in computational biology. One of the key remaining challenges is an efficient probabilistic exploration of the structural space that correctly reflects the relative...... conformational stabilities. Here, we present a fully probabilistic, continuous model of local protein structure in atomic detail. The generative model makes efficient conformational sampling possible and provides a framework for the rigorous analysis of local sequence-structure correlations in the native state...

  3. Third-generation dual-source CT of the neck using automated tube voltage adaptation in combination with advanced modeled iterative reconstruction: evaluation of image quality and radiation dose

    International Nuclear Information System (INIS)

    Scholtz, Jan-Erik; Wichmann, Julian L.; Huesers, Kristina; Albrecht, Moritz H.; Beeres, Martin; Bauer, Ralf W.; Vogl, Thomas J.; Bodelle, Boris

    2016-01-01

    To evaluate image quality and radiation dose in third-generation dual-source computed tomography (DSCT) of the neck using automated tube voltage adaptation (TVA) with advanced modelled iterative reconstruction (ADMIRE) algorithm. One hundred and sixteen patients were retrospectively evaluated. Group A (n = 59) was examined on second-generation DSCT with automated TVA and filtered back projection. Group B (n = 57) was examined on a third-generation DSCT with automated TVA and ADMIRE. Age, body diameter, attenuation of several anatomic structures, noise, signal-to-noise ratio (SNR), contrast-to-noise ratio (CNR), radiation dose (CTDI vol ) and size-specific dose estimates (SSDE) were assessed. Diagnostic acceptability was rated by three readers. Age (p = 0.87) and body diameter (p = 0.075) did not differ significantly. Tube voltage in Group A was set automatically to 100 kV for all patients (n = 59), and to 70 kV (n = 2), 80 kV (n = 5), and 90 kV (n = 50) in Group B. Noise was reduced and CNR was increased significantly (p < 0.001). Diagnostic acceptability was rated high in both groups, with better ratings in Group B (p < 0.001). SSDE was reduced by 34 % in Group B (20.38 ± 1.63 mGy vs. 13.04 ± 1.50 mGy, p < 0.001). Combination of automated TVA and ADMIRE in neck CT using third-generation DSCT results in a substantial radiation dose reduction with low noise and increased CNR. (orig.)

  4. Semi-Automated Air-Coupled Impact-Echo Method for Large-Scale Parkade Structure

    Directory of Open Access Journals (Sweden)

    Tyler Epp

    2018-03-01

    Full Text Available Structural Health Monitoring (SHM has moved to data-dense systems, utilizing numerous sensor types to monitor infrastructure, such as bridges and dams, more regularly. One of the issues faced in this endeavour is the scale of the inspected structures and the time it takes to carry out testing. Installing automated systems that can provide measurements in a timely manner is one way of overcoming these obstacles. This study proposes an Artificial Neural Network (ANN application that determines intact and damaged locations from a small training sample of impact-echo data, using air-coupled microphones from a reinforced concrete beam in lab conditions and data collected from a field experiment in a parking garage. The impact-echo testing in the field is carried out in a semi-autonomous manner to expedite the front end of the in situ damage detection testing. The use of an ANN removes the need for a user-defined cutoff value for the classification of intact and damaged locations when a least-square distance approach is used. It is postulated that this may contribute significantly to testing time reduction when monitoring large-scale civil Reinforced Concrete (RC structures.

  5. Comparing the Structure-Function Relationship at the Macula With Standard Automated Perimetry and Microperimetry.

    Science.gov (United States)

    Rao, Harsha L; Januwada, Manideepak; Hussain, Raza S M; Pillutla, Lalitha N; Begum, Viquar U; Chaitanya, Aditya; Senthil, Sirisha; Garudadri, Chandra S

    2015-12-01

    To compare the structure-function relationship between ganglion cell-inner plexiform layer (GCIPL) thickness measurements using spectral-domain optical coherence tomography (SDOCT) and visual sensitivities measured using standard automated perimetry (SAP) and microperimetry (MP) at the macula in glaucoma. In a prospective study, 45 control eyes (29 subjects) and 60 glaucoma eyes (45 patients) underwent visual sensitivity estimation at the macula (central 10°) by SAP and MP, and GCIPL thickness measurement at the macula by SDOCT. Structure-function relationships between GCILP thickness and visual sensitivity loss with SAP and MP at various macular sectors were assessed using the Hood and Kardon model. To compare structure-function relationship with SAP and MP, we calculated the number of data points falling outside the 5th and the 95th percentile values of the Hood and Kardon model with each of the perimeters. The number of points falling outside the 5th and 95th percentile values of the Hood and Kardon model ranged from 28 (superior sector) to 48 (inferonasal sector) with SAP and 33 (superior sector) to 49 (inferonasal sector) with MP. The difference in the number of points falling outside the 5th and 95th percentile values with SAP and MP was statistically insignificant (P > 0.05, χ(2) test) for all the sectors. Visual sensitivity measurements of both SAP and MP demonstrated a similar relationship with the GCIPL measurements of SDOCT at the macula in glaucoma.

  6. Vibration based structural health monitoring of an arch bridge: From automated OMA to damage detection

    Science.gov (United States)

    Magalhães, F.; Cunha, A.; Caetano, E.

    2012-04-01

    In order to evaluate the usefulness of approaches based on modal parameters tracking for structural health monitoring of bridges, in September of 2007, a dynamic monitoring system was installed in a concrete arch bridge at the city of Porto, in Portugal. The implementation of algorithms to perform the continuous on-line identification of modal parameters based on structural responses to ambient excitation (automated Operational Modal Analysis) has permitted to create a very complete database with the time evolution of the bridge modal characteristics during more than 2 years. This paper describes the strategy that was followed to minimize the effects of environmental and operational factors on the bridge natural frequencies, enabling, in a subsequent stage, the identification of structural anomalies. Alternative static and dynamic regression models are tested and complemented by a Principal Components Analysis. Afterwards, the identification of damages is tried with control charts. At the end, it is demonstrated that the adopted processing methodology permits the detection of realistic damage scenarios, associated with frequency shifts around 0.2%, which were simulated with a numerical model.

  7. Automation of inspection methods for eddy current testing of steam generator tubes

    International Nuclear Information System (INIS)

    Meurgey, P.; Baumaire, A.

    1990-01-01

    Inspection of all the tubes of a steam generator when the reactor is stopped is required for some of these exchangers affected by stress corrosion cracking. Characterization of each crack, in each tube is made possible by the development of software for processing the signals from an eddy current probe. The ESTELLE software allows a rapid increase of tested tubes, more than 80,000 in 1989 [fr

  8. Automating the generation of lexical patterns for processing free text in clinical documents.

    Science.gov (United States)

    Meng, Frank; Morioka, Craig

    2015-09-01

    Many tasks in natural language processing utilize lexical pattern-matching techniques, including information extraction (IE), negation identification, and syntactic parsing. However, it is generally difficult to derive patterns that achieve acceptable levels of recall while also remaining highly precise. We present a multiple sequence alignment (MSA)-based technique that automatically generates patterns, thereby leveraging language usage to determine the context of words that influence a given target. MSAs capture the commonalities among word sequences and are able to reveal areas of linguistic stability and variation. In this way, MSAs provide a systemic approach to generating lexical patterns that are generalizable, which will both increase recall levels and maintain high levels of precision. The MSA-generated patterns exhibited consistent F1-, F.5-, and F2- scores compared to two baseline techniques for IE across four different tasks. Both baseline techniques performed well for some tasks and less well for others, but MSA was found to consistently perform at a high level for all four tasks. The performance of MSA on the four extraction tasks indicates the method's versatility. The results show that the MSA-based patterns are able to handle the extraction of individual data elements as well as relations between two concepts without the need for large amounts of manual intervention. We presented an MSA-based framework for generating lexical patterns that showed consistently high levels of both performance and recall over four different extraction tasks when compared to baseline methods. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. Evaluating an Automated Number Series Item Generator Using Linear Logistic Test Models

    Directory of Open Access Journals (Sweden)

    Bao Sheng Loe

    2018-04-01

    Full Text Available This study investigates the item properties of a newly developed Automatic Number Series Item Generator (ANSIG. The foundation of the ANSIG is based on five hypothesised cognitive operators. Thirteen item models were developed using the numGen R package and eleven were evaluated in this study. The 16-item ICAR (International Cognitive Ability Resource1 short form ability test was used to evaluate construct validity. The Rasch Model and two Linear Logistic Test Model(s (LLTM were employed to estimate and predict the item parameters. Results indicate that a single factor determines the performance on tests composed of items generated by the ANSIG. Under the LLTM approach, all the cognitive operators were significant predictors of item difficulty. Moderate to high correlations were evident between the number series items and the ICAR test scores, with high correlation found for the ICAR Letter-Numeric-Series type items, suggesting adequate nomothetic span. Extended cognitive research is, nevertheless, essential for the automatic generation of an item pool with predictable psychometric properties.

  10. SeqReporter: automating next-generation sequencing result interpretation and reporting workflow in a clinical laboratory.

    Science.gov (United States)

    Roy, Somak; Durso, Mary Beth; Wald, Abigail; Nikiforov, Yuri E; Nikiforova, Marina N

    2014-01-01

    A wide repertoire of bioinformatics applications exist for next-generation sequencing data analysis; however, certain requirements of the clinical molecular laboratory limit their use: i) comprehensive report generation, ii) compatibility with existing laboratory information systems and computer operating system, iii) knowledgebase development, iv) quality management, and v) data security. SeqReporter is a web-based application developed using ASP.NET framework version 4.0. The client-side was designed using HTML5, CSS3, and Javascript. The server-side processing (VB.NET) relied on interaction with a customized SQL server 2008 R2 database. Overall, 104 cases (1062 variant calls) were analyzed by SeqReporter. Each variant call was classified into one of five report levels: i) known clinical significance, ii) uncertain clinical significance, iii) pending pathologists' review, iv) synonymous and deep intronic, and v) platform and panel-specific sequence errors. SeqReporter correctly annotated and classified 99.9% (859 of 860) of sequence variants, including 68.7% synonymous single-nucleotide variants, 28.3% nonsynonymous single-nucleotide variants, 1.7% insertions, and 1.3% deletions. One variant of potential clinical significance was re-classified after pathologist review. Laboratory information system-compatible clinical reports were generated automatically. SeqReporter also facilitated quality management activities. SeqReporter is an example of a customized and well-designed informatics solution to optimize and automate the downstream analysis of clinical next-generation sequencing data. We propose it as a model that may envisage the development of a comprehensive clinical informatics solution. Copyright © 2014 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  11. Temporal Dynamics of Health and Well-Being: A Crowdsourcing Approach to Momentary Assessments and Automated Generation of Personalized Feedback.

    Science.gov (United States)

    van der Krieke, Lian; Blaauw, Frank J; Emerencia, Ando C; Schenk, Hendrika M; Slaets, Joris P J; Bos, Elisabeth H; de Jonge, Peter; Jeronimus, Bertus F

    Recent developments in research and mobile health enable a quantitative idiographic approach in health research. The present study investigates the potential of an electronic diary crowdsourcing study in the Netherlands for (1) large-scale automated self-assessment for individual-based health promotion and (2) enabling research at both the between-persons and within-persons level. To illustrate the latter, we examined between-persons and within-persons associations between somatic symptoms and quality of life. A website provided the general Dutch population access to a 30-day (3 times a day) diary study assessing 43 items related to health and well-being, which gave participants personalized feedback. Associations between somatic symptoms and quality of life were examined with a linear mixed model. A total of 629 participants completed 28,430 assessments, with a mean (SD) of 45 (32) assessments per participant. Most participants (n = 517 [82%]) were women and 531 (84%) had high education. Almost 40% of the participants (n = 247) completed enough assessments (t = 68) to generate personalized feedback including temporal dynamics between well-being, health behavior, and emotions. Substantial between-person variability was found in the within-person association between somatic symptoms and quality of life. We successfully built an application for automated diary assessments and personalized feedback. The application was used by a sample of mainly highly educated women, which suggests that the potential of our intensive diary assessment method for large-scale health promotion is limited. However, a rich data set was collected that allows for group-level and idiographic analyses that can shed light on etiological processes and may contribute to the development of empirical-based health promotion solutions.

  12. Automated generation of massive image knowledge collections using Microsoft Live Labs Pivot to promote neuroimaging and translational research.

    Science.gov (United States)

    Viangteeravat, Teeradache; Anyanwu, Matthew N; Ra Nagisetty, Venkateswara; Kuscu, Emin

    2011-07-15

    Massive datasets comprising high-resolution images, generated in neuro-imaging studies and in clinical imaging research, are increasingly challenging our ability to analyze, share, and filter such images in clinical and basic translational research. Pivot collection exploratory analysis provides each user the ability to fully interact with the massive amounts of visual data to fully facilitate sufficient sorting, flexibility and speed to fluidly access, explore or analyze the massive image data sets of high-resolution images and their associated meta information, such as neuro-imaging databases from the Allen Brain Atlas. It is used in clustering, filtering, data sharing and classifying of the visual data into various deep zoom levels and meta information categories to detect the underlying hidden pattern within the data set that has been used. We deployed prototype Pivot collections using the Linux CentOS running on the Apache web server. We also tested the prototype Pivot collections on other operating systems like Windows (the most common variants) and UNIX, etc. It is demonstrated that the approach yields very good results when compared with other approaches used by some researchers for generation, creation, and clustering of massive image collections such as the coronal and horizontal sections of the mouse brain from the Allen Brain Atlas. Pivot visual analytics was used to analyze a prototype of dataset Dab2 co-expressed genes from the Allen Brain Atlas. The metadata along with high-resolution images were automatically extracted using the Allen Brain Atlas API. It is then used to identify the hidden information based on the various categories and conditions applied by using options generated from automated collection. A metadata category like chromosome, as well as data for individual cases like sex, age, and plan attributes of a particular gene, is used to filter, sort and to determine if there exist other genes with a similar characteristics to Dab2. And

  13. Learning Orthographic Structure With Sequential Generative Neural Networks.

    Science.gov (United States)

    Testolin, Alberto; Stoianov, Ivilin; Sperduti, Alessandro; Zorzi, Marco

    2016-04-01

    Learning the structure of event sequences is a ubiquitous problem in cognition and particularly in language. One possible solution is to learn a probabilistic generative model of sequences that allows making predictions about upcoming events. Though appealing from a neurobiological standpoint, this approach is typically not pursued in connectionist modeling. Here, we investigated a sequential version of the restricted Boltzmann machine (RBM), a stochastic recurrent neural network that extracts high-order structure from sensory data through unsupervised generative learning and can encode contextual information in the form of internal, distributed representations. We assessed whether this type of network can extract the orthographic structure of English monosyllables by learning a generative model of the letter sequences forming a word training corpus. We show that the network learned an accurate probabilistic model of English graphotactics, which can be used to make predictions about the letter following a given context as well as to autonomously generate high-quality pseudowords. The model was compared to an extended version of simple recurrent networks, augmented with a stochastic process that allows autonomous generation of sequences, and to non-connectionist probabilistic models (n-grams and hidden Markov models). We conclude that sequential RBMs and stochastic simple recurrent networks are promising candidates for modeling cognition in the temporal domain. Copyright © 2015 Cognitive Science Society, Inc.

  14. prepare_taxa_charts.py: A Python program to automate generation of publication ready taxonomic pie chart images from QIIME.

    Science.gov (United States)

    Lakhujani, Vijay; Badapanda, Chandan

    2017-06-01

    QIIME (Quantitative Insights Into Microbial Ecology) is one of the most popular open-source bioinformatics suite for performing metagenome, 16S rRNA amplicon and Internal Transcribed Spacer (ITS) data analysis. Although, it is very comprehensive and powerful tool, it lacks a method to provide publication ready taxonomic pie charts. The script plot_taxa_summary . py bundled with QIIME generate a html file and a folder containing taxonomic pie chart and legend as separate images. The images have randomly generated alphanumeric names. Therefore, it is difficult to associate the pie chart with the legend and the corresponding sample identifier. Even if the option to have the legend within the html file is selected while executing plot_taxa_summary . py , it is very tedious to crop a complete image (having both the pie chart and the legend) due to unequal image sizes. It requires a lot of time to manually prepare the pie charts for multiple samples for publication purpose. Moreover, there are chances of error while identifying the pie chart and legend pair due to random alphanumeric names of the images. To bypass all these bottlenecks and make this process efficient, we have developed a python based program, prepare_taxa_charts . py , to automate the renaming, cropping and merging of taxonomic pie chart and corresponding legend image into a single, good quality publication ready image. This program not only augments the functionality of plot_taxa_summary . py but is also very fast in terms of CPU time and user friendly.

  15. ARBUS: A FORTRAN tool for generating tree structure diagrams

    International Nuclear Information System (INIS)

    Ferrero, C.; Zanger, M.

    1992-02-01

    The FORTRAN77 stand-alone code ARBUS has been designed to aid the user by providing a tree structure diagram generating utility for computer programs written in FORTRAN language. This report is intended to describe the main purpose and features of ARBUS and to highlight some additional applications of the code by means of practical test cases. (orig.) [de

  16. Learning Orthographic Structure with Sequential Generative Neural Networks

    Science.gov (United States)

    Testolin, Alberto; Stoianov, Ivilin; Sperduti, Alessandro; Zorzi, Marco

    2016-01-01

    Learning the structure of event sequences is a ubiquitous problem in cognition and particularly in language. One possible solution is to learn a probabilistic generative model of sequences that allows making predictions about upcoming events. Though appealing from a neurobiological standpoint, this approach is typically not pursued in…

  17. Structure analysis and design of PCCV for new generation NPP

    International Nuclear Information System (INIS)

    Wang Mingdan; Wang Xiaowen; Huang Xiaolin; Xia Zufeng

    2005-01-01

    The paper documents the overall schedule work which has been done by Shanghai Nuclear Engineering Research and Design Institute (SNERDI) in the research and design scope of the new generational advanced prestressed concrete containment vessel (PCCV). It can be applied to the design of nuclear engineering and general prestressed concrete structures in civil engineering. (authors)

  18. Automated Music Video Generation Using Multi-level Feature-based Segmentation

    Science.gov (United States)

    Yoon, Jong-Chul; Lee, In-Kwon; Byun, Siwoo

    The expansion of the home video market has created a requirement for video editing tools to allow ordinary people to assemble videos from short clips. However, professional skills are still necessary to create a music video, which requires a stream to be synchronized with pre-composed music. Because the music and the video are pre-generated in separate environments, even a professional producer usually requires a number of trials to obtain a satisfactory synchronization, which is something that most amateurs are unable to achieve.

  19. De novo protein structure generation from incomplete chemical shift assignments

    Energy Technology Data Exchange (ETDEWEB)

    Shen Yang [National Institutes of Health, Laboratory of Chemical Physics, National Institute of Diabetes and Digestive and Kidney Diseases (United States); Vernon, Robert; Baker, David [University of Washington, Department of Biochemistry and Howard Hughes Medical Institute (United States); Bax, Ad [National Institutes of Health, Laboratory of Chemical Physics, National Institute of Diabetes and Digestive and Kidney Diseases (United States)], E-mail: bax@nih.gov

    2009-02-15

    NMR chemical shifts provide important local structural information for proteins. Consistent structure generation from NMR chemical shift data has recently become feasible for proteins with sizes of up to 130 residues, and such structures are of a quality comparable to those obtained with the standard NMR protocol. This study investigates the influence of the completeness of chemical shift assignments on structures generated from chemical shifts. The Chemical-Shift-Rosetta (CS-Rosetta) protocol was used for de novo protein structure generation with various degrees of completeness of the chemical shift assignment, simulated by omission of entries in the experimental chemical shift data previously used for the initial demonstration of the CS-Rosetta approach. In addition, a new CS-Rosetta protocol is described that improves robustness of the method for proteins with missing or erroneous NMR chemical shift input data. This strategy, which uses traditional Rosetta for pre-filtering of the fragment selection process, is demonstrated for two paramagnetic proteins and also for two proteins with solid-state NMR chemical shift assignments.

  20. An automated eddy current in-service inspection system for nuclear steam generator tubing

    International Nuclear Information System (INIS)

    Wells, N.S.

    1981-06-01

    A prototype steam generator in-service inspection system incorporating remotely-controlled instrumentation linked by a digital transmission line to an instrument and control trailer outside the reactor containment has been designed and manufactured and is presently undergoing field tests. The (Monel 400) steam generator tubes are scanned two at a time using absolute eddy current probes controlled by two remotely-operated probe drives at a scanning speed of 0.5 m/s. The probes are positioned on the tubesheet by a light-weight (1.5 kg) microprocessor-operated tubesheet walker mechanism. Digitized control and data signals are transmitted up to 300 m to the control trailer. There the control and analysis computers extract the relevant signal information and present it in condensed form as labelled graphics on CRT consoles for on-line visual assessment. Hard copy output is also provided for each tube scanned (one per minute). Condensed data is archived on magnetic tapes for additional off-line analysis and comparisons with other inspections

  1. Development of an automated technique for the speciation of arsenic using flow injection hydride generation atomic absorption spectrometry (FI-HG-AAS)

    Energy Technology Data Exchange (ETDEWEB)

    Ruede, T.R. (Inst. of Petrography and Geochemistry, Univ. of Karlsruhe (Germany)); Puchelt, H. (Inst. of Petrography and Geochemistry, Univ. of Karlsruhe (Germany))

    1994-09-01

    An automated method for the determination of arsenic acid (AsV), arsenous acid (AsIII), monomethylarsonic acid (MMAA) and dimethylarsinic acid (DMAA) was developed using a commercial available flow injection hydride generation system. By carrying out the hydride generation in selected acid media the determination of As(III) alone, of MMAA and DMAA by sum and by different sensitivities, and of all four species is possible. (orig.)

  2. Methods for automated semantic definition of manufacturing structures (mBOM) in mechanical engineering companies

    Science.gov (United States)

    Stekolschik, Alexander, Prof.

    2017-10-01

    The bill of materials (BOM), which involves all parts and assemblies of the product, is the core of any mechanical or electronic product. The flexible and integrated management of engineering (Engineering Bill of Materials [eBOM]) and manufacturing (Manufacturing Bill of Materials [mBOM]) structures is the key to the creation of modern products in mechanical engineering companies. This paper presents a method framework for the creation and control of e- and, especially, mBOM. The requirements, resulting from the process of differentiation between companies that produce serialized or engineered-to-order products, are considered in the analysis phase. The main part of the paper describes different approaches to fully or partly automated creation of mBOM. The first approach is the definition of part selection rules in the generic mBOM templates. The mBOM can be derived from the eBOM for partly standardized products by using this method. Another approach is the simultaneous use of semantic rules, options, and parameters in both structures. The implementation of the method framework (selection of use cases) in a standard product lifecycle management (PLM) system is part of the research.

  3. [Automated morphometric evaluation of the chromatin structure of liver cell nuclei after vagotomy].

    Science.gov (United States)

    Butusova, N N; Zhukotskiĭ, A V; Sherbo, I V; Gribkov, E N; Dubovaia, T K

    1989-05-01

    The morphometric analysis of the interphase chromatine structure of the hepatic cells nuclei was carried out on the automated TV installation for the quantitative analysis of images "IBAS-2" (by the OPTON firm, the FRG) according to 50 optical and geometric parameters during various periods (1.2 and 4 weeks) after the vagotomy operation. It is determined that upper-molecular organisation of chromatine undergoes the biggest changes one week after operation, and changes of granular component are more informative than changes of the nongranular component (with the difference 15-20%). It was also revealed that chromatine components differ in tinctorial properties, which are evidently dependent on physicochemical characteristics of the chromatine under various functional conditions of the cell. As a result of the correlation analysis the group of morphometric indices of chromatine structure was revealed, which are highly correlated with level of transcription activity of chromatine during various terms after denervation. The correlation quotient of these parameters is 0.85-0.97. The summing up: vagus denervation of the liver causes changes in the morphofunctional organisation of the chromatine.

  4. CONCEPT AND STRUCTURE OF AUTOMATED SYSTEM FOR MONITORING STUDENT LEARNING QUALITY

    Directory of Open Access Journals (Sweden)

    M. Yu. Kataev

    2017-01-01

    organization and management of the learning process in a higher educational institution. The factors that affect the level of student knowledge obtained during training are shown. On this basis, the determining factors in assessing the level of knowledge are highlighted. It is offered to build the managing of individual training at any time interval on the basis of a calculation of the generalized criterion which consists of students’ current progress, their activity and time spent for training.The block structure of the automated program system of continuous monitoring of achievements of each student is described. All functional blocks of system are interconnected with educational process. The main advantage of this system is that students have continuous access to materials about own individual achievements and mistakes; from passive consumers of information they turn into active members of the education, and thus, they can achieve bigger effectiveness of personal vocational training. It is pointed out that information base of such system has to be available not only to students and teachers, but also future employers of university graduates.Practical significance. The concept of automated system for education results monitoring and technique of processing of collected material presented in the article are based on a simple and obvious circumstance: a student with high progress spends more time on training and leads active lifestyle in comparison with fellow students; therefore, that student with high probability will be more successful in the chosen profession. Thus, for ease of use, complete, fully detailed and digitized information on individual educational achievements of future expert is necessary not only for effective management of educational process in higher education institutions, but also for employers interested in well-prepared, qualified and hard-working staff intended to take responsibility for labour duties.

  5. Structural Design Optimization of Doubly-Fed Induction Generators Using GeneratorSE

    Energy Technology Data Exchange (ETDEWEB)

    Sethuraman, Latha [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Fingersh, Lee J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Dykes, Katherine L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Arthurs, Claire [Georgia Institute of Technology

    2017-11-13

    A wind turbine with a larger rotor swept area can generate more electricity, however, this increases costs disproportionately for manufacturing, transportation, and installation. This poster presents analytical models for optimizing doubly-fed induction generators (DFIGs), with the objective of reducing the costs and mass of wind turbine drivetrains. The structural design for the induction machine includes models for the casing, stator, rotor, and high-speed shaft developed within the DFIG module in the National Renewable Energy Laboratory's wind turbine sizing tool, GeneratorSE. The mechanical integrity of the machine is verified by examining stresses, structural deflections, and modal properties. The optimization results are then validated using finite element analysis (FEA). The results suggest that our analytical model correlates with the FEA in some areas, such as radial deflection, differing by less than 20 percent. But the analytical model requires further development for axial deflections, torsional deflections, and stress calculations.

  6. MMS control system analysis using automated root-locus plot generation

    International Nuclear Information System (INIS)

    Hefler, J.W.

    1987-01-01

    Use of the Modular Modeling System (MMS) for control systems improvement has been impeded by the need to plot eigenvalues manually. This problem has been solved by an automatic eigenvalue plotting routine. A practical procedure for control systems analysis based upon automatically generated root-locus plots has been developed using the Advanced Continuous Simulation Language (ACSL)-based version of the Modular Modeling System. Examples are given of typical ACSL run-time statements. Actual root-locus and time history plots are shown for simple models (4 state variables). More complex models are discussed. The plots show the control systems response before and after the determination of tuning parameters using the methods described

  7. CFSAN SNP Pipeline: an automated method for constructing SNP matrices from next-generation sequence data

    Directory of Open Access Journals (Sweden)

    Steve Davis

    2015-08-01

    Full Text Available The analysis of next-generation sequence (NGS data is often a fragmented step-wise process. For example, multiple pieces of software are typically needed to map NGS reads, extract variant sites, and construct a DNA sequence matrix containing only single nucleotide polymorphisms (i.e., a SNP matrix for a set of individuals. The management and chaining of these software pieces and their outputs can often be a cumbersome and difficult task. Here, we present CFSAN SNP Pipeline, which combines into a single package the mapping of NGS reads to a reference genome with Bowtie2, processing of those mapping (BAM files using SAMtools, identification of variant sites using VarScan, and production of a SNP matrix using custom Python scripts. We also introduce a Python package (CFSAN SNP Mutator that when given a reference genome will generate variants of known position against which we validate our pipeline. We created 1,000 simulated Salmonella enterica sp. enterica Serovar Agona genomes at 100× and 20× coverage, each containing 500 SNPs, 20 single-base insertions and 20 single-base deletions. For the 100× dataset, the CFSAN SNP Pipeline recovered 98.9% of the introduced SNPs and had a false positive rate of 1.04 × 10−6; for the 20× dataset 98.8% of SNPs were recovered and the false positive rate was 8.34 × 10−7. Based on these results, CFSAN SNP Pipeline is a robust and accurate tool that it is among the first to combine into a single executable the myriad steps required to produce a SNP matrix from NGS data. Such a tool is useful to those working in an applied setting (e.g., food safety traceback investigations as well as for those interested in evolutionary questions.

  8. Producing genome structure populations with the dynamic and automated PGS software.

    Science.gov (United States)

    Hua, Nan; Tjong, Harianto; Shin, Hanjun; Gong, Ke; Zhou, Xianghong Jasmine; Alber, Frank

    2018-05-01

    Chromosome conformation capture technologies such as Hi-C are widely used to investigate the spatial organization of genomes. Because genome structures can vary considerably between individual cells of a population, interpreting ensemble-averaged Hi-C data can be challenging, in particular for long-range and interchromosomal interactions. We pioneered a probabilistic approach for the generation of a population of distinct diploid 3D genome structures consistent with all the chromatin-chromatin interaction probabilities from Hi-C experiments. Each structure in the population is a physical model of the genome in 3D. Analysis of these models yields new insights into the causes and the functional properties of the genome's organization in space and time. We provide a user-friendly software package, called PGS, which runs on local machines (for practice runs) and high-performance computing platforms. PGS takes a genome-wide Hi-C contact frequency matrix, along with information about genome segmentation, and produces an ensemble of 3D genome structures entirely consistent with the input. The software automatically generates an analysis report, and provides tools to extract and analyze the 3D coordinates of specific domains. Basic Linux command-line knowledge is sufficient for using this software. A typical running time of the pipeline is ∼3 d with 300 cores on a computer cluster to generate a population of 1,000 diploid genome structures at topological-associated domain (TAD)-level resolution.

  9. CompaRNA: a server for continuous benchmarking of automated methods for RNA secondary structure prediction

    Science.gov (United States)

    Puton, Tomasz; Kozlowski, Lukasz P.; Rother, Kristian M.; Bujnicki, Janusz M.

    2013-01-01

    We present a continuous benchmarking approach for the assessment of RNA secondary structure prediction methods implemented in the CompaRNA web server. As of 3 October 2012, the performance of 28 single-sequence and 13 comparative methods has been evaluated on RNA sequences/structures released weekly by the Protein Data Bank. We also provide a static benchmark generated on RNA 2D structures derived from the RNAstrand database. Benchmarks on both data sets offer insight into the relative performance of RNA secondary structure prediction methods on RNAs of different size and with respect to different types of structure. According to our tests, on the average, the most accurate predictions obtained by a comparative approach are generated by CentroidAlifold, MXScarna, RNAalifold and TurboFold. On the average, the most accurate predictions obtained by single-sequence analyses are generated by CentroidFold, ContextFold and IPknot. The best comparative methods typically outperform the best single-sequence methods if an alignment of homologous RNA sequences is available. This article presents the results of our benchmarks as of 3 October 2012, whereas the rankings presented online are continuously updated. We will gladly include new prediction methods and new measures of accuracy in the new editions of CompaRNA benchmarks. PMID:23435231

  10. CompaRNA: a server for continuous benchmarking of automated methods for RNA secondary structure prediction.

    Science.gov (United States)

    Puton, Tomasz; Kozlowski, Lukasz P; Rother, Kristian M; Bujnicki, Janusz M

    2013-04-01

    We present a continuous benchmarking approach for the assessment of RNA secondary structure prediction methods implemented in the CompaRNA web server. As of 3 October 2012, the performance of 28 single-sequence and 13 comparative methods has been evaluated on RNA sequences/structures released weekly by the Protein Data Bank. We also provide a static benchmark generated on RNA 2D structures derived from the RNAstrand database. Benchmarks on both data sets offer insight into the relative performance of RNA secondary structure prediction methods on RNAs of different size and with respect to different types of structure. According to our tests, on the average, the most accurate predictions obtained by a comparative approach are generated by CentroidAlifold, MXScarna, RNAalifold and TurboFold. On the average, the most accurate predictions obtained by single-sequence analyses are generated by CentroidFold, ContextFold and IPknot. The best comparative methods typically outperform the best single-sequence methods if an alignment of homologous RNA sequences is available. This article presents the results of our benchmarks as of 3 October 2012, whereas the rankings presented online are continuously updated. We will gladly include new prediction methods and new measures of accuracy in the new editions of CompaRNA benchmarks.

  11. Generation of Elliptically Polarized Terahertz Waves from Antiferromagnetic Sandwiched Structure.

    Science.gov (United States)

    Zhou, Sheng; Zhang, Qiang; Fu, Shu-Fang; Wang, Xuan-Zhang; Song, Yu-Ling; Wang, Xiang-Guang; Qu, Xiu-Rong

    2018-04-01

    The generation of elliptically polarized electromagnetic wave of an antiferromagnetic (AF)/dielectric sandwiched structure in the terahertz range is studied. The frequency and external magnetic field can change the AF optical response, resulting in the generation of elliptical polarization. An especially useful geometry with high levels of the generation of elliptical polarization is found in the case where an incident electromagnetic wave perpendicularly illuminates the sandwiched structure, the AF anisotropy axis is vertical to the wave-vector and the external magnetic field is pointed along the wave-vector. In numerical calculations, the AF layer is FeF2 and the dielectric layers are ZnF2. Although the effect originates from the AF layer, it can be also influenced by the sandwiched structure. We found that the ZnF2/FeF2/ZnF2 structure possesses optimal rotation of the principal axis and ellipticity, which can reach up to about thrice that of a single FeF2 layer.

  12. AN AUTOMATED METHOD FOR 3D ROOF OUTLINE GENERATION AND REGULARIZATION IN AIRBONE LASER SCANNER DATA

    Directory of Open Access Journals (Sweden)

    S. N. Perera

    2012-07-01

    Full Text Available In this paper, an automatic approach for the generation and regularization of 3D roof boundaries in Airborne Laser scanner data is presented. The workflow is commenced by segmentation of the point clouds. A classification step and a rule based roof extraction step are followed the planar segmentation. Refinement on roof extraction is performed in order to minimize the effect due to urban vegetation. Boundary points of the connected roof planes are extracted and fitted series of straight line segments. Each line is then regularized with respect to the dominant building orientation. We introduce the usage of cycle graphs for the best use of topological information. Ridge-lines and step-edges are basically extracted to recognise correct topological relationships among the roof faces. Inner roof corners are geometrically fitted based on the closed cycle graphs. Outer boundary is reconstructed using the same concept but with the outer most cycle graph. In here, union of the sub cycles is taken. Intermediate line segments (outer bounds are intersected to reconstruct the roof eave lines. Two test areas with two different point densities are tested with the developed approach. Performance analysis of the test results is provided to demonstrate the applicability of the method.

  13. Automated generation of node-splitting models for assessment of inconsistency in network meta-analysis.

    Science.gov (United States)

    van Valkenhoef, Gert; Dias, Sofia; Ades, A E; Welton, Nicky J

    2016-03-01

    Network meta-analysis enables the simultaneous synthesis of a network of clinical trials comparing any number of treatments. Potential inconsistencies between estimates of relative treatment effects are an important concern, and several methods to detect inconsistency have been proposed. This paper is concerned with the node-splitting approach, which is particularly attractive because of its straightforward interpretation, contrasting estimates from both direct and indirect evidence. However, node-splitting analyses are labour-intensive because each comparison of interest requires a separate model. It would be advantageous if node-splitting models could be estimated automatically for all comparisons of interest. We present an unambiguous decision rule to choose which comparisons to split, and prove that it selects only comparisons in potentially inconsistent loops in the network, and that all potentially inconsistent loops in the network are investigated. Moreover, the decision rule circumvents problems with the parameterisation of multi-arm trials, ensuring that model generation is trivial in all cases. Thus, our methods eliminate most of the manual work involved in using the node-splitting approach, enabling the analyst to focus on interpreting the results. © 2015 The Authors Research Synthesis Methods Published by John Wiley & Sons Ltd.

  14. Automated Kinematics Equations Generation and Constrained Motion Planning Resolution for Modular and Reconfigurable Robots

    Energy Technology Data Exchange (ETDEWEB)

    Pin, Francois G.; Love, Lonnie L.; Jung, David L.

    2004-03-29

    Contrary to the repetitive tasks performed by industrial robots, the tasks in most DOE missions such as environmental restoration or Decontamination and Decommissioning (D&D) can be characterized as ''batches-of-one'', in which robots must be capable of adapting to changes in constraints, tools, environment, criteria and configuration. No commercially available robot control code is suitable for use with such widely varying conditions. In this talk we present our development of a ''generic code'' to allow real time (at loop rate) robot behavior adaptation to changes in task objectives, tools, number and type of constraints, modes of controls or kinematics configuration. We present the analytical framework underlying our approach and detail the design of its two major modules for the automatic generation of the kinematics equations when the robot configuration or tools change and for the motion planning under time-varying constraints. Sample problems illustrating the capabilities of the developed system are presented.

  15. Automated Generation of 3D Volcanic Gas Plume Models for Geobrowsers

    Science.gov (United States)

    Wright, T. E.; Burton, M.; Pyle, D. M.

    2007-12-01

    A network of five UV spectrometers on Etna automatically gathers column amounts of SO2 during daylight hours. Near-simultaneous scans from adjacent spectrometers, comprising 210 column amounts in total, are then converted to 2D slices showing the spatial distribution of the gas by tomographic reconstruction. The trajectory of the plume is computed using an automatically-submitted query to NOAA's HYSPLIT Trajectory Model. This also provides local estimates of air temperature, which are used to determine the atmospheric stability and therefore the degree to which the plume is dispersed by turbulence. This information is sufficient to construct an animated sequence of models which show how the plume is advected and diffused over time. These models are automatically generated in the Collada Digital Asset Exchange format and combined into a single file which displays the evolution of the plume in Google Earth. These models are useful for visualising and predicting the shape and distribution of the plume for civil defence, to assist field campaigns and as a means of communicating some of the work of volcano observatories to the public. The Simultaneous Algebraic Reconstruction Technique is used to create the 2D slices. This is a well-known method, based on iteratively updating a forward model (from 2D distribution to column amounts). Because it is based on a forward model, it also provides a simple way to quantify errors.

  16. Automated graphic image generation system for effective representation of infectious disease surveillance data.

    Science.gov (United States)

    Inoue, Masashi; Hasegawa, Shinsaku; Suyama, Akihiko; Meshitsuka, Shunsuke

    2003-11-01

    Infectious disease surveillance schemes have been established to detect infectious disease outbreak in the early stages, to identify the causative viral strains, and to rapidly assess related morbidity and mortality. To make a scheme function well, two things are required. Firstly, it must have sufficient sensitivity and be timely to guarantee as short a delay as possible from collection to redistribution of information. Secondly, it must provide a good representation of the results of the surveillance. To do this, we have developed a database system that can redistribute the information via the Internet. The feature of this system is to automatically generate the graphic images based on the numerical data stored in the database by using Hypertext Preprocessor (PHP) script and Graphics Drawing (GD) library. It dynamically displays the information as a map or bar chart as well as a numerical impression according to the real time demand of the users. This system will be a useful tool for medical personnel and researchers working on infectious disease problems and will save significant time in the redistribution of information.

  17. HPV-QUEST: A highly customized system for automated HPV sequence analysis capable of processing Next Generation sequencing data set.

    Science.gov (United States)

    Yin, Li; Yao, Jiqiang; Gardner, Brent P; Chang, Kaifen; Yu, Fahong; Goodenow, Maureen M

    2012-01-01

    Next Generation sequencing (NGS) applied to human papilloma viruses (HPV) can provide sensitive methods to investigate the molecular epidemiology of multiple type HPV infection. Currently a genotyping system with a comprehensive collection of updated HPV reference sequences and a capacity to handle NGS data sets is lacking. HPV-QUEST was developed as an automated and rapid HPV genotyping system. The web-based HPV-QUEST subtyping algorithm was developed using HTML, PHP, Perl scripting language, and MYSQL as the database backend. HPV-QUEST includes a database of annotated HPV reference sequences with updated nomenclature covering 5 genuses, 14 species and 150 mucosal and cutaneous types to genotype blasted query sequences. HPV-QUEST processes up to 10 megabases of sequences within 1 to 2 minutes. Results are reported in html, text and excel formats and display e-value, blast score, and local and coverage identities; provide genus, species, type, infection site and risk for the best matched reference HPV sequence; and produce results ready for additional analyses.

  18. CIF2Cell: Generating geometries for electronic structure programs

    Science.gov (United States)

    Björkman, Torbjörn

    2011-05-01

    The CIF2Cell program generates the geometrical setup for a number of electronic structure programs based on the crystallographic information in a Crystallographic Information Framework (CIF) file. The program will retrieve the space group number, Wyckoff positions and crystallographic parameters, make a sensible choice for Bravais lattice vectors (primitive or principal cell) and generate all atomic positions. Supercells can be generated and alloys are handled gracefully. The code currently has output interfaces to the electronic structure programs ABINIT, CASTEP, CPMD, Crystal, Elk, Exciting, EMTO, Fleur, RSPt, Siesta and VASP. Program summaryProgram title: CIF2Cell Catalogue identifier: AEIM_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEIM_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU GPL version 3 No. of lines in distributed program, including test data, etc.: 12 691 No. of bytes in distributed program, including test data, etc.: 74 933 Distribution format: tar.gz Programming language: Python (versions 2.4-2.7) Computer: Any computer that can run Python (versions 2.4-2.7) Operating system: Any operating system that can run Python (versions 2.4-2.7) Classification: 7.3, 7.8, 8 External routines: PyCIFRW [1] Nature of problem: Generate the geometrical setup of a crystallographic cell for a variety of electronic structure programs from data contained in a CIF file. Solution method: The CIF file is parsed using routines contained in the library PyCIFRW [1], and crystallographic as well as bibliographic information is extracted. The program then generates the principal cell from symmetry information, crystal parameters, space group number and Wyckoff sites. Reduction to a primitive cell is then performed, and the resulting cell is output to suitably named files along with documentation of the information source generated from any bibliographic information contained in the CIF

  19. Generative probabilistic models extend the scope of inferential structure determination

    DEFF Research Database (Denmark)

    Olsson, Simon; Boomsma, Wouter; Frellsen, Jes

    2011-01-01

    demonstrate that the use of generative probabilistic models instead of physical forcefields in the Bayesian formalism is not only conceptually attractive, but also improves precision and efficiency. Our results open new vistas for the use of sophisticated probabilistic models of biomolecular structure......Conventional methods for protein structure determination from NMR data rely on the ad hoc combination of physical forcefields and experimental data, along with heuristic determination of free parameters such as weight of experimental data relative to a physical forcefield. Recently, a theoretically...

  20. The structure of carbosilane dendrimers of higher generations

    International Nuclear Information System (INIS)

    Rogachev, A.V.; Kuklin, A.I.; Chernyj, A.Yu.; Ozerin, A.N.; Muzafarov, A.M.; Tatarinova, E.A.; Gordelij, V.I.

    2008-01-01

    Using small-angle neutron scattering method, we investigate the structure of carbosilane dendrimers of the ninth generation with a four-function core and butyl terminal groups. It is shown that the dendrimers in question are monodispersive objects having anisometric form. The values of the partial volume and the mean scattering length density are determined with the contrast variation method. The studied dendrimers exhibit the same size and distribution of the scattering length density. It is found that about 20% of the interior dendrimer volume is permeable to a solvent. Performing a Monte Carlo simulation, we reconstruct the spatial distribution of scattering length density over the dendrimers and reveal changing of the excluded volume for different contrasts. The spatial structure features of carbosilane dendrimers of higher generations are discussed

  1. Micromachined ultrasonic droplet generator based on a liquid horn structure

    Science.gov (United States)

    Meacham, J. M.; Ejimofor, C.; Kumar, S.; Degertekin, F. L.; Fedorov, A. G.

    2004-05-01

    A micromachined ultrasonic droplet generator is developed and demonstrated for drop-on-demand fluid atomization. The droplet generator comprises a bulk ceramic piezoelectric transducer for ultrasound generation, a reservoir for the ejection fluid, and a silicon micromachined liquid horn structure as the nozzle. The nozzles are formed using a simple batch microfabrication process that involves wet etching of (100) silicon in potassium hydroxide solution. Device operation is demonstrated by droplet ejection of water through 30 μm orifices at 1.49 and 2.30 MHz. The finite-element simulations of the acoustic fields in the cavity and electrical impedance of the device are in agreement with the measurements and indicate that the device utilizes cavity resonances in the 1-5 MHz range in conjunction with acoustic wave focusing by the pyramidally shaped nozzles to achieve low power operation.

  2. Automatic Structure-Based Code Generation from Coloured Petri Nets

    DEFF Research Database (Denmark)

    Kristensen, Lars Michael; Westergaard, Michael

    2010-01-01

    Automatic code generation based on Coloured Petri Net (CPN) models is challenging because CPNs allow for the construction of abstract models that intermix control flow and data processing, making translation into conventional programming constructs difficult. We introduce Process-Partitioned CPNs...... (PP-CPNs) which is a subclass of CPNs equipped with an explicit separation of process control flow, message passing, and access to shared and local data. We show how PP-CPNs caters for a four phase structure-based automatic code generation process directed by the control flow of processes....... The viability of our approach is demonstrated by applying it to automatically generate an Erlang implementation of the Dynamic MANET On-demand (DYMO) routing protocol specified by the Internet Engineering Task Force (IETF)....

  3. SABATPG-A Structural Analysis Based Automatic Test Generation System

    Institute of Scientific and Technical Information of China (English)

    李忠诚; 潘榆奇; 闵应骅

    1994-01-01

    A TPG system, SABATPG, is given based on a generic structural model of large circuits. Three techniques of partial implication, aftereffect of identified undetectable faults and shared sensitization with new concepts of localization and aftereffect are employed in the system to improve FAN algorithm. Experiments for the 10 ISCAS benchmark circuits show that the computing time of SABATPG for test generation is 19.42% less than that of FAN algorithm.

  4. Technology standards for structure, etc. concerning nuclear power generating facilities

    International Nuclear Information System (INIS)

    1977-01-01

    Based on the Ordinance for the Technology Standards concerning Nuclear Power Generating Facilities, the technology standards are established for the vessels of class 1 to 4 (including reactor pressure vessels, reactor containment vessels, etc.), the pipes of class 1 to 3, safety valves, pressure test and monitoring test specimens. Those specified are materials, nondestructive tests, structures, shapes, shells, flanges, etc. for the vessels and the pipes, and so on. (Mori, K.)

  5. Application of an automated wireless structural monitoring system for long-span suspension bridges

    International Nuclear Information System (INIS)

    Kurata, M.; Lynch, J. P.; Linden, G. W. van der; Hipley, P.; Sheng, L.-H.

    2011-01-01

    This paper describes an automated wireless structural monitoring system installed at the New Carquinez Bridge (NCB). The designed system utilizes a dense network of wireless sensors installed in the bridge but remotely controlled by a hierarchically designed cyber-environment. The early efforts have included performance verification of a dense network of wireless sensors installed on the bridge and the establishment of a cellular gateway to the system for remote access from the internet. Acceleration of the main bridge span was the primary focus of the initial field deployment of the wireless monitoring system. An additional focus of the study is on ensuring wireless sensors can survive for long periods without human intervention. Toward this end, the life-expectancy of the wireless sensors has been enhanced by embedding efficient power management schemes in the sensors while integrating solar panels for power harvesting. The dynamic characteristics of the NCB under daily traffic and wind loads were extracted from the vibration response of the bridge deck and towers. These results have been compared to a high-fidelity finite element model of the bridge.

  6. Application of AN Automated Wireless Structural Monitoring System for Long-Span Suspension Bridges

    Science.gov (United States)

    Kurata, M.; Lynch, J. P.; van der Linden, G. W.; Hipley, P.; Sheng, L.-H.

    2011-06-01

    This paper describes an automated wireless structural monitoring system installed at the New Carquinez Bridge (NCB). The designed system utilizes a dense network of wireless sensors installed in the bridge but remotely controlled by a hierarchically designed cyber-environment. The early efforts have included performance verification of a dense network of wireless sensors installed on the bridge and the establishment of a cellular gateway to the system for remote access from the internet. Acceleration of the main bridge span was the primary focus of the initial field deployment of the wireless monitoring system. An additional focus of the study is on ensuring wireless sensors can survive for long periods without human intervention. Toward this end, the life-expectancy of the wireless sensors has been enhanced by embedding efficient power management schemes in the sensors while integrating solar panels for power harvesting. The dynamic characteristics of the NCB under daily traffic and wind loads were extracted from the vibration response of the bridge deck and towers. These results have been compared to a high-fidelity finite element model of the bridge.

  7. Development of a fully automated online mixing system for SAXS protein structure analysis

    DEFF Research Database (Denmark)

    Nielsen, Søren Skou; Arleth, Lise

    2010-01-01

    This thesis presents the development of an automated high-throughput mixing and exposure system for Small-Angle Scattering analysis on a synchrotron using polymer microfluidics. Software and hardware for both automated mixing, exposure control on a beamline and automated data reduction...... and preliminary analysis is presented. Three mixing systems that have been the corner stones of the development process are presented including a fully functioning high-throughput microfluidic system that is able to produce and expose 36 mixed samples per hour using 30 μL of sample volume. The system is tested...

  8. Automated cloning methods.; TOPICAL

    International Nuclear Information System (INIS)

    Collart, F.

    2001-01-01

    Argonne has developed a series of automated protocols to generate bacterial expression clones by using a robotic system designed to be used in procedures associated with molecular biology. The system provides plate storage, temperature control from 4 to 37 C at various locations, and Biomek and Multimek pipetting stations. The automated system consists of a robot that transports sources from the active station on the automation system. Protocols for the automated generation of bacterial expression clones can be grouped into three categories (Figure 1). Fragment generation protocols are initiated on day one of the expression cloning procedure and encompass those protocols involved in generating purified coding region (PCR)

  9. Review on structured optical field generated from array beams

    Science.gov (United States)

    Hou, Tianyue; Zhou, Pu; Ma, Yanxing; Zhi, Dong

    2018-03-01

    Structured optical field (SOF), which includes vortex beams, non-diffraction beams, cylindrical vector beams and so on, has been under intensive investigation theoretically and experimentally in recent years. Generally, current research focus on the extraordinary properties (non-diffraction propagation, helical wavefront, rotation of electrical field, et al), which can be widely applied in micro-particle manipulation, super-resolution imaging, free-space communication and so on. There are mainly two technical routes, that is, inner-cavity and outer-cavity (spatial light modulators, diffractive phase holograms, q-plates). To date, most of the SOFs generated from both technical routes involves with single monolithic beam. As a novel technical route, SOF based on array beams has the advantage in more flexible freedom degree and power scaling potential. In this paper, research achievements in SOF generation based on array beams are arranged and discussed in detail. Moreover, experiment of generating exotic beam by array beams is introduced, which illustrates that SOF generated from array beams is theoretically valid and experimentally feasible. SOF generated from array beams is also beneficial for capacity increasing and data receiving for free-space optical communication systems at long distance.

  10. Algorithms and data structures for automated change detection and classification of sidescan sonar imagery

    Science.gov (United States)

    Gendron, Marlin Lee

    During Mine Warfare (MIW) operations, MIW analysts perform change detection by visually comparing historical sidescan sonar imagery (SSI) collected by a sidescan sonar with recently collected SSI in an attempt to identify objects (which might be explosive mines) placed at sea since the last time the area was surveyed. This dissertation presents a data structure and three algorithms, developed by the author, that are part of an automated change detection and classification (ACDC) system. MIW analysts at the Naval Oceanographic Office, to reduce the amount of time to perform change detection, are currently using ACDC. The dissertation introductory chapter gives background information on change detection, ACDC, and describes how SSI is produced from raw sonar data. Chapter 2 presents the author's Geospatial Bitmap (GB) data structure, which is capable of storing information geographically and is utilized by the three algorithms. This chapter shows that a GB data structure used in a polygon-smoothing algorithm ran between 1.3--48.4x faster than a sparse matrix data structure. Chapter 3 describes the GB clustering algorithm, which is the author's repeatable, order-independent method for clustering. Results from tests performed in this chapter show that the time to cluster a set of points is not affected by the distribution or the order of the points. In Chapter 4, the author presents his real-time computer-aided detection (CAD) algorithm that automatically detects mine-like objects on the seafloor in SSI. The author ran his GB-based CAD algorithm on real SSI data, and results of these tests indicate that his real-time CAD algorithm performs comparably to or better than other non-real-time CAD algorithms. The author presents his computer-aided search (CAS) algorithm in Chapter 5. CAS helps MIW analysts locate mine-like features that are geospatially close to previously detected features. A comparison between the CAS and a great circle distance algorithm shows that the

  11. Automated artery-venous classification of retinal blood vessels based on structural mapping method

    Science.gov (United States)

    Joshi, Vinayak S.; Garvin, Mona K.; Reinhardt, Joseph M.; Abramoff, Michael D.

    2012-03-01

    Retinal blood vessels show morphologic modifications in response to various retinopathies. However, the specific responses exhibited by arteries and veins may provide a precise diagnostic information, i.e., a diabetic retinopathy may be detected more accurately with the venous dilatation instead of average vessel dilatation. In order to analyze the vessel type specific morphologic modifications, the classification of a vessel network into arteries and veins is required. We previously described a method for identification and separation of retinal vessel trees; i.e. structural mapping. Therefore, we propose the artery-venous classification based on structural mapping and identification of color properties prominent to the vessel types. The mean and standard deviation of each of green channel intensity and hue channel intensity are analyzed in a region of interest around each centerline pixel of a vessel. Using the vector of color properties extracted from each centerline pixel, it is classified into one of the two clusters (artery and vein), obtained by the fuzzy-C-means clustering. According to the proportion of clustered centerline pixels in a particular vessel, and utilizing the artery-venous crossing property of retinal vessels, each vessel is assigned a label of an artery or a vein. The classification results are compared with the manually annotated ground truth (gold standard). We applied the proposed method to a dataset of 15 retinal color fundus images resulting in an accuracy of 88.28% correctly classified vessel pixels. The automated classification results match well with the gold standard suggesting its potential in artery-venous classification and the respective morphology analysis.

  12. Structural materials issues for the next generation fission reactors

    Science.gov (United States)

    Chant, I.; Murty, K. L.

    2010-09-01

    Generation-IV reactor design concepts envisioned thus far cater to a common goal of providing safer, longer lasting, proliferation-resistant, and economically viable nuclear power plants. The foremost consideration in the successful development and deployment of Gen-W reactor systems is the performance and reliability issues involving structural materials for both in-core and out-of-core applications. The structural materials need to endure much higher temperatures, higher neutron doses, and extremely corrosive environments, which are beyond the experience of the current nuclear power plants. Materials under active consideration for use in different reactor components include various ferritic/martensitic steels, austenitic stainless steels, nickel-base superalloys, ceramics, composites, etc. This article addresses the material requirements for these advanced fission reactor types, specifically addressing structural materials issues depending on the specific application areas.

  13. H++ 3.0: automating pK prediction and the preparation of biomolecular structures for atomistic molecular modeling and simulations.

    Science.gov (United States)

    Anandakrishnan, Ramu; Aguilar, Boris; Onufriev, Alexey V

    2012-07-01

    The accuracy of atomistic biomolecular modeling and simulation studies depend on the accuracy of the input structures. Preparing these structures for an atomistic modeling task, such as molecular dynamics (MD) simulation, can involve the use of a variety of different tools for: correcting errors, adding missing atoms, filling valences with hydrogens, predicting pK values for titratable amino acids, assigning predefined partial charges and radii to all atoms, and generating force field parameter/topology files for MD. Identifying, installing and effectively using the appropriate tools for each of these tasks can be difficult for novice and time-consuming for experienced users. H++ (http://biophysics.cs.vt.edu/) is a free open-source web server that automates the above key steps in the preparation of biomolecular structures for molecular modeling and simulations. H++ also performs extensive error and consistency checking, providing error/warning messages together with the suggested corrections. In addition to numerous minor improvements, the latest version of H++ includes several new capabilities and options: fix erroneous (flipped) side chain conformations for HIS, GLN and ASN, include a ligand in the input structure, process nucleic acid structures and generate a solvent box with specified number of common ions for explicit solvent MD.

  14. TopoGen: A Network Topology Generation Architecture with application to automating simulations of Software Defined Networks

    CERN Document Server

    Laurito, Andres; The ATLAS collaboration

    2017-01-01

    Simulation is an important tool to validate the performance impact of control decisions in Software Defined Networks (SDN). Yet, the manual modeling of complex topologies that may change often during a design process can be a tedious error-prone task. We present TopoGen, a general purpose architecture and tool for systematic translation and generation of network topologies. TopoGen can be used to generate network simulation models automatically by querying information available at diverse sources, notably SDN controllers. The DEVS modeling and simulation framework facilitates a systematic translation of structured knowledge about a network topology into a formal modular and hierarchical coupling of preexisting or new models of network entities (physical or logical). TopoGen can be flexibly extended with new parsers and generators to grow its scope of applicability. This permits to design arbitrary workflows of topology transformations. We tested TopoGen in a network engineering project for the ATLAS detector ...

  15. TopoGen: A Network Topology Generation Architecture with application to automating simulations of Software Defined Networks

    CERN Document Server

    Laurito, Andres; The ATLAS collaboration

    2018-01-01

    Simulation is an important tool to validate the performance impact of control decisions in Software Defined Networks (SDN). Yet, the manual modeling of complex topologies that may change often during a design process can be a tedious error-prone task. We present TopoGen, a general purpose architecture and tool for systematic translation and generation of network topologies. TopoGen can be used to generate network simulation models automatically by querying information available at diverse sources, notably SDN controllers. The DEVS modeling and simulation framework facilitates a systematic translation of structured knowledge about a network topology into a formal modular and hierarchical coupling of preexisting or new models of network entities (physical or logical). TopoGen can be flexibly extended with new parsers and generators to grow its scope of applicability. This permits to design arbitrary workflows of topology transformations. We tested TopoGen in a network engineering project for the ATLAS detector ...

  16. Automated global structure extraction for effective local building block processing in XCS.

    Science.gov (United States)

    Butz, Martin V; Pelikan, Martin; Llorà, Xavier; Goldberg, David E

    2006-01-01

    Learning Classifier Systems (LCSs), such as the accuracy-based XCS, evolve distributed problem solutions represented by a population of rules. During evolution, features are specialized, propagated, and recombined to provide increasingly accurate subsolutions. Recently, it was shown that, as in conventional genetic algorithms (GAs), some problems require efficient processing of subsets of features to find problem solutions efficiently. In such problems, standard variation operators of genetic and evolutionary algorithms used in LCSs suffer from potential disruption of groups of interacting features, resulting in poor performance. This paper introduces efficient crossover operators to XCS by incorporating techniques derived from competent GAs: the extended compact GA (ECGA) and the Bayesian optimization algorithm (BOA). Instead of simple crossover operators such as uniform crossover or one-point crossover, ECGA or BOA-derived mechanisms are used to build a probabilistic model of the global population and to generate offspring classifiers locally using the model. Several offspring generation variations are introduced and evaluated. The results show that it is possible to achieve performance similar to runs with an informed crossover operator that is specifically designed to yield ideal problem-dependent exploration, exploiting provided problem structure information. Thus, we create the first competent LCSs, XCS/ECGA and XCS/BOA, that detect dependency structures online and propagate corresponding lower-level dependency structures effectively without any information about these structures given in advance.

  17. Application of Real-Time Automated Traffic Incident Response Plan Management System: A Web Structure for the Regional Highway Network in China

    Directory of Open Access Journals (Sweden)

    Yongfeng Ma

    2014-01-01

    Full Text Available Traffic incidents, caused by various factors, may lead to heavy traffic delay and be harmful to traffic capacity of downstream sections. Traffic incident management (TIM systems have been developed widely to respond to traffic incidents intelligently and reduce the losses. Traffic incident response plans, as an important component of TIM, can effectively guide responders as to what and how to do in traffic incidents. In the paper, a real-time automated traffic incident response plan management system was developed, which could generate and manage traffic incident response plans timely and automatically. A web application structure and a physical structure were designed to implement and show these functions. A standard framework of data storage was also developed to save information about traffic incidents and generated response plans. Furthermore, a conformation survey and case-based reasoning (CBR were introduced to identify traffic incident and generate traffic incident response plans automatically, respectively. Twenty-three traffic crash-related incidents were selected and three indicators were used to measure the system performance. Results showed that 20 of 23 cases could be retrieved effectively and accurately. The system is practicable to generate traffic incident response plans and has been implemented in China.

  18. Direct evidence of intra- and interhemispheric corticomotor network degeneration in amyotrophic lateral sclerosis: an automated MRI structural connectivity study.

    Science.gov (United States)

    Rose, Stephen; Pannek, Kerstin; Bell, Christopher; Baumann, Fusun; Hutchinson, Nicole; Coulthard, Alan; McCombe, Pamela; Henderson, Robert

    2012-02-01

    Although the pathogenesis of amyotrophic lateral sclerosis (ALS) is uncertain, there is mounting neuroimaging evidence to suggest a mechanism involving the degeneration of multiple white matter (WM) motor and extramotor neural networks. This insight has been achieved, in part, by using MRI Diffusion Tensor Imaging (DTI) and the voxelwise analysis of anisotropy indices, along with DTI tractography to determine which specific motor pathways are involved with ALS pathology. Automated MRI structural connectivity analyses, which probe WM connections linking various functionally discrete cortical regions, have the potential to provide novel information about degenerative processes within multiple white matter (WM) pathways. Our hypothesis is that measures of altered intra- and interhemispheric structural connectivity of the primary motor and somatosensory cortex will provide an improved assessment of corticomotor involvement in ALS. To test this hypothesis, we acquired High Angular Resolution Diffusion Imaging (HARDI) scans along with high resolution structural images (sMRI) on 15 patients with clinical evidence of upper and lower motor neuron involvement, and 20 matched control participants. Whole brain probabilistic tractography was applied to define specific WM pathways connecting discrete corticomotor targets generated from anatomical parcellation of sMRI of the brain. The integrity of these connections was interrogated by comparing the mean fractional anisotropy (FA) derived for each WM pathway. To assist in the interpretation of results, we measured the reproducibility of the FA summary measures over time (6months) in control participants. We also incorporated into our analysis pipeline the evaluation and replacement of outlier voxels due to head motion and physiological noise. When assessing corticomotor connectivity, we found a significant reduction in mean FA within a number of intra- and interhemispheric motor pathways in ALS patients. The abnormal

  19. Generation of equipment response spectrum considering equipment-structure interaction

    International Nuclear Information System (INIS)

    Lee, Sang Hoon; Yoo, Kwang Hoon

    2005-01-01

    Floor response spectra for dynamic response of subsystem such as equipment, or piping in nuclear power plant are usually generated without considering dynamic interaction between main structure and subsystem. Since the dynamic structural response generally has the narrow-banded shapes, the resulting floor response spectra developed for various locations in the structure usually have high spectral peak amplitudes in the narrow frequency bands corresponding to the natural frequencies of the structural system. The application of such spectra for design of subsystems often leads to excessive design conservatisms, especially when the equipment frequency and structure are at resonance condition. Thus, in order to provide a rational and realistic design input for dynamic analysis and design of equipment, dynamic equipment-structure interaction (ESI) should be considered in developing equipment response spectrum which is particularly important for equipment at the resonance condition. Many analytical methods have been proposed in the past for developing equipment response spectra considering ESI. However, most of these methods have not been adapted to the practical applications because of either the complexities or the lack of rigorousness of the methods. At one hand, mass ratio among the equipment and structure was used as an important parameter to obtain equipment response spectra. Similarly, Tseng has also proposed the analytical method for developing equipment response spectra using mass ratio in the frequency domain. This method is analytically rigorous and can be easily validated. It is based on the dynamic substructuring method as applied to the dynamic soil-structure interaction (SSI) analysis, and can relatively easily be implemented for practical applications without to change the current dynamic analysis and design practice for subsystems. The equipment response spectra derived in this study are also based on Tseng's proposed method

  20. Automated Simulation Model Generation

    NARCIS (Netherlands)

    Huang, Y.

    2013-01-01

    One of today's challenges in the field of modeling and simulation is to model increasingly larger and more complex systems. Complex models take long to develop and incur high costs. With the advances in data collection technologies and more popular use of computer-aided systems, more data has become

  1. Automated pattern recognition to support geological mapping and exploration target generation: a case study from southern Namibia

    CSIR Research Space (South Africa)

    Eberle, D

    2015-06-01

    Full Text Available This paper demonstrates a methodology for the automatic joint interpretation of high resolution airborne geophysical and space-borne remote sensing data to support geological mapping in a largely automated, fast and objective manner. At the request...

  2. Corrosion of structural materials for Generation IV systems

    International Nuclear Information System (INIS)

    Balbaud-Celerier, F.; Cabet, C.; Courouau, J.L.; Martinelli, L.; Arnoux, P.

    2009-01-01

    The Generation IV International Forum aims at developing future generation nuclear energy systems. Six systems have been selected for further consideration: sodium-cooled fast reactor (SFR), gas-cooled fast reactor (GFR), lead-cooled fast reactor (LFR), molten salt reactor (MSR), supercritical water-cooled reactor (SCWR) and very high temperature reactor (VHTR). CEA, in the frame of a national program, of EC projects and of the GIF, contributes to the structural materials developments and research programs. Particularly, corrosion studies are being performed in the complex environments of the GEN IV systems. As a matter of fact, structural materials encounter very severe conditions regarding corrosion concerns: high temperatures and possibly aggressive chemical environments. Therefore, the multiple environments considered require also a large diversity of materials. On the other hand, the similar levels of working temperatures as well as neutron spectrum imply also similar families of materials for the various systems. In this paper, status of the research performed in CEA on the corrosion behavior of the structural material in the different environments is presented. The materials studied are either metallic materials as austenitic (or Y, La, Ce doped) and ferrito-martensitic steels, Ni base alloys, ODS steels, or ceramics and composites. In all the environments studied, the scientific approach is identical, the objective being in all cases the understanding of the corrosion processes to establish recommendations on the chemistry control of the coolant and to predict the long term behavior of the materials by the development of corrosion models. (author)

  3. Analytical and between-subject variation of thrombin generation measured by calibrated automated thrombography on plasma samples.

    Science.gov (United States)

    Kristensen, Anne F; Kristensen, Søren R; Falkmer, Ursula; Münster, Anna-Marie B; Pedersen, Shona

    2018-05-01

    The Calibrated Automated Thrombography (CAT) is an in vitro thrombin generation (TG) assay that holds promise as a valuable tool within clinical diagnostics. However, the technique has a considerable analytical variation, and we therefore, investigated the analytical and between-subject variation of CAT systematically. Moreover, we assess the application of an internal standard for normalization to diminish variation. 20 healthy volunteers donated one blood sample which was subsequently centrifuged, aliquoted and stored at -80 °C prior to analysis. The analytical variation was determined on eight runs, where plasma from the same seven volunteers was processed in triplicates, and for the between-subject variation, TG analysis was performed on plasma from all 20 volunteers. The trigger reagents used for the TG assays included both PPP reagent containing 5 pM tissue factor (TF) and PPPlow with 1 pM TF. Plasma, drawn from a single donor, was applied to all plates as an internal standard for each TG analysis, which subsequently was used for normalization. The total analytical variation for TG analysis performed with PPPlow reagent is 3-14% and 9-13% for PPP reagent. This variation can be minimally reduced by using an internal standard but mainly for ETP (endogenous thrombin potential). The between-subject variation is higher when using PPPlow than PPP and this variation is considerable higher than the analytical variation. TG has a rather high inherent analytical variation but considerable lower than the between-subject variation when using PPPlow as reagent.

  4. IsoMS: automated processing of LC-MS data generated by a chemical isotope labeling metabolomics platform.

    Science.gov (United States)

    Zhou, Ruokun; Tseng, Chiao-Li; Huan, Tao; Li, Liang

    2014-05-20

    A chemical isotope labeling or isotope coded derivatization (ICD) metabolomics platform uses a chemical derivatization method to introduce a mass tag to all of the metabolites having a common functional group (e.g., amine), followed by LC-MS analysis of the labeled metabolites. To apply this platform to metabolomics studies involving quantitative analysis of different groups of samples, automated data processing is required. Herein, we report a data processing method based on the use of a mass spectral feature unique to the chemical labeling approach, i.e., any differential-isotope-labeled metabolites are detected as peak pairs with a fixed mass difference in a mass spectrum. A software tool, IsoMS, has been developed to process the raw data generated from one or multiple LC-MS runs by peak picking, peak pairing, peak-pair filtering, and peak-pair intensity ratio calculation. The same peak pairs detected from multiple samples are then aligned to produce a CSV file that contains the metabolite information and peak ratios relative to a control (e.g., a pooled sample). This file can be readily exported for further data and statistical analysis, which is illustrated in an example of comparing the metabolomes of human urine samples collected before and after drinking coffee. To demonstrate that this method is reliable for data processing, five (13)C2-/(12)C2-dansyl labeled metabolite standards were analyzed by LC-MS. IsoMS was able to detect these metabolites correctly. In addition, in the analysis of a (13)C2-/(12)C2-dansyl labeled human urine, IsoMS detected 2044 peak pairs, and manual inspection of these peak pairs found 90 false peak pairs, representing a false positive rate of 4.4%. IsoMS for Windows running R is freely available for noncommercial use from www.mycompoundid.org/IsoMS.

  5. An automated pulse labelling method for structure-activity relationship studies with antibacterial oxazolidinones.

    Science.gov (United States)

    Eustice, D C; Brittelli, D R; Feldman, P A; Brown, L J; Borkowski, J J; Slee, A M

    1990-01-01

    The 3-aryl-2-oxooxazolidinones are a new class of synthetic antibacterial agents that potently inhibit protein synthesis. An automated pulse labelling method with [3H]-lysine was developed with Bacillus subtilis to obtain additional quantitative activity data for structure-activity relationship studies with the oxazolidinones. Inhibition constants were calculated after a Logit fit of the data into the formula: % of control = 100/(1 + e[-B(X - A)]), where B is the slope of the model, X is the natural log of the inhibitor concentration and A is the natural log of the inhibitor concentration required to inhibit protein synthesis by 50% (ln IC50). When substituents at the 5-methyl position of the heterocyclic ring (B-substituent) were NHCOCH3, OH or Cl, the correlation coefficient was 0.87 between the MIC and IC50 values (for all compounds with MICs less than or equal to 16 micrograms/ml). The D-isomers of DuP 721 (A-substituent = CH3CO) and DuP 105 (A-substituent = CH3SO) gave MICs of 128 micrograms/ml and IC50s of greater than or equal to 50 micrograms/ml for protein synthesis, showing that only the L-isomers were active. By MIC testing, oxazolidinones with the B-substituent of NHCOCH3 and the A-substituent of CH3CO, NO2, CH3S, CH3SO2 or (CH3)2CH had comparable antibacterial potency; however, pulse labelling analysis showed that compounds with an A-substituent of CH3CO or NO2 were more potent inhibitors of protein synthesis.

  6. Automated quantification of renal interstitial fibrosis for computer-aided diagnosis: A comprehensive tissue structure segmentation method.

    Science.gov (United States)

    Tey, Wei Keat; Kuang, Ye Chow; Ooi, Melanie Po-Leen; Khoo, Joon Joon

    2018-03-01

    Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses. This study proposes an automated quantification system for measuring the amount of interstitial fibrosis in renal biopsy images as a consistent basis of comparison among pathologists. The system extracts and segments the renal tissue structures based on colour information and structural assumptions of the tissue structures. The regions in the biopsy representing the interstitial fibrosis are deduced through the elimination of non-interstitial fibrosis structures from the biopsy area and quantified as a percentage of the total area of the biopsy sample. A ground truth image dataset has been manually prepared by consulting an experienced pathologist for the validation of the segmentation algorithms. The results from experiments involving experienced pathologists have demonstrated a good correlation in quantification result between the automated system and the pathologists' visual evaluation. Experiments investigating the variability in pathologists also proved the automated quantification error rate to be on par with the average intra-observer variability in pathologists' quantification. Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses due to the uncertainties in human judgement. An automated quantification system for accurately measuring the amount of interstitial fibrosis in renal biopsy images is presented as a consistent basis of comparison among pathologists. The system identifies the renal tissue structures

  7. Laboratory Measurements of Electrostatic Solitary Structures Generated by Beam Injection

    International Nuclear Information System (INIS)

    Lefebvre, Bertrand; Chen, Li-Jen; Gekelman, Walter; Pribyl, Patrick; Vincena, Stephen; Kintner, Paul; Pickett, Jolene; Chiang, Franklin; Judy, Jack

    2010-01-01

    Electrostatic solitary structures are generated by injection of a suprathermal electron beam parallel to the magnetic field in a laboratory plasma. Electric microprobes with tips smaller than the Debye length (λ De ) enabled the measurement of positive potential pulses with half-widths 4 to 25λ De and velocities 1 to 3 times the background electron thermal speed. Nonlinear wave packets of similar velocities and scales are also observed, indicating that the two descend from the same mode which is consistent with the electrostatic whistler mode and result from an instability likely to be driven by field-aligned currents.

  8. All dispenser printed flexible 3D structured thermoelectric generators

    Science.gov (United States)

    Cao, Z.; Shi, J. J.; Torah, R. N.; Tudor, M. J.; Beeby, S. P.

    2015-12-01

    This work presents a vertically fabricated 3D thermoelectric generator (TEG) by dispenser printing on flexible polyimide substrate. This direct-write technology only involves printing of electrodes, thermoelectric active materials and structure material, which needs no masks to transfer the patterns onto the substrate. The dimension for single thermoelectric element is 2 mm × 2 mm × 0.5 mm while the distance between adjacent cubes is 1.2 mm. The polymer structure layer was used to support the electrodes which are printed to connect the top ends of the thermoelectric material and ensure the flexibility as well. The advantages and the limitations of the dispenser printed 3D TEGs will also be evaluated in this paper. The proposed method is potential to be a low-cost and scalable fabrication solution for TEGs.

  9. Structural looseness investigation in slow rotating permanent magnet generators

    DEFF Research Database (Denmark)

    Skrimpas, Georgios Alexandros; Mijatovic, Nenad; Sweeney, Christian Walsted

    2016-01-01

    Structural looseness in electric machines is a condition influencing the alignment of the machine and thus the overall bearing health. In this work, assessment of the above mentioned failure mode is tested on a slow rotating (running speed equal to 0.7Hz) permanent magnet generator (PMG), while...... collecting vibration and current data in order to cross-reference the indications from the two monitoring techniques. It is found that electric signature analysis shows no response even when two hold down bolts are untightened, whereas the analysis results from the vibration data exhibit superior performance....... The vibration-based condition indicators with the best response are the stator slot pass frequency, which can be directly related to the cogging torque in PMGs, and the 4th electric frequency harmonic, whose amplitudes increase due to the overall lower structure damping coefficient under looseness...

  10. Si Thermoelectric Power Generator with an Unconventional Structure

    Science.gov (United States)

    Sakamoto, Tatsuya; Iida, Tsutomu; Ohno, Yota; Ishikawa, Masashi; Kogo, Yasuo; Hirayama, Naomi; Arai, Koya; Nakamura, Takashi; Nishio, Keishi; Takanashi, Yoshifumi

    2014-06-01

    We examine the mechanical stability of an unconventional Mg2Si thermoelectric generator (TEG) structure. In this structure, the angle θ between the thermoelectric (TE) chips and the heat sink is less than 90°. We examined the tolerance to an external force of various Mg2Si TEG structures using a finite-element method (FEM) with the ANSYS code. The output power of the TEGs was also measured. First, for the FEM analysis, the mechanical properties of sintered Mg2Si TE chips, such as the bending strength and Young's modulus, were measured. Then, two-dimensional (2D) TEG models with various values of θ (90°, 75°, 60°, 45°, 30°, 15°, and 0°) were constructed in ANSYS. The x and y axes were defined as being in the horizontal and vertical directions of the substrate, respectively. In the analysis, the maximum tensile stress in the chip when a constant load was applied to the TEG model in the x direction was determined. Based on the analytical results, an appropriate structure was selected and a module fabricated. For the TEG fabrication, eight TE chips, each with dimensions of 3 mm × 3 mm × 10 mm and consisting of Sb-doped n-Mg2Si prepared by a plasma-activated sintering process, were assembled such that two chips were connected in parallel, and four pairs of these were connected in series on a footprint of 46 mm × 12 mm. The measured power generation characteristics and temperature distribution with temperature differences between 873 K and 373 K are discussed.

  11. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten

    2006-01-01

    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a complex...... system at a high level of functional abstraction, analyze single and multiple fault scenarios and automatically generate parity relations for diagnosis for the system in normal and impaired conditions. User interface and algorithmic details are presented....

  12. Computational mesh generation for vascular structures with deformable surfaces

    International Nuclear Information System (INIS)

    Putter, S. de; Laffargue, F.; Breeuwer, M.; Vosse, F.N. van de; Gerritsen, F.A.; Philips Medical Systems, Best

    2006-01-01

    Computational blood flow and vessel wall mechanics simulations for vascular structures are becoming an important research tool for patient-specific surgical planning and intervention. An important step in the modelling process for patient-specific simulations is the creation of the computational mesh based on the segmented geometry. Most known solutions either require a large amount of manual processing or lead to a substantial difference between the segmented object and the actual computational domain. We have developed a chain of algorithms that lead to a closely related implementation of image segmentation with deformable models and 3D mesh generation. The resulting processing chain is very robust and leads both to an accurate geometrical representation of the vascular structure as well as high quality computational meshes. The chain of algorithms has been tested on a wide variety of shapes. A benchmark comparison of our mesh generation application with five other available meshing applications clearly indicates that the new approach outperforms the existing methods in the majority of cases. (orig.)

  13. A generating mechanism of spiral structure in barred galaxies

    International Nuclear Information System (INIS)

    Thielheim, K.O.; Wolff, H.

    1982-01-01

    The time-dependent response of non-interacting stars to growing oval distortions in disc galaxies is calculated by following their motion numerically and Fourier-analysing their positions. Long-lived spiral density waves are found for fast-growing perturbations as well as in cases in which the perturbation evolves only slowly, compared with a characteristic internal rotation period of the disc. This mechanism of driving a spiral structure in non-self-gravitating stellar discs provides an explanation for the long-lived global spiral patterns, observed in N-body experiments showing an evolving central bar, that is not based on the self-gravitation in the disc. In conjunction with the theory of Lynden-Bell according to which angular momentum transfer in the disc leads to a slow increase of the oval distortion, this effect provides a general mechanism for the generation of spiral structure in barred galaxies. In addition to stellar discs with velocity dispersion, cold discs, with the stars initially in circular motion, which bear great similarity to gaseous discs, are investigated. The linear epicyclic approximation is used to develop an analytical description of the generating mechanism. (author)

  14. Automated Sample Exchange Robots for the Structural Biology Beam Lines at the Photon Factory

    International Nuclear Information System (INIS)

    Hiraki, Masahiko; Watanabe, Shokei; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Gaponov, Yurii; Wakatsuki, Soichi

    2007-01-01

    We are now developing automated sample exchange robots for high-throughput protein crystallographic experiments for onsite use at synchrotron beam lines. It is part of the fully automated robotics systems being developed at the Photon Factory, for the purposes of protein crystallization, monitoring crystal growth, harvesting and freezing crystals, mounting the crystals inside a hutch and for data collection. We have already installed the sample exchange robots based on the SSRL automated mounting system at our insertion device beam lines BL-5A and AR-NW12A at the Photon Factory. In order to reduce the time required for sample exchange further, a prototype of a double-tonged system was developed. As a result of preliminary experiments with double-tonged robots, the sample exchange time was successfully reduced from 70 seconds to 10 seconds with the exception of the time required for pre-cooling and warming up the tongs

  15. Primordial Black Holes as Generators of Cosmic Structures

    Science.gov (United States)

    Carr, Bernard; Silk, Joseph

    2018-05-01

    Primordial black holes (PBHs) could provide the dark matter in various mass windows below 102M⊙ and those of 30M⊙ might explain the LIGO events. PBHs much larger than this might have important consequences even if they provide only a small fraction of the dark matter. In particular, they could generate cosmological structure either individually through the `seed' effect or collectively through the `Poisson' effect, thereby alleviating some problems associated with the standard CDM scenario. If the PBHs all have a similar mass and make a small contribution to the dark matter, then the seed effect dominates on small scales, in which case PBHs could generate the supermassive black holes in galactic nuclei or even galaxies themselves. If they have a similar mass and provide the dark matter, the Poisson effect dominates on all scales and the first bound clouds would form earlier than in the usual scenario, with interesting observational consequences. If the PBHs have an extended mass spectrum, which is more likely, they could fulfill all three roles - providing the dark matter, binding the first bound clouds and generating galaxies. In this case, the galactic mass function naturally has the observed form, with the galaxy mass being simply related to the black hole mass. The stochastic gravitational wave background from the PBHs in this scenario would extend continuously from the LIGO frequency to the LISA frequency, offering a potential goal for future surveys.

  16. File structure and organization in the automation system for operative account of equipment and materials in JINR

    International Nuclear Information System (INIS)

    Gulyaeva, N.D.; Markova, N.F.; Nikitina, V.I.; Tentyukova, G.N.

    1975-01-01

    The structure and organization of files in the information bank for the first variant of a JINR material and technical supply subsystem are described. Automated system of equipment operative stock-taking on the base of the SDS-6200 computer is developed. Information is stored on magnetic discs. The arrangement of each file depends on its purpose and structure of data. Access to the files can be arbitrary or consecutive. The files are divided into groups: primary document files, long-term reference, information on items that may change as a result of administrative decision [ru

  17. Ensembles generated from crystal structures of single distant homologues solve challenging molecular-replacement cases in AMPLE.

    Science.gov (United States)

    Rigden, Daniel J; Thomas, Jens M H; Simkovic, Felix; Simpkin, Adam; Winn, Martyn D; Mayans, Olga; Keegan, Ronan M

    2018-03-01

    Molecular replacement (MR) is the predominant route to solution of the phase problem in macromolecular crystallography. Although routine in many cases, it becomes more effortful and often impossible when the available experimental structures typically used as search models are only distantly homologous to the target. Nevertheless, with current powerful MR software, relatively small core structures shared between the target and known structure, of 20-40% of the overall structure for example, can succeed as search models where they can be isolated. Manual sculpting of such small structural cores is rarely attempted and is dependent on the crystallographer's expertise and understanding of the protein family in question. Automated search-model editing has previously been performed on the basis of sequence alignment, in order to eliminate, for example, side chains or loops that are not present in the target, or on the basis of structural features (e.g. solvent accessibility) or crystallographic parameters (e.g. B factors). Here, based on recent work demonstrating a correlation between evolutionary conservation and protein rigidity/packing, novel automated ways to derive edited search models from a given distant homologue over a range of sizes are presented. A variety of structure-based metrics, many readily obtained from online webservers, can be fed to the MR pipeline AMPLE to produce search models that succeed with a set of test cases where expertly manually edited comparators, further processed in diverse ways with MrBUMP, fail. Further significant performance gains result when the structure-based distance geometry method CONCOORD is used to generate ensembles from the distant homologue. To our knowledge, this is the first such approach whereby a single structure is meaningfully transformed into an ensemble for the purposes of MR. Additional cases further demonstrate the advantages of the approach. CONCOORD is freely available and computationally inexpensive, so

  18. 'Ab initio' structure solution from electron diffraction data obtained by a combination of automated diffraction tomography and precession technique

    International Nuclear Information System (INIS)

    Mugnaioli, E.; Gorelik, T.; Kolb, U.

    2009-01-01

    Using a combination of our recently developed automated diffraction tomography (ADT) module with precession electron technique (PED), quasi-kinematical 3D diffraction data sets of an inorganic salt (BaSO 4 ) were collected. The lattice cell parameters and their orientation within the data sets were found automatically. The extracted intensities were used for 'ab initio' structure analysis by direct methods. The data set covered almost the complete set of possible symmetrically equivalent reflections for an orthorhombic structure. The structure solution in one step delivered all heavy (Ba, S) as well as light atoms (O). Results of the structure solution using direct methods, charge flipping and maximum entropy algorithms as well as structure refinement for three different 3D electron diffraction data sets were presented.

  19. Developing a Graphical User Interface to Automate the Estimation and Prediction of Risk Values for Flood Protective Structures using Artificial Neural Network

    Science.gov (United States)

    Hasan, M.; Helal, A.; Gabr, M.

    2014-12-01

    In this project, we focus on providing a computer-automated platform for a better assessment of the potential failures and retrofit measures of flood-protecting earth structures, e.g., dams and levees. Such structures play an important role during extreme flooding events as well as during normal operating conditions. Furthermore, they are part of other civil infrastructures such as water storage and hydropower generation. Hence, there is a clear need for accurate evaluation of stability and functionality levels during their service lifetime so that the rehabilitation and maintenance costs are effectively guided. Among condition assessment approaches based on the factor of safety, the limit states (LS) approach utilizes numerical modeling to quantify the probability of potential failures. The parameters for LS numerical modeling include i) geometry and side slopes of the embankment, ii) loading conditions in terms of rate of rising and duration of high water levels in the reservoir, and iii) cycles of rising and falling water levels simulating the effect of consecutive storms throughout the service life of the structure. Sample data regarding the correlations of these parameters are available through previous research studies. We have unified these criteria and extended the risk assessment in term of loss of life through the implementation of a graphical user interface to automate input parameters that divides data into training and testing sets, and then feeds them into Artificial Neural Network (ANN) tool through MATLAB programming. The ANN modeling allows us to predict risk values of flood protective structures based on user feedback quickly and easily. In future, we expect to fine-tune the software by adding extensive data on variations of parameters.

  20. Sandwich-structured hollow fiber membranes for osmotic power generation

    KAUST Repository

    Fu, Feng Jiang; Zhang, Sui; Chung, Neal Tai-Shung

    2015-01-01

    In this work, a novel sandwich-structured hollow fiber membrane has been developed via a specially designed spinneret and optimized spinning conditions. With this specially designed spinneret, the outer layer, which is the most crucial part of the sandwich-structured membrane, is maintained the same as the traditional dual-layer membrane. The inner substrate layer is separated into two layers: (1) an ultra-thin middle layer comprising a high molecular weight polyvinylpyrrolidone (PVP) additive to enhance integration with the outer polybenzimidazole (PBI) selective layer, and (2) an inner-layer to provide strong mechanical strength for the membrane. Experimental results show that a high water permeability and good mechanical strength could be achieved without the expensive post treatment process to remove PVP which was necessary for the dual-layer pressure retarded osmosis (PRO) membranes. By optimizing the composition, the membrane shows a maximum power density of 6.23W/m2 at a hydraulic pressure of 22.0bar when 1M NaCl and 10mM NaCl are used as the draw and feed solutions, respectively. To our best knowledge, this is the best phase inversion hollow fiber membrane with an outer selective PBI layer for osmotic power generation. In addition, this is the first work that shows how to fabricate sandwich-structured hollow fiber membranes for various applications. © 2015 Elsevier B.V.

  1. Sandwich-structured hollow fiber membranes for osmotic power generation

    KAUST Repository

    Fu, Feng Jiang

    2015-11-01

    In this work, a novel sandwich-structured hollow fiber membrane has been developed via a specially designed spinneret and optimized spinning conditions. With this specially designed spinneret, the outer layer, which is the most crucial part of the sandwich-structured membrane, is maintained the same as the traditional dual-layer membrane. The inner substrate layer is separated into two layers: (1) an ultra-thin middle layer comprising a high molecular weight polyvinylpyrrolidone (PVP) additive to enhance integration with the outer polybenzimidazole (PBI) selective layer, and (2) an inner-layer to provide strong mechanical strength for the membrane. Experimental results show that a high water permeability and good mechanical strength could be achieved without the expensive post treatment process to remove PVP which was necessary for the dual-layer pressure retarded osmosis (PRO) membranes. By optimizing the composition, the membrane shows a maximum power density of 6.23W/m2 at a hydraulic pressure of 22.0bar when 1M NaCl and 10mM NaCl are used as the draw and feed solutions, respectively. To our best knowledge, this is the best phase inversion hollow fiber membrane with an outer selective PBI layer for osmotic power generation. In addition, this is the first work that shows how to fabricate sandwich-structured hollow fiber membranes for various applications. © 2015 Elsevier B.V.

  2. Automated tools to be used for ascertaining structural condition in South African hard rock mines

    CSIR Research Space (South Africa)

    Teleka, R

    2011-11-01

    Full Text Available in the mining operations and in the efforts to improve mine safety. If mines are safe, the belief is that more skilled labor will express interest in it unlike the way it currently is. The purpose of this paper is to discuss the possibility of using automated...

  3. Structure of the automated uchebno-methodical complex on technical disciplines

    Directory of Open Access Journals (Sweden)

    Вячеслав Михайлович Дмитриев

    2010-12-01

    Full Text Available In article it is put and the problem of automation and information of process of training of students on the basis of the entered system-organizational forms which have received in aggregate the name of education methodical complexes on discipline dares.

  4. AUTOMATED VOXEL MODEL FROM POINT CLOUDS FOR STRUCTURAL ANALYSIS OF CULTURAL HERITAGE

    Directory of Open Access Journals (Sweden)

    G. Bitelli

    2016-06-01

    Full Text Available In the context of cultural heritage, an accurate and comprehensive digital survey of a historical building is today essential in order to measure its geometry in detail for documentation or restoration purposes, for supporting special studies regarding materials and constructive characteristics, and finally for structural analysis. Some proven geomatic techniques, such as photogrammetry and terrestrial laser scanning, are increasingly used to survey buildings with different complexity and dimensions; one typical product is in form of point clouds. We developed a semi-automatic procedure to convert point clouds, acquired from laserscan or digital photogrammetry, to a filled volume model of the whole structure. The filled volume model, in a voxel format, can be useful for further analysis and also for the generation of a Finite Element Model (FEM of the surveyed building. In this paper a new approach is presented with the aim to decrease operator intervention in the workflow and obtain a better description of the structure. In order to achieve this result a voxel model with variable resolution is produced. Different parameters are compared and different steps of the procedure are tested and validated in the case study of the North tower of the San Felice sul Panaro Fortress, a monumental historical building located in San Felice sul Panaro (Modena, Italy that was hit by an earthquake in 2012.

  5. Low dimension structures and devices for new generation photonic technology

    International Nuclear Information System (INIS)

    Zhang, D. H.; Tang, D. Y.; Chen, T. P.; Mei, T.; Yuan, X. C.

    2014-01-01

    Low dimensional structures and devices are the key technological building blocks for new generation of electronic and photonic technology. Such structures and devices show novel properties and can be integrated into systems for wide applications in many areas, including medical, biological and military and advancement of science. In this invited talk, I will present the main results achieved in our competitive research program which aims to explore the application of the mesoscopic structures in light source, manipulation and imaging and integrate them into advanced systems. In the light source aspect, we have for the first time developed graphene mode-locked lasers which are in the process of commercialization. Nanocrystal Si embedded in dielectrics was formed by ion implantation and subsequent annealing. Si light emitting devices with external quantum efficiency of about 2.9×10 −3 % for visible emission were demonstrated at room temperature and the color of emitted light can be tuned electrically from violet to white by varying the injected current. In light manipulation, loss compensation of surface plasmon polaritons (SPPs) using quantum well (QW) gain media was studied theoretically and demonstrated experimentally. The SPP propagation length was effectively elongated several times through electrical pumping. One and two microring resonators based on silicon on insulator and III-V semiconductors technologies have been successfully fabricated and they can be used as filter and switch in the photonic circuit. In imaging, both SPP and low dimension structures are investigated and resolution far beyond diffraction limit in visible range has been realized. The integration of the components in the three aspects into complicated systems is on the way

  6. Rate Structures for Customers With Onsite Generation: Practice and Innovation

    Energy Technology Data Exchange (ETDEWEB)

    Johnston, L.; Takahashi, K.; Weston, F.; Murray, C.

    2005-12-01

    Recognizing that innovation and good public policy do not always proclaim themselves, Synapse Energy Economics and the Regulatory Assistance Project, under a contract with the California Energy Commission (CEC) and the National Renewable Energy Laboratory (NREL), undertook a survey of state policies on rates for partial-requirements customers with onsite distributed generation. The survey investigated a dozen or so states. These varied in geography and the structures of their electric industries. By reviewing regulatory proceedings, tariffs, publications, and interviews, the researchers identified a number of approaches to standby and associated rates--many promising but some that are perhaps not--that deserve policymakers' attention if they are to promote the deployment of cost-effective DG in their states.

  7. The structure of relation algebras generated by relativizations

    CERN Document Server

    Givant, Steven R

    1994-01-01

    The foundation for an algebraic theory of binary relations was laid by De Morgan, Peirce, and Schröder during the second half of the nineteenth century. Modern development of the subject as a theory of abstract algebras, called "relation algebras", was undertaken by Tarski and his students. This book aims to analyze the structure of relation algebras that are generated by relativized subalgebras. As examples of their potential for applications, the main results are used to establish representation theorems for classes of relation algebras and to prove existence and uniqueness theorems for simple closures (i.e., for minimal simple algebras containing a given family of relation algebras as relativized subalgebras). This book is well written and accessible to those who are not specialists in this area. In particular, it contains two introductory chapters on the arithmetic and the algebraic theory of relation algebras. This book is suitable for use in graduate courses on algebras of binary relations or algebraic...

  8. Automated service quality and its behavioural consequences in CRM Environment: A structural equation modeling and causal loop diagramming approach

    Directory of Open Access Journals (Sweden)

    Arup Kumar Baksi

    2012-08-01

    Full Text Available Information technology induced communications (ICTs have revolutionized the operational aspects of service sector and have triggered a perceptual shift in service quality as rapid dis-intermediation has changed the access-mode of services on part of the consumers. ICT-enabled services further stimulated the perception of automated service quality with renewed dimensions and there subsequent significance to influence the behavioural outcomes of the consumers. Customer Relationship Management (CRM has emerged as an offshoot to technological breakthrough as it ensured service-encapsulation by integrating people, process and technology. This paper attempts to explore the relationship between automated service quality and its behavioural consequences in a relatively novel business-philosophy – CRM. The study has been conducted on the largest public sector bank of India - State bank of India (SBI at Kolkata which has successfully completed its decade-long operational automation in the year 2008. The study used structural equation modeling (SEM to justify the proposed model construct and causal loop diagramming (CLD to depict the negative and positive linkages between the variables.

  9. Improving the correlation of structural FEA models by the application of automated high density robotized laser Doppler vibrometry

    Science.gov (United States)

    Chowanietz, Maximilian; Bhangaonkar, Avinash; Semken, Michael; Cockrill, Martin

    2016-06-01

    Sound has had an intricate relation with the wellbeing of humans since time immemorial. It has the ability to enhance the quality of life immensely when present as music; at the same time, it can degrade its quality when manifested as noise. Hence, understanding its sources and the processes by which it is produced gains acute significance. Although various theories exist with respect to evolution of bells, it is indisputable that they carry millennia of cultural significance, and at least a few centuries of perfection with respect to design, casting and tuning. Despite the science behind its design, the nuances pertaining to founding and tuning have largely been empirical, and conveyed from one generation to the next. Post-production assessment for bells remains largely person-centric and traditional. However, progressive bell manufacturers have started adopting methods such as finite element analysis (FEA) for informing and optimising their future model designs. To establish confidence in the FEA process it is necessary to correlate the virtual model against a physical example. This is achieved by performing an experimental modal analysis (EMA) and comparing the results with those from FEA. Typically to collect the data for an EMA, the vibratory response of the structure is measured with the application of accelerometers. This technique has limitations; principally these are the observer effect and limited geometric resolution. In this paper, 3-dimensional laser Doppler vibrometry (LDV) has been used to measure the vibratory response with no observer effect due to the non-contact nature of the technique; resulting in higher accuracy measurements as the input to the correlation process. The laser heads were mounted on an industrial robot that enables large objects to be measured and extensive data sets to be captured quickly through an automated process. This approach gives previously unobtainable geometric resolution resulting in a higher confidence EMA. This is

  10. Generating inferences from knowledge structures based on general automata

    Energy Technology Data Exchange (ETDEWEB)

    Koenig, E C

    1983-01-01

    The author shows that the model for knowledge structures for computers based on general automata accommodates procedures for establishing inferences. Algorithms are presented which generate inferences as output of a computer when its sentence input names appropriate knowledge elements contained in an associated knowledge structure already stored in the memory of the computer. The inferences are found to have either a single graph tuple or more than one graph tuple of associated knowledge. Six algorithms pertain to a single graph tuple and a seventh pertains to more than one graph tuple of associated knowledge. A named term is either the automaton, environment, auxiliary receptor, principal receptor, auxiliary effector, or principal effector. The algorithm pertaining to more than one graph tuple requires that the input sentence names the automaton, transformation response, and environment of one of the tuples of associated knowledge in a sequence of tuples. Interaction with the computer may be either in a conversation or examination mode. The algorithms are illustrated by an example. 13 references.

  11. Influence of computer technology on the automation of oil and gas fields and on the companies' information structures

    Energy Technology Data Exchange (ETDEWEB)

    Graf, H.G.

    1984-02-01

    Exemplified by a Direct Digital Control System, the fundamentals of process automation are demonstrated. Description of the so-called ''General-purpose computers'' and their peripherals which are used in the mineral oil industry. Explanation of individual types of information processing such as data, process and text processing. Broad outline of typical applications of EDP Systems in the mineral oil/natural gas producing industries. Further chapters deal with the incompany information structure and with economic shaping of the information system.

  12. Study of geologic-structural situation around Semipalatinsk test site test - holes using space images automated decoding method

    International Nuclear Information System (INIS)

    Gorbunova, Eh.M.; Ivanchenko, G.N.

    2004-01-01

    Performance of underground nuclear explosions (UNE) leads to irreversible changes in geological environment around the boreholes. In natural environment it was detected inhomogeneity of rock massif condition changes, which depended on characteristics of the underground nuclear explosion, anisotropy of medium and presence of faulting. Application of automated selection and statistic analysis of unstretched lineaments in high resolution space images using special software pack LESSA allows specifying the geologic-structural features of Semipalatinsk Test Site (STS), ranging selected fracture zones, outlining and analyzing post-explosion zone surface deformations. (author)

  13. FigSum: automatically generating structured text summaries for figures in biomedical literature.

    Science.gov (United States)

    Agarwal, Shashank; Yu, Hong

    2009-11-14

    Figures are frequently used in biomedical articles to support research findings; however, they are often difficult to comprehend based on their legends alone and information from the full-text articles is required to fully understand them. Previously, we found that the information associated with a single figure is distributed throughout the full-text article the figure appears in. Here, we develop and evaluate a figure summarization system - FigSum, which aggregates this scattered information to improve figure comprehension. For each figure in an article, FigSum generates a structured text summary comprising one sentence from each of the four rhetorical categories - Introduction, Methods, Results and Discussion (IMRaD). The IMRaD category of sentences is predicted by an automated machine learning classifier. Our evaluation shows that FigSum captures 53% of the sentences in the gold standard summaries annotated by biomedical scientists and achieves an average ROUGE-1 score of 0.70, which is higher than a baseline system.

  14. Zero in on Key Open Problems in Automated NMR Protein Structure Determination

    KAUST Repository

    Abbas, Ahmed

    2015-11-12

    Nuclear magnetic resonance (NMR) is one of the main approaches for protein struc- ture determination. The biggest advantage of this approach is that it can determine the three-dimensional structure of the protein in the solution phase. Thus, the natural dynamics of the protein can be studied. However, NMR protein structure determina- tion is an expertise intensive and time-consuming process. If the structure determi- nation process can be accelerated or even automated by computational methods, that will significantly advance the structural biology field. Our goal in this dissertation is to propose highly efficient and error tolerant methods that can work well on real and noisy data sets of NMR. Our first contribution in this dissertation is the development of a novel peak pick- ing method (WaVPeak). First, WaVPeak denoises the NMR spectra using wavelet smoothing. A brute force method is then used to identify all the candidate peaks. Af- ter that, the volume of each candidate peak is estimated. Finally, the peaks are sorted according to their volumes. WaVPeak is tested on the same benchmark data set that was used to test the state-of-the-art method, PICKY. WaVPeak shows significantly better performance than PICKY in terms of recall and precision. Our second contribution is to propose an automatic method to select peaks pro- duced by peak picking methods. This automatic method is used to overcome the limitations of fixed number-based methods. Our method is based on the Benjamini- Hochberg (B-H) algorithm. The method is used with both WaVPeak and PICKY to automatically select the number of peaks to return from out of hundreds of candidate peaks. The volume (in WaVPeak) and the intensity (in PICKY) are converted into p-values. Peaks that have p-values below some certain threshold are selected. Ex- perimental results show that the new method is better than the fixed number-based method in terms of recall. To improve precision, we tried to eliminate false peaks using

  15. Seamless integration of dose-response screening and flow chemistry: efficient generation of structure-activity relationship data of β-secretase (BACE1) inhibitors.

    Science.gov (United States)

    Werner, Michael; Kuratli, Christoph; Martin, Rainer E; Hochstrasser, Remo; Wechsler, David; Enderle, Thilo; Alanine, Alexander I; Vogel, Horst

    2014-02-03

    Drug discovery is a multifaceted endeavor encompassing as its core element the generation of structure-activity relationship (SAR) data by repeated chemical synthesis and biological testing of tailored molecules. Herein, we report on the development of a flow-based biochemical assay and its seamless integration into a fully automated system comprising flow chemical synthesis, purification and in-line quantification of compound concentration. This novel synthesis-screening platform enables to obtain SAR data on b-secretase (BACE1) inhibitors at an unprecedented cycle time of only 1 h instead of several days. Full integration and automation of industrial processes have always led to productivity gains and cost reductions, and this work demonstrates how applying these concepts to SAR generation may lead to a more efficient drug discovery process. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Open-Source Assisted Laboratory Automation through Graphical User Interfaces and 3D Printers: Application to Equipment Hyphenation for Higher-Order Data Generation.

    Science.gov (United States)

    Siano, Gabriel G; Montemurro, Milagros; Alcaráz, Mirta R; Goicoechea, Héctor C

    2017-10-17

    Higher-order data generation implies some automation challenges, which are mainly related to the hidden programming languages and electronic details of the equipment. When techniques and/or equipment hyphenation are the key to obtaining higher-order data, the required simultaneous control of them demands funds for new hardware, software, and licenses, in addition to very skilled operators. In this work, we present Design of Inputs-Outputs with Sikuli (DIOS), a free and open-source code program that provides a general framework for the design of automated experimental procedures without prior knowledge of programming or electronics. Basically, instruments and devices are considered as nodes in a network, and every node is associated both with physical and virtual inputs and outputs. Virtual components, such as graphical user interfaces (GUIs) of equipment, are handled by means of image recognition tools provided by Sikuli scripting language, while handling of their physical counterparts is achieved using an adapted open-source three-dimensional (3D) printer. Two previously reported experiments of our research group, related to fluorescence matrices derived from kinetics and high-performance liquid chromatography, were adapted to be carried out in a more automated fashion. Satisfactory results, in terms of analytical performance, were obtained. Similarly, advantages derived from open-source tools assistance could be appreciated, mainly in terms of lesser intervention of operators and cost savings.

  17. A Microfluidic Device for Preparing Next Generation DNA Sequencing Libraries and for Automating Other Laboratory Protocols That Require One or More Column Chromatography Steps

    Science.gov (United States)

    Tan, Swee Jin; Phan, Huan; Gerry, Benjamin Michael; Kuhn, Alexandre; Hong, Lewis Zuocheng; Min Ong, Yao; Poon, Polly Suk Yean; Unger, Marc Alexander; Jones, Robert C.; Quake, Stephen R.; Burkholder, William F.

    2013-01-01

    Library preparation for next-generation DNA sequencing (NGS) remains a key bottleneck in the sequencing process which can be relieved through improved automation and miniaturization. We describe a microfluidic device for automating laboratory protocols that require one or more column chromatography steps and demonstrate its utility for preparing Next Generation sequencing libraries for the Illumina and Ion Torrent platforms. Sixteen different libraries can be generated simultaneously with significantly reduced reagent cost and hands-on time compared to manual library preparation. Using an appropriate column matrix and buffers, size selection can be performed on-chip following end-repair, dA tailing, and linker ligation, so that the libraries eluted from the chip are ready for sequencing. The core architecture of the device ensures uniform, reproducible column packing without user supervision and accommodates multiple routine protocol steps in any sequence, such as reagent mixing and incubation; column packing, loading, washing, elution, and regeneration; capture of eluted material for use as a substrate in a later step of the protocol; and removal of one column matrix so that two or more column matrices with different functional properties can be used in the same protocol. The microfluidic device is mounted on a plastic carrier so that reagents and products can be aliquoted and recovered using standard pipettors and liquid handling robots. The carrier-mounted device is operated using a benchtop controller that seals and operates the device with programmable temperature control, eliminating any requirement for the user to manually attach tubing or connectors. In addition to NGS library preparation, the device and controller are suitable for automating other time-consuming and error-prone laboratory protocols requiring column chromatography steps, such as chromatin immunoprecipitation. PMID:23894273

  18. A microfluidic device for preparing next generation DNA sequencing libraries and for automating other laboratory protocols that require one or more column chromatography steps.

    Directory of Open Access Journals (Sweden)

    Swee Jin Tan

    Full Text Available Library preparation for next-generation DNA sequencing (NGS remains a key bottleneck in the sequencing process which can be relieved through improved automation and miniaturization. We describe a microfluidic device for automating laboratory protocols that require one or more column chromatography steps and demonstrate its utility for preparing Next Generation sequencing libraries for the Illumina and Ion Torrent platforms. Sixteen different libraries can be generated simultaneously with significantly reduced reagent cost and hands-on time compared to manual library preparation. Using an appropriate column matrix and buffers, size selection can be performed on-chip following end-repair, dA tailing, and linker ligation, so that the libraries eluted from the chip are ready for sequencing. The core architecture of the device ensures uniform, reproducible column packing without user supervision and accommodates multiple routine protocol steps in any sequence, such as reagent mixing and incubation; column packing, loading, washing, elution, and regeneration; capture of eluted material for use as a substrate in a later step of the protocol; and removal of one column matrix so that two or more column matrices with different functional properties can be used in the same protocol. The microfluidic device is mounted on a plastic carrier so that reagents and products can be aliquoted and recovered using standard pipettors and liquid handling robots. The carrier-mounted device is operated using a benchtop controller that seals and operates the device with programmable temperature control, eliminating any requirement for the user to manually attach tubing or connectors. In addition to NGS library preparation, the device and controller are suitable for automating other time-consuming and error-prone laboratory protocols requiring column chromatography steps, such as chromatin immunoprecipitation.

  19. PONDEROSA-C/S: client-server based software package for automated protein 3D structure determination.

    Science.gov (United States)

    Lee, Woonghee; Stark, Jaime L; Markley, John L

    2014-11-01

    Peak-picking Of Noe Data Enabled by Restriction Of Shift Assignments-Client Server (PONDEROSA-C/S) builds on the original PONDEROSA software (Lee et al. in Bioinformatics 27:1727-1728. doi: 10.1093/bioinformatics/btr200, 2011) and includes improved features for structure calculation and refinement. PONDEROSA-C/S consists of three programs: Ponderosa Server, Ponderosa Client, and Ponderosa Analyzer. PONDEROSA-C/S takes as input the protein sequence, a list of assigned chemical shifts, and nuclear Overhauser data sets ((13)C- and/or (15)N-NOESY). The output is a set of assigned NOEs and 3D structural models for the protein. Ponderosa Analyzer supports the visualization, validation, and refinement of the results from Ponderosa Server. These tools enable semi-automated NMR-based structure determination of proteins in a rapid and robust fashion. We present examples showing the use of PONDEROSA-C/S in solving structures of four proteins: two that enable comparison with the original PONDEROSA package, and two from the Critical Assessment of automated Structure Determination by NMR (Rosato et al. in Nat Methods 6:625-626. doi: 10.1038/nmeth0909-625 , 2009) competition. The software package can be downloaded freely in binary format from http://pine.nmrfam.wisc.edu/download_packages.html. Registered users of the National Magnetic Resonance Facility at Madison can submit jobs to the PONDEROSA-C/S server at http://ponderosa.nmrfam.wisc.edu, where instructions, tutorials, and instructions can be found. Structures are normally returned within 1-2 days.

  20. Adequacy and adjustment of electromechanical elements of a X radiation generator for automation of system of additional filtration

    International Nuclear Information System (INIS)

    Alves Junior, Iremar; Santos, Lucas dos; Potiens, Maria da Penha A.; Vivolo, Vitor

    2011-01-01

    This paper dimensioned the filter wheel components and the adequacy of additional filtrations for the implantation of the OTW automated system with complete replacement of previous used filtration by new set of machine-made filters to be used as the qualities already implanted at the Instrument Calibration Laboratory of the IPEN, Sao Paulo, Brazil. In the sequence, it was performed the measurements of kerma i the air in each quality to be used as reference values

  1. Exploring astrobiology using in silico molecular structure generation.

    Science.gov (United States)

    Meringer, Markus; Cleaves, H James

    2017-12-28

    The origin of life is typically understood as a transition from inanimate or disorganized matter to self-organized, 'animate' matter. This transition probably took place largely in the context of organic compounds, and most approaches, to date, have focused on using the organic chemical composition of modern organisms as the main guide for understanding this process. However, it has gradually come to be appreciated that biochemistry, as we know it, occupies a minute volume of the possible organic 'chemical space'. As the majority of abiotic syntheses appear to make a large set of compounds not found in biochemistry, as well as an incomplete subset of those that are, it is possible that life began with a significantly different set of components. Chemical graph-based structure generation methods allow for exhaustive in silico enumeration of different compound types and different types of 'chemical spaces' beyond those used by biochemistry, which can be explored to help understand the types of compounds biology uses, as well as to understand the nature of abiotic synthesis, and potentially design novel types of living systems.This article is part of the themed issue 'Reconceptualizing the origins of life'. © 2017 The Authors.

  2. Exploring astrobiology using in silico molecular structure generation

    Science.gov (United States)

    Meringer, Markus; Cleaves, H. James

    2017-11-01

    The origin of life is typically understood as a transition from inanimate or disorganized matter to self-organized, `animate' matter. This transition probably took place largely in the context of organic compounds, and most approaches, to date, have focused on using the organic chemical composition of modern organisms as the main guide for understanding this process. However, it has gradually come to be appreciated that biochemistry, as we know it, occupies a minute volume of the possible organic `chemical space'. As the majority of abiotic syntheses appear to make a large set of compounds not found in biochemistry, as well as an incomplete subset of those that are, it is possible that life began with a significantly different set of components. Chemical graph-based structure generation methods allow for exhaustive in silico enumeration of different compound types and different types of `chemical spaces' beyond those used by biochemistry, which can be explored to help understand the types of compounds biology uses, as well as to understand the nature of abiotic synthesis, and potentially design novel types of living systems. This article is part of the themed issue 'Reconceptualizing the origins of life'.

  3. Structural materials for the next generation nuclear reactors - an overview

    International Nuclear Information System (INIS)

    Charit, I.; Murty, K.L.

    2007-01-01

    The Generation-IV reactors need to withstand much higher temperatures, greater neutron doses, severe corrosive environment and above all, a substantially higher life time (60 years or more). Hence for their successful deployment, a significant research in structural materials is needed. Various potential candidate materials, such as austenitic stainless steels, oxide-dispersion strengthened steels, nickel-base superalloys, refractory alloys etc. are considered. Both baseline and irradiated mechanical, thermophysical and chemical properties are important. However, due to the longer high temperature exposure involved in most designs, creep and corrosion/oxidation will become the major performance limiting factors. In this study we did not cover fabricability and weldability of the candidate materials. Pros and cons of each candidate can be summarized as following: -) for austenitic stainless steel: lower thermal creep resistance at higher temperatures but poor swelling resistance at high temperatures; -) for ferritic-martensitic steels: excellent swelling resistance at higher burnups but thermal creep strength is limited at higher temperatures and radiation embrittlement at low temperature; -) for Ni-base alloys: excellent thermal creep resistance at higher temperatures but radiation embrittlement even at moderate doses and helium embrittlement at higher temperatures; and -) for refractory alloys: adequate swelling resistance up to high burnups but fabrication difficulties, low temperature radiation hardening and poor oxidation resistance

  4. Development of Reinforcement Learning Algorithm for Automation of Slide Gate Check Structure in Canals

    Directory of Open Access Journals (Sweden)

    K. Shahverdi

    2016-02-01

    Full Text Available Introduction: Nowadays considering water shortage and weak management in agricultural water sector and for optimal uses of water, irrigation networks performance need to be improveed. Recently, intelligent management of water conveyance and delivery, and better control technologies have been considered for improving the performance of irrigation networks and their operation. For this affair, providing of mathematical model of automatic control system and related structures, which connected with hydrodynamic models, is necessary. The main objective of this research, is development of mathematical model of RL upstream control algorithm inside ICSS hydrodynamic model as a subroutine. Materials and Methods: In the learning systems, a set of state-action rules called classifiers compete to control the system based on the system's receipt from the environment. One could be identified five main elements of the RL: an agent, an environment, a policy, a reward function, and a simulator. The learner (decision-maker is called the agent. The thing it interacts with, comprising everything outside the agent, is called the environment. The agent selects an action based on existing state in the environment. When the agent takes an action and performs on environment, the environment goes new state and reward is assigned based on it. The agent and the environment continually interact to maximize the reward. The policy is a set of state-action pair, which have higher rewards. It defines the agent's behavior and says which action must be taken in which state. The reward function defines the goal in a RL problem. The reward function defines what the good and bad events are for the agent. The higher the reward, the better the action. The simulator provides environment information. In irrigation canals, the agent is the check structures. The action and state are the check structures adjustment and the water depth, respectively. The environment comprises the hydraulic

  5. Development of structural diagram of automated dispatch control system for power consumption at non-ferrous metallurgy enterprises

    Science.gov (United States)

    Klyuev, R. V.; Bosikov, I. I.; Madaeva, M. Z.; A-V Turluev, R.

    2018-03-01

    The structural scheme of the automated control system of power consumption at the industrial enterprise is developed in the article. At the non-ferrous metallurgy enterprise, an energy inspection and a rank analysis of the electrical energy consumption of the main processing equipment were carried out. It is established that the enterprises of non-ferrous metallurgy are a complex process system consisting of a set of thousands of jointly functioning technological facilities. For the most effective estimation of power consumption of enterprises, it is reasonable to use the automated system of dispatching control of power consumption (ASDCPC). The paper presents the results of the development of the ASDCPC structural diagram that allows one to perform on-line control and management of the energy and process parameters of the main production units and the enterprise as a whole. As a result of the introduction of ASDCPC at the non-ferrous metallurgy enterprise, the consumed active power was reduced during the peak hours of the load by 20%, the specific electricity consumption - by 14%, the cost of the energy component in the cost of production of hard alloys - by 3%.

  6. A “dose on demand” Biomarker Generator for automated production of [18F]F− and [18F]FDG

    International Nuclear Information System (INIS)

    Awasthi, V.; Watson, J.; Gali, H.; Matlock, G.; McFarland, A.; Bailey, J.; Anzellotti, A.

    2014-01-01

    The University of Oklahoma—College of Pharmacy has installed the first Biomarker Generator (BG75) comprising a self-shielded 7.5-MeV proton beam positive ion cyclotron and an aseptic automated chemistry production and quality control module for production of [ 18 F]F − and clinical [ 18 F]FDG. Performance, reliability, and safety of the system for the production of “dose on demand” were tested over several months. No-carrier-added [ 18 F]F − was obtained through the 18 O(p,n) 18 F nuclear reaction by irradiation (20–40 min) of a >95% enriched [ 18 O]H 2 O target (280 μl) with a 7.5-MeV proton beam (3.5–5.0 μA). Automated quality control tests were performed on each dose. The HPLC-based analytical methods were validated against USP methods of quality control. [ 18 F]FDG produced by BG75 was tested in a mouse tumor model implanted with H441 human lung adenocarcinoma cells. After initial installment and optimization, the [ 18 F]F − production has been consistent since March 2011 with a maximum production of 400 to 450 mCi in a day. The average yield is 0.61 mCi/min and 0.92 mCi/min at 3.8 µA and 5 µA, respectively. The current target window has held up for over 25 weeks against >400 bombardment cycles. [ 18 F]FDG production has been consistent since June 2012 with an average of six doses/day in an automated synthesis mode (RCY≈50%). The release criteria included USP-specified limits for pH, residual solvents (acetonitrile/ethanol), kryptofix, radiochemical purity/identity, and filter integrity test. The entire automated operation generated minimal radiation exposure hazard to the operator and environment. As expected, [ 18 F]FDG produced by BG75 was found to delineate tumor volume in a mouse model of xenograft tumor. In summary, production and quality control of “[ 18 F]FDG dose on demand” have been accomplished in an automated and safe manner by the first Biomarker Generator. The implementation of a cGMP quality system is under way towards

  7. Automated Processing of Imaging Data through Multi-tiered Classification of Biological Structures Illustrated Using Caenorhabditis elegans.

    Directory of Open Access Journals (Sweden)

    Mei Zhan

    2015-04-01

    Full Text Available Quantitative imaging has become a vital technique in biological discovery and clinical diagnostics; a plethora of tools have recently been developed to enable new and accelerated forms of biological investigation. Increasingly, the capacity for high-throughput experimentation provided by new imaging modalities, contrast techniques, microscopy tools, microfluidics and computer controlled systems shifts the experimental bottleneck from the level of physical manipulation and raw data collection to automated recognition and data processing. Yet, despite their broad importance, image analysis solutions to address these needs have been narrowly tailored. Here, we present a generalizable formulation for autonomous identification of specific biological structures that is applicable for many problems. The process flow architecture we present here utilizes standard image processing techniques and the multi-tiered application of classification models such as support vector machines (SVM. These low-level functions are readily available in a large array of image processing software packages and programming languages. Our framework is thus both easy to implement at the modular level and provides specific high-level architecture to guide the solution of more complicated image-processing problems. We demonstrate the utility of the classification routine by developing two specific classifiers as a toolset for automation and cell identification in the model organism Caenorhabditis elegans. To serve a common need for automated high-resolution imaging and behavior applications in the C. elegans research community, we contribute a ready-to-use classifier for the identification of the head of the animal under bright field imaging. Furthermore, we extend our framework to address the pervasive problem of cell-specific identification under fluorescent imaging, which is critical for biological investigation in multicellular organisms or tissues. Using these examples as a

  8. Automated Processing of Imaging Data through Multi-tiered Classification of Biological Structures Illustrated Using Caenorhabditis elegans.

    Science.gov (United States)

    Zhan, Mei; Crane, Matthew M; Entchev, Eugeni V; Caballero, Antonio; Fernandes de Abreu, Diana Andrea; Ch'ng, QueeLim; Lu, Hang

    2015-04-01

    Quantitative imaging has become a vital technique in biological discovery and clinical diagnostics; a plethora of tools have recently been developed to enable new and accelerated forms of biological investigation. Increasingly, the capacity for high-throughput experimentation provided by new imaging modalities, contrast techniques, microscopy tools, microfluidics and computer controlled systems shifts the experimental bottleneck from the level of physical manipulation and raw data collection to automated recognition and data processing. Yet, despite their broad importance, image analysis solutions to address these needs have been narrowly tailored. Here, we present a generalizable formulation for autonomous identification of specific biological structures that is applicable for many problems. The process flow architecture we present here utilizes standard image processing techniques and the multi-tiered application of classification models such as support vector machines (SVM). These low-level functions are readily available in a large array of image processing software packages and programming languages. Our framework is thus both easy to implement at the modular level and provides specific high-level architecture to guide the solution of more complicated image-processing problems. We demonstrate the utility of the classification routine by developing two specific classifiers as a toolset for automation and cell identification in the model organism Caenorhabditis elegans. To serve a common need for automated high-resolution imaging and behavior applications in the C. elegans research community, we contribute a ready-to-use classifier for the identification of the head of the animal under bright field imaging. Furthermore, we extend our framework to address the pervasive problem of cell-specific identification under fluorescent imaging, which is critical for biological investigation in multicellular organisms or tissues. Using these examples as a guide, we envision

  9. Automated Protein Structure Modeling with SWISS-MODEL Workspace and the Protein Model Portal

    OpenAIRE

    Bordoli, Lorenza; Schwede, Torsten

    2012-01-01

    Comparative protein structure modeling is a computational approach to build three-dimensional structural models for proteins using experimental structures of related protein family members as templates. Regular blind assessments of modeling accuracy have demonstrated that comparative protein structure modeling is currently the most reliable technique to model protein structures. Homology models are often sufficiently accurate to substitute for experimental structures in a wide variety of appl...

  10. Fast and Efficient Fragment-Based Lead Generation by Fully Automated Processing and Analysis of Ligand-Observed NMR Binding Data.

    Science.gov (United States)

    Peng, Chen; Frommlet, Alexandra; Perez, Manuel; Cobas, Carlos; Blechschmidt, Anke; Dominguez, Santiago; Lingel, Andreas

    2016-04-14

    NMR binding assays are routinely applied in hit finding and validation during early stages of drug discovery, particularly for fragment-based lead generation. To this end, compound libraries are screened by ligand-observed NMR experiments such as STD, T1ρ, and CPMG to identify molecules interacting with a target. The analysis of a high number of complex spectra is performed largely manually and therefore represents a limiting step in hit generation campaigns. Here we report a novel integrated computational procedure that processes and analyzes ligand-observed proton and fluorine NMR binding data in a fully automated fashion. A performance evaluation comparing automated and manual analysis results on (19)F- and (1)H-detected data sets shows that the program delivers robust, high-confidence hit lists in a fraction of the time needed for manual analysis and greatly facilitates visual inspection of the associated NMR spectra. These features enable considerably higher throughput, the assessment of larger libraries, and shorter turn-around times.

  11. AmeriFlux Data Processing: Integrating automated and manual data management across software technologies and an international network to generate timely data products

    Science.gov (United States)

    Christianson, D. S.; Beekwilder, N.; Chan, S.; Cheah, Y. W.; Chu, H.; Dengel, S.; O'Brien, F.; Pastorello, G.; Sandesh, M.; Torn, M. S.; Agarwal, D.

    2017-12-01

    AmeriFlux is a network of scientists who independently collect eddy covariance and related environmental observations at over 250 locations across the Americas. As part of the AmeriFlux Management Project, the AmeriFlux Data Team manages standardization, collection, quality assurance / quality control (QA/QC), and distribution of data submitted by network members. To generate data products that are timely, QA/QC'd, and repeatable, and have traceable provenance, we developed a semi-automated data processing pipeline. The new pipeline consists of semi-automated format and data QA/QC checks. Results are communicated via on-line reports as well as an issue-tracking system. Data processing time has been reduced from 2-3 days to a few hours of manual review time, resulting in faster data availability from the time of data submission. The pipeline is scalable to the network level and has the following key features. (1) On-line results of the format QA/QC checks are available immediately for data provider review. This enables data providers to correct and resubmit data quickly. (2) The format QA/QC assessment includes an automated attempt to fix minor format errors. Data submissions that are formatted in the new AmeriFlux FP-In standard can be queued for the data QA/QC assessment, often with minimal delay. (3) Automated data QA/QC checks identify and communicate potentially erroneous data via online, graphical quick views that highlight observations with unexpected values, incorrect units, time drifts, invalid multivariate correlations, and/or radiation shadows. (4) Progress through the pipeline is integrated with an issue-tracking system that facilitates communications between data providers and the data processing team in an organized and searchable fashion. Through development of these and other features of the pipeline, we present solutions to challenges that include optimizing automated with manual processing, bridging legacy data management infrastructure with

  12. Oxidation state specific generation of arsines from methylated arsenicals based on L-cysteine treatment in buffered media for speciation analysis by hydride generation-automated cryotrapping-gas chromatography-atomic absorption spectrometry with the multiatomizer

    Energy Technology Data Exchange (ETDEWEB)

    Matousek, Tomas [Institute of Analytical Chemistry of the ASCR, v.v.i., Videnska 1083, 14220 Prague (Czech Republic)], E-mail: matousek@biomed.cas.cz; Hernandez-Zavala, Araceli [Center for Environmental Medicine, Asthma, and Lung Biology, University of North Carolina at Chapel Hill, Chapel Hill, NC 27599-7310 (United States); Svoboda, Milan; Langrova, Lenka [Institute of Analytical Chemistry of the ASCR, v.v.i., Videnska 1083, 14220 Prague (Czech Republic); Charles University, Faculty of Science, Albertov 8, 12840 Prague 2 (Czech Republic); Adair, Blakely M. [Pharmacokinetics Branch, Experimental Toxicology Division, National Health and Environmental Effects Laboratory, Office of Research and Development, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711 (United States); Drobna, Zuzana [Department of Nutrition, School of Medicine, University of North Carolina at Chapel Hill, Chapel Hill, NC 27599-7461 (United States); Thomas, David J. [Pharmacokinetics Branch, Experimental Toxicology Division, National Health and Environmental Effects Laboratory, Office of Research and Development, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711 (United States); Styblo, Miroslav [Center for Environmental Medicine, Asthma, and Lung Biology, University of North Carolina at Chapel Hill, Chapel Hill, NC 27599-7310 (United States); Department of Nutrition, School of Medicine, University of North Carolina at Chapel Hill, Chapel Hill, NC 27599-7461 (United States); Dedina, Jiri [Institute of Analytical Chemistry of the ASCR, v.v.i., Videnska 1083, 14220 Prague (Czech Republic)

    2008-03-15

    An automated system for hydride generation-cryotrapping-gas chromatography-atomic absorption spectrometry with the multiatomizer is described. Arsines are preconcentrated and separated in a Chromosorb filled U-tube. An automated cryotrapping unit, employing nitrogen gas formed upon heating in the detection phase for the displacement of the cooling liquid nitrogen, has been developed. The conditions for separation of arsines in a Chromosorb filled U-tube have been optimized. A complete separation of signals from arsine, methylarsine, dimethylarsine, and trimethylarsine has been achieved within a 60 s reading window. The limits of detection for methylated arsenicals tested were 4 ng l{sup -1}. Selective hydride generation is applied for the oxidation state specific speciation analysis of inorganic and methylated arsenicals. The arsines are generated either exclusively from trivalent or from both tri- and pentavalent inorganic and methylated arsenicals depending on the presence of L-cysteine as a prereductant and/or reaction modifier. A TRIS buffer reaction medium is proposed to overcome narrow optimum concentration range observed for the L-cysteine modified reaction in HCl medium. The system provides uniform peak area sensitivity for all As species. Consequently, the calibration with a single form of As is possible. This method permits a high-throughput speciation analysis of metabolites of inorganic arsenic in relatively complex biological matrices such as cell culture systems without sample pretreatment, thus preserving the distribution of tri- and pentavalent species.

  13. Structural Flexibility of Large Direct Drive Generators for Wind Turbines

    NARCIS (Netherlands)

    Shrestha, G.

    2013-01-01

    The trend in wind energy is towards large offshore wind farms. This trend has led to the demand for high reliability and large single unit wind turbines. Different energy conversion topologies such as multiple stage geared generators, single stage geared generators and gearless (direct drive)

  14. Automating tasks in protein structure determination with the clipper python module.

    Science.gov (United States)

    McNicholas, Stuart; Croll, Tristan; Burnley, Tom; Palmer, Colin M; Hoh, Soon Wen; Jenkins, Huw T; Dodson, Eleanor; Cowtan, Kevin; Agirre, Jon

    2018-01-01

    Scripting programming languages provide the fastest means of prototyping complex functionality. Those with a syntax and grammar resembling human language also greatly enhance the maintainability of the produced source code. Furthermore, the combination of a powerful, machine-independent scripting language with binary libraries tailored for each computer architecture allows programs to break free from the tight boundaries of efficiency traditionally associated with scripts. In the present work, we describe how an efficient C++ crystallographic library such as Clipper can be wrapped, adapted and generalized for use in both crystallographic and electron cryo-microscopy applications, scripted with the Python language. We shall also place an emphasis on best practices in automation, illustrating how this can be achieved with this new Python module. © 2017 The Authors Protein Science published by Wiley Periodicals, Inc. on behalf of The Protein Society.

  15. Structure of automated system for tracking the formation and burial of radioactive wastes

    International Nuclear Information System (INIS)

    Kozlov, A.A.

    1993-01-01

    Intermediate- and low-activity wastes are formed when radionuclides are used in science, industry, agriculture, and medicine. A centralized system, including territorial specialized complexes and radioactive-waste burial sites (RWBS), has been created for collection, processing, and long-term storage. At this time, however, the records kept of wastes for long-term storage and assessment of their preparation for burial do not come up to current scientific and technical requirements at most RWBSs in Russia. It is necessary, therefore, to create an automated tracking system. Earlier studies, considered the design of a system for monitoring and recording the handling of sources of ionizing radiation, which are the most hazardous part of the wastes. The novel proposed automated system incorporates distinctive functional elements and makes for higher quality waste processing and efficient data exchange. It performs such functions as recording the wastes earmarked for burial, processing, and long-term storage, and where they are stored in the RWBS; ensuring an optimum cycle of collection, transportation, processing, and long-term storage of wastes; recording planned monitored levels of discharges and ejections of substances at the RWBSs; recording the wastes delivered for storage and stored on RWBSs; making calculations, including an estimate of the costs of transport, processing, and storage of wastes for each enterprise, with allowance for penalties; classifying wastes according to processing methods and determining the optimum operating regime and technological facilities; identifying the parameters of wastes delivered for processing and burial; and predicting the deliveries of wastes to RWBSs, planning the construction of new special storage facilities and containers for temporary and long-term storage of wastes

  16. Automated prototyping tool-kit (APT)

    OpenAIRE

    Nada, Nader; Shing, M.; Berzins, V.; Luqi

    2002-01-01

    Automated prototyping tool-kit (APT) is an integrated set of software tools that generate source programs directly from real-time requirements. The APT system uses a fifth-generation prototyping language to model the communication structure, timing constraints, 1/0 control, and data buffering that comprise the requirements for an embedded software system. The language supports the specification of hard real-time systems with reusable components from domain specific component libraries. APT ha...

  17. Automated calibration of laser spectrometer measurements of δ18 O and δ2 H values in water vapour using a Dew Point Generator.

    Science.gov (United States)

    Munksgaard, Niels C; Cheesman, Alexander W; Gray-Spence, Andrew; Cernusak, Lucas A; Bird, Michael I

    2018-06-30

    Continuous measurement of stable O and H isotope compositions in water vapour requires automated calibration for remote field deployments. We developed a new low-cost device for calibration of both water vapour mole fraction and isotope composition. We coupled a commercially available dew point generator (DPG) to a laser spectrometer and developed hardware for water and air handling along with software for automated operation and data processing. We characterised isotopic fractionation in the DPG, conducted a field test and assessed the influence of critical parameters on the performance of the device. An analysis time of 1 hour was sufficient to achieve memory-free analysis of two water vapour standards and the δ 18 O and δ 2 H values were found to be independent of water vapour concentration over a range of ≈20,000-33,000 ppm. The reproducibility of the standard vapours over a 10-day period was better than 0.14 ‰ and 0.75 ‰ for δ 18 O and δ 2 H values, respectively (1 σ, n = 11) prior to drift correction and calibration. The analytical accuracy was confirmed by the analysis of a third independent vapour standard. The DPG distillation process requires that isotope calibration takes account of DPG temperature, analysis time, injected water volume and air flow rate. The automated calibration system provides high accuracy and precision and is a robust, cost-effective option for long-term field measurements of water vapour isotopes. The necessary modifications to the DPG are minor and easily reversible. Copyright © 2018 John Wiley & Sons, Ltd.

  18. Analytical methods for large-scale sensitivity analysis using GRESS [GRadient Enhanced Software System] and ADGEN [Automated Adjoint Generator

    International Nuclear Information System (INIS)

    Pin, F.G.

    1988-04-01

    Sensitivity analysis is an established methodology used by researchers in almost every field to gain essential insight in design and modeling studies and in performance assessments of complex systems. Conventional sensitivity analysis methodologies, however, have not enjoyed the widespread use they deserve considering the wealth of information they can provide, partly because of their prohibitive cost or the large initial analytical investment they require. Automated systems have recently been developed at ORNL to eliminate these drawbacks. Compilers such as GRESS and ADGEN now allow automatic and cost effective calculation of sensitivities in FORTRAN computer codes. In this paper, these and other related tools are described and their impact and applicability in the general areas of modeling, performance assessment and decision making for radioactive waste isolation problems are discussed. 7 refs., 2 figs

  19. Surface structure enhanced second harmonic generation in organic nanofibers

    DEFF Research Database (Denmark)

    Fiutowski, Jacek; Maibohm, Christian; Kostiučenko, Oksana

    Second-harmonic generation upon femto-second laser irradiation of nonlinearly optically active nanofibers grown from nonsymmetrically functionalized para-quarterphenylene (CNHP4) molecules is investigated. Following growth on mica templates, the nanofibers have been transferred onto lithography...

  20. Complex Domains Call for Automation but Automation Requires More Knowledge and Learning

    DEFF Research Database (Denmark)

    Madsen, Erik Skov; Mikkelsen, Lars Lindegaard

    studies investigate operation and automation of oil and gas production in the North Sea. Semi-structured interviews, surveys, and observations are the main methods used. The paper provides a novel conceptual framework around which management may generate discussions about productivity and the need...

  1. Deposit3D: a tool for automating structure depositions to the Protein Data Bank

    International Nuclear Information System (INIS)

    Badger, J.; Hendle, J.; Burley, S. K.; Kissinger, C. R.

    2005-01-01

    This paper describes a Python script that may be used to gather all required structure-annotation information into an mmCIF file for upload through the RCSB PDB ADIT structure-deposition interface. Almost all successful protein structure-determination projects in the public sector culminate in a structure deposition to the Protein Data Bank (PDB). In order to expedite the deposition proces, Deposit3D has been developed. This command-line script calculates or gathers all the required structure-deposition information and outputs this data into a mmCIF file for subsequent upload through the RCSB PDB ADIT interface. Deposit3D might be particularly useful for structural genomics pipeline projects because it allows workers involved with various stages of a structure-determination project to pool their different categories of annotation information before starting a deposition session

  2. Deposit3D: a tool for automating structure depositions to the Protein Data Bank

    Energy Technology Data Exchange (ETDEWEB)

    Badger, J., E-mail: jbadger@active-sight.com; Hendle, J.; Burley, S. K.; Kissinger, C. R. [SGX Inc., 10505 Roselle Street, San Diego, CA 92121 (United States)

    2005-09-01

    This paper describes a Python script that may be used to gather all required structure-annotation information into an mmCIF file for upload through the RCSB PDB ADIT structure-deposition interface. Almost all successful protein structure-determination projects in the public sector culminate in a structure deposition to the Protein Data Bank (PDB). In order to expedite the deposition proces, Deposit3D has been developed. This command-line script calculates or gathers all the required structure-deposition information and outputs this data into a mmCIF file for subsequent upload through the RCSB PDB ADIT interface. Deposit3D might be particularly useful for structural genomics pipeline projects because it allows workers involved with various stages of a structure-determination project to pool their different categories of annotation information before starting a deposition session.

  3. TH-AB-207A-05: A Fully-Automated Pipeline for Generating CT Images Across a Range of Doses and Reconstruction Methods

    International Nuclear Information System (INIS)

    Young, S; Lo, P; Hoffman, J; Wahi-Anwar, M; Brown, M; McNitt-Gray, M; Noo, F

    2016-01-01

    Purpose: To evaluate the robustness of CAD or Quantitative Imaging methods, they should be tested on a variety of cases and under a variety of image acquisition and reconstruction conditions that represent the heterogeneity encountered in clinical practice. The purpose of this work was to develop a fully-automated pipeline for generating CT images that represent a wide range of dose and reconstruction conditions. Methods: The pipeline consists of three main modules: reduced-dose simulation, image reconstruction, and quantitative analysis. The first two modules of the pipeline can be operated in a completely automated fashion, using configuration files and running the modules in a batch queue. The input to the pipeline is raw projection CT data; this data is used to simulate different levels of dose reduction using a previously-published algorithm. Filtered-backprojection reconstructions are then performed using FreeCT_wFBP, a freely-available reconstruction software for helical CT. We also added support for an in-house, model-based iterative reconstruction algorithm using iterative coordinate-descent optimization, which may be run in tandem with the more conventional recon methods. The reduced-dose simulations and image reconstructions are controlled automatically by a single script, and they can be run in parallel on our research cluster. The pipeline was tested on phantom and lung screening datasets from a clinical scanner (Definition AS, Siemens Healthcare). Results: The images generated from our test datasets appeared to represent a realistic range of acquisition and reconstruction conditions that we would expect to find clinically. The time to generate images was approximately 30 minutes per dose/reconstruction combination on a hybrid CPU/GPU architecture. Conclusion: The automated research pipeline promises to be a useful tool for either training or evaluating performance of quantitative imaging software such as classifiers and CAD algorithms across the range

  4. TH-AB-207A-05: A Fully-Automated Pipeline for Generating CT Images Across a Range of Doses and Reconstruction Methods

    Energy Technology Data Exchange (ETDEWEB)

    Young, S; Lo, P; Hoffman, J; Wahi-Anwar, M; Brown, M; McNitt-Gray, M [UCLA School of Medicine, Los Angeles, CA (United States); Noo, F [University of Utah, Salt Lake City, UT (United States)

    2016-06-15

    Purpose: To evaluate the robustness of CAD or Quantitative Imaging methods, they should be tested on a variety of cases and under a variety of image acquisition and reconstruction conditions that represent the heterogeneity encountered in clinical practice. The purpose of this work was to develop a fully-automated pipeline for generating CT images that represent a wide range of dose and reconstruction conditions. Methods: The pipeline consists of three main modules: reduced-dose simulation, image reconstruction, and quantitative analysis. The first two modules of the pipeline can be operated in a completely automated fashion, using configuration files and running the modules in a batch queue. The input to the pipeline is raw projection CT data; this data is used to simulate different levels of dose reduction using a previously-published algorithm. Filtered-backprojection reconstructions are then performed using FreeCT-wFBP, a freely-available reconstruction software for helical CT. We also added support for an in-house, model-based iterative reconstruction algorithm using iterative coordinate-descent optimization, which may be run in tandem with the more conventional recon methods. The reduced-dose simulations and image reconstructions are controlled automatically by a single script, and they can be run in parallel on our research cluster. The pipeline was tested on phantom and lung screening datasets from a clinical scanner (Definition AS, Siemens Healthcare). Results: The images generated from our test datasets appeared to represent a realistic range of acquisition and reconstruction conditions that we would expect to find clinically. The time to generate images was approximately 30 minutes per dose/reconstruction combination on a hybrid CPU/GPU architecture. Conclusion: The automated research pipeline promises to be a useful tool for either training or evaluating performance of quantitative imaging software such as classifiers and CAD algorithms across the range

  5. Arsenic fractionation in agricultural soil using an automated three-step sequential extraction method coupled to hydride generation-atomic fluorescence spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Rosas-Castor, J.M. [Universidad Autónoma de Nuevo León, UANL, Facultad de Ciencias Químicas, Cd. Universitaria, San Nicolás de los Garza, Nuevo León, C.P. 66451 Nuevo León (Mexico); Group of Analytical Chemistry, Automation and Environment, University of Balearic Islands, Cra. Valldemossa km 7.5, 07122 Palma de Mallorca (Spain); Portugal, L.; Ferrer, L. [Group of Analytical Chemistry, Automation and Environment, University of Balearic Islands, Cra. Valldemossa km 7.5, 07122 Palma de Mallorca (Spain); Guzmán-Mar, J.L.; Hernández-Ramírez, A. [Universidad Autónoma de Nuevo León, UANL, Facultad de Ciencias Químicas, Cd. Universitaria, San Nicolás de los Garza, Nuevo León, C.P. 66451 Nuevo León (Mexico); Cerdà, V. [Group of Analytical Chemistry, Automation and Environment, University of Balearic Islands, Cra. Valldemossa km 7.5, 07122 Palma de Mallorca (Spain); Hinojosa-Reyes, L., E-mail: laura.hinojosary@uanl.edu.mx [Universidad Autónoma de Nuevo León, UANL, Facultad de Ciencias Químicas, Cd. Universitaria, San Nicolás de los Garza, Nuevo León, C.P. 66451 Nuevo León (Mexico)

    2015-05-18

    Highlights: • A fully automated flow-based modified-BCR extraction method was developed to evaluate the extractable As of soil. • The MSFIA–HG-AFS system included an UV photo-oxidation step for organic species degradation. • The accuracy and precision of the proposed method were found satisfactory. • The time analysis can be reduced up to eight times by using the proposed flow-based BCR method. • The labile As (F1 + F2) was <50% of total As in soil samples from As-contaminated-mining zones. - Abstract: A fully automated modified three-step BCR flow-through sequential extraction method was developed for the fractionation of the arsenic (As) content from agricultural soil based on a multi-syringe flow injection analysis (MSFIA) system coupled to hydride generation-atomic fluorescence spectrometry (HG-AFS). Critical parameters that affect the performance of the automated system were optimized by exploiting a multivariate approach using a Doehlert design. The validation of the flow-based modified-BCR method was carried out by comparison with the conventional BCR method. Thus, the total As content was determined in the following three fractions: fraction 1 (F1), the acid-soluble or interchangeable fraction; fraction 2 (F2), the reducible fraction; and fraction 3 (F3), the oxidizable fraction. The limits of detection (LOD) were 4.0, 3.4, and 23.6 μg L{sup −1} for F1, F2, and F3, respectively. A wide working concentration range was obtained for the analysis of each fraction, i.e., 0.013–0.800, 0.011–0.900 and 0.079–1.400 mg L{sup −1} for F1, F2, and F3, respectively. The precision of the automated MSFIA–HG-AFS system, expressed as the relative standard deviation (RSD), was evaluated for a 200 μg L{sup −1} As standard solution, and RSD values between 5 and 8% were achieved for the three BCR fractions. The new modified three-step BCR flow-based sequential extraction method was satisfactorily applied for arsenic fractionation in real agricultural

  6. Structures technology for a new generation of rotorcraft

    Science.gov (United States)

    Bartlett, Felton D., Jr.

    1989-01-01

    This paper presents an overview of structures research at the U. S. Army Aerostructures Directorate. The objectives of this research are to investigate, explore, and demonstrate emerging technologies that will provide lighter, safer, more survivable, and more cost-effective structures for rotorcraft in the 1990s and beyond. The emphasis of today's R&D is to contribute proven structures technology to the U. S. rotorcraft industry and Army aviation that directly impacts tomorrow's fleet readiness and mission capabilities. The primary contributor toward meeting these challenges is the development of high-strength and durable composites to minimize structural weight while maximizing cost effectiveness. Special aviation issues such as delamination of dynamic components, impact damage to thin skins, crashworthiness, and affordable manufacturing need to be resolved before the full potential of composites technology can be realized. To that end, this paper highlights research into composites structural integrity, crashworthiness, and materials applications which addresses these issues.

  7. Gas jet structure influence on high harmonic generation

    OpenAIRE

    Grant-Jacob, James; Mills, Benjamin; Butcher, Thomas J.; Chapman, Richard T.; Brocklesby, William S.; Frey, Jeremy G.

    2011-01-01

    Gas jets used as sources for high harmonic generation (HHG) have a complex three-dimensional density and velocity profile. This paper describes how the profile influences the generation of extreme-UV light. As the position of the laser focus is varied along the jet flow axis, we show that the intensity of the output radiation varies by approximately three times, with the highest flux being observed when the laser is focused into the Mach disc. The work demonstrated here will aid in the optimi...

  8. Using Automated Processes to Generate Test Items And Their Associated Solutions and Rationales to Support Formative Feedback

    Directory of Open Access Journals (Sweden)

    Mark Gierl

    2015-08-01

    Full Text Available Automatic item generation is the process of using item models to produce assessment tasks using computer technology. An item model is similar to a template that highlights the elements in the task that must be manipulated to produce new items. The purpose of our study is to describe an innovative method for generating large numbers of diverse and heterogeneous items along with their solutions and associated rationales to support formative feedback. We demonstrate the method by generating items in two diverse content areas, mathematics and nonverbal reasoning

  9. Three-dimensional Reciprocal Structures: Morphology, Concepts, Generative Rules

    DEFF Research Database (Denmark)

    Parigi, Dario; Pugnale, Alberto

    2012-01-01

    , causing every configuration to develop naturally out-of the plane. The structures presented here were developed and built by the students of the Master of Science in “Architectural Design” during a two week long workshop organized at Aalborg University in the fall semester 2011.......This paper present seven different three dimensional structures based on the principle of structural reciprocity with superimposition joint and standardized un-notched elements. Such typology could be regarded as being intrinsically three-dimensional because elements sit one of the top of the other...

  10. Automated assignment and 3D structure calculations using combinations of 2D homonuclear and 3D heteronuclear NOESY spectra

    International Nuclear Information System (INIS)

    Oezguen, Numan; Adamian, Larisa; Xu Yuan; Rajarathnam, Krishna; Braun, Werner

    2002-01-01

    The NOAH/DIAMOD suite uses feedback filtering and self-correcting distance geometry to generate 3D structures from unassigned NOESY spectra. In this study we determined the minimum set of experiments needed to generate a high quality structure bundle. Different combinations of 3D 15 N-edited, 13 C-edited HSQC-NOESY and 2D homonuclear 1 H- 1 H NOESY spectra of the 77 amino acid protein, myeloid progenitor inhibitory factor-1 (MPIF-1) were used as input for NOAH/DIAMOD calculations. The quality of the assignments of NOESY cross peaks and the accuracy of the automatically generated 3D structures were compared to those obtained with a conventional manual procedure. Combining data from two types of experiments synergistically increased the number of peaks assigned unambiguously in both individual spectra. As a general trend for the accuracy of the structures we observed structural variations in the backbone fold of the final structures of about 2 A for single spectral data, of 1 A to 1.5 A for double spectral data, and of 0.6 A for triple spectral data sets. The quality of the assignments and 3D structures from the optimal data using all three spectra were similar to those obtained from traditional assignment methods with structural variations within the bundle of 0.6 A and 1.3 A for backbone and heavy atoms, respectively. Almost all constraints (97%) of the automatic NOESY cross peak assignments were cross compatible with the structures from the conventional manual assignment procedure, and an even larger proportion (99%) of the manually derived constraints were compatible with the automatically determined 3D structures. The two mean structures determined by both methods differed only by 1.3 A rmsd for the backbone atoms in the well-defined regions of the protein. Thus NOAD/DIAMOD analysis of spectra from labeled proteins provides a reliable method for high throughput analysis of genomic targets

  11. Second harmonic generation from photonic structured GaN nanowalls

    Energy Technology Data Exchange (ETDEWEB)

    Soya, Takahiro; Inose, Yuta; Kunugita, Hideyuki; Ema, Kazuhiro; Yamano, Kouji; Kikuchi, Akihiko; Kishino, Katsumi, E-mail: t-soya@sophia.ac.j [Department of Engineering and Applied Sciences, Sophia University 7-1, Kioi-cho, Chiyoda-ku, Tokyo 102-8554 (Japan)

    2009-11-15

    We observed large enhancement of reflected second harmonic generation (SHG) using the one-dimensional photonic effect in regularly arranged InGaN/GaN single-quantum-well nanowalls. Using the effect when both fundamental and SH resonate with the photonic mode, we obtained enhancement of about 40 times compared with conditions far from resonance.

  12. Information Theoretic Secret Key Generation: Structured Codes and Tree Packing

    Science.gov (United States)

    Nitinawarat, Sirin

    2010-01-01

    This dissertation deals with a multiterminal source model for secret key generation by multiple network terminals with prior and privileged access to a set of correlated signals complemented by public discussion among themselves. Emphasis is placed on a characterization of secret key capacity, i.e., the largest rate of an achievable secret key,…

  13. Thrust generation and wake structure for flow across a pitching ...

    Indian Academy of Sciences (India)

    ... condition for the generation of thrust. The vortex strength is found to be invariant of the pitching frequency. Certain differences from the reported results are noted, which may be because of difference in the airfoil shape. These results can help improve understanding of the flow behavior as the low Reynolds number range ...

  14. AUTOMATION OF OPERATIONAL CONTROL OF DATA FLOWS OF THE METALLURGICAL ENTERPRISE ORGANIZATIONAL STRUCTURE

    Directory of Open Access Journals (Sweden)

    A. N. Chichko

    2006-01-01

    Full Text Available New method for creation of models of operative control of enterprise is offered. The computer variant of the organizational structure, based on analysis of the charging dynamics of control units, is offered and illustrated at the example of one of organizational structures of Belorussian metallurgical works.

  15. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system’s...... of the graph. SaTool makes analysis of the structure graph to provide knowledge about fundamental properties of the system in normal and faulty conditions. Salient features of SaTool include rapid analysis of possibility to diagnose faults and ability to make autonomous recovery should faults occur........ The list of such variables and functional relations constitute the system’s structure graph. Normal operation means all functional relations are intact. Should faults occur, one or more functional relations cease to be valid. In a structure graph, this is seen as the disappearance of one or more nodes...

  16. Automation synthesis modules review

    International Nuclear Information System (INIS)

    Boschi, S.; Lodi, F.; Malizia, C.; Cicoria, G.; Marengo, M.

    2013-01-01

    The introduction of 68 Ga labelled tracers has changed the diagnostic approach to neuroendocrine tumours and the availability of a reliable, long-lived 68 Ge/ 68 Ga generator has been at the bases of the development of 68 Ga radiopharmacy. The huge increase in clinical demand, the impact of regulatory issues and a careful radioprotection of the operators have boosted for extensive automation of the production process. The development of automated systems for 68 Ga radiochemistry, different engineering and software strategies and post-processing of the eluate were discussed along with impact of automation with regulations. - Highlights: ► Generators availability and robust chemistry boosted for the huge diffusion of 68Ga radiopharmaceuticals. ► Different technological approaches for 68Ga radiopharmaceuticals will be discussed. ► Generator eluate post processing and evolution to cassette based systems were the major issues in automation. ► Impact of regulations on the technological development will be also considered

  17. Ferroelectric nanoparticle-embedded sponge structure triboelectric generators

    Science.gov (United States)

    Park, Daehoon; Shin, Sung-Ho; Yoon, Ick-Jae; Nah, Junghyo

    2018-05-01

    We report high-performance triboelectric nanogenerators (TENGs) employing ferroelectric nanoparticles (NPs) embedded in a sponge structure. The ferroelectric BaTiO3 NPs inside the sponge structure play an important role in increasing surface charge density by polarized spontaneous dipoles, enabling the packaging of TENGs even with a minimal separation gap. Since the friction surfaces are encapsulated in the packaged device structure, it suffers negligible performance degradation even at a high relative humidity of 80%. The TENGs also demonstrated excellent mechanical durability due to the elasticity and flexibility of the sponge structure. Consequently, the TENGs can reliably harvest energy even under harsh conditions. The approach introduced here is a simple, effective, and reliable way to fabricate compact and packaged TENGs for potential applications in wearable energy-harvesting devices.

  18. Automated detection and labeling of high-density EEG electrodes from structural MR images

    Science.gov (United States)

    Marino, Marco; Liu, Quanying; Brem, Silvia; Wenderoth, Nicole; Mantini, Dante

    2016-10-01

    Objective. Accurate knowledge about the positions of electrodes in electroencephalography (EEG) is very important for precise source localizations. Direct detection of electrodes from magnetic resonance (MR) images is particularly interesting, as it is possible to avoid errors of co-registration between electrode and head coordinate systems. In this study, we propose an automated MR-based method for electrode detection and labeling, particularly tailored to high-density montages. Approach. Anatomical MR images were processed to create an electrode-enhanced image in individual space. Image processing included intensity non-uniformity correction, background noise and goggles artifact removal. Next, we defined a search volume around the head where electrode positions were detected. Electrodes were identified as local maxima in the search volume and registered to the Montreal Neurological Institute standard space using an affine transformation. This allowed the matching of the detected points with the specific EEG montage template, as well as their labeling. Matching and labeling were performed by the coherent point drift method. Our method was assessed on 8 MR images collected in subjects wearing a 256-channel EEG net, using the displacement with respect to manually selected electrodes as performance metric. Main results. Average displacement achieved by our method was significantly lower compared to alternative techniques, such as the photogrammetry technique. The maximum displacement was for more than 99% of the electrodes lower than 1 cm, which is typically considered an acceptable upper limit for errors in electrode positioning. Our method showed robustness and reliability, even in suboptimal conditions, such as in the case of net rotation, imprecisely gathered wires, electrode detachment from the head, and MR image ghosting. Significance. We showed that our method provides objective, repeatable and precise estimates of EEG electrode coordinates. We hope our work

  19. Automated tube voltage selection for radiation dose and contrast medium reduction at coronary CT angiography using 3{sup rd} generation dual-source CT

    Energy Technology Data Exchange (ETDEWEB)

    Mangold, Stefanie [Medical University of South Carolina, Division of Cardiovascular Imaging, Department of Radiology and Radiological Science, Charleston, SC (United States); Eberhard-Karls University Tuebingen, Department of Diagnostic and Interventional Radiology, Tuebingen (Germany); Wichmann, Julian L. [Medical University of South Carolina, Division of Cardiovascular Imaging, Department of Radiology and Radiological Science, Charleston, SC (United States); University Hospital Frankfurt, Department of Diagnostic and Interventional Radiology, Frankfurt (Germany); Schoepf, U.J. [Medical University of South Carolina, Division of Cardiovascular Imaging, Department of Radiology and Radiological Science, Charleston, SC (United States); Medical University of South Carolina, Division of Cardiology, Department of Medicine, Charleston, SC (United States); Poole, Zachary B.; Varga-Szemes, Akos; De Cecco, Carlo N. [Medical University of South Carolina, Division of Cardiovascular Imaging, Department of Radiology and Radiological Science, Charleston, SC (United States); Canstein, Christian [Siemens Medical Solutions, Malvern, PA (United States); Caruso, Damiano [Medical University of South Carolina, Division of Cardiovascular Imaging, Department of Radiology and Radiological Science, Charleston, SC (United States); University of Rome ' ' Sapienza' ' , Department of Radiological Sciences, Oncology and Pathology, Rome (Italy); Bamberg, Fabian; Nikolaou, Konstantin [Eberhard-Karls University Tuebingen, Department of Diagnostic and Interventional Radiology, Tuebingen (Germany)

    2016-10-15

    To investigate the relationship between automated tube voltage selection (ATVS) and body mass index (BMI) and its effect on image quality and radiation dose of coronary CT angiography (CCTA). We evaluated 272 patients who underwent CCTA with 3{sup rd} generation dual-source CT (DSCT). Prospectively ECG-triggered spiral acquisition was performed with automated tube current selection and advanced iterative reconstruction. Tube voltages were selected by ATVS (70-120 kV). BMI, effective dose (ED), and vascular attenuation in the coronary arteries were recorded. Signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR) were calculated. Five-point scales were used for subjective image quality analysis. Image quality was rated good to excellent in 98.9 % of examinations without significant differences for proximal and distal attenuation (all p ≥.0516), whereas image noise was rated significantly higher at 70 kV compared to ≥100 kV (all p <.0266). However, no significant differences were observed in SNR or CNR at 70-120 kV (all p ≥.0829). Mean ED at 70-120 kV was 1.5 ± 1.2 mSv, 2.4 ± 1.5 mSv, 3.6 ± 2.7 mSv, 5.9 ± 4.0 mSv, 7.9 ± 4.2 mSv, and 10.7 ± 4.1 mSv, respectively (all p ≤.0414). Correlation analysis showed a moderate association between tube voltage and BMI (r =.639). ATVS allows individual tube voltage adaptation for CCTA performed with 3{sup rd} generation DSCT, resulting in significantly decreased radiation exposure while maintaining image quality. (orig.)

  20. A hybrid computational-experimental approach for automated crystal structure solution

    Science.gov (United States)

    Meredig, Bryce; Wolverton, C.

    2013-02-01

    Crystal structure solution from diffraction experiments is one of the most fundamental tasks in materials science, chemistry, physics and geology. Unfortunately, numerous factors render this process labour intensive and error prone. Experimental conditions, such as high pressure or structural metastability, often complicate characterization. Furthermore, many materials of great modern interest, such as batteries and hydrogen storage media, contain light elements such as Li and H that only weakly scatter X-rays. Finally, structural refinements generally require significant human input and intuition, as they rely on good initial guesses for the target structure. To address these many challenges, we demonstrate a new hybrid approach, first-principles-assisted structure solution (FPASS), which combines experimental diffraction data, statistical symmetry information and first-principles-based algorithmic optimization to automatically solve crystal structures. We demonstrate the broad utility of FPASS to clarify four important crystal structure debates: the hydrogen storage candidates MgNH and NH3BH3; Li2O2, relevant to Li-air batteries; and high-pressure silane, SiH4.

  1. Semi-automated measurement of anatomical structures using statistical and morphological priors

    Science.gov (United States)

    Ashton, Edward A.; Du, Tong

    2004-05-01

    Rapid, accurate and reproducible delineation and measurement of arbitrary anatomical structures in medical images is a widely held goal, with important applications in both clinical diagnostics and, perhaps more significantly, pharmaceutical trial evaluation. This process requires the ability first to localize a structure within the body, and then to find a best approximation of the structure"s boundaries within a given scan. Structures that are tortuous and small in cross section, such as the hippocampus in the brain or the abdominal aorta, present a particular challenge. Their apparent shape and position can change significantly from slice to slice, and accurate prior shape models for such structures are often difficult to form. In this work, we have developed a system that makes use of both a user-defined shape model and a statistical maximum likelihood classifier to identify and measure structures of this sort in MRI and CT images. Experiments show that this system can reduce analysis time by 75% or more with respect to manual tracing with no loss of precision or accuracy.

  2. Generation of human and structural capital: lessons from knowledge management

    OpenAIRE

    Agndal, Henrik; Nilsson, Ulf

    2006-01-01

    Interorganizational and social relationships can be seen as part of the intellectual capital of a firm. Existing frameworks of intellectual capital, however, fail to address how relationships should be managed to generate more intellectual capital. Drawing on the interaction approach and the fields of intellectual capital and knowledge management, this paper develops a framework for managing relationships. The framework is illustrated with a case study. It is also noted that firms can improve...

  3. Towards fully automated structure-based NMR resonance assignment of 15N-labeled proteins from automatically picked peaks

    KAUST Repository

    Jang, Richard; Gao, Xin; Li, Ming

    2011-01-01

    In NMR resonance assignment, an indispensable step in NMR protein studies, manually processed peaks from both N-labeled and C-labeled spectra are typically used as inputs. However, the use of homologous structures can allow one to use only N-labeled NMR data and avoid the added expense of using C-labeled data. We propose a novel integer programming framework for structure-based backbone resonance assignment using N-labeled data. The core consists of a pair of integer programming models: one for spin system forming and amino acid typing, and the other for backbone resonance assignment. The goal is to perform the assignment directly from spectra without any manual intervention via automatically picked peaks, which are much noisier than manually picked peaks, so methods must be error-tolerant. In the case of semi-automated/manually processed peak data, we compare our system with the Xiong-Pandurangan-Bailey- Kellogg's contact replacement (CR) method, which is the most error-tolerant method for structure-based resonance assignment. Our system, on average, reduces the error rate of the CR method by five folds on their data set. In addition, by using an iterative algorithm, our system has the added capability of using the NOESY data to correct assignment errors due to errors in predicting the amino acid and secondary structure type of each spin system. On a publicly available data set for human ubiquitin, where the typing accuracy is 83%, we achieve 91% accuracy, compared to the 59% accuracy obtained without correcting for such errors. In the case of automatically picked peaks, using assignment information from yeast ubiquitin, we achieve a fully automatic assignment with 97% accuracy. To our knowledge, this is the first system that can achieve fully automatic structure-based assignment directly from spectra. This has implications in NMR protein mutant studies, where the assignment step is repeated for each mutant. © Copyright 2011, Mary Ann Liebert, Inc.

  4. Towards fully automated structure-based NMR resonance assignment of 15N-labeled proteins from automatically picked peaks

    KAUST Repository

    Jang, Richard

    2011-03-01

    In NMR resonance assignment, an indispensable step in NMR protein studies, manually processed peaks from both N-labeled and C-labeled spectra are typically used as inputs. However, the use of homologous structures can allow one to use only N-labeled NMR data and avoid the added expense of using C-labeled data. We propose a novel integer programming framework for structure-based backbone resonance assignment using N-labeled data. The core consists of a pair of integer programming models: one for spin system forming and amino acid typing, and the other for backbone resonance assignment. The goal is to perform the assignment directly from spectra without any manual intervention via automatically picked peaks, which are much noisier than manually picked peaks, so methods must be error-tolerant. In the case of semi-automated/manually processed peak data, we compare our system with the Xiong-Pandurangan-Bailey- Kellogg\\'s contact replacement (CR) method, which is the most error-tolerant method for structure-based resonance assignment. Our system, on average, reduces the error rate of the CR method by five folds on their data set. In addition, by using an iterative algorithm, our system has the added capability of using the NOESY data to correct assignment errors due to errors in predicting the amino acid and secondary structure type of each spin system. On a publicly available data set for human ubiquitin, where the typing accuracy is 83%, we achieve 91% accuracy, compared to the 59% accuracy obtained without correcting for such errors. In the case of automatically picked peaks, using assignment information from yeast ubiquitin, we achieve a fully automatic assignment with 97% accuracy. To our knowledge, this is the first system that can achieve fully automatic structure-based assignment directly from spectra. This has implications in NMR protein mutant studies, where the assignment step is repeated for each mutant. © Copyright 2011, Mary Ann Liebert, Inc.

  5. An Automated Fluid-Structural Interaction Analysis of a Large Segmented Solid Rocket Motor

    National Research Council Canada - National Science Library

    Rex, Brian

    2003-01-01

    ... couple the ABAQUS structural solver with FLUENT, the computational fluid dynamics (CFD) solver. This iterative process automatically used the results of one solver as the inputs to the other solver until convergence to a solution was obtained...

  6. An Automated Fluid-Structural Interaction Analysis of a Large Segmented Solid Rocket Motor

    National Research Council Canada - National Science Library

    Rex, Brian

    2003-01-01

    .... The fluid-structural interaction (FSI) analysis of the ETM-3 motor used PYTHON, a powerful programming language, and FEM BUILDER, a pre- and post processor developed by ATK Thiokol Propulsion under contract to the AFRL, to automatically...

  7. Structural and leakage integrity assessment of WWER steam generator tubes

    International Nuclear Information System (INIS)

    Splichal, K.; Otruba, J.; Keilova, E.; Krhounek, V.; Turek, J.

    1996-01-01

    The leakage and plugging limits were derived for WWER steam generators based on leak and burst tests using tubes with axial part-through and through-wall defects. The following conclusions were arrived at: (i) The permissible primary-to-secondary leak rate with respect to the permissible through-wall defect size of WWER-440 and WWER-1000 steam generator tubes is 8 l/h. (ii) The primary-to-secondary leak rate is reduced by the blocking of the tube cracks by corrosion product particles and other substances. (iii) The rate of crack penetration through the tube wall is higher than the crack widening. (iv) The validity of the criterion of instability for tubes with through-wall cracks was confirmed experimentally. For the WWER-440 and WWER-1000 steam generators, the critical size of axial through-wall cracks, for the threshold primary-to-secondary pressure difference, is 13.8 and 12.0 mm, respectively. (v) The calculated leakage for the rupture of one tube and for the assumed extreme defects is two orders and one order of magnitude, respectively, higher than the proposed primary water leakage limit of 8 l/h. (vi) The experiments gave evidence that the use of the permissible thinning limit of 80% for the heat exchange tube plugging does not bring about uncontrollable leakage or unstable crack growth. This is consistent with experience gained at WWER-440 type nuclear power plants. 4 tabs., 5 figs., 9 refs

  8. Automated microfluidic sample-preparation platform for high-throughput structural investigation of proteins by small-angle X-ray scattering

    DEFF Research Database (Denmark)

    Lafleur, Josiane P.; Snakenborg, Detlef; Nielsen, Søren Skou

    2011-01-01

    A new microfluidic sample-preparation system is presented for the structural investigation of proteins using small-angle X-ray scattering (SAXS) at synchrotrons. The system includes hardware and software features for precise fluidic control, sample mixing by diffusion, automated X-ray exposure...... control, UV absorbance measurements and automated data analysis. As little as 15 l of sample is required to perform a complete analysis cycle, including sample mixing, SAXS measurement, continuous UV absorbance measurements, and cleaning of the channels and X-ray cell with buffer. The complete analysis...

  9. Design automation of load-bearing arched structures of roofs of tall buildings

    Science.gov (United States)

    Kulikov, Vladimir

    2018-03-01

    The article considers aspects of the possible use of arched roofs in the construction of skyscrapers. Tall buildings experience large load from various environmental factors. Skyscrapers are subject to various and complex types of deformation of its structural elements. The paper discusses issues related to the aerodynamics of various structural elements of tall buildings. The technique of solving systems of equations state method of Simpson. The article describes the optimization of geometric parameters of bearing elements of the arched roofs of skyscrapers.

  10. Generation of tripolar vortical structures on the beta plane

    DEFF Research Database (Denmark)

    Hesthaven, J.S.; Lynov, Jens-Peter; Juul Rasmussen, J.

    1993-01-01

    and oscillation of the tripolar structure may lead to increased mixing near the boundary of the vortex core. The translation of strong monopoles is found to be well described, even for times longer than the linear Rossby wave period, by a recent approximate theory for the evolution of an azimuthal perturbation...

  11. A new generation of computational tools in structural engineering

    International Nuclear Information System (INIS)

    Ebersolt, L.; Verpeaux, P.; Farvacque, M.; Combescure, A.

    1985-11-01

    The new concepts of operators and typed objects changed considerably the programming capacities for the finite element method in structural engineering. Their greater versatility offers many advantages: especially the user's capacity to modify or improve an algorithm just by changing the dataset and this without and help from the developping team [fr

  12. Detailed requirements for a next generation nuclear data structure.

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-07-05

    This document attempts to compile the requirements for the top-levels of a hierarchical arrangement of nuclear data such as found in the ENDF format. This set of requirements will be used to guide the development of a new data structure to replace the legacy ENDF format.

  13. Form-finding of shell structures generated from physical models

    NARCIS (Netherlands)

    Li, Q.; Su, Y; Wu, Y; Borgart, A.; Rots, J.G.

    2017-01-01

    Vector form intrinsic finite element is a recently developed and promising numerical method for the analysis of complicated structural behavior. Taking the cable-link element as example, the framework of the vector form intrinsic finite element is explained first. Based on this, a constant strain

  14. WWER steam generator tube structural and leakage integrity

    International Nuclear Information System (INIS)

    Splichal, K.; Krhounek, Vl.; Otruba, J.; Ruscak, M.

    1998-01-01

    The integrity of heat exchange tubes may influence the lifetime of WWER steam generators and appears to be an important criterion for the evaluation of their safety and operational reliability. The basic requirements are to assure very low probability of radioactive water leakage, preventing unstable crack growth and sudden tube rupture. These requirements led to development of permissible limits for primary to secondary leak evaluation and heat exchange tubes plugging. The stress corrosion cracking and pitting are the main corrosion damages of WWER heat exchange tubes and are initiated from the outer surface. Both the initiation and crack growth cause thinning of the tube wall and lead to part thickness cracks and through wall cracks, oriented preferentially in the axial direction. The paper presents the leakage and plugging limits for WWER steam generators, which have been determined from leak tests and burst tests. The tubes with axial part-through and through-wall defects have been used. The permissible value of primary to secondary leak rate was evaluated with respect to permissible axial through-wall defect size of WWER 440 and 1000 steam generator tubes. Blocking of the tube cracks by corrosion product particles and other compounds reduces the primary to secondary leak rate. The plugging limits involve the following factors: permissible tube wall thickness which determine further operation of the tubes with defects and assures their integrity under operating conditions and permissible size of a through-wall crack which is sufficiently stable under normal and accident conditions in relation to the critical crack length. For the evaluation of burst test of heat exchange tubes with longitudinal through-wall defects the instability criterion has been used and the dependence of the normalised burst pressure on the normalised length of an axial through-wall defect has been determined. The validity of the criterion of instability for WWER tubes with through

  15. Application of X-ray digital radiography to online automated inspection of interior assembly structures of complex products

    International Nuclear Information System (INIS)

    Han Yueping; Han Yan; Li Ruihong; Wang Liming

    2009-01-01

    The paper proposes an application of X-ray digital radiography to online automated inspection and recognition of the interior assembly structures of complex products by means of the multiple views techniques. First, a vertical hybrid projection function (VHPF) is proposed as the recognition feature of a two-dimensional image. VHPF combines an integral projection function and a standard deviation function so that it can reflect the mean and the variance of the pixels in the vertical direction in an image. Secondly, by considering the different importance grades of objects inside the product and the independence of these objects along the circumference, the paper presents a hierarchical recognition method and uses a neural network system to speed up the computation process with parallel operations. Thirdly, using the whole-orientation features of one standard swatch and by extracting its maximal system of linear independence as the feature basis, the issue of blind areas for recognition is resolved. Based on this approach, the first domestic X-ray multi-view digital detection system has been developed and applied to the online detection of objects containing complicated assembly structures.

  16. Automated identification of RNA 3D modules with discriminative power in RNA structural alignments

    DEFF Research Database (Denmark)

    Theis, Corinna; Höner zu Siederdissen, Christian; Hofacker, Ivo L.

    2013-01-01

    Recent progress in predicting RNA structure is moving towards filling the 'gap' in 2D RNA structure prediction where, for example, predicted internal loops often form non-canonical base pairs. This is increasingly recognized with the steady increase of known RNA 3D modules. There is a general...... comparative evidence. Subsequently, the modules, initially represented by a graph, are turned into models for the RMDetect program, which allows to test their discriminative power using real and randomized Rfam alignments. An initial extraction of 22495 3D modules in all PDB files results in 977 internal loop...

  17. Automated method for structural segmentation of nasal airways based on cone beam computed tomography

    Science.gov (United States)

    Tymkovych, Maksym Yu.; Avrunin, Oleg G.; Paliy, Victor G.; Filzow, Maksim; Gryshkov, Oleksandr; Glasmacher, Birgit; Omiotek, Zbigniew; DzierŻak, RóŻa; Smailova, Saule; Kozbekova, Ainur

    2017-08-01

    The work is dedicated to the segmentation problem of human nasal airways using Cone Beam Computed Tomography. During research, we propose a specialized approach of structured segmentation of nasal airways. That approach use spatial information, symmetrisation of the structures. The proposed stages can be used for construction a virtual three dimensional model of nasal airways and for production full-scale personalized atlases. During research we build the virtual model of nasal airways, which can be used for construction specialized medical atlases and aerodynamics researches.

  18. Spray structure as generated under homogeneous flash boiling nucleation regime

    International Nuclear Information System (INIS)

    Levy, M.; Levy, Y.; Sher, E.

    2014-01-01

    We show the effect of the initial pressure and temperature on the spatial distribution of droplets size and their velocity profile inside a spray cloud that is generated by a flash boiling mechanism under homogeneous nucleation regime. We used TSI's Phase Doppler Particle Analyzer (PDPA) to characterize the spray. We conclude that the homogeneous nucleation process is strongly affected by the initial liquid temperature while the initial pressure has only a minor effect. The spray shape is not affected by temperature or pressure under homogeneous nucleation regime. We noted that the only visible effect is in the spray opacity. Finally, homogeneous nucleation may be easily achieved by using a simple atomizer construction, and thus is potentially suitable for fuel injection systems in combustors and engines. - Highlights: • We study the characteristics of a spray that is generated by a flash boiling process. • In this study, the flash boiling process occurs under homogeneous nucleation regime. • We used Phase Doppler Particle Analyzer (PDPA) to characterize the spray. • The SMD has been found to be strongly affected by the initial liquid temperature. • Homogeneous nucleation may be easily achieved by using a simple atomizer unit

  19. Generating XML schemas for DICOM structured reporting templates.

    Science.gov (United States)

    Zhao, Luyin; Lee, Kwok Pun; Hu, Jingkun

    2005-01-01

    In this paper, the authors describe a methodology to transform programmatically structured reporting (SR) templates defined by the Digital Imaging and Communications for Medicine (DICOM) standard into an XML schema representation. Such schemas can be used in the creation and validation of XML-encoded SR documents that use templates. Templates are a means to put additional constraints on an SR document to promote common formats for specific reporting applications or domains. As the use of templates becomes more widespread in the production of SR documents, it is important to ensure validity of such documents. The work described in this paper is an extension of the authors' previous work on XML schema representation for DICOM SR. Therefore, this paper inherits and partially modifies the structure defined in the earlier work.

  20. Residual Generation for the Ship Benchmark Using Structural Approach

    DEFF Research Database (Denmark)

    Cocquempot, V.; Izadi-Zamanabadi, Roozbeh; Staroswiecki, M

    1998-01-01

    The prime objective of Fault-tolerant Control (FTC) systems is to handle faults and discrepancies using appropriate accommodation policies. The issue of obtaining information about various parameters and signals, which have to be monitored for fault detection purposes, becomes a rigorous task...... with the growing number of subsystems. The structural approach, presented in this paper, constitutes a general framework for providing information when the system becomes complex. The methodology of this approach is illustrated on the ship propulsion benchmark....

  1. MATHEMATICAL SIMULATION AND AUTOMATION OF PROCESS ENGINEERING FOR WELDED STRUCTURE PRODUCTION

    Directory of Open Access Journals (Sweden)

    P. V. Zankovets

    2017-01-01

    Full Text Available Models and methods for presentation of database and knowledge base have been developed on the basis of composition and structure of data flow in technological process of welding. The information in data and knowledge base is presented in the form of multilevel hierarchical structure and it is organized according to its functionality in the form of separate files. Each file contains a great number of tables. While using mathematical simulation and information technologies an expert system has been developed with the purpose to take decisions in designing and process engineering for production of welded ructures. The system makes it possible to carry out technically substantiated selection of welded and welding materials, sttypes of welded connections, welding methods, parameters and modes of welding. The developed system allows to improve quality of the accepted design decisions due to reduction of manual labour costs for work with normative-reference documentation, analysis and evaluation of dozens of possible alternatives. The system also permits to reduce labour inputs for testing structures on technological effectiveness, to ensure reduction of materials consumption for welded structures, to guarantee faultless formation of welded connections at this stage.

  2. Low-Cost Impact Detection and Location for Automated Inspections of 3D Metallic Based Structures

    Directory of Open Access Journals (Sweden)

    Carlos Morón

    2015-05-01

    Full Text Available This paper describes a new low-cost means to detect and locate mechanical impacts (collisions on a 3D metal-based structure. We employ the simple and reasonably hypothesis that the use of a homogeneous material will allow certain details of the impact to be automatically determined by measuring the time delays of acoustic wave propagation throughout the 3D structure. The location of strategic piezoelectric sensors on the structure and an electronic-computerized system has allowed us to determine the instant and position at which the impact is produced. The proposed automatic system allows us to fully integrate impact point detection and the task of inspecting the point or zone at which this impact occurs. What is more, the proposed method can be easily integrated into a robot-based inspection system capable of moving over 3D metallic structures, thus avoiding (or minimizing the need for direct human intervention. Experimental results are provided to show the effectiveness of the proposed approach.

  3. Blind testing of routine, fully automated determination of protein structures from NMR data.

    NARCIS (Netherlands)

    Rosato, A.; Aramini, J.M.; Arrowsmith, C.; Bagaria, A.; Baker, D.; Cavalli, A.; Doreleijers, J.; Eletsky, A.; Giachetti, A.; Guerry, P.; Gutmanas, A.; Guntert, P.; He, Y.; Herrmann, T.; Huang, Y.J.; Jaravine, V.; Jonker, H.R.; Kennedy, M.A.; Lange, O.F.; Liu, G.; Malliavin, T.E.; Mani, R.; Mao, B.; Montelione, G.T.; Nilges, M.; Rossi, P.; Schot, G. van der; Schwalbe, H.; Szyperski, T.A.; Vendruscolo, M.; Vernon, R.; Vranken, W.F.; Vries, S.D. de; Vuister, G.W.; Wu, B.; Yang, Y.; Bonvin, A.M.

    2012-01-01

    The protocols currently used for protein structure determination by nuclear magnetic resonance (NMR) depend on the determination of a large number of upper distance limits for proton-proton pairs. Typically, this task is performed manually by an experienced researcher rather than automatically by

  4. Blind Testing of Routine, Fully Automated Determination of Protein Structures from NMR Data

    NARCIS (Netherlands)

    Rosato, A.; Aramini, J.M.; van der Schot, G.; de Vries, S.J.|info:eu-repo/dai/nl/304837717; Bonvin, A.M.J.J.|info:eu-repo/dai/nl/113691238

    2012-01-01

    The protocols currently used for protein structure determination by nuclear magnetic resonance (NMR) depend on the determination of a large number of upper distance limits for proton-proton pairs. Typically, this task is performed manually by an experienced researcher rather than automatically by

  5. INTEGRATED MODEL OF AUTOMATED PROCESS LIFECYCLE MANAGEMENT TRAINING THROUGH STRUCTURIZATION CONTENT OF HIGH SCHOOL AND ORGANIZATIONS

    Directory of Open Access Journals (Sweden)

    Gennady G. Kulikov

    2015-01-01

    Full Text Available This article discusses the modern point of view, the issue of developing methods of forming the structure of the process lifecycle management of specialisttraining in conjunction with the University of industrial enterprise on the basisof a comprehensive content base chair. The possibility of using IT to improve the efficiency of educational processes.

  6. SV-AUTOPILOT: optimized, automated construction of structural variation discovery and benchmarking pipelines

    NARCIS (Netherlands)

    W.Y. Leung; T. Marschall (Tobias); Y. Paudel; L. Falquet; H. Mei (Hailiang); A. Schönhuth (Alexander); T.Y. Maoz

    2015-01-01

    htmlabstractBackground Many tools exist to predict structural variants (SVs), utilizing a variety of algorithms. However, they have largely been developed and tested on human germline or somatic (e.g. cancer) variation. It seems appropriate to exploit this wealth of technology available for humans

  7. Using Structure-Based Organic Chemistry Online Tutorials with Automated Correction for Student Practice and Review

    Science.gov (United States)

    O'Sullivan, Timothy P.; Hargaden, Gra´inne C.

    2014-01-01

    This article describes the development and implementation of an open-access organic chemistry question bank for online tutorials and assessments at University College Cork and Dublin Institute of Technology. SOCOT (structure-based organic chemistry online tutorials) may be used to supplement traditional small-group tutorials, thereby allowing…

  8. Generation of magnetic structures on the solar photosphere

    Energy Technology Data Exchange (ETDEWEB)

    Gangadhara, R. T.; Krishan, V. [Indian Institute of Astrophysics, Bangalore-560034 (India); Bhowmick, A. K.; Chitre, S. M., E-mail: ganga@iiap.res.in [Centre for Excellence in Basic Sciences, University of Mumbai, Mumbai-400098 (India)

    2014-06-20

    The lower solar atmosphere is a partially ionized plasma consisting of electrons, ions, and neutral atoms. In this, which is essentially a three-fluid system, the Hall effect arises from the treatment of the electrons and ions as two separate fluids and the ambipolar diffusion arises from the inclusion of neutrals as the third fluid. The Hall effect and ambipolar diffusion have been shown to be operational in a region beginning from near the photosphere up to the chromosphere. In a partially ionized plasma, the magnetic induction is subjected to ambipolar diffusion and the Hall drift in addition to the usual resistive dissipation. These nonlinear effects create sharp magnetic structures which then submit themselves to various relaxation mechanisms. A first-principles derivation of these effects in a three-fluid system and an analytic solution to the magnetic induction equation in a stationary state are presented, which in the general case includes the Hall effect, ambipolar diffusion, and ohmic dissipation. The temporal evolution of the magnetic field is then investigated under the combined as well as the individual effects of the Hall drift and ambipolar diffusion to demonstrate the formation of steep magnetic structures and the resultant current sheet formation. These structures have just the right features for the release of magnetic energy into the solar atmosphere.

  9. Automated Conflict Resolution For Air Traffic Control

    Science.gov (United States)

    Erzberger, Heinz

    2005-01-01

    The ability to detect and resolve conflicts automatically is considered to be an essential requirement for the next generation air traffic control system. While systems for automated conflict detection have been used operationally by controllers for more than 20 years, automated resolution systems have so far not reached the level of maturity required for operational deployment. Analytical models and algorithms for automated resolution have been traffic conditions to demonstrate that they can handle the complete spectrum of conflict situations encountered in actual operations. The resolution algorithm described in this paper was formulated to meet the performance requirements of the Automated Airspace Concept (AAC). The AAC, which was described in a recent paper [1], is a candidate for the next generation air traffic control system. The AAC's performance objectives are to increase safety and airspace capacity and to accommodate user preferences in flight operations to the greatest extent possible. In the AAC, resolution trajectories are generated by an automation system on the ground and sent to the aircraft autonomously via data link .The algorithm generating the trajectories must take into account the performance characteristics of the aircraft, the route structure of the airway system, and be capable of resolving all types of conflicts for properly equipped aircraft without requiring supervision and approval by a controller. Furthermore, the resolution trajectories should be compatible with the clearances, vectors and flight plan amendments that controllers customarily issue to pilots in resolving conflicts. The algorithm described herein, although formulated specifically to meet the needs of the AAC, provides a generic engine for resolving conflicts. Thus, it can be incorporated into any operational concept that requires a method for automated resolution, including concepts for autonomous air to air resolution.

  10. On-line low and high frequency acoustic leak detection and location for an automated steam generator protection system

    International Nuclear Information System (INIS)

    Gaubatz, D.C.; Gluekler, E.L.

    1990-01-01

    Two on-line acoustic leak detection systems were operated and installed on a 76 MW hockey stick steam generator in the Sodium Components Test Installation (SCTI) at the Energy Technology Engineering Center (ETEC) in Southern California. The low frequency system demonstrated the capability to detect and locate leaks, both intentional and unintentional. No false alarms were issued during the two year test program even with adjacent blasting activities, pneumatic drilling, shuttle rocket engine testing nearby, scrams of the SCTI facility, thermal/hydraulic transient testing, and pump/control valve operations. For the high frequency system the capability to detect water into sodium reactions was established utilizing frequencies as high as 300 kHz. The high frequency system appeared to be sensitive to noise generated by maintenance work and system valve operations. Subsequent development work which is incomplete as of this date showed much more promise for the high frequency system. (author). 13 figs

  11. Distortions in processed signals and their application in electronic design - III: An automated generator of communication jamming signals

    International Nuclear Information System (INIS)

    Njau, E.C.

    1987-10-01

    We describe the design and operational features of a simple electronic circuit that is capable of automatically generating a narrow bandwidth jamming signal around each frequency signal received from target transmitters. It is noted that jamming based upon this circuit is fairly difficult to nullify using some of the conventional ''counter jamming'' strategies since in this case the jamming signals are flexibly locked onto the spectral components of the received signals. (author). 3 refs, 3 figs

  12. Computed tomography landmark-based semi-automated mesh morphing and mapping techniques: generation of patient specific models of the human pelvis without segmentation.

    Science.gov (United States)

    Salo, Zoryana; Beek, Maarten; Wright, David; Whyne, Cari Marisa

    2015-04-13

    Current methods for the development of pelvic finite element (FE) models generally are based upon specimen specific computed tomography (CT) data. This approach has traditionally required segmentation of CT data sets, which is time consuming and necessitates high levels of user intervention due to the complex pelvic anatomy. The purpose of this research was to develop and assess CT landmark-based semi-automated mesh morphing and mapping techniques to aid the generation and mechanical analysis of specimen-specific FE models of the pelvis without the need for segmentation. A specimen-specific pelvic FE model (source) was created using traditional segmentation methods and morphed onto a CT scan of a different (target) pelvis using a landmark-based method. The morphed model was then refined through mesh mapping by moving the nodes to the bone boundary. A second target model was created using traditional segmentation techniques. CT intensity based material properties were assigned to the morphed/mapped model and to the traditionally segmented target models. Models were analyzed to evaluate their geometric concurrency and strain patterns. Strains generated in a double-leg stance configuration were compared to experimental strain gauge data generated from the same target cadaver pelvis. CT landmark-based morphing and mapping techniques were efficiently applied to create a geometrically multifaceted specimen-specific pelvic FE model, which was similar to the traditionally segmented target model and better replicated the experimental strain results (R(2)=0.873). This study has shown that mesh morphing and mapping represents an efficient validated approach for pelvic FE model generation without the need for segmentation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Photoactivation by visible light of CdTe quantum dots for inline generation of reactive oxygen species in an automated multipumping flow system

    Energy Technology Data Exchange (ETDEWEB)

    Ribeiro, David S.M.; Frigerio, Christian; Santos, Joao L.M. [Requimte, Department of Chemical Sciences, Laboratory of Applied Chemistry, Faculty of Pharmacy, University of Porto, Rua de Jorge Viterbo Ferreira no. 228, 4050-313 Porto (Portugal); Prior, Joao A.V., E-mail: joaoavp@ff.up.pt [Requimte, Department of Chemical Sciences, Laboratory of Applied Chemistry, Faculty of Pharmacy, University of Porto, Rua de Jorge Viterbo Ferreira no. 228, 4050-313 Porto (Portugal)

    2012-07-20

    Highlights: Black-Right-Pointing-Pointer CdTe quantum dots generate free radical species upon exposure to visible radiation. Black-Right-Pointing-Pointer A high power visible LED lamp was used as photoirradiation element. Black-Right-Pointing-Pointer The laboratory-made LED photocatalytic unit was implemented inline in a MPFS. Black-Right-Pointing-Pointer Free radical species oxidize luminol producing a strong chemiluminescence emission. Black-Right-Pointing-Pointer Epinephrine scavenges free radical species quenching chemiluminescence emission. - Abstract: Quantum dots (QD) are semiconductor nanocrystals able to generate free radical species upon exposure to an electromagnetic radiation, usually in the ultraviolet wavelength range. In this work, CdTe QD were used as highly reactive oxygen species (ROS) generators for the control of pharmaceutical formulations containing epinephrine. The developed approach was based on the chemiluminometric monitoring of the quenching effect of epinephrine on the oxidation of luminol by the produced ROS. Due to the relatively low energy band-gap of this chalcogenide a high power visible light emitting diode (LED) lamp was used as photoirradiation element and assembled in a laboratory-made photocatalytic unit. Owing to the very short lifetime of ROS and to ensure both reproducible generation and time-controlled reaction implementation and development, all reactional processes were implemented inline by using an automated multipumping micro-flow system. A linear working range for epinephrine concentration of up to 2.28 Multiplication-Sign 10{sup -6} mol L{sup -1} (r = 0.9953; n = 5) was verified. The determination rate was about 79 determinations per hour and the detection limit was about 8.69 Multiplication-Sign 10{sup -8} mol L{sup -1}. The results obtained in the analysis of epinephrine pharmaceutical formulations by using the proposed methodology were in good agreement with those furnished by the reference procedure, with

  14. Automated processing of first-pass radionuclide angiocardiography by factor analysis of dynamic structures

    International Nuclear Information System (INIS)

    Cavailloles, F.; Valette, H.; Hebert, J.-L.; Bazin, J.-P.; Di Paola, R.; Capderou, A.

    1987-01-01

    A method for automatic processing of cardiac first-pass radionuclide study is presented. This technique, factor analysis of dynamic structures (FADS) provides an automatic separation of anatomical structures according to their different temporal behaviour, even if they are superimposed. FADS has been applied to 76 studies. A description of factor patterns obtained in various pathological categories is presented. FADS provides easy diagnosis of shunts and tricuspid insufficiency. Quantitative information derived from the factors (cardiac output and mean transit time) were compared to those obtained by the region of interest method. Using FADS, a higher correlation with cardiac catheterization was found for cardiac output calculation. Thus compared to the ROI method, FADS presents obvious advantages: a good separation of overlapping cardiac chambers is obtained; this operator independent method provides more objective and reproducible results. (author)

  15. Automated processing of first-pass radionuclide angiocardiography by factor analysis of dynamic structures

    Energy Technology Data Exchange (ETDEWEB)

    Cavailloles, F.; Valette, H.; Hebert, J.-L.; Bazin, J.-P.; Di Paola, R.; Capderou, A.

    1987-05-01

    A method for automatic processing of cardiac first-pass radionuclide study is presented. This technique, factor analysis of dynamic structures (FADS) provides an automatic separation of anatomical structures according to their different temporal behaviour, even if they are superimposed. FADS has been applied to 76 studies. A description of factor patterns obtained in various pathological categories is presented. FADS provides easy diagnosis of shunts and tricuspid insufficiency. Quantitative information derived from the factors (cardiac output and mean transit time) were compared to those obtained by the region of interest method. Using FADS, a higher correlation with cardiac catheterization was found for cardiac output calculation. Thus compared to the ROI method, FADS presents obvious advantages: a good separation of overlapping cardiac chambers is obtained; this operator independent method provides more objective and reproducible results.

  16. A machine learning approach to automated structural network analysis: application to neonatal encephalopathy.

    Directory of Open Access Journals (Sweden)

    Etay Ziv

    Full Text Available Neonatal encephalopathy represents a heterogeneous group of conditions associated with life-long developmental disabilities and neurological deficits. Clinical measures and current anatomic brain imaging remain inadequate predictors of outcome in children with neonatal encephalopathy. Some studies have suggested that brain development and, therefore, brain connectivity may be altered in the subgroup of patients who subsequently go on to develop clinically significant neurological abnormalities. Large-scale structural brain connectivity networks constructed using diffusion tractography have been posited to reflect organizational differences in white matter architecture at the mesoscale, and thus offer a unique tool for characterizing brain development in patients with neonatal encephalopathy. In this manuscript we use diffusion tractography to construct structural networks for a cohort of patients with neonatal encephalopathy. We systematically map these networks to a high-dimensional space and then apply standard machine learning algorithms to predict neurological outcome in the cohort. Using nested cross-validation we demonstrate high prediction accuracy that is both statistically significant and robust over a broad range of thresholds. Our algorithm offers a novel tool to evaluate neonates at risk for developing neurological deficit. The described approach can be applied to any brain pathology that affects structural connectivity.

  17. Automated polyp measurement based on colon structure decomposition for CT colonography

    Science.gov (United States)

    Wang, Huafeng; Li, Lihong C.; Han, Hao; Peng, Hao; Song, Bowen; Wei, Xinzhou; Liang, Zhengrong

    2014-03-01

    Accurate assessment of colorectal polyp size is of great significance for early diagnosis and management of colorectal cancers. Due to the complexity of colon structure, polyps with diverse geometric characteristics grow from different landform surfaces. In this paper, we present a new colon decomposition approach for polyp measurement. We first apply an efficient maximum a posteriori expectation-maximization (MAP-EM) partial volume segmentation algorithm to achieve an effective electronic cleansing on colon. The global colon structure is then decomposed into different kinds of morphological shapes, e.g. haustral folds or haustral wall. Meanwhile, the polyp location is identified by an automatic computer aided detection algorithm. By integrating the colon structure decomposition with the computer aided detection system, a patch volume of colon polyps is extracted. Thus, polyp size assessment can be achieved by finding abnormal protrusion on a relative uniform morphological surface from the decomposed colon landform. We evaluated our method via physical phantom and clinical datasets. Experiment results demonstrate the feasibility of our method in consistently quantifying the size of polyp volume and, therefore, facilitating characterizing for clinical management.

  18. A Framework for Semi-Automated Implementation of Multidimensional Data Models

    Directory of Open Access Journals (Sweden)

    Ilona Mariana NAGY

    2012-08-01

    Full Text Available Data warehousing solution development represents a challenging task which requires the employment of considerable resources on behalf of enterprises and sustained commitment from the stakeholders. Costs derive mostly from the amount of time invested in the design and physical implementation of these large projects, time that we consider, may be decreased through the automation of several processes. Thus, we present a framework for semi-automated implementation of multidimensional data models and introduce an automation prototype intended to reduce the time of data structures generation in the warehousing environment. Our research is focused on the design of an automation component and the development of a corresponding prototype from technical metadata.

  19. GenRGenS: Software for Generating Random Genomic Sequences and Structures

    OpenAIRE

    Ponty , Yann; Termier , Michel; Denise , Alain

    2006-01-01

    International audience; GenRGenS is a software tool dedicated to randomly generating genomic sequences and structures. It handles several classes of models useful for sequence analysis, such as Markov chains, hidden Markov models, weighted context-free grammars, regular expressions and PROSITE expressions. GenRGenS is the only program that can handle weighted context-free grammars, thus allowing the user to model and to generate structured objects (such as RNA secondary structures) of any giv...

  20. Automated processing of first-pass radionuclide angiocardiography by factor analysis of dynamic structures.

    Science.gov (United States)

    Cavailloles, F; Bazin, J P; Capderou, A; Valette, H; Herbert, J L; Di Paola, R

    1987-05-01

    A method for automatic processing of cardiac first-pass radionuclide study is presented. This technique, factor analysis of dynamic structures (FADS) provides an automatic separation of anatomical structures according to their different temporal behaviour, even if they are superimposed. FADS has been applied to 76 studies. A description of factor patterns obtained in various pathological categories is presented. FADS provides easy diagnosis of shunts and tricuspid insufficiency. Quantitative information derived from the factors (cardiac output and mean transit time) were compared to those obtained by the region of interest method. Using FADS, a higher correlation with cardiac catheterization was found for cardiac output calculation. Thus compared to the ROI method, FADS presents obvious advantages: a good separation of overlapping cardiac chambers is obtained; this operator independant method provides more objective and reproducible results. A number of parameters of the cardio-pulmonary function can be assessed by first-pass radionuclide angiocardiography (RNA) [1,2]. Usually, they are calculated using time-activity curves (TAC) from regions of interest (ROI) drawn on the cardiac chambers and the lungs. This method has two main drawbacks: (1) the lack of inter and intra-observers reproducibility; (2) the problem of crosstalk which affects the evaluation of the cardio-pulmonary performance. The crosstalk on planar imaging is due to anatomical superimposition of the cardiac chambers and lungs. The activity measured in any ROI is the sum of the activity in several organs and 'decontamination' of the TAC cannot easily be performed using the ROI method [3]. Factor analysis of dynamic structures (FADS) [4,5] can solve the two problems mentioned above. It provides an automatic separation of anatomical structures according to their different temporal behaviour, even if they are superimposed. The resulting factors are estimates of the time evolution of the activity in each

  1. Migraine Subclassification via a Data-Driven Automated Approach Using Multimodality Factor Mixture Modeling of Brain Structure Measurements.

    Science.gov (United States)

    Schwedt, Todd J; Si, Bing; Li, Jing; Wu, Teresa; Chong, Catherine D

    2017-07-01

    The current subclassification of migraine is according to headache frequency and aura status. The variability in migraine symptoms, disease course, and response to treatment suggest the presence of additional heterogeneity or subclasses within migraine. The study objective was to subclassify migraine via a data-driven approach, identifying latent factors by jointly exploiting multiple sets of brain structural features obtained via magnetic resonance imaging (MRI). Migraineurs (n = 66) and healthy controls (n = 54) had brain MRI measurements of cortical thickness, cortical surface area, and volumes for 68 regions. A multimodality factor mixture model was used to subclassify MRIs and to determine the brain structural factors that most contributed to the subclassification. Clinical characteristics of subjects in each subgroup were compared. Automated MRI classification divided the subjects into two subgroups. Migraineurs in subgroup #1 had more severe allodynia symptoms during migraines (6.1 ± 5.3 vs. 3.6 ± 3.2, P = .03), more years with migraine (19.2 ± 11.3 years vs 13 ± 8.3 years, P = .01), and higher Migraine Disability Assessment (MIDAS) scores (25 ± 22.9 vs 15.7 ± 12.2, P = .04). There were not differences in headache frequency or migraine aura status between the two subgroups. Data-driven subclassification of brain MRIs based upon structural measurements identified two subgroups. Amongst migraineurs, the subgroups differed in allodynia symptom severity, years with migraine, and migraine-related disability. Since allodynia is associated with this imaging-based subclassification of migraine and prior publications suggest that allodynia impacts migraine treatment response and disease prognosis, future migraine diagnostic criteria could consider allodynia when defining migraine subgroups. © 2017 American Headache Society.

  2. Holographically generated structured illumination for cell stimulation in optogenetics

    Science.gov (United States)

    Schmieder, Felix; Büttner, Lars; Czarske, Jürgen; Torres, Maria Leilani; Heisterkamp, Alexander; Klapper, Simon; Busskamp, Volker

    2017-06-01

    In Optogenetics, cells, e.g. neurons or cardiac cells, are genetically altered to produce for example the lightsensitive protein Channelrhodopsin-2. Illuminating these cells induces action potentials or contractions and therefore allows to control electrical activity. Thus, light-induced cell stimulation can be used to gain insight to various biological processes. Many optogenetics studies, however, use only full field illumination and thus gain no local information about their specimen. But using modern spatial light modulators (SLM) in conjunction with computer-generated holograms (CGH), cells may be stimulated locally, thus enabling the research of the foundations of cell networks and cell communications. In our contribution, we present a digital holographic system for the patterned, spatially resolved stimulation of cell networks. We employ a fast ferroelectric liquid crystal on silicon SLM to display CGH at up to 1.7 kHz. With an effective working distance of 33 mm, we achieve a focus of 10 μm at a positioning accuracy of the individual foci of about 8 μm. We utilized our setup for the optogenetic stimulation of clusters of cardiac cells derived from induced pluripotent stem cells and were able to observe contractions correlated to both temporal frequency and spatial power distribution of the light incident on the cell clusters.

  3. Structural and leakage integrity assessment of WWER steam generator tubes

    Energy Technology Data Exchange (ETDEWEB)

    Splichal, K.; Otruba, J. [Nuclear Research Inst., Rez (Switzerland)

    1997-12-31

    The integrity of heat exchange tubes may influence the life-time of WWER steam generators and appears to be an important criterion for the evaluation of their safety and operational reliability. The basic requirement is to assure a very low probability of radioactive water leakage, preventing unstable crack growth and sudden tube rupture. These requirements led to development of permissible limits for primary to secondary leak evolution and heat exchange tubes plugging based on eddy current test inspection. The stress corrosion cracking and pitting are the main corrosion damage of WWER heat exchange tubes and are initiated from the outer surface. They are influenced by water chemistry, temperature and tube wall stress level. They take place under crevice corrosion condition and are indicated especially (1) under the tube support plates, where up to 90-95 % of defects detected by the ECT method occur, and (2) on free spans under tube deposit layers. Both the initiation and crack growth cause thinning of the tube wall and lead to part thickness cracks and through-wall cracks, oriented above all in the axial direction. 10 refs.

  4. Structural and leakage integrity assessment of WWER steam generator tubes

    International Nuclear Information System (INIS)

    Splichal, K.; Otruba, J.

    1997-01-01

    The integrity of heat exchange tubes may influence the life-time of WWER steam generators and appears to be an important criterion for the evaluation of their safety and operational reliability. The basic requirement is to assure a very low probability of radioactive water leakage, preventing unstable crack growth and sudden tube rupture. These requirements led to development of permissible limits for primary to secondary leak evolution and heat exchange tubes plugging based on eddy current test inspection. The stress corrosion cracking and pitting are the main corrosion damage of WWER heat exchange tubes and are initiated from the outer surface. They are influenced by water chemistry, temperature and tube wall stress level. They take place under crevice corrosion condition and are indicated especially (1) under the tube support plates, where up to 90-95 % of defects detected by the ECT method occur, and (2) on free spans under tube deposit layers. Both the initiation and crack growth cause thinning of the tube wall and lead to part thickness cracks and through-wall cracks, oriented above all in the axial direction

  5. Structural and leakage integrity assessment of WWER steam generator tubes

    Energy Technology Data Exchange (ETDEWEB)

    Splichal, K; Otruba, J [Nuclear Research Inst., Rez (Switzerland)

    1998-12-31

    The integrity of heat exchange tubes may influence the life-time of WWER steam generators and appears to be an important criterion for the evaluation of their safety and operational reliability. The basic requirement is to assure a very low probability of radioactive water leakage, preventing unstable crack growth and sudden tube rupture. These requirements led to development of permissible limits for primary to secondary leak evolution and heat exchange tubes plugging based on eddy current test inspection. The stress corrosion cracking and pitting are the main corrosion damage of WWER heat exchange tubes and are initiated from the outer surface. They are influenced by water chemistry, temperature and tube wall stress level. They take place under crevice corrosion condition and are indicated especially (1) under the tube support plates, where up to 90-95 % of defects detected by the ECT method occur, and (2) on free spans under tube deposit layers. Both the initiation and crack growth cause thinning of the tube wall and lead to part thickness cracks and through-wall cracks, oriented above all in the axial direction. 10 refs.

  6. Turbulence anisotropy and coherent structures in electromagnetically generated vortex patterns

    International Nuclear Information System (INIS)

    Kenjereš, S

    2011-01-01

    Numerical investigations addressing influence of the localised electromagnetic forcing on turbulent thermal convection of a weakly electrically conductive fluid in a wall-bounded rectangular enclosure are performed over a wide range of working parameters (10 4 ≤Ra≤5×10 5 , Pr = 7). An asymmetrical electromagnetic forcing (EMF) is applied originating from combined effects of the imposed magnetic fields (originating from an array of 5×7 permanent magnets with |b 0 | max = 1 T each, located beneath the lower thermally active wall) and electric fields (originating from two electrodes supplied with dc current of different intensities, 0≤I≤10 A). Subgrid turbulent stress is modelled by electromagnetically extended Smagorinsky model and subgrid turbulent heat flux is represented by a simple gradient diffusion hypothesis. Simulations revealed two interesting findings: the electromagnetic forcing generated significant overall heat transfer increase (more than 500% for lower values of Ra) compared to its neutral case, and, the turbulence anisotropy was reduced in the central part of the enclosure.

  7. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  8. Experimental evaluation of an automated endoscope reprocessor with in situ generation of peracetic acid for disinfection of semicritical devices.

    Science.gov (United States)

    Sattar, Syed A; Kibbee, Richard J; Tetro, Jason A; Rook, Tony A

    2006-11-01

    To evaluate the effectiveness of a high-level disinfection solution generated inside an endoscope processing system for decontaminating external and internal surfaces of experimentally contaminated heat-sensitive medical devices. The American Society for Testing and Materials Simulated-Use Test protocol (E1837-02), which incorporates a soil load in each inoculum, was used to evaluate the efficacy of the system when processing 4 common types of endoscopes contaminated separately with 5 types of nosocomial pathogens: Pseudomonas aeruginosa (ATCC 15442), spores of Clostridium difficile (ATCC 9689), a glutaraldehyde-resistant strain of Mycobacterium chelonae, a vancomycin-resistant strain of Enterococcus faecalis, and a methicillin-resistant strain of Staphylococcus aureus. Rinse solution samples from channels and from surfaces of the processed endoscopes were tested for any microbicidal residues. For all organisms tested, the baseline level of contamination of the endoscopes ranged from 5 log(10) to greater than 7 log(10) at each external surface site and internal channel. All tests showed reductions in viability of the test organisms to undetectable levels. All rinse solution samples from external and internal sites of the endoscopes proved to be free of any residual microbicidal activity. The endoscope reprocessor, with its processor-generated high-level disinfection solution, successfully reduced the numbers of selected, clinically relevant pathogens to undetectable levels both in the channels and on the outside surfaces of the 4 representative endoscopes tested in this study.

  9. Automated system for on-line determination of dimethylarsinic and inorganic arsenic by hydride generation-atomic fluorescence spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Chaparro, L.L.; Leal, L.O. [Renewable Energy and Environmental Protection Department, Advanced Materials Research Center (CIMAV), Chihuahua, Chihuahua (Mexico); Ferrer, L.; Cerda, V. [University of the Balearic Islands, Department of Chemistry, Palma de Mallorca (Spain)

    2012-09-15

    A multisyringe flow-injection approach has been coupled to hydride generation-atomic fluorescence spectrometry (HG-AFS) with UV photo-oxidation for dimethylarsinic (DMA), inorganic As and total As determination, depending on the pre-treatment given to the sample (extraction or digestion). The implementation of a UV lamp allows on-line photo-oxidation of DMA and the following arsenic detection, whereas a bypass leads the flow directly to the HG-AFS system, performing inorganic arsenic determination. DMA concentration is calculated by the difference of total inorganic arsenic and measurement of the photo-oxidation step. The detection limits for DMA and inorganic arsenic were 0.09 and 0.47 {mu}g L{sup -1}, respectively. The repeatability values accomplished were of 2.4 and 1.8 %, whereas the injection frequencies were 24 and 28 injections per hour for DMA and inorganic arsenic, respectively. This method was validated by means of a solid reference material BCR-627 (muscle of tuna) with good agreement with the certified values. Satisfactory results for DMA and inorganic arsenic determination were obtained in several water matrices. The proposed method offers several advantages, such as increasing the sampling frequency, low detection limits and decreasing reagents and sample consumption, which leads to lower waste generation. (orig.)

  10. Informatics in radiology: automated structured reporting of imaging findings using the AIM standard and XML.

    Science.gov (United States)

    Zimmerman, Stefan L; Kim, Woojin; Boonn, William W

    2011-01-01

    Quantitative and descriptive imaging data are a vital component of the radiology report and are frequently of paramount importance to the ordering physician. Unfortunately, current methods of recording these data in the report are both inefficient and error prone. In addition, the free-text, unstructured format of a radiology report makes aggregate analysis of data from multiple reports difficult or even impossible without manual intervention. A structured reporting work flow has been developed that allows quantitative data created at an advanced imaging workstation to be seamlessly integrated into the radiology report with minimal radiologist intervention. As an intermediary step between the workstation and the reporting software, quantitative and descriptive data are converted into an extensible markup language (XML) file in a standardized format specified by the Annotation and Image Markup (AIM) project of the National Institutes of Health Cancer Biomedical Informatics Grid. The AIM standard was created to allow image annotation data to be stored in a uniform machine-readable format. These XML files containing imaging data can also be stored on a local database for data mining and analysis. This structured work flow solution has the potential to improve radiologist efficiency, reduce errors, and facilitate storage of quantitative and descriptive imaging data for research. Copyright © RSNA, 2011.

  11. Fully automated reconstruction of three-dimensional vascular tree structures from two orthogonal views using computational algorithms and productionrules

    Science.gov (United States)

    Liu, Iching; Sun, Ying

    1992-10-01

    A system for reconstructing 3-D vascular structure from two orthogonally projected images is presented. The formidable problem of matching segments between two views is solved using knowledge of the epipolar constraint and the similarity of segment geometry and connectivity. The knowledge is represented in a rule-based system, which also controls the operation of several computational algorithms for tracking segments in each image, representing 2-D segments with directed graphs, and reconstructing 3-D segments from matching 2-D segment pairs. Uncertain reasoning governs the interaction between segmentation and matching; it also provides a framework for resolving the matching ambiguities in an iterative way. The system was implemented in the C language and the C Language Integrated Production System (CLIPS) expert system shell. Using video images of a tree model, the standard deviation of reconstructed centerlines was estimated to be 0.8 mm (1.7 mm) when the view direction was parallel (perpendicular) to the epipolar plane. Feasibility of clinical use was shown using x-ray angiograms of a human chest phantom. The correspondence of vessel segments between two views was accurate. Computational time for the entire reconstruction process was under 30 s on a workstation. A fully automated system for two-view reconstruction that does not require the a priori knowledge of vascular anatomy is demonstrated.

  12. Psi4 1.1: An Open-Source Electronic Structure Program Emphasizing Automation, Advanced Libraries, and Interoperability.

    Science.gov (United States)

    Parrish, Robert M; Burns, Lori A; Smith, Daniel G A; Simmonett, Andrew C; DePrince, A Eugene; Hohenstein, Edward G; Bozkaya, Uğur; Sokolov, Alexander Yu; Di Remigio, Roberto; Richard, Ryan M; Gonthier, Jérôme F; James, Andrew M; McAlexander, Harley R; Kumar, Ashutosh; Saitow, Masaaki; Wang, Xiao; Pritchard, Benjamin P; Verma, Prakash; Schaefer, Henry F; Patkowski, Konrad; King, Rollin A; Valeev, Edward F; Evangelista, Francesco A; Turney, Justin M; Crawford, T Daniel; Sherrill, C David

    2017-07-11

    Psi4 is an ab initio electronic structure program providing methods such as Hartree-Fock, density functional theory, configuration interaction, and coupled-cluster theory. The 1.1 release represents a major update meant to automate complex tasks, such as geometry optimization using complete-basis-set extrapolation or focal-point methods. Conversion of the top-level code to a Python module means that Psi4 can now be used in complex workflows alongside other Python tools. Several new features have been added with the aid of libraries providing easy access to techniques such as density fitting, Cholesky decomposition, and Laplace denominators. The build system has been completely rewritten to simplify interoperability with independent, reusable software components for quantum chemistry. Finally, a wide range of new theoretical methods and analyses have been added to the code base, including functional-group and open-shell symmetry adapted perturbation theory, density-fitted coupled cluster with frozen natural orbitals, orbital-optimized perturbation and coupled-cluster methods (e.g., OO-MP2 and OO-LCCD), density-fitted multiconfigurational self-consistent field, density cumulant functional theory, algebraic-diagrammatic construction excited states, improvements to the geometry optimizer, and the "X2C" approach to relativistic corrections, among many other improvements.

  13. Generation of shockwave and vortex structures at the outflow of a boiling water jet

    Science.gov (United States)

    Alekseev, M. V.; Lezhnin, S. I.; Pribaturin, N. A.; Sorokin, A. L.

    2014-12-01

    Results of numerical simulation for shock waves and generation of vortex structures during unsteady outflow of boiling liquid jet are presented. The features of evolution of shock waves and vortex structures formation during unsteady outflow of boiling water are compared with corresponding structures during unsteady gas outflow.

  14. The Matchmaker Exchange API: automating patient matching through the exchange of structured phenotypic and genotypic profiles.

    Science.gov (United States)

    Buske, Orion J; Schiettecatte, François; Hutton, Benjamin; Dumitriu, Sergiu; Misyura, Andriy; Huang, Lijia; Hartley, Taila; Girdea, Marta; Sobreira, Nara; Mungall, Chris; Brudno, Michael

    2015-10-01

    Despite the increasing prevalence of clinical sequencing, the difficulty of identifying additional affected families is a key obstacle to solving many rare diseases. There may only be a handful of similar patients worldwide, and their data may be stored in diverse clinical and research databases. Computational methods are necessary to enable finding similar patients across the growing number of patient repositories and registries. We present the Matchmaker Exchange Application Programming Interface (MME API), a protocol and data format for exchanging phenotype and genotype profiles to enable matchmaking among patient databases, facilitate the identification of additional cohorts, and increase the rate with which rare diseases can be researched and diagnosed. We designed the API to be straightforward and flexible in order to simplify its adoption on a large number of data types and workflows. We also provide a public test data set, curated from the literature, to facilitate implementation of the API and development of new matching algorithms. The initial version of the API has been successfully implemented by three members of the Matchmaker Exchange and was immediately able to reproduce previously identified matches and generate several new leads currently being validated. The API is available at https://github.com/ga4gh/mme-apis. © 2015 WILEY PERIODICALS, INC.

  15. Development of a Genetic Algorithm to Automate Clustering of a Dependency Structure Matrix

    Science.gov (United States)

    Rogers, James L.; Korte, John J.; Bilardo, Vincent J.

    2006-01-01

    Much technology assessment and organization design data exists in Microsoft Excel spreadsheets. Tools are needed to put this data into a form that can be used by design managers to make design decisions. One need is to cluster data that is highly coupled. Tools such as the Dependency Structure Matrix (DSM) and a Genetic Algorithm (GA) can be of great benefit. However, no tool currently combines the DSM and a GA to solve the clustering problem. This paper describes a new software tool that interfaces a GA written as an Excel macro with a DSM in spreadsheet format. The results of several test cases are included to demonstrate how well this new tool works.

  16. Data structures and language elements for automated transport calculations for neutron and gamma radiation

    International Nuclear Information System (INIS)

    Rexer, G.

    1978-12-01

    Computer-aided design of nuclear shielding and irradiation facilities is characterized by studies of different design variants in order to determine which facilities are safe and still economicol. The design engineer has a very complex task including the formulation of calculation models, data linking of programs and data, and the management of large data stores. Integrated modular program systems with centralized module and data management make it possible to treat these problems in a more simplified and automatic manner. The paper describes a system of this type for the field of radiation transport and radiation shielding. The basis is the modular system RSYST II which has a dynamic hierarchical scheme for the structuring of problem data in a central data base. (orig./RW) [de

  17. Generating spatial precipitation ensembles: impact of temporal correlation structure

    Science.gov (United States)

    Rakovec, O.; Hazenberg, P.; Torfs, P. J. J. F.; Weerts, A. H.; Uijlenhoet, R.

    2012-09-01

    Sound spatially distributed rainfall fields including a proper spatial and temporal error structure are of key interest for hydrologists to force hydrological models and to identify uncertainties in the simulated and forecasted catchment response. The current paper presents a temporally coherent error identification method based on time-dependent multivariate spatial conditional simulations, which are conditioned on preceding simulations. A sensitivity analysis and real-world experiment are carried out within the hilly region of the Belgian Ardennes. Precipitation fields are simulated for pixels of 10 km × 10 km resolution. Uncertainty analyses in the simulated fields focus on (1) the number of previous simulation hours on which the new simulation is conditioned, (2) the advection speed of the rainfall event, (3) the size of the catchment considered, and (4) the rain gauge density within the catchment. The results for a sensitivity analysis show for typical advection speeds >20 km h-1, no uncertainty is added in terms of across ensemble spread when conditioned on more than one or two previous hourly simulations. However, for the real-world experiment, additional uncertainty can still be added when conditioning on a larger number of previous simulations. This is because for actual precipitation fields, the dynamics exhibit a larger spatial and temporal variability. Moreover, by thinning the observation network with 50%, the added uncertainty increases only slightly and the cross-validation shows that the simulations at the unobserved locations are unbiased. Finally, the first-order autocorrelation coefficients show clear temporal coherence in the time series of the areal precipitation using the time-dependent multivariate conditional simulations, which was not the case using the time-independent univariate conditional simulations. The presented work can be easily implemented within a hydrological calibration and data assimilation framework and can be used as an

  18. Generating spatial precipitation ensembles: impact of temporal correlation structure

    Directory of Open Access Journals (Sweden)

    O. Rakovec

    2012-09-01

    Full Text Available Sound spatially distributed rainfall fields including a proper spatial and temporal error structure are of key interest for hydrologists to force hydrological models and to identify uncertainties in the simulated and forecasted catchment response. The current paper presents a temporally coherent error identification method based on time-dependent multivariate spatial conditional simulations, which are conditioned on preceding simulations. A sensitivity analysis and real-world experiment are carried out within the hilly region of the Belgian Ardennes. Precipitation fields are simulated for pixels of 10 km × 10 km resolution. Uncertainty analyses in the simulated fields focus on (1 the number of previous simulation hours on which the new simulation is conditioned, (2 the advection speed of the rainfall event, (3 the size of the catchment considered, and (4 the rain gauge density within the catchment. The results for a sensitivity analysis show for typical advection speeds >20 km h−1, no uncertainty is added in terms of across ensemble spread when conditioned on more than one or two previous hourly simulations. However, for the real-world experiment, additional uncertainty can still be added when conditioning on a larger number of previous simulations. This is because for actual precipitation fields, the dynamics exhibit a larger spatial and temporal variability. Moreover, by thinning the observation network with 50%, the added uncertainty increases only slightly and the cross-validation shows that the simulations at the unobserved locations are unbiased. Finally, the first-order autocorrelation coefficients show clear temporal coherence in the time series of the areal precipitation using the time-dependent multivariate conditional simulations, which was not the case using the time-independent univariate conditional simulations. The presented work can be easily implemented within a hydrological calibration and data assimilation

  19. Automated analysis of lightning leader speed, local flash rates and electric charge structure in thunderstorms

    Science.gov (United States)

    Van Der Velde, O. A.; Montanya, J.; López, J. A.

    2017-12-01

    A Lightning Mapping Array (LMA) maps radio pulses emitted by lightning leaders, displaying lightning flash development in the cloud in three dimensions. Since the last 10 years about a dozen of these advanced systems have become operational in the United States and in Europe, often with the purpose of severe weather monitoring or lightning research. We introduce new methods for the analysis of complex three-dimensional lightning data produced by LMAs and illustrate them by cases of a mid-latitude severe weather producing thunderstorm and a tropical thunderstorm in Colombia. The method is based on the characteristics of bidrectional leader development as observed in LMA data (van der Velde and Montanyà, 2013, JGR-Atmospheres), where mapped positive leaders were found to propagate at characteristic speeds around 2 · 104 m s-1, while negative leaders typically propagate at speeds around 105 m s-1. Here, we determine leader speed for every 1.5 x 1.5 x 0.75 km grid box in 3 ms time steps, using two time intervals (e.g., 9 ms and 27 ms) and circles (4.5 km and 2.5 km wide) in which a robust Theil-Sen fitting of the slope is performed for fast and slow leaders. The two are then merged such that important speed characteristics are optimally maintained in negative and positive leaders, and labeled with positive or negative polarity according to the resulting velocity. The method also counts how often leaders from a lightning flash initiate or pass through each grid box. This "local flash rate" may be used in severe thunderstorm or NOx production studies and shall be more meaningful than LMA source density which is biased by the detection efficiency. Additionally, in each grid box the median x, y and z components of the leader propagation vectors of all flashes result in a 3D vector grid which can be compared to vectors in numerical models of leader propagation in response to cloud charge structure. Finally, the charge region altitudes, thickness and rates are summarized

  20. Generating Free-Form Grid Truss Structures from 3D Scanned Point Clouds

    Directory of Open Access Journals (Sweden)

    Hui Ding

    2017-01-01

    Full Text Available Reconstruction, according to physical shape, is a novel way to generate free-form grid truss structures. 3D scanning is an effective means of acquiring physical form information and it generates dense point clouds on surfaces of objects. However, generating grid truss structures from point clouds is still a challenge. Based on the advancing front technique (AFT which is widely used in Finite Element Method (FEM, a scheme for generating grid truss structures from 3D scanned point clouds is proposed in this paper. Based on the characteristics of point cloud data, the search box is adopted to reduce the search space in grid generating. A front advancing procedure suit for point clouds is established. Delaunay method and Laplacian method are used to improve the quality of the generated grids, and an adjustment strategy that locates grid nodes at appointed places is proposed. Several examples of generating grid truss structures from 3D scanned point clouds of seashells are carried out to verify the proposed scheme. Physical models of the grid truss structures generated in the examples are manufactured by 3D print, which solidifies the feasibility of the scheme.

  1. BranchAnalysis2D/3D automates morphometry analyses of branching structures.

    Science.gov (United States)

    Srinivasan, Aditya; Muñoz-Estrada, Jesús; Bourgeois, Justin R; Nalwalk, Julia W; Pumiglia, Kevin M; Sheen, Volney L; Ferland, Russell J

    2018-01-15

    Morphometric analyses of biological features have become increasingly common in recent years with such analyses being subject to a large degree of observer bias, variability, and time consumption. While commercial software packages exist to perform these analyses, they are expensive, require extensive user training, and are usually dependent on the observer tracing the morphology. To address these issues, we have developed a broadly applicable, no-cost ImageJ plugin we call 'BranchAnalysis2D/3D', to perform morphometric analyses of structures with branching morphologies, such as neuronal dendritic spines, vascular morphology, and primary cilia. Our BranchAnalysis2D/3D algorithm allows for rapid quantification of the length and thickness of branching morphologies, independent of user tracing, in both 2D and 3D data sets. We validated the performance of BranchAnalysis2D/3D against pre-existing software packages using trained human observers and images from brain and retina. We found that the BranchAnalysis2D/3D algorithm outputs results similar to available software (i.e., Metamorph, AngioTool, Neurolucida), while allowing faster analysis times and unbiased quantification. BranchAnalysis2D/3D allows inexperienced observers to output results like a trained observer but more efficiently, thereby increasing the consistency, speed, and reliability of morphometric analyses. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. THE NEW ONLINE METADATA EDITOR FOR GENERATING STRUCTURED METADATA

    Energy Technology Data Exchange (ETDEWEB)

    Devarakonda, Ranjeet [ORNL; Shrestha, Biva [ORNL; Palanisamy, Giri [ORNL; Hook, Leslie A [ORNL; Killeffer, Terri S [ORNL; Boden, Thomas A [ORNL; Cook, Robert B [ORNL; Zolly, Lisa [United States Geological Service (USGS); Hutchison, Viv [United States Geological Service (USGS); Frame, Mike [United States Geological Service (USGS); Cialella, Alice [Brookhaven National Laboratory (BNL); Lazer, Kathy [Brookhaven National Laboratory (BNL)

    2014-01-01

    Nobody is better suited to describe data than the scientist who created it. This description about a data is called Metadata. In general terms, Metadata represents the who, what, when, where, why and how of the dataset [1]. eXtensible Markup Language (XML) is the preferred output format for metadata, as it makes it portable and, more importantly, suitable for system discoverability. The newly developed ORNL Metadata Editor (OME) is a Web-based tool that allows users to create and maintain XML files containing key information, or metadata, about the research. Metadata include information about the specific projects, parameters, time periods, and locations associated with the data. Such information helps put the research findings in context. In addition, the metadata produced using OME will allow other researchers to find these data via Metadata clearinghouses like Mercury [2][4]. OME is part of ORNL s Mercury software fleet [2][3]. It was jointly developed to support projects funded by the United States Geological Survey (USGS), U.S. Department of Energy (DOE), National Aeronautics and Space Administration (NASA) and National Oceanic and Atmospheric Administration (NOAA). OME s architecture provides a customizable interface to support project-specific requirements. Using this new architecture, the ORNL team developed OME instances for USGS s Core Science Analytics, Synthesis, and Libraries (CSAS&L), DOE s Next Generation Ecosystem Experiments (NGEE) and Atmospheric Radiation Measurement (ARM) Program, and the international Surface Ocean Carbon Dioxide ATlas (SOCAT). Researchers simply use the ORNL Metadata Editor to enter relevant metadata into a Web-based form. From the information on the form, the Metadata Editor can create an XML file on the server that the editor is installed or to the user s personal computer. Researchers can also use the ORNL Metadata Editor to modify existing XML metadata files. As an example, an NGEE Arctic scientist use OME to register

  3. ESBL Detection: Comparison of a Commercially Available Chromogenic Test for Third Generation Cephalosporine Resistance and Automated Susceptibility Testing in Enterobactericeae.

    Directory of Open Access Journals (Sweden)

    Mohamed Ramadan El-Jade

    Full Text Available Rapid detection and reporting of third generation cephalosporine resistance (3GC-R and of extended spectrum betalactamases in Enterobacteriaceae (ESBL-E is a diagnostic and therapeutic priority to avoid inefficacy of the initial antibiotic regimen. In this study we evaluated a commercially available chromogenic screen for 3GC-R as a predictive and/or confirmatory test for ESBL and AmpC activity in clinical and veterinary Enterobacteriaceae isolates. The test was highly reliable in the prediction of cefotaxime and cefpodoxime resistance, but there was no correlation with ceftazidime and piperacillin/tazobactam minimal inhibitory concentrations. All human and porcine ESBL-E tested were detected with exception of one genetically positive but phenotypically negative isolate. By contrast, AmpC detection rates lay below 30%. Notably, exclusion of piperacillin/tazobactam resistant, 3GC susceptible K1+ Klebsiella isolates increased the sensitivity and specificity of the test for ESBL detection. Our data further imply that in regions with low prevalence of AmpC and K1 positive E. coli strains chromogenic testing for 3GC-R can substitute for more time consuming ESBL confirmative testing in E. coli isolates tested positive by Phoenix or VITEK2 ESBL screen. We, therefore, suggest a diagnostic algorithm that distinguishes 3GC-R screening from primary culture and species-dependent confirmatory ESBL testing by βLACTATM and discuss the implications of MIC distribution results on the choice of antibiotic regimen.

  4. Generation of structural MR images from amyloid PET: Application to MR-less quantification.

    Science.gov (United States)

    Choi, Hongyoon; Lee, Dong Soo

    2017-12-07

    Structural magnetic resonance (MR) images concomitantly acquired with PET images can provide crucial anatomical information for precise quantitative analysis. However, in the clinical setting, not all the subjects have corresponding MR. Here, we developed a model to generate structural MR images from amyloid PET using deep generative networks. We applied our model to quantification of cortical amyloid load without structural MR. Methods: We used florbetapir PET and structural MR data of Alzheimer's Disease Neuroimaging Initiative database. The generative network was trained to generate realistic structural MR images from florbetapir PET images. After the training, the model was applied to the quantification of cortical amyloid load. PET images were spatially normalized to the template space using the generated MR and then standardized uptake value ratio (SUVR) of the target regions was measured by predefined regions-of-interests. A real MR-based quantification was used as the gold standard to measure the accuracy of our approach. Other MR-less methods, a normal PET template-based, multi-atlas PET template-based and PET segmentation-based normalization/quantification methods, were also tested. We compared performance of quantification methods using generated MR with that of MR-based and MR-less quantification methods. Results: Generated MR images from florbetapir PET showed visually similar signal patterns to the real MR. The structural similarity index between real and generated MR was 0.91 ± 0.04. Mean absolute error of SUVR of cortical composite regions estimated by the generated MR-based method was 0.04±0.03, which was significantly smaller than other MR-less methods (0.29±0.12 for the normal PET-template, 0.12±0.07 for multiatlas PET-template and 0.08±0.06 for PET segmentation-based methods). Bland-Altman plots revealed that the generated MR-based SUVR quantification was the closest to the SUVR values estimated by the real MR-based method. Conclusion

  5. Automated Generation of Geo-Referenced Mosaics From Video Data Collected by Deep-Submergence Vehicles: Preliminary Results

    Science.gov (United States)

    Rhzanov, Y.; Beaulieu, S.; Soule, S. A.; Shank, T.; Fornari, D.; Mayer, L. A.

    2005-12-01

    Many advances in understanding geologic, tectonic, biologic, and sedimentologic processes in the deep ocean are facilitated by direct observation of the seafloor. However, making such observations is both difficult and expensive. Optical systems (e.g., video, still camera, or direct observation) will always be constrained by the severe attenuation of light in the deep ocean, limiting the field of view to distances that are typically less than 10 meters. Acoustic systems can 'see' much larger areas, but at the cost of spatial resolution. Ultimately, scientists want to study and observe deep-sea processes in the same way we do land-based phenomena so that the spatial distribution and juxtaposition of processes and features can be resolved. We have begun development of algorithms that will, in near real-time, generate mosaics from video collected by deep-submergence vehicles. Mosaics consist of >>10 video frames and can cover 100's of square-meters. This work builds on a publicly available still and video mosaicking software package developed by Rzhanov and Mayer. Here we present the results of initial tests of data collection methodologies (e.g., transects across the seafloor and panoramas across features of interest), algorithm application, and GIS integration conducted during a recent cruise to the Eastern Galapagos Spreading Center (0 deg N, 86 deg W). We have developed a GIS database for the region that will act as a means to access and display mosaics within a geospatially-referenced framework. We have constructed numerous mosaics using both video and still imagery and assessed the quality of the mosaics (including registration errors) under different lighting conditions and with different navigation procedures. We have begun to develop algorithms for efficient and timely mosaicking of collected video as well as integration with navigation data for georeferencing the mosaics. Initial results indicate that operators must be properly versed in the control of the

  6. Analog automatic test pattern generation for quasi-static structural test.

    NARCIS (Netherlands)

    Zjajo, A.; Pineda de Gyvez, J.

    2009-01-01

    A new approach for structural, fault-oriented analog test generation methodology to test for the presence of manufacturing-related defects is proposed. The output of the test generator consists of optimized test stimuli, fault coverage and sampling instants that are sufficient to detect the failure

  7. Generation of Earthquake Ground Motion Considering Local Site Effects and Soil-Structure Interaction Analysis of Ancient Structures

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Kwan; Lee, J. S.; Yang, T. S.; Cho, J. R.; R, H. [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1997-09-01

    In order to establish a correct correlation between them, mechanical characteristics of the ancient structures need to be investigated. Since sedimentary basins are preferred dwelling sites in ancient times, it is necessary to perform SSI analysis to derive correct correlation between the damage and ground motion intensity. Contents of Project are as follows: (1) Generation of stochastic earthquake ground motion considering source mechanism and site effects. (2) Analysis of seismic response of sedimentary basin. (3) Soil-structure interaction analysis of ancient structures (4) Investigation of dynamic response characteristics of ancient structure considering soil-structure interaction effects. A procedure is presented for generation of stochastic earthquake ground motion considering source mechanism and site effects. The simulation method proposed by Boore is used to generate the outcropping rock motion. The free field motion at the soil site is obtained by a convolution analysis. And for the study of wood structures, a nonlinear SDOF model is developed. The effects of soil-structure interaction on the behavior of the wood structures are found to be very minor. But the response can be significantly affected due to the intensity and frequency contents of the input motion. 13 refs., 6 tabs., 31 figs. (author)

  8. Range and stability of structural colors generated by Morpho-inspired color reflectors.

    Science.gov (United States)

    Chung, Kyungjae; Shin, Jung H

    2013-05-01

    The range and stability of structural colors generated by Morpho-inspired color reflectors are investigated. We find that despite the internal randomness of such structures that gives rise to their Morpho-like angle-independent iridescence, their colors under ambient lighting condition can be predicted by simple transfer-matrix calculations of corresponding planar multilayer structures. By calculating the possible range of colors generated by multilayers of different structures and material combinations using such transfer-matrix methods, we find that low-refractive index multilayers with intrastructure absorption, such as the melanin-containing chitin/air multilayer structure from the Morpho butterflies, can provide not only the most pure structural colors with the largest color gamut, but also the highest stability of color against variations in multilayer structure.

  9. Automated evaluation of liver fibrosis in thioacetamide, carbon tetrachloride, and bile duct ligation rodent models using second-harmonic generation/two-photon excited fluorescence microscopy.

    Science.gov (United States)

    Liu, Feng; Chen, Long; Rao, Hui-Ying; Teng, Xiao; Ren, Ya-Yun; Lu, Yan-Qiang; Zhang, Wei; Wu, Nan; Liu, Fang-Fang; Wei, Lai

    2017-01-01

    Animal models provide a useful platform for developing and testing new drugs to treat liver fibrosis. Accordingly, we developed a novel automated system to evaluate liver fibrosis in rodent models. This system uses second-harmonic generation (SHG)/two-photon excited fluorescence (TPEF) microscopy to assess a total of four mouse and rat models, using chemical treatment with either thioacetamide (TAA) or carbon tetrachloride (CCl 4 ), and a surgical method, bile duct ligation (BDL). The results obtained by the new technique were compared with that using Ishak fibrosis scores and two currently used quantitative methods for determining liver fibrosis: the collagen proportionate area (CPA) and measurement of hydroxyproline (HYP) content. We show that 11 shared morphological parameters faithfully recapitulate Ishak fibrosis scores in the models, with high area under the receiver operating characteristic (ROC) curve (AUC) performance. The AUC values of 11 shared parameters were greater than that of the CPA (TAA: 0.758-0.922 vs 0.752-0.908; BDL: 0.874-0.989 vs 0.678-0.966) in the TAA mice and BDL rat models and similar to that of the CPA in the TAA rat and CCl 4 mouse models. Similarly, based on the trends in these parameters at different time points, 9, 10, 7, and 2 model-specific parameters were selected for the TAA rats, TAA mice, CCl 4 mice, and BDL rats, respectively. These parameters identified differences among the time points in the four models, with high AUC accuracy, and the corresponding AUC values of these parameters were greater compared with those of the CPA in the TAA rat and mouse models (rats: 0.769-0.894 vs 0.64-0.799; mice: 0.87-0.93 vs 0.739-0.836) and similar to those of the CPA in the CCl 4 mouse and BDL rat models. Similarly, the AUC values of 11 shared parameters and model-specific parameters were greater than those of HYP in the TAA rats, TAA mice, and CCl 4 mouse models and were similar to those of HYP in the BDL rat models. The automated

  10. Ground Reaction Forces Generated During Rhythmical Squats as a Dynamic Loads of the Structure

    Science.gov (United States)

    Pantak, Marek

    2017-10-01

    Dynamic forces generated by moving persons can lead to excessive vibration of the long span, slender and lightweight structure such as floors, stairs, stadium stands and footbridges. These dynamic forces are generated during walking, running, jumping and rhythmical body swaying in vertical or horizontal direction etc. In the paper the mathematical models of the Ground Reaction Forces (GRFs) generated during squats have been presented. Elaborated models was compared to the GRFs measured during laboratory tests carried out by author in wide range of frequency using force platform. Moreover, the GRFs models were evaluated during dynamic numerical analyses and dynamic field tests of the exemplary structure (steel footbridge).

  11. Submicron hollow spot generation by solid immersion lens and structured illumination

    NARCIS (Netherlands)

    Kim, M.S.; Assafrao, A.C.; Scharf, T.; Wachters, A.J.H.; Pereira, S.F.; Urbach, H.P.; Brun, M.; Olivier, S.; Nicoletti, S.; Herzig, H.P.

    2012-01-01

    We report on the experimental and numerical demonstration of immersed submicron-size hollow focused spots, generated by structuring the polarization state of an incident light beam impinging on a micro-size solid immersion lens (?-SIL) made of SiO2. Such structured focal spots are characterized by a

  12. Optimizing the financial structure and maximizing the future value of your generation project

    International Nuclear Information System (INIS)

    Arulampalam, G.; Letellier, M.

    2004-01-01

    This paper discusses ways of optimizing the financial structure and maximizing the future value of an electric power generation project. It outlines the project structure, the sponsor objectives, project finance lending criteria, project timeline, risk mitigation, bank and institutional financing, sponsor's role, impact of financing choices on project value, and impact of penalties and derivative products

  13. Automated NMR structure determination of stereo-array isotope labeled ubiquitin from minimal sets of spectra using the SAIL-FLYA system

    Energy Technology Data Exchange (ETDEWEB)

    Ikeya, Teppei [Goethe University Frankfurt am Main, Institute of Biophysical Chemistry, Center for Biomolecular Magnetic Resonance (Germany); Takeda, Mitsuhiro; Yoshida, Hitoshi; Terauchi, Tsutomu; Jee, Jun-Goo; Kainosho, Masatsune [Tokyo Metropolitan University, Graduate School of Science (Japan)], E-mail: kainosho@nmr.chem.metro-u.ac.jp; Guentert, Peter [Goethe University Frankfurt am Main, Institute of Biophysical Chemistry, Center for Biomolecular Magnetic Resonance (Germany)], E-mail: guentert@em.uni-frankfurt.de

    2009-08-15

    Stereo-array isotope labeling (SAIL) has been combined with the fully automated NMR structure determination algorithm FLYA to determine the three-dimensional structure of the protein ubiquitin from different sets of input NMR spectra. SAIL provides a complete stereo- and regio-specific pattern of stable isotopes that results in sharper resonance lines and reduced signal overlap, without information loss. Here we show that as a result of the superior quality of the SAIL NMR spectra, reliable, fully automated analyses of the NMR spectra and structure calculations are possible using fewer input spectra than with conventional uniformly {sup 13}C/{sup 15}N-labeled proteins. FLYA calculations with SAIL ubiquitin, using a single three-dimensional 'through-bond' spectrum (and 2D HSQC spectra) in addition to the {sup 13}C-edited and {sup 15}N-edited NOESY spectra for conformational restraints, yielded structures with an accuracy of 0.83-1.15 A for the backbone RMSD to the conventionally determined solution structure of SAIL ubiquitin. NMR structures can thus be determined almost exclusively from the NOESY spectra that yield the conformational restraints, without the need to record many spectra only for determining intermediate, auxiliary data of the chemical shift assignments. The FLYA calculations for this report resulted in 252 ubiquitin structure bundles, obtained with different input data but identical structure calculation and refinement methods. These structures cover the entire range from highly accurate structures to seriously, but not trivially, wrong structures, and thus constitute a valuable database for the substantiation of structure validation methods.

  14. Automated NMR structure determination of stereo-array isotope labeled ubiquitin from minimal sets of spectra using the SAIL-FLYA system

    International Nuclear Information System (INIS)

    Ikeya, Teppei; Takeda, Mitsuhiro; Yoshida, Hitoshi; Terauchi, Tsutomu; Jee, Jun-Goo; Kainosho, Masatsune; Guentert, Peter

    2009-01-01

    Stereo-array isotope labeling (SAIL) has been combined with the fully automated NMR structure determination algorithm FLYA to determine the three-dimensional structure of the protein ubiquitin from different sets of input NMR spectra. SAIL provides a complete stereo- and regio-specific pattern of stable isotopes that results in sharper resonance lines and reduced signal overlap, without information loss. Here we show that as a result of the superior quality of the SAIL NMR spectra, reliable, fully automated analyses of the NMR spectra and structure calculations are possible using fewer input spectra than with conventional uniformly 13 C/ 15 N-labeled proteins. FLYA calculations with SAIL ubiquitin, using a single three-dimensional 'through-bond' spectrum (and 2D HSQC spectra) in addition to the 13 C-edited and 15 N-edited NOESY spectra for conformational restraints, yielded structures with an accuracy of 0.83-1.15 A for the backbone RMSD to the conventionally determined solution structure of SAIL ubiquitin. NMR structures can thus be determined almost exclusively from the NOESY spectra that yield the conformational restraints, without the need to record many spectra only for determining intermediate, auxiliary data of the chemical shift assignments. The FLYA calculations for this report resulted in 252 ubiquitin structure bundles, obtained with different input data but identical structure calculation and refinement methods. These structures cover the entire range from highly accurate structures to seriously, but not trivially, wrong structures, and thus constitute a valuable database for the substantiation of structure validation methods

  15. Automated NMR structure determination of stereo-array isotope labeled ubiquitin from minimal sets of spectra using the SAIL-FLYA system.

    Science.gov (United States)

    Ikeya, Teppei; Takeda, Mitsuhiro; Yoshida, Hitoshi; Terauchi, Tsutomu; Jee, Jun-Goo; Kainosho, Masatsune; Güntert, Peter

    2009-08-01

    Stereo-array isotope labeling (SAIL) has been combined with the fully automated NMR structure determination algorithm FLYA to determine the three-dimensional structure of the protein ubiquitin from different sets of input NMR spectra. SAIL provides a complete stereo- and regio-specific pattern of stable isotopes that results in sharper resonance lines and reduced signal overlap, without information loss. Here we show that as a result of the superior quality of the SAIL NMR spectra, reliable, fully automated analyses of the NMR spectra and structure calculations are possible using fewer input spectra than with conventional uniformly 13C/15N-labeled proteins. FLYA calculations with SAIL ubiquitin, using a single three-dimensional "through-bond" spectrum (and 2D HSQC spectra) in addition to the 13C-edited and 15N-edited NOESY spectra for conformational restraints, yielded structures with an accuracy of 0.83-1.15 A for the backbone RMSD to the conventionally determined solution structure of SAIL ubiquitin. NMR structures can thus be determined almost exclusively from the NOESY spectra that yield the conformational restraints, without the need to record many spectra only for determining intermediate, auxiliary data of the chemical shift assignments. The FLYA calculations for this report resulted in 252 ubiquitin structure bundles, obtained with different input data but identical structure calculation and refinement methods. These structures cover the entire range from highly accurate structures to seriously, but not trivially, wrong structures, and thus constitute a valuable database for the substantiation of structure validation methods.

  16. Floating wind generators offshore wind farm: Implications for structural loads and control actions

    International Nuclear Information System (INIS)

    Garcia, E.; Morant F, Quiles E.; Correcher, A.

    2009-01-01

    This paper describes the work currently carried out in the design of floating wind generators and their involvement in the future development of power generation in marine farms in depths exceeding 20 m. We discuss the main issues to be taken into account in the design of floating platforms, including the involvement of structural loads they bear. Also from a standpoint of control engineering are discussed strategies to reduce structural loads such a system to ensure adequate durability and therefore ensuring their economic viability. Finally, the abstract modeling tools for floating wind turbines that can be used in both structural design and the design of appropriate control algorithms

  17. Molecular design chemical structure generation from the properties of pure organic compounds

    CERN Document Server

    Horvath, AL

    1992-01-01

    This book is a systematic presentation of the methods that have been developed for the interpretation of molecular modeling to the design of new chemicals. The main feature of the compilation is the co-ordination of the various scientific disciplines required for the generation of new compounds. The five chapters deal with such areas as structure and properties of organic compounds, relationships between structure and properties, and models for structure generation. The subject is covered in sufficient depth to provide readers with the necessary background to understand the modeling

  18. Structural integrity of power generating speed bumps made of concrete foam composite

    Science.gov (United States)

    Syam, B.; Muttaqin, M.; Hastrino, D.; Sebayang, A.; Basuki, W. S.; Sabri, M.; Abda, S.

    2018-02-01

    In this paper concrete foam composite speed bumps were designed to generate electrical power by utilizing the movements of commuting vehicles on highways, streets, parking gates, and drive-thru station of fast food restaurants. The speed bumps were subjected to loadings generated by vehicles pass over the power generating mechanical system. In this paper, we mainly focus our discussion on the structural integrity of the speed bumps and discuss the electrical power generating speed bumps in another paper. One aspect of structural integrity is its ability to support designed loads without breaking and includes the study of past structural failures in order to prevent failures in future designs. The concrete foam composites were used for the speed bumps; the reinforcement materials are selected from empty fruit bunch of oil palm. In this study, the speed bump materials and structure were subjected to various tests to obtain its physical and mechanical properties. To analyze the structure stability of the speed bumps some models were produced and tested in our speed bump test station. We also conduct a FEM-based computer simulation to analyze stress responses of the speed bump structures. It was found that speed bump type 1 significantly reduced the radial voltage. In addition, the speed bump is equipped with a steel casing is also suitable for use as a component component in generating electrical energy.

  19. Classification of Automated Search Traffic

    Science.gov (United States)

    Buehrer, Greg; Stokes, Jack W.; Chellapilla, Kumar; Platt, John C.

    As web search providers seek to improve both relevance and response times, they are challenged by the ever-increasing tax of automated search query traffic. Third party systems interact with search engines for a variety of reasons, such as monitoring a web site’s rank, augmenting online games, or possibly to maliciously alter click-through rates. In this paper, we investigate automated traffic (sometimes referred to as bot traffic) in the query stream of a large search engine provider. We define automated traffic as any search query not generated by a human in real time. We first provide examples of different categories of query logs generated by automated means. We then develop many different features that distinguish between queries generated by people searching for information, and those generated by automated processes. We categorize these features into two classes, either an interpretation of the physical model of human interactions, or as behavioral patterns of automated interactions. Using the these detection features, we next classify the query stream using multiple binary classifiers. In addition, a multiclass classifier is then developed to identify subclasses of both normal and automated traffic. An active learning algorithm is used to suggest which user sessions to label to improve the accuracy of the multiclass classifier, while also seeking to discover new classes of automated traffic. Performance analysis are then provided. Finally, the multiclass classifier is used to predict the subclass distribution for the search query stream.

  20. Elucidating structural order and disorder phenomena in mullite-type Al4B2O9 by automated electron diffraction tomography

    International Nuclear Information System (INIS)

    Zhao, Haishuang; Krysiak, Yaşar; Hoffmann, Kristin; Barton, Bastian; Molina-Luna, Leopoldo; Neder, Reinhard B.; Kleebe, Hans-Joachim; Gesing, Thorsten M.; Schneider, Hartmut; Fischer, Reinhard X.

    2017-01-01

    The crystal structure and disorder phenomena of Al 4 B 2 O 9 , an aluminum borate from the mullite-type family, were studied using automated diffraction tomography (ADT), a recently established method for collection and analysis of electron diffraction data. Al 4 B 2 O 9 , prepared by sol-gel approach, crystallizes in the monoclinic space group C2/m. The ab initio structure determination based on three-dimensional electron diffraction data from single ordered crystals reveals that edge-connected AlO 6 octahedra expanding along the b axis constitute the backbone. The ordered structure (A) was confirmed by TEM and HAADF-STEM images. Furthermore, disordered crystals with diffuse scattering along the b axis are observed. Analysis of the modulation pattern implies a mean superstructure (AAB) with a threefold b axis, where B corresponds to an A layer shifted by ½a and ½c. Diffraction patterns simulated for the AAB sequence including additional stacking disorder are in good agreement with experimental electron diffraction patterns. - Graphical abstract: Crystal structure and disorder phenomena of B-rich Al 4 B 2 O 9 studied by automated electron diffraction tomography (ADT) and described by diffraction simulation using DISCUS. - Highlights: • Ab-initio structure solution by electron diffraction from single nanocrystals. • Detected modulation corresponding mainly to three-fold superstructure. • Diffuse diffraction streaks caused by stacking faults in disordered crystals. • Observed streaks explained by simulated electron diffraction patterns.