WorldWideScience

Sample records for automated structure generation

  1. Automated quadrilateral mesh generation for digital image structures

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    With the development of advanced imaging technology, digital images are widely used. This paper proposes an automatic quadrilateral mesh generation algorithm for multi-colour imaged structures. It takes an original arbitrary digital image as an input for automatic quadrilateral mesh generation, this includes removing the noise, extracting and smoothing the boundary geometries between different colours, and automatic all-quad mesh generation with the above boundaries as constraints. An application example is...

  2. Bim Automation: Advanced Modeling Generative Process for Complex Structures

    Science.gov (United States)

    Banfi, F.; Fai, S.; Brumana, R.

    2017-08-01

    The new paradigm of the complexity of modern and historic structures, which are characterised by complex forms, morphological and typological variables, is one of the greatest challenges for building information modelling (BIM). Generation of complex parametric models needs new scientific knowledge concerning new digital technologies. These elements are helpful to store a vast quantity of information during the life cycle of buildings (LCB). The latest developments of parametric applications do not provide advanced tools, resulting in time-consuming work for the generation of models. This paper presents a method capable of processing and creating complex parametric Building Information Models (BIM) with Non-Uniform to NURBS) with multiple levels of details (Mixed and ReverseLoD) based on accurate 3D photogrammetric and laser scanning surveys. Complex 3D elements are converted into parametric BIM software and finite element applications (BIM to FEA) using specific exchange formats and new modelling tools. The proposed approach has been applied to different case studies: the BIM of modern structure for the courtyard of West Block on Parliament Hill in Ottawa (Ontario) and the BIM of Masegra Castel in Sondrio (Italy), encouraging the dissemination and interaction of scientific results without losing information during the generative process.

  3. Automated Narratives and Journalistic Text Generation: The Lead Organization Structure Translated into Code.

    Directory of Open Access Journals (Sweden)

    Márcio Carneiro dos Santos

    2016-07-01

    Full Text Available It describes the experiment of building a software capable of generating leads and newspaper titles in an automated fashion from information obtained from the Internet. The theoretical possibility Lage already provided by the end of last century is based on relatively rigid and simple structure of this type of story construction, which facilitates the representation or translation of its syntax in terms of instructions that the computer can execute. The paper also discusses the relationship between society, technique and technology, making a brief history of the introduction of digital solutions in newsrooms and their impacts. The development was done with the Python programming language and NLTK- Natural Language Toolkit library - and used the results of the Brazilian Soccer Championship 2013 published on an internet portal as a data source.

  4. A script for automated 3-dimentional structure generation and conformer search from 2- dimentional chemical drawing.

    Science.gov (United States)

    Ishikawa, Yoshinobu

    2013-01-01

    Building 3-dimensional (3D) molecules is the starting point in molecular modeling. Conformer search and identification of a global energy minimum structure are often performed computationally during spectral analysis of data from NMR, IR, and VCD or during rational drug design through ligand-based, structure-based, and QSAR approaches. I herein report a convenient script that allows for automated building of 3D structures and conformer searching from 2-dimensional (2D) drawing of chemical structures. With this Bash shell script, which runs on Mac OS X and the Linux platform, the tasks are consecutively and iteratively executed without a 3D molecule builder via the command line interface of the free (academic) software OpenBabel, Balloon, and MOPAC2012. A large number of 2D chemical drawing files can be processed simultaneously, and the script functions with stereoisomers. Semi-empirical quantum chemical calculation ensures reliable ranking of the generated conformers on the basis of energy. In addition to an energy-sorted list of file names of the conformers, their Gaussian input files are provided for ab initio and density functional theory calculations to predict rigorous electronic energies, structures, and properties. This script is freely available to all scientists.

  5. Automated Test Case Generation

    CERN Document Server

    CERN. Geneva

    2015-01-01

    I would like to present the concept of automated test case generation. I work on it as part of my PhD and I think it would be interesting also for other people. It is also the topic of a workshop paper that I am introducing in Paris. (abstract below) Please note that the talk itself would be more general and not about the specifics of my PhD, but about the broad field of Automated Test Case Generation. I would introduce the main approaches (combinatorial testing, symbolic execution, adaptive random testing) and their advantages and problems. (oracle problem, combinatorial explosion, ...) Abstract of the paper: Over the last decade code-based test case generation techniques such as combinatorial testing or dynamic symbolic execution have seen growing research popularity. Most algorithms and tool implementations are based on finding assignments for input parameter values in order to maximise the execution branch coverage. Only few of them consider dependencies from outside the Code Under Test’s scope such...

  6. Automated Chat Generator

    Science.gov (United States)

    2012-08-10

    WordNet 1 with the chat generator. WordNet is a simple lexical resource that organizes nouns, adjectives, adverbs, and verbs into sets of cognitive...DELTA_TAO> Will keep an eye out [01:04] <BRAVO_TAO> Received intel from Sao Paulo. South Korean vessels are operating in this region. [01:11

  7. Automated grid generation from models of complex geologic structure and stratigraphy

    Energy Technology Data Exchange (ETDEWEB)

    Gable, C.; Trease, H.; Cherry, T.

    1996-04-01

    The construction of computational grids which accurately reflect complex geologic structure and stratigraphy for flow and transport models poses a formidable task. With an understanding of stratigraphy, material properties and boundary and initial conditions, the task of incorporating this data into a numerical model can be difficult and time consuming. Most GIS tools for representing complex geologic volumes and surfaces are not designed for producing optimal grids for flow and transport computation. We have developed a tool, GEOMESH, for generating finite element grids that maintain the geometric integrity of input volumes, surfaces, and geologic data and produce an optimal (Delaunay) tetrahedral grid that can be used for flow and transport computations. GEOMESH also satisfies the constraint that the geometric coupling coefficients of the grid are positive for all elements. GEOMESH generates grids for two dimensional cross sections, three dimensional regional models, represents faults and fractures, and has the capability of including finer grids representing tunnels and well bores into grids. GEOMESH also permits adaptive grid refinement in three dimensions. The tools to glue, merge and insert grids together demonstrate how complex grids can be built from simpler pieces. The resulting grid can be utilized by unstructured finite element or integrated finite difference computational physics codes.

  8. A mathematical basis for automated structured grid generation with close coupling to the flow solver

    Energy Technology Data Exchange (ETDEWEB)

    Barnette, D.W.

    1998-02-01

    The first two truncation error terms resulting from finite differencing the convection terms in the two-dimensional Navier-Stokes equations are examined for the purpose of constructing two-dimensional grid generation schemes. These schemes are constructed such that the resulting grid distributions drive the error terms to zero. Two sets of equations result, one for each error term, that show promise in generating grids that provide more accurate flow solutions and possibly faster convergence. One set results in an algebraic scheme that drives the first truncation term to zero, and the other a hyperbolic scheme that drives the second term to zero. Also discussed is the possibility of using the schemes in sequentially constructing a grid in an iterative algorithm involving the flow solver. In essence, the process is envisioned to generate not only a flow field solution but the grid as well, rendering the approach a hands-off method for grid generation

  9. Cassini Tour Atlas Automated Generation

    Science.gov (United States)

    Grazier, Kevin R.; Roumeliotis, Chris; Lange, Robert D.

    2011-01-01

    During the Cassini spacecraft s cruise phase and nominal mission, the Cassini Science Planning Team developed and maintained an online database of geometric and timing information called the Cassini Tour Atlas. The Tour Atlas consisted of several hundreds of megabytes of EVENTS mission planning software outputs, tables, plots, and images used by mission scientists for observation planning. Each time the nominal mission trajectory was altered or tweaked, a new Tour Atlas had to be regenerated manually. In the early phases of Cassini s Equinox Mission planning, an a priori estimate suggested that mission tour designers would develop approximately 30 candidate tours within a short period of time. So that Cassini scientists could properly analyze the science opportunities in each candidate tour quickly and thoroughly so that the optimal series of orbits for science return could be selected, a separate Tour Atlas was required for each trajectory. The task of manually generating the number of trajectory analyses in the allotted time would have been impossible, so the entire task was automated using code written in five different programming languages. This software automates the generation of the Cassini Tour Atlas database. It performs with one UNIX command what previously took a day or two of human labor.

  10. Integrated Design Engineering Analysis (IDEA) Environment Automated Generation of Structured CFD Grids using Topology Methods

    Science.gov (United States)

    Kamhawi, Hilmi N.

    2012-01-01

    This report documents the work performed from March 2010 to March 2012. The Integrated Design and Engineering Analysis (IDEA) environment is a collaborative environment based on an object-oriented, multidisciplinary, distributed framework using the Adaptive Modeling Language (AML) as a framework and supporting the configuration design and parametric CFD grid generation. This report will focus on describing the work in the area of parametric CFD grid generation using novel concepts for defining the interaction between the mesh topology and the geometry in such a way as to separate the mesh topology from the geometric topology while maintaining the link between the mesh topology and the actual geometry.

  11. Generative Representations for Automated Design of Robots

    Science.gov (United States)

    Homby, Gregory S.; Lipson, Hod; Pollack, Jordan B.

    2007-01-01

    according to a fitness criterion to yield a figure of merit that is fed back into the evolutionary subprocess of the next iteration. In comparison with prior approaches to automated evolutionary design of robots, the use of generative representations offers two advantages: First, a generative representation enables the reuse of components in regular and hierarchical ways and thereby serves a systematic means of creating more complex modules out of simpler ones. Second, the evolved generative representation may capture intrinsic properties of the design problem, so that variations in the representations move through the design space more effectively than do equivalent variations in a nongenerative representation. This method has been demonstrated by using it to design some robots that move, variously, by walking, rolling, or sliding. Some of the robots were built (see figure). Although these robots are very simple, in comparison with robots designed by humans, their structures are more regular, modular, hierarchical, and complex than are those of evolved designs of comparable functionality synthesized by use of nongenerative representations.

  12. Automated Word Puzzle Generation via Topic Dictionaries

    CERN Document Server

    Pinter, Balazs; Szabo, Zoltan; Lorincz, Andras

    2012-01-01

    We propose a general method for automated word puzzle generation. Contrary to previous approaches in this novel field, the presented method does not rely on highly structured datasets obtained with serious human annotation effort: it only needs an unstructured and unannotated corpus (i.e., document collection) as input. The method builds upon two additional pillars: (i) a topic model, which induces a topic dictionary from the input corpus (examples include e.g., latent semantic analysis, group-structured dictionaries or latent Dirichlet allocation), and (ii) a semantic similarity measure of word pairs. Our method can (i) generate automatically a large number of proper word puzzles of different types, including the odd one out, choose the related word and separate the topics puzzle. (ii) It can easily create domain-specific puzzles by replacing the corpus component. (iii) It is also capable of automatically generating puzzles with parameterizable levels of difficulty suitable for, e.g., beginners or intermedia...

  13. Automated kinematic generator for surgical robotic systems.

    Science.gov (United States)

    Jung, David L; Dixon, Warren E; Pin, François G

    2004-01-01

    Unlike traditional assembly line robotic systems that have a fixed kinematic structure associated with a single tool for a structured task, next-generation robotic surgical assist systems will be required to use an array of end-effector tools. Once a robot is connected with a tool, the kinematic equations of motion are altered. Given the need to accommodate evolving surgical challenges and to alleviate the restrictions imposed by the confined minimally invasive environment, new surgical tools may resemble small flexible snakes rather than rigid, cable driven instruments. Connecting to these developing articulated tools will significantly alter the overall kinematic structure of a robotic system. In this paper we present a technique for real-time automated generation and evaluation of manipulator kinematic equations that exhibits the combined advantages of existing methods-speed and flexibility to kinematic change--without their disadvantages.

  14. Automated cognome construction and semi-automated hypothesis generation.

    Science.gov (United States)

    Voytek, Jessica B; Voytek, Bradley

    2012-06-30

    Modern neuroscientific research stands on the shoulders of countless giants. PubMed alone contains more than 21 million peer-reviewed articles with 40-50,000 more published every month. Understanding the human brain, cognition, and disease will require integrating facts from dozens of scientific fields spread amongst millions of studies locked away in static documents, making any such integration daunting, at best. The future of scientific progress will be aided by bridging the gap between the millions of published research articles and modern databases such as the Allen brain atlas (ABA). To that end, we have analyzed the text of over 3.5 million scientific abstracts to find associations between neuroscientific concepts. From the literature alone, we show that we can blindly and algorithmically extract a "cognome": relationships between brain structure, function, and disease. We demonstrate the potential of data-mining and cross-platform data-integration with the ABA by introducing two methods for semi-automated hypothesis generation. By analyzing statistical "holes" and discrepancies in the literature we can find understudied or overlooked research paths. That is, we have added a layer of semi-automation to a part of the scientific process itself. This is an important step toward fundamentally incorporating data-mining algorithms into the scientific method in a manner that is generalizable to any scientific or medical field.

  15. Automated Test Requirement Document Generation

    Science.gov (United States)

    1987-11-01

    DIAGNOSTICS BASED ON THE PRINCIPLES OF ARTIFICIAL INTELIGENCE ", 1984 International Test Conference, 01Oct84, (A3, 3, Cs D3, E2, G2, H2, 13, J6, K) 425...j0O GLOSSARY OF ACRONYMS 0 ABBREVIATION DEFINITION AFSATCOM Air Force Satellite Communication Al Artificial Intelligence ASIC Application Specific...In-Test Equipment (BITE) and AI ( Artificial Intelligence) - Expert Systems - need to be fully applied before a completely automated process can be

  16. Semi-automated ontology generation and evolution

    Science.gov (United States)

    Stirtzinger, Anthony P.; Anken, Craig S.

    2009-05-01

    Extending the notion of data models or object models, ontology can provide rich semantic definition not only to the meta-data but also to the instance data of domain knowledge, making these semantic definitions available in machine readable form. However, the generation of an effective ontology is a difficult task involving considerable labor and skill. This paper discusses an Ontology Generation and Evolution Processor (OGEP) aimed at automating this process, only requesting user input when un-resolvable ambiguous situations occur. OGEP directly attacks the main barrier which prevents automated (or self learning) ontology generation: the ability to understand the meaning of artifacts and the relationships the artifacts have to the domain space. OGEP leverages existing lexical to ontological mappings in the form of WordNet, and Suggested Upper Merged Ontology (SUMO) integrated with a semantic pattern-based structure referred to as the Semantic Grounding Mechanism (SGM) and implemented as a Corpus Reasoner. The OGEP processing is initiated by a Corpus Parser performing a lexical analysis of the corpus, reading in a document (or corpus) and preparing it for processing by annotating words and phrases. After the Corpus Parser is done, the Corpus Reasoner uses the parts of speech output to determine the semantic meaning of a word or phrase. The Corpus Reasoner is the crux of the OGEP system, analyzing, extrapolating, and evolving data from free text into cohesive semantic relationships. The Semantic Grounding Mechanism provides a basis for identifying and mapping semantic relationships. By blending together the WordNet lexicon and SUMO ontological layout, the SGM is given breadth and depth in its ability to extrapolate semantic relationships between domain entities. The combination of all these components results in an innovative approach to user assisted semantic-based ontology generation. This paper will describe the OGEP technology in the context of the architectural

  17. Automating defence generation for risk assessment

    NARCIS (Netherlands)

    Gadyatskaya, Olga

    2016-01-01

    Efficient risk assessment requires automation of its most tedious tasks: identification of vulnerabilities, attacks that can exploit these vulnerabilities, and countermeasures that can mitigate the attacks. E.g., the attack tree generation by policy invalidation approach looks at systematic automati

  18. Automated Liquibase Generator And ValidatorALGV

    Directory of Open Access Journals (Sweden)

    Manik Jain

    2015-08-01

    Full Text Available Abstract This paper presents an automation tool namely ALGV Automated Liquibase Generator and Validator for the automated generation and verification of liquibase scripts. Liquibase is one of the most efficient ways of applying and persisting changes to a database schema. Since its invention by Nathan Voxland 1 it has become de facto standard for database change management. The advantages of using liquibase scripts over traditional sql queries ranges from version control to reusing the same scripts over multiple database platforms. Irrespective of its advantages manual creation of liquibase scripts takes a lot of effort and sometimes is error-prone. ALGV helps to reduce the time consuming liquibase script generation manual typing efforts possible error occurrence and manual verification process and time by 75. Automating the liquibase generation process also helps to remove the burden of recollecting specific tags to be used for a particular change. Moreover developers can concentrate on the business logic and business data rather than wasting their precious efforts in writing files.

  19. Automated generation of lattice QCD Feynman rules

    Energy Technology Data Exchange (ETDEWEB)

    Hart, A.; Mueller, E.H. [Edinburgh Univ. (United Kingdom). SUPA School of Physics and Astronomy; von Hippel, G.M. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Horgan, R.R. [Cambridge Univ. (United Kingdom). DAMTP, CMS

    2009-04-15

    The derivation of the Feynman rules for lattice perturbation theory from actions and operators is complicated, especially for highly improved actions such as HISQ. This task is, however, both important and particularly suitable for automation. We describe a suite of software to generate and evaluate Feynman rules for a wide range of lattice field theories with gluons and (relativistic and/or heavy) quarks. Our programs are capable of dealing with actions as complicated as (m)NRQCD and HISQ. Automated differentiation methods are used to calculate also the derivatives of Feynman diagrams. (orig.)

  20. A linear programming approach to reconstructing subcellular structures from confocal images for automated generation of representative 3D cellular models.

    Science.gov (United States)

    Wood, Scott T; Dean, Brian C; Dean, Delphine

    2013-04-01

    This paper presents a novel computer vision algorithm to analyze 3D stacks of confocal images of fluorescently stained single cells. The goal of the algorithm is to create representative in silico model structures that can be imported into finite element analysis software for mechanical characterization. Segmentation of cell and nucleus boundaries is accomplished via standard thresholding methods. Using novel linear programming methods, a representative actin stress fiber network is generated by computing a linear superposition of fibers having minimum discrepancy compared with an experimental 3D confocal image. Qualitative validation is performed through analysis of seven 3D confocal image stacks of adherent vascular smooth muscle cells (VSMCs) grown in 2D culture. The presented method is able to automatically generate 3D geometries of the cell's boundary, nucleus, and representative F-actin network based on standard cell microscopy data. These geometries can be used for direct importation and implementation in structural finite element models for analysis of the mechanics of a single cell to potentially speed discoveries in the fields of regenerative medicine, mechanobiology, and drug discovery.

  1. Automated branching pattern report generation for laparoscopic surgery assistance

    Science.gov (United States)

    Oda, Masahiro; Matsuzaki, Tetsuro; Hayashi, Yuichiro; Kitasaka, Takayuki; Misawa, Kazunari; Mori, Kensaku

    2015-05-01

    This paper presents a method for generating branching pattern reports of abdominal blood vessels for laparoscopic gastrectomy. In gastrectomy, it is very important to understand branching structure of abdominal arteries and veins, which feed and drain specific abdominal organs including the stomach, the liver and the pancreas. In the real clinical stage, a surgeon creates a diagnostic report of the patient anatomy. This report summarizes the branching patterns of the blood vessels related to the stomach. The surgeon decides actual operative procedure. This paper shows an automated method to generate a branching pattern report for abdominal blood vessels based on automated anatomical labeling. The report contains 3D rendering showing important blood vessels and descriptions of branching patterns of each vessel. We have applied this method for fifty cases of 3D abdominal CT scans and confirmed the proposed method can automatically generate branching pattern reports of abdominal arteries.

  2. An Automated Approach to the Generation of Structured Building Information Models from Unstructured 3d Point Cloud Scans

    DEFF Research Database (Denmark)

    Tamke, Martin; Evers, Henrik Leander; Wessel, Raoul

    2016-01-01

    on the interpretation and transformation of the resulting Point Cloud data into information, which can be used in architectural and engineering design workflows. Our approach to tackle this problem, is in contrast to existing ones which work on the levels of points, based on the detection of building elements......In this paper we present and evaluate an approach for the automatic generation of building models in IFC BIM format from unstructured Point Cloud scans, as they result from 3dlaser scans of buildings. While the actual measurement process is relatively fast, 85% of the overall time are spend...... design in BIM and simulations with the build environment....

  3. An Automated Approach to the Generation of Structured Building Information Models from Unstructured 3d Point Cloud Scans

    DEFF Research Database (Denmark)

    Tamke, Martin

    2016-01-01

    In this paper we present and evaluate an approach for the automatic generation of building models in IFC BIM format from unstructured Point Clouds scans, as they result from 3D laser scans of buildings. While the actual measurement process is relatively fast, 85% of the overall time are spend...... on the interpretation and transformation of the resulting Point Cloud data into information, which can be used in architectural and engineering design workflows. Our approach to tackle this problem, is in contrast to existing ones which work on the levels of points, based on the detection of building elements......, such as walls, ceilings, doors, windows, and spaces and the relation between these. We present use cases with our software prototype, evaluate the results, and discuss future work, that will bring the research further towards the aim to create automatically semantic links between the conception of building...

  4. An Automated Technique for Generating Georectified Mosaics from Ultra-High Resolution Unmanned Aerial Vehicle (UAV Imagery, Based on Structure from Motion (SfM Point Clouds

    Directory of Open Access Journals (Sweden)

    Christopher Watson

    2012-05-01

    Full Text Available Unmanned Aerial Vehicles (UAVs are an exciting new remote sensing tool capable of acquiring high resolution spatial data. Remote sensing with UAVs has the potential to provide imagery at an unprecedented spatial and temporal resolution. The small footprint of UAV imagery, however, makes it necessary to develop automated techniques to geometrically rectify and mosaic the imagery such that larger areas can be monitored. In this paper, we present a technique for geometric correction and mosaicking of UAV photography using feature matching and Structure from Motion (SfM photogrammetric techniques. Images are processed to create three dimensional point clouds, initially in an arbitrary model space. The point clouds are transformed into a real-world coordinate system using either a direct georeferencing technique that uses estimated camera positions or via a Ground Control Point (GCP technique that uses automatically identified GCPs within the point cloud. The point cloud is then used to generate a Digital Terrain Model (DTM required for rectification of the images. Subsequent georeferenced images are then joined together to form a mosaic of the study area. The absolute spatial accuracy of the direct technique was found to be 65–120 cm whilst the GCP technique achieves an accuracy of approximately 10–15 cm.

  5. The automation of natural product structure elucidation.

    Science.gov (United States)

    Steinbeck, C

    2001-05-01

    The last two or three years have seen exciting developments in the field of computer-assisted structure elucidation (CASE) with a number of programs becoming commercially or freely available. This was the conditio sine qua non for CASE to be widely applied in the daily work of bench chemists and spectroscopists. A number of promising applications have been published in the area of structure generators, deterministic and stochastic CASE tools and property predictions, including the automatic distinction between natural products and artificial compounds, as well as the determination of 3-D structure from a connection table based on IR spectroscopy. Advancements in coupling techniques between chromatographic and spectroscopic methods demonstrate progress towards a fully automated structure elucidation or identification process starting at the earliest steps of obtaining crude extracts.

  6. Integrated, Automated Distributed Generation Technologies Demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Jensen, Kevin

    2014-09-30

    The purpose of the NETL Project was to develop a diverse combination of distributed renewable generation technologies and controls and demonstrate how the renewable generation could help manage substation peak demand at the ATK Promontory plant site. The Promontory plant site is located in the northwestern Utah desert approximately 25 miles west of Brigham City, Utah. The plant encompasses 20,000 acres and has over 500 buildings. The ATK Promontory plant primarily manufactures solid propellant rocket motors for both commercial and government launch systems. The original project objectives focused on distributed generation; a 100 kW (kilowatt) wind turbine, a 100 kW new technology waste heat generation unit, a 500 kW energy storage system, and an intelligent system-wide automation system to monitor and control the renewable energy devices then release the stored energy during the peak demand time. The original goal was to reduce peak demand from the electrical utility company, Rocky Mountain Power (RMP), by 3.4%. For a period of time we also sought to integrate our energy storage requirements with a flywheel storage system (500 kW) proposed for the Promontory/RMP Substation. Ultimately the flywheel storage system could not meet our project timetable, so the storage requirement was switched to a battery storage system (300 kW.) A secondary objective was to design/install a bi-directional customer/utility gateway application for real-time visibility and communications between RMP, and ATK. This objective was not achieved because of technical issues with RMP, ATK Information Technology Department’s stringent requirements based on being a rocket motor manufacturing facility, and budget constraints. Of the original objectives, the following were achieved: • Installation of a 100 kW wind turbine. • Installation of a 300 kW battery storage system. • Integrated control system installed to offset electrical demand by releasing stored energy from renewable sources

  7. Integrated, Automated Distributed Generation Technologies Demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Jensen, Kevin [Atk Launch Systems Inc., Brigham City, UT (United States)

    2014-09-01

    The purpose of the NETL Project was to develop a diverse combination of distributed renewable generation technologies and controls and demonstrate how the renewable generation could help manage substation peak demand at the ATK Promontory plant site. The Promontory plant site is located in the northwestern Utah desert approximately 25 miles west of Brigham City, Utah. The plant encompasses 20,000 acres and has over 500 buildings. The ATK Promontory plant primarily manufactures solid propellant rocket motors for both commercial and government launch systems. The original project objectives focused on distributed generation; a 100 kW (kilowatt) wind turbine, a 100 kW new technology waste heat generation unit, a 500 kW energy storage system, and an intelligent system-wide automation system to monitor and control the renewable energy devices then release the stored energy during the peak demand time. The original goal was to reduce peak demand from the electrical utility company, Rocky Mountain Power (RMP), by 3.4%. For a period of time we also sought to integrate our energy storage requirements with a flywheel storage system (500 kW) proposed for the Promontory/RMP Substation. Ultimately the flywheel storage system could not meet our project timetable, so the storage requirement was switched to a battery storage system (300 kW.) A secondary objective was to design/install a bi-directional customer/utility gateway application for real-time visibility and communications between RMP, and ATK. This objective was not achieved because of technical issues with RMP, ATK Information Technology Department’s stringent requirements based on being a rocket motor manufacturing facility, and budget constraints. Of the original objectives, the following were achieved: • Installation of a 100 kW wind turbine. • Installation of a 300 kW battery storage system. • Integrated control system installed to offset electrical demand by releasing stored energy from renewable sources

  8. Automation Framework for Flight Dynamics Products Generation

    Science.gov (United States)

    Wiegand, Robert E.; Esposito, Timothy C.; Watson, John S.; Jun, Linda; Shoan, Wendy; Matusow, Carla

    2010-01-01

    XFDS provides an easily adaptable automation platform. To date it has been used to support flight dynamics operations. It coordinates the execution of other applications such as Satellite TookKit, FreeFlyer, MATLAB, and Perl code. It provides a mechanism for passing messages among a collection of XFDS processes, and allows sending and receiving of GMSEC messages. A unified and consistent graphical user interface (GUI) is used for the various tools. Its automation configuration is stored in text files, and can be edited either directly or using the GUI.

  9. Loft: An Automated Mesh Generator for Stiffened Shell Aerospace Vehicles

    Science.gov (United States)

    Eldred, Lloyd B.

    2011-01-01

    Loft is an automated mesh generation code that is designed for aerospace vehicle structures. From user input, Loft generates meshes for wings, noses, tanks, fuselage sections, thrust structures, and so on. As a mesh is generated, each element is assigned properties to mark the part of the vehicle with which it is associated. This property assignment is an extremely powerful feature that enables detailed analysis tasks, such as load application and structural sizing. This report is presented in two parts. The first part is an overview of the code and its applications. The modeling approach that was used to create the finite element meshes is described. Several applications of the code are demonstrated, including a Next Generation Launch Technology (NGLT) wing-sizing study, a lunar lander stage study, a launch vehicle shroud shape study, and a two-stage-to-orbit (TSTO) orbiter. Part two of the report is the program user manual. The manual includes in-depth tutorials and a complete command reference.

  10. Automating the Generation of Heterogeneous Aviation Safety Cases

    Science.gov (United States)

    Denney, Ewen W.; Pai, Ganesh J.; Pohl, Josef M.

    2012-01-01

    A safety case is a structured argument, supported by a body of evidence, which provides a convincing and valid justification that a system is acceptably safe for a given application in a given operating environment. This report describes the development of a fragment of a preliminary safety case for the Swift Unmanned Aircraft System. The construction of the safety case fragment consists of two parts: a manually constructed system-level case, and an automatically constructed lower-level case, generated from formal proof of safety-relevant correctness properties. We provide a detailed discussion of the safety considerations for the target system, emphasizing the heterogeneity of sources of safety-relevant information, and use a hazard analysis to derive safety requirements, including formal requirements. We evaluate the safety case using three classes of metrics for measuring degrees of coverage, automation, and understandability. We then present our preliminary conclusions and make suggestions for future work.

  11. Automated Structure Solution with the PHENIX Suite

    Energy Technology Data Exchange (ETDEWEB)

    Zwart, Peter H.; Zwart, Peter H.; Afonine, Pavel; Grosse-Kunstleve, Ralf W.; Hung, Li-Wei; Ioerger, Tom R.; McCoy, A.J.; McKee, Eric; Moriarty, Nigel; Read, Randy J.; Sacchettini, James C.; Sauter, Nicholas K.; Storoni, L.C.; Terwilliger, Tomas C.; Adams, Paul D.

    2008-06-09

    Significant time and effort are often required to solve and complete a macromolecular crystal structure. The development of automated computational methods for the analysis, solution and completion of crystallographic structures has the potential to produce minimally biased models in a short time without the need for manual intervention. The PHENIX software suite is a highly automated system for macromolecular structure determination that can rapidly arrive at an initial partial model of a structure without significant human intervention, given moderate resolution and good quality data. This achievement has been made possible by the development of new algorithms for structure determination, maximum-likelihood molecular replacement (PHASER), heavy-atom search (HySS), template and pattern-based automated model-building (RESOLVE, TEXTAL), automated macromolecular refinement (phenix.refine), and iterative model-building, density modification and refinement that can operate at moderate resolution (RESOLVE, AutoBuild). These algorithms are based on a highly integrated and comprehensive set of crystallographic libraries that have been built and made available to the community. The algorithms are tightly linked and made easily accessible to users through the PHENIX Wizards and the PHENIX GUI.

  12. Automated structure solution with the PHENIX suite

    Energy Technology Data Exchange (ETDEWEB)

    Terwilliger, Thomas C [Los Alamos National Laboratory; Zwart, Peter H [LBNL; Afonine, Pavel V [LBNL; Grosse - Kunstleve, Ralf W [LBNL

    2008-01-01

    Significant time and effort are often required to solve and complete a macromolecular crystal structure. The development of automated computational methods for the analysis, solution, and completion of crystallographic structures has the potential to produce minimally biased models in a short time without the need for manual intervention. The PHENIX software suite is a highly automated system for macromolecular structure determination that can rapidly arrive at an initial partial model of a structure without significant human intervention, given moderate resolution, and good quality data. This achievement has been made possible by the development of new algorithms for structure determination, maximum-likelihood molecular replacement (PHASER), heavy-atom search (HySS), template- and pattern-based automated model-building (RESOLVE, TEXTAL), automated macromolecular refinement (phenix. refine), and iterative model-building, density modification and refinement that can operate at moderate resolution (RESOLVE, AutoBuild). These algorithms are based on a highly integrated and comprehensive set of crystallographic libraries that have been built and made available to the community. The algorithms are tightly linked and made easily accessible to users through the PHENIX Wizards and the PHENIX GUI.

  13. Automated Test Case Generation for an Autopilot Requirement Prototype

    Science.gov (United States)

    Giannakopoulou, Dimitra; Rungta, Neha; Feary, Michael

    2011-01-01

    Designing safety-critical automation with robust human interaction is a difficult task that is susceptible to a number of known Human-Automation Interaction (HAI) vulnerabilities. It is therefore essential to develop automated tools that provide support both in the design and rapid evaluation of such automation. The Automation Design and Evaluation Prototyping Toolset (ADEPT) enables the rapid development of an executable specification for automation behavior and user interaction. ADEPT supports a number of analysis capabilities, thus enabling the detection of HAI vulnerabilities early in the design process, when modifications are less costly. In this paper, we advocate the introduction of a new capability to model-based prototyping tools such as ADEPT. The new capability is based on symbolic execution that allows us to automatically generate quality test suites based on the system design. Symbolic execution is used to generate both user input and test oracles user input drives the testing of the system implementation, and test oracles ensure that the system behaves as designed. We present early results in the context of a component in the Autopilot system modeled in ADEPT, and discuss the challenges of test case generation in the HAI domain.

  14. Cerebellum engages in automation of verb-generation skill.

    Science.gov (United States)

    Yang, Zhi; Wu, Paula; Weng, Xuchu; Bandettini, Peter A

    2014-03-01

    Numerous studies have shown cerebellar involvement in item-specific association, a form of explicit learning. However, very few have demonstrated cerebellar participation in automation of non-motor cognitive tasks. Applying fMRI to a repeated verb-generation task, we sought to distinguish cerebellar involvement in learning of item-specific noun-verb association and automation of verb generation skill. The same set of nouns was repeated in six verb-generation blocks so that subjects practiced generating verbs for the nouns. The practice was followed by a novel block with a different set of nouns. The cerebellar vermis (IV/V) and the right cerebellar lobule VI showed decreased activation following practice; activation in the right cerebellar Crus I was significantly lower in the novel challenge than in the initial verb-generation task. Furthermore, activation in this region during well-practiced blocks strongly correlated with improvement of behavioral performance in both the well-practiced and the novel blocks, suggesting its role in the learning of general mental skills not specific to the practiced noun-verb pairs. Therefore, the cerebellum processes both explicit verbal associative learning and automation of cognitive tasks. Different cerebellar regions predominate in this processing: lobule VI during the acquisition of item-specific association, and Crus I during automation of verb-generation skills through practice.

  15. Automated Concurrent Blackboard System Generation in C++

    Science.gov (United States)

    Kaplan, J. A.; McManus, J. W.; Bynum, W. L.

    1999-01-01

    In his 1992 Ph.D. thesis, "Design and Analysis Techniques for Concurrent Blackboard Systems", John McManus defined several performance metrics for concurrent blackboard systems and developed a suite of tools for creating and analyzing such systems. These tools allow a user to analyze a concurrent blackboard system design and predict the performance of the system before any code is written. The design can be modified until simulated performance is satisfactory. Then, the code generator can be invoked to generate automatically all of the code required for the concurrent blackboard system except for the code implementing the functionality of each knowledge source. We have completed the port of the source code generator and a simulator for a concurrent blackboard system. The source code generator generates the necessary C++ source code to implement the concurrent blackboard system using Parallel Virtual Machine (PVM) running on a heterogeneous network of UNIX(trademark) workstations. The concurrent blackboard simulator uses the blackboard specification file to predict the performance of the concurrent blackboard design. The only part of the source code for the concurrent blackboard system that the user must supply is the code implementing the functionality of the knowledge sources.

  16. Automated Layout Generation of Analogue and Mixed-Signal ASIC's

    DEFF Research Database (Denmark)

    Bloch, Rene

    The research and development carried out in this Ph.D. study focusses on two key areas of the design flow for analogue and mixed-signal integrated circuit design, the mixed-signal floorplanning and the analogue layout generation.A novel approach to floorplanning is presented which provides true...... flow.A new design flow for automated layout generation of general analogue integrated circuits is presented. The design flow provides an automated design path from a sized circuit schematic to the final layout containing the placed, but unrouted, devices of the circuit. The analogue circuit layout...... interactive floorplanning capabilities due to a new implementation variant of a Genetic Algorithm. True interactive floorplanning allows the designer to communicate with existing floorplans during optimization. By entering the "ideas" and expertise of the designer into the optimization algorithm the automated...

  17. Automated mass spectrum generation for new physics

    CERN Document Server

    Alloul, Adam; De Causmaecker, Karen; Fuks, Benjamin; Rausch de Traubenberg, Michel

    2013-01-01

    We describe an extension of the FeynRules package dedicated to the automatic generation of the mass spectrum associated with any Lagrangian-based quantum field theory. After introducing a simplified way to implement particle mixings, we present a new class of FeynRules functions allowing both for the analytical computation of all the model mass matrices and for the generation of a C++ package, dubbed ASperGe. This program can then be further employed for a numerical evaluation of the rotation matrices necessary to diagonalize the field basis. We illustrate these features in the context of the Two-Higgs-Doublet Model, the Minimal Left-Right Symmetric Standard Model and the Minimal Supersymmetric Standard Model.

  18. Automated Environment Generation for Software Model Checking

    Science.gov (United States)

    Tkachuk, Oksana; Dwyer, Matthew B.; Pasareanu, Corina S.

    2003-01-01

    A key problem in model checking open systems is environment modeling (i.e., representing the behavior of the execution context of the system under analysis). Software systems are fundamentally open since their behavior is dependent on patterns of invocation of system components and values defined outside the system but referenced within the system. Whether reasoning about the behavior of whole programs or about program components, an abstract model of the environment can be essential in enabling sufficiently precise yet tractable verification. In this paper, we describe an approach to generating environments of Java program fragments. This approach integrates formally specified assumptions about environment behavior with sound abstractions of environment implementations to form a model of the environment. The approach is implemented in the Bandera Environment Generator (BEG) which we describe along with our experience using BEG to reason about properties of several non-trivial concurrent Java programs.

  19. Automated Scenario Generation and Interaction Techniques

    Science.gov (United States)

    1981-06-01

    function. These routines are written in FORTRAN IV and utilize an overlay structure to minimize core requirements. Total resource in core is a result of...Munition Effectiveness Manual (JMEM). 6.8 COSTO SCHEDULE, AND MANPOWER REQUIREMENTS: TALON Accounting estimates in the TALON gaming model are divided

  20. Toward Fully Automated Multicriterial Plan Generation: A Prospective Clinical Study

    Energy Technology Data Exchange (ETDEWEB)

    Voet, Peter W.J., E-mail: p.voet@erasmusmc.nl [Department of Radiation Oncology, Erasmus Medical Center–Daniel den Hoed Cancer Center, Groene Hilledijk 301, Rotterdam 3075EA (Netherlands); Dirkx, Maarten L.P.; Breedveld, Sebastiaan; Fransen, Dennie; Levendag, Peter C.; Heijmen, Ben J.M. [Department of Radiation Oncology, Erasmus Medical Center–Daniel den Hoed Cancer Center, Groene Hilledijk 301, Rotterdam 3075EA (Netherlands)

    2013-03-01

    Purpose: To prospectively compare plans generated with iCycle, an in-house-developed algorithm for fully automated multicriterial intensity modulated radiation therapy (IMRT) beam profile and beam orientation optimization, with plans manually generated by dosimetrists using the clinical treatment planning system. Methods and Materials: For 20 randomly selected head-and-neck cancer patients with various tumor locations (of whom 13 received sequential boost treatments), we offered the treating physician the choice between an automatically generated iCycle plan and a manually optimized plan using standard clinical procedures. Although iCycle used a fixed “wish list” with hard constraints and prioritized objectives, the dosimetrists manually selected the beam configuration and fine tuned the constraints and objectives for each IMRT plan. Dosimetrists were not informed in advance whether a competing iCycle plan was made. The 2 plans were simultaneously presented to the physician, who then selected the plan to be used for treatment. For the patient group, differences in planning target volume coverage and sparing of critical tissues were quantified. Results: In 32 of 33 plan comparisons, the physician selected the iCycle plan for treatment. This highly consistent preference for the automatically generated plans was mainly caused by the improved sparing for the large majority of critical structures. With iCycle, the normal tissue complication probabilities for the parotid and submandibular glands were reduced by 2.4% ± 4.9% (maximum, 18.5%, P=.001) and 6.5% ± 8.3% (maximum, 27%, P=.005), respectively. The reduction in the mean oral cavity dose was 2.8 ± 2.8 Gy (maximum, 8.1 Gy, P=.005). For the swallowing muscles, the esophagus and larynx, the mean dose reduction was 3.3 ± 1.1 Gy (maximum, 9.2 Gy, P<.001). For 15 of the 20 patients, target coverage was also improved. Conclusions: In 97% of cases, automatically generated plans were selected for treatment because of

  1. Automated cardiac sarcomere analysis from second harmonic generation images

    Science.gov (United States)

    Garcia-Canadilla, Patricia; Gonzalez-Tendero, Anna; Iruretagoyena, Igor; Crispi, Fatima; Torre, Iratxe; Amat-Roldan, Ivan; Bijnens, Bart H.; Gratacos, Eduard

    2014-05-01

    Automatic quantification of cardiac muscle properties in tissue sections might provide important information related to different types of diseases. Second harmonic generation (SHG) imaging provides a stain-free microscopy approach to image cardiac fibers that, combined with our methodology of the automated measurement of the ultrastructure of muscle fibers, computes a reliable set of quantitative image features (sarcomere length, A-band length, thick-thin interaction length, and fiber orientation). We evaluated the performance of our methodology in computer-generated muscle fibers modeling some artifacts that are present during the image acquisition. Then, we also evaluated it by comparing it to manual measurements in SHG images from cardiac tissue of fetal and adult rabbits. The results showed a good performance of our methodology at high signal-to-noise ratio of 20 dB. We conclude that our automated measurements enable reliable characterization of cardiac fiber tissues to systematically study cardiac tissue in a wide range of conditions.

  2. Carbohydrate structure: the rocky road to automation.

    Science.gov (United States)

    Agirre, Jon; Davies, Gideon J; Wilson, Keith S; Cowtan, Kevin D

    2016-12-08

    With the introduction of intuitive graphical software, structural biologists who are not experts in crystallography are now able to build complete protein or nucleic acid models rapidly. In contrast, carbohydrates are in a wholly different situation: scant automation exists, with manual building attempts being sometimes toppled by incorrect dictionaries or refinement problems. Sugars are the most stereochemically complex family of biomolecules and, as pyranose rings, have clear conformational preferences. Despite this, all refinement programs may produce high-energy conformations at medium to low resolution, without any support from the electron density. This problem renders the affected structures unusable in glyco-chemical terms. Bringing structural glycobiology up to 'protein standards' will require a total overhaul of the methodology. Time is of the essence, as the community is steadily increasing the production rate of glycoproteins, and electron cryo-microscopy has just started to image them in precisely that resolution range where crystallographic methods falter most.

  3. A Recommendation Algorithm for Automating Corollary Order Generation

    Science.gov (United States)

    Klann, Jeffrey; Schadow, Gunther; McCoy, JM

    2009-01-01

    Manual development and maintenance of decision support content is time-consuming and expensive. We explore recommendation algorithms, e-commerce data-mining tools that use collective order history to suggest purchases, to assist with this. In particular, previous work shows corollary order suggestions are amenable to automated data-mining techniques. Here, an item-based collaborative filtering algorithm augmented with association rule interestingness measures mined suggestions from 866,445 orders made in an inpatient hospital in 2007, generating 584 potential corollary orders. Our expert physician panel evaluated the top 92 and agreed 75.3% were clinically meaningful. Also, at least one felt 47.9% would be directly relevant in guideline development. This automated generation of a rough-cut of corollary orders confirms prior indications about automated tools in building decision support content. It is an important step toward computerized augmentation to decision support development, which could increase development efficiency and content quality while automatically capturing local standards. PMID:20351875

  4. Automated Generation of User Guidance by Combining Computation and Deduction

    CERN Document Server

    Neuper, Walther

    2012-01-01

    Herewith, a fairly old concept is published for the first time and named "Lucas Interpretation". This has been implemented in a prototype, which has been proved useful in educational practice and has gained academic relevance with an emerging generation of educational mathematics assistants (EMA) based on Computer Theorem Proving (CTP). Automated Theorem Proving (ATP), i.e. deduction, is the most reliable technology used to check user input. However ATP is inherently weak in automatically generating solutions for arbitrary problems in applied mathematics. This weakness is crucial for EMAs: when ATP checks user input as incorrect and the learner gets stuck then the system should be able to suggest possible next steps. The key idea of Lucas Interpretation is to compute the steps of a calculation following a program written in a novel CTP-based programming language, i.e. computation provides the next steps. User guidance is generated by combining deduction and computation: the latter is performed by a specific l...

  5. A scheme on automated test data generation and its evaluation

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    By analyzing some existing test data generation methods, a new automated test data generation approach was presented. The linear predicate functions on a given path was directly used to construct a linear constrain system for input variables. Only when the predicate function is nonlinear, does the linear arithmetic representation need to be computed. If the entire predicate functions on the given path are linear, either the desired test data or the guarantee that the path is infeasible can be gotten from the solution of the constrain system. Otherwise, the iterative refining for the input is required to obtain the desired test data. Theoretical analysis and test results show that the approach is simple and effective, and takes less computation. The scheme can also be used to generate path-based test data for the programs with arrays and loops.

  6. Automated 3D model generation for urban environments [online

    OpenAIRE

    Frueh, Christian

    2007-01-01

    Abstract In this thesis, we present a fast approach to automated generation of textured 3D city models with both high details at ground level and complete coverage for bird’s-eye view. A ground-based facade model is acquired by driving a vehicle equipped with two 2D laser scanners and a digital camera under normal traffic conditions on public roads. One scanner is mounted horizontally and is used to determine the approximate component of relative motion along the move...

  7. Generators and automated generator systems for production and on-line injections of pet radiopharmaceuticals

    Science.gov (United States)

    Shimchuk, G.; Shimchuk, Gr; Pakhomov, G.; Avalishvili, G.; Zavrazhnov, G.; Polonsky-Byslaev, I.; Fedotov, A.; Polozov, P.

    2017-01-01

    One of the prospective directions of PET development is using generator positron radiating nuclides [1,2]. Introduction of this technology is financially promising, since it does not require expensive special accelerator and radiochemical laboratory in the medical institution, which considerably reduces costs of PET diagnostics and makes it available to more patients. POZITOM-PRO RPC LLC developed and produced an 82Sr-82Rb generator, an automated injection system, designed for automatic and fully-controlled injections of 82RbCl produced by this generator, automated radiopharmaceutical synthesis units based on generated 68Ga produced using a domestically-manufactured 68Ge-68Ga generator for preparing two pharmaceuticals: Ga-68-DOTA-TATE and Vascular Ga-68.

  8. Generative Moire Structures

    Directory of Open Access Journals (Sweden)

    Adrian – Mihail Marian

    2006-01-01

    Full Text Available “GRAPHIC ON COMPUTER” – the work of the Czech Petar Milojevic, published in Titus Mocanu’s book “THE MODERN ART’S MORPHOLOGY”, in 1973, had great influence on me. I tried to discover the algorithm that generated this work. It was not so difficult to do and in a short time I was able to draw 12 such structures. In time, with interruptions, I have returned to this kind of works. In my personal exhibition “CYBERNETIC DESIGN” that took place at “M4-1-13-etopa” gallery of Pitesti, in March 1981, I have presented 8 such structures. To my joy, they had an impact on art lovers.

  9. Fully integrated, fully automated generation of short tandem repeat profiles

    Science.gov (United States)

    2013-01-01

    Background The generation of short tandem repeat profiles, also referred to as ‘DNA typing,’ is not currently performed outside the laboratory because the process requires highly skilled technical operators and a controlled laboratory environment and infrastructure with several specialized instruments. The goal of this work was to develop a fully integrated system for the automated generation of short tandem repeat profiles from buccal swab samples, to improve forensic laboratory process flow as well as to enable short tandem repeat profile generation to be performed in police stations and in field-forward military, intelligence, and homeland security settings. Results An integrated system was developed consisting of an injection-molded microfluidic BioChipSet cassette, a ruggedized instrument, and expert system software. For each of five buccal swabs, the system purifies DNA using guanidinium-based lysis and silica binding, amplifies 15 short tandem repeat loci and the amelogenin locus, electrophoretically separates the resulting amplicons, and generates a profile. No operator processing of the samples is required, and the time from swab insertion to profile generation is 84 minutes. All required reagents are contained within the BioChipSet cassette; these consist of a lyophilized polymerase chain reaction mix and liquids for purification and electrophoretic separation. Profiles obtained from fully automated runs demonstrate that the integrated system generates concordant short tandem repeat profiles. The system exhibits single-base resolution from 100 to greater than 500 bases, with inter-run precision with a standard deviation of ±0.05 - 0.10 bases for most alleles. The reagents are stable for at least 6 months at 22°C, and the instrument has been designed and tested to Military Standard 810F for shock and vibration ruggedization. A nontechnical user can operate the system within or outside the laboratory. Conclusions The integrated system represents the

  10. Automated Reference File Generation for HST STIS Using OPUS

    Science.gov (United States)

    Swam, Michael S.; Goodfrooij, Paul; Diaz-Miller, Rosa I.

    A project has been undertaken at the Space Telescope Science Institute (STScI) to automatically generate some types of instrument reference files as soon as the raw exposures are received from the Hubble Space Telescope (HST). This project will start by automatically producing bias and dark reference files for some of the modes of the Space Telescope Imaging Spectrograph's (STIS) CCD camera. Using database tables of planned and received exposures, scripts developed by the STIS instrument group, and the process automation and monitoring features of OPUS, this system will allow monitoring of reference file creation as it occurs on the production systems at STScI. This monitoring occurs through the OPUS Observation Manager (OMG), a Java application running on a potentially remote workstation or personal computer. The automation of the process will not only reduce considerably the time it takes to create these reference files, but also will allow the delivery of accurate bias and dark reference files for STIS shortly after the observations occur, resulting in a much better calibration of the data. This paper will describe the features of the system architecture and the software technologies used to implement them.

  11. Automated Generation of User Guidance by Combining Computation and Deduction

    Directory of Open Access Journals (Sweden)

    Walther Neuper

    2012-02-01

    Full Text Available Herewith, a fairly old concept is published for the first time and named "Lucas Interpretation". This has been implemented in a prototype, which has been proved useful in educational practice and has gained academic relevance with an emerging generation of educational mathematics assistants (EMA based on Computer Theorem Proving (CTP. Automated Theorem Proving (ATP, i.e. deduction, is the most reliable technology used to check user input. However ATP is inherently weak in automatically generating solutions for arbitrary problems in applied mathematics. This weakness is crucial for EMAs: when ATP checks user input as incorrect and the learner gets stuck then the system should be able to suggest possible next steps. The key idea of Lucas Interpretation is to compute the steps of a calculation following a program written in a novel CTP-based programming language, i.e. computation provides the next steps. User guidance is generated by combining deduction and computation: the latter is performed by a specific language interpreter, which works like a debugger and hands over control to the learner at breakpoints, i.e. tactics generating the steps of calculation. The interpreter also builds up logical contexts providing ATP with the data required for checking user input, thus combining computation and deduction. The paper describes the concepts underlying Lucas Interpretation so that open questions can adequately be addressed, and prerequisites for further work are provided.

  12. Automated Discourse Generation Using Discourse Structure Relations

    Science.gov (United States)

    1993-06-01

    For Section 5.2: Thanks to the ISI Text Planning Group: Dr. Cicile Paris (USC/ISI), Dr. Julia Lavid (University Complutense of Madrid ), Ms. Elisabeth...amd Dr. Julia Lavid (University of Madrid ). For Section 6.1: Thanks to Prof. Donia Scott (Brighton University) and Dr. Dietmar R6sner (University of Ulm...1989. Enhancing Text Quality in a Question-Answering System. Unpublished manuscript, Pontificia Universidade Cat6lica de Rio de Janeiro. [Dobe§ & Novak

  13. Automating Initial Guess Generation for High Fidelity Trajectory Optimization Tools

    Science.gov (United States)

    Villa, Benjamin; Lantoine, Gregory; Sims, Jon; Whiffen, Gregory

    2013-01-01

    Many academic studies in spaceflight dynamics rely on simplified dynamical models, such as restricted three-body models or averaged forms of the equations of motion of an orbiter. In practice, the end result of these preliminary orbit studies needs to be transformed into more realistic models, in particular to generate good initial guesses for high-fidelity trajectory optimization tools like Mystic. This paper reviews and extends some of the approaches used in the literature to perform such a task, and explores the inherent trade-offs of such a transformation with a view toward automating it for the case of ballistic arcs. Sample test cases in the libration point regimes and small body orbiter transfers are presented.

  14. Toward the Automated Generation of Components from Existing Source Code

    Energy Technology Data Exchange (ETDEWEB)

    Quinlan, D; Yi, Q; Kumfert, G; Epperly, T; Dahlgren, T; Schordan, M; White, B

    2004-12-02

    A major challenge to achieving widespread use of software component technology in scientific computing is an effective migration strategy for existing, or legacy, source code. This paper describes initial work and challenges in automating the identification and generation of components using the ROSE compiler infrastructure and the Babel language interoperability tool. Babel enables calling interfaces expressed in the Scientific Interface Definition Language (SIDL) to be implemented in, and called from, an arbitrary combination of supported languages. ROSE is used to build specialized source-to-source translators that (1) extract a SIDL interface specification from information implicit in existing C++ source code and (2) transform Babel's output to include dispatches to the legacy code.

  15. Guiding automated NMR structure determination using a global optimization metric, the NMR DP score

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Yuanpeng Janet, E-mail: yphuang@cabm.rutgers.edu; Mao, Binchen; Xu, Fei; Montelione, Gaetano T., E-mail: gtm@rutgers.edu [Rutgers, The State University of New Jersey, Department of Molecular Biology and Biochemistry, Center for Advanced Biotechnology and Medicine, and Northeast Structural Genomics Consortium (United States)

    2015-08-15

    ASDP is an automated NMR NOE assignment program. It uses a distinct bottom-up topology-constrained network anchoring approach for NOE interpretation, with 2D, 3D and/or 4D NOESY peak lists and resonance assignments as input, and generates unambiguous NOE constraints for iterative structure calculations. ASDP is designed to function interactively with various structure determination programs that use distance restraints to generate molecular models. In the CASD–NMR project, ASDP was tested and further developed using blinded NMR data, including resonance assignments, either raw or manually-curated (refined) NOESY peak list data, and in some cases {sup 15}N–{sup 1}H residual dipolar coupling data. In these blinded tests, in which the reference structure was not available until after structures were generated, the fully-automated ASDP program performed very well on all targets using both the raw and refined NOESY peak list data. Improvements of ASDP relative to its predecessor program for automated NOESY peak assignments, AutoStructure, were driven by challenges provided by these CASD–NMR data. These algorithmic improvements include (1) using a global metric of structural accuracy, the discriminating power score, for guiding model selection during the iterative NOE interpretation process, and (2) identifying incorrect NOESY cross peak assignments caused by errors in the NMR resonance assignment list. These improvements provide a more robust automated NOESY analysis program, ASDP, with the unique capability of being utilized with alternative structure generation and refinement programs including CYANA, CNS, and/or Rosetta.

  16. Automated Generation of Web Services for Visualization Toolkits

    Science.gov (United States)

    Jensen, P. A.; Yuen, D. A.; Erlebacher, G.; Bollig, E. F.; Kigelman, D. G.; Shukh, E. A.

    2005-12-01

    The recent explosion in the size and complexity of geophysical data and an increasing trend for collaboration across large geographical areas demand the use of remote, full featured visualization toolkits. As the scientific community shifts toward grid computing to handle these increased demands, new web services are needed to assemble powerful distributed applications. Recent research has established the possibility of converting toolkits such as VTK [1] and Matlab [2] into remote visualization services. We are investigating an automated system to allow these toolkits to export their functions as web services under the standardized protocols SOAP and WSDL using pre-existing software (gSOAP [3]) and a custom compiler for Tcl-based scripts. The compiler uses a flexible parser and type inferring mechanism to convert the Tcl into a C++ program that allows the desired Tcl procedures to be exported as SOAP-accessible functions and the VTK rendering window to be captured offscreen and encapsulated for forwarding through a web service. Classes for a client-side Java applet to access the rendering window remotely are also generated. We will use this system to demonstrate the streamlined generation of a standards-compliant web service (suitable for grid deployment) from a Tcl script for VTK. References: [1] The Visualization Toolkit, http://www.vtk.org [2] Matlab, http://www.mathworks.com [3] gSOAP, http://www.cs.fsu.edu/~engelen/soap.html

  17. Methodical Approaches to Creation of Dividing Automation at Industrial Enterprises with Generating Power Plants

    Directory of Open Access Journals (Sweden)

    E. V. Kalentionok

    2010-01-01

    Full Text Available The paper considers a problem pertaining to creation of dividing automation at industrial enterprises which have their own generating plants. Algorithms for action of dividing automation that permit to ensure minimum possible power non-balance while using generating plants for autonomous operation and possible parameters for its response are proposed in the paper.

  18. Extending and automating a Systems-Theoretic hazard analysis for requirements generation and analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, John (Massachusetts Institute of Technology)

    2012-05-01

    Systems Theoretic Process Analysis (STPA) is a powerful new hazard analysis method designed to go beyond traditional safety techniques - such as Fault Tree Analysis (FTA) - that overlook important causes of accidents like flawed requirements, dysfunctional component interactions, and software errors. While proving to be very effective on real systems, no formal structure has been defined for STPA and its application has been ad-hoc with no rigorous procedures or model-based design tools. This report defines a formal mathematical structure underlying STPA and describes a procedure for systematically performing an STPA analysis based on that structure. A method for using the results of the hazard analysis to generate formal safety-critical, model-based system and software requirements is also presented. Techniques to automate both the analysis and the requirements generation are introduced, as well as a method to detect conflicts between the safety and other functional model-based requirements during early development of the system.

  19. Automated Generation and Assessment of Autonomous Systems Test Cases

    Science.gov (United States)

    Barltrop, Kevin J.; Friberg, Kenneth H.; Horvath, Gregory A.

    2008-01-01

    This slide presentation reviews some of the issues concerning verification and validation testing of autonomous spacecraft routinely culminates in the exploration of anomalous or faulted mission-like scenarios using the work involved during the Dawn mission's tests as examples. Prioritizing which scenarios to develop usually comes down to focusing on the most vulnerable areas and ensuring the best return on investment of test time. Rules-of-thumb strategies often come into play, such as injecting applicable anomalies prior to, during, and after system state changes; or, creating cases that ensure good safety-net algorithm coverage. Although experience and judgment in test selection can lead to high levels of confidence about the majority of a system's autonomy, it's likely that important test cases are overlooked. One method to fill in potential test coverage gaps is to automatically generate and execute test cases using algorithms that ensure desirable properties about the coverage. For example, generate cases for all possible fault monitors, and across all state change boundaries. Of course, the scope of coverage is determined by the test environment capabilities, where a faster-than-real-time, high-fidelity, software-only simulation would allow the broadest coverage. Even real-time systems that can be replicated and run in parallel, and that have reliable set-up and operations features provide an excellent resource for automated testing. Making detailed predictions for the outcome of such tests can be difficult, and when algorithmic means are employed to produce hundreds or even thousands of cases, generating predicts individually is impractical, and generating predicts with tools requires executable models of the design and environment that themselves require a complete test program. Therefore, evaluating the results of large number of mission scenario tests poses special challenges. A good approach to address this problem is to automatically score the results

  20. Calibrated automated thrombin generation measurement in clotting plasma.

    Science.gov (United States)

    Hemker, H Coenraad; Giesen, Peter; Al Dieri, Raed; Regnault, Véronique; de Smedt, Eric; Wagenvoord, Rob; Lecompte, Thomas; Béguin, Suzette

    2003-01-01

    Calibrated automated thrombography displays the concentration of thrombin in clotting plasma with or without platelets (platelet-rich plasma/platelet-poor plasma, PRP/PPP) in up to 48 samples by monitoring the splitting of a fluorogenic substrate and comparing it to a constant known thrombin activity in a parallel, non-clotting sample. Thus, the non-linearity of the reaction rate with thrombin concentration is compensated for, and adding an excess of substrate can be avoided. Standard conditions were established at which acceptable experimental variation accompanies sensitivity to pathological changes. The coefficients of variation of the surface under the curve (endogenous thrombin potential) are: within experiment approximately 3%; intra-individual: AVK, heparin(-likes), direct inhibitors]. In PRP, it is diminished in von Willebrand's disease, but it also shows the effect of platelet inhibitors (e.g. aspirin and abciximab). Addition of activated protein C (APC) or thrombomodulin inhibits thrombin generation and reflects disorders of the APC system (congenital and acquired resistance, deficiencies and lupus antibodies) independent of concomitant inhibition of the procoagulant pathway as for example by anticoagulants.

  1. An automated approach to network features of protein structure ensembles.

    Science.gov (United States)

    Bhattacharyya, Moitrayee; Bhat, Chanda R; Vishveshwara, Saraswathi

    2013-10-01

    Network theory applied to protein structures provides insights into numerous problems of biological relevance. The explosion in structural data available from PDB and simulations establishes a need to introduce a standalone-efficient program that assembles network concepts/parameters under one hood in an automated manner. Herein, we discuss the development/application of an exhaustive, user-friendly, standalone program package named PSN-Ensemble, which can handle structural ensembles generated through molecular dynamics (MD) simulation/NMR studies or from multiple X-ray structures. The novelty in network construction lies in the explicit consideration of side-chain interactions among amino acids. The program evaluates network parameters dealing with topological organization and long-range allosteric communication. The introduction of a flexible weighing scheme in terms of residue pairwise cross-correlation/interaction energy in PSN-Ensemble brings in dynamical/chemical knowledge into the network representation. Also, the results are mapped on a graphical display of the structure, allowing an easy access of network analysis to a general biological community. The potential of PSN-Ensemble toward examining structural ensemble is exemplified using MD trajectories of an ubiquitin-conjugating enzyme (UbcH5b). Furthermore, insights derived from network parameters evaluated using PSN-Ensemble for single-static structures of active/inactive states of β2-adrenergic receptor and the ternary tRNA complexes of tyrosyl tRNA synthetases (from organisms across kingdoms) are discussed. PSN-Ensemble is freely available from http://vishgraph.mbu.iisc.ernet.in/PSN-Ensemble/psn_index.html.

  2. Automating generation of textual class definitions from OWL to English.

    Science.gov (United States)

    Stevens, Robert; Malone, James; Williams, Sandra; Power, Richard; Third, Allan

    2011-05-17

    Text definitions for entities within bio-ontologies are a cornerstone of the effort to gain a consensus in understanding and usage of those ontologies. Writing these definitions is, however, a considerable effort and there is often a lag between specification of the main part of an ontology (logical descriptions and definitions of entities) and the development of the text-based definitions. The goal of natural language generation (NLG) from ontologies is to take the logical description of entities and generate fluent natural language. The application described here uses NLG to automatically provide text-based definitions from an ontology that has logical descriptions of its entities, so avoiding the bottleneck of authoring these definitions by hand. To produce the descriptions, the program collects all the axioms relating to a given entity, groups them according to common structure, realises each group through an English sentence, and assembles the resulting sentences into a paragraph, to form as 'coherent' a text as possible without human intervention. Sentence generation is accomplished using a generic grammar based on logical patterns in OWL, together with a lexicon for realising atomic entities. We have tested our output for the Experimental Factor Ontology (EFO) using a simple survey strategy to explore the fluency of the generated text and how well it conveys the underlying axiomatisation. Two rounds of survey and improvement show that overall the generated English definitions are found to convey the intended meaning of the axiomatisation in a satisfactory manner. The surveys also suggested that one form of generated English will not be universally liked; that intrusion of too much 'formal ontology' was not liked; and that too much explicit exposure of OWL semantics was also not liked. Our prototype tools can generate reasonable paragraphs of English text that can act as definitions. The definitions were found acceptable by our survey and, as a result, the

  3. UNICOS CPC6: Automated Code Generation for Process Control Applications

    CERN Document Server

    Fernandez Adiego, B; Prieto Barreiro, I

    2011-01-01

    The Continuous Process Control package (CPC) is one of the components of the CERN Unified Industrial Control System framework (UNICOS) [1]. As a part of this framework, UNICOS-CPC provides a well defined library of device types, amethodology and a set of tools to design and implement industrial control applications. The new CPC version uses the software factory UNICOS Application Builder (UAB) [2] to develop CPC applications. The CPC component is composed of several platform oriented plugins PLCs and SCADA) describing the structure and the format of the generated code. It uses a resource package where both, the library of device types and the generated file syntax, are defined. The UAB core is the generic part of this software, it discovers and calls dynamically the different plug-ins and provides the required common services. In this paper the UNICOS CPC6 package is introduced. It is composed of several plug-ins: the Instance generator and the Logic generator for both, Siemens and Schneider PLCs, the SCADA g...

  4. Blind testing of routine, fully automated determination of protein structures from NMR data.

    Science.gov (United States)

    Rosato, Antonio; Aramini, James M; Arrowsmith, Cheryl; Bagaria, Anurag; Baker, David; Cavalli, Andrea; Doreleijers, Jurgen F; Eletsky, Alexander; Giachetti, Andrea; Guerry, Paul; Gutmanas, Aleksandras; Güntert, Peter; He, Yunfen; Herrmann, Torsten; Huang, Yuanpeng J; Jaravine, Victor; Jonker, Hendrik R A; Kennedy, Michael A; Lange, Oliver F; Liu, Gaohua; Malliavin, Thérèse E; Mani, Rajeswari; Mao, Binchen; Montelione, Gaetano T; Nilges, Michael; Rossi, Paolo; van der Schot, Gijs; Schwalbe, Harald; Szyperski, Thomas A; Vendruscolo, Michele; Vernon, Robert; Vranken, Wim F; Vries, Sjoerd de; Vuister, Geerten W; Wu, Bin; Yang, Yunhuang; Bonvin, Alexandre M J J

    2012-02-08

    The protocols currently used for protein structure determination by nuclear magnetic resonance (NMR) depend on the determination of a large number of upper distance limits for proton-proton pairs. Typically, this task is performed manually by an experienced researcher rather than automatically by using a specific computer program. To assess whether it is indeed possible to generate in a fully automated manner NMR structures adequate for deposition in the Protein Data Bank, we gathered 10 experimental data sets with unassigned nuclear Overhauser effect spectroscopy (NOESY) peak lists for various proteins of unknown structure, computed structures for each of them using different, fully automatic programs, and compared the results to each other and to the manually solved reference structures that were not available at the time the data were provided. This constitutes a stringent "blind" assessment similar to the CASP and CAPRI initiatives. This study demonstrates the feasibility of routine, fully automated protein structure determination by NMR.

  5. Prototype Software for Automated Structural Analysis of Systems

    DEFF Research Database (Denmark)

    Jørgensen, A.; Izadi-Zamanabadi, Roozbeh; Kristensen, M.

    2004-01-01

    In this paper we present a prototype software tool that is developed to analyse the structural model of automated systems in order to identify redundant information that is hence utilized for Fault detection and Isolation (FDI) purposes. The dedicated algorithms in this software tool use a tri...

  6. Prototype Software for Automated Structural Analysis of Systems

    DEFF Research Database (Denmark)

    Jørgensen, A.; Izadi-Zamanabadi, Roozbeh; Kristensen, M.

    2004-01-01

    In this paper we present a prototype software tool that is developed to analyse the structural model of automated systems in order to identify redundant information that is hence utilized for Fault detection and Isolation (FDI) purposes. The dedicated algorithms in this software tool use a tri...

  7. Automated Test Input Generation for Android: Are We There Yet?

    OpenAIRE

    Choudhary, Shauvik Roy; Gorla, Alessandra; Orso, Alessandro

    2015-01-01

    Mobile applications, often simply called "apps", are increasingly widespread, and we use them daily to perform a number of activities. Like all software, apps must be adequately tested to gain confidence that they behave correctly. Therefore, in recent years, researchers and practitioners alike have begun to investigate ways to automate apps testing. In particular, because of Android's open source nature and its large share of the market, a great deal of research has been performed on input g...

  8. Prototype Software for Automated Structural Analysis of Systems

    DEFF Research Database (Denmark)

    Jørgensen, A.; Izadi-Zamanabadi, Roozbeh; Kristensen, M.

    2004-01-01

    In this paper we present a prototype software tool that is developed to analyse the structural model of automated systems in order to identify redundant information that is hence utilized for Fault detection and Isolation (FDI) purposes. The dedicated algorithms in this software tool use a tri......-partite graph that represents the structural model of the system. A component-based approach has been used to address issues such as system complexity and reconfigurability possibilities....

  9. Automated hexahedral mesh generation from biomedical image data: applications in limb prosthetics.

    Science.gov (United States)

    Zachariah, S G; Sanders, J E; Turkiyyah, G M

    1996-06-01

    A general method to generate hexahedral meshes for finite element analysis of residual limbs and similar biomedical geometries is presented. The method utilizes skeleton-based subdivision of cross-sectional domains to produce simple subdomains in which structured meshes are easily generated. Application to a below-knee residual limb and external prosthetic socket is described. The residual limb was modeled as consisting of bones, soft tissue, and skin. The prosthetic socket model comprised a socket wall with an inner liner. The geometries of these structures were defined using axial cross-sectional contour data from X-ray computed tomography, optical scanning, and mechanical surface digitization. A tubular surface representation, using B-splines to define the directrix and generator, is shown to be convenient for definition of the structure geometries. Conversion of cross-sectional data to the compact tubular surface representation is direct, and the analytical representation simplifies geometric querying and numerical optimization within the mesh generation algorithms. The element meshes remain geometrically accurate since boundary nodes are constrained to lie on the tubular surfaces. Several element meshes of increasing mesh density were generated for two residual limbs and prosthetic sockets. Convergence testing demonstrated that approximately 19 elements are required along a circumference of the residual limb surface for a simple linear elastic model. A model with the fibula absent compared with the same geometry with the fibula present showed differences suggesting higher distal stresses in the absence of the fibula. Automated hexahedral mesh generation algorithms for sliced data represent an advancement in prosthetic stress analysis since they allow rapid modeling of any given residual limb and optimization of mesh parameters.

  10. Using Automated Theorem Provers to Certify Auto-Generated Aerospace Software

    Science.gov (United States)

    Denney, Ewen; Fischer, Bernd; Schumann, Johann

    2004-01-01

    We describe a system for the automated certification of safety properties of NASA software. The system uses Hoare-style program verification technology to generate proof obligations which are then processed by an automated first-order theorem prover (ATP). For full automation, however, the obligations must be aggressively preprocessed and simplified We describe the unique requirements this places on the ATP and demonstrate how the individual simplification stages, which are implemented by rewriting, influence the ability of the ATP to solve the proof tasks. Experiments on more than 25,000 tasks were carried out using Vampire, Spass, and e-setheo.

  11. An automated pipeline for cortical surface generation and registration of the cerebral cortex

    Science.gov (United States)

    Li, Wen; Ibanez, Luis; Gelas, Arnaud; Yeo, B. T. Thomas; Niethammer, Marc; Andreasen, Nancy C.; Magnotta, Vincent A.

    2011-03-01

    The human cerebral cortex is one of the most complicated structures in the body. It has a highly convoluted structure with much of the cortical sheet buried in sulci. Based on cytoarchitectural and functional imaging studies, it is possible to segment the cerebral cortex into several subregions. While it is only possible to differentiate the true anatomical subregions based on cytoarchitecture, the surface morphometry aligns closely with the underlying cytoarchitecture and provides features that allow the surface of the cortex to be parcellated based on the sulcal and gyral patterns that are readily visible on the MR images. We have developed a fully automated pipeline for the generation and registration of cortical surfaces in the spherical domain. The pipeline initiates with the BRAINS AutoWorkup pipeline. Subsequently, topology correction and surface generation is performed to generate a genus zero surface and mapped to a sphere. Several surface features are then calculated to drive the registration between the atlas surface and other datasets. A spherical diffeomorphic demons algorithm is used to co-register an atlas surface onto a subject surface. A lobar based atlas of the cerebral cortex was created from a manual parcellation of the cortex. The atlas surface was then co-registered to five additional subjects using a spherical diffeomorphic demons algorithm. The labels from the atlas surface were warped on the subject surface and compared to the manual raters. The average Dice overlap index was 0.89 across all regions.

  12. Automated Generation of Kempe Linkage and Its Complexity

    Institute of Scientific and Technical Information of China (English)

    高小山; 朱长才

    1999-01-01

    It is a famous result of Kempe that a linkage can be designed to generate any given plane algebraic curve.In this paper,Kempe's result is improved to give a precise algorithm for generating Kempe linkage.We proved that for an algebraic plane curve of degrenn n,Kempe linkage uses at most O(n4) links.Efforts to implement a program which may generate Kempe linkage and simulation of the generation process of the plane curves are presented in the paper.

  13. Automated Generation of Safety Requirements from Railway Interlocking Tables

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2012-01-01

    This paper describes a tool for extracting formal safety conditions from interlocking tables for railway interlocking systems. The tool has been applied to generate safety conditions for the interlocking system at Stenstrup station in Denmark, and the generated conditions were then checked to hold...... by the SAL model checker tool....

  14. Automated analysis of fundamental features of brain structures.

    Science.gov (United States)

    Lancaster, Jack L; McKay, D Reese; Cykowski, Matthew D; Martinez, Michael J; Tan, Xi; Valaparla, Sunil; Zhang, Yi; Fox, Peter T

    2011-12-01

    Automated image analysis of the brain should include measures of fundamental structural features such as size and shape. We used principal axes (P-A) measurements to measure overall size and shape of brain structures segmented from MR brain images. The rationale was that quantitative volumetric studies of brain structures would benefit from shape standardization as had been shown for whole brain studies. P-A analysis software was extended to include controls for variability in position and orientation to support individual structure spatial normalization (ISSN). The rationale was that ISSN would provide a bias-free means to remove elementary sources of a structure's spatial variability in preparation for more detailed analyses. We studied nine brain structures (whole brain, cerebral hemispheres, cerebellum, brainstem, caudate, putamen, hippocampus, inferior frontal gyrus, and precuneus) from the 40-brain LPBA40 atlas. This paper provides the first report of anatomical positions and principal axes orientations within a standard reference frame, in addition to "shape/size related" principal axes measures, for the nine brain structures from the LPBA40 atlas. Analysis showed that overall size (mean volume) for internal brain structures was preserved using shape standardization while variance was reduced by more than 50%. Shape standardization provides increased statistical power for between-group volumetric studies of brain structures compared to volumetric studies that control only for whole brain size. To test ISSN's ability to control for spatial variability of brain structures we evaluated the overlap of 40 regions of interest (ROIs) in a standard reference frame for the nine different brain structures before and after processing. Standardizations of orientation or shape were ineffective when not combined with position standardization. The greatest reduction in spatial variability was seen for combined standardizations of position, orientation and shape. These

  15. AUTOMATED DETECTION OF STRUCTURAL ALERTS (CHEMICAL FRAGMENTS IN (ECOTOXICOLOGY

    Directory of Open Access Journals (Sweden)

    Alban Lepailleur

    2013-02-01

    Full Text Available This mini-review describes the evolution of different algorithms dedicated to the automated discovery of chemical fragments associated to (ecotoxicological endpoints. These structural alerts correspond to one of the most interesting approach of in silico toxicology due to their direct link with specific toxicological mechanisms. A number of expert systems are already available but, since the first work in this field which considered a binomial distribution of chemical fragments between two datasets, new data miners were developed and applied with success in chemoinformatics. The frequency of a chemical fragment in a dataset is often at the core of the process for the definition of its toxicological relevance. However, recent progresses in data mining provide new insights into the automated discovery of new rules. Particularly, this review highlights the notion of Emerging Patterns that can capture contrasts between classes of data.

  16. Automated Diagnosis and Classification of Steam Generator Tube Defects

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Gabe V. Garcia

    2004-10-01

    A major cause of failure in nuclear steam generators is tube degradation. Tube defects are divided into seven categories, one of which is intergranular attack/stress corrosion cracking (IGA/SCC). Defects of this type usually begin on the outer surface of the tubes and propagate both inward and laterally. In many cases these defects occur at or near the tube support plates. Several different methods exist for the nondestructive evaluation of nuclear steam generator tubes for defect characterization.

  17. PYMORPH: automated galaxy structural parameter estimation using PYTHON

    Science.gov (United States)

    Vikram, Vinu; Wadadekar, Yogesh; Kembhavi, Ajit K.; Vijayagovindan, G. V.

    2010-12-01

    We present a new software pipeline - PYMORPH- for automated estimation of structural parameters of galaxies. Both parametric fits through a two-dimensional bulge disc decomposition and structural parameter measurements like concentration, asymmetry etc. are supported. The pipeline is designed to be easy to use yet flexible; individual software modules can be replaced with ease. A find-and-fit mode is available so that all galaxies in an image can be measured with a simple command. A parallel version of the PYMORPH pipeline runs on computer clusters and a virtual observatory compatible web enabled interface is under development.

  18. A screened automated structural search with semiempirical methods

    CERN Document Server

    Ota, Yukihiro; Machida, Masahiko; Shiga, Motoyuki

    2016-01-01

    We developed an interface program between a program suite for an automated search of chemical reaction pathways, GRRM, and a program package of semiempirical methods, MOPAC. A two-step structural search is proposed as an application of this interface program. A screening test is first performed by semiempirical calculations. Subsequently, a reoptimization procedure is done by ab initio or density functional calculations. We apply this approach to ion adsorption on cellulose. The computational efficiency is also shown for a GRRM search. The interface program is suitable for the structural search of large molecular systems for which semiempirical methods are applicable.

  19. A telerobotic system for automated assembly of large space structures

    Science.gov (United States)

    Rhodes, Marvin D.; Will, Ralph W.; Wise, Marion A.

    1990-01-01

    Future space missions such as polar platforms and antennas are anticipated to require large truss structures as their primary support system. During the past several years considerable research has been conducted to develop hardware and construction techniques suitable for astronaut assembly of truss structures in space. A research program has recently been initiated to develop the technology and to demonstrate the potential for automated in-space assembly of large erectable structures. The initial effort will be focused on automated assembly of a tetrahedral truss composed of 2-meter members. The facility is designed as a ground based system to permit evaluation of assembly concepts and was not designed for space qualification. The system is intended to be used as a tool from which more sophisticated procedures and operations can be developed. The facility description includes a truss structure, motionbases and a robot arm equipped with an end effector. Other considerations and requirements of the structural assembly describe computer control systems to monitor and control the operations of the assembly facility.

  20. Aircraft wing structural design optimization based on automated finite element modelling and ground structure approach

    Science.gov (United States)

    Yang, Weizhu; Yue, Zhufeng; Li, Lei; Wang, Peiyan

    2016-01-01

    An optimization procedure combining an automated finite element modelling (AFEM) technique with a ground structure approach (GSA) is proposed for structural layout and sizing design of aircraft wings. The AFEM technique, based on CATIA VBA scripting and PCL programming, is used to generate models automatically considering the arrangement of inner systems. GSA is used for local structural topology optimization. The design procedure is applied to a high-aspect-ratio wing. The arrangement of the integral fuel tank, landing gear and control surfaces is considered. For the landing gear region, a non-conventional initial structural layout is adopted. The positions of components, the number of ribs and local topology in the wing box and landing gear region are optimized to obtain a minimum structural weight. Constraints include tank volume, strength, buckling and aeroelastic parameters. The results show that the combined approach leads to a greater weight saving, i.e. 26.5%, compared with three additional optimizations based on individual design approaches.

  1. Automated generation of formal safety conditions from railway interlocking tables

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2014-01-01

    This paper describes a tool for extracting formal safety conditions from interlocking tables for railway interlocking systems. The tool has been applied to generate safety conditions for the interlocking system at Stenstrup station in Denmark, and the SAL model checker tool has been used to check...

  2. Automated refinement of macromolecular structures at low resolution using prior information

    Science.gov (United States)

    Kovalevskiy, Oleg; Nicholls, Robert A.; Murshudov, Garib N.

    2016-01-01

    Since the ratio of the number of observations to adjustable parameters is small at low resolution, it is necessary to use complementary information for the analysis of such data. ProSMART is a program that can generate restraints for macromolecules using homologous structures, as well as generic restraints for the stabilization of secondary structures. These restraints are used by REFMAC5 to stabilize the refinement of an atomic model. However, the optimal refinement protocol varies from case to case, and it is not always obvious how to select appropriate homologous structure(s), or other sources of prior information, for restraint generation. After running extensive tests on a large data set of low-resolution models, the best-performing refinement protocols and strategies for the selection of homologous structures have been identified. These strategies and protocols have been implemented in the Low-Resolution Structure Refinement (LORESTR) pipeline. The pipeline performs auto-detection of twinning and selects the optimal scaling method and solvent parameters. LORESTR can either use user-supplied homologous structures, or run an automated BLAST search and download homologues from the PDB. The pipeline executes multiple model-refinement instances using different parameters in order to find the best protocol. Tests show that the automated pipeline improves R factors, geometry and Ramachandran statistics for 94% of the low-resolution cases from the PDB included in the test set. PMID:27710936

  3. Automated volumetric grid generation for finite element modeling of human hand joints

    Energy Technology Data Exchange (ETDEWEB)

    Hollerbach, K.; Underhill, K. [Lawrence Livermore National Lab., CA (United States); Rainsberger, R. [XYZ Scientific Applications, Inc., Livermore, CA (United States)

    1995-02-01

    We are developing techniques for finite element analysis of human joints. These techniques need to provide high quality results rapidly in order to be useful to a physician. The research presented here increases model quality and decreases user input time by automating the volumetric mesh generation step.

  4. Automated generation of program translation and verification tools using annotated grammars

    NARCIS (Netherlands)

    Ordonez Camacho, D.; Mens, K.; Brand, M.G.J. van den; Vinju, J.J.

    2010-01-01

    Automatically generating program translators from source and target language specifications is a non-trivial problem. In this paper we focus on the problem of automating the process of building translators between operations languages, a family of DSLs used to program satellite operations procedures

  5. Automated learning of generative models for subcellular location: building blocks for systems biology.

    Science.gov (United States)

    Zhao, Ting; Murphy, Robert F

    2007-12-01

    The goal of location proteomics is the systematic and comprehensive study of protein subcellular location. We have previously developed automated, quantitative methods to identify protein subcellular location families, but there have been no effective means of communicating their patterns to integrate them with other information for building cell models. We built generative models of subcellular location that are learned from a collection of images so that they not only represent the pattern, but also capture its variation from cell to cell. Our models contain three components: a nuclear model, a cell shape model and a protein-containing object model. We built models for six patterns that consist primarily of discrete structures. To validate the generated images, we showed that they are recognized with reasonable accuracy by a classifier trained on real images. We also showed that the model parameters themselves can be used as features to discriminate the classes. The models allow the synthesis of images with the expectation that they are drawn from the same underlying statistical distribution as the images used to train them. They can potentially be combined for many proteins to yield a high resolution location map in support of systems biology.

  6. Automated generation of curved planar reformations from MR images of the spine

    Energy Technology Data Exchange (ETDEWEB)

    Vrtovec, Tomaz [Faculty of Electrical Engineering, University of Ljubljana, Trzaska 25, SI-1000 Ljubljana (Slovenia); Ourselin, Sebastien [CSIRO ICT Centre, Autonomous Systems Laboratory, BioMedIA Lab, Locked Bag 17, North Ryde, NSW 2113 (Australia); Gomes, Lavier [Department of Radiology, Westmead Hospital, University of Sydney, Hawkesbury Road, Westmead NSW 2145 (Australia); Likar, Bostjan [Faculty of Electrical Engineering, University of Ljubljana, Trzaska 25, SI-1000 Ljubljana (Slovenia); Pernus, Franjo [Faculty of Electrical Engineering, University of Ljubljana, Trzaska 25, SI-1000 Ljubljana (Slovenia)

    2007-05-21

    A novel method for automated curved planar reformation (CPR) of magnetic resonance (MR) images of the spine is presented. The CPR images, generated by a transformation from image-based to spine-based coordinate system, follow the structural shape of the spine and allow the whole course of the curved anatomy to be viewed in individual cross-sections. The three-dimensional (3D) spine curve and the axial vertebral rotation, which determine the transformation, are described by polynomial functions. The 3D spine curve passes through the centres of vertebral bodies, while the axial vertebral rotation determines the rotation of vertebrae around the axis of the spinal column. The optimal polynomial parameters are obtained by a robust refinement of the initial estimates of the centres of vertebral bodies and axial vertebral rotation. The optimization framework is based on the automatic image analysis of MR spine images that exploits some basic anatomical properties of the spine. The method was evaluated on 21 MR images from 12 patients and the results provided a good description of spine anatomy, with mean errors of 2.5 mm and 1.7{sup 0} for the position of the 3D spine curve and axial rotation of vertebrae, respectively. The generated CPR images are independent of the position of the patient in the scanner while comprising both anatomical and geometrical properties of the spine.

  7. Automated generation of curved planar reformations from MR images of the spine

    Science.gov (United States)

    Vrtovec, Tomaz; Ourselin, Sébastien; Gomes, Lavier; Likar, Boštjan; Pernuš, Franjo

    2007-05-01

    A novel method for automated curved planar reformation (CPR) of magnetic resonance (MR) images of the spine is presented. The CPR images, generated by a transformation from image-based to spine-based coordinate system, follow the structural shape of the spine and allow the whole course of the curved anatomy to be viewed in individual cross-sections. The three-dimensional (3D) spine curve and the axial vertebral rotation, which determine the transformation, are described by polynomial functions. The 3D spine curve passes through the centres of vertebral bodies, while the axial vertebral rotation determines the rotation of vertebrae around the axis of the spinal column. The optimal polynomial parameters are obtained by a robust refinement of the initial estimates of the centres of vertebral bodies and axial vertebral rotation. The optimization framework is based on the automatic image analysis of MR spine images that exploits some basic anatomical properties of the spine. The method was evaluated on 21 MR images from 12 patients and the results provided a good description of spine anatomy, with mean errors of 2.5 mm and 1.7° for the position of the 3D spine curve and axial rotation of vertebrae, respectively. The generated CPR images are independent of the position of the patient in the scanner while comprising both anatomical and geometrical properties of the spine.

  8. Automating the generation of finite element dynamical cores with Firedrake

    Science.gov (United States)

    Ham, David; Mitchell, Lawrence; Homolya, Miklós; Luporini, Fabio; Gibson, Thomas; Kelly, Paul; Cotter, Colin; Lange, Michael; Kramer, Stephan; Shipton, Jemma; Yamazaki, Hiroe; Paganini, Alberto; Kärnä, Tuomas

    2017-04-01

    The development of a dynamical core is an increasingly complex software engineering undertaking. As the equations become more complete, the discretisations more sophisticated and the hardware acquires ever more fine-grained parallelism and deeper memory hierarchies, the problem of building, testing and modifying dynamical cores becomes increasingly complex. Here we present Firedrake, a code generation system for the finite element method with specialist features designed to support the creation of geoscientific models. Using Firedrake, the dynamical core developer writes the partial differential equations in weak form in a high level mathematical notation. Appropriate function spaces are chosen and time stepping loops written at the same high level. When the programme is run, Firedrake generates high performance C code for the resulting numerics which are executed in parallel. Models in Firedrake typically take a tiny fraction of the lines of code required by traditional hand-coding techniques. They support more sophisticated numerics than are easily achieved by hand, and the resulting code is frequently higher performance. Critically, debugging, modifying and extending a model written in Firedrake is vastly easier than by traditional methods due to the small, highly mathematical code base. Firedrake supports a wide range of key features for dynamical core creation: A vast range of discretisations, including both continuous and discontinuous spaces and mimetic (C-grid-like) elements which optimally represent force balances in geophysical flows. High aspect ratio layered meshes suitable for ocean and atmosphere domains. Curved elements for high accuracy representations of the sphere. Support for non-finite element operators, such as parametrisations. Access to PETSc, a world-leading library of programmable linear and nonlinear solvers. High performance adjoint models generated automatically by symbolically reasoning about the forward model. This poster will present

  9. Automated Testcase Generation for Numerical Support Functions in Embedded Systems

    Science.gov (United States)

    Schumann, Johann; Schnieder, Stefan-Alexander

    2014-01-01

    We present a tool for the automatic generation of test stimuli for small numerical support functions, e.g., code for trigonometric functions, quaternions, filters, or table lookup. Our tool is based on KLEE to produce a set of test stimuli for full path coverage. We use a method of iterative deepening over abstractions to deal with floating-point values. During actual testing the stimuli exercise the code against a reference implementation. We illustrate our approach with results of experiments with low-level trigonometric functions, interpolation routines, and mathematical support functions from an open source UAS autopilot.

  10. An automated framework for hypotheses generation using literature

    Directory of Open Access Journals (Sweden)

    Abedi Vida

    2012-08-01

    Full Text Available Abstract Background In bio-medicine, exploratory studies and hypothesis generation often begin with researching existing literature to identify a set of factors and their association with diseases, phenotypes, or biological processes. Many scientists are overwhelmed by the sheer volume of literature on a disease when they plan to generate a new hypothesis or study a biological phenomenon. The situation is even worse for junior investigators who often find it difficult to formulate new hypotheses or, more importantly, corroborate if their hypothesis is consistent with existing literature. It is a daunting task to be abreast with so much being published and also remember all combinations of direct and indirect associations. Fortunately there is a growing trend of using literature mining and knowledge discovery tools in biomedical research. However, there is still a large gap between the huge amount of effort and resources invested in disease research and the little effort in harvesting the published knowledge. The proposed hypothesis generation framework (HGF finds “crisp semantic associations” among entities of interest - that is a step towards bridging such gaps. Methodology The proposed HGF shares similar end goals like the SWAN but are more holistic in nature and was designed and implemented using scalable and efficient computational models of disease-disease interaction. The integration of mapping ontologies with latent semantic analysis is critical in capturing domain specific direct and indirect “crisp” associations, and making assertions about entities (such as disease X is associated with a set of factors Z. Results Pilot studies were performed using two diseases. A comparative analysis of the computed “associations” and “assertions” with curated expert knowledge was performed to validate the results. It was observed that the HGF is able to capture “crisp” direct and indirect associations, and provide knowledge

  11. Automated extraction of chemical structure information from digital raster images

    Directory of Open Access Journals (Sweden)

    Shedden Kerby A

    2009-02-01

    Full Text Available Abstract Background To search for chemical structures in research articles, diagrams or text representing molecules need to be translated to a standard chemical file format compatible with cheminformatic search engines. Nevertheless, chemical information contained in research articles is often referenced as analog diagrams of chemical structures embedded in digital raster images. To automate analog-to-digital conversion of chemical structure diagrams in scientific research articles, several software systems have been developed. But their algorithmic performance and utility in cheminformatic research have not been investigated. Results This paper aims to provide critical reviews for these systems and also report our recent development of ChemReader – a fully automated tool for extracting chemical structure diagrams in research articles and converting them into standard, searchable chemical file formats. Basic algorithms for recognizing lines and letters representing bonds and atoms in chemical structure diagrams can be independently run in sequence from a graphical user interface-and the algorithm parameters can be readily changed-to facilitate additional development specifically tailored to a chemical database annotation scheme. Compared with existing software programs such as OSRA, Kekule, and CLiDE, our results indicate that ChemReader outperforms other software systems on several sets of sample images from diverse sources in terms of the rate of correct outputs and the accuracy on extracting molecular substructure patterns. Conclusion The availability of ChemReader as a cheminformatic tool for extracting chemical structure information from digital raster images allows research and development groups to enrich their chemical structure databases by annotating the entries with published research articles. Based on its stable performance and high accuracy, ChemReader may be sufficiently accurate for annotating the chemical database with links

  12. ADVANTG An Automated Variance Reduction Parameter Generator, Rev. 1

    Energy Technology Data Exchange (ETDEWEB)

    Mosher, Scott W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bevill, Aaron M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ibrahim, Ahmad M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Daily, Charles R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wagner, John C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Jeffrey O. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Grove, Robert E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-08-01

    The primary objective of ADVANTG is to reduce both the user effort and the computational time required to obtain accurate and precise tally estimates across a broad range of challenging transport applications. ADVANTG has been applied to simulations of real-world radiation shielding, detection, and neutron activation problems. Examples of shielding applications include material damage and dose rate analyses of the Oak Ridge National Laboratory (ORNL) Spallation Neutron Source and High Flux Isotope Reactor (Risner and Blakeman 2013) and the ITER Tokamak (Ibrahim et al. 2011). ADVANTG has been applied to a suite of radiation detection, safeguards, and special nuclear material movement detection test problems (Shaver et al. 2011). ADVANTG has also been used in the prediction of activation rates within light water reactor facilities (Pantelias and Mosher 2013). In these projects, ADVANTG was demonstrated to significantly increase the tally figure of merit (FOM) relative to an analog MCNP simulation. The ADVANTG-generated parameters were also shown to be more effective than manually generated geometry splitting parameters.

  13. Validation of fully automated VMAT plan generation for library-based plan-of-the-day cervical cancer radiotherapy

    NARCIS (Netherlands)

    A.W.M. Sharfo (Abdul Wahab M.); S. Breedveld (Sebastiaan); P.W.J. Voet (Peter W.J.); S.T. Heijkoop (Sabrina); J.W.M. Mens (Jan); M.S. Hoogeman (Mischa); B.J.M. Heijmen (Ben)

    2016-01-01

    textabstractPurpose: To develop and validate fully automated generation of VMAT plan-libraries for plan-of-the-day adaptive radiotherapy in locally-advanced cervical cancer. Material and Methods: Our framework for fully automated treatment plan generation (Erasmus-iCycle) was adapted to create dual-

  14. Automation of the process of generation of the students insurance, applying RFID and GPRS technologies

    Directory of Open Access Journals (Sweden)

    Nelson Barrera-Lombana

    2013-07-01

    Full Text Available This article presents the description of the design and implementation of a system which allows the fulfilment of a consultation service on various parameters to a web server using a GSM modem, exchanging information systems over the Internet (ISS and radio-frequency identification (RFID. The application validates for its use in automation of the process of generation of the student insurance, and hardware and software, developed by the Research Group in Robotics and Industrial Automation GIRAof UPTC, are used as a platform.

  15. Automated problem generation in Learning Management Systems: a tutorial

    Directory of Open Access Journals (Sweden)

    Jaime Romero

    2016-07-01

    Full Text Available The benefits of solving problems have been widely acknowledged by literature. Its implementation in e–learning platforms can make easier its management and the learning process itself. However, its implementation can also become a very time–consuming task, particularly when the number of problems to generate is high. In this tutorial we describe a methodology that we have developed aiming to alleviate the workload of producing a great deal of problems in Moodle for an undergraduate business course. This methodology follows a six-step process and allows evaluating student’s skills in problem solving, minimizes plagiarism behaviors and provides immediate feedback. We expect this tutorial encourage other educators to apply our six steps process, thus benefiting themselves and their students of its advantages.

  16. An automated algorithm for the generation of dynamically reconstructed trajectories

    Science.gov (United States)

    Komalapriya, C.; Romano, M. C.; Thiel, M.; Marwan, N.; Kurths, J.; Kiss, I. Z.; Hudson, J. L.

    2010-03-01

    The lack of long enough data sets is a major problem in the study of many real world systems. As it has been recently shown [C. Komalapriya, M. Thiel, M. C. Romano, N. Marwan, U. Schwarz, and J. Kurths, Phys. Rev. E 78, 066217 (2008)], this problem can be overcome in the case of ergodic systems if an ensemble of short trajectories is available, from which dynamically reconstructed trajectories can be generated. However, this method has some disadvantages which hinder its applicability, such as the need for estimation of optimal parameters. Here, we propose a substantially improved algorithm that overcomes the problems encountered by the former one, allowing its automatic application. Furthermore, we show that the new algorithm not only reproduces the short term but also the long term dynamics of the system under study, in contrast to the former algorithm. To exemplify the potential of the new algorithm, we apply it to experimental data from electrochemical oscillators and also to analyze the well-known problem of transient chaotic trajectories.

  17. Structured automated code checking through structural components and systems engineering

    NARCIS (Netherlands)

    Coenders, J.L.; Rolvink, A.

    2014-01-01

    This paper presents a proposal to employ the design computing methodology proposed as StructuralComponents (Rolvink et al [6] and van de Weerd et al [7]) as a method to perform a digital verification process to fulfil the requirements related to structural design and engineering as part of a buildin

  18. Automated glioblastoma segmentation based on a multiparametric structured unsupervised classification.

    Science.gov (United States)

    Juan-Albarracín, Javier; Fuster-Garcia, Elies; Manjón, José V; Robles, Montserrat; Aparici, F; Martí-Bonmatí, L; García-Gómez, Juan M

    2015-01-01

    Automatic brain tumour segmentation has become a key component for the future of brain tumour treatment. Currently, most of brain tumour segmentation approaches arise from the supervised learning standpoint, which requires a labelled training dataset from which to infer the models of the classes. The performance of these models is directly determined by the size and quality of the training corpus, whose retrieval becomes a tedious and time-consuming task. On the other hand, unsupervised approaches avoid these limitations but often do not reach comparable results than the supervised methods. In this sense, we propose an automated unsupervised method for brain tumour segmentation based on anatomical Magnetic Resonance (MR) images. Four unsupervised classification algorithms, grouped by their structured or non-structured condition, were evaluated within our pipeline. Considering the non-structured algorithms, we evaluated K-means, Fuzzy K-means and Gaussian Mixture Model (GMM), whereas as structured classification algorithms we evaluated Gaussian Hidden Markov Random Field (GHMRF). An automated postprocess based on a statistical approach supported by tissue probability maps is proposed to automatically identify the tumour classes after the segmentations. We evaluated our brain tumour segmentation method with the public BRAin Tumor Segmentation (BRATS) 2013 Test and Leaderboard datasets. Our approach based on the GMM model improves the results obtained by most of the supervised methods evaluated with the Leaderboard set and reaches the second position in the ranking. Our variant based on the GHMRF achieves the first position in the Test ranking of the unsupervised approaches and the seventh position in the general Test ranking, which confirms the method as a viable alternative for brain tumour segmentation.

  19. Automated glioblastoma segmentation based on a multiparametric structured unsupervised classification.

    Directory of Open Access Journals (Sweden)

    Javier Juan-Albarracín

    Full Text Available Automatic brain tumour segmentation has become a key component for the future of brain tumour treatment. Currently, most of brain tumour segmentation approaches arise from the supervised learning standpoint, which requires a labelled training dataset from which to infer the models of the classes. The performance of these models is directly determined by the size and quality of the training corpus, whose retrieval becomes a tedious and time-consuming task. On the other hand, unsupervised approaches avoid these limitations but often do not reach comparable results than the supervised methods. In this sense, we propose an automated unsupervised method for brain tumour segmentation based on anatomical Magnetic Resonance (MR images. Four unsupervised classification algorithms, grouped by their structured or non-structured condition, were evaluated within our pipeline. Considering the non-structured algorithms, we evaluated K-means, Fuzzy K-means and Gaussian Mixture Model (GMM, whereas as structured classification algorithms we evaluated Gaussian Hidden Markov Random Field (GHMRF. An automated postprocess based on a statistical approach supported by tissue probability maps is proposed to automatically identify the tumour classes after the segmentations. We evaluated our brain tumour segmentation method with the public BRAin Tumor Segmentation (BRATS 2013 Test and Leaderboard datasets. Our approach based on the GMM model improves the results obtained by most of the supervised methods evaluated with the Leaderboard set and reaches the second position in the ranking. Our variant based on the GHMRF achieves the first position in the Test ranking of the unsupervised approaches and the seventh position in the general Test ranking, which confirms the method as a viable alternative for brain tumour segmentation.

  20. M073: Monte Carlo generated spectra for QA/QC of automated NAA routine

    Energy Technology Data Exchange (ETDEWEB)

    Jackman, K. R. (Kevin R.); Biegalski, S. R.

    2004-01-01

    A quality check for an automated system of analyzing large sets of neutron activated samples has been developed. Activated samples are counted with an HPGe detector, in conjunction with an automated sample changer and spectral analysis tools, controlled by the Canberra GENIE 2K and REXX software. After each sample is acquired and analyzed, a Microsoft Visual Basic program imports the results into a template Microsoft Excel file where the final concentrations, uncertainties, and detection limits are determined. Standard reference materials are included in each set of 40 samples as a standard quality assurance/quality control (QA/QC) test. A select group of sample spectra are also visually reviewed to check the peak fitting routines. A reference spectrum was generated in MCNP 4c2 using an F8, pulse height, tally with a detector model of the actual detector used in counting. The detector model matches the detector resolution, energy calibration, and counting geometry. The generated spectrum also contained a radioisotope matrix that was similar to what was expected in the samples. This spectrum can then be put through the automated system and analyzed along with the other samples. The automated results are then compared to expected results for QA/QC assurance.

  1. Semi-Automated Discovery of Application Session Structure

    Energy Technology Data Exchange (ETDEWEB)

    Kannan, J.; Jung, J.; Paxson, V.; Koksal, C.

    2006-09-07

    While the problem of analyzing network traffic at the granularity of individual connections has seen considerable previous work and tool development, understanding traffic at a higher level---the structure of user-initiated sessions comprised of groups of related connections---remains much less explored. Some types of session structure, such as the coupling between an FTP control connection and the data connections it spawns, have prespecified forms, though the specifications do not guarantee how the forms appear in practice. Other types of sessions, such as a user reading email with a browser, only manifest empirically. Still other sessions might exist without us even knowing of their presence, such as a botnet zombie receiving instructions from its master and proceeding in turn to carry them out. We present algorithms rooted in the statistics of Poisson processes that can mine a large corpus of network connection logs to extract the apparent structure of application sessions embedded in the connections. Our methods are semi-automated in that we aim to present an analyst with high-quality information (expressed as regular expressions) reflecting different possible abstractions of an application's session structure. We develop and test our methods using traces from a large Internet site, finding diversity in the number of applications that manifest, their different session structures, and the presence of abnormal behavior. Our work has applications to traffic characterization and monitoring, source models for synthesizing network traffic, and anomaly detection.

  2. Automated Euler and Navier-Stokes Database Generation for a Glide-Back Booster

    Science.gov (United States)

    Chaderjian, Neal M.; Rogers, Stuart E.; Aftosmis, Mike J.; Pandya, Shishir A.; Ahmad, Jasim U.; Tejnil, Edward

    2004-01-01

    The past two decades have seen a sustained increase in the use of high fidelity Computational Fluid Dynamics (CFD) in basic research, aircraft design, and the analysis of post-design issues. As the fidelity of a CFD method increases, the number of cases that can be readily and affordably computed greatly diminishes. However, computer speeds now exceed 2 GHz, hundreds of processors are currently available and more affordable, and advances in parallel CFD algorithms scale more readily with large numbers of processors. All of these factors make it feasible to compute thousands of high fidelity cases. However, there still remains the overwhelming task of monitoring the solution process. This paper presents an approach to automate the CFD solution process. A new software tool, AeroDB, is used to compute thousands of Euler and Navier-Stokes solutions for a 2nd generation glide-back booster in one week. The solution process exploits a common job-submission grid environment, the NASA Information Power Grid (IPG), using 13 computers located at 4 different geographical sites. Process automation and web-based access to a MySql database greatly reduces the user workload, removing much of the tedium and tendency for user input errors. The AeroDB framework is shown. The user submits/deletes jobs, monitors AeroDB's progress, and retrieves data and plots via a web portal. Once a job is in the database, a job launcher uses an IPG resource broker to decide which computers are best suited to run the job. Job/code requirements, the number of CPUs free on a remote system, and queue lengths are some of the parameters the broker takes into account. The Globus software provides secure services for user authentication, remote shell execution, and secure file transfers over an open network. AeroDB automatically decides when a job is completed. Currently, the Cart3D unstructured flow solver is used for the Euler equations, and the Overflow structured overset flow solver is used for the

  3. CANEapp: a user-friendly application for automated next generation transcriptomic data analysis.

    Science.gov (United States)

    Velmeshev, Dmitry; Lally, Patrick; Magistri, Marco; Faghihi, Mohammad Ali

    2016-01-13

    Next generation sequencing (NGS) technologies are indispensable for molecular biology research, but data analysis represents the bottleneck in their application. Users need to be familiar with computer terminal commands, the Linux environment, and various software tools and scripts. Analysis workflows have to be optimized and experimentally validated to extract biologically meaningful data. Moreover, as larger datasets are being generated, their analysis requires use of high-performance servers. To address these needs, we developed CANEapp (application for Comprehensive automated Analysis of Next-generation sequencing Experiments), a unique suite that combines a Graphical User Interface (GUI) and an automated server-side analysis pipeline that is platform-independent, making it suitable for any server architecture. The GUI runs on a PC or Mac and seamlessly connects to the server to provide full GUI control of RNA-sequencing (RNA-seq) project analysis. The server-side analysis pipeline contains a framework that is implemented on a Linux server through completely automated installation of software components and reference files. Analysis with CANEapp is also fully automated and performs differential gene expression analysis and novel noncoding RNA discovery through alternative workflows (Cuffdiff and R packages edgeR and DESeq2). We compared CANEapp to other similar tools, and it significantly improves on previous developments. We experimentally validated CANEapp's performance by applying it to data derived from different experimental paradigms and confirming the results with quantitative real-time PCR (qRT-PCR). CANEapp adapts to any server architecture by effectively using available resources and thus handles large amounts of data efficiently. CANEapp performance has been experimentally validated on various biological datasets. CANEapp is available free of charge at http://psychiatry.med.miami.edu/research/laboratory-of-translational-rna-genomics/CANE-app . We

  4. Implementing the WebSocket Protocol Based on Formal Modelling and Automated Code Generation

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2014-01-01

    protocols. Furthermore, we perform formal verification of the CPN model prior to code generation, and test the implementation for interoperability against the Autobahn WebSocket test-suite resulting in 97% and 99% success rate for the client and server implementation, respectively. The tests show...... with pragmatic annotations for automated code generation of protocol software. The contribution of this paper is an application of the approach as implemented in the PetriCode tool to obtain protocol software implementing the IETF WebSocket protocol. This demonstrates the scalability of our approach to real...

  5. Automated importance generation and biasing techniques for Monte Carlo shielding techniques by the TRIPOLI-3 code

    Energy Technology Data Exchange (ETDEWEB)

    Both, J.P.; Nimal, J.C.; Vergnaud, T. (CEA Centre d' Etudes Nucleaires de Saclay, 91 - Gif-sur-Yvette (France). Service d' Etudes des Reacteurs et de Mathematiques Appliquees)

    1990-01-01

    We discuss an automated biasing procedure for generating the parameters necessary to achieve efficient Monte Carlo biasing shielding calculations. The biasing techniques considered here are exponential transform and collision biasing deriving from the concept of the biased game based on the importance function. We use a simple model of the importance function with exponential attenuation as the distance to the detector increases. This importance function is generated on a three-dimensional mesh including geometry and with graph theory algorithms. This scheme is currently being implemented in the third version of the neutron and gamma ray transport code TRIPOLI-3. (author).

  6. Automated test data generation for branch testing using incremental genetic algorithm

    Indian Academy of Sciences (India)

    T MANIKUMAR; A JOHN SANJEEV KUMAR; R MARUTHAMUTHU

    2016-09-01

    Cost of software testing can be reduced by automated test data generation to find a minimal set of data that has maximum coverage. Search-based software testing (SBST) is one of the techniques recently used for automated testing task. SBST makes use of control flow graph (CFG) and meta-heuristic search algorithms to accomplish the process. This paper focuses on test data generation for branch coverage. A major drawback in using meta-heuristic techniques is that the CFG paths have to be traversed from the starting node to end node for each automated test data. This kind of traversal could be improved by branch ordering, together with elitism. But still the population size and the number of iterations are maintained as the same to keep all the branches alive. In this paper, we present an incremental genetic algorithm (IGA) for branch coverage testing. Initially, a classical genetic algorithm (GA) is used to construct the population with the best parents for each branch node, and the IGA is started with these parents as the initial population. Hence, it is not necessary to maintain a huge population size and large number of iterations to cover all the branches. The performance is analyzed with five benchmark programs studied from the literature. The experimental results indicate that the proposed IGA search technique outperforms the other meta-heuristic search techniques in terms of memory usage and scalability.

  7. Integral Compressor/Generator/Fan Unitary Structure

    OpenAIRE

    Dreiman, Nelik

    2016-01-01

    INTEGRAL COMPRESSOR / GENERATOR / FAN UNITARY STRUCTURE.*) Dr. Nelik Dreiman Consultant, P.O.Box 144, Tipton, MI E-mail: An extremely compact, therefore space saving single compressor/generator/cooling fan structure of short axial length and light weight has been developed to provide generation of electrical power with simultaneous operation of the compressor when power is unavailable or function as a regular AC compressor powered by a power line. The generators and ai...

  8. International Normalized Ratio (INR), coagulation factor activities and calibrated automated thrombin generation - influence of 24 h storage at ambient temperature

    DEFF Research Database (Denmark)

    Christensen, T D; Jensen, C; Larsen, T B

    2010-01-01

    International Normalized Ratio (INR) measurements are used to monitor oral anticoagulation therapy with coumarins. Single coagulation factor activities and calibrated automated thrombin (CAT) generation are considered as more advanced methods for evaluating overall haemostatic capacity. The aims...

  9. Automated Monte Carlo biasing for photon-generated electrons near surfaces.

    Energy Technology Data Exchange (ETDEWEB)

    Franke, Brian Claude; Crawford, Martin James; Kensek, Ronald Patrick

    2009-09-01

    This report describes efforts to automate the biasing of coupled electron-photon Monte Carlo particle transport calculations. The approach was based on weight-windows biasing. Weight-window settings were determined using adjoint-flux Monte Carlo calculations. A variety of algorithms were investigated for adaptivity of the Monte Carlo tallies. Tree data structures were used to investigate spatial partitioning. Functional-expansion tallies were used to investigate higher-order spatial representations.

  10. Computer vision: automating DEM generation of active lava flows and domes from photos

    Science.gov (United States)

    James, M. R.; Varley, N. R.; Tuffen, H.

    2012-12-01

    Accurate digital elevation models (DEMs) form fundamental data for assessing many volcanic processes. We present a photo-based approach developed within the computer vision community to produce DEMs from a consumer-grade digital camera and freely available software. Two case studies, based on the Volcán de Colima lava dome and the Puyehue Cordón-Caulle obsidian flow, highlight the advantages of the technique in terms of the minimal expertise required, the speed of data acquisition and the automated processing involved. The reconstruction procedure combines structure-from-motion and multi-view stereo algorithms (SfM-MVS) and can generate dense 3D point clouds (millions of points) from multiple photographs of a scene taken from different positions. Processing is carried out by automated software (e.g. http://blog.neonascent.net/archives/bundler-photogrammetry-package/). SfM-MVS reconstructions are initally un-scaled and un-oriented so additional geo-referencing software has been developed. Although this step requires the presence of some control points, the SfM-MVS approach has significantly easier image acquisition and control requirements than traditional photogrammetry, facilitating its use in a broad range of difficult environments. At Colima, the lava dome surface was reconstructed from recent and archive images taken from light aircraft over flights (2007-2011). Scaling and geo-referencing was carried out using features identified in web-sourced ortho-imagery obtained as a basemap layer in ArcMap - no ground-based measurements were required. Average surface measurement densities are typically 10-40 points per m2. Over mean viewing distances of ~500-2500 m (for different surveys), RMS error on the control features is ~1.5 m. The derived DEMs (with 1-m grid resolution) are sufficient to quantify volumetric change, as well as to highlight the structural evolution of the upper surface of the dome following an explosion in June 2011. At Puyehue Cord

  11. Automated effective band structures for defective and mismatched supercells

    Science.gov (United States)

    Brommer, Peter; Quigley, David

    2014-12-01

    In plane-wave density functional theory codes, defects and incommensurate structures are usually represented in supercells. However, interpretation of E versus k band structures is most effective within the primitive cell, where comparison to ideal structures and spectroscopy experiments are most natural. Popescu and Zunger recently described a method to derive effective band structures (EBS) from supercell calculations in the context of random alloys. In this paper, we present bs_sc2pc, an implementation of this method in the CASTEP code, which generates an EBS using the structural data of the supercell and the underlying primitive cell with symmetry considerations handled automatically. We demonstrate the functionality of our implementation in three test cases illustrating the efficacy of this scheme for capturing the effect of vacancies, substitutions and lattice mismatch on effective primitive cell band structures.

  12. An automation of design and modelling tasks in NX Siemens environment with original software - generator module

    Science.gov (United States)

    Zbiciak, M.; Grabowik, C.; Janik, W.

    2015-11-01

    Nowadays the design constructional process is almost exclusively aided with CAD/CAE/CAM systems. It is evaluated that nearly 80% of design activities have a routine nature. These design routine tasks are highly susceptible to automation. Design automation is usually made with API tools which allow building original software responsible for adding different engineering activities. In this paper the original software worked out in order to automate engineering tasks at the stage of a product geometrical shape design is presented. The elaborated software works exclusively in NX Siemens CAD/CAM/CAE environment and was prepared in Microsoft Visual Studio with application of the .NET technology and NX SNAP library. The software functionality allows designing and modelling of spur and helicoidal involute gears. Moreover, it is possible to estimate relative manufacturing costs. With the Generator module it is possible to design and model both standard and non-standard gear wheels. The main advantage of the model generated in such a way is its better representation of an involute curve in comparison to those which are drawn in specialized standard CAD systems tools. It comes from fact that usually in CAD systems an involute curve is drawn by 3 points that respond to points located on the addendum circle, the reference diameter of a gear and the base circle respectively. In the Generator module the involute curve is drawn by 11 involute points which are located on and upper the base and the addendum circles therefore 3D gear wheels models are highly accurate. Application of the Generator module makes the modelling process very rapid so that the gear wheel modelling time is reduced to several seconds. During the conducted research the analysis of differences between standard 3 points and 11 points involutes was made. The results and conclusions drawn upon analysis are shown in details.

  13. Uniform generation of combinatorial structures

    Energy Technology Data Exchange (ETDEWEB)

    Zito, M.; Pu, I.; Amos, M.; Gibbons, A. [Univ. of Warwick, Coventry (United Kingdom)

    1996-12-31

    We describe several RNC algorithms for generating graphs and subgraphs uniformly at random. For example, unlabelled undirected graphs are generated in O(log{sup 3} n lg lg n) time using O({epsilon}n{sup 1.5}/lg{sup 3} n lg lg n) processors if their number is n lg lg n known in advance and in O(lg n) time using O ({epsilon}n{sup 2}/lg n) processors otherwise. In both cases the error probability is the inverse of a polynomial in {epsilon}. Thus {epsilon} may be chosen to trade-off processors for error probability. Also, for an arbitrary graph, we describe RNC algorithms for the uniform generation of its subgraphs that are either non-simple paths or spanning trees. The work measure for the subgraph algorithms is essentially determined by the transitive closure bottleneck. As for sequential algorithms, the general notion of constructing generators from counters also applies to parallel algorithms although this approach is not employed by all the algorithms of this paper.

  14. Automated web service composition supporting conditional branch structures

    Science.gov (United States)

    Wang, Pengwei; Ding, Zhijun; Jiang, Changjun; Zhou, Mengchu

    2014-01-01

    The creation of value-added services by automatic composition of existing ones is gaining a significant momentum as the potential silver bullet in service-oriented architecture. However, service composition faces two aspects of difficulties. First, users' needs present such characteristics as diversity, uncertainty and personalisation; second, the existing services run in a real-world environment that is highly complex and dynamically changing. These difficulties may cause the emergence of nondeterministic choices in the process of service composition, which has gone beyond what the existing automated service composition techniques can handle. According to most of the existing methods, the process model of composite service includes sequence constructs only. This article presents a method to introduce conditional branch structures into the process model of composite service when needed, in order to satisfy users' diverse and personalised needs and adapt to the dynamic changes of real-world environment. UML activity diagrams are used to represent dependencies in composite service. Two types of user preferences are considered in this article, which have been ignored by the previous work and a simple programming language style expression is adopted to describe them. Two different algorithms are presented to deal with different situations. A real-life case is provided to illustrate the proposed concepts and methods.

  15. ATTEMPTS TO AUTOMATE THE PROCESS OF GENERATION OF ORTHOIMAGES OF OBJECTS OF CULTURAL HERITAGE

    Directory of Open Access Journals (Sweden)

    J. S. Markiewicz

    2015-02-01

    Full Text Available At present, digital documentation recorded in the form of raster or vector files is the obligatory way of inventorying historical objects. The orthoimage is a cartometric form of photographic presentation of information in the two-dimensional reference system. The paper will discuss the issue of automation of the orthoimage generation basing on the TLS data and digital images. At present attempts are made to apply modern technologies not only for the needs of surveys, but also during the data processing. This paper will present attempts aiming at utilisation of appropriate algorithms and the author’s application for automatic generation of the projection plane, for the needs of acquisition of intensity orthoimages from the TLS data. Such planes are defined manually in the majority of popular TLS data processing applications. A separate issue related to the RGB image generation is the orientation of digital images in relation to scans. It is important, in particular in such cases when scans and photographs are not taken simultaneously. This paper will present experiments concerning the utilisation of the SIFT algorithm for automatic matching of intensity orthoimages of the intensity and digital (RGB photographs. Satisfactory results of the process of automation, as well as in relation to the quality of resulting orthoimages have been obtained.

  16. Vacuum pressure generation via microfabricated converging-diverging nozzles for operation of automated pneumatic logic.

    Science.gov (United States)

    Christoforidis, Theodore; Werner, Erik M; Hui, Elliot E; Eddington, David T

    2016-08-01

    Microfluidic devices with integrated pneumatic logic enable automated fluid handling without requiring external control instruments. These chips offer the additional advantage that they may be powered by vacuum and do not require an electricity source. This work describes a microfluidic converging-diverging (CD) nozzle optimized to generate vacuum at low input pressures, making it suitable for microfluidic applications including powering integrated pneumatic logic. It was found that efficient vacuum pressure was generated for high aspect ratios of the CD nozzle constriction (or throat) width to height and diverging angle of 3.6(o). In specific, for an inlet pressure of 42.2 psia (290.8 kPa) and a volumetric flow rate of approximately 1700 sccm, a vacuum pressure of 8.03 psia (55.3 kPa) was generated. To demonstrate the capabilities of our converging - diverging nozzle device, we connected it to a vacuum powered peristaltic pump driven by integrated pneumatic logic and obtained tunable flow rates from 0 to 130 μL/min. Finally, we demonstrate a proof of concept system for use where electricity and vacuum pressure are not readily available by powering a CD nozzle with a bicycle tire pump and pressure regulator. This system is able to produce a stable vacuum sufficient to drive pneumatic logic, and could be applied to power automated microfluidics in limited resource settings.

  17. Aptaligner: automated software for aligning pseudorandom DNA X-aptamers from next-generation sequencing data.

    Science.gov (United States)

    Lu, Emily; Elizondo-Riojas, Miguel-Angel; Chang, Jeffrey T; Volk, David E

    2014-06-10

    Next-generation sequencing results from bead-based aptamer libraries have demonstrated that traditional DNA/RNA alignment software is insufficient. This is particularly true for X-aptamers containing specialty bases (W, X, Y, Z, ...) that are identified by special encoding. Thus, we sought an automated program that uses the inherent design scheme of bead-based X-aptamers to create a hypothetical reference library and Markov modeling techniques to provide improved alignments. Aptaligner provides this feature as well as length error and noise level cutoff features, is parallelized to run on multiple central processing units (cores), and sorts sequences from a single chip into projects and subprojects.

  18. Automation of labelling of Lipiodol with high-activity generator-produced 188Re.

    Science.gov (United States)

    Lepareur, Nicolas; Ardisson, Valérie; Noiret, Nicolas; Boucher, Eveline; Raoul, Jean-Luc; Clément, Bruno; Garin, Etienne

    2011-02-01

    This work describes optimisation of the kit formulation for labelling of Lipiodol with high-activity generator-produced rhenium-188. Radiochemical purity (RCP) was 92.52±2.3% and extraction yield was 98.56±1.2%. The synthesis has been automated with a TADDEO module (Comecer) giving a mean final yield of 52.68±9.6%, and reducing radiation burden to the radiochemist by 80%. Radiolabelled Lipiodol ((188)Re-SSS/Lipiodol) is stable for at least 7 days (RCP=91.07±0.9%).

  19. Automation of labelling of Lipiodol with high-activity generator-produced {sup 188}Re

    Energy Technology Data Exchange (ETDEWEB)

    Lepareur, Nicolas, E-mail: n.lepareur@rennes.fnclcc.f [Service de Medecine Nucleaire, Centre Regional de Lutte Contre le Cancer Eugene Marquis, CS 44229, 35042 Rennes (France); INSERM U-991, Foie, Metabolismes et Cancer, 35033 Rennes (France); Universite Europeenne de Bretagne, Rennes (France); Ardisson, Valerie [Service de Medecine Nucleaire, Centre Regional de Lutte Contre le Cancer Eugene Marquis, CS 44229, 35042 Rennes (France); INSERM U-991, Foie, Metabolismes et Cancer, 35033 Rennes (France); Universite Europeenne de Bretagne, Rennes (France); Noiret, Nicolas [Universite Europeenne de Bretagne, Rennes (France); Ecole Nationale Superieure de Chimie de Rennes, UMR CNRS 6226, Chimie Organique et Supramoleculaire, Avenue du General Leclerc, CS 50837, 35708 Rennes Cedex 7 (France); Boucher, Eveline; Raoul, Jean-Luc [INSERM U-991, Foie, Metabolismes et Cancer, 35033 Rennes (France); Universite Europeenne de Bretagne, Rennes (France); Service d' Oncologie Digestive, Centre Regional de Lutte Contre le Cancer Eugene Marquis, CS 44229, 35042 Rennes (France); Clement, Bruno [INSERM U-991, Foie, Metabolismes et Cancer, 35033 Rennes (France); Garin, Etienne [Service de Medecine Nucleaire, Centre Regional de Lutte Contre le Cancer Eugene Marquis, CS 44229, 35042 Rennes (France); INSERM U-991, Foie, Metabolismes et Cancer, 35033 Rennes (France); Universite Europeenne de Bretagne, Rennes (France)

    2011-02-15

    This work describes optimisation of the kit formulation for labelling of Lipiodol with high-activity generator-produced rhenium-188. Radiochemical purity (RCP) was 92.52{+-}2.3% and extraction yield was 98.56{+-}1.2%. The synthesis has been automated with a TADDEO module (Comecer) giving a mean final yield of 52.68{+-}9.6%, and reducing radiation burden to the radiochemist by 80%. Radiolabelled Lipiodol ({sup 188}Re-SSS/Lipiodol) is stable for at least 7 days (RCP=91.07{+-}0.9%).

  20. Substitute Valuations: Generation and Structure

    CERN Document Server

    Hajek, Bruce

    2007-01-01

    Substitute valuations (in some contexts called gross substitute valuations) are prominent in combinatorial auction theory. An algorithm is given in this paper for generating a substitute valuation through Monte Carlo simulation. In addition, the geometry of the set of all substitute valuations for a fixed number of goods K is investigated. The set consists of a union of polyhedrons, and the maximal polyhedrons are identified for K=4. It is shown that the maximum dimension of the maximal polyhedrons increases with K nearly as fast as two to the power K.

  1. Automated Eukaryotic Gene Structure Annotation Using EVidenceModeler and the Program to Assemble Spliced Alignments

    Energy Technology Data Exchange (ETDEWEB)

    Haas, B J; Salzberg, S L; Zhu, W; Pertea, M; Allen, J E; Orvis, J; White, O; Buell, C R; Wortman, J R

    2007-12-10

    EVidenceModeler (EVM) is presented as an automated eukaryotic gene structure annotation tool that reports eukaryotic gene structures as a weighted consensus of all available evidence. EVM, when combined with the Program to Assemble Spliced Alignments (PASA), yields a comprehensive, configurable annotation system that predicts protein-coding genes and alternatively spliced isoforms. Our experiments on both rice and human genome sequences demonstrate that EVM produces automated gene structure annotation approaching the quality of manual curation.

  2. Automated construction of lightweight, simple, field-erected structures

    Science.gov (United States)

    Leonard, R. S.

    1980-01-01

    The feasibility of automation of construction processes which could result in mobile construction robots is examined. The construction of a large photovoltaic power plant with a peak power output of 100 MW is demonstrated. The reasons to automate the construction process, a conventional construction scenario as the reference for evaluation, and a list of potential cost benefits using robots are presented. The technical feasibility of using robots to construct SPS ground stations is addressed.

  3. Next generation structural silicone glazing

    Directory of Open Access Journals (Sweden)

    Charles D. Clift

    2015-06-01

    Full Text Available This paper presents an advanced engineering evaluation, using nonlinear analysis of hyper elastic material that provides significant improvement to structural silicone glazing (SSG design in high performance curtain wall systems. Very high cladding wind pressures required in hurricane zones often result in bulky SSG profile dimensions. Architectural desire for aesthetically slender curtain wall framing sight-lines in combination with a desire to reduce aluminium usage led to optimization of silicone material geometry for better stress distribution.To accomplish accurate simulation of predicted behaviour under structural load, robust stress-strain curves of the silicone material are essential. The silicone manufacturer provided physical property testing via a specialized laboratory protocol. A series of rigorous curve fit techniques were then made to closely model test data in the finite element computer analysis that accounts for nonlinear strain of hyper elastic silicone.Comparison of this advanced design technique to traditional SSG design highlights differences in stress distribution contours in the silicone material. Simplified structural engineering per the traditional SSG design method does not provide accurate forecasting of material and stress optimization as shown in the advanced design.Full-scale specimens subject to structural load testing were performed to verify the design capacity, not only for high wind pressure values, but also for debris impact per ASTM E1886 and ASTM E1996. Also, construction of the test specimens allowed development of SSG installation techniques necessitated by the unique geometry of the silicone profile. Finally, correlation of physical test results with theoretical simulations is made, so evaluation of design confidence is possible. This design technique will introduce significant engineering advancement to the curtain wall industry.

  4. Automated system for generation of soil moisture products for agricultural drought assessment

    Science.gov (United States)

    Raja Shekhar, S. S.; Chandrasekar, K.; Sesha Sai, M. V. R.; Diwakar, P. G.; Dadhwal, V. K.

    2014-11-01

    Drought is a frequently occurring disaster affecting lives of millions of people across the world every year. Several parameters, indices and models are being used globally to forecast / early warning of drought and monitoring drought for its prevalence, persistence and severity. Since drought is a complex phenomenon, large number of parameter/index need to be evaluated to sufficiently address the problem. It is a challenge to generate input parameters from different sources like space based data, ground data and collateral data in short intervals of time, where there may be limitation in terms of processing power, availability of domain expertise, specialized models & tools. In this study, effort has been made to automate the derivation of one of the important parameter in the drought studies viz Soil Moisture. Soil water balance bucket model is in vogue to arrive at soil moisture products, which is widely popular for its sensitivity to soil conditions and rainfall parameters. This model has been encoded into "Fish-Bone" architecture using COM technologies and Open Source libraries for best possible automation to fulfill the needs for a standard procedure of preparing input parameters and processing routines. The main aim of the system is to provide operational environment for generation of soil moisture products by facilitating users to concentrate on further enhancements and implementation of these parameters in related areas of research, without re-discovering the established models. Emphasis of the architecture is mainly based on available open source libraries for GIS and Raster IO operations for different file formats to ensure that the products can be widely distributed without the burden of any commercial dependencies. Further the system is automated to the extent of user free operations if required with inbuilt chain processing for every day generation of products at specified intervals. Operational software has inbuilt capabilities to automatically

  5. Automated Generation of Formal Models from ST Control Programs for Verification Purposes

    CERN Document Server

    Fernandez Adiego, B; Tournier, J-C; Blanco Vinuela, E; Blech, J-O; Gonzalez Suarez, V

    2014-01-01

    In large industrial control systems such as the ones installed at CERN, one of the main issues is the ability to verify the correct behaviour of the Programmable Logic Controller (PLC) programs. While manual and automated testing can achieve good results, some obvious problems remain unsolved such as the difficulty to check safety or liveness properties. This paper proposes a general methodology and a tool to verify PLC programs by automatically generating formal models for different model checkers out of ST code. The proposed methodology defines an automata-based formalism used as intermediate model (IM) to transform PLC programs written in ST language into different formal models for verification purposes. A tool based on Xtext has been implemented that automatically generates models for the NuSMV and UPPAAL model checkers and the BIP framework.

  6. Automated Generation of Digital Terrain Model using Point Clouds of Digital Surface Model in Forest Area

    Directory of Open Access Journals (Sweden)

    Yoshikazu Kamiya

    2011-04-01

    Full Text Available At present, most of the digital data acquisition methods generate Digital Surface Model (DSM and not a Digital Elevation Model (DEM. Conversion from DSM to DEM still has some drawbacks, especially the removing of off terrain point clouds and subsequently the generation of DEM within these spaces even though the methods are automated. In this paper it was intended to overcome this issue by attempting to project off terrain point clouds to the terrain in forest areas using Artificial Neural Networks (ANN instead of removing them and then filling gaps by interpolation. Five sites were tested and accuracies assessed. They all give almost the same results. In conclusion, the ANN has ability to obtain the DEM by projecting the DSM point clouds and greater accuracies of DEMs were obtained. If the size of the hollow areas resulting from the removal of DSM point clouds are larger the accuracies are reduced.

  7. Automated Design and Analysis Tool for CEV Structural and TPS Components Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation of the proposed effort is a unique automated process for the analysis, design, and sizing of CEV structures and TPS. This developed process will...

  8. Automated Design and Analysis Tool for CLV/CEV Composite and Metallic Structural Components Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation of the proposed effort is a unique automated process for the analysis, design, and sizing of CLV/CEV composite and metallic structures. This developed...

  9. The structure and functions of an automated project management system for the centers of scientific and technical creativity of students

    OpenAIRE

    2013-01-01

    This article discusses the possibility of automating of the student's projecting through the use of automated project management system. There are described the purpose, structure and formalism of automated workplace of student-designer (AWSD), and shown its structural-functional diagram.

  10. Automating methods to improve precision in Monte-Carlo event generation for particle colliders

    Energy Technology Data Exchange (ETDEWEB)

    Gleisberg, Tanju

    2008-07-01

    The subject of this thesis was the development of tools for the automated calculation of exact matrix elements, which are a key for the systematic improvement of precision and confidence for theoretical predictions. Part I of this thesis concentrates on the calculations of cross sections at tree level. A number of extensions have been implemented in the matrix element generator AMEGIC++, namely new interaction models such as effective loop-induced couplings of the Higgs boson with massless gauge bosons, required for a number of channels for the Higgs boson search at LHC and anomalous gauge couplings, parameterizing a number of models beyond th SM. Further a special treatment to deal with complicated decay chains of heavy particles has been constructed. A significant effort went into the implementation of methods to push the limits on particle multiplicities. Two recursive methods have been implemented, the Cachazo-Svrcek-Witten recursion and the colour dressed Berends-Giele recursion. For the latter the new module COMIX has been added to the SHERPA framework. The Monte-Carlo phase space integration techniques have been completely revised, which led to significantly reduced statistical error estimates when calculating cross sections and a greatly improved unweighting efficiency for the event generation. Special integration methods have been developed to cope with the newly accessible final states. The event generation framework SHERPA directly benefits from those new developments, improving the precision and the efficiency. Part II was addressed to the automation of QCD calculations at next-to-leading order. A code has been developed, that, for the first time fully automates the real correction part of a NLO calculation. To calculate the correction for a m-parton process obeying the Catani-Seymour dipole subtraction method the following components are provided: 1. the corresponding m+1-parton tree level matrix elements, 2. a number dipole subtraction terms to remove

  11. Structured grid generator on parallel computers

    Energy Technology Data Exchange (ETDEWEB)

    Muramatsu, Kazuhiro; Murakami, Hiroyuki; Higashida, Akihiro; Yanagisawa, Ichiro

    1997-03-01

    A general purpose structured grid generator on parallel computers, which generates a large-scale structured grid efficiently, has been developed. The generator is applicable to Cartesian, cylindrical and BFC (Boundary-Fitted Curvilinear) coordinates. In case of BFC grids, there are three adaptable topologies; L-type, O-type and multi-block type, the last of which enables any combination of L- and O-grids. Internal BFC grid points can be automatically generated and smoothed by either algebraic supplemental method or partial differential equation method. The partial differential equation solver is implemented on parallel computers, because it consumes a large portion of overall execution time. Therefore, high-speed processing of large-scale grid generation can be realized by use of parallel computer. Generated grid data are capable to be adjusted to domain decomposition for parallel analysis. (author)

  12. Automated panning and screening procedure on microplates for antibody generation from phage display libraries.

    Science.gov (United States)

    Turunen, Laura; Takkinen, Kristiina; Söderlund, Hans; Pulli, Timo

    2009-03-01

    Antibody phage display technology is well established and widely used for selecting specific antibodies against desired targets. Using conventional manual methods, it is laborious to perform multiple selections with different antigens simultaneously. Furthermore, manual screening of the positive clones requires much effort. The authors describe optimized and automated procedures of these processes using a magnetic bead processor for the selection and a robotic station for the screening step. Both steps are performed in a 96-well microplate format. In addition, adopting the antibody phage display technology to automated platform polyethylene glycol precipitation of the enriched phage pool was unnecessary. For screening, an enzyme-linked immunosorbent assay protocol suitable for a robotic station was developed. This system was set up using human gamma-globulin as a model antigen to select antibodies from a VTT naive human single-chain antibody (scFv) library. In total, 161 gamma-globulin-selected clones were screened, and according to fingerprinting analysis, 9 of the 13 analyzed clones were different. The system was further tested using testosterone bovine serum albumin (BSA) and beta-estradiol-BSA as antigens with the same library. In total, 1536 clones were screened from 4 rounds of selection with both antigens, and 29 different testosterone-BSA and 23 beta-estradiol-BSA binding clones were found and verified by sequencing. This automated antibody phage display procedure increases the throughput of generating wide panels of target-binding antibody candidates and allows the selection and screening of antibodies against several different targets in parallel with high efficiency.

  13. Towards fully automated structure-based function prediction in structural genomics: a case study.

    Science.gov (United States)

    Watson, James D; Sanderson, Steve; Ezersky, Alexandra; Savchenko, Alexei; Edwards, Aled; Orengo, Christine; Joachimiak, Andrzej; Laskowski, Roman A; Thornton, Janet M

    2007-04-13

    As the global Structural Genomics projects have picked up pace, the number of structures annotated in the Protein Data Bank as hypothetical protein or unknown function has grown significantly. A major challenge now involves the development of computational methods to assign functions to these proteins accurately and automatically. As part of the Midwest Center for Structural Genomics (MCSG) we have developed a fully automated functional analysis server, ProFunc, which performs a battery of analyses on a submitted structure. The analyses combine a number of sequence-based and structure-based methods to identify functional clues. After the first stage of the Protein Structure Initiative (PSI), we review the success of the pipeline and the importance of structure-based function prediction. As a dataset, we have chosen all structures solved by the MCSG during the 5 years of the first PSI. Our analysis suggests that two of the structure-based methods are particularly successful and provide examples of local similarity that is difficult to identify using current sequence-based methods. No one method is successful in all cases, so, through the use of a number of complementary sequence and structural approaches, the ProFunc server increases the chances that at least one method will find a significant hit that can help elucidate function. Manual assessment of the results is a time-consuming process and subject to individual interpretation and human error. We present a method based on the Gene Ontology (GO) schema using GO-slims that can allow the automated assessment of hits with a success rate approaching that of expert manual assessment.

  14. Automating gene library synthesis by structure-based combinatorial protein engineering: examples from plant sesquiterpene synthases.

    Science.gov (United States)

    Dokarry, Melissa; Laurendon, Caroline; O'Maille, Paul E

    2012-01-01

    Structure-based combinatorial protein engineering (SCOPE) is a homology-independent recombination method to create multiple crossover gene libraries by assembling defined combinations of structural elements ranging from single mutations to domains of protein structure. SCOPE was originally inspired by DNA shuffling, which mimics recombination during meiosis, where mutations from parental genes are "shuffled" to create novel combinations in the resulting progeny. DNA shuffling utilizes sequence identity between parental genes to mediate template-switching events (the annealing and extension of one parental gene fragment on another) in PCR reassembly reactions to generate crossovers and hence recombination between parental genes. In light of the conservation of protein structure and degeneracy of sequence, SCOPE was developed to enable the "shuffling" of distantly related genes with no requirement for sequence identity. The central principle involves the use of oligonucleotides to encode for crossover regions to choreograph template-switching events during PCR assembly of gene fragments to create chimeric genes. This approach was initially developed to create libraries of hybrid DNA polymerases from distantly related parents, and later developed to create a combinatorial mutant library of sesquiterpene synthases to explore the catalytic landscapes underlying the functional divergence of related enzymes. This chapter presents a simplified protocol of SCOPE that can be integrated with different mutagenesis techniques and is suitable for automation by liquid-handling robots. Two examples are presented to illustrate the application of SCOPE to create gene libraries using plant sesquiterpene synthases as the model system. In the first example, we outline how to create an active-site library as a series of complex mixtures of diverse mutants. In the second example, we outline how to create a focused library as an array of individual clones to distil minimal combinations of

  15. Concurrent combined verification: reducing false positives in automated NMR structure verification through the evaluation of multiple challenge control structures.

    Science.gov (United States)

    Golotvin, Sergey S; Pol, Rostislav; Sasaki, Ryan R; Nikitina, Asya; Keyes, Philip

    2012-06-01

    Automated structure verification using (1)H NMR data or a combination of (1)H and heteronuclear single-quantum correlation (HSQC) data is gaining more interest as a routine application for qualitative evaluation of large compound libraries produced by synthetic chemistry. The goal of this automated software method is to identify a manageable subset of compounds and data that require human review. In practice, the automated method will flag structure and data combinations that exhibit some inconsistency (i.e. strange chemical shifts, conflicts in multiplicity, or overestimated and underestimated integration values) and validate those that appear consistent. One drawback of this approach is that no automated system can guarantee that all passing structures are indeed correct structures. The major reason for this is that approaches using only (1)H or even (1)H and HSQC spectra often do not provide sufficient information to properly distinguish between similar structures. Therefore, current implementations of automated structure verification systems allow, in principle, false positive results. Presented in this work is a method that greatly reduces the probability of an automated validation system passing incorrect structures (i.e. false positives). This novel method was applied to automatically validate 127 non-proprietary compounds from several commercial sources. Presented also is the impact of this approach on false positive and false negative results.

  16. A Structured Light Scanner for Hyper Flexible Industrial Automation

    DEFF Research Database (Denmark)

    Hansen, Kent; Pedersen, Jeppe; Sølund, Thomas

    2014-01-01

    A current trend in industrial automation implies a need for doing automatic scene understanding, from optical 3D sensors, which in turn imposes a need for a lightweight and reliable 3D optical sensor to be mounted on a collaborative robot e.g., Universal Robot UR5 or Kuka LWR. Here, we empirically...

  17. Automated biphasic morphological assessment of hepatitis B-related liver fibrosis using second harmonic generation microscopy

    Science.gov (United States)

    Wang, Tong-Hong; Chen, Tse-Ching; Teng, Xiao; Liang, Kung-Hao; Yeh, Chau-Ting

    2015-08-01

    Liver fibrosis assessment by biopsy and conventional staining scores is based on histopathological criteria. Variations in sample preparation and the use of semi-quantitative histopathological methods commonly result in discrepancies between medical centers. Thus, minor changes in liver fibrosis might be overlooked in multi-center clinical trials, leading to statistically non-significant data. Here, we developed a computer-assisted, fully automated, staining-free method for hepatitis B-related liver fibrosis assessment. In total, 175 liver biopsies were divided into training (n = 105) and verification (n = 70) cohorts. Collagen was observed using second harmonic generation (SHG) microscopy without prior staining, and hepatocyte morphology was recorded using two-photon excitation fluorescence (TPEF) microscopy. The training cohort was utilized to establish a quantification algorithm. Eleven of 19 computer-recognizable SHG/TPEF microscopic morphological features were significantly correlated with the ISHAK fibrosis stages (P 0.82 for liver cirrhosis detection. Since no subjective gradings are needed, interobserver discrepancies could be avoided using this fully automated method.

  18. Automated COBOL code generation for SNAP-I CAI development and maintenance procedures

    Energy Technology Data Exchange (ETDEWEB)

    Buhrmaster, M.A.; Duncan, L.D.; Hume, R.; Huntley, A.F.

    1988-07-01

    In designing and implementing a computer aided instruction (CAI) prototype for the Navy Management System Support Office (NAVMASSO) as part of the Shipboard Nontactical ADP Program (SNAP), Data Systems Engineering Organization (DSEO) personnel developed techniques for automating the production of COBOL source code for CAI applications. This report discusses the techniques applied, which incorporate the use of a database management system (DBMS) to store, access, and manipulate the data necessary for producing COBOL source code automatically. The objective for developing the code generation techniques is to allow for the production of future applications in an efficient and reliable manner. This report covers the standards and conventions defined, database tables created, and the host language interface program used for generating COBOL source files. The approach is responsible for producing 85 percent of an 830,000 line COBOL application, in approximately one year's time. This code generation program generated transaction processing routines to be executed under the DM6TP NAVMASSO distributed processing environment on the Honeywell DPS-6 minicomputers, representing the standard SNAP-I environment.

  19. A Logic Design Automation System for Generating Logic Diagram from Hardware Description

    Institute of Scientific and Technical Information of China (English)

    刘明业; 郭书明; 杨淮; 贾良玉; 洪恩宇

    1989-01-01

    This paper discusses a logic design automation system (LODAS) implemented on APOLLO DOMAIN workstation. LODAS can generate VLSI logic diagram from the hardware description. The system accepts many kinds of input description such as DDL or AHPL language description, functinual array (truth table), covering array , Boolean equations or state transition tables. The system first simulates the functional description to verify the functional description of the system designed, then the translator translates the fnnctional descriptong into register transfer equations, Boolean equatinos and state transition equations antomatically.Logic synthesis software partitions the translation result into a series of blocks, and transforma every small block into a mnlti-level NAND/NOR network according to the fan - in and fan - out restriction.

  20. A Logic Design Automation System for Generating Logic Diagram from Hardware Description

    Institute of Scientific and Technical Information of China (English)

    刘明业; 郭书明; 等

    1989-01-01

    This paper discusses a logic design automation system(LODAS) implemented on APOLLO DOMAIN workstation.LODAS can generate VLSI logic diagram from the hardware description.The system accepts many kinds of input description such as DDL or AHPL language description.Functional array(truth table).covering array,Boolean equations or state transition tables,The system first simulates the functional desecription to verify the functional description of the system designed.then translator translates the functional description into resgister transfer equation.Boolean equations and state transition equations automatically.Logic synthesis software partitions the translation result into a series of blocks,and transforms every small block into a multi-level NAND /NOR network according to the fan-in and fan-out restriction.

  1. Automated determinations of selenium in thermal power plant wastewater by sequential hydride generation and chemiluminescence detection.

    Science.gov (United States)

    Ezoe, Kentaro; Ohyama, Seiichi; Hashem, Md Abul; Ohira, Shin-Ichi; Toda, Kei

    2016-02-01

    After the Fukushima disaster, power generation from nuclear power plants in Japan was completely stopped and old coal-based power plants were re-commissioned to compensate for the decrease in power generation capacity. Although coal is a relatively inexpensive fuel for power generation, it contains high levels (mgkg(-1)) of selenium, which could contaminate the wastewater from thermal power plants. In this work, an automated selenium monitoring system was developed based on sequential hydride generation and chemiluminescence detection. This method could be applied to control of wastewater contamination. In this method, selenium is vaporized as H2Se, which reacts with ozone to produce chemiluminescence. However, interference from arsenic is of concern because the ozone-induced chemiluminescence intensity of H2Se is much lower than that of AsH3. This problem was successfully addressed by vaporizing arsenic and selenium individually in a sequential procedure using a syringe pump equipped with an eight-port selection valve and hot and cold reactors. Oxidative decomposition of organoselenium compounds and pre-reduction of the selenium were performed in the hot reactor, and vapor generation of arsenic and selenium were performed separately in the cold reactor. Sample transfers between the reactors were carried out by a pneumatic air operation by switching with three-way solenoid valves. The detection limit for selenium was 0.008 mg L(-1) and calibration curve was linear up to 1.0 mg L(-1), which provided suitable performance for controlling selenium in wastewater to around the allowable limit (0.1 mg L(-1)). This system consumes few chemicals and is stable for more than a month without any maintenance. Wastewater samples from thermal power plants were collected, and data obtained by the proposed method were compared with those from batchwise water treatment followed by hydride generation-atomic fluorescence spectrometry.

  2. Automated Mosaicking of Multiple 3d Point Clouds Generated from a Depth Camera

    Science.gov (United States)

    Kim, H.; Yoon, W.; Kim, T.

    2016-06-01

    In this paper, we propose a method for automated mosaicking of multiple 3D point clouds generated from a depth camera. A depth camera generates depth data by using ToF (Time of Flight) method and intensity data by using intensity of returned signal. The depth camera used in this paper was a SR4000 from MESA Imaging. This camera generates a depth map and intensity map of 176 x 44 pixels. Generated depth map saves physical depth data with mm of precision. Generated intensity map contains texture data with many noises. We used texture maps for extracting tiepoints and depth maps for assigning z coordinates to tiepoints and point cloud mosaicking. There are four steps in the proposed mosaicking method. In the first step, we acquired multiple 3D point clouds by rotating depth camera and capturing data per rotation. In the second step, we estimated 3D-3D transformation relationships between subsequent point clouds. For this, 2D tiepoints were extracted automatically from the corresponding two intensity maps. They were converted into 3D tiepoints using depth maps. We used a 3D similarity transformation model for estimating the 3D-3D transformation relationships. In the third step, we converted local 3D-3D transformations into a global transformation for all point clouds with respect to a reference one. In the last step, the extent of single depth map mosaic was calculated and depth values per mosaic pixel were determined by a ray tracing method. For experiments, 8 depth maps and intensity maps were used. After the four steps, an output mosaicked depth map of 454x144 was generated. It is expected that the proposed method would be useful for developing an effective 3D indoor mapping method in future.

  3. Toward the automated generation of genome-scale metabolic networks in the SEED

    Directory of Open Access Journals (Sweden)

    Gould John

    2007-04-01

    Full Text Available Abstract Background Current methods for the automated generation of genome-scale metabolic networks focus on genome annotation and preliminary biochemical reaction network assembly, but do not adequately address the process of identifying and filling gaps in the reaction network, and verifying that the network is suitable for systems level analysis. Thus, current methods are only sufficient for generating draft-quality networks, and refinement of the reaction network is still largely a manual, labor-intensive process. Results We have developed a method for generating genome-scale metabolic networks that produces substantially complete reaction networks, suitable for systems level analysis. Our method partitions the reaction space of central and intermediary metabolism into discrete, interconnected components that can be assembled and verified in isolation from each other, and then integrated and verified at the level of their interconnectivity. We have developed a database of components that are common across organisms, and have created tools for automatically assembling appropriate components for a particular organism based on the metabolic pathways encoded in the organism's genome. This focuses manual efforts on that portion of an organism's metabolism that is not yet represented in the database. We have demonstrated the efficacy of our method by reverse-engineering and automatically regenerating the reaction network from a published genome-scale metabolic model for Staphylococcus aureus. Additionally, we have verified that our method capitalizes on the database of common reaction network components created for S. aureus, by using these components to generate substantially complete reconstructions of the reaction networks from three other published metabolic models (Escherichia coli, Helicobacter pylori, and Lactococcus lactis. We have implemented our tools and database within the SEED, an open-source software environment for comparative

  4. DG-AMMOS: A New tool to generate 3D conformation of small molecules using Distance Geometry and Automated Molecular Mechanics Optimization for in silico Screening

    Directory of Open Access Journals (Sweden)

    Villoutreix Bruno O

    2009-11-01

    Full Text Available Abstract Background Discovery of new bioactive molecules that could enter drug discovery programs or that could serve as chemical probes is a very complex and costly endeavor. Structure-based and ligand-based in silico screening approaches are nowadays extensively used to complement experimental screening approaches in order to increase the effectiveness of the process and facilitating the screening of thousands or millions of small molecules against a biomolecular target. Both in silico screening methods require as input a suitable chemical compound collection and most often the 3D structure of the small molecules has to be generated since compounds are usually delivered in 1D SMILES, CANSMILES or in 2D SDF formats. Results Here, we describe the new open source program DG-AMMOS which allows the generation of the 3D conformation of small molecules using Distance Geometry and their energy minimization via Automated Molecular Mechanics Optimization. The program is validated on the Astex dataset, the ChemBridge Diversity database and on a number of small molecules with known crystal structures extracted from the Cambridge Structural Database. A comparison with the free program Balloon and the well-known commercial program Omega generating the 3D of small molecules is carried out. The results show that the new free program DG-AMMOS is a very efficient 3D structure generator engine. Conclusion DG-AMMOS provides fast, automated and reliable access to the generation of 3D conformation of small molecules and facilitates the preparation of a compound collection prior to high-throughput virtual screening computations. The validation of DG-AMMOS on several different datasets proves that generated structures are generally of equal quality or sometimes better than structures obtained by other tested methods.

  5. TAPDANCE: An automated tool to identify and annotate transposon insertion CISs and associations between CISs from next generation sequence data

    Directory of Open Access Journals (Sweden)

    Sarver Aaron L

    2012-06-01

    Full Text Available Abstract Background Next generation sequencing approaches applied to the analyses of transposon insertion junction fragments generated in high throughput forward genetic screens has created the need for clear informatics and statistical approaches to deal with the massive amount of data currently being generated. Previous approaches utilized to 1 map junction fragments within the genome and 2 identify Common Insertion Sites (CISs within the genome are not practical due to the volume of data generated by current sequencing technologies. Previous approaches applied to this problem also required significant manual annotation. Results We describe Transposon Annotation Poisson Distribution Association Network Connectivity Environment (TAPDANCE software, which automates the identification of CISs within transposon junction fragment insertion data. Starting with barcoded sequence data, the software identifies and trims sequences and maps putative genomic sequence to a reference genome using the bowtie short read mapper. Poisson distribution statistics are then applied to assess and rank genomic regions showing significant enrichment for transposon insertion. Novel methods of counting insertions are used to ensure that the results presented have the expected characteristics of informative CISs. A persistent mySQL database is generated and utilized to keep track of sequences, mappings and common insertion sites. Additionally, associations between phenotypes and CISs are also identified using Fisher’s exact test with multiple testing correction. In a case study using previously published data we show that the TAPDANCE software identifies CISs as previously described, prioritizes them based on p-value, allows holistic visualization of the data within genome browser software and identifies relationships present in the structure of the data. Conclusions The TAPDANCE process is fully automated, performs similarly to previous labor intensive approaches

  6. Description and recognition of regular and distorted secondary structures in proteins using the automated protein structure analysis method.

    Science.gov (United States)

    Ranganathan, Sushilee; Izotov, Dmitry; Kraka, Elfi; Cremer, Dieter

    2009-08-01

    The Automated Protein Structure Analysis (APSA) method, which describes the protein backbone as a smooth line in three-dimensional space and characterizes it by curvature kappa and torsion tau as a function of arc length s, was applied on 77 proteins to determine all secondary structural units via specific kappa(s) and tau(s) patterns. A total of 533 alpha-helices and 644 beta-strands were recognized by APSA, whereas DSSP gives 536 and 651 units, respectively. Kinks and distortions were quantified and the boundaries (entry and exit) of secondary structures were classified. Similarity between proteins can be easily quantified using APSA, as was demonstrated for the roll architecture of proteins ubiquitin and spinach ferridoxin. A twenty-by-twenty comparison of all alpha domains showed that the curvature-torsion patterns generated by APSA provide an accurate and meaningful similarity measurement for secondary, super secondary, and tertiary protein structure. APSA is shown to accurately reflect the conformation of the backbone effectively reducing three-dimensional structure information to two-dimensional representations that are easy to interpret and understand.

  7. SEMANTIC WEB-BASED SOFTWARE ENGINEERING BY AUTOMATED REQUIREMENTS ONTOLOGY GENERATION IN SOA

    Directory of Open Access Journals (Sweden)

    Vahid Rastgoo

    2014-04-01

    Full Text Available This paper presents an approach for automated generation of requirements ontology using UML diagrams in service-oriented architecture (SOA. The goal of this paper is to convenience progress of software engineering processes like software design, software reuse, service discovering and etc. The proposed method is based on a four conceptual layers. The first layer includes requirements achieved by stakeholders, the second one designs service-oriented diagrams from the data in first layer and extracts XMI codes of them. The third layer includes requirement ontology and protocol ontology to describe behavior of services and relationships between them semantically. Finally the forth layer makes standard the concepts exists in ontologies of previous layer. The generated ontology exceeds absolute domain ontology because it considers the behavior of services moreover the hierarchical relationship of them. Experimental results conducted on a set of UML4Soa diagrams in different scopes demonstrate the improvement of the proposed approach from different points of view such as: completeness of requirements ontology, automatic generation and considering SOA.

  8. Structure of microprocessor-based automation system of oil pumping station “Alexndrovskaya”

    Directory of Open Access Journals (Sweden)

    Dmitriyenko Margarita A.

    2014-01-01

    Full Text Available Structure of microprocessed-based automation system (MBAS of oil pumping station (OPS «Alexandrovskaya», located on the territory of Tomsk region and forming part of the Oil Transporting Joint Stock Company «Transneft», developed in accordance with the requirements of the guidance document «Complex of the typical design choices automation of OPSs and crude storages on the basis of modern standard solutions and components».

  9. ArrayIDer: automated structural re-annotation pipeline for DNA microarrays

    Directory of Open Access Journals (Sweden)

    McCarthy Fiona M

    2009-01-01

    Full Text Available Abstract Background Systems biology modeling from microarray data requires the most contemporary structural and functional array annotation. However, microarray annotations, especially for non-commercial, non-traditional biomedical model organisms, are often dated. In addition, most microarray analysis tools do not readily accept EST clone names, which are abundantly represented on arrays. Manual re-annotation of microarrays is impracticable and so we developed a computational re-annotation tool (ArrayIDer to retrieve the most recent accession mapping files from public databases based on EST clone names or accessions and rapidly generate database accessions for entire microarrays. Results We utilized the Fred Hutchinson Cancer Research Centre 13K chicken cDNA array – a widely-used non-commercial chicken microarray – to demonstrate the principle that ArrayIDer could markedly improve annotation. We structurally re-annotated 55% of the entire array. Moreover, we decreased non-chicken functional annotations by 2 fold. One beneficial consequence of our re-annotation was to identify 290 pseudogenes, of which 66 were previously incorrectly annotated. Conclusion ArrayIDer allows rapid automated structural re-annotation of entire arrays and provides multiple accession types for use in subsequent functional analysis. This information is especially valuable for systems biology modeling in the non-traditional biomedical model organisms.

  10. Automated As-Built Model Generation of Subway Tunnels from Mobile LiDAR Data.

    Science.gov (United States)

    Arastounia, Mostafa

    2016-09-13

    This study proposes fully-automated methods for as-built model generation of subway tunnels employing mobile Light Detection and Ranging (LiDAR) data. The employed dataset is acquired by a Velodyne HDL 32E and covers 155 m of a subway tunnel containing six million points. First, the tunnel's main axis and cross sections are extracted. Next, a preliminary model is created by fitting an ellipse to each extracted cross section. The model is refined by employing residual analysis and Baarda's data snooping method to eliminate outliers. The final model is then generated by applying least squares adjustment to outlier-free data. The obtained results indicate that the tunnel's main axis and 1551 cross sections at 0.1 m intervals are successfully extracted. Cross sections have an average semi-major axis of 7.8508 m with a standard deviation of 0.2 mm and semi-minor axis of 7.7509 m with a standard deviation of 0.1 mm. The average normal distance of points from the constructed model (average absolute error) is also 0.012 m. The developed algorithm is applicable to tunnels with any horizontal orientation and degree of curvature since it makes no assumptions, nor does it use any a priori knowledge regarding the tunnel's curvature and horizontal orientation.

  11. Compact Structural Test Generation for Analog Macros

    NARCIS (Netherlands)

    Kaal, V.; Kerkhoff, Hans G.

    1997-01-01

    A structural, fault-model based methodology for the generation of compact high-quality test sets for analog macros is presented. Results are shown for an IV-converter macro design. Parameters of so-called test configurations are optimized for detection of faults in a fault-list and an optimal

  12. Finite element based electrostatic-structural coupled analysis with automated mesh morphing

    Energy Technology Data Exchange (ETDEWEB)

    OWEN,STEVEN J.; ZHULIN,V.I.; OSTERGAARD,D.F.

    2000-02-29

    A co-simulation tool based on finite element principles has been developed to solve coupled electrostatic-structural problems. An automated mesh morphing algorithm has been employed to update the field mesh after structural deformation. The co-simulation tool has been successfully applied to the hysteric behavior of a MEMS switch.

  13. Laser materials processing of complex components: from reverse engineering via automated beam path generation to short process development cycles

    Science.gov (United States)

    Görgl, Richard; Brandstätter, Elmar

    2017-01-01

    The article presents an overview of what is possible nowadays in the field of laser materials processing. The state of the art in the complete process chain is shown, starting with the generation of a specific components CAD data and continuing with the automated motion path generation for the laser head carried by a CNC or robot system. Application examples from laser cladding and laser-based additive manufacturing are given.

  14. A NOVEL METHOD FOR AUTOMATION OF 3D HYDRO BREAK LINE GENERATION FROM LIDAR DATA USING MATLAB

    OpenAIRE

    Toscano, G. J.; U. Gopalam; V. Devarajan

    2013-01-01

    Water body detection is necessary to generate hydro break lines, which are in turn useful in creating deliverables such as TINs, contours, DEMs from LiDAR data. Hydro flattening follows the detection and delineation of water bodies (lakes, rivers, ponds, reservoirs, streams etc.) with hydro break lines. Manual hydro break line generation is time consuming and expensive. Accuracy and processing time depend on the number of vertices marked for delineation of break lines. Automation wit...

  15. Laser materials processing of complex components. From reverse engineering via automated beam path generation to short process development cycles.

    Science.gov (United States)

    Görgl, R.; Brandstätter, E.

    2016-03-01

    The article presents an overview of what is possible nowadays in the field of laser materials processing. The state of the art in the complete process chain is shown, starting with the generation of a specific components CAD data and continuing with the automated motion path generation for the laser head carried by a CNC or robot system. Application examples from laser welding, laser cladding and additive laser manufacturing are given.

  16. ELECTRIC WELDING EQUIPMENT AND AUTOMATION OF WELDING IN CONSTRUCTION,

    Science.gov (United States)

    WELDING , *ARC WELDING , AUTOMATION, CONSTRUCTION, INDUSTRIES, POWER EQUIPMENT, GENERATORS, POWER TRANSFORMERS, RESISTANCE WELDING , SPOT WELDING , MACHINES, AUTOMATIC, STRUCTURES, WIRING DIAGRAMS, USSR.

  17. Automated High Throughput Drug Target Crystallography

    Energy Technology Data Exchange (ETDEWEB)

    Rupp, B

    2005-02-18

    The molecular structures of drug target proteins and receptors form the basis for 'rational' or structure guided drug design. The majority of target structures are experimentally determined by protein X-ray crystallography, which as evolved into a highly automated, high throughput drug discovery and screening tool. Process automation has accelerated tasks from parallel protein expression, fully automated crystallization, and rapid data collection to highly efficient structure determination methods. A thoroughly designed automation technology platform supported by a powerful informatics infrastructure forms the basis for optimal workflow implementation and the data mining and analysis tools to generate new leads from experimental protein drug target structures.

  18. An automated tetrahedral mesh generator for computer simulation in Odontology based on the Delaunay's algorithm

    Directory of Open Access Journals (Sweden)

    Mauro Massayoshi Sakamoto

    2008-01-01

    Full Text Available In this work, a software package based on the Delaunay´s algorithm is described. The main feature of this package is the capability in applying discretization in geometric domains of teeth taking into account their complex inner structures and the materials with different hardness. Usually, the mesh generators reported in literature treat molars and other teeth by using simplified geometric models, or even considering the teeth as homogeneous structures.

  19. The second round of Critical Assessment of Automated Structure Determination of Proteins by NMR: CASD-NMR-2013

    Energy Technology Data Exchange (ETDEWEB)

    Rosato, Antonio [University of Florence, Department of Chemistry and Magnetic Resonance Center (Italy); Vranken, Wim [Vrije Universiteit Brussel, Structural Biology Brussels (Belgium); Fogh, Rasmus H.; Ragan, Timothy J. [University of Leicester, Department of Biochemistry, School of Biological Sciences (United Kingdom); Tejero, Roberto [Universidad de Valencia, Departamento de Química Física (Spain); Pederson, Kari; Lee, Hsiau-Wei; Prestegard, James H. [University of Georgia, Complex Carbohydrate Research Center and Northeast Structural Genomics Consortium (United States); Yee, Adelinda; Wu, Bin; Lemak, Alexander; Houliston, Scott; Arrowsmith, Cheryl H. [University of Toronto, Department of Medical Biophysics, Cancer Genomics and Proteomics, Ontario Cancer Institute, Northeast Structural Genomics Consortium (Canada); Kennedy, Michael [Miami University, Department of Chemistry and Biochemistry, Northeast Structural Genomics Consortium (United States); Acton, Thomas B.; Xiao, Rong; Liu, Gaohua; Montelione, Gaetano T., E-mail: guy@cabm.rutgers.edu [The State University of New Jersey, Department of Molecular Biology and Biochemistry, Center for Advanced Biotechnology and Medicine, Northeast Structural Genomics Consortium, Rutgers (United States); Vuister, Geerten W., E-mail: gv29@le.ac.uk [University of Leicester, Department of Biochemistry, School of Biological Sciences (United Kingdom)

    2015-08-15

    The second round of the community-wide initiative Critical Assessment of automated Structure Determination of Proteins by NMR (CASD-NMR-2013) comprised ten blind target datasets, consisting of unprocessed spectral data, assigned chemical shift lists and unassigned NOESY peak and RDC lists, that were made available in both curated (i.e. manually refined) or un-curated (i.e. automatically generated) form. Ten structure calculation programs, using fully automated protocols only, generated a total of 164 three-dimensional structures (entries) for the ten targets, sometimes using both curated and un-curated lists to generate multiple entries for a single target. The accuracy of the entries could be established by comparing them to the corresponding manually solved structure of each target, which was not available at the time the data were provided. Across the entire data set, 71 % of all entries submitted achieved an accuracy relative to the reference NMR structure better than 1.5 Å. Methods based on NOESY peak lists achieved even better results with up to 100 % of the entries within the 1.5 Å threshold for some programs. However, some methods did not converge for some targets using un-curated NOESY peak lists. Over 90 % of the entries achieved an accuracy better than the more relaxed threshold of 2.5 Å that was used in the previous CASD-NMR-2010 round. Comparisons between entries generated with un-curated versus curated peaks show only marginal improvements for the latter in those cases where both calculations converged.

  20. Automated Generation of Fault Management Artifacts from a Simple System Model

    Science.gov (United States)

    Kennedy, Andrew K.; Day, John C.

    2013-01-01

    Our understanding of off-nominal behavior - failure modes and fault propagation - in complex systems is often based purely on engineering intuition; specific cases are assessed in an ad hoc fashion as a (fallible) fault management engineer sees fit. This work is an attempt to provide a more rigorous approach to this understanding and assessment by automating the creation of a fault management artifact, the Failure Modes and Effects Analysis (FMEA) through querying a representation of the system in a SysML model. This work builds off the previous development of an off-nominal behavior model for the upcoming Soil Moisture Active-Passive (SMAP) mission at the Jet Propulsion Laboratory. We further developed the previous system model to more fully incorporate the ideas of State Analysis, and it was restructured in an organizational hierarchy that models the system as layers of control systems while also incorporating the concept of "design authority". We present software that was developed to traverse the elements and relationships in this model to automatically construct an FMEA spreadsheet. We further discuss extending this model to automatically generate other typical fault management artifacts, such as Fault Trees, to efficiently portray system behavior, and depend less on the intuition of fault management engineers to ensure complete examination of off-nominal behavior.

  1. Automated scene generated background context for near-nadir look angles

    Science.gov (United States)

    Tucker, Jonathan D.; Stanfill, S. Robert

    2016-05-01

    Multi-INT fusion of GEOINT and IMINT can enable performance optimization of target detection and target tracking problem domains1, amongst others. Contextual information, which defines the relationship of foreground to background scene content, is a source of GEOINT for which various online repositories exist today including but not limited to the following: Open Street Maps (OSM)2 and the United States Geological Survey (USGS)3. However, as the nature of the world's landscape is dynamic and ever-changing, such contextual information can easily become stagnant and irrelevant if not maintained. In this paper we discuss our approach to providing the latest relevant context by performing automated scene generated background context segmentation and classification for near-nadir look angles for the purpose of defining roadways or parking lots, buildings, and natural areas. This information can be used in a variety of ways including augmenting context data from repositories, performing mission pre-planning, and for real-time missions such that GEOINT and IMINT fusion can occur and enable significant performance advantages in target detection and tracking applications in all areas of the world.

  2. QuickNGS elevates Next-Generation Sequencing data analysis to a new level of automation.

    Science.gov (United States)

    Wagle, Prerana; Nikolić, Miloš; Frommolt, Peter

    2015-07-01

    Next-Generation Sequencing (NGS) has emerged as a widely used tool in molecular biology. While time and cost for the sequencing itself are decreasing, the analysis of the massive amounts of data remains challenging. Since multiple algorithmic approaches for the basic data analysis have been developed, there is now an increasing need to efficiently use these tools to obtain results in reasonable time. We have developed QuickNGS, a new workflow system for laboratories with the need to analyze data from multiple NGS projects at a time. QuickNGS takes advantage of parallel computing resources, a comprehensive back-end database, and a careful selection of previously published algorithmic approaches to build fully automated data analysis workflows. We demonstrate the efficiency of our new software by a comprehensive analysis of 10 RNA-Seq samples which we can finish in only a few minutes of hands-on time. The approach we have taken is suitable to process even much larger numbers of samples and multiple projects at a time. Our approach considerably reduces the barriers that still limit the usability of the powerful NGS technology and finally decreases the time to be spent before proceeding to further downstream analysis and interpretation of the data.

  3. Remote automated multi-generational growth and observation of an animal in low Earth orbit.

    Science.gov (United States)

    Oczypok, Elizabeth A; Etheridge, Timothy; Freeman, Jacob; Stodieck, Louis; Johnsen, Robert; Baillie, David; Szewczyk, Nathaniel J

    2012-03-07

    The ultimate survival of humanity is dependent upon colonization of other planetary bodies. Key challenges to such habitation are (patho)physiologic changes induced by known, and unknown, factors associated with long-duration and distance space exploration. However, we currently lack biological models for detecting and studying these changes. Here, we use a remote automated culture system to successfully grow an animal in low Earth orbit for six months. Our observations, over 12 generations, demonstrate that the multi-cellular soil worm Caenorhabditis elegans develops from egg to adulthood and produces progeny with identical timings in space as on the Earth. Additionally, these animals display normal rates of movement when fully fed, comparable declines in movement when starved, and appropriate growth arrest upon starvation and recovery upon re-feeding. These observations establish C. elegans as a biological model that can be used to detect changes in animal growth, development, reproduction and behaviour in response to environmental conditions during long-duration spaceflight. This experimental system is ready to be incorporated on future, unmanned interplanetary missions and could be used to study cost-effectively the effects of such missions on these biological processes and the efficacy of new life support systems and radiation shielding technologies.

  4. Automated Generation of Fault Management Artifacts from a Simple System Model

    Science.gov (United States)

    Kennedy, Andrew K.; Day, John C.

    2013-01-01

    Our understanding of off-nominal behavior - failure modes and fault propagation - in complex systems is often based purely on engineering intuition; specific cases are assessed in an ad hoc fashion as a (fallible) fault management engineer sees fit. This work is an attempt to provide a more rigorous approach to this understanding and assessment by automating the creation of a fault management artifact, the Failure Modes and Effects Analysis (FMEA) through querying a representation of the system in a SysML model. This work builds off the previous development of an off-nominal behavior model for the upcoming Soil Moisture Active-Passive (SMAP) mission at the Jet Propulsion Laboratory. We further developed the previous system model to more fully incorporate the ideas of State Analysis, and it was restructured in an organizational hierarchy that models the system as layers of control systems while also incorporating the concept of "design authority". We present software that was developed to traverse the elements and relationships in this model to automatically construct an FMEA spreadsheet. We further discuss extending this model to automatically generate other typical fault management artifacts, such as Fault Trees, to efficiently portray system behavior, and depend less on the intuition of fault management engineers to ensure complete examination of off-nominal behavior.

  5. Development and verification testing of automation and robotics for assembly of space structures

    Science.gov (United States)

    Rhodes, Marvin D.; Will, Ralph W.; Quach, Cuong C.

    1993-01-01

    A program was initiated within the past several years to develop operational procedures for automated assembly of truss structures suitable for large-aperture antennas. The assembly operations require the use of a robotic manipulator and are based on the principle of supervised autonomy to minimize crew resources. A hardware testbed was established to support development and evaluation testing. A brute-force automation approach was used to develop the baseline assembly hardware and software techniques. As the system matured and an operation was proven, upgrades were incorprated and assessed against the baseline test results. This paper summarizes the developmental phases of the program, the results of several assembly tests, the current status, and a series of proposed developments for additional hardware and software control capability. No problems that would preclude automated in-space assembly of truss structures have been encountered. The current system was developed at a breadboard level and continued development at an enhanced level is warranted.

  6. Automated Learning of Subcellular Variation among Punctate Protein Patterns and a Generative Model of Their Relation to Microtubules.

    Directory of Open Access Journals (Sweden)

    Gregory R Johnson

    2015-12-01

    Full Text Available Characterizing the spatial distribution of proteins directly from microscopy images is a difficult problem with numerous applications in cell biology (e.g. identifying motor-related proteins and clinical research (e.g. identification of cancer biomarkers. Here we describe the design of a system that provides automated analysis of punctate protein patterns in microscope images, including quantification of their relationships to microtubules. We constructed the system using confocal immunofluorescence microscopy images from the Human Protein Atlas project for 11 punctate proteins in three cultured cell lines. These proteins have previously been characterized as being primarily located in punctate structures, but their images had all been annotated by visual examination as being simply "vesicular". We were able to show that these patterns could be distinguished from each other with high accuracy, and we were able to assign to one of these subclasses hundreds of proteins whose subcellular localization had not previously been well defined. In addition to providing these novel annotations, we built a generative approach to modeling of punctate distributions that captures the essential characteristics of the distinct patterns. Such models are expected to be valuable for representing and summarizing each pattern and for constructing systems biology simulations of cell behaviors.

  7. Automated Learning of Subcellular Variation among Punctate Protein Patterns and a Generative Model of Their Relation to Microtubules.

    Science.gov (United States)

    Johnson, Gregory R; Li, Jieyue; Shariff, Aabid; Rohde, Gustavo K; Murphy, Robert F

    2015-12-01

    Characterizing the spatial distribution of proteins directly from microscopy images is a difficult problem with numerous applications in cell biology (e.g. identifying motor-related proteins) and clinical research (e.g. identification of cancer biomarkers). Here we describe the design of a system that provides automated analysis of punctate protein patterns in microscope images, including quantification of their relationships to microtubules. We constructed the system using confocal immunofluorescence microscopy images from the Human Protein Atlas project for 11 punctate proteins in three cultured cell lines. These proteins have previously been characterized as being primarily located in punctate structures, but their images had all been annotated by visual examination as being simply "vesicular". We were able to show that these patterns could be distinguished from each other with high accuracy, and we were able to assign to one of these subclasses hundreds of proteins whose subcellular localization had not previously been well defined. In addition to providing these novel annotations, we built a generative approach to modeling of punctate distributions that captures the essential characteristics of the distinct patterns. Such models are expected to be valuable for representing and summarizing each pattern and for constructing systems biology simulations of cell behaviors.

  8. Automated laser-based barely visible impact damage detection in honeycomb sandwich composite structures

    Energy Technology Data Exchange (ETDEWEB)

    Girolamo, D., E-mail: dgirola@ncsu.edu; Yuan, F. G. [National Institute of Aerospace, Integrated Structural Health Management Laboratory, Hampton, VA 23666 and North Carolina State University, Department of Mechanical and Aerospace Engineering, Raleigh, NC 27695 (United States); Girolamo, L. [North Carolina State University, Department of Mechanical and Aerospace Engineering, Raleigh, NC 27695 (United States)

    2015-03-31

    Nondestructive evaluation (NDE) for detection and quantification of damage in composite materials is fundamental in the assessment of the overall structural integrity of modern aerospace systems. Conventional NDE systems have been extensively used to detect the location and size of damages by propagating ultrasonic waves normal to the surface. However they usually require physical contact with the structure and are time consuming and labor intensive. An automated, contactless laser ultrasonic imaging system for barely visible impact damage (BVID) detection in advanced composite structures has been developed to overcome these limitations. Lamb waves are generated by a Q-switched Nd:YAG laser, raster scanned by a set of galvano-mirrors over the damaged area. The out-of-plane vibrations are measured through a laser Doppler Vibrometer (LDV) that is stationary at a point on the corner of the grid. The ultrasonic wave field of the scanned area is reconstructed in polar coordinates and analyzed for high resolution characterization of impact damage in the composite honeycomb panel. Two methodologies are used for ultrasonic wave-field analysis: scattered wave field analysis (SWA) and standing wave energy analysis (SWEA) in the frequency domain. The SWA is employed for processing the wave field and estimate spatially dependent wavenumber values, related to discontinuities in the structural domain. The SWEA algorithm extracts standing waves trapped within damaged areas and, by studying the spectrum of the standing wave field, returns high fidelity damage imaging. While the SWA can be used to locate the impact damage in the honeycomb panel, the SWEA produces damage images in good agreement with X-ray computed tomographic (X-ray CT) scans. The results obtained prove that the laser-based nondestructive system is an effective alternative to overcome limitations of conventional NDI technologies.

  9. HD-RNAS: An automated hierarchical database of RNA structures

    Directory of Open Access Journals (Sweden)

    Shubhra Sankar eRay

    2012-04-01

    Full Text Available One of the important goals of most biological investigations is to classify and organize the experimental findings so that they are readily useful for deriving generalized rules. Although there is a huge amount of information on RNA structures in PDB, there are redundant files, ambiguous synthetic sequences etc. Moreover, a systematic hierarchical organization, reflecting RNA classification, is missing in PDB. In this investigation, we have classified all the available RNA crystal structures from PDB through a programmatic approach. Hence, it would be now a simple assignment to regularly update the classification as and when new structures are released. The classification can further determine (i a non-redundant set of RNA structures and (ii if available, a set of structures of identical sequence and function, which can highlight structural polymorphism, ligand-induced conformational alterations etc. Presently, we have classified the available structures (2095 PDB entries having RNA chain longer than 9 nucleotides solved by X-ray crystallography or NMR spectroscopy into nine functional classes. The structures of same function and same source are mostly seen to be similar with subtle differences depending on their functional complexation. The web-server is available online at http://www.saha.ac.in/biop/www/HD-RNAS.html and is updated regularly.

  10. A Framework for Semi-Automated Generation of a Virtual Combine Harvester

    DEFF Research Database (Denmark)

    Hermann, Dan; Bilde, M.L.; Andersen, Nils Axel

    2016-01-01

    This paper describes a generic data-driven model of the threshing, separation and cleaning process in a combine harvester. The aim is a model that describes the actual material flow and sensor values for relevant actuator configurations and measured environmental disturbances in order to facilitate...... Hardware In the Loop (HIL) simulation and sensor based material flow estimation. A modular data-driven model structure is chosen as it maintains the actual steady-state values and facilitates verification and debugging using laboratory and field data. The overall model structure, model generation procedure......, and estimation of parameters from field data are described, as well as simulation results are presented....

  11. An automated laboratory-scale methodology for the generation of sheared mammalian cell culture samples.

    Science.gov (United States)

    Joseph, Adrian; Goldrick, Stephen; Mollet, Michael; Turner, Richard; Bender, Jean; Gruber, David; Farid, Suzanne S; Titchener-Hooker, Nigel

    2017-05-01

    Continuous disk-stack centrifugation is typically used for the removal of cells and cellular debris from mammalian cell culture broths at manufacturing-scale. The use of scale-down methods to characterise disk-stack centrifugation performance enables substantial reductions in material requirements and allows a much wider design space to be tested than is currently possible at pilot-scale. The process of scaling down centrifugation has historically been challenging due to the difficulties in mimicking the Energy Dissipation Rates (EDRs) in typical machines. This paper describes an alternative and easy-to-assemble automated capillary-based methodology to generate levels of EDRs consistent with those found in a continuous disk-stack centrifuge. Variations in EDR were achieved through changes in capillary internal diameter and the flow rate of operation through the capillary. The EDRs found to match the levels of shear in the feed zone of a pilot-scale centrifuge using the experimental method developed in this paper (2.4×10(5) W/Kg) are consistent with those obtained through previously published computational fluid dynamic (CFD) studies (2.0×10(5) W/Kg). Furthermore, this methodology can be incorporated into existing scale-down methods to model the process performance of continuous disk-stack centrifuges. This was demonstrated through the characterisation of culture hold time, culture temperature and EDRs on centrate quality. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Engineering Mathematical Analysis Method for Productivity Rate in Linear Arrangement Serial Structure Automated Flow Assembly Line

    Directory of Open Access Journals (Sweden)

    Tan Chan Sin

    2015-01-01

    Full Text Available Productivity rate (Q or production rate is one of the important indicator criteria for industrial engineer to improve the system and finish good output in production or assembly line. Mathematical and statistical analysis method is required to be applied for productivity rate in industry visual overviews of the failure factors and further improvement within the production line especially for automated flow line since it is complicated. Mathematical model of productivity rate in linear arrangement serial structure automated flow line with different failure rate and bottleneck machining time parameters becomes the basic model for this productivity analysis. This paper presents the engineering mathematical analysis method which is applied in an automotive company which possesses automated flow assembly line in final assembly line to produce motorcycle in Malaysia. DCAS engineering and mathematical analysis method that consists of four stages known as data collection, calculation and comparison, analysis, and sustainable improvement is used to analyze productivity in automated flow assembly line based on particular mathematical model. Variety of failure rate that causes loss of productivity and bottleneck machining time is shown specifically in mathematic figure and presents the sustainable solution for productivity improvement for this final assembly automated flow line.

  13. Retargetable Code Generation based on Structural Processor Descriptions

    OpenAIRE

    Leupers, Rainer; Marwedel, Peter

    1998-01-01

    Design automation for embedded systems comprising both hardware and software components demands for code generators integrated into electronic CAD systems. These code generators provide the necessary link between software synthesis tools in HW/SW codesign systems and embedded processors. General-purpose compilers for standard processors are often insufficient, because they do not provide flexibility with respect to different target processors and also suffer from inferior code quality....

  14. Automated Modal Parameter Estimation of Civil Engineering Structures

    DEFF Research Database (Denmark)

    Andersen, Palle; Brincker, Rune; Goursat, Maurice

    In this paper the problems of doing automatic modal parameter extraction of ambient excited civil engineering structures is considered. Two different approaches for obtaining the modal parameters automatically are presented: The Frequency Domain Decomposition (FDD) technique and a correlation...

  15. Ab Initio structure determination of vaterite by automated electron diffraction.

    Science.gov (United States)

    Mugnaioli, Enrico; Andrusenko, Iryna; Schüler, Timo; Loges, Niklas; Dinnebier, Robert E; Panthöfer, Martin; Tremel, Wolfgang; Kolb, Ute

    2012-07-09

    "This is a mineral about which there has been much discussion" is a typical statement about vaterite in older standard textbooks of inorganic chemistry. This polymorph of CaCO(3) was first mentioned by H. Vater in 1897, plays key roles in weathering and biomineralization processes, but occurs only in the form of nanosized crystals, unsuitable for structure determination. Its structure could now be solved by automated electron diffraction tomography from 50 nm sized nanocrystals.

  16. A second-generation device for automated training and quantitative behavior analyses of molecularly-tractable model organisms.

    Directory of Open Access Journals (Sweden)

    Douglas Blackiston

    Full Text Available A deep understanding of cognitive processes requires functional, quantitative analyses of the steps leading from genetics and the development of nervous system structure to behavior. Molecularly-tractable model systems such as Xenopus laevis and planaria offer an unprecedented opportunity to dissect the mechanisms determining the complex structure of the brain and CNS. A standardized platform that facilitated quantitative analysis of behavior would make a significant impact on evolutionary ethology, neuropharmacology, and cognitive science. While some animal tracking systems exist, the available systems do not allow automated training (feedback to individual subjects in real time, which is necessary for operant conditioning assays. The lack of standardization in the field, and the numerous technical challenges that face the development of a versatile system with the necessary capabilities, comprise a significant barrier keeping molecular developmental biology labs from integrating behavior analysis endpoints into their pharmacological and genetic perturbations. Here we report the development of a second-generation system that is a highly flexible, powerful machine vision and environmental control platform. In order to enable multidisciplinary studies aimed at understanding the roles of genes in brain function and behavior, and aid other laboratories that do not have the facilities to undergo complex engineering development, we describe the device and the problems that it overcomes. We also present sample data using frog tadpoles and flatworms to illustrate its use. Having solved significant engineering challenges in its construction, the resulting design is a relatively inexpensive instrument of wide relevance for several fields, and will accelerate interdisciplinary discovery in pharmacology, neurobiology, regenerative medicine, and cognitive science.

  17. From bacterial to human dihydrouridine synthase: automated structure determination

    Energy Technology Data Exchange (ETDEWEB)

    Whelan, Fiona, E-mail: fiona.whelan@york.ac.uk; Jenkins, Huw T., E-mail: fiona.whelan@york.ac.uk [The University of York, Heslington, York YO10 5DD (United Kingdom); Griffiths, Samuel C. [University of Oxford, Headington, Oxford OX3 7BN (United Kingdom); Byrne, Robert T. [Ludwig-Maximilians-University Munich, Feodor-Lynen-Strasse 25, 81377 Munich (Germany); Dodson, Eleanor J.; Antson, Alfred A., E-mail: fiona.whelan@york.ac.uk [The University of York, Heslington, York YO10 5DD (United Kingdom)

    2015-06-30

    The crystal structure of a human dihydrouridine synthase, an enzyme associated with lung cancer, with 18% sequence identity to a T. maritima enzyme, has been determined at 1.9 Å resolution by molecular replacement after extensive molecular remodelling of the template. The reduction of uridine to dihydrouridine at specific positions in tRNA is catalysed by dihydrouridine synthase (Dus) enzymes. Increased expression of human dihydrouridine synthase 2 (hDus2) has been linked to pulmonary carcinogenesis, while its knockdown decreased cancer cell line viability, suggesting that it may serve as a valuable target for therapeutic intervention. Here, the X-ray crystal structure of a construct of hDus2 encompassing the catalytic and tRNA-recognition domains (residues 1–340) determined at 1.9 Å resolution is presented. It is shown that the structure can be determined automatically by phenix.mr-rosetta starting from a bacterial Dus enzyme with only 18% sequence identity and a significantly divergent structure. The overall fold of the human Dus2 is similar to that of bacterial enzymes, but has a larger recognition domain and a unique three-stranded antiparallel β-sheet insertion into the catalytic domain that packs next to the recognition domain, contributing to domain–domain interactions. The structure may inform the development of novel therapeutic approaches in the fight against lung cancer.

  18. Stochastic Generator of Chemical Structure. 3. Reaction Network Generation

    Energy Technology Data Exchange (ETDEWEB)

    FAULON,JEAN-LOUP; SAULT,ALLEN G.

    2000-07-15

    A new method to generate chemical reaction network is proposed. The particularity of the method is that network generation and mechanism reduction are performed simultaneously using sampling techniques. Our method is tested for hydrocarbon thermal cracking. Results and theoretical arguments demonstrate that our method scales in polynomial time while other deterministic network generator scale in exponential time. This finding offers the possibility to investigate complex reacting systems such as those studied in petroleum refining and combustion.

  19. Hierarchical Testing with Automated Document Generation for Amanzi, ASCEM's Subsurface Flow and Reactive Transport Simulator

    Science.gov (United States)

    Moulton, J. D.; Steefel, C. I.; Yabusaki, S.; Castleton, K.; Scheibe, T. D.; Keating, E. H.; Freedman, V. L.

    2013-12-01

    The Advanced Simulation Capabililty for Environmental Management (ASCEM) program is developing an approach and open-source tool suite for standardized risk and performance assessments at legacy nuclear waste sites. These assessments use a graded and iterative approach, beginning with simplified highly abstracted models, and adding geometric and geologic complexity as understanding is gained. To build confidence in this assessment capability, extensive testing of the underlying tools is needed. Since the tools themselves, such as the subsurface flow and reactive-transport simulator, Amanzi, are under active development, testing must be both hierarchical and highly automated. In this presentation we show how we have met these requirements, by leveraging the python-based open-source documentation system called Sphinx with several other open-source tools. Sphinx builds on the reStructured text tool docutils, with important extensions that include high-quality formatting of equations, and integrated plotting through matplotlib. This allows the documentation, as well as the input files for tests, benchmark and tutorial problems, to be maintained with the source code under a version control system. In addition, it enables developers to build documentation in several different formats (e.g., html and pdf) from a single source. We will highlight these features, and discuss important benefits of this approach for Amanzi. In addition, we'll show that some of ASCEM's other tools, such as the sampling provided by the Uncertainty Quantification toolset, are naturally leveraged to enable more comprehensive testing. Finally, we will highlight the integration of this hiearchical testing and documentation framework with our build system and tools (CMake, CTest, and CDash).

  20. Automated Generation of the Alaska Coastline Using High-Resolution Satellite Imagery

    Science.gov (United States)

    Roth, G.; Porter, C. C.; Cloutier, M. D.; Clementz, M. E.; Reim, C.; Morin, P. J.

    2015-12-01

    Previous campaigns to map Alaska's coast at high resolution have relied on airborne, marine, or ground-based surveying and manual digitization. The coarse temporal resolution, inability to scale geographically, and high cost of field data acquisition in these campaigns is inadequate for the scale and speed of recent coastal change in Alaska. Here, we leverage the Polar Geospatial Center (PGC) archive of DigitalGlobe, Inc. satellite imagery to produce a state-wide coastline at 2 meter resolution. We first select multispectral imagery based on time and quality criteria. We then extract the near-infrared (NIR) band from each processed image, and classify each pixel as water or land with a pre-determined NIR threshold value. Processing continues with vectorizing the water-land boundary, removing extraneous data, and attaching metadata. Final coastline raster and vector products maintain the original accuracy of the orthorectified satellite data, which is often within the local tidal range. The repeat frequency of coastline production can range from 1 month to 3 years, depending on factors such as satellite capacity, cloud cover, and floating ice. Shadows from trees or structures complicate the output and merit further data cleaning. The PGC's imagery archive, unique expertise, and computing resources enabled us to map the Alaskan coastline in a few months. The DigitalGlobe archive allows us to update this coastline as new imagery is acquired, and facilitates baseline data for studies of coastal change and improvement of topographic datasets. Our results are not simply a one-time coastline, but rather a system for producing multi-temporal, automated coastlines. Workflows and tools produced with this project can be freely distributed and utilized globally. Researchers and government agencies must now consider how they can incorporate and quality-control this high-frequency, high-resolution data to meet their mapping standards and research objectives.

  1. Automated structural health monitoring based on adaptive kernel spectral clustering

    Science.gov (United States)

    Langone, Rocco; Reynders, Edwin; Mehrkanoon, Siamak; Suykens, Johan A. K.

    2017-06-01

    Structural health monitoring refers to the process of measuring damage-sensitive variables to assess the functionality of a structure. In principle, vibration data can capture the dynamics of the structure and reveal possible failures, but environmental and operational variability can mask this information. Thus, an effective outlier detection algorithm can be applied only after having performed data normalization (i.e. filtering) to eliminate external influences. Instead, in this article we propose a technique which unifies the data normalization and damage detection steps. The proposed algorithm, called adaptive kernel spectral clustering (AKSC), is initialized and calibrated in a phase when the structure is undamaged. The calibration process is crucial to ensure detection of early damage and minimize the number of false alarms. After the calibration, the method can automatically identify new regimes which may be associated with possible faults. These regimes are discovered by means of two complementary damage (i.e. outlier) indicators. The proposed strategy is validated with a simulated example and with real-life natural frequency data from the Z24 pre-stressed concrete bridge, which was progressively damaged at the end of a one-year monitoring period.

  2. Comparison between a second generation automated multicapillary electrophoresis system with an automated agarose gel electrophoresis system for the detection of M-components.

    Science.gov (United States)

    Larsson, Anders; Hansson, Lars-Olof

    2008-01-01

    During the last decade, capillary electrophoresis (CE) has emerged as an interesting alternative to traditional analysis of serum, plasma and urine proteins by agarose gel electrophoresis. Initially there was a considerable difference in resolution between the two methods but the quality of CE has improved significantly. We thus wanted to evaluate a second generation of automated multicapillary instruments (Capillarys, Sebia, Paris, France) and the high resolution (HR) buffer for serum or plasma protein analysis with an automated agarose gel electrophoresis system for the detection of M-components. The comparison between the two systems was performed with patients samples with and without M-components. The comparison included 76 serum samples with M-components > 1 g/L. There was a total agreement between the two methods for detection of these M-components. When studying samples containing oligoclonal bands/small M-components, there were differences between the two systems. The capillary electrophoresis system detected a slightly higher number of samples with oligoclonal bands but the two systems found oligoclonal bands in different samples. When looking at resolution, the agarose gel electrophoresis system yielded a slightly better resolution in the alpha and beta regions, but it required an experienced interpreter to be able to benefit from the increased resolution. The capillary electrophoresis has shorter turn-around times and bar-code reader that allows positive sample identification. The Capillarys in combination with HR buffer gives better resolution of the alpha and beta regions than the same instrument with the beta1-beta2+ buffer or the Paragon CZE2000 (Beckman) which was the first generation of capillary electrophoresis systems.

  3. An automated procedure for covariation-based detection of RNA structure

    Energy Technology Data Exchange (ETDEWEB)

    Winker, S.; Overbeek, R.; Woese, C.R.; Olsen, G.J.; Pfluger, N.

    1989-12-01

    This paper summarizes our investigations into the computational detection of secondary and tertiary structure of ribosomal RNA. We have developed a new automated procedure that not only identifies potential bondings of secondary and tertiary structure, but also provides the covariation evidence that supports the proposed bondings, and any counter-evidence that can be detected in the known sequences. A small number of previously unknown bondings have been detected in individual RNA molecules (16S rRNA and 7S RNA) through the use of our automated procedure. Currently, we are systematically studying mitochondrial rRNA. Our goal is to detect tertiary structure within 16S rRNA and quaternary structure between 16S and 23S rRNA. Our ultimate hope is that automated covariation analysis will contribute significantly to a refined picture of ribosome structure. Our colleagues in biology have begun experiments to test certain hypotheses suggested by an examination of our program's output. These experiments involve sequencing key portions of the 23S ribosomal RNA for species in which the known 16S ribosomal RNA exhibits variation (from the dominant pattern) at the site of a proposed bonding. The hope is that the 23S ribosomal RNA of these species will exhibit corresponding complementary variation or generalized covariation. 24 refs.

  4. Speciation analysis of arsenic in biological matrices by automated hydride generation-cryotrapping-atomic absorption spectrometry with multiple microflame quartz tube atomizer (multiatomizer).

    Science.gov (United States)

    This paper describes an automated system for the oxidation state specific speciation of inorganic and methylated arsenicals by selective hydride generation - cryotrapping- gas chromatography - atomic absorption spectrometry with the multiatomizer. The corresponding arsines are ge...

  5. Alleviating the Collision States and Fleet Optimization by Introducing a New Generation of Automated Guided Vehicle Systems

    Directory of Open Access Journals (Sweden)

    Parham Azimi

    2011-01-01

    Full Text Available The aim of the current research is to propose a new generation of automated guided vehicle systems for alleviating the collision states in material handling systems where the automated guided vehicles movements are allowed to be both unidirectional and bidirectional. The objective function is to maximize the average annual profit in an FMS system using a simulation method. Despite several researches done in this field, this criterion has been studied rarely. The current study includes some new changes in AGV design for preventing some common problems such as congestions and deadlocks based on real profits/costs analysis in a flexible manufacturing system. For this reason, some experiments have been carried out to study the effects of several empty vehicle dispatching rules on average annual profit. The results show that the proposed framework is efficient and robust enough for industrial environments.

  6. Novel structural descriptors for automated colon cancer detection and grading.

    Science.gov (United States)

    Rathore, Saima; Hussain, Mutawarra; Aksam Iftikhar, Muhammad; Jalil, Abdul

    2015-09-01

    The histopathological examination of tissue specimens is necessary for the diagnosis and grading of colon cancer. However, the process is subjective and leads to significant inter/intra observer variation in diagnosis as it mainly relies on the visual assessment of histopathologists. Therefore, a reliable computer-aided technique, which can automatically classify normal and malignant colon samples, and determine grades of malignant samples, is required. In this paper, we propose a novel colon cancer diagnostic (CCD) system, which initially classifies colon biopsy images into normal and malignant classes, and then automatically determines the grades of colon cancer for malignant images. To this end, various novel structural descriptors, which mathematically model and quantify the variation among the structure of normal colon tissues and malignant tissues of various cancer grades, have been employed. Radial basis function (RBF) kernel of support vector machines (SVM) has been employed as classifier in order to classify/grade colon samples based on these descriptors. The proposed system has been tested on 92 malignant and 82 normal colon biopsy images. The classification performance has been measured in terms of various performance measures, and quite promising performance has been observed. Compared with previous techniques, the proposed system has demonstrated better cancer detection (classification accuracy=95.40%) and grading (classification accuracy=93.47%) capability. Therefore, the proposed CCD system can provide a reliable second opinion to the histopathologists.

  7. Heuristic Approach of Automated Test Data Generation for Program having Array of Different Dimensions and Loops with Variable Number of Iteration

    OpenAIRE

    Hitesh Tahbildar; Bichitra Kalita

    2010-01-01

    Normally, program execution spends most of the time on loops. Automated test data generation devotes special attention to loops for better coverage. Automated test data generation for programs having loops with variable number of iteration and variable length array is a challenging problem. It is so because the number of paths may increase exponentially with the increase of array size for some programming constructs, like merge sort. We propose a method that finds heuristic for different type...

  8. Automated Generation of Phase Diagrams for Binary Systems with Azeotropic Behavior

    DEFF Research Database (Denmark)

    Cismondi, Martin; Michelsen, Michael Locht; Zabaloy, Marcelo S.

    2008-01-01

    In this work, we propose a computational strategy and methods for the automated calculation of complete loci of homogeneous azeotropy of binary mixtures and the related Pxy and Txy diagrams for models of the equation-of-state (EOS) type. The strategy consists of first finding the system's azeotro...

  9. Grayscale lithography-automated mask generation for complex three-dimensional topography

    Science.gov (United States)

    Loomis, James; Ratnayake, Dilan; McKenna, Curtis; Walsh, Kevin M.

    2016-01-01

    Grayscale lithography is a relatively underutilized technique that enables fabrication of three-dimensional (3-D) microstructures in photosensitive polymers (photoresists). By spatially modulating ultraviolet (UV) dosage during the writing process, one can vary the depth at which photoresist is developed. This means complex structures and bioinspired designs can readily be produced that would otherwise be cost prohibitive or too time intensive to fabricate. The main barrier to widespread grayscale implementation, however, stems from the laborious generation of mask files required to create complex surface topography. We present a process and associated software utility for automatically generating grayscale mask files from 3-D models created within industry-standard computer-aided design (CAD) suites. By shifting the microelectromechanical systems (MEMS) design onus to commonly used CAD programs ideal for complex surfacing, engineering professionals already familiar with traditional 3-D CAD software can readily utilize their pre-existing skills to make valuable contributions to the MEMS community. Our conversion process is demonstrated by prototyping several samples on a laser pattern generator-capital equipment already in use in many foundries. Finally, an empirical calibration technique is shown that compensates for nonlinear relationships between UV exposure intensity and photoresist development depth as well as a thermal reflow technique to help smooth microstructure surfaces.

  10. a Novel Method for Automation of 3d Hydro Break Line Generation from LIDAR Data Using Matlab

    Science.gov (United States)

    Toscano, G. J.; Gopalam, U.; Devarajan, V.

    2013-08-01

    Water body detection is necessary to generate hydro break lines, which are in turn useful in creating deliverables such as TINs, contours, DEMs from LiDAR data. Hydro flattening follows the detection and delineation of water bodies (lakes, rivers, ponds, reservoirs, streams etc.) with hydro break lines. Manual hydro break line generation is time consuming and expensive. Accuracy and processing time depend on the number of vertices marked for delineation of break lines. Automation with minimal human intervention is desired for this operation. This paper proposes using a novel histogram analysis of LiDAR elevation data and LiDAR intensity data to automatically detect water bodies. Detection of water bodies using elevation information was verified by checking against LiDAR intensity data since the spectral reflectance of water bodies is very small compared with that of land and vegetation in near infra-red wavelength range. Detection of water bodies using LiDAR intensity data was also verified by checking against LiDAR elevation data. False detections were removed using morphological operations and 3D break lines were generated. Finally, a comparison of automatically generated break lines with their semi-automated/manual counterparts was performed to assess the accuracy of the proposed method and the results were discussed.

  11. Automated Quality Assessment of Structural Magnetic Resonance Brain Images Based on a Supervised Machine Learning Algorithm

    Directory of Open Access Journals (Sweden)

    Ricardo Andres Pizarro

    2016-12-01

    Full Text Available High-resolution three-dimensional magnetic resonance imaging (3D-MRI is being increasingly used to delineate morphological changes underlying neuropsychiatric disorders. Unfortunately, artifacts frequently compromise the utility of 3D-MRI yielding irreproducible results, from both type I and type II errors. It is therefore critical to screen 3D-MRIs for artifacts before use. Currently, quality assessment involves slice-wise visual inspection of 3D-MRI volumes, a procedure that is both subjective and time consuming. Automating the quality rating of 3D-MRI could improve the efficiency and reproducibility of the procedure. The present study is one of the first efforts to apply a support vector machine (SVM algorithm in the quality assessment of structural brain images, using global and region of interest (ROI automated image quality features developed in-house. SVM is a supervised machine-learning algorithm that can predict the category of test datasets based on the knowledge acquired from a learning dataset. The performance (accuracy of the automated SVM approach was assessed, by comparing the SVM-predicted quality labels to investigator-determined quality labels. The accuracy for classifying 1457 3D-MRI volumes from our database using the SVM approach is around 80%. These results are promising and illustrate the possibility of using SVM as an automated quality assessment tool for 3D-MRI.

  12. Mass spectra-based framework for automated structural elucidation of metabolome data to explore phytochemical diversity

    Directory of Open Access Journals (Sweden)

    Fumio eMatsuda

    2011-08-01

    Full Text Available A novel framework for automated elucidation of metabolite structures in liquid chromatography-mass spectrometer (LC-MS metabolome data was constructed by integrating databases. High-resolution tandem mass spectra data automatically acquired from each metabolite signal were used for database searches. Three distinct databases, KNApSAcK, ReSpect, and the PRIMe standard compound database, were employed for the structural elucidation. The outputs were retrieved using the CAS metabolite identifier for identification and putative annotation. A simple metabolite ontology system was also introduced to attain putative characterization of the metabolite signals. The automated method was applied for the metabolome data sets obtained from the rosette leaves of 20 Arabidopsis accessions. Phenotypic variations in novel Arabidopsis metabolites among these accessions could be investigated using this method.

  13. Knowledge structure representation and automated updates in intelligent information management systems

    Science.gov (United States)

    Corey, Stephen; Carnahan, Richard S., Jr.

    1990-01-01

    A continuing effort to apply rapid prototyping and Artificial Intelligence techniques to problems associated with projected Space Station-era information management systems is examined. In particular, timely updating of the various databases and knowledge structures within the proposed intelligent information management system (IIMS) is critical to support decision making processes. Because of the significantly large amounts of data entering the IIMS on a daily basis, information updates will need to be automatically performed with some systems requiring that data be incorporated and made available to users within a few hours. Meeting these demands depends first, on the design and implementation of information structures that are easily modified and expanded, and second, on the incorporation of intelligent automated update techniques that will allow meaningful information relationships to be established. Potential techniques are studied for developing such an automated update capability and IIMS update requirements are examined in light of results obtained from the IIMS prototyping effort.

  14. Rule-based programming and strategies for automated generation of detailed kinetic models for gas phase combustion of polycyclic hydrocarbon molecules; Programmation par regles et strategies pour la generation automatique de mecanismes de combustion d'hydrocarbures polycycliques

    Energy Technology Data Exchange (ETDEWEB)

    Ibanescu, L.

    2004-06-15

    The primary objective of this thesis is to explore the approach of using rule-based systems and strategies, for a complex problem of chemical kinetic: the automated generation of reaction mechanisms. The chemical reactions are naturally expressed as conditional rewriting rules. The control of the chemical reactions chaining is easy to describe using a strategies language, such as the one of the ELAN system, developed in the Protheo team. The thesis presents the basic concepts of the chemical kinetics, the chemical and computational problems related to the conception and validation of a reaction mechanism, and gives a general structure for the generator of reaction mechanisms called GasEI. Our research focuses on the primary mechanism generator. We give solutions for encoding the chemical species, the reactions and their chaining, and we present the prototype developed in ELAN. The representation of the chemical species uses the notion of molecular graphs, encoded by a term structure called GasEI terms. The chemical reactions are expressed by rewriting rules on molecular graphs, encoded by a set of conditional rewriting rules on GasEI terms. The strategies language of the ELAN system is used to express the reactions chaining in the primary mechanism generator. This approach is illustrated by coding ten generic reactions of the oxidizing pyrolysis. Qualitative chemical validations of the prototype show that our approach gives, for acyclic molecules, the same results as the existing mechanism generators, and for polycyclic molecules produces original results.

  15. TCRep 3D: an automated in silico approach to study the structural properties of TCR repertoires.

    Directory of Open Access Journals (Sweden)

    Antoine Leimgruber

    Full Text Available TCRep 3D is an automated systematic approach for TCR-peptide-MHC class I structure prediction, based on homology and ab initio modeling. It has been considerably generalized from former studies to be applicable to large repertoires of TCR. First, the location of the complementary determining regions of the target sequences are automatically identified by a sequence alignment strategy against a database of TCR Vα and Vβ chains. A structure-based alignment ensures automated identification of CDR3 loops. The CDR are then modeled in the environment of the complex, in an ab initio approach based on a simulated annealing protocol. During this step, dihedral restraints are applied to drive the CDR1 and CDR2 loops towards their canonical conformations, described by Al-Lazikani et. al. We developed a new automated algorithm that determines additional restraints to iteratively converge towards TCR conformations making frequent hydrogen bonds with the pMHC. We demonstrated that our approach outperforms popular scoring methods (Anolea, Dope and Modeller in predicting relevant CDR conformations. Finally, this modeling approach has been successfully applied to experimentally determined sequences of TCR that recognize the NY-ESO-1 cancer testis antigen. This analysis revealed a mechanism of selection of TCR through the presence of a single conserved amino acid in all CDR3β sequences. The important structural modifications predicted in silico and the associated dramatic loss of experimental binding affinity upon mutation of this amino acid show the good correspondence between the predicted structures and their biological activities. To our knowledge, this is the first systematic approach that was developed for large TCR repertoire structural modeling.

  16. Integrating automated structured analysis and design with Ada programming support environments

    Science.gov (United States)

    Hecht, Alan; Simmons, Andy

    1986-01-01

    Ada Programming Support Environments (APSE) include many powerful tools that address the implementation of Ada code. These tools do not address the entire software development process. Structured analysis is a methodology that addresses the creation of complete and accurate system specifications. Structured design takes a specification and derives a plan to decompose the system subcomponents, and provides heuristics to optimize the software design to minimize errors and maintenance. It can also produce the creation of useable modules. Studies have shown that most software errors result from poor system specifications, and that these errors also become more expensive to fix as the development process continues. Structured analysis and design help to uncover error in the early stages of development. The APSE tools help to insure that the code produced is correct, and aid in finding obscure coding errors. However, they do not have the capability to detect errors in specifications or to detect poor designs. An automated system for structured analysis and design TEAMWORK, which can be integrated with an APSE to support software systems development from specification through implementation is described. These tools completement each other to help developers improve quality and productivity, as well as to reduce development and maintenance costs. Complete system documentation and reusable code also resultss from the use of these tools. Integrating an APSE with automated tools for structured analysis and design provide capabilities and advantages beyond those realized with any of these systems used by themselves.

  17. Automated Gel Size Selection to Improve the Quality of Next-generation Sequencing Libraries Prepared from Environmental Water Samples.

    Science.gov (United States)

    Uyaguari-Diaz, Miguel I; Slobodan, Jared R; Nesbitt, Matthew J; Croxen, Matthew A; Isaac-Renton, Judith; Prystajecky, Natalie A; Tang, Patrick

    2015-04-17

    Next-generation sequencing of environmental samples can be challenging because of the variable DNA quantity and quality in these samples. High quality DNA libraries are needed for optimal results from next-generation sequencing. Environmental samples such as water may have low quality and quantities of DNA as well as contaminants that co-precipitate with DNA. The mechanical and enzymatic processes involved in extraction and library preparation may further damage the DNA. Gel size selection enables purification and recovery of DNA fragments of a defined size for sequencing applications. Nevertheless, this task is one of the most time-consuming steps in the DNA library preparation workflow. The protocol described here enables complete automation of agarose gel loading, electrophoretic analysis, and recovery of targeted DNA fragments. In this study, we describe a high-throughput approach to prepare high quality DNA libraries from freshwater samples that can be applied also to other environmental samples. We used an indirect approach to concentrate bacterial cells from environmental freshwater samples; DNA was extracted using a commercially available DNA extraction kit, and DNA libraries were prepared using a commercial transposon-based protocol. DNA fragments of 500 to 800 bp were gel size selected using Ranger Technology, an automated electrophoresis workstation. Sequencing of the size-selected DNA libraries demonstrated significant improvements to read length and quality of the sequencing reads.

  18. Automated Generation of Machine Verifiable and Readable Proofs: A Case Study of Tarski's Geometry

    OpenAIRE

    Stojanovic Durdevic, Sana; Narboux, Julien; Janicic, Predrag

    2015-01-01

    International audience; The power of state-of-the-art automated and interactive the-orem provers has reached the level at which a significant portion of non-trivial mathematical contents can be formalized almost fully automat-ically. In this paper we present our framework for the formalization of mathematical knowledge that can produce machine verifiable proofs (for different proof assistants) but also human-readable (nearly textbook-like) proofs. As a case study, we focus on one of the twent...

  19. A new Matlab coder for generating Structured Text Language from matrix expression for PLC and PAC controllers

    Science.gov (United States)

    Buciakowski, Mariusz; Witczak, Piotr

    2017-01-01

    This paper presents a new Matlab toolbox for synthesis of Structured Text (ST) code for Programmable Logic Controllers (PLC) and Programmable Automation Controller (PAC). This tool can directly generate IEC 61131-3 Structured Text Language from Matlab script for selected Integrated Development Environments (IDEs). The generated code can be verified and compared with the results obtained with Matlab simulation. After this, generated code can be used in IDEs, compiled and uploaded to a PLC or PAC controller for final verification. This approach leaves all available Matlab toolboxes for programmers use, thus allowing fast and easy synthesis developed algorithms.

  20. A Knowledge Based Approach for Automated Modelling of Extended Wing Structures in Preliminary Aircraft Design

    OpenAIRE

    Dorbath, Felix; Nagel, Björn; Gollnick, Volker

    2011-01-01

    This paper introduces the concept of the ELWIS model generator for Finite Element models of aircraft wing structures. The physical modelling of the structure is extended beyond the wing primary structures, to increase the level of accuracy for aircraft which diverge from existing configurations. Also the impact of novel high lift technologies on structural masses can be captured already in the early stages of design by using the ELWIS models. The ELWIS model generator is able to c...

  1. Automated Calculation Scheme for alpha^n Contributions of QED to Lepton g-2: Generating Renormalized Amplitudes for Diagrams without Lepton Loops

    CERN Document Server

    Aoyama, T; Kinoshita, T; Nio, M

    2006-01-01

    Among 12672 Feynman diagrams contributing to the electron anomalous magnetic moment at the tenth order, 6354 are the diagrams having no lepton loops, i.e., those of quenched type. Because the renormalization structure of these diagrams is very complicated, some automation scheme is inevitable to calculate them. We developed an algorithm to write down FORTRAN programs for numerical evaluation of these diagrams, where the necessary counterterms to subtract out ultraviolet subdivergence are generated according to Zimmermann's forest formula. Thus far we have evaluated crudely integrals of 2232 tenth-order vertex diagrams which require vertex renormalization only. Remaining 4122 diagrams, which have ultraviolet-divergent self-energy subdiagrams and infrared-divergent subdiagrams, are being evaluated by giving small mass lambda to photons to control the infrared problem.

  2. Generation of orientation tools for automated zebrafish screening assays using desktop 3D printing

    Science.gov (United States)

    2014-01-01

    Background The zebrafish has been established as the main vertebrate model system for whole organism screening applications. However, the lack of consistent positioning of zebrafish embryos within wells of microtiter plates remains an obstacle for the comparative analysis of images acquired in automated screening assays. While technical solutions to the orientation problem exist, dissemination is often hindered by the lack of simple and inexpensive ways of distributing and duplicating tools. Results Here, we provide a cost effective method for the production of 96-well plate compatible zebrafish orientation tools using a desktop 3D printer. The printed tools enable the positioning and orientation of zebrafish embryos within cavities formed in agarose. Their applicability is demonstrated by acquiring lateral and dorsal views of zebrafish embryos arrayed within microtiter plates using an automated screening microscope. This enables the consistent visualization of morphological phenotypes and reporter gene expression patterns. Conclusions The designs are refined versions of previously demonstrated devices with added functionality and strongly reduced production costs. All corresponding 3D models are freely available and digital design can be easily shared electronically. In combination with the increasingly widespread usage of 3D printers, this provides access to the developed tools to a wide range of zebrafish users. Finally, the design files can serve as templates for other additive and subtractive fabrication methods. PMID:24886511

  3. Automated Reconstruction of Historic Roof Structures from Point Clouds - Development and Examples

    Science.gov (United States)

    Pöchtrager, M.; Styhler-Aydın, G.; Döring-Williams, M.; Pfeifer, N.

    2017-08-01

    The analysis of historic roof constructions is an important task for planning the adaptive reuse of buildings or for maintenance and restoration issues. Current approaches to modeling roof constructions consist of several consecutive operations that need to be done manually or using semi-automatic routines. To increase efficiency and allow the focus to be on analysis rather than on data processing, a set of methods was developed for the fully automated analysis of the roof constructions, including integration of architectural and structural modeling. Terrestrial laser scanning permits high-detail surveying of large-scale structures within a short time. Whereas 3-D laser scan data consist of millions of single points on the object surface, we need a geometric description of structural elements in order to obtain a structural model consisting of beam axis and connections. Preliminary results showed that the developed methods work well for beams in flawless condition with a quadratic cross section and no bending. Deformations or damages such as cracks and cuts on the wooden beams can lead to incomplete representations in the model. Overall, a high degree of automation was achieved.

  4. Evaluation of software tools for automated identification of neuroanatomical structures in quantitative β-amyloid PET imaging to diagnose Alzheimer's disease

    Energy Technology Data Exchange (ETDEWEB)

    Tuszynski, Tobias; Luthardt, Julia; Butzke, Daniel; Tiepolt, Solveig; Seese, Anita; Barthel, Henryk [Leipzig University Medical Centre, Department of Nuclear Medicine, Leipzig (Germany); Rullmann, Michael; Hesse, Swen; Sabri, Osama [Leipzig University Medical Centre, Department of Nuclear Medicine, Leipzig (Germany); Leipzig University Medical Centre, Integrated Treatment and Research Centre (IFB) Adiposity Diseases, Leipzig (Germany); Gertz, Hermann-Josef [Leipzig University Medical Centre, Department of Psychiatry, Leipzig (Germany); Lobsien, Donald [Leipzig University Medical Centre, Department of Neuroradiology, Leipzig (Germany)

    2016-06-15

    For regional quantification of nuclear brain imaging data, defining volumes of interest (VOIs) by hand is still the gold standard. As this procedure is time-consuming and operator-dependent, a variety of software tools for automated identification of neuroanatomical structures were developed. As the quality and performance of those tools are poorly investigated so far in analyzing amyloid PET data, we compared in this project four algorithms for automated VOI definition (HERMES Brass, two PMOD approaches, and FreeSurfer) against the conventional method. We systematically analyzed florbetaben brain PET and MRI data of ten patients with probable Alzheimer's dementia (AD) and ten age-matched healthy controls (HCs) collected in a previous clinical study. VOIs were manually defined on the data as well as through the four automated workflows. Standardized uptake value ratios (SUVRs) with the cerebellar cortex as a reference region were obtained for each VOI. SUVR comparisons between ADs and HCs were carried out using Mann-Whitney-U tests, and effect sizes (Cohen's d) were calculated. SUVRs of automatically generated VOIs were correlated with SUVRs of conventionally derived VOIs (Pearson's tests). The composite neocortex SUVRs obtained by manually defined VOIs were significantly higher for ADs vs. HCs (p=0.010, d=1.53). This was also the case for the four tested automated approaches which achieved effect sizes of d=1.38 to d=1.62. SUVRs of automatically generated VOIs correlated significantly with those of the hand-drawn VOIs in a number of brain regions, with regional differences in the degree of these correlations. Best overall correlation was observed in the lateral temporal VOI for all tested software tools (r=0.82 to r=0.95, p<0.001). Automated VOI definition by the software tools tested has a great potential to substitute for the current standard procedure to manually define VOIs in β-amyloid PET data analysis. (orig.)

  5. Automated High Throughput Protein Crystallization Screening at Nanoliter Scale and Protein Structural Study on Lactate Dehydrogenase

    Energy Technology Data Exchange (ETDEWEB)

    Li, Fenglei [Iowa State Univ., Ames, IA (United States)

    2006-08-09

    The purposes of our research were: (1) To develop an economical, easy to use, automated, high throughput system for large scale protein crystallization screening. (2) To develop a new protein crystallization method with high screening efficiency, low protein consumption and complete compatibility with high throughput screening system. (3) To determine the structure of lactate dehydrogenase complexed with NADH by x-ray protein crystallography to study its inherent structural properties. Firstly, we demonstrated large scale protein crystallization screening can be performed in a high throughput manner with low cost, easy operation. The overall system integrates liquid dispensing, crystallization and detection and serves as a whole solution to protein crystallization screening. The system can dispense protein and multiple different precipitants in nanoliter scale and in parallel. A new detection scheme, native fluorescence, has been developed in this system to form a two-detector system with a visible light detector for detecting protein crystallization screening results. This detection scheme has capability of eliminating common false positives by distinguishing protein crystals from inorganic crystals in a high throughput and non-destructive manner. The entire system from liquid dispensing, crystallization to crystal detection is essentially parallel, high throughput and compatible with automation. The system was successfully demonstrated by lysozyme crystallization screening. Secondly, we developed a new crystallization method with high screening efficiency, low protein consumption and compatibility with automation and high throughput. In this crystallization method, a gas permeable membrane is employed to achieve the gentle evaporation required by protein crystallization. Protein consumption is significantly reduced to nanoliter scale for each condition and thus permits exploring more conditions in a phase diagram for given amount of protein. In addition

  6. Automated brain structure segmentation based on atlas registration and appearance models

    DEFF Research Database (Denmark)

    van der Lijn, Fedde; de Bruijne, Marleen; Klein, Stefan;

    2012-01-01

    Accurate automated brain structure segmentation methods facilitate the analysis of large-scale neuroimaging studies. This work describes a novel method for brain structure segmentation in magnetic resonance images that combines information about a structure’s location and appearance. The spatial...... model is implemented by registering multiple atlas images to the target image and creating a spatial probability map. The structure’s appearance is modeled by a classi¿er based on Gaussian scale-space features. These components are combined with a regularization term in a Bayesian framework...... that is globally optimized using graph cuts. The incorporation of the appearance model enables the method to segment structures with complex intensity distributions and increases its robustness against errors in the spatial model. The method is tested in cross-validation experiments on two datasets acquired...

  7. Heuristic Approach of Automated Test Data Generation for Program having Array of Different Dimensions and Loops with Variable Number of Iteration

    CERN Document Server

    Tahbildar, Hitesh

    2010-01-01

    Normally, program execution spends most of the time on loops. Automated test data generation devotes special attention to loops for better coverage. Automated test data generation for programs having loops with variable number of iteration and variable length array is a challenging problem. It is so because the number of paths may increase exponentially with the increase of array size for some programming constructs, like merge sort. We propose a method that finds heuristic for different types of programming constructs with loops and arrays. Linear search, Bubble sort, merge sort, and matrix multiplication programs are included in an attempt to highlight the difference in execution between single loop, variable length array and nested loops with one and two dimensional arrays. We have used two parameters/heuristics to predict the minimum number of iterations required for generating automated test data. They are longest path level (kL) and saturation level (kS). The proceedings of our work includes the instrum...

  8. Instructional Topics in Educational Measurement (ITEMS) Module: Using Automated Processes to Generate Test Items

    Science.gov (United States)

    Gierl, Mark J.; Lai, Hollis

    2013-01-01

    Changes to the design and development of our educational assessments are resulting in the unprecedented demand for a large and continuous supply of content-specific test items. One way to address this growing demand is with automatic item generation (AIG). AIG is the process of using item models to generate test items with the aid of computer…

  9. The next-generation Hybrid Capture High-Risk HPV DNA assay on a fully automated platform.

    Science.gov (United States)

    Eder, Paul S; Lou, Jianrong; Huff, John; Macioszek, Jerzy

    2009-07-01

    A next-generation diagnostic system has been developed at QIAGEN. The QIAensemble system consists of an analytical subsystem (JE2000) that utilizes a re-engineered Hybrid Capture chemistry (NextGen) to maintain the high level of clinical sensitivity established by the digene High-Risk HPV DNA Test (HC2), while creating improved analytical specificity as shown both in plasmid-based analyses and in processing of clinical specimens. Limit-of-detection and cross-reactivity experiments were performed using plasmid DNA constructs containing multiple high-risk (HR) and low-risk (LR) HPV types. Cervical specimens collected into a novel specimen collection medium, DCM, were used to measure stability of specimens, as well as analytical specificity. Signal carryover, instrument precision, and specimen reproducibility were measured on the prototype JE2000 system using the automated NextGen assay. The Limit of Detection (LOD) is HPV 16 plasmid in the automated assay. No cross-reactivity (signal above cutoff) was detected on the automated system from any of 13 LR types tested at 10(7) copies per assay. Within-plate, plate-to-plate, and day-to-day performance in the prototype system yielded a CV of 20%. No indication of target carryover was found when samples containing up to 10(9) copies/ml of HPV DNA type 16 were processed on the JE2000 instrument. In an agreement study with HC2, 1038 donor cervical specimens were tested in both the manual NextGen assay and HC2 to evaluate agreement between the two tests. After eliminating discrepant specimens that were adjudicated by HR-HPV genotyping, the adjudicated positive agreement was 98.5% (95% CI: 94.6, 99.6). The JE2000 prototype system automates NextGen assay processing, yielding accurate, reproducible, and highly specific results with both plasmid analytical model tests and cervical specimens collected in DCM. The final system will process more than 2000 specimens in an 8-hour shift, with fully continuous loading.

  10. Analysis of Numerically Generated Wake Structures

    DEFF Research Database (Denmark)

    Ivanell, S.; Sørensen, Jens Nørkær; Mikkelsen, Robert Flemming;

    2009-01-01

    Direct numerical simulations of the Navier-Stokes equations are performed to achieve a better understanding of the behaviour of wakes generated by wind turbines. The simulations are performed by combining the in-house developed computer code EllipSys3D with the actuator-line methodology. In the a......Direct numerical simulations of the Navier-Stokes equations are performed to achieve a better understanding of the behaviour of wakes generated by wind turbines. The simulations are performed by combining the in-house developed computer code EllipSys3D with the actuator-line methodology...

  11. Low cost home automation system: a support to the ecological electricity generation in Colombia

    Directory of Open Access Journals (Sweden)

    Elmer Alejandro Parada Prieto

    2016-09-01

    Full Text Available Context/Objective: In Colombia, consumption of residential electricity accounts for about 40% of the national demand; therefore, alternatives to reduce this consumption are needed. The goal of this study was to develop a home automation prototype to control the illumination of a household and to foster the efficient use of energy. Method: The system consists of independent control modules and an information manager module; the control module regulates the luminaires using a microcontroller and a presence sensor, and exchanges data by means of a radio frequency transceiver; the manager module allows the access to the control modules from a Web interface. The prototype was implemented in a household located in the city of San José de Cúcuta, Colombia, during a 60 days period. Results: The operation of the system diminished the total electricity consumption by 3,75 %, with a z-score of -1,93 that was obtained from the statistical analysis. Conclusions: We concluded that the prototype is inexpensive in comparison to similar technologies available in the national and international markets, and it reduces the waste of electrical energy due to the consumption habits of the residents in the case study.

  12. An automated process for generating archival data files from MATLAB figures

    Science.gov (United States)

    Wallace, G. M.; Greenwald, M.; Stillerman, J.

    2016-10-01

    A new directive from the White House Office of Science and Technology Policy requires that all publications supported by federal funding agencies (e.g. Department of Energy Office of Science, National Science Foundation) include machine-readable datasets for figures and tables. An automated script was developed at the PSFC to make this process easier for authors using the MATLAB plotting environment to create figures. All relevant data (x, y, z, errorbars) and metadata (line style, color, symbol shape, labels) are contained within the MATLAB .fig file created when saving a figure. The export_fig script extracts data and metadata from a .fig file and exports it into an HDF5 data file with no additional user input required. Support is included for a number of plot types including 2-D and 3-D line, contour, and surface plots, quiver plots, bar graphs, and histograms. This work supported by US Department of Energy cooperative agreement DE-FC02-99ER54512 using the Alcator C-Mod tokamak, a DOE Office of Science user facility.

  13. Generation and Performance of Automated Jarosite Mineral Detectors for Vis/NIR Spectrometers at Mars

    Science.gov (United States)

    Gilmore, M. S.; Bornstein, B.; Merrill, M. D.; Castano, R.; Greenwood, J. P.

    2005-01-01

    Sulfate salt discoveries at the Eagle and Endurance craters in Meridiani Planum by the Mars Exploration Rover Opportunity have proven mineralogically the existence and involvement of water in Mars past. Visible and near infrared spectrometers like the Mars Express OMEGA, the Mars Reconnaissance Orbiter CRISM and the 2009 Mars Science Laboratory Rover cameras are powerful tools for the identification of water-bearing salts and other high priority minerals at Mars. The increasing spectral resolution and rover mission lifetimes represented by these missions currently necessitate data compression in order to ease downlink restrictions. On board data processing techniques can be used to guide the selection, measurement and return of scientifically important data from relevant targets, thus easing bandwidth stress and increasing scientific return. We have developed an automated support vector machine (SVM) detector operating in the visible/near-infrared (VisNIR, 300-2500 nm) spectral range trained to recognize the mineral jarosite (typically KFe3(SO4)2(OH)6), positively identified by the Mossbauer spectrometer at Meridiani Planum. Additional information is included in the original extended abstract.

  14. Structured Beam Generation with a Single Metasurface

    CERN Document Server

    Yue, Fuyong; Xin, Jingtao; Gerardot, Brian; Li, Jensen; Chen, Xianzhong

    2016-01-01

    Despite a plethora of applications ranging from quantum memories to high-resolution lithography, the current technologies to generate vector vortex beams (VVBs) suffer from less efficient energy use, poor resolution, low damage threshold, bulky size and complicated experimental setup, preventing further practical applications. We propose and experimentally demonstrate an approach to generate VVBs with a single metasurface by locally tailoring phase and transverse polarization distribution. This method features the spin-orbit coupling and the superposition of the converted part with an additional phase pickup and the residual part without a phase change. By maintaining the equal components for the converted part and the residual part, the cylindrically polarized vortex beams carrying orbital angular momentum are experimentally demonstrated based on a single metasurface at subwavelength scale. The proposed approach provides unprecedented freedom in engineering the properties of optical waves with the high-effic...

  15. Characteristic flow patterns generated by macrozoobenthic structures

    Science.gov (United States)

    Friedrichs, M.; Graf, G.

    2009-02-01

    A laboratory flume channel, equipped with an acoustic Doppler flow sensor and a bottom scanning laser, was used for detailed, non-intrusive flow measurements (at 2 cm s - 1 and 10 cm s - 1 ) around solitary biogenic structures, combined with high-resolution mapping of the structure shape and position. The structures were replicates of typical macrozoobenthic species commonly found in the Mecklenburg Bight and with a presumed influence on both, the near-bed current regime and sediment transport dynamics: a worm tube, a snail shell, a mussel, a sand mound, a pit, and a cross-stream track furrow. The flow was considerably altered locally by the different protruding structures (worm tube, snail, mussel and mound). They reduced the horizontal approach velocity by 72% to 79% in the wake zone at about 1-2 cm height, and the flow was deflected around the structures with vertical and lateral velocities of up to 10% and 20% of the free-stream velocity respectively in a region adjacent to the structures. The resulting flow separation (at flow Reynolds number of about 4000 and 20,000 respectively) divided an outer deflection region from an inner region with characteristic vortices and the wake region. All protruding structures showed this general pattern, but also produced individual characteristics. Conversely, the depressions (track and pit) only had a weak influence on the local boundary layer flow, combined with a considerable flow reduction within their cavities (between 29% and 53% of the free-stream velocity). A longitudinal vortex formed, below which a stagnant space was found. The average height affected by the structure-related mass flow rate deficit for the two velocities was 1.6 cm and 1.3 cm respectively (80% of height and 64%) for the protruding structures and 0.6 cm and 0.9 cm (90% and 127% of depth) for the depressions. Marine benthic soft-bottom macrozoobenthos species are expected to benefit from the flow modifications they induce, particularly in terms of

  16. Steam generation process control and automation; Automacao e controle no processo de geracao de vapor

    Energy Technology Data Exchange (ETDEWEB)

    Souza Junior, Jose Cleodon de; Silva, Walmy Andre C.M. da [PETROBRAS S.A., Natal, RN (Brazil)

    2004-07-01

    This paper describes the implementation of the Supervisory Control and Data Acquisition System (SCADA) in the steam generation process for injection in heavy oil fields of the Alto do Rodrigues Production Asset, developed by PETROBRAS/E and P/UN-RNCE. This Asset is located in the northeastern region of Brazil, in Rio Grande do Norte State. It addresses to the steam generators for injection in oil wells and the upgrade project that installed remote terminal units and a new panel controlled by PLC, changed all the pneumatic transmitters by electronic and incorporated the steam quality and oxygen control, providing the remote supervision of the process. It also discusses the improvements obtained in the steam generation after the changes in the conception of the control and safety systems. (author)

  17. Development of Closed-Loop Simulation Methods for a Next-Generation Terminal Area Automation System

    Science.gov (United States)

    Robinson, John E., III; Isaacson, Douglas R.

    2002-01-01

    A next-generation air traffic decision support tool, known as the Active Final Approach Spacing Tool (aFAST), will generate heading, speed and altitude commands to achieve more precise separation of aircraft in the terminal area. The techniques used to analyze the performance of earlier generation decision support tools are not adequate to analyze the performance of aFAST. This paper summarizes the development of a new and innovative fully closed-loop testing method for aFAST. This method, called trajectory feedback testing, closes each aircraft's control loop inside of the aFAST scheduling algorithm. Validation of trajectory feedback testing by examination of the variation of aircraft time-of-arrival predictions between schedule updates and the variation of aircraft excess separation distances between simulation runs is presented.

  18. Fast, Automated, Scalable Generation of Textured 3D Models of Indoor Environments

    Science.gov (United States)

    2014-12-18

    Indoor localization and visualization using a human-operated backpack system,” in Indoor Positioning and Indoor Navigation (IPIN), 2010 International...throughs of environments, gaming entertainment, augmented reality, indoor navigation , and energy simulation analysis. These applications rely on the...similar techniques to recover point estimates of wall positions . When generating a mesh of indoor environments, one appli- cation allows for imagery

  19. Nano Structured Devices for Energy Generation

    DEFF Research Database (Denmark)

    Radziwon, Michal Jędrzej

    This work focuses on the enhancement of α-sexithiophene / buckminsterfullerene (α-6T / C60) inverted bilayer organic solar cell effciency by the introduction of crystalline nanostructures in the electron donor layer. In order to utilize the charge carrier mobility anisotropy in crystalline α-6T......?uorescence polarimetry and X-ray diffractometry (XRD). Layer thicknesses of inverted α-6T / C60 bilayer organic solar cells fabricated at room temperature were optimized to obtain the model device for the performance enhancement studies. By variation of the substrate temperature during deposition of α-6T, the structures...... structures in solar cells, the orientation of the individual molecules should favor charge transport perpendicular to the substrate plane. Such orientation is realized from α-6T molecules lying on the substrate, which additionally infers the preferred orientation of the transition dipole for maximal light...

  20. A computer generator for randomly layered structures

    Institute of Scientific and Technical Information of China (English)

    YU Jia-shun; HE Zhen-hua

    2004-01-01

    An algorithm is introduced in this paper for the synthesis of randomly layered earth models. Under the assumption that the layering and the physical parameters for a layer are random variables with truncated normal distributions, random numbers sampled from the distributions can be used to construct the layered structure and determine physical parameters for the layers. To demonstrate its application, random models were synthesized for the modelling of seismic ground motion amplification of a site with uncertainties in its model parameters.

  1. Automated Assignment of MS/MS Cleavable Cross-Links in Protein 3D-Structure Analysis

    Science.gov (United States)

    Götze, Michael; Pettelkau, Jens; Fritzsche, Romy; Ihling, Christian H.; Schäfer, Mathias; Sinz, Andrea

    2015-01-01

    CID-MS/MS cleavable cross-linkers hold an enormous potential for an automated analysis of cross-linked products, which is essential for conducting structural proteomics studies. The created characteristic fragment ion patterns can easily be used for an automated assignment and discrimination of cross-linked products. To date, there are only a few software solutions available that make use of these properties, but none allows for an automated analysis of cleavable cross-linked products. The MeroX software fills this gap and presents a powerful tool for protein 3D-structure analysis in combination with MS/MS cleavable cross-linkers. We show that MeroX allows an automatic screening of characteristic fragment ions, considering static and variable peptide modifications, and effectively scores different types of cross-links. No manual input is required for a correct assignment of cross-links and false discovery rates are calculated. The self-explanatory graphical user interface of MeroX provides easy access for an automated cross-link search platform that is compatible with commonly used data file formats, enabling analysis of data originating from different instruments. The combination of an MS/MS cleavable cross-linker with a dedicated software tool for data analysis provides an automated workflow for 3D-structure analysis of proteins. MeroX is available at www.StavroX.com .

  2. Automating the generation of lexical patterns for processing free text in clinical documents.

    Science.gov (United States)

    Meng, Frank; Morioka, Craig

    2015-09-01

    Many tasks in natural language processing utilize lexical pattern-matching techniques, including information extraction (IE), negation identification, and syntactic parsing. However, it is generally difficult to derive patterns that achieve acceptable levels of recall while also remaining highly precise. We present a multiple sequence alignment (MSA)-based technique that automatically generates patterns, thereby leveraging language usage to determine the context of words that influence a given target. MSAs capture the commonalities among word sequences and are able to reveal areas of linguistic stability and variation. In this way, MSAs provide a systemic approach to generating lexical patterns that are generalizable, which will both increase recall levels and maintain high levels of precision. The MSA-generated patterns exhibited consistent F1-, F.5-, and F2- scores compared to two baseline techniques for IE across four different tasks. Both baseline techniques performed well for some tasks and less well for others, but MSA was found to consistently perform at a high level for all four tasks. The performance of MSA on the four extraction tasks indicates the method's versatility. The results show that the MSA-based patterns are able to handle the extraction of individual data elements as well as relations between two concepts without the need for large amounts of manual intervention. We presented an MSA-based framework for generating lexical patterns that showed consistently high levels of both performance and recall over four different extraction tasks when compared to baseline methods. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. Intelligent Automated Process Planning and Code Generation for Computer-Controlled Inspection

    Science.gov (United States)

    1994-01-01

    rx)) (setf x2 (multiply-vector-and- matriz x2 rz)) Figure 24 shows how coordinate frame xyz is rotated into its setup orientation by three ordered... Boston , MA: Kluwer Academic Press. LeClair, Steven R., 1991. "The Rapid Design System: Memory-Driven Feature- Based Design," Proceedings of the 1991...Generation for PCB Assembly," Manufacturing Review, 4, 3, 214-224. Takefuji, Yoshiyasu, 1992. Neural Network Parallel Computing. Boston . MA: Kluwer

  4. Improved reliability, accuracy and quality in automated NMR structure calculation with ARIA

    Energy Technology Data Exchange (ETDEWEB)

    Mareuil, Fabien [Institut Pasteur, Cellule d' Informatique pour la Biologie (France); Malliavin, Thérèse E.; Nilges, Michael; Bardiaux, Benjamin, E-mail: bardiaux@pasteur.fr [Institut Pasteur, Unité de Bioinformatique Structurale, CNRS UMR 3528 (France)

    2015-08-15

    In biological NMR, assignment of NOE cross-peaks and calculation of atomic conformations are critical steps in the determination of reliable high-resolution structures. ARIA is an automated approach that performs NOE assignment and structure calculation in a concomitant manner in an iterative procedure. The log-harmonic shape for distance restraint potential and the Bayesian weighting of distance restraints, recently introduced in ARIA, were shown to significantly improve the quality and the accuracy of determined structures. In this paper, we propose two modifications of the ARIA protocol: (1) the softening of the force field together with adapted hydrogen radii, which is meaningful in the context of the log-harmonic potential with Bayesian weighting, (2) a procedure that automatically adjusts the violation tolerance used in the selection of active restraints, based on the fitting of the structure to the input data sets. The new ARIA protocols were fine-tuned on a set of eight protein targets from the CASD–NMR initiative. As a result, the convergence problems previously observed for some targets was resolved and the obtained structures exhibited better quality. In addition, the new ARIA protocols were applied for the structure calculation of ten new CASD–NMR targets in a blind fashion, i.e. without knowing the actual solution. Even though optimisation of parameters and pre-filtering of unrefined NOE peak lists were necessary for half of the targets, ARIA consistently and reliably determined very precise and highly accurate structures for all cases. In the context of integrative structural biology, an increasing number of experimental methods are used that produce distance data for the determination of 3D structures of macromolecules, stressing the importance of methods that successfully make use of ambiguous and noisy distance data.

  5. SeqReporter: automating next-generation sequencing result interpretation and reporting workflow in a clinical laboratory.

    Science.gov (United States)

    Roy, Somak; Durso, Mary Beth; Wald, Abigail; Nikiforov, Yuri E; Nikiforova, Marina N

    2014-01-01

    A wide repertoire of bioinformatics applications exist for next-generation sequencing data analysis; however, certain requirements of the clinical molecular laboratory limit their use: i) comprehensive report generation, ii) compatibility with existing laboratory information systems and computer operating system, iii) knowledgebase development, iv) quality management, and v) data security. SeqReporter is a web-based application developed using ASP.NET framework version 4.0. The client-side was designed using HTML5, CSS3, and Javascript. The server-side processing (VB.NET) relied on interaction with a customized SQL server 2008 R2 database. Overall, 104 cases (1062 variant calls) were analyzed by SeqReporter. Each variant call was classified into one of five report levels: i) known clinical significance, ii) uncertain clinical significance, iii) pending pathologists' review, iv) synonymous and deep intronic, and v) platform and panel-specific sequence errors. SeqReporter correctly annotated and classified 99.9% (859 of 860) of sequence variants, including 68.7% synonymous single-nucleotide variants, 28.3% nonsynonymous single-nucleotide variants, 1.7% insertions, and 1.3% deletions. One variant of potential clinical significance was re-classified after pathologist review. Laboratory information system-compatible clinical reports were generated automatically. SeqReporter also facilitated quality management activities. SeqReporter is an example of a customized and well-designed informatics solution to optimize and automate the downstream analysis of clinical next-generation sequencing data. We propose it as a model that may envisage the development of a comprehensive clinical informatics solution. Copyright © 2014 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  6. Subsystem of automated generating of «Portfolio in figures» on the basis of the index-rating system of estimation of students’ activity

    Directory of Open Access Journals (Sweden)

    Константин Васильевич Рочев

    2013-12-01

    Full Text Available This article describes the capabilities of the index-rating system of students evaluation, in particular, for the Automated Portfolio generation in a traditional text format, and a «Portfolio in figures» in relation to other students.

  7. Toward Automated FAÇADE Texture Generation for 3d Photorealistic City Modelling with Smartphones or Tablet Pcs

    Science.gov (United States)

    Wang, S.

    2012-07-01

    An automated model-image fitting algorithm is proposed in this paper for generating façade texture image from pictures taken by smartphones or tablet PCs. The façade texture generation requires tremendous labour work and thus, has been the bottleneck of 3D photo-realistic city modelling. With advanced developments of the micro electro mechanical system (MEMS), camera, global positioning system (GPS), and gyroscope (G-sensors) can all be integrated into a smartphone or a table PC. These sensors bring the possibility of direct-georeferencing for the pictures taken by smartphones or tablet PCs. Since the accuracy of these sensors cannot compared to the surveying instruments, the image position and orientation derived from these sensors are not capable of photogrammetric measurements. This paper adopted the least-squares model-image fitting (LSMIF) algorithm to iteratively improve the image's exterior orientation. The image position from GPS and the image orientation from gyroscope are treated as the initial values. By fitting the projection of the wireframe model to the extracted edge pixels on image, the image exterior orientation elements are solved when the optimal fitting achieved. With the exact exterior orientation elements, the wireframe model of the building can be correctly projected on the image and, therefore, the façade texture image can be extracted from the picture.

  8. prepare_taxa_charts.py: A Python program to automate generation of publication ready taxonomic pie chart images from QIIME.

    Science.gov (United States)

    Lakhujani, Vijay; Badapanda, Chandan

    2017-06-01

    QIIME (Quantitative Insights Into Microbial Ecology) is one of the most popular open-source bioinformatics suite for performing metagenome, 16S rRNA amplicon and Internal Transcribed Spacer (ITS) data analysis. Although, it is very comprehensive and powerful tool, it lacks a method to provide publication ready taxonomic pie charts. The script plot_taxa_summary.py bundled with QIIME generate a html file and a folder containing taxonomic pie chart and legend as separate images. The images have randomly generated alphanumeric names. Therefore, it is difficult to associate the pie chart with the legend and the corresponding sample identifier. Even if the option to have the legend within the html file is selected while executing plot_taxa_summary.py, it is very tedious to crop a complete image (having both the pie chart and the legend) due to unequal image sizes. It requires a lot of time to manually prepare the pie charts for multiple samples for publication purpose. Moreover, there are chances of error while identifying the pie chart and legend pair due to random alphanumeric names of the images. To bypass all these bottlenecks and make this process efficient, we have developed a python based program, prepare_taxa_charts.py, to automate the renaming, cropping and merging of taxonomic pie chart and corresponding legend image into a single, good quality publication ready image. This program not only augments the functionality of plot_taxa_summary.py but is also very fast in terms of CPU time and user friendly.

  9. Data Structure Analysis to Represent Basic Models of Finite State Automation

    Directory of Open Access Journals (Sweden)

    V. V. Gurenko

    2015-01-01

    Full Text Available Complex system engineering based on the automaton models requires a reasoned data structure selection to implement them. The problem of automaton representation and data structure selection to be used in it has been understudied. Arbitrary data structure selection for automaton model software implementation leads to unnecessary computational burden and reduces the developed system efficiency. This article proposes an approach to the reasoned selection of data structures to represent finite algoristic automaton basic models and gives practical considerations based on it.Static and dynamic data structures are proposed for three main ways to assign Mealy and Moore automatons: a transition table, a matrix of coupling and a transition graph. A thirddimensional array, a rectangular matrix and a matrix of lists are the static structures. Dynamic structures are list-oriented structures: two-level and three-level Ayliff vectors and a multi-linked list. These structures allow us to store all required information about finite state automaton model components - characteristic set cardinalities and data of transition and output functions.A criterion system is proposed for data structure comparative evaluation in virtue of algorithmic features of automata theory problems. The criteria focused on capacitive and time computational complexity of operations performed in tasks such as equivalent automaton conversions, proving of automaton equivalence and isomorphism, and automaton minimization.A data structure comparative analysis based on the criterion system has done for both static and dynamic type. The analysis showed advantages of the third-dimensional array, matrix and two-level Ayliff vector. These are structures that assign automaton by transition table. For these structures an experiment was done to measure the execution time of automation operations included in criterion system.The analysis of experiment results showed that a dynamic structure - two

  10. Nano Structured Devices for Energy Generation

    DEFF Research Database (Denmark)

    Radziwon, Michal Jędrzej

    This work focuses on the enhancement of α-sexithiophene / buckminsterfullerene (α-6T / C60) inverted bilayer organic solar cell effciency by the introduction of crystalline nanostructures in the electron donor layer. In order to utilize the charge carrier mobility anisotropy in crystalline α-6T...... structures in solar cells, the orientation of the individual molecules should favor charge transport perpendicular to the substrate plane. Such orientation is realized from α-6T molecules lying on the substrate, which additionally infers the preferred orientation of the transition dipole for maximal light...... temperatures and a shutter were controlled by the supervisory control and data acquisition (SCADA) system, which has been implemented in LabVIEW environment. The temperatures, process pressure, and deposition rate were stored for future analysis. By variation of the substrate temperature during deposition...

  11. CompaRNA: a server for continuous benchmarking of automated methods for RNA secondary structure prediction

    OpenAIRE

    2013-01-01

    We present a continuous benchmarking approach for the assessment of RNA secondary structure prediction methods implemented in the CompaRNA web server. As of 3 October 2012, the performance of 28 single-sequence and 13 comparative methods has been evaluated on RNA sequences/structures released weekly by the Protein Data Bank. We also provide a static benchmark generated on RNA 2D structures derived from the RNAstrand database. Benchmarks on both data sets offer insight into the relative perfor...

  12. Automated Music Video Generation Using Multi-level Feature-based Segmentation

    Science.gov (United States)

    Yoon, Jong-Chul; Lee, In-Kwon; Byun, Siwoo

    The expansion of the home video market has created a requirement for video editing tools to allow ordinary people to assemble videos from short clips. However, professional skills are still necessary to create a music video, which requires a stream to be synchronized with pre-composed music. Because the music and the video are pre-generated in separate environments, even a professional producer usually requires a number of trials to obtain a satisfactory synchronization, which is something that most amateurs are unable to achieve.

  13. Human/autonomy collaboration for the automated generation of intelligence products

    Science.gov (United States)

    DiBona, Phil; Schlachter, Jason; Kuter, Ugur; Goldman, Robert

    2017-05-01

    Intelligence Analysis remains a manual process despite trends toward autonomy in information processing. Analysts need agile decision-­-support tools that can adapt to the evolving information needs of the mission, allowing the analyst to pose novel analytic questions. Our research enables the analysts to only provide a constrained English specification of what the intelligence product should be. Using HTN planning, the autonomy discovers, decides, and generates a workflow of algorithms to create the intelligence product. Therefore, the analyst can quickly and naturally communicate to the autonomy what information product is needed, rather than how to create it.

  14. Software package to automate the design and production of translucent building structures made of pvc

    Directory of Open Access Journals (Sweden)

    Petrova Irina Yur’evna

    2016-08-01

    Full Text Available The article describes the features of the design and production of translucent building structures made of PVC. The analysis of the automation systems of this process currently existing on the market is carried out, their advantages and disadvantages are identified. Basing on this analysis, a set of requirements for automation systems for the design and production of translucent building structures made of PVC is formulated; the basic entities are involved in those business processes. The necessary functions for the main application and for dealers’ application are specified. The main application is based on technological platform 1C: Enterprise 8.2. The dealers’ module is .NET application and is developed with the use of Microsoft Visual Studio and Microsoft SQL Server because these software products have client versions free for end users (.NET Framework 4.0 Client Profile and Microsoft SQL Server 2008 Express. The features of the developed software complex implementation are described; the relevant charts are given. The scheme of system deployment and protocols of data exchange between 1C server, 1C client and dealer is presented. Also the functions supported by 1C module and .NET module are described. The article describes the content of class library developed for .NET module. The specification of integration of the two applications in a single software package is given. The features of the GUI organization are described; the corresponding screenshots are given. The possible ways of further development of the described software complex are presented and a conclusion about its competitiveness and expediency of new researches is made.

  15. ProDaMa: an open source Python library to generate protein structure datasets

    Directory of Open Access Journals (Sweden)

    Manconi Andrea

    2009-10-01

    Full Text Available Abstract Background The huge difference between the number of known sequences and known tertiary structures has justified the use of automated methods for protein analysis. Although a general methodology to solve these problems has not been yet devised, researchers are engaged in developing more accurate techniques and algorithms whose training plays a relevant role in determining their performance. From this perspective, particular importance is given to the training data used in experiments, and researchers are often engaged in the generation of specialized datasets that meet their requirements. Findings To facilitate the task of generating specialized datasets we devised and implemented ProDaMa, an open source Python library than provides classes for retrieving, organizing, updating, analyzing, and filtering protein data. Conclusion ProDaMa has been used to generate specialized datasets useful for secondary structure prediction and to develop a collaborative web application aimed at generating and sharing protein structure datasets. The library, the related database, and the documentation are freely available at the URL http://iasc.diee.unica.it/prodama.

  16. A structural study of cyanotrichite from Dachang by conventional and automated electron diffraction

    Science.gov (United States)

    Ventruti, Gennaro; Mugnaioli, Enrico; Capitani, Giancarlo; Scordari, Fernando; Pinto, Daniela; Lausi, Andrea

    2015-09-01

    The crystal structure of cyanotrichite, having general formula Cu4Al2(SO4)(OH)12·2H2O, from the Dachang deposit (China) was studied by means of conventional transmission electron microscopy, automated electron diffraction tomography (ADT) and synchrotron X-ray powder diffraction (XRPD). ADT revealed the presence of two different cyanotrichite-like phases. The same phases were also recognized in the XRPD pattern, allowing the perfect indexing of all peaks leading, after refinement to the following cell parameters: (1) a = 12.417(2) Å, b = 2.907(1) Å, c = 10.157(1) Å and β = 98.12(1); (2) a = 12.660(2) Å, b = 2.897(1) Å, c = 10.162(1) Å and β = 92.42(1)°. Only for the former phase, labeled cyanotrichite-98, a partial structure, corresponding to the [Cu4Al2(OH){12/2+}] cluster, was obtained ab initio by direct methods in space group C2/ m on the basis of electron diffraction data. Geometric and charge-balance considerations allowed to reach the whole structure model for the cyanotrichite-98 phase. The sulfate group and water molecule result to be statistically disordered over two possible positions, but keeping the average structure consistent with the C-centering symmetry, in agreement with ADT results.

  17. Modelling and interpreting biologically crusted dryland soil sub-surface structure using automated micropenetrometry

    Science.gov (United States)

    Hoon, Stephen R.; Felde, Vincent J. M. N. L.; Drahorad, Sylvie L.; Felix-Henningsen, Peter

    2015-04-01

    Soil penetrometers are used routinely to determine the shear strength of soils and deformable sediments both at the surface and throughout a depth profile in disciplines as diverse as soil science, agriculture, geoengineering and alpine avalanche-safety (e.g. Grunwald et al. 2001, Van Herwijnen et al. 2009). Generically, penetrometers comprise two principal components: An advancing probe, and a transducer; the latter to measure the pressure or force required to cause the probe to penetrate or advance through the soil or sediment. The force transducer employed to determine the pressure can range, for example, from a simple mechanical spring gauge to an automatically data-logged electronic transducer. Automated computer control of the penetrometer step size and probe advance rate enables precise measurements to be made down to a resolution of 10's of microns, (e.g. the automated electronic micropenetrometer (EMP) described by Drahorad 2012). Here we discuss the determination, modelling and interpretation of biologically crusted dryland soil sub-surface structures using automated micropenetrometry. We outline a model enabling the interpretation of depth dependent penetration resistance (PR) profiles and their spatial differentials using the model equations, σ {}(z) ={}σ c0{}+Σ 1n[σ n{}(z){}+anz + bnz2] and dσ /dz = Σ 1n[dσ n(z) /dz{} {}+{}Frn(z)] where σ c0 and σ n are the plastic deformation stresses for the surface and nth soil structure (e.g. soil crust, layer, horizon or void) respectively, and Frn(z)dz is the frictional work done per unit volume by sliding the penetrometer rod an incremental distance, dz, through the nth layer. Both σ n(z) and Frn(z) are related to soil structure. They determine the form of σ {}(z){} measured by the EMP transducer. The model enables pores (regions of zero deformation stress) to be distinguished from changes in layer structure or probe friction. We have applied this method to both artificial calibration soils in the

  18. On random field Completely Automated Public Turing Test to Tell Computers and Humans Apart generation.

    Science.gov (United States)

    Kouritzin, Michael A; Newton, Fraser; Wu, Biao

    2013-04-01

    Herein, we propose generating CAPTCHAs through random field simulation and give a novel, effective and efficient algorithm to do so. Indeed, we demonstrate that sufficient information about word tests for easy human recognition is contained in the site marginal probabilities and the site-to-nearby-site covariances and that these quantities can be embedded directly into certain conditional probabilities, designed for effective simulation. The CAPTCHAs are then partial random realizations of the random CAPTCHA word. We start with an initial random field (e.g., randomly scattered letter pieces) and use Gibbs resampling to re-simulate portions of the field repeatedly using these conditional probabilities until the word becomes human-readable. The residual randomness from the initial random field together with the random implementation of the CAPTCHA word provide significant resistance to attack. This results in a CAPTCHA, which is unrecognizable to modern optical character recognition but is recognized about 95% of the time in a human readability study.

  19. CFSAN SNP Pipeline: an automated method for constructing SNP matrices from next-generation sequence data

    Directory of Open Access Journals (Sweden)

    Steve Davis

    2015-08-01

    Full Text Available The analysis of next-generation sequence (NGS data is often a fragmented step-wise process. For example, multiple pieces of software are typically needed to map NGS reads, extract variant sites, and construct a DNA sequence matrix containing only single nucleotide polymorphisms (i.e., a SNP matrix for a set of individuals. The management and chaining of these software pieces and their outputs can often be a cumbersome and difficult task. Here, we present CFSAN SNP Pipeline, which combines into a single package the mapping of NGS reads to a reference genome with Bowtie2, processing of those mapping (BAM files using SAMtools, identification of variant sites using VarScan, and production of a SNP matrix using custom Python scripts. We also introduce a Python package (CFSAN SNP Mutator that when given a reference genome will generate variants of known position against which we validate our pipeline. We created 1,000 simulated Salmonella enterica sp. enterica Serovar Agona genomes at 100× and 20× coverage, each containing 500 SNPs, 20 single-base insertions and 20 single-base deletions. For the 100× dataset, the CFSAN SNP Pipeline recovered 98.9% of the introduced SNPs and had a false positive rate of 1.04 × 10−6; for the 20× dataset 98.8% of SNPs were recovered and the false positive rate was 8.34 × 10−7. Based on these results, CFSAN SNP Pipeline is a robust and accurate tool that it is among the first to combine into a single executable the myriad steps required to produce a SNP matrix from NGS data. Such a tool is useful to those working in an applied setting (e.g., food safety traceback investigations as well as for those interested in evolutionary questions.

  20. PASS2: an automated database of protein alignments organised as structural superfamilies

    Directory of Open Access Journals (Sweden)

    Sowdhamini Ramanathan

    2004-04-01

    Full Text Available Abstract Background The functional selection and three-dimensional structural constraints of proteins in nature often relates to the retention of significant sequence similarity between proteins of similar fold and function despite poor sequence identity. Organization of structure-based sequence alignments for distantly related proteins, provides a map of the conserved and critical regions of the protein universe that is useful for the analysis of folding principles, for the evolutionary unification of protein families and for maximizing the information return from experimental structure determination. The Protein Alignment organised as Structural Superfamily (PASS2 database represents continuously updated, structural alignments for evolutionary related, sequentially distant proteins. Description An automated and updated version of PASS2 is, in direct correspondence with SCOP 1.63, consisting of sequences having identity below 40% among themselves. Protein domains have been grouped into 628 multi-member superfamilies and 566 single member superfamilies. Structure-based sequence alignments for the superfamilies have been obtained using COMPARER, while initial equivalencies have been derived from a preliminary superposition using LSQMAN or STAMP 4.0. The final sequence alignments have been annotated for structural features using JOY4.0. The database is supplemented with sequence relatives belonging to different genomes, conserved spatially interacting and structural motifs, probabilistic hidden markov models of superfamilies based on the alignments and useful links to other databases. Probabilistic models and sensitive position specific profiles obtained from reliable superfamily alignments aid annotation of remote homologues and are useful tools in structural and functional genomics. PASS2 presents the phylogeny of its members both based on sequence and structural dissimilarities. Clustering of members allows us to understand diversification of

  1. Structured adaptive grid generation using algebraic methods

    Science.gov (United States)

    Yang, Jiann-Cherng; Soni, Bharat K.; Roger, R. P.; Chan, Stephen C.

    1993-01-01

    The accuracy of the numerical algorithm depends not only on the formal order of approximation but also on the distribution of grid points in the computational domain. Grid adaptation is a procedure which allows optimal grid redistribution as the solution progresses. It offers the prospect of accurate flow field simulations without the use of an excessively timely, computationally expensive, grid. Grid adaptive schemes are divided into two basic categories: differential and algebraic. The differential method is based on a variational approach where a function which contains a measure of grid smoothness, orthogonality and volume variation is minimized by using a variational principle. This approach provided a solid mathematical basis for the adaptive method, but the Euler-Lagrange equations must be solved in addition to the original governing equations. On the other hand, the algebraic method requires much less computational effort, but the grid may not be smooth. The algebraic techniques are based on devising an algorithm where the grid movement is governed by estimates of the local error in the numerical solution. This is achieved by requiring the points in the large error regions to attract other points and points in the low error region to repel other points. The development of a fast, efficient, and robust algebraic adaptive algorithm for structured flow simulation applications is presented. This development is accomplished in a three step process. The first step is to define an adaptive weighting mesh (distribution mesh) on the basis of the equidistribution law applied to the flow field solution. The second, and probably the most crucial step, is to redistribute grid points in the computational domain according to the aforementioned weighting mesh. The third and the last step is to reevaluate the flow property by an appropriate search/interpolate scheme at the new grid locations. The adaptive weighting mesh provides the information on the desired concentration

  2. Development of a fully automated online mixing system for SAXS protein structure analysis

    DEFF Research Database (Denmark)

    Nielsen, Søren Skou; Arleth, Lise

    2010-01-01

    This thesis presents the development of an automated high-throughput mixing and exposure system for Small-Angle Scattering analysis on a synchrotron using polymer microfluidics. Software and hardware for both automated mixing, exposure control on a beamline and automated data reduction and prelim......This thesis presents the development of an automated high-throughput mixing and exposure system for Small-Angle Scattering analysis on a synchrotron using polymer microfluidics. Software and hardware for both automated mixing, exposure control on a beamline and automated data reduction...... and preliminary analysis is presented. Three mixing systems that have been the corner stones of the development process are presented including a fully functioning high-throughput microfluidic system that is able to produce and expose 36 mixed samples per hour using 30 μL of sample volume. The system is tested...

  3. Automated generation of node-splitting models for assessment of inconsistency in network meta-analysis.

    Science.gov (United States)

    van Valkenhoef, Gert; Dias, Sofia; Ades, A E; Welton, Nicky J

    2016-03-01

    Network meta-analysis enables the simultaneous synthesis of a network of clinical trials comparing any number of treatments. Potential inconsistencies between estimates of relative treatment effects are an important concern, and several methods to detect inconsistency have been proposed. This paper is concerned with the node-splitting approach, which is particularly attractive because of its straightforward interpretation, contrasting estimates from both direct and indirect evidence. However, node-splitting analyses are labour-intensive because each comparison of interest requires a separate model. It would be advantageous if node-splitting models could be estimated automatically for all comparisons of interest. We present an unambiguous decision rule to choose which comparisons to split, and prove that it selects only comparisons in potentially inconsistent loops in the network, and that all potentially inconsistent loops in the network are investigated. Moreover, the decision rule circumvents problems with the parameterisation of multi-arm trials, ensuring that model generation is trivial in all cases. Thus, our methods eliminate most of the manual work involved in using the node-splitting approach, enabling the analyst to focus on interpreting the results.

  4. Automated metric characterization of urban structure using building decomposition from very high resolution imagery

    Science.gov (United States)

    Heinzel, Johannes; Kemper, Thomas

    2015-03-01

    Classification approaches for urban areas are mostly of qualitative and semantic nature. They produce interpreted classes similar to those from land cover and land use classifications. As a complement to those classes, quantitative measures directly derived from the image could lead to a metric characterization of the urban area. While these metrics lack of qualitative interpretation they are able to provide objective measure of the urban structures. Such quantitative measures are especially important in rapidly growing cities since, beside of the growth in area, they can provide structural information for specific areas and detect changes. Rustenburg, which serves as test area for the present study, is amongst the fastest growing cities in South Africa. It reveals a heterogeneous face of housing and building structures reflecting social and/or economic differences often linked to the spatial distribution of industrial and local mining sites. Up to date coverage with aerial photographs is provided by aerial surveys in regular intervals. Also recent satellite systems provide imagery with suitable resolution. Using such set of very high resolution images a fully automated algorithm has been developed which outputs metric classes by systematically combining important measures of building structure. The measurements are gained by decomposition of buildings directly from the imagery and by using methods from mathematical morphology. The decomposed building objects serve as basis for the computation of grid statistics. Finally a systematic combination of the single features leads to combined metrical classes. For the dominant urban structures verification results indicate an overall accuracy of at least 80% on the single feature level and 70% for the combined classes.

  5. Identifying relevant biomarkers of brain injury from structural MRI: Validation using automated approaches in children with unilateral cerebral palsy.

    Science.gov (United States)

    Pagnozzi, Alex M; Dowson, Nicholas; Doecke, James; Fiori, Simona; Bradley, Andrew P; Boyd, Roslyn N; Rose, Stephen

    2017-01-01

    Previous studies have proposed that the early elucidation of brain injury from structural Magnetic Resonance Images (sMRI) is critical for the clinical assessment of children with cerebral palsy (CP). Although distinct aetiologies, including cortical maldevelopments, white and grey matter lesions and ventricular enlargement, have been categorised, these injuries are commonly only assessed in a qualitative fashion. As a result, sMRI remains relatively underexploited for clinical assessments, despite its widespread use. In this study, several automated and validated techniques to automatically quantify these three classes of injury were generated in a large cohort of children (n = 139) aged 5-17, including 95 children diagnosed with unilateral CP. Using a feature selection approach on a training data set (n = 97) to find severity of injury biomarkers predictive of clinical function (motor, cognitive, communicative and visual function), cortical shape and regional lesion burden were most often chosen associated with clinical function. Validating the best models on the unseen test data (n = 42), correlation values ranged between 0.545 and 0.795 (p<0.008), indicating significant associations with clinical function. The measured prevalence of injury, including ventricular enlargement (70%), white and grey matter lesions (55%) and cortical malformations (30%), were similar to the prevalence observed in other cohorts of children with unilateral CP. These findings support the early characterisation of injury from sMRI into previously defined aetiologies as part of standard clinical assessment. Furthermore, the strong and significant association between quantifications of injury observed on structural MRI and multiple clinical scores accord with empirically established structure-function relationships.

  6. Automated pattern recognition to support geological mapping and exploration target generation - A case study from southern Namibia

    Science.gov (United States)

    Eberle, Detlef; Hutchins, David; Das, Sonali; Majumdar, Anandamayee; Paasche, Hendrik

    2015-06-01

    to the result obtained from unsupervised fuzzy clustering. Furthermore, a comparison of the aposterior probability of class assignment with the trustworthiness values provided by fuzzy clustering also indicates only slight differences. These observed differences can be explained by the exponential class probability term which tends to deliver either fairly high or low probability values. The methodology and results presented here demonstrate that automated objective pattern recognition can essentially contribute to geological mapping of large study areas and mineral exploration target generation. This methodology is considered well suited to a number of African countries whose large territories have recently been covered by high resolution airborne geophysical data, but where existing geological mapping is poor, incomplete or outdated.

  7. Mutation Sampling Technique for the Generation of Structural Test Data

    CERN Document Server

    Scholive, M; Robach, C; Flottes, M L; Rouzeyre, B

    2011-01-01

    Our goal is to produce validation data that can be used as an efficient (pre) test set for structural stuck-at faults. In this paper, we detail an original test-oriented mutation sampling technique used for generating such data and we present a first evaluation on these validation data with regard to a structural test.

  8. CompaRNA: a server for continuous benchmarking of automated methods for RNA secondary structure prediction.

    Science.gov (United States)

    Puton, Tomasz; Kozlowski, Lukasz P; Rother, Kristian M; Bujnicki, Janusz M

    2013-04-01

    We present a continuous benchmarking approach for the assessment of RNA secondary structure prediction methods implemented in the CompaRNA web server. As of 3 October 2012, the performance of 28 single-sequence and 13 comparative methods has been evaluated on RNA sequences/structures released weekly by the Protein Data Bank. We also provide a static benchmark generated on RNA 2D structures derived from the RNAstrand database. Benchmarks on both data sets offer insight into the relative performance of RNA secondary structure prediction methods on RNAs of different size and with respect to different types of structure. According to our tests, on the average, the most accurate predictions obtained by a comparative approach are generated by CentroidAlifold, MXScarna, RNAalifold and TurboFold. On the average, the most accurate predictions obtained by single-sequence analyses are generated by CentroidFold, ContextFold and IPknot. The best comparative methods typically outperform the best single-sequence methods if an alignment of homologous RNA sequences is available. This article presents the results of our benchmarks as of 3 October 2012, whereas the rankings presented online are continuously updated. We will gladly include new prediction methods and new measures of accuracy in the new editions of CompaRNA benchmarks.

  9. Material specific effects and limitations during ps-laser generation of micro structures

    Science.gov (United States)

    Hildenhagen, J.; Engelhardt, U.; Smarra, M.; Dickmann, K.

    2012-01-01

    The use of picosecond lasers for microstructuring, especially in the combination with scanner optics, leads to undesired effects with increasing ablation depths. The cavity edges slope to a degree ranging between 50° and 85°, depending on the material. With highly reflective substrates, ditches of up to 20% of their total depth can be formed on its ground structure. In certain materials also diverse substructures such as holes, canals, or grooves can be developed. These could impact the precision of the ablation geometry partially. A systematic study of the specific ablation characteristics is needed to achieve a defined depth of the structure. Considering a huge number of influential parameters, an automation of such measurements would be meaningful. For a study of eight different materials (high-alloy steels, copper, titanium, aluminum, PMMA, Al2O3 ceramics, silicon and fused quartz), an industrial ps-laser coupled with a chromatic sensor for distance measurement was used. Hence a direct acquisition of the generated structures as well as an automatic evaluation of the parameters is possible. Furthermore an online quality control and a local post processing can be implemented. In this way the generation of complex structures with a higher precision is possible.

  10. Random generation of RNA secondary structures according to native distributions

    Directory of Open Access Journals (Sweden)

    Nebel Markus E

    2011-10-01

    Full Text Available Abstract Background Random biological sequences are a topic of great interest in genome analysis since, according to a powerful paradigm, they represent the background noise from which the actual biological information must differentiate. Accordingly, the generation of random sequences has been investigated for a long time. Similarly, random object of a more complicated structure like RNA molecules or proteins are of interest. Results In this article, we present a new general framework for deriving algorithms for the non-uniform random generation of combinatorial objects according to the encoding and probability distribution implied by a stochastic context-free grammar. Briefly, the framework extends on the well-known recursive method for (uniform random generation and uses the popular framework of admissible specifications of combinatorial classes, introducing weighted combinatorial classes to allow for the non-uniform generation by means of unranking. This framework is used to derive an algorithm for the generation of RNA secondary structures of a given fixed size. We address the random generation of these structures according to a realistic distribution obtained from real-life data by using a very detailed context-free grammar (that models the class of RNA secondary structures by distinguishing between all known motifs in RNA structure. Compared to well-known sampling approaches used in several structure prediction tools (such as SFold ours has two major advantages: Firstly, after a preprocessing step in time O(n2 for the computation of all weighted class sizes needed, with our approach a set of m random secondary structures of a given structure size n can be computed in worst-case time complexity Om⋅n⋅ log(n while other algorithms typically have a runtime in O(m⋅n2. Secondly, our approach works with integer arithmetic only which is faster and saves us from all the discomforting details of using floating point arithmetic with

  11. Automated pattern recognition to support geological mapping and exploration target generation: a case study from southern Namibia

    CSIR Research Space (South Africa)

    Eberle, D

    2015-06-01

    Full Text Available This paper demonstrates a methodology for the automatic joint interpretation of high resolution airborne geophysical and space-borne remote sensing data to support geological mapping in a largely automated, fast and objective manner. At the request...

  12. TopoGen: A Network Topology Generation Architecture with application to automating simulations of Software Defined Networks

    CERN Document Server

    Laurito, Andres; The ATLAS collaboration

    2017-01-01

    Simulation is an important tool to validate the performance impact of control decisions in Software Defined Networks (SDN). Yet, the manual modeling of complex topologies that may change often during a design process can be a tedious error-prone task. We present TopoGen, a general purpose architecture and tool for systematic translation and generation of network topologies. TopoGen can be used to generate network simulation models automatically by querying information available at diverse sources, notably SDN controllers. The DEVS modeling and simulation framework facilitates a systematic translation of structured knowledge about a network topology into a formal modular and hierarchical coupling of preexisting or new models of network entities (physical or logical). TopoGen can be flexibly extended with new parsers and generators to grow its scope of applicability. This permits to design arbitrary workflows of topology transformations. We tested TopoGen in a network engineering project for the ATLAS detector ...

  13. Automated Structure-Activity Relationship Mining: Connecting Chemical Structure to Biological Profiles.

    Science.gov (United States)

    Wawer, Mathias J; Jaramillo, David E; Dančík, Vlado; Fass, Daniel M; Haggarty, Stephen J; Shamji, Alykhan F; Wagner, Bridget K; Schreiber, Stuart L; Clemons, Paul A

    2014-06-01

    Understanding the structure-activity relationships (SARs) of small molecules is important for developing probes and novel therapeutic agents in chemical biology and drug discovery. Increasingly, multiplexed small-molecule profiling assays allow simultaneous measurement of many biological response parameters for the same compound (e.g., expression levels for many genes or binding constants against many proteins). Although such methods promise to capture SARs with high granularity, few computational methods are available to support SAR analyses of high-dimensional compound activity profiles. Many of these methods are not generally applicable or reduce the activity space to scalar summary statistics before establishing SARs. In this article, we present a versatile computational method that automatically extracts interpretable SAR rules from high-dimensional profiling data. The rules connect chemical structural features of compounds to patterns in their biological activity profiles. We applied our method to data from novel cell-based gene-expression and imaging assays collected on more than 30,000 small molecules. Based on the rules identified for this data set, we prioritized groups of compounds for further study, including a novel set of putative histone deacetylase inhibitors.

  14. Absorption-reduced waveguide structure for efficient terahertz generation

    Energy Technology Data Exchange (ETDEWEB)

    Pálfalvi, L., E-mail: palfalvi@fizika.ttk.pte.hu [Institute of Physics, University of Pécs, Ifjúság ú. 6, 7624 Pécs (Hungary); Fülöp, J. A. [MTA-PTE High-Field Terahertz Research Group, Ifjúság ú. 6, 7624 Pécs (Hungary); Szentágothai Research Centre, University of Pécs, Ifjúság ú. 20, 7624 Pécs (Hungary); Hebling, J. [Institute of Physics, University of Pécs, Ifjúság ú. 6, 7624 Pécs (Hungary); MTA-PTE High-Field Terahertz Research Group, Ifjúság ú. 6, 7624 Pécs (Hungary); Szentágothai Research Centre, University of Pécs, Ifjúság ú. 20, 7624 Pécs (Hungary)

    2015-12-07

    An absorption-reduced planar waveguide structure is proposed for increasing the efficiency of terahertz (THz) pulse generation by optical rectification of femtosecond laser pulses with tilted-pulse-front in highly nonlinear materials with large absorption coefficient. The structure functions as waveguide both for the optical pump and the generated THz radiation. Most of the THz power propagates inside the cladding with low THz absorption, thereby reducing losses and leading to the enhancement of the THz generation efficiency by up to more than one order of magnitude, as compared with a bulk medium. Such a source can be suitable for highly efficient THz pulse generation pumped by low-energy (nJ-μJ) pulses at high (MHz) repetition rates delivered by compact fiber lasers.

  15. Automated Synthetic Scene Generation

    Science.gov (United States)

    2014-07-01

    Föerstner, 1999), environmental modeling (Brenner, 1999), navigation (Auer, et al., 2010; Brenner, 2005), games and entertainment (Hearn and Baker, 1997...city planning, games and entertainment, and military planning, scenes have the additional requirement to be accurately attributed with visible color...Observation Platform (AOP) Pathfinder 2010 data release, National Ecological Observatory Network, URL: http://neoninc.org/pds/files/NEON.AOP.015068

  16. Automated foveola localization in retinal 3D-OCT images using structural support vector machine prediction.

    Science.gov (United States)

    Liu, Yu-Ying; Ishikawa, Hiroshi; Chen, Mei; Wollstein, Gadi; Schuman, Joel S; Rehg, James M

    2012-01-01

    We develop an automated method to determine the foveola location in macular 3D-OCT images in either healthy or pathological conditions. Structural Support Vector Machine (S-SVM) is trained to directly predict the location of the foveola, such that the score at the ground truth position is higher than that at any other position by a margin scaling with the associated localization loss. This S-SVM formulation directly minimizes the empirical risk of localization error, and makes efficient use of all available training data. It deals with the localization problem in a more principled way compared to the conventional binary classifier learning that uses zero-one loss and random sampling of negative examples. A total of 170 scans were collected for the experiment. Our method localized 95.1% of testing scans within the anatomical area of the foveola. Our experimental results show that the proposed method can effectively identify the location of the foveola, facilitating diagnosis around this important landmark.

  17. Application of AN Automated Wireless Structural Monitoring System for Long-Span Suspension Bridges

    Science.gov (United States)

    Kurata, M.; Lynch, J. P.; van der Linden, G. W.; Hipley, P.; Sheng, L.-H.

    2011-06-01

    This paper describes an automated wireless structural monitoring system installed at the New Carquinez Bridge (NCB). The designed system utilizes a dense network of wireless sensors installed in the bridge but remotely controlled by a hierarchically designed cyber-environment. The early efforts have included performance verification of a dense network of wireless sensors installed on the bridge and the establishment of a cellular gateway to the system for remote access from the internet. Acceleration of the main bridge span was the primary focus of the initial field deployment of the wireless monitoring system. An additional focus of the study is on ensuring wireless sensors can survive for long periods without human intervention. Toward this end, the life-expectancy of the wireless sensors has been enhanced by embedding efficient power management schemes in the sensors while integrating solar panels for power harvesting. The dynamic characteristics of the NCB under daily traffic and wind loads were extracted from the vibration response of the bridge deck and towers. These results have been compared to a high-fidelity finite element model of the bridge.

  18. The automation of science.

    Science.gov (United States)

    King, Ross D; Rowland, Jem; Oliver, Stephen G; Young, Michael; Aubrey, Wayne; Byrne, Emma; Liakata, Maria; Markham, Magdalena; Pir, Pinar; Soldatova, Larisa N; Sparkes, Andrew; Whelan, Kenneth E; Clare, Amanda

    2009-04-03

    The basis of science is the hypothetico-deductive method and the recording of experiments in sufficient detail to enable reproducibility. We report the development of Robot Scientist "Adam," which advances the automation of both. Adam has autonomously generated functional genomics hypotheses about the yeast Saccharomyces cerevisiae and experimentally tested these hypotheses by using laboratory automation. We have confirmed Adam's conclusions through manual experiments. To describe Adam's research, we have developed an ontology and logical language. The resulting formalization involves over 10,000 different research units in a nested treelike structure, 10 levels deep, that relates the 6.6 million biomass measurements to their logical description. This formalization describes how a machine contributed to scientific knowledge.

  19. A generative, probabilistic model of local protein structure

    DEFF Research Database (Denmark)

    Boomsma, Wouter; Mardia, Kanti V.; Taylor, Charles C.;

    2008-01-01

    Despite significant progress in recent years, protein structure prediction maintains its status as one of the prime unsolved problems in computational biology. One of the key remaining challenges is an efficient probabilistic exploration of the structural space that correctly reflects the relative...... conformational stabilities. Here, we present a fully probabilistic, continuous model of local protein structure in atomic detail. The generative model makes efficient conformational sampling possible and provides a framework for the rigorous analysis of local sequence-structure correlations in the native state...

  20. Generating function approach to reliability analysis of structural systems

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    The generating function approach is an important tool for performance assessment in multi-state systems. Aiming at strength reliability analysis of structural systems, generating function approach is introduced and developed. Static reliability models of statically determinate, indeterminate systems and fatigue reliability models are built by constructing special generating functions, which are used to describe probability distributions of strength (resistance), stress (load) and fatigue life, by defining composite operators of generating functions and performance structure functions thereof. When composition operators are executed, computational costs can be reduced by a big margin by means of collecting like terms. The results of theoretical analysis and numerical simulation show that the generating function approach can be widely used for probability modeling of large complex systems with hierarchical structures due to the unified form, compact expression, computer program realizability and high universality. Because the new method considers twin loads giving rise to component failure dependency, it can provide a theoretical reference and act as a powerful tool for static, dynamic reliability analysis in civil engineering structures and mechanical equipment systems with multi-mode damage coupling.

  1. Structural Learning of Attack Vectors for Generating Mutated XSS Attacks

    Directory of Open Access Journals (Sweden)

    Yi-Hsun Wang

    2010-09-01

    Full Text Available Web applications suffer from cross-site scripting (XSS attacks that resulting from incomplete or incorrect input sanitization. Learning the structure of attack vectors could enrich the variety of manifestations in generated XSS attacks. In this study, we focus on generating more threatening XSS attacks for the state-of-the-art detection approaches that can find potential XSS vulnerabilities in Web applications, and propose a mechanism for structural learning of attack vectors with the aim of generating mutated XSS attacks in a fully automatic way. Mutated XSS attack generation depends on the analysis of attack vectors and the structural learning mechanism. For the kernel of the learning mechanism, we use a Hidden Markov model (HMM as the structure of the attack vector model to capture the implicit manner of the attack vector, and this manner is benefited from the syntax meanings that are labeled by the proposed tokenizing mechanism. Bayes theorem is used to determine the number of hidden states in the model for generalizing the structure model. The paper has the contributions as following: (1 automatically learn the structure of attack vectors from practical data analysis to modeling a structure model of attack vectors, (2 mimic the manners and the elements of attack vectors to extend the ability of testing tool for identifying XSS vulnerabilities, (3 be helpful to verify the flaws of blacklist sanitization procedures of Web applications. We evaluated the proposed mechanism by Burp Intruder with a dataset collected from public XSS archives. The results show that mutated XSS attack generation can identify potential vulnerabilities.

  2. Injecting Structured Data to Generative Topic Model in Enterprise Settings

    Science.gov (United States)

    Xiao, Han; Wang, Xiaojie; Du, Chao

    Enterprises have accumulated both structured and unstructured data steadily as computing resources improve. However, previous research on enterprise data mining often treats these two kinds of data independently and omits mutual benefits. We explore the approach to incorporate a common type of structured data (i.e. organigram) into generative topic model. Our approach, the Partially Observed Topic model (POT), not only considers the unstructured words, but also takes into account the structured information in its generation process. By integrating the structured data implicitly, the mixed topics over document are partially observed during the Gibbs sampling procedure. This allows POT to learn topic pertinently and directionally, which makes it easy tuning and suitable for end-use application. We evaluate our proposed new model on a real-world dataset and show the result of improved expressiveness over traditional LDA. In the task of document classification, POT also demonstrates more discriminative power than LDA.

  3. H++ 3.0: automating pK prediction and the preparation of biomolecular structures for atomistic molecular modeling and simulations.

    Science.gov (United States)

    Anandakrishnan, Ramu; Aguilar, Boris; Onufriev, Alexey V

    2012-07-01

    The accuracy of atomistic biomolecular modeling and simulation studies depend on the accuracy of the input structures. Preparing these structures for an atomistic modeling task, such as molecular dynamics (MD) simulation, can involve the use of a variety of different tools for: correcting errors, adding missing atoms, filling valences with hydrogens, predicting pK values for titratable amino acids, assigning predefined partial charges and radii to all atoms, and generating force field parameter/topology files for MD. Identifying, installing and effectively using the appropriate tools for each of these tasks can be difficult for novice and time-consuming for experienced users. H++ (http://biophysics.cs.vt.edu/) is a free open-source web server that automates the above key steps in the preparation of biomolecular structures for molecular modeling and simulations. H++ also performs extensive error and consistency checking, providing error/warning messages together with the suggested corrections. In addition to numerous minor improvements, the latest version of H++ includes several new capabilities and options: fix erroneous (flipped) side chain conformations for HIS, GLN and ASN, include a ligand in the input structure, process nucleic acid structures and generate a solvent box with specified number of common ions for explicit solvent MD.

  4. Application of Real-Time Automated Traffic Incident Response Plan Management System: A Web Structure for the Regional Highway Network in China

    Directory of Open Access Journals (Sweden)

    Yongfeng Ma

    2014-01-01

    Full Text Available Traffic incidents, caused by various factors, may lead to heavy traffic delay and be harmful to traffic capacity of downstream sections. Traffic incident management (TIM systems have been developed widely to respond to traffic incidents intelligently and reduce the losses. Traffic incident response plans, as an important component of TIM, can effectively guide responders as to what and how to do in traffic incidents. In the paper, a real-time automated traffic incident response plan management system was developed, which could generate and manage traffic incident response plans timely and automatically. A web application structure and a physical structure were designed to implement and show these functions. A standard framework of data storage was also developed to save information about traffic incidents and generated response plans. Furthermore, a conformation survey and case-based reasoning (CBR were introduced to identify traffic incident and generate traffic incident response plans automatically, respectively. Twenty-three traffic crash-related incidents were selected and three indicators were used to measure the system performance. Results showed that 20 of 23 cases could be retrieved effectively and accurately. The system is practicable to generate traffic incident response plans and has been implemented in China.

  5. A Structural Algorithm for Complex Natural Languages Parse Generation

    Directory of Open Access Journals (Sweden)

    Enikuomehin, A. O.

    2013-06-01

    Full Text Available In artificial intelligence, the study of how humans understand natural languages is cognitive based and such science is essential in the development of a modern day embedded robotic systems. Such systems should have the capability to process natural languages and generate meaningful output. As against machines, humans have the ability to understand a natural language sentence due to the in-born facility inherent in them and such is used to process it. Robotics requires appropriate PARSE systems to be developed in order to handle language based operations. In this paper, we present a new method of generating parse structures on complex natural language using algorithmic processes. The paper explores the process of generating meaning via parse structure and improves on the existing results using well established parsing scheme. The resulting algorithm was implemented in Java and a natural language interface for parse generation is presented. The result further shows that tokenizing sentences into their respective units affects the parse structure in the first instance and semantic representation in the larger scale. Efforts were made to limit the rules used in the generation of the grammar since natural language rules are almost infinite depending on the language set. (Abstract

  6. Learning Orthographic Structure With Sequential Generative Neural Networks.

    Science.gov (United States)

    Testolin, Alberto; Stoianov, Ivilin; Sperduti, Alessandro; Zorzi, Marco

    2016-04-01

    Learning the structure of event sequences is a ubiquitous problem in cognition and particularly in language. One possible solution is to learn a probabilistic generative model of sequences that allows making predictions about upcoming events. Though appealing from a neurobiological standpoint, this approach is typically not pursued in connectionist modeling. Here, we investigated a sequential version of the restricted Boltzmann machine (RBM), a stochastic recurrent neural network that extracts high-order structure from sensory data through unsupervised generative learning and can encode contextual information in the form of internal, distributed representations. We assessed whether this type of network can extract the orthographic structure of English monosyllables by learning a generative model of the letter sequences forming a word training corpus. We show that the network learned an accurate probabilistic model of English graphotactics, which can be used to make predictions about the letter following a given context as well as to autonomously generate high-quality pseudowords. The model was compared to an extended version of simple recurrent networks, augmented with a stochastic process that allows autonomous generation of sequences, and to non-connectionist probabilistic models (n-grams and hidden Markov models). We conclude that sequential RBMs and stochastic simple recurrent networks are promising candidates for modeling cognition in the temporal domain.

  7. Structural looseness investigation in slow rotating permanent magnet generators

    DEFF Research Database (Denmark)

    Skrimpas, Georgios Alexandros; Mijatovic, Nenad; Sweeney, Christian Walsted;

    2016-01-01

    Structural looseness in electric machines is a condition influencing the alignment of the machine and thus the overall bearing health. In this work, assessment of the above mentioned failure mode is tested on a slow rotating (running speed equal to 0.7Hz) permanent magnet generator (PMG), while...

  8. Learning Orthographic Structure with Sequential Generative Neural Networks

    Science.gov (United States)

    Testolin, Alberto; Stoianov, Ivilin; Sperduti, Alessandro; Zorzi, Marco

    2016-01-01

    Learning the structure of event sequences is a ubiquitous problem in cognition and particularly in language. One possible solution is to learn a probabilistic generative model of sequences that allows making predictions about upcoming events. Though appealing from a neurobiological standpoint, this approach is typically not pursued in…

  9. Structural Learning of Attack Vectors for Generating Mutated XSS Attacks

    CERN Document Server

    Wang, Yi-Hsun; Lee, Hahn-Ming; 10.4204/EPTCS.35.2

    2010-01-01

    Web applications suffer from cross-site scripting (XSS) attacks that resulting from incomplete or incorrect input sanitization. Learning the structure of attack vectors could enrich the variety of manifestations in generated XSS attacks. In this study, we focus on generating more threatening XSS attacks for the state-of-the-art detection approaches that can find potential XSS vulnerabilities in Web applications, and propose a mechanism for structural learning of attack vectors with the aim of generating mutated XSS attacks in a fully automatic way. Mutated XSS attack generation depends on the analysis of attack vectors and the structural learning mechanism. For the kernel of the learning mechanism, we use a Hidden Markov model (HMM) as the structure of the attack vector model to capture the implicit manner of the attack vector, and this manner is benefited from the syntax meanings that are labeled by the proposed tokenizing mechanism. Bayes theorem is used to determine the number of hidden states in the model...

  10. Virtual Screening and Structure Generation Applied to Drug Design

    Institute of Scientific and Technical Information of China (English)

    FAN B.T.; CHEN H. F.; XIE L.; YUAN S. G.; A. PANAYE; J-P. DOUCET

    2004-01-01

    The methods of computer-aided drug design can be divided into two categories according to whether or not the structures of receptors are known1, corresponding to two principal strategies:(1) searching the bio-active ligands against virtual combinatorial libraries and calculating the affinity energy between ligand and receptor by docking ; (2) QSAR and 3D-structure data-mining.3D-QSAR method is now applied widely to drug discovery, but this method is generally limited to refine the structures of known bio-active compounds. During the process of drug design, we have usually the prejudice that certain groups or structural fragments will play or not important roles on the activity. This will sometimes be misleading, and prevent us from obtaining expected results.The method of generating firstly diverse structures, then screening out the promising structures by means of a computational method or QSAR model, is an efficient way for drug discovery. We developed an efficient virtual and rational drag design method. It combines virtual bioactive compound generation using genetic algorithms with 3D-QSAR model and docking. Using this method can generate a lot of highly diverse molecules and find virtual active lead compounds. The method was validated by the study on a set of anti-tumor drugs, colchicine analogs2. With the constraints of pharmacophore obtained determined by DISCO, 97 virtual bioactive compounds were generated,and their anti-tumor activities were predicted by CoMFA. 8 structures with high activity were selected and screened by 3D-QSAR model. The most active generated structure was further investigated by modifying its structure in order to increase the activity (see fig.1). This drug design method could also avoid the conflict between the insufficiency of active structures and the great quantity of compounds needed for high-throughput screening. This method has been also applied to anti-HIV drug design.We have developed equally another approach of virtual

  11. Survey on the consciousness structure toward nuclear power generation

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, M.; Yoshida, T. (Nomura Research Institute, Kamakura, Kanagawa (Japan))

    1981-05-01

    A survey on the popular consciousness toward nuclear power generation was carried out by direct means of questionnaire to 1600 persons, ages from 20 to 69, in power demand areas (Tokyo and Osaka) and power supply areas (sites of nuclear power generation) from early February to early March, 1980, and the recovery rate was 74.4% (1190 persons). The results are described by way of their explanation. The purpose is to clarify the structure of popular consciousness toward nuclear energy, in particular nuclear power generation, and the nature of its acceptance. That is, it was surveyed how general people in the power supply and the power demand areas are taking nuclear power generation concerning its need and safety, and further how the attitudes are constituted and vary.

  12. Effect of Structural Modification on Second Harmonic Generation in Collagen

    Energy Technology Data Exchange (ETDEWEB)

    Stoller, P C; Reiser, K M; Celliers, P M; Rubenchik, A M

    2003-04-04

    The effects of structural perturbation on second harmonic generation in collagen were investigated. Type I collagen fascicles obtained from rat tails were structurally modified by increasing nonenzymatic cross-linking, by thermal denaturation, by collagenase digestion, or by dehydration. Changes in polarization dependence were observed in the dehydrated samples. Surprisingly, no changes in polarization dependence were observed in highly crosslinked samples, despite significant alterations in packing structure. Complete thermal denaturation and collagenase digestion produced samples with no detectable second harmonic signal. Prior to loss of signal, no change in polarization dependence was observed in partially heated or digested collagen.

  13. Effect of structural modification on second harmonic generation in collagen

    Science.gov (United States)

    Stoller, Patrick C.; Reiser, Karen M.; Celliers, Peter M.; Rubenchik, Alexander M.

    2003-07-01

    The effects of structural perturbation on second harmonic generation in collagen were investigated. Type I collagen fascicles obtained from rat tails were structurally modified by increasing nonenzymatic cross-linking, by thermal denaturation, by collagenase digestion, or by dehydration. Changes in polarization dependence were observed in the dehydrated samples. Surprisingly, no changes in polarization dependence were observed in highly crosslinked samples, despite significant alterations in packing structure. Complete thermal denaturation and collagenase digestion produced samples with no detectable second harmonic signal. Prior to loss of signal, no change in polarization dependence was observed in partially heated or digested collagen.

  14. Expanding the mammalian phenotype ontology to support automated exchange of high throughput mouse phenotyping data generated by large-scale mouse knockout screens.

    Science.gov (United States)

    Smith, Cynthia L; Eppig, Janan T

    2015-01-01

    A vast array of data is about to emerge from the large scale high-throughput mouse knockout phenotyping projects worldwide. It is critical that this information is captured in a standardized manner, made accessible, and is fully integrated with other phenotype data sets for comprehensive querying and analysis across all phenotype data types. The volume of data generated by the high-throughput phenotyping screens is expected to grow exponentially, thus, automated methods and standards to exchange phenotype data are required. The IMPC (International Mouse Phenotyping Consortium) is using the Mammalian Phenotype (MP) ontology in the automated annotation of phenodeviant data from high throughput phenotyping screens. 287 new term additions with additional hierarchy revisions were made in multiple branches of the MP ontology to accurately describe the results generated by these high throughput screens. Because these large scale phenotyping data sets will be reported using the MP as the common data standard for annotation and data exchange, automated importation of these data to MGI (Mouse Genome Informatics) and other resources is possible without curatorial effort. Maximum biomedical value of these mutant mice will come from integrating primary high-throughput phenotyping data with secondary, comprehensive phenotypic analyses combined with published phenotype details on these and related mutants at MGI and other resources.

  15. Code Generation for Embedded Software for Modeling Clear Box Structures

    Directory of Open Access Journals (Sweden)

    V. Chandra Prakash

    2011-09-01

    Full Text Available Cleanroom software Engineering (CRSE recommended that the code related to the Application systems be generated either manually or through code generation models or represents the same as a hierarchy of clear box structures. CRSE has even advocated that the code be developed using the State models that models the internal behavior of the systems. No framework has been recommended by any Author using which the Clear boxes are designed using the code generation methods. Code Generation is one of the important quality issues addressed in cleanroom software engineering. It has been investigated that CRSE can be used for life cycle management of the embedded systems when the hardware-software co-design is in-built as part and parcel of CRSE by way of adding suitable models to CRSE and redefining the same. The design of Embedded Systems involves code generation in respect of hardware and Embedded Software. In this paper, a framework is proposed using which the embedded software is generated. The method is unique that it considers various aspects of the code generation which includes Code Segments, Code Functions, Classes, Globalization, Variable propagation etc. The proposed Framework has been applied to a Pilot project and the experimental results are presented.

  16. Brain anatomical structure segmentation by hybrid discriminative/generative models.

    Science.gov (United States)

    Tu, Z; Narr, K L; Dollar, P; Dinov, I; Thompson, P M; Toga, A W

    2008-04-01

    In this paper, a hybrid discriminative/generative model for brain anatomical structure segmentation is proposed. The learning aspect of the approach is emphasized. In the discriminative appearance models, various cues such as intensity and curvatures are combined to locally capture the complex appearances of different anatomical structures. A probabilistic boosting tree (PBT) framework is adopted to learn multiclass discriminative models that combine hundreds of features across different scales. On the generative model side, both global and local shape models are used to capture the shape information about each anatomical structure. The parameters to combine the discriminative appearance and generative shape models are also automatically learned. Thus, low-level and high-level information is learned and integrated in a hybrid model. Segmentations are obtained by minimizing an energy function associated with the proposed hybrid model. Finally, a grid-face structure is designed to explicitly represent the 3-D region topology. This representation handles an arbitrary number of regions and facilitates fast surface evolution. Our system was trained and tested on a set of 3-D magnetic resonance imaging (MRI) volumes and the results obtained are encouraging.

  17. Structures for distributed automation systems in the future; Strukturen kuenftiger verteilter leittechnischer Systeme am Beispiel der Feldtechnik

    Energy Technology Data Exchange (ETDEWEB)

    Doebrich, U.; Heidel, R. [Siemens AG, Karlsruhe (Germany)

    2000-07-01

    Due to the introduction of the fieldbus an improved communication infrastructure is provided in the field. The field devices cooperate in distributed applications with each other over the bus. In order to use the field devices in the automation application optimally their behavior may have to be adjusted. This adjustment may have some impact on the structure of each of the components. Besides that other technical trends may have consequences on components and the architecture of the automation system for distributed applications. (orig.) [German] Mit dem Feldbus steht in der Automatisierungstechnik eine wesentlich verbesserte Kommunikationsinfrastruktur zur Verfuegung. Um diese fuer die automatisierungstechnischen Anwendungen optimal nutzen zu koennen, sind die Eigenschaften der Einzelkomponenten aufeinander abzustimmen. Daneben sind Trends zu beobachten, die die Struktur der Leittechnik beeinflussen werden. Diese und die moeglichen Konsequenzen auf kuenftige leitsystemtechnische Komponenten werden beschrieben. (orig.)

  18. Benchmarks on automated system and software generation higher flexibility increased productivity and shorter time-to-market by ScaPable software

    Science.gov (United States)

    Gerlich, Rainer

    2002-07-01

    "ScaPable" is an acronym derived from "scalable" and "portable". The attribute "scalable" indicates that specific application software can automatically be built from scratch and verified without writing any statement in a programming language like C, thereby covering a large variety of embedded and/or distributed applications. The term "portable" addresses the capability to automatically port parts of such an application from one physical node to another one - the processor and operating system type may change - only requiring the names of the nodes, their processor type and operating system. This way the infrastructure of an embedded / distributed system can be built just by provision of literals and figures which define the system interaction, communication, topology and performance. Moreover, dedicated application software like needed for on-board command handling, data acquisition and processing, and telemetry handling can be built from generic templates. The generation time range from less than one second up to about twenty minutes on a PC/Linux platform (800 MHz). By this extremely short generation time risks can be identified early because the executable application is immediately available for validation. A rough estimation shows that one hour of automated system and software generation is equivalent to about 5 .. 50 man years. Currently, about 50% of a typical space embedded system can be covered by the available automated approach. However, the more it is applied, the more can be covered by automation. A system is constructed by applying a formal transformation to the few information as delivered by the user. This approach is not limited to the space domain, although the first industrial application was a space project. Quite different domains can take advantage of such principles of system construction. This paper explains the approach, compares it with other approaches, and provides figures on productivity, duration of system generation and reliability.

  19. Developing a Graphical User Interface to Automate the Estimation and Prediction of Risk Values for Flood Protective Structures using Artificial Neural Network

    Science.gov (United States)

    Hasan, M.; Helal, A.; Gabr, M.

    2014-12-01

    In this project, we focus on providing a computer-automated platform for a better assessment of the potential failures and retrofit measures of flood-protecting earth structures, e.g., dams and levees. Such structures play an important role during extreme flooding events as well as during normal operating conditions. Furthermore, they are part of other civil infrastructures such as water storage and hydropower generation. Hence, there is a clear need for accurate evaluation of stability and functionality levels during their service lifetime so that the rehabilitation and maintenance costs are effectively guided. Among condition assessment approaches based on the factor of safety, the limit states (LS) approach utilizes numerical modeling to quantify the probability of potential failures. The parameters for LS numerical modeling include i) geometry and side slopes of the embankment, ii) loading conditions in terms of rate of rising and duration of high water levels in the reservoir, and iii) cycles of rising and falling water levels simulating the effect of consecutive storms throughout the service life of the structure. Sample data regarding the correlations of these parameters are available through previous research studies. We have unified these criteria and extended the risk assessment in term of loss of life through the implementation of a graphical user interface to automate input parameters that divides data into training and testing sets, and then feeds them into Artificial Neural Network (ANN) tool through MATLAB programming. The ANN modeling allows us to predict risk values of flood protective structures based on user feedback quickly and easily. In future, we expect to fine-tune the software by adding extensive data on variations of parameters.

  20. CIF2Cell: Generating geometries for electronic structure programs

    Science.gov (United States)

    Björkman, Torbjörn

    2011-05-01

    The CIF2Cell program generates the geometrical setup for a number of electronic structure programs based on the crystallographic information in a Crystallographic Information Framework (CIF) file. The program will retrieve the space group number, Wyckoff positions and crystallographic parameters, make a sensible choice for Bravais lattice vectors (primitive or principal cell) and generate all atomic positions. Supercells can be generated and alloys are handled gracefully. The code currently has output interfaces to the electronic structure programs ABINIT, CASTEP, CPMD, Crystal, Elk, Exciting, EMTO, Fleur, RSPt, Siesta and VASP. Program summaryProgram title: CIF2Cell Catalogue identifier: AEIM_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEIM_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU GPL version 3 No. of lines in distributed program, including test data, etc.: 12 691 No. of bytes in distributed program, including test data, etc.: 74 933 Distribution format: tar.gz Programming language: Python (versions 2.4-2.7) Computer: Any computer that can run Python (versions 2.4-2.7) Operating system: Any operating system that can run Python (versions 2.4-2.7) Classification: 7.3, 7.8, 8 External routines: PyCIFRW [1] Nature of problem: Generate the geometrical setup of a crystallographic cell for a variety of electronic structure programs from data contained in a CIF file. Solution method: The CIF file is parsed using routines contained in the library PyCIFRW [1], and crystallographic as well as bibliographic information is extracted. The program then generates the principal cell from symmetry information, crystal parameters, space group number and Wyckoff sites. Reduction to a primitive cell is then performed, and the resulting cell is output to suitably named files along with documentation of the information source generated from any bibliographic information contained in the CIF

  1. Structured Reporting Method for ePR Generation

    Directory of Open Access Journals (Sweden)

    Arash Ebrahimi

    2007-08-01

    Full Text Available Appropriate electronic medical report-making soft-wares help physicians to personally generate records for paper printing and ePR access. Flat data-sheets with check-boxes that have been already used in traditional medical paper reports, do not satisfy today's physician demands for more professional reports."nAlternatively, Structured Reporting (SR as the modified version of flat check-box based reporting method is being selected. In this method, items are nested in a hieratical tree so that each reporting item includes several substitutions. Hence, computer generates professional sentences with logical pre-defined combination of selected items. In our work, on the basis of SR method, we provided solution for reporting of Endoscopy procedures that is accepted by several Gastroenterologists as a proper software. In addi-tion, successful results in generation of ePRs using SR are newly achieved in Cardiology.

  2. Nanosecond pulsed laser generation of holographic structures on metals

    Science.gov (United States)

    Wlodarczyk, Krystian L.; Ardron, Marcus; Weston, Nick J.; Hand, Duncan P.

    2016-03-01

    A laser-based process for the generation of phase holographic structures directly onto the surface of metals is presented. This process uses 35ns long laser pulses of wavelength 355nm to generate optically-smooth surface deformations on a metal. The laser-induced surface deformations (LISDs) are produced by either localized laser melting or the combination of melting and evaporation. The geometry (shape and dimension) of the LISDs depends on the laser processing parameters, in particular the pulse energy, as well as on the chemical composition of a metal. In this paper, we explain the mechanism of the LISDs formation on various metals, such as stainless steel, pure nickel and nickel-chromium Inconel® alloys. In addition, we provide information about the design and fabrication process of the phase holographic structures and demonstrate their use as robust markings for the identification and traceability of high value metal goods.

  3. Structural materials for the next generation of technologies

    CERN Document Server

    Van de Voorde, Marcel Hubert

    1996-01-01

    1. Overview of advanced technologies; i.e. aerospace-aeronautics; automobile; energy technology; accelerator engineering etc. and the need for new structural materials. 2. Familiarisation with polymers, metals and alloys, structural ceramics, composites and surface engineering. The study of modern materials processing, generation of a materials data base, engineering properties includind NDE, radiation damage etc. 3. Development of new materials for the next generation of technologies; including the spin-off of materials developed for space and military purposes to industrial applications. 4. Materials selection for modern accelerator engineering. 5. Materials research in Europe, USA and Japan. Material R & D programmes sponsored by the European Union and the collaboration of CERN in EU sponsored programmes.

  4. Harmonic and subharmonic acoustic wave generation in finite structures.

    Science.gov (United States)

    Alippi, A; Bettucci, A; Germano, M; Passeri, D

    2006-12-22

    The generation of harmonic and subharmonic vibrations is considered in a finite monodimensional structure, as it is produced by the nonlinear acoustic characteristics of the medium. The equation of motion is considered, where a general function of the displacement and its derivatives acts as the forcing term for (sub)harmonic generation and a series of 'selection rules' is found, depending on the sample constrains. The localization of the nonlinear term is also considered that mimics the presence of defects or cracks in the structure, together with the spatial distribution of subharmonic modes. Experimental evidence is given relative to the power law dependence of the harmonic modes vs. the fundamental mode displacement amplitude, and subharmonic mode distribution with hysteretic effects is also reported in a cylindrical sample of piezoelectric material.

  5. SABATPG-A Structural Analysis Based Automatic Test Generation System

    Institute of Scientific and Technical Information of China (English)

    李忠诚; 潘榆奇; 闵应骅

    1994-01-01

    A TPG system, SABATPG, is given based on a generic structural model of large circuits. Three techniques of partial implication, aftereffect of identified undetectable faults and shared sensitization with new concepts of localization and aftereffect are employed in the system to improve FAN algorithm. Experiments for the 10 ISCAS benchmark circuits show that the computing time of SABATPG for test generation is 19.42% less than that of FAN algorithm.

  6. Generating a 2D Representation of a Complex Data Structure

    Science.gov (United States)

    James, Mark

    2006-01-01

    A computer program, designed to assist in the development and debugging of other software, generates a two-dimensional (2D) representation of a possibly complex n-dimensional (where n is an integer >2) data structure or abstract rank-n object in that other software. The nature of the 2D representation is such that it can be displayed on a non-graphical output device and distributed by non-graphical means.

  7. Structure of the automated uchebno-methodical complex on technical disciplines

    Directory of Open Access Journals (Sweden)

    Вячеслав Михайлович Дмитриев

    2010-12-01

    Full Text Available In article it is put and the problem of automation and information of process of training of students on the basis of the entered system-organizational forms which have received in aggregate the name of education methodical complexes on discipline dares.

  8. Open-Source Assisted Laboratory Automation through Graphical User Interfaces and 3D Printers: Application to Equipment Hyphenation for Higher-Order Data Generation.

    Science.gov (United States)

    Siano, Gabriel G; Montemurro, Milagros; Alcaráz, Mirta R; Goicoechea, Héctor C

    2017-09-25

    Higher-order data generation implies some automation challenges, which are mainly related to the hidden programming languages and electronic details of the equipment. When techniques and/or equipment hyphenation are the key to obtaining higher-order data, the required simultaneous control of them demands funds for new hardware, software, and licenses, in addition to very skilled operators. In this work, we present Design of Inputs-Outputs with Sikuli (DIOS), a free and open-source code program that provides a general framework for the design of automated experimental procedures without prior knowledge of programming or electronics. Basically, instruments and devices are considered as nodes in a network, and every node is associated both with physical and virtual inputs and outputs. Virtual components, such as graphical user interfaces (GUIs) of equipment, are handled by means of image recognition tools provided by Sikuli scripting language, while handling of their physical counterparts is achieved using an adapted open-source three-dimensional (3D) printer. Two previously reported experiments of our research group, related to fluorescence matrices derived from kinetics and high-performance liquid chromatography, were adapted to be carried out in a more automated fashion. Satisfactory results, in terms of analytical performance, were obtained. Similarly, advantages derived from open-source tools assistance could be appreciated, mainly in terms of lesser intervention of operators and cost savings.

  9. A microfluidic device for preparing next generation DNA sequencing libraries and for automating other laboratory protocols that require one or more column chromatography steps.

    Directory of Open Access Journals (Sweden)

    Swee Jin Tan

    Full Text Available Library preparation for next-generation DNA sequencing (NGS remains a key bottleneck in the sequencing process which can be relieved through improved automation and miniaturization. We describe a microfluidic device for automating laboratory protocols that require one or more column chromatography steps and demonstrate its utility for preparing Next Generation sequencing libraries for the Illumina and Ion Torrent platforms. Sixteen different libraries can be generated simultaneously with significantly reduced reagent cost and hands-on time compared to manual library preparation. Using an appropriate column matrix and buffers, size selection can be performed on-chip following end-repair, dA tailing, and linker ligation, so that the libraries eluted from the chip are ready for sequencing. The core architecture of the device ensures uniform, reproducible column packing without user supervision and accommodates multiple routine protocol steps in any sequence, such as reagent mixing and incubation; column packing, loading, washing, elution, and regeneration; capture of eluted material for use as a substrate in a later step of the protocol; and removal of one column matrix so that two or more column matrices with different functional properties can be used in the same protocol. The microfluidic device is mounted on a plastic carrier so that reagents and products can be aliquoted and recovered using standard pipettors and liquid handling robots. The carrier-mounted device is operated using a benchtop controller that seals and operates the device with programmable temperature control, eliminating any requirement for the user to manually attach tubing or connectors. In addition to NGS library preparation, the device and controller are suitable for automating other time-consuming and error-prone laboratory protocols requiring column chromatography steps, such as chromatin immunoprecipitation.

  10. Automated service quality and its behavioural consequences in CRM Environment: A structural equation modeling and causal loop diagramming approach

    Directory of Open Access Journals (Sweden)

    Arup Kumar Baksi

    2012-08-01

    Full Text Available Information technology induced communications (ICTs have revolutionized the operational aspects of service sector and have triggered a perceptual shift in service quality as rapid dis-intermediation has changed the access-mode of services on part of the consumers. ICT-enabled services further stimulated the perception of automated service quality with renewed dimensions and there subsequent significance to influence the behavioural outcomes of the consumers. Customer Relationship Management (CRM has emerged as an offshoot to technological breakthrough as it ensured service-encapsulation by integrating people, process and technology. This paper attempts to explore the relationship between automated service quality and its behavioural consequences in a relatively novel business-philosophy – CRM. The study has been conducted on the largest public sector bank of India - State bank of India (SBI at Kolkata which has successfully completed its decade-long operational automation in the year 2008. The study used structural equation modeling (SEM to justify the proposed model construct and causal loop diagramming (CLD to depict the negative and positive linkages between the variables.

  11. Structural materials issues for the next generation fission reactors

    Science.gov (United States)

    Chant, I.; Murty, K. L.

    2010-09-01

    Generation-IV reactor design concepts envisioned thus far cater to a common goal of providing safer, longer lasting, proliferation-resistant, and economically viable nuclear power plants. The foremost consideration in the successful development and deployment of Gen-W reactor systems is the performance and reliability issues involving structural materials for both in-core and out-of-core applications. The structural materials need to endure much higher temperatures, higher neutron doses, and extremely corrosive environments, which are beyond the experience of the current nuclear power plants. Materials under active consideration for use in different reactor components include various ferritic/martensitic steels, austenitic stainless steels, nickel-base superalloys, ceramics, composites, etc. This article addresses the material requirements for these advanced fission reactor types, specifically addressing structural materials issues depending on the specific application areas.

  12. Automated Voxel Model from Point Clouds for Structural Analysis of Cultural Heritage

    Science.gov (United States)

    Bitelli, G.; Castellazzi, G.; D'Altri, A. M.; De Miranda, S.; Lambertini, A.; Selvaggi, I.

    2016-06-01

    In the context of cultural heritage, an accurate and comprehensive digital survey of a historical building is today essential in order to measure its geometry in detail for documentation or restoration purposes, for supporting special studies regarding materials and constructive characteristics, and finally for structural analysis. Some proven geomatic techniques, such as photogrammetry and terrestrial laser scanning, are increasingly used to survey buildings with different complexity and dimensions; one typical product is in form of point clouds. We developed a semi-automatic procedure to convert point clouds, acquired from laserscan or digital photogrammetry, to a filled volume model of the whole structure. The filled volume model, in a voxel format, can be useful for further analysis and also for the generation of a Finite Element Model (FEM) of the surveyed building. In this paper a new approach is presented with the aim to decrease operator intervention in the workflow and obtain a better description of the structure. In order to achieve this result a voxel model with variable resolution is produced. Different parameters are compared and different steps of the procedure are tested and validated in the case study of the North tower of the San Felice sul Panaro Fortress, a monumental historical building located in San Felice sul Panaro (Modena, Italy) that was hit by an earthquake in 2012.

  13. Structuring and extracting knowledge for the support of hypothesis generation in molecular biology

    NARCIS (Netherlands)

    Roos, M.; Marshall, M.S.; Gibson, A.P.; Schuemie, M.; Meij, E.; Katrenko, S.; van Hage, W.R.; Krommydas, K.; Adriaans, P.W.

    2009-01-01

    Background: Hypothesis generation in molecular and cellular biology is an empirical process in which knowledge derived from prior experiments is distilled into a comprehensible model. The requirement of automated support is exemplified by the difficulty of considering all relevant facts that are

  14. Improving the correlation of structural FEA models by the application of automated high density robotized laser Doppler vibrometry

    Science.gov (United States)

    Chowanietz, Maximilian; Bhangaonkar, Avinash; Semken, Michael; Cockrill, Martin

    2016-06-01

    Sound has had an intricate relation with the wellbeing of humans since time immemorial. It has the ability to enhance the quality of life immensely when present as music; at the same time, it can degrade its quality when manifested as noise. Hence, understanding its sources and the processes by which it is produced gains acute significance. Although various theories exist with respect to evolution of bells, it is indisputable that they carry millennia of cultural significance, and at least a few centuries of perfection with respect to design, casting and tuning. Despite the science behind its design, the nuances pertaining to founding and tuning have largely been empirical, and conveyed from one generation to the next. Post-production assessment for bells remains largely person-centric and traditional. However, progressive bell manufacturers have started adopting methods such as finite element analysis (FEA) for informing and optimising their future model designs. To establish confidence in the FEA process it is necessary to correlate the virtual model against a physical example. This is achieved by performing an experimental modal analysis (EMA) and comparing the results with those from FEA. Typically to collect the data for an EMA, the vibratory response of the structure is measured with the application of accelerometers. This technique has limitations; principally these are the observer effect and limited geometric resolution. In this paper, 3-dimensional laser Doppler vibrometry (LDV) has been used to measure the vibratory response with no observer effect due to the non-contact nature of the technique; resulting in higher accuracy measurements as the input to the correlation process. The laser heads were mounted on an industrial robot that enables large objects to be measured and extensive data sets to be captured quickly through an automated process. This approach gives previously unobtainable geometric resolution resulting in a higher confidence EMA. This is

  15. Parallel and Streaming Generation of Ghost Data for Structured Grids

    Energy Technology Data Exchange (ETDEWEB)

    Isenburg, M; Lindstrom, P; Childs, H

    2008-04-15

    Parallel simulations decompose large domains into many blocks. A fundamental requirement for subsequent parallel analysis and visualization is the presence of ghost data that supplements each block with a layer of adjacent data elements from neighboring blocks. The standard approach for generating ghost data requires all blocks to be in memory at once. This becomes impractical when there are fewer processors - and thus less aggregate memory - available for analysis than for simulation. We describe an algorithm for generating ghost data for structured grids that uses many fewer processors than previously possible. Our algorithm stores as little as one block per processor in memory and can run on as few processors as are available (possibly just one). The key idea is to slightly change the size of the original blocks by declaring parts of them to be ghost data, and by later padding adjacent blocks with this data.

  16. Development of automated extraction method of biliary tract from abdominal CT volumes based on local intensity structure analysis

    Science.gov (United States)

    Koga, Kusuto; Hayashi, Yuichiro; Hirose, Tomoaki; Oda, Masahiro; Kitasaka, Takayuki; Igami, Tsuyoshi; Nagino, Masato; Mori, Kensaku

    2014-03-01

    In this paper, we propose an automated biliary tract extraction method from abdominal CT volumes. The biliary tract is the path by which bile is transported from liver to the duodenum. No extraction method have been reported for the automated extraction of the biliary tract from common contrast CT volumes. Our method consists of three steps including: (1) extraction of extrahepatic bile duct (EHBD) candidate regions, (2) extraction of intrahepatic bile duct (IHBD) candidate regions, and (3) combination of these candidate regions. The IHBD has linear structures and intensities of the IHBD are low in CT volumes. We use a dark linear structure enhancement (DLSE) filter based on a local intensity structure analysis method using the eigenvalues of the Hessian matrix for the IHBD candidate region extraction. The EHBD region is extracted using a thresholding process and a connected component analysis. In the combination process, we connect the IHBD candidate regions to each EHBD candidate region and select a bile duct region from the connected candidate regions. We applied the proposed method to 22 cases of CT volumes. An average Dice coefficient of extraction result was 66.7%.

  17. Generation of nanometer structures on surfaces of ionic solids generated by laser and electron beam irradiation

    Science.gov (United States)

    Dawes, M. L.; Langford, S. C.; Dickinson, J. Thomas

    2001-03-01

    Radiation effects on hydrated single crystals are poorly understood. We find that dense arrays of nanoscale conical structures, with aspect ratios on the order of 200, are produced when single crystal brushite (CaHPO_4^.2H_2O) is exposed to energetic electrons (2 keV). Other three dimensional nanostructures are generated by exposing brushite to excimer laser irradiation. We show that the mechanism involves: (a) photo/electron stimulated decomposition of the matrix, and (b) thermally stimulated migration of water (in this case, crystalline) and ionic material. We have isolated these factors to some extent and present plausible mechanisms for structure formation. In addition, we have recently exposed non-hydrated ionic crystals to radiation in the presence of background water (pp_water ~ 10-7 Torr), which produces exceedingly fine structures (sub-10 nm). The optical and luminescence properties of these features will be presented. An example of a “stealth surface” will be given with possible applications for the laser generation of x-rays.

  18. Zero in on Key Open Problems in Automated NMR Protein Structure Determination

    KAUST Repository

    Abbas, Ahmed

    2015-11-12

    Nuclear magnetic resonance (NMR) is one of the main approaches for protein struc- ture determination. The biggest advantage of this approach is that it can determine the three-dimensional structure of the protein in the solution phase. Thus, the natural dynamics of the protein can be studied. However, NMR protein structure determina- tion is an expertise intensive and time-consuming process. If the structure determi- nation process can be accelerated or even automated by computational methods, that will significantly advance the structural biology field. Our goal in this dissertation is to propose highly efficient and error tolerant methods that can work well on real and noisy data sets of NMR. Our first contribution in this dissertation is the development of a novel peak pick- ing method (WaVPeak). First, WaVPeak denoises the NMR spectra using wavelet smoothing. A brute force method is then used to identify all the candidate peaks. Af- ter that, the volume of each candidate peak is estimated. Finally, the peaks are sorted according to their volumes. WaVPeak is tested on the same benchmark data set that was used to test the state-of-the-art method, PICKY. WaVPeak shows significantly better performance than PICKY in terms of recall and precision. Our second contribution is to propose an automatic method to select peaks pro- duced by peak picking methods. This automatic method is used to overcome the limitations of fixed number-based methods. Our method is based on the Benjamini- Hochberg (B-H) algorithm. The method is used with both WaVPeak and PICKY to automatically select the number of peaks to return from out of hundreds of candidate peaks. The volume (in WaVPeak) and the intensity (in PICKY) are converted into p-values. Peaks that have p-values below some certain threshold are selected. Ex- perimental results show that the new method is better than the fixed number-based method in terms of recall. To improve precision, we tried to eliminate false peaks using

  19. Automatic structures and growth functions for finitely generated abelian groups

    CERN Document Server

    Kamei, Satoshi

    2011-01-01

    In this paper, we consider the formal power series whose n-th coefficient is the number of copies of a given finite graph in the ball of radius n centred at the identity element in the Cayley graph of a finitely generated group and call it the growth function. Epstein, Iano-Fletcher and Uri Zwick proved that the growth function is a rational function if the group has a geodesic automatic structure. We compute the growth function in the case where the group is abelian and see that the denominator of the rational function is determined from the rank of the group.

  20. Fast and Efficient Fragment-Based Lead Generation by Fully Automated Processing and Analysis of Ligand-Observed NMR Binding Data.

    Science.gov (United States)

    Peng, Chen; Frommlet, Alexandra; Perez, Manuel; Cobas, Carlos; Blechschmidt, Anke; Dominguez, Santiago; Lingel, Andreas

    2016-04-14

    NMR binding assays are routinely applied in hit finding and validation during early stages of drug discovery, particularly for fragment-based lead generation. To this end, compound libraries are screened by ligand-observed NMR experiments such as STD, T1ρ, and CPMG to identify molecules interacting with a target. The analysis of a high number of complex spectra is performed largely manually and therefore represents a limiting step in hit generation campaigns. Here we report a novel integrated computational procedure that processes and analyzes ligand-observed proton and fluorine NMR binding data in a fully automated fashion. A performance evaluation comparing automated and manual analysis results on (19)F- and (1)H-detected data sets shows that the program delivers robust, high-confidence hit lists in a fraction of the time needed for manual analysis and greatly facilitates visual inspection of the associated NMR spectra. These features enable considerably higher throughput, the assessment of larger libraries, and shorter turn-around times.

  1. AG-NGS: a powerful and user-friendly computing application for the semi-automated preparation of next-generation sequencing libraries using open liquid handling platforms.

    Science.gov (United States)

    Callejas, Sergio; Álvarez, Rebeca; Benguria, Alberto; Dopazo, Ana

    2014-01-01

    Next-generation sequencing (NGS) is becoming one of the most widely used technologies in the field of genomics. Library preparation is one of the most critical, hands-on, and time-consuming steps in the NGS workflow. Each library must be prepared in an independent well, increasing the number of hours required for a sequencing run and the risk of human-introduced error. Automation of library preparation is the best option to avoid these problems. With this in mind, we have developed automatic genomics NGS (AG-NGS), a computing application that allows an open liquid handling platform to be transformed into a library preparation station without losing the potential of an open platform. Implementation of AG-NGS does not require programming experience, and the application has also been designed to minimize implementation costs. Automated library preparation with AG-NGS generated high-quality libraries from different samples, demonstrating its efficiency, and all quality control parameters fell within the range of optimal values.

  2. The scheme of combined application of optimization and simulation models for formation of an optimum structure of an automated control system of space systems

    Science.gov (United States)

    Chernigovskiy, A. S.; Tsarev, R. Yu; Nikiforov, A. Yu; Zelenkov, P. V.

    2016-11-01

    With the development of automated control systems of space systems, there are new classes of spacecraft that requires improvement of their structure and expand their functions. When designing the automated control system of space systems occurs various tasks such as: determining location of elements and subsystems in the space, hardware selection, the distribution of the set of functions performed by the system units, all of this under certain conditions on the quality of control and connectivity of components. The problem of synthesis of structure of automated control system of space systems formalized using discrete variables at various levels of system detalization. A sequence of tasks and stages of the formation of automated control system of space systems structure is developed. The authors have developed and proposed a scheme of the combined implementation of optimization and simulation models to ensure rational distribution of functions between the automated control system complex and the rest of the system units. The proposed approach allows to make reasonable hardware selection, taking into account the different requirements for the operation of automated control systems of space systems.

  3. Structuring and extracting knowledge for the support of hypothesis generation in molecular biology

    Directory of Open Access Journals (Sweden)

    van Hage Willem

    2009-10-01

    Full Text Available Abstract Background Hypothesis generation in molecular and cellular biology is an empirical process in which knowledge derived from prior experiments is distilled into a comprehensible model. The requirement of automated support is exemplified by the difficulty of considering all relevant facts that are contained in the millions of documents available from PubMed. Semantic Web provides tools for sharing prior knowledge, while information retrieval and information extraction techniques enable its extraction from literature. Their combination makes prior knowledge available for computational analysis and inference. While some tools provide complete solutions that limit the control over the modeling and extraction processes, we seek a methodology that supports control by the experimenter over these critical processes. Results We describe progress towards automated support for the generation of biomolecular hypotheses. Semantic Web technologies are used to structure and store knowledge, while a workflow extracts knowledge from text. We designed minimal proto-ontologies in OWL for capturing different aspects of a text mining experiment: the biological hypothesis, text and documents, text mining, and workflow provenance. The models fit a methodology that allows focus on the requirements of a single experiment while supporting reuse and posterior analysis of extracted knowledge from multiple experiments. Our workflow is composed of services from the 'Adaptive Information Disclosure Application' (AIDA toolkit as well as a few others. The output is a semantic model with putative biological relations, with each relation linked to the corresponding evidence. Conclusion We demonstrated a 'do-it-yourself' approach for structuring and extracting knowledge in the context of experimental research on biomolecular mechanisms. The methodology can be used to bootstrap the construction of semantically rich biological models using the results of knowledge extraction

  4. Effective theory for the cosmological generation of structure

    CERN Document Server

    Bojowald, Martin

    2008-01-01

    The current understanding of structure formation in the early universe is mainly built on a magnification of quantum fluctuations in an initial vacuum state during an early phase of accelerated universe expansion. One usually describes this process by solving equations for a quantum state of matter on a given expanding background space-time, followed by decoherence arguments for the emergence of classical inhomogeneities from the quantum fluctuations. Here, we formulate the coupling of quantum matter fields to a dynamical gravitational background in an effective framework which allows the inclusion of back-reaction effects. It is shown how quantum fluctuations couple to classical inhomogeneities and can thus manage to generate cosmic structure in an evolving background. Several specific effects follow from a qualitative analysis of the back-reaction, including a likely reduction of the overall amplitude of power in the cosmic microwave background, the occurrence of small non-Gaussianities, and a possible supp...

  5. PONDEROSA-C/S: client-server based software package for automated protein 3D structure determination.

    Science.gov (United States)

    Lee, Woonghee; Stark, Jaime L; Markley, John L

    2014-11-01

    Peak-picking Of Noe Data Enabled by Restriction Of Shift Assignments-Client Server (PONDEROSA-C/S) builds on the original PONDEROSA software (Lee et al. in Bioinformatics 27:1727-1728. doi: 10.1093/bioinformatics/btr200, 2011) and includes improved features for structure calculation and refinement. PONDEROSA-C/S consists of three programs: Ponderosa Server, Ponderosa Client, and Ponderosa Analyzer. PONDEROSA-C/S takes as input the protein sequence, a list of assigned chemical shifts, and nuclear Overhauser data sets ((13)C- and/or (15)N-NOESY). The output is a set of assigned NOEs and 3D structural models for the protein. Ponderosa Analyzer supports the visualization, validation, and refinement of the results from Ponderosa Server. These tools enable semi-automated NMR-based structure determination of proteins in a rapid and robust fashion. We present examples showing the use of PONDEROSA-C/S in solving structures of four proteins: two that enable comparison with the original PONDEROSA package, and two from the Critical Assessment of automated Structure Determination by NMR (Rosato et al. in Nat Methods 6:625-626. doi: 10.1038/nmeth0909-625 , 2009) competition. The software package can be downloaded freely in binary format from http://pine.nmrfam.wisc.edu/download_packages.html. Registered users of the National Magnetic Resonance Facility at Madison can submit jobs to the PONDEROSA-C/S server at http://ponderosa.nmrfam.wisc.edu, where instructions, tutorials, and instructions can be found. Structures are normally returned within 1-2 days.

  6. Task of generation of variants of subsystems in to the automated hydrometeorological system on basis of morphological synthesis

    OpenAIRE

    Доронина, Юлия Валентиновна

    2011-01-01

    The aspects of generation of variants of subsystems are examined for  the hydrometeorological system. Principles of generation of variants are rotined on the basis of morphological synthesis, statements of genetic algorithm

  7. Managing ambiguity in reference generation: the role of surface structure.

    Science.gov (United States)

    Khan, Imtiaz H; van Deemter, Kees; Ritchie, Graeme

    2012-04-01

    This article explores the role of surface ambiguities in referring expressions, and how the risk of such ambiguities should be taken into account by an algorithm that generates referring expressions, if these expressions are to be optimally effective for a hearer. We focus on the ambiguities that arise when adjectives occur in coordinated structures. The central idea is to use statistical information about lexical co-occurrence to estimate which interpretation of a phrase is most likely for human readers, and to avoid generating phrases where misunderstandings are likely. Various aspects of the problem were explored in three experiments in which responses by human participants provided evidence about which reading was most likely for certain phrases, which phrases were deemed most suitable for particular referents, and the speed at which various phrases were read. We found a preference for ''clear'' expressions to ''unclear'' ones, but if several of the expressions are ''clear,'' then brief expressions are preferred over non-brief ones even though the brief ones are syntactically ambiguous and the non-brief ones are not; the notion of clarity was made precise using Kilgarriff's Word Sketches. We outline an implemented algorithm that generates noun phrases conforming to our hypotheses.

  8. Combinatorial parallel synthesis and automated screening of a novel class of liquid crystalline materials.

    Science.gov (United States)

    Deeg, Oliver; Kirsch, Peer; Pauluth, Detlef; Bäuerle, Peter

    2002-12-07

    Combinatorial parallel synthesis has led to the rapid generation of a single-compound library of novel fluorinated quaterphenyls. Subsequent automated screening revealed liquid crystalline (LC) behaviour and gave qualitative relationships of molecular structures and solid state properties.

  9. Automated protein structure modeling in CASP9 by I-TASSER pipeline combined with QUARK-based ab initio folding and FG-MD-based structure refinement.

    Science.gov (United States)

    Xu, Dong; Zhang, Jian; Roy, Ambrish; Zhang, Yang

    2011-01-01

    I-TASSER is an automated pipeline for protein tertiary structure prediction using multiple threading alignments and iterative structure assembly simulations. In CASP9 experiments, two new algorithms, QUARK and fragment-guided molecular dynamics (FG-MD), were added to the I-TASSER pipeline for improving the structural modeling accuracy. QUARK is a de novo structure prediction algorithm used for structure modeling of proteins that lack detectable template structures. For distantly homologous targets, QUARK models are found useful as a reference structure for selecting good threading alignments and guiding the I-TASSER structure assembly simulations. FG-MD is an atomic-level structural refinement program that uses structural fragments collected from the PDB structures to guide molecular dynamics simulation and improve the local structure of predicted model, including hydrogen-bonding networks, torsion angles, and steric clashes. Despite considerable progress in both the template-based and template-free structure modeling, significant improvements on protein target classification, domain parsing, model selection, and ab initio folding of β-proteins are still needed to further improve the I-TASSER pipeline. Copyright © 2011 Wiley-Liss, Inc.

  10. Semantics-based Automated Web Testing

    OpenAIRE

    Hai-Feng Guo; Qing Ouyang; Harvey Siy

    2015-01-01

    We present TAO, a software testing tool performing automated test and oracle generation based on a semantic approach. TAO entangles grammar-based test generation with automated semantics evaluation using a denotational semantics framework. We show how TAO can be incorporated with the Selenium automation tool for automated web testing, and how TAO can be further extended to support automated delta debugging, where a failing web test script can be systematically reduced based on grammar-direct...

  11. Warehouse automation

    OpenAIRE

    Pogačnik, Jure

    2017-01-01

    An automated high bay warehouse is commonly used for storing large number of material with a high throughput. In an automated warehouse pallet movements are mainly performed by a number of automated devices like conveyors systems, trolleys, and stacker cranes. From the introduction of the material to the automated warehouse system to its dispatch the system requires no operator input or intervention since all material movements are done automatically. This allows the automated warehouse to op...

  12. Generations.

    Science.gov (United States)

    Chambers, David W

    2005-01-01

    Groups naturally promote their strengths and prefer values and rules that give them an identity and an advantage. This shows up as generational tensions across cohorts who share common experiences, including common elders. Dramatic cultural events in America since 1925 can help create an understanding of the differing value structures of the Silents, the Boomers, Gen Xers, and the Millennials. Differences in how these generations see motivation and values, fundamental reality, relations with others, and work are presented, as are some applications of these differences to the dental profession.

  13. Arsenic fractionation in agricultural soil using an automated three-step sequential extraction method coupled to hydride generation-atomic fluorescence spectrometry.

    Science.gov (United States)

    Rosas-Castor, J M; Portugal, L; Ferrer, L; Guzmán-Mar, J L; Hernández-Ramírez, A; Cerdà, V; Hinojosa-Reyes, L

    2015-05-18

    A fully automated modified three-step BCR flow-through sequential extraction method was developed for the fractionation of the arsenic (As) content from agricultural soil based on a multi-syringe flow injection analysis (MSFIA) system coupled to hydride generation-atomic fluorescence spectrometry (HG-AFS). Critical parameters that affect the performance of the automated system were optimized by exploiting a multivariate approach using a Doehlert design. The validation of the flow-based modified-BCR method was carried out by comparison with the conventional BCR method. Thus, the total As content was determined in the following three fractions: fraction 1 (F1), the acid-soluble or interchangeable fraction; fraction 2 (F2), the reducible fraction; and fraction 3 (F3), the oxidizable fraction. The limits of detection (LOD) were 4.0, 3.4, and 23.6 μg L(-1) for F1, F2, and F3, respectively. A wide working concentration range was obtained for the analysis of each fraction, i.e., 0.013-0.800, 0.011-0.900 and 0.079-1.400 mg L(-1) for F1, F2, and F3, respectively. The precision of the automated MSFIA-HG-AFS system, expressed as the relative standard deviation (RSD), was evaluated for a 200 μg L(-1) As standard solution, and RSD values between 5 and 8% were achieved for the three BCR fractions. The new modified three-step BCR flow-based sequential extraction method was satisfactorily applied for arsenic fractionation in real agricultural soil samples from an arsenic-contaminated mining zone to evaluate its extractability. The frequency of analysis of the proposed method was eight times higher than that of the conventional BCR method (6 vs 48 h), and the kinetics of lixiviation were established for each fraction.

  14. Sandwich-structured hollow fiber membranes for osmotic power generation

    KAUST Repository

    Fu, Feng Jiang

    2015-11-01

    In this work, a novel sandwich-structured hollow fiber membrane has been developed via a specially designed spinneret and optimized spinning conditions. With this specially designed spinneret, the outer layer, which is the most crucial part of the sandwich-structured membrane, is maintained the same as the traditional dual-layer membrane. The inner substrate layer is separated into two layers: (1) an ultra-thin middle layer comprising a high molecular weight polyvinylpyrrolidone (PVP) additive to enhance integration with the outer polybenzimidazole (PBI) selective layer, and (2) an inner-layer to provide strong mechanical strength for the membrane. Experimental results show that a high water permeability and good mechanical strength could be achieved without the expensive post treatment process to remove PVP which was necessary for the dual-layer pressure retarded osmosis (PRO) membranes. By optimizing the composition, the membrane shows a maximum power density of 6.23W/m2 at a hydraulic pressure of 22.0bar when 1M NaCl and 10mM NaCl are used as the draw and feed solutions, respectively. To our best knowledge, this is the best phase inversion hollow fiber membrane with an outer selective PBI layer for osmotic power generation. In addition, this is the first work that shows how to fabricate sandwich-structured hollow fiber membranes for various applications. © 2015 Elsevier B.V.

  15. Analysis of the control structure of wind energy generation systems based on a permanent magnet synchronous generator

    OpenAIRE

    Carranza Castillo, Oscar; Figueres Amorós, Emilio; Garcerá Sanfeliú, Gabriel; González Medina, Raul

    2013-01-01

    This paper presents the analysis of the two usual control structures for variable speed and fixed pitch wind energy generation systems, namely speed and torque control, to determine the most appropriate structure to improve both robustness and reliability of this kind of distributed generators. The study considers all the elements of a typical wind power generation system and it has been carried out in a general way, so that conclusions are independent of the kind of the AC/DC converter that ...

  16. ARBUS: A FORTRAN tool for generating tree structure diagrams

    Energy Technology Data Exchange (ETDEWEB)

    Ferrero, C. [Kernforschungszentrum Karlsruhe GmbH (Germany). Hauptabteilung Ingenieurtechnik; Zanger, M.

    1992-02-01

    The FORTRAN77 stand-alone code ARBUS has been designed to aid the user by providing a tree structure diagram generating utility for computer programs written in FORTRAN language. This report is intended to describe the main purpose and features of ARBUS and to highlight some additional applications of the code by means of practical test cases. (orig.). [Deutsch] Das FORTRAN77-Rechenprogramm ARBUS wurde entwickelt, um dem Benutzer die graphische Darstellung des Aufrufbaumdiagramms bzw. der Aufrufstruktur der einzelnen Unterprogramme in einem beliebigen FORTRAN-Programm zu ermoeglichen. In diesem Bericht wird auf die Zielsetzung und die Hauptmerkmale von ARBUS eingegangen. Ausserdem werden einige Anwendungen des Codes anhand von praktischen Beispielen erlaeutert. (orig.).

  17. Rate Structures for Customers With Onsite Generation: Practice and Innovation

    Energy Technology Data Exchange (ETDEWEB)

    Johnston, L.; Takahashi, K.; Weston, F.; Murray, C.

    2005-12-01

    Recognizing that innovation and good public policy do not always proclaim themselves, Synapse Energy Economics and the Regulatory Assistance Project, under a contract with the California Energy Commission (CEC) and the National Renewable Energy Laboratory (NREL), undertook a survey of state policies on rates for partial-requirements customers with onsite distributed generation. The survey investigated a dozen or so states. These varied in geography and the structures of their electric industries. By reviewing regulatory proceedings, tariffs, publications, and interviews, the researchers identified a number of approaches to standby and associated rates--many promising but some that are perhaps not--that deserve policymakers' attention if they are to promote the deployment of cost-effective DG in their states.

  18. Generative Benchmark Models for Mesoscale Structures in Multilayer Networks

    CERN Document Server

    Bazzi, Marya; Arenas, Alex; Howison, Sam D; Porter, Mason A

    2016-01-01

    Multilayer networks allow one to represent diverse and interdependent connectivity patterns --- e.g., time-dependence, multiple subsystems, or both --- that arise in many applications and which are difficult or awkward to incorporate into standard network representations. In the study of multilayer networks, it is important to investigate "mesoscale" (i.e., intermediate-scale) structures, such as dense sets of nodes known as "communities" that are connected sparsely to each other, to discover network features that are not apparent at the microscale or the macroscale. A variety of methods and algorithms are available to identify communities in multilayer networks, but they differ in their definitions and/or assumptions of what constitutes a community, and many scalable algorithms provide approximate solutions with little or no theoretical guarantee on the quality of their approximations. Consequently, it is crucial to develop generative models of networks to use as a common test of community-detection tools. I...

  19. Arsenic fractionation in agricultural soil using an automated three-step sequential extraction method coupled to hydride generation-atomic fluorescence spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Rosas-Castor, J.M. [Universidad Autónoma de Nuevo León, UANL, Facultad de Ciencias Químicas, Cd. Universitaria, San Nicolás de los Garza, Nuevo León, C.P. 66451 Nuevo León (Mexico); Group of Analytical Chemistry, Automation and Environment, University of Balearic Islands, Cra. Valldemossa km 7.5, 07122 Palma de Mallorca (Spain); Portugal, L.; Ferrer, L. [Group of Analytical Chemistry, Automation and Environment, University of Balearic Islands, Cra. Valldemossa km 7.5, 07122 Palma de Mallorca (Spain); Guzmán-Mar, J.L.; Hernández-Ramírez, A. [Universidad Autónoma de Nuevo León, UANL, Facultad de Ciencias Químicas, Cd. Universitaria, San Nicolás de los Garza, Nuevo León, C.P. 66451 Nuevo León (Mexico); Cerdà, V. [Group of Analytical Chemistry, Automation and Environment, University of Balearic Islands, Cra. Valldemossa km 7.5, 07122 Palma de Mallorca (Spain); Hinojosa-Reyes, L., E-mail: laura.hinojosary@uanl.edu.mx [Universidad Autónoma de Nuevo León, UANL, Facultad de Ciencias Químicas, Cd. Universitaria, San Nicolás de los Garza, Nuevo León, C.P. 66451 Nuevo León (Mexico)

    2015-05-18

    Highlights: • A fully automated flow-based modified-BCR extraction method was developed to evaluate the extractable As of soil. • The MSFIA–HG-AFS system included an UV photo-oxidation step for organic species degradation. • The accuracy and precision of the proposed method were found satisfactory. • The time analysis can be reduced up to eight times by using the proposed flow-based BCR method. • The labile As (F1 + F2) was <50% of total As in soil samples from As-contaminated-mining zones. - Abstract: A fully automated modified three-step BCR flow-through sequential extraction method was developed for the fractionation of the arsenic (As) content from agricultural soil based on a multi-syringe flow injection analysis (MSFIA) system coupled to hydride generation-atomic fluorescence spectrometry (HG-AFS). Critical parameters that affect the performance of the automated system were optimized by exploiting a multivariate approach using a Doehlert design. The validation of the flow-based modified-BCR method was carried out by comparison with the conventional BCR method. Thus, the total As content was determined in the following three fractions: fraction 1 (F1), the acid-soluble or interchangeable fraction; fraction 2 (F2), the reducible fraction; and fraction 3 (F3), the oxidizable fraction. The limits of detection (LOD) were 4.0, 3.4, and 23.6 μg L{sup −1} for F1, F2, and F3, respectively. A wide working concentration range was obtained for the analysis of each fraction, i.e., 0.013–0.800, 0.011–0.900 and 0.079–1.400 mg L{sup −1} for F1, F2, and F3, respectively. The precision of the automated MSFIA–HG-AFS system, expressed as the relative standard deviation (RSD), was evaluated for a 200 μg L{sup −1} As standard solution, and RSD values between 5 and 8% were achieved for the three BCR fractions. The new modified three-step BCR flow-based sequential extraction method was satisfactorily applied for arsenic fractionation in real agricultural

  20. Active Data Archive Product Tracking and Automated SPASE Metadata Generation in Support of the Heliophysics Data Environment

    Science.gov (United States)

    Bargatze, L. F.

    2013-12-01

    The understanding of Solar interaction with the Earth and other bodies in the solar system is a primary goal of Heliophysics as outlined in the NASA Science Mission Directive Science Plan. Heliophysics researchers need access to a vast collection of satellite and ground-based observations coupled with numerical simulation data to study complex processes some of which, as in the case of space weather, pose danger to physical elements of modern society. The infrastructure of the Heliophysics data environment plays a vital role in furthering the understanding of space physics processes by providing researchers with means for data discovery and access. The Heliophysics data environment is highly dynamic with thousands of data products involved. Access to data is facilitated via the Heliophysics Virtual Observatories (VxO) but routine access is possible only if the VxO SPASE metadata repositories contain accurate and up to date information. The Heliophysics Data Consortium has the stated goal of providing routine access to all relevant data products inclusively. Currently, only a small fraction of the data products relevant to Heliophysics studies have been described and registered in a VxO repository. And, for those products that have been described in SPASE, there is a significant time lag from when new data becomes available to when VxO metadata are updated to provide access. It is possible to utilize automated tools to shorten the response time of VxO data product registration via active data archive product tracking. Such a systematic approach is designed to address data access reliability by embracing the highly dynamic nature of the Heliophysics data environment. For example, the CDAWEB data repository located at the NASA Space Science Physics Data facility maintains logs of the data products served to the community. These files include two that pertain to full directory list information, updated daily, and a set of SHA1SUM hash value files, one for each of more

  1. Using Automated Processes to Generate Test Items And Their Associated Solutions and Rationales to Support Formative Feedback

    Directory of Open Access Journals (Sweden)

    Mark Gierl

    2015-08-01

    Full Text Available Automatic item generation is the process of using item models to produce assessment tasks using computer technology. An item model is similar to a template that highlights the elements in the task that must be manipulated to produce new items. The purpose of our study is to describe an innovative method for generating large numbers of diverse and heterogeneous items along with their solutions and associated rationales to support formative feedback. We demonstrate the method by generating items in two diverse content areas, mathematics and nonverbal reasoning

  2. Automated tube voltage selection for radiation dose and contrast medium reduction at coronary CT angiography using 3{sup rd} generation dual-source CT

    Energy Technology Data Exchange (ETDEWEB)

    Mangold, Stefanie [Medical University of South Carolina, Division of Cardiovascular Imaging, Department of Radiology and Radiological Science, Charleston, SC (United States); Eberhard-Karls University Tuebingen, Department of Diagnostic and Interventional Radiology, Tuebingen (Germany); Wichmann, Julian L. [Medical University of South Carolina, Division of Cardiovascular Imaging, Department of Radiology and Radiological Science, Charleston, SC (United States); University Hospital Frankfurt, Department of Diagnostic and Interventional Radiology, Frankfurt (Germany); Schoepf, U.J. [Medical University of South Carolina, Division of Cardiovascular Imaging, Department of Radiology and Radiological Science, Charleston, SC (United States); Medical University of South Carolina, Division of Cardiology, Department of Medicine, Charleston, SC (United States); Poole, Zachary B.; Varga-Szemes, Akos; De Cecco, Carlo N. [Medical University of South Carolina, Division of Cardiovascular Imaging, Department of Radiology and Radiological Science, Charleston, SC (United States); Canstein, Christian [Siemens Medical Solutions, Malvern, PA (United States); Caruso, Damiano [Medical University of South Carolina, Division of Cardiovascular Imaging, Department of Radiology and Radiological Science, Charleston, SC (United States); University of Rome ' ' Sapienza' ' , Department of Radiological Sciences, Oncology and Pathology, Rome (Italy); Bamberg, Fabian; Nikolaou, Konstantin [Eberhard-Karls University Tuebingen, Department of Diagnostic and Interventional Radiology, Tuebingen (Germany)

    2016-10-15

    To investigate the relationship between automated tube voltage selection (ATVS) and body mass index (BMI) and its effect on image quality and radiation dose of coronary CT angiography (CCTA). We evaluated 272 patients who underwent CCTA with 3{sup rd} generation dual-source CT (DSCT). Prospectively ECG-triggered spiral acquisition was performed with automated tube current selection and advanced iterative reconstruction. Tube voltages were selected by ATVS (70-120 kV). BMI, effective dose (ED), and vascular attenuation in the coronary arteries were recorded. Signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR) were calculated. Five-point scales were used for subjective image quality analysis. Image quality was rated good to excellent in 98.9 % of examinations without significant differences for proximal and distal attenuation (all p ≥.0516), whereas image noise was rated significantly higher at 70 kV compared to ≥100 kV (all p <.0266). However, no significant differences were observed in SNR or CNR at 70-120 kV (all p ≥.0829). Mean ED at 70-120 kV was 1.5 ± 1.2 mSv, 2.4 ± 1.5 mSv, 3.6 ± 2.7 mSv, 5.9 ± 4.0 mSv, 7.9 ± 4.2 mSv, and 10.7 ± 4.1 mSv, respectively (all p ≤.0414). Correlation analysis showed a moderate association between tube voltage and BMI (r =.639). ATVS allows individual tube voltage adaptation for CCTA performed with 3{sup rd} generation DSCT, resulting in significantly decreased radiation exposure while maintaining image quality. (orig.)

  3. DEFINITION OF A SEMANTIC PLATAFORM FOR AUTOMATED CODE GENERATION BASED ON UML CLASS DIAGRAMS AND DSL SEMANTIC ANNOTATIONS

    Directory of Open Access Journals (Sweden)

    ANDRÉS MUÑETÓN

    2012-01-01

    Full Text Available En este trabajo se propone una plataforma semántica de servicios que implementan los pasos de un método para la generación automática de código. El método se basa en información semántica y en MDA (model-driven architecture. La generación de código se logra relacionando semánticamente operaciones en diagramas de clases en UML (unified modeling language con operaciones implementadas. La relación entre operaciones se hace consultando operaciones implementadas que tengan la misma postcondición de la operación bajo implementación. El código resultante es una secuencia de invocaciones a operaciones implementadas que, en conjunto, alcancen la postcondición de la operación bajo implementación. La semántica se especifica mediante un DSL (domain-specific language, también definido en este artículo. Los servicios de la plataforma y el método se prueban mediante un caso de estudio.

  4. Enzyme engineering: A synthetic biology approach for more effective library generation and automated high-throughput screening

    Science.gov (United States)

    Ebert, Maximilian C. C. J. C.; Mugford, Paul F.; Pelletier, Joelle N.

    2017-01-01

    The Golden Gate strategy entails the use of type IIS restriction enzymes, which cut outside of their recognition sequence. It enables unrestricted design of unique DNA fragments that can be readily and seamlessly recombined. Successfully employed in other synthetic biology applications, we demonstrate its advantageous use to engineer a biocatalyst. Hot-spots for mutations were individuated in three distinct regions of Candida antarctica lipase A (Cal-A), the biocatalyst chosen as a target to demonstrate the versatility of this recombination method. The three corresponding gene segments were subjected to the most appropriate method of mutagenesis (targeted or random). Their straightforward reassembly allowed combining products of different mutagenesis methods in a single round for rapid production of a series of diverse libraries, thus facilitating directed evolution. Screening to improve discrimination of short-chain versus long-chain fatty acid substrates was aided by development of a general, automated method for visual discrimination of the hydrolysis of varied substrates by whole cells. PMID:28178357

  5. On the structure of finitely generated shift-invariant subspaces

    OpenAIRE

    Kazarian, K. S.

    2016-01-01

    A characterization of finitely generated shift-invariant subspaces is given when generators are g-minimal. An algorithm is given for the determination of the coefficients in the well known representation of the Fourier transform of an element of the finitely generated shift-invariant subspace as a linear combination of Fourier transformations of generators. An estimate for the norms of those coefficients is derived. For the proof a sort of orthogonalization procedure for generators is used wh...

  6. Accounting Automation

    OpenAIRE

    Laynebaril1

    2017-01-01

    Accounting Automation   Click Link Below To Buy:   http://hwcampus.com/shop/accounting-automation/  Or Visit www.hwcampus.com Accounting Automation” Please respond to the following: Imagine you are a consultant hired to convert a manual accounting system to an automated system. Suggest the key advantages and disadvantages of automating a manual accounting system. Identify the most important step in the conversion process. Provide a rationale for your response. ...

  7. Automated retrieval of forest structure variables based on multi-scale texture analysis of VHR satellite imagery

    Science.gov (United States)

    Beguet, Benoit; Guyon, Dominique; Boukir, Samia; Chehata, Nesrine

    2014-10-01

    The main goal of this study is to design a method to describe the structure of forest stands from Very High Resolution satellite imagery, relying on some typical variables such as crown diameter, tree height, trunk diameter, tree density and tree spacing. The emphasis is placed on the automatization of the process of identification of the most relevant image features for the forest structure retrieval task, exploiting both spectral and spatial information. Our approach is based on linear regressions between the forest structure variables to be estimated and various spectral and Haralick's texture features. The main drawback of this well-known texture representation is the underlying parameters which are extremely difficult to set due to the spatial complexity of the forest structure. To tackle this major issue, an automated feature selection process is proposed which is based on statistical modeling, exploring a wide range of parameter values. It provides texture measures of diverse spatial parameters hence implicitly inducing a multi-scale texture analysis. A new feature selection technique, we called Random PRiF, is proposed. It relies on random sampling in feature space, carefully addresses the multicollinearity issue in multiple-linear regression while ensuring accurate prediction of forest variables. Our automated forest variable estimation scheme was tested on Quickbird and Pléiades panchromatic and multispectral images, acquired at different periods on the maritime pine stands of two sites in South-Western France. It outperforms two well-established variable subset selection techniques. It has been successfully applied to identify the best texture features in modeling the five considered forest structure variables. The RMSE of all predicted forest variables is improved by combining multispectral and panchromatic texture features, with various parameterizations, highlighting the potential of a multi-resolution approach for retrieving forest structure

  8. Caractérisation des convertisseurs matriciels : I. Structure de l'automate de commande rapprochée

    Science.gov (United States)

    François, B.; Cambronne, J. P.; Hautier, J. P.

    1996-05-01

    This paper details a design method for the control of static converters. After recalling the usefull modeling concepts, the authors establish a functionnal description of the knowledge model. This one, completed with the notion of mean value, enables the definition of a generalized knowmedge model. Thanks to the informational graphs, a control structure is systematically designed. It contains a set of different functionnal blocs which constitutes the required equipment for the control of the converter. This set is called “Automate de Commande Rapprochée” (A.C.R.). This unified design method falls into the field of the direct matrix converters working in Pulse Width Modulation (P.W.M.). Cet article propose une méthode de conception pour la commande rapprochée d'un dispositif à conversion statique. Après avoir rappelé les concepts utiles à la modélisation, les auteurs établissent une décomposition fonctionnelle du modèle de connaissance qui, associée à la notion de valeur moyenne de conversion permet de définir un modèle de commande généralisé. À l'aide d'une présentation par graphes informationnels, l'inversion de ce modèle conduit de façon systématique à une structure de commande dont les différents blocs constituent l'Automate de Commande Rapprochée (A.C.R.). Cette méthode g{enérale est plus particulièrement illustrée dans le cadre des convertisseurs directs polyphasés fonctionnant en Modulation de Largeurs d'Impulsions (M.L.I.).

  9. The role of social and ecological processes in structuring animal populations: a case study from automated tracking of wild birds.

    Science.gov (United States)

    Farine, Damien R; Firth, Josh A; Aplin, Lucy M; Crates, Ross A; Culina, Antica; Garroway, Colin J; Hinde, Camilla A; Kidd, Lindall R; Milligan, Nicole D; Psorakis, Ioannis; Radersma, Reinder; Verhelst, Brecht; Voelkl, Bernhard; Sheldon, Ben C

    2015-04-01

    Both social and ecological factors influence population process and structure, with resultant consequences for phenotypic selection on individuals. Understanding the scale and relative contribution of these two factors is thus a central aim in evolutionary ecology. In this study, we develop a framework using null models to identify the social and spatial patterns that contribute to phenotypic structure in a wild population of songbirds. We used automated technologies to track 1053 individuals that formed 73 737 groups from which we inferred a social network. Our framework identified that both social and spatial drivers contributed to assortment in the network. In particular, groups had a more even sex ratio than expected and exhibited a consistent age structure that suggested local association preferences, such as preferential attachment or avoidance. By contrast, recent immigrants were spatially partitioned from locally born individuals, suggesting differential dispersal strategies by phenotype. Our results highlight how different scales of social decision-making, ranging from post-natal dispersal settlement to fission-fusion dynamics, can interact to drive phenotypic structure in animal populations.

  10. Automation synthesis modules review.

    Science.gov (United States)

    Boschi, S; Lodi, F; Malizia, C; Cicoria, G; Marengo, M

    2013-06-01

    The introduction of (68)Ga labelled tracers has changed the diagnostic approach to neuroendocrine tumours and the availability of a reliable, long-lived (68)Ge/(68)Ga generator has been at the bases of the development of (68)Ga radiopharmacy. The huge increase in clinical demand, the impact of regulatory issues and a careful radioprotection of the operators have boosted for extensive automation of the production process. The development of automated systems for (68)Ga radiochemistry, different engineering and software strategies and post-processing of the eluate were discussed along with impact of automation with regulations. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  12. Automated Standard Hazard Tool

    Science.gov (United States)

    Stebler, Shane

    2014-01-01

    The current system used to generate standard hazard reports is considered cumbersome and iterative. This study defines a structure for this system's process in a clear, algorithmic way so that standard hazard reports and basic hazard analysis may be completed using a centralized, web-based computer application. To accomplish this task, a test server is used to host a prototype of the tool during development. The prototype is configured to easily integrate into NASA's current server systems with minimal alteration. Additionally, the tool is easily updated and provides NASA with a system that may grow to accommodate future requirements and possibly, different applications. Results of this project's success are outlined in positive, subjective reviews complete by payload providers and NASA Safety and Mission Assurance personnel. Ideally, this prototype will increase interest in the concept of standard hazard automation and lead to the full-scale production of a user-ready application.

  13. Expanding Lipidome Coverage Using LC-MS/MS Data-Dependent Acquisition with Automated Exclusion List Generation

    Science.gov (United States)

    Koelmel, Jeremy P.; Kroeger, Nicholas M.; Gill, Emily L.; Ulmer, Candice Z.; Bowden, John A.; Patterson, Rainey E.; Yost, Richard A.; Garrett, Timothy J.

    2017-05-01

    Untargeted omics analyses aim to comprehensively characterize biomolecules within a biological system. Changes in the presence or quantity of these biomolecules can indicate important biological perturbations, such as those caused by disease. With current technological advancements, the entire genome can now be sequenced; however, in the burgeoning fields of lipidomics, only a subset of lipids can be identified. The recent emergence of high resolution tandem mass spectrometry (HR-MS/MS), in combination with ultra-high performance liquid chromatography, has resulted in an increased coverage of the lipidome. Nevertheless, identifications from MS/MS are generally limited by the number of precursors that can be selected for fragmentation during chromatographic elution. Therefore, we developed the software IE-Omics to automate iterative exclusion (IE), where selected precursors using data-dependent topN analyses are excluded in sequential injections. In each sequential injection, unique precursors are fragmented until HR-MS/MS spectra of all ions above a user-defined intensity threshold are acquired. IE-Omics was applied to lipidomic analyses in Red Cross plasma and substantia nigra tissue. Coverage of the lipidome was drastically improved using IE. When applying IE-Omics to Red Cross plasma and substantia nigra lipid extracts in positive ion mode, 69% and 40% more molecular identifications were obtained, respectively. In addition, applying IE-Omics to a lipidomics workflow increased the coverage of trace species, including odd-chained and short-chained diacylglycerides and oxidized lipid species. By increasing the coverage of the lipidome, applying IE to a lipidomics workflow increases the probability of finding biomarkers and provides additional information for determining etiology of disease.

  14. Architectural Benevolent Builders (ABB): A Global System Automating Integration of structured and Semistructured Sources

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    We investigate highly sophisticated mechanisms that merge and automate interoperability of heterogeneous traditional information systems together with the World Wide Web as one world. In particular, we introduce the ABB system that employs most of the Benevolent Builders (BB) which are assertions, integration rules, ABB-network graph and agents to activate the components' versatility to reconcile the semantics involved in data sharing in order to withstand the terrific dynamic computer technology in the present and future information age. The ABB is a global application system with its operation covering local databases to the Internet. The first three BB are passive objects, whereas, the agent has a strong versatility to perceive events, perform actions, communicate, make commitments, and satisfy claims. Due to the BB's power of intelligence, ABB also has the capability to filter out and process only the relevant operational sources like preferences (i.e. customer's interest) from the sites. The ABB's richness in knowledge and flexibility to accommodate various data models, manages to link: system-to-system or firm-to-firm regardless of the field such as: engineering, insurance, medical, space science, and education, to mention a few.

  15. Fully Automated and Robust Tracking of Transient Waves in Structured Anatomies Using Dynamic Programming.

    Science.gov (United States)

    Akkus, Zeynettin; Bayat, Mahdi; Cheong, Mathew; Viksit, Kumar; Erickson, Bradley J; Alizad, Azra; Fatemi, Mostafa

    2016-10-01

    Tissue stiffness is often linked to underlying pathology and can be quantified by measuring the mechanical transient transverse wave speed (TWS) within the medium. Time-of-flight methods based on correlation of the transient signals or tracking of peaks have been used to quantify the TWS from displacement maps obtained with ultrasound pulse-echo techniques. However, it is challenging to apply these methods to in vivo data because of tissue inhomogeneity, noise and artifacts that produce outliers. In this study, we introduce a robust and fully automated method based on dynamic programming to estimate TWS in tissues with known geometries. The method is validated using ultrasound bladder vibrometry data from an in vivo study. We compared the results of our method with those of time-of-flight techniques. Our method performs better than time-of-flight techniques. In conclusion, we present a robust and accurate TWS detection method that overcomes the difficulties of time-of-flight methods.

  16. Advances in inspection automation

    Science.gov (United States)

    Weber, Walter H.; Mair, H. Douglas; Jansen, Dion; Lombardi, Luciano

    2013-01-01

    This new session at QNDE reflects the growing interest in inspection automation. Our paper describes a newly developed platform that makes the complex NDE automation possible without the need for software programmers. Inspection tasks that are tedious, error-prone or impossible for humans to perform can now be automated using a form of drag and drop visual scripting. Our work attempts to rectify the problem that NDE is not keeping pace with the rest of factory automation. Outside of NDE, robots routinely and autonomously machine parts, assemble components, weld structures and report progress to corporate databases. By contrast, components arriving in the NDT department typically require manual part handling, calibrations and analysis. The automation examples in this paper cover the development of robotic thickness gauging and the use of adaptive contour following on the NRU reactor inspection at Chalk River.

  17. Application of Discrete Control System in Automation of 200 MW Power Generating Set%分散控制系统在200 MW机组自动化中的应用

    Institute of Scientific and Technical Information of China (English)

    李维群; 陈勇; 赖建明; 王志兵; 孙德利

    2000-01-01

    Justifies the automation of 200 MW power generating set, illustrates the automation scheme for 200 MW power generating set, its technical specisl features, technical problems to be solved, operational efficiency with typical examples, and suggests further actions for future automation.%阐述200 MW机组自动化改造的必要性,通过实例介绍了200 MW机组自动化改造的技术方案、技术特点、解决的技术问题、运行效果以及对今后自动化改造的一些探讨性建议。

  18. PRESAGE: Protecting Structured Address Generation against Soft Errors

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, Vishal C.; Gopalakrishnan, Ganesh; Krishnamoorthy, Sriram

    2016-12-28

    Modern computer scaling trends in pursuit of larger component counts and power efficiency have, unfortunately, lead to less reliable hardware and consequently soft errors escaping into application data ("silent data corruptions"). Techniques to enhance system resilience hinge on the availability of efficient error detectors that have high detection rates, low false positive rates, and lower computational overhead. Unfortunately, efficient detectors to detect faults during address generation have not been widely researched (especially in the context of indexing large arrays). We present a novel lightweight compiler-driven technique called PRESAGE for detecting bit-flips affecting structured address computations. A key insight underlying PRESAGE is that any address computation scheme that propagates an already incurred error is better than a scheme that corrupts one particular array access but otherwise (falsely) appears to compute perfectly. Ensuring the propagation of errors allows one to place detectors at loop exit points and helps turn silent corruptions into easily detectable error situations. Our experiments using the PolyBench benchmark suite indicate that PRESAGE-based error detectors have a high error-detection rate while incurring low overheads.

  19. Method and system for automated on-chip material and structural certification of MEMS devices

    Energy Technology Data Exchange (ETDEWEB)

    Sinclair, Michael B.; DeBoer, Maarten P.; Smith, Norman F.; Jensen, Brian D.; Miller, Samuel L.

    2003-05-20

    A new approach toward MEMS quality control and materials characterization is provided by a combined test structure measurement and mechanical response modeling approach. Simple test structures are cofabricated with the MEMS devices being produced. These test structures are designed to isolate certain types of physical response, so that measurement of their behavior under applied stress can be easily interpreted as quality control and material properties information.

  20. Automated detection and labeling of high-density EEG electrodes from structural MR images

    Science.gov (United States)

    Marino, Marco; Liu, Quanying; Brem, Silvia; Wenderoth, Nicole; Mantini, Dante

    2016-10-01

    Objective. Accurate knowledge about the positions of electrodes in electroencephalography (EEG) is very important for precise source localizations. Direct detection of electrodes from magnetic resonance (MR) images is particularly interesting, as it is possible to avoid errors of co-registration between electrode and head coordinate systems. In this study, we propose an automated MR-based method for electrode detection and labeling, particularly tailored to high-density montages. Approach. Anatomical MR images were processed to create an electrode-enhanced image in individual space. Image processing included intensity non-uniformity correction, background noise and goggles artifact removal. Next, we defined a search volume around the head where electrode positions were detected. Electrodes were identified as local maxima in the search volume and registered to the Montreal Neurological Institute standard space using an affine transformation. This allowed the matching of the detected points with the specific EEG montage template, as well as their labeling. Matching and labeling were performed by the coherent point drift method. Our method was assessed on 8 MR images collected in subjects wearing a 256-channel EEG net, using the displacement with respect to manually selected electrodes as performance metric. Main results. Average displacement achieved by our method was significantly lower compared to alternative techniques, such as the photogrammetry technique. The maximum displacement was for more than 99% of the electrodes lower than 1 cm, which is typically considered an acceptable upper limit for errors in electrode positioning. Our method showed robustness and reliability, even in suboptimal conditions, such as in the case of net rotation, imprecisely gathered wires, electrode detachment from the head, and MR image ghosting. Significance. We showed that our method provides objective, repeatable and precise estimates of EEG electrode coordinates. We hope our work

  1. Distributed cyberinfrastructure tools for automated data processing of structural monitoring data

    Science.gov (United States)

    Zhang, Yilan; Kurata, Masahiro; Lynch, Jerome P.; van der Linden, Gwendolyn; Sederat, Hassan; Prakash, Atul

    2012-04-01

    The emergence of cost-effective sensing technologies has now enabled the use of dense arrays of sensors to monitor the behavior and condition of large-scale bridges. The continuous operation of dense networks of sensors presents a number of new challenges including how to manage such massive amounts of data that can be created by the system. This paper reports on the progress of the creation of cyberinfrastructure tools which hierarchically control networks of wireless sensors deployed in a long-span bridge. The internet-enabled cyberinfrastructure is centrally managed by a powerful database which controls the flow of data in the entire monitoring system architecture. A client-server model built upon the database provides both data-provider and system end-users with secured access to various levels of information of a bridge. In the system, information on bridge behavior (e.g., acceleration, strain, displacement) and environmental condition (e.g., wind speed, wind direction, temperature, humidity) are uploaded to the database from sensor networks installed in the bridge. Then, data interrogation services interface with the database via client APIs to autonomously process data. The current research effort focuses on an assessment of the scalability and long-term robustness of the proposed cyberinfrastructure framework that has been implemented along with a permanent wireless monitoring system on the New Carquinez (Alfred Zampa Memorial) Suspension Bridge in Vallejo, CA. Many data interrogation tools are under development using sensor data and bridge metadata (e.g., geometric details, material properties, etc.) Sample data interrogation clients including those for the detection of faulty sensors, automated modal parameter extraction.

  2. Computed tomography landmark-based semi-automated mesh morphing and mapping techniques: generation of patient specific models of the human pelvis without segmentation.

    Science.gov (United States)

    Salo, Zoryana; Beek, Maarten; Wright, David; Whyne, Cari Marisa

    2015-04-13

    Current methods for the development of pelvic finite element (FE) models generally are based upon specimen specific computed tomography (CT) data. This approach has traditionally required segmentation of CT data sets, which is time consuming and necessitates high levels of user intervention due to the complex pelvic anatomy. The purpose of this research was to develop and assess CT landmark-based semi-automated mesh morphing and mapping techniques to aid the generation and mechanical analysis of specimen-specific FE models of the pelvis without the need for segmentation. A specimen-specific pelvic FE model (source) was created using traditional segmentation methods and morphed onto a CT scan of a different (target) pelvis using a landmark-based method. The morphed model was then refined through mesh mapping by moving the nodes to the bone boundary. A second target model was created using traditional segmentation techniques. CT intensity based material properties were assigned to the morphed/mapped model and to the traditionally segmented target models. Models were analyzed to evaluate their geometric concurrency and strain patterns. Strains generated in a double-leg stance configuration were compared to experimental strain gauge data generated from the same target cadaver pelvis. CT landmark-based morphing and mapping techniques were efficiently applied to create a geometrically multifaceted specimen-specific pelvic FE model, which was similar to the traditionally segmented target model and better replicated the experimental strain results (R(2)=0.873). This study has shown that mesh morphing and mapping represents an efficient validated approach for pelvic FE model generation without the need for segmentation.

  3. Photoactivation by visible light of CdTe quantum dots for inline generation of reactive oxygen species in an automated multipumping flow system

    Energy Technology Data Exchange (ETDEWEB)

    Ribeiro, David S.M.; Frigerio, Christian; Santos, Joao L.M. [Requimte, Department of Chemical Sciences, Laboratory of Applied Chemistry, Faculty of Pharmacy, University of Porto, Rua de Jorge Viterbo Ferreira no. 228, 4050-313 Porto (Portugal); Prior, Joao A.V., E-mail: joaoavp@ff.up.pt [Requimte, Department of Chemical Sciences, Laboratory of Applied Chemistry, Faculty of Pharmacy, University of Porto, Rua de Jorge Viterbo Ferreira no. 228, 4050-313 Porto (Portugal)

    2012-07-20

    Highlights: Black-Right-Pointing-Pointer CdTe quantum dots generate free radical species upon exposure to visible radiation. Black-Right-Pointing-Pointer A high power visible LED lamp was used as photoirradiation element. Black-Right-Pointing-Pointer The laboratory-made LED photocatalytic unit was implemented inline in a MPFS. Black-Right-Pointing-Pointer Free radical species oxidize luminol producing a strong chemiluminescence emission. Black-Right-Pointing-Pointer Epinephrine scavenges free radical species quenching chemiluminescence emission. - Abstract: Quantum dots (QD) are semiconductor nanocrystals able to generate free radical species upon exposure to an electromagnetic radiation, usually in the ultraviolet wavelength range. In this work, CdTe QD were used as highly reactive oxygen species (ROS) generators for the control of pharmaceutical formulations containing epinephrine. The developed approach was based on the chemiluminometric monitoring of the quenching effect of epinephrine on the oxidation of luminol by the produced ROS. Due to the relatively low energy band-gap of this chalcogenide a high power visible light emitting diode (LED) lamp was used as photoirradiation element and assembled in a laboratory-made photocatalytic unit. Owing to the very short lifetime of ROS and to ensure both reproducible generation and time-controlled reaction implementation and development, all reactional processes were implemented inline by using an automated multipumping micro-flow system. A linear working range for epinephrine concentration of up to 2.28 Multiplication-Sign 10{sup -6} mol L{sup -1} (r = 0.9953; n = 5) was verified. The determination rate was about 79 determinations per hour and the detection limit was about 8.69 Multiplication-Sign 10{sup -8} mol L{sup -1}. The results obtained in the analysis of epinephrine pharmaceutical formulations by using the proposed methodology were in good agreement with those furnished by the reference procedure, with

  4. Organotin speciation in environmental matrices by automated on-line hydride generation-programmed temperature vaporization-capillary gas chromatography-mass spectrometry detection.

    Science.gov (United States)

    Serra, H; Nogueira, J M F

    2005-11-11

    In the present contribution, a new automated on-line hydride generation methodology was developed for dibutyltin and tributyltin speciation at the trace level, using a programmable temperature-vaporizing inlet followed by capillary gas chromatography coupled to mass spectrometry in the selected ion-monitoring mode acquisition (PTV-GC/MS(SIM)). The methodology involves a sequence defined by two running methods, the first one configured for hydride generation with sodium tetrahydroborate as derivatising agent and the second configured for speciation purposes, using a conventional autosampler and data acquisition controlled by the instrument's software. From the method-development experiments, it had been established that injector configuration has a great effect on the speciation of the actual methodology, particularly, the initial inlet temperature (-20 degrees C; He: 150 ml/min), injection volume (2 microl) and solvent characteristics using the solvent venting mode. Under optimized conditions, a remarkable instrumental performance including very good precision (RSD CRM 462, Nr. 330 dibutyltin: 68+/-12 ng/g; tributyltin: 54+/-15 ng/g on dry mass basis), using liquid-liquid extraction (LLE) and solid-phase extraction (SPE) sample enrichment and multiple injections (2 x 5 microl) for sensitivity enhancement. The methodology evidenced high reproducibility, is easy to work-up, sensitive and showed to be a suitable alternative to replace the currently dedicated analytical systems for organotin speciation in environmental matrices at the trace level.

  5. Generation Of Manufacturing Routing And Operations Using Structured Knowledge As Basis To Application Of Computer Aided In Process Planning

    Science.gov (United States)

    Oswaldo, Luiz Agostinho

    2011-01-01

    The development of computer aided resources in automation of generation of manufacturing routings and operations is being mainly accomplished through the search of similarities between existent ones, resulting standard process routings that are grouped by analysis of similarities between parts or routings. This article proposes the development of manufacturing routings and operations detailment using a methodology which steps will define the initial, intermediate and final operations, starting from the rough piece and going up to the final specifications, that must have binunivocal relationship with the part design specifications. Each step will use the so called rules of precedence to link and chain the routing operations. The rules of precedence order and prioritize the knowledge of various manufacturing processes, taking in account the theories of machining, forging, assembly, and heat treatments; also, utilizes the theories of accumulation of tolerances and process capabilities, between others. It is also reinforced the availability of manufacturing databases related to process tolerances, deviations of machine tool- cutting tool- fixturing devices—workpiece, and process capabilities. The statement and application of rules of precedence, linking and joining manufacturing concepts in a logical and structured way, and their application in the methodology steps will make viable the utilization of structured knowledge instead of tacit one currently available in the manufacturing engineering departments, in the generation of manufacturing routing and operations. Consequently, the development of Computer Aided in Process Planning will be facilitated, due to the structured knowledge applied with this methodology.

  6. Definitive Metabolite Identification Coupled with Automated Ligand Identification System (ALIS) Technology: A Novel Approach to Uncover Structure-Activity Relationships and Guide Drug Design in a Factor IXa Inhibitor Program.

    Science.gov (United States)

    Zhang, Ting; Liu, Yong; Yang, Xianshu; Martin, Gary E; Yao, Huifang; Shang, Jackie; Bugianesi, Randal M; Ellsworth, Kenneth P; Sonatore, Lisa M; Nizner, Peter; Sherer, Edward C; Hill, Susan E; Knemeyer, Ian W; Geissler, Wayne M; Dandliker, Peter J; Helmy, Roy; Wood, Harold B

    2016-03-10

    A potent and selective Factor IXa (FIXa) inhibitor was subjected to a series of liver microsomal incubations, which generated a number of metabolites. Using automated ligand identification system-affinity selection (ALIS-AS) methodology, metabolites in the incubation mixture were prioritized by their binding affinities to the FIXa protein. Microgram quantities of the metabolites of interest were then isolated through microisolation analytical capabilities, and structurally characterized using MicroCryoProbe heteronuclear 2D NMR techniques. The isolated metabolites recovered from the NMR experiments were then submitted directly to an in vitro FIXa enzymatic assay. The order of the metabolites' binding affinity to the Factor IXa protein from the ALIS assay was completely consistent with the enzymatic assay results. This work showcases an innovative and efficient approach to uncover structure-activity relationships (SARs) and guide drug design via microisolation-structural characterization and ALIS capabilities.

  7. Expert system for elucidation of structures of organic compounds——Structural generator of ESESOC-II

    Institute of Scientific and Technical Information of China (English)

    胡昌玉; 许禄

    1995-01-01

    An expert system for the elucidation of the structures of organic compounds--ESESOC-IIhas been designed. It is composed of three parts: spectroscopic data analysis, structure generator, and evaluation of the candidate structures. The heart of ESESOC is the structure generator, as an integral part, which accepts the specific types of information, e.g. molecular formulae, substructure constraints, and produces an exhaustive and irredundant list of candidate structures. The scheme for the structural generation is given, in which the depth-first search strategy is used to fill the bonding adjacency matrix (BAM) and a new method is introduced to remove the duplicates.

  8. Automated docking to multiple target structures: incorporation of protein mobility and structural water heterogeneity in AutoDock.

    Science.gov (United States)

    Osterberg, Fredrik; Morris, Garrett M; Sanner, Michel F; Olson, Arthur J; Goodsell, David S

    2002-01-01

    Protein motion and heterogeneity of structural waters are approximated in ligand-docking simulations, using an ensemble of protein structures. Four methods of combining multiple target structures within a single grid-based lookup table of interaction energies are tested. The method is evaluated using complexes of 21 peptidomimetic inhibitors with human immunodeficiency virus type 1 (HIV-1) protease. Several of these structures show motion of an arginine residue, which is essential for binding of large inhibitors. A structural water is also present in 20 of the structures, but it must be absent in the remaining one for proper binding. Mean and minimum methods perform poorly, but two weighted average methods permit consistent and accurate ligand docking, using a single grid representation of the target protein structures. Copyright 2001 Wiley-Liss, Inc.

  9. Towards fully automated structure-based NMR resonance assignment of 15N-labeled proteins from automatically picked peaks

    KAUST Repository

    Jang, Richard

    2011-03-01

    In NMR resonance assignment, an indispensable step in NMR protein studies, manually processed peaks from both N-labeled and C-labeled spectra are typically used as inputs. However, the use of homologous structures can allow one to use only N-labeled NMR data and avoid the added expense of using C-labeled data. We propose a novel integer programming framework for structure-based backbone resonance assignment using N-labeled data. The core consists of a pair of integer programming models: one for spin system forming and amino acid typing, and the other for backbone resonance assignment. The goal is to perform the assignment directly from spectra without any manual intervention via automatically picked peaks, which are much noisier than manually picked peaks, so methods must be error-tolerant. In the case of semi-automated/manually processed peak data, we compare our system with the Xiong-Pandurangan-Bailey- Kellogg\\'s contact replacement (CR) method, which is the most error-tolerant method for structure-based resonance assignment. Our system, on average, reduces the error rate of the CR method by five folds on their data set. In addition, by using an iterative algorithm, our system has the added capability of using the NOESY data to correct assignment errors due to errors in predicting the amino acid and secondary structure type of each spin system. On a publicly available data set for human ubiquitin, where the typing accuracy is 83%, we achieve 91% accuracy, compared to the 59% accuracy obtained without correcting for such errors. In the case of automatically picked peaks, using assignment information from yeast ubiquitin, we achieve a fully automatic assignment with 97% accuracy. To our knowledge, this is the first system that can achieve fully automatic structure-based assignment directly from spectra. This has implications in NMR protein mutant studies, where the assignment step is repeated for each mutant. © Copyright 2011, Mary Ann Liebert, Inc.

  10. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system...... of the graph. SaTool makes analysis of the structure graph to provide knowledge about fundamental properties of the system in normal and faulty conditions. Salient features of SaTool include rapid analysis of possibility to diagnose faults and ability to make autonomous recovery should faults occur....

  11. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system......’s properties. Measured and controlled quantities in the system are related to variables through functional relations, which need only be stated as names, their explicit composition need not be described to the tool. The user enters a list of these relations that together describe the entirerity of the system....... The list of such variables and functional relations constitute the system’s structure graph. Normal operation means all functional relations are intact. Should faults occur, one or more functional relations cease to be valid. In a structure graph, this is seen as the disappearance of one or more nodes...

  12. CASD-NMR: critical assessment of automated structure determination by NMR

    NARCIS (Netherlands)

    Rosato, A.; van der Schot, G.; Bonvin, A.M.J.J.

    2009-01-01

    NMR spectroscopy is currently the only technique for determining the solution structure of biological macromolecules. This typically requires both the assignment of resonances and a labor-intensive analysis of multidimensional nuclear Overhauser effect spectroscopy (NOESY) spectra, in which peaks

  13. 2nd Generation RLV Airframe Structures and Materials

    Science.gov (United States)

    Johnson, Theodore F.

    2000-01-01

    The goals and objectives of the project summarized in this viewgraph presentation are the following: (1) Develop and demonstrate verified airframe and cryotank structural design and analysis technologies, including damage tolerance, safety, reliability, and residual strength technologies, robust nonlinear shell and cryotank analysis technologies, high-fidelity analysis and design technologies for local structural detail features and joints, and high-fidelity analysis technologies for sandwich structures; (2) Demonstrate low cost, robust materials and processing, including polymeric matrix composite (PMC) and metallic materials and processing, and refractory composite and metallic hot structures materials and processing; (3) Develop and demonstrate robust airframe structures and validated integrated airframe structural concepts, including low cost fabrication and joining, operations efficient designs and inspection techniques (non-destructive evaluation), scale-up and integrated thermal structure tests, and airframe structures IVHM; (4) Demonstrate low cost, robust repair techniques; and (5) Develop verified integrated airframe structural concepts, including integrated structural concepts.

  14. Measurements and automated mechanism generation modeling of OH production in photolytically initiated oxidation of the neopentyl radical.

    Science.gov (United States)

    Petway, Sarah V; Ismail, Huzeifa; Green, William H; Estupiñan, Edgar G; Jusinski, Leonard E; Taatjes, Craig A

    2007-05-17

    Production of OH in the reaction of the neopentyl radical with O2 has been measured by a laser photolysis/cw absorption method for various pressures and oxygen concentrations at 673, 700, and 725 K. The MIT Reaction Mechanism Generator (RMG) was used to automatically generate a model for this system, and the predicted OH concentration profiles are compared to present and literature experimental results. Several reactions significantly affect the OH profile. The experimental data provide useful constraints on the rate coefficient for the formally direct chemical activation reaction of neopentyl radical with O2 to form OH (CH3)3CCH2 + O2 --> OH + 3,3-dimethyloxetane (Rxn 1) At 673 K and 60 Torr, log k(1) (cm(3) molecule(-1) s(-1)) = -13.7 +/- 0.5. Absolute absorbance measurements on OH and I indicate that the branching ratio for R + O2 to OH is about 0.03 under these conditions. The data suggest that the ab initio neopentyl + O2 potential energy surface of Sun and Bozzelli is accurate to within 2 kcal mol(-1).

  15. Automated generation of IMRT treatment plans for prostate cancer patients with metal hip prostheses: comparison of different planning strategies.

    Science.gov (United States)

    Voet, Peter W J; Dirkx, Maarten L P; Breedveld, Sebastiaan; Heijmen, Ben J M

    2013-07-01

    To compare IMRT planning strategies for prostate cancer patients with metal hip prostheses. All plans were generated fully automatically (i.e., no human trial-and-error interactions) using iCycle, the authors' in-house developed algorithm for multicriterial selection of beam angles and optimization of fluence profiles, allowing objective comparison of planning strategies. For 18 prostate cancer patients (eight with bilateral hip prostheses, ten with a right-sided unilateral prosthesis), two planning strategies were evaluated: (i) full exclusion of beams containing beamlets that would deliver dose to the target after passing a prosthesis (IMRT remove) and (ii) exclusion of those beamlets only (IMRT cut). Plans with optimized coplanar and noncoplanar beam arrangements were generated. Differences in PTV coverage and sparing of organs at risk (OARs) were quantified. The impact of beam number on plan quality was evaluated. Especially for patients with bilateral hip prostheses, IMRT cut significantly improved rectum and bladder sparing compared to IMRT remove. For 9-beam coplanar plans, rectum V60 Gy reduced by 17.5% ± 15.0% (maximum 37.4%, p = 0.036) and rectum D mean by 9.4% ± 7.8% (maximum 19.8%, p = 0.036). Further improvements in OAR sparing were achievable by using noncoplanar beam setups, reducing rectum V 60Gy by another 4.6% ± 4.9% (p = 0.012) for noncoplanar 9-beam IMRT cut plans. Large reductions in rectum dose delivery were also observed when increasing the number of beam directions in the plans. For bilateral implants, the rectum V 60Gy was 37.3% ± 12.1% for coplanar 7-beam plans and reduced on average by 13.5% (maximum 30.1%, p = 0.012) for 15 directions. iCycle was able to automatically generate high quality plans for prostate cancer patients with prostheses. Excluding only beamlets that passed through the prostheses (IMRTcut strategy) significantly improved OAR sparing. Noncoplanar beam arrangements and, to a larger extent, increasing the number of

  16. Automated generation of IMRT treatment plans for prostate cancer patients with metal hip prostheses: Comparison of different planning strategies

    Energy Technology Data Exchange (ETDEWEB)

    Voet, Peter W. J.; Dirkx, Maarten L. P.; Breedveld, Sebastiaan; Heijmen, Ben J. M. [Erasmus MC - Daniel den Hoed Cancer Center, Department of Radiation Oncology, Groene Hilledijk 301, 3075EA Rotterdam (Netherlands)

    2013-07-15

    Purpose: To compare IMRT planning strategies for prostate cancer patients with metal hip prostheses.Methods: All plans were generated fully automatically (i.e., no human trial-and-error interactions) using iCycle, the authors' in-house developed algorithm for multicriterial selection of beam angles and optimization of fluence profiles, allowing objective comparison of planning strategies. For 18 prostate cancer patients (eight with bilateral hip prostheses, ten with a right-sided unilateral prosthesis), two planning strategies were evaluated: (i) full exclusion of beams containing beamlets that would deliver dose to the target after passing a prosthesis (IMRT{sub remove}) and (ii) exclusion of those beamlets only (IMRT{sub cut}). Plans with optimized coplanar and noncoplanar beam arrangements were generated. Differences in PTV coverage and sparing of organs at risk (OARs) were quantified. The impact of beam number on plan quality was evaluated.Results: Especially for patients with bilateral hip prostheses, IMRT{sub cut} significantly improved rectum and bladder sparing compared to IMRT{sub remove}. For 9-beam coplanar plans, rectum V{sub 60Gy} reduced by 17.5%{+-} 15.0% (maximum 37.4%, p= 0.036) and rectum D{sub mean} by 9.4%{+-} 7.8% (maximum 19.8%, p= 0.036). Further improvements in OAR sparing were achievable by using noncoplanar beam setups, reducing rectum V{sub 60Gy} by another 4.6%{+-} 4.9% (p= 0.012) for noncoplanar 9-beam IMRT{sub cut} plans. Large reductions in rectum dose delivery were also observed when increasing the number of beam directions in the plans. For bilateral implants, the rectum V{sub 60Gy} was 37.3%{+-} 12.1% for coplanar 7-beam plans and reduced on average by 13.5% (maximum 30.1%, p= 0.012) for 15 directions.Conclusions: iCycle was able to automatically generate high quality plans for prostate cancer patients with prostheses. Excluding only beamlets that passed through the prostheses (IMRT{sub cut} strategy) significantly improved

  17. An efficient approach to bioconversion kinetic model generation based on automated microscale experimentation integrated with model driven experimental design

    DEFF Research Database (Denmark)

    Chen, B. H.; Micheletti, M.; Baganz, F.;

    2009-01-01

    design. It incorporates a model driven approach to the experimental design that minimises the number of experiments to be performed, while still generating accurate values of kinetic parameters. The approach has been illustrated with the transketolase mediated asymmetric synthesis of L...... experimental design.]it comparison with conventional methodology, the modelling approach enabled a nearly 4-fold decrease in the number of experiments while the microwell experimentation enabled a 45-fold decrease in material requirements and a significant increase in experimental throughput. The approach......Reliable models of enzyme kinetics are required for the effective design of bioconversion processes. Kinetic expressions of the enzyme-catalysed reaction rate however, are frequently complex and establishing accurate values of kinetic parameters normally requires a large number of experiments...

  18. Greater Buyer Effectiveness through Automation

    Science.gov (United States)

    1989-01-01

    FOB = free on board FPAC = Federal Procurement Automation Council FPDS = Federal Procurement Data System 4GL = fourth generation language GAO = General...Procurement Automation Council ( FPAC ), entitled Compendium of Automated Procurement Systems in Federal Agencies. The FPAC inventory attempted to identify...In some cases we have updated descriptions of systems identified by the FPAC study, but many of the newer systems are identified here for the first

  19. Automated test bench for simulation of radiation electrification of spacecraft structural dielectrics

    Science.gov (United States)

    Vladimirov, A. M.; Bezhayev, A. Yu; Zykov, V. M.; Isaychenko, V. I.; Lukashchuk, A. A.; Lukonin, S. E.

    2017-01-01

    The paper describes the test bench “Prognoz-2” designed in Testing Center, Institute of Non-Destructive Testing, Tomsk Polytechnic University, which can be used: for ground testing of individual samples of spacecraft structural materials (e.g. thermal control coatings or cover glasses for solar batteries) or ceramics of the plasma thruster discharge channel), and whole spacecraft units or instruments (e.g. instruments of solar and stellar orientation or correcting plasma thrusters) exposed to radiation electrification factors; to verify the calculation mathematical models of radiation electrification of structural dielectrics under the impact of space factors in different orbits.

  20. Surface structure enhanced second harmonic generation in organic nanofibers

    DEFF Research Database (Denmark)

    Fiutowski, Jacek; Maibohm, Christian; Kostiucenko, Oksana;

    Second-harmonic generation upon femto-second laser irradiation of nonlinearly optically active nanofibers grown from nonsymmetrically functionalized para-quarterphenylene (CNHP4) molecules is investigated. Following growth on mica templates, the nanofibers have been transferred onto lithography-d......-defined regular arrays of gold square nanostructures. These nanostructure arrays induce local field enhancement, which significantly lowers the threshold for second harmonic generation in the nanofibers.......Second-harmonic generation upon femto-second laser irradiation of nonlinearly optically active nanofibers grown from nonsymmetrically functionalized para-quarterphenylene (CNHP4) molecules is investigated. Following growth on mica templates, the nanofibers have been transferred onto lithography...

  1. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten

    2006-01-01

    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a comple...

  2. Blind testing of routine, fully automated determination of protein structures from NMR data.

    NARCIS (Netherlands)

    Rosato, A.; Aramini, J.M.; Arrowsmith, C.; Bagaria, A.; Baker, D.; Cavalli, A.; Doreleijers, J.; Eletsky, A.; Giachetti, A.; Guerry, P.; Gutmanas, A.; Guntert, P.; He, Y.; Herrmann, T.; Huang, Y.J.; Jaravine, V.; Jonker, H.R.; Kennedy, M.A.; Lange, O.F.; Liu, G.; Malliavin, T.E.; Mani, R.; Mao, B.; Montelione, G.T.; Nilges, M.; Rossi, P.; Schot, G. van der; Schwalbe, H.; Szyperski, T.A.; Vendruscolo, M.; Vernon, R.; Vranken, W.F.; Vries, S.D. de; Vuister, G.W.; Wu, B.; Yang, Y.; Bonvin, A.M.

    2012-01-01

    The protocols currently used for protein structure determination by nuclear magnetic resonance (NMR) depend on the determination of a large number of upper distance limits for proton-proton pairs. Typically, this task is performed manually by an experienced researcher rather than automatically by us

  3. Using Structure-Based Organic Chemistry Online Tutorials with Automated Correction for Student Practice and Review

    Science.gov (United States)

    O'Sullivan, Timothy P.; Hargaden, Gra´inne C.

    2014-01-01

    This article describes the development and implementation of an open-access organic chemistry question bank for online tutorials and assessments at University College Cork and Dublin Institute of Technology. SOCOT (structure-based organic chemistry online tutorials) may be used to supplement traditional small-group tutorials, thereby allowing…

  4. Using Structure-Based Organic Chemistry Online Tutorials with Automated Correction for Student Practice and Review

    Science.gov (United States)

    O'Sullivan, Timothy P.; Hargaden, Gra´inne C.

    2014-01-01

    This article describes the development and implementation of an open-access organic chemistry question bank for online tutorials and assessments at University College Cork and Dublin Institute of Technology. SOCOT (structure-based organic chemistry online tutorials) may be used to supplement traditional small-group tutorials, thereby allowing…

  5. Automated Clustering Analysis of Immunoglobulin Sequences in Chronic Lymphocytic Leukemia Based on 3D Structural Descriptors

    DEFF Research Database (Denmark)

    Marcatili, Paolo; Mochament, Konstantinos; Agathangelidis, Andreas

    2016-01-01

    Imunoglobulins (Igs) are crucial for the defense against pathogens, but they are also important in many clinical and biotechnological applications. Their characteristics, and ultimately their function, depend on their three-dimensional (3D) structure; however, the procedures to experimentally det...

  6. Low-Cost Impact Detection and Location for Automated Inspections of 3D Metallic Based Structures

    Directory of Open Access Journals (Sweden)

    Carlos Morón

    2015-05-01

    Full Text Available This paper describes a new low-cost means to detect and locate mechanical impacts (collisions on a 3D metal-based structure. We employ the simple and reasonably hypothesis that the use of a homogeneous material will allow certain details of the impact to be automatically determined by measuring the time delays of acoustic wave propagation throughout the 3D structure. The location of strategic piezoelectric sensors on the structure and an electronic-computerized system has allowed us to determine the instant and position at which the impact is produced. The proposed automatic system allows us to fully integrate impact point detection and the task of inspecting the point or zone at which this impact occurs. What is more, the proposed method can be easily integrated into a robot-based inspection system capable of moving over 3D metallic structures, thus avoiding (or minimizing the need for direct human intervention. Experimental results are provided to show the effectiveness of the proposed approach.

  7. Low-Cost Impact Detection and Location for Automated Inspections of 3D Metallic Based Structures

    Science.gov (United States)

    Morón, Carlos; Portilla, Marina P.; Somolinos, José A.; Morales, Rafael

    2015-01-01

    This paper describes a new low-cost means to detect and locate mechanical impacts (collisions) on a 3D metal-based structure. We employ the simple and reasonably hypothesis that the use of a homogeneous material will allow certain details of the impact to be automatically determined by measuring the time delays of acoustic wave propagation throughout the 3D structure. The location of strategic piezoelectric sensors on the structure and an electronic-computerized system has allowed us to determine the instant and position at which the impact is produced. The proposed automatic system allows us to fully integrate impact point detection and the task of inspecting the point or zone at which this impact occurs. What is more, the proposed method can be easily integrated into a robot-based inspection system capable of moving over 3D metallic structures, thus avoiding (or minimizing) the need for direct human intervention. Experimental results are provided to show the effectiveness of the proposed approach. PMID:26029951

  8. SV-AUTOPILOT: optimized, automated construction of structural variation discovery and benchmarking pipelines

    NARCIS (Netherlands)

    Leung, W.Y.; Marschall, T.; Paudel, Y.; Falquet, L.; Mei, H.; Schönhuth, A.; Maoz, T.Y.

    2015-01-01

    Background Many tools exist to predict structural variants (SVs), utilizing a variety of algorithms. However, they have largely been developed and tested on human germline or somatic (e.g. cancer) variation. It seems appropriate to exploit this wealth of technology available for humans also for othe

  9. INTEGRATED MODEL OF AUTOMATED PROCESS LIFECYCLE MANAGEMENT TRAINING THROUGH STRUCTURIZATION CONTENT OF HIGH SCHOOL AND ORGANIZATIONS

    Directory of Open Access Journals (Sweden)

    Gennady G. Kulikov

    2015-01-01

    Full Text Available This article discusses the modern point of view, the issue of developing methods of forming the structure of the process lifecycle management of specialisttraining in conjunction with the University of industrial enterprise on the basisof a comprehensive content base chair. The possibility of using IT to improve the efficiency of educational processes.

  10. Semi-automated structural analysis of high resolution magnetic and gamma-ray spectrometry airborne surveys

    Science.gov (United States)

    Debeglia, N.; Martelet, G.; Perrin, J.; Truffert, C.; Ledru, P.; Tourlière, B.

    2005-08-01

    A user-controlled procedure was implemented for the structural analysis of geophysical maps. Local edge segments are first extracted using a suitable edge detector function, then linked into straight discontinuities and, finally, organised in complex boundary lines best delineating geophysical features. Final boundary lines may be attributed by a geologist to lithological contacts and/or structural geological features. Tests of some edge detectors, (i) horizontal gradient magnitude (HGM), (ii) various orders of the analytic signal ( An), reduced to the pole or not, (iii) enhanced horizontal derivative (EHD), (iv) composite analytic signal (CAS), were performed on synthetic magnetic data (with and without noise). As a result of these comparisons, the horizontal gradient appears to remain the best operator for the analysis of magnetic data. Computation of gradients in the frequency domain, including filtering and upward continuation of noisy data, is well-suited to the extraction of magnetic gradients associated to deep sources, while space-domain smoothing and differentiation techniques is generally preferable in the case of shallow magnetic sources, or for gamma-ray spectrometry analysis. Algorithms for edge extraction, segment linking, and line following can be controlled by choosing adequate edge detector and processing parameters which allows adaptation to a desired scale of interpretation. Tests on synthetic and real case data demonstrate the adaptability of the procedure and its ability to produce basic layer for multi-data analysis. The method was applied to the interpretation of high-resolution airborne magnetic and gamma-ray spectrometry data collected in northern Namibia. It allowed the delineation of dyke networks concealed by superficial weathering and demonstrated the presence of lithological variations in alluvial flows. The output from the structural analysis procedure are compatible with standard GIS softwares and enable the geologist to (i) compare

  11. Automated generation of a World Wide Web-based data entry and check program for medical applications.

    Science.gov (United States)

    Kiuchi, T; Kaihara, S

    1997-02-01

    The World Wide Web-based form is a promising method for the construction of an on-line data collection system for clinical and epidemiological research. It is, however, laborious to prepare a common gateway interface (CGI) program for each project, which the World Wide Web server needs to handle the submitted data. In medicine, it is even more laborious because the CGI program must check deficits, type, ranges, and logical errors (bad combination of data) of entered data for quality assurance as well as data length and meta-characters of the entered data to enhance the security of the server. We have extended the specification of the hypertext markup language (HTML) form to accommodate information necessary for such data checking and we have developed software named AUTOFORM for this purpose. The software automatically analyzes the extended HTML form and generates the corresponding ordinary HTML form, 'Makefile', and C source of CGI programs. The resultant CGI program checks the entered data through the HTML form, records them in a computer, and returns them to the end-user. AUTOFORM drastically reduces the burden of development of the World Wide Web-based data entry system and allows the CGI programs to be more securely and reliably prepared than had they been written from scratch.

  12. ESBL Detection: Comparison of a Commercially Available Chromogenic Test for Third Generation Cephalosporine Resistance and Automated Susceptibility Testing in Enterobactericeae.

    Science.gov (United States)

    El-Jade, Mohamed Ramadan; Parcina, Marijo; Schmithausen, Ricarda Maria; Stein, Christoph; Meilaender, Alina; Hoerauf, Achim; Molitor, Ernst; Bekeredjian-Ding, Isabelle

    2016-01-01

    Rapid detection and reporting of third generation cephalosporine resistance (3GC-R) and of extended spectrum betalactamases in Enterobacteriaceae (ESBL-E) is a diagnostic and therapeutic priority to avoid inefficacy of the initial antibiotic regimen. In this study we evaluated a commercially available chromogenic screen for 3GC-R as a predictive and/or confirmatory test for ESBL and AmpC activity in clinical and veterinary Enterobacteriaceae isolates. The test was highly reliable in the prediction of cefotaxime and cefpodoxime resistance, but there was no correlation with ceftazidime and piperacillin/tazobactam minimal inhibitory concentrations. All human and porcine ESBL-E tested were detected with exception of one genetically positive but phenotypically negative isolate. By contrast, AmpC detection rates lay below 30%. Notably, exclusion of piperacillin/tazobactam resistant, 3GC susceptible K1+ Klebsiella isolates increased the sensitivity and specificity of the test for ESBL detection. Our data further imply that in regions with low prevalence of AmpC and K1 positive E. coli strains chromogenic testing for 3GC-R can substitute for more time consuming ESBL confirmative testing in E. coli isolates tested positive by Phoenix or VITEK2 ESBL screen. We, therefore, suggest a diagnostic algorithm that distinguishes 3GC-R screening from primary culture and species-dependent confirmatory ESBL testing by βLACTATM and discuss the implications of MIC distribution results on the choice of antibiotic regimen.

  13. ESBL Detection: Comparison of a Commercially Available Chromogenic Test for Third Generation Cephalosporine Resistance and Automated Susceptibility Testing in Enterobactericeae

    Science.gov (United States)

    El-Jade, Mohamed Ramadan; Parcina, Marijo; Schmithausen, Ricarda Maria; Stein, Christoph; Meilaender, Alina; Hoerauf, Achim; Molitor, Ernst

    2016-01-01

    Rapid detection and reporting of third generation cephalosporine resistance (3GC-R) and of extended spectrum betalactamases in Enterobacteriaceae (ESBL-E) is a diagnostic and therapeutic priority to avoid inefficacy of the initial antibiotic regimen. In this study we evaluated a commercially available chromogenic screen for 3GC-R as a predictive and/or confirmatory test for ESBL and AmpC activity in clinical and veterinary Enterobacteriaceae isolates. The test was highly reliable in the prediction of cefotaxime and cefpodoxime resistance, but there was no correlation with ceftazidime and piperacillin/tazobactam minimal inhibitory concentrations. All human and porcine ESBL-E tested were detected with exception of one genetically positive but phenotypically negative isolate. By contrast, AmpC detection rates lay below 30%. Notably, exclusion of piperacillin/tazobactam resistant, 3GC susceptible K1+ Klebsiella isolates increased the sensitivity and specificity of the test for ESBL detection. Our data further imply that in regions with low prevalence of AmpC and K1 positive E. coli strains chromogenic testing for 3GC-R can substitute for more time consuming ESBL confirmative testing in E. coli isolates tested positive by Phoenix or VITEK2 ESBL screen. We, therefore, suggest a diagnostic algorithm that distinguishes 3GC-R screening from primary culture and species-dependent confirmatory ESBL testing by βLACTATM and discuss the implications of MIC distribution results on the choice of antibiotic regimen. PMID:27494134

  14. Automatic Structure-Based Code Generation from Coloured Petri Nets

    DEFF Research Database (Denmark)

    Kristensen, Lars Michael; Westergaard, Michael

    2010-01-01

    Automatic code generation based on Coloured Petri Net (CPN) models is challenging because CPNs allow for the construction of abstract models that intermix control flow and data processing, making translation into conventional programming constructs difficult. We introduce Process-Partitioned CPNs....... The viability of our approach is demonstrated by applying it to automatically generate an Erlang implementation of the Dynamic MANET On-demand (DYMO) routing protocol specified by the Internet Engineering Task Force (IETF)....

  15. Computer generation of structural models of amorphous Si and Ge

    Science.gov (United States)

    Wooten, F.; Winer, K.; Weaire, D.

    1985-04-01

    We have developed and applied a computer algorithm that generates realistic random-network models of a-Si with periodic boundary conditions. These are the first models to have correlation functions that show no serious deiscrepancy with experiment. The algorithm provides a much-needed systematic approach to model construction that can be used to generate models of a large class of amorphous materials.

  16. Automating unambiguous NOE data usage in NVR for NMR protein structure-based assignments.

    Science.gov (United States)

    Akhmedov, Murodzhon; Çatay, Bülent; Apaydın, Mehmet Serkan

    2015-12-01

    Nuclear Magnetic Resonance (NMR) Spectroscopy is an important technique that allows determining protein structure in solution. An important problem in protein structure determination using NMR spectroscopy is the mapping of peaks to corresponding amino acids, also known as the assignment problem. Structure-Based Assignment (SBA) is an approach to solve this problem using a template structure that is homologous to the target. Our previously developed approach Nuclear Vector Replacement-Binary Integer Programming (NVR-BIP) computed the optimal solution for small proteins, but was unable to solve the assignments of large proteins. NVR-Ant Colony Optimization (ACO) extended the applicability of the NVR approach for such proteins. One of the input data utilized in these approaches is the Nuclear Overhauser Effect (NOE) data. NOE is an interaction observed between two protons if the protons are located close in space. These protons could be amide protons, protons attached to the alpha-carbon atom in the backbone of the protein, or side chain protons. NVR only uses backbone protons. In this paper, we reformulate the NVR-BIP model to distinguish the type of proton in NOE data and use the corresponding proton coordinates in the extended formulation. In addition, the threshold value over interproton distances is set in a standard manner for all proteins by extracting the NOE upper bound distance information from the data. We also convert NOE intensities into distance thresholds. Our new approach thus handles the NOE data correctly and without manually determined parameters. We accordingly adapt NVR-ACO solution methodology to these changes. Computational results show that our approaches obtain optimal solutions for small proteins. For the large proteins our ant colony optimization-based approach obtains promising results.

  17. A machine learning approach to automated structural network analysis: application to neonatal encephalopathy.

    Directory of Open Access Journals (Sweden)

    Etay Ziv

    Full Text Available Neonatal encephalopathy represents a heterogeneous group of conditions associated with life-long developmental disabilities and neurological deficits. Clinical measures and current anatomic brain imaging remain inadequate predictors of outcome in children with neonatal encephalopathy. Some studies have suggested that brain development and, therefore, brain connectivity may be altered in the subgroup of patients who subsequently go on to develop clinically significant neurological abnormalities. Large-scale structural brain connectivity networks constructed using diffusion tractography have been posited to reflect organizational differences in white matter architecture at the mesoscale, and thus offer a unique tool for characterizing brain development in patients with neonatal encephalopathy. In this manuscript we use diffusion tractography to construct structural networks for a cohort of patients with neonatal encephalopathy. We systematically map these networks to a high-dimensional space and then apply standard machine learning algorithms to predict neurological outcome in the cohort. Using nested cross-validation we demonstrate high prediction accuracy that is both statistically significant and robust over a broad range of thresholds. Our algorithm offers a novel tool to evaluate neonates at risk for developing neurological deficit. The described approach can be applied to any brain pathology that affects structural connectivity.

  18. Automated System of Study Nonlinear Processes in Electro-vacuum Devices with Open Resonant Periodic Structures

    Directory of Open Access Journals (Sweden)

    G.S. Vorobyov

    2014-04-01

    Full Text Available The article describes the experimental equipment and the results of investigations of nonlinear processes occurring during the excitation of electromagnetic oscillations in the resonant electron beam devices such as an orotron-generator of diffraction radiation. These devices are finding wide application in physics and microwave technology, now. A technique for experimental research, which bases on the using of the universal electro vacuum equipment diffraction radiation analyzer and the microprocessor system for collecting and processing data. The experimental investigations results of the energy and frequency characteristics for the most common modes of the excitation oscillations in the open resonant systems such as an orotron. The implementations on the optimum modes for the oscillations excitation in such devices were recommended.

  19. Automated tracing of open-field coronal structures for an optimized large-scale magnetic field reconstruction

    Science.gov (United States)

    Uritsky, V. M.; Davila, J. M.; Jones, S. I.

    2014-12-01

    Solar Probe Plus and Solar Orbiter will provide detailed measurements in the inner heliosphere magnetically connected with the topologically complex and eruptive solar corona. Interpretation of these measurements will require accurate reconstruction of the large-scale coronal magnetic field. In a related presentation by S. Jones et al., we argue that such reconstruction can be performed using photospheric extrapolation methods constrained by white-light coronagraph images. Here, we present the image-processing component of this project dealing with an automated segmentation of fan-like coronal loop structures. In contrast to the existing segmentation codes designed for detecting small-scale closed loops in the vicinity of active regions, we focus on the large-scale geometry of the open-field coronal features observed at significant radial distances from the solar surface. The coronagraph images used for the loop segmentation are transformed into a polar coordinate system and undergo radial detrending and initial noise reduction. The preprocessed images are subject to an adaptive second order differentiation combining radial and azimuthal directions. An adjustable thresholding technique is applied to identify candidate coronagraph features associated with the large-scale coronal field. A blob detection algorithm is used to extract valid features and discard noisy data pixels. The obtained features are interpolated using higher-order polynomials which are used to derive empirical directional constraints for magnetic field extrapolation procedures based on photospheric magnetograms.

  20. Classification of Automated Search Traffic

    Science.gov (United States)

    Buehrer, Greg; Stokes, Jack W.; Chellapilla, Kumar; Platt, John C.

    As web search providers seek to improve both relevance and response times, they are challenged by the ever-increasing tax of automated search query traffic. Third party systems interact with search engines for a variety of reasons, such as monitoring a web site’s rank, augmenting online games, or possibly to maliciously alter click-through rates. In this paper, we investigate automated traffic (sometimes referred to as bot traffic) in the query stream of a large search engine provider. We define automated traffic as any search query not generated by a human in real time. We first provide examples of different categories of query logs generated by automated means. We then develop many different features that distinguish between queries generated by people searching for information, and those generated by automated processes. We categorize these features into two classes, either an interpretation of the physical model of human interactions, or as behavioral patterns of automated interactions. Using the these detection features, we next classify the query stream using multiple binary classifiers. In addition, a multiclass classifier is then developed to identify subclasses of both normal and automated traffic. An active learning algorithm is used to suggest which user sessions to label to improve the accuracy of the multiclass classifier, while also seeking to discover new classes of automated traffic. Performance analysis are then provided. Finally, the multiclass classifier is used to predict the subclass distribution for the search query stream.

  1. Automated reconstruction of neuronal morphology based on local geometrical and global structural models.

    Science.gov (United States)

    Zhao, Ting; Xie, Jun; Amat, Fernando; Clack, Nathan; Ahammad, Parvez; Peng, Hanchuan; Long, Fuhui; Myers, Eugene

    2011-09-01

    Digital reconstruction of neurons from microscope images is an important and challenging problem in neuroscience. In this paper, we propose a model-based method to tackle this problem. We first formulate a model structure, then develop an algorithm for computing it by carefully taking into account morphological characteristics of neurons, as well as the image properties under typical imaging protocols. The method has been tested on the data sets used in the DIADEM competition and produced promising results for four out of the five data sets.

  2. Library Automation

    OpenAIRE

    Dhakne, B. N.; Giri, V. V.; Waghmode, S. S.

    2010-01-01

    New technologies library provides several new materials, media and mode of storing and communicating the information. Library Automation reduces the drudgery of repeated manual efforts in library routine. By use of library automation collection, Storage, Administration, Processing, Preservation and communication etc.

  3. The Matchmaker Exchange API: automating patient matching through the exchange of structured phenotypic and genotypic profiles.

    Science.gov (United States)

    Buske, Orion J; Schiettecatte, François; Hutton, Benjamin; Dumitriu, Sergiu; Misyura, Andriy; Huang, Lijia; Hartley, Taila; Girdea, Marta; Sobreira, Nara; Mungall, Chris; Brudno, Michael

    2015-10-01

    Despite the increasing prevalence of clinical sequencing, the difficulty of identifying additional affected families is a key obstacle to solving many rare diseases. There may only be a handful of similar patients worldwide, and their data may be stored in diverse clinical and research databases. Computational methods are necessary to enable finding similar patients across the growing number of patient repositories and registries. We present the Matchmaker Exchange Application Programming Interface (MME API), a protocol and data format for exchanging phenotype and genotype profiles to enable matchmaking among patient databases, facilitate the identification of additional cohorts, and increase the rate with which rare diseases can be researched and diagnosed. We designed the API to be straightforward and flexible in order to simplify its adoption on a large number of data types and workflows. We also provide a public test data set, curated from the literature, to facilitate implementation of the API and development of new matching algorithms. The initial version of the API has been successfully implemented by three members of the Matchmaker Exchange and was immediately able to reproduce previously identified matches and generate several new leads currently being validated. The API is available at https://github.com/ga4gh/mme-apis.

  4. Three-dimensional Reciprocal Structures: Morphology, Concepts, Generative Rules

    DEFF Research Database (Denmark)

    Parigi, Dario; Pugnale, Alberto

    2012-01-01

    This paper present seven different three dimensional structures based on the principle of structural reciprocity with superimposition joint and standardized un-notched elements. Such typology could be regarded as being intrinsically three-dimensional because elements sit one of the top of the oth...

  5. Development of a Genetic Algorithm to Automate Clustering of a Dependency Structure Matrix

    Science.gov (United States)

    Rogers, James L.; Korte, John J.; Bilardo, Vincent J.

    2006-01-01

    Much technology assessment and organization design data exists in Microsoft Excel spreadsheets. Tools are needed to put this data into a form that can be used by design managers to make design decisions. One need is to cluster data that is highly coupled. Tools such as the Dependency Structure Matrix (DSM) and a Genetic Algorithm (GA) can be of great benefit. However, no tool currently combines the DSM and a GA to solve the clustering problem. This paper describes a new software tool that interfaces a GA written as an Excel macro with a DSM in spreadsheet format. The results of several test cases are included to demonstrate how well this new tool works.

  6. Surface structure enhanced second harmonic generation in organic nanofibers

    DEFF Research Database (Denmark)

    Fiutowski, Jacek; Maibohm, Christian; Kostiucenko, Oksana;

    Second-harmonic generation upon femto-second laser irradiation of nonlinearly optically active nanofibers grown from nonsymmetrically functionalized para-quarterphenylene (CNHP4) molecules is investigated. Following growth on mica templates, the nanofibers have been transferred onto lithography-d...

  7. Information Theoretic Secret Key Generation: Structured Codes and Tree Packing

    Science.gov (United States)

    Nitinawarat, Sirin

    2010-01-01

    This dissertation deals with a multiterminal source model for secret key generation by multiple network terminals with prior and privileged access to a set of correlated signals complemented by public discussion among themselves. Emphasis is placed on a characterization of secret key capacity, i.e., the largest rate of an achievable secret key,…

  8. Second harmonic generation from photonic structured GaN nanowalls

    Energy Technology Data Exchange (ETDEWEB)

    Soya, Takahiro; Inose, Yuta; Kunugita, Hideyuki; Ema, Kazuhiro; Yamano, Kouji; Kikuchi, Akihiko; Kishino, Katsumi, E-mail: t-soya@sophia.ac.j [Department of Engineering and Applied Sciences, Sophia University 7-1, Kioi-cho, Chiyoda-ku, Tokyo 102-8554 (Japan)

    2009-11-15

    We observed large enhancement of reflected second harmonic generation (SHG) using the one-dimensional photonic effect in regularly arranged InGaN/GaN single-quantum-well nanowalls. Using the effect when both fundamental and SH resonate with the photonic mode, we obtained enhancement of about 40 times compared with conditions far from resonance.

  9. Automated microfluidic sample-preparation platform for high-throughput structural investigation of proteins by small-angle X-ray scattering

    DEFF Research Database (Denmark)

    Lafleur, Josiane P.; Snakenborg, Detlef; Nielsen, Søren Skou

    2011-01-01

    A new microfluidic sample-preparation system is presented for the structural investigation of proteins using small-angle X-ray scattering (SAXS) at synchrotrons. The system includes hardware and software features for precise fluidic control, sample mixing by diffusion, automated X-ray exposure...... control, UV absorbance measurements and automated data analysis. As little as 15 l of sample is required to perform a complete analysis cycle, including sample mixing, SAXS measurement, continuous UV absorbance measurements, and cleaning of the channels and X-ray cell with buffer. The complete analysis...... cycle can be performed in less than 3 min. Bovine serum albumin was used as a model protein to characterize the mixing efficiency and sample consumption of the system. The N2 fragment of an adaptor protein (p120-RasGAP) was used to demonstrate how the device can be used to survey the structural space...

  10. An Online Structural Plasticity Rule for Generating Better Reservoirs

    OpenAIRE

    Roy, Subhrajit; Basu, Arindam

    2016-01-01

    In this article, a novel neuro-inspired low-resolution online unsupervised learning rule is proposed to train the reservoir or liquid of Liquid State Machine. The liquid is a sparsely interconnected huge recurrent network of spiking neurons. The proposed learning rule is inspired from structural plasticity and trains the liquid through formation and elimination of synaptic connections. Hence, the learning involves rewiring of the reservoir connections similar to structural plasticity observed...

  11. Structures' validation profiles in Transmission of Imaging and Data (TRIAD) for automated National Clinical Trials Network (NCTN) clinical trial digital data quality assurance.

    Science.gov (United States)

    Giaddui, Tawfik; Yu, Jialu; Manfredi, Denise; Linnemann, Nancy; Hunter, Joanne; O'Meara, Elizabeth; Galvin, James; Bialecki, Brian; Xiao, Ying

    2016-01-01

    Transmission of Imaging and Data (TRIAD) is a standard-based system built by the American College of Radiology to provide the seamless exchange of images and data for accreditation of clinical trials and registries. Scripts of structures' names validation profiles created in TRIAD are used in the automated submission process. It is essential for users to understand the logistics of these scripts for successful submission of radiation therapy cases with less iteration.

  12. Automation or De-automation

    Science.gov (United States)

    Gorlach, Igor; Wessel, Oliver

    2008-09-01

    In the global automotive industry, for decades, vehicle manufacturers have continually increased the level of automation of production systems in order to be competitive. However, there is a new trend to decrease the level of automation, especially in final car assembly, for reasons of economy and flexibility. In this research, the final car assembly lines at three production sites of Volkswagen are analysed in order to determine the best level of automation for each, in terms of manufacturing costs, productivity, quality and flexibility. The case study is based on the methodology proposed by the Fraunhofer Institute. The results of the analysis indicate that fully automated assembly systems are not necessarily the best option in terms of cost, productivity and quality combined, which is attributed to high complexity of final car assembly systems; some de-automation is therefore recommended. On the other hand, the analysis shows that low automation can result in poor product quality due to reasons related to plant location, such as inadequate workers' skills, motivation, etc. Hence, the automation strategy should be formulated on the basis of analysis of all relevant aspects of the manufacturing process, such as costs, quality, productivity and flexibility in relation to the local context. A more balanced combination of automated and manual assembly operations provides better utilisation of equipment, reduces production costs and improves throughput.

  13. Alternative similarity renormalization group generators in nuclear structure calculations

    CERN Document Server

    Dicaire, Nuiok M; Navratil, Petr

    2014-01-01

    The similarity renormalization group (SRG) has been successfully applied to soften interactions for ab initio nuclear calculations. In almost all practical applications in nuclear physics, an SRG generator with the kinetic energy operator is used. With this choice, a fast convergence of many-body calculations can be achieved, but at the same time substantial three-body interactions are induced even if one starts from a purely two-nucleon (NN) Hamiltonian. Three-nucleon (3N) interactions can be handled by modern many-body methods. However, it has been observed that when including initial chiral 3N forces in the Hamiltonian, the SRG transformations induce a non-negligible four-nucleon interaction that cannot be currently included in the calculations for technical reasons. Consequently, it is essential to investigate alternative SRG generators that might suppress the induction of many-body forces while at the same time might preserve the good convergence. In this work we test two alternative generators with oper...

  14. Generation, structure and reactivity of arynes: A theoretical study

    Indian Academy of Sciences (India)

    Peter G S Dkhar; R H Duncan Lyngdoh

    2000-04-01

    The semiempirical AM1 SCF-MO method is used to study the benzyne mechanism for aromatic nucleophilic substitution of various -substituted chlorobenzenes and 3-chloropyridine. The calculations predict that most of the fixed substituents studied here would induce the formation of 2,3-arynes through their electron-withdrawing resonance or inductive effects. The geometry and electronic structure of the 2,3- and 3,4-arynes investigated here, confirm the generally accepted -benzyne structure postulated for arynes. The sites of nucleophilic addition to arynes as predicted here are in fair agreement with expectation and experimental findings.

  15. Generating Counterexamples for Structural Inductions by Exploiting Nonstandard Models

    Science.gov (United States)

    Blanchette, Jasmin Christian; Claessen, Koen

    Induction proofs often fail because the stated theorem is noninductive, in which case the user must strengthen the theorem or prove auxiliary properties before performing the induction step. (Counter)model finders are useful for detecting non-theorems, but they will not find any counterexamples for noninductive theorems. We explain how to apply a well-known concept from first-order logic, nonstandard models, to the detection of noninductive invariants. Our work was done in the context of the proof assistant Isabelle/HOL and the counterexample generator Nitpick.

  16. Automated Generation of Attack Trees

    DEFF Research Database (Denmark)

    Vigo, Roberto; Nielson, Flemming; Nielson, Hanne Riis

    2014-01-01

    Attack trees are widely used to represent threat scenarios in a succinct and intuitive manner, suitable for conveying security information to non-experts. The manual construction of such objects relies on the creativity and experience of specialists, and therefore it is error-prone and impractica......Attack trees are widely used to represent threat scenarios in a succinct and intuitive manner, suitable for conveying security information to non-experts. The manual construction of such objects relies on the creativity and experience of specialists, and therefore it is error...

  17. Local structure of numerically generated worm hole spacetime.

    Science.gov (United States)

    Siino, M.

    The author investigates the evolution of the apparent horizons in a numerically gererated worm hole spacetime. The behavior of the apparent horizons is affected by the dynamics of the matter field. By using the local mass of the system, he interprets the evolution of the worm hole structure.

  18. Local Structure of Numerically Generated Worm Hole Spacetime

    OpenAIRE

    Nambu, Yasusada; Siino, Masaru

    1993-01-01

    We investigate the evolution of the apparent horizons in a numerically gererated worm hole spacetime. The behavior of the apparent horizons is affected by the dynamics of the matter field. By using the local mass of the system, we interpret the evolution of the worm hole structure. Figures are available by mail to author.

  19. Detailed requirements for a next generation nuclear data structure.

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-07-05

    This document attempts to compile the requirements for the top-levels of a hierarchical arrangement of nuclear data such as found in the ENDF format. This set of requirements will be used to guide the development of a new data structure to replace the legacy ENDF format.

  20. Local Structure of Numerically Generated Worm Hole Spacetime

    CERN Document Server

    Nambu, Y; Nambu, Yasusada; Siino, Masaru

    1993-01-01

    We investigate the evolution of the apparent horizons in a numerically gererated worm hole spacetime. The behavior of the apparent horizons is affected by the dynamics of the matter field. By using the local mass of the system, we interpret the evolution of the worm hole structure. Figures are available by mail to author.

  1. Generative probabilistic models extend the scope of inferential structure determination

    DEFF Research Database (Denmark)

    Olsson, Simon; Boomsma, Wouter; Frellsen, Jes

    2011-01-01

    rigorous approach was developed which treats structure determination as a problem of Bayesian inference. In this case, the forcefields are brought in as a prior distribution in the form of a Boltzmann factor. Due to high computational cost, the approach has been only sparsely applied in practice. Here, we...

  2. Generation of tripolar vortical structures on the beta plane

    DEFF Research Database (Denmark)

    Hesthaven, J.S.; Lynov, Jens-Peter; Juul Rasmussen, J.

    1993-01-01

    and oscillation of the tripolar structure may lead to increased mixing near the boundary of the vortex core. The translation of strong monopoles is found to be well described, even for times longer than the linear Rossby wave period, by a recent approximate theory for the evolution of an azimuthal perturbation...

  3. Platelet-induced thrombin generation by the calibrated automated thrombogram assay is increased in patients with essential thrombocythemia and polycythemia vera.

    Science.gov (United States)

    Panova-Noeva, Marina; Marchetti, Marina; Spronk, Henri Maria; Russo, Laura; Diani, Erika; Finazzi, Guido; Finazzi, Good; Salmoiraghi, Silvia; Rambaldi, Alessandro; Rambaldi, Aueesandrd; Barbui, Tiziano; Barbui, Titiano; Ten Cate, Hugo; Ten Cate, Huao; Falanga, Anna

    2011-04-01

    The platelet contribution to the thrombophilic state of patients with myeloproliferative neoplasms (MPNs), i.e., essential thrombocythemia (ET) and polycythemia vera (PV), remains uncertain. In this study we aimed to characterize the thrombin generation (TG) potential expressed by platelets from these subjects, compare it to normal platelets, and identify what factors might be responsible for platelet TG. In a group of 140 MPN patients (80 ET and 60 PV) and 72 healthy subjects, we measured the global procoagulant potential of platelet-rich plasma (PRP) utilizing the TG assay by the calibrated automated thrombogram (CAT). To characterize the procoagulant contribution of platelets in PRP, the TG of both isolated platelets and platelet-poor plasma was measured, and the platelet surface expression of TF was determined. Finally, the activation status of platelets was assessed by the levels of P-selectin expressed on platelet surface. MPN patients had significantly increased PRP and isolated platelet TG potential compared to controls. This was associated to the occurrence of platelet activation. Patients carriers of the JAK2V617F mutation showed the highest values of TG and platelet surface TF and P-selectin. Platelet TG potential was significantly lower in hydroxyurea(HU) compared to non-HU-treated patients and was lowest in HU-treated JAK2V617F carriers. In subjects not receiving HU, platelet TG significantly increased by JAK2V617F allele burden increment (P < 0.05).This study demonstrates a platelet-dependent form of hypercoagulability in MPN patients, particularly in those carriers of the JAK2V617F mutation. The cytoreductive therapy with HU significantly affects this prothrombotic phenotype.

  4. Semantics-based Automated Web Testing

    Directory of Open Access Journals (Sweden)

    Hai-Feng Guo

    2015-08-01

    Full Text Available We present TAO, a software testing tool performing automated test and oracle generation based on a semantic approach. TAO entangles grammar-based test generation with automated semantics evaluation using a denotational semantics framework. We show how TAO can be incorporated with the Selenium automation tool for automated web testing, and how TAO can be further extended to support automated delta debugging, where a failing web test script can be systematically reduced based on grammar-directed strategies. A real-life parking website is adopted throughout the paper to demonstrate the effectivity of our semantics-based web testing approach.

  5. Automated Integrated Analog Filter Design Issues

    OpenAIRE

    2015-01-01

    An analysis of modern automated integrated analog circuits design methods and their use in integrated filter design is done. Current modern analog circuits automated tools are based on optimization algorithms and/or new circuit generation methods. Most automated integrated filter design methods are only suited to gmC and switched current filter topologies. Here, an algorithm for an active RC integrated filter design is proposed, that can be used in automated filter designs. The algorithm is t...

  6. Current Generation in Double-Matrix Structure: A Theoretical Simulation

    OpenAIRE

    Bogdan Lukiyanets; Dariya Matulka; Ivan Grygorchak

    2013-01-01

    Peculiarities of kinetic characteristics in a supramolecular system, in particular, in a double‐matrix structure observed at change of the guest content in a matrix‐host are investigated. Results obtained within the framework of a time‐independent one‐dimensional Schrödinger equation with three barrier potential qualitatively explain experimental data. They indicate the importance of size quantization of a system, correlation between energy and geometric characteristics of both guest and hos...

  7. Residual Generation for the Ship Benchmark Using Structural Approach

    DEFF Research Database (Denmark)

    Cocquempot, V.; Izadi-Zamanabadi, Roozbeh; Staroswiecki, M

    1998-01-01

    The prime objective of Fault-tolerant Control (FTC) systems is to handle faults and discrepancies using appropriate accommodation policies. The issue of obtaining information about various parameters and signals, which have to be monitored for fault detection purposes, becomes a rigorous task wit...... with the growing number of subsystems. The structural approach, presented in this paper, constitutes a general framework for providing information when the system becomes complex. The methodology of this approach is illustrated on the ship propulsion benchmark....

  8. Turbulence and Vorticity in Galaxy Clusters Generated by Structure Formation

    CERN Document Server

    Vazza, F; Brüggen, M; Brunetti, G; Gheller, C; Porter, D; Ryu, D

    2016-01-01

    Turbulence is a key ingredient for the evolution of the intracluster medium, whose properties can be predicted with high resolution numerical simulations. We present initial results on the generation of solenoidal and compressive turbulence in the intracluster medium during the formation of a small-size cluster using highly resolved, non-radiative cosmological simulations, with a refined monitoring in time. In this first of a series of papers, we closely look at one simulated cluster whose formation was distinguished by a merger around $z \\sim 0.3$. We separate laminar gas motions, turbulence and shocks with dedicated filtering strategies and distinguish the solenoidal and compressive components of the gas flows using Hodge-Helmholtz decomposition. Solenoidal turbulence dominates the dissipation of turbulent motions ($\\sim 95\\%$) in the central cluster volume at all epochs. The dissipation via compressive modes is found to be more important ($\\sim 30 \\%$ of the total) only at large radii ($\\geq 0.5 ~r_{\\rm vi...

  9. Top-down characterization of nucleic acids modified by structural probes using high-resolution tandem mass spectrometry and automated data interpretation.

    Science.gov (United States)

    Kellersberger, Katherine A; Yu, Eizadora; Kruppa, Gary H; Young, Malin M; Fabris, Daniele

    2004-05-01

    . A new program called MS2Links was developed for the automated reduction and interpretation of fragmentation data obtained from modified nucleic acids. Based on an algorithm that searches for plausible isotopic patterns, the data reduction module is capable of discriminating legitimate signals from noise spikes of comparable intensity. The fragment identification module calculates the monoisotopic mass of ion products expected from a certain sequence and user-defined covalent modifications, which are finally matched with the signals selected by the data reduction program. Considering that MS2Links can generate similar fragment libraries for peptides and their covalent conjugates with other peptides or nucleic acids, this program provides an integrated platform for the structural investigation of protein-nucleic acid complexes based on cross-linking strategies and top-down ESI-FTMS.

  10. Materials Testing and Automation

    Science.gov (United States)

    Cooper, Wayne D.; Zweigoron, Ronald B.

    1980-07-01

    The advent of automation in materials testing has been in large part responsible for recent radical changes in the materials testing field: Tests virtually impossible to perform without a computer have become more straightforward to conduct. In addition, standardized tests may be performed with enhanced efficiency and repeatability. A typical automated system is described in terms of its primary subsystems — an analog station, a digital computer, and a processor interface. The processor interface links the analog functions with the digital computer; it includes data acquisition, command function generation, and test control functions. Features of automated testing are described with emphasis on calculated variable control, control of a variable that is computed by the processor and cannot be read directly from a transducer. Three calculated variable tests are described: a yield surface probe test, a thermomechanical fatigue test, and a constant-stress-intensity range crack-growth test. Future developments are discussed.

  11. Web matrices: structural properties and generating combinatorial identities

    CERN Document Server

    Dukes, Mark

    2016-01-01

    In this paper we present new results for the combinatorics of web diagrams and web worlds. These are discrete objects that arise in the physics of calculating scattering amplitudes in non-abelian gauge theories. Web-colouring and web-mixing matrices (collectively known as web matrices) are indexed by ordered pairs of web-diagrams and contain information relating the number of colourings of the first web diagram that will produce the second diagram. We introduce the black diamond product on power series and show how it determines the web-colouring matrix of disjoint web worlds. Furthermore, we show that combining known physical results with the black diamond product gives a new technique for generating combinatorial identities. Due to the complicated action of the product on power series, the resulting identities appear highly non-trivial. We present two results to explain repeated entries that appear in the web matrices. The first of these shows how diagonal web matrix entries will be the same if the comparab...

  12. Structural and leakage integrity assessment of WWER steam generator tubes

    Energy Technology Data Exchange (ETDEWEB)

    Splichal, K.; Otruba, J. [Nuclear Research Inst., Rez (Switzerland)

    1997-12-31

    The integrity of heat exchange tubes may influence the life-time of WWER steam generators and appears to be an important criterion for the evaluation of their safety and operational reliability. The basic requirement is to assure a very low probability of radioactive water leakage, preventing unstable crack growth and sudden tube rupture. These requirements led to development of permissible limits for primary to secondary leak evolution and heat exchange tubes plugging based on eddy current test inspection. The stress corrosion cracking and pitting are the main corrosion damage of WWER heat exchange tubes and are initiated from the outer surface. They are influenced by water chemistry, temperature and tube wall stress level. They take place under crevice corrosion condition and are indicated especially (1) under the tube support plates, where up to 90-95 % of defects detected by the ECT method occur, and (2) on free spans under tube deposit layers. Both the initiation and crack growth cause thinning of the tube wall and lead to part thickness cracks and through-wall cracks, oriented above all in the axial direction. 10 refs.

  13. Generating Free-Form Grid Truss Structures from 3D Scanned Point Clouds

    Directory of Open Access Journals (Sweden)

    Hui Ding

    2017-01-01

    Full Text Available Reconstruction, according to physical shape, is a novel way to generate free-form grid truss structures. 3D scanning is an effective means of acquiring physical form information and it generates dense point clouds on surfaces of objects. However, generating grid truss structures from point clouds is still a challenge. Based on the advancing front technique (AFT which is widely used in Finite Element Method (FEM, a scheme for generating grid truss structures from 3D scanned point clouds is proposed in this paper. Based on the characteristics of point cloud data, the search box is adopted to reduce the search space in grid generating. A front advancing procedure suit for point clouds is established. Delaunay method and Laplacian method are used to improve the quality of the generated grids, and an adjustment strategy that locates grid nodes at appointed places is proposed. Several examples of generating grid truss structures from 3D scanned point clouds of seashells are carried out to verify the proposed scheme. Physical models of the grid truss structures generated in the examples are manufactured by 3D print, which solidifies the feasibility of the scheme.

  14. Automated Composite Column Wrapping

    OpenAIRE

    ECT Team, Purdue

    2007-01-01

    The Automated Composite Column Wrapping is performed by a patented machine known as Robo-Wrapper. Currently there are three versions of the machine available for bridge retrofit work depending on the size of the columns being wrapped. Composite column retrofit jacket systems can be structurally just as effective as conventional steel jacketing in improving the seismic response characteristics of substandard reinforced concrete columns.

  15. Generating spatial precipitation ensembles: impact of temporal correlation structure

    Directory of Open Access Journals (Sweden)

    O. Rakovec

    2012-09-01

    Full Text Available Sound spatially distributed rainfall fields including a proper spatial and temporal error structure are of key interest for hydrologists to force hydrological models and to identify uncertainties in the simulated and forecasted catchment response. The current paper presents a temporally coherent error identification method based on time-dependent multivariate spatial conditional simulations, which are conditioned on preceding simulations. A sensitivity analysis and real-world experiment are carried out within the hilly region of the Belgian Ardennes. Precipitation fields are simulated for pixels of 10 km × 10 km resolution. Uncertainty analyses in the simulated fields focus on (1 the number of previous simulation hours on which the new simulation is conditioned, (2 the advection speed of the rainfall event, (3 the size of the catchment considered, and (4 the rain gauge density within the catchment. The results for a sensitivity analysis show for typical advection speeds >20 km h−1, no uncertainty is added in terms of across ensemble spread when conditioned on more than one or two previous hourly simulations. However, for the real-world experiment, additional uncertainty can still be added when conditioning on a larger number of previous simulations. This is because for actual precipitation fields, the dynamics exhibit a larger spatial and temporal variability. Moreover, by thinning the observation network with 50%, the added uncertainty increases only slightly and the cross-validation shows that the simulations at the unobserved locations are unbiased. Finally, the first-order autocorrelation coefficients show clear temporal coherence in the time series of the areal precipitation using the time-dependent multivariate conditional simulations, which was not the case using the time-independent univariate conditional simulations. The presented work can be easily implemented within a hydrological calibration and data assimilation

  16. Smashing the Stovepipe: Leveraging the GMSEC Open Architecture and Advanced IT Automation to Rapidly Prototype, Develop and Deploy Next-Generation Multi-Mission Ground Systems

    Science.gov (United States)

    Swenson, Paul

    2017-01-01

    Satellite/Payload Ground Systems - Typically highly-customized to a specific mission's use cases - Utilize hundreds (or thousands!) of specialized point-to-point interfaces for data flows / file transfers Documentation and tracking of these complex interfaces requires extensive time to develop and extremely high staffing costs Implementation and testing of these interfaces are even more cost-prohibitive, and documentation often lags behind implementation resulting in inconsistencies down the road With expanding threat vectors, IT Security, Information Assurance and Operational Security have become key Ground System architecture drivers New Federal security-related directives are generated on a daily basis, imposing new requirements on current / existing ground systems - These mandated activities and data calls typically carry little or no additional funding for implementation As a result, Ground System Sustaining Engineering groups and Information Technology staff continually struggle to keep up with the rolling tide of security Advancing security concerns and shrinking budgets are pushing these large stove-piped ground systems to begin sharing resources - I.e. Operational / SysAdmin staff, IT security baselines, architecture decisions or even networks / hosting infrastructure Refactoring these existing ground systems into multi-mission assets proves extremely challenging due to what is typically very tight coupling between legacy components As a result, many "Multi-Mission" ops. environments end up simply sharing compute resources and networks due to the difficulty of refactoring into true multi-mission systems Utilizing continuous integration / rapid system deployment technologies in conjunction with an open architecture messaging approach allows System Engineers and Architects to worry less about the low-level details of interfaces between components and configuration of systems GMSEC messaging is inherently designed to support multi-mission requirements, and

  17. THE NEW ONLINE METADATA EDITOR FOR GENERATING STRUCTURED METADATA

    Energy Technology Data Exchange (ETDEWEB)

    Devarakonda, Ranjeet [ORNL; Shrestha, Biva [ORNL; Palanisamy, Giri [ORNL; Hook, Leslie A [ORNL; Killeffer, Terri S [ORNL; Boden, Thomas A [ORNL; Cook, Robert B [ORNL; Zolly, Lisa [United States Geological Service (USGS); Hutchison, Viv [United States Geological Service (USGS); Frame, Mike [United States Geological Service (USGS); Cialella, Alice [Brookhaven National Laboratory (BNL); Lazer, Kathy [Brookhaven National Laboratory (BNL)

    2014-01-01

    Nobody is better suited to describe data than the scientist who created it. This description about a data is called Metadata. In general terms, Metadata represents the who, what, when, where, why and how of the dataset [1]. eXtensible Markup Language (XML) is the preferred output format for metadata, as it makes it portable and, more importantly, suitable for system discoverability. The newly developed ORNL Metadata Editor (OME) is a Web-based tool that allows users to create and maintain XML files containing key information, or metadata, about the research. Metadata include information about the specific projects, parameters, time periods, and locations associated with the data. Such information helps put the research findings in context. In addition, the metadata produced using OME will allow other researchers to find these data via Metadata clearinghouses like Mercury [2][4]. OME is part of ORNL s Mercury software fleet [2][3]. It was jointly developed to support projects funded by the United States Geological Survey (USGS), U.S. Department of Energy (DOE), National Aeronautics and Space Administration (NASA) and National Oceanic and Atmospheric Administration (NOAA). OME s architecture provides a customizable interface to support project-specific requirements. Using this new architecture, the ORNL team developed OME instances for USGS s Core Science Analytics, Synthesis, and Libraries (CSAS&L), DOE s Next Generation Ecosystem Experiments (NGEE) and Atmospheric Radiation Measurement (ARM) Program, and the international Surface Ocean Carbon Dioxide ATlas (SOCAT). Researchers simply use the ORNL Metadata Editor to enter relevant metadata into a Web-based form. From the information on the form, the Metadata Editor can create an XML file on the server that the editor is installed or to the user s personal computer. Researchers can also use the ORNL Metadata Editor to modify existing XML metadata files. As an example, an NGEE Arctic scientist use OME to register

  18. Automated Building Extraction from High-Resolution Satellite Imagery in Urban Areas Using Structural, Contextual, and Spectral Information

    Directory of Open Access Journals (Sweden)

    Curt H. Davis

    2005-08-01

    Full Text Available High-resolution satellite imagery provides an important new data source for building extraction. We demonstrate an integrated strategy for identifying buildings in 1-meter resolution satellite imagery of urban areas. Buildings are extracted using structural, contextual, and spectral information. First, a series of geodesic opening and closing operations are used to build a differential morphological profile (DMP that provides image structural information. Building hypotheses are generated and verified through shape analysis applied to the DMP. Second, shadows are extracted using the DMP to provide reliable contextual information to hypothesize position and size of adjacent buildings. Seed building rectangles are verified and grown on a finely segmented image. Next, bright buildings are extracted using spectral information. The extraction results from the different information sources are combined after independent extraction. Performance evaluation of the building extraction on an urban test site using IKONOS satellite imagery of the City of Columbia, Missouri, is reported. With the combination of structural, contextual, and spectral information, 72.7% of the building areas are extracted with a quality percentage 58.8%.

  19. The application of rhetorical structure theory to interactive news program generation from digital archives

    NARCIS (Netherlands)

    Lindley, C.A.; Davis, J.R.; Nack, F.-M.; Rutledge, L.

    2001-01-01

    Rhetorical structure theory (RST) provides a model of textual function based upon rhetoric. Initially developed as a model of text coherence, RST has been used extensively in text generation research, and has more recently been proposed as a basis for multimedia presentation generation. This pap

  20. Automated spectral classification and the GAIA project

    Science.gov (United States)

    Lasala, Jerry; Kurtz, Michael J.

    1995-01-01

    Two dimensional spectral types for each of the stars observed in the global astrometric interferometer for astrophysics (GAIA) mission would provide additional information for the galactic structure and stellar evolution studies, as well as helping in the identification of unusual objects and populations. The classification of the large quantity generated spectra requires that automated techniques are implemented. Approaches for the automatic classification are reviewed, and a metric-distance method is discussed. In tests, the metric-distance method produced spectral types with mean errors comparable to those of human classifiers working at similar resolution. Data and equipment requirements for an automated classification survey, are discussed. A program of auxiliary observations is proposed to yield spectral types and radial velocities for the GAIA-observed stars.

  1. Spatial properties of entangled photon pairs generated in nonlinear layered structures

    CERN Document Server

    Perina, Jan

    2011-01-01

    A spatial quantum model of spontaneous parametric down-conversion in nonlinear layered structures is developed expanding the interacting vectorial fields into monochromatic plane waves. A two-photon spectral amplitude depending on the signal- and idler-field frequencies and propagation directions is used to derive transverse profiles of the emitted fields as well as their spatial correlations. Intensity spatial profiles and their spatial correlations are mainly determined by the positions of transmission peaks formed in these structures with photonic bands. A method for geometry optimization of the structures with respect to efficiency of the nonlinear process is suggested. Several structures composed of GaN/AlN layers are analyzed as typical examples. They allow the generation of photon pairs correlated in several emission directions. Photon-pair generation rates increasing better than the second power of the number of layers can be reached. Also structures efficiently generated photon pairs showing anti-bun...

  2. Imaging of the Space-time Structure of a Vortex Generator in Supersonic Flow

    Institute of Scientific and Technical Information of China (English)

    WANG Dengpan; XIA Zhixun; ZHAO Yuxin; WANG Bo; ZHAO Yanhui

    2012-01-01

    The fine space-time structure of a vortex generator (VG) in supersonic flow is studied with the nanoparticle-based planar laser scattering (NPLS) method in a quiet supersonic wind tunnel.The fine coherent structure at the symmetrical plane of the flow field around the VG is imaged with NPLS.The spatial structure and temporal evolution characteristics of the vortical structure are analyzed,which demonstrate periodic evolution and similar geometry,and the characteristics of rapid movement and slow change.Because the NPLS system yields the flow images at high temporal and spatial resolutions,from these images the position of a large scale structure can be extracted precisely.The position and velocity of the large scale structures can be evaluated with edge detection and correlation algorithms.The shocklet structures induced by vortices are imaged,from which the generation and development of shocklets are discussed in this paper.

  3. Automated Tape Placement in Large Composite Cylinder Structure%大型复合材料筒形结构自动铺带技术

    Institute of Scientific and Technical Information of China (English)

    张蕾; 王俊锋; 刘伟; 熊艳丽; 范佳

    2011-01-01

    采用国产T300/605热熔法预浸料,对大型复合材料筒形结构自动铺带技术进行了研究.通过对自动铺带角度的工艺优化,铺带角度进行微调,实现了复合材料筒形结构的满覆盖铺放.在此基础上进行了大型复合材料筒形结构的自动铺带工艺试验,对自动铺带工艺试验件进行无损检测及取样性能测试.结果表明:预浸料铺覆性良好,自动铺带成型的预浸带间隙或重叠≤1 mm,铺带角度与理论铺带角度偏差≤0.2°.试验件成型质量良好,自动铺带技术可以满足大型复合材料结构高质量成型需求.%Automated tape placement in large cylinder structure was studied with domestic T300/605 melting prepared prepreg. The ply angle was optimized to achieve the full-scale laying in large cylinder structure ,which would avoid the gap or ovedaps. On the basis of the optimization, the process experiment of large cylinder structure with automated tape placement was carried out and the result indicated that the adhesion of the prepreg tape was fit for automated tape placement.The gap or overlap between the tapes were less than 1mm and the error of ply angle was less than 0. 2°. NondesUuctive testing of the composite cylinder and test of mechanical and physical perfonnance was carried out. The result showed that the property of the composite cylinder was eligible,which indicated that automated tape placement satisfied the moulding of the large cylinder structure.

  4. Automating Finance

    Science.gov (United States)

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  5. Conformer generation with OMEGA: algorithm and validation using high quality structures from the Protein Databank and Cambridge Structural Database.

    Science.gov (United States)

    Hawkins, Paul C D; Skillman, A Geoffrey; Warren, Gregory L; Ellingson, Benjamin A; Stahl, Matthew T

    2010-04-26

    Here, we present the algorithm and validation for OMEGA, a systematic, knowledge-based conformer generator. The algorithm consists of three phases: assembly of an initial 3D structure from a library of fragments; exhaustive enumeration of all rotatable torsions using values drawn from a knowledge-based list of angles, thereby generating a large set of conformations; and sampling of this set by geometric and energy criteria. Validation of conformer generators like OMEGA has often been undertaken by comparing computed conformer sets to experimental molecular conformations from crystallography, usually from the Protein Databank (PDB). Such an approach is fraught with difficulty due to the systematic problems with small molecule structures in the PDB. Methods are presented to identify a diverse set of small molecule structures from cocomplexes in the PDB that has maximal reliability. A challenging set of 197 high quality, carefully selected ligand structures from well-solved models was obtained using these methods. This set will provide a sound basis for comparison and validation of conformer generators in the future. Validation results from this set are compared to the results using structures of a set of druglike molecules extracted from the Cambridge Structural Database (CSD). OMEGA is found to perform very well in reproducing the crystallographic conformations from both these data sets using two complementary metrics of success.

  6. Parametric 'route structure' generation and analysis: An interactive design system application for urban design

    NARCIS (Netherlands)

    Beirao, J.N.; Nourian Ghadikolaee, P.; Van Walderveen, B.

    2011-01-01

    Marshall (2005) developed the concept of characteristic structure of a street network as a characteristic set of indicators extracted from the street network through a process which he called “route structure analysis”. In this paper we propose an integrated process for street network generation and

  7. Submicron hollow spot generation by solid immersion lens and structured illumination

    NARCIS (Netherlands)

    Kim, M.S.; Assafrao, A.C.; Scharf, T.; Wachters, A.J.H.; Pereira, S.F.; Urbach, H.P.; Brun, M.; Olivier, S.; Nicoletti, S.; Herzig, H.P.

    2012-01-01

    We report on the experimental and numerical demonstration of immersed submicron-size hollow focused spots, generated by structuring the polarization state of an incident light beam impinging on a micro-size solid immersion lens (μ-SIL) made of SiO2. Such structured focal spots are characterized by a

  8. Completely Integrable Hamiltonian Systems Generated by Poisson Structures in R3

    Institute of Scientific and Technical Information of China (English)

    LEI De-Chao; ZHANG Xiang

    2005-01-01

    @@ The completely integrable Hamiltonian systems have been applied to physics and mechanics intensively. We generate a family of completely integrable Hamiltonian systems from some kinds of exact Poisson structures in R3 by the realization of the Poisson algebra. Moreover, we prove that there is a Poisson algebra which cannot be realized by an exact Poisson structure.

  9. Submicron hollow spot generation by solid immersion lens and structured illumination

    NARCIS (Netherlands)

    Kim, M.S.; Assafrao, A.C.; Scharf, T.; Wachters, A.J.H.; Pereira, S.F.; Urbach, H.P.; Brun, M.; Olivier, S.; Nicoletti, S.; Herzig, H.P.

    2012-01-01

    We report on the experimental and numerical demonstration of immersed submicron-size hollow focused spots, generated by structuring the polarization state of an incident light beam impinging on a micro-size solid immersion lens (μ-SIL) made of SiO2. Such structured focal spots are characterized by a

  10. Parametric 'route structure' generation and analysis: An interactive design system application for urban design

    NARCIS (Netherlands)

    Beirao, J.N.; Nourian Ghadikolaee, P.; Van Walderveen, B.

    2011-01-01

    Marshall (2005) developed the concept of characteristic structure of a street network as a characteristic set of indicators extracted from the street network through a process which he called “route structure analysis”. In this paper we propose an integrated process for street network generation and

  11. Strategies for Inclusion of Structural Mass Estimates in the Direct-Drive Generator Optimization Process

    DEFF Research Database (Denmark)

    Henriksen, Matthew Lee; Jensen, Bogi Bech

    2013-01-01

    Usage of a lookup table containing the structural mass and air gap deformation for direct-drive wind turbines of various dimensions is demonstrated. The development of the table is described in detail. Optimal generator designs while both neglecting and considering the structural mass are also...

  12. Scaling Out and Evaluation of OBSecAn, an Automated Section Annotator for Semi-Structured Clinical Documents, on a Large VA Clinical Corpus.

    Science.gov (United States)

    Tran, Le-Thuy T; Divita, Guy; Redd, Andrew; Carter, Marjorie E; Samore, Matthew; Gundlapalli, Adi V

    2015-01-01

    "Identifying and labeling" (annotating) sections improves the effectiveness of extracting information stored in the free text of clinical documents. OBSecAn, an automated ontology-based section annotator, was developed to identify and label sections of semi-structured clinical documents from the Department of Veterans Affairs (VA). In the first step, the algorithm reads and parses the document to obtain and store information regarding sections into a structure that supports the hierarchy of sections. The second stage detects and makes correction to errors in the parsed structure. The third stage produces the section annotation output using the final parsed tree. In this study, we present the OBSecAn method and its scale to a million document corpus and evaluate its performance in identifying family history sections. We identify high yield sections for this use case from note titles such as primary care and demonstrate a median rate of 99% in correctly identifying a family history section.

  13. Structure_threader: An improved method for automation and parallelization of programs structure, fastStructure and MavericK on multicore CPU systems.

    Science.gov (United States)

    Pina-Martins, Francisco; Silva, Diogo N; Fino, Joana; Paulo, Octávio S

    2017-08-04

    Structure_threader is a program to parallelize multiple runs of genetic clustering software that does not make use of multithreading technology (structure, fastStructure and MavericK) on multicore computers. Our approach was benchmarked across multiple systems and displayed great speed improvements relative to the single-threaded implementation, scaling very close to linearly with the number of physical cores used. Structure_threader was compared to previous software written for the same task-ParallelStructure and StrAuto and was proven to be the faster (up to 25% faster) wrapper under all tested scenarios. Furthermore, Structure_threader can perform several automatic and convenient operations, assisting the user in assessing the most biologically likely value of 'K' via implementations such as the "Evanno," or "Thermodynamic Integration" tests and automatically draw the "meanQ" plots (static or interactive) for each value of K (or even combined plots). Structure_threader is written in python 3 and licensed under the GPLv3. It can be downloaded free of charge at https://github.com/StuntsPT/Structure_threader. © 2017 John Wiley & Sons Ltd.

  14. Molecular design chemical structure generation from the properties of pure organic compounds

    CERN Document Server

    Horvath, AL

    1992-01-01

    This book is a systematic presentation of the methods that have been developed for the interpretation of molecular modeling to the design of new chemicals. The main feature of the compilation is the co-ordination of the various scientific disciplines required for the generation of new compounds. The five chapters deal with such areas as structure and properties of organic compounds, relationships between structure and properties, and models for structure generation. The subject is covered in sufficient depth to provide readers with the necessary background to understand the modeling

  15. Myths in test automation

    Directory of Open Access Journals (Sweden)

    Jazmine Francis

    2015-01-01

    Full Text Available Myths in automation of software testing is an issue of discussion that echoes about the areas of service in validation of software industry. Probably, the first though that appears in knowledgeable reader would be Why this old topic again? What's New to discuss the matter? But, for the first time everyone agrees that undoubtedly automation testing today is not today what it used to be ten or fifteen years ago, because it has evolved in scope and magnitude. What began as a simple linear scripts for web applications today has a complex architecture and a hybrid framework to facilitate the implementation of testing applications developed with various platforms and technologies. Undoubtedly automation has advanced, but so did the myths associated with it. The change in perspective and knowledge of people on automation has altered the terrain. This article reflects the points of views and experience of the author in what has to do with the transformation of the original myths in new versions, and how they are derived; also provides his thoughts on the new generation of myths.

  16. Band-gap nonlinear optical generation: The structure of internal optical field and the structural light focusing

    Energy Technology Data Exchange (ETDEWEB)

    Zaytsev, Kirill I., E-mail: kirzay@gmail.com; Katyba, Gleb M.; Yakovlev, Egor V.; Yurchenko, Stanislav O., E-mail: st.yurchenko@mail.ru [Bauman Moscow State Technical University, 2nd Baumanskaya str. 5, Moscow 105005 (Russian Federation); Gorelik, Vladimir S. [P. N. Lebedev Physics Institute of the Russian Academy of Sciences, Leninskiy Prospekt 53, Moscow 119991 (Russian Federation)

    2014-06-07

    A novel approach for the enhancement of nonlinear optical effects inside globular photonic crystals (PCs) is proposed and systematically studied via numerical simulations. The enhanced optical harmonic generation is associated with two- and three-dimensional PC pumping with the wavelength corresponding to different PC band-gaps. The interactions between light and the PC are numerically simulated using the finite-difference time-domain technique for solving the Maxwell's equations. Both empty and infiltrated two-dimensional PC structures are considered. A significant enhancement of harmonic generation is predicted owing to the highly efficient PC pumping based on the structural light focusing effect inside the PC structure. It is shown that a highly efficient harmonic generation could be attained for both the empty and infiltrated two- and three-dimensional PCs. We are demonstrating the ability for two times enhancement of the parametric decay efficiency, one order enhancement of the second harmonic generation, and two order enhancement of the third harmonic generation in PC structures in comparison to the nonlinear generations in appropriate homogenous media. Obviously, the nonlinear processes should be allowed by the molecular symmetry. The criteria of the nonlinear process efficiency are specified and calculated as a function of pumping wavelength position towards the PC globule diameter. Obtained criterion curves exhibit oscillating characteristics, which indicates that the highly efficient generation corresponds to the various PC band-gap pumping. The highest efficiency of nonlinear conversions could be reached for PC pumping with femtosecond optical pulses; thus, the local peak intensity would be maximized. Possible applications of the observed phenomenon are also discussed.

  17. Three-dimensional structural imaging of starch granules by second-harmonic generation circular dichroism.

    Science.gov (United States)

    Zhuo, G-Y; Lee, H; Hsu, K-J; Huttunen, M J; Kauranen, M; Lin, Y-Y; Chu, S-W

    2014-03-01

    Chirality is one of the most fundamental and essential structural properties of biological molecules. Many important biological molecules including amino acids and polysaccharides are intrinsically chiral. Conventionally, chiral species can be distinguished by interaction with circularly polarized light, and circular dichroism is one of the best-known approaches for chirality detection. As a linear optical process, circular dichroism suffers from very low signal contrast and lack of spatial resolution in the axial direction. It has been demonstrated that by incorporating nonlinear interaction with circularly polarized excitation, second-harmonic generation circular dichroism can provide much higher signal contrast. However, previous circular dichroism and second-harmonic generation circular dichroism studies are mostly limited to probe chiralities at surfaces and interfaces. It is known that second-harmonic generation, as a second-order nonlinear optical effect, provides excellent optical sectioning capability when combined with a laser-scanning microscope. In this work, we combine the axial resolving power of second-harmonic generation and chiral sensitivity of second-harmonic generation circular dichroism to realize three-dimensional chiral detection in biological tissues. Within the point spread function of a tight focus, second-harmonic generation circular dichroism could arise from the macroscopic supramolecular packing as well as the microscopic intramolecular chirality, so our aim is to clarify the origins of second-harmonic generation circular dichroism response in complicated three-dimensional biological systems. The sample we use is starch granules whose second-harmonic generation-active molecules are amylopectin with both microscopic chirality due to its helical structure and macroscopic chirality due to its crystallized packing. We found that in a starch granule, the second-harmonic generation for right-handed circularly polarized excitation is

  18. Heating automation

    OpenAIRE

    Tomažič, Tomaž

    2013-01-01

    This degree paper presents usage and operation of peripheral devices with microcontroller for heating automation. The main goal is to make a quality system control for heating three house floors and with that, increase efficiency of heating devices and lower heating expenses. Heat pump, furnace, boiler pump, two floor-heating pumps and two radiator pumps need to be controlled by this system. For work, we have chosen a development kit stm32f4 - discovery with five temperature sensors, LCD disp...

  19. Automation Security

    OpenAIRE

    Mirzoev, Dr. Timur

    2014-01-01

    Web-based Automated Process Control systems are a new type of applications that use the Internet to control industrial processes with the access to the real-time data. Supervisory control and data acquisition (SCADA) networks contain computers and applications that perform key functions in providing essential services and commodities (e.g., electricity, natural gas, gasoline, water, waste treatment, transportation) to all Americans. As such, they are part of the nation s critical infrastructu...

  20. Marketing automation

    OpenAIRE

    Raluca Dania TODOR

    2017-01-01

    The automation of the marketing process seems to be nowadays, the only solution to face the major changes brought by the fast evolution of technology and the continuous increase in supply and demand. In order to achieve the desired marketing results, businessis have to employ digital marketing and communication services. These services are efficient and measurable thanks to the marketing technology used to track, score and implement each campaign. Due to the...

  1. State recognition of the viscoelastic sandwich structure based on the adaptive redundant second generation wavelet packet transform, permutation entropy and the wavelet support vector machine

    Science.gov (United States)

    Qu, Jinxiu; Zhang, Zhousuo; Wen, Jinpeng; Guo, Ting; Luo, Xue; Sun, Chuang; Li, Bing

    2014-08-01

    The viscoelastic sandwich structure is widely used in mechanical equipment, yet the structure always suffers from damage during long-term service. Therefore, state recognition of the viscoelastic sandwich structure is very necessary for monitoring structural health states and keeping the equipment running with high reliability. Through the analysis of vibration response signals, this paper presents a novel method for this task based on the adaptive redundant second generation wavelet packet transform (ARSGWPT), permutation entropy (PE) and the wavelet support vector machine (WSVM). In order to tackle the non-linearity existing in the structure vibration response, the PE is introduced to reveal the state changes of the structure. In the case of complex non-stationary vibration response signals, in order to obtain more effective information regarding the structural health states, the ARSGWPT, which can adaptively match the characteristics of a given signal, is proposed to process the vibration response signals, and then multiple PE features are extracted from the resultant wavelet packet coefficients. The WSVM, which can benefit from the conventional SVM as well as wavelet theory, is applied to classify the various structural states automatically. In this study, to achieve accurate and automated state recognition, the ARSGWPT, PE and WSVM are combined for signal processing, feature extraction and state classification, respectively. To demonstrate the effectiveness of the proposed method, a typical viscoelastic sandwich structure is designed, and the different degrees of preload on the structure are used to characterize the various looseness states. The test results show that the proposed method can reliably recognize the different looseness states of the viscoelastic sandwich structure, and the WSVM can achieve a better classification performance than the conventional SVM. Moreover, the superiority of the proposed ARSGWPT in processing the complex vibration response

  2. Numerical analysis of stiffened shells of revolution. Volume 3: Users' manual for STARS-2B, 2V, shell theory automated for rotational structures, 2 (buckling, vibrations), digital computer programs

    Science.gov (United States)

    Svalbonas, V.

    1973-01-01

    The User's manual for the shell theory automated for rotational structures (STARS) 2B and 2V (buckling, vibrations) is presented. Several features of the program are: (1) arbitrary branching of the shell meridians, (2) arbitrary boundary conditions, (3) minimum input requirements to describe a complex, practical shell of revolution structure, and (4) accurate analysis capability using a minimum number of degrees of freedom.

  3. GenEx: A simple generator structure for exclusive processes in high energy collisions

    CERN Document Server

    Kycia, R A; Staszewski, R; Turnau, J

    2014-01-01

    A simple C++ class structure for construction of a Monte Carlo event generators which can produce unweighted events within relativistic phase space is presented. The generator is self-adapting to the provided matrix element and acceptance cuts. The program is designed specially for exclusive processes and includes, as an example of such an application, implementation of the model for exclusive production of meson pairs $pp \\rightarrow p M^+M^- p $ in high energy proton-proton collisions.

  4. Stochastic generation of explicit pore structures by thresholding Gaussian random fields

    Energy Technology Data Exchange (ETDEWEB)

    Hyman, Jeffrey D., E-mail: jhyman@lanl.gov [Program in Applied Mathematics, University of Arizona, Tucson, AZ 85721-0089 (United States); Computational Earth Science, Earth and Environmental Sciences (EES-16), and Center for Nonlinear Studies, Los Alamos National Laboratory, Los Alamos, NM 87544 (United States); Winter, C. Larrabee, E-mail: winter@email.arizona.edu [Program in Applied Mathematics, University of Arizona, Tucson, AZ 85721-0089 (United States); Department of Hydrology and Water Resources, University of Arizona, Tucson, AZ 85721-0011 (United States)

    2014-11-15

    We provide a description and computational investigation of an efficient method to stochastically generate realistic pore structures. Smolarkiewicz and Winter introduced this specific method in pores resolving simulation of Darcy flows (Smolarkiewicz and Winter, 2010 [1]) without giving a complete formal description or analysis of the method, or indicating how to control the parameterization of the ensemble. We address both issues in this paper. The method consists of two steps. First, a realization of a correlated Gaussian field, or topography, is produced by convolving a prescribed kernel with an initial field of independent, identically distributed random variables. The intrinsic length scales of the kernel determine the correlation structure of the topography. Next, a sample pore space is generated by applying a level threshold to the Gaussian field realization: points are assigned to the void phase or the solid phase depending on whether the topography over them is above or below the threshold. Hence, the topology and geometry of the pore space depend on the form of the kernel and the level threshold. Manipulating these two user prescribed quantities allows good control of pore space observables, in particular the Minkowski functionals. Extensions of the method to generate media with multiple pore structures and preferential flow directions are also discussed. To demonstrate its usefulness, the method is used to generate a pore space with physical and hydrological properties similar to a sample of Berea sandstone. -- Graphical abstract: -- Highlights: •An efficient method to stochastically generate realistic pore structures is provided. •Samples are generated by applying a level threshold to a Gaussian field realization. •Two user prescribed quantities determine the topology and geometry of the pore space. •Multiple pore structures and preferential flow directions can be produced. •A pore space based on Berea sandstone is generated.

  5. Structure Refinement for Vulnerability Estimation Models using Genetic Algorithm Based Model Generators

    Directory of Open Access Journals (Sweden)

    2009-01-01

    Full Text Available In this paper, a method for model structure refinement is proposed and applied in estimation of cumulative number of vulnerabilities according to time. Security as a quality characteristic is presented and defined. Vulnerabilities are defined and their importance is assessed. Existing models used for number of vulnerabilities estimation are enumerated, inspecting their structure. The principles of genetic model generators are inspected. Model structure refinement is defined in comparison with model refinement and a method for model structure refinement is proposed. A case study shows how the method is applied and the obtained results.

  6. Generational influences in academic emergency medicine: structure, function, and culture (Part II).

    Science.gov (United States)

    Mohr, Nicholas M; Smith-Coggins, Rebecca; Larrabee, Hollynn; Dyne, Pamela L; Promes, Susan B

    2011-02-01

    Strategies for approaching generational issues that affect teaching and learning, mentoring, and technology in emergency medicine (EM) have been reported. Tactics to address generational influences involving the structure and function of the academic emergency department (ED), organizational culture, and EM schedule have not been published. Through a review of the literature and consensus by modified Delphi methodology of the Society for Academic Emergency Medicine Aging and Generational Issues Task Force, the authors have developed this two-part series to address generational issues present in academic EM. Understanding generational characteristics and mitigating strategies can address some common issues encountered in academic EM. By understanding the differences and strengths of each of the cohorts in academic EM departments and considering simple mitigating strategies, faculty leaders can maximize their cooperative effectiveness and face the challenges of a new millennium.

  7. Direct CO(2) laser-based generation of holographic structures on the surface of glass.

    Science.gov (United States)

    Wlodarczyk, Krystian L; Weston, Nicholas J; Ardron, Marcus; Hand, Duncan P

    2016-01-25

    A customized CO(2) laser micromachining system was used for the generation of phase holographic structures directly on the surface of fused silica (HPFS(®)7980 Corning) and Borofloat(®)33 (Schott AG) glass. This process used pulses of duration 10µs and nominal wavelength 10.59µm. The pulse energy delivered to the glass workpiece was controlled by an acousto-optic modulator. The laser-generated structures were optically smooth and crack free. We demonstrated their use as diffractive optical elements (DOEs), which could be exploited as anti-counterfeiting markings embedded into valuable glass-made components and products.

  8. Discontinuous space variant sub-wavelength structures for generating radially polarized light in visible region

    Science.gov (United States)

    Ghadyani, Z.; Dmitriev, S.; Lindlein, N.; Leuchs, G.; Rusina, O.; Harder, I.

    2011-08-01

    A discontinuous space variant sub-wavelength dielectric grating is designed and fabricated for generating radially polarized light in visible region (l = 632.8 nm). The design is based on sub-wavelength silicon nitride structures introducing a retardation of p/2 by form birefringence, with space variant orientation of the optical axis. The pattern is divided into concentric ring segments with constant structural parameters, therefore reducing electron-beam writing time significantly. The design avoids the technological challenges encountered in the generation of a continuous space variant grating while maintaining good quality of the resulting polarization mode.

  9. Generation of Hierarchically Ordered Structures on a Polymer Film by Electrohydrodynamic Structure Formation.

    Science.gov (United States)

    Tian, Hongmiao; Shao, Jinyou; Hu, Hong; Wang, Li; Ding, Yucheng

    2016-06-29

    The extensive applications of hierarchical structures in optoelectronics, micro/nanofluidics, energy conservation, etc., have led to the development of a variety of approaches for their fabrication, which can be categorized as bottom-up or top-down strategies. Current bottom-up and top-down strategies bear a complementary relationship to each other due to their processing characteristics, i.e., the advantages of one method correspond to the disadvantages of the other, and vice versa. Here we propose a novel method based on electrohydrodynamic structure formation, aimed at combining the main advantages of the two strategies. The method allows the fabrication of a hierarchically ordered structure with well-defined geometry and high mechanical durability on a polymer film, through a simple and low-cost process also suitable for mass-production. In this approach, upon application of an electric field between a template and a substrate sandwiching an air gap and a polymer film, the polymer is pulled toward the template and further flows into the template cavities, resulting in a hierarchical structure with primary and secondary patterns determined by electrohydrodynamic instability and by the template features, respectively. In this work, the fabrication of a hierarchical structure by electrohydrodynamic structure formation is studied using numerical simulations and experimental tests. The proposed method is then employed for the one-step fabrication of a hierarchical structure exhibiting a gradual transition in the periodicity of the primary structure using a slant template and a flat polymer film, which presents an excellent performance on controllable wettability.

  10. De Novo generation of molecular structures using optimization to select graphs on a given lattice

    DEFF Research Database (Denmark)

    Bywater, R.P.; Poulsen, Thomas Agersten; Røgen, Peter;

    2004-01-01

    A recurrent problem in organic chemistry is the generation of new molecular structures that conform to some predetermined set of structural constraints that are imposed in an endeavor to build certain required properties into the newly generated structure. An example of this is the pharmacophore...... model, used in medicinal chemistry to guide de novo design or selection of suitable structures from compound databases. We propose here a method that efficiently links up a selected number of required atom positions while at the same time directing the emergent molecular skeleton to avoid forbidden...... positions. The linkage process takes place on a lattice whose unit step length and overall geometry is designed to match typical architectures of organic molecules. We use an optimization method to select from the many different graphs possible. The approach is demonstrated in an example where crystal...

  11. Generation of 3-Dimensional Polymer Structures in Liquid Crystalline Devices using Direct Laser Writing

    OpenAIRE

    Tartan, CC; Salter, PS; Wilkinson, TD; Booth, M; Morris, S.; Elston, SJ

    2017-01-01

    Direct laser writing is a powerful nonlinear fabrication technique that provides high intensities in the focal plane of a sample to engineer multidimensional structures with submicron feature sizes. Dielectrically and optically anisotropic soft matter is of particular interest when considering a host medium in which exotic topological characteristics may be generated. In this manuscript, we adopt a novel approach for direct laser writing of polymeric structures, whereby the photo-sensitive re...

  12. An assessment of the geotemperature conditions of Bazhenov oil generation (Koltogor mezodepression and its framing structures)

    Science.gov (United States)

    Stotskiy, V. V.; Isaev, V. I.; Fomin, M. A.

    2016-09-01

    The thermal history of oil source rocks of Bazhenov deposits was reconstructed in 8 cross-sections of representative wells of Koltogor mezodepression and the structures framing it. The tectonic history and geotemperature were reconstructed for well profiles, some located in the depression zone and others in the positive structures. A comparative analysis of the geotemperature conditions accompanying the generation of Bazhenov oils was performed.

  13. The Power Structure of 2-Generator 2-Groups of Class Two

    Institute of Scientific and Technical Information of China (English)

    Friedrich L. Kluempen

    2002-01-01

    In 1934, Hall introduced regular p-groups. Any regular 2-group is abelian. A regular p-group is power closed, exponent closed, strongly semi-pabelian, and an exact power margin group. The study of these properties and relationships among them constitutes the investigation of the power structure of a p-group. In this paper, we classify 2-generator 2-groups of nilpotency class two according to their power structure.

  14. Planificación automatizada del arranque de generadores para la restauración de sistemas eléctricos de potencia ;Generator Start-Up automated planning for electric power system restoration

    Directory of Open Access Journals (Sweden)

    Leonel Francisco Aleaga Loaiza

    2015-06-01

    Full Text Available La elección de la secuencia de arranque de los generadores afecta directamente a la capacidad de generación disponible en el proceso de restauración del sistema de potencia. En este artículo se utiliza un método basado en la planificación automatizada para calcular la secuencia de arranque de las unidades de generación en el proceso de restauración de sistemas eléctricos de potencia. Se presenta una formulación basada en acciones donde se involucran varios factores complejos tales como: la naturaleza combinatoria, el conocimiento de expertos, varias restricciones y condiciones cambiantes en el tiempo que deben cumplirse y la optimización de varios recursos numéricos. Los resultados de prueba sobre el sistema IEEE39-barras muestran que el método es muy eficiente en obtener planes precisos y optimizados para restaurar el sistema de generación utilizando un algoritmo de planificación automatizada basado en la búsqueda heurística con capacidades de razonamiento en tiempo continuo.The choice of generator startup sequence affects directly the available generation capacity in the power system restoration process. In this paper an automated planning based method is used to calculate the startup sequence generating units in the electric power system restoration process. An action-based formulation is presented where several complex factors are involved such us: the combinatorial nature, expert knowledge, several restrictions and changing conditions over time that must be met and the optimization of several numerical resources. The test result son theIEEE39-bus system show that the method is veryefficientto obtain accurate and optimized plans to restore the generation system using an automated planning algorithm based on heuristic search with capabilities of reasoning in continuous time.

  15. Homology Modeling: Generating Structural Models to Understand Protein Function and Mechanism

    Science.gov (United States)

    Ramachandran, Srinivas; Dokholyan, Nikolay V.

    Geneticists and molecular and cell biologists routinely uncover new proteins important in specific biological processes/pathways. However, either the molecular functions or the functional mechanisms of many of these proteins are unclear due to a lack of knowledge of their atomic structures. Yet, determining experimental structures of many proteins presents technical challenges. The current methods for obtaining atomic-resolution structures of biomolecules (X-ray crystallography and NMR spectroscopy) require pure preparations of proteins at concentrations much higher than those at which the proteins exist in a physiological environment. Additionally, NMR has size limitations, with current technology limited to the determination of structures of proteins with masses of up to 15 kDa. Due to these reasons, atomic structures of many medically and biologically important proteins do not exist. However, the structures of these proteins are essential for several purposes, including in silico drug design [1], understanding the effects of disease mutations [2], and designing experiments to probe the functional mechanisms of proteins. Comparative modeling has gained importance as a tool for bridging the gap between sequence and structure space, allowing researchers to build structural models of proteins that are difficult to crystallize or for which structure determination by NMR spectroscopy is not tractable. Comparative modeling, or homology modeling, exploits the fact that two proteins whose sequences are evolutionarily connected display similar structural features [3]. Thus, the known structure of a protein (template) can be used to generate a molecular model of the protein (query) whose experimental structure is notknown.

  16. Maneuver Automation Software

    Science.gov (United States)

    Uffelman, Hal; Goodson, Troy; Pellegrin, Michael; Stavert, Lynn; Burk, Thomas; Beach, David; Signorelli, Joel; Jones, Jeremy; Hahn, Yungsun; Attiyah, Ahlam; Illsley, Jeannette

    2009-01-01

    The Maneuver Automation Software (MAS) automates the process of generating commands for maneuvers to keep the spacecraft of the Cassini-Huygens mission on a predetermined prime mission trajectory. Before MAS became available, a team of approximately 10 members had to work about two weeks to design, test, and implement each maneuver in a process that involved running many maneuver-related application programs and then serially handing off data products to other parts of the team. MAS enables a three-member team to design, test, and implement a maneuver in about one-half hour after Navigation has process-tracking data. MAS accepts more than 60 parameters and 22 files as input directly from users. MAS consists of Practical Extraction and Reporting Language (PERL) scripts that link, sequence, and execute the maneuver- related application programs: "Pushing a single button" on a graphical user interface causes MAS to run navigation programs that design a maneuver; programs that create sequences of commands to execute the maneuver on the spacecraft; and a program that generates predictions about maneuver performance and generates reports and other files that enable users to quickly review and verify the maneuver design. MAS can also generate presentation materials, initiate electronic command request forms, and archive all data products for future reference.

  17. Automated measurement of Drosophila wings

    Directory of Open Access Journals (Sweden)

    Mezey Jason

    2003-12-01

    Full Text Available Abstract Background Many studies in evolutionary biology and genetics are limited by the rate at which phenotypic information can be acquired. The wings of Drosophila species are a favorable target for automated analysis because of the many interesting questions in evolution and development that can be addressed with them, and because of their simple structure. Results We have developed an automated image analysis system (WINGMACHINE that measures the positions of all the veins and the edges of the wing blade of Drosophilid flies. A video image is obtained with the aid of a simple suction device that immobilizes the wing of a live fly. Low-level processing is used to find the major intersections of the veins. High-level processing then optimizes the fit of an a priori B-spline model of wing shape. WINGMACHINE allows the measurement of 1 wing per minute, including handling, imaging, analysis, and data editing. The repeatabilities of 12 vein intersections averaged 86% in a sample of flies of the same species and sex. Comparison of 2400 wings of 25 Drosophilid species shows that wing shape is quite conservative within the group, but that almost all taxa are diagnosably different from one another. Wing shape retains some phylogenetic structure, although some species have shapes very different from closely related species. The WINGMACHINE system facilitates artificial selection experiments on complex aspects of wing shape. We selected on an index which is a function of 14 separate measurements of each wing. After 14 generations, we achieved a 15 S.D. difference between up and down-selected treatments. Conclusion WINGMACHINE enables rapid, highly repeatable measurements of wings in the family Drosophilidae. Our approach to image analysis may be applicable to a variety of biological objects that can be represented as a framework of connected lines.

  18. RandSpg: An open-source program for generating atomistic crystal structures with specific spacegroups

    Science.gov (United States)

    Avery, Patrick; Zurek, Eva

    2017-04-01

    A new algorithm, RANDSPG, that can be used to generate trial crystal structures with specific space groups and compositions is described. The program has been designed for systems where the atoms are independent of one another, and it is therefore primarily suited towards inorganic systems. The structures that are generated adhere to user-defined constraints such as: the lattice shape and size, stoichiometry, set of space groups to be generated, and factors that influence the minimum interatomic separations. In addition, the user can optionally specify if the most general Wyckoff position is to be occupied or constrain select atoms to specific Wyckoff positions. Extensive testing indicates that the algorithm is efficient and reliable. The library is lightweight, portable, dependency-free and is published under a license recognized by the Open Source Initiative. A web interface for the algorithm is publicly accessible at http://xtalopt.openmolecules.net/randSpg/randSpg.html. RANDSPG has also been interfaced with the XTALOPT evolutionary algorithm for crystal structure prediction, and it is illustrated that the use of symmetric lattices in the first generation of randomly created individuals decreases the number of structures that need to be optimized to find the global energy minimum.

  19. Quasi-phase-matching bandwidth for second harmonic generation in crystals with a regular domain structure

    NARCIS (Netherlands)

    Dmitriev, VG; Yur'ev, YV

    1999-01-01

    The question of the quasi-phase-matching bandwidth for second harmonic generation in crystals with a regular domain structure is considered in terms of the approximation of a constant field of the fundamental-frequency radiation and in the nonlinear conversion regime.

  20. A general protocol for the generation of Nanobodies for structural biology

    DEFF Research Database (Denmark)

    Pardon, Els; Laeremans, Toon; Triest, Sarah

    2014-01-01

    There is growing interest in using antibodies as auxiliary tools to crystallize proteins. Here we describe a general protocol for the generation of Nanobodies to be used as crystallization chaperones for the structural investigation of diverse conformational states of flexible (membrane) proteins...

  1. Next Generation Nuclear Plant Structures, Systems, and Components Safety Classification White Paper

    Energy Technology Data Exchange (ETDEWEB)

    Pete Jordan

    2010-09-01

    This white paper outlines the relevant regulatory policy and guidance for a risk-informed approach for establishing the safety classification of Structures, Systems, and Components (SSCs) for the Next Generation Nuclear Plant and sets forth certain facts for review and discussion in order facilitate an effective submittal leading to an NGNP Combined Operating License application under 10 CFR 52.

  2. Automated urinalysis.

    Science.gov (United States)

    Carlson, D A; Statland, B E

    1988-09-01

    Many sources of variation affect urinalysis testing. These are due to physiologic changes in the patient, therapeutic interventions, and collection, transportation, and storage of urine specimens. There are problems inherent to the manual performance of this high-volume test. Procedures are poorly standardized across the United States, and even within the same laboratory there can be significant technologist-to-technologist variability. The methods used can perturb the specimen so that recovery of analytes is less than 100 per cent in the aliquot examined. The absence of significant automation of the entire test, with the one exception of the Yellow IRIS, is unusual in the clinical laboratory setting, where most other hematology and chemistry testing has been fully automated. Our evaluation of the Yellow IRIS found that this system is an excellent way to improve the quality of the results and thereby physician acceptance. There is a positive impact for those centers using this instrument, both for the laboratory and for the hospital.

  3. Variable Structure Control of DFIG for Wind Power Generation and Harmonic Current Mitigation

    Directory of Open Access Journals (Sweden)

    BELMADANI, B.

    2010-11-01

    Full Text Available This paper focuses on wind energy conversion system (WECS analysis and control for power generation along with problems related to the mitigation of harmonic pollution in the grid using a variable-speed structure control of the doubly fed induction generator (DFIG. A control approach based on the so-called sliding mode control (SMC that is both efficient and suitable is used for power generation control and harmonic-current compensation. The WECS then behaves as an active power filter (APF. The method aims at improving the overall efficiency, dynamic performance and robustness of the wind power generation system. Simulation results obtained on a 20-kW, 380-V, 50-Hz DFIG confirm the effectiveness of the proposed approach.

  4. Generation of laser-induced periodic surface structures on transparent material-fused silica

    Science.gov (United States)

    Schwarz, Simon; Rung, Stefan; Hellmann, Ralf

    2016-05-01

    We report on a comparison between simulated and experimental results for the generation of laser-induced periodic surface structures with low spatial frequency on dielectrics. Using the established efficacy factor theory extended by a Drude model, we determine the required carrier density for the generation of low spatial frequency LIPSS (LSFL) and forecast their periodicity and orientation. In a subsequent calculative step, we determine the fluence of ultrashort laser pulses necessary to excite this required carrier density in due consideration of the pulse number dependent ablation threshold. The later calculation is based on a rate equation including photo- and avalanche ionization and derives appropriate process parameters for a selective generation of LSFL. Exemplarily, we apply this approach to the generation of LSFL on fused silica using a 1030 nm femtosecond laser. The experimental results for the orientation and spatial periodicity of LSFL reveal excellent agreement with the simulation.

  5. Combining photocatalytic hydrogen generation and capsule storage in graphene based sandwich structures

    Science.gov (United States)

    Yang, Li; Li, Xiyu; Zhang, Guozhen; Cui, Peng; Wang, Xijun; Jiang, Xiang; Zhao, Jin; Luo, Yi; Jiang, Jun

    2017-07-01

    The challenge of safe hydrogen storage has limited the practical application of solar-driven photocatalytic water splitting. It is hard to isolate hydrogen from oxygen products during water splitting to avoid unwanted reverse reaction or explosion. Here we propose a multi-layer structure where a carbon nitride is sandwiched between two graphene sheets modified by different functional groups. First-principles simulations demonstrate that such a system can harvest light and deliver photo-generated holes to the outer graphene-based sheets for water splitting and proton generation. Driven by electrostatic attraction, protons penetrate through graphene to react with electrons on the inner carbon nitride to generate hydrogen molecule. The produced hydrogen is completely isolated and stored with a high-density level within the sandwich, as no molecules could migrate through graphene. The ability of integrating photocatalytic hydrogen generation and safe capsule storage has made the sandwich system an exciting candidate for realistic solar and hydrogen energy utilization.

  6. Development of Control Structure for Hybrid Wind Generators with Active Power Capability

    Directory of Open Access Journals (Sweden)

    Mehdi Niroomand

    2014-01-01

    Full Text Available A hierarchical control structure is proposed for hybrid energy systems (HES which consist of wind energy system (WES and energy storage system (ESS. The proposed multilevel control structure consists of four blocks: reference generation and mode select, power balancing, control algorithms, and switching control blocks. A high performance power management strategy is used for the system. Also, the proposed system is analyzed as an active power filter (APF with ability to control the voltage, to compensate the harmonics, and to deliver active power. The HES is designed with parallel DC coupled structure. Simulation results are shown for verification of the theoretical analysis.

  7. Beyond the Twilight Zone: automated prediction of structural properties of proteins by recursive neural networks and remote homology information.

    Science.gov (United States)

    Mooney, Catherine; Pollastri, Gianluca

    2009-10-01

    The prediction of 1D structural properties of proteins is an important step toward the prediction of protein structure and function, not only in the ab initio case but also when homology information to known structures is available. Despite this the vast majority of 1D predictors do not incorporate homology information into the prediction process. We develop a novel structural alignment method, SAMD, which we use to build alignments of putative remote homologues that we compress into templates of structural frequency profiles. We use these templates as additional input to ensembles of recursive neural networks, which we specialise for the prediction of query sequences that show only remote homology to any Protein Data Bank structure. We predict four 1D structural properties - secondary structure, relative solvent accessibility, backbone structural motifs, and contact density. Secondary structure prediction accuracy, tested by five-fold cross-validation on a large set of proteins allowing less than 25% sequence identity between training and test set and query sequences and templates, exceeds 82%, outperforming its ab initio counterpart, other state-of-the-art secondary structure predictors (Jpred 3 and PSIPRED) and two other systems based on PSI-BLAST and COMPASS templates. We show that structural information from homologues improves prediction accuracy well beyond the Twilight Zone of sequence similarity, even below 5% sequence identity, for all four structural properties. Significant improvement over the extraction of structural information directly from PDB templates suggests that the combination of sequence and template information is more informative than templates alone.

  8. Molecular dynamics simulations with replica-averaged structural restraints generate structural ensembles according to the maximum entropy principle.

    Science.gov (United States)

    Cavalli, Andrea; Camilloni, Carlo; Vendruscolo, Michele

    2013-03-07

    In order to characterise the dynamics of proteins, a well-established method is to incorporate experimental parameters as replica-averaged structural restraints into molecular dynamics simulations. Here, we justify this approach in the case of interproton distance information provided by nuclear Overhauser effects by showing that it generates ensembles of conformations according to the maximum entropy principle. These results indicate that the use of replica-averaged structural restraints in molecular dynamics simulations, given a force field and a set of experimental data, can provide an accurate approximation of the unknown Boltzmann distribution of a system.

  9. An improved method for the determination of trace levels of arsenic and antimony in geological materials by automated hydride generation-atomic absorption spectroscopy

    Science.gov (United States)

    Crock, J.G.; Lichte, F.E.

    1982-01-01

    An improved, automated method for the determination of arsenic and antimony in geological materials is described. After digestion of the material in sulfuric, nitric, hydrofluoric and perchloric acids, a hydrochloric acid solution of the sample is automatically mixed with reducing agents, acidified with additional hydrochloric acid, and treated with a sodium tetrahydroborate solution to form arsine and stibine. The hydrides are decomposed in a heated quartz tube in the optical path of an atomic absorption spectrometer. The absorbance peak height for arsenic or antimony is measured. Interferences that exist are minimized to the point where most geological materials including coals, soils, coal ashes, rocks and sediments can be analyzed directly without use of standard additions. The relative standard deviation of the digestion and the instrumental procedure is less than 2% at the 50 ??g l-1 As or Sb level. The reagent-blank detection limit is 0.2 ??g l-1 As or Sb. ?? 1982.

  10. Implementation of a semi-automated strategy for the annotation of metabolomic fingerprints generated by liquid chromatography-high resolution mass spectrometry from biological samples.

    Science.gov (United States)

    Courant, Frédérique; Royer, Anne-Lise; Chéreau, Sylvain; Morvan, Marie-Line; Monteau, Fabrice; Antignac, Jean-Philippe; Le Bizec, Bruno

    2012-11-07

    -eight compounds were identified successfully in the generated chemical phenotypes, among which five were found to be candidate markers of the administration of these anabolic agents, demonstrating the efficiency of the developed strategy to reveal and confirm metabolite structures according to the high-throughput objective expected from these integrative biological approaches.

  11. FIJI Macro 3D ART VeSElecT: 3D Automated Reconstruction Tool for Vesicle Structures of Electron Tomograms.

    Science.gov (United States)

    Kaltdorf, Kristin Verena; Schulze, Katja; Helmprobst, Frederik; Kollmannsberger, Philip; Dandekar, Thomas; Stigloher, Christian

    2017-01-01

    Automatic image reconstruction is critical to cope with steadily increasing data from advanced microscopy. We describe here the Fiji macro 3D ART VeSElecT which we developed to study synaptic vesicles in electron tomograms. We apply this tool to quantify vesicle properties (i) in embryonic Danio rerio 4 and 8 days past fertilization (dpf) and (ii) to compare Caenorhabditis elegans N2 neuromuscular junctions (NMJ) wild-type and its septin mutant (unc-59(e261)). We demonstrate development-specific and mutant-specific changes in synaptic vesicle pools in both models. We confirm the functionality of our macro by applying our 3D ART VeSElecT on zebrafish NMJ showing smaller vesicles in 8 dpf embryos then 4 dpf, which was validated by manual reconstruction of the vesicle pool. Furthermore, we analyze the impact of C. elegans septin mutant unc-59(e261) on vesicle pool formation and vesicle size. Automated vesicle registration and characterization was implemented in Fiji as two macros (registration and measurement). This flexible arrangement allows in particular reducing false positives by an optional manual revision step. Preprocessing and contrast enhancement work on image-stacks of 1nm/pixel in x and y direction. Semi-automated cell selection was integrated. 3D ART VeSElecT removes interfering components, detects vesicles by 3D segmentation and calculates vesicle volume and diameter (spherical approximation, inner/outer diameter). Results are collected in color using the RoiManager plugin including the possibility of manual removal of non-matching confounder vesicles. Detailed evaluation considered performance (detected vesicles) and specificity (true vesicles) as well as precision and recall. We furthermore show gain in segmentation and morphological filtering compared to learning based methods and a large time gain compared to manual segmentation. 3D ART VeSElecT shows small error rates and its speed gain can be up to 68 times faster in comparison to manual annotation

  12. Automated Integrated Analog Filter Design Issues

    Directory of Open Access Journals (Sweden)

    Karolis Kiela

    2015-07-01

    Full Text Available An analysis of modern automated integrated analog circuits design methods and their use in integrated filter design is done. Current modern analog circuits automated tools are based on optimization algorithms and/or new circuit generation methods. Most automated integrated filter design methods are only suited to gmC and switched current filter topologies. Here, an algorithm for an active RC integrated filter design is proposed, that can be used in automated filter designs. The algorithm is tested by designing an integrated active RC filter in a 65 nm CMOS technology.

  13. Detailed simulation of structural color generation inspired by the Morpho butterfly.

    Science.gov (United States)

    Steindorfer, Michael A; Schmidt, Volker; Belegratis, Maria; Stadlober, Barbara; Krenn, Joachim R

    2012-09-10

    The brilliancy and variety of structural colors found in nature has become a major scientific topic in recent years. Rapid-prototyping processes enable the fabrication of according structures, but the technical exploitation requires a profound understanding of structural features and material properties regarding the generation of reflected color. This paper presents an extensive simulation of the reflectance spectra of a simplified 2D Morpho butterfly wing model by utilizing the finite-difference time-domain method. The structural parameters are optimized for reflection in a given spectral range. A comparison to simpler models, such as a plane dielectric layer stack, provides an understanding of the origin of the reflection behavior. We find that the wavelength of the reflection maximum is mainly set by the lateral dimensions of the structures. Furthermore small variations of the vertical dimensions leave the spectral position of the reflectance wavelength unchanged, potentially reducing grating effects.

  14. Structural Evaluation of a PGSFR Steam Generator for a Steady State Condition

    Energy Technology Data Exchange (ETDEWEB)

    Park, Chang-Gyu; Kim, Jong-Bum; Kim, Hoe-Woong; Koo, Gyeong-Hoi [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    In this study, design loads for design condition and normal operating steady state condition were classified and the structural analyses for each design loads were carried out. And, structural integrities under each service level were evaluated according to ASME design code. The structural analyses of a steam generator are carried out and its structural integrity under the given service levels is evaluated per ASME Code rule. The design loads according to design condition and normal operating steady condition are classified and stresses calculated from stress analyses are linearized and summarized in their stress components. As a result, the SG structure satisfies with design criteria for both service levels. Though the steam header is designed as a thick hemisphere, its design margin is not so high in spite of just steady state condition. Thus, additional evaluation by considering various operating events will be followed.

  15. Generation of arbitrary radially polarized array beams by modulating the correlation structure

    CERN Document Server

    Zhu, Shijun; Li, Zhenhua

    2016-01-01

    We demonstrate a convenient approach for simultaneously manipulating the amplitude and polarization of light beams by means of the modulation of the correlation structure. As an illustration, we constructed a periodic correlation structure that can generate an arbitrary radially polarized array (RPA) beam of a radial or rectangular symmetry array in the focal plane from a radially polarized (RP) beam. The physical realizability conditions for such source and the far-field beam condition are derived. It is illustrated that the beamlet shape and the state of polarization (SOP) can be effectively controlled by the initial correlation structure and the coherence width. Furthermore, by designing the source correlation structure, a tunable OK-shaped RPA beam and an optical cage are demonstrated, which can find widespread applications in non-destructive manipulation of particles and living biological cells. The arbitrariness in the design of correlation structure prompted us to find more convenient approaches for co...

  16. Automated discovery of structural features of the optic nerve head on the basis of image and genetic data

    Science.gov (United States)

    Christopher, Mark; Tang, Li; Fingert, John H.; Scheetz, Todd E.; Abramoff, Michael D.

    2014-03-01

    Evaluation of optic nerve head (ONH) structure is a commonly used clinical technique for both diagnosis and monitoring of glaucoma. Glaucoma is associated with characteristic changes in the structure of the ONH. We present a method for computationally identifying ONH structural features using both imaging and genetic data from a large cohort of participants at risk for primary open angle glaucoma (POAG). Using 1054 participants from the Ocular Hypertension Treatment Study, ONH structure was measured by application of a stereo correspondence algorithm to stereo fundus images. In addition, the genotypes of several known POAG genetic risk factors were considered for each participant. ONH structural features were discovered using both a principal component analysis approach to identify the major modes of variance within structural measurements and a linear discriminant analysis approach to capture the relationship between genetic risk factors and ONH structure. The identified ONH structural features were evaluated based on the strength of their associations with genotype and development of POAG by the end of the OHTS study. ONH structural features with strong associations with genotype were identified for each of the genetic loci considered. Several identified ONH structural features were significantly associated (p genetic risk status was found to substantially increase performance of early POAG prediction. These results suggest incorporating both imaging and genetic data into ONH structural modeling significantly improves the ability to explain POAG-related changes to ONH structure.

  17. Automated Budget System

    Data.gov (United States)

    Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...

  18. The effect of gender and age structure on municipal waste generation in Poland

    Energy Technology Data Exchange (ETDEWEB)

    Talalaj, Izabela Anna, E-mail: izabela.tj@gmail.com; Walery, Maria, E-mail: m.walery@pb.edu.pl

    2015-06-15

    Highlights: • An effect of gender and age structure on municipal waste generation was presented. • The waste accumulation index is influenced by a number of unemployed women. • Greater share of women in society contributes to greater waste production. • A model describing the analyzed dependences was determined. - Abstract: In this study the effect of gender and age structure on municipal waste generation was investigated. The data from 10-year period, from 2001 to 2010 year, were taken into consideration. The following parameters of gender and age structure were analyzed: men and woman quantity, female to male ratio, number of working, pre-working and post-working age men/women, number of unemployed men/women. The results have showed a strong correlation of annual per capita waste generation rate with number of unemployed women (r = 0.70) and female to male ratio (r = 0.81). This indicates that waste generation rate is more depended on ratio of men and women that on quantitative size of each group. Using the regression analysis a model describing the dependence between female to male ratio, number of unemployed woman and waste quantity was determined. The model explains 70% of waste quantity variation. Obtained results can be used both to improve waste management and to a fuller understanding of gender behavior.

  19. Coupling of electromagnetic and structural dynamics for a wind turbine generator

    Science.gov (United States)

    Matzke, D.; Rick, S.; Hollas, S.; Schelenz, R.; Jacobs, G.; Hameyer, K.

    2016-09-01

    This contribution presents a model interface of a wind turbine generator to represent the reciprocal effects between the mechanical and the electromagnetic system. Therefore, a multi-body-simulation (MBS) model in Simpack is set up and coupled with a quasi-static electromagnetic (EM) model of the generator in Matlab/Simulink via co-simulation. Due to lack of data regarding the structural properties of the generator the modal properties of the MBS model are fitted with respect to results of an experimental modal analysis (EMA) on the reference generator. The used method and the results of this approach are presented in this paper. The MB S model and the interface are set up in such a way that the EM forces can be applied to the structure and the response of the structure can be fed back to the EM model. The results of this cosimulation clearly show an influence of the feedback of the mechanical response which is mainly damping in the torsional degree of freedom and effects due to eccentricity in radial direction. The accuracy of these results will be validated via test bench measurements and presented in future work. Furthermore it is suggested that the EM model should be adjusted in future works so that transient effects are represented.

  20. Stochastic generation of explicit pore structures by thresholding Gaussian random fields

    Science.gov (United States)

    Hyman, Jeffrey D.; Winter, C. Larrabee

    2014-11-01

    We provide a description and computational investigation of an efficient method to stochastically generate realistic pore structures. Smolarkiewicz and Winter introduced this specific method in pores resolving simulation of Darcy flows (Smolarkiewicz and Winter, 2010 [1]) without giving a complete formal description or analysis of the method, or indicating how to control the parameterization of the ensemble. We address both issues in this paper. The method consists of two steps. First, a realization of a correlated Gaussian field, or topography, is produced by convolving a prescribed kernel with an initial field of independent, identically distributed random variables. The intrinsic length scales of the kernel determine the correlation structure of the topography. Next, a sample pore space is generated by applying a level threshold to the Gaussian field realization: points are assigned to the void phase or the solid phase depending on whether the topography over them is above or below the threshold. Hence, the topology and geometry of the pore space depend on the form of the kernel and the level threshold. Manipulating these two user prescribed quantities allows good control of pore space observables, in particular the Minkowski functionals. Extensions of the method to generate media with multiple pore structures and preferential flow directions are also discussed. To demonstrate its usefulness, the method is used to generate a pore space with physical and hydrological properties similar to a sample of Berea sandstone.

  1. Complex Domains Call for Automation but Automation Requires More Knowledge and Learning

    DEFF Research Database (Denmark)

    Madsen, Erik Skov; Mikkelsen, Lars Lindegaard

    This paper discusses productivity in relation to tacit knowledge and domain complexity. The study is based on five very different case studies; three studies are conducted in Denmark, Mexico, and the Czech Republic in relation to knowledge transfer when relocating manufacturing facilities. Two...... studies investigate operation and automation of oil and gas production in the North Sea. Semi-structured interviews, surveys, and observations are the main methods used. The paper provides a novel conceptual framework around which management may generate discussions about productivity and the need...

  2. Automation 2017

    CERN Document Server

    Zieliński, Cezary; Kaliczyńska, Małgorzata

    2017-01-01

    This book consists of papers presented at Automation 2017, an international conference held in Warsaw from March 15 to 17, 2017. It discusses research findings associated with the concepts behind INDUSTRY 4.0, with a focus on offering a better understanding of and promoting participation in the Fourth Industrial Revolution. Each chapter presents a detailed analysis of a specific technical problem, in most cases followed by a numerical analysis, simulation and description of the results of implementing the solution in a real-world context. The theoretical results, practical solutions and guidelines presented are valuable for both researchers working in the area of engineering sciences and practitioners looking for solutions to industrial problems. .

  3. Marketing automation

    Directory of Open Access Journals (Sweden)

    TODOR Raluca Dania

    2017-01-01

    Full Text Available The automation of the marketing process seems to be nowadays, the only solution to face the major changes brought by the fast evolution of technology and the continuous increase in supply and demand. In order to achieve the desired marketing results, businessis have to employ digital marketing and communication services. These services are efficient and measurable thanks to the marketing technology used to track, score and implement each campaign. Due to the technical progress, the marketing fragmentation, demand for customized products and services on one side and the need to achieve constructive dialogue with the customers, immediate and flexible response and the necessity to measure the investments and the results on the other side, the classical marketing approached had changed continue to improve substantially.

  4. Nonlinear Kinetic Development of the Weibel Instability and the generation of electrostatic coherent structures

    CERN Document Server

    Palodhi, L; Pegoraro, F; 10.1088/0741-3335/51/12/125006

    2010-01-01

    The nonlinear evolution of the Weibel instability driven by the anisotropy of the electron distribution function in a collisionless plasma is investigated in a spatially one-dimensional configuration with a Vlasov code in a two-dimensional velocity space. It is found that the electromagnetic fields generated by this instability cause a strong deformation of the electron distribution function in phase space, corresponding to highly filamented magnetic vortices. Eventually, these deformations lead to the generation of short wavelength Langmuir modes that form highly localized electrostatic structures corresponding to jumps of the electrostatic potential.

  5. Fluid-Structure Interaction Effects Modeling for the Modal Analysis of a Steam Generator Tube Bundle

    Energy Technology Data Exchange (ETDEWEB)

    Sigrist, J.F. [DCNS Prop, Serv Tech et Sci, F-44620 La Montagne, (France); Broc, D. [CEA Saclay, Serv Etud Mecan et Sism, F-91191 Gif Sur Yvette, (France)

    2009-07-01

    Seismic analysis of steam generator is of paramount importance in the safety assessment of nuclear installations. These analyses require, in particular, the calculation of frequency, mode shape, and effective modal mass of the system Eigenmodes. As fluid-structure interaction effects can significantly affect the dynamic behavior of immersed structures, the numerical modeling of the steam generator has to take into account FSI. A complete modeling of heat exchangers (including pressure vessel, tubes, and fluid) is not accessible to the engineer for industrial design studies. In the past decades, homogenization methods have been studied and developed in order to model tubes and fluid through an equivalent continuous media, thus avoiding the tedious task to mesh all structure and fluid sub-domains within the tube bundle. Few of these methods have nonetheless been implemented in industrial finite element codes. In a previous paper (Sigrist, 2007, 'Fluid-Structure Interaction Effects Modeling for the Modal Analysis of a Nuclear Pressure Vessel', J. Pressure Vessel Technol., 123, p. 1-6), a homogenization method has been applied to an industrial case for the modal analysis of a nuclear rector with internal structures and coupling effects modeling. The present paper aims at investigating the extension of the proposed method for the dynamic analysis of tube bundles with fluid-structure interaction modeling. The homogenization method is compared with the classical coupled method in terms of eigenfrequencies, Eigenmodes, and effective modal masses. (authors)

  6. Automation of electroweak NLO corrections in general models

    Energy Technology Data Exchange (ETDEWEB)

    Lang, Jean-Nicolas [Universitaet Wuerzburg (Germany)

    2016-07-01

    I discuss the automation of generation of scattering amplitudes in general quantum field theories at next-to-leading order in perturbation theory. The work is based on Recola, a highly efficient one-loop amplitude generator for the Standard Model, which I have extended so that it can deal with general quantum field theories. Internally, Recola computes off-shell currents and for new models new rules for off-shell currents emerge which are derived from the Feynman rules. My work relies on the UFO format which can be obtained by a suited model builder, e.g. FeynRules. I have developed tools to derive the necessary counterterm structures and to perform the renormalization within Recola in an automated way. I describe the procedure using the example of the two-Higgs-doublet model.

  7. Automated Clustering of Similar Amendments

    CERN Document Server

    CERN. Geneva

    2016-01-01

    The Italian Senate is clogged by computer-generated amendments. This talk will describe a simple strategy to cluster them in an automated fashion, so that the appropriate Senate procedures can be used to get rid of them in one sweep.

  8. Discovery of new natural products by application of X-hitting, a novel algorithm for automated comparison of full UV-spectra, combined with structural determination by NMR spectroscophy

    DEFF Research Database (Denmark)

    Larsen, Thomas Ostenfeld; Petersen, Bent O.; Duus, Jens Øllgaard;

    2005-01-01

    X-hitting, a newly developed algorithm for automated comparison of UV data, has been used for the tracking of two novel spiro-quinazoline metabolites, lapatins A (1)andB(2), in a screening study targeting quinazolines. The structures of 1 and 2 were elucidated by analysis of spectroscopic data, p......, primarily 2D NMR....

  9. Automated Content Detection for Cassini Images

    Science.gov (United States)

    Stanboli, A.; Bue, B.; Wagstaff, K.; Altinok, A.

    2017-06-01

    NASA missions generate numerous images ever organized in increasingly large archives. Image archives are currently not searchable by image content. We present an automated content detection prototype that can enable content search.

  10. Generating a hexagonal lattice wave-field with a gradient basis structure

    CERN Document Server

    Kumar, Manish

    2016-01-01

    We present a new, single step approach for generating a hexagonal lattice wave-field with a gradient local basis structure. We incorporate this by coherently superposing two (or more) hexagonal lattice wave-fields which differ in their basis structures. The basis of the resultant lattice wave-field is highly dependent on the relative strengths of constituent wave-fields and a desired spatial modulation of basis structure is thus obtained by controlling the spatial modulation of relative strengths of constituent wave-fields. The experimental realization of gradient lattice is achieved by using a phase only spatial light modulator (SLM) in an optical 4f Fourier filter setup where the SLM is displayed with numerically calculated gradient phase mask. The presented method is wavelength independent and is completely scalable making it very promising for micro-fabrication of corresponding structures.

  11. Generation of structurally novel short carotenoids and study of their biological activity

    DEFF Research Database (Denmark)

    Kim, Se Hyeuk; Kim, Moon S.; Lee, Bun Y.

    2016-01-01

    Recent research interest in phytochemicals has consistently driven the efforts in the metabolic engineering field toward microbial production of various carotenoids. In spite of systematic studies, the possibility of using C30 carotenoids as biologically functional compounds has not been explored...... thus far. Here, we generated 13 novel structures of C30 carotenoids and one C35 carotenoid, including acyclic, monocyclic, and bicyclic structures, through directed evolution and combinatorial biosynthesis, in Escherichia coli. Measurement of radical scavenging activity of various C30 carotenoid...... structures revealed that acyclic C30 carotenoids showed higher radical scavenging activity than did DL-atocopherol. We could assume high potential biological activity of the novel structures of C30 carotenoids as well, based on the neuronal differentiation activity observed for the monocyclic C30 carotenoid...

  12. Algorithms for the automatic generation of 2-D structured multi-block grids

    Science.gov (United States)

    Schoenfeld, Thilo; Weinerfelt, Per; Jenssen, Carl B.

    1995-01-01

    Two different approaches to the fully automatic generation of structured multi-block grids in two dimensions are presented. The work aims to simplify the user interactivity necessary for the definition of a multiple block grid topology. The first approach is based on an advancing front method commonly used for the generation of unstructured grids. The original algorithm has been modified toward the generation of large quadrilateral elements. The second method is based on the divide-and-conquer paradigm with the global domain recursively partitioned into sub-domains. For either method each of the resulting blocks is then meshed using transfinite interpolation and elliptic smoothing. The applicability of these methods to practical problems is demonstrated for typical geometries of fluid dynamics.

  13. Analysis of public consciousness structure and consideration of information supply against the nuclear power generation

    Energy Technology Data Exchange (ETDEWEB)

    Shimooka, Hiroshi [Institute of Applied Energy, Tokyo (Japan)

    2001-01-01

    The Energy Engineering Research Institute carried out six times of questionnaire on analysis of public consciousness structure for fiscal years for 1986 to 1999, to obtain a lot of informations on public recognition against the nuclear power generation. In recent, as a feasibility on change of consciousness against the power generation was supposed by occurrence of the JCO critical accident forming the first victim in Japan on September, 1999 after investigation in fiscal year 1998, by carrying out the same questionnaire as one in previous fiscal year to the same objects after the accident, to analyze how evaluation, behavior determining factor and so forth on the power generation changed by the accident. In this paper, on referring to results of past questionnaires, were introduced on the questionnaire results and their analysis carried out before and after the JCO critical accident, to consider on information supply referred by them. (G.K.)

  14. The mesh-matching algorithm: an automatic 3D mesh generator for Finite element structures

    CERN Document Server

    Couteau, B; Lavallee, S; Payan, Yohan; Lavallee, St\\'{e}phane

    2000-01-01

    Several authors have employed Finite Element Analysis (FEA) for stress and strain analysis in orthopaedic biomechanics. Unfortunately, the use of three-dimensional models is time consuming and consequently the number of analysis to be performed is limited. The authors have investigated a new method allowing automatically 3D mesh generation for structures as complex as bone for example. This method called Mesh-Matching (M-M) algorithm generated automatically customized 3D meshes of bones from an already existing model. The M-M algorithm has been used to generate FE models of ten proximal human femora from an initial one which had been experimentally validated. The new meshes seemed to demonstrate satisfying results.

  15. Sphingomyelinase D activity in model membranes: structural effects of in situ generation of ceramide-1-phosphate

    DEFF Research Database (Denmark)

    Stock, Roberto; Brewer, Jonathan R.; Wagner, Kerstin

    2012-01-01

    membranes were studied both by in situ generation of this lipid using a recombinant sphingomyelinase D from the spider Loxosceles laeta and by pre-mixing it with sphingomyelin and cholesterol. The systems of choice were large unilamellar vesicles for bulk studies (enzyme kinetics, fluorescence spectroscopy...... sphingomyelin were examined. The findings indicate that: 1) ceramide-1-phosphate (particularly lauroyl ceramide-1-phosphate) can be incorporated into sphingomyelin bilayers in a concentration-dependent manner and generates coexistence of liquid disordered/solid ordered domains, 2) the activity...... of sphingomyelinase D is clearly influenced by the supramolecular organization of its substrate in membranes and, 3) in situ ceramide-1-phosphate generation by enzymatic activity profoundly alters the lateral structure and morphology of the target membranes....

  16. Sensitivity of echo enabled harmonic generation to sinusoidal electron beam energy structure

    Directory of Open Access Journals (Sweden)

    E. Hemsing

    2017-06-01

    Full Text Available We analytically examine the bunching factor spectrum of a relativistic electron beam with sinusoidal energy structure that then undergoes an echo-enabled harmonic generation (EEHG transformation to produce high harmonics. The performance is found to be described primarily by a simple scaling parameter. The dependence of the bunching amplitude on fluctuations of critical parameters is derived analytically, and compared with simulations. Where applicable, EEHG is also compared with high gain harmonic generation (HGHG and we find that EEHG is generally less sensitive to several types of energy structure. In the presence of intermediate frequency modulations like those produced by the microbunching instability, EEHG has a substantially narrower intrinsic bunching pedestal.

  17. Defects and defect generation in oxide layer of ion implanted silicon-silicon dioxide structures

    CERN Document Server

    Baraban, A P

    2002-01-01

    One studies mechanism of generation of defects in Si-SiO sub 2 structure oxide layer as a result of implantation of argon ions with 130 keV energy and 10 sup 1 sup 3 - 3.2 x 10 sup 1 sup 7 cm sup - sup 2 doses. Si-SiO sub 2 structures are produced by thermal oxidation of silicon under 950 deg C temperature. Investigations were based on electroluminescence technique and on measuring of high-frequency volt-farad characteristics. Increase of implantation dose was determined to result in spreading of luminosity centres and in its maximum shifting closer to boundary with silicon. Ion implantation was shown, as well, to result in increase of density of surface states at Si-SiO sub 2 interface. One proposed model of defect generation resulting from Ar ion implantation into Si-SiO sub 2

  18. Creation Myths of Generative Grammar and the Mathematics of Syntactic Structures

    Science.gov (United States)

    Pullum, Geoffrey K.

    Syntactic Structures (Chomsky [6]) is widely believed to have laid the foundations of a cognitive revolution in linguistic science, and to have presented (i) the first use in linguistics of powerful new ideas regarding grammars as generative systems, (ii) a proof that English was not a regular language, (iii) decisive syntactic arguments against context-free phrase structure grammar description, and (iv) a demonstration of how transformational rules could provide a formal solution to those problems. None of these things are true. This paper offers a retrospective analysis and evaluation.

  19. Workflow Fault Tree Generation Through Model Checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2014-01-01

    We present a framework for the automated generation of fault trees from models of realworld process workflows, expressed in a formalised subset of the popular Business Process Modelling and Notation (BPMN) language. To capture uncertainty and unreliability in workflows, we extend this formalism...... of the system being modelled. From these calculations, a comprehensive fault tree is generated. Further, we show that annotating the model with rewards (data) allows the expected mean values of reward structures to be calculated at points of failure....

  20. Robust Automated Image Co-Registration of Optical Multi-Sensor Time Series Data: Database Generation for Multi-Temporal Landslide Detection

    Directory of Open Access Journals (Sweden)

    Robert Behling

    2014-03-01

    Full Text Available Reliable multi-temporal landslide detection over longer periods of time requires multi-sensor time series data characterized by high internal geometric stability, as well as high relative and absolute accuracy. For this purpose, a new methodology for fully automated co-registration has been developed allowing efficient and robust spatial alignment of standard orthorectified data products originating from a multitude of optical satellite remote sensing data of varying spatial resolution. Correlation-based co-registration uses world-wide available terrain corrected Landsat Level 1T time series data as the spatial reference, ensuring global applicability. The developed approach has been applied to a multi-sensor time series of 592 remote sensing datasets covering an approximately 12,000 km2 area in Southern Kyrgyzstan (Central Asia strongly affected by landslides. The database contains images acquired during the last 26 years by Landsat (ETM, ASTER, SPOT and RapidEye sensors. Analysis of the spatial shifts obtained from co-registration has revealed sensor-specific alignments ranging between 5 m and more than 400 m. Overall accuracy assessment of these alignments has resulted in a high relative image-to-image accuracy of 17 m (RMSE and a high absolute accuracy of 23 m (RMSE for the whole co-registered database, making it suitable for multi-temporal landslide detection at a regional scale in Southern Kyrgyzstan.