WorldWideScience

Sample records for parser initially analyzes

  1. SEMSIN SEMANTIC AND SYNTACTIC PARSER

    Directory of Open Access Journals (Sweden)

    K. K. Boyarsky

    2015-09-01

    Full Text Available The paper deals with the principle of operation for SemSin semantic and syntactic parser creating a dependency tree for the Russian language sentences. The parser consists of 4 blocks: a dictionary, morphological analyzer, production rules and lexical analyzer. An important logical part of the parser is pre-syntactical module, which harmonizes and complements morphological analysis results, separates the text paragraphs into individual sentences, and also carries out predisambiguation. Characteristic feature of the presented parser is an open type of control – it is done by means of a set of production rules. A varied set of commands provides the ability to both morphological and semantic-syntactic analysis of the sentence. The paper presents the sequence of rules usage and examples of their work. Specific feature of the rules is the decision making on establishment of syntactic links with simultaneous removal of the morphological and semantic ambiguity. The lexical analyzer provides the execution of commands and rules, and manages the parser in manual or automatic modes of the text analysis. In the first case, the analysis is performed interactively with the possibility of step-by-step execution of the rules and scanning the resulting parse tree. In the second case, analysis results are filed in an xml-file. Active usage of syntactic and semantic dictionary information gives the possibility to reduce significantly the ambiguity of parsing. In addition to marking the text, the parser is also usable as a tool for information extraction from natural language texts.

  2. LRSYS, PASCAL LR(1) Parser Generator System

    International Nuclear Information System (INIS)

    O'Hair, K.

    1991-01-01

    Description of program or function: LRSYS is a complete LR(1) parser generator system written entirely in a portable subset of Pascal. The system, LRSYS, includes a grammar analyzer program (LR) which reads a context-free (BNF) grammar as input and produces LR(1) parsing tables as output, a lexical analyzer generator (LEX) which reads regular expressions created by the REG process as input and produces lexical tables as output, and various parser skeletons that get merged with the tables to produce complete parsers (SMAKE). Current parser skeletons include Pascal, FORTRAN 77, and C. In addition, the CRAY1, DEC VAX11 version contains LRLTRAN and CFT- FORTRAN 77 skeletons. Other language skeletons can easily be added to the system. LRSYS is based on the LR program (NESC Abstract 822)

  3. Telugu dependency parsing using different statistical parsers

    Directory of Open Access Journals (Sweden)

    B. Venkata Seshu Kumari

    2017-01-01

    Full Text Available In this paper we explore different statistical dependency parsers for parsing Telugu. We consider five popular dependency parsers namely, MaltParser, MSTParser, TurboParser, ZPar and Easy-First Parser. We experiment with different parser and feature settings and show the impact of different settings. We also provide a detailed analysis of the performance of all the parsers on major dependency labels. We report our results on test data of Telugu dependency treebank provided in the ICON 2010 tools contest on Indian languages dependency parsing. We obtain state-of-the art performance of 91.8% in unlabeled attachment score and 70.0% in labeled attachment score. To the best of our knowledge ours is the only work which explored all the five popular dependency parsers and compared the performance under different feature settings for Telugu.

  4. Parser Macros for Scala

    OpenAIRE

    Duhem, Martin; Burmako, Eugene

    2015-01-01

    Parser macros are a new kind of macros that allow developers to create new language constructs and to define their own syntax for using them. In this report, we present why parser macros are useful and the kind of problems that they help to solve. We will also see how they are implemented and gain insight about how they take advantage from scala.meta, the new metaprogramming toolkit for Scala. Finally, we will discuss what are the current limitations of parser macros and what is left for futu...

  5. Practical, general parser combinators

    NARCIS (Netherlands)

    A. Izmaylova (Anastasia); A. Afroozeh (Ali); T. van der Storm (Tijs)

    2016-01-01

    textabstractParser combinators are a popular approach to parsing where contextfree grammars are represented as executable code. However, conventional parser combinators do not support left recursion, and can have worst-case exponential runtime. These limitations hinder the expressivity and

  6. Lazy functional parser combinators in Java

    NARCIS (Netherlands)

    Swierstra, D.S.; Dijkstra, A.

    2001-01-01

    A parser is a program that checks if a text is a sentence of the language as described by a grammar. Traditionally, the program text of a parser is generated from a grammar description, after which it is compiled and subsequently run. The language accepted by such a parser is, by the nature of

  7. Expression and cut parser for CMS event data

    International Nuclear Information System (INIS)

    Lista, Luca; Jones, Christopher D; Petrucciani, Giovanni

    2010-01-01

    We present a parser to evaluate expressions and Boolean selections that is applied on CMS event data for event filtering and analysis purposes. The parser is based on Boost Spirit grammar definition, and uses Reflex dictionaries for class introspection. The parser allows for a natural definition of expressions and cuts in users' configurations, and provides good runtime performance compared to other existing parsers.

  8. The ModelCC Model-Driven Parser Generator

    Directory of Open Access Journals (Sweden)

    Fernando Berzal

    2015-01-01

    Full Text Available Syntax-directed translation tools require the specification of a language by means of a formal grammar. This grammar must conform to the specific requirements of the parser generator to be used. This grammar is then annotated with semantic actions for the resulting system to perform its desired function. In this paper, we introduce ModelCC, a model-based parser generator that decouples language specification from language processing, avoiding some of the problems caused by grammar-driven parser generators. ModelCC receives a conceptual model as input, along with constraints that annotate it. It is then able to create a parser for the desired textual syntax and the generated parser fully automates the instantiation of the language conceptual model. ModelCC also includes a reference resolution mechanism so that ModelCC is able to instantiate abstract syntax graphs, rather than mere abstract syntax trees.

  9. Towards automated processing of clinical Finnish: sublanguage analysis and a rule-based parser.

    Science.gov (United States)

    Laippala, Veronika; Ginter, Filip; Pyysalo, Sampo; Salakoski, Tapio

    2009-12-01

    In this paper, we present steps taken towards more efficient automated processing of clinical Finnish, focusing on daily nursing notes in a Finnish Intensive Care Unit (ICU). First, we analyze ICU Finnish as a sublanguage, identifying its specific features facilitating, for example, the development of a specialized syntactic analyser. The identified features include frequent omission of finite verbs, limitations in allowed syntactic structures, and domain-specific vocabulary. Second, we develop a formal grammar and a parser for ICU Finnish, thus providing better tools for the development of further applications in the clinical domain. The grammar is implemented in the LKB system in a typed feature structure formalism. The lexicon is automatically generated based on the output of the FinTWOL morphological analyzer adapted to the clinical domain. As an additional experiment, we study the effect of using Finnish constraint grammar to reduce the size of the lexicon. The parser construction thus makes efficient use of existing resources for Finnish. The grammar currently covers 76.6% of ICU Finnish sentences, producing highly accurate best-parse analyzes with F-score of 91.1%. We find that building a parser for the highly specialized domain sublanguage is not only feasible, but also surprisingly efficient, given an existing morphological analyzer with broad vocabulary coverage. The resulting parser enables a deeper analysis of the text than was previously possible.

  10. Techniques for Automated Testing of Lola Industrial Robot Language Parser

    Directory of Open Access Journals (Sweden)

    M. M. Lutovac

    2014-06-01

    Full Text Available The accuracy of parsing execution directly affects the accuracy of semantic analysis, optimization and object code generation. Therefore, parser testing represents the basis of compiler testing. It should include tests for correct and expected, but also for unexpected and invalid cases. Techniques for testing the parser, as well as algorithms and tools for test sentences generation, are discussed in this paper. The methodology for initial testing of a newly developed compiler is proposed. Generation of negative test sentences by modifying the original language grammar is described. Positive and negative test cases generated by Grow, Purdom’s algorithm with and without length control, CDRC-P algorithm and CDRC-P algorithm with length control are applied to the testing of L-IRL robot programming language. For this purpose two different tools for generation of test sentences are used. Based on the presented analysis of possible solutions, the appropriate method can be chosen for testing the parser for smaller grammars with many recursive rules.

  11. Cross-lingual parser selection for low-resource languages

    DEFF Research Database (Denmark)

    Agic, Zeljko

    2017-01-01

    In multilingual dependency parsing, transferring delexicalized models provides unmatched language coverage and competitive scores, with minimal requirements. Still, selecting the single best parser for any target language poses a challenge. Here, we propose a lean method for parser selection. It ....... It offers top performance, and it does so without disadvantaging the truly low-resource languages. We consistently select appropriate source parsers for our target languages in a realistic cross-lingual parsing experiment....

  12. An efficient implementation of the head-corner parser

    NARCIS (Netherlands)

    vanNoord, G

    This paper describes an efficient and robust implementation of a bidirectional, head-driven parser for constraint-based grammars. This parser is developed for the OVIS system: a Dutch spoken dialogue system in which information about public transport can be obtained by telephone. After a review of

  13. Design and Implementation of High Level Trigger Configuration Exporter and Parser

    CERN Document Server

    Abdulwahhab, Husam

    2015-01-01

    This paper serves as a description of the project that was developed at CMS during the summer. The initial task of the project was with the design, implementation and development of a configuration exporter from an oracle database to a python file. Next was the development of a parser that reads all the necessary information from the python configuration file that was created by the parser, and store the information into the memory in the form of an efficient and easy to access and manipulate cache. The final task of the project was the implementation of a system that handles requests from the client, which is a web interface, and replies with the appropriate data organized in a way that can be viewed on the interface.

  14. A memory-based shallow parser for spoken Dutch

    NARCIS (Netherlands)

    Canisius, S.V.M.; van den Bosch, A.; Decadt, B.; Hoste, V.; De Pauw, G.

    2004-01-01

    We describe the development of a Dutch memory-based shallow parser. The availability of large treebanks for Dutch, such as the one provided by the Spoken Dutch Corpus, allows memory-based learners to be trained on examples of shallow parsing taken from the treebank, and act as a shallow parser after

  15. Manual for the ELL (2) - parser generator and tree generator generator

    OpenAIRE

    Heckmann, Reinhold

    1986-01-01

    Regular right part grammars extended by tree generator specifications are interpreted by a combined parser generator and tree generator that produces an ELL(2) parser. This parser is able to translate programs of the specified language into abstract syntax trees according to the tree specifications in the generator input.

  16. Historical Post Office Directory Parser (POD Parser Software From the AddressingHistory Project

    Directory of Open Access Journals (Sweden)

    Nicola Osborne

    2014-07-01

    Full Text Available The POD Parser is Python software for parsing the OCR’d (optical character recognised text of digitised historical Scottish Post Office Directories (PODs to produce a consistent structured format for the data and for geocoding each address. The software was developed as part of the AddressingHistory project which sought to combine digitised historic directories with digitised and georeferenced historic maps.  The software has potential for reuse in multiple research contexts where historical post office directory data is relevant, and is therefore particularly of use in historical research into social, economic or demographic trends. The POD Parser is currently designed for use with Scottish directories but is extensible, perhaps with some adaptation, to use with other similarly formatted materials such as the English Trade Directories.

  17. Storing files in a parallel computing system based on user-specified parser function

    Science.gov (United States)

    Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Manzanares, Adam; Torres, Aaron

    2014-10-21

    Techniques are provided for storing files in a parallel computing system based on a user-specified parser function. A plurality of files generated by a distributed application in a parallel computing system are stored by obtaining a parser from the distributed application for processing the plurality of files prior to storage; and storing one or more of the plurality of files in one or more storage nodes of the parallel computing system based on the processing by the parser. The plurality of files comprise one or more of a plurality of complete files and a plurality of sub-files. The parser can optionally store only those files that satisfy one or more semantic requirements of the parser. The parser can also extract metadata from one or more of the files and the extracted metadata can be stored with one or more of the plurality of files and used for searching for files.

  18. The CLaC Discourse Parser at CoNLL-2015

    OpenAIRE

    Laali, Majid; Davoodi, Elnaz; Kosseim, Leila

    2017-01-01

    This paper describes our submission (kosseim15) to the CoNLL-2015 shared task on shallow discourse parsing. We used the UIMA framework to develop our parser and used ClearTK to add machine learning functionality to the UIMA framework. Overall, our parser achieves a result of 17.3 F1 on the identification of discourse relations on the blind CoNLL-2015 test set, ranking in sixth place.

  19. Policy-Based Management Natural Language Parser

    Science.gov (United States)

    James, Mark

    2009-01-01

    The Policy-Based Management Natural Language Parser (PBEM) is a rules-based approach to enterprise management that can be used to automate certain management tasks. This parser simplifies the management of a given endeavor by establishing policies to deal with situations that are likely to occur. Policies are operating rules that can be referred to as a means of maintaining order, security, consistency, or other ways of successfully furthering a goal or mission. PBEM provides a way of managing configuration of network elements, applications, and processes via a set of high-level rules or business policies rather than managing individual elements, thus switching the control to a higher level. This software allows unique management rules (or commands) to be specified and applied to a cross-section of the Global Information Grid (GIG). This software embodies a parser that is capable of recognizing and understanding conversational English. Because all possible dialect variants cannot be anticipated, a unique capability was developed that parses passed on conversation intent rather than the exact way the words are used. This software can increase productivity by enabling a user to converse with the system in conversational English to define network policies. PBEM can be used in both manned and unmanned science-gathering programs. Because policy statements can be domain-independent, this software can be applied equally to a wide variety of applications.

  20. ImageParser: a tool for finite element generation from three-dimensional medical images

    Directory of Open Access Journals (Sweden)

    Yamada T

    2004-10-01

    Full Text Available Abstract Background The finite element method (FEM is a powerful mathematical tool to simulate and visualize the mechanical deformation of tissues and organs during medical examinations or interventions. It is yet a challenge to build up an FEM mesh directly from a volumetric image partially because the regions (or structures of interest (ROIs may be irregular and fuzzy. Methods A software package, ImageParser, is developed to generate an FEM mesh from 3-D tomographic medical images. This software uses a semi-automatic method to detect ROIs from the context of image including neighboring tissues and organs, completes segmentation of different tissues, and meshes the organ into elements. Results The ImageParser is shown to build up an FEM model for simulating the mechanical responses of the breast based on 3-D CT images. The breast is compressed by two plate paddles under an overall displacement as large as 20% of the initial distance between the paddles. The strain and tangential Young's modulus distributions are specified for the biomechanical analysis of breast tissues. Conclusion The ImageParser can successfully exact the geometry of ROIs from a complex medical image and generate the FEM mesh with customer-defined segmentation information.

  1. Designing a Constraint Based Parser for Sanskrit

    Science.gov (United States)

    Kulkarni, Amba; Pokar, Sheetal; Shukl, Devanand

    Verbal understanding (śā bdabodha) of any utterance requires the knowledge of how words in that utterance are related to each other. Such knowledge is usually available in the form of cognition of grammatical relations. Generative grammars describe how a language codes these relations. Thus the knowledge of what information various grammatical relations convey is available from the generation point of view and not the analysis point of view. In order to develop a parser based on any grammar one should then know precisely the semantic content of the grammatical relations expressed in a language string, the clues for extracting these relations and finally whether these relations are expressed explicitly or implicitly. Based on the design principles that emerge from this knowledge, we model the parser as finding a directed Tree, given a graph with nodes representing the words and edges representing the possible relations between them. Further, we also use the Mīmā ṃsā constraint of ākā ṅkṣā (expectancy) to rule out non-solutions and sannidhi (proximity) to prioritize the solutions. We have implemented a parser based on these principles and its performance was found to be satisfactory giving us a confidence to extend its functionality to handle the complex sentences.

  2. Thermo-msf-parser: an open source Java library to parse and visualize Thermo Proteome Discoverer msf files.

    Science.gov (United States)

    Colaert, Niklaas; Barsnes, Harald; Vaudel, Marc; Helsens, Kenny; Timmerman, Evy; Sickmann, Albert; Gevaert, Kris; Martens, Lennart

    2011-08-05

    The Thermo Proteome Discoverer program integrates both peptide identification and quantification into a single workflow for peptide-centric proteomics. Furthermore, its close integration with Thermo mass spectrometers has made it increasingly popular in the field. Here, we present a Java library to parse the msf files that constitute the output of Proteome Discoverer. The parser is also implemented as a graphical user interface allowing convenient access to the information found in the msf files, and in Rover, a program to analyze and validate quantitative proteomics information. All code, binaries, and documentation is freely available at http://thermo-msf-parser.googlecode.com.

  3. On Minimizing Training Corpus for Parser Acquisition

    National Research Council Canada - National Science Library

    Hwa, Rebecca

    2001-01-01

    .... In this work, we consider selecting training examples with the it tree-entropy metric. Our goal is to assess how well this selection technique can be applied for training different types of parsers...

  4. A Protocol for Annotating Parser Differences. Research Report. ETS RR-16-02

    Science.gov (United States)

    Bruno, James V.; Cahill, Aoife; Gyawali, Binod

    2016-01-01

    We present an annotation scheme for classifying differences in the outputs of syntactic constituency parsers when a gold standard is unavailable or undesired, as in the case of texts written by nonnative speakers of English. We discuss its automated implementation and the results of a case study that uses the scheme to choose a parser best suited…

  5. Parser Adaptation for Social Media by Integrating Normalization

    NARCIS (Netherlands)

    van der Goot, Rob; van Noord, Gerardus

    This work explores normalization for parser adaptation. Traditionally, normalization is used as separate pre-processing step. We show that integrating the normalization model into the parsing algorithm is beneficial. This way, multiple normalization candidates can be leveraged, which improves

  6. "cba to check the spelling" investigating parser performance on discussion forum posts

    OpenAIRE

    Foster, Jennifer

    2010-01-01

    We evaluate the Berkeley parser on text from an online discussion forum. We evaluate the parser output with and without gold tokens and spellings (using Sparseval and Parseval), and we compile a list of problematic phenomena for this domain. The Parseval f-score for a small development set is 77.56. This increases to 80.27 when we apply a set of simple transformations to the input sentences and to the Wall Street Journal (WSJ) training sections.

  7. Pseudocode Interpreter (Pseudocode Integrated Development Environment with Lexical Analyzer and Syntax Analyzer using Recursive Descent Parsing Algorithm

    Directory of Open Access Journals (Sweden)

    Christian Lester D. Gimeno

    2017-11-01

    Full Text Available –This research study focused on the development of a software that helps students design, write, validate and run their pseudocode in a semi Integrated Development Environment (IDE instead of manually writing it on a piece of paper.Specifically, the study aimed to develop lexical analyzer or lexer, syntax analyzer or parser using recursive descent parsing algorithm and an interpreter. The lexical analyzer reads pseudocodesource in a sequence of symbols or characters as lexemes.The lexemes are then analyzed by the lexer that matches a pattern for valid tokens and passes to the syntax analyzer or parser. The syntax analyzer or parser takes those valid tokens and builds meaningful commands using recursive descent parsing algorithm in a form of an abstract syntax tree. The generation of an abstract syntax tree is based on the specified grammar rule created by the researcher expressed in Extended Backus-Naur Form. The Interpreter takes the generated abstract syntax tree and starts the evaluation or interpretation to produce pseudocode output. The software was evaluated using white-box testing by several ICT professionals and black-box testing by several computer science students based on the International Organization for Standardization (ISO 9126 software quality standards. The overall results of the evaluation both for white-box and black-box were described as “Excellent in terms of functionality, reliability, usability, efficiency, maintainability and portability”.

  8. Combining shallow and deep processing for a robust, fast, deep-linguistic dependency parser

    OpenAIRE

    Schneider, G

    2004-01-01

    This paper describes Pro3Gres, a fast, robust, broad-coverage parser that delivers deep-linguistic grammatical relation structures as output, which are closer to predicate-argument structures and more informative than pure constituency structures. The parser stays as shallow as is possible for each task, combining shallow and deep-linguistic methods by integrating chunking and by expressing the majority of long-distance dependencies in a context-free way. It combines statistical and rule-base...

  9. Processing the ITU vocabulary: revisions and adaptations to the Pisa syntactic-semantic parser

    OpenAIRE

    Peters, Carol; Federici, Stefano; Montemagni, Simonetta; Calzolari, Nicoletta

    1993-01-01

    The first version of the Pisa syntactic-semantic parser was described in detail in Deliverable 4, Section 2 and Appendices 2,3, and 4. The scope of this report is to discuss the testing of the parser on the sample set of vocabulary which has been selected from the ITU Corpus (see Deliverable 6.1) and to illustrate the revisions and extensions that are now being implemented. The report therefore concentrates on presenting analysis and extraction activities. We need to specify clearly all the k...

  10. Constructing a Parser for a given Deterministic Syntax Graph: A ...

    African Journals Online (AJOL)

    The rules of graph to program translation were laid down and followed religiously to arrive at the required program. ... The last part of the work is the translation from BNF into parser driven data structures that is ... AJOL African Journals Online.

  11. On different approaches to syntactic analysis into bi-lexical dependencies: An empirical comparison of direct, PCFG-based, and HPSG-based parsers

    Directory of Open Access Journals (Sweden)

    Angelina Ivanova

    2016-04-01

    Full Text Available We compare three different approaches to parsing into syntactic, bi- lexical dependencies for English: a ‘direct’ data-driven dependency parser, a statistical phrase structure parser, and a hybrid, ‘deep’ grammar-driven parser. The analyses from the latter two are post- converted to bi-lexical dependencies. Through this ‘reduction’ of all three approaches to syntactic dependency parsers, we determine empirically what performance can be obtained for a common set of de- pendency types for English, across a broad variety of domains. In doing so, we observe what trade-offs apply along three dimensions, accuracy, efficiency, and resilience to domain variation. Our results suggest that the hand-built grammar in one of our parsers helps in both accuracy and cross-domain parsing performance, but these accuracy gains do not necessarily translate to improvements in the downstream task of negation resolution.

  12. MASCOT HTML and XML parser: an implementation of a novel object model for protein identification data.

    Science.gov (United States)

    Yang, Chunguang G; Granite, Stephen J; Van Eyk, Jennifer E; Winslow, Raimond L

    2006-11-01

    Protein identification using MS is an important technique in proteomics as well as a major generator of proteomics data. We have designed the protein identification data object model (PDOM) and developed a parser based on this model to facilitate the analysis and storage of these data. The parser works with HTML or XML files saved or exported from MASCOT MS/MS ions search in peptide summary report or MASCOT PMF search in protein summary report. The program creates PDOM objects, eliminates redundancy in the input file, and has the capability to output any PDOM object to a relational database. This program facilitates additional analysis of MASCOT search results and aids the storage of protein identification information. The implementation is extensible and can serve as a template to develop parsers for other search engines. The parser can be used as a stand-alone application or can be driven by other Java programs. It is currently being used as the front end for a system that loads HTML and XML result files of MASCOT searches into a relational database. The source code is freely available at http://www.ccbm.jhu.edu and the program uses only free and open-source Java libraries.

  13. "gnparser": a powerful parser for scientific names based on Parsing Expression Grammar.

    Science.gov (United States)

    Mozzherin, Dmitry Y; Myltsev, Alexander A; Patterson, David J

    2017-05-26

    Scientific names in biology act as universal links. They allow us to cross-reference information about organisms globally. However variations in spelling of scientific names greatly diminish their ability to interconnect data. Such variations may include abbreviations, annotations, misspellings, etc. Authorship is a part of a scientific name and may also differ significantly. To match all possible variations of a name we need to divide them into their elements and classify each element according to its role. We refer to this as 'parsing' the name. Parsing categorizes name's elements into those that are stable and those that are prone to change. Names are matched first by combining them according to their stable elements. Matches are then refined by examining their varying elements. This two stage process dramatically improves the number and quality of matches. It is especially useful for the automatic data exchange within the context of "Big Data" in biology. We introduce Global Names Parser (gnparser). It is a Java tool written in Scala language (a language for Java Virtual Machine) to parse scientific names. It is based on a Parsing Expression Grammar. The parser can be applied to scientific names of any complexity. It assigns a semantic meaning (such as genus name, species epithet, rank, year of publication, authorship, annotations, etc.) to all elements of a name. It is able to work with nested structures as in the names of hybrids. gnparser performs with ≈99% accuracy and processes 30 million name-strings/hour per CPU thread. The gnparser library is compatible with Scala, Java, R, Jython, and JRuby. The parser can be used as a command line application, as a socket server, a web-app or as a RESTful HTTP-service. It is released under an Open source MIT license. Global Names Parser (gnparser) is a fast, high precision tool for biodiversity informaticians and biologists working with large numbers of scientific names. It can replace expensive and error

  14. The power and limits of a rule-based morpho-semantic parser.

    Science.gov (United States)

    Baud, R H; Rassinoux, A M; Ruch, P; Lovis, C; Scherrer, J R

    1999-01-01

    The venue of Electronic Patient Record (EPR) implies an increasing amount of medical texts readily available for processing, as soon as convenient tools are made available. The chief application is text analysis, from which one can drive other disciplines like indexing for retrieval, knowledge representation, translation and inferencing for medical intelligent systems. Prerequisites for a convenient analyzer of medical texts are: building the lexicon, developing semantic representation of the domain, having a large corpus of texts available for statistical analysis, and finally mastering robust and powerful parsing techniques in order to satisfy the constraints of the medical domain. This article aims at presenting an easy-to-use parser ready to be adapted in different settings. It describes its power together with its practical limitations as experienced by the authors.

  15. Polish Semantic Parser

    Directory of Open Access Journals (Sweden)

    Agnieszka Grudzinska

    2000-01-01

    Full Text Available Amount of information transferred by computers grows very rapidly thus outgrowing the average man's capability of reception. It implies computer programs increase in the demand for which would be able to perform an introductory classitication or even selection of information directed to a particular receiver. Due to the complexity of the problem, we restricted it to understanding short newspaper notes. Among many conceptions formulated so far, the conceptual dependency worked out by Roger Schank has been chosen. It is a formal language of description of the semantics of pronouncement integrated with a text understanding algorithm. Substantial part of each text transformation system is a semantic parser of the Polish language. It is a module, which as the first and the only one has an access to the text in the Polish language. lt plays the role of an element, which finds relations between words of the Polish language and the formal registration. It translates sentences written in the language used by people into the language theory. The presented structure of knowledge units and the shape of understanding process algorithms are universal by virtue of the theory. On the other hand the defined knowledge units and the rules used in the algorithms ure only examples because they are constructed in order to understand short newspaper notes.

  16. Parsley: a Command-Line Parser for Astronomical Applications

    Science.gov (United States)

    Deich, William

    Parsley is a sophisticated keyword + value parser, packaged as a library of routines that offers an easy method for providing command-line arguments to programs. It makes it easy for the user to enter values, and it makes it easy for the programmer to collect and validate the user's entries. Parsley is tuned for astronomical applications: for example, dates entered in Julian, Modified Julian, calendar, or several other formats are all recognized without special effort by the user or by the programmer; angles can be entered using decimal degrees or dd:mm:ss; time-like intervals as decimal hours, hh:mm:ss, or a variety of other units. Vectors of data are accepted as readily as scalars.

  17. GBParsy: A GenBank flatfile parser library with high speed

    Directory of Open Access Journals (Sweden)

    Kim Yeon-Ki

    2008-07-01

    Full Text Available Abstract Background GenBank flatfile (GBF format is one of the most popular sequence file formats because of its detailed sequence features and ease of readability. To use the data in the file by a computer, a parsing process is required and is performed according to a given grammar for the sequence and the description in a GBF. Currently, several parser libraries for the GBF have been developed. However, with the accumulation of DNA sequence information from eukaryotic chromosomes, parsing a eukaryotic genome sequence with these libraries inevitably takes a long time, due to the large GBF file and its correspondingly large genomic nucleotide sequence and related feature information. Thus, there is significant need to develop a parsing program with high speed and efficient use of system memory. Results We developed a library, GBParsy, which was C language-based and parses GBF files. The parsing speed was maximized by using content-specified functions in place of regular expressions that are flexible but slow. In addition, we optimized an algorithm related to memory usage so that it also increased parsing performance and efficiency of memory usage. GBParsy is at least 5 - 100× faster than current parsers in benchmark tests. Conclusion GBParsy is estimated to extract annotated information from almost 100 Mb of a GenBank flatfile for chromosomal sequence information within a second. Thus, it should be used for a variety of applications such as on-time visualization of a genome at a web site.

  18. A domain specific language for the automatic generation of parsers classes for text protocols

    OpenAIRE

    Kistel, Thomas; Vandenhouten, Ralf

    2014-01-01

    ABNF ist eine Sprache zur Definition einer formalen Syntax für technische Spezifikationen und wird häufig zur Beschreibung textueller Nachrichten von Internetprotokollen eingesetzt. Die Möglichkeiten der automatischen Generierung von Parser-Klassen aus ABNF-Spezifikationen sind derzeit sehr begrenzt, da ABNF lediglich die Transfersyntax und Produktionsregeln von Textnachrichten beschreibt. Die fehlende Definition von Variablennamen innerhalb einer ABNF-Spezifikation ermöglicht es nicht, sinnv...

  19. ACPYPE - AnteChamber PYthon Parser interfacE.

    Science.gov (United States)

    Sousa da Silva, Alan W; Vranken, Wim F

    2012-07-23

    ACPYPE (or AnteChamber PYthon Parser interfacE) is a wrapper script around the ANTECHAMBER software that simplifies the generation of small molecule topologies and parameters for a variety of molecular dynamics programmes like GROMACS, CHARMM and CNS. It is written in the Python programming language and was developed as a tool for interfacing with other Python based applications such as the CCPN software suite (for NMR data analysis) and ARIA (for structure calculations from NMR data). ACPYPE is open source code, under GNU GPL v3, and is available as a stand-alone application at http://www.ccpn.ac.uk/acpype and as a web portal application at http://webapps.ccpn.ac.uk/acpype. We verified the topologies generated by ACPYPE in three ways: by comparing with default AMBER topologies for standard amino acids; by generating and verifying topologies for a large set of ligands from the PDB; and by recalculating the structures for 5 protein-ligand complexes from the PDB. ACPYPE is a tool that simplifies the automatic generation of topology and parameters in different formats for different molecular mechanics programmes, including calculation of partial charges, while being object oriented for integration with other applications.

  20. The Accelerator Markup Language and the Universal Accelerator Parser

    International Nuclear Information System (INIS)

    Sagan, D.; Forster, M.; Cornell U., LNS; Bates, D.A.; LBL, Berkeley; Wolski, A.; Liverpool U.; Cockcroft Inst. Accel. Sci. Tech.; Schmidt, F.; CERN; Walker, N.J.; DESY; Larrieu, T.; Roblin, Y.; Jefferson Lab; Pelaia, T.; Oak Ridge; Tenenbaum, P.; Woodley, M.; SLAC; Reiche, S.; UCLA

    2006-01-01

    A major obstacle to collaboration on accelerator projects has been the sharing of lattice description files between modeling codes. To address this problem, a lattice description format called Accelerator Markup Language (AML) has been created. AML is based upon the standard eXtensible Markup Language (XML) format; this provides the flexibility for AML to be easily extended to satisfy changing requirements. In conjunction with AML, a software library, called the Universal Accelerator Parser (UAP), is being developed to speed the integration of AML into any program. The UAP is structured to make it relatively straightforward (by giving appropriate specifications) to read and write lattice files in any format. This will allow programs that use the UAP code to read a variety of different file formats. Additionally, this will greatly simplify conversion of files from one format to another. Currently, besides AML, the UAP supports the MAD lattice format

  1. A methodology for analyzing precursors to earthquake-initiated and fire-initiated accident sequences

    International Nuclear Information System (INIS)

    Budnitz, R.J.; Lambert, H.E.; Apostolakis, G.

    1998-04-01

    This report covers work to develop a methodology for analyzing precursors to both earthquake-initiated and fire-initiated accidents at commercial nuclear power plants. Currently, the U.S. Nuclear Regulatory Commission sponsors a large ongoing project, the Accident Sequence Precursor project, to analyze the safety significance of other types of accident precursors, such as those arising from internally-initiated transients and pipe breaks, but earthquakes and fires are not within the current scope. The results of this project are that: (1) an overall step-by-step methodology has been developed for precursors to both fire-initiated and seismic-initiated potential accidents; (2) some stylized case-study examples are provided to demonstrate how the fully-developed methodology works in practice, and (3) a generic seismic-fragility date base for equipment is provided for use in seismic-precursors analyses. 44 refs., 23 figs., 16 tabs

  2. ACPYPE - AnteChamber PYthon Parser interfacE

    Directory of Open Access Journals (Sweden)

    Sousa da Silva Alan W

    2012-07-01

    Full Text Available Abstract Background ACPYPE (or AnteChamber PYthon Parser interfacE is a wrapper script around the ANTECHAMBER software that simplifies the generation of small molecule topologies and parameters for a variety of molecular dynamics programmes like GROMACS, CHARMM and CNS. It is written in the Python programming language and was developed as a tool for interfacing with other Python based applications such as the CCPN software suite (for NMR data analysis and ARIA (for structure calculations from NMR data. ACPYPE is open source code, under GNU GPL v3, and is available as a stand-alone application at http://www.ccpn.ac.uk/acpype and as a web portal application at http://webapps.ccpn.ac.uk/acpype. Findings We verified the topologies generated by ACPYPE in three ways: by comparing with default AMBER topologies for standard amino acids; by generating and verifying topologies for a large set of ligands from the PDB; and by recalculating the structures for 5 protein–ligand complexes from the PDB. Conclusions ACPYPE is a tool that simplifies the automatic generation of topology and parameters in different formats for different molecular mechanics programmes, including calculation of partial charges, while being object oriented for integration with other applications.

  3. MLS-Net and SecureParser®: A New Method for Securing and Segregating Network Data

    Directory of Open Access Journals (Sweden)

    Robert A. Johnson

    2008-10-01

    Full Text Available A new method of network security and virtualization is presented which allows the consolidation of multiple network infrastructures dedicated to single security levels or communities of interest onto a single, virtualized network. An overview of the state of the art of network security protocols is presented, including the use of SSL, IPSec, and HAIPE IS, followed by a discussion of the SecureParser® technology and MLS-Net architecture, which in combination allow the virtualization of local network enclaves.

  4. Experienced physicians benefit from analyzing initial diagnostic hypotheses

    Science.gov (United States)

    Bass, Adam; Geddes, Colin; Wright, Bruce; Coderre, Sylvain; Rikers, Remy; McLaughlin, Kevin

    2013-01-01

    Background Most incorrect diagnoses involve at least one cognitive error, of which premature closure is the most prevalent. While metacognitive strategies can mitigate premature closure in inexperienced learners, these are rarely studied in experienced physicians. Our objective here was to evaluate the effect of analytic information processing on diagnostic performance of nephrologists and nephrology residents. Methods We asked nine nephrologists and six nephrology residents at the University of Calgary and Glasgow University to diagnose ten nephrology cases. We provided presenting features along with contextual information, after which we asked for an initial diagnosis. We then primed participants to use either hypothetico-deductive reasoning or scheme-inductive reasoning to analyze the remaining case data and generate a final diagnosis. Results After analyzing initial hypotheses, both nephrologists and residents improved the accuracy of final diagnoses (31.1% vs. 65.6%, p inductive reasoning (odds ratio [95% confidence interval] 5.69 [1.59, 20.33], p = 0.07), whereas the performance of experienced nephrologists did not differ between strategies (odds ratio 0.57 [0.23, 1.39], p = 0.20). Discussion Experienced nephrologists and nephrology residents can improve their performance by analyzing initial diagnostic hypotheses. The explanation of the interaction between experience and the effect of different reasoning strategies is unclear, but may relate to preferences in reasoning strategy, or the changes in knowledge structure with experience. PMID:26451203

  5. Experienced physicians benefit from analyzing initial diagnostic hypotheses

    Directory of Open Access Journals (Sweden)

    Adam Bass

    2013-03-01

    Full Text Available Background: Most incorrect diagnoses involve at least one cognitive error, of which premature closure is the most prevalent. While metacognitive strategies can mitigate premature closure in inexperienced learners, these are rarely studied in experienced physicians. Our objective here was to evaluate the effect of analytic information processing on diagnostic performance of nephrologists and nephrology residents. Methods: We asked nine nephrologists and six nephrology residents at the University of Calgary and Glasgow University to diagnose ten nephrology cases. We provided presenting features along with contextual information, after which we asked for an initial diagnosis. We then primed participants to use either hypothetico-deductive reasoning or scheme-inductive reasoning to analyze the remaining case data and generate a final diagnosis. Results: After analyzing initial hypotheses, both nephrologists and residents improved the accuracy of final diagnoses (31.1% vs. 65.6%, p < 0.001, and 40.0% vs. 70.0%, p < 0.001, respectively. We found a significant interaction between experience and analytic processing strategy (p = 0.002: nephrology residents had significantly increased odds of diagnostic success when using scheme-inductive reasoning (odds ratio [95% confidence interval] 5.69 [1.59, 20.33], p = 0.007, whereas the performance of experienced nephrologists did not differ between strategies (odds ratio 0.57 [0.23, 1.39], p = 0.2. Discussion: Experienced nephrologists and nephrology residents can improve their performance by analyzing initial diagnostic hypotheses. The explanation of the interaction between experience and the effect of different reasoning strategies is unclear, but may relate to preferences in reasoning strategy, or the changes in knowledge structure with experience.

  6. Analyzing Digital Library Initiatives: 5S Theory Perspective

    Science.gov (United States)

    Isah, Abdulmumin; Mutshewa, Athulang; Serema, Batlang; Kenosi, Lekoko

    2015-01-01

    This article traces the historical development of Digital Libraries (DLs), examines some DL initiatives in developed and developing countries and uses 5S Theory as a lens for analyzing the focused DLs. The analysis shows that present-day systems, in both developed and developing nations, are essentially content and user centric, with low level…

  7. jmzReader: A Java parser library to process and visualize multiple text and XML-based mass spectrometry data formats.

    Science.gov (United States)

    Griss, Johannes; Reisinger, Florian; Hermjakob, Henning; Vizcaíno, Juan Antonio

    2012-03-01

    We here present the jmzReader library: a collection of Java application programming interfaces (APIs) to parse the most commonly used peak list and XML-based mass spectrometry (MS) data formats: DTA, MS2, MGF, PKL, mzXML, mzData, and mzML (based on the already existing API jmzML). The library is optimized to be used in conjunction with mzIdentML, the recently released standard data format for reporting protein and peptide identifications, developed by the HUPO proteomics standards initiative (PSI). mzIdentML files do not contain spectra data but contain references to different kinds of external MS data files. As a key functionality, all parsers implement a common interface that supports the various methods used by mzIdentML to reference external spectra. Thus, when developing software for mzIdentML, programmers no longer have to support multiple MS data file formats but only this one interface. The library (which includes a viewer) is open source and, together with detailed documentation, can be downloaded from http://code.google.com/p/jmzreader/. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Microsoft Biology Initiative: .NET Bioinformatics Platform and Tools

    Science.gov (United States)

    Diaz Acosta, B.

    2011-01-01

    The Microsoft Biology Initiative (MBI) is an effort in Microsoft Research to bring new technology and tools to the area of bioinformatics and biology. This initiative is comprised of two primary components, the Microsoft Biology Foundation (MBF) and the Microsoft Biology Tools (MBT). MBF is a language-neutral bioinformatics toolkit built as an extension to the Microsoft .NET Framework—initially aimed at the area of Genomics research. Currently, it implements a range of parsers for common bioinformatics file formats; a range of algorithms for manipulating DNA, RNA, and protein sequences; and a set of connectors to biological web services such as NCBI BLAST. MBF is available under an open source license, and executables, source code, demo applications, documentation and training materials are freely downloadable from http://research.microsoft.com/bio. MBT is a collection of tools that enable biology and bioinformatics researchers to be more productive in making scientific discoveries.

  9. ULTRA: Universal Grammar as a Universal Parser.

    Science.gov (United States)

    Medeiros, David P

    2018-01-01

    A central concern of generative grammar is the relationship between hierarchy and word order, traditionally understood as two dimensions of a single syntactic representation. A related concern is directionality in the grammar. Traditional approaches posit process-neutral grammars, embodying knowledge of language, put to use with infinite facility both for production and comprehension. This has crystallized in the view of Merge as the central property of syntax, perhaps its only novel feature. A growing number of approaches explore grammars with different directionalities, often with more direct connections to performance mechanisms. This paper describes a novel model of universal grammar as a one-directional, universal parser. Mismatch between word order and interpretation order is pervasive in comprehension; in the present model, word order is language-particular and interpretation order (i.e., hierarchy) is universal. These orders are not two dimensions of a unified abstract object (e.g., precedence and dominance in a single tree); rather, both are temporal sequences, and UG is an invariant real-time procedure (based on Knuth's stack-sorting algorithm) transforming word order into hierarchical order. This shift in perspective has several desirable consequences. It collapses linearization, displacement, and composition into a single performance process. The architecture provides a novel source of brackets (labeled unambiguously and without search), which are understood not as part-whole constituency relations, but as storage and retrieval routines in parsing. It also explains why neutral word order within single syntactic cycles avoids 213-like permutations. The model identifies cycles as extended projections of lexical heads, grounding the notion of phase. This is achieved with a universal processor, dispensing with parameters. The empirical focus is word order in noun phrases. This domain provides some of the clearest evidence for 213-avoidance as a cross

  10. Development of an event-driven parser for active document and web-based nuclear design system

    Energy Technology Data Exchange (ETDEWEB)

    Park, Yong Soo

    2005-02-15

    Nuclear design works consist of extensive unit job modules in which many computer codes are used. Each unit module requires time-consuming and erroneous input preparation, code run, output analysis and quality assurance process. The task for safety evaluation of reload core is especially the most man-power intensive and time-consuming due to the large amount of calculations and data exchanges. The purpose of this study is to develop a new nuclear design system called Innovative Design Processor (IDP) in order to minimize human effort and maximize design quality and productivity, and then to achieve an ultimately optimized core loading pattern. Two new basic principles of IDP are the document-oriented design and the web based design. Contrary to the conventional code-oriented or procedure-oriented design, the document-oriented design is human-oriented in that the final document is automatically prepared with complete analysis, table and plots, if the designer writes a design document called active document and feeds it to a parser. This study defined a number of active components and developed an event-driven parser for the active document in HTML (Hypertext Markup Language) or XML (Extensible Markup Language). The active documents can be created on the web, which is another framework of IDP. Using proper mix-up of server side and client side programming under the HAMP (HP-UX/Apache/MySQL/PHP) environment, the document-oriented design process on the web is modeled as a design wizard for designer's convenience and platform independency. This automation using IDP was tested for the reload safety evaluation of Korea Standard Nuclear Power Plant (KSNP) type PWRs. Great time saving was confirmed and IDP can complete several-month jobs in a few days. More optimized core loading pattern, therefore, can be obtained since it takes little time to do the reload safety evaluation tasks with several core loading pattern candidates. Since the technology is also applicable to

  11. Development of an event-driven parser for active document and web-based nuclear design system

    International Nuclear Information System (INIS)

    Park, Yong Soo

    2005-02-01

    Nuclear design works consist of extensive unit job modules in which many computer codes are used. Each unit module requires time-consuming and erroneous input preparation, code run, output analysis and quality assurance process. The task for safety evaluation of reload core is especially the most man-power intensive and time-consuming due to the large amount of calculations and data exchanges. The purpose of this study is to develop a new nuclear design system called Innovative Design Processor (IDP) in order to minimize human effort and maximize design quality and productivity, and then to achieve an ultimately optimized core loading pattern. Two new basic principles of IDP are the document-oriented design and the web based design. Contrary to the conventional code-oriented or procedure-oriented design, the document-oriented design is human-oriented in that the final document is automatically prepared with complete analysis, table and plots, if the designer writes a design document called active document and feeds it to a parser. This study defined a number of active components and developed an event-driven parser for the active document in HTML (Hypertext Markup Language) or XML (Extensible Markup Language). The active documents can be created on the web, which is another framework of IDP. Using proper mix-up of server side and client side programming under the HAMP (HP-UX/Apache/MySQL/PHP) environment, the document-oriented design process on the web is modeled as a design wizard for designer's convenience and platform independency. This automation using IDP was tested for the reload safety evaluation of Korea Standard Nuclear Power Plant (KSNP) type PWRs. Great time saving was confirmed and IDP can complete several-month jobs in a few days. More optimized core loading pattern, therefore, can be obtained since it takes little time to do the reload safety evaluation tasks with several core loading pattern candidates. Since the technology is also applicable to the

  12. Analyzing Ambiguity of Context-Free Grammars

    DEFF Research Database (Denmark)

    Brabrand, Claus; Giegerich, Robert; Møller, Anders

    2010-01-01

    It has been known since 1962 that the ambiguity problem for context-free grammars is undecidable. Ambiguity in context-free grammars is a recurring problem in language design and parser generation, as well as in applications where grammars are used as models of real-world physical structures. We...... observe that there is a simple linguistic characterization of the grammar ambiguity problem, and we show how to exploit this by presenting an ambiguity analysis framework based on conservative language approximations. As a concrete example, we propose a technique based on local regular approximations...

  13. Analyzing Ambiguity of Context-Free Grammars

    DEFF Research Database (Denmark)

    Brabrand, Claus; Giegerich, Robert; Møller, Anders

    2007-01-01

    It has been known since 1962 that the ambiguity problem for context-free grammars is undecidable. Ambiguity in context-free grammars is a recurring problem in language design and parser generation, as well as in applications where grammars are used as models of real-world physical structures. We...... observe that there is a simple linguistic characterization of the grammar ambiguity problem, and we show how to exploit this to conservatively approximate the problem based on local regular approximations and grammar unfoldings. As an application, we consider grammars that occur in RNA analysis...

  14. Análisis, optimización, mejora y aplicación del análisis de dependencias. Analyzing, enhancing, optimizing and applying dependency analysis

    OpenAIRE

    Ballesteros Martínez, Miguel

    2012-01-01

    Los analizadores de dependencias estadísticos han sido mejorados en gran medida durante los últimos años. Esto ha sido posible gracias a los sistemas basados en aprendizaje automático que muestran una gran precisión. Estos sistemas permiten la generación de parsers para idiomas en los que se disponga de un corpus adecuado sin causar, para ello, un gran esfuerzo en el usuario final. MaltParser es uno de estos sistemas. En esta tesis hemos usado sistemas del estado del arte, para mostrar una se...

  15. The time course of processing difficulties with non-WH extraction in Danish

    DEFF Research Database (Denmark)

    Poulsen, Mads

    Danish, a V2-language, allows liberal extraction of non-WH elements from a variety of clause types to sentence-initial position, e.g. from relative clauses. Extractions from complement clauses, see (1), are more frequent (Jensen 2001) than extractions from adverbial clauses as in (2). (1)    De......-verbs), the parser doesn’t expect to have linked all mentioned arguments with the main verb at the clause boundary. For clauses with intransitive verbs, on the other hand, the parser expects to be able to find a role for all constituents within the clause, i.e. the parser should experience difficulties...

  16. Governance of extended lifecycle in large-scale eHealth initiatives: analyzing variability of enterprise architecture elements.

    Science.gov (United States)

    Mykkänen, Juha; Virkanen, Hannu; Tuomainen, Mika

    2013-01-01

    The governance of large eHealth initiatives requires traceability of many requirements and design decisions. We provide a model which we use to conceptually analyze variability of several enterprise architecture (EA) elements throughout the extended lifecycle of development goals using interrelated projects related to the national ePrescription in Finland.

  17. Task planning systems with natural language interface

    International Nuclear Information System (INIS)

    Kambayashi, Shaw; Uenaka, Junji

    1989-12-01

    In this report, a natural language analyzer and two different task planning systems are described. In 1988, we have introduced a Japanese language analyzer named CS-PARSER for the input interface of the task planning system in the Human Acts Simulation Program (HASP). For the purpose of a high speed analysis, we have modified a dictionary system of the CS-PARSER by using C language description. It is found that the new dictionary system is very useful for a high speed analysis and an efficient maintenance of the dictionary. For the study of the task planning problem, we have modified a story generating system named Micro TALE-SPIN to generate a story written in Japanese sentences. We have also constructed a planning system with natural language interface by using the CS-PARSER. Task planning processes and related knowledge bases of these systems are explained. A concept design for a new task planning system will be also discussed from evaluations of above mentioned systems. (author)

  18. Is human sentence parsing serial or parallel? Evidence from event-related brain potentials.

    Science.gov (United States)

    Hopf, Jens-Max; Bader, Markus; Meng, Michael; Bayer, Josef

    2003-01-01

    In this ERP study we investigate the processes that occur in syntactically ambiguous German sentences at the point of disambiguation. Whereas most psycholinguistic theories agree on the view that processing difficulties arise when parsing preferences are disconfirmed (so-called garden-path effects), important differences exist with respect to theoretical assumptions about the parser's recovery from a misparse. A key distinction can be made between parsers that compute all alternative syntactic structures in parallel (parallel parsers) and parsers that compute only a single preferred analysis (serial parsers). To distinguish empirically between parallel and serial parsing models, we compare ERP responses to garden-path sentences with ERP responses to truly ungrammatical sentences. Garden-path sentences contain a temporary and ultimately curable ungrammaticality, whereas truly ungrammatical sentences remain so permanently--a difference which gives rise to different predictions in the two classes of parsing architectures. At the disambiguating word, ERPs in both sentence types show negative shifts of similar onset latency, amplitude, and scalp distribution in an initial time window between 300 and 500 ms. In a following time window (500-700 ms), the negative shift to garden-path sentences disappears at right central parietal sites, while it continues in permanently ungrammatical sentences. These data are taken as evidence for a strictly serial parser. The absence of a difference in the early time window indicates that temporary and permanent ungrammaticalities trigger the same kind of parsing responses. Later differences can be related to successful reanalysis in garden-path but not in ungrammatical sentences. Copyright 2003 Elsevier Science B.V.

  19. Solving LR Conflicts Through Context Aware Scanning

    Science.gov (United States)

    Leon, C. Rodriguez; Forte, L. Garcia

    2011-09-01

    This paper presents a new algorithm to compute the exact list of tokens expected by any LR syntax analyzer at any point of the scanning process. The lexer can, at any time, compute the exact list of valid tokens to return only tokens in this set. In the case than more than one matching token is in the valid set, the lexer can resort to a nested LR parser to disambiguate. Allowing nested LR parsing requires some slight modifications when building the LR parsing tables. We also show how LR parsers can parse conflictive and inherently ambiguous languages using a combination of nested parsing and context aware scanning. These expanded lexical analyzers can be generated from high level specifications.

  20. Structure before meaning: sentence processing, plausibility, and subcategorization.

    Science.gov (United States)

    Kizach, Johannes; Nyvad, Anne Mette; Christensen, Ken Ramshøj

    2013-01-01

    Natural language processing is a fast and automatized process. A crucial part of this process is parsing, the online incremental construction of a syntactic structure. The aim of this study was to test whether a wh-filler extracted from an embedded clause is initially attached as the object of the matrix verb with subsequent reanalysis, and if so, whether the plausibility of such an attachment has an effect on reaction time. Finally, we wanted to examine whether subcategorization plays a role. We used a method called G-Maze to measure response time in a self-paced reading design. The experiments confirmed that there is early attachment of fillers to the matrix verb. When this attachment is implausible, the off-line acceptability of the whole sentence is significantly reduced. The on-line results showed that G-Maze was highly suited for this type of experiment. In accordance with our predictions, the results suggest that the parser ignores (or has no access to information about) implausibility and attaches fillers as soon as possible to the matrix verb. However, the results also show that the parser uses the subcategorization frame of the matrix verb. In short, the parser ignores semantic information and allows implausible attachments but adheres to information about which type of object a verb can take, ensuring that the parser does not make impossible attachments. We argue that the evidence supports a syntactic parser informed by syntactic cues, rather than one guided by semantic cues or one that is blind, or completely autonomous.

  1. Structure before meaning: sentence processing, plausibility, and subcategorization.

    Directory of Open Access Journals (Sweden)

    Johannes Kizach

    Full Text Available Natural language processing is a fast and automatized process. A crucial part of this process is parsing, the online incremental construction of a syntactic structure. The aim of this study was to test whether a wh-filler extracted from an embedded clause is initially attached as the object of the matrix verb with subsequent reanalysis, and if so, whether the plausibility of such an attachment has an effect on reaction time. Finally, we wanted to examine whether subcategorization plays a role. We used a method called G-Maze to measure response time in a self-paced reading design. The experiments confirmed that there is early attachment of fillers to the matrix verb. When this attachment is implausible, the off-line acceptability of the whole sentence is significantly reduced. The on-line results showed that G-Maze was highly suited for this type of experiment. In accordance with our predictions, the results suggest that the parser ignores (or has no access to information about implausibility and attaches fillers as soon as possible to the matrix verb. However, the results also show that the parser uses the subcategorization frame of the matrix verb. In short, the parser ignores semantic information and allows implausible attachments but adheres to information about which type of object a verb can take, ensuring that the parser does not make impossible attachments. We argue that the evidence supports a syntactic parser informed by syntactic cues, rather than one guided by semantic cues or one that is blind, or completely autonomous.

  2. A Semantic Analysis Method for Scientific and Engineering Code

    Science.gov (United States)

    Stewart, Mark E. M.

    1998-01-01

    This paper develops a procedure to statically analyze aspects of the meaning or semantics of scientific and engineering code. The analysis involves adding semantic declarations to a user's code and parsing this semantic knowledge with the original code using multiple expert parsers. These semantic parsers are designed to recognize formulae in different disciplines including physical and mathematical formulae and geometrical position in a numerical scheme. In practice, a user would submit code with semantic declarations of primitive variables to the analysis procedure, and its semantic parsers would automatically recognize and document some static, semantic concepts and locate some program semantic errors. A prototype implementation of this analysis procedure is demonstrated. Further, the relationship between the fundamental algebraic manipulations of equations and the parsing of expressions is explained. This ability to locate some semantic errors and document semantic concepts in scientific and engineering code should reduce the time, risk, and effort of developing and using these codes.

  3. Measuring and Analyzing the Scholarly Impact of Experimental Evaluation Initiatives

    DEFF Research Database (Denmark)

    Angelini, Marco; Ferro, Nicola; Larsen, Birger

    2014-01-01

    Evaluation initiatives have been widely credited with contributing highly to the development and advancement of information access systems, by providing a sustainable platform for conducting the very demanding activity of comparable experimental evaluation in a large scale. Measuring the impact...

  4. An acetone breath analyzer using cavity ringdown spectroscopy: an initial test with human subjects under various situations

    International Nuclear Information System (INIS)

    Wang, Chuji; Surampudi, Anand B

    2008-01-01

    We have developed a portable breath acetone analyzer using cavity ringdown spectroscopy (CRDS). The instrument was initially tested by measuring the absorbance of breath gases at a single wavelength (266 nm) from 32 human subjects under various conditions. A background subtraction method, implemented to obtain absorbance differences, from which an upper limit of breath acetone concentration was obtained, is described. The upper limits of breath acetone concentration in the four Type 1 diabetes (T1D) subjects, tested after a 14 h overnight fast, range from 0.80 to 3.97 parts per million by volume (ppmv), higher than the mean acetone concentration (0.49 ppmv) in non-diabetic healthy breath reported in the literature. The preliminary results show that the instrument can tell distinctive differences between the breath from individuals who are healthy and those with T1D. On-line monitoring of breath gases in healthy people post-exercise, post-meals and post-alcohol-consumption was also conducted. This exploratory study demonstrates the first CRDS-based acetone breath analyzer and its potential application for point-of-care, non-invasive, diabetic monitoring

  5. Grammar Engineering Support for Precedence Rule Recovery and Compatibility Checking

    NARCIS (Netherlands)

    Bouwers, E.; Bravenboer, M.; Visser, E.

    2007-01-01

    A wide range of parser generators are used to generate parsers for programming languages. The grammar formalisms that come with parser generators provide different approaches for defining operator precedence. Some generators (e.g. YACC) support precedence declarations, others require the grammar to

  6. Transparent parsing : Head-driven processing of verb-final structures

    NARCIS (Netherlands)

    Mulders, I.C.M.C.

    2002-01-01

    The conceptual guideline underlying this study is that the goal of processing theory should be to construct a transparent parser. A transparent parser is a parser which employs only properties and relations that are available in the grammar, without resorting to processing-specific notions. Under

  7. The Mystro system: A comprehensive translator toolkit

    Science.gov (United States)

    Collins, W. R.; Noonan, R. E.

    1985-01-01

    Mystro is a system that facilities the construction of compilers, assemblers, code generators, query interpretors, and similar programs. It provides features to encourage the use of iterative enhancement. Mystro was developed in response to the needs of NASA Langley Research Center (LaRC) and enjoys a number of advantages over similar systems. There are other programs available that can be used in building translators. These typically build parser tables, usually supply the source of a parser and parts of a lexical analyzer, but provide little or no aid for code generation. In general, only the front end of the compiler is addressed. Mystro, on the other hand, emphasizes tools for both ends of a compiler.

  8. Inducing Head-Driven PCFGs with Latent Heads: Refining a Tree-bank Grammar for Parsing

    NARCIS (Netherlands)

    Prescher, D.; Gama, J.; Camacho, R.; Brazdil, P.; Jorge, A.; Torgo, L.

    2005-01-01

    Although state-of-the-art parsers for natural language are lexicalized, it was recently shown that an accurate unlexicalized parser for the Penn tree-bank can be simply read off a manually refined tree-bank. While lexicalized parsers often suffer from sparse data, manual mark-up is costly and

  9. Syntactic Reconstruction and Reanalysis, Semantic Dead Ends, and Prefrontal Cortex

    DEFF Research Database (Denmark)

    Christensen, Ken Ramshøj

    2010-01-01

    have been to Paris than […] to Oslo), using pseudo-elliptical structures (‘dead ends’) as control (More people have been to Paris than I have). (ii) Reanalysis in the face of structural ambiguity in syntactic ‘garden paths’, where the parser initially assigns an incorrect structure and is forced...

  10. MAD parsing and conversion code

    International Nuclear Information System (INIS)

    Mokhov, Dmitri N.

    2000-01-01

    The authors describe design and implementation issues while developing an embeddable MAD language parser. Two working applications of the parser are also described, namely, MAD-> C++ converter and C++ factory. The report contains some relevant details about the parser and examples of converted code. It also describes some of the problems that were encountered and the solutions found for them

  11. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    It also describes the working of Punjabi Shallow Parser used for the processing of the input sentence, which performs the tasks of Tokenizer, Morph-analyzer, Part-of-Speech Tagger and Chunker. This paper also considers the seven phases used in the process of EnConversion of input Punjabi text to UNL representation.

  12. Dependency Grammar in Lithuanian Language Processing

    OpenAIRE

    Grigonytė, Gintarė

    2006-01-01

    Lithuanian language is quite in an early stage of language processing. And therefore has a high demand on automated tools like taggers, parsers, word sense disambiguators etc. During the last 10 years only a few researchers were attempting to create a parser for Lithuanian language. However none of them are used in practices nowadays. The process of designing and implementing rule based parser for Lithuanian language is presented in this paper. Rules and constraints of the formal grammar foll...

  13. 40 CFR 86.1322-84 - Carbon monoxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... be used. (2) Zero the carbon monoxide analyzer with either zero-grade air or zero-grade nitrogen. (3... columns is one form of corrective action which may be taken.) (b) Initial and periodic calibration. Prior... calibrated. (1) Adjust the analyzer to optimize performance. (2) Zero the carbon monoxide analyzer with...

  14. Parsing Universal Dependencies without training

    DEFF Research Database (Denmark)

    Martínez Alonso, Héctor; Agic, Zeljko; Plank, Barbara

    2017-01-01

    We present UDP, the first training-free parser for Universal Dependencies (UD). Our algorithm is based on PageRank and a small set of specific dependency head rules. UDP features two-step decoding to guarantee that function words are attached as leaf nodes. The parser requires no training......, and it is competitive with a delexicalized transfer system. UDP offers a linguistically sound unsupervised alternative to cross-lingual parsing for UD. The parser has very few parameters and distinctly robust to domain change across languages....

  15. Investigating AI with BASIC and Logo: Helping the Computer to Understand INPUTS.

    Science.gov (United States)

    Mandell, Alan; Lucking, Robert

    1988-01-01

    Investigates using the microcomputer to develop a sentence parser to simulate intelligent conversation used in artificial intelligence applications. Compares the ability of LOGO and BASIC for this use. Lists and critiques several LOGO and BASIC parser programs. (MVL)

  16. Defense Acquisition Initiatives Review: An Assessment of Extant Initiatives

    National Research Council Canada - National Science Library

    Porter, Gene; Berteau, David; Christle, Gary; Mandelbaum, Jay; Diehl, Richard

    2005-01-01

    ...) to identify and analyze a subset of initiatives that the team finds to have potential for near term management emphasis that could provide visible improvements to the much criticized Defense acquisition system...

  17. Subdomain sensitive statistical parsing using raw corpora

    NARCIS (Netherlands)

    Plank, B.; Sima'an, K.

    2008-01-01

    Modern statistical parsers are trained on large annotated corpora (treebanks). These treebanks usually consist of sentences addressing different subdomains (e.g. sports, politics, music), which implies that the statistics gathered by current statistical parsers are mixtures of subdomains of language

  18. 40 CFR 86.327-79 - Quench checks; NOX analyzer.

    Science.gov (United States)

    2010-07-01

    ... any flow rate into the reaction chamber. This includes, but is not limited to, sample capillary, ozone... Quench checks; NOX analyzer. (a) Perform the reaction chamber quench check for each model of high vacuum reaction chamber analyzer prior to initial use. (b) Perform the reaction chamber quench check for each new...

  19. Faster, Practical GLL Parsing

    NARCIS (Netherlands)

    A. Afroozeh (Ali); A. Izmaylova (Anastasia)

    2015-01-01

    htmlabstractGeneralized LL (GLL) parsing is an extension of recursive-descent (RD) parsing that supports all context-free grammars in cubic time and space. GLL parsers have the direct relationship with the grammar that RD parsers have, and therefore, compared to GLR, are easier to understand, debug,

  20. Memory Retrieval in Parsing and Interpretation

    Science.gov (United States)

    Schlueter, Ananda Lila Zoe

    2017-01-01

    This dissertation explores the relationship between the parser and the grammar in error-driven retrieval by examining the mechanism underlying the illusory licensing of subject-verb agreement violations ("agreement attraction"). Previous work motivates a two-stage model of agreement attraction in which the parser predicts the verb's…

  1. Ambiguity Detection Methods for Context-Free Grammars

    NARCIS (Netherlands)

    H.J.S. Basten (Bas)

    2007-01-01

    textabstractThe Meta-Environment enables the creation of grammars using the SDF formalism. From these grammars an SGLR parser can be generated. One of the advantages of these parsers is that they can handle the entire class of context-free grammars (CFGs). The grammar developer does not have to

  2. Parsing Universal Dependencies without training

    NARCIS (Netherlands)

    Martínez Alonso, Héctor; Agić, Željko; Plank, Barbara; Søgaard, Anders

    2017-01-01

    We propose UDP, the first training-free parser for Universal Dependencies (UD). Our algorithm is based on PageRank and a small set of head attachment rules. It features two-step decoding to guarantee that function words are attached as leaf nodes. The parser requires no training, and it is

  3. A Lexical Analysis Tool with Ambiguity Support

    OpenAIRE

    Quesada, Luis; Berzal, Fernando; Cortijo, Francisco J.

    2012-01-01

    Lexical ambiguities naturally arise in languages. We present Lamb, a lexical analyzer that produces a lexical analysis graph describing all the possible sequences of tokens that can be found within the input string. Parsers can process such lexical analysis graphs and discard any sequence of tokens that does not produce a valid syntactic sentence, therefore performing, together with Lamb, a context-sensitive lexical analysis in lexically-ambiguous language specifications.

  4. A study of the transferability of influenza case detection systems between two large healthcare systems.

    Directory of Open Access Journals (Sweden)

    Ye Ye

    Full Text Available This study evaluates the accuracy and transferability of Bayesian case detection systems (BCD that use clinical notes from emergency department (ED to detect influenza cases.A BCD uses natural language processing (NLP to infer the presence or absence of clinical findings from ED notes, which are fed into a Bayesain network classifier (BN to infer patients' diagnoses. We developed BCDs at the University of Pittsburgh Medical Center (BCDUPMC and Intermountain Healthcare in Utah (BCDIH. At each site, we manually built a rule-based NLP and trained a Bayesain network classifier from over 40,000 ED encounters between Jan. 2008 and May. 2010 using feature selection, machine learning, and expert debiasing approach. Transferability of a BCD in this study may be impacted by seven factors: development (source institution, development parser, application (target institution, application parser, NLP transfer, BN transfer, and classification task. We employed an ANOVA analysis to study their impacts on BCD performance.Both BCDs discriminated well between influenza and non-influenza on local test cases (AUCs > 0.92. When tested for transferability using the other institution's cases, BCDUPMC discriminations declined minimally (AUC decreased from 0.95 to 0.94, p<0.01, and BCDIH discriminations declined more (from 0.93 to 0.87, p<0.0001. We attributed the BCDIH decline to the lower recall of the IH parser on UPMC notes. The ANOVA analysis showed five significant factors: development parser, application institution, application parser, BN transfer, and classification task.We demonstrated high influenza case detection performance in two large healthcare systems in two geographically separated regions, providing evidentiary support for the use of automated case detection from routinely collected electronic clinical notes in national influenza surveillance. The transferability could be improved by training Bayesian network classifier locally and increasing the

  5. PEG parsing in less space using progressive tabling and dynamic analysis

    DEFF Research Database (Denmark)

    Henglein, Fritz; Rasmussen, Ulrik Terp

    2017-01-01

    -case constant and worst-case linear memory use. Furthermore, semantic actions are scheduled before the parser has seen the end of the input. The scheduling is conservative in the sense that no action has to be "undone" in the case of backtracking. The time complexity is O(dmn) where m is the size of the parser...

  6. Environmental applications of the centrifugal fast analyzer

    International Nuclear Information System (INIS)

    Goldstein, G.; Strain, J.E.; Bowling, J.L.

    1975-12-01

    The centrifugal fast analyzer (GeMSAEC Fast Analyzer) was applied to the analysis of pollutants in air and water. Since data acquisition and processing are computer controlled, considerable effort went into devising appropriate software. A modified version of the standard FOCAL interpreter was developed which includes special machine language functions for data timing, acquisition, and storage, and also permits chaining together of programs stored on a disk. Programs were written and experimental procedures developed to implement spectrophotometric, turbidimetric, kinetic (including initial-rate, fixed-time, and variable-time techniques), and chemiluminescence methods of analysis. Analytical methods were developed for the following elements and compounds: SO 2 , O 3 , Ca, Cr, Cu, Fe, Mg, Se(IV), Zn, Cl - , I - , NO 2 - , PO 4 -3 , S -2 , and SO 4 -2 . In many cases, standard methods could be adapted to the centrifugal analyzer, in others new methods were employed. In general, analyses performed with the centrifugal fast analyzer were faster, more precise, and more accurate than with conventional instrumentation

  7. Nuclear plant analyzer program for Bulgaria

    International Nuclear Information System (INIS)

    Shier, W.; Kennett, R.

    1993-01-01

    An interactive nuclear plant analyzer(NPA) has been developed for use by the Bulgarian technical community in the training of plant personnel, the development and verification of plant operating procedures, and in the analysis of various anticipated operational occurrences and accident scenarios. The current NPA includes models for a VVER-440 Model 230 and a VVER-1000 Model 320 and is operational on an IBM RISC6000 workstation. The RELAP5/MOD2 computer code has been used for the calculation of the reactor responses to the interactive commands initiated by the NPA operator. The interactive capabilities of the NPA have been developed to provide considerable flexibility in the plant actions that can be initiated by the operator. The current capabilities for both the VVER-440 and VVER-1000 models include: (1) scram initiation; (2) reactor coolant pump trip; (3) high pressure safety injection system initiation; (4) low pressure safety injection system initiation; (5) pressurizer safety valve opening; (6) steam generator relief/safety valve opening; (7) feedwater system initiation and trip; (8) turbine trip; and (9) emergency feedwater initiation. The NPA has the capability to display the results of the simulations in various forms that are determined by the model developer. Results displayed on the reactor mask are shown through the user defined, digital display of various plant parameters and through color changes that reflect changes in primary system fluid temperatures, fuel and clad temperatures, and the temperature of other metal structures. In addition, changes in the status of various components and systems can be initiated and/or displayed both numerically and graphically on the mask. This paper provides a description of the structure of the NPA, a discussion of the simulation models used for the VVER-440 and the VVER-1000, and an overview of the NPA capabilities. Typical results obtained using both simulation models will be discussed

  8. The time course of syntactic activation during language processing: a model based on neuropsychological and neurophysiological data.

    Science.gov (United States)

    Friederici, A D

    1995-09-01

    This paper presents a model describing the temporal and neurotopological structure of syntactic processes during comprehension. It postulates three distinct phases of language comprehension, two of which are primarily syntactic in nature. During the first phase the parser assigns the initial syntactic structure on the basis of word category information. These early structural processes are assumed to be subserved by the anterior parts of the left hemisphere, as event-related brain potentials show this area to be maximally activated when phrase structure violations are processed and as circumscribed lesions in this area lead to an impairment of the on-line structural assignment. During the second phase lexical-semantic and verb-argument structure information is processed. This phase is neurophysiologically manifest in a negative component in the event-related brain potential around 400 ms after stimulus onset which is distributed over the left and right temporo-parietal areas when lexical-semantic information is processed and over left anterior areas when verb-argument structure information is processed. During the third phase the parser tries to map the initial syntactic structure onto the available lexical-semantic and verb-argument structure information. In case of an unsuccessful match between the two types of information reanalyses may become necessary. These processes of structural reanalysis are correlated with a centroparietally distributed late positive component in the event-related brain potential.(ABSTRACT TRUNCATED AT 250 WORDS)

  9. The METAFRONT System

    DEFF Research Database (Denmark)

    Brabrand, Claus; Schwartzbach, Michael Ignatieff

    2007-01-01

    algorithm that is designed to support gradual extensions of a grammar by allowing productions to remain in a natural style and by statically reporting ambiguities and errors in terms of individual productions as they are being added. Our tool may be used as a parser generator in which the resulting parser...... automatically supports a flexible, safe, and efficient macro processor, or as an extensible lightweight compiler generator for domain-specific languages. We show substantial examples of both kinds.......We present the metafront tool for specifying flexible, safe, and efficient syntactic transformations between languages defined by context-free grammars. The transformations are guaranteed to terminate and to map grammatically legal input to grammatically legal output. We rely on a novel parser...

  10. jmzML, an open-source Java API for mzML, the PSI standard for MS data.

    Science.gov (United States)

    Côté, Richard G; Reisinger, Florian; Martens, Lennart

    2010-04-01

    We here present jmzML, a Java API for the Proteomics Standards Initiative mzML data standard. Based on the Java Architecture for XML Binding and XPath-based XML indexer random-access XML parser, jmzML can handle arbitrarily large files in minimal memory, allowing easy and efficient processing of mzML files using the Java programming language. jmzML also automatically resolves internal XML references on-the-fly. The library (which includes a viewer) can be downloaded from http://jmzml.googlecode.com.

  11. DBPQL: A view-oriented query language for the Intel Data Base Processor

    Science.gov (United States)

    Fishwick, P. A.

    1983-01-01

    An interactive query language (BDPQL) for the Intel Data Base Processor (DBP) is defined. DBPQL includes a parser generator package which permits the analyst to easily create and manipulate the query statement syntax and semantics. The prototype language, DBPQL, includes trace and performance commands to aid the analyst when implementing new commands and analyzing the execution characteristics of the DBP. The DBPQL grammar file and associated key procedures are included as an appendix to this report.

  12. Open Source Software Projects Needing Security Investments

    Science.gov (United States)

    2015-06-19

    modtls, BouncyCastle, gpg, otr, axolotl. 7. Static analyzers: Clang, Frama-C. 8. Nginx. 9. OpenVPN . It was noted that the funding model may be similar...to OpenSSL, where consulting funds the company. It was also noted that OpenVPN needs to correctly use OpenSSL in order to be secure, so focusing on...Dovecot 4. Other high-impact network services: OpenSSH, OpenVPN , BIND, ISC DHCP, University of Delaware NTPD 5. Core infrastructure data parsers

  13. New project? Don't analyze--act.

    Science.gov (United States)

    Schlesinger, Leonard A; Kiefer, Charles F; Brown, Paul B

    2012-03-01

    In a predictable world, getting a new initiative off the ground typically involves analyzing the market, creating a forecast, and writing a business plan. But what about in an unpredictable environment? The authors recommend looking to those who are experts in navigating extreme uncertainty while minimizing risk: serial entrepreneurs. These business leaders act, learn, and build their way into the future. Managers in traditional organizations can do the same, starting with smart, low-risk steps that follow simple rules: Use the means at hand; stay within an acceptable loss; secure only the commitment needed for the next step; bring along only volunteers; link the initiative to a business imperative; produce early results; and manage expectations. Momentum is gained by continuing to act based on what is learned at each step. The launch of Clorox's Green Works product line is discussed as an example.

  14. Creating Parsing Lexicons from Semantic Lexicons Automatically and Its Applications

    National Research Council Canada - National Science Library

    Ayan, Necip F; Dorr, Bonnie

    2002-01-01

    ...). We also present the effects of using such a lexicon on the parser performance. The advantage of automating the process is that the same technique can be applied directly to lexicons we have for other languages, for example, Arabic, Chinese, and Spanish. The results indicate that our method will help us generate parsing lexicons which can be used by a broad-coverage parser that runs on different languages.

  15. Mixed-Initiative Clustering

    Science.gov (United States)

    Huang, Yifen

    2010-01-01

    Mixed-initiative clustering is a task where a user and a machine work collaboratively to analyze a large set of documents. We hypothesize that a user and a machine can both learn better clustering models through enriched communication and interactive learning from each other. The first contribution or this thesis is providing a framework of…

  16. Exploiting multiple sources of information in learning an artificial language: human data and modeling.

    Science.gov (United States)

    Perruchet, Pierre; Tillmann, Barbara

    2010-03-01

    This study investigates the joint influences of three factors on the discovery of new word-like units in a continuous artificial speech stream: the statistical structure of the ongoing input, the initial word-likeness of parts of the speech flow, and the contextual information provided by the earlier emergence of other word-like units. Results of an experiment conducted with adult participants show that these sources of information have strong and interactive influences on word discovery. The authors then examine the ability of different models of word segmentation to account for these results. PARSER (Perruchet & Vinter, 1998) is compared to the view that word segmentation relies on the exploitation of transitional probabilities between successive syllables, and with the models based on the Minimum Description Length principle, such as INCDROP. The authors submit arguments suggesting that PARSER has the advantage of accounting for the whole pattern of data without ad-hoc modifications, while relying exclusively on general-purpose learning principles. This study strengthens the growing notion that nonspecific cognitive processes, mainly based on associative learning and memory principles, are able to account for a larger part of early language acquisition than previously assumed. Copyright © 2009 Cognitive Science Society, Inc.

  17. Development of a nuclear plant analyzer (NPA)

    International Nuclear Information System (INIS)

    De Vlaminck, M.; Mampaey, L.; Vanhoenacker, L.; Bastenaire, F.

    1990-01-01

    A Nuclear Plant Analyzer has been developed by TRACTABEL. Three distinct functional units make up the Nuclear Plant Analyser, a model builder, a run time unit and an analysis unit. The model builder is intended to build simulation models which describe on the one hand the geometric structure and initial conditions of a given plant and on the other hand command control logics and reactor protection systems. The run time unit carries out dialog between the user and the thermal-hydraulic code. The analysis unit is aimed at deep analyzing of the transient results. The model builder is being tested in the framework of the International Standard Problem ISP-26, which is the simulation of a LOCA on the Japanese ROSA facility

  18. Syntactic analysis in sentence comprehension: effects of dependency types and grammatical constraints.

    Science.gov (United States)

    De Vincenzi, M

    1996-01-01

    This paper presents three experiments on the parsing of Italian wh-questions that manipulate the wh-type (who vs. which-N) and the wh extraction site (main clause, dependent clause with or without complementizer). The aim of these manipulations is to see whether the parser is sensitive to the type of dependencies being processed and whether the processing effects can be explained by a unique processing principle, the minimal chain principle (MCP; De Vincenzi, 1991). The results show that the parser, following the MCP, prefers structures with fewer and less complex chains. In particular: (1) There is a processing advantage for the wh-subject extractions, the structures with less complex chains; (2) there is a processing dissociation between the who and which questions; (3) the parser respects the principle that governs the well-formedness of the empty categories (ECP).

  19. A syntactic component for Vietnamese language processing

    Directory of Open Access Journals (Sweden)

    Phuong Le-Hong

    2015-06-01

    Full Text Available This paper presents the development of a syntactic component for the Vietnamese language. We first discuss the construction of a lexicalized tree-adjoining grammar using an automatic extraction approach. We then present the construction and evaluation of a deep syntactic parser based on the extracted grammar. This is a complete system integrating necessary tools to process Vietnamese text, which permits to take as input raw texts and produce syntactic structures. A dependency annotation scheme for Vietnamese and an algorithm for extracting dependency structures from derivation trees are also proposed. At present, this is the first Vietnamese parsing system capable of producing both constituency and dependency analyses with encouraging performances: 69.33% and 73.21% for constituency and dependency analysis accuracy, respectively. The parser also compares favourably to a statistical parser which is trained and tested on the same data sets.

  20. Partial dependency parsing for Irish

    OpenAIRE

    Uí Dhonnchadha, Elaine; van Genabith, Josef

    2010-01-01

    In this paper we present a partial dependency parser for Irish, in which Constraint Grammar (CG) rules are used to annotate dependency relations and grammatical functions in unrestricted Irish text. Chunking is performed using a regular-expression grammar which operates on the dependency tagged sentences. As this is the first implementation of a parser for unrestricted Irish text (to our knowledge), there were no guidelines or precedents available. Therefore deciding what constitutes a syntac...

  1. Tackling Error Propagation through Reinforcement Learning: A Case of Greedy Dependency Parsing

    OpenAIRE

    Le, Minh; Fokkens, Antske

    2017-01-01

    Error propagation is a common problem in NLP. Reinforcement learning explores erroneous states during training and can therefore be more robust when mistakes are made early in a process. In this paper, we apply reinforcement learning to greedy dependency parsing which is known to suffer from error propagation. Reinforcement learning improves accuracy of both labeled and unlabeled dependencies of the Stanford Neural Dependency Parser, a high performance greedy parser, while maintaining its eff...

  2. Contextual Semantic Parsing using Crowdsourced Spatial Descriptions

    OpenAIRE

    Dukes, Kais

    2014-01-01

    We describe a contextual parser for the Robot Commands Treebank, a new crowdsourced resource. In contrast to previous semantic parsers that select the most-probable parse, we consider the different problem of parsing using additional situational context to disambiguate between different readings of a sentence. We show that multiple semantic analyses can be searched using dynamic programming via interaction with a spatial planner, to guide the parsing process. We are able to parse sentences in...

  3. Extracting BI-RADS Features from Portuguese Clinical Texts.

    Science.gov (United States)

    Nassif, Houssam; Cunha, Filipe; Moreira, Inês C; Cruz-Correia, Ricardo; Sousa, Eliana; Page, David; Burnside, Elizabeth; Dutra, Inês

    2012-01-01

    In this work we build the first BI-RADS parser for Portuguese free texts, modeled after existing approaches to extract BI-RADS features from English medical records. Our concept finder uses a semantic grammar based on the BIRADS lexicon and on iterative transferred expert knowledge. We compare the performance of our algorithm to manual annotation by a specialist in mammography. Our results show that our parser's performance is comparable to the manual method.

  4. Automated detection of analyzable metaphase chromosome cells depicted on scanned digital microscopic images

    Science.gov (United States)

    Qiu, Yuchen; Wang, Xingwei; Chen, Xiaodong; Li, Yuhua; Liu, Hong; Li, Shibo; Zheng, Bin

    2010-02-01

    Visually searching for analyzable metaphase chromosome cells under microscopes is quite time-consuming and difficult. To improve detection efficiency, consistency, and diagnostic accuracy, an automated microscopic image scanning system was developed and tested to directly acquire digital images with sufficient spatial resolution for clinical diagnosis. A computer-aided detection (CAD) scheme was also developed and integrated into the image scanning system to search for and detect the regions of interest (ROI) that contain analyzable metaphase chromosome cells in the large volume of scanned images acquired from one specimen. Thus, the cytogeneticists only need to observe and interpret the limited number of ROIs. In this study, the high-resolution microscopic image scanning and CAD performance was investigated and evaluated using nine sets of images scanned from either bone marrow (three) or blood (six) specimens for diagnosis of leukemia. The automated CAD-selection results were compared with the visual selection. In the experiment, the cytogeneticists first visually searched for the analyzable metaphase chromosome cells from specimens under microscopes. The specimens were also automated scanned and followed by applying the CAD scheme to detect and save ROIs containing analyzable cells while deleting the others. The automated selected ROIs were then examined by a panel of three cytogeneticists. From the scanned images, CAD selected more analyzable cells than initially visual examinations of the cytogeneticists in both blood and bone marrow specimens. In general, CAD had higher performance in analyzing blood specimens. Even in three bone marrow specimens, CAD selected 50, 22, 9 ROIs, respectively. Except matching with the initially visual selection of 9, 7, and 5 analyzable cells in these three specimens, the cytogeneticists also selected 41, 15 and 4 new analyzable cells, which were missed in initially visual searching. This experiment showed the feasibility of

  5. Natural-Language Parser for PBEM

    Science.gov (United States)

    James, Mark

    2010-01-01

    A computer program called "Hunter" accepts, as input, a colloquial-English description of a set of policy-based-management rules, and parses that description into a form useable by policy-based enterprise management (PBEM) software. PBEM is a rules-based approach suitable for automating some management tasks. PBEM simplifies the management of a given enterprise through establishment of policies addressing situations that are likely to occur. Hunter was developed to have a unique capability to extract the intended meaning instead of focusing on parsing the exact ways in which individual words are used.

  6. Radiofrequency initiation and radiofrequency sustainment of laser initiated seeded high pressure plasma

    International Nuclear Information System (INIS)

    Paller, Eric S.; Scharer, John E.; Akhtar, Kamran; Kelly, Kurt; Ding, Guowen

    2001-01-01

    We examine radiofrequency initiation of high pressure(1-70 Torr) inductive plasma discharges in argon, nitrogen, air and organic seed gas mixtures. Millimeter wave interferometry, optical emission and antenna wave impedance measurements for double half-turn helix and helical inductive antennas are used to interpret the rf/plasma coupling, measure the densities in the range of 10 12 cm -3 and analyze the ionization and excited states of the gas mixtures. We have also carried out 193 nm excimer laser initiation of an organic gas seed plasma which is sustained at higher pressures(150 Torr) by radiofrequency coupling at 2.8 kW power levels

  7. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    Directory of Open Access Journals (Sweden)

    Mike W.-L. Cheung

    2016-05-01

    Full Text Available Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists – and probably the most crucial one – is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  8. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.

    Science.gov (United States)

    Cheung, Mike W-L; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  9. Analyzing Visibility Configurations.

    Science.gov (United States)

    Dachsbacher, C

    2011-04-01

    Many algorithms, such as level of detail rendering and occlusion culling methods, make decisions based on the degree of visibility of an object, but do not analyze the distribution, or structure, of the visible and occluded regions across surfaces. We present an efficient method to classify different visibility configurations and show how this can be used on top of existing methods based on visibility determination. We adapt co-occurrence matrices for visibility analysis and generalize them to operate on clusters of triangular surfaces instead of pixels. We employ machine learning techniques to reliably classify the thus extracted feature vectors. Our method allows perceptually motivated level of detail methods for real-time rendering applications by detecting configurations with expected visual masking. We exemplify the versatility of our method with an analysis of area light visibility configurations in ray tracing and an area-to-area visibility analysis suitable for hierarchical radiosity refinement. Initial results demonstrate the robustness, simplicity, and performance of our method in synthetic scenes, as well as real applications.

  10. A multi-analyzer crystal spectrometer (MAX) for pulsed neutron sources

    International Nuclear Information System (INIS)

    Tajima, K.; Ishikawa, Y.; Kanai, K.; Windsor, C.G.; Tomiyoshi, S.

    1982-03-01

    The paper describes the principle and initial performance of a multi-analyzer crystal spectrometer (MAX) recently installed at the KENS spallation neutron source at Tsukuba. The spectrometer is able to make time of flight scans along a desired direction in reciprocal space, covering a wide range of the energy transfers corresponding to the fifteen analyzer crystals. The constant Q or constant E modes of operation can be performed. The spectrometer is particularly suited for studying collective excitations such as phonons and magnons to high energy transfers using single crystal samples. (author)

  11. Personality Traits and Training Initiation Process: Intention, Planning, and Action Initiation.

    Science.gov (United States)

    Laguna, Mariola; Purc, Ewelina

    2016-01-01

    The article aims at investigating the role of personality traits in relation to training initiation. Training initiation is conceptualized as a goal realization process, and explained using goal theories. There are three stages of the process analyzed: intention to undertake training, plan formulation, and actual training undertaking. Two studies tested the relationships between five personality traits, defined according to the five factor model, and the stages of the goal realization process. In Study 1, which explains training intention and training plans' formulation, 155 employees participated. In Study 2, which was time-lagged with two measurement points, and which explains intention, plans, and training actions undertaken, the data from 176 employees was collected at 3 month intervals. The results of these studies show that personality traits, mainly openness to experience, predict the training initiation process to some degree: intention, plans, and actual action initiation. The findings allow us to provide recommendations for practitioners responsible for human resource development. The assessment of openness to experience in employees helps predict their motivation to participate in training activities. To increase training motivation it is vital to strengthen intentions to undertake training, and to encourage training action planning.

  12. Initialization Errors in Quantum Data Base Recall

    OpenAIRE

    Natu, Kalyani

    2016-01-01

    This paper analyzes the relationship between initialization error and recall of a specific memory in the Grover algorithm for quantum database search. It is shown that the correct memory is obtained with high probability even when the initial state is far removed from the correct one. The analysis is done by relating the variance of error in the initial state to the recovery of the correct memory and the surprising result is obtained that the relationship between the two is essentially linear.

  13. XML Transformations

    Directory of Open Access Journals (Sweden)

    Felician ALECU

    2012-04-01

    Full Text Available XSLT style sheets are designed to transform the XML documents into something else. The two most popular parsers of the moment are the Document Object Model (DOM and the Simple API for XML (SAX. DOM is an official recommendation of the W3C (available at http://www.w3.org/TR/REC-DOM-Level-1, while SAX is a de facto standard. A good parser should be fast, space efficient, rich in functionality and easy to use.

  14. Inhomogeneous inflation: The initial-value problem

    International Nuclear Information System (INIS)

    Laguna, P.; Kurki-Suonio, H.; Matzner, R.A.

    1991-01-01

    We present a spatially three-dimensional study for solving the initial-value problem in general relativity for inhomogeneous cosmologies. We use York's conformal approach to solve the constraint equations of Einstein's field equations for scalar field sources and find the initial data which will be used in the evolution. This work constitutes the first stage in the development of a code to analyze the effects of matter and spacetime inhomogeneities on inflation

  15. NATO and EU/European Defense Initiatives: Competitive or Complementary

    National Research Council Canada - National Science Library

    Muckel, Hubert

    2006-01-01

    .... This paper analyzes the current status of NATO and the European Union (EU) defense initiatives examines national objectives and interests of European key-players and the US and evaluates the aspects of competitiveness or complement of NATO and EU defense initiatives.

  16. Time-reversal asymmetry: polarization and analyzing power in nuclear reactions

    International Nuclear Information System (INIS)

    Rioux, C.; Roy, R.; Slobodrian, R.J.; Conzett, H.E.

    1984-01-01

    Measurements of the proton polarization in the reactions 7 Li( 3 He, p vector) 9 Be and 9 Be( 3 He, p vector) 11 B and of the analyzing powers in the inverse reactions, initiated by polarized protons at the same center-of-mass energies, show significant differences. This implies the failure of the polarization-analyzing-power theorem and, prima facie, of time-reversal invariance in these reactions. The reaction 2 H( 3 He, p vector) 4 He and its inverse have also been investigated and show smaller differences. A discussion of instrumental asymmetries is presented

  17. Time asymmetry: Polarization and analyzing power in the nuclear reactions

    International Nuclear Information System (INIS)

    Rioux, C.; Roy, R.; Slobodrian, R.J.; Conzett, H.E.

    1983-01-01

    Measurements of the proton polarization in the reactions 7 Li( 3 He, p vector) 9 Be and 9 Be( 3 He, p vector) 11 B and of the analyzing powers of the inverse reactions, initiated by polarized protons at the same c.m. energies, show significant differences which imply the failure of the polarization-analyzing-power theorem and, prima facie, of time-reversal invariance in these reactions. The reaction 2 H( 3 He, p vector) 4 He and its inverse have also been investigated and show some smaller differences. A discussion of the instrumental asymmetries is presented. (orig.)

  18. The Myth of Peer Influence in Adolescent Smoking Initiation

    Science.gov (United States)

    Arnett, Jeffrey Jensen

    2007-01-01

    The widespread belief that peer influence is the primary cause of adolescent smoking initiation is examined and called into question. Correlational and longitudinal studies purporting to demonstrate peer influence are analyzed, and their limitations described. Qualitative interview studies of adolescent smoking initiation are presented as…

  19. Research on Initiation Sensitivity of Solid Explosive and Planer Initiation System

    Directory of Open Access Journals (Sweden)

    N Matsuo

    2016-09-01

    Full Text Available Firstly, recently, there are a lot of techniques being demanded for complex process, various explosive initiation method and highly accurate control of detonation are needed. In this research, the metal foil explosion using high current is focused attention on the method to obtain linear or planate initiation easily, and the main evaluation of metal foil explosion to initiate explosive was conducted. The explosion power was evaluated by observing optically the underwater shock wave generated from the metal foil explosion. Secondly, in high energy explosive processing, there are several applications, such as shock compaction, explosive welding, food processing and explosive forming. In these explosive applications, a high sensitive explosive has been mainly used. The high sensitive explosive is so dangerous, since it can lead to explosion suddenly. So, for developing explosives, the safety is the most important thing as well as low manufacturing cost and explosive characteristics. In this work, we have focused on the initiation sensitivity of a solid explosive and performed numerical analysis of sympathetic detonation. The numerical analysis is calculated by LS-DYNA 3D (commercial code. To understand the initiation reaction of an explosive, Lee-Tarver equation was used and impact detonation process was analyzed by ALE code. Configuration of simulation model is a quarter of circular cylinder. The donor type of explosive (SEP was used as initiation explosive. When the donor explosive is exploded, a shock wave is generated and it propagates into PMMA, air and metallic layers in order. During passing through the layers, the shock wave is attenuated and finally, it has influence on the acceptor explosive, Comp. B. Here, we evaluate the initiation of acceptor explosive and discuss about detonation pressure, reactive rate of acceptor explosive and attenuation of impact pressure.

  20. The initiating events in the Loviisa nuclear power plant history

    International Nuclear Information System (INIS)

    Sjoblom, K.

    1987-01-01

    During the 16 reactor years of Loviisa nuclear power plant operation no serious incident has endangered the high level of safety. The initiating events of plant incidents have been analyzed in order to get a view of plant operational safety experience. The initiating events have been placed in categories similar to those that EPRI uses. However, because of the very small number of scrams the study was extended to also cover transients with a relatively low safety importance in order to get more comprehensive statistics. Human errors, which contributed to 15% of the transients, were a special subject in this study. The conditions under which human failures occurred, and the nature and root causes of the human failures that caused the initiating events were analyzed. For future analyses it was noticed that it would be beneficial to analyze incidents immediately, to consult with the persons directly involved and to develop an international standard format for incident analyses

  1. 32 CFR 989.4 - Initial considerations.

    Science.gov (United States)

    2010-07-01

    ... alternatives analyzed in the environmental documents. (f) Pursue the objective of furthering foreign policy and... ENVIRONMENTAL IMPACT ANALYSIS PROCESS (EIAP) § 989.4 Initial considerations. Air Force personnel will: (a... CATEX from environmental impact analysis (appendix B). (c) Make environmental documents, comments, and...

  2. Time asymmetry: Polarization and analyzing power in the nuclear reactions

    Energy Technology Data Exchange (ETDEWEB)

    Rioux, C.; Roy, R.; Slobodrian, R.J. (Laval Univ., Quebec City (Canada). Lab. de Physique Nucleaire); Conzett, H.E. (California Univ., Berkeley (USA). Lawrence Berkeley Lab.)

    1983-02-28

    Measurements of the proton polarization in the reactions /sup 7/Li(/sup 3/He, p vector)/sup 9/Be and /sup 9/Be(/sup 3/He, p vector)/sup 11/B and of the analyzing powers of the inverse reactions, initiated by polarized protons at the same c.m. energies, show significant differences which imply the failure of the polarization-analyzing-power theorem and, prima facie, of time-reversal invariance in these reactions. The reaction /sup 2/H(/sup 3/He, p vector)/sup 4/ He and its inverse have also been investigated and show some smaller differences. A discussion of the instrumental asymmetries is presented.

  3. Systematic Review about Personal Growth Initiative

    Directory of Open Access Journals (Sweden)

    Clarissa Pinto Pizarro de Freitas

    Full Text Available The present study aimed to realize a systematic review of publications about personal growth initiative. A literature review was realized in Bireme, Index Psi, LILACS, PePSIC, Pubmed - Publisher's Medlme, Wiley Online Library, PsycINFO, OneFile, SciVerse ScienceDirect, ERIC, Emerald Journals, PsycARTICLES - American Psychological Association, Directory of Open Access Journals - DOAJ, SAGE Journals, SpringerLink, PLoS, IngentaConnect, IEEE Journals & Magazines and SciELO databases. The literature review was performed from December of 2014 to January of 2015, without stipulating date limits for the publication of the articles. It was found 53 studies, excluded seven, and analyzed 46 researches. The studies aimed to investigate the psychometric properties of personal growth initiative scale and personal growth initiative scale II. The relations of personal initiative growth and others constructs were also evaluated. Furthermore the studies investigated the impact of interventions to promote personal growth initiative. Results of these studies showed that personal growth initiative was positively related to levels of well-being, selfesteem and others positive dimensions, and negatively to anxiety, depression and others negative factors.

  4. Implementación y pruebas de REsource LOcation And Discovery (RELOAD) Parser and Encoder

    OpenAIRE

    Jiménez Bolonio, Jaime Antonio

    2009-01-01

    El ampliamente utilizado paradigma cliente/servidor está siendo complementado e incluso reemplazado por otros planteamientos de tipo Peer-to-Peer (P2P). Las redes P2P ofrecen un sistema descentralizado de distribución de la información, son más estables, y representan una solución al problema de la escalabilidad. Al mismo tiempo, el Session Initiation Protocol (SIP), un protocolo de señalización diseñado inicialmente para arquitecturas de tipo ciente/servidor, ha sido ampliamente adoptado par...

  5. Interactive nuclear plant analyzer for VVER-440 reactor

    International Nuclear Information System (INIS)

    Shier, W.; Horak, W.; Kennett, R.

    1992-05-01

    This document discusses an interactive nuclear plant analyzer (NPA) which has been developed for a VVER-440, Model 213 reactor for use in the training of plant personnel, the development and verification of plant operating procedures, and in the analysis of various anticipated operational occurrences and accident scenarios. This NPA is operational on an IBM RISC-6000 workstation and utilizes the RELAP5/MOD2 computer code for the calculation of the VVER-440 reactor response to the interactive commands initiated by the NPA operator

  6. Inclusion-initiated fracture model for ceramics

    International Nuclear Information System (INIS)

    Sung, J.; Nicholson, P.S.

    1990-01-01

    The fracture of ceramics initiating from a typical inclusion is analyzed. The inclusion is considered to have a thermal expansion coefficient and fracture toughness lower than those of the matrix and a Young's modulus higher than that of the matrix. Inclusion-initiated fracture is modeled for a spherical inclusion using a weight function method to compute the residual stress intensity factor for a part-through elliptical crack. The results are applied to an α-Al 2 O 3 inclusion embedded in a tetragonal ZrO 2 ceramic. The strength predictions agree well with experimental data

  7. The Aqueduct Global Flood Analyzer

    Science.gov (United States)

    Iceland, Charles

    2015-04-01

    As population growth and economic growth take place, and as climate change accelerates, many regions across the globe are finding themselves increasingly vulnerable to flooding. A recent OECD study of the exposure of the world's large port cities to coastal flooding found that 40 million people were exposed to a 1 in 100 year coastal flood event in 2005, and the total value of exposed assets was about US 3,000 billion, or 5% of global GDP. By the 2070s, those numbers were estimated to increase to 150 million people and US 35,000 billion, or roughly 9% of projected global GDP. Impoverished people in developing countries are particularly at risk because they often live in flood-prone areas and lack the resources to respond. WRI and its Dutch partners - Deltares, IVM-VU University Amsterdam, Utrecht University, and PBL Netherlands Environmental Assessment Agency - are in the initial stages of developing a robust set of river flood and coastal storm surge risk measures that show the extent of flooding under a variety of scenarios (both current and future), together with the projected human and economic impacts of these flood scenarios. These flood risk data and information will be accessible via an online, easy-to-use Aqueduct Global Flood Analyzer. We will also investigate the viability, benefits, and costs of a wide array of flood risk reduction measures that could be implemented in a variety of geographic and socio-economic settings. Together, the activities we propose have the potential for saving hundreds of thousands of lives and strengthening the resiliency and security of many millions more, especially those who are most vulnerable. Mr. Iceland will present Version 1.0 of the Aqueduct Global Flood Analyzer and provide a preview of additional elements of the Analyzer to be released in the coming years.

  8. Dionex series 8000 on-line analyzer, Sequoyah Nuclear Power Plant. Final report

    International Nuclear Information System (INIS)

    1986-03-01

    This project was initiated to develop a custom-designed online water analyzer (ion chromatograph) for secondary water chemistry control in TVA's nuclear plants. This water analyzer development was conducted pursuant to a cooperative research agreement with the Dionex Corporation. Dionex developed and installed a dual channel, six stream analyzer on the secondary side of TVA's Sequoyah Nuclear Plant. The analyzer was developed for real time detection of sodium, chloride, and sulfate in any of the six sampling streams. The analyzer is providing Sequoyah's plant personnel with reliable secondary water chemistry data in a much more timely manner than the past grab sampling techniques. Results on the performance of the analyzer show that it is performing above and beyond the expectations of plant personnel. Since its installation at Sequoyah, there have been 29 units ordered from Dionex including 1 unit for Sequoyah, 5 units for Browns Ferry, and 23 units for other utilities. In the future, the analyzer will allow plant staffs to take corrective action before corrosive conditions occur or before having to derate a unit

  9. Dependency Parsing with Transformed Feature

    Directory of Open Access Journals (Sweden)

    Fuxiang Wu

    2017-01-01

    Full Text Available Dependency parsing is an important subtask of natural language processing. In this paper, we propose an embedding feature transforming method for graph-based parsing, transform-based parsing, which directly utilizes the inner similarity of the features to extract information from all feature strings including the un-indexed strings and alleviate the feature sparse problem. The model transforms the extracted features to transformed features via applying a feature weight matrix, which consists of similarities between the feature strings. Since the matrix is usually rank-deficient because of similar feature strings, it would influence the strength of constraints. However, it is proven that the duplicate transformed features do not degrade the optimization algorithm: the margin infused relaxed algorithm. Moreover, this problem can be alleviated by reducing the number of the nearest transformed features of a feature. In addition, to further improve the parsing accuracy, a fusion parser is introduced to integrate transformed and original features. Our experiments verify that both transform-based and fusion parser improve the parsing accuracy compared to the corresponding feature-based parser.

  10. [The role of animacy in European Portuguese relative clause attachment: evidence from production and comprehension tasks].

    Science.gov (United States)

    Soares, Ana Paula; Fraga, Isabel; Comesaña, Montserrat; Piñeiro, Ana

    2010-11-01

    This work presents an analysis of the role of animacy in attachment preferences of relative clauses to complex noun phrases in European Portuguese (EP). The study of how the human parser solves this kind of syntactic ambiguities has been focus of extensive research. However, what is known about EP is both limited and puzzling. Additionally, as recent studies have stressed the importance of extra-syntactic variables in this process, two experiments were carried out to assess EP attachment preferences considering four animacy conditions: Study 1 used a sentence-completion-task, and Study 2 a self-paced reading task. Both studies indicate a significant preference for high attachment in EP. Furthermore, they showed that this preference was modulated by the animacy of the host NP: if the first host was inanimate and the second one was animate, the parser's preference changed to low attachment preference. These findings shed light on previous results regarding EP and strengthen the idea that, even in early stages of processing, the parser seems to be sensitive to extra-syntactic information.

  11. Overview of the ArbiTER edge plasma eigenvalue code

    Science.gov (United States)

    Baver, Derek; Myra, James; Umansky, Maxim

    2011-10-01

    The Arbitrary Topology Equation Reader, or ArbiTER, is a flexible eigenvalue solver that is currently under development for plasma physics applications. The ArbiTER code builds on the equation parser framework of the existing 2DX code, extending it to include a topology parser. This will give the code the capability to model problems with complicated geometries (such as multiple X-points and scrape-off layers) or model equations with arbitrary numbers of dimensions (e.g. for kinetic analysis). In the equation parser framework, model equations are not included in the program's source code. Instead, an input file contains instructions for building a matrix from profile functions and elementary differential operators. The program then executes these instructions in a sequential manner. These instructions may also be translated into analytic form, thus giving the code transparency as well as flexibility. We will present an overview of how the ArbiTER code is to work, as well as preliminary results from early versions of this code. Work supported by the U.S. DOE.

  12. Extracting Various Classes of Data From Biological Text Using the Concept of Existence Dependency.

    Science.gov (United States)

    Taha, Kamal

    2015-11-01

    One of the key goals of biological natural language processing (NLP) is the automatic information extraction from biomedical publications. Most current constituency and dependency parsers overlook the semantic relationships between the constituents comprising a sentence and may not be well suited for capturing complex long-distance dependences. We propose in this paper a hybrid constituency-dependency parser for biological NLP information extraction called EDCC. EDCC aims at enhancing the state of the art of biological text mining by applying novel linguistic computational techniques that overcome the limitations of current constituency and dependency parsers outlined earlier, as follows: 1) it determines the semantic relationship between each pair of constituents in a sentence using novel semantic rules; and 2) it applies a semantic relationship extraction model that extracts information from different structural forms of constituents in sentences. EDCC can be used to extract different types of data from biological texts for purposes such as protein function prediction, genetic network construction, and protein-protein interaction detection. We evaluated the quality of EDCC by comparing it experimentally with six systems. Results showed marked improvement.

  13. Transient analyzer

    International Nuclear Information System (INIS)

    Muir, M.D.

    1975-01-01

    The design and design philosophy of a high performance, extremely versatile transient analyzer is described. This sub-system was designed to be controlled through the data acquisition computer system which allows hands off operation. Thus it may be placed on the experiment side of the high voltage safety break between the experimental device and the control room. This analyzer provides control features which are extremely useful for data acquisition from PPPL diagnostics. These include dynamic sample rate changing, which may be intermixed with multiple post trigger operations with variable length blocks using normal, peak to peak or integrate modes. Included in the discussion are general remarks on the advantages of adding intelligence to transient analyzers, a detailed description of the characteristics of the PPPL transient analyzer, a description of the hardware, firmware, control language and operation of the PPPL transient analyzer, and general remarks on future trends in this type of instrumentation both at PPPL and in general

  14. Energy efficiency initiatives: Indian experience

    Energy Technology Data Exchange (ETDEWEB)

    Dey, Dipankar [ICFAI Business School, Kolkata, (IBS-K) (India)

    2007-07-01

    India, with a population of over 1.10 billion is one of the fastest growing economies of the world. As domestic sources of different conventional commercial energy are drying up, dependence on foreign energy sources is increasing. There exists a huge potential for saving energy in India. After the first 'oil shock' (1973), the government of India realized the need for conservation of energy and a 'Petroleum Conservation Action Group' was formed in 1976. Since then many initiatives aiming at energy conservation and improving energy efficiency, have been undertaken (the establishment of Petroleum Conservation Research Association in 1978; the notification of Eco labelling scheme in 1991; the formation of Bureau of Energy Efficiency in 2002). But no such initiative was successful. In this paper an attempt has been made to analyze the changing importance of energy conservation/efficiency measures which have been initiated in India between 1970 and 2005.The present study tries to analyze the limitations and the reasons of failure of those initiatives. The probable reasons are: fuel pricing mechanism (including subsidies), political factors, corruption and unethical practices, influence of oil and related industry lobbies - both internal and external, the economic situation and the prolonged protection of domestic industries. Further, as India is opening its economy, the study explores the opportunities that the globally competitive market would offer to improve the overall energy efficiency of the economy. The study suggests that the Bureau of Energy Efficiency (BEE) - the newly formed nodal agency for improving energy efficiency of the economy may be made an autonomous institution where intervention from the politicians would be very low. For proper implementation of different initiatives to improve energy efficiency, BEE should involve more the civil societies (NGO) from the inception to the implementation stage of the programs. The paper also

  15. Donatus: uma interface amigável para o estudo da sintaxe formal utilizando a biblioteca em Python do NLTK

    Directory of Open Access Journals (Sweden)

    Leonel Figueiredo de Alencar

    2012-12-01

    Full Text Available Este trabalho objetiva, primeiramente, evidenciar a utilidade da CFG e da FCFG no estudo da sintaxe formal. A aplicação de parsers baseados nesses formalismos na análise de um corpus pode revelar consequências de uma dada análise que de outro modo passariam despercebidas. O NLTK é uma caixa de ferramentas para o PLN em Python que possibilita a construção de parsers em diferentes arquiteturas. No entanto, para uma utilização não trivial dessa biblioteca na análise sintática automática são necessários conhecimentos de programação. Para permitir o acesso de não programadores à implementação e testagem de parsers, desenvolvemos o Donatus, uma interface gráfica amigável para as facilidades de parsing do NLTK, dotada de recursos adicionais que a tornam interessante também para programadores. Como exemplo do funcionamento da ferramenta e demonstração da sua relevância na investigação sintática formal, comparamos implementações de duas análises alternativas da modificação adjetival em português. A primeira abordagem, baseada na Teoria X-barra tradicional, produziu um grande número de pseudoambiguidades. Esse problema foi evitado por um parser baseado em abordagem no âmbito do Programa Minimalista. Sem o recurso do computador, essa diferença entre as duas abordagens não seria facilmente revelada.

  16. DONATUS: UMA INTERFACE AMIGÁVEL PARA O ESTUDO DA SINTAXE FORMAL UTILIZANDO A BIBLIOTECA EM PYTHON DO NLTK

    Directory of Open Access Journals (Sweden)

    Leonel Figueiredo de Alencar

    2012-01-01

    Full Text Available Este trabalho objetiva, primeiramente, evidenciar a utilidade da CFG e da FCFG no estudo da sintaxe formal. A aplicação de parsers baseados nesses formalismos na análise de um corpus pode revelar consequências de uma dada análise que de outro modo passariam despercebidas. O NLTK é uma caixa de ferramentas para o PLN em Python que possibilita a construção de parsers em diferentes arquiteturas. No entanto, para uma utilização não trivial dessa biblioteca na análise sintática automática são necessários conhecimentos de programação. Para permitir o acesso de não programadores à implementação e testagem de parsers, desenvolvemos o Donatus, uma interface gráfica amigável para as facilidades de parsing do NLTK, dotada de recursos adicionais que a tornam interessante também para programadores. Como exemplo do funcionamento da ferramenta e demonstração da sua relevância na investigação sintática formal, comparamos implementações de duas análises alternativas da modificação adjetival em português. A primeira abordagem, baseada na Teoria X-barra tradicional, produziu um grande número de pseudoambiguidades. Esse problema foi evitado por um parser baseado em abordagem no âmbito do Programa Minimalista. Sem o recurso do computador, essa diferença entre as duas abordagens não seria facilmente revelada.

  17. Complex-radical copolymerization of vinyl monomers on organoelemental initiators

    International Nuclear Information System (INIS)

    Grishin, D.F.

    1993-01-01

    Data on regularities of the initiation and growth of the (co)polymerization of polar vinyl series monomers on organo-elemental initiator, organo-boron in particular, are generalized. The effect of organo-metallic compounds and some phenol type inhibitors on the rate of acrylate (co)polymerization is analyzed from view of the change of electroacceptor properties (electrophilicity) of macroradicals

  18. Semi-automated ontology generation and evolution

    Science.gov (United States)

    Stirtzinger, Anthony P.; Anken, Craig S.

    2009-05-01

    Extending the notion of data models or object models, ontology can provide rich semantic definition not only to the meta-data but also to the instance data of domain knowledge, making these semantic definitions available in machine readable form. However, the generation of an effective ontology is a difficult task involving considerable labor and skill. This paper discusses an Ontology Generation and Evolution Processor (OGEP) aimed at automating this process, only requesting user input when un-resolvable ambiguous situations occur. OGEP directly attacks the main barrier which prevents automated (or self learning) ontology generation: the ability to understand the meaning of artifacts and the relationships the artifacts have to the domain space. OGEP leverages existing lexical to ontological mappings in the form of WordNet, and Suggested Upper Merged Ontology (SUMO) integrated with a semantic pattern-based structure referred to as the Semantic Grounding Mechanism (SGM) and implemented as a Corpus Reasoner. The OGEP processing is initiated by a Corpus Parser performing a lexical analysis of the corpus, reading in a document (or corpus) and preparing it for processing by annotating words and phrases. After the Corpus Parser is done, the Corpus Reasoner uses the parts of speech output to determine the semantic meaning of a word or phrase. The Corpus Reasoner is the crux of the OGEP system, analyzing, extrapolating, and evolving data from free text into cohesive semantic relationships. The Semantic Grounding Mechanism provides a basis for identifying and mapping semantic relationships. By blending together the WordNet lexicon and SUMO ontological layout, the SGM is given breadth and depth in its ability to extrapolate semantic relationships between domain entities. The combination of all these components results in an innovative approach to user assisted semantic-based ontology generation. This paper will describe the OGEP technology in the context of the architectural

  19. PATMA: parser of archival tissue microarray

    Directory of Open Access Journals (Sweden)

    Lukasz Roszkowiak

    2016-12-01

    Full Text Available Tissue microarrays are commonly used in modern pathology for cancer tissue evaluation, as it is a very potent technique. Tissue microarray slides are often scanned to perform computer-aided histopathological analysis of the tissue cores. For processing the image, splitting the whole virtual slide into images of individual cores is required. The only way to distinguish cores corresponding to specimens in the tissue microarray is through their arrangement. Unfortunately, distinguishing the correct order of cores is not a trivial task as they are not labelled directly on the slide. The main aim of this study was to create a procedure capable of automatically finding and extracting cores from archival images of the tissue microarrays. This software supports the work of scientists who want to perform further image processing on single cores. The proposed method is an efficient and fast procedure, working in fully automatic or semi-automatic mode. A total of 89% of punches were correctly extracted with automatic selection. With an addition of manual correction, it is possible to fully prepare the whole slide image for extraction in 2 min per tissue microarray. The proposed technique requires minimum skill and time to parse big array of cores from tissue microarray whole slide image into individual core images.

  20. Modeling Algorithms in SystemC and ACL2

    Directory of Open Access Journals (Sweden)

    John W. O'Leary

    2014-06-01

    Full Text Available We describe the formal language MASC, based on a subset of SystemC and intended for modeling algorithms to be implemented in hardware. By means of a special-purpose parser, an algorithm coded in SystemC is converted to a MASC model for the purpose of documentation, which in turn is translated to ACL2 for formal verification. The parser also generates a SystemC variant that is suitable as input to a high-level synthesis tool. As an illustration of this methodology, we describe a proof of correctness of a simple 32-bit radix-4 multiplier.

  1. MEDINA: MECCA Development in Accelerators – KPP Fortran to CUDA source-to-source Pre-processor

    Directory of Open Access Journals (Sweden)

    Michail Alvanos

    2017-04-01

    Full Text Available The global climate model ECHAM/MESSy Atmospheric Chemistry (EMAC is a modular global model that simulates climate change and air quality scenarios. The application includes different sub-models for the calculation of chemical species concentrations, their interaction with land and sea, and the human interaction. The paper presents a source-to-source parser that enables support for Graphics Processing Units (GPU by the Kinetic Pre-Processor (KPP general purpose open-source software tool. The requirements of the host system are also described. The source code of the source-to-source parser is available under the MIT License.

  2. Data classification based on the hybrid intellectual technology

    Directory of Open Access Journals (Sweden)

    Demidova Liliya

    2018-01-01

    Full Text Available In this paper the data classification technique, implying the consistent application of the SVM and Parzen classifiers, has been suggested. The Parser classifier applies to data which can be both correctly and erroneously classified using the SVM classifier, and are located in the experimentally defined subareas near the hyperplane which separates the classes. A herewith, the SVM classifier is used with the default parameters values, and the optimal parameters values of the Parser classifier are determined using the genetic algorithm. The experimental results confirming the effectiveness of the proposed hybrid intellectual data classification technology have been presented.

  3. Diagnostics on LALR(k) conflicts based on a method for LR(k) testing

    DEFF Research Database (Denmark)

    Kristensen, Bent Bruun; Madsen, Ole Lehrmann

    1981-01-01

    A user of an LALR(k) parser generator system may have difficulties in understanding how a given LALR(k) conflict is generated. This is especially difficult if the conflict does not correspond to an LR(k) conflict. A practical method for giving informative diagnostics on LALR(k) conflicts is prese......A user of an LALR(k) parser generator system may have difficulties in understanding how a given LALR(k) conflict is generated. This is especially difficult if the conflict does not correspond to an LR(k) conflict. A practical method for giving informative diagnostics on LALR(k) conflicts...

  4. MATH: A Scientific Tool for Numerical Methods Calculation and Visualization

    Directory of Open Access Journals (Sweden)

    Henrich Glaser-Opitz

    2016-02-01

    Full Text Available MATH is an easy to use application for various numerical methods calculations with graphical user interface and integrated plotting tool written in Qt with extensive use of Qwt library for plotting options and use of Gsl and MuParser libraries as a numerical and parser helping libraries. It can be found at http://sourceforge.net/projects/nummath. MATH is a convenient tool for use in education process because of its capability of showing every important step in solution process to better understand how it is done. MATH also enables fast comparison of similar method speed and precision.

  5. Development of a process analyzer for trace uranium

    International Nuclear Information System (INIS)

    Hiller, J.M.

    1990-01-01

    A process analyzer, based on time-resolved laser-induced luminescence, is being developed for the Department of Energy's Oak Ridge Y-12 Plant for the ultra-trace determination of uranium. The present instrument has a detection limit of 1 μg/L; the final instrument will have a detection limit near 1 ng/L for continuous environmental monitoring. Time-resolved luminescence decay is used to enhance sensitivity, reduce interferences, and eliminate the need for standard addition. The basic analyzer sequence is: a pulse generator triggers the laser; the laser beam strikes a photodiode which initiates data acquisition and synchronizes the timing, nearly simultaneously, laser light strikes the sample; intensity data are collected under control of the gated photon counter; and the cycle repeats as necessary. Typically, data are collected in 10 μs intervals over 700 μs (several luminescence half-lives). The final instrument will also collect and prepare samples, calibrate itself, reduce the raw data, and transmit reduced data to the control station(s)

  6. 3002 Humidified Tandem Differential Mobility Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Uin, Janek [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-03-01

    The Brechtel Manufacturing Inc. (BMI) Humidified Tandem Differential Mobility Analyzer (HT-DMA Model 3002) (Brechtel and Kreidenweis 2000a,b, Henning et al. 2005, Xerxes et al. 2014) measures how aerosol particles of different initial dry sizes grow or shrink when exposed to changing relative humidity (RH) conditions. It uses two different mobility analyzers (DMA) and a humidification system to make the measurements. One DMA selects a narrow size range of dry aerosol particles, which are exposed to varying RH conditions in the humidification system. The second (humidified) DMA scans the particle size distribution output from the humidification system. Scanning a wide range of particle sizes enables the second DMA to measure changes in size or growth factor (growth factor = humidified size/dry size), due to water uptake by the particles. A Condensation Particle Counter (CPC) downstream of the second DMA counts particles as a function of selected size in order to obtain the number size distribution of particles exposed to different RH conditions.

  7. Tubular Initial Conditions and Ridge Formation

    Directory of Open Access Journals (Sweden)

    M. S. Borysova

    2013-01-01

    Full Text Available The 2D azimuth and rapidity structure of the two-particle correlations in relativistic A+A collisions is altered significantly by the presence of sharp inhomogeneities in superdense matter formed in such processes. The causality constraints enforce one to associate the long-range longitudinal correlations observed in a narrow angular interval, the so-called (soft ridge, with peculiarities of the initial conditions of collision process. This study's objective is to analyze whether multiform initial tubular structures, undergoing the subsequent hydrodynamic evolution and gradual decoupling, can form the soft ridges. Motivated by the flux-tube scenarios, the initial energy density distribution contains the different numbers of high density tube-like boost-invariant inclusions that form a bumpy structure in the transverse plane. The influence of various structures of such initial conditions in the most central A+A events on the collective evolution of matter, resulting spectra, angular particle correlations and vn-coefficients is studied in the framework of the hydrokinetic model (HKM.

  8. Analyzing, Modelling, and Designing Software Ecosystems

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    as the software development and distribution by a set of actors dependent on each other and the ecosystem. We commence on the hypothesis that the establishment of a software ecosystem on the telemedicine services of Denmark would address these issues and investigate how a software ecosystem can foster...... the development, implementation, and use of telemedicine services. We initially expand the theory of software ecosystems by contributing to the definition and understanding of software ecosystems, providing means of analyzing existing and designing new ecosystems, and defining and measuring the qualities...... of software ecosystems. We use these contributions to design a software ecosystem in the telemedicine services of Denmark with (i) a common platform that supports and promotes development from different actors, (ii) high software interaction, (iii) strong social network of actors, (iv) robust business...

  9. The data quality analyzer: a quality control program for seismic data

    Science.gov (United States)

    Ringler, Adam; Hagerty, M.T.; Holland, James F.; Gonzales, A.; Gee, Lind S.; Edwards, J.D.; Wilson, David; Baker, Adam

    2015-01-01

    The U.S. Geological Survey's Albuquerque Seismological Laboratory (ASL) has several initiatives underway to enhance and track the quality of data produced from ASL seismic stations and to improve communication about data problems to the user community. The Data Quality Analyzer (DQA) is one such development and is designed to characterize seismic station data quality in a quantitative and automated manner.

  10. Self-help initiatives and rural development in Ibesikpo community of ...

    African Journals Online (AJOL)

    This study investigates the impact of self-help initiatives on rural development in Ibesikpo community of Akwa Ibom State, Nigeria. Self help initiatives were defined in terms of provision of employment, education and health-care. A sample size of 369 rural dwellers was drawn and data were analyzed using simple regression ...

  11. Interactive nuclear plant analyzer for the VVER-440 reactor

    International Nuclear Information System (INIS)

    Shier, W.; Kennett, R.

    1993-01-01

    An interactive nuclear plant analyzer (NPA) has been developed for a VVER-440 model 213 reactor for use in the training of plant personnel, the development and verification of plant operating procedures, and in the analysis of various anticipated operational occurrences and accident scenarios. This NPA is operational on an IBM RISC-6000 workstation and utilizes the RELAP5/MOD2 computer code for the calculation of the VVER-440 reactor response to the interactive commands initiated by the NPA operator. Results of the interactive calculation can be through the user-defined, digital display of various plant parameters and through color changes that reflect changes in primary system fluid temperatures, fuel and clad temperatures, and the temperatures of other metal structures. In addition, changes in the status of various components and system can be initiated and/or displayed both numerically and graphically on the mask

  12. Mapping equality in access : the case of Bogota's sustainable transport initiatives

    NARCIS (Netherlands)

    Teunissen, Thijs; Sarmiento, Olga; Zuidgeest (Former Assistant Professor), Mark; Brussel, M.J.G.

    2015-01-01

    To enhance social equity, three important sustainable transportation initiatives have been introduced in Bogotá. Spatial information and GIS have been used to analyze levels of inequality in access to these initiatives. The results show that the TransMilenio BRT offers equal access for all

  13. Education Policy and Family Values: A Critical Analysis of Initiatives from the Right

    Science.gov (United States)

    Kumashiro, Kevin K.

    2009-01-01

    This article analyzes current education policy initiatives from the political Right in the United States, focusing on initiatives at the federal level (standards and testing), the state level (funding), the local level (alternative certification), and the campus level (censorship). Each initiative has received wide bipartisan and public support,…

  14. Implication of the dominant design in electronic initiation systems in the South African mining industry

    CSIR Research Space (South Africa)

    Smit, FC

    1998-11-01

    Full Text Available This article analyzes an emerging technological innovation, namely, electronic initiation systems for mining explosives in South Africa. The concept of electronic initiation is presenting itself as a challenge to traditional initiation systems...

  15. THE RELEVANCE OF SUBSIDIARY INITIATIVES FOR BRAZILIAN MULTINATIONALS

    Directory of Open Access Journals (Sweden)

    Moacir de Miranda Oliveira Junior

    2009-07-01

    Full Text Available The purpose of this paper is to analyze relationship patterns between headquarters and subsidiaries of Brazilian Multinationals Enterprises (BrMNEs. The key construct for that investigation is Subsidiary Initiative, which comprises Subsidiary Entrepreneurial Orientation, Autonomy, Integration, Local Competitive Context and Business Network.A survey was carried out in a sample of 65 subsidiaries of 29 BrMNEs. The main outcome is that subsidiaries are highly integrated and receive Entrepreneurial Orientation from Headquarters (HQs, but Initiative is limited. Actually, the main determinants of subsidiary’s initiatives are Local Context and Business Networking in the host country. This apparent paradox may be explained by what we call ‘rebellious subsidiaries’, which take initiatives based on their business environment and connections, regardless of their HQs’ directions or delegation of autonomy.

  16. Initiation devices, initiation systems including initiation devices and related methods

    Energy Technology Data Exchange (ETDEWEB)

    Daniels, Michael A.; Condit, Reston A.; Rasmussen, Nikki; Wallace, Ronald S.

    2018-04-10

    Initiation devices may include at least one substrate, an initiation element positioned on a first side of the at least one substrate, and a spark gap electrically coupled to the initiation element and positioned on a second side of the at least one substrate. Initiation devices may include a plurality of substrates where at least one substrate of the plurality of substrates is electrically connected to at least one adjacent substrate of the plurality of substrates with at least one via extending through the at least one substrate. Initiation systems may include such initiation devices. Methods of igniting energetic materials include passing a current through a spark gap formed on at least one substrate of the initiation device, passing the current through at least one via formed through the at least one substrate, and passing the current through an explosive bridge wire of the initiation device.

  17. The initial value problem of scalar-tensor theories of gravity

    Energy Technology Data Exchange (ETDEWEB)

    Salgado, Marcelo; Martinez del Rio, David [Instituto de Ciencias Nucleares Universidad Nacional Autonoma de Mexico Apdo. Postal 70-543 Mexico 04510 D.F. (Mexico)

    2007-11-15

    The initial value problem of scalar-tensor theories of gravity (STT) is analyzed in the physical (Jordan) frame using a 3+1 decomposition of spacetime. A first order strongly hyperbolic system is obtained for which the well posedness of the Cauchy problem can be established. We provide two simple applications of the 3+1 system of equations: one for static and spherically symmetric spacetimes which allows the construction of unstable initial data (compact objects) for which a further black hole formation and scalar gravitational wave emission can be analyzed, and another application is for homogeneous and isotropic spacetimes that permits to study the dynamics of the Universe in the framework of STT.

  18. Electron attachment analyzer

    International Nuclear Information System (INIS)

    Popp, P.; Grosse, H.J.; Leonhardt, J.; Mothes, S.; Oppermann, G.

    1984-01-01

    The invention concerns an electron attachment analyzer for detecting traces of electroaffine substances in electronegative gases, especially in air. The analyzer can be used for monitoring working places, e. g., in operating theatres. The analyzer consists of two electrodes inserted in a base frame of insulating material (quartz or ceramics) and a high-temperature resistant radiation source ( 85 Kr, 3 H, or 63 Ni)

  19. The US proliferation security initiative (PSI); L'initiative americaine de securite contre la proliferation (PSI)

    Energy Technology Data Exchange (ETDEWEB)

    Gregoire, B

    2004-10-01

    The proliferation security initiative (PSI), launched by President Bush on May 31, 2003, aims at intercepting any transfer of mass destruction weapons, of their vectors and related equipments, towards or coming from countries or organizations suspected to have a proliferation activity. This initiative, which involves coercive means to fight against proliferation, raises international lawfulness and legal questions, the answers of which are today under construction. This article analyzes the place of the European Union in the PSI, the action means (optimization of existing means, cooperation between intelligence and interception services), and the PSI stakes (lawfulness with respect to the international law, bilateral agreements, draft boarding agreement, sustain of the United Nations, widening of the partnership and of the field of action). (J.S.)

  20. QUEST/Ada: Query utility environment for software testing of Ada

    Science.gov (United States)

    Brown, David B.

    1989-01-01

    Results of research and development efforts are presented for Task 1, Phase 2 of a general project entitled, The Development of a Program Analysis Environment for Ada. A prototype of the QUEST/Ada system was developed to collect data to determine the effectiveness of the rule-based testing paradigm. The prototype consists of five parts: the test data generator, the parser/scanner, the test coverage analyzer, a symbolic evaluator, and a data management facility, known as the Librarian. These components are discussed at length. Also presented is an experimental design for the evaluations, an overview of the project, and a schedule for its completion.

  1. Regulation of protein translation initiation in response to ionizing radiation

    International Nuclear Information System (INIS)

    Trivigno, Donatella; Bornes, Laura; Huber, Stephan M; Rudner, Justine

    2013-01-01

    Proliferating tumor cells require continuous protein synthesis. De novo synthesis of most proteins is regulated through cap-dependent translation. Cellular stress such as ionizing radiation (IR) blocks cap-dependent translation resulting in shut-down of global protein translation which saves resources and energy needed for the stress response. At the same time, levels of proteins required for stress response are maintained or even increased. The study aimed to analyze the regulation of signaling pathways controlling protein translation in response to IR and the impact on Mcl-1, an anti-apoptotic and radioprotective protein, which levels rapidly decline upon IR. Protein levels and processing were analyzed by Western blot. The assembly of the translational pre-initiation complex was examined by Immunoprecipitation and pull-down experiments with 7-methyl GTP agarose. To analyze IR-induced cell death, dissipation of the mitochondrial membrane potential and DNA fragmentation were determined by flow cytometry. Protein levels of the different initiation factors were down-regulated using RNA interference approach. IR induced caspase-dependent cleavage of the translational initiation factors eIF4G1, eIF3A, and eIF4B resulting in disassembly of the cap-dependent initiation complex. In addition, DAP5-dependent initiation complex that regulates IRES-dependent translation was disassembled in response to IR. Moreover, IR resulted in dephosphorylation of 4EBP1, an inhibitor of cap-dependent translation upstream of caspase activation. However, knock-down of eIF4G1, eIF4B, DAP5, or 4EBP1 did not affect IR-induced decline of the anti-apoptotic protein Mcl-1. Our data shows that cap-dependent translation is regulated at several levels in response to IR. However, the experiments indicate that IR-induced Mcl-1 decline is not a consequence of translational inhibition in Jurkat cells

  2. Regulation of protein translation initiation in response to ionizing radiation

    Directory of Open Access Journals (Sweden)

    Trivigno Donatella

    2013-02-01

    Full Text Available Abstract Background Proliferating tumor cells require continuous protein synthesis. De novo synthesis of most proteins is regulated through cap-dependent translation. Cellular stress such as ionizing radiation (IR blocks cap-dependent translation resulting in shut-down of global protein translation which saves resources and energy needed for the stress response. At the same time, levels of proteins required for stress response are maintained or even increased. The study aimed to analyze the regulation of signaling pathways controlling protein translation in response to IR and the impact on Mcl-1, an anti-apoptotic and radioprotective protein, which levels rapidly decline upon IR. Methods Protein levels and processing were analyzed by Western blot. The assembly of the translational pre-initiation complex was examined by Immunoprecipitation and pull-down experiments with 7-methyl GTP agarose. To analyze IR-induced cell death, dissipation of the mitochondrial membrane potential and DNA fragmentation were determined by flow cytometry. Protein levels of the different initiation factors were down-regulated using RNA interference approach. Results IR induced caspase-dependent cleavage of the translational initiation factors eIF4G1, eIF3A, and eIF4B resulting in disassembly of the cap-dependent initiation complex. In addition, DAP5-dependent initiation complex that regulates IRES-dependent translation was disassembled in response to IR. Moreover, IR resulted in dephosphorylation of 4EBP1, an inhibitor of cap-dependent translation upstream of caspase activation. However, knock-down of eIF4G1, eIF4B, DAP5, or 4EBP1 did not affect IR-induced decline of the anti-apoptotic protein Mcl-1. Conclusion Our data shows that cap-dependent translation is regulated at several levels in response to IR. However, the experiments indicate that IR-induced Mcl-1 decline is not a consequence of translational inhibition in Jurkat cells.

  3. Analysis on the Initial Cracking Parameters of Cross-Measure Hydraulic Fracture in Underground Coal Mines

    Directory of Open Access Journals (Sweden)

    Yiyu Lu

    2015-07-01

    Full Text Available Initial cracking pressure and locations are important parameters in conducting cross-measure hydraulic fracturing to enhance coal seam permeability in underground coalmines, which are significantly influenced by in-situ stress and occurrence of coal seam. In this study, stress state around cross-measure fracturing boreholes was analyzed using in-situ stress coordinate transformation, then a mathematical model was developed to evaluate initial cracking parameters of borehole assuming the maximum tensile stress criterion. Subsequently, the influences of in-situ stress and occurrence of coal seams on initial cracking pressure and locations in underground coalmines were analyzed using the proposed model. Finally, the proposed model was verified with field test data. The results suggest that the initial cracking pressure increases with the depth cover and coal seam dip angle. However, it decreases with the increase in azimuth of major principle stress. The results also indicate that the initial cracking locations concentrated in the second and fourth quadrant in polar coordinate, and shifted direction to the strike of coal seam as coal seam dip angle and azimuth of maximum principle stress increase. Field investigation revealed consistent rule with the developed model that the initial cracking pressure increases with the coal seam dip angle. Therefore, the proposed mathematical model provides theoretical insight to analyze the initial cracking parameters during cross-measure hydraulic fracturing for underground coalmines.

  4. Сoncept of national legislative initiative and its types

    Directory of Open Access Journals (Sweden)

    А. Л. Крутько

    2015-11-01

    Full Text Available . National legislative initiative is a new instrument of popular wills demonstration as compared to different forms of direct democracy. In most of developed democracies this institution regulated at the constitutional/ legislative level. But in the modern Ukraine its constitutional legal regulation is absent, due disregard of its possibilities and lack of understanding of its essence. Paper objective. This article an aim is to analyze in details the definition of «national legislative initiative» and determinate its basic types according to theoretical insights and foreign current law. Recent research and publications analysis. The domestic and foreign scholars works on scientific research of national legislative initiative institution such as V.N. Rudenko, O.M. Mudra, V.M. Shapoval, V.F. Nesterovich, J. F. Zimmerman and etc. Their works were foundational at the time of writing. Paper main body. With the help of big definition dictionary and new encyclopedic dictionary it was found the etymology of the concept «initiative» which is characterized as the basis, also found meaning of «legislative initiative», «national initiative» and «national legislative initiative». It was argued impossibility an identification of «national initiative» with «national legislative initiative». The current definitions of the national legislative initiative were analyzed in the article. It was noted that suggested terms were limited only by identification of institute’s apparent indicator and withhold essence. This is precisely why four types of the national legislative initiative’s realization are briefly examined for the complex determination of the definition. These types depending on what role the legislator are assigning to citizen, who are the main actors of initiative. And on the basis of this analysis the author provided his own definition of «the national legislative initiative». The author had notes that the proposed definition was not

  5. Mass spectrometer calibration of Cosmic Dust Analyzer

    Science.gov (United States)

    Ahrens, Thomas J.; Gupta, Satish C.; Jyoti, G.; Beauchamp, J. L.

    2003-02-01

    The time-of-flight (TOF) mass spectrometer (MS) of the Cosmic Dust Analyzer (CDA) instrument aboard the Cassini spacecraft is expected to be placed in orbit about Saturn to sample submicrometer-diameter ring particles and impact ejecta from Saturn's satellites. The CDA measures a mass spectrum of each particle that impacts the chemical analyzer sector of the instrument. Particles impact a Rh target plate at velocities of 1-100 km/s and produce some 10-8 to 10-5 times the particle mass of positive valence, single-charged ions. These are analyzed via a TOF MS. Initial tests employed a pulsed N2 laser acting on samples of kamacite, pyrrhotite, serpentine, olivine, and Murchison meteorite induced bursts of ions which were detected with a microchannel plate and a charge sensitive amplifier (CSA). Pulses from the N2 laser (1011 W/cm2) are assumed to simulate particle impact. Using aluminum alloy as a test sample, each pulse produces a charge of ~4.6 pC (mostly Al+1), whereas irradiation of a stainless steel target produces a ~2.8 pC (Fe+1) charge. Thus the present system yields ~10-5% of the laser energy in resulting ions. A CSA signal indicates that at the position of the microchannel plate, the ion detector geometry is such that some 5% of the laser-induced ions are collected in the CDA geometry. Employing a multichannel plate detector in this MS yields for Al-Mg-Cu alloy and kamacite targets well-defined peaks at 24 (Mg+1), 27(Al+1), and 64 (Cu+1) and 56 (Fe+1), 58 (Ni+1), and 60 (Ni+1) dalton, respectively.

  6. Building pathway graphs from BioPAX data in R.

    Science.gov (United States)

    Benis, Nirupama; Schokker, Dirkjan; Kramer, Frank; Smits, Mari A; Suarez-Diez, Maria

    2016-01-01

    Biological pathways are increasingly available in the BioPAX format which uses an RDF model for data storage. One can retrieve the information in this data model in the scripting language R using the package rBiopaxParser , which converts the BioPAX format to one readable in R. It also has a function to build a regulatory network from the pathway information. Here we describe an extension of this function. The new function allows the user to build graphs of entire pathways, including regulated as well as non-regulated elements, and therefore provides a maximum of information. This function is available as part of the rBiopaxParser distribution from Bioconductor.

  7. The US proliferation security initiative (PSI)

    International Nuclear Information System (INIS)

    Gregoire, B.

    2004-01-01

    The proliferation security initiative (PSI), launched by President Bush on May 31, 2003, aims at intercepting any transfer of mass destruction weapons, of their vectors and related equipments, towards or coming from countries or organizations suspected to have a proliferation activity. This initiative, which involves coercive means to fight against proliferation, raises international lawfulness and legal questions, the answers of which are today under construction. This article analyzes the place of the European Union in the PSI, the action means (optimization of existing means, cooperation between intelligence and interception services), and the PSI stakes (lawfulness with respect to the international law, bilateral agreements, draft boarding agreement, sustain of the United Nations, widening of the partnership and of the field of action). (J.S.)

  8. An Additional Method for Analyzing the Reversible Inhibition of an ?Enzyme Using Acid Phosphatase as a Model

    OpenAIRE

    Baumhardt, Jordan M.; Dorsey, Benjamin M.; McLauchlan, Craig C.; Jones, Marjorie A.

    2015-01-01

    Using wheat germ acid phosphatase and sodium orthovanadate as a competitive inhibitor, a novel method for analyzing reversible inhibition was carried out. Our alternative approach involves plotting the initial velocity at which product is formed as a function of the ratio of substrate concentration to inhibitor concentration at a constant enzyme concentration and constant assay conditions. The concept of initial concentrations driving equilibrium leads to the chosen axes. Three apparent const...

  9. Computer-controlled system for plasma ion energy auto-analyzer

    International Nuclear Information System (INIS)

    Wu Xianqiu; Chen Junfang; Jiang Zhenmei; Zhong Qinghua; Xiong Yuying; Wu Kaihua

    2003-01-01

    A computer-controlled system for plasma ion energy auto-analyzer was technically studied for rapid and online measurement of plasma ion energy distribution. The system intelligently controls all the equipments via a RS-232 port, a printer port and a home-built circuit. The software designed by LabVIEW G language automatically fulfils all of the tasks such as system initializing, adjustment of scanning-voltage, measurement of weak-current, data processing, graphic export, etc. By using the system, a few minutes are taken to acquire the whole ion energy distribution, which rapidly provide important parameters of plasma process techniques based on semiconductor devices and microelectronics

  10. Resolving Lexical Ambiguity in a Deterministic Parser

    OpenAIRE

    Milne, Robert W.

    1983-01-01

    This work is an investigation into part of the human sentence parsing mechanism (HSPM), where parsing implies syntactic and non-syntactic analysis. It is hypothesised. that the HSPM consists of at least two processors. We will call the first processor the syntactic processor, and the second will be known as the non-syntactic processor. For normal sentence processing, the two processors are controlled by a 'normal component", whilst when an error occurs, they are controlled by a...

  11. The Decompositioning of Volatile-Matter of Tanjung Enim Coal by using Thermogravimetry Analyzer (TGA

    Directory of Open Access Journals (Sweden)

    Nukman Nukman

    2010-10-01

    Full Text Available Coal is a nature material which a kind of energy source. The decompotition of coal could analyze by heat treated using thermogravimetry analyzer. The decomposition of the volatile matter for three kinds of Tanjung Enim coal could be known. The value of activation energy that be found diference, then for Semi Anthracite, Bitumonius and Sub Bituminous Coal, the initial temperatures are 60.8 oC, 70.7 oC, 97.8oC, and the last temperatures are 893.8 oC, 832 oC, 584.6oC.

  12. Combat Wound Initiative program.

    Science.gov (United States)

    Stojadinovic, Alexander; Elster, Eric; Potter, Benjamin K; Davis, Thomas A; Tadaki, Doug K; Brown, Trevor S; Ahlers, Stephen; Attinger, Christopher E; Andersen, Romney C; Burris, David; Centeno, Jose; Champion, Hunter; Crumbley, David R; Denobile, John; Duga, Michael; Dunne, James R; Eberhardt, John; Ennis, William J; Forsberg, Jonathan A; Hawksworth, Jason; Helling, Thomas S; Lazarus, Gerald S; Milner, Stephen M; Mullick, Florabel G; Owner, Christopher R; Pasquina, Paul F; Patel, Chirag R; Peoples, George E; Nissan, Aviram; Ring, Michael; Sandberg, Glenn D; Schaden, Wolfgang; Schultz, Gregory S; Scofield, Tom; Shawen, Scott B; Sheppard, Forest R; Stannard, James P; Weina, Peter J; Zenilman, Jonathan M

    2010-07-01

    The Combat Wound Initiative (CWI) program is a collaborative, multidisciplinary, and interservice public-private partnership that provides personalized, state-of-the-art, and complex wound care via targeted clinical and translational research. The CWI uses a bench-to-bedside approach to translational research, including the rapid development of a human extracorporeal shock wave therapy (ESWT) study in complex wounds after establishing the potential efficacy, biologic mechanisms, and safety of this treatment modality in a murine model. Additional clinical trials include the prospective use of clinical data, serum and wound biomarkers, and wound gene expression profiles to predict wound healing/failure and additional clinical patient outcomes following combat-related trauma. These clinical research data are analyzed using machine-based learning algorithms to develop predictive treatment models to guide clinical decision-making. Future CWI directions include additional clinical trials and study centers and the refinement and deployment of our genetically driven, personalized medicine initiative to provide patient-specific care across multiple medical disciplines, with an emphasis on combat casualty care.

  13. Aperture Valve for the Mars Organic Molecule Analyzer (MOMA)

    Science.gov (United States)

    Hakun, Claef F.; Engler, Charles D.; Barber, Willie E.; Canham, John S.

    2014-01-01

    NASA's participation in the multi-nation ExoMars 2018 Rover mission includes a critical astrobiology Mass Spectrometer Instrument on the Rover called the Mars Organic Molecule Analyzer (MOMA). The Aperture Valve is a critical electromechanical valve used by the Mass Spectrometer to facilitate the transfer of ions from Martian soil to the Mass Spectrometer for analysis. The MOMA Aperture Valve development program will be discussed in terms of the Initial valve design and subsequent improvements that resulted from prototype testing. The Initial Aperture Valve concept seemed promising, based on calculations and perceived merits. However, performance results of this design were disappointing, due to delamination of TiN and DLC coatings applied to the Titanium base metals, causing debris from the coatings to seize the valve. While peer reviews and design trade studies are important forums to vet a concept design, results from testing should not be underestimated.Despite the lack of development progress to meet requirements, valuable information from weakness discovered in the Initial Valve design was used to develop a second, more robust Aperture valve. Based on a check-ball design, the ETU flight valve design resulted in significantly less surface area to create the seal. Moreover, PVD coatings were eliminated in favor of hardened, nonmagnetic corrosion resistant alloys. Test results were impressive, with the valve achieving five orders of magnitude better sealing leak rate over end of life requirements. Cycle life was equally impressive, achieving 280,000 cycles without failure.

  14. Miniature mass analyzer

    CERN Document Server

    Cuna, C; Lupsa, N; Cuna, S; Tuzson, B

    2003-01-01

    The paper presents the concept of different mass analyzers that were specifically designed as small dimension instruments able to detect with great sensitivity and accuracy the main environmental pollutants. The mass spectrometers are very suited instrument for chemical and isotopic analysis, needed in environmental surveillance. Usually, this is done by sampling the soil, air or water followed by laboratory analysis. To avoid drawbacks caused by sample alteration during the sampling process and transport, the 'in situ' analysis is preferred. Theoretically, any type of mass analyzer can be miniaturized, but some are more appropriate than others. Quadrupole mass filter and trap, magnetic sector, time-of-flight and ion cyclotron mass analyzers can be successfully shrunk, for each of them some performances being sacrificed but we must know which parameters are necessary to be kept unchanged. To satisfy the miniaturization criteria of the analyzer, it is necessary to use asymmetrical geometries, with ion beam obl...

  15. Analyzing relationship between ERP utilization and lean manufacturing maturity of Turkish SMEs

    DEFF Research Database (Denmark)

    Iris, Cagatay; Cebeci, Ufuk

    2014-01-01

    with rapid development of information technology (IT) and progress in modern production management strategies have emerged. Obtained results show that Turkish SMEs have widely initiated lean production practices. However, applications are in initial level in the most cases. In respect to ERP systems...... is not adequate to make an inference about relationship between ERP and lean practices. Hence, a relational model is developed to analyze correlation between use of ERP and lean manufacturing implementation in white goods manufacturing SMEs of Istanbul, Turkey. Design/methodology/approach - The examination......Purpose - The purpose of this paper is to understand how effective Turkish small and medium size enterprises (SMEs) use enterprise resource planning (ERP) systems in module aspect and to assess the adherence to lean manufacturing requirements. Obtaining each efficiency result separately...

  16. Analyzing Peace Pedagogies

    Science.gov (United States)

    Haavelsrud, Magnus; Stenberg, Oddbjorn

    2012-01-01

    Eleven articles on peace education published in the first volume of the Journal of Peace Education are analyzed. This selection comprises peace education programs that have been planned or carried out in different contexts. In analyzing peace pedagogies as proposed in the 11 contributions, we have chosen network analysis as our method--enabling…

  17. Factors associated with hookah use initiation among adolescents.

    Science.gov (United States)

    Reveles, Caroline C; Segri, Neuber J; Botelho, Clovis

    2013-01-01

    to determine the prevalence and to analyze factors associated with hookah use initiation among adolescents. This was a cross-sectional study, in which questionnaires were collected from 495 students attending public and private schools of the urban area of the city of Várzea Grande, in the state of Mato Grosso, Brazil. Data were analyzed through descriptive, bivariate, and multiple Poisson regression analyses. A total of 19.7% students had tried a hookah. The use of hookah was associated with the final period of adolescence [PR=6.54 (2.79, 15.32)]; enrollment in private schools [PR=2.23 (1.73, 2.88)]; and presence of work activities [PR=1.80 (1.17, 2.78)]. The proportion of adolescents that had tried a hookah was high. The influence of age, work activities, and class period on smoking initiation using the hookah was observed. Preventive measures encompassing all forms of tobacco smoking should be targeted at adolescents in the school environment, aiming at tobacco use control. Copyright © 2013 Sociedade Brasileira de Pediatria. Published by Elsevier Editora Ltda. All rights reserved.

  18. Rethinking sexual initiation: pathways to identity formation among gay and bisexual Mexican male youth.

    Science.gov (United States)

    Carrillo, Héctor; Fontdevila, Jorge

    2011-12-01

    The topic of same-sex sexual initiation has generally remained understudied in the literature on sexual identity formation among sexual minority youth. This article analyzes the narratives of same-sex sexual initiation provided by 76 gay and bisexual Mexican immigrant men who participated in interviews for the Trayectos Study, an ethnographic study of sexuality and HIV risk. These participants were raised in a variety of locations throughout Mexico, where they also realized their same-sex attraction and initiated their sexual lives with men. We argue that Mexican male same-sex sexuality is characterized by three distinct patterns of sexual initiation--one heavily-based on gender roles, one based on homosociality, and one based on object choice--which inform the men's interpretations regarding sexual roles, partner preferences, and sexual behaviors. We analyzed the social factors and forms of cultural/sexual socialization that lead sexual minority youth specifically to each of these three patterns of sexual initiation. Our findings confirm the importance of studying same-sex sexual initiation as a topic in its own right, particularly as a tool to gain a greater understanding of the diversity of same-sex sexual experiences and sexual identities within and among ethnic/cultural groups.

  19. The Global Reporting Initiative: What Value is Added?

    OpenAIRE

    W. Richard Sherman

    2011-01-01

    This paper explores the extent to which the Global Reporting Initiative G3 Reporting Framework adds value to the external reporting of a companys financial, environmental and social performance. This inquiry takes the form of analyzing the content of the published sustainability reports of well-known companies to compare and contrast the information communicated in these reports.

  20. Study on diagnostic plant analyzer method for support of emergency operation

    International Nuclear Information System (INIS)

    Yoshikawa, H.; Gofuku, A.; Itoh, K.; Wakabayashi, J.

    1986-01-01

    Methods of time-critical diagnostic plant analyzer are investigated which would serve as support to emergency operation of nuclear power plant. A faster-than-real-time simulator code TOKRAC is developed for analyzing PWR primary loop thermo-hydraulics of small-break LOCA and it is applied for a numerical experiment of initial phase of TMI-2 accident. TOKRAC resulted in a good agreement with a RELAP4/MOD6 calculation and the plant record with as fast as can one-tenth of real time. A real-time estimator of SG heat transfer rate based on Kalman filter is proposed and its applicability is verified using LOFT ATWS experimental data. With regards to how to integrate those methods into the software system in operation support center, a new concept of module-based simulation system is proposed which aims at offering a flexible and human-cognitive oriented environment for various analytical tool development

  1. Degradation of hydrocarbons in soil samples analyzed within accepted analytical holding times

    International Nuclear Information System (INIS)

    Jackson, J.; Thomey, N.; Dietlein, L.F.

    1992-01-01

    Samples which are collected in conjunction with subsurface investigations at leaking petroleum storage tank sites and petroleum refineries are routinely analyzed for benzene, toluene, ethylbenzene, xylenes (BTEX), and total petroleum hydrocarbons (TPH). Water samples are preserved by the addition of hydrochloric acid and maintained at four degrees centigrade prior to analysis. This is done to prevent bacterial degradation of hydrocarbons. Chemical preservation is not presently performed on soil samples. Instead, the samples are cooled and maintained at four degrees centigrade. This study was done to measure the degree of degradation of hydrocarbons in soil samples which are analyzed within accepted holding times. Soil samples were collected and representative subsamples were prepared from the initial sample. Subsamples were analyzed in triplicate for BTEX and TPH throughout the length of the approved holding times to measure the extent of sample constituent degradation prior to analysis. Findings imply that for sandy soils, BTEX and TPH concentrations can be highly dependent upon the length of time which elapses between sample collection and analysis

  2. Care initiation area yields dramatic results.

    Science.gov (United States)

    2009-03-01

    The ED at Gaston Memorial Hospital in Gastonia, NC, has achieved dramatic results in key department metrics with a Care Initiation Area (CIA) and a physician in triage. Here's how the ED arrived at this winning solution: Leadership was trained in and implemented the Kaizen method, which eliminates redundant or inefficient process steps. Simulation software helped determine additional space needed by analyzing arrival patterns and other key data. After only two days of meetings, new ideas were implemented and tested.

  3. Understanding of Navy Technical Language via Statistical Parsing

    National Research Council Canada - National Science Library

    Rowe, Neil C

    2004-01-01

    ... (both on word senses and word-sense pairs) from a training corpus with a statistical parser. Innovations of our approach are in statistical inheritance of binary co-occurrence probabilities and in weighting...

  4. Thromboelastography platelet mapping in healthy dogs using 1 analyzer versus 2 analyzers.

    Science.gov (United States)

    Blois, Shauna L; Banerjee, Amrita; Wood, R Darren; Park, Fiona M

    2013-07-01

    The objective of this study was to describe the results of thromboelastography platelet mapping (TEG-PM) carried out using 2 techniques in 20 healthy dogs. Maximum amplitudes (MA) generated by thrombin (MAthrombin), fibrin (MAfibrin), adenosine diphosphate (ADP) receptor activity (MAADP), and thromboxane A2 (TxA2) receptor activity (stimulated by arachidonic acid, MAAA) were recorded. Thromboelastography platelet mapping was carried out according to the manufacturer's guidelines (2-analyzer technique) and using a variation of this method employing only 1 analyzer (1-analyzer technique) on 2 separate blood samples obtained from each dog. Mean [± standard deviation (SD)] MA values for the 1-analyzer/2-analyzer techniques were: MAthrombin = 51.9 mm (± 7.1)/52.5 mm (± 8.0); MAfibrin = 20.7 mm (± 21.8)/23.0 mm (± 26.1); MAADP = 44.5 mm (± 15.6)/45.6 mm (± 17.0); and MAAA = 45.7 mm (± 11.6)/45.0 mm (± 15.4). Mean (± SD) percentage aggregation due to ADP receptor activity was 70.4% (± 32.8)/67.6% (± 33.7). Mean percentage aggregation due to TxA2 receptor activity was 77.3% (± 31.6)/78.1% (± 50.2). Results of TEG-PM were not significantly different for the 1-analyzer and 2-analyzer methods. High correlation was found between the 2 methods for MAfibrin [concordance correlation coefficient (r) = 0.930]; moderate correlation was found for MAthrombin (r = 0.70) and MAADP (r = 0.57); correlation between the 2 methods for MAAA was lower (r = 0.32). Thromboelastography platelet mapping (TEG-PM) should be further investigated to determine if it is a suitable method for measuring platelet dysfunction in dogs with thrombopathy.

  5. Initial Characteristics and Mentoring Satisfaction of College Women Mentoring Youth: Implications for Training

    Science.gov (United States)

    Foukal, Martha D.; Lawrence, Edith C.; Williams, Joanna L.

    2016-01-01

    Being a youth mentor is popular among college students, yet little is known about how their initial characteristics are related to mentoring satisfaction. Survey data from college women enrolled in a youth mentoring program (n = 158) and a comparison group (n = 136) were analyzed to determine how initial characteristics of youth mentors (a) differ…

  6. Epidemiological Patterns of Initial and Subsequent Injuries in Collegiate Football Athletes.

    Science.gov (United States)

    Williams, Jacob Z; Singichetti, Bhavna; Li, Hongmei; Xiang, Henry; Klingele, Kevin E; Yang, Jingzhen

    2017-04-01

    A body of epidemiological studies has examined football injuries and associated risk factors among collegiate athletes. However, few existing studies specifically analyzed injury risk in terms of initial or subsequent injuries. To determine athlete-exposures (AEs) and rates of initial and subsequent injury among collegiate football athletes. Descriptive epidemiological study. Injury and exposure data collected from collegiate football players from two Division I universities (2007-2011) were analyzed. Rate of initial injury was calculated as the number of initial injuries divided by the total number of AEs for initial injuries, while the rate for subsequent injury was calculated as the number of subsequent injuries divided by the total number of AEs for subsequent injury. Poisson regression was used to determine injury rate ratio (subsequent vs initial injury), with adjustment for other covariates. The total AEs during the study period were 67,564, resulting in an overall injury rate of 35.2 per 10,000 AEs. Rates for initial and subsequent injuries were 31.7 and 45.3 per 10,000 AEs, respectively, with a rate ratio (RR) of 1.4 for rate of subsequent injury vs rate of initial injury (95% CI, 1.1-1.9). Rate of injury appeared to increase with each successive injury. RR during games was 1.8 (95% CI, 1.1-3.0). The rate of subsequent injuries to the head, neck, and face was 10.9 per 10,000 AEs, nearly double the rate of initial injuries to the same sites (RR = 2.0; 95% CI, 1.1-3.5). For wide receivers, the rate of subsequent injuries was 2.2 times the rate of initial injuries (95% CI, 1.3-3.8), and for defensive linemen, the rate of subsequent injuries was 2.1 times the rate of initial injuries (95% CI, 1.1-3.9). The method used in this study allows for a more accurate determination of injury risk among football players who have already been injured at least once. Further research is warranted to better identify which specific factors contribute to this increased risk

  7. Relativistic effects in the calibration of electrostatic electron analyzers. I. Toroidal analyzers

    Energy Technology Data Exchange (ETDEWEB)

    Keski Rahkonen, O [Helsinki University of Technology, Espoo (Finland). Laboratory of Physics; Krause, M O [Oak Ridge National Lab., Tenn. (USA)

    1978-02-01

    Relativistic correction terms up to the second order are derived for the kinetic energy of an electron travelling along the circular central trajectory of a toroidal analyzer. Furthermore, a practical energy calibration equation of the spherical sector plate analyzer is written for the variable-plate-voltage recording mode. Accurate measurements with a spherical analyzer performed using kinetic energies from 600 to 2100 eV are in good agreement with this theory showing our approximation (neglect of fringing fields, and source and detector geometry) is realistic enough for actual calibration purposes.

  8. Multichannel analyzer development in CAMAC

    International Nuclear Information System (INIS)

    Nagy, J.Z.; Zarandy, A.

    1988-01-01

    The data acquisition in TOKAMAK experiments some CAMAC modules have been developed. The modules are the following: 64 K analyzer memory, 32 K analyzer memory, 6-channel pulse peak analyzer memory which contains the 32 K analyzer memory and eight AD-converters

  9. Employability and personal initiative as antecedents of job satisfaction.

    Science.gov (United States)

    Gamboa, Juan Pablo; Gracia, Francisco; Ripoll, Pilar; Peiró, José María

    2009-11-01

    In a changing and flexible labour market it is important to clarify the role of environmental and personal variables that contribute to obtaining adequate levels of job satisfaction. The aim of the present study is to analyze the direct effects of employability and personal initiative on intrinsic, extrinsic and social job satisfaction, clarifying their cumulative and interactive effects. The study has been carried out in a sample of 1319 young Spanish workers. Hypotheses were tested by means of the moderated hierarchical regression analysis. Results show that employability and personal initiative predict in a cumulative way the intrinsic, extrinsic and social job satisfaction. Moreover, the interaction between employability and personal initiative increases the prediction of these two variables on intrinsic and extrinsic job satisfaction. Results also indicate that higher values of employability when initiative is also high are associated to higher levels of intrinsic and extrinsic satisfaction. These results have implications for theory and practice in a context of new employment relations.

  10. Selection of initial events of accelerator driven subcritical system

    International Nuclear Information System (INIS)

    Wang Qianglong; Hu Liqin; Wang Jiaqun; Li Yazhou; Yang Zhiyi

    2013-01-01

    The Probabilistic Safety Assessment (PSA) is an important tool in reactor safety analysis and a significant reference to the design and operation of reactor. It is the origin and foundation of the PSA for a reactor to select the initial events. Accelerator Driven Subcritical System (ADS) has advanced design characteristics, complicated subsystems and little engineering and operating experience, which makes it much more difficult to identify the initial events of ADS. Based on the current design project of ADS, the system's safety characteristics and special issues were analyzed in this article. After a series of deductions with Master Logic Diagram (MLD) and considering the relating experience of other advanced research reactors, a preliminary initial events was listed finally, which provided the foundation for the next safety assessment. (authors)

  11. Nurse-Initiated Telephone Follow Up after Ureteroscopic Stone Surgery.

    Science.gov (United States)

    Tackitt, Helen M; Eaton, Samuel H; Lentz, Aaron C

    2016-01-01

    This article presents findings of a quality improvement (QI) project using the DMAIC (define, measure, analyze, improve, and control) model designed to decrease the rate of emergency department (ED) visits and nurse advice line calls after ureteroscopic stone surgery. Results indicated that nurse-initiated follow- up phone calls can decrease ED visits.

  12. Web server attack analyzer

    OpenAIRE

    Mižišin, Michal

    2013-01-01

    Web server attack analyzer - Abstract The goal of this work was to create prototype of analyzer of injection flaws attacks on web server. Proposed solution combines capabilities of web application firewall and web server log analyzer. Analysis is based on configurable signatures defined by regular expressions. This paper begins with summary of web attacks, followed by detection techniques analysis on web servers, description and justification of selected implementation. In the end are charact...

  13. Retrieval Interference in Syntactic Processing: The Case of Reflexive Binding in English.

    Science.gov (United States)

    Patil, Umesh; Vasishth, Shravan; Lewis, Richard L

    2016-01-01

    It has been proposed that in online sentence comprehension the dependency between a reflexive pronoun such as himself/herself and its antecedent is resolved using exclusively syntactic constraints. Under this strictly syntactic search account, Principle A of the binding theory-which requires that the antecedent c-command the reflexive within the same clause that the reflexive occurs in-constrains the parser's search for an antecedent. The parser thus ignores candidate antecedents that might match agreement features of the reflexive (e.g., gender) but are ineligible as potential antecedents because they are in structurally illicit positions. An alternative possibility accords no special status to structural constraints: in addition to using Principle A, the parser also uses non-structural cues such as gender to access the antecedent. According to cue-based retrieval theories of memory (e.g., Lewis and Vasishth, 2005), the use of non-structural cues should result in increased retrieval times and occasional errors when candidates partially match the cues, even if the candidates are in structurally illicit positions. In this paper, we first show how the retrieval processes that underlie the reflexive binding are naturally realized in the Lewis and Vasishth (2005) model. We present the predictions of the model under the assumption that both structural and non-structural cues are used during retrieval, and provide a critical analysis of previous empirical studies that failed to find evidence for the use of non-structural cues, suggesting that these failures may be Type II errors. We use this analysis and the results of further modeling to motivate a new empirical design that we use in an eye tracking study. The results of this study confirm the key predictions of the model concerning the use of non-structural cues, and are inconsistent with the strictly syntactic search account. These results present a challenge for theories advocating the infallibility of the human

  14. Hawaii Energy and Environmental Technologies (HEET) Initiative

    Science.gov (United States)

    2011-12-01

    polymer electrolyte fuel cells ( PEMFCs ) performance. This work was performed to support the DOE manufacturing initiative for PEMFC production. The work...performed by exposing the MEA cathode to 10 ppm SO2 in N2 at certain potential and typical operating conditions of a PEMFC for certain time, then...adsorbate by analyzing the electrochemical reduction and oxidation potential and charge. As for the in-situ SO2 adsorption experiments, a PEMFC under

  15. Teachers’ perceptions of their own initiative: Collective initiative vs. personal initiative

    Directory of Open Access Journals (Sweden)

    Džinović Vladimir

    2013-01-01

    Full Text Available Current trends in education demand from teachers to exhibit proactive behaviour and assume responsibility for the implementation of changes in school practice. In that sense, it is important to study how teachers perceive their own initiative and to gain insight into the activities where such initiative is demonstrated. This study has been conceived as a mixed-methods research. The qualitative study implied forming four focus groups with subject teachers and class teachers (N=38, while the quantitative study entailed surveying 1441 teachers in forty primary schools in Serbia using the questionnaire constructed based on qualitative data. Data from focus groups were processed by qualitative thematic analysis, while the questionnaire data were processed by principal component analysis and univariate analysis of variance. The findings of the study have shown that teachers mostly demonstrate initiative through co­operative activities that include planning of joint teaching as well as conducting joint projects within school and with the local community actors. Teachers are least ready to demonstrate personal initiative and the initiative aimed at accomplishing considerable changes in school work. The concluding part includes the recommendations for encouraging teachers’ personal initiative and building organizational culture that would support such initiative. [Projekat Ministarstva nauke Republike Srbije, br. br. 47008: Unapređivanje kvaliteta i dostupnosti obrazovanja u procesima modernizacije Srbije i br. 179034: Od podsticanja inicijative, saradnje i stvaralaštva u obrazovanju do novih uloga i identiteta u društvu

  16. Building Software Tools for Combat Modeling and Analysis

    National Research Council Canada - National Science Library

    Yuanxin, Chen

    2004-01-01

    ... (Meta-Language for Combat Simulations) and its associated parser and C++ code generator were designed to reduce the amount of time and developmental efforts needed to build sophisticated real world combat simulations. A C++...

  17. SGML/XML to HTML Conversion System and Method for Frame-Based Viewer

    National Research Council Canada - National Science Library

    Lepore, Marcus A

    2008-01-01

    ..., such as with Java or JavaScript. A parser allows for multi-layered HTML documents comprised of dynamically sizing framesets which combines transformed SGML and external components such as graphics multimedia with executable mini...

  18. Initiation preference at a yeast origin of replication.

    Science.gov (United States)

    Brewer, B J; Fangman, W L

    1994-04-12

    Replication origins in the yeast Saccharomyces cerevisiae are identified as autonomous replication sequence (ARS) elements. To examine the effect of origin density on replication initiation, we have analyzed the replication of a plasmid that contains two copies of the same origin, ARS1. The activation of origins and the direction that replication forks move through flanking sequences can be physically determined by analyzing replication intermediates on two-dimensional agarose gels. We find that only one of the two identical ARSs on the plasmid initiates replication on any given plasmid molecule; that is, this close spacing of ARSs results in an apparent interference between the potential origins. Moreover, in the particular plasmid that we constructed, one of the two identical copies of ARS1 is used four times more frequently than the other one. These results show that the plasmid context is critical for determining the preferred origin. This origin preference is also exhibited when the tandem copies of ARS1 are introduced into a yeast chromosome. The sequences responsible for establishing the origin preference have been identified by deletion analysis and are found to reside in a portion of the yeast URA3 gene.

  19. A Software for the Analysis of Scripted Dialogs Based on Surface Markers

    Directory of Open Access Journals (Sweden)

    Sylvain Delisle

    2003-04-01

    Full Text Available Most information systems that deal with natural language texts do not tolerate much deviation from their idealized and simplified model of language. Spoken dialog is notoriously ungrammatical however. Because the MAREDI project focuses in particular on the automatic analysis of scripted dialogs, we needed to develop a robust capacity to analyze transcribed spoken language. This paper presents the main elements of our approach, which is based on exploiting surface markers as the best route to the semantics of the conversation modelled. We highlight the foundations of our particular conversational model and give an overview of the MAREDI system. The latter consists of three key modules, which are 1 a connectionist network to recognise speech acts, 2 a robust syntactic parser, and 3 a semantic analyzer. These three modules are fully implemented in Prolog and C++ and have been packaged into an integrated software.

  20. Effects of Initial Stance of Quadruped Trotting on Walking Stability

    Directory of Open Access Journals (Sweden)

    Peisun Ma

    2008-11-01

    Full Text Available It is very important for quadruped walking machine to keep its stability in high speed walking. It has been indicated that moment around the supporting diagonal line of quadruped in trotting gait largely influences walking stability. In this paper, moment around the supporting diagonal line of quadruped in trotting gait is modeled and its effects on body attitude are analyzed. The degree of influence varies with different initial stances of quadruped and we get the optimal initial stance of quadruped in trotting gait with maximal walking stability. Simulation results are presented.

  1. Studies on the ANN implementation in the macro BIM cost analyzes

    Directory of Open Access Journals (Sweden)

    Michał Juszczyk

    2017-06-01

    Full Text Available The paper presents an approach which combines the concept of macro-level BIM-based cost analyzes and application of artificial intelligence tools – namely artificial neural networks. Discussion and foundations of the proposed approach are introduced in the paper to clarify the problem’s core. An exemplary case study reports the results of initial studies on the application of neural networks for the purposes of BIM-based cost analysis of a buildings’ floor structural frame. The results obtained justify the proposal of application of neural networks as a supportive mathematical tool in the problem presented in the paper.

  2. Influence of the crystalline orientations on microcrack initiation in low-cycle fatigue

    Energy Technology Data Exchange (ETDEWEB)

    Mu, P. [Univ Lille Nord de France, F-59000 Lille (France); ECLille, LML, F-59650 Villeneuve d’Ascq (France); CNRS, UMR 8107, UMR 8579 (France); Aubin, V., E-mail: veronique.aubin@ecp.fr [ECP, MSSMat, F-92295 Châtenay-Malabry (France); CNRS, UMR 8107, UMR 8579 (France); Alvarez-Armas, I.; Armas, A. [IFIR, CONICET, Universidad Nacional de Rosario (Argentina)

    2013-06-20

    Present study aims at analyzing the crack initiation in an austenitic stainless steel in low-cycle fatigue. A fatigue test was carried out using a polished specimen. The surface of the specimen was observed in situ during the fatigue test, in order to establish the time of slip activity or crack initiation. After a number of cycles sufficient to initiate small cracks, the test was stopped and the surface observed by scanning electron microscopy. The electron backscattered diffraction technique (EBSD) was used to identify the orientations of surface grains in the central zone of the fatigue specimen. Crack-initiation sites and slip systems associated to the initiated microcracks were identified. The criterion of the maximum Schmid factor explains two-thirds of the cracks initiated in slip systems; however if the favorably oriented slip band with respect to this criterion makes an angle of around 45° to the loading direction, a crack may initiate in another slip system.

  3. A Novel Flood Forecasting Method Based on Initial State Variable Correction

    Directory of Open Access Journals (Sweden)

    Kuang Li

    2017-12-01

    Full Text Available The influence of initial state variables on flood forecasting accuracy by using conceptual hydrological models is analyzed in this paper and a novel flood forecasting method based on correction of initial state variables is proposed. The new method is abbreviated as ISVC (Initial State Variable Correction. The ISVC takes the residual between the measured and forecasted flows during the initial period of the flood event as the objective function, and it uses a particle swarm optimization algorithm to correct the initial state variables, which are then used to drive the flood forecasting model. The historical flood events of 11 watersheds in south China are forecasted and verified, and important issues concerning the ISVC application are then discussed. The study results show that the ISVC is effective and applicable in flood forecasting tasks. It can significantly improve the flood forecasting accuracy in most cases.

  4. Fulltext PDF

    Indian Academy of Sciences (India)

    of Punjabi Shallow Parser used for the processing of the input sentence, which per- ... by Jain & Damani (2009) for English to UNL conversion. ... system, traditional rules and dictionary based algorithms with statistical machine learning have.

  5. Breaking the Resource Bottleneck for Multilingual Parsing

    National Research Council Canada - National Science Library

    Hwa, Rebecca; Resnik, Philip; Weinberg, Amy

    2005-01-01

    ...-quality English resources. We present a large-scale experiment showing that Chinese dependency trees can be induced by using an English parser, a word alignment package, and a large corpus of sentence-aligned bilingual text...

  6. Gravitational radiation from stellar collapse: The initial burst

    International Nuclear Information System (INIS)

    Shapiro, S.L.

    1977-01-01

    The burst of gravitational radiation emitted during the initial collapse and rebound of a homogeneous, uniformly rotating spheroid with internal pressure is analyzed numerically. The surface of the collapsing spheroid is assumed to start at rest from infinity with negligible eccentricity (''zero-energy collapse''). The adopted internal pressure function is constant on self-similar spheroidal surfaces, and its central value is described by a polytropic law with index n< or =3. The Newtonian equations of motion are integrated numerically to follow the initial collapse and rebound of the configuration for the special case in which the collapse is time-reversal invariant about the moment of maximum compression, and the total energy and frequency spectrum of the emitted quadrupole radiation are computed. The results are employed to estimate the (approx.minimum) total energy and frequency distribution of the initial burst of gravitational radiation emitted during the formation of low-mass (Mapproximately-less-thanM/sub sun/) neutron stars and during the collapse of supermassive gas clouds

  7. High-frequency acoustic spectrum analyzer based on polymer integrated optics

    Science.gov (United States)

    Yacoubian, Araz

    This dissertation presents an acoustic spectrum analyzer based on nonlinear polymer-integrated optics. The device is used in a scanning heterodyne geometry by zero biasing a Michelson interferometer. It is capable of detecting vibrations from DC to the GHz range. Initial low frequency experiments show that the device is an effective tool for analyzing an acoustic spectrum even in noisy environments. Three generations of integrated sensors are presented, starting with a very lossy (86 dB total insertion loss) initial device that detects vibrations as low as λ/10, and second and third generation improvements with a final device of 44 dB total insertion loss. The sensor was further tested for detecting a pulsed laser-excited vibration and resonances due to the structure of the sample. The data are compared to the acoustic spectrum measured using a low loss passive fiber interferometer detection scheme which utilizes a high speed detector. The peaks present in the passive detection scheme are clearly visible with our sensor data, which have a lower noise floor. Hybrid integration of GHz electronics is also investigated in this dissertation. A voltage controlled oscillator (VCO) is integrated on a polymer device using a new approach. The VCO is shown to operate as specified by the manufacturer, and the RF signal is efficiently launched onto the micro-strip line used for EO modulation. In the future this technology can be used in conjunction with the presented sensor to produce a fully integrated device containing high frequency drive electronics controlled by low DC voltage. Issues related to device fabrication, loss analysis, RF power delivery to drive circuitry, efficient poling of large area samples, and optimizing poling conditions are also discussed throughout the text.

  8. The security analyzer: A security analyzer program written in Prolog

    International Nuclear Information System (INIS)

    Zimmerman, B.D.; Densley, P.J.

    1986-09-01

    The Security Analyzer is a software tool capable of analyzing the effectiveness of a facility's security system. It is written in the Prolog logic programming computer language, using entity-relationship data modeling techniques. The program performs the following functions: (1) provides descriptive, locational and operational status information about intrusion detectors and assessment devices (i.e., ''sensors'' and ''cameras'') upon request; (2) provides for storage and retrieval of maintenance history information for various components of the security system (including intrusion detectors), and allows for changing that information as desired; (3) provides a ''search'' mode, wherein all paths are found from any specified physical location to another specified location which satisfy user chosen ''intruder detection'' probability and elapsed time criteria (i.e., the program finds the ''weakest paths'' from a security point of view). The first two of these functions can be provided fairly easily with a conventional database program; the third function could be provided using Fortran or some similar language, though with substantial difficulty. In the Security Analyzer program, all these functions are provided in a simple and straight-forward manner. This simplicity is possible because the program is written in the symbolic (as opposed to numeric) processing language Prolog, and because the knowledge base is structured according to entity-relationship modeling principles. Also, the use of Prolog and the entity-relationship modeling technique allows the capabilities of the Security analyzer program, both for knowledge base interrogation and for searching-type operations, to be easily expanded in ways that would be very difficult for a numeric and more algorithmically deterministic language such as Fortran to duplicate. 4 refs

  9. Carl Rogers during Initial Interviews: A Moderate and Consistent Therapist.

    Science.gov (United States)

    Edwards, H. P.; And Others

    1982-01-01

    Analyzed two initial interviews by Carl Rogers in their entirety using the Carkhuff scales, Hill's category system, and a brief grammatical analysis to establish the level and consistency with which Rogers provides facilitative conditions. Results indicated his behavior as counselor was stable and consistent within and across interviews. (Author)

  10. Gearbox vibration diagnostic analyzer

    Science.gov (United States)

    1992-01-01

    This report describes the Gearbox Vibration Diagnostic Analyzer installed in the NASA Lewis Research Center's 500 HP Helicopter Transmission Test Stand to monitor gearbox testing. The vibration of the gearbox is analyzed using diagnostic algorithms to calculate a parameter indicating damaged components.

  11. Extraction spectrophotometric analyzer

    International Nuclear Information System (INIS)

    Batik, J.; Vitha, F.

    1985-01-01

    Automation is discussed of extraction spectrophotometric determination of uranium in a solution. Uranium is extracted from accompanying elements in an HCl medium with a solution of tributyl phosphate in benzene. The determination is performed by measuring absorbance at 655 nm in a single-phase ethanol-water-benzene-tributyl phosphate medium. The design is described of an analyzer consisting of an analytical unit and a control unit. The analyzer performance promises increased productivity of labour, improved operating and hygiene conditions, and mainly more accurate results of analyses. (J.C.)

  12. Transnational Higher Education and Sustainable Development: Current Initiatives and Future Prospects

    Science.gov (United States)

    Koehn, Peter H.

    2012-01-01

    Tertiary educational institutions increasingly are relied upon for sustainable development initiatives. This policy research note analyzes newly available data regarding seven key dimensions of 295 transnational sustainable development projects involving US universities. Comparative regional analysis of the projects profiled in the APLU/AAU…

  13. A tandem parallel plate analyzer

    International Nuclear Information System (INIS)

    Hamada, Y.; Fujisawa, A.; Iguchi, H.; Nishizawa, A.; Kawasumi, Y.

    1996-11-01

    By a new modification of a parallel plate analyzer the second-order focus is obtained in an arbitrary injection angle. This kind of an analyzer with a small injection angle will have an advantage of small operational voltage, compared to the Proca and Green analyzer where the injection angle is 30 degrees. Thus, the newly proposed analyzer will be very useful for the precise energy measurement of high energy particles in MeV range. (author)

  14. China’s economic interests in the “One Belt, One Road” initiative

    Directory of Open Access Journals (Sweden)

    Silin Yakov

    2017-01-01

    Full Text Available The article examines the “One Belt – One Road” initiative of China aimed at the development of transport and logistics infrastructure on the trade route from China to Europe. The authors pay special attention to the history of the Silk Road, which serves as an ideological basis for the modern initiative. The scale of the new project allows the authors to expect that its impact on the international trade will be comparable with the contribution of the historical Silk Road to the development of the global economy as we know it. The authors analyze the prospects of the development and implementation of the initiative in terms of China’s economic interests. The most significant threats associated with the initiative are identified.

  15. Formative Assessment, Communication Skills and ICT in Initial Teacher Training

    Science.gov (United States)

    Romero-Martín, M. Rosario; Castejón-Oliva, Francisco-Javier; López-Pastor, Víctor-Manuel; Fraile-Aranda, Antonio

    2017-01-01

    The purpose of this study is to analyze the perception of students, graduates, and lecturers in relation to systems of formative and shared assessment and to the acquisition of teaching competences regarding communication and the use of Information and Communications Technology (ICT) in initial teacher education (ITE) on degrees in Primary…

  16. Analyzing in the present

    DEFF Research Database (Denmark)

    Revsbæk, Line; Tanggaard, Lene

    2015-01-01

    The article presents a notion of “analyzing in the present” as a source of inspiration in analyzing qualitative research materials. The term emerged from extensive listening to interview recordings during everyday commuting to university campus. Paying attention to the way different parts of vari...

  17. Light at Night Markup Language (LANML): XML Technology for Light at Night Monitoring Data

    Science.gov (United States)

    Craine, B. L.; Craine, E. R.; Craine, E. M.; Crawford, D. L.

    2013-05-01

    Light at Night Markup Language (LANML) is a standard, based upon XML, useful in acquiring, validating, transporting, archiving and analyzing multi-dimensional light at night (LAN) datasets of any size. The LANML standard can accommodate a variety of measurement scenarios including single spot measures, static time-series, web based monitoring networks, mobile measurements, and airborne measurements. LANML is human-readable, machine-readable, and does not require a dedicated parser. In addition LANML is flexible; ensuring future extensions of the format will remain backward compatible with analysis software. The XML technology is at the heart of communicating over the internet and can be equally useful at the desktop level, making this standard particularly attractive for web based applications, educational outreach and efficient collaboration between research groups.

  18. Initial results with the Berkeley on-line mass separator-RAMA

    International Nuclear Information System (INIS)

    Cerny, J.; Moltz, D.M.; Evans, H.C.; Vieira, D.J.; Parry, R.F.; Wouters, J.M.; Gough, R.A.; Zisman, M.S.

    1977-11-01

    Initial performance is described for a reasonably fast and universal (having little or no chemical selectivity) on-line mass analysis system used to expand capabilities in studying nuclei far from stability. The system is termed RAMA, an acronym for Recoil Atom Mass Analyzer. Basically, this system utilizes the helium-jet method to transport activity to a Sidenius hollow-cathode ion source which is coupled to a mass spectrometer. Initial experiments and planned improvements are discussed. Transport efficiencies of between 10 and 60 percent have routinely been achieved, though the latter is much more typical when conditions are optimized

  19. The Double Star Orbit Initial Value Problem

    Science.gov (United States)

    Hensley, Hagan

    2018-04-01

    Many precise algorithms exist to find a best-fit orbital solution for a double star system given a good enough initial value. Desmos is an online graphing calculator tool with extensive capabilities to support animations and defining functions. It can provide a useful visual means of analyzing double star data to arrive at a best guess approximation of the orbital solution. This is a necessary requirement before using a gradient-descent algorithm to find the best-fit orbital solution for a binary system.

  20. Subsurface crack initiation and propagation mechanisms in gigacycle fatigue

    International Nuclear Information System (INIS)

    Huang Zhiyong; Wagner, Daniele; Bathias, Claude; Paris, Paul C.

    2010-01-01

    In the very high cycle regime (N f > 10 7 cycles) cracks can nucleate on inclusions, 'supergrains' and pores, which leads to fish-eye propagation around the defect. The initiation from an inclusion or other defect is almost equal to the total crack growth lifetime, perhaps much more than 99% of this lifetime in many cases. Integration of the Paris law allows one to predict the number of cycles to crack initiation. A cyclic plastic zone around the crack exists, and recording the surface temperature of the sample during the test may allow one to follow crack propagation and determine the number of cycles to crack initiation. A thermo-mechanical model has been developed. In this study several fish-eyes from various materials have been observed by scanning electron microscopy, and the fractographic results analyzed as they related to the mechanical and thermo-mechanical models.

  1. Testing the initial-final mass relationship of white dwarfs

    International Nuclear Information System (INIS)

    Catalan, S; Isern, J; Garcia-Berro, E; Ribas, I

    2009-01-01

    In this contribution we revisit the initial-final mass relationship of white dwarfs, which links the mass of a white dwarf with that of its progenitor in the main-sequence. Although this function is of paramount importance to several fields in modern astrophysics, it is still not well constrained either from the theoretical or the observational points of view. We present here a revision of the present semi-empirical initial-final mass relationship using all the available data and including our recent results obtained from studying white dwarfs in common proper motion pairs. We have also analyzed the results obtained so far to provide some clues on the dependence of this relationship on metallicity. Finally, we have also performed an indirect test of the initial-final mass relationship by studying its effect on the luminosity function and on the mass distribution of white dwarfs.

  2. Assessment of engineering plant analyzer with Peach Bottom 2 stability tests

    International Nuclear Information System (INIS)

    Rohatgi, U.S.; Mallen, A.N.; Cheng, H.S.; Wulff, W.

    1992-01-01

    Engineering Plant Analyzer (EPA) has been developed to simulate plant transients for Boiling Water Reactor (BWR). Recently, this code has been used to simulate LaSalle-2 instability event which was initiated by a failure in the feed water heater. The simulation was performed for the scram conditions and for the postulated failure in the scram. In order to assess the capability of the EPA to simulate oscillatory flows as observed in the LaSalle event, EPA has been benchmarked with the available data from the Peach Bottom 2 (PB2) Instability tests PT1, PT2, and PT4. This document provides a description of these tests

  3. Predictors of HbA1c levels in patients initiating metformin.

    Science.gov (United States)

    Martono, Doti P; Hak, Eelko; Lambers Heerspink, Hiddo; Wilffert, Bob; Denig, Petra

    2016-12-01

    The aim was to assess demographic and clinical factors as predictors of short (6 months) and long term (18 months) HbA1c levels in diabetes patients initiating metformin treatment. We conducted a cohort study including type 2 diabetes patients who received their first metformin prescription between 2007 and 2013 in the Groningen Initiative to Analyze Type 2 Diabetes Treatment (GIANTT) database. The primary outcome was HbA1c level at follow-up adjusted for baseline HbA1c; the secondary outcome was failing to achieve the target HbA1c level of 53 mmol/mol. Associations were analyzed by linear and logistic regression. Multiple imputation was used for missing data. Additional analyses stratified by dose and adherence level were conducted. The cohort included 6050 patients initiating metformin. Baseline HbA1c at target consistently predicted better HbA1c outcomes. Longer diabetes duration and lower total cholesterol level at baseline were predictors for higher HbA1c levels at 6 months. At 18 months, cholesterol level was not a predictor. Longer diabetes duration was also associated with not achieving the target HbA1c at follow-up. The association for longer diabetes duration was especially seen in patients starting on low dose treatment. No consistent associations were found for comorbidity and comedication. Diabetes duration was a relevant predictor of HbA1c levels after 6 and 18 months of follow-up in patients initiating metformin treatment. Given the study design, no causal inference can be made. Our study suggests that prompt treatment intensification may be needed in patients who have a longer diabetes duration at treatment initiation.

  4. A study of the transferability of influenza case detection systems between two large healthcare systems.

    Science.gov (United States)

    Ye, Ye; Wagner, Michael M; Cooper, Gregory F; Ferraro, Jeffrey P; Su, Howard; Gesteland, Per H; Haug, Peter J; Millett, Nicholas E; Aronis, John M; Nowalk, Andrew J; Ruiz, Victor M; López Pineda, Arturo; Shi, Lingyun; Van Bree, Rudy; Ginter, Thomas; Tsui, Fuchiang

    2017-01-01

    This study evaluates the accuracy and transferability of Bayesian case detection systems (BCD) that use clinical notes from emergency department (ED) to detect influenza cases. A BCD uses natural language processing (NLP) to infer the presence or absence of clinical findings from ED notes, which are fed into a Bayesain network classifier (BN) to infer patients' diagnoses. We developed BCDs at the University of Pittsburgh Medical Center (BCDUPMC) and Intermountain Healthcare in Utah (BCDIH). At each site, we manually built a rule-based NLP and trained a Bayesain network classifier from over 40,000 ED encounters between Jan. 2008 and May. 2010 using feature selection, machine learning, and expert debiasing approach. Transferability of a BCD in this study may be impacted by seven factors: development (source) institution, development parser, application (target) institution, application parser, NLP transfer, BN transfer, and classification task. We employed an ANOVA analysis to study their impacts on BCD performance. Both BCDs discriminated well between influenza and non-influenza on local test cases (AUCs > 0.92). When tested for transferability using the other institution's cases, BCDUPMC discriminations declined minimally (AUC decreased from 0.95 to 0.94, pdetection performance in two large healthcare systems in two geographically separated regions, providing evidentiary support for the use of automated case detection from routinely collected electronic clinical notes in national influenza surveillance. The transferability could be improved by training Bayesian network classifier locally and increasing the accuracy of the NLP parser.

  5. FLIP for FLAG model visualization

    Energy Technology Data Exchange (ETDEWEB)

    Wooten, Hasani Omar [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-11-15

    A graphical user interface has been developed for FLAG users. FLIP (FLAG Input deck Parser) provides users with an organized view of FLAG models and a means for efficiently and easily navigating and editing nodes, parameters, and variables.

  6. Subgingival temperature and microbiota in initial periodontitis.

    Science.gov (United States)

    Maiden, M F; Tanner, A C; Macuch, P J; Murray, L; Kent, R L

    1998-10-01

    The association between subgingival temperature, other clinical characteristics, and the subgingival microbiota was examined in adult subjects with initial periodontitis and differing levels of gingival inflammation. 43 subjects were measured at 6 sites per tooth for pocket depth, attachment level, presence of plaque, gingival redness, bleeding on probing and subgingival temperature at 3-month intervals for 1 year. Subgingival plaque was sampled from 15 initial active periodontitis sites (10 subjects), 121 gingivitis, sites (20 subjects) and 202 healthy sites (13 subjects), and included the 5 hottest and 5 coldest sites in each subject. Plaque samples were analyzed for 13 subgingival species using whole-genomic DNA probes. The major influences on the subgingival microbiota were the clinical status of sites, pocket depth, and the presence of supragingival plaque. No significant association between species and site temperature was observed. Initial active sites were associated with Bacteroides forsythus and Campylobacter rectus, and had a higher mean subgingival temperature and deeper mean pocket depth than inactive sites. A weak association between pocket depth and site temperature was noted. The major influence on subgingival temperature of sites was the anterior to posterior anatomical temperature gradient in the mandible and maxilla.

  7. Reforming Higher Education in "Transition": Between National and International Reform Initiatives--The Case of Slovenia

    Science.gov (United States)

    Zgaga, Pavel; Miklavic, Klemen

    2011-01-01

    The article analyzes the last two decades of higher education reforms in Slovenia. During the "period of transition," they were led by national as well as international initiatives. At an early stage, the national initiatives were mainly based on criticisms of the last reform made by the former regime, although the generation of new…

  8. Microgamma Scan System for analyzing radial isotopic profiles of irradiated transmutation fuels

    International Nuclear Information System (INIS)

    Hilton, Bruce A.; McGrath, Christopher A.

    2008-01-01

    The U. S. Global Nuclear Energy Partnership / Advanced Fuel Cycle Initiative (GNEP/AFCI) is developing metallic transmutation alloys as a fuel form to transmute the long-lived transuranic actinide isotopes contained in spent nuclear fuel into shorter-lived fission products. A micro-gamma scan system is being developed to analyze the radial distribution of fission products, such as Cs-137, Cs-134, Ru-106, and Zr-95, in irradiated fuel cross-sections. The micro-gamma scan system consists of a precision linear stage with integrated sample holder and a tungsten alloy collimator, which interfaces with the Idaho National Laboratory (INL) Analytical Laboratory Hot Cell (ALHC) Gamma Scan System high purity germanium detector, multichannel analyzer, and removable collimators. A simplified model of the micro-gamma scan system was developed in MCNP (Monte-Carlo N-Particle Transport Code) and used to investigate the system performance and to interpret data from the scoping studies. Preliminary measurements of the micro-gamma scan system are discussed. (authors)

  9. Multiple modes of action potential initiation and propagation in mitral cell primary dendrite

    DEFF Research Database (Denmark)

    Chen, Wei R; Shen, Gongyu Y; Shepherd, Gordon M

    2002-01-01

    recordings with computational modeling to analyze action-potential initiation and propagation in the primary dendrite. In response to depolarizing current injection or distal olfactory nerve input, fast Na(+) action potentials were recorded along the entire length of the primary dendritic trunk. With weak......-to-moderate olfactory nerve input, an action potential was initiated near the soma and then back-propagated into the primary dendrite. As olfactory nerve input increased, the initiation site suddenly shifted to the distal primary dendrite. Multi-compartmental modeling indicated that this abrupt shift of the spike......-initiation site reflected an independent thresholding mechanism in the distal dendrite. When strong olfactory nerve excitation was paired with strong inhibition to the mitral cell basal secondary dendrites, a small fast prepotential was recorded at the soma, which indicated that an action potential was initiated...

  10. A Mitigation Approach to Counter Initial Ranging Based DoS Attacks on IEEE 802.16-2009

    International Nuclear Information System (INIS)

    Saleem, Y.; Asif, K.H.; Ahmad, T.; Bashir, K.

    2013-01-01

    In recent years increase in wireless accessed devices does not prerequisite any evidence. Security is the main concern for the researchers in 802.16e now-a-days. The layer structures defines that the security sub-layer resides over the physical layer and provides security on the link layer. This paper discusses the security threats present and still unsolved at the initial network entry stage. A mitigation approach to counter Initial Ranging Based DoS attacks on IEEE 802.16-2009 are particularized in this paper. Furthermore the existing solutions of initial ranging vulnerability are analyzed and their limitations are discussed. Proposed solution was checked against these limitations to ensure their absence. Moreover the solution was implemented in OMNET++ and results were analyzed to ensure the practicality and efficiency. (author)

  11. Low-energy particle experiments-electron analyzer (LEPe) onboard the Arase spacecraft

    Science.gov (United States)

    Kazama, Yoichi; Wang, Bo-Jhou; Wang, Shiang-Yu; Ho, Paul T. P.; Tam, Sunny W. Y.; Chang, Tzu-Fang; Chiang, Chih-Yu; Asamura, Kazushi

    2017-12-01

    In this report, we describe the low-energy electron instrument LEPe (low-energy particle experiments-electron analyzer) onboard the Arase (ERG) spacecraft. The instrument measures a three-dimensional distribution function of electrons with energies of ˜ 19 eV-19 keV. Electrons in this energy range dominate in the inner magnetosphere, and measurement of such electrons is important in terms of understanding the magnetospheric dynamics and wave-particle interaction. The instrument employs a toroidal tophat electrostatic energy analyzer with a passive 6-mm aluminum shield. To minimize background radiation effects, the analyzer has a background channel, which monitors counts produced by background radiation. Background counts are then subtracted from measured counts. Electronic components are radiation tolerant, and 5-mm-thick shielding of the electronics housing ensures that the total dose is less than 100 kRad for the one-year nominal mission lifetime. The first in-space measurement test was done on February 12, 2017, showing that the instrument functions well. On February 27, the first all-instrument run test was done, and the LEPe instrument measured an energy dispersion event probably related to a substorm injection occurring immediately before the instrument turn-on. These initial results indicate that the instrument works fine in space, and the measurement performance is good for science purposes.[Figure not available: see fulltext.

  12. A translator writing system for microcomputer high-level languages and assemblers

    Science.gov (United States)

    Collins, W. R.; Knight, J. C.; Noonan, R. E.

    1980-01-01

    In order to implement high level languages whenever possible, a translator writing system of advanced design was developed. It is intended for routine production use by many programmers working on different projects. As well as a fairly conventional parser generator, it includes a system for the rapid generation of table driven code generators. The parser generator was developed from a prototype version. The translator writing system includes various tools for the management of the source text of a compiler under construction. In addition, it supplies various default source code sections so that its output is always compilable and executable. The system thereby encourages iterative enhancement as a development methodology by ensuring an executable program from the earliest stages of a compiler development project. The translator writing system includes PASCAL/48 compiler, three assemblers, and two compilers for a subset of HAL/S.

  13. Polystyrene/magnesium hydroxide nanocomposite particles prepared by surface-initiated in-situ polymerization

    International Nuclear Information System (INIS)

    Liu Hui; Yi Jianhong

    2009-01-01

    In order to avoid their agglomeration and incompatibility with hydrophobic polystyrene substrate, magnesium hydroxide nanoparticles were encapsulated by surface-initiated in-situ polymerization of styrene. The process contained two steps: electrostatic adsorption of initiator and polymerization of monomer on the surface of magnesium hydroxide. It was found that high adsorption ratio in the electrostatic adsorption of initiator could be attained only in acidic region, and the adsorption belonged to typical physical process. Compared to traditional in-situ polymerization, higher grafting ratio was obtained in surface-initiated in-situ polymerization, which can be attributed to weaker steric hindrance. Both Fourier transform infrared spectroscopy (FTIR) and transmission electron microscopy (TEM) indicated that polystyrene/magnesium hydroxide nanocomposite particles had been successfully prepared by surface-initiated in-situ polymerization. The resulting samples were also analyzed and characterized by means of contact angle testing, dispersibility evaluation and thermogravimetric analysis

  14. A Morphological Parser For Afrikaans | de Stadler | Stellenbosch ...

    African Journals Online (AJOL)

    Stellenbosch Papers in Linguistics Plus. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 22 (1992) >. Log in or Register to get access to full text downloads. Username, Password, Remember me, or Register · Download this PDF file. The PDF file you selected ...

  15. Nuclear power plant analyzer

    International Nuclear Information System (INIS)

    Stritar, A.

    1986-01-01

    The development of Nuclear Power Plant Analyzers in USA is described. There are two different types of Analyzers under development in USA, the forst in Idaho and Los Alamos national Lab, the second in brookhaven National lab. That one is described in detail. The computer hardware and the mathematical models of the reactor vessel thermalhydraulics are described. (author)

  16. Funding Initiatives | Women in Science | Initiatives | Indian Academy ...

    Indian Academy of Sciences (India)

    Home; Initiatives; Women in Science; Funding Initiatives ... The Fellowship Scheme for Women Scientists for societal programmes is initiative of the ... at a young age of 52, after a valiant battle with cancer, today on 29th March 2016 in Delhi.

  17. Electrical Stimulation of Coleopteran Muscle for Initiating Flight.

    Science.gov (United States)

    Choo, Hao Yu; Li, Yao; Cao, Feng; Sato, Hirotaka

    2016-01-01

    Some researchers have long been interested in reconstructing natural insects into steerable robots or vehicles. However, until recently, these so-called cyborg insects, biobots, or living machines existed only in science fiction. Owing to recent advances in nano/micro manufacturing, data processing, and anatomical and physiological biology, we can now stimulate living insects to induce user-desired motor actions and behaviors. To improve the practicality and applicability of airborne cyborg insects, a reliable and controllable flight initiation protocol is required. This study demonstrates an electrical stimulation protocol that initiates flight in a beetle (Mecynorrhina torquata, Coleoptera). A reliable stimulation protocol was determined by analyzing a pair of dorsal longitudinal muscles (DLMs), flight muscles that oscillate the wings. DLM stimulation has achieved with a high success rate (> 90%), rapid response time (cyborg insects or biobots.

  18. Initial 12-h operative fluid volume is an independent risk factor for pleural effusion after hepatectomy.

    Science.gov (United States)

    Cheng, Xiang; Wu, Jia-Wei; Sun, Ping; Song, Zi-Fang; Zheng, Qi-Chang

    2016-12-01

    Pleural effusion after hepatectomy is associated with significant morbidity and prolonged hospital stays. Several studies have addressed the risk factors for postoperative pleural effusion. However, there are no researches concerning the role of the initial 12-h operative fluid volume. The aim of this study was to evaluate whether the initial 12-h operative fluid volume during liver resection is an independent risk factor for pleural effusion after hepatectomy. In this study, we retrospectively analyzed clinical data of 470 patients consecutively undergoing elective hepatectomy between January 2011 and December 2012. We prospectively collected and retrospectively analyzed baseline and clinical data, including preoperative, intraoperative, and postoperative variables. Univariate and multivariate analyses were carried out to identify whether the initial 12-h operative fluid volume was an independent risk factor for pleural effusion after hepatectomy. The multivariate analysis identified 2 independent risk factors for pleural effusion: operative time [odds ratio (OR)=10.2] and initial 12-h operative fluid volume (OR=1.0003). Threshold effect analyses revealed that the initial 12 h operative fluid volume was positively correlated with the incidence of pleural effusion when the initial 12-h operative fluid volume exceeded 4636 mL. We conclude that the initial 12-h operative fluid volume during liver resection and operative time are independent risk factors for pleural effusion after hepatectomy. Perioperative intravenous fluids should be restricted properly.

  19. Community Participation and Benefits in REDD+: A Review of Initial Outcomes and Lessons

    OpenAIRE

    David J. Ganz; Jill Blockhus; Kathleen Lawlor; Erin Myers Madeira

    2013-01-01

    The advent of initiatives to reduce emissions from deforestation and degradation and enhance forest carbon stocks (REDD+) in developing countries has raised much concern regarding impacts on local communities. To inform this debate, we analyze the initial outcomes of those REDD+ projects that systematically report on their socio-economic dimensions. To categorize and compare projects, we develop a participation and benefits framework that considers REDD+’s effects on local populations’ opport...

  20. Data Mining Approaches for Habitats and Stopovers Discovery of Migratory Birds

    Directory of Open Access Journals (Sweden)

    Qiang Xu

    2013-03-01

    Full Text Available This paper focuses on using data mining technology to efficiently and accurately discover habitats and stopovers of migratory birds. The three methods we used are as follows: 1. a density-based clustering method, detecting stopovers of birds during their migration through density-based clustering of location points; 2. A location histories parser method, detecting areas that have been overstayed by migratory birds during a set time period by setting time and distance thresholds; and 3. A time-parameterized line segment clustering method, clustering directed line segments to analyze shared segments of migratory pathways of different migratory birds and discover the habitats and stopovers of these birds. Finally, we analyzed the migration data of the bar-headed goose in the Qinghai Lake Area through the three above methods and verified the effectiveness of the three methods and, by comparison, identified the scope and context of the use of these three methods respectively.

  1. Monitoring beryllium during site cleanup and closure using a real-time analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Schlager, R.J.; Sappey, A.D.; French, P.D. [ADA Technologies, Inc., Englewood, CO (United States)

    1998-12-31

    Beryllium metal has a number of unique properties that have been exploited for use in commercial and government applications. Airborne beryllium particles can represent a significant human health hazard if deposited in the lungs. These particles can cause immunologically-mediated chronic granulomatous lung disease (chronic beryllium disease). Traditional methods of monitoring airborne beryllium involve collecting samples of air within the work area using a filter. The filter then undergoes chemical analysis to determine the amount of beryllium collected during the sampling period. These methods are time-consuming and results are known only after a potential exposure has occurred. The need for monitoring exposures in real time has prompted government and commercial companies to develop instrumentation that will allow for the real time assessment of short-term exposures so that adequate protection for workers in contaminated environments can be provided. Such an analyzer provides a tool that will allow government and commercial sites to be cleaned up in a more safe and effective manner since exposure assessments can be made instantaneously. This paper describes the development and initial testing of an analyzer for monitoring airborne beryllium using a technique known as Laser-Induced Breakdown Spectroscopy (LIBS). Energy from a focused, pulsed laser is used to vaporize a sample and create an intense plasma. The light emitted from the plasma is analyzed to determine the quantity of beryllium in the sampled air. A commercial prototype analyzer has been fabricated and tested in a program conducted by Lawrence Livermore National Laboratory, Los Alamos National Laboratory, Lovelace Respiratory Research Institute, and ADA Technologies, Inc. Design features of the analyzer and preliminary test results are presented.

  2. Competing initiatives: a new tobacco industry strategy to oppose statewide clean indoor air ballot measures.

    Science.gov (United States)

    Tung, Gregory J; Hendlin, Yogi H; Glantz, Stanton A

    2009-03-01

    To describe how the tobacco and gaming industries opposed clean indoor air voter initiatives in 2006, we analyzed media records and government and other publicly available documents and conducted interviews with knowledgeable individuals. In an attempt to avoid strict "smoke free" regulations pursued by health groups via voter initiatives in Arizona, Ohio, and Nevada, in 2006, the tobacco and gaming industries sponsored competing voter initiatives for alternative laws. Health groups succeeded in defeating the pro-tobacco competing initiatives because they were able to dispel confusion and create a head-to-head competition by associating each campaign with its respective backer and instructing voters to vote "no" on the pro-tobacco initiative in addition to voting "yes" on the health group initiative.

  3. The Implementation of Social Responsiveness Initiatives: Case of Lithuania

    Directory of Open Access Journals (Sweden)

    Valentinas Navickas

    2015-03-01

    Full Text Available A concept of social responsibility reflects public concerns and issues for a specific time, and these change with time. Various stakeholders as consumers, customers, employees, trade unions, communities, non-governmental organizations, foundations, donors, investors are more and more interested in the activities of companies (organizations, and influence on them in a variety of ways. Companies, for their part, also look for ways to meet the expectations of the public in the area of social responsibility. Corporate social responsiveness is an ability of business to respond to social pressure. The article analyzes the implementation of social responsiveness initiatives as organizational programs. Social responsiveness is understood as action dimension of corporate social responsibility. The paper deals with implementation of social responsiveness initiatives in Lithuania. Researched the socially responsiveness initiatives as organizational programs, the authors found that an active development of corporate social responsiveness positively influences on businesses and society relationship and contribute to sustainable development of region or country.

  4. Construction of spacetimes from initial data

    International Nuclear Information System (INIS)

    Isenberg, J.A.

    1979-01-01

    As relativistic effects become more accessible to physical experiment and observation, it becomes important to be able to theoretically analyze the behavior of relativistic model systems designed to incorporate such measurable effects. This dissertation describes in detail the initial value (IV) procedure for carrying out such analyses (i.e., for ''building spacetimes''). We report progress--of the author as well as others--in all of these areas: (1) The generalized Bergmann-Dirac (BD) procedure can be used to systematically translate any theory into 3+1 form. (2) The York procedure turns the constraints of Einstein's theory into a set of four elliptic equations for four unknowns (with the rest of the initial data ''relatively free''). (3) The maximal and K-foliation schemes appear to give preferred kinematics for the generic spacetimes one might build. We discuss the sense in which these foliations are preferred, and compare them with others. We then show how to find maximal and K-surfaces, both in a given spacetime (e.g. Schwarzschild) and in one being built from scratch. (4) Many physically interesting systems have symmetries which considerably simplify the equations. After discussing how, in general, one can build symmetries into initial data, and how one can use them to simplify the analysis, we look at a particular example symmetry: spacetimes with two space-like translation Killing Vectors. (''2T'')

  5. Faster Scannerless GLR parsing

    NARCIS (Netherlands)

    J.J. Vinju (Jurgen); G.R. Economopoulos (Giorgos Robert); P. Klint (Paul)

    2008-01-01

    textabstractAnalysis and renovation of large software portfolios requires syntax analysis of multiple, usually embedded, languages and this is beyond the capabilities of many standard parsing techniques. The traditional separation between lexer and parser falls short due to the limitations of

  6. Faster scannerless GLR parsing

    NARCIS (Netherlands)

    G.R. Economopoulos (Giorgos Robert); P. Klint (Paul); J.J. Vinju (Jurgen); O. de Moor; M.I. Schwartzbach

    2009-01-01

    textabstractAnalysis and renovation of large software portfolios requires syntax analysis of multiple, usually embedded, languages and this is beyond the capabilities of many standard parsing techniques. The traditional separation between lexer and parser falls short due to the limitations of

  7. Comparison of Context-free Grammars Based on Parsing Generated Test Data

    NARCIS (Netherlands)

    B. Fischer (Bernd); R. Lämmel (Ralf); V. Zaytsev (Vadim); U. Aßmann; J. Saraiva; A.M. Sloane

    2011-01-01

    textabstractThere exist a number of software engineering scenarios that essentially involve equivalence or correspondence assertions for some of the context-free grammars in the scenarios. For instance, when applying grammar transformations during parser development---be it for the sake of

  8. Syntactic discriminative language model rerankers for statistical machine translation

    NARCIS (Netherlands)

    Carter, S.; Monz, C.

    2011-01-01

    This article describes a method that successfully exploits syntactic features for n-best translation candidate reranking using perceptrons. We motivate the utility of syntax by demonstrating the superior performance of parsers over n-gram language models in differentiating between Statistical

  9. The development on the methodology of the initiating event frequencies for liquid metal reactor KALIMER

    International Nuclear Information System (INIS)

    Jeong, K. S.; Yang, Z. A.; Ah, Y. B.; Jang, W. P.; Jeong, H. Y.; Ha, K. S.; Han, D. H.

    2002-01-01

    In this paper, the PSA methodology of PRISM,Light Water Reactor, Pressurized Heavy Water Reactor are analyzed and the methodology of Initiating Events for KALIMER are suggested. Also,the reliability assessment of assumptions for Pipes Corrosion Frequency is set up. The reliability assessment of Passive Safety System, one of Main Safety System of KALIMER, are discussed and analyzed

  10. Initial data for N black holes

    International Nuclear Information System (INIS)

    Kulkarni, A.D.

    1984-01-01

    The N-body problem in general relatively is of enormous difficulty, especially in the nonlinear regime, where radiation is important. It is now possible to study this problem by treating it as a Cauchy problem and by using large-scale computers to develop numerical models. With this motivation, the Cauchy formulation of Einstein equations is described. It consists of the Hamiltonian and momentum constraint equations, and the evolution equations. The constraints are analyzed using the conformal technique. The first step is this approach is to set up initial data compatible with the constraints. The N-body data, in general, will depend on the stress-energy tensor of the bodies. Hence this issue is bypassed by considering the matter free representation of particles in terms of the geometries of certain non-Euclidean manifolds. Problems such as the dynamics of a binary system of black holes are more interesting. They required data representing holes with nonzero momenta. Hence the extrinsic curvature of the initial hypersurface cannot be taken to be zero

  11. PM 3655 PHILIPS Logic analyzer

    CERN Multimedia

    A logic analyzer is an electronic instrument that captures and displays multiple signals from a digital system or digital circuit. A logic analyzer may convert the captured data into timing diagrams, protocol decodes, state machine traces, assembly language, or may correlate assembly with source-level software. Logic Analyzers have advanced triggering capabilities, and are useful when a user needs to see the timing relationships between many signals in a digital system.

  12. A Categorization of Dynamic Analyzers

    Science.gov (United States)

    Lujan, Michelle R.

    1997-01-01

    Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input

  13. Micro-Analyzer: automatic preprocessing of Affymetrix microarray data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Cannataro, Mario

    2013-08-01

    A current trend in genomics is the investigation of the cell mechanism using different technologies, in order to explain the relationship among genes, molecular processes and diseases. For instance, the combined use of gene-expression arrays and genomic arrays has been demonstrated as an effective instrument in clinical practice. Consequently, in a single experiment different kind of microarrays may be used, resulting in the production of different types of binary data (images and textual raw data). The analysis of microarray data requires an initial preprocessing phase, that makes raw data suitable for use on existing analysis platforms, such as the TIGR M4 (TM4) Suite. An additional challenge to be faced by emerging data analysis platforms is the ability to treat in a combined way those different microarray formats coupled with clinical data. In fact, resulting integrated data may include both numerical and symbolic data (e.g. gene expression and SNPs regarding molecular data), as well as temporal data (e.g. the response to a drug, time to progression and survival rate), regarding clinical data. Raw data preprocessing is a crucial step in analysis but is often performed in a manual and error prone way using different software tools. Thus novel, platform independent, and possibly open source tools enabling the semi-automatic preprocessing and annotation of different microarray data are needed. The paper presents Micro-Analyzer (Microarray Analyzer), a cross-platform tool for the automatic normalization, summarization and annotation of Affymetrix gene expression and SNP binary data. It represents the evolution of the μ-CS tool, extending the preprocessing to SNP arrays that were not allowed in μ-CS. The Micro-Analyzer is provided as a Java standalone tool and enables users to read, preprocess and analyse binary microarray data (gene expression and SNPs) by invoking TM4 platform. It avoids: (i) the manual invocation of external tools (e.g. the Affymetrix Power

  14. Update on the USNRC's nuclear plant analyzer

    International Nuclear Information System (INIS)

    Laats, E.T.

    1987-01-01

    The Nuclear Plant Analyzer (NPA) is the U.S. Nuclear Regulatory Commission's (NRC's) state-of-the-art nuclear reactor simulation capability. This computer software package integrates high fidelity nuclear reactor simulation codes such as the TRAC and RELAPS series of codes with color graphics display techniques and advanced workstation hardware. An overview of this program was given at the 1984 Summer Computer Simulation Conference (SCSC), with selected topics discussed at the 1985 and 1986 SCSCs. This paper addresses these activities and related experiences. First, The Class VI computer implementation is discussed. The trade-offs between gaining significantly greater computational speed and central memory, with the loss of performance due to many more simultaneous users is shown. Second, the goal of the super-minicomputer implementation is to produce a very cost-effective system that utilizes advanced (multi-dimensional, two-phase coolant) simulation capabilities at real wall-clock simulation times. Benchmarking of the initial super-minicomputer implementation is discussed. Finally, the technical and economic feasibility is addressed for implementing the super-minicomputer version of the NPA with the RELAPS simulation code onto the Black Fox full scope nuclear power plant simulator

  15. Oculomotor evidence for top-down control following the initial saccade.

    Directory of Open Access Journals (Sweden)

    Alisha Siebold

    Full Text Available The goal of the current study was to investigate how salience-driven and goal-driven processes unfold during visual search over multiple eye movements. Eye movements were recorded while observers searched for a target, which was located on (Experiment 1 or defined as (Experiment 2 a specific orientation singleton. This singleton could either be the most, medium, or least salient element in the display. Results were analyzed as a function of response time separately for initial and second eye movements. Irrespective of the search task, initial saccades elicited shortly after the onset of the search display were primarily salience-driven whereas initial saccades elicited after approximately 250 ms were completely unaffected by salience. Initial saccades were increasingly guided in line with task requirements with increasing response times. Second saccades were completely unaffected by salience and were consistently goal-driven, irrespective of response time. These results suggest that stimulus-salience affects the visual system only briefly after a visual image enters the brain and has no effect thereafter.

  16. Importance of proper initial treatment of moderate and major burns

    Directory of Open Access Journals (Sweden)

    Vulović Dejan

    2008-01-01

    Full Text Available Background/Aim. Burns are common injuries with frequency depending on human factors, development of protection, industry and traffic, eventual wars. Organized treatment of major burn injuries has tremendous medical, social and economic importance. The aim of this study was to analyze initial treatment of major and moderate burns, to compare it with the current recommendations and to signify the importance of organized management of burns. Methods. In a prospective study 547 adult patients with major burns were analyzed, covering a period of eight years, with the emphasis on the initial hospital admission and emergency care for burns greater than 10% of total body surface area (TBSA. Results. In the different groups of major burns, the percentage of hospital admission was: 81.5 in burns greater than 10% TBSA, 37.7 in burns of the functional areas, 54.5 in the III degree burns, 81.6 in electrical burns, 55.9 in chemical burns, 61.9 in inhalation injury, 41.0 in burns in patients with the greater risk and 100 in burns with a concomitant trauma. In the group of 145 patients with burns greater than 10% TBSA, intravenous fluids were given in 87 patients, analgesics in 45, corticosteroids in 29, antibiotics in 23 and oxygen administration in 14. In the same group, wound irrigation was done in 14.4%, removing of the clothing and shoes in 29.6%, elevation of the legs in 8.9% and prevention of hypothermia in 7.6% of the victims. There were no initial estimations of burn extent (percentage of a burn, notes about the patient and injury and tetanus immunizations. Conclusion. Based on these findings, it is concluded that there should be much more initial hospital admissions of major burns, and also, necessary steps in the emergency care of burns greater than 10% TBSA should be taken more frequently. On the other side, unnecessary or wrong steps should be avoided in the initial burn treatment.

  17. Criteria for initiation of delamination in quasi-static punch-shear tests of a carbon-fiber composite material.

    Energy Technology Data Exchange (ETDEWEB)

    Chin, Eric Brian [Sandia National Lab. (SNL-CA), Livermore, CA (United States); English, Shawn Allen [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Briggs, Timothy [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2015-09-01

    V arious phenomenological delamination initiation criteria are analyzed in quasi - static punch - shear tests conducted on six different geometries. These six geometries are modeled and analyzed using elastic, large - deformation finite element analysis. Analysis output is post - processed to assess different delamination initiation criteria, and their applicability to each of the geometries. These criteria are compared to test results to assess whether or not they are appropriate based on what occurred in testing. Further, examinations of CT scans and ultrasonic images o f test specimens are conducted in the appendix to determine the sequence of failure in each test geometry.

  18. Stability of BUN and creatinine determinations on the Siemens Advia 1800 analyzer.

    Science.gov (United States)

    Qin, Jia; Wang, Huiying; Rets, Anton; Harari, Saul; Alexis, Herol; Eid, Ikram; Pincus, Matthew R

    2013-11-01

    Serum creatinine values of patients tend to change as a result of the use of different blanks used for creatinine determinations on the Advia 1650. After upgrading the analyzer to the Advia 1800, creatinine values tended to be more reproducible. As part of a quality assurance investigation to test the reproducibilities of creatinine values, we determined serial creatinine values in the sera of 13 patients whose initial values were either in the reference range or elevated (range 0.58-7.8 mg/dl). These values were determined concurrently with serum blood urea nitrogen (BUN) determinations (range 6.0-84.4 mg/dl) as these two analytes are used together in evaluation of renal function. We determined BUN and creatinine values, using the glutamate dehydrogenase lined enzyme assay system and the Jaffe method, respectively. We find that all values for creatinine on samples stored at 4 °C were reproducible as were the corresponding BUN values, which is revealed by low values for the coefficients of variation (CVs), that is, mean CV of 4.55% for creatinine and 2.52% for BUN. One sample with relatively high CV (10.6%) for creatinine was found to have an initial value of 1.1 mg/dl, in the reference range; but, on repeat determinations, the obtained levels were as high as 1.5 mg/dl, above the reference range. BUN values for this sample remained in the reference range, suggesting that no renal disease was present. We conclude that creatinine and BUN determinations are stable, but occasional spurious creatinine values can occur on the Advia 1800 analyzer. © 2013 Wiley Periodicals, Inc.

  19. Low-cycle fatigue of welded joints: coupled initiation propagation model

    International Nuclear Information System (INIS)

    Madi, Yazid; Recho, Naman; Matheron, Philippe

    2004-01-01

    This paper deals with the low-cycle fatigue (LC) design of welded structures, the aim being the critical analysis of the rule used in the RCC-MR [Design and construction rules for mechanical components of FBR nuclear islands, AFCEN, 1993], for the design and construction of fast breeder reactors. The study takes into account the evolution of the material behavior laws and damage accumulation during the fatigue loading. The adopted model consists of analyzing separately the behavior and the damage evolutions. It allows us to determine the damage ratio corresponding to initiation and propagation of a significant crack in order to determine the life duration. This model suggests the existence of a threshold level of loading, above which micro-cracks initiate. The initiation fatigue life can then be neglected below the threshold level. This work shows also that the RCC-MR rules are valid below this threshold load level

  20. Generic packet descriptions : Verified Parsing and Pretty Printing of Low-Level Data

    NARCIS (Netherlands)

    van Geest, Marcell; Swierstra, Wouter

    2017-01-01

    Complex protocols describing the communication or storage of binary data are difficult to describe precisely. This paper presents a collection of data types for describing a binary data formats; the corresponding parser and pretty printer are generated automatically from a data description. By

  1. Does syntax help discourse segmentation? Not so much

    DEFF Research Database (Denmark)

    Braud, Chloé Elodie; Lacroix, Ophélie; Søgaard, Anders

    2017-01-01

    Discourse segmentation is the first step in building discourse parsers. Most work on discourse segmentation does not scale to real-world discourse parsing across languages, for two reasons: (i) models rely on constituent trees, and (ii) experiments have relied on gold standard identification...

  2. Faster scannerless GLR parsing

    NARCIS (Netherlands)

    Economopoulos, G.R.; Klint, P.; Vinju, J.J.; Moor, de O.; Schwartzbach, M.I.

    2009-01-01

    Analysis and renovation of large software portfolios requires syntax analysis of multiple, usually embedded, languages and this is beyond the capabilities of many standard parsing techniques. The traditional separation between lexer and parser falls short due to the limitations of tokenization based

  3. The Effect of Semantic Transparency on the Processing of Morphologically Derived Words: Evidence from Decision Latencies and Event-Related Potentials

    Science.gov (United States)

    Jared, Debra; Jouravlev, Olessia; Joanisse, Marc F.

    2017-01-01

    Decomposition theories of morphological processing in visual word recognition posit an early morpho-orthographic parser that is blind to semantic information, whereas parallel distributed processing (PDP) theories assume that the transparency of orthographic-semantic relationships influences processing from the beginning. To test these…

  4. The METAFRONT System

    DEFF Research Database (Denmark)

    Brabrand, Claus; Schwartzbach, Michael Ignatieff; Vangaard, Mads

    2003-01-01

    We present the metafront tool for specifying flexible, safe, and efficient syntactic transformations between languages defined by context-free grammars. The transformations are guaranteed to terminate and to map grammatically legal input to grammatically legal output. We rely on a novel parser al...

  5. The performance of the ATLAS initial detector layout for B-physics channels

    International Nuclear Information System (INIS)

    Epp, B.; Ghete, V.M.; Kuhn, D.; Zhang, Y.J.

    2004-01-01

    At the start-up of LHC one expects parts of the ATLAS detector to be missing. This layout is called initial layout, whereas the fully staged detector is called complete layout. B-physics channels were simulated, reconstructed and analyzed using the software tools of ATLAS data challenge-1 (DC1). The performance of the detector with respect to quantities relevant to the analysis of the B s → D s π channel and the validation of the full chain generation-simulation-reconstruction-analysis were evaluated for the initial and complete layout. (author)

  6. Spectral methods for a nonlinear initial value problem involving pseudo differential operators

    International Nuclear Information System (INIS)

    Pasciak, J.E.

    1982-01-01

    Spectral methods (Fourier methods) for approximating the solution of a nonlinear initial value problem involving pseudo differential operators are defined and analyzed. A semidiscrete approximation to the nonlinear equation based on an L 2 projection is described. The semidiscrete L 2 approximation is shown to be a priori stable and convergent under sufficient decay and smoothness assumptions on the initial data. It is shown that the semidiscrete method converges with infinite order, that is, higher order decay and smoothness assumptions imply higher order error bounds. Spectral schemes based on spacial collocation are also discussed

  7. Digital Multi Channel Analyzer Enhancement

    International Nuclear Information System (INIS)

    Gonen, E.; Marcus, E.; Wengrowicz, U.; Beck, A.; Nir, J.; Sheinfeld, M.; Broide, A.; Tirosh, D.

    2002-01-01

    A cement analyzing system based on radiation spectroscopy had been developed [1], using novel digital approach for real-time, high-throughput and low-cost Multi Channel Analyzer. The performance of the developed system had a severe problem: the resulted spectrum suffered from lack of smoothness, it was very noisy and full of spikes and surges, therefore it was impossible to use this spectrum for analyzing the cement substance. This paper describes the work carried out to improve the system performance

  8. Final Scientific/Technical Report. A closed path methane and water vapor gas analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Liukang [LI-COR Inc., Lincoln, NE (United States); McDermitt, Dayle [LI-COR Inc., Lincoln, NE (United States); Anderson, Tyler [LI-COR Inc., Lincoln, NE (United States); Riensche, Brad [LI-COR Inc., Lincoln, NE (United States); Komissarov, Anatoly [LI-COR Inc., Lincoln, NE (United States); Howe, Julie [LI-COR Inc., Lincoln, NE (United States)

    2012-02-01

    Robust, economical, low-power and reliable closed-path methane (CH4), carbon dioxide (CO2), and water vapor (H2O) analyzers suitable for long-term measurements are not readily available commercially. Such analyzers are essential for quantifying the amount of CH4 and CO2 released from various ecosystems (wetlands, rice paddies, forests, etc.) and other surface contexts (e.g. landfills, animal husbandry lots, etc.), and for understanding the dynamics of the atmospheric CH4 and CO2 budget and their impact on climate change and global warming. The purpose of this project is to develop a closed-path methane, carbon dioxide gas and water vapor analyzer capable of long-term measurements in remote areas for global climate change and environmental research. The analyzer will be capable of being deployed over a wide range of ecosystems to understand methane and carbon dioxide exchange between the atmosphere and the surface. Measurements of methane and carbon dioxide exchange need to be made all year-round with limited maintenance requirements. During this Phase II effort, we successfully completed the design of the electronics, optical bench, trace gas detection method and mechanical infrastructure. We are using the technologies of two vertical cavity surface emitting lasers, a multiple-pass Herriott optical cell, wavelength modulation spectroscopy and direct absorption to measure methane, carbon dioxide, and water vapor. We also have designed the instrument application software, Field Programmable Gate Array (FPGA), along with partial completion of the embedded software. The optical bench has been tested in a lab setting with very good results. Major sources of optical noise have been identified and through design, the optical noise floor is approaching -60dB. Both laser modules can be temperature controlled to help maximize the stability of the analyzer. Additionally, a piezo electric transducer has been

  9. On-Demand Urine Analyzer

    Science.gov (United States)

    Farquharson, Stuart; Inscore, Frank; Shende, Chetan

    2010-01-01

    A lab-on-a-chip was developed that is capable of extracting biochemical indicators from urine samples and generating their surface-enhanced Raman spectra (SERS) so that the indicators can be quantified and identified. The development was motivated by the need to monitor and assess the effects of extended weightlessness, which include space motion sickness and loss of bone and muscle mass. The results may lead to developments of effective exercise programs and drug regimes that would maintain astronaut health. The analyzer containing the lab-on-a- chip includes materials to extract 3- methylhistidine (a muscle-loss indicator) and Risedronate (a bone-loss indicator) from the urine sample and detect them at the required concentrations using a Raman analyzer. The lab-on- a-chip has both an extractive material and a SERS-active material. The analyzer could be used to monitor the onset of diseases, such as osteoporosis.

  10. Analysis of SCC initiation/propagation behavior of stainless steels in LWR environments

    International Nuclear Information System (INIS)

    Saito, Koichi; Kuniya, Jiro

    1999-01-01

    This paper presents a method to analyze initiation and propagation behavior of stress corrosion cracking (SCC) of stainless steels on the basis of a new prediction algorithm in which the initiation period and propagation period of SCC under irradiation conditions are considered from a practical viewpoint. The prediction algorithm is based on three ideas: (1) threshold neutron fluence of radiation-enhanced SCC (RESCC), (2) equivalent critical crack depth, and (3) threshold stress intensity factor for SCC (K ISCC ). SCC initiation/propagation behavior in light water reactor (LWR) environments is analyzed by incorporating model equations on irradiation hardening, irradiation-enhanced electrochemical potentiokinetic reactivation (EPR) and irradiation stress relaxation that are phenomena peculiar to neutron irradiation. The analytical method is applied to predict crack growth behavior of a semi-elliptical surface crack in a flat plane that has an arbitrary residual stress profile; specimens are sensitized type 304 stainless steels which had been subjected to neutron irradiation in high temperature water. SCC growth behavior of a semi-elliptical surface crack was greatly dependent on the distribution of residual stress in a flat plane. When residual stress at the surface of the flat plane was relatively small, the method predicted SCC propagation did not take place. (author)

  11. Initial teacher training. His musical impact on children in terms of refuge

    Directory of Open Access Journals (Sweden)

    Ana Rita Castañeda-de Liendo

    2017-04-01

    Full Text Available The initial teacher training was the cultivation of teacher preparation in order to be the executors of government projects drive enacted by social groups of power in the historical development of Venezuela This article analyzes the initial teacher training in Venezuela, with the help of the logical place to unveil what characterizes the initial training of teachers historical method, specifically the working conditions of temporary shelter, and develops the axiological dynamics through musical training, which determine the current and future behavior regarding the formation of these. There is insufficient number of documents that allow raising the methodological work in the preparation of teachers of children of preschool age who are in temporary shelter provided, this work contributes to the improvement of such problems.

  12. Radiometric analyzer

    International Nuclear Information System (INIS)

    Arima, S.; Oda, M.; Miyashita, K.; Takada, M.

    1977-01-01

    A radiometric analyzer for measuring the characteristic values of a sample by radiation includes a humer of radiation measuring subsystems having different ratios of sensitivities to the elements of the sample and linearizing circuits having inverse function characteristics of calibration functions which correspond to the radiation measuring subsystems. A weighing adder operates a desirable linear combination of the outputs of the linearizing circuits. Operators for operating between two or more different linear combinations are included

  13. Analysis of compaction initiation in human embryos by using time-lapse cinematography.

    Science.gov (United States)

    Iwata, Kyoko; Yumoto, Keitaro; Sugishima, Minako; Mizoguchi, Chizuru; Kai, Yoshiteru; Iba, Yumiko; Mio, Yasuyuki

    2014-04-01

    To analyze the initiation of compaction in human embryos in vitro by using time-lapse cinematography (TLC), with the goal of determining the precise timing of compaction and clarifying the morphological changes underlying the compaction process. One hundred and fifteen embryos donated by couples with no further need for embryo-transfer were used in this study. Donated embryos were thawed and processed, and then their morphological behavior during the initiation of compaction was dynamically observed via time-lapse cinematography (TLC) for 5 days. Although the initiation of compaction occurred throughout the period from the 4-cell to 16-cell stage, 99 (86.1 %) embryos initiated compaction at the 8-cell stage or later, with initiation at the 8-cell stage being most frequent (22.6 %). Of these 99 embryos, 49.5 % developed into good-quality blastocysts. In contrast, of the 16 (13.9 %) embryos that initiated compaction prior to the 8-cell stage, only 18.8 % developed into good-quality blastocysts. Embryos that initiated compaction before the 8-cell stage showed significantly higher numbers of multinucleated blastomeres, due to asynchronism in nuclear division at the third mitotic division resulting from cytokinetic failure. The initiation of compaction primarily occurs at the third mitotic division or later in human embryos. Embryos that initiate compaction before the 8-cell stage are usually associated with aberrant embryonic development (i.e., cytokinetic failure accompanied by karyokinesis).

  14. A high-precision instrument for analyzing nonlinear dynamic behavior of bearing cage

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Z., E-mail: zhaohui@nwpu.edu.cn; Yu, T. [School of Aeronautics, Northwestern Polytechnical University, Xi’an 710072 (China); Chen, H. [Xi’an Aerospace Propulsion Institute, Xi’an 710100 (China); Li, B. [State Key Laboratory for Manufacturing and Systems Engineering, Xi’an Jiaotong University, Xi’an 710054 (China)

    2016-08-15

    The high-precision ball bearing is fundamental to the performance of complex mechanical systems. As the speed increases, the cage behavior becomes a key factor in influencing the bearing performance, especially life and reliability. This paper develops a high-precision instrument for analyzing nonlinear dynamic behavior of the bearing cage. The trajectory of the rotational center and non-repetitive run-out (NRRO) of the cage are used to evaluate the instability of cage motion. This instrument applied an aerostatic spindle to support and spin test the bearing to decrease the influence of system error. Then, a high-speed camera is used to capture images when the bearing works at high speeds. A 3D trajectory tracking software TEMA Motion is used to track the spot which marked the cage surface. Finally, by developing the MATLAB program, a Lissajous’ figure was used to evaluate the nonlinear dynamic behavior of the cage with different speeds. The trajectory of rotational center and NRRO of the cage with various speeds are analyzed. The results can be used to predict the initial failure and optimize cage structural parameters. In addition, the repeatability precision of instrument is also validated. In the future, the motorized spindle will be applied to increase testing speed and image processing algorithms will be developed to analyze the trajectory of the cage.

  15. A high-precision instrument for analyzing nonlinear dynamic behavior of bearing cage

    International Nuclear Information System (INIS)

    Yang, Z.; Yu, T.; Chen, H.; Li, B.

    2016-01-01

    The high-precision ball bearing is fundamental to the performance of complex mechanical systems. As the speed increases, the cage behavior becomes a key factor in influencing the bearing performance, especially life and reliability. This paper develops a high-precision instrument for analyzing nonlinear dynamic behavior of the bearing cage. The trajectory of the rotational center and non-repetitive run-out (NRRO) of the cage are used to evaluate the instability of cage motion. This instrument applied an aerostatic spindle to support and spin test the bearing to decrease the influence of system error. Then, a high-speed camera is used to capture images when the bearing works at high speeds. A 3D trajectory tracking software TEMA Motion is used to track the spot which marked the cage surface. Finally, by developing the MATLAB program, a Lissajous’ figure was used to evaluate the nonlinear dynamic behavior of the cage with different speeds. The trajectory of rotational center and NRRO of the cage with various speeds are analyzed. The results can be used to predict the initial failure and optimize cage structural parameters. In addition, the repeatability precision of instrument is also validated. In the future, the motorized spindle will be applied to increase testing speed and image processing algorithms will be developed to analyze the trajectory of the cage.

  16. Historical civilian nuclear accident based Nuclear Reactor Condition Analyzer

    Science.gov (United States)

    McCoy, Kaylyn Marie

    There are significant challenges to successfully monitoring multiple processes within a nuclear reactor facility. The evidence for this observation can be seen in the historical civilian nuclear incidents that have occurred with similar initiating conditions and sequences of events. Because there is a current lack within the nuclear industry, with regards to the monitoring of internal sensors across multiple processes for patterns of failure, this study has developed a program that is directed at accomplishing that charge through an innovation that monitors these systems simultaneously. The inclusion of digital sensor technology within the nuclear industry has appreciably increased computer systems' capabilities to manipulate sensor signals, thus making the satisfaction of these monitoring challenges possible. One such manipulation to signal data has been explored in this study. The Nuclear Reactor Condition Analyzer (NRCA) program that has been developed for this research, with the assistance of the Nuclear Regulatory Commission's Graduate Fellowship, utilizes one-norm distance and kernel weighting equations to normalize all nuclear reactor parameters under the program's analysis. This normalization allows the program to set more consistent parameter value thresholds for a more simplified approach to analyzing the condition of the nuclear reactor under its scrutiny. The product of this research provides a means for the nuclear industry to implement a safety and monitoring program that can oversee the system parameters of a nuclear power reactor facility, like that of a nuclear power plant.

  17. Electrical Stimulation of Coleopteran Muscle for Initiating Flight.

    Directory of Open Access Journals (Sweden)

    Hao Yu Choo

    Full Text Available Some researchers have long been interested in reconstructing natural insects into steerable robots or vehicles. However, until recently, these so-called cyborg insects, biobots, or living machines existed only in science fiction. Owing to recent advances in nano/micro manufacturing, data processing, and anatomical and physiological biology, we can now stimulate living insects to induce user-desired motor actions and behaviors. To improve the practicality and applicability of airborne cyborg insects, a reliable and controllable flight initiation protocol is required. This study demonstrates an electrical stimulation protocol that initiates flight in a beetle (Mecynorrhina torquata, Coleoptera. A reliable stimulation protocol was determined by analyzing a pair of dorsal longitudinal muscles (DLMs, flight muscles that oscillate the wings. DLM stimulation has achieved with a high success rate (> 90%, rapid response time (< 1.0 s, and small variation (< 0.33 s; indicating little habituation. Notably, the stimulation of DLMs caused no crucial damage to the free flight ability. In contrast, stimulation of optic lobes, which was earlier demonstrated as a successful flight initiation protocol, destabilized the beetle in flight. Thus, DLM stimulation is a promising secure protocol for inducing flight in cyborg insects or biobots.

  18. Threshold amounts of organic carbon needed to initiate reductive dechlorination in groundwater systems

    Science.gov (United States)

    Chapelle, Francis H.; Thomas, Lashun K.; Bradley, Paul M.; Rectanus, Heather V.; Widdowson, Mark A.

    2012-01-01

    Aquifer sediment and groundwater chemistry data from 15 Department of Defense facilities located throughout the United States were collected and analyzed with the goal of estimating the amount of natural organic carbon needed to initiate reductive dechlorination in groundwater systems. Aquifer sediments were analyzed for hydroxylamine and NaOH-extractable organic carbon, yielding a probable underestimate of potentially bioavailable organic carbon (PBOC). Aquifer sediments were also analyzed for total organic carbon (TOC) using an elemental combustion analyzer, yielding a probable overestimate of bioavailable carbon. Concentrations of PBOC correlated linearly with TOC with a slope near one. However, concentrations of PBOC were consistently five to ten times lower than TOC. When mean concentrations of dissolved oxygen observed at each site were plotted versus PBOC, it showed that anoxic conditions were initiated at approximately 200 mg/kg of PBOC. Similarly, the accumulation of reductive dechlorination daughter products relative to parent compounds increased at a PBOC concentration of approximately 200 mg/kg. Concentrations of total hydrolysable amino acids (THAA) in sediments also increased at approximately 200 mg/kg, and bioassays showed that sediment CO2 production correlated positively with THAA. The results of this study provide an estimate for threshold amounts of bioavailable carbon present in aquifer sediments (approximately 200 mg/kg of PBOC; approximately 1,000 to 2,000 mg/kg of TOC) needed to support reductive dechlorination in groundwater systems.

  19. Soft Decision Analyzer

    Science.gov (United States)

    Lansdowne, Chatwin; Steele, Glen; Zucha, Joan; Schlesinger, Adam

    2013-01-01

    We describe the benefit of using closed-loop measurements for a radio receiver paired with a counterpart transmitter. We show that real-time analysis of the soft decision output of a receiver can provide rich and relevant insight far beyond the traditional hard-decision bit error rate (BER) test statistic. We describe a Soft Decision Analyzer (SDA) implementation for closed-loop measurements on single- or dual- (orthogonal) channel serial data communication links. The analyzer has been used to identify, quantify, and prioritize contributors to implementation loss in live-time during the development of software defined radios. This test technique gains importance as modern receivers are providing soft decision symbol synchronization as radio links are challenged to push more data and more protocol overhead through noisier channels, and software-defined radios (SDRs) use error-correction codes that approach Shannon's theoretical limit of performance.

  20. Uncertainty in decision models analyzing cost-effectiveness : The joint distribution of incremental costs and effectiveness evaluated with a nonparametric bootstrap method

    NARCIS (Netherlands)

    Hunink, Maria; Bult, J.R.; De Vries, J; Weinstein, MC

    1998-01-01

    Purpose. To illustrate the use of a nonparametric bootstrap method in the evaluation of uncertainty in decision models analyzing cost-effectiveness. Methods. The authors reevaluated a previously published cost-effectiveness analysis that used a Markov model comparing initial percutaneous

  1. Investigation on the Crack Initiation of V-Shaped Notch Tip in Precision Cropping

    Directory of Open Access Journals (Sweden)

    Lijun Zhang

    2014-01-01

    Full Text Available The crack initiation of V-shaped notch tip has a very important influence on the cross-section quality and the cropping time for every segment of metal bar in course of low stress precision cropping. By the finite element method, the influence of machining precision of V-shaped notch bottom corner on the crack initiation location is analyzed and it is pointed out that the crack initiation point locates in the place at the maximal equivalent stress change rate on V-shaped notch surface. The judgment criterion of the crack initiation direction is presented and the corresponding crack initiation angle can be calculated by means of the displacement extrapolation method. The factual crack initiation angle of the metal bar has been measured by using the microscopic measurement system. The formula of the crack initiation life of V-shaped notch tip is built, which mainly includes the stress concentration factor of V-shaped notch, the tensile properties of metal material, and the cyclic loading conditions. The experimental results show that the obtained theoretical analyses about the crack initiation location, the crack initiation direction, and the crack initiation time in this paper are correct. It is also shown that the crack initiation time accounts for about 80% of the cropping time for every segment of the metal bar.

  2. JAKEF, Gradient or Jacobian Function from Objective Function or Vector Function

    International Nuclear Information System (INIS)

    Hillstrom, K.E.

    1988-01-01

    1 - Description of program or function: JAKEF is a language processor that accepts as input a single- or double-precision ANSI standard 1977 FORTRAN subroutine defining an objective function f(x), or a vector function F(x), and produces as output a single- or double- precision ANSI standard 1977 FORTRAN subroutine defining the gradient of f(x), or the Jacobian of F(x). 2 - Method of solution: JAKEF is a four-pass compiler consisting of a lexical preprocessor, a parser, a tree-building and flow analysis pass, and a differentiator and output construction pass. The lexical preprocessor reworks the input FORTRAN program to give it a recognizable lexical structure. The parser transforms the pre-processed input into a string of tokens in a post-fix representation of the program tree. The tree-building and flow analysis pass constructs a tree out of the post-fix token string. The differentiator identifies relevant assignment statements; then, if necessary, it analyzes them into component statements governed by a single differentiation rule and augments each of these statements with a call to a member of the run-time support package which implements the differentiation rule. After completing the construction of the main body of the routine, JAKEF inserts calls to support package routines that complete the differentiation. This results in a modified program tree in a form compatible with FORTRAN rules. 3 - Restrictions on the complexity of the problem: Statement functions and Equivalence's that involve the independent variables are not handled correctly. Variables, constants, or functions of type COMPLEX are not recognized. Character sub-string expressions and alternate returns are not permitted

  3. Authoritative parenting, child competencies, and initiation of cigarette smoking.

    Science.gov (United States)

    Jackson, C; Bee-Gates, D J; Henriksen, L

    1994-01-01

    School-based social influence programs to prevent adolescent smoking are having limited success in the long term. Intervening earlier in the process of smoking onset, during the childhood years, may be required to prevent adolescent smoking. Child socialization variables, specifically parenting behaviors and child competencies, may be important to understanding the earliest phase of smoking onset. This study tested hypotheses of association between authoritative parenting behaviors, enhanced child competencies, and relatively low rates of initiation of cigarette smoking. Analyzing cross-sectional survey data from 937 students in Grades 3 to 8, we found general support for the study hypotheses: Authoritative parenting was positively associated with child competencies; children's competency levels were inversely related to their rates of smoking intention, initiation, and experimentation; authoritative parenting was inversely related to rates of child smoking intention and behaviors; and authoritative parenting and parent smoking status had independent associations with child initiation of cigarette smoking. These results indicate that child socialization variables merit further investigation for their potential role in the development of early intervention programs for smoking prevention.

  4. Parsing with subdomain instance weighting from raw corpora

    NARCIS (Netherlands)

    Plank, B.; Sima'an, K.

    2008-01-01

    The treebanks that are used for training statistical parsers consist of hand-parsed sentences from a single source/domain like newspaper text. However, newspaper text concerns different subdomains of language use (e.g. finance, sports, politics, music), which implies that the statistics gathered by

  5. Parsing with Subdomain Instance Weighting from Raw Corpora

    NARCIS (Netherlands)

    Plank, Barbara; Sima'an, Khalil

    2008-01-01

    The treebanks that are used for training statistical parsers consist of hand-parsed sentences from a single source/domain like newspaper text. However, newspaper text concerns different subdomains of language use (e.g. finance, sports, politics, music), which implies that the statistics gathered by

  6. Joint part-of-speech and dependency projection from multiple sources

    DEFF Research Database (Denmark)

    Johannsen, Anders Trærup; Agic, Zeljko; Søgaard, Anders

    2016-01-01

    for multiple tasks from multiple source languages, relying on parallel corpora available for hundreds of languages. When training POS taggers and dependency parsers on jointly projected POS tags and syntactic dependencies using our algorithm, we obtain better performance than a standard approach on 20...

  7. The adhesive strength and initial viscosity of denture adhesives.

    Science.gov (United States)

    Han, Jian-Min; Hong, Guang; Dilinuer, Maimaitishawuti; Lin, Hong; Zheng, Gang; Wang, Xin-Zhi; Sasaki, Keiichi

    2014-11-01

    To examine the initial viscosity and adhesive strength of modern denture adhesives in vitro. Three cream-type denture adhesives (Poligrip S, Corect Cream, Liodent Cream; PGS, CRC, LDC) and three powder-type denture adhesives (Poligrip Powder, New Faston, Zanfton; PGP, FSN, ZFN) were used in this study. The initial viscosity was measured using a controlled-stress rheometer. The adhesive strength was measured according to ISO-10873 recommended procedures. All data were analyzed independently by one-way analysis of variance combined with a Student-Newman-Keuls multiple comparison test at a 5% level of significance. The initial viscosity of all the cream-type denture adhesives was lower than the powder-type adhesives. Before immersion in water, all the powder-type adhesives exhibited higher adhesive strength than the cream-type adhesives. However, the adhesive strength of cream-type denture adhesives increased significantly and exceeded the powder-type denture adhesives after immersion in water. For powder-type adhesives, the adhesive strength significantly decreased after immersion in water for 60 min, while the adhesive strength of the cream-type adhesives significantly decreased after immersion in water for 180 min. Cream-type denture adhesives have lower initial viscosity and higher adhesive strength than powder type adhesives, which may offer better manipulation properties and greater efficacy during application.

  8. Effect of initial grain size on dynamic recrystallization in high purity austenitic stainless steels

    International Nuclear Information System (INIS)

    El Wahabi, M.; Gavard, L.; Montheillet, F.; Cabrera, J.M.; Prado, J.M.

    2005-01-01

    The influence of initial microstructure on discontinuous dynamic recrystallization (DDRX) has been investigated by using high purity and ultra high purity austenitic stainless steels with various initial grain sizes. After uniaxial compression tests at constant strain rates and various temperatures, the steady state microstructure or the state corresponding to the maximum strain (ε = 1) attained in the test was analyzed by scanning electron microscopy aided with automated electron back scattering diffraction. Recrystallized grain size d rec and twin boundary fraction f TB measurements were carried out. The mechanical behavior was also investigated by comparing experimental stress-strain curves with various initial grain sizes. DDRX kinetics was described by the classical Avrami equation. It was concluded that larger initial grain sizes promoted a delay in the DDRX onset in the two alloys. It was also observed that the softening process progressed faster for smaller initial grain sizes. The effect of initial grain size is larger in the HP material and becomes more pronounced at low temperature

  9. Design and numerical simulation of a 3-D electron plasma analyzer that resolves both energy and elevation angle

    International Nuclear Information System (INIS)

    Weiss, L.A.; Sablik, M.J.; Winningham, J.D.; Frahm, R.A.; Reiff, P.H.

    1989-01-01

    The Comet Rendezvous and Asteroid Flyby Mission (CRAF) will include, as one of its complement of thirteen scientific instruments, a plasma electron analyzer capable of providing 3-dimensional measurements of the energy and angular distribution of electrons in the solar wind, asteroidal and cometary environments. After initial instrument selection, mission planners at JPL suggested that an instrument capable of performing angular scanning electronically rather than mechanically be investigated. This paper describes the computer design of the new CRAF plasma electron detector, consisting of an electronic scanning component, called the 'elevation analyzer', and an energy analyzing component based on the Soft Particle Spectrometer (SPS) and its successor, the Spectrographic Particle Imager (SPI). Numerical simulation of each component's operation - consisting of ray-tracing particles through the electrostatic field of each analyzer and collecting statistics on those particles successfully transmitted - is used to determine the energy and angular response functions of each component and the design dimensions that optimize these responses. (orig.)

  10. Initial Analyses of Change Detection Capabilities and Data Redundancies in the Long Term Resource Monitoring Program

    National Research Council Canada - National Science Library

    Lubinski, Kenneth

    2001-01-01

    Evaluations of Long Term Resource Monitoring Program sampling designs for water quality, fish, aquatic vegetation, and macroinvertebrates were initiated in 1999 by analyzing data collected since 1992...

  11. Analyzing data files in SWAN

    CERN Document Server

    Gajam, Niharika

    2016-01-01

    Traditionally analyzing data happens via batch-processing and interactive work on the terminal. The project aims to provide another way of analyzing data files: A cloud-based approach. It aims to make it a productive and interactive environment through the combination of FCC and SWAN software.

  12. Effect of initial perturbation amplitude on Richtmyer-Meshkov flows induced by strong shocks

    Energy Technology Data Exchange (ETDEWEB)

    Dell, Z.; Abarzhi, S. I., E-mail: snezhana.abarzhi@gmail.com, E-mail: sabarji@andrew.cmu.edu [Mellon College of Science and Carnegie Mellon University – Qatar, Carnegie Mellon University, Pittsburgh, Pennsylvania 15231 (United States); Stellingwerf, R. F. [Stellingwerf Consulting, Huntsville, Alabama 35803 (United States)

    2015-09-15

    We systematically study the effect of the initial perturbation on Richtmyer-Meshkov (RM) flows induced by strong shocks in fluids with contrasting densities. Smooth Particle Hydrodynamics simulations are employed. A broad range of shock strengths and density ratios is considered. The amplitude of the initial single mode sinusoidal perturbation of the interface varies from 0% to 100% of its wavelength. The simulations results are compared, wherever possible, with four rigorous theories, and with other experiments and simulations, achieving good quantitative and qualitative agreement. Our study is focused on early time dynamics of the Richtmyer-Meshkov instability (RMI). We analyze the initial growth-rate of RMI immediately after the shock passage, when the perturbation amplitude increases linearly with time. For the first time, to the authors' knowledge, we find that the initial growth-rate of RMI is a non-monotone function of the initial perturbation amplitude, thus restraining the amount of energy that can be deposited by the shock at the interface. The maximum value of the initial growth-rate depends on the shock strength and the density ratio, whereas the corresponding value of the initial perturbation amplitude depends only slightly on the shock strength and density ratio.

  13. Onboard Processing on PWE OFA/WFC (Onboard Frequency Analyzer/Waveform Capture) aboard the ERG (ARASE) Satellite

    Science.gov (United States)

    Matsuda, S.; Kasahara, Y.; Kojima, H.; Kasaba, Y.; Yagitani, S.; Ozaki, M.; Imachi, T.; Ishisaka, K.; Kurita, S.; Ota, M.; Kumamoto, A.; Tsuchiya, F.; Yoshizumi, M.; Matsuoka, A.; Teramoto, M.; Shinohara, I.

    2017-12-01

    Exploration of energization and Radiation in Geospace (ERG) is a mission for understanding particle acceleration, loss mechanisms, and the dynamic evolution of space storms in the context of cross-energy and cross-regional coupling [Miyoshi et al., 2012]. The ERG (ARASE) satellite was launched on December 20, 2016, and successfully inserted into an orbit. The Plasma Wave Experiment (PWE) is one of the science instruments on board the ERG satellite to measure electric field and magnetic field in the inner magnetosphere. PWE consists of three sub-components, EFD (Electric Field Detector), OFA/WFC (Onboard Frequency Analyzer and Waveform Capture), and HFA (High Frequency Analyzer). Especially, OFA/WFC measures electric and magnetic field spectrum and waveform from a few Hz to 20 kHz. OFA/WFC processes signals detected by a couple of dipole wire-probe antenna (WPT) and tri-axis magnetic search coils (MSC) installed onboard the satellite. The PWE-OFA subsystem calculates and produces three kind of data; OFA-SPEC (power spectrum), OFA-MATRIX (spectrum matrix), and OFA-COMPLEX (complex spectrum). They are continuously processed 24 hours per day and all data are sent to the ground. OFA-MATRIX and OFA-COMPLEX are used for polarization analyses and direction finding of the plasma waves. The PWE-WFC subsystem measures raw (64 kHz sampled) and down-sampled (1 kHz sampled) burst waveform detected by the WPT and the MSC sensors. It activates by a command, automatic triggering, and scheduling. The initial check-out process of the PWE successfully completed, and initial data has been obtained. In this presentation, we introduce onboard processing technique on PWE OFA/WFC and its initial results.

  14. Snail1 induces epithelial-to-mesenchymal transition and tumor initiating stem cell characteristics

    International Nuclear Information System (INIS)

    Dang, Hien; Ding, Wei; Emerson, Dow; Rountree, C Bart

    2011-01-01

    Tumor initiating stem-like cells (TISCs) are a subset of neoplastic cells that possess distinct survival mechanisms and self-renewal characteristics crucial for tumor maintenance and propagation. The induction of epithelial-mesenchymal-transition (EMT) by TGFβ has been recently linked to the acquisition of TISC characteristics in breast cancer. In HCC, a TISC and EMT phenotype correlates with a worse prognosis. In this work, our aim is to elucidate the underlying mechanism by which cells acquire tumor initiating characteristics after EMT. Gene and protein expression assays and Nanog-promoter luciferase reporter were utilized in epithelial and mesenchymal phenotype liver cancer cell lines. EMT was analyzed with migration/invasion assays. TISC characteristics were analyzed with tumor-sphere self-renewal and chemotherapy resistance assays. In vivo tumor assay was performed to investigate the role of Snail1 in tumor initiation. TGFβ induced EMT in epithelial cells through the up-regulation of Snail1 in Smad-dependent signaling. Mesenchymal liver cancer post-EMT demonstrates TISC characteristics such as tumor-sphere formation but are not resistant to cytotoxic therapy. The inhibition of Snail1 in mesenchymal cells results in decreased Nanog promoter luciferase activity and loss of self-renewal characteristics in vitro. These changes confirm the direct role of Snail1 in some TISC traits. In vivo, the down-regulation of Snail1 reduced tumor growth but was not sufficient to eliminate tumor initiation. In summary, TGFβ induces EMT and TISC characteristics through Snail1 and Nanog up-regulation. In mesenchymal cells post-EMT, Snail1 directly regulates Nanog expression, and loss of Snail1 regulates tumor growth without affecting tumor initiation

  15. ADAM: Analyzer for Dialectal Arabic Morphology

    Directory of Open Access Journals (Sweden)

    Wael Salloum

    2014-12-01

    Full Text Available While Modern Standard Arabic (MSA has many resources, Arabic Dialects, the primarily spoken local varieties of Arabic, are quite impoverished in this regard. In this article, we present ADAM (Analyzer for Dialectal Arabic Morphology. ADAM is a poor man’s solution to quickly develop morphological analyzers for dialectal Arabic. ADAM has roughly half the out-of-vocabulary rate of a state-of-the-art MSA analyzer and is comparable in its recall performance to an Egyptian dialectal morphological analyzer that took years and expensive resources to build.

  16. [Automated analyzer of enzyme immunoassay].

    Science.gov (United States)

    Osawa, S

    1995-09-01

    Automated analyzers for enzyme immunoassay can be classified by several points of view: the kind of labeled antibodies or enzymes, detection methods, the number of tests per unit time, analytical time and speed per run. In practice, it is important for us consider the several points such as detection limits, the number of tests per unit time, analytical range, and precision. Most of the automated analyzers on the market can randomly access and measure samples. I will describe the recent advance of automated analyzers reviewing their labeling antibodies and enzymes, the detection methods, the number of test per unit time and analytical time and speed per test.

  17. Design, development and implementation of a simple program ...

    African Journals Online (AJOL)

    The method adopted included sentence analysis, which involved the recognition of sentences and sentence structures; construction of Syntax graphs, which reflect the flow of control during the process of parsing a sentence and its corresponding parser, which reads the text into an internal, more abstract representation.

  18. O acesso semântico no parsing sintático: evidências experimentais

    Directory of Open Access Journals (Sweden)

    Marcus Maia

    2001-02-01

    Full Text Available

    Este estudo apresenta evidências em favor da hipótese de que o parser faz uso rápido e eficiente de alguns tipos de informação de natureza lexical associados ao verbo no processamento on-line.

  19. Integrating Syntax, Semantics, and Discourse DARPA Natural Language Understanding Program. Volume 2. Appendices.

    Science.gov (United States)

    1987-05-14

    Memo No. 43, Paoli Reserach Center, System Development Corporation, 1986. L. Hiuuchman ad K. Puder, Restriction Grammar in Prolog. In Pr... of as...causes and results of SAC failures. 3. METHODOLOGY The essential feature of our parser which facilitates the collecting of syntactic patterns is the

  20. SUBTLE: Situation Understanding Bot through Language and Environment

    Science.gov (United States)

    2016-01-06

    using the Bikel parser (Bikel, 2004); these parses are then post-processed using the null element (un- derstood subject) restoration system of Gabbard ...trol. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp 1988–1993 Gabbard R, Marcus M, Kulick S (2006) Fully

  1. Duplicates and translation of nested SQL queries into XRA

    NARCIS (Netherlands)

    N.Th. Verbrugge

    1990-01-01

    textabstractThe PRISMA/DB system contains a parser to translate the database language SQL into eXtended Relational Algebra (XRA). The early definition of XRA, which has a multi-set semantics, proves inadequate for translating SQL according to its nested-iteration semantics. The prime cause is that

  2. An Alternate Approach to Optimal L 2 -Error Analysis of Semidiscrete Galerkin Methods for Linear Parabolic Problems with Nonsmooth Initial Data

    KAUST Repository

    Goswami, Deepjyoti; Pani, Amiya K.

    2011-01-01

    In this article, we propose and analyze an alternate proof of a priori error estimates for semidiscrete Galerkin approximations to a general second order linear parabolic initial and boundary value problem with rough initial data. Our analysis

  3. Avaliação da anotação semântica do PALAVRAS e sua pós-edição manual para o Corpus Summ-it

    Directory of Open Access Journals (Sweden)

    Élen Cátia Tomazela

    2011-01-01

    Full Text Available Este artigo apresenta uma avaliação da anotação semântica automática do parser PALAVRAS e sua pós-edição manual para um corpus de textos em português – o Corpus Summ-it. Essa pós-edição visou ao aprimoramento de um modelo linguístico para a sumarização automática de textos e buscou atribuir  etiquetas semânticas mais adequadas aos itens lexicais, comparadas às empregadas pelo parser. Essa tarefa foi realizada por linguistas e os casos problemáticos são apresentados neste artigo, os quais levam a considerações sobre o próprio modelo de etiquetagem do PALAVRAS. O corpus revisado estará disponível para a comunidade e poderá ser útil para várias aplicações de Processamento de Línguas Naturais.

  4. How Far Is Stanford from Prague (and vice versa? Comparing Two Dependency-based Annotation Schemes by Network Analysis

    Directory of Open Access Journals (Sweden)

    Marco Passarotti

    2016-07-01

    Full Text Available The paper evaluates the differences between two currently leading annotation schemes for dependency treebanks. By relying on four treebanks, we demonstrate that the treatment of conjunctions and adpositions represents the core difference between the two schemes and that this impacts the topological properties of the linguistic networks induced from the treebanks. We also show that such properties are reflected in the performances of four probabilistic dependency parsers trained on the treebanks. L’articolo valuta le differenze tra i due principali schemi di annotazione a dipenden-ze in uso. Sulla base di quattro treebank, l’articolo dimostra che il trattamento delle congiunzioni e delle pre/postposizioni rappresenta la differenza principale tra i due schemi e che ciò comporta delle conseguenze sulle proprietà topologiche dei net-work indotti dalle treebank. Inoltre, si dimostra come tali proprietà siano riflesse nell’accuratezza di quattro parser probabilistici a dipendenze addestrati sulle treebank.

  5. Avaliação da anotação semântica do PALAVRAS e sua pós-edição manual para o Corpus Summ-it

    Directory of Open Access Journals (Sweden)

    Élen Cátia Tomazela

    2011-01-01

    Full Text Available Este artigo apresenta uma avaliação da anotação semântica automática do parser PALAVRAS e sua pós-edição manual para um corpus de textos em português – o Corpus Summ-it. Essa pós-edição visou ao aprimoramento de um modelo linguístico para a sumarização automática de textos e buscou atribuir  etiquetas semânticas mais adequadas aos itens lexicais, comparadas às empregadas pelo parser. Essa tarefa foi realizada por linguistas e os casos problemáticos são apresentados neste artigo, os quais levam a considerações sobre o próprio modelo de etiquetagem do PALAVRAS. O corpus revisado estará disponível para a comunidade e poderá ser útil para várias aplicações de Processamento de Línguas Naturais.

  6. Translation of PLC Programs to x86 for Simulation and Verification

    CERN Document Server

    Sallai, Gyula

    2017-01-01

    PLC programs are written in special languages, variants of the languages defined in the IEC 61131 standard. These programs cannot be directly executed on personal computers (on x86 architecture). To perform simulation of the PLC program or diagnostics during development, either a real PLC or a PLC simulator has to be used. However, these solutions are often inflexible and they do not provide appropriate performance. By generating x86-representations (semantically equivalent programs which can be executed on PCs, e.g. written in C, C++ or Java) of the PLC programs, some of these challenges could be met. PLCverif is a PLC program verification tool developed at CERN which includes a parser for Siemens PLC programs. In this work, we describe a code generator based on this parser of PLCverif. This work explores the possibilities and challenges of generating programs in widely-used general purpose languages from PLC programs, and provides a proof-of-concept code generation implementation. The presented solution dem...

  7. The effects of initial rise and axial loads on MEMS arches

    KAUST Repository

    Tella, Sherif Adekunle

    2017-04-07

    Arch microbeams have been utilized and proposed for many uses over the past few years due to their large tunability and bistability. However, recent experimental data have shown different mechanical behavior of arches when subjected to axial loads. This paper aims to investigate in depth the influence of the competing effects of initial rise and axial loads on the mechanical behavior of micromachined arches; mainly their static deflection and resonant frequencies. Based on analytical solutions, the static response and eigenvalue problems are analyzed for various values of initial rises and axial loads. Universal curves showing the variation of the first three resonance frequencies of the arch are generated for various values of initial rise under both tensile and compressive axial loads. This study shows that increasing the tensile or compressive axial loads for different values of initial rise may lead to either increase in the stiffness of the beam or initial decrease in the stiffness, which later increases as the axial load is increased depending on the dominant effect of the initial rise of the arch and the axial load. The obtained universal curves represent useful design tools to predict the tunability of arches under axial loads for various values of initial rises. The use of the universal curves is demonstrated with an experimental case study. Analytical formulation is developed to predict the point of minimum where the trend of the resonance frequency versus axial loads changes qualitatively due to the competing effects of axial loads and initial curvature.

  8. Multichannel analyzer embedded in FPGA

    International Nuclear Information System (INIS)

    Garcia D, A.; Hernandez D, V. M.; Vega C, H. R.; Ordaz G, O. O.; Bravo M, I.

    2017-10-01

    Ionizing radiation has different applications, so it is a very significant and useful tool, which in turn can be dangerous for living beings if they are exposed to uncontrolled doses. However, due to its characteristics, it cannot be perceived by any of the senses of the human being, so that in order to know the presence of it, radiation detectors and additional devices are required to quantify and classify it. A multichannel analyzer is responsible for separating the different pulse heights that are generated in the detectors, in a certain number of channels; according to the number of bits of the analog to digital converter. The objective of the work was to design and implement a multichannel analyzer and its associated virtual instrument, for nuclear spectrometry. The components of the multichannel analyzer were created in VHDL hardware description language and packaged in the Xilinx Vivado design suite, making use of resources such as the ARM processing core that the System on Chip Zynq contains and the virtual instrument was developed on the LabView programming graphics platform. The first phase was to design the hardware architecture to be embedded in the FPGA and for the internal control of the multichannel analyzer the application was generated for the ARM processor in C language. For the second phase, the virtual instrument was developed for the management, control and visualization of the results. The data obtained as a result of the development of the system were observed graphically in a histogram showing the spectrum measured. The design of the multichannel analyzer embedded in FPGA was tested with two different radiation detection systems (hyper-pure germanium and scintillation) which allowed determining that the spectra obtained are similar in comparison with the commercial multichannel analyzers. (Author)

  9. No-hair conjectures, primordial shear and protoinflationary initial conditions

    CERN Document Server

    Giovannini, Massimo

    2014-01-01

    Anisotropic inflationary background geometries are analyzed in the context of an extended gauge action where the electric and magnetic susceptibilities are not bound to coincide and depend on the inflaton field. After deriving various classes of solutions with electric and magnetic hairs, we discuss the problem of the initial boundary conditions of the shear parameter and consider a globally neutral plasma as a possible relic of a preinflationary stage of expansion. While electric hairs are washed out by the finite value of the protoinflationary conductivity, magnetic hairs can persist and introduce a tiny amount of shear causing a different inflationary rate of expansion along orthogonal spatial directions. The plasma interactions are a necessary criterion to discriminate between physical and unphysical initial conditions but they are not strictly sufficient to warrant the stability of a given magnetic solution.

  10. Analyzing solid waste management practices for the hotel industry

    Directory of Open Access Journals (Sweden)

    S.T. Pham Phu

    2018-01-01

    Full Text Available The current study aims to analyze waste characteristics and management practices of the hotel industry in Hoi An, a tourism city in the center of Vietnam. Solid wastes from 120 hotels were sampled, the face-to-face interviews were conducted, and statistical methods were carried out to analyze the data. The results showed that the mean of waste generation rate of the hotels was 2.28 kg/guest/day and strongly correlated to internal influencing factors such as the capacity, the price of the room, garden, and level of restaurant. The differences in waste generation rate of the hotels were proved to be statistically significant. The higher the scale of hotels, the higher the waste generation rate. Moreover, the waste composition of the hotels was identified by 58.5% for biodegradable waste, 25.8% for recyclables and 15.7% for others. The relative differences in the waste composition of the hotels by climate, the features of hotels, and the types of the guest were explained. Whereby, the higher size of the hotels, the higher percentage of biodegradable and less proportion of recyclable waste. Also, this study revealed that the implementation status of waste management practices of the hoteliers initially reaped quite positive achievements with 76% for sorting, 39% for recycling, 29% for reduction, and 0.8% for composting. The rate of waste management practices was proportional to the scale of the hotel. This study provided information on waste management practice of hotel industry and contributed to the overall assessment of municipal solid waste management practices of Hoi An city.

  11. Teachers’ assessments of demonstration of student initiative

    Directory of Open Access Journals (Sweden)

    Komlenović Đurđica

    2012-01-01

    Full Text Available This paper explores student initiative or student engagement in activities in school environment, as an aspect of students’ functioning that is assumed to be a prerequisite for their contribution to the quality of instruction and better use of possibilities for education and development in school environment. We approach this topic from teachers’ perspective since it is our aim to observe how teachers assess the initiative of their students (how important it is, how it is manifested, how present it is in different segments of school activities. In the first part of the paper we analyze the construct “student initiative” and a similar construct “student engagement”. In the second part of the paper we present the results of a research in which primary school teachers (N=182 from the territory of Serbia expressed their views on student initiative. Teachers’ answers to open- and close-ended questions from the questionnaire (19 items in total were processed by quantitative and qualitative methodology. Research results indicate that the majority of teachers believed that student initiative was a very important general feature of behavior in school environment, independent of age, which was most present in the domain of peer socializing and relationship with teachers, and least present in the very domains of student functioning that teachers deemed the most desirable (mastering the curriculum, regulation of disciplinary issues. [Projekat Ministarstva nauke Republike Srbije, br. 179034: Od podsticanja inicijative, saradnje, stvaralaštva u obrazovanju do novih uloga i identiteta u društvu i br. 47008: Unapređivanje kvaliteta i dostupnosti obrazovanja u procesima modernizacije Srbije

  12. Initiating Event Analysis of a Lithium Fluoride Thorium Reactor

    Science.gov (United States)

    Geraci, Nicholas Charles

    The primary purpose of this study is to perform an Initiating Event Analysis for a Lithium Fluoride Thorium Reactor (LFTR) as the first step of a Probabilistic Safety Assessment (PSA). The major objective of the research is to compile a list of key initiating events capable of resulting in failure of safety systems and release of radioactive material from the LFTR. Due to the complex interactions between engineering design, component reliability and human reliability, probabilistic safety assessments are most useful when the scope is limited to a single reactor plant. Thus, this thesis will study the LFTR design proposed by Flibe Energy. An October 2015 Electric Power Research Institute report on the Flibe Energy LFTR asked "what-if?" questions of subject matter experts and compiled a list of key hazards with the most significant consequences to the safety or integrity of the LFTR. The potential exists for unforeseen hazards to pose additional risk for the LFTR, but the scope of this thesis is limited to evaluation of those key hazards already identified by Flibe Energy. These key hazards are the starting point for the Initiating Event Analysis performed in this thesis. Engineering evaluation and technical study of the plant using a literature review and comparison to reference technology revealed four hazards with high potential to cause reactor core damage. To determine the initiating events resulting in realization of these four hazards, reference was made to previous PSAs and existing NRC and EPRI initiating event lists. Finally, fault tree and event tree analyses were conducted, completing the logical classification of initiating events. Results are qualitative as opposed to quantitative due to the early stages of system design descriptions and lack of operating experience or data for the LFTR. In summary, this thesis analyzes initiating events using previous research and inductive and deductive reasoning through traditional risk management techniques to

  13. Japan's English-Medium Instruction Initiatives and the Globalization of Higher Education

    Science.gov (United States)

    Rose, Heath; McKinley, Jim

    2018-01-01

    This article analyzes a recent initiative of Japan's Ministry of Education, which aims to internationalize higher education in Japan. The large-investment project "Top Global University Project" (TGUP) has emerged to create globally oriented universities, to increase the role of foreign languages in higher education, and to foster global…

  14. Determinants of smoking initiation among women in five European countries: a cross-sectional survey

    LENUS (Irish Health Repository)

    Oh, Debora L

    2010-02-17

    Abstract Background The rate of smoking and lung cancer among women is rising in Europe. The primary aim of this study was to determine why women begin smoking in five different European countries at different stages of the tobacco epidemic and to determine if smoking is associated with certain characteristics and\\/or beliefs about smoking. Methods A cross-sectional telephone survey on knowledge and beliefs about tobacco was conducted as part of the Women in Europe Against Lung Cancer and Smoking (WELAS) Project. A total of 5 000 adult women from France, Ireland, Italy, Czech Republic, and Sweden were interviewed, with 1 000 from each participating country. All participants were asked questions about demographics, knowledge and beliefs about smoking, and their tobacco use background. Current and former smokers also were asked questions about smoking initiation. Basic statistics on the cross-sectional data was reported with chi-squared and ANOVA p-values. Logistic regression was used to analyze ever versus never smokers. Linear regression analyses were used to analyze age of smoking initiation. Results Being older, being divorced, having friends\\/family who smoke, and having parents who smoke were all significantly associated with ever smoking, though the strength of the associations varied by country. The most frequently reported reason for initiation smoking was friend smoking, with 62.3% of ever smokers reporting friends as one of the reasons why they began smoking. Mean age of smoking initiation was 18.2 years and over 80% of participants started smoking by the age of 20. The highest levels of young initiators were in Sweden with 29.3% of women initiating smoking at age 14-15 and 12.0% initiating smoking younger than age 14. The lowest level of young initiators was in the Czech Republic with 13.7% of women initiating smoking at age 14-15 and 1.4% of women initiating smoking younger than age 14. Women who started smoking because their friends smoked or to look

  15. Determinants of smoking initiation among women in five European countries: a cross-sectional survey.

    LENUS (Irish Health Repository)

    Oh, Debora L

    2010-02-17

    ABSTRACT: BACKGROUND: The rate of smoking and lung cancer among women is rising in Europe. The primary aim of this study was to determine why women begin smoking in five different European countries at different stages of the tobacco epidemic and to determine if smoking is associated with certain characteristics and\\/or beliefs about smoking. METHODS: A cross-sectional telephone survey on knowledge and beliefs about tobacco was conducted as part of the Women in Europe Against Lung Cancer and Smoking (WELAS) Project. A total of 5 000 adult women from France, Ireland, Italy, Czech Republic, and Sweden were interviewed, with 1 000 from each participating country. All participants were asked questions about demographics, knowledge and beliefs about smoking, and their tobacco use background. Current and former smokers also were asked questions about smoking initiation. Basic statistics on the cross-sectional data was reported with chi-squared and ANOVA p-values. Logistic regression was used to analyze ever versus never smokers. Linear regression analyses were used to analyze age of smoking initiation. RESULTS: Being older, being divorced, having friends\\/family who smoke, and having parents who smoke were all significantly associated with ever smoking, though the strength of the associations varied by country. The most frequently reported reason for initiation smoking was friend smoking, with 62.3% of ever smokers reporting friends as one of the reasons why they began smoking. Mean age of smoking initiation was 18.2 years and over 80% of participants started smoking by the age of 20. The highest levels of young initiators were in Sweden with 29.3% of women initiating smoking at age 14-15 and 12.0% initiating smoking younger than age 14. The lowest level of young initiators was in the Czech Republic with 13.7% of women initiating smoking at age 14-15 and 1.4% of women initiating smoking younger than age 14. Women who started smoking because their friends smoked or to

  16. Initialized Fractional Calculus

    Science.gov (United States)

    Lorenzo, Carl F.; Hartley, Tom T.

    2000-01-01

    This paper demonstrates the need for a nonconstant initialization for the fractional calculus and establishes a basic definition set for the initialized fractional differintegral. This definition set allows the formalization of an initialized fractional calculus. Two basis calculi are considered; the Riemann-Liouville and the Grunwald fractional calculi. Two forms of initialization, terminal and side are developed.

  17. Multichannel analyzer type CMA-3

    International Nuclear Information System (INIS)

    Czermak, A.; Jablonski, J.; Ostrowicz, A.

    1978-01-01

    Multichannel analyzer CMA-3 is designed for two-parametric analysis with operator controlled logical windows. It is implemented in CAMAC standard. A single crate contains all required modules and is controlled by the PDP-11/10 minicomputer. Configuration of CMA-3 is shown. CMA-3 is the next version of the multichannel analyzer described in report No 958/E-8. (author)

  18. THE ANALYSIS OF THE COMMODITY PRICE FORECASTING SUCCESS CONSIDERING DIFFERENT LENGTHS OF THE INITIAL CONDITION DRIFT

    Directory of Open Access Journals (Sweden)

    Marcela Lascsáková

    2015-09-01

    Full Text Available In the paper the numerical model based on the exponential approximation of commodity stock exchanges was derived. The price prognoses of aluminium on the London Metal Exchange were determined as numerical solution of the Cauchy initial problem for the 1st order ordinary differential equation. To make the numerical model more accurate the idea of the modification of the initial condition value by the stock exchange was realized. By having analyzed the forecasting success of the chosen initial condition drift types, the initial condition drift providing the most accurate prognoses for the commodity price movements was determined. The suggested modification of the original model made the commodity price prognoses more accurate.

  19. Salvage of relapse of patients with Hodgkin's disease in clinical stages I or II who were staged with laparotomy and initially treated with radiotherapy alone. A report from the international database on Hodgkin's disease

    DEFF Research Database (Denmark)

    Specht, L.; Horwich, A.; Ashley, S.

    1994-01-01

    patients in the International Database on Hodgkin's Disease who were initially in clinical Stages I or II, who were staged with laparotomy, and who relapsed after initial treatment with irradiation alone. Factors analyzed for outcome after first relapse included initial stage, age, sex, histology......PURPOSE: To analyze presentation variables that might indicate a high or low likelihood of success of the treatment of patients relapsing after initial radiotherapy of Hodgkin's disease in clinical Stages I or II who were staged with laparotomy. METHODS AND MATERIALS: Data were analyzed on 681...

  20. Productive Language Use with IT'S ENGLISH

    NARCIS (Netherlands)

    Kanselaar, G.; Jaspers, J.G.M.; Kok, W.A.M.

    1993-01-01

    Based on the results of a study in 1989, a new Computer-Assisted Instruction program for foreign language teaching of English has been developed. Main features of this program are the communicative approach, a 70, 000 word dictionary, sound and a syntactic parser.An evaluation study was carried out

  1. Robo-Sensei's NLP-Based Error Detection and Feedback Generation

    Science.gov (United States)

    Nagata, Noriko

    2009-01-01

    This paper presents a new version of Robo-Sensei's NLP (Natural Language Processing) system which updates the version currently available as the software package "ROBO-SENSEI: Personal Japanese Tutor" (Nagata, 2004). Robo-Sensei's NLP system includes a lexicon, a morphological generator, a word segmentor, a morphological parser, a syntactic…

  2. Zebu

    DEFF Research Database (Denmark)

    Burgy, Laurent; Réveillère, Laurent; Lawall, Julia

    2011-01-01

    -handling layer to the needs of a given application. The Zebu compiler first checks the annotated specification for inconsistencies, and then generates a protocol-handling layer according to the annotations. This protocol-handling layer is made up of a set of data structures that represent a message, a parser...

  3. Grammar as a Programming Language. Artificial Intelligence Memo 391.

    Science.gov (United States)

    Rowe, Neil

    Student projects that involve writing generative grammars in the computer language, "LOGO," are described in this paper, which presents a grammar-running control structure that allows students to modify and improve the grammar interpreter itself while learning how a simple kind of computer parser works. Included are procedures for…

  4. Sorry Dave, I’m Afraid I Can’t Do That: Explaining Unachievable Robot Tasks using Natural Language

    Science.gov (United States)

    2013-06-24

    processing components used by Brooks et al. [6]: the Bikel parser [3] combined with the null element (understood subject) restoration of Gabbard et al...Intelligent Robots and Systems (IROS), pages 1988 – 1993, 2010. [12] Ryan Gabbard , Mitch Marcus, and Seth Kulick. Fully parsing the Penn Treebank. In Human

  5. [Sexual initiation, masculinity and health: narratives of young men].

    Science.gov (United States)

    Rebello, Lúcia Emilia Figueiredo de Sousa; Gomes, Romeu

    2009-01-01

    The main objective of this study was to analyze the narratives of young university students about the experience of sexual initiation. The theoretical and conceptual references used were the sexual scripts of our society that inform people about when, how, where and with whom they should have their sexual experiences, indicating how to act sexually and the reasons why they have to practice some kind of sexual activity. The method used was a qualitative study of narratives from the perspective of dialectic hermeneutics. The methodological design involves the comprehension of sceneries, contexts, environments and characters of the narratives about sexual initiation. The analysis refers to narratives of university students in the city of Rio de Janeiro. Among the meanings of sexual initiation, we emphasize sexual intercourse, the demarcation of a stage of life, the awakening to the opposite sex and the discovery of the body. We observed that the young men's narratives were coherent with what is considered masculine, present in the discourse of different generations. It is concluded that the young men should be encouraged to participate in actions combining health and education aimed at promotion of sexual and reproductive health.

  6. Student perception of initial transition into a nursing program: A mixed methods research study.

    Science.gov (United States)

    McDonald, Meghan; Brown, Janine; Knihnitski, Crystal

    2018-05-01

    Transition into undergraduate education programs is stressful and impacts students' well-being and academic achievement. Previous research indicates nursing students experience stress, depression, anxiety, and poor lifestyle habits which interfere with learning. However, nursing students' experience of transition into nursing programs has not been well studied. Incongruence exists between this lack of research and the desire to foster student success. This study analyzed students' experiences of initial transition into a nursing program. An embedded mixed method design. A single site of a direct-entry, four year baccalaureate Canadian nursing program. All first year nursing students enrolled in the fall term of 2016. This study combined the Student Adaptation to College Questionnaire (SACQ) with a subset of participants participating in qualitative focus groups. Quantitative data was analyzed using descriptive statistics to identify statistically significant differences in full-scale and subscale scores. Qualitative data was analyzed utilizing thematic analysis. Significant differences were seen between those who moved to attend university and those who did not, with those who moved scoring lower on the Academic Adjustment subscale. Focus group thematic analysis highlighted how students experienced initial transition into a baccalaureate nursing program. Identified themes included reframing supports, splitting focus/finding focus, negotiating own expectations, negotiating others' expectations, and forming identity. These findings form the Undergraduate Nursing Initial Transition (UNIT) Framework. Significance of this research includes applications in faculty development and program supports to increase student success in the first year of nursing and to provide foundational success for ongoing nursing practice. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Analyzing the Mechanical Behavior of Polymer and Composite Materials by Means of Unique Method of Deformation Calorimetry

    Science.gov (United States)

    Bessonova, N. P.; Chvalun, S. N.

    2018-06-01

    Results are presented from long-term investigations of a wide range of polymer systems, varying from elastomers and thermoplastic elastomers to plastics and fibers. The thermophysical properties of both initial and modifying additive-containing polysiloxanes, block copolymers, and poleolefins that differ in chemical nature, structure, and composition are analyzed. It is shown that deformation calorimetry allows the simultaneous registration of mechanical (from 5 × 10-3 kg) and thermal effects (at a sensitivity of 2 × 10‒7 J/s), and the determination of changes in enthalpy, internal energy, and intra- and intermolecular contributions to the formation of the tensile stress response. In other words, it provides a unique opportunity to analyze the deformation mechanism of investigated systems and its dependence on the changing parameters.

  8. DEMorphy, German Language Morphological Analyzer

    OpenAIRE

    Altinok, Duygu

    2018-01-01

    DEMorphy is a morphological analyzer for German. It is built onto large, compactified lexicons from German Morphological Dictionary. A guesser based on German declension suffixed is also provided. For German, we provided a state-of-art morphological analyzer. DEMorphy is implemented in Python with ease of usability and accompanying documentation. The package is suitable for both academic and commercial purposes wit a permissive licence.

  9. THE ROLE AND SIGNIFICANCE OF HOMEWORK IN INITIAL MATHEMATICS TEACHING

    Directory of Open Access Journals (Sweden)

    Sead Rešić

    2013-02-01

    Full Text Available This thesis elaborates on the role and importance of homework in the initial stages of teaching mathematics. The aim is to determine and analyze the degree of burden on students with homework. The following tasks were performed as a starting point for this research: determining the degree of correlation between the time that student spends on weekly homework and the student’s homework amount determinedby the pedagogical norm, determining the level of parents’ participation in helping students with homework, etermining the degree of correlation of differentiation of homework with the students' motivation for doing homework. Homework plays an important role in the initial stages of teaching mathematics, and takes up a significant place in the process of studying and teaching mathematics. The results, analysis, and conclusions are presented upon research.

  10. Initial investment to 3D printing technologies in a construction company

    Directory of Open Access Journals (Sweden)

    Cernohorsky, Zdenek

    2017-06-01

    Full Text Available This article deals with an initial investment to 3D printing technologies in a construction company. The investment refers to the use of building information models and their integration with 3D printing technology within a construction company. In the first part, there will be discussed an introduction of 3D printing scheme in a construction company from a lifecycle perspective in general. As a part of this scheme, the ideal variant of an initial investment will be considered a.k.a a pilot project. In the second part, there will be a more detailed discussion of the pilot project, more about each activities which should be its parts and which should analyze cost categories. These categories will be about particular lifecycle stages of the pilot project. In the third part, a summary is done. This article could be a handout for a construction company in a term of an initial investment to 3D printing.

  11. Analyzed Using Statistical Moments

    International Nuclear Information System (INIS)

    Oltulu, O.

    2004-01-01

    Diffraction enhanced imaging (DEl) technique is a new x-ray imaging method derived from radiography. The method uses a monorheumetten x-ray beam and introduces an analyzer crystal between an object and a detector Narrow angular acceptance of the analyzer crystal generates an improved contrast over the evaluation radiography. While standart radiography can produce an 'absorption image', DEl produces 'apparent absorption' and 'apparent refraction' images with superior quality. Objects with similar absorption properties may not be distinguished with conventional techniques due to close absorption coefficients. This problem becomes more dominant when an object has scattering properties. A simple approach is introduced to utilize scattered radiation to obtain 'pure absorption' and 'pure refraction' images

  12. Delay of Treatment Initiation Does Not Adversely Affect Survival Outcome in Breast Cancer.

    Science.gov (United States)

    Yoo, Tae-Kyung; Han, Wonshik; Moon, Hyeong-Gon; Kim, Jisun; Lee, Jun Woo; Kim, Min Kyoon; Lee, Eunshin; Kim, Jongjin; Noh, Dong-Young

    2016-07-01

    Previous studies examining the relationship between time to treatment and survival outcome in breast cancer have shown inconsistent results. The aim of this study was to analyze the overall impact of delay of treatment initiation on patient survival and to determine whether certain subgroups require more prompt initiation of treatment. This study is a retrospective analysis of stage I-III patients who were treated in a single tertiary institution between 2005 and 2008. Kaplan-Meier survival analysis and Cox proportional hazards regression model were used to evaluate the impact of interval between diagnosis and treatment initiation in breast cancer and various subgroups. A total of 1,702 patients were included. Factors associated with longer delay of treatment initiation were diagnosis at another hospital, medical comorbidities, and procedures performed before admission for surgery. An interval between diagnosis and treatment initiation as a continuous variable or with a cutoff value of 15, 30, 45, and 60 days had no impact on disease-free survival (DFS). Subgroup analyses for hormone-responsiveness, triple-negative breast cancer, young age, clinical stage, and type of initial treatment showed no significant association between longer delay of treatment initiation and DFS. Our results show that an interval between diagnosis and treatment initiation of 60 days or shorter does not appear to adversely affect DFS in breast cancer.

  13. The distribution of synonymous codon choice in the translation initiation region of dengue virus.

    Directory of Open Access Journals (Sweden)

    Jian-hua Zhou

    Full Text Available Dengue is the most common arthropod-borne viral (Arboviral illness in humans. The genetic features concerning the codon usage of dengue virus (DENV were analyzed by the relative synonymous codon usage, the effective number of codons and the codon adaptation index. The evolutionary distance between DENV and the natural hosts (Homo sapiens, Pan troglodytes, Aedes albopictus and Aedes aegypti was estimated by a novel formula. Finally, the synonymous codon usage preference for the translation initiation region of this virus was also analyzed. The result indicates that the general trend of the 59 synonymous codon usage of the four genotypes of DENV are similar to each other, and this pattern has no link with the geographic distribution of the virus. The effect of codon usage pattern of Aedes albopictus and Aedes aegypti on the formation of codon usage of DENV is stronger than that of the two primates. Turning to the codon usage preference of the translation initiation region of this virus, some codons pairing to low tRNA copy numbers in the two primates have a stronger tendency to exist in the translation initiation region than those in the open reading frame of DENV. Although DENV, like other RNA viruses, has a high mutation to adapt its hosts, the regulatory features about the synonymous codon usage have been 'branded' on the translation initiation region of this virus in order to hijack the translational mechanisms of the hosts.

  14. Learning service and development of emotional competencies in initial teacher training

    Directory of Open Access Journals (Sweden)

    Mayka García García

    2017-02-01

    Full Text Available This work has as main objective to make visible and analyze awareness of the emotional development of students Grade Early Childhood Education at the University of Cádiz, which are involved in Service Leaning experiences done within a pathway chasing curriculum institutionalization of it. For it, we use a qualitative methodology approach, where the personal accounts of students –as tools for gathering information– were analyzed through a previous category system. The results show, illustrated through the voice of the students, develop their emotional skills, which allows us to conclude that the scaffold ApS allows this dimension in the context of initial teacher favoring their intrapersonal and interpersonal development.

  15. Speed up of XML parsers with PHP language implementation

    Science.gov (United States)

    Georgiev, Bozhidar; Georgieva, Adriana

    2012-11-01

    In this paper, authors introduce PHP5's XML implementation and show how to read, parse, and write a short and uncomplicated XML file using Simple XML in a PHP environment. The possibilities for mutual work of PHP5 language and XML standard are described. The details of parsing process with Simple XML are also cleared. A practical project PHP-XML-MySQL presents the advantages of XML implementation in PHP modules. This approach allows comparatively simple search of XML hierarchical data by means of PHP software tools. The proposed project includes database, which can be extended with new data and new XML parsing functions.

  16. Analyzing octopus movements using three-dimensional reconstruction.

    Science.gov (United States)

    Yekutieli, Yoram; Mitelman, Rea; Hochner, Binyamin; Flash, Tamar

    2007-09-01

    Octopus arms, as well as other muscular hydrostats, are characterized by a very large number of degrees of freedom and a rich motion repertoire. Over the years, several attempts have been made to elucidate the interplay between the biomechanics of these organs and their control systems. Recent developments in electrophysiological recordings from both the arms and brains of behaving octopuses mark significant progress in this direction. The next stage is relating these recordings to the octopus arm movements, which requires an accurate and reliable method of movement description and analysis. Here we describe a semiautomatic computerized system for 3D reconstruction of an octopus arm during motion. It consists of two digital video cameras and a PC computer running custom-made software. The system overcomes the difficulty of extracting the motion of smooth, nonrigid objects in poor viewing conditions. Some of the trouble is explained by the problem of light refraction in recording underwater motion. Here we use both experiments and simulations to analyze the refraction problem and show that accurate reconstruction is possible. We have used this system successfully to reconstruct different types of octopus arm movements, such as reaching and bend initiation movements. Our system is noninvasive and does not require attaching any artificial markers to the octopus arm. It may therefore be of more general use in reconstructing other nonrigid, elongated objects in motion.

  17. CSTT Update: Fuel Quality Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Brosha, Eric L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Lujan, Roger W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Mukundan, Rangachary [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockward, Tommy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Romero, Christopher J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Stefan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Wilson, Mahlon S. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-06

    These are slides from a presentation. The following topics are covered: project background (scope and approach), developing the prototype (timeline), update on intellectual property, analyzer comparisons (improving humidification, stabilizing the baseline, applying clean-up strategy, impact of ionomer content and improving clean-up), proposed operating mode, considerations for testing in real-world conditions (Gen 1 analyzer electronics development, testing partner identified, field trial planning), summary, and future work.

  18. The Evaluation of the Initial Shear Modulus of Selected Cohesive Soils

    Science.gov (United States)

    Gabryś, Katarzyna; Szymański, Alojzy

    2015-06-01

    The paper concerns the evaluation of the initial stiffness of selected cohesive soils based on laboratory tests. The research materials used in this study were clayey soils taken from the area of the road embankment No. WD-18, on the 464th km of the S2 express-way, Konotopa-Airport route, Warsaw. The initial stiffness is represented here by the shear modulus (Gmax) determined during resonant column tests. In the article, a number of literature empirical formulas for defining initial value of the shear modulus of soils being examined were adopted from the literature in order to analyze the data set. However, a large discrepancy between laboratory test results and the values of Gmax calculated from empirical relationships resulted in the rejection of these proposals. They are inaccurate and do not allow for an exact evaluation of soil stiffness for selected cohesive soils. Hence, the authors proposed their own empirical formula that enables the evaluation of the test soils' Gmax in an easy and uncomplicated way. This unique formula describes mathematically the effect of certain soil parameters, namely mean effective stress ( p') and void ratio (e), on the initial soil stiffness.

  19. Machine Translation Using Constraint-Based Synchronous Grammar

    Institute of Scientific and Technical Information of China (English)

    WONG Fai; DONG Mingchui; HU Dongcheng

    2006-01-01

    A synchronous grammar based on the formalism of context-free grammar was developed by generalizing the first component of production that models the source text. Unlike other synchronous grammars,the grammar allows multiple target productions to be associated to a single production rule which can be used to guide a parser to infer different possible translational equivalences for a recognized input string according to the feature constraints of symbols in the pattern. An extended generalized LR algorithm was adapted to the parsing of the proposed formalism to analyze the syntactic structure of a language. The grammar was used as the basis for building a machine translation system for Portuguese to Chinese translation. The empirical results show that the grammar is more expressive when modeling the translational equivalences of parallel texts for machine translation and grammar rewriting applications.

  20. Proceedings of the of the Eleventh Workshop on Language Descriptions, Tools and Applications (LDTA 2011)

    DEFF Research Database (Denmark)

    This volume contains the proceedings of the Eleventh Workshop on Language Descriptions, Tools and Applications (LDTA 2011), held in Saarbrücken, Germany on March 26 & 27, 2011. LDTA is a two-day satellite event of ETAPS (European Joint Conferences on Theory and Practice of Software) and organized...... in cooperation with ACM SIGPLAN. LDTA is an application and tool-oriented workshop focused on grammarware---software based on grammars in some form. Grammarware applications are typically language processing applications and traditional examples include parsers, program analyzers, optimizers and translators......, as well as techniques and tools, to the test in a new way in the form of the LDTA Tool Challenge. Tool developers were invited to participate in the Challenge by developing solutions to a range of language processing tasks over a simple but evolving set of imperative programming languages. Tool challenge...

  1. B-physics performance with Initial and Complete Inner detector layouts in Data Challenge-1

    CERN Document Server

    Benekos, N C; Bouhova-Thacker, E; Epp, B; Ghete, V M; Jones, R; Kartvelishvili, V G; Lagouri, T; Laporte, J F; Nairz, A; Nikitine, N; Reznicek, P; Sivoklokov, S Yu; Smizanska, M; Testa, M; Toms, K

    2004-01-01

    The B-physics performance for the Initial and the Complete Inner Detector layouts is presented. Selected types of B-physics events were simulated, reconstructed and analyzed using the software tools of ATLAS Data Challenge-1 (DC1). The results were compared to those obtained with an older ATLAS detector design the so-called TDR layout. Within the limitations of the DC1 software tools an attempt was made to evaluate the performance loss due to missing detector parts in the Initial layout in comparison with the Complete detector.

  2. Zeroing in on methicillin-resistant Staphylococcus aureus: US Department of Veterans Affairs' MRSA Prevention Initiative.

    Science.gov (United States)

    Kralovic, Stephen M; Evans, Martin E; Simbartl, Loretta A; Ambrose, Meredith; Jain, Rajiv; Roselle, Gary A

    2013-05-01

    Implementation of a methicillin-resistant Staphylococcus aureus (MRSA) Prevention Initiative within US Department of Veterans Affairs medical facilities was associated with a significant reduction in MRSA health care-associated infection (HAI) rates nationwide. The first 36 months of data from the Initiative were analyzed to determine how many facilities reported zero MRSA HAIs each month. From October 2007 through September 2010, there was a 37.6% increase nationwide in the number of facilities achieving zero MRSA HAIs each month. Published by Mosby, Inc.

  3. Centrifugal analyzer development

    International Nuclear Information System (INIS)

    Burtis, C.A.; Bauer, M.L.; Bostick, W.D.

    1976-01-01

    The development of the centrifuge fast analyzer (CFA) is reviewed. The development of a miniature CFA with computer data analysis is reported and applications for automated diagnostic chemical and hematological assays are discussed. A portable CFA system with microprocessor was adapted for field assays of air and water samples for environmental pollutants, including ammonia, nitrates, nitrites, phosphates, sulfates, and silica. 83 references

  4. Initiating Event Rates at U.S. Nuclear Power Plants. 1988 - 2013

    International Nuclear Information System (INIS)

    Schroeder, John A.; Bower, Gordon R.

    2014-01-01

    Analyzing initiating event rates is important because it indicates performance among plants and also provides inputs to several U.S. Nuclear Regulatory Commission (NRC) risk-informed regulatory activities. This report presents an analysis of initiating event frequencies at U.S. commercial nuclear power plants since each plant's low-power license date. The evaluation is based on the operating experience from fiscal year 1988 through 2013 as reported in licensee event reports. Engineers with nuclear power plant experience staff reviewed each event report since the last update to this report for the presence of valid scrams or reactor trips at power. To be included in the study, an event had to meet all of the following criteria: includes an unplanned reactor trip (not a scheduled reactor trip on the daily operations schedule), sequence of events starts when reactor is critical and at or above the point of adding heat, occurs at a U.S. commercial nuclear power plant (excluding Fort St. Vrain and LaCrosse), and is reported by a licensee event report. This report displays occurrence rates (baseline frequencies) for the categories of initiating events that contribute to the NRC's Industry Trends Program. Sixteen initiating event groupings are trended and displayed. Initiators are plotted separately for initiating events with different occurrence rates for boiling water reactors and pressurized water reactors. p-values are given for the possible presence of a trend over the most recent 10 years.

  5. Low molecular weight heparin versus unfractionated heparin in the initial treatment of venous thromboembolism

    NARCIS (Netherlands)

    Hettiarachchi, R. J.; Prins, M. H.; Lensing, A. W.; Buller, H. R.

    1998-01-01

    In this review, we analyze data from randomized trials in which low molecular weight heparin was compared with unfractionated heparin, both to estimate the treatment effect of low molecular weight heparin in the initial treatment of venous thromboembolism and to evaluate the effect of the varied

  6. Distorted Pattern Recognition and Analysis with the Help of IEf Graph Representation

    Directory of Open Access Journals (Sweden)

    Adam Sedziwy

    2002-01-01

    Full Text Available An algorithm for distorted pattern recognition is presented. lt's generalization of M Flasinski results (Pattern Recognition, 27, 1-16, 1992. A new formalism allows to make both qualitative and quantitive distortion analysis. It also enlarges parser flexibility by extending the set of patterns which may be recognized.

  7. The Case of the Khoekhoegowab Dictionary

    African Journals Online (AJOL)

    Lexicographic databases are attractive to builders of computer applica- tions of various kinds. The NDP5 database is currently used by a postgraduate student in South Africa to extract morphological data for a Master's thesis on the development of a morphological parser for Khoekhoegowab.6. A Khoekhoe spell checker is ...

  8. Improving Precision of Generated ASTs

    DEFF Research Database (Denmark)

    Winther, Johnni

    The parser-generator is an essential tool in grammarware and its output, the parse tree in form of the concrete or abstract syntax tree, often forms the basis for the whole structure of the grammarware application. Several tools for Java encode the parse tree in a class hierarchy generated to model...

  9. Author Details

    African Journals Online (AJOL)

    Constructing a Parser for a given Deterministic Syntax Graph: A Procedural Approach Abstract · Vol 5, No 3 (2009) - Articles Removing the Restrictions Imposed on Finite State Machines-Employing the Push Down Machine Abstract · Vol 6, No 4 (2010) - Articles Design, development and implementation of a simple program ...

  10. Multimedia CALLware: The Developer's Responsibility.

    Science.gov (United States)

    Dodigovic, Marina

    The early computer-assisted-language-learning (CALL) programs were silent and mostly limited to screen or printer supported written text as the prevailing communication resource. The advent of powerful graphics, sound and video combined with AI-based parsers and sound recognition devices gradually turned the computer into a rather anthropomorphic…

  11. Contamination Analyzer

    Science.gov (United States)

    1994-01-01

    Measurement of the total organic carbon content in water is important in assessing contamination levels in high purity water for power generation, pharmaceutical production and electronics manufacture. Even trace levels of organic compounds can cause defects in manufactured products. The Sievers Model 800 Total Organic Carbon (TOC) Analyzer, based on technology developed for the Space Station, uses a strong chemical oxidizing agent and ultraviolet light to convert organic compounds in water to carbon dioxide. After ionizing the carbon dioxide, the amount of ions is determined by measuring the conductivity of the deionized water. The new technique is highly sensitive, does not require compressed gas, and maintenance is minimal.

  12. Automation and robotics for the Space Exploration Initiative: Results from Project Outreach

    Science.gov (United States)

    Gonzales, D.; Criswell, D.; Heer, E.

    1991-01-01

    A total of 52 submissions were received in the Automation and Robotics (A&R) area during Project Outreach. About half of the submissions (24) contained concepts that were judged to have high utility for the Space Exploration Initiative (SEI) and were analyzed further by the robotics panel. These 24 submissions are analyzed here. Three types of robots were proposed in the high scoring submissions: structured task robots (STRs), teleoperated robots (TORs), and surface exploration robots. Several advanced TOR control interface technologies were proposed in the submissions. Many A&R concepts or potential standards were presented or alluded to by the submitters, but few specific technologies or systems were suggested.

  13. Love Me Tinder: Online Identity Performance and Romantic Relationship Initiation

    OpenAIRE

    Villani, Anna Marie; Hvass, Charlotte Colstrup; Lund-Larsen, Ida; Rørhøj, Jacob Mark; Vintersborg, Kathrine Mosbæk; Hansen, Mathias Constant Bek; Bengtsson, Teresa Imaya

    2015-01-01

    Our project explores Tinder – a mobile dating application we view as a product of our time, a modern phenomenon. Specifically, we look at online identity and romantic relationship initiation, taking into account Tinder’s evolution as an application, the users, and subjectivities. Through semiotics and image rhetoric, we analyze the taglines and photographs users post in their profiles. We relate this to performance theory, viewing Tinder as a virtual stage. We delve deeper into Tinder, honing...

  14. Initial growth of Schizolobium parahybae in Brazilian Cerrado soil under liming and mineral fertilization

    Directory of Open Access Journals (Sweden)

    Ademilson Coneglian

    Full Text Available ABSTRACT High prices and the scarcity of hardwoods require the use of alternative wood sources, such as the Guapuruvu (Schizolobium parahybae, an arboreal species native to the Atlantic Forest, which has fast growth and high market potential. However, there is no information on its cultivation in the Brazilian Cerrado. Thus, this study aimed to analyze the contribution of mineral fertilization and liming in a Cerrado soil on the initial growth of Schizolobium parahybae. The experiment was set in a randomized block design, with 4 treatments (Cerrado soil; soil + liming; soil + fertilizer; and soil + fertilizer + liming and 15 replicates. The following variables were analyzed: plant height, stem diameter, number of leaves, total, shoot, leaf, root and stem dry matter, and root/shoot ratio. The obtained data were subjected to the analysis of variance, Tukey test and regression analysis. During the initial growth, Schizolobium parahybae can be cultivated in a Brazilian Cerrado soil only under mineral fertilization, with no need for soil liming.

  15. Analyzing medical costs with time-dependent treatment: The nested g-formula.

    Science.gov (United States)

    Spieker, Andrew; Roy, Jason; Mitra, Nandita

    2018-04-16

    As medical expenses continue to rise, methods to properly analyze cost outcomes are becoming of increasing relevance when seeking to compare average costs across treatments. Inverse probability weighted regression models have been developed to address the challenge of cost censoring in order to identify intent-to-treat effects (i.e., to compare mean costs between groups on the basis of their initial treatment assignment, irrespective of any subsequent changes to their treatment status). In this paper, we describe a nested g-computation procedure that can be used to compare mean costs between two or more time-varying treatment regimes. We highlight the relative advantages and limitations of this approach when compared with existing regression-based models. We illustrate the utility of this approach as a means to inform public policy by applying it to a simulated data example motivated by costs associated with cancer treatments. Simulations confirm that inference regarding intent-to-treat effects versus the joint causal effects estimated by the nested g-formula can lead to markedly different conclusions regarding differential costs. Therefore, it is essential to prespecify the desired target of inference when choosing between these two frameworks. The nested g-formula should be considered as a useful, complementary tool to existing methods when analyzing cost outcomes. Copyright © 2018 John Wiley & Sons, Ltd.

  16. Time-delay analyzer with continuous discretization

    International Nuclear Information System (INIS)

    Bayatyan, G.L.; Darbinyan, K.T.; Mkrtchyan, K.K.; Stepanyan, S.S.

    1988-01-01

    A time-delay analyzer is described which when triggered by a start pulse of adjustable duration performs continuous discretization of the analyzed signal within nearly 22 ns time intervals, the recording in a memory unit with following slow read-out of the information to the computer and its processing. The time-delay analyzer consists of four CAMAC-VECTOR systems of unit width. With its help one can separate comparatively short, small-amplitude rare signals against the background of quasistationary noise processes. 4 refs.; 3 figs

  17. Alcohol industry corporate social responsibility initiatives and harmful drinking: a systematic review.

    Science.gov (United States)

    Mialon, Melissa; McCambridge, Jim

    2018-04-25

    There is growing awareness of the detrimental effects of alcohol industry commercial activities, and concern about possible adverse impacts of its corporate social responsibility (CSR) initiatives, on public health. The aims of this systematic review were to summarize and examine what is known about CSR initiatives undertaken by alcohol industry actors in respect of harmful drinking globally. We searched for peer-reviewed studies published since 1980 of alcohol industry CSR initiatives in seven electronic databases. The basic search strategy was organized around the three constructs of 'alcohol', 'industry' and 'corporate social responsibility'. We performed the searches on 21 July 2017. Data from included studies were analyzed inductively, according to the extent to which they addressed specified research objectives. A total of 21 studies were included. We identified five types of CSR initiatives relevant to the reduction of harmful drinking: alcohol information and education provision; drink driving prevention; research involvement; policy involvement and the creation of social aspects organizations. Individual companies appear to undertake different CSR initiatives than do industry-funded social aspects organizations. There is no robust evidence that alcohol industry CSR initiatives reduce harmful drinking. There is good evidence, however, that CSR initiatives are used to influence the framing of the nature of alcohol-related issues in line with industry interests. This research literature is at an early stage of development. Alcohol policy measures to reduce harmful drinking are needed, and the alcohol industry CSR initiatives studied so far do not contribute to the attainment of this goal.

  18. Prognostic factors of a good response to initial therapy in children and adolescents with differentiated thyroid cancer

    Directory of Open Access Journals (Sweden)

    Fernanda Vaisman

    2011-01-01

    Full Text Available BACKGROUND: Therapeutic approaches in pediatric populations are based on adult data because there is a lack of appropriate data for children. Consequently, there are many controversies regarding the proper treatment of pediatric patients. OBJECTIVE: The present study was designed to evaluate patients with differentiated thyroid carcinoma diagnosed before 20 years of age and to determine the factors associated with the response to the initial therapy. METHODS: Sixty-five patients, treated in two tertiary-care referral centers in Rio de Janeiro between 1980 and 2005 were evaluated. Information about clinical presentation and the response to initial treatment was analyzed and patients had their risk stratified in Tumor-Node- Metastasis; Age-Metastasis-Extracapsular-Size; distant Metastasis-Age-Completeness of primary tumor resection-local Invasion-Size and American-Thyroid-Association classification RESULTS: Patients ages ranged from 4 to 20 years (median 14. The mean follow-up was 12,6 years. Lymph node metastasis was found in 61.5% and indicated a poor response to initial therapy, with a significant impact on time for achieving disease free status (p = 0.014 for response to initial therapy and p<0,0001 for disease-free status in follow-up. Distant metastasis was a predictor of a poor response to initial therapy in these patients (p = 0.014. The risk stratification systems we analyzed were useful for high-risk patients because they had a high sensitivity and negative predictive value in determining the response to initial therapy. CONCLUSIONS: Metastases, both lymph nodal and distant, are important predictors of the persistence of disease after initial therapy in children and adolescents with differentiated thyroid cancer.

  19. Device for analyzing a solution

    International Nuclear Information System (INIS)

    Marchand, Joseph.

    1978-01-01

    The device enables a solution containing an antigen to be analyzed by the radio-immunology technique without coming up against the problems of antigen-antibody complex and free antigen separation. This device, for analyzing a solution containing a biological compound capable of reacting with an antagonistic compound specific of the biological compound, features a tube closed at its bottom end and a component set and immobilized in the bottom of the tube so as to leave a capacity between the bottom of the tube and its lower end. The component has a large developed surface and is so shaped that it allows the solution to be analyzed to have access to the bottom of the tube; it is made of a material having some elastic deformation and able to take up a given quantity of the biological compound or of the antagonistic compound specific of the biological compound [fr

  20. Renin-angiotensin-aldosterone system inhibitors lower hemoglobin and hematocrit only in renal transplant recipients with initially higher levels.

    Science.gov (United States)

    Mikolasevic, I; Zaputovic, L; Zibar, L; Begic, I; Zutelija, M; Klanac, A; Majurec, I; Simundic, T; Minazek, M; Orlic, L

    2016-04-01

    We have analyzed the effects of renin-angiotensin-aldosterone system (RAAS) inhibitors on evolution of hemoglobin (Hb) and hematocrit (Htc) levels as well as on the evaluation of kidney graft function in stable renal transplant recipients (RTRs) in respect with initially higher or lower Hb and Htc values. The study group comprised of 270 RTRs with stable graft function. Besides other prescribed antihypertensive therapy, 169 of them have been taking RAAS inhibitors. We wanted to analyze the effect of the use of RAAS inhibitors on Hb and Htc in patients with initially higher or lower Hb/Htc values. For this analysis, only RTRs that were taking RAAS inhibitors were stratified into two groups: one with higher Hb and Htc (initial Hb≥150g/L and Htc≥45%) and another one with lower Hb and Htc (initial Hb<150g/L and Htc<45%) values. Thirty-four RTRs with initially higher Hb and 41 RTRs with initially higher Htc had a statistically significant decrease in Hb (p=0.006) and Htc (p<0.0001) levels after 12-months of follow-up. In the group of patients with initially lower Hb (135 RTRs) and Htc (128 RTRs) there was a significant increase in Hb (p=0.0001) and Htc (p=0.004) levels through the observed period. The use of RAAS inhibitors has been associated with a trend of slowing renal insufficiency in RTRs (p=0.03). RAAS inhibitors lower Hb and Htc only in RTRs with initially higher levels. In patients with initially lower Hb and Htc levels, the use of these drugs is followed by beneficial impact on erythropoiesis and kidney graft function. Copyright © 2015 European Federation of Internal Medicine. Published by Elsevier B.V. All rights reserved.

  1. The successful implementation of STEM initiatives in lower income schools

    Science.gov (United States)

    Bakshi, Leena

    The purpose of this study was to examine the leadership strategies utilized by superintendents, district administrators and school principals and the impact of these identified strategies on implementing STEM initiatives specifically for lower-income students. This study set out to determine (a) What role does district leadership play in the implementation of STEM initiatives in lower income secondary schools; (b) What internal systems of accountability exist in successful lower income secondary schools' STEM programs; (c) What leadership strategies are used to implement STEM curriculum initiatives; (d) How do school and district leadership support staff in order to achieve student engagement in STEM Initiative curriculum. This study used a mixed-methods approach to determine the impact of leadership strategies utilized by superintendents, district administrators and school principals on implementing STEM initiatives. Quantitative data analyzed survey questionnaires to determine the degree of correlation between the school districts that have demonstrated the successful implementation of STEM initiatives at the school and district levels. Qualitative data was collected using highly structured participant interviews and purposeful sampling of four district superintendents, one district-level administrator and five school leaders to capture the key strategies in implementing STEM initiatives in lower income secondary schools. Through the process of triangulation, the results of the study revealed that superintendents and principals should consider the characteristics of effective STEM initiatives that have shown a considerable degree of correlation with positive outcomes for lower income students. These included the leadership strategies of personnel's making decisions about the district's and school's instructional direction and an emphasis on the conceptual development of scientific principles using the Next Generation Science Standards coupled with the Common Core

  2. KWU Nuclear Plant Analyzer

    International Nuclear Information System (INIS)

    Bennewitz, F.; Hummel, R.; Oelmann, K.

    1986-01-01

    The KWU Nuclear Plant Analyzer is a real time engineering simulator based on the KWU computer programs used in plant transient analysis and licensing. The primary goal is to promote the understanding of the technical and physical processes of a nuclear power plant at an on-site training facility. Thus the KWU Nuclear Plant Analyzer is available with comparable low costs right at the time when technical questions or training needs arise. This has been achieved by (1) application of the transient code NLOOP; (2) unrestricted operator interaction including all simulator functions; (3) using the mainframe computer Control Data Cyber 176 in the KWU computing center; (4) four color graphic displays controlled by a dedicated graphic computer, no control room equipment; and (5) coupling of computers by telecommunication via telephone

  3. Mexican-American mothers’ initiation and understanding of home oral hygiene for young children

    Science.gov (United States)

    HOEFT, Kristin S.; BARKER, Judith C.; MASTERSON, Erin E.

    2012-01-01

    Purpose To investigate caregiver beliefs and behaviors as key issues in the initiation of home oral hygiene routines. Oral hygiene helps reduce the prevalence of early childhood caries, which is disproportionately high among Mexican-American children. Methods Interviews were conducted with a convenience sample of 48 Mexican-American mothers of young children in a low income, urban neighborhood. Interviews were digitally recorded, translated, transcribed, coded and analyzed using standard qualitative procedures. Results The average age of tooth brushing initiation was 1.8±0.8 years; only a small proportion of parents (13%) initiated oral hygiene in accord with American Dental Association (ADA) recommendations. Mothers initiated 2 forms of oral hygiene: infant oral hygiene and regular tooth brushing. For the 48% of children who participated in infant oral hygiene, mothers were prompted by pediatrician and social service (WIC) professionals. For regular tooth brushing initiation, a set of maternal beliefs exist about when this oral hygiene practice becomes necessary for children. Beliefs are mainly based on a child’s dental maturity, interest, capacity and age/size. Conclusions Most (87%) of the urban Mexican-American mothers in the study do not initiate oral hygiene practices in compliance with ADA recommendations. These findings have implications for educational messages. PMID:19947134

  4. Reversibility and irreversibility from an initial value formulation

    International Nuclear Information System (INIS)

    Muriel, A.

    2013-01-01

    From a time evolution equation for the single particle distribution function derived from the N-particle distribution function (A. Muriel, M. Dresden, Physica D 101 (1997) 297), an exact solution for the 3D Navier–Stokes equation – an old problem – has been found (A. Muriel, Results Phys. 1 (2011) 2). In this Letter, a second exact conclusion from the above-mentioned work is presented. We analyze the time symmetry properties of a formal, exact solution for the single-particle distribution function contracted from the many-body Liouville equation. This analysis must be done because group theoretic results on time reversal symmetry of the full Liouville equation (E.C.G. Sudarshan, N. Mukunda, Classical Mechanics: A Modern Perspective, Wiley, 1974). no longer applies automatically to the single particle distribution function contracted from the formal solution of the N-body Liouville equation. We find the following result: if the initial momentum distribution is even in the momentum, the single particle distribution is reversible. If there is any asymmetry in the initial momentum distribution, no matter how small, the system is irreversible.

  5. X-ray fluorescence analyzer arrangement

    International Nuclear Information System (INIS)

    Vatai, Endre; Ando, Laszlo; Gal, Janos.

    1981-01-01

    An x-ray fluorescence analyzer for the quantitative determination of one or more elements of complex samples is reported. The novelties of the invention are the excitation of the samples by x-rays or γ-radiation, the application of a balanced filter pair as energy selector, and the measurement of the current or ion charge of ionization detectors used as sensors. Due to the increased sensitivity and accuracy, the novel design can extend the application fields of x-ray fluorescence analyzers. (A.L.)

  6. Unilateral initiatives

    International Nuclear Information System (INIS)

    Anon.

    1988-01-01

    This paper reports on arms control which is generally thought of in terms of formal negotiations with an opponent, with the resulting agreements embodied in a treaty. This is not surprising, since arms control discussions between opponents are both important and politically visible. There are, however, strong reasons for countries to consider and frequently take unilateral initiatives. To do so is entirely consistent with the established major precepts of arms control which state that arms control is designed to reduce the risk of war, the costs of preparing for war, and the death and destruction if war should come. Unilateral initiatives on what weapons are purchased, which ones are eliminated and how forces are deployed can all relate to these objectives. There are two main categories of motives for unilateral initiatives in arms control. In one category, internal national objectives are the dominant, often sole, driving force; the initiative is undertaken for our own good

  7. Debris flow-induced topographic changes: effects of recurrent debris flow initiation.

    Science.gov (United States)

    Chen, Chien-Yuan; Wang, Qun

    2017-08-12

    Chushui Creek in Shengmu Village, Nantou County, Taiwan, was analyzed for recurrent debris flow using numerical modeling and geographic information system (GIS) spatial analysis. The two-dimensional water flood and mudflow simulation program FLO-2D were used to simulate debris flow induced by rainfall during typhoon Herb in 1996 and Mindulle in 2004. Changes in topographic characteristics after the debris flows were simulated for the initiation of hydrological characteristics, magnitude, and affected area. Changes in topographic characteristics included those in elevation, slope, aspect, stream power index (SPI), topographic wetness index (TWI), and hypsometric curve integral (HI), all of which were analyzed using GIS spatial analysis. The results show that the SPI and peak discharge in the basin increased after a recurrence of debris flow. The TWI was higher in 2003 than in 2004 and indicated higher potential of landslide initiation when the slope of the basin was steeper. The HI revealed that the basin was in its mature stage and was shifting toward the old stage. Numerical simulation demonstrated that the parameters' mean depth, maximum depth, affected area, mean flow rate, maximum flow rate, and peak flow discharge were increased after recurrent debris flow, and peak discharge occurred quickly.

  8. Tall Buildings Initiative

    Science.gov (United States)

    Initiative 2017 TBI Guidelines Version 2.03 Now Available Screen Shot 2017-10-10 at 3.05.10 PM PEER has just initiative to develop design criteria that will ensure safe and usable tall buildings following future earthquakes. Download the primary product of this initiative: Guidelines for Performance-Based Seismic Design

  9. The design and implementation of an adequate recovery ...

    African Journals Online (AJOL)

    This study considered what happens when the complier encounters an ill-formed construct. Ordinarily the parser is supposed to issue an error message that could terminate the process of scanning the sentence. However, if this happens nothing has been achieved. My motivation is to have a complier that would issue an ...

  10. AutoMap User’s Guide 2012

    Science.gov (United States)

    2012-06-11

    Fonts ................................................................................... 47 Java Licenses...American universities. http://salrc.uchicago.edu/resources/fonts/available/ urdu/ 23 OCT 09 Java Licenses Description 49 This table contains... hibernate - core- 3.3.2.GA.ja r Hibernate Core LGPL v2.1 53 htmlparser .jar HTML Parser http://htmlparser.sourceforge.net/ Common Public License 1.0

  11. IRIG 106 Chapter 10 Programmers Handbook

    Science.gov (United States)

    2016-08-16

    szDataItem. Specific parsers are called for specific attribute types, indicated by the first letter of the code name. After all TMATS attributes are...EnI106Status I106_CALL_DECL enI106Ch10Close(int iHandle) { // If handles have been init’ed then bail if

  12. Noise and analyzer-crystal angular position analysis for analyzer-based phase-contrast imaging

    Science.gov (United States)

    Majidi, Keivan; Li, Jun; Muehleman, Carol; Brankov, Jovan G.

    2014-04-01

    The analyzer-based phase-contrast x-ray imaging (ABI) method is emerging as a potential alternative to conventional radiography. Like many of the modern imaging techniques, ABI is a computed imaging method (meaning that images are calculated from raw data). ABI can simultaneously generate a number of planar parametric images containing information about absorption, refraction, and scattering properties of an object. These images are estimated from raw data acquired by measuring (sampling) the angular intensity profile of the x-ray beam passed through the object at different angular positions of the analyzer crystal. The noise in the estimated ABI parametric images depends upon imaging conditions like the source intensity (flux), measurements angular positions, object properties, and the estimation method. In this paper, we use the Cramér-Rao lower bound (CRLB) to quantify the noise properties in parametric images and to investigate the effect of source intensity, different analyzer-crystal angular positions and object properties on this bound, assuming a fixed radiation dose delivered to an object. The CRLB is the minimum bound for the variance of an unbiased estimator and defines the best noise performance that one can obtain regardless of which estimation method is used to estimate ABI parametric images. The main result of this paper is that the variance (hence the noise) in parametric images is directly proportional to the source intensity and only a limited number of analyzer-crystal angular measurements (eleven for uniform and three for optimal non-uniform) are required to get the best parametric images. The following angular measurements only spread the total dose to the measurements without improving or worsening CRLB, but the added measurements may improve parametric images by reducing estimation bias. Next, using CRLB we evaluate the multiple-image radiography, diffraction enhanced imaging and scatter diffraction enhanced imaging estimation techniques

  13. Noise and analyzer-crystal angular position analysis for analyzer-based phase-contrast imaging

    International Nuclear Information System (INIS)

    Majidi, Keivan; Brankov, Jovan G; Li, Jun; Muehleman, Carol

    2014-01-01

    The analyzer-based phase-contrast x-ray imaging (ABI) method is emerging as a potential alternative to conventional radiography. Like many of the modern imaging techniques, ABI is a computed imaging method (meaning that images are calculated from raw data). ABI can simultaneously generate a number of planar parametric images containing information about absorption, refraction, and scattering properties of an object. These images are estimated from raw data acquired by measuring (sampling) the angular intensity profile of the x-ray beam passed through the object at different angular positions of the analyzer crystal. The noise in the estimated ABI parametric images depends upon imaging conditions like the source intensity (flux), measurements angular positions, object properties, and the estimation method. In this paper, we use the Cramér–Rao lower bound (CRLB) to quantify the noise properties in parametric images and to investigate the effect of source intensity, different analyzer-crystal angular positions and object properties on this bound, assuming a fixed radiation dose delivered to an object. The CRLB is the minimum bound for the variance of an unbiased estimator and defines the best noise performance that one can obtain regardless of which estimation method is used to estimate ABI parametric images. The main result of this paper is that the variance (hence the noise) in parametric images is directly proportional to the source intensity and only a limited number of analyzer-crystal angular measurements (eleven for uniform and three for optimal non-uniform) are required to get the best parametric images. The following angular measurements only spread the total dose to the measurements without improving or worsening CRLB, but the added measurements may improve parametric images by reducing estimation bias. Next, using CRLB we evaluate the multiple-image radiography, diffraction enhanced imaging and scatter diffraction enhanced imaging estimation techniques

  14. 76 FR 37781 - Initiation of Antidumping and Countervailing Duty Administrative Reviews and Request for...

    Science.gov (United States)

    2011-06-28

    .... Caterpillar Logistics Services China Ltd. Caterpillar Mexico, S.A. de C.V. Glory Ltd. Hagglunds Ltd. Hino... this notice of initiation had no exports, sales, or entries during the period of review (``POR''), it... government control of its export activities to be entitled to a separate rate, the Department analyzes each...

  15. Analysis of the temporal program of replication initiation in yeast chromosomes.

    Science.gov (United States)

    Friedman, K L; Raghuraman, M K; Fangman, W L; Brewer, B J

    1995-01-01

    The multiple origins of eukaryotic chromosomes vary in the time of their initiation during S phase. In the chromosomes of Saccharomyces cerevisiae the presence of a functional telomere causes nearby origins to delay initiation until the second half of S phase. The key feature of telomeres that causes the replication delay is the telomeric sequence (C(1-3)A/G(1-3)T) itself and not the proximity of the origin to a DNA end. A second group of late replicating origins has been found at an internal position on chromosome XIV. Four origins, spanning approximately 140 kb, initiate replication in the second half of S phase. At least two of these internal origins maintain their late replication time on circular plasmids. Each of these origins can be separated into two functional elements: those sequences that provide origin function and those that impose late activation. Because the assay for determining replication time is costly and laborious, it has not been possible to analyze in detail these 'late' elements. We report here the development of two new assays for determining replication time. The first exploits the expression of the Escherichia coli dam methylase in yeast and the characteristic period of hemimethylation that transiently follows the passage of a replication fork. The second uses quantitative hybridization to detect two-fold differences in the amount of specific restriction fragments as a function of progress through S phase. The novel aspect of this assay is the creation in vivo of a non-replicating DNA sequence by site-specific pop-out recombination. This non-replicating fragment acts as an internal control for copy number within and between samples. Both of these techniques are rapid and much less costly than the more conventional density transfer experiments that require CsCl gradients to detect replicated DNA. With these techniques it should be possible to identify the sequences responsible for late initiation, to search for other late replicating

  16. Averse to Initiative: Risk Management’s Effect on Mission Command

    Science.gov (United States)

    2017-05-25

    Master’s Thesis 3. DATES COVERED (From - To) JUN 2016 – MAY 2017 4. TITLE AND SUBTITLE Averse to Initiative: Risk Management’s Effect on Mission...decision weights and potential implications of educating leaders with Prospect Theory . Prospect Theory sheds light on how to look at loss and opportunity...the gain–causing risk aversion. Leaders educated in how to analyze a problem using Prospect Theory can rationally approach gains and losses. For the

  17. Online communication predicts Belgian adolescents' initiation of romantic and sexual activity.

    Science.gov (United States)

    Vandenbosch, Laura; Beyens, Ine; Vangeel, Laurens; Eggermont, Steven

    2016-04-01

    Online communication is associated with offline romantic and sexual activity among college students. Yet, it is unknown whether online communication is associated with the initiation of romantic and sexual activity among adolescents. This two-wave panel study investigated whether chatting, visiting dating websites, and visiting erotic contact websites predicted adolescents' initiation of romantic and sexual activity. We analyzed two-wave panel data from 1163 Belgian adolescents who participated in the MORES Study. We investigated the longitudinal impact of online communication on the initiation of romantic relationships and sexual intercourse using logistic regression analyses. The odds ratios of initiating a romantic relationship among romantically inexperienced adolescents who frequently used chat rooms, dating websites, or erotic contact websites were two to three times larger than those of non-users. Among sexually inexperienced adolescents who frequently used chat rooms, dating websites, or erotic contact websites, the odds ratios of initiating sexual intercourse were two to five times larger than that among non-users, even after a number of other relevant factors were introduced. This is the first study to demonstrate that online communication predicts the initiation of offline sexual and romantic activity as early as adolescence. Practitioners and parents need to consider the role of online communication in adolescents' developing sexuality. • Adolescents increasingly communicate online with peers. • Online communication predicts romantic and sexual activity among college students. What is New: • Online communication predicts adolescents' offline romantic activity over time. • Online communication predicts adolescents' offline sexual activity over time.

  18. Optimal initial fuel distribution in a thermal reactor for maximum energy production

    International Nuclear Information System (INIS)

    Moran-Lopez, J.M.

    1983-01-01

    Using the fuel burnup as objective function, it is desired to determine the initial distribution of the fuel in a reactor in order to obtain the maximum energy possible, for which, without changing a fixed initial fuel mass, the results for different initial fuel and control poison configurations are analyzed and the corresponding running times compared. One-dimensional, two energy-group theory is applied to a reflected cylindrical reactor using U-235 as fuel and light water as moderator and reflector. Fissions in both fast and thermal groups are considered. The reactor is divided into several annular regions, and the constant flux approximation in each depletion step is then used to solve the fuel and fission-product poisons differential equations in each region. The computer code OPTIME was developed to determine the time variation of core properties during the fuel cycle. At each depletion step, OPTIME calls ODMUG, [12] a criticality search program, from which the spatially-averaged neutron fluxes and control poison cross sections are obtained

  19. Aircraft accident investigation: the decision-making in initial action scenario.

    Science.gov (United States)

    Barreto, Marcia M; Ribeiro, Selma L O

    2012-01-01

    In the complex aeronautical environment, the efforts in terms of operational safety involve the adoption of proactive and reactive measures. The process of investigation begins right after the occurrence of the aeronautical accident, through the initial action. Thus, it is in the crisis scenario, that the person responsible for the initial action makes decisions and gathers the necessary information for the subsequent phases of the investigation process. Within this scenario, which is a natural environment, researches have shown the fragility of rational models of decision making. The theoretical perspective of naturalistic decision making constitutes a breakthrough in the understanding of decision problems demanded by real world. The proposal of this study was to verify if the initial action, after the occurrence of an accident, and the decision-making strategies, used by the investigators responsible for this activity, are characteristic of the naturalistic decision making theoretical approach. To attend the proposed objective a descriptive research was undertaken with a sample of professionals that work in this activity. The data collected through individual interviews were analyzed and the results demonstrated that the initial action environment, which includes restricted time, dynamic conditions, the presence of multiple actors, stress and insufficient information is characteristic of the naturalistic decision making. They also demonstrated that, when the investigators make their decisions, they use their experience and the mental simulation, intuition, improvisation, metaphors and analogues cases, as strategies, all of them related to the naturalistic approach of decision making, in order to satisfy the needs of the situation and reach the objectives of the initial action in the accident scenario.

  20. Initiation and persistence to statin treatment in patients with diabetes receiving glucose-lowering medications 1997- 2006

    DEFF Research Database (Denmark)

    Dominguez, H; Schramm, T K; Norgaard, M L

    2009-01-01

    AIMS: Since 2001 guidelines recommend statin treatment in most patients with diabetes. We investigated secular changes in initiation and persistence to statin treatment during a 10-year period in a nationwide cohort of patients initiating glucose-lowering medication (GLM). METHODS: All Danish...... citizens 30 years and older who claimed prescriptions of GLM between 1997 and 2006 were identified from nationwide registers of drug dispensing from pharmacies and hospitalizations, and followed until 2006. Statin treatment was registered if a prescription was claimed during the period. By logistic...... regression we analyzed factors related to initiation and persistence to statin treatment. RESULTS: In total 128,106 patients were included. In 1997 only 7% of the patients receiving GLM claimed statins within the first year after GLM initiation. Despite increasing statin prescriptions the following years...

  1. Adapting adaptation: the English eco-town initiative as governance process

    Directory of Open Access Journals (Sweden)

    Daniel Tomozeiu

    2014-06-01

    Full Text Available Climate change adaptation and mitigation have become key policy drivers in the UK under its Climate Change Act of 2008. At the same time, urbanization has been high on the agenda, given the pressing need for substantial additional housing, particularly in southeast England. These twin policy objectives were brought together in the UK government's 'eco-town' initiative for England launched in 2007, which has since resulted in four eco-town projects currently under development. We critically analyze the eco-town initiative's policy evolution and early planning phase from a multilevel governance perspective by focusing on the following two interrelated aspects: (1 the evolving governance structures and resulting dynamics arising from the development of the eco-town initiative at UK governmental level, and the subsequent partial devolution to local stakeholders, including local authorities and nongovernmental actors, under the new 'localism' agenda; and (2 the effect of these governance dynamics on the conceptual and practical approach to adaptation through the emerging eco-town projects. As such, we problematize the impact of multilevel governance relations, and competing governance strategies and leadership, on shaping eco-town and related adaptation strategies and practice.

  2. Loviisa nuclear power plant analyzer

    International Nuclear Information System (INIS)

    Porkholm, K.; Nurmilaukas, P.; Tiihonen, O.; Haenninen, M.; Puska, E.

    1992-12-01

    The APROS Simulation Environment has been developed since 1986 by Imatran Voima Oy (IVO) and the Technical Research Centre of Finland (VTT). It provides tools, solution algorithms and process components for use in different simulation systems for design, analysis and training purposes. One of its main nuclear applications is the Loviisa Nuclear Power Plant Analyzer (LPA). The Loviisa Plant Analyzer includes all the important plant components both in the primary and in the secondary circuits. In addition, all the main control systems, the protection system and the high voltage electrical systems are included. (orig.)

  3. Filtering observations without the initial guess

    Science.gov (United States)

    Chin, T. M.; Abbondanza, C.; Gross, R. S.; Heflin, M. B.; Parker, J. W.; Soja, B.; Wu, X.

    2017-12-01

    Noisy geophysical observations sampled irregularly over space and time are often numerically "analyzed" or "filtered" before scientific usage. The standard analysis and filtering techniques based on the Bayesian principle requires "a priori" joint distribution of all the geophysical parameters of interest. However, such prior distributions are seldom known fully in practice, and best-guess mean values (e.g., "climatology" or "background" data if available) accompanied by some arbitrarily set covariance values are often used in lieu. It is therefore desirable to be able to exploit efficient (time sequential) Bayesian algorithms like the Kalman filter while not forced to provide a prior distribution (i.e., initial mean and covariance). An example of this is the estimation of the terrestrial reference frame (TRF) where requirement for numerical precision is such that any use of a priori constraints on the observation data needs to be minimized. We will present the Information Filter algorithm, a variant of the Kalman filter that does not require an initial distribution, and apply the algorithm (and an accompanying smoothing algorithm) to the TRF estimation problem. We show that the information filter allows temporal propagation of partial information on the distribution (marginal distribution of a transformed version of the state vector), instead of the full distribution (mean and covariance) required by the standard Kalman filter. The information filter appears to be a natural choice for the task of filtering observational data in general cases where prior assumption on the initial estimate is not available and/or desirable. For application to data assimilation problems, reduced-order approximations of both the information filter and square-root information filter (SRIF) have been published, and the former has previously been applied to a regional configuration of the HYCOM ocean general circulation model. Such approximation approaches are also briefed in the

  4. Imaging Reporters for Proteasome Activity Identify Tumor- and Metastasis-Initiating Cells

    Directory of Open Access Journals (Sweden)

    Amanda C. Stacer

    2015-08-01

    Full Text Available Tumor-initiating cells, also designated as cancer stem cells, are proposed to constitute a subpopulation of malignant cells central to tumorigenesis, metastasis, and treatment resistance. We analyzed the activity of the proteasome, the primary organelle for targeted protein degradation, as a marker of tumor- and metastasis-initiating cells. Using human and mouse breast cancer cells expressing a validated fluorescent reporter, we found a small subpopulation of cells with low proteasome activity that divided asymmetrically to produce daughter cells with low or high proteasome activity. Breast cancer cells with low proteasome activity had greater local tumor formation and metastasis in immunocompromised and immunocompetent mice. To allow flexible labeling of cells, we also developed a new proteasome substrate based on HaloTag technology. Patient-derived glioblastoma cells with low proteasome activity measured by the HaloTag reporter show key phenotypes associated with tumor-initiating cells, including expression of a stem cell transcription factor, reconstitution of the original starting population, and enhanced neurosphere formation. We also show that patient-derived glioblastoma cells with low proteasome activity have higher frequency of tumor formation in mouse xenografts. These studies support proteasome function as a tool to investigate tumor- and metastasis-initiating cancer cells and a potential biomarker for outcomes in patients with several different cancers.

  5. Reduction of initial shock in decadal predictions using a new initialization strategy

    Science.gov (United States)

    He, Yujun; Wang, Bin; Liu, Mimi; Liu, Li; Yu, Yongqiang; Liu, Juanjuan; Li, Ruizhe; Zhang, Cheng; Xu, Shiming; Huang, Wenyu; Liu, Qun; Wang, Yong; Li, Feifei

    2017-08-01

    A novel full-field initialization strategy based on the dimension-reduced projection four-dimensional variational data assimilation (DRP-4DVar) is proposed to alleviate the well-known initial shock occurring in the early years of decadal predictions. It generates consistent initial conditions, which best fit the monthly mean oceanic analysis data along the coupled model trajectory in 1 month windows. Three indices to measure the initial shock intensity are also proposed. Results indicate that this method does reduce the initial shock in decadal predictions by Flexible Global Ocean-Atmosphere-Land System model, Grid-point version 2 (FGOALS-g2) compared with the three-dimensional variational data assimilation-based nudging full-field initialization for the same model and is comparable to or even better than the different initialization strategies for other fifth phase of the Coupled Model Intercomparison Project (CMIP5) models. Better hindcasts of global mean surface air temperature anomalies can be obtained than in other FGOALS-g2 experiments. Due to the good model response to external forcing and the reduction of initial shock, higher decadal prediction skill is achieved than in other CMIP5 models.

  6. Susceptibility of bovine dental enamel with initial erosion lesion to new erosive challenges.

    Science.gov (United States)

    Oliveira, Gabriela Cristina de; Tereza, Guida Paola Genovez; Boteon, Ana Paula; Ferrairo, Brunna Mota; Gonçalves, Priscilla Santana Pinto; Silva, Thiago Cruvinel da; Honório, Heitor Marques; Rios, Daniela

    2017-01-01

    This in vitro study evaluated the impact of initial erosion on the susceptibility of enamel to further erosive challenge. Thirty bovine enamel blocks were selected by surface hardness and randomized into two groups (n = 15): GC- group composed by enamel blocks without erosion lesion and GT- group composed by enamel blocks with initial erosion lesion. The baseline profile of each block was determined using the profilometer. The initial erosion was produced by immersing the blocks into HCl 0.01 M, pH 2.3 for 30 seconds, under stirring. The erosive cycling consisted of blocks immersion in hydrochloric acid (0.01 M, pH 2.3) for 2 minutes, followed by immersion in artificial saliva for 120 minutes. This procedure was repeated 4 times a day for 5 days, and the blocks were kept in artificial saliva overnight. After erosive cycling, final profile measurement was performed. Profilometry measured the enamel loss by the superposition of initial and final profiles. Data were analyzed by t-test (perosion on bovine dental enamel does not enhance its susceptibility to new erosive challenges.

  7. Remote Laser Diffraction PSD Analyzer

    International Nuclear Information System (INIS)

    Batcheller, T.A.; Huestis, G.M.; Bolton, S.M.

    2000-01-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified off-the-shelf classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a hot cell (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable - making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives

  8. Remote Laser Diffraction PSD Analyzer

    International Nuclear Information System (INIS)

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2000-01-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified ''off-the-shelf'' classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a ''hot cell'' (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable--making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives

  9. Wh-filler-gap dependency formation guides reflexive antecedent search

    Directory of Open Access Journals (Sweden)

    Michael eFrazier

    2015-10-01

    Full Text Available Prior studies on online sentence processing have shown that the parser can resolve non-local dependencies rapidly and accurately. This study investigates the interaction between the processing of two such non-local dependencies: wh-filler-gap dependencies (WhFGD and reflexive-antecedent dependencies. We show that reflexive-antecedent dependency resolution is sensitive to the presence of a WhFGD, and argue that the filler-gap dependency established by WhFGD resolution is selected online as the antecedent of a reflexive dependency. We investigate the processing of constructions like (1, where two NPs might be possible antecedents for the reflexive, namely which cowgirl and Mary. Even though Mary is linearly closer to the reflexive, the only grammatically licit antecedent for the reflexive is the more distant wh-NP, which cowgirl. 1. Which cowgirl did Mary expect to have injured herself due to negligence?Four eye-tracking text-reading experiments were conducted on examples like (1, differing in whether the embedded clause was non-finite (1 and 3 or finite (2 and 4, and in whether the tail of the wh-dependency intervened between the reflexive and its closest overt antecedent (1 and 2 or the wh-dependency was associated with a position earlier in the sentence (3 and 4.The results of Experiments 1 and 2 indicate the parser accesses the result of WhFGD formation during reflexive antecedent search. The resolution of a wh-dependency alters the representation that reflexive antecedent search operates over, allowing the grammatical but linearly distant antecedent to be accessed rapidly. In the absence of a long-distance WhFGD (Exp. 3 and 4, wh-NPs were not found to impact reading times of the reflexive, indicating that the parser's ability to select distant wh-NPs as reflexive antecedents crucially involves syntactic structure.

  10. Semi-autonomous inline water analyzer: design of a common light detector for bacterial, phage, and immunological biosensors.

    Science.gov (United States)

    Descamps, Elodie C T; Meunier, Damien; Brutesco, Catherine; Prévéral, Sandra; Franche, Nathalie; Bazin, Ingrid; Miclot, Bertrand; Larosa, Philippe; Escoffier, Camille; Fantino, Jean-Raphael; Garcia, Daniel; Ansaldi, Mireille; Rodrigue, Agnès; Pignol, David; Cholat, Pierre; Ginet, Nicolas

    2017-01-01

    The use of biosensors as sensitive and rapid alert systems is a promising perspective to monitor accidental or intentional environmental pollution, but their implementation in the field is limited by the lack of adapted inline water monitoring devices. We describe here the design and initial qualification of an analyzer prototype able to accommodate three types of biosensors based on entirely different methodologies (immunological, whole-cell, and bacteriophage biosensors), but whose responses rely on the emission of light. We developed a custom light detector and a reaction chamber compatible with the specificities of the three systems and resulting in statutory detection limits. The water analyzer prototype resulting from the COMBITOX project can be situated at level 4 on the Technology Readiness Level (TRL) scale and this technical advance paves the way to the use of biosensors on-site.

  11. Diel cycling of zinc in a stream impacted by acid rock drainage: Initial results from a new in situ Zn analyzer

    Science.gov (United States)

    Chapin, T.P.; Nimick, D.A.; Gammons, C.H.; Wanty, R.B.

    2007-01-01

    Recent work has demonstrated that many trace metals undergo dramatic diel (24-h) cycles in near neutral pH streams with metal concentrations reproducibly changing up to 500% during the diel period (Nimick et al., 2003). To examine diel zinc cycles in streams affected by acid rock drainage, we have developed a novel instrument, the Zn-DigiScan, to continuously monitor in situ zinc concentrations in near real-time. Initial results from a 3-day deployment at Fisher Creek, Montana have demonstrated the ability of the Zn-DigiScan to record diel Zn cycling at levels below 100 ??g/l. Longer deployments of this instrument could be used to examine the effects of episodic events such as rainstorms and snowmelt pulses on zinc loading in streams affected by acid rock drainage. ?? Springer Science+Business Media B.V. 2006.

  12. FILLED GAP EFFECT AND SEMANTIC PLAUSIBILITY IN BRAZILIAN PORTUGUESE SENTENCE PROCESSING

    OpenAIRE

    Marcus Maia

    2014-01-01

    The Filled Gap Effect (FGE) is investigated in Brazilian Portuguese through eye- tracking and self paced reading experiments. Results detect the presence of FGE, indicating that the parser is strictly syntactic in the early stage of processing. The final measures in the two experiments present discrepant results, motivating a discussion on possible good-enough effects.

  13. Experimental study of living free radical polymerization using trifunctional initiator and polymerization mediated by nitroxide

    International Nuclear Information System (INIS)

    Galhardo, Eduardo; Lona, Liliane M.F.

    2009-01-01

    Controlled free radical polymerization or living free radical polymerization has received increasing attention as a technique for the production of polymers with microstructure highly controlled. In particular, narrow molecular weight distributions are obtained with polydispersity very close to one. In this research it was investigate the controlled polymerization mediated by nitroxide, using a cyclic trifunctional peroxide. As long as we know, there are only publications in literature dealing with NMRP using mono- and bi-functional initiators. It was believed that the trifunctional peroxide can increase the rate of polymerization, since more free radicals are generated, if compared with initiators with lower functionality. Furthermore, the fact of the initiator be cyclic means that branches are not generated in the chains, which theoretically prevents an increase in polydispersity of the polymer. The effect of the dissociation constant of the trifunctional initiator in the velocity of the reaction was analyzed. (author)

  14. Plutonium solution analyzer

    International Nuclear Information System (INIS)

    Burns, D.A.

    1994-09-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%-O.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40-240 g/L and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4-4.0 g/L. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 mL of each sample and standard, and generates waste at the rate of about 1.5 mL per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded)

  15. Plutonium solution analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Burns, D.A.

    1994-09-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%-O.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40-240 g/L and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4-4.0 g/L. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 mL of each sample and standard, and generates waste at the rate of about 1.5 mL per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded).

  16. Analyzing Political Television Advertisements.

    Science.gov (United States)

    Burson, George

    1992-01-01

    Presents a lesson plan to help students understand that political advertisements often mislead, lie, or appeal to emotion. Suggests that the lesson will enable students to examine political advertisements analytically. Includes a worksheet to be used by students to analyze individual political advertisements. (DK)

  17. Sintaxe X-barra: uma aplicação computacional

    Directory of Open Access Journals (Sweden)

    Gabriel de Ávila Othero

    2009-04-01

    Full Text Available http://dx.doi.org/10.5007/1984-8420.2008v9nespp15 Neste trabalho, apresentaremos uma aplicação computacional da teoria X-barra (cf. HAEGEMAN, 1994; MIOTO et al., 2004, através do programa Grammar Play, um parser sintático em Prolog. O Grammar Play analisa sentenças declarativas simples do português brasileiro, identificando sua estrutura de constituintes. Sua gramática é implementada em Prolog, com o recurso das DCGs, e é baseada nos moldes propostos pela teoria X-barra. O parser é uma primeira tentativa de expandir a cobertura de analisadores semelhantes, como o esboçado em Pagani (2004 e Othero (2004. Os objetivos que guiam a presente versão do Grammar Play são o de implementar computacionalmente modelos lingüísticos coerentes aplicados à descrição do português e o de criar uma ferramenta computacional que possa ser usada didaticamente em aulas de introdução à sintaxe e lingüística, por exemplo.

  18. Evolution of the Generic Lock System at Jefferson Lab

    International Nuclear Information System (INIS)

    Brian Bevins; Yves Roblin

    2003-01-01

    The Generic Lock system is a software framework that allows highly flexible feedback control of large distributed systems. It allows system operators to implement new feedback loops between arbitrary process variables quickly and with no disturbance to the underlying control system. Several different types of feedback loops are provided and more are being added. This paper describes the further evolution of the system since it was first presented at ICALEPCS 2001 and reports on two years of successful use in accelerator operations. The framework has been enhanced in several key ways. Multiple-input, multiple-output (MIMO) lock types have been added for accelerator orbit and energy stabilization. The general purpose Proportional-Integral-Derivative (PID) locks can now be tuned automatically. The generic lock server now makes use of the Proxy IOC (PIOC) developed at Jefferson Lab to allow the locks to be monitored from any EPICS Channel Access aware client. (Previously clients had to be Cdev aware.) The dependency on the Qt XML parser has been replaced with the freely available Xerces DOM parser from the Apache project

  19. Identifying the null subject: evidence from event-related brain potentials.

    Science.gov (United States)

    Demestre, J; Meltzer, S; García-Albea, J E; Vigil, A

    1999-05-01

    Event-related brain potentials (ERPs) were recorded during spoken language comprehension to study the on-line effects of gender agreement violations in controlled infinitival complements. Spanish sentences were constructed in which the complement clause contained a predicate adjective marked for syntactic gender. By manipulating the gender of the antecedent (i.e., the controller) of the implicit subject while holding constant the gender of the adjective, pairs of grammatical and ungrammatical sentences were created. The detection of such a gender agreement violation would indicate that the parser had established the coreference relation between the null subject and its antecedent. The results showed a complex biphasic ERP (i.e., an early negativity with prominence at anterior and central sites, followed by a centroparietal positivity) in the violating condition as compared to the non-violating conditions. The brain reacts to NP-adjective gender agreement violations within a few hundred milliseconds of their occurrence. The data imply that the parser has properly coindexed the null subject of an infinitive clause with its antecedent.

  20. Conformal changes of metrics and the initial-value problem of general relativity

    International Nuclear Information System (INIS)

    Mielke, E.W.

    1977-01-01

    Conformal techniques are reviewed with respect to applications to the initial-value problem of general relativity. Invariant transverse traceless decompositions of tensors, one of its main tools, are related to representations of the group of 'conformeomorphisms' acting on the space of all Riemannian metrics on M. Conformal vector fields, a kernel in the decomposition, are analyzed on compact manifolds with constant scalar curvature. The realization of arbitrary functions as scalar curvature of conformally equivalent metrics, a generalization of Yamabe's (Osaka Math. J.; 12:12 (1960)) conjecture, is applied to the Hamiltonian constraint and to the issue of positive energy of gravitational fields. Various approaches to the solution of the initial-value equations produced by altering the scaling behaviour of the second fundamental form are compared. (author)

  1. Algorithms to analyze the quality test parameter values of seafood in the proposed ontology based seafood quality analyzer and miner (ONTO SQAM model

    Directory of Open Access Journals (Sweden)

    Vinu Sherimon

    2017-07-01

    Full Text Available Ensuring the quality of food, particularly seafood has increasingly become an important issue nowadays. Quality Management Systems empower any organization to identify, measure, control and improve the quality of the products manufactured that will eventually lead to improved business performance. With the advent of new technologies, now intelligent systems are being developed. To ensure the quality of seafood, an ontology based seafood quality analyzer and miner (ONTO SQAM model is proposed. The knowledge is represented using ontology. The domain concepts are defined using ontology. This paper presents the initial part of the proposed model – the analysis of quality test parameter values. Two algorithms are proposed to do the analysis – Comparison Algorithm and Data Store Updater algorithm. The algorithms ensure that the values of various quality tests are in the acceptable range. The real data sets taken from different seafood companies in Kerala, India, and validated by the Marine Product Export Development Authority of India (MPEDA are used for the experiments. The performance of the algorithms is evaluated using standard performance metrics such as precision, recall, and accuracy. The results obtained show that all the three measures achieved good results.

  2. Initial high-power testing of the ATF [Advanced Toroidal Facility] ECH [electron cyclotron heating] system

    International Nuclear Information System (INIS)

    White, T.L.; Bigelow, T.S.; Kimrey, H.D. Jr.

    1987-01-01

    The Advanced Toroidal Facility (ATF) is a moderate aspect ratio torsatron that will utilize 53.2 GHz 200 kW Electron Cyclotron Heating (ECH) to produce nearly current-free target plasmas suitable for subsequent heating by strong neutral beam injection. The initial configuration of the ECH system from the gyrotron to ATF consists of an optical arc detector, three bellows, a waveguide mode analyzer, two TiO 2 mode absorbers, two 90 0 miter bends, two waveguide pumpouts, an insulating break, a gate valve, and miscellaneous straight waveguide sections feeding a launcher radiating in the TE 02 mode. Later, a focusing Vlasov launcher will be added to beam the ECH power to the saddle point in ATF magnetic geometry for optimum power deposition. The ECH system has several unique features; namely, the entire ECH system is evacuated, the ECH system is broadband, forward power is monitored by a newly developed waveguide mode analyzer, phase correcting miter bends will be employed, and the ECH system will be capable of operating short pulse to cw. Initial high-power tests show that the overall system efficiency is 87%. The waveguide mode analyzer shows that the gyrotron mode output consists of 13% TE 01 , 82.6% TE 02 , 2.5% TE 03 , and 1.9% TE 04 . 4 refs

  3. Electrodynamic thermogravimetric analyzer

    International Nuclear Information System (INIS)

    Spjut, R.E.; Bar-Ziv, E.; Sarofim, A.F.; Longwell, J.P.

    1986-01-01

    The design and operation of a new device for studying single-aerosol-particle kinetics at elevated temperatures, the electrodynamic thermogravimetric analyzer (EDTGA), was examined theoretically and experimentally. The completed device consists of an electrodynamic balance modified to permit particle heating by a CO 2 laser, temperature measurement by a three-color infrared-pyrometry system, and continuous weighing by a position-control system. In this paper, the position-control, particle-weight-measurement, heating, and temperature-measurement systems are described and their limitations examined

  4. Singularity, initial conditions and quantum tunneling in modern cosmology

    International Nuclear Information System (INIS)

    Khalatnikov, I M; Kamenshchik, A Yu

    1998-01-01

    The key problems of modern cosmology, such as the cosmological singularity, initial conditions, and the quantum tunneling hypothesis, are discussed. The relationship between the latest cosmological trends and L D Landau's old ideas is analyzed. Particular attention is given to the oscillatory approach to singularity; quantum tunneling processes determining wave function of the Universe in the presence of a compex scalar field; and the role of quantum corrections in these processes. The classical dynamics of closed models with a real scalar field is investigated from the standpoint of chaotic, fractal, and singularity-avoiding properties. (special issue)

  5. Systematic Review of Smoking Initiation among Asian Adolescents, 20052015: Utilizing the Frameworks of Triadic Influence and Planned Behavior.

    Science.gov (United States)

    Talip, Tajidah; Murang, Zaidah; Kifli, Nurolaini; Naing, Lin

    2016-01-01

    presents various factors influencing smoking initiation of the Asian adolescents and provides a conceptual framework to further analyze factors. Future studies should have a standard measure of smoking initiation, should analyze interactions and the intensity of relationships between different factors or variables in the conceptual model. This will in turn consolidate the understanding of the different factors affecting smoking initiation and will help to improve interventions in this area.

  6. Adherence of private health system hospitals to dissemination of outcomes according to the Global Reporting Initiative (GRI) model.

    Science.gov (United States)

    Machado, Celso; César, Robson Danúbio da Silva; Souza, Maria Tereza Saraiva de

    2017-01-01

    To verify if there is an analogy between the indicators of the Global Reporting Initiative adopted by hospitals in the private healthcare system. Documentary research supported by reports that are electronically available on the website of the companies surveyed. The organizations surveyed had a significant adherence of their economic, social and environmental indicators of the model proposed by the Global Reporting Initiative, showing an analogous field of common indicators between them. There is similarity between the indicators adopted by companies, but one of the hospitals analyzed had a greater number of converging indicators to Global Reporting Initiative.

  7. Access to Specialized Care Through Telemedicine in Limited-Resource Country: Initial 1,065 Teleconsultations in Albania.

    Science.gov (United States)

    Latifi, Rifat; Gunn, Jayleen K L; Bakiu, Evis; Boci, Arian; Dasho, Erion; Olldashi, Fatos; Pipero, Pellumb; Stroster, John A; Qesteri, Orland; Kucani, Julian; Sulo, Ardi; Oshafi, Manjola; Osmani, Kalterina L; Dogjani, Agron; Doarn, Charles R; Shatri, Zhaneta; Kociraj, Agim; Merrell, Ronald C

    2016-12-01

    To analyze the initial experience of the nationwide clinical telemedicine program of Albania, as a model of implementation of telemedicine using "Initiate-Build-Operate-Transfer" strategy. This was a retrospective study of prospectively collected data from teleconsultations in Albania between January 1, 2014 and August 26, 2015, delivered synchronously, asynchronously, or a combination of both methods. Patient's demographics, mode of consultation, clinical specialty, hospitals providing referral and consultation, time from initial call to completion of consultation, and patient disposition following teleconsultation were analyzed. Challenges of the newly created program have been identified and analyzed as well. There were 1,065 teleconsultations performed altogether during the study period. Ninety-one patients with autism managed via telemedicine were not included in this analysis and will be reported separately. Of 974 teleconsults, the majority were for radiology, neurotrauma, and stroke (55%, 16%, and 10% respectively). Asynchronous technology accounted for nearly two-thirds of all teleconsultations (63.7%), followed by combined (24.3%), and then synchronous (12.0%). Of 974 cases, only 20.0% of patients in 2014 and 22.72% of patients in 2015 were transferred to a tertiary hospital. A majority (98.5%) of all teleconsultations were conducted within the country itself. The Integrated Telemedicine and e-Health program of Albania has become a useful tool to improve access to high-quality healthcare, particularly in high demanding specialty disciplines. A number of challenges were identified and these should serve as lessons for other countries in their quest to establish nationwide telemedicine programs.

  8. Evaluation of strength and failure of brittle rock containing initial cracks under lithospheric conditions

    Science.gov (United States)

    Li, Xiaozhao; Qi, Chengzhi; Shao, Zhushan; Ma, Chao

    2018-02-01

    Natural brittle rock contains numerous randomly distributed microcracks. Crack initiation, growth, and coalescence play a predominant role in evaluation for the strength and failure of brittle rocks. A new analytical method is proposed to predict the strength and failure of brittle rocks containing initial microcracks. The formulation of this method is based on an improved wing crack model and a suggested micro-macro relation. In this improved wing crack model, the parameter of crack angle is especially introduced as a variable, and the analytical stress-crack relation considering crack angle effect is obtained. Coupling the proposed stress-crack relation and the suggested micro-macro relation describing the relation between crack growth and axial strain, the stress-strain constitutive relation is obtained to predict the rock strength and failure. Considering different initial microcrack sizes, friction coefficients and confining pressures, effects of crack angle on tensile wedge force acting on initial crack interface are studied, and effects of crack angle on stress-strain constitutive relation of rocks are also analyzed. The strength and crack initiation stress under different crack angles are discussed, and the value of most disadvantaged angle triggering crack initiation and rock failure is founded. The analytical results are similar to the published study results. Rationality of this proposed analytical method is verified.

  9. The Ties that Bind: Presidential Involvement with the Development of NCAA Division I Initial Eligibility Legislation.

    Science.gov (United States)

    Covell, Dan; Barr, Carol A.

    2001-01-01

    Provides a chronology of college presidential efforts to deal with conflicts related to reconciliation of academic mission and athletic success through development of National Collegiate Athletic Association (NCAA) initial eligibility academic legislation. Analyzes these efforts in terms of maintaining congruence within the constituency-based…

  10. The initial decrease in effective peritoneal surface area is not caused by an increase in hematocrit

    NARCIS (Netherlands)

    Struijk, D. G.; Krediet, R. T.; Koomen, G. C.; Boeschoten, E. W.; Hoek, F. J.; Arisz, L.

    1993-01-01

    The possible relationship between initial changes in functional characteristics of the peritoneal membrane in time and hemoglobin (Hb) or hematocrit (Ht) was analyzed as part of a prospective longitudinal study. The patients were investigated twice: the first time within 3 months after the start of

  11. Susceptibility of bovine dental enamel with initial erosion lesion to new erosive challenges.

    Directory of Open Access Journals (Sweden)

    Gabriela Cristina de Oliveira

    Full Text Available This in vitro study evaluated the impact of initial erosion on the susceptibility of enamel to further erosive challenge. Thirty bovine enamel blocks were selected by surface hardness and randomized into two groups (n = 15: GC- group composed by enamel blocks without erosion lesion and GT- group composed by enamel blocks with initial erosion lesion. The baseline profile of each block was determined using the profilometer. The initial erosion was produced by immersing the blocks into HCl 0.01 M, pH 2.3 for 30 seconds, under stirring. The erosive cycling consisted of blocks immersion in hydrochloric acid (0.01 M, pH 2.3 for 2 minutes, followed by immersion in artificial saliva for 120 minutes. This procedure was repeated 4 times a day for 5 days, and the blocks were kept in artificial saliva overnight. After erosive cycling, final profile measurement was performed. Profilometry measured the enamel loss by the superposition of initial and final profiles. Data were analyzed by t-test (p<0.05. The result showed no statistically significant difference between groups (GS = 14.60±2.86 and GE = .14.69±2.21 μm. The presence of initial erosion on bovine dental enamel does not enhance its susceptibility to new erosive challenges.

  12. A new uranium automatic analyzer

    International Nuclear Information System (INIS)

    Xia Buyun; Zhu Yaokun; Wang Bin; Cong Peiyuan; Zhang Lan

    1993-01-01

    A new uranium automatic analyzer based on the flow injection analysis (FIA) principle has been developed. It consists of a multichannel peristaltic pump, an injection valve, a photometric detector, a single-chip microprocessor system and electronic circuit. The new designed multifunctional auto-injection valve can automatically change the injection volume of the sample and the channels so that the determination ranges and items can easily be changed. It also can make the instrument vary the FIA operation modes that it has functions of a universal instrument. A chromatographic column with extractant-containing resin was installed in the manifold of the analyzer for the concentration and separation of trace uranium. The 2-(5-bromo-2-pyridylazo)-5-diethyl-aminophenol (Br-PADAP) was used as colour reagent. Uranium was determined in the aqueous solution by adding cetyl-pyridium bromide (CPB). The uranium in the solution in the range 0.02-500 mg · L -1 can be directly determined without any pretreatment. A sample throughput rate of 30-90 h -1 and reproducibility of 1-2% were obtained. The analyzer has been satisfactorily applied to the laboratory and the plant

  13. Integrated verification and testing system (IVTS) for HAL/S programs

    Science.gov (United States)

    Senn, E. H.; Ames, K. R.; Smith, K. A.

    1983-01-01

    The IVTS is a large software system designed to support user-controlled verification analysis and testing activities for programs written in the HAL/S language. The system is composed of a user interface and user command language, analysis tools and an organized data base of host system files. The analysis tools are of four major types: (1) static analysis, (2) symbolic execution, (3) dynamic analysis (testing), and (4) documentation enhancement. The IVTS requires a split HAL/S compiler, divided at the natural separation point between the parser/lexical analyzer phase and the target machine code generator phase. The IVTS uses the internal program form (HALMAT) between these two phases as primary input for the analysis tools. The dynamic analysis component requires some way to 'execute' the object HAL/S program. The execution medium may be an interpretive simulation or an actual host or target machine.

  14. Annotating Logical Forms for EHR Questions.

    Science.gov (United States)

    Roberts, Kirk; Demner-Fushman, Dina

    2016-05-01

    This paper discusses the creation of a semantically annotated corpus of questions about patient data in electronic health records (EHRs). The goal is to provide the training data necessary for semantic parsers to automatically convert EHR questions into a structured query. A layered annotation strategy is used which mirrors a typical natural language processing (NLP) pipeline. First, questions are syntactically analyzed to identify multi-part questions. Second, medical concepts are recognized and normalized to a clinical ontology. Finally, logical forms are created using a lambda calculus representation. We use a corpus of 446 questions asking for patient-specific information. From these, 468 specific questions are found containing 259 unique medical concepts and requiring 53 unique predicates to represent the logical forms. We further present detailed characteristics of the corpus, including inter-annotator agreement results, and describe the challenges automatic NLP systems will face on this task.

  15. Single-molecule packaging initiation in real time by a viral DNA packaging machine from bacteriophage T4.

    Science.gov (United States)

    Vafabakhsh, Reza; Kondabagil, Kiran; Earnest, Tyler; Lee, Kyung Suk; Zhang, Zhihong; Dai, Li; Dahmen, Karin A; Rao, Venigalla B; Ha, Taekjip

    2014-10-21

    Viral DNA packaging motors are among the most powerful molecular motors known. A variety of structural, biochemical, and single-molecule biophysical approaches have been used to understand their mechanochemistry. However, packaging initiation has been difficult to analyze because of its transient and highly dynamic nature. Here, we developed a single-molecule fluorescence assay that allowed visualization of packaging initiation and reinitiation in real time and quantification of motor assembly and initiation kinetics. We observed that a single bacteriophage T4 packaging machine can package multiple DNA molecules in bursts of activity separated by long pauses, suggesting that it switches between active and quiescent states. Multiple initiation pathways were discovered including, unexpectedly, direct DNA binding to the capsid portal followed by recruitment of motor subunits. Rapid succession of ATP hydrolysis was essential for efficient initiation. These observations have implications for the evolution of icosahedral viruses and regulation of virus assembly.

  16. Faraday cup for analyzing multi-ion plasma

    International Nuclear Information System (INIS)

    Fujita, Takao

    1987-01-01

    A compact and convenient ion analyzer (a kind of a Faraday cup) is developed in order to analyze weakly ionized multi-ion plasmas. This Faraday cup consists of three mesh electrodes and a movable ion collector. With a negative gate pulse superimposed on the ion retarding bias, ions are analyzed by means of time-of-flight. The identification of ion species and measurements of ion density and ion temperature are studied. (author)

  17. Initial-boundary value problems associated with the Ablowitz-Ladik system

    Science.gov (United States)

    Xia, Baoqiang; Fokas, A. S.

    2018-02-01

    We employ the Ablowitz-Ladik system as an illustrative example in order to demonstrate how to analyze initial-boundary value problems for integrable nonlinear differential-difference equations via the unified transform (Fokas method). In particular, we express the solutions of the integrable discrete nonlinear Schrödinger and integrable discrete modified Korteweg-de Vries equations in terms of the solutions of appropriate matrix Riemann-Hilbert problems. We also discuss in detail, for both the above discrete integrable equations, the associated global relations and the process of eliminating of the unknown boundary values.

  18. FILLED GAP EFFECT AND SEMANTIC PLAUSIBILITY IN BRAZILIAN PORTUGUESE SENTENCE PROCESSING

    Directory of Open Access Journals (Sweden)

    Marcus Maia

    2014-12-01

    Full Text Available The Filled Gap Effect (FGE is investigated in Brazilian Portuguese through eye- tracking and self paced reading experiments. Results detect the presence of FGE, indicating that the parser is strictly syntactic in the early stage of processing. The final measures in the two experiments present discrepant results, motivating a discussion on possible good-enough effects.

  19. Automatic radioxenon analyzer for CTBT monitoring

    International Nuclear Information System (INIS)

    Bowyer, T.W.; Abel, K.H.; Hensley, W.K.

    1996-12-01

    Over the past 3 years, with support from US DOE's NN-20 Comprehensive Test Ban Treaty (CTBT) R ampersand D program, PNNL has developed and demonstrated a fully automatic analyzer for collecting and measuring the four Xe radionuclides, 131m Xe(11.9 d), 133m Xe(2.19 d), 133 Xe (5.24 d), and 135 Xe(9.10 h), in the atmosphere. These radionuclides are important signatures in monitoring for compliance to a CTBT. Activity ratios permit discriminating radioxenon from nuclear detonation and that from nuclear reactor operations, nuclear fuel reprocessing, or medical isotope production and usage. In the analyzer, Xe is continuously and automatically separated from the atmosphere at flow rates of about 7 m 3 /h on sorption bed. Aliquots collected for 6-12 h are automatically analyzed by electron-photon coincidence spectrometry to produce sensitivities in the range of 20-100 μBq/m 3 of air, about 100-fold better than with reported laboratory-based procedures for short time collection intervals. Spectral data are automatically analyzed and the calculated radioxenon concentrations and raw gamma- ray spectra automatically transmitted to data centers

  20. Analyzing the multiple-target-multiple-agent scenario using optimal assignment algorithms

    Science.gov (United States)

    Kwok, Kwan S.; Driessen, Brian J.; Phillips, Cynthia A.; Tovey, Craig A.

    1997-09-01

    This work considers the problem of maximum utilization of a set of mobile robots with limited sensor-range capabilities and limited travel distances. The robots are initially in random positions. A set of robots properly guards or covers a region if every point within the region is within the effective sensor range of at least one vehicle. We wish to move the vehicles into surveillance positions so as to guard or cover a region, while minimizing the maximum distance traveled by any vehicle. This problem can be formulated as an assignment problem, in which we must optimally decide which robot to assign to which slot of a desired matrix of grid points. The cost function is the maximum distance traveled by any robot. Assignment problems can be solved very efficiently. Solution times for one hundred robots took only seconds on a silicon graphics crimson workstation. The initial positions of all the robots can be sampled by a central base station and their newly assigned positions communicated back to the robots. Alternatively, the robots can establish their own coordinate system with the origin fixed at one of the robots and orientation determined by the compass bearing of another robot relative to this robot. This paper presents example solutions to the multiple-target-multiple-agent scenario using a matching algorithm. Two separate cases with one hundred agents in each were analyzed using this method. We have found these mobile robot problems to be a very interesting application of network optimization methods, and we expect this to be a fruitful area for future research.

  1. Analyzing the multiple-target-multiple-agent scenario using optimal assignment algorithms

    International Nuclear Information System (INIS)

    Kwok, K.S.; Driessen, B.J.; Phillips, C.A.; Tovey, C.A.

    1997-01-01

    This work considers the problem of maximum utilization of a set of mobile robots with limited sensor-range capabilities and limited travel distances. The robots are initially in random positions. A set of robots properly guards or covers a region if every point within the region is within the effective sensor range of at least one vehicle. The authors wish to move the vehicles into surveillance positions so as to guard or cover a region, while minimizing the maximum distance traveled by any vehicle. This problem can be formulated as an assignment problem, in which they must optimally decide which robot to assign to which slot of a desired matrix of grid points. The cost function is the maximum distance traveled by any robot. Assignment problems can be solved very efficiently. Solutions times for one hundred robots took only seconds on a Silicon Graphics Crimson workstation. The initial positions of all the robots can be sampled by a central base station and their newly assigned positions communicated back to the robots. Alternatively, the robots can establish their own coordinate system with the origin fixed at one of the robots and orientation determined by the compass bearing of another robot relative to this robot. This paper presents example solutions to the multiple-target-multiple-agent scenario using a matching algorithm. Two separate cases with one hundred agents in each were analyzed using this method. They have found these mobile robot problems to be a very interesting application of network optimization methods, and they expect this to be a fruitful area for future research

  2. Americal options analyzed differently

    NARCIS (Netherlands)

    Nieuwenhuis, J.W.

    2003-01-01

    In this note we analyze in a discrete-time context and with a finite outcome space American options starting with the idea that every tradable should be a martingale under a certain measure. We believe that in this way American options become more understandable to people with a good working

  3. Community Road Safety Initiatives for the Minerals Industry

    Directory of Open Access Journals (Sweden)

    Tim Horberry

    2013-12-01

    Full Text Available Major companies in the minerals industry are increasingly recognizing that their operations have an impact in the wider community. Regarding transportation issues, this impact extends beyond purely the safety of company vehicle fleets to consideration of Community Road Safety (CRS concerns, which address the driving, walking, and riding practices of community members in a locale with increased heavy vehicle traffic. Our assessment here of national and international trends in approaches to road safety awareness and associated road safety strategies is meant to inform companies in the minerals industry of developments that can influence the design of their road safety initiatives. The review begins by considering the overall road safety context and the dominant “safe systems” framework employed internationally. Thereafter, it considers what is typically included in CRS initiatives for the minerals industry. Three case studies are then presented to highlight approaches that feature exemplary collaboration, design, implementation, or impact. Thereafter, we analyze lessons learnt by key researchers and practitioners in the CRS field. Finally, we conclude that best CRS practices for the minerals industry rely on eleven factors, including for example collaboration with local entities and stepwise implementation.

  4. Development of remote controlled electron probe micro analyzer with crystal orientation analyzer

    International Nuclear Information System (INIS)

    Honda, Junichi; Matsui, Hiroki; Harada, Akio; Obata, Hiroki; Tomita, Takeshi

    2012-07-01

    The advanced utilization of Light Water Reactor (LWR) fuel is progressed in Japan to save the power generating cost and the volume of nuclear wastes. The electric power companies have continued the approach to the burnup extension and to rise up the thermal power increase of the commercial fuel. The government should be accumulating the detailed information on the newest technologies to make the regulations and guidelines for the safety of the advanced nuclear fuels. The remote controlled Electron Probe Micro Analyzer (EPMA) attached with crystal orientation analyzer has been developed in Japan Atomic Energy Agency (JAEA) to study the fuel behavior of the high burnup fuels under the accident condition. The effects of the cladding microstructure on the fuel behavior will be evaluated more conveniently and quantitatively by this EPMA. The commercial model of EPMA has been modified to have the performance of airtight and earthquake resistant in compliance with the safety regulation by the government for handling the high radioactive elements. This paper describes the specifications of EPMA which were specialised for post irradiation examination and the test results of the cold mock-up to confirm their performances and reliabilities. (author)

  5. Comparative evaluation of Plateletworks, Multiplate analyzer and Platelet function analyzer-200 in cardiology patients.

    Science.gov (United States)

    Kim, Jeeyong; Cho, Chi Hyun; Jung, Bo Kyeung; Nam, Jeonghun; Seo, Hong Seog; Shin, Sehyun; Lim, Chae Seung

    2018-04-14

    The objective of this study was to comparatively evaluate three commercial whole-blood platelet function analyzer systems: Platelet Function Analyzer-200 (PFA; Siemens Canada, Mississauga, Ontario, Canada), Multiplate analyzer (MP; Roche Diagnostics International Ltd., Rotkreuz, Switzerland), and Plateletworks Combo-25 kit (PLW; Helena Laboratories, Beaumont, TX, USA). Venipuncture was performed on 160 patients who visited a department of cardiology. Pairwise agreement among the three platelet function assays was assessed using Cohen's kappa coefficient and percent agreement within the reference limit. Kappa values with the same agonists were poor between PFA-collagen (COL; agonist)/adenosine diphosphate (ADP) and MP-ADP (-0.147), PFA-COL/ADP and PLW-ADP (0.089), MP-ADP and PLW-ADP (0.039), PFA-COL/ADP and MP-COL (-0.039), and between PFA-COL/ADP and PLW-COL (-0.067). Nonetheless, kappa values for the same assay principle with a different agonist were slightly higher between PFA-COL/ADP and PFA-COL/EPI (0.352), MP-ADP and MP-COL (0.235), and between PLW-ADP and PLW-COL (0.247). The range of percent agreement values was 38.7% to 73.8%. Therefore, various measurements of platelet function by more than one method were needed to obtain a reliable interpretation of platelet function considering low kappa coefficient and modest percent agreement rates among 3 different platelet function tests.

  6. Factors Related to Initiating Interpersonal Contacts on Internet Dating Sites: A View From the Social Exchange Theory

    Directory of Open Access Journals (Sweden)

    Rivka Shtatfeld

    2009-12-01

    Full Text Available The purpose of this study was to identify factors that influence dating-site users to initiate contact with potential romantic partners. The study was carried out by observing online behaviors and analyzing the profiles and authentic messages of these users (N = 106 over seven months. Contacts made by and with the research participants were analyzed in terms of the relationships between initiators‘ and receivers‘ demographic variables (marital status, age, level of education, income, writing skills, and stated physical appearance. In addition, the relationship between contacting partners and site accessibility was examined. The findings revealed that dating-site users initiated contact primarily with those having a similar marital status or slightly better characteristics (income, education, writing skills. In regard to writing skills, it was found that skilled writers attracted more contacts than did less skilled writers. However, the factor that was found to be most significantly related to initiating contact was the length of time that elapsed from last connection to the site, which implies the perceived accessibility of potential romantic partners. The findings were explained in terms of the Social Exchange Theory: people are attracted to those who grant them rewards.

  7. Popular Legislative Initiative for Spain Surrogacy: A Study of the Role of Notary in Contract Surrogacy

    Directory of Open Access Journals (Sweden)

    Lorena Sales Pallarés

    2016-12-01

    Full Text Available This article analyzes the Popular Legislative Initiative to regulate surrogacy in Spain. It is proposed to regulate this contractual figure guaranteeing the rights of all parties involved in the process by bringing this feature in the figure of the notary. Therefore, this article analyzes this notarial tutelage of the surrogacy contract. It will consider whether the functions of the Notary make possible this proposal or if it would be necessary to make changes either in the draft law on notary functions well.

  8. Methyl-Analyzer--whole genome DNA methylation profiling.

    Science.gov (United States)

    Xin, Yurong; Ge, Yongchao; Haghighi, Fatemeh G

    2011-08-15

    Methyl-Analyzer is a python package that analyzes genome-wide DNA methylation data produced by the Methyl-MAPS (methylation mapping analysis by paired-end sequencing) method. Methyl-MAPS is an enzymatic-based method that uses both methylation-sensitive and -dependent enzymes covering >80% of CpG dinucleotides within mammalian genomes. It combines enzymatic-based approaches with high-throughput next-generation sequencing technology to provide whole genome DNA methylation profiles. Methyl-Analyzer processes and integrates sequencing reads from methylated and unmethylated compartments and estimates CpG methylation probabilities at single base resolution. Methyl-Analyzer is available at http://github.com/epigenomics/methylmaps. Sample dataset is available for download at http://epigenomicspub.columbia.edu/methylanalyzer_data.html. fgh3@columbia.edu Supplementary data are available at Bioinformatics online.

  9. Research on Initiation Sensitivity of Solid Explosive and Planer Initiation System

    OpenAIRE

    N Matsuo; M Otuka; H Hamasima; K Hokamoto; S Itoh

    2016-01-01

    Firstly, recently, there are a lot of techniques being demanded for complex process, various explosive initiation method and highly accurate control of detonation are needed. In this research, the metal foil explosion using high current is focused attention on the method to obtain linear or planate initiation easily, and the main evaluation of metal foil explosion to initiate explosive was conducted. The explosion power was evaluated by observing optically the underwater shock wave generated ...

  10. Analyzing power of polarized protons interactions with carbon nuclei at 0.71-3.61 GeV

    International Nuclear Information System (INIS)

    Anoshina, E.V.; Bodyagin, V.A.; Vardanyan, I.N.; Gribushin, A.M.; Ershov, A.A.; Kruglov, N.A.; Sarycheva, L.I.

    1997-01-01

    For the first time at JINR synchrophasotron an experiment in the polarized proton beam was carried out. Beams of polarized protons with energy T p = 0.71-3.61 GeV, polarization P p ≅ 0.5 and intensity I p ≅ 10 6 particles/spell have been formed, their characteristics were investigated, and a possibility to use those beams as initial for physical and methodical investigations has been shown. The proton-carbon interaction analyzing power at the energies of 1.46 and 3.61 GeV has been measured for two values of the scattering angle. 22 refs., 3 figs

  11. Defects level evaluation of LiTiZn ferrite ceramics using temperature dependence of initial permeability

    Science.gov (United States)

    Malyshev, A. V.; Petrova, A. B.; Sokolovskiy, A. N.; Surzhikov, A. P.

    2018-06-01

    The method for evaluating the integral defects level and chemical homogeneity of ferrite ceramics based on temperature dependence analysis of initial permeability is suggested. A phenomenological expression for the description of such dependence was suggested and an interpretation of its main parameters was given. It was shown, that the main criterion of the integral defects level of ferrite ceramics is relation of two parameters correlating with elastic stress value in a material. An indicator of structural perfection can be a maximum value of initial permeability close to Curie point as well. The temperature dependences of initial permeability have analyzed for samples sintered in laboratory conditions and for the ferrite industrial product. The proposed method allows controlling integral defects level of the soft ferrite products and has high sensitivity compare to typical X-ray methods.

  12. [Evaluation of the Initial Stage Career Exploration Inventory (ISCEI)].

    Science.gov (United States)

    Adachi, Tomoko

    2010-06-01

    The Initial Stage Career Exploration Inventory (ISCEI) was designed to assess career exploration among students in the early stage of making career decisions. The reliability, validity, and applicability of the ISCEI were investigated. In Study 1, responses on the ISCEI from student participants (n = 294 : 69 men, 225 women) were factor analyzed. The results suggested a 3-factor structure consisting of "self-understanding," "information gathering" and "learning from others." Comparison between the ISCEI and self-improvement motive, vocational decisions, and career decision-making self-efficacy scales from the Career Exploration Survey (CES) indicated that the ISCEI had sufficient construct validity. Study 2 investigated the applicability of the ISCEI. The responses of student participants (n = 859 : 451 men, 408 women) on the ISCEI indicated high "self-understanding," neutral "information gathering," and comparatively low "learning from others" scores, which were similarly related to the CES as in Study 1. These findings indicate that the ISCEI can be used as a tool for understanding career exploration among students in the initial stage of making career decision.

  13. Trait emotional intelligence in initial teacher training

    Directory of Open Access Journals (Sweden)

    David Molero

    2017-02-01

    Full Text Available This study analyzes the emotional intelligence (EI in teachers during their initial training following the trait EI model, namely the wellness model Bar-On (2002; 2006. 460 students participated (age in years M=22.57, SD=±3.39 of the University of Jaen (Spain who responded to the scale EQ-i Short Form Spanish version (López-Zafra, Pulido-Martos, & Berrios-Martos, 2014, that includes 4 factors (Interpersonal, Adaptability, Stress management and Intrapersonal. There are significant differences (p<.05 on various factors based on gender, age, degree of participants and the educational level of the same. The variables considered in the regression analysis that most predict global IE are Stress Management, Adaptability followed, Intrapersonal and Interpersonal. The results are consistent with those obtained in other studies in similar contexts.

  14. Proton-beam energy analyzer

    International Nuclear Information System (INIS)

    Belan, V.N.; Bolotin, L.I.; Kiselev, V.A.; Linnik, A.F.; Uskov, V.V.

    1989-01-01

    The authors describe a magnetic analyzer for measurement of proton-beam energy in the range from 100 keV to 25 MeV. The beam is deflected in a uniform transverse magnetic field and is registered by photographing a scintillation screen. The energy spectrum of the beam is constructed by microphotometry of the photographic film

  15. Radon-induced DNA damage and apoptosis analyzed by flow cytometry

    International Nuclear Information System (INIS)

    Meenakshi, C.; Mohankumar, Mary N.

    2012-01-01

    Natural radiation is the major source of human exposure to ionizing radiation and its largest contributing component to effective doses arises from inhalation of 222 Rn and its radioactive progeny. 222 Rn, a chemically inert gas produced naturally from radium in rocks and soil is a proven source of lung cancer especially in closed environments such as mines and in poorly ventilated homes. Much of the data on the effect of radon in humans comes from epidemiological studies, often masked by confounding factors such as age, smoking and lifestyle. Radiation carcinogenesis is initiated by DNA damage and flow cytometry is a versatile, fast and accurate technique for the analysis of DNA damage as it offers the analysis of high number of individual cells in few minutes. An attempt was made to detect DNA damage and apoptosis after exposing human blood cells in vitro to radon by flow cytometry. Blood samples were collected from apparently healthy individuals and exposed in vitro to radon ranging between 1-5 mGy using a simple, portable irradiation assembly designed and tested at the Radiological Safety Division of Indira Gandhi Centre for Atomic Research. Cultures were initiated by the addition of phytohemagglutinin and cells were processed stained and analyzed for DNA damage and apoptosis by flow cytometry. CV values indicative of DNA damage were plotted against dose and were observed to increase in a dose dependent manner 3h after of irradiation. However no such response was observed at 24h and 48h. Nevertheless, the percentage of apoptotic cells increased steadily with dose after 24 and 48h post exposure. DNA breaks appear to be rejoined after about 24h of irradiation. However apoptotic cells increased with time and dose, suggesting elimination of highly damaged cells. Further experiments are needed to identify apoptotic cells as a biomarker of radiation exposure and risk. (author)

  16. Associations between a voluntary restaurant menu designation initiative and patron purchasing behavior.

    Science.gov (United States)

    Sosa, Erica T; Biediger-Friedman, Lesli; Banda, Martha

    2014-03-01

    Restaurant initiatives provide an efficient opportunity to impact large numbers of patrons. The purpose of this study is to measure patron purchasing behaviors during the ¡Por Vida! menu designation initiative. This study used a cross-sectional design and survey data to assess 23 restaurants throughout Bexar County and 152 restaurant patrons. The Patron Awareness Questionnaire assessed if patrons noticed the logo; believed nutrition, cost, and taste were important in making purchasing decisions; and purchased a ¡Por Vida! item. Descriptive statistics, Spearman correlations, and logistic regression were used to analyze the data. Most (93.4%) patrons considered taste very important when deciding what to eat. Cost was very important to 63.8% and nutrition was very important to 55.9% of the sample. The strongest predictors of purchasing a ¡Por Vida! item were the patrons' ages being between 18 and 35 years (odds ratio = 1.474; confidence interval = 0.017, 0.812; p designation initiatives can potentially influence patron purchasing behaviors among a segment of the population when the logo is visible.

  17. Study on the Formation and Initial Transport for Non-Homogeneous Debris Flow

    Directory of Open Access Journals (Sweden)

    An Ping Shu

    2017-04-01

    Full Text Available Non-homogeneous debris flows generally occur during the rainy seasons in Southwest China, and have received considerable attention in the literature. Regarding the complexity in debris flow dynamics, experimental approaches have proven to be effective in revealing the formative mechanism for debris flow, and quantifying the relations between the various influencing factors with debris-flow formation and subsequent transport processes. Therefore, a flume-based and experimental study was performed at the Debris Flow Observation and Research Station of Jiangjia Gully in Yunnan Province, to theoretically analyze favorable conditions for debris-flow formation and initial transport by selecting the median particle size d50, flow rate Q, vertical grading coefficient ψ, slopes S, and the initial soil water contents W as the five variables for investigation. To achieve this, an optimal combination of these variables was made through an orthogonal experimental design to determine their relative importance upon the occurrence and initial mobilization behavior of a debris flow and to further enhance our insight into debris-flow triggering and transport mechanisms.

  18. Dioxin Exposure Initiative

    Science.gov (United States)

    The Dioxin Exposure Initiative (DEI) is no longer active. This page contains a summary of the dioxin exposure initiative with illustrations, contact and background information.Originally supported by scientist Matthew Lorber, who retired in Mar 2017.

  19. Rap Music Use, Perceived Peer Behavior, and Sexual Initiation Among Ethnic Minority Youth.

    Science.gov (United States)

    Johnson-Baker, Kimberly A; Markham, Christine; Baumler, Elizabeth; Swain, Honora; Emery, Susan

    2016-03-01

    Research shows that rap music use is associated with risky sexual behavior in ethnic minority youth; however, it is unknown whether rap music use impacts sexual initiation specifically and, if so, which factors mediate this impact. Thus, we investigated the longitudinal relationship between hours spent listening to rap music in seventh grade and sexual initiation in ninth grade. We also examined the role of perceived peer sexual behavior as a potential mediator of this relationship. We analyzed data from students (n = 443) enrolled in a school-based randomized controlled trial of a sexual health education curriculum collected at baseline and at 18-month follow-up. Rap music use and perceived peer sexual behavior were assessed in seventh grade, whereas sexual initiation was assessed in ninth grade. Univariate, multivariate, and mediation analyses were conducted. At baseline, rap music use was significantly associated with race/ethnicity, parental music rules, and sexual behavior, but not with gender or parental education. Rap music use was a significant predictor of sexual initiation on univariate analysis but not multivariate analysis. Mediation analysis showed that the association between hours spent listening to rap music and sexual initiation was significantly mediated by perceived peer sexual behavior. Rap music use in early adolescence significantly impacts sexual initiation in late adolescence, partially mediated by perceived peer sexual behavior. More research is needed to understand how rap music influences perceptions of peer sexual behavior, which, in turn, influence early sexual initiation. Copyright © 2016 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  20. Which Triple Aim related measures are being used to evaluate population management initiatives? An international comparative analysis.

    Science.gov (United States)

    Hendrikx, Roy J P; Drewes, Hanneke W; Spreeuwenberg, Marieke; Ruwaard, Dirk; Struijs, Jeroen N; Baan, Caroline A

    2016-05-01

    Population management (PM) initiatives are introduced in order to create sustainable health care systems. These initiatives should focus on the continuum of health and well-being of a population by introducing interventions that integrate various services. To be successful they should pursue the Triple Aim, i.e. simultaneously improve population health and quality of care while reducing costs per capita. This study explores how PM initiatives measure the Triple Aim in practice. An exploratory search was combined with expert consultations to identify relevant PM initiatives. These were analyzed based on general characteristics, utilized measures and related selection criteria. In total 865 measures were used by 20 PM initiatives. All quality of care domains were included by at least 11 PM initiatives, while most domains of population health and costs were included by less than 7 PM initiatives. Although their goals showed substantial overlap, the measures applied showed few similarities between PM initiatives and were predominantly selected based on local priority areas and data availability. Most PM initiatives do not measure the full scope of the Triple Aim. Additionally, variety between measures limits comparability between PM initiatives. Consensus on the coverage of Triple Aim domains and a set of standardized measures could further both the inclusion of the various domains as well as the comparability between PM initiatives. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  1. Identifying criteria for multimodel software process improvement solutions : based on a review of current problems and initiatives

    NARCIS (Netherlands)

    Kelemen, Z.D.; Kusters, R.J.; Trienekens, J.J.M.

    2012-01-01

    In this article, we analyze current initiatives in multimodel software process improvement and identify criteria for multimodel solutions. With multimodel, we mean the simultaneous usage of more than one quality approach (e.g. standards, methods, techniques to improve software processes). This paper

  2. Sulfur Dioxide Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, Stephen R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-05-01

    The Sulfur Dioxide Analyzer measures sulfur dioxide based on absorbance of UV light at one wavelength by SO2 molecules which then decay to a lower energy state by emitting UV light at a longer wavelength. Specifically, SO2 + hυ1 →SO2 *→SO2 + hυ2 The emitted light is proportional to the concentration of SO2 in the optical cell. External communication with the analyzer is available through an Ethernet port configured through the instrument network of the AOS systems. The Model 43i-TLE is part of the i-series of Thermo Scientific instruments. The i-series instruments are designed to interface with external computers through the proprietary Thermo Scientific iPort Software. However, this software is somewhat cumbersome and inflexible. Brookhaven National Laboratory (BNL) has written an interface program in National Instruments LabView that both controls the Model 43i-TLE Analyzer AND queries the unit for all measurement and housekeeping data. The LabView vi (the software program written by BNL) ingests all raw data from the instrument and outputs raw data files in a uniform data format similar to other instruments in the AOS and described more fully in Section 6.0 below.

  3. Strengthening Multipayer Collaboration: Lessons From the Comprehensive Primary Care Initiative.

    Science.gov (United States)

    Anglin, Grace; Tu, H A; Liao, Kristie; Sessums, Laura; Taylor, Erin Fries

    2017-09-01

    Policy Points: Collaboration across payers to align financial incentives, quality measurement, and data feedback to support practice transformation is critical, but challenging due to competitive market dynamics and competing institutional priorities. The Centers for Medicare & Medicaid Services or other entities convening multipayer initiatives can build trust with other participants by clearly outlining each participant's role and the parameters of collaboration at the outset of the initiative. Multipayer collaboration can be improved if participating payers employ neutral, proactive meeting facilitators; develop formal decision-making processes; seek input on decisions from practice representatives; and champion the initiative within their organizations. With increasing frequency, public and private payers are joining forces to align goals and resources for primary care transformation. However, sustaining engagement and achieving coordination among payers can be challenging. The Comprehensive Primary Care (CPC) initiative is one of the largest multipayer initiatives ever tested. Drawing on the experience of the CPC initiative, this paper examines the factors that influence the effectiveness of multipayer collaboration. This paper draws largely on semistructured interviews with CPC-participating payers and payer conveners that facilitated CPC discussions and on observation of payer meetings. We coded and analyzed these qualitative data to describe collaborative dynamics and outcomes and assess the factors influencing them. We found that several factors appeared to increase the likelihood of successful payer collaboration: contracting with effective, neutral payer conveners; leveraging the support of payer champions, and seeking input on decisions from practice representatives. The presence of these factors helped some CPC regions overcome significant initial barriers to achieve common goals. We also found that leadership from the Centers for Medicare & Medicaid

  4. Did our current initial treatment practice change after EAU/ESPU vesicoureteral reflux risk grouping?

    Science.gov (United States)

    Tokat, Eda; Gurocak, Serhat; Ure, Iyimser; Acar, Cenk; Sınık, Zafer; Tan, Mustafa Ozgur

    2018-06-02

    The "European Association of Urology (EAU) Guidelines on Vesicoureteral Reflux (VUR) in Children (September 2012)" established risk classification by analyzing and defining risk factors for each patient. In this study we aimed to investigate how our initial treatment procedures were affected by EAU/ESPU guideline vesicoureteral reflux risk grouping and to compare the early clinical results of treatments performed before and after the risk classification in our patients with VUR. 334 renal units with regular clinical follow-up who were treated owing to VUR (vesicoureteral reflux) between years 2009 and 2017 were retrospectively reviewed. Preoperative clinical parameters such as grade and laterality of reflux, presence of renal scar, initial and follow-up treatments, findings of medical treatment and surgical procedures were analyzed. The initial medical and surgical methods were compared by categorizing patients according to risk groups before and after 2013. Mean age and follow-up duration were 71.4(6-216) months and 47(4-141) months, respectively. Among the preoperative parameters, only high EAU risk group (p = 0.01) and treating lower urinary tract symptoms (p age, sex, and presence of renal scar at DMSA were not affecting the success of treatment significantly. While no significant difference in medical and surgical treatment rates is observed after risk grouping system in low risk group, the percentages of patients who are treated with surgical methods initially were significantly decreased in moderate and high risk groups (p = 0.002 and p = 0.012, respectively). We determined that VUR risk grouping did not change clinical success significantly in all risk groups. Despite the fact that EAU/ESPU VUR risk classification changed our current practice in terms of initial treatment method, this different approach did not seem to affect early clinical success positively. There is still an absolute need for studies with larger sample size and long

  5. Risk factors associated with recurrent hemorrhage after the initial improvement of colonic diverticular bleeding.

    Science.gov (United States)

    Nishikawa, Hiroki; Maruo, Takanori; Tsumura, Takehiko; Sekikawa, Akira; Kanesaka, Takashi; Osaki, Yukio

    2013-03-01

    We elucidated risk factors contributing to recurrent hemorrhage after initial improvement of colonic diverticular bleeding. 172 consecutive hospitalized patients diagnosed with colonic diverticular bleeding were analyzed. Recurrent hemorrhage after initial improvement of colonic diverticular bleeding is main outcome measure. We analyzed factors contributing to recurrent hemorrhage risk in univariate and multivariate analyses. The length of the observation period after improvement of colonic diverticular bleeding was 26.4 +/- 14.6 months (range, 1-79 months). The cumulative recurrent hemorrhage rate in all patients at 1 and 2 years was 34.8% and 41.8%, respectively. By univariate analysis, age > 70 years (P = 0.021), BMI > 25 kg/m2 (P = 0.013), the use of anticoagulant drugs (P = 0.034), the use of NSAIDs (P = 0.040), history of hypertension (P = 0.011), history of smoking (P = 0.030) and serum creatinine level > 1.5 mg/dL (P bleeding. By multivariate analysis, age > 70 years (Hazard ratio (HR), 1.905, 95% confidence interval (CI), 1.067-3.403, P = 0.029), history of hypertension (HR, 0.493, 95% CI, 0.245-0.993, P = 0.048) and serum creatinine level > 1.5 mg/dL (HR, 95% CI, 0.288-0.964, P = 0.044) were shown to be significant independent risk factors. Close observation after the initial improvement of colonic diverticular bleeding is needed, especially in elderly patients or patients with history of hypertension or renal deficiency.

  6. Axi-symmetric generalized thermoelastic diffusion problem with two-temperature and initial stress under fractional order heat conduction

    International Nuclear Information System (INIS)

    Deswal, Sunita; Kalkal, Kapil Kumar; Sheoran, Sandeep Singh

    2016-01-01

    A mathematical model of fractional order two-temperature generalized thermoelasticity with diffusion and initial stress is proposed to analyze the transient wave phenomenon in an infinite thermoelastic half-space. The governing equations are derived in cylindrical coordinates for a two dimensional axi-symmetric problem. The analytical solution is procured by employing the Laplace and Hankel transforms for time and space variables respectively. The solutions are investigated in detail for a time dependent heat source. By using numerical inversion method of integral transforms, we obtain the solutions for displacement, stress, temperature and diffusion fields in physical domain. Computations are carried out for copper material and displayed graphically. The effect of fractional order parameter, two-temperature parameter, diffusion, initial stress and time on the different thermoelastic and diffusion fields is analyzed on the basis of analytical and numerical results. Some special cases have also been deduced from the present investigation.

  7. Digital dynamic amplitude-frequency spectra analyzer

    International Nuclear Information System (INIS)

    Kalinnikov, V.A.; )

    2006-01-01

    The spectra analyzer is intended for the dynamic spectral analysis of signals physical installations and noise filtering. The recurrence Fourier transformation algorithm is used in the digital dynamic analyzer. It is realized on the basis of the fast logic FPGA matrix and the special signal ADSP microprocessor. The discretization frequency is 2 kHz-10 MHz. The number of calculated spectral coefficients is not less 512. The functional fast-action is 20 ns [ru

  8. A new automatic analyzer for uranium determination

    International Nuclear Information System (INIS)

    Xia Buyun; Zhu Yaokun; Wang Bin; Cong Peiyan; Zhang Lan

    1992-08-01

    An intellectual automatic analyzer for uranium based on the principle of flow injection analysis (FIA) has been developed. It can directly determine the uranium solution in range of 0.02 to 500 mg/L without any pre-process. A chromatographic column with extractant, in which the trace uranium is concentrated and separated, has special ability to enrich uranium, is connected to the manifold of the analyzer. The analyzer is suited for trace uranium determination in varies samples. The 2-(5-bromo-2-pyridylazo)-5-diethyl-aminophenol (Br-PADAP) is used as color reagent. Uranium is determined in aqueous solution by adding cation surfactant, cetyl-pyridinium bromide (PCB). The rate of analysis is 30 to 90 samples per hour. The relative standard deviation of determination is 1% ∼ 2%. The analyzer has been used in factories and laboratory, and the results are satisfied. The determination range can easily be changed by using a multi-function auto-injection valve that changes the injection volume of the sample and channels. So, it could adopt varies FIA operation modes to meet the needs of FIA determination for other substance. The analyzer has universal functions

  9. 21 CFR 870.3630 - Pacemaker generator function analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Pacemaker generator function analyzer. 870.3630... (CONTINUED) MEDICAL DEVICES CARDIOVASCULAR DEVICES Cardiovascular Prosthetic Devices § 870.3630 Pacemaker generator function analyzer. (a) Identification. A pacemaker generator function analyzer is a device that is...

  10. The comparison of automated urine analyzers with manual microscopic examination for urinalysis automated urine analyzers and manual urinalysis.

    Science.gov (United States)

    İnce, Fatma Demet; Ellidağ, Hamit Yaşar; Koseoğlu, Mehmet; Şimşek, Neşe; Yalçın, Hülya; Zengin, Mustafa Osman

    2016-08-01

    Urinalysis is one of the most commonly performed tests in the clinical laboratory. However, manual microscopic sediment examination is labor-intensive, time-consuming, and lacks standardization in high-volume laboratories. In this study, the concordance of analyses between manual microscopic examination and two different automatic urine sediment analyzers has been evaluated. 209 urine samples were analyzed by the Iris iQ200 ELITE (İris Diagnostics, USA), Dirui FUS-200 (DIRUI Industrial Co., China) automatic urine sediment analyzers and by manual microscopic examination. The degree of concordance (Kappa coefficient) and the rates within the same grading were evaluated. For erythrocytes, leukocytes, epithelial cells, bacteria, crystals and yeasts, the degree of concordance between the two instruments was better than the degree of concordance between the manual microscopic method and the individual devices. There was no concordance between all methods for casts. The results from the automated analyzers for erythrocytes, leukocytes and epithelial cells were similar to the result of microscopic examination. However, in order to avoid any error or uncertainty, some images (particularly: dysmorphic cells, bacteria, yeasts, casts and crystals) have to be analyzed by manual microscopic examination by trained staff. Therefore, the software programs which are used in automatic urine sediment analysers need further development to recognize urinary shaped elements more accurately. Automated systems are important in terms of time saving and standardization.

  11. Thermal study of sintered (Th-U)O2 MOX pellet by a commercial thermo-gravimetric analyzer coupled with an evolved gas analyzer

    International Nuclear Information System (INIS)

    Mahanty, B.N.; Khan, F.A.; Karande, A.; Prakash, A.; Afzal, Md.; Panakkal, J.P.; Kamath, H.S.

    2010-01-01

    Full text: Fabrication of (Th-U)O 2 MOX pellets by the impregnation agglomerate pelletization (lAP) process is being explored in Advanced Fuel Fabrication Facility, BARC, Tarapur for the forthcoming Advanced Heavy Water Reactor (AHWR). High temperature thermal study of this fuel is important in order to understand the behaviour of the fuel under the operational temperature of the reactor. In this study, fabrication of ThO 2 -3%UO 2 was carried out by impregnation agglomerate pelletization process and subsequently sintered in reducing or air atmosphere. The degassed pellets were broken into small pieces and subjected to high temperature (1050 deg C-1250 deg C) heating under high pure argon gas in a commercial thermal analyzer. Subsequently the evolved gases were qualitatively analyzed by a quadrupole mass analyzer. The pellet sintered in reducing atmosphere (IAP-R) shows an increase in weight after the analysis where as the pellet sintered in oxidizing atmosphere (IAP-O) shows a decrease in final weight. The IAP-R pellet may become slightly hyper-stoichiometric on heating due to the presence of small amount of oxygen in the high pure argon gas. This is further supported by the mass spectrum at m/z 32(O 2 + ) that shows a decrease in the signal intensity as the temperature of analysis increases. The sharp decrease of the signal intensity at m/z 32(O 2 + ) started at 920 deg C temperature may be attributed to the formation of SO 2 (m/z=64) and CO 2 (m/z=44) gases. On the other hand the IAP-O pellet being hyper stoichiometric initially may lose its weight to form water on reaction with the excess oxygen on heating due to the presence ( small amount of hydrogen in the high pure argon gas. This is being supported by the appearance of small peak at m/z 18 (H 2 O + ) in the mass spectrum. The formation of SO 2 and CO 2 gases started at higher temperature in case of IAP-O pellet as compared to that of IAP-R pellet. This may be due to the higher density achieved in case of

  12. Complete motif analysis of sequence requirements for translation initiation at non-AUG start codons.

    Science.gov (United States)

    Diaz de Arce, Alexander J; Noderer, William L; Wang, Clifford L

    2018-01-25

    The initiation of mRNA translation from start codons other than AUG was previously believed to be rare and of relatively low impact. More recently, evidence has suggested that as much as half of all translation initiation utilizes non-AUG start codons, codons that deviate from AUG by a single base. Furthermore, non-AUG start codons have been shown to be involved in regulation of expression and disease etiology. Yet the ability to gauge expression based on the sequence of a translation initiation site (start codon and its flanking bases) has been limited. Here we have performed a comprehensive analysis of translation initiation sites that utilize non-AUG start codons. By combining genetic-reporter, cell-sorting, and high-throughput sequencing technologies, we have analyzed the expression associated with all possible variants of the -4 to +4 positions of non-AUG translation initiation site motifs. This complete motif analysis revealed that 1) with the right sequence context, certain non-AUG start codons can generate expression comparable to that of AUG start codons, 2) sequence context affects each non-AUG start codon differently, and 3) initiation at non-AUG start codons is highly sensitive to changes in the flanking sequences. Complete motif analysis has the potential to be a key tool for experimental and diagnostic genomics. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  13. Verification and quality control of routine hematology analyzers.

    Science.gov (United States)

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  14. Characteristics of patients initiating raloxifene compared to those initiating bisphosphonates

    Directory of Open Access Journals (Sweden)

    Wang Sara

    2008-12-01

    Full Text Available Abstract Background Both raloxifene and bisphosphonates are indicated for the prevention and treatment of postmenopausal osteoporosis, however these medications have different efficacy and safety profiles. It is plausible that physicians would prescribe these agents to optimize the benefit/risk profile for individual patients. The objective of this study was to compare demographic and clinical characteristics of patients initiating raloxifene with those of patients initiating bisphosphonates for the prevention and treatment of osteoporosis. Methods This study was conducted using a retrospective cohort design. Female beneficiaries (45 years and older with at least one claim for raloxifene or a bisphosphonate in 2003 through 2005 and continuous enrollment in the previous 12 months and subsequent 6 months were identified using a collection of large national commercial, Medicare supplemental, and Medicaid administrative claims databases (MarketScan®. Patients were divided into two cohorts, a combined commercial/Medicare cohort and a Medicaid cohort. Within each cohort, characteristics (demographic, clinical, and resource utilization of patients initiating raloxifene were compared to those of patients initiating bisphosphonate therapy. Group comparisons were made using chi-square tests for proportions of categorical measures and Wilcoxon rank-sum tests for continuous variables. Logistic regression was used to simultaneously examine factors independently associated with initiation of raloxifene versus a bisphosphonate. Results Within both the commercial/Medicare and Medicaid cohorts, raloxifene patients were younger, had fewer comorbid conditions, and fewer pre-existing fractures than bisphosphonate patients. Raloxifene patients in both cohorts were less likely to have had a bone mineral density (BMD screening in the previous year than were bisphosphonate patients, and were also more likely to have used estrogen or estrogen/progestin therapy in the

  15. The Influence of the National truth campaign on smoking initiation.

    Science.gov (United States)

    Farrelly, Matthew C; Nonnemaker, James; Davis, Kevin C; Hussin, Altijani

    2009-05-01

    States and national organizations spend millions annually on antismoking campaigns aimed at youth. Much of the evidence for their effectiveness is based on cross-sectional studies. This study was designed to evaluate the effectiveness of a prominent national youth smoking-prevention campaign in the U.S. known as truth that was launched in February 2000. A nationally representative cohort of 8904 adolescents aged 12-17 years who were interviewed annually from 1997 to 2004 was analyzed in 2008. A quasi-experimental design was used to relate changes in smoking initiation to variable levels of exposure to antismoking messages over time and across 210 media markets in the U.S. A discrete-time hazard model was used to quantify the influence of media market delivery of TV commercials on smoking initiation, controlling for confounding influences. Based on the results of the hazard model, the number of youth nationally who were prevented from smoking from 2000 through 2004 was estimated. Exposure to the truth campaign is associated with a decreased risk of smoking initiation (relative risk=0.80, p=0.001). Through 2004, approximately 450,000 adolescents were prevented from trying smoking nationwide. Factors negatively associated with initiation include African-American race (relative risk=0.44, p<0.001), Hispanic ethnicity (relative risk=0.74, p<0.001), completing high school (relative risk=0.69, p<0.001), and living with both parents at baseline (OR=0.79, p<0.001). The current study strengthens the available evidence for antismoking campaigns as a viable strategy for preventing youth smoking.

  16. 21 CFR 868.1700 - Nitrous oxide gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Nitrous oxide gas analyzer. 868.1700 Section 868...) MEDICAL DEVICES ANESTHESIOLOGY DEVICES Diagnostic Devices § 868.1700 Nitrous oxide gas analyzer. (a) Identification. A nitrous oxide gas analyzer is a device intended to measure the concentration of nitrous oxide...

  17. METHODOLOGICAL BACKGROUND OF EXPERT ESTIMATION OF INITIAL DATA COMPLETENESS AND QUALITY ACCORDING TO THE CERTIFIED INFORMATION SECURITY SYSTEM

    Directory of Open Access Journals (Sweden)

    V. K. Fisenko

    2015-01-01

    Full Text Available Problem of information security systems certification is analyzed and the tasks of initial data analysis are carried out. The objectives, indices and decision making criteria, as well as the challenges to be addressed are formulated. It is shown that, in order to improve quality, reduce time and cost of preparation for certification, it is reasonable to use software system for automatization of the process of initial data analysis, presented by the owner of the information system.

  18. Optimal Error Estimates of Two Mixed Finite Element Methods for Parabolic Integro-Differential Equations with Nonsmooth Initial Data

    KAUST Repository

    Goswami, Deepjyoti; Pani, Amiya K.; Yadav, Sangita

    2013-01-01

    In the first part of this article, a new mixed method is proposed and analyzed for parabolic integro-differential equations (PIDE) with nonsmooth initial data. Compared to the standard mixed method for PIDE, the present method does not bank on a

  19. The Organization of Knowledge in a Multi-Lingual, Integrated Parser.

    Science.gov (United States)

    1984-11-01

    of Israeli rule. Spanish: Dos alcaldes que simpatizan con Is Organizacion para Is Liberacion de Palestina fueron mutilados duranto las explosiones...NAME ISRAEL OUR DURO Total time: 97618 asecs. NIL Translation: Explosions in 4 cities of the West Bank somed 2 Arab mayors today during s the worst... Israel . German: In Angre if*s uf don besetiten Vest soak vrden mindestons noun Pa lestineaeer einschli eastlich swel erableebe, 9vorgerselater

  20. Biomechanical aspects of initial intraosseous stability and implant design: a quantitative micro-morphometric analysis.

    Science.gov (United States)

    Akça, Kivanç; Chang, Ting-Ling; Tekdemir, Ibrahim; Fanuscu, Mete I

    2006-08-01

    The objective of this biomechanical study was to explore the effect of bone micro-morphology on initial intraosseous stability of implants with different designs. Straumann and Astra Tech dental implants were placed into anterior and posterior regions of completely edentulous maxilla and mandible of a human cadaver. Experiments were undertaken to quantify initial implant stability and bone micro-morphology. Installation torque values (ITVs) and implant stability quotients (ISQs) were measured to determine initial intraosseous implant stability. For quantification of relative bone volume and micro-architecture, sectioned implant-bone and bone core specimens of each implant placement site were consecutively scanned and trabecular bone was analyzed in a micro-computed tomography (micro-CT) unit. Experimental outcomes were evaluated for correlations among implant designs, initial intraosseous implant stability and bone micro-structural parameters. ITVs correlated higher with bone volume fraction (BV/TV) than ISQs, at 88.1% and 68.9% levels, respectively. Correlations between ITVs and micro-morphometric parameters were significant at the 95% confidence level (Pimplant designs used were not significant at the 95% confidence level (P>0.05). Bone micro-morphology has a prevailing effect over implant design on intraosseus initial implant stability, and ITV is more sensitive in terms of revealing biomechanical properties at the bone-implant interface in comparison with ISQ.

  1. Analyzing Log Files using Data-Mining

    Directory of Open Access Journals (Sweden)

    Marius Mihut

    2008-01-01

    Full Text Available Information systems (i.e. servers, applications and communication devices create a large amount of monitoring data that are saved as log files. For analyzing them, a data-mining approach is helpful. This article presents the steps which are necessary for creating an ‘analyzing instrument’, based on an open source software called Waikato Environment for Knowledge Analysis (Weka [1]. For exemplification, a system log file created by a Windows-based operating system, is used as input file.

  2. The comparison of automated urine analyzers with manual microscopic examination for urinalysis automated urine analyzers and manual urinalysis

    Directory of Open Access Journals (Sweden)

    Fatma Demet Ä°nce

    2016-08-01

    Full Text Available Objectives: Urinalysis is one of the most commonly performed tests in the clinical laboratory. However, manual microscopic sediment examination is labor-intensive, time-consuming, and lacks standardization in high-volume laboratories. In this study, the concordance of analyses between manual microscopic examination and two different automatic urine sediment analyzers has been evaluated. Design and methods: 209 urine samples were analyzed by the Iris iQ200 ELITE (Ä°ris Diagnostics, USA, Dirui FUS-200 (DIRUI Industrial Co., China automatic urine sediment analyzers and by manual microscopic examination. The degree of concordance (Kappa coefficient and the rates within the same grading were evaluated. Results: For erythrocytes, leukocytes, epithelial cells, bacteria, crystals and yeasts, the degree of concordance between the two instruments was better than the degree of concordance between the manual microscopic method and the individual devices. There was no concordance between all methods for casts. Conclusion: The results from the automated analyzers for erythrocytes, leukocytes and epithelial cells were similar to the result of microscopic examination. However, in order to avoid any error or uncertainty, some images (particularly: dysmorphic cells, bacteria, yeasts, casts and crystals have to be analyzed by manual microscopic examination by trained staff. Therefore, the software programs which are used in automatic urine sediment analysers need further development to recognize urinary shaped elements more accurately. Automated systems are important in terms of time saving and standardization. Keywords: Urinalysis, Autoanalysis, Microscopy

  3. The security analyzer, a security analyzer program written in Prolog

    International Nuclear Information System (INIS)

    Zimmerman, B.D.; Densley, P.J.; Carlson, R.L.

    1987-01-01

    A technique has been developed to characterize a nuclear facility and measure the strengths and weaknesses of the physical protection system. It utilizes the artificial intelligence capabilities available in the prolog programming language to probe a facility's defenses and find potential attack paths that meet designated search criteria. As sensors or barriers become inactive due to maintenance, failure, or inclement weather conditions, the protection system can rapidly be reanalyzed to discover weaknesses that would need to be strengthened by alternative means. Conversely, proposed upgrades and enhancements can be easily entered into the database and their effect measured against a variety of potential adversary attacks. Thus the security analyzer is a tool that aids the protection planner as well as the protection operations staff

  4. Sustainable Agricultural Marketing Initiatives

    Directory of Open Access Journals (Sweden)

    Hakan Adanacıoğlu

    2015-07-01

    Full Text Available Sustainable marketing is a holistic approach that puts equal emphasis on environmental, social equity, and economic concerns in the development of marketing strategies. The purpose of the study is to examine and discuss the sustainable agricultural marketing initiatives practiced throughout the World and Turkey, and to put forth suggestions to further improve the performance of agricultural marketing initiatives in Turkey. Some of the sustainable agricultural marketing initiatives practiced around the world are carried out through civil organizations. Furthermore; some of these initiatives have also launched by farmers, consumers, food processors and retailers. The long-term strategies to increase these initiatives should be determined due to the fact that examples of successful sustainable agricultural marketing initiatives are inadequate and cannot be spread in Turkey. In this context, first of all, the supports provided by the government to improve agricultural marketing systems, such as EU funds for rural development should be compatible with the goals of sustainable marketing. For this purpose, it should be examined whether all proposed projects related to agricultural marketing meet the social, economic, and environmental principles of sustainable marketing. It is important that supporting organizations, especially civil society organisations, should take an active role for faster dissemination and adoption of sustainable agricultural marketing practices in Turkey. These organizations may provide technical assistance in preparing successful project proposals and training to farm groups. In addition, the other organizations, such as local administrations, producers' associations, cooperatives, can contribute to the success of sustainable agricultural marketing initiatives. The use of direct marketing strategies and vertical integration attempts in sustainable agricultural marketing initiatives that will likely be implemented in Turkey is

  5. Community Rates of Breastfeeding Initiation.

    Science.gov (United States)

    Grubesic, Tony H; Durbin, Kelly M

    2016-11-01

    Breastfeeding initiation rates vary considerably across racial and ethnic groups, maternal age, and education level, yet there are limited data concerning the influence of geography on community rates of breastfeeding initiation. This study aimed to describe how community rates of breastfeeding initiation vary in geographic space, highlighting "hot spots" and "cool spots" of initiation and exploring the potential connections between race, socioeconomic status, and urbanization levels on these patterns. Birth certificate data from the Kentucky Department of Health for 2004-2010 were combined with county-level geographic base files, Census 2010 demographic and socioeconomic data, and Rural-Urban Continuum Codes to conduct a spatial statistical analysis of community rates of breastfeeding initiation. Between 2004 and 2010, the average rate of breastfeeding initiation for Kentucky increased from 43.84% to 49.22%. Simultaneously, the number of counties identified as breastfeeding initiation hot spots also increased, displaying a systematic geographic pattern in doing so. Cool spots of breastfeeding initiation persisted in rural, Appalachian Kentucky. Spatial regression results suggested that unemployment, income, race, education, location, and the availability of International Board Certified Lactation Consultants are connected to breastfeeding initiation. Not only do spatial analytics facilitate the identification of breastfeeding initiation hot spots and cool spots, but they can be used to better understand the landscape of breastfeeding initiation and help target breastfeeding education and/or support efforts.

  6. Analyzer for gamma cameras diagnostic

    International Nuclear Information System (INIS)

    Oramas Polo, I.; Osorio Deliz, J. F.; Diaz Garcia, A.

    2013-01-01

    This research work was carried out to develop an analyzer for gamma cameras diagnostic. It is composed of an electronic system that includes hardware and software capabilities, and operates from the acquisition of the 4 head position signals of a gamma camera detector. The result is the spectrum of the energy delivered by nuclear radiation coming from the camera detector head. This system includes analog processing of position signals from the camera, digitization and the subsequent processing of the energy signal in a multichannel analyzer, sending data to a computer via a standard USB port and processing of data in a personal computer to obtain the final histogram. The circuits are composed of an analog processing board and a universal kit with micro controller and programmable gate array. (Author)

  7. Methods for Analyzing Social Media

    DEFF Research Database (Denmark)

    Jensen, Jakob Linaa

    2013-01-01

    Social media is becoming increasingly attractive for users. It is a fast way to communicate ideas and a key source of information. It is therefore one of the most influential mediums of communication of our time and an important area for audience research. The growth of social media invites many...... new questions such as: How can we analyze social media? Can we use traditional audience research methods and apply them to online content? Which new research strategies have been developed? Which ethical research issues and controversies do we have to pay attention to? This book focuses on research...... strategies and methods for analyzing social media and will be of interest to researchers and practitioners using social media, as well as those wanting to keep up to date with the subject....

  8. PhosphoSiteAnalyzer

    DEFF Research Database (Denmark)

    Bennetzen, Martin V; Cox, Jürgen; Mann, Matthias

    2012-01-01

    an algorithm to retrieve kinase predictions from the public NetworKIN webpage in a semiautomated way and applies hereafter advanced statistics to facilitate a user-tailored in-depth analysis of the phosphoproteomic data sets. The interface of the software provides a high degree of analytical flexibility......Phosphoproteomic experiments are routinely conducted in laboratories worldwide, and because of the fast development of mass spectrometric techniques and efficient phosphopeptide enrichment methods, researchers frequently end up having lists with tens of thousands of phosphorylation sites...... and is designed to be intuitive for most users. PhosphoSiteAnalyzer is a freeware program available at http://phosphosite.sourceforge.net ....

  9. Inductive dielectric analyzer

    International Nuclear Information System (INIS)

    Agranovich, Daniel; Popov, Ivan; Ben Ishai, Paul; Feldman, Yuri; Polygalov, Eugene

    2017-01-01

    One of the approaches to bypass the problem of electrode polarization in dielectric measurements is the free electrode method. The advantage of this technique is that, the probing electric field in the material is not supplied by contact electrodes, but rather by electromagnetic induction. We have designed an inductive dielectric analyzer based on a sensor comprising two concentric toroidal coils. In this work, we present an analytic derivation of the relationship between the impedance measured by the sensor and the complex dielectric permittivity of the sample. The obtained relationship was successfully employed to measure the dielectric permittivity and conductivity of various alcohols and aqueous salt solutions. (paper)

  10. Initiatives and outcomes of green supply chain management implementation by Chinese manufacturers.

    Science.gov (United States)

    Zhu, Qinghua; Sarkis, Joseph; Lai, Kee-hung

    2007-10-01

    This paper aims to explore the green supply chain management (GSCM) initiatives (implementation) of various manufacturing industrial sectors in China and examine the links between GSCM initiatives and performance outcomes. We conducted a survey to collect data from four typical manufacturing industrial sectors in China, namely, power generating, chemical/petroleum, electrical/electronic and automobile, and received 171 valid organizational responses for data analysis. Analysis of variance (ANOVA) was used to analyze the data. The results are consistent with our prediction that the different manufacturing industry types display different levels of GSCM implementation and outcomes. We specifically found that the electrical/electronic industry has relatively higher levels of GSCM implementation and achieves better performance outcomes than the other three manufacturer types. Implications of the results are discussed and suggestions for further research on the implementation of GSCM are offered.

  11. Update on the USNRC's Nuclear Plant Analyzer

    International Nuclear Information System (INIS)

    Laats, E.T.

    1987-01-01

    The Nuclear Plant Analyzer (NPA) is the US Nuclear Regulatory Commission's (NRC's) state-of-the-art nuclear reactor simulation capability. This computer software package integrates high fidelity nuclear reactor simulation codes such as the TRAC and RELAP5 series of codes with color graphics display techniques and advanced workstation hardware. An overview of this program was given at the 1984 Summer Computer Simulation Conference (SCSC), with selected topics discussed at the 1985 and 1986 SCSCs. Since the 1984 presentation, major redirections of this NRC program have been taken. The original NPA system was developed for operation on a Control Data Corporation CYBER 176 computer, technology that is some 10 to 15 years old. The NPA system has recently been implemented on Class VI computers to gain increased computational capabilities, and is now being implemented on super-minicomputers for use by the scientific community and possibly by the commercial nuclear power plant simulator community. This paper addresses these activities and related experiences. First, the Class VI computer implementation is discussed. The trade-offs between gaining significantly greater computational speed and central memory, with the loss of performance due to many more simultaneous users is shown. Second, the goal of the super-minicomputer implementation is to produce a very cost-effective system that utilizes advanced (multi-dimensional, two-phase coolant) simulation capabilities at real wall-clock simulation times. Benchmarking of the initial super-minicomputer implementation is discussed. Finally, the technical and economic feasibility is addressed for implementing the super-minicomputer version of the NPA with the RELAP5 simulation code onto the Black Fox full scope nuclear power plant simulator

  12. Diffusion-weighted MR imaging of neuro-Behcet's disease: initial and follow-up studies

    International Nuclear Information System (INIS)

    Heo, Suk Hee; Seo, Jeong Jin; Kim, Heung Joong; Chang, Nam Gyu; Shin, Sang Soo; Jeong, Yong Yeon; Jeong Gwang Woo; Kang, Heoung Keun

    2005-01-01

    To assess the usefulness of diffusion-weighted MR imaging (DWI) and apparent diffusion coefficient (ADC) in the initial and follow-up studies of patients with neuro-Behcet's disease. Six patients diagnosed with neuro-Behcet's disease were the subjects of this study. Initial and follow-up MR imaging were obtained in all six patients. Initial and follow-up DWI were also obtained is four of the six patients, with only an initial DWI in the other two. The DWI were obtained using multi-shot echo planar imaging, on a 1.5T MR unit, with two gradient steps (b values of 0, 1000 sec/mm 2 ). The ADC value and ADC maps were obtained using commercial software. The locations and signal intensities of the lesions were analyzed on conventional MRI and DWI, respectively. The ADC values of the lesions were calculated on the initial and follow-up DWI, and compared those of lesions in the normal contralateral regions. The initial DWI showed iso-signal intensities in four of the six patients, with high signal intensities in the other two. In five of the six patients, including three of the four that showed isosignal intensities and the two that showed high signal intensities on the initial DWI, the ADC values of the involved lesions were higher than those of the normal contralateral regions. In three of four that showed isosignal intensities, the ADC values of the lesions were decreased and normalized on the follow-up DWI. Obtaining DWI and ADC values in patients with neuro-Behcet's disease may be helpful in the understanding of pathophysiology and differential diagnosis of this disease

  13. Analyzing the effects of Energy Action Plans on electricity consumption in Covenant of Mayors signatory municipalities in Andalusia

    International Nuclear Information System (INIS)

    Pablo-Romero, María del P.; Pozo-Barajas, Rafael; Sánchez-Braza, Antonio

    2016-01-01

    The Covenant of Mayors (CM) is an initiative by which towns, cities and regions voluntarily commit to reduce their CO_2 emissions beyond the European Union climate targets, through policies promoting energy saving and renewable energy. The aim of this paper is to analyze whether joining the CM is reducing municipalities' electricity consumption, and therefore their emissions. For this purpose, the evolution of total, household and public administration electricity consumption from 2001 to 2012 is analyzed by using panel data econometric techniques. This analysis is made for municipalities in Andalusia, the region of Spain with more signatories. Obtained results show that the CM is having a positive effect on the electricity consumption reductions, since the municipalities have greater rates of reduction of electricity consumption after signing the CM. Therefore, it may be considered appropriate to promote policies which incentivize the municipalities to join the CM and develop their action plans, as this can reduce their electricity consumption. - Highlights: • We analyze whether joining the CM is reducing municipalities' electricity consumption. • Results show a positive influence of CM in reducing electricity consumption. • Promoting policies that incentivize to join the CM is appropriate.

  14. Peritoneal dialysis technique success during the initial 90 days of therapy.

    Science.gov (United States)

    Guest, Steven; Hayes, Andrew C; Story, Kenneth; Davis, Ira D

    2012-01-01

    Comparisons of technique success by peritoneal dialysis (PD) modality have typically excluded the initial 90 days of therapy. We analyzed a database of 51,469 new PD starts from 2004 to 2008 in the United States. The analysis concentrated on the initial 90 days of therapy to determine technique success and the impact of the continuous ambulatory PD (CAPD) and automated PD (APD) modalities. Overall, 13.3% of patients stopped PD within 90 days. Of patients starting directly on APD, 14.3% stopped PD within 90 days. Of patients starting on CAPD, 12.6% stopped PD within 90 days, and 63.4% changed to APD within 90 days. Only 3.3% of the latter patients failed to reach 90 days of therapy. By comparison, technique failure occurred in 28.8% of those initiating with and remaining on CAPD. We conclude that initial training to perform CAPD, with timely transfer to APD within the first 3 months, was associated with the greatest technique success at 90 days. The reasons for that success are unclear, and further research should be directed to determining factors responsible. It is possible that patients trained initially to CAPD but converted to APD have a greater understanding of the total therapy, which improves confidence. Those converted to APD may be more appreciative of the lifestyle benefits of APD, which translates into improved compliance; alternatively, technical factors associated with APD may be responsible. Those technical factors may include improved catheter function in the recumbent position during APD or the reduced infection risk associated with just 2 connect/disconnect procedures in APD compared with 8 in CAPD.

  15. Structure of a Complete Mediator-RNA Polymerase II Pre-Initiation Complex.

    Science.gov (United States)

    Robinson, Philip J; Trnka, Michael J; Bushnell, David A; Davis, Ralph E; Mattei, Pierre-Jean; Burlingame, Alma L; Kornberg, Roger D

    2016-09-08

    A complete, 52-protein, 2.5 million dalton, Mediator-RNA polymerase II pre-initiation complex (Med-PIC) was assembled and analyzed by cryo-electron microscopy and by chemical cross-linking and mass spectrometry. The resulting complete Med-PIC structure reveals two components of functional significance, absent from previous structures, a protein kinase complex and the Mediator-activator interaction region. It thereby shows how the kinase and its target, the C-terminal domain of the polymerase, control Med-PIC interaction and transcription. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. A “New Deal” for the profession : Regulatory initiatives, changing knowledge conceptions and the Committee on Accounting Procedure

    NARCIS (Netherlands)

    Detzen, Dominic

    Purpose: The purpose of this paper is to analyze how “New Deal” regulatory initiatives, primarily the Securities Acts and the Securities and Exchange Commission (SEC), changed US auditors’ professional knowledge conception, culminating in the 1938 expansion of the Committee on Accounting Procedure

  17. Crystallographic investigation of grain selection during initial solidification

    International Nuclear Information System (INIS)

    Esaka, H; Shinozuka, K; Kataoka, Y

    2016-01-01

    Normally, macroscopic solidified structure consists of chill, columnar and equiaxed zones. In a chill zone, many fine grains nucleate on the mold surface and grow their own preferred growth direction. Only a few of them continue to grow because of grain selection. In order to understand the grain selection process, crystallographic investigation has been carried out in the zone of initial solidification in this study. 10 g of Al-6 wt%Si alloy was melted at 850 °C and poured on the thick copper plate. Longitudinal cross section of the solidified shell was observed by a SEM and analyzed by EBSD. The result of EBSD mapping reveals that crystallographic orientation was random in the range of initial solidification. Further, some grains are elongated along their <100> direction. Columnar grains, whose growth directions are almost parallel to the heat flow direction, develop via grain selection. Here, a dendrite whose growth direction is close to the heat flow direction overgrows the other dendrite whose growth direction is far from the heat flow direction. However, sometimes we observed that dendrite, whose zenith angle is large, overgrew the other dendrite. It can be deduced that the time of nucleation on the mold surface is not constant. (paper)

  18. Systems Analyze Water Quality in Real Time

    Science.gov (United States)

    2010-01-01

    A water analyzer developed under Small Business Innovation Research (SBIR) contracts with Kennedy Space Center now monitors treatment processes at water and wastewater facilities around the world. Originally designed to provide real-time detection of nutrient levels in hydroponic solutions for growing plants in space, the ChemScan analyzer, produced by ASA Analytics Inc., of Waukesha, Wisconsin, utilizes spectrometry and chemometric algorithms to automatically analyze multiple parameters in the water treatment process with little need for maintenance, calibration, or operator intervention. The company has experienced a compound annual growth rate of 40 percent over its 15-year history as a direct result of the technology's success.

  19. N-glycans released from glycoproteins using a commercial kit and comprehensively analyzed with a hypothetical database

    Directory of Open Access Journals (Sweden)

    Xue Sun

    2017-04-01

    Full Text Available The glycosylation of proteins is responsible for their structural and functional roles in many cellular activities. This work describes a strategy that combines an efficient release, labeling and liquid chromatography-mass spectral analysis with the use of a comprehensive database to analyze N-glycans. The analytical method described relies on a recently commercialized kit in which quick deglycosylation is followed by rapid labeling and cleanup of labeled glycans. This greatly improves the separation, mass spectrometry (MS analysis and fluorescence detection of N-glycans. A hypothetical database, constructed using GlycResoft, provides all compositional possibilities of N-glycans based on the common sugar residues found in N-glycans. In the initial version this database contains >8,700 N-glycans, and is compatible with MS instrument software and expandable. N-glycans from four different well-studied glycoproteins were analyzed by this strategy. The results provided much more accurate and comprehensive data than had been previously reported. This strategy was then used to analyze the N-glycans present on the membrane glycoproteins of gastric carcinoma cells with different degrees of differentiation. Accurate and comprehensive N-glycan data from those cells was obtained efficiently and their differences compared corresponding to their differentiation states. Thus, the novel strategy developed greatly improves accuracy, efficiency and comprehensiveness of N-glycan analysis.

  20. The basic research on the CDA initiation phase for a metallic fuel FBR

    International Nuclear Information System (INIS)

    Hirano, Go; Hirakawa, Naohiro; Kawada, Ken-ichi; Niwa, Hazime

    1998-03-01

    A metallic fuel with novel design has received great deal of interest recently as an option of advanced fuel to be substituted MOX fuel, however, the behavior at the transient has not been studied in many aspects. Therefore, for the purpose to show the basic tendency of the behavior and released energy at CDA (core disruptive accident) for a metallic fuel FBR and to prepare the basic knowledge for consideration of the adoption of the advanced fuel, Tohoku University and Power Reactor and Nuclear Fuel Development Corporation have made a joint research entitled. (1) Target and Results of analysis: The accident initiator considered is a LOF accident with ATWS. The LOF analysis was performed for a metallic fuel 600 MWe homogeneous two region core at the beginning of cycle, both for an ordinary metallic fuel core and for a metallic fuel core with ZrH pins. It was necessary mainly to change the constants of input parameters to apply the code for the analysis of a metallic fueled reactor. These changes were made by assuming appropriate models. Basic LOF cases and all blackout case that assumed using electromagnetic pumps were analyzed. The results show that the basic LOF cases for a metallic fuel core and all the cases for a metallic fuel core with ZrH pins could be avoided to become prompt-critical, and mildly transfer to the transient phase. (2) Improvement of CDA initiation phase analysis code: At present, it is difficult for the code to adapt to the large material movement to in the core at the transient. Therefore, the nuclear calculation model in the code was improved by using the adiabatic space dependent kinetics. The results of a sample case, that is a metallic fueled core at the beginning of cycle, show this improvement is appropriate. (3) Conclusion: The behavior at CDA of a metallic fueled core of a fast reactor was analyzed using the CDA initiation phase analysis code and the knowledge of the important characteristics at the CDA initiation phase was obtained

  1. Piezoelectrically Initiated Pyrotechnic Igniter

    Science.gov (United States)

    Quince, Asia; Dutton, Maureen; Hicks, Robert; Burnham, Karen

    2013-01-01

    This innovation consists of a pyrotechnic initiator and piezoelectric initiation system. The device will be capable of being initiated mechanically; resisting initiation by EMF, RF, and EMI (electromagnetic field, radio frequency, and electromagnetic interference, respectively); and initiating in water environments and space environments. Current devices of this nature are initiated by the mechanical action of a firing pin against a primer. Primers historically are prone to failure. These failures are commonly known as misfires or hang-fires. In many cases, the primer shows the dent where the firing pin struck the primer, but the primer failed to fire. In devices such as "T" handles, which are commonly used to initiate the blowout of canopies, loss of function of the device may result in loss of crew. In devices such as flares or smoke generators, failure can result in failure to spot a downed pilot. The piezoelectrically initiated ignition system consists of a pyrotechnic device that plugs into a mechanical system (activator), which on activation, generates a high-voltage spark. The activator, when released, will strike a stack of electrically linked piezo crystals, generating a high-voltage, low-amperage current that is then conducted to the pyro-initiator. Within the initiator, an electrode releases a spark that passes through a pyrotechnic first-fire mixture, causing it to combust. The combustion of the first-fire initiates a primary pyrotechnic or explosive powder. If used in a "T" handle, the primary would ramp the speed of burn up to the speed of sound, generating a shock wave that would cause a high explosive to go "high order." In a flare or smoke generator, the secondary would produce the heat necessary to ignite the pyrotechnic mixture. The piezo activator subsystem is redundant in that a second stack of crystals would be struck at the same time with the same activation force, doubling the probability of a first strike spark generation. If the first

  2. Scintiscans data analyzer model AS-10

    International Nuclear Information System (INIS)

    Malesa, J.; Wierzbicki, W.

    1975-01-01

    The principle of work and construction elements of the device made up for scintiscans data analyzation by ''square root scaling'' is presented. The device is equipped with cassette tape recorder type MK-125, made in Poland serving like scintiscans data bank, and with scintiscans data analyzation three programs. The cassette of two types, C-60 and C-90, is applied with working time of 2 x 30 min. and 2 x 45 min. respectivly. Results of scintiscans data analysation are printed by electric typewriter at figures in form of digital scintigram. (author)

  3. Student initiative: A conceptual analysis

    Directory of Open Access Journals (Sweden)

    Polovina Nada

    2014-01-01

    Full Text Available In the description and scientific consideration of the attitude of children and youth towards their education and development, the concept of student initiative has been gaining ground lately, and it is hence the subject of analysis in this paper. The analysis is important because of the discrepancy between the increased efforts of the key educational policy holders to promote the idea about the importance of the development of student initiative and rare acceptance of this idea among theoreticians, researchers and practitioners dealing with the education and development of children and youth. By concretising the features of initiative student behaviour, our aim was, on the one hand, to observe the structural determinants and scientific status of the very concept of an initiative student, and, on the other, to contribute to the understanding of the initiative behaviour in practice. In the first part of the paper we deal with different notions and concretisations of the features of initiative behaviour of children and youth, which includes the consideration of: basic student initiative, academic student initiative, individual student initiative, the capacity for initiative and personal development initiative. In the second part of the paper, we discuss the relations of the concept of student initiative with the similar general concepts (activity/passivity, proactivity, agency and the concepts immediately related to school environment (student involvement, student participation. The results of our analysis indicate that the concept of student initiative has: particular features that differentiate it from similar concepts; the potential to reach the status of a scientific concept, bearing in mind the initial empirical specifications and general empirical verifiability of the yet unverified determinants of the concept. In the concluding part of the paper, we discuss the implications of the conceptual analysis for further research, as well as for

  4. Experimental investigation on the initial expansion stage of vacuum arc on cup-shaped TMF contacts

    Science.gov (United States)

    Wang, Ting; Xiu, Shixin; Liu, Zixi; Zhang, Yanzhe; Feng, Dingyu

    2018-02-01

    Arc behavior and measures to control it directly affect the properties of vacuum circuit breakers. Nowadays, transverse magnetic field (TMF) contacts are widely used for medium voltages. A magnetic field perpendicular to the current direction between the TMF contacts makes the arc move, transmitting its energy to the whole contact and avoiding excessive local ablation. Previous research on TMF arc behavior concentrated mainly on the arc movement and less on the initial stage (from arc ignition to an unstable arc column). A significant amount of experiment results suggest that there is a short period of arc stagnation after ignition. The duration of this arc stagnation and the arc characteristics during this stage affect the subsequent arc motion and even the breaking property of interrupters. The present study is of the arc characteristics in the initial stage. Experiments were carried out in a demountable vacuum chamber with cup-shaped TMF contacts. Using a high-speed camera, both single-point arc ignition mode and multiple-point arc ignition (MPAI) mode were observed. The experimental data show that the probability of MPAI mode occurring is related to the arc current. The influences of arc-ignition mode, arc current, and contact diameter on the initial expansion process were investigated. In addition, simulations were performed to analyze the multiple arc expansion process mechanically. Based on the experimental phenomena and simulation results, the mechanism of the arc expansion motion was analyzed.

  5. Using Outcomes to Analyze Patients Rather than Patients to Analyze Outcomes: A Step toward Pragmatism in Benefit:risk Evaluation

    Science.gov (United States)

    Evans, Scott R.; Follmann, Dean

    2016-01-01

    In the future, clinical trials will have an increased emphasis on pragmatism, providing a practical description of the effects of new treatments in realistic clinical settings. Accomplishing pragmatism requires better summaries of the totality of the evidence in ways that clinical trials consumers---patients, physicians, insurers---find transparent and allow for informed benefit:risk decision-making. The current approach to the analysis of clinical trials is to analyze efficacy and safety separately and then combine these analyses into a benefit:risk assessment. Many assume that this will effectively describe the impact on patients. But this approach is suboptimal for evaluating the totality of effects on patients. We discuss methods for benefit:risk assessment that have greater pragmatism than methods that separately analyze efficacy and safety. These include the concepts of within-patient analyses and composite benefit:risk endpoints with a goal of understanding how to analyze one patient before trying to figure out how to analyze many. We discuss the desirability of outcome ranking (DOOR) and introduce the partial credit strategy using an example in a clinical trial evaluating the effects of a new antibiotic. As part of the example we introduce a strategy to engage patients as a resource to inform benefit:risk analyses consistent with the goal of measuring and weighing outcomes that are most important from the patient’s perspective. We describe a broad vision for the future of clinical trials consistent with increased pragmatism. Greater focus on using endpoints to analyze patients rather than patients to analyze endpoints particularly in late-phase/stage clinical trials is an important part of this vision. PMID:28435515

  6. Mechanisms of sharp wave initiation and ripple generation.

    Science.gov (United States)

    Schlingloff, Dániel; Káli, Szabolcs; Freund, Tamás F; Hájos, Norbert; Gulyás, Attila I

    2014-08-20

    Replay of neuronal activity during hippocampal sharp wave-ripples (SWRs) is essential in memory formation. To understand the mechanisms underlying the initiation of irregularly occurring SWRs and the generation of periodic ripples, we selectively manipulated different components of the CA3 network in mouse hippocampal slices. We recorded EPSCs and IPSCs to examine the buildup of neuronal activity preceding SWRs and analyzed the distribution of time intervals between subsequent SWR events. Our results suggest that SWRs are initiated through a combined refractory and stochastic mechanism. SWRs initiate when firing in a set of spontaneously active pyramidal cells triggers a gradual, exponential buildup of activity in the recurrent CA3 network. We showed that this tonic excitatory envelope drives reciprocally connected parvalbumin-positive basket cells, which start ripple-frequency spiking that is phase-locked through reciprocal inhibition. The synchronized GABA(A) receptor-mediated currents give rise to a major component of the ripple-frequency oscillation in the local field potential and organize the phase-locked spiking of pyramidal cells. Optogenetic stimulation of parvalbumin-positive cells evoked full SWRs and EPSC sequences in pyramidal cells. Even with excitation blocked, tonic driving of parvalbumin-positive cells evoked ripple oscillations. Conversely, optogenetic silencing of parvalbumin-positive cells interrupted the SWRs or inhibited their occurrence. Local drug applications and modeling experiments confirmed that the activity of parvalbumin-positive perisomatic inhibitory neurons is both necessary and sufficient for ripple-frequency current and rhythm generation. These interneurons are thus essential in organizing pyramidal cell activity not only during gamma oscillation, but, in a different configuration, during SWRs. Copyright © 2014 the authors 0270-6474/14/3411385-14$15.00/0.

  7. Sonographic findings of thyroid cancer initially assessed as no suspicious malignancy

    International Nuclear Information System (INIS)

    Kim, Do Youn; Kang, Seok Seon; Ji, Eun Kyung; Kwon, Tae Hee; Park, Hae Lin; Shim, Jeong Yun

    2008-01-01

    To review the retrospective imaging findings of thyroid cancer initially assessed as no suspicious malignancy. Of 338 nodules confirmed to be thyroid cancer, this study included 38 patients with 39 nodules assessed as no suspicious malignancy on initial sonography. (mean age:39 years, 36 females and 2 males). We evaluated sonographic findings by shape, margin, echogenecity, calcification, cystic degeneration and peripheral hypoechoic rim retrospectively. We analyzed whether sonographic findings were different according to the size (standard:1 cm). The most frequent sonographic findings were avoid to round shape 90%, well-defined smooth margin 64%, hypoechogenecity 54%, no calcification 92%, no cystic degeneration 77% and peripheral hypoechoic rim 56%. Suspicious malignancy findings were taller than wide shape 10%, well-defined spiculated margin 36%, markedly hypoechogenecity 10% and microcalcifications 8%. Isoechogenecity, cystic degeneration and peripheral hypoechoic rim were common in 1 cm more than nodules. Well-defined spiculated margin was common in 1 cm less than nodules. In retrospective, 56% showed no suspicious malignancy finding. Although nodules assessed as no suspicious malignancy on initial US had many retrospectively suspicious malignancy findings, still many nodules showed no suspicious malignancy finding. Suspicious findings were ignored due to equivocal finding in small size, isoechogenecity, cystic degeneration or peripheral hypoechoic rim. We need careful observation

  8. Parental smoking, exposure to secondhand smoke at home, and smoking initiation among young children.

    Science.gov (United States)

    Wang, Man Ping; Ho, Sai Yin; Lam, Tai Hing

    2011-09-01

    To investigate the associations of parental smoking and secondhand smoke (SHS) exposure at home with smoking initiation among young children in Hong Kong. A prospective school-based survey of Hong Kong primary 2-4 students was conducted at baseline in 2006 and followed up in 2008. Self-administered anonymous questionnaires were used to collect information about smoking, SHS exposure at home, parental smoking, and sociodemographic characteristics. Cross-sectional and prospective associations of SHS exposure at home and parental smoking with student smoking were analyzed using logistic regression adjusting for potential confounders. Cross-sectional association between parental smoking and ever smoking was significant with adjustment of sociodemographic characteristics but became insignificant after adjusting for home SHS exposure. Home SHS exposure mediated the association between parental smoking and students smoking (p = .03). Prospectively, parental smoking was not associated with smoking initiation after adjusting for home SHS exposure. Each day increase in home SHS exposure significantly predicted 16% excess risk of smoking initiation after adjusting for parental smoking. The prospective effect of parental smoking on smoking initiation was significantly mediated by baseline home SHS exposure (p smoking initiation of young Chinese children in Hong Kong independent of parental smoking status. On the other hand, the effect of parental smoking on smoking initiation was mediated through SHS exposure at home. To prevent children from smoking as well as the harm of SHS exposure, parents and other family members should quit smoking or at least reduce smoking at home.

  9. Handheld Fluorescence Microscopy based Flow Analyzer.

    Science.gov (United States)

    Saxena, Manish; Jayakumar, Nitin; Gorthi, Sai Siva

    2016-03-01

    Fluorescence microscopy has the intrinsic advantages of favourable contrast characteristics and high degree of specificity. Consequently, it has been a mainstay in modern biological inquiry and clinical diagnostics. Despite its reliable nature, fluorescence based clinical microscopy and diagnostics is a manual, labour intensive and time consuming procedure. The article outlines a cost-effective, high throughput alternative to conventional fluorescence imaging techniques. With system level integration of custom-designed microfluidics and optics, we demonstrate fluorescence microscopy based imaging flow analyzer. Using this system we have imaged more than 2900 FITC labeled fluorescent beads per minute. This demonstrates high-throughput characteristics of our flow analyzer in comparison to conventional fluorescence microscopy. The issue of motion blur at high flow rates limits the achievable throughput in image based flow analyzers. Here we address the issue by computationally deblurring the images and show that this restores the morphological features otherwise affected by motion blur. By further optimizing concentration of the sample solution and flow speeds, along with imaging multiple channels simultaneously, the system is capable of providing throughput of about 480 beads per second.

  10. Analysis of molten fuel-coolant interaction during a reactivity-initiated accident experiment

    International Nuclear Information System (INIS)

    El-Genk, M.S.; Hobbins, R.R.

    1981-01-01

    The results of a reactivity-initiated accident experiment, designated RIA-ST-4, are discussed and analyzed with regard to molten fuel-coolant interaction (MFCI). In this experiment, extensive amounts of molten UO 2 fuel and zircaloy cladding were produced and fragmented upon mixing with the coolant. Coolant pressurization up to 35 MPa and coolant overheating in excess of 940 K occurred after fuel rod failure. The initial coolant conditions were similar to those in boiling water reactors during a hot startup (that is, coolant pressure of 6.45 MPa, coolant temperature of 538 K, and coolant flow rate of 85 cm 3 /s). It is concluded that the high coolant pressure recorded in the RIA-ST-4 experiment was caused by an energetic MFCI and was not due to gas release from the test rod at failure, Zr/water reaction, or to UO 2 fuel vapor pressure. The high coolant temperature indicated the presence of superheated steam, which may have formed during the expansion of the working fluid back to the initial coolant pressure; yet, the thermal-to-mechanical energy conversion ratio is estimated to be only 0.3%

  11. Recognition of speaker-dependent continuous speech with KEAL

    Science.gov (United States)

    Mercier, G.; Bigorgne, D.; Miclet, L.; Le Guennec, L.; Querre, M.

    1989-04-01

    A description of the speaker-dependent continuous speech recognition system KEAL is given. An unknown utterance, is recognized by means of the followng procedures: acoustic analysis, phonetic segmentation and identification, word and sentence analysis. The combination of feature-based, speaker-independent coarse phonetic segmentation with speaker-dependent statistical classification techniques is one of the main design features of the acoustic-phonetic decoder. The lexical access component is essentially based on a statistical dynamic programming technique which aims at matching a phonemic lexical entry containing various phonological forms, against a phonetic lattice. Sentence recognition is achieved by use of a context-free grammar and a parsing algorithm derived from Earley's parser. A speaker adaptation module allows some of the system parameters to be adjusted by matching known utterances with their acoustical representation. The task to be performed, described by its vocabulary and its grammar, is given as a parameter of the system. Continuously spoken sentences extracted from a 'pseudo-Logo' language are analyzed and results are presented.

  12. Development of parallel/serial program analyzing tool

    International Nuclear Information System (INIS)

    Watanabe, Hiroshi; Nagao, Saichi; Takigawa, Yoshio; Kumakura, Toshimasa

    1999-03-01

    Japan Atomic Energy Research Institute has been developing 'KMtool', a parallel/serial program analyzing tool, in order to promote the parallelization of the science and engineering computation program. KMtool analyzes the performance of program written by FORTRAN77 and MPI, and it reduces the effort for parallelization. This paper describes development purpose, design, utilization and evaluation of KMtool. (author)

  13. Methods for Analyzing Pipe Networks

    DEFF Research Database (Denmark)

    Nielsen, Hans Bruun

    1989-01-01

    to formulate the flow equations in terms of pipe discharges than in terms of energy heads. The behavior of some iterative methods is compared in the initial phase with large errors. It is explained why the linear theory method oscillates when the iteration gets close to the solution, and it is further...... demonstrated that this method offers good starting values for a Newton-Raphson iteration....

  14. AC Initiation System.

    Science.gov (United States)

    An ac initiation system is described which uses three ac transmission signals interlocked for safety by frequency, phase, and power discrimination...The ac initiation system is pre-armed by the application of two ac signals have the proper phases, and activates a load when an ac power signal of the proper frequency and power level is applied. (Author)

  15. Two models of minimalist, incremental syntactic analysis.

    Science.gov (United States)

    Stabler, Edward P

    2013-07-01

    Minimalist grammars (MGs) and multiple context-free grammars (MCFGs) are weakly equivalent in the sense that they define the same languages, a large mildly context-sensitive class that properly includes context-free languages. But in addition, for each MG, there is an MCFG which is strongly equivalent in the sense that it defines the same language with isomorphic derivations. However, the structure-building rules of MGs but not MCFGs are defined in a way that generalizes across categories. Consequently, MGs can be exponentially more succinct than their MCFG equivalents, and this difference shows in parsing models too. An incremental, top-down beam parser for MGs is defined here, sound and complete for all MGs, and hence also capable of parsing all MCFG languages. But since the parser represents its grammar transparently, the relative succinctness of MGs is again evident. Although the determinants of MG structure are narrowly and discretely defined, probabilistic influences from a much broader domain can influence even the earliest analytic steps, allowing frequency and context effects to come early and from almost anywhere, as expected in incremental models. Copyright © 2013 Cognitive Science Society, Inc.

  16. Lexical and sublexical units in speech perception.

    Science.gov (United States)

    Giroux, Ibrahima; Rey, Arnaud

    2009-03-01

    Saffran, Newport, and Aslin (1996a) found that human infants are sensitive to statistical regularities corresponding to lexical units when hearing an artificial spoken language. Two sorts of segmentation strategies have been proposed to account for this early word-segmentation ability: bracketing strategies, in which infants are assumed to insert boundaries into continuous speech, and clustering strategies, in which infants are assumed to group certain speech sequences together into units (Swingley, 2005). In the present study, we test the predictions of two computational models instantiating each of these strategies i.e., Serial Recurrent Networks: Elman, 1990; and Parser: Perruchet & Vinter, 1998 in an experiment where we compare the lexical and sublexical recognition performance of adults after hearing 2 or 10 min of an artificial spoken language. The results are consistent with Parser's predictions and the clustering approach, showing that performance on words is better than performance on part-words only after 10 min. This result suggests that word segmentation abilities are not merely due to stronger associations between sublexical units but to the emergence of stronger lexical representations during the development of speech perception processes. Copyright © 2009, Cognitive Science Society, Inc.

  17. A qualitative study of methamphetamine initiation in Cape Town, South Africa

    Science.gov (United States)

    Hobkirk, Andréa L.; Watt, Melissa H.; Myers, Bronwyn; Skinner, Donald; Meade, Christina S.

    2015-01-01

    Background Despite a significant rise in methamphetamine use in low- and middle-income countries, there has been little empirical examination of the factors that contribute to individuals’ initiation of methamphetamine use in these settings. The goal of this study was to qualitatively examine factors associated with methamphetamine initiation in South Africa. Methods In-depth interviews were conducted with 30 active methamphetamine users (13 women and 17 men) in Cape Town, South Africa. Interviews included narrative descriptions of the circumstances surrounding methamphetamine initiation. Interviews were audio recorded, transcribed, and translated. Transcripts were analyzed with document memos, data display matrices, and a constant comparison technique to identify themes. Results On average, participants began regularly using methamphetamine around age 21 and had used for seven years. Four major themes emerged related to the initiation of methamphetamine use. The prevalence of methamphetamine users and distributors made the drug convenient and highly accessible to first time users. Methamphetamine has increased in popularity and is considered “trendy”, which contributes to social pressure from friends, and less often, family members to initiate use. Initiation is further fueled by a lack of opportunities for recreation and employment, which leads to boredom and curiosity about the rumored positive effects of the drug. Young people also turn to methamphetamine use and distribution through gang membership as an attempt to generate income in impoverished communities with limited economic opportunities. Finally, participants described initiating methamphetamine as a means of coping with the cumulative stress and psychological burden provoked by the high rates of violence and crime in areas of Cape Town. Conclusion The findings highlight the complex nature of methamphetamine initiation in low- and middle-income countries like South Africa. There is a need for

  18. A qualitative study of methamphetamine initiation in Cape Town, South Africa.

    Science.gov (United States)

    Hobkirk, Andréa L; Watt, Melissa H; Myers, Bronwyn; Skinner, Donald; Meade, Christina S

    2016-04-01

    Despite a significant rise in methamphetamine use in low- and middle-income countries, there has been little empirical examination of the factors that contribute to individuals' initiation of methamphetamine use in these settings. The goal of this study was to qualitatively examine factors associated with methamphetamine initiation in South Africa. In-depth interviews were conducted with 30 active methamphetamine users (13 women and 17 men) in Cape Town, South Africa. Interviews included narrative descriptions of the circumstances surrounding methamphetamine initiation. Interviews were audio recorded, transcribed, and translated. Transcripts were analyzed with document memos, data display matrices, and a constant comparison technique to identify themes. On average, participants began regularly using methamphetamine around age 21 and had used for seven years. Four major themes emerged related to the initiation of methamphetamine use. The prevalence of methamphetamine users and distributors made the drug convenient and highly accessible to first time users. Methamphetamine has increased in popularity and is considered "trendy", which contributes to social pressure from friends, and less often, family members to initiate use. Initiation is further fueled by a lack of opportunities for recreation and employment, which leads to boredom and curiosity about the rumored positive effects of the drug. Young people also turn to methamphetamine use and distribution through gang membership as an attempt to generate income in impoverished communities with limited economic opportunities. Finally, participants described initiating methamphetamine as a means of coping with the cumulative stress and psychological burden provoked by the high rates of violence and crime in areas of Cape Town. The findings highlight the complex nature of methamphetamine initiation in low- and middle-income countries like South Africa. There is a need for community-level interventions to address the

  19. Concepts and realization of the KWU Nuclear Plant Analyzer

    International Nuclear Information System (INIS)

    Moritz, H.; Hummel, R.

    1987-01-01

    The Nuclear Plant Analyzer (NPA) is a real time simulator developed from KWU computer programs for transient and safety analysis ('engineering simulator'). The NPA has no control room, the hardware consists only of commercially available data processing devices. The KWU NPA makes available all simulator operating features such as initial conditions, free operator action and multiple malfunctions as well as freeze, snapshot, backtrack and playback, which have evolved useful training support in training simulators of all technical disciplines. The simulation program itself is running on a large mainframe computer Control Data CYBER 176 or CYBER 990 in the KWU computing center under the interactive component INTERCOM of the operating system NOS/BE. It transmits the time dependent engineering date roughly once a second to a process computer SIEMENS 300-R30E using telecommunication by telephone. The computers are coupled by an emulation of the communication protocol Mode 4A, running on the R30 computer. To this emulation a program-to-program interface via a circular buffer on the R30 was added. In the process computer data are processed and displayed graphically on 4 colour screens (560x512 pixels, 8 colours) by means of the process monitoring system DISIT. All activities at the simulator, including operator actions, are performed locally by the operator at the screens by means of function keys or dialog. (orig.)

  20. Analyzing the Risk of Well Plug Failure after Abandonment

    International Nuclear Information System (INIS)

    Mainguy, M.; Longuemare, P.; Audibert, A.; Lecolier, E.

    2007-01-01

    All oil and gas wells will have to be plugged and abandoned at some time. The plugging and abandonment procedure must provide an effective isolation of the well fluids all along the well to reduce environmental risks of contamination and prevent from costly remedial jobs. Previous works have analyzed the plug behavior when submitted to local pressure or thermal changes but no work has looked to the effects of external pressure, thermal and stress changes resulting from a global equilibrium restoration in a hydrocarbon reservoir once production has stopped. This work estimates those changes after abandonment on a reservoir field case using a reservoir simulator in conjunction with a geomechanical simulator. Such simulations provide the pressure and thermal changes and the maximum effective stress changes in the reservoir cap rock where critical plugs are put in place for isolating the production intervals. These changes are used as loads in a well bore stress model that explicitly models an injector well and predict stress rearrangements in the plug after abandonment. Results obtained with the well bore stress model for a conventional class G cement plug show that the main risk of failure is tensile failure because of the low tensile strength of the cement. Actually, soft sealing materials or initially pre-stressed plug appears to be more adapted to the downhole conditions changes that may occurs after well plugging and abandonment. (authors)