WorldWideScience

Sample records for parser initially analyzes

  1. SEMSIN SEMANTIC AND SYNTACTIC PARSER

    Directory of Open Access Journals (Sweden)

    K. K. Boyarsky

    2015-09-01

    Full Text Available The paper deals with the principle of operation for SemSin semantic and syntactic parser creating a dependency tree for the Russian language sentences. The parser consists of 4 blocks: a dictionary, morphological analyzer, production rules and lexical analyzer. An important logical part of the parser is pre-syntactical module, which harmonizes and complements morphological analysis results, separates the text paragraphs into individual sentences, and also carries out predisambiguation. Characteristic feature of the presented parser is an open type of control – it is done by means of a set of production rules. A varied set of commands provides the ability to both morphological and semantic-syntactic analysis of the sentence. The paper presents the sequence of rules usage and examples of their work. Specific feature of the rules is the decision making on establishment of syntactic links with simultaneous removal of the morphological and semantic ambiguity. The lexical analyzer provides the execution of commands and rules, and manages the parser in manual or automatic modes of the text analysis. In the first case, the analysis is performed interactively with the possibility of step-by-step execution of the rules and scanning the resulting parse tree. In the second case, analysis results are filed in an xml-file. Active usage of syntactic and semantic dictionary information gives the possibility to reduce significantly the ambiguity of parsing. In addition to marking the text, the parser is also usable as a tool for information extraction from natural language texts.

  2. Practical, general parser combinators

    NARCIS (Netherlands)

    A. Izmaylova (Anastasia); A. Afroozeh (Ali); T. van der Storm (Tijs)

    2016-01-01

    textabstractParser combinators are a popular approach to parsing where contextfree grammars are represented as executable code. However, conventional parser combinators do not support left recursion, and can have worst-case exponential runtime. These limitations hinder the expressivity and

  3. LRSYS, PASCAL LR(1) Parser Generator System

    International Nuclear Information System (INIS)

    O'Hair, K.

    1991-01-01

    Description of program or function: LRSYS is a complete LR(1) parser generator system written entirely in a portable subset of Pascal. The system, LRSYS, includes a grammar analyzer program (LR) which reads a context-free (BNF) grammar as input and produces LR(1) parsing tables as output, a lexical analyzer generator (LEX) which reads regular expressions created by the REG process as input and produces lexical tables as output, and various parser skeletons that get merged with the tables to produce complete parsers (SMAKE). Current parser skeletons include Pascal, FORTRAN 77, and C. In addition, the CRAY1, DEC VAX11 version contains LRLTRAN and CFT- FORTRAN 77 skeletons. Other language skeletons can easily be added to the system. LRSYS is based on the LR program (NESC Abstract 822)

  4. Parser Macros for Scala

    OpenAIRE

    Duhem, Martin; Burmako, Eugene

    2015-01-01

    Parser macros are a new kind of macros that allow developers to create new language constructs and to define their own syntax for using them. In this report, we present why parser macros are useful and the kind of problems that they help to solve. We will also see how they are implemented and gain insight about how they take advantage from scala.meta, the new metaprogramming toolkit for Scala. Finally, we will discuss what are the current limitations of parser macros and what is left for futu...

  5. Polish Semantic Parser

    Directory of Open Access Journals (Sweden)

    Agnieszka Grudzinska

    2000-01-01

    Full Text Available Amount of information transferred by computers grows very rapidly thus outgrowing the average man's capability of reception. It implies computer programs increase in the demand for which would be able to perform an introductory classitication or even selection of information directed to a particular receiver. Due to the complexity of the problem, we restricted it to understanding short newspaper notes. Among many conceptions formulated so far, the conceptual dependency worked out by Roger Schank has been chosen. It is a formal language of description of the semantics of pronouncement integrated with a text understanding algorithm. Substantial part of each text transformation system is a semantic parser of the Polish language. It is a module, which as the first and the only one has an access to the text in the Polish language. lt plays the role of an element, which finds relations between words of the Polish language and the formal registration. It translates sentences written in the language used by people into the language theory. The presented structure of knowledge units and the shape of understanding process algorithms are universal by virtue of the theory. On the other hand the defined knowledge units and the rules used in the algorithms ure only examples because they are constructed in order to understand short newspaper notes.

  6. Telugu dependency parsing using different statistical parsers

    Directory of Open Access Journals (Sweden)

    B. Venkata Seshu Kumari

    2017-01-01

    Full Text Available In this paper we explore different statistical dependency parsers for parsing Telugu. We consider five popular dependency parsers namely, MaltParser, MSTParser, TurboParser, ZPar and Easy-First Parser. We experiment with different parser and feature settings and show the impact of different settings. We also provide a detailed analysis of the performance of all the parsers on major dependency labels. We report our results on test data of Telugu dependency treebank provided in the ICON 2010 tools contest on Indian languages dependency parsing. We obtain state-of-the art performance of 91.8% in unlabeled attachment score and 70.0% in labeled attachment score. To the best of our knowledge ours is the only work which explored all the five popular dependency parsers and compared the performance under different feature settings for Telugu.

  7. Lazy functional parser combinators in Java

    NARCIS (Netherlands)

    Swierstra, D.S.; Dijkstra, A.

    2001-01-01

    A parser is a program that checks if a text is a sentence of the language as described by a grammar. Traditionally, the program text of a parser is generated from a grammar description, after which it is compiled and subsequently run. The language accepted by such a parser is, by the nature of

  8. Expression and cut parser for CMS event data

    International Nuclear Information System (INIS)

    Lista, Luca; Jones, Christopher D; Petrucciani, Giovanni

    2010-01-01

    We present a parser to evaluate expressions and Boolean selections that is applied on CMS event data for event filtering and analysis purposes. The parser is based on Boost Spirit grammar definition, and uses Reflex dictionaries for class introspection. The parser allows for a natural definition of expressions and cuts in users' configurations, and provides good runtime performance compared to other existing parsers.

  9. On Minimizing Training Corpus for Parser Acquisition

    National Research Council Canada - National Science Library

    Hwa, Rebecca

    2001-01-01

    .... In this work, we consider selecting training examples with the it tree-entropy metric. Our goal is to assess how well this selection technique can be applied for training different types of parsers...

  10. Techniques for Automated Testing of Lola Industrial Robot Language Parser

    Directory of Open Access Journals (Sweden)

    M. M. Lutovac

    2014-06-01

    Full Text Available The accuracy of parsing execution directly affects the accuracy of semantic analysis, optimization and object code generation. Therefore, parser testing represents the basis of compiler testing. It should include tests for correct and expected, but also for unexpected and invalid cases. Techniques for testing the parser, as well as algorithms and tools for test sentences generation, are discussed in this paper. The methodology for initial testing of a newly developed compiler is proposed. Generation of negative test sentences by modifying the original language grammar is described. Positive and negative test cases generated by Grow, Purdom’s algorithm with and without length control, CDRC-P algorithm and CDRC-P algorithm with length control are applied to the testing of L-IRL robot programming language. For this purpose two different tools for generation of test sentences are used. Based on the presented analysis of possible solutions, the appropriate method can be chosen for testing the parser for smaller grammars with many recursive rules.

  11. Policy-Based Management Natural Language Parser

    Science.gov (United States)

    James, Mark

    2009-01-01

    The Policy-Based Management Natural Language Parser (PBEM) is a rules-based approach to enterprise management that can be used to automate certain management tasks. This parser simplifies the management of a given endeavor by establishing policies to deal with situations that are likely to occur. Policies are operating rules that can be referred to as a means of maintaining order, security, consistency, or other ways of successfully furthering a goal or mission. PBEM provides a way of managing configuration of network elements, applications, and processes via a set of high-level rules or business policies rather than managing individual elements, thus switching the control to a higher level. This software allows unique management rules (or commands) to be specified and applied to a cross-section of the Global Information Grid (GIG). This software embodies a parser that is capable of recognizing and understanding conversational English. Because all possible dialect variants cannot be anticipated, a unique capability was developed that parses passed on conversation intent rather than the exact way the words are used. This software can increase productivity by enabling a user to converse with the system in conversational English to define network policies. PBEM can be used in both manned and unmanned science-gathering programs. Because policy statements can be domain-independent, this software can be applied equally to a wide variety of applications.

  12. Designing a Constraint Based Parser for Sanskrit

    Science.gov (United States)

    Kulkarni, Amba; Pokar, Sheetal; Shukl, Devanand

    Verbal understanding (śā bdabodha) of any utterance requires the knowledge of how words in that utterance are related to each other. Such knowledge is usually available in the form of cognition of grammatical relations. Generative grammars describe how a language codes these relations. Thus the knowledge of what information various grammatical relations convey is available from the generation point of view and not the analysis point of view. In order to develop a parser based on any grammar one should then know precisely the semantic content of the grammatical relations expressed in a language string, the clues for extracting these relations and finally whether these relations are expressed explicitly or implicitly. Based on the design principles that emerge from this knowledge, we model the parser as finding a directed Tree, given a graph with nodes representing the words and edges representing the possible relations between them. Further, we also use the Mīmā ṃsā constraint of ākā ṅkṣā (expectancy) to rule out non-solutions and sannidhi (proximity) to prioritize the solutions. We have implemented a parser based on these principles and its performance was found to be satisfactory giving us a confidence to extend its functionality to handle the complex sentences.

  13. Towards automated processing of clinical Finnish: sublanguage analysis and a rule-based parser.

    Science.gov (United States)

    Laippala, Veronika; Ginter, Filip; Pyysalo, Sampo; Salakoski, Tapio

    2009-12-01

    In this paper, we present steps taken towards more efficient automated processing of clinical Finnish, focusing on daily nursing notes in a Finnish Intensive Care Unit (ICU). First, we analyze ICU Finnish as a sublanguage, identifying its specific features facilitating, for example, the development of a specialized syntactic analyser. The identified features include frequent omission of finite verbs, limitations in allowed syntactic structures, and domain-specific vocabulary. Second, we develop a formal grammar and a parser for ICU Finnish, thus providing better tools for the development of further applications in the clinical domain. The grammar is implemented in the LKB system in a typed feature structure formalism. The lexicon is automatically generated based on the output of the FinTWOL morphological analyzer adapted to the clinical domain. As an additional experiment, we study the effect of using Finnish constraint grammar to reduce the size of the lexicon. The parser construction thus makes efficient use of existing resources for Finnish. The grammar currently covers 76.6% of ICU Finnish sentences, producing highly accurate best-parse analyzes with F-score of 91.1%. We find that building a parser for the highly specialized domain sublanguage is not only feasible, but also surprisingly efficient, given an existing morphological analyzer with broad vocabulary coverage. The resulting parser enables a deeper analysis of the text than was previously possible.

  14. Cross-lingual parser selection for low-resource languages

    DEFF Research Database (Denmark)

    Agic, Zeljko

    2017-01-01

    In multilingual dependency parsing, transferring delexicalized models provides unmatched language coverage and competitive scores, with minimal requirements. Still, selecting the single best parser for any target language poses a challenge. Here, we propose a lean method for parser selection. It ....... It offers top performance, and it does so without disadvantaging the truly low-resource languages. We consistently select appropriate source parsers for our target languages in a realistic cross-lingual parsing experiment....

  15. Analyzing Digital Library Initiatives: 5S Theory Perspective

    Science.gov (United States)

    Isah, Abdulmumin; Mutshewa, Athulang; Serema, Batlang; Kenosi, Lekoko

    2015-01-01

    This article traces the historical development of Digital Libraries (DLs), examines some DL initiatives in developed and developing countries and uses 5S Theory as a lens for analyzing the focused DLs. The analysis shows that present-day systems, in both developed and developing nations, are essentially content and user centric, with low level…

  16. Experienced physicians benefit from analyzing initial diagnostic hypotheses

    Science.gov (United States)

    Bass, Adam; Geddes, Colin; Wright, Bruce; Coderre, Sylvain; Rikers, Remy; McLaughlin, Kevin

    2013-01-01

    Background Most incorrect diagnoses involve at least one cognitive error, of which premature closure is the most prevalent. While metacognitive strategies can mitigate premature closure in inexperienced learners, these are rarely studied in experienced physicians. Our objective here was to evaluate the effect of analytic information processing on diagnostic performance of nephrologists and nephrology residents. Methods We asked nine nephrologists and six nephrology residents at the University of Calgary and Glasgow University to diagnose ten nephrology cases. We provided presenting features along with contextual information, after which we asked for an initial diagnosis. We then primed participants to use either hypothetico-deductive reasoning or scheme-inductive reasoning to analyze the remaining case data and generate a final diagnosis. Results After analyzing initial hypotheses, both nephrologists and residents improved the accuracy of final diagnoses (31.1% vs. 65.6%, p inductive reasoning (odds ratio [95% confidence interval] 5.69 [1.59, 20.33], p = 0.07), whereas the performance of experienced nephrologists did not differ between strategies (odds ratio 0.57 [0.23, 1.39], p = 0.20). Discussion Experienced nephrologists and nephrology residents can improve their performance by analyzing initial diagnostic hypotheses. The explanation of the interaction between experience and the effect of different reasoning strategies is unclear, but may relate to preferences in reasoning strategy, or the changes in knowledge structure with experience. PMID:26451203

  17. An efficient implementation of the head-corner parser

    NARCIS (Netherlands)

    vanNoord, G

    This paper describes an efficient and robust implementation of a bidirectional, head-driven parser for constraint-based grammars. This parser is developed for the OVIS system: a Dutch spoken dialogue system in which information about public transport can be obtained by telephone. After a review of

  18. A memory-based shallow parser for spoken Dutch

    NARCIS (Netherlands)

    Canisius, S.V.M.; van den Bosch, A.; Decadt, B.; Hoste, V.; De Pauw, G.

    2004-01-01

    We describe the development of a Dutch memory-based shallow parser. The availability of large treebanks for Dutch, such as the one provided by the Spoken Dutch Corpus, allows memory-based learners to be trained on examples of shallow parsing taken from the treebank, and act as a shallow parser after

  19. The ModelCC Model-Driven Parser Generator

    Directory of Open Access Journals (Sweden)

    Fernando Berzal

    2015-01-01

    Full Text Available Syntax-directed translation tools require the specification of a language by means of a formal grammar. This grammar must conform to the specific requirements of the parser generator to be used. This grammar is then annotated with semantic actions for the resulting system to perform its desired function. In this paper, we introduce ModelCC, a model-based parser generator that decouples language specification from language processing, avoiding some of the problems caused by grammar-driven parser generators. ModelCC receives a conceptual model as input, along with constraints that annotate it. It is then able to create a parser for the desired textual syntax and the generated parser fully automates the instantiation of the language conceptual model. ModelCC also includes a reference resolution mechanism so that ModelCC is able to instantiate abstract syntax graphs, rather than mere abstract syntax trees.

  20. Experienced physicians benefit from analyzing initial diagnostic hypotheses

    Directory of Open Access Journals (Sweden)

    Adam Bass

    2013-03-01

    Full Text Available Background: Most incorrect diagnoses involve at least one cognitive error, of which premature closure is the most prevalent. While metacognitive strategies can mitigate premature closure in inexperienced learners, these are rarely studied in experienced physicians. Our objective here was to evaluate the effect of analytic information processing on diagnostic performance of nephrologists and nephrology residents. Methods: We asked nine nephrologists and six nephrology residents at the University of Calgary and Glasgow University to diagnose ten nephrology cases. We provided presenting features along with contextual information, after which we asked for an initial diagnosis. We then primed participants to use either hypothetico-deductive reasoning or scheme-inductive reasoning to analyze the remaining case data and generate a final diagnosis. Results: After analyzing initial hypotheses, both nephrologists and residents improved the accuracy of final diagnoses (31.1% vs. 65.6%, p < 0.001, and 40.0% vs. 70.0%, p < 0.001, respectively. We found a significant interaction between experience and analytic processing strategy (p = 0.002: nephrology residents had significantly increased odds of diagnostic success when using scheme-inductive reasoning (odds ratio [95% confidence interval] 5.69 [1.59, 20.33], p = 0.007, whereas the performance of experienced nephrologists did not differ between strategies (odds ratio 0.57 [0.23, 1.39], p = 0.2. Discussion: Experienced nephrologists and nephrology residents can improve their performance by analyzing initial diagnostic hypotheses. The explanation of the interaction between experience and the effect of different reasoning strategies is unclear, but may relate to preferences in reasoning strategy, or the changes in knowledge structure with experience.

  1. Historical Post Office Directory Parser (POD Parser Software From the AddressingHistory Project

    Directory of Open Access Journals (Sweden)

    Nicola Osborne

    2014-07-01

    Full Text Available The POD Parser is Python software for parsing the OCR’d (optical character recognised text of digitised historical Scottish Post Office Directories (PODs to produce a consistent structured format for the data and for geocoding each address. The software was developed as part of the AddressingHistory project which sought to combine digitised historic directories with digitised and georeferenced historic maps.  The software has potential for reuse in multiple research contexts where historical post office directory data is relevant, and is therefore particularly of use in historical research into social, economic or demographic trends. The POD Parser is currently designed for use with Scottish directories but is extensible, perhaps with some adaptation, to use with other similarly formatted materials such as the English Trade Directories.

  2. A methodology for analyzing precursors to earthquake-initiated and fire-initiated accident sequences

    International Nuclear Information System (INIS)

    Budnitz, R.J.; Lambert, H.E.; Apostolakis, G.

    1998-04-01

    This report covers work to develop a methodology for analyzing precursors to both earthquake-initiated and fire-initiated accidents at commercial nuclear power plants. Currently, the U.S. Nuclear Regulatory Commission sponsors a large ongoing project, the Accident Sequence Precursor project, to analyze the safety significance of other types of accident precursors, such as those arising from internally-initiated transients and pipe breaks, but earthquakes and fires are not within the current scope. The results of this project are that: (1) an overall step-by-step methodology has been developed for precursors to both fire-initiated and seismic-initiated potential accidents; (2) some stylized case-study examples are provided to demonstrate how the fully-developed methodology works in practice, and (3) a generic seismic-fragility date base for equipment is provided for use in seismic-precursors analyses. 44 refs., 23 figs., 16 tabs

  3. Parser Adaptation for Social Media by Integrating Normalization

    NARCIS (Netherlands)

    van der Goot, Rob; van Noord, Gerardus

    This work explores normalization for parser adaptation. Traditionally, normalization is used as separate pre-processing step. We show that integrating the normalization model into the parsing algorithm is beneficial. This way, multiple normalization candidates can be leveraged, which improves

  4. Thermo-msf-parser: an open source Java library to parse and visualize Thermo Proteome Discoverer msf files.

    Science.gov (United States)

    Colaert, Niklaas; Barsnes, Harald; Vaudel, Marc; Helsens, Kenny; Timmerman, Evy; Sickmann, Albert; Gevaert, Kris; Martens, Lennart

    2011-08-05

    The Thermo Proteome Discoverer program integrates both peptide identification and quantification into a single workflow for peptide-centric proteomics. Furthermore, its close integration with Thermo mass spectrometers has made it increasingly popular in the field. Here, we present a Java library to parse the msf files that constitute the output of Proteome Discoverer. The parser is also implemented as a graphical user interface allowing convenient access to the information found in the msf files, and in Rover, a program to analyze and validate quantitative proteomics information. All code, binaries, and documentation is freely available at http://thermo-msf-parser.googlecode.com.

  5. Design and Implementation of High Level Trigger Configuration Exporter and Parser

    CERN Document Server

    Abdulwahhab, Husam

    2015-01-01

    This paper serves as a description of the project that was developed at CMS during the summer. The initial task of the project was with the design, implementation and development of a configuration exporter from an oracle database to a python file. Next was the development of a parser that reads all the necessary information from the python configuration file that was created by the parser, and store the information into the memory in the form of an efficient and easy to access and manipulate cache. The final task of the project was the implementation of a system that handles requests from the client, which is a web interface, and replies with the appropriate data organized in a way that can be viewed on the interface.

  6. ImageParser: a tool for finite element generation from three-dimensional medical images

    Directory of Open Access Journals (Sweden)

    Yamada T

    2004-10-01

    Full Text Available Abstract Background The finite element method (FEM is a powerful mathematical tool to simulate and visualize the mechanical deformation of tissues and organs during medical examinations or interventions. It is yet a challenge to build up an FEM mesh directly from a volumetric image partially because the regions (or structures of interest (ROIs may be irregular and fuzzy. Methods A software package, ImageParser, is developed to generate an FEM mesh from 3-D tomographic medical images. This software uses a semi-automatic method to detect ROIs from the context of image including neighboring tissues and organs, completes segmentation of different tissues, and meshes the organ into elements. Results The ImageParser is shown to build up an FEM model for simulating the mechanical responses of the breast based on 3-D CT images. The breast is compressed by two plate paddles under an overall displacement as large as 20% of the initial distance between the paddles. The strain and tangential Young's modulus distributions are specified for the biomechanical analysis of breast tissues. Conclusion The ImageParser can successfully exact the geometry of ROIs from a complex medical image and generate the FEM mesh with customer-defined segmentation information.

  7. Measuring and Analyzing the Scholarly Impact of Experimental Evaluation Initiatives

    DEFF Research Database (Denmark)

    Angelini, Marco; Ferro, Nicola; Larsen, Birger

    2014-01-01

    Evaluation initiatives have been widely credited with contributing highly to the development and advancement of information access systems, by providing a sustainable platform for conducting the very demanding activity of comparable experimental evaluation in a large scale. Measuring the impact...

  8. ULTRA: Universal Grammar as a Universal Parser.

    Science.gov (United States)

    Medeiros, David P

    2018-01-01

    A central concern of generative grammar is the relationship between hierarchy and word order, traditionally understood as two dimensions of a single syntactic representation. A related concern is directionality in the grammar. Traditional approaches posit process-neutral grammars, embodying knowledge of language, put to use with infinite facility both for production and comprehension. This has crystallized in the view of Merge as the central property of syntax, perhaps its only novel feature. A growing number of approaches explore grammars with different directionalities, often with more direct connections to performance mechanisms. This paper describes a novel model of universal grammar as a one-directional, universal parser. Mismatch between word order and interpretation order is pervasive in comprehension; in the present model, word order is language-particular and interpretation order (i.e., hierarchy) is universal. These orders are not two dimensions of a unified abstract object (e.g., precedence and dominance in a single tree); rather, both are temporal sequences, and UG is an invariant real-time procedure (based on Knuth's stack-sorting algorithm) transforming word order into hierarchical order. This shift in perspective has several desirable consequences. It collapses linearization, displacement, and composition into a single performance process. The architecture provides a novel source of brackets (labeled unambiguously and without search), which are understood not as part-whole constituency relations, but as storage and retrieval routines in parsing. It also explains why neutral word order within single syntactic cycles avoids 213-like permutations. The model identifies cycles as extended projections of lexical heads, grounding the notion of phase. This is achieved with a universal processor, dispensing with parameters. The empirical focus is word order in noun phrases. This domain provides some of the clearest evidence for 213-avoidance as a cross

  9. Manual for the ELL (2) - parser generator and tree generator generator

    OpenAIRE

    Heckmann, Reinhold

    1986-01-01

    Regular right part grammars extended by tree generator specifications are interpreted by a combined parser generator and tree generator that produces an ELL(2) parser. This parser is able to translate programs of the specified language into abstract syntax trees according to the tree specifications in the generator input.

  10. Constructing a Parser for a given Deterministic Syntax Graph: A ...

    African Journals Online (AJOL)

    The rules of graph to program translation were laid down and followed religiously to arrive at the required program. ... The last part of the work is the translation from BNF into parser driven data structures that is ... AJOL African Journals Online.

  11. Storing files in a parallel computing system based on user-specified parser function

    Science.gov (United States)

    Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Manzanares, Adam; Torres, Aaron

    2014-10-21

    Techniques are provided for storing files in a parallel computing system based on a user-specified parser function. A plurality of files generated by a distributed application in a parallel computing system are stored by obtaining a parser from the distributed application for processing the plurality of files prior to storage; and storing one or more of the plurality of files in one or more storage nodes of the parallel computing system based on the processing by the parser. The plurality of files comprise one or more of a plurality of complete files and a plurality of sub-files. The parser can optionally store only those files that satisfy one or more semantic requirements of the parser. The parser can also extract metadata from one or more of the files and the extracted metadata can be stored with one or more of the plurality of files and used for searching for files.

  12. A Protocol for Annotating Parser Differences. Research Report. ETS RR-16-02

    Science.gov (United States)

    Bruno, James V.; Cahill, Aoife; Gyawali, Binod

    2016-01-01

    We present an annotation scheme for classifying differences in the outputs of syntactic constituency parsers when a gold standard is unavailable or undesired, as in the case of texts written by nonnative speakers of English. We discuss its automated implementation and the results of a case study that uses the scheme to choose a parser best suited…

  13. The power and limits of a rule-based morpho-semantic parser.

    Science.gov (United States)

    Baud, R H; Rassinoux, A M; Ruch, P; Lovis, C; Scherrer, J R

    1999-01-01

    The venue of Electronic Patient Record (EPR) implies an increasing amount of medical texts readily available for processing, as soon as convenient tools are made available. The chief application is text analysis, from which one can drive other disciplines like indexing for retrieval, knowledge representation, translation and inferencing for medical intelligent systems. Prerequisites for a convenient analyzer of medical texts are: building the lexicon, developing semantic representation of the domain, having a large corpus of texts available for statistical analysis, and finally mastering robust and powerful parsing techniques in order to satisfy the constraints of the medical domain. This article aims at presenting an easy-to-use parser ready to be adapted in different settings. It describes its power together with its practical limitations as experienced by the authors.

  14. Parsley: a Command-Line Parser for Astronomical Applications

    Science.gov (United States)

    Deich, William

    Parsley is a sophisticated keyword + value parser, packaged as a library of routines that offers an easy method for providing command-line arguments to programs. It makes it easy for the user to enter values, and it makes it easy for the programmer to collect and validate the user's entries. Parsley is tuned for astronomical applications: for example, dates entered in Julian, Modified Julian, calendar, or several other formats are all recognized without special effort by the user or by the programmer; angles can be entered using decimal degrees or dd:mm:ss; time-like intervals as decimal hours, hh:mm:ss, or a variety of other units. Vectors of data are accepted as readily as scalars.

  15. The Accelerator Markup Language and the Universal Accelerator Parser

    International Nuclear Information System (INIS)

    Sagan, D.; Forster, M.; Cornell U., LNS; Bates, D.A.; LBL, Berkeley; Wolski, A.; Liverpool U.; Cockcroft Inst. Accel. Sci. Tech.; Schmidt, F.; CERN; Walker, N.J.; DESY; Larrieu, T.; Roblin, Y.; Jefferson Lab; Pelaia, T.; Oak Ridge; Tenenbaum, P.; Woodley, M.; SLAC; Reiche, S.; UCLA

    2006-01-01

    A major obstacle to collaboration on accelerator projects has been the sharing of lattice description files between modeling codes. To address this problem, a lattice description format called Accelerator Markup Language (AML) has been created. AML is based upon the standard eXtensible Markup Language (XML) format; this provides the flexibility for AML to be easily extended to satisfy changing requirements. In conjunction with AML, a software library, called the Universal Accelerator Parser (UAP), is being developed to speed the integration of AML into any program. The UAP is structured to make it relatively straightforward (by giving appropriate specifications) to read and write lattice files in any format. This will allow programs that use the UAP code to read a variety of different file formats. Additionally, this will greatly simplify conversion of files from one format to another. Currently, besides AML, the UAP supports the MAD lattice format

  16. Combining shallow and deep processing for a robust, fast, deep-linguistic dependency parser

    OpenAIRE

    Schneider, G

    2004-01-01

    This paper describes Pro3Gres, a fast, robust, broad-coverage parser that delivers deep-linguistic grammatical relation structures as output, which are closer to predicate-argument structures and more informative than pure constituency structures. The parser stays as shallow as is possible for each task, combining shallow and deep-linguistic methods by integrating chunking and by expressing the majority of long-distance dependencies in a context-free way. It combines statistical and rule-base...

  17. "cba to check the spelling" investigating parser performance on discussion forum posts

    OpenAIRE

    Foster, Jennifer

    2010-01-01

    We evaluate the Berkeley parser on text from an online discussion forum. We evaluate the parser output with and without gold tokens and spellings (using Sparseval and Parseval), and we compile a list of problematic phenomena for this domain. The Parseval f-score for a small development set is 77.56. This increases to 80.27 when we apply a set of simple transformations to the input sentences and to the Wall Street Journal (WSJ) training sections.

  18. The CLaC Discourse Parser at CoNLL-2015

    OpenAIRE

    Laali, Majid; Davoodi, Elnaz; Kosseim, Leila

    2017-01-01

    This paper describes our submission (kosseim15) to the CoNLL-2015 shared task on shallow discourse parsing. We used the UIMA framework to develop our parser and used ClearTK to add machine learning functionality to the UIMA framework. Overall, our parser achieves a result of 17.3 F1 on the identification of discourse relations on the blind CoNLL-2015 test set, ranking in sixth place.

  19. ACPYPE - AnteChamber PYthon Parser interfacE

    Directory of Open Access Journals (Sweden)

    Sousa da Silva Alan W

    2012-07-01

    Full Text Available Abstract Background ACPYPE (or AnteChamber PYthon Parser interfacE is a wrapper script around the ANTECHAMBER software that simplifies the generation of small molecule topologies and parameters for a variety of molecular dynamics programmes like GROMACS, CHARMM and CNS. It is written in the Python programming language and was developed as a tool for interfacing with other Python based applications such as the CCPN software suite (for NMR data analysis and ARIA (for structure calculations from NMR data. ACPYPE is open source code, under GNU GPL v3, and is available as a stand-alone application at http://www.ccpn.ac.uk/acpype and as a web portal application at http://webapps.ccpn.ac.uk/acpype. Findings We verified the topologies generated by ACPYPE in three ways: by comparing with default AMBER topologies for standard amino acids; by generating and verifying topologies for a large set of ligands from the PDB; and by recalculating the structures for 5 protein–ligand complexes from the PDB. Conclusions ACPYPE is a tool that simplifies the automatic generation of topology and parameters in different formats for different molecular mechanics programmes, including calculation of partial charges, while being object oriented for integration with other applications.

  20. ACPYPE - AnteChamber PYthon Parser interfacE.

    Science.gov (United States)

    Sousa da Silva, Alan W; Vranken, Wim F

    2012-07-23

    ACPYPE (or AnteChamber PYthon Parser interfacE) is a wrapper script around the ANTECHAMBER software that simplifies the generation of small molecule topologies and parameters for a variety of molecular dynamics programmes like GROMACS, CHARMM and CNS. It is written in the Python programming language and was developed as a tool for interfacing with other Python based applications such as the CCPN software suite (for NMR data analysis) and ARIA (for structure calculations from NMR data). ACPYPE is open source code, under GNU GPL v3, and is available as a stand-alone application at http://www.ccpn.ac.uk/acpype and as a web portal application at http://webapps.ccpn.ac.uk/acpype. We verified the topologies generated by ACPYPE in three ways: by comparing with default AMBER topologies for standard amino acids; by generating and verifying topologies for a large set of ligands from the PDB; and by recalculating the structures for 5 protein-ligand complexes from the PDB. ACPYPE is a tool that simplifies the automatic generation of topology and parameters in different formats for different molecular mechanics programmes, including calculation of partial charges, while being object oriented for integration with other applications.

  1. Processing the ITU vocabulary: revisions and adaptations to the Pisa syntactic-semantic parser

    OpenAIRE

    Peters, Carol; Federici, Stefano; Montemagni, Simonetta; Calzolari, Nicoletta

    1993-01-01

    The first version of the Pisa syntactic-semantic parser was described in detail in Deliverable 4, Section 2 and Appendices 2,3, and 4. The scope of this report is to discuss the testing of the parser on the sample set of vocabulary which has been selected from the ITU Corpus (see Deliverable 6.1) and to illustrate the revisions and extensions that are now being implemented. The report therefore concentrates on presenting analysis and extraction activities. We need to specify clearly all the k...

  2. MASCOT HTML and XML parser: an implementation of a novel object model for protein identification data.

    Science.gov (United States)

    Yang, Chunguang G; Granite, Stephen J; Van Eyk, Jennifer E; Winslow, Raimond L

    2006-11-01

    Protein identification using MS is an important technique in proteomics as well as a major generator of proteomics data. We have designed the protein identification data object model (PDOM) and developed a parser based on this model to facilitate the analysis and storage of these data. The parser works with HTML or XML files saved or exported from MASCOT MS/MS ions search in peptide summary report or MASCOT PMF search in protein summary report. The program creates PDOM objects, eliminates redundancy in the input file, and has the capability to output any PDOM object to a relational database. This program facilitates additional analysis of MASCOT search results and aids the storage of protein identification information. The implementation is extensible and can serve as a template to develop parsers for other search engines. The parser can be used as a stand-alone application or can be driven by other Java programs. It is currently being used as the front end for a system that loads HTML and XML result files of MASCOT searches into a relational database. The source code is freely available at http://www.ccbm.jhu.edu and the program uses only free and open-source Java libraries.

  3. A domain specific language for the automatic generation of parsers classes for text protocols

    OpenAIRE

    Kistel, Thomas; Vandenhouten, Ralf

    2014-01-01

    ABNF ist eine Sprache zur Definition einer formalen Syntax für technische Spezifikationen und wird häufig zur Beschreibung textueller Nachrichten von Internetprotokollen eingesetzt. Die Möglichkeiten der automatischen Generierung von Parser-Klassen aus ABNF-Spezifikationen sind derzeit sehr begrenzt, da ABNF lediglich die Transfersyntax und Produktionsregeln von Textnachrichten beschreibt. Die fehlende Definition von Variablennamen innerhalb einer ABNF-Spezifikation ermöglicht es nicht, sinnv...

  4. Governance of extended lifecycle in large-scale eHealth initiatives: analyzing variability of enterprise architecture elements.

    Science.gov (United States)

    Mykkänen, Juha; Virkanen, Hannu; Tuomainen, Mika

    2013-01-01

    The governance of large eHealth initiatives requires traceability of many requirements and design decisions. We provide a model which we use to conceptually analyze variability of several enterprise architecture (EA) elements throughout the extended lifecycle of development goals using interrelated projects related to the national ePrescription in Finland.

  5. "gnparser": a powerful parser for scientific names based on Parsing Expression Grammar.

    Science.gov (United States)

    Mozzherin, Dmitry Y; Myltsev, Alexander A; Patterson, David J

    2017-05-26

    Scientific names in biology act as universal links. They allow us to cross-reference information about organisms globally. However variations in spelling of scientific names greatly diminish their ability to interconnect data. Such variations may include abbreviations, annotations, misspellings, etc. Authorship is a part of a scientific name and may also differ significantly. To match all possible variations of a name we need to divide them into their elements and classify each element according to its role. We refer to this as 'parsing' the name. Parsing categorizes name's elements into those that are stable and those that are prone to change. Names are matched first by combining them according to their stable elements. Matches are then refined by examining their varying elements. This two stage process dramatically improves the number and quality of matches. It is especially useful for the automatic data exchange within the context of "Big Data" in biology. We introduce Global Names Parser (gnparser). It is a Java tool written in Scala language (a language for Java Virtual Machine) to parse scientific names. It is based on a Parsing Expression Grammar. The parser can be applied to scientific names of any complexity. It assigns a semantic meaning (such as genus name, species epithet, rank, year of publication, authorship, annotations, etc.) to all elements of a name. It is able to work with nested structures as in the names of hybrids. gnparser performs with ≈99% accuracy and processes 30 million name-strings/hour per CPU thread. The gnparser library is compatible with Scala, Java, R, Jython, and JRuby. The parser can be used as a command line application, as a socket server, a web-app or as a RESTful HTTP-service. It is released under an Open source MIT license. Global Names Parser (gnparser) is a fast, high precision tool for biodiversity informaticians and biologists working with large numbers of scientific names. It can replace expensive and error

  6. GBParsy: A GenBank flatfile parser library with high speed

    Directory of Open Access Journals (Sweden)

    Kim Yeon-Ki

    2008-07-01

    Full Text Available Abstract Background GenBank flatfile (GBF format is one of the most popular sequence file formats because of its detailed sequence features and ease of readability. To use the data in the file by a computer, a parsing process is required and is performed according to a given grammar for the sequence and the description in a GBF. Currently, several parser libraries for the GBF have been developed. However, with the accumulation of DNA sequence information from eukaryotic chromosomes, parsing a eukaryotic genome sequence with these libraries inevitably takes a long time, due to the large GBF file and its correspondingly large genomic nucleotide sequence and related feature information. Thus, there is significant need to develop a parsing program with high speed and efficient use of system memory. Results We developed a library, GBParsy, which was C language-based and parses GBF files. The parsing speed was maximized by using content-specified functions in place of regular expressions that are flexible but slow. In addition, we optimized an algorithm related to memory usage so that it also increased parsing performance and efficiency of memory usage. GBParsy is at least 5 - 100× faster than current parsers in benchmark tests. Conclusion GBParsy is estimated to extract annotated information from almost 100 Mb of a GenBank flatfile for chromosomal sequence information within a second. Thus, it should be used for a variety of applications such as on-time visualization of a genome at a web site.

  7. On different approaches to syntactic analysis into bi-lexical dependencies: An empirical comparison of direct, PCFG-based, and HPSG-based parsers

    Directory of Open Access Journals (Sweden)

    Angelina Ivanova

    2016-04-01

    Full Text Available We compare three different approaches to parsing into syntactic, bi- lexical dependencies for English: a ‘direct’ data-driven dependency parser, a statistical phrase structure parser, and a hybrid, ‘deep’ grammar-driven parser. The analyses from the latter two are post- converted to bi-lexical dependencies. Through this ‘reduction’ of all three approaches to syntactic dependency parsers, we determine empirically what performance can be obtained for a common set of de- pendency types for English, across a broad variety of domains. In doing so, we observe what trade-offs apply along three dimensions, accuracy, efficiency, and resilience to domain variation. Our results suggest that the hand-built grammar in one of our parsers helps in both accuracy and cross-domain parsing performance, but these accuracy gains do not necessarily translate to improvements in the downstream task of negation resolution.

  8. An acetone breath analyzer using cavity ringdown spectroscopy: an initial test with human subjects under various situations

    International Nuclear Information System (INIS)

    Wang, Chuji; Surampudi, Anand B

    2008-01-01

    We have developed a portable breath acetone analyzer using cavity ringdown spectroscopy (CRDS). The instrument was initially tested by measuring the absorbance of breath gases at a single wavelength (266 nm) from 32 human subjects under various conditions. A background subtraction method, implemented to obtain absorbance differences, from which an upper limit of breath acetone concentration was obtained, is described. The upper limits of breath acetone concentration in the four Type 1 diabetes (T1D) subjects, tested after a 14 h overnight fast, range from 0.80 to 3.97 parts per million by volume (ppmv), higher than the mean acetone concentration (0.49 ppmv) in non-diabetic healthy breath reported in the literature. The preliminary results show that the instrument can tell distinctive differences between the breath from individuals who are healthy and those with T1D. On-line monitoring of breath gases in healthy people post-exercise, post-meals and post-alcohol-consumption was also conducted. This exploratory study demonstrates the first CRDS-based acetone breath analyzer and its potential application for point-of-care, non-invasive, diabetic monitoring

  9. Pseudocode Interpreter (Pseudocode Integrated Development Environment with Lexical Analyzer and Syntax Analyzer using Recursive Descent Parsing Algorithm

    Directory of Open Access Journals (Sweden)

    Christian Lester D. Gimeno

    2017-11-01

    Full Text Available –This research study focused on the development of a software that helps students design, write, validate and run their pseudocode in a semi Integrated Development Environment (IDE instead of manually writing it on a piece of paper.Specifically, the study aimed to develop lexical analyzer or lexer, syntax analyzer or parser using recursive descent parsing algorithm and an interpreter. The lexical analyzer reads pseudocodesource in a sequence of symbols or characters as lexemes.The lexemes are then analyzed by the lexer that matches a pattern for valid tokens and passes to the syntax analyzer or parser. The syntax analyzer or parser takes those valid tokens and builds meaningful commands using recursive descent parsing algorithm in a form of an abstract syntax tree. The generation of an abstract syntax tree is based on the specified grammar rule created by the researcher expressed in Extended Backus-Naur Form. The Interpreter takes the generated abstract syntax tree and starts the evaluation or interpretation to produce pseudocode output. The software was evaluated using white-box testing by several ICT professionals and black-box testing by several computer science students based on the International Organization for Standardization (ISO 9126 software quality standards. The overall results of the evaluation both for white-box and black-box were described as “Excellent in terms of functionality, reliability, usability, efficiency, maintainability and portability”.

  10. jmzReader: A Java parser library to process and visualize multiple text and XML-based mass spectrometry data formats.

    Science.gov (United States)

    Griss, Johannes; Reisinger, Florian; Hermjakob, Henning; Vizcaíno, Juan Antonio

    2012-03-01

    We here present the jmzReader library: a collection of Java application programming interfaces (APIs) to parse the most commonly used peak list and XML-based mass spectrometry (MS) data formats: DTA, MS2, MGF, PKL, mzXML, mzData, and mzML (based on the already existing API jmzML). The library is optimized to be used in conjunction with mzIdentML, the recently released standard data format for reporting protein and peptide identifications, developed by the HUPO proteomics standards initiative (PSI). mzIdentML files do not contain spectra data but contain references to different kinds of external MS data files. As a key functionality, all parsers implement a common interface that supports the various methods used by mzIdentML to reference external spectra. Thus, when developing software for mzIdentML, programmers no longer have to support multiple MS data file formats but only this one interface. The library (which includes a viewer) is open source and, together with detailed documentation, can be downloaded from http://code.google.com/p/jmzreader/. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. MLS-Net and SecureParser®: A New Method for Securing and Segregating Network Data

    Directory of Open Access Journals (Sweden)

    Robert A. Johnson

    2008-10-01

    Full Text Available A new method of network security and virtualization is presented which allows the consolidation of multiple network infrastructures dedicated to single security levels or communities of interest onto a single, virtualized network. An overview of the state of the art of network security protocols is presented, including the use of SSL, IPSec, and HAIPE IS, followed by a discussion of the SecureParser® technology and MLS-Net architecture, which in combination allow the virtualization of local network enclaves.

  12. Development of an event-driven parser for active document and web-based nuclear design system

    Energy Technology Data Exchange (ETDEWEB)

    Park, Yong Soo

    2005-02-15

    Nuclear design works consist of extensive unit job modules in which many computer codes are used. Each unit module requires time-consuming and erroneous input preparation, code run, output analysis and quality assurance process. The task for safety evaluation of reload core is especially the most man-power intensive and time-consuming due to the large amount of calculations and data exchanges. The purpose of this study is to develop a new nuclear design system called Innovative Design Processor (IDP) in order to minimize human effort and maximize design quality and productivity, and then to achieve an ultimately optimized core loading pattern. Two new basic principles of IDP are the document-oriented design and the web based design. Contrary to the conventional code-oriented or procedure-oriented design, the document-oriented design is human-oriented in that the final document is automatically prepared with complete analysis, table and plots, if the designer writes a design document called active document and feeds it to a parser. This study defined a number of active components and developed an event-driven parser for the active document in HTML (Hypertext Markup Language) or XML (Extensible Markup Language). The active documents can be created on the web, which is another framework of IDP. Using proper mix-up of server side and client side programming under the HAMP (HP-UX/Apache/MySQL/PHP) environment, the document-oriented design process on the web is modeled as a design wizard for designer's convenience and platform independency. This automation using IDP was tested for the reload safety evaluation of Korea Standard Nuclear Power Plant (KSNP) type PWRs. Great time saving was confirmed and IDP can complete several-month jobs in a few days. More optimized core loading pattern, therefore, can be obtained since it takes little time to do the reload safety evaluation tasks with several core loading pattern candidates. Since the technology is also applicable to

  13. Development of an event-driven parser for active document and web-based nuclear design system

    International Nuclear Information System (INIS)

    Park, Yong Soo

    2005-02-01

    Nuclear design works consist of extensive unit job modules in which many computer codes are used. Each unit module requires time-consuming and erroneous input preparation, code run, output analysis and quality assurance process. The task for safety evaluation of reload core is especially the most man-power intensive and time-consuming due to the large amount of calculations and data exchanges. The purpose of this study is to develop a new nuclear design system called Innovative Design Processor (IDP) in order to minimize human effort and maximize design quality and productivity, and then to achieve an ultimately optimized core loading pattern. Two new basic principles of IDP are the document-oriented design and the web based design. Contrary to the conventional code-oriented or procedure-oriented design, the document-oriented design is human-oriented in that the final document is automatically prepared with complete analysis, table and plots, if the designer writes a design document called active document and feeds it to a parser. This study defined a number of active components and developed an event-driven parser for the active document in HTML (Hypertext Markup Language) or XML (Extensible Markup Language). The active documents can be created on the web, which is another framework of IDP. Using proper mix-up of server side and client side programming under the HAMP (HP-UX/Apache/MySQL/PHP) environment, the document-oriented design process on the web is modeled as a design wizard for designer's convenience and platform independency. This automation using IDP was tested for the reload safety evaluation of Korea Standard Nuclear Power Plant (KSNP) type PWRs. Great time saving was confirmed and IDP can complete several-month jobs in a few days. More optimized core loading pattern, therefore, can be obtained since it takes little time to do the reload safety evaluation tasks with several core loading pattern candidates. Since the technology is also applicable to the

  14. Implementación y pruebas de REsource LOcation And Discovery (RELOAD) Parser and Encoder

    OpenAIRE

    Jiménez Bolonio, Jaime Antonio

    2009-01-01

    El ampliamente utilizado paradigma cliente/servidor está siendo complementado e incluso reemplazado por otros planteamientos de tipo Peer-to-Peer (P2P). Las redes P2P ofrecen un sistema descentralizado de distribución de la información, son más estables, y representan una solución al problema de la escalabilidad. Al mismo tiempo, el Session Initiation Protocol (SIP), un protocolo de señalización diseñado inicialmente para arquitecturas de tipo ciente/servidor, ha sido ampliamente adoptado par...

  15. Transient analyzer

    International Nuclear Information System (INIS)

    Muir, M.D.

    1975-01-01

    The design and design philosophy of a high performance, extremely versatile transient analyzer is described. This sub-system was designed to be controlled through the data acquisition computer system which allows hands off operation. Thus it may be placed on the experiment side of the high voltage safety break between the experimental device and the control room. This analyzer provides control features which are extremely useful for data acquisition from PPPL diagnostics. These include dynamic sample rate changing, which may be intermixed with multiple post trigger operations with variable length blocks using normal, peak to peak or integrate modes. Included in the discussion are general remarks on the advantages of adding intelligence to transient analyzers, a detailed description of the characteristics of the PPPL transient analyzer, a description of the hardware, firmware, control language and operation of the PPPL transient analyzer, and general remarks on future trends in this type of instrumentation both at PPPL and in general

  16. Diel cycling of zinc in a stream impacted by acid rock drainage: Initial results from a new in situ Zn analyzer

    Science.gov (United States)

    Chapin, T.P.; Nimick, D.A.; Gammons, C.H.; Wanty, R.B.

    2007-01-01

    Recent work has demonstrated that many trace metals undergo dramatic diel (24-h) cycles in near neutral pH streams with metal concentrations reproducibly changing up to 500% during the diel period (Nimick et al., 2003). To examine diel zinc cycles in streams affected by acid rock drainage, we have developed a novel instrument, the Zn-DigiScan, to continuously monitor in situ zinc concentrations in near real-time. Initial results from a 3-day deployment at Fisher Creek, Montana have demonstrated the ability of the Zn-DigiScan to record diel Zn cycling at levels below 100 ??g/l. Longer deployments of this instrument could be used to examine the effects of episodic events such as rainstorms and snowmelt pulses on zinc loading in streams affected by acid rock drainage. ?? Springer Science+Business Media B.V. 2006.

  17. Impact of reduced-radiation dual-energy protocols using 320-detector row computed tomography for analyzing urinary calculus components: initial in vitro evaluation.

    Science.gov (United States)

    Cai, Xiangran; Zhou, Qingchun; Yu, Juan; Xian, Zhaohui; Feng, Youzhen; Yang, Wencai; Mo, Xukai

    2014-10-01

    To evaluate the impact of reduced-radiation dual-energy (DE) protocols using 320-detector row computed tomography on the differentiation of urinary calculus components. A total of 58 urinary calculi were placed into the same phantom and underwent DE scanning with 320-detector row computed tomography. Each calculus was scanned 4 times with the DE protocols using 135 kV and 80 kV tube voltage and different tube current combinations, including 100 mA and 570 mA (group A), 50 mA and 290 mA (group B), 30 mA and 170 mA (group C), and 10 mA and 60 mA (group D). The acquisition data of all 4 groups were then analyzed by stone DE analysis software, and the results were compared with x-ray diffraction analysis. Noise, contrast-to-noise ratio, and radiation dose were compared. Calculi were correctly identified in 56 of 58 stones (96.6%) using group A and B protocols. However, only 35 stones (60.3%) and 16 stones (27.6%) were correctly diagnosed using group C and D protocols, respectively. Mean noise increased significantly and mean contrast-to-noise ratio decreased significantly from groups A to D (P calculus component analysis while reducing patient radiation exposure to 1.81 mSv. Further reduction of tube currents may compromise diagnostic accuracy. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. Radiometric analyzer

    International Nuclear Information System (INIS)

    Arima, S.; Oda, M.; Miyashita, K.; Takada, M.

    1977-01-01

    A radiometric analyzer for measuring the characteristic values of a sample by radiation includes a humer of radiation measuring subsystems having different ratios of sensitivities to the elements of the sample and linearizing circuits having inverse function characteristics of calibration functions which correspond to the radiation measuring subsystems. A weighing adder operates a desirable linear combination of the outputs of the linearizing circuits. Operators for operating between two or more different linear combinations are included

  19. Contamination Analyzer

    Science.gov (United States)

    1994-01-01

    Measurement of the total organic carbon content in water is important in assessing contamination levels in high purity water for power generation, pharmaceutical production and electronics manufacture. Even trace levels of organic compounds can cause defects in manufactured products. The Sievers Model 800 Total Organic Carbon (TOC) Analyzer, based on technology developed for the Space Station, uses a strong chemical oxidizing agent and ultraviolet light to convert organic compounds in water to carbon dioxide. After ionizing the carbon dioxide, the amount of ions is determined by measuring the conductivity of the deionized water. The new technique is highly sensitive, does not require compressed gas, and maintenance is minimal.

  20. Microsoft Biology Initiative: .NET Bioinformatics Platform and Tools

    Science.gov (United States)

    Diaz Acosta, B.

    2011-01-01

    The Microsoft Biology Initiative (MBI) is an effort in Microsoft Research to bring new technology and tools to the area of bioinformatics and biology. This initiative is comprised of two primary components, the Microsoft Biology Foundation (MBF) and the Microsoft Biology Tools (MBT). MBF is a language-neutral bioinformatics toolkit built as an extension to the Microsoft .NET Framework—initially aimed at the area of Genomics research. Currently, it implements a range of parsers for common bioinformatics file formats; a range of algorithms for manipulating DNA, RNA, and protein sequences; and a set of connectors to biological web services such as NCBI BLAST. MBF is available under an open source license, and executables, source code, demo applications, documentation and training materials are freely downloadable from http://research.microsoft.com/bio. MBT is a collection of tools that enable biology and bioinformatics researchers to be more productive in making scientific discoveries.

  1. Analyzing Visibility Configurations.

    Science.gov (United States)

    Dachsbacher, C

    2011-04-01

    Many algorithms, such as level of detail rendering and occlusion culling methods, make decisions based on the degree of visibility of an object, but do not analyze the distribution, or structure, of the visible and occluded regions across surfaces. We present an efficient method to classify different visibility configurations and show how this can be used on top of existing methods based on visibility determination. We adapt co-occurrence matrices for visibility analysis and generalize them to operate on clusters of triangular surfaces instead of pixels. We employ machine learning techniques to reliably classify the thus extracted feature vectors. Our method allows perceptually motivated level of detail methods for real-time rendering applications by detecting configurations with expected visual masking. We exemplify the versatility of our method with an analysis of area light visibility configurations in ray tracing and an area-to-area visibility analysis suitable for hierarchical radiosity refinement. Initial results demonstrate the robustness, simplicity, and performance of our method in synthetic scenes, as well as real applications.

  2. Determining initial enrichment, burnup, and cooling time of pressurized-water-reactor spent fuel assemblies by analyzing passive gamma spectra measured at the Clab interim-fuel storage facility in Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Favalli, A., E-mail: afavalli@lanl.gov [Los Alamos National Laboratory, Los Alamos, NM (United States); Vo, D. [Los Alamos National Laboratory, Los Alamos, NM (United States); Grogan, B. [Oak Ridge National Laboratory, Oak Ridge, TN (United States); Jansson, P. [Uppsala University, Uppsala (Sweden); Liljenfeldt, H. [Oak Ridge National Laboratory, Oak Ridge, TN (United States); Mozin, V. [Lawrence Livermore National Laboratory, Livermore, CA (United States); Schwalbach, P. [European Commission, DG Energy, Euratom Safeguards Luxemburg, Luxemburg (Luxembourg); Sjöland, A. [Swedish Nuclear Fuel and Waste Management Company, Stockholm (Sweden); Tobin, S.J.; Trellue, H. [Los Alamos National Laboratory, Los Alamos, NM (United States); Vaccaro, S. [European Commission, DG Energy, Euratom Safeguards Luxemburg, Luxemburg (Luxembourg)

    2016-06-01

    The purpose of the Next Generation Safeguards Initiative (NGSI)–Spent Fuel (SF) project is to strengthen the technical toolkit of safeguards inspectors and/or other interested parties. The NGSI–SF team is working to achieve the following technical goals more easily and efficiently than in the past using nondestructive assay measurements of spent fuel assemblies: (1) verify the initial enrichment, burnup, and cooling time of facility declaration; (2) detect the diversion or replacement of pins; (3) estimate the plutonium mass [which is also a function of the variables in (1)]; (4) estimate the decay heat; and (5) determine the reactivity of spent fuel assemblies. Since August 2013, a set of measurement campaigns has been conducted at the Central Interim Storage Facility for Spent Nuclear Fuel (Clab), in collaboration with Swedish Nuclear Fuel and Waste Management Company (SKB). One purpose of the measurement campaigns was to acquire passive gamma spectra with high-purity germanium and lanthanum bromide scintillation detectors from Pressurized Water Reactor and Boiling Water Reactor spent fuel assemblies. The absolute {sup 137}Cs count rate and the {sup 154}Eu/{sup 137}Cs, {sup 134}Cs/{sup 137}Cs, {sup 106}Ru/{sup 137}Cs, and {sup 144}Ce/{sup 137}Cs isotopic ratios were extracted; these values were used to construct corresponding model functions (which describe each measured quantity’s behavior over various combinations of burnup, cooling time, and initial enrichment) and then were used to determine those same quantities in each measured spent fuel assembly. The results obtained in comparison with the operator declared values, as well as the methodology developed, are discussed in detail in the paper.

  3. Determining initial enrichment, burnup, and cooling time of pressurized-water-reactor spent fuel assemblies by analyzing passive gamma spectra measured at the Clab interim-fuel storage facility in Sweden

    Science.gov (United States)

    Favalli, A.; Vo, D.; Grogan, B.; Jansson, P.; Liljenfeldt, H.; Mozin, V.; Schwalbach, P.; Sjöland, A.; Tobin, S. J.; Trellue, H.; Vaccaro, S.

    2016-06-01

    The purpose of the Next Generation Safeguards Initiative (NGSI)-Spent Fuel (SF) project is to strengthen the technical toolkit of safeguards inspectors and/or other interested parties. The NGSI-SF team is working to achieve the following technical goals more easily and efficiently than in the past using nondestructive assay measurements of spent fuel assemblies: (1) verify the initial enrichment, burnup, and cooling time of facility declaration; (2) detect the diversion or replacement of pins; (3) estimate the plutonium mass [which is also a function of the variables in (1)]; (4) estimate the decay heat; and (5) determine the reactivity of spent fuel assemblies. Since August 2013, a set of measurement campaigns has been conducted at the Central Interim Storage Facility for Spent Nuclear Fuel (Clab), in collaboration with Swedish Nuclear Fuel and Waste Management Company (SKB). One purpose of the measurement campaigns was to acquire passive gamma spectra with high-purity germanium and lanthanum bromide scintillation detectors from Pressurized Water Reactor and Boiling Water Reactor spent fuel assemblies. The absolute 137Cs count rate and the 154Eu/137Cs, 134Cs/137Cs, 106Ru/137Cs, and 144Ce/137Cs isotopic ratios were extracted; these values were used to construct corresponding model functions (which describe each measured quantity's behavior over various combinations of burnup, cooling time, and initial enrichment) and then were used to determine those same quantities in each measured spent fuel assembly. The results obtained in comparison with the operator declared values, as well as the methodology developed, are discussed in detail in the paper.

  4. Analyzing Ambiguity of Context-Free Grammars

    DEFF Research Database (Denmark)

    Brabrand, Claus; Giegerich, Robert; Møller, Anders

    2010-01-01

    It has been known since 1962 that the ambiguity problem for context-free grammars is undecidable. Ambiguity in context-free grammars is a recurring problem in language design and parser generation, as well as in applications where grammars are used as models of real-world physical structures. We...... observe that there is a simple linguistic characterization of the grammar ambiguity problem, and we show how to exploit this by presenting an ambiguity analysis framework based on conservative language approximations. As a concrete example, we propose a technique based on local regular approximations...

  5. Analyzing Ambiguity of Context-Free Grammars

    DEFF Research Database (Denmark)

    Brabrand, Claus; Giegerich, Robert; Møller, Anders

    2007-01-01

    It has been known since 1962 that the ambiguity problem for context-free grammars is undecidable. Ambiguity in context-free grammars is a recurring problem in language design and parser generation, as well as in applications where grammars are used as models of real-world physical structures. We...... observe that there is a simple linguistic characterization of the grammar ambiguity problem, and we show how to exploit this to conservatively approximate the problem based on local regular approximations and grammar unfoldings. As an application, we consider grammars that occur in RNA analysis...

  6. Heat stress-induced loss of eukaryotic initiation factor 5A (eIF-5A) in a human pancreatic cancer cell line, MIA PaCa-2, analyzed by two-dimensional gel electrophoresis.

    Science.gov (United States)

    Takeuchi, Kana; Nakamura, Kazuyuki; Fujimoto, Masanori; Kaino, Seiji; Kondoh, Satoshi; Okita, Kiwamu

    2002-02-01

    Alterations of intracellular proteins during the process of heat stress-induced cell death of a human pancreatic cancer cell line, MIA PaCa-2, were investigated using two-dimensional gel electrophoresis (2-DE), agarose gel electrophoresis, and cell biology techniques. Incubation of MIA PaCa-2 at 45 degrees C for 30 min decreased the cell growth rate and cell viability without causing chromosomal DNA fragmentation. Incubation at 51 degrees C for 30 min suppressed cell growth and again led to death without DNA fragmentation. The cell death was associated with the loss of an intracellular protein of M(r) 17,500 and pI 5.2 on 2-DE gel. This protein was determined to be eukaryotic initiation factor SA (eIF-5A) by microsequencing of the N-terminal region of peptide fragments obtained by cyanogen bromide treatment of the protein blotted onto a polyvinylidene difluoride (PVDF) membrane. The sequences detected were QXSALRKNGFVVLKGRP and STSKTGXHGHAKVHLVGID, which were homologous with the sequence of eIF-5A from Gln 20 to Pro 36 and from Ser 43 to Asp 61, respectively. Furthermore, the result of sequencing suggested that the protein was an active form of hypusinated eIF-5A, because Lys 46 could be detected but not Lys 49, which is the site for hypusination. These results suggest that loss of the active form of eIF-5A is an important factor in the irreversible process of heat stress-induced death of MIA PaCa-2 cells.

  7. Web server attack analyzer

    OpenAIRE

    Mižišin, Michal

    2013-01-01

    Web server attack analyzer - Abstract The goal of this work was to create prototype of analyzer of injection flaws attacks on web server. Proposed solution combines capabilities of web application firewall and web server log analyzer. Analysis is based on configurable signatures defined by regular expressions. This paper begins with summary of web attacks, followed by detection techniques analysis on web servers, description and justification of selected implementation. In the end are charact...

  8. Electron attachment analyzer

    International Nuclear Information System (INIS)

    Popp, P.; Grosse, H.J.; Leonhardt, J.; Mothes, S.; Oppermann, G.

    1984-01-01

    The invention concerns an electron attachment analyzer for detecting traces of electroaffine substances in electronegative gases, especially in air. The analyzer can be used for monitoring working places, e. g., in operating theatres. The analyzer consists of two electrodes inserted in a base frame of insulating material (quartz or ceramics) and a high-temperature resistant radiation source ( 85 Kr, 3 H, or 63 Ni)

  9. Nuclear power plant analyzer

    International Nuclear Information System (INIS)

    Stritar, A.

    1986-01-01

    The development of Nuclear Power Plant Analyzers in USA is described. There are two different types of Analyzers under development in USA, the forst in Idaho and Los Alamos national Lab, the second in brookhaven National lab. That one is described in detail. The computer hardware and the mathematical models of the reactor vessel thermalhydraulics are described. (author)

  10. Analyzing Peace Pedagogies

    Science.gov (United States)

    Haavelsrud, Magnus; Stenberg, Oddbjorn

    2012-01-01

    Eleven articles on peace education published in the first volume of the Journal of Peace Education are analyzed. This selection comprises peace education programs that have been planned or carried out in different contexts. In analyzing peace pedagogies as proposed in the 11 contributions, we have chosen network analysis as our method--enabling…

  11. Analyzing in the present

    DEFF Research Database (Denmark)

    Revsbæk, Line; Tanggaard, Lene

    2015-01-01

    The article presents a notion of “analyzing in the present” as a source of inspiration in analyzing qualitative research materials. The term emerged from extensive listening to interview recordings during everyday commuting to university campus. Paying attention to the way different parts of vari...

  12. Gearbox vibration diagnostic analyzer

    Science.gov (United States)

    1992-01-01

    This report describes the Gearbox Vibration Diagnostic Analyzer installed in the NASA Lewis Research Center's 500 HP Helicopter Transmission Test Stand to monitor gearbox testing. The vibration of the gearbox is analyzed using diagnostic algorithms to calculate a parameter indicating damaged components.

  13. Miniature mass analyzer

    CERN Document Server

    Cuna, C; Lupsa, N; Cuna, S; Tuzson, B

    2003-01-01

    The paper presents the concept of different mass analyzers that were specifically designed as small dimension instruments able to detect with great sensitivity and accuracy the main environmental pollutants. The mass spectrometers are very suited instrument for chemical and isotopic analysis, needed in environmental surveillance. Usually, this is done by sampling the soil, air or water followed by laboratory analysis. To avoid drawbacks caused by sample alteration during the sampling process and transport, the 'in situ' analysis is preferred. Theoretically, any type of mass analyzer can be miniaturized, but some are more appropriate than others. Quadrupole mass filter and trap, magnetic sector, time-of-flight and ion cyclotron mass analyzers can be successfully shrunk, for each of them some performances being sacrificed but we must know which parameters are necessary to be kept unchanged. To satisfy the miniaturization criteria of the analyzer, it is necessary to use asymmetrical geometries, with ion beam obl...

  14. Extraction spectrophotometric analyzer

    International Nuclear Information System (INIS)

    Batik, J.; Vitha, F.

    1985-01-01

    Automation is discussed of extraction spectrophotometric determination of uranium in a solution. Uranium is extracted from accompanying elements in an HCl medium with a solution of tributyl phosphate in benzene. The determination is performed by measuring absorbance at 655 nm in a single-phase ethanol-water-benzene-tributyl phosphate medium. The design is described of an analyzer consisting of an analytical unit and a control unit. The analyzer performance promises increased productivity of labour, improved operating and hygiene conditions, and mainly more accurate results of analyses. (J.C.)

  15. Americal options analyzed differently

    NARCIS (Netherlands)

    Nieuwenhuis, J.W.

    2003-01-01

    In this note we analyze in a discrete-time context and with a finite outcome space American options starting with the idea that every tradable should be a martingale under a certain measure. We believe that in this way American options become more understandable to people with a good working

  16. Analyzing Political Television Advertisements.

    Science.gov (United States)

    Burson, George

    1992-01-01

    Presents a lesson plan to help students understand that political advertisements often mislead, lie, or appeal to emotion. Suggests that the lesson will enable students to examine political advertisements analytically. Includes a worksheet to be used by students to analyze individual political advertisements. (DK)

  17. Centrifugal analyzer development

    International Nuclear Information System (INIS)

    Burtis, C.A.; Bauer, M.L.; Bostick, W.D.

    1976-01-01

    The development of the centrifuge fast analyzer (CFA) is reviewed. The development of a miniature CFA with computer data analysis is reported and applications for automated diagnostic chemical and hematological assays are discussed. A portable CFA system with microprocessor was adapted for field assays of air and water samples for environmental pollutants, including ammonia, nitrates, nitrites, phosphates, sulfates, and silica. 83 references

  18. Soft Decision Analyzer

    Science.gov (United States)

    Lansdowne, Chatwin; Steele, Glen; Zucha, Joan; Schlesinger, Adam

    2013-01-01

    We describe the benefit of using closed-loop measurements for a radio receiver paired with a counterpart transmitter. We show that real-time analysis of the soft decision output of a receiver can provide rich and relevant insight far beyond the traditional hard-decision bit error rate (BER) test statistic. We describe a Soft Decision Analyzer (SDA) implementation for closed-loop measurements on single- or dual- (orthogonal) channel serial data communication links. The analyzer has been used to identify, quantify, and prioritize contributors to implementation loss in live-time during the development of software defined radios. This test technique gains importance as modern receivers are providing soft decision symbol synchronization as radio links are challenged to push more data and more protocol overhead through noisier channels, and software-defined radios (SDRs) use error-correction codes that approach Shannon's theoretical limit of performance.

  19. KWU Nuclear Plant Analyzer

    International Nuclear Information System (INIS)

    Bennewitz, F.; Hummel, R.; Oelmann, K.

    1986-01-01

    The KWU Nuclear Plant Analyzer is a real time engineering simulator based on the KWU computer programs used in plant transient analysis and licensing. The primary goal is to promote the understanding of the technical and physical processes of a nuclear power plant at an on-site training facility. Thus the KWU Nuclear Plant Analyzer is available with comparable low costs right at the time when technical questions or training needs arise. This has been achieved by (1) application of the transient code NLOOP; (2) unrestricted operator interaction including all simulator functions; (3) using the mainframe computer Control Data Cyber 176 in the KWU computing center; (4) four color graphic displays controlled by a dedicated graphic computer, no control room equipment; and (5) coupling of computers by telecommunication via telephone

  20. Analyzed Using Statistical Moments

    International Nuclear Information System (INIS)

    Oltulu, O.

    2004-01-01

    Diffraction enhanced imaging (DEl) technique is a new x-ray imaging method derived from radiography. The method uses a monorheumetten x-ray beam and introduces an analyzer crystal between an object and a detector Narrow angular acceptance of the analyzer crystal generates an improved contrast over the evaluation radiography. While standart radiography can produce an 'absorption image', DEl produces 'apparent absorption' and 'apparent refraction' images with superior quality. Objects with similar absorption properties may not be distinguished with conventional techniques due to close absorption coefficients. This problem becomes more dominant when an object has scattering properties. A simple approach is introduced to utilize scattered radiation to obtain 'pure absorption' and 'pure refraction' images

  1. Emission spectrometric isotope analyzer

    International Nuclear Information System (INIS)

    Mauersberger, K.; Meier, G.; Nitschke, W.; Rose, W.; Schmidt, G.; Rahm, N.; Andrae, G.; Krieg, D.; Kuefner, W.; Tamme, G.; Wichlacz, D.

    1982-01-01

    An emission spectrometric isotope analyzer has been designed for determining relative abundances of stable isotopes in gaseous samples in discharge tubes, in liquid samples, and in flowing gaseous samples. It consists of a high-frequency generator, a device for defined positioning of discharge tubes, a grating monochromator with oscillating slit and signal converter, signal generator, window discriminator, AND connection, read-out display, oscillograph, gas dosing device and chemical conversion system with carrier gas source and vacuum pump

  2. PhosphoSiteAnalyzer

    DEFF Research Database (Denmark)

    Bennetzen, Martin V; Cox, Jürgen; Mann, Matthias

    2012-01-01

    an algorithm to retrieve kinase predictions from the public NetworKIN webpage in a semiautomated way and applies hereafter advanced statistics to facilitate a user-tailored in-depth analysis of the phosphoproteomic data sets. The interface of the software provides a high degree of analytical flexibility......Phosphoproteomic experiments are routinely conducted in laboratories worldwide, and because of the fast development of mass spectrometric techniques and efficient phosphopeptide enrichment methods, researchers frequently end up having lists with tens of thousands of phosphorylation sites...... and is designed to be intuitive for most users. PhosphoSiteAnalyzer is a freeware program available at http://phosphosite.sourceforge.net ....

  3. Electrodynamic thermogravimetric analyzer

    International Nuclear Information System (INIS)

    Spjut, R.E.; Bar-Ziv, E.; Sarofim, A.F.; Longwell, J.P.

    1986-01-01

    The design and operation of a new device for studying single-aerosol-particle kinetics at elevated temperatures, the electrodynamic thermogravimetric analyzer (EDTGA), was examined theoretically and experimentally. The completed device consists of an electrodynamic balance modified to permit particle heating by a CO 2 laser, temperature measurement by a three-color infrared-pyrometry system, and continuous weighing by a position-control system. In this paper, the position-control, particle-weight-measurement, heating, and temperature-measurement systems are described and their limitations examined

  4. Analyzing Chinese Financial Reporting

    Institute of Scientific and Technical Information of China (English)

    SABRINA; ZHANG

    2008-01-01

    If the world’s capital markets could use a harmonized accounting framework it would not be necessary for a comparison between two or more sets of accounting standards. However,there is much to do before this becomes reality.This article aims to pres- ent a general overview of China’s General Accepted Accounting Principles(GAAP), U.S.General Accepted Accounting Principles and International Financial Reporting Standards(IFRS),and to analyze the differ- ences among IFRS,U.S.GAAP and China GAAP using fixed assets as an example.

  5. Inductive dielectric analyzer

    International Nuclear Information System (INIS)

    Agranovich, Daniel; Popov, Ivan; Ben Ishai, Paul; Feldman, Yuri; Polygalov, Eugene

    2017-01-01

    One of the approaches to bypass the problem of electrode polarization in dielectric measurements is the free electrode method. The advantage of this technique is that, the probing electric field in the material is not supplied by contact electrodes, but rather by electromagnetic induction. We have designed an inductive dielectric analyzer based on a sensor comprising two concentric toroidal coils. In this work, we present an analytic derivation of the relationship between the impedance measured by the sensor and the complex dielectric permittivity of the sample. The obtained relationship was successfully employed to measure the dielectric permittivity and conductivity of various alcohols and aqueous salt solutions. (paper)

  6. Plutonium solution analyzer

    International Nuclear Information System (INIS)

    Burns, D.A.

    1994-09-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%-O.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40-240 g/L and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4-4.0 g/L. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 mL of each sample and standard, and generates waste at the rate of about 1.5 mL per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded)

  7. Multiple capillary biochemical analyzer

    Science.gov (United States)

    Dovichi, N.J.; Zhang, J.Z.

    1995-08-08

    A multiple capillary analyzer allows detection of light from multiple capillaries with a reduced number of interfaces through which light must pass in detecting light emitted from a sample being analyzed, using a modified sheath flow cuvette. A linear or rectangular array of capillaries is introduced into a rectangular flow chamber. Sheath fluid draws individual sample streams through the cuvette. The capillaries are closely and evenly spaced and held by a transparent retainer in a fixed position in relation to an optical detection system. Collimated sample excitation radiation is applied simultaneously across the ends of the capillaries in the retainer. Light emitted from the excited sample is detected by the optical detection system. The retainer is provided by a transparent chamber having inward slanting end walls. The capillaries are wedged into the chamber. One sideways dimension of the chamber is equal to the diameter of the capillaries and one end to end dimension varies from, at the top of the chamber, slightly greater than the sum of the diameters of the capillaries to, at the bottom of the chamber, slightly smaller than the sum of the diameters of the capillaries. The optical system utilizes optic fibers to deliver light to individual photodetectors, one for each capillary tube. A filter or wavelength division demultiplexer may be used for isolating fluorescence at particular bands. 21 figs.

  8. Plutonium solution analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Burns, D.A.

    1994-09-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%-O.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40-240 g/L and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4-4.0 g/L. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 mL of each sample and standard, and generates waste at the rate of about 1.5 mL per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded).

  9. Trace impurity analyzer

    International Nuclear Information System (INIS)

    Schneider, W.J.; Edwards, D. Jr.

    1979-01-01

    The desirability for long-term reliability of large scale helium refrigerator systems used on superconducting accelerator magnets has necessitated detection of impurities to levels of a few ppM. An analyzer that measures trace impurity levels of condensable contaminants in concentrations of less than a ppM in 15 atm of He is described. The instrument makes use of the desorption temperature at an indicated pressure of the various impurities to determine the type of contaminant. The pressure rise at that temperature yields a measure of the contaminant level of the impurity. A LN 2 cryogenic charcoal trap is also employed to measure air impurities (nitrogen and oxygen) to obtain the full range of contaminant possibilities. The results of this detector which will be in use on the research and development helium refrigerator of the ISABELLE First-Cell is described

  10. Analyzing Water's Optical Absorption

    Science.gov (United States)

    2002-01-01

    A cooperative agreement between World Precision Instruments (WPI), Inc., and Stennis Space Center has led the UltraPath(TM) device, which provides a more efficient method for analyzing the optical absorption of water samples at sea. UltraPath is a unique, high-performance absorbance spectrophotometer with user-selectable light path lengths. It is an ideal tool for any study requiring precise and highly sensitive spectroscopic determination of analytes, either in the laboratory or the field. As a low-cost, rugged, and portable system capable of high- sensitivity measurements in widely divergent waters, UltraPath will help scientists examine the role that coastal ocean environments play in the global carbon cycle. UltraPath(TM) is a trademark of World Precision Instruments, Inc. LWCC(TM) is a trademark of World Precision Instruments, Inc.

  11. PDA: Pooled DNA analyzer

    Directory of Open Access Journals (Sweden)

    Lin Chin-Yu

    2006-04-01

    Full Text Available Abstract Background Association mapping using abundant single nucleotide polymorphisms is a powerful tool for identifying disease susceptibility genes for complex traits and exploring possible genetic diversity. Genotyping large numbers of SNPs individually is performed routinely but is cost prohibitive for large-scale genetic studies. DNA pooling is a reliable and cost-saving alternative genotyping method. However, no software has been developed for complete pooled-DNA analyses, including data standardization, allele frequency estimation, and single/multipoint DNA pooling association tests. This motivated the development of the software, 'PDA' (Pooled DNA Analyzer, to analyze pooled DNA data. Results We develop the software, PDA, for the analysis of pooled-DNA data. PDA is originally implemented with the MATLAB® language, but it can also be executed on a Windows system without installing the MATLAB®. PDA provides estimates of the coefficient of preferential amplification and allele frequency. PDA considers an extended single-point association test, which can compare allele frequencies between two DNA pools constructed under different experimental conditions. Moreover, PDA also provides novel chromosome-wide multipoint association tests based on p-value combinations and a sliding-window concept. This new multipoint testing procedure overcomes a computational bottleneck of conventional haplotype-oriented multipoint methods in DNA pooling analyses and can handle data sets having a large pool size and/or large numbers of polymorphic markers. All of the PDA functions are illustrated in the four bona fide examples. Conclusion PDA is simple to operate and does not require that users have a strong statistical background. The software is available at http://www.ibms.sinica.edu.tw/%7Ecsjfann/first%20flow/pda.htm.

  12. A neutron activation analyzer

    International Nuclear Information System (INIS)

    Westphal, G.P.; Lemmel, H.; Grass, F.; De Regge, P.P.; Burns, K.; Markowicz, A.

    2005-01-01

    Dubbed 'Analyzer' because of its simplicity, a neutron activation analysis facility for short-lived isomeric transitions is based on a low-cost rabbit system and an adaptive digital filter which are controlled by a software performing irradiation control, loss-free gamma-spectrometry, spectra evaluation, nuclide identification and calculation of concentrations in a fully automatic flow of operations. Designed for TRIGA reactors and constructed from inexpensive plastic tubing and an aluminum in-core part, the rabbit system features samples of 5 ml and 10 ml with sample separation at 150 ms and 200 ms transport time or 25 ml samples without separation at a transport time of 300 ms. By automatically adapting shaping times to pulse intervals the preloaded digital filter gives best throughput at best resolution up to input counting rates of 10 6 cps. Loss-free counting enables quantitative correction of counting losses of up to 99%. As a test of system reproducibility in sample separation geometry, K, Cl, Mn, Mg, Ca, Sc, and V have been determined in various reference materials at excellent agreement with consensus values. (author)

  13. Downhole Fluid Analyzer Development

    Energy Technology Data Exchange (ETDEWEB)

    Bill Turner

    2006-11-28

    A novel fiber optic downhole fluid analyzer has been developed for operation in production wells. This device will allow real-time determination of the oil, gas and water fractions of fluids from different zones in a multizone or multilateral completion environment. The device uses near infrared spectroscopy and induced fluorescence measurement to unambiguously determine the oil, water and gas concentrations at all but the highest water cuts. The only downhole components of the system are the fiber optic cable and windows. All of the active components--light sources, sensors, detection electronics and software--will be located at the surface, and will be able to operate multiple downhole probes. Laboratory testing has demonstrated that the sensor can accurately determine oil, water and gas fractions with a less than 5 percent standard error. Once installed in an intelligent completion, this sensor will give the operating company timely information about the fluids arising from various zones or multilaterals in a complex completion pattern, allowing informed decisions to be made on controlling production. The research and development tasks are discussed along with a market analysis.

  14. Climate Model Diagnostic Analyzer

    Science.gov (United States)

    Lee, Seungwon; Pan, Lei; Zhai, Chengxing; Tang, Benyang; Kubar, Terry; Zhang, Zia; Wang, Wei

    2015-01-01

    The comprehensive and innovative evaluation of climate models with newly available global observations is critically needed for the improvement of climate model current-state representation and future-state predictability. A climate model diagnostic evaluation process requires physics-based multi-variable analyses that typically involve large-volume and heterogeneous datasets, making them both computation- and data-intensive. With an exploratory nature of climate data analyses and an explosive growth of datasets and service tools, scientists are struggling to keep track of their datasets, tools, and execution/study history, let alone sharing them with others. In response, we have developed a cloud-enabled, provenance-supported, web-service system called Climate Model Diagnostic Analyzer (CMDA). CMDA enables the physics-based, multivariable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. At the same time, CMDA provides a crowd-sourcing space where scientists can organize their work efficiently and share their work with others. CMDA is empowered by many current state-of-the-art software packages in web service, provenance, and semantic search.

  15. Digital Microfluidics Sample Analyzer

    Science.gov (United States)

    Pollack, Michael G.; Srinivasan, Vijay; Eckhardt, Allen; Paik, Philip Y.; Sudarsan, Arjun; Shenderov, Alex; Hua, Zhishan; Pamula, Vamsee K.

    2010-01-01

    Three innovations address the needs of the medical world with regard to microfluidic manipulation and testing of physiological samples in ways that can benefit point-of-care needs for patients such as premature infants, for which drawing of blood for continuous tests can be life-threatening in their own right, and for expedited results. A chip with sample injection elements, reservoirs (and waste), droplet formation structures, fluidic pathways, mixing areas, and optical detection sites, was fabricated to test the various components of the microfluidic platform, both individually and in integrated fashion. The droplet control system permits a user to control droplet microactuator system functions, such as droplet operations and detector operations. Also, the programming system allows a user to develop software routines for controlling droplet microactuator system functions, such as droplet operations and detector operations. A chip is incorporated into the system with a controller, a detector, input and output devices, and software. A novel filler fluid formulation is used for the transport of droplets with high protein concentrations. Novel assemblies for detection of photons from an on-chip droplet are present, as well as novel systems for conducting various assays, such as immunoassays and PCR (polymerase chain reaction). The lab-on-a-chip (a.k.a., lab-on-a-printed-circuit board) processes physiological samples and comprises a system for automated, multi-analyte measurements using sub-microliter samples of human serum. The invention also relates to a diagnostic chip and system including the chip that performs many of the routine operations of a central labbased chemistry analyzer, integrating, for example, colorimetric assays (e.g., for proteins), chemiluminescence/fluorescence assays (e.g., for enzymes, electrolytes, and gases), and/or conductometric assays (e.g., for hematocrit on plasma and whole blood) on a single chip platform.

  16. The Aqueduct Global Flood Analyzer

    Science.gov (United States)

    Iceland, Charles

    2015-04-01

    As population growth and economic growth take place, and as climate change accelerates, many regions across the globe are finding themselves increasingly vulnerable to flooding. A recent OECD study of the exposure of the world's large port cities to coastal flooding found that 40 million people were exposed to a 1 in 100 year coastal flood event in 2005, and the total value of exposed assets was about US 3,000 billion, or 5% of global GDP. By the 2070s, those numbers were estimated to increase to 150 million people and US 35,000 billion, or roughly 9% of projected global GDP. Impoverished people in developing countries are particularly at risk because they often live in flood-prone areas and lack the resources to respond. WRI and its Dutch partners - Deltares, IVM-VU University Amsterdam, Utrecht University, and PBL Netherlands Environmental Assessment Agency - are in the initial stages of developing a robust set of river flood and coastal storm surge risk measures that show the extent of flooding under a variety of scenarios (both current and future), together with the projected human and economic impacts of these flood scenarios. These flood risk data and information will be accessible via an online, easy-to-use Aqueduct Global Flood Analyzer. We will also investigate the viability, benefits, and costs of a wide array of flood risk reduction measures that could be implemented in a variety of geographic and socio-economic settings. Together, the activities we propose have the potential for saving hundreds of thousands of lives and strengthening the resiliency and security of many millions more, especially those who are most vulnerable. Mr. Iceland will present Version 1.0 of the Aqueduct Global Flood Analyzer and provide a preview of additional elements of the Analyzer to be released in the coming years.

  17. Natural-Language Parser for PBEM

    Science.gov (United States)

    James, Mark

    2010-01-01

    A computer program called "Hunter" accepts, as input, a colloquial-English description of a set of policy-based-management rules, and parses that description into a form useable by policy-based enterprise management (PBEM) software. PBEM is a rules-based approach suitable for automating some management tasks. PBEM simplifies the management of a given enterprise through establishment of policies addressing situations that are likely to occur. Hunter was developed to have a unique capability to extract the intended meaning instead of focusing on parsing the exact ways in which individual words are used.

  18. PATMA: parser of archival tissue microarray

    Directory of Open Access Journals (Sweden)

    Lukasz Roszkowiak

    2016-12-01

    Full Text Available Tissue microarrays are commonly used in modern pathology for cancer tissue evaluation, as it is a very potent technique. Tissue microarray slides are often scanned to perform computer-aided histopathological analysis of the tissue cores. For processing the image, splitting the whole virtual slide into images of individual cores is required. The only way to distinguish cores corresponding to specimens in the tissue microarray is through their arrangement. Unfortunately, distinguishing the correct order of cores is not a trivial task as they are not labelled directly on the slide. The main aim of this study was to create a procedure capable of automatically finding and extracting cores from archival images of the tissue microarrays. This software supports the work of scientists who want to perform further image processing on single cores. The proposed method is an efficient and fast procedure, working in fully automatic or semi-automatic mode. A total of 89% of punches were correctly extracted with automatic selection. With an addition of manual correction, it is possible to fully prepare the whole slide image for extraction in 2 min per tissue microarray. The proposed technique requires minimum skill and time to parse big array of cores from tissue microarray whole slide image into individual core images.

  19. Resolving Lexical Ambiguity in a Deterministic Parser

    OpenAIRE

    Milne, Robert W.

    1983-01-01

    This work is an investigation into part of the human sentence parsing mechanism (HSPM), where parsing implies syntactic and non-syntactic analysis. It is hypothesised. that the HSPM consists of at least two processors. We will call the first processor the syntactic processor, and the second will be known as the non-syntactic processor. For normal sentence processing, the two processors are controlled by a 'normal component", whilst when an error occurs, they are controlled by a...

  20. Multichannel analyzer development in CAMAC

    International Nuclear Information System (INIS)

    Nagy, J.Z.; Zarandy, A.

    1988-01-01

    The data acquisition in TOKAMAK experiments some CAMAC modules have been developed. The modules are the following: 64 K analyzer memory, 32 K analyzer memory, 6-channel pulse peak analyzer memory which contains the 32 K analyzer memory and eight AD-converters

  1. Manufacturing Initiative

    Data.gov (United States)

    National Aeronautics and Space Administration — The Advanced Manufacturing Technologies (AMT) Project supports multiple activities within the Administration's National Manufacturing Initiative. A key component of...

  2. A tandem parallel plate analyzer

    International Nuclear Information System (INIS)

    Hamada, Y.; Fujisawa, A.; Iguchi, H.; Nishizawa, A.; Kawasumi, Y.

    1996-11-01

    By a new modification of a parallel plate analyzer the second-order focus is obtained in an arbitrary injection angle. This kind of an analyzer with a small injection angle will have an advantage of small operational voltage, compared to the Proca and Green analyzer where the injection angle is 30 degrees. Thus, the newly proposed analyzer will be very useful for the precise energy measurement of high energy particles in MeV range. (author)

  3. Unilateral initiatives

    International Nuclear Information System (INIS)

    Anon.

    1988-01-01

    This paper reports on arms control which is generally thought of in terms of formal negotiations with an opponent, with the resulting agreements embodied in a treaty. This is not surprising, since arms control discussions between opponents are both important and politically visible. There are, however, strong reasons for countries to consider and frequently take unilateral initiatives. To do so is entirely consistent with the established major precepts of arms control which state that arms control is designed to reduce the risk of war, the costs of preparing for war, and the death and destruction if war should come. Unilateral initiatives on what weapons are purchased, which ones are eliminated and how forces are deployed can all relate to these objectives. There are two main categories of motives for unilateral initiatives in arms control. In one category, internal national objectives are the dominant, often sole, driving force; the initiative is undertaken for our own good

  4. Ports Initiative

    Science.gov (United States)

    EPA's Ports Initiative works in collaboration with the port industry, communities, and government to improve environmental performance and increase economic prosperity. This effort helps people near ports breath cleaner air and live better lives.

  5. Digital Multi Channel Analyzer Enhancement

    International Nuclear Information System (INIS)

    Gonen, E.; Marcus, E.; Wengrowicz, U.; Beck, A.; Nir, J.; Sheinfeld, M.; Broide, A.; Tirosh, D.

    2002-01-01

    A cement analyzing system based on radiation spectroscopy had been developed [1], using novel digital approach for real-time, high-throughput and low-cost Multi Channel Analyzer. The performance of the developed system had a severe problem: the resulted spectrum suffered from lack of smoothness, it was very noisy and full of spikes and surges, therefore it was impossible to use this spectrum for analyzing the cement substance. This paper describes the work carried out to improve the system performance

  6. PM 3655 PHILIPS Logic analyzer

    CERN Multimedia

    A logic analyzer is an electronic instrument that captures and displays multiple signals from a digital system or digital circuit. A logic analyzer may convert the captured data into timing diagrams, protocol decodes, state machine traces, assembly language, or may correlate assembly with source-level software. Logic Analyzers have advanced triggering capabilities, and are useful when a user needs to see the timing relationships between many signals in a digital system.

  7. Nuclear plant analyzer program for Bulgaria

    International Nuclear Information System (INIS)

    Shier, W.; Kennett, R.

    1993-01-01

    An interactive nuclear plant analyzer(NPA) has been developed for use by the Bulgarian technical community in the training of plant personnel, the development and verification of plant operating procedures, and in the analysis of various anticipated operational occurrences and accident scenarios. The current NPA includes models for a VVER-440 Model 230 and a VVER-1000 Model 320 and is operational on an IBM RISC6000 workstation. The RELAP5/MOD2 computer code has been used for the calculation of the reactor responses to the interactive commands initiated by the NPA operator. The interactive capabilities of the NPA have been developed to provide considerable flexibility in the plant actions that can be initiated by the operator. The current capabilities for both the VVER-440 and VVER-1000 models include: (1) scram initiation; (2) reactor coolant pump trip; (3) high pressure safety injection system initiation; (4) low pressure safety injection system initiation; (5) pressurizer safety valve opening; (6) steam generator relief/safety valve opening; (7) feedwater system initiation and trip; (8) turbine trip; and (9) emergency feedwater initiation. The NPA has the capability to display the results of the simulations in various forms that are determined by the model developer. Results displayed on the reactor mask are shown through the user defined, digital display of various plant parameters and through color changes that reflect changes in primary system fluid temperatures, fuel and clad temperatures, and the temperature of other metal structures. In addition, changes in the status of various components and systems can be initiated and/or displayed both numerically and graphically on the mask. This paper provides a description of the structure of the NPA, a discussion of the simulation models used for the VVER-440 and the VVER-1000, and an overview of the NPA capabilities. Typical results obtained using both simulation models will be discussed

  8. Multichannel analyzer type CMA-3

    International Nuclear Information System (INIS)

    Czermak, A.; Jablonski, J.; Ostrowicz, A.

    1978-01-01

    Multichannel analyzer CMA-3 is designed for two-parametric analysis with operator controlled logical windows. It is implemented in CAMAC standard. A single crate contains all required modules and is controlled by the PDP-11/10 minicomputer. Configuration of CMA-3 is shown. CMA-3 is the next version of the multichannel analyzer described in report No 958/E-8. (author)

  9. Analyzing data files in SWAN

    CERN Document Server

    Gajam, Niharika

    2016-01-01

    Traditionally analyzing data happens via batch-processing and interactive work on the terminal. The project aims to provide another way of analyzing data files: A cloud-based approach. It aims to make it a productive and interactive environment through the combination of FCC and SWAN software.

  10. 40 CFR 86.1322-84 - Carbon monoxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... be used. (2) Zero the carbon monoxide analyzer with either zero-grade air or zero-grade nitrogen. (3... columns is one form of corrective action which may be taken.) (b) Initial and periodic calibration. Prior... calibrated. (1) Adjust the analyzer to optimize performance. (2) Zero the carbon monoxide analyzer with...

  11. Mixed-Initiative Clustering

    Science.gov (United States)

    Huang, Yifen

    2010-01-01

    Mixed-initiative clustering is a task where a user and a machine work collaboratively to analyze a large set of documents. We hypothesize that a user and a machine can both learn better clustering models through enriched communication and interactive learning from each other. The first contribution or this thesis is providing a framework of…

  12. Initial Study

    DEFF Research Database (Denmark)

    Torp, Kristian

    2009-01-01

    increased. In the initial study presented here, the time it takes to pass an intersection is studied in details. Two major signal-controlled four-way intersections in the center of the city Aalborg are studied in details to estimate the congestion levels in these intersections, based on the time it takes...

  13. [Automated analyzer of enzyme immunoassay].

    Science.gov (United States)

    Osawa, S

    1995-09-01

    Automated analyzers for enzyme immunoassay can be classified by several points of view: the kind of labeled antibodies or enzymes, detection methods, the number of tests per unit time, analytical time and speed per run. In practice, it is important for us consider the several points such as detection limits, the number of tests per unit time, analytical range, and precision. Most of the automated analyzers on the market can randomly access and measure samples. I will describe the recent advance of automated analyzers reviewing their labeling antibodies and enzymes, the detection methods, the number of test per unit time and analytical time and speed per test.

  14. DEMorphy, German Language Morphological Analyzer

    OpenAIRE

    Altinok, Duygu

    2018-01-01

    DEMorphy is a morphological analyzer for German. It is built onto large, compactified lexicons from German Morphological Dictionary. A guesser based on German declension suffixed is also provided. For German, we provided a state-of-art morphological analyzer. DEMorphy is implemented in Python with ease of usability and accompanying documentation. The package is suitable for both academic and commercial purposes wit a permissive licence.

  15. A Categorization of Dynamic Analyzers

    Science.gov (United States)

    Lujan, Michelle R.

    1997-01-01

    Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input

  16. CSTT Update: Fuel Quality Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Brosha, Eric L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Lujan, Roger W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Mukundan, Rangachary [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockward, Tommy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Romero, Christopher J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Stefan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Wilson, Mahlon S. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-06

    These are slides from a presentation. The following topics are covered: project background (scope and approach), developing the prototype (timeline), update on intellectual property, analyzer comparisons (improving humidification, stabilizing the baseline, applying clean-up strategy, impact of ionomer content and improving clean-up), proposed operating mode, considerations for testing in real-world conditions (Gen 1 analyzer electronics development, testing partner identified, field trial planning), summary, and future work.

  17. On-Demand Urine Analyzer

    Science.gov (United States)

    Farquharson, Stuart; Inscore, Frank; Shende, Chetan

    2010-01-01

    A lab-on-a-chip was developed that is capable of extracting biochemical indicators from urine samples and generating their surface-enhanced Raman spectra (SERS) so that the indicators can be quantified and identified. The development was motivated by the need to monitor and assess the effects of extended weightlessness, which include space motion sickness and loss of bone and muscle mass. The results may lead to developments of effective exercise programs and drug regimes that would maintain astronaut health. The analyzer containing the lab-on-a- chip includes materials to extract 3- methylhistidine (a muscle-loss indicator) and Risedronate (a bone-loss indicator) from the urine sample and detect them at the required concentrations using a Raman analyzer. The lab-on- a-chip has both an extractive material and a SERS-active material. The analyzer could be used to monitor the onset of diseases, such as osteoporosis.

  18. Device for analyzing a solution

    International Nuclear Information System (INIS)

    Marchand, Joseph.

    1978-01-01

    The device enables a solution containing an antigen to be analyzed by the radio-immunology technique without coming up against the problems of antigen-antibody complex and free antigen separation. This device, for analyzing a solution containing a biological compound capable of reacting with an antagonistic compound specific of the biological compound, features a tube closed at its bottom end and a component set and immobilized in the bottom of the tube so as to leave a capacity between the bottom of the tube and its lower end. The component has a large developed surface and is so shaped that it allows the solution to be analyzed to have access to the bottom of the tube; it is made of a material having some elastic deformation and able to take up a given quantity of the biological compound or of the antagonistic compound specific of the biological compound [fr

  19. Multichannel analyzer embedded in FPGA

    International Nuclear Information System (INIS)

    Garcia D, A.; Hernandez D, V. M.; Vega C, H. R.; Ordaz G, O. O.; Bravo M, I.

    2017-10-01

    Ionizing radiation has different applications, so it is a very significant and useful tool, which in turn can be dangerous for living beings if they are exposed to uncontrolled doses. However, due to its characteristics, it cannot be perceived by any of the senses of the human being, so that in order to know the presence of it, radiation detectors and additional devices are required to quantify and classify it. A multichannel analyzer is responsible for separating the different pulse heights that are generated in the detectors, in a certain number of channels; according to the number of bits of the analog to digital converter. The objective of the work was to design and implement a multichannel analyzer and its associated virtual instrument, for nuclear spectrometry. The components of the multichannel analyzer were created in VHDL hardware description language and packaged in the Xilinx Vivado design suite, making use of resources such as the ARM processing core that the System on Chip Zynq contains and the virtual instrument was developed on the LabView programming graphics platform. The first phase was to design the hardware architecture to be embedded in the FPGA and for the internal control of the multichannel analyzer the application was generated for the ARM processor in C language. For the second phase, the virtual instrument was developed for the management, control and visualization of the results. The data obtained as a result of the development of the system were observed graphically in a histogram showing the spectrum measured. The design of the multichannel analyzer embedded in FPGA was tested with two different radiation detection systems (hyper-pure germanium and scintillation) which allowed determining that the spectra obtained are similar in comparison with the commercial multichannel analyzers. (Author)

  20. Loviisa nuclear power plant analyzer

    International Nuclear Information System (INIS)

    Porkholm, K.; Nurmilaukas, P.; Tiihonen, O.; Haenninen, M.; Puska, E.

    1992-12-01

    The APROS Simulation Environment has been developed since 1986 by Imatran Voima Oy (IVO) and the Technical Research Centre of Finland (VTT). It provides tools, solution algorithms and process components for use in different simulation systems for design, analysis and training purposes. One of its main nuclear applications is the Loviisa Nuclear Power Plant Analyzer (LPA). The Loviisa Plant Analyzer includes all the important plant components both in the primary and in the secondary circuits. In addition, all the main control systems, the protection system and the high voltage electrical systems are included. (orig.)

  1. Defense Acquisition Initiatives Review: An Assessment of Extant Initiatives

    National Research Council Canada - National Science Library

    Porter, Gene; Berteau, David; Christle, Gary; Mandelbaum, Jay; Diehl, Richard

    2005-01-01

    ...) to identify and analyze a subset of initiatives that the team finds to have potential for near term management emphasis that could provide visible improvements to the much criticized Defense acquisition system...

  2. The security analyzer: A security analyzer program written in Prolog

    International Nuclear Information System (INIS)

    Zimmerman, B.D.; Densley, P.J.

    1986-09-01

    The Security Analyzer is a software tool capable of analyzing the effectiveness of a facility's security system. It is written in the Prolog logic programming computer language, using entity-relationship data modeling techniques. The program performs the following functions: (1) provides descriptive, locational and operational status information about intrusion detectors and assessment devices (i.e., ''sensors'' and ''cameras'') upon request; (2) provides for storage and retrieval of maintenance history information for various components of the security system (including intrusion detectors), and allows for changing that information as desired; (3) provides a ''search'' mode, wherein all paths are found from any specified physical location to another specified location which satisfy user chosen ''intruder detection'' probability and elapsed time criteria (i.e., the program finds the ''weakest paths'' from a security point of view). The first two of these functions can be provided fairly easily with a conventional database program; the third function could be provided using Fortran or some similar language, though with substantial difficulty. In the Security Analyzer program, all these functions are provided in a simple and straight-forward manner. This simplicity is possible because the program is written in the symbolic (as opposed to numeric) processing language Prolog, and because the knowledge base is structured according to entity-relationship modeling principles. Also, the use of Prolog and the entity-relationship modeling technique allows the capabilities of the Security analyzer program, both for knowledge base interrogation and for searching-type operations, to be easily expanded in ways that would be very difficult for a numeric and more algorithmically deterministic language such as Fortran to duplicate. 4 refs

  3. Development of a nuclear plant analyzer (NPA)

    International Nuclear Information System (INIS)

    De Vlaminck, M.; Mampaey, L.; Vanhoenacker, L.; Bastenaire, F.

    1990-01-01

    A Nuclear Plant Analyzer has been developed by TRACTABEL. Three distinct functional units make up the Nuclear Plant Analyser, a model builder, a run time unit and an analysis unit. The model builder is intended to build simulation models which describe on the one hand the geometric structure and initial conditions of a given plant and on the other hand command control logics and reactor protection systems. The run time unit carries out dialog between the user and the thermal-hydraulic code. The analysis unit is aimed at deep analyzing of the transient results. The model builder is being tested in the framework of the International Standard Problem ISP-26, which is the simulation of a LOCA on the Japanese ROSA facility

  4. Methods of analyzing crude oil

    Science.gov (United States)

    Cooks, Robert Graham; Jjunju, Fred Paul Mark; Li, Anyin; Rogan, Iman S.

    2017-08-15

    The invention generally relates to methods of analyzing crude oil. In certain embodiments, methods of the invention involve obtaining a crude oil sample, and subjecting the crude oil sample to mass spectrometry analysis. In certain embodiments, the method is performed without any sample pre-purification steps.

  5. Therapy Talk: Analyzing Therapeutic Discourse

    Science.gov (United States)

    Leahy, Margaret M.

    2004-01-01

    Therapeutic discourse is the talk-in-interaction that represents the social practice between clinician and client. This article invites speech-language pathologists to apply their knowledge of language to analyzing therapy talk and to learn how talking practices shape clinical roles and identities. A range of qualitative research approaches,…

  6. The Convertible Arbitrage Strategy Analyzed

    NARCIS (Netherlands)

    Loncarski, I.; Ter Horst, J.R.; Veld, C.H.

    2006-01-01

    This paper analyzes convertible bond arbitrage on the Canadian market for the period 1998 to 2004.Convertible bond arbitrage is the combination of a long position in convertible bonds and a short position in the underlying stocks. Convertible arbitrage has been one of the most successful strategies

  7. Analyzing the complexity of nanotechnology

    NARCIS (Netherlands)

    Vries, de M.J.; Schummer, J.; Baird, D.

    2006-01-01

    Nanotechnology is a highly complex technological development due to many uncertainties in our knowledge about it. The Dutch philosopher Herman Dooyeweerd has developed a conceptual framework that can be used (1) to analyze the complexity of technological developments and (2) to see how priorities

  8. Proton-beam energy analyzer

    International Nuclear Information System (INIS)

    Belan, V.N.; Bolotin, L.I.; Kiselev, V.A.; Linnik, A.F.; Uskov, V.V.

    1989-01-01

    The authors describe a magnetic analyzer for measurement of proton-beam energy in the range from 100 keV to 25 MeV. The beam is deflected in a uniform transverse magnetic field and is registered by photographing a scintillation screen. The energy spectrum of the beam is constructed by microphotometry of the photographic film

  9. Methods for Analyzing Pipe Networks

    DEFF Research Database (Denmark)

    Nielsen, Hans Bruun

    1989-01-01

    to formulate the flow equations in terms of pipe discharges than in terms of energy heads. The behavior of some iterative methods is compared in the initial phase with large errors. It is explained why the linear theory method oscillates when the iteration gets close to the solution, and it is further...... demonstrated that this method offers good starting values for a Newton-Raphson iteration....

  10. Environmental applications of the centrifugal fast analyzer

    International Nuclear Information System (INIS)

    Goldstein, G.; Strain, J.E.; Bowling, J.L.

    1975-12-01

    The centrifugal fast analyzer (GeMSAEC Fast Analyzer) was applied to the analysis of pollutants in air and water. Since data acquisition and processing are computer controlled, considerable effort went into devising appropriate software. A modified version of the standard FOCAL interpreter was developed which includes special machine language functions for data timing, acquisition, and storage, and also permits chaining together of programs stored on a disk. Programs were written and experimental procedures developed to implement spectrophotometric, turbidimetric, kinetic (including initial-rate, fixed-time, and variable-time techniques), and chemiluminescence methods of analysis. Analytical methods were developed for the following elements and compounds: SO 2 , O 3 , Ca, Cr, Cu, Fe, Mg, Se(IV), Zn, Cl - , I - , NO 2 - , PO 4 -3 , S -2 , and SO 4 -2 . In many cases, standard methods could be adapted to the centrifugal analyzer, in others new methods were employed. In general, analyses performed with the centrifugal fast analyzer were faster, more precise, and more accurate than with conventional instrumentation

  11. Analyzer for gamma cameras diagnostic

    International Nuclear Information System (INIS)

    Oramas Polo, I.; Osorio Deliz, J. F.; Diaz Garcia, A.

    2013-01-01

    This research work was carried out to develop an analyzer for gamma cameras diagnostic. It is composed of an electronic system that includes hardware and software capabilities, and operates from the acquisition of the 4 head position signals of a gamma camera detector. The result is the spectrum of the energy delivered by nuclear radiation coming from the camera detector head. This system includes analog processing of position signals from the camera, digitization and the subsequent processing of the energy signal in a multichannel analyzer, sending data to a computer via a standard USB port and processing of data in a personal computer to obtain the final histogram. The circuits are composed of an analog processing board and a universal kit with micro controller and programmable gate array. (Author)

  12. Methods for Analyzing Social Media

    DEFF Research Database (Denmark)

    Jensen, Jakob Linaa

    2013-01-01

    Social media is becoming increasingly attractive for users. It is a fast way to communicate ideas and a key source of information. It is therefore one of the most influential mediums of communication of our time and an important area for audience research. The growth of social media invites many...... new questions such as: How can we analyze social media? Can we use traditional audience research methods and apply them to online content? Which new research strategies have been developed? Which ethical research issues and controversies do we have to pay attention to? This book focuses on research...... strategies and methods for analyzing social media and will be of interest to researchers and practitioners using social media, as well as those wanting to keep up to date with the subject....

  13. New approach to analyzing vulnerability

    International Nuclear Information System (INIS)

    O'Callaghan, P.B.; Carlson, R.L.; Riedeman, G.W.

    1986-01-01

    The Westinghouse Hanford Company (WHC) has recently completed construction of the Fuel Cycle Plant (FCP) at Richland, Washington. At start-up the facility will fabricate driver fuel for the Fast Flux Test Facility in the Secure Automated Fabrication line. After construction completion, but before facility certification, the Department of Energy (DOE) Richland Operation Office requested that a vulnerability analysis be performed which assumed multiple insiders as a threat to the security system. A unique method of analyzing facility vulnerabilities was developed at the Security Applications Center (SAC), which is managed by WHC for DOE. The method that was developed verifies a previous vulnerability assessment, as well as introducing a modeling technique which analyzes security alarms in relation to delaying factors and possible insider activities. With this information it is possible to assess the relative strength or weakness of various possible routes to and from a target within a facility

  14. Analyzing the Facebook Friendship Graph

    OpenAIRE

    Catanese, Salvatore; De Meo, Pasquale; Ferrara, Emilio; Fiumara, Giacomo

    2010-01-01

    Online Social Networks (OSN) during last years acquired a huge and increasing popularity as one of the most important emerging Web phenomena, deeply modifying the behavior of users and contributing to build a solid substrate of connections and relationships among people using the Web. In this preliminary work paper, our purpose is to analyze Facebook, considering a significant sample of data reflecting relationships among subscribed users. Our goal is to extract, from this platform, relevant ...

  15. Sulfur Dioxide Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, Stephen R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-05-01

    The Sulfur Dioxide Analyzer measures sulfur dioxide based on absorbance of UV light at one wavelength by SO2 molecules which then decay to a lower energy state by emitting UV light at a longer wavelength. Specifically, SO2 + hυ1 →SO2 *→SO2 + hυ2 The emitted light is proportional to the concentration of SO2 in the optical cell. External communication with the analyzer is available through an Ethernet port configured through the instrument network of the AOS systems. The Model 43i-TLE is part of the i-series of Thermo Scientific instruments. The i-series instruments are designed to interface with external computers through the proprietary Thermo Scientific iPort Software. However, this software is somewhat cumbersome and inflexible. Brookhaven National Laboratory (BNL) has written an interface program in National Instruments LabView that both controls the Model 43i-TLE Analyzer AND queries the unit for all measurement and housekeeping data. The LabView vi (the software program written by BNL) ingests all raw data from the instrument and outputs raw data files in a uniform data format similar to other instruments in the AOS and described more fully in Section 6.0 below.

  16. Remote Laser Diffraction PSD Analyzer

    International Nuclear Information System (INIS)

    Batcheller, T.A.; Huestis, G.M.; Bolton, S.M.

    2000-01-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified off-the-shelf classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a hot cell (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable - making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives

  17. A new uranium automatic analyzer

    International Nuclear Information System (INIS)

    Xia Buyun; Zhu Yaokun; Wang Bin; Cong Peiyuan; Zhang Lan

    1993-01-01

    A new uranium automatic analyzer based on the flow injection analysis (FIA) principle has been developed. It consists of a multichannel peristaltic pump, an injection valve, a photometric detector, a single-chip microprocessor system and electronic circuit. The new designed multifunctional auto-injection valve can automatically change the injection volume of the sample and the channels so that the determination ranges and items can easily be changed. It also can make the instrument vary the FIA operation modes that it has functions of a universal instrument. A chromatographic column with extractant-containing resin was installed in the manifold of the analyzer for the concentration and separation of trace uranium. The 2-(5-bromo-2-pyridylazo)-5-diethyl-aminophenol (Br-PADAP) was used as colour reagent. Uranium was determined in the aqueous solution by adding cetyl-pyridium bromide (CPB). The uranium in the solution in the range 0.02-500 mg · L -1 can be directly determined without any pretreatment. A sample throughput rate of 30-90 h -1 and reproducibility of 1-2% were obtained. The analyzer has been satisfactorily applied to the laboratory and the plant

  18. Remote Laser Diffraction PSD Analyzer

    International Nuclear Information System (INIS)

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2000-01-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified ''off-the-shelf'' classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a ''hot cell'' (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable--making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives

  19. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    Directory of Open Access Journals (Sweden)

    Mike W.-L. Cheung

    2016-05-01

    Full Text Available Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists – and probably the most crucial one – is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  20. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.

    Science.gov (United States)

    Cheung, Mike W-L; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  1. The security analyzer, a security analyzer program written in Prolog

    International Nuclear Information System (INIS)

    Zimmerman, B.D.; Densley, P.J.; Carlson, R.L.

    1987-01-01

    A technique has been developed to characterize a nuclear facility and measure the strengths and weaknesses of the physical protection system. It utilizes the artificial intelligence capabilities available in the prolog programming language to probe a facility's defenses and find potential attack paths that meet designated search criteria. As sensors or barriers become inactive due to maintenance, failure, or inclement weather conditions, the protection system can rapidly be reanalyzed to discover weaknesses that would need to be strengthened by alternative means. Conversely, proposed upgrades and enhancements can be easily entered into the database and their effect measured against a variety of potential adversary attacks. Thus the security analyzer is a tool that aids the protection planner as well as the protection operations staff

  2. 40 CFR 86.327-79 - Quench checks; NOX analyzer.

    Science.gov (United States)

    2010-07-01

    ... any flow rate into the reaction chamber. This includes, but is not limited to, sample capillary, ozone... Quench checks; NOX analyzer. (a) Perform the reaction chamber quench check for each model of high vacuum reaction chamber analyzer prior to initial use. (b) Perform the reaction chamber quench check for each new...

  3. Openness initiative

    International Nuclear Information System (INIS)

    Duncan, S.S.

    1995-01-01

    Although antinuclear campaigns seem to be effective, public communication and education efforts on low-level radioactive waste have mixed results. Attempts at public information programs on low-level radioactive waste still focus on influencing public opinion. A question then is: open-quotes Is it preferable to have a program focus on public education that will empower individuals to make informed decisions rather than trying to influence them in their decisions?close quotes To address this question, a case study with both quantitative and qualitative data will be used. The Ohio Low-Level Radioactive Waste Education Program has a goal to provide people with information they want/need to make their own decisions. The program initiated its efforts by conducting a statewide survey to determine information needed by people and where they turned for that information. This presentation reports data from the survey and then explores the program development process in which programs were designed and presented using the information. Pre and post data from the programs reveal attitude and knowledge shifts

  4. Openness initiative

    Energy Technology Data Exchange (ETDEWEB)

    Duncan, S.S. [Los Alamos National Lab., NM (United States)

    1995-12-31

    Although antinuclear campaigns seem to be effective, public communication and education efforts on low-level radioactive waste have mixed results. Attempts at public information programs on low-level radioactive waste still focus on influencing public opinion. A question then is: {open_quotes}Is it preferable to have a program focus on public education that will empower individuals to make informed decisions rather than trying to influence them in their decisions?{close_quotes} To address this question, a case study with both quantitative and qualitative data will be used. The Ohio Low-Level Radioactive Waste Education Program has a goal to provide people with information they want/need to make their own decisions. The program initiated its efforts by conducting a statewide survey to determine information needed by people and where they turned for that information. This presentation reports data from the survey and then explores the program development process in which programs were designed and presented using the information. Pre and post data from the programs reveal attitude and knowledge shifts.

  5. Fuel analyzer; Analisador de combustiveis

    Energy Technology Data Exchange (ETDEWEB)

    Cozzolino, Roberval [RS Motors, Indaiatuba, SP (Brazil)

    2008-07-01

    The current technology 'COMBUSTIMETRO' aims to examine the fuel through performance of the engine, as the role of the fuel is to produce energy for the combustion engine in the form of which is directly proportional to the quality and type of fuel. The 'COMBUSTIMETRO' has an engine that always keeps the same entry of air, fuel and fixed point of ignition. His operation is monitored by sensors (Sonda Lambda, RPM and Gases Analyzer) connected to a processor that performs calculations and records the information, generate reports and graphs. (author)

  6. Initiative hard coal; Initiative Steinkohle

    Energy Technology Data Exchange (ETDEWEB)

    Leonhardt, J.

    2007-08-02

    In order to decrease the import dependence of hard coal in the European Union, the author has submitted suggestions to the director of conventional sources of energy (directorate general for energy and transport) of the European community, which found a positive resonance. These suggestions are summarized in an elaboration 'Initiative Hard Coal'. After clarifying the starting situation and defining the target the presupposition for a better use of hard coal deposits as raw material in the European Union are pointed out. On that basis concrete suggestions for measures are made. Apart from the conditions of the deposits it concerns thereby also new mining techniques and mining-economical developments, connected with tasks for the mining-machine industry. (orig.)

  7. Compact Microwave Fourier Spectrum Analyzer

    Science.gov (United States)

    Savchenkov, Anatoliy; Matsko, Andrey; Strekalov, Dmitry

    2009-01-01

    A compact photonic microwave Fourier spectrum analyzer [a Fourier-transform microwave spectrometer, (FTMWS)] with no moving parts has been proposed for use in remote sensing of weak, natural microwave emissions from the surfaces and atmospheres of planets to enable remote analysis and determination of chemical composition and abundances of critical molecular constituents in space. The instrument is based on a Bessel beam (light modes with non-zero angular momenta) fiber-optic elements. It features low power consumption, low mass, and high resolution, without a need for any cryogenics, beyond what is achievable by the current state-of-the-art in space instruments. The instrument can also be used in a wide-band scatterometer mode in active radar systems.

  8. Charge Analyzer Responsive Local Oscillations

    Science.gov (United States)

    Krause, Linda Habash; Thornton, Gary

    2015-01-01

    The first transatlantic radio transmission, demonstrated by Marconi in December of 1901, revealed the essential role of the ionosphere for radio communications. This ionized layer of the upper atmosphere controls the amount of radio power transmitted through, reflected off of, and absorbed by the atmospheric medium. Low-frequency radio signals can propagate long distances around the globe via repeated reflections off of the ionosphere and the Earth's surface. Higher frequency radio signals can punch through the ionosphere to be received at orbiting satellites. However, any turbulence in the ionosphere can distort these signals, compromising the performance or even availability of space-based communication and navigations systems. The physics associated with this distortion effect is analogous to the situation when underwater images are distorted by convecting air bubbles. In fact, these ionospheric features are often called 'plasma bubbles' since they exhibit some of the similar behavior as underwater air bubbles. These events, instigated by solar and geomagnetic storms, can cause communication and navigation outages that last for hours. To help understand and predict these outages, a world-wide community of space scientists and technologists are devoted to researching this topic. One aspect of this research is to develop instruments capable of measuring the ionospheric plasma bubbles. Figure 1 shows a photo of the Charge Analyzer Responsive to Local Oscillations (CARLO), a new instrument under development at NASA Marshall Space Flight Center (MSFC). It is a frequency-domain ion spectrum analyzer designed to measure the distributions of ionospheric turbulence from 1 Hz to 10 kHz (i.e., spatial scales from a few kilometers down to a few centimeters). This frequency range is important since it focuses on turbulence scales that affect VHF/UHF satellite communications, GPS systems, and over-the-horizon radar systems. CARLO is based on the flight-proven Plasma Local

  9. Radiation energy detector and analyzer

    International Nuclear Information System (INIS)

    Roberts, T.G.

    1981-01-01

    A radiation detector array and a method for measuring the spectral content of radiation. The radiation sensor or detector is an array or stack of thin solid-electrolyte batteries. The batteries, arranged in a stack, may be composed of independent battery cells or may be arranged so that adjacent cells share a common terminal surface. This common surface is possible since the polarity of the batteries with respect to an adjacent battery is unrestricted, allowing a reduction in component parts of the assembly and reducing the overall stack length. Additionally, a test jig or chamber for allowing rapid measurement of the voltage across each battery is disclosed. A multichannel recorder and display may be used to indicate the voltage gradient change across the cells, or a small computer may be used for rapidly converting these voltage readings to a graph of radiation intensity versus wavelength or energy. The behavior of the batteries when used as a radiation detector and analyzer are such that the voltage measurements can be made at leisure after the detector array has been exposed to the radiation, and it is not necessary to make rapid measurements as is now done

  10. Nuclear plant analyzer desktop workstation

    International Nuclear Information System (INIS)

    Beelman, R.J.

    1990-01-01

    In 1983 the U.S. Nuclear Regulatory Commission (USNRC) commissioned the Idaho National Engineering Laboratory (INEL) to develop a Nuclear Plant Analyzer (NPA). The NPA was envisioned as a graphical aid to assist reactor safety analysts in comprehending the results of thermal-hydraulic code calculations. The development was to proceed in three distinct phases culminating in a desktop reactor safety workstation. The desktop NPA is now complete. The desktop NPA is a microcomputer based reactor transient simulation, visualization and analysis tool developed at INEL to assist an analyst in evaluating the transient behavior of nuclear power plants by means of graphic displays. The NPA desktop workstation integrates advanced reactor simulation codes with online computer graphics allowing reactor plant transient simulation and graphical presentation of results. The graphics software, written exclusively in ANSI standard C and FORTRAN 77 and implemented over the UNIX/X-windows operating environment, is modular and is designed to interface to the NRC's suite of advanced thermal-hydraulic codes to the extent allowed by that code. Currently, full, interactive, desktop NPA capabilities are realized only with RELAP5

  11. Analyzing, Modelling, and Designing Software Ecosystems

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    as the software development and distribution by a set of actors dependent on each other and the ecosystem. We commence on the hypothesis that the establishment of a software ecosystem on the telemedicine services of Denmark would address these issues and investigate how a software ecosystem can foster...... the development, implementation, and use of telemedicine services. We initially expand the theory of software ecosystems by contributing to the definition and understanding of software ecosystems, providing means of analyzing existing and designing new ecosystems, and defining and measuring the qualities...... of software ecosystems. We use these contributions to design a software ecosystem in the telemedicine services of Denmark with (i) a common platform that supports and promotes development from different actors, (ii) high software interaction, (iii) strong social network of actors, (iv) robust business...

  12. Mass spectrometer calibration of Cosmic Dust Analyzer

    Science.gov (United States)

    Ahrens, Thomas J.; Gupta, Satish C.; Jyoti, G.; Beauchamp, J. L.

    2003-02-01

    The time-of-flight (TOF) mass spectrometer (MS) of the Cosmic Dust Analyzer (CDA) instrument aboard the Cassini spacecraft is expected to be placed in orbit about Saturn to sample submicrometer-diameter ring particles and impact ejecta from Saturn's satellites. The CDA measures a mass spectrum of each particle that impacts the chemical analyzer sector of the instrument. Particles impact a Rh target plate at velocities of 1-100 km/s and produce some 10-8 to 10-5 times the particle mass of positive valence, single-charged ions. These are analyzed via a TOF MS. Initial tests employed a pulsed N2 laser acting on samples of kamacite, pyrrhotite, serpentine, olivine, and Murchison meteorite induced bursts of ions which were detected with a microchannel plate and a charge sensitive amplifier (CSA). Pulses from the N2 laser (1011 W/cm2) are assumed to simulate particle impact. Using aluminum alloy as a test sample, each pulse produces a charge of ~4.6 pC (mostly Al+1), whereas irradiation of a stainless steel target produces a ~2.8 pC (Fe+1) charge. Thus the present system yields ~10-5% of the laser energy in resulting ions. A CSA signal indicates that at the position of the microchannel plate, the ion detector geometry is such that some 5% of the laser-induced ions are collected in the CDA geometry. Employing a multichannel plate detector in this MS yields for Al-Mg-Cu alloy and kamacite targets well-defined peaks at 24 (Mg+1), 27(Al+1), and 64 (Cu+1) and 56 (Fe+1), 58 (Ni+1), and 60 (Ni+1) dalton, respectively.

  13. Initiation devices, initiation systems including initiation devices and related methods

    Energy Technology Data Exchange (ETDEWEB)

    Daniels, Michael A.; Condit, Reston A.; Rasmussen, Nikki; Wallace, Ronald S.

    2018-04-10

    Initiation devices may include at least one substrate, an initiation element positioned on a first side of the at least one substrate, and a spark gap electrically coupled to the initiation element and positioned on a second side of the at least one substrate. Initiation devices may include a plurality of substrates where at least one substrate of the plurality of substrates is electrically connected to at least one adjacent substrate of the plurality of substrates with at least one via extending through the at least one substrate. Initiation systems may include such initiation devices. Methods of igniting energetic materials include passing a current through a spark gap formed on at least one substrate of the initiation device, passing the current through at least one via formed through the at least one substrate, and passing the current through an explosive bridge wire of the initiation device.

  14. Análisis, optimización, mejora y aplicación del análisis de dependencias. Analyzing, enhancing, optimizing and applying dependency analysis

    OpenAIRE

    Ballesteros Martínez, Miguel

    2012-01-01

    Los analizadores de dependencias estadísticos han sido mejorados en gran medida durante los últimos años. Esto ha sido posible gracias a los sistemas basados en aprendizaje automático que muestran una gran precisión. Estos sistemas permiten la generación de parsers para idiomas en los que se disponga de un corpus adecuado sin causar, para ello, un gran esfuerzo en el usuario final. MaltParser es uno de estos sistemas. En esta tesis hemos usado sistemas del estado del arte, para mostrar una se...

  15. Update on the USNRC's nuclear plant analyzer

    International Nuclear Information System (INIS)

    Laats, E.T.

    1987-01-01

    The Nuclear Plant Analyzer (NPA) is the U.S. Nuclear Regulatory Commission's (NRC's) state-of-the-art nuclear reactor simulation capability. This computer software package integrates high fidelity nuclear reactor simulation codes such as the TRAC and RELAPS series of codes with color graphics display techniques and advanced workstation hardware. An overview of this program was given at the 1984 Summer Computer Simulation Conference (SCSC), with selected topics discussed at the 1985 and 1986 SCSCs. This paper addresses these activities and related experiences. First, The Class VI computer implementation is discussed. The trade-offs between gaining significantly greater computational speed and central memory, with the loss of performance due to many more simultaneous users is shown. Second, the goal of the super-minicomputer implementation is to produce a very cost-effective system that utilizes advanced (multi-dimensional, two-phase coolant) simulation capabilities at real wall-clock simulation times. Benchmarking of the initial super-minicomputer implementation is discussed. Finally, the technical and economic feasibility is addressed for implementing the super-minicomputer version of the NPA with the RELAPS simulation code onto the Black Fox full scope nuclear power plant simulator

  16. Combat Wound Initiative program.

    Science.gov (United States)

    Stojadinovic, Alexander; Elster, Eric; Potter, Benjamin K; Davis, Thomas A; Tadaki, Doug K; Brown, Trevor S; Ahlers, Stephen; Attinger, Christopher E; Andersen, Romney C; Burris, David; Centeno, Jose; Champion, Hunter; Crumbley, David R; Denobile, John; Duga, Michael; Dunne, James R; Eberhardt, John; Ennis, William J; Forsberg, Jonathan A; Hawksworth, Jason; Helling, Thomas S; Lazarus, Gerald S; Milner, Stephen M; Mullick, Florabel G; Owner, Christopher R; Pasquina, Paul F; Patel, Chirag R; Peoples, George E; Nissan, Aviram; Ring, Michael; Sandberg, Glenn D; Schaden, Wolfgang; Schultz, Gregory S; Scofield, Tom; Shawen, Scott B; Sheppard, Forest R; Stannard, James P; Weina, Peter J; Zenilman, Jonathan M

    2010-07-01

    The Combat Wound Initiative (CWI) program is a collaborative, multidisciplinary, and interservice public-private partnership that provides personalized, state-of-the-art, and complex wound care via targeted clinical and translational research. The CWI uses a bench-to-bedside approach to translational research, including the rapid development of a human extracorporeal shock wave therapy (ESWT) study in complex wounds after establishing the potential efficacy, biologic mechanisms, and safety of this treatment modality in a murine model. Additional clinical trials include the prospective use of clinical data, serum and wound biomarkers, and wound gene expression profiles to predict wound healing/failure and additional clinical patient outcomes following combat-related trauma. These clinical research data are analyzed using machine-based learning algorithms to develop predictive treatment models to guide clinical decision-making. Future CWI directions include additional clinical trials and study centers and the refinement and deployment of our genetically driven, personalized medicine initiative to provide patient-specific care across multiple medical disciplines, with an emphasis on combat casualty care.

  17. Update on the USNRC's Nuclear Plant Analyzer

    International Nuclear Information System (INIS)

    Laats, E.T.

    1987-01-01

    The Nuclear Plant Analyzer (NPA) is the US Nuclear Regulatory Commission's (NRC's) state-of-the-art nuclear reactor simulation capability. This computer software package integrates high fidelity nuclear reactor simulation codes such as the TRAC and RELAP5 series of codes with color graphics display techniques and advanced workstation hardware. An overview of this program was given at the 1984 Summer Computer Simulation Conference (SCSC), with selected topics discussed at the 1985 and 1986 SCSCs. Since the 1984 presentation, major redirections of this NRC program have been taken. The original NPA system was developed for operation on a Control Data Corporation CYBER 176 computer, technology that is some 10 to 15 years old. The NPA system has recently been implemented on Class VI computers to gain increased computational capabilities, and is now being implemented on super-minicomputers for use by the scientific community and possibly by the commercial nuclear power plant simulator community. This paper addresses these activities and related experiences. First, the Class VI computer implementation is discussed. The trade-offs between gaining significantly greater computational speed and central memory, with the loss of performance due to many more simultaneous users is shown. Second, the goal of the super-minicomputer implementation is to produce a very cost-effective system that utilizes advanced (multi-dimensional, two-phase coolant) simulation capabilities at real wall-clock simulation times. Benchmarking of the initial super-minicomputer implementation is discussed. Finally, the technical and economic feasibility is addressed for implementing the super-minicomputer version of the NPA with the RELAP5 simulation code onto the Black Fox full scope nuclear power plant simulator

  18. Food-initiated outbreak of methicillin-resistant Staphylococcus aureus analyzed by pheno- and genotyping

    NARCIS (Netherlands)

    J.A.J.W. Kluytmans (Jan); R. Hollis; S. Messer; L. Herwaldt; J. Bruining (Hans); M. Heck; J. Rost; N. van Leeuwen; W.H.F. Goessens (Wil); W.B. van Leeuwen (Willem)

    1995-01-01

    textabstractAn outbreak of methicillin-resistant Staphylococcus aureus (MRSA) involving 27 patients and 14 health-care workers (HCW) was studied. The outbreak started in the hematology unit of the University Hospital Rotterdam, Dijkzigt, The Netherlands, and spread to

  19. Time asymmetry: Polarization and analyzing power in the nuclear reactions

    International Nuclear Information System (INIS)

    Rioux, C.; Roy, R.; Slobodrian, R.J.; Conzett, H.E.

    1983-01-01

    Measurements of the proton polarization in the reactions 7 Li( 3 He, p vector) 9 Be and 9 Be( 3 He, p vector) 11 B and of the analyzing powers of the inverse reactions, initiated by polarized protons at the same c.m. energies, show significant differences which imply the failure of the polarization-analyzing-power theorem and, prima facie, of time-reversal invariance in these reactions. The reaction 2 H( 3 He, p vector) 4 He and its inverse have also been investigated and show some smaller differences. A discussion of the instrumental asymmetries is presented. (orig.)

  20. Initialized Fractional Calculus

    Science.gov (United States)

    Lorenzo, Carl F.; Hartley, Tom T.

    2000-01-01

    This paper demonstrates the need for a nonconstant initialization for the fractional calculus and establishes a basic definition set for the initialized fractional differintegral. This definition set allows the formalization of an initialized fractional calculus. Two basis calculi are considered; the Riemann-Liouville and the Grunwald fractional calculi. Two forms of initialization, terminal and side are developed.

  1. ADAM: Analyzer for Dialectal Arabic Morphology

    Directory of Open Access Journals (Sweden)

    Wael Salloum

    2014-12-01

    Full Text Available While Modern Standard Arabic (MSA has many resources, Arabic Dialects, the primarily spoken local varieties of Arabic, are quite impoverished in this regard. In this article, we present ADAM (Analyzer for Dialectal Arabic Morphology. ADAM is a poor man’s solution to quickly develop morphological analyzers for dialectal Arabic. ADAM has roughly half the out-of-vocabulary rate of a state-of-the-art MSA analyzer and is comparable in its recall performance to an Egyptian dialectal morphological analyzer that took years and expensive resources to build.

  2. A Morphological Parser For Afrikaans | de Stadler | Stellenbosch ...

    African Journals Online (AJOL)

    Stellenbosch Papers in Linguistics Plus. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 22 (1992) >. Log in or Register to get access to full text downloads. Username, Password, Remember me, or Register · Download this PDF file. The PDF file you selected ...

  3. Speed up of XML parsers with PHP language implementation

    Science.gov (United States)

    Georgiev, Bozhidar; Georgieva, Adriana

    2012-11-01

    In this paper, authors introduce PHP5's XML implementation and show how to read, parse, and write a short and uncomplicated XML file using Simple XML in a PHP environment. The possibilities for mutual work of PHP5 language and XML standard are described. The details of parsing process with Simple XML are also cleared. A practical project PHP-XML-MySQL presents the advantages of XML implementation in PHP modules. This approach allows comparatively simple search of XML hierarchical data by means of PHP software tools. The proposed project includes database, which can be extended with new data and new XML parsing functions.

  4. Initialization Errors in Quantum Data Base Recall

    OpenAIRE

    Natu, Kalyani

    2016-01-01

    This paper analyzes the relationship between initialization error and recall of a specific memory in the Grover algorithm for quantum database search. It is shown that the correct memory is obtained with high probability even when the initial state is far removed from the correct one. The analysis is done by relating the variance of error in the initial state to the recovery of the correct memory and the surprising result is obtained that the relationship between the two is essentially linear.

  5. Time-delay analyzer with continuous discretization

    International Nuclear Information System (INIS)

    Bayatyan, G.L.; Darbinyan, K.T.; Mkrtchyan, K.K.; Stepanyan, S.S.

    1988-01-01

    A time-delay analyzer is described which when triggered by a start pulse of adjustable duration performs continuous discretization of the analyzed signal within nearly 22 ns time intervals, the recording in a memory unit with following slow read-out of the information to the computer and its processing. The time-delay analyzer consists of four CAMAC-VECTOR systems of unit width. With its help one can separate comparatively short, small-amplitude rare signals against the background of quasistationary noise processes. 4 refs.; 3 figs

  6. Systems Analyze Water Quality in Real Time

    Science.gov (United States)

    2010-01-01

    A water analyzer developed under Small Business Innovation Research (SBIR) contracts with Kennedy Space Center now monitors treatment processes at water and wastewater facilities around the world. Originally designed to provide real-time detection of nutrient levels in hydroponic solutions for growing plants in space, the ChemScan analyzer, produced by ASA Analytics Inc., of Waukesha, Wisconsin, utilizes spectrometry and chemometric algorithms to automatically analyze multiple parameters in the water treatment process with little need for maintenance, calibration, or operator intervention. The company has experienced a compound annual growth rate of 40 percent over its 15-year history as a direct result of the technology's success.

  7. Dioxin Exposure Initiative

    Science.gov (United States)

    The Dioxin Exposure Initiative (DEI) is no longer active. This page contains a summary of the dioxin exposure initiative with illustrations, contact and background information.Originally supported by scientist Matthew Lorber, who retired in Mar 2017.

  8. On-Demand Urine Analyzer, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this program (through Phase III) is to develop an analyzer that can be integrated into International Space Station (ISS) toilets to measure key...

  9. Low Gravity Drug Stability Analyzer, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this proposed program (through Phase III) is to build a space-worthy Drug Stability Analyzer that can determine the extent of drug degradation....

  10. New high voltage parallel plate analyzer

    International Nuclear Information System (INIS)

    Hamada, Y.; Kawasumi, Y.; Masai, K.; Iguchi, H.; Fujisawa, A.; Abe, Y.

    1992-01-01

    A new modification on the parallel plate analyzer for 500 keV heavy ions to eliminate the effect of the intense UV and visible radiations, is successfully conducted. Its principle and results are discussed. (author)

  11. Analyzing the economic impacts of transportation projects.

    Science.gov (United States)

    2013-09-01

    The main goal of the study is to explore methods, approaches and : analytical software tools for analyzing economic activity that results from largescale : transportation investments in Connecticut. The primary conclusion is that the : transportation...

  12. Digital dynamic amplitude-frequency spectra analyzer

    International Nuclear Information System (INIS)

    Kalinnikov, V.A.; )

    2006-01-01

    The spectra analyzer is intended for the dynamic spectral analysis of signals physical installations and noise filtering. The recurrence Fourier transformation algorithm is used in the digital dynamic analyzer. It is realized on the basis of the fast logic FPGA matrix and the special signal ADSP microprocessor. The discretization frequency is 2 kHz-10 MHz. The number of calculated spectral coefficients is not less 512. The functional fast-action is 20 ns [ru

  13. FST Based Morphological Analyzer for Hindi Language

    OpenAIRE

    Deepak Kumar; Manjeet Singh; Seema Shukla

    2012-01-01

    Hindi being a highly inflectional language, FST (Finite State Transducer) based approach is most efficient for developing a morphological analyzer for this language. The work presented in this paper uses the SFST (Stuttgart Finite State Transducer) tool for generating the FST. A lexicon of root words is created. Rules are then added for generating inflectional and derivational words from these root words. The Morph Analyzer developed was used in a Part Of Speech (POS) Tagger based on Stanford...

  14. Tall Buildings Initiative

    Science.gov (United States)

    Initiative 2017 TBI Guidelines Version 2.03 Now Available Screen Shot 2017-10-10 at 3.05.10 PM PEER has just initiative to develop design criteria that will ensure safe and usable tall buildings following future earthquakes. Download the primary product of this initiative: Guidelines for Performance-Based Seismic Design

  15. Energy efficiency initiatives: Indian experience

    Energy Technology Data Exchange (ETDEWEB)

    Dey, Dipankar [ICFAI Business School, Kolkata, (IBS-K) (India)

    2007-07-01

    India, with a population of over 1.10 billion is one of the fastest growing economies of the world. As domestic sources of different conventional commercial energy are drying up, dependence on foreign energy sources is increasing. There exists a huge potential for saving energy in India. After the first 'oil shock' (1973), the government of India realized the need for conservation of energy and a 'Petroleum Conservation Action Group' was formed in 1976. Since then many initiatives aiming at energy conservation and improving energy efficiency, have been undertaken (the establishment of Petroleum Conservation Research Association in 1978; the notification of Eco labelling scheme in 1991; the formation of Bureau of Energy Efficiency in 2002). But no such initiative was successful. In this paper an attempt has been made to analyze the changing importance of energy conservation/efficiency measures which have been initiated in India between 1970 and 2005.The present study tries to analyze the limitations and the reasons of failure of those initiatives. The probable reasons are: fuel pricing mechanism (including subsidies), political factors, corruption and unethical practices, influence of oil and related industry lobbies - both internal and external, the economic situation and the prolonged protection of domestic industries. Further, as India is opening its economy, the study explores the opportunities that the globally competitive market would offer to improve the overall energy efficiency of the economy. The study suggests that the Bureau of Energy Efficiency (BEE) - the newly formed nodal agency for improving energy efficiency of the economy may be made an autonomous institution where intervention from the politicians would be very low. For proper implementation of different initiatives to improve energy efficiency, BEE should involve more the civil societies (NGO) from the inception to the implementation stage of the programs. The paper also

  16. A new automatic analyzer for uranium determination

    International Nuclear Information System (INIS)

    Xia Buyun; Zhu Yaokun; Wang Bin; Cong Peiyan; Zhang Lan

    1992-08-01

    An intellectual automatic analyzer for uranium based on the principle of flow injection analysis (FIA) has been developed. It can directly determine the uranium solution in range of 0.02 to 500 mg/L without any pre-process. A chromatographic column with extractant, in which the trace uranium is concentrated and separated, has special ability to enrich uranium, is connected to the manifold of the analyzer. The analyzer is suited for trace uranium determination in varies samples. The 2-(5-bromo-2-pyridylazo)-5-diethyl-aminophenol (Br-PADAP) is used as color reagent. Uranium is determined in aqueous solution by adding cation surfactant, cetyl-pyridinium bromide (PCB). The rate of analysis is 30 to 90 samples per hour. The relative standard deviation of determination is 1% ∼ 2%. The analyzer has been used in factories and laboratory, and the results are satisfied. The determination range can easily be changed by using a multi-function auto-injection valve that changes the injection volume of the sample and channels. So, it could adopt varies FIA operation modes to meet the needs of FIA determination for other substance. The analyzer has universal functions

  17. Strategies for Analyzing Data from Intact Groups.

    Science.gov (United States)

    Cross, Lawrence H.; Lane, Carolyn E.

    Action research often necessitates the use of intact groups for the comparison of educational treatments or programs. This paper considers several analytical methods that might be used for such situations when pretest scores indicate that these intact groups differ significantly initially. The methods considered include gain score analysis of…

  18. Analyzing Aptitudes for Learning: Inductive Reasoning.

    Science.gov (United States)

    Pellegrino, James W.; Glaser, Robert

    A major focus of the psychology of instruction is understanding and facilitating the changes in cognition and performance that occur as an individual moves from low to higher competence in a domain of knowledge and skill. A new program of research which examines the initial state of the learner as a component of this transition in competence is…

  19. New project? Don't analyze--act.

    Science.gov (United States)

    Schlesinger, Leonard A; Kiefer, Charles F; Brown, Paul B

    2012-03-01

    In a predictable world, getting a new initiative off the ground typically involves analyzing the market, creating a forecast, and writing a business plan. But what about in an unpredictable environment? The authors recommend looking to those who are experts in navigating extreme uncertainty while minimizing risk: serial entrepreneurs. These business leaders act, learn, and build their way into the future. Managers in traditional organizations can do the same, starting with smart, low-risk steps that follow simple rules: Use the means at hand; stay within an acceptable loss; secure only the commitment needed for the next step; bring along only volunteers; link the initiative to a business imperative; produce early results; and manage expectations. Momentum is gained by continuing to act based on what is learned at each step. The launch of Clorox's Green Works product line is discussed as an example.

  20. Interactive nuclear plant analyzer for the VVER-440 reactor

    International Nuclear Information System (INIS)

    Shier, W.; Kennett, R.

    1993-01-01

    An interactive nuclear plant analyzer (NPA) has been developed for a VVER-440 model 213 reactor for use in the training of plant personnel, the development and verification of plant operating procedures, and in the analysis of various anticipated operational occurrences and accident scenarios. This NPA is operational on an IBM RISC-6000 workstation and utilizes the RELAP5/MOD2 computer code for the calculation of the VVER-440 reactor response to the interactive commands initiated by the NPA operator. Results of the interactive calculation can be through the user-defined, digital display of various plant parameters and through color changes that reflect changes in primary system fluid temperatures, fuel and clad temperatures, and the temperatures of other metal structures. In addition, changes in the status of various components and system can be initiated and/or displayed both numerically and graphically on the mask

  1. Scintiscans data analyzer model AS-10

    International Nuclear Information System (INIS)

    Malesa, J.; Wierzbicki, W.

    1975-01-01

    The principle of work and construction elements of the device made up for scintiscans data analyzation by ''square root scaling'' is presented. The device is equipped with cassette tape recorder type MK-125, made in Poland serving like scintiscans data bank, and with scintiscans data analyzation three programs. The cassette of two types, C-60 and C-90, is applied with working time of 2 x 30 min. and 2 x 45 min. respectivly. Results of scintiscans data analysation are printed by electric typewriter at figures in form of digital scintigram. (author)

  2. Analyzing Engineered Nanoparticles using Photothermal Infrared Spectroscopy

    DEFF Research Database (Denmark)

    Yamada, Shoko

    . To facilitate occupational safety and health there is a need to develop instruments to monitor and analyze nanoparticles in the industry, research and urban environments. The aim of this Ph.D. project was to develop new sensors that can analyze engineered nanoparticles. Two sensors were studied: (i......) a miniaturized toxicity sensor based on electrochemistry and (ii) a photothermal spectrometer based on tensile-stressed mechanical resonators (string resonators). Miniaturization of toxicity sensor targeting engineered nanoparticles was explored. This concept was based on the results of the biodurability test...

  3. Analyzing Web Behavior in Indoor Retail Spaces

    OpenAIRE

    Ren, Yongli; Tomko, Martin; Salim, Flora; Ong, Kevin; Sanderson, Mark

    2015-01-01

    We analyze 18 million rows of Wi-Fi access logs collected over a one year period from over 120,000 anonymized users at an inner-city shopping mall. The anonymized dataset gathered from an opt-in system provides users' approximate physical location, as well as Web browsing and some search history. Such data provides a unique opportunity to analyze the interaction between people's behavior in physical retail spaces and their Web behavior, serving as a proxy to their information needs. We find: ...

  4. Analyzing Log Files using Data-Mining

    Directory of Open Access Journals (Sweden)

    Marius Mihut

    2008-01-01

    Full Text Available Information systems (i.e. servers, applications and communication devices create a large amount of monitoring data that are saved as log files. For analyzing them, a data-mining approach is helpful. This article presents the steps which are necessary for creating an ‘analyzing instrument’, based on an open source software called Waikato Environment for Knowledge Analysis (Weka [1]. For exemplification, a system log file created by a Windows-based operating system, is used as input file.

  5. BWR plant analyzer development at BNL

    International Nuclear Information System (INIS)

    Cheng, H.S.; Wulff, W.; Mallen, A.N.; Lekach, S.V.; Stritar, A.; Cerbone, R.J.

    1985-01-01

    Advanced technology for high-speed interactive nuclear power plant simulations is of great value for timely resolution of safety issues, for plant monitoring, and for computer-aided emergency responses to an accident. Presented is the methodology employed at BNL to develop a BWR plant analyzer capable of simulating severe plant transients at much faster than real-time process speeds. Five modeling principles are established and a criterion is given for selecting numerical procedures and efficient computers to achieve the very high simulation speeds. Typical results are shown to demonstrate the modeling fidelity of the BWR plant analyzer

  6. X-ray fluorescence analyzer arrangement

    International Nuclear Information System (INIS)

    Vatai, Endre; Ando, Laszlo; Gal, Janos.

    1981-01-01

    An x-ray fluorescence analyzer for the quantitative determination of one or more elements of complex samples is reported. The novelties of the invention are the excitation of the samples by x-rays or γ-radiation, the application of a balanced filter pair as energy selector, and the measurement of the current or ion charge of ionization detectors used as sensors. Due to the increased sensitivity and accuracy, the novel design can extend the application fields of x-ray fluorescence analyzers. (A.L.)

  7. A Novel Architecture For Multichannel Analyzer

    International Nuclear Information System (INIS)

    Marcus, E.; Elhanani, I.; Nir, J.; Ellenbogen, M.; Kadmon, Y.; Tirosh, D.

    1999-01-01

    A novel digital approach to real-time, high-throughput, low-cost Multichannel Analyzer (MCA) for radiation spectroscopy is being presented. The MCA input is a shaped nuclear pulse sampled at a high rate, using an Analog-to-Digital Converter (ADC) chip. The digital samples are analyzed by a state-of-the-art Field Programmable Gate Away (FPGA). A customized algorithm is utilized to estimate the peak of the pulse, to reject pile-up and to eliminate processing dead time. The valid pulses estimated peaks are transferred to a micro controller system that creates the histogram and controls the Human Machine Interface (HMI)

  8. Interactive nuclear plant analyzer for VVER-440 reactor

    International Nuclear Information System (INIS)

    Shier, W.; Horak, W.; Kennett, R.

    1992-05-01

    This document discusses an interactive nuclear plant analyzer (NPA) which has been developed for a VVER-440, Model 213 reactor for use in the training of plant personnel, the development and verification of plant operating procedures, and in the analysis of various anticipated operational occurrences and accident scenarios. This NPA is operational on an IBM RISC-6000 workstation and utilizes the RELAP5/MOD2 computer code for the calculation of the VVER-440 reactor response to the interactive commands initiated by the NPA operator

  9. Time asymmetry: Polarization and analyzing power in the nuclear reactions

    Energy Technology Data Exchange (ETDEWEB)

    Rioux, C.; Roy, R.; Slobodrian, R.J. (Laval Univ., Quebec City (Canada). Lab. de Physique Nucleaire); Conzett, H.E. (California Univ., Berkeley (USA). Lawrence Berkeley Lab.)

    1983-02-28

    Measurements of the proton polarization in the reactions /sup 7/Li(/sup 3/He, p vector)/sup 9/Be and /sup 9/Be(/sup 3/He, p vector)/sup 11/B and of the analyzing powers of the inverse reactions, initiated by polarized protons at the same c.m. energies, show significant differences which imply the failure of the polarization-analyzing-power theorem and, prima facie, of time-reversal invariance in these reactions. The reaction /sup 2/H(/sup 3/He, p vector)/sup 4/ He and its inverse have also been investigated and show some smaller differences. A discussion of the instrumental asymmetries is presented.

  10. Fluidization quality analyzer for fluidized beds

    Science.gov (United States)

    Daw, C.S.; Hawk, J.A.

    1995-07-25

    A control loop and fluidization quality analyzer for a fluidized bed utilizes time varying pressure drop measurements. A fast-response pressure transducer measures the overall bed pressure drop, or over some segment of the bed, and the pressure drop signal is processed to produce an output voltage which changes with the degree of fluidization turbulence. 9 figs.

  11. SINDA, Systems Improved Numerical Differencing Analyzer

    Science.gov (United States)

    Fink, L. C.; Pan, H. M. Y.; Ishimoto, T.

    1972-01-01

    Computer program has been written to analyze group of 100-node areas and then provide for summation of any number of 100-node areas to obtain temperature profile. SINDA program options offer user variety of methods for solution of thermal analog modes presented in network format.

  12. Analyzing the Biology on the System Level

    OpenAIRE

    Tong, Wei

    2016-01-01

    Although various genome projects have provided us enormous static sequence information, understanding of the sophisticated biology continues to require integrating the computational modeling, system analysis, technology development for experiments, and quantitative experiments all together to analyze the biology architecture on various levels, which is just the origin of systems biology subject. This review discusses the object, its characteristics, and research attentions in systems biology,...

  13. Analyzing the Acoustic Beat with Mobile Devices

    Science.gov (United States)

    Kuhn, Jochen; Vogt, Patrik; Hirth, Michael

    2014-01-01

    In this column, we have previously presented various examples of how physical relationships can be examined by analyzing acoustic signals using smartphones or tablet PCs. In this example, we will be exploring the acoustic phenomenon of small beats, which is produced by the overlapping of two tones with a low difference in frequency ?f. The…

  14. Automatic radioxenon analyzer for CTBT monitoring

    International Nuclear Information System (INIS)

    Bowyer, T.W.; Abel, K.H.; Hensley, W.K.

    1996-12-01

    Over the past 3 years, with support from US DOE's NN-20 Comprehensive Test Ban Treaty (CTBT) R ampersand D program, PNNL has developed and demonstrated a fully automatic analyzer for collecting and measuring the four Xe radionuclides, 131m Xe(11.9 d), 133m Xe(2.19 d), 133 Xe (5.24 d), and 135 Xe(9.10 h), in the atmosphere. These radionuclides are important signatures in monitoring for compliance to a CTBT. Activity ratios permit discriminating radioxenon from nuclear detonation and that from nuclear reactor operations, nuclear fuel reprocessing, or medical isotope production and usage. In the analyzer, Xe is continuously and automatically separated from the atmosphere at flow rates of about 7 m 3 /h on sorption bed. Aliquots collected for 6-12 h are automatically analyzed by electron-photon coincidence spectrometry to produce sensitivities in the range of 20-100 μBq/m 3 of air, about 100-fold better than with reported laboratory-based procedures for short time collection intervals. Spectral data are automatically analyzed and the calculated radioxenon concentrations and raw gamma- ray spectra automatically transmitted to data centers

  15. How to Analyze Company Using Social Network?

    Science.gov (United States)

    Palus, Sebastian; Bródka, Piotr; Kazienko, Przemysław

    Every single company or institution wants to utilize its resources in the most efficient way. In order to do so they have to be have good structure. The new way to analyze company structure by utilizing existing within company natural social network and example of its usage on Enron company are presented in this paper.

  16. Analyzing Vessel Behavior Using Process Mining

    NARCIS (Netherlands)

    Maggi, F.M.; Mooij, A.J.; Aalst, W.M.P. van der

    2013-01-01

    In the maritime domain, electronic sensors such as AIS receivers and radars collect large amounts of data about the vessels in a certain geographical area. We investigate the use of process mining techniques for analyzing the behavior of the vessels based on these data. In the context of maritime

  17. Strengthening 4-H by Analyzing Enrollment Data

    Science.gov (United States)

    Hamilton, Stephen F.; Northern, Angela; Neff, Robert

    2014-01-01

    The study reported here used data from the ACCESS 4-H Enrollment System to gain insight into strengthening New York State's 4-H programming. Member enrollment lists from 2009 to 2012 were analyzed using Microsoft Excel to determine trends and dropout rates. The descriptive data indicate declining 4-H enrollment in recent years and peak enrollment…

  18. Emerging Pathogens Initiative (EPI)

    Data.gov (United States)

    Department of Veterans Affairs — The Emerging Pathogens Initiative (EPI) database contains emerging pathogens information from the local Veterans Affairs Medical Centers (VAMCs). The EPI software...

  19. Methodology for analyzing risk at nuclear facilities

    International Nuclear Information System (INIS)

    Yoo, Hosik; Lee, Nayoung; Ham, Taekyu; Seo, Janghoon

    2015-01-01

    Highlights: • A new methodology for evaluating the risk at nuclear facilities was developed. • Five measures reflecting all factors that should be concerned to assess risk were developed. • The attributes on NMAC and nuclear security culture are included as attributes for analyzing. • The newly developed methodology can be used to evaluate risk of both existing facility and future nuclear system. - Abstract: A methodology for evaluating risks at nuclear facilities is developed in this work. A series of measures is drawn from the analysis of factors that determine risks. Five measures are created to evaluate risks at nuclear facilities. These include the legal and institutional framework, material control, physical protection system effectiveness, human resources, and consequences. Evaluation attributes are developed for each measure and specific values are given in order to calculate the risk value quantitatively. Questionnaires are drawn up on whether or not a state has properly established a legal and regulatory framework (based on international standards). These questionnaires can be a useful measure for comparing the status of the physical protection regime between two countries. Analyzing an insider threat is not an easy task and no methodology has been developed for this purpose. In this study, attributes that could quantitatively evaluate an insider threat, in the case of an unauthorized removal of nuclear materials, are developed by adopting the Nuclear Material Accounting & Control (NMAC) system. The effectiveness of a physical protection system, P(E), could be analyzed by calculating the probability of interruption, P(I), and the probability of neutralization, P(N). In this study, the Tool for Evaluating Security System (TESS) code developed by KINAC is used to calculate P(I) and P(N). Consequence is an important measure used to analyze risks at nuclear facilities. This measure comprises radiological, economic, and social damage. Social and

  20. Development of pulse neutron coal analyzer

    International Nuclear Information System (INIS)

    Jing Shiwie; Gu Deshan; Qiao Shuang; Liu Yuren; Liu Linmao; Jing Shiwei

    2005-01-01

    This article introduced the development of pulsed neutron coal analyzer by pulse fast-thermal neutron analysis technology in the Radiation Technology Institute of Northeast Normal University. The 14 MeV pulse neutron generator and bismuth germanate detector and 4096 multichannel analyzer were applied in this system. The multiple linear regression method employed to process data solved the interferential problem of multiple elements. The prototype (model MZ-MKFY) had been applied in Changshan and Jilin power plant for about a year. The results of measuring the main parameters of coal such as low caloric power, whole total water, ash content, volatile content, and sulfur content, with precision acceptable to the coal industry, are presented

  1. Real time speech formant analyzer and display

    Science.gov (United States)

    Holland, George E.; Struve, Walter S.; Homer, John F.

    1987-01-01

    A speech analyzer for interpretation of sound includes a sound input which converts the sound into a signal representing the sound. The signal is passed through a plurality of frequency pass filters to derive a plurality of frequency formants. These formants are converted to voltage signals by frequency-to-voltage converters and then are prepared for visual display in continuous real time. Parameters from the inputted sound are also derived and displayed. The display may then be interpreted by the user. The preferred embodiment includes a microprocessor which is interfaced with a television set for displaying of the sound formants. The microprocessor software enables the sound analyzer to present a variety of display modes for interpretive and therapeutic used by the user.

  2. Analyzing public health policy: three approaches.

    Science.gov (United States)

    Coveney, John

    2010-07-01

    Policy is an important feature of public and private organizations. Within the field of health as a policy arena, public health has emerged in which policy is vital to decision making and the deployment of resources. Public health practitioners and students need to be able to analyze public health policy, yet many feel daunted by the subject's complexity. This article discusses three approaches that simplify policy analysis: Bacchi's "What's the problem?" approach examines the way that policy represents problems. Colebatch's governmentality approach provides a way of analyzing the implementation of policy. Bridgman and Davis's policy cycle allows for an appraisal of public policy development. Each approach provides an analytical framework from which to rigorously study policy. Practitioners and students of public health gain much in engaging with the politicized nature of policy, and a simple approach to policy analysis can greatly assist one's understanding and involvement in policy work.

  3. Miniature multichannel analyzer for process monitoring

    International Nuclear Information System (INIS)

    Halbig, J.K.; Klosterbuer, S.F.; Russo, P.A.; Sprinkle, J.K. Jr.; Stephens, M.M.; Wiig, L.G.; Ianakiev, K.D.

    1993-01-01

    A new, 4,000-channel analyzer has been developed for gamma-ray spectroscopy applications. A design philosophy of hardware and software building blocks has been combined with design goals of simplicity, compactness, portability, and reliability. The result is a miniature, modular multichannel analyzer (MMMCA), which offers solution to a variety of nondestructive assay (NDA) needs in many areas of general application, independent of computer platform or operating system. Detector-signal analog electronics, the bias supply, and batteries are included in the virtually pocket-size, low-power MMMCA unit. The MMMCA features digital setup and control, automated data reduction, and automated quality assurance. Areas of current NDA applications include on-line continuous (process) monitoring, process material holdup measurements, and field inspections

  4. Testing the Application for Analyzing Structured Entities

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2011-01-01

    Full Text Available The paper presents the testing process of the application for the analysis of structured text entities. The structured entities are presented. Quality characteristics of structured entities are identified and analyzed. The design and building processes are presented. Rules for building structured entities are described. The steps of building the application for the analysis of structured text entities are presented. The objective of the testing process is defined. Ways of testing the application on components and as a whole are established. A testing strategy for different objectives is proposed. The behavior of users during the testing period is analyzed. Statistical analysis regarding the behavior of users in processes of infinite resources access are realized.

  5. A new approach to analyzing vulnerability

    International Nuclear Information System (INIS)

    O'Callaghan, P.B.; Carlson, R.L.; Riedeman, G.W.

    1986-01-01

    The Westinghouse Hanford Company (WHC) has recently completed construction of the Fuel Cycle Plant (FCP) at Richland, Washington. At start-up the facility will fabricate driver fuel for the Fast Flux Test Facility in the Secure Automated Fabrication line. After construction completion, but before facility certification, the Department of Energy (DOE) Richland Operation Office requested that a vulnerability analysis be performed which assumed multiple insiders as a threat to the security system. A unique method of analyzing facility vulnerabilities was developed at the Security Applications Center (SAC), which is managed by WHC for DOE. The method that was developed verifies a previous vulnerability assessment, as well as introducing a modeling technique which analyzes security alarms in relation to delaying factors and possible insider activities. With this information it is possible to assess the relative strength or weakness of various possible routes to and from a target within a facility,

  6. Real-time airborne particle analyzer

    Science.gov (United States)

    Reilly, Peter T.A.

    2012-10-16

    An aerosol particle analyzer includes a laser ablation chamber, a gas-filled conduit, and a mass spectrometer. The laser ablation chamber can be operated at a low pressure, which can be from 0.1 mTorr to 30 mTorr. The ablated ions are transferred into a gas-filled conduit. The gas-filled conduit reduces the electrical charge and the speed of ablated ions as they collide and mix with buffer gases in the gas-filled conduit. Preferably, the gas filled-conduit includes an electromagnetic multipole structure that collimates the nascent ions into a beam, which is guided into the mass spectrometer. Because the gas-filled conduit allows storage of vast quantities of the ions from the ablated particles, the ions from a single ablated particle can be analyzed multiple times and by a variety of techniques to supply statistically meaningful analysis of composition and isotope ratios.

  7. Computer-based radionuclide analyzer system

    International Nuclear Information System (INIS)

    Ohba, Kengo; Ishizuka, Akira; Kobayashi, Akira; Ohhashi, Hideaki; Tsuruoka, Kimitoshi.

    1978-01-01

    The radionuclide analysis in nuclear power plants, practiced for the purpose of monitoring the quality of the primary loop water, the confirmation of the performance of reactor cleanup system and monitoring the radioactive waste effluent, is an important job. Important as it is, it requires considerable labor of experts, because the samples to be analyzed are multifarious and very large in number, and in addition, this job depends much on manual work. With a view of saving the labor, simplifying and standardizing the work, reducing radiation exposure, and automatizing the work of analysis, the computerized analyzer system has been worked out. The results of its performance test at the operating power plant have proved that the development has fairly accomplished the objects and that the system is well useful. The developmental work was carried out by the cooperation between The Tokyo Electric Power Co. and Toshiba in about 4 years from 1974 to this year. (auth.)

  8. AC Initiation System.

    Science.gov (United States)

    An ac initiation system is described which uses three ac transmission signals interlocked for safety by frequency, phase, and power discrimination...The ac initiation system is pre-armed by the application of two ac signals have the proper phases, and activates a load when an ac power signal of the proper frequency and power level is applied. (Author)

  9. Neutral Particle Analyzer Diagnostic on NSTX

    International Nuclear Information System (INIS)

    Medley, S.S.; Roquemore, A.L.

    2004-01-01

    The Neutral Particle Analyzer (NPA) diagnostic on the National Spherical Torus Experiment (NSTX) utilizes a PPPL-designed E||B spectrometer that measures the energy spectra of minority hydrogen and bulk deuterium species simultaneously with 39 energy channels per mass specie and a time resolution of 1 ms. The calibrated energy range is E = 0.5-150 keV and the energy resolution varies from AE/E = 3-7% over the surface of the microchannel plate detector

  10. Analyzing Gender Stereotyping in Bollywood Movies

    OpenAIRE

    Madaan, Nishtha; Mehta, Sameep; Agrawaal, Taneea S; Malhotra, Vrinda; Aggarwal, Aditi; Saxena, Mayank

    2017-01-01

    The presence of gender stereotypes in many aspects of society is a well-known phenomenon. In this paper, we focus on studying such stereotypes and bias in Hindi movie industry (Bollywood). We analyze movie plots and posters for all movies released since 1970. The gender bias is detected by semantic modeling of plots at inter-sentence and intra-sentence level. Different features like occupation, introduction of cast in text, associated actions and descriptions are captured to show the pervasiv...

  11. Neutral Particle Analyzer Diagnostic on NSTX

    Energy Technology Data Exchange (ETDEWEB)

    S.S. Medley; A.L. Roquemore

    2004-03-16

    The Neutral Particle Analyzer (NPA) diagnostic on the National Spherical Torus Experiment (NSTX) utilizes a PPPL-designed E||B spectrometer that measures the energy spectra of minority hydrogen and bulk deuterium species simultaneously with 39 energy channels per mass specie and a time resolution of 1 ms. The calibrated energy range is E = 0.5-150 keV and the energy resolution varies from AE/E = 3-7% over the surface of the microchannel plate detector.

  12. A seal analyzer for testing container integrity

    International Nuclear Information System (INIS)

    McDaniel, P.; Jenkins, C.

    1988-01-01

    This paper reports on the development of laboratory and production seal analyzer that offers a rapid, nondestructive method of assuring the seal integrity of virtually any type of single or double sealed container. The system can test a broad range of metal cans, drums and trays, membrane-lidded vessels, flexible pouches, aerosol containers, and glass or metal containers with twist-top lids that are used in the chemical/pesticide (hazardous materials/waste), beverage, food, medical and pharmaceutical industries

  13. Information decomposition method to analyze symbolical sequences

    International Nuclear Information System (INIS)

    Korotkov, E.V.; Korotkova, M.A.; Kudryashov, N.A.

    2003-01-01

    The information decomposition (ID) method to analyze symbolical sequences is presented. This method allows us to reveal a latent periodicity of any symbolical sequence. The ID method is shown to have advantages in comparison with application of the Fourier transformation, the wavelet transform and the dynamic programming method to look for latent periodicity. Examples of the latent periods for poetic texts, DNA sequences and amino acids are presented. Possible origin of a latent periodicity for different symbolical sequences is discussed

  14. Analyzing the Existing Undergraduate Engineering Leadership Skills

    OpenAIRE

    Hamed M. Almalki; Luis Rabelo; Charles Davis; Hammad Usmani; Debra Hollister; Alfonso Sarmiento

    2016-01-01

    Purpose: Studying and analyzing the undergraduate engineering students' leadership skills to discover their potential leadership strengths and weaknesses. This study will unveil potential ways to enhance the ways we teach engineering leadership. The research has great insights that might assist engineering programs to improve curricula for the purpose of better engineering preparation to meet industry's demands. Methodology and Findings: 441 undergraduate engineering students have been s...

  15. General methods for analyzing bounded proportion data

    OpenAIRE

    Hossain, Abu

    2017-01-01

    This thesis introduces two general classes of models for analyzing proportion response variable when the response variable Y can take values between zero and one, inclusive of zero and/or one. The models are inflated GAMLSS model and generalized Tobit GAMLSS model. The inflated GAMLSS model extends the flexibility of beta inflated models by allowing the distribution on (0,1) of the continuous component of the dependent variable to come from any explicit or transformed (i.e. logit or truncated...

  16. The analyzing of Dove marketing strategy

    Institute of Scientific and Technical Information of China (English)

    Guo; Yaohui

    2015-01-01

    <正>1.Introduction In this report,I try to analyze the related information about DOVE chocolate.Firstly,I would like to introduce this product.Dove chocolate is one of a series of products launched by the world’s largest pet food and snack food manufacturers,U.S.multinational food company Mars(Mars).Entered China in 1989,It becomes China’s leading brand of chocolate in

  17. Analyzing negative ties in social networks

    Directory of Open Access Journals (Sweden)

    Mankirat Kaur

    2016-03-01

    Full Text Available Online social networks are a source of sharing information and maintaining personal contacts with other people through social interactions and thus forming virtual communities online. Social networks are crowded with positive and negative relations. Positive relations are formed by support, endorsement and friendship and thus, create a network of well-connected users whereas negative relations are a result of opposition, distrust and avoidance creating disconnected networks. Due to increase in illegal activities such as masquerading, conspiring and creating fake profiles on online social networks, exploring and analyzing these negative activities becomes the need of hour. Usually negative ties are treated in same way as positive ties in many theories such as balance theory and blockmodeling analysis. But the standard concepts of social network analysis do not yield same results in respect of each tie. This paper presents a survey on analyzing negative ties in social networks through various types of network analysis techniques that are used for examining ties such as status, centrality and power measures. Due to the difference in characteristics of flow in positive and negative tie networks some of these measures are not applicable on negative ties. This paper also discusses new methods that have been developed specifically for analyzing negative ties such as negative degree, and h∗ measure along with the measures based on mixture of positive and negative ties. The different types of social network analysis approaches have been reviewed and compared to determine the best approach that can appropriately identify the negative ties in online networks. It has been analyzed that only few measures such as Degree and PN centrality are applicable for identifying outsiders in network. For applicability in online networks, the performance of PN measure needs to be verified and further, new measures should be developed based upon negative clique concept.

  18. Testing the Application for Analyzing Structured Entities

    OpenAIRE

    Ion IVAN; Bogdan VINTILA

    2011-01-01

    The paper presents the testing process of the application for the analysis of structured text entities. The structured entities are presented. Quality characteristics of structured entities are identified and analyzed. The design and building processes are presented. Rules for building structured entities are described. The steps of building the application for the analysis of structured text entities are presented. The objective of the testing process is defined. Ways of testing the applicat...

  19. Evaluation of the Air Void Analyzer

    Science.gov (United States)

    2013-07-01

    concrete using image analysis: Petrography of cementitious materials. ASTM STP 1215. S.M. DeHayes and D. Stark, eds. Philadelphia, PA: American...Administration (FHWA). 2006. Priority, market -ready technologies and innovations: Air Void Analyzer. Washington D.C. PDF file. Germann Instruments (GI). 2011...tests and properties of concrete and concrete-making materials. STP 169D. West Conshohocken, PA: ASTM International. Magura, D.D. 1996. Air void

  20. Semantic analyzability in children's understanding of idioms.

    Science.gov (United States)

    Gibbs, R W

    1991-06-01

    This study investigated the role of semantic analyzability in children's understanding of idioms. Kindergartners and first, third, and fourth graders listened to idiomatic expressions either alone or at the end of short story contexts. Their task was to explain verbally the intended meanings of these phrases and then to choose their correct idiomatic interpretations. The idioms presented to the children differed in their degree of analyzability. Some idioms were highly analyzable or decomposable, with the meanings of their parts contributing independently to their overall figurative meanings. Other idioms were nondecomposable because it was difficult to see any relation between a phrase's individual components and the idiom's figurative meaning. The results showed that younger children (kindergartners and first graders) understood decomposable idioms better than they did nondecomposable phrases. Older children (third and fourth graders) understood both kinds of idioms equally well in supporting contexts, but were better at interpreting decomposable idioms than they were at understanding nondecomposable idioms without contextual information. These findings demonstrate that young children better understand idiomatic phrases whose individual parts independently contribute to their overall figurative meanings.

  1. Handheld Fluorescence Microscopy based Flow Analyzer.

    Science.gov (United States)

    Saxena, Manish; Jayakumar, Nitin; Gorthi, Sai Siva

    2016-03-01

    Fluorescence microscopy has the intrinsic advantages of favourable contrast characteristics and high degree of specificity. Consequently, it has been a mainstay in modern biological inquiry and clinical diagnostics. Despite its reliable nature, fluorescence based clinical microscopy and diagnostics is a manual, labour intensive and time consuming procedure. The article outlines a cost-effective, high throughput alternative to conventional fluorescence imaging techniques. With system level integration of custom-designed microfluidics and optics, we demonstrate fluorescence microscopy based imaging flow analyzer. Using this system we have imaged more than 2900 FITC labeled fluorescent beads per minute. This demonstrates high-throughput characteristics of our flow analyzer in comparison to conventional fluorescence microscopy. The issue of motion blur at high flow rates limits the achievable throughput in image based flow analyzers. Here we address the issue by computationally deblurring the images and show that this restores the morphological features otherwise affected by motion blur. By further optimizing concentration of the sample solution and flow speeds, along with imaging multiple channels simultaneously, the system is capable of providing throughput of about 480 beads per second.

  2. A Raman-Based Portable Fuel Analyzer

    Science.gov (United States)

    Farquharson, Stuart

    2010-08-01

    Fuel is the single most import supply during war. Consider that the US Military is employing over 25,000 vehicles in Iraq and Afghanistan. Most fuel is obtained locally, and must be characterized to ensure proper operation of these vehicles. Fuel properties are currently determined using a deployed chemical laboratory. Unfortunately, each sample requires in excess of 6 hours to characterize. To overcome this limitation, we have developed a portable fuel analyzer capable of determine 7 fuel properties that allow determining fuel usage. The analyzer uses Raman spectroscopy to measure the fuel samples without preparation in 2 minutes. The challenge, however, is that as distilled fractions of crude oil, all fuels are composed of hundreds of hydrocarbon components that boil at similar temperatures, and performance properties can not be simply correlated to a single component, and certainly not to specific Raman peaks. To meet this challenge, we measured over 800 diesel and jet fuels from around the world and used chemometrics to correlate the Raman spectra to fuel properties. Critical to the success of this approach is laser excitation at 1064 nm to avoid fluorescence interference (many fuels fluoresce) and a rugged interferometer that provides 0.1 cm-1 wavenumber (x-axis) accuracy to guarantee accurate correlations. Here we describe the portable fuel analyzer, the chemometric models, and the successful determination of these 7 fuel properties for over 100 unknown samples provided by the US Marine Corps, US Navy, and US Army.

  3. Method of stabilizing single channel analyzers

    International Nuclear Information System (INIS)

    Fasching, G.E.; Patton, G.H.

    1975-01-01

    A method and the apparatus to reduce the drift of single channel analyzers are described. Essentially, this invention employs a time-sharing or multiplexing technique to insure that the outputs from two single channel analyzers (SCAS) maintain the same count ratio regardless of variations in the threshold voltage source or voltage changes, the multiplexing technique is accomplished when a flip flop, actuated by a clock, changes state to switch the output from the individual SCAS before these outputs are sent to a ratio counting scalar. In the particular system embodiment disclosed that illustrates this invention, the sulfur content of coal is determined by subjecting the coal to radiation from a neutron producing source. A photomultiplier and detector system equates the transmitted gamma radiation to an analog voltage signal and sends the same signal after amplification, to a SCA system that contains the invention. Therein, at least two single channel analyzers scan the analog signal over different parts of a spectral region. The two outputs may then be sent to a digital multiplexer so that the output from the multiplexer contains counts falling within two distinct segments of the region. By dividing the counts from the multiplexer by each other, the percentage of sulfur within the coal sample under observation may be determined. (U.S.)

  4. Sustainable Agricultural Marketing Initiatives

    Directory of Open Access Journals (Sweden)

    Hakan Adanacıoğlu

    2015-07-01

    Full Text Available Sustainable marketing is a holistic approach that puts equal emphasis on environmental, social equity, and economic concerns in the development of marketing strategies. The purpose of the study is to examine and discuss the sustainable agricultural marketing initiatives practiced throughout the World and Turkey, and to put forth suggestions to further improve the performance of agricultural marketing initiatives in Turkey. Some of the sustainable agricultural marketing initiatives practiced around the world are carried out through civil organizations. Furthermore; some of these initiatives have also launched by farmers, consumers, food processors and retailers. The long-term strategies to increase these initiatives should be determined due to the fact that examples of successful sustainable agricultural marketing initiatives are inadequate and cannot be spread in Turkey. In this context, first of all, the supports provided by the government to improve agricultural marketing systems, such as EU funds for rural development should be compatible with the goals of sustainable marketing. For this purpose, it should be examined whether all proposed projects related to agricultural marketing meet the social, economic, and environmental principles of sustainable marketing. It is important that supporting organizations, especially civil society organisations, should take an active role for faster dissemination and adoption of sustainable agricultural marketing practices in Turkey. These organizations may provide technical assistance in preparing successful project proposals and training to farm groups. In addition, the other organizations, such as local administrations, producers' associations, cooperatives, can contribute to the success of sustainable agricultural marketing initiatives. The use of direct marketing strategies and vertical integration attempts in sustainable agricultural marketing initiatives that will likely be implemented in Turkey is

  5. Aperture Valve for the Mars Organic Molecule Analyzer (MOMA)

    Science.gov (United States)

    Hakun, Claef F.; Engler, Charles D.; Barber, Willie E.; Canham, John S.

    2014-01-01

    NASA's participation in the multi-nation ExoMars 2018 Rover mission includes a critical astrobiology Mass Spectrometer Instrument on the Rover called the Mars Organic Molecule Analyzer (MOMA). The Aperture Valve is a critical electromechanical valve used by the Mass Spectrometer to facilitate the transfer of ions from Martian soil to the Mass Spectrometer for analysis. The MOMA Aperture Valve development program will be discussed in terms of the Initial valve design and subsequent improvements that resulted from prototype testing. The Initial Aperture Valve concept seemed promising, based on calculations and perceived merits. However, performance results of this design were disappointing, due to delamination of TiN and DLC coatings applied to the Titanium base metals, causing debris from the coatings to seize the valve. While peer reviews and design trade studies are important forums to vet a concept design, results from testing should not be underestimated.Despite the lack of development progress to meet requirements, valuable information from weakness discovered in the Initial Valve design was used to develop a second, more robust Aperture valve. Based on a check-ball design, the ETU flight valve design resulted in significantly less surface area to create the seal. Moreover, PVD coatings were eliminated in favor of hardened, nonmagnetic corrosion resistant alloys. Test results were impressive, with the valve achieving five orders of magnitude better sealing leak rate over end of life requirements. Cycle life was equally impressive, achieving 280,000 cycles without failure.

  6. Inhomogeneous inflation: The initial-value problem

    International Nuclear Information System (INIS)

    Laguna, P.; Kurki-Suonio, H.; Matzner, R.A.

    1991-01-01

    We present a spatially three-dimensional study for solving the initial-value problem in general relativity for inhomogeneous cosmologies. We use York's conformal approach to solve the constraint equations of Einstein's field equations for scalar field sources and find the initial data which will be used in the evolution. This work constitutes the first stage in the development of a code to analyze the effects of matter and spacetime inhomogeneities on inflation

  7. Remote Laser Diffraction Particle Size Distribution Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2001-03-01

    In support of a radioactive slurry sampling and physical characterization task, an “off-the-shelf” laser diffraction (classical light scattering) particle size analyzer was utilized for remote particle size distribution (PSD) analysis. Spent nuclear fuel was previously reprocessed at the Idaho Nuclear Technology and Engineering Center (INTEC—formerly recognized as the Idaho Chemical Processing Plant) which is on DOE’s INEEL site. The acidic, radioactive aqueous raffinate streams from these processes were transferred to 300,000 gallon stainless steel storage vessels located in the INTEC Tank Farm area. Due to the transfer piping configuration in these vessels, complete removal of the liquid can not be achieved. Consequently, a “heel” slurry remains at the bottom of an “emptied” vessel. Particle size distribution characterization of the settled solids in this remaining heel slurry, as well as suspended solids in the tank liquid, is the goal of this remote PSD analyzer task. A Horiba Instruments Inc. Model LA-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a “hot cell” (gamma radiation) environment. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not previously achievable—making this technology far superior than the traditional methods used. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives.

  8. IRISpy: Analyzing IRIS Data in Python

    Science.gov (United States)

    Ryan, Daniel; Christe, Steven; Mumford, Stuart; Baruah, Ankit; Timothy, Shelbe; Pereira, Tiago; De Pontieu, Bart

    2017-08-01

    IRISpy is a new community-developed open-source software library for analysing IRIS level 2 data. It is written in Python, a free, cross-platform, general-purpose, high-level programming language. A wide array of scientific computing software packages have already been developed in Python, from numerical computation (NumPy, SciPy, etc.), to visualization and plotting (matplotlib), to solar-physics-specific data analysis (SunPy). IRISpy is currently under development as a SunPy-affiliated package which means it depends on the SunPy library, follows similar standards and conventions, and is developed with the support of of the SunPy development team. IRISpy’s has two primary data objects, one for analyzing slit-jaw imager data and another for analyzing spectrograph data. Both objects contain basic slicing, indexing, plotting, and animating functionality to allow users to easily inspect, reduce and analyze the data. As part of this functionality the objects can output SunPy Maps, TimeSeries, Spectra, etc. of relevant data slices for easier inspection and analysis. Work is also ongoing to provide additional data analysis functionality including derivation of systematic measurement errors (e.g. readout noise), exposure time correction, residual wavelength calibration, radiometric calibration, and fine scale pointing corrections. IRISpy’s code base is publicly available through github.com and can be contributed to by anyone. In this poster we demonstrate IRISpy’s functionality and future goals of the project. We also encourage interested users to become involved in further developing IRISpy.

  9. 1996 environmental initiatives report

    International Nuclear Information System (INIS)

    1996-01-01

    Progress by Consumers Gas in addressing environmental challenges were reviewed. Proposed environmental initiatives for the next fiscal year and beyond were introduced. Proposed initiatives were placed into three priority categories, high, medium or low, which together with the environmental management framework form the the utility's overall environmental agenda. High on the list of environmental priorities for the company are atmospheric air emissions, planning and construction practices, energy conservation and efficiency, environmental compliance, and methane emissions. The present state of the initiatives by the various company divisions and regions, compiled from the respective business plans, were reported. 21 figs

  10. Grid and Data Analyzing and Security

    Directory of Open Access Journals (Sweden)

    Fatemeh SHOKRI

    2012-12-01

    Full Text Available This paper examines the importance of secure structures in the process of analyzing and distributing information with aid of Grid-based technologies. The advent of distributed network has provided many practical opportunities for detecting and recording the time of events, and made efforts to identify the events and solve problems of storing information such as being up-to-date and documented. In this regard, the data distribution systems in a network environment should be accurate. As a consequence, a series of continuous and updated data must be at hand. In this case, Grid is the best answer to use data and resource of organizations by common processing.

  11. The kpx, a program analyzer for parallelization

    International Nuclear Information System (INIS)

    Matsuyama, Yuji; Orii, Shigeo; Ota, Toshiro; Kume, Etsuo; Aikawa, Hiroshi.

    1997-03-01

    The kpx is a program analyzer, developed as a common technological basis for promoting parallel processing. The kpx consists of three tools. The first is ktool, that shows how much execution time is spent in program segments. The second is ptool, that shows parallelization overhead on the Paragon system. The last is xtool, that shows parallelization overhead on the VPP system. The kpx, designed to work for any FORTRAN cord on any UNIX computer, is confirmed to work well after testing on Paragon, SP2, SR2201, VPP500, VPP300, Monte-4, SX-4 and T90. (author)

  12. A low power Multi-Channel Analyzer

    International Nuclear Information System (INIS)

    Anderson, G.A.; Brackenbush, L.W.

    1993-06-01

    The instrumentation used in nuclear spectroscopy is generally large, is not portable, and requires a lot of power. Key components of these counting systems are the computer and the Multi-Channel Analyzer (MCA). To assist in performing measurements requiring portable systems, a small, very low power MCA has been developed at Pacific Northwest Laboratory (PNL). This MCA is interfaced with a Hewlett Packard palm top computer for portable applications. The MCA can also be connected to an IBM/PC for data storage and analysis. In addition, a real-time time display mode allows the user to view the spectra as they are collected

  13. The SPAR thermal analyzer: Present and future

    Science.gov (United States)

    Marlowe, M. B.; Whetstone, W. D.; Robinson, J. C.

    The SPAR thermal analyzer, a system of finite-element processors for performing steady-state and transient thermal analyses, is described. The processors communicate with each other through the SPAR random access data base. As each processor is executed, all pertinent source data is extracted from the data base and results are stored in the data base. Steady state temperature distributions are determined by a direct solution method for linear problems and a modified Newton-Raphson method for nonlinear problems. An explicit and several implicit methods are available for the solution of transient heat transfer problems. Finite element plotting capability is available for model checkout and verification.

  14. Light-weight analyzer for odor recognition

    Energy Technology Data Exchange (ETDEWEB)

    Vass, Arpad A; Wise, Marcus B

    2014-05-20

    The invention provides a light weight analyzer, e.g., detector, capable of locating clandestine graves. The detector utilizes the very specific and unique chemicals identified in the database of human decompositional odor. This detector, based on specific chemical compounds found relevant to human decomposition, is the next step forward in clandestine grave detection and will take the guess-work out of current methods using canines and ground-penetrating radar, which have historically been unreliable. The detector is self contained, portable and built for field use. Both visual and auditory cues are provided to the operator.

  15. Analyzing water/wastewater infrastructure interdependencies

    International Nuclear Information System (INIS)

    Gillette, J. L.; Fisher, R. E.; Peerenboom, J. P.; Whitfield, R. G.

    2002-01-01

    This paper describes four general categories of infrastructure interdependencies (physical, cyber, geographic, and logical) as they apply to the water/wastewater infrastructure, and provides an overview of one of the analytic approaches and tools used by Argonne National Laboratory to evaluate interdependencies. Also discussed are the dimensions of infrastructure interdependency that create spatial, temporal, and system representation complexities that make analyzing the water/wastewater infrastructure particularly challenging. An analytical model developed to incorporate the impacts of interdependencies on infrastructure repair times is briefly addressed

  16. Analyzing Argumentation In Rich, Natural Contexts

    Directory of Open Access Journals (Sweden)

    Anita Reznitskaya

    2008-02-01

    Full Text Available The paper presents the theoretical and methodological aspects of research on the development of argument- ation in elementary school children. It presents a theoretical framework detailing psychological mechanisms responsible for the acquisition and transfer of argumentative discourse and demonstrates several applications of the framework, described in sufficient detail to guide future empirical investigations of oral, written, individual, or group argumentation performance. Software programs capable of facilitating data analysis are identified and their uses illustrated. The analytic schemes can be used to analyze large amounts of verbal data with reasonable precision and efficiency. The conclusion addresses more generally the challenges for and possibilities of empirical study of the development of argumentation.

  17. Nonlinear single-spin spectrum analyzer.

    Science.gov (United States)

    Kotler, Shlomi; Akerman, Nitzan; Glickman, Yinnon; Ozeri, Roee

    2013-03-15

    Qubits have been used as linear spectrum analyzers of their environments. Here we solve the problem of nonlinear spectral analysis, required for discrete noise induced by a strongly coupled environment. Our nonperturbative analytical model shows a nonlinear signal dependence on noise power, resulting in a spectral resolution beyond the Fourier limit as well as frequency mixing. We develop a noise characterization scheme adapted to this nonlinearity. We then apply it using a single trapped ion as a sensitive probe of strong, non-Gaussian, discrete magnetic field noise. Finally, we experimentally compared the performance of equidistant vs Uhrig modulation schemes for spectral analysis.

  18. Thermo Scientific Ozone Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, S. R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-03-01

    The primary measurement output from the Thermo Scientific Ozone Analyzer is the concentration of the analyte (O3) reported at 1-s resolution in units of ppbv in ambient air. Note that because of internal pneumatic switching limitations the instrument only makes an independent measurement every 4 seconds. Thus, the same concentration number is repeated roughly 4 times at the uniform, monotonic 1-s time base used in the AOS systems. Accompanying instrument outputs include sample temperatures, flows, chamber pressure, lamp intensities and a multiplicity of housekeeping information. There is also a field for operator comments made at any time while data is being collected.

  19. ASDA - Advanced Suit Design Analyzer computer program

    Science.gov (United States)

    Bue, Grant C.; Conger, Bruce C.; Iovine, John V.; Chang, Chi-Min

    1992-01-01

    An ASDA model developed to evaluate the heat and mass transfer characteristics of advanced pressurized suit design concepts for low pressure or vacuum planetary applications is presented. The model is based on a generalized 3-layer suit that uses the Systems Integrated Numerical Differencing Analyzer '85 in conjunction with a 41-node FORTRAN routine. The latter simulates the transient heat transfer and respiratory processes of a human body in a suited environment. The user options for the suit encompass a liquid cooled garment, a removable jacket, a CO2/H2O permeable layer, and a phase change layer.

  20. Development of a Portable Water Quality Analyzer

    Directory of Open Access Journals (Sweden)

    Germán COMINA

    2010-08-01

    Full Text Available A portable water analyzer based on a voltammetric electronic tongue has been developed. The system uses an electrochemical cell with two working electrodes as sensors, a computer controlled potentiostat, and software based on multivariate data analysis for pattern recognition. The system is suitable to differentiate laboratory made and real in-situ river water samples contaminated with different amounts of Escherichia coli. This bacteria is not only one of the main indicators for water quality, but also a main concern for public health, affecting especially people living in high-burden, resource-limiting settings.

  1. CRISP90 - SOFTWARE DESIGN ANALYZER SYSTEM

    Science.gov (United States)

    Tausworthe, R. C.

    1994-01-01

    The CRISP90 Software Design Analyzer System, an update of CRISP-80, is a set of programs forming a software design and documentation tool which supports top-down, hierarchic, modular, structured design and programming methodologies. The quality of a computer program can often be significantly influenced by the design medium in which the program is developed. The medium must foster the expression of the programmer's ideas easily and quickly, and it must permit flexible and facile alterations, additions, and deletions to these ideas as the design evolves. The CRISP90 software design analyzer system was developed to provide the PDL (Programmer Design Language) programmer with such a design medium. A program design using CRISP90 consists of short, English-like textual descriptions of data, interfaces, and procedures that are imbedded in a simple, structured, modular syntax. The display is formatted into two-dimensional, flowchart-like segments for a graphic presentation of the design. Together with a good interactive full-screen editor or word processor, the CRISP90 design analyzer becomes a powerful tool for the programmer. In addition to being a text formatter, the CRISP90 system prepares material that would be tedious and error prone to extract manually, such as a table of contents, module directory, structure (tier) chart, cross-references, and a statistics report on the characteristics of the design. Referenced modules are marked by schematic logic symbols to show conditional, iterative, and/or concurrent invocation in the program. A keyword usage profile can be generated automatically and glossary definitions inserted into the output documentation. Another feature is the capability to detect changes that were made between versions. Thus, "change-bars" can be placed in the output document along with a list of changed pages and a version history report. Also, items may be marked as "to be determined" and each will appear on a special table until the item is

  2. Analyzing endocrine system conservation and evolution.

    Science.gov (United States)

    Bonett, Ronald M

    2016-08-01

    Analyzing variation in rates of evolution can provide important insights into the factors that constrain trait evolution, as well as those that promote diversification. Metazoan endocrine systems exhibit apparent variation in evolutionary rates of their constituent components at multiple levels, yet relatively few studies have quantified these patterns and analyzed them in a phylogenetic context. This may be in part due to historical and current data limitations for many endocrine components and taxonomic groups. However, recent technological advancements such as high-throughput sequencing provide the opportunity to collect large-scale comparative data sets for even non-model species. Such ventures will produce a fertile data landscape for evolutionary analyses of nucleic acid and amino acid based endocrine components. Here I summarize evolutionary rate analyses that can be applied to categorical and continuous endocrine traits, and also those for nucleic acid and protein-based components. I emphasize analyses that could be used to test whether other variables (e.g., ecology, ontogenetic timing of expression, etc.) are related to patterns of rate variation and endocrine component diversification. The application of phylogenetic-based rate analyses to comparative endocrine data will greatly enhance our understanding of the factors that have shaped endocrine system evolution. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Improving respiration measurements with gas exchange analyzers.

    Science.gov (United States)

    Montero, R; Ribas-Carbó, M; Del Saz, N F; El Aou-Ouad, H; Berry, J A; Flexas, J; Bota, J

    2016-12-01

    Dark respiration measurements with open-flow gas exchange analyzers are often questioned for their low accuracy as their low values often reach the precision limit of the instrument. Respiration was measured in five species, two hypostomatous (Vitis Vinifera L. and Acanthus mollis) and three amphistomatous, one with similar amount of stomata in both sides (Eucalyptus citriodora) and two with different stomata density (Brassica oleracea and Vicia faba). CO 2 differential (ΔCO 2 ) increased two-fold with no change in apparent R d , when the two leaves with higher stomatal density faced outside. These results showed a clear effect of the position of stomata on ΔCO 2 . Therefore, it can be concluded that leaf position is important to guarantee the improvement of respiration measurements increasing ΔCO 2 without affecting the respiration results by leaf or mass units. This method will help to increase the accuracy of leaf respiration measurements using gas exchange analyzers. Copyright © 2016 Elsevier GmbH. All rights reserved.

  4. Solar Probe ANalyzer for Ions - Laboratory Performance

    Science.gov (United States)

    Livi, R.; Larson, D. E.; Kasper, J. C.; Korreck, K. E.; Whittlesey, P. L.

    2017-12-01

    The Parker Solar Probe (PSP) mission is a heliospheric satellite that will orbit the Sun closer than any prior mission to date with a perihelion of 35 solar radii (RS) and an aphelion of 10 RS. PSP includes the Solar Wind Electrons Alphas and Protons (SWEAP) instrument suite, which in turn consists of four instruments: the Solar Probe Cup (SPC) and three Solar Probe ANalyzers (SPAN) for ions and electrons. Together, this suite will take local measurements of particles and electromagnetic fields within the Sun's corona. SPAN-Ai has completed flight calibration and spacecraft integration and is set to be launched in July of 2018. The main mode of operation consists of an electrostatic analyzer (ESA) at its aperture followed by a Time-of-Flight section to measure the energy and mass per charge (m/q) of the ambient ions. SPAN-Ai's main objective is to measure solar wind ions within an energy range of 5 eV - 20 keV, a mass/q between 1-60 [amu/q] and a field of view of 2400x1200. Here we will show flight calibration results and performance.

  5. Analyzing delay causes in Egyptian construction projects

    Directory of Open Access Journals (Sweden)

    Mohamed M. Marzouk

    2014-01-01

    Full Text Available Construction delays are common problems in civil engineering projects in Egypt. These problems occur frequently during project life-time leading to disputes and litigation. Therefore, it is essential to study and analyze causes of construction delays. This research presents a list of construction delay causes retrieved from literature. The feedback of construction experts was obtained through interviews. Subsequently, a questionnaire survey was prepared. The questionnaire survey was distributed to thirty-three construction experts who represent owners, consultants, and contractor’s organizations. Frequency Index, Severity Index, and Importance Index are calculated and according to the highest values of them the top ten delay causes of construction projects in Egypt are determined. A case study is analyzed and compared to the most important delay causes in the research. Statistical analysis is carried out using analysis of variance ANOVA method to test delay causes, obtained from the survey. The test results reveal good correlation between groups while there is significant difference between them for some delay causes and finally roadmap for prioritizing delay causes groups is presented.

  6. Analyzing rare diseases terms in biomedical terminologies

    Directory of Open Access Journals (Sweden)

    Erika Pasceri

    2012-03-01

    Full Text Available Rare disease patients too often face common problems, including the lack of access to correct diagnosis, lack of quality information on the disease, lack of scientific knowledge of the disease, inequities and difficulties in access to treatment and care. These things could be changed by implementing a comprehensive approach to rare diseases, increasing international cooperation in scientific research, by gaining and sharing scientific knowledge about and by developing tools for extracting and sharing knowledge. A significant aspect to analyze is the organization of knowledge in the biomedical field for the proper management and recovery of health information. For these purposes, the sources needed have been acquired from the Office of Rare Diseases Research, the National Organization of Rare Disorders and Orphanet, organizations that provide information to patients and physicians and facilitate the exchange of information among different actors involved in this field. The present paper shows the representation of rare diseases terms in biomedical terminologies such as MeSH, ICD-10, SNOMED CT and OMIM, leveraging the fact that these terminologies are integrated in the UMLS. At the first level, it was analyzed the overlap among sources and at a second level, the presence of rare diseases terms in target sources included in UMLS, working at the term and concept level. We found that MeSH has the best representation of rare diseases terms.

  7. Analyzing Virtual Physics Simulations with Tracker

    Science.gov (United States)

    Claessens, Tom

    2017-12-01

    In the physics teaching community, Tracker is well known as a user-friendly open source video analysis software, authored by Douglas Brown. With this tool, the user can trace markers indicated on a video or on stroboscopic photos and perform kinematic analyses. Tracker also includes a data modeling tool that allows one to fit some theoretical equations of motion onto experimentally obtained data. In the field of particle mechanics, Tracker has been effectively used for learning and teaching about projectile motion, "toss up" and free-fall vertical motion, and to explain the principle of mechanical energy conservation. Also, Tracker has been successfully used in rigid body mechanics to interpret the results of experiments with rolling/slipping cylinders and moving rods. In this work, I propose an original method in which Tracker is used to analyze virtual computer simulations created with a physics-based motion solver, instead of analyzing video recording or stroboscopic photos. This could be an interesting approach to study kinematics and dynamics problems in physics education, in particular when there is no or limited access to physical labs. I demonstrate the working method with a typical (but quite challenging) problem in classical mechanics: a slipping/rolling cylinder on a rough surface.

  8. Automatic analyzing device for chlorine ion

    International Nuclear Information System (INIS)

    Sugibayashi, Shinji; Morikawa, Yoshitake; Fukase, Kazuo; Kashima, Hiromasa.

    1997-01-01

    The present invention provides a device of automatically analyzing a trance amount of chlorine ions contained in feedwater, condensate and reactor water of a BWR type power plant. Namely, zero-adjustment or span calibration in this device is conducted as follows. (1) A standard chlorine ion liquid is supplied from a tank to a mixer by a constant volume pump, and the liquid is diluted and mixed with purified water to form a standard liquid. (2) The pH of the standard liquid is adjusted by a pH adjuster. (3) The standard liquid is supplied to an electrode cell to conduct zero adjustment or span calibration. Chlorine ions in a specimen are measured by the device of the present invention as follows. (1) The specimen is supplied to a head tank through a line filter. (2) The pH of the specimen is adjusted by a pH adjuster. (3) The specimen is supplied to an electrode cell to electrically measure the concentration of the chlorine ions in the specimen. The device of the present invention can automatically analyze trance amount of chlorine ions at a high accuracy, thereby capable of improving the sensitivity, reducing an operator's burden and radiation exposure. (I.S.)

  9. Plutonium solution analyzer. Revised February 1995

    International Nuclear Information System (INIS)

    Burns, D.A.

    1995-02-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%--0.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40--240 g/l: and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4--4.0 g/y. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 ml of each sample and standard, and generates waste at the rate of about 1.5 ml per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded)

  10. Optoacoustic 13C-breath test analyzer

    Science.gov (United States)

    Harde, Hermann; Helmrich, Günther; Wolff, Marcus

    2010-02-01

    The composition and concentration of exhaled volatile gases reflects the physical ability of a patient. Therefore, a breath analysis allows to recognize an infectious disease in an organ or even to identify a tumor. One of the most prominent breath tests is the 13C-urea-breath test, applied to ascertain the presence of the bacterium helicobacter pylori in the stomach wall as an indication of a gastric ulcer. In this contribution we present a new optical analyzer that employs a compact and simple set-up based on photoacoustic spectroscopy. It consists of two identical photoacoustic cells containing two breath samples, one taken before and one after capturing an isotope-marked substrate, where the most common isotope 12C is replaced to a large extent by 13C. The analyzer measures simultaneously the relative CO2 isotopologue concentrations in both samples by exciting the molecules on specially selected absorption lines with a semiconductor laser operating at a wavelength of 2.744 μm. For a reliable diagnosis changes of the 13CO2 concentration of 1% in the exhaled breath have to be detected at a concentration level of this isotope in the breath of about 500 ppm.

  11. Research Programs & Initiatives

    Science.gov (United States)

    CGH develops international initiatives and collaborates with other NCI divisions, NCI-designated Cancer Centers, and other countries to support cancer control planning, encourage capacity building, and support cancer research and research networks.

  12. Nursing Home Quality Initiative

    Data.gov (United States)

    U.S. Department of Health & Human Services — This Nursing Home Quality Initiative (NHQI) website provides consumer and provider information regarding the quality of care in nursing homes. NHQI discusses quality...

  13. Surgical Critical Care Initiative

    Data.gov (United States)

    Federal Laboratory Consortium — The Surgical Critical Care Initiative (SC2i) is a USU research program established in October 2013 to develop, translate, and validate biology-driven critical care....

  14. Global Methane Initiative

    Science.gov (United States)

    The Global Methane Initiative promotes cost-effective, near-term methane recovery through partnerships between developed and developing countries, with participation from the private sector, development banks, and nongovernmental organizations.

  15. Medical Errors Reduction Initiative

    National Research Council Canada - National Science Library

    Mutter, Michael L

    2005-01-01

    The Valley Hospital of Ridgewood, New Jersey, is proposing to extend a limited but highly successful specimen management and medication administration medical errors reduction initiative on a hospital-wide basis...

  16. RAS Initiative - Community Outreach

    Science.gov (United States)

    Through community and technical collaborations, workshops and symposia, and the distribution of reference reagents, the RAS Initiative seeks to increase the sharing of knowledge and resources essential to defeating cancers caused by mutant RAS genes.

  17. RAS Initiative - Events

    Science.gov (United States)

    The NCI RAS Initiative has organized multiple events with outside experts to discuss how the latest scientific and technological breakthroughs can be applied to discover vulnerabilities in RAS-driven cancers.

  18. PESP Landscaping Initiative

    Science.gov (United States)

    Landscaping practices can positively or negatively affect local environments and human health. The Landscaping Initiative seeks to enhance benefits of landscaping while reducing need for pesticides, fertilizers, etc., by working with partners.

  19. About the RAS Initiative

    Science.gov (United States)

    The RAS Initiative, a "hub and spoke" model, connects researchers to better understand and target the more than 30% of cancers driven by mutations in RAS genes. Includes oversight and contact information.

  20. The Yekaterinburg headache initiative

    DEFF Research Database (Denmark)

    Lebedeva, Elena R; Olesen, Jes; Osipova, Vera V

    2013-01-01

    for a demonstrational interventional project in Russia, undertaken within the Global Campaign against Headache. The initiative proposes three actions: 1) raise awareness of need for improvement; 2) design and implement a three-tier model (from primary care to a single highly specialized centre with academic affiliation......) for efficient and equitable delivery of headache-related health care; 3) develop a range of educational initiatives aimed at primary-care physicians, non-specialist neurologists, pharmacists and the general public to support the second action. RESULTS AND CONCLUSION: We set these proposals in a context...... of a health-care needs assessment, and as a model for all Russia. We present and discuss early progress of the initiative, justify the investment of resources required for implementation and call for the political support that full implementation requires. The more that the Yekaterinburg headache initiative...

  1. The RAS Initiative

    Science.gov (United States)

    NCI established the RAS Initiative to explore innovative approaches for attacking the proteins encoded by mutant forms of RAS genes and to ultimately create effective, new therapies for RAS-related cancers.

  2. Piezoelectrically Initiated Pyrotechnic Igniter

    Science.gov (United States)

    Quince, Asia; Dutton, Maureen; Hicks, Robert; Burnham, Karen

    2013-01-01

    This innovation consists of a pyrotechnic initiator and piezoelectric initiation system. The device will be capable of being initiated mechanically; resisting initiation by EMF, RF, and EMI (electromagnetic field, radio frequency, and electromagnetic interference, respectively); and initiating in water environments and space environments. Current devices of this nature are initiated by the mechanical action of a firing pin against a primer. Primers historically are prone to failure. These failures are commonly known as misfires or hang-fires. In many cases, the primer shows the dent where the firing pin struck the primer, but the primer failed to fire. In devices such as "T" handles, which are commonly used to initiate the blowout of canopies, loss of function of the device may result in loss of crew. In devices such as flares or smoke generators, failure can result in failure to spot a downed pilot. The piezoelectrically initiated ignition system consists of a pyrotechnic device that plugs into a mechanical system (activator), which on activation, generates a high-voltage spark. The activator, when released, will strike a stack of electrically linked piezo crystals, generating a high-voltage, low-amperage current that is then conducted to the pyro-initiator. Within the initiator, an electrode releases a spark that passes through a pyrotechnic first-fire mixture, causing it to combust. The combustion of the first-fire initiates a primary pyrotechnic or explosive powder. If used in a "T" handle, the primary would ramp the speed of burn up to the speed of sound, generating a shock wave that would cause a high explosive to go "high order." In a flare or smoke generator, the secondary would produce the heat necessary to ignite the pyrotechnic mixture. The piezo activator subsystem is redundant in that a second stack of crystals would be struck at the same time with the same activation force, doubling the probability of a first strike spark generation. If the first

  3. Supply Chain Initiatives Database

    Energy Technology Data Exchange (ETDEWEB)

    None

    2012-11-01

    The Supply Chain Initiatives Database (SCID) presents innovative approaches to engaging industrial suppliers in efforts to save energy, increase productivity and improve environmental performance. This comprehensive and freely-accessible database was developed by the Institute for Industrial Productivity (IIP). IIP acknowledges Ecofys for their valuable contributions. The database contains case studies searchable according to the types of activities buyers are undertaking to motivate suppliers, target sector, organization leading the initiative, and program or partnership linkages.

  4. Automated Root Tracking with "Root System Analyzer"

    Science.gov (United States)

    Schnepf, Andrea; Jin, Meina; Ockert, Charlotte; Bol, Roland; Leitner, Daniel

    2015-04-01

    Crucial factors for plant development are water and nutrient availability in soils. Thus, root architecture is a main aspect of plant productivity and needs to be accurately considered when describing root processes. Images of root architecture contain a huge amount of information, and image analysis helps to recover parameters describing certain root architectural and morphological traits. The majority of imaging systems for root systems are designed for two-dimensional images, such as RootReader2, GiA Roots, SmartRoot, EZ-Rhizo, and Growscreen, but most of them are semi-automated and involve mouse-clicks in each root by the user. "Root System Analyzer" is a new, fully automated approach for recovering root architectural parameters from two-dimensional images of root systems. Individual roots can still be corrected manually in a user interface if required. The algorithm starts with a sequence of segmented two-dimensional images showing the dynamic development of a root system. For each image, morphological operators are used for skeletonization. Based on this, a graph representation of the root system is created. A dynamic root architecture model helps to determine which edges of the graph belong to an individual root. The algorithm elongates each root at the root tip and simulates growth confined within the already existing graph representation. The increment of root elongation is calculated assuming constant growth. For each root, the algorithm finds all possible paths and elongates the root in the direction of the optimal path. In this way, each edge of the graph is assigned to one or more coherent roots. Image sequences of root systems are handled in such a way that the previous image is used as a starting point for the current image. The algorithm is implemented in a set of Matlab m-files. Output of Root System Analyzer is a data structure that includes for each root an identification number, the branching order, the time of emergence, the parent

  5. Buccal microbiology analyzed by infrared spectroscopy

    Science.gov (United States)

    de Abreu, Geraldo Magno Alves; da Silva, Gislene Rodrigues; Khouri, Sônia; Favero, Priscila Pereira; Raniero, Leandro; Martin, Airton Abrahão

    2012-01-01

    Rapid microbiological identification and characterization are very important in dentistry and medicine. In addition to dental diseases, pathogens are directly linked to cases of endocarditis, premature delivery, low birth weight, and loss of organ transplants. Fourier Transform Infrared Spectroscopy (FTIR) was used to analyze oral pathogens Aggregatibacter actinomycetemcomitans ATCC 29523, Aggregatibacter actinomycetemcomitans-JP2, and Aggregatibacter actinomycetemcomitans which was clinically isolated from the human blood-CI. Significant spectra differences were found among each organism allowing the identification and characterization of each bacterial species. Vibrational modes in the regions of 3500-2800 cm-1, the 1484-1420 cm-1, and 1000-750 cm-1 were used in this differentiation. The identification and classification of each strain were performed by cluster analysis achieving 100% separation of strains. This study demonstrated that FTIR can be used to decrease the identification time, compared to the traditional methods, of fastidious buccal microorganisms associated with the etiology of the manifestation of periodontitis.

  6. Sensors and Automated Analyzers for Radionuclides

    International Nuclear Information System (INIS)

    Grate, Jay W.; Egorov, Oleg B.

    2003-01-01

    The production of nuclear weapons materials has generated large quantities of nuclear waste and significant environmental contamination. We have developed new, rapid, automated methods for determination of radionuclides using sequential injection methodologies to automate extraction chromatographic separations, with on-line flow-through scintillation counting for real time detection. This work has progressed in two main areas: radionuclide sensors for water monitoring and automated radiochemical analyzers for monitoring nuclear waste processing operations. Radionuclide sensors have been developed that collect and concentrate radionuclides in preconcentrating minicolumns with dual functionality: chemical selectivity for radionuclide capture and scintillation for signal output. These sensors can detect pertechnetate to below regulatory levels and have been engineered into a prototype for field testing. A fully automated process monitor has been developed for total technetium in nuclear waste streams. This instrument performs sample acidification, speciation adjustment, separation and detection in fifteen minutes or less

  7. Nuclear Plant Analyzer: Installation manual. Volume 1

    International Nuclear Information System (INIS)

    Snider, D.M.; Wagner, K.L.; Grush, W.H.; Jones, K.R.

    1995-01-01

    This report contains the installation instructions for the Nuclear Plant Analyzer (NPA) System. The NPA System consists of the Computer Visual System (CVS) program, the NPA libraries, the associated utility programs. The NPA was developed at the Idaho National Engineering Laboratory under the sponsorship of the US Nuclear Regulatory Commission to provide a highly flexible graphical user interface for displaying the results of these analysis codes. The NPA also provides the user with a convenient means of interactively controlling the host program through user-defined pop-up menus. The NPA was designed to serve primarily as an analysis tool. After a brief introduction to the Computer Visual System and the NPA, an analyst can quickly create a simple picture or set of pictures to aide in the study of a particular phenomenon. These pictures can range from simple collections of square boxes and straight lines to complex representations of emergency response information displays

  8. Method and apparatus for analyzing ionizable materials

    International Nuclear Information System (INIS)

    Ehrlich, B.J.; Hall, R.C.; Thiede, P.W.

    1979-01-01

    An apparatus and method are described for analyzing a solution of ionizable compounds in a liquid. The solution is irradiated with electromagnetic radiation to ionize the compounds and the electrical conductivity of the solution is measured. The radiation may be X-rays, ultra-violet, infra-red or microwaves. The solution may be split into two streams, only one of which is irradiated, the other being used as a reference by comparing conductivities of the two streams. The liquid must be nonionizable and is preferably a polar solvent. The invention provides an analysis technique useful in liquid chromatography and in gas chromatography after dissolving the eluted gases in a suitable solvent. Electrical conductivity measurements performed on the irradiated eluent provide a quantitative indication of the ionizable materials existing within the eluent stream and a qualitative indication of the purity of the eluent stream. (author)

  9. Analyzing Options for Airborne Emergency Wireless Communications

    Energy Technology Data Exchange (ETDEWEB)

    Michael Schmitt; Juan Deaton; Curt Papke; Shane Cherry

    2008-03-01

    In the event of large-scale natural or manmade catastrophic events, access to reliable and enduring commercial communication systems is critical. Hurricane Katrina provided a recent example of the need to ensure communications during a national emergency. To ensure that communication demands are met during these critical times, Idaho National Laboratory (INL) under the guidance of United States Strategic Command has studied infrastructure issues, concerns, and vulnerabilities associated with an airborne wireless communications capability. Such a capability could provide emergency wireless communications until public/commercial nodes can be systematically restored. This report focuses on the airborne cellular restoration concept; analyzing basic infrastructure requirements; identifying related infrastructure issues, concerns, and vulnerabilities and offers recommended solutions.

  10. Analyzing and forecasting the European social climate

    Directory of Open Access Journals (Sweden)

    Liliana DUGULEANĂ

    2015-06-01

    Full Text Available The paper uses the results of the sample survey Eurobarometer, which has been requested by the European Commission. The social climate index is used to measure the level of perceptions of population by taking into account their personal situation and their perspective at national level. The paper makes an analysis of the evolution of social climate indices for the countries of European Union and offers information about the expectations of population of analyzed countries. The obtained results can be compared with the forecasting of Eurobarometer, on short term of one year and medium term of five years. Modelling the social climate index and its influence factors offers useful information about the efficiency of social protection and inclusion policies.

  11. Analyzing Demand: Hegemonic Masculinity and Feminine Prostitution

    Directory of Open Access Journals (Sweden)

    Beatriz Ranea Triviño

    2016-12-01

    Full Text Available In this article, it is presented an exploratory research in which we analyzed the relationship between the construction of hegemonic masculinity and consumption of female prostitution. We have focused our attention on the experiences, attitudes and perceptions of young heterosexual men who have ever paid for sex. Following with a quantitative method of analysis, we conducted six semi-structured interviews with men between 18 to 35 years old. The analysis of the interviews shows the different demographic characteristics, such as, frequency of payment for sexual services, diversity of motivations, spaces where prostitutes are searched, opinions on prostitution and prostitutes. The main conclusions of this study are that the discourses of the interviewees reproduce gender stereotypes and gender sexual roles. And it is suggested that prostitution can be interpreted as a scenario where these men performance their hegemonic masculinity.

  12. Using wavelet features for analyzing gamma lines

    International Nuclear Information System (INIS)

    Medhat, M.E.; Abdel-hafiez, A.; Hassan, M.F.; Ali, M.A.; Uzhinskii, V.V.

    2004-01-01

    Data processing methods for analyzing gamma ray spectra with symmetric bell-shaped peaks form are considered. In many cases the peak form is symmetrical bell shaped in particular a Gaussian case is the most often used due to many physical reasons. The problem is how to evaluate parameters of such peaks, i.e. their positions, amplitudes and also their half-widths, that is for a single peak and overlapped peaks. Through wavelet features by using Marr wavelet (Mexican Hat) as a correlation method, it could be to estimate the optimal wavelet parameters and to locate peaks in the spectrum. The performance of the proposed method and others shows a better quality of wavelet transform method

  13. Analyzing petabytes of data with Hadoop

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    Abstract The open source Apache Hadoop project provides a powerful suite of tools for storing and analyzing petabytes of data using commodity hardware. After several years of production use inside of web companies like Yahoo! and Facebook and nearly a year of commercial support and development by Cloudera, the technology is spreading rapidly through other disciplines, from financial services and government to life sciences and high energy physics. The talk will motivate the design of Hadoop and discuss some key implementation details in depth. It will also cover the major subprojects in the Hadoop ecosystem, go over some example applications, highlight best practices for deploying Hadoop in your environment, discuss plans for the future of the technology, and provide pointers to the many resources available for learning more. In addition to providing more information about the Hadoop platform, a major goal of this talk is to begin a dialogue with the ATLAS research team on how the tools commonly used in t...

  14. Analyzer of neutron flux in real time

    International Nuclear Information System (INIS)

    Rojas S, A.S.; Carrillo M, R.A.; Balderas, E.G.

    1999-01-01

    With base in the study of the real signals of neutron flux of instability events occurred in the Laguna Verde nuclear power plant where the nucleus oscillation phenomena of the reactor are in the 0 to 2.5 Hz range, it has been seen the possibility about the development a surveillance and diagnostic equipment capable to analyze in real time the behavior of nucleus in this frequencies range. An important method for surveillance the stability of the reactor nucleus is the use of the Power spectral density which allows to determine the frequencies and amplitudes contained in the signals. It is used an instrument carried out by LabVIEW graphic programming with a data acquisition card of 16 channels which works at Windows 95/98 environment. (Author)

  15. Nuclear plant analyzer development and analysis applications

    International Nuclear Information System (INIS)

    Laats, E.T.

    1984-10-01

    The Nuclear Plant Analyzer (NPA) is being developed as the US Nuclear Regulatory Commission's (NRC's) state of the art safety analysis and engineering tool to address key nuclear plant safety issues. This paper describes four applications of the NPA in assisting reactor safety analyses. Two analyses evaluated reactor operating procedures, during off-normal operation, for a pressurized water reactor (PWR) and a boiling water reactor (BWR), respectively. The third analysis was performed in support of a reactor safety experiment conducted in the Semiscale facility. The final application demonstrated the usefulness of atmospheric dispersion computer codes for site emergency planning purposes. An overview of the NPA and how it supported these analyses are the topics of this paper

  16. Analyzing Strategic Business Rules through Simulation Modeling

    Science.gov (United States)

    Orta, Elena; Ruiz, Mercedes; Toro, Miguel

    Service Oriented Architecture (SOA) holds promise for business agility since it allows business process to change to meet new customer demands or market needs without causing a cascade effect of changes in the underlying IT systems. Business rules are the instrument chosen to help business and IT to collaborate. In this paper, we propose the utilization of simulation models to model and simulate strategic business rules that are then disaggregated at different levels of an SOA architecture. Our proposal is aimed to help find a good configuration for strategic business objectives and IT parameters. The paper includes a case study where a simulation model is built to help business decision-making in a context where finding a good configuration for different business parameters and performance is too complex to analyze by trial and error.

  17. Nuclear plant analyzer development at INEL

    International Nuclear Information System (INIS)

    Laats, E.T.; Russell, K.D.; Stewart, H.D.

    1983-01-01

    The Office of Nuclear Regulatory Research of the US Nuclear Regulatory Commission (NRC) has sponsored development of a software-hardware system called the Nuclear Plant Analyzer (NPA). This paper describes the status of the NPA project at the INEL after one year of development. When completed, the NPA will be an integrated network of analytical tools for performing reactor plant analyses. Development of the NPA in FY-1983 progressed along two parallel pathways; namely, conceptual planning and software development. Regarding NPA planning, and extensive effort was conducted to define the function requirements of the NPA, conceptual design, and hardware needs. Regarding software development conducted in FY-1983, all development was aimed toward demonstrating the basic concept and feasibility of the NPA. Nearly all software was developed and resides on the INEL twin Control Data Corporation 176 mainframe computers

  18. Structural factoring approach for analyzing stochastic networks

    Science.gov (United States)

    Hayhurst, Kelly J.; Shier, Douglas R.

    1991-01-01

    The problem of finding the distribution of the shortest path length through a stochastic network is investigated. A general algorithm for determining the exact distribution of the shortest path length is developed based on the concept of conditional factoring, in which a directed, stochastic network is decomposed into an equivalent set of smaller, generally less complex subnetworks. Several network constructs are identified and exploited to reduce significantly the computational effort required to solve a network problem relative to complete enumeration. This algorithm can be applied to two important classes of stochastic path problems: determining the critical path distribution for acyclic networks and the exact two-terminal reliability for probabilistic networks. Computational experience with the algorithm was encouraging and allowed the exact solution of networks that have been previously analyzed only by approximation techniques.

  19. RELAP5 nuclear plant analyzer capabilities

    International Nuclear Information System (INIS)

    Wagner, R.J.; Ransom, V.H.

    1982-01-01

    An interactive execution capability has been developed for the RELAP5 code which permits it to be used as a Nuclear Plant Analyzer. This capability has been demonstrated using a simplified primary and secondary loop model of a PWR. A variety of loss-of-feed-water accidents have been simulated using this model. The computer execution time on a CDC Cyber 176 is one half of the transient simulation time so that the results can be displayed in real time. The results of the demonstration problems are displayed in digital form on a color schematic of the plant model using a Textronics 4027 CRT terminal. The interactive feature allows the user to enter commands in much the same manner as a reactor operator

  20. Statistical network analysis for analyzing policy networks

    DEFF Research Database (Denmark)

    Robins, Garry; Lewis, Jenny; Wang, Peng

    2012-01-01

    and policy network methodology is the development of statistical modeling approaches that can accommodate such dependent data. In this article, we review three network statistical methods commonly used in the current literature: quadratic assignment procedures, exponential random graph models (ERGMs......To analyze social network data using standard statistical approaches is to risk incorrect inference. The dependencies among observations implied in a network conceptualization undermine standard assumptions of the usual general linear models. One of the most quickly expanding areas of social......), and stochastic actor-oriented models. We focus most attention on ERGMs by providing an illustrative example of a model for a strategic information network within a local government. We draw inferences about the structural role played by individuals recognized as key innovators and conclude that such an approach...

  1. Diffractive interference optical analyzer (DiOPTER)

    Science.gov (United States)

    Sasikumar, Harish; Prasad, Vishnu; Pal, Parama; Varma, Manoj M.

    2016-03-01

    This report demonstrates a method for high-resolution refractometric measurements using, what we have termed as, a Diffractive Interference Optical Analyzer (DiOpter). The setup consists of a laser, polarizer, a transparent diffraction grating and Si-photodetectors. The sensor is based on the differential response of diffracted orders to bulk refractive index changes. In these setups, the differential read-out of the diffracted orders suppresses signal drifts and enables time-resolved determination of refractive index changes in the sample cell. A remarkable feature of this device is that under appropriate conditions, the measurement sensitivity of the sensor can be enhanced by more than two orders of magnitude due to interference between multiply reflected diffracted orders. A noise-equivalent limit of detection (LoD) of 6x10-7 RIU was achieved in glass. This work focuses on devices with integrated sample well, made on low-cost PDMS. As the detection methodology is experimentally straightforward, it can be used across a wide array of applications, ranging from detecting changes in surface adsorbates via binding reactions to estimating refractive index (and hence concentration) variations in bulk samples. An exciting prospect of this technique is the potential integration of this device to smartphones using a simple interface based on transmission mode configuration. In a transmission configuration, we were able to achieve an LoD of 4x10-4 RIU which is sufficient to explore several applications in food quality testing and related fields. We are envisioning the future of this platform as a personal handheld optical analyzer for applications ranging from environmental sensing to healthcare and quality testing of food products.

  2. Relativistic effects in the calibration of electrostatic electron analyzers. I. Toroidal analyzers

    Energy Technology Data Exchange (ETDEWEB)

    Keski Rahkonen, O [Helsinki University of Technology, Espoo (Finland). Laboratory of Physics; Krause, M O [Oak Ridge National Lab., Tenn. (USA)

    1978-02-01

    Relativistic correction terms up to the second order are derived for the kinetic energy of an electron travelling along the circular central trajectory of a toroidal analyzer. Furthermore, a practical energy calibration equation of the spherical sector plate analyzer is written for the variable-plate-voltage recording mode. Accurate measurements with a spherical analyzer performed using kinetic energies from 600 to 2100 eV are in good agreement with this theory showing our approximation (neglect of fringing fields, and source and detector geometry) is realistic enough for actual calibration purposes.

  3. Evaluation of containment hydrogen and oxygen analyzers

    International Nuclear Information System (INIS)

    Booth, H.R.; Stanley, L.

    1993-02-01

    This report contains information concerning operation and calibration of detectors utilized at US nuclear power plants for determining concentration of hydrogen and oxygen within the containment structure.A study was prompted by reports that several plants had experienced problems in operating, calibrating, and maintaining the detectors supplied by various vendors. A survey of all nuclear power plants was conducted to identify the specific problems. Discussions were held with key vendors concerning these problems. The major area of interest was centered around problems associated with calibration of the detectors. Many variations from plant-to-plant concerning calibration accuracies, calibration time periods, and frequencies were identified. Another area of prime consideration involved variations as to maintenance of the equipment. Some plants devoted considerable effort to in-house maintenance of equipment while others relied heavily on the vendor for such maintenance. A workshop was conducted with key utility and vendor personnel in attendance to discuss findings of the survey. It was resolved that a much improved coordinated effort between the vendors and utilities would be initiated as a means to resolve existing problems

  4. Plant-bacterium interactions analyzed by proteomics

    Directory of Open Access Journals (Sweden)

    Amber eAfroz

    2013-02-01

    Full Text Available The evolution of the plant immune response has resulted in a highly effective defense system that is able to resist potential attack by microbial pathogens. The primary immune response is referred to as pathogen associated molecular pattern triggered immunity and has evolved to recognize common features of microbial pathogens. In response to the delivery of pathogen effector proteins, plants acquired R proteins to fight against pathogen attack. R-dependent defense response is important in understanding the biochemical and cellular mechanisms and underlying these interactions will enable molecular and transgenic approaches for crops with increased biotic resistance. Proteomic analyses are particularly useful for understanding the mechanisms of host plant against the pathogen attack. Recent advances in the field of proteome analyses have initiated a new research area, i.e the analysis of more complex microbial communities and their interaction with plant. Such areas hold great potential to elucidate, not only the interactions between bacteria and their host plants, but also of bacteria-bacteria interactions between different bacterial taxa, symbiotic, pathogenic bacteria and commensal bacteria. During biotic stress, plant hormonal signaling pathways prioritizes defense over other cellular functions. Some plant pathogens take advantage of hormone dependent regulatory system by mimicking hormones that interfere with host immune responses to promote virulence. In this review, it is discussed the cross talk that plays important role in response to pathogens attack with different infection strategies using proteomic approaches.

  5. Implication of the dominant design in electronic initiation systems in the South African mining industry

    CSIR Research Space (South Africa)

    Smit, FC

    1998-11-01

    Full Text Available This article analyzes an emerging technological innovation, namely, electronic initiation systems for mining explosives in South Africa. The concept of electronic initiation is presenting itself as a challenge to traditional initiation systems...

  6. International EUREKA: Initialization Segment

    International Nuclear Information System (INIS)

    1982-02-01

    The Initialization Segment creates the starting description of the uranium market. The starting description includes the international boundaries of trade, the geologic provinces, resources, reserves, production, uranium demand forecasts, and existing market transactions. The Initialization Segment is designed to accept information of various degrees of detail, depending on what is known about each region. It must transform this information into a specific data structure required by the Market Segment of the model, filling in gaps in the information through a predetermined sequence of defaults and built in assumptions. A principal function of the Initialization Segment is to create diagnostic messages indicating any inconsistencies in data and explaining which assumptions were used to organize the data base. This permits the user to manipulate the data base until such time the user is satisfied that all the assumptions used are reasonable and that any inconsistencies are resolved in a satisfactory manner

  7. Initiating events frequency determination

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2004-01-01

    The paper describes work performed for the Nuclear Power Station (NPS). Work is related to the periodic initiating events frequency update for the Probabilistic Safety Assessment (PSA). Data for all relevant NPS initiating events (IE) were reviewed. The main focus was on events occurring during most recent operating history (i.e., last four years). The final IE frequencies were estimated by incorporating both NPS experience and nuclear industry experience. Each event was categorized according to NPS individual plant examination (IPE) initiating events grouping approach. For the majority of the IE groups, few, or no events have occurred at the NPS. For those IE groups with few or no NPS events, the final estimate was made by means of a Bayesian update with general nuclear industry values. Exceptions are rare loss-of-coolant-accidents (LOCA) events, where evaluation of engineering aspects is used in order to determine frequency.(author)

  8. Plasma diagnostics with a retarding potential analyzer

    International Nuclear Information System (INIS)

    Jack, T.M.

    1996-01-01

    The plasma rocket is located at NASA Johnson Space Center. To produce a thrust in space, an inert gas is ionized into a plasma and heated in the linear section of a tokamak fusion device. The magnetic field used to contain the plasma has a magnitude of 2--10 kGauss. The plasma plume has a variable thrust and specific impulse. A high temperature retarding potential analyzer (RPA) is being developed to characterize the plasma in the plume and at the edge of the magnetically contained plasma. The RPA measures the energy and density of ions or electrons entering into its solid angle of collection. An oscilloscope displays the ion flux versus the collected current. All measurements are made relative to the facility ground. Testing of this device involves the determination of its output parameters, sensitivity, and responses to a wide range of energies and densities. Each grid will be tested individually by changing only its voltage and observing the output from the RPA. To verify that the RPA is providing proper output, it is compared to the output from a Langmuir or Faraday probe

  9. Analyzing the development of Indonesia shrimp industry

    Science.gov (United States)

    Wati, L. A.

    2018-04-01

    This research aimed to analyze the development of shrimp industry in Indonesia. Porter’s Diamond Theory was used for analysis. The Porter’s Diamond theory is one of framework for industry analysis and business strategy development. The Porter’s Diamond theory has five forces that determine the competitive intensity in an industry, namely (1) the threat of substitute products, (2) the threat of competition, (3) the threat of new entrants, (4) bargaining power of suppliers, and (5) bargaining power of consumers. The development of Indonesian shrimp industry pretty good, explained by Porter Diamond Theory analysis. Analysis of Porter Diamond Theory through four main components namely factor conditions; demand condition; related and supporting industries; and firm strategy, structure and rivalry coupled with a two-component supporting (regulatory the government and the factor of chance). Based on the result of this research show that two-component supporting (regulatory the government and the factor of chance) have positive. Related and supporting industries have negative, firm and structure strategy have negative, rivalry has positive, factor condition have positive (except science and technology resources).

  10. Analyzing Spatiotemporal Anomalies through Interactive Visualization

    Directory of Open Access Journals (Sweden)

    Tao Zhang

    2014-06-01

    Full Text Available As we move into the big data era, data grows not just in size, but also in complexity, containing a rich set of attributes, including location and time information, such as data from mobile devices (e.g., smart phones, natural disasters (e.g., earthquake and hurricane, epidemic spread, etc. We are motivated by the rising challenge and build a visualization tool for exploring generic spatiotemporal data, i.e., records containing time location information and numeric attribute values. Since the values often evolve over time and across geographic regions, we are particularly interested in detecting and analyzing the anomalous changes over time/space. Our analytic tool is based on geographic information system and is combined with spatiotemporal data mining algorithms, as well as various data visualization techniques, such as anomaly grids and anomaly bars superimposed on the map. We study how effective the tool may guide users to find potential anomalies through demonstrating and evaluating over publicly available spatiotemporal datasets. The tool for spatiotemporal anomaly analysis and visualization is useful in many domains, such as security investigation and monitoring, situation awareness, etc.

  11. Alternative approach to analyzing occupational mortality data

    International Nuclear Information System (INIS)

    Gilbert, E.S.; Buchanan, J.A.

    1984-01-01

    It is widely recognized that analyzing occupational mortality by calculating standardized mortality ratios based on death rates from the general population is subject to a number of limitations. An alternative approach described in this report takes advantage of the fact that comparisons of mortality by subgroups and assessments of trends in mortality are often of equal or greater interest than overall assessments and that such comparisons do not require an external control. A computer program MOX (Mortality and Occupational Exposure) is available for performing the needed calculations for several diseases. MOX was written to asses the effect of radiation exposure on Hanford nuclear workers. For this application, analyses have been based on cumulative exposure computed (by MOX) from annual records of radiation exposure obtained from personal dosimeter readings. This program provides tests for differences and trends among subcategories defined by variables such as length of employment, job category, or exposure measurements and also provides control for age, calendar year, and several other potentially confounding variables. 29 references, 2 tables

  12. Analyzing Design Heating Loads in Superinsulated Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Arena, Lois [Consortium for Advanced Residential Buildings, Norwalk, CT (United States)

    2015-06-16

    The U.S. Department of Energy’s Building America research team Consortium for Advanced Residential Buildings (CARB) worked with the EcoVillage cohousing community in Ithaca, New York, on the Third Residential EcoVillage Experience neighborhood. This communityscale project consists of 40 housing units—15 apartments and 25 single-family residences. Units range in size from 450 ft2 to 1,664 ft2 and cost from $80,000 for a studio apartment to $235,000 for a three- or four-bedroom single-family home. For the research component of this project, CARB analyzed current heating system sizing methods for superinsulated homes in cold climates to determine if changes in building load calculation methodology should be recommended. Actual heating energy use was monitored and compared to results from the Air Conditioning Contractors of America’s Manual J8 (MJ8) and the Passive House Planning Package software. Results from that research indicate that MJ8 significantly oversizes heating systems for superinsulated homes and that thermal inertia and internal gains should be considered for more accurate load calculations.

  13. A framework to analyze emissions implications of ...

    Science.gov (United States)

    Future year emissions depend highly on the evolution of the economy, technology and current and future regulatory drivers. A scenario framework was adopted to analyze various technology development pathways and societal change while considering existing regulations and future uncertainty in regulations and evaluate resulting emissions growth patterns. The framework integrates EPA’s energy systems model with an economic Input-Output (I/O) Life Cycle Assessment model. The EPAUS9r MARKAL database is assembled from a set of technologies to represent the U.S. energy system within MARKAL bottom-up technology rich energy modeling framework. The general state of the economy and consequent demands for goods and services from these sectors are taken exogenously in MARKAL. It is important to characterize exogenous inputs about the economy to appropriately represent the industrial sector outlook for each of the scenarios and case studies evaluated. An economic input-output (I/O) model of the US economy is constructed to link up with MARKAL. The I/O model enables user to change input requirements (e.g. energy intensity) for different sectors or the share of consumer income expended on a given good. This gives end-users a mechanism for modeling change in the two dimensions of technological progress and consumer preferences that define the future scenarios. The framework will then be extended to include environmental I/O framework to track life cycle emissions associated

  14. Analyzing the Pension System of the USSR

    Directory of Open Access Journals (Sweden)

    Aleksei V. Pudovkin

    2015-01-01

    Full Text Available The article under the title "ANALYSIS OF THE PENSION SYSTEM OF THE USSR" deals with numerous aspects of development of the pension system of the former USSR. Since the improvement of the Russian pension system is presently high on the agenda, the author believes that analyzing the own historical experience in the first line is essential in order to create a sound and efficient pension system in Russia. The study presented in the article aims to execute an in-depth analysis of legislature on the soviet pension system with the view to recreate the architecture of the pension system of the USSR. In addition, the study also reflects on the official statistics for the said period to make a qualified and fundamental conclusion on the efficiency of the soviet pension system. The evolution of the pension system, based on statistical data evidently proves the efficiently of the soviet pension system. It is highly recommended that the positive aspects of the soviet pension system are taken into consideration when reforming the actual pension system of Russian Federation.

  15. Analyzing Music Services Positioning Through Qualitative Research

    Directory of Open Access Journals (Sweden)

    Manuel Cuadrado

    2015-12-01

    Full Text Available Information technologies have produced new ways of distributing and consuming music, mainly by youth, in relation to both goods and services. In the case of goods, there has been a dramatic shift from traditional ways of buying and listening to music to new digital platforms. There has also been an evolution in relation to music services. In this sense, live music concerts have been losing their audiences over the past few years, as have music radio stations, in favor of streaming platforms. Curious about this phenomenon, we conducted an exploratory research in order to analyze how all these services, both traditional and new ones were perceived. Specifically, we aimed to study youth´s assessment of the three most relevant music service categories: music radio stations, digital streaming platforms, and pop-rock music festivals. To do so, we used the projective technique of image association to gather information. The population of the study consisted of individuals between 18 and 25 years of age. Our results, after using content analysis, were poor due to spontaneous recall. Therefore, we duplicated the study, but in a more focus-oriented way. Information gathered this time allowed us not only to better know how all these organizations are positioned but also to obtain a list of descriptors to be used in a subsequent descriptive research study.

  16. Analyzing the Existing Undergraduate Engineering Leadership Skills

    Directory of Open Access Journals (Sweden)

    Hamed M. Almalki

    2016-12-01

    Full Text Available Purpose: Studying and analyzing the undergraduate engineering students' leadership skills to discover their potential leadership strengths and weaknesses. This study will unveil potential ways to enhance the ways we teach engineering leadership. The research has great insights that might assist engineering programs to improve curricula for the purpose of better engineering preparation to meet industry's demands. Methodology and Findings: 441 undergraduate engineering students have been surveyed in two undergraduate engineering programs to discover their leadership skills. The results in both programs were revealing that undergraduate engineering students are lacking behind in the visionary leadership skills compared to directing, including and cultivating leadership styles. Recommendation: A practical framework has been proposed to enhance the lacking leadership skills by utilizing the Matrix of Change (MOC, and the Balanced Scorecard BSC to capture the best leadership scenarios to design virtual simulation environment as per the lacking leadership skills which is the visionary leadership skills in this case. After that, the virtual simulation will be used to provide an experiential learning by replacing human beings with avatars that can be managed or dramatized by real people to enable the creation of live, practical, measurable, and customizable leadership development programs.

  17. PSAIA – Protein Structure and Interaction Analyzer

    Directory of Open Access Journals (Sweden)

    Vlahoviček Kristian

    2008-04-01

    Full Text Available Abstract Background PSAIA (Protein Structure and Interaction Analyzer was developed to compute geometric parameters for large sets of protein structures in order to predict and investigate protein-protein interaction sites. Results In addition to most relevant established algorithms, PSAIA offers a new method PIADA (Protein Interaction Atom Distance Algorithm for the determination of residue interaction pairs. We found that PIADA produced more satisfactory results than comparable algorithms implemented in PSAIA. Particular advantages of PSAIA include its capacity to combine different methods to detect the locations and types of interactions between residues and its ability, without any further automation steps, to handle large numbers of protein structures and complexes. Generally, the integration of a variety of methods enables PSAIA to offer easier automation of analysis and greater reliability of results. PSAIA can be used either via a graphical user interface or from the command-line. Results are generated in either tabular or XML format. Conclusion In a straightforward fashion and for large sets of protein structures, PSAIA enables the calculation of protein geometric parameters and the determination of location and type for protein-protein interaction sites. XML formatted output enables easy conversion of results to various formats suitable for statistic analysis. Results from smaller data sets demonstrated the influence of geometry on protein interaction sites. Comprehensive analysis of properties of large data sets lead to new information useful in the prediction of protein-protein interaction sites.

  18. A Methodology to Analyze Photovoltaic Tracker Uptime

    Energy Technology Data Exchange (ETDEWEB)

    Muller, Matthew T [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ruth, Dan [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-04-17

    A metric is developed to analyze the daily performance of single-axis photovoltaic (PV) trackers. The metric relies on comparing correlations between the daily time series of the PV power output and an array of simulated plane-of-array irradiances for the given day. Mathematical thresholds and a logic sequence are presented, so the daily tracking metric can be applied in an automated fashion on large-scale PV systems. The results of applying the metric are visually examined against the time series of the power output data for a large number of days and for various systems. The visual inspection results suggest that overall, the algorithm is accurate in identifying stuck or functioning trackers on clear-sky days. Visual inspection also shows that there are days that are not classified by the metric where the power output data may be sufficient to identify a stuck tracker. Based on the daily tracking metric, uptime results are calculated for 83 different inverters at 34 PV sites. The mean tracker uptime is calculated at 99% based on 2 different calculation methods. The daily tracking metric clearly has limitations, but as there is no existing metrics in the literature, it provides a valuable tool for flagging stuck trackers.

  19. Nuclear plant analyzer development and analysis applications

    International Nuclear Information System (INIS)

    Laats, E.T.

    1984-01-01

    The Nuclear Plant Analyzer (NPA) is being developed as the U.S. Nuclear Regulatory Commission's (NRC's) state of the art safety analysis and engineering tool to address key nuclear plant safety issues. The NPA integrates the NRC's computerized reactor behavior simulation codes such as RELAP5 and TRAC-BWR, both of which are well-developed computer graphics programs, and large repositories of reactor design and experimental data. Utilizing the complex reactor behavior codes as well as the experiment data repositories enables simulation applications of the NPA that are generally not possible with more simplistic, less mechanistic reactor behavior codes. These latter codes are used in training simulators or with other NPA-type software packages and are limited to displaying calculated data only. This paper describes four applications of the NPA in assisting reactor safety analyses. Two analyses evaluated reactor operating procedures, during off-normal operation, for a pressurized water reactor (PWR) and a boiling water reactor (BWR), respectively. The third analysis was performed in support of a reactor safety experiment conducted in the Semiscale facility. The final application demonstrated the usefulness of atmospheric dispersion computer codes for site emergency planning purposes. An overview of the NPA and how it supported these analyses are the topics of this paper

  20. 32 CFR 989.4 - Initial considerations.

    Science.gov (United States)

    2010-07-01

    ... alternatives analyzed in the environmental documents. (f) Pursue the objective of furthering foreign policy and... ENVIRONMENTAL IMPACT ANALYSIS PROCESS (EIAP) § 989.4 Initial considerations. Air Force personnel will: (a... CATEX from environmental impact analysis (appendix B). (c) Make environmental documents, comments, and...

  1. RCRA facility stabilization initiative

    International Nuclear Information System (INIS)

    1995-02-01

    The RCRA Facility Stabilization Initiative was developed as a means of implementing the Corrective Action Program's management goals recommended by the RIS for stabilizing actual or imminent releases from solid waste management units that threaten human health and the environment. The overall goal of stabilization is to, as situations warrant, control or abate threats to human health and/or the environment from releases at RCRA facilities, and/or to prevent or minimize the further spread of contamination while long-term remedies are pursued. The Stabilization initiative is a management philosophy and should not be confused with stabilization technologies

  2. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Li, J.; Zhang, J.; Wang, W.

    2015-12-01

    Both the National Research Council Decadal Survey and the latest Intergovernmental Panel on Climate Change Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with the synergistic use of global satellite observations in order to improve our weather and climate simulation and prediction capabilities. The abundance of satellite observations for fundamental climate parameters and the availability of coordinated model outputs from CMIP5 for the same parameters offer a great opportunity to understand and diagnose model biases in climate models. In addition, the Obs4MIPs efforts have created several key global observational datasets that are readily usable for model evaluations. However, a model diagnostic evaluation process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. In response, we have developed a novel methodology to diagnose model biases in contemporary climate models and implementing the methodology as a web-service based, cloud-enabled, provenance-supported climate-model evaluation system. The evaluation system is named Climate Model Diagnostic Analyzer (CMDA), which is the product of the research and technology development investments of several current and past NASA ROSES programs. The current technologies and infrastructure of CMDA are designed and selected to address several technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. In particular, we have three key technology components: (1) diagnostic analysis methodology; (2) web-service based, cloud-enabled technology; (3) provenance-supported technology. The diagnostic analysis methodology includes random forest feature importance ranking, conditional probability distribution function, conditional sampling, and time-lagged correlation map. We have implemented the

  3. Analyzing wildfire exposure on Sardinia, Italy

    Science.gov (United States)

    Salis, Michele; Ager, Alan A.; Arca, Bachisio; Finney, Mark A.; Alcasena, Fermin; Bacciu, Valentina; Duce, Pierpaolo; Munoz Lozano, Olga; Spano, Donatella

    2014-05-01

    We used simulation modeling based on the minimum travel time algorithm (MTT) to analyze wildfire exposure of key ecological, social and economic features on Sardinia, Italy. Sardinia is the second largest island of the Mediterranean Basin, and in the last fifty years experienced large and dramatic wildfires, which caused losses and threatened urban interfaces, forests and natural areas, and agricultural productions. Historical fires and environmental data for the period 1995-2009 were used as input to estimate fine scale burn probability, conditional flame length, and potential fire size in the study area. With this purpose, we simulated 100,000 wildfire events within the study area, randomly drawing from the observed frequency distribution of burn periods and wind directions for each fire. Estimates of burn probability, excluding non-burnable fuels, ranged from 0 to 1.92x10-3, with a mean value of 6.48x10-5. Overall, the outputs provided a quantitative assessment of wildfire exposure at the landscape scale and captured landscape properties of wildfire exposure. We then examined how the exposure profiles varied among and within selected features and assets located on the island. Spatial variation in modeled outputs resulted in a strong effect of fuel models, coupled with slope and weather. In particular, the combined effect of Mediterranean maquis, woodland areas and complex topography on flame length was relevant, mainly in north-east Sardinia, whereas areas with herbaceous fuels and flat areas were in general characterized by lower fire intensity but higher burn probability. The simulation modeling proposed in this work provides a quantitative approach to inform wildfire risk management activities, and represents one of the first applications of burn probability modeling to capture fire risk and exposure profiles in the Mediterranean basin.

  4. Modeling and Analyzing Academic Researcher Behavior

    Directory of Open Access Journals (Sweden)

    Phuc Huu Nguyen

    2016-12-01

    Full Text Available Abstract. This paper suggests a theoretical framework for analyzing the mechanism of the behavior of academic researchers whose interests are tangled and vary widely in academic factors (the intrinsic satisfaction in conducting research, the improvement in individual research ability, etc. or non-academic factors (career rewards, financial rewards, etc.. Furthermore, each researcher also has his/her different academic stances in their preferences about academic freedom and academic entrepreneurship. Understanding the behavior of academic researchers will contribute to nurture young researchers, to improve the standard of research and education as well as to boost collaboration in academia-industry. In particular, as open innovation is increasingly in need of the involvement of university researchers, to establish a successful approach to entice researchers into enterprises’ research, companies must comprehend the behavior of university researchers who have multiple complex motivations. The paper explores academic researchers' behaviors through optimizing their utility functions, i.e. the satisfaction obtained by their research outputs. This paper characterizes these outputs as the results of researchers' 3C: Competence (the ability to implement the research, Commitment (the effort to do the research, and Contribution (finding meaning in the research. Most of the previous research utilized the empirical methods to study researcher's motivation. Without adopting economic theory into the analysis, the past literature could not offer a deeper understanding of researcher's behavior. Our contribution is important both conceptually and practically because it provides the first theoretical framework to study the mechanism of researcher's behavior. Keywords: Academia-Industry, researcher behavior, ulrich model’s 3C.

  5. Analyzing personalized policies for online biometric verification.

    Science.gov (United States)

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M

    2014-01-01

    Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident.

  6. Novel topological descriptors for analyzing biological networks

    Directory of Open Access Journals (Sweden)

    Varmuza Kurt K

    2010-06-01

    Full Text Available Abstract Background Topological descriptors, other graph measures, and in a broader sense, graph-theoretical methods, have been proven as powerful tools to perform biological network analysis. However, the majority of the developed descriptors and graph-theoretical methods does not have the ability to take vertex- and edge-labels into account, e.g., atom- and bond-types when considering molecular graphs. Indeed, this feature is important to characterize biological networks more meaningfully instead of only considering pure topological information. Results In this paper, we put the emphasis on analyzing a special type of biological networks, namely bio-chemical structures. First, we derive entropic measures to calculate the information content of vertex- and edge-labeled graphs and investigate some useful properties thereof. Second, we apply the mentioned measures combined with other well-known descriptors to supervised machine learning methods for predicting Ames mutagenicity. Moreover, we investigate the influence of our topological descriptors - measures for only unlabeled vs. measures for labeled graphs - on the prediction performance of the underlying graph classification problem. Conclusions Our study demonstrates that the application of entropic measures to molecules representing graphs is useful to characterize such structures meaningfully. For instance, we have found that if one extends the measures for determining the structural information content of unlabeled graphs to labeled graphs, the uniqueness of the resulting indices is higher. Because measures to structurally characterize labeled graphs are clearly underrepresented so far, the further development of such methods might be valuable and fruitful for solving problems within biological network analysis.

  7. The SEED Initiative

    Science.gov (United States)

    Teich, Carolyn R.

    2011-01-01

    Committed to fulfilling the promise of the green economy, the American Association of Community Colleges (AACC) launched the Sustainability Education and Economic Development (SEED) initiative (www.theseedcenter.org) in October 2010. The project advances sustainability and clean energy workforce development practices at community colleges by…

  8. Major New Initiatives

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. Major New Initiatives. Multi-party multi-rate video conferencing OOPS. Live Lecture OOPS. Rural ATM Machine Vortex. Finger print detection HP-IITM. Medical Diagnostic kit NeuroSynaptic. LCD projection system TeNeT. Web Terminal MeTeL Midas. Entertainment ...

  9. Monolithic exploding foil initiator

    Science.gov (United States)

    Welle, Eric J; Vianco, Paul T; Headley, Paul S; Jarrell, Jason A; Garrity, J. Emmett; Shelton, Keegan P; Marley, Stephen K

    2012-10-23

    A monolithic exploding foil initiator (EFI) or slapper detonator and the method for making the monolithic EFI wherein the exploding bridge and the dielectric from which the flyer will be generated are integrated directly onto the header. In some embodiments, the barrel is directly integrated directly onto the header.

  10. Thromboelastography platelet mapping in healthy dogs using 1 analyzer versus 2 analyzers.

    Science.gov (United States)

    Blois, Shauna L; Banerjee, Amrita; Wood, R Darren; Park, Fiona M

    2013-07-01

    The objective of this study was to describe the results of thromboelastography platelet mapping (TEG-PM) carried out using 2 techniques in 20 healthy dogs. Maximum amplitudes (MA) generated by thrombin (MAthrombin), fibrin (MAfibrin), adenosine diphosphate (ADP) receptor activity (MAADP), and thromboxane A2 (TxA2) receptor activity (stimulated by arachidonic acid, MAAA) were recorded. Thromboelastography platelet mapping was carried out according to the manufacturer's guidelines (2-analyzer technique) and using a variation of this method employing only 1 analyzer (1-analyzer technique) on 2 separate blood samples obtained from each dog. Mean [± standard deviation (SD)] MA values for the 1-analyzer/2-analyzer techniques were: MAthrombin = 51.9 mm (± 7.1)/52.5 mm (± 8.0); MAfibrin = 20.7 mm (± 21.8)/23.0 mm (± 26.1); MAADP = 44.5 mm (± 15.6)/45.6 mm (± 17.0); and MAAA = 45.7 mm (± 11.6)/45.0 mm (± 15.4). Mean (± SD) percentage aggregation due to ADP receptor activity was 70.4% (± 32.8)/67.6% (± 33.7). Mean percentage aggregation due to TxA2 receptor activity was 77.3% (± 31.6)/78.1% (± 50.2). Results of TEG-PM were not significantly different for the 1-analyzer and 2-analyzer methods. High correlation was found between the 2 methods for MAfibrin [concordance correlation coefficient (r) = 0.930]; moderate correlation was found for MAthrombin (r = 0.70) and MAADP (r = 0.57); correlation between the 2 methods for MAAA was lower (r = 0.32). Thromboelastography platelet mapping (TEG-PM) should be further investigated to determine if it is a suitable method for measuring platelet dysfunction in dogs with thrombopathy.

  11. Next generation initiation techniques

    Science.gov (United States)

    Warner, Tom; Derber, John; Zupanski, Milija; Cohn, Steve; Verlinde, Hans

    1993-01-01

    Four-dimensional data assimilation strategies can generally be classified as either current or next generation, depending upon whether they are used operationally or not. Current-generation data-assimilation techniques are those that are presently used routinely in operational-forecasting or research applications. They can be classified into the following categories: intermittent assimilation, Newtonian relaxation, and physical initialization. It should be noted that these techniques are the subject of continued research, and their improvement will parallel the development of next generation techniques described by the other speakers. Next generation assimilation techniques are those that are under development but are not yet used operationally. Most of these procedures are derived from control theory or variational methods and primarily represent continuous assimilation approaches, in which the data and model dynamics are 'fitted' to each other in an optimal way. Another 'next generation' category is the initialization of convective-scale models. Intermittent assimilation systems use an objective analysis to combine all observations within a time window that is centered on the analysis time. Continuous first-generation assimilation systems are usually based on the Newtonian-relaxation or 'nudging' techniques. Physical initialization procedures generally involve the use of standard or nonstandard data to force some physical process in the model during an assimilation period. Under the topic of next-generation assimilation techniques, variational approaches are currently being actively developed. Variational approaches seek to minimize a cost or penalty function which measures a model's fit to observations, background fields and other imposed constraints. Alternatively, the Kalman filter technique, which is also under investigation as a data assimilation procedure for numerical weather prediction, can yield acceptable initial conditions for mesoscale models. The

  12. Historical civilian nuclear accident based Nuclear Reactor Condition Analyzer

    Science.gov (United States)

    McCoy, Kaylyn Marie

    There are significant challenges to successfully monitoring multiple processes within a nuclear reactor facility. The evidence for this observation can be seen in the historical civilian nuclear incidents that have occurred with similar initiating conditions and sequences of events. Because there is a current lack within the nuclear industry, with regards to the monitoring of internal sensors across multiple processes for patterns of failure, this study has developed a program that is directed at accomplishing that charge through an innovation that monitors these systems simultaneously. The inclusion of digital sensor technology within the nuclear industry has appreciably increased computer systems' capabilities to manipulate sensor signals, thus making the satisfaction of these monitoring challenges possible. One such manipulation to signal data has been explored in this study. The Nuclear Reactor Condition Analyzer (NRCA) program that has been developed for this research, with the assistance of the Nuclear Regulatory Commission's Graduate Fellowship, utilizes one-norm distance and kernel weighting equations to normalize all nuclear reactor parameters under the program's analysis. This normalization allows the program to set more consistent parameter value thresholds for a more simplified approach to analyzing the condition of the nuclear reactor under its scrutiny. The product of this research provides a means for the nuclear industry to implement a safety and monitoring program that can oversee the system parameters of a nuclear power reactor facility, like that of a nuclear power plant.

  13. Development of a process analyzer for trace uranium

    International Nuclear Information System (INIS)

    Hiller, J.M.

    1990-01-01

    A process analyzer, based on time-resolved laser-induced luminescence, is being developed for the Department of Energy's Oak Ridge Y-12 Plant for the ultra-trace determination of uranium. The present instrument has a detection limit of 1 μg/L; the final instrument will have a detection limit near 1 ng/L for continuous environmental monitoring. Time-resolved luminescence decay is used to enhance sensitivity, reduce interferences, and eliminate the need for standard addition. The basic analyzer sequence is: a pulse generator triggers the laser; the laser beam strikes a photodiode which initiates data acquisition and synchronizes the timing, nearly simultaneously, laser light strikes the sample; intensity data are collected under control of the gated photon counter; and the cycle repeats as necessary. Typically, data are collected in 10 μs intervals over 700 μs (several luminescence half-lives). The final instrument will also collect and prepare samples, calibrate itself, reduce the raw data, and transmit reduced data to the control station(s)

  14. 3002 Humidified Tandem Differential Mobility Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Uin, Janek [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-03-01

    The Brechtel Manufacturing Inc. (BMI) Humidified Tandem Differential Mobility Analyzer (HT-DMA Model 3002) (Brechtel and Kreidenweis 2000a,b, Henning et al. 2005, Xerxes et al. 2014) measures how aerosol particles of different initial dry sizes grow or shrink when exposed to changing relative humidity (RH) conditions. It uses two different mobility analyzers (DMA) and a humidification system to make the measurements. One DMA selects a narrow size range of dry aerosol particles, which are exposed to varying RH conditions in the humidification system. The second (humidified) DMA scans the particle size distribution output from the humidification system. Scanning a wide range of particle sizes enables the second DMA to measure changes in size or growth factor (growth factor = humidified size/dry size), due to water uptake by the particles. A Condensation Particle Counter (CPC) downstream of the second DMA counts particles as a function of selected size in order to obtain the number size distribution of particles exposed to different RH conditions.

  15. Analyzing the attributes of Indiana's STEM schools

    Science.gov (United States)

    Eltz, Jeremy

    "Primary and secondary schools do not seem able to produce enough students with the interest, motivation, knowledge, and skills they will need to compete and prosper in the emerging world" (National Academy of Sciences [NAS], 2007a, p. 94). This quote indicated that there are changing expectations for today's students which have ultimately led to new models of education, such as charters, online and blended programs, career and technical centers, and for the purposes of this research, STEM schools. STEM education as defined in this study is a non-traditional model of teaching and learning intended to "equip them [students] with critical thinking, problem solving, creative and collaborative skills, and ultimately establishes connections between the school, work place, community and the global economy" (Science Foundation Arizona, 2014, p. 1). Focusing on science, technology, engineering, and math (STEM) education is believed by many educational stakeholders to be the solution for the deficits many students hold as they move on to college and careers. The National Governors Association (NGA; 2011) believes that building STEM skills in the nation's students will lead to the ability to compete globally with a new workforce that has the capacity to innovate and will in turn spur economic growth. In order to accomplish the STEM model of education, a group of educators and business leaders from Indiana developed a comprehensive plan for STEM education as an option for schools to use in order to close this gap. This plan has been promoted by the Indiana Department of Education (IDOE, 2014a) with the goal of increasing STEM schools throughout Indiana. To determine what Indiana's elementary STEM schools are doing, this study analyzed two of the elementary schools that were certified STEM by the IDOE. This qualitative case study described the findings and themes from two elementary STEM schools. Specifically, the research looked at the vital components to accomplish STEM

  16. analyzers in overweight/obese renal patients

    Directory of Open Access Journals (Sweden)

    Mariusz Kusztal

    2015-05-01

    Full Text Available Bioelectrical impedance analysis (BIA is an affordable, non-invasive and fast alternative method to assess body composition. The purpose of this study was to compare two different tetrapolar BIA devices for estimating body fluid volumes and body cell mass (BCM in a clinical setting among patients with kidney failure.All double measurements were performed by multi-frequency (MF and single-frequency (SF BIA analyzers: a Body Composition Monitor (Fresenius Medical Care, Germany and BIA-101 (Akern, Italy, respectively. All procedures were conducted according to the manufacturers’ instructions (dedicated electrodes, measurement sites, positions, etc. Total body water (TBW, extracellular water (ECW, intracellular water (ICW and BCM were compared. The study included 39 chronic kidney disease patients (stage III-V with a mean age of 45.8 ± 8 years (21 men and 18 women who had a wide range of BMI [17-34 kg/m2 (mean 26.6 ±5].A comparison of results from patients with BMI <25 vs ≥25 revealed a significant discrepancy in measurements between the two BIA devices. Namely, in the group with BMI <25 (n=16 acceptable correlations were obtained in TBW (r 0.99; p<0.01, ICW (0.92; p<0.01, BCM (0.68; p<0.01, and ECW (0.96 p<0.05, but those with BMI ≥25 (n=23 showed a discrepancy (lower correlations in TBW (r 0.82; p<0.05, ICW (0.78; p<0.05, BCM (0.52; p<0.05, and ECW (0.76; p<0.01.Since estimates of TBW, ICW and BCM by the present BIA devices do not differ in patients with BMI <25, they might be interchangeable. This does not hold true for overweight/obese renal patients.

  17. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2014-12-01

    We have developed a cloud-enabled web-service system that empowers physics-based, multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks. The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the observational datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation, (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs, and (3) ECMWF reanalysis outputs for several environmental variables in order to supplement observational datasets. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, (4) the calculation of difference between two variables, and (5) the conditional sampling of one physical variable with respect to another variable. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use, avoiding the hassle of local software installation and environment incompatibility. CMDA will be used as an educational tool for the summer school organized by JPL's Center for Climate Science in 2014. In order to support 30+ simultaneous users during the school, we have deployed CMDA to the Amazon cloud environment. The cloud-enabled CMDA will provide each student with a virtual machine while the user interaction with the system will remain the same

  18. The Albuquerque Seismological Laboratory Data Quality Analyzer

    Science.gov (United States)

    Ringler, A. T.; Hagerty, M.; Holland, J.; Gee, L. S.; Wilson, D.

    2013-12-01

    The U.S. Geological Survey's Albuquerque Seismological Laboratory (ASL) has several efforts underway to improve data quality at its stations. The Data Quality Analyzer (DQA) is one such development. The DQA is designed to characterize station data quality in a quantitative and automated manner. Station quality is based on the evaluation of various metrics, such as timing quality, noise levels, sensor coherence, and so on. These metrics are aggregated into a measurable grade for each station. The DQA consists of a website, a metric calculator (Seedscan), and a PostgreSQL database. The website allows the user to make requests for various time periods, review specific networks and stations, adjust weighting of the station's grade, and plot metrics as a function of time. The website dynamically loads all station data from a PostgreSQL database. The database is central to the application; it acts as a hub where metric values and limited station descriptions are stored. Data is stored at the level of one sensor's channel per day. The database is populated by Seedscan. Seedscan reads and processes miniSEED data, to generate metric values. Seedscan, written in Java, compares hashes of metadata and data to detect changes and perform subsequent recalculations. This ensures that the metric values are up to date and accurate. Seedscan can be run in a scheduled task or on demand by way of a config file. It will compute metrics specified in its configuration file. While many metrics are currently in development, some are completed and being actively used. These include: availability, timing quality, gap count, deviation from the New Low Noise Model, deviation from a station's noise baseline, inter-sensor coherence, and data-synthetic fits. In all, 20 metrics are planned, but any number could be added. ASL is actively using the DQA on a daily basis for station diagnostics and evaluation. As Seedscan is scheduled to run every night, data quality analysts are able to then use the

  19. Technology for collecting and analyzing relational data

    Directory of Open Access Journals (Sweden)

    E. N. Fedorova

    2016-01-01

    summarize the information there is a mechanism of data grouping, which provides general data of the number of entries, maximum, minimum, average values for different groups of records.Results. This technology has been tested in the monitoring requirements of the services of additional professional education and the definition of the educational needs of teachers and executives of educational organizations of the Irkutsk region. The survey has involved 2,780 respondents in 36 municipalities. Creating the data model took several hours. The survey was conducted during the month.Conclusion. The proposed technology allows a short time to collect the information in relational form, and then analyze it without the need for programming with flexible assignment of the operating logic for form.

  20. Florida Hydrogen Initiative

    Energy Technology Data Exchange (ETDEWEB)

    Block, David L

    2013-06-30

    The Florida Hydrogen Initiative (FHI) was a research, development and demonstration hydrogen and fuel cell program. The FHI program objectives were to develop Florida?s hydrogen and fuel cell infrastructure and to assist DOE in its hydrogen and fuel cell activities The FHI program funded 12 RD&D projects as follows: Hydrogen Refueling Infrastructure and Rental Car Strategies -- L. Lines, Rollins College This project analyzes strategies for Florida's early stage adaptation of hydrogen-powered public transportation. In particular, the report investigates urban and statewide network of refueling stations and the feasibility of establishing a hydrogen rental-car fleet based in Orlando. Methanol Fuel Cell Vehicle Charging Station at Florida Atlantic University ? M. Fuchs, EnerFuel, Inc. The project objectives were to design, and demonstrate a 10 kWnet proton exchange membrane fuel cell stationary power plant operating on methanol, to achieve an electrical energy efficiency of 32% and to demonstrate transient response time of less than 3 milliseconds. Assessment of Public Understanding of the Hydrogen Economy Through Science Center Exhibits, J. Newman, Orlando Science Center The project objective was to design and build an interactive Science Center exhibit called: ?H2Now: the Great Hydrogen Xchange?. On-site Reformation of Diesel Fuel for Hydrogen Fueling Station Applications ? A. Raissi, Florida Solar Energy Center This project developed an on-demand forecourt hydrogen production technology by catalytically converting high-sulfur hydrocarbon fuels to an essentially sulfur-free gas. The removal of sulfur from reformate is critical since most catalysts used for the steam reformation have limited sulfur tolerance. Chemochromic Hydrogen Leak Detectors for Safety Monitoring ? N. Mohajeri and N. Muradov, Florida Solar Energy Center This project developed and demonstrated a cost-effective and highly selective chemochromic (visual) hydrogen leak detector for safety

  1. Analyzers Measure Greenhouse Gases, Airborne Pollutants

    Science.gov (United States)

    2012-01-01

    In complete darkness, a NASA observatory waits. When an eruption of boiling water billows from a nearby crack in the ground, the observatory s sensors seek particles in the fluid, measure shifts in carbon isotopes, and analyze samples for biological signatures. NASA has landed the observatory in this remote location, far removed from air and sunlight, to find life unlike any that scientists have ever seen. It might sound like a scene from a distant planet, but this NASA mission is actually exploring an ocean floor right here on Earth. NASA established a formal exobiology program in 1960, which expanded into the present-day Astrobiology Program. The program, which celebrated its 50th anniversary in 2010, not only explores the possibility of life elsewhere in the universe, but also examines how life begins and evolves, and what the future may hold for life on Earth and other planets. Answers to these questions may be found not only by launching rockets skyward, but by sending probes in the opposite direction. Research here on Earth can revise prevailing concepts of life and biochemistry and point to the possibilities for life on other planets, as was demonstrated in December 2010, when NASA researchers discovered microbes in Mono Lake in California that subsist and reproduce using arsenic, a toxic chemical. The Mono Lake discovery may be the first of many that could reveal possible models for extraterrestrial life. One primary area of interest for NASA astrobiologists lies with the hydrothermal vents on the ocean floor. These vents expel jets of water heated and enriched with chemicals from off-gassing magma below the Earth s crust. Also potentially within the vents: microbes that, like the Mono Lake microorganisms, defy the common characteristics of life on Earth. Basically all organisms on our planet generate energy through the Krebs Cycle, explains Mike Flynn, research scientist at NASA s Ames Research Center. This metabolic process breaks down sugars for energy

  2. Funding Initiatives | Women in Science | Initiatives | Indian Academy ...

    Indian Academy of Sciences (India)

    Home; Initiatives; Women in Science; Funding Initiatives ... The Fellowship Scheme for Women Scientists for societal programmes is initiative of the ... at a young age of 52, after a valiant battle with cancer, today on 29th March 2016 in Delhi.

  3. Sustaining Participatory Design Initiatives

    DEFF Research Database (Denmark)

    Iversen, Ole Sejer; Dindler, Christian

    2014-01-01

    While many participatory design (PD) projects succeed in establishing new organisational initiatives or creating technology that is attuned to the people affected, the issue of how such results are sustained after the project ends remains an important challenge. We explore the challenge...... these various forms of sustainability may be pursued in PD practice and how they can become a resource in reflecting on PD activities. Finally, we discuss implications for PD practice, suggesting that a nuanced conception of sustainability and how it may relate to PD practice are useful resources for designers...... of sustaining PD initiatives beyond the individual project and discuss implications for PD practice. First, based on current PD literature, we distinguish between four ideal typical forms of sustainability: maintaining, scaling, replicating and evolving. Second, we demonstrate from a case study how...

  4. Initial management of breastfeeding.

    Science.gov (United States)

    Sinusas, K; Gagliardi, A

    2001-09-15

    Breast milk is widely accepted as the ideal source of nutrition for infants. In order to ensure success in breastfeeding, it is important that it be initiated as early as possible during the neonatal period. This is facilitated by skin-to-skin contact between the mother and infant immediately following birth. When possible, the infant should be allowed to root and latch on spontaneously within the first hour of life. Many common nursery routines such as weighing the infant, administration of vitamin K and application of ocular antibiotics can be safely delayed until after the initial breastfeeding. Postpartum care practices that improve breastfeeding rates include rooming-in, anticipatory guidance about breastfeeding problems and the avoidance of formula supplementation and pacifiers.

  5. Self-initiated expatriates

    DEFF Research Database (Denmark)

    Selmer, Jan; Lauring, Jakob

    2014-01-01

    Purpose – As it has been suggested that adult third-culture kids may be more culturally adaptable than others, they have been labelled “the ideal” expatriates. In this article, we explore the adjustment of self-initiated expatriate academics in Hong Kong, comparing adult third-culture kids...... with adult mono-culture kids. Design/methodology/approach – We use survey results from 267 self-initiated expatriate academics in Hong Kong. Findings – Exploratory results show that adult third-culture kids had a higher extent of general adjustment. No significant results were found in relation...... to interaction adjustment and job adjustment. We also found that recent expatriate experiences generally had a positive association with the adjustment of adult mono-culture kids, but this association only existed in terms of general adjustment for adult third-culture kids. Originality/value – Once corroborated...

  6. INITIAL TRAINING OF RESEARCHERS

    Directory of Open Access Journals (Sweden)

    Karina Alejandra Cruz-Pallares

    2015-07-01

    Full Text Available The document presents results of a research that used as strategy a complementary training project with thirty-three students of a Bachelors Degree in Primary School 1997(DPS,1997 of an Education Faculty for the initial training of investigators, applied by four teachers members of the academic research group in Mexico; that develops through process of action research methodology. Highlighted in results is the strengthening of the competition of reading, understanding and writing scientific texts, which is analogous to the first feature of the graduate profile called intellectual skills. Among the conclusions it is emphasized that the initial training of teachers in a task that is quite interesting, challenging and complex, as is the educational complex phenomenon.

  7. Hanford tanks initiative plan

    International Nuclear Information System (INIS)

    McKinney, K.E.

    1997-01-01

    Abstract: The Hanford Tanks Initiative (HTI) is a five-year project resulting from the technical and financial partnership of the U.S. Department of Energy's Office of Waste Management (EM-30) and Office of Science and Technology Development (EM-50). The HTI project accelerates activities to gain key technical, cost performance, and regulatory information on two high-level waste tanks. The HTI will provide a basis for design and regulatory decisions affecting the remainder of the Tank Waste Remediation System's tank waste retrieval Program

  8. UNLV Nuclear Hydrogen Initiative

    Energy Technology Data Exchange (ETDEWEB)

    Hechanova, Anthony E.; Johnson, Allen; O' Toole, Brendan; Trabia, Mohamed; Peterson, Per

    2012-10-25

    Evaluation of the Crack growth rate (CGR) of Alloy 617 and Alloy 276 under constant K at ambient temperature has been completed. Creep deformation of Alloy 230 at different temperature range and load level has been completed and heat to heat variation has been noticed. Creep deformation study of Alloy 276 has been completed under an applied initial stress level of 10% of yield stress at 950ºC. The grain size evaluation of the tested creep specimens of Alloy 276 has been completed.

  9. Feedback stabilization initiative

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-06-01

    Much progress has been made in attaining high confinement regimes in magnetic confinement devices. These operating modes tend to be transient, however, due to the onset of MHD instabilities, and their stabilization is critical for improved performance at steady state. This report describes the Feedback Stabilization Initiative (FSI), a broad-based, multi-institutional effort to develop and implement methods for raising the achievable plasma betas through active MHD feedback stabilization. A key element in this proposed effort is the Feedback Stabilization Experiment (FSX), a medium-sized, national facility that would be specifically dedicated to demonstrating beta improvement in reactor relevant plasmas by using a variety of MHD feedback stabilization schemes.

  10. Feedback stabilization initiative

    International Nuclear Information System (INIS)

    1997-06-01

    Much progress has been made in attaining high confinement regimes in magnetic confinement devices. These operating modes tend to be transient, however, due to the onset of MHD instabilities, and their stabilization is critical for improved performance at steady state. This report describes the Feedback Stabilization Initiative (FSI), a broad-based, multi-institutional effort to develop and implement methods for raising the achievable plasma betas through active MHD feedback stabilization. A key element in this proposed effort is the Feedback Stabilization Experiment (FSX), a medium-sized, national facility that would be specifically dedicated to demonstrating beta improvement in reactor relevant plasmas by using a variety of MHD feedback stabilization schemes

  11. Initiation of slug flow

    Energy Technology Data Exchange (ETDEWEB)

    Hanratty, T.J.; Woods, B.D. [Univ. of Illinois, Urbana, IL (United States)

    1995-12-31

    The initiation of slug flow in a horizontal pipe can be predicted either by considering the stability of a slug or by considering the stability of a stratified flow. Measurements of the shedding rate of slugs are used to define necessary conditions for the existence of a slug. Recent results show that slugs develop from an unstable stratified flow through the evolution of small wavelength waves into large wavelength waves that have the possibility of growing to form a slug. The mechanism appears to be quite different for fluids with viscosities close to water than for fluids with large viscosities (20 centipoise).

  12. Stirling to Flight Initiative

    Science.gov (United States)

    Hibbard, Kenneth E.; Mason, Lee S.; Ndu, Obi; Smith, Clayton; Withrow, James P.

    2016-01-01

    Flight (S2F) initiative with the objective of developing a 100-500 We Stirling generator system. Additionally, a different approach is being devised for this initiative to avoid pitfalls of the past, and apply lessons learned from the recent ASRG experience. Two key aspects of this initiative are a Stirling System Technology Maturation Effort, and a Surrogate Mission Team (SMT) intended to provide clear mission pull and requirements context. The S2F project seeks to lead directly into a DOE flight system development of a new SRG. This paper will detail the proposed S2F initiative, and provide specifics on the key efforts designed to pave a forward path for bringing Stirling technology to flight.

  13. Initiating statistical maintenance optimization

    International Nuclear Information System (INIS)

    Doyle, E. Kevin; Tuomi, Vesa; Rowley, Ian

    2007-01-01

    Since the 1980 s maintenance optimization has been centered around various formulations of Reliability Centered Maintenance (RCM). Several such optimization techniques have been implemented at the Bruce Nuclear Station. Further cost refinement of the Station preventive maintenance strategy includes evaluation of statistical optimization techniques. A review of successful pilot efforts in this direction is provided as well as initial work with graphical analysis. The present situation reguarding data sourcing, the principle impediment to use of stochastic methods in previous years, is discussed. The use of Crowe/AMSAA (Army Materials Systems Analysis Activity) plots is demonstrated from the point of view of justifying expenditures in optimization efforts. (author)

  14. Noise and analyzer-crystal angular position analysis for analyzer-based phase-contrast imaging

    Science.gov (United States)

    Majidi, Keivan; Li, Jun; Muehleman, Carol; Brankov, Jovan G.

    2014-04-01

    The analyzer-based phase-contrast x-ray imaging (ABI) method is emerging as a potential alternative to conventional radiography. Like many of the modern imaging techniques, ABI is a computed imaging method (meaning that images are calculated from raw data). ABI can simultaneously generate a number of planar parametric images containing information about absorption, refraction, and scattering properties of an object. These images are estimated from raw data acquired by measuring (sampling) the angular intensity profile of the x-ray beam passed through the object at different angular positions of the analyzer crystal. The noise in the estimated ABI parametric images depends upon imaging conditions like the source intensity (flux), measurements angular positions, object properties, and the estimation method. In this paper, we use the Cramér-Rao lower bound (CRLB) to quantify the noise properties in parametric images and to investigate the effect of source intensity, different analyzer-crystal angular positions and object properties on this bound, assuming a fixed radiation dose delivered to an object. The CRLB is the minimum bound for the variance of an unbiased estimator and defines the best noise performance that one can obtain regardless of which estimation method is used to estimate ABI parametric images. The main result of this paper is that the variance (hence the noise) in parametric images is directly proportional to the source intensity and only a limited number of analyzer-crystal angular measurements (eleven for uniform and three for optimal non-uniform) are required to get the best parametric images. The following angular measurements only spread the total dose to the measurements without improving or worsening CRLB, but the added measurements may improve parametric images by reducing estimation bias. Next, using CRLB we evaluate the multiple-image radiography, diffraction enhanced imaging and scatter diffraction enhanced imaging estimation techniques

  15. Noise and analyzer-crystal angular position analysis for analyzer-based phase-contrast imaging

    International Nuclear Information System (INIS)

    Majidi, Keivan; Brankov, Jovan G; Li, Jun; Muehleman, Carol

    2014-01-01

    The analyzer-based phase-contrast x-ray imaging (ABI) method is emerging as a potential alternative to conventional radiography. Like many of the modern imaging techniques, ABI is a computed imaging method (meaning that images are calculated from raw data). ABI can simultaneously generate a number of planar parametric images containing information about absorption, refraction, and scattering properties of an object. These images are estimated from raw data acquired by measuring (sampling) the angular intensity profile of the x-ray beam passed through the object at different angular positions of the analyzer crystal. The noise in the estimated ABI parametric images depends upon imaging conditions like the source intensity (flux), measurements angular positions, object properties, and the estimation method. In this paper, we use the Cramér–Rao lower bound (CRLB) to quantify the noise properties in parametric images and to investigate the effect of source intensity, different analyzer-crystal angular positions and object properties on this bound, assuming a fixed radiation dose delivered to an object. The CRLB is the minimum bound for the variance of an unbiased estimator and defines the best noise performance that one can obtain regardless of which estimation method is used to estimate ABI parametric images. The main result of this paper is that the variance (hence the noise) in parametric images is directly proportional to the source intensity and only a limited number of analyzer-crystal angular measurements (eleven for uniform and three for optimal non-uniform) are required to get the best parametric images. The following angular measurements only spread the total dose to the measurements without improving or worsening CRLB, but the added measurements may improve parametric images by reducing estimation bias. Next, using CRLB we evaluate the multiple-image radiography, diffraction enhanced imaging and scatter diffraction enhanced imaging estimation techniques

  16. Comparative evaluation of Plateletworks, Multiplate analyzer and Platelet function analyzer-200 in cardiology patients.

    Science.gov (United States)

    Kim, Jeeyong; Cho, Chi Hyun; Jung, Bo Kyeung; Nam, Jeonghun; Seo, Hong Seog; Shin, Sehyun; Lim, Chae Seung

    2018-04-14

    The objective of this study was to comparatively evaluate three commercial whole-blood platelet function analyzer systems: Platelet Function Analyzer-200 (PFA; Siemens Canada, Mississauga, Ontario, Canada), Multiplate analyzer (MP; Roche Diagnostics International Ltd., Rotkreuz, Switzerland), and Plateletworks Combo-25 kit (PLW; Helena Laboratories, Beaumont, TX, USA). Venipuncture was performed on 160 patients who visited a department of cardiology. Pairwise agreement among the three platelet function assays was assessed using Cohen's kappa coefficient and percent agreement within the reference limit. Kappa values with the same agonists were poor between PFA-collagen (COL; agonist)/adenosine diphosphate (ADP) and MP-ADP (-0.147), PFA-COL/ADP and PLW-ADP (0.089), MP-ADP and PLW-ADP (0.039), PFA-COL/ADP and MP-COL (-0.039), and between PFA-COL/ADP and PLW-COL (-0.067). Nonetheless, kappa values for the same assay principle with a different agonist were slightly higher between PFA-COL/ADP and PFA-COL/EPI (0.352), MP-ADP and MP-COL (0.235), and between PLW-ADP and PLW-COL (0.247). The range of percent agreement values was 38.7% to 73.8%. Therefore, various measurements of platelet function by more than one method were needed to obtain a reliable interpretation of platelet function considering low kappa coefficient and modest percent agreement rates among 3 different platelet function tests.

  17. Inclusion-initiated fracture model for ceramics

    International Nuclear Information System (INIS)

    Sung, J.; Nicholson, P.S.

    1990-01-01

    The fracture of ceramics initiating from a typical inclusion is analyzed. The inclusion is considered to have a thermal expansion coefficient and fracture toughness lower than those of the matrix and a Young's modulus higher than that of the matrix. Inclusion-initiated fracture is modeled for a spherical inclusion using a weight function method to compute the residual stress intensity factor for a part-through elliptical crack. The results are applied to an α-Al 2 O 3 inclusion embedded in a tetragonal ZrO 2 ceramic. The strength predictions agree well with experimental data

  18. The LHCb Starterkit initiative

    CERN Document Server

    Puig Navarro, Albert

    2017-01-01

    The vast majority of high-energy physicists use and produce software every day. Software skills are usually acquired on the go and dedicated training courses are rare. The LHCb Starterkit is a new training format for getting LHCb collaborators started in effectively using software to perform their research. The initiative, combining courses and online tutorials, focuses on teaching basic skills for research computing, as well as LHCb software specifics. Unlike traditional tutorials we focus on starting with basics, performing all the material live, with a high degree of interactivity, giving priority to understanding the tools as opposed to handing out recipes that work “as if by magic”. The LHCb Starterkit was started by young members of the collaboration inspired by the principles of Software Carpentry, and the material is created in a collaborative fashion using the tools we teach. Three successful entry-level workshops, as well as two advanced ones, have taken place since the start of the initiative i...

  19. Initiatives for proliferation prevention

    International Nuclear Information System (INIS)

    1997-04-01

    Preventing the proliferation of weapons of mass destruction is a central part of US national security policy. A principal instrument of the Department of Energy's (DOE's) program for securing weapons of mass destruction technology and expertise and removing incentives for scientists, engineers and technicians in the newly independent states (NIS) of the former Soviet Union to go to rogue countries or assist terrorist groups is the Initiatives for Proliferation Prevention (IPP). IPP was initiated pursuant to the 1994 Foreign Operations Appropriations Act. IPP is a nonproliferation program with a commercialization strategy. IPP seeks to enhance US national security and to achieve nonproliferation objectives by engaging scientists, engineers and technicians from former NIS weapons institutes; redirecting their activities in cooperatively-developed, commercially viable non-weapons related projects. These projects lead to commercial and economic benefits for both the NIS and the US IPP projects are funded in Russian, Ukraine, Kazakhstan and Belarus. This booklet offers an overview of the IPP program as well as a sampling of some of the projects which are currently underway

  20. Micro-Analyzer: automatic preprocessing of Affymetrix microarray data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Cannataro, Mario

    2013-08-01

    A current trend in genomics is the investigation of the cell mechanism using different technologies, in order to explain the relationship among genes, molecular processes and diseases. For instance, the combined use of gene-expression arrays and genomic arrays has been demonstrated as an effective instrument in clinical practice. Consequently, in a single experiment different kind of microarrays may be used, resulting in the production of different types of binary data (images and textual raw data). The analysis of microarray data requires an initial preprocessing phase, that makes raw data suitable for use on existing analysis platforms, such as the TIGR M4 (TM4) Suite. An additional challenge to be faced by emerging data analysis platforms is the ability to treat in a combined way those different microarray formats coupled with clinical data. In fact, resulting integrated data may include both numerical and symbolic data (e.g. gene expression and SNPs regarding molecular data), as well as temporal data (e.g. the response to a drug, time to progression and survival rate), regarding clinical data. Raw data preprocessing is a crucial step in analysis but is often performed in a manual and error prone way using different software tools. Thus novel, platform independent, and possibly open source tools enabling the semi-automatic preprocessing and annotation of different microarray data are needed. The paper presents Micro-Analyzer (Microarray Analyzer), a cross-platform tool for the automatic normalization, summarization and annotation of Affymetrix gene expression and SNP binary data. It represents the evolution of the μ-CS tool, extending the preprocessing to SNP arrays that were not allowed in μ-CS. The Micro-Analyzer is provided as a Java standalone tool and enables users to read, preprocess and analyse binary microarray data (gene expression and SNPs) by invoking TM4 platform. It avoids: (i) the manual invocation of external tools (e.g. the Affymetrix Power

  1. Analyzing solid waste management practices for the hotel industry

    Directory of Open Access Journals (Sweden)

    S.T. Pham Phu

    2018-01-01

    Full Text Available The current study aims to analyze waste characteristics and management practices of the hotel industry in Hoi An, a tourism city in the center of Vietnam. Solid wastes from 120 hotels were sampled, the face-to-face interviews were conducted, and statistical methods were carried out to analyze the data. The results showed that the mean of waste generation rate of the hotels was 2.28 kg/guest/day and strongly correlated to internal influencing factors such as the capacity, the price of the room, garden, and level of restaurant. The differences in waste generation rate of the hotels were proved to be statistically significant. The higher the scale of hotels, the higher the waste generation rate. Moreover, the waste composition of the hotels was identified by 58.5% for biodegradable waste, 25.8% for recyclables and 15.7% for others. The relative differences in the waste composition of the hotels by climate, the features of hotels, and the types of the guest were explained. Whereby, the higher size of the hotels, the higher percentage of biodegradable and less proportion of recyclable waste. Also, this study revealed that the implementation status of waste management practices of the hoteliers initially reaped quite positive achievements with 76% for sorting, 39% for recycling, 29% for reduction, and 0.8% for composting. The rate of waste management practices was proportional to the scale of the hotel. This study provided information on waste management practice of hotel industry and contributed to the overall assessment of municipal solid waste management practices of Hoi An city.

  2. Evaluation Program initiative

    International Nuclear Information System (INIS)

    Rich, B.L.

    1987-01-01

    The purpose of this paper is to provide the Department of Energy's (DOE) safeguards and security community with some insights on an important management initiative by the Office of Security Evaluations (OSE). The paper will present the ''what, where, who, when, and why'' of a new Evaluation Program. The Evaluation Program will be comprised of a continuing series of regular and special evaluations of DOE safeguards and security programs and policies. The evaluations will be integrative and ''crosscutting,'' i.e. will extend across DOE organizational lines. Evaluations will be offered as positive advisories to DOE managers with safeguards and security responsibilities and will not be rated. They will complement the ongoing OSE Inspection Program of inspections conducted by OSE's Inspection Division. The purpose for the evaluations is to establish an accurate and current assessment of the effectiveness and status of safeguards and security programs and policies and to provide DOE managers with required information on program and policy effectiveness

  3. Breckinridge Project, initial effort

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    1982-09-01

    Report III, Volume 1 contains those specifications numbered A through J, as follows: General Specifications (A); Specifications for Pressure Vessels (C); Specifications for Tanks (D); Specifications for Exchangers (E); Specifications for Fired Heaters (F); Specifications for Pumps and Drivers (G); and Specifications for Instrumentation (J). The standard specifications of Bechtel Petroleum Incorporated have been amended as necessary to reflect the specific requirements of the Breckinridge Project, and the more stringent specifications of Ashland Synthetic Fuels, Inc. These standard specifications are available to the Initial Effort (Phase Zero) work performed by all contractors and subcontractors. Report III, Volume 1 also contains the unique specifications prepared for Plants 8, 15, and 27. These specifications will be substantially reviewed during Phase I of the project, and modified as necessary for use during the engineering, procurement, and construction of this project.

  4. The climate technology initiative

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Adam [International Energy Agency, Climate Technology Initiative, Paris (France)

    2000-12-01

    The CTI (Climate Technology Initiative) aims to promote those technologies which cause the minimum of harm to the environment: reducing emissions of greenhouse gases and supporting those countries most vulnerable to climate change are priorities. A strong case for cogeneration is made and it is pointed out that both the European Union and the USA aim to double their cogeneration capacity by 2010. The CTI holds training courses and seminars all over the world where the barriers to the expansion of climate-friendly technology are discussed. The article also mentions the CTI Co-operation Technology Implementation Plan, research and development, its website and search engine, its presence at all UNFCCC events and its awards programme.

  5. The climate technology initiative

    International Nuclear Information System (INIS)

    Smith, Adam

    2000-01-01

    The CTI (Climate Technology Initiative) aims to promote those technologies which cause the minimum of harm to the environment: reducing emissions of greenhouse gases and supporting those countries most vulnerable to climate change are priorities. A strong case for cogeneration is made and it is pointed out that both the European Union and the USA aim to double their cogeneration capacity by 2010. The CTI holds training courses and seminars all over the world where the barriers to the expansion of climate-friendly technology are discussed. The article also mentions the CTI Co-operation Technology Implementation Plan, research and development, its website and search engine, its presence at all UNFCCC events and its awards programme

  6. Initial brain aging

    DEFF Research Database (Denmark)

    Thomsen, Kirsten; Yokota, Takashi; Hasan-Olive, Md Mahdi

    2018-01-01

    Brain aging is accompanied by declining mitochondrial respiration. We hypothesized that mitochondrial morphology and dynamics would reflect this decline. Using hippocampus and frontal cortex of a segmental progeroid mouse model lacking Cockayne syndrome protein B (CSB(m/m)) and C57Bl/6 (WT......) controls and comparing young (2-5 months) to middle-aged mice (13-14 months), we found that complex I-linked state 3 respiration (CI) was reduced at middle age in CSB(m/m) hippocampus, but not in CSB(m/m) cortex or WT brain. In hippocampus of both genotypes, mitochondrial size heterogeneity increased....... Mitochondrial DNA content was lower, and hypoxia-induced factor 1α mRNA was greater at both ages in CSB(m/m) compared to WT brain. Our findings show that decreased CI and increased mitochondrial size heterogeneity are highly associated and point to declining mitochondrial quality control as an initial event...

  7. Materials Genome Initiative

    Science.gov (United States)

    Vickers, John

    2015-01-01

    The Materials Genome Initiative (MGI) project element is a cross-Center effort that is focused on the integration of computational tools to simulate manufacturing processes and materials behavior. These computational simulations will be utilized to gain understanding of processes and materials behavior to accelerate process development and certification to more efficiently integrate new materials in existing NASA projects and to lead to the design of new materials for improved performance. This NASA effort looks to collaborate with efforts at other government agencies and universities working under the national MGI. MGI plans to develop integrated computational/experimental/ processing methodologies for accelerating discovery and insertion of materials to satisfy NASA's unique mission demands. The challenges include validated design tools that incorporate materials properties, processes, and design requirements; and materials process control to rapidly mature emerging manufacturing methods and develop certified manufacturing processes

  8. The Ombudperson Initiative Group

    CERN Multimedia

    Laura Stewart

    Following many discussions that took place at some of the ATLAS Women's Network lunch gatherings, a few ATLAS women joined forces with similarly concerned CERN staff women to form a small group last Fall to discuss the need for a CERN-wide Ombudsperson. This has since evolved into the Ombudsperson Initiative Group (OIG) currently composed of the following members: Barbro Asman, Stockholm University; Pierre Charrue, CERN AB; Anna Cook, CERN IT; Catherine Delamare, CERN and IT Ombudsperson; Paula Eerola, Lund University; Pauline Gagnon, Indiana University; Eugenia Hatziangeli, CERN AB; Doreen Klem, CERN IT; Bertrand Nicquevert, CERN TS and Laura Stewart, CERN AT. On June 12, members of the OIG met with representatives of Human Resources (HR) and the Equal Opportunity Advisory Panel (EOAP) to discuss the proposal drafted by the OIG. The meeting was very positive. Everybody agreed that the current procedures at CERN applicable in the event of conflict required a thorough review, and that a professionnally trai...

  9. Instrumented Pipeline Initiative

    Energy Technology Data Exchange (ETDEWEB)

    Thomas Piro; Michael Ream

    2010-07-31

    This report summarizes technical progress achieved during the cooperative agreement between Concurrent Technologies Corporation (CTC) and U.S. Department of Energy to address the need for a for low-cost monitoring and inspection sensor system as identified in the Department of Energy (DOE) National Gas Infrastructure Research & Development (R&D) Delivery Reliability Program Roadmap.. The Instrumented Pipeline Initiative (IPI) achieved the objective by researching technologies for the monitoring of pipeline delivery integrity, through a ubiquitous network of sensors and controllers to detect and diagnose incipient defects, leaks, and failures. This report is organized by tasks as detailed in the Statement of Project Objectives (SOPO). The sections all state the objective and approach before detailing results of work.

  10. The Enzyme Function Initiative.

    Science.gov (United States)

    Gerlt, John A; Allen, Karen N; Almo, Steven C; Armstrong, Richard N; Babbitt, Patricia C; Cronan, John E; Dunaway-Mariano, Debra; Imker, Heidi J; Jacobson, Matthew P; Minor, Wladek; Poulter, C Dale; Raushel, Frank M; Sali, Andrej; Shoichet, Brian K; Sweedler, Jonathan V

    2011-11-22

    The Enzyme Function Initiative (EFI) was recently established to address the challenge of assigning reliable functions to enzymes discovered in bacterial genome projects; in this Current Topic, we review the structure and operations of the EFI. The EFI includes the Superfamily/Genome, Protein, Structure, Computation, and Data/Dissemination Cores that provide the infrastructure for reliably predicting the in vitro functions of unknown enzymes. The initial targets for functional assignment are selected from five functionally diverse superfamilies (amidohydrolase, enolase, glutathione transferase, haloalkanoic acid dehalogenase, and isoprenoid synthase), with five superfamily specific Bridging Projects experimentally testing the predicted in vitro enzymatic activities. The EFI also includes the Microbiology Core that evaluates the in vivo context of in vitro enzymatic functions and confirms the functional predictions of the EFI. The deliverables of the EFI to the scientific community include (1) development of a large-scale, multidisciplinary sequence/structure-based strategy for functional assignment of unknown enzymes discovered in genome projects (target selection, protein production, structure determination, computation, experimental enzymology, microbiology, and structure-based annotation), (2) dissemination of the strategy to the community via publications, collaborations, workshops, and symposia, (3) computational and bioinformatic tools for using the strategy, (4) provision of experimental protocols and/or reagents for enzyme production and characterization, and (5) dissemination of data via the EFI's Website, http://enzymefunction.org. The realization of multidisciplinary strategies for functional assignment will begin to define the full metabolic diversity that exists in nature and will impact basic biochemical and evolutionary understanding, as well as a wide range of applications of central importance to industrial, medicinal, and pharmaceutical efforts.

  11. Green Power Initiative

    Energy Technology Data Exchange (ETDEWEB)

    Butler, Patrick Barry [Univ. of Iowa, Iowa City, IA (United States)

    2013-01-28

    National energy policy supports the gathering of more detailed and authoritative data on the introduction of renewable bio-based fuels into new and existing district energy systems via the application of biomass gasification. The University of Iowa developed a biomass-fueled, university-scale steam generation system based on biomass gasification technologies. The system serves as a state-of-the-art research and educational facility in the emerging application of gasification in steam generation. The facility, which includes a smaller down-draft gasifier and a larger multi-stage biomass boiler, was designed to operate primarily on wood-based fuels, but has provisions for testing other biomass fuel sources produced within a 100-mile radius, providing enough flexibility to meet the fluctuating local supply of biomass from industry and Midwest agriculture. The equipment was installed in an existing, staffed facility. The down-draft gasifier unit is operated by College of Engineering staff and students, under the direct technical supervision of qualified Utilities plant staff. The Green Power Initiative also includes a substantial, innovative educational component. In addition to an onsite, graduate-level research program in biomass fuels, the investigators have integrated undergraduate and graduate level teaching – through classroom studies and experiential learning – and applied research into a biomass-based, university-scale, functioning power plant. University of Iowa is unique in that it currently has multiple renewable energy technologies deployed, including significant biomass combustion (oat hulls) at its Main Power Plant and a new reciprocating engine based renewable district energy system. This project complements and supports the national energy policy and State of Iowa initiatives in ethanol and biodiesel. Byproducts of ethanol and biodiesel processes (distiller grains) as well as industry residues (oat hulls, wood chips, construction and demolition

  12. Systematic Review about Personal Growth Initiative

    Directory of Open Access Journals (Sweden)

    Clarissa Pinto Pizarro de Freitas

    Full Text Available The present study aimed to realize a systematic review of publications about personal growth initiative. A literature review was realized in Bireme, Index Psi, LILACS, PePSIC, Pubmed - Publisher's Medlme, Wiley Online Library, PsycINFO, OneFile, SciVerse ScienceDirect, ERIC, Emerald Journals, PsycARTICLES - American Psychological Association, Directory of Open Access Journals - DOAJ, SAGE Journals, SpringerLink, PLoS, IngentaConnect, IEEE Journals & Magazines and SciELO databases. The literature review was performed from December of 2014 to January of 2015, without stipulating date limits for the publication of the articles. It was found 53 studies, excluded seven, and analyzed 46 researches. The studies aimed to investigate the psychometric properties of personal growth initiative scale and personal growth initiative scale II. The relations of personal initiative growth and others constructs were also evaluated. Furthermore the studies investigated the impact of interventions to promote personal growth initiative. Results of these studies showed that personal growth initiative was positively related to levels of well-being, selfesteem and others positive dimensions, and negatively to anxiety, depression and others negative factors.

  13. Development of remote controlled electron probe micro analyzer with crystal orientation analyzer

    International Nuclear Information System (INIS)

    Honda, Junichi; Matsui, Hiroki; Harada, Akio; Obata, Hiroki; Tomita, Takeshi

    2012-07-01

    The advanced utilization of Light Water Reactor (LWR) fuel is progressed in Japan to save the power generating cost and the volume of nuclear wastes. The electric power companies have continued the approach to the burnup extension and to rise up the thermal power increase of the commercial fuel. The government should be accumulating the detailed information on the newest technologies to make the regulations and guidelines for the safety of the advanced nuclear fuels. The remote controlled Electron Probe Micro Analyzer (EPMA) attached with crystal orientation analyzer has been developed in Japan Atomic Energy Agency (JAEA) to study the fuel behavior of the high burnup fuels under the accident condition. The effects of the cladding microstructure on the fuel behavior will be evaluated more conveniently and quantitatively by this EPMA. The commercial model of EPMA has been modified to have the performance of airtight and earthquake resistant in compliance with the safety regulation by the government for handling the high radioactive elements. This paper describes the specifications of EPMA which were specialised for post irradiation examination and the test results of the cold mock-up to confirm their performances and reliabilities. (author)

  14. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Directory of Open Access Journals (Sweden)

    Roozbeh Rashed

    2013-01-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.  

  15. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Directory of Open Access Journals (Sweden)

    Farzin Heravi

    2012-09-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.

  16. Concepts and realization of the KWU Nuclear Plant Analyzer

    International Nuclear Information System (INIS)

    Moritz, H.; Hummel, R.

    1987-01-01

    The Nuclear Plant Analyzer (NPA) is a real time simulator developed from KWU computer programs for transient and safety analysis ('engineering simulator'). The NPA has no control room, the hardware consists only of commercially available data processing devices. The KWU NPA makes available all simulator operating features such as initial conditions, free operator action and multiple malfunctions as well as freeze, snapshot, backtrack and playback, which have evolved useful training support in training simulators of all technical disciplines. The simulation program itself is running on a large mainframe computer Control Data CYBER 176 or CYBER 990 in the KWU computing center under the interactive component INTERCOM of the operating system NOS/BE. It transmits the time dependent engineering date roughly once a second to a process computer SIEMENS 300-R30E using telecommunication by telephone. The computers are coupled by an emulation of the communication protocol Mode 4A, running on the R30 computer. To this emulation a program-to-program interface via a circular buffer on the R30 was added. In the process computer data are processed and displayed graphically on 4 colour screens (560x512 pixels, 8 colours) by means of the process monitoring system DISIT. All activities at the simulator, including operator actions, are performed locally by the operator at the screens by means of function keys or dialog. (orig.)

  17. Analyzing octopus movements using three-dimensional reconstruction.

    Science.gov (United States)

    Yekutieli, Yoram; Mitelman, Rea; Hochner, Binyamin; Flash, Tamar

    2007-09-01

    Octopus arms, as well as other muscular hydrostats, are characterized by a very large number of degrees of freedom and a rich motion repertoire. Over the years, several attempts have been made to elucidate the interplay between the biomechanics of these organs and their control systems. Recent developments in electrophysiological recordings from both the arms and brains of behaving octopuses mark significant progress in this direction. The next stage is relating these recordings to the octopus arm movements, which requires an accurate and reliable method of movement description and analysis. Here we describe a semiautomatic computerized system for 3D reconstruction of an octopus arm during motion. It consists of two digital video cameras and a PC computer running custom-made software. The system overcomes the difficulty of extracting the motion of smooth, nonrigid objects in poor viewing conditions. Some of the trouble is explained by the problem of light refraction in recording underwater motion. Here we use both experiments and simulations to analyze the refraction problem and show that accurate reconstruction is possible. We have used this system successfully to reconstruct different types of octopus arm movements, such as reaching and bend initiation movements. Our system is noninvasive and does not require attaching any artificial markers to the octopus arm. It may therefore be of more general use in reconstructing other nonrigid, elongated objects in motion.

  18. Analyzing the Risk of Well Plug Failure after Abandonment

    International Nuclear Information System (INIS)

    Mainguy, M.; Longuemare, P.; Audibert, A.; Lecolier, E.

    2007-01-01

    All oil and gas wells will have to be plugged and abandoned at some time. The plugging and abandonment procedure must provide an effective isolation of the well fluids all along the well to reduce environmental risks of contamination and prevent from costly remedial jobs. Previous works have analyzed the plug behavior when submitted to local pressure or thermal changes but no work has looked to the effects of external pressure, thermal and stress changes resulting from a global equilibrium restoration in a hydrocarbon reservoir once production has stopped. This work estimates those changes after abandonment on a reservoir field case using a reservoir simulator in conjunction with a geomechanical simulator. Such simulations provide the pressure and thermal changes and the maximum effective stress changes in the reservoir cap rock where critical plugs are put in place for isolating the production intervals. These changes are used as loads in a well bore stress model that explicitly models an injector well and predict stress rearrangements in the plug after abandonment. Results obtained with the well bore stress model for a conventional class G cement plug show that the main risk of failure is tensile failure because of the low tensile strength of the cement. Actually, soft sealing materials or initially pre-stressed plug appears to be more adapted to the downhole conditions changes that may occurs after well plugging and abandonment. (authors)

  19. Analyzing EFL Teachers’ Initial Job Motivation and Factors Effecting Their Motivation in Fezalar Educational Institutions in Iraq

    Directory of Open Access Journals (Sweden)

    Selcuk Koran

    2015-02-01

    Full Text Available Teacher motivation is one of the primary variables of students’ high performance. It is experienced that students whose teachers are highly motivated are more engaged in the learning process. Therefore, it’s mostly the teacher who determines the level of success or failure in achieving institution’s goal in the educational process. Thus, teachers are expected to demonstrate a high job motivation performance by administrations. However, some teachers seem naturally enthusiastic about teaching while others need to be stimulated, inspired and challenged. There are several factors that provide teachers with necessary motivation driven by which they can work effectively. These factors can be emotional, financial, physical or academic. This study is an attempt to find out what motivates teachers to enter this profession, since the reasons of entering this job has significant influence on their commitment to the job, investigate factors which are responsible for high or low motivation of language teachers in Fezalar Educational Institutions (FEI, which is a Turkish private institution that operates in Iraq, and ascertain the degree to which intrinsic and extrinsic motivational factors impact teachers in their work situation. Based on the review of the recent researches of motivation, in general, and of language teacher motivation, in particular, and relying on the qualitative and quantitative study of the issue, a detailed analysis of some aspects of foreign language teacher motivation is presented in the article. Keywords: teacher motivation, job satisfaction, foreign language teaching, L2 teacher motivation

  20. Analyzing EFL Teachers' Initial Job Motivation and Factors Effecting Their Motivation in Fezalar Educational Institutions in Iraq

    Science.gov (United States)

    Koran, Selcuk

    2015-01-01

    Teacher motivation is one of the primary variables of students' high performance. It is experienced that students whose teachers are highly motivated are more engaged in the learning process. Therefore, it's mostly the teacher who determines the level of success or failure in achieving institution's goal in the educational process. Thus, teachers…

  1. Care initiation area yields dramatic results.

    Science.gov (United States)

    2009-03-01

    The ED at Gaston Memorial Hospital in Gastonia, NC, has achieved dramatic results in key department metrics with a Care Initiation Area (CIA) and a physician in triage. Here's how the ED arrived at this winning solution: Leadership was trained in and implemented the Kaizen method, which eliminates redundant or inefficient process steps. Simulation software helped determine additional space needed by analyzing arrival patterns and other key data. After only two days of meetings, new ideas were implemented and tested.

  2. Hawaii Energy and Environmental Technologies (HEET) Initiative

    Science.gov (United States)

    2011-12-01

    polymer electrolyte fuel cells ( PEMFCs ) performance. This work was performed to support the DOE manufacturing initiative for PEMFC production. The work...performed by exposing the MEA cathode to 10 ppm SO2 in N2 at certain potential and typical operating conditions of a PEMFC for certain time, then...adsorbate by analyzing the electrochemical reduction and oxidation potential and charge. As for the in-situ SO2 adsorption experiments, a PEMFC under

  3. The data quality analyzer: a quality control program for seismic data

    Science.gov (United States)

    Ringler, Adam; Hagerty, M.T.; Holland, James F.; Gonzales, A.; Gee, Lind S.; Edwards, J.D.; Wilson, David; Baker, Adam

    2015-01-01

    The U.S. Geological Survey's Albuquerque Seismological Laboratory (ASL) has several initiatives underway to enhance and track the quality of data produced from ASL seismic stations and to improve communication about data problems to the user community. The Data Quality Analyzer (DQA) is one such development and is designed to characterize seismic station data quality in a quantitative and automated manner.

  4. Initial Egyptian ECMO experience

    Directory of Open Access Journals (Sweden)

    Akram Abdelbary

    2016-04-01

    Results: A total of twelve patients received ECMO between January 2014 and June 2015. The mean age was 35.9 years. (range 13–65 years, 8 males, with VV ECMO in 10 patients, and VA ECMO in 2 patients. Out of ten patients of VV ECMO, one had H1N1 pneumonia, one had advanced vasculitic lung, four had bacterial pneumonia, two traumatic lung contusions and one with organophosphorus poisoning, and one undiagnosed etiology leading to severe ARDS. Lung injury score range was 3–3.8, PaO2/FiO2 (20–76 mechanical ventilation duration before ECMO 1–14 days, Femoro-jugular cannulation in 7 patients and femoro-femoral in 2 patients and femoro-subclavian in 1 patient; all patients were initially sedated and paralyzed for (2–4 days and ventilated on pressure controlled ventilation with Pmax of 25 cm H2O and PEEP of 10 cm H2O. In VA ECMO patients were cannulated percutaneously using femoro-femoral approach. One patient showed no neurologic recovery and died after 24 h, the other had CABG on ECMO however the heart didn’t recover and died after 9 days. Heparin intravenous infusion was used initially in all patients and changed to Bivalirudin in 2 patients due to possible HIT. Pump flow ranged from 2.6 to 6.5 L/min. Average support time was 12 days (range 2–24 days. Seven patients (63.3% were successfully separated from ECMO and survived to hospital discharge. Hospital length of stay ranged from 3 to 42 days, tracheostomy was done percutaneously in 5 patients and surgically in 3. Gastrointestinal bleeding occurred in 6 patients, VAP in 7 patients, neurologic complications in 1 patient with complete recovery, cardiac arrhythmias in 3 patients, pneumothorax in 9 patients, and deep venous thrombosis in 2 patients.

  5. MONTANA PALLADIUM RESEARCH INITIATIVE

    Energy Technology Data Exchange (ETDEWEB)

    Peters, John; McCloskey, Jay; Douglas, Trevor; Young, Mark; Snyder, Stuart; Gurney, Brian

    2012-05-09

    Project Objective: The overarching objective of the Montana Palladium Research Initiative is to perform scientific research on the properties and uses of palladium in the context of the U.S. Department of Energy's Hydrogen, Fuel Cells and Infrastructure Technologies Program. The purpose of the research will be to explore possible palladium as an alternative to platinum in hydrogen-economy applications. To achieve this objective, the Initiatives activities will focus on several cutting-edge research approaches across a range of disciplines, including metallurgy, biomimetics, instrumentation development, and systems analysis. Background: Platinum-group elements (PGEs) play significant roles in processing hydrogen, an element that shows high potential to address this need in the U.S. and the world for inexpensive, reliable, clean energy. Platinum, however, is a very expensive component of current and planned systems, so less-expensive alternatives that have similar physical properties are being sought. To this end, several tasks have been defined under the rubric of the Montana Palladium Research Iniative. This broad swath of activities will allow progress on several fronts. The membrane-related activities of Task 1 employs state-of-the-art and leading-edge technologies to develop new, ceramic-substrate metallic membranes for the production of high-purity hydrogen, and develop techniques for the production of thin, defect-free platinum group element catalytic membranes for energy production and pollution control. The biomimetic work in Task 2 explores the use of substrate-attached hydrogen-producing enzymes and the encapsulation of palladium in virion-based protein coats to determine their utility for distributed hydrogen production. Task 3 work involves developing laser-induced breakdown spectroscopy (LIBS) as a real-time, in situ diagnostic technique to characterize PGEs nanoparticles for process monitoring and control. The systems engineering work in task 4

  6. Complex-radical copolymerization of vinyl monomers on organoelemental initiators

    International Nuclear Information System (INIS)

    Grishin, D.F.

    1993-01-01

    Data on regularities of the initiation and growth of the (co)polymerization of polar vinyl series monomers on organo-elemental initiator, organo-boron in particular, are generalized. The effect of organo-metallic compounds and some phenol type inhibitors on the rate of acrylate (co)polymerization is analyzed from view of the change of electroacceptor properties (electrophilicity) of macroradicals

  7. The Myth of Peer Influence in Adolescent Smoking Initiation

    Science.gov (United States)

    Arnett, Jeffrey Jensen

    2007-01-01

    The widespread belief that peer influence is the primary cause of adolescent smoking initiation is examined and called into question. Correlational and longitudinal studies purporting to demonstrate peer influence are analyzed, and their limitations described. Qualitative interview studies of adolescent smoking initiation are presented as…

  8. NATO and EU/European Defense Initiatives: Competitive or Complementary

    National Research Council Canada - National Science Library

    Muckel, Hubert

    2006-01-01

    .... This paper analyzes the current status of NATO and the European Union (EU) defense initiatives examines national objectives and interests of European key-players and the US and evaluates the aspects of competitiveness or complement of NATO and EU defense initiatives.

  9. TFTR initial operations

    International Nuclear Information System (INIS)

    Young, K.M.; Bell, M.; Blanchard, W.R.

    1984-01-01

    TFTR (Tokamak Fusion Test Reactor) has operated since December 1982 with ohmically heated plasmas. Routine operation with feedback control of plasma current, position and density has been obtained for plasmas with Isub(p) approx.= 800 kA, a = 68 cm, R = 250 cm, and Bsub(t) = 27 kG. A maximum plasma current of 1 MA was achieved with q approx.= 2.5. Energy confinement times of approx. 150 msec were measured for hydrogen and deuterium plasmas with n-barsub(e) approx.= 2 x 10 13 cm -3 , Tsub(e)(O) approx.= 1.5 keV, Tsub(i)(O) approx.= 1.5 keV and Zsub(eff) approx.= 3. The preliminary results suggest a size-cubed scaling from PLT, and are consistent with Alcator C scaling where tau approx. nR 2 a. Initial measurements of plasma disruption characteristics indicate current decay rates of approx. 800 kA in 8 ms which is within the TFTR design requirement of 3 MA in 3 ms. (author)

  10. The new childcare initiative

    CERN Multimedia

    Cigdem Issever

    The ATLAS Women's Network recently sent out a general mailing to all ATLAS and CMS members to announce a new initiative aimed at improving childcare facilities for Users coming to CERN. Several people have expressed the need that CERN should provide or facilitate affordable day care for children of temporary visitors at CERN. The ATLAS Women's Network is now forming a child care task force from concerned people and invites all those interested to join this effort. You can do so by either adding your name to the mailing list cern-users-childcare@cern.ch in Simba or by contacting Cigdem.Issever@cern.NOSPAM.ch and Pauline.Gagnon@cern.NOSPAM.ch. More than 50 people have already joined this effort. Those who have joined the mailing list will soon receive all the details about the next conference call meeting which has been scheduled for Thursday October 25th from 16:30 to 18:00 CERN time. The preliminary agenda is the following: Summary of our first contact of ATLAS and CMS (5 min) Discussion about the co-conv...

  11. AECL's new environmental initiatives

    International Nuclear Information System (INIS)

    McDonnell, F.N.

    1993-01-01

    AECL's research and development expenditures in environmental sciences and waste management technology are about $50 M per year. The main focus of these programs is the Nuclear Fuel Waste Management Program. This research is supplemented by activities in support of laboratory, Environmental Authority and internal waste management requirements, as well as provision of non-nuclear services. AECL intends to become more involved in performing environmental research and development with broader application. The goal is to achieve a relationship with Canadian industry that would involve a substantial portion of AECL's environmental research capabilities. The research directions and priorities of the resulting partnership would be set by the private sector in accordance with their needs and requirements. It is expected that the activities associated with this new environmental initiative will start small and grow in response to perceived needs. AECL is now increasing its non-nuclear research efforts by targeting those markets that appear most attractive. The thrust can be divided into three broad categories: environmental research, environmental services, and environmental products. (Author)

  12. TFTR initial operations

    International Nuclear Information System (INIS)

    Young, K.M.; Bell, M.; Blanchard, W.R.

    1983-11-01

    The Tokamak Fusion Test Reactor (TFTR) has operated since December 1982 with ohmically heated plasmas. Routine operation with feedback control of plasma current, position, and density has been obtained for plasmas with I/sub p/ approx. = 800 kA, a = 68 cm, R = 250 cm, and B/sub t/ = 27 kG. A maximum plasma current of 1 MA was achieved with q approx. = 2.5. Energy confinement times of approx. 150 msec were measured for hydrogen and deuterium plasmas with anti n/sub e/ approx. = 2 x 10 13 cm -3 , T/sub e/ (0) approx. = 1.5 keV, T/sub i/ (0) approx. = 1.5 keV, and Z/sub eff/ approx. = 3. The preliminary results suggest a size-cubed scaling from PLT and are consistent with Alcator C scaling where tau approx. nR 2 a. Initial measurements of plasma disruption characteristics indicate current decay rates of approx. 800 kA in 8 ms which is within the TFTR design requirement of 3 MA in 3 ms

  13. European nuclear education initiatives

    International Nuclear Information System (INIS)

    Glatz, Jean-Paul

    2011-01-01

    Whatever option regarding their future nuclear energy development is chosen by European Union Member States, the availability of a sufficient number of well trained and experienced staff is key for the responsible use of nuclear energy. This is true in all areas including design, construction, operation, decommissioning, fuel cycle and waste management as well as radiation protection. Given the high average age of existing experts leading to a significant retirement induce a real risk of the loss of nuclear competencies in the coming years. Therefore the demand of hiring skilled employees is rising. The challenge of ensuring a sufficient number of qualified staff in the nuclear sector has been acknowledged widely among the different stakeholders, in particular the nuclear industry, national regulatory authorities and Technical Support Organisations (TSOs). Already the EURATOM Treaty refers explicitly to the obligation for the Commission to carry out training actions. Recently initiatives have been launched at EU level to facilitate and strengthen the efforts of national stakeholders. The European Nuclear Education Network (ENEN) Association aims at preservation and further development of expertise in the nuclear field by higher education and training. The goal of the European Nuclear Energy Leadership Academy (ENELA) is to educate future leaders in the nuclear field to ensure the further development of sustainable European nuclear energy solutions The European Nuclear Energy Forum (ENEF) is a platform operated by the European Commission for a broad discussion on the opportunities and risks of nuclear energy. The nuclear programs under investigation in the Joint Research Center (JRC) are increasingly contributing to Education and Training (E and T) initiatives, promoting a better cooperation between key players and universities as well as operators and regulatory bodies in order to mutually optimise their training programmes. Another objective is to increase

  14. The MEGAPIE Initiative

    International Nuclear Information System (INIS)

    Salvatores, M.; Bauer, G.S.; Heusener, G.

    2000-10-01

    MEGAPIE (Megawatt Pilot Experiment) is a joint initiative by Commissariat a l'Energie Atomique (CEA), France, Forschungszentrum Karlsruhe (FZK), Germany, and Paul Scherrer Institut (PSI), Switzerland, to design, build, operate and explore a liquid lead-bismuth spallation target for 1MW of beam power, taking advantage of the existing spallation neutron facility SINQ at PSI. Such a target based on an eutectic mixture with a melting point as low as 125 o C and a boiling point as high as 1670 o C is the preferred concept in several studies aiming at utilising accelerators to drive subcritical assemblies in order to transmute long lived nuclear waste into shorter lived isotopes in an effort to ease problems of long term storage and final disposal. MEGAPIE will be an essential step towards demonstrating the feasibility of the coupling of a high power accelerator, a spallation target and a subcritical assembly. It will specifically address one of the most critical issues, namely the behaviour of a liquid metal target under realistic operating conditions. As an intensely instrumented pilot experiment it will provide valuable data for benchmarking of frequently used computer codes and will allow to gain important experience in the safe handling of components that have been irradiated with PbBi. It will be installed at the ring cyclotron at PSI with 590 MeV proton energy and a continuous current of 1.8 mA. The basic concept of the MEGAPIE target as well as the definition of the project phases and of the supporting research and development activities at the participating laboratories are described in the present report

  15. Initial Cladding Condition

    International Nuclear Information System (INIS)

    Siegmann, E.

    2000-01-01

    The purpose of this analysis is to describe the condition of commercial Zircaloy clad fuel as it is received at the Yucca Mountain Project (YMP) site. Most commercial nuclear fuel is encased in Zircaloy cladding. This analysis is developed to describe cladding degradation from the expected failure modes. This includes reactor operation impacts including incipient failures, potential degradation after reactor operation during spent fuel storage in pool and dry storage and impacts due to transportation. Degradation modes include cladding creep, and delayed hydride cracking during dry storage and transportation. Mechanical stresses from fuel handling and transportation vibrations are also included. This Analysis and Model Report (AMR) does not address any potential damage to assemblies that might occur at the YMP surface facilities. Ranges and uncertainties have been defined. This analysis will be the initial boundary condition for the analysis of cladding degradation inside the repository. In accordance with AP-2.13Q, ''Technical Product Development Planning'', a work plan (CRWMS M andO 2000c) was developed, issued, and utilized in the preparation of this document. There are constraints, caveats and limitations to this analysis. This cladding degradation analysis is based on commercial Pressurized Water Reactor (PWR) fuel with Zircaloy cladding but is applicable to Boiling Water Reactor (BWR) fuel. Reactor operating experience for both PWRs and BWRs is used to establish fuel reliability from reactor operation. It is limited to fuel exposed to normal operation and anticipated operational occurrences (i.e. events which are anticipated to occur within a reactor lifetime), and not to fuel that has been exposed to severe accidents. Fuel burnup projections have been limited to the current commercial reactor licensing environment with restrictions on fuel enrichment, oxide coating thickness and rod plenum pressures. The information provided in this analysis will be used in

  16. Hazardous material reduction initiative

    International Nuclear Information System (INIS)

    Nichols, D.H.

    1995-02-01

    The Hazardous Material Reduction Initiative (HMRI) explores using the review of purchase requisitions to reduce both the use of hazardous materials and the generation of regulated and nonregulated wastes. Based on an 11-month program implemented at the Hanford Site, hazardous material use and waste generation was effectively reduced by using a centralized procurement control program known as HMRI. As expected, several changes to the original proposal were needed during the development/testing phase of the program to accommodate changing and actual conditions found at the Hanford Site. The current method requires a central receiving point within the Procurement Organization to review all purchase requisitions for potentially Occupational Safety and Health Administration (OSHA) hazardous products. Those requisitions (approximately 4% to 6% of the total) are then forwarded to Pollution Prevention personnel for evaluation under HMRI. The first step is to determine if the requested item can be filled by existing or surplus material. The requisitions that cannot filled by existing or surplus material are then sorted into two groups based on applicability to the HMRI project. For example, laboratory requests for analytical reagents or standards are excluded and the purchase requisitions are returned to Procurement for normal processing because, although regulated, there is little opportunity for source reduction due to the strict protocols followed. Each item is then checked to determine if it is regulated or not. Regulated items are prioritized based on hazardous contents, quantity requested, and end use. Copies of these requisitions are made and the originals are returned to Procurement within 1-hr. Since changes to the requisition can be made at later stages during procurement, the HMRI fulfills one of its original premises in that it does not slow the procurement process

  17. Nursing Facility Initiative Annual Report

    Data.gov (United States)

    U.S. Department of Health & Human Services — This annual report summarizes impacts from the Initiative to Reduce Avoidable Hospitalizations among Nursing Facility Residents in 2014. This initiative is designed...

  18. National Take-Back Initiative

    Science.gov (United States)

    ... Disposal Information Drug and Chemical Information E-commerce Initiatives Federal Agencies & Related Links Federal Register Notices National ... Disposal Information Drug and Chemical Information E-commerce Initiatives Federal Agencies & Related Links Federal Register Notices National ...

  19. The national geomagnetic initiative

    Science.gov (United States)

    1993-01-01

    The Earth's magnetic field, through its variability over a spectrum of spatial and temporal scales, contains fundamental information on the solid Earth and geospace environment (the latter comprising the atmosphere, ionosphere, and magnetosphere). Integrated studies of the geomagnetic field have the potential to address a wide range of important processes in the deep mantle and core, asthenosphere, lithosphere, oceans, and the solar-terrestrial environment. These studies have direct applications to important societal problems, including resource assessment and exploration, natural hazard mitigation, safe navigation, and the maintenance and survivability of communications and power systems on the ground and in space. Studies of the Earth's magnetic field are supported by a variety of federal and state agencies as well as by private industry. Both basic and applied research is presently supported by several federal agencies, including the National Science Foundation (NSF), U.S. Geological Survey (USGS), U.S. Department of Energy (DOE), National Oceanic and Atmospheric Administration (NOAA), National Aeronautics and Space Administration (NASA), and U.S. Department of Defense (DOD) (through the Navy, Air Force, and Defense Mapping Agency). Although each agency has a unique, well-defined mission in geomagnetic studies, many areas of interest overlap. For example, NASA, the Navy, and USGS collaborate closely in the development of main field reference models. NASA, NSF, and the Air Force collaborate in space physics. These interagency linkages need to be strengthened. Over the past decade, new opportunities for fundamental advances in geomagnetic research have emerged as a result of three factors: well-posed, first-order scientific questions; increased interrelation of research activities dealing with geomagnetic phenomena; and recent developments in technology. These new opportunities can be exploited through a national geomagnetic initiative to define objectives and

  20. Initial Radionuclide Inventories

    Energy Technology Data Exchange (ETDEWEB)

    H. Miller

    2004-09-19

    The purpose of this analysis is to provide an initial radionuclide inventory (in grams per waste package) and associated uncertainty distributions for use in the Total System Performance Assessment for the License Application (TSPA-LA) in support of the license application for the repository at Yucca Mountain, Nevada. This document is intended for use in postclosure analysis only. Bounding waste stream information and data were collected that capture probable limits. For commercially generated waste, this analysis considers alternative waste stream projections to bound the characteristics of wastes likely to be encountered using arrival scenarios that potentially impact the commercial spent nuclear fuel (CSNF) waste stream. For TSPA-LA, this radionuclide inventory analysis considers U.S. Department of Energy (DOE) high-level radioactive waste (DHLW) glass and two types of spent nuclear fuel (SNF): CSNF and DOE-owned (DSNF). These wastes are placed in two groups of waste packages: the CSNF waste package and the codisposal waste package (CDSP), which are designated to contain DHLW glass and DSNF, or DHLW glass only. The radionuclide inventory for naval SNF is provided separately in the classified ''Naval Nuclear Propulsion Program Technical Support Document'' for the License Application. As noted previously, the radionuclide inventory data presented here is intended only for TSPA-LA postclosure calculations. It is not applicable to preclosure safety calculations. Safe storage, transportation, and ultimate disposal of these wastes require safety analyses to support the design and licensing of repository equipment and facilities. These analyses will require radionuclide inventories to represent the radioactive source term that must be accommodated during handling, storage and disposition of these wastes. This analysis uses the best available information to identify the radionuclide inventory that is expected at the last year of last emplacement

  1. Citizen participation and citizen initiatives

    International Nuclear Information System (INIS)

    Matthoefer, H.

    1977-01-01

    Contents: Social conditions for citizen initiatives - technical change and employment - crisis behaviour - socio-psychological analysis of political planning; legitimation - presentation and criticism - conditions for citizen initiatives coming into being within the field of tension citizen : administration - legal problems of citizen initiatives - environmental protection in the energy discussion; participation; models. (HP) [de

  2. NLM Emergency Access Initiative: FAQs

    Science.gov (United States)

    Facebook Visit us on Twitter Visit us on Youtube Emergency Access Initiative Home | Journals | Books | Online Databases | FAQs Take Short Survey FAQ What is the Emergency Access Initiative? The Emergency Access Initiative (EAI) is a collaborative partnership between NLM and participating publishers to

  3. BOA, Beam Optics Analyzer A Particle-In-Cell Code

    International Nuclear Information System (INIS)

    Bui, Thuc

    2007-01-01

    The program was tasked with implementing time dependent analysis of charges particles into an existing finite element code with adaptive meshing, called Beam Optics Analyzer (BOA). BOA was initially funded by a DOE Phase II program to use the finite element method with adaptive meshing to track particles in unstructured meshes. It uses modern programming techniques, state-of-the-art data structures, so that new methods, features and capabilities are easily added and maintained. This Phase II program was funded to implement plasma simulations in BOA and extend its capabilities to model thermal electrons, secondary emissions, self magnetic field and implement a more comprehensive post-processing and feature-rich GUI. The program was successful in implementing thermal electrons, secondary emissions, and self magnetic field calculations. The BOA GUI was also upgraded significantly, and CCR is receiving interest from the microwave tube and semiconductor equipment industry for the code. Implementation of PIC analysis was partially successful. Computational resource requirements for modeling more than 2000 particles begin to exceed the capability of most readily available computers. Modern plasma analysis typically requires modeling of approximately 2 million particles or more. The problem is that tracking many particles in an unstructured mesh that is adapting becomes inefficient. In particular memory requirements become excessive. This probably makes particle tracking in unstructured meshes currently unfeasible with commonly available computer resources. Consequently, Calabazas Creek Research, Inc. is exploring hybrid codes where the electromagnetic fields are solved on the unstructured, adaptive mesh while particles are tracked on a fixed mesh. Efficient interpolation routines should be able to transfer information between nodes of the two meshes. If successfully developed, this could provide high accuracy and reasonable computational efficiency.

  4. Community Rates of Breastfeeding Initiation.

    Science.gov (United States)

    Grubesic, Tony H; Durbin, Kelly M

    2016-11-01

    Breastfeeding initiation rates vary considerably across racial and ethnic groups, maternal age, and education level, yet there are limited data concerning the influence of geography on community rates of breastfeeding initiation. This study aimed to describe how community rates of breastfeeding initiation vary in geographic space, highlighting "hot spots" and "cool spots" of initiation and exploring the potential connections between race, socioeconomic status, and urbanization levels on these patterns. Birth certificate data from the Kentucky Department of Health for 2004-2010 were combined with county-level geographic base files, Census 2010 demographic and socioeconomic data, and Rural-Urban Continuum Codes to conduct a spatial statistical analysis of community rates of breastfeeding initiation. Between 2004 and 2010, the average rate of breastfeeding initiation for Kentucky increased from 43.84% to 49.22%. Simultaneously, the number of counties identified as breastfeeding initiation hot spots also increased, displaying a systematic geographic pattern in doing so. Cool spots of breastfeeding initiation persisted in rural, Appalachian Kentucky. Spatial regression results suggested that unemployment, income, race, education, location, and the availability of International Board Certified Lactation Consultants are connected to breastfeeding initiation. Not only do spatial analytics facilitate the identification of breastfeeding initiation hot spots and cool spots, but they can be used to better understand the landscape of breastfeeding initiation and help target breastfeeding education and/or support efforts.

  5. Verification and quality control of routine hematology analyzers

    NARCIS (Netherlands)

    Vis, J Y; Huisman, A

    2016-01-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and

  6. Looking for a Framework for Analyzing Eco-innovation Dynamics

    DEFF Research Database (Denmark)

    Yang, Yan

    2011-01-01

    Looking for a Framework for Analyzing Eco-innovation Dynamics: A Triple Helix Model of Innovation Perspective.......Looking for a Framework for Analyzing Eco-innovation Dynamics: A Triple Helix Model of Innovation Perspective....

  7. Portable Programmable Multifunction Body Fluids Analyzer, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Advanced Liquid Logic proposes to develop a very capable analyzer based on its digital microfluidic technology. Such an analyzer would be:  Capable of both simple...

  8. Radiofrequency initiation and radiofrequency sustainment of laser initiated seeded high pressure plasma

    International Nuclear Information System (INIS)

    Paller, Eric S.; Scharer, John E.; Akhtar, Kamran; Kelly, Kurt; Ding, Guowen

    2001-01-01

    We examine radiofrequency initiation of high pressure(1-70 Torr) inductive plasma discharges in argon, nitrogen, air and organic seed gas mixtures. Millimeter wave interferometry, optical emission and antenna wave impedance measurements for double half-turn helix and helical inductive antennas are used to interpret the rf/plasma coupling, measure the densities in the range of 10 12 cm -3 and analyze the ionization and excited states of the gas mixtures. We have also carried out 193 nm excimer laser initiation of an organic gas seed plasma which is sustained at higher pressures(150 Torr) by radiofrequency coupling at 2.8 kW power levels

  9. 21 CFR 870.3640 - Indirect pacemaker generator function analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Indirect pacemaker generator function analyzer... Indirect pacemaker generator function analyzer. (a) Identification. An indirect pacemaker generator function analyzer is an electrically powered device that is used to determine pacemaker function or...

  10. 21 CFR 870.3630 - Pacemaker generator function analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Pacemaker generator function analyzer. 870.3630... (CONTINUED) MEDICAL DEVICES CARDIOVASCULAR DEVICES Cardiovascular Prosthetic Devices § 870.3630 Pacemaker generator function analyzer. (a) Identification. A pacemaker generator function analyzer is a device that is...

  11. 21 CFR 868.1700 - Nitrous oxide gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Nitrous oxide gas analyzer. 868.1700 Section 868...) MEDICAL DEVICES ANESTHESIOLOGY DEVICES Diagnostic Devices § 868.1700 Nitrous oxide gas analyzer. (a) Identification. A nitrous oxide gas analyzer is a device intended to measure the concentration of nitrous oxide...

  12. 21 CFR 862.2500 - Enzyme analyzer for clinical use.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Enzyme analyzer for clinical use. 862.2500 Section... Instruments § 862.2500 Enzyme analyzer for clinical use. (a) Identification. An enzyme analyzer for clinical use is a device intended to measure enzymes in plasma or serum by nonkinetic or kinetic measurement of...

  13. Hardware Realization of an Ethernet Packet Analyzer Search Engine

    Science.gov (United States)

    2000-06-30

    specific for the home automation industry. This analyzer will be at the gateway of a network and analyze Ethernet packets as they go by. It will keep... home automation and not the computer network. This system is a stand-alone real-time network analyzer capable of decoding Ethernet protocols. The

  14. Initial Analyses of Change Detection Capabilities and Data Redundancies in the Long Term Resource Monitoring Program

    National Research Council Canada - National Science Library

    Lubinski, Kenneth

    2001-01-01

    Evaluations of Long Term Resource Monitoring Program sampling designs for water quality, fish, aquatic vegetation, and macroinvertebrates were initiated in 1999 by analyzing data collected since 1992...

  15. Analyzing the Effects of Urban Combat on Daily Casualty Rates

    National Research Council Canada - National Science Library

    Yazilitas, Hakan

    2004-01-01

    .... The available data set contains measurements about the battles like initial strengths, daily casualties, terrain, front width, linear density, attacker's and defender's country, and armor losses...

  16. A Task-Based Approach to Analyzing Processes

    National Research Council Canada - National Science Library

    Stone, Brice

    1999-01-01

    As much of corporate America has embraced business process reengineering, the Government Performance and Results Act of 1993 and the Department of Defense Corporate Information Management Initiative...

  17. A multi-analyzer crystal spectrometer (MAX) for pulsed neutron sources

    International Nuclear Information System (INIS)

    Tajima, K.; Ishikawa, Y.; Kanai, K.; Windsor, C.G.; Tomiyoshi, S.

    1982-03-01

    The paper describes the principle and initial performance of a multi-analyzer crystal spectrometer (MAX) recently installed at the KENS spallation neutron source at Tsukuba. The spectrometer is able to make time of flight scans along a desired direction in reciprocal space, covering a wide range of the energy transfers corresponding to the fifteen analyzer crystals. The constant Q or constant E modes of operation can be performed. The spectrometer is particularly suited for studying collective excitations such as phonons and magnons to high energy transfers using single crystal samples. (author)

  18. Time-reversal asymmetry: polarization and analyzing power in nuclear reactions

    International Nuclear Information System (INIS)

    Rioux, C.; Roy, R.; Slobodrian, R.J.; Conzett, H.E.

    1984-01-01

    Measurements of the proton polarization in the reactions 7 Li( 3 He, p vector) 9 Be and 9 Be( 3 He, p vector) 11 B and of the analyzing powers in the inverse reactions, initiated by polarized protons at the same center-of-mass energies, show significant differences. This implies the failure of the polarization-analyzing-power theorem and, prima facie, of time-reversal invariance in these reactions. The reaction 2 H( 3 He, p vector) 4 He and its inverse have also been investigated and show smaller differences. A discussion of instrumental asymmetries is presented

  19. The Decompositioning of Volatile-Matter of Tanjung Enim Coal by using Thermogravimetry Analyzer (TGA

    Directory of Open Access Journals (Sweden)

    Nukman Nukman

    2010-10-01

    Full Text Available Coal is a nature material which a kind of energy source. The decompotition of coal could analyze by heat treated using thermogravimetry analyzer. The decomposition of the volatile matter for three kinds of Tanjung Enim coal could be known. The value of activation energy that be found diference, then for Semi Anthracite, Bitumonius and Sub Bituminous Coal, the initial temperatures are 60.8 oC, 70.7 oC, 97.8oC, and the last temperatures are 893.8 oC, 832 oC, 584.6oC.

  20. Inflation with generalized initial conditions

    International Nuclear Information System (INIS)

    Albrecht, A.; Brandenberger, R.; Matzner, R.

    1987-01-01

    In many current models of the early Universe a scalar field phi which is only very weakly coupled to other quantum fields is used to generate inflation. In such models there are no forces which could thermalize the scalar field, and previous assumptions about its preinflation ''initial'' conditions must be abandoned. In this paper the onset of inflation is studied classically for more general initial conditions of the scalar field configuration. In particular, initial conditions with a nonvanishing spatial average of phi, with phi chosen at random in each initial horizon volume, and with random initial momenta are considered. We identify and discuss several mechanisms that can drive these more general initial conditions toward an inflationary state. The analysis is done in one spatial dimension

  1. EXPERIENCES WITH IDEA PROMOTING INITIATIVES

    DEFF Research Database (Denmark)

    Gish, Liv

    2011-01-01

    In new product development a central activity is to provide new ideas. Over the last decades experiences with stimulating employee creativity and establishing idea promoting initiatives have been made in industrial practice. Such initiatives are often labeled Idea Management – a research field...... with a growing interest. In this paper I examine three different idea promoting initiatives carried out in Grundfos, a leading pump manufacturer. In the analysis I address what understandings of idea work are inscribed in the initiatives and what role these initiatives play in the organization with respect...... understandings of idea work are inscribed in the idea promoting initiatives as they to some degree have to fit with the understandings embedded in practice in order to work....

  2. Tubular Initial Conditions and Ridge Formation

    Directory of Open Access Journals (Sweden)

    M. S. Borysova

    2013-01-01

    Full Text Available The 2D azimuth and rapidity structure of the two-particle correlations in relativistic A+A collisions is altered significantly by the presence of sharp inhomogeneities in superdense matter formed in such processes. The causality constraints enforce one to associate the long-range longitudinal correlations observed in a narrow angular interval, the so-called (soft ridge, with peculiarities of the initial conditions of collision process. This study's objective is to analyze whether multiform initial tubular structures, undergoing the subsequent hydrodynamic evolution and gradual decoupling, can form the soft ridges. Motivated by the flux-tube scenarios, the initial energy density distribution contains the different numbers of high density tube-like boost-invariant inclusions that form a bumpy structure in the transverse plane. The influence of various structures of such initial conditions in the most central A+A events on the collective evolution of matter, resulting spectra, angular particle correlations and vn-coefficients is studied in the framework of the hydrokinetic model (HKM.

  3. The US proliferation security initiative (PSI)

    International Nuclear Information System (INIS)

    Gregoire, B.

    2004-01-01

    The proliferation security initiative (PSI), launched by President Bush on May 31, 2003, aims at intercepting any transfer of mass destruction weapons, of their vectors and related equipments, towards or coming from countries or organizations suspected to have a proliferation activity. This initiative, which involves coercive means to fight against proliferation, raises international lawfulness and legal questions, the answers of which are today under construction. This article analyzes the place of the European Union in the PSI, the action means (optimization of existing means, cooperation between intelligence and interception services), and the PSI stakes (lawfulness with respect to the international law, bilateral agreements, draft boarding agreement, sustain of the United Nations, widening of the partnership and of the field of action). (J.S.)

  4. Dionex series 8000 on-line analyzer, Sequoyah Nuclear Power Plant. Final report

    International Nuclear Information System (INIS)

    1986-03-01

    This project was initiated to develop a custom-designed online water analyzer (ion chromatograph) for secondary water chemistry control in TVA's nuclear plants. This water analyzer development was conducted pursuant to a cooperative research agreement with the Dionex Corporation. Dionex developed and installed a dual channel, six stream analyzer on the secondary side of TVA's Sequoyah Nuclear Plant. The analyzer was developed for real time detection of sodium, chloride, and sulfate in any of the six sampling streams. The analyzer is providing Sequoyah's plant personnel with reliable secondary water chemistry data in a much more timely manner than the past grab sampling techniques. Results on the performance of the analyzer show that it is performing above and beyond the expectations of plant personnel. Since its installation at Sequoyah, there have been 29 units ordered from Dionex including 1 unit for Sequoyah, 5 units for Browns Ferry, and 23 units for other utilities. In the future, the analyzer will allow plant staffs to take corrective action before corrosive conditions occur or before having to derate a unit

  5. Initiation into Adolescent Marijuana Use.

    Science.gov (United States)

    Brook, Judith S.; And Others

    1980-01-01

    This longitudinal study examined the relationship of three domains (personality/attitudinal orientations, peer relationships, and family socialization factors) with initiation into adolescent marihuana use. (Author/DB)

  6. The Double Star Orbit Initial Value Problem

    Science.gov (United States)

    Hensley, Hagan

    2018-04-01

    Many precise algorithms exist to find a best-fit orbital solution for a double star system given a good enough initial value. Desmos is an online graphing calculator tool with extensive capabilities to support animations and defining functions. It can provide a useful visual means of analyzing double star data to arrive at a best guess approximation of the orbital solution. This is a necessary requirement before using a gradient-descent algorithm to find the best-fit orbital solution for a binary system.

  7. Automated detection of analyzable metaphase chromosome cells depicted on scanned digital microscopic images

    Science.gov (United States)

    Qiu, Yuchen; Wang, Xingwei; Chen, Xiaodong; Li, Yuhua; Liu, Hong; Li, Shibo; Zheng, Bin

    2010-02-01

    Visually searching for analyzable metaphase chromosome cells under microscopes is quite time-consuming and difficult. To improve detection efficiency, consistency, and diagnostic accuracy, an automated microscopic image scanning system was developed and tested to directly acquire digital images with sufficient spatial resolution for clinical diagnosis. A computer-aided detection (CAD) scheme was also developed and integrated into the image scanning system to search for and detect the regions of interest (ROI) that contain analyzable metaphase chromosome cells in the large volume of scanned images acquired from one specimen. Thus, the cytogeneticists only need to observe and interpret the limited number of ROIs. In this study, the high-resolution microscopic image scanning and CAD performance was investigated and evaluated using nine sets of images scanned from either bone marrow (three) or blood (six) specimens for diagnosis of leukemia. The automated CAD-selection results were compared with the visual selection. In the experiment, the cytogeneticists first visually searched for the analyzable metaphase chromosome cells from specimens under microscopes. The specimens were also automated scanned and followed by applying the CAD scheme to detect and save ROIs containing analyzable cells while deleting the others. The automated selected ROIs were then examined by a panel of three cytogeneticists. From the scanned images, CAD selected more analyzable cells than initially visual examinations of the cytogeneticists in both blood and bone marrow specimens. In general, CAD had higher performance in analyzing blood specimens. Even in three bone marrow specimens, CAD selected 50, 22, 9 ROIs, respectively. Except matching with the initially visual selection of 9, 7, and 5 analyzable cells in these three specimens, the cytogeneticists also selected 41, 15 and 4 new analyzable cells, which were missed in initially visual searching. This experiment showed the feasibility of

  8. Student initiative: A conceptual analysis

    Directory of Open Access Journals (Sweden)

    Polovina Nada

    2014-01-01

    Full Text Available In the description and scientific consideration of the attitude of children and youth towards their education and development, the concept of student initiative has been gaining ground lately, and it is hence the subject of analysis in this paper. The analysis is important because of the discrepancy between the increased efforts of the key educational policy holders to promote the idea about the importance of the development of student initiative and rare acceptance of this idea among theoreticians, researchers and practitioners dealing with the education and development of children and youth. By concretising the features of initiative student behaviour, our aim was, on the one hand, to observe the structural determinants and scientific status of the very concept of an initiative student, and, on the other, to contribute to the understanding of the initiative behaviour in practice. In the first part of the paper we deal with different notions and concretisations of the features of initiative behaviour of children and youth, which includes the consideration of: basic student initiative, academic student initiative, individual student initiative, the capacity for initiative and personal development initiative. In the second part of the paper, we discuss the relations of the concept of student initiative with the similar general concepts (activity/passivity, proactivity, agency and the concepts immediately related to school environment (student involvement, student participation. The results of our analysis indicate that the concept of student initiative has: particular features that differentiate it from similar concepts; the potential to reach the status of a scientific concept, bearing in mind the initial empirical specifications and general empirical verifiability of the yet unverified determinants of the concept. In the concluding part of the paper, we discuss the implications of the conceptual analysis for further research, as well as for

  9. Verification and quality control of routine hematology analyzers.

    Science.gov (United States)

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  10. Development of a Telemetric, Miniaturized Electrochemical Amperometric Analyzer

    OpenAIRE

    Jaehyo Jung; Jihoon Lee; Siho Shin; Youn Tae Kim

    2017-01-01

    In this research, we developed a portable, three-electrode electrochemical amperometric analyzer that can transmit data to a PC or a tablet via Bluetooth communication. We performed experiments using an indium tin oxide (ITO) glass electrode to confirm the performance and reliability of the analyzer. The proposed analyzer uses a current-to-voltage (I/V) converter to convert the current generated by the reduction-oxidation (redox) reaction of the buffer solution to a voltage signal. This signa...

  11. Faraday cup for analyzing multi-ion plasma

    International Nuclear Information System (INIS)

    Fujita, Takao

    1987-01-01

    A compact and convenient ion analyzer (a kind of a Faraday cup) is developed in order to analyze weakly ionized multi-ion plasmas. This Faraday cup consists of three mesh electrodes and a movable ion collector. With a negative gate pulse superimposed on the ion retarding bias, ions are analyzed by means of time-of-flight. The identification of ion species and measurements of ion density and ion temperature are studied. (author)

  12. Career Technical Education Pathways Initiative

    Science.gov (United States)

    California Community Colleges, Chancellor's Office, 2013

    2013-01-01

    California's education system--the largest in the United States--is an essential resource for ensuring strong economic growth in the state. The Career Technical Education Pathways Initiative (referred to as the Initiative in this report), which became law in 2005, brings together community colleges, K-12 school districts, employers, organized…

  13. The metabolomics standards initiative (MSI)

    NARCIS (Netherlands)

    Fiehn, O.; Robertson, D.; Griffin, J.; Werf, M. van der; Nikolau, B.; Morrison, N.; Sumner, L.W.; Goodacre, R.; Hardy, N.W.; Taylor, C.; Fostel, J.; Kristal, B.; Kaddurah-Daouk, R.; Mendes, P.; Ommen, B. van; Lindon, J.C.; Sansone, S.-A.

    2007-01-01

    In 2005, the Metabolomics Standards Initiative has been formed. An outline and general introduction is provided to inform about the history, structure, working plan and intentions of this initiative. Comments on any of the suggested minimal reporting standards are welcome to be sent to the open

  14. Behavioral Initiatives in Broad Perspective.

    Science.gov (United States)

    California Univ., Los Angeles. Center for Mental Health in Schools.

    This booklet is a technical assistance sampler addressing the issues of student misbehavior, discipline problems, and behavioral initiatives. The term behavioral initiative is defined, disciplining children with disabilities is discussed, and a cautionary note concerning ignoring students' reasons for misbehavior is presented. A brief entitled…

  15. Local initiative extrapolated to nation

    DEFF Research Database (Denmark)

    Wittchen, Kim Bjarne; Kragh, Jesper; Brøgger, Morten

    In the municipality of Sønderborg, in the southern part of Jutland, there is a shining example initiated in 2007, ProjectZero, of a local initiative that have resulted in extensive energy savings in residential buildings and at the same time created local workplaces. The intension with the pilot...

  16. Northern Eurasia Future Initiative (NEFI)

    DEFF Research Database (Denmark)

    Groisman, Pavel; Shugart, Herman; Kicklighter, David

    2017-01-01

    . The Northern Eurasia Future Initiative (NEFI) has been designed as an essential continuation of the Northern Eurasia Earth Science Partnership Initiative (NEESPI), which was launched in 2004. NEESPI sought to elucidate all aspects of ongoing environmental change, to inform societies and, thus, to better...

  17. Comprehensive School Safety Initiative Report

    Science.gov (United States)

    National Institute of Justice, 2014

    2014-01-01

    The National Institute of Justice (NIJ) developed the Comprehensive School Safety Initiative in consultation with federal partners and Congress. It is a research-focused initiative designed to increase the safety of schools nationwide through the development of knowledge regarding the most effective and sustainable school safety interventions and…

  18. Initial conditions for chaotic inflation

    International Nuclear Information System (INIS)

    Brandenberger, R.; Kung, J.; Feldman, H.

    1991-01-01

    In contrast to many other inflationary Universe models, chaotic inflation does not depend on fine tuning initial conditions. Within the context of linear perturbation theory, it is shown that chaotic inflation is stable towards both metric and matter perturbations. Neglecting gravitational perturbations, it is shown that chaotic inflation is an attractor in initial condition space. (orig.)

  19. Self-initiated expatriate academics

    DEFF Research Database (Denmark)

    Selmer, Jan; Lauring, Jakob

    2013-01-01

    In this chapter we examine self-initiated expatriate academics. Universities are to an increasing extent looking for talent beyond national boundaries. Accordingly, self-initiated expatriate academics represent a fast growing group of highly educated professionals who gain employment abroad...

  20. L G-2 Scintrex manual.Fluorescence analyzer

    International Nuclear Information System (INIS)

    Pirelli, H.

    1987-01-01

    The Scintrex Fluorescence Analyzer LG-2 selectively detects the presence of certain fluorescent minerals through UV photoluminescence induced and provides quantitative information on its distribution.

  1. Polarized 3He gas circulating technologies for neutron analyzers

    Energy Technology Data Exchange (ETDEWEB)

    Watt, David W. [Xemed, LLC, Durham, NH (United States)

    2017-10-02

    We outline our project to develop a circulating polarized helium-3 system for developing of large, quasi-continuously operating neutron analyzers. The project consisted of four areas: 1) Development of robust external cavity narrowed diode laser output with spectral line width < 0.17 nm and power of 2000 W. 2) Development of large glass polarizing cells using cell surface treatments to obtain long relaxation lifetimes. 3) Refinements of the circulation system with an emphasis on gas purification and materials testing. 4) Design/fabrication of a new polarizer system. 5) Preliminary testing of the new polarizer. 1. Developed Robust High-Power Narrowed Laser The optical configuration of the laser was discussed in the proposal and will be reviewed in the body of this report. The external cavity is configured to mutually lock the wavelength of five 10-bar laser stacks. All the logistical milestones were been met and critical subsystems- laser stack manifold and power divider, external laser cavity, and output telescope- were assembled and tested at low power. Each individual bar is narrowed to ~0.05 nm; when combined the laser has a cumulative spectral width of 0.17 nm across the entire beam due to variations of the bars central wavelength by +/- 0.1 nm, which is similar to that of Volume Bragg Grating narrowed laser bars. This configuration eliminates the free-running “pedestal” that occurs in other external cavity diode lasers. The full-scale laser was completed in 2016 and was used in both the older and newer helium polarizers. This laser was operated at 75% power for periods of up to 8 hours. Once installed, the spectrum became slightly broader (~.25 nm) at full power; this is likely due to very slight misalignments that occurred during handling. 2. Developed the processes to create uniform sintered sol-gel coatings. Our work on cell development comprised: 1) Production of large GE180 cells and explore different means of cell preparation, and 2) Development of

  2. The initiating events in the Loviisa nuclear power plant history

    International Nuclear Information System (INIS)

    Sjoblom, K.

    1987-01-01

    During the 16 reactor years of Loviisa nuclear power plant operation no serious incident has endangered the high level of safety. The initiating events of plant incidents have been analyzed in order to get a view of plant operational safety experience. The initiating events have been placed in categories similar to those that EPRI uses. However, because of the very small number of scrams the study was extended to also cover transients with a relatively low safety importance in order to get more comprehensive statistics. Human errors, which contributed to 15% of the transients, were a special subject in this study. The conditions under which human failures occurred, and the nature and root causes of the human failures that caused the initiating events were analyzed. For future analyses it was noticed that it would be beneficial to analyze incidents immediately, to consult with the persons directly involved and to develop an international standard format for incident analyses

  3. 40 CFR 91.314 - Analyzer accuracy and specifications.

    Science.gov (United States)

    2010-07-01

    .... (3) Zero drift. The analyzer zero-response drift during a one-hour period must be less than two percent of full-scale chart deflection on the lowest range used. The zero-response is defined as the mean... calibration or span gas. (2) Noise. The analyzer peak-to-peak response to zero and calibration or span gases...

  4. A data mining approach to analyze occupant behavior motivation

    NARCIS (Netherlands)

    Ren, X.; Zhao, Y.; Zeiler, W.; Boxem, G.; Li, T.

    2017-01-01

    Occupants' behavior could bring significant impact on the performance of built environment. Methods of analyzing people's behavior have not been adequately developed. The traditional methods such as survey or interview are not efficient. This study proposed a data-driven method to analyze the

  5. Development of parallel/serial program analyzing tool

    International Nuclear Information System (INIS)

    Watanabe, Hiroshi; Nagao, Saichi; Takigawa, Yoshio; Kumakura, Toshimasa

    1999-03-01

    Japan Atomic Energy Research Institute has been developing 'KMtool', a parallel/serial program analyzing tool, in order to promote the parallelization of the science and engineering computation program. KMtool analyzes the performance of program written by FORTRAN77 and MPI, and it reduces the effort for parallelization. This paper describes development purpose, design, utilization and evaluation of KMtool. (author)

  6. Control of a pulse height analyzer using an RDX workstation

    International Nuclear Information System (INIS)

    Montelongo, S.; Hunt, D.N.

    1984-12-01

    The Nuclear Chemistry Division of Lawrence Livermore National laboratory is in the midst of upgrading its radiation counting facilities to automate data acquisition and quality control. This upgrade requires control of a pulse height analyzer (PHA) from an interactive LSI-11/23 workstation running RSX-11M. The PHA is a micro-computer based multichannel analyzer system providing data acquisition, storage, display, manipulation and input/output from up to four independent acquisition interfaces. Control of the analyzer includes reading and writing energy spectra, issuing commands, and servicing device interrupts. The analyzer communicates to the host system over a 9600-baud serial line using the Digital Data Communications link level Protocol (DDCMP). We relieved the RSX workstation CPU from the DDCMP overhead by implementing a DEC compatible in-house designed DMA serial line board (the ISL-11) to communicate with the analyzer. An RSX I/O device driver was written to complete the path between the analyzer and the RSX system by providing the link between the communication board and an application task. The I/O driver is written to handle several ISL-11 cards all operating in parallel thus providing support for control of multiple analyzers from a single workstation. The RSX device driver, its design and use by application code controlling the analyzer, and its operating environment will be discussed

  7. Analyzing FCS Professionals in Higher Education: A Case Study

    Science.gov (United States)

    Hall, Scott S.; Harden, Amy; Pucciarelli, Deanna L.

    2016-01-01

    A national study of family and consumer sciences (FCS) professionals in higher education was analyzed as a case study to illustrate procedures useful for investigating issues related to FCS. The authors analyzed response rates of more than 1,900 FCS faculty and administrators by comparing those invited to participate and the 345 individuals who…

  8. A multichannel analyzer computer system for simultaneously measuring 64 spectra

    International Nuclear Information System (INIS)

    Jin Yuheng; Wan Yuqing; Zhang Jiahong; Li Li; Chen Guozhu

    2000-01-01

    The author introduces a multichannel analyzer computer system for simultaneously measuring 64 spectra with 64 coded independent inputs. The system is developed for a double chopper neutron scattering time-of-flight spectrometer. The system structure, coding method, operating principle and performances are presented. The system can also be used for other nuclear physics experiments which need multichannel analyzer with independent coded inputs

  9. Computer-based multi-channel analyzer based on internet

    International Nuclear Information System (INIS)

    Zhou Xinzhi; Ning Jiaoxian

    2001-01-01

    Combined the technology of Internet with computer-based multi-channel analyzer, a new kind of computer-based multi-channel analyzer system which is based on browser is presented. Its framework and principle as well as its implementation are discussed

  10. A Morphological Analyzer for Vocalized or Not Vocalized Arabic Language

    Science.gov (United States)

    El Amine Abderrahim, Med; Breksi Reguig, Fethi

    This research has been to show the realization of a morphological analyzer of the Arabic language (vocalized or not vocalized). This analyzer is based upon our object model for the Arabic Natural Language Processing (NLP) and can be exploited by NLP applications such as translation machine, orthographical correction and the search for information.

  11. Analyzing Population Genetics Data: A Comparison of the Software

    Science.gov (United States)

    Choosing a software program for analyzing population genetic data can be a challenge without prior knowledge of the methods used by each program. There are numerous web sites listing programs by type of data analyzed, type of analyses performed, or other criteria. Even with programs categorized in ...

  12. 40 CFR 90.318 - Oxides of nitrogen analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Oxides of nitrogen analyzer calibration. 90.318 Section 90.318 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Emission Test Equipment Provisions § 90.318 Oxides of nitrogen analyzer calibration. (a) Calibrate the...

  13. 40 CFR 91.318 - Oxides of nitrogen analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Oxides of nitrogen analyzer calibration. 91.318 Section 91.318 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Provisions § 91.318 Oxides of nitrogen analyzer calibration. (a) Calibrate the chemiluminescent oxides of...

  14. 40 CFR 89.321 - Oxides of nitrogen analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Oxides of nitrogen analyzer calibration. 89.321 Section 89.321 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Test Equipment Provisions § 89.321 Oxides of nitrogen analyzer calibration. (a) The chemiluminescent...

  15. THE EXPERIENCE OF COMPARISON OF STATIC SECURITY CODE ANALYZERS

    Directory of Open Access Journals (Sweden)

    Alexey Markov

    2015-09-01

    Full Text Available This work presents a methodological approach to comparison of static security code analyzers. It substantiates the comparison of the static analyzers as to efficiency and functionality indicators, which are stipulated in the international regulatory documents. The test data for assessment of static analyzers efficiency is represented by synthetic sets of open-source software, which contain vulnerabilities. We substantiated certain criteria for quality assessment of the static security code analyzers subject to standards NIST SP 500-268 and SATEC. We carried out experiments that allowed us to assess a number of the Russian proprietary software tools and open-source tools. We came to the conclusion that it is of paramount importance to develop Russian regulatory framework for testing software security (firstly, for controlling undocumented features and evaluating the quality of static security code analyzers.

  16. Initial Assessment of Whiplash Patients

    Directory of Open Access Journals (Sweden)

    R Gunzburg

    2003-01-01

    Full Text Available The article looks at how for severe trauma, the outcome of treatment depends on the initial medical care. This has now also been accepted for whiplash associated disorders, underlining the importance of a proper initial assessment. Once major injury has been excluded and the diagnosis of whiplash associated disorder has been established, the initial treatment of whiplash in the emergency room can be started. The four key points to remember are described, including reassuring the patient about evolution, no soft collar, nonsteroidal anti-inflammatory drugs and early mobilisation.

  17. Canada's family violence initiative: partnerships

    Directory of Open Access Journals (Sweden)

    Elaine Scott

    1994-01-01

    Full Text Available Under Canada's four-year, $136 million Family Violence Initiative, the federal government is calling upon all Canadians to work in partnerships towards the elimination of family violence - child abuse, violence against women, and elder (senior abuse. Family violence is a complex problem and requires the efforts of all Canadians to resolve it. One of the key themes of the Initiative - a multidisciplinary approach to the problem of family violence - is reflected in the selection and development of projects. Activities funded by the seven federal departments and agencies involved in the Initiative emphasize partnerships with the professional, voluntary, corporate, non-government and government sectors.

  18. Learning from the Ilulissat Initiative

    DEFF Research Database (Denmark)

    Rahbek-Clemmensen, Jon; Thomasen, Gry

    In May 2018, ten years will have passed since the representatives from the five Arctic coastal states (Canada, Denmark, Norway, Russia) and the Home Rule government of Greenland met in Ilulissat. Learning from the Ilulissat Initiative examines how the initiative came about, and how it has affected...... strong economic interests in maintaining peaceful Arctic relations. Northern forums therefore give policymakers a rare opportunity to meet, communicate, and influence a key region. The Ilulissat meeting was the result of a joint Danish‒Greenlandic initiative and is often hailed as one of the most...

  19. Research on Initiation Sensitivity of Solid Explosive and Planer Initiation System

    Directory of Open Access Journals (Sweden)

    N Matsuo

    2016-09-01

    Full Text Available Firstly, recently, there are a lot of techniques being demanded for complex process, various explosive initiation method and highly accurate control of detonation are needed. In this research, the metal foil explosion using high current is focused attention on the method to obtain linear or planate initiation easily, and the main evaluation of metal foil explosion to initiate explosive was conducted. The explosion power was evaluated by observing optically the underwater shock wave generated from the metal foil explosion. Secondly, in high energy explosive processing, there are several applications, such as shock compaction, explosive welding, food processing and explosive forming. In these explosive applications, a high sensitive explosive has been mainly used. The high sensitive explosive is so dangerous, since it can lead to explosion suddenly. So, for developing explosives, the safety is the most important thing as well as low manufacturing cost and explosive characteristics. In this work, we have focused on the initiation sensitivity of a solid explosive and performed numerical analysis of sympathetic detonation. The numerical analysis is calculated by LS-DYNA 3D (commercial code. To understand the initiation reaction of an explosive, Lee-Tarver equation was used and impact detonation process was analyzed by ALE code. Configuration of simulation model is a quarter of circular cylinder. The donor type of explosive (SEP was used as initiation explosive. When the donor explosive is exploded, a shock wave is generated and it propagates into PMMA, air and metallic layers in order. During passing through the layers, the shock wave is attenuated and finally, it has influence on the acceptor explosive, Comp. B. Here, we evaluate the initiation of acceptor explosive and discuss about detonation pressure, reactive rate of acceptor explosive and attenuation of impact pressure.

  20. Construction of spacetimes from initial data

    International Nuclear Information System (INIS)

    Isenberg, J.A.

    1979-01-01

    As relativistic effects become more accessible to physical experiment and observation, it becomes important to be able to theoretically analyze the behavior of relativistic model systems designed to incorporate such measurable effects. This dissertation describes in detail the initial value (IV) procedure for carrying out such analyses (i.e., for ''building spacetimes''). We report progress--of the author as well as others--in all of these areas: (1) The generalized Bergmann-Dirac (BD) procedure can be used to systematically translate any theory into 3+1 form. (2) The York procedure turns the constraints of Einstein's theory into a set of four elliptic equations for four unknowns (with the rest of the initial data ''relatively free''). (3) The maximal and K-foliation schemes appear to give preferred kinematics for the generic spacetimes one might build. We discuss the sense in which these foliations are preferred, and compare them with others. We then show how to find maximal and K-surfaces, both in a given spacetime (e.g. Schwarzschild) and in one being built from scratch. (4) Many physically interesting systems have symmetries which considerably simplify the equations. After discussing how, in general, one can build symmetries into initial data, and how one can use them to simplify the analysis, we look at a particular example symmetry: spacetimes with two space-like translation Killing Vectors. (''2T'')

  1. Subgingival temperature and microbiota in initial periodontitis.

    Science.gov (United States)

    Maiden, M F; Tanner, A C; Macuch, P J; Murray, L; Kent, R L

    1998-10-01

    The association between subgingival temperature, other clinical characteristics, and the subgingival microbiota was examined in adult subjects with initial periodontitis and differing levels of gingival inflammation. 43 subjects were measured at 6 sites per tooth for pocket depth, attachment level, presence of plaque, gingival redness, bleeding on probing and subgingival temperature at 3-month intervals for 1 year. Subgingival plaque was sampled from 15 initial active periodontitis sites (10 subjects), 121 gingivitis, sites (20 subjects) and 202 healthy sites (13 subjects), and included the 5 hottest and 5 coldest sites in each subject. Plaque samples were analyzed for 13 subgingival species using whole-genomic DNA probes. The major influences on the subgingival microbiota were the clinical status of sites, pocket depth, and the presence of supragingival plaque. No significant association between species and site temperature was observed. Initial active sites were associated with Bacteroides forsythus and Campylobacter rectus, and had a higher mean subgingival temperature and deeper mean pocket depth than inactive sites. A weak association between pocket depth and site temperature was noted. The major influence on subgingival temperature of sites was the anterior to posterior anatomical temperature gradient in the mandible and maxilla.

  2. The Luxembourg Space Resources Initiative

    Science.gov (United States)

    Link, M.

    2017-09-01

    This keynote talk by M. Link from the Directorate of ICT and Space Affairs, Ministry of the Economy, The Government of the Grand Duchy of Luxembourg, will provide an overview of Luxembourg's ins-space resource utilization initiative.

  3. Integrated landscape initiatives in Europe

    DEFF Research Database (Denmark)

    García-Martín, María; Bieling, Claudia; Hart, Abigail

    2016-01-01

    Landscapes are linked to human well-being in a multitude of ways, some of which are challenged by global market forces and traditional management approaches. In response to this situation there has been a rise in local initiatives to sustain the values of landscape. The aim of this paper is to pr......Landscapes are linked to human well-being in a multitude of ways, some of which are challenged by global market forces and traditional management approaches. In response to this situation there has been a rise in local initiatives to sustain the values of landscape. The aim of this paper...... searches and canvassing of European umbrella organisations; followed by an online survey of representatives from the identified initiatives (n??=??71). Our results show that the most relevant characteristics of integrated landscape initiatives in Europe are: a holistic approach to landscape management...

  4. FY 10 Multifamily Initial Endorsements

    Data.gov (United States)

    Department of Housing and Urban Development — In FY 2010, HUD's Multifamily's 18 Hubs initially endorsed 1011 loans totaling $11.3 billion and providing 170,672 units/ beds. FY 10's $11.3 billion is the highest...

  5. Smart roadside initiative : user manual.

    Science.gov (United States)

    2015-09-01

    This document provides the user instructions for the Smart Roadside Initiative (SRI) applications including mobile and web-based SRI applications. These applications include smartphone-enabled information exchange and notification, and software compo...

  6. Smart roadside initiative : final report.

    Science.gov (United States)

    2015-09-01

    This is the Final Report for the Smart Roadside Initiative (SRI) prototype system deployment project. The SRI prototype was implemented at weigh stations in Grass Lake, Michigan and West Friendship, Maryland. The prototype was developed to integrate ...

  7. Global Peace Operations Initiative (GPOI)

    National Research Council Canada - National Science Library

    Kim-Mitchell, Elena

    2005-01-01

    .... In terms of manpower, the initiative aims at deploying 75,000 peace support operations (PSO) troops worldwide over the next 5 years, primarily to Africa, but also to Latin America, Europe, and Asia...

  8. Experimental analysis of a new retarding field energy analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Wei, Yu-Xiang [Shanghai Institute of Mechanical and Electrical Engineering, No. 3888, Yuanjiang Road, Minhang District, Shanghai 201109 (China); Institute of Electronics, Chinese Academy of Sciences, No. 19, North 4th Ring Road West, Haidian District, Beijing 100190 (China); Liu, Shu-Qing; Li, Xian-Xia; Shen, Hong-Li; Huang, Ming-Guang [Institute of Electronics, Chinese Academy of Sciences, No. 19, North 4th Ring Road West, Haidian District, Beijing 100190 (China); Liu, Pu-Kun, E-mail: pkliu@pku.edu.cn [School of Electronics Engineering and Computer Science, Peking University, No. 5, Yiheyuan Road, Haidian District, Beijing 100871 (China)

    2015-06-11

    In this paper, a new compact retarding field energy analyzer (RFEA) is designed for diagnosing electron beams of a K-band space travelling-wave tube (TWT). This analyzer has an aperture plate to sample electron beams and a cylindrical electrode to overcome the defocusing effects. The front end of the analyzer constructed as a multistage depression collector (MDC) structure is intended to shape the field to prevent electrons from being accelerated to escape. The direct-current (DC) beams of the K-band space TWTs with the removing MDC can be investigated on the beam measurement system. The current density distribution of DC beams is determined by the analyzer, while the anode voltage and helix voltage of the TWTs are 7000 V and 6850 V, respectively. The current curve’s slope effect due to the reflection of secondary electrons on the copper collector of the analyzer is discussed. The experimental analysis shows this RFEA has a good energy resolution to satisfy the requirement of beam measurement. - Highlights: • A new retarding field energy analyzer (RFEA) is designed to diagnose the electron beam of a K-band space TWT. • The current density distribution of direct-current beam is determined. • The reflection effect of secondary electrons on the copper collector of the analyzer is discussed.

  9. Canada's family violence initiative: partnerships

    OpenAIRE

    Scott,Elaine

    1994-01-01

    Under Canada's four-year, $136 million Family Violence Initiative, the federal government is calling upon all Canadians to work in partnerships towards the elimination of family violence - child abuse, violence against women, and elder (senior) abuse. Family violence is a complex problem and requires the efforts of all Canadians to resolve it. One of the key themes of the Initiative - a multidisciplinary approach to the problem of family violence - is reflected in the selection and developmen...

  10. Student initiative: A conceptual analysis

    OpenAIRE

    Polovina Nada

    2014-01-01

    In the description and scientific consideration of the attitude of children and youth towards their education and development, the concept of student initiative has been gaining ground lately, and it is hence the subject of analysis in this paper. The analysis is important because of the discrepancy between the increased efforts of the key educational policy holders to promote the idea about the importance of the development of student initiative and rare a...

  11. Analysis and discussion on the experimental data of electrolyte analyzer

    Science.gov (United States)

    Dong, XinYu; Jiang, JunJie; Liu, MengJun; Li, Weiwei

    2018-06-01

    In the subsequent verification of electrolyte analyzer, we found that the instrument can achieve good repeatability and stability in repeated measurements with a short period of time, in line with the requirements of verification regulation of linear error and cross contamination rate, but the phenomenon of large indication error is very common, the measurement results of different manufacturers have great difference, in order to find and solve this problem, help enterprises to improve quality of product, to obtain accurate and reliable measurement data, we conducted the experimental evaluation of electrolyte analyzer, and the data were analyzed by statistical analysis.

  12. Transit time spreads in biased paracentric hemispherical deflection analyzers

    International Nuclear Information System (INIS)

    Sise, Omer; Zouros, Theo J.M.

    2016-01-01

    The biased paracentric hemispherical deflection analyzers (HDAs) are an alternative to conventional (centric) HDAs maintaining greater dispersion, lower angular aberrations, and hence better energy resolution without the use of any additional fringing field correctors. In the present work, the transit time spread of the biased paracentric HDA is computed over a wide range of analyzer parameters. The combination of high energy resolution with good time resolution and simplicity of design makes the biased paracentric analyzers very promising for both coincidence and singles spectroscopy applications.

  13. Transit time spreads in biased paracentric hemispherical deflection analyzers

    Energy Technology Data Exchange (ETDEWEB)

    Sise, Omer, E-mail: omersise@sdu.edu.tr [Dept. of Science Education, Faculty of Education, Suleyman Demirel Univ., 32260 Isparta (Turkey); Zouros, Theo J.M. [Dept. of Physics, Univ. of Crete, P.O. Box 2208, GR 71003 Heraklion (Greece); Tandem Lab, INPP, NCSR Demokritos, P.O. Box 60228, GR 15310 Ag. Paraskevi (Greece)

    2016-02-15

    The biased paracentric hemispherical deflection analyzers (HDAs) are an alternative to conventional (centric) HDAs maintaining greater dispersion, lower angular aberrations, and hence better energy resolution without the use of any additional fringing field correctors. In the present work, the transit time spread of the biased paracentric HDA is computed over a wide range of analyzer parameters. The combination of high energy resolution with good time resolution and simplicity of design makes the biased paracentric analyzers very promising for both coincidence and singles spectroscopy applications.

  14. Transit time spreads in biased paracentric hemispherical deflection analyzers

    Science.gov (United States)

    Sise, Omer; Zouros, Theo J. M.

    2016-02-01

    The biased paracentric hemispherical deflection analyzers (HDAs) are an alternative to conventional (centric) HDAs maintaining greater dispersion, lower angular aberrations, and hence better energy resolution without the use of any additional fringing field correctors. In the present work, the transit time spread of the biased paracentric HDA is computed over a wide range of analyzer parameters. The combination of high energy resolution with good time resolution and simplicity of design makes the biased paracentric analyzers very promising for both coincidence and singles spectroscopy applications.

  15. Emergency response training with the BNL plant analyzer

    International Nuclear Information System (INIS)

    Cheng, H.S.; Guppy, J.G.; Mallen, A.N.; Wulff, W.

    1987-01-01

    Presented is the experience in the use of the BNL Plant Analyzer for NRC emergency response training to simulated accidents in a BWR. The unique features of the BNL Plant Analyzer that are important for the emergency response training are summarized. A closed-loop simulation of all the key systems of a power plant in question was found essential to the realism of the emergency drills conducted at NRC. The faster than real-time simulation speeds afforded by the BNL Plant Analyzer have demonstrated its usefulness for the timely conduct of the emergency response training

  16. Teachers’ perceptions of their own initiative: Collective initiative vs. personal initiative

    Directory of Open Access Journals (Sweden)

    Džinović Vladimir

    2013-01-01

    Full Text Available Current trends in education demand from teachers to exhibit proactive behaviour and assume responsibility for the implementation of changes in school practice. In that sense, it is important to study how teachers perceive their own initiative and to gain insight into the activities where such initiative is demonstrated. This study has been conceived as a mixed-methods research. The qualitative study implied forming four focus groups with subject teachers and class teachers (N=38, while the quantitative study entailed surveying 1441 teachers in forty primary schools in Serbia using the questionnaire constructed based on qualitative data. Data from focus groups were processed by qualitative thematic analysis, while the questionnaire data were processed by principal component analysis and univariate analysis of variance. The findings of the study have shown that teachers mostly demonstrate initiative through co­operative activities that include planning of joint teaching as well as conducting joint projects within school and with the local community actors. Teachers are least ready to demonstrate personal initiative and the initiative aimed at accomplishing considerable changes in school work. The concluding part includes the recommendations for encouraging teachers’ personal initiative and building organizational culture that would support such initiative. [Projekat Ministarstva nauke Republike Srbije, br. br. 47008: Unapređivanje kvaliteta i dostupnosti obrazovanja u procesima modernizacije Srbije i br. 179034: Od podsticanja inicijative, saradnje i stvaralaštva u obrazovanju do novih uloga i identiteta u društvu

  17. An Additional Method for Analyzing the Reversible Inhibition of an ?Enzyme Using Acid Phosphatase as a Model

    OpenAIRE

    Baumhardt, Jordan M.; Dorsey, Benjamin M.; McLauchlan, Craig C.; Jones, Marjorie A.

    2015-01-01

    Using wheat germ acid phosphatase and sodium orthovanadate as a competitive inhibitor, a novel method for analyzing reversible inhibition was carried out. Our alternative approach involves plotting the initial velocity at which product is formed as a function of the ratio of substrate concentration to inhibitor concentration at a constant enzyme concentration and constant assay conditions. The concept of initial concentrations driving equilibrium leads to the chosen axes. Three apparent const...

  18. Сoncept of national legislative initiative and its types

    Directory of Open Access Journals (Sweden)

    А. Л. Крутько

    2015-11-01

    Full Text Available . National legislative initiative is a new instrument of popular wills demonstration as compared to different forms of direct democracy. In most of developed democracies this institution regulated at the constitutional/ legislative level. But in the modern Ukraine its constitutional legal regulation is absent, due disregard of its possibilities and lack of understanding of its essence. Paper objective. This article an aim is to analyze in details the definition of «national legislative initiative» and determinate its basic types according to theoretical insights and foreign current law. Recent research and publications analysis. The domestic and foreign scholars works on scientific research of national legislative initiative institution such as V.N. Rudenko, O.M. Mudra, V.M. Shapoval, V.F. Nesterovich, J. F. Zimmerman and etc. Their works were foundational at the time of writing. Paper main body. With the help of big definition dictionary and new encyclopedic dictionary it was found the etymology of the concept «initiative» which is characterized as the basis, also found meaning of «legislative initiative», «national initiative» and «national legislative initiative». It was argued impossibility an identification of «national initiative» with «national legislative initiative». The current definitions of the national legislative initiative were analyzed in the article. It was noted that suggested terms were limited only by identification of institute’s apparent indicator and withhold essence. This is precisely why four types of the national legislative initiative’s realization are briefly examined for the complex determination of the definition. These types depending on what role the legislator are assigning to citizen, who are the main actors of initiative. And on the basis of this analysis the author provided his own definition of «the national legislative initiative». The author had notes that the proposed definition was not

  19. Intermittency in multiparticle production analyzed by means of stochastic theories

    International Nuclear Information System (INIS)

    Bartl, A.; Suzuki, N.

    1990-01-01

    Intermittency in multiparticle production is described by means of probability distributions derived from pure birth stochastic equations. The UA1, TASSO, NA22 and cosmic ray data are analyzed. 24 refs., 1 fig. (Authors)

  20. Automated Real-Time Clearance Analyzer (ARCA), Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The Automated Real-Time Clearance Analyzer (ARCA) addresses the future safety need for Real-Time System-Wide Safety Assurance (RSSA) in aviation and progressively...

  1. Triple Isotope Water Analyzer for Extraplanetary Studies, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Los Gatos Research (LGR) proposes to employ Off-Axis ICOS to develop triple-isotope water analyzers for lunar and other extraplanetary exploration. This instrument...

  2. Analyzing Software Errors in Safety-Critical Embedded Systems

    Science.gov (United States)

    Lutz, Robyn R.

    1994-01-01

    This paper analyzes the root causes of safty-related software faults identified as potentially hazardous to the system are distributed somewhat differently over the set of possible error causes than non-safety-related software faults.

  3. Mini Total Organic Carbon Analyzer (miniTOCA)

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this development is to create a prototype hand-held, 1 to 2 liter size battery-powered Total Organic Carbon Analyzer (TOCA). The majority of...

  4. The quality infrastructure measuring, analyzing, and improving library services

    CERN Document Server

    Murphy, Sarah Anne

    2013-01-01

    Summarizing specific tools for measuring service quality alongside tips for using these tools most effectively, this book helps libraries of all kinds take a programmatic approach to measuring, analyzing, and improving library services.

  5. Josephson junction spectrum analyzer for millimeter and submillimeter wavelengths

    International Nuclear Information System (INIS)

    Larkin, S.Y.; Anischenko, S.E.; Khabayev, P.V.

    1994-01-01

    A prototype of the Josephson-effect spectrum analyzer developed for the millimeter-wave band is described. The measurement results for spectra obtained in the frequency band from 50 to 250 GHz are presented

  6. Analyzing radial acceleration with a smartphone acceleration sensor

    Science.gov (United States)

    Vogt, Patrik; Kuhn, Jochen

    2013-03-01

    This paper continues the sequence of experiments using the acceleration sensor of smartphones (for description of the function and the use of the acceleration sensor, see Ref. 1) within this column, in this case for analyzing the radial acceleration.

  7. The Photo-Pneumatic CO2 Analyzer, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We are proposing to build a new technology, the photo-pneumatic analyzer. It is small, solid-state, inexpensive, and appropriate for observations of atmospheric...

  8. Josephson junction spectrum analyzer for millimeter and submillimeter wavelengths

    Energy Technology Data Exchange (ETDEWEB)

    Larkin, S.Y.; Anischenko, S.E.; Khabayev, P.V. [State Research Center, Kiev (Ukraine)

    1994-12-31

    A prototype of the Josephson-effect spectrum analyzer developed for the millimeter-wave band is described. The measurement results for spectra obtained in the frequency band from 50 to 250 GHz are presented.

  9. Methyl-Analyzer--whole genome DNA methylation profiling.

    Science.gov (United States)

    Xin, Yurong; Ge, Yongchao; Haghighi, Fatemeh G

    2011-08-15

    Methyl-Analyzer is a python package that analyzes genome-wide DNA methylation data produced by the Methyl-MAPS (methylation mapping analysis by paired-end sequencing) method. Methyl-MAPS is an enzymatic-based method that uses both methylation-sensitive and -dependent enzymes covering >80% of CpG dinucleotides within mammalian genomes. It combines enzymatic-based approaches with high-throughput next-generation sequencing technology to provide whole genome DNA methylation profiles. Methyl-Analyzer processes and integrates sequencing reads from methylated and unmethylated compartments and estimates CpG methylation probabilities at single base resolution. Methyl-Analyzer is available at http://github.com/epigenomics/methylmaps. Sample dataset is available for download at http://epigenomicspub.columbia.edu/methylanalyzer_data.html. fgh3@columbia.edu Supplementary data are available at Bioinformatics online.

  10. Analyzed method for calculating the distribution of electrostatic field

    International Nuclear Information System (INIS)

    Lai, W.

    1981-01-01

    An analyzed method for calculating the distribution of electrostatic field under any given axial gradient in tandem accelerators is described. This method possesses satisfactory accuracy compared with the results of numerical calculation

  11. Generalized empirical likelihood methods for analyzing longitudinal data

    KAUST Repository

    Wang, S.; Qian, L.; Carroll, R. J.

    2010-01-01

    Efficient estimation of parameters is a major objective in analyzing longitudinal data. We propose two generalized empirical likelihood based methods that take into consideration within-subject correlations. A nonparametric version of the Wilks

  12. NRC nuclear-plant-analyzer concept and status at INEL

    International Nuclear Information System (INIS)

    Aguilar, F.; Wagner, R.J.

    1982-01-01

    The Office of Research of the US NRC has proposed development of a software-hardware system called the Nuclear Plant Analyzer (NPA). This paper describes how we of the INEL envision the nuclear-plant analyzer. The paper also describes a pilot RELAP5 plant-analyzer project completed during the past year and current work. A great deal of analysis is underway to determine nuclear-steam-system response. System transient analysis being so complex, there is the need to present analytical results in a way that interconnections among phenomena and all the nuances of the transient are apparent. There is the need for the analyst to dynamically control system calculations to simulate plant operation in order to perform what if studies as well as the need to perform system analysis within hours of a plant emergency to diagnose the state of the stricken plant and formulate recovery actions. The NRC-proposed nuclear-plant analyzer can meet these needs

  13. AmAMorph: Finite State Morphological Analyzer for Amazighe

    Directory of Open Access Journals (Sweden)

    Fatima Zahra Nejme

    2016-03-01

    Full Text Available This paper presents AmAMorph, a morphological analyzer for Amazighe language using a system based on the NooJ linguistic development environment. The paper begins with the development of Amazighe lexicons with large coverage formalization. The built electronic lexicons, named ‘NAmLex’, ‘VAmLex’ and ‘PAmLex’ which stand for ‘Noun Amazighe Lexicon’, ‘Verb Amazighe Lexicon’ and ‘Particles Amazighe Lexicon’, link inflectional, morphological, and syntacticsemantic information to the list of lemmas. Automated inflectional and derivational routines are applied to each lemma producing over inflected forms. To our knowledge,AmAMorph is the first morphological analyzer for Amazighe. It identifies the component morphemes of the forms using large coverage morphological grammars. Along with the description of how the analyzer is implemented, this paper gives an evaluation of the analyzer.

  14. Radiometric flow injection analysis with an ASIA (Ismatec) analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Myint, U; Win, N; San, K; Han, B; Myoe, K M [Yangon Univ. (Myanmar). Dept. of Chemistry; Toelgyessy, J [Slovak Technical Univ., Bratislava (Slovakia). Dept. of Environmental Science

    1994-07-01

    Radiometric Flow Injection Analysis of a radioactive ([sup 131]I) sample is described. For analysis an ASIA (Ismatec) analyzer with a NaI(Tl) scintillation detector was used. (author) 5 refs.; 3 figs.

  15. Multisensor Analyzed Sea Ice Extent - Northern Hemisphere (MASIE-NH)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Multisensor Analyzed Sea Ice Extent Northern Hemisphere (MASIE-NH) products provide measurements of daily sea ice extent and sea ice edge boundary for the...

  16. Quality Performance of Drugs Analyzed in the Drug Analysis and ...

    African Journals Online (AJOL)

    ICT TEAM

    performance of drug samples analyzed therein. Previous reports have ... wholesalers, non-governmental organizations, hospitals, analytical ..... a dispute concerning discharge of waste water ... Healthcare Industry in Kenya, December. 2008.

  17. Generating and analyzing non-diffracting vector vortex beams

    CSIR Research Space (South Africa)

    Li, Y

    2013-08-01

    Full Text Available single order Bessel beam and superposition cases are studied. The polarization and the azimuthal modes of the generated beams are analyzed. The results of modal decompositions on polarization components are in good agreement with theory. We demonstrate...

  18. A Framework for Modeling and Analyzing Complex Distributed Systems

    National Research Council Canada - National Science Library

    Lynch, Nancy A; Shvartsman, Alex Allister

    2005-01-01

    Report developed under STTR contract for topic AF04-T023. This Phase I project developed a modeling language and laid a foundation for computational support tools for specifying, analyzing, and verifying complex distributed system designs...

  19. Analyzing Spread of Influence in Social Networks for Transportation Applications

    Science.gov (United States)

    2016-09-02

    This project analyzed the spread of influence in social media, in particular, the Twitter social media site, and identified the individuals who exert the most influence to those they interact with. There are published studies that use social media to...

  20. Analyzing Spread of Influence in Social Networks for Transportation Application.

    Science.gov (United States)

    2016-09-02

    This project analyzed the spread of influence in social media, in particular, the Twitter social media site, and identified the individuals who exert the most influence to those they interact with. There are published studies that use social media to...

  1. Airspace Analyzer for Assessing Airspace Directional Permeability, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We build a software tool which enables the user (airline or Air Traffic Service Provider (ATSP)) the ability to analyze the flight-level-by-flight-level permeability...

  2. Giessen polarization facility. III. Multi-detector analyzing system

    Energy Technology Data Exchange (ETDEWEB)

    Krause, H H; Stock, R; Arnold, W; Berg, H; Huttel, E; Ulbricht, J; Clausnitzer, G [Giessen Univ. (Germany, F.R.). Strahlenzentrum

    1977-06-15

    An analyzing system with a PDP 11 computer and a digital multiplexer is described. It allows to accept signals from 16 detectors with individual ADCs simultaneously. For measurements of analyzing powers the polarization of the ion beam can be switched to zero with a frequency of 1 kHz. The switching operation additionally controls the handling of the detector pulses. The software contains special programs for the analysis of polarization experiments.

  3. APPROACHES TO ANALYZE THE QUALITY OF ROMANIAN TOURISM WEB SITES

    Directory of Open Access Journals (Sweden)

    Lacurezeanu Ramona

    2013-07-01

    The purpose of our work is to analyze travel web-sites, more exactly, whether the criteria used to analyze virtual stores are also adequate for the Romanian tourism product. Following the study, we concluded that the Romanian online tourism web-sites for the Romanian market have the features that we found listed on similar web-sites of France, England, Germany, etc. In conclusion, online Romanian tourism can be considered one of the factors of economic growth.

  4. Tests of the Royce ultrasonic interface level analyzer

    International Nuclear Information System (INIS)

    WITWER, K.S.

    1999-01-01

    This document describes testing carried out in 1995 on the Royce Interface Level Analyzer. The testing was carried out in the 305 Bldg., Engineering Testing Laboratory, 300 Area. The Level Analyzer was shown to be able to effectively locate the solid liquid interface layer of two different simulants under various conditions and was able to do so after being irradiated with over 5 million RADS gamma from a Cobalt 60 source

  5. Evaluation of haematology analyzer CELL-DYN 3700 SL

    Directory of Open Access Journals (Sweden)

    Enver Suljević

    2003-05-01

    Full Text Available Research on the parameters of full blood count and differential white blood count is included in the program of all medical laboratories of primary, secondary and tertiary health care levels. Today, all haematological tests are exclusively performed on the haematology analyzers. Automation of haematology laboratories is a result of the huge requires for haematological test performing, timely issuing of the haematological findings, and possibility of the usage of modern techniques.This work is an evaluation of laser haematology analyzer Cell-Dyn 3700 SL. It investigates the reliability of test results throughout the following parameters: precision, accuracy, sensitivity and specificity of determination methods. It also explores the influence of sample transferring and correlation with haematology analyzer MAXM Retti. Haematology parameters that have been investigated are: white blood cell (WBC, neutrophils (NEU, lymphocytes (LXM, monocytes (MONO, eosinophils (EOS, basophils (BASO, red blood cells (RBC, haemoglobin (HGB, haematocrit (HCT, mean corpuscular volume (MCV, mean corpuscular haemoglobin (MCHC red cell distribution width (RDW, platelet (PLT, mean platelet volume (MPV, plateletocrit (PCT, and platelet distribution width (PDW.The results confirms that precision of analyzer fulfils the reproducibility of testing parameters: WBC, RBC, HGB, MCV, MCH, MCHC, and PLT. Correlation coefficient values (r gained throughout the statistical analysis, that is linear regression results obtained throughout the comparison of two analyzers are adequate except for MCHC (r = 0.64, what is in accordance with literature data.Accuracy is tested by haematology analyzer method and microscopic differentiating method. Correlation coefficient results for granulocytes, lymphocytes and monocytes point the accuracy of methods. Sensitivity and specificity parameters fulfil the analytical criteria.It is confirmed that haematology analyzer Cell-Dyn 3700 SL is reliable for

  6. Teachers’ assessments of demonstration of student initiative

    Directory of Open Access Journals (Sweden)

    Komlenović Đurđica

    2012-01-01

    Full Text Available This paper explores student initiative or student engagement in activities in school environment, as an aspect of students’ functioning that is assumed to be a prerequisite for their contribution to the quality of instruction and better use of possibilities for education and development in school environment. We approach this topic from teachers’ perspective since it is our aim to observe how teachers assess the initiative of their students (how important it is, how it is manifested, how present it is in different segments of school activities. In the first part of the paper we analyze the construct “student initiative” and a similar construct “student engagement”. In the second part of the paper we present the results of a research in which primary school teachers (N=182 from the territory of Serbia expressed their views on student initiative. Teachers’ answers to open- and close-ended questions from the questionnaire (19 items in total were processed by quantitative and qualitative methodology. Research results indicate that the majority of teachers believed that student initiative was a very important general feature of behavior in school environment, independent of age, which was most present in the domain of peer socializing and relationship with teachers, and least present in the very domains of student functioning that teachers deemed the most desirable (mastering the curriculum, regulation of disciplinary issues. [Projekat Ministarstva nauke Republike Srbije, br. 179034: Od podsticanja inicijative, saradnje, stvaralaštva u obrazovanju do novih uloga i identiteta u društvu i br. 47008: Unapređivanje kvaliteta i dostupnosti obrazovanja u procesima modernizacije Srbije

  7. Magnetic systems for wide-aperture neutron polarizers and analyzers

    Energy Technology Data Exchange (ETDEWEB)

    Gilev, A.G. [Neutron Research Department, Petersburg Nuclear Physics Institute, NRC “Kurchatov Institute”, Orlova Roscha, Gatchina, St. Petersburg 188300 (Russian Federation); Pleshanov, N.K., E-mail: pnk@pnpi.spb.ru [Neutron Research Department, Petersburg Nuclear Physics Institute, NRC “Kurchatov Institute”, Orlova Roscha, Gatchina, St. Petersburg 188300 (Russian Federation); Bazarov, B.A.; Bulkin, A.P.; Schebetov, A.F. [Neutron Research Department, Petersburg Nuclear Physics Institute, NRC “Kurchatov Institute”, Orlova Roscha, Gatchina, St. Petersburg 188300 (Russian Federation); Syromyatnikov, V.G. [Neutron Research Department, Petersburg Nuclear Physics Institute, NRC “Kurchatov Institute”, Orlova Roscha, Gatchina, St. Petersburg 188300 (Russian Federation); Physical Department, St. Petersburg State University, Ulyanovskaya, 1, Petrodvorets, St. Petersburg 198504 (Russian Federation); Tarnavich, V.V.; Ulyanov, V.A. [Neutron Research Department, Petersburg Nuclear Physics Institute, NRC “Kurchatov Institute”, Orlova Roscha, Gatchina, St. Petersburg 188300 (Russian Federation)

    2016-10-11

    Requirements on the field uniformity in neutron polarizers are analyzed in view of the fact that neutron polarizing coatings have been improved during the past decade. The design of magnetic systems that meet new requirements is optimized by numerical simulations. Magnetic systems for wide-aperture multichannel polarizers and analyzers are represented, including (a) the polarizer to be built at channel 4-4′ of the reactor PIK (Gatchina, Russia) for high-flux experiments with a 100×150 mm{sup 2} beam of polarized cold neutrons; (b) the fan analyzer covering a 150×100 mm{sup 2} window of the detector at the Magnetism Reflectometer (SNS, ORNL, USA); (c) the polarizer and (d) the fan analyzer covering a 220×110 mm{sup 2} window of the detector at the reflectometer NERO, which is transferred to PNPI (Russia) from HZG (Germany). Deviations of the field from the vertical did not exceed 2°. The polarizing efficiency of the analyzer at the Magnetism Reflectometer reached 99%, a record level for wide-aperture supermirror analyzers.

  8. Education Policy and Family Values: A Critical Analysis of Initiatives from the Right

    Science.gov (United States)

    Kumashiro, Kevin K.

    2009-01-01

    This article analyzes current education policy initiatives from the political Right in the United States, focusing on initiatives at the federal level (standards and testing), the state level (funding), the local level (alternative certification), and the campus level (censorship). Each initiative has received wide bipartisan and public support,…

  9. Integrated Initiating Event Performance Indicators

    International Nuclear Information System (INIS)

    S. A. Eide; Dale M. Rasmuson; Corwin L. Atwood

    2005-01-01

    The U.S. Nuclear Regulatory Commission Industry Trends Program (ITP) collects and analyses industry-wide data, assesses the safety significance of results, and communicates results to Congress and other stakeholders. This paper outlines potential enhancements in the ITP to comprehensively cover the Initiating Events Cornerstone of Safety. Future work will address other cornerstones of safety. The proposed Tier 1 activity involves collecting data on ten categories of risk-significant initiating events, trending the results, and comparing early performance with prediction limits (allowable numbers of events, above which NRC action may occur). Tier 1 results would be used to monitor industry performance at the level of individual categories of initiating events. The proposed Tier 2 activity involves integrating the information for individual categories of initiating events into a single risk-based indicator, termed the Baseline Risk Index for Initiating Events or BRIIE. The BRIIE would be evaluated yearly and compared against a threshold. BRIIE results would be reported to Congress on a yearly basis

  10. Characteristics of patients initiating raloxifene compared to those initiating bisphosphonates

    Directory of Open Access Journals (Sweden)

    Wang Sara

    2008-12-01

    Full Text Available Abstract Background Both raloxifene and bisphosphonates are indicated for the prevention and treatment of postmenopausal osteoporosis, however these medications have different efficacy and safety profiles. It is plausible that physicians would prescribe these agents to optimize the benefit/risk profile for individual patients. The objective of this study was to compare demographic and clinical characteristics of patients initiating raloxifene with those of patients initiating bisphosphonates for the prevention and treatment of osteoporosis. Methods This study was conducted using a retrospective cohort design. Female beneficiaries (45 years and older with at least one claim for raloxifene or a bisphosphonate in 2003 through 2005 and continuous enrollment in the previous 12 months and subsequent 6 months were identified using a collection of large national commercial, Medicare supplemental, and Medicaid administrative claims databases (MarketScan®. Patients were divided into two cohorts, a combined commercial/Medicare cohort and a Medicaid cohort. Within each cohort, characteristics (demographic, clinical, and resource utilization of patients initiating raloxifene were compared to those of patients initiating bisphosphonate therapy. Group comparisons were made using chi-square tests for proportions of categorical measures and Wilcoxon rank-sum tests for continuous variables. Logistic regression was used to simultaneously examine factors independently associated with initiation of raloxifene versus a bisphosphonate. Results Within both the commercial/Medicare and Medicaid cohorts, raloxifene patients were younger, had fewer comorbid conditions, and fewer pre-existing fractures than bisphosphonate patients. Raloxifene patients in both cohorts were less likely to have had a bone mineral density (BMD screening in the previous year than were bisphosphonate patients, and were also more likely to have used estrogen or estrogen/progestin therapy in the

  11. The comparison of automated urine analyzers with manual microscopic examination for urinalysis automated urine analyzers and manual urinalysis

    OpenAIRE

    ?nce, Fatma Demet; Ellida?, Hamit Ya?ar; Koseo?lu, Mehmet; ?im?ek, Ne?e; Yal??n, H?lya; Zengin, Mustafa Osman

    2016-01-01

    Objectives: Urinalysis is one of the most commonly performed tests in the clinical laboratory. However, manual microscopic sediment examination is labor-intensive, time-consuming, and lacks standardization in high-volume laboratories. In this study, the concordance of analyses between manual microscopic examination and two different automatic urine sediment analyzers has been evaluated. Design and methods: 209 urine samples were analyzed by the Iris iQ200 ELITE (Ä°ris Diagnostics, USA), Dirui...

  12. Applications of Electronstatic Lenses to Electron Gun and Energy Analyzers

    International Nuclear Information System (INIS)

    Sise, O.

    2004-01-01

    Focal properties and geometries are given for several types of electrostatic lens systems commonly needed in electron impact studies. One type is an electron gun which focuses electrons over a wide range of energy onto a fixed point, such as target, and the other type is an analyzer system which focuses scattered electrons of variable energy onto a fixed position, such as the entrance plane of an analyzer. There are many different types and geometries of these lenses for controlling and focusing of the electron beams. In this presentation we discussed the criteria used for the design of the electrostatic lenses associated with the electron gun and energy analyzers and determined the fundamental relationships between the operation and behaviour of multi-element electrostatic lenses, containing five, six and seven elements. The focusing of the electron beam was achieved by applying suitable voltages to the series of these lens elements, Design of the lens system for electron gun was based on our requirements that the beam at the target had a small spot size and zero beam angle, that is, afocal mode. For energy analyzer systems we considered the entrance of the hemispherical analyzer which determines the energy of the electron beam and discussed the focusing condition of this lens systems

  13. Health Services Cost Analyzing in Tabriz Health Centers 2008

    Directory of Open Access Journals (Sweden)

    Massumeh gholizadeh

    2015-08-01

    Full Text Available Background and objectives : Health Services cost analyzing is an important management tool for evidence-based decision making in health system. This study was conducted with the purpose of cost analyzing and identifying the proportion of different factors on total cost of health services that are provided in urban health centers in Tabriz. Material and Methods : This study was a descriptive and analytic study. Activity Based Costing method (ABC was used for cost analyzing. This cross–sectional survey analyzed and identified the proportion of different factors on total cost of health services that are provided in Tabriz urban health centers. The statistical population of this study was comprised of urban community health centers in Tabriz. In this study, a multi-stage sampling method was used to collect data. Excel software was used for data analyzing. The results were described with tables and graphs. Results : The study results showed the portion of different factors in various health services. Human factors by 58%, physical space 8%, medical equipment 1.3% were allocated with high portion of expenditures and costs of health services in Tabriz urban health centers. Conclusion : Based on study results, since the human factors included the highest portion of health services costs and expenditures in Tabriz urban health centers, balancing workload with staff number, institutionalizing performance-based management and using multidisciplinary staffs may lead to reduced costs of services. ​

  14. Test of a two-dimensional neutron spin analyzer

    International Nuclear Information System (INIS)

    Falus, Peter; Vorobiev, Alexei; Krist, Thomas

    2006-01-01

    The aim of this measurement was to test the new large-area spin polarization analyzer for the EVA-SERGIS beamline at Institute Laue Langevin (ILL). The spin analyzer, which was built in Berlin selects one of the two spin states of a neutron beam of wavelength 5.5 A impinging on a horizontal sample and reflected or scattered from the sample. The spin is analyzed for all neutrons scattered into a detector with an area of 190 mmx190 mm positioned 2.7 m behind the sample, thus covering an angular interval of 4 o x4 o . The tests were done at the HMI V14 beamline followed by tests at the EVA beamline at ILL. The transmission for the two spin components, the flipping ratio and small angle scattering were recorded while scanning the incoming beam on the analyzer. It was clearly visible, that due to the stacked construction the intensity is blocked at regular intervals. Careful inspection shows that the transmission of the good spin component is more than 0.72 for 60% of the detector area and the corrected flipping ratio is more than 47 for 60% of the detector area. Although some small-angle scattering is visible, it is notable that this analyzer design has small scattering intensities

  15. Test of a two-dimensional neutron spin analyzer

    Science.gov (United States)

    Falus, Péter; Vorobiev, Alexei; Krist, Thomas

    2006-11-01

    The aim of this measurement was to test the new large-area spin polarization analyzer for the EVA-SERGIS beamline at Institute Laue Langevin (ILL). The spin analyzer, which was built in Berlin selects one of the two spin states of a neutron beam of wavelength 5.5 Å impinging on a horizontal sample and reflected or scattered from the sample. The spin is analyzed for all neutrons scattered into a detector with an area of 190 mm×190 mm positioned 2.7 m behind the sample, thus covering an angular interval of 4°×4°. The tests were done at the HMI V14 beamline followed by tests at the EVA beamline at ILL. The transmission for the two spin components, the flipping ratio and small angle scattering were recorded while scanning the incoming beam on the analyzer. It was clearly visible, that due to the stacked construction the intensity is blocked at regular intervals. Careful inspection shows that the transmission of the good spin component is more than 0.72 for 60% of the detector area and the corrected flipping ratio is more than 47 for 60% of the detector area. Although some small-angle scattering is visible, it is notable that this analyzer design has small scattering intensities.

  16. Analyzing Innovation Systems (Burkina Faso) | CRDI - Centre de ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    This project aims to improve the efficiency of the nascent innovation system in Burkina Faso by strengthening exchanges between researchers, inventors and innovators and public ... L'Initiative des conseils subventionnaires de la recherche scientifique en Afrique subsaharienne remporte le prix de la diplomatie scientifique.

  17. 40 CFR 92.119 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... concentrations. (5) Perform a linear least square regression on the data generated. Use an equation of the form y... periodic calibration: (a) Initial and periodic optimization of detector response. Prior to introduction... to find the linear chart deflection (z) for each calibration gas concentration (y). (7) Determine the...

  18. Some problems in recording and analyzing South African English ...

    African Journals Online (AJOL)

    ... english, etymology, family names, folk etymology, french, german, hebrew, initialisms, latin, lexicography, misprints, nonce forms, overdefinition, personal names, place names, postal terms, prepositions, productivization, reflexive pronouns, slang, slips of the tongue, south african english, spelling, status and usage labels, ...

  19. Personality Traits and Training Initiation Process: Intention, Planning, and Action Initiation.

    Science.gov (United States)

    Laguna, Mariola; Purc, Ewelina

    2016-01-01

    The article aims at investigating the role of personality traits in relation to training initiation. Training initiation is conceptualized as a goal realization process, and explained using goal theories. There are three stages of the process analyzed: intention to undertake training, plan formulation, and actual training undertaking. Two studies tested the relationships between five personality traits, defined according to the five factor model, and the stages of the goal realization process. In Study 1, which explains training intention and training plans' formulation, 155 employees participated. In Study 2, which was time-lagged with two measurement points, and which explains intention, plans, and training actions undertaken, the data from 176 employees was collected at 3 month intervals. The results of these studies show that personality traits, mainly openness to experience, predict the training initiation process to some degree: intention, plans, and actual action initiation. The findings allow us to provide recommendations for practitioners responsible for human resource development. The assessment of openness to experience in employees helps predict their motivation to participate in training activities. To increase training motivation it is vital to strengthen intentions to undertake training, and to encourage training action planning.

  20. Formative Assessment, Communication Skills and ICT in Initial Teacher Training

    Science.gov (United States)

    Romero-Martín, M. Rosario; Castejón-Oliva, Francisco-Javier; López-Pastor, Víctor-Manuel; Fraile-Aranda, Antonio

    2017-01-01

    The purpose of this study is to analyze the perception of students, graduates, and lecturers in relation to systems of formative and shared assessment and to the acquisition of teaching competences regarding communication and the use of Information and Communications Technology (ICT) in initial teacher education (ITE) on degrees in Primary…