WorldWideScience

Sample records for rapidly analyze complex

  1. Analyzing the complexity of nanotechnology

    NARCIS (Netherlands)

    Vries, de M.J.; Schummer, J.; Baird, D.

    2006-01-01

    Nanotechnology is a highly complex technological development due to many uncertainties in our knowledge about it. The Dutch philosopher Herman Dooyeweerd has developed a conceptual framework that can be used (1) to analyze the complexity of technological developments and (2) to see how priorities

  2. Developing rapid methods for analyzing upland riparian functions and values.

    Science.gov (United States)

    Hruby, Thomas

    2009-06-01

    Regulators protecting riparian areas need to understand the integrity, health, beneficial uses, functions, and values of this resource. Up to now most methods providing information about riparian areas are based on analyzing condition or integrity. These methods, however, provide little information about functions and values. Different methods are needed that specifically address this aspect of riparian areas. In addition to information on functions and values, regulators have very specific needs that include: an analysis at the site scale, low cost, usability, and inclusion of policy interpretations. To meet these needs a rapid method has been developed that uses a multi-criteria decision matrix to categorize riparian areas in Washington State, USA. Indicators are used to identify the potential of the site to provide a function, the potential of the landscape to support the function, and the value the function provides to society. To meet legal needs fixed boundaries for assessment units are established based on geomorphology, the distance from "Ordinary High Water Mark" and different categories of land uses. Assessment units are first classified based on ecoregions, geomorphic characteristics, and land uses. This simplifies the data that need to be collected at a site, but it requires developing and calibrating a separate model for each "class." The approach to developing methods is adaptable to other locations as its basic structure is not dependent on local conditions.

  3. A Framework for Modeling and Analyzing Complex Distributed Systems

    National Research Council Canada - National Science Library

    Lynch, Nancy A; Shvartsman, Alex Allister

    2005-01-01

    Report developed under STTR contract for topic AF04-T023. This Phase I project developed a modeling language and laid a foundation for computational support tools for specifying, analyzing, and verifying complex distributed system designs...

  4. Analyzing Program Termination and Complexity Automatically with AProVE

    DEFF Research Database (Denmark)

    Giesl, Jürgen; Aschermann, Cornelius; Brockschmidt, Marc

    2017-01-01

    In this system description, we present the tool AProVE for automatic termination and complexity proofs of Java, C, Haskell, Prolog, and rewrite systems. In addition to classical term rewrite systems (TRSs), AProVE also supports rewrite systems containing built-in integers (int-TRSs). To analyze...... programs in high-level languages, AProVE automatically converts them to (int-)TRSs. Then, a wide range of techniques is employed to prove termination and to infer complexity bounds for the resulting rewrite systems. The generated proofs can be exported to check their correctness using automatic certifiers...

  5. Rapid Mission Design for Dynamically Complex Environments

    Data.gov (United States)

    National Aeronautics and Space Administration — Designing trajectories in dynamically complex environments is very challenging and easily becomes an intractable problem. More complex planning implies potentially...

  6. Analyzing complex networks evolution through Information Theory quantifiers

    International Nuclear Information System (INIS)

    Carpi, Laura C.; Rosso, Osvaldo A.; Saco, Patricia M.; Ravetti, Martin Gomez

    2011-01-01

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Nino/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  7. Analyzing complex networks evolution through Information Theory quantifiers

    Energy Technology Data Exchange (ETDEWEB)

    Carpi, Laura C., E-mail: Laura.Carpi@studentmail.newcastle.edu.a [Civil, Surveying and Environmental Engineering, University of Newcastle, University Drive, Callaghan NSW 2308 (Australia); Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, Belo Horizonte (31270-901), MG (Brazil); Rosso, Osvaldo A., E-mail: rosso@fisica.ufmg.b [Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, Belo Horizonte (31270-901), MG (Brazil); Chaos and Biology Group, Instituto de Calculo, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, Pabellon II, Ciudad Universitaria, 1428 Ciudad de Buenos Aires (Argentina); Saco, Patricia M., E-mail: Patricia.Saco@newcastle.edu.a [Civil, Surveying and Environmental Engineering, University of Newcastle, University Drive, Callaghan NSW 2308 (Australia); Departamento de Hidraulica, Facultad de Ciencias Exactas, Ingenieria y Agrimensura, Universidad Nacional de Rosario, Avenida Pellegrini 250, Rosario (Argentina); Ravetti, Martin Gomez, E-mail: martin.ravetti@dep.ufmg.b [Departamento de Engenharia de Producao, Universidade Federal de Minas Gerais, Av. Antonio Carlos, 6627, Belo Horizonte (31270-901), MG (Brazil)

    2011-01-24

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Nino/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  8. Rapid Complexation of Aptamers by Their Specific Antidotes

    Directory of Open Access Journals (Sweden)

    Heidi Stoll

    2017-06-01

    Full Text Available Nucleic acid ligands, aptamers, harbor the unique characteristics of small molecules and antibodies. The specificity and high affinity of aptamers enable their binding to different targets, such as small molecules, proteins, or cells. Chemical modifications of aptamers allow increased bioavailability. A further great benefit of aptamers is the antidote (AD-mediated controllability of their effect. In this study, the AD-mediated complexation and neutralization of the thrombin binding aptamer NU172 and Toll-like receptor 9 (TLR9 binding R10-60 aptamer were determined. Thereby, the required time for the generation of aptamer/AD-complexes was analyzed at 37 °C in human serum using gel electrophoresis. Afterwards, the blocking of aptamers’ effects was analyzed by determining the activated clotting time (ACT in the case of the NU172 aptamer, or the expression of immune activation related genes IFN-1β, IL-6, CXCL-10, and IL-1β in the case of the R10-60 aptamer. Gel electrophoresis analyses demonstrated the rapid complexation of the NU172 and R10-60 aptamers by complementary AD binding after just 2 min of incubation in human serum. A rapid neutralization of anticoagulant activity of NU172 was also demonstrated in fresh human whole blood 5 min after addition of AD. Furthermore, the TLR9-mediated activation of PMDC05 cells was interrupted after the addition of the R10-60 AD. Using these two different aptamers, the rapid antagonizability of the aptamers was demonstrated in different environments; whole blood containing numerous proteins, cells, and different small molecules, serum, or cell culture media. Thus, nucleic acid ADs are promising molecules, which offer several possibilities for different in vivo applications, such as antagonizing aptamer-based drugs, immobilization, or delivery of oligonucleotides to defined locations.

  9. Increasing process understanding by analyzing complex interactions in experimental data

    DEFF Research Database (Denmark)

    Naelapaa, Kaisa; Allesø, Morten; Kristensen, Henning Gjelstrup

    2009-01-01

    understanding of a coating process. It was possible to model the response, that is, the amount of drug released, using both mentioned techniques. However, the ANOVAmodel was difficult to interpret as several interactions between process parameters existed. In contrast to ANOVA, GEMANOVA is especially suited...... for modeling complex interactions and making easily understandable models of these. GEMANOVA modeling allowed a simple visualization of the entire experimental space. Furthermore, information was obtained on how relative changes in the settings of process parameters influence the film quality and thereby drug......There is a recognized need for new approaches to understand unit operations with pharmaceutical relevance. A method for analyzing complex interactions in experimental data is introduced. Higher-order interactions do exist between process parameters, which complicate the interpretation...

  10. Program for Analyzing Flows in a Complex Network

    Science.gov (United States)

    Majumdar, Alok Kumar

    2006-01-01

    Generalized Fluid System Simulation Program (GFSSP) version 4 is a general-purpose computer program for analyzing steady-state and transient flows in a complex fluid network. The program is capable of modeling compressibility, fluid transients (e.g., water hammers), phase changes, mixtures of chemical species, and such externally applied body forces as gravitational and centrifugal ones. A graphical user interface enables the user to interactively develop a simulation of a fluid network consisting of nodes and branches. The user can also run the simulation and view the results in the interface. The system of equations for conservation of mass, energy, chemical species, and momentum is solved numerically by a combination of the Newton-Raphson and successive-substitution methods.

  11. Collecting and Analyzing Stakeholder Feedback for Signing at Complex Interchanges

    Science.gov (United States)

    2014-10-01

    The purpose of this project was to identify design constraints related to signing, markings, and geometry for complex interchanges, and then to identify useful topics for future research that will yield findings that can address those design issues. ...

  12. ITRA 084 - a microprocessor controlled rapid analyzer in mining and metallurgy

    International Nuclear Information System (INIS)

    Kliem, V.; Kreher, M.; Boy, N.

    1986-01-01

    A new rapid analyzer of the ITRA series has been developed at the Freiberg Research Institute of Non-Ferrous Metals for single and multi-element analysis in mining and non-ferrous metallurgy. INTRA-08 represents an efficient microprocessor-controlled on-line X-ray fluorescence analyzer based on the main principles utilized with success hitherto in device engineering (isotope excitation, four-channel modification, balance filter method). A U880 single-chip microcomputer provides the central control of the device including the execution of an extensive program for the matrix correction. The efficiency of the analyzer is demonstrated taking measured values as a basis

  13. Analyzing the Implicit Computational Complexity of object-oriented programs

    OpenAIRE

    Marion , Jean-Yves; Péchoux , Romain

    2008-01-01

    International audience; A sup-interpretation is a tool which provides upper bounds on the size of the values computed by the function symbols of a program. Sup-interpretations have shown their interest to deal with the complexity of first order functional programs. This paper is an attempt to adapt the framework of sup-interpretations to a fragment of object-oriented programs, including loop and while constructs and methods with side effects. We give a criterion, called brotherly criterion, w...

  14. Principles of big data preparing, sharing, and analyzing complex information

    CERN Document Server

    Berman, Jules J

    2013-01-01

    Principles of Big Data helps readers avoid the common mistakes that endanger all Big Data projects. By stressing simple, fundamental concepts, this book teaches readers how to organize large volumes of complex data, and how to achieve data permanence when the content of the data is constantly changing. General methods for data verification and validation, as specifically applied to Big Data resources, are stressed throughout the book. The book demonstrates how adept analysts can find relationships among data objects held in disparate Big Data resources, when the data objects are endo

  15. Application of Microwave Irradiation to Rapid Organic Inclusion Complex

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    @@ Microwave irradiation has been used in chemical laboratories for moisture analysis and wet asking procedures of biological and geological materials for a number of years [1]. More recently the microwave irradiation also widely used for rapid organic synthesis [2]. However, there have not yet been any reports concerning the ultilisatioin of microwave ovens in the routine organic inclusion complex regularly in chemical research.

  16. Towards a theoretical framework for analyzing complex linguistic networks

    CERN Document Server

    Lücking, Andy; Banisch, Sven; Blanchard, Philippe; Job, Barbara

    2016-01-01

    The aim of this book is to advocate and promote network models of linguistic systems that are both based on thorough mathematical models and substantiated in terms of linguistics. In this way, the book contributes first steps towards establishing a statistical network theory as a theoretical basis of linguistic network analysis the boarder of the natural sciences and the humanities.This book addresses researchers who want to get familiar with theoretical developments, computational models and their empirical evaluation in the field of complex linguistic networks. It is intended to all those who are interested in statisticalmodels of linguistic systems from the point of view of network research. This includes all relevant areas of linguistics ranging from phonological, morphological and lexical networks on the one hand and syntactic, semantic and pragmatic networks on the other. In this sense, the volume concerns readers from many disciplines such as physics, linguistics, computer science and information scien...

  17. Analyzing complex networks through correlations in centrality measurements

    International Nuclear Information System (INIS)

    Ricardo Furlan Ronqui, José; Travieso, Gonzalo

    2015-01-01

    Many real world systems can be expressed as complex networks of interconnected nodes. It is frequently important to be able to quantify the relative importance of the various nodes in the network, a task accomplished by defining some centrality measures, with different centrality definitions stressing different aspects of the network. It is interesting to know to what extent these different centrality definitions are related for different networks. In this work, we study the correlation between pairs of a set of centrality measures for different real world networks and two network models. We show that the centralities are in general correlated, but with stronger correlations for network models than for real networks. We also show that the strength of the correlation of each pair of centralities varies from network to network. Taking this fact into account, we propose the use of a centrality correlation profile, consisting of the values of the correlation coefficients between all pairs of centralities of interest, as a way to characterize networks. Using the yeast protein interaction network as an example we show also that the centrality correlation profile can be used to assess the adequacy of a network model as a representation of a given real network. (paper)

  18. Rapid multiplex high resolution melting method to analyze inflammatory related SNPs in preterm birth

    Directory of Open Access Journals (Sweden)

    Pereyra Silvana

    2012-01-01

    Full Text Available Abstract Background Complex traits like cancer, diabetes, obesity or schizophrenia arise from an intricate interaction between genetic and environmental factors. Complex disorders often cluster in families without a clear-cut pattern of inheritance. Genomic wide association studies focus on the detection of tens or hundreds individual markers contributing to complex diseases. In order to test if a subset of single nucleotide polymorphisms (SNPs from candidate genes are associated to a condition of interest in a particular individual or group of people, new techniques are needed. High-resolution melting (HRM analysis is a new method in which polymerase chain reaction (PCR and mutations scanning are carried out simultaneously in a closed tube, making the procedure fast, inexpensive and easy. Preterm birth (PTB is considered a complex disease, where genetic and environmental factors interact to carry out the delivery of a newborn before 37 weeks of gestation. It is accepted that inflammation plays an important role in pregnancy and PTB. Methods Here, we used real time-PCR followed by HRM analysis to simultaneously identify several gene variations involved in inflammatory pathways on preterm labor. SNPs from TLR4, IL6, IL1 beta and IL12RB genes were analyzed in a case-control study. The results were confirmed either by sequencing or by PCR followed by restriction fragment length polymorphism. Results We were able to simultaneously recognize the variations of four genes with similar accuracy than other methods. In order to obtain non-overlapping melting temperatures, the key step in this strategy was primer design. Genotypic frequencies found for each SNP are in concordance with those previously described in similar populations. None of the studied SNPs were associated with PTB. Conclusions Several gene variations related to the same inflammatory pathway were screened through a new flexible, fast and non expensive method with the purpose of analyzing

  19. Rapid visual and spectrophotometric nitrite detection by cyclometalated ruthenium complex.

    Science.gov (United States)

    Lo, Hoi-Shing; Lo, Ka-Wai; Yeung, Chi-Fung; Wong, Chun-Yuen

    2017-10-16

    Quantitative determination of nitrite ion (NO 2 - ) is of great importance in environmental and clinical investigations. A rapid visual and spectrophotometric assay for NO 2 - detection was developed based on a newly designed ruthenium complex, [Ru(npy)([9]aneS3)(CO)](ClO 4 ) (denoted as RuNPY; npy = 2-(1-naphthyl)pyridine, [9]aneS3 = 1,4,7-trithiacyclononane). This complex traps NO + produced in acidified NO 2 - solution, and yields observable color change within 1 min at room temperature. The assay features excellent dynamic range (1-840 μmol L -1 ) and high selectivity, and its limit of detection (0.39 μmol L -1 ) is also well below the guideline values for drinking water recommended by WHO and U.S. EPA. Practical use of this assay in tap water and human urine was successfully demonstrated. Overall, the rapidity and selectivity of this assay overcome the problems suffered by the commonly used modified Griess assays for nitrite determination. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. A rapid and cost-effective fluorescence detection in tube (FDIT) method to analyze protein phosphorylation.

    Science.gov (United States)

    Jin, Xiao; Gou, Jin-Ying

    2016-01-01

    Protein phosphorylation is one of the most important post-translational modifications catalyzed by protein kinases in living organisms. The advance of genome sequencing provided the information of protein kinase families in many organisms, including both model and non-model plants. The development of proteomics technologies also enabled scientists to efficiently reveal a large number of protein phosphorylations of an organism. However, kinases and phosphorylation targets are still to be connected to illustrate the complicated network in life. Here we adapted Pro-Q ® Diamond (Pro-Q ® Diamond Phosphoprotein Gel Stain), a widely used phosphoprotein gel-staining fluorescence dye, to establish a rapid, economical and non-radioactive fluorescence detection in tube (FDIT) method to analyze phosphorylated proteins. Taking advantages of high sensitivity and specificity of Pro-Q ® diamond, the FDIT method is also demonstrated to be rapid and reliable, with a suitable linear range for in vitro protein phosphorylation. A significant and satisfactory protein kinase reaction was detected as fast as 15 min from Wheat Kinase START 1.1 (WKS1.1) on a thylakoid ascorbate peroxidase (tAPX), an established phosphorylation target in our earlier study. The FDIT method saves up to 95% of the dye consumed in a gel staining method. The FDIT method is remarkably quick, highly reproducible, unambiguous and capable to be scaled up to dozens of samples. The FDIT method could serve as a simple and sensitive alternative procedure to determine protein kinase reactions with zero radiation exposure, as a supplementation to other widely used radioactive and in-gel assays.

  1. A rapid and cost-effective fluorescence detection in tube (FDIT method to analyze protein phosphorylation

    Directory of Open Access Journals (Sweden)

    Xiao Jin

    2016-11-01

    Full Text Available Abstract Background Protein phosphorylation is one of the most important post-translational modifications catalyzed by protein kinases in living organisms. The advance of genome sequencing provided the information of protein kinase families in many organisms, including both model and non-model plants. The development of proteomics technologies also enabled scientists to efficiently reveal a large number of protein phosphorylations of an organism. However, kinases and phosphorylation targets are still to be connected to illustrate the complicated network in life. Results Here we adapted Pro-Q® Diamond (Pro-Q® Diamond Phosphoprotein Gel Stain, a widely used phosphoprotein gel-staining fluorescence dye, to establish a rapid, economical and non-radioactive fluorescence detection in tube (FDIT method to analyze phosphorylated proteins. Taking advantages of high sensitivity and specificity of Pro-Q® diamond, the FDIT method is also demonstrated to be rapid and reliable, with a suitable linear range for in vitro protein phosphorylation. A significant and satisfactory protein kinase reaction was detected as fast as 15 min from Wheat Kinase START 1.1 (WKS1.1 on a thylakoid ascorbate peroxidase (tAPX, an established phosphorylation target in our earlier study. Conclusion The FDIT method saves up to 95% of the dye consumed in a gel staining method. The FDIT method is remarkably quick, highly reproducible, unambiguous and capable to be scaled up to dozens of samples. The FDIT method could serve as a simple and sensitive alternative procedure to determine protein kinase reactions with zero radiation exposure, as a supplementation to other widely used radioactive and in-gel assays.

  2. A rapid automatic analyzer and its methodology for effective bentonite content based on image recognition technology

    Directory of Open Access Journals (Sweden)

    Wei Long

    2016-09-01

    Full Text Available Fast and accurate determination of effective bentonite content in used clay bonded sand is very important for selecting the correct mixing ratio and mixing process to obtain high-performance molding sand. Currently, the effective bentonite content is determined by testing the ethylene blue absorbed in used clay bonded sand, which is usually a manual operation with some disadvantages including complicated process, long testing time and low accuracy. A rapid automatic analyzer of the effective bentonite content in used clay bonded sand was developed based on image recognition technology. The instrument consists of auto stirring, auto liquid removal, auto titration, step-rotation and image acquisition components, and processor. The principle of the image recognition method is first to decompose the color images into three-channel gray images based on the photosensitive degree difference of the light blue and dark blue in the three channels of red, green and blue, then to make the gray values subtraction calculation and gray level transformation of the gray images, and finally, to extract the outer circle light blue halo and the inner circle blue spot and calculate their area ratio. The titration process can be judged to reach the end-point while the area ratio is higher than the setting value.

  3. Rapid Point of Care Analyzer for the Measurement of Cyanide in Blood

    Science.gov (United States)

    Ma, Jian; Ohira, Shin-Ichi; Mishra, Santosh K.; Puanngam, Mahitti; Dasgupta, Purnendu K.; Mahon, Sari B.; Brenner, Matthew; Blackledge, William; Boss, Gerry R.

    2011-01-01

    A simple, sensitive optical analyzer for the rapid determination of cyanide in blood in point of care applications is described. HCN is liberated by the addition of 20% H3PO4 and is absorbed by a paper filter impregnated with borate-buffered (pH 9.0) hydroxoaquocobinamide Hereinafter called cobinamide). Cobinamide on the filter changes color from orange (λmax = 510 nm) to violet (λmax = 583 nm) upon reaction with cyanide. This color change is monitored in the transmission mode by a light emitting diode (LED) with a 583 nm emission maximum and a photodiode detector. The observed rate of color change increases 10x when the cobinamide solution for filter impregnation is prepared in borate-buffer rather than in water. The use of a second LED emitting at 653 nm and alternate pulsing of the LEDs improve the limit of detection by 4x to ~ 0.5 μM for a 1 mL blood sample. Blood cyanide levels of imminent concern (≥ 10 μM) can be accurately measured in ~ 2 min. The response is proportional to the mass of cyanide in the sample – smaller sample volumes can be successfully used with proportionate change in the concentration LODs. Bubbling air through the blood-acid mixture was found effective for mixing of the acid with the sample and the liberation of HCN. A small amount of ethanol added to the top of the blood was found to be the most effective means to prevent frothing during aeration. The relative standard deviation (RSD) for repetitive determination of blood samples containing 9 μM CN was 1.09% (n=5). The technique was compared blind with a standard microdiffusion-spectrophotometric method used for the determination of cyanide in rabbit blood. The results showed good correlation (slope 1.05, r2 0.9257); independent calibration standards were used. PMID:21553921

  4. The ESTER particle and plasma analyzer complex for the Phobos mission

    Energy Technology Data Exchange (ETDEWEB)

    Afonin, V.V.; Shutte, N.M. (AN SSSR, Moscow (USSR). Inst. Kosmicheskikh Issledovanij); McKenna-Lawlor, S.; Rusznyak, P. (Space Technology Ireland Ltd., Maynooth (Ireland)); Kiraly, P.; Szabo, L.; Szalai, S.; Szucs, I.T.; Varhalmi, L. (Hungarian Academy of Sciences, Budapest (Hungary). Central Research Inst. for Physics); Marsden, R. (European Space Agency, Noordwijk (Netherlands). Space Science Dept.); Richter, A.; Witte, M. (Max-Planck-Institut fuer Aeronomie, Katlenburg-Lindau (Germany, F.R.))

    1990-05-01

    The ESTER particle and plasma analyzer system for the Phobos Mission comprised a complex of three instruments (LET, SLED and HARP) serviced by a common Data Processing Unit. An account is provided of this complex, its objectives and excellent performance in space. (orig.).

  5. Rapid measurement of macronutrients in breast milk: How reliable are infrared milk analyzers?✩

    Science.gov (United States)

    Fusch, Gerhard; Rochow, Niels; Choi, Arum; Fusch, Stephanie; Poeschl, Susanna; Ubah, Adelaide Obianuju; Lee, Sau-Young; Raja, Preeya; Fusch, Christoph

    2016-01-01

    SUMMARY Background & aims Significant biological variation in macronutrient content of breast milk is an important barrier that needs to be overcome to meet nutritional needs of preterm infants. To analyze macronutrient content, commercial infrared milk analyzers have been proposed as efficient and practical tools in terms of efficiency and practicality. Since milk analyzers were originally developed for the dairy industry, they must be validated using a significant number of human milk samples that represent the broad range of variation in macronutrient content in preterm and term milk. Aim of this study was to validate two milk analyzers for breast milk analysis with reference methods and to determine an effective sample pretreatment. Current evidence for the influence of (i) aliquoting, (ii) storage time and (iii) temperature, and (iv) vessel wall adsorption on stability and availability of macronutrients in frozen breast milk is reviewed. Methods Breast milk samples (n = 1188) were collected from 63 mothers of preterm and term infants. Milk analyzers: (A) Near-infrared milk analyzer (Unity SpectraStar, USA) and (B) Mid-infrared milk analyzer (Miris, Sweden) were compared to reference methods, e.g. ether extraction, elemental analysis, and UPLC-MS/MS for fat, protein, and lactose, respectively. Results For fat analysis, (A) measured precisely but not accurately (y = 0.55x + 1.25, r2 = 0.85), whereas (B) measured precisely and accurately (y = 0.93x + 0.18, r2 = 0.86). For protein analysis, (A) was precise but not accurate (y = 0.55x + 0.54, r2 = 0.67) while (B) was both precise and accurate (y = 0.78x + 0.05, r2 = 0.73). For lactose analysis, both devices (A) and (B) showed two distinct concentration levels and measured therefore neither accurately nor precisely (y = 0.02x + 5.69, r2 = 0.01 and y = −0.09x + 6.62, r2 = 0.02 respectively). Macronutrient levels were unchanged in two independent samples of stored breast milk (−20 °C measured with IR; −80

  6. Analyzing Integrated Cost-Schedule Risk for Complex Product Systems R&D Projects

    Directory of Open Access Journals (Sweden)

    Zhe Xu

    2014-01-01

    Full Text Available The vast majority of the research efforts in project risk management tend to assess cost risk and schedule risk independently. However, project cost and time are related in reality and the relationship between them should be analyzed directly. We propose an integrated cost and schedule risk assessment model for complex product systems R&D projects. Graphical evaluation review technique (GERT, Monte Carlo simulation, and probability distribution theory are utilized to establish the model. In addition, statistical analysis and regression analysis techniques are employed to analyze simulation outputs. Finally, a complex product systems R&D project as an example is modeled by the proposed approach and the simulation outputs are analyzed to illustrate the effectiveness of the risk assessment model. It seems that integrating cost and schedule risk assessment can provide more reliable risk estimation results.

  7. RAPID AUTOMATED RADIOCHEMICAL ANALYZER FOR DETERMINATION OF TARGETED RADIONUCLIDES IN NUCLEAR PROCESS STREAMS

    International Nuclear Information System (INIS)

    O'Hara, Matthew J.; Durst, Philip C.; Grate, Jay W.; Egorov, Oleg; Devol, Timothy A.

    2008-01-01

    Some industrial process-scale plants require the monitoring of specific radionuclides as an indication of the composition of their feed streams or as indicators of plant performance. In this process environment, radiochemical measurements must be fast, accurate, and reliable. Manual sampling, sample preparation, and analysis of process fluids are highly precise and accurate, but tend to be expensive and slow. Scientists at Pacific Northwest National Laboratory (PNNL) have assembled and characterized a fully automated prototype Process Monitor instrument which was originally designed to rapidly measure Tc-99 in the effluent streams of the Waste Treatment Plant at Hanford, WA. The system is capable of a variety of tasks: extraction of a precise volume of sample, sample digestion/analyte redox adjustment, column-based chemical separations, flow-through radiochemical detection and data analysis/reporting. The system is compact, its components are fluidically inter-linked, and analytical results can be immediately calculated and electronically reported. It is capable of performing a complete analytical cycle in less than 15 minutes. The system is highly modular and can be adapted to a variety of sample types and analytical requirements. It exemplifies how automation could be integrated into reprocessing facilities to support international nuclear safeguards needs

  8. Pharmacogenetic testing for clopidogrel using the rapid INFINITI analyzer: a dose-escalation study.

    Science.gov (United States)

    Gladding, Patrick; White, Harvey; Voss, Jamie; Ormiston, John; Stewart, Jim; Ruygrok, Peter; Bvaldivia, Badi; Baak, Ruth; White, Catherine; Webster, Mark

    2009-11-01

    Our aim was to assess whether a higher clopidogrel maintenance dose has a greater antiplatelet effect in CYP2C19*2 allele carriers compared with noncarriers. Clopidogrel is a prodrug that is biotransformed by the cytochrome P450 enzymes CYP2C19, 2C9, and 3A4, 2B6, 1A2. The CYPC219*2 loss of function variant has been associated with a reduced antiplatelet response to clopidogrel and a 3-fold risk of stent thrombosis. Forty patients on standard maintenance dosage clopidogrel (75 mg), for 9.4 +/- 9.2 weeks, were enrolled into a dose escalation study. Platelet function was assessed at baseline and after 1 week of 150 mg once daily using the VerifyNow platelet function analyzer (Accumetrics Ltd., San Diego, California). Genomic DNA was hybridized to a BioFilmChip microarray on the INFINITI analyzer (AutoGenomics Inc., Carlsbad, California) and analyzed for the CYP19*2, *4, *17, and CYP2C9*2, *3 polymorphisms. Platelet inhibition increased over 1 week, mean +8.6 +/- 13.5% (p = 0.0003). Carriers of the CYP2C19*2 allele had significantly reduced platelet inhibition at baseline (median 18%, range 0% to 72%) compared with wildtype (wt) (median 59%, range 11% to 95%, p = 0.01) and at 1 week (p = 0.03). CYP2C19*2 allele carriers had an increase in platelet inhibition of (mean +9 +/- 11%, p = 0.03) and reduction in platelet reactivity (mean -26 +/- 38 platelet response unit, p = 0.04) with a higher dose. Together CYP2C19*2 and CYP2C9*3 loss of function carriers had a greater change in platelet inhibition with 150 mg daily than wt/wt (+10.9% vs. +0.7%, p = 0.04). Increasing the dose of clopidogrel in patients with nonresponder polymorphisms can increase antiplatelet response. Personalizing clopidogrel dosing using pharmacogenomics may be an effective method of optimizing treatment.

  9. Analyzing the Impact of Highways Associated with Farmland Loss under Rapid Urbanization

    Directory of Open Access Journals (Sweden)

    Jie Song

    2016-06-01

    Full Text Available Highway construction has accelerated urban growth and induced direct and indirect changes to land use. Although many studies have analyzed the relationship between highway construction and local development, relatively less attention has been paid to clarifying the various impacts of highways associated with farmland loss. This paper integrates GIS spatial analysis, remote sensing, buffer analysis and landscape metrics to analyze the landscape pattern change induced by direct and indirect highway impacts. This paper explores the interaction between the impact of highways and farmland loss, using the case of the highly urbanized traffic hubs in eastern China, Hang-Jia-Hu Plain. Our results demonstrate that the Hang-Jia-Hu Plain experienced extensive highway construction during 1990–2010, with a clear acceleration of expressway development since 2000. This unprecedented highway construction has directly fragmented the regional landscape and indirectly disturbed the regional landscape by attracting a large amount of built-up land transition from farmland during the last two decades. In the highway-effect zone, serious farmland loss initially occurred in the urban region and then spread to the rural region. Moreover, we found the discontinuous expansion of built-up land scattered the farmland in the rural region and expressway-effect zone. Furthermore, farmland protection policies in the 1990s had the effect of controlling the total area of farmland loss. However, the cohesive farmland structure was still fragmented by the direct and indirect impacts of highway construction. We suggest that an overall farmland protection system should be established to enhance spatial control and mitigate the adverse impacts caused by highway construction. This work improves the understanding of regional sustainable development, and provides a scientific basis for balanced urban development with farmland protection in decision-making processes.

  10. Mobile Complex For Rapid Diagnosis of the Technological System Elements

    Directory of Open Access Journals (Sweden)

    Gavrilin Alexey

    2016-01-01

    Full Text Available The article shows the up-to-dateness of the new informing and measuring tools and technologies development. It is reviewed the mobile complex for runtime diagnostics of technological system “machine-toolinstrument- detail”. It was found that the use of the complex allows to identify the frequency area in which the appearance of resonance of the technological system elements is possible, and thus to draw a conclusion on the technical state of the diagnosed object. It is concluded that there is the prospects for the use of the above mentioned mobile complex for vibration diagnostics.

  11. On complex adaptive systems and terrorism [rapid communication

    Science.gov (United States)

    Ahmed, E.; Elgazzar, A. S.; Hegazi, A. S.

    2005-03-01

    Complex adaptive systems (CAS) are ubiquitous in nature. They are basic in social sciences. An overview of CAS is given with emphasize on the occurrence of bad side effects to seemingly “wise” decisions. Hence application to terrorism is given. Some conclusions on how to deal with this phenomena are proposed.

  12. A Comparison of Geographic Information Systems, Complex Networks, and Other Models for Analyzing Transportation Network Topologies

    Science.gov (United States)

    Alexandrov, Natalia (Technical Monitor); Kuby, Michael; Tierney, Sean; Roberts, Tyler; Upchurch, Christopher

    2005-01-01

    This report reviews six classes of models that are used for studying transportation network topologies. The report is motivated by two main questions. First, what can the "new science" of complex networks (scale-free, small-world networks) contribute to our understanding of transport network structure, compared to more traditional methods? Second, how can geographic information systems (GIS) contribute to studying transport networks? The report defines terms that can be used to classify different kinds of models by their function, composition, mechanism, spatial and temporal dimensions, certainty, linearity, and resolution. Six broad classes of models for analyzing transport network topologies are then explored: GIS; static graph theory; complex networks; mathematical programming; simulation; and agent-based modeling. Each class of models is defined and classified according to the attributes introduced earlier. The paper identifies some typical types of research questions about network structure that have been addressed by each class of model in the literature.

  13. Analyzing the causation of a railway accident based on a complex network

    Science.gov (United States)

    Ma, Xin; Li, Ke-Ping; Luo, Zi-Yan; Zhou, Jin

    2014-02-01

    In this paper, a new model is constructed for the causation analysis of railway accident based on the complex network theory. In the model, the nodes are defined as various manifest or latent accident causal factors. By employing the complex network theory, especially its statistical indicators, the railway accident as well as its key causations can be analyzed from the overall perspective. As a case, the “7.23” China—Yongwen railway accident is illustrated based on this model. The results show that the inspection of signals and the checking of line conditions before trains run played an important role in this railway accident. In conclusion, the constructed model gives a theoretical clue for railway accident prediction and, hence, greatly reduces the occurrence of railway accidents.

  14. Analyzing the causation of a railway accident based on a complex network

    International Nuclear Information System (INIS)

    Ma Xin; Li Ke-Ping; Luo Zi-Yan; Zhou Jin

    2014-01-01

    In this paper, a new model is constructed for the causation analysis of railway accident based on the complex network theory. In the model, the nodes are defined as various manifest or latent accident causal factors. By employing the complex network theory, especially its statistical indicators, the railway accident as well as its key causations can be analyzed from the overall perspective. As a case, the “7.23” China—Yongwen railway accident is illustrated based on this model. The results show that the inspection of signals and the checking of line conditions before trains run played an important role in this railway accident. In conclusion, the constructed model gives a theoretical clue for railway accident prediction and, hence, greatly reduces the occurrence of railway accidents. (interdisciplinary physics and related areas of science and technology)

  15. Topographical memory analyzed in mice using the Hamlet test, a novel complex maze.

    Science.gov (United States)

    Crouzier, Lucie; Gilabert, Damien; Rossel, Mireille; Trousse, Françoise; Maurice, Tangui

    2018-03-01

    The Hamlet test is an innovative device providing a complex environment for testing topographic memory in mice. Animals were trained in groups for weeks in a small village with a central agora, streets expanding from it towards five functionalized houses, where they can drink, eat, hide, run, interact with a stranger mouse. Memory was tested by depriving mice from water or food and analyzing their ability to locate the Drink/Eat house. Exploration and memory were analyzed in different strains, gender, and after different training periods and delays. After 2 weeks training, differences in exploration patterns were observed between strains, but not gender. Neuroanatomical structures activated by training, identified using FosB/ΔFosB immunolabelling, showed an involvement of the hippocampus-subiculum-parahippocampal gyrus axis and dopaminergic structures. Training increased hippocampal neurogenesis (cell proliferation and neuronal maturation) and modified the amnesic efficacy of muscarinic or nicotinic cholinergic antagonists. Moreover, topographical disorientation in Alzheimer's disease was addressed using intracerebroventricular injection of amyloid β 25-35 peptide in trained mice. When retested after 7 days, Aβ 25-35 -treated mice showed memory impairment. The Hamlet test specifically allows analysis of topographical memory in mice, based on complex environment. It offers an innovative tool for various ethological or pharmacological research needs. For instance, it allowed to examine topographical disorientation, a warning sign in Alzheimer's disease. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. Portable Microplate Analyzer with a Thermostatic Chamber Based on a Smartphone for On-site Rapid Detection.

    Science.gov (United States)

    Wan, Zijian; Zhong, Longjie; Pan, Yuxiang; Li, Hongbo; Zou, Quchao; Su, Kaiqi; Wang, Ping

    2017-01-01

    A microplate method provides an efficient way to use modern detection technology. However, there are some difficulties concerning on-site detection, such as being non-portable and time-consuming. In this work, a novel portable microplate analyzer with a thermostatic chamber based on a smartphone was designed for rapid on-site detection. An analyzer with a wide-angle lens and an optical filter provides a proper environment for the microplate. A smartphone app-iPlate Monitor was used for RGB analyze of image. After a consistency experiment with a microtiter plate reader (MTPR), the normalized calibration curves were y = 0.7276x + 0.0243 (R 2 = 0.9906) and y = 0.3207x + 0.0094 (R 2 = 0.9917) with a BCA protein kit as well as y = 0.182x + 0.0134 (R 2 = 0.994) and y = 0.0674x + 0.0003 (R 2 = 0.9988) with a glucose kit. The times for obtaining the detection requirement were 15 and 10 min for the BCA protein kit and the glucose kit at 37°C; in contrast, it required more than 30 and 20 min at ambient temperature. Meanwhile, it also showed good repeatability for detections.

  17. Employing the Hilbert-Huang Transform to analyze observed natural complex signals: Calm wind meandering cases

    Science.gov (United States)

    Martins, Luis Gustavo Nogueira; Stefanello, Michel Baptistella; Degrazia, Gervásio Annes; Acevedo, Otávio Costa; Puhales, Franciano Scremin; Demarco, Giuliano; Mortarini, Luca; Anfossi, Domenico; Roberti, Débora Regina; Costa, Felipe Denardin; Maldaner, Silvana

    2016-11-01

    In this study we analyze natural complex signals employing the Hilbert-Huang spectral analysis. Specifically, low wind meandering meteorological data are decomposed into turbulent and non turbulent components. These non turbulent movements, responsible for the absence of a preferential direction of the horizontal wind, provoke negative lobes in the meandering autocorrelation functions. The meandering characteristic time scales (meandering periods) are determined from the spectral peak provided by the Hilbert-Huang marginal spectrum. The magnitudes of the temperature and horizontal wind meandering period obtained agree with the results found from the best fit of the heuristic meandering autocorrelation functions. Therefore, the new method represents a new procedure to evaluate meandering periods that does not employ mathematical expressions to represent observed meandering autocorrelation functions.

  18. Internet of THings Area Coverage Analyzer (ITHACA for Complex Topographical Scenarios

    Directory of Open Access Journals (Sweden)

    Raúl Parada

    2017-10-01

    Full Text Available The number of connected devices is increasing worldwide. Not only in contexts like the Smart City, but also in rural areas, to provide advanced features like smart farming or smart logistics. Thus, wireless network technologies to efficiently allocate Internet of Things (IoT and Machine to Machine (M2M communications are necessary. Traditional cellular networks like Global System for Mobile communications (GSM are widely used worldwide for IoT environments. Nevertheless, Low Power Wide Area Networks (LP-WAN are becoming widespread as infrastructure for present and future IoT and M2M applications. Based also on a subscription service, the LP-WAN technology SIGFOXTM may compete with cellular networks in the M2M and IoT communications market, for instance in those projects where deploying the whole communications infrastructure is too complex or expensive. For decision makers to decide the most suitable technology for each specific application, signal coverage is within the key features. Unfortunately, besides simulated coverage maps, decision-makers do not have real coverage maps for SIGFOXTM, as they can be found for cellular networks. Thereby, we propose Internet of THings Area Coverage Analyzer (ITHACA, a signal analyzer prototype to provide automated signal coverage maps and analytics for LP-WAN. Experiments performed in the Gran Canaria Island, Spain (with both urban and complex topographic rural environments, returned a real SIGFOXTM service availability above 97% and above 11% more coverage with respect to the company-provided simulated maps. We expect that ITHACA may help decision makers to deploy the most suitable technologies for future IoT and M2M projects.

  19. Comparisons of complex network based models and real train flow model to analyze Chinese railway vulnerability

    International Nuclear Information System (INIS)

    Ouyang, Min; Zhao, Lijing; Hong, Liu; Pan, Zhezhe

    2014-01-01

    Recently numerous studies have applied complex network based models to study the performance and vulnerability of infrastructure systems under various types of attacks and hazards. But how effective are these models to capture their real performance response is still a question worthy of research. Taking the Chinese railway system as an example, this paper selects three typical complex network based models, including purely topological model (PTM), purely shortest path model (PSPM), and weight (link length) based shortest path model (WBSPM), to analyze railway accessibility and flow-based vulnerability and compare their results with those from the real train flow model (RTFM). The results show that the WBSPM can produce the train routines with 83% stations and 77% railway links identical to the real routines and can approach the RTFM the best for railway vulnerability under both single and multiple component failures. The correlation coefficient for accessibility vulnerability from WBSPM and RTFM under single station failures is 0.96 while it is 0.92 for flow-based vulnerability; under multiple station failures, where each station has the same failure probability fp, the WBSPM can produce almost identical vulnerability results with those from the RTFM under almost all failures scenarios when fp is larger than 0.62 for accessibility vulnerability and 0.86 for flow-based vulnerability

  20. How Unstable Are Complex Financial Systems? Analyzing an Inter-bank Network of Credit Relations

    Science.gov (United States)

    Sinha, Sitabhra; Thess, Maximilian; Markose, Sheri

    The recent worldwide economic crisis of 2007-09 has focused attention on the need to analyze systemic risk in complex financial networks. We investigate the problem of robustness of such systems in the context of the general theory of dynamical stability in complex networks and, in particular, how the topology of connections influence the risk of the failure of a single institution triggering a cascade of successive collapses propagating through the network. We use data on bilateral liabilities (or exposure) in the derivatives market between 202 financial intermediaries based in USA and Europe in the last quarter of 2009 to empirically investigate the network structure of the over-the-counter (OTC) derivatives market. We observe that the network exhibits both heterogeneity in node properties and the existence of communities. It also has a prominent core-periphery organization and can resist large-scale collapse when subjected to individual bank defaults (however, failure of any bank in the core may result in localized collapse of the innermost core with substantial loss of capital) but is vulnerable to system-wide breakdown as a result of an accompanying liquidity crisis.

  1. Rapid Monitoring of Mercury in Air from an Organic Chemical Factory in China Using a Portable Mercury Analyzer

    Directory of Open Access Journals (Sweden)

    Akira Yasutake

    2011-01-01

    Full Text Available A chemical factory, using a production technology of acetaldehyde with mercury catalysis, was located southeast of Qingzhen City in Guizhou Province, China. Previous research showed heavy mercury pollution through an extensive downstream area. A current investigation of the mercury distribution in ambient air, soils, and plants suggests that mobile mercury species in soils created elevated mercury concentrations in ambient air and vegetation. Mercury concentrations of up to 600 ng/m3 in air over the contaminated area provided evidence of the mercury transformation to volatile Hg(0. Mercury analysis of soil and plant samples demonstrated that the mercury concentrations in soil with vaporized and plant-absorbable forms were higher in the southern area, which was closer to the factory. Our results suggest that air monitoring using a portable mercury analyzer can be a convenient and useful method for the rapid detection and mapping of mercury pollution in advanced field surveys.

  2. Extractive Atmospheric Pressure Photoionization (EAPPI) Mass Spectrometry: Rapid Analysis of Chemicals in Complex Matrices.

    Science.gov (United States)

    Liu, Chengyuan; Yang, Jiuzhong; Wang, Jian; Hu, Yonghua; Zhao, Wan; Zhou, Zhongyue; Qi, Fei; Pan, Yang

    2016-10-01

    Extractive atmospheric pressure photoionization (EAPPI) mass spectrometry was designed for rapid qualitative and quantitative analysis of chemicals in complex matrices. In this method, an ultrasonic nebulization system was applied to sample extraction, nebulization, and vaporization. Mixed with a gaseous dopant, vaporized analytes were ionized through ambient photon-induced ion-molecule reactions, and were mass-analyzed by a high resolution time-of-flight mass spectrometer (TOF-MS). After careful optimization and testing with pure sample solution, EAPPI was successfully applied to the fast screening of capsules, soil, natural products, and viscous compounds. Analysis was completed within a few seconds without the need for preseparation. Moreover, the quantification capability of EAPPI for matrices was evaluated by analyzing six polycyclic aromatic hydrocarbons (PAHs) in soil. The correlation coefficients (R (2) ) for standard curves of all six PAHs were above 0.99, and the detection limits were in the range of 0.16-0.34 ng/mg. In addition, EAPPI could also be used to monitor organic chemical reactions in real time. Graphical Abstract ᅟ.

  3. Design patterns for instructional materials that foster proficiency at analyzing and interpreting complex geoscience data

    Science.gov (United States)

    Kastens, K. A.; Krumhansl, R.

    2016-12-01

    The Next Generation Science Standards incorporate a stronger emphasis on having students work with data than did prior standards. This emphasis is most obvious in Practice 4: Analyzing and Interpreting Data, but also permeates performance expectations built on Practice 2 when students test models, Practice 6 when students construct explanations, and Practice 7 when student test claims with evidence. To support curriculum developers who wish to guide high school students towards more sophisticated engagement with complex data, we analyzed a well-regarded body of instructional materials designed for use in introductory college courses (http://serc.carleton.edu/integrate/teaching_materials/). Our analysis sought design patterns that can be reused for a variety of topics at the high school or college level. We found five such patterns, each of which was used in at least half of the modules analyzed. We describe each pattern, provide an example, and hypothesize a theory of action that could explain how the sequence of activities leverages known perceptual, cognitive and/or social processes to foster learning from and about data. In order from most to least frequent, the observed design patterns are as follows: In Data Puzzles, students respond to guiding questions about high-value snippets of data pre-selected and sequenced by the curriculum developer to lead to an Aha! inference. In Pooling Data to See the Big Picture, small groups analyze different instances of analogous phenomenon (e.g. different hurricanes, or different divergent plate boundaries) and pool their insights to extract the commonalities that constitute the essence of that phenomenon. In Make a Decision or Recommendation, students combine geoscience data with other factors (such as economic or environmental justice concerns) to make a decision or recommendation about a human or societal action. In Predict-Observe-Explain, students make a prediction about what the Earth will look like under conditions

  4. [Development of a portable mid-infrared rapid analyzer for oil concentration in water based on MEMS linear sensor array].

    Science.gov (United States)

    Gao, Zhi-fan; Zeng, Li-bo; Shi, Lei; Li, Kai; Yang, Yuan-zhou; Wu, Qiong-shui

    2014-06-01

    Aiming at the existing problems such as weak environmental adaptability, low analytic efficiency and poor measuring repeatability in the traditional spectral oil analyzers, the present paper designed a portable mid-infrared rapid analyzer for oil concentration in water. To reduce the volume of the instrument, the non-symmetrical folding M-type Czerny-Turner optical structure was adopted in the core optical path. With a periodically rotating chopper, controlled by digital PID algorithm, applied for infrared light modulation, the modulating accuracy reached ±0.5%. Different from traditional grating-scanning spectrophotometers, this instrument used a fixed grating for light dispersion and avoided rotating error in the course of the measuring procedures. A new-type MEMS infrared linear sensor array was applied for modulated spectral signals detection, which improved the measuring efficiency remarkably. Optical simulation and experimental results indicate that the spectral range is 2 800 - 3 200 cm(-1), the spectral resolution is 6 cm(-1) (@3 130 cm(-1)), and the signal to noise ratio is up to 5 200 : 1. The acquisition time is 13 milliseconds per spectrogram, and the standard deviation of absorbance is less than 3 x 10(-3). These performances meet the standards of oil concentration measurements perfectly. Compared with traditional infrared spectral analyzers for oil concentration, the instrument demonstrated in this paper has many advantages such as smaller size, more efficiency, higher precision, and stronger vibration & moisture isolation. In addition, the proposed instrument is especially suitable for the environmental monitoring departments to implement real-time measurements in the field for oil concentration in water, hence it has broad prospects of application in the field of water quality monitoring.

  5. A Hybrid DGTD-MNA Scheme for Analyzing Complex Electromagnetic Systems

    KAUST Repository

    Li, Peng

    2015-01-07

    A hybrid electromagnetics (EM)-circuit simulator for analyzing complex systems consisting of EM devices loaded with nonlinear multi-port lumped circuits is described. The proposed scheme splits the computational domain into two subsystems: EM and circuit subsystems, where field interactions are modeled using Maxwell and Kirchhoff equations, respectively. Maxwell equations are discretized using a discontinuous Galerkin time domain (DGTD) scheme while Kirchhoff equations are discretized using a modified nodal analysis (MNA)-based scheme. The coupling between the EM and circuit subsystems is realized at the lumped ports, where related EM fields and circuit voltages and currents are allowed to “interact’’ via numerical flux. To account for nonlinear lumped circuit elements, the standard Newton-Raphson method is applied at every time step. Additionally, a local time-stepping scheme is developed to improve the efficiency of the hybrid solver. Numerical examples consisting of EM systems loaded with single and multiport linear/nonlinear circuit networks are presented to demonstrate the accuracy, efficiency, and applicability of the proposed solver.

  6. Modeling and Analyzing Operational Decision-Making Synchronization of C2 Organization in Complex Environment

    Directory of Open Access Journals (Sweden)

    Zou Zhigang

    2013-01-01

    Full Text Available In order to improve capability of operational decision-making synchronization (ODMS in command and control (C2 organization, the paper puts forward that ODMS is the negotiation process of situation cognition with three phases about “situation cognition, situation interaction and decision-making synchronization” in complex environment, and then the model and strategies of ODMS are given in quantity. Firstly, measure indexes of three steps above are given in the paper based on the time consumed in negotiation, and three patterns are proposed for negotiating timely in high quality during situation interaction. Secondly, the ODMS model with two stages in continuous changing situation is put forward in the paper, and ODMS strategies are analyzed within environment influence and time restriction. Thirdly, simulation cases are given to validate the process of ODMS under different continuous changing situations the results of this model are better than the other previous models to fulfill the actual restrictions, and the process of ODMS can be adjusted more reasonable for improving the capability of ODMS. Then we discuss the case and summarize the influence factors of ODMS in the C2 organization as organization structure, shared information resources, negotiation patterns, and allocation of decision rights.

  7. Deep graphs—A general framework to represent and analyze heterogeneous complex systems across scales

    Science.gov (United States)

    Traxl, Dominik; Boers, Niklas; Kurths, Jürgen

    2016-06-01

    Network theory has proven to be a powerful tool in describing and analyzing systems by modelling the relations between their constituent objects. Particularly in recent years, a great progress has been made by augmenting "traditional" network theory in order to account for the multiplex nature of many networks, multiple types of connections between objects, the time-evolution of networks, networks of networks and other intricacies. However, existing network representations still lack crucial features in order to serve as a general data analysis tool. These include, most importantly, an explicit association of information with possibly heterogeneous types of objects and relations, and a conclusive representation of the properties of groups of nodes as well as the interactions between such groups on different scales. In this paper, we introduce a collection of definitions resulting in a framework that, on the one hand, entails and unifies existing network representations (e.g., network of networks and multilayer networks), and on the other hand, generalizes and extends them by incorporating the above features. To implement these features, we first specify the nodes and edges of a finite graph as sets of properties (which are permitted to be arbitrary mathematical objects). Second, the mathematical concept of partition lattices is transferred to the network theory in order to demonstrate how partitioning the node and edge set of a graph into supernodes and superedges allows us to aggregate, compute, and allocate information on and between arbitrary groups of nodes. The derived partition lattice of a graph, which we denote by deep graph, constitutes a concise, yet comprehensive representation that enables the expression and analysis of heterogeneous properties, relations, and interactions on all scales of a complex system in a self-contained manner. Furthermore, to be able to utilize existing network-based methods and models, we derive different representations of

  8. Use of multiple singular value decompositions to analyze complex intracellular calcium ion signals

    KAUST Repository

    Martinez, Josue G.; Huang, Jianhua Z.; Burghardt, Robert C.; Barhoumi, Rola; Carroll, Raymond J.

    2009-01-01

    ) to extract the signals from such movies, in a way that is semi-automatic and tuned closely to the actual data and their many complexities. These complexities include the following. First, the images themselves are of no interest: all interest focuses

  9. ReaderBench: A Multi-lingual Framework for Analyzing Text Complexity

    NARCIS (Netherlands)

    Dascalu, Mihai; Gutu, Gabriel; Ruseti, Stefan; Paraschiv, Ionut Cristian; Dessus, Philippe; McNamara, Danielle S.; Crossley, Scott; Trausan-Matu, Stefan

    2017-01-01

    Assessing textual complexity is a difficult, but important endeavor, especially for adapting learning materials to students’ and readers’ levels of understanding. With the continuous growth of information technologies spanning through various research fields, automated assessment tools have

  10. A Framework For Analyzing And Mitigating The Vulnerabilities Of Complex Systems Via Attack And Protection Trees

    National Research Council Canada - National Science Library

    Edge, Kenneth S

    2007-01-01

    .... In addition to developing protection trees, this research improves the existing concept of attack trees and develops rule sets for the manipulation of metrics used in the security of complex systems...

  11. Decomposition of overlapping protein complexes: A graph theoretical method for analyzing static and dynamic protein associations

    Directory of Open Access Journals (Sweden)

    Guimarães Katia S

    2006-04-01

    Full Text Available Abstract Background Most cellular processes are carried out by multi-protein complexes, groups of proteins that bind together to perform a specific task. Some proteins form stable complexes, while other proteins form transient associations and are part of several complexes at different stages of a cellular process. A better understanding of this higher-order organization of proteins into overlapping complexes is an important step towards unveiling functional and evolutionary mechanisms behind biological networks. Results We propose a new method for identifying and representing overlapping protein complexes (or larger units called functional groups within a protein interaction network. We develop a graph-theoretical framework that enables automatic construction of such representation. We illustrate the effectiveness of our method by applying it to TNFα/NF-κB and pheromone signaling pathways. Conclusion The proposed representation helps in understanding the transitions between functional groups and allows for tracking a protein's path through a cascade of functional groups. Therefore, depending on the nature of the network, our representation is capable of elucidating temporal relations between functional groups. Our results show that the proposed method opens a new avenue for the analysis of protein interaction networks.

  12. High-Throughput and Rapid Screening of Low-Mass Hazardous Compounds in Complex Samples.

    Science.gov (United States)

    Wang, Jing; Liu, Qian; Gao, Yan; Wang, Yawei; Guo, Liangqia; Jiang, Guibin

    2015-07-07

    Rapid screening and identification of hazardous chemicals in complex samples is of extreme importance for public safety and environmental health studies. In this work, we report a new method for high-throughput, sensitive, and rapid screening of low-mass hazardous compounds in complex media without complicated sample preparation procedures. This method is achieved based on size-selective enrichment on ordered mesoporous carbon followed by matrix-assisted laser desorption/ionization-time-of-flight mass spectrometry analysis with graphene as a matrix. The ordered mesoporous carbon CMK-8 can exclude interferences from large molecules in complex samples (e.g., human serum, urine, and environmental water samples) and efficiently enrich a wide variety of low-mass hazardous compounds. The method can work at very low concentrations down to part per trillion (ppt) levels, and it is much faster and more facile than conventional methods. It was successfully applied to rapidly screen and identify unknown toxic substances such as perfluorochemicals in human serum samples from athletes and workers. Therefore, this method not only can sensitively detect target compounds but also can identify unknown hazardous compounds in complex media.

  13. Methodological issues in analyzing human communication – the complexities of multimodality

    DEFF Research Database (Denmark)

    Høegh, Tina

    2017-01-01

    This chapter develops a multimodal method for transcribing speech, communication, and performance. The chapter discusses the methodological solutions to the complex translation of speech, language rhythm and gesture in time and space into the two-dimensional format of a piece of paper. The focus...

  14. A robust interrupted time series model for analyzing complex health care intervention data

    KAUST Repository

    Cruz, Maricela

    2017-08-29

    Current health policy calls for greater use of evidence-based care delivery services to improve patient quality and safety outcomes. Care delivery is complex, with interacting and interdependent components that challenge traditional statistical analytic techniques, in particular, when modeling a time series of outcomes data that might be

  15. A robust interrupted time series model for analyzing complex health care intervention data

    KAUST Repository

    Cruz, Maricela; Bender, Miriam; Ombao, Hernando

    2017-01-01

    Current health policy calls for greater use of evidence-based care delivery services to improve patient quality and safety outcomes. Care delivery is complex, with interacting and interdependent components that challenge traditional statistical analytic techniques, in particular, when modeling a time series of outcomes data that might be

  16. Rapid screening for nuclear genes mutations in isolated respiratory chain complex I defects.

    Science.gov (United States)

    Pagniez-Mammeri, Hélène; Lombes, Anne; Brivet, Michèle; Ogier-de Baulny, Hélène; Landrieu, Pierre; Legrand, Alain; Slama, Abdelhamid

    2009-04-01

    Complex I or reduced nicotinamide adenine dinucleotide (NADH): ubiquinone oxydoreductase deficiency is the most common cause of respiratory chain defects. Molecular bases of complex I deficiencies are rarely identified because of the dual genetic origin of this multi-enzymatic complex (nuclear DNA and mitochondrial DNA) and the lack of phenotype-genotype correlation. We used a rapid method to screen patients with isolated complex I deficiencies for nuclear genes mutations by Surveyor nuclease digestion of cDNAs. Eight complex I nuclear genes, among the most frequently mutated (NDUFS1, NDUFS2, NDUFS3, NDUFS4, NDUFS7, NDUFS8, NDUFV1 and NDUFV2), were studied in 22 cDNA fragments spanning their coding sequences in 8 patients with a biochemically proved complex I deficiency. Single nucleotide polymorphisms and missense mutations were detected in 18.7% of the cDNA fragments by Surveyor nuclease treatment. Molecular defects were detected in 3 patients. Surveyor nuclease screening is a reliable method for genotyping nuclear complex I deficiencies, easy to interpret, and limits the number of sequence reactions. Its use will enhance the possibility of prenatal diagnosis and help us for a better understanding of complex I molecular defects.

  17. The use of synthetic spectra to test the preparedness to evaluate and analyze complex gamma spectra

    International Nuclear Information System (INIS)

    Nikkinen, M

    2001-10-01

    This is the report of two exercises that were run under the NKS BOK-1.1 sub-project. In these exercises synthetic gamma spectra were developed to exercise the analysis of difficult spectra typically seen after a severe nuclear accident. The spectra were analyzed twice; first, participants were given short time to give results to resemble an actual emergency preparedness situation, then a longer period of time was allowed to tune the laboratory analysis results for quality assurance purposes. The exercise did prove that it is possible to move measurement data from one laboratory to another if second opinion of the analysis is needed. It was also felt that this kind of exercise would enhance the experience the laboratories have in analyzing accident data. Participants expressed the need for additional exercises of this type, this is inexpensive and an easy way to exercise quick emergency response situations not normally seen in daily laboratory routines. (au)

  18. Analyzing complex wake-terrain interactions and its implications on wind-farm performance.

    Science.gov (United States)

    Tabib, Mandar; Rasheed, Adil; Fuchs, Franz

    2016-09-01

    Rotating wind turbine blades generate complex wakes involving vortices (helical tip-vortex, root-vortex etc.).These wakes are regions of high velocity deficits and high turbulence intensities and they tend to degrade the performance of down-stream turbines. Hence, a conservative inter-turbine distance of up-to 10 times turbine diameter (10D) is sometimes used in wind-farm layout (particularly in cases of flat terrain). This ensures that wake-effects will not reduce the overall wind-farm performance, but this leads to larger land footprint for establishing a wind-farm. In-case of complex-terrain, within a short distance (say 10D) itself, the nearby terrain can rise in altitude and be high enough to influence the wake dynamics. This wake-terrain interaction can happen either (a) indirectly, through an interaction of wake (both near tip vortex and far wake large-scale vortex) with terrain induced turbulence (especially, smaller eddies generated by small ridges within the terrain) or (b) directly, by obstructing the wake-region partially or fully in its flow-path. Hence, enhanced understanding of wake- development due to wake-terrain interaction will help in wind farm design. To this end the current study involves: (1) understanding the numerics for successful simulation of vortices, (2) understanding fundamental vortex-terrain interaction mechanism through studies devoted to interaction of a single vortex with different terrains, (3) relating influence of vortex-terrain interactions to performance of a wind-farm by studying a multi-turbine wind-farm layout under different terrains. The results on interaction of terrain and vortex has shown a much faster decay of vortex for complex terrain compared to a flatter-terrain. The potential reasons identified explaining the observation are (a) formation of secondary vortices in flow and its interaction with the primary vortex and (b) enhanced vorticity diffusion due to increased terrain-induced turbulence. The implications of

  19. Use of multiple singular value decompositions to analyze complex intracellular calcium ion signals

    KAUST Repository

    Martinez, Josue G.

    2009-12-01

    We compare calcium ion signaling (Ca(2+)) between two exposures; the data are present as movies, or, more prosaically, time series of images. This paper describes novel uses of singular value decompositions (SVD) and weighted versions of them (WSVD) to extract the signals from such movies, in a way that is semi-automatic and tuned closely to the actual data and their many complexities. These complexities include the following. First, the images themselves are of no interest: all interest focuses on the behavior of individual cells across time, and thus, the cells need to be segmented in an automated manner. Second, the cells themselves have 100+ pixels, so that they form 100+ curves measured over time, so that data compression is required to extract the features of these curves. Third, some of the pixels in some of the cells are subject to image saturation due to bit depth limits, and this saturation needs to be accounted for if one is to normalize the images in a reasonably un-biased manner. Finally, the Ca(2+) signals have oscillations or waves that vary with time and these signals need to be extracted. Thus, our aim is to show how to use multiple weighted and standard singular value decompositions to detect, extract and clarify the Ca(2+) signals. Our signal extraction methods then lead to simple although finely focused statistical methods to compare Ca(2+) signals across experimental conditions.

  20. Taking advantage of local structure descriptors to analyze interresidue contacts in protein structures and protein complexes.

    Science.gov (United States)

    Martin, Juliette; Regad, Leslie; Etchebest, Catherine; Camproux, Anne-Claude

    2008-11-15

    Interresidue protein contacts in proteins structures and at protein-protein interface are classically described by the amino acid types of interacting residues and the local structural context of the contact, if any, is described using secondary structures. In this study, we present an alternate analysis of interresidue contact using local structures defined by the structural alphabet introduced by Camproux et al. This structural alphabet allows to describe a 3D structure as a sequence of prototype fragments called structural letters, of 27 different types. Each residue can then be assigned to a particular local structure, even in loop regions. The analysis of interresidue contacts within protein structures defined using Voronoï tessellations reveals that pairwise contact specificity is greater in terms of structural letters than amino acids. Using a simple heuristic based on specificity score comparison, we find that 74% of the long-range contacts within protein structures are better described using structural letters than amino acid types. The investigation is extended to a set of protein-protein complexes, showing that the similar global rules apply as for intraprotein contacts, with 64% of the interprotein contacts best described by local structures. We then present an evaluation of pairing functions integrating structural letters to decoy scoring and show that some complexes could benefit from the use of structural letter-based pairing functions.

  1. Framework based on communicability and flow to analyze complex network dynamics

    Science.gov (United States)

    Gilson, M.; Kouvaris, N. E.; Deco, G.; Zamora-López, G.

    2018-05-01

    Graph theory constitutes a widely used and established field providing powerful tools for the characterization of complex networks. The intricate topology of networks can also be investigated by means of the collective dynamics observed in the interactions of self-sustained oscillations (synchronization patterns) or propagationlike processes such as random walks. However, networks are often inferred from real-data-forming dynamic systems, which are different from those employed to reveal their topological characteristics. This stresses the necessity for a theoretical framework dedicated to the mutual relationship between the structure and dynamics in complex networks, as the two sides of the same coin. Here we propose a rigorous framework based on the network response over time (i.e., Green function) to study interactions between nodes across time. For this purpose we define the flow that describes the interplay between the network connectivity and external inputs. This multivariate measure relates to the concepts of graph communicability and the map equation. We illustrate our theory using the multivariate Ornstein-Uhlenbeck process, which describes stable and non-conservative dynamics, but the formalism can be adapted to other local dynamics for which the Green function is known. We provide applications to classical network examples, such as small-world ring and hierarchical networks. Our theory defines a comprehensive framework that is canonically related to directed and weighted networks, thus paving a way to revise the standards for network analysis, from the pairwise interactions between nodes to the global properties of networks including community detection.

  2. A system to analyze the complex physiological states of coal solubilizing fungi

    Energy Technology Data Exchange (ETDEWEB)

    Hoelker, U.; Moenkemann, H.; Hoefer, M. [Universitaet Bonn, Bonn (Germany). Botanisches Institut

    1997-11-01

    The mechanism by which some microorganisms solubilize brown coal is still unknown. The paper discusses the deuteromycetes Fusarium oxysporum and Trichoderma atroviride as a suitable test system to analyse the complex fungal physiology relating to coal solubilization. The two fungi can occur in two different growth substrate-controlled physiological states: a coal-solubilizing one, when cells are grown on glutamate or gluconate as substrate and a non-solubilizing one, when grown on carbohydrates. When grown on carbohydrates, F.oxysporum produces the pigment bikaverein. Purified bikaverein inhibits also coal solubilization by T. atroviride. The ability to solubilize coal is constitutive in F. oxysporum, while in T. atroviride, it has to be induced. 10 refs., 3 figs., 3 tabs.

  3. A new theoretical approach to analyze complex processes in cytoskeleton proteins.

    Science.gov (United States)

    Li, Xin; Kolomeisky, Anatoly B

    2014-03-20

    Cytoskeleton proteins are filament structures that support a large number of important biological processes. These dynamic biopolymers exist in nonequilibrium conditions stimulated by hydrolysis chemical reactions in their monomers. Current theoretical methods provide a comprehensive picture of biochemical and biophysical processes in cytoskeleton proteins. However, the description is only qualitative under biologically relevant conditions because utilized theoretical mean-field models neglect correlations. We develop a new theoretical method to describe dynamic processes in cytoskeleton proteins that takes into account spatial correlations in the chemical composition of these biopolymers. Our approach is based on analysis of probabilities of different clusters of subunits. It allows us to obtain exact analytical expressions for a variety of dynamic properties of cytoskeleton filaments. By comparing theoretical predictions with Monte Carlo computer simulations, it is shown that our method provides a fully quantitative description of complex dynamic phenomena in cytoskeleton proteins under all conditions.

  4. A Complex Network Model for Analyzing Railway Accidents Based on the Maximal Information Coefficient

    International Nuclear Information System (INIS)

    Shao Fu-Bo; Li Ke-Ping

    2016-01-01

    It is an important issue to identify important influencing factors in railway accident analysis. In this paper, employing the good measure of dependence for two-variable relationships, the maximal information coefficient (MIC), which can capture a wide range of associations, a complex network model for railway accident analysis is designed in which nodes denote factors of railway accidents and edges are generated between two factors of which MIC values are larger than or equal to the dependent criterion. The variety of network structure is studied. As the increasing of the dependent criterion, the network becomes to an approximate scale-free network. Moreover, employing the proposed network, important influencing factors are identified. And we find that the annual track density-gross tonnage factor is an important factor which is a cut vertex when the dependent criterion is equal to 0.3. From the network, it is found that the railway development is unbalanced for different states which is consistent with the fact. (paper)

  5. Aviation Safety: Modeling and Analyzing Complex Interactions between Humans and Automated Systems

    Science.gov (United States)

    Rungta, Neha; Brat, Guillaume; Clancey, William J.; Linde, Charlotte; Raimondi, Franco; Seah, Chin; Shafto, Michael

    2013-01-01

    The on-going transformation from the current US Air Traffic System (ATS) to the Next Generation Air Traffic System (NextGen) will force the introduction of new automated systems and most likely will cause automation to migrate from ground to air. This will yield new function allocations between humans and automation and therefore change the roles and responsibilities in the ATS. Yet, safety in NextGen is required to be at least as good as in the current system. We therefore need techniques to evaluate the safety of the interactions between humans and automation. We think that current human factor studies and simulation-based techniques will fall short in front of the ATS complexity, and that we need to add more automated techniques to simulations, such as model checking, which offers exhaustive coverage of the non-deterministic behaviors in nominal and off-nominal scenarios. In this work, we present a verification approach based both on simulations and on model checking for evaluating the roles and responsibilities of humans and automation. Models are created using Brahms (a multi-agent framework) and we show that the traditional Brahms simulations can be integrated with automated exploration techniques based on model checking, thus offering a complete exploration of the behavioral space of the scenario. Our formal analysis supports the notion of beliefs and probabilities to reason about human behavior. We demonstrate the technique with the Ueberligen accident since it exemplifies authority problems when receiving conflicting advices from human and automated systems.

  6. A Nanometer Aerosol Size Analyzer (nASA) for Rapid Measurement of High-concentration Size Distributions

    International Nuclear Information System (INIS)

    Han, H.-S.; Chen, D.-R.; Pui, David Y.H.; Anderson, Bruce E.

    2000-01-01

    We have developed a fast-response nanometer aerosol size analyzer (nASA) that is capable of scanning 30 size channels between 3 and 100 nm in a total time of 3 s. The analyzer includes a bipolar charger (Po 210 ), an extended-length nanometer differential mobility analyzer (Nano-DMA), and an electrometer (TSI 3068). This combination of components provides particle size spectra at a scan rate of 0.1 s per channel free of uncertainties caused by response-time-induced smearing. The nASA thus offers a fast response for aerosol size distribution measurements in high-concentration conditions and also eliminates the need for applying a de-smearing algorithm to resulting data. In addition, because of its thermodynamically stable means of particle detection, the nASA is useful for applications requiring measurements over a broad range of sample pressures and temperatures. Indeed, experimental transfer functions determined for the extended-length Nano-DMA using the tandem differential mobility analyzer (TDMA) technique indicate the nASA provides good size resolution at pressures as low as 200 Torr. Also, as was demonstrated in tests to characterize the soot emissions from the J85-GE engine of a T-38 aircraft, the broad dynamic concentration range of the nASA makes it particularly suitable for studies of combustion or particle formation processes. Further details of the nASA performance as well as results from calibrations, laboratory tests and field applications are presented below

  7. A climate for speciation: rapid spatial diversification within the Sorex cinereus complex of shrews

    Science.gov (United States)

    Hope, Andrew G.; Speer, Kelly A.; Demboski, John R.; Talbot, Sandra L.; Cook, Joseph A.

    2012-01-01

    The cyclic climate regime of the late Quaternary caused dramatic environmental change at high latitudes. Although these events may have been brief in periodicity from an evolutionary standpoint, multiple episodes of allopatry and divergence have been implicated in rapid radiations of a number of organisms. Shrews of the Sorex cinereus complex have long challenged taxonomists due to similar morphology and parapatric geographic ranges. Here, multi-locus phylogenetic and demographic assessments using a coalescent framework were combined to investigate spatiotemporal evolution of 13 nominal species with a widespread distribution throughout North America and across Beringia into Siberia. For these species, we first test a hypothesis of recent differentiation in response to Pleistocene climate versus more ancient divergence that would coincide with pre-Pleistocene perturbations. We then investigate the processes driving diversification over multiple continents. Our genetic analyses highlight novel diversity within these morphologically conserved mammals and clarify relationships between geographic distribution and evolutionary history. Demography within and among species indicates both regional stability and rapid expansion. Ancestral ecological differentiation coincident with early cladogenesis within the complex enabled alternating and repeated episodes of allopatry and expansion where successive glacial and interglacial phases each promoted divergence. The Sorex cinereus complex constitutes a valuable model for future comparative assessments of evolution in response to cyclic environmental change.

  8. Application of adjustable pulse lasers to studying rapid reaction kinetics of excited lanthanide complexing

    Energy Technology Data Exchange (ETDEWEB)

    Gruzdev, V.P. (Gosudarstvennyj Opticheskij Inst., Leningrad (USSR))

    1983-12-01

    Using some europium (3) ion complexes new possibilities to be opened by application of adjustable pulse lasers for studying rapid reactions of electron-excited metal ion complexing are demonstrated. The 6Zh rhodamine pulse laser is used as a source of nonequilibrium photoexcitation of an array of Eu/sup 3 +/ complexes in the luminescent kinetic spectroscopy method. The following results are obtained: for the first time the rate of reaction of acetate ion substitution for water molecules of an excited (/sup 5/D/sub 0/) ion of Eu/sup 3 +/ was measured to be (0.7+-0.2)x10/sup 7/ s/sup -1/; using direct experiments the lower limit for the rate of transition of one isomeric form of the excited Eu x EDTA complex into another one in an aqueous solution is determined to be 5x10/sup 5/ s/sup -1/ at 295 K; the kinetics of the excitation energy migration beteen aqueous solvates of Eu/sup 3 +/ and EuxEDTA complexes is investigated.

  9. Rapid identification of pork for halal authentication using the electronic nose and gas chromatography mass spectrometer with headspace analyzer.

    Science.gov (United States)

    Nurjuliana, M; Che Man, Y B; Mat Hashim, D; Mohamed, A K S

    2011-08-01

    The volatile compounds of pork, other meats and meat products were studied using an electronic nose and gas chromatography mass spectrometer with headspace analyzer (GCMS-HS) for halal verification. The zNose™ was successfully employed for identification and differentiation of pork and pork sausages from beef, mutton and chicken meats and sausages which were achieved using a visual odor pattern called VaporPrint™, derived from the frequency of the surface acoustic wave (SAW) detector of the electronic nose. GCMS-HS was employed to separate and analyze the headspace gasses from samples into peaks corresponding to individual compounds for the purpose of identification. Principal component analysis (PCA) was applied for data interpretation. Analysis by PCA was able to cluster and discriminate pork from other types of meats and sausages. It was shown that PCA could provide a good separation of the samples with 67% of the total variance accounted by PC1. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Rapid

    Directory of Open Access Journals (Sweden)

    Nahla M. Wassim

    2013-01-01

    Full Text Available Members of Aedes caspius mosquitoes are incriminated to be a potential reservoir of “Rift Valley Fever Virus” (RVF during interepizootic periods in Egypt. Ae. caspius contains two distinct forms which are morphologically indistinguishable but differ in physiology and behavior; Ae. caspius form (a requires a blood meal for each egg batch(anautogeny, is unable to mate in confined spaces(eurygamous. The second form (b lays egg batch without blood meal (autogenous and can mate in confined spaces (stenogamous. In this work, we collected the autogenous and anautogenous forms of Ae. caspius from two different breeding habitats in the Qalyubia Governorate. Analysis of the Drosophila ace-Orthologous acetylecholinesterase gene revealed that a single polymorphic region characterized each species. Based on this region, specific primers were used to amplify the entire section of intron II, sections of Exon 2 and Exon 3 of ace-2 gene for differentiating the complex species of mosquitoes. The amplicons of anautogenous form sized 441 pb and increase 116 bp than autogenous form of Ae. caspius. High rates of point mutations were addressed; deletion/insertion events are 120 bases. The transversion mutations were 44 bases and were relatively close to the transtion mutations 43 base. The genetic distance was 0.01 between the two forms.

  11. Evaluation of a Fully Automated Analyzer for Rapid Measurement of Water Vapor Sorption Isotherms for Applications in Soil Science

    DEFF Research Database (Denmark)

    Arthur, Emmanuel; Tuller, Markus; Moldrup, Per

    2014-01-01

    The characterization and description of important soil processes such as water vapor transport, volatilization of pesticides, and hysteresis require accurate means for measuring the soil water characteristic (SWC) at low water potentials. Until recently, measurement of the SWC at low water...... potentials was constrained by hydraulic decoupling and long equilibration times when pressure plates or single-point, chilled-mirror instruments were used. A new, fully automated Vapor Sorption Analyzer (VSA) helps to overcome these challenges and allows faster measurement of highly detailed water vapor...

  12. A Rapid Convergent Low Complexity Interference Alignment Algorithm for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Lihui Jiang

    2015-07-01

    Full Text Available Interference alignment (IA is a novel technique that can effectively eliminate the interference and approach the sum capacity of wireless sensor networks (WSNs when the signal-to-noise ratio (SNR is high, by casting the desired signal and interference into different signal subspaces. The traditional alternating minimization interference leakage (AMIL algorithm for IA shows good performance in high SNR regimes, however, the complexity of the AMIL algorithm increases dramatically as the number of users and antennas increases, posing limits to its applications in the practical systems. In this paper, a novel IA algorithm, called directional quartic optimal (DQO algorithm, is proposed to minimize the interference leakage with rapid convergence and low complexity. The properties of the AMIL algorithm are investigated, and it is discovered that the difference between the two consecutive iteration results of the AMIL algorithm will approximately point to the convergence solution when the precoding and decoding matrices obtained from the intermediate iterations are sufficiently close to their convergence values. Based on this important property, the proposed DQO algorithm employs the line search procedure so that it can converge to the destination directly. In addition, the optimal step size can be determined analytically by optimizing a quartic function. Numerical results show that the proposed DQO algorithm can suppress the interference leakage more rapidly than the traditional AMIL algorithm, and can achieve the same level of sum rate as that of AMIL algorithm with far less iterations and execution time.

  13. A Rapid Convergent Low Complexity Interference Alignment Algorithm for Wireless Sensor Networks.

    Science.gov (United States)

    Jiang, Lihui; Wu, Zhilu; Ren, Guanghui; Wang, Gangyi; Zhao, Nan

    2015-07-29

    Interference alignment (IA) is a novel technique that can effectively eliminate the interference and approach the sum capacity of wireless sensor networks (WSNs) when the signal-to-noise ratio (SNR) is high, by casting the desired signal and interference into different signal subspaces. The traditional alternating minimization interference leakage (AMIL) algorithm for IA shows good performance in high SNR regimes, however, the complexity of the AMIL algorithm increases dramatically as the number of users and antennas increases, posing limits to its applications in the practical systems. In this paper, a novel IA algorithm, called directional quartic optimal (DQO) algorithm, is proposed to minimize the interference leakage with rapid convergence and low complexity. The properties of the AMIL algorithm are investigated, and it is discovered that the difference between the two consecutive iteration results of the AMIL algorithm will approximately point to the convergence solution when the precoding and decoding matrices obtained from the intermediate iterations are sufficiently close to their convergence values. Based on this important property, the proposed DQO algorithm employs the line search procedure so that it can converge to the destination directly. In addition, the optimal step size can be determined analytically by optimizing a quartic function. Numerical results show that the proposed DQO algorithm can suppress the interference leakage more rapidly than the traditional AMIL algorithm, and can achieve the same level of sum rate as that of AMIL algorithm with far less iterations and execution time.

  14. DC-Analyzer-facilitated combinatorial strategy for rapid directed evolution of functional enzymes with multiple mutagenesis sites.

    Science.gov (United States)

    Wang, Xiong; Zheng, Kai; Zheng, Huayu; Nie, Hongli; Yang, Zujun; Tang, Lixia

    2014-12-20

    Iterative saturation mutagenesis (ISM) has been shown to be a powerful method for directed evolution. In this study, the approach was modified (termed M-ISM) by combining the single-site saturation mutagenesis method with a DC-Analyzer-facilitated combinatorial strategy, aiming to evolve novel biocatalysts efficiently in the case where multiple sites are targeted simultaneously. Initially, all target sites were explored individually by constructing single-site saturation mutagenesis libraries. Next, the top two to four variants in each library were selected and combined using the DC-Analyzer-facilitated combinatorial strategy. In addition to site-saturation mutagenesis, iterative saturation mutagenesis also needed to be performed. The advantages of M-ISM over ISM were that the screening effort is greatly reduced, and the entire M-ISM procedure was less time-consuming. The M-ISM strategy was successfully applied to the randomization of halohydrin dehalogenase from Agrobacterium radiobacter AD1 (HheC) when five interesting sites were targeted simultaneously. After screening 900 clones in total, six positive mutants were obtained. These mutants exhibited 4.0- to 9.3-fold higher k(cat) values than did the wild-type HheC toward 1,3-dichloro-2-propanol. However, with the ISM strategy, the best hit showed a 5.9-fold higher k(cat) value toward 1,3-DCP than the wild-type HheC, which was obtained after screening 4000 clones from four rounds of mutagenesis. Therefore, M-ISM could serve as a simple and efficient version of ISM for the randomization of target genes with multiple positions of interest.

  15. Using iMCFA to Perform the CFA, Multilevel CFA, and Maximum Model for Analyzing Complex Survey Data.

    Science.gov (United States)

    Wu, Jiun-Yu; Lee, Yuan-Hsuan; Lin, John J H

    2018-01-01

    To construct CFA, MCFA, and maximum MCFA with LISREL v.8 and below, we provide iMCFA (integrated Multilevel Confirmatory Analysis) to examine the potential multilevel factorial structure in the complex survey data. Modeling multilevel structure for complex survey data is complicated because building a multilevel model is not an infallible statistical strategy unless the hypothesized model is close to the real data structure. Methodologists have suggested using different modeling techniques to investigate potential multilevel structure of survey data. Using iMCFA, researchers can visually set the between- and within-level factorial structure to fit MCFA, CFA and/or MAX MCFA models for complex survey data. iMCFA can then yield between- and within-level variance-covariance matrices, calculate intraclass correlations, perform the analyses and generate the outputs for respective models. The summary of the analytical outputs from LISREL is gathered and tabulated for further model comparison and interpretation. iMCFA also provides LISREL syntax of different models for researchers' future use. An empirical and a simulated multilevel dataset with complex and simple structures in the within or between level was used to illustrate the usability and the effectiveness of the iMCFA procedure on analyzing complex survey data. The analytic results of iMCFA using Muthen's limited information estimator were compared with those of Mplus using Full Information Maximum Likelihood regarding the effectiveness of different estimation methods.

  16. Analyzing the spatial patterns and drivers of ecosystem services in rapidly urbanizing Taihu Lake Basin of China

    Science.gov (United States)

    Ai, Junyong; Sun, Xiang; Feng, Lan; Li, Yangfan; Zhu, Xiaodong

    2015-09-01

    Quantifying and mapping the distribution patterns of ecosystem services can help to ascertain which services should be protected and where investments should be directed to improve synergies and reduce tradeoffs. Moreover, the indicators of urbanization that affect the provision of ecosystem services must be identified to determine which approach to adopt in formulating policies related to these services. This paper presents a case study that maps the distribution of multiple ecosystem services and analyzes the ways in which they interact. The relationship between the supply of ecosystem services and the socio-economic development in the Taihu Lake Basin of eastern China is also revealed. Results show a significant negative relationship between crop production and tourism income ( p<0.005) and a positive relationship between crop production, nutrient retention, and carbon sequestration ( p<0.005). The negative effects of the urbanization process on providing and regulating services are also identified through a comparison of the ecosystem services in large and small cities. Regression analysis was used to compare and elucidate the relative significance of the selected urbanization factors to ecosystem services. The results indicate that urbanization level is the most substantial factor inversely correlated with crop production ( R 2 = 0.414) and nutrient retention services ( R 2 = 0.572). Population density is the most important factor that negatively affects carbon sequestration ( R 2 = 0.447). The findings of this study suggest the potential relevance of ecosystem service dynamics to urbanization management and decision making.

  17. Rapid Quadrupole-Time-of-Flight Mass Spectrometry Method Quantifies Oxygen-Rich Lignin Compound in Complex Mixtures

    Science.gov (United States)

    Boes, Kelsey S.; Roberts, Michael S.; Vinueza, Nelson R.

    2018-03-01

    Complex mixture analysis is a costly and time-consuming task facing researchers with foci as varied as food science and fuel analysis. When faced with the task of quantifying oxygen-rich bio-oil molecules in a complex diesel mixture, we asked whether complex mixtures could be qualitatively and quantitatively analyzed on a single mass spectrometer with mid-range resolving power without the use of lengthy separations. To answer this question, we developed and evaluated a quantitation method that eliminated chromatography steps and expanded the use of quadrupole-time-of-flight mass spectrometry from primarily qualitative to quantitative as well. To account for mixture complexity, the method employed an ionization dopant, targeted tandem mass spectrometry, and an internal standard. This combination of three techniques achieved reliable quantitation of oxygen-rich eugenol in diesel from 300 to 2500 ng/mL with sufficient linearity (R2 = 0.97 ± 0.01) and excellent accuracy (percent error = 0% ± 5). To understand the limitations of the method, it was compared to quantitation attained on a triple quadrupole mass spectrometer, the gold standard for quantitation. The triple quadrupole quantified eugenol from 50 to 2500 ng/mL with stronger linearity (R2 = 0.996 ± 0.003) than the quadrupole-time-of-flight and comparable accuracy (percent error = 4% ± 5). This demonstrates that a quadrupole-time-of-flight can be used for not only qualitative analysis but also targeted quantitation of oxygen-rich lignin molecules in complex mixtures without extensive sample preparation. The rapid and cost-effective method presented here offers new possibilities for bio-oil research, including: (1) allowing for bio-oil studies that demand repetitive analysis as process parameters are changed and (2) making this research accessible to more laboratories. [Figure not available: see fulltext.

  18. Rapid spread of complex change: a case study in inpatient palliative care

    Directory of Open Access Journals (Sweden)

    Filipski Marta I

    2009-12-01

    Full Text Available Abstract Background Based on positive findings from a randomized controlled trial, Kaiser Permanente's national executive leadership group set an expectation that all Kaiser Permanente and partner hospitals would implement a consultative model of interdisciplinary, inpatient-based palliative care (IPC. Within one year, the number of IPC consultations program-wide increased almost tenfold from baseline, and the number of teams nearly doubled. We report here results from a qualitative evaluation of the IPC initiative after a year of implementation; our purpose was to understand factors supporting or impeding the rapid and consistent spread of a complex program. Methods Quality improvement study using a case study design and qualitative analysis of in-depth semi-structured interviews with 36 national, regional, and local leaders. Results Compelling evidence of impacts on patient satisfaction and quality of care generated 'pull' among adopters, expressed as a remarkably high degree of conviction about the value of the model. Broad leadership agreement gave rise to sponsorship and support that permeated the organization. A robust social network promoted knowledge exchange and built on an existing network with a strong interest in palliative care. Resource constraints, pre-existing programs of a different model, and ambiguous accountability for implementation impeded spread. Conclusions A complex, hospital-based, interdisciplinary intervention in a large health care organization spread rapidly due to a synergy between organizational 'push' strategies and grassroots-level pull. The combination of push and pull may be especially important when the organizational context or the practice to be spread is complex.

  19. Impact of rapid maxillary expansion on nasomaxillary complex volume in mouth-breathers

    Directory of Open Access Journals (Sweden)

    Mario Cappellette Jr.

    Full Text Available ABSTRACT Objective: To assess the volumetric changes that occur in the nasomaxillary complex of mouth-breathing patients with transverse maxillary deficiency subjected to rapid maxillary expansion (RME. Methods: This was a controlled, prospective intervention study involving 38 mouth-breathing patients presenting with transverse maxillary deficiency, regardless of malocclusion type or race. Twenty-three of them comprised the experimental group, which was composed of 11 (47.8% boys, and 12 (52.2% girls, with a mean age of 9.6 years, ranging from 6.4 to 14.2 years and standard deviation of 2.3 years; and 15 of them comprised the control group, composed of 9 (60% boys and 6 (40% girls with an mean age of 10.5 years, ranging from 8.0 to 13.6 years, and standard deviation of 1.9 years. All patients were scanned (CT according to a standard protocol: Initial CT (T1, and CT three months thereafter (T2, and the patients in the experimental group were treated with RME using a Hyrax expander for the correction of maxillary deficiency during the T1-T2 interval. The CT scans were manipulated using Dolphin® Imaging version 11.7 software for total and partial volumetric assessment of the nasomaxillary complex. Results: The results revealed that in the experimental group there was a significant increase in the size of the structures of interest compared to the control group, both in general aspect and in specific regions. Conclusions: Rapid maxillary expansion (RME provided a significant expansion in all the structures of the nasomaxillary complex (nasal cavity, oropharynx, right and left maxillary sinuses.

  20. Actin-Sorting Nexin 27 (SNX27)-Retromer Complex Mediates Rapid Parathyroid Hormone Receptor Recycling*

    Science.gov (United States)

    McGarvey, Jennifer C.; Xiao, Kunhong; Bowman, Shanna L.; Mamonova, Tatyana; Zhang, Qiangmin; Bisello, Alessandro; Sneddon, W. Bruce; Ardura, Juan A.; Jean-Alphonse, Frederic; Vilardaga, Jean-Pierre; Puthenveedu, Manojkumar A.; Friedman, Peter A.

    2016-01-01

    The G protein-coupled parathyroid hormone receptor (PTHR) regulates mineral-ion homeostasis and bone remodeling. Upon parathyroid hormone (PTH) stimulation, the PTHR internalizes into early endosomes and subsequently traffics to the retromer complex, a sorting platform on early endosomes that promotes recycling of surface receptors. The C terminus of the PTHR contains a type I PDZ ligand that binds PDZ domain-containing proteins. Mass spectrometry identified sorting nexin 27 (SNX27) in isolated endosomes as a PTHR binding partner. PTH treatment enriched endosomal PTHR. SNX27 contains a PDZ domain and serves as a cargo selector for the retromer complex. VPS26, VPS29, and VPS35 retromer subunits were isolated with PTHR in endosomes from cells stimulated with PTH. Molecular dynamics and protein binding studies establish that PTHR and SNX27 interactions depend on the PDZ recognition motif in PTHR and the PDZ domain of SNX27. Depletion of either SNX27 or VPS35 or actin depolymerization decreased the rate of PTHR recycling following agonist stimulation. Mutating the PDZ ligand of PTHR abolished the interaction with SNX27 but did not affect the overall rate of recycling, suggesting that PTHR may directly engage the retromer complex. Coimmunoprecipitation and overlay experiments show that both intact and mutated PTHR bind retromer through the VPS26 protomer and sequentially assemble a ternary complex with PTHR and SNX27. SNX27-independent recycling may involve N-ethylmaleimide-sensitive factor, which binds both PDZ intact and mutant PTHRs. We conclude that PTHR recycles rapidly through at least two pathways, one involving the ASRT complex of actin, SNX27, and retromer and another possibly involving N-ethylmaleimide-sensitive factor. PMID:27008860

  1. [Automated RNA amplification for the rapid identification of Mycobacterium tuberculosis complex in respiratory specimens].

    Science.gov (United States)

    Drouillon, V; Houriez, F; Buze, M; Lagrange, P; Herrmann, J-L

    2006-01-01

    Rapid and sensitive detection of Mycobacterium tuberculosis complex (MTB) directly on clinical respiratory specimens is essential for a correct management of patients suspected of tuberculosis. For this purpose PCR-based kits are available to detect MTB in respiratory specimen but most of them need at least 4 hours to be completed. New methods, based on TRC method (TRC: Transcription Reverse transcription Concerted--TRCRapid M. Tuberculosis--Tosoh Bioscience, Tokyo, Japon) and dedicated monitor have been developed. A new kit (TRC Rapid M. tuberculosis and Real-time monitor TRCRapid-160, Tosoh Corporation, Japan) enabling one step amplification and real-time detection of MTB 16S rRNA by a combination of intercalative dye oxazole yellow-linked DNA probe and isothermal RNA amplification directly on respiratory specimens has been tested in our laboratory. 319 respiratory specimens were tested in this preliminary study and results were compared to smear and culture. Fourteen had a positive culture for MTB. Among theses samples, smear was positive in 11 cases (78.6%) and TRC process was positive in 8 cases (57.1%). Overall sensitivity of TRC compared to smear positive samples is 73%. Theses first results demonstrated that a rapid identification of MTB was possible (less than 2 processing hours for 14 specimens and about 1 hour for 1 specimen) in most cases of smear positive samples using ready to use reagents for real time detection of MTB rRNA in clinical samples. New pretreatment and extraction reagents kits to increase the stability of the sputum RNA and the extraction efficiency are now tested in our laboratory.

  2. A new rapid immunohistochemical staining technique using the EnVision antibody complex.

    Science.gov (United States)

    Kämmerer, U; Kapp, M; Gassel, A M; Richter, T; Tank, C; Dietl, J; Ruck, P

    2001-05-01

    Rapid immunohistochemical investigation, in addition to staining with hematoxylin and eosin, would be useful during intraoperative frozen section diagnosis in some cases. This study was undertaken to investigate whether the recently described EnVision system, a highly sensitive two-step immunohistochemical technique, could be modified for rapid immunostaining of frozen sections. Forty-five primary antibodies were tested on frozen sections from various different tissues. After fixation in acetone for 1 min and air-drying, the sections were incubated for 3 min each with the primary antibody, the EnVision complex (a large number of secondary antibodies and horseradish peroxidase coupled to a dextran backbone), and the chromogen (3,3'diaminobenzidine or 3-amino-9-ethylcarbazole). All reactions were carried out at 37C. Specific staining was seen with 38 antibodies (including HMB-45 and antibodies against keratin, vimentin, leukocyte common antigen, smooth muscle actin, synaptophysin, CD34, CD3, CD20, and prostate-specific antigen). A modification of the EnVision method allows the detection of a broad spectrum of antigens in frozen sections in less than 13 min. This method could be a useful new tool in frozen section diagnosis and research. (J Histochem Cytochem 49:623-630, 2001)

  3. Complex fluids under microflow probed by SAXS: rapid microfabrication and analysis

    International Nuclear Information System (INIS)

    Martin, Hazel P; Luckham, Paul F; Cabral, Joao T; Brooks, Nicholas J; Seddon, John M; Terrill, Nick J; Kowalski, Adam J

    2010-01-01

    We report a combined microfluidic and online synchrotron small-angle X-ray scattering (SAXS) study of complex surfactant mixtures under flow. We investigate the influence of a series of flow constrictions, generating well-defined, periodic extensional flow fields, on the microstructure of two model surfactant mixtures containing SDS and CTAC. Specifically, the lamella spacing, orientation and structural order are reported and correlated with the imposed flow field: geometry, flow velocity and residence time. The design, fabrication and operation of a microfluidic system using rapid prototyping is described in detail. We show that polydimethyl siloxane (PDMS), ubiquitous in microfabrication, provides a suitable matrix for SAXS microdevices provided that: (i) PDMS thickness are kept to a minimum while retaining structural integrity (∼1000μm) and (ii) scattering from the structure of interest is sufficiently decoupled from the amorphous background scattering. The combination SAXS-microfluidics provides unprecedented opportunities to elucidate the non-equilibrium structure formation and relaxation of complex fluids, demonstrated here for concentrated surfactant mixtures.

  4. Complex fluids under microflow probed by SAXS: rapid microfabrication and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Martin, Hazel P; Luckham, Paul F; Cabral, Joao T [Department of Chemical Engineering, Imperial College London, Exhibition Road, London SW7 2AZ (United Kingdom); Brooks, Nicholas J; Seddon, John M [Department of Chemistry, Imperial College London, Exhibition Road, London SW7 2AZ (United Kingdom); Terrill, Nick J [Diamond Light Source Ltd., Diamond House, Harwell Science and Innovation Campus, Chilton, Didcot, Oxfordshire, OX11 0DE (United Kingdom); Kowalski, Adam J, E-mail: j.cabral@imperial.ac.u [Unilever Research and Development, Port Sunlight Laboratory, Bebington, Wirral, CH63 3JW (United Kingdom)

    2010-10-01

    We report a combined microfluidic and online synchrotron small-angle X-ray scattering (SAXS) study of complex surfactant mixtures under flow. We investigate the influence of a series of flow constrictions, generating well-defined, periodic extensional flow fields, on the microstructure of two model surfactant mixtures containing SDS and CTAC. Specifically, the lamella spacing, orientation and structural order are reported and correlated with the imposed flow field: geometry, flow velocity and residence time. The design, fabrication and operation of a microfluidic system using rapid prototyping is described in detail. We show that polydimethyl siloxane (PDMS), ubiquitous in microfabrication, provides a suitable matrix for SAXS microdevices provided that: (i) PDMS thickness are kept to a minimum while retaining structural integrity ({approx}1000{mu}m) and (ii) scattering from the structure of interest is sufficiently decoupled from the amorphous background scattering. The combination SAXS-microfluidics provides unprecedented opportunities to elucidate the non-equilibrium structure formation and relaxation of complex fluids, demonstrated here for concentrated surfactant mixtures.

  5. Analyzing discourse and text complexity for learning and collaborating a cognitive approach based on natural language processing

    CERN Document Server

    Dascălu, Mihai

    2014-01-01

    With the advent and increasing popularity of Computer Supported Collaborative Learning (CSCL) and e-learning technologies, the need of automatic assessment and of teacher/tutor support for the two tightly intertwined activities of comprehension of reading materials and of collaboration among peers has grown significantly. In this context, a polyphonic model of discourse derived from Bakhtin’s work as a paradigm is used for analyzing both general texts and CSCL conversations in a unique framework focused on different facets of textual cohesion. As specificity of our analysis, the individual learning perspective is focused on the identification of reading strategies and on providing a multi-dimensional textual complexity model, whereas the collaborative learning dimension is centered on the evaluation of participants’ involvement, as well as on collaboration assessment. Our approach based on advanced Natural Language Processing techniques provides a qualitative estimation of the learning process and enhance...

  6. Rapid prototyping of a complex model for the manufacture of plaster molds for slip casting ceramic

    Directory of Open Access Journals (Sweden)

    D. P. C. Velazco

    2014-12-01

    Full Text Available Computer assisted designing (CAD is well known for several decades and employed for ceramic manufacturing almost since the beginning, but usually employed in the first part of the projectual ideation processes, neither in the prototyping nor in the manufacturing stages. The rapid prototyping machines, also known as 3D printers, have the capacity to produce in a few hours real pieces using plastic materials of high resistance, with great precision and similarity with respect to the original, based on unprecedented digital models produced by means of modeling with specific design software or from the digitalization of existing parts using the so-called 3D scanners. The main objective of the work is to develop the methodology used in the entire process of building a part in ceramics from the interrelationship between traditional techniques and new technologies for the manufacture of prototypes. And to take advantage of the benefits that allow us this new reproduction technology. The experience was based on the generation of a complex piece, in digital format, which served as the model. A regular 15 cm icosahedron presented features complex enough not to advise the production of the model by means of the traditional techniques of ceramics (manual or mechanical. From this digital model, a plaster mold was made in the traditional way in order to slip cast clay based slurries, freely dried in air and fired and glazed in the traditional way. This experience has shown the working hypothesis and opens up the possibility of new lines of work to academic and technological levels that will be explored in the near future. This technology provides a wide range of options to address the formal aspect of a part to be performed for the field of design, architecture, industrial design, the traditional pottery, ceramic art, etc., which allow you to amplify the formal possibilities, save time and therefore costs when drafting the necessary and appropriate matrixes

  7. Rapid phenolic O-glycosylation of small molecules and complex unprotected peptides in aqueous solvent

    Science.gov (United States)

    Wadzinski, Tyler J.; Steinauer, Angela; Hie, Liana; Pelletier, Guillaume; Schepartz, Alanna; Miller, Scott J.

    2018-06-01

    Glycosylated natural products and synthetic glycopeptides represent a significant and growing source of biochemical probes and therapeutic agents. However, methods that enable the aqueous glycosylation of endogenous amino acid functionality in peptides without the use of protecting groups are scarce. Here, we report a transformation that facilitates the efficient aqueous O-glycosylation of phenolic functionality in a wide range of small molecules, unprotected tyrosine, and tyrosine residues embedded within a range of complex, fully unprotected peptides. The transformation, which uses glycosyl fluoride donors and is promoted by Ca(OH)2, proceeds rapidly at room temperature in water, with good yields and selective formation of unique anomeric products depending on the stereochemistry of the glycosyl donor. High functional group tolerance is observed, and the phenol glycosylation occurs selectively in the presence of virtually all side chains of the proteinogenic amino acids with the singular exception of Cys. This method offers a highly selective, efficient, and operationally simple approach for the protecting-group-free synthesis of O-aryl glycosides and Tyr-O-glycosylated peptides in water.

  8. Development and operation of an integrated sampling probe and gas analyzer for turbulent mixing studies in complex supersonic flows

    Science.gov (United States)

    Wiswall, John D.

    -temporal characteristic scales of the flow on the resulting time-area-averaged concentration measurements. Two series of experiments were performed to verify the probe's design; the first used Schlieren photography and verified that the probe sampled from the supersonic flowfield isokinetically. The second series involved traversing the probe across a free mixing layer of air and helium, to obtain both mean concentration and high frequency measurements. High-frequency data was statistically analyzed and inspection of the Probability Density Function (PDF) of the hot-film response was instrumental to interpret how well the resulting average mixing measurements represent these types of complex flows. The probe is minimally intrusive, has accuracy comparable to its predecessors, has an improved frequency response for mean concentration measurements, and samples from a very small area in the flowfield.

  9. The challenge for genetic epidemiologists: how to analyze large numbers of SNPs in relation to complex diseases.

    Science.gov (United States)

    Heidema, A Geert; Boer, Jolanda M A; Nagelkerke, Nico; Mariman, Edwin C M; van der A, Daphne L; Feskens, Edith J M

    2006-04-21

    Genetic epidemiologists have taken the challenge to identify genetic polymorphisms involved in the development of diseases. Many have collected data on large numbers of genetic markers but are not familiar with available methods to assess their association with complex diseases. Statistical methods have been developed for analyzing the relation between large numbers of genetic and environmental predictors to disease or disease-related variables in genetic association studies. In this commentary we discuss logistic regression analysis, neural networks, including the parameter decreasing method (PDM) and genetic programming optimized neural networks (GPNN) and several non-parametric methods, which include the set association approach, combinatorial partitioning method (CPM), restricted partitioning method (RPM), multifactor dimensionality reduction (MDR) method and the random forests approach. The relative strengths and weaknesses of these methods are highlighted. Logistic regression and neural networks can handle only a limited number of predictor variables, depending on the number of observations in the dataset. Therefore, they are less useful than the non-parametric methods to approach association studies with large numbers of predictor variables. GPNN on the other hand may be a useful approach to select and model important predictors, but its performance to select the important effects in the presence of large numbers of predictors needs to be examined. Both the set association approach and random forests approach are able to handle a large number of predictors and are useful in reducing these predictors to a subset of predictors with an important contribution to disease. The combinatorial methods give more insight in combination patterns for sets of genetic and/or environmental predictor variables that may be related to the outcome variable. As the non-parametric methods have different strengths and weaknesses we conclude that to approach genetic association

  10. Rapid Genome-wide Single Nucleotide Polymorphism Discovery in Soybean and Rice via Deep Resequencing of Reduced Representation Libraries with the Illumina Genome Analyzer

    Directory of Open Access Journals (Sweden)

    Stéphane Deschamps

    2010-07-01

    Full Text Available Massively parallel sequencing platforms have allowed for the rapid discovery of single nucleotide polymorphisms (SNPs among related genotypes within a species. We describe the creation of reduced representation libraries (RRLs using an initial digestion of nuclear genomic DNA with a methylation-sensitive restriction endonuclease followed by a secondary digestion with the 4bp-restriction endonuclease This strategy allows for the enrichment of hypomethylated genomic DNA, which has been shown to be rich in genic sequences, and the digestion with serves to increase the number of common loci resequenced between individuals. Deep resequencing of these RRLs performed with the Illumina Genome Analyzer led to the identification of 2618 SNPs in rice and 1682 SNPs in soybean for two representative genotypes in each of the species. A subset of these SNPs was validated via Sanger sequencing, exhibiting validation rates of 96.4 and 97.0%, in rice ( and soybean (, respectively. Comparative analysis of the read distribution relative to annotated genes in the reference genome assemblies indicated that the RRL strategy was primarily sampling within genic regions for both species. The massively parallel sequencing of methylation-sensitive RRLs for genome-wide SNP discovery can be applied across a wide range of plant species having sufficient reference genomic sequence.

  11. MDcons: Intermolecular contact maps as a tool to analyze the interface of protein complexes from molecular dynamics trajectories

    KAUST Repository

    Abdel-Azeim, Safwat

    2014-05-06

    Background: Molecular Dynamics ( MD) simulations of protein complexes suffer from the lack of specific tools in the analysis step. Analyses of MD trajectories of protein complexes indeed generally rely on classical measures, such as the RMSD, RMSF and gyration radius, conceived and developed for single macromolecules. As a matter of fact, instead, researchers engaged in simulating the dynamics of a protein complex are mainly interested in characterizing the conservation/variation of its biological interface. Results: On these bases, herein we propose a novel approach to the analysis of MD trajectories or other conformational ensembles of protein complexes, MDcons, which uses the conservation of inter-residue contacts at the interface as a measure of the similarity between different snapshots. A "consensus contact map" is also provided, where the conservation of the different contacts is drawn in a grey scale. Finally, the interface area of the complex is monitored during the simulations. To show its utility, we used this novel approach to study two protein-protein complexes with interfaces of comparable size and both dominated by hydrophilic interactions, but having binding affinities at the extremes of the experimental range. MDcons is demonstrated to be extremely useful to analyse the MD trajectories of the investigated complexes, adding important insight into the dynamic behavior of their biological interface. Conclusions: MDcons specifically allows the user to highlight and characterize the dynamics of the interface in protein complexes and can thus be used as a complementary tool for the analysis of MD simulations of both experimental and predicted structures of protein complexes.

  12. A complex-plane strategy for computing rotating polytropic models - Numerical results for strong and rapid differential rotation

    International Nuclear Information System (INIS)

    Geroyannis, V.S.

    1990-01-01

    In this paper, a numerical method, called complex-plane strategy, is implemented in the computation of polytropic models distorted by strong and rapid differential rotation. The differential rotation model results from a direct generalization of the classical model, in the framework of the complex-plane strategy; this generalization yields very strong differential rotation. Accordingly, the polytropic models assume extremely distorted interiors, while their boundaries are slightly distorted. For an accurate simulation of differential rotation, a versatile method, called multiple partition technique is developed and implemented. It is shown that the method remains reliable up to rotation states where other elaborate techniques fail to give accurate results. 11 refs

  13. Protein complexes in the archaeon Methanothermobacter thermautotrophicus analyzed by blue native/SDS-PAGE and mass spectrometry.

    NARCIS (Netherlands)

    Farhoud, M.H.; Wessels, H.C.T.; Steenbakkers, P.J.M.; Mattijssen, S.; Wevers, R.A.; Engelen, B.G.M. van; Jetten, M.S.M.; Smeitink, J.A.M.; Heuvel, L.P.W.J. van den; Keltjens, J.T.M.

    2005-01-01

    Methanothermobacter thermautotrophicus is a thermophilic archaeon that produces methane as the end product of its primary metabolism. The biochemistry of methane formation has been extensively studied and is catalyzed by individual enzymes and proteins that are organized in protein complexes.

  14. Chess players' eye movements reveal rapid recognition of complex visual patterns: Evidence from a chess-related visual search task.

    Science.gov (United States)

    Sheridan, Heather; Reingold, Eyal M

    2017-03-01

    To explore the perceptual component of chess expertise, we monitored the eye movements of expert and novice chess players during a chess-related visual search task that tested anecdotal reports that a key differentiator of chess skill is the ability to visualize the complex moves of the knight piece. Specifically, chess players viewed an array of four minimized chessboards, and they rapidly searched for the target board that allowed a knight piece to reach a target square in three moves. On each trial, there was only one target board (i.e., the "Yes" board), and for the remaining "lure" boards, the knight's path was blocked on either the first move (the "Easy No" board) or the second move (i.e., "the Difficult No" board). As evidence that chess experts can rapidly differentiate complex chess-related visual patterns, the experts (but not the novices) showed longer first-fixation durations on the "Yes" board relative to the "Difficult No" board. Moreover, as hypothesized, the task strongly differentiated chess skill: Reaction times were more than four times faster for the experts relative to novices, and reaction times were correlated with within-group measures of expertise (i.e., official chess ratings, number of hours of practice). These results indicate that a key component of chess expertise is the ability to rapidly recognize complex visual patterns.

  15. Rapid qualitative research methods during complex health emergencies: A systematic review of the literature.

    Science.gov (United States)

    Johnson, Ginger A; Vindrola-Padros, Cecilia

    2017-09-01

    The 2013-2016 Ebola outbreak in West Africa highlighted both the successes and limitations of social science contributions to emergency response operations. An important limitation was the rapid and effective communication of study findings. A systematic review was carried out to explore how rapid qualitative methods have been used during global heath emergencies to understand which methods are commonly used, how they are applied, and the difficulties faced by social science researchers in the field. We also asses their value and benefit for health emergencies. The review findings are used to propose recommendations for qualitative research in this context. Peer-reviewed articles and grey literature were identified through six online databases. An initial search was carried out in July 2016 and updated in February 2017. The PRISMA checklist was used to guide the reporting of methods and findings. The articles were assessed for quality using the MMAT and AACODS checklist. From an initial search yielding 1444 articles, 22 articles met the criteria for inclusion. Thirteen of the articles were qualitative studies and nine used a mixed-methods design. The purpose of the rapid studies included: the identification of causes of the outbreak, and assessment of infrastructure, control strategies, health needs and health facility use. The studies varied in duration (from 4 days to 1 month). The main limitations identified by the authors were: the low quality of the collected data, small sample sizes, and little time for cross-checking facts with other data sources to reduce bias. Rapid qualitative methods were seen as beneficial in highlighting context-specific issues that need to be addressed locally, population-level behaviors influencing health service use, and organizational challenges in response planning and implementation. Recommendations for carrying out rapid qualitative research in this context included the early designation of community leaders as a point of

  16. A rapid method based on hot water extraction and liquid chromatography-tandem mass spectrometry for analyzing tetracycline antibiotic residues in cheese.

    Science.gov (United States)

    Bogialli, Sara; Coradazzi, Cristina; Di Corcia, Antonio; Lagana, Aldo; Sergi, Manuel

    2007-01-01

    A rapid, specific, and sensitive procedure for determining residues of 4 widely used tetracycline antibiotics and 3 of their 4-epimers in cheese is presented. The method is based on the matrix solid-phase dispersion (MSPD) technique followed by liquid chromatography/tandem mass spectrometry (LC/MS/MS). After dispersing samples of mozzarella, asiago, parmigiano, gruyere, emmenthal, and camembert on sand, target compounds were eluted from the MSPD column by passing through it 6 mL water heated at 70 degrees C. After acidification and filtration, 200 microL of the aqueous extract was directly injected into the LC column. For analyte identification and quantification, MS data acquisition was performed in the multireaction monitoring mode, selecting 2 precursor ion-to-product ion transitions for each target compound. Hot water appeared to be an efficient extractant, because absolute recoveries were no lower than 78%. Using demeclocycline as a surrogate analyte, recoveries of analyte added to the 6 types of cheeses at the 30 ng/g level were 96-117%, with relative standard deviation (RSD) not higher than 9%. Statistical analysis of the mean recovery data showed that the extraction efficiency was not dependent on the type of cheese analyzed. This result indicates that this method could be applied to other cheese types not considered here. At the lowest concentration considered, i.e., 10 ng/g, the accuracy of the method ranged between 90 and 107%, with RSDs not larger than 12%. Based on a signal-to-noise ratio of 10, limits of quantitation were estimated to be 1-2 ng/g.

  17. A vacuum manifold for rapid world-to-chip connectivity of complex PDMS microdevices.

    Science.gov (United States)

    Cooksey, Gregory A; Plant, Anne L; Atencia, Javier

    2009-05-07

    The lack of simple interfaces for microfluidic devices with a large number of inlets significantly limits production and utilization of these devices. In this article, we describe the fabrication of a reusable manifold that provides rapid world-to-chip connectivity. A vacuum network milled into a rigid manifold holds microdevices and prevents leakage of fluids injected into the device from ports in the manifold. A number of different manifold designs were explored, and all performed similarly, yielding an average of 100 kPa (15 psi) fluid holding pressure. The wide applicability of this manifold concept is demonstrated by interfacing with a 51-inlet microfluidic chip containing 144 chambers and hundreds of embedded pneumatic valves. Due to the speed of connectivity, the manifolds are ideal for rapid prototyping and are well suited to serve as "universal" interfaces.

  18. Analyzing suitability for urban expansion under rapid coastal urbanization with remote sensing and GIS techniques: a case study of Linanyungang, China

    DEFF Research Database (Denmark)

    Zhao, Wenjun; Zhu, Xiaodong; Reenberg, Anette

    2010-01-01

    Beginning in 2000, Lianyungang's urbanization entered a period of rapid growth, spatially as well as economically. Rapid and intensive expansion of "construction land" imposed increasing pressures on regional environment. With the support of remote sensing data and GIS tools, this paper reports a...

  19. MDcons: Intermolecular contact maps as a tool to analyze the interface of protein complexes from molecular dynamics trajectories

    KAUST Repository

    Abdel-Azeim, Safwat; Chermak, Edrisse; Vangone, Anna; Oliva, Romina; Cavallo, Luigi

    2014-01-01

    of the similarity between different snapshots. A "consensus contact map" is also provided, where the conservation of the different contacts is drawn in a grey scale. Finally, the interface area of the complex is monitored during the simulations. To show its utility

  20. The challenge for genetic epidemiologists: how to analyze large numbers of SNPs in relation to complex diseases

    NARCIS (Netherlands)

    Heidema, A.G.; Boer, J.M.A.; Nagelkerke, N.; Mariman, E.C.M.; A, van der D.L.; Feskens, E.J.M.

    2006-01-01

    Genetic epidemiologists have taken the challenge to identify genetic polymorphisms involved in the development of diseases. Many have collected data on large numbers of genetic markers but are not familiar with available methods to assess their association with complex diseases. Statistical methods

  1. Analyzing the Risk of Fire in a Hospital Complex by “Fire Risk Assessment Method for Engineering”(FRAME

    Directory of Open Access Journals (Sweden)

    Sarsangi V.* MSc,

    2016-08-01

    Full Text Available Aims The occurrence of fire in residential buildings, commercial complexes and large and small industries cause physical, environmental and financial damages to many different communities. Fire safety in hospitals is sensitive and it is believed that the society takes the responsibility to care sick people. The goal of this study was to use Fire Risk Assessment Method for Engineering (FRAME in a hospital complex environment and assess the level of fire risks. Materials & Methods This descriptive study was conducted in Kashan Shahid Beheshti hospital in 2013. The FRAME is designed based on the empirical and scientific knowledge and experiment and have acceptable reliability for assessing the building fire risk. Excel software was used to calculate the risk level and finally fire risk (R was calculated separately for different units. Findings Calculated Rs were less than 1for health, autoclave, office of nursing and infection control units. R1s were greater than 1 for all units. R2s were less than 1 for office of nursing and infection control units. Conclusion FRAME is an acceptable tool for assessing the risk of fire in buildings and the fire risk is high in Shahid Beheshti Hospital Complex of Kashan and damages can be intolerable in the case of fire.

  2. The membrane attack complex of the complement system is essential for rapid wallerian degeneration

    NARCIS (Netherlands)

    Ramaglia, Valeria; King, Rosalind Helen Mary; Nourallah, Michelle; Wolterman, Ruud; de Jonge, Rosalein; Ramkema, Marja; Vigar, Miriam Ann; van der Wetering, Sandra; Morgan, Brian Paul; Troost, Dirk; Baas, Frank

    2007-01-01

    The complement (C) system plays an important role in myelin breakdown during Wallerian degeneration (WD). The pathway and mechanism involved are, however, not clear. In a crush injury model of the sciatic nerve, we show that C6, necessary for the assembly of the membrane attack complex (MAC), is

  3. Rapid identification of clinical members of Fusarium fujikuroi complex using MALDI-TOF MS

    NARCIS (Netherlands)

    Al-Hatmi, Abdullah Ms; Normand, Anne-Cécile; van Diepeningen, Anne D; Hendrickx, Marijke; de Hoog, G Sybren; Piarroux, Renaud

    2015-01-01

    AIM: To develop the matrix-assisted laser desorption ionization mass spectrometry (MALDI-TOF MS) method for identification of Fusarium species within Fusarium fujikuroi complex for use in clinical microbiology laboratories. MATERIALS & METHODS: A total of 24 reference and 60 clinical and

  4. Laser rapid forming technology of high-performance dense metal components with complex structure

    Science.gov (United States)

    Huang, Weidong; Chen, Jing; Li, Yanming; Lin, Xin

    2005-01-01

    Laser rapid forming (LRF) is a new and advanced manufacturing technology that has been developed on the basis of combining high power laser cladding technology with rapid prototyping (RP) to realize net shape forming of high performance dense metal components without dies. Recently we have developed a set of LRF equipment. LRF experiments were carried out on the equipment to investigate the influences of processing parameters on forming characterizations systematically with the cladding powder materials as titanium alloys, superalloys, stainless steel, and copper alloys. The microstructure of laser formed components is made up of columnar grains or columnar dendrites which grow epitaxially from the substrate since the solid components were prepared layer by layer additionally. The result of mechanical testing proved that the mechanical properties of laser formed samples are similar to or even over that of forging and much better than that of casting. It is shown in this paper that LRF technology is providing a new solution for some difficult processing problems in the high tech field of aviation, spaceflight and automobile industries.

  5. The effects of micro-implant assisted rapid palatal expansion (MARPE) on the nasomaxillary complex--a finite element method (FEM) analysis.

    Science.gov (United States)

    MacGinnis, Matt; Chu, Howard; Youssef, George; Wu, Kimberley W; Machado, Andre Wilson; Moon, Won

    2014-08-29

    Orthodontic palatal expansion appliances have been widely used with satisfactory and, most often, predictable clinical results. Recently, clinicians have successfully utilized micro-implants with palatal expander designs to work as anchors to the palate to achieve more efficient skeletal expansion and to decrease undesired dental effects. The purpose of the study was to use finite element method (FEM) to determine the stress distribution and displacement within the craniofacial complex when simulated conventional and micro-implant-assisted rapid palatal expansion (MARPE) expansion forces are applied to the maxilla. The simulated stress distribution produced within the palate and maxillary buttresses in addition to the displacement and rotation of the maxilla could then be analyzed to determine if micro-implants aid in skeletal expansion. A three-dimensional (3D) mesh model of the cranium with associated maxillary sutures was developed using computed tomography (CT) images and Mimics modeling software. To compare transverse expansion stresses in rapid palatal expansion (RPE) and MARPE, expansion forces were distributed to differing points on the maxilla and evaluated with ANSYS simulation software. The stresses distributed from forces applied to the maxillary teeth are distributed mainly along the trajectories of the three maxillary buttresses. In comparison, the MARPE showed tension and compression directed to the palate, while showing less rotation, and tipping of the maxillary complex. In addition, the conventional hyrax displayed a rotation of the maxilla around the teeth as opposed to the midpalatal suture of the MARPE. This data suggests that the MARPE causes the maxilla to bend laterally, while preventing unwanted rotation of the complex. In conclusion, the MARPE may be beneficial for hyperdivergent patients, or those that have already experienced closure of the midpalatal suture, who require palatal expansion and would worsen from buccal tipping of the teeth

  6. A Strategy for Rapid Construction of Blood Vessel-Like Structures with Complex Cell Alignments.

    Science.gov (United States)

    Wang, Nuoxin; Peng, Yunhu; Zheng, Wenfu; Tang, Lixue; Cheng, Shiyu; Yang, Junchuan; Liu, Shaoqin; Zhang, Wei; Jiang, Xingyu

    2018-04-17

    A method is developed that can rapidly produce blood vessel-like structures by bonding cell-laden electrospinning (ES) films layer by layer using fibrin glue within 90 min. This strategy allows control of cell type, cell orientation, and material composition in separate layers. Furthermore, ES films with thicker fibers (polylactic-co-glycolic acid, fiber diameter: ≈3.7 µm) are used as cell-seeding layers to facilitate the cell in-growth; those with thinner fibers (polylactic acid, fiber diameter: ≈1.8 µm) are used as outer reinforcing layers to improve the mechanical strength and reduce the liquid leakage of the scaffold. Cells grow, proliferate, and migrate well in the multilayered structure. This design aims at a new type of blood vessel substitute with flexible control of parameters and implementation of functions. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Use of a field model to analyze probable fire environments encountered within the complex geometries of nuclear power plants

    International Nuclear Information System (INIS)

    Boccio, J.L.; Usher, J.L.; Singhal, A.K.; Tam, L.T.

    1985-08-01

    A fire in a nuclear power plant (NPP) can damage equipment needed to safely operate the plant and thereby either directly cause an accident or else reduce the plant's margin of safety. The development of a field-model fire code to analyze the probable fire environments encountered within NPP is discussed. A set of fire tests carried out under the aegis of the US Nuclear Regulatory Commission (NRC) is described. The results of these tests are then utilized to validate the field model

  8. Harm reduction as a complex adaptive system: A dynamic framework for analyzing Tanzanian policies concerning heroin use.

    Science.gov (United States)

    Ratliff, Eric A; Kaduri, Pamela; Masao, Frank; Mbwambo, Jessie K K; McCurdy, Sheryl A

    2016-04-01

    Contrary to popular belief, policies on drug use are not always based on scientific evidence or composed in a rational manner. Rather, decisions concerning drug policies reflect the negotiation of actors' ambitions, values, and facts as they organize in different ways around the perceived problems associated with illicit drug use. Drug policy is thus best represented as a complex adaptive system (CAS) that is dynamic, self-organizing, and coevolving. In this analysis, we use a CAS framework to examine how harm reduction emerged around heroin trafficking and use in Tanzania over the past thirty years (1985-present). This account is an organizational ethnography based on of the observant participation of the authors as actors within this system. We review the dynamic history and self-organizing nature of harm reduction, noting how interactions among system actors and components have coevolved with patterns of heroin us, policing, and treatment activities over time. Using a CAS framework, we describe harm reduction as a complex process where ambitions, values, facts, and technologies interact in the Tanzanian sociopolitical environment. We review the dynamic history and self-organizing nature of heroin policies, noting how the interactions within and between competing prohibitionist and harm reduction policies have changed with patterns of heroin use, policing, and treatment activities over time. Actors learn from their experiences to organize with other actors, align their values and facts, and implement new policies. Using a CAS approach provides researchers and policy actors a better understanding of patterns and intricacies in drug policy. This knowledge of how the system works can help improve the policy process through adaptive action to introduce new actors, different ideas, and avenues for communication into the system. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Labeled EF-Tus for rapid kinetic studies of pretranslocation complex formation

    DEFF Research Database (Denmark)

    Liu, Wei; Kavaliauskas, Darius; Schrader, Jared

    2014-01-01

    The universally conserved translation elongation factor EF-Tu delivers aminoacyl(aa)-tRNA in the form of an aa-tRNA·EF-Tu·GTP ternary complex (TC) to the ribosome where it binds to the cognate mRNA codon within the ribosomal A-site, leading to formation of a pretranslocation (PRE) complex. Here we...... describe preparation of QSY9 and Cy5 derivatives of the variant E348C-EF-Tu that are functional in translation elongation. Together with fluorophore derivatives of aa-tRNA and of ribosomal protein L11, located within the GTPase associated center (GAC), these labeled EF-Tus allow development of two new FRET...... assays that permit the dynamics of distance changes between EF-Tu and both L11 (Tu-L11 assay) and aa-tRNA (Tu-tRNA assay) to be determined during the decoding process. We use these assays to examine: (i) the relative rates of EF-Tu movement away from the GAC and from aa-tRNA during decoding, (ii...

  10. Rapid and specific purification of Argonaute-small RNA complexes from crude cell lysates.

    Science.gov (United States)

    Flores-Jasso, C Fabián; Salomon, William E; Zamore, Phillip D

    2013-02-01

    Small interfering RNAs (siRNAs) direct Argonaute proteins, the core components of the RNA-induced silencing complex (RISC), to cleave complementary target RNAs. Here, we describe a method to purify active RISC containing a single, unique small RNA guide sequence. We begin by capturing RISC using a complementary 2'-O-methyl oligonucleotide tethered to beads. Unlike other methods that capture RISC but do not allow its recovery, our strategy purifies active, soluble RISC in good yield. The method takes advantage of the finding that RISC partially paired to a target through its siRNA guide dissociates more than 300 times faster than a fully paired siRNA in RISC. We use this strategy to purify fly Ago1- and Ago2-RISC, as well as mouse AGO2-RISC. The method can discriminate among RISCs programmed with different guide strands, making it possible to deplete and recover specific RISC populations. Endogenous microRNA:Argonaute complexes can also be purified from cell lysates. Our method scales readily and takes less than a day to complete.

  11. Silica nanoparticles doped with an iridium(III) complex for rapid and fluorometric detection of cyanide

    International Nuclear Information System (INIS)

    Mu, Juanjuan; Feng, Qingyue; Chen, Xiudan; Li, Jing; Wang, Huili; Li, Mei-Jin

    2015-01-01

    We describe a nanosensor for sensitive and selective detection of cyanide anions. The Ir(III) chlorine bridge complex [Ir(C N ) 2 -m-Cl] 2 (Irpq, where pq is C N = 2-phenyl quinoline) was doped into silica nanoparticles (SiNPs) with a typical size of about 30 nm. The intensity of the yellow emission of the doped SiNPs (under 410 nm exCitation) was strongly enhanced on addition of cyanide ions due to the replacement of chloride by cyanide. The method can detect cyanide ions in the 12.5 to 113 μM concentration range, and the limit of detection is 1.66 μM (at an S/N ratio of 3). The method is simple, sensitive and fast, and this makes it a candidate probe for the fast optical determination of cyanide. (author)

  12. Evaluation of rapid immuno chromatographic assay kit using monoclonal mpt64 antibodies for identification of mycobacterium tuberculosis complex

    International Nuclear Information System (INIS)

    Satti, L.; Ikram, A.; Malik, N.

    2010-01-01

    To evaluate the performance of rapid immuno chromatographic kit MPT64 Ag for the identification of Mycobacterium tuberculosis complex from various Mycobacterium tuberculosis culture positive specimens. Department of Microbiology, Armed Forces Institute of Pathology Rawalpindi, from August 2008 through March 2009. Eighty four Mycobacterium tuberculosis positive cultures on I BACTEC 460 and MGIT 960, one ATCC 25177 MTB strain, three institutional control MTB strains, two institutional control MOTT strains and 20 different bacterial isolates were tested. Tests were performed according to the instructional manual. Out of total 84 tested samples, MPT64 showed positive result in 80 cultures. Only four positive cultures did not display any band on MPT64 kit. These four strains were reconfirmed as Mycobacterium tuberculosis by PCR method. MOTT control strains and all the 20 bacterial isolates were negative for band. The sensitivity and specificity of ICT assay in our study was 95.2% and 100% respectively. Rapid MPT64 Kit is a good diagnostic tool to differentiate between Mycobacterium tuberculosis complex and MOTT with 100% specificity. The technique is simple and can provide prompt information to the clinicians to initiate early and appropriate antituberculosis therapy. (author)

  13. Analyzing Katana referral hospital as a complex adaptive system: agents, interactions and adaptation to a changing environment.

    Science.gov (United States)

    Karemere, Hermès; Ribesse, Nathalie; Marchal, Bruno; Macq, Jean

    2015-01-01

    This study deals with the adaptation of Katana referral hospital in Eastern Democratic Republic of Congo in a changing environment that is affected for more than a decade by intermittent armed conflicts. His objective is to generate theoretical proposals for addressing differently the analysis of hospitals governance in the aims to assess their performance and how to improve that performance. The methodology applied approach uses a case study using mixed methods ( qualitative and quantitative) for data collection. It uses (1) hospital data to measure the output of hospitals, (2) literature review to identify among others, events and interventions recorded in the history of hospital during the study period and (3) information from individual interviews to validate the interpretation of the results of the previous two sources of data and understand the responsiveness of management team referral hospital during times of change. The study brings four theoretical propositions: (1) Interaction between key agents is a positive force driving adaptation if the actors share a same vision, (2) The strength of the interaction between agents is largely based on the nature of institutional arrangements, which in turn are shaped by the actors themselves, (3) The owner and the management team play a decisive role in the implementation of effective institutional arrangements and establishment of positive interactions between agents, (4) The analysis of recipient population's perception of health services provided allow to better tailor and adapt the health services offer to the population's needs and expectations. Research shows that it isn't enough just to provide support (financial and technical), to manage a hospital for operate and adapt to a changing environment but must still animate, considering that it is a complex adaptive system and that this animation is nothing other than the induction of a positive interaction between agents.

  14. An Enduring Rapidly Moving Storm as a Guide to Saturn's Equatorial Jet's Complex Structure

    Science.gov (United States)

    Sanchez-Lavega, A.; Garcia-Melendo, E.; Perez-Hoyos, S.; Hueso, R.; Wong, M. H.; Simon, A.; Sanz-Requena, J. F.; Antunano, A.; Barrado-Izagirre, N.; Garate-Lopez, I.; hide

    2016-01-01

    Saturn has an intense and broad eastward equatorial jet with a complex three-dimensional structure mixed with time variability. The equatorial region experiences strong seasonal insolation variations enhanced by ring shadowing, and three of the six known giant planetary-scale storms have developed in it. These factors make Saturn's equator a natural laboratory to test models of jets in giant planets. Here we report on a bright equatorial atmospheric feature imaged in 2015 that moved steadily at a high speed of 450/ms not measured since 1980-1981 with other equatorial clouds moving within an ample range of velocities. Radiative transfer models show that these motions occur at three altitude levels within the upper haze and clouds. We find that the peak of the jet (latitudes 10degN to 10degS) suffers intense vertical shears reaching + 2.5/ms/km, two orders of magnitude higher than meridional shears, and temporal variability above 1 bar altitude level.

  15. COMPLEX GAS KINEMATICS IN COMPACT, RAPIDLY ASSEMBLING STAR-FORMING GALAXIES

    Energy Technology Data Exchange (ETDEWEB)

    Amorin, R.; Vilchez, J. M.; Perez-Montero, E. [Instituto de Astrofisica de Andalucia-CSIC, Glorieta de la Astronomia S/N, E-18008 Granada (Spain); Haegele, G. F.; Firpo, V. [Facultad de Ciencias Astronomicas y Geofisicas, Universidad de la Plata, Paseo del Bosque S/N, 1900 La Plata (Argentina); Papaderos, P., E-mail: amorin@iaa.es [Centro de Astrofisica and Faculdade de Ciencias, Universidade do Porto, Rua das Estrelas, 4150-762 Porto (Portugal)

    2012-08-01

    Deep, high-resolution spectroscopic observations have been obtained for six compact, strongly star-forming galaxies at redshift z {approx} 0.1-0.3, most of them also known as green peas. Remarkably, these galaxies show complex emission-line profiles in the spectral region including H{alpha}, [N II] {lambda}{lambda}6548, 6584, and [S II] {lambda}{lambda}6717, 6731, consisting of the superposition of different kinematical components on a spatial extent of few kiloparsecs: a very broad line emission underlying more than one narrower component. For at least two of the observed galaxies some of these multiple components are resolved spatially in their two-dimensional spectra, whereas for another one a faint detached H{alpha} blob lacking stellar continuum is detected at the same recessional velocity {approx}7 kpc away from the galaxy. The individual narrower H{alpha} components show high intrinsic velocity dispersion ({sigma} {approx} 30-80 km s{sup -1}), suggesting together with unsharped masking Hubble Space Telescope images that star formation proceeds in an ensemble of several compact and turbulent clumps, with relative velocities of up to {approx}500 km s{sup -1}. The broad underlying H{alpha} components indicate in all cases large expansion velocities (full width zero intensity {>=}1000 km s{sup -1}) and very high luminosities (up to {approx}10{sup 42} erg s{sup -1}), probably showing the imprint of energetic outflows from supernovae. These intriguing results underline the importance of green peas for studying the assembly of low-mass galaxies near and far.

  16. Analyzing suitability for urban expansion under rapid coastal urbanization with remote sensing and GIS techniques: a case study of Lianyungang, China

    Science.gov (United States)

    Zhao, Wenjun; Zhu, Xiaodong; Reenberg, Anette; Sun, Xiang

    2010-10-01

    Beginning in 2000, Lianyungang's urbanization entered a period of rapid growth, spatially as well as economically. Rapid and intensive expansion of "construction land" imposed increasing pressures on regional environment. With the support of remote sensing data and GIS tools, this paper reports a "present-capacity-potential" integrated suitability analysis framework, in order to characterize and evaluate the suitability of urban expansion in Lianyungang. We found that during the rapid coastal urbanization process from 2000 to 2008, the characteristics of physical expansion in the study area were characterized by a combination of high-density expansion and sprawling development. The land use conversion driven by urbanization and industrialization has not occurred only in city districts, but also the surrounding areas that were spatially absorbed by urban growth, while closely associated and greatly influenced by the explosive growth of industrial establishment. The over-consumption of land resources in the areas with low environmental carrying capacity, particularly in the eastern coastal area, should be strictly controlled. Compared to conventional land suitability analysis methods, the proposed integrated approach could better review the potential environmental impacts of urban expansion and provide guidance for decision makers.

  17. Rapid, topology-based particle tracking for high-resolution measurements of large complex 3D motion fields.

    Science.gov (United States)

    Patel, Mohak; Leggett, Susan E; Landauer, Alexander K; Wong, Ian Y; Franck, Christian

    2018-04-03

    Spatiotemporal tracking of tracer particles or objects of interest can reveal localized behaviors in biological and physical systems. However, existing tracking algorithms are most effective for relatively low numbers of particles that undergo displacements smaller than their typical interparticle separation distance. Here, we demonstrate a single particle tracking algorithm to reconstruct large complex motion fields with large particle numbers, orders of magnitude larger than previously tractably resolvable, thus opening the door for attaining very high Nyquist spatial frequency motion recovery in the images. Our key innovations are feature vectors that encode nearest neighbor positions, a rigorous outlier removal scheme, and an iterative deformation warping scheme. We test this technique for its accuracy and computational efficacy using synthetically and experimentally generated 3D particle images, including non-affine deformation fields in soft materials, complex fluid flows, and cell-generated deformations. We augment this algorithm with additional particle information (e.g., color, size, or shape) to further enhance tracking accuracy for high gradient and large displacement fields. These applications demonstrate that this versatile technique can rapidly track unprecedented numbers of particles to resolve large and complex motion fields in 2D and 3D images, particularly when spatial correlations exist.

  18. Dual-harmonic auto voltage control for the rapid cycling synchrotron of the Japan Proton Accelerator Research Complex

    Directory of Open Access Journals (Sweden)

    Fumihiko Tamura

    2008-07-01

    Full Text Available The dual-harmonic operation, in which the accelerating cavities are driven by the superposition of the fundamental and the second harmonic rf voltage, is useful for acceleration of the ultrahigh intensity proton beam in the rapid cycling synchrotron (RCS of Japan Proton Accelerator Research Complex (J-PARC. However, the precise and fast voltage control of the harmonics is necessary to realize the dual-harmonic acceleration. We developed the dual-harmonic auto voltage control system for the J-PARC RCS. We describe details of the design and the implementation. Various tests of the system are performed with the RCS rf system. Also, a preliminary beam test has been done. We report the test results.

  19. Beam commissioning of the 3-GeV rapid cycling synchrotron of the Japan Proton Accelerator Research Complex

    Directory of Open Access Journals (Sweden)

    H. Hotchi

    2009-04-01

    Full Text Available The 3-GeV rapid cycling synchrotron (RCS of the Japan Proton Accelerator Research Complex (J-PARC was commissioned in October 2007, and successfully accomplished 3 GeV acceleration on October 31. Six run cycles through February 2008 were dedicated to commissioning the RCS, for which the initial machine parameter tuning and various underlying beam studies were completed. Then since May 2008 the RCS beam has been delivered to the downstream facilities for their beam commissioning. In this paper we describe beam tuning and study results following our beam commissioning scenario and a beam performance and operational experience obtained in the first commissioning phase through June 2008.

  20. A novel method for rapid and reliable detection of complex vertebral malformation and bovine leukocyte adhesion deficiency in Holstein cattle

    Directory of Open Access Journals (Sweden)

    Zhang Yi

    2012-07-01

    Full Text Available Abstract Background Complex vertebral malformation (CVM and bovine leukocyte adhesion deficiency (BLAD are two autosomal recessive lethal genetic defects frequently occurring in Holstein cattle, identifiable by single nucleotide polymorphisms. The objective of this study is to develop a rapid and reliable genotyping assay to screen the active Holstein sires and determine the carrier frequency of CVM and BLAD in Chinese dairy cattle population. Results We developed real-time PCR-based assays for discrimination of wild-type and defective alleles, so that carriers can be detected. Only one step was required after the DNA extraction from the sample and time consumption was about 2 hours. A total of 587 Chinese Holstein bulls were assayed, and fifty-six CVM-carriers and eight BLAD-carriers were identified, corresponding to heterozygote carrier frequencies of 9.54% and 1.36%, respectively. The pedigree analysis showed that most of the carriers could be traced back to the common ancestry, Osborndale Ivanhoe for BLAD and Pennstate Ivanhoe Star for CVM. Conclusions These results demonstrate that real-time PCR is a simple, rapid and reliable assay for BLAD and CVM defective allele detection. The high frequency of the CVM allele suggests that implementing a routine testing system is necessary to gradually eradicate the deleterious gene from the Chinese Holstein population.

  1. Pulse-Driven Capacitive Lead Ion Detection with Reduced Graphene Oxide Field-Effect Transistor Integrated with an Analyzing Device for Rapid Water Quality Monitoring.

    Science.gov (United States)

    Maity, Arnab; Sui, Xiaoyu; Tarman, Chad R; Pu, Haihui; Chang, Jingbo; Zhou, Guihua; Ren, Ren; Mao, Shun; Chen, Junhong

    2017-11-22

    Rapid and real-time detection of heavy metals in water with a portable microsystem is a growing demand in the field of environmental monitoring, food safety, and future cyber-physical infrastructure. Here, we report a novel ultrasensitive pulse-driven capacitance-based lead ion sensor using self-assembled graphene oxide (GO) monolayer deposition strategy to recognize the heavy metal ions in water. The overall field-effect transistor (FET) structure consists of a thermally reduced graphene oxide (rGO) channel with a thin layer of Al 2 O 3 passivation as a top gate combined with sputtered gold nanoparticles that link with the glutathione (GSH) probe to attract Pb 2+ ions in water. Using a preprogrammed microcontroller, chemo-capacitance based detection of lead ions has been demonstrated with this FET sensor. With a rapid response (∼1-2 s) and negligible signal drift, a limit of detection (LOD) water stabilization followed by lead ion testing and calculation is much shorter than common FET resistance/current measurements (∼minutes) and other conventional methods, such as optical and inductively coupled plasma methods (∼hours). An approximate linear operational range (5-20 ppb) around 15 ppb (the maximum contaminant limit by US Environmental Protection Agency (EPA) for lead in drinking water) makes it especially suitable for drinking water quality monitoring. The validity of the pulse method is confirmed by quantifying Pb 2+ in various real water samples such as tap, lake, and river water with an accuracy ∼75%. This capacitance measurement strategy is promising and can be readily extended to various FET-based sensor devices for other targets.

  2. Rapid presumptive identification of the Mycobacterium tuberculosis-bovis complex by radiometric determination of heat stable urease

    International Nuclear Information System (INIS)

    Gandy, J.H.; Pruden, E.L.; Cox, F.R.

    1983-01-01

    Simple and rapid Bactec methodologies for the determination of neat (unaltered) and heat stable urease activity of mycobacteria are presented. Clinical isolates (63) and stock cultures (32)--consisting of: M. tuberculosis (19), M. bovis (5), M. kansasii (15), M. marinum (4), M. simiae (3), M. scrofulaceum (16), M. gordonae (6), M. szulgai (6), M. flavescens (1), M. gastri (1), M. intracellulare (6), M. fortuitum-chelonei complex (12), and M. smegmatis (1)--were tested for neat urease activity by Bactec radiometry. Mycobacterial isolates (50-100 mg wet weight) were incubated at 35 degrees C for 30 minutes with microCi14C-urea. Urease-positive mycobacteria gave Bactec growth index (GI) values greater than 100 units, whereas urease-negative species gave values less than 10 GI units. Eighty-three isolates possessing neat urease activity were heated at 80 degrees C for 30 minutes followed by incubation at 35 degrees C for 30 minutes with 1 microCi14C-urea. Mycobacterium tuberculosis-bovis complex demonstrated heat-stable urease activity (GI more than 130 units) and could be distinguished from mycobacteria other than tuberculosis (MOTT), which gave GI values equal to or less than 40 units

  3. Versatile, ultra-low sample volume gas analyzer using a rapid, broad-tuning ECQCL and a hollow fiber gas cell

    Science.gov (United States)

    Kriesel, Jason M.; Makarem, Camille N.; Phillips, Mark C.; Moran, James J.; Coleman, Max L.; Christensen, Lance E.; Kelly, James F.

    2017-05-01

    We describe a versatile mid-infrared (Mid-IR) spectroscopy system developed to measure the concentration of a wide range of gases with an ultra-low sample size. The system combines a rapidly-swept external cavity quantum cascade laser (ECQCL) with a hollow fiber gas cell. The ECQCL has sufficient spectral resolution and reproducibility to measure gases with narrow features (e.g., water, methane, ammonia, etc.), and also the spectral tuning range needed to measure volatile organic compounds (VOCs), (e.g., aldehydes, ketones, hydrocarbons), sulfur compounds, chlorine compounds, etc. The hollow fiber is a capillary tube having an internal reflective coating optimized for transmitting the Mid-IR laser beam to a detector. Sample gas introduced into the fiber (e.g., internal volume = 0.6 ml) interacts strongly with the laser beam, and despite relatively modest path lengths (e.g., L 3 m), the requisite quantity of sample needed for sensitive measurements can be significantly less than what is required using conventional IR laser spectroscopy systems. Example measurements are presented including quantification of VOCs relevant for human breath analysis with a sensitivity of 2 picomoles at a 1 Hz data rate.

  4. Versatile, ultra-low sample volume gas analyzer using a rapid, broad-tuning ECQCL and a hollow fiber gas cell

    Energy Technology Data Exchange (ETDEWEB)

    Kriesel, Jason M.; Makarem, Camille N.; Phillips, Mark C.; Moran, James J.; Coleman, Max; Christensen, Lance; Kelly, James F.

    2017-05-05

    We describe a versatile mid-infrared (Mid-IR) spectroscopy system developed to measure the concentration of a wide range of gases with an ultra-low sample size. The system combines a rapidly-swept external cavity quantum cascade laser (ECQCL) with a hollow fiber gas cell. The ECQCL has sufficient spectral resolution and reproducibility to measure gases with narrow features (e.g., water, methane, ammonia, etc.), and also the spectral tuning range needed to measure volatile organic compounds (VOCs), (e.g., aldehydes, ketones, hydrocarbons), sulfur compounds, chlorine compounds, etc. The hollow fiber is a capillary tube having an internal reflective coating optimized for transmitting the Mid-IR laser beam to a detector. Sample gas introduced into the fiber (e.g., internal volume = 0.6 ml) interacts strongly with the laser beam, and despite relatively modest path lengths (e.g., L ~ 3 m), the requisite quantity of sample needed for sensitive measurements can be significantly less than what is required using conventional IR laser spectroscopy systems. Example measurements are presented including quantification of VOCs relevant for human breath analysis with a sensitivity of ~2 picomoles at a 1 Hz data rate.

  5. New Statistical Method to Analyze Three-Dimensional Landmark Configurations Obtained with Cone-Beam CT: Basic Features and Clinical Application for Rapid Maxillary Expansion

    Energy Technology Data Exchange (ETDEWEB)

    Gamble, Jennifer; Lagravere, Manuel O.; Major, Paul W.; Heo, Giseon [University of Alberta, Edmonton (Canada)

    2012-03-15

    To describe a statistical method of three-dimensional landmark configuration data and apply it to an orthodontic data set comparing two types of rapid maxillary expansion (RME) treatments. Landmark configurations obtained from cone beam CT scans were used to represent patients in two types (please describe what were two types) of RME groups and a control group over four time points. A method using tools from persistent homology and dimensionality reduction is presented and used to identify variability between the subjects. The analysis was in agreement with previous results using conventional methods, which found significant differences between treatment groups and the control, but no distinction between the types of treatment. Additionally, it was found that second molar eruption varied considerably between the subjects, and this has not been evaluated in previous analyses. This method of analysis allows entire configurations to be considered as a whole, and does not require specific inter-landmark distances or angles to be selected. Sources of variability present themselves, without having to be individually sought after. This method is suggested as an additional tool for the analysis of landmark configuration data.

  6. Development of a Novel Cu(II Complex Modified Electrode and a Portable Electrochemical Analyzer for the Determination of Dissolved Oxygen (DO in Water

    Directory of Open Access Journals (Sweden)

    Salvatore Gianluca Leonardi

    2016-04-01

    Full Text Available The development of an electrochemical dissolved oxygen (DO sensor based on a novel Cu(II complex-modified screen printed carbon electrode is reported. The voltammetric behavior of the modified electrode was investigated at different scan rates and oxygen concentrations in PBS (pH = 7. An increase of cathodic current (at about −0.4 vs. Ag/AgCl with the addition of oxygen was observed. The modified Cu(II complex electrode was demonstrated for the determination of DO in water using chronoamperometry. A small size and low power consumption home-made portable electrochemical analyzer based on custom electronics for sensor interfacing and operating in voltammetry and amperometry modes has been also designed and fabricated. Its performances in the monitoring of DO in water were compared with a commercial one.

  7. Rapid extraction and quantitative detection of the herbicide diuron in surface water by a hapten-functionalized carbon nanotubes based electrochemical analyzer.

    Science.gov (United States)

    Sharma, Priyanka; Bhalla, Vijayender; Tuteja, Satish; Kukkar, Manil; Suri, C Raman

    2012-05-21

    A solid phase extraction micro-cartridge containing a non-polar polystyrene absorbent matrix was coupled with an electrochemical immunoassay analyzer (EIA) and used for the ultra-sensitive detection of the phenyl urea herbicide diuron in real samples. The EIA was fabricated by using carboxylated carbon nanotubes (CNTs) functionalized with a hapten molecule (an amine functionalized diuron derivative). Screen printed electrodes (SPE) were modified with these haptenized CNTs and specific in-house generated anti diuron antibodies were used for bio-interface development. The immunodetection was realized in a competitive electrochemical immunoassay format using alkaline phosphatase labeled secondary anti-IgG antibody. The addition of 1-naphthyl phosphate substrate resulted in the production of an electrochemically active product, 1-naphthol, which was monitored by using differential pulse voltammetry (DPV). The assay exhibited excellent sensitivity and specificity having a dynamic response range of 0.01 pg mL(-1) to 10 μg mL(-1) for diuron with a limit of detection of around 0.1 pg mL(-1) (n = 3) in standard water samples. The micro-cartridge coupled hapten-CNTs modified SPE provided an effective and efficient electrochemical immunoassay for the real-time monitoring of pesticides samples with a very high degree of sensitivity.

  8. Using threshold regression to analyze survival data from complex surveys: With application to mortality linked NHANES III Phase II genetic data.

    Science.gov (United States)

    Li, Yan; Xiao, Tao; Liao, Dandan; Lee, Mei-Ling Ting

    2018-03-30

    The Cox proportional hazards (PH) model is a common statistical technique used for analyzing time-to-event data. The assumption of PH, however, is not always appropriate in real applications. In cases where the assumption is not tenable, threshold regression (TR) and other survival methods, which do not require the PH assumption, are available and widely used. These alternative methods generally assume that the study data constitute simple random samples. In particular, TR has not been studied in the setting of complex surveys that involve (1) differential selection probabilities of study subjects and (2) intracluster correlations induced by multistage cluster sampling. In this paper, we extend TR procedures to account for complex sampling designs. The pseudo-maximum likelihood estimation technique is applied to estimate the TR model parameters. Computationally efficient Taylor linearization variance estimators that consider both the intracluster correlation and the differential selection probabilities are developed. The proposed methods are evaluated by using simulation experiments with various complex designs and illustrated empirically by using mortality-linked Third National Health and Nutrition Examination Survey Phase II genetic data. Copyright © 2017 John Wiley & Sons, Ltd.

  9. Nanobodies: site-specific labeling for super-resolution imaging, rapid epitope-mapping and native protein complex isolation

    Science.gov (United States)

    Pleiner, Tino; Bates, Mark; Trakhanov, Sergei; Lee, Chung-Tien; Schliep, Jan Erik; Chug, Hema; Böhning, Marc; Stark, Holger; Urlaub, Henning; Görlich, Dirk

    2015-01-01

    Nanobodies are single-domain antibodies of camelid origin. We generated nanobodies against the vertebrate nuclear pore complex (NPC) and used them in STORM imaging to locate individual NPC proteins with nanobody sequence and labeled the resulting proteins with fluorophore-maleimides. As nanobodies are normally stabilized by disulfide-bonded cysteines, this appears counterintuitive. Yet, our analysis showed that this caused no folding problems. Compared to traditional NHS ester-labeling of lysines, the cysteine-maleimide strategy resulted in far less background in fluorescence imaging, it better preserved epitope recognition and it is site-specific. We also devised a rapid epitope-mapping strategy, which relies on crosslinking mass spectrometry and the introduced ectopic cysteines. Finally, we used different anti-nucleoporin nanobodies to purify the major NPC building blocks – each in a single step, with native elution and, as demonstrated, in excellent quality for structural analysis by electron microscopy. The presented strategies are applicable to any nanobody and nanobody-target. DOI: http://dx.doi.org/10.7554/eLife.11349.001 PMID:26633879

  10. Pick a Color MARIA: Adaptive Sampling Enables the Rapid Identification of Complex Perovskite Nanocrystal Compositions with Defined Emission Characteristics.

    Science.gov (United States)

    Bezinge, Leonard; Maceiczyk, Richard M; Lignos, Ioannis; Kovalenko, Maksym V; deMello, Andrew J

    2018-06-06

    Recent advances in the development of hybrid organic-inorganic lead halide perovskite (LHP) nanocrystals (NCs) have demonstrated their versatility and potential application in photovoltaics and as light sources through compositional tuning of optical properties. That said, due to their compositional complexity, the targeted synthesis of mixed-cation and/or mixed-halide LHP NCs still represents an immense challenge for traditional batch-scale chemistry. To address this limitation, we herein report the integration of a high-throughput segmented-flow microfluidic reactor and a self-optimizing algorithm for the synthesis of NCs with defined emission properties. The algorithm, named Multiparametric Automated Regression Kriging Interpolation and Adaptive Sampling (MARIA), iteratively computes optimal sampling points at each stage of an experimental sequence to reach a target emission peak wavelength based on spectroscopic measurements. We demonstrate the efficacy of the method through the synthesis of multinary LHP NCs, (Cs/FA)Pb(I/Br) 3 (FA = formamidinium) and (Rb/Cs/FA)Pb(I/Br) 3 NCs, using MARIA to rapidly identify reagent concentrations that yield user-defined photoluminescence peak wavelengths in the green-red spectral region. The procedure returns a robust model around a target output in far fewer measurements than systematic screening of parametric space and additionally enables the prediction of other spectral properties, such as, full-width at half-maximum and intensity, for conditions yielding NCs with similar emission peak wavelength.

  11. Transverse effects on the nasomaxillary complex one year after rapid maxillary expansion as the only intervention: A controlled study

    Directory of Open Access Journals (Sweden)

    Carolina da Luz Baratieri

    2014-10-01

    Full Text Available The aim of this study was to assess by means of cone-beam computed tomography (CBCT scans the transverse effects on the nasomaxillary complex in patients submitted to rapid maxillary expansion (RME using Haas expander in comparison to untreated individuals. This prospective controlled clinical study assessed 30 subjects (18 boys and 12 girls with mixed dentition and during pubertal growth. The treated group was submitted to RME with Haas expander, retention for six months and a six-month follow-up after removal. The control group matched the treated group in terms of age and sex distribution. CBCT scans were taken at treatment onset and one year after the expander was activated. Maxillary first molars (U6 width, right and left U6 angulation, maxillary alveolar width, maxillary basal width, palatal alveolar width, palatal base width, right and left alveolar angulation, palatal area, nasal base width, nasal cavity width and inferior nasal cavity area on the posterior, middle and anterior coronal slices were measured with Dolphin Imaging Software(r 11.5, except for the first two variables which were performed only on the posterior slice. All transverse dimensions increased significantly (P 0.05. Results suggest that increase of molar, maxillary, palatal and nasal transverse dimensions was stable in comparison to the control group one year after treatment with RME.

  12. Health Systems Research in a Complex and Rapidly Changing Context: Ethical Implications of Major Health Systems Change at Scale.

    Science.gov (United States)

    MacGregor, Hayley; Bloom, Gerald

    2016-12-01

    This paper discusses health policy and systems research in complex and rapidly changing contexts. It focuses on ethical issues at stake for researchers working with government policy makers to provide evidence to inform major health systems change at scale, particularly when the dynamic nature of the context and ongoing challenges to the health system can result in unpredictable outcomes. We focus on situations where 'country ownership' of HSR is relatively well established and where there is significant involvement of local researchers and close ties and relationships with policy makers are often present. We frame our discussion around two country case studies with which we are familiar, namely China and South Africa and discuss the implications for conducting 'embedded' research. We suggest that reflexivity is an important concept for health system researchers who need to think carefully about positionality and their normative stance and to use such reflection to ensure that they can negotiate to retain autonomy, whilst also contributing evidence for health system change. A research process informed by the notion of reflexive practice and iterative learning will require a longitudinal review at key points in the research timeline. Such review should include the convening of a deliberative process and should involve a range of stakeholders, including those most likely to be affected by the intended and unintended consequences of change. © 2016 The Authors Developing World Bioethics Published by John Wiley & Sons Ltd.

  13. Rapid molecular cytogenetic analysis of X-chromosomal microdeletions: Fluorescence in situ hybridization (FISH) for complex glycerol kinase deficiency

    Energy Technology Data Exchange (ETDEWEB)

    Worley, K.C.; Lindsay, E.A.; McCabe, E.R.B. [Baylor College of Medicine, Houston, TX (United States)] [and others

    1995-07-17

    Diagnosis of X-chromosomal microdeletions has relied upon the traditional methods of Southern blotting and DNA amplification, with carrier identification requiring time-consuming and unreliable dosage calculations. In this report, we describe rapid molecular cytogenetic identification of deleted DNA in affected males with the Xp21 contiguous gene syndrome (complex glycerol kinase deficiency, CGKD) and female carriers for this disorder. CGKD deletions involve the genes for glycerol kinase, Duchenne muscular dystrophy, and/or adrenal hypoplasia congenita. We report an improved method for diagnosis of deletions in individuals with CGKD and for identification of female carriers within their families using fluorescence in situ hybridization (FISH) with a cosmid marker (cosmid 35) within the glycerol kinase gene. When used in combination with an Xq control probe, affected males demonstrate a single signal from the control probe, while female carriers demonstrate a normal chromosome with two signals, as well as a deleted chromosome with a single signal from the control probe. FISH analysis for CGKD provides the advantages of speed and accuracy for evaluation of submicroscopic X-chromosome deletions, particularly in identification of female carriers. In addition to improving carrier evaluation, FISH will make prenatal diagnosis of CGKD more readily available. 17 refs., 2 figs.

  14. Influence of metal loading and humic acid functional groups on the complexation behavior of trivalent lanthanides analyzed by CE-ICP-MS

    Energy Technology Data Exchange (ETDEWEB)

    Kautenburger, Ralf, E-mail: r.kautenburger@mx.uni-saarland.de [Institute of Inorganic Solid State Chemistry, Saarland University, Campus Dudweiler, Am Markt Zeile 3-5, D-66125 Saarbrücken (Germany); Hein, Christina; Sander, Jonas M. [Institute of Inorganic Solid State Chemistry, Saarland University, Campus Dudweiler, Am Markt Zeile 3-5, D-66125 Saarbrücken (Germany); Beck, Horst P. [Institute of Inorganic and Analytical Chemistry and Radiochemistry, Saarland University, Campus Dudweiler, Am Markt Zeile 5, D-66125 Saarbrücken (Germany)

    2014-03-01

    Highlights: • Free and complexed HA-Ln species are separated by CE-ICP-MS. • Weaker and stronger HA-binding sites for Ln-complexation can be detected. • Complexation by original and modified humic acid (HA) with blocked phenolic hydroxyl- and carboxyl-groups is compared. • Stronger HA-binding sites for Ln³⁺ can be assumed as chelating complexes. • Chelates consist of trivalent Ln and a combination of both OH- and COOH-groups. Abstract: The complexation behavior of Aldrich humic acid (AHA) and a modified humic acid (AHA-PB) with blocked phenolic hydroxyl groups for trivalent lanthanides (Ln) is compared, and their influence on the mobility of Ln(III) in an aquifer is analyzed. As speciation technique, capillary electrophoresis (CE) was hyphenated with inductively coupled plasma mass spectrometry (ICP-MS). For metal loading experiments 25 mg L⁻¹ of AHA and different concentrations (c Ln(Eu+Gd)} = 100–6000 μg L⁻¹) of Eu(III) and Gd(III) in 10 mM NaClO₄ at pH 5 were applied. By CE-ICP-MS, three Ln-fractions, assumed to be uncomplexed, weakly and strongly AHA-complexed metal can be detected. For the used Ln/AHA-ratios conservative complex stability constants log βLnAHA decrease from 6.33 (100 μg L⁻¹ Ln³⁺) to 4.31 (6000 μg L⁻¹ Ln³⁺) with growing Ln-content. In order to verify the postulated weaker and stronger humic acid binding sites for trivalent Eu and Gd, a modified AHA with blocked functional groups was used. For these experiments 500 μg L⁻¹ Eu and 25 mg L⁻¹ AHA and AHA-PB in 10 mM NaClO₄ at pH-values ranging from 3 to 10 have been applied. With AHA-PB, where 84% of the phenolic OH-groups and 40% of the COOH-groups were blocked, Eu complexation was significantly lower, especially at the strong binding sites. The log β-values decrease from 6.11 (pH 10) to 5.61 at pH 3 (AHA) and for AHA-PB from 6.01 (pH 7) to 3.94 at pH 3. As a potential consequence, particularly humic acids with a high amount of

  15. Transient analyzer

    International Nuclear Information System (INIS)

    Muir, M.D.

    1975-01-01

    The design and design philosophy of a high performance, extremely versatile transient analyzer is described. This sub-system was designed to be controlled through the data acquisition computer system which allows hands off operation. Thus it may be placed on the experiment side of the high voltage safety break between the experimental device and the control room. This analyzer provides control features which are extremely useful for data acquisition from PPPL diagnostics. These include dynamic sample rate changing, which may be intermixed with multiple post trigger operations with variable length blocks using normal, peak to peak or integrate modes. Included in the discussion are general remarks on the advantages of adding intelligence to transient analyzers, a detailed description of the characteristics of the PPPL transient analyzer, a description of the hardware, firmware, control language and operation of the PPPL transient analyzer, and general remarks on future trends in this type of instrumentation both at PPPL and in general

  16. A coiled coil trigger site is essential for rapid binding of synaptobrevin to the SNARE acceptor complex

    DEFF Research Database (Denmark)

    Wiederhold, Katrin; Kloepper, Tobias H; Walter, Alexander M

    2010-01-01

    Exocytosis from synaptic vesicles is driven by stepwise formation of a tight alpha-helical complex between the fusing membranes. The complex is composed of the three SNAREs: synaptobrevin 2, SNAP-25, and syntaxin 1a. An important step in complex formation is fast binding of vesicular synaptobrevi...

  17. Analyzing Clickstreams

    DEFF Research Database (Denmark)

    Andersen, Jesper; Giversen, Anders; Jensen, Allan H.

    in modern enterprises. In the data warehousing pproach, selected information is extracted in advance and stored in a repository. This approach is used because of its high performance. However, in many situations a logical (rather than physical) integration of data is preferable. Previous web-based data......On-Line Analytical Processing (OLAP) enables analysts to gain insight into data through fast and interactive access to a variety of possible views on information, organized in a dimensional model. The demand for data integration is rapidly becoming larger as more and more information sources appear....... Extensible Markup Language (XML) is fast becoming the new standard for data representation and exchange on the World Wide Web. The rapid emergence of XML data on the web, e.g., business-to-business (B2B) ecommerce, is making it necessary for OLAP and other data analysis tools to handleXML data as well...

  18. Rapid formation of complexity in the total synthesis of natural products enabled by oxabicyclo[2.2.1]heptene building blocks.

    Science.gov (United States)

    Schindler, Corinna S; Carreira, Erick M

    2009-11-01

    This critical review showcases examples of rapid formation of complexity in total syntheses starting from 7-oxabicyclo[2.2.1]hept-5-ene derivatives. An overview of methods allowing synthetic access to these building blocks is provided and their application in recently developed synthetic transformations to structurally complex systems is illustrated. In addition, the facile access to a novel oxabicyclo[2.2.1]heptene derived building block is presented which significantly enlarges the possibilities of previously known chemical transformations and is highlighted in the enantioselective route to the core of the banyaside and suomilide natural products (107 references).

  19. Radiometric analyzer

    International Nuclear Information System (INIS)

    Arima, S.; Oda, M.; Miyashita, K.; Takada, M.

    1977-01-01

    A radiometric analyzer for measuring the characteristic values of a sample by radiation includes a humer of radiation measuring subsystems having different ratios of sensitivities to the elements of the sample and linearizing circuits having inverse function characteristics of calibration functions which correspond to the radiation measuring subsystems. A weighing adder operates a desirable linear combination of the outputs of the linearizing circuits. Operators for operating between two or more different linear combinations are included

  20. Contamination Analyzer

    Science.gov (United States)

    1994-01-01

    Measurement of the total organic carbon content in water is important in assessing contamination levels in high purity water for power generation, pharmaceutical production and electronics manufacture. Even trace levels of organic compounds can cause defects in manufactured products. The Sievers Model 800 Total Organic Carbon (TOC) Analyzer, based on technology developed for the Space Station, uses a strong chemical oxidizing agent and ultraviolet light to convert organic compounds in water to carbon dioxide. After ionizing the carbon dioxide, the amount of ions is determined by measuring the conductivity of the deionized water. The new technique is highly sensitive, does not require compressed gas, and maintenance is minimal.

  1. Nuclear and plastid haplotypes suggest rapid diploid and polyploid speciation in the N Hemisphere Achillea millefolium complex (Asteraceae

    Directory of Open Access Journals (Sweden)

    Guo Yan-Ping

    2012-01-01

    Full Text Available Abstract Background Species complexes or aggregates consist of a set of closely related species often of different ploidy levels, whose relationships are difficult to reconstruct. The N Hemisphere Achillea millefolium aggregate exhibits complex morphological and genetic variation and a broad ecological amplitude. To understand its evolutionary history, we study sequence variation at two nuclear genes and three plastid loci across the natural distribution of this species complex and compare the patterns of such variations to the species tree inferred earlier from AFLP data. Results Among the diploid species of A. millefolium agg., gene trees of the two nuclear loci, ncpGS and SBP, and the combined plastid fragments are incongruent with each other and with the AFLP tree likely due to incomplete lineage sorting or secondary introgression. In spite of the large distributional range, no isolation by distance is found. Furthermore, there is evidence for intragenic recombination in the ncpGS gene. An analysis using a probabilistic model for population demographic history indicates large ancestral effective population sizes and short intervals between speciation events. Such a scenario explains the incongruence of the gene trees and species tree we observe. The relationships are particularly complex in the polyploid members of A. millefolium agg. Conclusions The present study indicates that the diploid members of A. millefolium agg. share a large part of their molecular genetic variation. The findings of little lineage sorting and lack of isolation by distance is likely due to short intervals between speciation events and close proximity of ancestral populations. While previous AFLP data provide species trees congruent with earlier morphological classification and phylogeographic considerations, the present sequence data are not suited to recover the relationships of diploid species in A. millefolium agg. For the polyploid taxa many hybrid links and

  2. Complex systems thinking in emergency medicine: A novel paradigm for a rapidly changing and interconnected health care landscape.

    Science.gov (United States)

    Widmer, Matthew A; Swanson, R Chad; Zink, Brian J; Pines, Jesse M

    2017-12-27

    The specialty of emergency medicine is experiencing the convergence of a number of transformational forces in the United States, including health care reform, technological advancements, and societal shifts. These bring both opportunity and uncertainty. 21ST CENTURY CHALLENGES: Persistent challenges such as the opioid epidemic, rising health care costs, misaligned incentives, patients with multiple chronic diseases, and emergency department crowding continue to plague the acute, unscheduled care system. The traditional approach to health care practice and improvement-reductionism-is not adequate for the complexity of the twenty-first century. Reductionist thinking will likely continue to produce unintended consequences and suboptimal outcomes. Complex systems thinking provides a perspective and set of tools better suited for the challenges and opportunities facing public health in general, and emergency medicine more specifically. This article introduces complex systems thinking and argues for its application in the context of emergency medicine by drawing on the history of the circumstances surrounding the formation of the specialty and by providing examples of its application to several practice challenges. © 2017 John Wiley & Sons, Ltd.

  3. Alu polymerase chain reaction: A method for rapid isolation of human-specific sequences from complex DNA sources

    International Nuclear Information System (INIS)

    Nelson, D.L.; Ledbetter, S.A.; Corbo, L.; Victoria, M.F.; Ramirez-Solis, R.; Webster, T.D.; Ledbetter, D.H.; Caskey, C.T.

    1989-01-01

    Current efforts to map the human genome are focused on individual chromosomes or smaller regions and frequently rely on the use of somatic cell hybrids. The authors report the application of the polymerase chain reaction to direct amplification of human DNA from hybrid cells containing regions of the human genome in rodent cell backgrounds using primers directed to the human Alu repeat element. They demonstrate Alu-directed amplification of a fragment of the human HPRT gene from both hybrid cell and cloned DNA and identify through sequence analysis the Alu repeats involved in this amplification. They also demonstrate the application of this technique to identify the chromosomal locations of large fragments of the human X chromosome cloned in a yeast artificial chromosome and the general applicability of the method to the preparation of DNA probes from cloned human sequences. The technique allows rapid gene mapping and provides a simple method for the isolation and analysis of specific chromosomal regions

  4. Rapid kinetics of iron responsive element (IRE) RNA/iron regulatory protein 1 and IRE-RNA/eIF4F complexes respond differently to metal ions.

    Science.gov (United States)

    Khan, Mateen A; Ma, Jia; Walden, William E; Merrick, William C; Theil, Elizabeth C; Goss, Dixie J

    2014-06-01

    Metal ion binding was previously shown to destabilize IRE-RNA/IRP1 equilibria and enhanced IRE-RNA/eIF4F equilibria. In order to understand the relative importance of kinetics and stability, we now report rapid rates of protein/RNA complex assembly and dissociation for two IRE-RNAs with IRP1, and quantitatively different metal ion response kinetics that coincide with the different iron responses in vivo. kon, for FRT IRE-RNA binding to IRP1 was eight times faster than ACO2 IRE-RNA. Mn(2+) decreased kon and increased koff for IRP1 binding to both FRT and ACO2 IRE-RNA, with a larger effect for FRT IRE-RNA. In order to further understand IRE-mRNA regulation in terms of kinetics and stability, eIF4F kinetics with FRT IRE-RNA were determined. kon for eIF4F binding to FRT IRE-RNA in the absence of metal ions was 5-times slower than the IRP1 binding to FRT IRE-RNA. Mn(2+) increased the association rate for eIF4F binding to FRT IRE-RNA, so that at 50 µM Mn(2+) eIF4F bound more than 3-times faster than IRP1. IRP1/IRE-RNA complex has a much shorter life-time than the eIF4F/IRE-RNA complex, which suggests that both rate of assembly and stability of the complexes are important, and that allows this regulatory system to respond rapidly to change in cellular iron. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  5. A rapid-acting, long-acting insulin formulation based on a phospholipid complex loaded PHBHHx nanoparticles.

    Science.gov (United States)

    Peng, Qiang; Zhang, Zhi-Rong; Gong, Tao; Chen, Guo-Qiang; Sun, Xun

    2012-02-01

    The application of poly(hydroxybutyrate-co-hydroxyhexanoate) (PHBHHx) for sustained and controlled delivery of hydrophilic insulin was made possible by preparing insulin phospholipid complex loaded biodegradable PHBHHx nanoparticles (INS-PLC-NPs). The INS-PLC-NPs produced by a solvent evaporation method showed a spherical shape with a mean particle size, zeta potential and entrapment efficiency of 186.2 nm, -38.4 mv and 89.73%, respectively. In vitro studies demonstrated that only 20% of insulin was released within 31 days with a burst release of 5.42% in the first 8 h. The hypoglycaemic effect in STZ induced diabetic rats lasted for more than 3 days after the subcutaneous injection of INS-PLC-NPs, which significantly prolonged the therapeutic effect compared with the administration of insulin solution. The pharmacological bioavailability (PA) of INS-PLC-NPs relative to insulin solution was over 350%, indicating that the bioavailability of insulin was significantly enhanced by INS-PLC-NPs. Therefore, the INS-PLC-NPs system is promising to serve as a long lasting insulin release formulation, by which the patient compliance can be enhanced significantly. This study also showed that phospholipid complex loaded biodegradable nanoparticles (PLC-NPs) have a great potential to be used as a sustained delivery system for hydrophilic proteins to be encapsulated in hydrophobic polymers. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. A new million-channel analyzer for complex nuclear spectroscopy studies and its application in measurements of the β decay of 149Pr

    International Nuclear Information System (INIS)

    Tenten, W.

    1978-11-01

    A million-channel analyzer with CAMAC instrumentation and PDP-11 control computer was developed and tested using the case of the β decay of 149 Pr and the γ decays of 149 Nd. A level scheme for 149 Nd was developed. (WL) [de

  7. What is science? Thinking about doctoral Business Administration students’ perceptions analyzed from the perspective of Edgar Morin and the paradigm of Complexity

    Directory of Open Access Journals (Sweden)

    Giancarlo Dal Bo

    2015-09-01

    Full Text Available Discussions about the paradigms that shape science are important in order to promote reflection among researchers as to their role in society. The Cartesian-Newtonian paradigm lies both in the natural and social sciences, both of which initially adopted it, before gradually bringing it into question due to the depletion of its explanatory power for current phenomena. Some authors propose the paradigm of complexity, which would, based on the features exposed in this paper, be better suited to providing a broad understanding of the process of knowledge construction. Through literature review and quantitative research conducted with students of the Doctoral Program in Business Administration at two Higher Education Institutions in Rio Grande do Sul, this paper attempts to identify the prevailing perceptions regarding the epistemological and paradigmatic positions adopted in the sciences and challenge them with the complexity paradigm proposed by Edgar Morin.

  8. Evaluation of a rapid radiometric differentiation test for the Mycobacterium tuberculosis complex by selective inhibition with p-nitro-alpha-acetylamino-beta-hydroxypropiophenone

    International Nuclear Information System (INIS)

    Laszlo, A.; Siddiqi, S.H.

    1984-01-01

    This study is an evaluation of a rapid technique for the differentiation of the Mycobacterium tuberculosis complex from other mycobacteria, using p-nitro-alpha-acetylamino-beta- hydroxypropiophenone (NAP) as a selective inhibitory agent. A total of 416 coded cultures, 234 cultures belonging to the M. tuberculosis complex and 182 cultures belonging to 35 other mycobacterial species, were tested in two laboratories for p-nitro-alpha-acetylamino-beta- hydroxypropiophenone inhibition to concentrations of 5 and 10 micrograms of NAP per ml in Middlebrook 7H12 liquid medium. Two testing modes were compared: the indirect, in which a large bacterial inoculum was used from an isolated culture on a solid medium, and the direct, which used a small inoculum from 7H12 medium. A decrease or no increase in daily 14 CO 2 output as measured by a BACTEC system was considered evidence of inhibition. The data presented show that a concentration of 5 micrograms of NAP per ml can effectively separate the M. tuberculosis complex from other mycobacterial species in 4 to 6 days. The direct test data show that, unlike other conventional biochemical tests, it does not require a heavy inoculum of mycobacteria and can therefore be performed soon after growth is detected by the radiometric method

  9. Analyzing coastal turbidity under complex terrestrial loads characterized by a 'stress connectivity matrix' with an atmosphere-watershed-coastal ocean coupled model

    Science.gov (United States)

    Yamamoto, Takahiro; Nadaoka, Kazuo

    2018-04-01

    Atmospheric, watershed and coastal ocean models were integrated to provide a holistic analysis approach for coastal ocean simulation. The coupled model was applied to coastal ocean in the Philippines where terrestrial sediment loads provided from several adjacent watersheds play a major role in influencing coastal turbidity and are partly responsible for the coastal ecosystem degradation. The coupled model was validated using weather and hydrologic measurement to examine its potential applicability. The results revealed that the coastal water quality may be governed by the loads not only from the adjacent watershed but also from the distant watershed via coastal currents. This important feature of the multiple linkages can be quantitatively characterized by a "stress connectivity matrix", which indicates the complex underlying structure of environmental stresses in coastal ocean. The multiple stress connectivity concept shows the potential advantage of the integrated modelling approach for coastal ocean assessment, which may also serve for compensating the lack of measured data especially in tropical basins.

  10. The "adjuvant effect" of the polymorphic B-G antigens of the chicken major histocompatibility complex analyzed using purified molecules incorporated in liposomes

    DEFF Research Database (Denmark)

    Salomonsen, J; Eriksson, H; Skjødt, K

    1991-01-01

    The polymorphic B-G region of the chicken major histocompatibility complex has previously been shown to mediate an "adjuvant effect" on the humoral response to other erythrocyte alloantigens. We demonstrate here that B-G molecules purified with monoclonal antibodies exert this adjuvant effect...... on the production of alloantibodies to chicken class I (B-F) molecules, when the two are in the same liposome. The adjuvant effect may in part be mediated by antibodies, since the antibody response to B-G molecules occurs much faster than the response to B-F molecules, and conditions in which antibodies to B......-G are present increase the speed of the response to B-F molecules. We also found that the presence of B-G molecules in separate liposomes results in a lack of response to B-F molecules. In the light of this and other data, we consider the possible roles for the polymorphic B-G molecules, particularly...

  11. Rapid, automated, nonradiometric susceptibility testing of Mycobacterium tuberculosis complex to four first-line antituberculous drugs used in standard short-course chemotherapy

    DEFF Research Database (Denmark)

    Johansen, Isik Somuncu; Thomsen, Vibeke Østergaard; Marjamäki, Merja

    2004-01-01

    The increasing prevalence of drug-resistant tuberculosis necessitates rapid and accurate susceptibility testing. The nonradiometric BACTEC Mycobacteria Growth Indicator Tube 960 (MGIT) system for susceptibility testing was evaluated on 222 clinical Mycobacterium tuberculosis complex isolates...... for isoniazid, rifampin, and ethambutol. Fifty-seven of the isolates were tested for pyrazinamide. Results were compared to those of radiometric BACTEC 460 system and discrepancies were resolved by the agar proportion method. We found an overall agreement of 99.0% for isoniazid, 99.5% for rifampin, 98.......2% for ethambutol, and 100% for pyrazinamide. After resolution of discrepancies, MGIT yielded no false susceptibility for rifampin and isoniazid. Although turnaround times were comparable, MGIT provides an advantage as inoculation can be done on any weekday as the growth is monitored automatically. The automated...

  12. Rapid modeling of complex multi-fault ruptures with simplistic models from real-time GPS: Perspectives from the 2016 Mw 7.8 Kaikoura earthquake

    Science.gov (United States)

    Crowell, B.; Melgar, D.

    2017-12-01

    The 2016 Mw 7.8 Kaikoura earthquake is one of the most complex earthquakes in recent history, rupturing across at least 10 disparate faults with varying faulting styles, and exhibiting intricate surface deformation patterns. The complexity of this event has motivated the need for multidisciplinary geophysical studies to get at the underlying source physics to better inform earthquake hazards models in the future. However, events like Kaikoura beg the question of how well (or how poorly) such earthquakes can be modeled automatically in real-time and still satisfy the general public and emergency managers. To investigate this question, we perform a retrospective real-time GPS analysis of the Kaikoura earthquake with the G-FAST early warning module. We first perform simple point source models of the earthquake using peak ground displacement scaling and a coseismic offset based centroid moment tensor (CMT) inversion. We predict ground motions based on these point sources as well as simple finite faults determined from source scaling studies, and validate against true recordings of peak ground acceleration and velocity. Secondly, we perform a slip inversion based upon the CMT fault orientations and forward model near-field tsunami maximum expected wave heights to compare against available tide gauge records. We find remarkably good agreement between recorded and predicted ground motions when using a simple fault plane, with the majority of disagreement in ground motions being attributable to local site effects, not earthquake source complexity. Similarly, the near-field tsunami maximum amplitude predictions match tide gauge records well. We conclude that even though our models for the Kaikoura earthquake are devoid of rich source complexities, the CMT driven finite fault is a good enough "average" source and provides useful constraints for rapid forecasting of ground motion and near-field tsunami amplitudes.

  13. Role of genotype® mycobacterium common mycobacteria/additional species assay for rapid differentiation between Mycobacterium tuberculosis complex and different species of non-tuberculous mycobacteria

    Directory of Open Access Journals (Sweden)

    Amresh Kumar Singh

    2013-01-01

    Full Text Available Background: Mycobacterium tuberculosis complex (MTBC and non-tuberculous mycobacteria (NTM may or may not have same clinical presentations, but the treatment regimens are always different. Laboratory differentiation between MTBC and NTM by routine methods are time consuming and cumbersome to perform. We have evaluated the role of GenoType® Mycobacterium common mycobacteria/additional species (CM/AS assay for differentiation between MTBC and different species of NTM in clinical isolates from tuberculosis (TB cases. Materials and Methods: A total of 1080 clinical specimens were collected from January 2010 to June 2012. Diagnosis was performed by Ziehl-Neelsen staining followed by culture in BacT/ALERT 3D system (bioMerieux, France. A total of 219 culture positive clinical isolates (BacT/ALERT® MP cultures were selected for differentiation by p-nitrobenzoic acid (PNB sensitivity test as and BIO-LINE SD Ag MPT64 TB test considering as the gold standard test. Final identification and differentiation between MTBC and different species of NTM were further confirmed by GenoType® Mycobacterium CM/AS assay (Hain Lifescience, Nehren, Germany. Results: Out of 219 BacT/ALERT® MP culture positive isolates tested by PNB as 153 MTBC (69.9% and by GenoType® Mycobacterium CM/AS assay as 159 (72.6% MTBC and remaining 60 (27.4% were considered as NTM species. The GenoType® Mycobacterium CM/AS assay was proved 99.3% sensitive and 98.3% specific for rapid differentiation of MTBC and NTM. The most common NTM species were; Mycobacterium fortuitum 20 (33.3% among rapid growing mycobacteria and Mycobacterium intracellulare 11 (18.3% among slow growing mycobacteria. Conclusion: The GenoType® Mycobacterium assay makes rapid and accurate identification of NTM species as compared with different phenotypic and molecular diagnostic tool and helps in management of infections caused by different mycobacteria.

  14. MALDI-TOF MS enables the rapid identification of the major molecular types within the Cryptococcus neoformans/C. gattii species complex.

    Directory of Open Access Journals (Sweden)

    Carolina Firacative

    Full Text Available BACKGROUND: The Cryptococcus neoformans/C. gattii species complex comprises two sibling species that are divided into eight major molecular types, C. neoformans VNI to VNIV and C. gattii VGI to VGIV. These genotypes differ in host range, epidemiology, virulence, antifungal susceptibility and geographic distribution. The currently used phenotypic and molecular identification methods for the species/molecular types are time consuming and expensive. As Matrix-Assisted Laser Desorption Ionization-Time-of-Flight Mass Spectrometry (MALDI-TOF MS offers an effective alternative for the rapid identification of microorganisms, the objective of this study was to examine its potential for the identification of C. neoformans and C. gattii strains at the intra- and inter-species level. METHODOLOGY: Protein extracts obtained via the formic acid extraction method of 164 C. neoformans/C. gattii isolates, including four inter-species hybrids, were studied. RESULTS: The obtained mass spectra correctly identified 100% of all studied isolates, grouped each isolate according to the currently recognized species, C. neoformans and C. gattii, and detected potential hybrids. In addition, all isolates were clearly separated according to their major molecular type, generating greater spectral differences among the C. neoformans molecular types than the C. gattii molecular types, most likely reflecting a closer phylogenetic relationship between the latter. The number of colonies used and the incubation length did not affect the results. No spectra were obtained from intact yeast cells. An extended validated spectral library containing spectra of all eight major molecular types was established. CONCLUSIONS: MALDI-TOF MS is a rapid identification tool for the correct recognition of the two currently recognized human pathogenic Cryptococcus species and offers a simple method for the separation of the eight major molecular types and the detection of hybrid strains within this

  15. Rapid colorimetric assay for detection of Listeria monocytogenes in food samples using LAMP formation of DNA concatemers and gold nanoparticle-DNA probe complex

    Science.gov (United States)

    Wachiralurpan, Sirirat; Sriyapai, Thayat; Areekit, Supatra; Sriyapai, Pichapak; Augkarawaritsawong, Suphitcha; Santiwatanakul, Somchai; Chansiri, Kosum

    2018-04-01

    ABSTRACT Listeria monocytogenes is a major foodborne pathogen of global health concern. Herein, the rapid diagnosis of L. monocytogenes has been achieved using loop-mediated isothermal amplification (LAMP) based on the phosphatidylcholine-phospholipase C gene (plcB). Colorimetric detection was then performed through the formation of DNA concatemers and a gold nanoparticle/DNA probe complex (GNP/DNA probe). The overall detection process was accomplished within approximately 1 h with no need for complicated equipment. The limits of detection for L. monocytogenes in the forms of purified genomic DNA and pure culture were 800 fg and 2.82 CFU mL-1, respectively. No cross reactions were observed from closely related bacteria species. The LAMP-GNP/DNA probe assay was applied to the detection of 200 raw chicken meat samples and compared to routine standard methods. The data revealed that the specificity, sensitivity and accuracy were 100%, 90.20% and 97.50%, respectively. The present assay was 100% in conformity with LAMP-agarose gel electrophoresis assay. Five samples that were negative by both assays appeared to have the pathogen at below the level of detection. The assay can be applied as a rapid direct screening method for L. monocytogenes.

  16. EEG Differences in Two Clinically Similar Rapid Dementias: Voltage-Gated Potassium Channel Complex-Associated Autoimmune Encephalitis and Creutzfeldt-Jakob Disease.

    Science.gov (United States)

    Freund, Brin; Probasco, John C; Cervenka, Mackenzie C; Sutter, Raoul; Kaplan, Peter W

    2018-05-01

    Distinguishing treatable causes for rapidly progressive dementia from those that are incurable is vital. Creutzfeldt-Jakob disease (CJD) and voltage-gated potassium channel complex-associated autoimmune encephalitis (VGKC AE) are 2 such conditions with disparate outcomes and response to treatment. To determine the differences in electroencephalography between CJD and VGKC AE, we performed a retrospective review of medical records and examined clinical data, neuroimaging, and electroencephalographs performed in patients admitted for evaluation for rapidly progressive dementia diagnosed with CJD and VGKC AE at the Johns Hopkins Hospital and Bayview Medical Center between January 1, 2007 and December 31, 2015. More patients in the VGKC AE group had seizures (12/17) than those with CJD (3/14; P = .008). Serum sodium levels were lower in those with VGKC AE ( P = .001). Cerebrospinal fluid (CSF) white blood cell count was higher in VGKC AE ( P = .008). CSF protein 14-3-3 ( P = .018) was more commonly detected in CJD, and tau levels were higher in those with CJD ( P VGKC AE, and electroencephalography can aid in their diagnoses. Performing serial EEGs better delineates these conditions.

  17. Comparison of net CO2 fluxes measured with open- and closed-path infrared gas analyzers in an urban complex environment

    DEFF Research Database (Denmark)

    Järvi, L.; Mammarella, I.; Eugster, W.

    2009-01-01

    and their suitability to accurately measure CO2 exchange in such non-ideal landscape. In addition, this study examined the effect of open-path sensor heating on measured fluxes in urban terrain, and these results were compared with similar measurements made above a temperate beech forest in Denmark. The correlation...... between the two fluxes was good (R2 = 0.93) at the urban site, but during the measurement period the open-path net surface exchange (NSE) was 17% smaller than the closed-path NSE, indicating apparent additional uptake of CO2 by open-path measurements. At both sites, sensor heating corrections evidently...... improved the performance of the open-path analyzer by reducing discrepancies in NSE at the urban site to 2% and decreasing the difference in NSE from 67% to 7% at the forest site. Overall, the site-specific approach gave the best results at both sites and, if possible, it should be preferred in the sensor...

  18. A rapid screening procedure for the analysis of proliferation compounds in complex matrices using solid phase microextraction (SPME) and SPME with in-situ derivatization

    International Nuclear Information System (INIS)

    Alcaraz, A.; Hulsey, S.S.; Andresen, B.D.

    1995-01-01

    A variety of methods have been established using advanced chromatographic techniques and new detection systems for the analysis of chemical signatures associated with nuclear and chemical weapon (CW) proliferation. Most of these analytical methods are used in the laboratory and seldom applied in the field. The Chemical Weapons Convention (an international treaty to ban chemical weapons) may require the rapid on-site analysis of environmental samples which contain CW agents, their precursors, and/or their degradation products. In addition to the fact that certain countries are involved in CW non-compliance, there is a current uncertainty regarding nuclear proliferation. This also creates new demands on sample work-up and analytical instrumentation use in the field. The isolation and identification of unique chemical signatures in complex samples such as soils, waste tanks, and decontamination solutions would determine non-compliance. However, a primary area of detection research continues to be sample preparation. Most of the established sample cleanup technologies involve liquid/liquid, Soxhlet, or most recently, solid phase extraction (SPE). Despite the success of these traditional sample preparation techniques, they are time consuming and require multi-step procedures (especially when preparing samples for gas chromatographic mass-spectrometric analysis). The goal of this work is to demonstrate the advantages of utilizing SPME and SPME in-situ derivatization techniques to eliminate time consuming steps necessary to prepare a sample for on-site GC-MS. The authors' approach was to compare two SPME fibers and to develop methods to facilitate the isolation of polar and moderately polar proliferation compounds from complex environmental samples. This work will help to evaluate current SPME technologies for use during on-site environmental monitoring analysis

  19. Beam loss reduction by injection painting in the 3-GeV rapid cycling synchrotron of the Japan Proton Accelerator Research Complex

    Directory of Open Access Journals (Sweden)

    H. Hotchi

    2012-04-01

    Full Text Available The 3-GeV rapid cycling synchrotron (RCS of the Japan Proton Accelerator Research Complex was commissioned in October 2007. Via the initial beam tuning and a series of underlying beam studies with low-intensity beams, since December 2009, we have intermittently been performing beam tuning experiments with higher-intensity beams including the injection painting technique. By optimizing the injection painting parameters, we have successfully achieved a 420 kW-equivalent output intensity at a low-level intensity loss of less than 1%. Also the corresponding numerical simulation well reproduced the observed painting parameter dependence on the beam loss, and captured a characteristic behavior of the high-intensity beam in the injection painting process. In this paper, we present the experimental results obtained in the course of the RCS beam power ramp-up, especially on the beam loss reduction achieved by employing the injection painting, together with the numerical simulation results.

  20. Development and validation of rapid ion-pair RPLC method for simultaneous determination of certain B-complex vitamins along with vitamin C.

    Science.gov (United States)

    Patil, Suyog S; Srivastava, Ashiwini K

    2012-01-01

    A rapid, simple, and accurate ion-pair RPLC method has been developed for simultaneous analysis of vitamin C and major B-complex vitamins. An RP C18 column thermostated at 30 degrees C was used with gradient elution of mobile phase comprising 10 mM potassium dihydrogen phosphate buffer (containing 3 mM sodium hexane-1-sulfonate, adjusted to pH 2.80 with o-phosphoric acid) and methanol at a flow rate of 1.0 mL/min to achieve the best possible separation and resolution of all vitamins in about 11.00 min. The detection was performed at 274 nm. The method has been implemented successfully for simultaneous determination of vitamins present in 12 multivitamin/multimineral pharmaceutical preparations, as well as in human urine. Typical validation characteristics were evaluated in accordance with International Conference on Harmonization guidelines. Good linearity over the investigated concentration levels was observed. Intraday repeatability was vitamins. The method can be used for assay of these vitamins over a wide concentration range with good precision and accuracy; hence, it would be appropriate for routine QC as well as in clinical analysis.

  1. Application of computer-aided three-dimensional skull model with rapid prototyping technique in repair of zygomatico-orbito-maxillary complex fracture.

    Science.gov (United States)

    Li, Wei Zhong; Zhang, Mei Chao; Li, Shao Ping; Zhang, Lei Tao; Huang, Yu

    2009-06-01

    With the advent of CAD/CAM and rapid prototyping (RP), a technical revolution in oral and maxillofacial trauma was promoted to benefit treatment, repair of maxillofacial fractures and reconstruction of maxillofacial defects. For a patient with zygomatico-facial collapse deformity resulting from a zygomatico-orbito-maxillary complex (ZOMC) fracture, CT scan data were processed by using Mimics 10.0 for three-dimensional (3D) reconstruction. The reduction design was aided by 3D virtual imaging and the 3D skull model was reproduced using the RP technique. In line with the design by Mimics, presurgery was performed on the 3D skull model and the semi-coronal incision was taken for reduction of ZOMC fracture, based on the outcome from the presurgery. Postoperative CT and images revealed significantly modified zygomatic collapse and zygomatic arch rise and well-modified facial symmetry. The CAD/CAM and RP technique is a relatively useful tool that can assist surgeons with reconstruction of the maxillofacial skeleton, especially in repairs of ZOMC fracture.

  2. Effects of Test Paper Drying and Reaction Periods on Silver Ion-Arsine Complex Colour Development for a Simple and Rapid Arsenic (V) Determination

    International Nuclear Information System (INIS)

    Khim, O.K.; Wan Md Zin Wan Yunus; Abdul Ghapor Hussin; Mansor Ahmad; Ahmad Farid Mohd Azmi

    2015-01-01

    Arsenic is a toxic element that exists in different forms in nature and can be accumulated by various biota and environmental media. Current techniques for the environmental monitoring of arsenic are usually sophisticated, time consuming and inappropriate for on-site analyses. We are developing a simple and rapid colorimetric quantitative method based on a colour complex formed by silver ion impregnated on a filter paper with arsine gas produced from arsenic ion reduction by hydrogen generated from zinc and sulfamic acid reaction in the sample. In this report we describe effects of drying of the silver ion impregnated filter paper and exposing period of this test paper to the arsine gas. The data obtained are digitized and used to develop a model for arsenic (V) ion estimation. The study reveals that when 4.0 g of sulfamic acid and 2.0 g of zinc powder are used to reduce 50 ml of arsenic solution sample, the drying and exposure periods needed are 20 seconds and 10 minutes, respectively. The best fitted model that relates arsenic (V) concentration (Ac) and the red colour intensity value (R) is Ac =120.1 - 1.071R. This model can accurately estimate the arsenic (V) concentration from 0 to 100 μg/l. (author)

  3. Direct observation of the phase space footprint of a painting injection in the Rapid Cycling Synchrotron at the Japan Proton Accelerator Research Complex

    Directory of Open Access Journals (Sweden)

    P. K. Saha

    2009-04-01

    Full Text Available The 3 GeV Rapid Cycling Synchrotron (RCS at Japan Proton Accelerator Research Complex is nearly at the operational stage with regard to the beam commissioning aspects. Recently, the design painting injection study has been commenced with the aim of high output beam power at the extraction. In order to observe the phase space footprint of the painting injection, a method was developed utilizing a beam position monitor (BPM in the so-called single pass mode. The turn-by-turn phase space coordinates of the circulating beam directly measured using a pair of BPMs entirely positioned in drift space, and the calculated transfer matrices from the injection point to the pair of BPMs with several successive turns were used together in order to obtain the phase space footprint of the painting injection. There are two such pairs of BPMs placed in two different locations in the RCS, the results from which both agreed and were quite consistent with what was expected.

  4. Direct observation of the phase space footprint of a painting injection in the Rapid Cycling Synchrotron at the Japan Proton Accelerator Research Complex

    Science.gov (United States)

    Saha, P. K.; Shobuda, Y.; Hotchi, H.; Hayashi, N.; Takayanagi, T.; Harada, H.; Irie, Y.

    2009-04-01

    The 3 GeV Rapid Cycling Synchrotron (RCS) at Japan Proton Accelerator Research Complex is nearly at the operational stage with regard to the beam commissioning aspects. Recently, the design painting injection study has been commenced with the aim of high output beam power at the extraction. In order to observe the phase space footprint of the painting injection, a method was developed utilizing a beam position monitor (BPM) in the so-called single pass mode. The turn-by-turn phase space coordinates of the circulating beam directly measured using a pair of BPMs entirely positioned in drift space, and the calculated transfer matrices from the injection point to the pair of BPMs with several successive turns were used together in order to obtain the phase space footprint of the painting injection. There are two such pairs of BPMs placed in two different locations in the RCS, the results from which both agreed and were quite consistent with what was expected.

  5. Inductive dielectric analyzer

    International Nuclear Information System (INIS)

    Agranovich, Daniel; Popov, Ivan; Ben Ishai, Paul; Feldman, Yuri; Polygalov, Eugene

    2017-01-01

    One of the approaches to bypass the problem of electrode polarization in dielectric measurements is the free electrode method. The advantage of this technique is that, the probing electric field in the material is not supplied by contact electrodes, but rather by electromagnetic induction. We have designed an inductive dielectric analyzer based on a sensor comprising two concentric toroidal coils. In this work, we present an analytic derivation of the relationship between the impedance measured by the sensor and the complex dielectric permittivity of the sample. The obtained relationship was successfully employed to measure the dielectric permittivity and conductivity of various alcohols and aqueous salt solutions. (paper)

  6. Combining blue native polyacrylamide gel electrophoresis with liquid chromatography tandem mass spectrometry as an effective strategy for analyzing potential membrane protein complexes of Mycobacterium bovis bacillus Calmette-Guérin

    Directory of Open Access Journals (Sweden)

    Li Weijun

    2011-01-01

    Full Text Available Abstract Background Tuberculosis is an infectious bacterial disease in humans caused primarily by Mycobacterium tuberculosis, and infects one-third of the world's total population. Mycobacterium bovis bacillus Calmette-Guérin (BCG vaccine has been widely used to prevent tuberculosis worldwide since 1921. Membrane proteins play important roles in various cellular processes, and the protein-protein interactions involved in these processes may provide further information about molecular organization and cellular pathways. However, membrane proteins are notoriously under-represented by traditional two-dimensional polyacrylamide gel electrophoresis (2-D PAGE and little is known about mycobacterial membrane and membrane-associated protein complexes. Here we investigated M. bovis BCG by an alternative proteomic strategy coupling blue native PAGE to liquid chromatography tandem mass spectrometry (LC-MS/MS to characterize potential protein-protein interactions in membrane fractions. Results Using this approach, we analyzed native molecular composition of protein complexes in BCG membrane fractions. As a result, 40 proteins (including 12 integral membrane proteins, which were organized in 9 different gel bands, were unambiguous identified. The proteins identified have been experimentally confirmed using 2-D SDS PAGE. We identified MmpL8 and four neighboring proteins that were involved in lipid transport complexes, and all subunits of ATP synthase complex in their monomeric states. Two phenolpthiocerol synthases and three arabinosyltransferases belonging to individual operons were obtained in different gel bands. Furthermore, two giant multifunctional enzymes, Pks7 and Pks8, and four mycobacterial Hsp family members were determined. Additionally, seven ribosomal proteins involved in polyribosome complex and two subunits of the succinate dehydrogenase complex were also found. Notablely, some proteins with high hydrophobicity or multiple transmembrane

  7. Combining blue native polyacrylamide gel electrophoresis with liquid chromatography tandem mass spectrometry as an effective strategy for analyzing potential membrane protein complexes of Mycobacterium bovis bacillus Calmette-Guérin.

    Science.gov (United States)

    Zheng, Jianhua; Wei, Candong; Zhao, Lina; Liu, Liguo; Leng, Wenchuan; Li, Weijun; Jin, Qi

    2011-01-18

    Tuberculosis is an infectious bacterial disease in humans caused primarily by Mycobacterium tuberculosis, and infects one-third of the world's total population. Mycobacterium bovis bacillus Calmette-Guérin (BCG) vaccine has been widely used to prevent tuberculosis worldwide since 1921. Membrane proteins play important roles in various cellular processes, and the protein-protein interactions involved in these processes may provide further information about molecular organization and cellular pathways. However, membrane proteins are notoriously under-represented by traditional two-dimensional polyacrylamide gel electrophoresis (2-D PAGE) and little is known about mycobacterial membrane and membrane-associated protein complexes. Here we investigated M. bovis BCG by an alternative proteomic strategy coupling blue native PAGE to liquid chromatography tandem mass spectrometry (LC-MS/MS) to characterize potential protein-protein interactions in membrane fractions. Using this approach, we analyzed native molecular composition of protein complexes in BCG membrane fractions. As a result, 40 proteins (including 12 integral membrane proteins), which were organized in 9 different gel bands, were unambiguous identified. The proteins identified have been experimentally confirmed using 2-D SDS PAGE. We identified MmpL8 and four neighboring proteins that were involved in lipid transport complexes, and all subunits of ATP synthase complex in their monomeric states. Two phenolpthiocerol synthases and three arabinosyltransferases belonging to individual operons were obtained in different gel bands. Furthermore, two giant multifunctional enzymes, Pks7 and Pks8, and four mycobacterial Hsp family members were determined. Additionally, seven ribosomal proteins involved in polyribosome complex and two subunits of the succinate dehydrogenase complex were also found. Notablely, some proteins with high hydrophobicity or multiple transmembrane helixes were identified well in our work. In this

  8. Performance of the new automated Abbott RealTime MTB assay for rapid detection of Mycobacterium tuberculosis complex in respiratory specimens.

    Science.gov (United States)

    Chen, J H K; She, K K K; Kwong, T-C; Wong, O-Y; Siu, G K H; Leung, C-C; Chang, K-C; Tam, C-M; Ho, P-L; Cheng, V C C; Yuen, K-Y; Yam, W-C

    2015-09-01

    The automated high-throughput Abbott RealTime MTB real-time PCR assay has been recently launched for Mycobacterium tuberculosis complex (MTBC) clinical diagnosis. This study would like to evaluate its performance. We first compared its diagnostic performance with the Roche Cobas TaqMan MTB assay on 214 clinical respiratory specimens. Prospective analysis of a total 520 specimens was then performed to further evaluate the Abbott assay. The Abbott assay showed a lower limit of detection at 22.5 AFB/ml, which was more sensitive than the Cobas assay (167.5 AFB/ml). The two assays demonstrated a significant difference in diagnostic performance (McNemar's test; P = 0.0034), in which the Abbott assay presented significantly higher area under curve (AUC) than the Cobas assay (1.000 vs 0.880; P = 0.0002). The Abbott assay demonstrated extremely low PCR inhibition on clinical respiratory specimens. The automated Abbott assay required only very short manual handling time (0.5 h), which could help to improve the laboratory management. In the prospective analysis, the overall estimates for sensitivity and specificity of the Abbott assay were both 100 % among smear-positive specimens, whereas the smear-negative specimens were 96.7 and 96.1 %, respectively. No cross-reactivity with non-tuberculosis mycobacterial species was observed. The superiority in sensitivity of the Abbott assay for detecting MTBC in smear-negative specimens could further minimize the risk in MTBC false-negative detection. The new Abbott RealTime MTB assay has good diagnostic performance which can be a useful diagnostic tool for rapid MTBC detection in clinical laboratories.

  9. Simulation, measurement, and mitigation of beam instability caused by the kicker impedance in the 3-GeV rapid cycling synchrotron at the Japan Proton Accelerator Research Complex

    Science.gov (United States)

    Saha, P. K.; Shobuda, Y.; Hotchi, H.; Harada, H.; Hayashi, N.; Kinsho, M.; Tamura, F.; Tani, N.; Yamamoto, M.; Watanabe, Y.; Chin, Yong Ho; Holmes, J. A.

    2018-02-01

    The transverse impedance of eight extraction pulsed kicker magnets is a strong beam instability source in the 3-GeV rapid cycling synchrotron (RCS) at the Japan Proton Accelerator Research Complex. Significant beam instability occurs even at half of the designed 1 MW beam power when the chromaticity (ξ ) is fully corrected for the entire acceleration cycle by using ac sextupole (SX) fields. However, if ξ is fully corrected only at the injection energy by using dc SX fields, the beam is stable. In order to study realistic beam instability scenarios, including the effect of space charge and to determine practical measures to accomplish 1 MW beam power, we enhance the orbit particle tracking code to incorporate all realistic time-dependent machine parameters, including the time dependence of the impedance itself. The beam stability properties beyond 0.5 MW beam power are found to be very sensitive to a number of parameters in both simulations and measurements. In order to stabilize a beam at 1 MW beam power, two practical measures based on detailed and systematic simulation studies are determined, namely, (i) proper manipulation of the betatron tunes during acceleration and (ii) reduction of the dc SX field to reduce the ξ correction even at injection. The simulation results are well reproduced by measurements, and, as a consequence, an acceleration to 1 MW beam power is successfully demonstrated. In this paper, details of the orbit simulation and the corresponding experimental results up to 1 MW of beam power are presented. To further increase the RCS beam power, beam stability issues and possible measures beyond 1 MW beam power are also considered.

  10. Rapid detection and quantification of cell free cytomegalovirus by a high-speed centrifugation-based microculture assay: comparison to longitudinally analyzed viral DNA load and pp67 late transcript during lactation.

    Science.gov (United States)

    Hamprecht, Klaus; Witzel, Simone; Maschmann, Jens; Dietz, Klaus; Baumeister, Andrea; Mikeler, Elfriede; Goelz, Rangmar; Speer, Christian P; Jahn, Gerhard

    2003-12-01

    Human cytomegalovirus (HCMV) is reactivated in nearly every seropositive breastfeeding mother during lactation [Lancet 357 (2001) 513]. Conventional tissue culture (TC) and low-speed centrifugation-enhanced microtiter culture methods are not able to detect HCMV from milk during all stages of lactation. Development of a sensitive and quantitative microculture technique to describe the dynamics of HCMV reactivation in different milk compartments during lactation. Milk samples were collected longitudinally from seropositive breastfeeding mothers of preterm infants. Native milk samples were separated into fraction 1 (aqueous extract of milk fat), fraction 2 (cell and fat free milk whey) and fraction 3 (milk cells). Each of these fractions was screened qualitatively (TC, nPCR, pp67 late mRNA) and quantitatively (high-speed centrifugation-based microculture, quantitative PCR). Prior to low-speed centrifugation-enhanced inoculation, virus particles were concentrated by high-speed centrifugation (60 min at 50,000 x g, 4 degrees C). Using fraction 2 we were able to describe the dynamics of viral reactivation during lactation. We present the course of the quantitative virolactia and DNAlactia and qualitative detection of HCMV pp67 late mRNA in milk whey of four mothers (three transmitters and one non-transmitter). In all these cases virolactia described an unimodal and self limited course. Peak levels of virolactia for transmitters (T1: day 44; T2: day 43; T3: day 50) were closely related the onset of viruria of the corresponding preterm infants (U1: day 39; U2a/U2b: day 44/57; U3: day 60). The courses of viral load coincidence with the courses of DNA load. We present a rapid and highly sensitive microculture method for the quantification of cell free HCMV from milk whey and aqueous extracts from milk fat. Viral reactivation during lactation describes an unimodal course. Our findings have strong implications for quality control of any virus inactivation procedure.

  11. An integrated in silico approach to analyze the involvement of single amino acid polymorphisms in FANCD1/BRCA2-PALB2 and FANCD1/BRCA2-RAD51 complex.

    Science.gov (United States)

    Doss, C George Priya; Nagasundaram, N

    2014-11-01

    Fanconi anemia (FA) is an autosomal recessive human disease characterized by genomic instability and a marked increase in cancer risk. The importance of FANCD1 gene is manifested by the fact that deleterious amino acid substitutions were found to confer susceptibility to hereditary breast and ovarian cancers. Attaining experimental knowledge about the possible disease-associated substitutions is laborious and time consuming. The recent introduction of genome variation analyzing in silico tools have the capability to identify the deleterious variants in an efficient manner. In this study, we conducted in silico variation analysis of deleterious non-synonymous SNPs at both functional and structural level in the breast cancer and FA susceptibility gene BRCA2/FANCD1. To identify and characterize deleterious mutations in this study, five in silico tools based on two different prediction methods namely pathogenicity prediction (SIFT, PolyPhen, and PANTHER), and protein stability prediction (I-Mutant 2.0 and MuStab) were analyzed. Based on the deleterious scores that overlap in these in silico approaches, and the availability of three-dimensional structures, structure analysis was carried out with the major mutations that occurred in the native protein coded by FANCD1/BRCA2 gene. In this work, we report the results of the first molecular dynamics (MD) simulation study performed to analyze the structural level changes in time scale level with respect to the native and mutated protein complexes (G25R, W31C, W31R in FANCD1/BRCA2-PALB2, and F1524V, V1532F in FANCD1/BRCA2-RAD51). Analysis of the MD trajectories indicated that predicted deleterious variants alter the structural behavior of BRCA2-PALB2 and BRCA2-RAD51 protein complexes. In addition, statistical analysis was employed to test the significance of these in silico tool predictions. Based on these predictions, we conclude that the identification of disease-related SNPs by in silico methods, in combination with MD

  12. Development of a triple hyphenated HPLC-radical scavenging detection-DAD-SPE-NMR system for the rapid identification of antioxidants in complex plant extracts

    NARCIS (Netherlands)

    Pukalskas, A.; Beek, van T.A.; Waard, de P.

    2005-01-01

    A rapid method for the simultaneous detection and identification of radical scavenging compounds in plant extracts was developed by combining an HPLC with on-line radical scavenging using DPPH as a model radical and an HPLC¿DAD¿SPE¿NMR system. Using this method a commercial rosemary extract was

  13. PDA: Pooled DNA analyzer

    Directory of Open Access Journals (Sweden)

    Lin Chin-Yu

    2006-04-01

    Full Text Available Abstract Background Association mapping using abundant single nucleotide polymorphisms is a powerful tool for identifying disease susceptibility genes for complex traits and exploring possible genetic diversity. Genotyping large numbers of SNPs individually is performed routinely but is cost prohibitive for large-scale genetic studies. DNA pooling is a reliable and cost-saving alternative genotyping method. However, no software has been developed for complete pooled-DNA analyses, including data standardization, allele frequency estimation, and single/multipoint DNA pooling association tests. This motivated the development of the software, 'PDA' (Pooled DNA Analyzer, to analyze pooled DNA data. Results We develop the software, PDA, for the analysis of pooled-DNA data. PDA is originally implemented with the MATLAB® language, but it can also be executed on a Windows system without installing the MATLAB®. PDA provides estimates of the coefficient of preferential amplification and allele frequency. PDA considers an extended single-point association test, which can compare allele frequencies between two DNA pools constructed under different experimental conditions. Moreover, PDA also provides novel chromosome-wide multipoint association tests based on p-value combinations and a sliding-window concept. This new multipoint testing procedure overcomes a computational bottleneck of conventional haplotype-oriented multipoint methods in DNA pooling analyses and can handle data sets having a large pool size and/or large numbers of polymorphic markers. All of the PDA functions are illustrated in the four bona fide examples. Conclusion PDA is simple to operate and does not require that users have a strong statistical background. The software is available at http://www.ibms.sinica.edu.tw/%7Ecsjfann/first%20flow/pda.htm.

  14. Downhole Fluid Analyzer Development

    Energy Technology Data Exchange (ETDEWEB)

    Bill Turner

    2006-11-28

    A novel fiber optic downhole fluid analyzer has been developed for operation in production wells. This device will allow real-time determination of the oil, gas and water fractions of fluids from different zones in a multizone or multilateral completion environment. The device uses near infrared spectroscopy and induced fluorescence measurement to unambiguously determine the oil, water and gas concentrations at all but the highest water cuts. The only downhole components of the system are the fiber optic cable and windows. All of the active components--light sources, sensors, detection electronics and software--will be located at the surface, and will be able to operate multiple downhole probes. Laboratory testing has demonstrated that the sensor can accurately determine oil, water and gas fractions with a less than 5 percent standard error. Once installed in an intelligent completion, this sensor will give the operating company timely information about the fluids arising from various zones or multilaterals in a complex completion pattern, allowing informed decisions to be made on controlling production. The research and development tasks are discussed along with a market analysis.

  15. Removal of radioactive cesium from surface soils solidified using polyion complex. Rapid communication for decontamination test at Iitate-mura in Fukushima Prefecture

    International Nuclear Information System (INIS)

    Naganawa, Hirochika; Yanase, Nobuyuki; Mitamura, Hisayoshi; Nagano, Tetsushi; Yoshida, Zenko; Kumazawa, Noriyuki; Saitoh, Hiroshi; Kashima, Kaoru; Fukuda, Tatsuya; Tanaka, Shun-ichi

    2011-01-01

    We tried the decontamination of surface soils for three types of agricultural land at Nagadoro district of Iitate-mura (village) in Fukushima Prefecture, which is highly contaminated by deposits of radionuclides from the plume released from the Fukushima Daiichi nuclear power plant. The decontamination method consisted of the peeling of surface soils solidified using a polyion complex, which was formed from a salt solution of polycations and polyanions. Two types of polyion complex solution were applied to an upland field in a plastic greenhouse, a pasture, and a paddy field. The decontamination efficiency of the surface soils reached 90%, and dust release was effectively suppressed during the removal of surface soils. (author)

  16. Rapid Two-Step Procedure for Large-Scale Purification of Pediocin-Like Bacteriocins and Other Cationic Antimicrobial Peptides from Complex Culture Medium

    OpenAIRE

    Uteng, Marianne; Hauge, Håvard Hildeng; Brondz, Ilia; Nissen-Meyer, Jon; Fimland, Gunnar

    2002-01-01

    A rapid and simple two-step procedure suitable for both small- and large-scale purification of pediocin-like bacteriocins and other cationic peptides has been developed. In the first step, the bacterial culture was applied directly on a cation-exchange column (1-ml cation exchanger per 100-ml cell culture). Bacteria and anionic compounds passed through the column, and cationic bacteriocins were subsequently eluted with 1 M NaCl. In the second step, the bacteriocin fraction was applied on a lo...

  17. Rapid extensional unroofing of a granite-migmatite dome with relics of high-pressure rocks, the Podolsko complex, Bohemian Massif

    Czech Academy of Sciences Publication Activity Database

    Žák, J.; Sláma, Jiří; Burjak, M.

    2017-01-01

    Roč. 154, č. 2 (2017), s. 354-380 ISSN 0016-7568 Institutional support: RVO:67985831 Keywords : anisotropy of magnetic susceptibility (AMS) * granite-migmatite dome * exhumation * metamorphic core complex * U-Pb zircon geochronology Subject RIV: DB - Geology ; Mineralogy OBOR OECD: Geology Impact factor: 1.965, year: 2016

  18. Achievement report for fiscal 1997 on the development of technologies for utilizing biological resources such as complex biosystems. Development of complex biosystem analyzing technology; 1997 nendo fukugo seibutsukei nado seibutsu shigen riyo gijutsu kaihatsu seika hokokusho. Fukugo seibutsukei kaiseki gijutsu no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    The aim is to utilize the sophisticated functions of complex biosystems. In the research and development of technologies for effectively utilizing unexploited resources and substances such as seeweeds and algae, seaweeds are added to seawater to turn into a microbial suspension after the passage of two weeks, the suspension is next scattered on a carageenan culture medium, and then carageenan decomposing microbes are obtained. In the research and development of technologies for utilizing microbe/fauna-flora complex systems, technologies for exploring and analyzing microbes are studied. For this purpose, 48 kinds of sponges and 300 kinds of bacteria symbiotic with the sponges are sampled in Malaysia. Out of them, 15 exhibit enzyme inhibition and Artemia salina lethality activities. In the development of technologies for analyzing the functions of microbes engaged in the production of useful resources and substances for animals and plants, 150 kinds of micro-algae are subjected to screening using protease and chitinase inhibiting activities as the indexes, and it is found that an extract of Isochrysis galbana displays an intense inhibitory activity. The alga is cultured in quantities, the active component is isolated from 20g of dried alga, and its constitution is determined. (NEDO)

  19. Rapid Two-Step Procedure for Large-Scale Purification of Pediocin-Like Bacteriocins and Other Cationic Antimicrobial Peptides from Complex Culture Medium

    Science.gov (United States)

    Uteng, Marianne; Hauge, Håvard Hildeng; Brondz, Ilia; Nissen-Meyer, Jon; Fimland, Gunnar

    2002-01-01

    A rapid and simple two-step procedure suitable for both small- and large-scale purification of pediocin-like bacteriocins and other cationic peptides has been developed. In the first step, the bacterial culture was applied directly on a cation-exchange column (1-ml cation exchanger per 100-ml cell culture). Bacteria and anionic compounds passed through the column, and cationic bacteriocins were subsequently eluted with 1 M NaCl. In the second step, the bacteriocin fraction was applied on a low-pressure, reverse-phase column and the bacteriocins were detected as major optical density peaks upon elution with propanol. More than 80% of the activity that was initially in the culture supernatant was recovered in both purification steps, and the final bacteriocin preparation was more than 90% pure as judged by analytical reverse-phase chromatography and capillary electrophoresis. PMID:11823243

  20. Rapid spin-state interconversion in the bis-complex of tris-(1-pyrazolyl) methane with Fe(II) studied by Moessbauer spectroscopy

    International Nuclear Information System (INIS)

    Winkler, H.; Trautwein, A.X.; Toftlund, H.

    1992-01-01

    Dynamic spin equilibrium is observed in a complex of the [Fe(II)-N 6 ] type above room temperature. The Moessbauer lineshapes as function of temperature can be understood by means of the random-frequency-modulation model. Taking into accout the different Lamb-Moessbauer factors of the low- and high-spin state yields true populations of the two spin states. The transition rates follow rather well an Arrhenius law. With appropriate assumptions an activation energy ΔE LH =18(1) kJmol -1 is deduced. (orig.)

  1. Web server attack analyzer

    OpenAIRE

    Mižišin, Michal

    2013-01-01

    Web server attack analyzer - Abstract The goal of this work was to create prototype of analyzer of injection flaws attacks on web server. Proposed solution combines capabilities of web application firewall and web server log analyzer. Analysis is based on configurable signatures defined by regular expressions. This paper begins with summary of web attacks, followed by detection techniques analysis on web servers, description and justification of selected implementation. In the end are charact...

  2. Electron attachment analyzer

    International Nuclear Information System (INIS)

    Popp, P.; Grosse, H.J.; Leonhardt, J.; Mothes, S.; Oppermann, G.

    1984-01-01

    The invention concerns an electron attachment analyzer for detecting traces of electroaffine substances in electronegative gases, especially in air. The analyzer can be used for monitoring working places, e. g., in operating theatres. The analyzer consists of two electrodes inserted in a base frame of insulating material (quartz or ceramics) and a high-temperature resistant radiation source ( 85 Kr, 3 H, or 63 Ni)

  3. Rapid determination of some psychotropic drugs in complex matrices by tandem dispersive liquid-liquid microextraction followed by high performance liquid chromatography.

    Science.gov (United States)

    Asghari, Alireza; Fahimi, Ebrahim; Bazregar, Mohammad; Rajabi, Maryam; Boutorabi, Leila

    2017-05-01

    Simple and rapid determinations of some psychotropic drugs in some pharmaceutical wastewater and human plasma samples were successfully accomplished via the tandem dispersive liquid-liquid microextraction combined with high performance liquid chromatography-ultraviolet detection (TDLLME-HPLC-UV). TDLLME of the three psychotropic drugs clozapine, chlorpromazine, and thioridazine was easily performed through two consecutive dispersive liquid-liquid microextractions. By performing this convenient method, proper sample preconcentrations and clean-ups were achieved in just about 7min. In order to achieve the best extraction efficiency, the effective parameters involved were optimized. The optimal experimental conditions consisted of 100μL of CCl 4 (as the extraction organic solvent), and the pH values of 13 and 2 for the donor and acceptor phases, respectively. Under these optimum experimental conditions, the proposed TDLLME-HPLC-UV technique provided a good linearity in the range of 5-3000ngmL -1 for the three psychotropic drugs with the correlation of determinations (R 2 s) higher than 0.996. The limits of quantification (LOQs) and limits of detection (LODs) obtained were 5.0ngmL -1 and 1.0-1.5ngmL -1 , respectively. Also the proper enrichment factors (EFs) of 96, 99, and 88 for clozapine, chlorpromazine, and thioridazine, respectively, and good extraction repeatabilities (relative standard deviations below 9.3%, n=5) were obtained. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Radiation energy detector and analyzer

    International Nuclear Information System (INIS)

    Roberts, T.G.

    1981-01-01

    A radiation detector array and a method for measuring the spectral content of radiation. The radiation sensor or detector is an array or stack of thin solid-electrolyte batteries. The batteries, arranged in a stack, may be composed of independent battery cells or may be arranged so that adjacent cells share a common terminal surface. This common surface is possible since the polarity of the batteries with respect to an adjacent battery is unrestricted, allowing a reduction in component parts of the assembly and reducing the overall stack length. Additionally, a test jig or chamber for allowing rapid measurement of the voltage across each battery is disclosed. A multichannel recorder and display may be used to indicate the voltage gradient change across the cells, or a small computer may be used for rapidly converting these voltage readings to a graph of radiation intensity versus wavelength or energy. The behavior of the batteries when used as a radiation detector and analyzer are such that the voltage measurements can be made at leisure after the detector array has been exposed to the radiation, and it is not necessary to make rapid measurements as is now done

  5. Nuclear power plant analyzer

    International Nuclear Information System (INIS)

    Stritar, A.

    1986-01-01

    The development of Nuclear Power Plant Analyzers in USA is described. There are two different types of Analyzers under development in USA, the forst in Idaho and Los Alamos national Lab, the second in brookhaven National lab. That one is described in detail. The computer hardware and the mathematical models of the reactor vessel thermalhydraulics are described. (author)

  6. Analyzing Peace Pedagogies

    Science.gov (United States)

    Haavelsrud, Magnus; Stenberg, Oddbjorn

    2012-01-01

    Eleven articles on peace education published in the first volume of the Journal of Peace Education are analyzed. This selection comprises peace education programs that have been planned or carried out in different contexts. In analyzing peace pedagogies as proposed in the 11 contributions, we have chosen network analysis as our method--enabling…

  7. Analyzing in the present

    DEFF Research Database (Denmark)

    Revsbæk, Line; Tanggaard, Lene

    2015-01-01

    The article presents a notion of “analyzing in the present” as a source of inspiration in analyzing qualitative research materials. The term emerged from extensive listening to interview recordings during everyday commuting to university campus. Paying attention to the way different parts of vari...

  8. Gearbox vibration diagnostic analyzer

    Science.gov (United States)

    1992-01-01

    This report describes the Gearbox Vibration Diagnostic Analyzer installed in the NASA Lewis Research Center's 500 HP Helicopter Transmission Test Stand to monitor gearbox testing. The vibration of the gearbox is analyzed using diagnostic algorithms to calculate a parameter indicating damaged components.

  9. Device for analyzing a solution

    International Nuclear Information System (INIS)

    Marchand, Joseph.

    1978-01-01

    The device enables a solution containing an antigen to be analyzed by the radio-immunology technique without coming up against the problems of antigen-antibody complex and free antigen separation. This device, for analyzing a solution containing a biological compound capable of reacting with an antagonistic compound specific of the biological compound, features a tube closed at its bottom end and a component set and immobilized in the bottom of the tube so as to leave a capacity between the bottom of the tube and its lower end. The component has a large developed surface and is so shaped that it allows the solution to be analyzed to have access to the bottom of the tube; it is made of a material having some elastic deformation and able to take up a given quantity of the biological compound or of the antagonistic compound specific of the biological compound [fr

  10. A Categorization of Dynamic Analyzers

    Science.gov (United States)

    Lujan, Michelle R.

    1997-01-01

    Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input

  11. Rapid identification and quantification of Campylobacter coli and Campylobacter jejuni by real-time PCR in pure cultures and in complex samples

    Directory of Open Access Journals (Sweden)

    Denis Martine

    2011-05-01

    Full Text Available Abstract Background Campylobacter spp., especially Campylobacter jejuni (C. jejuni and Campylobacter coli (C. coli, are recognized as the leading human foodborne pathogens in developed countries. Livestock animals carrying Campylobacter pose an important risk for human contamination. Pigs are known to be frequently colonized with Campylobacter, especially C. coli, and to excrete high numbers of this pathogen in their faeces. Molecular tools, notably real-time PCR, provide an effective, rapid, and sensitive alternative to culture-based methods for the detection of C. coli and C. jejuni in various substrates. In order to serve as a diagnostic tool supporting Campylobacter epidemiology, we developed a quantitative real-time PCR method for species-specific detection and quantification of C. coli and C. jejuni directly in faecal, feed, and environmental samples. Results With a sensitivity of 10 genome copies and a linear range of seven to eight orders of magnitude, the C. coli and C. jejuni real-time PCR assays allowed a precise quantification of purified DNA from C. coli and C. jejuni. The assays were highly specific and showed a 6-log-linear dynamic range of quantification with a quantitative detection limit of approximately 2.5 × 102 CFU/g of faeces, 1.3 × 102 CFU/g of feed, and 1.0 × 103 CFU/m2 for the environmental samples. Compared to the results obtained by culture, both C. coli and C. jejuni real-time PCR assays exhibited a specificity of 96.2% with a kappa of 0.94 and 0.89 respectively. For faecal samples of experimentally infected pigs, the coefficients of correlation between the C. coli or C. jejuni real-time PCR assay and culture enumeration were R2 = 0.90 and R2 = 0.93 respectively. Conclusion The C. coli and C. jejuni real-time quantitative PCR assays developed in this study provide a method capable of directly detecting and quantifying C. coli and C. jejuni in faeces, feed, and environmental samples. These assays represent a new

  12. PCR Assay Based on the gyrB Gene for Rapid Identification of Acinetobacter baumannii-calcoaceticus Complex at Specie Level.

    Science.gov (United States)

    Teixeira, Aline B; Barin, Juliana; Hermes, Djuli M; Barth, Afonso L; Martins, Andreza F

    2017-05-01

    The genus Acinetobacter sp. comprises more than 50 species, and four are closely related and difficult to be distinguished by either phenotypic or genotypic methods: the Acinetobacter calcoaceticus-baumannii complex (ABC). The correct identification at species level is necessary mainly due to the epidemiological aspects. We evaluated a multiplex PCR for gyrB gene to identify the species of the ABC using the sequencing of the ITS 16S-23S fragment as a gold standard. Isolates identified as Acinetobacter calcoaceticus-baumannii from three hospitals at southern Brazil in 2011 were included in this study. A total of 117 isolates were obtained and 106 (90.6%) were confirmed as A. baumannii, 6 (5.1%) as A. nosocomialis and 4 (3.4%) as A. pittii by PCR for gyrB gene. Only one isolate did not present a product of the PCR for the gyrB gene; this isolate was identified as Acinetobacter genospecie 10 by sequencing of ITS. We also noted that the non-A. baumannii isolates were recovered from respiratory tract (8/72.7%), blood (2/18.2%) and urine (1/9.1%), suggesting that these species can cause serious infection. These findings evidenced that the multiplex PCR of the gyrB is a feasible and simple method to identify isolates of the ABC at the species level. © 2016 Wiley Periodicals, Inc.

  13. Rapid and efficient visible light photocatalytic dye degradation using AFe2O4 (A = Ba, Ca and Sr) complex oxides

    International Nuclear Information System (INIS)

    Vijayaraghavan, T.; Suriyaraj, S.P.; Selvakumar, R.; Venkateswaran, R.; Ashok, Anuradha

    2016-01-01

    Highlights: • Alkaline earth ferrites AFe 2 O 4 (A = Ba, Ca and Sr) were synthesized by sol–gel method. • Visible light photocatalytic activity of these ferrites were studied using congo red dye degradation. • BaFe 2 O 4 exhibited the best photocatalytic activity under visible light (xenon lamp) irradiation; CaFe 2 O 4 was the best photocatalyst under natural sun light irradiation. - Abstract: Photocatalytic activity of spinel type complex oxides has been investigated in this study. Alkaline earth ferrites AFe 2 O 4 (A = Ba, Ca, Sr) were synthesized by sol–gel method. Structural characterizations reveal that the synthesized ferrites have orthorhombic crystal structures with different space groups and cell dimensions when they have different alkaline earth metals in their A site. All the synthesized ferrites exhibited their bandgap in the range 2.14–2.19 eV. Their photocatalytic activities were studied using congo red dye under sunlight and xenon lamp radiation. The substitution of Ba, Ca and Sr at A site of these ferrites had varying impact on dye degradation process. Under xenon lamp irradiation, BaFe 2 O 4 exhibited the highest percentage of dye degradation (92% after 75 min). However, CaFe 2 O 4 showed the fastest degradation of the dye (70% within 15 min). In the absence of irradiation, SrFe 2 O 4 showed the highest dye adsorption (44% after 75 min).

  14. Miniature mass analyzer

    CERN Document Server

    Cuna, C; Lupsa, N; Cuna, S; Tuzson, B

    2003-01-01

    The paper presents the concept of different mass analyzers that were specifically designed as small dimension instruments able to detect with great sensitivity and accuracy the main environmental pollutants. The mass spectrometers are very suited instrument for chemical and isotopic analysis, needed in environmental surveillance. Usually, this is done by sampling the soil, air or water followed by laboratory analysis. To avoid drawbacks caused by sample alteration during the sampling process and transport, the 'in situ' analysis is preferred. Theoretically, any type of mass analyzer can be miniaturized, but some are more appropriate than others. Quadrupole mass filter and trap, magnetic sector, time-of-flight and ion cyclotron mass analyzers can be successfully shrunk, for each of them some performances being sacrificed but we must know which parameters are necessary to be kept unchanged. To satisfy the miniaturization criteria of the analyzer, it is necessary to use asymmetrical geometries, with ion beam obl...

  15. Rapid and efficient visible light photocatalytic dye degradation using AFe{sub 2}O{sub 4} (A = Ba, Ca and Sr) complex oxides

    Energy Technology Data Exchange (ETDEWEB)

    Vijayaraghavan, T. [PSG Institute of Advanced Studies, Coimbatore 641004 (India); Suriyaraj, S.P.; Selvakumar, R. [Nanobiotechnology Laboratory, PSG Institute of Advanced Studies, Coimbatore 641004 (India); Venkateswaran, R. [PSG Institute of Advanced Studies, Coimbatore 641004 (India); Ashok, Anuradha, E-mail: anu@psgias.ac.in [PSG Institute of Advanced Studies, Coimbatore 641004 (India)

    2016-08-15

    Highlights: • Alkaline earth ferrites AFe{sub 2}O{sub 4} (A = Ba, Ca and Sr) were synthesized by sol–gel method. • Visible light photocatalytic activity of these ferrites were studied using congo red dye degradation. • BaFe{sub 2}O{sub 4} exhibited the best photocatalytic activity under visible light (xenon lamp) irradiation; CaFe{sub 2}O{sub 4} was the best photocatalyst under natural sun light irradiation. - Abstract: Photocatalytic activity of spinel type complex oxides has been investigated in this study. Alkaline earth ferrites AFe{sub 2}O{sub 4} (A = Ba, Ca, Sr) were synthesized by sol–gel method. Structural characterizations reveal that the synthesized ferrites have orthorhombic crystal structures with different space groups and cell dimensions when they have different alkaline earth metals in their A site. All the synthesized ferrites exhibited their bandgap in the range 2.14–2.19 eV. Their photocatalytic activities were studied using congo red dye under sunlight and xenon lamp radiation. The substitution of Ba, Ca and Sr at A site of these ferrites had varying impact on dye degradation process. Under xenon lamp irradiation, BaFe{sub 2}O{sub 4} exhibited the highest percentage of dye degradation (92% after 75 min). However, CaFe{sub 2}O{sub 4} showed the fastest degradation of the dye (70% within 15 min). In the absence of irradiation, SrFe{sub 2}O{sub 4} showed the highest dye adsorption (44% after 75 min).

  16. Rapid Continuous Multimaterial Extrusion Bioprinting

    NARCIS (Netherlands)

    Liu, Wanjun; Zhang, Yu Shrike; Heinrich, Marcel A.; De Ferrari, F; Jang, HL; Bakht, SM; Alvarez, MM; Yang, J; Li, YC; Trujillo-de Stantiago, G; Miri, AK; Zhu, K; Khoshakhlagh, P; Prakash, G; Cheng, H; Guan, X; Zhong, Z; Ju, J; Zhu, GH; Jin, X; Ryon Shin, Su; Dokmeci, M.R.; Khademhosseini, Ali

    The development of a multimaterial extrusion bioprinting platform is reported. This platform is capable of depositing multiple coded bioinks in a continuous manner with fast and smooth switching among different reservoirs for rapid fabrication of complex constructs, through digitally controlled

  17. The security analyzer, a security analyzer program written in Prolog

    International Nuclear Information System (INIS)

    Zimmerman, B.D.; Densley, P.J.; Carlson, R.L.

    1987-01-01

    A technique has been developed to characterize a nuclear facility and measure the strengths and weaknesses of the physical protection system. It utilizes the artificial intelligence capabilities available in the prolog programming language to probe a facility's defenses and find potential attack paths that meet designated search criteria. As sensors or barriers become inactive due to maintenance, failure, or inclement weather conditions, the protection system can rapidly be reanalyzed to discover weaknesses that would need to be strengthened by alternative means. Conversely, proposed upgrades and enhancements can be easily entered into the database and their effect measured against a variety of potential adversary attacks. Thus the security analyzer is a tool that aids the protection planner as well as the protection operations staff

  18. Remote Laser Diffraction PSD Analyzer

    International Nuclear Information System (INIS)

    Batcheller, T.A.; Huestis, G.M.; Bolton, S.M.

    2000-01-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified off-the-shelf classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a hot cell (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable - making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives

  19. Remote Laser Diffraction PSD Analyzer

    International Nuclear Information System (INIS)

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2000-01-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified ''off-the-shelf'' classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a ''hot cell'' (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable--making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives

  20. Extraction spectrophotometric analyzer

    International Nuclear Information System (INIS)

    Batik, J.; Vitha, F.

    1985-01-01

    Automation is discussed of extraction spectrophotometric determination of uranium in a solution. Uranium is extracted from accompanying elements in an HCl medium with a solution of tributyl phosphate in benzene. The determination is performed by measuring absorbance at 655 nm in a single-phase ethanol-water-benzene-tributyl phosphate medium. The design is described of an analyzer consisting of an analytical unit and a control unit. The analyzer performance promises increased productivity of labour, improved operating and hygiene conditions, and mainly more accurate results of analyses. (J.C.)

  1. Influences of oceanic islands and the Pleistocene on the biogeography and evolution of two groups of Australasian parrots (Aves: Psittaciformes: Eclectus roratus, Trichoglossus haematodus complex. Rapid evolution and implications for taxonomy and conservation

    Directory of Open Access Journals (Sweden)

    Braun Michael P.

    2016-08-01

    Full Text Available The Australasian region is a centre of biodiversity and endemism, mainly based on the tropical climate in combination with the large amount of islands. During the Pleistocene, islands of the Sahul Shelf (Australia, New Guinea, Aru Islands had been part of the same land mass, while islands within the Wallacea (Lesser Sunda Islands, Moluccas, Sulawesi etc. remained isolated. We investigated biogeographical avian diversification patterns of two species complexes across the Wallacea and the Sahul Shelf: the Eclectus Parrot Eclectus roratus Wagler, 1832, and the Rainbow Lorikeet Trichoglossus haematodus Linnaeus, 1771. Both species are represented by a large number of described geographical subspecies. We used mitochondrial cytochrome b (cyt b sequences for phylogenetic and network analysis to detect biogeographic roles of islands and avian diversification patterns. The number of threatened taxa in this region is increasing rapidly and there is an urgent need for (sub-species conservation in this region. Our study provides first genetic evidence for treating several island taxa as distinct species. In both species complexes similar genetic patterns were detected. Genetic diversification was higher across the islands of the Wallacea than across the islands of the Sahul Shelf. Divergence in E. roratus can be dated back about 1.38 million years ago, whereas in the younger T. haematodus it was 0.80 million years ago. Long distance dispersal was the most likely event for distribution patterns across the Wallacea and Sahul Shelf. The geographic origin of the species-complex Eclectus roratus spp. is supposed to be Wallacean, but for the species-complex Trichoglossus haematodus spp. it is supposed to be non-Wallacean. Trichoglossus euteles, so far considered a distinct species, clearly belongs to the Trichoglossus-haematodus-complex. The only case of sympatry in the complex is the distribution of T. (h. euteles and T. h. capistratus on Timor, which means a

  2. Americal options analyzed differently

    NARCIS (Netherlands)

    Nieuwenhuis, J.W.

    2003-01-01

    In this note we analyze in a discrete-time context and with a finite outcome space American options starting with the idea that every tradable should be a martingale under a certain measure. We believe that in this way American options become more understandable to people with a good working

  3. Analyzing Political Television Advertisements.

    Science.gov (United States)

    Burson, George

    1992-01-01

    Presents a lesson plan to help students understand that political advertisements often mislead, lie, or appeal to emotion. Suggests that the lesson will enable students to examine political advertisements analytically. Includes a worksheet to be used by students to analyze individual political advertisements. (DK)

  4. Centrifugal analyzer development

    International Nuclear Information System (INIS)

    Burtis, C.A.; Bauer, M.L.; Bostick, W.D.

    1976-01-01

    The development of the centrifuge fast analyzer (CFA) is reviewed. The development of a miniature CFA with computer data analysis is reported and applications for automated diagnostic chemical and hematological assays are discussed. A portable CFA system with microprocessor was adapted for field assays of air and water samples for environmental pollutants, including ammonia, nitrates, nitrites, phosphates, sulfates, and silica. 83 references

  5. Analyzing business models

    DEFF Research Database (Denmark)

    Nielsen, Christian

    2014-01-01

    New types of disclosure and reporting are argued to be vital in order to convey a transparent picture of the true state of the company. However, they are unfortunately not without problems as these types of information are somewhat more complex than the information provided in the traditional...... stakeholders in a form that corresponds to the stakeholders understanding, then disclosure and interpretation of key performance indicators will also be facilitated....

  6. Soft Decision Analyzer

    Science.gov (United States)

    Lansdowne, Chatwin; Steele, Glen; Zucha, Joan; Schlesinger, Adam

    2013-01-01

    We describe the benefit of using closed-loop measurements for a radio receiver paired with a counterpart transmitter. We show that real-time analysis of the soft decision output of a receiver can provide rich and relevant insight far beyond the traditional hard-decision bit error rate (BER) test statistic. We describe a Soft Decision Analyzer (SDA) implementation for closed-loop measurements on single- or dual- (orthogonal) channel serial data communication links. The analyzer has been used to identify, quantify, and prioritize contributors to implementation loss in live-time during the development of software defined radios. This test technique gains importance as modern receivers are providing soft decision symbol synchronization as radio links are challenged to push more data and more protocol overhead through noisier channels, and software-defined radios (SDRs) use error-correction codes that approach Shannon's theoretical limit of performance.

  7. KWU Nuclear Plant Analyzer

    International Nuclear Information System (INIS)

    Bennewitz, F.; Hummel, R.; Oelmann, K.

    1986-01-01

    The KWU Nuclear Plant Analyzer is a real time engineering simulator based on the KWU computer programs used in plant transient analysis and licensing. The primary goal is to promote the understanding of the technical and physical processes of a nuclear power plant at an on-site training facility. Thus the KWU Nuclear Plant Analyzer is available with comparable low costs right at the time when technical questions or training needs arise. This has been achieved by (1) application of the transient code NLOOP; (2) unrestricted operator interaction including all simulator functions; (3) using the mainframe computer Control Data Cyber 176 in the KWU computing center; (4) four color graphic displays controlled by a dedicated graphic computer, no control room equipment; and (5) coupling of computers by telecommunication via telephone

  8. Analyzed Using Statistical Moments

    International Nuclear Information System (INIS)

    Oltulu, O.

    2004-01-01

    Diffraction enhanced imaging (DEl) technique is a new x-ray imaging method derived from radiography. The method uses a monorheumetten x-ray beam and introduces an analyzer crystal between an object and a detector Narrow angular acceptance of the analyzer crystal generates an improved contrast over the evaluation radiography. While standart radiography can produce an 'absorption image', DEl produces 'apparent absorption' and 'apparent refraction' images with superior quality. Objects with similar absorption properties may not be distinguished with conventional techniques due to close absorption coefficients. This problem becomes more dominant when an object has scattering properties. A simple approach is introduced to utilize scattered radiation to obtain 'pure absorption' and 'pure refraction' images

  9. Emission spectrometric isotope analyzer

    International Nuclear Information System (INIS)

    Mauersberger, K.; Meier, G.; Nitschke, W.; Rose, W.; Schmidt, G.; Rahm, N.; Andrae, G.; Krieg, D.; Kuefner, W.; Tamme, G.; Wichlacz, D.

    1982-01-01

    An emission spectrometric isotope analyzer has been designed for determining relative abundances of stable isotopes in gaseous samples in discharge tubes, in liquid samples, and in flowing gaseous samples. It consists of a high-frequency generator, a device for defined positioning of discharge tubes, a grating monochromator with oscillating slit and signal converter, signal generator, window discriminator, AND connection, read-out display, oscillograph, gas dosing device and chemical conversion system with carrier gas source and vacuum pump

  10. Rapid deployment intrusion detection system

    International Nuclear Information System (INIS)

    Graham, R.H.

    1997-01-01

    A rapidly deployable security system is one that provides intrusion detection, assessment, communications, and annunciation capabilities; is easy to install and configure; can be rapidly deployed, and is reusable. A rapidly deployable intrusion detection system (RADIDS) has many potential applications within the DOE Complex: back-up protection for failed zones in a perimeter intrusion detection and assessment system, intrusion detection and assessment capabilities in temporary locations, protection of assets during Complex reconfiguration, and protection in hazardous locations, protection of assets during Complex reconfiguration, and protection in hazardous locations. Many DOE user-need documents have indicated an interest in a rapidly deployable intrusion detection system. The purpose of the RADIDS project is to design, develop, and implement such a system. 2 figs

  11. Symmetry in Complex Networks

    Directory of Open Access Journals (Sweden)

    Angel Garrido

    2011-01-01

    Full Text Available In this paper, we analyze a few interrelated concepts about graphs, such as their degree, entropy, or their symmetry/asymmetry levels. These concepts prove useful in the study of different types of Systems, and particularly, in the analysis of Complex Networks. A System can be defined as any set of components functioning together as a whole. A systemic point of view allows us to isolate a part of the world, and so, we can focus on those aspects that interact more closely than others. Network Science analyzes the interconnections among diverse networks from different domains: physics, engineering, biology, semantics, and so on. Current developments in the quantitative analysis of Complex Networks, based on graph theory, have been rapidly translated to studies of brain network organization. The brain's systems have complex network features—such as the small-world topology, highly connected hubs and modularity. These networks are not random. The topology of many different networks shows striking similarities, such as the scale-free structure, with the degree distribution following a Power Law. How can very different systems have the same underlying topological features? Modeling and characterizing these networks, looking for their governing laws, are the current lines of research. So, we will dedicate this Special Issue paper to show measures of symmetry in Complex Networks, and highlight their close relation with measures of information and entropy.

  12. PhosphoSiteAnalyzer

    DEFF Research Database (Denmark)

    Bennetzen, Martin V; Cox, Jürgen; Mann, Matthias

    2012-01-01

    an algorithm to retrieve kinase predictions from the public NetworKIN webpage in a semiautomated way and applies hereafter advanced statistics to facilitate a user-tailored in-depth analysis of the phosphoproteomic data sets. The interface of the software provides a high degree of analytical flexibility......Phosphoproteomic experiments are routinely conducted in laboratories worldwide, and because of the fast development of mass spectrometric techniques and efficient phosphopeptide enrichment methods, researchers frequently end up having lists with tens of thousands of phosphorylation sites...... and is designed to be intuitive for most users. PhosphoSiteAnalyzer is a freeware program available at http://phosphosite.sourceforge.net ....

  13. Electrodynamic thermogravimetric analyzer

    International Nuclear Information System (INIS)

    Spjut, R.E.; Bar-Ziv, E.; Sarofim, A.F.; Longwell, J.P.

    1986-01-01

    The design and operation of a new device for studying single-aerosol-particle kinetics at elevated temperatures, the electrodynamic thermogravimetric analyzer (EDTGA), was examined theoretically and experimentally. The completed device consists of an electrodynamic balance modified to permit particle heating by a CO 2 laser, temperature measurement by a three-color infrared-pyrometry system, and continuous weighing by a position-control system. In this paper, the position-control, particle-weight-measurement, heating, and temperature-measurement systems are described and their limitations examined

  14. Analyzing Chinese Financial Reporting

    Institute of Scientific and Technical Information of China (English)

    SABRINA; ZHANG

    2008-01-01

    If the world’s capital markets could use a harmonized accounting framework it would not be necessary for a comparison between two or more sets of accounting standards. However,there is much to do before this becomes reality.This article aims to pres- ent a general overview of China’s General Accepted Accounting Principles(GAAP), U.S.General Accepted Accounting Principles and International Financial Reporting Standards(IFRS),and to analyze the differ- ences among IFRS,U.S.GAAP and China GAAP using fixed assets as an example.

  15. Spectrometric microbiological analyzer

    Science.gov (United States)

    Schlager, Kenneth J.; Meissner, Ken E.

    1996-04-01

    Currently, there are four general approaches to microbiological analysis, i.e., the detection, identification and quantification of micro-organisms: (1) Traditional culturing and staining procedures, metabolic fermentations and visual morphological characteristics; (2) Immunological approaches employing microbe-specific antibodies; (3) Biotechnical techniques employing DNA probes and related genetic engineering methods; and (4) Physical measurement techniques based on the biophysical properties of micro-organisms. This paper describes an instrumentation development in the fourth of the above categories, physical measurement, that uses a combination of fluorometric and light scatter spectra to detect and identify micro-organisms at the species level. A major advantage of this approach is the rapid turnaround possible in medical diagnostic or water testing applications. Fluorometric spectra serve to define the biochemical characteristics of the microbe, and light scatter spectra the size and shape morphology. Together, the two spectra define a 'fingerprint' for each species of microbe for detection, identification and quantification purposes. A prototype instrument has been developed and tested under NASA sponsorship based on fluorometric spectra alone. This instrument demonstrated identification and quantification capabilities at the species level. The paper reports on test results using this instrument, and the benefits of employing a combination of fluorometric and light scatter spectra.

  16. Immunization of chickens with an agonistic monoclonal anti-chicken CD40 antibody-hapten complex: rapid and robust IgG response induced by a single subcutaneous injection.

    Science.gov (United States)

    Chen, Chang-Hsin; Abi-Ghanem, Daad; Waghela, Suryakant D; Chou, Wen-Ko; Farnell, Morgan B; Mwangi, Waithaka; Berghman, Luc R

    2012-04-30

    Producing diagnostic antibodies in chicken egg yolk represents an alternate animal system that offers many advantages including high productivity at low cost. Despite being an excellent counterpart to mammalian antibodies, chicken IgG from yolk still represents an underused resource. The potential of agonistic monoclonal anti-CD40 antibodies (mAb) as a powerful immunological adjuvant has been demonstrated in mammals, but not in chickens. We recently reported an agonistic anti-chicken CD40 mAb (designated mAb 2C5) and showed that it may have potential as an immunological adjuvant. In this study, we examined the efficacy of targeting a short peptide to chicken CD40 [expressed by the antigen-presenting cells (APCs)] in enhancing an effective IgG response in chickens. For this purpose, an immune complex consisting of one streptavidin molecule, two directionally biotinylated mAb 2C5 molecules, and two biotinylated peptide molecules was produced. Chickens were immunized subcutaneously with doses of this complex ranging from 10 to 90 μg per injection once, and relative quantification of the peptide-specific IgG response showed that the mAb 2C5-based complex was able to elicit a strong IgG response as early as four days post-immunization. This demonstrates that CD40-targeting antigen to chicken APCs can significantly enhance antibody responses and induce immunoglobulin isotype-switching. This immunization strategy holds promise for rapid production of hapten-specific IgG in chickens. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. Plutonium solution analyzer

    International Nuclear Information System (INIS)

    Burns, D.A.

    1994-09-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%-O.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40-240 g/L and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4-4.0 g/L. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 mL of each sample and standard, and generates waste at the rate of about 1.5 mL per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded)

  18. Multiple capillary biochemical analyzer

    Science.gov (United States)

    Dovichi, N.J.; Zhang, J.Z.

    1995-08-08

    A multiple capillary analyzer allows detection of light from multiple capillaries with a reduced number of interfaces through which light must pass in detecting light emitted from a sample being analyzed, using a modified sheath flow cuvette. A linear or rectangular array of capillaries is introduced into a rectangular flow chamber. Sheath fluid draws individual sample streams through the cuvette. The capillaries are closely and evenly spaced and held by a transparent retainer in a fixed position in relation to an optical detection system. Collimated sample excitation radiation is applied simultaneously across the ends of the capillaries in the retainer. Light emitted from the excited sample is detected by the optical detection system. The retainer is provided by a transparent chamber having inward slanting end walls. The capillaries are wedged into the chamber. One sideways dimension of the chamber is equal to the diameter of the capillaries and one end to end dimension varies from, at the top of the chamber, slightly greater than the sum of the diameters of the capillaries to, at the bottom of the chamber, slightly smaller than the sum of the diameters of the capillaries. The optical system utilizes optic fibers to deliver light to individual photodetectors, one for each capillary tube. A filter or wavelength division demultiplexer may be used for isolating fluorescence at particular bands. 21 figs.

  19. Plutonium solution analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Burns, D.A.

    1994-09-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%-O.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40-240 g/L and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4-4.0 g/L. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 mL of each sample and standard, and generates waste at the rate of about 1.5 mL per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded).

  20. Trace impurity analyzer

    International Nuclear Information System (INIS)

    Schneider, W.J.; Edwards, D. Jr.

    1979-01-01

    The desirability for long-term reliability of large scale helium refrigerator systems used on superconducting accelerator magnets has necessitated detection of impurities to levels of a few ppM. An analyzer that measures trace impurity levels of condensable contaminants in concentrations of less than a ppM in 15 atm of He is described. The instrument makes use of the desorption temperature at an indicated pressure of the various impurities to determine the type of contaminant. The pressure rise at that temperature yields a measure of the contaminant level of the impurity. A LN 2 cryogenic charcoal trap is also employed to measure air impurities (nitrogen and oxygen) to obtain the full range of contaminant possibilities. The results of this detector which will be in use on the research and development helium refrigerator of the ISABELLE First-Cell is described

  1. Analyzing Water's Optical Absorption

    Science.gov (United States)

    2002-01-01

    A cooperative agreement between World Precision Instruments (WPI), Inc., and Stennis Space Center has led the UltraPath(TM) device, which provides a more efficient method for analyzing the optical absorption of water samples at sea. UltraPath is a unique, high-performance absorbance spectrophotometer with user-selectable light path lengths. It is an ideal tool for any study requiring precise and highly sensitive spectroscopic determination of analytes, either in the laboratory or the field. As a low-cost, rugged, and portable system capable of high- sensitivity measurements in widely divergent waters, UltraPath will help scientists examine the role that coastal ocean environments play in the global carbon cycle. UltraPath(TM) is a trademark of World Precision Instruments, Inc. LWCC(TM) is a trademark of World Precision Instruments, Inc.

  2. ROBOT TASK SCENE ANALYZER

    International Nuclear Information System (INIS)

    Hamel, William R.; Everett, Steven

    2000-01-01

    Environmental restoration and waste management (ER and WM) challenges in the United States Department of Energy (DOE), and around the world, involve radiation or other hazards which will necessitate the use of remote operations to protect human workers from dangerous exposures. Remote operations carry the implication of greater costs since remote work systems are inherently less productive than contact human work due to the inefficiencies/complexities of teleoperation. To reduce costs and improve quality, much attention has been focused on methods to improve the productivity of combined human operator/remote equipment systems; the achievements to date are modest at best. The most promising avenue in the near term is to supplement conventional remote work systems with robotic planning and control techniques borrowed from manufacturing and other domains where robotic automation has been used. Practical combinations of teleoperation and robotic control will yield telerobotic work systems that outperform currently available remote equipment. It is believed that practical telerobotic systems may increase remote work efficiencies significantly. Increases of 30% to 50% have been conservatively estimated for typical remote operations. It is important to recognize that the basic hardware and software features of most modern remote manipulation systems can readily accommodate the functionality required for telerobotics. Further, several of the additional system ingredients necessary to implement telerobotic control--machine vision, 3D object and workspace modeling, automatic tool path generation and collision-free trajectory planning--are existent

  3. ROBOT TASK SCENE ANALYZER

    Energy Technology Data Exchange (ETDEWEB)

    William R. Hamel; Steven Everett

    2000-08-01

    Environmental restoration and waste management (ER and WM) challenges in the United States Department of Energy (DOE), and around the world, involve radiation or other hazards which will necessitate the use of remote operations to protect human workers from dangerous exposures. Remote operations carry the implication of greater costs since remote work systems are inherently less productive than contact human work due to the inefficiencies/complexities of teleoperation. To reduce costs and improve quality, much attention has been focused on methods to improve the productivity of combined human operator/remote equipment systems; the achievements to date are modest at best. The most promising avenue in the near term is to supplement conventional remote work systems with robotic planning and control techniques borrowed from manufacturing and other domains where robotic automation has been used. Practical combinations of teleoperation and robotic control will yield telerobotic work systems that outperform currently available remote equipment. It is believed that practical telerobotic systems may increase remote work efficiencies significantly. Increases of 30% to 50% have been conservatively estimated for typical remote operations. It is important to recognize that the basic hardware and software features of most modern remote manipulation systems can readily accommodate the functionality required for telerobotics. Further, several of the additional system ingredients necessary to implement telerobotic control--machine vision, 3D object and workspace modeling, automatic tool path generation and collision-free trajectory planning--are existent.

  4. A neutron activation analyzer

    International Nuclear Information System (INIS)

    Westphal, G.P.; Lemmel, H.; Grass, F.; De Regge, P.P.; Burns, K.; Markowicz, A.

    2005-01-01

    Dubbed 'Analyzer' because of its simplicity, a neutron activation analysis facility for short-lived isomeric transitions is based on a low-cost rabbit system and an adaptive digital filter which are controlled by a software performing irradiation control, loss-free gamma-spectrometry, spectra evaluation, nuclide identification and calculation of concentrations in a fully automatic flow of operations. Designed for TRIGA reactors and constructed from inexpensive plastic tubing and an aluminum in-core part, the rabbit system features samples of 5 ml and 10 ml with sample separation at 150 ms and 200 ms transport time or 25 ml samples without separation at a transport time of 300 ms. By automatically adapting shaping times to pulse intervals the preloaded digital filter gives best throughput at best resolution up to input counting rates of 10 6 cps. Loss-free counting enables quantitative correction of counting losses of up to 99%. As a test of system reproducibility in sample separation geometry, K, Cl, Mn, Mg, Ca, Sc, and V have been determined in various reference materials at excellent agreement with consensus values. (author)

  5. Climate Model Diagnostic Analyzer

    Science.gov (United States)

    Lee, Seungwon; Pan, Lei; Zhai, Chengxing; Tang, Benyang; Kubar, Terry; Zhang, Zia; Wang, Wei

    2015-01-01

    The comprehensive and innovative evaluation of climate models with newly available global observations is critically needed for the improvement of climate model current-state representation and future-state predictability. A climate model diagnostic evaluation process requires physics-based multi-variable analyses that typically involve large-volume and heterogeneous datasets, making them both computation- and data-intensive. With an exploratory nature of climate data analyses and an explosive growth of datasets and service tools, scientists are struggling to keep track of their datasets, tools, and execution/study history, let alone sharing them with others. In response, we have developed a cloud-enabled, provenance-supported, web-service system called Climate Model Diagnostic Analyzer (CMDA). CMDA enables the physics-based, multivariable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. At the same time, CMDA provides a crowd-sourcing space where scientists can organize their work efficiently and share their work with others. CMDA is empowered by many current state-of-the-art software packages in web service, provenance, and semantic search.

  6. Analyzing Visibility Configurations.

    Science.gov (United States)

    Dachsbacher, C

    2011-04-01

    Many algorithms, such as level of detail rendering and occlusion culling methods, make decisions based on the degree of visibility of an object, but do not analyze the distribution, or structure, of the visible and occluded regions across surfaces. We present an efficient method to classify different visibility configurations and show how this can be used on top of existing methods based on visibility determination. We adapt co-occurrence matrices for visibility analysis and generalize them to operate on clusters of triangular surfaces instead of pixels. We employ machine learning techniques to reliably classify the thus extracted feature vectors. Our method allows perceptually motivated level of detail methods for real-time rendering applications by detecting configurations with expected visual masking. We exemplify the versatility of our method with an analysis of area light visibility configurations in ray tracing and an area-to-area visibility analysis suitable for hierarchical radiosity refinement. Initial results demonstrate the robustness, simplicity, and performance of our method in synthetic scenes, as well as real applications.

  7. Digital Microfluidics Sample Analyzer

    Science.gov (United States)

    Pollack, Michael G.; Srinivasan, Vijay; Eckhardt, Allen; Paik, Philip Y.; Sudarsan, Arjun; Shenderov, Alex; Hua, Zhishan; Pamula, Vamsee K.

    2010-01-01

    Three innovations address the needs of the medical world with regard to microfluidic manipulation and testing of physiological samples in ways that can benefit point-of-care needs for patients such as premature infants, for which drawing of blood for continuous tests can be life-threatening in their own right, and for expedited results. A chip with sample injection elements, reservoirs (and waste), droplet formation structures, fluidic pathways, mixing areas, and optical detection sites, was fabricated to test the various components of the microfluidic platform, both individually and in integrated fashion. The droplet control system permits a user to control droplet microactuator system functions, such as droplet operations and detector operations. Also, the programming system allows a user to develop software routines for controlling droplet microactuator system functions, such as droplet operations and detector operations. A chip is incorporated into the system with a controller, a detector, input and output devices, and software. A novel filler fluid formulation is used for the transport of droplets with high protein concentrations. Novel assemblies for detection of photons from an on-chip droplet are present, as well as novel systems for conducting various assays, such as immunoassays and PCR (polymerase chain reaction). The lab-on-a-chip (a.k.a., lab-on-a-printed-circuit board) processes physiological samples and comprises a system for automated, multi-analyte measurements using sub-microliter samples of human serum. The invention also relates to a diagnostic chip and system including the chip that performs many of the routine operations of a central labbased chemistry analyzer, integrating, for example, colorimetric assays (e.g., for proteins), chemiluminescence/fluorescence assays (e.g., for enzymes, electrolytes, and gases), and/or conductometric assays (e.g., for hematocrit on plasma and whole blood) on a single chip platform.

  8. Beam loss caused by edge focusing of injection bump magnets and its mitigation in the 3-GeV rapid cycling synchrotron of the Japan Proton Accelerator Research Complex

    Directory of Open Access Journals (Sweden)

    H. Hotchi

    2016-01-01

    Full Text Available In the 3-GeV rapid cycling synchrotron of the Japan Proton Accelerator Research Complex, transverse injection painting is utilized not only to suppress space-charge induced beam loss in the low energy region but also to mitigate foil scattering beam loss during charge-exchange injection. The space-charge induced beam loss is well minimized by the combination of modest transverse painting and full longitudinal painting. But, for sufficiently mitigating the foil scattering part of beam loss, the transverse painting area has to be further expanded. However, such a wide-ranging transverse painting had not been realized until recently due to beta function beating caused by edge focusing of pulsed injection bump magnets during injection. This beta function beating additionally excites random betatron resonances through a distortion of the lattice superperiodicity, and its resultant deterioration of the betatron motion stability causes significant extra beam loss when expanding the transverse painting area. To solve this issue, we newly installed pulse-type quadrupole correctors to compensate the beta function beating. This paper presents recent experimental results on this correction scheme for suppressing the extra beam loss, while discussing the beam loss and its mitigation mechanisms with the corresponding numerical simulations.

  9. Innovation Networks New Approaches in Modelling and Analyzing

    CERN Document Server

    Pyka, Andreas

    2009-01-01

    The science of graphs and networks has become by now a well-established tool for modelling and analyzing a variety of systems with a large number of interacting components. Starting from the physical sciences, applications have spread rapidly to the natural and social sciences, as well as to economics, and are now further extended, in this volume, to the concept of innovations, viewed broadly. In an abstract, systems-theoretical approach, innovation can be understood as a critical event which destabilizes the current state of the system, and results in a new process of self-organization leading to a new stable state. The contributions to this anthology address different aspects of the relationship between innovation and networks. The various chapters incorporate approaches in evolutionary economics, agent-based modeling, social network analysis and econophysics and explore the epistemic tension between insights into economics and society-related processes, and the insights into new forms of complex dynamics.

  10. JINR rapid communications

    International Nuclear Information System (INIS)

    1998-01-01

    The present collection of rapid communications from JINR, Dubna, contains seven separate records on decays of excited strange mesons in the extended NJL model, production of heavy evaporation residues in the reactions induced by an extracted 48 Ca beam on a 208 Pb target, scaling behaviour of tensor analyzing power (A yy ) in the inelastic scattering or relativistic deuterons,two-photon collisions at very low Q 2 from LEP2: forthcoming results, high magnetic field uniformity superconducting magnet for a movable polarized target, multichannel time-to-digital converter for drift detector and wavelet-analysis: application to Gaussian signals

  11. JINR rapid communications

    International Nuclear Information System (INIS)

    1995-01-01

    The present collection of rapid communications from JINR, Dubna, contains eight separate reports on the measurement of charge radii for Ti nuclei, spectroscopy of 13 Be, concentrations of hadrons and quark-gluon plasma in mixed phase, experimental results on one-spin pion asymmetry in the d↑ + A → π±(90 0 ) + X process, new results on cumulative pion and proton production in p-D collisions, investigation of charge exchange reactions, the study of the tensor analyzing power in cumulative particle production on a deuteron beam and an evidence for the excited states of the S = -2 stable light dibaryon. 32 figs., 6 tabs

  12. A seal analyzer for testing container integrity

    International Nuclear Information System (INIS)

    McDaniel, P.; Jenkins, C.

    1988-01-01

    This paper reports on the development of laboratory and production seal analyzer that offers a rapid, nondestructive method of assuring the seal integrity of virtually any type of single or double sealed container. The system can test a broad range of metal cans, drums and trays, membrane-lidded vessels, flexible pouches, aerosol containers, and glass or metal containers with twist-top lids that are used in the chemical/pesticide (hazardous materials/waste), beverage, food, medical and pharmaceutical industries

  13. Solid-state thermal decomposition of the [Co(NH3)5CO3]NO3·0.5H2O complex: A simple, rapid and low-temperature synthetic route to Co3O4 nanoparticles

    International Nuclear Information System (INIS)

    Farhadi, Saeid; Safabakhsh, Jalil

    2012-01-01

    Highlights: ► [Co(NH 3 ) 5 CO 3 ]NO 3 ·0.5H 2 O complex was used for preparing pure Co 3 O 4 nanoparticles. ► Co 3 O 4 nanoparticles were prepared at low temperature of 175 °C. ► Co 3 O 4 nanoparticles show a weak ferromagnetic behaviour at room temperature. ► The method is simple, low-cost and suitable for the production of Co 3 O 4 . - Abstract: Co 3 O 4 nanoparticles were easily prepared via the decomposition of the pentammine(carbonato)cobalt(III) nitrate precursor complex [Co(NH 3 ) 5 CO 3 ]NO 3 ·0.5H 2 O at low temperature (175 °C). The product was characterized by thermal analysis, X-ray diffraction (XRD), Fourier-transform infrared spectroscopy (FT-IR), UV–visible spectroscopy, transmission electron microscopy (TEM), energy-dispersive X-ray spectroscopy (EDX), Raman spectroscopy, Brunauer–Emmett–Teller (BET) specific surface area measurements and magnetic measurements. The FT-IR, XRD, Raman and EDX results indicated that the synthesized Co 3 O 4 nanoparticles are highly pure and have a single phase. The TEM analysis revealed nearly uniform and quasi-spherical Co 3 O 4 nanoparticles with an average particle size of approximately 10 nm. The optical absorption spectrum of the Co 3 O 4 nanoparticles showed two direct band gaps of 2.18 and 3.52 eV with a red shift in comparison with previous reported values. The prepared Co 3 O 4 nanoparticles showed a weak ferromagnetic behaviour that could be attributed to uncompensated surface spins and/or finite-size effects. Using the present method, Co 3 O 4 nanoparticles can be produced without expensive organic solvents and complicated equipment. This simple, rapid, safe and low-cost synthetic route can be extended to the synthesis of other transition-metal oxides.

  14. Multichannel analyzer development in CAMAC

    International Nuclear Information System (INIS)

    Nagy, J.Z.; Zarandy, A.

    1988-01-01

    The data acquisition in TOKAMAK experiments some CAMAC modules have been developed. The modules are the following: 64 K analyzer memory, 32 K analyzer memory, 6-channel pulse peak analyzer memory which contains the 32 K analyzer memory and eight AD-converters

  15. Rapid shallow breathing

    Science.gov (United States)

    Tachypnea; Breathing - rapid and shallow; Fast shallow breathing; Respiratory rate - rapid and shallow ... Shallow, rapid breathing has many possible medical causes, including: Asthma Blood clot in an artery in the ...

  16. Rapid Geophysical Surveyor

    International Nuclear Information System (INIS)

    Roybal, L.G.; Carpenter, G.S.; Josten, N.E.

    1993-01-01

    The Rapid Geophysical Surveyor (RGS) is a system designed to rapidly and economically collect closely-spaced geophysical data used for characterization of US Department of Energy waste sites. Geophysical surveys of waste sites are an important first step in the remediation and closure of these sites; especially older sites where historical records are inaccurate and survey benchmarks have changed because of refinements in coordinate controls and datum changes. Closely-spaced data are required to adequately differentiate pits, trenches, and soil vault rows whose edges may be only a few feet from each other. A prototype vehicle designed to collect magnetic field data was built at the Idaho National Engineering Laboratory (INEL) during the summer of 1992. The RGS was funded by the Buried Waste Integrated Demonstration program. This vehicle was demonstrated at the Subsurface Disposal Area (SDA) within the Radioactive Waste Management Complex at the INEL in September 1992. Magnetic data were collected over two areas in the SDA, with a total survey area of about 1.7 acres. Data were collected at a nominal density of 2 1/2 in. along survey lines spaced 1-ft apart. Over 350,000 data points were collected over a 6 day period corresponding to about 185 worker-days using conventional ground survey techniques

  17. A tandem parallel plate analyzer

    International Nuclear Information System (INIS)

    Hamada, Y.; Fujisawa, A.; Iguchi, H.; Nishizawa, A.; Kawasumi, Y.

    1996-11-01

    By a new modification of a parallel plate analyzer the second-order focus is obtained in an arbitrary injection angle. This kind of an analyzer with a small injection angle will have an advantage of small operational voltage, compared to the Proca and Green analyzer where the injection angle is 30 degrees. Thus, the newly proposed analyzer will be very useful for the precise energy measurement of high energy particles in MeV range. (author)

  18. JINR rapid communications

    International Nuclear Information System (INIS)

    1997-01-01

    The present collection of rapid communications from JINR, Dubna, contains seven separate reports on investigation of the tensor analyzing power A yy in the reaction A(d polarized, p)X at large transverse momenta of proton, double-differential ionization cross section calculations for fast collisions of ions and atoms, a study of the two-photon interactions tagged at an average 2 > of 90 GeV 2 , cluster and single-particle distributions in nucleus-nucleus interactions, the Coulomb interaction of charged pions in CC-and CTa-collisions at 4.2 A GeV/c, influence of nitrogen and oxygen gas admixtures on the response of the DELPHI HCAL and MUS detectors and an automation of physics research on base of open standards

  19. X-ray fluorescence analyzer arrangement

    International Nuclear Information System (INIS)

    Vatai, Endre; Ando, Laszlo; Gal, Janos.

    1981-01-01

    An x-ray fluorescence analyzer for the quantitative determination of one or more elements of complex samples is reported. The novelties of the invention are the excitation of the samples by x-rays or γ-radiation, the application of a balanced filter pair as energy selector, and the measurement of the current or ion charge of ionization detectors used as sensors. Due to the increased sensitivity and accuracy, the novel design can extend the application fields of x-ray fluorescence analyzers. (A.L.)

  20. Digital Multi Channel Analyzer Enhancement

    International Nuclear Information System (INIS)

    Gonen, E.; Marcus, E.; Wengrowicz, U.; Beck, A.; Nir, J.; Sheinfeld, M.; Broide, A.; Tirosh, D.

    2002-01-01

    A cement analyzing system based on radiation spectroscopy had been developed [1], using novel digital approach for real-time, high-throughput and low-cost Multi Channel Analyzer. The performance of the developed system had a severe problem: the resulted spectrum suffered from lack of smoothness, it was very noisy and full of spikes and surges, therefore it was impossible to use this spectrum for analyzing the cement substance. This paper describes the work carried out to improve the system performance

  1. PM 3655 PHILIPS Logic analyzer

    CERN Multimedia

    A logic analyzer is an electronic instrument that captures and displays multiple signals from a digital system or digital circuit. A logic analyzer may convert the captured data into timing diagrams, protocol decodes, state machine traces, assembly language, or may correlate assembly with source-level software. Logic Analyzers have advanced triggering capabilities, and are useful when a user needs to see the timing relationships between many signals in a digital system.

  2. On the rapid melt quenching

    International Nuclear Information System (INIS)

    Usatyuk, I.I.; Novokhatskij, I.A.; Kaverin, Yu.F.

    1994-01-01

    Specific features of instrumentation of traditionally employed method of melt spinning (rapid quenching), its disadvantages being discussed, were analyzed. The necessity of the method upgrading as applied to the problems of studying fine structure of molten metals and glasses was substantiated. The principle flowsheet of experimental facility for extremely rapid quenching of the melts of metals is described, specificity of its original functional units being considered. The sequence and character of all the principal stages of the method developed were discussed. 18 refs.; 3 figs

  3. Multichannel analyzer type CMA-3

    International Nuclear Information System (INIS)

    Czermak, A.; Jablonski, J.; Ostrowicz, A.

    1978-01-01

    Multichannel analyzer CMA-3 is designed for two-parametric analysis with operator controlled logical windows. It is implemented in CAMAC standard. A single crate contains all required modules and is controlled by the PDP-11/10 minicomputer. Configuration of CMA-3 is shown. CMA-3 is the next version of the multichannel analyzer described in report No 958/E-8. (author)

  4. Analyzing data files in SWAN

    CERN Document Server

    Gajam, Niharika

    2016-01-01

    Traditionally analyzing data happens via batch-processing and interactive work on the terminal. The project aims to provide another way of analyzing data files: A cloud-based approach. It aims to make it a productive and interactive environment through the combination of FCC and SWAN software.

  5. Rapid geophysical surveyor

    International Nuclear Information System (INIS)

    Roybal, L.G.; Carpenter, G.S.; Josten, N.E.

    1993-01-01

    The Rapid Geophysical Surveyor (RGS) is a system designed to rapidly and economically collect closely-spaced geophysical data used for characterization of Department of Energy (DOE) waste sites. Geophysical surveys of waste sites are an important first step in the remediation and closure of these sites; especially older sties where historical records are inaccurate and survey benchmarks have changed due to refinements in coordinate controls and datum changes. Closely-spaced data are required to adequately differentiate pits, trenches, and soil vault rows whose edges may be only a few feet from each other. A prototype vehicle designed to collect magnetic field data was built at the Idaho national Engineering Laboratory (INEL) during the summer of 1992. The RGS was one of several projects funded by the Buried Waste Integrated Demonstration (BWID) program. This vehicle was demonstrated at the Subsurface Disposal Area (SDA) within the Radioactive Waste Management Complex (RWMC) on the INEL in September of 1992. Magnetic data were collected over two areas in the SDA, with a total survey area of about 1.7 acres. Data were collected at a nominal density of 2 1/2 inches along survey lines spaced 1 foot apart. Over 350,000 data points were collected over a 6 day period corresponding to about 185 man-days using conventional ground survey techniques. This report documents the design and demonstration of the RGS concept including the presentation of magnetic data collected at the SDA. The surveys were able to show pit and trench boundaries and determine details of their spatial orientation never before achieved

  6. [Automated analyzer of enzyme immunoassay].

    Science.gov (United States)

    Osawa, S

    1995-09-01

    Automated analyzers for enzyme immunoassay can be classified by several points of view: the kind of labeled antibodies or enzymes, detection methods, the number of tests per unit time, analytical time and speed per run. In practice, it is important for us consider the several points such as detection limits, the number of tests per unit time, analytical range, and precision. Most of the automated analyzers on the market can randomly access and measure samples. I will describe the recent advance of automated analyzers reviewing their labeling antibodies and enzymes, the detection methods, the number of test per unit time and analytical time and speed per test.

  7. DEMorphy, German Language Morphological Analyzer

    OpenAIRE

    Altinok, Duygu

    2018-01-01

    DEMorphy is a morphological analyzer for German. It is built onto large, compactified lexicons from German Morphological Dictionary. A guesser based on German declension suffixed is also provided. For German, we provided a state-of-art morphological analyzer. DEMorphy is implemented in Python with ease of usability and accompanying documentation. The package is suitable for both academic and commercial purposes wit a permissive licence.

  8. CSTT Update: Fuel Quality Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Brosha, Eric L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Lujan, Roger W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Mukundan, Rangachary [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockward, Tommy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Romero, Christopher J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Stefan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Wilson, Mahlon S. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-06

    These are slides from a presentation. The following topics are covered: project background (scope and approach), developing the prototype (timeline), update on intellectual property, analyzer comparisons (improving humidification, stabilizing the baseline, applying clean-up strategy, impact of ionomer content and improving clean-up), proposed operating mode, considerations for testing in real-world conditions (Gen 1 analyzer electronics development, testing partner identified, field trial planning), summary, and future work.

  9. Rapid flow imaging method

    International Nuclear Information System (INIS)

    Pelc, N.J.; Spritzer, C.E.; Lee, J.N.

    1988-01-01

    A rapid, phase-contrast, MR imaging method of imaging flow has been implemented. The method, called VIGRE (velocity imaging with gradient recalled echoes), consists of two interleaved, narrow flip angle, gradient-recalled acquisitions. One is flow compensated while the second has a specified flow encoding (both peak velocity and direction) that causes signals to contain additional phase in proportion to velocity in the specified direction. Complex image data from the first acquisition are used as a phase reference for the second, yielding immunity from phase accumulation due to causes other than motion. Images with pixel values equal to MΔΘ where M is the magnitude of the flow compensated image and ΔΘ is the phase difference at the pixel, are produced. The magnitude weighting provides additional vessel contrast, suppresses background noise, maintains the flow direction information, and still allows quantitative data to be retrieved. The method has been validated with phantoms and is undergoing initial clinical evaluation. Early results are extremely encouraging

  10. On-Demand Urine Analyzer

    Science.gov (United States)

    Farquharson, Stuart; Inscore, Frank; Shende, Chetan

    2010-01-01

    A lab-on-a-chip was developed that is capable of extracting biochemical indicators from urine samples and generating their surface-enhanced Raman spectra (SERS) so that the indicators can be quantified and identified. The development was motivated by the need to monitor and assess the effects of extended weightlessness, which include space motion sickness and loss of bone and muscle mass. The results may lead to developments of effective exercise programs and drug regimes that would maintain astronaut health. The analyzer containing the lab-on-a- chip includes materials to extract 3- methylhistidine (a muscle-loss indicator) and Risedronate (a bone-loss indicator) from the urine sample and detect them at the required concentrations using a Raman analyzer. The lab-on- a-chip has both an extractive material and a SERS-active material. The analyzer could be used to monitor the onset of diseases, such as osteoporosis.

  11. Multichannel analyzer embedded in FPGA

    International Nuclear Information System (INIS)

    Garcia D, A.; Hernandez D, V. M.; Vega C, H. R.; Ordaz G, O. O.; Bravo M, I.

    2017-10-01

    Ionizing radiation has different applications, so it is a very significant and useful tool, which in turn can be dangerous for living beings if they are exposed to uncontrolled doses. However, due to its characteristics, it cannot be perceived by any of the senses of the human being, so that in order to know the presence of it, radiation detectors and additional devices are required to quantify and classify it. A multichannel analyzer is responsible for separating the different pulse heights that are generated in the detectors, in a certain number of channels; according to the number of bits of the analog to digital converter. The objective of the work was to design and implement a multichannel analyzer and its associated virtual instrument, for nuclear spectrometry. The components of the multichannel analyzer were created in VHDL hardware description language and packaged in the Xilinx Vivado design suite, making use of resources such as the ARM processing core that the System on Chip Zynq contains and the virtual instrument was developed on the LabView programming graphics platform. The first phase was to design the hardware architecture to be embedded in the FPGA and for the internal control of the multichannel analyzer the application was generated for the ARM processor in C language. For the second phase, the virtual instrument was developed for the management, control and visualization of the results. The data obtained as a result of the development of the system were observed graphically in a histogram showing the spectrum measured. The design of the multichannel analyzer embedded in FPGA was tested with two different radiation detection systems (hyper-pure germanium and scintillation) which allowed determining that the spectra obtained are similar in comparison with the commercial multichannel analyzers. (Author)

  12. Loviisa nuclear power plant analyzer

    International Nuclear Information System (INIS)

    Porkholm, K.; Nurmilaukas, P.; Tiihonen, O.; Haenninen, M.; Puska, E.

    1992-12-01

    The APROS Simulation Environment has been developed since 1986 by Imatran Voima Oy (IVO) and the Technical Research Centre of Finland (VTT). It provides tools, solution algorithms and process components for use in different simulation systems for design, analysis and training purposes. One of its main nuclear applications is the Loviisa Nuclear Power Plant Analyzer (LPA). The Loviisa Plant Analyzer includes all the important plant components both in the primary and in the secondary circuits. In addition, all the main control systems, the protection system and the high voltage electrical systems are included. (orig.)

  13. The security analyzer: A security analyzer program written in Prolog

    International Nuclear Information System (INIS)

    Zimmerman, B.D.; Densley, P.J.

    1986-09-01

    The Security Analyzer is a software tool capable of analyzing the effectiveness of a facility's security system. It is written in the Prolog logic programming computer language, using entity-relationship data modeling techniques. The program performs the following functions: (1) provides descriptive, locational and operational status information about intrusion detectors and assessment devices (i.e., ''sensors'' and ''cameras'') upon request; (2) provides for storage and retrieval of maintenance history information for various components of the security system (including intrusion detectors), and allows for changing that information as desired; (3) provides a ''search'' mode, wherein all paths are found from any specified physical location to another specified location which satisfy user chosen ''intruder detection'' probability and elapsed time criteria (i.e., the program finds the ''weakest paths'' from a security point of view). The first two of these functions can be provided fairly easily with a conventional database program; the third function could be provided using Fortran or some similar language, though with substantial difficulty. In the Security Analyzer program, all these functions are provided in a simple and straight-forward manner. This simplicity is possible because the program is written in the symbolic (as opposed to numeric) processing language Prolog, and because the knowledge base is structured according to entity-relationship modeling principles. Also, the use of Prolog and the entity-relationship modeling technique allows the capabilities of the Security analyzer program, both for knowledge base interrogation and for searching-type operations, to be easily expanded in ways that would be very difficult for a numeric and more algorithmically deterministic language such as Fortran to duplicate. 4 refs

  14. Analyzing water/wastewater infrastructure interdependencies

    International Nuclear Information System (INIS)

    Gillette, J. L.; Fisher, R. E.; Peerenboom, J. P.; Whitfield, R. G.

    2002-01-01

    This paper describes four general categories of infrastructure interdependencies (physical, cyber, geographic, and logical) as they apply to the water/wastewater infrastructure, and provides an overview of one of the analytic approaches and tools used by Argonne National Laboratory to evaluate interdependencies. Also discussed are the dimensions of infrastructure interdependency that create spatial, temporal, and system representation complexities that make analyzing the water/wastewater infrastructure particularly challenging. An analytical model developed to incorporate the impacts of interdependencies on infrastructure repair times is briefly addressed

  15. Development of a rapid in vitro protein refolding assay which discriminates between peptide-bound and peptide-free forms of recombinant porcine major histocompatibility class I complex (SLA-I)

    DEFF Research Database (Denmark)

    Oleksiewicz, M.B.; Kristensen, B.; Ladekjaer-Mikkelsen, A.S.

    2002-01-01

    The extracellular domains of swine leukocyte antigen class I (SLA-I, major histocompatibility complex protein class 1) were cloned and sequenced for two haplotypes (114 and H7) which do not share any alleles based on serological typing, and which are the most important in Danish farmed pigs...

  16. Analyzing public health policy: three approaches.

    Science.gov (United States)

    Coveney, John

    2010-07-01

    Policy is an important feature of public and private organizations. Within the field of health as a policy arena, public health has emerged in which policy is vital to decision making and the deployment of resources. Public health practitioners and students need to be able to analyze public health policy, yet many feel daunted by the subject's complexity. This article discusses three approaches that simplify policy analysis: Bacchi's "What's the problem?" approach examines the way that policy represents problems. Colebatch's governmentality approach provides a way of analyzing the implementation of policy. Bridgman and Davis's policy cycle allows for an appraisal of public policy development. Each approach provides an analytical framework from which to rigorously study policy. Practitioners and students of public health gain much in engaging with the politicized nature of policy, and a simple approach to policy analysis can greatly assist one's understanding and involvement in policy work.

  17. Methods of analyzing crude oil

    Science.gov (United States)

    Cooks, Robert Graham; Jjunju, Fred Paul Mark; Li, Anyin; Rogan, Iman S.

    2017-08-15

    The invention generally relates to methods of analyzing crude oil. In certain embodiments, methods of the invention involve obtaining a crude oil sample, and subjecting the crude oil sample to mass spectrometry analysis. In certain embodiments, the method is performed without any sample pre-purification steps.

  18. Therapy Talk: Analyzing Therapeutic Discourse

    Science.gov (United States)

    Leahy, Margaret M.

    2004-01-01

    Therapeutic discourse is the talk-in-interaction that represents the social practice between clinician and client. This article invites speech-language pathologists to apply their knowledge of language to analyzing therapy talk and to learn how talking practices shape clinical roles and identities. A range of qualitative research approaches,…

  19. The Convertible Arbitrage Strategy Analyzed

    NARCIS (Netherlands)

    Loncarski, I.; Ter Horst, J.R.; Veld, C.H.

    2006-01-01

    This paper analyzes convertible bond arbitrage on the Canadian market for the period 1998 to 2004.Convertible bond arbitrage is the combination of a long position in convertible bonds and a short position in the underlying stocks. Convertible arbitrage has been one of the most successful strategies

  20. Proton-beam energy analyzer

    International Nuclear Information System (INIS)

    Belan, V.N.; Bolotin, L.I.; Kiselev, V.A.; Linnik, A.F.; Uskov, V.V.

    1989-01-01

    The authors describe a magnetic analyzer for measurement of proton-beam energy in the range from 100 keV to 25 MeV. The beam is deflected in a uniform transverse magnetic field and is registered by photographing a scintillation screen. The energy spectrum of the beam is constructed by microphotometry of the photographic film

  1. AnalyzeHOLE: An Integrated Wellbore Flow Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Keith J. Halford

    2009-10-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  2. Buccal microbiology analyzed by infrared spectroscopy

    Science.gov (United States)

    de Abreu, Geraldo Magno Alves; da Silva, Gislene Rodrigues; Khouri, Sônia; Favero, Priscila Pereira; Raniero, Leandro; Martin, Airton Abrahão

    2012-01-01

    Rapid microbiological identification and characterization are very important in dentistry and medicine. In addition to dental diseases, pathogens are directly linked to cases of endocarditis, premature delivery, low birth weight, and loss of organ transplants. Fourier Transform Infrared Spectroscopy (FTIR) was used to analyze oral pathogens Aggregatibacter actinomycetemcomitans ATCC 29523, Aggregatibacter actinomycetemcomitans-JP2, and Aggregatibacter actinomycetemcomitans which was clinically isolated from the human blood-CI. Significant spectra differences were found among each organism allowing the identification and characterization of each bacterial species. Vibrational modes in the regions of 3500-2800 cm-1, the 1484-1420 cm-1, and 1000-750 cm-1 were used in this differentiation. The identification and classification of each strain were performed by cluster analysis achieving 100% separation of strains. This study demonstrated that FTIR can be used to decrease the identification time, compared to the traditional methods, of fastidious buccal microorganisms associated with the etiology of the manifestation of periodontitis.

  3. Sensors and Automated Analyzers for Radionuclides

    International Nuclear Information System (INIS)

    Grate, Jay W.; Egorov, Oleg B.

    2003-01-01

    The production of nuclear weapons materials has generated large quantities of nuclear waste and significant environmental contamination. We have developed new, rapid, automated methods for determination of radionuclides using sequential injection methodologies to automate extraction chromatographic separations, with on-line flow-through scintillation counting for real time detection. This work has progressed in two main areas: radionuclide sensors for water monitoring and automated radiochemical analyzers for monitoring nuclear waste processing operations. Radionuclide sensors have been developed that collect and concentrate radionuclides in preconcentrating minicolumns with dual functionality: chemical selectivity for radionuclide capture and scintillation for signal output. These sensors can detect pertechnetate to below regulatory levels and have been engineered into a prototype for field testing. A fully automated process monitor has been developed for total technetium in nuclear waste streams. This instrument performs sample acidification, speciation adjustment, separation and detection in fifteen minutes or less

  4. Analyzing petabytes of data with Hadoop

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    Abstract The open source Apache Hadoop project provides a powerful suite of tools for storing and analyzing petabytes of data using commodity hardware. After several years of production use inside of web companies like Yahoo! and Facebook and nearly a year of commercial support and development by Cloudera, the technology is spreading rapidly through other disciplines, from financial services and government to life sciences and high energy physics. The talk will motivate the design of Hadoop and discuss some key implementation details in depth. It will also cover the major subprojects in the Hadoop ecosystem, go over some example applications, highlight best practices for deploying Hadoop in your environment, discuss plans for the future of the technology, and provide pointers to the many resources available for learning more. In addition to providing more information about the Hadoop platform, a major goal of this talk is to begin a dialogue with the ATLAS research team on how the tools commonly used in t...

  5. Portable Diagnostics and Rapid Germination

    Energy Technology Data Exchange (ETDEWEB)

    Dunn, Zachary Spencer [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-12-01

    In the Bioenergy and Defense Department of Sandia National Laboratories, characterization of the BaDx (Bacillus anthracis diagnostic cartridge) was performed and rapid germination chemistry was investigated. BaDx was tested with complex sample matrixes inoculated with Bacillus anthracis, and the trials proved that BaDx will detect Bacillus anthracis in a variety of the medium, such as dirt, serum, blood, milk, and horse fluids. The dimensions of the device were altered to accommodate an E. coli or Listeria lateral flow immunoassay, and using a laser printer, BaDx devices were manufactured to identify E. coli and Listeria. Initial testing with E. coli versions of BaDx indicate that the device will be viable as a portable diagnostic cartridge. The device would be more effective with faster bacteria germination; hence studies were performed the use of rapid germination chemistry. Trials with calcium dipicolinic acid displayed increased cell germination, as shown by control studies using a microplate reader. Upon lyophilization the rapid germination chemistry failed to change growth patterns, indicating that the calcium dipicolinic acid was not solubilized under the conditions tested. Although incompatible with the portable diagnostic device, the experiments proved that the rapid germination chemistry was effective in increasing cell germination.

  6. Analyzer for gamma cameras diagnostic

    International Nuclear Information System (INIS)

    Oramas Polo, I.; Osorio Deliz, J. F.; Diaz Garcia, A.

    2013-01-01

    This research work was carried out to develop an analyzer for gamma cameras diagnostic. It is composed of an electronic system that includes hardware and software capabilities, and operates from the acquisition of the 4 head position signals of a gamma camera detector. The result is the spectrum of the energy delivered by nuclear radiation coming from the camera detector head. This system includes analog processing of position signals from the camera, digitization and the subsequent processing of the energy signal in a multichannel analyzer, sending data to a computer via a standard USB port and processing of data in a personal computer to obtain the final histogram. The circuits are composed of an analog processing board and a universal kit with micro controller and programmable gate array. (Author)

  7. Methods for Analyzing Social Media

    DEFF Research Database (Denmark)

    Jensen, Jakob Linaa

    2013-01-01

    Social media is becoming increasingly attractive for users. It is a fast way to communicate ideas and a key source of information. It is therefore one of the most influential mediums of communication of our time and an important area for audience research. The growth of social media invites many...... new questions such as: How can we analyze social media? Can we use traditional audience research methods and apply them to online content? Which new research strategies have been developed? Which ethical research issues and controversies do we have to pay attention to? This book focuses on research...... strategies and methods for analyzing social media and will be of interest to researchers and practitioners using social media, as well as those wanting to keep up to date with the subject....

  8. New approach to analyzing vulnerability

    International Nuclear Information System (INIS)

    O'Callaghan, P.B.; Carlson, R.L.; Riedeman, G.W.

    1986-01-01

    The Westinghouse Hanford Company (WHC) has recently completed construction of the Fuel Cycle Plant (FCP) at Richland, Washington. At start-up the facility will fabricate driver fuel for the Fast Flux Test Facility in the Secure Automated Fabrication line. After construction completion, but before facility certification, the Department of Energy (DOE) Richland Operation Office requested that a vulnerability analysis be performed which assumed multiple insiders as a threat to the security system. A unique method of analyzing facility vulnerabilities was developed at the Security Applications Center (SAC), which is managed by WHC for DOE. The method that was developed verifies a previous vulnerability assessment, as well as introducing a modeling technique which analyzes security alarms in relation to delaying factors and possible insider activities. With this information it is possible to assess the relative strength or weakness of various possible routes to and from a target within a facility

  9. Analyzing the Facebook Friendship Graph

    OpenAIRE

    Catanese, Salvatore; De Meo, Pasquale; Ferrara, Emilio; Fiumara, Giacomo

    2010-01-01

    Online Social Networks (OSN) during last years acquired a huge and increasing popularity as one of the most important emerging Web phenomena, deeply modifying the behavior of users and contributing to build a solid substrate of connections and relationships among people using the Web. In this preliminary work paper, our purpose is to analyze Facebook, considering a significant sample of data reflecting relationships among subscribed users. Our goal is to extract, from this platform, relevant ...

  10. Sulfur Dioxide Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, Stephen R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-05-01

    The Sulfur Dioxide Analyzer measures sulfur dioxide based on absorbance of UV light at one wavelength by SO2 molecules which then decay to a lower energy state by emitting UV light at a longer wavelength. Specifically, SO2 + hυ1 →SO2 *→SO2 + hυ2 The emitted light is proportional to the concentration of SO2 in the optical cell. External communication with the analyzer is available through an Ethernet port configured through the instrument network of the AOS systems. The Model 43i-TLE is part of the i-series of Thermo Scientific instruments. The i-series instruments are designed to interface with external computers through the proprietary Thermo Scientific iPort Software. However, this software is somewhat cumbersome and inflexible. Brookhaven National Laboratory (BNL) has written an interface program in National Instruments LabView that both controls the Model 43i-TLE Analyzer AND queries the unit for all measurement and housekeeping data. The LabView vi (the software program written by BNL) ingests all raw data from the instrument and outputs raw data files in a uniform data format similar to other instruments in the AOS and described more fully in Section 6.0 below.

  11. A new uranium automatic analyzer

    International Nuclear Information System (INIS)

    Xia Buyun; Zhu Yaokun; Wang Bin; Cong Peiyuan; Zhang Lan

    1993-01-01

    A new uranium automatic analyzer based on the flow injection analysis (FIA) principle has been developed. It consists of a multichannel peristaltic pump, an injection valve, a photometric detector, a single-chip microprocessor system and electronic circuit. The new designed multifunctional auto-injection valve can automatically change the injection volume of the sample and the channels so that the determination ranges and items can easily be changed. It also can make the instrument vary the FIA operation modes that it has functions of a universal instrument. A chromatographic column with extractant-containing resin was installed in the manifold of the analyzer for the concentration and separation of trace uranium. The 2-(5-bromo-2-pyridylazo)-5-diethyl-aminophenol (Br-PADAP) was used as colour reagent. Uranium was determined in the aqueous solution by adding cetyl-pyridium bromide (CPB). The uranium in the solution in the range 0.02-500 mg · L -1 can be directly determined without any pretreatment. A sample throughput rate of 30-90 h -1 and reproducibility of 1-2% were obtained. The analyzer has been satisfactorily applied to the laboratory and the plant

  12. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    Directory of Open Access Journals (Sweden)

    Mike W.-L. Cheung

    2016-05-01

    Full Text Available Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists – and probably the most crucial one – is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  13. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.

    Science.gov (United States)

    Cheung, Mike W-L; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  14. Direct Application of the INNO-LiPA Rif.TB Line-Probe Assay for Rapid Identification of Mycobacterium tuberculosis Complex Strains and Detection of Rifampin Resistance in 360 Smear-Positive Respiratory Specimens from an Area of High Incidence of Multidrug-Resistant Tuberculosis

    Science.gov (United States)

    Viveiros, Miguel; Leandro, Clara; Rodrigues, Liliana; Almeida, Josefina; Bettencourt, Rosário; Couto, Isabel; Carrilho, Lurdes; Diogo, José; Fonseca, Ana; Lito, Luís; Lopes, João; Pacheco, Teresa; Pessanha, Mariana; Quirim, Judite; Sancho, Luísa; Salfinger, Max; Amaral, Leonard

    2005-01-01

    The INNO-LiPA Rif.TB assay for the identification of Mycobacterium tuberculosis complex strains and the detection of rifampin (RIF) resistance has been evaluated with 360 smear-positive respiratory specimens from an area of high incidence of multidrug-resistant tuberculosis (MDR-TB). The sensitivity when compared to conventional identification/culture methods was 82.2%, and the specificity was 66.7%; the sensitivity and specificity were 100.0% and 96.9%, respectively, for the detection of RIF resistance. This assay has the potential to provide rapid information that is essential for the effective management of MDR-TB. PMID:16145166

  15. Rapid Prototyping Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The ARDEC Rapid Prototyping (RP) Laboratory was established in December 1992 to provide low cost RP capabilities to the ARDEC engineering community. The Stratasys,...

  16. The Aqueduct Global Flood Analyzer

    Science.gov (United States)

    Iceland, Charles

    2015-04-01

    As population growth and economic growth take place, and as climate change accelerates, many regions across the globe are finding themselves increasingly vulnerable to flooding. A recent OECD study of the exposure of the world's large port cities to coastal flooding found that 40 million people were exposed to a 1 in 100 year coastal flood event in 2005, and the total value of exposed assets was about US 3,000 billion, or 5% of global GDP. By the 2070s, those numbers were estimated to increase to 150 million people and US 35,000 billion, or roughly 9% of projected global GDP. Impoverished people in developing countries are particularly at risk because they often live in flood-prone areas and lack the resources to respond. WRI and its Dutch partners - Deltares, IVM-VU University Amsterdam, Utrecht University, and PBL Netherlands Environmental Assessment Agency - are in the initial stages of developing a robust set of river flood and coastal storm surge risk measures that show the extent of flooding under a variety of scenarios (both current and future), together with the projected human and economic impacts of these flood scenarios. These flood risk data and information will be accessible via an online, easy-to-use Aqueduct Global Flood Analyzer. We will also investigate the viability, benefits, and costs of a wide array of flood risk reduction measures that could be implemented in a variety of geographic and socio-economic settings. Together, the activities we propose have the potential for saving hundreds of thousands of lives and strengthening the resiliency and security of many millions more, especially those who are most vulnerable. Mr. Iceland will present Version 1.0 of the Aqueduct Global Flood Analyzer and provide a preview of additional elements of the Analyzer to be released in the coming years.

  17. Fuel analyzer; Analisador de combustiveis

    Energy Technology Data Exchange (ETDEWEB)

    Cozzolino, Roberval [RS Motors, Indaiatuba, SP (Brazil)

    2008-07-01

    The current technology 'COMBUSTIMETRO' aims to examine the fuel through performance of the engine, as the role of the fuel is to produce energy for the combustion engine in the form of which is directly proportional to the quality and type of fuel. The 'COMBUSTIMETRO' has an engine that always keeps the same entry of air, fuel and fixed point of ignition. His operation is monitored by sensors (Sonda Lambda, RPM and Gases Analyzer) connected to a processor that performs calculations and records the information, generate reports and graphs. (author)

  18. Remote Laser Diffraction Particle Size Distribution Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2001-03-01

    In support of a radioactive slurry sampling and physical characterization task, an “off-the-shelf” laser diffraction (classical light scattering) particle size analyzer was utilized for remote particle size distribution (PSD) analysis. Spent nuclear fuel was previously reprocessed at the Idaho Nuclear Technology and Engineering Center (INTEC—formerly recognized as the Idaho Chemical Processing Plant) which is on DOE’s INEEL site. The acidic, radioactive aqueous raffinate streams from these processes were transferred to 300,000 gallon stainless steel storage vessels located in the INTEC Tank Farm area. Due to the transfer piping configuration in these vessels, complete removal of the liquid can not be achieved. Consequently, a “heel” slurry remains at the bottom of an “emptied” vessel. Particle size distribution characterization of the settled solids in this remaining heel slurry, as well as suspended solids in the tank liquid, is the goal of this remote PSD analyzer task. A Horiba Instruments Inc. Model LA-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a “hot cell” (gamma radiation) environment. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not previously achievable—making this technology far superior than the traditional methods used. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives.

  19. Effects of Chain Length and Degree of Unsaturation of Fatty Acids on Structure and in Vitro Digestibility of Starch-Protein-Fatty Acid Complexes.

    Science.gov (United States)

    Zheng, Mengge; Chao, Chen; Yu, Jinglin; Copeland, Les; Wang, Shuo; Wang, Shujun

    2018-02-28

    The effects of chain length and degree of unsaturation of fatty acids (FAs) on structure and in vitro digestibility of starch-protein-FA complexes were investigated in model systems. Studies with the rapid visco analyzer (RVA) showed that the formation of ternary complex resulted in higher viscosities than those of binary complex during the cooling and holding stages. The results of differential scanning calorimetry (DSC), Raman, and X-ray diffraction (XRD) showed that the structural differences for ternary complexes were much less than those for binary complexes. Starch-protein-FA complexes presented lower in vitro enzymatic digestibility compared with starch-FAs complexes. We conclude that shorter chain and lower unsaturation FAs favor the formation of ternary complexes but decrease the thermal stability of these complexes. FAs had a smaller effect on the ordered structures of ternary complexes than on those of binary complexes and little effect on enzymatic digestibility of both binary and ternary complexes.

  20. Rapid Tooling via Stereolithography

    OpenAIRE

    Montgomery, Eva

    2006-01-01

    Approximately three years ago, composite stereolithography (SL) resins were introduced to the marketplace, offering performance features beyond what traditional SL resins could offer. In particular, the high heat deflection temperatures and high stiffness of these highly filled resins have opened the door to several new rapid prototyping (RP) applications, including wind tunnel test modelling and, more recently, rapid tooling.

  1. Compact Microwave Fourier Spectrum Analyzer

    Science.gov (United States)

    Savchenkov, Anatoliy; Matsko, Andrey; Strekalov, Dmitry

    2009-01-01

    A compact photonic microwave Fourier spectrum analyzer [a Fourier-transform microwave spectrometer, (FTMWS)] with no moving parts has been proposed for use in remote sensing of weak, natural microwave emissions from the surfaces and atmospheres of planets to enable remote analysis and determination of chemical composition and abundances of critical molecular constituents in space. The instrument is based on a Bessel beam (light modes with non-zero angular momenta) fiber-optic elements. It features low power consumption, low mass, and high resolution, without a need for any cryogenics, beyond what is achievable by the current state-of-the-art in space instruments. The instrument can also be used in a wide-band scatterometer mode in active radar systems.

  2. Rapid improvement teams.

    Science.gov (United States)

    Alemi, F; Moore, S; Headrick, L; Neuhauser, D; Hekelman, F; Kizys, N

    1998-03-01

    Suggestions, most of which are supported by empirical studies, are provided on how total quality management (TQM) teams can be used to bring about faster organizationwide improvements. Ideas are offered on how to identify the right problem, have rapid meetings, plan rapidly, collect data rapidly, and make rapid whole-system changes. Suggestions for identifying the right problem include (1) postpone benchmarking when problems are obvious, (2) define the problem in terms of customer experience so as not to blame employees nor embed a solution in the problem statement, (3) communicate with the rest of the organization from the start, (4) state the problem from different perspectives, and (5) break large problems into smaller units. Suggestions for having rapid meetings include (1) choose a nonparticipating facilitator to expedite meetings, (2) meet with each team member before the team meeting, (3) postpone evaluation of ideas, and (4) rethink conclusions of a meeting before acting on them. Suggestions for rapid planning include reducing time spent on flowcharting by focusing on the future, not the present. Suggestions for rapid data collection include (1) sample patients for surveys, (2) rely on numerical estimates by process owners, and (3) plan for rapid data collection. Suggestions for rapid organizationwide implementation include (1) change membership on cross-functional teams, (2) get outside perspectives, (3) use unfolding storyboards, and (4) go beyond self-interest to motivate lasting change in the organization. Additional empirical investigations of time saved as a consequence of the strategies provided are needed. If organizations solve their problems rapidly, fewer unresolved problems may remain.

  3. Development of a rapid in vitro protein refolding assay which discriminates between peptide-bound and peptide-free forms of recombinant porcine major histocompatibility class I complex (SLA-I)

    DEFF Research Database (Denmark)

    Oleksiewicz, M.B.; Kristensen, B.; Ladekjaer-Mikkelsen, A.S.

    2002-01-01

    The extracellular domains of swine leukocyte antigen class I (SLA-I, major histocompatibility complex protein class 1) were cloned and sequenced for two haplotypes (114 and H7) which do not share any alleles based on serological typing, and which are the most important in Danish farmed pigs....... The extracellular domain of SLA-I was connected to porcine beta2 microglobulin by glycine-rich linkers. The engineered sin.-le-chain proteins, consisting of fused SLA-I and beta2 microglobulin, were overexpressed as inclusion bodies in Escherichia coli. Also, variants were made of the single-chain proteins......, by linking them through glycine-rich linkers to peptides representing T-cell epitopes from classical swine fever virus (CSFV) and foot-and-mouth disease virus (FMDV). An in vitro refold assay was developed, using a monoclonal anti-SLA antibody (PT85A) to gauge refolding. The single best-defined, SLA...

  4. Ranking in evolving complex networks

    Science.gov (United States)

    Liao, Hao; Mariani, Manuel Sebastian; Medo, Matúš; Zhang, Yi-Cheng; Zhou, Ming-Yang

    2017-05-01

    Complex networks have emerged as a simple yet powerful framework to represent and analyze a wide range of complex systems. The problem of ranking the nodes and the edges in complex networks is critical for a broad range of real-world problems because it affects how we access online information and products, how success and talent are evaluated in human activities, and how scarce resources are allocated by companies and policymakers, among others. This calls for a deep understanding of how existing ranking algorithms perform, and which are their possible biases that may impair their effectiveness. Many popular ranking algorithms (such as Google's PageRank) are static in nature and, as a consequence, they exhibit important shortcomings when applied to real networks that rapidly evolve in time. At the same time, recent advances in the understanding and modeling of evolving networks have enabled the development of a wide and diverse range of ranking algorithms that take the temporal dimension into account. The aim of this review is to survey the existing ranking algorithms, both static and time-aware, and their applications to evolving networks. We emphasize both the impact of network evolution on well-established static algorithms and the benefits from including the temporal dimension for tasks such as prediction of network traffic, prediction of future links, and identification of significant nodes.

  5. Charge Analyzer Responsive Local Oscillations

    Science.gov (United States)

    Krause, Linda Habash; Thornton, Gary

    2015-01-01

    The first transatlantic radio transmission, demonstrated by Marconi in December of 1901, revealed the essential role of the ionosphere for radio communications. This ionized layer of the upper atmosphere controls the amount of radio power transmitted through, reflected off of, and absorbed by the atmospheric medium. Low-frequency radio signals can propagate long distances around the globe via repeated reflections off of the ionosphere and the Earth's surface. Higher frequency radio signals can punch through the ionosphere to be received at orbiting satellites. However, any turbulence in the ionosphere can distort these signals, compromising the performance or even availability of space-based communication and navigations systems. The physics associated with this distortion effect is analogous to the situation when underwater images are distorted by convecting air bubbles. In fact, these ionospheric features are often called 'plasma bubbles' since they exhibit some of the similar behavior as underwater air bubbles. These events, instigated by solar and geomagnetic storms, can cause communication and navigation outages that last for hours. To help understand and predict these outages, a world-wide community of space scientists and technologists are devoted to researching this topic. One aspect of this research is to develop instruments capable of measuring the ionospheric plasma bubbles. Figure 1 shows a photo of the Charge Analyzer Responsive to Local Oscillations (CARLO), a new instrument under development at NASA Marshall Space Flight Center (MSFC). It is a frequency-domain ion spectrum analyzer designed to measure the distributions of ionospheric turbulence from 1 Hz to 10 kHz (i.e., spatial scales from a few kilometers down to a few centimeters). This frequency range is important since it focuses on turbulence scales that affect VHF/UHF satellite communications, GPS systems, and over-the-horizon radar systems. CARLO is based on the flight-proven Plasma Local

  6. Rapid response systems.

    Science.gov (United States)

    Lyons, Patrick G; Edelson, Dana P; Churpek, Matthew M

    2018-07-01

    Rapid response systems are commonly employed by hospitals to identify and respond to deteriorating patients outside of the intensive care unit. Controversy exists about the benefits of rapid response systems. We aimed to review the current state of the rapid response literature, including evolving aspects of afferent (risk detection) and efferent (intervention) arms, outcome measurement, process improvement, and implementation. Articles written in English and published in PubMed. Rapid response systems are heterogeneous, with important differences among afferent and efferent arms. Clinically meaningful outcomes may include unexpected mortality, in-hospital cardiac arrest, length of stay, cost, and processes of care at end of life. Both positive and negative interventional studies have been published, although the two largest randomized trials involving rapid response systems - the Medical Early Response and Intervention Trial (MERIT) and the Effect of a Pediatric Early Warning System on All-Cause Mortality in Hospitalized Pediatric Patients (EPOCH) trial - did not find a mortality benefit with these systems, albeit with important limitations. Advances in monitoring technologies, risk assessment strategies, and behavioral ergonomics may offer opportunities for improvement. Rapid responses may improve some meaningful outcomes, although these findings remain controversial. These systems may also improve care for patients at the end of life. Rapid response systems are expected to continue evolving with novel developments in monitoring technologies, risk prediction informatics, and work in human factors. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Nuclear plant analyzer desktop workstation

    International Nuclear Information System (INIS)

    Beelman, R.J.

    1990-01-01

    In 1983 the U.S. Nuclear Regulatory Commission (USNRC) commissioned the Idaho National Engineering Laboratory (INEL) to develop a Nuclear Plant Analyzer (NPA). The NPA was envisioned as a graphical aid to assist reactor safety analysts in comprehending the results of thermal-hydraulic code calculations. The development was to proceed in three distinct phases culminating in a desktop reactor safety workstation. The desktop NPA is now complete. The desktop NPA is a microcomputer based reactor transient simulation, visualization and analysis tool developed at INEL to assist an analyst in evaluating the transient behavior of nuclear power plants by means of graphic displays. The NPA desktop workstation integrates advanced reactor simulation codes with online computer graphics allowing reactor plant transient simulation and graphical presentation of results. The graphics software, written exclusively in ANSI standard C and FORTRAN 77 and implemented over the UNIX/X-windows operating environment, is modular and is designed to interface to the NRC's suite of advanced thermal-hydraulic codes to the extent allowed by that code. Currently, full, interactive, desktop NPA capabilities are realized only with RELAP5

  8. Nuclear Plant Analyzer: Installation manual. Volume 1

    International Nuclear Information System (INIS)

    Snider, D.M.; Wagner, K.L.; Grush, W.H.; Jones, K.R.

    1995-01-01

    This report contains the installation instructions for the Nuclear Plant Analyzer (NPA) System. The NPA System consists of the Computer Visual System (CVS) program, the NPA libraries, the associated utility programs. The NPA was developed at the Idaho National Engineering Laboratory under the sponsorship of the US Nuclear Regulatory Commission to provide a highly flexible graphical user interface for displaying the results of these analysis codes. The NPA also provides the user with a convenient means of interactively controlling the host program through user-defined pop-up menus. The NPA was designed to serve primarily as an analysis tool. After a brief introduction to the Computer Visual System and the NPA, an analyst can quickly create a simple picture or set of pictures to aide in the study of a particular phenomenon. These pictures can range from simple collections of square boxes and straight lines to complex representations of emergency response information displays

  9. Analyzing Strategic Business Rules through Simulation Modeling

    Science.gov (United States)

    Orta, Elena; Ruiz, Mercedes; Toro, Miguel

    Service Oriented Architecture (SOA) holds promise for business agility since it allows business process to change to meet new customer demands or market needs without causing a cascade effect of changes in the underlying IT systems. Business rules are the instrument chosen to help business and IT to collaborate. In this paper, we propose the utilization of simulation models to model and simulate strategic business rules that are then disaggregated at different levels of an SOA architecture. Our proposal is aimed to help find a good configuration for strategic business objectives and IT parameters. The paper includes a case study where a simulation model is built to help business decision-making in a context where finding a good configuration for different business parameters and performance is too complex to analyze by trial and error.

  10. Structural factoring approach for analyzing stochastic networks

    Science.gov (United States)

    Hayhurst, Kelly J.; Shier, Douglas R.

    1991-01-01

    The problem of finding the distribution of the shortest path length through a stochastic network is investigated. A general algorithm for determining the exact distribution of the shortest path length is developed based on the concept of conditional factoring, in which a directed, stochastic network is decomposed into an equivalent set of smaller, generally less complex subnetworks. Several network constructs are identified and exploited to reduce significantly the computational effort required to solve a network problem relative to complete enumeration. This algorithm can be applied to two important classes of stochastic path problems: determining the critical path distribution for acyclic networks and the exact two-terminal reliability for probabilistic networks. Computational experience with the algorithm was encouraging and allowed the exact solution of networks that have been previously analyzed only by approximation techniques.

  11. Rapid Discrimination for Traditional Complex Herbal Medicines from Different Parts, Collection Time, and Origins Using High-Performance Liquid Chromatography and Near-Infrared Spectral Fingerprints with Aid of Pattern Recognition Methods

    Directory of Open Access Journals (Sweden)

    Haiyan Fu

    2015-01-01

    Full Text Available As an effective method, the fingerprint technique, which emphasized the whole compositions of samples, has already been used in various fields, especially in identifying and assessing the quality of herbal medicines. High-performance liquid chromatography (HPLC and near-infrared (NIR, with their unique characteristics of reliability, versatility, precision, and simple measurement, played an important role among all the fingerprint techniques. In this paper, a supervised pattern recognition method based on PLSDA algorithm by HPLC and NIR has been established to identify the information of Hibiscus mutabilis L. and Berberidis radix, two common kinds of herbal medicines. By comparing component analysis (PCA, linear discriminant analysis (LDA, and particularly partial least squares discriminant analysis (PLSDA with different fingerprint preprocessing of NIR spectra variables, PLSDA model showed perfect functions on the analysis of samples as well as chromatograms. Most important, this pattern recognition method by HPLC and NIR can be used to identify different collection parts, collection time, and different origins or various species belonging to the same genera of herbal medicines which proved to be a promising approach for the identification of complex information of herbal medicines.

  12. Implementation of a gel dosimeter for dosimetric verification of treatments with RapidArcTM

    International Nuclear Information System (INIS)

    Cortes, H.; Vasquez, J.; Plazas, M.

    2014-08-01

    The gel dosimetry represents advantages on other dosimetric systems for its potential of analyzing information in third dimension (3D). This work seeks to find another alternative for the verification of treatments of high complexity like the RapidArc TM . A gel type Magic was prepared and characterized, which was irradiated with base in a plan of RapidArc TM calculated in the Treatment Planning System (Tps) Eclipse, using the Anisotropic Analytic Algorithm (Aaa) for a beam with an acceleration potential of 6 MV. The dosimeter was characterized using Magnetic Resonance Images starting from the correlation between the T2 and the dose. The dose distribution curves were analyzed in second dimension (2D) using the program Omni Pro-I mrT and were compared with the curves obtained for the Tps under the approach gamma 2D. The comparison showed that the Gel represents a valid option inside the acceptable ranges for Quality Assurance in radiotherapy. (Author)

  13. ITHNA.SYS: An Integrated Thermal Hydraulic and Neutronic Analyzer SYStem for NUR research reactor

    Energy Technology Data Exchange (ETDEWEB)

    Mazidi, S., E-mail: samirmazidi@gmail.com [Division Physique et Applications Nucléaires, Centre de Recherche Nucléaire de Draria (CRND), BP 43 Sebala, Draria, Alger (Algeria); Meftah, B., E-mail: b_meftah@yahoo.com [Division Physique et Applications Nucléaires, Centre de Recherche Nucléaire de Draria (CRND), BP 43 Sebala, Draria, Alger (Algeria); Belgaid, M., E-mail: belgaidm@yahoo.com [Faculté de Physique, Université Houari Boumediene, USTHB, BP 31, Bab Ezzouar, Alger (Algeria); Letaim, F., E-mail: fletaim@yahoo.fr [Faculté des Sciences et Technologies, Université d’El-oued, PO Box 789, El-oued (Algeria); Halilou, A., E-mail: hal_rane@yahoo.fr [Division Réacteur NUR, Centre de Recherche Nucléaire de Draria, BP 43 Sebala, Draria, Alger (Algeria)

    2015-08-15

    Highlights: • We develop a neutronic and thermal hydraulic MTR reactor analyzer. • The analyzer allows a rapid determination of the reactor core parameters. • Some NUR reactor parameters have been analyzed. - Abstract: This paper introduces the Integrated Thermal Hydraulic and Neutronic Analyzer SYStem (ITHNA.SYS) that has been developed for the Algerian research reactor NUR. It is used both as an operating aid tool and as a core physics engineering analysis tool. The system embeds three modules of the MTR-PC software package developed by INVAP SE: the cell calculation code WIMSD, the core calculation code CITVAP and the program TERMIC for thermal hydraulic analysis of a material testing reactor (MTR) core in forced convection. ITHNA.SYS operates both in on-line and off-line modes. In the on-line mode, the system is linked, via the computer parallel port, to the data acquisition console of the reactor control room and allows a real time monitoring of major physical and safety parameters of the NUR core. PC-based ITHNA.SYS provides a viable and convenient way of using an accumulated and often complex reactor physics stock of knowledge and frees the user from the intricacy of adequate reactor core modeling. This guaranties an accurate, though rapid, determination of a variety of neutronic and thermal hydraulic parameters of importance for the operation and safety analysis of the NUR research reactor. Instead of the several hours usually required, the processing time for the determination of such parameters is now reduced to few seconds. Validation of the system was performed with respect to experimental measurements and to calculations using reference codes. ITHNA.SYS can be easily adapted to accommodate other kinds of MTR reactors.

  14. Single Molecule Analysis Research Tool (SMART: an integrated approach for analyzing single molecule data.

    Directory of Open Access Journals (Sweden)

    Max Greenfeld

    Full Text Available Single molecule studies have expanded rapidly over the past decade and have the ability to provide an unprecedented level of understanding of biological systems. A common challenge upon introduction of novel, data-rich approaches is the management, processing, and analysis of the complex data sets that are generated. We provide a standardized approach for analyzing these data in the freely available software package SMART: Single Molecule Analysis Research Tool. SMART provides a format for organizing and easily accessing single molecule data, a general hidden Markov modeling algorithm for fitting an array of possible models specified by the user, a standardized data structure and graphical user interfaces to streamline the analysis and visualization of data. This approach guides experimental design, facilitating acquisition of the maximal information from single molecule experiments. SMART also provides a standardized format to allow dissemination of single molecule data and transparency in the analysis of reported data.

  15. Complexity explained

    CERN Document Server

    Erdi, Peter

    2008-01-01

    This book explains why complex systems research is important in understanding the structure, function and dynamics of complex natural and social phenomena. Readers will learn the basic concepts and methods of complex system research.

  16. Test plan for demonstration of Rapid Transuranic Monitoring Laboratory

    International Nuclear Information System (INIS)

    McIsaac, C.V.; Sill, C.W.; Gehrke, R.J.; Killian, E.W.; Watts, K.D.

    1993-06-01

    This plan describes tests to demonstrate the capability of the Rapid Transuranic Monitoring Laboratory (RTML) to monitor airborne alpha-emitting radionuclides and analyze soil, smear, and filter samples for alpha- and gamma-emitting radionuclides under field conditions. The RTML will be tested during June 1993 at a site adjacent to the Cold Test Pit at the Radioactive Waste Management Complex at the Idaho National Engineering Laboratory. Measurement systems installed in the RTML that will be demonstrated include two large-area ionization chamber alpha spectrometers, an x-ray/gamma-ray spectrometer, and four alpha continuous air monitors. Test objectives, requirements for data quality, experimental apparatus and procedures, and safety and logistics issues are described

  17. Complex chemistry

    International Nuclear Information System (INIS)

    Kim, Bong Gon; Kim, Jae Sang; Kim, Jin Eun; Lee, Boo Yeon

    2006-06-01

    This book introduces complex chemistry with ten chapters, which include development of complex chemistry on history coordination theory and Warner's coordination theory and new development of complex chemistry, nomenclature on complex with conception and define, chemical formula on coordination compound, symbol of stereochemistry, stereo structure and isomerism, electron structure and bond theory on complex, structure of complex like NMR and XAFS, balance and reaction on solution, an organo-metallic chemistry, biology inorganic chemistry, material chemistry of complex, design of complex and calculation chemistry.

  18. Rapid world modeling

    International Nuclear Information System (INIS)

    Little, Charles; Jensen, Ken

    2002-01-01

    Sandia National Laboratories has designed and developed systems capable of large-scale, three-dimensional mapping of unstructured environments in near real time. This mapping technique is called rapid world modeling and has proven invaluable when used by prototype systems consisting of sensory detection devices mounted on mobile platforms. These systems can be deployed into previously unmapped environments and transmit real-time 3-D visual images to operators located remotely. This paper covers a brief history of the rapid world modeling system, its implementation on mobile platforms, and the current state of the technology. Applications to the nuclear power industry are discussed. (author)

  19. JINR rapid communications

    International Nuclear Information System (INIS)

    1998-01-01

    The present collection of rapid communications from JINR, Dubna, contains seven separate records on relativistic multiparticle processes in the central rapidity region at asymptotically high energies, a new experimental study of charged K→3π decays, pre-Cherenkov radiation as a phenomenon of 'light barrier', stable S=-2 H dibaryon found in Dubna, calculation of Green functions and gluon top in some unambiguous gauges, a method of a fast selection of inelastic nucleus-nucleus collisions for the CMS experiment and the manifestation of jet quenching in differential distributions of the total transverse energy in nucleus-nucleus collisions

  20. Rapid microbiology - raising awareness.

    Science.gov (United States)

    Bailie, Jonathan

    2016-01-01

    A 'high-level overview' of some of the emerging rapid microbiology technologies designed to help healthcare engineering and infection control teams working in hospitals and other healthcare facilities more rapidly identify potentially hazardous levels of waterborne microorganisms in their water systems, enabling them to take prompt remedial action, and a look at the some of the 'pros and cons' of such testing techniques, was given by Nalco technical director, Howard Barnes, the vice-chair of the Legionella Control Association (LCA), at a recent LCA open day. HEJ editor, Jonathan Bailie, reports.

  1. JINR rapid communications

    International Nuclear Information System (INIS)

    1998-01-01

    The present collection of rapid communications from JINR, Dubna, contains seven separate records on invisible Z-boson width and restrictions on next-to-minimal supersymmetric standard model, cosmic test of honeycomb drift chambers, fission of 209 Bi, 232 Th, 235 U, 238 U and 237 Np in a spallation neutron field, rapid screening of spontaneous and radiation-induced structural changes at the vestigial gene of Drosophila melanogaster by polymerase chain reaction, gamma-ray multiplicities in sub-barrier fission of 226 Th and the decay constants of the scalar and pseudoscalar mesons in the quark models with quasilocal interaction

  2. Classroom Evaluation of a Rapid Prototyping System.

    Science.gov (United States)

    Tennyson, Stephen A.; Krueger, Thomas J.

    2001-01-01

    Introduces rapid prototyping which creates virtual models through a variety of automated material additive processes. Relates experiences using JP System 5 in freshman and sophomore engineering design graphics courses. Analyzes strengths and limitations of the JP System 5 and discusses how to use it effectively. (Contains 15 references.)…

  3. Navigate the Digital Rapids

    Science.gov (United States)

    Lindsay, Julie; Davis, Vicki

    2010-01-01

    How can teachers teach digital citizenship when the digital landscape is changing so rapidly? How can teachers teach proper online social interactions when the students are outside their classroom and thus outside their control? Will encouraging students to engage in global collaborative environments land teachers in hot water? These are the…

  4. Customer-experienced rapid prototyping

    Science.gov (United States)

    Zhang, Lijuan; Zhang, Fu; Li, Anbo

    2008-12-01

    In order to describe accurately and comprehend quickly the perfect GIS requirements, this article will integrate the ideas of QFD (Quality Function Deployment) and UML (Unified Modeling Language), and analyze the deficiency of prototype development model, and will propose the idea of the Customer-Experienced Rapid Prototyping (CE-RP) and describe in detail the process and framework of the CE-RP, from the angle of the characteristics of Modern-GIS. The CE-RP is mainly composed of Customer Tool-Sets (CTS), Developer Tool-Sets (DTS) and Barrier-Free Semantic Interpreter (BF-SI) and performed by two roles of customer and developer. The main purpose of the CE-RP is to produce the unified and authorized requirements data models between customer and software developer.

  5. Rapid learning: a breakthrough agenda.

    Science.gov (United States)

    Etheredge, Lynn M

    2014-07-01

    A "rapid-learning health system" was proposed in a 2007 thematic issue of Health Affairs. The system was envisioned as one that uses evidence-based medicine to quickly determine the best possible treatments for patients. It does so by drawing on electronic health records and the power of big data to access large volumes of information from a variety of sources at high speed. The foundation for a rapid-learning health system was laid during 2007-13 by workshops, policy papers, large public investments in databases and research programs, and developing learning systems. Challenges now include implementing a new clinical research system with several hundred million patients, modernizing clinical trials and registries, devising and funding research on national priorities, and analyzing genetic and other factors that influence diseases and responses to treatment. Next steps also should aim to improve comparative effectiveness research; build on investments in health information technology to standardize handling of genetic information and support information exchange through apps and software modules; and develop new tools, data, and information for clinical decision support. Further advances will require commitment, leadership, and public-private and global collaboration. Project HOPE—The People-to-People Health Foundation, Inc.

  6. Analyzing Spatiotemporal Anomalies through Interactive Visualization

    Directory of Open Access Journals (Sweden)

    Tao Zhang

    2014-06-01

    Full Text Available As we move into the big data era, data grows not just in size, but also in complexity, containing a rich set of attributes, including location and time information, such as data from mobile devices (e.g., smart phones, natural disasters (e.g., earthquake and hurricane, epidemic spread, etc. We are motivated by the rising challenge and build a visualization tool for exploring generic spatiotemporal data, i.e., records containing time location information and numeric attribute values. Since the values often evolve over time and across geographic regions, we are particularly interested in detecting and analyzing the anomalous changes over time/space. Our analytic tool is based on geographic information system and is combined with spatiotemporal data mining algorithms, as well as various data visualization techniques, such as anomaly grids and anomaly bars superimposed on the map. We study how effective the tool may guide users to find potential anomalies through demonstrating and evaluating over publicly available spatiotemporal datasets. The tool for spatiotemporal anomaly analysis and visualization is useful in many domains, such as security investigation and monitoring, situation awareness, etc.

  7. PSAIA – Protein Structure and Interaction Analyzer

    Directory of Open Access Journals (Sweden)

    Vlahoviček Kristian

    2008-04-01

    Full Text Available Abstract Background PSAIA (Protein Structure and Interaction Analyzer was developed to compute geometric parameters for large sets of protein structures in order to predict and investigate protein-protein interaction sites. Results In addition to most relevant established algorithms, PSAIA offers a new method PIADA (Protein Interaction Atom Distance Algorithm for the determination of residue interaction pairs. We found that PIADA produced more satisfactory results than comparable algorithms implemented in PSAIA. Particular advantages of PSAIA include its capacity to combine different methods to detect the locations and types of interactions between residues and its ability, without any further automation steps, to handle large numbers of protein structures and complexes. Generally, the integration of a variety of methods enables PSAIA to offer easier automation of analysis and greater reliability of results. PSAIA can be used either via a graphical user interface or from the command-line. Results are generated in either tabular or XML format. Conclusion In a straightforward fashion and for large sets of protein structures, PSAIA enables the calculation of protein geometric parameters and the determination of location and type for protein-protein interaction sites. XML formatted output enables easy conversion of results to various formats suitable for statistic analysis. Results from smaller data sets demonstrated the influence of geometry on protein interaction sites. Comprehensive analysis of properties of large data sets lead to new information useful in the prediction of protein-protein interaction sites.

  8. Nuclear plant analyzer development and analysis applications

    International Nuclear Information System (INIS)

    Laats, E.T.

    1984-01-01

    The Nuclear Plant Analyzer (NPA) is being developed as the U.S. Nuclear Regulatory Commission's (NRC's) state of the art safety analysis and engineering tool to address key nuclear plant safety issues. The NPA integrates the NRC's computerized reactor behavior simulation codes such as RELAP5 and TRAC-BWR, both of which are well-developed computer graphics programs, and large repositories of reactor design and experimental data. Utilizing the complex reactor behavior codes as well as the experiment data repositories enables simulation applications of the NPA that are generally not possible with more simplistic, less mechanistic reactor behavior codes. These latter codes are used in training simulators or with other NPA-type software packages and are limited to displaying calculated data only. This paper describes four applications of the NPA in assisting reactor safety analyses. Two analyses evaluated reactor operating procedures, during off-normal operation, for a pressurized water reactor (PWR) and a boiling water reactor (BWR), respectively. The third analysis was performed in support of a reactor safety experiment conducted in the Semiscale facility. The final application demonstrated the usefulness of atmospheric dispersion computer codes for site emergency planning purposes. An overview of the NPA and how it supported these analyses are the topics of this paper

  9. Analyzing Preservice Teachers' Attitudes towards Technology

    Science.gov (United States)

    Akturk, Ahmet Oguz; Izci, Kemal; Caliskan, Gurbuz; Sahin, Ismail

    2015-01-01

    Rapid developments in technology in the present age have made it necessary for communities to follow technological developments and adapt themselves to these developments. One of the fields that are most rapidly affected by these developments is undoubtedly education. Determination of the attitudes of preservice teachers, who live in an age of…

  10. Rapid genotyping using pyrene-perylene locked nucleic acid complexes

    DEFF Research Database (Denmark)

    Kumar, Santhosh T.; Myznikova, Anna; Samokhina, Evgeniya

    2013-01-01

    We have developed an assay for single strand DNA and RNA detection which is based on novel pyrene-perylene FRET pairs attached to short LNA/DNA probes. The assay is based on ratiometric emission upon binding of target DNA/RNA by three combinations of fluorescent LNA/DNA reporter strands. Specific...... geometry of the pyrene fluorophore attached to the 2'-amino group of 2'-amino-LNA in position 4 allows for the first time to efficiently utilize dipole-dipole orientation parameter for sensing of single-nucleotide polymorphisms (SNPs) in nucleic acid targets by FRET. Using novel probes, SNP detection......-perylene FRET pairs, e.g., in imaging and clinical diagnostics....

  11. Serum metabolomics of slow vs. rapid motor progression Parkinson's disease: a pilot study.

    Directory of Open Access Journals (Sweden)

    James R Roede

    Full Text Available Progression of Parkinson's disease (PD is highly variable, indicating that differences between slow and rapid progression forms could provide valuable information for improved early detection and management. Unfortunately, this represents a complex problem due to the heterogeneous nature of humans in regards to demographic characteristics, genetics, diet, environmental exposures and health behaviors. In this pilot study, we employed high resolution mass spectrometry-based metabolic profiling to investigate the metabolic signatures of slow versus rapidly progressing PD present in human serum. Archival serum samples from PD patients obtained within 3 years of disease onset were analyzed via dual chromatography-high resolution mass spectrometry, with data extraction by xMSanalyzer and used to predict rapid or slow motor progression of these patients during follow-up. Statistical analyses, such as false discovery rate analysis and partial least squares discriminant analysis, yielded a list of statistically significant metabolic features and further investigation revealed potential biomarkers. In particular, N8-acetyl spermidine was found to be significantly elevated in the rapid progressors compared to both control subjects and slow progressors. Our exploratory data indicate that a fast motor progression disease phenotype can be distinguished early in disease using high resolution mass spectrometry-based metabolic profiling and that altered polyamine metabolism may be a predictive marker of rapidly progressing PD.

  12. Serum metabolomics of slow vs. rapid motor progression Parkinson's disease: a pilot study.

    Science.gov (United States)

    Roede, James R; Uppal, Karan; Park, Youngja; Lee, Kichun; Tran, Vilinh; Walker, Douglas; Strobel, Frederick H; Rhodes, Shannon L; Ritz, Beate; Jones, Dean P

    2013-01-01

    Progression of Parkinson's disease (PD) is highly variable, indicating that differences between slow and rapid progression forms could provide valuable information for improved early detection and management. Unfortunately, this represents a complex problem due to the heterogeneous nature of humans in regards to demographic characteristics, genetics, diet, environmental exposures and health behaviors. In this pilot study, we employed high resolution mass spectrometry-based metabolic profiling to investigate the metabolic signatures of slow versus rapidly progressing PD present in human serum. Archival serum samples from PD patients obtained within 3 years of disease onset were analyzed via dual chromatography-high resolution mass spectrometry, with data extraction by xMSanalyzer and used to predict rapid or slow motor progression of these patients during follow-up. Statistical analyses, such as false discovery rate analysis and partial least squares discriminant analysis, yielded a list of statistically significant metabolic features and further investigation revealed potential biomarkers. In particular, N8-acetyl spermidine was found to be significantly elevated in the rapid progressors compared to both control subjects and slow progressors. Our exploratory data indicate that a fast motor progression disease phenotype can be distinguished early in disease using high resolution mass spectrometry-based metabolic profiling and that altered polyamine metabolism may be a predictive marker of rapidly progressing PD.

  13. Rapid road repair vehicle

    Science.gov (United States)

    Mara, Leo M.

    1998-01-01

    Disclosed is a rapid road repair vehicle capable of moving over a surface to be repaired at near normal posted traffic speeds to scan for and find an the high rate of speed, imperfections in the pavement surface, prepare the surface imperfection for repair by air pressure and vacuum cleaning, applying a correct amount of the correct patching material to effect the repair, smooth the resulting repaired surface, and catalog the location and quality of the repairs for maintenance records of the road surface. The rapid road repair vehicle can repair surface imperfections at lower cost, improved quality, at a higher rate of speed than was was heretofor possible, with significantly reduced exposure to safety and health hazards associated with this kind of road repair activities in the past.

  14. Rapidly processable radiographic material

    International Nuclear Information System (INIS)

    Brabandere, L.A. de; Borginon, H.A.; Pattyn, H.A.; Pollet, R.J.

    1981-01-01

    A new rapidly processable radiographic silver halide material is described for use in mammography and non-destructive testing of industrial materials. The radiographic material is used for direct exposure to penetrating radiation without the use of fluorescent-intensifying screens. It consists of a transparent support with a layer of hydrophilic colloid silver halide emulsion on one or both sides. Examples of the preparation of three different silver halide emulsions are given including the use of different chemical sensitizers. These new radiographic materials have good resistance to the formation of pressure marks in rapid processing apparatus and they have improved sensitivity for direct exposure to penetrating radiation compared to conventional radiographic emulsions. (U.K.)

  15. Rapid manufacturing for microfluidics

    CSIR Research Space (South Africa)

    Land, K

    2012-10-01

    Full Text Available for microfluidics K. LAND, S. HUGO, M MBANJWA, L FOURIE CSIR Materials Science and Manufacturing P O Box 395, Pretoria 0001, SOUTH AFRICA Email: kland@csir.co.za INTRODUCTION Microfluidics refers to the manipulation of very small volumes of fluid.... Microfluidics is at the forefront of developing solutions for drug discovery, diagnostics (from glucose tests to malaria and TB testing) and environmental diagnostics (E-coli monitoring of drinking water). In order to quickly implement new designs, a rapid...

  16. Tiber Personal Rapid Transit

    Directory of Open Access Journals (Sweden)

    Diego Carlo D'agostino

    2011-02-01

    Full Text Available The project “Tiber Personal Rapid Transit” have been presented by the author at the Rome City Vision Competition1 2010, an ideas competition, which challenges architects, engineers, designers, students and creatives individuals to develop visionary urban proposals with the intention of stimulating and supporting the contemporary city, in this case Rome. The Tiber PRT proposal tries to answer the competition questions with the definition of a provocative idea: a Personal Rapid transit System on the Tiber river banks. The project is located in the central section of the Tiber river and aims at the renewal of the river banks with the insertion of a Personal Rapid Transit infrastructure. The project area include the riverbank of Tiber from Rome Transtevere RFI station to Piazza del Popolo, an area where main touristic and leisure attractions are located. The intervention area is actually no used by the city users and residents and constitute itself a strong barrier in the heart of the historic city.

  17. Rapid MR imaging

    International Nuclear Information System (INIS)

    Edelman, R.R.; Buxton, R.B.; Brady, T.J.

    1988-01-01

    Conventional magnetic resonance (MR) imaging methods typically require several minutes to produce an image, but the periods of respiration, cardiac motion and peristalsis are on the order of seconds or less. The need to reduce motion artifact, as well as the need to reduce imaging time for patient comfort and efficiency, have provided a strong impetus for the development of rapid imaging methods. For abdominal imaging, motion artifacts due to respiration can be significantly reduced by collecting the entire image during one breath hold. For other applications, such as following the kinetics of administered contrast agents, rapid imaging is essential to achieve adequate time resolution. A shorter imaging time entails a cost in image signal/noise (S/N), but improvements in recent years in magnet homogeneity, gradient and radiofrequency coil design have led to steady improvements in S/N and consequently in image quality. For many chemical applications the available S/N is greater than needed, and a trade-off of lower S/N for a shorter imaging time is acceptable. In this chapter, the authors consider the underlying principles of rapid imaging as well as clinical applications of these methods. The bulk of this review concentrates on short TR imaging, but methods that provide for a more modest decrease in imaging time as well as or those that dramatically shorten the imaging time to tens of milliseconds are also discussed

  18. Rapid classification of biological components

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, Vicki S.; Barrett, Karen B.; Key, Diane E.

    2013-10-15

    A method is disclosed for analyzing a biological sample by antibody profiling for identifying forensic samples or for detecting the presence of an analyte. In an illustrative embodiment of the invention, the analyte is a drug, such as marijuana, cocaine (crystalline tropane alkaloid), methamphetamine, methyltestosterone, or mesterolone. The method involves attaching antigens to a surface of a solid support in a preselected pattern to form an array wherein the locations of the antigens are known; contacting the array with the biological sample such that a portion of antibodies in the sample reacts with and binds to antigens in the array, thereby forming immune complexes; washing away antibodies that do not form immune complexes; and detecting the immune complexes, thereby forming an antibody profile. Forensic samples are identified by comparing a sample from an unknown source with a sample from a known source. Further, an assay, such as a test for illegal drug use, can be coupled to a test for identity such that the results of the assay can be positively correlated to a subject's identity.

  19. Rapid classification of biological components

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, Vicki S. (Idaho Falls, ID); Barrett, Karen B. (Meridian, ID); Key, Diane E. (Idaho Falls, ID)

    2010-03-23

    A method is disclosed for analyzing a biological sample by antibody profiling for identifying forensic samples or for detecting the presence of an analyte. In an illustrative embodiment of the invention, the analyte is a drug, such as marijuana, cocaine (crystalline tropane alkaloid), methamphetamine, methyltestosterone, or mesterolone. The method involves attaching antigens to a surface of a solid support in a preselected pattern to form an array wherein the locations of the antigens are known; contacting the array with the biological sample such that a portion of antibodies in the sample reacts with and binds to antigens in the array, thereby forming immune complexes; washing away antibodies that do not form immune complexes; and detecting the immune complexes, thereby forming an antibody profile. Forensic samples are identified by comparing a sample from an unknown source with a sample from a known source. Further, an assay, such as a test for illegal drug use, can be coupled to a test for identity such that the results of the assay can be positively correlated to a subject's identity.

  20. Rapid classification of biological components

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, Vicki S. (Idaho Falls, ID); Barrett, Karen B. (Meridian, ID); Key, Diane E. (Idaho Falls, ID)

    2010-03-23

    A method is disclosed for analyzing a biological sample by antibody profiling for identifying forensic samples or for detecting the presence of an analyte. In an illustrative embodiment of the invention, the analyte is a drug, such as marijuana, Cocaine (crystalline tropane alkaloid), methamphetamine, methyltestosterone, or mesterolone. The method involves attaching antigens of the surface of a solid support in a preselected pattern to form an array wherein the locations of the antigens are known; contacting the array with the biological sample such that a portion of antibodies in the sample reacts with and binds to antigens in the array, thereby forming immune complexes; washing away antibodies that do not form immune complexes; and detecting the immune complexes, thereby forming an antibody profile. Forensic samples are identified by comparing a sample from an unknown source with a sample from a known source. Further, an assay, such as a test for illegal drug use, can be coupled to a test for identity such that the results of the assay can be positively correlated to a subject's identity.

  1. Rapid classification of biological components

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, Vicki S.; Barrett, Karen B.; Key, Diane E.

    2006-01-24

    A method is disclosed for analyzing a biological sample by antibody profiling for identifying forensic samples or for detecting the presence of an analyte. In an illustrative embodiment of the invention, the analyte is a drug, such as marijuana, cocaine, methamphetamine, methyltestosterone, or mesterolone. The method involves attaching antigens to the surface of a solid support in a preselected pattern to form an array wherein the locations of the antigens are known; contacting the array with the biological sample such that a portion of antibodies in the sample reacts with and binds to antigens in the array, thereby forming immune complexes; washing away antibodies that do form immune complexes; and detecting the immune complexes, thereby forming an antibody profile. Forensic samples are identified by comparing a sample from an unknown source with a sample from a known source. Further, an assay, such as a test for illegal drug use, can be coupled to a test for identity such that the results of the assay can be positively correlated to the subject's identity.

  2. (II) complexes

    African Journals Online (AJOL)

    activities of Schiff base tin (II) complexes. Neelofar1 ... Conclusion: All synthesized Schiff bases and their Tin (II) complexes showed high antimicrobial and ...... Singh HL. Synthesis and characterization of tin (II) complexes of fluorinated Schiff bases derived from amino acids. Spectrochim Acta Part A: Molec Biomolec.

  3. Scalable Parallel Methods for Analyzing Metagenomics Data at Extreme Scale

    International Nuclear Information System (INIS)

    Daily, Jeffrey A.

    2015-01-01

    The field of bioinformatics and computational biology is currently experiencing a data revolution. The exciting prospect of making fundamental biological discoveries is fueling the rapid development and deployment of numerous cost-effective, high-throughput next-generation sequencing technologies. The result is that the DNA and protein sequence repositories are being bombarded with new sequence information. Databases are continuing to report a Moore's law-like growth trajectory in their database sizes, roughly doubling every 18 months. In what seems to be a paradigm-shift, individual projects are now capable of generating billions of raw sequence data that need to be analyzed in the presence of already annotated sequence information. While it is clear that data-driven methods, such as sequencing homology detection, are becoming the mainstay in the field of computational life sciences, the algorithmic advancements essential for implementing complex data analytics at scale have mostly lagged behind. Sequence homology detection is central to a number of bioinformatics applications including genome sequencing and protein family characterization. Given millions of sequences, the goal is to identify all pairs of sequences that are highly similar (or 'homologous') on the basis of alignment criteria. While there are optimal alignment algorithms to compute pairwise homology, their deployment for large-scale is currently not feasible; instead, heuristic methods are used at the expense of quality. In this dissertation, we present the design and evaluation of a parallel implementation for conducting optimal homology detection on distributed memory supercomputers. Our approach uses a combination of techniques from asynchronous load balancing (viz. work stealing, dynamic task counters), data replication, and exact-matching filters to achieve homology detection at scale. Results for a collection of 2.56M sequences show parallel efficiencies of ~75-100% on up to 8K

  4. Scalable Parallel Methods for Analyzing Metagenomics Data at Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Daily, Jeffrey A. [Washington State Univ., Pullman, WA (United States)

    2015-05-01

    The field of bioinformatics and computational biology is currently experiencing a data revolution. The exciting prospect of making fundamental biological discoveries is fueling the rapid development and deployment of numerous cost-effective, high-throughput next-generation sequencing technologies. The result is that the DNA and protein sequence repositories are being bombarded with new sequence information. Databases are continuing to report a Moore’s law-like growth trajectory in their database sizes, roughly doubling every 18 months. In what seems to be a paradigm-shift, individual projects are now capable of generating billions of raw sequence data that need to be analyzed in the presence of already annotated sequence information. While it is clear that data-driven methods, such as sequencing homology detection, are becoming the mainstay in the field of computational life sciences, the algorithmic advancements essential for implementing complex data analytics at scale have mostly lagged behind. Sequence homology detection is central to a number of bioinformatics applications including genome sequencing and protein family characterization. Given millions of sequences, the goal is to identify all pairs of sequences that are highly similar (or “homologous”) on the basis of alignment criteria. While there are optimal alignment algorithms to compute pairwise homology, their deployment for large-scale is currently not feasible; instead, heuristic methods are used at the expense of quality. In this dissertation, we present the design and evaluation of a parallel implementation for conducting optimal homology detection on distributed memory supercomputers. Our approach uses a combination of techniques from asynchronous load balancing (viz. work stealing, dynamic task counters), data replication, and exact-matching filters to achieve homology detection at scale. Results for a collection of 2.56M sequences show parallel efficiencies of ~75-100% on up to 8K cores

  5. Analyzing wildfire exposure on Sardinia, Italy

    Science.gov (United States)

    Salis, Michele; Ager, Alan A.; Arca, Bachisio; Finney, Mark A.; Alcasena, Fermin; Bacciu, Valentina; Duce, Pierpaolo; Munoz Lozano, Olga; Spano, Donatella

    2014-05-01

    We used simulation modeling based on the minimum travel time algorithm (MTT) to analyze wildfire exposure of key ecological, social and economic features on Sardinia, Italy. Sardinia is the second largest island of the Mediterranean Basin, and in the last fifty years experienced large and dramatic wildfires, which caused losses and threatened urban interfaces, forests and natural areas, and agricultural productions. Historical fires and environmental data for the period 1995-2009 were used as input to estimate fine scale burn probability, conditional flame length, and potential fire size in the study area. With this purpose, we simulated 100,000 wildfire events within the study area, randomly drawing from the observed frequency distribution of burn periods and wind directions for each fire. Estimates of burn probability, excluding non-burnable fuels, ranged from 0 to 1.92x10-3, with a mean value of 6.48x10-5. Overall, the outputs provided a quantitative assessment of wildfire exposure at the landscape scale and captured landscape properties of wildfire exposure. We then examined how the exposure profiles varied among and within selected features and assets located on the island. Spatial variation in modeled outputs resulted in a strong effect of fuel models, coupled with slope and weather. In particular, the combined effect of Mediterranean maquis, woodland areas and complex topography on flame length was relevant, mainly in north-east Sardinia, whereas areas with herbaceous fuels and flat areas were in general characterized by lower fire intensity but higher burn probability. The simulation modeling proposed in this work provides a quantitative approach to inform wildfire risk management activities, and represents one of the first applications of burn probability modeling to capture fire risk and exposure profiles in the Mediterranean basin.

  6. Modeling and Analyzing Academic Researcher Behavior

    Directory of Open Access Journals (Sweden)

    Phuc Huu Nguyen

    2016-12-01

    Full Text Available Abstract. This paper suggests a theoretical framework for analyzing the mechanism of the behavior of academic researchers whose interests are tangled and vary widely in academic factors (the intrinsic satisfaction in conducting research, the improvement in individual research ability, etc. or non-academic factors (career rewards, financial rewards, etc.. Furthermore, each researcher also has his/her different academic stances in their preferences about academic freedom and academic entrepreneurship. Understanding the behavior of academic researchers will contribute to nurture young researchers, to improve the standard of research and education as well as to boost collaboration in academia-industry. In particular, as open innovation is increasingly in need of the involvement of university researchers, to establish a successful approach to entice researchers into enterprises’ research, companies must comprehend the behavior of university researchers who have multiple complex motivations. The paper explores academic researchers' behaviors through optimizing their utility functions, i.e. the satisfaction obtained by their research outputs. This paper characterizes these outputs as the results of researchers' 3C: Competence (the ability to implement the research, Commitment (the effort to do the research, and Contribution (finding meaning in the research. Most of the previous research utilized the empirical methods to study researcher's motivation. Without adopting economic theory into the analysis, the past literature could not offer a deeper understanding of researcher's behavior. Our contribution is important both conceptually and practically because it provides the first theoretical framework to study the mechanism of researcher's behavior. Keywords: Academia-Industry, researcher behavior, ulrich model’s 3C.

  7. Communication complexity and information complexity

    Science.gov (United States)

    Pankratov, Denis

    Information complexity enables the use of information-theoretic tools in communication complexity theory. Prior to the results presented in this thesis, information complexity was mainly used for proving lower bounds and direct-sum theorems in the setting of communication complexity. We present three results that demonstrate new connections between information complexity and communication complexity. In the first contribution we thoroughly study the information complexity of the smallest nontrivial two-party function: the AND function. While computing the communication complexity of AND is trivial, computing its exact information complexity presents a major technical challenge. In overcoming this challenge, we reveal that information complexity gives rise to rich geometrical structures. Our analysis of information complexity relies on new analytic techniques and new characterizations of communication protocols. We also uncover a connection of information complexity to the theory of elliptic partial differential equations. Once we compute the exact information complexity of AND, we can compute exact communication complexity of several related functions on n-bit inputs with some additional technical work. Previous combinatorial and algebraic techniques could only prove bounds of the form theta( n). Interestingly, this level of precision is typical in the area of information theory, so our result demonstrates that this meta-property of precise bounds carries over to information complexity and in certain cases even to communication complexity. Our result does not only strengthen the lower bound on communication complexity of disjointness by making it more exact, but it also shows that information complexity provides the exact upper bound on communication complexity. In fact, this result is more general and applies to a whole class of communication problems. In the second contribution, we use self-reduction methods to prove strong lower bounds on the information

  8. JINR rapid communications

    International Nuclear Information System (INIS)

    1996-01-01

    The present collection of rapid communications from JINR, Dubna, contains five separate reports on analytic QCD running coupling with finite IR behaviour and universal α bar s (0) value, quark condensate in the interacting pion- nucleon medium at finite temperature and baryon number density, γ-π 0 discrimination with a shower maximum detector using neural networks for the solenoidal tracker at RHIC, off-specular neutron reflection from magnetic media with nondiagonal reflectivity matrices and molecular cytogenetics of radiation-induced gene mutations in Drosophila melanogaster. 21 fig., 1 tab

  9. JINR rapid communications

    International Nuclear Information System (INIS)

    1999-01-01

    The present collection of rapid communications from JINR, Dubna, contains seven separate records on additional conditions on eigenvectors in solving inverse problem for two-dimensional Schroedinger equation, on an absolute calibration of deuteron beam polarization at LHE, determination of the vector component of the polarization of the JINR synchrophasotron deuteron beam, wavelet-analysis: criterion of reliable signal selection, on asymptotics in inclusive production of antinuclei and nuclear fragments, use of neutron activation analysis at the IBR-2 reactor for atmospheric monitoring and impulse method for temperature measurement of silicon detectors

  10. JINR rapid communications

    International Nuclear Information System (INIS)

    1995-01-01

    The present collection of rapid communications from JINR, Dubna, contains six separate reports on Monte Carlo simulation of silicon detectors for the ALICE experiment at LHC, a study of single tagged multihadronic γγ* events at an average Q 2 of 90 GeV 2 , epithermal neutron activation analysis of moss, lichen and pine needles in atmospheric deposition monitoring, the theory of neutrino oscillation, coupled quadrupole and monopole vibrations of large amplitude and test of the Ellis-Jaffe sum rule using parametrization of the measured lepton-proton asymmetry. 21 figs., 18 tabs

  11. Rapidly variable relatvistic absorption

    Science.gov (United States)

    Parker, M.; Pinto, C.; Fabian, A.; Lohfink, A.; Buisson, D.; Alston, W.; Jiang, J.

    2017-10-01

    I will present results from the 1.5Ms XMM-Newton observing campaign on the most X-ray variable AGN, IRAS 13224-3809. We find a series of nine absorption lines with a velocity of 0.24c from an ultra-fast outflow. For the first time, we are able to see extremely rapid variability of the UFO features, and can link this to the X-ray variability from the inner accretion disk. We find a clear flux dependence of the outflow features, suggesting that the wind is ionized by increasing X-ray emission.

  12. Rapid Transformation in a Dual Identity Defense University

    National Research Council Canada - National Science Library

    Sekerka, Leslie E; Zolin, Roxanne; Simon, Cary

    2005-01-01

    .... Transcripts were thematically analyzed. The findings contributed to the development of a model to depict the effects of a specialized management identity that employs a deletion strategy using coercion to effect rapid transformation...

  13. Big and complex data analysis methodologies and applications

    CERN Document Server

    2017-01-01

    This volume conveys some of the surprises, puzzles and success stories in high-dimensional and complex data analysis and related fields. Its peer-reviewed contributions showcase recent advances in variable selection, estimation and prediction strategies for a host of useful models, as well as essential new developments in the field. The continued and rapid advancement of modern technology now allows scientists to collect data of increasingly unprecedented size and complexity. Examples include epigenomic data, genomic data, proteomic data, high-resolution image data, high-frequency financial data, functional and longitudinal data, and network data. Simultaneous variable selection and estimation is one of the key statistical problems involved in analyzing such big and complex data. The purpose of this book is to stimulate research and foster interaction between researchers in the area of high-dimensional data analysis. More concretely, its goals are to: 1) highlight and expand the breadth of existing methods in...

  14. NRC nuclear-plant-analyzer concept and status at INEL

    International Nuclear Information System (INIS)

    Aguilar, F.; Wagner, R.J.

    1982-01-01

    The Office of Research of the US NRC has proposed development of a software-hardware system called the Nuclear Plant Analyzer (NPA). This paper describes how we of the INEL envision the nuclear-plant analyzer. The paper also describes a pilot RELAP5 plant-analyzer project completed during the past year and current work. A great deal of analysis is underway to determine nuclear-steam-system response. System transient analysis being so complex, there is the need to present analytical results in a way that interconnections among phenomena and all the nuances of the transient are apparent. There is the need for the analyst to dynamically control system calculations to simulate plant operation in order to perform what if studies as well as the need to perform system analysis within hours of a plant emergency to diagnose the state of the stricken plant and formulate recovery actions. The NRC-proposed nuclear-plant analyzer can meet these needs

  15. Complexity Plots

    KAUST Repository

    Thiyagalingam, Jeyarajan

    2013-06-01

    In this paper, we present a novel visualization technique for assisting the observation and analysis of algorithmic complexity. In comparison with conventional line graphs, this new technique is not sensitive to the units of measurement, allowing multivariate data series of different physical qualities (e.g., time, space and energy) to be juxtaposed together conveniently and consistently. It supports multivariate visualization as well as uncertainty visualization. It enables users to focus on algorithm categorization by complexity classes, while reducing visual impact caused by constants and algorithmic components that are insignificant to complexity analysis. It provides an effective means for observing the algorithmic complexity of programs with a mixture of algorithms and black-box software through visualization. Through two case studies, we demonstrate the effectiveness of complexity plots in complexity analysis in research, education and application. © 2013 The Author(s) Computer Graphics Forum © 2013 The Eurographics Association and Blackwell Publishing Ltd.

  16. Rapid automated nuclear chemistry

    International Nuclear Information System (INIS)

    Meyer, R.A.

    1979-01-01

    Rapid Automated Nuclear Chemistry (RANC) can be thought of as the Z-separation of Neutron-rich Isotopes by Automated Methods. The range of RANC studies of fission and its products is large. In a sense, the studies can be categorized into various energy ranges from the highest where the fission process and particle emission are considered, to low energies where nuclear dynamics are being explored. This paper presents a table which gives examples of current research using RANC on fission and fission products. The remainder of this text is divided into three parts. The first contains a discussion of the chemical methods available for the fission product elements, the second describes the major techniques, and in the last section, examples of recent results are discussed as illustrations of the use of RANC

  17. JINR rapid communications

    International Nuclear Information System (INIS)

    1999-01-01

    The present collection of rapid communications from JINR, DUBNA, contains eight separate records on symmetry in modern physics (dedicated to the 100th anniversary of the birth of academician V.A.Fock), the double φ-meson production investigation on the Serpukhov accelerator, two-leptonic η-meson decays and SUSY without R parity, charge form factors and alpha-cluster internal structure of 12 C, increasing of muon-track reconstruction efficiency in ME1/1 Dubna prototype for the CMS/LHC, study of photon-structure function F 2 γ in the reaction e + e - → e + e - + hadrons at LEP2, jets reconstruction possibility in pAu and AuAu interactions at STAR RHIC and high-vacuum nondispersable gas absorber

  18. Rapid thermal pulse annealing

    International Nuclear Information System (INIS)

    Miller, M.G.; Koehn, B.W.; Chaplin, R.L.

    1976-01-01

    Characteristics of recovery processes have been investigated for cases of heating a sample to successively higher temperatures by means of isochronal annealing or by using a rapid pulse annealing. A recovery spectra shows the same features independent of which annealing procedure is used. In order to determine which technique provides the best resolution, a study was made of how two independent first-order processes are separated for different heating rates and time increments of the annealing pulses. It is shown that the pulse anneal method offers definite advantages over isochronal annealing when annealing for short time increments. Experimental data by means of the pulse anneal techniques are given for the various substages of stage I of aluminium. (author)

  19. JINR rapid communications

    International Nuclear Information System (INIS)

    1996-01-01

    The present collection of rapid communications from JINR, Dubna, contains seven separate reports on the identification of events with a secondary vertex in the experiment EXCHARM, the zero degree calorimeter for CERN WA-98 experiment, a new approach to increase the resource of installation elements for super-high energy physics, a method of the in-flight production of exotic systems in the charge-exchange reactions, the neutron activation analysis for monitoring northern terrestrial ecosystems, a search for 28 O and study of the neutron-rich nuclei near the neutron closure N=20, a search for new neutron-rich nuclei with a 70A MeV 48 Ca beam. 33 figs., 4 tabs

  20. JINR Rapid Communications. Collection

    International Nuclear Information System (INIS)

    1994-01-01

    The present collection of rapid communications from JINR, Dubna, contains nine separate reports on quasi-classical description of one-nucleon transfer reactions with heavy ions, elastic and inelastic scattering in the high energy approximation, experimental study of fission and evaporation cross sections for 6 He + 209 Bi reaction, d ↑ + 12 C → p + X at Θ p = 0 o in the region of high internal momenta in the deuteron, the Nuclotron internal targets, actively screened superconducting magnets, using of polarized target in backward elastic dp scattering, application of transputers in the data acquisition system of the INESS-ALPHA spectrometer, narrow dibaryon resonances with isotopic spin I=2. 93 refs., 27 figs., 4 tabs

  1. JINR Rapid Communications. Collection

    International Nuclear Information System (INIS)

    1994-01-01

    The present collection of rapid communications from JINR, Dubna, contains eight separate reports on Lorentz transformations with superluminal velocities, photo chromic effect in HTSC films, the investigation of hypernuclei in the Nuclotron accelerator, a new hadron jets finding algorithm in the four-dimensional velocity space, investigations of neutral particle production by relativistic nuclei on the LHE 90-channel γ-spectrometer (results and perspectives), coherent meson production in the dp → 3 HeX reaction, the relativistic projectile nuclei fragmentation and A-dependence of nucleon Fermi-momenta, energy spectra of γ-quanta from d-propane interactions at momentum P d = 1.25 GeV/c per nucleon. 86 refs., 26 figs., 4 tabs

  2. JINR rapid communications

    International Nuclear Information System (INIS)

    1999-01-01

    The present collection of rapid communications from JINR, Dubna, contains seven separate records on measurements of the total cross section difference Δσ L (np) at 1.59, 1.79, and 2.20 GeV, to the estimation of angular distributions of double charged spectator fragments in nucleus-nucleus interactions at superhigh energies, simulation dE/dx analysis results for silicon inner tracking system of ALICE set-up at LHC accelerator, high-multiplicity processes, triggering of high-multiplicity events using calorimetry, ORBIT-3.0 - a computer code for simulation and correction of the closed orbit and first turn in synchrotrons and determination of memory performance

  3. JINR rapid communications

    International Nuclear Information System (INIS)

    1999-01-01

    The present collection of rapid communications from JINR, Dubna, contains seven separate records on yields of the rare-earth neutron-deficient isotopes in the reactions of Mo isotopes with 40 Ca ions, observations of slow components of solitonic-type wave structure excited by e-beam in massive copper sample, development and investigation of low-mass multilayer drift chambers (MDC-2) for inner part of the HADES spectrometer, temperature measurement of the uranium sample irradiated with secondary neutrons, edge effects in multiwire proportional chambers, the influence of the dielectric frame, an object-oriented framework for the hadronic Monte-Carlo event generators and uranium-238 as a source for electronuclear power production. 32 figs., 3 tabs

  4. JINR rapid communications

    International Nuclear Information System (INIS)

    1997-01-01

    The present collection of rapid communications from JINR, Dubna, contains nine separate reports on collective energy dissipation and fluctuations in elastoplastic systems, diagnostics system of the circulating beam of the NUCLOTRON based on microchannel plates, time-of-flight detector for WA98 CERN experiment, fractal structure formation on the surfaces of solids subjected to high intensity electron and ion treatment, production of nuclei in 32,34,36 S-induced reactions in the energy range 6-75 MeV/A, rare-earth elements in soil and pine needle from northern terrestrial ecosystems, 'thermal' multifragmentation in p + Au collisions at relativistic energies, search for effects of the OZI rule violation in φ and ω mesons production in polarized deuteron beam interaction with polarized proton target (project DPHE3) and fast detector for triggering on charged particle multiplicity for relativistic nucleus-nucleus collisions

  5. JINR rapid communications

    International Nuclear Information System (INIS)

    1997-01-01

    The present collection of rapid communications from JINR, Dubna, contains seven separate reports on observation of transversal handedness in the diffractive production of pion triples, a possible experiment on the research of dibaryon states, Cherenkov beam counter system of the CERES/NA45 spectrometer for investigation with 160 GeV/n. lead ions, a profile-based gaseous detector with capacitive pad readout as the prototype of the shower maximum detector for the end-cap electromagnetic calorimeter for the STAR experiment, what DELPHI can get with an upgraded position for the very small angle tagger, estimation of the radiation environment and the shielding aspect for the point 2 area of the LHC and the orthopositronium decay puzzle

  6. Rapid chemical separations

    CERN Document Server

    Trautmann, N

    1976-01-01

    A survey is given on the progress of fast chemical separation procedures during the last few years. Fast, discontinuous separation techniques are illustrated by a procedure for niobium. The use of such techniques for the chemical characterization of the heaviest known elements is described. Other rapid separation methods from aqueous solutions are summarized. The application of the high speed liquid chromatography to the separation of chemically similar elements is outlined. The use of the gas jet recoil transport method for nuclear reaction products and its combination with a continuous solvent extraction technique and with a thermochromatographic separation is presented. Different separation methods in the gas phase are briefly discussed and the attachment of a thermochromatographic technique to an on-line mass separator is shown. (45 refs).

  7. JINR rapid communications

    International Nuclear Information System (INIS)

    1997-01-01

    The present collection of rapid communications from JINR, Dubna, contains nine separate reports on effects arising from charged particles overcoming of the light velocity barrier, deformable templates for circle recognition, scintillation detectors for precise time measurements, atomic form factors and incoherent scattering functions of atoms and ions with the number of electrons N ≤ 10, experimental set-up ANOMALON for measurement of relativistic nuclear fragmentation cross sections, superconducting dipole magnet for ALICE dimuon arm spectrometer, analysis of transverse mass dependence of Bose-Einstein correlation radii using the DELPHI data, low-energy theorem in softly broken supersymmetry and study of the characteristics of particles in reactions π - , p, d, He, C + C with the total disintegration on carbon nucleus

  8. JINR rapid communications

    International Nuclear Information System (INIS)

    1998-01-01

    The present collection of rapid communications from JINR, Dubna, contains six separate records on test of a threshold aerogel Cherenkov counter on cosmic particles, first results of study of transversal dimension of region of cumulative particles production in d + C and d + Cu reactions for energy 2 GeV/nucleon, the evidence of σ[0 + (0 ++ 0)] meson at a mass of M π + π - = 750 ± 5 MeV/c 2 observed in π + π - combinations from the reaction np → npπ + π - at an incident momentum of P n (5.20 ± 0.16 GeV/c, inclusive spectra of protons and π - mesons emitted in 4 HeC and 12 CC interactions with total disintegration of nuclei, heavy quark-antiquark pair production by double pomeron exchange in pp and AA collisions on the CMS and global features of nucleus-nucleus collisions in ultrarelativistic domain

  9. Wind energy system time-domain (WEST) analyzers

    Science.gov (United States)

    Dreier, M. E.; Hoffman, J. A.

    1981-01-01

    A portable analyzer which simulates in real time the complex nonlinear dynamics of horizontal axis wind energy systems was constructed. Math models for an aeroelastic rotor featuring nonlinear aerodynamic and inertial terms were implemented with high speed digital controllers and analog calculation. This model was combined with other math models of elastic supports, control systems, a power train and gimballed rotor kinematics. A stroboscopic display system graphically depicting distributed blade loads, motion, and other aerodynamic functions on a cathode ray tube is included. Limited correlation efforts showed good comparison between the results of this analyzer and other sophisticated digital simulations. The digital simulation results were successfully correlated with test data.

  10. Diversity, Community Composition, and Dynamics of Nonpigmented and Late-Pigmenting Rapidly Growing Mycobacteria in an Urban Tap Water Production and Distribution System

    OpenAIRE

    Dubrou, S.; Konjek, J.; Macheras, E.; Welté, B.; Guidicelli, L.; Chignon, E.; Joyeux, M.; Gaillard, J. L.; Heym, B.; Tully, T.; Sapriel, G.

    2013-01-01

    Nonpigmented and late-pigmenting rapidly growing mycobacteria (RGM) have been reported to commonly colonize water production and distribution systems. However, there is little information about the nature and distribution of RGM species within the different parts of such complex networks or about their clustering into specific RGM species communities. We conducted a large-scale survey between 2007 and 2009 in the Parisian urban tap water production and distribution system. We analyzed 1,418 w...

  11. Rapid Refresh (RAP) [13 km

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Rapid Refresh (RAP) numerical weather model took the place of the Rapid Update Cycle (RUC) on May 1, 2012. Run by the National Centers for Environmental...

  12. Rapid Refresh (RAP) [20 km

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Rapid Refresh (RAP) numerical weather model took the place of the Rapid Update Cycle (RUC) on May 1, 2012. Run by the National Centers for Environmental...

  13. Rapid population growth.

    Science.gov (United States)

    1972-01-01

    At the current rate of population growth, world population by 2000 is expected to reach 7 billion or more, with developing countries accounting for some 5.4 billion, and economically advanced nations accounting for 1.6 billion. 'Population explosion' is the result of falling mortality rates and continuing high birth rates. Many European countries, and Japan, have already completed what is termed as demographic transition, that is, birth rates have fallen to below 20 births per 1000 population, death rates to 10/1000 population, and annual growth rates are 1% or less; annual growth rates for less developed countries ranged from 2 to 3.5%. Less developed countries can be divided into 3 groups: 1) countries with both high birth and death rates; 2) countries with high birth rates and low death rates; and 3) countries with intermediate and declining birth rates and low death rates. Rapid population growth has serious economic consequences. It encourages inequities in income distribution; it limits rate of growth of gross national product by holding down level of savings and capital investments; it exerts pressure on agricultural production and land; and it creates unemployment problems. In addition, the quality of education for increasing number of chidren is adversely affected, as high proportions of children reduce the amount that can be spent for the education of each child out of the educational budget; the cost and adequacy of health and welfare services are affected in a similar way. Other serious consequences of rapid population growth are maternal death and illness, and physical and mental retardation of children of very poor families. It is very urgent that over a billion births be prevented in the next 30 years to reduce annual population growth rate from the current 2% to 1% per year.

  14. Rapidly developing market regions : Brazil

    International Nuclear Information System (INIS)

    Britto, A.

    1997-01-01

    Brazil and the State of Rio Grande do Sul are experiencing a period of rapid industrial development. Global investment has been forecast to reach $240 billion over the next five to seven years. This level of development is likely to result in a sharp increase in the consumption of plastic products made from olefins and from aromatic products. Accordingly, Copesul, the centre of raw materials for the State complex, is expected to increase its production of ethane from 685 tonnes to 1.13 million tonnes after 1999. The government has established a program of incentives to stimulate investment in third generation industries. Also, the State petrochemical industry has been rendered more competitive as a result of the purchase of the latest generation equipment. The principal challenges that exist for the petrochemical industry in Brazil and for that matter, around the world, are to reduce production costs and to preserve the natural environment. Another challenge, also world-wide, is to address the issue of plastic residues and to eliminate such residues through plastic recycling programs

  15. Technology for collecting and analyzing relational data

    Directory of Open Access Journals (Sweden)

    E. N. Fedorova

    2016-01-01

    Full Text Available Purpose of the study. The scientific and educational organizations use traditionally e-mail with Microsoft Excel spreadsheets and Microsoft Word documents for operational data collection. The disadvantages of this approach include the lack of control of the correctness of the data input, the complexity of processing the information received due to non-relational data model, etc. There are online services that enable to organize the collection of data in a relational form. The disadvantages of these systems are: the absence of thesaurus support; a limited set of elements of data input control; the limited control the operation of the input form; most of the systems is shareware, etc. Thus, it is required the development of Internet data collection and analysis technology, which should allow to identify quickly model the data collected and automatically implement data collection in accordance with this model.Materials and methods. The article describes the technology developed and tested for operational data collection and analysis using "Faramant" system. System operation "Faramant" is based on a model document, which includes three components: description of the data structure; visualization; logic of form work. All stages of the technology are performed by the user using the browser. The main stage of the proposed technology is the definition of the data model as a set of relational tables. To create a table within the system it’s required to determine the name and a list of fields. For each field, you must specify its name and use the control to the data input and logic of his work. Controls are used to organize the correct input data depending on the data type. Based on a model system "Faramant" automatically creates a filling form, using which users can enter information. To change the form visualization, you can use the form template. The data can be viewed page by page in a table. For table rows, you can apply different filters. To

  16. Correlation of mitochondrial protein expression in complexes I to V with natural and induced forms of canine idiopathic dilated cardiomyopathy.

    Science.gov (United States)

    Lopes, Rosana; Solter, Philip F; Sisson, D David; Oyama, Mark A; Prosek, Robert

    2006-06-01

    To identify qualitative and quantitative differences in cardiac mitochondrial protein expression in complexes I to V between healthy dogs and dogs with natural or induced dilated cardiomyopathy (DCM). Left ventricle samples were obtained from 7 healthy dogs, 7 Doberman Pinschers with naturally occurring DCM, and 7 dogs with DCM induced by rapid right ventricular pacing. Fresh and frozen mitochondrial fractions were isolated from the left ventricular free wall and analyzed by 2-dimensional electrophoresis. Protein spots that increased or decreased in density by 2-fold or greater between groups were analyzed by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry or quadrupole selecting, quadrupole collision cell, time-of-flight mass spectrometry. A total of 22 altered mitochondrial proteins were identified in complexes I to V. Ten and 12 were found in complex I and complexes II to V, respectively. Five were mitochondrial encoded, and 17 were nuclear encoded. Most altered mitochondrial proteins in tissue specimens from dogs with naturally occurring DCM were associated with complexes I and V, whereas in tissue specimens from dogs subjected to rapid ventricular pacing, complexes I and IV were more affected. In the experimentally induced form of DCM, only nuclear-encoded subunits were changed in complex I. In both disease groups, the 22-kd subunit was downregulated. Natural and induced forms of DCM resulted in altered mitochondrial protein expression in complexes I to V. However, subcellular differences between the experimental and naturally occurring forms of DCM may exist.

  17. Modeling Complex Time Limits

    Directory of Open Access Journals (Sweden)

    Oleg Svatos

    2013-01-01

    Full Text Available In this paper we analyze complexity of time limits we can find especially in regulated processes of public administration. First we review the most popular process modeling languages. There is defined an example scenario based on the current Czech legislature which is then captured in discussed process modeling languages. Analysis shows that the contemporary process modeling languages support capturing of the time limit only partially. This causes troubles to analysts and unnecessary complexity of the models. Upon unsatisfying results of the contemporary process modeling languages we analyze the complexity of the time limits in greater detail and outline lifecycles of a time limit using the multiple dynamic generalizations pattern. As an alternative to the popular process modeling languages there is presented PSD process modeling language, which supports the defined lifecycles of a time limit natively and therefore allows keeping the models simple and easy to understand.

  18. Spectroscopic studies of fluorescent complexes of tyrosine 8-hydroxyquinoline and tyrosine-8-hydroxyquinaldine in aqueous phase

    International Nuclear Information System (INIS)

    Jakhrani, M.A.; Kazi, T.G.

    2002-01-01

    A new method has been developed by preparing complexes involving condensation of tyrosine with 8-hydroxyquinoline (Oxine) and 8-hydroxyquinaldine (Quinaldine) respectively, producing fluorescent products. The products obtained have been investigated for identification and quantitative estimation using different spectroscopic techniques including fluorescence activity of newly synthesized products. 8-hydroxyquinaldine and 8-hydroxyquinoline (Oxine) condensed with tyrosine separately produced water soluble fluorescent complexes. The complexes have been investigated for identification and quantitative estimation of amino acids. Identification of amino acids in nano mole or below than nano mole has become possible by present fluorometric activity of these complexes involving different excitation and emission wavelengths. The fluorometric activity of complexes has been observed to be 100 to 1000 times higher than assay method involving ninhydrin and amino acid analyzer. The method adopted in our laboratory is rapid, versatile with good reproducibility and provides excellent results for adoption by analytical, agricultural and biomedical laboratories to estimate amino acids and metals in composite matrix. (author)

  19. Complexity Theory

    Science.gov (United States)

    Lee, William H K.

    2016-01-01

    A complex system consists of many interacting parts, generates new collective behavior through self organization, and adaptively evolves through time. Many theories have been developed to study complex systems, including chaos, fractals, cellular automata, self organization, stochastic processes, turbulence, and genetic algorithms.

  20. Analyzing Equity Capital Programs of Banks for Cooperatives

    OpenAIRE

    Ismail Ahmad; Ken D. Duft; Ron C. Mittelhammer

    1986-01-01

    Characteristics of Banks for Cooperatives term loan and equity capital programs contribute toward complex intermittent exchanges of positive and negative cash flows between the cooperative lender and borrower and complicate the analysis of the net present value and effective interest of the financing project. A multiperiod linear program was developed to analyze the effect of variations in equity capital program components on the present value of the financing project. Furthermore, the concep...

  1. ADAM: Analyzer for Dialectal Arabic Morphology

    Directory of Open Access Journals (Sweden)

    Wael Salloum

    2014-12-01

    Full Text Available While Modern Standard Arabic (MSA has many resources, Arabic Dialects, the primarily spoken local varieties of Arabic, are quite impoverished in this regard. In this article, we present ADAM (Analyzer for Dialectal Arabic Morphology. ADAM is a poor man’s solution to quickly develop morphological analyzers for dialectal Arabic. ADAM has roughly half the out-of-vocabulary rate of a state-of-the-art MSA analyzer and is comparable in its recall performance to an Egyptian dialectal morphological analyzer that took years and expensive resources to build.

  2. Rapid Evaporation of microbubbles

    Science.gov (United States)

    Gautam, Jitendra; Esmaeeli, Asghar

    2008-11-01

    When a liquid is heated to a temperature far above its boiling point, it evaporates abruptly. Boiling of liquid at high temperatures can be explosive and destructive, and poses a potential hazard for a host of industrial processes. Explosive boiling may occur if a cold and volatile liquid is brought into contact with a hot and non-volatile liquid, or if a liquid is superheated or depressurized rapidly. Such possibilities are realized, for example, in the depressurization of low boiling point liquefied natural gas (LNG) in the pipelines or storage tanks as a result of a leak. While boiling of highly heated liquids can be destructive at macroscale, the (nearly) instantaneous pace of the process and the release of large amount of kinetic energy make the phenomena extremely attractive at microscale where it is possible to utilize the released energy to derive micromechanical systems. For instance, there is currently a growing interest in micro-explosion of liquid for generation of micro bubbles for actuation purposes. The aim of the current study is to gain a fundamental understanding of the subject using direct numerical simulations. In particular, we seek to investigate the boundary between stable and unstable nucleus growth in terms of the degree of liquid superheat and to compare the dynamics of unstable and stable growth.

  3. JINR rapid communications

    International Nuclear Information System (INIS)

    1995-01-01

    The present collection of rapid communications from JINR, Dubna, contains twelve separate reports on an estimation of the possibility of fusion reactions in water molecules, an analysis of pion spectra of the charge-exchange reaction Mg(t, 3 He), the results of simulation of e + e - pair production and detection in the ALICE experiment, the data on the edge effects in multiwire proportional chambers, standard and nonstandard applications of wavelet analysis, the design and study of light readout system for scintillator shower maximum detector for the endcap electromagnetic calorimeter for the STAR experiment at RHIC, a study of multiparticle azimuthal correlations in high energy interactions, coherent multifragmentation of relativistic nuclei, superposition of neutrino eigenstates and neutrino oscillation, simulation results and suggestions for possible design of gaseous shower maximum detector for the endcap electromagnetic calorimeter for the STAR experiment at RHIC, determination of the sizes of the pion emission region in np-interactions at P n =(5.2±0.16)GeV/c using the interference correlation method for identical particles, inelasticity of nucleus-nucleus collisions in the CMS experiment. 65 figs., 19 tabs

  4. Rapid Polymer Sequencer

    Science.gov (United States)

    Stolc, Viktor (Inventor); Brock, Matthew W (Inventor)

    2013-01-01

    Method and system for rapid and accurate determination of each of a sequence of unknown polymer components, such as nucleic acid components. A self-assembling monolayer of a selected substance is optionally provided on an interior surface of a pipette tip, and the interior surface is immersed in a selected liquid. A selected electrical field is impressed in a longitudinal direction, or in a transverse direction, in the tip region, a polymer sequence is passed through the tip region, and a change in an electrical current signal is measured as each polymer component passes through the tip region. Each of the measured changes in electrical current signals is compared with a database of reference electrical change signals, with each reference signal corresponding to an identified polymer component, to identify the unknown polymer component with a reference polymer component. The nanopore preferably has a pore inner diameter of no more than about 40 nm and is prepared by heating and pulling a very small section of a glass tubing.

  5. Dependency visualization for complex system understanding

    Energy Technology Data Exchange (ETDEWEB)

    Smart, J. Allison Cory [Univ. of California, Davis, CA (United States)

    1994-09-01

    With the volume of software in production use dramatically increasing, the importance of software maintenance has become strikingly apparent. Techniques now sought and developed for reverse engineering and design extraction and recovery. At present, numerous commercial products and research tools exist which are capable of visualizing a variety of programming languages and software constructs. The list of new tools and services continues to grow rapidly. Although the scope of the existing commercial and academic product set is quite broad, these tools still share a common underlying problem. The ability of each tool to visually organize object representations is increasingly impaired as the number of components and component dependencies within systems increases. Regardless of how objects are defined, complex ``spaghetti`` networks result in nearly all large system cases. While this problem is immediately apparent in modem systems analysis involving large software implementations, it is not new. As will be discussed in Chapter 2, related problems involving the theory of graphs were identified long ago. This important theoretical foundation provides a useful vehicle for representing and analyzing complex system structures. While the utility of directed graph based concepts in software tool design has been demonstrated in literature, these tools still lack the capabilities necessary for large system comprehension. This foundation must therefore be expanded with new organizational and visualization constructs necessary to meet this challenge. This dissertation addresses this need by constructing a conceptual model and a set of methods for interactively exploring, organizing, and understanding the structure of complex software systems.

  6. Managing Complexity

    DEFF Research Database (Denmark)

    Maylath, Bruce; Vandepitte, Sonia; Minacori, Patricia

    2013-01-01

    and into French. The complexity of the undertaking proved to be a central element in the students' learning, as the collaboration closely resembles the complexity of international documentation workplaces of language service providers. © Association of Teachers of Technical Writing.......This article discusses the largest and most complex international learning-by-doing project to date- a project involving translation from Danish and Dutch into English and editing into American English alongside a project involving writing, usability testing, and translation from English into Dutch...

  7. Complex variables

    CERN Document Server

    Fisher, Stephen D

    1999-01-01

    The most important topics in the theory and application of complex variables receive a thorough, coherent treatment in this introductory text. Intended for undergraduates or graduate students in science, mathematics, and engineering, this volume features hundreds of solved examples, exercises, and applications designed to foster a complete understanding of complex variables as well as an appreciation of their mathematical beauty and elegance. Prerequisites are minimal; a three-semester course in calculus will suffice to prepare students for discussions of these topics: the complex plane, basic

  8. The algorithm and program complex for splitting on a parts the records of acoustic waves recorded during the work of plasma actuator flush-mounted in the model plane nozzle with the purpose of analyzing their robust spectral and correlation characteristics

    International Nuclear Information System (INIS)

    Chernousov, A D; Malakhov, D V; Skvortsova, N N

    2014-01-01

    Currently acute problem of developing new technologies by reducing the noise of aircraft engines, including the directional impact on the noise on the basis of the interaction of plasma disturbances and sound generation pulsations. One of the devices built on this principle being developed in GPI RAS. They are plasma actuators (group of related to each other gaps, built on the perimeter of the nozzle) of various shapes and forms. In this paper an algorithm was developed which allows to separate impulses from the received experimental data, acquired during the work of plasma actuator flush-mounted in the model plane nozzle. The algorithm can be adjusted manually under a variety of situations (work of actuator in a nozzle with or without airflow, adjustment to different frequencies and pulse duration of the actuator). And program complex is developed on the basis of MatLab software, designed for building sustainable robust spectral and autocovariation functions of acoustic signals recorded during the experiments with the model of a nozzle with working actuator

  9. Softball Complex

    Science.gov (United States)

    Ellis, Jim

    1977-01-01

    The Parks and Recreation Department of Montgomery, Alabama, has developed a five-field softball complex as part of a growing community park with facilities for camping, golf, aquatics, tennis, and picnicking. (MJB)

  10. Lecithin Complex

    African Journals Online (AJOL)

    1Department of Food Science and Engineering, Xinyang College of Agriculture and ... Results: The UV and IR spectra of the complex showed an additive effect of polydatin-lecithin, in which .... Monochromatic Cu Ka radiation (wavelength =.

  11. Solid-state thermal decomposition of the [Co(NH{sub 3}){sub 5}CO{sub 3}]NO{sub 3}{center_dot}0.5H{sub 2}O complex: A simple, rapid and low-temperature synthetic route to Co{sub 3}O{sub 4} nanoparticles

    Energy Technology Data Exchange (ETDEWEB)

    Farhadi, Saeid, E-mail: sfarhad2001@yahoo.com [Department of Chemistry, Lorestan University, Khorramabad 68135-465 (Iran, Islamic Republic of); Safabakhsh, Jalil [Department of Chemistry, Lorestan University, Khorramabad 68135-465 (Iran, Islamic Republic of)

    2012-02-25

    Highlights: Black-Right-Pointing-Pointer [Co(NH{sub 3}){sub 5}CO{sub 3}]NO{sub 3}{center_dot}0.5H{sub 2}O complex was used for preparing pure Co{sub 3}O{sub 4} nanoparticles. Black-Right-Pointing-Pointer Co{sub 3}O{sub 4} nanoparticles were prepared at low temperature of 175 Degree-Sign C. Black-Right-Pointing-Pointer Co{sub 3}O{sub 4} nanoparticles show a weak ferromagnetic behaviour at room temperature. Black-Right-Pointing-Pointer The method is simple, low-cost and suitable for the production of Co{sub 3}O{sub 4}. - Abstract: Co{sub 3}O{sub 4} nanoparticles were easily prepared via the decomposition of the pentammine(carbonato)cobalt(III) nitrate precursor complex [Co(NH{sub 3}){sub 5}CO{sub 3}]NO{sub 3}{center_dot}0.5H{sub 2}O at low temperature (175 Degree-Sign C). The product was characterized by thermal analysis, X-ray diffraction (XRD), Fourier-transform infrared spectroscopy (FT-IR), UV-visible spectroscopy, transmission electron microscopy (TEM), energy-dispersive X-ray spectroscopy (EDX), Raman spectroscopy, Brunauer-Emmett-Teller (BET) specific surface area measurements and magnetic measurements. The FT-IR, XRD, Raman and EDX results indicated that the synthesized Co{sub 3}O{sub 4} nanoparticles are highly pure and have a single phase. The TEM analysis revealed nearly uniform and quasi-spherical Co{sub 3}O{sub 4} nanoparticles with an average particle size of approximately 10 nm. The optical absorption spectrum of the Co{sub 3}O{sub 4} nanoparticles showed two direct band gaps of 2.18 and 3.52 eV with a red shift in comparison with previous reported values. The prepared Co{sub 3}O{sub 4} nanoparticles showed a weak ferromagnetic behaviour that could be attributed to uncompensated surface spins and/or finite-size effects. Using the present method, Co{sub 3}O{sub 4} nanoparticles can be produced without expensive organic solvents and complicated equipment. This simple, rapid, safe and low-cost synthetic route can be extended to the synthesis of other

  12. Relatively Inexpensive Rapid Prototyping of Small Parts

    Science.gov (United States)

    Swan, Scott A.

    2003-01-01

    Parts with complex three-dimensional shapes and with dimensions up to 8 by 8 by 10 in. (20.3 by 20.3 by 25.4 cm) can be made as unitary pieces of a room-temperature-curing polymer, with relatively little investment in time and money, by a process now in use at Johnson Space Center. The process is one of a growing number of processes and techniques that are known collectively as the art of rapid prototyping. The main advantages of this process over other rapid-prototyping processes are greater speed and lower cost: There is no need to make paper drawings and take them to a shop for fabrication, and thus no need for the attendant paperwork and organizational delays. Instead, molds for desired parts are made automatically on a machine that is guided by data from a computer-aided design (CAD) system and can reside in an engineering office.

  13. Sensors and Automated Analyzers for Radionuclides

    International Nuclear Information System (INIS)

    Grate, Jay W.; Egorov, Oleg B.; O'Hara, Matthew J.

    2005-01-01

    The production of nuclear weapons materials has generated a legacy of nuclear waste and contaminated environmental sites. (1) The remediation of radiochemically contaminated sites and processing of stored wastes into stable waste forms will require characterization procedures throughout all phases of these activities. (2) Consequently the demands for radiochemical analysis will increase rapidly in the future. Methods are required to do these analyses rapidly and precisely. To meet these characterization needs, new automated techniques are required in order to provide improved precision, consistent analytical protocols, reduced worker exposure to toxic and/or radioactive samples, greater sample throughput, reduced costs, and reduced secondary waste generation. Furthermore, methods are required that provide automatic analyses in settings other than analytical laboratories

  14. Time-delay analyzer with continuous discretization

    International Nuclear Information System (INIS)

    Bayatyan, G.L.; Darbinyan, K.T.; Mkrtchyan, K.K.; Stepanyan, S.S.

    1988-01-01

    A time-delay analyzer is described which when triggered by a start pulse of adjustable duration performs continuous discretization of the analyzed signal within nearly 22 ns time intervals, the recording in a memory unit with following slow read-out of the information to the computer and its processing. The time-delay analyzer consists of four CAMAC-VECTOR systems of unit width. With its help one can separate comparatively short, small-amplitude rare signals against the background of quasistationary noise processes. 4 refs.; 3 figs

  15. Systems Analyze Water Quality in Real Time

    Science.gov (United States)

    2010-01-01

    A water analyzer developed under Small Business Innovation Research (SBIR) contracts with Kennedy Space Center now monitors treatment processes at water and wastewater facilities around the world. Originally designed to provide real-time detection of nutrient levels in hydroponic solutions for growing plants in space, the ChemScan analyzer, produced by ASA Analytics Inc., of Waukesha, Wisconsin, utilizes spectrometry and chemometric algorithms to automatically analyze multiple parameters in the water treatment process with little need for maintenance, calibration, or operator intervention. The company has experienced a compound annual growth rate of 40 percent over its 15-year history as a direct result of the technology's success.

  16. Resilience and Complexity

    DEFF Research Database (Denmark)

    Dahlberg, Rasmus

    2015-01-01

    This paper explores two key concepts: resilience and complexity. The first is understood as an emergent property of the latter, and their inter-relatedness is discussed using a three tier approach. First, by exploring the discourse of each concept, next, by analyzing underlying relationships and...... robust. Robustness is a property of simple or complicated systems characterized by predictable behavior, enabling the system to bounce back to its normal state following a perturbation. Resilience, however, is an emergent property of complex adaptive systems. It is suggested that this distinction...

  17. On-Demand Urine Analyzer, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this program (through Phase III) is to develop an analyzer that can be integrated into International Space Station (ISS) toilets to measure key...

  18. Low Gravity Drug Stability Analyzer, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this proposed program (through Phase III) is to build a space-worthy Drug Stability Analyzer that can determine the extent of drug degradation....

  19. New high voltage parallel plate analyzer

    International Nuclear Information System (INIS)

    Hamada, Y.; Kawasumi, Y.; Masai, K.; Iguchi, H.; Fujisawa, A.; Abe, Y.

    1992-01-01

    A new modification on the parallel plate analyzer for 500 keV heavy ions to eliminate the effect of the intense UV and visible radiations, is successfully conducted. Its principle and results are discussed. (author)

  20. Analyzing the economic impacts of transportation projects.

    Science.gov (United States)

    2013-09-01

    The main goal of the study is to explore methods, approaches and : analytical software tools for analyzing economic activity that results from largescale : transportation investments in Connecticut. The primary conclusion is that the : transportation...

  1. Digital dynamic amplitude-frequency spectra analyzer

    International Nuclear Information System (INIS)

    Kalinnikov, V.A.; )

    2006-01-01

    The spectra analyzer is intended for the dynamic spectral analysis of signals physical installations and noise filtering. The recurrence Fourier transformation algorithm is used in the digital dynamic analyzer. It is realized on the basis of the fast logic FPGA matrix and the special signal ADSP microprocessor. The discretization frequency is 2 kHz-10 MHz. The number of calculated spectral coefficients is not less 512. The functional fast-action is 20 ns [ru

  2. FST Based Morphological Analyzer for Hindi Language

    OpenAIRE

    Deepak Kumar; Manjeet Singh; Seema Shukla

    2012-01-01

    Hindi being a highly inflectional language, FST (Finite State Transducer) based approach is most efficient for developing a morphological analyzer for this language. The work presented in this paper uses the SFST (Stuttgart Finite State Transducer) tool for generating the FST. A lexicon of root words is created. Rules are then added for generating inflectional and derivational words from these root words. The Morph Analyzer developed was used in a Part Of Speech (POS) Tagger based on Stanford...

  3. Complex analysis

    CERN Document Server

    Freitag, Eberhard

    2005-01-01

    The guiding principle of this presentation of ``Classical Complex Analysis'' is to proceed as quickly as possible to the central results while using a small number of notions and concepts from other fields. Thus the prerequisites for understanding this book are minimal; only elementary facts of calculus and algebra are required. The first four chapters cover the essential core of complex analysis: - differentiation in C (including elementary facts about conformal mappings) - integration in C (including complex line integrals, Cauchy's Integral Theorem, and the Integral Formulas) - sequences and series of analytic functions, (isolated) singularities, Laurent series, calculus of residues - construction of analytic functions: the gamma function, Weierstrass' Factorization Theorem, Mittag-Leffler Partial Fraction Decomposition, and -as a particular highlight- the Riemann Mapping Theorem, which characterizes the simply connected domains in C. Further topics included are: - the theory of elliptic functions based on...

  4. To Internationalize Rapidly from Inception: Crowdsource

    Directory of Open Access Journals (Sweden)

    Nirosh Kannangara

    2012-10-01

    Full Text Available Technology entrepreneurs continuously search for tools to accelerate the internationalization of their startups. For the purpose of internationalizing rapidly from inception, we propose that technology startups use crowdsourcing to internalize the tacit knowledge embodied in members of a crowd distributed across various geographies. For example, a technology startup can outsource to a large crowd the definition of a customer problem that occurs across various geographies, the development of the best solution to the problem, and the identification of attractive business expansion opportunities. In this article, we analyze how three small firms use crowdsourcing, discuss the benefits of crowdsourcing, and offer six recommendations to technology entrepreneurs interested in using crowdsourcing to rapidly internationalize their startups from inception.

  5. Subgroup complexes

    CERN Document Server

    Smith, Stephen D

    2011-01-01

    This book is intended as an overview of a research area that combines geometries for groups (such as Tits buildings and generalizations), topological aspects of simplicial complexes from p-subgroups of a group (in the spirit of Brown, Quillen, and Webb), and combinatorics of partially ordered sets. The material is intended to serve as an advanced graduate-level text and partly as a general reference on the research area. The treatment offers optional tracks for the reader interested in buildings, geometries for sporadic simple groups, and G-equivariant equivalences and homology for subgroup complexes.

  6. Complex manifolds

    CERN Document Server

    Morrow, James

    2006-01-01

    This book, a revision and organization of lectures given by Kodaira at Stanford University in 1965-66, is an excellent, well-written introduction to the study of abstract complex (analytic) manifolds-a subject that began in the late 1940's and early 1950's. It is largely self-contained, except for some standard results about elliptic partial differential equations, for which complete references are given. -D. C. Spencer, MathSciNet The book under review is the faithful reprint of the original edition of one of the most influential textbooks in modern complex analysis and geometry. The classic

  7. Isolation and mass spectrometry of transcription factor complexes.

    Science.gov (United States)

    Sebastiaan Winkler, G; Lacomis, Lynne; Philip, John; Erdjument-Bromage, Hediye; Svejstrup, Jesper Q; Tempst, Paul

    2002-03-01

    Protocols are described that enable the isolation of novel proteins associated with a known protein and the subsequent identification of these proteins by mass spectrometry. We review the basics of nanosample handling and of two complementary approaches to mass analysis, and provide protocols for the entire process. The protein isolation procedure is rapid and based on two high-affinity chromatography steps. The method does not require previous knowledge of complex composition or activity and permits subsequent biochemical characterization of the isolated factor. As an example, we provide the procedures used to isolate and analyze yeast Elongator, a histone acetyltransferase complex important for transcript elongation, which led to the identification of three novel subunits.

  8. A new automatic analyzer for uranium determination

    International Nuclear Information System (INIS)

    Xia Buyun; Zhu Yaokun; Wang Bin; Cong Peiyan; Zhang Lan

    1992-08-01

    An intellectual automatic analyzer for uranium based on the principle of flow injection analysis (FIA) has been developed. It can directly determine the uranium solution in range of 0.02 to 500 mg/L without any pre-process. A chromatographic column with extractant, in which the trace uranium is concentrated and separated, has special ability to enrich uranium, is connected to the manifold of the analyzer. The analyzer is suited for trace uranium determination in varies samples. The 2-(5-bromo-2-pyridylazo)-5-diethyl-aminophenol (Br-PADAP) is used as color reagent. Uranium is determined in aqueous solution by adding cation surfactant, cetyl-pyridinium bromide (PCB). The rate of analysis is 30 to 90 samples per hour. The relative standard deviation of determination is 1% ∼ 2%. The analyzer has been used in factories and laboratory, and the results are satisfied. The determination range can easily be changed by using a multi-function auto-injection valve that changes the injection volume of the sample and channels. So, it could adopt varies FIA operation modes to meet the needs of FIA determination for other substance. The analyzer has universal functions

  9. Program for analyzing power boost tests

    International Nuclear Information System (INIS)

    Wills, C.A.

    1982-03-01

    A rapid increase of power in a reactor produces a failure in the fuel. Experiments to study the conditions in the NRU reactor after such failures have been planned and carried out. Given the concentrations of specified isotopes at a number of times over the length of an experiment as produced for example, from the program SARGS and the power history of the reactor, this program calculates the release rates, escape rate coefficients, and fractional releases for the isotopes. These values may be optionally printed and plotted. Decay schemes for a limited number of mass numbers are implemented. The program is written in FORTRAN and runs on the CDC 6600 - CYBER 170 system

  10. Rapid prototyping: een veelbelovende methode

    NARCIS (Netherlands)

    Haverman, T.M.; Karagozoglu, K.H.; Prins, H.; Schulten, E.A.J.M.; Forouzanfar, T.

    2013-01-01

    Rapid prototyping is a method which makes it possible to produce a three-dimensional model based on two-dimensional imaging. Various rapid prototyping methods are available for modelling, such as stereolithography, selective laser sintering, direct laser metal sintering, two-photon polymerization,

  11. Multilevel Analysis in Analyzing Speech Data

    Science.gov (United States)

    Guddattu, Vasudeva; Krishna, Y.

    2011-01-01

    The speech produced by human vocal tract is a complex acoustic signal, with diverse applications in phonetics, speech synthesis, automatic speech recognition, speaker identification, communication aids, speech pathology, speech perception, machine translation, hearing research, rehabilitation and assessment of communication disorders and many…

  12. Analyzing Grid Log Data with Affinity Propagation

    NARCIS (Netherlands)

    Modena, G.; van Someren, M.W.; Ali, M; Bosse, T.; Hindriks, K.V.; Hoogendoorn, M.; Jonker, C.M; Treur, J.

    2013-01-01

    In this paper we present an unsupervised learning approach to detect meaningful job traffic patterns in Grid log data. Manual anomaly detection on modern Grid environments is troublesome given their increasing complexity, the distributed, dynamic topology of the network and heterogeneity of the jobs

  13. Complex Networks

    CERN Document Server

    Evsukoff, Alexandre; González, Marta

    2013-01-01

    In the last decade we have seen the emergence of a new inter-disciplinary field focusing on the understanding of networks which are dynamic, large, open, and have a structure sometimes called random-biased. The field of Complex Networks is helping us better understand many complex phenomena such as the spread of  deseases, protein interactions, social relationships, to name but a few. Studies in Complex Networks are gaining attention due to some major scientific breakthroughs proposed by network scientists helping us understand and model interactions contained in large datasets. In fact, if we could point to one event leading to the widespread use of complex network analysis is the availability of online databases. Theories of Random Graphs from Erdös and Rényi from the late 1950s led us to believe that most networks had random characteristics. The work on large online datasets told us otherwise. Starting with the work of Barabási and Albert as well as Watts and Strogatz in the late 1990s, we now know th...

  14. glbase: a framework for combining, analyzing and displaying heterogeneous genomic and high-throughput sequencing data.

    Science.gov (United States)

    Hutchins, Andrew Paul; Jauch, Ralf; Dyla, Mateusz; Miranda-Saavedra, Diego

    2014-01-01

    Genomic datasets and the tools to analyze them have proliferated at an astonishing rate. However, such tools are often poorly integrated with each other: each program typically produces its own custom output in a variety of non-standard file formats. Here we present glbase, a framework that uses a flexible set of descriptors that can quickly parse non-binary data files. glbase includes many functions to intersect two lists of data, including operations on genomic interval data and support for the efficient random access to huge genomic data files. Many glbase functions can produce graphical outputs, including scatter plots, heatmaps, boxplots and other common analytical displays of high-throughput data such as RNA-seq, ChIP-seq and microarray expression data. glbase is designed to rapidly bring biological data into a Python-based analytical environment to facilitate analysis and data processing. In summary, glbase is a flexible and multifunctional toolkit that allows the combination and analysis of high-throughput data (especially next-generation sequencing and genome-wide data), and which has been instrumental in the analysis of complex data sets. glbase is freely available at http://bitbucket.org/oaxiom/glbase/.

  15. glbase: a framework for combining, analyzing and displaying heterogeneous genomic and high-throughput sequencing data

    Directory of Open Access Journals (Sweden)

    Andrew Paul Hutchins

    2014-01-01

    Full Text Available Genomic datasets and the tools to analyze them have proliferated at an astonishing rate. However, such tools are often poorly integrated with each other: each program typically produces its own custom output in a variety of non-standard file formats. Here we present glbase, a framework that uses a flexible set of descriptors that can quickly parse non-binary data files. glbase includes many functions to intersect two lists of data, including operations on genomic interval data and support for the efficient random access to huge genomic data files. Many glbase functions can produce graphical outputs, including scatter plots, heatmaps, boxplots and other common analytical displays of high-throughput data such as RNA-seq, ChIP-seq and microarray expression data. glbase is designed to rapidly bring biological data into a Python-based analytical environment to facilitate analysis and data processing. In summary, glbase is a flexible and multifunctional toolkit that allows the combination and analysis of high-throughput data (especially next-generation sequencing and genome-wide data, and which has been instrumental in the analysis of complex data sets. glbase is freely available at http://bitbucket.org/oaxiom/glbase/.

  16. Scintiscans data analyzer model AS-10

    International Nuclear Information System (INIS)

    Malesa, J.; Wierzbicki, W.

    1975-01-01

    The principle of work and construction elements of the device made up for scintiscans data analyzation by ''square root scaling'' is presented. The device is equipped with cassette tape recorder type MK-125, made in Poland serving like scintiscans data bank, and with scintiscans data analyzation three programs. The cassette of two types, C-60 and C-90, is applied with working time of 2 x 30 min. and 2 x 45 min. respectivly. Results of scintiscans data analysation are printed by electric typewriter at figures in form of digital scintigram. (author)

  17. Analyzing Engineered Nanoparticles using Photothermal Infrared Spectroscopy

    DEFF Research Database (Denmark)

    Yamada, Shoko

    . To facilitate occupational safety and health there is a need to develop instruments to monitor and analyze nanoparticles in the industry, research and urban environments. The aim of this Ph.D. project was to develop new sensors that can analyze engineered nanoparticles. Two sensors were studied: (i......) a miniaturized toxicity sensor based on electrochemistry and (ii) a photothermal spectrometer based on tensile-stressed mechanical resonators (string resonators). Miniaturization of toxicity sensor targeting engineered nanoparticles was explored. This concept was based on the results of the biodurability test...

  18. Analyzing Web Behavior in Indoor Retail Spaces

    OpenAIRE

    Ren, Yongli; Tomko, Martin; Salim, Flora; Ong, Kevin; Sanderson, Mark

    2015-01-01

    We analyze 18 million rows of Wi-Fi access logs collected over a one year period from over 120,000 anonymized users at an inner-city shopping mall. The anonymized dataset gathered from an opt-in system provides users' approximate physical location, as well as Web browsing and some search history. Such data provides a unique opportunity to analyze the interaction between people's behavior in physical retail spaces and their Web behavior, serving as a proxy to their information needs. We find: ...

  19. Analyzing Log Files using Data-Mining

    Directory of Open Access Journals (Sweden)

    Marius Mihut

    2008-01-01

    Full Text Available Information systems (i.e. servers, applications and communication devices create a large amount of monitoring data that are saved as log files. For analyzing them, a data-mining approach is helpful. This article presents the steps which are necessary for creating an ‘analyzing instrument’, based on an open source software called Waikato Environment for Knowledge Analysis (Weka [1]. For exemplification, a system log file created by a Windows-based operating system, is used as input file.

  20. BWR plant analyzer development at BNL

    International Nuclear Information System (INIS)

    Cheng, H.S.; Wulff, W.; Mallen, A.N.; Lekach, S.V.; Stritar, A.; Cerbone, R.J.

    1985-01-01

    Advanced technology for high-speed interactive nuclear power plant simulations is of great value for timely resolution of safety issues, for plant monitoring, and for computer-aided emergency responses to an accident. Presented is the methodology employed at BNL to develop a BWR plant analyzer capable of simulating severe plant transients at much faster than real-time process speeds. Five modeling principles are established and a criterion is given for selecting numerical procedures and efficient computers to achieve the very high simulation speeds. Typical results are shown to demonstrate the modeling fidelity of the BWR plant analyzer

  1. A Novel Architecture For Multichannel Analyzer

    International Nuclear Information System (INIS)

    Marcus, E.; Elhanani, I.; Nir, J.; Ellenbogen, M.; Kadmon, Y.; Tirosh, D.

    1999-01-01

    A novel digital approach to real-time, high-throughput, low-cost Multichannel Analyzer (MCA) for radiation spectroscopy is being presented. The MCA input is a shaped nuclear pulse sampled at a high rate, using an Analog-to-Digital Converter (ADC) chip. The digital samples are analyzed by a state-of-the-art Field Programmable Gate Away (FPGA). A customized algorithm is utilized to estimate the peak of the pulse, to reject pile-up and to eliminate processing dead time. The valid pulses estimated peaks are transferred to a micro controller system that creates the histogram and controls the Human Machine Interface (HMI)

  2. Improving and analyzing signage within a healthcare setting.

    Science.gov (United States)

    Rousek, J B; Hallbeck, M S

    2011-11-01

    Healthcare facilities are increasingly utilizing pictograms rather than text signs to help direct people. The purpose of this study was to analyze a wide variety of standardized healthcare pictograms and the effects of color contrasts and complexity for participants with both normal and impaired vision. Fifty (25 males, 25 females) participants completed a signage recognition questionnaire and identified pictograms while wearing vision simulators to represent specific visual impairment. The study showed that certain color contrasts, complexities and orientations can help or hinder comprehension of signage for people with and without visual impairment. High contrast signage with consistent pictograms involving human figures (not too detailed or too abstract) is most identifiable. Standardization of healthcare signage is recommended to speed up and aid the cognitive thought process in detecting signage and determining meaning. These fundamental signage principles are critical in producing an efficient, universal wayfinding system for healthcare facilities. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  3. Can Complexity be Planned?

    Directory of Open Access Journals (Sweden)

    Ilona Koutny

    2015-04-01

    Full Text Available The long accepted complexity invariance of human languages has become controversial within the last decade. In investigations of the problem, both creole and planned languages have often been neglected. After a presentation of the scope of the invariance problem and the proposition of the natural to planned language continuum, this article will discuss the contribution of planned languages. It will analyze the complexity of Esperanto at the phonological, morphological, syntactic and semantic levels, using linguistic data bases. The role of the L2 speech community and development of the language will also be taken into account when discussing the endurance of the same level of simplicity of this planned international language. The author argues that complexity can be variable and to some extent planned and maintained.

  4. Theoretical studies on rapid fluctuations in solar flares

    International Nuclear Information System (INIS)

    Vlahos, L.

    1986-01-01

    Rapid fluctuations in the emission of solar bursts may have many different origins, e.g., the acceleration process can have a pulsating structure, the propagation of energetic electrons and ions can be interrupted from plasma instabilities and finally the electromagnetic radiation produced by the interaction of electrostatic and electromagnetic waves may have a pulsating behavior in time. In two separate studies the conditions for rapid fluctuations in solar flare driven emission were analyzed

  5. Theoretical studies on rapid fluctuations in solar flares

    Science.gov (United States)

    Vlahos, Loukas

    1986-01-01

    Rapid fluctuations in the emission of solar bursts may have many different origins e.g., the acceleration process can have a pulsating structure, the propagation of energetic electrons and ions can be interrupted from plasma instabilities and finally the electromagnetic radiation produced by the interaction of electrostatic and electromagnetic waves may have a pulsating behavior in time. In two separate studies the conditions for rapid fluctuations in solar flare driven emission were analyzed.

  6. Hypoxia targeting copper complexes

    International Nuclear Information System (INIS)

    Dearling, J.L.

    1998-11-01

    The importance and incidence of tumour hypoxia, its measurement and current treatments available, including pharmacological and radiopharmacological methods of targeting hypoxia, are discussed. A variety of in vitro and in vivo methods for imposing hypoxia have been developed and are reviewed. Copper, its chemistry, biochemistry and radiochemistry, the potential for use of copper radionuclides and its use to date in this field is considered with particular reference to the thiosemicarbazones. Their biological activity, metal chelation, in vitro and in vivo studies of their radiocopper complexes and the potential for their use as hypoxia targeting radiopharmaceuticals is described. The reduction of the copper(II) complex to copper(l), its pivotal importance in their biological behaviour, and the potential for manipulation of this to effect hypoxia selectivity are described. An in vitro method for assessing the hypoxia selectivity of radiopharmaceuticals is reported. The rapid deoxygenation and high viability of a mammalian cell culture in this system is discussed and factors which may affect the cellular uptake of a radiopharmaceutical are described. The design, synthesis and complexation with copper and radiocopper of a range of bis(thiosemicarbazones) is reported. Synthesis of these compounds is simple giving high yields of pure products. The characteristics of the radiocopper complexes ( 64 Cu) including lipophilicity and redox activity are reported (reduction potentials in the range -0.314 - -0.590 V). High cellular uptakes of the radiocopper complexes of the ligands, in hypoxic and normoxic EMT6 and CHO320 cells, were observed. Extremes of selectivity are shown ranging from the hypoxia selective 64 Cu(II)ATSM to normoxic cell selective 64 Cu(II)GTS. The selectivities observed are compared with the physico chemical characteristics of the complexes. A good correlation exists between selectivity of the complex and its Cu(II)/Cu(I) reduction potential, with hypoxia

  7. Designing experiments and analyzing data a model comparison perspective

    CERN Document Server

    Maxwell, Scott E

    2013-01-01

    Through this book's unique model comparison approach, students and researchers are introduced to a set of fundamental principles for analyzing data. After seeing how these principles can be applied in simple designs, students are shown how these same principles also apply in more complicated designs. Drs. Maxwell and Delaney believe that the model comparison approach better prepares students to understand the logic behind a general strategy of data analysis appropriate for various designs; and builds a stronger foundation, which allows for the introduction of more complex topics omitt

  8. Analyzing high school students' reasoning about electromagnetic induction

    Science.gov (United States)

    Jelicic, Katarina; Planinic, Maja; Planinsic, Gorazd

    2017-06-01

    Electromagnetic induction is an important, yet complex, physics topic that is a part of Croatian high school curriculum. Nine Croatian high school students of different abilities in physics were interviewed using six demonstration experiments from electromagnetism (three of them concerned the topic of electromagnetic induction). Students were asked to observe, describe, and explain the experiments. The analysis of students' explanations indicated the existence of many conceptual and reasoning difficulties with the basic concepts of electromagnetism, and especially with recognizing and explaining the phenomenon of electromagnetic induction. Three student mental models of electromagnetic induction, formed during the interviews, which reoccurred among students, are described and analyzed within the knowledge-in-pieces framework.

  9. Organization of a multichannel analyzer for gamma ray spectrometry

    International Nuclear Information System (INIS)

    Robinet, Genevieve

    1988-06-01

    This report describes the software organization of a medium scale multichannel analyzer for qualitative and quantitative measurements of the gamma rays emitted by radioactive samples. The first part reminds basis of radioactivity, principle of gamma ray detection, and data processing used for interpretation of a nuclear spectrum. The second part describes first the general organization of the software and then gives some details on interactivity, multidetector capabilites, and integration of complex algorithms for peak search and nuclide identification;problems encountered during the design phase are mentioned and solutions are given. Basic ideas are presented for further developments, such as expert system which should improve interpretation of the results. This present software has been integrated in a manufactured multichannel analyzer named 'POLYGAM NU416'. [fr

  10. Fluidization quality analyzer for fluidized beds

    Science.gov (United States)

    Daw, C.S.; Hawk, J.A.

    1995-07-25

    A control loop and fluidization quality analyzer for a fluidized bed utilizes time varying pressure drop measurements. A fast-response pressure transducer measures the overall bed pressure drop, or over some segment of the bed, and the pressure drop signal is processed to produce an output voltage which changes with the degree of fluidization turbulence. 9 figs.

  11. SINDA, Systems Improved Numerical Differencing Analyzer

    Science.gov (United States)

    Fink, L. C.; Pan, H. M. Y.; Ishimoto, T.

    1972-01-01

    Computer program has been written to analyze group of 100-node areas and then provide for summation of any number of 100-node areas to obtain temperature profile. SINDA program options offer user variety of methods for solution of thermal analog modes presented in network format.

  12. Analyzing the Biology on the System Level

    OpenAIRE

    Tong, Wei

    2016-01-01

    Although various genome projects have provided us enormous static sequence information, understanding of the sophisticated biology continues to require integrating the computational modeling, system analysis, technology development for experiments, and quantitative experiments all together to analyze the biology architecture on various levels, which is just the origin of systems biology subject. This review discusses the object, its characteristics, and research attentions in systems biology,...

  13. Analyzing the Acoustic Beat with Mobile Devices

    Science.gov (United States)

    Kuhn, Jochen; Vogt, Patrik; Hirth, Michael

    2014-01-01

    In this column, we have previously presented various examples of how physical relationships can be examined by analyzing acoustic signals using smartphones or tablet PCs. In this example, we will be exploring the acoustic phenomenon of small beats, which is produced by the overlapping of two tones with a low difference in frequency ?f. The…

  14. Automatic radioxenon analyzer for CTBT monitoring

    International Nuclear Information System (INIS)

    Bowyer, T.W.; Abel, K.H.; Hensley, W.K.

    1996-12-01

    Over the past 3 years, with support from US DOE's NN-20 Comprehensive Test Ban Treaty (CTBT) R ampersand D program, PNNL has developed and demonstrated a fully automatic analyzer for collecting and measuring the four Xe radionuclides, 131m Xe(11.9 d), 133m Xe(2.19 d), 133 Xe (5.24 d), and 135 Xe(9.10 h), in the atmosphere. These radionuclides are important signatures in monitoring for compliance to a CTBT. Activity ratios permit discriminating radioxenon from nuclear detonation and that from nuclear reactor operations, nuclear fuel reprocessing, or medical isotope production and usage. In the analyzer, Xe is continuously and automatically separated from the atmosphere at flow rates of about 7 m 3 /h on sorption bed. Aliquots collected for 6-12 h are automatically analyzed by electron-photon coincidence spectrometry to produce sensitivities in the range of 20-100 μBq/m 3 of air, about 100-fold better than with reported laboratory-based procedures for short time collection intervals. Spectral data are automatically analyzed and the calculated radioxenon concentrations and raw gamma- ray spectra automatically transmitted to data centers

  15. How to Analyze Company Using Social Network?

    Science.gov (United States)

    Palus, Sebastian; Bródka, Piotr; Kazienko, Przemysław

    Every single company or institution wants to utilize its resources in the most efficient way. In order to do so they have to be have good structure. The new way to analyze company structure by utilizing existing within company natural social network and example of its usage on Enron company are presented in this paper.

  16. Environmental applications of the centrifugal fast analyzer

    International Nuclear Information System (INIS)

    Goldstein, G.; Strain, J.E.; Bowling, J.L.

    1975-12-01

    The centrifugal fast analyzer (GeMSAEC Fast Analyzer) was applied to the analysis of pollutants in air and water. Since data acquisition and processing are computer controlled, considerable effort went into devising appropriate software. A modified version of the standard FOCAL interpreter was developed which includes special machine language functions for data timing, acquisition, and storage, and also permits chaining together of programs stored on a disk. Programs were written and experimental procedures developed to implement spectrophotometric, turbidimetric, kinetic (including initial-rate, fixed-time, and variable-time techniques), and chemiluminescence methods of analysis. Analytical methods were developed for the following elements and compounds: SO 2 , O 3 , Ca, Cr, Cu, Fe, Mg, Se(IV), Zn, Cl - , I - , NO 2 - , PO 4 -3 , S -2 , and SO 4 -2 . In many cases, standard methods could be adapted to the centrifugal analyzer, in others new methods were employed. In general, analyses performed with the centrifugal fast analyzer were faster, more precise, and more accurate than with conventional instrumentation

  17. Analyzing Vessel Behavior Using Process Mining

    NARCIS (Netherlands)

    Maggi, F.M.; Mooij, A.J.; Aalst, W.M.P. van der

    2013-01-01

    In the maritime domain, electronic sensors such as AIS receivers and radars collect large amounts of data about the vessels in a certain geographical area. We investigate the use of process mining techniques for analyzing the behavior of the vessels based on these data. In the context of maritime

  18. Strengthening 4-H by Analyzing Enrollment Data

    Science.gov (United States)

    Hamilton, Stephen F.; Northern, Angela; Neff, Robert

    2014-01-01

    The study reported here used data from the ACCESS 4-H Enrollment System to gain insight into strengthening New York State's 4-H programming. Member enrollment lists from 2009 to 2012 were analyzed using Microsoft Excel to determine trends and dropout rates. The descriptive data indicate declining 4-H enrollment in recent years and peak enrollment…

  19. How Rapid is Rapid Prototyping? Analysis of ESPADON Programme Results

    Directory of Open Access Journals (Sweden)

    Ian D. Alston

    2003-05-01

    Full Text Available New methodologies, engineering processes, and support environments are beginning to emerge for embedded signal processing systems. The main objectives are to enable defence industry to field state-of-the-art products in less time and with lower costs, including retrofits and upgrades, based predominately on commercial off the shelf (COTS components and the model-year concept. One of the cornerstones of the new methodologies is the concept of rapid prototyping. This is the ability to rapidly and seamlessly move from functional design to the architectural design to the implementation, through automatic code generation tools, onto real-time COTS test beds. In this paper, we try to quantify the term “rapid” and provide results, the metrics, from two independent benchmarks, a radar and sonar beamforming application subset. The metrics show that the rapid prototyping process may be sixteen times faster than a conventional process.

  20. Rapid Prototyping and its Application in Dentistry

    Directory of Open Access Journals (Sweden)

    V. N. V. Madhav

    2013-01-01

    Full Text Available Medical implants and biological models have three main characteristics: low volume, complex shape, and can be customized. These characteristics suit very well with Rapid Prototyping (RP and Rapid Manufacturing (RM processes. RP/RM processes are fabricated part layer- by-layer until complete shape finished from 3D model. Biocompatible materials, such as Titanium and Titanium alloy, Zirconium, Cobalt Chromium, PEEK, etc, are used for fabrication process. Reverse Engineering (RE technology greatly affects RP/RM processes. RE is used to capture or scan image of the limb, cranium, tooth, and other biological objects. Three common methods to get the image are 3D laser scanning, Computer Tomography (CT, and Magnetic Resonance Imaging (MRI. Main RP/RM techniques used in Dentistry are Stereotype Lithography Apparatus (SLA, Fused Deposition Modeling (FDM, Selective Laser Sintering (SLS, and ink jet printing. This article reviews the changing scenario of technology in dentistry with special emphasis on Rapid Prototyping and its various applications in Dentistry.

  1. Managing Complexity

    Energy Technology Data Exchange (ETDEWEB)

    Chassin, David P.; Posse, Christian; Malard, Joel M.

    2004-08-01

    Physical analogs have shown considerable promise for understanding the behavior of complex adaptive systems, including macroeconomics, biological systems, social networks, and electric power markets. Many of today’s most challenging technical and policy questions can be reduced to a distributed economic control problem. Indeed, economically-based control of large-scale systems is founded on the conjecture that the price-based regulation (e.g., auctions, markets) results in an optimal allocation of resources and emergent optimal system control. This paper explores the state of the art in the use physical analogs for understanding the behavior of some econophysical systems and to deriving stable and robust control strategies for them. In particular we review and discussion applications of some analytic methods based on the thermodynamic metaphor according to which the interplay between system entropy and conservation laws gives rise to intuitive and governing global properties of complex systems that cannot be otherwise understood.

  2. CTG Analyzer: A graphical user interface for cardiotocography.

    Science.gov (United States)

    Sbrollini, Agnese; Agostinelli, Angela; Burattini, Luca; Morettini, Micaela; Di Nardo, Francesco; Fioretti, Sandro; Burattini, Laura

    2017-07-01

    Cardiotocography (CTG) is the most commonly used test for establishing the good health of the fetus during pregnancy and labor. CTG consists in the recording of fetal heart rate (FHR; bpm) and maternal uterine contractions (UC; mmHg). FHR is characterized by baseline, baseline variability, tachycardia, bradycardia, acceleration and decelerations. Instead, UC signal is characterized by presence of contractions and contractions period. Such parameters are usually evaluated by visual inspection. However, visual analysis of CTG recordings has a well-demonstrated poor reproducibility, due to the complexity of physiological phenomena affecting fetal heart rhythm and being related to clinician's experience. Computerized tools in support of clinicians represents a possible solution for improving correctness in CTG interpretation. This paper proposes CTG Analyzer as a graphical tool for automatic and objective analysis of CTG tracings. CTG Analyzer was developed under MATLAB®; it is a very intuitive and user friendly graphical user interface. FHR time series and UC signal are represented one under the other, on a grid with reference lines, as usually done for CTG reports printed on paper. Colors help identification of FHR and UC features. Automatic analysis is based on some unchangeable features definitions provided by the FIGO guidelines, and other arbitrary settings whose default values can be changed by the user. Eventually, CTG Analyzer provides a report file listing all the quantitative results of the analysis. Thus, CTG Analyzer represents a potentially useful graphical tool for automatic and objective analysis of CTG tracings.

  3. The development of MOSA-II multichannel optical spectrum analyzer

    International Nuclear Information System (INIS)

    Guo Li; Yang Zhoujing; Fang Shuyao

    1989-01-01

    The MOSA-II Multichannel Optical Spectrum Analyzer is a high-perfor mance, easy-to-use measurment system for extremely rapid spectral data acquistion, processing and presentation. It consists of four parts: vidicon, data acquiring and timing circuit, the correcting circuit for the geometric distortion and the non-uniform distortion of the vidicon, IBMPC/XT and color plotter. The system has the following functions: single spectrum acquisition, continuous acquisition of multi-spectra, noise reduction, math, operations (including addition, subtraction, multiplication, and division) and geometric transform action of the spectra, and 3D-presentation of the spectra on both the color screen and the plotter. The absolute sensitivity of the system is 18 Photons/sec · mm 2 . The access time for data acquisition is 64 μs/per channel and the spectrum range is 1800 A - 8000A, the geometric distortion<2%, the amplitude error<5%

  4. Search for rapid variability of 53 Cam

    International Nuclear Information System (INIS)

    Zverko, J.

    1982-01-01

    Photoelectric observations of magnetic Ap star 53 Cam made at the Skalnate Pleso Observatory in 1978 and 1979 are analyzed from the point of view of rapid variability. The observations were made with an intermediate passband filter, effective wavelength 526 nm. Besides the differences msub(53Cam)-msub(Comp), the behaviour was also investigated of the deflections for the comparison star during the observation runs. A strong correlation between the behaviour of the comparison and variable star light curve was found and the appearance differs from night to night depending on atmospheric conditions. Each observation run is analyzed in detail and it was concluded that all observed variations are only apparent and due to the variability of atmospheric extinction above the observation site. (author)

  5. Continuous online nuclear analyzer of coal

    International Nuclear Information System (INIS)

    Rogers, R.S.C.; Bozorgmanesh, H.; Gozani, T.; Brown, T.

    1980-01-01

    Since CONAC is a relatively new concept in coal quality measurement, the present paper concentrates primarily on instrument development. The basic principles of elemental composition, moisture content and Btu measurements are described and typical measurement results presented. Then, since CONAC is under development specifically for quality control purposes, its advantages and potential applications in the coal circuit are discussed. The CONAC development work showed principles of CONAC operation (PNAA, moisture and Btu determinations) to be powerful and versatile tools for the analysis of coal; CONAC-type instrumentation is being developed for batch, laboratory usage. Such a device will enable a user to perform rapid coal sample analyses in a nondestructive fashion, leaving samples intact for further evaluation. It is felt that a batch laboratory, CONAC-type device will be of great use in the mining industry where the analysis of borehole samples are important in the evaluation of coal reserves and in the planning of mining operations

  6. Methodology for analyzing risk at nuclear facilities

    International Nuclear Information System (INIS)

    Yoo, Hosik; Lee, Nayoung; Ham, Taekyu; Seo, Janghoon

    2015-01-01

    Highlights: • A new methodology for evaluating the risk at nuclear facilities was developed. • Five measures reflecting all factors that should be concerned to assess risk were developed. • The attributes on NMAC and nuclear security culture are included as attributes for analyzing. • The newly developed methodology can be used to evaluate risk of both existing facility and future nuclear system. - Abstract: A methodology for evaluating risks at nuclear facilities is developed in this work. A series of measures is drawn from the analysis of factors that determine risks. Five measures are created to evaluate risks at nuclear facilities. These include the legal and institutional framework, material control, physical protection system effectiveness, human resources, and consequences. Evaluation attributes are developed for each measure and specific values are given in order to calculate the risk value quantitatively. Questionnaires are drawn up on whether or not a state has properly established a legal and regulatory framework (based on international standards). These questionnaires can be a useful measure for comparing the status of the physical protection regime between two countries. Analyzing an insider threat is not an easy task and no methodology has been developed for this purpose. In this study, attributes that could quantitatively evaluate an insider threat, in the case of an unauthorized removal of nuclear materials, are developed by adopting the Nuclear Material Accounting & Control (NMAC) system. The effectiveness of a physical protection system, P(E), could be analyzed by calculating the probability of interruption, P(I), and the probability of neutralization, P(N). In this study, the Tool for Evaluating Security System (TESS) code developed by KINAC is used to calculate P(I) and P(N). Consequence is an important measure used to analyze risks at nuclear facilities. This measure comprises radiological, economic, and social damage. Social and

  7. Development of pulse neutron coal analyzer

    International Nuclear Information System (INIS)

    Jing Shiwie; Gu Deshan; Qiao Shuang; Liu Yuren; Liu Linmao; Jing Shiwei

    2005-01-01

    This article introduced the development of pulsed neutron coal analyzer by pulse fast-thermal neutron analysis technology in the Radiation Technology Institute of Northeast Normal University. The 14 MeV pulse neutron generator and bismuth germanate detector and 4096 multichannel analyzer were applied in this system. The multiple linear regression method employed to process data solved the interferential problem of multiple elements. The prototype (model MZ-MKFY) had been applied in Changshan and Jilin power plant for about a year. The results of measuring the main parameters of coal such as low caloric power, whole total water, ash content, volatile content, and sulfur content, with precision acceptable to the coal industry, are presented

  8. Real time speech formant analyzer and display

    Science.gov (United States)

    Holland, George E.; Struve, Walter S.; Homer, John F.

    1987-01-01

    A speech analyzer for interpretation of sound includes a sound input which converts the sound into a signal representing the sound. The signal is passed through a plurality of frequency pass filters to derive a plurality of frequency formants. These formants are converted to voltage signals by frequency-to-voltage converters and then are prepared for visual display in continuous real time. Parameters from the inputted sound are also derived and displayed. The display may then be interpreted by the user. The preferred embodiment includes a microprocessor which is interfaced with a television set for displaying of the sound formants. The microprocessor software enables the sound analyzer to present a variety of display modes for interpretive and therapeutic used by the user.

  9. Miniature multichannel analyzer for process monitoring

    International Nuclear Information System (INIS)

    Halbig, J.K.; Klosterbuer, S.F.; Russo, P.A.; Sprinkle, J.K. Jr.; Stephens, M.M.; Wiig, L.G.; Ianakiev, K.D.

    1993-01-01

    A new, 4,000-channel analyzer has been developed for gamma-ray spectroscopy applications. A design philosophy of hardware and software building blocks has been combined with design goals of simplicity, compactness, portability, and reliability. The result is a miniature, modular multichannel analyzer (MMMCA), which offers solution to a variety of nondestructive assay (NDA) needs in many areas of general application, independent of computer platform or operating system. Detector-signal analog electronics, the bias supply, and batteries are included in the virtually pocket-size, low-power MMMCA unit. The MMMCA features digital setup and control, automated data reduction, and automated quality assurance. Areas of current NDA applications include on-line continuous (process) monitoring, process material holdup measurements, and field inspections

  10. Testing the Application for Analyzing Structured Entities

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2011-01-01

    Full Text Available The paper presents the testing process of the application for the analysis of structured text entities. The structured entities are presented. Quality characteristics of structured entities are identified and analyzed. The design and building processes are presented. Rules for building structured entities are described. The steps of building the application for the analysis of structured text entities are presented. The objective of the testing process is defined. Ways of testing the application on components and as a whole are established. A testing strategy for different objectives is proposed. The behavior of users during the testing period is analyzed. Statistical analysis regarding the behavior of users in processes of infinite resources access are realized.

  11. A new approach to analyzing vulnerability

    International Nuclear Information System (INIS)

    O'Callaghan, P.B.; Carlson, R.L.; Riedeman, G.W.

    1986-01-01

    The Westinghouse Hanford Company (WHC) has recently completed construction of the Fuel Cycle Plant (FCP) at Richland, Washington. At start-up the facility will fabricate driver fuel for the Fast Flux Test Facility in the Secure Automated Fabrication line. After construction completion, but before facility certification, the Department of Energy (DOE) Richland Operation Office requested that a vulnerability analysis be performed which assumed multiple insiders as a threat to the security system. A unique method of analyzing facility vulnerabilities was developed at the Security Applications Center (SAC), which is managed by WHC for DOE. The method that was developed verifies a previous vulnerability assessment, as well as introducing a modeling technique which analyzes security alarms in relation to delaying factors and possible insider activities. With this information it is possible to assess the relative strength or weakness of various possible routes to and from a target within a facility,

  12. Real-time airborne particle analyzer

    Science.gov (United States)

    Reilly, Peter T.A.

    2012-10-16

    An aerosol particle analyzer includes a laser ablation chamber, a gas-filled conduit, and a mass spectrometer. The laser ablation chamber can be operated at a low pressure, which can be from 0.1 mTorr to 30 mTorr. The ablated ions are transferred into a gas-filled conduit. The gas-filled conduit reduces the electrical charge and the speed of ablated ions as they collide and mix with buffer gases in the gas-filled conduit. Preferably, the gas filled-conduit includes an electromagnetic multipole structure that collimates the nascent ions into a beam, which is guided into the mass spectrometer. Because the gas-filled conduit allows storage of vast quantities of the ions from the ablated particles, the ions from a single ablated particle can be analyzed multiple times and by a variety of techniques to supply statistically meaningful analysis of composition and isotope ratios.

  13. Development of a nuclear plant analyzer (NPA)

    International Nuclear Information System (INIS)

    De Vlaminck, M.; Mampaey, L.; Vanhoenacker, L.; Bastenaire, F.

    1990-01-01

    A Nuclear Plant Analyzer has been developed by TRACTABEL. Three distinct functional units make up the Nuclear Plant Analyser, a model builder, a run time unit and an analysis unit. The model builder is intended to build simulation models which describe on the one hand the geometric structure and initial conditions of a given plant and on the other hand command control logics and reactor protection systems. The run time unit carries out dialog between the user and the thermal-hydraulic code. The analysis unit is aimed at deep analyzing of the transient results. The model builder is being tested in the framework of the International Standard Problem ISP-26, which is the simulation of a LOCA on the Japanese ROSA facility

  14. Computer-based radionuclide analyzer system

    International Nuclear Information System (INIS)

    Ohba, Kengo; Ishizuka, Akira; Kobayashi, Akira; Ohhashi, Hideaki; Tsuruoka, Kimitoshi.

    1978-01-01

    The radionuclide analysis in nuclear power plants, practiced for the purpose of monitoring the quality of the primary loop water, the confirmation of the performance of reactor cleanup system and monitoring the radioactive waste effluent, is an important job. Important as it is, it requires considerable labor of experts, because the samples to be analyzed are multifarious and very large in number, and in addition, this job depends much on manual work. With a view of saving the labor, simplifying and standardizing the work, reducing radiation exposure, and automatizing the work of analysis, the computerized analyzer system has been worked out. The results of its performance test at the operating power plant have proved that the development has fairly accomplished the objects and that the system is well useful. The developmental work was carried out by the cooperation between The Tokyo Electric Power Co. and Toshiba in about 4 years from 1974 to this year. (auth.)

  15. Neutral Particle Analyzer Diagnostic on NSTX

    International Nuclear Information System (INIS)

    Medley, S.S.; Roquemore, A.L.

    2004-01-01

    The Neutral Particle Analyzer (NPA) diagnostic on the National Spherical Torus Experiment (NSTX) utilizes a PPPL-designed E||B spectrometer that measures the energy spectra of minority hydrogen and bulk deuterium species simultaneously with 39 energy channels per mass specie and a time resolution of 1 ms. The calibrated energy range is E = 0.5-150 keV and the energy resolution varies from AE/E = 3-7% over the surface of the microchannel plate detector

  16. Analyzing Gender Stereotyping in Bollywood Movies

    OpenAIRE

    Madaan, Nishtha; Mehta, Sameep; Agrawaal, Taneea S; Malhotra, Vrinda; Aggarwal, Aditi; Saxena, Mayank

    2017-01-01

    The presence of gender stereotypes in many aspects of society is a well-known phenomenon. In this paper, we focus on studying such stereotypes and bias in Hindi movie industry (Bollywood). We analyze movie plots and posters for all movies released since 1970. The gender bias is detected by semantic modeling of plots at inter-sentence and intra-sentence level. Different features like occupation, introduction of cast in text, associated actions and descriptions are captured to show the pervasiv...

  17. Neutral Particle Analyzer Diagnostic on NSTX

    Energy Technology Data Exchange (ETDEWEB)

    S.S. Medley; A.L. Roquemore

    2004-03-16

    The Neutral Particle Analyzer (NPA) diagnostic on the National Spherical Torus Experiment (NSTX) utilizes a PPPL-designed E||B spectrometer that measures the energy spectra of minority hydrogen and bulk deuterium species simultaneously with 39 energy channels per mass specie and a time resolution of 1 ms. The calibrated energy range is E = 0.5-150 keV and the energy resolution varies from AE/E = 3-7% over the surface of the microchannel plate detector.

  18. Information decomposition method to analyze symbolical sequences

    International Nuclear Information System (INIS)

    Korotkov, E.V.; Korotkova, M.A.; Kudryashov, N.A.

    2003-01-01

    The information decomposition (ID) method to analyze symbolical sequences is presented. This method allows us to reveal a latent periodicity of any symbolical sequence. The ID method is shown to have advantages in comparison with application of the Fourier transformation, the wavelet transform and the dynamic programming method to look for latent periodicity. Examples of the latent periods for poetic texts, DNA sequences and amino acids are presented. Possible origin of a latent periodicity for different symbolical sequences is discussed

  19. Analyzing the Existing Undergraduate Engineering Leadership Skills

    OpenAIRE

    Hamed M. Almalki; Luis Rabelo; Charles Davis; Hammad Usmani; Debra Hollister; Alfonso Sarmiento

    2016-01-01

    Purpose: Studying and analyzing the undergraduate engineering students' leadership skills to discover their potential leadership strengths and weaknesses. This study will unveil potential ways to enhance the ways we teach engineering leadership. The research has great insights that might assist engineering programs to improve curricula for the purpose of better engineering preparation to meet industry's demands. Methodology and Findings: 441 undergraduate engineering students have been s...

  20. General methods for analyzing bounded proportion data

    OpenAIRE

    Hossain, Abu

    2017-01-01

    This thesis introduces two general classes of models for analyzing proportion response variable when the response variable Y can take values between zero and one, inclusive of zero and/or one. The models are inflated GAMLSS model and generalized Tobit GAMLSS model. The inflated GAMLSS model extends the flexibility of beta inflated models by allowing the distribution on (0,1) of the continuous component of the dependent variable to come from any explicit or transformed (i.e. logit or truncated...

  1. The analyzing of Dove marketing strategy

    Institute of Scientific and Technical Information of China (English)

    Guo; Yaohui

    2015-01-01

    <正>1.Introduction In this report,I try to analyze the related information about DOVE chocolate.Firstly,I would like to introduce this product.Dove chocolate is one of a series of products launched by the world’s largest pet food and snack food manufacturers,U.S.multinational food company Mars(Mars).Entered China in 1989,It becomes China’s leading brand of chocolate in

  2. Analyzing negative ties in social networks

    Directory of Open Access Journals (Sweden)

    Mankirat Kaur

    2016-03-01

    Full Text Available Online social networks are a source of sharing information and maintaining personal contacts with other people through social interactions and thus forming virtual communities online. Social networks are crowded with positive and negative relations. Positive relations are formed by support, endorsement and friendship and thus, create a network of well-connected users whereas negative relations are a result of opposition, distrust and avoidance creating disconnected networks. Due to increase in illegal activities such as masquerading, conspiring and creating fake profiles on online social networks, exploring and analyzing these negative activities becomes the need of hour. Usually negative ties are treated in same way as positive ties in many theories such as balance theory and blockmodeling analysis. But the standard concepts of social network analysis do not yield same results in respect of each tie. This paper presents a survey on analyzing negative ties in social networks through various types of network analysis techniques that are used for examining ties such as status, centrality and power measures. Due to the difference in characteristics of flow in positive and negative tie networks some of these measures are not applicable on negative ties. This paper also discusses new methods that have been developed specifically for analyzing negative ties such as negative degree, and h∗ measure along with the measures based on mixture of positive and negative ties. The different types of social network analysis approaches have been reviewed and compared to determine the best approach that can appropriately identify the negative ties in online networks. It has been analyzed that only few measures such as Degree and PN centrality are applicable for identifying outsiders in network. For applicability in online networks, the performance of PN measure needs to be verified and further, new measures should be developed based upon negative clique concept.

  3. Testing the Application for Analyzing Structured Entities

    OpenAIRE

    Ion IVAN; Bogdan VINTILA

    2011-01-01

    The paper presents the testing process of the application for the analysis of structured text entities. The structured entities are presented. Quality characteristics of structured entities are identified and analyzed. The design and building processes are presented. Rules for building structured entities are described. The steps of building the application for the analysis of structured text entities are presented. The objective of the testing process is defined. Ways of testing the applicat...

  4. Evaluation of the Air Void Analyzer

    Science.gov (United States)

    2013-07-01

    concrete using image analysis: Petrography of cementitious materials. ASTM STP 1215. S.M. DeHayes and D. Stark, eds. Philadelphia, PA: American...Administration (FHWA). 2006. Priority, market -ready technologies and innovations: Air Void Analyzer. Washington D.C. PDF file. Germann Instruments (GI). 2011...tests and properties of concrete and concrete-making materials. STP 169D. West Conshohocken, PA: ASTM International. Magura, D.D. 1996. Air void

  5. Semantic analyzability in children's understanding of idioms.

    Science.gov (United States)

    Gibbs, R W

    1991-06-01

    This study investigated the role of semantic analyzability in children's understanding of idioms. Kindergartners and first, third, and fourth graders listened to idiomatic expressions either alone or at the end of short story contexts. Their task was to explain verbally the intended meanings of these phrases and then to choose their correct idiomatic interpretations. The idioms presented to the children differed in their degree of analyzability. Some idioms were highly analyzable or decomposable, with the meanings of their parts contributing independently to their overall figurative meanings. Other idioms were nondecomposable because it was difficult to see any relation between a phrase's individual components and the idiom's figurative meaning. The results showed that younger children (kindergartners and first graders) understood decomposable idioms better than they did nondecomposable phrases. Older children (third and fourth graders) understood both kinds of idioms equally well in supporting contexts, but were better at interpreting decomposable idioms than they were at understanding nondecomposable idioms without contextual information. These findings demonstrate that young children better understand idiomatic phrases whose individual parts independently contribute to their overall figurative meanings.

  6. Handheld Fluorescence Microscopy based Flow Analyzer.

    Science.gov (United States)

    Saxena, Manish; Jayakumar, Nitin; Gorthi, Sai Siva

    2016-03-01

    Fluorescence microscopy has the intrinsic advantages of favourable contrast characteristics and high degree of specificity. Consequently, it has been a mainstay in modern biological inquiry and clinical diagnostics. Despite its reliable nature, fluorescence based clinical microscopy and diagnostics is a manual, labour intensive and time consuming procedure. The article outlines a cost-effective, high throughput alternative to conventional fluorescence imaging techniques. With system level integration of custom-designed microfluidics and optics, we demonstrate fluorescence microscopy based imaging flow analyzer. Using this system we have imaged more than 2900 FITC labeled fluorescent beads per minute. This demonstrates high-throughput characteristics of our flow analyzer in comparison to conventional fluorescence microscopy. The issue of motion blur at high flow rates limits the achievable throughput in image based flow analyzers. Here we address the issue by computationally deblurring the images and show that this restores the morphological features otherwise affected by motion blur. By further optimizing concentration of the sample solution and flow speeds, along with imaging multiple channels simultaneously, the system is capable of providing throughput of about 480 beads per second.

  7. A Raman-Based Portable Fuel Analyzer

    Science.gov (United States)

    Farquharson, Stuart

    2010-08-01

    Fuel is the single most import supply during war. Consider that the US Military is employing over 25,000 vehicles in Iraq and Afghanistan. Most fuel is obtained locally, and must be characterized to ensure proper operation of these vehicles. Fuel properties are currently determined using a deployed chemical laboratory. Unfortunately, each sample requires in excess of 6 hours to characterize. To overcome this limitation, we have developed a portable fuel analyzer capable of determine 7 fuel properties that allow determining fuel usage. The analyzer uses Raman spectroscopy to measure the fuel samples without preparation in 2 minutes. The challenge, however, is that as distilled fractions of crude oil, all fuels are composed of hundreds of hydrocarbon components that boil at similar temperatures, and performance properties can not be simply correlated to a single component, and certainly not to specific Raman peaks. To meet this challenge, we measured over 800 diesel and jet fuels from around the world and used chemometrics to correlate the Raman spectra to fuel properties. Critical to the success of this approach is laser excitation at 1064 nm to avoid fluorescence interference (many fuels fluoresce) and a rugged interferometer that provides 0.1 cm-1 wavenumber (x-axis) accuracy to guarantee accurate correlations. Here we describe the portable fuel analyzer, the chemometric models, and the successful determination of these 7 fuel properties for over 100 unknown samples provided by the US Marine Corps, US Navy, and US Army.

  8. Method of stabilizing single channel analyzers

    International Nuclear Information System (INIS)

    Fasching, G.E.; Patton, G.H.

    1975-01-01

    A method and the apparatus to reduce the drift of single channel analyzers are described. Essentially, this invention employs a time-sharing or multiplexing technique to insure that the outputs from two single channel analyzers (SCAS) maintain the same count ratio regardless of variations in the threshold voltage source or voltage changes, the multiplexing technique is accomplished when a flip flop, actuated by a clock, changes state to switch the output from the individual SCAS before these outputs are sent to a ratio counting scalar. In the particular system embodiment disclosed that illustrates this invention, the sulfur content of coal is determined by subjecting the coal to radiation from a neutron producing source. A photomultiplier and detector system equates the transmitted gamma radiation to an analog voltage signal and sends the same signal after amplification, to a SCA system that contains the invention. Therein, at least two single channel analyzers scan the analog signal over different parts of a spectral region. The two outputs may then be sent to a digital multiplexer so that the output from the multiplexer contains counts falling within two distinct segments of the region. By dividing the counts from the multiplexer by each other, the percentage of sulfur within the coal sample under observation may be determined. (U.S.)

  9. [Electromyography Analysis of Rapid Eye Movement Sleep Behavior Disorder].

    Science.gov (United States)

    Nakano, Natsuko; Kinoshita, Fumiya; Takada, Hiroki; Nakayama, Meiho

    2018-01-01

    Polysomnography (PSG), which records physiological phenomena including brain waves, breathing status, and muscle tonus, is useful for the diagnosis of sleep disorders as a gold standard. However, measurement and analysis are complex for several specific sleep disorders, such as rapid eye movement (REM) sleep behavior disorder (RBD). Usually, brain waves during REM sleep indicate an awakening pattern under relaxed conditions of skeletal and antigravity muscles. However, these muscles are activated during REM sleep when patients suffer from RBD. These activated muscle movements during REM, so-called REM without atonia (RWA) recorded by PSG, may be related to a neurodegenerative disease such as Parkinson's disease. Thus, careful analysis of RWA is significant not only physically, but also clinically. Commonly, manual viewing measurement analysis of RWA is time-consuming. Therefore, quantitative studies on RWA are rarely reported. A software program, developed from Microsoft Office Excel ® , was used to semiautomatically analyze the RWA ratio extracted from PSG to compare with manual viewing measurement analysis. In addition, a quantitative muscle tonus study was carried out to evaluate the effect of medication on RBD patients. Using this new software program, we were able to analyze RWA on the same cases in approximately 15 min as compared with 60 min in the manual viewing measurement analysis. This software program can not only quantify RWA easily but also identify RWA waves for either phasic or tonic bursts. We consider that this software program will support physicians and scientists in their future research on RBD. We are planning to offer this software program for free to physicians and scientists.

  10. A Hybrid DGTD-MNA Scheme for Analyzing Complex Electromagnetic Systems

    KAUST Repository

    Li, Peng; Jiang, Li-Jun; Bagci, Hakan

    2015-01-01

    lumped circuit elements, the standard Newton-Raphson method is applied at every time step. Additionally, a local time-stepping scheme is developed to improve the efficiency of the hybrid solver. Numerical examples consisting of EM systems loaded

  11. A Framework For Analyzing And Mitigating The Vulnerabilities Of Complex Systems Via Attack And Protection Trees

    Science.gov (United States)

    2007-07-01

    Systems, Ciudad Real, Spain, 2002. [Ame00] "Metamorphosis," in American Heritage Dictionary of the English Language Fourth ed: Houghton Mifflin Company...Beyond Fear: Thinking Sensibly About Security in an Uncertain World. New York: Copernicus Books, 2003. [Sch99] Schneier, B. "Modeling Security

  12. Analyzing the complexity of cone production in longleaf pine by multiscale entropy

    Science.gov (United States)

    Xiongwen Chen; Qinfeng Guo; Dale G. Brockway

    2016-01-01

    The longleaf pine (Pinus palustris Mill.) forests are important ecosystems in the southeastern USA because of their ecological and economic value. Since European settlement, longleaf pine ecosystems have dramatically declined in extent, to the degree that they are now listed as endangered ecosystems. Its sporadic seed production, which...

  13. A Framework For Analyzing And Mitigating The Vulnerabilities Of Complex Systems Via Attack And Protection Trees

    National Research Council Canada - National Science Library

    Edge, Kenneth S

    2007-01-01

    .... Attack trees by themselves do not provide enough decision support to system defenders. This research develops the concept of using protection trees to offer a detailed risk analysis of a system...

  14. A method to analyze, sort, and retain viability of obligate anaerobic microorganisms from complex microbial communities.

    Science.gov (United States)

    Thompson, Anne W; Crow, Matthew J; Wadey, Brian; Arens, Christina; Turkarslan, Serdar; Stolyar, Sergey; Elliott, Nicholas; Petersen, Timothy W; van den Engh, Ger; Stahl, David A; Baliga, Nitin S

    2015-10-01

    A high speed flow cytometric cell sorter was modified to maintain a controlled anaerobic environment. This technology enabled coupling of the precise high-throughput analytical and cell separation capabilities of flow cytometry to the assessment of cell viability of evolved lineages of obligate anaerobic organisms from cocultures. Copyright © 2015. Published by Elsevier B.V.

  15. Colour reconnections and rapidity gaps

    International Nuclear Information System (INIS)

    Loennblad, Leif

    1996-01-01

    I argue that the success of recently proposed models describing events with large rapidity gaps in DIS at HERA in terms of non-perturbative colour exchange is heavily reliant on suppression of perturbative gluon emission in the proton direction. There is little or no physical motivation for such suppression and I show that a model without this suppression cannot describe the rapidity gap events at HERA. (author)

  16. Modeling and Analyzing Electric Vehicle Charging

    DEFF Research Database (Denmark)

    Andersen, Ove; Krogh, Benjamin Bjerre; Thomsen, Christian

    2016-01-01

    , such as wind turbines. To both enable a smart grid and the use of renewable energy, it is essential to know when and where an EV is plugged into the power grid and what battery capacity is available. In this paper, we present a generic spatio-temporal data-warehouse model for storing detailed information...... on all aspects of charging EVs, including integration with the electricity prices from a spot market. The proposed data warehouse is fully implemented and currently contains 2.5 years of charging data from 176 EVs. We describe the date warehouse model and the implementation including complex operations...

  17. Dilepton distributions at backward rapidities

    International Nuclear Information System (INIS)

    Betemps, M. A.; Ducati, M. B. Gay; Oliveira, E. G. de

    2006-01-01

    The dilepton production at backward rapidities in pAu and pp collisions at RHIC and LHC energies is investigated in the dipole approach. The results are shown through the nuclear modification ratio R pA considering transverse momentum and rapidity spectra. The dilepton modification ratio presents interesting behavior at the backward rapidities when compared with the already known forward ones, since it is related with the large x kinematical region that is being probed. The rapidity dependence of the nuclear modification ratio in the dilepton production is strongly dependent on the Bjorken x behavior of the nuclear structure function ratio R F 2 =F 2 A /F 2 p . The R pA transverse momentum dependence at backward rapidities is modified due to the large x nuclear effects: at RHIC energies, for instance, the ratio R pA is reduced as p T increases, presenting an opposite behavior when compared with the forward one. It implies that the dilepton production at backward rapidities should carry information of the nuclear effects at large Bjorken x, as well as that it is useful to investigate the p T dependence of the observables in this kinematical regime

  18. Rapid colorimetric assay for gentamicin injection.

    Science.gov (United States)

    Tarbutton, P

    1987-01-01

    A rapid colorimetric method for determining gentamicin concentration in commercial preparations of gentamicin sulfate injection was developed. Methods currently available for measuring gentamicin concentration via its colored complex with cupric ions in alkaline solution were modified to reduce the time required for a single analysis. The alkaline copper tartrate (ACT) reagent solution was prepared such that each milliliter contained 100 mumol cupric sulfate, 210 mumol potassium sodium tartrate, and 1.25 mmol sodium hydroxide. The assay involves mixing 0.3 mL gentamicin sulfate injection 40 mg/mL (of gentamicin), 1.0 mL ACT reagent, and 0.7 mL water; the absorbance of the resulting solution at 560 nm was used to calculate the gentamicin concentration in the sample. For injections containing 10 mg/mL of gentamicin, the amount of the injection was increased to 0.5 mL and water decreased to 0.5 mL. The concentration of gentamicin in samples representing 11 lots of gentamicin sulfate injection 40 mg/mL and 8 lots of gentamicin sulfate injection 10 mg/mL was determined. The specificity, reproducibility, and accuracy of the assay were assessed. The colored complex was stable for at least two hours. Gentamicin concentration ranged from 93.7 to 108% and from 95 to 109% of the stated label value of the 40 mg/mL and the 10 mg/mL injections, respectively. No components of the preservative system present in the injections interfered with the assay. Since other aminoglycosides produced a colored complex, the assay is not specific for gentamicin. The assay was accurate and reproducible over the range of 4-20 mg of gentamicin. This rapid and accurate assay can be easily applied in the hospital pharmacy setting.

  19. Complex variables

    CERN Document Server

    Flanigan, Francis J

    2010-01-01

    A caution to mathematics professors: Complex Variables does not follow conventional outlines of course material. One reviewer noting its originality wrote: ""A standard text is often preferred [to a superior text like this] because the professor knows the order of topics and the problems, and doesn't really have to pay attention to the text. He can go to class without preparation."" Not so here-Dr. Flanigan treats this most important field of contemporary mathematics in a most unusual way. While all the material for an advanced undergraduate or first-year graduate course is covered, discussion

  20. Rapid single flux quantum logic in high temperature superconductor technology

    NARCIS (Netherlands)

    Shunmugavel, K.

    2006-01-01

    A Josephson junction is the basic element of rapid single flux quantum logic (RSFQ) circuits. A high operating speed and low power consumption are the main advantages of RSFQ logic over semiconductor electronic circuits. To realize complex RSFQ circuits in HTS technology one needs a reproducible

  1. A Simple and Rapid Complexometric Determination of Thallium(III ...

    African Journals Online (AJOL)

    A simple, rapid and selective complexometric method is proposed for the determination of thallium(III), using mercaptoethane(EtSH) as demasking agent. The sample solution containing Tl(III) is first complexed with excess EDTA and the surplus EDTA is removed by titration at pH 5–6 with zinc sulphate solution using ...

  2. Is a Universal Science of Complexity Conceivable?

    Science.gov (United States)

    West, Geoffrey B.

    Over the past quarter of a century, terms like complex adaptive system, the science of complexity, emergent behavior, self-organization, and adaptive dynamics have entered the literature, reflecting the rapid growth in collaborative, trans-disciplinary research on fundamental problems in complex systems ranging across the entire spectrum of science from the origin and dynamics of organisms and ecosystems to financial markets, corporate dynamics, urbanization and the human brain...

  3. Analyzing critical material demand: A revised approach.

    Science.gov (United States)

    Nguyen, Ruby Thuy; Fishman, Tomer; Zhao, Fu; Imholte, D D; Graedel, T E

    2018-07-15

    Apparent consumption has been widely used as a metric to estimate material demand. However, with technology advancement and complexity of material use, this metric has become less useful in tracking material flows, estimating recycling feedstocks, and conducting life cycle assessment of critical materials. We call for future research efforts to focus on building a multi-tiered consumption database for the global trade network of critical materials. This approach will help track how raw materials are processed into major components (e.g., motor assemblies) and eventually incorporated into complete pieces of equipment (e.g., wind turbines). Foreseeable challenges would involve: 1) difficulty in obtaining a comprehensive picture of trade partners due to business sensitive information, 2) complexity of materials going into components of a machine, and 3) difficulty maintaining such a database. We propose ways to address these challenges such as making use of digital design, learning from the experience of building similar databases, and developing a strategy for financial sustainability. We recommend that, with the advancement of information technology, small steps toward building such a database will contribute significantly to our understanding of material flows in society and the associated human impacts on the environment. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Rapid Inspection of Aerospace Structures - Is It Autonomous Yet?

    Science.gov (United States)

    Bar-Cohen, Yoseph; Backes, Paul; Joffe, Benjamin

    1996-01-01

    The trend to increase the usage of aging aircraft added a great deal of urgency to the ongoing need for low-cost, rapid, simple-to-operate, reliable and efficient NDE methods for detection and characterization of flaws in aircraft structures. In many cases, the problem of inspection is complex due to the limitation of current technology and the need to disassemble aircraft structures and testing them in lab conditions. To overcome these limitations, reliable field inspection tools are being developed for rapid NDE of large and complex-shape structures, that can operate at harsh, hostal and remote conditions with minimum human interface. In recent years, to address the need for rapid inspection in field conditions, numerous portable scanners were developed using NDE methods, including ultrasonics, shearography, thermography. This paper is written with emphasis on ultrasonic NDE scanners, their evolution and the expected direction of growth.

  5. Complex dynamics

    CERN Document Server

    Carleson, Lennart

    1993-01-01

    Complex dynamics is today very much a focus of interest. Though several fine expository articles were available, by P. Blanchard and by M. Yu. Lyubich in particular, until recently there was no single source where students could find the material with proofs. For anyone in our position, gathering and organizing the material required a great deal of work going through preprints and papers and in some cases even finding a proof. We hope that the results of our efforts will be of help to others who plan to learn about complex dynamics and perhaps even lecture. Meanwhile books in the field a. re beginning to appear. The Stony Brook course notes of J. Milnor were particularly welcome and useful. Still we hope that our special emphasis on the analytic side will satisfy a need. This book is a revised and expanded version of notes based on lectures of the first author at UCLA over several \\Vinter Quarters, particularly 1986 and 1990. We owe Chris Bishop a great deal of gratitude for supervising the production of cour...

  6. GOoDA: The Generic Optimization Data Analyzer

    International Nuclear Information System (INIS)

    Calafiura, P; Vitillo, R A; Eranian, S; Levinthal, D; Kama, S

    2012-01-01

    Modern superscalar, out-of-order microprocessors dominate large scale server computing. Monitoring their activity, during program execution, has become complicated due to the complexity of the microarchitectures and their IO interactions. Recent processors have thousands of performance monitoring events. These are required to actually provide coverage for all of the complex interactions and performance issues that can occur. Knowing which data to collect and how to interpret the results has become an unreasonable burden for code developers whose tasks are already hard enough. It becomes the task of the analysis tool developer to bridge this gap. To address this issue, a generic decomposition of how a microprocessor is using the consumed cycles allows code developers to quickly understand which of the myriad of microarchitectural complexities they are battling, without requiring a detailed knowledge of the microarchitecture. When this approach is intrinsically integrated into a performance data analysis tool, it enables software developers to take advantage of the microarchitectural methodology that has only been available to experts. The Generic Optimization Data Analyzer (GOoDA) project integrates this expertise into a profiling tool in order to lower the required expertise of the user and, being designed from the ground up with large-scale object-oriented applications in mind, it will be particularly useful for large HENP codebases

  7. Mass spectrometer calibration of Cosmic Dust Analyzer

    Science.gov (United States)

    Ahrens, Thomas J.; Gupta, Satish C.; Jyoti, G.; Beauchamp, J. L.

    2003-02-01

    The time-of-flight (TOF) mass spectrometer (MS) of the Cosmic Dust Analyzer (CDA) instrument aboard the Cassini spacecraft is expected to be placed in orbit about Saturn to sample submicrometer-diameter ring particles and impact ejecta from Saturn's satellites. The CDA measures a mass spectrum of each particle that impacts the chemical analyzer sector of the instrument. Particles impact a Rh target plate at velocities of 1-100 km/s and produce some 10-8 to 10-5 times the particle mass of positive valence, single-charged ions. These are analyzed via a TOF MS. Initial tests employed a pulsed N2 laser acting on samples of kamacite, pyrrhotite, serpentine, olivine, and Murchison meteorite induced bursts of ions which were detected with a microchannel plate and a charge sensitive amplifier (CSA). Pulses from the N2 laser (1011 W/cm2) are assumed to simulate particle impact. Using aluminum alloy as a test sample, each pulse produces a charge of ~4.6 pC (mostly Al+1), whereas irradiation of a stainless steel target produces a ~2.8 pC (Fe+1) charge. Thus the present system yields ~10-5% of the laser energy in resulting ions. A CSA signal indicates that at the position of the microchannel plate, the ion detector geometry is such that some 5% of the laser-induced ions are collected in the CDA geometry. Employing a multichannel plate detector in this MS yields for Al-Mg-Cu alloy and kamacite targets well-defined peaks at 24 (Mg+1), 27(Al+1), and 64 (Cu+1) and 56 (Fe+1), 58 (Ni+1), and 60 (Ni+1) dalton, respectively.

  8. IRISpy: Analyzing IRIS Data in Python

    Science.gov (United States)

    Ryan, Daniel; Christe, Steven; Mumford, Stuart; Baruah, Ankit; Timothy, Shelbe; Pereira, Tiago; De Pontieu, Bart

    2017-08-01

    IRISpy is a new community-developed open-source software library for analysing IRIS level 2 data. It is written in Python, a free, cross-platform, general-purpose, high-level programming language. A wide array of scientific computing software packages have already been developed in Python, from numerical computation (NumPy, SciPy, etc.), to visualization and plotting (matplotlib), to solar-physics-specific data analysis (SunPy). IRISpy is currently under development as a SunPy-affiliated package which means it depends on the SunPy library, follows similar standards and conventions, and is developed with the support of of the SunPy development team. IRISpy’s has two primary data objects, one for analyzing slit-jaw imager data and another for analyzing spectrograph data. Both objects contain basic slicing, indexing, plotting, and animating functionality to allow users to easily inspect, reduce and analyze the data. As part of this functionality the objects can output SunPy Maps, TimeSeries, Spectra, etc. of relevant data slices for easier inspection and analysis. Work is also ongoing to provide additional data analysis functionality including derivation of systematic measurement errors (e.g. readout noise), exposure time correction, residual wavelength calibration, radiometric calibration, and fine scale pointing corrections. IRISpy’s code base is publicly available through github.com and can be contributed to by anyone. In this poster we demonstrate IRISpy’s functionality and future goals of the project. We also encourage interested users to become involved in further developing IRISpy.

  9. Grid and Data Analyzing and Security

    Directory of Open Access Journals (Sweden)

    Fatemeh SHOKRI

    2012-12-01

    Full Text Available This paper examines the importance of secure structures in the process of analyzing and distributing information with aid of Grid-based technologies. The advent of distributed network has provided many practical opportunities for detecting and recording the time of events, and made efforts to identify the events and solve problems of storing information such as being up-to-date and documented. In this regard, the data distribution systems in a network environment should be accurate. As a consequence, a series of continuous and updated data must be at hand. In this case, Grid is the best answer to use data and resource of organizations by common processing.

  10. The kpx, a program analyzer for parallelization

    International Nuclear Information System (INIS)

    Matsuyama, Yuji; Orii, Shigeo; Ota, Toshiro; Kume, Etsuo; Aikawa, Hiroshi.

    1997-03-01

    The kpx is a program analyzer, developed as a common technological basis for promoting parallel processing. The kpx consists of three tools. The first is ktool, that shows how much execution time is spent in program segments. The second is ptool, that shows parallelization overhead on the Paragon system. The last is xtool, that shows parallelization overhead on the VPP system. The kpx, designed to work for any FORTRAN cord on any UNIX computer, is confirmed to work well after testing on Paragon, SP2, SR2201, VPP500, VPP300, Monte-4, SX-4 and T90. (author)

  11. A low power Multi-Channel Analyzer

    International Nuclear Information System (INIS)

    Anderson, G.A.; Brackenbush, L.W.

    1993-06-01

    The instrumentation used in nuclear spectroscopy is generally large, is not portable, and requires a lot of power. Key components of these counting systems are the computer and the Multi-Channel Analyzer (MCA). To assist in performing measurements requiring portable systems, a small, very low power MCA has been developed at Pacific Northwest Laboratory (PNL). This MCA is interfaced with a Hewlett Packard palm top computer for portable applications. The MCA can also be connected to an IBM/PC for data storage and analysis. In addition, a real-time time display mode allows the user to view the spectra as they are collected

  12. The SPAR thermal analyzer: Present and future

    Science.gov (United States)

    Marlowe, M. B.; Whetstone, W. D.; Robinson, J. C.

    The SPAR thermal analyzer, a system of finite-element processors for performing steady-state and transient thermal analyses, is described. The processors communicate with each other through the SPAR random access data base. As each processor is executed, all pertinent source data is extracted from the data base and results are stored in the data base. Steady state temperature distributions are determined by a direct solution method for linear problems and a modified Newton-Raphson method for nonlinear problems. An explicit and several implicit methods are available for the solution of transient heat transfer problems. Finite element plotting capability is available for model checkout and verification.

  13. Light-weight analyzer for odor recognition

    Energy Technology Data Exchange (ETDEWEB)

    Vass, Arpad A; Wise, Marcus B

    2014-05-20

    The invention provides a light weight analyzer, e.g., detector, capable of locating clandestine graves. The detector utilizes the very specific and unique chemicals identified in the database of human decompositional odor. This detector, based on specific chemical compounds found relevant to human decomposition, is the next step forward in clandestine grave detection and will take the guess-work out of current methods using canines and ground-penetrating radar, which have historically been unreliable. The detector is self contained, portable and built for field use. Both visual and auditory cues are provided to the operator.

  14. Analyzing Argumentation In Rich, Natural Contexts

    Directory of Open Access Journals (Sweden)

    Anita Reznitskaya

    2008-02-01

    Full Text Available The paper presents the theoretical and methodological aspects of research on the development of argument- ation in elementary school children. It presents a theoretical framework detailing psychological mechanisms responsible for the acquisition and transfer of argumentative discourse and demonstrates several applications of the framework, described in sufficient detail to guide future empirical investigations of oral, written, individual, or group argumentation performance. Software programs capable of facilitating data analysis are identified and their uses illustrated. The analytic schemes can be used to analyze large amounts of verbal data with reasonable precision and efficiency. The conclusion addresses more generally the challenges for and possibilities of empirical study of the development of argumentation.

  15. Nonlinear single-spin spectrum analyzer.

    Science.gov (United States)

    Kotler, Shlomi; Akerman, Nitzan; Glickman, Yinnon; Ozeri, Roee

    2013-03-15

    Qubits have been used as linear spectrum analyzers of their environments. Here we solve the problem of nonlinear spectral analysis, required for discrete noise induced by a strongly coupled environment. Our nonperturbative analytical model shows a nonlinear signal dependence on noise power, resulting in a spectral resolution beyond the Fourier limit as well as frequency mixing. We develop a noise characterization scheme adapted to this nonlinearity. We then apply it using a single trapped ion as a sensitive probe of strong, non-Gaussian, discrete magnetic field noise. Finally, we experimentally compared the performance of equidistant vs Uhrig modulation schemes for spectral analysis.

  16. Thermo Scientific Ozone Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, S. R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-03-01

    The primary measurement output from the Thermo Scientific Ozone Analyzer is the concentration of the analyte (O3) reported at 1-s resolution in units of ppbv in ambient air. Note that because of internal pneumatic switching limitations the instrument only makes an independent measurement every 4 seconds. Thus, the same concentration number is repeated roughly 4 times at the uniform, monotonic 1-s time base used in the AOS systems. Accompanying instrument outputs include sample temperatures, flows, chamber pressure, lamp intensities and a multiplicity of housekeeping information. There is also a field for operator comments made at any time while data is being collected.

  17. ASDA - Advanced Suit Design Analyzer computer program

    Science.gov (United States)

    Bue, Grant C.; Conger, Bruce C.; Iovine, John V.; Chang, Chi-Min

    1992-01-01

    An ASDA model developed to evaluate the heat and mass transfer characteristics of advanced pressurized suit design concepts for low pressure or vacuum planetary applications is presented. The model is based on a generalized 3-layer suit that uses the Systems Integrated Numerical Differencing Analyzer '85 in conjunction with a 41-node FORTRAN routine. The latter simulates the transient heat transfer and respiratory processes of a human body in a suited environment. The user options for the suit encompass a liquid cooled garment, a removable jacket, a CO2/H2O permeable layer, and a phase change layer.

  18. Development of a Portable Water Quality Analyzer

    Directory of Open Access Journals (Sweden)

    Germán COMINA

    2010-08-01

    Full Text Available A portable water analyzer based on a voltammetric electronic tongue has been developed. The system uses an electrochemical cell with two working electrodes as sensors, a computer controlled potentiostat, and software based on multivariate data analysis for pattern recognition. The system is suitable to differentiate laboratory made and real in-situ river water samples contaminated with different amounts of Escherichia coli. This bacteria is not only one of the main indicators for water quality, but also a main concern for public health, affecting especially people living in high-burden, resource-limiting settings.

  19. CRISP90 - SOFTWARE DESIGN ANALYZER SYSTEM

    Science.gov (United States)

    Tausworthe, R. C.

    1994-01-01

    The CRISP90 Software Design Analyzer System, an update of CRISP-80, is a set of programs forming a software design and documentation tool which supports top-down, hierarchic, modular, structured design and programming methodologies. The quality of a computer program can often be significantly influenced by the design medium in which the program is developed. The medium must foster the expression of the programmer's ideas easily and quickly, and it must permit flexible and facile alterations, additions, and deletions to these ideas as the design evolves. The CRISP90 software design analyzer system was developed to provide the PDL (Programmer Design Language) programmer with such a design medium. A program design using CRISP90 consists of short, English-like textual descriptions of data, interfaces, and procedures that are imbedded in a simple, structured, modular syntax. The display is formatted into two-dimensional, flowchart-like segments for a graphic presentation of the design. Together with a good interactive full-screen editor or word processor, the CRISP90 design analyzer becomes a powerful tool for the programmer. In addition to being a text formatter, the CRISP90 system prepares material that would be tedious and error prone to extract manually, such as a table of contents, module directory, structure (tier) chart, cross-references, and a statistics report on the characteristics of the design. Referenced modules are marked by schematic logic symbols to show conditional, iterative, and/or concurrent invocation in the program. A keyword usage profile can be generated automatically and glossary definitions inserted into the output documentation. Another feature is the capability to detect changes that were made between versions. Thus, "change-bars" can be placed in the output document along with a list of changed pages and a version history report. Also, items may be marked as "to be determined" and each will appear on a special table until the item is

  20. An image analyzer system for the analysis of nuclear traces

    International Nuclear Information System (INIS)

    Cuapio O, A.

    1990-10-01

    Inside the project of nuclear traces and its application techniques to be applied in the detection of nuclear reactions of low section (non detectable by conventional methods), in the study of accidental and personal neutron dosemeters, and other but, are developed. All these studies are based on the fact that the charged particles leave latent traces of dielectric that if its are engraved with appropriate chemical solutions its are revealed until becoming visible to the optical microscope. From the analysis of the different trace forms, it is possible to obtain information of the characteristic parameters of the incident particles (charge, mass and energy). Of the density of traces it is possible to obtain information of the flow of the incident radiation and consequently of the received dose. For carry out this analysis has been designed and coupled different systems, that it has allowed the solution of diverse outlined problems. Notwithstanding it has been detected that to make but versatile this activity is necessary to have an Image Analyzer System that allow us to digitize, to process and to display the images with more rapidity. The present document, presents the proposal to carry out the acquisition of the necessary components for to assembling an Image Analyzing System, like support to the mentioned project. (Author)

  1. Sensor gas analyzer for acetone determination in expired air

    Science.gov (United States)

    Baranov, Vitaly V.

    2001-05-01

    Diseases and changes in the way of life change the concentration and composition of the expired air. Our adaptable gas analyzer is intended for the selective analysis of expired air and can be adapted for the solution of current diagnostic and analytical tasks by the user (a physician or a patient). Having analyzed the existing trends in the development of noninvasive diagnostics we have chosen the method of noninvasive acetone detection in expired air, where the acetone concentration correlates with blood and urine glucose concentrations. The appearance of acetone in expired air is indicative of disorders that may be caused not only by diabetes but also be wrong diet, incorrect sportsmen training etc. To control the disorders one should know the acetone concentration in the human body. This knowledge allows one to judge upon the state of the patient, choose a correct diet that will not cause damage to the patient's health, determine sportsmen training efficiency and results and solve the artificial pancreas problem. Our device provide highly accurate analysis, rapid diagnostics and authentic acetone quantification in the patient's body at any time aimed at prediction of the patient's state and assessing the efficiency of the therapy used. Clinical implementation of the device will improve the health and save lives of many thousands of diabetes sufferers.

  2. Regional modeling approach for analyzing harmonic stability in radial power electronics based power system

    DEFF Research Database (Denmark)

    Yoon, Changwoo; Bai, Haofeng; Wang, Xiongfei

    2015-01-01

    Stability analysis of distributed power generation system becomes complex when there are many numbers of grid inverters in the system. In order to analyze system stability, the overall network impedance will be lumped and needs to be analyzed one by one. However, using a unified bulky transfer-fu...... and then it is expanded for generalizing its concept to an overall radial structured network....

  3. Cosmic Complexity

    Science.gov (United States)

    Mather, John C.

    2012-01-01

    What explains the extraordinary complexity of the observed universe, on all scales from quarks to the accelerating universe? My favorite explanation (which I certainty did not invent) ls that the fundamental laws of physics produce natural instability, energy flows, and chaos. Some call the result the Life Force, some note that the Earth is a living system itself (Gaia, a "tough bitch" according to Margulis), and some conclude that the observed complexity requires a supernatural explanation (of which we have many). But my dad was a statistician (of dairy cows) and he told me about cells and genes and evolution and chance when I was very small. So a scientist must look for me explanation of how nature's laws and statistics brought us into conscious existence. And how is that seemll"!gly Improbable events are actually happening a!1 the time? Well, the physicists have countless examples of natural instability, in which energy is released to power change from simplicity to complexity. One of the most common to see is that cooling water vapor below the freezing point produces snowflakes, no two alike, and all complex and beautiful. We see it often so we are not amazed. But physlc!sts have observed so many kinds of these changes from one structure to another (we call them phase transitions) that the Nobel Prize in 1992 could be awarded for understanding the mathematics of their common features. Now for a few examples of how the laws of nature produce the instabilities that lead to our own existence. First, the Big Bang (what an insufficient name!) apparently came from an instability, in which the "false vacuum" eventually decayed into the ordinary vacuum we have today, plus the most fundamental particles we know, the quarks and leptons. So the universe as a whole started with an instability. Then, a great expansion and cooling happened, and the loose quarks, finding themselves unstable too, bound themselves together into today's less elementary particles like protons and

  4. Analyzing endocrine system conservation and evolution.

    Science.gov (United States)

    Bonett, Ronald M

    2016-08-01

    Analyzing variation in rates of evolution can provide important insights into the factors that constrain trait evolution, as well as those that promote diversification. Metazoan endocrine systems exhibit apparent variation in evolutionary rates of their constituent components at multiple levels, yet relatively few studies have quantified these patterns and analyzed them in a phylogenetic context. This may be in part due to historical and current data limitations for many endocrine components and taxonomic groups. However, recent technological advancements such as high-throughput sequencing provide the opportunity to collect large-scale comparative data sets for even non-model species. Such ventures will produce a fertile data landscape for evolutionary analyses of nucleic acid and amino acid based endocrine components. Here I summarize evolutionary rate analyses that can be applied to categorical and continuous endocrine traits, and also those for nucleic acid and protein-based components. I emphasize analyses that could be used to test whether other variables (e.g., ecology, ontogenetic timing of expression, etc.) are related to patterns of rate variation and endocrine component diversification. The application of phylogenetic-based rate analyses to comparative endocrine data will greatly enhance our understanding of the factors that have shaped endocrine system evolution. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Improving respiration measurements with gas exchange analyzers.

    Science.gov (United States)

    Montero, R; Ribas-Carbó, M; Del Saz, N F; El Aou-Ouad, H; Berry, J A; Flexas, J; Bota, J

    2016-12-01

    Dark respiration measurements with open-flow gas exchange analyzers are often questioned for their low accuracy as their low values often reach the precision limit of the instrument. Respiration was measured in five species, two hypostomatous (Vitis Vinifera L. and Acanthus mollis) and three amphistomatous, one with similar amount of stomata in both sides (Eucalyptus citriodora) and two with different stomata density (Brassica oleracea and Vicia faba). CO 2 differential (ΔCO 2 ) increased two-fold with no change in apparent R d , when the two leaves with higher stomatal density faced outside. These results showed a clear effect of the position of stomata on ΔCO 2 . Therefore, it can be concluded that leaf position is important to guarantee the improvement of respiration measurements increasing ΔCO 2 without affecting the respiration results by leaf or mass units. This method will help to increase the accuracy of leaf respiration measurements using gas exchange analyzers. Copyright © 2016 Elsevier GmbH. All rights reserved.

  6. Solar Probe ANalyzer for Ions - Laboratory Performance

    Science.gov (United States)

    Livi, R.; Larson, D. E.; Kasper, J. C.; Korreck, K. E.; Whittlesey, P. L.

    2017-12-01

    The Parker Solar Probe (PSP) mission is a heliospheric satellite that will orbit the Sun closer than any prior mission to date with a perihelion of 35 solar radii (RS) and an aphelion of 10 RS. PSP includes the Solar Wind Electrons Alphas and Protons (SWEAP) instrument suite, which in turn consists of four instruments: the Solar Probe Cup (SPC) and three Solar Probe ANalyzers (SPAN) for ions and electrons. Together, this suite will take local measurements of particles and electromagnetic fields within the Sun's corona. SPAN-Ai has completed flight calibration and spacecraft integration and is set to be launched in July of 2018. The main mode of operation consists of an electrostatic analyzer (ESA) at its aperture followed by a Time-of-Flight section to measure the energy and mass per charge (m/q) of the ambient ions. SPAN-Ai's main objective is to measure solar wind ions within an energy range of 5 eV - 20 keV, a mass/q between 1-60 [amu/q] and a field of view of 2400x1200. Here we will show flight calibration results and performance.

  7. Analyzing delay causes in Egyptian construction projects

    Directory of Open Access Journals (Sweden)

    Mohamed M. Marzouk

    2014-01-01

    Full Text Available Construction delays are common problems in civil engineering projects in Egypt. These problems occur frequently during project life-time leading to disputes and litigation. Therefore, it is essential to study and analyze causes of construction delays. This research presents a list of construction delay causes retrieved from literature. The feedback of construction experts was obtained through interviews. Subsequently, a questionnaire survey was prepared. The questionnaire survey was distributed to thirty-three construction experts who represent owners, consultants, and contractor’s organizations. Frequency Index, Severity Index, and Importance Index are calculated and according to the highest values of them the top ten delay causes of construction projects in Egypt are determined. A case study is analyzed and compared to the most important delay causes in the research. Statistical analysis is carried out using analysis of variance ANOVA method to test delay causes, obtained from the survey. The test results reveal good correlation between groups while there is significant difference between them for some delay causes and finally roadmap for prioritizing delay causes groups is presented.

  8. Analyzing rare diseases terms in biomedical terminologies

    Directory of Open Access Journals (Sweden)

    Erika Pasceri

    2012-03-01

    Full Text Available Rare disease patients too often face common problems, including the lack of access to correct diagnosis, lack of quality information on the disease, lack of scientific knowledge of the disease, inequities and difficulties in access to treatment and care. These things could be changed by implementing a comprehensive approach to rare diseases, increasing international cooperation in scientific research, by gaining and sharing scientific knowledge about and by developing tools for extracting and sharing knowledge. A significant aspect to analyze is the organization of knowledge in the biomedical field for the proper management and recovery of health information. For these purposes, the sources needed have been acquired from the Office of Rare Diseases Research, the National Organization of Rare Disorders and Orphanet, organizations that provide information to patients and physicians and facilitate the exchange of information among different actors involved in this field. The present paper shows the representation of rare diseases terms in biomedical terminologies such as MeSH, ICD-10, SNOMED CT and OMIM, leveraging the fact that these terminologies are integrated in the UMLS. At the first level, it was analyzed the overlap among sources and at a second level, the presence of rare diseases terms in target sources included in UMLS, working at the term and concept level. We found that MeSH has the best representation of rare diseases terms.

  9. Analyzing Virtual Physics Simulations with Tracker

    Science.gov (United States)

    Claessens, Tom

    2017-12-01

    In the physics teaching community, Tracker is well known as a user-friendly open source video analysis software, authored by Douglas Brown. With this tool, the user can trace markers indicated on a video or on stroboscopic photos and perform kinematic analyses. Tracker also includes a data modeling tool that allows one to fit some theoretical equations of motion onto experimentally obtained data. In the field of particle mechanics, Tracker has been effectively used for learning and teaching about projectile motion, "toss up" and free-fall vertical motion, and to explain the principle of mechanical energy conservation. Also, Tracker has been successfully used in rigid body mechanics to interpret the results of experiments with rolling/slipping cylinders and moving rods. In this work, I propose an original method in which Tracker is used to analyze virtual computer simulations created with a physics-based motion solver, instead of analyzing video recording or stroboscopic photos. This could be an interesting approach to study kinematics and dynamics problems in physics education, in particular when there is no or limited access to physical labs. I demonstrate the working method with a typical (but quite challenging) problem in classical mechanics: a slipping/rolling cylinder on a rough surface.

  10. Automatic analyzing device for chlorine ion

    International Nuclear Information System (INIS)

    Sugibayashi, Shinji; Morikawa, Yoshitake; Fukase, Kazuo; Kashima, Hiromasa.

    1997-01-01

    The present invention provides a device of automatically analyzing a trance amount of chlorine ions contained in feedwater, condensate and reactor water of a BWR type power plant. Namely, zero-adjustment or span calibration in this device is conducted as follows. (1) A standard chlorine ion liquid is supplied from a tank to a mixer by a constant volume pump, and the liquid is diluted and mixed with purified water to form a standard liquid. (2) The pH of the standard liquid is adjusted by a pH adjuster. (3) The standard liquid is supplied to an electrode cell to conduct zero adjustment or span calibration. Chlorine ions in a specimen are measured by the device of the present invention as follows. (1) The specimen is supplied to a head tank through a line filter. (2) The pH of the specimen is adjusted by a pH adjuster. (3) The specimen is supplied to an electrode cell to electrically measure the concentration of the chlorine ions in the specimen. The device of the present invention can automatically analyze trance amount of chlorine ions at a high accuracy, thereby capable of improving the sensitivity, reducing an operator's burden and radiation exposure. (I.S.)

  11. Plutonium solution analyzer. Revised February 1995

    International Nuclear Information System (INIS)

    Burns, D.A.

    1995-02-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%--0.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40--240 g/l: and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4--4.0 g/y. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 ml of each sample and standard, and generates waste at the rate of about 1.5 ml per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded)

  12. Optoacoustic 13C-breath test analyzer

    Science.gov (United States)

    Harde, Hermann; Helmrich, Günther; Wolff, Marcus

    2010-02-01

    The composition and concentration of exhaled volatile gases reflects the physical ability of a patient. Therefore, a breath analysis allows to recognize an infectious disease in an organ or even to identify a tumor. One of the most prominent breath tests is the 13C-urea-breath test, applied to ascertain the presence of the bacterium helicobacter pylori in the stomach wall as an indication of a gastric ulcer. In this contribution we present a new optical analyzer that employs a compact and simple set-up based on photoacoustic spectroscopy. It consists of two identical photoacoustic cells containing two breath samples, one taken before and one after capturing an isotope-marked substrate, where the most common isotope 12C is replaced to a large extent by 13C. The analyzer measures simultaneously the relative CO2 isotopologue concentrations in both samples by exciting the molecules on specially selected absorption lines with a semiconductor laser operating at a wavelength of 2.744 μm. For a reliable diagnosis changes of the 13CO2 concentration of 1% in the exhaled breath have to be detected at a concentration level of this isotope in the breath of about 500 ppm.

  13. Improving physics instruction by analyzing video games

    Science.gov (United States)

    Beatty, Ian D.

    2013-01-01

    Video games can be very powerful teaching systems, and game designers have become adept at optimizing player engagement while scaffolding development of complex skills and situated knowledge. One implication is that we might create games to teach physics. Another, which I explore here, is that we might learn to improve classroom physics instruction by studying effective games. James Gee, in his book What Video Games Have to Teach Us About Learning and Literacy (2007), articulates 36 principles that make good video games highly effective as learning environments. In this theoretical work, I identify 16 themes running through Gee's principles, and explore how these themes and Gee's principles could be applied to the design of an on-campus physics course. I argue that the process pushes us to confront aspects of learning that physics instructors and even physics education researchers generally neglect, and suggest some novel ideas for course design.

  14. Monitoring, Analyzing and Assessing Radiation Belt Loss and Energization

    Science.gov (United States)

    Daglis, I.; Balasis, G.; Bourdarie, S.; Horne, R.; Khotyaintsev, Y.; Mann, I.; Santolik, O.; Turner, D.; Anastasiadis, A.; Georgiou, M.; Giannakis, O.; Papadimitriou, C.; Ropokis, G.; Sandberg, I.; Angelopoulos, V.; Glauert, S.; Grison, B., Kersten T.; Kolmasova, I.; Lazaro, D.; Mella, M.; Ozeke, L.; Usanova, M.

    2013-09-01

    We present the concept, objectives and expected impact of the MAARBLE (Monitoring, Analyzing and Assessing Radiation Belt Loss and Energization) project, which is being implemented by a consortium of seven institutions (five European, one Canadian and one US) with support from the European Community's Seventh Framework Programme. The MAARBLE project employs multi-spacecraft monitoring of the geospace environment, complemented by ground-based monitoring, in order to analyze and assess the physical mechanisms leading to radiation belt particle energization and loss. Particular attention is paid to the role of ULF/VLF waves. A database containing properties of the waves is being created and will be made available to the scientific community. Based on the wave database, a statistical model of the wave activity dependent on the level of geomagnetic activity, solar wind forcing, and magnetospheric region will be developed. Multi-spacecraft particle measurements will be incorporated into data assimilation tools, leading to new understanding of the causal relationships between ULF/VLF waves and radiation belt dynamics. Data assimilation techniques have been proven as a valuable tool in the field of radiation belts, able to guide 'the best' estimate of the state of a complex system. The MAARBLE (Monitoring, Analyzing and Assessing Radiation Belt Energization and Loss) collaborative research project has received funding from the European Union’s Seventh Framework Programme (FP7-SPACE-2011-1) under grant agreement no. 284520.

  15. Data acquisition and analysis system for the ion microprobe mass analyzer

    International Nuclear Information System (INIS)

    Darby, D.M.; Cristy, S.S.

    1979-02-01

    A computer was interfaced to an ion microprobe mass analyzer for more rapid data acquisition and analysis. The interface is designed to allow data acquisition, independent of the computer. A large data analysis package was developed and implemented. Performance of the computerized system was evaluated and compared to manual operation

  16. Development of turbine cycle performance analyzer using intelligent data mining

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Gyun Young

    2004-02-15

    In recent year, the performance enhancement of turbine cycle in nuclear power plants is being highlighted because of worldwide deregulation environment. Especially the first target of operating plants became the reduction of operating cost to compete other power plants. It is known that overhaul interval is closely related to operating cost Author identified that the rapid and reliable performance tests, analysis, and diagnosis play an important role in the control of overhaul interval through field investigation. First the technical road map was proposed to clearly set up the objectives. The controversial issues were summarized into data gathering, analysis tool, and diagnosis method. Author proposed the integrated solution on the basis of intelligent data mining techniques. For the reliable data gathering, the state analyzer composed of statistical regression, wavelet analysis, and neural network was developed. The role of the state analyzer is to estimate unmeasured data and to increase the reliability of the collected data. For the advanced performance analysis, performance analysis toolbox was developed. The purpose of this tool makes analysis process easier and more accurate by providing three novel heat balance diagrams. This tool includes the state analyzer and turbine cycle simulation code. In diagnosis module, the probabilistic technique based on Bayesian network model and the deterministic technique based on algebraical model are provided together. It compromises the uncertainty in diagnosis process and the pin-point capability. All the modules were validated by simulated data as well as actual test data, and some modules are used as industrial applications. We have a lot of thing to be improved in turbine cycle in order to increase plant availability. This study was accomplished to remind the concern about the importance of turbine cycle and to propose the solutions on the basis of academic as well as industrial needs.

  17. Development of turbine cycle performance analyzer using intelligent data mining

    International Nuclear Information System (INIS)

    Heo, Gyun Young

    2004-02-01

    In recent year, the performance enhancement of turbine cycle in nuclear power plants is being highlighted because of worldwide deregulation environment. Especially the first target of operating plants became the reduction of operating cost to compete other power plants. It is known that overhaul interval is closely related to operating cost Author identified that the rapid and reliable performance tests, analysis, and diagnosis play an important role in the control of overhaul interval through field investigation. First the technical road map was proposed to clearly set up the objectives. The controversial issues were summarized into data gathering, analysis tool, and diagnosis method. Author proposed the integrated solution on the basis of intelligent data mining techniques. For the reliable data gathering, the state analyzer composed of statistical regression, wavelet analysis, and neural network was developed. The role of the state analyzer is to estimate unmeasured data and to increase the reliability of the collected data. For the advanced performance analysis, performance analysis toolbox was developed. The purpose of this tool makes analysis process easier and more accurate by providing three novel heat balance diagrams. This tool includes the state analyzer and turbine cycle simulation code. In diagnosis module, the probabilistic technique based on Bayesian network model and the deterministic technique based on algebraical model are provided together. It compromises the uncertainty in diagnosis process and the pin-point capability. All the modules were validated by simulated data as well as actual test data, and some modules are used as industrial applications. We have a lot of thing to be improved in turbine cycle in order to increase plant availability. This study was accomplished to remind the concern about the importance of turbine cycle and to propose the solutions on the basis of academic as well as industrial needs

  18. Automated Root Tracking with "Root System Analyzer"

    Science.gov (United States)

    Schnepf, Andrea; Jin, Meina; Ockert, Charlotte; Bol, Roland; Leitner, Daniel

    2015-04-01

    Crucial factors for plant development are water and nutrient availability in soils. Thus, root architecture is a main aspect of plant productivity and needs to be accurately considered when describing root processes. Images of root architecture contain a huge amount of information, and image analysis helps to recover parameters describing certain root architectural and morphological traits. The majority of imaging systems for root systems are designed for two-dimensional images, such as RootReader2, GiA Roots, SmartRoot, EZ-Rhizo, and Growscreen, but most of them are semi-automated and involve mouse-clicks in each root by the user. "Root System Analyzer" is a new, fully automated approach for recovering root architectural parameters from two-dimensional images of root systems. Individual roots can still be corrected manually in a user interface if required. The algorithm starts with a sequence of segmented two-dimensional images showing the dynamic development of a root system. For each image, morphological operators are used for skeletonization. Based on this, a graph representation of the root system is created. A dynamic root architecture model helps to determine which edges of the graph belong to an individual root. The algorithm elongates each root at the root tip and simulates growth confined within the already existing graph representation. The increment of root elongation is calculated assuming constant growth. For each root, the algorithm finds all possible paths and elongates the root in the direction of the optimal path. In this way, each edge of the graph is assigned to one or more coherent roots. Image sequences of root systems are handled in such a way that the previous image is used as a starting point for the current image. The algorithm is implemented in a set of Matlab m-files. Output of Root System Analyzer is a data structure that includes for each root an identification number, the branching order, the time of emergence, the parent

  19. Method and apparatus for analyzing ionizable materials

    International Nuclear Information System (INIS)

    Ehrlich, B.J.; Hall, R.C.; Thiede, P.W.

    1979-01-01

    An apparatus and method are described for analyzing a solution of ionizable compounds in a liquid. The solution is irradiated with electromagnetic radiation to ionize the compounds and the electrical conductivity of the solution is measured. The radiation may be X-rays, ultra-violet, infra-red or microwaves. The solution may be split into two streams, only one of which is irradiated, the other being used as a reference by comparing conductivities of the two streams. The liquid must be nonionizable and is preferably a polar solvent. The invention provides an analysis technique useful in liquid chromatography and in gas chromatography after dissolving the eluted gases in a suitable solvent. Electrical conductivity measurements performed on the irradiated eluent provide a quantitative indication of the ionizable materials existing within the eluent stream and a qualitative indication of the purity of the eluent stream. (author)

  20. Analyzing Options for Airborne Emergency Wireless Communications

    Energy Technology Data Exchange (ETDEWEB)

    Michael Schmitt; Juan Deaton; Curt Papke; Shane Cherry

    2008-03-01

    In the event of large-scale natural or manmade catastrophic events, access to reliable and enduring commercial communication systems is critical. Hurricane Katrina provided a recent example of the need to ensure communications during a national emergency. To ensure that communication demands are met during these critical times, Idaho National Laboratory (INL) under the guidance of United States Strategic Command has studied infrastructure issues, concerns, and vulnerabilities associated with an airborne wireless communications capability. Such a capability could provide emergency wireless communications until public/commercial nodes can be systematically restored. This report focuses on the airborne cellular restoration concept; analyzing basic infrastructure requirements; identifying related infrastructure issues, concerns, and vulnerabilities and offers recommended solutions.