WorldWideScience

Sample records for automated mass action

  1. Mining Repair Actions for Guiding Automated Program Fixing

    OpenAIRE

    Martinez , Matias; Monperrus , Martin

    2012-01-01

    Automated program fixing consists of generating source code in order to fix bugs in an automated manner. Our intuition is that automated program fixing can imitate human-based program fixing. Hence, we present a method to mine repair actions from software repositories. A repair action is a small semantic modification on code such as adding a method call. We then decorate repair actions with a probability distribution also learnt from software repositories. Our probabilistic repair models enab...

  2. Automated Intelligent Assistant for mass spectrometry operation

    International Nuclear Information System (INIS)

    Filby, E.E.; Rankin, R.A.; Yoshida, D.E.

    1991-01-01

    The Automated Intelligent Assistant is designed to insure that our mass spectrometers produce timely, high-quality measurement data. The design combines instrument interfacing and expert system technology to automate an adaptable set-point damage prevention strategy. When shutdowns occur, the Assistant can help guide troubleshooting efforts. Stored real-time data will help our development program upgrade and improve the system, and also make it possible to re-run previously-observed instrument problems as ''live'' training exercises for the instrument operators. Initial work has focused on implementing the Assistant for the instrument ultra-high vacuum components. 14 refs., 5 figs

  3. Impact of automation on mass spectrometry.

    Science.gov (United States)

    Zhang, Yan Victoria; Rockwood, Alan

    2015-10-23

    Mass spectrometry coupled to liquid chromatography (LC-MS and LC-MS/MS) is an analytical technique that has rapidly grown in popularity in clinical practice. In contrast to traditional technology, mass spectrometry is superior in many respects including resolution, specificity, multiplex capability and has the ability to measure analytes in various matrices. Despite these advantages, LC-MS/MS remains high cost, labor intensive and has limited throughput. This specialized technology requires highly trained personnel and therefore has largely been limited to large institutions, academic organizations and reference laboratories. Advances in automation will be paramount to break through this bottleneck and increase its appeal for routine use. This article reviews these challenges, shares perspectives on essential features for LC-MS/MS total automation and proposes a step-wise and incremental approach to achieve total automation through reducing human intervention, increasing throughput and eventually integrating the LC-MS/MS system into the automated clinical laboratory operations. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Mass Action and Conservation of Current

    Directory of Open Access Journals (Sweden)

    Eisenberg Robert S.

    2016-10-01

    Full Text Available The law of mass action does not force a series of chemical reactions to have the same current flow everywhere. Interruption of far-away current does not stop current everywhere in a series of chemical reactions (analyzed according to the law of mass action, and so does not obey Maxwell’s equations. An additional constraint and equation is needed to enforce global continuity of current. The additional constraint is introduced in this paper in the special case that the chemical reaction describes spatial movement through narrow channels. In that case, a fully consistent treatment is possible using different models of charge movement. The general case must be dealt with by variational methods that enforce consistency of all the physical laws involved. Violations of current continuity arise away from equilibrium, when current flows, and the law of mass action is applied to a non-equilibrium situation, different from the systems considered when the law was originally derived. Device design in the chemical world is difficult because simple laws are not obeyed in that way. Rate constants of the law of mass action are found experimentally to change from one set of conditions to another. The law of mass action is not robust in most cases and cannot serve the same role that circuit models do in our electrical technology. Robust models and device designs in the chemical world will not be possible until continuity of current is embedded in a generalization of the law of mass action using a consistent variational model of energy and dissipation.

  5. Computer automation of an accelerator mass spectrometry system

    International Nuclear Information System (INIS)

    Gressett, J.D.; Maxson, D.L.; Matteson, S.; McDaniel, F.D.; Duggan, J.L.; Mackey, H.J.; North Texas State Univ., Denton, TX; Anthony, J.M.

    1989-01-01

    The determination of trace impurities in electronic materials using accelerator mass spectrometry (AMS) requires efficient automation of the beam transport and mass discrimination hardware. The ability to choose between a variety of charge states, isotopes and injected molecules is necessary to provide survey capabilities similar to that available on conventional mass spectrometers. This paper will discuss automation hardware and software for flexible, high-sensitivity trace analysis of electronic materials, e.g. Si, GaAs and HgCdTe. Details regarding settling times will be presented, along with proof-of-principle experimental data. Potential and present applications will also be discussed. (orig.)

  6. Computer automated mass spectrometer for isotope analysis on gas samples

    International Nuclear Information System (INIS)

    Pamula, A.; Kaucsar, M.; Fatu, C.; Ursu, D.; Vonica, D.; Bendea, D.; Muntean, F.

    1998-01-01

    A low resolution, high precision instrument was designed and realized in the mass spectrometry laboratory of the Institute of Isotopic and Molecular Technology, Cluj-Napoca. The paper presents the vacuum system, the sample inlet system, the ion source, the magnetic analyzer and the ion collector. The instrument is almost completely automated. There are described the analog-to-digital conversion circuits, the local control microcomputer, the automation systems and the performance checking. (authors)

  7. A portable, automated, inexpensive mass and balance calibration system

    International Nuclear Information System (INIS)

    Maxwell, S.L. III; Clark, J.P.

    1987-01-01

    Reliable mass measurements are essential for a nuclear production facility or process control laboratory. DOE Order 5630.2 requires that traceable standards be used to calibrate and monitor equipment used for nuclear material measurements. To ensure the reliability of mass measurements and to comply with DOE traceable requirements, a portable, automated mass and balance calibration system is used at the Savannah River Plant. Automation is achieved using an EPSON HX-20 notebook computer, which can be operated via RS232C interfacing to electronic balances or function with manual data entry if computer interfacing is not feasible. This economical, comprehensive, user-friendly system has three main functions in a mass measurement control program (MMCP): balance certification, calibration of mass standards, and daily measurement of traceable standards. The balance certification program tests for accuracy, precision, sensitivity, linearity, and cornerloading versus specific requirements. The mass calibration program allows rapid calibration of inexpensive mass standards traceable to certified Class S standards. This MMCP permits daily measurement of traceable standards to monitor the reliability of balances during routine use. The automated system verifies balance calibration, stores results for future use, and provides a printed control chart of the stored data. Another feature of the system permits three different weighing routines that accommodate our need for varying degrees of reliability in routine weighing operations. 1 ref

  8. A portable, automated, inexpensive mass and balance calibration system

    International Nuclear Information System (INIS)

    Maxwell, S.L. III; Clark, J.P.

    1987-01-01

    Reliable mass measurements are essential for a nuclear production facility or process control laboratory. DOE Order 5630.2 requires that traceable standards be used to calibrate and monitor equipment used for nuclear material measurements. To ensure the reliability of mass measurements and to comply with DOE traceability requirements, a portable, automated mass and balance calibration system is used at the Savannah River Plant. Automation is achieved using an EPSON HX-20 notebook computer, which can be operated via RS232C interfacing to electronic balances or function with manual data entry if computer interfacing is not feasible. This economical, comprehensive, user-friendly system has three main functions in a mass measurement control program (MMCP): balance certification, calibration of mass standards, and daily measurement of traceable standards. The balance certification program tests for accuracy, precision, sensitivity, linearity, and cornerloading versus specific requirements. The mass calibration program allows rapid calibration of inexpensive mass standards traceable to certified Class S standards. This MMCP permits daily measurement of traceable standards to monitor the reliability of balances during routine use. The automated system verifies balance calibration, stores results for future use, and provides a printed control chart of the stored data. Another feature of the system permits three different weighing routines that accommodate their need for varying degrees of reliability in routine weighing operations

  9. Automation of a thermal ionisation mass spectrometer

    International Nuclear Information System (INIS)

    Pamula, A.; Leuca, M.; Albert, S.; Benta, Adriana

    2001-01-01

    A thermal ionization mass spectrometer was upgraded in order to be monitored by a PC. A PC-LMP-16 National Instruments data acquisition board was used for the ion current channel and the Hall signal channel. A dedicated interface was built to allow commands from the computer to the current supply of the analyzing magnet and to the high voltage unit of the mass spectrometer. A software application was worked out to perform the adjustment of the spectrometer, magnetic scanning and mass spectra acquisition, data processing and isotope ratio determination. The apparatus is used for isotope ratio 235 U/ 238 U determination near the natural abundance. A peak jumping technique is applied to choose between the 235 U and 238 U signal, by switching the high voltage applied to the ion source between two preset values. This avoids the delay between the acquisition of the peaks of interest, a delay that would appear in the case of a 'pure' magnetic scanning. Corrections are applied for the mass discrimination effects and a statistical treatment of the data is achieved. (authors)

  10. Automated, parallel mass spectrometry imaging and structural identification of lipids

    DEFF Research Database (Denmark)

    Ellis, Shane R.; Paine, Martin R.L.; Eijkel, Gert B.

    2018-01-01

    We report a method that enables automated data-dependent acquisition of lipid tandem mass spectrometry data in parallel with a high-resolution mass spectrometry imaging experiment. The method does not increase the total image acquisition time and is combined with automatic structural assignments....... This lipidome-per-pixel approach automatically identified and validated 104 unique molecular lipids and their spatial locations from rat cerebellar tissue....

  11. Automated mass spectrum generation for new physics

    CERN Document Server

    Alloul, Adam; De Causmaecker, Karen; Fuks, Benjamin; Rausch de Traubenberg, Michel

    2013-01-01

    We describe an extension of the FeynRules package dedicated to the automatic generation of the mass spectrum associated with any Lagrangian-based quantum field theory. After introducing a simplified way to implement particle mixings, we present a new class of FeynRules functions allowing both for the analytical computation of all the model mass matrices and for the generation of a C++ package, dubbed ASperGe. This program can then be further employed for a numerical evaluation of the rotation matrices necessary to diagonalize the field basis. We illustrate these features in the context of the Two-Higgs-Doublet Model, the Minimal Left-Right Symmetric Standard Model and the Minimal Supersymmetric Standard Model.

  12. Perspectives on bioanalytical mass spectrometry and automation in drug discovery.

    Science.gov (United States)

    Janiszewski, John S; Liston, Theodore E; Cole, Mark J

    2008-11-01

    The use of high speed synthesis technologies has resulted in a steady increase in the number of new chemical entities active in the drug discovery research stream. Large organizations can have thousands of chemical entities in various stages of testing and evaluation across numerous projects on a weekly basis. Qualitative and quantitative measurements made using LC/MS are integrated throughout this process from early stage lead generation through candidate nomination. Nearly all analytical processes and procedures in modern research organizations are automated to some degree. This includes both hardware and software automation. In this review we discuss bioanalytical mass spectrometry and automation as components of the analytical chemistry infrastructure in pharma. Analytical chemists are presented as members of distinct groups with similar skillsets that build automated systems, manage test compounds, assays and reagents, and deliver data to project teams. The ADME-screening process in drug discovery is used as a model to highlight the relationships between analytical tasks in drug discovery. Emerging software and process automation tools are described that can potentially address gaps and link analytical chemistry related tasks. The role of analytical chemists and groups in modern 'industrialized' drug discovery is also discussed.

  13. Automated data processing of high-resolution mass spectra

    DEFF Research Database (Denmark)

    Hansen, Michael Adsetts Edberg; Smedsgaard, Jørn

    of the massive amounts of data. We present an automated data processing method to quantitatively compare large numbers of spectra from the analysis of complex mixtures, exploiting the full quality of high-resolution mass spectra. By projecting all detected ions - within defined intervals on both the time...... infusion of crude extracts into the source taking advantage of the high sensitivity, high mass resolution and accuracy and the limited fragmentation. Unfortunately, there has not been a comparable development in the data processing techniques to fully exploit gain in high resolution and accuracy...... infusion analyses of crude extract to find the relationship between species from several species terverticillate Penicillium, and also that the ions responsible for the segregation can be identified. Furthermore the process can automate the process of detecting unique species and unique metabolites....

  14. The development of a completely automated oxygen isotope mass spectrometer

    International Nuclear Information System (INIS)

    Ahern, T.K.

    1980-01-01

    A completely automated mass spectrometer system has been developed to measure the oxygen isotope ratio of carbon dioxide samples. The system has an accuracy of 0.03 percent, and is capable of analyzing more than 100 samples a day. The system uses an Interdata minicomputer as the primary controller. The intelligence of the system is contained within hardware circuits, software within the minicomputer, and firmware written for a Motorola 6802 microprocessor. A microprocessor-based inlet system controller maximizes the throughput of carbon dioxide samples within the inlet system. The inlet system normally contains four different aliquots of carbon dioxide and introduces these samples to the mass spectrometer through a single admittance leak. The system has been used in the analysis of 111 samples of ice taken from the Steele glacier

  15. Automated spike preparation system for Isotope Dilution Mass Spectrometry (IDMS)

    International Nuclear Information System (INIS)

    Maxwell, S.L. III; Clark, J.P.

    1990-01-01

    Isotope Dilution Mass Spectrometry (IDMS) is a method frequently employed to measure dissolved, irradiated nuclear materials. A known quantity of a unique isotope of the element to be measured (referred to as the ''spike'') is added to the solution containing the analyte. The resulting solution is chemically purified then analyzed by mass spectrometry. By measuring the magnitude of the response for each isotope and the response for the ''unique spike'' then relating this to the known quantity of the ''spike'', the quantity of the nuclear material can be determined. An automated spike preparation system was developed at the Savannah River Site (SRS) to dispense spikes for use in IDMS analytical methods. Prior to this development, technicians weighed each individual spike manually to achieve the accuracy required. This procedure was time-consuming and subjected the master stock solution to evaporation. The new system employs a high precision SMI Model 300 Unipump dispenser interfaced with an electronic balance and a portable Epson HX-20 notebook computer to automate spike preparation

  16. Automating ActionScript Projects with Eclipse and Ant

    CERN Document Server

    Koning, Sidney

    2011-01-01

    Automating repetitive programming tasks is easier than many Flash/AS3 developers think. With the Ant build tool, the Eclipse IDE, and this concise guide, you can set up your own "ultimate development machine" to code, compile, debug, and deploy projects faster. You'll also get started with versioning systems, such as Subversion and Git. Create a consistent workflow for multiple machines, or even complete departments, with the help of extensive Ant code samples. If you want to work smarter and take your skills to a new level, this book will get you on the road to automation-with Ant. Set up y

  17. Hypervascular mediastinal masses: Action points for radiologists

    Energy Technology Data Exchange (ETDEWEB)

    Cabral, Fernanda C.; Trotman-Dickenson, Beatrice; Madan, Rachna, E-mail: rmadan@partners.org

    2015-03-15

    Highlights: •An algorithm combining clinical data and radiology features of hypervascular mediastinal masses is proposed to determine further evaluation and subsequently guide treatment. •Characteristic features and known association with syndromes and genetic mutations assists in achieving a diagnosis. •MRI and functional imaging can be very helpful in the evaluation of hypervascular mediastinal masses. •Identification of hypervascularity within mediastinal masses should alert the radiologist and clinician and an attempt should be made to preferably avoid percutaneous CT guided biopsies and attempt tissue sampling surgically with better control of post procedure hemorrhage. -- Abstract: Hypervascular mediastinal masses are a distinct group of rare diseases that include a subset of benign and malignant entities. Characteristic features and known association with syndromes and genetic mutations assist in achieving a diagnosis. Imaging allows an understanding of the vascularity of the lesion and should alert the radiologist and clinician to potential hemorrhagic complications and avoid percutaneous CT guided biopsies. In such cases, pre-procedure embolization and surgical biopsy maybe considered for better control of post procedure hemorrhage. The purpose of this article is to describe and illustrate the clinical features and radiologic spectrum of hypervascular mediastinal masses, and discuss the associated clinical and genetic syndromes. We will present an imaging algorithm to determine further evaluation and subsequently guide treatment.

  18. Hypervascular mediastinal masses: Action points for radiologists

    International Nuclear Information System (INIS)

    Cabral, Fernanda C.; Trotman-Dickenson, Beatrice; Madan, Rachna

    2015-01-01

    Highlights: •An algorithm combining clinical data and radiology features of hypervascular mediastinal masses is proposed to determine further evaluation and subsequently guide treatment. •Characteristic features and known association with syndromes and genetic mutations assists in achieving a diagnosis. •MRI and functional imaging can be very helpful in the evaluation of hypervascular mediastinal masses. •Identification of hypervascularity within mediastinal masses should alert the radiologist and clinician and an attempt should be made to preferably avoid percutaneous CT guided biopsies and attempt tissue sampling surgically with better control of post procedure hemorrhage. -- Abstract: Hypervascular mediastinal masses are a distinct group of rare diseases that include a subset of benign and malignant entities. Characteristic features and known association with syndromes and genetic mutations assist in achieving a diagnosis. Imaging allows an understanding of the vascularity of the lesion and should alert the radiologist and clinician to potential hemorrhagic complications and avoid percutaneous CT guided biopsies. In such cases, pre-procedure embolization and surgical biopsy maybe considered for better control of post procedure hemorrhage. The purpose of this article is to describe and illustrate the clinical features and radiologic spectrum of hypervascular mediastinal masses, and discuss the associated clinical and genetic syndromes. We will present an imaging algorithm to determine further evaluation and subsequently guide treatment

  19. Automated mass correction and data interpretation for protein open-access liquid chromatography-mass spectrometry.

    Science.gov (United States)

    Wagner, Craig D; Hall, John T; White, Wendy L; Miller, Luke A D; Williams, Jon D

    2007-02-01

    Characterization of recombinant protein purification fractions and final products by liquid chromatography-mass spectrometry (LC/MS) are requested more frequently each year. A protein open-access (OA) LC/MS system was developed in our laboratory to meet this demand. This paper compares the system that we originally implemented in our facilities in 2003 to the one now in use, and discusses, in more detail, recent enhancements that have improved its robustness, reliability, and data reporting capabilities. The system utilizes instruments equipped with reversed-phase chromatography and an orthogonal accelerated time-of-flight mass spectrometer fitted with an electrospray source. Sample analysis requests are accomplished using a simple form on a web-enabled laboratory information management system (LIMS). This distributed form is accessible from any intranet-connected company desktop computer. Automated data acquisition and processing are performed using a combination of in-house (OA-Self Service, OA-Monitor, and OA-Analysis Engine) and vendor-supplied programs (AutoLynx, and OpenLynx) located on acquisition computers and off-line processing workstations. Analysis results are then reported via the same web-based LIMS. Also presented are solutions to problems not addressed on commercially available, small-molecule OA-LC/MS systems. These include automated transforming of mass-to-charge (m/z) spectra to mass spectra and automated data interpretation that considers minor variants to the protein sequence-such as common post-translational modifications (PTMs). Currently, our protein OA-LC/MS platform runs on five LC/MS instruments located in three separate GlaxoSmithKline R&D sites in the US and UK. To date, more than 8000 protein OA-LC/MS samples have been analyzed. With these user friendly and highly automated OA systems in place, mass spectrometry plays a key role in assessing the quality of recombinant proteins, either produced at our facilities or bought from external

  20. Automating the Detection of Reflection-on-Action

    Science.gov (United States)

    Saucerman, Jenny; Ruis, A. R.; Shaffer, David Williamson

    2017-01-01

    Learning to solve "complex problems"--problems whose solutions require the application of more than basic facts and skills--is critical to meaningful participation in the economic, social, and cultural life of the digital age. In this paper, we use a theoretical understanding of how professionals use reflection-in-action to solve complex…

  1. UV Photodissociation Action Spectroscopy of Haloanilinium Ions in a Linear Quadrupole Ion Trap Mass Spectrometer

    Science.gov (United States)

    Hansen, Christopher S.; Kirk, Benjamin B.; Blanksby, Stephen J.; O'Hair, Richard. A. J.; Trevitt, Adam J.

    2013-06-01

    UV-vis photodissociation action spectroscopy is becoming increasingly prevalent because of advances in, and commercial availability of, ion trapping technologies and tunable laser sources. This study outlines in detail an instrumental arrangement, combining a commercial ion-trap mass spectrometer and tunable nanosecond pulsed laser source, for performing fully automated photodissociation action spectroscopy on gas-phase ions. The components of the instrumentation are outlined, including the optical and electronic interfacing, in addition to the control software for automating the experiment and performing online analysis of the spectra. To demonstrate the utility of this ensemble, the photodissociation action spectra of 4-chloroanilinium, 4-bromoanilinium, and 4-iodoanilinium cations are presented and discussed. Multiple photoproducts are detected in each case and the photoproduct yields are followed as a function of laser wavelength. It is shown that the wavelength-dependent partitioning of the halide loss, H loss, and NH3 loss channels can be broadly rationalized in terms of the relative carbon-halide bond dissociation energies and processes of energy redistribution. The photodissociation action spectrum of (phenyl)Ag2 + is compared with a literature spectrum as a further benchmark.

  2. On the network thermodynamics of mass action chemical reaction networks

    NARCIS (Netherlands)

    Schaft, A.J. van der; Rao, S.; Jayawardhana, B.

    In this paper we elaborate on the mathematical formulation of mass action chemical reaction networks as recently given in van der Schaft, Rao, Jayawardhana (2012). We show how the reference chemical potentials define a specific thermodynamical equilibrium, and we discuss the port-Hamiltonian

  3. Open complex-balanced mass action chemical reaction networks

    NARCIS (Netherlands)

    Rao, Shodhan; van der Schaft, Arjan; Jayawardhana, Bayu

    We consider open chemical reaction networks, i.e. ones with inflows and outflows. We assume that all the inflows to the network are constant and all outflows obey the mass action kinetics rate law. We define a complex-balanced open reaction network as one that admits a complex-balanced steady state.

  4. Balance between automation and human actions in nuclear power plant operation. Results of international cooperation

    International Nuclear Information System (INIS)

    Sun, B.; Olmstead, R.; Oudiz, A.; Jenkinson, J.; Kossilov, A.

    1990-01-01

    Automation has long been an established feature of power plants. In some applications, the use of automation has been the significant factor which has enabled plant technology to progress to its current state. Societal demands for increased levels of safety have led to greater use of redundancy and diversity and this, in turn, has increased levels of automation. However, possibly the greatest contributory factor in increased automation has resulted from improvements in information technology. Much recent attention has been focused on the concept of inherently safe reactors, which may simplify safety system requirements and information and control system complexity. The allocation of tasks between man and machine may be one of the most critical activity in the design of new nuclear plants and major retro-fits and it therefore warrants a design approach which is commensurate in quality with the high levels of safety and production performance sought from nuclear plants. Facing this climate, in 1989 the International Atomic Energy Agency (IAEA) formed an advisory group from member countries with extensive experience in nuclear power plant automation. The task of this group was to advise on the appropriate balance between manual and automatic actions in plant operation. (author) [fr

  5. Balance between automation and human actions in nuclear power plant operation. Results of international cooperation

    International Nuclear Information System (INIS)

    Sun, B.; Olmstead, R.; Oudiz, A.; Jenkinson, J.; Kossilov, A.

    1990-01-01

    Automation has long been an established feature of power plants. In some applications, the use of automation has been the significant factor which has enabled plant technology to progress to its current state. Societal demands for increased levels of safety have led to greater use of redundancy and diversity and this, in turn, has increased levels of automation. However, possibly the greatest contributory factor in increased automation has resulted from improvements in information technology. Much recent attention has been focused on the concept of inherently safe reactors, which may simplify safety system requirements and information and control system complexity. The allocation of tasks between man and machine may be one of the most critical activity in the design of new nuclear plants and major retro-fits and it therefore warrants a design approach which is commensurate in quality with the high levels of safety and production performance sought from nuclear plants. Facing this climate, in 1989 the International Atomic Energy Agency (IAEA) formed an advisory group from member countries with extensive experience in nuclear power plant automation. The task of this group was to advise on the appropriate balance between manual and automatic actions in plant operation

  6. The mass-action-law theory of micellization revisited.

    Science.gov (United States)

    Rusanov, Anatoly I

    2014-12-09

    Among numerous definitions of the critical micelle concentration (CMC), there is one related to the constant K of the mass action law as CMC = K(1-n) (n is the aggregation number). In this paper, the generalization of this definition for multicomponent micelles and the development of the mass-action-law theory of micellization based on this definition and the analysis of a multiple-equilibrium polydisperse micellar system have been presented. This variant of the theory of micellization looks more consistent than the earlier one. In addition, two thermodynamic findings are reported: the stability conditions for micellar systems and the dependence of aggregation numbers on the surfactant concentrations. The growth of the monomer concentration with the total surfactant concentration is shown to be a thermodynamic rule only in the case of a single sort of aggregative particles or at adding a single surfactant to a mixture. The stability condition takes more complex form when adding a mixture of aggregative particles. For the aggregation number of a micelle, it has been deduced a thermodynamic rule obeying it to increase with the total surfactant concentration. However, if the monomer concentration increases slowly, the aggregation number increases much more slowly and the more slowly the more pronounced is a maximum corresponding to a micelle on the distribution hypersurface (curve in the one-component case). This forms grounding for the quasi-chemical approximation in the mass-action-law theory (the constancy of aggregation numbers).

  7. ActionMap: A web-based software that automates loci assignments to framework maps.

    Science.gov (United States)

    Albini, Guillaume; Falque, Matthieu; Joets, Johann

    2003-07-01

    Genetic linkage computation may be a repetitive and time consuming task, especially when numerous loci are assigned to a framework map. We thus developed ActionMap, a web-based software that automates genetic mapping on a fixed framework map without adding the new markers to the map. Using this tool, hundreds of loci may be automatically assigned to the framework in a single process. ActionMap was initially developed to map numerous ESTs with a small plant mapping population and is limited to inbred lines and backcrosses. ActionMap is highly configurable and consists of Perl and PHP scripts that automate command steps for the MapMaker program. A set of web forms were designed for data import and mapping settings. Results of automatic mapping can be displayed as tables or drawings of maps and may be exported. The user may create personal access-restricted projects to store raw data, settings and mapping results. All data may be edited, updated or deleted. ActionMap may be used either online or downloaded for free (http://moulon.inra.fr/~bioinfo/).

  8. Unified mass-action theory for virus neutralization and radioimmunology

    International Nuclear Information System (INIS)

    Trautman, R.

    1976-01-01

    All ideas implicit in the papers since 1953 involved in applying mass-action thermodynamics to antibody- antigen reactions are unified by the use of: (a) the intermediary concept of extent of reaction; (b) the concept of intrinsic association constant; (c) a statistical analysis for probable complexes; and (d) identification of the complex or complexes that contribute to the bioassay. Several general theoretical examples are given that show the limitations of linear interpretations of equilibrium data. Two practical examples from the literature illustrate foot-and-mouth disease virus and influenza virus neutralization. (Auth.)

  9. Mass deformed world-sheet action of semi local vortices

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Yunguo [School of Space Science and Physics, Shandong University at Weihai,264209 Weihai (China); Shandong Provincial Key Laboratory of Optical Astronomy and Solar-Terrestrial Environment,264209 Weihai (China)

    2014-02-10

    The mass deformed effective world-sheet theory of semi local vortices was constructed via the field theoretical method. By Euler-Lagrangian equations, the Ansatze for both the gauge field and the adjoint scalar were solved, this ensures that zero modes of vortices are minimal excitations of the system. Up to the 1/g{sup 2} order, all profiles are solved. The mass deformed effective action was obtained by integrating out the transverse plane of the vortex string. The effective theory interpolates between the local vortex and the lump. Respecting certain normalization conditions, the effective theory shows a Seiberg-like duality, which agrees with the result of the Kähler quotient construction.

  10. Commissioning of an automated microphotometer used in spark-source mass spectrometry

    International Nuclear Information System (INIS)

    Pearton, D.C.G.; Heron, C.

    1983-01-01

    A description of the automated microphotometer and its operation is given, which includes measurement under computer control. Speed and precision tests indicate that the system is superior in every respect to that in which an analyst reads photoplates in spark-source mass spectrometry

  11. Electrically evoked compound action potentials artefact rejection by independent component analysis: procedure automation.

    Science.gov (United States)

    Akhoun, Idrick; McKay, Colette; El-Deredy, Wael

    2015-01-15

    Independent-components-analysis (ICA) successfully separated electrically-evoked compound action potentials (ECAPs) from the stimulation artefact and noise (ECAP-ICA, Akhoun et al., 2013). This paper shows how to automate the ECAP-ICA artefact cancellation process. Raw-ECAPs without artefact rejection were consecutively recorded for each stimulation condition from at least 8 intra-cochlear electrodes. Firstly, amplifier-saturated recordings were discarded, and the data from different stimulus conditions (different current-levels) were concatenated temporally. The key aspect of the automation procedure was the sequential deductive source categorisation after ICA was applied with a restriction to 4 sources. The stereotypical aspect of the 4 sources enables their automatic classification as two artefact components, a noise and the sought ECAP based on theoretical and empirical considerations. The automatic procedure was tested using 8 cochlear implant (CI) users and one to four stimulus electrodes. The artefact and noise sources were successively identified and discarded, leaving the ECAP as the remaining source. The automated ECAP-ICA procedure successfully extracted the correct ECAPs compared to standard clinical forward masking paradigm in 22 out of 26 cases. ECAP-ICA does not require extracting the ECAP from a combination of distinct buffers as it is the case with regular methods. It is an alternative that does not have the possible bias of traditional artefact rejections such as alternate-polarity or forward-masking paradigms. The ECAP-ICA procedure bears clinical relevance, for example as the artefact rejection sub-module of automated ECAP-threshold detection techniques, which are common features of CI clinical fitting software. Copyright © 2014. Published by Elsevier B.V.

  12. Mass Action Stoichiometric. Simulation for Cell Factory Design

    DEFF Research Database (Denmark)

    Matos, Marta R. A.

    -state, giving information only about the reactions’ fluxes, while the latter take into account enzyme dynamics which makes it possible to model substrate-level enzyme regulation and get information about metabolite concentrations and reaction fluxes over time, although at the cost of introducing more parameters....... Kinetic models have been plagued by the lack of kinetic data. The focus of this thesis are kinetic models of cell metabolism. In this work we start by developing a software package to create a model ensemble for individual enzymes in metabolism, where we decompose each reaction into elementary steps......, using mass action kinetics to model each step. The resulting rate constants are then fitted to kinetic data (kcat, Km, Ki, etc.). We then use the package as the basis to build a system-level kinetic model. To do so, we take two different approaches, and in both we drop the assumption that χfree ≈ χtot...

  13. Automated processing of whole blood samples for the determination of immunosuppressants by liquid chromatography tandem-mass spectrometry.

    Science.gov (United States)

    Vogeser, Michael; Spöhrer, Ute

    2006-01-01

    Liquid chromatography tandem-mass spectrometry (LC-MS/MS) is an efficient technology for routine determination of immunosuppressants in whole blood; however, time-consuming manual sample preparation remains a significant limitation of this technique. Using a commercially available robotic pipetting system (Tecan Freedom EVO), we developed an automated sample-preparation protocol for quantification of tacrolimus in whole blood by LC-MS/MS. Barcode reading, sample resuspension, transfer of whole blood aliquots into a deep-well plate, addition of internal standard solution, mixing, and protein precipitation by addition of an organic solvent is performed by the robotic system. After centrifugation of the plate, the deproteinized supernatants are submitted to on-line solid phase extraction, using column switching prior to LC-MS/MS analysis. The only manual actions within the entire process are decapping of the tubes, and transfer of the deep-well plate from the robotic system to a centrifuge and finally to the HPLC autosampler. Whole blood pools were used to assess the reproducibility of the entire analytical system for measuring tacrolimus concentrations. A total coefficient of variation of 1.7% was found for the entire automated analytical process (n=40; mean tacrolimus concentration, 5.3 microg/L). Close agreement between tacrolimus results obtained after manual and automated sample preparation was observed. The analytical system described here, comprising automated protein precipitation, on-line solid phase extraction and LC-MS/MS analysis, is convenient and precise, and minimizes hands-on time and the risk of mistakes in the quantification of whole blood immunosuppressant concentrations compared to conventional methods.

  14. Automated grouping of action potentials of human embryonic stem cell-derived cardiomyocytes.

    Science.gov (United States)

    Gorospe, Giann; Zhu, Renjun; Millrod, Michal A; Zambidis, Elias T; Tung, Leslie; Vidal, Rene

    2014-09-01

    Methods for obtaining cardiomyocytes from human embryonic stem cells (hESCs) are improving at a significant rate. However, the characterization of these cardiomyocytes (CMs) is evolving at a relatively slower rate. In particular, there is still uncertainty in classifying the phenotype (ventricular-like, atrial-like, nodal-like, etc.) of an hESC-derived cardiomyocyte (hESC-CM). While previous studies identified the phenotype of a CM based on electrophysiological features of its action potential, the criteria for classification were typically subjective and differed across studies. In this paper, we use techniques from signal processing and machine learning to develop an automated approach to discriminate the electrophysiological differences between hESC-CMs. Specifically, we propose a spectral grouping-based algorithm to separate a population of CMs into distinct groups based on the similarity of their action potential shapes. We applied this method to a dataset of optical maps of cardiac cell clusters dissected from human embryoid bodies. While some of the nine cell clusters in the dataset are presented with just one phenotype, the majority of the cell clusters are presented with multiple phenotypes. The proposed algorithm is generally applicable to other action potential datasets and could prove useful in investigating the purification of specific types of CMs from an electrophysiological perspective.

  15. Law of mass action for co-precipitation; Loi d'action de masse de la co-precipitation

    Energy Technology Data Exchange (ETDEWEB)

    Vitorge, P

    2008-07-01

    Coprecipitation is often understood as the incorporation of elements at trace concentrations into -initially pure- solid compounds. Coprecipitation has typically been used to identify radioactive isotopes. Coprecipitation can result in lowering solubility as compared to the solubility, when controlled by pure compounds. For this reason it is also important for geochemistry, waste management and de-pollution studies. The solid obtained with coprecipitation is a new homogeneous solid phase called solid solution. The 2 formula needed to calculate the aqueous solubility when controlled by the ideal AB{sub b(1-x)}C{sub cx} solid solutions are K{sub s,B}{sup 1-x}*K{sub s,C}{sup x} equals [A{sup z{sub A}}]*[B{sup z{sub B}}]{sup b(1-x)}*[C{sup z{sub C}}]{sup cx}/((1-x){sup b(1-x)}x{sup cx}) and K{sub s,C}/K{sub s,B} equals (1-x){sup b}*[C{sup z{sub C}}]{sup c}/[B{sup z{sub B}}]{sup b}*x{sup c}), where K{sub s,B} and K{sub s,C} are the classical constant solubility products of the AB{sub b} and AC{sub c} end-members, the b and c values are calculated from the (z{sub i}) charges of the ions and from charge balance. This report is essentially written to provide a thermodynamic demonstration of the law of mass action in attempts to confirm scientific bases for solubility calculations in geosciences (as typically retention of radio-nuclides by co-precipitation), and to facilitate such calculations. Note that the law of mass action is here a set of 2 equations (not only 1) for the ideal or near ideal systems. Since they are consistent with the phase rule, no extra formula (beside mass balance) is needed to calculate the concentrations of all the species in both phases, namely: [A{sup z{sub A}}], [B{sup z{sub B}}], [C{sup z{sub C}}] and specially x.

  16. Mass Spectra-Based Framework for Automated Structural Elucidation of Metabolome Data to Explore Phytochemical Diversity

    Science.gov (United States)

    Matsuda, Fumio; Nakabayashi, Ryo; Sawada, Yuji; Suzuki, Makoto; Hirai, Masami Y.; Kanaya, Shigehiko; Saito, Kazuki

    2011-01-01

    A novel framework for automated elucidation of metabolite structures in liquid chromatography–mass spectrometer metabolome data was constructed by integrating databases. High-resolution tandem mass spectra data automatically acquired from each metabolite signal were used for database searches. Three distinct databases, KNApSAcK, ReSpect, and the PRIMe standard compound database, were employed for the structural elucidation. The outputs were retrieved using the CAS metabolite identifier for identification and putative annotation. A simple metabolite ontology system was also introduced to attain putative characterization of the metabolite signals. The automated method was applied for the metabolome data sets obtained from the rosette leaves of 20 Arabidopsis accessions. Phenotypic variations in novel Arabidopsis metabolites among these accessions could be investigated using this method. PMID:22645535

  17. Mass spectra-based framework for automated structural elucidation of metabolome data to explore phytochemical diversity

    Directory of Open Access Journals (Sweden)

    Fumio eMatsuda

    2011-08-01

    Full Text Available A novel framework for automated elucidation of metabolite structures in liquid chromatography-mass spectrometer (LC-MS metabolome data was constructed by integrating databases. High-resolution tandem mass spectra data automatically acquired from each metabolite signal were used for database searches. Three distinct databases, KNApSAcK, ReSpect, and the PRIMe standard compound database, were employed for the structural elucidation. The outputs were retrieved using the CAS metabolite identifier for identification and putative annotation. A simple metabolite ontology system was also introduced to attain putative characterization of the metabolite signals. The automated method was applied for the metabolome data sets obtained from the rosette leaves of 20 Arabidopsis accessions. Phenotypic variations in novel Arabidopsis metabolites among these accessions could be investigated using this method.

  18. AutoLens: Automated Modeling of a Strong Lens's Light, Mass and Source

    Science.gov (United States)

    Nightingale, J. W.; Dye, S.; Massey, Richard J.

    2018-05-01

    This work presents AutoLens, the first entirely automated modeling suite for the analysis of galaxy-scale strong gravitational lenses. AutoLens simultaneously models the lens galaxy's light and mass whilst reconstructing the extended source galaxy on an adaptive pixel-grid. The method's approach to source-plane discretization is amorphous, adapting its clustering and regularization to the intrinsic properties of the lensed source. The lens's light is fitted using a superposition of Sersic functions, allowing AutoLens to cleanly deblend its light from the source. Single component mass models representing the lens's total mass density profile are demonstrated, which in conjunction with light modeling can detect central images using a centrally cored profile. Decomposed mass modeling is also shown, which can fully decouple a lens's light and dark matter and determine whether the two component are geometrically aligned. The complexity of the light and mass models are automatically chosen via Bayesian model comparison. These steps form AutoLens's automated analysis pipeline, such that all results in this work are generated without any user-intervention. This is rigorously tested on a large suite of simulated images, assessing its performance on a broad range of lens profiles, source morphologies and lensing geometries. The method's performance is excellent, with accurate light, mass and source profiles inferred for data sets representative of both existing Hubble imaging and future Euclid wide-field observations.

  19. Estimating Regional Mass Balance of Himalayan Glaciers Using Hexagon Imagery: An Automated Approach

    Science.gov (United States)

    Maurer, J. M.; Rupper, S.

    2013-12-01

    Currently there is much uncertainty regarding the present and future state of Himalayan glaciers, which supply meltwater for river systems vital to more than 1.4 billion people living throughout Asia. Previous assessments of regional glacier mass balance in the Himalayas using various remote sensing and field-based methods give inconsistent results, and most assessments are over relatively short (e.g., single decade) timescales. This study aims to quantify multi-decadal changes in volume and extent of Himalayan glaciers through efficient use of the large database of declassified 1970-80s era Hexagon stereo imagery. Automation of the DEM extraction process provides an effective workflow for many images to be processed and glacier elevation changes quantified with minimal user input. The tedious procedure of manual ground control point selection necessary for block-bundle adjustment (as ephemeral data is not available for the declassified images) is automated using the Maximally Stable Extremal Regions algorithm, which matches image elements between raw Hexagon images and georeferenced Landsat 15 meter panchromatic images. Additional automated Hexagon DEM processing, co-registration, and bias correction allow for direct comparison with modern ASTER and SRTM elevation data, thus quantifying glacier elevation and area changes over several decades across largely inaccessible mountainous regions. As consistent methodology is used for all glaciers, results will likely reveal significant spatial and temporal patterns in regional ice mass balance. Ultimately, these findings could have important implications for future water resource management in light of environmental change.

  20. A fully automated mass spectrometer for the analysis of organic solids

    International Nuclear Information System (INIS)

    Hillig, H.; Kueper, H.; Riepe, W.

    1979-01-01

    Automation of a mass spectrometer-computer system makes it possible to process up to 30 samples without attention after sample loading. An automatic sample changer introduces the samples successively into the ion source by means of a direct inlet probe. A process control unit determines the operation sequence. Computer programs are available for the hardware support, system supervision and evaluation of the spectrometer signals. The most essential precondition for automation - automatic evaporation of the sample material by electronic control of the total ion current - is confirmed to be satisfactory. The system operates routinely overnight in an industrial laboratory, so that day work can be devoted to difficult analytical problems. The cost of routine analyses is halved. (Auth.)

  1. Robust automated mass spectra interpretation and chemical formula calculation using mixed integer linear programming.

    Science.gov (United States)

    Baran, Richard; Northen, Trent R

    2013-10-15

    Untargeted metabolite profiling using liquid chromatography and mass spectrometry coupled via electrospray ionization is a powerful tool for the discovery of novel natural products, metabolic capabilities, and biomarkers. However, the elucidation of the identities of uncharacterized metabolites from spectral features remains challenging. A critical step in the metabolite identification workflow is the assignment of redundant spectral features (adducts, fragments, multimers) and calculation of the underlying chemical formula. Inspection of the data by experts using computational tools solving partial problems (e.g., chemical formula calculation for individual ions) can be performed to disambiguate alternative solutions and provide reliable results. However, manual curation is tedious and not readily scalable or standardized. Here we describe an automated procedure for the robust automated mass spectra interpretation and chemical formula calculation using mixed integer linear programming optimization (RAMSI). Chemical rules among related ions are expressed as linear constraints and both the spectra interpretation and chemical formula calculation are performed in a single optimization step. This approach is unbiased in that it does not require predefined sets of neutral losses and positive and negative polarity spectra can be combined in a single optimization. The procedure was evaluated with 30 experimental mass spectra and was found to effectively identify the protonated or deprotonated molecule ([M + H](+) or [M - H](-)) while being robust to the presence of background ions. RAMSI provides a much-needed standardized tool for interpreting ions for subsequent identification in untargeted metabolomics workflows.

  2. Automated work-flow for processing high-resolution direct infusion electrospray ionization mass spectral fingerprints

    DEFF Research Database (Denmark)

    Hansen, Michael Adsetts Edberg; Smedsgaard, Jørn

    2007-01-01

    an automated data processing pipeline to compare large numbers of fingerprint spectra from direct infusion experiments analyzed by high resolution MS. We describe some of the intriguing problems that have to be addressed. starting with the conversion and pre-processing of the raw data to the final data......The use of mass spectrometry (MS) is pivotal in analyses of the metabolome and presents a major challenge for subsequent data processing. While the last few years have given new high performance instruments, there has not been a comparable development in data processing. In this paper we discuss...

  3. Automated sample preparation using membrane microtiter extraction for bioanalytical mass spectrometry.

    Science.gov (United States)

    Janiszewski, J; Schneider, P; Hoffmaster, K; Swyden, M; Wells, D; Fouda, H

    1997-01-01

    The development and application of membrane solid phase extraction (SPE) in 96-well microtiter plate format is described for the automated analysis of drugs in biological fluids. The small bed volume of the membrane allows elution of the analyte in a very small solvent volume, permitting direct HPLC injection and negating the need for the time consuming solvent evaporation step. A programmable liquid handling station (Quadra 96) was modified to automate all SPE steps. To avoid drying of the SPE bed and to enhance the analytical precision a novel protocol for performing the condition, load and wash steps in rapid succession was utilized. A block of 96 samples can now be extracted in 10 min., about 30 times faster than manual solvent extraction or single cartridge SPE methods. This processing speed complements the high-throughput speed of contemporary high performance liquid chromatography mass spectrometry (HPLC/MS) analysis. The quantitative analysis of a test analyte (Ziprasidone) in plasma demonstrates the utility and throughput of membrane SPE in combination with HPLC/MS. The results obtained with the current automated procedure compare favorably with those obtained using solvent and traditional solid phase extraction methods. The method has been used for the analysis of numerous drug prototypes in biological fluids to support drug discovery efforts.

  4. Sample tracking in an automated cytogenetic biodosimetry laboratory for radiation mass casualties

    International Nuclear Information System (INIS)

    Martin, P.R.; Berdychevski, R.E.; Subramanian, U.; Blakely, W.F.; Prasanna, P.G.S.

    2007-01-01

    Chromosome-aberration-based dicentric assay is expected to be used after mass-casualty life-threatening radiation exposures to assess radiation dose to individuals. This will require processing of a large number of samples for individual dose assessment and clinical triage to aid treatment decisions. We have established an automated, high-throughput, cytogenetic biodosimetry laboratory to process a large number of samples for conducting the dicentric assay using peripheral blood from exposed individuals according to internationally accepted laboratory protocols (i.e., within days following radiation exposures). The components of an automated cytogenetic biodosimetry laboratory include blood collection kits for sample shipment, a cell viability analyzer, a robotic liquid handler, an automated metaphase harvester, a metaphase spreader, high-throughput slide stainer and coverslipper, a high-throughput metaphase finder, multiple satellite chromosome-aberration analysis systems, and a computerized sample-tracking system. Laboratory automation using commercially available, off-the-shelf technologies, customized technology integration, and implementation of a laboratory information management system (LIMS) for cytogenetic analysis will significantly increase throughput. This paper focuses on our efforts to eliminate data-transcription errors, increase efficiency, and maintain samples' positive chain-of-custody by sample tracking during sample processing and data analysis. This sample-tracking system represents a 'beta' version, which can be modeled elsewhere in a cytogenetic biodosimetry laboratory, and includes a customized LIMS with a central server, personal computer workstations, barcode printers, fixed station and wireless hand-held devices to scan barcodes at various critical steps, and data transmission over a private intra-laboratory computer network. Our studies will improve diagnostic biodosimetry response, aid confirmation of clinical triage, and medical

  5. Sample tracking in an automated cytogenetic biodosimetry laboratory for radiation mass casualties

    Energy Technology Data Exchange (ETDEWEB)

    Martin, P.R.; Berdychevski, R.E.; Subramanian, U.; Blakely, W.F. [Armed Forces Radiobiology Research Institute, Uniformed Services University of Health Sciences, 8901 Wisconsin Avenue, Bethesda, MD 20889-5603 (United States); Prasanna, P.G.S. [Armed Forces Radiobiology Research Institute, Uniformed Services University of Health Sciences, 8901 Wisconsin Avenue, Bethesda, MD 20889-5603 (United States)], E-mail: prasanna@afrri.usuhs.mil

    2007-07-15

    Chromosome-aberration-based dicentric assay is expected to be used after mass-casualty life-threatening radiation exposures to assess radiation dose to individuals. This will require processing of a large number of samples for individual dose assessment and clinical triage to aid treatment decisions. We have established an automated, high-throughput, cytogenetic biodosimetry laboratory to process a large number of samples for conducting the dicentric assay using peripheral blood from exposed individuals according to internationally accepted laboratory protocols (i.e., within days following radiation exposures). The components of an automated cytogenetic biodosimetry laboratory include blood collection kits for sample shipment, a cell viability analyzer, a robotic liquid handler, an automated metaphase harvester, a metaphase spreader, high-throughput slide stainer and coverslipper, a high-throughput metaphase finder, multiple satellite chromosome-aberration analysis systems, and a computerized sample-tracking system. Laboratory automation using commercially available, off-the-shelf technologies, customized technology integration, and implementation of a laboratory information management system (LIMS) for cytogenetic analysis will significantly increase throughput. This paper focuses on our efforts to eliminate data-transcription errors, increase efficiency, and maintain samples' positive chain-of-custody by sample tracking during sample processing and data analysis. This sample-tracking system represents a 'beta' version, which can be modeled elsewhere in a cytogenetic biodosimetry laboratory, and includes a customized LIMS with a central server, personal computer workstations, barcode printers, fixed station and wireless hand-held devices to scan barcodes at various critical steps, and data transmission over a private intra-laboratory computer network. Our studies will improve diagnostic biodosimetry response, aid confirmation of clinical triage, and

  6. Improved protein hydrogen/deuterium exchange mass spectrometry platform with fully automated data processing.

    Science.gov (United States)

    Zhang, Zhongqi; Zhang, Aming; Xiao, Gang

    2012-06-05

    Protein hydrogen/deuterium exchange (HDX) followed by protease digestion and mass spectrometric (MS) analysis is accepted as a standard method for studying protein conformation and conformational dynamics. In this article, an improved HDX MS platform with fully automated data processing is described. The platform significantly reduces systematic and random errors in the measurement by introducing two types of corrections in HDX data analysis. First, a mixture of short peptides with fast HDX rates is introduced as internal standards to adjust the variations in the extent of back exchange from run to run. Second, a designed unique peptide (PPPI) with slow intrinsic HDX rate is employed as another internal standard to reflect the possible differences in protein intrinsic HDX rates when protein conformations at different solution conditions are compared. HDX data processing is achieved with a comprehensive HDX model to simulate the deuterium labeling and back exchange process. The HDX model is implemented into the in-house developed software MassAnalyzer and enables fully unattended analysis of the entire protein HDX MS data set starting from ion detection and peptide identification to final processed HDX output, typically within 1 day. The final output of the automated data processing is a set (or the average) of the most possible protection factors for each backbone amide hydrogen. The utility of the HDX MS platform is demonstrated by exploring the conformational transition of a monoclonal antibody by increasing concentrations of guanidine.

  7. Automated thermochemolysis reactor for detection of Bacillus anthracis endospores by gas chromatography–mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Li, Dan [Department of Chemistry and Biochemistry, Brigham Young University, Provo, UT 84602 (United States); Rands, Anthony D.; Losee, Scott C. [Torion Technologies, American Fork, UT 84003 (United States); Holt, Brian C. [Department of Statistics, Brigham Young University, Provo, UT 84602 (United States); Williams, John R. [Department of Chemistry and Biochemistry, Brigham Young University, Provo, UT 84602 (United States); Lammert, Stephen A. [Torion Technologies, American Fork, UT 84003 (United States); Robison, Richard A. [Department of Microbiology and Molecular Biology, Brigham Young University, Provo, UT 84602 (United States); Tolley, H. Dennis [Department of Statistics, Brigham Young University, Provo, UT 84602 (United States); Lee, Milton L., E-mail: milton_lee@byu.edu [Department of Chemistry and Biochemistry, Brigham Young University, Provo, UT 84602 (United States)

    2013-05-02

    Graphical abstract: -- Highlights: •An automated sample preparation system for Bacillus anthracis endospores was developed. •A thermochemolysis method was applied to produce and derivatize biomarkers for Bacillus anthracis detection. •The autoreactor controlled the precise delivery of reagents, and TCM reaction times and temperatures. •Solid phase microextraction was used to extract biomarkers, and GC–MS was used for final identification. •This autoreactor was successfully applied to the identification of Bacillus anthracis endospores. -- Abstract: An automated sample preparation system was developed and tested for the rapid detection of Bacillus anthracis endospores by gas chromatography–mass spectrometry (GC–MS) for eventual use in the field. This reactor is capable of automatically processing suspected bio-threat agents to release and derivatize unique chemical biomarkers by thermochemolysis (TCM). The system automatically controls the movement of sample vials from one position to another, crimping of septum caps onto the vials, precise delivery of reagents, and TCM reaction times and temperatures. The specific operations of introduction of sample vials, solid phase microextraction (SPME) sampling, injection into the GC–MS system, and ejection of used vials from the system were performed manually in this study, although they can be integrated into the automated system. Manual SPME sampling is performed by following visual and audible signal prompts for inserting the fiber into and retracting it from the sampling port. A rotating carousel design allows for simultaneous sample collection, reaction, biomarker extraction and analysis of sequential samples. Dipicolinic acid methyl ester (DPAME), 3-methyl-2-butenoic acid methyl ester (a fragment of anthrose) and two methylated sugars were used to compare the performance of the autoreactor with manual TCM. Statistical algorithms were used to construct reliable bacterial endospore signatures, and 24

  8. Automated thermochemolysis reactor for detection of Bacillus anthracis endospores by gas chromatography–mass spectrometry

    International Nuclear Information System (INIS)

    Li, Dan; Rands, Anthony D.; Losee, Scott C.; Holt, Brian C.; Williams, John R.; Lammert, Stephen A.; Robison, Richard A.; Tolley, H. Dennis; Lee, Milton L.

    2013-01-01

    Graphical abstract: -- Highlights: •An automated sample preparation system for Bacillus anthracis endospores was developed. •A thermochemolysis method was applied to produce and derivatize biomarkers for Bacillus anthracis detection. •The autoreactor controlled the precise delivery of reagents, and TCM reaction times and temperatures. •Solid phase microextraction was used to extract biomarkers, and GC–MS was used for final identification. •This autoreactor was successfully applied to the identification of Bacillus anthracis endospores. -- Abstract: An automated sample preparation system was developed and tested for the rapid detection of Bacillus anthracis endospores by gas chromatography–mass spectrometry (GC–MS) for eventual use in the field. This reactor is capable of automatically processing suspected bio-threat agents to release and derivatize unique chemical biomarkers by thermochemolysis (TCM). The system automatically controls the movement of sample vials from one position to another, crimping of septum caps onto the vials, precise delivery of reagents, and TCM reaction times and temperatures. The specific operations of introduction of sample vials, solid phase microextraction (SPME) sampling, injection into the GC–MS system, and ejection of used vials from the system were performed manually in this study, although they can be integrated into the automated system. Manual SPME sampling is performed by following visual and audible signal prompts for inserting the fiber into and retracting it from the sampling port. A rotating carousel design allows for simultaneous sample collection, reaction, biomarker extraction and analysis of sequential samples. Dipicolinic acid methyl ester (DPAME), 3-methyl-2-butenoic acid methyl ester (a fragment of anthrose) and two methylated sugars were used to compare the performance of the autoreactor with manual TCM. Statistical algorithms were used to construct reliable bacterial endospore signatures, and 24

  9. The infection rate of Daphnia magna by Pasteuria ramosa conforms with the mass-action principle.

    Science.gov (United States)

    Regoes, R R; Hottinger, J W; Sygnarski, L; Ebert, D

    2003-10-01

    In simple epidemiological models that describe the interaction between hosts with their parasites, the infection process is commonly assumed to be governed by the law of mass action, i.e. it is assumed that the infection rate depends linearly on the densities of the host and the parasite. The mass-action assumption, however, can be problematic if certain aspects of the host-parasite interaction are very pronounced, such as spatial compartmentalization, host immunity which may protect from infection with low doses, or host heterogeneity with regard to susceptibility to infection. As deviations from a mass-action infection rate have consequences for the dynamics of the host-parasite system, it is important to test for the appropriateness of the mass-action assumption in a given host-parasite system. In this paper, we examine the relationship between the infection rate and the parasite inoculum for the water flee Daphnia magna and its bacterial parasite Pasteuria ramosa. We measured the fraction of infected hosts after exposure to 14 different doses of the parasite. We find that the observed relationship between the fraction of infected hosts and the parasite dose is largely consistent with an infection process governed by the mass-action principle. However, we have evidence for a subtle but significant deviation from a simple mass-action infection model, which can be explained either by some antagonistic effects of the parasite spores during the infection process, or by heterogeneity in the hosts' susceptibility with regard to infection.

  10. Highly Reproducible Automated Proteomics Sample Preparation Workflow for Quantitative Mass Spectrometry.

    Science.gov (United States)

    Fu, Qin; Kowalski, Michael P; Mastali, Mitra; Parker, Sarah J; Sobhani, Kimia; van den Broek, Irene; Hunter, Christie L; Van Eyk, Jennifer E

    2018-01-05

    Sample preparation for protein quantification by mass spectrometry requires multiple processing steps including denaturation, reduction, alkylation, protease digestion, and peptide cleanup. Scaling these procedures for the analysis of numerous complex biological samples can be tedious and time-consuming, as there are many liquid transfer steps and timed reactions where technical variations can be introduced and propagated. We established an automated sample preparation workflow with a total processing time for 96 samples of 5 h, including a 2 h incubation with trypsin. Peptide cleanup is accomplished by online diversion during the LC/MS/MS analysis. In a selected reaction monitoring (SRM) assay targeting 6 plasma biomarkers and spiked β-galactosidase, mean intraday and interday cyclic voltammograms (CVs) for 5 serum and 5 plasma samples over 5 days were samples repeated on 3 separate days had total CVs below 20%. Similar results were obtained when the workflow was transferred to a second site: 93% of peptides had CVs below 20%. An automated trypsin digestion workflow yields uniformly processed samples in less than 5 h. Reproducible quantification of peptides was observed across replicates, days, instruments, and laboratory sites, demonstrating the broad applicability of this approach.

  11. Balance between automation and human actions in NPP operation: Results of international co-operation

    International Nuclear Information System (INIS)

    Bastl, W.; Jenkinson, J.; Kossilov, A.; Olmstead, R.A.; Oudiz, A.; Sun, B.

    1991-01-01

    Automation is an essential feature of NPPs. The degree of automation can be seen to be increasing, owing to technical and social factors, but also as a result of advances in information technology. Deciding upon the appropriate level of automation, the allocation of functions to man, or to a machine or a combination of both may be one of the most critical aspects of NPP design. It is important that automation is carried out in a sufficiently systematic way. There appears to be need for additional guidance in this key area. In 1989 the International Atomic Energy Agency formed an advisory group to consider this problem. The group has proposed a methodology for allocating functions between man and machine. This methodology, which is described in the paper, takes account of the factors which influence the allocation process, considers viable approaches to automation and gives guidance on decision making. In addition, areas where future research may be justified are discussed. (author). 8 refs, 1 fig

  12. Pep2Path: automated mass spectrometry-guided genome mining of peptidic natural products.

    Directory of Open Access Journals (Sweden)

    Marnix H Medema

    2014-09-01

    Full Text Available Nonribosomally and ribosomally synthesized bioactive peptides constitute a source of molecules of great biomedical importance, including antibiotics such as penicillin, immunosuppressants such as cyclosporine, and cytostatics such as bleomycin. Recently, an innovative mass-spectrometry-based strategy, peptidogenomics, has been pioneered to effectively mine microbial strains for novel peptidic metabolites. Even though mass-spectrometric peptide detection can be performed quite fast, true high-throughput natural product discovery approaches have still been limited by the inability to rapidly match the identified tandem mass spectra to the gene clusters responsible for the biosynthesis of the corresponding compounds. With Pep2Path, we introduce a software package to fully automate the peptidogenomics approach through the rapid Bayesian probabilistic matching of mass spectra to their corresponding biosynthetic gene clusters. Detailed benchmarking of the method shows that the approach is powerful enough to correctly identify gene clusters even in data sets that consist of hundreds of genomes, which also makes it possible to match compounds from unsequenced organisms to closely related biosynthetic gene clusters in other genomes. Applying Pep2Path to a data set of compounds without known biosynthesis routes, we were able to identify candidate gene clusters for the biosynthesis of five important compounds. Notably, one of these clusters was detected in a genome from a different subphylum of Proteobacteria than that in which the molecule had first been identified. All in all, our approach paves the way towards high-throughput discovery of novel peptidic natural products. Pep2Path is freely available from http://pep2path.sourceforge.net/, implemented in Python, licensed under the GNU General Public License v3 and supported on MS Windows, Linux and Mac OS X.

  13. An Automated, High-Throughput Method for Interpreting the Tandem Mass Spectra of Glycosaminoglycans

    Science.gov (United States)

    Duan, Jiana; Jonathan Amster, I.

    2018-05-01

    The biological interactions between glycosaminoglycans (GAGs) and other biomolecules are heavily influenced by structural features of the glycan. The structure of GAGs can be assigned using tandem mass spectrometry (MS2), but analysis of these data, to date, requires manually interpretation, a slow process that presents a bottleneck to the broader deployment of this approach to solving biologically relevant problems. Automated interpretation remains a challenge, as GAG biosynthesis is not template-driven, and therefore, one cannot predict structures from genomic data, as is done with proteins. The lack of a structure database, a consequence of the non-template biosynthesis, requires a de novo approach to interpretation of the mass spectral data. We propose a model for rapid, high-throughput GAG analysis by using an approach in which candidate structures are scored for the likelihood that they would produce the features observed in the mass spectrum. To make this approach tractable, a genetic algorithm is used to greatly reduce the search-space of isomeric structures that are considered. The time required for analysis is significantly reduced compared to an approach in which every possible isomer is considered and scored. The model is coded in a software package using the MATLAB environment. This approach was tested on tandem mass spectrometry data for long-chain, moderately sulfated chondroitin sulfate oligomers that were derived from the proteoglycan bikunin. The bikunin data was previously interpreted manually. Our approach examines glycosidic fragments to localize SO3 modifications to specific residues and yields the same structures reported in literature, only much more quickly.

  14. Automated, feature-based image alignment for high-resolution imaging mass spectrometry of large biological samples

    NARCIS (Netherlands)

    Broersen, A.; Liere, van R.; Altelaar, A.F.M.; Heeren, R.M.A.; McDonnell, L.A.

    2008-01-01

    High-resolution imaging mass spectrometry of large biological samples is the goal of several research groups. In mosaic imaging, the most common method, the large sample is divided into a mosaic of small areas that are then analyzed with high resolution. Here we present an automated alignment

  15. Automated combustion accelerator mass spectrometry for the analysis of biomedical samples in the low attomole range.

    Science.gov (United States)

    van Duijn, Esther; Sandman, Hugo; Grossouw, Dimitri; Mocking, Johannes A J; Coulier, Leon; Vaes, Wouter H J

    2014-08-05

    The increasing role of accelerator mass spectrometry (AMS) in biomedical research necessitates modernization of the traditional sample handling process. AMS was originally developed and used for carbon dating, therefore focusing on a very high precision but with a comparably low sample throughput. Here, we describe the combination of automated sample combustion with an elemental analyzer (EA) online coupled to an AMS via a dedicated interface. This setup allows direct radiocarbon measurements for over 70 samples daily by AMS. No sample processing is required apart from the pipetting of the sample into a tin foil cup, which is placed in the carousel of the EA. In our system, up to 200 AMS analyses are performed automatically without the need for manual interventions. We present results on the direct total (14)C count measurements in <2 μL human plasma samples. The method shows linearity over a range of 0.65-821 mBq/mL, with a lower limit of quantification of 0.65 mBq/mL (corresponding to 0.67 amol for acetaminophen). At these extremely low levels of activity, it becomes important to quantify plasma specific carbon percentages. This carbon percentage is automatically generated upon combustion of a sample on the EA. Apparent advantages of the present approach include complete omission of sample preparation (reduced hands-on time) and fully automated sample analysis. These improvements clearly stimulate the standard incorporation of microtracer research in the drug development process. In combination with the particularly low sample volumes required and extreme sensitivity, AMS strongly improves its position as a bioanalysis method.

  16. Geena 2, improved automated analysis of MALDI/TOF mass spectra.

    Science.gov (United States)

    Romano, Paolo; Profumo, Aldo; Rocco, Mattia; Mangerini, Rosa; Ferri, Fabio; Facchiano, Angelo

    2016-03-02

    Mass spectrometry (MS) is producing high volumes of data supporting oncological sciences, especially for translational research. Most of related elaborations can be carried out by combining existing tools at different levels, but little is currently available for the automation of the fundamental steps. For the analysis of MALDI/TOF spectra, a number of pre-processing steps are required, including joining of isotopic abundances for a given molecular species, normalization of signals against an internal standard, background noise removal, averaging multiple spectra from the same sample, and aligning spectra from different samples. In this paper, we present Geena 2, a public software tool for the automated execution of these pre-processing steps for MALDI/TOF spectra. Geena 2 has been developed in a Linux-Apache-MySQL-PHP web development environment, with scripts in PHP and Perl. Input and output are managed as simple formats that can be consumed by any database system and spreadsheet software. Input data may also be stored in a MySQL database. Processing methods are based on original heuristic algorithms which are introduced in the paper. Three simple and intuitive web interfaces are available: the Standard Search Interface, which allows a complete control over all parameters, the Bright Search Interface, which leaves to the user the possibility to tune parameters for alignment of spectra, and the Quick Search Interface, which limits the number of parameters to a minimum by using default values for the majority of parameters. Geena 2 has been utilized, in conjunction with a statistical analysis tool, in three published experimental works: a proteomic study on the effects of long-term cryopreservation on the low molecular weight fraction of serum proteome, and two retrospective serum proteomic studies, one on the risk of developing breat cancer in patients affected by gross cystic disease of the breast (GCDB) and the other for the identification of a predictor of

  17. Towards Automation 2.0: A Neurocognitive Model for Environment Recognition, Decision-Making, and Action Execution

    Directory of Open Access Journals (Sweden)

    Zucker Gerhard

    2011-01-01

    Full Text Available The ongoing penetration of building automation by information technology is by far not saturated. Today's systems need not only be reliable and fault tolerant, they also have to regard energy efficiency and flexibility in the overall consumption. Meeting the quality and comfort goals in building automation while at the same time optimizing towards energy, carbon footprint and cost-efficiency requires systems that are able to handle large amounts of information and negotiate system behaviour that resolves conflicting demands—a decision-making process. In the last years, research has started to focus on bionic principles for designing new concepts in this area. The information processing principles of the human mind have turned out to be of particular interest as the mind is capable of processing huge amounts of sensory data and taking adequate decisions for (re-actions based on these analysed data. In this paper, we discuss how a bionic approach can solve the upcoming problems of energy optimal systems. A recently developed model for environment recognition and decision-making processes, which is based on research findings from different disciplines of brain research is introduced. This model is the foundation for applications in intelligent building automation that have to deal with information from home and office environments. All of these applications have in common that they consist of a combination of communicating nodes and have many, partly contradicting goals.

  18. Y-12 Plant remedial action Technology Logic Diagram: Volume 3, Technology evaluation data sheets: Part B, Characterization; robotics/automation

    International Nuclear Information System (INIS)

    1994-09-01

    The Y-12 Plant Remedial Action Technology Logic Diagram (TLD) was developed to provide a decision-support tool that relates environmental restoration (ER) problems at the Y-12 Plant to potential technologies that can remediate theses problems. The TLD identifies the research, development, demonstration, testing, and evaluation needed for sufficient development of these technologies to allow for technology transfer and application to remedial action (RA) activities. The TLD consists of three volumes. Volume 1 contains an overview of the TLD, an explanation of the program-specific responsibilities, a review of identified technologies, and the rankings of remedial technologies. Volume 2 contains the logic linkages among environmental management goals, environmental problems, and the various technologies that have the potential to solve these problems. Volume 3 contains the TLD data sheets. This report is Part B of Volume 3 and contains the Characterization and Robotics/Automation sections

  19. Overlap valence quarks on a twisted mass sea. A case study for mixed action lattice QCD

    International Nuclear Information System (INIS)

    Cichy, Krzysztof; Herdoiza, Gregorio; UAM/CSIC Univ. Autonoma de Madrid

    2012-11-01

    We discuss a Lattice QCD mixed action investigation employing Wilson maximally twisted mass sea and overlap valence fermions. Using four values of the lattice spacing, we demonstrate that the overlap Dirac operator assumes a point-like locality in the continuum limit. We also show that by adopting suitable matching conditions for the sea and valence theories a consistent continuum limit for the pion decay constant and light baryon masses can be obtained. Finally, we confront results for sea-valence mixed meson masses and the valence scalar correlator with corresponding expressions of chiral perturbation theory. This allows us to extract low energy constants of mixed action chiral perturbation which characterize the strength of unitarity violations in our mixed action setup.

  20. Overlap valence quarks on a twisted mass sea. A case study for mixed action lattice QCD

    Energy Technology Data Exchange (ETDEWEB)

    Cichy, Krzysztof [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Poznan Univ. (Poland). Faculty of Physics; Drach, Vincent; Jansen, Karl [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Garcia-Ramos, Elena [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Humboldt-Universitaet, Berlin (Germany); Herdoiza, Gregorio [UAM/CSIC Univ. Autonoma de Madrid (Spain). Dept. de Fisica Teorica; UAM/CSIC Univ. Autonoma de Madrid (Spain). Inst. de Fisica Teorica; Collaboration: European Twisted Mass Collaboration

    2012-11-15

    We discuss a Lattice QCD mixed action investigation employing Wilson maximally twisted mass sea and overlap valence fermions. Using four values of the lattice spacing, we demonstrate that the overlap Dirac operator assumes a point-like locality in the continuum limit. We also show that by adopting suitable matching conditions for the sea and valence theories a consistent continuum limit for the pion decay constant and light baryon masses can be obtained. Finally, we confront results for sea-valence mixed meson masses and the valence scalar correlator with corresponding expressions of chiral perturbation theory. This allows us to extract low energy constants of mixed action chiral perturbation which characterize the strength of unitarity violations in our mixed action setup.

  1. Possibilities of the common research-development action in the field of automated logistical engines

    Directory of Open Access Journals (Sweden)

    Pap Lajos

    2003-12-01

    Full Text Available The paper briefly presents the R&D cooperation of the Department of Materials Handling and Logistics and Departments of Automation. The main fields of cooperation are introduced. Different kind of Linear Motor (hereafter LM drives are being developed and tested for warehouse and rolling conveyor systems. Modern control strategies using AI methods are being investigated and tested for Automated guide vehicle. Wireless communication methods are being searched and developed for mobile material handling devices. Application possibilities of voice recognition and image processing are being tested for control of material handling robots and devices. Application of process visualization programs are being developed and investigated. Multi-level industrial communication system is being developed for the laboratories of the cooperating departments.

  2. Negative-Parity Baryon Masses Using O(a)-improved Fermion Action

    Energy Technology Data Exchange (ETDEWEB)

    M. Gockeler; R. Horsley; D. Pleiter; P.E.L. Rakow; G. Schierholz; C.M. Maynard; D.G. Richards

    2001-06-01

    We present a calculation of the mass of the lowest-lying negative-parity J=1/2{sup {minus}} state in quenched QCD. Results are obtained using a non-perturbatively {Omicron}(a)-improved clover fermion action, and a splitting found between the masses of the nucleon, and its parity partner. The calculation is performed on two lattice volumes, and at three lattice spacings, enabling a study of both finite-volume and finite lattice-spacing uncertainties. A comparison is made with results obtained using the unimproved Wilson fermion action.

  3. Negative-parity baryon masses using an Ο(α)-improved fermion action

    International Nuclear Information System (INIS)

    Goeckeler, M.; Rakow, P.E.L.; Maynard, C.M.; Richards, D.G.; Old Dominion Univ., Norfolk, VA

    2001-06-01

    We present a calculation of the mass of the lowest-lying negative-parity J = 1/2 - state in quenched QCD. Results are obtained using a non-perturbatively O(a)-improved clover fermion action, and a splitting is found between the masses of the nucleon, and its parity partner. The calculation is performed on two lattice volumes and at three lattice spacings, enabling a study of both finite-volume and finite lattice-spacing uncertainties. A comparison is made with results obtained using the unimproved Wilson fermion action. (orig.)

  4. A new method for automated discontinuity trace mapping on rock mass 3D surface model

    Science.gov (United States)

    Li, Xiaojun; Chen, Jianqin; Zhu, Hehua

    2016-04-01

    This paper presents an automated discontinuity trace mapping method on a 3D surface model of rock mass. Feature points of discontinuity traces are first detected using the Normal Tensor Voting Theory, which is robust to noisy point cloud data. Discontinuity traces are then extracted from feature points in four steps: (1) trace feature point grouping, (2) trace segment growth, (3) trace segment connection, and (4) redundant trace segment removal. A sensitivity analysis is conducted to identify optimal values for the parameters used in the proposed method. The optimal triangular mesh element size is between 5 cm and 6 cm; the angle threshold in the trace segment growth step is between 70° and 90°; the angle threshold in the trace segment connection step is between 50° and 70°, and the distance threshold should be at least 15 times the mean triangular mesh element size. The method is applied to the excavation face trace mapping of a drill-and-blast tunnel. The results show that the proposed discontinuity trace mapping method is fast and effective and could be used as a supplement to traditional direct measurement of discontinuity traces.

  5. Automated MALDI Matrix Coating System for Multiple Tissue Samples for Imaging Mass Spectrometry

    Science.gov (United States)

    Mounfield, William P.; Garrett, Timothy J.

    2012-03-01

    Uniform matrix deposition on tissue samples for matrix-assisted laser desorption/ionization (MALDI) is key for reproducible analyte ion signals. Current methods often result in nonhomogenous matrix deposition, and take time and effort to produce acceptable ion signals. Here we describe a fully-automated method for matrix deposition using an enclosed spray chamber and spray nozzle for matrix solution delivery. A commercial air-atomizing spray nozzle was modified and combined with solenoid controlled valves and a Programmable Logic Controller (PLC) to control and deliver the matrix solution. A spray chamber was employed to contain the nozzle, sample, and atomized matrix solution stream, and to prevent any interference from outside conditions as well as allow complete control of the sample environment. A gravity cup was filled with MALDI matrix solutions, including DHB in chloroform/methanol (50:50) at concentrations up to 60 mg/mL. Various samples (including rat brain tissue sections) were prepared using two deposition methods (spray chamber, inkjet). A linear ion trap equipped with an intermediate-pressure MALDI source was used for analyses. Optical microscopic examination showed a uniform coating of matrix crystals across the sample. Overall, the mass spectral images gathered from tissues coated using the spray chamber system were of better quality and more reproducible than from tissue specimens prepared by the inkjet deposition method.

  6. On the Mathematical Structure of Balanced Chemical Reaction Networks Governed by Mass Action Kinetics

    NARCIS (Netherlands)

    Schaft, Arjan van der; Rao, Shodhan; Jayawardhana, Bayu

    2013-01-01

    Motivated by recent progress on the interplay between graph theory, dynamics, and systems theory, we revisit the analysis of chemical reaction networks described by mass action kinetics. For reaction networks possessing a thermodynamic equilibrium we derive a compact formulation exhibiting at the

  7. Variable elimination in chemical reaction networks with mass-action kinetics

    DEFF Research Database (Denmark)

    Feliu, Elisenda; Wiuf, C.

    2012-01-01

    We consider chemical reaction networks taken with mass-action kinetics. The steady states of such a system are solutions to a system of polynomial equations. Even for small systems the task of finding the solutions is daunting. We develop an algebraic framework and procedure for linear elimination...

  8. On the graph and systems analysis of reversible chemical reaction networks with mass action kinetics

    NARCIS (Netherlands)

    Rao, Shodhan; Jayawardhana, Bayu; Schaft, Arjan van der

    2012-01-01

    Motivated by the recent progresses on the interplay between the graph theory and systems theory, we revisit the analysis of reversible chemical reaction networks described by mass action kinetics by reformulating it using the graph knowledge of the underlying networks. Based on this formulation, we

  9. Application of mass spectrometric techniques to delineate the modes-of-action of anticancer metallodrugs

    NARCIS (Netherlands)

    Hartinger, Christian G.; Groessl, Michael; Meier, Samuel M.; Casini, Angela; Dyson, Paul J.

    2013-01-01

    Mass spectrometry (MS) has emerged as an important tool for studying anticancer metallodrugs in complex biological samples and for characterising their interactions with biomolecules and potential targets on a molecular level. The exact modes-of-action of these coordination compounds and especially

  10. Technical note: CT-guided biopsy of lung masses using an automated guiding apparatus

    International Nuclear Information System (INIS)

    Chellathurai, Amarnath; Kanhirat, Saneej; Chokkappan, Kabilan; Swaminathan, Thiruchendur S; Kulasekaran, Nadhamuni

    2009-01-01

    Automated guiding apparatuses for CT-guided biopsies are now available. We report our experience with an indigenous system to guide lung biopsies. This system gave results similar to those with the manual technique. Automated planning also appears to be technically easier, it requires fewer number of needle passes, consumes less time, and requires fewer number of check scans

  11. Fully Automated Laser Ablation Liquid Capture Sample Analysis using NanoElectrospray Ionization Mass Spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Lorenz, Matthias [ORNL; Ovchinnikova, Olga S [ORNL; Van Berkel, Gary J [ORNL

    2014-01-01

    RATIONALE: Laser ablation provides for the possibility of sampling a large variety of surfaces with high spatial resolution. This type of sampling when employed in conjunction with liquid capture followed by nanoelectrospray ionization provides the opportunity for sensitive and prolonged interrogation of samples by mass spectrometry as well as the ability to analyze surfaces not amenable to direct liquid extraction. METHODS: A fully automated, reflection geometry, laser ablation liquid capture spot sampling system was achieved by incorporating appropriate laser fiber optics and a focusing lens into a commercially available, liquid extraction surface analysis (LESA ) ready Advion TriVersa NanoMate system. RESULTS: Under optimized conditions about 10% of laser ablated material could be captured in a droplet positioned vertically over the ablation region using the NanoMate robot controlled pipette. The sampling spot size area with this laser ablation liquid capture surface analysis (LA/LCSA) mode of operation (typically about 120 m x 160 m) was approximately 50 times smaller than that achievable by direct liquid extraction using LESA (ca. 1 mm diameter liquid extraction spot). The set-up was successfully applied for the analysis of ink on glass and paper as well as the endogenous components in Alstroemeria Yellow King flower petals. In a second mode of operation with a comparable sampling spot size, termed laser ablation/LESA , the laser system was used to drill through, penetrate, or otherwise expose material beneath a solvent resistant surface. Once drilled, LESA was effective in sampling soluble material exposed at that location on the surface. CONCLUSIONS: Incorporating the capability for different laser ablation liquid capture spot sampling modes of operation into a LESA ready Advion TriVersa NanoMate enhanced the spot sampling spatial resolution of this device and broadened the surface types amenable to analysis to include absorbent and solvent resistant

  12. A totally automated data acquisition/reduction system for routine treatment of mass spectroscopic data by factor analysis

    International Nuclear Information System (INIS)

    Tway, P.C.; Love, L.J.C.; Woodruff, H.B.

    1980-01-01

    Target transformation factor analysis is applied to typical data from gas chromatography-mass spectrometry and solid-probe mass spectrometry to determine rapidly the number of components in unresolved or partially resolved peaks. This technique allows the detection of hidden impurities which often make interpretation or quantification impossible. The error theory of Malinowski is used to assess the reliability of the results. The totally automated system uses a commercially available g.c.-m.s. data system interfaced to the large computer, and the number of components under a peak can be determined routinely and rapidly. (Auth.)

  13. Automated knowledge acquisition - an aid to understanding the actions of a skilled operator

    International Nuclear Information System (INIS)

    Wright, R.M.

    1990-01-01

    The operator of an accelerator experiment often appears to be a highly skilled magician. The diagnostics used, such as oscilloscope traces, may be only indirectly related to the state of the accelerator. The operator adjusts the knobs so that the experiment runs well, but is not always able to describe in words exactly what is happening. The tool described in this paper, acc-plot-tool, was developed as a supplement to note-taking while attempting to develop an expert system that might function as an apprentice operator. During an experiment, a historical record is made of all operator actions and the resulting changes in the state of the equipment, including digitized records of the oscilloscope traces used to make decisions. These records are transformed into a set of graphical objects in the knowledge engineering environment (KEE). A flexible set of KEE functions has been developed, allowing both graphical and numerical manipulation of the data. The acc-plot-tool is not only useful in helping to understand the operator's actions, but also in discovering correlations between operator actions and system-parameter changes, in detecting equipment failures and in the mathematical modeling of the system. Examples will be given from ion-source operation. (orig.)

  14. Estimation of heterogeneity in malaria transmission by stochastic modelling of apparent deviations from mass action kinetics

    Directory of Open Access Journals (Sweden)

    Smith Thomas A

    2008-01-01

    Full Text Available Abstract Background Quantifying heterogeneity in malaria transmission is a prerequisite for accurate predictive mathematical models, but the variance in field measurements of exposure overestimates true micro-heterogeneity because it is inflated to an uncertain extent by sampling variation. Descriptions of field data also suggest that the rate of Plasmodium falciparum infection is not proportional to the intensity of challenge by infectious vectors. This appears to violate the principle of mass action that is implied by malaria biology. Micro-heterogeneity may be the reason for this anomaly. It is proposed that the level of micro-heterogeneity can be estimated from statistical models that estimate the amount of variation in transmission most compatible with a mass-action model for the relationship of infection to exposure. Methods The relationship between the entomological inoculation rate (EIR for falciparum malaria and infection risk was reanalysed using published data for cohorts of children in Saradidi (western Kenya. Infection risk was treated as binomially distributed, and measurement-error (Poisson and negative binomial models were considered for the EIR. Models were fitted using Bayesian Markov chain Monte Carlo algorithms and model fit compared for models that assume either mass-action kinetics, facilitation, competition or saturation of the infection process with increasing EIR. Results The proportion of inocula that resulted in infection in Saradidi was inversely related to the measured intensity of challenge. Models of facilitation showed, therefore, a poor fit to the data. When sampling error in the EIR was neglected, either competition or saturation needed to be incorporated in the model in order to give a good fit. Negative binomial models for the error in exposure could achieve a comparable fit while incorporating the more parsimonious and biologically plausible mass action assumption. Models that assume negative binomial micro

  15. Automated detection of masses on whole breast volume ultrasound scanner: false positive reduction using deep convolutional neural network

    Science.gov (United States)

    Hiramatsu, Yuya; Muramatsu, Chisako; Kobayashi, Hironobu; Hara, Takeshi; Fujita, Hiroshi

    2017-03-01

    Breast cancer screening with mammography and ultrasonography is expected to improve sensitivity compared with mammography alone, especially for women with dense breast. An automated breast volume scanner (ABVS) provides the operator-independent whole breast data which facilitate double reading and comparison with past exams, contralateral breast, and multimodality images. However, large volumetric data in screening practice increase radiologists' workload. Therefore, our goal is to develop a computer-aided detection scheme of breast masses in ABVS data for assisting radiologists' diagnosis and comparison with mammographic findings. In this study, false positive (FP) reduction scheme using deep convolutional neural network (DCNN) was investigated. For training DCNN, true positive and FP samples were obtained from the result of our initial mass detection scheme using the vector convergence filter. Regions of interest including the detected regions were extracted from the multiplanar reconstraction slices. We investigated methods to select effective FP samples for training the DCNN. Based on the free response receiver operating characteristic analysis, simple random sampling from the entire candidates was most effective in this study. Using DCNN, the number of FPs could be reduced by 60%, while retaining 90% of true masses. The result indicates the potential usefulness of DCNN for FP reduction in automated mass detection on ABVS images.

  16. Preclusion of switch behavior in reaction networks with mass-action kinetics

    DEFF Research Database (Denmark)

    Feliu, Elisenda; Wiuf, C.

    2012-01-01

    We study networks taken with mass-action kinetics and provide a Jacobian criterion that applies to an arbitrary network to preclude the existence of multiple positive steady states within any stoichiometric class for any choice of rate constants. We are concerned with the characterization...... precludes the existence of degenerate steady states. Further, we relate injectivity of a network to that of the network obtained by adding outflow, or degradation, reactions for all species....

  17. Predicting Causal Relationships from Biological Data: Applying Automated Casual Discovery on Mass Cytometry Data of Human Immune Cells

    KAUST Repository

    Triantafillou, Sofia; Lagani, Vincenzo; Heinze-Deml, Christina; Schmidt, Angelika; Tegner, Jesper; Tsamardinos, Ioannis

    2017-01-01

    Learning the causal relationships that define a molecular system allows us to predict how the system will respond to different interventions. Distinguishing causality from mere association typically requires randomized experiments. Methods for automated causal discovery from limited experiments exist, but have so far rarely been tested in systems biology applications. In this work, we apply state-of-the art causal discovery methods on a large collection of public mass cytometry data sets, measuring intra-cellular signaling proteins of the human immune system and their response to several perturbations. We show how different experimental conditions can be used to facilitate causal discovery, and apply two fundamental methods that produce context-specific causal predictions. Causal predictions were reproducible across independent data sets from two different studies, but often disagree with the KEGG pathway databases. Within this context, we discuss the caveats we need to overcome for automated causal discovery to become a part of the routine data analysis in systems biology.

  18. Predicting Causal Relationships from Biological Data: Applying Automated Casual Discovery on Mass Cytometry Data of Human Immune Cells

    KAUST Repository

    Triantafillou, Sofia

    2017-03-31

    Learning the causal relationships that define a molecular system allows us to predict how the system will respond to different interventions. Distinguishing causality from mere association typically requires randomized experiments. Methods for automated causal discovery from limited experiments exist, but have so far rarely been tested in systems biology applications. In this work, we apply state-of-the art causal discovery methods on a large collection of public mass cytometry data sets, measuring intra-cellular signaling proteins of the human immune system and their response to several perturbations. We show how different experimental conditions can be used to facilitate causal discovery, and apply two fundamental methods that produce context-specific causal predictions. Causal predictions were reproducible across independent data sets from two different studies, but often disagree with the KEGG pathway databases. Within this context, we discuss the caveats we need to overcome for automated causal discovery to become a part of the routine data analysis in systems biology.

  19. The Impact of Automated Notification on Follow-up of Actionable Tests Pending at Discharge: a Cluster-Randomized Controlled Trial.

    Science.gov (United States)

    Dalal, Anuj K; Schaffer, Adam; Gershanik, Esteban F; Papanna, Ranganath; Eibensteiner, Katyuska; Nolido, Nyryan V; Yoon, Cathy S; Williams, Deborah; Lipsitz, Stuart R; Roy, Christopher L; Schnipper, Jeffrey L

    2018-03-12

    Follow-up of tests pending at discharge (TPADs) is poor. We previously demonstrated a twofold increase in awareness of any TPAD by attendings and primary care physicians (PCPs) using an automated email intervention OBJECTIVE: To determine whether automated notification improves documented follow-up for actionable TPADs DESIGN: Cluster-randomized controlled trial SUBJECTS: Attendings and PCPs caring for adult patients discharged from general medicine and cardiology services with at least one actionable TPAD between June 2011 and May 2012 INTERVENTION: An automated system that notifies discharging attendings and network PCPs of finalized TPADs by email MAIN MEASURES: The primary outcome was the proportion of actionable TPADs with documented action determined by independent physician review of the electronic health record (EHR). Secondary outcomes included documented acknowledgment, 30-day readmissions, and adjusted median days to documented follow-up. Of the 3378 TPADs sampled, 253 (7.5%) were determined to be actionable by physician review. Of these, 150 (123 patients discharged by 53 attendings) and 103 (90 patients discharged by 44 attendings) were assigned to intervention and usual care groups, respectively, and underwent chart review. The proportion of actionable TPADs with documented action was 60.7 vs. 56.3% (p = 0.82) in the intervention vs. usual care groups, similar for documented acknowledgment. The proportion of patients with actionable TPADs readmitted within 30 days was 22.8 vs. 31.1% in the intervention vs. usual care groups (p = 0.24). The adjusted median days [95% CI] to documented action was 9 [6.2, 11.8] vs. 14 [10.2, 17.8] (p = 0.04) in the intervention vs. usual care groups, similar for documented acknowledgment. In sub-group analysis, the intervention had greater impact on documented action for patients with network PCPs compared with usual care (70 vs. 50%, p = 0.03). Automated notification of actionable TPADs shortened time to

  20. Optimization of Reversed-Phase Peptide Liquid Chromatography Ultraviolet Mass Spectrometry Analyses Using an Automated Blending Methodology

    Science.gov (United States)

    Chakraborty, Asish B.; Berger, Scott J.

    2005-01-01

    The balance between chromatographic performance and mass spectrometric response has been evaluated using an automated series of experiments where separations are produced by the real-time automated blending of water with organic and acidic modifiers. In this work, the concentration effects of two acidic modifiers (formic acid and trifluoroacetic acid) were studied on the separation selectivity, ultraviolet, and mass spectrometry detector response, using a complex peptide mixture. Peptide retention selectivity differences were apparent between the two modifiers, and under the conditions studied, trifluoroacetic acid produced slightly narrower (more concentrated) peaks, but significantly higher electrospray mass spectrometry suppression. Trifluoroacetic acid suppression of electrospray signal and influence on peptide retention and selectivity was dominant when mixtures of the two modifiers were analyzed. Our experimental results indicate that in analyses where the analyzed components are roughly equimolar (e.g., a peptide map of a recombinant protein), the selectivity of peptide separations can be optimized by choice and concentration of acidic modifier, without compromising the ability to obtain effective sequence coverage of a protein. In some cases, these selectivity differences were explored further, and a rational basis for differentiating acidic modifier effects from the underlying peptide sequences is described. PMID:16522853

  1. E-FUSRAP: AUTOMATING THE CASE FILE FOR THE FORMERLY UTILIZED SITES REMEDIAL ACTION PROGRAM

    International Nuclear Information System (INIS)

    Mackenzie, D.; Marshall, K.

    2003-01-01

    The Department of Energy's (DOE) Office of Site Closure, EM-30, houses the document library pertaining to sites that are related to the Formerly Utilized Sites Remedial Action Program (FUSRAP) and regularly addresses ongoing information demands, primarily from Freedom of Information Act (FOIA) requests, interested members of the public, the DOE, and other Federal Agencies. To address these demands more efficiently, DOE has begun to implement a new multi-phase, information management process known as e-FUSRAP. The first phase of e-FUSRAP, the development of the Considered Sites Database, summarizes and allows public access to complex information on over 600 sites considered as candidates for FUSRAP. The second phase of e-FUSRAP, the development of the Document Indexing Database, will create an internal index of more than 10,000 documents in the FUSRAP library's case file, allowing more effective management and retrieval of case file documents. Together, the phases of e-FUSRAP will allow EM-30 to become an innovative leader in enhancing public information sources

  2. E-FUSRAP: AUTOMATING THE CASE FILE FOR THE FORMERLY UTILIZED SITES REMEDIAL ACTION PROGRAM

    Energy Technology Data Exchange (ETDEWEB)

    Mackenzie, D.; Marshall, K.

    2003-02-27

    The Department of Energy's (DOE) Office of Site Closure, EM-30, houses the document library pertaining to sites that are related to the Formerly Utilized Sites Remedial Action Program (FUSRAP) and regularly addresses ongoing information demands, primarily from Freedom of Information Act (FOIA) requests, interested members of the public, the DOE, and other Federal Agencies. To address these demands more efficiently, DOE has begun to implement a new multi-phase, information management process known as e-FUSRAP. The first phase of e-FUSRAP, the development of the Considered Sites Database, summarizes and allows public access to complex information on over 600 sites considered as candidates for FUSRAP. The second phase of e-FUSRAP, the development of the Document Indexing Database, will create an internal index of more than 10,000 documents in the FUSRAP library's case file, allowing more effective management and retrieval of case file documents. Together, the phases of e-FUSRAP will allow EM-30 to become an innovative leader in enhancing public information sources.

  3. Hardware and software for automating the process of studying high-speed gas flows in wind tunnels of short-term action

    Science.gov (United States)

    Yakovlev, V. V.; Shakirov, S. R.; Gilyov, V. M.; Shpak, S. I.

    2017-10-01

    In this paper, we propose a variant of constructing automation systems for aerodynamic experiments on the basis of modern hardware-software means of domestic development. The structure of the universal control and data collection system for performing experiments in wind tunnels of continuous, periodic or short-term action is proposed. The proposed hardware and software development tools for ICT SB RAS and ITAM SB RAS, as well as subsystems based on them, can be widely applied to any scientific and experimental installations, as well as to the automation of technological processes in production.

  4. A semi-automated approach to derive elevation time-series and calculate glacier mass balance from historical aerial imagery

    Science.gov (United States)

    Whorton, E.; Headman, A.; Shean, D. E.; McCann, E.

    2017-12-01

    Understanding the implications of glacier recession on water resources in the western U.S. requires quantifying glacier mass change across large regions over several decades. Very few glaciers in North America have long-term continuous field measurements of glacier mass balance. However, systematic aerial photography campaigns began in 1957 on many glaciers in the western U.S. and Alaska. These historical, vertical aerial stereo-photographs documenting glacier evolution have recently become publically available. Digital elevation models (DEM) of the transient glacier surface preserved in each imagery timestamp can be derived, then differenced to calculate glacier volume and mass change to improve regional geodetic solutions of glacier mass balance. In order to batch process these data, we use Python-based algorithms and Agisoft Photoscan structure from motion (SfM) photogrammetry software to semi-automate DEM creation, and orthorectify and co-register historical aerial imagery in a high-performance computing environment. Scanned photographs are rotated to reduce scaling issues, cropped to the same size to remove fiducials, and batch histogram equalization is applied to improve image quality and aid pixel-matching algorithms using the Python library OpenCV. Processed photographs are then passed to Photoscan through the Photoscan Python library to create DEMs and orthoimagery. To extend the period of record, the elevation products are co-registered to each other, airborne LiDAR data, and DEMs derived from sub-meter commercial satellite imagery. With the exception of the placement of ground control points, the process is entirely automated with Python. Current research is focused on: one, applying these algorithms to create geodetic mass balance time series for the 90 photographed glaciers in Washington State and two, evaluating the minimal amount of positional information required in Photoscan to prevent distortion effects that cannot be addressed during co

  5. Automated CD-SEM recipe creation technology for mass production using CAD data

    Science.gov (United States)

    Kawahara, Toshikazu; Yoshida, Masamichi; Tanaka, Masashi; Ido, Sanyu; Nakano, Hiroyuki; Adachi, Naokaka; Abe, Yuichi; Nagatomo, Wataru

    2011-03-01

    Critical Dimension Scanning Electron Microscope (CD-SEM) recipe creation needs sample preparation necessary for matching pattern registration, and recipe creation on CD-SEM using the sample, which hinders the reduction in test production cost and time in semiconductor manufacturing factories. From the perspective of cost reduction and improvement of the test production efficiency, automated CD-SEM recipe creation without the sample preparation and the manual operation has been important in the production lines. For the automated CD-SEM recipe creation, we have introduced RecipeDirector (RD) that enables the recipe creation by using Computer-Aided Design (CAD) data and text data that includes measurement information. We have developed a system that automatically creates the CAD data and the text data necessary for the recipe creation on RD; and, for the elimination of the manual operation, we have enhanced RD so that all measurement information can be specified in the text data. As a result, we have established an automated CD-SEM recipe creation system without the sample preparation and the manual operation. For the introduction of the CD-SEM recipe creation system using RD to the production lines, the accuracy of the pattern matching was an issue. The shape of design templates for the matching created from the CAD data was different from that of SEM images in vision. Thus, a development of robust pattern matching algorithm that considers the shape difference was needed. The addition of image processing of the templates for the matching and shape processing of the CAD patterns in the lower layer has enabled the robust pattern matching. This paper describes the automated CD-SEM recipe creation technology for the production lines without the sample preparation and the manual operation using RD applied in Sony Semiconductor Kyusyu Corporation Kumamoto Technology Center (SCK Corporation Kumamoto TEC).

  6. Integration of biotechnology, visualisation technology and robot technology for automated mass propagation af elite trees

    DEFF Research Database (Denmark)

    Find, Jens

    for the production of Christmas trees and Sitka spruce has gained renewed interest as a fast growing species for the production biofuels. These species are used as model systems for the development of automated plant production based on robot and visualisation technology. The commercial aspect of the project aims at......: 1) the market for cloned elite plants in the forestry sector and 2) the market for robot technology in the production of plants for the forestry sector....

  7. Micronucleus test for radiation biodosimetry in mass casualty events: Evaluation of visual and automated scoring

    Energy Technology Data Exchange (ETDEWEB)

    Bolognesi, Claudia, E-mail: claudia.bolognesi@istge.i [Environmental Carcinogenesis Unit, National Cancer Research Institute, Largo R. Benzi 10, 16132 Genoa (Italy); Balia, Cristina; Roggieri, Paola [Environmental Carcinogenesis Unit, National Cancer Research Institute, Largo R. Benzi 10, 16132 Genoa (Italy); Cardinale, Francesco [Clinical Epidemiology Unit, National Cancer Research Institute, Largo R. Benzi 10, 16132 Genoa (Italy); Department of Health Sciences, University of Genoa, Genoa (Italy); Bruzzi, Paolo [Clinical Epidemiology Unit, National Cancer Research Institute, Largo R. Benzi 10, 16132 Genoa (Italy); Sorcinelli, Francesca [Environmental Carcinogenesis Unit, National Cancer Research Institute, Largo R. Benzi 10, 16132 Genoa (Italy); Laboratory of Genetics, Histology and Molecular Biology Section, Army Medical and Veterinary, Research Center, Via Santo Stefano Rotondo 4, 00184 Roma (Italy); Lista, Florigio [Laboratory of Genetics, Histology and Molecular Biology Section, Army Medical and Veterinary, Research Center, Via Santo Stefano Rotondo 4, 00184 Roma (Italy); D' Amelio, Raffaele [Sapienza, Universita di Roma II Facolta di Medicina e Chirurgia and Ministero della Difesa, Direzione Generale Sanita Militare (Italy); Righi, Enzo [Frascati National Laboratories, National Institute of Nuclear Physics, Via Enrico Fermi 40, 00044 Frascati, Rome (Italy)

    2011-02-15

    In the case of a large-scale nuclear or radiological incidents a reliable estimate of dose is an essential tool for providing timely assessment of radiation exposure and for making life-saving medical decisions. Cytogenetics is considered as the 'gold standard' for biodosimetry. The dicentric analysis (DA) represents the most specific cytogenetic bioassay. The micronucleus test (MN) applied in interphase in peripheral lymphocytes is an alternative and simpler approach. A dose-effect calibration curve for the MN frequency in peripheral lymphocytes from 27 adult donors was established after in vitro irradiation at a dose range 0.15-8 Gy of {sup 137}Cs gamma rays (dose rate 6 Gy min{sup -1}). Dose prediction by visual scoring in a dose-blinded study (0.15-4.0 Gy) revealed a high level of accuracy (R = 0.89). The scoring of MN is time consuming and requires adequate skills and expertise. Automated image analysis is a feasible approach allowing to reduce the time and to increase the accuracy of the dose estimation decreasing the variability due to subjective evaluation. A good correlation (R = 0.705) between visual and automated scoring with visual correction was observed over the dose range 0-2 Gy. Almost perfect discrimination power for exposure to 1-2 Gy, and a satisfactory power for 0.6 Gy were detected. This threshold level can be considered sufficient for identification of sub lethally exposed individuals by automated CBMN assay.

  8. An Automated High Performance Capillary Liquid Chromatography Fourier Transform Ion Cyclotron Resonance Mass Spectrometer for High-Throughput Proteomics

    International Nuclear Information System (INIS)

    Belov, Mikhail E.; Anderson, Gordon A.; Wingerd, Mark A.; Udseth, Harold R.; Tang, Keqi; Prior, David C.; Swanson, Kenneth R.; Buschbach, Michael A.; Strittmatter, Eric F.; Moore, Ronald J.; Smith, Richard D.

    2004-01-01

    We report on a fully automated 9.4 tesla Fourier transform ion resonance cyclotron (FTICR) mass spectrometer coupled to reverse-phase chromatography for high-throughput proteomic studies. Modifications made to the front-end of a commercial FTICR instrument--a dual-ESI-emitter ion source; dual-channel electrodynamic ion funnel; and collisional-cooling, selection and accumulation quadrupoles--significantly improved the sensitivity, dynamic range and mass measurement accuracy of the mass spectrometer. A high-pressure capillary liquid chromatography (LC) system was incorporated with an autosampler that enabled 24 h/day operation. A novel method for accumulating ions in the ICR cell was also developed. Unattended operation of the instrument revealed the exceptional reproducibility (1-5% deviation in elution times for peptides from a bacterial proteome), repeatability (10-20% deviation in detected abundances for peptides from the same aliquot analyzed a few weeks apart) and robustness (high-throughput operation for 5 months without downtime) of the LC/FTICR system. When combined with modulated-ion-energy gated trapping, the internal calibration of FTICR mass spectra decreased dispersion of mass measurement errors for peptide identifications in conjunction with high resolution capillary LC separations to < 5 ppm over a dynamic range for each spectrum of 10 3

  9. Quantum back-action-evading measurement of motion in a negative mass reference frame

    Science.gov (United States)

    Møller, Christoffer B.; Thomas, Rodrigo A.; Vasilakis, Georgios; Zeuthen, Emil; Tsaturyan, Yeghishe; Balabas, Mikhail; Jensen, Kasper; Schliesser, Albert; Hammerer, Klemens; Polzik, Eugene S.

    2017-07-01

    Quantum mechanics dictates that a continuous measurement of the position of an object imposes a random quantum back-action (QBA) perturbation on its momentum. This randomness translates with time into position uncertainty, thus leading to the well known uncertainty on the measurement of motion. As a consequence of this randomness, and in accordance with the Heisenberg uncertainty principle, the QBA puts a limitation—the so-called standard quantum limit—on the precision of sensing of position, velocity and acceleration. Here we show that QBA on a macroscopic mechanical oscillator can be evaded if the measurement of motion is conducted in the reference frame of an atomic spin oscillator. The collective quantum measurement on this hybrid system of two distant and disparate oscillators is performed with light. The mechanical oscillator is a vibrational ‘drum’ mode of a millimetre-sized dielectric membrane, and the spin oscillator is an atomic ensemble in a magnetic field. The spin oriented along the field corresponds to an energetically inverted spin population and realizes a negative-effective-mass oscillator, while the opposite orientation corresponds to an oscillator with positive effective mass. The QBA is suppressed by -1.8 decibels in the negative-mass setting and enhanced by 2.4 decibels in the positive-mass case. This hybrid quantum system paves the way to entanglement generation and distant quantum communication between mechanical and spin systems and to sensing of force, motion and gravity beyond the standard quantum limit.

  10. Quantum back-action-evading measurement of motion in a negative mass reference frame.

    Science.gov (United States)

    Møller, Christoffer B; Thomas, Rodrigo A; Vasilakis, Georgios; Zeuthen, Emil; Tsaturyan, Yeghishe; Balabas, Mikhail; Jensen, Kasper; Schliesser, Albert; Hammerer, Klemens; Polzik, Eugene S

    2017-07-12

    Quantum mechanics dictates that a continuous measurement of the position of an object imposes a random quantum back-action (QBA) perturbation on its momentum. This randomness translates with time into position uncertainty, thus leading to the well known uncertainty on the measurement of motion. As a consequence of this randomness, and in accordance with the Heisenberg uncertainty principle, the QBA puts a limitation-the so-called standard quantum limit-on the precision of sensing of position, velocity and acceleration. Here we show that QBA on a macroscopic mechanical oscillator can be evaded if the measurement of motion is conducted in the reference frame of an atomic spin oscillator. The collective quantum measurement on this hybrid system of two distant and disparate oscillators is performed with light. The mechanical oscillator is a vibrational 'drum' mode of a millimetre-sized dielectric membrane, and the spin oscillator is an atomic ensemble in a magnetic field. The spin oriented along the field corresponds to an energetically inverted spin population and realizes a negative-effective-mass oscillator, while the opposite orientation corresponds to an oscillator with positive effective mass. The QBA is suppressed by -1.8 decibels in the negative-mass setting and enhanced by 2.4 decibels in the positive-mass case. This hybrid quantum system paves the way to entanglement generation and distant quantum communication between mechanical and spin systems and to sensing of force, motion and gravity beyond the standard quantum limit.

  11. Automated solid-phase extraction of herbicides from water for gas chromatographic-mass spectrometric analysis

    Science.gov (United States)

    Meyer, M.T.; Mills, M.S.; Thurman, E.M.

    1993-01-01

    An automated solid-phase extraction (SPE) method was developed for the pre-concentration of chloroacetanilide and triazine herbicides, and two triazine metabolites from 100-ml water samples. Breakthrough experiments for the C18 SPE cartridge show that the two triazine metabolites are not fully retained and that increasing flow-rate decreases their retention. Standard curve r2 values of 0.998-1.000 for each compound were consistently obtained and a quantitation level of 0.05 ??g/l was achieved for each compound tested. More than 10,000 surface and ground water samples have been analyzed by this method.

  12. Metabolite Identification Using Automated Comparison of High-Resolution Multistage Mass Spectral Trees

    NARCIS (Netherlands)

    Rojas-Cherto, M.; Peironcely, J.E.; Kasper, P.T.; Hooft, van der J.J.J.; Vos, de R.C.H.; Vreeken, R.; Hankemeier, T.; Reijmers, T.

    2012-01-01

    Multistage mass spectrometry (MSn) generating so-called spectral trees is a powerful tool in the annotation and structural elucidation of metabolites and is increasingly used in the area of accurate mass LC/MS-based metabolomics to identify unknown, but biologically relevant, compounds. As a

  13. Predicting Causal Relationships from Biological Data: Applying Automated Causal Discovery on Mass Cytometry Data of Human Immune Cells

    KAUST Repository

    Triantafillou, Sofia; Lagani, Vincenzo; Heinze-Deml, Christina; Schmidt, Angelika; Tegner, Jesper; Tsamardinos, Ioannis

    2017-01-01

    Learning the causal relationships that define a molecular system allows us to predict how the system will respond to different interventions. Distinguishing causality from mere association typically requires randomized experiments. Methods for automated  causal discovery from limited experiments exist, but have so far rarely been tested in systems biology applications. In this work, we apply state-of-the art causal discovery methods on a large collection of public mass cytometry data sets, measuring intra-cellular signaling proteins of the human immune system and their response to several perturbations. We show how different experimental conditions can be used to facilitate causal discovery, and apply two fundamental methods that produce context-specific causal predictions. Causal predictions were reproducible across independent data sets from two different studies, but often disagree with the KEGG pathway databases. Within this context, we discuss the caveats we need to overcome for automated causal discovery to become a part of the routine data analysis in systems biology.

  14. Gravity-Assist Trajectories to the Ice Giants: An Automated Method to Catalog Mass-or Time-Optimal Solutions

    Science.gov (United States)

    Hughes, Kyle M.; Knittel, Jeremy M.; Englander, Jacob A.

    2017-01-01

    This work presents an automated method of calculating mass (or time) optimal gravity-assist trajectories without a priori knowledge of the flyby-body combination. Since gravity assists are particularly crucial for reaching the outer Solar System, we use the Ice Giants, Uranus and Neptune, as example destinations for this work. Catalogs are also provided that list the most attractive trajectories found over launch dates ranging from 2024 to 2038. The tool developed to implement this method, called the Python EMTG Automated Trade Study Application (PEATSA), iteratively runs the Evolutionary Mission Trajectory Generator (EMTG), a NASA Goddard Space Flight Center in-house trajectory optimization tool. EMTG finds gravity-assist trajectories with impulsive maneuvers using a multiple-shooting structure along with stochastic methods (such as monotonic basin hopping) and may be run with or without an initial guess provided. PEATSA runs instances of EMTG in parallel over a grid of launch dates. After each set of runs completes, the best results within a neighborhood of launch dates are used to seed all other cases in that neighborhood---allowing the solutions across the range of launch dates to improve over each iteration. The results here are compared against trajectories found using a grid-search technique, and PEATSA is found to outperform the grid-search results for most launch years considered.

  15. Automation and Control of an Imaging Internal Laser Desorption Fourier Transform Mass Spectrometer (I2LD-FTMS)

    Energy Technology Data Exchange (ETDEWEB)

    McJunkin, Timothy R; Tranter, Troy Joseph; Scott, Jill Rennee

    2002-06-01

    This paper describes the automation of an imaging internal source laser desorption Fourier transform mass spectrometer (I2LD-FTMS). The I2LD-FTMS consists of a laser-scanning device [Scott and Tremblay, Rev. Sci. Instrum. 2002, 73, 1108–1116] that has been integrated with a laboratory-built FTMS using a commercial data acquisition system (ThermoFinnigan FT/MS, Bremen, Germany). A new user interface has been developed in National Instrument's (Austin, Texas) graphical programming language LabVIEW to control the motors of the laser positioning system and the commercial FTMS data acquisition system. A feature of the FTMS software that allows the user to write macros in a scripting language is used creatively to our advantage in creating a mechanism to control the FTMS from outside its graphical user interface. The new user interface also allows the user to configure target locations. Automation of the data analysis along with data display using commercial graphing software is also described.

  16. Hexicon 2: automated processing of hydrogen-deuterium exchange mass spectrometry data with improved deuteration distribution estimation.

    Science.gov (United States)

    Lindner, Robert; Lou, Xinghua; Reinstein, Jochen; Shoeman, Robert L; Hamprecht, Fred A; Winkler, Andreas

    2014-06-01

    Hydrogen-deuterium exchange (HDX) experiments analyzed by mass spectrometry (MS) provide information about the dynamics and the solvent accessibility of protein backbone amide hydrogen atoms. Continuous improvement of MS instrumentation has contributed to the increasing popularity of this method; however, comprehensive automated data analysis is only beginning to mature. We present Hexicon 2, an automated pipeline for data analysis and visualization based on the previously published program Hexicon (Lou et al. 2010). Hexicon 2 employs the sensitive NITPICK peak detection algorithm of its predecessor in a divide-and-conquer strategy and adds new features, such as chromatogram alignment and improved peptide sequence assignment. The unique feature of deuteration distribution estimation was retained in Hexicon 2 and improved using an iterative deconvolution algorithm that is robust even to noisy data. In addition, Hexicon 2 provides a data browser that facilitates quality control and provides convenient access to common data visualization tasks. Analysis of a benchmark dataset demonstrates superior performance of Hexicon 2 compared with its predecessor in terms of deuteration centroid recovery and deuteration distribution estimation. Hexicon 2 greatly reduces data analysis time compared with manual analysis, whereas the increased number of peptides provides redundant coverage of the entire protein sequence. Hexicon 2 is a standalone application available free of charge under http://hx2.mpimf-heidelberg.mpg.de.

  17. An improved, automated whole air sampler and gas chromatography mass spectrometry analysis system for volatile organic compounds in the atmosphere

    Science.gov (United States)

    Lerner, Brian M.; Gilman, Jessica B.; Aikin, Kenneth C.; Atlas, Elliot L.; Goldan, Paul D.; Graus, Martin; Hendershot, Roger; Isaacman-VanWertz, Gabriel A.; Koss, Abigail; Kuster, William C.; Lueb, Richard A.; McLaughlin, Richard J.; Peischl, Jeff; Sueper, Donna; Ryerson, Thomas B.; Tokarek, Travis W.; Warneke, Carsten; Yuan, Bin; de Gouw, Joost A.

    2017-01-01

    Volatile organic compounds were quantified during two aircraft-based field campaigns using highly automated, whole air samplers with expedited post-flight analysis via a new custom-built, field-deployable gas chromatography-mass spectrometry instrument. During flight, air samples were pressurized with a stainless steel bellows compressor into electropolished stainless steel canisters. The air samples were analyzed using a novel gas chromatograph system designed specifically for field use which eliminates the need for liquid nitrogen. Instead, a Stirling cooler is used for cryogenic sample pre-concentration at temperatures as low as -165 °C. The analysis system was fully automated on a 20 min cycle to allow for unattended processing of an entire flight of 72 sample canisters within 30 h, thereby reducing typical sample residence times in the canisters to less than 3 days. The new analytical system is capable of quantifying a wide suite of C2 to C10 organic compounds at part-per-trillion sensitivity. This paper describes the sampling and analysis systems, along with the data analysis procedures which include a new peak-fitting software package for rapid chromatographic data reduction. Instrument sensitivities, uncertainties and system artifacts are presented for 35 trace gas species in canister samples. Comparisons of reported mixing ratios from each field campaign with measurements from other instruments are also presented.

  18. Predicting Causal Relationships from Biological Data: Applying Automated Causal Discovery on Mass Cytometry Data of Human Immune Cells

    KAUST Repository

    Triantafillou, Sofia

    2017-09-29

    Learning the causal relationships that define a molecular system allows us to predict how the system will respond to different interventions. Distinguishing causality from mere association typically requires randomized experiments. Methods for automated  causal discovery from limited experiments exist, but have so far rarely been tested in systems biology applications. In this work, we apply state-of-the art causal discovery methods on a large collection of public mass cytometry data sets, measuring intra-cellular signaling proteins of the human immune system and their response to several perturbations. We show how different experimental conditions can be used to facilitate causal discovery, and apply two fundamental methods that produce context-specific causal predictions. Causal predictions were reproducible across independent data sets from two different studies, but often disagree with the KEGG pathway databases. Within this context, we discuss the caveats we need to overcome for automated causal discovery to become a part of the routine data analysis in systems biology.

  19. Hexicon 2: Automated Processing of Hydrogen-Deuterium Exchange Mass Spectrometry Data with Improved Deuteration Distribution Estimation

    Science.gov (United States)

    Lindner, Robert; Lou, Xinghua; Reinstein, Jochen; Shoeman, Robert L.; Hamprecht, Fred A.; Winkler, Andreas

    2014-06-01

    Hydrogen-deuterium exchange (HDX) experiments analyzed by mass spectrometry (MS) provide information about the dynamics and the solvent accessibility of protein backbone amide hydrogen atoms. Continuous improvement of MS instrumentation has contributed to the increasing popularity of this method; however, comprehensive automated data analysis is only beginning to mature. We present Hexicon 2, an automated pipeline for data analysis and visualization based on the previously published program Hexicon (Lou et al. 2010). Hexicon 2 employs the sensitive NITPICK peak detection algorithm of its predecessor in a divide-and-conquer strategy and adds new features, such as chromatogram alignment and improved peptide sequence assignment. The unique feature of deuteration distribution estimation was retained in Hexicon 2 and improved using an iterative deconvolution algorithm that is robust even to noisy data. In addition, Hexicon 2 provides a data browser that facilitates quality control and provides convenient access to common data visualization tasks. Analysis of a benchmark dataset demonstrates superior performance of Hexicon 2 compared with its predecessor in terms of deuteration centroid recovery and deuteration distribution estimation. Hexicon 2 greatly reduces data analysis time compared with manual analysis, whereas the increased number of peptides provides redundant coverage of the entire protein sequence. Hexicon 2 is a standalone application available free of charge under http://hx2.mpimf-heidelberg.mpg.de.

  20. Structural Characterization of Laboratory Made Tholins by IRMPD Action Spectroscopy and Ultrahigh Resolution Mass Spectrometry

    Science.gov (United States)

    Thissen, R.; Somogyi, A.; Vuitton, V.; Bégué, D.; Lemaire, J.; Steinmetz, V.

    2011-10-01

    The complex organic material that is found on the surface and within the haze layer of Titan is attributed to chemistry occurring in its thick N2/CH4 atmosphere. Although several groups are producing in various laboratory setting the socalled tholins which have been investigated by using analytical methods including UV/Vis, fluorescence, IR, and MS1-5, these very complex organic mixtures still hold many unanswered questions, especially related to the potentiality for their prebiotic chemistry. In addition to tholins characterization and analysis, we recently investigated quantitatively the hydrolysis kinetics of tholins in pure and NH3 containing water at different temperatures.7-8 Our groups at UJF (Grenoble) and at U of Arizona (Tucson) have been collaborating on mass spectral analyses of tholins samples for several years.9 Here, we report our most recent results on the structural characterization of tholins by infrared multiphoton dissociation (IRMPD) action spectroscopy10 and ultrahigh resolution MS. IRMPD action spectroscopy is a recently developed technique that uses IR photons of variable wavelengths to activate ions trapped inside an ion trap. When photons are absorbed at a given wavelength, the selected ion fragments and this fragmentation is monitored as a function of wavelength, analog to an absorption spectrum (impossible to record otherwise because of the much reduced density). This technique can, therefore, be used to determine IR spectra of ions in the gas phase, and provides with very acute structural information. IRMPD action spectroscopy is often used to distinguish between structural isomers of isobaric ions. The drawback is that it requests for high power lasers. Only two Free Electron Lasers (FEL) are available in the world and allow to record spectra with reasonable resolution (20-25 cm-1). IRMPD action spectra of selected ions from tholins will be presented and discussed together with observed fragmentation processes that reveal structural

  1. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data.

    Science.gov (United States)

    Jaitly, Navdeep; Mayampurath, Anoop; Littlefield, Kyle; Adkins, Joshua N; Anderson, Gordon A; Smith, Richard D

    2009-03-17

    Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS)-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC) elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to identify and quantify peptides in the

  2. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data

    Directory of Open Access Journals (Sweden)

    Anderson Gordon A

    2009-03-01

    Full Text Available Abstract Background Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. Results With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to

  3. Semi-automated digital measurement as the method of choice for beta cell mass analysis.

    Directory of Open Access Journals (Sweden)

    Violette Coppens

    Full Text Available Pancreas injury by partial duct ligation (PDL activates beta cell differentiation and proliferation in adult mouse pancreas but remains controversial regarding the anticipated increase in beta cell volume. Several reports unable to show beta cell volume augmentation in PDL pancreas used automated digital image analysis software. We hypothesized that fully automatic beta cell morphometry without manual micrograph artifact remediation introduces bias and therefore might be responsible for reported discrepancies and controversy. However, our present results prove that standard digital image processing with automatic thresholding is sufficiently robust albeit less sensitive and less adequate to demonstrate a significant increase in beta cell volume in PDL versus Sham-operated pancreas. We therefore conclude that other confounding factors such as quality of surgery, selection of samples based on relative abundance of the transcription factor Neurogenin 3 (Ngn3 and tissue processing give rise to inter-laboratory inconsistencies in beta cell volume quantification in PDL pancreas.

  4. Pharmacokinetic Studies of Chinese Medicinal Herbs Using an Automated Blood Sampling System and Liquid Chromatography-mass Spectrometry

    Directory of Open Access Journals (Sweden)

    Yu-Tse Wu

    2012-01-01

    Full Text Available The safety of herbal products is one of the major concerns for the modernization of traditional Chinese medicine, and pharmacokinetic data of medicinal herbs guide us to design the rational use of the herbal formula. This article reviews the advantages of the automated blood sampling (ABS systems for pharmacokinetic studies. In addition, three commonly used sample preparative methods, protein precipitation, liquid-liquid extraction and solid-phase extraction, are introduced. Furthermore, the definition, causes and evaluation of matrix effects in liquid chromatography-mass spectrometry (LC/MS analysis are demonstrated. Finally, we present our previous works as practical examples of the application of ABS systems and LC/MS for the pharmacokinetic studies of Chinese medicinal herbs.

  5. RADARS, a bioinformatics solution that automates proteome mass spectral analysis, optimises protein identification, and archives data in a relational database.

    Science.gov (United States)

    Field, Helen I; Fenyö, David; Beavis, Ronald C

    2002-01-01

    RADARS, a rapid, automated, data archiving and retrieval software system for high-throughput proteomic mass spectral data processing and storage, is described. The majority of mass spectrometer data files are compatible with RADARS, for consistent processing. The system automatically takes unprocessed data files, identifies proteins via in silico database searching, then stores the processed data and search results in a relational database suitable for customized reporting. The system is robust, used in 24/7 operation, accessible to multiple users of an intranet through a web browser, may be monitored by Virtual Private Network, and is secure. RADARS is scalable for use on one or many computers, and is suited to multiple processor systems. It can incorporate any local database in FASTA format, and can search protein and DNA databases online. A key feature is a suite of visualisation tools (many available gratis), allowing facile manipulation of spectra, by hand annotation, reanalysis, and access to all procedures. We also described the use of Sonar MS/MS, a novel, rapid search engine requiring 40 MB RAM per process for searches against a genomic or EST database translated in all six reading frames. RADARS reduces the cost of analysis by its efficient algorithms: Sonar MS/MS can identifiy proteins without accurate knowledge of the parent ion mass and without protein tags. Statistical scoring methods provide close-to-expert accuracy and brings robust data analysis to the non-expert user.

  6. Approaches towards the automated interpretation and prediction of electrospray tandem mass spectra of non-peptidic combinatorial compounds.

    Science.gov (United States)

    Klagkou, Katerina; Pullen, Frank; Harrison, Mark; Organ, Andy; Firth, Alistair; Langley, G John

    2003-01-01

    Combinatorial chemistry is widely used within the pharmaceutical industry as a means of rapid identification of potential drugs. With the growth of combinatorial libraries, mass spectrometry (MS) became the key analytical technique because of its speed of analysis, sensitivity, accuracy and ability to be coupled with other analytical techniques. In the majority of cases, electrospray mass spectrometry (ES-MS) has become the default ionisation technique. However, due to the absence of fragment ions in the resulting spectra, tandem mass spectrometry (MS/MS) is required to provide structural information for the identification of an unknown analyte. This work discusses the first steps of an investigation into the fragmentation pathways taking place in electrospray tandem mass spectrometry. The ultimate goal for this project is to set general fragmentation rules for non-peptidic, pharmaceutical, combinatorial compounds. As an aid, an artificial intelligence (AI) software package is used to facilitate interpretation of the spectra. This initial study has focused on determining the fragmentation rules for some classes of compound types that fit the remit as outlined above. Based on studies carried out on several combinatorial libraries of these compounds, it was established that different classes of drug molecules follow unique fragmentation pathways. In addition to these general observations, the specific ionisation processes and the fragmentation pathways involved in the electrospray mass spectra of these systems were explored. The ultimate goal will be to incorporate our findings into the computer program and allow identification of an unknown, non-peptidic compound following insertion of its ES-MS/MS spectrum into the AI package. The work herein demonstrates the potential benefit of such an approach in addressing the issue of high-throughput, automated MS/MS data interpretation. Copyright 2003 John Wiley & Sons, Ltd.

  7. An atomistic vision of the Mass Action Law: Prediction of carbon/oxygen defects in silicon

    Energy Technology Data Exchange (ETDEWEB)

    Brenet, G.; Timerkaeva, D.; Caliste, D.; Pochet, P. [CEA, INAC-SP2M, Atomistic Simulation Laboratory, F-38000 Grenoble (France); Univ. Grenoble Alpes, INAC-SP2M, L-Sim, F-38000 Grenoble (France); Sgourou, E. N.; Londos, C. A. [University of Athens, Solid State Physics Section, Panepistimiopolis Zografos, Athens 157 84 (Greece)

    2015-09-28

    We introduce an atomistic description of the kinetic Mass Action Law to predict concentrations of defects and complexes. We demonstrate in this paper that this approach accurately predicts carbon/oxygen related defect concentrations in silicon upon annealing. The model requires binding and migration energies of the impurities and complexes, here obtained from density functional theory (DFT) calculations. Vacancy-oxygen complex kinetics are studied as a model system during both isochronal and isothermal annealing. Results are in good agreement with experimental data, confirming the success of the methodology. More importantly, it gives access to the sequence of chain reactions by which oxygen and carbon related complexes are created in silicon. Beside the case of silicon, the understanding of such intricate reactions is a key to develop point defect engineering strategies to control defects and thus semiconductors properties.

  8. Automated on-line liquid–liquid extraction system for temporal mass spectrometric analysis of dynamic samples

    Energy Technology Data Exchange (ETDEWEB)

    Hsieh, Kai-Ta; Liu, Pei-Han [Department of Applied Chemistry, National Chiao Tung University, 1001 University Rd, Hsinchu, 300, Taiwan (China); Urban, Pawel L. [Department of Applied Chemistry, National Chiao Tung University, 1001 University Rd, Hsinchu, 300, Taiwan (China); Institute of Molecular Science, National Chiao Tung University, 1001 University Rd, Hsinchu, 300, Taiwan (China)

    2015-09-24

    Most real samples cannot directly be infused to mass spectrometers because they could contaminate delicate parts of ion source and guides, or cause ion suppression. Conventional sample preparation procedures limit temporal resolution of analysis. We have developed an automated liquid–liquid extraction system that enables unsupervised repetitive treatment of dynamic samples and instantaneous analysis by mass spectrometry (MS). It incorporates inexpensive open-source microcontroller boards (Arduino and Netduino) to guide the extraction and analysis process. Duration of every extraction cycle is 17 min. The system enables monitoring of dynamic processes over many hours. The extracts are automatically transferred to the ion source incorporating a Venturi pump. Operation of the device has been characterized (repeatability, RSD = 15%, n = 20; concentration range for ibuprofen, 0.053–2.000 mM; LOD for ibuprofen, ∼0.005 mM; including extraction and detection). To exemplify its usefulness in real-world applications, we implemented this device in chemical profiling of pharmaceutical formulation dissolution process. Temporal dissolution profiles of commercial ibuprofen and acetaminophen tablets were recorded during 10 h. The extraction-MS datasets were fitted with exponential functions to characterize the rates of release of the main and auxiliary ingredients (e.g. ibuprofen, k = 0.43 ± 0.01 h{sup −1}). The electronic control unit of this system interacts with the operator via touch screen, internet, voice, and short text messages sent to the mobile phone, which is helpful when launching long-term (e.g. overnight) measurements. Due to these interactive features, the platform brings the concept of the Internet-of-Things (IoT) to the chemistry laboratory environment. - Highlights: • Mass spectrometric analysis normally requires sample preparation. • Liquid–liquid extraction can isolate analytes from complex matrices. • The proposed system automates

  9. Automated on-line liquid–liquid extraction system for temporal mass spectrometric analysis of dynamic samples

    International Nuclear Information System (INIS)

    Hsieh, Kai-Ta; Liu, Pei-Han; Urban, Pawel L.

    2015-01-01

    Most real samples cannot directly be infused to mass spectrometers because they could contaminate delicate parts of ion source and guides, or cause ion suppression. Conventional sample preparation procedures limit temporal resolution of analysis. We have developed an automated liquid–liquid extraction system that enables unsupervised repetitive treatment of dynamic samples and instantaneous analysis by mass spectrometry (MS). It incorporates inexpensive open-source microcontroller boards (Arduino and Netduino) to guide the extraction and analysis process. Duration of every extraction cycle is 17 min. The system enables monitoring of dynamic processes over many hours. The extracts are automatically transferred to the ion source incorporating a Venturi pump. Operation of the device has been characterized (repeatability, RSD = 15%, n = 20; concentration range for ibuprofen, 0.053–2.000 mM; LOD for ibuprofen, ∼0.005 mM; including extraction and detection). To exemplify its usefulness in real-world applications, we implemented this device in chemical profiling of pharmaceutical formulation dissolution process. Temporal dissolution profiles of commercial ibuprofen and acetaminophen tablets were recorded during 10 h. The extraction-MS datasets were fitted with exponential functions to characterize the rates of release of the main and auxiliary ingredients (e.g. ibuprofen, k = 0.43 ± 0.01 h"−"1). The electronic control unit of this system interacts with the operator via touch screen, internet, voice, and short text messages sent to the mobile phone, which is helpful when launching long-term (e.g. overnight) measurements. Due to these interactive features, the platform brings the concept of the Internet-of-Things (IoT) to the chemistry laboratory environment. - Highlights: • Mass spectrometric analysis normally requires sample preparation. • Liquid–liquid extraction can isolate analytes from complex matrices. • The proposed system automates the

  10. Effective action and electromagnetic response of topological superconductors and Majorana-mass Weyl fermions

    Science.gov (United States)

    Stone, Michael; Lopes, Pedro L. e. S.

    2016-05-01

    Motivated by an apparent paradox in [X.-L. Qi, E. Witten, and S.-C. Zhang, Phys. Rev. B 87, 134519 (2013), 10.1103/PhysRevB.87.134519], we use the method of gauged Wess-Zumino-Witten functionals to construct an effective action for a Weyl fermion with a Majorana mass that arises from coupling to a charged condensate. We obtain expressions for the current induced by an external gauge field and observe that the topological part of the current is only one-third of that that might have been expected from the gauge anomaly. The anomaly is not changed by the induced mass gap, however. The topological current is supplemented by a conventional supercurrent that provides the remaining two-thirds of the anomaly once the equation of motion for the Goldstone mode is satisfied. We apply our formula for the current to resolve the apparent paradox and also to the chiral magnetic effect (CME), where it predicts a reduction of the CME current to one-third of its value for a free Weyl gas in thermal equilibrium. We attribute this reduction to a partial cancellation of the CME by a chiral vortical effect current arising from the persistent rotation of the fluid induced by the external magnetic field.

  11. Towards automated discrimination of lipids versus peptides from full scan mass spectra

    Directory of Open Access Journals (Sweden)

    Piotr Dittwald

    2014-09-01

    Full Text Available Although physicochemical fractionation techniques play a crucial role in the analysis of complex mixtures, they are not necessarily the best solution to separate specific molecular classes, such as lipids and peptides. Any physical fractionation step such as, for example, those based on liquid chromatography, will introduce its own variation and noise. In this paper we investigate to what extent the high sensitivity and resolution of contemporary mass spectrometers offers viable opportunities for computational separation of signals in full scan spectra. We introduce an automatic method that can discriminate peptide from lipid peaks in full scan mass spectra, based on their isotopic properties. We systematically evaluate which features maximally contribute to a peptide versus lipid classification. The selected features are subsequently used to build a random forest classifier that enables almost perfect separation between lipid and peptide signals without requiring ion fragmentation and classical tandem MS-based identification approaches. The classifier is trained on in silico data, but is also capable of discriminating signals in real world experiments. We evaluate the influence of typical data inaccuracies of common classes of mass spectrometry instruments on the optimal set of discriminant features. Finally, the method is successfully extended towards the classification of individual lipid classes from full scan mass spectral features, based on input data defined by the Lipid Maps Consortium.

  12. Automated Sample Preparation Platform for Mass Spectrometry-Based Plasma Proteomics and Biomarker Discovery

    Directory of Open Access Journals (Sweden)

    Vilém Guryča

    2014-03-01

    Full Text Available The identification of novel biomarkers from human plasma remains a critical need in order to develop and monitor drug therapies for nearly all disease areas. The discovery of novel plasma biomarkers is, however, significantly hampered by the complexity and dynamic range of proteins within plasma, as well as the inherent variability in composition from patient to patient. In addition, it is widely accepted that most soluble plasma biomarkers for diseases such as cancer will be represented by tissue leakage products, circulating in plasma at low levels. It is therefore necessary to find approaches with the prerequisite level of sensitivity in such a complex biological matrix. Strategies for fractionating the plasma proteome have been suggested, but improvements in sensitivity are often negated by the resultant process variability. Here we describe an approach using multidimensional chromatography and on-line protein derivatization, which allows for higher sensitivity, whilst minimizing the process variability. In order to evaluate this automated process fully, we demonstrate three levels of processing and compare sensitivity, throughput and reproducibility. We demonstrate that high sensitivity analysis of the human plasma proteome is possible down to the low ng/mL or even high pg/mL level with a high degree of technical reproducibility.

  13. Characterization of Disulfide-Linked Peptides Using Tandem Mass Spectrometry Coupled with Automated Data Analysis Software

    Science.gov (United States)

    Liang, Zhidan; McGuinness, Kenneth N.; Crespo, Alejandro; Zhong, Wendy

    2018-05-01

    Disulfide bond formation is critical for maintaining structure stability and function of many peptides and proteins. Mass spectrometry has become an important tool for the elucidation of molecular connectivity. However, the interpretation of the tandem mass spectral data of disulfide-linked peptides has been a major challenge due to the lack of appropriate tools. Developing proper data analysis software is essential to quickly characterize disulfide-linked peptides. A thorough and in-depth understanding of how disulfide-linked peptides fragment in mass spectrometer is a key in developing software to interpret the tandem mass spectra of these peptides. Two model peptides with inter- and intra-chain disulfide linkages were used to study fragmentation behavior in both collisional-activated dissociation (CAD) and electron-based dissociation (ExD) experiments. Fragments generated from CAD and ExD can be categorized into three major types, which result from different S-S and C-S bond cleavage patterns. DiSulFinder is a computer algorithm that was newly developed based on the fragmentation observed in these peptides. The software is vendor neutral and capable of quickly and accurately identifying a variety of fragments generated from disulfide-linked peptides. DiSulFinder identifies peptide backbone fragments with S-S and C-S bond cleavages and, more importantly, can also identify fragments with the S-S bond still intact to aid disulfide linkage determination. With the assistance of this software, more comprehensive disulfide connectivity characterization can be achieved. [Figure not available: see fulltext.

  14. Characterization of Disulfide-Linked Peptides Using Tandem Mass Spectrometry Coupled with Automated Data Analysis Software.

    Science.gov (United States)

    Liang, Zhidan; McGuinness, Kenneth N; Crespo, Alejandro; Zhong, Wendy

    2018-01-25

    Disulfide bond formation is critical for maintaining structure stability and function of many peptides and proteins. Mass spectrometry has become an important tool for the elucidation of molecular connectivity. However, the interpretation of the tandem mass spectral data of disulfide-linked peptides has been a major challenge due to the lack of appropriate tools. Developing proper data analysis software is essential to quickly characterize disulfide-linked peptides. A thorough and in-depth understanding of how disulfide-linked peptides fragment in mass spectrometer is a key in developing software to interpret the tandem mass spectra of these peptides. Two model peptides with inter- and intra-chain disulfide linkages were used to study fragmentation behavior in both collisional-activated dissociation (CAD) and electron-based dissociation (ExD) experiments. Fragments generated from CAD and ExD can be categorized into three major types, which result from different S-S and C-S bond cleavage patterns. DiSulFinder is a computer algorithm that was newly developed based on the fragmentation observed in these peptides. The software is vendor neutral and capable of quickly and accurately identifying a variety of fragments generated from disulfide-linked peptides. DiSulFinder identifies peptide backbone fragments with S-S and C-S bond cleavages and, more importantly, can also identify fragments with the S-S bond still intact to aid disulfide linkage determination. With the assistance of this software, more comprehensive disulfide connectivity characterization can be achieved. Graphical Abstract ᅟ.

  15. Formularity: Software for Automated Formula Assignment of Natural and Other Organic Matter from Ultrahigh-Resolution Mass Spectra.

    Science.gov (United States)

    Tolić, Nikola; Liu, Yina; Liyu, Andrey; Shen, Yufeng; Tfaily, Malak M; Kujawinski, Elizabeth B; Longnecker, Krista; Kuo, Li-Jung; Robinson, Errol W; Paša-Tolić, Ljiljana; Hess, Nancy J

    2017-12-05

    Ultrahigh resolution mass spectrometry, such as Fourier transform ion cyclotron resonance mass spectrometry (FT ICR MS), can resolve thousands of molecular ions in complex organic matrices. A Compound Identification Algorithm (CIA) was previously developed for automated elemental formula assignment for natural organic matter (NOM). In this work, we describe software Formularity with a user-friendly interface for CIA function and newly developed search function Isotopic Pattern Algorithm (IPA). While CIA assigns elemental formulas for compounds containing C, H, O, N, S, and P, IPA is capable of assigning formulas for compounds containing other elements. We used halogenated organic compounds (HOC), a chemical class that is ubiquitous in nature as well as anthropogenic systems, as an example to demonstrate the capability of Formularity with IPA. A HOC standard mix was used to evaluate the identification confidence of IPA. Tap water and HOC spike in Suwannee River NOM were used to assess HOC identification in complex environmental samples. Strategies for reconciliation of CIA and IPA assignments were discussed. Software and sample databases with documentation are freely available.

  16. Sequential automated fusion/extraction chromatography methodology for the dissolution of uranium in environmental samples for mass spectrometric determination

    Energy Technology Data Exchange (ETDEWEB)

    Milliard, Alex; Durand-Jezequel, Myriam [Laboratoire de Radioecologie, Departement de chimie, Universite Laval, 1045 Avenue de la Medecine, Quebec, QC, G1V 0A6 (Canada); Lariviere, Dominic, E-mail: dominic.lariviere@chm.ulaval.ca [Laboratoire de Radioecologie, Departement de chimie, Universite Laval, 1045 Avenue de la Medecine, Quebec, QC, G1V 0A6 (Canada)

    2011-01-17

    An improved methodology has been developed, based on dissolution by automated fusion followed by extraction chromatography for the detection and quantification of uranium in environmental matrices by mass spectrometry. A rapid fusion protocol (<8 min) was investigated for the complete dissolution of various samples. It could be preceded, if required, by an effective ashing procedure using the M4 fluxer and a newly designed platinum lid. Complete dissolution of the sample was observed and measured using standard reference materials (SRMs) and experimental data show no evidence of cross-contamination of crucibles when LiBO{sub 2}/LiBr melts were used. The use of a M4 fusion unit also improved repeatability in sample preparation over muffle furnace fusion. Instrumental issues originating from the presence of high salt concentrations in the digestate after lithium metaborate fusion was also mitigated using an extraction chromatography (EXC) protocol aimed at removing lithium and interfering matrix constituants prior to the elution of uranium. The sequential methodology, which can be performed simultaneously on three samples, requires less than 20 min per sample for fusion and separation. It was successfully coupled to inductively coupled plasma mass spectrometry (ICP-MS) achieving detection limits below 100 pg kg{sup -1} for 5-300 mg of sample.

  17. Formularity: Software for Automated Formula Assignment of Natural and Other Organic Matter from Ultrahigh-Resolution Mass Spectra

    Energy Technology Data Exchange (ETDEWEB)

    Tolic, Nikola; Liu, Yina; Liyu, Andrey V.; Shen, Yufeng; Tfaily, Malak M.; Kujawinski, Elizabeth B.; Longnecker, Krista; Kuo, Li-Jung; Robinson, Errol W.; Pasa Tolic, Ljiljana; Hess, Nancy J.

    2017-11-13

    Ultrahigh-resolution mass spectrometry, such as Fourier transform ion-cyclotron resonance mass spectrometry (FT-ICR MS), can resolve thousands of molecular ions in complex organic matrices. A Compound Identification Algorithm (CIA) was previously developed for automated elemental formula assignment for natural organic matter (NOM). In this work we describe a user friendly interface for CIA, titled Formularity, which includes an additional functionality to perform search of formulas based on an Isotopic Pattern Algorithm (IPA). While CIA assigns elemental formulas for compounds containing C, H, O, N, S, and P, IPA is capable of assigning formulas for compounds containing other elements. We used halogenated organic compounds (HOC), a chemical class that is ubiquitous in nature as well as anthropogenic systems, as an example to demonstrate the capability of Formularity with IPA. A HOC standard mix was used to evaluate the identification confidence of IPA. The HOC spike in NOM and tap water were used to assess HOC identification in natural and anthropogenic matrices. Strategies for reconciliation of CIA and IPA assignments are discussed. Software and sample databases with documentation are freely available from the PNNL OMICS software repository https://omics.pnl.gov/software/formularity.

  18. Dried Blood Spot Proteomics: Surface Extraction of Endogenous Proteins Coupled with Automated Sample Preparation and Mass Spectrometry Analysis

    Science.gov (United States)

    Martin, Nicholas J.; Bunch, Josephine; Cooper, Helen J.

    2013-08-01

    Dried blood spots offer many advantages as a sample format including ease and safety of transport and handling. To date, the majority of mass spectrometry analyses of dried blood spots have focused on small molecules or hemoglobin. However, dried blood spots are a potentially rich source of protein biomarkers, an area that has been overlooked. To address this issue, we have applied an untargeted bottom-up proteomics approach to the analysis of dried blood spots. We present an automated and integrated method for extraction of endogenous proteins from the surface of dried blood spots and sample preparation via trypsin digestion by use of the Advion Biosciences Triversa Nanomate robotic platform. Liquid chromatography tandem mass spectrometry of the resulting digests enabled identification of 120 proteins from a single dried blood spot. The proteins identified cross a concentration range of four orders of magnitude. The method is evaluated and the results discussed in terms of the proteins identified and their potential use as biomarkers in screening programs.

  19. SAIDE: A Semi-Automated Interface for Hydrogen/Deuterium Exchange Mass Spectrometry.

    Science.gov (United States)

    Villar, Maria T; Miller, Danny E; Fenton, Aron W; Artigues, Antonio

    2010-01-01

    Deuterium/hydrogen exchange in combination with mass spectrometry (DH MS) is a sensitive technique for detection of changes in protein conformation and dynamics. Since temperature, pH and timing control are the key elements for reliable and efficient measurement of hydrogen/deuterium content in proteins and peptides, we have developed a small, semiautomatic interface for deuterium exchange that interfaces the HPLC pumps with a mass spectrometer. This interface is relatively inexpensive to build, and provides efficient temperature and timing control in all stages of enzyme digestion, HPLC separation and mass analysis of the resulting peptides. We have tested this system with a series of standard tryptic peptides reconstituted in a solvent containing increasing concentration of deuterium. Our results demonstrate the use of this interface results in minimal loss of deuterium due to back exchange during HPLC desalting and separation. For peptides reconstituted in a buffer containing 100% deuterium, and assuming that all amide linkages have exchanged hydrogen with deuterium, the maximum loss of deuterium content is only 17% of the label, indicating the loss of only one deuterium molecule per peptide.

  20. Connecting imaging mass spectrometry and magnetic resonance imaging-based anatomical atlases for automated anatomical interpretation and differential analysis.

    Science.gov (United States)

    Verbeeck, Nico; Spraggins, Jeffrey M; Murphy, Monika J M; Wang, Hui-Dong; Deutch, Ariel Y; Caprioli, Richard M; Van de Plas, Raf

    2017-07-01

    Imaging mass spectrometry (IMS) is a molecular imaging technology that can measure thousands of biomolecules concurrently without prior tagging, making it particularly suitable for exploratory research. However, the data size and dimensionality often makes thorough extraction of relevant information impractical. To help guide and accelerate IMS data analysis, we recently developed a framework that integrates IMS measurements with anatomical atlases, opening up opportunities for anatomy-driven exploration of IMS data. One example is the automated anatomical interpretation of ion images, where empirically measured ion distributions are automatically decomposed into their underlying anatomical structures. While offering significant potential, IMS-atlas integration has thus far been restricted to the Allen Mouse Brain Atlas (AMBA) and mouse brain samples. Here, we expand the applicability of this framework by extending towards new animal species and a new set of anatomical atlases retrieved from the Scalable Brain Atlas (SBA). Furthermore, as many SBA atlases are based on magnetic resonance imaging (MRI) data, a new registration pipeline was developed that enables direct non-rigid IMS-to-MRI registration. These developments are demonstrated on protein-focused FTICR IMS measurements from coronal brain sections of a Parkinson's disease (PD) rat model. The measurements are integrated with an MRI-based rat brain atlas from the SBA. The new rat-focused IMS-atlas integration is used to perform automated anatomical interpretation and to find differential ions between healthy and diseased tissue. IMS-atlas integration can serve as an important accelerator in IMS data exploration, and with these new developments it can now be applied to a wider variety of animal species and modalities. This article is part of a Special Issue entitled: MALDI Imaging, edited by Dr. Corinna Henkel and Prof. Peter Hoffmann. Copyright © 2017. Published by Elsevier B.V.

  1. Computer-aided detection system for masses in automated whole breast ultrasonography: development and evaluation of the effectiveness

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jeoung Hyun [Dept. of Radiology, Ewha Womans University Mokdong Hospital, Ewha Womans University School of Medicine, Seoul (Korea, Republic of); Cha, Joo Hee; Kim, Nam Kug; Chang, Young Jun; Kim, Hak Hee [Dept. of Radiology and Research Institute of Radiology, Asan Medical Center, University of Ulsan College of Medicine, Seoul (Korea, Republic of); Ko, Myung Su [Health Screening and Promotion Center, Asan Medical Center, Seoul (Korea, Republic of); Choi, Young Wook [Korea Electrotechnology Research Institute, Ansan (Korea, Republic of)

    2014-04-15

    The aim of this study was to evaluate the performance of a proposed computer-aided detection (CAD) system in automated breast ultrasonography (ABUS). Eighty-nine two-dimensional images (20 cysts, 42 benign lesions, and 27 malignant lesions) were obtained from 47 patients who underwent ABUS (ACUSON S2000). After boundary detection and removal, we detected mass candidates by using the proposed adjusted Otsu's threshold; the threshold was adaptive to the variations of pixel intensities in an image. Then, the detected candidates were segmented. Features of the segmented objects were extracted and used for training/testing in the classification. In our study, a support vector machine classifier was adopted. Eighteen features were used to determine whether the candidates were true lesions or not. A five-fold cross validation was repeated 20 times for the performance evaluation. The sensitivity and the false positive rate per image were calculated, and the classification accuracy was evaluated for each feature. In the classification step, the sensitivity of the proposed CAD system was 82.67% (SD, 0.02%). The false positive rate was 0.26 per image. In the detection/segmentation step, the sensitivities for benign and malignant mass detection were 90.47% (38/42) and 92.59% (25/27), respectively. In the five-fold cross-validation, the standard deviation of pixel intensities for the mass candidates was the most frequently selected feature, followed by the vertical position of the centroids. In the univariate analysis, each feature had 50% or higher accuracy. The proposed CAD system can be used for lesion detection in ABUS and may be useful in improving the screening efficiency.

  2. Automated Morphological and Morphometric Analysis of Mass Spectrometry Imaging Data: Application to Biomarker Discovery

    Science.gov (United States)

    Picard de Muller, Gaël; Ait-Belkacem, Rima; Bonnel, David; Longuespée, Rémi; Stauber, Jonathan

    2017-12-01

    Mass spectrometry imaging datasets are mostly analyzed in terms of average intensity in regions of interest. However, biological tissues have different morphologies with several sizes, shapes, and structures. The important biological information, contained in this highly heterogeneous cellular organization, could be hidden by analyzing the average intensities. Finding an analytical process of morphology would help to find such information, describe tissue model, and support identification of biomarkers. This study describes an informatics approach for the extraction and identification of mass spectrometry image features and its application to sample analysis and modeling. For the proof of concept, two different tissue types (healthy kidney and CT-26 xenograft tumor tissues) were imaged and analyzed. A mouse kidney model and tumor model were generated using morphometric - number of objects and total surface - information. The morphometric information was used to identify m/z that have a heterogeneous distribution. It seems to be a worthwhile pursuit as clonal heterogeneity in a tumor is of clinical relevance. This study provides a new approach to find biomarker or support tissue classification with more information. [Figure not available: see fulltext.

  3. Identification of triacylglycerol using automated annotation of high resolution multistage mass spectral trees.

    Science.gov (United States)

    Wang, Xiupin; Peng, Qingzhi; Li, Peiwu; Zhang, Qi; Ding, Xiaoxia; Zhang, Wen; Zhang, Liangxiao

    2016-10-12

    High complexity of identification for non-target triacylglycerols (TAGs) is a major challenge in lipidomics analysis. To identify non-target TAGs, a powerful tool named accurate MS(n) spectrometry generating so-called ion trees is used. In this paper, we presented a technique for efficient structural elucidation of TAGs on MS(n) spectral trees produced by LTQ Orbitrap MS(n), which was implemented as an open source software package, or TIT. The TIT software was used to support automatic annotation of non-target TAGs on MS(n) ion trees from a self-built fragment ion database. This database includes 19108 simulate TAG molecules from a random combination of fatty acids and corresponding 500582 self-built multistage fragment ions (MS ≤ 3). Our software can identify TAGs using a "stage-by-stage elimination" strategy. By utilizing the MS(1) accurate mass and referenced RKMD, the TIT software can discriminate unique elemental composition candidates. The regiospecific isomers of fatty acyl chains will be distinguished using MS(2) and MS(3) fragment spectra. We applied the algorithm to the selection of 45 TAG standards and demonstrated that the molecular ions could be 100% correctly assigned. Therefore, the TIT software could be applied to TAG identification in complex biological samples such as mouse plasma extracts. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Determination of thermodynamic potentials and the aggregation number for micelles with the mass-action model by isothermal titration calorimetry

    DEFF Research Database (Denmark)

    Olesen, Niels Erik; Westh, Peter; Holm, René

    2015-01-01

    of micelles with ITC were compared to a mass-action model (MAM) of reaction type: n⋅S⇌Mn. This analysis can provide guidelines for future ITC studies of systems behaving in accordance with this model such as micelles and proteins that undergo self-association to oligomers. Micelles with small aggregation...

  5. A Customizable Flow Injection System for Automated, High Throughput, and Time Sensitive Ion Mobility Spectrometry and Mass Spectrometry Measurements.

    Science.gov (United States)

    Orton, Daniel J; Tfaily, Malak M; Moore, Ronald J; LaMarche, Brian L; Zheng, Xueyun; Fillmore, Thomas L; Chu, Rosalie K; Weitz, Karl K; Monroe, Matthew E; Kelly, Ryan T; Smith, Richard D; Baker, Erin S

    2018-01-02

    To better understand disease conditions and environmental perturbations, multiomic studies combining proteomic, lipidomic, and metabolomic analyses are vastly increasing in popularity. In a multiomic study, a single sample is typically extracted in multiple ways, and various analyses are performed using different instruments, most often based upon mass spectrometry (MS). Thus, one sample becomes many measurements, making high throughput and reproducible evaluations a necessity. One way to address the numerous samples and varying instrumental conditions is to utilize a flow injection analysis (FIA) system for rapid sample injections. While some FIA systems have been created to address these challenges, many have limitations such as costly consumables, low pressure capabilities, limited pressure monitoring, and fixed flow rates. To address these limitations, we created an automated, customizable FIA system capable of operating at a range of flow rates (∼50 nL/min to 500 μL/min) to accommodate both low- and high-flow MS ionization sources. This system also functions at varying analytical throughputs from 24 to 1200 samples per day to enable different MS analysis approaches. Applications ranging from native protein analyses to molecular library construction were performed using the FIA system, and results showed a highly robust and reproducible platform capable of providing consistent performance over many days without carryover, as long as washing buffers specific to each molecular analysis were utilized.

  6. Automated Assessment of Left Ventricular Function and Mass Using Heart Deformation Analysis: Initial Experience in 160 Older Adults.

    Science.gov (United States)

    Lin, Kai; Collins, Jeremy D; Lloyd-Jones, Donald M; Jolly, Marie-Pierre; Li, Debiao; Markl, Michael; Carr, James C

    2016-03-01

    To assess the performance of automated quantification of left ventricular function and mass based on heart deformation analysis (HDA) in asymptomatic older adults. This study complied with Health Insurance Portability and Accountability Act regulations. Following the approval of the institutional review board, 160 asymptomatic older participants were recruited for cardiac magnetic resonance imaging including two-dimensional cine images covering the entire left ventricle in short-axis view. Data analysis included the calculation of left ventricular ejection fraction (LVEF), left ventricular mass (LVM), and cardiac output (CO) using HDA and standard global cardiac function analysis (delineation of end-systolic and end-diastolic left ventricle epi- and endocardial borders). The agreement between methods was evaluated using intraclass correlation coefficient (ICC) and coefficient of variation (CoV). HDA had a shorter processing time than the standard method (1.5 ± 0.3 min/case vs. 5.8 ± 1.4 min/case, P HDA. There was a systemic bias toward lower LVEF (62.8% ± 8.3% vs. 69.3% ± 6.7%, P HDA compared to the standard technique. Conversely, HDA overestimated LVM (114.8 ± 30.1 g vs. 100.2 ± 29.0 g, P HDA has the potential to measure LVEF, CO, and LVM without the need for user interaction based on standard cardiac two-dimensional cine images. Copyright © 2015 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  7. Cato Guldberg and Peter Waage, the history of the Law of Mass Action, and its relevance to clinical pharmacology.

    Science.gov (United States)

    Ferner, Robin E; Aronson, Jeffrey K

    2016-01-01

    We have traced the historical link between the Law of Mass Action and clinical pharmacology. The Law evolved from the work of the French chemist Claude Louis Berthollet, was first formulated by Cato Guldberg and Peter Waage in 1864 and later clarified by the Dutch chemist Jacobus van 't Hoff in 1877. It has profoundly influenced our qualitative and quantitative understanding of a number of physiological and pharmacological phenomena. According to the Law of Mass Action, the velocity of a chemical reaction depends on the concentrations of the reactants. At equilibrium the concentrations of the chemicals involved bear a constant relation to each other, described by the equilibrium constant, K. The Law of Mass Action is relevant to various physiological and pharmacological concepts, including concentration-effect curves, dose-response curves, and ligand-receptor binding curves, all of which are important in describing the pharmacological actions of medications, the Langmuir adsorption isotherm, which describes the binding of medications to proteins, activation curves for transmembrane ion transport, enzyme inhibition and the Henderson-Hasselbalch equation, which describes the relation between pH, as a measure of acidity and the concentrations of the contributory acids and bases. Guldberg and Waage recognized the importance of dynamic equilibrium, while others failed to do so. Their ideas, over 150 years old, are embedded in and still relevant to clinical pharmacology. Here we explain the ideas and in a subsequent paper show how they are relevant to understanding adverse drug reactions. © 2015 The British Pharmacological Society.

  8. Automated gravimetric sample pretreatment using an industrial robot for the high-precision determination of plutonium by isotope dilution mass spectrometry

    International Nuclear Information System (INIS)

    Surugaya, Naoki; Hiyama, Toshiaki; Watahiki, Masaru

    2008-01-01

    A robotized sample-preparation method for the determination of Pu, which is recovered by extraction reprocessing of spent nuclear fuel, by isotope dilution mass spectrometry (IDMS) is described. The automated system uses a six-axis industrial robot, whose motility is very fast, accurate, and flexible, installed in a glove box. The automation of the weighing and dilution steps enables operator-unattended sample pretreatment for the high-precision analysis of Pu in aqueous solutions. Using the developed system, the Pu concentration in a HNO 3 medium was successfully determined using a set of subsequent mass spectrometric measurements. The relative uncertainty in determining the Pu concentration by IDMS using this system was estimated to be less than 0.1% (k=2), which is equal to that expected of a talented analysis. The operation time required was the same as that for a skilled operator. (author)

  9. Novel heparan sulfate assay by using automated high-throughput mass spectrometry: Application to monitoring and screening for mucopolysaccharidoses.

    Science.gov (United States)

    Shimada, Tsutomu; Kelly, Joan; LaMarr, William A; van Vlies, Naomi; Yasuda, Eriko; Mason, Robert W; Mackenzie, William; Kubaski, Francyne; Giugliani, Roberto; Chinen, Yasutsugu; Yamaguchi, Seiji; Suzuki, Yasuyuki; Orii, Kenji E; Fukao, Toshiyuki; Orii, Tadao; Tomatsu, Shunji

    2014-01-01

    Mucopolysaccharidoses (MPS) are caused by deficiency of one of a group of specific lysosomal enzymes, resulting in excessive accumulation of glycosaminoglycans (GAGs). We previously developed GAG assay methods using liquid chromatography tandem mass spectrometry (LC-MS/MS); however, it takes 4-5 min per sample for analysis. For the large numbers of samples in a screening program, a more rapid process is desirable. The automated high-throughput mass spectrometry (HT-MS/MS) system (RapidFire) integrates a solid phase extraction robot to concentrate and desalt samples prior to direction into the MS/MS without chromatographic separation; thereby allowing each sample to be processed within 10s (enabling screening of more than one million samples per year). The aim of this study was to develop a higher throughput system to assay heparan sulfate (HS) using HT-MS/MS, and to compare its reproducibility, sensitivity and specificity with conventional LC-MS/MS. HS levels were measured in the blood (plasma and serum) from control subjects and patients with MPS II, III, or IV and in dried blood spots (DBS) from newborn controls and patients with MPS I, II, or III. Results obtained from HT-MS/MS showed 1) that there was a strong correlation of levels of disaccharides derived from HS in the blood, between those calculated using conventional LC-MS/MS and HT-MS/MS, 2) that levels of HS in the blood were significantly elevated in patients with MPS II and III, but not in MPS IVA, 3) that the level of HS in patients with a severe form of MPS II was higher than that in an attenuated form, 4) that reduction of blood HS level was observed in MPS II patients treated with enzyme replacement therapy or hematopoietic stem cell transplantation, and 5) that levels of HS in newborn DBS were elevated in patients with MPS I, II or III, compared to those of control newborns. In conclusion, HT-MS/MS provides much higher throughput than LC-MS/MS-based methods with similar sensitivity and specificity

  10. Automated multi-plug filtration cleanup for liquid chromatographic-tandem mass spectrometric pesticide multi-residue analysis in representative crop commodities.

    Science.gov (United States)

    Qin, Yuhong; Zhang, Jingru; Zhang, Yuan; Li, Fangbing; Han, Yongtao; Zou, Nan; Xu, Haowei; Qian, Meiyuan; Pan, Canping

    2016-09-02

    An automated multi-plug filtration cleanup (m-PFC) method on modified QuEChERS (quick, easy, cheap, effective, rugged, and safe) extracts was developed. The automatic device was aimed to reduce labor-consuming manual operation workload in the cleanup steps. It could control the volume and the speed of pulling and pushing cycles accurately. In this work, m-PFC was based on multi-walled carbon nanotubes (MWCNTs) mixed with other sorbents and anhydrous magnesium sulfate (MgSO4) in a packed tip for analysis of pesticide multi-residues in crop commodities followed by liquid chromatography with tandem mass spectrometric (LC-MS/MS) detection. It was validated by analyzing 25 pesticides in six representative matrices spiked at two concentration levels of 10 and 100μg/kg. Salts, sorbents, m-PFC procedure, automated pulling and pushing volume, automated pulling speed, and pushing speed for each matrix were optimized. After optimization, two general automated m-PFC methods were introduced to relatively simple (apple, citrus fruit, peanut) and relatively complex (spinach, leek, green tea) matrices. Spike recoveries were within 83 and 108% and 1-14% RSD for most analytes in the tested matrices. Matrix-matched calibrations were performed with the coefficients of determination >0.997 between concentration levels of 10 and 1000μg/kg. The developed method was successfully applied to the determination of pesticide residues in market samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. The mass-action law based algorithm for cost-effective approach for cancer drug discovery and development.

    Science.gov (United States)

    Chou, Ting-Chao

    2011-01-01

    The mass-action law based system analysis via mathematical induction and deduction lead to the generalized theory and algorithm that allows computerized simulation of dose-effect dynamics with small size experiments using a small number of data points in vitro, in animals, and in humans. The median-effect equation of the mass-action law deduced from over 300 mechanism specific-equations has been shown to be the unified theory that serves as the common-link for complicated biomedical systems. After using the median-effect principle as the common denominator, its applications are mechanism-independent, drug unit-independent, and dynamic order-independent; and can be used generally for single drug analysis or for multiple drug combinations in constant-ratio or non-constant ratios. Since the "median" is the common link and universal reference point in biological systems, these general enabling lead to computerized quantitative bio-informatics for econo-green bio-research in broad disciplines. Specific applications of the theory, especially relevant to drug discovery, drug combination, and clinical trials, have been cited or illustrated in terms of algorithms, experimental design and computerized simulation for data analysis. Lessons learned from cancer research during the past fifty years provide a valuable opportunity to reflect, and to improve the conventional divergent approach and to introduce a new convergent avenue, based on the mass-action law principle, for the efficient cancer drug discovery and the low-cost drug development.

  12. Prediction of body mass index status from voice signals based on machine learning for automated medical applications.

    Science.gov (United States)

    Lee, Bum Ju; Kim, Keun Ho; Ku, Boncho; Jang, Jun-Su; Kim, Jong Yeol

    2013-05-01

    The body mass index (BMI) provides essential medical information related to body weight for the treatment and prognosis prediction of diseases such as cardiovascular disease, diabetes, and stroke. We propose a method for the prediction of normal, overweight, and obese classes based only on the combination of voice features that are associated with BMI status, independently of weight and height measurements. A total of 1568 subjects were divided into 4 groups according to age and gender differences. We performed statistical analyses by analysis of variance (ANOVA) and Scheffe test to find significant features in each group. We predicted BMI status (normal, overweight, and obese) by a logistic regression algorithm and two ensemble classification algorithms (bagging and random forests) based on statistically significant features. In the Female-2030 group (females aged 20-40 years), classification experiments using an imbalanced (original) data set gave area under the receiver operating characteristic curve (AUC) values of 0.569-0.731 by logistic regression, whereas experiments using a balanced data set gave AUC values of 0.893-0.994 by random forests. AUC values in Female-4050 (females aged 41-60 years), Male-2030 (males aged 20-40 years), and Male-4050 (males aged 41-60 years) groups by logistic regression in imbalanced data were 0.585-0.654, 0.581-0.614, and 0.557-0.653, respectively. AUC values in Female-4050, Male-2030, and Male-4050 groups in balanced data were 0.629-0.893 by bagging, 0.707-0.916 by random forests, and 0.695-0.854 by bagging, respectively. In each group, we found discriminatory features showing statistical differences among normal, overweight, and obese classes. The results showed that the classification models built by logistic regression in imbalanced data were better than those built by the other two algorithms, and significant features differed according to age and gender groups. Our results could support the development of BMI diagnosis

  13. Automation of dimethylation after guanidination labeling chemistry and its compatibility with common buffers and surfactants for mass spectrometry-based shotgun quantitative proteome analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lo, Andy; Tang, Yanan; Chen, Lu; Li, Liang, E-mail: Liang.Li@ualberta.ca

    2013-07-25

    Graphical abstract: -- Highlights: •Dimethylation after guanidination (2MEGA) uses inexpensive reagents for isotopic labeling of peptides. •2MEGA can be optimized and automated for labeling peptides with high efficiency. •2MEGA is compatible with several commonly used cell lysis and protein solubilization reagents. •The automated 2MEGA labeling method can be used to handle a variety of protein samples for relative proteome quantification. -- Abstract: Isotope labeling liquid chromatography–mass spectrometry (LC–MS) is a major analytical platform for quantitative proteome analysis. Incorporation of isotopes used to distinguish samples plays a critical role in the success of this strategy. In this work, we optimized and automated a chemical derivatization protocol (dimethylation after guanidination, 2MEGA) to increase the labeling reproducibility and reduce human intervention. We also evaluated the reagent compatibility of this protocol to handle biological samples in different types of buffers and surfactants. A commercially available liquid handler was used for reagent dispensation to minimize analyst intervention and at least twenty protein digest samples could be prepared in a single run. Different front-end sample preparation methods for protein solubilization (SDS, urea, Rapigest™, and ProteaseMAX™) and two commercially available cell lysis buffers were evaluated for compatibility with the automated protocol. It was found that better than 94% desired labeling could be obtained in all conditions studied except urea, where the rate was reduced to about 92% due to carbamylation on the peptide amines. This work illustrates the automated 2MEGA labeling process can be used to handle a wide range of protein samples containing various reagents that are often encountered in protein sample preparation for quantitative proteome analysis.

  14. Balance between automation and human actions in nuclear power plant operation. Results of international cooperation; Equilibre entre automatisation et action humaine dans la conduite des centrale nucleaires, resultats de la cooperation internationale

    Energy Technology Data Exchange (ETDEWEB)

    Sun, B [CEA Centre d' Etudes Nucleaires de Fontenay-aux-Roses, 92 (France). Dept. d' Analyse de Surete; Bastl, W [Gesellschaft fuer Reaktorsicherheit m.b.H. (GRS), Garching (Germany); Olmstead, R [Atomic Energy of Canada Ltd, Mississauga (Canada); Oudiz, A [Electric Power Research Inst., Palo Alto, CA (United States); Jenkinson, J [Nuclear Electric PLC, Gloucester (United Kingdom); Kossilov, A [International Atomic Energy Agency, Vienna (Austria)

    1990-07-01

    Automation has long been an established feature of power plants. In some applications, the use of automation has been the significant factor which has enabled plant technology to progress to its current state. Societal demands for increased levels of safety have led to greater use of redundancy and diversity and this, in turn, has increased levels of automation. However, possibly the greatest contributory factor in increased automation has resulted from improvements in information technology. Much recent attention has been focused on the concept of inherently safe reactors, which may simplify safety system requirements and information and control system complexity. The allocation of tasks between man and machine may be one of the most critical activity in the design of new nuclear plants and major retro-fits and it therefore warrants a design approach which is commensurate in quality with the high levels of safety and production performance sought from nuclear plants. Facing this climate, in 1989 the International Atomic Energy Agency (IAEA) formed an advisory group from member countries with extensive experience in nuclear power plant automation. The task of this group was to advise on the appropriate balance between manual and automatic actions in plant operation. (author) [French] L'automatisation a longtemps ete une caracteristique bien etablie des centrales nucleaires. Dans certaines applications, l'utilisation de l'automatisation a ete le facteur decisif qui a permis a la technologie des centrales de progresser jusqu'a son etat actuel. Les exigences de l'opinion publique en matiere de securite renforcee ont conduit a l'utilisation d'une plus grande redondance et a une plus grande diversification et ceci, en retour, a encore accru le niveau d'automatisation. Toutefois, il est possible que le facteur preponderant de cet accroissement de l'automatisation soit constitue par les progres effectues dans la technologie de l'information. Plus recemment, l'attention s

  15. Tandem mass spectrometric analysis of Aspergillus niger pectin methylesterase: mode of action on fully methylesterified oligogalacturonates

    NARCIS (Netherlands)

    Kester, H.C.M.; Esteban Warren, M.; Orlando, R.; Benen, J.A.E.; Bergmann, C.; Visser, J.

    2000-01-01

    The substrate specificity and the mode of action of Aspergillus niger pectin methylesterase (PME) was determined using both fully methyl-esterified oligogalacturonates with degrees of polymerization (DP) 2–6 and chemically synthesized monomethyl trigalacturonates. The enzymic activity on the

  16. Supersymmetric Dirac-Born-Infeld action with self-dual mass term

    International Nuclear Information System (INIS)

    Nishino, Hitoshi; Rajpoot, Subhash; Reed, Kevin

    2005-01-01

    We introduce a Dirac-Born-Infeld action to a self-dual N = 1 supersymmetric vector multiplet in three dimensions. This action is based on the supersymmetric generalized self-duality in odd dimensions developed originally by Townsend, Pilch and van Nieuwenhuizen. Even though such a self-duality had been supposed to be very difficult to generalize to a supersymmetrically interacting system, we show that the Dirac-Born-Infeld action is actually compatible with supersymmetry and self-duality in three dimensions, even though the original self-duality receives corrections by the Dirac-Born-Infeld action. The interactions can be further generalized to arbitrary (non)polynomial interactions. As a by-product, we also show that a third-rank field strength leads to a more natural formulation of self-duality in 3D. We also show an interesting role played by the third-rank field strength leading to supersymmetry breaking, in addition to accommodating a Chern-Simons form

  17. Automated analysis of non-mass-enhancing lesions in breast MRI based on morphological, kinetic, and spatio-temporal moments and joint segmentation-motion compensation technique

    Science.gov (United States)

    Hoffmann, Sebastian; Shutler, Jamie D.; Lobbes, Marc; Burgeth, Bernhard; Meyer-Bäse, Anke

    2013-12-01

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) represents an established method for the detection and diagnosis of breast lesions. While mass-like enhancing lesions can be easily categorized according to the Breast Imaging Reporting and Data System (BI-RADS) MRI lexicon, a majority of diagnostically challenging lesions, the so called non-mass-like enhancing lesions, remain both qualitatively as well as quantitatively difficult to analyze. Thus, the evaluation of kinetic and/or morphological characteristics of non-masses represents a challenging task for an automated analysis and is of crucial importance for advancing current computer-aided diagnosis (CAD) systems. Compared to the well-characterized mass-enhancing lesions, non-masses have no well-defined and blurred tumor borders and a kinetic behavior that is not easily generalizable and thus discriminative for malignant and benign non-masses. To overcome these difficulties and pave the way for novel CAD systems for non-masses, we will evaluate several kinetic and morphological descriptors separately and a novel technique, the Zernike velocity moments, to capture the joint spatio-temporal behavior of these lesions, and additionally consider the impact of non-rigid motion compensation on a correct diagnosis.

  18. ParticipACTION: A mass media campaign targeting parents of inactive children; knowledge, saliency, and trialing behaviours

    Directory of Open Access Journals (Sweden)

    Gauvin Lise

    2009-12-01

    Full Text Available Abstract Background In late 2007, Canada's ParticipACTION national physical activity mass media campaign was re-launched, with an initial campaign targeting parents of elementary school-aged children. The campaign informed them about the risks of physical inactivity for children and youth. The purpose of this study was to assess campaign awareness and understanding following the campaign, and to identify whether exposure to this campaign was likely associated with behaviour change. Methods A convenience sample of 1,500 adults was recruited though an existing panel (n = 60,000 of Canadian adults to participate in online surveys. Initial campaign exposure included "prompted" and "unprompted" recall of specific physical activity messages from the 2007 ParticipACTION campaign, knowledge of the benefits of PA, saliency, and initial trial behaviours to help their children become more active. Results One quarter of respondents showed unprompted recall of specific message content from the ParticipACTION campaign, and prompted recall was 57%. Message recall and understanding was associated with knowledge about physical activity, and that in turn was related to high saliency. Saliency was associated with each of the physical activity-related trial behaviours asked. Conclusion Campaign awareness and understanding was high following this ParticipACTION campaign, and was associated with intermediate campaign outcomes, including saliency and trial behaviours. This is relevant to campaign evaluations, as it suggests that an initial focus on influencing awareness and understanding is likely to lead to more substantial change in campaign endpoints.

  19. Enantioselective determination of methylphenidate and ritalinic acid in whole blood from forensic cases using automated solid-phase extraction and liquid chromatography-tandem mass spectrometry

    DEFF Research Database (Denmark)

    Thomsen, Ragnar; B. Rasmussen, Henrik; Linnet, Kristian

    2012-01-01

    A chiral liquid chromatography tandem mass spectrometry (LC–MS-MS) method was developed and validated for quantifying methylphenidate and its major metabolite ritalinic acid in blood from forensic cases. Blood samples were prepared in a fully automated system by protein precipitation followed...... methylphenidate was not determined to be related to the cause of death, the femoral blood concentration of d-methylphenidate ranged from 5 to 58 ng/g, and from undetected to 48 ng/g for l-methylphenidate (median d/l-ratio 5.9). Ritalinic acid was present at concentrations 10–20 times higher with roughly equal...

  20. The antimicrobial action of low-molar-mass chitosan, chitosan derivates and chitooligosaccharides on bifidobacteria

    Czech Academy of Sciences Publication Activity Database

    Šimůnek, Jiří; Koppová, Ingrid; Lukáš, Filip; Tishchenko, Galina; Belzecki, G.

    2010-01-01

    Roč. 55, č. 4 (2010), s. 379-382 ISSN 0015-5632 R&D Projects: GA ČR(CZ) GA525/08/0803 Institutional research plan: CEZ:AV0Z50450515; CEZ:AV0Z40500505 Keywords : chitooligosaccharides * low-molar-mass chitosan Subject RIV: EE - Microbiology, Virology Impact factor: 0.977, year: 2010

  1. Use of mass and toxicity balances in risk-based corrective action decisions at contaminated sites

    International Nuclear Information System (INIS)

    Sevigny, J.H.; Lintott, D.; Wrubleski, R.M.; Drury, C.R.

    1997-01-01

    The contaminated groundwater at a sour gas plant facility was studied to identify the chemicals of environmental concern. Simple mass balance principles were used to determine the proportion of organic carbon, organic nitrogen and Microtox R toxicity that can be attributed to two process chemicals that have contaminated several sour gas plants in western Canada. The two process chemicals are sulfolane and diisopropanolamine (DIPA). The organic carbon balance was calculated by determining the molar contribution of sulfolane and DIPA relative to the mass of carboxylic acid-corrected dissolved organic carbon. Organic carbon balances ranged from 44 to 96 per cent. The organic nitrogen balance was calculated by determining the molar contribution of DIPA relative to the mass of ammonium ion-corrected dissolved Kjeldahl nitrogen. The nitrogen balances were highly variable between 8 to 48 per cent for samples with organic nitrogen concentrations between 10 and 32 mg/L. The Microtox R toxicity balance was calculated by determining the proportions of toxicity that could be accounted for by pure phase sulfolane and DIPA. The Microtox R toxicity balance for samples that showed significant toxicity ranged from 71 to 122 per cent

  2. Use of mass and toxicity balances in risk-based corrective action decisions at contaminated sites

    Energy Technology Data Exchange (ETDEWEB)

    Sevigny, J.H. [Komex Consultants Ltd., Calgary, AB (Canada); Lintott, D. [HydroQual Consultants, Inc., Calgary, AB (Canada); Wrubleski, R.M.; Drury, C.R. [Shell Canada Ltd., Calgary, AB (Canada). Calgary Research Centre

    1997-12-31

    The contaminated groundwater at a sour gas plant facility was studied to identify the chemicals of environmental concern. Simple mass balance principles were used to determine the proportion of organic carbon, organic nitrogen and Microtox{sup R} toxicity that can be attributed to two process chemicals that have contaminated several sour gas plants in western Canada. The two process chemicals are sulfolane and diisopropanolamine (DIPA). The organic carbon balance was calculated by determining the molar contribution of sulfolane and DIPA relative to the mass of carboxylic acid-corrected dissolved organic carbon. Organic carbon balances ranged from 44 to 96 per cent. The organic nitrogen balance was calculated by determining the molar contribution of DIPA relative to the mass of ammonium ion-corrected dissolved Kjeldahl nitrogen. The nitrogen balances were highly variable between 8 to 48 per cent for samples with organic nitrogen concentrations between 10 and 32 mg/L. The Microtox{sup R} toxicity balance was calculated by determining the proportions of toxicity that could be accounted for by pure phase sulfolane and DIPA. The Microtox{sup R} toxicity balance for samples that showed significant toxicity ranged from 71 to 122 per cent.

  3. Single-core magnetic markers in rotating magnetic field based homogeneous bioassays and the law of mass action

    Energy Technology Data Exchange (ETDEWEB)

    Dieckhoff, Jan, E-mail: j.dieckhoff@tu-bs.de [Institut fuer Elektrische Messtechnik und Grundlagen der Elektrotechnik, TU Braunschweig, Braunschweig (Germany); Schrittwieser, Stefan; Schotter, Joerg [Molecular Diagnostics, AIT Austrian Institute of Technology, Vienna (Austria); Remmer, Hilke; Schilling, Meinhard; Ludwig, Frank [Institut fuer Elektrische Messtechnik und Grundlagen der Elektrotechnik, TU Braunschweig, Braunschweig (Germany)

    2015-04-15

    In this work, we report on the effect of the magnetic nanoparticle (MNP) concentration on the quantitative detection of proteins in solution with a rotating magnetic field (RMF) based homogeneous bioassay. Here, the phase lag between 30 nm iron oxide single-core particles and the RMF is analyzed with a fluxgate-based measurement system. As a test analyte anti-human IgG is applied which binds to the protein G functionalized MNP shell and causes a change of the phase lag. The measured phase lag changes for a fixed MNP and a varying analyte concentration are modeled with logistic functions. A change of the MNP concentration results in a nonlinear shift of the logistic function with the analyte concentration. This effect results from the law of mass action. Furthermore, the bioassay results are used to determine the association constant of the binding reaction. - Highlights: • A rotating magnetic field based homogeneous bioassay concept was presented. • Here, single-core iron oxide nanoparticles are applied as markers. • The impact of the particle concentration on the bioassay results is investigated. • The relation between particle concentration and bioassay sensitivity is nonlinear. • This finding can be reasonably explained by the law of mass action.

  4. Creatine Supplementation and Skeletal Muscle Metabolism for Building Muscle Mass- Review of the Potential Mechanisms of Action.

    Science.gov (United States)

    Farshidfar, Farnaz; Pinder, Mark A; Myrie, Semone B

    2017-01-01

    Creatine, a very popular supplement among athletic populations, is of growing interest for clinical applications. Since over 90% of creatine is stored in skeletal muscle, the effect of creatine supplementation on muscle metabolism is a widely studied area. While numerous studies over the past few decades have shown that creatine supplementation has many favorable effects on skeletal muscle physiology and metabolism, including enhancing muscle mass (growth/hypertrophy); the underlying mechanisms are poorly understood. This report reviews studies addressing the mechanisms of action of creatine supplementation on skeletal muscle growth/hypertrophy. Early research proposed that the osmotic effect of creatine supplementation serves as a cellular stressor (osmosensing) that acts as an anabolic stimulus for protein synthesis signal pathways. Other reports indicated that creatine directly affects muscle protein synthesis via modulations of components in the mammalian target of rapamycin (mTOR) pathway. Creatine may also directly affect the myogenic process (formation of muscle tissue), by altering secretions of myokines, such as myostatin and insulin-like growth factor-1, and expressions of myogenic regulatory factors, resulting in enhanced satellite cells mitotic activities and differentiation into myofiber. Overall, there is still no clear understanding of the mechanisms of action regarding how creatine affects muscle mass/growth, but current evidence suggests it may exert its effects through multiple approaches, with converging impacts on protein synthesis and myogenesis. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  5. Testing the hierarchy of effects model: ParticipACTION's serial mass communication campaigns on physical activity in Canada.

    Science.gov (United States)

    Craig, C L; Bauman, A; Reger-Nash, B

    2010-03-01

    The hierarchy of effects (HOE) model is often used in planning mass-reach communication campaigns to promote health, but has rarely been empirically tested. This paper examines Canada's 30 year ParticipACTION campaign to promote physical activity (PA). A cohort from the nationally representative 1981 Canada Fitness Survey was followed up in 1988 and 2002-2004. Modelling of these data tested whether the mechanisms of campaign effects followed the theoretical framework proposed in the HOE. Campaign awareness was measured in 1981. Outcome expectancy, attitudes, decision balance and future intention were asked in 1988. PA was assessed at all time points. Logistic regression was used to sequentially test mediating and moderating variables adjusting for age, sex and education. No selection bias was observed; however, relatively fewer respondents than non-respondents smoked or were underweight at baseline. Among those inactive at baseline, campaign awareness predicted outcome expectancy which in turn predicted positive attitude to PA. Positive attitudes predicted high decision balance, which predicted future intention. Future intention mediated the relationship between decision balance and sufficient activity. Among those sufficiently active at baseline, awareness was unrelated to outcome expectancy and inversely related to positive attitude. These results lend support to the HOE model, in that the effects of ParticipACTION's serial mass media campaigns were consistent with the sequential rollout of its messages, which in turn was associated with achieving an active lifestyle among those initially insufficiently active. This provides support to an often-used theoretical framework for designing health promotion media campaigns.

  6. Mass

    International Nuclear Information System (INIS)

    Quigg, Chris

    2007-01-01

    In the classical physics we inherited from Isaac Newton, mass does not arise, it simply is. The mass of a classical object is the sum of the masses of its parts. Albert Einstein showed that the mass of a body is a measure of its energy content, inviting us to consider the origins of mass. The protons we accelerate at Fermilab are prime examples of Einsteinian matter: nearly all of their mass arises from stored energy. Missing mass led to the discovery of the noble gases, and a new form of missing mass leads us to the notion of dark matter. Starting with a brief guided tour of the meanings of mass, the colloquium will explore the multiple origins of mass. We will see how far we have come toward understanding mass, and survey the issues that guide our research today.

  7. Oak Ridge K-25 Site Technology Logic Diagram. Volume 3, Technology evaluation data sheets; Part B, Remedial action, robotics/automation, waste management

    Energy Technology Data Exchange (ETDEWEB)

    Fellows, R.L. [ed.

    1993-02-26

    The Oak Ridge K-25 Technology Logic Diagram (TLD), a decision support tool for the K-25 Site, was developed to provide a planning document that relates environmental restoration (ER) and waste management (WN) problems at the Oak Ridge K-25 Site. The TLD technique identifies the research necessary to develop these technologies to a state that allows for technology transfer and application to waste management, remediation, decontamination, and decommissioning activities. The TLD consists of four separate volumes-Vol. 1, Vol. 2, Vol. 3A, and Vol. 3B. Volume 1 provides introductory and overview information about the TLD. Volume 2 contains logic diagrams. Volume 3 has been divided into two separate volumes to facilitate handling and use. This volume 3 B provides the Technology Evaluation Data Sheets (TEDS) for ER/WM activities (Remedial Action Robotics and Automation, Waste Management) that are referenced by a TEDS code number in Vol. 2 of the TLD. Each of these sheets represents a single logic trace across the TLD. These sheets contain more detail than each technology in Vol. 2. The TEDS are arranged alphanumerically by the TEDS code number in the upper right corner of each data sheet. Volume 3 can be used in two ways: (1) technologies that are identified from Vol. 2 can be referenced directly in Vol. 3 by using the TEDS codes, and (2) technologies and general technology areas (alternatives) can be located in the index in the front of this volume.

  8. Effects of mass media action on the Axelrod model with social influence

    Science.gov (United States)

    Rodríguez, Arezky H.; Moreno, Y.

    2010-07-01

    The use of dyadic interaction between agents, in combination with homophily (the principle that “likes attract”) in the Axelrod model for the study of cultural dissemination, has two important problems: the prediction of monoculture in large societies and an extremely narrow window of noise levels in which diversity with local convergence is obtained. Recently, the inclusion of social influence has proven to overcome them [A. Flache and M. W. Macy, e-print arXiv:0808.2710]. Here, we extend the Axelrod model with social influence interaction for the study of mass media effects through the inclusion of a superagent which acts over the whole system and has non-null overlap with each agent of the society. The dependence with different parameters as the initial social diversity, size effects, mass media strength, and noise is outlined. Our results might be relevant in several socioeconomic contexts and for the study of the emergence of collective behavior in complex social systems.

  9. Comparison of retraction phenomenon and BI-RADS-US descriptors in differentiating benign and malignant breast masses using an automated breast volume scanner.

    Science.gov (United States)

    Zheng, Feng-Yang; Yan, Li-Xia; Huang, Bei-Jian; Xia, Han-Sheng; Wang, Xi; Lu, Qing; Li, Cui-Xian; Wang, Wen-Ping

    2015-11-01

    To compare the diagnostic values of retraction phenomenon in the coronal planes and descriptors in the Breast Imaging Reporting and Data System-Ultrasound (BI-RADS-US) lexicon in differentiating benign and malignant breast masses using an automated breast volume scanner (ABVS). Two hundred and eight female patients with 237 pathologically proven breast masses (120 benign and 117 malignant) were included in this study. ABVS was performed for each mass after preoperative localization by conventional ultrasonography (US). Multivariate logistic regression analysis was performed to assess independent variables for malignancy prediction. Diagnostic performance was evaluated through the receiver operating characteristic (ROC) curve analysis. Retraction phenomenon (odds ratio [OR]: 76.70; 95% confidence interval [CI]: 12.55, 468.70; P<0.001) was the strongest independent predictor for malignant masses, followed by microlobulated margins (OR: 55.87; 95% CI: 12.56, 248.44; P<0.001), angular margins (OR: 36.44; 95% CI: 4.55, 292.06; P=0.001), calcifications (OR: 5.53; 95% CI: 1.34, 22.88; P=0.018,) and patient age (OR: 1.10; 95% CI: 1.03, 1.17; P=0.004). Mass shape, orientation, echo pattern, indistinct margins, spiculated margins, and mass size were not significantly associated with breast malignancy. Area under the ROC curve (Az) for microlobulated margins and retraction phenomenon was higher than that for other significant independent predictors. Az, sensitivity, and specificity were 0.877 (95% CI: 0.829, 0.926) and 0.838 (95% CI: 0.783, 0.892), 82.9% and 70.1%, and 92.5% and 98.3%, respectively, for microlobulated margins and retraction phenomenon. Retraction phenomenon and microlobulated margins have high diagnostic values in the differentiation of benign and malignant breast masses using an ABVS. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  10. Detection and identification of drugs and toxicants in human body fluids by liquid chromatography-tandem mass spectrometry under data-dependent acquisition control and automated database search.

    Science.gov (United States)

    Oberacher, Herbert; Schubert, Birthe; Libiseller, Kathrin; Schweissgut, Anna

    2013-04-03

    Systematic toxicological analysis (STA) is aimed at detecting and identifying all substances of toxicological relevance (i.e. drugs, drugs of abuse, poisons and/or their metabolites) in biological material. Particularly, gas chromatography-mass spectrometry (GC/MS) represents a competent and commonly applied screening and confirmation tool. Herein, we present an untargeted liquid chromatography-tandem mass spectrometry (LC/MS/MS) assay aimed to complement existing GC/MS screening for the detection and identification of drugs in blood, plasma and urine samples. Solid-phase extraction was accomplished on mixed-mode cartridges. LC was based on gradient elution in a miniaturized C18 column. High resolution electrospray ionization-MS/MS in positive ion mode with data-dependent acquisition control was used to generate tandem mass spectral information that enabled compound identification via automated library search in the "Wiley Registry of Tandem Mass Spectral Data, MSforID". Fitness of the developed LC/MS/MS method for application in STA in terms of selectivity, detection capability and reliability of identification (sensitivity/specificity) was demonstrated with blank samples, certified reference materials, proficiency test samples, and authentic casework samples. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. A simple automated solid-phase extraction procedure for measurement of 25-hydroxyvitamin D3 and D2 by liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Knox, Susan; Harris, John; Calton, Lisa; Wallace, A Michael

    2009-05-01

    Measurement of 25-hydroxyvitamin D(3) (25OHD(3)) and D(2) (25OHD(2)) is challenging. Liquid chromatography-tandem mass spectrometry (LC-MS/MS) methods have been described but they are often complex and difficult to automate. We have developed a simplified procedure involving an automated solid-phase extraction (SPE). Internal standard (hexadeuterated 25-hydroxyvitamin D(3)) was added to serum or plasma followed by protein precipitation with methanol. Following centrifugation, a robotic instrument (CTC PAL [Presearch] for ITSP SPE [MicroLiter Analytical Supplies, Inc]) performed a six-step SPE procedure and the purified samples were injected into the LC-MS/MS. Quantification of 25OHD(3) and 25OHD(2) was by electrospray ionization MS/MS in the multiple-reaction monitoring mode. The lower limit of quantitation was 4.0 nmol/L for 25OHD(3) and 7.5 nmol/L for 25OHD(2). Within- and between-assay precision was below 10% over the concentration range of 22.5-120 nmol/L for D(3) and 17.5-70 nmol/L for D(2) (n = 10). The calibration was linear up to 2500 nmol/L (r = 0.99). Recoveries ranged between 89% and 104% for both metabolites and no ion suppression was observed. The results obtained compared well (r = 0.96) with the IDS-OCTEIA 25-hydroxyvitamin D enzyme immunoassay for samples containing less than 125 nmol/L, at higher concentrations the immunodiagnostic system (IDS) method showed positive bias. Our simplified sample preparation and automated SPE method is suitable for the measurement of 25OHD(3) and D(2) in a routine laboratory environment. The system can process up to 300 samples per day with no cumbersome solvent evaporation step and minimal operator intervention.

  12. Differential fragmentation patterns of pectin oligogalacturonides observed by nanoelectrospray quadrupole ion-trap mass spectrometry using automated spectra interpretation

    DEFF Research Database (Denmark)

    Mutenda, Kudzai E; Matthiesen, Rune; Roepstorff, Peter

    2007-01-01

    Oligogalacturonides of different degrees of polymerization (DP) and methyl esterification (DE) were structurally analyzed by nanoESI quadrupole ion-trap mass spectrometry. The fragmentation patterns of the oligogalacturonides were compared using the program 'Virtual Expert Mass Spectrometrist...... with free carboxylic acid groups underwent higher water loss compared to fully methyl-esterified oligogalacturonides under the same fragmentation conditions. Cross-ring cleavage, in which fragmentation occurs across the ring system of the galacturonate residue and signified by unique mass losses...... water loss than methyl-esterified ones will be postulated. In addition, the VEMS program was extended to automatically interpret and assign the fragment ions peaks generated in this study....

  13. Effects of prey type on specific dynamic action, growth, and mass conversion efficiencies in the horned frog, Ceratophrys cranwelli.

    Science.gov (United States)

    Grayson, Kristine L; Cook, Leslie W; Todd, M Jason; Pierce, D; Hopkins, William A; Gatten, Robert E; Dorcas, Michael E

    2005-07-01

    To be most energetically profitable, predators should ingest prey with the maximal nutritional benefit while minimizing the cost of processing. Therefore, when determining the quality of prey items, both the cost of processing and nutritional content must be considered. Specific dynamic action (SDA), the increase in metabolic rate associated with feeding in animals, is a significant processing cost that represents the total cost of digestion and assimilation of nutrients from prey. We examined the effects of an invertebrate diet (earthworms) and a vertebrate diet (newborn mice) on mass conversion efficiencies, growth, and SDA in the Chacoan horned frog, Ceratophrys cranwelli. We found the earthworm diet to be significantly lower in lipid, protein, and energy content when compared to the diet of newborn mice. Growth and mass conversion efficiencies were significantly higher in frogs fed newborn mice. However, mean SDA did not differ between frogs fed the two diets, a finding that contradicts many studies that indicate SDA increases with the protein content of the meal. Together, our results indicate that future studies evaluating the effect of meal type on bioenergetics of herpetofauna are warranted and may provide significant insight into the underlying factors driving SDA.

  14. Determination of Low Concentrations of Acetochlor in Water by Automated Solid-Phase Extraction and Gas Chromatography with Mass-Selective Detection

    Science.gov (United States)

    Lindley, C.E.; Stewart, J.T.; Sandstrom, M.W.

    1996-01-01

    A sensitive and reliable gas chromatographic/mass spectrometric (GC/MS) method for determining acetochlor in environmental water samples was developed. The method involves automated extraction of the herbicide from a filtered 1 L water sample through a C18 solid-phase extraction column, elution from the column with hexane-isopropyl alcohol (3 + 1), and concentration of the extract with nitrogen gas. The herbicide is quantitated by capillary/column GC/MS with selected-ion monitoring of 3 characteristic ions. The single-operator method detection limit for reagent water samples is 0.0015 ??g/L. Mean recoveries ranged from about 92 to 115% for 3 water matrixes fortified at 0.05 and 0.5 ??g/L. Average single-operator precision, over the course of 1 week, was better than 5%.

  15. Future Trends in Process Automation

    OpenAIRE

    Jämsä-Jounela, Sirkka-Liisa

    2007-01-01

    The importance of automation in the process industries has increased dramatically in recent years. In the highly industrialized countries, process automation serves to enhance product quality, master the whole range of products, improve process safety and plant availability, efficiently utilize resources and lower emissions. In the rapidly developing countries, mass production is the main motivation for applying process automation. The greatest demand for process automation is in the chemical...

  16. Automated Ambient Desorption-Ionization Platform for Surface Imaging Integrated with a Commercial Fourier Transform Ion Cyclotron Resonance Mass Spectrometer

    Czech Academy of Sciences Publication Activity Database

    Pól, Jaroslav; Vidová, Veronika; Kruppa, G.; Kobliha, Václav; Novák, Petr; Lemr, Karel; Kotiaho, T.; Kostiainen, R.; Havlíček, Vladimír; Volný, Michael

    2009-01-01

    Roč. 81, č. 20 (2009), s. 8479-8487 ISSN 0003-2700 R&D Projects: GA MŠk LC07017 Institutional research plan: CEZ:AV0Z50200510 Keywords : ATMOSPHERIC-PRESSURE PHOTOIONIZATION * COMPREHENSIVE CLASSIFICATION-SYSTEM * mass spectrometry Subject RIV: EE - Microbiology, Virology Impact factor: 5.214, year: 2009

  17. A Customizable Flow Injection System for Automated, High Throughput, and Time Sensitive Ion Mobility Spectrometry and Mass Spectrometry Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Orton, Daniel J. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Tfaily, Malak M. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Moore, Ronald J. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; LaMarche, Brian L. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Zheng, Xueyun [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Fillmore, Thomas L. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Chu, Rosalie K. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Weitz, Karl K. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Monroe, Matthew E. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Kelly, Ryan T. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Smith, Richard D. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Baker, Erin S. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States

    2017-12-13

    To better understand disease conditions and environmental perturbations, multi-omic studies (i.e. proteomic, lipidomic, metabolomic, etc. analyses) are vastly increasing in popularity. In a multi-omic study, a single sample is typically extracted in multiple ways and numerous analyses are performed using different instruments. Thus, one sample becomes many analyses, making high throughput and reproducible evaluations a necessity. One way to address the numerous samples and varying instrumental conditions is to utilize a flow injection analysis (FIA) system for rapid sample injection. While some FIA systems have been created to address these challenges, many have limitations such as high consumable costs, low pressure capabilities, limited pressure monitoring and fixed flow rates. To address these limitations, we created an automated, customizable FIA system capable of operating at diverse flow rates (~50 nL/min to 500 µL/min) to accommodate low- and high-flow instrument sources. This system can also operate at varying analytical throughputs from 24 to 1200 samples per day to enable different MS analysis approaches. Applications ranging from native protein analyses to molecular library construction were performed using the FIA system. The results from these studies showed a highly robust platform, providing consistent performance over many days without carryover as long as washing buffers specific to each molecular analysis were utilized.

  18. Killing of targets by effector CD8 T cells in the mouse spleen follows the law of mass action

    Energy Technology Data Exchange (ETDEWEB)

    Ganusov, Vitaly V [Los Alamos National Laboratory

    2009-01-01

    In contrast with antibody-based vaccines, it has been difficult to measure the efficacy of T cell-based vaccines and to correlate the efficacy of CD8 T cell responses with protection again viral infections. In part, this difficulty is due to poor understanding of the in vivo efficacy of CD8 T cells produced by vaccination. Using a: recently developed experimental method of in vivo cytotoxicity we have investigated quantitative aspects of killing of peptide-pulsed targets by effector and memory CD8 T cells, specific to three epitopes of lymphocytic choriomeningitis virus (LCMV), in the mouse spleen. By analyzing data on killing of targets with varying number of epitope-specific effector and memory CD8 T cells, we find that killing of targets by effectors follows the law of mass-action, that is the death rate of peptide-pulsed targets is proportional to the frequency of CTLs in the spleen. In contrast, killing of targets by memory CD8 T cells does not follow the mass action law because the death rate of targets saturates at high frequencies of memory CD8 T cells. For both effector and memory cells, we also find little support for the killing term that includes the decrease of the death rate of targets with target cell density. Interestingly, our analysis suggests that at low CD8 T cell frequencies, memory CD8 T cells on the per capita basis are more efficient at killing peptide-pulsed targets than effectors, but at high frequencies, effectors are more efficient killers than memory T cells. Comparison of the estimated killing efficacy of effector T cells with the value that is predicted from theoretical physics and based on motility of T cells in lymphoid tissues, suggests that limiting step in the killing of peptide-pulsed targets is delivering the lethal hit and not finding the target. Our results thus form a basis for quantitative understanding of the process of killing of virus-infected cells by T cell responses in tissues and can be used to correlate the

  19. Automated mini-column solid-phase extraction cleanup for high-throughput analysis of chemical contaminants in foods by low-pressure gas chromatography – tandem mass spectrometry

    Science.gov (United States)

    This study demonstrated the application of an automated high-throughput mini-cartridge solid-phase extraction (mini-SPE) cleanup for the rapid low-pressure gas chromatography – tandem mass spectrometry (LPGC-MS/MS) analysis of pesticides and environmental contaminants in QuEChERS extracts of foods. ...

  20. Automated Online Solid-Phase Derivatization for Sensitive Quantification of Endogenous S-Nitrosoglutathione and Rapid Capture of Other Low-Molecular-Mass S-Nitrosothiols.

    Science.gov (United States)

    Wang, Xin; Garcia, Carlos T; Gong, Guanyu; Wishnok, John S; Tannenbaum, Steven R

    2018-02-06

    S-Nitrosothiols (RSNOs) constitute a circulating endogenous reservoir of nitric oxide and have important biological activities. In this study, an online coupling of solid-phase derivatization (SPD) with liquid chromatography-mass spectrometry (LC-MS) was developed and applied in the analysis of low-molecular-mass RSNOs. A derivatizing-reagent-modified polymer monolithic column was prepared and adapted for online SPD-LC-MS. Analytes from the LC autosampler flowed through the monolithic column for derivatization and then directly into the LC-MS for analysis. This integration of the online derivatization, LC separation, and MS detection facilitated system automation, allowing rapid, laborsaving, and sensitive detection of RSNOs. S-Nitrosoglutathione (GSNO) was quantified using this automated online method with good linearity (R 2 = 0.9994); the limit of detection was 0.015 nM. The online SPD-LC-MS method has been used to determine GSNO levels in mouse samples, 138 ± 13.2 nM of endogenous GSNO was detected in mouse plasma. Besides, the GSNO concentrations in liver (64.8 ± 11.3 pmol/mg protein), kidney (47.2 ± 6.1 pmol/mg protein), heart (8.9 ± 1.8 pmol/mg protein), muscle (1.9 ± 0.3 pmol/mg protein), hippocampus (5.3 ± 0.9 pmol/mg protein), striatum (6.7 ± 0.6 pmol/mg protein), cerebellum (31.4 ± 6.5 pmol/mg protein), and cortex (47.9 ± 4.6 pmol/mg protein) were also successfully quantified. When the derivatization was performed within 8 min, followed by LC-MS detection, samples could be rapidly analyzed compared with the offline manual method. Other low-molecular-mass RSNOs, such as S-nitrosocysteine and S-nitrosocysteinylglycine, were captured by rapid precursor-ion scanning, showing that the proposed method is a potentially powerful tool for capture, identification, and quantification of RSNOs in biological samples.

  1. Semi-automated solid phase extraction method for the mass spectrometric quantification of 12 specific metabolites of organophosphorus pesticides, synthetic pyrethroids, and select herbicides in human urine.

    Science.gov (United States)

    Davis, Mark D; Wade, Erin L; Restrepo, Paula R; Roman-Esteva, William; Bravo, Roberto; Kuklenyik, Peter; Calafat, Antonia M

    2013-06-15

    Organophosphate and pyrethroid insecticides and phenoxyacetic acid herbicides represent important classes of pesticides applied in commercial and residential settings. Interest in assessing the extent of human exposure to these pesticides exists because of their widespread use and their potential adverse health effects. An analytical method for measuring 12 biomarkers of several of these pesticides in urine has been developed. The target analytes were extracted from one milliliter of urine by a semi-automated solid phase extraction technique, separated from each other and from other urinary biomolecules by reversed-phase high performance liquid chromatography, and detected using tandem mass spectrometry with isotope dilution quantitation. This method can be used to measure all the target analytes in one injection with similar repeatability and detection limits of previous methods which required more than one injection. Each step of the procedure was optimized to produce a robust, reproducible, accurate, precise and efficient method. The required selectivity and sensitivity for trace-level analysis (e.g., limits of detection below 0.5ng/mL) was achieved using a narrow diameter analytical column, higher than unit mass resolution for certain analytes, and stable isotope labeled internal standards. The method was applied to the analysis of 55 samples collected from adult anonymous donors with no known exposure to the target pesticides. This efficient and cost-effective method is adequate to handle the large number of samples required for national biomonitoring surveys. Published by Elsevier B.V.

  2. Automated correlation and classification of secondary ion mass spectrometry images using a k-means cluster method.

    Science.gov (United States)

    Konicek, Andrew R; Lefman, Jonathan; Szakal, Christopher

    2012-08-07

    We present a novel method for correlating and classifying ion-specific time-of-flight secondary ion mass spectrometry (ToF-SIMS) images within a multispectral dataset by grouping images with similar pixel intensity distributions. Binary centroid images are created by employing a k-means-based custom algorithm. Centroid images are compared to grayscale SIMS images using a newly developed correlation method that assigns the SIMS images to classes that have similar spatial (rather than spectral) patterns. Image features of both large and small spatial extent are identified without the need for image pre-processing, such as normalization or fixed-range mass-binning. A subsequent classification step tracks the class assignment of SIMS images over multiple iterations of increasing n classes per iteration, providing information about groups of images that have similar chemistry. Details are discussed while presenting data acquired with ToF-SIMS on a model sample of laser-printed inks. This approach can lead to the identification of distinct ion-specific chemistries for mass spectral imaging by ToF-SIMS, as well as matrix-assisted laser desorption ionization (MALDI), and desorption electrospray ionization (DESI).

  3. Deviation from the kinetic law of mass action for reactions induced by binary encounters in liquid solutions

    International Nuclear Information System (INIS)

    Doktorov, Alexander B; Kipriyanov, Alexey A

    2007-01-01

    In considering the irreversible chemical reaction A+B→ C+B in liquid solutions two many-particle approaches to the derivation of binary non-Markovian kinetic equations are compared: simple superposition decoupling and a method of extracting 'pair' channels from three-particle correlation evolution. It is shown that both methods provide an almost identical description of this reaction. However, in studies of reversible reactions in liquid solutions only the channel extraction method gives a correct physically clear description of the reaction though it consists of a sequence of steps: the development of integral encounter theory (IET), effective pairs approximation (EPA), modified encounter theory (MET), and the final regular form (RF) of kinetic equations. It is shown that the rate equations often encountered in the literature correspond to the independence of transient channels of 'scattering' in the bimolecular reversible reaction (A+B -B), while the independent transient channel of 'decay' in the reversible reactionA+B -C is defined solely by time integral convolution. In the general case transient channels in non-Markovian theory are not independent, and their interference manifests itself as a non-Markovian inhomogeneous source in binary non-Markovian kinetic equations in regular form. Based on the derived equations new universal kinetics (independent of models) of chemical equilibrium attainment have been obtained. It is shown that these kinetics can differ essentially from the kinetics corresponding to the kinetic law of mass action of formal chemical kinetics

  4. A Novel High-Molecular-Mass Bacteriocin Produced by Enterococcus faecium: Biochemical Features and Mode of Action.

    Science.gov (United States)

    Vasilchenko, A S; Vasilchenko, A V; Valyshev, A V; Rogozhin, E A

    2018-02-08

    Discovery of a novel bacteriocin is always an event in sciences, since cultivation of most bacterial species is a general problem in microbiology. This statement is reflected by the fact that number of bacteriocins is smaller for tenfold comparing to known antimicrobial peptides. We cultivated Enterococcus faecium on simplified medium to reduce amount of purification steps. This approach allows to purify the novel heavy weight bacteriocin produced by E. faecium ICIS 7. The novelty of this bacteriocin, named enterocin-7, was confirmed by N-terminal sequencing and by comparing the structural-functional properties with available data. Purified enterocin-7 is characterized by a sequence of amino acid residues having no homology in UniProt/SwissProt/TrEMBL databases: NH2 - Asp - Ala - His - Leu - Ser - Glu - Val - Ala - Glu - Arg - Phe - Glu - Asp - Leu - Gly. Isolated thermostable protein has a molecular mass of 65 kDa, which allows it to be classified into class III in bacteriocin classification schemes. Enterocin-7 displayed a broad spectrum of activity against some Gram-positive and Gram-negative microorganisms. Fluorescent microscopy and spectroscopy showed the permeabilizing mechanism of the action of enterocin-7, which is realized within a few minutes.

  5. Automating data analysis for two-dimensional gas chromatography/time-of-flight mass spectrometry non-targeted analysis of comparative samples.

    Science.gov (United States)

    Titaley, Ivan A; Ogba, O Maduka; Chibwe, Leah; Hoh, Eunha; Cheong, Paul H-Y; Simonich, Staci L Massey

    2018-03-16

    Non-targeted analysis of environmental samples, using comprehensive two-dimensional gas chromatography coupled with time-of-flight mass spectrometry (GC × GC/ToF-MS), poses significant data analysis challenges due to the large number of possible analytes. Non-targeted data analysis of complex mixtures is prone to human bias and is laborious, particularly for comparative environmental samples such as contaminated soil pre- and post-bioremediation. To address this research bottleneck, we developed OCTpy, a Python™ script that acts as a data reduction filter to automate GC × GC/ToF-MS data analysis from LECO ® ChromaTOF ® software and facilitates selection of analytes of interest based on peak area comparison between comparative samples. We used data from polycyclic aromatic hydrocarbon (PAH) contaminated soil, pre- and post-bioremediation, to assess the effectiveness of OCTpy in facilitating the selection of analytes that have formed or degraded following treatment. Using datasets from the soil extracts pre- and post-bioremediation, OCTpy selected, on average, 18% of the initial suggested analytes generated by the LECO ® ChromaTOF ® software Statistical Compare feature. Based on this list, 63-100% of the candidate analytes identified by a highly trained individual were also selected by OCTpy. This process was accomplished in several minutes per sample, whereas manual data analysis took several hours per sample. OCTpy automates the analysis of complex mixtures of comparative samples, reduces the potential for human error during heavy data handling and decreases data analysis time by at least tenfold. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Automated determination of aliphatic primary amines in wastewater by simultaneous derivatization and headspace solid-phase microextraction followed by gas chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Llop, Anna; Pocurull, Eva; Borrull, Francesc

    2010-01-22

    This paper presents a fully automated method for determining ten primary amines in wastewater at ng/L levels. The method is based on simultaneous derivatization with pentafluorobenzaldehyde (PFBAY) and headspace solid-phase microextraction (HS-SPME) followed by gas chromatography coupled to ion trap tandem mass spectrometry (GC-IT-MS-MS). The influence of main factors on the efficiency of derivatization and of HS-SPME is described in detail and optimized by a central composite design. For all species, the highest enrichment factors were achieved using a 85 microm polyacrylate (PA) fiber exposed in the headspace of stirred water samples (750 rpm) at pH 12, containing 360 g/L of NaCl, at 40 degrees C for 15 min. Under optimized conditions, the proposed method achieved detection limits ranging from 10 to 100 ng/L (except for cyclohexylamine). The optimized method was then used to determine the presence of primary amines in various types of wastewater samples, such as influent and effluent wastewater from municipal and industrial wastewater treatment plants (WWTPs) and a potable water treatment plant. Although the analysis of these samples revealed the presence of up to 1500 microg/L of certain primary amines in influent industrial wastewater, the concentration of these compounds in the effluent and in municipal and potable water was substantially lower, at low microg/L levels. The new derivatization-HS-SPME-GC-IT-MS-MS method is suitable for the fast, reliable and inexpensive determination of primary amines in wastewater in an automated procedure. Copyright 2009 Elsevier B.V. All rights reserved.

  7. Proteomic analysis of Bacillus thuringiensis at different growth phases by using an automated online two-dimensional liquid chromatography-tandem mass spectrometry strategy.

    Science.gov (United States)

    Huang, Shaoya; Ding, Xuezhi; Sun, Yunjun; Yang, Qi; Xiao, Xiuqing; Cao, Zhenping; Xia, Liqiu

    2012-08-01

    The proteome of a new Bacillus thuringiensis subsp. kurstaki strain, 4.0718, from the middle vegetative (T(1)), early sporulation (T(2)), and late sporulation (T(3)) phases was analyzed using an integrated liquid chromatography (LC)-based protein identification system. The system comprised two-dimensional (2D) LC coupled with nanoscale electrospray ionization (ESI) tandem mass spectrometry (MS/MS) on a high-resolution hybrid mass spectrometer with an automated data analysis system. After deletion of redundant proteins from the different batches and B. thuringiensis subspecies, 918, 703, and 778 proteins were identified in the respective three phases. Their molecular masses ranged from 4.6 Da to 477.4 Da, and their isoelectric points ranged from 4.01 to 11.84. Function clustering revealed that most of the proteins in the three phases were functional metabolic proteins, followed by proteins participating in cell processes. Small molecular and macromolecular metabolic proteins were further classified according to the Kyoto Encyclopedia of Genes and Genome and BioCyc metabolic pathway database. Three protoxins (Cry2Aa, Cry1Aa, and Cry1Ac) as well as a series of potential intracellular active factors were detected. Many significant proteins related to spore and crystal formation, including sporulation proteins, help proteins, chaperones, and so on, were identified. The expression patterns of two identified proteins, CotJc and glutamine synthetase, were validated by Western blot analysis, which further confirmed the MS results. This study is the first to use shotgun technology to research the proteome of B. thuringiensis. Valuable experimental data are provided regarding the methodology of analyzing the B. thuringiensis proteome (which can be used to produce insecticidal crystal proteins) and have been added to the related protein database.

  8. Comparison of conventional and automated breast volume ultrasound in the description and characterization of solid breast masses based on BI-RADS features.

    Science.gov (United States)

    Kim, Hyunji; Cha, Joo Hee; Oh, Ha-Yeun; Kim, Hak Hee; Shin, Hee Jung; Chae, Eun Young

    2014-07-01

    To compare the performance of radiologists in the use of conventional ultrasound (US) and automated breast volume ultrasound (ABVU) for the characterization of benign and malignant solid breast masses based on breast imaging and reporting data system (BI-RADS) criteria. Conventional US and ABVU images were obtained in 87 patients with 106 solid breast masses (52 cancers, 54 benign lesions). Three experienced radiologists who were blinded to all examination results independently characterized the lesions and reported a BI-RADS assessment category and a level of suspicion of malignancy. The results were analyzed by calculation of Cohen's κ coefficient and by receiver operating characteristic (ROC) analysis. Assessment of the agreement of conventional US and ABVU indicated that the posterior echo feature was the most discordant feature of seven features (κ = 0.371 ± 0.225) and that orientation had the greatest agreement (κ = 0.608 ± 0.210). The final assessment showed substantial agreement (κ = 0.773 ± 0.104). The areas under the ROC curves (Az) for conventional US and ABVU were not statistically significant for each reader, but the mean Az values of conventional US and ABVU by multi-reader multi-case analysis were significantly different (conventional US 0.991, ABVU 0.963; 95 % CI -0.0471 to -0.0097). The means for sensitivity, specificity, positive predictive value, and negative predictive value of conventional US and ABVU did not differ significantly. There was substantial inter-observer agreement in the final assessment of solid breast masses by conventional US and ABVU. ROC analysis comparing the performance of conventional US and ABVU indicated a marginally significant difference in mean Az, but not in mean sensitivity, specificity, positive predictive value, or negative predictive value.

  9. Interest Convergence or Divergence? A Critical Race Analysis of Asian Americans, Meritocracy, and Critical Mass in the Affirmative Action Debate

    Science.gov (United States)

    Park, Julie J.; Liu, Amy

    2014-01-01

    We use the Critical Race Theory frameworks of interest convergence and divergence to critique the anti-affirmative action movement's co-option of Asian Americans. Past discussions of affirmative action and Asian Americans mainly concentrate on how Asian Americans are affected by affirmative action, whether positively or negatively. We demonstrate…

  10. Negative chemical ionization gas chromatography coupled to hybrid quadrupole time-of-flight mass spectrometry and automated accurate mass data processing for determination of pesticides in fruit and vegetables.

    Science.gov (United States)

    Besil, Natalia; Uclés, Samanta; Mezcúa, Milagros; Heinzen, Horacio; Fernández-Alba, Amadeo R

    2015-08-01

    Gas chromatography coupled to high resolution hybrid quadrupole time-of-flight mass spectrometry (GC-QTOF MS), operating in negative chemical ionization (NCI) mode and combining full scan with MSMS experiments using accurate mass analysis, has been explored for the automated determination of pesticide residues in fruit and vegetables. Seventy compounds were included in this approach where 50 % of them are not approved by the EU legislation. A global 76 % of the analytes could be identified at 1 μg kg(-1). Recovery studies were developed at three concentration levels (1, 5, and 10 μg kg(-1)). Seventy-seven percent of the detected pesticides at the lowest level yielded recoveries within the 70 %-120 % range, whereas 94 % could be quantified at 5 μg kg(-1), and the 100 % were determined at 10 μg kg(-1). Good repeatability, expressed as relative standard deviation (RSD home-made database was developed and applied to an automatic accurate mass data processing. Measured mass accuracies of the generated ions were mainly less than 5 ppm for at least one diagnostic ion. When only one ion was obtained in the single-stage NCI-MS, a representative product ion from MSMS experiments was used as identification criterion. A total of 30 real samples were analyzed and 67 % of the samples were positive for 12 different pesticides in the range 1.0-1321.3 μg kg(-1).

  11. Using the theory of reasoned action to determine physicians' intention to measure body mass index in children and adolescents.

    Science.gov (United States)

    Khanna, Rahul; Kavookjian, Jan; Scott, Virginia Ginger; Kamal, Khalid M; Miller, Lesley-Ann N; Neal, William A

    2009-06-01

    Over the past few decades, childhood obesity has become a major public health issue in the United States. Numerous public and professional organizations recommend that physicians periodically screen for obesity in children and adolescents using the body mass index (BMI). However, studies have shown that physicians infrequently measure BMI in children and adolescents. The purpose of this study was to use the theory of reasoned action (TRA) to explain physicians' intentions to measure BMI in children and adolescents. The study objectives were to (1) determine if attitude and subjective norm predict physicians' intention to measure BMI in children and adolescents; (2) determine if family physicians and pediatricians differ in terms of theoretical factors; and (3) assess differences in behavioral beliefs, outcome evaluations, normative beliefs, and motivation to comply among physicians based on their level of intention to measure BMI. A cross-sectional mailed survey of 2590 physicians (family physicians and pediatricians) practicing in 4 states was conducted. A self-administered questionnaire was designed that included items related to the TRA constructs. The association between the theoretical constructs was examined using correlation and regression analyses. Student's t test was used to determine differences between family physicians and pediatricians on theoretical constructs and to compare the underlying beliefs of nonintenders with intenders. The usable response rate was 22.8%. Less than half (44%) of the physicians strongly intended to measure BMI in children and adolescents. Together, the TRA constructs attitude and subjective norm explained up to 49.9% of the variance in intention. Pediatricians had a significantly (P<.01) higher intention to measure BMI as compared to family physicians. There were significant (P<.01) behavioral and normative belief differences between physicians who intend and those who do not intend to measure BMI. The TRA is a useful model

  12. Simulating groundwater flow and runoff for the Oro Moraine aquifer system. Part II. Automated calibration and mass balance calculations

    Science.gov (United States)

    Beckers, J.; Frind, E. O.

    2001-03-01

    A steady-state groundwater model of the Oro Moraine aquifer system in Central Ontario, Canada, is developed. The model is used to identify the role of baseflow in the water balance of the Minesing Swamp, a 70 km 2 wetland of international significance. Lithologic descriptions are used to develop a hydrostratigraphic conceptual model of the aquifer system. The numerical model uses long-term averages to represent temporal variations of the flow regime and includes a mechanism to redistribute recharge in response to near-surface geologic heterogeneity. The model is calibrated to water level and streamflow measurements through inverse modeling. Observed baseflow and runoff quantities validate the water mass balance of the numerical model and provide information on the fraction of the water surplus that contributes to groundwater flow. The inverse algorithm is used to compare alternative model zonation scenarios, illustrating the power of non-linear regression in calibrating complex aquifer systems. The adjoint method is used to identify sensitive recharge areas for groundwater discharge to the Minesing Swamp. Model results suggest that nearby urban development will have a significant impact on baseflow to the swamp. Although the direct baseflow contribution makes up only a small fraction of the total inflow to the swamp, it provides an important steady influx of water over relatively large portions of the wetland. Urban development will also impact baseflow to the headwaters of local streams. The model provides valuable insight into crucial characteristics of the aquifer system although definite conclusions regarding details of its water budget are difficult to draw given current data limitations. The model therefore also serves to guide future data collection and studies of sub-areas within the basin.

  13. Epsilon-Q: An Automated Analyzer Interface for Mass Spectral Library Search and Label-Free Protein Quantification.

    Science.gov (United States)

    Cho, Jin-Young; Lee, Hyoung-Joo; Jeong, Seul-Ki; Paik, Young-Ki

    2017-12-01

    Mass spectrometry (MS) is a widely used proteome analysis tool for biomedical science. In an MS-based bottom-up proteomic approach to protein identification, sequence database (DB) searching has been routinely used because of its simplicity and convenience. However, searching a sequence DB with multiple variable modification options can increase processing time, false-positive errors in large and complicated MS data sets. Spectral library searching is an alternative solution, avoiding the limitations of sequence DB searching and allowing the detection of more peptides with high sensitivity. Unfortunately, this technique has less proteome coverage, resulting in limitations in the detection of novel and whole peptide sequences in biological samples. To solve these problems, we previously developed the "Combo-Spec Search" method, which uses manually multiple references and simulated spectral library searching to analyze whole proteomes in a biological sample. In this study, we have developed a new analytical interface tool called "Epsilon-Q" to enhance the functions of both the Combo-Spec Search method and label-free protein quantification. Epsilon-Q performs automatically multiple spectral library searching, class-specific false-discovery rate control, and result integration. It has a user-friendly graphical interface and demonstrates good performance in identifying and quantifying proteins by supporting standard MS data formats and spectrum-to-spectrum matching powered by SpectraST. Furthermore, when the Epsilon-Q interface is combined with the Combo-Spec search method, called the Epsilon-Q system, it shows a synergistic function by outperforming other sequence DB search engines for identifying and quantifying low-abundance proteins in biological samples. The Epsilon-Q system can be a versatile tool for comparative proteome analysis based on multiple spectral libraries and label-free quantification.

  14. Automated Vehicle Monitoring System

    OpenAIRE

    Wibowo, Agustinus Deddy Arief; Heriansyah, Rudi

    2014-01-01

    An automated vehicle monitoring system is proposed in this paper. The surveillance system is based on image processing techniques such as background subtraction, colour balancing, chain code based shape detection, and blob. The proposed system will detect any human's head as appeared at the side mirrors. The detected head will be tracked and recorded for further action.

  15. Building Automation Systems.

    Science.gov (United States)

    Honeywell, Inc., Minneapolis, Minn.

    A number of different automation systems for use in monitoring and controlling building equipment are described in this brochure. The system functions include--(1) collection of information, (2) processing and display of data at a central panel, and (3) taking corrective action by sounding alarms, making adjustments, or automatically starting and…

  16. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  17. New Evidence for the Mechanism of Action of a Type-2 Diabetes Drug Using a Magnetic Bead-Based Automated Biosensing Platform

    DEFF Research Database (Denmark)

    Uddin, Rokon; Nur-E-Habiba; Rena, Graham

    2017-01-01

    The mechanism of action (MOA) of the first line type-2 diabetes drug metformin remains unclear despite its widespread usage. However, recent evidence suggests that the mitochondrial copper (Cu)-binding action of metformin may contribute toward the drug's MOA. Here, we present a novel biosensing...... of metformin's blood-glucose lowering action. In this assay, cysteine-functionalized magnetic beadswere agglutinated in the presence of Cu due to cysteine's Cu-chelation property. Addition of clinically relevant doses of metformin resulted in disaggregation of Cu-bridged bead-clusters, whereas the effect...

  18. Planejamento de ações para automação inteligente da manufatura Action planning for intelligent manufacturing automation

    Directory of Open Access Journals (Sweden)

    Flavio Tonidandel

    2002-12-01

    Full Text Available Este artigo investiga o uso do sistema FAR-OFF na área da Automação da Manufatura. O sistema FAR-OFF possui característica similar aos sistemas de planejamento baseado em busca heurística, os quais têm apresentado excelentes resultados nos últimos anos na área de planejamento. Entretanto, em vez de ser um sistema generativo, o FAR-OFF é um sistema de planejamento baseado em casos que garante estabilidade para solucionar problemas em tempo aceitável. Os resultados apresentados pelo sistema no domínio de logística mostram que este é um sistema promissor para automação inteligente.This paper investigates the use of the FAR-OFF system in the Manufacturing Automation field. The FAR-OFF system has the feature of heuristic search based systems, which have been presenting excellent results over the last years in the planning area. However, instead of being a generative planning system, the FAR-OFF system is a case-based planner that can guarantee stability to solve problems in a reasonable amount of time. The results presented by its application in the logistic domain show that it is a promising system for intelligent automation.

  19. Automation for tsetse mass rearing for use in sterile insect technique programmes. Final report of a co-ordinated research project 1995-2001

    International Nuclear Information System (INIS)

    2003-05-01

    The rearing of tsetse flies for the sterile insect technique has been a laborious procedure in the past. The purpose of this co-ordinated research project (CRP) 'Automation for tsetse mass rearing for use in sterile insect technique programmes' was to develop appropriate semiautomated procedures to simplify the rearing, reduce the cost and standardize the product. Two main objectives were accomplished. The first was to simplify the handling of adults at emergence. This was achieved by allowing the adults to emerge directly into the production cages. Selection of the appropriate environmental conditions and timing allowed the manipulation of the emergence pattern to achieve the desired ratio of four females to one male with minimal un-emerged females remaining mixed with the male pupae. Tests demonstrated that putting the sexes together at emergence, leaving the males in the production cages, and using a ratio of 4:1 (3:1 for a few species) did not adversely affect pupal production. This has resulted in a standardized system for the self stocking of production cages. The second was to reduce the labour involved in feeding the flies. Three distinct systems were developed and tested in sequence. The first tsetse production unit (TPU 1) was a fully automated system, but the fly survival and fecundity were unacceptably low. From this a simpler TPU 2 was developed and tested, where 63 large cages were held on a frame that could be moved as a single unit to the feeding location. TPU 2 was tested in various locations, and found to satisfy the basic requirements, and the adoption of Plexiglas pupal collection slopes resolved much of the problem due to light distribution. However the cage holding frame was heavy and difficult to position on the feeding frame and the movement disturbed the flies. TPU 2 was superseded by TPU 3, in which the cages remain stationary at all times, and the blood is brought to the flies. The blood feeding system is mounted on rails to make it

  20. The attraction of sugar : An association between body mass index and impaired avoidance action tendencies of sweet snacks

    NARCIS (Netherlands)

    Maas, J.; Woud, M.L.; Keijsers, G.P.J.; Rinck, M.; Becker, E. S.; Wiers, R. W.

    2017-01-01

    The present study investigated implicit approach-avoidance action tendencies towards snack foods (pictorial Approach-Avoidance Task), and implicit approach-avoidance associations (verbal approach-avoidance Single-Target IAT) and affective associations (verbal positive-negative Single-Target IAT)

  1. A control system verifier using automated reasoning software

    International Nuclear Information System (INIS)

    Smith, D.E.; Seeman, S.E.

    1985-08-01

    An on-line, automated reasoning software system for verifying the actions of other software or human control systems has been developed. It was demonstrated by verifying the actions of an automated procedure generation system. The verifier uses an interactive theorem prover as its inference engine with the rules included as logical axioms. Operation of the verifier is generally transparent except when the verifier disagrees with the actions of the monitored software. Testing with an automated procedure generation system demonstrates the successful application of automated reasoning software for verification of logical actions in a diverse, redundant manner. A higher degree of confidence may be placed in the verified actions of the combined system

  2. Analytical and clinical performance of the new Fujirebio 25-OH vitamin D assay, a comparison with liquid chromatography-tandem mass spectrometry (LC-MS/MS) and three other automated assays

    OpenAIRE

    Saleh, Lanja; Mueller, Daniel; von Eckardstein, Arnold

    2015-01-01

    BACKGROUND: We evaluated the analytical and clinical performance of the new Lumipulse® G 25-OH vitamin D assay from Fujirebio, and compared it to a liquid chromatography-tandem mass spectrometry (LC-MS/MS) method and three other commercial automated assays. METHODS: Total 25 hydroxy vitamin D (25(OH)D) levels were measured in 100 selected serum samples from our routine analysis with Fujirebio 25(OH)D assay. The results were compared with those obtained with LC-MS/MS and three other automat...

  3. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  4. Automated data processing and radioassays.

    Science.gov (United States)

    Samols, E; Barrows, G H

    1978-04-01

    Radioassays include (1) radioimmunoassays, (2) competitive protein-binding assays based on competition for limited antibody or specific binding protein, (3) immunoradiometric assay, based on competition for excess labeled antibody, and (4) radioreceptor assays. Most mathematical models describing the relationship between labeled ligand binding and unlabeled ligand concentration have been based on the law of mass action or the isotope dilution principle. These models provide useful data reduction programs, but are theoretically unfactory because competitive radioassay usually is not based on classical dilution principles, labeled and unlabeled ligand do not have to be identical, antibodies (or receptors) are frequently heterogenous, equilibrium usually is not reached, and there is probably steric and cooperative influence on binding. An alternative, more flexible mathematical model based on the probability or binding collisions being restricted by the surface area of reactive divalent sites on antibody and on univalent antigen has been derived. Application of these models to automated data reduction allows standard curves to be fitted by a mathematical expression, and unknown values are calculated from binding data. The vitrues and pitfalls are presented of point-to-point data reduction, linear transformations, and curvilinear fitting approaches. A third-order polynomial using the square root of concentration closely approximates the mathematical model based on probability, and in our experience this method provides the most acceptable results with all varieties of radioassays. With this curvilinear system, linear point connection should be used between the zero standard and the beginning of significant dose response, and also towards saturation. The importance is stressed of limiting the range of reported automated assay results to that portion of the standard curve that delivers optimal sensitivity. Published methods for automated data reduction of Scatchard plots

  5. Direct analysis by time-of-flight secondary ion mass spectrometry reveals action of bacterial laccase-mediator systems on both hardwood and softwood samples.

    Science.gov (United States)

    Goacher, Robyn E; Braham, Erick J; Michienzi, Courtney L; Flick, Robert M; Yakunin, Alexander F; Master, Emma R

    2017-12-29

    The modification and degradation of lignin play a vital role in carbon cycling as well as production of biofuels and bioproducts. The possibility of using bacterial laccases for the oxidation of lignin offers a route to utilize existing industrial protein expression techniques. However, bacterial laccases are most frequently studied on small model compounds that do not capture the complexity of lignocellulosic materials. This work studied the action of laccases from Bacillus subtilis and Salmonella typhimurium (EC 1.10.3.2) on ground wood samples from yellow birch (Betula alleghaniensis) and red spruce (Picea rubens). The ability of bacterial laccases to modify wood can be facilitated by small molecule mediators. Herein, 2,2'-azino-bis(3-ethylbenzothiazoline-6-sulphonic acid) (ABTS), gallic acid and sinapic acid mediators were tested. Direct analysis of the wood samples was achieved by time-of-flight secondary ion mass spectrometry (ToF-SIMS), a surface sensitive mass spectrometry technique that has characteristic peaks for H, G and S lignin. The action of the bacterial laccases on both wood samples was demonstrated and revealed a strong mediator influence. The ABTS mediator led to delignification, evident in an overall increase of polysaccharide peaks in the residual solid, along with equal loss of G and S-lignin peaks. The gallic acid mediator demonstrated minimal laccase activity. Meanwhile, the sinapic acid mediator altered the S/G peak ratio consistent with mediator attaching to the wood solids. The current investigation demonstrates the action of bacterial laccase-mediator systems directly on woody materials, and the potential of using ToF-SIMS to uncover the fundamental and applied role of bacterial enzymes in lignocellulose conversion. © 2017 Scandinavian Plant Physiology Society.

  6. Distribution automation

    International Nuclear Information System (INIS)

    Gruenemeyer, D.

    1991-01-01

    This paper reports on a Distribution Automation (DA) System enhances the efficiency and productivity of a utility. It also provides intangible benefits such as improved public image and market advantages. A utility should evaluate the benefits and costs of such a system before committing funds. The expenditure for distribution automation is economical when justified by the deferral of a capacity increase, a decrease in peak power demand, or a reduction in O and M requirements

  7. Action spectroscopy of SrCl{sup +} using an integrated ion trap time-of-flight mass spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Puri, Prateek, E-mail: teek24@ucla.edu; Schowalter, Steven J.; Hudson, Eric R. [Department of Physics and Astronomy, University of California, Los Angeles, California 90095 (United States); Kotochigova, Svetlana; Petrov, Alexander [Department of Physics, Temple University, Philadelphia, Pennsylvania 19122 (United States)

    2014-07-07

    The photodissociation cross-section of SrCl{sup +} is measured in the spectral range of 36 000–46 000 cm{sup −1} using a modular time-of-flight mass spectrometer (TOF-MS). By irradiating a sample of trapped SrCl{sup +} molecular ions with a pulsed dye laser, X{sup 1}Σ{sup +} state molecular ions are electronically excited to the repulsive wall of the A{sup 1}Π state, resulting in dissociation. Using the TOF-MS, the product fragments are detected and the photodissociation cross-section is determined for a broad range of photon energies. Detailed ab initio calculations of the SrCl{sup +} molecular potentials and spectroscopic constants are also performed and are found to be in good agreement with experiment. The spectroscopic constants for SrCl{sup +} are also compared to those of another alkaline earth halogen, BaCl{sup +}, in order to highlight structural differences between the two molecular ions. This work represents the first spectroscopy and ab initio calculations of SrCl{sup +}.

  8. Non-perturbative renormalization in coordinate space for N{sub f}=2 maximally twisted mass fermions with tree-level Symanzik improved gauge action

    Energy Technology Data Exchange (ETDEWEB)

    Cichy, Krzysztof [DESY, Zeuthen (Germany). NIC; Adam Mickiewicz Univ., Poznan (Poland). Faculty of Physics; Jansen, Karl [DESY, Zeuthen (Germany). NIC; Korcyl, Piotr [DESY, Zeuthen (Germany). NIC; Jagiellonian Univ., Krakow (Poland). M. Smoluchowski Inst. of Physics

    2012-07-15

    We present results of a lattice QCD application of a coordinate space renormalization scheme for the extraction of renormalization constants for flavour non-singlet bilinear quark operators. The method consists in the analysis of the small-distance behaviour of correlation functions in Euclidean space and has several theoretical and practical advantages, in particular: it is gauge invariant, easy to implement and has relatively low computational cost. The values of renormalization constants in the X-space scheme can be converted to the MS scheme via 4-loop continuum perturbative formulae. Our results for N{sub f}=2 maximally twisted mass fermions with tree-level Symanzik improved gauge action are compared to the ones from the RI-MOM scheme and show full agreement with this method. (orig.)

  9. Non-perturbative renormalization in coordinate space for Nf=2 maximally twisted mass fermions with tree-level Symanzik improved gauge action

    International Nuclear Information System (INIS)

    Cichy, Krzysztof; Adam Mickiewicz Univ., Poznan; Jansen, Karl; Korcyl, Piotr; Jagiellonian Univ., Krakow

    2012-07-01

    We present results of a lattice QCD application of a coordinate space renormalization scheme for the extraction of renormalization constants for flavour non-singlet bilinear quark operators. The method consists in the analysis of the small-distance behaviour of correlation functions in Euclidean space and has several theoretical and practical advantages, in particular: it is gauge invariant, easy to implement and has relatively low computational cost. The values of renormalization constants in the X-space scheme can be converted to the MS scheme via 4-loop continuum perturbative formulae. Our results for N f =2 maximally twisted mass fermions with tree-level Symanzik improved gauge action are compared to the ones from the RI-MOM scheme and show full agreement with this method. (orig.)

  10. Virtual automation.

    Science.gov (United States)

    Casis, E; Garrido, A; Uranga, B; Vives, A; Zufiaurre, C

    2001-01-01

    Total laboratory automation (TLA) can be substituted in mid-size laboratories by a computer sample workflow control (virtual automation). Such a solution has been implemented in our laboratory using PSM, software developed in cooperation with Roche Diagnostics (Barcelona, Spain), to this purpose. This software is connected to the online analyzers and to the laboratory information system and is able to control and direct the samples working as an intermediate station. The only difference with TLA is the replacement of transport belts by personnel of the laboratory. The implementation of this virtual automation system has allowed us the achievement of the main advantages of TLA: workload increase (64%) with reduction in the cost per test (43%), significant reduction in the number of biochemistry primary tubes (from 8 to 2), less aliquoting (from 600 to 100 samples/day), automation of functional testing, drastic reduction of preanalytical errors (from 11.7 to 0.4% of the tubes) and better total response time for both inpatients (from up to 48 hours to up to 4 hours) and outpatients (from up to 10 days to up to 48 hours). As an additional advantage, virtual automation could be implemented without hardware investment and significant headcount reduction (15% in our lab).

  11. Studies on the regioselectivity and kinetics of the action of trypsin on proinsulin and its derivatives using mass spectrometry.

    Science.gov (United States)

    Gardner, Qurra-tul-Ann Afza; Younas, Hooria; Akhtar, Muhammad

    2013-01-01

    Human M-proinsulin was cleaved by trypsin at the R(31)R(32)-E(33) and K(64)R(65)-G(66) bonds (B/C and C/A junctions), showing the same cleavage specificity as exhibited by prohormone convertases 1 and 2 respectively. Buffalo/bovine M-proinsulin was also cleaved by trypsin at the K(59)R(60)-G(61) bond but at the B/C junction cleavage occurred at the R(31)R(32)-E(33) as well as the R(31)-R(32)E(33) bond. Thus, the human isoform in the native state, with a 31 residue connecting C-peptide, seems to have a unique structure around the B/C and C/A junctions and cleavage at these sites is predominantly governed by the structure of the proinsulin itself. In the case of both the proinsulin species the cleavage at the B/C junction was preferred (65%) over that at the C/A junction (35%) supporting the earlier suggestion of the presence of some form of secondary structure at the C/A junction. Proinsulin and its derivatives, as natural substrates for trypsin, were used and mass spectrometric analysis showed that the k(cat.)/K(m) values for the cleavage were most favourable for the scission of the bonds at the two junctions (1.02±0.08×10(5)s(-1)M(-1)) and the cleavage of the K(29)-T(30) bond of M-insulin-RR (1.3±0.07×10(5)s(-1)M(-1)). However, the K(29)-T(30) bond in M-insulin, insulin as well as M-proinsulin was shielded from attack by trypsin (k(cat.)/K(m) values around 1000s(-1)M(-1)). Hence, as the biosynthetic path follows the sequence; proinsulin→insulin-RR→insulin, the K(29)-T(30) bond becomes shielded, exposed then shielded again respectively. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. Development and characterization of a novel plug and play liquid chromatography-mass spectrometry (LC-MS) source that automates connections between the capillary trap, column, and emitter.

    Science.gov (United States)

    Bereman, Michael S; Hsieh, Edward J; Corso, Thomas N; Van Pelt, Colleen K; Maccoss, Michael J

    2013-06-01

    We report the development and characterization of a novel, vendor-neutral ultra-high pressure-compatible (~10,000 p.s.i.) LC-MS source. This device is the first to make automated connections with user-packed capillary traps, columns, and capillary emitters. The source uses plastic rectangular inserts (referred to here as cartridges) where individual components (i.e. trap, column, or emitter) can be exchanged independent of one another in a plug and play manner. Automated robotic connections are made between the three cartridges using linear translation powered by stepper motors to axially compress each cartridge by applying a well controlled constant compression force to each commercial LC fitting. The user has the versatility to tailor the separation (e.g. the length of the column, type of stationary phase, and mode of separation) to the experimental design of interest in a cost-effective manner. The source is described in detail, and several experiments are performed to evaluate the robustness of both the system and the exchange of the individual trap and emitter cartridges. The standard deviation in the retention time of four targeted peptides from a standard digest interlaced with a soluble Caenorhabditis elegans lysate ranged between 3.1 and 5.3 s over 3 days of analyses. Exchange of the emitter cartridge was found to have an insignificant effect on the abundance of various peptides. In addition, the trap cartridge can be replaced with minimal effects on retention time (<20 s).

  13. Validation of an automated solid-phase extraction method for the analysis of 23 opioids, cocaine, and metabolites in urine with ultra-performance liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Ramírez Fernández, María del Mar; Van Durme, Filip; Wille, Sarah M R; di Fazio, Vincent; Kummer, Natalie; Samyn, Nele

    2014-06-01

    The aim of this work was to automate a sample preparation procedure extracting morphine, hydromorphone, oxymorphone, norcodeine, codeine, dihydrocodeine, oxycodone, 6-monoacetyl-morphine, hydrocodone, ethylmorphine, benzoylecgonine, cocaine, cocaethylene, tramadol, meperidine, pentazocine, fentanyl, norfentanyl, buprenorphine, norbuprenorphine, propoxyphene, methadone and 2-ethylidene-1,5-dimethyl-3,3-diphenylpyrrolidine from urine samples. Samples were extracted by solid-phase extraction (SPE) with cation exchange cartridges using a TECAN Freedom Evo 100 base robotic system, including a hydrolysis step previous extraction when required. Block modules were carefully selected in order to use the same consumable material as in manual procedures to reduce cost and/or manual sample transfers. Moreover, the present configuration included pressure monitoring pipetting increasing pipetting accuracy and detecting sampling errors. The compounds were then separated in a chromatographic run of 9 min using a BEH Phenyl analytical column on a ultra-performance liquid chromatography-tandem mass spectrometry system. Optimization of the SPE was performed with different wash conditions and elution solvents. Intra- and inter-day relative standard deviations (RSDs) were within ±15% and bias was within ±15% for most of the compounds. Recovery was >69% (RSD automated SPE system was observed. The extracted samples were stable for 72 h in the autosampler (4°C). This method was applied to authentic samples (from forensic and toxicology cases) and to proficiency testing schemes containing cocaine, heroin, buprenorphine and methadone, offering fast and reliable results. Automation resulted in improved precision and accuracy, and a minimum operator intervention, leading to safer sample handling and less time-consuming procedures.

  14. Quantification of 31 illicit and medicinal drugs and metabolites in whole blood by fully automated solid-phase extraction and ultra-performance liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Bjørk, Marie Kjærgaard; Simonsen, Kirsten Wiese; Andersen, David Wederkinck; Dalsgaard, Petur Weihe; Sigurðardóttir, Stella Rögn; Linnet, Kristian; Rasmussen, Brian Schou

    2013-03-01

    An efficient method for analyzing illegal and medicinal drugs in whole blood using fully automated sample preparation and short ultra-high-performance liquid chromatography-tandem mass spectrometry (MS/MS) run time is presented. A selection of 31 drugs, including amphetamines, cocaine, opioids, and benzodiazepines, was used. In order to increase the efficiency of routine analysis, a robotic system based on automated liquid handling and capable of handling all unit operation for sample preparation was built on a Freedom Evo 200 platform with several add-ons from Tecan and third-party vendors. Solid-phase extraction was performed using Strata X-C plates. Extraction time for 96 samples was less than 3 h. Chromatography was performed using an ACQUITY UPLC system (Waters Corporation, Milford, USA). Analytes were separated on a 100 mm × 2.1 mm, 1.7 μm Acquity UPLC CSH C(18) column using a 6.5 min 0.1 % ammonia (25 %) in water/0.1 % ammonia (25 %) in methanol gradient and quantified by MS/MS (Waters Quattro Premier XE) in multiple-reaction monitoring mode. Full validation, including linearity, precision and trueness, matrix effect, ion suppression/enhancement of co-eluting analytes, recovery, and specificity, was performed. The method was employed successfully in the laboratory and used for routine analysis of forensic material. In combination with tetrahydrocannabinol analysis, the method covered 96 % of cases involving driving under the influence of drugs. The manual labor involved in preparing blood samples, solvents, etc., was reduced to a half an hour per batch. The automated sample preparation setup also minimized human exposure to hazardous materials, provided highly improved ergonomics, and eliminated manual pipetting.

  15. Automating Finance

    Science.gov (United States)

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  16. Library Automation.

    Science.gov (United States)

    Husby, Ole

    1990-01-01

    The challenges and potential benefits of automating university libraries are reviewed, with special attention given to cooperative systems. Aspects discussed include database size, the role of the university computer center, storage modes, multi-institutional systems, resource sharing, cooperative system management, networking, and intelligent…

  17. Automated digital image analysis of islet cell mass using Nikon's inverted eclipse Ti microscope and software to improve engraftment may help to advance the therapeutic efficacy and accessibility of islet transplantation across centers.

    Science.gov (United States)

    Gmyr, Valery; Bonner, Caroline; Lukowiak, Bruno; Pawlowski, Valerie; Dellaleau, Nathalie; Belaich, Sandrine; Aluka, Isanga; Moermann, Ericka; Thevenet, Julien; Ezzouaoui, Rimed; Queniat, Gurvan; Pattou, Francois; Kerr-Conte, Julie

    2015-01-01

    Reliable assessment of islet viability, mass, and purity must be met prior to transplanting an islet preparation into patients with type 1 diabetes. The standard method for quantifying human islet preparations is by direct microscopic analysis of dithizone-stained islet samples, but this technique may be susceptible to inter-/intraobserver variability, which may induce false positive/negative islet counts. Here we describe a simple, reliable, automated digital image analysis (ADIA) technique for accurately quantifying islets into total islet number, islet equivalent number (IEQ), and islet purity before islet transplantation. Islets were isolated and purified from n = 42 human pancreata according to the automated method of Ricordi et al. For each preparation, three islet samples were stained with dithizone and expressed as IEQ number. Islets were analyzed manually by microscopy or automatically quantified using Nikon's inverted Eclipse Ti microscope with built-in NIS-Elements Advanced Research (AR) software. The AIDA method significantly enhanced the number of islet preparations eligible for engraftment compared to the standard manual method (p image analysis utilizing the Nikon Instruments software is an unbiased, simple, and reliable teaching tool to comprehensively assess the individual size of each islet cell preparation prior to transplantation. Implementation of this technology to improve engraftment may help to advance the therapeutic efficacy and accessibility of islet transplantation across centers.

  18. Determination of 1-chloro-4-[2,2,2-trichloro-1-(4-chlorophenyl)ethyl]benzene and related compounds in marine pore water by automated thermal desorption-gas chromatography/mass spectrometry using disposable optical fiber

    Science.gov (United States)

    Eganhouse, Robert P.; DiFilippo, Erica L

    2015-01-01

    A method is described for determination of ten DDT-related compounds in marine pore water based on equilibrium solid-phase microextraction (SPME) using commercial polydimethylsiloxane-coated optical fiber with analysis by automated thermal desorption-gas chromatography/mass spectrometry (TD-GC/MS). Thermally cleaned fiber was directly exposed to sediments and allowed to reach equilibrium under static conditions at the in situ field temperature. Following removal, fibers were rinsed, dried and cut into appropriate lengths for storage in leak-tight containers at -20°C. Analysis by TD-GC/MS under full scan (FS) and selected ion monitoring (SIM) modes was then performed. Pore-water method detection limits in FS and SIM modes were estimated at 0.05-2.4ng/L and 0.7-16pg/L, respectively. Precision of the method, including contributions from fiber handling, was less than 10%. Analysis of independently prepared solutions containing eight DDT compounds yielded concentrations that were within 6.9±5.5% and 0.1±14% of the actual concentrations in FS and SIM modes, respectively. The use of optical fiber with automated analysis allows for studies at high temporal and/or spatial resolution as well as for monitoring programs over large spatial and/or long temporal scales with adequate sample replication. This greatly enhances the flexibility of the technique and improves the ability to meet quality control objectives at significantly lower cost.

  19. Analytical and clinical performance of the new Fujirebio 25-OH vitamin D assay, a comparison with liquid chromatography-tandem mass spectrometry (LC-MS/MS) and three other automated assays.

    Science.gov (United States)

    Saleh, Lanja; Mueller, Daniel; von Eckardstein, Arnold

    2016-04-01

    We evaluated the analytical and clinical performance of the new Lumipulse® G 25-OH vitamin D assay from Fujirebio, and compared it to a liquid chromatography-tandem mass spectrometry (LC-MS/MS) method and three other commercial automated assays. Total 25 hydroxy vitamin D (25(OH)D) levels were measured in 100 selected serum samples from our routine analysis with Fujirebio 25(OH)D assay. The results were compared with those obtained with LC-MS/MS and three other automated 25(OH)D assays (Abbott, Beckman, and Roche). The accuracy of each assay tested was evaluated against a Labquality reference serum panel for 25(OH)D (Ref!25OHD; University of Ghent). Intra- and inter-day imprecision of the Fujirebio 25(OH)D assay was Lumipulse G 25-OH vitamin D assay from Fujirebio demonstrated a good correlation with LC-MS/MS and some immunoassays. The performance of the assay is well-suited for routine 25(OH)D measurement in clinical serum samples. A correction for the observed negative bias vs. LC-MS/MS could be considered.

  20. Mass-action model analysis of the apparent molar volume and heat capacity of pluronics in water and liposome suspensions at 25 °C.

    Science.gov (United States)

    Quirion, François; Meilleur, Luc; Lévesque, Isabelle

    2013-07-09

    Pluronics are block copolymers composed of a central block of polypropylene oxide and two side chains of polyethylene oxide. They are used in water to generate aggregates and gels or added to phospholipid suspensions to prepare microparticles for drug delivery applications. The structure of these systems has been widely investigated. However, little is known about the mechanisms leading to these structures. This investigation compares the apparent molar volumes and heat capacities of Pluronics F38, F108, F127, P85, P104, and P103 at 25 °C in water and in the presence of lecithin liposomes. The changes in molar volumes, heat capacities, and enthalpies generated by a mass-action model are in good agreement with the loss of hydrophobic hydration of the polypropylene oxide central block of the Pluronics. However, the molecularity of the endothermic transitions is much smaller than the aggregation numbers reported in the literature for the same systems. It is suggested that Pluronics go through dehydration of their central block to form unimolecular or small entities having a hydrophobic polypropylene oxide core. In water, these entities would assemble athermally to form larger aggregates. In the presence of liposomes, they would be transferred into the hydrophobic lecithin bilayers of the liposomes. Light transmission experiments suggest that the liposome suspensions are significantly altered only when the added Pluronics are in the dehydrated state.

  1. Automated N-glycan profiling of a mutant Trypanosoma rangeli sialidase expressed in Pichia pastoris, using tandem mass spectrometry and bioinformatics

    DEFF Research Database (Denmark)

    Li, Haiying; Rasmussen, Morten I; Larsen, Martin R

    2015-01-01

    A mutant Trypanosoma rangeli sialidase, Tr7, expressed in Pichia pastoris, exhibits significant trans-sialidase activity, and has been used for analytical-scale production of sialylated human milk oligosaccharides. Mass spectrometry-based site-specific N-glycoprofiling of Tr7 showed that heteroge...

  2. Development of an automated on-line pepsin digestion-liquid chromatography-tandem mass spectrometry configuration for the rapid analysis of protein adducts of chemical warfare agents

    NARCIS (Netherlands)

    Carol-Visser, J.; van der Schans, M.; Fidder, A.; Huist, A.G.; van Baar, B.L.M.; Irth, H.; Noort, D.

    2008-01-01

    Rapid monitoring and retrospective verification are key issues in protection against and non-proliferation of chemical warfare agents (CWA). Such monitoring and verification are adequately accomplished by the analysis of persistent protein adducts of these agents. Liquid chromatography-mass

  3. The Spitzer Survey of Stellar Structure in Galaxies (S4G): Precise Stellar Mass Distributions from Automated Dust Correction at 3.6 μm

    Science.gov (United States)

    Querejeta, Miguel; Meidt, Sharon E.; Schinnerer, Eva; Cisternas, Mauricio; Muñoz-Mateos, Juan Carlos; Sheth, Kartik; Knapen, Johan; van de Ven, Glenn; Norris, Mark A.; Peletier, Reynier; Laurikainen, Eija; Salo, Heikki; Holwerda, Benne W.; Athanassoula, E.; Bosma, Albert; Groves, Brent; Ho, Luis C.; Gadotti, Dimitri A.; Zaritsky, Dennis; Regan, Michael; Hinz, Joannah; Gil de Paz, Armando; Menendez-Delmestre, Karin; Seibert, Mark; Mizusawa, Trisha; Kim, Taehyun; Erroz-Ferrer, Santiago; Laine, Jarkko; Comerón, Sébastien

    2015-07-01

    The mid-infrared is an optimal window to trace stellar mass in nearby galaxies and the 3.6 μ {{m}} IRAC band has been exploited to this effect, but such mass estimates can be biased by dust emission. We present our pipeline to reveal the old stellar flux at 3.6 μm and obtain stellar mass maps for more than 1600 galaxies available from the Spitzer Survey of Stellar Structure in Galaxies (S4G). This survey consists of images in two infrared bands (3.6 and 4.5 μ {{m}}), and we use the Independent Component Analysis (ICA) method presented in Meidt et al. to separate the dominant light from old stars and the dust emission that can significantly contribute to the observed 3.6 μ {{m}} flux. We exclude from our ICA analysis galaxies with low signal-to-noise ratio ({{S}}/{{N}}\\lt 10) and those with original [3.6]-[4.5] colors compatible with an old stellar population, indicative of little dust emission (mostly early Hubble types, which can directly provide good mass maps). For the remaining 1251 galaxies to which ICA was successfully applied, we find that as much as 10%-30% of the total light at 3.6 μ {{m}} typically originates from dust, and locally it can reach even higher values. This contamination fraction shows a correlation with specific star formation rates, confirming that the dust emission that we detect is related to star formation. Additionally, we have used our large sample of mass estimates to calibrate a relationship of effective mass-to-light ratio (M/L) as a function of observed [3.6]-[4.5] color: {log}({\\text{}}M/L) = -0.339(+/- 0.057) × ([3.6]-[4.5])-0.336(+/- 0.002). Our final pipeline products have been made public through IRSA, providing the astronomical community with an unprecedentedly large set of stellar mass maps ready to use for scientific applications.

  4. Murine Automated Urine Sampler (MAUS), Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal outlines planned development for a low-power, low-mass automated urine sample collection and preservation system for small mammals, capable of...

  5. An automated, open-source pipeline for mass production of digital elevation models (DEMs) from very-high-resolution commercial stereo satellite imagery

    Science.gov (United States)

    Shean, David E.; Alexandrov, Oleg; Moratto, Zachary M.; Smith, Benjamin E.; Joughin, Ian R.; Porter, Claire; Morin, Paul

    2016-06-01

    We adapted the automated, open source NASA Ames Stereo Pipeline (ASP) to generate digital elevation models (DEMs) and orthoimages from very-high-resolution (VHR) commercial imagery of the Earth. These modifications include support for rigorous and rational polynomial coefficient (RPC) sensor models, sensor geometry correction, bundle adjustment, point cloud co-registration, and significant improvements to the ASP code base. We outline a processing workflow for ˜0.5 m ground sample distance (GSD) DigitalGlobe WorldView-1 and WorldView-2 along-track stereo image data, with an overview of ASP capabilities, an evaluation of ASP correlator options, benchmark test results, and two case studies of DEM accuracy. Output DEM products are posted at ˜2 m with direct geolocation accuracy of process individual stereo pairs on a local workstation, the methods presented here were developed for large-scale batch processing in a high-performance computing environment. We are leveraging these resources to produce dense time series and regional mosaics for the Earth's polar regions.

  6. Study of the effect of positive ions impinging sensitive emulsions in mass spectrography; Etude de l'action des ions positifs sur les emulsions sensibles en spectrographie de masse

    Energy Technology Data Exchange (ETDEWEB)

    Cavard, A [Commissariat a l' Energie Atomique, Grenoble (France). Centre d' Etudes Nucleaires

    1969-08-01

    Experimental relationships have been established between the blackening of emulsions by impinging ions and the following parameters: number, mass and energy of impinging particles. Mean energy ions (about twenty keV) give rise to a latent image probably made of small specks of metallic silver located at the surface or in the bulk of the silver halide grain. A specific developer for ion sensitive emulsion was perfected. Sensitivity and detection threshold are increased by a factor of two or three, compared with values observed using a classical developer. Low energy particles sputtered from superficial layers of the emulsion by the impinging twenty keV ions, produce a latent image close to the surface of the silver halide grain. An oxidizing process bleaches superficial latent image and then reduces background fog. The improved signal over background ratio allows to observe lines undetectable when the plates are developed by usual process. (author) [French] On a etabli les relations experimentales entre le noircissement resultant de l'action des ions sur l'emulsion sensible et les parametres suivants: nombre de particules incidentes, masse et energie de ces particules. L'effet sur les grains d'halogenure d'argent a ete examine: les ions d'energie moyenne (de l'ordre de 20 keV) creent une image latente vraisemblablement constituee d'argent, situee a la fois a la surface et a l'interieur du grain. Un revelateur specifique pour le developpement des emulsions sensibles exposees aux ions a ete mis au point: il accroit la sensibilite d'un facteur deux a trois et le facteur de contraste par rapport a un revelateur usuel. Les particules de faible energie, emises par pulverisation des couches superficielles de l'emulsion au cours de l'enregistrement des spectres de masse des ions d'energie moyenne, creent essentiellement dans les grains d'halogenure d'argent, une image latente superficielle, responsable du fond. Un traitement oxydant est propose pour eliminer cette image en

  7. Marketing automation processes as a way to improve contemporary marketing of a company

    Directory of Open Access Journals (Sweden)

    Witold Świeczak

    2013-09-01

    Full Text Available The main aim of this article is to identify the possibilities which are given to contemporary companies thanks to the usage of processes included in marketing automation system. This publication deals with the key aspects of this issue. Its shows how the importance of the organization changes, how its value increases, as a result of using the tools which are provided by the processes included in the concept of marketing automation. This article defines the factors and processes which influence an effective course of actions taken as a part of marketing automation. Because the concept of marketing automation is a completely new reality; it is giving up the communication based on mass distribution of a uniform contents for really personalized individual and fully automated communication. This is a completely new idea, a kind of coexistence, in which both a sales department and a marketing department cooperate with each other closely to achieve the best result. It is also a situation in which marketing can definitely confirm its contribution to the income generated by the company. But marketing automation also means huge analytical possibilities and a real increase of a company’s value, its value added generated by the system – the source of information about clients, about all processes both marketing and sales, taking place in a company. The introduction of marketing automation system alters not only the current functioning of a marketing department, but also marketers themselves. In fact, everything that marketing automation system provides, including primarily accumulated unique knowledge of the client, is also a critical marketing value of every modern enterprise.

  8. An off-line automated preconcentration system with ethylenediaminetriacetate chelating resin for the determination of trace metals in seawater by high-resolution inductively coupled plasma mass spectrometry.

    Science.gov (United States)

    Minami, Tomoharu; Konagaya, Wataru; Zheng, Linjie; Takano, Shotaro; Sasaki, Masanobu; Murata, Rena; Nakaguchi, Yuzuru; Sohrin, Yoshiki

    2015-01-07

    A novel automated off-line preconcentration system for trace metals (Al, Mn, Fe, Co, Ni, Cu, Zn, Cd, and Pb) in seawater was developed by improving a commercially available solid-phase extraction system SPE-100 (Hiranuma Sangyo). The utilized chelating resin was NOBIAS Chelate-PA1 (Hitachi High-Technologies) with ethylenediaminetriacetic acid and iminodiacetic acid functional groups. Parts of the 8-way valve made of alumina and zirconia in the original SPE-100 system were replaced with parts made of polychlorotrifluoroethylene in order to reduce contamination of trace metals. The eluent pass was altered for the back flush elution of trace metals. We optimized the cleaning procedures for the chelating resin column and flow lines of the preconcentration system, and developed a preconcentration procedure, which required less labor and led to a superior performance compared to manual preconcentration (Sohrin et al.). The nine trace metals were simultaneously and quantitatively preconcentrated from ∼120 g of seawater, eluted with ∼15 g of 1M HNO3, and determined by HR-ICP-MS using the calibration curve method. The single-step preconcentration removed more than 99.998% of Na, K, Mg, Ca, and Sr from seawater. The procedural blanks and detection limits were lower than the lowest concentrations in seawater for Mn, Ni, Cu, and Pb, while they were as low as the lowest concentrations in seawater for Al, Fe, Co, Zn, and Cd. The accuracy and precision of this method were confirmed by the analysis of reference seawater samples (CASS-5, NASS-5, GEOTRACES GS, and GD) and seawater samples for vertical distribution in the western North Pacific Ocean. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Autonomy, Automation, and Systems

    Science.gov (United States)

    Turner, Philip R.

    1987-02-01

    Aerospace industry interest in autonomy and automation, given fresh impetus by the national goal of establishing a Space Station, is becoming a major item of research and technology development. The promise of new technology arising from research in Artificial Intelligence (AI) has focused much attention on its potential in autonomy and automation. These technologies can improve performance in autonomous control functions that involve planning, scheduling, and fault diagnosis of complex systems. There are, however, many aspects of system and subsystem design in an autonomous system that impact AI applications, but do not directly involve AI technology. Development of a system control architecture, establishment of an operating system within the design, providing command and sensory data collection features appropriate to automated operation, and the use of design analysis tools to support system engineering are specific examples of major design issues. Aspects such as these must also receive attention and technology development support if we are to implement complex autonomous systems within the realistic limitations of mass, power, cost, and available flight-qualified technology that are all-important to a flight project.

  10. Automated generation of lattice QCD Feynman rules

    Energy Technology Data Exchange (ETDEWEB)

    Hart, A.; Mueller, E.H. [Edinburgh Univ. (United Kingdom). SUPA School of Physics and Astronomy; von Hippel, G.M. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Horgan, R.R. [Cambridge Univ. (United Kingdom). DAMTP, CMS

    2009-04-15

    The derivation of the Feynman rules for lattice perturbation theory from actions and operators is complicated, especially for highly improved actions such as HISQ. This task is, however, both important and particularly suitable for automation. We describe a suite of software to generate and evaluate Feynman rules for a wide range of lattice field theories with gluons and (relativistic and/or heavy) quarks. Our programs are capable of dealing with actions as complicated as (m)NRQCD and HISQ. Automated differentiation methods are used to calculate also the derivatives of Feynman diagrams. (orig.)

  11. Automated generation of lattice QCD Feynman rules

    International Nuclear Information System (INIS)

    Hart, A.; Mueller, E.H.; Horgan, R.R.

    2009-04-01

    The derivation of the Feynman rules for lattice perturbation theory from actions and operators is complicated, especially for highly improved actions such as HISQ. This task is, however, both important and particularly suitable for automation. We describe a suite of software to generate and evaluate Feynman rules for a wide range of lattice field theories with gluons and (relativistic and/or heavy) quarks. Our programs are capable of dealing with actions as complicated as (m)NRQCD and HISQ. Automated differentiation methods are used to calculate also the derivatives of Feynman diagrams. (orig.)

  12. Plant automation

    International Nuclear Information System (INIS)

    Christensen, L.J.; Sackett, J.I.; Dayal, Y.; Wagner, W.K.

    1989-01-01

    This paper describes work at EBR-II in the development and demonstration of new control equipment and methods and associated schemes for plant prognosis, diagnosis, and automation. The development work has attracted the interest of other national laboratories, universities, and commercial companies. New initiatives include use of new control strategies, expert systems, advanced diagnostics, and operator displays. The unique opportunity offered by EBR-II is as a test bed where a total integrated approach to automatic reactor control can be directly tested under real power plant conditions

  13. An Automated Sample Preparation Instrument to Accelerate Positive Blood Cultures Microbial Identification by MALDI-TOF Mass Spectrometry (Vitek®MS

    Directory of Open Access Journals (Sweden)

    Patrick Broyer

    2018-05-01

    Full Text Available Sepsis is the leading cause of death among patients in intensive care units (ICUs requiring an early diagnosis to introduce efficient therapeutic intervention. Rapid identification (ID of a causative pathogen is key to guide directed antimicrobial selection and was recently shown to reduce hospitalization length in ICUs. Direct processing of positive blood cultures by MALDI-TOF MS technology is one of the several currently available tools used to generate rapid microbial ID. However, all recently published protocols are still manual and time consuming, requiring dedicated technician availability and specific strategies for batch processing. We present here a new prototype instrument for automated preparation of Vitek®MS slides directly from positive blood culture broth based on an “all-in-one” extraction strip. This bench top instrument was evaluated on 111 and 22 organisms processed using artificially inoculated blood culture bottles in the BacT/ALERT® 3D (SA/SN blood culture bottles or the BacT/ALERT VirtuoTM system (FA/FN Plus bottles, respectively. Overall, this new preparation station provided reliable and accurate Vitek MS species-level identification of 87% (Gram-negative bacteria = 85%, Gram-positive bacteria = 88%, and yeast = 100% when used with BacT/ALERT® 3D and of 84% (Gram-negative bacteria = 86%, Gram-positive bacteria = 86%, and yeast = 75% with Virtuo® instruments, respectively. The prototype was then evaluated in a clinical microbiology laboratory on 102 clinical blood culture bottles and compared to routine laboratory ID procedures. Overall, the correlation of ID on monomicrobial bottles was 83% (Gram-negative bacteria = 89%, Gram-positive bacteria = 79%, and yeast = 78%, demonstrating roughly equivalent performance between manual and automatized extraction methods. This prototype instrument exhibited a high level of performance regardless of bottle type or BacT/ALERT system. Furthermore, blood culture workflow could

  14. WIDAFELS flexible automation systems

    International Nuclear Information System (INIS)

    Shende, P.S.; Chander, K.P.; Ramadas, P.

    1990-01-01

    After discussing the various aspects of automation, some typical examples of various levels of automation are given. One of the examples is of automated production line for ceramic fuel pellets. (M.G.B.)

  15. An Automation Planning Primer.

    Science.gov (United States)

    Paynter, Marion

    1988-01-01

    This brief planning guide for library automation incorporates needs assessment and evaluation of options to meet those needs. A bibliography of materials on automation planning and software reviews, library software directories, and library automation journals is included. (CLB)

  16. LipidMatch: an automated workflow for rule-based lipid identification using untargeted high-resolution tandem mass spectrometry data.

    Science.gov (United States)

    Koelmel, Jeremy P; Kroeger, Nicholas M; Ulmer, Candice Z; Bowden, John A; Patterson, Rainey E; Cochran, Jason A; Beecher, Christopher W W; Garrett, Timothy J; Yost, Richard A

    2017-07-10

    Lipids are ubiquitous and serve numerous biological functions; thus lipids have been shown to have great potential as candidates for elucidating biomarkers and pathway perturbations associated with disease. Methods expanding coverage of the lipidome increase the likelihood of biomarker discovery and could lead to more comprehensive understanding of disease etiology. We introduce LipidMatch, an R-based tool for lipid identification for liquid chromatography tandem mass spectrometry workflows. LipidMatch currently has over 250,000 lipid species spanning 56 lipid types contained in in silico fragmentation libraries. Unique fragmentation libraries, compared to other open source software, include oxidized lipids, bile acids, sphingosines, and previously uncharacterized adducts, including ammoniated cardiolipins. LipidMatch uses rule-based identification. For each lipid type, the user can select which fragments must be observed for identification. Rule-based identification allows for correct annotation of lipids based on the fragments observed, unlike typical identification based solely on spectral similarity scores, where over-reporting structural details that are not conferred by fragmentation data is common. Another unique feature of LipidMatch is ranking lipid identifications for a given feature by the sum of fragment intensities. For each lipid candidate, the intensities of experimental fragments with exact mass matches to expected in silico fragments are summed. The lipid identifications with the greatest summed intensity using this ranking algorithm were comparable to other lipid identification software annotations, MS-DIAL and Greazy. For example, for features with identifications from all 3 software, 92% of LipidMatch identifications by fatty acyl constituents were corroborated by at least one other software in positive mode and 98% in negative ion mode. LipidMatch allows users to annotate lipids across a wide range of high resolution tandem mass spectrometry

  17. Low cost automation

    International Nuclear Information System (INIS)

    1987-03-01

    This book indicates method of building of automation plan, design of automation facilities, automation and CHIP process like basics of cutting, NC processing machine and CHIP handling, automation unit, such as drilling unit, tapping unit, boring unit, milling unit and slide unit, application of oil pressure on characteristics and basic oil pressure circuit, application of pneumatic, automation kinds and application of process, assembly, transportation, automatic machine and factory automation.

  18. Automated and high confidence protein phosphorylation site localization using complementary collision-activated dissociation and electron transfer dissociation tandem mass spectrometry

    DEFF Research Database (Denmark)

    Hansen, Thomas A; Sylvester, Marc; Jensen, Ole N

    2012-01-01

    -site localization and the number of assigned phospho-sites at a fixed false-localization rate. The average calculated Cscore from a large data set (>7000 phosphopeptide MS/MS spectra) was ∼32 compared to ∼23 and ∼17 for the Ascore using collision-activated dissociation (CAD) or electron transfer dissociation (ETD...... peptide fragmentation and the loss of labile phosphate groups complicate identification of the site of the phosphate motif. Here, we have implemented and evaluated a novel approach for phospho-site localization by the combined use of peptide tandem mass spectrometry data obtained using both collision......-activated dissociation and electron transfer dissociation, an approach termed the Cscore. The scoring algorithm used in the Cscore was adapted from the widely used Ascore method. The analytical benefit of integrating the product ion information of both ETD and CAD data are evident by increased confidence in phospho...

  19. Automated Budget System -

    Data.gov (United States)

    Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...

  20. Automation 2017

    CERN Document Server

    Zieliński, Cezary; Kaliczyńska, Małgorzata

    2017-01-01

    This book consists of papers presented at Automation 2017, an international conference held in Warsaw from March 15 to 17, 2017. It discusses research findings associated with the concepts behind INDUSTRY 4.0, with a focus on offering a better understanding of and promoting participation in the Fourth Industrial Revolution. Each chapter presents a detailed analysis of a specific technical problem, in most cases followed by a numerical analysis, simulation and description of the results of implementing the solution in a real-world context. The theoretical results, practical solutions and guidelines presented are valuable for both researchers working in the area of engineering sciences and practitioners looking for solutions to industrial problems. .

  1. Marketing automation

    Directory of Open Access Journals (Sweden)

    TODOR Raluca Dania

    2017-01-01

    Full Text Available The automation of the marketing process seems to be nowadays, the only solution to face the major changes brought by the fast evolution of technology and the continuous increase in supply and demand. In order to achieve the desired marketing results, businessis have to employ digital marketing and communication services. These services are efficient and measurable thanks to the marketing technology used to track, score and implement each campaign. Due to the technical progress, the marketing fragmentation, demand for customized products and services on one side and the need to achieve constructive dialogue with the customers, immediate and flexible response and the necessity to measure the investments and the results on the other side, the classical marketing approached had changed continue to improve substantially.

  2. Cocrystal solubility-pH and drug solubilization capacity of sodium dodecyl sulfate – mass action model for data analysis and simulation to improve design of experiments

    Directory of Open Access Journals (Sweden)

    Alex Avdeef

    2018-06-01

    Full Text Available This review discusses the disposition of the anionic surfactant, sodium dodecyl sulfate (SDS; i.e., sodium lauryl sulfate, to solubilize sparingly-soluble drugs above the surfactant critical micelle concentration (CMC, as quantitated by the solubilization capacity (k. A compilation of 101 published SDS k values of mostly poorly-soluble drug molecules was used to develop a prediction model as a function of the drug’s intrinsic solubility, S0, and its calculated H-bond acceptor/donor potential. In almost all cases, the surfactant was found to solubilize the neutral form of the drug. Using the mass action model, the k values were converted to drug-micelle stoichiometric binding constants, Kn, corresponding to drug-micelle equilibria in drug-saturated solutions. An in-depth case study (data from published sources considered the micellization reactions as a function of pH of a weak base, B, (pKa 3.58, S0 52 μg/mL, where at pH 1 the BH.SDS salt was predicted to precipitate both below and above the CMC. At low SDS concentrations, two drug salts were predicted to co-precipitate: BH.Cl and BH.SDS. Solubility products of both were determined from the analysis of the reported solubility-surfactant data. Above the CMC, in a rare example, the charged form of the drug (BH+ appeared to be strongly solubilized by the surfactant. The constant for that reaction was also determined. At pH 7, the reactions were simpler, as only the neutral form of the drug was solubilized, to a significantly lesser extent than at pH 1. Case studies also featured examples of solubilization of solids in the form of cocrystals. For many cocrystal systems studied in aqueous solution, the anticipated supersaturated state is not long-lasting, as the drug component precipitates to a thermodynamically stable form, thus lowering the amount of the active ingredient available for intestinal absorption. Use of surfactant can prevent this. A recently-described method for predicting the

  3. Automated and sensitive determination of four anabolic androgenic steroids in urine by online turbulent flow solid-phase extraction coupled with liquid chromatography-tandem mass spectrometry: a novel approach for clinical monitoring and doping control.

    Science.gov (United States)

    Guo, Feng; Shao, Jing; Liu, Qian; Shi, Jian-Bo; Jiang, Gui-Bin

    2014-07-01

    A novel method for automated and sensitive analysis of testosterone, androstenedione, methyltestosterone and methenolone in urine samples by online turbulent flow solid-phase extraction coupled with high performance liquid chromatography-tandem mass spectrometry was developed. The optimization and validation of the method were discussed in detail. The Turboflow C18-P SPE column showed the best extraction efficiency for all the analytes. Nanogram per liter (ng/L) level of AAS could be determined directly and the limits of quantification (LOQs) were 0.01 ng/mL, which were much lower than normally concerned concentrations for these typical anabolic androgenic steroids (AAS) (0.1 ng/mL). The linearity range was from the LOQ to 100 ng/mL for each compound, with the coefficients of determination (r(2)) ranging from 0.9990 to 0.9999. The intraday and interday relative standard deviations (RSDs) ranged from 1.1% to 14.5% (n=5). The proposed method was successfully applied to the analysis of urine samples collected from 24 male athletes and 15 patients of prostate cancer. The proposed method provides an alternative practical way to rapidly determine AAS in urine samples, especially for clinical monitoring and doping control. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. An automated, open-source (NASA Ames Stereo Pipeline) workflow for mass production of high-resolution DEMs from commercial stereo satellite imagery: Application to mountain glacies in the contiguous US

    Science.gov (United States)

    Shean, D. E.; Arendt, A. A.; Whorton, E.; Riedel, J. L.; O'Neel, S.; Fountain, A. G.; Joughin, I. R.

    2016-12-01

    We adapted the open source NASA Ames Stereo Pipeline (ASP) to generate digital elevation models (DEMs) and orthoimages from very-high-resolution (VHR) commercial imagery of the Earth. These modifications include support for rigorous and rational polynomial coefficient (RPC) sensor models, sensor geometry correction, bundle adjustment, point cloud co-registration, and significant improvements to the ASP code base. We outline an automated processing workflow for 0.5 m GSD DigitalGlobe WorldView-1/2/3 and GeoEye-1 along-track and cross-track stereo image data. Output DEM products are posted at 2, 8, and 32 m with direct geolocation accuracy of process individual stereo pairs on a local workstation, the methods presented here were developed for large-scale batch processing in a high-performance computing environment. We have leveraged these resources to produce dense time series and regional mosaics for the Earth's ice sheets. We are now processing and analyzing all available 2008-2016 commercial stereo DEMs over glaciers and perennial snowfields in the contiguous US. We are using these records to study long-term, interannual, and seasonal volume change and glacier mass balance. This analysis will provide a new assessment of regional climate change, and will offer basin-scale analyses of snowpack evolution and snow/ice melt runoff for water resource applications.

  5. Quantification of five compounds with heterogeneous physicochemical properties (morphine, 6-monoacetylmorphine, cyamemazine, meprobamate and caffeine) in 11 fluids and tissues, using automated solid-phase extraction and gas chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Bévalot, Fabien; Bottinelli, Charline; Cartiser, Nathalie; Fanton, Laurent; Guitton, Jérôme

    2014-06-01

    An automated solid-phase extraction (SPE) protocol followed by gas chromatography coupled with tandem mass spectrometry was developed for quantification of caffeine, cyamemazine, meprobamate, morphine and 6-monoacetylmorphine (6-MAM) in 11 biological matrices [blood, urine, bile, vitreous humor, liver, kidney, lung and skeletal muscle, brain, adipose tissue and bone marrow (BM)]. The assay was validated for linearity, within- and between-day precision and accuracy, limits of quantification, selectivity, extraction recovery (ER), sample dilution and autosampler stability on BM. For the other matrices, partial validation was performed (limits of quantification, linearity, within-day precision, accuracy, selectivity and ER). The lower limits of quantification were 12.5 ng/mL(ng/g) for 6-MAM, morphine and cyamemazine, 100 ng/mL(ng/g) for meprobamate and 50 ng/mL(ng/g) for caffeine. Analysis of real-case samples demonstrated the performance of the assay in forensic toxicology to investigate challenging cases in which, for example, blood is not available or in which analysis in alternative matrices could be relevant. The SPE protocol was also assessed as an extraction procedure that could target other relevant analytes of interest. The extraction procedure was applied to 12 molecules of forensic interest with various physicochemical properties (alimemazine, alprazolam, amitriptyline, citalopram, cocaine, diazepam, levomepromazine, nordazepam, tramadol, venlafaxine, pentobarbital and phenobarbital). All drugs were able to be detected at therapeutic concentrations in blood and in the alternate matrices.

  6. Analysis of urinary 8-isoprostane as an oxidative stress biomarker by stable isotope dilution using automated online in-tube solid-phase microextraction coupled with liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Mizuno, Keisuke; Kataoka, Hiroyuki

    2015-08-10

    We have developed a simple and sensitive method for the determination of the oxidative stress biomarker 8-isoprostane (8-IP) in human urine by automated online in-tube solid-phase microextraction (SPME) coupled with liquid chromatography-tandem mass spectrometry (LC-MS/MS) using a Zorbax Eclipse XDB-8 column and 0.1% formic acid/methanol (25/75, v/v) as a mobile phase. Electrospray MS/MS for 8-IP was performed on an API 4000 triple quadruple mass spectrometer in negative ion mode. The optimum in-tube SPME conditions were 20 draw/eject cycles with a sample size of 40 μL using a Carboxen 1006 PLOT capillary column for the extraction. The extracted compounds were easily desorbed from the capillary by passage of the mobile phase, and no carryover was observed. Total analysis time of this method including online extraction and analysis was about 30 min for each sample. The in-tube SPME LC-MS/MS method showed good linearity in the concentration range of 20-1000 pg/mL with a correlation coefficient r = 0.9999 for 8-IP using a stable isotope-labeled internal standard, 8-IP-d4. The detection limit of 8-IP was 3.3 pg/mL and the proposed method showed 42-fold higher sensitivity than the direct injection method. The intra-day and inter-day precisions (relative standard deviations) were below 5.0% and 8.5% (n = 5), respectively. This method was applied successfully to the analysis of urine samples without pretreatment or interference peaks. The recovery rates of 8-IP spiked into urine samples were above 92%. This method is useful for assessing the effects of oxidative stress and antioxidant intake. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Automation from pictures

    International Nuclear Information System (INIS)

    Kozubal, A.J.

    1992-01-01

    The state transition diagram (STD) model has been helpful in the design of real time software, especially with the emergence of graphical computer aided software engineering (CASE) tools. Nevertheless, the translation of the STD to real time code has in the past been primarily a manual task. At Los Alamos we have automated this process. The designer constructs the STD using a CASE tool (Cadre Teamwork) using a special notation for events and actions. A translator converts the STD into an intermediate state notation language (SNL), and this SNL is compiled directly into C code (a state program). Execution of the state program is driven by external events, allowing multiple state programs to effectively share the resources of the host processor. Since the design and the code are tightly integrated through the CASE tool, the design and code never diverge, and we avoid design obsolescence. Furthermore, the CASE tool automates the production of formal technical documents from the graphic description encapsulated by the CASE tool. (author)

  8. Automated digital magnetofluidics

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, J; Garcia, A A; Marquez, M [Harrington Department of Bioengineering Arizona State University, Tempe AZ 85287-9709 (United States)], E-mail: tony.garcia@asu.edu

    2008-08-15

    Drops can be moved in complex patterns on superhydrophobic surfaces using a reconfigured computer-controlled x-y metrology stage with a high degree of accuracy, flexibility, and reconfigurability. The stage employs a DMC-4030 controller which has a RISC-based, clock multiplying processor with DSP functions, accepting encoder inputs up to 22 MHz, provides servo update rates as high as 32 kHz, and processes commands at rates as fast as 40 milliseconds. A 6.35 mm diameter cylindrical NdFeB magnet is translated by the stage causing water drops to move by the action of induced magnetization of coated iron microspheres that remain in the drop and are attracted to the rare earth magnet through digital magnetofluidics. Water drops are easily moved in complex patterns in automated digital magnetofluidics at an average speed of 2.8 cm/s over a superhydrophobic polyethylene surface created by solvent casting. With additional components, some potential uses for this automated microfluidic system include characterization of superhydrophobic surfaces, water quality analysis, and medical diagnostics.

  9. Automated digital magnetofluidics

    Science.gov (United States)

    Schneider, J.; Garcia, A. A.; Marquez, M.

    2008-08-01

    Drops can be moved in complex patterns on superhydrophobic surfaces using a reconfigured computer-controlled x-y metrology stage with a high degree of accuracy, flexibility, and reconfigurability. The stage employs a DMC-4030 controller which has a RISC-based, clock multiplying processor with DSP functions, accepting encoder inputs up to 22 MHz, provides servo update rates as high as 32 kHz, and processes commands at rates as fast as 40 milliseconds. A 6.35 mm diameter cylindrical NdFeB magnet is translated by the stage causing water drops to move by the action of induced magnetization of coated iron microspheres that remain in the drop and are attracted to the rare earth magnet through digital magnetofluidics. Water drops are easily moved in complex patterns in automated digital magnetofluidics at an average speed of 2.8 cm/s over a superhydrophobic polyethylene surface created by solvent casting. With additional components, some potential uses for this automated microfluidic system include characterization of superhydrophobic surfaces, water quality analysis, and medical diagnostics.

  10. Determination of anabolic steroids in human urine by automated in-tube solid-phase microextraction coupled with liquid chromatography-mass spectrometry.

    Science.gov (United States)

    Saito, Keita; Yagi, Katsuharu; Ishizaki, Atsushi; Kataoka, Hiroyuki

    2010-09-05

    A simple, rapid and sensitive method was developed for determining the presence of seven anabolic steroids (boldenone, nandrolone, testosterone, methyltestosterone, epiandrosterone, androsterone, and atnozolol) in human urine. Glucuronide-conjugates of these compounds were hydrolyzed with beta-glucuronidase. The anabolic steroids were analyzed by on-line in-tube solid-phase microextraction (SPME) coupled with liquid chromatography-mass spectrometry (LC-MS). The steroids were separated within 14 min by high performance liquid chromatography using a Chromolith RP-18e column and 5 mM ammonium formate/methanol (35/65, v/v) as a mobile phase at a flow rate of 1.0 mL/min. Electrospray ionization conditions in the positive ion mode were optimized for the MS detection of these compounds. The optimum in-tube SPME conditions were 20 draw/eject cycles with a sample size of 40 microL using a Supel-Q PLOT capillary column for the extraction. The extracted compounds could be desorbed readily from the capillary column by flow of the mobile phase, and no carryover was observed. Using the in-tube SPME LC-MS with SIM mode detection, good linearity of the calibration curve (r>0.995) was obtained in the concentration range of 0.5-20 ng/mL, except for stanozolol. The detection limits (S/N=3) of anabolic steroids were in the range 9-182 pg/mL and the proposed method showed 20-33-fold higher sensitivity than the direct injection method. The within-day and between-day precisions were below 4.0% and 7.3% (n=5), respectively. This method was applied successfully to the analysis of urine samples without the interference peaks. The recovery rates of anabolic steroids spiked into urine samples were above 85%. This method is useful to analyze the urinary levels of these compounds in anti-doping tests. 2010 Elsevier B.V. All rights reserved.

  11. Application of automated reasoning software: procedure generation system verifier

    International Nuclear Information System (INIS)

    Smith, D.E.; Seeman, S.E.

    1984-09-01

    An on-line, automated reasoning software system for verifying the actions of other software or human control systems has been developed. It was demonstrated by verifying the actions of an automated procedure generation system. The verifier uses an interactive theorem prover as its inference engine with the rules included as logic axioms. Operation of the verifier is generally transparent except when the verifier disagrees with the actions of the monitored software. Testing with an automated procedure generation system demonstrates the successful application of automated reasoning software for verification of logical actions in a diverse, redundant manner. A higher degree of confidence may be placed in the verified actions gathered by the combined system

  12. Both Automation and Paper.

    Science.gov (United States)

    Purcell, Royal

    1988-01-01

    Discusses the concept of a paperless society and the current situation in library automation. Various applications of automation and telecommunications are addressed, and future library automation is considered. Automation at the Monroe County Public Library in Bloomington, Indiana, is described as an example. (MES)

  13. Automation and decision support in interactive consumer products.

    OpenAIRE

    Sauer, J.; Rüttinger, B.

    2007-01-01

    This article presents two empirical studies (n=30, n=48) that are concerned with different forms of automation in interactive consumer products. The goal of the studies was to evaluate the effectiveness of two types of automation: perceptual augmentation (i.e. supporting users' action selection and implementation). Furthermore, the effectiveness of non-product information (i.e. labels attached to product) in supporting automation design was evaluated. The findings suggested greater benefits f...

  14. Automated solid-phase extraction-liquid chromatography-tandem mass spectrometry analysis of 6-acetylmorphine in human urine specimens: application for a high-throughput urine analysis laboratory.

    Science.gov (United States)

    Robandt, P V; Bui, H M; Scancella, J M; Klette, K L

    2010-10-01

    An automated solid-phase extraction-liquid chromatography- tandem mass spectrometry (SPE-LC-MS-MS) method using the Spark Holland Symbiosis Pharma SPE-LC coupled to a Waters Quattro Micro MS-MS was developed for the analysis of 6-acetylmorphine (6-AM) in human urine specimens. The method was linear (R² = 0.9983) to 100 ng/mL, with no carryover at 200 ng/mL. Limits of quantification and detection were found to be 2 ng/mL. Interrun precision calculated as percent coefficient of variation (%CV) and evaluated by analyzing five specimens at 10 ng/mL over nine batches (n = 45) was 3.6%. Intrarun precision evaluated from 0 to 100 ng/mL ranged from 1.0 to 4.4%CV. Other opioids (codeine, morphine, oxycodone, oxymorphone, hydromorphone, hydrocodone, and norcodeine) did not interfere in the detection, quantification, or chromatography of 6-AM or the deuterated internal standard. The quantified values for 41 authentic human urine specimens previously found to contain 6-AM by a validated gas chromatography (GC)-MS method were compared to those obtained by the SPE-LC-MS-MS method. The SPE-LC-MS-MS procedure eliminates the human factors of specimen handling, extraction, and derivatization, thereby reducing labor costs and rework resulting from human error or technique issues. The time required for extraction and analysis was reduced by approximately 50% when compared to a validated 6-AM procedure using manual SPE and GC-MS analysis.

  15. Simultaneous determination of polycyclic aromatic hydrocarbons and their chlorination by-products in drinking water and the coatings of water pipes by automated solid-phase microextraction followed by gas chromatography-mass spectrometry.

    Science.gov (United States)

    Tillner, Jocelyn; Hollard, Caroline; Bach, Cristina; Rosin, Christophe; Munoz, Jean-François; Dauchy, Xavier

    2013-11-08

    In this study, an automated method for the simultaneous determination of polycyclic aromatic hydrocarbons (PAHs) and their chlorination by-products in drinking water was developed based on online solid-phase microextraction-gas chromatography-mass spectrometry. The main focus was the optimisation of the solid-phase microextraction step. The influence of the agitation rate, type of fibre, desorption time, extraction time, extraction temperature, desorption temperature, and solvent addition was examined. The method was developed and validated using a mixture of 17 PAHs, 11 potential chlorination by-products (chlorinated and oxidised PAHs) and 6 deuterated standards. The limit of quantification was 10 ng/L for all target compounds. The validated method was used to analyse drinking water samples from three different drinking water distribution networks and the presumably coal tar-based pipe coatings of two pipe sections. A number of PAHs were detected in all three networks although individual compositions varied. Several PAH chlorination by-products (anthraquinone, fluorenone, cyclopenta[d,e,f]phenanthrenone, 3-chlorofluoranthene, and 1-chloropyrene) were also found, their presence correlating closely with that of their respective parent compounds. Their concentrations were always below 100 ng/L. In the coatings, all PAHs targeted were detected although concentrations varied between the two coatings (76-12,635 mg/kg and 12-6295 mg/kg, respectively). A number of chlorination by-products (anthraquinone, fluorenone, cyclopenta[d,e,f]phenanthrenone, 3-chlorofluoranthene, and 1-chloropyrene) were also detected (from 40 to 985 mg/kg), suggesting that the reaction of PAHs with disinfectant agents takes place in the coatings and not in the water phase after migration. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. Determination of the oxidative stress biomarker urinary 8-hydroxy-2'-deoxyguanosine by automated on-line in-tube solid-phase microextraction coupled with liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Kataoka, Hiroyuki; Mizuno, Keisuke; Oda, Eri; Saito, Akihiro

    2016-04-15

    A simple and sensitive method for the determination of 8-hydroxy-2'-deoxyguanosine (8-OHdG), a marker of oxidative DNA damage in human urine, was developed using automated on-line in-tube solid-phase microextraction (SPME) coupled with stable isotope-dilution liquid chromatography-tandem mass spectrometry (LC-MS/MS). Creatinine was also analyzed simultaneously to normalize urine volume by the in-tube SPME LC-MS/MS method, and 8-OHdG and creatinine were separated within 3 min using a Zorbax Eclipse XDB-C8 column. Electrospray MS/MS for these compounds was performed on an API 4000 triple quadruple mass spectrometer in the positive ion mode by multiple reaction monitoring. The optimum in-tube SPME conditions were 20 draw/eject cycles of 40 μL of sample at a flow rate of 200 μL/min using a Carboxen 1006 PLOT capillary column as an extraction device. The extracted compounds were easily desorbed from the capillary by passage of the mobile phase, and no carryover was observed. The calibration curve for 8-OHdG using its stable isotope-labeled internal standard was linear in the range of 0.05-10 ng/mL, and the detection limit was 8.3 pg/mL. The intra-day and inter-day precision (relative standard deviations) were below 3.1% and 9.6% (n=5), respectively. This method was applied successfully to the analysis of urine samples without any other pretreatment and interference peaks, with good recovery rates above 91% in spiked urine samples. The limits of quantification of 8-OHdG and creatinine in 0.1 mL urine samples were about 0.32 and 0.69 ng/mL (S/N=10), respectively. This method was utilized to assess the effects of smoking, green tea drinking and alcohol drinking on the urinary excretion of 8-OHdG. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Minima de L'intégrale D'action du Problème Newtoniende 4 Corps de Masses Égales Dans R3: Orbites `Hip-Hop'

    Science.gov (United States)

    Chenciner, Alain; Venturelli, Andrea

    2000-09-01

    We consider the problem of 4 bodies of equal masses in R 3 for the Newtonian r-1 potential. We address the question of the absolute minima of the action integral among (anti)symmetric loops of class H 1 whose period is fixed. It is the simplest case for which the results of [4] (corrected in [5]) do not apply: the minima cannot be the relative equilibria whose configuration is an absolute minimum of the potential among the configurations having a given moment of inertia with respect to their center of mass. This is because the regular tetrahedron cannot have a relative equilibrium motion in R 3 (see [2]). We show that the absolute minima of the action are not homographic motions. We also show that if we force the configuration to admit a certain type of symmetry of order 4, the absolute minimum is a collisionless orbit whose configuration ‘hesitates’ between the central configuration of the square and the one of the tetrahedron. We call these orbits ‘hip-hop’. A similar result holds in case of a symmetry of order 3 where the central configuration of the equilateral triangle with a body at the center of mass replaces the square.

  18. An automated swimming respirometer

    DEFF Research Database (Denmark)

    STEFFENSEN, JF; JOHANSEN, K; BUSHNELL, PG

    1984-01-01

    An automated respirometer is described that can be used for computerized respirometry of trout and sharks.......An automated respirometer is described that can be used for computerized respirometry of trout and sharks....

  19. Autonomy and Automation

    Science.gov (United States)

    Shively, Jay

    2017-01-01

    A significant level of debate and confusion has surrounded the meaning of the terms autonomy and automation. Automation is a multi-dimensional concept, and we propose that Remotely Piloted Aircraft Systems (RPAS) automation should be described with reference to the specific system and task that has been automated, the context in which the automation functions, and other relevant dimensions. In this paper, we present definitions of automation, pilot in the loop, pilot on the loop and pilot out of the loop. We further propose that in future, the International Civil Aviation Organization (ICAO) RPAS Panel avoids the use of the terms autonomy and autonomous when referring to automated systems on board RPA. Work Group 7 proposes to develop, in consultation with other workgroups, a taxonomy of Levels of Automation for RPAS.

  20. Configuration Management Automation (CMA) -

    Data.gov (United States)

    Department of Transportation — Configuration Management Automation (CMA) will provide an automated, integrated enterprise solution to support CM of FAA NAS and Non-NAS assets and investments. CMA...

  1. Automated technological equipment-robot complexes

    International Nuclear Information System (INIS)

    Zhitomirskii, S.V.; Samorodskikh, B.L.

    1984-01-01

    This paper surveys the types of automated technological equipment robot complexes. The principal elements of such complexes are described. Complexes are divided into two principal groups: those using simultaneously acting robots, and those using successively acting robots. The great variety of types of robots using successive action is then described

  2. 78 FR 75528 - Federal Government Participation in the Automated Clearing House

    Science.gov (United States)

    2013-12-12

    ... Participation in the Automated Clearing House AGENCY: Bureau of the Fiscal Service, Treasury. ACTION: Notice of... Service (Service) is proposing to amend its regulation governing the use of the Automated Clearing House... Automated Clearing House, Electronic funds transfer, Financial institutions, Fraud, and Incorporation by...

  3. Parameters Investigation of Mathematical Model of Productivity for Automated Line with Availability by DMAIC Methodology

    Directory of Open Access Journals (Sweden)

    Tan Chan Sin

    2014-01-01

    Full Text Available Automated line is widely applied in industry especially for mass production with less variety product. Productivity is one of the important criteria in automated line as well as industry which directly present the outputs and profits. Forecast of productivity in industry accurately in order to achieve the customer demand and the forecast result is calculated by using mathematical model. Mathematical model of productivity with availability for automated line has been introduced to express the productivity in terms of single level of reliability for stations and mechanisms. Since this mathematical model of productivity with availability cannot achieve close enough productivity compared to actual one due to lack of parameters consideration, the enhancement of mathematical model is required to consider and add the loss parameters that is not considered in current model. This paper presents the investigation parameters of productivity losses investigated by using DMAIC (Define, Measure, Analyze, Improve, and Control concept and PACE Prioritization Matrix (Priority, Action, Consider, and Eliminate. The investigated parameters are important for further improvement of mathematical model of productivity with availability to develop robust mathematical model of productivity in automated line.

  4. Reduced bone mass in obese young rats through PPAR omega suppression of wnt/beta-catenin signaling and direct action of free fatty acids (NEFA)

    Science.gov (United States)

    The relationship of obesity to skeletal development is unclear. We utilized total enteral nutrition to feed high and low fat diets (HFD and LFD) to rats for 4 wks to produce obesity. Weight gain was matched but fat mass, serum leptin and NEFA were increased by HFD (P < 0.05). HFD lowered total bone ...

  5. Differential employment rates in the journalism and mass communication labor force based on gender, race and ethnicity: Exploring the impact of affirmative action

    NARCIS (Netherlands)

    Becker, L.B.; Lauf, E.; Lowrey, W.

    1999-01-01

    This paper examines whether gender, race, and ethnicity are associated with employment in the journalism and mass communication labor market and—if discrepancies in employment exist—what explanations might he offered for them. The data show strong evidence that race and ethnicity are associated with

  6. Automation in Clinical Microbiology

    Science.gov (United States)

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  7. MARC and the Library Service Center: Automation at Bargain Rates.

    Science.gov (United States)

    Pearson, Karl M.

    Despite recent research and development in the field of library automation, libraries have been unable to reap the benefits promised by technology due to the high cost of building and maintaining their own computer-based systems. Time-sharing and disc mass storage devices will bring automation costs, if spread over a number of users, within the…

  8. Digital computers in action

    CERN Document Server

    Booth, A D

    1965-01-01

    Digital Computers in Action is an introduction to the basics of digital computers as well as their programming and various applications in fields such as mathematics, science, engineering, economics, medicine, and law. Other topics include engineering automation, process control, special purpose games-playing devices, machine translation and mechanized linguistics, and information retrieval. This book consists of 14 chapters and begins by discussing the history of computers, from the idea of performing complex arithmetical calculations to the emergence of a modern view of the structure of a ge

  9. Application of ultraperformance liquid chromatography/mass spectrometry-based metabonomic techniques to analyze the joint toxic action of long-term low-level exposure to a mixture of organophosphate pesticides on rat urine profile.

    Science.gov (United States)

    Du, Longfei; Wang, Hong; Xu, Wei; Zeng, Yan; Hou, Yurong; Zhang, Yuqiu; Zhao, Xiujuan; Sun, Changhao

    2013-07-01

    In previously published articles, we evaluated the toxicity of four organophosphate (OP) pesticides (dichlorvos, dimethoate, acephate, and phorate) to rats using metabonomic technology at their corresponding no observed adverse effect level (NOAEL). Results show that a single pesticide elicits no toxic response. This study aimed to determine whether chronic exposure to a mixture of the above four pesticides (at their corresponding NOAEL) can lead to joint toxic action in rats using the same technology. Pesticides were administered daily to rats through drinking water for 24 weeks. The above mixture of the four pesticides showed joint toxic action at the NOAEL of each pesticide. The metabonomic profiles of rats urine were analyzed by ultraperformance liquid chromatography/mass spectrometry. The 16 metabolites statistically significantly changed in all treated groups compared with the control group. Dimethylphosphate and dimethyldithiophosphate exclusively detected in all treated groups can be used as early, sensitive biomarkers for exposure to a mixture of the OP pesticides. Moreover, exposure to the OP pesticides resulted in increased 7-methylguanine, ribothymidine, cholic acid, 4-pyridoxic acid, kynurenine, and indoxyl sulfate levels, as well as decreased hippuric acid, creatinine, uric acid, gentisic acid, C18-dihydrosphingosine, phytosphingosine, suberic acid, and citric acid. The results indicated that a mixture of OP pesticides induced DNA damage and oxidative stress, disturbed the metabolism of lipids, and interfered with the tricarboxylic acid cycle. Ensuring food safety requires not only the toxicology test data of each pesticide for the calculation of the acceptable daily intake but also the joint toxic action.

  10. Effective action and brane running

    International Nuclear Information System (INIS)

    Brevik, Iver; Ghoroku, Kazuo; Yahiro, Masanobu

    2004-01-01

    We address the renormalized effective action for a Randall-Sundrum brane running in 5D bulk space. The running behavior of the brane action is obtained by shifting the brane position without changing the background and fluctuations. After an appropriate renormalization, we obtain an effective, low energy brane world action, in which the effective 4D Planck mass is independent of the running position. We address some implications for this effective action

  11. Bioprocessing automation in cell therapy manufacturing: Outcomes of special interest group automation workshop.

    Science.gov (United States)

    Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David

    2018-04-01

    Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy

  12. Development of design principles for automated systems in transport control.

    Science.gov (United States)

    Balfe, Nora; Wilson, John R; Sharples, Sarah; Clarke, Theresa

    2012-01-01

    This article reports the results of a qualitative study investigating attitudes towards and opinions of an advanced automation system currently used in UK rail signalling. In-depth interviews were held with 10 users, key issues associated with automation were identified and the automation's impact on the signalling task investigated. The interview data highlighted the importance of the signallers' understanding of the automation and their (in)ability to predict its outputs. The interviews also covered the methods used by signallers to interact with and control the automation, and the perceived effects on their workload. The results indicate that despite a generally low level of understanding and ability to predict the actions of the automation system, signallers have developed largely successful coping mechanisms that enable them to use the technology effectively. These findings, along with parallel work identifying desirable attributes of automation from the literature in the area, were used to develop 12 principles of automation which can be used to help design new systems which better facilitate cooperative working. The work reported in this article was completed with the active involvement of operational rail staff who regularly use automated systems in rail signalling. The outcomes are currently being used to inform decisions on the extent and type of automation and user interfaces in future generations of rail control systems.

  13. Power, speed & automation with Adobe Photoshop

    CERN Document Server

    Scott, Geoff

    2012-01-01

    This is a must for the serious Photoshop user! Power, Speed & Automation explores how to customize and automate Photoshop to increase your speed and productivity.  With numerous step-by-step instructions, the authors-two of Adobe's own software developers!- walk you through the steps to best tailor Photoshop's interface to your personal workflow; write and apply Actions; and use batching and scripts to process large numbers of images quickly and automatically.  You will learn how to build your own dialogs and panels to improve your production workflows in Photoshop, the secrets of changing

  14. Automation warning system against driver falling asleep in-traffic

    Directory of Open Access Journals (Sweden)

    Dymov I. S.

    2017-12-01

    Full Text Available The paper is devoted to the development of a new automation recognition and warning system against driver falling asleep in-traffic. The issue of the physical condition control of professional drivers on the voyage has been considered both on the part of efficiency and quality of its determination, and in terms of improving overall road safety. The existing and widely used devices for determining the transition to the stage of sleep of drivers being in-traffic have been analyzed. Their advantages and disadvantages have been detected. It has been established that the main negative factor preventing the mass introduction of pre-existing warning systems is the need to wear one or another monitoring device before starting the movement. Carried out project research work has proposed a complex monitoring of the physical and physiological condition of driving person as a new warning method against falling asleep in-traffic. The proposed variations of algorithmic implementations can be used in long-distance trucks and passenger vehicles. Two different versions of the automatic control status of the driver physical condition have been considered. The first approach has proposed the use of sensors of the biometric parameters of body, pulsus, body temperature, and hands on wheel pressure sensors. The second one has proposed using the tracking cameras. Both for the first and second versions of the automation system a toolset of control devices is being installed inside the vehicle and have no physical, so irritating action on the driver. Software approach for the false operation rejection of the devices has been developed. The paper considers the flow diagrams of the automatic systems and logical structure of analysis and decision-making. The set of impacts intended for driver's awakening has been proposed. The conclusion about the engineering perspectives of the proposed approach of projected automation systems has been made.

  15. Automation systems for radioimmunoassay

    International Nuclear Information System (INIS)

    Yamasaki, Paul

    1974-01-01

    The application of automation systems for radioimmunoassay (RIA) was discussed. Automated systems could be useful in the second step, of the four basic processes in the course of RIA, i.e., preparation of sample for reaction. There were two types of instrumentation, a semi-automatic pipete, and a fully automated pipete station, both providing for fast and accurate dispensing of the reagent or for the diluting of sample with reagent. Illustrations of the instruments were shown. (Mukohata, S.)

  16. Laboratory Automation and Middleware.

    Science.gov (United States)

    Riben, Michael

    2015-06-01

    The practice of surgical pathology is under constant pressure to deliver the highest quality of service, reduce errors, increase throughput, and decrease turnaround time while at the same time dealing with an aging workforce, increasing financial constraints, and economic uncertainty. Although not able to implement total laboratory automation, great progress continues to be made in workstation automation in all areas of the pathology laboratory. This report highlights the benefits and challenges of pathology automation, reviews middleware and its use to facilitate automation, and reviews the progress so far in the anatomic pathology laboratory. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Automated cloning methods.; TOPICAL

    International Nuclear Information System (INIS)

    Collart, F.

    2001-01-01

    Argonne has developed a series of automated protocols to generate bacterial expression clones by using a robotic system designed to be used in procedures associated with molecular biology. The system provides plate storage, temperature control from 4 to 37 C at various locations, and Biomek and Multimek pipetting stations. The automated system consists of a robot that transports sources from the active station on the automation system. Protocols for the automated generation of bacterial expression clones can be grouped into three categories (Figure 1). Fragment generation protocols are initiated on day one of the expression cloning procedure and encompass those protocols involved in generating purified coding region (PCR)

  18. Complacency and Automation Bias in the Use of Imperfect Automation.

    Science.gov (United States)

    Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L

    2015-08-01

    We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.

  19. Automation in the clinical microbiology laboratory.

    Science.gov (United States)

    Novak, Susan M; Marlowe, Elizabeth M

    2013-09-01

    Imagine a clinical microbiology laboratory where a patient's specimens are placed on a conveyor belt and sent on an automation line for processing and plating. Technologists need only log onto a computer to visualize the images of a culture and send to a mass spectrometer for identification. Once a pathogen is identified, the system knows to send the colony for susceptibility testing. This is the future of the clinical microbiology laboratory. This article outlines the operational and staffing challenges facing clinical microbiology laboratories and the evolution of automation that is shaping the way laboratory medicine will be practiced in the future. Copyright © 2013 Elsevier Inc. All rights reserved.

  20. Automated remedial assessment methodology software system

    International Nuclear Information System (INIS)

    Whiting, M.; Wilkins, M.; Stiles, D.

    1994-11-01

    The Automated Remedial Analysis Methodology (ARAM) software system has been developed by the Pacific Northwest Laboratory to assist the U.S. Department of Energy (DOE) in evaluating cleanup options for over 10,000 contaminated sites across the DOE complex. The automated methodology comprises modules for decision logic diagrams, technology applicability and effectiveness rules, mass balance equations, cost and labor estimating factors and equations, and contaminant stream routing. ARAM is used to select technologies for meeting cleanup targets; determine the effectiveness of the technologies in destroying, removing, or immobilizing contaminants; decide the nature and amount of secondary waste requiring further treatment; and estimate the cost and labor involved when applying technologies

  1. Automated System Marketplace 1994.

    Science.gov (United States)

    Griffiths, Jose-Marie; Kertis, Kimberly

    1994-01-01

    Reports results of the 1994 Automated System Marketplace survey based on responses from 60 vendors. Highlights include changes in the library automation marketplace; estimated library systems revenues; minicomputer and microcomputer-based systems; marketplace trends; global markets and mergers; research needs; new purchase processes; and profiles…

  2. Automation in Warehouse Development

    NARCIS (Netherlands)

    Hamberg, R.; Verriet, J.

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and

  3. Order Division Automated System.

    Science.gov (United States)

    Kniemeyer, Justin M.; And Others

    This publication was prepared by the Order Division Automation Project staff to fulfill the Library of Congress' requirement to document all automation efforts. The report was originally intended for internal use only and not for distribution outside the Library. It is now felt that the library community at-large may have an interest in the…

  4. Automate functional testing

    Directory of Open Access Journals (Sweden)

    Ramesh Kalindri

    2014-06-01

    Full Text Available Currently, software engineers are increasingly turning to the option of automating functional tests, but not always have successful in this endeavor. Reasons range from low planning until over cost in the process. Some principles that can guide teams in automating these tests are described in this article.

  5. Automation and robotics

    Science.gov (United States)

    Montemerlo, Melvin

    1988-01-01

    The Autonomous Systems focus on the automation of control systems for the Space Station and mission operations. Telerobotics focuses on automation for in-space servicing, assembly, and repair. The Autonomous Systems and Telerobotics each have a planned sequence of integrated demonstrations showing the evolutionary advance of the state-of-the-art. Progress is briefly described for each area of concern.

  6. Automating the Small Library.

    Science.gov (United States)

    Skapura, Robert

    1987-01-01

    Discusses the use of microcomputers for automating school libraries, both for entire systems and for specific library tasks. Highlights include available library management software, newsletters that evaluate software, constructing an evaluation matrix, steps to consider in library automation, and a brief discussion of computerized card catalogs.…

  7. Automated model building

    CERN Document Server

    Caferra, Ricardo; Peltier, Nicholas

    2004-01-01

    This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses

  8. Automation in Immunohematology

    Directory of Open Access Journals (Sweden)

    Meenu Bajpai

    2012-01-01

    Full Text Available There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  9. Automation in Warehouse Development

    CERN Document Server

    Verriet, Jacques

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and supports the quality of picking processes. Secondly, the development of models to simulate and analyse warehouse designs and their components facilitates the challenging task of developing warehouses that take into account each customer’s individual requirements and logistic processes. Automation in Warehouse Development addresses both types of automation from the innovative perspective of applied science. In particular, it describes the outcomes of the Falcon project, a joint endeavour by a consortium of industrial and academic partners. The results include a model-based approach to automate warehouse control design, analysis models for warehouse design, concepts for robotic item handling and computer vision, and auton...

  10. If this then that: an introduction to automated task services.

    Science.gov (United States)

    Hoy, Matthew B

    2015-01-01

    This article explores automated task services, a type of website that allows users to create rules that are triggered by activity on one website and perform a task on another site. The most well-known automated task service is If This Then That (IFTTT), but recently a large number of these services have sprung up. These services can be used to connect websites, apps, business services, and even devices such as phones and home automation equipment. This allows for millions of possible combinations of rules, triggers, and actions. Librarians can put these services to use in many ways, from automating social media postings to remembering to bring their umbrella when rain is in the forecast. A list of popular automated task services is included, as well as a number of ideas for using these services in libraries.

  11. Developments towards a fully automated AMS system

    International Nuclear Information System (INIS)

    Steier, P.; Puchegger, S.; Golser, R.; Kutschera, W.; Priller, A.; Rom, W.; Wallner, A.; Wild, E.

    2000-01-01

    The possibilities of computer-assisted and automated accelerator mass spectrometry (AMS) measurements were explored. The goal of these efforts is to develop fully automated procedures for 'routine' measurements at the Vienna Environmental Research Accelerator (VERA), a dedicated 3-MV Pelletron tandem AMS facility. As a new tool for automatic tuning of the ion optics we developed a multi-dimensional optimization algorithm robust to noise, which was applied for 14 C and 10 Be. The actual isotope ratio measurements are performed in a fully automated fashion and do not require the presence of an operator. Incoming data are evaluated online and the results can be accessed via Internet. The system was used for 14 C, 10 Be, 26 Al and 129 I measurements

  12. Systematic review automation technologies

    Science.gov (United States)

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  13. Negotiating action

    Science.gov (United States)

    2017-12-01

    After years of working towards a climate accord, the Paris Agreement of 2015 marked the shift from negotiating to reach consensus on climate action to implementation of such action. The challenge now is to ensure transparency in the processes and identify the details of what is required.

  14. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1994-03-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (October - December 1993) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  15. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1992-11-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (July - September 1992) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  16. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1990-05-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (January--March 1990) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. Also included are a number of enforcement actions that had been previously resolved but not published in this NUREG. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  17. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1989-06-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (January--March 1989) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. Also included are a number of enforcement actions that had been previously resolved but not published in this NUREG. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  18. Operational proof of automation

    International Nuclear Information System (INIS)

    Jaerschky, R.; Reifenhaeuser, R.; Schlicht, K.

    1976-01-01

    Automation of the power plant process may imply quite a number of problems. The automation of dynamic operations requires complicated programmes often interfering in several branched areas. This reduces clarity for the operating and maintenance staff, whilst increasing the possibilities of errors. The synthesis and the organization of standardized equipment have proved very successful. The possibilities offered by this kind of automation for improving the operation of power plants will only sufficiently and correctly be turned to profit, however, if the application of these technics of equipment is further improved and if its volume is tallied with a definite etc. (orig.) [de

  19. Automation of radioimmunoassay

    International Nuclear Information System (INIS)

    Yamaguchi, Chisato; Yamada, Hideo; Iio, Masahiro

    1974-01-01

    Automation systems for measuring Australian antigen by radioimmunoassay under development were discussed. Samples were processed as follows: blood serum being dispensed by automated sampler to the test tube, and then incubated under controlled time and temperature; first counting being omitted; labelled antibody being dispensed to the serum after washing; samples being incubated and then centrifuged; radioactivities in the precipitate being counted by auto-well counter; measurements being tabulated by automated typewriter. Not only well-type counter but also position counter was studied. (Kanao, N.)

  20. Automated electron microprobe

    International Nuclear Information System (INIS)

    Thompson, K.A.; Walker, L.R.

    1986-01-01

    The Plant Laboratory at the Oak Ridge Y-12 Plant has recently obtained a Cameca MBX electron microprobe with a Tracor Northern TN5500 automation system. This allows full stage and spectrometer automation and digital beam control. The capabilities of the system include qualitative and quantitative elemental microanalysis for all elements above and including boron in atomic number, high- and low-magnification imaging and processing, elemental mapping and enhancement, and particle size, shape, and composition analyses. Very low magnification, quantitative elemental mapping using stage control (which is of particular interest) has been accomplished along with automated size, shape, and composition analysis over a large relative area

  1. Chef infrastructure automation cookbook

    CERN Document Server

    Marschall, Matthias

    2013-01-01

    Chef Infrastructure Automation Cookbook contains practical recipes on everything you will need to automate your infrastructure using Chef. The book is packed with illustrated code examples to automate your server and cloud infrastructure.The book first shows you the simplest way to achieve a certain task. Then it explains every step in detail, so that you can build your knowledge about how things work. Eventually, the book shows you additional things to consider for each approach. That way, you can learn step-by-step and build profound knowledge on how to go about your configuration management

  2. Managing laboratory automation.

    Science.gov (United States)

    Saboe, T J

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Finally, some comments on future automation need are discussed.

  3. Automated PCB Inspection System

    Directory of Open Access Journals (Sweden)

    Syed Usama BUKHARI

    2017-05-01

    Full Text Available Development of an automated PCB inspection system as per the need of industry is a challenging task. In this paper a case study is presented, to exhibit, a proposed system for an immigration process of a manual PCB inspection system to an automated PCB inspection system, with a minimal intervention on the existing production flow, for a leading automotive manufacturing company. A detailed design of the system, based on computer vision followed by testing and analysis was proposed, in order to aid the manufacturer in the process of automation.

  4. Operational proof of automation

    International Nuclear Information System (INIS)

    Jaerschky, R.; Schlicht, K.

    1977-01-01

    Automation of the power plant process may imply quite a number of problems. The automation of dynamic operations requires complicated programmes often interfering in several branched areas. This reduces clarity for the operating and maintenance staff, whilst increasing the possibilities of errors. The synthesis and the organization of standardized equipment have proved very successful. The possibilities offered by this kind of automation for improving the operation of power plants will only sufficiently and correctly be turned to profit, however, if the application of these equipment techniques is further improved and if it stands in a certain ratio with a definite efficiency. (orig.) [de

  5. Critical Mass

    CERN Multimedia

    AUTHOR|(CDS)2070299

    2017-01-01

    Critical Mass is a cycling event typically held on the last Friday of every month; its purpose is not usually formalized beyond the direct action of meeting at a set location and time and traveling as a group through city or town streets on bikes. The event originated in 1992 in San Francisco; by the end of 2003, the event was being held in over 300 cities around the world. At CERN it is held once a year in conjunction with the national Swiss campaing "Bike to work".

  6. AUTOMATING THE DATA SECURITY PROCESS

    Directory of Open Access Journals (Sweden)

    Florin Ogigau-Neamtiu

    2017-11-01

    Full Text Available Contemporary organizations face big data security challenges in the cyber environment due to modern threats and actual business working model which relies heavily on collaboration, data sharing, tool integration, increased mobility, etc. The nowadays data classification and data obfuscation selection processes (encryption, masking or tokenization suffer because of the human implication in the process. Organizations need to shirk data security domain by classifying information based on its importance, conduct risk assessment plans and use the most cost effective data obfuscation technique. The paper proposes a new model for data protection by using automated machine decision making procedures to classify data and to select the appropriate data obfuscation technique. The proposed system uses natural language processing capabilities to analyze input data and to select the best course of action. The system has capabilities to learn from previous experiences thus improving itself and reducing the risk of wrong data classification.

  7. Automated Vehicles Symposium 2015

    CERN Document Server

    Beiker, Sven

    2016-01-01

    This edited book comprises papers about the impacts, benefits and challenges of connected and automated cars. It is the third volume of the LNMOB series dealing with Road Vehicle Automation. The book comprises contributions from researchers, industry practitioners and policy makers, covering perspectives from the U.S., Europe and Japan. It is based on the Automated Vehicles Symposium 2015 which was jointly organized by the Association of Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Ann Arbor, Michigan, in July 2015. The topical spectrum includes, but is not limited to, public sector activities, human factors, ethical and business aspects, energy and technological perspectives, vehicle systems and transportation infrastructure. This book is an indispensable source of information for academic researchers, industrial engineers and policy makers interested in the topic of road vehicle automation.

  8. Hydrometeorological Automated Data System

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Office of Hydrologic Development of the National Weather Service operates HADS, the Hydrometeorological Automated Data System. This data set contains the last 48...

  9. Automated External Defibrillator

    Science.gov (United States)

    ... leads to a 10 percent reduction in survival. Training To Use an Automated External Defibrillator Learning how to use an AED and taking a CPR (cardiopulmonary resuscitation) course are helpful. However, if trained ...

  10. Planning for Office Automation.

    Science.gov (United States)

    Mick, Colin K.

    1983-01-01

    Outlines a practical approach to planning for office automation termed the "Focused Process Approach" (the "what" phase, "how" phase, "doing" phase) which is a synthesis of the problem-solving and participatory planning approaches. Thirteen references are provided. (EJS)

  11. Fixed automated spray technology.

    Science.gov (United States)

    2011-04-19

    This research project evaluated the construction and performance of Boschungs Fixed Automated : Spray Technology (FAST) system. The FAST system automatically sprays de-icing material on : the bridge when icing conditions are about to occur. The FA...

  12. Automated Vehicles Symposium 2014

    CERN Document Server

    Beiker, Sven; Road Vehicle Automation 2

    2015-01-01

    This paper collection is the second volume of the LNMOB series on Road Vehicle Automation. The book contains a comprehensive review of current technical, socio-economic, and legal perspectives written by experts coming from public authorities, companies and universities in the U.S., Europe and Japan. It originates from the Automated Vehicle Symposium 2014, which was jointly organized by the Association for Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Burlingame, CA, in July 2014. The contributions discuss the challenges arising from the integration of highly automated and self-driving vehicles into the transportation system, with a focus on human factors and different deployment scenarios. This book is an indispensable source of information for academic researchers, industrial engineers, and policy makers interested in the topic of road vehicle automation.

  13. Automation Interface Design Development

    Data.gov (United States)

    National Aeronautics and Space Administration — Our research makes its contributions at two levels. At one level, we addressed the problems of interaction between humans and computers/automation in a particular...

  14. I-94 Automation FAQs

    Data.gov (United States)

    Department of Homeland Security — In order to increase efficiency, reduce operating costs and streamline the admissions process, U.S. Customs and Border Protection has automated Form I-94 at air and...

  15. Automation synthesis modules review

    International Nuclear Information System (INIS)

    Boschi, S.; Lodi, F.; Malizia, C.; Cicoria, G.; Marengo, M.

    2013-01-01

    The introduction of 68 Ga labelled tracers has changed the diagnostic approach to neuroendocrine tumours and the availability of a reliable, long-lived 68 Ge/ 68 Ga generator has been at the bases of the development of 68 Ga radiopharmacy. The huge increase in clinical demand, the impact of regulatory issues and a careful radioprotection of the operators have boosted for extensive automation of the production process. The development of automated systems for 68 Ga radiochemistry, different engineering and software strategies and post-processing of the eluate were discussed along with impact of automation with regulations. - Highlights: ► Generators availability and robust chemistry boosted for the huge diffusion of 68Ga radiopharmaceuticals. ► Different technological approaches for 68Ga radiopharmaceuticals will be discussed. ► Generator eluate post processing and evolution to cassette based systems were the major issues in automation. ► Impact of regulations on the technological development will be also considered

  16. Effects of Automation Types on Air Traffic Controller Situation Awareness and Performance

    Science.gov (United States)

    Sethumadhavan, A.

    2009-01-01

    The Joint Planning and Development Office has proposed the introduction of automated systems to help air traffic controllers handle the increasing volume of air traffic in the next two decades (JPDO, 2007). Because fully automated systems leave operators out of the decision-making loop (e.g., Billings, 1991), it is important to determine the right level and type of automation that will keep air traffic controllers in the loop. This study examined the differences in the situation awareness (SA) and collision detection performance of individuals when they worked with information acquisition, information analysis, decision and action selection and action implementation automation to control air traffic (Parasuraman, Sheridan, & Wickens, 2000). When the automation was unreliable, the time taken to detect an upcoming collision was significantly longer for all the automation types compared with the information acquisition automation. This poor performance following automation failure was mediated by SA, with lower SA yielding poor performance. Thus, the costs associated with automation failure are greater when automation is applied to higher order stages of information processing. Results have practical implications for automation design and development of SA training programs.

  17. Disassembly automation automated systems with cognitive abilities

    CERN Document Server

    Vongbunyong, Supachai

    2015-01-01

    This book presents a number of aspects to be considered in the development of disassembly automation, including the mechanical system, vision system and intelligent planner. The implementation of cognitive robotics increases the flexibility and degree of autonomy of the disassembly system. Disassembly, as a step in the treatment of end-of-life products, can allow the recovery of embodied value left within disposed products, as well as the appropriate separation of potentially-hazardous components. In the end-of-life treatment industry, disassembly has largely been limited to manual labor, which is expensive in developed countries. Automation is one possible solution for economic feasibility. The target audience primarily comprises researchers and experts in the field, but the book may also be beneficial for graduate students.

  18. Highway Electrification And Automation

    OpenAIRE

    Shladover, Steven E.

    1992-01-01

    This report addresses how the California Department of Transportation and the California PATH Program have made efforts to evaluate the feasibility and applicability of highway electrification and automation technologies. In addition to describing how the work was conducted, the report also describes the findings on highway electrification and highway automation, with experimental results, design study results, and a region-wide application impacts study for Los Angeles.

  19. Automated lattice data generation

    Directory of Open Access Journals (Sweden)

    Ayyar Venkitesh

    2018-01-01

    Full Text Available The process of generating ensembles of gauge configurations (and measuring various observables over them can be tedious and error-prone when done “by hand”. In practice, most of this procedure can be automated with the use of a workflow manager. We discuss how this automation can be accomplished using Taxi, a minimal Python-based workflow manager built for generating lattice data. We present a case study demonstrating this technology.

  20. Automated lattice data generation

    Science.gov (United States)

    Ayyar, Venkitesh; Hackett, Daniel C.; Jay, William I.; Neil, Ethan T.

    2018-03-01

    The process of generating ensembles of gauge configurations (and measuring various observables over them) can be tedious and error-prone when done "by hand". In practice, most of this procedure can be automated with the use of a workflow manager. We discuss how this automation can be accomplished using Taxi, a minimal Python-based workflow manager built for generating lattice data. We present a case study demonstrating this technology.

  1. Automated security management

    CERN Document Server

    Al-Shaer, Ehab; Xie, Geoffrey

    2013-01-01

    In this contributed volume, leading international researchers explore configuration modeling and checking, vulnerability and risk assessment, configuration analysis, and diagnostics and discovery. The authors equip readers to understand automated security management systems and techniques that increase overall network assurability and usability. These constantly changing networks defend against cyber attacks by integrating hundreds of security devices such as firewalls, IPSec gateways, IDS/IPS, authentication servers, authorization/RBAC servers, and crypto systems. Automated Security Managemen

  2. Marketing automation supporting sales

    OpenAIRE

    Sandell, Niko

    2016-01-01

    The past couple of decades has been a time of major changes in marketing. Digitalization has become a permanent part of marketing and at the same time enabled efficient collection of data. Personalization and customization of content are playing a crucial role in marketing when new customers are acquired. This has also created a need for automation to facilitate the distribution of targeted content. As a result of successful marketing automation more information of the customers is gathered ...

  3. Instant Sikuli test automation

    CERN Document Server

    Lau, Ben

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. A concise guide written in an easy-to follow style using the Starter guide approach.This book is aimed at automation and testing professionals who want to use Sikuli to automate GUI. Some Python programming experience is assumed.

  4. Managing laboratory automation

    OpenAIRE

    Saboe, Thomas J.

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Fina...

  5. Shielded cells transfer automation

    International Nuclear Information System (INIS)

    Fisher, J.J.

    1984-01-01

    Nuclear waste from shielded cells is removed, packaged, and transferred manually in many nuclear facilities. Radiation exposure is absorbed by operators during these operations and limited only through procedural controls. Technological advances in automation using robotics have allowed a production waste removal operation to be automated to reduce radiation exposure. The robotic system bags waste containers out of glove box and transfers them to a shielded container. Operators control the system outside the system work area via television cameras. 9 figures

  6. Automated Status Notification System

    Science.gov (United States)

    2005-01-01

    NASA Lewis Research Center's Automated Status Notification System (ASNS) was born out of need. To prevent "hacker attacks," Lewis' telephone system needed to monitor communications activities 24 hr a day, 7 days a week. With decreasing staff resources, this continuous monitoring had to be automated. By utilizing existing communications hardware, a UNIX workstation, and NAWK (a pattern scanning and processing language), we implemented a continuous monitoring system.

  7. Automated Groundwater Screening

    International Nuclear Information System (INIS)

    Taylor, Glenn A.; Collard, Leonard B.

    2005-01-01

    The Automated Intruder Analysis has been extended to include an Automated Ground Water Screening option. This option screens 825 radionuclides while rigorously applying the National Council on Radiation Protection (NCRP) methodology. An extension to that methodology is presented to give a more realistic screening factor for those radionuclides which have significant daughters. The extension has the promise of reducing the number of radionuclides which must be tracked by the customer. By combining the Automated Intruder Analysis with the Automated Groundwater Screening a consistent set of assumptions and databases is used. A method is proposed to eliminate trigger values by performing rigorous calculation of the screening factor thereby reducing the number of radionuclides sent to further analysis. Using the same problem definitions as in previous groundwater screenings, the automated groundwater screening found one additional nuclide, Ge-68, which failed the screening. It also found that 18 of the 57 radionuclides contained in NCRP Table 3.1 failed the screening. This report describes the automated groundwater screening computer application

  8. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1993-09-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (April--June 1993) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  9. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1991-05-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (January--March 1991) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  10. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1991-02-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (October--December 1990) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  11. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1990-03-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (October--December 1989) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  12. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1990-11-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (July--September 1990) and includes copies of letters, notices, and orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  13. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1992-08-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (April--June 1992) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  14. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1990-09-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (April--June 1990) and includes copies of letters, notices, and orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  15. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1993-06-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (January--March 1993) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  16. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1992-05-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (January--March 1992) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  17. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1993-12-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (July--September 1993) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  18. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1993-03-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (October--December 1992) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  19. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1991-07-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (April-June 1991) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  20. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1991-11-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (July--September 1991) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  1. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1992-03-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (October--December 1991) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  2. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1989-12-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (July--September 1989) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  3. Validation of a fully automated solid‐phase extraction and ultra‐high‐performance liquid chromatography–tandem mass spectrometry method for quantification of 30 pharmaceuticals and metabolites in post‐mortem blood and brain samples

    DEFF Research Database (Denmark)

    Nielsen, Marie Katrine Klose; Nedahl, Michael; Johansen, Sys Stybe

    2018-01-01

    In this study, we present the validation of an analytical method capable of quantifying 30 commonly encountered pharmaceuticals and metabolites in whole blood and brain tissue from forensic cases. Solid‐phase extraction was performed by a fully automated robotic system, thereby minimising manual...... labour and human error while increasing sample throughput, robustness, and traceability. The method was validated in blood in terms of selectivity, linear range, matrix effect, extraction recovery, process efficiency, carry‐over, stability, precision, and accuracy. Deuterated analogues of each analyte....../kg. Thus, the linear range covered both therapeutic and toxic levels. The method showed acceptable accuracy and precision, with accuracies ranging from 80 to 118% and precision below 19% for the majority of the analytes. Linear range, matrix effect, extraction recovery, process efficiency, precision...

  4. Quantification of 31 illicit and medicinal drugs and metabolites in whole blood by fully automated solid-phase extraction and ultra-performance liquid chromatography-tandem mass spectrometry

    DEFF Research Database (Denmark)

    Bjørk, Marie Kjærgaard; Simonsen, Kirsten Wiese; Andersen, David Wederkinck

    2013-01-01

    , and benzodiazepines, was used. In order to increase the efficiency of routine analysis, a robotic system based on automated liquid handling and capable of handling all unit operation for sample preparation was built on a Freedom Evo 200 platform with several add-ons from Tecan and third-party vendors. Solid-phase...... extraction was performed using Strata X-C plates. Extraction time for 96 samples was less than 3 h. Chromatography was performed using an ACQUITY UPLC system (Waters Corporation, Milford, USA). Analytes were separated on a 100 mm × 2.1 mm, 1.7 μm Acquity UPLC CSH C18 column using a 6.5 min 0.1 % ammonia (25...

  5. Automation of a thermogravimetric equipment

    International Nuclear Information System (INIS)

    Mussio, L.; Castiglioni, J.; Diano, W.

    1987-01-01

    A low cost automation of some instruments by means of simple electronic circuits and a microcomputer Apple IIe. type is discussed. The electronic circuits described are: a) a position detector including phototransistors connected as differential amplifier; b) a current source that, using the error signal of the position detector, changes the current through the coil of an electromagnetic balance to restore its zero position; c) a proportional temperature controller, zero volt switching to drive a furnace to a desired temperature; d) an interface temperature regulator-microcomputer to control the temperature regulator by software; e) a multiplexer for an analog input of a commercial interface. Those circuits are applied in a thermogravimetric equipment used also for vapours adsorption. A program in block diagram form is included and is able to record change of mass, time, furnace temperature and to drive the temperature regulator in order to have the heating rates or the temperature plateaux needed for the experiment. (author) [pt

  6. Single-run determination of polybrominated diphenyl ethers (PBDEs) di- to deca-brominated in fish meal, fish oil and fish feed by isotope dilution: Application of automated sample purification and gas chromatography/ion trap tandem mass spectrometry (GC/ITMS)

    International Nuclear Information System (INIS)

    Blanco, Sonia Lucia; Vieites, Juan M.

    2010-01-01

    The present paper describes the application of automated cleanup and fractionation procedures of the Power Prep system (Fluid Management Systems) for the determination of polybrominated diphenyl ethers (PBDEs) in feeding stuffs and fish meal and oil. Gas chromatography (GC) separation followed by ion trap tandem mass spectrometry detection in EI mode (ITMS) allowed the analysis of di- to deca-BDEs in the samples matrices used in fish aquaculture. The method developed enabled the determination of 26 native PBDE congeners and 11 13 C 12 -labelled congeners, including deca-BDE 209, in a single-run analysis, using isotope dilution. The automated cleanup, consisting of a succession of multilayer silica and basic alumina columns previously applied by Wyrzykowska et al. (2009) in combustion flue gas, was succesfully applied in our complex matrices. The method allowed an increase in productivity, i.e. lower time was required to process samples, and simultaneous purification of several samples was achieved at a time, reducing analyst dedication and human error input. Average recoveries of 43-96% were obtained. GC/ITMS can overcome the complexity originating from the sample matrix, eliminating matrix effects by tandem MS, to enable the detection of congeners penta- to nona-BDEs where interferent masses were present. The provisional detection limits, estimated in the samples, were 5-30 pg for di-, tri-, tetra-, and penta-BDEs, 20-65 pg for hexa-, hepta-, octa- and nona-BDEs, and 105 pg for deca-BDE. Reduction of deca-BDE 209 blank values is of concern to ongoing research. Good accuracy was obtained by application of the whole procedure, representing an efficient, low-cost and fast alternative for routine analyses.

  7. Single-run determination of polybrominated diphenyl ethers (PBDEs) di- to deca-brominated in fish meal, fish oil and fish feed by isotope dilution: application of automated sample purification and gas chromatography/ion trap tandem mass spectrometry (GC/ITMS).

    Science.gov (United States)

    Blanco, Sonia Lucía; Vieites, Juan M

    2010-07-05

    The present paper describes the application of automated cleanup and fractionation procedures of the Power Prep system (Fluid Management Systems) for the determination of polybrominated diphenyl ethers (PBDEs) in feeding stuffs and fish meal and oil. Gas chromatography (GC) separation followed by ion trap tandem mass spectrometry detection in EI mode (ITMS) allowed the analysis of di- to deca-BDEs in the samples matrices used in fish aquaculture. The method developed enabled the determination of 26 native PBDE congeners and 11 (13)C(12)-labelled congeners, including deca-BDE 209, in a single-run analysis, using isotope dilution. The automated cleanup, consisting of a succession of multilayer silica and basic alumina columns previously applied by Wyrzykowska et al. (2009) [28] in combustion flue gas, was successfully applied in our complex matrices. The method allowed an increase in productivity, i.e. lower time was required to process samples, and simultaneous purification of several samples was achieved at a time, reducing analyst dedication and human error input. Average recoveries of 43-96% were obtained. GC/ITMS can overcome the complexity originating from the sample matrix, eliminating matrix effects by tandem MS, to enable the detection of congeners penta- to nona-BDEs where interferent masses were present. The provisional detection limits, estimated in the samples, were 5-30 pg for di-, tri-, tetra-, and penta-BDEs, 20-65 pg for hexa-, hepta-, octa- and nona-BDEs, and 105 pg for deca-BDE. Reduction of deca-BDE 209 blank values is of concern to ongoing research. Good accuracy was obtained by application of the whole procedure, representing an efficient, low-cost and fast alternative for routine analyses. Copyright 2010 Elsevier B.V. All rights reserved.

  8. 77 FR 63771 - Implementation of Full-Service Intelligent Mail Requirements for Automation Prices

    Science.gov (United States)

    2012-10-17

    ... large amount of additional data. Also, it requires each of my customers to have their own CRID. Today... Automation Prices AGENCY: Postal Service TM . ACTION: Proposed rule. SUMMARY: The Postal Service is proposing...]) throughout various sections to modify eligibility requirements for mailers to obtain automation prices for...

  9. 78 FR 2363 - Notification of Deletion of a System of Records; Automated Trust Funds Database

    Science.gov (United States)

    2013-01-11

    ... [Docket No. APHIS-2012-0041] Notification of Deletion of a System of Records; Automated Trust Funds Database AGENCY: Animal and Plant Health Inspection Service, USDA. ACTION: Notice of deletion of a system... establishing the Automated Trust Funds (ATF) database system of records. The Federal Information Security...

  10. Using Automated Planning for Traffic Signals Control

    Directory of Open Access Journals (Sweden)

    Matija Gulić

    2016-08-01

    Full Text Available Solving traffic congestions represents a high priority issue in many big cities. Traditional traffic control systems are mainly based on pre-programmed, reactive and local techniques. This paper presents an autonomic system that uses automated planning techniques instead. These techniques are easily configurable and modified, and can reason about the future implications of actions that change the default traffic lights behaviour. The proposed implemented system includes some autonomic properties, since it monitors the current traffic state, detects if the system is degrading its performance, sets up new sets of goals to be achieved by the planner, triggers the planner that generates plans with control actions, and executes the selected courses of actions. The obtained results in several artificial and real world data-based simulation scenarios show that the proposed system can efficiently solve traffic congestion.

  11. Working together on automated vehicle guidance AVG : preliminary business plan, abridged version.

    NARCIS (Netherlands)

    Awareness (ed.)

    1998-01-01

    This plan describes the questions which will have to be answered in the short term, and the action which need to be taken in a phased and structured manner to gain insight into the potential of automated vehicle guidance (AVG).

  12. Automation of Taxiing

    Directory of Open Access Journals (Sweden)

    Jaroslav Bursík

    2017-01-01

    Full Text Available The article focuses on the possibility of automation of taxiing, which is the part of a flight, which, under adverse weather conditions, greatly reduces the operational usability of an airport, and is the only part of a flight that has not been affected by automation, yet. Taxiing is currently handled manually by the pilot, who controls the airplane based on information from visual perception. The article primarily deals with possible ways of obtaining navigational information, and its automatic transfer to the controls. Analyzed wand assessed were currently available technologies such as computer vision, Light Detection and Ranging and Global Navigation Satellite System, which are useful for navigation and their general implementation into an airplane was designed. Obstacles to the implementation were identified, too. The result is a proposed combination of systems along with their installation into airplane’s systems so that it is possible to use the automated taxiing.

  13. ALLocator: an interactive web platform for the analysis of metabolomic LC-ESI-MS datasets, enabling semi-automated, user-revised compound annotation and mass isotopomer ratio analysis.

    Science.gov (United States)

    Kessler, Nikolas; Walter, Frederik; Persicke, Marcus; Albaum, Stefan P; Kalinowski, Jörn; Goesmann, Alexander; Niehaus, Karsten; Nattkemper, Tim W

    2014-01-01

    Adduct formation, fragmentation events and matrix effects impose special challenges to the identification and quantitation of metabolites in LC-ESI-MS datasets. An important step in compound identification is the deconvolution of mass signals. During this processing step, peaks representing adducts, fragments, and isotopologues of the same analyte are allocated to a distinct group, in order to separate peaks from coeluting compounds. From these peak groups, neutral masses and pseudo spectra are derived and used for metabolite identification via mass decomposition and database matching. Quantitation of metabolites is hampered by matrix effects and nonlinear responses in LC-ESI-MS measurements. A common approach to correct for these effects is the addition of a U-13C-labeled internal standard and the calculation of mass isotopomer ratios for each metabolite. Here we present a new web-platform for the analysis of LC-ESI-MS experiments. ALLocator covers the workflow from raw data processing to metabolite identification and mass isotopomer ratio analysis. The integrated processing pipeline for spectra deconvolution "ALLocatorSD" generates pseudo spectra and automatically identifies peaks emerging from the U-13C-labeled internal standard. Information from the latter improves mass decomposition and annotation of neutral losses. ALLocator provides an interactive and dynamic interface to explore and enhance the results in depth. Pseudo spectra of identified metabolites can be stored in user- and method-specific reference lists that can be applied on succeeding datasets. The potential of the software is exemplified in an experiment, in which abundance fold-changes of metabolites of the l-arginine biosynthesis in C. glutamicum type strain ATCC 13032 and l-arginine producing strain ATCC 21831 are compared. Furthermore, the capability for detection and annotation of uncommon large neutral losses is shown by the identification of (γ-)glutamyl dipeptides in the same strains

  14. ALLocator: an interactive web platform for the analysis of metabolomic LC-ESI-MS datasets, enabling semi-automated, user-revised compound annotation and mass isotopomer ratio analysis.

    Directory of Open Access Journals (Sweden)

    Nikolas Kessler

    Full Text Available Adduct formation, fragmentation events and matrix effects impose special challenges to the identification and quantitation of metabolites in LC-ESI-MS datasets. An important step in compound identification is the deconvolution of mass signals. During this processing step, peaks representing adducts, fragments, and isotopologues of the same analyte are allocated to a distinct group, in order to separate peaks from coeluting compounds. From these peak groups, neutral masses and pseudo spectra are derived and used for metabolite identification via mass decomposition and database matching. Quantitation of metabolites is hampered by matrix effects and nonlinear responses in LC-ESI-MS measurements. A common approach to correct for these effects is the addition of a U-13C-labeled internal standard and the calculation of mass isotopomer ratios for each metabolite. Here we present a new web-platform for the analysis of LC-ESI-MS experiments. ALLocator covers the workflow from raw data processing to metabolite identification and mass isotopomer ratio analysis. The integrated processing pipeline for spectra deconvolution "ALLocatorSD" generates pseudo spectra and automatically identifies peaks emerging from the U-13C-labeled internal standard. Information from the latter improves mass decomposition and annotation of neutral losses. ALLocator provides an interactive and dynamic interface to explore and enhance the results in depth. Pseudo spectra of identified metabolites can be stored in user- and method-specific reference lists that can be applied on succeeding datasets. The potential of the software is exemplified in an experiment, in which abundance fold-changes of metabolites of the l-arginine biosynthesis in C. glutamicum type strain ATCC 13032 and l-arginine producing strain ATCC 21831 are compared. Furthermore, the capability for detection and annotation of uncommon large neutral losses is shown by the identification of (γ-glutamyl dipeptides in

  15. Automated toxicological screening reports of modified Agilent MSD Chemstation combined with Microsoft Visual Basic application programs.

    Science.gov (United States)

    Choe, Sanggil; Kim, Suncheun; Choi, Hyeyoung; Choi, Hwakyoung; Chung, Heesun; Hwang, Bangyeon

    2010-06-15

    Agilent GC-MS MSD Chemstation offers automated library search report for toxicological screening using total ion chromatogram (TIC) and mass spectroscopy in normal mode. Numerous peaks appear in the chromatogram of biological specimen such as blood or urine and often large migrating peaks obscure small target peaks, in addition, any target peaks of low abundance regularly give wrong library search result or low matching score. As a result, retention time and mass spectrum of all the peaks in the chromatogram have to be checked to see if they are relevant. These repeated actions are very tedious and time-consuming to toxicologists. MSD Chemstation software operates using a number of macro files which give commands and instructions on how to work on and extract data from the chromatogram and spectroscopy. These macro files are developed by the own compiler of the software. All the original macro files can be modified and new macro files can be added to the original software by users. To get more accurate results with more convenient method and to save time for data analysis, we developed new macro files for reports generation and inserted new menus in the Enhanced Data Analysis program. Toxicological screening reports generated by these new macro files are in text mode or graphic mode and these reports can be generated with three different automated subtraction options. Text reports have Brief mode and Full mode and graphic reports have the option with or without mass spectrum mode. Matched mass spectrum and matching score for detected compounds are printed in reports by modified library searching modules. We have also developed an independent application program named DrugMan. This program manages drug groups, lists and parameters that are in use in MSD Chemstation. The incorporation of DrugMan with modified macro modules provides a powerful tool for toxicological screening and save a lot of valuable time on toxicological work. (c) 2010 Elsevier Ireland Ltd. All

  16. Action Refinement

    NARCIS (Netherlands)

    Gorrieri, R.; Rensink, Arend; Bergstra, J.A.; Ponse, A.; Smolka, S.A.

    2001-01-01

    In this chapter, we give a comprehensive overview of the research results in the field of action refinement during the past 12 years. The different approaches that have been followed are outlined in detail and contrasted to each other in a uniform framework. We use two running examples to discuss

  17. Mass-spectrometric mining of Hadean zircons by automated SHRIMP multi-collector and single-collector U/Pb zircon age dating: The first 100,000 grains

    Science.gov (United States)

    Holden, Peter; Lanc, Peter; Ireland, Trevor R.; Harrison, T. Mark; Foster, John J.; Bruce, Zane

    2009-09-01

    The identification and retrieval of a large population of ancient zircons (>4 Ga; Hadean) is of utmost priority if models of the early evolution of Earth are to be rigorously tested. We have developed a rapid and accurate U-Pb zircon age determination protocol utilizing a fully automated multi-collector ion microprobe, the ANU SHRIMP II, to screen and date these zircons. Unattended data acquisition relies on the calibration of a digitized sample map to the Sensitive High Resolution Ion MicroProbe (SHRIMP) sample-stage co-ordinate system. High precision positioning of individual grains can be produced through optical image processing of a specified mount location. The focal position of the mount can be optimized through a correlation between secondary-ion steering and the spot position on the target. For the Hadean zircon project, sample mounts are photographed and sample locations (normally grain centers) are determined off-line. The sample is loaded, reference points calibrated, and the target positions are then visited sequentially. In SHRIMP II multiple-collector mode, zircons are initially screened (ca. 5 s data acquisition) through their 204Pb corrected 207Pb/206Pb ratio; suitable candidates are then analyzed in a longer routine to obtain better measurement statistics, U/Pb, and concentration data. In SHRIMP I and SHRIMP RG, we have incorporated the automated analysis protocol to single-collector measurements. These routines have been used to analyze over 100,000 zircons from the Jack Hills quartzite. Of these, ca. 7%, have an age greater than 3.8 Ga, the oldest grain being 4372 +/- 6 Ma (2[sigma]), and this age is part of a group of analyses around 4350 Ma which we interpret as the age when continental crust first began to coalesce in this region. In multi-collector mode, the analytical time taken for a single mount with 400 zircons is approximately 6 h; whereas in single-collector mode, the analytical time is ca. 17 h. With this productivity, we can produce

  18. Control and automation systems

    International Nuclear Information System (INIS)

    Schmidt, R.; Zillich, H.

    1986-01-01

    A survey is given of the development of control and automation systems for energy uses. General remarks about control and automation schemes are followed by a description of modern process control systems along with process control processes as such. After discussing the particular process control requirements of nuclear power plants the paper deals with the reliability and availability of process control systems and refers to computerized simulation processes. The subsequent paragraphs are dedicated to descriptions of the operating floor, ergonomic conditions, existing systems, flue gas desulfurization systems, the electromagnetic influences on digital circuits as well as of light wave uses. (HAG) [de

  19. Automated nuclear materials accounting

    International Nuclear Information System (INIS)

    Pacak, P.; Moravec, J.

    1982-01-01

    An automated state system of accounting for nuclear materials data was established in Czechoslovakia in 1979. A file was compiled of 12 programs in the PL/1 language. The file is divided into four groups according to logical associations, namely programs for data input and checking, programs for handling the basic data file, programs for report outputs in the form of worksheets and magnetic tape records, and programs for book inventory listing, document inventory handling and materials balance listing. A similar automated system of nuclear fuel inventory for a light water reactor was introduced for internal purposes in the Institute of Nuclear Research (UJV). (H.S.)

  20. Automating the CMS DAQ

    International Nuclear Information System (INIS)

    Bauer, G; Darlea, G-L; Gomez-Ceballos, G; Bawej, T; Chaze, O; Coarasa, J A; Deldicque, C; Dobson, M; Dupont, A; Gigi, D; Glege, F; Gomez-Reino, R; Hartl, C; Hegeman, J; Masetti, L; Behrens, U; Branson, J; Cittolin, S; Holzner, A; Erhan, S

    2014-01-01

    We present the automation mechanisms that have been added to the Data Acquisition and Run Control systems of the Compact Muon Solenoid (CMS) experiment during Run 1 of the LHC, ranging from the automation of routine tasks to automatic error recovery and context-sensitive guidance to the operator. These mechanisms helped CMS to maintain a data taking efficiency above 90% and to even improve it to 95% towards the end of Run 1, despite an increase in the occurrence of single-event upsets in sub-detector electronics at high LHC luminosity.

  1. Altering user' acceptance of automation through prior automation exposure.

    Science.gov (United States)

    Bekier, Marek; Molesworth, Brett R C

    2017-06-01

    Air navigation service providers worldwide see increased use of automation as one solution to overcome the capacity constraints imbedded in the present air traffic management (ATM) system. However, increased use of automation within any system is dependent on user acceptance. The present research sought to determine if the point at which an individual is no longer willing to accept or cooperate with automation can be manipulated. Forty participants underwent training on a computer-based air traffic control programme, followed by two ATM exercises (order counterbalanced), one with and one without the aid of automation. Results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation ('tipping point') decreased; suggesting it is indeed possible to alter automation acceptance. Practitioner Summary: This paper investigates whether the point at which a user of automation rejects automation (i.e. 'tipping point') is constant or can be manipulated. The results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation decreased; suggesting it is possible to alter automation acceptance.

  2. An automated method for the analysis of phenolic acids in plasma based on ion-pairing micro-extraction coupled on-line to gas chromatography/mass spectrometry with in-liner derivatisation

    NARCIS (Netherlands)

    Peters, S.; Kaal, E.; Horsting, I.; Janssen, H.-G.

    2012-01-01

    A new method is presented for the analysis of phenolic acids in plasma based on ion-pairing ‘Micro-extraction in packed sorbent’ (MEPS) coupled on-line to in-liner derivatisation-gas chromatography-mass spectrometry (GC-MS). The ion-pairing reagent served a dual purpose. It was used both to improve

  3. LIBRARY AUTOMATION IN NIGERAN UNIVERSITIES

    African Journals Online (AJOL)

    facilitate services and access to information in libraries is widely acceptable. ... Moreover, Ugah (2001) reports that the automation process at the. Abubakar ... blueprint in 1987 and a turn-key system of automation was suggested for the library.

  4. Adaptive Automation Design and Implementation

    Science.gov (United States)

    2015-09-17

    with an automated system to a real-world adaptive au- tomation system implementation. There have been plenty of adaptive automation 17 Adaptive...of systems without increasing manpower requirements by allocating routine tasks to automated aids, improving safety through the use of au- tomated ...between intermediate levels of au- tomation , explicitly defining which human task a given level automates. Each model aids the creation and classification

  5. Multilaboratory Validation of First Action Method 2016.04 for Determination of Four Arsenic Species in Fruit Juice by High-Performance Liquid Chromatography-Inductively Coupled Plasma-Mass Spectrometry.

    Science.gov (United States)

    Kubachka, Kevin; Heitkemper, Douglas T; Conklin, Sean

    2017-07-01

    Before being designated AOAC First Action Official MethodSM 2016.04, the U.S. Food and Drug Administration's method, EAM 4.10 High Performance Liquid Chromatography-Inductively Coupled Plasma-Mass Spectrometric Determination of Four Arsenic Species in Fruit Juice, underwent both a single-laboratory validation and a multilaboratory validation (MLV) study. Three federal and five state regulatory laboratories participated in the MLV study, which is the primary focus of this manuscript. The method was validated for inorganic arsenic (iAs) measured as the sum of the two iAs species arsenite [As(III)] and arsenate [As(V)], dimethylarsinic acid (DMA), and monomethylarsonic acid (MMA) by analyses of 13 juice samples, including three apple juice, three apple juice concentrate, four grape juice, and three pear juice samples. In addition, two water Standard Reference Materials (SRMs) were analyzed. The method LODs and LOQs obtained among the eight laboratories were approximately 0.3 and 2 ng/g, respectively, for each of the analytes and were adequate for the intended purpose of the method. Each laboratory analyzed method blanks, fortified method blanks, reference materials, triplicate portions of each juice sample, and duplicate fortified juice samples (one for each matrix type) at three fortification levels. In general, repeatability and reproducibility of the method was ≤15% RSD for each species present at a concentration >LOQ. The average recovery of fortified analytes for all laboratories ranged from 98 to 104% iAs, DMA, and MMA for all four juice sample matrixes. The average iAs results for SRMs 1640a and 1643e agreed within the range of 96-98% of certified values for total arsenic.

  6. Automated HAZOP revisited

    DEFF Research Database (Denmark)

    Taylor, J. R.

    2017-01-01

    Hazard and operability analysis (HAZOP) has developed from a tentative approach to hazard identification for process plants in the early 1970s to an almost universally accepted approach today, and a central technique of safety engineering. Techniques for automated HAZOP analysis were developed...

  7. Automated Student Model Improvement

    Science.gov (United States)

    Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.

    2012-01-01

    Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…

  8. Mechatronic Design Automation

    DEFF Research Database (Denmark)

    Fan, Zhun

    successfully design analogue filters, vibration absorbers, micro-electro-mechanical systems, and vehicle suspension systems, all in an automatic or semi-automatic way. It also investigates the very important issue of co-designing plant-structures and dynamic controllers in automated design of Mechatronic...

  9. Automated Accounting. Instructor Guide.

    Science.gov (United States)

    Moses, Duane R.

    This curriculum guide was developed to assist business instructors using Dac Easy Accounting College Edition Version 2.0 software in their accounting programs. The module consists of four units containing assignment sheets and job sheets designed to enable students to master competencies identified in the area of automated accounting. The first…

  10. Automated conflict resolution issues

    Science.gov (United States)

    Wike, Jeffrey S.

    1991-01-01

    A discussion is presented of how conflicts for Space Network resources should be resolved in the ATDRSS era. The following topics are presented: a description of how resource conflicts are currently resolved; a description of issues associated with automated conflict resolution; present conflict resolution strategies; and topics for further discussion.

  11. Automated gamma counters

    International Nuclear Information System (INIS)

    Regener, M.

    1977-01-01

    This is a report on the most recent developments in the full automation of gamma counting in RIA, in particular by Messrs. Kontron. The development targets were flexibility in sample capacity and shape of test tubes, the possibility of using different radioisotopes for labelling due to an optimisation of the detector system and the use of microprocessers to substitute software for hardware. (ORU) [de

  12. Myths in test automation

    Directory of Open Access Journals (Sweden)

    Jazmine Francis

    2014-12-01

    Full Text Available Myths in automation of software testing is an issue of discussion that echoes about the areas of service in validation of software industry. Probably, the first though that appears in knowledgeable reader would be Why this old topic again? What's New to discuss the matter? But, for the first time everyone agrees that undoubtedly automation testing today is not today what it used to be ten or fifteen years ago, because it has evolved in scope and magnitude. What began as a simple linear scripts for web applications today has a complex architecture and a hybrid framework to facilitate the implementation of testing applications developed with various platforms and technologies. Undoubtedly automation has advanced, but so did the myths associated with it. The change in perspective and knowledge of people on automation has altered the terrain. This article reflects the points of views and experience of the author in what has to do with the transformation of the original myths in new versions, and how they are derived; also provides his thoughts on the new generation of myths.

  13. Myths in test automation

    Directory of Open Access Journals (Sweden)

    Jazmine Francis

    2015-01-01

    Full Text Available Myths in automation of software testing is an issue of discussion that echoes about the areas of service in validation of software industry. Probably, the first though that appears in knowledgeable reader would be Why this old topic again? What's New to discuss the matter? But, for the first time everyone agrees that undoubtedly automation testing today is not today what it used to be ten or fifteen years ago, because it has evolved in scope and magnitude. What began as a simple linear scripts for web applications today has a complex architecture and a hybrid framework to facilitate the implementation of testing applications developed with various platforms and technologies. Undoubtedly automation has advanced, but so did the myths associated with it. The change in perspective and knowledge of people on automation has altered the terrain. This article reflects the points of views and experience of the author in what has to do with the transformation of the original myths in new versions, and how they are derived; also provides his thoughts on the new generation of myths.

  14. Automation of activation analysis

    International Nuclear Information System (INIS)

    Ivanov, I.N.; Ivanets, V.N.; Filippov, V.V.

    1985-01-01

    The basic data on the methods and equipment of activation analysis are presented. Recommendations on the selection of activation analysis techniques, and especially the technique envisaging the use of short-lived isotopes, are given. The equipment possibilities to increase dataway carrying capacity, using modern computers for the automation of the analysis and data processing procedure, are shown

  15. Protokoller til Home Automation

    DEFF Research Database (Denmark)

    Kjær, Kristian Ellebæk

    2008-01-01

    computer, der kan skifte mellem foruddefinerede indstillinger. Nogle gange kan computeren fjernstyres over internettet, så man kan se hjemmets status fra en computer eller måske endda fra en mobiltelefon. Mens nævnte anvendelser er klassiske indenfor home automation, er yderligere funktionalitet dukket op...

  16. Automation of radioimmunoassays

    International Nuclear Information System (INIS)

    Goldie, D.J.; West, P.M.; Ismail, A.A.A.

    1979-01-01

    A short account is given of recent developments in automation of the RIA technique. Difficulties encountered in the incubation, separation and quantitation steps are summarized. Published references are given to a number of systems, both discrete and continuous flow, and details are given of a system developed by the present authors. (U.K.)

  17. Microcontroller for automation application

    Science.gov (United States)

    Cooper, H. W.

    1975-01-01

    The description of a microcontroller currently being developed for automation application was given. It is basically an 8-bit microcomputer with a 40K byte random access memory/read only memory, and can control a maximum of 12 devices through standard 15-line interface ports.

  18. Driver Psychology during Automated Platooning

    NARCIS (Netherlands)

    Heikoop, D.D.

    2017-01-01

    With the rapid increase in vehicle automation technology, the call for understanding how humans behave while driving in an automated vehicle becomes more urgent. Vehicles that have automated systems such as Lane Keeping Assist (LKA) or Adaptive Cruise Control (ACC) not only support drivers in their

  19. Automated Inadvertent Intruder Application

    International Nuclear Information System (INIS)

    Koffman, Larry D.; Lee, Patricia L.; Cook, James R.; Wilhite, Elmer L.

    2008-01-01

    The Environmental Analysis and Performance Modeling group of Savannah River National Laboratory (SRNL) conducts performance assessments of the Savannah River Site (SRS) low-level waste facilities to meet the requirements of DOE Order 435.1. These performance assessments, which result in limits on the amounts of radiological substances that can be placed in the waste disposal facilities, consider numerous potential exposure pathways that could occur in the future. One set of exposure scenarios, known as inadvertent intruder analysis, considers the impact on hypothetical individuals who are assumed to inadvertently intrude onto the waste disposal site. Inadvertent intruder analysis considers three distinct scenarios for exposure referred to as the agriculture scenario, the resident scenario, and the post-drilling scenario. Each of these scenarios has specific exposure pathways that contribute to the overall dose for the scenario. For the inadvertent intruder analysis, the calculation of dose for the exposure pathways is a relatively straightforward algebraic calculation that utilizes dose conversion factors. Prior to 2004, these calculations were performed using an Excel spreadsheet. However, design checks of the spreadsheet calculations revealed that errors could be introduced inadvertently when copying spreadsheet formulas cell by cell and finding these errors was tedious and time consuming. This weakness led to the specification of functional requirements to create a software application that would automate the calculations for inadvertent intruder analysis using a controlled source of input parameters. This software application, named the Automated Inadvertent Intruder Application, has undergone rigorous testing of the internal calculations and meets software QA requirements. The Automated Inadvertent Intruder Application was intended to replace the previous spreadsheet analyses with an automated application that was verified to produce the same calculations and

  20. Levels of automation and user control - evaluation of a turbine automation interface

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Jonas (Chalmers Univ. of Technology (Sweden))

    2008-10-15

    The study was performed during the annual operator training at the Studsvik nuclear power plant simulator facility in Nykoeping, Sweden. The participating operators came from the Oskarshamn 3 nuclear power plant. In the study, seven nuclear power plant turbine operators were interviewed concerning their use of the automatic turbine system. A field study approach together with a heuristic usability evaluation was made to assess how the operators are affected by use of automation in the control room setting. The purpose of the study was to examine how operator performance is affected by varying levels of automation in nuclear power plant turbine operation. The Automatic Turbine System (ATS) was evaluated to clarify how the ATS interface design supports the operators work. The results show that during manual control the operators experience loss of speed and accuracy in performing actions together with difficulty of dividing attention between performing a task and overall monitoring, as the major problems. The positive aspects of manual operations lie in increased feeling of being in control when performing actions by hand. With higher levels of automation the problems shift to issues concerning difficulty of following the automatic sequences and loosing track in procedures. As the level of automation gets higher, the need of feedback increases which means that information presentation also becomes more important. The use of the semiautomatic, step-mode is often preferred by the operators since it combines the speed and accuracy of the automation with the ability of maintaining the feeling of being in control. Further, a number of usability related concerns was found in the ATS interface. The operators especially experience the presentation of the conditions that manage the automatic sequences as difficult to perceive. (author)

  1. Levels of automation and user control - evaluation of a turbine automation interface

    International Nuclear Information System (INIS)

    Andersson, Jonas

    2008-10-01

    The study was performed during the annual operator training at the Studsvik nuclear power plant simulator facility in Nykoeping, Sweden. The participating operators came from the Oskarshamn 3 nuclear power plant. In the study, seven nuclear power plant turbine operators were interviewed concerning their use of the automatic turbine system. A field study approach together with a heuristic usability evaluation was made to assess how the operators are affected by use of automation in the control room setting. The purpose of the study was to examine how operator performance is affected by varying levels of automation in nuclear power plant turbine operation. The Automatic Turbine System (ATS) was evaluated to clarify how the ATS interface design supports the operators work. The results show that during manual control the operators experience loss of speed and accuracy in performing actions together with difficulty of dividing attention between performing a task and overall monitoring, as the major problems. The positive aspects of manual operations lie in increased feeling of being in control when performing actions by hand. With higher levels of automation the problems shift to issues concerning difficulty of following the automatic sequences and loosing track in procedures. As the level of automation gets higher, the need of feedback increases which means that information presentation also becomes more important. The use of the semiautomatic, step-mode is often preferred by the operators since it combines the speed and accuracy of the automation with the ability of maintaining the feeling of being in control. Further, a number of usability related concerns was found in the ATS interface. The operators especially experience the presentation of the conditions that manage the automatic sequences as difficult to perceive. (au)

  2. Automating spectral measurements

    Science.gov (United States)

    Goldstein, Fred T.

    2008-09-01

    This paper discusses the architecture of software utilized in spectroscopic measurements. As optical coatings become more sophisticated, there is mounting need to automate data acquisition (DAQ) from spectrophotometers. Such need is exacerbated when 100% inspection is required, ancillary devices are utilized, cost reduction is crucial, or security is vital. While instrument manufacturers normally provide point-and-click DAQ software, an application programming interface (API) may be missing. In such cases automation is impossible or expensive. An API is typically provided in libraries (*.dll, *.ocx) which may be embedded in user-developed applications. Users can thereby implement DAQ automation in several Windows languages. Another possibility, developed by FTG as an alternative to instrument manufacturers' software, is the ActiveX application (*.exe). ActiveX, a component of many Windows applications, provides means for programming and interoperability. This architecture permits a point-and-click program to act as automation client and server. Excel, for example, can control and be controlled by DAQ applications. Most importantly, ActiveX permits ancillary devices such as barcode readers and XY-stages to be easily and economically integrated into scanning procedures. Since an ActiveX application has its own user-interface, it can be independently tested. The ActiveX application then runs (visibly or invisibly) under DAQ software control. Automation capabilities are accessed via a built-in spectro-BASIC language with industry-standard (VBA-compatible) syntax. Supplementing ActiveX, spectro-BASIC also includes auxiliary serial port commands for interfacing programmable logic controllers (PLC). A typical application is automatic filter handling.

  3. Typing of Ochrobactrum anthropi clinical isolates using automated repetitive extragenic palindromic-polymerase chain reaction DNA fingerprinting and matrix-assisted laser desorption/ionization-time-of-flight mass spectrometry.

    Science.gov (United States)

    Quirino, Angela; Pulcrano, Giovanna; Rametti, Linda; Puccio, Rossana; Marascio, Nadia; Catania, Maria Rosaria; Matera, Giovanni; Liberto, Maria Carla; Focà, Alfredo

    2014-03-22

    Ochrobactrum anthropi (O. anthropi), is a non-fermenting gram-negative bacillus usually found in the environment. Nevertheless, during the past decade it has been identified as pathogenic to immunocompromised patients. In this study, we assessed the usefulness of the automated repetitive extragenic palindromic-polymerase chain reaction (rep-PCR-based DiversiLab™ system, bioMèrieux, France) and of matrix-assisted laser desorption/ionization-time-of-flight (MALDI-TOF MS) for typing of twentythree O. anthropi clinical isolates that we found over a four-months period (from April 2011 to August 2011) in bacteriemic patients admitted in the same operative unit of our hospital. Pulsed-field gel electrophoresis (PFGE), commonly accepted as the gold standard technique for typing, was also used. Analysis was carried out using the Pearson correlation coefficient to determine the distance matrice and the unweighted pair group method with arithmetic mean (UPGMA) to generate dendogram. Rep-PCR analysis identified four different patterns: three that clustered together with 97% or more pattern similarity, and one whose members showed < 95% pattern similarity. Interestingly, strains isolated later (from 11/06/2011 to 24/08/2011) displayed a pattern with 99% similarity. MALDI-TOF MS evaluation clustered the twentythree strains of O. anthropi into a single group containing four distinct subgroups, each comprising the majority of strains clustering below 5 distance levels, indicating a high similarity between the isolates. Our results indicate that these isolates are clonally-related and the methods used afforded a valuable contribution to the epidemiology, prevention and control of the infections caused by this pathogen.

  4. The surveillance state of behavioral automation

    Science.gov (United States)

    Schaefer, Andreas T; Claridge-Chang, Adam

    2012-01-01

    Genetics’ demand for increased throughput is driving automatization of behavior analysis far beyond experimental workhorses like circadian monitors and the operant conditioning box. However, the new automation is not just faster: it is also allowing new kinds of experiments, many of which erase the boundaries of the traditional neuroscience disciplines (psychology, ethology and physiology) while producing insight into problems that were otherwise opaque. Ironically, a central theme of current automatization is to improve observation of animals in increasingly naturalistic environments. This is not just a return to 19th century priorities: the new observational methods provide unprecedented quantitation of actions and ever-closer integration with experimentation. PMID:22119142

  5. Efficient Temporal Action Localization in Videos

    KAUST Repository

    Alwassel, Humam

    2018-04-17

    State-of-the-art temporal action detectors inefficiently search the entire video for specific actions. Despite the encouraging progress these methods achieve, it is crucial to design automated approaches that only explore parts of the video which are the most relevant to the actions being searched. To address this need, we propose the new problem of action spotting in videos, which we define as finding a specific action in a video while observing a small portion of that video. Inspired by the observation that humans are extremely efficient and accurate in spotting and finding action instances in a video, we propose Action Search, a novel Recurrent Neural Network approach that mimics the way humans spot actions. Moreover, to address the absence of data recording the behavior of human annotators, we put forward the Human Searches dataset, which compiles the search sequences employed by human annotators spotting actions in the AVA and THUMOS14 datasets. We consider temporal action localization as an application of the action spotting problem. Experiments on the THUMOS14 dataset reveal that our model is not only able to explore the video efficiently (observing on average 17.3% of the video) but it also accurately finds human activities with 30.8% mAP (0.5 tIoU), outperforming state-of-the-art methods

  6. Automation at NRCN Dosimetry Laboratory

    International Nuclear Information System (INIS)

    Abraham, A.; Arad, I.; Mesing, M.; Levinson, S.; Weinstein, M.; Pelled, O.; Broida, A.; German, U.

    2014-01-01

    Running a dosimetric service based on TLD technology such as at the Nuclear Research Centre Negev (NRCN) requires a large group of workers to carry out simple mechanical actions such as opening and closing TLD badges, placing and removal of TLD cards from the badges and operating the TLD reader. These actions can be automated to free human resources for other assignments and to improve the quality assurance. At NRCN a project was undertaken to design and build a robotic system based on a manipulator arm. The design was based on the experience achieved with an earlier prototype (1,2). The system stores the TLD badges in special designed boxes, which are transported and stored in computer defined bins. The robotic arm loads and unloads TLD cards to the badges, and loads/unloads the cards to a magazine for the TLD reader. At the Nuclear Research Center Negev (NRCN) each badge is assigned to a specific worker and bears a sticker containing the worker's personal details, also in a machine readable form (barcode). In order to establish a proper QA check, a barcode reader records the information on the badge and on the TLD card placed in this badge and checks their compatibility with the information contained in the main database. Besides the TLD cards loading/unloading station, there is a contamination check station, a cards cleaning station and a UV irradiation box used to reduce the history dependent residual dose. The system was installed at the NRCN dosimetry laboratory It was successfully tested for several hundreds of cycles and will become operational in the first quarter of 2014. As far as we know, there is no similar product available for automatic handling in a TLD laboratory

  7. Rapid automated nuclear chemistry

    International Nuclear Information System (INIS)

    Meyer, R.A.

    1979-01-01

    Rapid Automated Nuclear Chemistry (RANC) can be thought of as the Z-separation of Neutron-rich Isotopes by Automated Methods. The range of RANC studies of fission and its products is large. In a sense, the studies can be categorized into various energy ranges from the highest where the fission process and particle emission are considered, to low energies where nuclear dynamics are being explored. This paper presents a table which gives examples of current research using RANC on fission and fission products. The remainder of this text is divided into three parts. The first contains a discussion of the chemical methods available for the fission product elements, the second describes the major techniques, and in the last section, examples of recent results are discussed as illustrations of the use of RANC

  8. Automated optical assembly

    Science.gov (United States)

    Bala, John L.

    1995-08-01

    Automation and polymer science represent fundamental new technologies which can be directed toward realizing the goal of establishing a domestic, world-class, commercial optics business. Use of innovative optical designs using precision polymer optics will enable the US to play a vital role in the next generation of commercial optical products. The increased cost savings inherent in the utilization of optical-grade polymers outweighs almost every advantage of using glass for high volume situations. Optical designers must gain experience with combined refractive/diffractive designs and broaden their knowledge base regarding polymer technology beyond a cursory intellectual exercise. Implementation of a fully automated assembly system, combined with utilization of polymer optics, constitutes the type of integrated manufacturing process which will enable the US to successfully compete with the low-cost labor employed in the Far East, as well as to produce an equivalent product.

  9. Automated breeder fuel fabrication

    International Nuclear Information System (INIS)

    Goldmann, L.H.; Frederickson, J.R.

    1983-01-01

    The objective of the Secure Automated Fabrication (SAF) Project is to develop remotely operated equipment for the processing and manufacturing of breeder reactor fuel pins. The SAF line will be installed in the Fuels and Materials Examination Facility (FMEF). The FMEF is presently under construction at the Department of Energy's (DOE) Hanford site near Richland, Washington, and is operated by the Westinghouse Hanford Company (WHC). The fabrication and support systems of the SAF line are designed for computer-controlled operation from a centralized control room. Remote and automated fuel fabriction operations will result in: reduced radiation exposure to workers; enhanced safeguards; improved product quality; near real-time accountability, and increased productivity. The present schedule calls for installation of SAF line equipment in the FMEF beginning in 1984, with qualifying runs starting in 1986 and production commencing in 1987. 5 figures

  10. Automated multiple failure FMEA

    International Nuclear Information System (INIS)

    Price, C.J.; Taylor, N.S.

    2002-01-01

    Failure mode and effects analysis (FMEA) is typically performed by a team of engineers working together. In general, they will only consider single point failures in a system. Consideration of all possible combinations of failures is impractical for all but the simplest example systems. Even if the task of producing the FMEA report for the full multiple failure scenario were automated, it would still be impractical for the engineers to read, understand and act on all of the results. This paper shows how approximate failure rates for components can be used to select the most likely combinations of failures for automated investigation using simulation. The important information can be automatically identified from the resulting report, making it practical for engineers to study and act on the results. The strategy described in the paper has been applied to a range of electrical subsystems, and the results have confirmed that the strategy described here works well for realistically complex systems

  11. Automated drawing generation system

    International Nuclear Information System (INIS)

    Yoshinaga, Toshiaki; Kawahata, Junichi; Yoshida, Naoto; Ono, Satoru

    1991-01-01

    Since automated CAD drawing generation systems still require human intervention, improvements were focussed on an interactive processing section (data input and correcting operation) which necessitates a vast amount of work. As a result, human intervention was eliminated, the original objective of a computerized system. This is the first step taken towards complete automation. The effects of development and commercialization of the system are as described below. (1) The interactive processing time required for generating drawings was improved. It was determined that introduction of the CAD system has reduced the time required for generating drawings. (2) The difference in skills between workers preparing drawings has been eliminated and the quality of drawings has been made uniform. (3) The extent of knowledge and experience demanded of workers has been reduced. (author)

  12. ATLAS Distributed Computing Automation

    CERN Document Server

    Schovancova, J; The ATLAS collaboration; Borrego, C; Campana, S; Di Girolamo, A; Elmsheuser, J; Hejbal, J; Kouba, T; Legger, F; Magradze, E; Medrano Llamas, R; Negri, G; Rinaldi, L; Sciacca, G; Serfon, C; Van Der Ster, D C

    2012-01-01

    The ATLAS Experiment benefits from computing resources distributed worldwide at more than 100 WLCG sites. The ATLAS Grid sites provide over 100k CPU job slots, over 100 PB of storage space on disk or tape. Monitoring of status of such a complex infrastructure is essential. The ATLAS Grid infrastructure is monitored 24/7 by two teams of shifters distributed world-wide, by the ATLAS Distributed Computing experts, and by site administrators. In this paper we summarize automation efforts performed within the ATLAS Distributed Computing team in order to reduce manpower costs and improve the reliability of the system. Different aspects of the automation process are described: from the ATLAS Grid site topology provided by the ATLAS Grid Information System, via automatic site testing by the HammerCloud, to automatic exclusion from production or analysis activities.

  13. Automated Analysis of Accountability

    DEFF Research Database (Denmark)

    Bruni, Alessandro; Giustolisi, Rosario; Schürmann, Carsten

    2017-01-01

    that the system can detect the misbehaving parties who caused that failure. Accountability is an intuitively stronger property than verifiability as the latter only rests on the possibility of detecting the failure of a goal. A plethora of accountability and verifiability definitions have been proposed...... in the literature. Those definitions are either very specific to the protocols in question, hence not applicable in other scenarios, or too general and widely applicable but requiring complicated and hard to follow manual proofs. In this paper, we advance formal definitions of verifiability and accountability...... that are amenable to automated verification. Our definitions are general enough to be applied to different classes of protocols and different automated security verification tools. Furthermore, we point out formally the relation between verifiability and accountability. We validate our definitions...

  14. Automated planning through abstractions in dynamic and stochastic environments

    OpenAIRE

    Martínez Muñoz, Moisés

    2016-01-01

    Mención Internacional en el título de doctor Generating sequences of actions - plans - for an automatic system, like a robot, using Automated Planning is particularly diflicult in stochastic and/or dynamic environments. These plans are composed of actions whose execution, in certain scenarios, might fail, which in tum prevents the execution of the rest of the actions in the plan. Also, in some environments, plans must he generated fast, hoth at the start of the execution and after every ex...

  15. Automation and Mankind

    Science.gov (United States)

    1960-08-07

    limited by the cap- abilities of the human organism in the matter of control of its processes. In our time, the speeds of technological processes are...in many cases limited by conditions of control. The speed of human reaction is limited and therefore, at pre- sent, only processes of a relatively...forwiard, It can e foreseer thast automIation will comp~letely free Mans -Pn work unler conlitions’ of high texpemratures pressures,, anid nollutA-: or

  16. Automated Cooperative Trajectories

    Science.gov (United States)

    Hanson, Curt; Pahle, Joseph; Brown, Nelson

    2015-01-01

    This presentation is an overview of the Automated Cooperative Trajectories project. An introduction to the phenomena of wake vortices is given, along with a summary of past research into the possibility of extracting energy from the wake by flying close parallel trajectories. Challenges and barriers to adoption of civilian automatic wake surfing technology are identified. A hardware-in-the-loop simulation is described that will support future research. Finally, a roadmap for future research and technology transition is proposed.

  17. Automating ASW fusion

    OpenAIRE

    Pabelico, James C.

    2011-01-01

    Approved for public release; distribution is unlimited. This thesis examines ASW eFusion, an anti-submarine warfare (ASW) tactical decision aid (TDA) that utilizes Kalman filtering to improve battlespace awareness by simplifying and automating the track management process involved in anti-submarine warfare (ASW) watchstanding operations. While this program can currently help the ASW commander manage uncertainty and make better tactical decisions, the program has several limitations. Comman...

  18. Longwall automation 2

    Energy Technology Data Exchange (ETDEWEB)

    David Hainsworth; David Reid; Con Caris; J.C. Ralston; C.O. Hargrave; Ron McPhee; I.N. Hutchinson; A. Strange; C. Wesner [CSIRO (Australia)

    2008-05-15

    This report covers a nominal two-year extension to the Major Longwall Automation Project (C10100). Production standard implementation of Longwall Automation Steering Committee (LASC) automation systems has been achieved at Beltana and Broadmeadow mines. The systems are now used on a 24/7 basis and have provided production benefits to the mines. The LASC Information System (LIS) has been updated and has been implemented successfully in the IT environment of major coal mining houses. This enables 3D visualisation of the longwall environment and equipment to be accessed on line. A simulator has been specified and a prototype system is now ready for implementation. The Shearer Position Measurement System (SPMS) has been upgraded to a modular commercial production standard hardware solution.A compact hardware solution for visual face monitoring has been developed, an approved enclosure for a thermal infrared camera has been produced and software for providing horizon control through faulted conditions has been delivered. The incorporation of the LASC Cut Model information into OEM horizon control algorithms has been bench and underground tested. A prototype system for shield convergence monitoring has been produced and studies to identify techniques for coal flow optimisation and void monitoring have been carried out. Liaison with equipment manufacturers has been maintained and technology delivery mechanisms for LASC hardware and software have been established.

  19. Automation in biological crystallization.

    Science.gov (United States)

    Stewart, Patrick Shaw; Mueller-Dieckmann, Jochen

    2014-06-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given.

  20. Action Learning: Avoiding Conflict or Enabling Action

    Science.gov (United States)

    Corley, Aileen; Thorne, Ann

    2006-01-01

    Action learning is based on the premise that action and learning are inextricably entwined and it is this potential, to enable action, which has contributed to the growth of action learning within education and management development programmes. However has this growth in action learning lead to an evolution or a dilution of Revan's classical…

  1. Systems Operations Studies for Automated Guideway Transit Systems : Discrete Event Simulation Model User's Manual.

    Science.gov (United States)

    1982-06-01

    In order to examine specific Automated Guideway Transit (AGT) developments and concepts, and to build a better knowledge base for future decision-making, the Urban Mass Transportation Administration (UMTA) undertook a new program of studies and techn...

  2. An Analysis of Automated Solutions for the Certification and Accreditation of Navy Medicine Information Assets

    National Research Council Canada - National Science Library

    Gonzales, Dominic V

    2005-01-01

    ... improve Navy Medicine's current C AND A security posture. The primary research reviewed C AND A policy and included a comparative analysis of two cutting edge automated C AND A tools namely, Xacta and eMASS...

  3. AUTOMATED INADVERTENT INTRUDER APPLICATION

    International Nuclear Information System (INIS)

    Koffman, L; Patricia Lee, P; Jim Cook, J; Elmer Wilhite, E

    2007-01-01

    The Environmental Analysis and Performance Modeling group of Savannah River National Laboratory (SRNL) conducts performance assessments of the Savannah River Site (SRS) low-level waste facilities to meet the requirements of DOE Order 435.1. These performance assessments, which result in limits on the amounts of radiological substances that can be placed in the waste disposal facilities, consider numerous potential exposure pathways that could occur in the future. One set of exposure scenarios, known as inadvertent intruder analysis, considers the impact on hypothetical individuals who are assumed to inadvertently intrude onto the waste disposal site. Inadvertent intruder analysis considers three distinct scenarios for exposure referred to as the agriculture scenario, the resident scenario, and the post-drilling scenario. Each of these scenarios has specific exposure pathways that contribute to the overall dose for the scenario. For the inadvertent intruder analysis, the calculation of dose for the exposure pathways is a relatively straightforward algebraic calculation that utilizes dose conversion factors. Prior to 2004, these calculations were performed using an Excel spreadsheet. However, design checks of the spreadsheet calculations revealed that errors could be introduced inadvertently when copying spreadsheet formulas cell by cell and finding these errors was tedious and time consuming. This weakness led to the specification of functional requirements to create a software application that would automate the calculations for inadvertent intruder analysis using a controlled source of input parameters. This software application, named the Automated Inadvertent Intruder Application, has undergone rigorous testing of the internal calculations and meets software QA requirements. The Automated Inadvertent Intruder Application was intended to replace the previous spreadsheet analyses with an automated application that was verified to produce the same calculations and

  4. Application Filters for TCP/IP Industrial Automation Protocols

    Science.gov (United States)

    Batista, Aguinaldo B.; Kobayashi, Tiago H.; Medeiros, João Paulo S.; Brito, Agostinho M.; Motta Pires, Paulo S.

    The use of firewalls is a common approach usually meant to secure Automation Technology (AT) from Information Technology (TI) networks. This work proposes a filtering system for TCP/IP-based automation networks in which only certain kind of industrial traffic is permitted. All network traffic which does not conform with a proper industrial protocol pattern or with specific rules for its actions is supposed to be abnormal and must be blocked. As a case study, we developed a seventh layer firewall application with the ability of blocking spurious traffic, using an IP packet queueing engine and a regular expression library.

  5. Robust localisation of automated guided vehicles for computer-integrated manufacturing environments

    Directory of Open Access Journals (Sweden)

    Dixon, R. C.

    2013-05-01

    Full Text Available As industry moves toward an era of complete automation and mass customisation, automated guided vehicles (AGVs are used as material handling systems. However, the current techniques that provide navigation, control, and manoeuvrability of automated guided vehicles threaten to create bottlenecks and inefficiencies in manufacturing environments that strive towards the optimisation of part production. This paper proposes a decentralised localisation technique for an automated guided vehicle without any non-holonomic constraints. Incorporation of these vehicles into the material handling system of a computer-integrated manufacturing environment would increase the characteristics of robustness, efficiency, flexibility, and advanced manoeuvrability.

  6. Technology assessment of automation trends in the modular home industry

    Science.gov (United States)

    Phil Mitchell; Robert Russell Hurst

    2009-01-01

    This report provides an assessment of technology used in manufacturing modular homes in the United States, and that used in the German prefabricated wooden home industry. It is the first step toward identifying the research needs in automation and manufacturing methods that will facilitate mass customization in the home manufacturing industry. Within the United States...

  7. A computational framework for the automated construction of glycosylation reaction networks.

    Science.gov (United States)

    Liu, Gang; Neelamegham, Sriram

    2014-01-01

    Glycosylation is among the most common and complex post-translational modifications identified to date. It proceeds through the catalytic action of multiple enzyme families that include the glycosyltransferases that add monosaccharides to growing glycans, and glycosidases which remove sugar residues to trim glycans. The expression level and specificity of these enzymes, in part, regulate the glycan distribution or glycome of specific cell/tissue systems. Currently, there is no systematic method to describe the enzymes and cellular reaction networks that catalyze glycosylation. To address this limitation, we present a streamlined machine-readable definition for the glycosylating enzymes and additional methodologies to construct and analyze glycosylation reaction networks. In this computational framework, the enzyme class is systematically designed to store detailed specificity data such as enzymatic functional group, linkage and substrate specificity. The new classes and their associated functions enable both single-reaction inference and automated full network reconstruction, when given a list of reactants and/or products along with the enzymes present in the system. In addition, graph theory is used to support functions that map the connectivity between two or more species in a network, and that generate subset models to identify rate-limiting steps regulating glycan biosynthesis. Finally, this framework allows the synthesis of biochemical reaction networks using mass spectrometry (MS) data. The features described above are illustrated using three case studies that examine: i) O-linked glycan biosynthesis during the construction of functional selectin-ligands; ii) automated N-linked glycosylation pathway construction; and iii) the handling and analysis of glycomics based MS data. Overall, the new computational framework enables automated glycosylation network model construction and analysis by integrating knowledge of glycan structure and enzyme biochemistry. All

  8. A computational framework for the automated construction of glycosylation reaction networks.

    Directory of Open Access Journals (Sweden)

    Gang Liu

    Full Text Available Glycosylation is among the most common and complex post-translational modifications identified to date. It proceeds through the catalytic action of multiple enzyme families that include the glycosyltransferases that add monosaccharides to growing glycans, and glycosidases which remove sugar residues to trim glycans. The expression level and specificity of these enzymes, in part, regulate the glycan distribution or glycome of specific cell/tissue systems. Currently, there is no systematic method to describe the enzymes and cellular reaction networks that catalyze glycosylation. To address this limitation, we present a streamlined machine-readable definition for the glycosylating enzymes and additional methodologies to construct and analyze glycosylation reaction networks. In this computational framework, the enzyme class is systematically designed to store detailed specificity data such as enzymatic functional group, linkage and substrate specificity. The new classes and their associated functions enable both single-reaction inference and automated full network reconstruction, when given a list of reactants and/or products along with the enzymes present in the system. In addition, graph theory is used to support functions that map the connectivity between two or more species in a network, and that generate subset models to identify rate-limiting steps regulating glycan biosynthesis. Finally, this framework allows the synthesis of biochemical reaction networks using mass spectrometry (MS data. The features described above are illustrated using three case studies that examine: i O-linked glycan biosynthesis during the construction of functional selectin-ligands; ii automated N-linked glycosylation pathway construction; and iii the handling and analysis of glycomics based MS data. Overall, the new computational framework enables automated glycosylation network model construction and analysis by integrating knowledge of glycan structure and enzyme

  9. Validation and application of a high-performance liquid chromatography-tandem mass spectrometric method for simultaneous quantification of lopinavir and ritonavir in human plasma using semi-automated 96-well liquid-liquid extraction.

    Science.gov (United States)

    Wang, Perry G; Wei, Jack S; Kim, Grace; Chang, Min; El-Shourbagy, Tawakol

    2006-10-20

    Kaletra is an important antiretroviral drug, which has been developed by Abbott Laboratories. It is composed of lopinavir (low-pin-a-veer) and ritonavir (ri-toe-na-veer). Both have been proved to be human immunodeficiency virus (HIV) protease inhibitors and have substantially reduced the morbidity and mortality associated with HIV-1 infection. We have developed and validated an assay, using liquid chromatography coupled with atmospheric pressure chemical ionization tandem mass spectrometry (LC/MS/MS), for the routine quantification of lopinavir and ritonavir in human plasma, in which lopinavir and ritonavir can be simultaneously analyzed with high throughput. The sample preparation consisted of liquid-liquid extraction with a mixture of hexane: ethyl acetate (1:1, v/v), using 100 microL of plasma. Chromatographic separation was performed on a Waters Symmetry C(18) column (150 mm x 3.9 mm, particle size 5 microm) with reverse-phase isocratic using mobile phase of 70:30 (v/v) acetonitrile: 2 mM ammonium acetate aqueous solution containing 0.01% formic acid (v/v) at a flow rate of 1.0 mL/min. A Waters symmetry C(18) guard column (20 mm x 3.9 mm, particle size 5 microm) was connected prior to the analytical column, and a guard column back wash was performed to reduce the analytical column contamination using a mixture of tetrahydrofuran (THF), methanol and water (45:45:10, v/v/v). The analytical run was 4 min. The use of a 96-well plate autosampler allowed a batch size up to 73 study samples. A triple-quadrupole mass spectrometer was operated in a positive ion mode and multiple reaction monitoring (MRM) was used for drug quantification. The method was validated over the concentration ranges of 19-5,300 ng/mL for lopinavir and 11-3,100 ng/mL for ritonavir. A-86093 was used as an internal standard (I.S.). The relative standard deviation (RSD) were <6% for both lopinavir and ritonavir. Mean accuracies were between the designed limits (+/-15%). The robust and rapid LC

  10. Automation, Performance and International Competition

    DEFF Research Database (Denmark)

    Kromann, Lene; Sørensen, Anders

    This paper presents new evidence on trade‐induced automation in manufacturing firms using unique data combining a retrospective survey that we have assembled with register data for 2005‐2010. In particular, we establish a causal effect where firms that have specialized in product types for which...... the Chinese exports to the world market has risen sharply invest more in automated capital compared to firms that have specialized in other product types. We also study the relationship between automation and firm performance and find that firms with high increases in scale and scope of automation have faster...... productivity growth than other firms. Moreover, automation improves the efficiency of all stages of the production process by reducing setup time, run time, and inspection time and increasing uptime and quantity produced per worker. The efficiency improvement varies by type of automation....

  11. Automation in organizations: Eternal conflict

    Science.gov (United States)

    Dieterly, D. L.

    1981-01-01

    Some ideas on and insights into the problems associated with automation in organizations are presented with emphasis on the concept of automation, its relationship to the individual, and its impact on system performance. An analogy is drawn, based on an American folk hero, to emphasize the extent of the problems encountered when dealing with automation within an organization. A model is proposed to focus attention on a set of appropriate dimensions. The function allocation process becomes a prominent aspect of the model. The current state of automation research is mentioned in relation to the ideas introduced. Proposed directions for an improved understanding of automation's effect on the individual's efficiency are discussed. The importance of understanding the individual's perception of the system in terms of the degree of automation is highlighted.

  12. AUTOMATED PROCESS MONITORING: APPLYING PROVEN AUTOMATION TECHNIQUES TO INTERNATIONAL SAFEGUARDS NEEDS

    International Nuclear Information System (INIS)

    O'Hara, Matthew J.; Durst, Philip C.; Grate, Jay W.; Devol, Timothy A.; Egorov, Oleg; Clements, John P.

    2008-01-01

    Identification and quantification of specific alpha- and beta-emitting radionuclides in complex liquid matrices is highly challenging, and is typically accomplished through laborious wet chemical sample preparation and separations followed by analysis using a variety of detection methodologies (e.g., liquid scintillation, gas proportional counting, alpha energy analysis, mass spectrometry). Analytical results may take days or weeks to report. Chains of custody and sample security measures may also complicate or slow the analytical process. When an industrial process-scale plant requires the monitoring of specific radionuclides as an indication of the composition of its feed stream or of plant performance, radiochemical measurements must be fast, accurate, and reliable. Scientists at Pacific Northwest National Laboratory have assembled a fully automated prototype Process Monitor instrument capable of a variety of tasks: automated sampling directly from a feed stream, sample digestion/analyte redox adjustment, chemical separations, radiochemical detection and data analysis/reporting. The system is compact, its components are fluidically inter-linked, and analytical results could be immediately transmitted to on- or off-site locations. The development of a rapid radiochemical Process Monitor for 99Tc in Hanford tank waste processing streams, capable of performing several measurements per hour, will be discussed in detail. More recently, the automated platform was modified to perform measurements of 90Sr in Hanford tank waste stimulant. The system exemplifies how automation could be integrated into reprocessing facilities to support international nuclear safeguards needs

  13. Automation System Products and Research

    OpenAIRE

    Rintala, Mikko; Sormunen, Jussi; Kuisma, Petri; Rahkala, Matti

    2014-01-01

    Automation systems are used in most buildings nowadays. In the past they were mainly used in industry to control and monitor critical systems. During the past few decades the automation systems have become more common and are used today from big industrial solutions to homes of private customers. With the growing need for ecologic and cost-efficient management systems, home and building automation systems are becoming a standard way of controlling lighting, ventilation, heating etc. Auto...

  14. Guidelines for Automation Project Execution

    OpenAIRE

    Takkinen, Heidi

    2011-01-01

    The purpose of this Master’s thesis was to create instructions for executing an automation project. Sarlin Oy Ab needed directions on how to execute an automation project. Sarlin is starting up a new business area offering total project solutions for customers. Sarlin focuses on small and minor automation projects on domestic markets. The thesis represents issues related to project execution starting from the theory of the project to its kick-off and termination. Site work is one importan...

  15. A new thermal ionisation mass spectrometer

    International Nuclear Information System (INIS)

    Haines, C.; Merren, T.O.; Unsworth, W.D.

    1979-01-01

    The Isomass 54E, a new thermal ionisation mass spectrometer for precise measurements of isotopic composition is described in detail. It combines the fruits of three development pro ects, viz. automation, energy filters and extended geometry with existing micromass expertise and experience. The hardware and software which are used for the automation as well as the energy filter used, are explained. The 'extended geometry' ion optical system adopted for better performance is discussed in detail. (K.B.)

  16. 78 FR 53466 - Modification of Two National Customs Automation Program (NCAP) Tests Concerning Automated...

    Science.gov (United States)

    2013-08-29

    ... Customs Automation Program (NCAP) Tests Concerning Automated Commercial Environment (ACE) Document Image... National Customs Automation Program (NCAP) tests concerning document imaging, known as the Document Image... the National Customs Automation Program (NCAP) tests concerning document imaging, known as the...

  17. World-wide distribution automation systems

    International Nuclear Information System (INIS)

    Devaney, T.M.

    1994-01-01

    A worldwide power distribution automation system is outlined. Distribution automation is defined and the status of utility automation is discussed. Other topics discussed include a distribution management system, substation feeder, and customer functions, potential benefits, automation costs, planning and engineering considerations, automation trends, databases, system operation, computer modeling of system, and distribution management systems

  18. Current advances and strategies towards fully automated sample preparation for regulated LC-MS/MS bioanalysis.

    Science.gov (United States)

    Zheng, Naiyu; Jiang, Hao; Zeng, Jianing

    2014-09-01

    Robotic liquid handlers (RLHs) have been widely used in automated sample preparation for liquid chromatography-tandem mass spectrometry (LC-MS/MS) bioanalysis. Automated sample preparation for regulated bioanalysis offers significantly higher assay efficiency, better data quality and potential bioanalytical cost-savings. For RLHs that are used for regulated bioanalysis, there are additional requirements, including 21 CFR Part 11 compliance, software validation, system qualification, calibration verification and proper maintenance. This article reviews recent advances in automated sample preparation for regulated bioanalysis in the last 5 years. Specifically, it covers the following aspects: regulated bioanalysis requirements, recent advances in automation hardware and software development, sample extraction workflow simplification, strategies towards fully automated sample extraction, and best practices in automated sample preparation for regulated bioanalysis.

  19. Contaminant analysis automation, an overview

    International Nuclear Information System (INIS)

    Hollen, R.; Ramos, O. Jr.

    1996-01-01

    To meet the environmental restoration and waste minimization goals of government and industry, several government laboratories, universities, and private companies have formed the Contaminant Analysis Automation (CAA) team. The goal of this consortium is to design and fabricate robotics systems that standardize and automate the hardware and software of the most common environmental chemical methods. In essence, the CAA team takes conventional, regulatory- approved (EPA Methods) chemical analysis processes and automates them. The automation consists of standard laboratory modules (SLMs) that perform the work in a much more efficient, accurate, and cost- effective manner

  20. Calculated trends and the atmospheric abundance of 1,1,1,2-tetrafluoroethane, 1,1-dichloro-1-fluoroethane, and 1-chloro-1,1-difluoroethane using automated in-situ gas chromatography-mass spectrometry measurements recorded at Mace Head, Ireland, from October 1994 to March 1997

    Science.gov (United States)

    Simmonds, P. G.; O'Doherty, S.; Huang, J.; Prinn, R.; Derwent, R. G.; Ryall, D.; Nickless, G.; Cunnold, D.

    1998-01-01

    The first in-situ measurements by automated gas chromatograph-mass spectrometer are reported for 1,1,1,2-tetrafluoroethane (HFC-134a), 1,1-dichloro-1-fluoroethane, (HCFC-141b), and 1-chloro-1,1-difluoroethane, (HCFC-142b). These compounds are steadily replacing the chlorofluorocarbons (CFCs) as refrigerants, foam-blowing agents, and solvents. The concentrations of all three compounds are shown to be rapidly increasing in the atmosphere, with 134a increasing at a rate of 2.05±0.02 ppt yr-1 over the 30 months of observations. Similarly, 141b and 142b increased at rates of 2.49±0.03 and 1.24±0.02 ppt yr-1, respectively, over the same period. The concentrations recorded at the atmospheric research station at Mace Head, Ireland, on January 1, 1996, the midpoint of the time series, were 3.67 ppt (134a), 7.38 ppt (141b), and 8.78 ppt (142b). From these observations we optimally estimate the HCFC and HFC emissions using a 12-box global model and OH concentrations derived from global 1,1,1-trichloroethane (CCl3CH3) measurements. Comparing two methods of estimating emissions with independent industry estimates shows satisfactory agreement for 134a and 141b, while for 142b, industry estimates are less than half those required to explain our observations.

  1. Automating CPM-GOMS

    Science.gov (United States)

    John, Bonnie; Vera, Alonso; Matessa, Michael; Freed, Michael; Remington, Roger

    2002-01-01

    CPM-GOMS is a modeling method that combines the task decomposition of a GOMS analysis with a model of human resource usage at the level of cognitive, perceptual, and motor operations. CPM-GOMS models have made accurate predictions about skilled user behavior in routine tasks, but developing such models is tedious and error-prone. We describe a process for automatically generating CPM-GOMS models from a hierarchical task decomposition expressed in a cognitive modeling tool called Apex. Resource scheduling in Apex automates the difficult task of interleaving the cognitive, perceptual, and motor resources underlying common task operators (e.g. mouse move-and-click). Apex's UI automatically generates PERT charts, which allow modelers to visualize a model's complex parallel behavior. Because interleaving and visualization is now automated, it is feasible to construct arbitrarily long sequences of behavior. To demonstrate the process, we present a model of automated teller interactions in Apex and discuss implications for user modeling. available to model human users, the Goals, Operators, Methods, and Selection (GOMS) method [6, 21] has been the most widely used, providing accurate, often zero-parameter, predictions of the routine performance of skilled users in a wide range of procedural tasks [6, 13, 15, 27, 28]. GOMS is meant to model routine behavior. The user is assumed to have methods that apply sequences of operators and to achieve a goal. Selection rules are applied when there is more than one method to achieve a goal. Many routine tasks lend themselves well to such decomposition. Decomposition produces a representation of the task as a set of nested goal states that include an initial state and a final state. The iterative decomposition into goals and nested subgoals can terminate in primitives of any desired granularity, the choice of level of detail dependent on the predictions required. Although GOMS has proven useful in HCI, tools to support the

  2. Automating dipole subtraction

    International Nuclear Information System (INIS)

    Hasegawa, K.; Moch, S.; Uwer, P.

    2008-07-01

    We report on automating the Catani-Seymour dipole subtraction which is a general procedure to treat infrared divergences in real emission processes at next-to-leading order in QCD. The automatization rests on three essential steps: the creation of the dipole terms, the calculation of the color linked squared Born matrix elements, and the evaluation of different helicity amplitudes. The routines have been tested for a number of complex processes, such as the real emission process gg→t anti tggg. (orig.)

  3. Automating dipole subtraction

    Energy Technology Data Exchange (ETDEWEB)

    Hasegawa, K.; Moch, S. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Uwer, P. [Karlsruhe Univ. (T.H.) (Germany). Inst. fuer Theoretische Teilchenphysik

    2008-07-15

    We report on automating the Catani-Seymour dipole subtraction which is a general procedure to treat infrared divergences in real emission processes at next-to-leading order in QCD. The automatization rests on three essential steps: the creation of the dipole terms, the calculation of the color linked squared Born matrix elements, and the evaluation of different helicity amplitudes. The routines have been tested for a number of complex processes, such as the real emission process gg{yields}t anti tggg. (orig.)

  4. Fossil power plant automation

    International Nuclear Information System (INIS)

    Divakaruni, S.M.; Touchton, G.

    1991-01-01

    This paper elaborates on issues facing the utilities industry and seeks to address how new computer-based control and automation technologies resulting from recent microprocessor evolution, can improve fossil plant operations and maintenance. This in turn can assist utilities to emerge stronger from the challenges ahead. Many presentations at the first ISA/EPRI co-sponsored conference are targeted towards improving the use of computer and control systems in the fossil and nuclear power plants and we believe this to be the right forum to share our ideas

  5. Automated Test Case Generation

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    I would like to present the concept of automated test case generation. I work on it as part of my PhD and I think it would be interesting also for other people. It is also the topic of a workshop paper that I am introducing in Paris. (abstract below) Please note that the talk itself would be more general and not about the specifics of my PhD, but about the broad field of Automated Test Case Generation. I would introduce the main approaches (combinatorial testing, symbolic execution, adaptive random testing) and their advantages and problems. (oracle problem, combinatorial explosion, ...) Abstract of the paper: Over the last decade code-based test case generation techniques such as combinatorial testing or dynamic symbolic execution have seen growing research popularity. Most algorithms and tool implementations are based on finding assignments for input parameter values in order to maximise the execution branch coverage. Only few of them consider dependencies from outside the Code Under Test’s scope such...

  6. Automating quantum experiment control

    Science.gov (United States)

    Stevens, Kelly E.; Amini, Jason M.; Doret, S. Charles; Mohler, Greg; Volin, Curtis; Harter, Alexa W.

    2017-03-01

    The field of quantum information processing is rapidly advancing. As the control of quantum systems approaches the level needed for useful computation, the physical hardware underlying the quantum systems is becoming increasingly complex. It is already becoming impractical to manually code control for the larger hardware implementations. In this chapter, we will employ an approach to the problem of system control that parallels compiler design for a classical computer. We will start with a candidate quantum computing technology, the surface electrode ion trap, and build a system instruction language which can be generated from a simple machine-independent programming language via compilation. We incorporate compile time generation of ion routing that separates the algorithm description from the physical geometry of the hardware. Extending this approach to automatic routing at run time allows for automated initialization of qubit number and placement and additionally allows for automated recovery after catastrophic events such as qubit loss. To show that these systems can handle real hardware, we present a simple demonstration system that routes two ions around a multi-zone ion trap and handles ion loss and ion placement. While we will mainly use examples from transport-based ion trap quantum computing, many of the issues and solutions are applicable to other architectures.

  7. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT ...

    Science.gov (United States)

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execution of the Soil Water Assessment Tool (SWAT) and KINEmatic Runoff and EROSion (KINEROS2) hydrologic models. The application of these two models allows AGWA to conduct hydrologic modeling and watershed assessments at multiple temporal and spatial scales. AGWA’s current outputs are runoff (volumes and peaks) and sediment yield, plus nitrogen and phosphorus with the SWAT model. AGWA uses commonly available GIS data layers to fully parameterize, execute, and visualize results from both models. Through an intuitive interface the user selects an outlet from which AGWA delineates and discretizes the watershed using a Digital Elevation Model (DEM) based on the individual model requirements. The watershed model elements are then intersected with soils and land cover data layers to derive the requisite model input parameters. The chosen model is then executed, and the results are imported back into AGWA for visualization. This allows managers to identify potential problem areas where additional monitoring can be undertaken or mitigation activities can be focused. AGWA also has tools to apply an array of best management practices. There are currently two versions of AGWA available; AGWA 1.5 for

  8. Maneuver Automation Software

    Science.gov (United States)

    Uffelman, Hal; Goodson, Troy; Pellegrin, Michael; Stavert, Lynn; Burk, Thomas; Beach, David; Signorelli, Joel; Jones, Jeremy; Hahn, Yungsun; Attiyah, Ahlam; hide

    2009-01-01

    The Maneuver Automation Software (MAS) automates the process of generating commands for maneuvers to keep the spacecraft of the Cassini-Huygens mission on a predetermined prime mission trajectory. Before MAS became available, a team of approximately 10 members had to work about two weeks to design, test, and implement each maneuver in a process that involved running many maneuver-related application programs and then serially handing off data products to other parts of the team. MAS enables a three-member team to design, test, and implement a maneuver in about one-half hour after Navigation has process-tracking data. MAS accepts more than 60 parameters and 22 files as input directly from users. MAS consists of Practical Extraction and Reporting Language (PERL) scripts that link, sequence, and execute the maneuver- related application programs: "Pushing a single button" on a graphical user interface causes MAS to run navigation programs that design a maneuver; programs that create sequences of commands to execute the maneuver on the spacecraft; and a program that generates predictions about maneuver performance and generates reports and other files that enable users to quickly review and verify the maneuver design. MAS can also generate presentation materials, initiate electronic command request forms, and archive all data products for future reference.

  9. Automated screening for retinopathy

    Directory of Open Access Journals (Sweden)

    A. S. Rodin

    2014-07-01

    Full Text Available Retinal pathology is a common cause of an irreversible decrease of central vision commonly found amongst senior population. Detection of the earliest signs of retinal diseases can be facilitated by viewing retinal images available from the telemedicine networks. To facilitate the process of retinal images, screening software applications based on image recognition technology are currently on the various stages of development.Purpose: To develop and implement computerized image recognition software that can be used as a decision support technologyfor retinal image screening for various types of retinopathies.Methods: The software application for the retina image recognition has been developed using C++ language. It was tested on dataset of 70 images with various types of pathological features (age related macular degeneration, chorioretinitis, central serous chorioretinopathy and diabetic retinopathy.Results: It was shown that the system can achieve a sensitivity of 73 % and specificity of 72 %.Conclusion: Automated detection of macular lesions using proposed software can significantly reduce manual grading workflow. In addition, automated detection of retinal lesions can be implemented as a clinical decision support system for telemedicine screening. It is anticipated that further development of this technology can become a part of diagnostic image analysis system for the electronic health records.

  10. Work Planing Automation at Mechanical Subdivision

    OpenAIRE

    Dzindzelėta, Vytautas

    2005-01-01

    Work planing automation, installation possibilities and future outlook at mechanical subdivision. To study how the work planing has changed before and after automation process and to analyse automation process methodology.

  11. Automating usability of ATLAS distributed computing resources

    International Nuclear Information System (INIS)

    Tupputi, S A; Girolamo, A Di; Kouba, T; Schovancová, J

    2014-01-01

    The automation of ATLAS Distributed Computing (ADC) operations is essential to reduce manpower costs and allow performance-enhancing actions, which improve the reliability of the system. In this perspective a crucial case is the automatic handling of outages of ATLAS computing sites storage resources, which are continuously exploited at the edge of their capabilities. It is challenging to adopt unambiguous decision criteria for storage resources of non-homogeneous types, sizes and roles. The recently developed Storage Area Automatic Blacklisting (SAAB) tool has provided a suitable solution, by employing an inference algorithm which processes history of storage monitoring tests outcome. SAAB accomplishes both the tasks of providing global monitoring as well as automatic operations on single sites. The implementation of the SAAB tool has been the first step in a comprehensive review of the storage areas monitoring and central management at all levels. Such review has involved the reordering and optimization of SAM tests deployment and the inclusion of SAAB results in the ATLAS Site Status Board with both dedicated metrics and views. The resulting structure allows monitoring the storage resources status with fine time-granularity and automatic actions to be taken in foreseen cases, like automatic outage handling and notifications to sites. Hence, the human actions are restricted to reporting and following up problems, where and when needed. In this work we show SAAB working principles and features. We present also the decrease of human interactions achieved within the ATLAS Computing Operation team. The automation results in a prompt reaction to failures, which leads to the optimization of resource exploitation.

  12. Drilling Automation Tests At A Lunar/Mars Analog Site

    Science.gov (United States)

    Glass, B.; Cannon, H.; Hanagud, S.; Lee, P.; Paulsen, G.

    2006-01-01

    Future in-situ lunar/martian resource utilization and characterization, as well as the scientific search for life on Mars, will require access to the subsurface and hence drilling. Drilling on Earth is hard - an art form more than an engineering discipline. The limited mass, energy and manpower in planetary drilling situations makes application of terrestrial drilling techniques problematic. The Drilling Automation for Mars Exploration (DAME) project is developing drilling automation and robotics for projected use in missions to the Moon and Mars in the 2011-15 period. This has been tested recently, drilling in permafrost at a lunar/martian analog site (Haughton Crater, Devon Island, Canada).

  13. Automation for mineral resource development

    Energy Technology Data Exchange (ETDEWEB)

    Norrie, A.W.; Turner, D.R. (eds.)

    1986-01-01

    A total of 55 papers were presented at the symposium under the following headings: automation and the future of mining; modelling and control of mining processes; transportation for mining; automation and the future of metallurgical processes; modelling and control of metallurgical processes; and general aspects. Fifteen papers have been abstracted separately.

  14. Opening up Library Automation Software

    Science.gov (United States)

    Breeding, Marshall

    2009-01-01

    Throughout the history of library automation, the author has seen a steady advancement toward more open systems. In the early days of library automation, when proprietary systems dominated, the need for standards was paramount since other means of inter-operability and data exchange weren't possible. Today's focus on Application Programming…

  15. Resins production: batch plant automation

    International Nuclear Information System (INIS)

    Banti, M.; Mauri, G.

    1996-01-01

    Companies that look for automation in their plants without external resources, have at their disposal flexible, custom and easy to use DCS, open towards PLC. In this article it is explained why Hoechts has followed this way of new plants for resins production automation

  16. Automated Methods of Corrosion Measurements

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov

    1997-01-01

    . Mechanical control, recording, and data processing must therefore be automated to a high level of precision and reliability. These general techniques and the apparatus involved have been described extensively. The automated methods of such high-resolution microscopy coordinated with computerized...

  17. Migration monitoring with automated technology

    Science.gov (United States)

    Rhonda L. Millikin

    2005-01-01

    Automated technology can supplement ground-based methods of migration monitoring by providing: (1) unbiased and automated sampling; (2) independent validation of current methods; (3) a larger sample area for landscape-level analysis of habitat selection for stopover, and (4) an opportunity to study flight behavior. In particular, radar-acoustic sensor fusion can...

  18. Automated methods of corrosion measurement

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov; Bech-Nielsen, Gregers; Reeve, John Ch

    1997-01-01

    to revise assumptions regarding the basis of the method, which sometimes leads to the discovery of as-yet unnoticed phenomena. The present selection of automated methods for corrosion measurements is not motivated simply by the fact that a certain measurement can be performed automatically. Automation...... is applied to nearly all types of measurements today....

  19. Classification of Automated Search Traffic

    Science.gov (United States)

    Buehrer, Greg; Stokes, Jack W.; Chellapilla, Kumar; Platt, John C.

    As web search providers seek to improve both relevance and response times, they are challenged by the ever-increasing tax of automated search query traffic. Third party systems interact with search engines for a variety of reasons, such as monitoring a web site’s rank, augmenting online games, or possibly to maliciously alter click-through rates. In this paper, we investigate automated traffic (sometimes referred to as bot traffic) in the query stream of a large search engine provider. We define automated traffic as any search query not generated by a human in real time. We first provide examples of different categories of query logs generated by automated means. We then develop many different features that distinguish between queries generated by people searching for information, and those generated by automated processes. We categorize these features into two classes, either an interpretation of the physical model of human interactions, or as behavioral patterns of automated interactions. Using the these detection features, we next classify the query stream using multiple binary classifiers. In addition, a multiclass classifier is then developed to identify subclasses of both normal and automated traffic. An active learning algorithm is used to suggest which user sessions to label to improve the accuracy of the multiclass classifier, while also seeking to discover new classes of automated traffic. Performance analysis are then provided. Finally, the multiclass classifier is used to predict the subclass distribution for the search query stream.

  20. Automated Test-Form Generation

    Science.gov (United States)

    van der Linden, Wim J.; Diao, Qi

    2011-01-01

    In automated test assembly (ATA), the methodology of mixed-integer programming is used to select test items from an item bank to meet the specifications for a desired test form and optimize its measurement accuracy. The same methodology can be used to automate the formatting of the set of selected items into the actual test form. Three different…

  1. Automated evaluation of ultrasonic indications

    International Nuclear Information System (INIS)

    Hansch, M.K.T.; Stegemann, D.

    1994-01-01

    Future requirements of reliability and reproducibility in quality assurance demand computer evaluation of defect indications. The ultrasonic method with its large field of applications and a high potential for automation provides all preconditions for fully automated inspection. The survey proposes several desirable hardware improvements, data acquisition requirements and software configurations. (orig.) [de

  2. Automated Methods Of Corrosion Measurements

    DEFF Research Database (Denmark)

    Bech-Nielsen, Gregers; Andersen, Jens Enevold Thaulov; Reeve, John Ch

    1997-01-01

    The chapter describes the following automated measurements: Corrosion Measurements by Titration, Imaging Corrosion by Scanning Probe Microscopy, Critical Pitting Temperature and Application of the Electrochemical Hydrogen Permeation Cell.......The chapter describes the following automated measurements: Corrosion Measurements by Titration, Imaging Corrosion by Scanning Probe Microscopy, Critical Pitting Temperature and Application of the Electrochemical Hydrogen Permeation Cell....

  3. Robotics/Automated Systems Technicians.

    Science.gov (United States)

    Doty, Charles R.

    Major resources exist that can be used to develop or upgrade programs in community colleges and technical institutes that educate robotics/automated systems technicians. The first category of resources is Economic, Social, and Education Issues. The Office of Technology Assessment (OTA) report, "Automation and the Workplace," presents analyses of…

  4. Translation: Aids, Robots, and Automation.

    Science.gov (United States)

    Andreyewsky, Alexander

    1981-01-01

    Examines electronic aids to translation both as ways to automate it and as an approach to solve problems resulting from shortage of qualified translators. Describes the limitations of robotic MT (Machine Translation) systems, viewing MAT (Machine-Aided Translation) as the only practical solution and the best vehicle for further automation. (MES)

  5. Automated cell type discovery and classification through knowledge transfer

    Science.gov (United States)

    Lee, Hao-Chih; Kosoy, Roman; Becker, Christine E.

    2017-01-01

    Abstract Motivation: Recent advances in mass cytometry allow simultaneous measurements of up to 50 markers at single-cell resolution. However, the high dimensionality of mass cytometry data introduces computational challenges for automated data analysis and hinders translation of new biological understanding into clinical applications. Previous studies have applied machine learning to facilitate processing of mass cytometry data. However, manual inspection is still inevitable and becoming the barrier to reliable large-scale analysis. Results: We present a new algorithm called Automated Cell-type Discovery and Classification (ACDC) that fully automates the classification of canonical cell populations and highlights novel cell types in mass cytometry data. Evaluations on real-world data show ACDC provides accurate and reliable estimations compared to manual gating results. Additionally, ACDC automatically classifies previously ambiguous cell types to facilitate discovery. Our findings suggest that ACDC substantially improves both reliability and interpretability of results obtained from high-dimensional mass cytometry profiling data. Availability and Implementation: A Python package (Python 3) and analysis scripts for reproducing the results are availability on https://bitbucket.org/dudleylab/acdc. Contact: brian.kidd@mssm.edu or joel.dudley@mssm.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28158442

  6. On the least action principle in cosmology

    NARCIS (Netherlands)

    Nusser, A; Branchini, E

    2000-01-01

    Given the present distribution of mass tracing objects in an expanding universe, we develop and test a fast method for recovering their past orbits using the least action principle. In this method, termed FAM for fast action minimization, the orbits are expanded in a set of orthogonal time basis

  7. ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wieselquist, William A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Thompson, Adam B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bowman, Stephen M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Peterson, Joshua L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-04-01

    Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process data to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.

  8. The Science of Home Automation

    Science.gov (United States)

    Thomas, Brian Louis

    Smart home technologies and the concept of home automation have become more popular in recent years. This popularity has been accompanied by social acceptance of passive sensors installed throughout the home. The subsequent increase in smart homes facilitates the creation of home automation strategies. We believe that home automation strategies can be generated intelligently by utilizing smart home sensors and activity learning. In this dissertation, we hypothesize that home automation can benefit from activity awareness. To test this, we develop our activity-aware smart automation system, CARL (CASAS Activity-aware Resource Learning). CARL learns the associations between activities and device usage from historical data and utilizes the activity-aware capabilities to control the devices. To help validate CARL we deploy and test three different versions of the automation system in a real-world smart environment. To provide a foundation of activity learning, we integrate existing activity recognition and activity forecasting into CARL home automation. We also explore two alternatives to using human-labeled data to train the activity learning models. The first unsupervised method is Activity Detection, and the second is a modified DBSCAN algorithm that utilizes Dynamic Time Warping (DTW) as a distance metric. We compare the performance of activity learning with human-defined labels and with automatically-discovered activity categories. To provide evidence in support of our hypothesis, we evaluate CARL automation in a smart home testbed. Our results indicate that home automation can be boosted through activity awareness. We also find that the resulting automation has a high degree of usability and comfort for the smart home resident.

  9. Sloan Digital Sky Survey photometric telescope automation and observing software

    International Nuclear Information System (INIS)

    Eric H. Neilsen, Jr.; email = neilsen@fnal.gov

    2002-01-01

    The photometric telescope (PT) provides observations necessary for the photometric calibration of the Sloan Digital Sky Survey (SDSS). Because the attention of the observing staff is occupied by the operation of the 2.5 meter telescope which takes the survey data proper, the PT must reliably take data with little supervision. In this paper we describe the PT's observing program, MOP, which automates most tasks necessary for observing. MOP's automated target selection is closely modeled on the actions a human observer might take, and is built upon a user interface that can be (and has been) used for manual operation. This results in an interface that makes it easy for an observer to track the activities of the automating procedures and intervene with minimum disturbance when necessary. MOP selects targets from the same list of standard star and calibration fields presented to the user, and chooses standard star fields covering ranges of airmass, color, and time necessary to monitor atmospheric extinction and produce a photometric solution. The software determines when additional standard star fields are unnecessary, and selects survey calibration fields according to availability and priority. Other automated features of MOP, such as maintaining the focus and keeping a night log, are also built around still functional manual interfaces, allowing the observer to be as active in observing as desired; MOP's automated features may be used as tools for manual observing, ignored entirely, or allowed to run the telescope with minimal supervision when taking routine data

  10. Open Automated Demand Response Communications Specification (Version 1.0)

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Ghatikar, Girish; Kiliccote, Sila; Koch, Ed; Hennage, Dan; Palensky, Peter; McParland, Charles

    2009-02-28

    The development of the Open Automated Demand Response Communications Specification, also known as OpenADR or Open Auto-DR, began in 2002 following the California electricity crisis. The work has been carried out by the Demand Response Research Center (DRRC), which is managed by Lawrence Berkeley National Laboratory. This specification describes an open standards-based communications data model designed to facilitate sending and receiving demand response price and reliability signals from a utility or Independent System Operator to electric customers. OpenADR is one element of the Smart Grid information and communications technologies that are being developed to improve optimization between electric supply and demand. The intention of the open automated demand response communications data model is to provide interoperable signals to building and industrial control systems that are preprogrammed to take action based on a demand response signal, enabling a demand response event to be fully automated, with no manual intervention. The OpenADR specification is a flexible infrastructure to facilitate common information exchange between the utility or Independent System Operator and end-use participants. The concept of an open specification is intended to allow anyone to implement the signaling systems, the automation server or the automation clients.

  11. Instrument design and automation

    International Nuclear Information System (INIS)

    Wernlund, R.F.

    1984-01-01

    The ion mobility spectrometer-mass spectrometer (IMS-MS) is described and consists of two separate instruments coupled in tandem: an ion mobility spectrometer coupled to a quadrupole mass spectrometer. The two insturments operate at different pressures in a synergistic manner, supplying both drift time and mass information about ions which are formed at atmospheric pressure in the ion mobility spectrometer tube. Two types of ion intensity signals are presented to the data processor. The IMS produces an analog voltage with major components from dc to 5 KHz. The mass spectrometer signal output resides in the pulse count rate derived from a series of TTL level pulses where each pulse represents the arrival of a single ion. The hardware, software, interfacing capabilities and basic data acquisition program are described in detail

  12. Spatial and temporal analysis of mass movement using dendrochronology

    NARCIS (Netherlands)

    Braam, R.R.; Weiss, E.E.J.; Burrough, P.A.

    1987-01-01

    Tree growth and inclination on sloping land is affected by mass movement. Suitable analysis of tree growth and tree form can therefore provide considerable information on mass movement activity. This paper reports a new, automated method for studying the temporal and spatial aspects of mass

  13. Robust automated knowledge capture.

    Energy Technology Data Exchange (ETDEWEB)

    Stevens-Adams, Susan Marie; Abbott, Robert G.; Forsythe, James Chris; Trumbo, Michael Christopher Stefan; Haass, Michael Joseph; Hendrickson, Stacey M. Langfitt

    2011-10-01

    This report summarizes research conducted through the Sandia National Laboratories Robust Automated Knowledge Capture Laboratory Directed Research and Development project. The objective of this project was to advance scientific understanding of the influence of individual cognitive attributes on decision making. The project has developed a quantitative model known as RumRunner that has proven effective in predicting the propensity of an individual to shift strategies on the basis of task and experience related parameters. Three separate studies are described which have validated the basic RumRunner model. This work provides a basis for better understanding human decision making in high consequent national security applications, and in particular, the individual characteristics that underlie adaptive thinking.

  14. Printing quality control automation

    Science.gov (United States)

    Trapeznikova, O. V.

    2018-04-01

    One of the most important problems in the concept of standardizing the process of offset printing is the control the quality rating of printing and its automation. To solve the problem, a software has been developed taking into account the specifics of printing system components and the behavior in printing process. In order to characterize the distribution of ink layer on the printed substrate the so-called deviation of the ink layer thickness on the sheet from nominal surface is suggested. The geometric data construction the surface projections of the color gamut bodies allows to visualize the color reproduction gamut of printing systems in brightness ranges and specific color sectors, that provides a qualitative comparison of the system by the reproduction of individual colors in a varying ranges of brightness.

  15. Automated electronic filter design

    CERN Document Server

    Banerjee, Amal

    2017-01-01

    This book describes a novel, efficient and powerful scheme for designing and evaluating the performance characteristics of any electronic filter designed with predefined specifications. The author explains techniques that enable readers to eliminate complicated manual, and thus error-prone and time-consuming, steps of traditional design techniques. The presentation includes demonstration of efficient automation, using an ANSI C language program, which accepts any filter design specification (e.g. Chebyschev low-pass filter, cut-off frequency, pass-band ripple etc.) as input and generates as output a SPICE(Simulation Program with Integrated Circuit Emphasis) format netlist. Readers then can use this netlist to run simulations with any version of the popular SPICE simulator, increasing accuracy of the final results, without violating any of the key principles of the traditional design scheme.

  16. Berkeley automated supernova search

    Energy Technology Data Exchange (ETDEWEB)

    Kare, J.T.; Pennypacker, C.R.; Muller, R.A.; Mast, T.S.; Crawford, F.S.; Burns, M.S.

    1981-01-01

    The Berkeley automated supernova search employs a computer controlled 36-inch telescope and charge coupled device (CCD) detector to image 2500 galaxies per night. A dedicated minicomputer compares each galaxy image with stored reference data to identify supernovae in real time. The threshold for detection is m/sub v/ = 18.8. We plan to monitor roughly 500 galaxies in Virgo and closer every night, and an additional 6000 galaxies out to 70 Mpc on a three night cycle. This should yield very early detection of several supernovae per year for detailed study, and reliable premaximum detection of roughly 100 supernovae per year for statistical studies. The search should be operational in mid-1982.

  17. Automated asteroseismic peak detections

    DEFF Research Database (Denmark)

    de Montellano, Andres Garcia Saravia Ortiz; Hekker, S.; Themessl, N.

    2018-01-01

    Space observatories such as Kepler have provided data that can potentially revolutionize our understanding of stars. Through detailed asteroseismic analyses we are capable of determining fundamental stellar parameters and reveal the stellar internal structure with unprecedented accuracy. However......, such detailed analyses, known as peak bagging, have so far been obtained for only a small percentage of the observed stars while most of the scientific potential of the available data remains unexplored. One of the major challenges in peak bagging is identifying how many solar-like oscillation modes are visible...... of detected oscillation modes. The algorithm presented here opens the possibility for detailed and automated peak bagging of the thousands of solar-like oscillators observed by Kepler....

  18. Automated Motivic Analysis

    DEFF Research Database (Denmark)

    Lartillot, Olivier

    2016-01-01

    Motivic analysis provides very detailed understanding of musical composi- tions, but is also particularly difficult to formalize and systematize. A computational automation of the discovery of motivic patterns cannot be reduced to a mere extraction of all possible sequences of descriptions...... for lossless compression. The structural complexity resulting from successive repetitions of patterns can be controlled through a simple modelling of cycles. Generally, motivic patterns cannot always be defined solely as sequences of descriptions in a fixed set of dimensions: throughout the descriptions...... of the successive notes and intervals, various sets of musical parameters may be invoked. In this chapter, a method is presented that allows for these heterogeneous patterns to be discovered. Motivic repetition with local ornamentation is detected by reconstructing, on top of “surface-level” monodic voices, longer...

  19. Berkeley automated supernova search

    International Nuclear Information System (INIS)

    Kare, J.T.; Pennypacker, C.R.; Muller, R.A.; Mast, T.S.

    1981-01-01

    The Berkeley automated supernova search employs a computer controlled 36-inch telescope and charge coupled device (CCD) detector to image 2500 galaxies per night. A dedicated minicomputer compares each galaxy image with stored reference data to identify supernovae in real time. The threshold for detection is m/sub v/ = 18.8. We plan to monitor roughly 500 galaxies in Virgo and closer every night, and an additional 6000 galaxies out to 70 Mpc on a three night cycle. This should yield very early detection of several supernovae per year for detailed study, and reliable premaximum detection of roughly 100 supernovae per year for statistical studies. The search should be operational in mid-1982

  20. (No) Security in Automation!?

    CERN Document Server

    Lüders, S

    2008-01-01

    Modern Information Technologies like Ethernet, TCP/IP, web server or FTP are nowadays increas-ingly used in distributed control and automation systems. Thus, information from the factory floor is now directly available at the management level (From Shop-Floor to Top-Floor) and can be ma-nipulated from there. Despite the benefits coming with this (r)evolution, new vulnerabilities are in-herited, too: worms and viruses spread within seconds via Ethernet and attackers are becoming interested in control systems. Unfortunately, control systems lack the standard security features that usual office PCs have. This contribution will elaborate on these problems, discuss the vulnerabilities of modern control systems and present international initiatives for mitigation.

  1. [Automated anesthesia record systems].

    Science.gov (United States)

    Heinrichs, W; Mönk, S; Eberle, B

    1997-07-01

    The introduction of electronic anaesthesia documentation systems was attempted as early as in 1979, although their efficient application has become reality only in the past few years. The advantages of the electronic protocol are apparent: Continuous high quality documentation, comparability of data due to the availability of a data bank, reduction in the workload of the anaesthetist and availability of additional data. Disadvantages of the electronic protocol have also been discussed in the literature. By going through the process of entering data on the course of the anaesthetic procedure on the protocol sheet, the information is mentally absorbed and evaluated by the anaesthetist. This information may, however, be lost when the data are recorded fully automatically-without active involvement on the part of the anaesthetist. Recent publications state that by using intelligent alarms and/or integrated displays manual record keeping is no longer necessary for anaesthesia vigilance. The technical design of automated anaesthesia records depends on an integration of network technology into the hospital. It will be appropriate to connect the systems to the internet, but safety requirements have to be followed strictly. Concerning the database, client server architecture as well as language standards like SQL should be used. Object oriented databases will be available in the near future. Another future goal of automated anaesthesia record systems will be using knowledge based technologies within these systems. Drug interactions, disease related anaesthetic techniques and other information sources can be integrated. At this time, almost none of the commercially available systems has matured to a point where their purchase can be recommended without reservation. There is still a lack of standards for the subsequent exchange of data and a solution to a number of ergonomic problems still remains to be found. Nevertheless, electronic anaesthesia protocols will be required in

  2. Automation of the dicentric chromosome assay and related assays

    International Nuclear Information System (INIS)

    Balajee, Adayabalam S.; Dainiak, Nicholas

    2016-01-01

    Dicentric Chromosome Assay (DCA) is considered to be the 'gold standard' for personalized dose assessment in humans after accidental or incidental radiation exposure. Although this technique is superior to other cytogenetic assays in terms of specificity and sensitivity, its potential application to radiation mass casualty scenarios is highly restricted because DCA is time consuming and labor intensive when performed manually. Therefore, it is imperative to develop high throughput automation techniques to make DCA suitable for radiological triage scenarios. At the Cytogenetic Biodosimetry Laboratory in Oak Ridge, efforts are underway to develop high throughput automation of DCA. Current status on development of various automated cytogenetic techniques in meeting the biodosimetry needs of radiological/nuclear incident(s) will be discussed

  3. Taking action against violence.

    Science.gov (United States)

    Kunz, K

    1996-05-01

    Significant increase in violent crimes in recent years forced Icelandic men to take action against violence. Television was seen as a major contributory factor in increasing violence. Surveys indicate that 10-15 years after television broadcasting commences in a particular society, the incidence of crime can be expected to double. While the majority of the individuals arrested for violent crimes are men, being male does not necessarily mean being violent. The Men's Committee of the Icelandic Equal Rights Council initiated a week-long information and education campaign under the theme "Men Against Violence". This campaign involved several events including an art exhibit, speeches on violence in families, treatment sought by those who are likely to resort to violence, booklet distribution among students in secondary schools, and a mass media campaign to raise public awareness on this pressing problem.

  4. Automating the radiographic NDT process

    International Nuclear Information System (INIS)

    Aman, J.K.

    1986-01-01

    Automation, the removal of the human element in inspection, has not been generally applied to film radiographic NDT. The justication for automating is not only productivity but also reliability of results. Film remains in the automated system of the future because of its extremely high image content, approximately 8 x 10 9 bits per 14 x 17. The equivalent to 2200 computer floppy discs. Parts handling systems and robotics applied for manufacturing and some NDT modalities, should now be applied to film radiographic NDT systems. Automatic film handling can be achieved with the daylight NDT film handling system. Automatic film processing is becoming the standard in industry and can be coupled to the daylight system. Robots offer the opportunity to automate fully the exposure step. Finally, computer aided interpretation appears on the horizon. A unit which laser scans a 14 x 17 (inch) film in 6 - 8 seconds can digitize film information for further manipulation and possible automatic interrogations (computer aided interpretation). The system called FDRS (for Film Digital Radiography System) is moving toward 50 micron (*approx* 16 lines/mm) resolution. This is believed to meet the need of the majority of image content needs. We expect the automated system to appear first in parts (modules) as certain operations are automated. The future will see it all come together in an automated film radiographic NDT system (author) [pt

  5. Improving mass measurement accuracy in mass spectrometry based proteomics by combining open source tools for chromatographic alignment and internal calibration.

    Science.gov (United States)

    Palmblad, Magnus; van der Burgt, Yuri E M; Dalebout, Hans; Derks, Rico J E; Schoenmaker, Bart; Deelder, André M

    2009-05-02

    Accurate mass determination enhances peptide identification in mass spectrometry based proteomics. We here describe the combination of two previously published open source software tools to improve mass measurement accuracy in Fourier transform ion cyclotron resonance mass spectrometry (FTICRMS). The first program, msalign, aligns one MS/MS dataset with one FTICRMS dataset. The second software, recal2, uses peptides identified from the MS/MS data for automated internal calibration of the FTICR spectra, resulting in sub-ppm mass measurement errors.

  6. DAPs: Deep Action Proposals for Action Understanding

    KAUST Repository

    Escorcia, Victor; Caba Heilbron, Fabian; Niebles, Juan Carlos; Ghanem, Bernard

    2016-01-01

    action proposals from long videos. We show how to take advantage of the vast capacity of deep learning models and memory cells to retrieve from untrimmed videos temporal segments, which are likely to contain actions. A comprehensive evaluation indicates

  7. [Automated analyzer of enzyme immunoassay].

    Science.gov (United States)

    Osawa, S

    1995-09-01

    Automated analyzers for enzyme immunoassay can be classified by several points of view: the kind of labeled antibodies or enzymes, detection methods, the number of tests per unit time, analytical time and speed per run. In practice, it is important for us consider the several points such as detection limits, the number of tests per unit time, analytical range, and precision. Most of the automated analyzers on the market can randomly access and measure samples. I will describe the recent advance of automated analyzers reviewing their labeling antibodies and enzymes, the detection methods, the number of test per unit time and analytical time and speed per test.

  8. Programmable Automated Welding System (PAWS)

    Science.gov (United States)

    Kline, Martin D.

    1994-01-01

    An ambitious project to develop an advanced, automated welding system is being funded as part of the Navy Joining Center with Babcock & Wilcox as the prime integrator. This program, the Programmable Automated Welding System (PAWS), involves the integration of both planning and real-time control activities. Planning functions include the development of a graphical decision support system within a standard, portable environment. Real-time control functions include the development of a modular, intelligent, real-time control system and the integration of a number of welding process sensors. This paper presents each of these components of the PAWS and discusses how they can be utilized to automate the welding operation.

  9. Man-machine interface versus full automation

    International Nuclear Information System (INIS)

    Hatton, V.

    1984-01-01

    As accelerators grow in size and complexity of operation there is an increasing economical as well as an operational incentive for the controls and operations teams to use computers to help the man-machine interface. At first the computer network replaced the traditional controls racks filled with knobs, buttons and digital displays of voltages and potentiometer readings. The computer system provided the operator with the extension of his hands and eyes. It was quickly found that much more could be achieved. Where previously it was necessary for the human operator to decide the order of the actions to be executed by the computer as a result of a visual indication of malfunctioning of the accelerator, now the operation is becoming more and more under the direct control of the computer system. Expert knowledge is programmed into the system to help the non-specialist make decision and to safeguard the equipment. Machine physics concepts have been incorporated and critical machine parameters can be optimised easily by the physicists or operators without any detailed knowledge of the intervening medium or of the equipment being controlled. As confidence grows and reliability improves, more and more automation can be added. How far can this process of automation replace the skilled operator. Can the accelerators of tomorrow be run like the ever increasing robotic assembly plants of today. How is the role of the operator changing in this new environment

  10. Givental action and trivialisation of circle action

    NARCIS (Netherlands)

    Dotsenko, V.; Shadrin, S.; Vallette, B.

    2015-01-01

    In this paper, we show that the Givental group action on genus zero cohomological field theories, also known as formal Frobenius manifolds or hypercommutative algebras, naturally arises in the deformation theory of Batalin-Vilkovisky algebras. We prove that the Givental action is equal to an action

  11. Default mode contributions to automated information processing.

    Science.gov (United States)

    Vatansever, Deniz; Menon, David K; Stamatakis, Emmanuel A

    2017-11-28

    Concurrent with mental processes that require rigorous computation and control, a series of automated decisions and actions govern our daily lives, providing efficient and adaptive responses to environmental demands. Using a cognitive flexibility task, we show that a set of brain regions collectively known as the default mode network plays a crucial role in such "autopilot" behavior, i.e., when rapidly selecting appropriate responses under predictable behavioral contexts. While applying learned rules, the default mode network shows both greater activity and connectivity. Furthermore, functional interactions between this network and hippocampal and parahippocampal areas as well as primary visual cortex correlate with the speed of accurate responses. These findings indicate a memory-based "autopilot role" for the default mode network, which may have important implications for our current understanding of healthy and adaptive brain processing.

  12. WIRELESS HOME AUTOMATION SYSTEM BASED ON MICROCONTROLLER

    Directory of Open Access Journals (Sweden)

    MUNA H. SALEH

    2017-11-01

    Full Text Available This paper presents the development of Global System Mobile (GSM-based control home air-conditioner for home automation system. The main aim of the prototype development is to reduce electricity wastage. GSM module was used for receiving Short Message Service (SMS from the user’s mobile phone that automatically enable the controller to take any further action such as to switch ON and OFF the home air-conditioner. The system controls the air-conditioner based on the temperature reading through the sensor. Every period temperature sensor sends the degree to Micro Controller Unit (MCU through ZigBee. Based on temperature degree MCU send ON or OFF signal to switch. Additionally, system allows user to operate or shut down the airconditioner remotely through SMS.

  13. An Automation Survival Guide for Media Centers.

    Science.gov (United States)

    Whaley, Roger E.

    1989-01-01

    Reviews factors that should affect the decision to automate a school media center and offers suggestions for the automation process. Topics discussed include getting the library collection ready for automation, deciding what automated functions are needed, evaluating software vendors, selecting software, and budgeting. (CLB)

  14. Towards automated identification of metabolites using mass spectral trees

    NARCIS (Netherlands)

    Rojas-Chertó, Miquel

    2014-01-01

    The detailed description of the chemical compounds present in organisms, organs/tissues, biofluids and cells is the key to understand the complexity of biological systems. The small molecules (metabolites) are known to be very diverse in structure and function. However, the identification of the

  15. Automated mass spectrometric analysis of urinary and plasma serotonin

    NARCIS (Netherlands)

    de Jong, Wilhelmina H. A.; Wilkens, Marianne H. L. I.; de Vries, Elisabeth G. E.; Kema, Ido P.

    Serotonin emerges as crucial neurotransmitter and hormone in a growing number of different physiologic processes. Besides extensive serotonin production previously noted in patients with metastatic carcinoid tumors, serotonin now is implicated in liver cell regeneration and bone formation. The aim

  16. The Automator: Intelligent control system monitoring

    International Nuclear Information System (INIS)

    M. Bickley; D.A. Bryan; K.S. White

    1999-01-01

    A large-scale control system may contain several hundred thousand control points which must be monitored to ensure smooth operation. Knowledge of the current state of such a system is often implicit in the values of these points and operators must be cognizant of the state while making decisions. Repetitive operators requiring human intervention lead to fatigue, which can in turn lead to mistakes. The authors propose a tool called the Automator based on a middleware software server. This tool would provide a user-configurable engine for monitoring control points. Based on the status of these control points, a specified action could be taken. The action could range from setting another control point, to triggering an alarm, to running an executable. Often the data presented by a system is meaningless without context information from other channels. Such a tool could be configured to present interpreted information based on values of other channels. Additionally, this tool could translate numerous values in a non-friendly form (such as numbers, bits, or return codes) into meaningful strings of information. Multiple instances of this server could be run, allowing individuals or groups to configure their own Automators. The configuration of the tool will be file-based. In the future, these files could be generated by graphical design tools, allowing for rapid development of new configurations. In addition, the server will be able to explicitly maintain information about the state of the control system. This state information can be used in decision-making processes and shared with other applications. A conceptual framework and software design for the tool are presented

  17. Decision Making In A High-Tech World: Automation Bias and Countermeasures

    Science.gov (United States)

    Mosier, Kathleen L.; Skitka, Linda J.; Burdick, Mark R.; Heers, Susan T.; Rosekind, Mark R. (Technical Monitor)

    1996-01-01

    Automated decision aids and decision support systems have become essential tools in many high-tech environments. In aviation, for example, flight management systems computers not only fly the aircraft, but also calculate fuel efficient paths, detect and diagnose system malfunctions and abnormalities, and recommend or carry out decisions. Air Traffic Controllers will soon be utilizing decision support tools to help them predict and detect potential conflicts and to generate clearances. Other fields as disparate as nuclear power plants and medical diagnostics are similarly becoming more and more automated. Ideally, the combination of human decision maker and automated decision aid should result in a high-performing team, maximizing the advantages of additional cognitive and observational power in the decision-making process. In reality, however, the presence of these aids often short-circuits the way that even very experienced decision makers have traditionally handled tasks and made decisions, and introduces opportunities for new decision heuristics and biases. Results of recent research investigating the use of automated aids have indicated the presence of automation bias, that is, errors made when decision makers rely on automated cues as a heuristic replacement for vigilant information seeking and processing. Automation commission errors, i.e., errors made when decision makers inappropriately follow an automated directive, or automation omission errors, i.e., errors made when humans fail to take action or notice a problem because an automated aid fails to inform them, can result from this tendency. Evidence of the tendency to make automation-related omission and commission errors has been found in pilot self reports, in studies using pilots in flight simulations, and in non-flight decision making contexts with student samples. Considerable research has found that increasing social accountability can successfully ameliorate a broad array of cognitive biases and

  18. A LARGE LIFE INSURANCE COMPANY AUTOMATES. WORKFORCE IMPLICATIONS OF COMPUTER CONVERSION. AUTOMATION PROGRAM REPORT, NUMBER 3.

    Science.gov (United States)

    CIBARICH, AUGUST L.; AND OTHERS

    THIS WAS ONE OF 20 DEMONSTRATION PROJECTS INITIATED IN 11 STATES IN 1961-63 TO GAIN EXPERIENCE WITH LABOR MARKET PROBLEMS ARISING FROM CHANGING TECHNOLOGY AND MASS LAYOFFS. THE FUNDAMENTAL AIM WAS TO COMBINE ACTION AND RESEARCH TO DEMONSTRATE WHAT THE STATE EMPLOYMENT SERVICE COULD DO IN AREAS WHERE THE LABOR MARKET WAS RAPIDLY CHANGING.…

  19. Home automation with Intel Galileo

    CERN Document Server

    Dundar, Onur

    2015-01-01

    This book is for anyone who wants to learn Intel Galileo for home automation and cross-platform software development. No knowledge of programming with Intel Galileo is assumed, but knowledge of the C programming language is essential.

  20. Strategic Transit Automation Research Plan

    Science.gov (United States)

    2018-01-01

    Transit bus automation could deliver many potential benefits, but transit agencies need additional research and policy guidance to make informed deployment decisions. Although funding and policy constraints may play a role, there is also a reasonable...

  1. The Evaluation of Automated Systems

    National Research Council Canada - National Science Library

    McDougall, Jeffrey

    2004-01-01

    .... The Army has recognized this change and is adapting to operate in this new environment. It has developed a number of automated tools to assist leaders in the command and control of their organizations...

  2. National Automated Conformity Inspection Process -

    Data.gov (United States)

    Department of Transportation — The National Automated Conformity Inspection Process (NACIP) Application is intended to expedite the workflow process as it pertains to the FAA Form 81 0-10 Request...

  3. Automation of the testing procedure

    International Nuclear Information System (INIS)

    Haas, H.; Fleischer, M.; Bachner, E.

    1979-01-01

    For the judgement of technologies applied and the testing of specific components of the HTR primary circuit, complex test procedures and data evaluations are required. Extensive automation of these test procedures is indispensable. (orig.) [de

  4. Automation of coal mining equipment

    Energy Technology Data Exchange (ETDEWEB)

    Yamada, Ryuji

    1986-12-25

    Major machines used in the working face include the shearer and the self-advancing frame. The shearer has been changed from the radio-controlled model to the microcomputer operated machine, while automating various functions. In addition, a system for comprehensively examining operating conditions and natural conditions in the working face for further automation. The selfadvancing frame has been modified from the sequence controlled model to the microcomputer aided electrohydraulic control system. In order to proceed further with automation and introduce robotics, detectors, control units and valves must be made smaller in higher reliability. The system will be controlled above the ground in the future, provided that the machines in the working face are remote controlled at the gate while transmitting relevant data above the ground from this system. Thus, automated working face will be realized. (2 figs, 1 photo)

  5. Synthesis of Automated Vehicle Legislation

    Science.gov (United States)

    2017-10-01

    This report provides a synthesis of issues addressed by state legislation regarding automated vehicles (AV); AV technologies are rapidly evolving and many states have developed legislation to govern AV testing and deployment and to assure safety on p...

  6. Fully automated parallel oligonucleotide synthesizer

    Czech Academy of Sciences Publication Activity Database

    Lebl, M.; Burger, Ch.; Ellman, B.; Heiner, D.; Ibrahim, G.; Jones, A.; Nibbe, M.; Thompson, J.; Mudra, Petr; Pokorný, Vít; Poncar, Pavel; Ženíšek, Karel

    2001-01-01

    Roč. 66, č. 8 (2001), s. 1299-1314 ISSN 0010-0765 Institutional research plan: CEZ:AV0Z4055905 Keywords : automated oligonucleotide synthesizer Subject RIV: CC - Organic Chemistry Impact factor: 0.778, year: 2001

  7. Automation and Human Resource Management.

    Science.gov (United States)

    Taft, Michael

    1988-01-01

    Discussion of the automation of personnel administration in libraries covers (1) new developments in human resource management systems; (2) system requirements; (3) software evaluation; (4) vendor evaluation; (5) selection of a system; (6) training and support; and (7) benefits. (MES)

  8. 75 FR 43537 - Mortgagee Review Board: Administrative Actions

    Science.gov (United States)

    2010-07-26

    ... Mortgage Corp., Inc., Margate, FL. 365. Florida Business Finance Corp., Jacksonville, FL (Titles 1 & 2... accurately identify Academy as the owner of the Web site; and failed to register the fictitious business name... requirements. 5. Automated Finance Corporation, Calabasas, CA [Docket No. 09-9825-MR] Action: On October 30...

  9. Impulsive action and motivation.

    Science.gov (United States)

    Frijda, Nico H

    2010-07-01

    This paper explores the way in which emotions are causal determinants of action. It argues that emotional events, as appraised by the individual, elicit changes in motive states (called states of action readiness), which in turn may (or may not) cause action. Actions can be elicited automatically, without prior intention (called impulsive actions), or intentionally. Impulsive actions reflect the simplest and biologically most general form in which emotions can cause action, since they require no reflection, no foresight, and no planning. Impulsive actions are determined conjointly by the nature of action readiness, the affordances perceived in the eliciting event as appraised, and the individual's action repertoire. Those actions from one's repertoire are performed that both match the perceived affordances and the aim of the state of action readiness. Copyright © 2010 Elsevier B.V. All rights reserved.

  10. Anesthesiology, automation, and artificial intelligence.

    Science.gov (United States)

    Alexander, John C; Joshi, Girish P

    2018-01-01

    There have been many attempts to incorporate automation into the practice of anesthesiology, though none have been successful. Fundamentally, these failures are due to the underlying complexity of anesthesia practice and the inability of rule-based feedback loops to fully master it. Recent innovations in artificial intelligence, especially machine learning, may usher in a new era of automation across many industries, including anesthesiology. It would be wise to consider the implications of such potential changes before they have been fully realized.

  11. Virtual Machine in Automation Projects

    OpenAIRE

    Xing, Xiaoyuan

    2010-01-01

    Virtual machine, as an engineering tool, has recently been introduced into automation projects in Tetra Pak Processing System AB. The goal of this paper is to examine how to better utilize virtual machine for the automation projects. This paper designs different project scenarios using virtual machine. It analyzes installability, performance and stability of virtual machine from the test results. Technical solutions concerning virtual machine are discussed such as the conversion with physical...

  12. Evolution of Home Automation Technology

    OpenAIRE

    Mohd. Rihan; M. Salim Beg

    2009-01-01

    In modern society home and office automation has becomeincreasingly important, providing ways to interconnectvarious home appliances. This interconnection results infaster transfer of information within home/offices leading tobetter home management and improved user experience.Home Automation, in essence, is a technology thatintegrates various electrical systems of a home to provideenhanced comfort and security. Users are grantedconvenient and complete control over all the electrical homeappl...

  13. Automated measuring systems. Automatisierte Messsysteme

    Energy Technology Data Exchange (ETDEWEB)

    1985-01-01

    Microprocessors have become a regular component of automated measuring systems. Experts offer their experience and basic information in 24 lectures and 10 poster presentations. The focus is on the following: Automated measuring, computer and microprocessor use, sensor technique, actuator technique, communication, interfaces, man-system interaction, distrubance tolerance and availability as well as uses. A discussion meeting is dedicated to the theme complex sensor digital signal, sensor interface and sensor bus.

  14. Aprendizaje automático

    OpenAIRE

    Moreno, Antonio

    1994-01-01

    En este libro se introducen los conceptos básicos en una de las ramas más estudiadas actualmente dentro de la inteligencia artificial: el aprendizaje automático. Se estudian temas como el aprendizaje inductivo, el razonamiento analógico, el aprendizaje basado en explicaciones, las redes neuronales, los algoritmos genéticos, el razonamiento basado en casos o las aproximaciones teóricas al aprendizaje automático.

  15. Safeguards through secure automated fabrication

    International Nuclear Information System (INIS)

    DeMerschman, A.W.; Carlson, R.L.

    1982-01-01

    Westinghouse Hanford Company, a prime contractor for the U.S. Department of Energy, is constructing the Secure Automated Fabrication (SAF) line for fabrication of mixed oxide breeder fuel pins. Fuel processing by automation, which provides a separation of personnel from fuel handling, will provide a means whereby advanced safeguards concepts will be introduced. Remote operations and the inter-tie between the process computer and the safeguards computer are discussed

  16. Automated sample analysis and remediation

    International Nuclear Information System (INIS)

    Hollen, R.; Settle, F.

    1995-01-01

    The Contaminant Analysis Automation Project is developing an automated chemical analysis system to address the current needs of the US Department of Energy (DOE). These needs focus on the remediation of large amounts of radioactive and chemically hazardous wastes stored, buried and still being processed at numerous DOE sites. This paper outlines the advantages of the system under development, and details the hardware and software design. A prototype system for characterizing polychlorinated biphenyls in soils is also described

  17. Manned spacecraft automation and robotics

    Science.gov (United States)

    Erickson, Jon D.

    1987-01-01

    The Space Station holds promise of being a showcase user and driver of advanced automation and robotics technology. The author addresses the advances in automation and robotics from the Space Shuttle - with its high-reliability redundancy management and fault tolerance design and its remote manipulator system - to the projected knowledge-based systems for monitoring, control, fault diagnosis, planning, and scheduling, and the telerobotic systems of the future Space Station.

  18. Home Automation and Security System

    OpenAIRE

    Surinder Kaur,; Rashmi Singh; Neha Khairwal; Pratyk Jain

    2016-01-01

    Easy Home or Home automation plays a very important role in modern era because of its flexibility in using it at different places with high precision which will save money and time by decreasing human hard work. Prime focus of this technology is to control the household equipment’s like light, fan, door, AC etc. automatically. This research paper has detailed information on Home Automation and Security System using Arduino, GSM and how we can control home appliances using Android application....

  19. 2015 Chinese Intelligent Automation Conference

    CERN Document Server

    Li, Hongbo

    2015-01-01

    Proceedings of the 2015 Chinese Intelligent Automation Conference presents selected research papers from the CIAC’15, held in Fuzhou, China. The topics include adaptive control, fuzzy control, neural network based control, knowledge based control, hybrid intelligent control, learning control, evolutionary mechanism based control, multi-sensor integration, failure diagnosis, reconfigurable control, etc. Engineers and researchers from academia, industry and the government can gain valuable insights into interdisciplinary solutions in the field of intelligent automation.

  20. BARD: Better Automated Redistricting

    Directory of Open Access Journals (Sweden)

    Micah Altman

    2011-08-01

    Full Text Available BARD is the first (and at time of writing, only open source software package for general redistricting and redistricting analysis. BARD provides methods to create, display, compare, edit, automatically refine, evaluate, and profile political districting plans. BARD aims to provide a framework for scientific analysis of redistricting plans and to facilitate wider public participation in the creation of new plans.BARD facilitates map creation and refinement through command-line, graphical user interface, and automatic methods. Since redistricting is a computationally complex partitioning problem not amenable to an exact optimization solution, BARD implements a variety of selectable metaheuristics that can be used to refine existing or randomly-generated redistricting plans based on user-determined criteria.Furthermore, BARD supports automated generation of redistricting plans and profiling of plans by assigning different weights to various criteria, such as district compactness or equality of population. This functionality permits exploration of trade-offs among criteria. The intent of a redistricting authority may be explored by examining these trade-offs and inferring which reasonably observable plans were not adopted.Redistricting is a computationally-intensive problem for even modest-sized states. Performance is thus an important consideration in BARD's design and implementation. The program implements performance enhancements such as evaluation caching, explicit memory management, and distributed computing across snow clusters.

  1. Automated uranium titration system

    International Nuclear Information System (INIS)

    Takahashi, M.; Kato, Y.

    1983-01-01

    An automated titration system based on the Davies-Gray method has been developed for accurate determination of uranium. The system consists of a potentiometric titrator with precise burettes, a sample changer, an electronic balance and a desk-top computer with a printer. Fifty-five titration vessels are loaded in the sample changer. The first three contain the standard solution for standardizing potassium dichromate titrant, and the next two and the last two contain the control samples for data quality assurance. The other forty-eight measurements are carried out for sixteen unknown samples. Sample solution containing about 100 mg uranium is taken in a titration vessel. At the pretreatment position, uranium (VI) is reduced to uranium (IV) by iron (II). After the valency adjustment, the vessel is transferred to the titration position. The rate of titrant addition is automatically controlled to be slower near the end-point. The last figure (0.01 mL) of the equivalent titrant volume for uranium is calculated from the potential change. The results obtained with this system on 100 mg uranium gave a precision of 0.2% (RSD,n=3) and an accuracy of better than 0.1%. Fifty-five titrations are accomplished in 10 hours. (author)

  2. Automated asteroseismic peak detections

    Science.gov (United States)

    García Saravia Ortiz de Montellano, Andrés; Hekker, S.; Themeßl, N.

    2018-05-01

    Space observatories such as Kepler have provided data that can potentially revolutionize our understanding of stars. Through detailed asteroseismic analyses we are capable of determining fundamental stellar parameters and reveal the stellar internal structure with unprecedented accuracy. However, such detailed analyses, known as peak bagging, have so far been obtained for only a small percentage of the observed stars while most of the scientific potential of the available data remains unexplored. One of the major challenges in peak bagging is identifying how many solar-like oscillation modes are visible in a power density spectrum. Identification of oscillation modes is usually done by visual inspection that is time-consuming and has a degree of subjectivity. Here, we present a peak-detection algorithm especially suited for the detection of solar-like oscillations. It reliably characterizes the solar-like oscillations in a power density spectrum and estimates their parameters without human intervention. Furthermore, we provide a metric to characterize the false positive and false negative rates to provide further information about the reliability of a detected oscillation mode or the significance of a lack of detected oscillation modes. The algorithm presented here opens the possibility for detailed and automated peak bagging of the thousands of solar-like oscillators observed by Kepler.

  3. Particle Accelerator Focus Automation

    Science.gov (United States)

    Lopes, José; Rocha, Jorge; Redondo, Luís; Cruz, João

    2017-08-01

    The Laboratório de Aceleradores e Tecnologias de Radiação (LATR) at the Campus Tecnológico e Nuclear, of Instituto Superior Técnico (IST) has a horizontal electrostatic particle accelerator based on the Van de Graaff machine which is used for research in the area of material characterization. This machine produces alfa (He+) and proton (H+) beams of some μA currents up to 2 MeV/q energies. Beam focusing is obtained using a cylindrical lens of the Einzel type, assembled near the high voltage terminal. This paper describes the developed system that automatically focuses the ion beam, using a personal computer running the LabVIEW software, a multifunction input/output board and signal conditioning circuits. The focusing procedure consists of a scanning method to find the lens bias voltage which maximizes the beam current measured on a beam stopper target, which is used as feedback for the scanning cycle. This system, as part of a wider start up and shut down automation system built for this particle accelerator, brings great advantages to the operation of the accelerator by turning it faster and easier to operate, requiring less human presence, and adding the possibility of total remote control in safe conditions.

  4. Particle Accelerator Focus Automation

    Directory of Open Access Journals (Sweden)

    Lopes José

    2017-08-01

    Full Text Available The Laboratório de Aceleradores e Tecnologias de Radiação (LATR at the Campus Tecnológico e Nuclear, of Instituto Superior Técnico (IST has a horizontal electrostatic particle accelerator based on the Van de Graaff machine which is used for research in the area of material characterization. This machine produces alfa (He+ and proton (H+ beams of some μA currents up to 2 MeV/q energies. Beam focusing is obtained using a cylindrical lens of the Einzel type, assembled near the high voltage terminal. This paper describes the developed system that automatically focuses the ion beam, using a personal computer running the LabVIEW software, a multifunction input/output board and signal conditioning circuits. The focusing procedure consists of a scanning method to find the lens bias voltage which maximizes the beam current measured on a beam stopper target, which is used as feedback for the scanning cycle. This system, as part of a wider start up and shut down automation system built for this particle accelerator, brings great advantages to the operation of the accelerator by turning it faster and easier to operate, requiring less human presence, and adding the possibility of total remote control in safe conditions.

  5. Automated ISS Flight Utilities

    Science.gov (United States)

    Offermann, Jan Tuzlic

    2016-01-01

    During my internship at NASA Johnson Space Center, I worked in the Space Radiation Analysis Group (SRAG), where I was tasked with a number of projects focused on the automation of tasks and activities related to the operation of the International Space Station (ISS). As I worked on a number of projects, I have written short sections below to give a description for each, followed by more general remarks on the internship experience. My first project is titled "General Exposure Representation EVADOSE", also known as "GEnEVADOSE". This project involved the design and development of a C++/ ROOT framework focused on radiation exposure for extravehicular activity (EVA) planning for the ISS. The utility helps mission managers plan EVAs by displaying information on the cumulative radiation doses that crew will receive during an EVA as a function of the egress time and duration of the activity. SRAG uses a utility called EVADOSE, employing a model of the space radiation environment in low Earth orbit to predict these doses, as while outside the ISS the astronauts will have less shielding from charged particles such as electrons and protons. However, EVADOSE output is cumbersome to work with, and prior to GEnEVADOSE, querying data and producing graphs of ISS trajectories and cumulative doses versus egress time required manual work in Microsoft Excel. GEnEVADOSE automates all this work, reading in EVADOSE output file(s) along with a plaintext file input by the user providing input parameters. GEnEVADOSE will output a text file containing all the necessary dosimetry for each proposed EVA egress time, for each specified EVADOSE file. It also plots cumulative dose versus egress time and the ISS trajectory, and displays all of this information in an auto-generated presentation made in LaTeX. New features have also been added, such as best-case scenarios (egress times corresponding to the least dose), interpolated curves for trajectories, and the ability to query any time in the

  6. Automated Supernova Discovery (Abstract)

    Science.gov (United States)

    Post, R. S.

    2015-12-01

    (Abstract only) We are developing a system of robotic telescopes for automatic recognition of Supernovas as well as other transient events in collaboration with the Puckett Supernova Search Team. At the SAS2014 meeting, the discovery program, SNARE, was first described. Since then, it has been continuously improved to handle searches under a wide variety of atmospheric conditions. Currently, two telescopes are used to build a reference library while searching for PSN with a partial library. Since data is taken every night without clouds, we must deal with varying atmospheric and high background illumination from the moon. Software is configured to identify a PSN, reshoot for verification with options to change the run plan to acquire photometric or spectrographic data. The telescopes are 24-inch CDK24, with Alta U230 cameras, one in CA and one in NM. Images and run plans are sent between sites so the CA telescope can search while photometry is done in NM. Our goal is to find bright PSNs with magnitude 17.5 or less which is the limit of our planned spectroscopy. We present results from our first automated PSN discoveries and plans for PSN data acquisition.

  7. Mass discrimination

    Energy Technology Data Exchange (ETDEWEB)

    Broeckman, A. [Rijksuniversiteit Utrecht (Netherlands)

    1978-12-15

    In thermal ionization mass spectrometry the phenomenon of mass discrimination has led to the use of a correction factor for isotope ratio-measurements. The correction factor is defined as the measured ratio divided by the true or accepted value of this ratio. In fact this factor corrects for systematic errors of the whole procedure; however mass discrimination is often associated just with the mass spectrometer.

  8. Negative mass

    International Nuclear Information System (INIS)

    Hammond, Richard T

    2015-01-01

    Some physical aspects of negative mass are examined. Several unusual properties, such as the ability of negative mass to penetrate any armor, are analysed. Other surprising effects include the bizarre system of negative mass chasing positive mass, naked singularities and the violation of cosmic censorship, wormholes, and quantum mechanical results as well. In addition, a brief look into the implications for strings is given. (paper)

  9. An automation model of Effluent Treatment Plant

    Directory of Open Access Journals (Sweden)

    Luiz Alberto Oliveira Lima Roque

    2012-07-01

    Full Text Available Population growth and intensification of industrial activities have increased the deterioration of natural resources. Industrial, hospital and residential wastes are dumped directly into landfills without processing, polluting soils. This action will have consequences later, because the liquid substance resulting from the putrefaction of organic material plows into the soil to reach water bodies. Cities arise without planning, industrial and household wastes are discharged into rivers, lakes and oceans without proper treatment, affecting water resources. It is well known that in the next century there will be fierce competition for fresh water on the planet, probably due to the scarcity of it. Demographic expansion has occurred without proper health planning, degrading oceans, lakes and rivers. Thus, a large percentage of world population suffers from diseases related to water pollution. Accordingly, it can be concluded that sewage treatment is essential to human survival, to preserve rivers, lakes and oceans. An Effluent Treatment Plant (ETP treats wastewater to reduce its pollution to acceptable levels before sending them to the oceans or rivers. To automate the operation of an ETP, motors, sensors and logic blocks, timers and counters are needed. These functions are achieved with programmable logic controllers (PLC and Supervisory Systems. The Ladder language is used to program controllers and is a pillar of the Automation and Control Engineering. The supervisory systems allow process information to be monitored, while the PLC are responsible for control and data acquisition. In the age we live in, process automation is used in an increasing scale in order to provide higher quality, raise productivity and improve the proposed activities. Therefore, an automatic ETP will improve performance and efficiency to handle large volumes of sewage. Considering the growing importance of environmental awareness with special emphasis

  10. Mass screening in breast cancer

    International Nuclear Information System (INIS)

    Strax, P.

    1977-01-01

    Some questions about mass screening in breast cancer are answered it being concluded that: 1. mass screening for the detection of early breast cancer is the only means with proven potential for lowering the death rate of the disease; 2. mammography is an importante - if not the most important modality in mass screening; 3. new film - screen combinations generally available are capable of producing mammograms of excelent quality with radiation doses down to .1 rad into the body of breast. The risk of malignant changes from such dosage - even when given periodically is negligeable. New equipment, to be available, shortly, will use the new film - screen combinations in an automated manner with must reduce cost in time, filme, personnel and processing - of more than 50%. This would make mass screening more practical. (M.A.) [pt

  11. Nominal Mass?

    Science.gov (United States)

    Attygalle, Athula B; Pavlov, Julius

    2017-08-01

    The current IUPAC-recommended definition of the term "nominal mass," based on the most abundant naturally occurring stable isotope of an element, is flawed. We propose that Nominal mass should be defined as the sum of integer masses of protons and neutrons in any chemical species. In this way, all isotopes and isotopologues can be assigned a definitive identifier. Graphical Abstract ᅟ.

  12. Significant NRC Enforcement Actions

    Data.gov (United States)

    Nuclear Regulatory Commission — This dataset provides a list of Nuclear Regulartory Commission (NRC) issued significant enforcement actions. These actions, referred to as "escalated", are issued by...

  13. Automated gravity gradient tensor inversion for underwater object detection

    International Nuclear Information System (INIS)

    Wu, Lin; Tian, Jinwen

    2010-01-01

    Underwater abnormal object detection is a current need for the navigation security of autonomous underwater vehicles (AUVs). In this paper, an automated gravity gradient tensor inversion algorithm is proposed for the purpose of passive underwater object detection. Full-tensor gravity gradient anomalies induced by an object in the partial area can be measured with the technique of gravity gradiometry on an AUV. Then the automated algorithm utilizes the anomalies, using the inverse method to estimate the mass and barycentre location of the arbitrary-shaped object. A few tests on simple synthetic models will be illustrated, in order to evaluate the feasibility and accuracy of the new algorithm. Moreover, the method is applied to a complicated model of an abnormal object with gradiometer and AUV noise, and interference from a neighbouring illusive smaller object. In all cases tested, the estimated mass and barycentre location parameters are found to be in good agreement with the actual values

  14. Clinical Chemistry Laboratory Automation in the 21st Century - Amat Victoria curam (Victory loves careful preparation)

    Science.gov (United States)

    Armbruster, David A; Overcash, David R; Reyes, Jaime

    2014-01-01

    The era of automation arrived with the introduction of the AutoAnalyzer using continuous flow analysis and the Robot Chemist that automated the traditional manual analytical steps. Successive generations of stand-alone analysers increased analytical speed, offered the ability to test high volumes of patient specimens, and provided large assay menus. A dichotomy developed, with a group of analysers devoted to performing routine clinical chemistry tests and another group dedicated to performing immunoassays using a variety of methodologies. Development of integrated systems greatly improved the analytical phase of clinical laboratory testing and further automation was developed for pre-analytical procedures, such as sample identification, sorting, and centrifugation, and post-analytical procedures, such as specimen storage and archiving. All phases of testing were ultimately combined in total laboratory automation (TLA) through which all modules involved are physically linked by some kind of track system, moving samples through the process from beginning-to-end. A newer and very powerful, analytical methodology is liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS). LC-MS/MS has been automated but a future automation challenge will be to incorporate LC-MS/MS into TLA configurations. Another important facet of automation is informatics, including middleware, which interfaces the analyser software to a laboratory information systems (LIS) and/or hospital information systems (HIS). This software includes control of the overall operation of a TLA configuration and combines analytical results with patient demographic information to provide additional clinically useful information. This review describes automation relevant to clinical chemistry, but it must be recognised that automation applies to other specialties in the laboratory, e.g. haematology, urinalysis, microbiology. It is a given that automation will continue to evolve in the clinical laboratory

  15. Factory automation for heavy electric equipment

    International Nuclear Information System (INIS)

    Rokutani, Takashi; Ninomiya, Iwao; Hatayama, Naokatsu; Kato, Hiroshi; Yano, Hideaki.

    1986-01-01

    The heightening of productivity in the factories manufacturing heavy electric equipment has been advanced so far by the rationalization of direct workings such as NC, robots and the adoption of FMS (flexible manufacturing system). However, as CAD advances, the effective utilization of these data and the expansion to future CIM (computer integrated manufacture) have become to be demanded. In the Hitachi Works of Hitachi Ltd., it was decided to advance the synthetic rationalization by adopting FA (factory automation) system. Steam turbine blades, pipings for nuclear power plants and motor coils were taken up as the objects since these are important parts, and for the purpose of the drastic heightening of QA level, the heightening of accuracy and the shortening of processes, the synchronization with field installation schedule and the creation of safe working place environment, the FA projects for these three sections were simultaneously planned. When the automation of non-mass production factories is promoted, there are the unmanned factories combining FMS lines for relatively many products and those characterized by FMC of shortening preparation time for small lot products, and this is the examples of the former. The system constitution for FA and the production management combined with it are described. The high reliability of the optical network was regarded as important. (Kako, I.)

  16. Automated Core Design

    International Nuclear Information System (INIS)

    Kobayashi, Yoko; Aiyoshi, Eitaro

    2005-01-01

    Multistate searching methods are a subfield of distributed artificial intelligence that aims to provide both principles for construction of complex systems involving multiple states and mechanisms for coordination of independent agents' actions. This paper proposes a multistate searching algorithm with reinforcement learning for the automatic core design of a boiling water reactor. The characteristics of this algorithm are that the coupling structure and the coupling operation suitable for the assigned problem are assumed and an optimal solution is obtained by mutual interference in multistate transitions using multiagents. Calculations in an actual plant confirmed that the proposed algorithm increased the convergence ability of the optimization process

  17. 77 FR 48527 - National Customs Automation Program (NCAP) Test Concerning Automated Commercial Environment (ACE...

    Science.gov (United States)

    2012-08-14

    ... National Customs Automation Program (NCAP) test concerning the simplified entry functionality in the... DEPARTMENT OF HOMELAND SECURITY U.S. Customs and Border Protection National Customs Automation Program (NCAP) Test Concerning Automated Commercial Environment (ACE) Simplified Entry: Modification of...

  18. SAIL: automating interlibrary loan.

    Science.gov (United States)

    Lacroix, E M

    1994-01-01

    The National Library of Medicine (NLM) initiated the System for Automated Interlibrary Loan (SAIL) pilot project to study the feasibility of using imaging technology linked to the DOCLINE system to deliver copies of journal articles. During the project, NLM converted a small number of print journal issues to electronic form, linking the captured articles to the MEDLINE citation unique identifier. DOCLINE requests for these journals that could not be filled by network libraries were routed to SAIL. Nearly 23,000 articles from sixty-four journals recently selected for indexing in Index Medicus were scanned to convert them to electronic images. During fiscal year 1992, 4,586 scanned articles were used to fill 10,444 interlibrary loan (ILL) requests, and more than half of these were used only once. Eighty percent of all the articles were not requested at all. The total cost per article delivered was $10.76, substantially more than it costs to process a photocopy request. Because conversion costs were the major component of the total SAIL cost, and most of the articles captured for the project were not requested, this model was not cost-effective. Data on SAIL journal article use was compared with all ILL requests filled by NLM for the same period. Eighty-eight percent of all articles requested from NLM were requested only once. The results of the SAIL project demonstrated that converting journal articles to electronic images and storing them in anticipation of repeated requests would not meet NLM's objective to improve interlibrary loan. PMID:8004020

  19. AUTOMATED ANALYSIS OF BREAKERS

    Directory of Open Access Journals (Sweden)

    E. M. Farhadzade

    2014-01-01

    Full Text Available Breakers relate to Electric Power Systems’ equipment, the reliability of which influence, to a great extend, on reliability of Power Plants. In particular, the breakers determine structural reliability of switchgear circuit of Power Stations and network substations. Failure in short-circuit switching off by breaker with further failure of reservation unit or system of long-distance protection lead quite often to system emergency.The problem of breakers’ reliability improvement and the reduction of maintenance expenses is becoming ever more urgent in conditions of systematic increasing of maintenance cost and repair expenses of oil circuit and air-break circuit breakers. The main direction of this problem solution is the improvement of diagnostic control methods and organization of on-condition maintenance. But this demands to use a great amount of statistic information about nameplate data of breakers and their operating conditions, about their failures, testing and repairing, advanced developments (software of computer technologies and specific automated information system (AIS.The new AIS with AISV logo was developed at the department: “Reliability of power equipment” of AzRDSI of Energy. The main features of AISV are:· to provide the security and data base accuracy;· to carry out systematic control of breakers conformity with operating conditions;· to make the estimation of individual  reliability’s value and characteristics of its changing for given combination of characteristics variety;· to provide personnel, who is responsible for technical maintenance of breakers, not only with information but also with methodological support, including recommendations for the given problem solving  and advanced methods for its realization.

  20. Automation of solar plants

    Energy Technology Data Exchange (ETDEWEB)

    Yebra, L.J.; Romero, M.; Martinez, D.; Valverde, A. [CIEMAT - Plataforma Solar de Almeria, Tabernas (Spain); Berenguel, M. [Almeria Univ. (Spain). Departamento de Lenguajes y Computacion

    2004-07-01

    This work overviews some of the main activities and research lines that are being carried out within the scope of the specific collaboration agreement between the Plataforma Solar de Almeria-CIEMAT (PSA-CIEMAT) and the Automatic Control, Electronics and Robotics research group of the Universidad de Almeria (TEP197) titled ''Development of control systems and tools for thermosolar plants'' and the projects financed by the MCYT DPI2001-2380-C02-02 and DPI2002-04375-C03. The research is directed by the need of improving the efficiency of the process through which the energy provided by the sun is totally or partially used as energy source, as far as diminishing the costs associated to the operation and maintenance of the installations that use this energy source. The final objective is to develop different automatic control systems and techniques aimed at improving the competitiveness of solar plants. The paper summarizes different objectives and automatic control approaches that are being implemented in different facilities at the PSA-CIEMAT: central receiver systems and solar furnace. For each one of these facilities, a systematic procedure is being followed, composed of several steps: (i) development of dynamic models using the newest modeling technologies (both for simulation and control purposes), (ii) development of fully automated data acquisition and control systems including software tools facilitating the analysis of data and the application of knowledge to the controlled plants and (iii) synthesis of advanced controllers using techniques successfully used in the process industry and development of new and optimized control algorithms for solar plants. These aspects are summarized in this work. (orig.)