WorldWideScience

Sample records for automated mass action

  1. Mining Repair Actions for Guiding Automated Program Fixing

    OpenAIRE

    Martinez , Matias; Monperrus , Martin

    2012-01-01

    Automated program fixing consists of generating source code in order to fix bugs in an automated manner. Our intuition is that automated program fixing can imitate human-based program fixing. Hence, we present a method to mine repair actions from software repositories. A repair action is a small semantic modification on code such as adding a method call. We then decorate repair actions with a probability distribution also learnt from software repositories. Our probabilistic repair models enab...

  2. Impact of automation on mass spectrometry.

    Science.gov (United States)

    Zhang, Yan Victoria; Rockwood, Alan

    2015-10-23

    Mass spectrometry coupled to liquid chromatography (LC-MS and LC-MS/MS) is an analytical technique that has rapidly grown in popularity in clinical practice. In contrast to traditional technology, mass spectrometry is superior in many respects including resolution, specificity, multiplex capability and has the ability to measure analytes in various matrices. Despite these advantages, LC-MS/MS remains high cost, labor intensive and has limited throughput. This specialized technology requires highly trained personnel and therefore has largely been limited to large institutions, academic organizations and reference laboratories. Advances in automation will be paramount to break through this bottleneck and increase its appeal for routine use. This article reviews these challenges, shares perspectives on essential features for LC-MS/MS total automation and proposes a step-wise and incremental approach to achieve total automation through reducing human intervention, increasing throughput and eventually integrating the LC-MS/MS system into the automated clinical laboratory operations. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Computer automation of an accelerator mass spectrometry system

    International Nuclear Information System (INIS)

    Gressett, J.D.; Maxson, D.L.; Matteson, S.; McDaniel, F.D.; Duggan, J.L.; Mackey, H.J.; North Texas State Univ., Denton, TX; Anthony, J.M.

    1989-01-01

    The determination of trace impurities in electronic materials using accelerator mass spectrometry (AMS) requires efficient automation of the beam transport and mass discrimination hardware. The ability to choose between a variety of charge states, isotopes and injected molecules is necessary to provide survey capabilities similar to that available on conventional mass spectrometers. This paper will discuss automation hardware and software for flexible, high-sensitivity trace analysis of electronic materials, e.g. Si, GaAs and HgCdTe. Details regarding settling times will be presented, along with proof-of-principle experimental data. Potential and present applications will also be discussed. (orig.)

  4. A portable, automated, inexpensive mass and balance calibration system

    International Nuclear Information System (INIS)

    Maxwell, S.L. III; Clark, J.P.

    1987-01-01

    Reliable mass measurements are essential for a nuclear production facility or process control laboratory. DOE Order 5630.2 requires that traceable standards be used to calibrate and monitor equipment used for nuclear material measurements. To ensure the reliability of mass measurements and to comply with DOE traceability requirements, a portable, automated mass and balance calibration system is used at the Savannah River Plant. Automation is achieved using an EPSON HX-20 notebook computer, which can be operated via RS232C interfacing to electronic balances or function with manual data entry if computer interfacing is not feasible. This economical, comprehensive, user-friendly system has three main functions in a mass measurement control program (MMCP): balance certification, calibration of mass standards, and daily measurement of traceable standards. The balance certification program tests for accuracy, precision, sensitivity, linearity, and cornerloading versus specific requirements. The mass calibration program allows rapid calibration of inexpensive mass standards traceable to certified Class S standards. This MMCP permits daily measurement of traceable standards to monitor the reliability of balances during routine use. The automated system verifies balance calibration, stores results for future use, and provides a printed control chart of the stored data. Another feature of the system permits three different weighing routines that accommodate their need for varying degrees of reliability in routine weighing operations

  5. A portable, automated, inexpensive mass and balance calibration system

    International Nuclear Information System (INIS)

    Maxwell, S.L. III; Clark, J.P.

    1987-01-01

    Reliable mass measurements are essential for a nuclear production facility or process control laboratory. DOE Order 5630.2 requires that traceable standards be used to calibrate and monitor equipment used for nuclear material measurements. To ensure the reliability of mass measurements and to comply with DOE traceable requirements, a portable, automated mass and balance calibration system is used at the Savannah River Plant. Automation is achieved using an EPSON HX-20 notebook computer, which can be operated via RS232C interfacing to electronic balances or function with manual data entry if computer interfacing is not feasible. This economical, comprehensive, user-friendly system has three main functions in a mass measurement control program (MMCP): balance certification, calibration of mass standards, and daily measurement of traceable standards. The balance certification program tests for accuracy, precision, sensitivity, linearity, and cornerloading versus specific requirements. The mass calibration program allows rapid calibration of inexpensive mass standards traceable to certified Class S standards. This MMCP permits daily measurement of traceable standards to monitor the reliability of balances during routine use. The automated system verifies balance calibration, stores results for future use, and provides a printed control chart of the stored data. Another feature of the system permits three different weighing routines that accommodate our need for varying degrees of reliability in routine weighing operations. 1 ref

  6. UV Photodissociation Action Spectroscopy of Haloanilinium Ions in a Linear Quadrupole Ion Trap Mass Spectrometer

    Science.gov (United States)

    Hansen, Christopher S.; Kirk, Benjamin B.; Blanksby, Stephen J.; O'Hair, Richard. A. J.; Trevitt, Adam J.

    2013-06-01

    UV-vis photodissociation action spectroscopy is becoming increasingly prevalent because of advances in, and commercial availability of, ion trapping technologies and tunable laser sources. This study outlines in detail an instrumental arrangement, combining a commercial ion-trap mass spectrometer and tunable nanosecond pulsed laser source, for performing fully automated photodissociation action spectroscopy on gas-phase ions. The components of the instrumentation are outlined, including the optical and electronic interfacing, in addition to the control software for automating the experiment and performing online analysis of the spectra. To demonstrate the utility of this ensemble, the photodissociation action spectra of 4-chloroanilinium, 4-bromoanilinium, and 4-iodoanilinium cations are presented and discussed. Multiple photoproducts are detected in each case and the photoproduct yields are followed as a function of laser wavelength. It is shown that the wavelength-dependent partitioning of the halide loss, H loss, and NH3 loss channels can be broadly rationalized in terms of the relative carbon-halide bond dissociation energies and processes of energy redistribution. The photodissociation action spectrum of (phenyl)Ag2 + is compared with a literature spectrum as a further benchmark.

  7. Mass Action and Conservation of Current

    Directory of Open Access Journals (Sweden)

    Eisenberg Robert S.

    2016-10-01

    Full Text Available The law of mass action does not force a series of chemical reactions to have the same current flow everywhere. Interruption of far-away current does not stop current everywhere in a series of chemical reactions (analyzed according to the law of mass action, and so does not obey Maxwell’s equations. An additional constraint and equation is needed to enforce global continuity of current. The additional constraint is introduced in this paper in the special case that the chemical reaction describes spatial movement through narrow channels. In that case, a fully consistent treatment is possible using different models of charge movement. The general case must be dealt with by variational methods that enforce consistency of all the physical laws involved. Violations of current continuity arise away from equilibrium, when current flows, and the law of mass action is applied to a non-equilibrium situation, different from the systems considered when the law was originally derived. Device design in the chemical world is difficult because simple laws are not obeyed in that way. Rate constants of the law of mass action are found experimentally to change from one set of conditions to another. The law of mass action is not robust in most cases and cannot serve the same role that circuit models do in our electrical technology. Robust models and device designs in the chemical world will not be possible until continuity of current is embedded in a generalization of the law of mass action using a consistent variational model of energy and dissipation.

  8. Automated Intelligent Assistant for mass spectrometry operation

    International Nuclear Information System (INIS)

    Filby, E.E.; Rankin, R.A.; Yoshida, D.E.

    1991-01-01

    The Automated Intelligent Assistant is designed to insure that our mass spectrometers produce timely, high-quality measurement data. The design combines instrument interfacing and expert system technology to automate an adaptable set-point damage prevention strategy. When shutdowns occur, the Assistant can help guide troubleshooting efforts. Stored real-time data will help our development program upgrade and improve the system, and also make it possible to re-run previously-observed instrument problems as ''live'' training exercises for the instrument operators. Initial work has focused on implementing the Assistant for the instrument ultra-high vacuum components. 14 refs., 5 figs

  9. Automated data processing of high-resolution mass spectra

    DEFF Research Database (Denmark)

    Hansen, Michael Adsetts Edberg; Smedsgaard, Jørn

    of the massive amounts of data. We present an automated data processing method to quantitatively compare large numbers of spectra from the analysis of complex mixtures, exploiting the full quality of high-resolution mass spectra. By projecting all detected ions - within defined intervals on both the time...... infusion of crude extracts into the source taking advantage of the high sensitivity, high mass resolution and accuracy and the limited fragmentation. Unfortunately, there has not been a comparable development in the data processing techniques to fully exploit gain in high resolution and accuracy...... infusion analyses of crude extract to find the relationship between species from several species terverticillate Penicillium, and also that the ions responsible for the segregation can be identified. Furthermore the process can automate the process of detecting unique species and unique metabolites....

  10. Automated mass correction and data interpretation for protein open-access liquid chromatography-mass spectrometry.

    Science.gov (United States)

    Wagner, Craig D; Hall, John T; White, Wendy L; Miller, Luke A D; Williams, Jon D

    2007-02-01

    Characterization of recombinant protein purification fractions and final products by liquid chromatography-mass spectrometry (LC/MS) are requested more frequently each year. A protein open-access (OA) LC/MS system was developed in our laboratory to meet this demand. This paper compares the system that we originally implemented in our facilities in 2003 to the one now in use, and discusses, in more detail, recent enhancements that have improved its robustness, reliability, and data reporting capabilities. The system utilizes instruments equipped with reversed-phase chromatography and an orthogonal accelerated time-of-flight mass spectrometer fitted with an electrospray source. Sample analysis requests are accomplished using a simple form on a web-enabled laboratory information management system (LIMS). This distributed form is accessible from any intranet-connected company desktop computer. Automated data acquisition and processing are performed using a combination of in-house (OA-Self Service, OA-Monitor, and OA-Analysis Engine) and vendor-supplied programs (AutoLynx, and OpenLynx) located on acquisition computers and off-line processing workstations. Analysis results are then reported via the same web-based LIMS. Also presented are solutions to problems not addressed on commercially available, small-molecule OA-LC/MS systems. These include automated transforming of mass-to-charge (m/z) spectra to mass spectra and automated data interpretation that considers minor variants to the protein sequence-such as common post-translational modifications (PTMs). Currently, our protein OA-LC/MS platform runs on five LC/MS instruments located in three separate GlaxoSmithKline R&D sites in the US and UK. To date, more than 8000 protein OA-LC/MS samples have been analyzed. With these user friendly and highly automated OA systems in place, mass spectrometry plays a key role in assessing the quality of recombinant proteins, either produced at our facilities or bought from external

  11. Computer automated mass spectrometer for isotope analysis on gas samples

    International Nuclear Information System (INIS)

    Pamula, A.; Kaucsar, M.; Fatu, C.; Ursu, D.; Vonica, D.; Bendea, D.; Muntean, F.

    1998-01-01

    A low resolution, high precision instrument was designed and realized in the mass spectrometry laboratory of the Institute of Isotopic and Molecular Technology, Cluj-Napoca. The paper presents the vacuum system, the sample inlet system, the ion source, the magnetic analyzer and the ion collector. The instrument is almost completely automated. There are described the analog-to-digital conversion circuits, the local control microcomputer, the automation systems and the performance checking. (authors)

  12. ActionMap: A web-based software that automates loci assignments to framework maps.

    Science.gov (United States)

    Albini, Guillaume; Falque, Matthieu; Joets, Johann

    2003-07-01

    Genetic linkage computation may be a repetitive and time consuming task, especially when numerous loci are assigned to a framework map. We thus developed ActionMap, a web-based software that automates genetic mapping on a fixed framework map without adding the new markers to the map. Using this tool, hundreds of loci may be automatically assigned to the framework in a single process. ActionMap was initially developed to map numerous ESTs with a small plant mapping population and is limited to inbred lines and backcrosses. ActionMap is highly configurable and consists of Perl and PHP scripts that automate command steps for the MapMaker program. A set of web forms were designed for data import and mapping settings. Results of automatic mapping can be displayed as tables or drawings of maps and may be exported. The user may create personal access-restricted projects to store raw data, settings and mapping results. All data may be edited, updated or deleted. ActionMap may be used either online or downloaded for free (http://moulon.inra.fr/~bioinfo/).

  13. Automated, parallel mass spectrometry imaging and structural identification of lipids

    DEFF Research Database (Denmark)

    Ellis, Shane R.; Paine, Martin R.L.; Eijkel, Gert B.

    2018-01-01

    We report a method that enables automated data-dependent acquisition of lipid tandem mass spectrometry data in parallel with a high-resolution mass spectrometry imaging experiment. The method does not increase the total image acquisition time and is combined with automatic structural assignments....... This lipidome-per-pixel approach automatically identified and validated 104 unique molecular lipids and their spatial locations from rat cerebellar tissue....

  14. Perspectives on bioanalytical mass spectrometry and automation in drug discovery.

    Science.gov (United States)

    Janiszewski, John S; Liston, Theodore E; Cole, Mark J

    2008-11-01

    The use of high speed synthesis technologies has resulted in a steady increase in the number of new chemical entities active in the drug discovery research stream. Large organizations can have thousands of chemical entities in various stages of testing and evaluation across numerous projects on a weekly basis. Qualitative and quantitative measurements made using LC/MS are integrated throughout this process from early stage lead generation through candidate nomination. Nearly all analytical processes and procedures in modern research organizations are automated to some degree. This includes both hardware and software automation. In this review we discuss bioanalytical mass spectrometry and automation as components of the analytical chemistry infrastructure in pharma. Analytical chemists are presented as members of distinct groups with similar skillsets that build automated systems, manage test compounds, assays and reagents, and deliver data to project teams. The ADME-screening process in drug discovery is used as a model to highlight the relationships between analytical tasks in drug discovery. Emerging software and process automation tools are described that can potentially address gaps and link analytical chemistry related tasks. The role of analytical chemists and groups in modern 'industrialized' drug discovery is also discussed.

  15. The Impact of Automated Notification on Follow-up of Actionable Tests Pending at Discharge: a Cluster-Randomized Controlled Trial.

    Science.gov (United States)

    Dalal, Anuj K; Schaffer, Adam; Gershanik, Esteban F; Papanna, Ranganath; Eibensteiner, Katyuska; Nolido, Nyryan V; Yoon, Cathy S; Williams, Deborah; Lipsitz, Stuart R; Roy, Christopher L; Schnipper, Jeffrey L

    2018-03-12

    Follow-up of tests pending at discharge (TPADs) is poor. We previously demonstrated a twofold increase in awareness of any TPAD by attendings and primary care physicians (PCPs) using an automated email intervention OBJECTIVE: To determine whether automated notification improves documented follow-up for actionable TPADs DESIGN: Cluster-randomized controlled trial SUBJECTS: Attendings and PCPs caring for adult patients discharged from general medicine and cardiology services with at least one actionable TPAD between June 2011 and May 2012 INTERVENTION: An automated system that notifies discharging attendings and network PCPs of finalized TPADs by email MAIN MEASURES: The primary outcome was the proportion of actionable TPADs with documented action determined by independent physician review of the electronic health record (EHR). Secondary outcomes included documented acknowledgment, 30-day readmissions, and adjusted median days to documented follow-up. Of the 3378 TPADs sampled, 253 (7.5%) were determined to be actionable by physician review. Of these, 150 (123 patients discharged by 53 attendings) and 103 (90 patients discharged by 44 attendings) were assigned to intervention and usual care groups, respectively, and underwent chart review. The proportion of actionable TPADs with documented action was 60.7 vs. 56.3% (p = 0.82) in the intervention vs. usual care groups, similar for documented acknowledgment. The proportion of patients with actionable TPADs readmitted within 30 days was 22.8 vs. 31.1% in the intervention vs. usual care groups (p = 0.24). The adjusted median days [95% CI] to documented action was 9 [6.2, 11.8] vs. 14 [10.2, 17.8] (p = 0.04) in the intervention vs. usual care groups, similar for documented acknowledgment. In sub-group analysis, the intervention had greater impact on documented action for patients with network PCPs compared with usual care (70 vs. 50%, p = 0.03). Automated notification of actionable TPADs shortened time to

  16. Commissioning of an automated microphotometer used in spark-source mass spectrometry

    International Nuclear Information System (INIS)

    Pearton, D.C.G.; Heron, C.

    1983-01-01

    A description of the automated microphotometer and its operation is given, which includes measurement under computer control. Speed and precision tests indicate that the system is superior in every respect to that in which an analyst reads photoplates in spark-source mass spectrometry

  17. Automated spike preparation system for Isotope Dilution Mass Spectrometry (IDMS)

    International Nuclear Information System (INIS)

    Maxwell, S.L. III; Clark, J.P.

    1990-01-01

    Isotope Dilution Mass Spectrometry (IDMS) is a method frequently employed to measure dissolved, irradiated nuclear materials. A known quantity of a unique isotope of the element to be measured (referred to as the ''spike'') is added to the solution containing the analyte. The resulting solution is chemically purified then analyzed by mass spectrometry. By measuring the magnitude of the response for each isotope and the response for the ''unique spike'' then relating this to the known quantity of the ''spike'', the quantity of the nuclear material can be determined. An automated spike preparation system was developed at the Savannah River Site (SRS) to dispense spikes for use in IDMS analytical methods. Prior to this development, technicians weighed each individual spike manually to achieve the accuracy required. This procedure was time-consuming and subjected the master stock solution to evaporation. The new system employs a high precision SMI Model 300 Unipump dispenser interfaced with an electronic balance and a portable Epson HX-20 notebook computer to automate spike preparation

  18. AutoLens: Automated Modeling of a Strong Lens's Light, Mass and Source

    Science.gov (United States)

    Nightingale, J. W.; Dye, S.; Massey, Richard J.

    2018-05-01

    This work presents AutoLens, the first entirely automated modeling suite for the analysis of galaxy-scale strong gravitational lenses. AutoLens simultaneously models the lens galaxy's light and mass whilst reconstructing the extended source galaxy on an adaptive pixel-grid. The method's approach to source-plane discretization is amorphous, adapting its clustering and regularization to the intrinsic properties of the lensed source. The lens's light is fitted using a superposition of Sersic functions, allowing AutoLens to cleanly deblend its light from the source. Single component mass models representing the lens's total mass density profile are demonstrated, which in conjunction with light modeling can detect central images using a centrally cored profile. Decomposed mass modeling is also shown, which can fully decouple a lens's light and dark matter and determine whether the two component are geometrically aligned. The complexity of the light and mass models are automatically chosen via Bayesian model comparison. These steps form AutoLens's automated analysis pipeline, such that all results in this work are generated without any user-intervention. This is rigorously tested on a large suite of simulated images, assessing its performance on a broad range of lens profiles, source morphologies and lensing geometries. The method's performance is excellent, with accurate light, mass and source profiles inferred for data sets representative of both existing Hubble imaging and future Euclid wide-field observations.

  19. Automated processing of whole blood samples for the determination of immunosuppressants by liquid chromatography tandem-mass spectrometry.

    Science.gov (United States)

    Vogeser, Michael; Spöhrer, Ute

    2006-01-01

    Liquid chromatography tandem-mass spectrometry (LC-MS/MS) is an efficient technology for routine determination of immunosuppressants in whole blood; however, time-consuming manual sample preparation remains a significant limitation of this technique. Using a commercially available robotic pipetting system (Tecan Freedom EVO), we developed an automated sample-preparation protocol for quantification of tacrolimus in whole blood by LC-MS/MS. Barcode reading, sample resuspension, transfer of whole blood aliquots into a deep-well plate, addition of internal standard solution, mixing, and protein precipitation by addition of an organic solvent is performed by the robotic system. After centrifugation of the plate, the deproteinized supernatants are submitted to on-line solid phase extraction, using column switching prior to LC-MS/MS analysis. The only manual actions within the entire process are decapping of the tubes, and transfer of the deep-well plate from the robotic system to a centrifuge and finally to the HPLC autosampler. Whole blood pools were used to assess the reproducibility of the entire analytical system for measuring tacrolimus concentrations. A total coefficient of variation of 1.7% was found for the entire automated analytical process (n=40; mean tacrolimus concentration, 5.3 microg/L). Close agreement between tacrolimus results obtained after manual and automated sample preparation was observed. The analytical system described here, comprising automated protein precipitation, on-line solid phase extraction and LC-MS/MS analysis, is convenient and precise, and minimizes hands-on time and the risk of mistakes in the quantification of whole blood immunosuppressant concentrations compared to conventional methods.

  20. A fully automated mass spectrometer for the analysis of organic solids

    International Nuclear Information System (INIS)

    Hillig, H.; Kueper, H.; Riepe, W.

    1979-01-01

    Automation of a mass spectrometer-computer system makes it possible to process up to 30 samples without attention after sample loading. An automatic sample changer introduces the samples successively into the ion source by means of a direct inlet probe. A process control unit determines the operation sequence. Computer programs are available for the hardware support, system supervision and evaluation of the spectrometer signals. The most essential precondition for automation - automatic evaporation of the sample material by electronic control of the total ion current - is confirmed to be satisfactory. The system operates routinely overnight in an industrial laboratory, so that day work can be devoted to difficult analytical problems. The cost of routine analyses is halved. (Auth.)

  1. Balance between automation and human actions in nuclear power plant operation. Results of international cooperation

    International Nuclear Information System (INIS)

    Sun, B.; Olmstead, R.; Oudiz, A.; Jenkinson, J.; Kossilov, A.

    1990-01-01

    Automation has long been an established feature of power plants. In some applications, the use of automation has been the significant factor which has enabled plant technology to progress to its current state. Societal demands for increased levels of safety have led to greater use of redundancy and diversity and this, in turn, has increased levels of automation. However, possibly the greatest contributory factor in increased automation has resulted from improvements in information technology. Much recent attention has been focused on the concept of inherently safe reactors, which may simplify safety system requirements and information and control system complexity. The allocation of tasks between man and machine may be one of the most critical activity in the design of new nuclear plants and major retro-fits and it therefore warrants a design approach which is commensurate in quality with the high levels of safety and production performance sought from nuclear plants. Facing this climate, in 1989 the International Atomic Energy Agency (IAEA) formed an advisory group from member countries with extensive experience in nuclear power plant automation. The task of this group was to advise on the appropriate balance between manual and automatic actions in plant operation

  2. Balance between automation and human actions in nuclear power plant operation. Results of international cooperation

    International Nuclear Information System (INIS)

    Sun, B.; Olmstead, R.; Oudiz, A.; Jenkinson, J.; Kossilov, A.

    1990-01-01

    Automation has long been an established feature of power plants. In some applications, the use of automation has been the significant factor which has enabled plant technology to progress to its current state. Societal demands for increased levels of safety have led to greater use of redundancy and diversity and this, in turn, has increased levels of automation. However, possibly the greatest contributory factor in increased automation has resulted from improvements in information technology. Much recent attention has been focused on the concept of inherently safe reactors, which may simplify safety system requirements and information and control system complexity. The allocation of tasks between man and machine may be one of the most critical activity in the design of new nuclear plants and major retro-fits and it therefore warrants a design approach which is commensurate in quality with the high levels of safety and production performance sought from nuclear plants. Facing this climate, in 1989 the International Atomic Energy Agency (IAEA) formed an advisory group from member countries with extensive experience in nuclear power plant automation. The task of this group was to advise on the appropriate balance between manual and automatic actions in plant operation. (author) [fr

  3. Mass Spectra-Based Framework for Automated Structural Elucidation of Metabolome Data to Explore Phytochemical Diversity

    Science.gov (United States)

    Matsuda, Fumio; Nakabayashi, Ryo; Sawada, Yuji; Suzuki, Makoto; Hirai, Masami Y.; Kanaya, Shigehiko; Saito, Kazuki

    2011-01-01

    A novel framework for automated elucidation of metabolite structures in liquid chromatography–mass spectrometer metabolome data was constructed by integrating databases. High-resolution tandem mass spectra data automatically acquired from each metabolite signal were used for database searches. Three distinct databases, KNApSAcK, ReSpect, and the PRIMe standard compound database, were employed for the structural elucidation. The outputs were retrieved using the CAS metabolite identifier for identification and putative annotation. A simple metabolite ontology system was also introduced to attain putative characterization of the metabolite signals. The automated method was applied for the metabolome data sets obtained from the rosette leaves of 20 Arabidopsis accessions. Phenotypic variations in novel Arabidopsis metabolites among these accessions could be investigated using this method. PMID:22645535

  4. Mass spectra-based framework for automated structural elucidation of metabolome data to explore phytochemical diversity

    Directory of Open Access Journals (Sweden)

    Fumio eMatsuda

    2011-08-01

    Full Text Available A novel framework for automated elucidation of metabolite structures in liquid chromatography-mass spectrometer (LC-MS metabolome data was constructed by integrating databases. High-resolution tandem mass spectra data automatically acquired from each metabolite signal were used for database searches. Three distinct databases, KNApSAcK, ReSpect, and the PRIMe standard compound database, were employed for the structural elucidation. The outputs were retrieved using the CAS metabolite identifier for identification and putative annotation. A simple metabolite ontology system was also introduced to attain putative characterization of the metabolite signals. The automated method was applied for the metabolome data sets obtained from the rosette leaves of 20 Arabidopsis accessions. Phenotypic variations in novel Arabidopsis metabolites among these accessions could be investigated using this method.

  5. Negative-Parity Baryon Masses Using O(a)-improved Fermion Action

    Energy Technology Data Exchange (ETDEWEB)

    M. Gockeler; R. Horsley; D. Pleiter; P.E.L. Rakow; G. Schierholz; C.M. Maynard; D.G. Richards

    2001-06-01

    We present a calculation of the mass of the lowest-lying negative-parity J=1/2{sup {minus}} state in quenched QCD. Results are obtained using a non-perturbatively {Omicron}(a)-improved clover fermion action, and a splitting found between the masses of the nucleon, and its parity partner. The calculation is performed on two lattice volumes, and at three lattice spacings, enabling a study of both finite-volume and finite lattice-spacing uncertainties. A comparison is made with results obtained using the unimproved Wilson fermion action.

  6. The development of a completely automated oxygen isotope mass spectrometer

    International Nuclear Information System (INIS)

    Ahern, T.K.

    1980-01-01

    A completely automated mass spectrometer system has been developed to measure the oxygen isotope ratio of carbon dioxide samples. The system has an accuracy of 0.03 percent, and is capable of analyzing more than 100 samples a day. The system uses an Interdata minicomputer as the primary controller. The intelligence of the system is contained within hardware circuits, software within the minicomputer, and firmware written for a Motorola 6802 microprocessor. A microprocessor-based inlet system controller maximizes the throughput of carbon dioxide samples within the inlet system. The inlet system normally contains four different aliquots of carbon dioxide and introduces these samples to the mass spectrometer through a single admittance leak. The system has been used in the analysis of 111 samples of ice taken from the Steele glacier

  7. The mass-action-law theory of micellization revisited.

    Science.gov (United States)

    Rusanov, Anatoly I

    2014-12-09

    Among numerous definitions of the critical micelle concentration (CMC), there is one related to the constant K of the mass action law as CMC = K(1-n) (n is the aggregation number). In this paper, the generalization of this definition for multicomponent micelles and the development of the mass-action-law theory of micellization based on this definition and the analysis of a multiple-equilibrium polydisperse micellar system have been presented. This variant of the theory of micellization looks more consistent than the earlier one. In addition, two thermodynamic findings are reported: the stability conditions for micellar systems and the dependence of aggregation numbers on the surfactant concentrations. The growth of the monomer concentration with the total surfactant concentration is shown to be a thermodynamic rule only in the case of a single sort of aggregative particles or at adding a single surfactant to a mixture. The stability condition takes more complex form when adding a mixture of aggregative particles. For the aggregation number of a micelle, it has been deduced a thermodynamic rule obeying it to increase with the total surfactant concentration. However, if the monomer concentration increases slowly, the aggregation number increases much more slowly and the more slowly the more pronounced is a maximum corresponding to a micelle on the distribution hypersurface (curve in the one-component case). This forms grounding for the quasi-chemical approximation in the mass-action-law theory (the constancy of aggregation numbers).

  8. A control system verifier using automated reasoning software

    International Nuclear Information System (INIS)

    Smith, D.E.; Seeman, S.E.

    1985-08-01

    An on-line, automated reasoning software system for verifying the actions of other software or human control systems has been developed. It was demonstrated by verifying the actions of an automated procedure generation system. The verifier uses an interactive theorem prover as its inference engine with the rules included as logical axioms. Operation of the verifier is generally transparent except when the verifier disagrees with the actions of the monitored software. Testing with an automated procedure generation system demonstrates the successful application of automated reasoning software for verification of logical actions in a diverse, redundant manner. A higher degree of confidence may be placed in the verified actions of the combined system

  9. Automated sample preparation using membrane microtiter extraction for bioanalytical mass spectrometry.

    Science.gov (United States)

    Janiszewski, J; Schneider, P; Hoffmaster, K; Swyden, M; Wells, D; Fouda, H

    1997-01-01

    The development and application of membrane solid phase extraction (SPE) in 96-well microtiter plate format is described for the automated analysis of drugs in biological fluids. The small bed volume of the membrane allows elution of the analyte in a very small solvent volume, permitting direct HPLC injection and negating the need for the time consuming solvent evaporation step. A programmable liquid handling station (Quadra 96) was modified to automate all SPE steps. To avoid drying of the SPE bed and to enhance the analytical precision a novel protocol for performing the condition, load and wash steps in rapid succession was utilized. A block of 96 samples can now be extracted in 10 min., about 30 times faster than manual solvent extraction or single cartridge SPE methods. This processing speed complements the high-throughput speed of contemporary high performance liquid chromatography mass spectrometry (HPLC/MS) analysis. The quantitative analysis of a test analyte (Ziprasidone) in plasma demonstrates the utility and throughput of membrane SPE in combination with HPLC/MS. The results obtained with the current automated procedure compare favorably with those obtained using solvent and traditional solid phase extraction methods. The method has been used for the analysis of numerous drug prototypes in biological fluids to support drug discovery efforts.

  10. Estimating Regional Mass Balance of Himalayan Glaciers Using Hexagon Imagery: An Automated Approach

    Science.gov (United States)

    Maurer, J. M.; Rupper, S.

    2013-12-01

    Currently there is much uncertainty regarding the present and future state of Himalayan glaciers, which supply meltwater for river systems vital to more than 1.4 billion people living throughout Asia. Previous assessments of regional glacier mass balance in the Himalayas using various remote sensing and field-based methods give inconsistent results, and most assessments are over relatively short (e.g., single decade) timescales. This study aims to quantify multi-decadal changes in volume and extent of Himalayan glaciers through efficient use of the large database of declassified 1970-80s era Hexagon stereo imagery. Automation of the DEM extraction process provides an effective workflow for many images to be processed and glacier elevation changes quantified with minimal user input. The tedious procedure of manual ground control point selection necessary for block-bundle adjustment (as ephemeral data is not available for the declassified images) is automated using the Maximally Stable Extremal Regions algorithm, which matches image elements between raw Hexagon images and georeferenced Landsat 15 meter panchromatic images. Additional automated Hexagon DEM processing, co-registration, and bias correction allow for direct comparison with modern ASTER and SRTM elevation data, thus quantifying glacier elevation and area changes over several decades across largely inaccessible mountainous regions. As consistent methodology is used for all glaciers, results will likely reveal significant spatial and temporal patterns in regional ice mass balance. Ultimately, these findings could have important implications for future water resource management in light of environmental change.

  11. Negative-parity baryon masses using an Ο(α)-improved fermion action

    International Nuclear Information System (INIS)

    Goeckeler, M.; Rakow, P.E.L.; Maynard, C.M.; Richards, D.G.; Old Dominion Univ., Norfolk, VA

    2001-06-01

    We present a calculation of the mass of the lowest-lying negative-parity J = 1/2 - state in quenched QCD. Results are obtained using a non-perturbatively O(a)-improved clover fermion action, and a splitting is found between the masses of the nucleon, and its parity partner. The calculation is performed on two lattice volumes and at three lattice spacings, enabling a study of both finite-volume and finite lattice-spacing uncertainties. A comparison is made with results obtained using the unimproved Wilson fermion action. (orig.)

  12. Electrically evoked compound action potentials artefact rejection by independent component analysis: procedure automation.

    Science.gov (United States)

    Akhoun, Idrick; McKay, Colette; El-Deredy, Wael

    2015-01-15

    Independent-components-analysis (ICA) successfully separated electrically-evoked compound action potentials (ECAPs) from the stimulation artefact and noise (ECAP-ICA, Akhoun et al., 2013). This paper shows how to automate the ECAP-ICA artefact cancellation process. Raw-ECAPs without artefact rejection were consecutively recorded for each stimulation condition from at least 8 intra-cochlear electrodes. Firstly, amplifier-saturated recordings were discarded, and the data from different stimulus conditions (different current-levels) were concatenated temporally. The key aspect of the automation procedure was the sequential deductive source categorisation after ICA was applied with a restriction to 4 sources. The stereotypical aspect of the 4 sources enables their automatic classification as two artefact components, a noise and the sought ECAP based on theoretical and empirical considerations. The automatic procedure was tested using 8 cochlear implant (CI) users and one to four stimulus electrodes. The artefact and noise sources were successively identified and discarded, leaving the ECAP as the remaining source. The automated ECAP-ICA procedure successfully extracted the correct ECAPs compared to standard clinical forward masking paradigm in 22 out of 26 cases. ECAP-ICA does not require extracting the ECAP from a combination of distinct buffers as it is the case with regular methods. It is an alternative that does not have the possible bias of traditional artefact rejections such as alternate-polarity or forward-masking paradigms. The ECAP-ICA procedure bears clinical relevance, for example as the artefact rejection sub-module of automated ECAP-threshold detection techniques, which are common features of CI clinical fitting software. Copyright © 2014. Published by Elsevier B.V.

  13. Application of automated reasoning software: procedure generation system verifier

    International Nuclear Information System (INIS)

    Smith, D.E.; Seeman, S.E.

    1984-09-01

    An on-line, automated reasoning software system for verifying the actions of other software or human control systems has been developed. It was demonstrated by verifying the actions of an automated procedure generation system. The verifier uses an interactive theorem prover as its inference engine with the rules included as logic axioms. Operation of the verifier is generally transparent except when the verifier disagrees with the actions of the monitored software. Testing with an automated procedure generation system demonstrates the successful application of automated reasoning software for verification of logical actions in a diverse, redundant manner. A higher degree of confidence may be placed in the verified actions gathered by the combined system

  14. Open complex-balanced mass action chemical reaction networks

    NARCIS (Netherlands)

    Rao, Shodhan; van der Schaft, Arjan; Jayawardhana, Bayu

    We consider open chemical reaction networks, i.e. ones with inflows and outflows. We assume that all the inflows to the network are constant and all outflows obey the mass action kinetics rate law. We define a complex-balanced open reaction network as one that admits a complex-balanced steady state.

  15. Automated grouping of action potentials of human embryonic stem cell-derived cardiomyocytes.

    Science.gov (United States)

    Gorospe, Giann; Zhu, Renjun; Millrod, Michal A; Zambidis, Elias T; Tung, Leslie; Vidal, Rene

    2014-09-01

    Methods for obtaining cardiomyocytes from human embryonic stem cells (hESCs) are improving at a significant rate. However, the characterization of these cardiomyocytes (CMs) is evolving at a relatively slower rate. In particular, there is still uncertainty in classifying the phenotype (ventricular-like, atrial-like, nodal-like, etc.) of an hESC-derived cardiomyocyte (hESC-CM). While previous studies identified the phenotype of a CM based on electrophysiological features of its action potential, the criteria for classification were typically subjective and differed across studies. In this paper, we use techniques from signal processing and machine learning to develop an automated approach to discriminate the electrophysiological differences between hESC-CMs. Specifically, we propose a spectral grouping-based algorithm to separate a population of CMs into distinct groups based on the similarity of their action potential shapes. We applied this method to a dataset of optical maps of cardiac cell clusters dissected from human embryoid bodies. While some of the nine cell clusters in the dataset are presented with just one phenotype, the majority of the cell clusters are presented with multiple phenotypes. The proposed algorithm is generally applicable to other action potential datasets and could prove useful in investigating the purification of specific types of CMs from an electrophysiological perspective.

  16. Automated, feature-based image alignment for high-resolution imaging mass spectrometry of large biological samples

    NARCIS (Netherlands)

    Broersen, A.; Liere, van R.; Altelaar, A.F.M.; Heeren, R.M.A.; McDonnell, L.A.

    2008-01-01

    High-resolution imaging mass spectrometry of large biological samples is the goal of several research groups. In mosaic imaging, the most common method, the large sample is divided into a mosaic of small areas that are then analyzed with high resolution. Here we present an automated alignment

  17. The infection rate of Daphnia magna by Pasteuria ramosa conforms with the mass-action principle.

    Science.gov (United States)

    Regoes, R R; Hottinger, J W; Sygnarski, L; Ebert, D

    2003-10-01

    In simple epidemiological models that describe the interaction between hosts with their parasites, the infection process is commonly assumed to be governed by the law of mass action, i.e. it is assumed that the infection rate depends linearly on the densities of the host and the parasite. The mass-action assumption, however, can be problematic if certain aspects of the host-parasite interaction are very pronounced, such as spatial compartmentalization, host immunity which may protect from infection with low doses, or host heterogeneity with regard to susceptibility to infection. As deviations from a mass-action infection rate have consequences for the dynamics of the host-parasite system, it is important to test for the appropriateness of the mass-action assumption in a given host-parasite system. In this paper, we examine the relationship between the infection rate and the parasite inoculum for the water flee Daphnia magna and its bacterial parasite Pasteuria ramosa. We measured the fraction of infected hosts after exposure to 14 different doses of the parasite. We find that the observed relationship between the fraction of infected hosts and the parasite dose is largely consistent with an infection process governed by the mass-action principle. However, we have evidence for a subtle but significant deviation from a simple mass-action infection model, which can be explained either by some antagonistic effects of the parasite spores during the infection process, or by heterogeneity in the hosts' susceptibility with regard to infection.

  18. Law of mass action for co-precipitation; Loi d'action de masse de la co-precipitation

    Energy Technology Data Exchange (ETDEWEB)

    Vitorge, P

    2008-07-01

    Coprecipitation is often understood as the incorporation of elements at trace concentrations into -initially pure- solid compounds. Coprecipitation has typically been used to identify radioactive isotopes. Coprecipitation can result in lowering solubility as compared to the solubility, when controlled by pure compounds. For this reason it is also important for geochemistry, waste management and de-pollution studies. The solid obtained with coprecipitation is a new homogeneous solid phase called solid solution. The 2 formula needed to calculate the aqueous solubility when controlled by the ideal AB{sub b(1-x)}C{sub cx} solid solutions are K{sub s,B}{sup 1-x}*K{sub s,C}{sup x} equals [A{sup z{sub A}}]*[B{sup z{sub B}}]{sup b(1-x)}*[C{sup z{sub C}}]{sup cx}/((1-x){sup b(1-x)}x{sup cx}) and K{sub s,C}/K{sub s,B} equals (1-x){sup b}*[C{sup z{sub C}}]{sup c}/[B{sup z{sub B}}]{sup b}*x{sup c}), where K{sub s,B} and K{sub s,C} are the classical constant solubility products of the AB{sub b} and AC{sub c} end-members, the b and c values are calculated from the (z{sub i}) charges of the ions and from charge balance. This report is essentially written to provide a thermodynamic demonstration of the law of mass action in attempts to confirm scientific bases for solubility calculations in geosciences (as typically retention of radio-nuclides by co-precipitation), and to facilitate such calculations. Note that the law of mass action is here a set of 2 equations (not only 1) for the ideal or near ideal systems. Since they are consistent with the phase rule, no extra formula (beside mass balance) is needed to calculate the concentrations of all the species in both phases, namely: [A{sup z{sub A}}], [B{sup z{sub B}}], [C{sup z{sub C}}] and specially x.

  19. Hardware and software for automating the process of studying high-speed gas flows in wind tunnels of short-term action

    Science.gov (United States)

    Yakovlev, V. V.; Shakirov, S. R.; Gilyov, V. M.; Shpak, S. I.

    2017-10-01

    In this paper, we propose a variant of constructing automation systems for aerodynamic experiments on the basis of modern hardware-software means of domestic development. The structure of the universal control and data collection system for performing experiments in wind tunnels of continuous, periodic or short-term action is proposed. The proposed hardware and software development tools for ICT SB RAS and ITAM SB RAS, as well as subsystems based on them, can be widely applied to any scientific and experimental installations, as well as to the automation of technological processes in production.

  20. Optimization of Reversed-Phase Peptide Liquid Chromatography Ultraviolet Mass Spectrometry Analyses Using an Automated Blending Methodology

    Science.gov (United States)

    Chakraborty, Asish B.; Berger, Scott J.

    2005-01-01

    The balance between chromatographic performance and mass spectrometric response has been evaluated using an automated series of experiments where separations are produced by the real-time automated blending of water with organic and acidic modifiers. In this work, the concentration effects of two acidic modifiers (formic acid and trifluoroacetic acid) were studied on the separation selectivity, ultraviolet, and mass spectrometry detector response, using a complex peptide mixture. Peptide retention selectivity differences were apparent between the two modifiers, and under the conditions studied, trifluoroacetic acid produced slightly narrower (more concentrated) peaks, but significantly higher electrospray mass spectrometry suppression. Trifluoroacetic acid suppression of electrospray signal and influence on peptide retention and selectivity was dominant when mixtures of the two modifiers were analyzed. Our experimental results indicate that in analyses where the analyzed components are roughly equimolar (e.g., a peptide map of a recombinant protein), the selectivity of peptide separations can be optimized by choice and concentration of acidic modifier, without compromising the ability to obtain effective sequence coverage of a protein. In some cases, these selectivity differences were explored further, and a rational basis for differentiating acidic modifier effects from the underlying peptide sequences is described. PMID:16522853

  1. Sample tracking in an automated cytogenetic biodosimetry laboratory for radiation mass casualties

    International Nuclear Information System (INIS)

    Martin, P.R.; Berdychevski, R.E.; Subramanian, U.; Blakely, W.F.; Prasanna, P.G.S.

    2007-01-01

    Chromosome-aberration-based dicentric assay is expected to be used after mass-casualty life-threatening radiation exposures to assess radiation dose to individuals. This will require processing of a large number of samples for individual dose assessment and clinical triage to aid treatment decisions. We have established an automated, high-throughput, cytogenetic biodosimetry laboratory to process a large number of samples for conducting the dicentric assay using peripheral blood from exposed individuals according to internationally accepted laboratory protocols (i.e., within days following radiation exposures). The components of an automated cytogenetic biodosimetry laboratory include blood collection kits for sample shipment, a cell viability analyzer, a robotic liquid handler, an automated metaphase harvester, a metaphase spreader, high-throughput slide stainer and coverslipper, a high-throughput metaphase finder, multiple satellite chromosome-aberration analysis systems, and a computerized sample-tracking system. Laboratory automation using commercially available, off-the-shelf technologies, customized technology integration, and implementation of a laboratory information management system (LIMS) for cytogenetic analysis will significantly increase throughput. This paper focuses on our efforts to eliminate data-transcription errors, increase efficiency, and maintain samples' positive chain-of-custody by sample tracking during sample processing and data analysis. This sample-tracking system represents a 'beta' version, which can be modeled elsewhere in a cytogenetic biodosimetry laboratory, and includes a customized LIMS with a central server, personal computer workstations, barcode printers, fixed station and wireless hand-held devices to scan barcodes at various critical steps, and data transmission over a private intra-laboratory computer network. Our studies will improve diagnostic biodosimetry response, aid confirmation of clinical triage, and medical

  2. Sample tracking in an automated cytogenetic biodosimetry laboratory for radiation mass casualties

    Energy Technology Data Exchange (ETDEWEB)

    Martin, P.R.; Berdychevski, R.E.; Subramanian, U.; Blakely, W.F. [Armed Forces Radiobiology Research Institute, Uniformed Services University of Health Sciences, 8901 Wisconsin Avenue, Bethesda, MD 20889-5603 (United States); Prasanna, P.G.S. [Armed Forces Radiobiology Research Institute, Uniformed Services University of Health Sciences, 8901 Wisconsin Avenue, Bethesda, MD 20889-5603 (United States)], E-mail: prasanna@afrri.usuhs.mil

    2007-07-15

    Chromosome-aberration-based dicentric assay is expected to be used after mass-casualty life-threatening radiation exposures to assess radiation dose to individuals. This will require processing of a large number of samples for individual dose assessment and clinical triage to aid treatment decisions. We have established an automated, high-throughput, cytogenetic biodosimetry laboratory to process a large number of samples for conducting the dicentric assay using peripheral blood from exposed individuals according to internationally accepted laboratory protocols (i.e., within days following radiation exposures). The components of an automated cytogenetic biodosimetry laboratory include blood collection kits for sample shipment, a cell viability analyzer, a robotic liquid handler, an automated metaphase harvester, a metaphase spreader, high-throughput slide stainer and coverslipper, a high-throughput metaphase finder, multiple satellite chromosome-aberration analysis systems, and a computerized sample-tracking system. Laboratory automation using commercially available, off-the-shelf technologies, customized technology integration, and implementation of a laboratory information management system (LIMS) for cytogenetic analysis will significantly increase throughput. This paper focuses on our efforts to eliminate data-transcription errors, increase efficiency, and maintain samples' positive chain-of-custody by sample tracking during sample processing and data analysis. This sample-tracking system represents a 'beta' version, which can be modeled elsewhere in a cytogenetic biodosimetry laboratory, and includes a customized LIMS with a central server, personal computer workstations, barcode printers, fixed station and wireless hand-held devices to scan barcodes at various critical steps, and data transmission over a private intra-laboratory computer network. Our studies will improve diagnostic biodosimetry response, aid confirmation of clinical triage, and

  3. Robust automated mass spectra interpretation and chemical formula calculation using mixed integer linear programming.

    Science.gov (United States)

    Baran, Richard; Northen, Trent R

    2013-10-15

    Untargeted metabolite profiling using liquid chromatography and mass spectrometry coupled via electrospray ionization is a powerful tool for the discovery of novel natural products, metabolic capabilities, and biomarkers. However, the elucidation of the identities of uncharacterized metabolites from spectral features remains challenging. A critical step in the metabolite identification workflow is the assignment of redundant spectral features (adducts, fragments, multimers) and calculation of the underlying chemical formula. Inspection of the data by experts using computational tools solving partial problems (e.g., chemical formula calculation for individual ions) can be performed to disambiguate alternative solutions and provide reliable results. However, manual curation is tedious and not readily scalable or standardized. Here we describe an automated procedure for the robust automated mass spectra interpretation and chemical formula calculation using mixed integer linear programming optimization (RAMSI). Chemical rules among related ions are expressed as linear constraints and both the spectra interpretation and chemical formula calculation are performed in a single optimization step. This approach is unbiased in that it does not require predefined sets of neutral losses and positive and negative polarity spectra can be combined in a single optimization. The procedure was evaluated with 30 experimental mass spectra and was found to effectively identify the protonated or deprotonated molecule ([M + H](+) or [M - H](-)) while being robust to the presence of background ions. RAMSI provides a much-needed standardized tool for interpreting ions for subsequent identification in untargeted metabolomics workflows.

  4. Highly Reproducible Automated Proteomics Sample Preparation Workflow for Quantitative Mass Spectrometry.

    Science.gov (United States)

    Fu, Qin; Kowalski, Michael P; Mastali, Mitra; Parker, Sarah J; Sobhani, Kimia; van den Broek, Irene; Hunter, Christie L; Van Eyk, Jennifer E

    2018-01-05

    Sample preparation for protein quantification by mass spectrometry requires multiple processing steps including denaturation, reduction, alkylation, protease digestion, and peptide cleanup. Scaling these procedures for the analysis of numerous complex biological samples can be tedious and time-consuming, as there are many liquid transfer steps and timed reactions where technical variations can be introduced and propagated. We established an automated sample preparation workflow with a total processing time for 96 samples of 5 h, including a 2 h incubation with trypsin. Peptide cleanup is accomplished by online diversion during the LC/MS/MS analysis. In a selected reaction monitoring (SRM) assay targeting 6 plasma biomarkers and spiked β-galactosidase, mean intraday and interday cyclic voltammograms (CVs) for 5 serum and 5 plasma samples over 5 days were samples repeated on 3 separate days had total CVs below 20%. Similar results were obtained when the workflow was transferred to a second site: 93% of peptides had CVs below 20%. An automated trypsin digestion workflow yields uniformly processed samples in less than 5 h. Reproducible quantification of peptides was observed across replicates, days, instruments, and laboratory sites, demonstrating the broad applicability of this approach.

  5. On the network thermodynamics of mass action chemical reaction networks

    NARCIS (Netherlands)

    Schaft, A.J. van der; Rao, S.; Jayawardhana, B.

    In this paper we elaborate on the mathematical formulation of mass action chemical reaction networks as recently given in van der Schaft, Rao, Jayawardhana (2012). We show how the reference chemical potentials define a specific thermodynamical equilibrium, and we discuss the port-Hamiltonian

  6. Towards Automation 2.0: A Neurocognitive Model for Environment Recognition, Decision-Making, and Action Execution

    Directory of Open Access Journals (Sweden)

    Zucker Gerhard

    2011-01-01

    Full Text Available The ongoing penetration of building automation by information technology is by far not saturated. Today's systems need not only be reliable and fault tolerant, they also have to regard energy efficiency and flexibility in the overall consumption. Meeting the quality and comfort goals in building automation while at the same time optimizing towards energy, carbon footprint and cost-efficiency requires systems that are able to handle large amounts of information and negotiate system behaviour that resolves conflicting demands—a decision-making process. In the last years, research has started to focus on bionic principles for designing new concepts in this area. The information processing principles of the human mind have turned out to be of particular interest as the mind is capable of processing huge amounts of sensory data and taking adequate decisions for (re-actions based on these analysed data. In this paper, we discuss how a bionic approach can solve the upcoming problems of energy optimal systems. A recently developed model for environment recognition and decision-making processes, which is based on research findings from different disciplines of brain research is introduced. This model is the foundation for applications in intelligent building automation that have to deal with information from home and office environments. All of these applications have in common that they consist of a combination of communicating nodes and have many, partly contradicting goals.

  7. Automating ActionScript Projects with Eclipse and Ant

    CERN Document Server

    Koning, Sidney

    2011-01-01

    Automating repetitive programming tasks is easier than many Flash/AS3 developers think. With the Ant build tool, the Eclipse IDE, and this concise guide, you can set up your own "ultimate development machine" to code, compile, debug, and deploy projects faster. You'll also get started with versioning systems, such as Subversion and Git. Create a consistent workflow for multiple machines, or even complete departments, with the help of extensive Ant code samples. If you want to work smarter and take your skills to a new level, this book will get you on the road to automation-with Ant. Set up y

  8. Variable elimination in chemical reaction networks with mass-action kinetics

    DEFF Research Database (Denmark)

    Feliu, Elisenda; Wiuf, C.

    2012-01-01

    We consider chemical reaction networks taken with mass-action kinetics. The steady states of such a system are solutions to a system of polynomial equations. Even for small systems the task of finding the solutions is daunting. We develop an algebraic framework and procedure for linear elimination...

  9. Mass deformed world-sheet action of semi local vortices

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Yunguo [School of Space Science and Physics, Shandong University at Weihai,264209 Weihai (China); Shandong Provincial Key Laboratory of Optical Astronomy and Solar-Terrestrial Environment,264209 Weihai (China)

    2014-02-10

    The mass deformed effective world-sheet theory of semi local vortices was constructed via the field theoretical method. By Euler-Lagrangian equations, the Ansatze for both the gauge field and the adjoint scalar were solved, this ensures that zero modes of vortices are minimal excitations of the system. Up to the 1/g{sup 2} order, all profiles are solved. The mass deformed effective action was obtained by integrating out the transverse plane of the vortex string. The effective theory interpolates between the local vortex and the lump. Respecting certain normalization conditions, the effective theory shows a Seiberg-like duality, which agrees with the result of the Kähler quotient construction.

  10. Automated detection of masses on whole breast volume ultrasound scanner: false positive reduction using deep convolutional neural network

    Science.gov (United States)

    Hiramatsu, Yuya; Muramatsu, Chisako; Kobayashi, Hironobu; Hara, Takeshi; Fujita, Hiroshi

    2017-03-01

    Breast cancer screening with mammography and ultrasonography is expected to improve sensitivity compared with mammography alone, especially for women with dense breast. An automated breast volume scanner (ABVS) provides the operator-independent whole breast data which facilitate double reading and comparison with past exams, contralateral breast, and multimodality images. However, large volumetric data in screening practice increase radiologists' workload. Therefore, our goal is to develop a computer-aided detection scheme of breast masses in ABVS data for assisting radiologists' diagnosis and comparison with mammographic findings. In this study, false positive (FP) reduction scheme using deep convolutional neural network (DCNN) was investigated. For training DCNN, true positive and FP samples were obtained from the result of our initial mass detection scheme using the vector convergence filter. Regions of interest including the detected regions were extracted from the multiplanar reconstraction slices. We investigated methods to select effective FP samples for training the DCNN. Based on the free response receiver operating characteristic analysis, simple random sampling from the entire candidates was most effective in this study. Using DCNN, the number of FPs could be reduced by 60%, while retaining 90% of true masses. The result indicates the potential usefulness of DCNN for FP reduction in automated mass detection on ABVS images.

  11. Improved protein hydrogen/deuterium exchange mass spectrometry platform with fully automated data processing.

    Science.gov (United States)

    Zhang, Zhongqi; Zhang, Aming; Xiao, Gang

    2012-06-05

    Protein hydrogen/deuterium exchange (HDX) followed by protease digestion and mass spectrometric (MS) analysis is accepted as a standard method for studying protein conformation and conformational dynamics. In this article, an improved HDX MS platform with fully automated data processing is described. The platform significantly reduces systematic and random errors in the measurement by introducing two types of corrections in HDX data analysis. First, a mixture of short peptides with fast HDX rates is introduced as internal standards to adjust the variations in the extent of back exchange from run to run. Second, a designed unique peptide (PPPI) with slow intrinsic HDX rate is employed as another internal standard to reflect the possible differences in protein intrinsic HDX rates when protein conformations at different solution conditions are compared. HDX data processing is achieved with a comprehensive HDX model to simulate the deuterium labeling and back exchange process. The HDX model is implemented into the in-house developed software MassAnalyzer and enables fully unattended analysis of the entire protein HDX MS data set starting from ion detection and peptide identification to final processed HDX output, typically within 1 day. The final output of the automated data processing is a set (or the average) of the most possible protection factors for each backbone amide hydrogen. The utility of the HDX MS platform is demonstrated by exploring the conformational transition of a monoclonal antibody by increasing concentrations of guanidine.

  12. A semi-automated approach to derive elevation time-series and calculate glacier mass balance from historical aerial imagery

    Science.gov (United States)

    Whorton, E.; Headman, A.; Shean, D. E.; McCann, E.

    2017-12-01

    Understanding the implications of glacier recession on water resources in the western U.S. requires quantifying glacier mass change across large regions over several decades. Very few glaciers in North America have long-term continuous field measurements of glacier mass balance. However, systematic aerial photography campaigns began in 1957 on many glaciers in the western U.S. and Alaska. These historical, vertical aerial stereo-photographs documenting glacier evolution have recently become publically available. Digital elevation models (DEM) of the transient glacier surface preserved in each imagery timestamp can be derived, then differenced to calculate glacier volume and mass change to improve regional geodetic solutions of glacier mass balance. In order to batch process these data, we use Python-based algorithms and Agisoft Photoscan structure from motion (SfM) photogrammetry software to semi-automate DEM creation, and orthorectify and co-register historical aerial imagery in a high-performance computing environment. Scanned photographs are rotated to reduce scaling issues, cropped to the same size to remove fiducials, and batch histogram equalization is applied to improve image quality and aid pixel-matching algorithms using the Python library OpenCV. Processed photographs are then passed to Photoscan through the Photoscan Python library to create DEMs and orthoimagery. To extend the period of record, the elevation products are co-registered to each other, airborne LiDAR data, and DEMs derived from sub-meter commercial satellite imagery. With the exception of the placement of ground control points, the process is entirely automated with Python. Current research is focused on: one, applying these algorithms to create geodetic mass balance time series for the 90 photographed glaciers in Washington State and two, evaluating the minimal amount of positional information required in Photoscan to prevent distortion effects that cannot be addressed during co

  13. Automated generation of lattice QCD Feynman rules

    Energy Technology Data Exchange (ETDEWEB)

    Hart, A.; Mueller, E.H. [Edinburgh Univ. (United Kingdom). SUPA School of Physics and Astronomy; von Hippel, G.M. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Horgan, R.R. [Cambridge Univ. (United Kingdom). DAMTP, CMS

    2009-04-15

    The derivation of the Feynman rules for lattice perturbation theory from actions and operators is complicated, especially for highly improved actions such as HISQ. This task is, however, both important and particularly suitable for automation. We describe a suite of software to generate and evaluate Feynman rules for a wide range of lattice field theories with gluons and (relativistic and/or heavy) quarks. Our programs are capable of dealing with actions as complicated as (m)NRQCD and HISQ. Automated differentiation methods are used to calculate also the derivatives of Feynman diagrams. (orig.)

  14. Automated generation of lattice QCD Feynman rules

    International Nuclear Information System (INIS)

    Hart, A.; Mueller, E.H.; Horgan, R.R.

    2009-04-01

    The derivation of the Feynman rules for lattice perturbation theory from actions and operators is complicated, especially for highly improved actions such as HISQ. This task is, however, both important and particularly suitable for automation. We describe a suite of software to generate and evaluate Feynman rules for a wide range of lattice field theories with gluons and (relativistic and/or heavy) quarks. Our programs are capable of dealing with actions as complicated as (m)NRQCD and HISQ. Automated differentiation methods are used to calculate also the derivatives of Feynman diagrams. (orig.)

  15. Future Trends in Process Automation

    OpenAIRE

    Jämsä-Jounela, Sirkka-Liisa

    2007-01-01

    The importance of automation in the process industries has increased dramatically in recent years. In the highly industrialized countries, process automation serves to enhance product quality, master the whole range of products, improve process safety and plant availability, efficiently utilize resources and lower emissions. In the rapidly developing countries, mass production is the main motivation for applying process automation. The greatest demand for process automation is in the chemical...

  16. Overlap valence quarks on a twisted mass sea. A case study for mixed action lattice QCD

    International Nuclear Information System (INIS)

    Cichy, Krzysztof; Herdoiza, Gregorio; UAM/CSIC Univ. Autonoma de Madrid

    2012-11-01

    We discuss a Lattice QCD mixed action investigation employing Wilson maximally twisted mass sea and overlap valence fermions. Using four values of the lattice spacing, we demonstrate that the overlap Dirac operator assumes a point-like locality in the continuum limit. We also show that by adopting suitable matching conditions for the sea and valence theories a consistent continuum limit for the pion decay constant and light baryon masses can be obtained. Finally, we confront results for sea-valence mixed meson masses and the valence scalar correlator with corresponding expressions of chiral perturbation theory. This allows us to extract low energy constants of mixed action chiral perturbation which characterize the strength of unitarity violations in our mixed action setup.

  17. Overlap valence quarks on a twisted mass sea. A case study for mixed action lattice QCD

    Energy Technology Data Exchange (ETDEWEB)

    Cichy, Krzysztof [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Poznan Univ. (Poland). Faculty of Physics; Drach, Vincent; Jansen, Karl [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Garcia-Ramos, Elena [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Humboldt-Universitaet, Berlin (Germany); Herdoiza, Gregorio [UAM/CSIC Univ. Autonoma de Madrid (Spain). Dept. de Fisica Teorica; UAM/CSIC Univ. Autonoma de Madrid (Spain). Inst. de Fisica Teorica; Collaboration: European Twisted Mass Collaboration

    2012-11-15

    We discuss a Lattice QCD mixed action investigation employing Wilson maximally twisted mass sea and overlap valence fermions. Using four values of the lattice spacing, we demonstrate that the overlap Dirac operator assumes a point-like locality in the continuum limit. We also show that by adopting suitable matching conditions for the sea and valence theories a consistent continuum limit for the pion decay constant and light baryon masses can be obtained. Finally, we confront results for sea-valence mixed meson masses and the valence scalar correlator with corresponding expressions of chiral perturbation theory. This allows us to extract low energy constants of mixed action chiral perturbation which characterize the strength of unitarity violations in our mixed action setup.

  18. A totally automated data acquisition/reduction system for routine treatment of mass spectroscopic data by factor analysis

    International Nuclear Information System (INIS)

    Tway, P.C.; Love, L.J.C.; Woodruff, H.B.

    1980-01-01

    Target transformation factor analysis is applied to typical data from gas chromatography-mass spectrometry and solid-probe mass spectrometry to determine rapidly the number of components in unresolved or partially resolved peaks. This technique allows the detection of hidden impurities which often make interpretation or quantification impossible. The error theory of Malinowski is used to assess the reliability of the results. The totally automated system uses a commercially available g.c.-m.s. data system interfaced to the large computer, and the number of components under a peak can be determined routinely and rapidly. (Auth.)

  19. Estimation of heterogeneity in malaria transmission by stochastic modelling of apparent deviations from mass action kinetics

    Directory of Open Access Journals (Sweden)

    Smith Thomas A

    2008-01-01

    Full Text Available Abstract Background Quantifying heterogeneity in malaria transmission is a prerequisite for accurate predictive mathematical models, but the variance in field measurements of exposure overestimates true micro-heterogeneity because it is inflated to an uncertain extent by sampling variation. Descriptions of field data also suggest that the rate of Plasmodium falciparum infection is not proportional to the intensity of challenge by infectious vectors. This appears to violate the principle of mass action that is implied by malaria biology. Micro-heterogeneity may be the reason for this anomaly. It is proposed that the level of micro-heterogeneity can be estimated from statistical models that estimate the amount of variation in transmission most compatible with a mass-action model for the relationship of infection to exposure. Methods The relationship between the entomological inoculation rate (EIR for falciparum malaria and infection risk was reanalysed using published data for cohorts of children in Saradidi (western Kenya. Infection risk was treated as binomially distributed, and measurement-error (Poisson and negative binomial models were considered for the EIR. Models were fitted using Bayesian Markov chain Monte Carlo algorithms and model fit compared for models that assume either mass-action kinetics, facilitation, competition or saturation of the infection process with increasing EIR. Results The proportion of inocula that resulted in infection in Saradidi was inversely related to the measured intensity of challenge. Models of facilitation showed, therefore, a poor fit to the data. When sampling error in the EIR was neglected, either competition or saturation needed to be incorporated in the model in order to give a good fit. Negative binomial models for the error in exposure could achieve a comparable fit while incorporating the more parsimonious and biologically plausible mass action assumption. Models that assume negative binomial micro

  20. Automated gravimetric sample pretreatment using an industrial robot for the high-precision determination of plutonium by isotope dilution mass spectrometry

    International Nuclear Information System (INIS)

    Surugaya, Naoki; Hiyama, Toshiaki; Watahiki, Masaru

    2008-01-01

    A robotized sample-preparation method for the determination of Pu, which is recovered by extraction reprocessing of spent nuclear fuel, by isotope dilution mass spectrometry (IDMS) is described. The automated system uses a six-axis industrial robot, whose motility is very fast, accurate, and flexible, installed in a glove box. The automation of the weighing and dilution steps enables operator-unattended sample pretreatment for the high-precision analysis of Pu in aqueous solutions. Using the developed system, the Pu concentration in a HNO 3 medium was successfully determined using a set of subsequent mass spectrometric measurements. The relative uncertainty in determining the Pu concentration by IDMS using this system was estimated to be less than 0.1% (k=2), which is equal to that expected of a talented analysis. The operation time required was the same as that for a skilled operator. (author)

  1. Automated on-line liquid–liquid extraction system for temporal mass spectrometric analysis of dynamic samples

    Energy Technology Data Exchange (ETDEWEB)

    Hsieh, Kai-Ta; Liu, Pei-Han [Department of Applied Chemistry, National Chiao Tung University, 1001 University Rd, Hsinchu, 300, Taiwan (China); Urban, Pawel L. [Department of Applied Chemistry, National Chiao Tung University, 1001 University Rd, Hsinchu, 300, Taiwan (China); Institute of Molecular Science, National Chiao Tung University, 1001 University Rd, Hsinchu, 300, Taiwan (China)

    2015-09-24

    Most real samples cannot directly be infused to mass spectrometers because they could contaminate delicate parts of ion source and guides, or cause ion suppression. Conventional sample preparation procedures limit temporal resolution of analysis. We have developed an automated liquid–liquid extraction system that enables unsupervised repetitive treatment of dynamic samples and instantaneous analysis by mass spectrometry (MS). It incorporates inexpensive open-source microcontroller boards (Arduino and Netduino) to guide the extraction and analysis process. Duration of every extraction cycle is 17 min. The system enables monitoring of dynamic processes over many hours. The extracts are automatically transferred to the ion source incorporating a Venturi pump. Operation of the device has been characterized (repeatability, RSD = 15%, n = 20; concentration range for ibuprofen, 0.053–2.000 mM; LOD for ibuprofen, ∼0.005 mM; including extraction and detection). To exemplify its usefulness in real-world applications, we implemented this device in chemical profiling of pharmaceutical formulation dissolution process. Temporal dissolution profiles of commercial ibuprofen and acetaminophen tablets were recorded during 10 h. The extraction-MS datasets were fitted with exponential functions to characterize the rates of release of the main and auxiliary ingredients (e.g. ibuprofen, k = 0.43 ± 0.01 h{sup −1}). The electronic control unit of this system interacts with the operator via touch screen, internet, voice, and short text messages sent to the mobile phone, which is helpful when launching long-term (e.g. overnight) measurements. Due to these interactive features, the platform brings the concept of the Internet-of-Things (IoT) to the chemistry laboratory environment. - Highlights: • Mass spectrometric analysis normally requires sample preparation. • Liquid–liquid extraction can isolate analytes from complex matrices. • The proposed system automates

  2. Automated on-line liquid–liquid extraction system for temporal mass spectrometric analysis of dynamic samples

    International Nuclear Information System (INIS)

    Hsieh, Kai-Ta; Liu, Pei-Han; Urban, Pawel L.

    2015-01-01

    Most real samples cannot directly be infused to mass spectrometers because they could contaminate delicate parts of ion source and guides, or cause ion suppression. Conventional sample preparation procedures limit temporal resolution of analysis. We have developed an automated liquid–liquid extraction system that enables unsupervised repetitive treatment of dynamic samples and instantaneous analysis by mass spectrometry (MS). It incorporates inexpensive open-source microcontroller boards (Arduino and Netduino) to guide the extraction and analysis process. Duration of every extraction cycle is 17 min. The system enables monitoring of dynamic processes over many hours. The extracts are automatically transferred to the ion source incorporating a Venturi pump. Operation of the device has been characterized (repeatability, RSD = 15%, n = 20; concentration range for ibuprofen, 0.053–2.000 mM; LOD for ibuprofen, ∼0.005 mM; including extraction and detection). To exemplify its usefulness in real-world applications, we implemented this device in chemical profiling of pharmaceutical formulation dissolution process. Temporal dissolution profiles of commercial ibuprofen and acetaminophen tablets were recorded during 10 h. The extraction-MS datasets were fitted with exponential functions to characterize the rates of release of the main and auxiliary ingredients (e.g. ibuprofen, k = 0.43 ± 0.01 h"−"1). The electronic control unit of this system interacts with the operator via touch screen, internet, voice, and short text messages sent to the mobile phone, which is helpful when launching long-term (e.g. overnight) measurements. Due to these interactive features, the platform brings the concept of the Internet-of-Things (IoT) to the chemistry laboratory environment. - Highlights: • Mass spectrometric analysis normally requires sample preparation. • Liquid–liquid extraction can isolate analytes from complex matrices. • The proposed system automates the

  3. Automated work-flow for processing high-resolution direct infusion electrospray ionization mass spectral fingerprints

    DEFF Research Database (Denmark)

    Hansen, Michael Adsetts Edberg; Smedsgaard, Jørn

    2007-01-01

    an automated data processing pipeline to compare large numbers of fingerprint spectra from direct infusion experiments analyzed by high resolution MS. We describe some of the intriguing problems that have to be addressed. starting with the conversion and pre-processing of the raw data to the final data......The use of mass spectrometry (MS) is pivotal in analyses of the metabolome and presents a major challenge for subsequent data processing. While the last few years have given new high performance instruments, there has not been a comparable development in data processing. In this paper we discuss...

  4. Geena 2, improved automated analysis of MALDI/TOF mass spectra.

    Science.gov (United States)

    Romano, Paolo; Profumo, Aldo; Rocco, Mattia; Mangerini, Rosa; Ferri, Fabio; Facchiano, Angelo

    2016-03-02

    Mass spectrometry (MS) is producing high volumes of data supporting oncological sciences, especially for translational research. Most of related elaborations can be carried out by combining existing tools at different levels, but little is currently available for the automation of the fundamental steps. For the analysis of MALDI/TOF spectra, a number of pre-processing steps are required, including joining of isotopic abundances for a given molecular species, normalization of signals against an internal standard, background noise removal, averaging multiple spectra from the same sample, and aligning spectra from different samples. In this paper, we present Geena 2, a public software tool for the automated execution of these pre-processing steps for MALDI/TOF spectra. Geena 2 has been developed in a Linux-Apache-MySQL-PHP web development environment, with scripts in PHP and Perl. Input and output are managed as simple formats that can be consumed by any database system and spreadsheet software. Input data may also be stored in a MySQL database. Processing methods are based on original heuristic algorithms which are introduced in the paper. Three simple and intuitive web interfaces are available: the Standard Search Interface, which allows a complete control over all parameters, the Bright Search Interface, which leaves to the user the possibility to tune parameters for alignment of spectra, and the Quick Search Interface, which limits the number of parameters to a minimum by using default values for the majority of parameters. Geena 2 has been utilized, in conjunction with a statistical analysis tool, in three published experimental works: a proteomic study on the effects of long-term cryopreservation on the low molecular weight fraction of serum proteome, and two retrospective serum proteomic studies, one on the risk of developing breat cancer in patients affected by gross cystic disease of the breast (GCDB) and the other for the identification of a predictor of

  5. Unified mass-action theory for virus neutralization and radioimmunology

    International Nuclear Information System (INIS)

    Trautman, R.

    1976-01-01

    All ideas implicit in the papers since 1953 involved in applying mass-action thermodynamics to antibody- antigen reactions are unified by the use of: (a) the intermediary concept of extent of reaction; (b) the concept of intrinsic association constant; (c) a statistical analysis for probable complexes; and (d) identification of the complex or complexes that contribute to the bioassay. Several general theoretical examples are given that show the limitations of linear interpretations of equilibrium data. Two practical examples from the literature illustrate foot-and-mouth disease virus and influenza virus neutralization. (Auth.)

  6. Effects of Automation Types on Air Traffic Controller Situation Awareness and Performance

    Science.gov (United States)

    Sethumadhavan, A.

    2009-01-01

    The Joint Planning and Development Office has proposed the introduction of automated systems to help air traffic controllers handle the increasing volume of air traffic in the next two decades (JPDO, 2007). Because fully automated systems leave operators out of the decision-making loop (e.g., Billings, 1991), it is important to determine the right level and type of automation that will keep air traffic controllers in the loop. This study examined the differences in the situation awareness (SA) and collision detection performance of individuals when they worked with information acquisition, information analysis, decision and action selection and action implementation automation to control air traffic (Parasuraman, Sheridan, & Wickens, 2000). When the automation was unreliable, the time taken to detect an upcoming collision was significantly longer for all the automation types compared with the information acquisition automation. This poor performance following automation failure was mediated by SA, with lower SA yielding poor performance. Thus, the costs associated with automation failure are greater when automation is applied to higher order stages of information processing. Results have practical implications for automation design and development of SA training programs.

  7. 78 FR 75528 - Federal Government Participation in the Automated Clearing House

    Science.gov (United States)

    2013-12-12

    ... Participation in the Automated Clearing House AGENCY: Bureau of the Fiscal Service, Treasury. ACTION: Notice of... Service (Service) is proposing to amend its regulation governing the use of the Automated Clearing House... Automated Clearing House, Electronic funds transfer, Financial institutions, Fraud, and Incorporation by...

  8. Levels of automation and user control - evaluation of a turbine automation interface

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Jonas (Chalmers Univ. of Technology (Sweden))

    2008-10-15

    The study was performed during the annual operator training at the Studsvik nuclear power plant simulator facility in Nykoeping, Sweden. The participating operators came from the Oskarshamn 3 nuclear power plant. In the study, seven nuclear power plant turbine operators were interviewed concerning their use of the automatic turbine system. A field study approach together with a heuristic usability evaluation was made to assess how the operators are affected by use of automation in the control room setting. The purpose of the study was to examine how operator performance is affected by varying levels of automation in nuclear power plant turbine operation. The Automatic Turbine System (ATS) was evaluated to clarify how the ATS interface design supports the operators work. The results show that during manual control the operators experience loss of speed and accuracy in performing actions together with difficulty of dividing attention between performing a task and overall monitoring, as the major problems. The positive aspects of manual operations lie in increased feeling of being in control when performing actions by hand. With higher levels of automation the problems shift to issues concerning difficulty of following the automatic sequences and loosing track in procedures. As the level of automation gets higher, the need of feedback increases which means that information presentation also becomes more important. The use of the semiautomatic, step-mode is often preferred by the operators since it combines the speed and accuracy of the automation with the ability of maintaining the feeling of being in control. Further, a number of usability related concerns was found in the ATS interface. The operators especially experience the presentation of the conditions that manage the automatic sequences as difficult to perceive. (author)

  9. Levels of automation and user control - evaluation of a turbine automation interface

    International Nuclear Information System (INIS)

    Andersson, Jonas

    2008-10-01

    The study was performed during the annual operator training at the Studsvik nuclear power plant simulator facility in Nykoeping, Sweden. The participating operators came from the Oskarshamn 3 nuclear power plant. In the study, seven nuclear power plant turbine operators were interviewed concerning their use of the automatic turbine system. A field study approach together with a heuristic usability evaluation was made to assess how the operators are affected by use of automation in the control room setting. The purpose of the study was to examine how operator performance is affected by varying levels of automation in nuclear power plant turbine operation. The Automatic Turbine System (ATS) was evaluated to clarify how the ATS interface design supports the operators work. The results show that during manual control the operators experience loss of speed and accuracy in performing actions together with difficulty of dividing attention between performing a task and overall monitoring, as the major problems. The positive aspects of manual operations lie in increased feeling of being in control when performing actions by hand. With higher levels of automation the problems shift to issues concerning difficulty of following the automatic sequences and loosing track in procedures. As the level of automation gets higher, the need of feedback increases which means that information presentation also becomes more important. The use of the semiautomatic, step-mode is often preferred by the operators since it combines the speed and accuracy of the automation with the ability of maintaining the feeling of being in control. Further, a number of usability related concerns was found in the ATS interface. The operators especially experience the presentation of the conditions that manage the automatic sequences as difficult to perceive. (au)

  10. Building Automation Systems.

    Science.gov (United States)

    Honeywell, Inc., Minneapolis, Minn.

    A number of different automation systems for use in monitoring and controlling building equipment are described in this brochure. The system functions include--(1) collection of information, (2) processing and display of data at a central panel, and (3) taking corrective action by sounding alarms, making adjustments, or automatically starting and…

  11. Automation and decision support in interactive consumer products.

    OpenAIRE

    Sauer, J.; Rüttinger, B.

    2007-01-01

    This article presents two empirical studies (n=30, n=48) that are concerned with different forms of automation in interactive consumer products. The goal of the studies was to evaluate the effectiveness of two types of automation: perceptual augmentation (i.e. supporting users' action selection and implementation). Furthermore, the effectiveness of non-product information (i.e. labels attached to product) in supporting automation design was evaluated. The findings suggested greater benefits f...

  12. An Automated, High-Throughput Method for Interpreting the Tandem Mass Spectra of Glycosaminoglycans

    Science.gov (United States)

    Duan, Jiana; Jonathan Amster, I.

    2018-05-01

    The biological interactions between glycosaminoglycans (GAGs) and other biomolecules are heavily influenced by structural features of the glycan. The structure of GAGs can be assigned using tandem mass spectrometry (MS2), but analysis of these data, to date, requires manually interpretation, a slow process that presents a bottleneck to the broader deployment of this approach to solving biologically relevant problems. Automated interpretation remains a challenge, as GAG biosynthesis is not template-driven, and therefore, one cannot predict structures from genomic data, as is done with proteins. The lack of a structure database, a consequence of the non-template biosynthesis, requires a de novo approach to interpretation of the mass spectral data. We propose a model for rapid, high-throughput GAG analysis by using an approach in which candidate structures are scored for the likelihood that they would produce the features observed in the mass spectrum. To make this approach tractable, a genetic algorithm is used to greatly reduce the search-space of isomeric structures that are considered. The time required for analysis is significantly reduced compared to an approach in which every possible isomer is considered and scored. The model is coded in a software package using the MATLAB environment. This approach was tested on tandem mass spectrometry data for long-chain, moderately sulfated chondroitin sulfate oligomers that were derived from the proteoglycan bikunin. The bikunin data was previously interpreted manually. Our approach examines glycosidic fragments to localize SO3 modifications to specific residues and yields the same structures reported in literature, only much more quickly.

  13. Marketing automation processes as a way to improve contemporary marketing of a company

    Directory of Open Access Journals (Sweden)

    Witold Świeczak

    2013-09-01

    Full Text Available The main aim of this article is to identify the possibilities which are given to contemporary companies thanks to the usage of processes included in marketing automation system. This publication deals with the key aspects of this issue. Its shows how the importance of the organization changes, how its value increases, as a result of using the tools which are provided by the processes included in the concept of marketing automation. This article defines the factors and processes which influence an effective course of actions taken as a part of marketing automation. Because the concept of marketing automation is a completely new reality; it is giving up the communication based on mass distribution of a uniform contents for really personalized individual and fully automated communication. This is a completely new idea, a kind of coexistence, in which both a sales department and a marketing department cooperate with each other closely to achieve the best result. It is also a situation in which marketing can definitely confirm its contribution to the income generated by the company. But marketing automation also means huge analytical possibilities and a real increase of a company’s value, its value added generated by the system – the source of information about clients, about all processes both marketing and sales, taking place in a company. The introduction of marketing automation system alters not only the current functioning of a marketing department, but also marketers themselves. In fact, everything that marketing automation system provides, including primarily accumulated unique knowledge of the client, is also a critical marketing value of every modern enterprise.

  14. Parameters Investigation of Mathematical Model of Productivity for Automated Line with Availability by DMAIC Methodology

    Directory of Open Access Journals (Sweden)

    Tan Chan Sin

    2014-01-01

    Full Text Available Automated line is widely applied in industry especially for mass production with less variety product. Productivity is one of the important criteria in automated line as well as industry which directly present the outputs and profits. Forecast of productivity in industry accurately in order to achieve the customer demand and the forecast result is calculated by using mathematical model. Mathematical model of productivity with availability for automated line has been introduced to express the productivity in terms of single level of reliability for stations and mechanisms. Since this mathematical model of productivity with availability cannot achieve close enough productivity compared to actual one due to lack of parameters consideration, the enhancement of mathematical model is required to consider and add the loss parameters that is not considered in current model. This paper presents the investigation parameters of productivity losses investigated by using DMAIC (Define, Measure, Analyze, Improve, and Control concept and PACE Prioritization Matrix (Priority, Action, Consider, and Eliminate. The investigated parameters are important for further improvement of mathematical model of productivity with availability to develop robust mathematical model of productivity in automated line.

  15. Bioprocessing automation in cell therapy manufacturing: Outcomes of special interest group automation workshop.

    Science.gov (United States)

    Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David

    2018-04-01

    Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy

  16. Automation and Control of an Imaging Internal Laser Desorption Fourier Transform Mass Spectrometer (I2LD-FTMS)

    Energy Technology Data Exchange (ETDEWEB)

    McJunkin, Timothy R; Tranter, Troy Joseph; Scott, Jill Rennee

    2002-06-01

    This paper describes the automation of an imaging internal source laser desorption Fourier transform mass spectrometer (I2LD-FTMS). The I2LD-FTMS consists of a laser-scanning device [Scott and Tremblay, Rev. Sci. Instrum. 2002, 73, 1108–1116] that has been integrated with a laboratory-built FTMS using a commercial data acquisition system (ThermoFinnigan FT/MS, Bremen, Germany). A new user interface has been developed in National Instrument's (Austin, Texas) graphical programming language LabVIEW to control the motors of the laser positioning system and the commercial FTMS data acquisition system. A feature of the FTMS software that allows the user to write macros in a scripting language is used creatively to our advantage in creating a mechanism to control the FTMS from outside its graphical user interface. The new user interface also allows the user to configure target locations. Automation of the data analysis along with data display using commercial graphing software is also described.

  17. Preclusion of switch behavior in reaction networks with mass-action kinetics

    DEFF Research Database (Denmark)

    Feliu, Elisenda; Wiuf, C.

    2012-01-01

    We study networks taken with mass-action kinetics and provide a Jacobian criterion that applies to an arbitrary network to preclude the existence of multiple positive steady states within any stoichiometric class for any choice of rate constants. We are concerned with the characterization...... precludes the existence of degenerate steady states. Further, we relate injectivity of a network to that of the network obtained by adding outflow, or degradation, reactions for all species....

  18. Y-12 Plant remedial action Technology Logic Diagram: Volume 3, Technology evaluation data sheets: Part B, Characterization; robotics/automation

    International Nuclear Information System (INIS)

    1994-09-01

    The Y-12 Plant Remedial Action Technology Logic Diagram (TLD) was developed to provide a decision-support tool that relates environmental restoration (ER) problems at the Y-12 Plant to potential technologies that can remediate theses problems. The TLD identifies the research, development, demonstration, testing, and evaluation needed for sufficient development of these technologies to allow for technology transfer and application to remedial action (RA) activities. The TLD consists of three volumes. Volume 1 contains an overview of the TLD, an explanation of the program-specific responsibilities, a review of identified technologies, and the rankings of remedial technologies. Volume 2 contains the logic linkages among environmental management goals, environmental problems, and the various technologies that have the potential to solve these problems. Volume 3 contains the TLD data sheets. This report is Part B of Volume 3 and contains the Characterization and Robotics/Automation sections

  19. Automated thermochemolysis reactor for detection of Bacillus anthracis endospores by gas chromatography–mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Li, Dan [Department of Chemistry and Biochemistry, Brigham Young University, Provo, UT 84602 (United States); Rands, Anthony D.; Losee, Scott C. [Torion Technologies, American Fork, UT 84003 (United States); Holt, Brian C. [Department of Statistics, Brigham Young University, Provo, UT 84602 (United States); Williams, John R. [Department of Chemistry and Biochemistry, Brigham Young University, Provo, UT 84602 (United States); Lammert, Stephen A. [Torion Technologies, American Fork, UT 84003 (United States); Robison, Richard A. [Department of Microbiology and Molecular Biology, Brigham Young University, Provo, UT 84602 (United States); Tolley, H. Dennis [Department of Statistics, Brigham Young University, Provo, UT 84602 (United States); Lee, Milton L., E-mail: milton_lee@byu.edu [Department of Chemistry and Biochemistry, Brigham Young University, Provo, UT 84602 (United States)

    2013-05-02

    Graphical abstract: -- Highlights: •An automated sample preparation system for Bacillus anthracis endospores was developed. •A thermochemolysis method was applied to produce and derivatize biomarkers for Bacillus anthracis detection. •The autoreactor controlled the precise delivery of reagents, and TCM reaction times and temperatures. •Solid phase microextraction was used to extract biomarkers, and GC–MS was used for final identification. •This autoreactor was successfully applied to the identification of Bacillus anthracis endospores. -- Abstract: An automated sample preparation system was developed and tested for the rapid detection of Bacillus anthracis endospores by gas chromatography–mass spectrometry (GC–MS) for eventual use in the field. This reactor is capable of automatically processing suspected bio-threat agents to release and derivatize unique chemical biomarkers by thermochemolysis (TCM). The system automatically controls the movement of sample vials from one position to another, crimping of septum caps onto the vials, precise delivery of reagents, and TCM reaction times and temperatures. The specific operations of introduction of sample vials, solid phase microextraction (SPME) sampling, injection into the GC–MS system, and ejection of used vials from the system were performed manually in this study, although they can be integrated into the automated system. Manual SPME sampling is performed by following visual and audible signal prompts for inserting the fiber into and retracting it from the sampling port. A rotating carousel design allows for simultaneous sample collection, reaction, biomarker extraction and analysis of sequential samples. Dipicolinic acid methyl ester (DPAME), 3-methyl-2-butenoic acid methyl ester (a fragment of anthrose) and two methylated sugars were used to compare the performance of the autoreactor with manual TCM. Statistical algorithms were used to construct reliable bacterial endospore signatures, and 24

  20. Automated thermochemolysis reactor for detection of Bacillus anthracis endospores by gas chromatography–mass spectrometry

    International Nuclear Information System (INIS)

    Li, Dan; Rands, Anthony D.; Losee, Scott C.; Holt, Brian C.; Williams, John R.; Lammert, Stephen A.; Robison, Richard A.; Tolley, H. Dennis; Lee, Milton L.

    2013-01-01

    Graphical abstract: -- Highlights: •An automated sample preparation system for Bacillus anthracis endospores was developed. •A thermochemolysis method was applied to produce and derivatize biomarkers for Bacillus anthracis detection. •The autoreactor controlled the precise delivery of reagents, and TCM reaction times and temperatures. •Solid phase microextraction was used to extract biomarkers, and GC–MS was used for final identification. •This autoreactor was successfully applied to the identification of Bacillus anthracis endospores. -- Abstract: An automated sample preparation system was developed and tested for the rapid detection of Bacillus anthracis endospores by gas chromatography–mass spectrometry (GC–MS) for eventual use in the field. This reactor is capable of automatically processing suspected bio-threat agents to release and derivatize unique chemical biomarkers by thermochemolysis (TCM). The system automatically controls the movement of sample vials from one position to another, crimping of septum caps onto the vials, precise delivery of reagents, and TCM reaction times and temperatures. The specific operations of introduction of sample vials, solid phase microextraction (SPME) sampling, injection into the GC–MS system, and ejection of used vials from the system were performed manually in this study, although they can be integrated into the automated system. Manual SPME sampling is performed by following visual and audible signal prompts for inserting the fiber into and retracting it from the sampling port. A rotating carousel design allows for simultaneous sample collection, reaction, biomarker extraction and analysis of sequential samples. Dipicolinic acid methyl ester (DPAME), 3-methyl-2-butenoic acid methyl ester (a fragment of anthrose) and two methylated sugars were used to compare the performance of the autoreactor with manual TCM. Statistical algorithms were used to construct reliable bacterial endospore signatures, and 24

  1. Application of mass spectrometric techniques to delineate the modes-of-action of anticancer metallodrugs

    NARCIS (Netherlands)

    Hartinger, Christian G.; Groessl, Michael; Meier, Samuel M.; Casini, Angela; Dyson, Paul J.

    2013-01-01

    Mass spectrometry (MS) has emerged as an important tool for studying anticancer metallodrugs in complex biological samples and for characterising their interactions with biomolecules and potential targets on a molecular level. The exact modes-of-action of these coordination compounds and especially

  2. On the Mathematical Structure of Balanced Chemical Reaction Networks Governed by Mass Action Kinetics

    NARCIS (Netherlands)

    Schaft, Arjan van der; Rao, Shodhan; Jayawardhana, Bayu

    2013-01-01

    Motivated by recent progress on the interplay between graph theory, dynamics, and systems theory, we revisit the analysis of chemical reaction networks described by mass action kinetics. For reaction networks possessing a thermodynamic equilibrium we derive a compact formulation exhibiting at the

  3. Balance between automation and human actions in NPP operation: Results of international co-operation

    International Nuclear Information System (INIS)

    Bastl, W.; Jenkinson, J.; Kossilov, A.; Olmstead, R.A.; Oudiz, A.; Sun, B.

    1991-01-01

    Automation is an essential feature of NPPs. The degree of automation can be seen to be increasing, owing to technical and social factors, but also as a result of advances in information technology. Deciding upon the appropriate level of automation, the allocation of functions to man, or to a machine or a combination of both may be one of the most critical aspects of NPP design. It is important that automation is carried out in a sufficiently systematic way. There appears to be need for additional guidance in this key area. In 1989 the International Atomic Energy Agency formed an advisory group to consider this problem. The group has proposed a methodology for allocating functions between man and machine. This methodology, which is described in the paper, takes account of the factors which influence the allocation process, considers viable approaches to automation and gives guidance on decision making. In addition, areas where future research may be justified are discussed. (author). 8 refs, 1 fig

  4. Automated multi-plug filtration cleanup for liquid chromatographic-tandem mass spectrometric pesticide multi-residue analysis in representative crop commodities.

    Science.gov (United States)

    Qin, Yuhong; Zhang, Jingru; Zhang, Yuan; Li, Fangbing; Han, Yongtao; Zou, Nan; Xu, Haowei; Qian, Meiyuan; Pan, Canping

    2016-09-02

    An automated multi-plug filtration cleanup (m-PFC) method on modified QuEChERS (quick, easy, cheap, effective, rugged, and safe) extracts was developed. The automatic device was aimed to reduce labor-consuming manual operation workload in the cleanup steps. It could control the volume and the speed of pulling and pushing cycles accurately. In this work, m-PFC was based on multi-walled carbon nanotubes (MWCNTs) mixed with other sorbents and anhydrous magnesium sulfate (MgSO4) in a packed tip for analysis of pesticide multi-residues in crop commodities followed by liquid chromatography with tandem mass spectrometric (LC-MS/MS) detection. It was validated by analyzing 25 pesticides in six representative matrices spiked at two concentration levels of 10 and 100μg/kg. Salts, sorbents, m-PFC procedure, automated pulling and pushing volume, automated pulling speed, and pushing speed for each matrix were optimized. After optimization, two general automated m-PFC methods were introduced to relatively simple (apple, citrus fruit, peanut) and relatively complex (spinach, leek, green tea) matrices. Spike recoveries were within 83 and 108% and 1-14% RSD for most analytes in the tested matrices. Matrix-matched calibrations were performed with the coefficients of determination >0.997 between concentration levels of 10 and 1000μg/kg. The developed method was successfully applied to the determination of pesticide residues in market samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Predicting Causal Relationships from Biological Data: Applying Automated Casual Discovery on Mass Cytometry Data of Human Immune Cells

    KAUST Repository

    Triantafillou, Sofia; Lagani, Vincenzo; Heinze-Deml, Christina; Schmidt, Angelika; Tegner, Jesper; Tsamardinos, Ioannis

    2017-01-01

    Learning the causal relationships that define a molecular system allows us to predict how the system will respond to different interventions. Distinguishing causality from mere association typically requires randomized experiments. Methods for automated causal discovery from limited experiments exist, but have so far rarely been tested in systems biology applications. In this work, we apply state-of-the art causal discovery methods on a large collection of public mass cytometry data sets, measuring intra-cellular signaling proteins of the human immune system and their response to several perturbations. We show how different experimental conditions can be used to facilitate causal discovery, and apply two fundamental methods that produce context-specific causal predictions. Causal predictions were reproducible across independent data sets from two different studies, but often disagree with the KEGG pathway databases. Within this context, we discuss the caveats we need to overcome for automated causal discovery to become a part of the routine data analysis in systems biology.

  6. Predicting Causal Relationships from Biological Data: Applying Automated Casual Discovery on Mass Cytometry Data of Human Immune Cells

    KAUST Repository

    Triantafillou, Sofia

    2017-03-31

    Learning the causal relationships that define a molecular system allows us to predict how the system will respond to different interventions. Distinguishing causality from mere association typically requires randomized experiments. Methods for automated causal discovery from limited experiments exist, but have so far rarely been tested in systems biology applications. In this work, we apply state-of-the art causal discovery methods on a large collection of public mass cytometry data sets, measuring intra-cellular signaling proteins of the human immune system and their response to several perturbations. We show how different experimental conditions can be used to facilitate causal discovery, and apply two fundamental methods that produce context-specific causal predictions. Causal predictions were reproducible across independent data sets from two different studies, but often disagree with the KEGG pathway databases. Within this context, we discuss the caveats we need to overcome for automated causal discovery to become a part of the routine data analysis in systems biology.

  7. Predicting Causal Relationships from Biological Data: Applying Automated Causal Discovery on Mass Cytometry Data of Human Immune Cells

    KAUST Repository

    Triantafillou, Sofia; Lagani, Vincenzo; Heinze-Deml, Christina; Schmidt, Angelika; Tegner, Jesper; Tsamardinos, Ioannis

    2017-01-01

    Learning the causal relationships that define a molecular system allows us to predict how the system will respond to different interventions. Distinguishing causality from mere association typically requires randomized experiments. Methods for automated  causal discovery from limited experiments exist, but have so far rarely been tested in systems biology applications. In this work, we apply state-of-the art causal discovery methods on a large collection of public mass cytometry data sets, measuring intra-cellular signaling proteins of the human immune system and their response to several perturbations. We show how different experimental conditions can be used to facilitate causal discovery, and apply two fundamental methods that produce context-specific causal predictions. Causal predictions were reproducible across independent data sets from two different studies, but often disagree with the KEGG pathway databases. Within this context, we discuss the caveats we need to overcome for automated causal discovery to become a part of the routine data analysis in systems biology.

  8. Predicting Causal Relationships from Biological Data: Applying Automated Causal Discovery on Mass Cytometry Data of Human Immune Cells

    KAUST Repository

    Triantafillou, Sofia

    2017-09-29

    Learning the causal relationships that define a molecular system allows us to predict how the system will respond to different interventions. Distinguishing causality from mere association typically requires randomized experiments. Methods for automated  causal discovery from limited experiments exist, but have so far rarely been tested in systems biology applications. In this work, we apply state-of-the art causal discovery methods on a large collection of public mass cytometry data sets, measuring intra-cellular signaling proteins of the human immune system and their response to several perturbations. We show how different experimental conditions can be used to facilitate causal discovery, and apply two fundamental methods that produce context-specific causal predictions. Causal predictions were reproducible across independent data sets from two different studies, but often disagree with the KEGG pathway databases. Within this context, we discuss the caveats we need to overcome for automated causal discovery to become a part of the routine data analysis in systems biology.

  9. Pep2Path: automated mass spectrometry-guided genome mining of peptidic natural products.

    Directory of Open Access Journals (Sweden)

    Marnix H Medema

    2014-09-01

    Full Text Available Nonribosomally and ribosomally synthesized bioactive peptides constitute a source of molecules of great biomedical importance, including antibiotics such as penicillin, immunosuppressants such as cyclosporine, and cytostatics such as bleomycin. Recently, an innovative mass-spectrometry-based strategy, peptidogenomics, has been pioneered to effectively mine microbial strains for novel peptidic metabolites. Even though mass-spectrometric peptide detection can be performed quite fast, true high-throughput natural product discovery approaches have still been limited by the inability to rapidly match the identified tandem mass spectra to the gene clusters responsible for the biosynthesis of the corresponding compounds. With Pep2Path, we introduce a software package to fully automate the peptidogenomics approach through the rapid Bayesian probabilistic matching of mass spectra to their corresponding biosynthetic gene clusters. Detailed benchmarking of the method shows that the approach is powerful enough to correctly identify gene clusters even in data sets that consist of hundreds of genomes, which also makes it possible to match compounds from unsequenced organisms to closely related biosynthetic gene clusters in other genomes. Applying Pep2Path to a data set of compounds without known biosynthesis routes, we were able to identify candidate gene clusters for the biosynthesis of five important compounds. Notably, one of these clusters was detected in a genome from a different subphylum of Proteobacteria than that in which the molecule had first been identified. All in all, our approach paves the way towards high-throughput discovery of novel peptidic natural products. Pep2Path is freely available from http://pep2path.sourceforge.net/, implemented in Python, licensed under the GNU General Public License v3 and supported on MS Windows, Linux and Mac OS X.

  10. Automated cell type discovery and classification through knowledge transfer

    Science.gov (United States)

    Lee, Hao-Chih; Kosoy, Roman; Becker, Christine E.

    2017-01-01

    Abstract Motivation: Recent advances in mass cytometry allow simultaneous measurements of up to 50 markers at single-cell resolution. However, the high dimensionality of mass cytometry data introduces computational challenges for automated data analysis and hinders translation of new biological understanding into clinical applications. Previous studies have applied machine learning to facilitate processing of mass cytometry data. However, manual inspection is still inevitable and becoming the barrier to reliable large-scale analysis. Results: We present a new algorithm called Automated Cell-type Discovery and Classification (ACDC) that fully automates the classification of canonical cell populations and highlights novel cell types in mass cytometry data. Evaluations on real-world data show ACDC provides accurate and reliable estimations compared to manual gating results. Additionally, ACDC automatically classifies previously ambiguous cell types to facilitate discovery. Our findings suggest that ACDC substantially improves both reliability and interpretability of results obtained from high-dimensional mass cytometry profiling data. Availability and Implementation: A Python package (Python 3) and analysis scripts for reproducing the results are availability on https://bitbucket.org/dudleylab/acdc. Contact: brian.kidd@mssm.edu or joel.dudley@mssm.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28158442

  11. Automated Vehicle Monitoring System

    OpenAIRE

    Wibowo, Agustinus Deddy Arief; Heriansyah, Rudi

    2014-01-01

    An automated vehicle monitoring system is proposed in this paper. The surveillance system is based on image processing techniques such as background subtraction, colour balancing, chain code based shape detection, and blob. The proposed system will detect any human's head as appeared at the side mirrors. The detected head will be tracked and recorded for further action.

  12. An Automated High Performance Capillary Liquid Chromatography Fourier Transform Ion Cyclotron Resonance Mass Spectrometer for High-Throughput Proteomics

    International Nuclear Information System (INIS)

    Belov, Mikhail E.; Anderson, Gordon A.; Wingerd, Mark A.; Udseth, Harold R.; Tang, Keqi; Prior, David C.; Swanson, Kenneth R.; Buschbach, Michael A.; Strittmatter, Eric F.; Moore, Ronald J.; Smith, Richard D.

    2004-01-01

    We report on a fully automated 9.4 tesla Fourier transform ion resonance cyclotron (FTICR) mass spectrometer coupled to reverse-phase chromatography for high-throughput proteomic studies. Modifications made to the front-end of a commercial FTICR instrument--a dual-ESI-emitter ion source; dual-channel electrodynamic ion funnel; and collisional-cooling, selection and accumulation quadrupoles--significantly improved the sensitivity, dynamic range and mass measurement accuracy of the mass spectrometer. A high-pressure capillary liquid chromatography (LC) system was incorporated with an autosampler that enabled 24 h/day operation. A novel method for accumulating ions in the ICR cell was also developed. Unattended operation of the instrument revealed the exceptional reproducibility (1-5% deviation in elution times for peptides from a bacterial proteome), repeatability (10-20% deviation in detected abundances for peptides from the same aliquot analyzed a few weeks apart) and robustness (high-throughput operation for 5 months without downtime) of the LC/FTICR system. When combined with modulated-ion-energy gated trapping, the internal calibration of FTICR mass spectra decreased dispersion of mass measurement errors for peptide identifications in conjunction with high resolution capillary LC separations to < 5 ppm over a dynamic range for each spectrum of 10 3

  13. Power, speed & automation with Adobe Photoshop

    CERN Document Server

    Scott, Geoff

    2012-01-01

    This is a must for the serious Photoshop user! Power, Speed & Automation explores how to customize and automate Photoshop to increase your speed and productivity.  With numerous step-by-step instructions, the authors-two of Adobe's own software developers!- walk you through the steps to best tailor Photoshop's interface to your personal workflow; write and apply Actions; and use batching and scripts to process large numbers of images quickly and automatically.  You will learn how to build your own dialogs and panels to improve your production workflows in Photoshop, the secrets of changing

  14. Automated data processing and radioassays.

    Science.gov (United States)

    Samols, E; Barrows, G H

    1978-04-01

    Radioassays include (1) radioimmunoassays, (2) competitive protein-binding assays based on competition for limited antibody or specific binding protein, (3) immunoradiometric assay, based on competition for excess labeled antibody, and (4) radioreceptor assays. Most mathematical models describing the relationship between labeled ligand binding and unlabeled ligand concentration have been based on the law of mass action or the isotope dilution principle. These models provide useful data reduction programs, but are theoretically unfactory because competitive radioassay usually is not based on classical dilution principles, labeled and unlabeled ligand do not have to be identical, antibodies (or receptors) are frequently heterogenous, equilibrium usually is not reached, and there is probably steric and cooperative influence on binding. An alternative, more flexible mathematical model based on the probability or binding collisions being restricted by the surface area of reactive divalent sites on antibody and on univalent antigen has been derived. Application of these models to automated data reduction allows standard curves to be fitted by a mathematical expression, and unknown values are calculated from binding data. The vitrues and pitfalls are presented of point-to-point data reduction, linear transformations, and curvilinear fitting approaches. A third-order polynomial using the square root of concentration closely approximates the mathematical model based on probability, and in our experience this method provides the most acceptable results with all varieties of radioassays. With this curvilinear system, linear point connection should be used between the zero standard and the beginning of significant dose response, and also towards saturation. The importance is stressed of limiting the range of reported automated assay results to that portion of the standard curve that delivers optimal sensitivity. Published methods for automated data reduction of Scatchard plots

  15. On the graph and systems analysis of reversible chemical reaction networks with mass action kinetics

    NARCIS (Netherlands)

    Rao, Shodhan; Jayawardhana, Bayu; Schaft, Arjan van der

    2012-01-01

    Motivated by the recent progresses on the interplay between the graph theory and systems theory, we revisit the analysis of reversible chemical reaction networks described by mass action kinetics by reformulating it using the graph knowledge of the underlying networks. Based on this formulation, we

  16. Automated technological equipment-robot complexes

    International Nuclear Information System (INIS)

    Zhitomirskii, S.V.; Samorodskikh, B.L.

    1984-01-01

    This paper surveys the types of automated technological equipment robot complexes. The principal elements of such complexes are described. Complexes are divided into two principal groups: those using simultaneously acting robots, and those using successively acting robots. The great variety of types of robots using successive action is then described

  17. Mass Action Stoichiometric. Simulation for Cell Factory Design

    DEFF Research Database (Denmark)

    Matos, Marta R. A.

    -state, giving information only about the reactions’ fluxes, while the latter take into account enzyme dynamics which makes it possible to model substrate-level enzyme regulation and get information about metabolite concentrations and reaction fluxes over time, although at the cost of introducing more parameters....... Kinetic models have been plagued by the lack of kinetic data. The focus of this thesis are kinetic models of cell metabolism. In this work we start by developing a software package to create a model ensemble for individual enzymes in metabolism, where we decompose each reaction into elementary steps......, using mass action kinetics to model each step. The resulting rate constants are then fitted to kinetic data (kcat, Km, Ki, etc.). We then use the package as the basis to build a system-level kinetic model. To do so, we take two different approaches, and in both we drop the assumption that χfree ≈ χtot...

  18. Automated mini-column solid-phase extraction cleanup for high-throughput analysis of chemical contaminants in foods by low-pressure gas chromatography – tandem mass spectrometry

    Science.gov (United States)

    This study demonstrated the application of an automated high-throughput mini-cartridge solid-phase extraction (mini-SPE) cleanup for the rapid low-pressure gas chromatography – tandem mass spectrometry (LPGC-MS/MS) analysis of pesticides and environmental contaminants in QuEChERS extracts of foods. ...

  19. The mass-action law based algorithm for cost-effective approach for cancer drug discovery and development.

    Science.gov (United States)

    Chou, Ting-Chao

    2011-01-01

    The mass-action law based system analysis via mathematical induction and deduction lead to the generalized theory and algorithm that allows computerized simulation of dose-effect dynamics with small size experiments using a small number of data points in vitro, in animals, and in humans. The median-effect equation of the mass-action law deduced from over 300 mechanism specific-equations has been shown to be the unified theory that serves as the common-link for complicated biomedical systems. After using the median-effect principle as the common denominator, its applications are mechanism-independent, drug unit-independent, and dynamic order-independent; and can be used generally for single drug analysis or for multiple drug combinations in constant-ratio or non-constant ratios. Since the "median" is the common link and universal reference point in biological systems, these general enabling lead to computerized quantitative bio-informatics for econo-green bio-research in broad disciplines. Specific applications of the theory, especially relevant to drug discovery, drug combination, and clinical trials, have been cited or illustrated in terms of algorithms, experimental design and computerized simulation for data analysis. Lessons learned from cancer research during the past fifty years provide a valuable opportunity to reflect, and to improve the conventional divergent approach and to introduce a new convergent avenue, based on the mass-action law principle, for the efficient cancer drug discovery and the low-cost drug development.

  20. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data.

    Science.gov (United States)

    Jaitly, Navdeep; Mayampurath, Anoop; Littlefield, Kyle; Adkins, Joshua N; Anderson, Gordon A; Smith, Richard D

    2009-03-17

    Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS)-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC) elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to identify and quantify peptides in the

  1. ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wieselquist, William A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Thompson, Adam B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bowman, Stephen M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Peterson, Joshua L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-04-01

    Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process data to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.

  2. Murine Automated Urine Sampler (MAUS), Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal outlines planned development for a low-power, low-mass automated urine sample collection and preservation system for small mammals, capable of...

  3. Technical note: CT-guided biopsy of lung masses using an automated guiding apparatus

    International Nuclear Information System (INIS)

    Chellathurai, Amarnath; Kanhirat, Saneej; Chokkappan, Kabilan; Swaminathan, Thiruchendur S; Kulasekaran, Nadhamuni

    2009-01-01

    Automated guiding apparatuses for CT-guided biopsies are now available. We report our experience with an indigenous system to guide lung biopsies. This system gave results similar to those with the manual technique. Automated planning also appears to be technically easier, it requires fewer number of needle passes, consumes less time, and requires fewer number of check scans

  4. Hexicon 2: automated processing of hydrogen-deuterium exchange mass spectrometry data with improved deuteration distribution estimation.

    Science.gov (United States)

    Lindner, Robert; Lou, Xinghua; Reinstein, Jochen; Shoeman, Robert L; Hamprecht, Fred A; Winkler, Andreas

    2014-06-01

    Hydrogen-deuterium exchange (HDX) experiments analyzed by mass spectrometry (MS) provide information about the dynamics and the solvent accessibility of protein backbone amide hydrogen atoms. Continuous improvement of MS instrumentation has contributed to the increasing popularity of this method; however, comprehensive automated data analysis is only beginning to mature. We present Hexicon 2, an automated pipeline for data analysis and visualization based on the previously published program Hexicon (Lou et al. 2010). Hexicon 2 employs the sensitive NITPICK peak detection algorithm of its predecessor in a divide-and-conquer strategy and adds new features, such as chromatogram alignment and improved peptide sequence assignment. The unique feature of deuteration distribution estimation was retained in Hexicon 2 and improved using an iterative deconvolution algorithm that is robust even to noisy data. In addition, Hexicon 2 provides a data browser that facilitates quality control and provides convenient access to common data visualization tasks. Analysis of a benchmark dataset demonstrates superior performance of Hexicon 2 compared with its predecessor in terms of deuteration centroid recovery and deuteration distribution estimation. Hexicon 2 greatly reduces data analysis time compared with manual analysis, whereas the increased number of peptides provides redundant coverage of the entire protein sequence. Hexicon 2 is a standalone application available free of charge under http://hx2.mpimf-heidelberg.mpg.de.

  5. Hexicon 2: Automated Processing of Hydrogen-Deuterium Exchange Mass Spectrometry Data with Improved Deuteration Distribution Estimation

    Science.gov (United States)

    Lindner, Robert; Lou, Xinghua; Reinstein, Jochen; Shoeman, Robert L.; Hamprecht, Fred A.; Winkler, Andreas

    2014-06-01

    Hydrogen-deuterium exchange (HDX) experiments analyzed by mass spectrometry (MS) provide information about the dynamics and the solvent accessibility of protein backbone amide hydrogen atoms. Continuous improvement of MS instrumentation has contributed to the increasing popularity of this method; however, comprehensive automated data analysis is only beginning to mature. We present Hexicon 2, an automated pipeline for data analysis and visualization based on the previously published program Hexicon (Lou et al. 2010). Hexicon 2 employs the sensitive NITPICK peak detection algorithm of its predecessor in a divide-and-conquer strategy and adds new features, such as chromatogram alignment and improved peptide sequence assignment. The unique feature of deuteration distribution estimation was retained in Hexicon 2 and improved using an iterative deconvolution algorithm that is robust even to noisy data. In addition, Hexicon 2 provides a data browser that facilitates quality control and provides convenient access to common data visualization tasks. Analysis of a benchmark dataset demonstrates superior performance of Hexicon 2 compared with its predecessor in terms of deuteration centroid recovery and deuteration distribution estimation. Hexicon 2 greatly reduces data analysis time compared with manual analysis, whereas the increased number of peptides provides redundant coverage of the entire protein sequence. Hexicon 2 is a standalone application available free of charge under http://hx2.mpimf-heidelberg.mpg.de.

  6. 77 FR 63771 - Implementation of Full-Service Intelligent Mail Requirements for Automation Prices

    Science.gov (United States)

    2012-10-17

    ... large amount of additional data. Also, it requires each of my customers to have their own CRID. Today... Automation Prices AGENCY: Postal Service TM . ACTION: Proposed rule. SUMMARY: The Postal Service is proposing...]) throughout various sections to modify eligibility requirements for mailers to obtain automation prices for...

  7. Pharmacokinetic Studies of Chinese Medicinal Herbs Using an Automated Blood Sampling System and Liquid Chromatography-mass Spectrometry

    Directory of Open Access Journals (Sweden)

    Yu-Tse Wu

    2012-01-01

    Full Text Available The safety of herbal products is one of the major concerns for the modernization of traditional Chinese medicine, and pharmacokinetic data of medicinal herbs guide us to design the rational use of the herbal formula. This article reviews the advantages of the automated blood sampling (ABS systems for pharmacokinetic studies. In addition, three commonly used sample preparative methods, protein precipitation, liquid-liquid extraction and solid-phase extraction, are introduced. Furthermore, the definition, causes and evaluation of matrix effects in liquid chromatography-mass spectrometry (LC/MS analysis are demonstrated. Finally, we present our previous works as practical examples of the application of ABS systems and LC/MS for the pharmacokinetic studies of Chinese medicinal herbs.

  8. Dried Blood Spot Proteomics: Surface Extraction of Endogenous Proteins Coupled with Automated Sample Preparation and Mass Spectrometry Analysis

    Science.gov (United States)

    Martin, Nicholas J.; Bunch, Josephine; Cooper, Helen J.

    2013-08-01

    Dried blood spots offer many advantages as a sample format including ease and safety of transport and handling. To date, the majority of mass spectrometry analyses of dried blood spots have focused on small molecules or hemoglobin. However, dried blood spots are a potentially rich source of protein biomarkers, an area that has been overlooked. To address this issue, we have applied an untargeted bottom-up proteomics approach to the analysis of dried blood spots. We present an automated and integrated method for extraction of endogenous proteins from the surface of dried blood spots and sample preparation via trypsin digestion by use of the Advion Biosciences Triversa Nanomate robotic platform. Liquid chromatography tandem mass spectrometry of the resulting digests enabled identification of 120 proteins from a single dried blood spot. The proteins identified cross a concentration range of four orders of magnitude. The method is evaluated and the results discussed in terms of the proteins identified and their potential use as biomarkers in screening programs.

  9. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data

    Directory of Open Access Journals (Sweden)

    Anderson Gordon A

    2009-03-01

    Full Text Available Abstract Background Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. Results With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to

  10. If this then that: an introduction to automated task services.

    Science.gov (United States)

    Hoy, Matthew B

    2015-01-01

    This article explores automated task services, a type of website that allows users to create rules that are triggered by activity on one website and perform a task on another site. The most well-known automated task service is If This Then That (IFTTT), but recently a large number of these services have sprung up. These services can be used to connect websites, apps, business services, and even devices such as phones and home automation equipment. This allows for millions of possible combinations of rules, triggers, and actions. Librarians can put these services to use in many ways, from automating social media postings to remembering to bring their umbrella when rain is in the forecast. A list of popular automated task services is included, as well as a number of ideas for using these services in libraries.

  11. Formularity: Software for Automated Formula Assignment of Natural and Other Organic Matter from Ultrahigh-Resolution Mass Spectra.

    Science.gov (United States)

    Tolić, Nikola; Liu, Yina; Liyu, Andrey; Shen, Yufeng; Tfaily, Malak M; Kujawinski, Elizabeth B; Longnecker, Krista; Kuo, Li-Jung; Robinson, Errol W; Paša-Tolić, Ljiljana; Hess, Nancy J

    2017-12-05

    Ultrahigh resolution mass spectrometry, such as Fourier transform ion cyclotron resonance mass spectrometry (FT ICR MS), can resolve thousands of molecular ions in complex organic matrices. A Compound Identification Algorithm (CIA) was previously developed for automated elemental formula assignment for natural organic matter (NOM). In this work, we describe software Formularity with a user-friendly interface for CIA function and newly developed search function Isotopic Pattern Algorithm (IPA). While CIA assigns elemental formulas for compounds containing C, H, O, N, S, and P, IPA is capable of assigning formulas for compounds containing other elements. We used halogenated organic compounds (HOC), a chemical class that is ubiquitous in nature as well as anthropogenic systems, as an example to demonstrate the capability of Formularity with IPA. A HOC standard mix was used to evaluate the identification confidence of IPA. Tap water and HOC spike in Suwannee River NOM were used to assess HOC identification in complex environmental samples. Strategies for reconciliation of CIA and IPA assignments were discussed. Software and sample databases with documentation are freely available.

  12. Automated planning through abstractions in dynamic and stochastic environments

    OpenAIRE

    Martínez Muñoz, Moisés

    2016-01-01

    Mención Internacional en el título de doctor Generating sequences of actions - plans - for an automatic system, like a robot, using Automated Planning is particularly diflicult in stochastic and/or dynamic environments. These plans are composed of actions whose execution, in certain scenarios, might fail, which in tum prevents the execution of the rest of the actions in the plan. Also, in some environments, plans must he generated fast, hoth at the start of the execution and after every ex...

  13. Automating the Detection of Reflection-on-Action

    Science.gov (United States)

    Saucerman, Jenny; Ruis, A. R.; Shaffer, David Williamson

    2017-01-01

    Learning to solve "complex problems"--problems whose solutions require the application of more than basic facts and skills--is critical to meaningful participation in the economic, social, and cultural life of the digital age. In this paper, we use a theoretical understanding of how professionals use reflection-in-action to solve complex…

  14. Automation from pictures

    International Nuclear Information System (INIS)

    Kozubal, A.J.

    1992-01-01

    The state transition diagram (STD) model has been helpful in the design of real time software, especially with the emergence of graphical computer aided software engineering (CASE) tools. Nevertheless, the translation of the STD to real time code has in the past been primarily a manual task. At Los Alamos we have automated this process. The designer constructs the STD using a CASE tool (Cadre Teamwork) using a special notation for events and actions. A translator converts the STD into an intermediate state notation language (SNL), and this SNL is compiled directly into C code (a state program). Execution of the state program is driven by external events, allowing multiple state programs to effectively share the resources of the host processor. Since the design and the code are tightly integrated through the CASE tool, the design and code never diverge, and we avoid design obsolescence. Furthermore, the CASE tool automates the production of formal technical documents from the graphic description encapsulated by the CASE tool. (author)

  15. Cato Guldberg and Peter Waage, the history of the Law of Mass Action, and its relevance to clinical pharmacology.

    Science.gov (United States)

    Ferner, Robin E; Aronson, Jeffrey K

    2016-01-01

    We have traced the historical link between the Law of Mass Action and clinical pharmacology. The Law evolved from the work of the French chemist Claude Louis Berthollet, was first formulated by Cato Guldberg and Peter Waage in 1864 and later clarified by the Dutch chemist Jacobus van 't Hoff in 1877. It has profoundly influenced our qualitative and quantitative understanding of a number of physiological and pharmacological phenomena. According to the Law of Mass Action, the velocity of a chemical reaction depends on the concentrations of the reactants. At equilibrium the concentrations of the chemicals involved bear a constant relation to each other, described by the equilibrium constant, K. The Law of Mass Action is relevant to various physiological and pharmacological concepts, including concentration-effect curves, dose-response curves, and ligand-receptor binding curves, all of which are important in describing the pharmacological actions of medications, the Langmuir adsorption isotherm, which describes the binding of medications to proteins, activation curves for transmembrane ion transport, enzyme inhibition and the Henderson-Hasselbalch equation, which describes the relation between pH, as a measure of acidity and the concentrations of the contributory acids and bases. Guldberg and Waage recognized the importance of dynamic equilibrium, while others failed to do so. Their ideas, over 150 years old, are embedded in and still relevant to clinical pharmacology. Here we explain the ideas and in a subsequent paper show how they are relevant to understanding adverse drug reactions. © 2015 The British Pharmacological Society.

  16. Automation of dimethylation after guanidination labeling chemistry and its compatibility with common buffers and surfactants for mass spectrometry-based shotgun quantitative proteome analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lo, Andy; Tang, Yanan; Chen, Lu; Li, Liang, E-mail: Liang.Li@ualberta.ca

    2013-07-25

    Graphical abstract: -- Highlights: •Dimethylation after guanidination (2MEGA) uses inexpensive reagents for isotopic labeling of peptides. •2MEGA can be optimized and automated for labeling peptides with high efficiency. •2MEGA is compatible with several commonly used cell lysis and protein solubilization reagents. •The automated 2MEGA labeling method can be used to handle a variety of protein samples for relative proteome quantification. -- Abstract: Isotope labeling liquid chromatography–mass spectrometry (LC–MS) is a major analytical platform for quantitative proteome analysis. Incorporation of isotopes used to distinguish samples plays a critical role in the success of this strategy. In this work, we optimized and automated a chemical derivatization protocol (dimethylation after guanidination, 2MEGA) to increase the labeling reproducibility and reduce human intervention. We also evaluated the reagent compatibility of this protocol to handle biological samples in different types of buffers and surfactants. A commercially available liquid handler was used for reagent dispensation to minimize analyst intervention and at least twenty protein digest samples could be prepared in a single run. Different front-end sample preparation methods for protein solubilization (SDS, urea, Rapigest™, and ProteaseMAX™) and two commercially available cell lysis buffers were evaluated for compatibility with the automated protocol. It was found that better than 94% desired labeling could be obtained in all conditions studied except urea, where the rate was reduced to about 92% due to carbamylation on the peptide amines. This work illustrates the automated 2MEGA labeling process can be used to handle a wide range of protein samples containing various reagents that are often encountered in protein sample preparation for quantitative proteome analysis.

  17. Automation Technology and Sense of Control: A Window on Human Agency

    Science.gov (United States)

    Berberian, Bruno; Sarrazin, Jean-Christophe; Le Blaye, Patrick; Haggard, Patrick

    2012-01-01

    Previous studies have shown that the perceived times of voluntary actions and their effects are perceived as shifted towards each other, so that the interval between action and outcome seems shortened. This has been referred to as ‘intentional binding’ (IB). However, the generality of this effect remains unclear. Here we demonstrate that Intentional Binding also occurs in complex control situations. Using an aircraft supervision task with different autopilot settings, our results first indicated a strong relation between measures of IB and different levels of system automation. Second, measures of IB were related to explicit agency judgement in this applied setting. We discuss the implications for the underlying mechanisms, and for sense of agency in automated environments. PMID:22479528

  18. Automation technology and sense of control: a window on human agency.

    Science.gov (United States)

    Berberian, Bruno; Sarrazin, Jean-Christophe; Le Blaye, Patrick; Haggard, Patrick

    2012-01-01

    Previous studies have shown that the perceived times of voluntary actions and their effects are perceived as shifted towards each other, so that the interval between action and outcome seems shortened. This has been referred to as 'intentional binding' (IB). However, the generality of this effect remains unclear. Here we demonstrate that Intentional Binding also occurs in complex control situations. Using an aircraft supervision task with different autopilot settings, our results first indicated a strong relation between measures of IB and different levels of system automation. Second, measures of IB were related to explicit agency judgement in this applied setting. We discuss the implications for the underlying mechanisms, and for sense of agency in automated environments.

  19. MARC and the Library Service Center: Automation at Bargain Rates.

    Science.gov (United States)

    Pearson, Karl M.

    Despite recent research and development in the field of library automation, libraries have been unable to reap the benefits promised by technology due to the high cost of building and maintaining their own computer-based systems. Time-sharing and disc mass storage devices will bring automation costs, if spread over a number of users, within the…

  20. Autonomy, Automation, and Systems

    Science.gov (United States)

    Turner, Philip R.

    1987-02-01

    Aerospace industry interest in autonomy and automation, given fresh impetus by the national goal of establishing a Space Station, is becoming a major item of research and technology development. The promise of new technology arising from research in Artificial Intelligence (AI) has focused much attention on its potential in autonomy and automation. These technologies can improve performance in autonomous control functions that involve planning, scheduling, and fault diagnosis of complex systems. There are, however, many aspects of system and subsystem design in an autonomous system that impact AI applications, but do not directly involve AI technology. Development of a system control architecture, establishment of an operating system within the design, providing command and sensory data collection features appropriate to automated operation, and the use of design analysis tools to support system engineering are specific examples of major design issues. Aspects such as these must also receive attention and technology development support if we are to implement complex autonomous systems within the realistic limitations of mass, power, cost, and available flight-qualified technology that are all-important to a flight project.

  1. Gravity-Assist Trajectories to the Ice Giants: An Automated Method to Catalog Mass-or Time-Optimal Solutions

    Science.gov (United States)

    Hughes, Kyle M.; Knittel, Jeremy M.; Englander, Jacob A.

    2017-01-01

    This work presents an automated method of calculating mass (or time) optimal gravity-assist trajectories without a priori knowledge of the flyby-body combination. Since gravity assists are particularly crucial for reaching the outer Solar System, we use the Ice Giants, Uranus and Neptune, as example destinations for this work. Catalogs are also provided that list the most attractive trajectories found over launch dates ranging from 2024 to 2038. The tool developed to implement this method, called the Python EMTG Automated Trade Study Application (PEATSA), iteratively runs the Evolutionary Mission Trajectory Generator (EMTG), a NASA Goddard Space Flight Center in-house trajectory optimization tool. EMTG finds gravity-assist trajectories with impulsive maneuvers using a multiple-shooting structure along with stochastic methods (such as monotonic basin hopping) and may be run with or without an initial guess provided. PEATSA runs instances of EMTG in parallel over a grid of launch dates. After each set of runs completes, the best results within a neighborhood of launch dates are used to seed all other cases in that neighborhood---allowing the solutions across the range of launch dates to improve over each iteration. The results here are compared against trajectories found using a grid-search technique, and PEATSA is found to outperform the grid-search results for most launch years considered.

  2. Automation of a thermal ionisation mass spectrometer

    International Nuclear Information System (INIS)

    Pamula, A.; Leuca, M.; Albert, S.; Benta, Adriana

    2001-01-01

    A thermal ionization mass spectrometer was upgraded in order to be monitored by a PC. A PC-LMP-16 National Instruments data acquisition board was used for the ion current channel and the Hall signal channel. A dedicated interface was built to allow commands from the computer to the current supply of the analyzing magnet and to the high voltage unit of the mass spectrometer. A software application was worked out to perform the adjustment of the spectrometer, magnetic scanning and mass spectra acquisition, data processing and isotope ratio determination. The apparatus is used for isotope ratio 235 U/ 238 U determination near the natural abundance. A peak jumping technique is applied to choose between the 235 U and 238 U signal, by switching the high voltage applied to the ion source between two preset values. This avoids the delay between the acquisition of the peaks of interest, a delay that would appear in the case of a 'pure' magnetic scanning. Corrections are applied for the mass discrimination effects and a statistical treatment of the data is achieved. (authors)

  3. Development of design principles for automated systems in transport control.

    Science.gov (United States)

    Balfe, Nora; Wilson, John R; Sharples, Sarah; Clarke, Theresa

    2012-01-01

    This article reports the results of a qualitative study investigating attitudes towards and opinions of an advanced automation system currently used in UK rail signalling. In-depth interviews were held with 10 users, key issues associated with automation were identified and the automation's impact on the signalling task investigated. The interview data highlighted the importance of the signallers' understanding of the automation and their (in)ability to predict its outputs. The interviews also covered the methods used by signallers to interact with and control the automation, and the perceived effects on their workload. The results indicate that despite a generally low level of understanding and ability to predict the actions of the automation system, signallers have developed largely successful coping mechanisms that enable them to use the technology effectively. These findings, along with parallel work identifying desirable attributes of automation from the literature in the area, were used to develop 12 principles of automation which can be used to help design new systems which better facilitate cooperative working. The work reported in this article was completed with the active involvement of operational rail staff who regularly use automated systems in rail signalling. The outcomes are currently being used to inform decisions on the extent and type of automation and user interfaces in future generations of rail control systems.

  4. AUTOMATED PROCESS MONITORING: APPLYING PROVEN AUTOMATION TECHNIQUES TO INTERNATIONAL SAFEGUARDS NEEDS

    International Nuclear Information System (INIS)

    O'Hara, Matthew J.; Durst, Philip C.; Grate, Jay W.; Devol, Timothy A.; Egorov, Oleg; Clements, John P.

    2008-01-01

    Identification and quantification of specific alpha- and beta-emitting radionuclides in complex liquid matrices is highly challenging, and is typically accomplished through laborious wet chemical sample preparation and separations followed by analysis using a variety of detection methodologies (e.g., liquid scintillation, gas proportional counting, alpha energy analysis, mass spectrometry). Analytical results may take days or weeks to report. Chains of custody and sample security measures may also complicate or slow the analytical process. When an industrial process-scale plant requires the monitoring of specific radionuclides as an indication of the composition of its feed stream or of plant performance, radiochemical measurements must be fast, accurate, and reliable. Scientists at Pacific Northwest National Laboratory have assembled a fully automated prototype Process Monitor instrument capable of a variety of tasks: automated sampling directly from a feed stream, sample digestion/analyte redox adjustment, chemical separations, radiochemical detection and data analysis/reporting. The system is compact, its components are fluidically inter-linked, and analytical results could be immediately transmitted to on- or off-site locations. The development of a rapid radiochemical Process Monitor for 99Tc in Hanford tank waste processing streams, capable of performing several measurements per hour, will be discussed in detail. More recently, the automated platform was modified to perform measurements of 90Sr in Hanford tank waste stimulant. The system exemplifies how automation could be integrated into reprocessing facilities to support international nuclear safeguards needs

  5. Impact of Automation on Drivers' Performance in Agricultural Semi-Autonomous Vehicles.

    Science.gov (United States)

    Bashiri, B; Mann, D D

    2015-04-01

    Drivers' inadequate mental workload has been reported as one of the negative effects of driving assistant systems and in-vehicle automation. The increasing trend of automation in agricultural vehicles raises some concerns about drivers' mental workload in such vehicles. Thus, a human factors perspective is needed to identify the consequences of such automated systems. In this simulator study, the effects of vehicle steering task automation (VSTA) and implement control and monitoring task automation (ICMTA) were investigated using a tractor-air seeder system as a case study. Two performance parameters (reaction time and accuracy of actions) were measured to assess drivers' perceived mental workload. Experiments were conducted using the tractor driving simulator (TDS) located in the Agricultural Ergonomics Laboratory at the University of Manitoba. Study participants were university students with tractor driving experience. According to the results, reaction time and number of errors made by drivers both decreased as the automation level increased. Correlations were found among performance parameters and subjective mental workload reported by the drivers.

  6. Robust localisation of automated guided vehicles for computer-integrated manufacturing environments

    Directory of Open Access Journals (Sweden)

    Dixon, R. C.

    2013-05-01

    Full Text Available As industry moves toward an era of complete automation and mass customisation, automated guided vehicles (AGVs are used as material handling systems. However, the current techniques that provide navigation, control, and manoeuvrability of automated guided vehicles threaten to create bottlenecks and inefficiencies in manufacturing environments that strive towards the optimisation of part production. This paper proposes a decentralised localisation technique for an automated guided vehicle without any non-holonomic constraints. Incorporation of these vehicles into the material handling system of a computer-integrated manufacturing environment would increase the characteristics of robustness, efficiency, flexibility, and advanced manoeuvrability.

  7. Developments towards a fully automated AMS system

    International Nuclear Information System (INIS)

    Steier, P.; Puchegger, S.; Golser, R.; Kutschera, W.; Priller, A.; Rom, W.; Wallner, A.; Wild, E.

    2000-01-01

    The possibilities of computer-assisted and automated accelerator mass spectrometry (AMS) measurements were explored. The goal of these efforts is to develop fully automated procedures for 'routine' measurements at the Vienna Environmental Research Accelerator (VERA), a dedicated 3-MV Pelletron tandem AMS facility. As a new tool for automatic tuning of the ion optics we developed a multi-dimensional optimization algorithm robust to noise, which was applied for 14 C and 10 Be. The actual isotope ratio measurements are performed in a fully automated fashion and do not require the presence of an operator. Incoming data are evaluated online and the results can be accessed via Internet. The system was used for 14 C, 10 Be, 26 Al and 129 I measurements

  8. Approaches towards the automated interpretation and prediction of electrospray tandem mass spectra of non-peptidic combinatorial compounds.

    Science.gov (United States)

    Klagkou, Katerina; Pullen, Frank; Harrison, Mark; Organ, Andy; Firth, Alistair; Langley, G John

    2003-01-01

    Combinatorial chemistry is widely used within the pharmaceutical industry as a means of rapid identification of potential drugs. With the growth of combinatorial libraries, mass spectrometry (MS) became the key analytical technique because of its speed of analysis, sensitivity, accuracy and ability to be coupled with other analytical techniques. In the majority of cases, electrospray mass spectrometry (ES-MS) has become the default ionisation technique. However, due to the absence of fragment ions in the resulting spectra, tandem mass spectrometry (MS/MS) is required to provide structural information for the identification of an unknown analyte. This work discusses the first steps of an investigation into the fragmentation pathways taking place in electrospray tandem mass spectrometry. The ultimate goal for this project is to set general fragmentation rules for non-peptidic, pharmaceutical, combinatorial compounds. As an aid, an artificial intelligence (AI) software package is used to facilitate interpretation of the spectra. This initial study has focused on determining the fragmentation rules for some classes of compound types that fit the remit as outlined above. Based on studies carried out on several combinatorial libraries of these compounds, it was established that different classes of drug molecules follow unique fragmentation pathways. In addition to these general observations, the specific ionisation processes and the fragmentation pathways involved in the electrospray mass spectra of these systems were explored. The ultimate goal will be to incorporate our findings into the computer program and allow identification of an unknown, non-peptidic compound following insertion of its ES-MS/MS spectrum into the AI package. The work herein demonstrates the potential benefit of such an approach in addressing the issue of high-throughput, automated MS/MS data interpretation. Copyright 2003 John Wiley & Sons, Ltd.

  9. 78 FR 2363 - Notification of Deletion of a System of Records; Automated Trust Funds Database

    Science.gov (United States)

    2013-01-11

    ... [Docket No. APHIS-2012-0041] Notification of Deletion of a System of Records; Automated Trust Funds Database AGENCY: Animal and Plant Health Inspection Service, USDA. ACTION: Notice of deletion of a system... establishing the Automated Trust Funds (ATF) database system of records. The Federal Information Security...

  10. Formularity: Software for Automated Formula Assignment of Natural and Other Organic Matter from Ultrahigh-Resolution Mass Spectra

    Energy Technology Data Exchange (ETDEWEB)

    Tolic, Nikola; Liu, Yina; Liyu, Andrey V.; Shen, Yufeng; Tfaily, Malak M.; Kujawinski, Elizabeth B.; Longnecker, Krista; Kuo, Li-Jung; Robinson, Errol W.; Pasa Tolic, Ljiljana; Hess, Nancy J.

    2017-11-13

    Ultrahigh-resolution mass spectrometry, such as Fourier transform ion-cyclotron resonance mass spectrometry (FT-ICR MS), can resolve thousands of molecular ions in complex organic matrices. A Compound Identification Algorithm (CIA) was previously developed for automated elemental formula assignment for natural organic matter (NOM). In this work we describe a user friendly interface for CIA, titled Formularity, which includes an additional functionality to perform search of formulas based on an Isotopic Pattern Algorithm (IPA). While CIA assigns elemental formulas for compounds containing C, H, O, N, S, and P, IPA is capable of assigning formulas for compounds containing other elements. We used halogenated organic compounds (HOC), a chemical class that is ubiquitous in nature as well as anthropogenic systems, as an example to demonstrate the capability of Formularity with IPA. A HOC standard mix was used to evaluate the identification confidence of IPA. The HOC spike in NOM and tap water were used to assess HOC identification in natural and anthropogenic matrices. Strategies for reconciliation of CIA and IPA assignments are discussed. Software and sample databases with documentation are freely available from the PNNL OMICS software repository https://omics.pnl.gov/software/formularity.

  11. Wireless Home Automation System using IoT

    Directory of Open Access Journals (Sweden)

    Alexandra MIHALACHE

    2017-01-01

    Full Text Available Nowadays, the chance of having an automated home is no longer a fancy luxury, but a reality accessible to a wide range of consumers, because smart home systems have replaced those that only automated the home in the past. More and more solutions based on IoT are being devel-oped to transform homes into smart ones, but the problem is that the benefits of home automa-tion are still not clear to everyone as they are not promoted enough, so we cannot talk about a broad mass of consumers already using integrated or DIY solutions to improve their lives. In this paper, I will present a home automation system using Arduino Uno integrated with rele-vant modules which are used to allow remote control of lights or fans, changes being made on the basis of different sensors data. The system is designed to be low cost and expandable, bring-ing accessibility, convenience and energy efficiency.

  12. Automated analysis of non-mass-enhancing lesions in breast MRI based on morphological, kinetic, and spatio-temporal moments and joint segmentation-motion compensation technique

    Science.gov (United States)

    Hoffmann, Sebastian; Shutler, Jamie D.; Lobbes, Marc; Burgeth, Bernhard; Meyer-Bäse, Anke

    2013-12-01

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) represents an established method for the detection and diagnosis of breast lesions. While mass-like enhancing lesions can be easily categorized according to the Breast Imaging Reporting and Data System (BI-RADS) MRI lexicon, a majority of diagnostically challenging lesions, the so called non-mass-like enhancing lesions, remain both qualitatively as well as quantitatively difficult to analyze. Thus, the evaluation of kinetic and/or morphological characteristics of non-masses represents a challenging task for an automated analysis and is of crucial importance for advancing current computer-aided diagnosis (CAD) systems. Compared to the well-characterized mass-enhancing lesions, non-masses have no well-defined and blurred tumor borders and a kinetic behavior that is not easily generalizable and thus discriminative for malignant and benign non-masses. To overcome these difficulties and pave the way for novel CAD systems for non-masses, we will evaluate several kinetic and morphological descriptors separately and a novel technique, the Zernike velocity moments, to capture the joint spatio-temporal behavior of these lesions, and additionally consider the impact of non-rigid motion compensation on a correct diagnosis.

  13. Application Filters for TCP/IP Industrial Automation Protocols

    Science.gov (United States)

    Batista, Aguinaldo B.; Kobayashi, Tiago H.; Medeiros, João Paulo S.; Brito, Agostinho M.; Motta Pires, Paulo S.

    The use of firewalls is a common approach usually meant to secure Automation Technology (AT) from Information Technology (TI) networks. This work proposes a filtering system for TCP/IP-based automation networks in which only certain kind of industrial traffic is permitted. All network traffic which does not conform with a proper industrial protocol pattern or with specific rules for its actions is supposed to be abnormal and must be blocked. As a case study, we developed a seventh layer firewall application with the ability of blocking spurious traffic, using an IP packet queueing engine and a regular expression library.

  14. Automated CD-SEM recipe creation technology for mass production using CAD data

    Science.gov (United States)

    Kawahara, Toshikazu; Yoshida, Masamichi; Tanaka, Masashi; Ido, Sanyu; Nakano, Hiroyuki; Adachi, Naokaka; Abe, Yuichi; Nagatomo, Wataru

    2011-03-01

    Critical Dimension Scanning Electron Microscope (CD-SEM) recipe creation needs sample preparation necessary for matching pattern registration, and recipe creation on CD-SEM using the sample, which hinders the reduction in test production cost and time in semiconductor manufacturing factories. From the perspective of cost reduction and improvement of the test production efficiency, automated CD-SEM recipe creation without the sample preparation and the manual operation has been important in the production lines. For the automated CD-SEM recipe creation, we have introduced RecipeDirector (RD) that enables the recipe creation by using Computer-Aided Design (CAD) data and text data that includes measurement information. We have developed a system that automatically creates the CAD data and the text data necessary for the recipe creation on RD; and, for the elimination of the manual operation, we have enhanced RD so that all measurement information can be specified in the text data. As a result, we have established an automated CD-SEM recipe creation system without the sample preparation and the manual operation. For the introduction of the CD-SEM recipe creation system using RD to the production lines, the accuracy of the pattern matching was an issue. The shape of design templates for the matching created from the CAD data was different from that of SEM images in vision. Thus, a development of robust pattern matching algorithm that considers the shape difference was needed. The addition of image processing of the templates for the matching and shape processing of the CAD patterns in the lower layer has enabled the robust pattern matching. This paper describes the automated CD-SEM recipe creation technology for the production lines without the sample preparation and the manual operation using RD applied in Sony Semiconductor Kyusyu Corporation Kumamoto Technology Center (SCK Corporation Kumamoto TEC).

  15. Automation warning system against driver falling asleep in-traffic

    Directory of Open Access Journals (Sweden)

    Dymov I. S.

    2017-12-01

    Full Text Available The paper is devoted to the development of a new automation recognition and warning system against driver falling asleep in-traffic. The issue of the physical condition control of professional drivers on the voyage has been considered both on the part of efficiency and quality of its determination, and in terms of improving overall road safety. The existing and widely used devices for determining the transition to the stage of sleep of drivers being in-traffic have been analyzed. Their advantages and disadvantages have been detected. It has been established that the main negative factor preventing the mass introduction of pre-existing warning systems is the need to wear one or another monitoring device before starting the movement. Carried out project research work has proposed a complex monitoring of the physical and physiological condition of driving person as a new warning method against falling asleep in-traffic. The proposed variations of algorithmic implementations can be used in long-distance trucks and passenger vehicles. Two different versions of the automatic control status of the driver physical condition have been considered. The first approach has proposed the use of sensors of the biometric parameters of body, pulsus, body temperature, and hands on wheel pressure sensors. The second one has proposed using the tracking cameras. Both for the first and second versions of the automation system a toolset of control devices is being installed inside the vehicle and have no physical, so irritating action on the driver. Software approach for the false operation rejection of the devices has been developed. The paper considers the flow diagrams of the automatic systems and logical structure of analysis and decision-making. The set of impacts intended for driver's awakening has been proposed. The conclusion about the engineering perspectives of the proposed approach of projected automation systems has been made.

  16. Automated digital magnetofluidics

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, J; Garcia, A A; Marquez, M [Harrington Department of Bioengineering Arizona State University, Tempe AZ 85287-9709 (United States)], E-mail: tony.garcia@asu.edu

    2008-08-15

    Drops can be moved in complex patterns on superhydrophobic surfaces using a reconfigured computer-controlled x-y metrology stage with a high degree of accuracy, flexibility, and reconfigurability. The stage employs a DMC-4030 controller which has a RISC-based, clock multiplying processor with DSP functions, accepting encoder inputs up to 22 MHz, provides servo update rates as high as 32 kHz, and processes commands at rates as fast as 40 milliseconds. A 6.35 mm diameter cylindrical NdFeB magnet is translated by the stage causing water drops to move by the action of induced magnetization of coated iron microspheres that remain in the drop and are attracted to the rare earth magnet through digital magnetofluidics. Water drops are easily moved in complex patterns in automated digital magnetofluidics at an average speed of 2.8 cm/s over a superhydrophobic polyethylene surface created by solvent casting. With additional components, some potential uses for this automated microfluidic system include characterization of superhydrophobic surfaces, water quality analysis, and medical diagnostics.

  17. Automated digital magnetofluidics

    Science.gov (United States)

    Schneider, J.; Garcia, A. A.; Marquez, M.

    2008-08-01

    Drops can be moved in complex patterns on superhydrophobic surfaces using a reconfigured computer-controlled x-y metrology stage with a high degree of accuracy, flexibility, and reconfigurability. The stage employs a DMC-4030 controller which has a RISC-based, clock multiplying processor with DSP functions, accepting encoder inputs up to 22 MHz, provides servo update rates as high as 32 kHz, and processes commands at rates as fast as 40 milliseconds. A 6.35 mm diameter cylindrical NdFeB magnet is translated by the stage causing water drops to move by the action of induced magnetization of coated iron microspheres that remain in the drop and are attracted to the rare earth magnet through digital magnetofluidics. Water drops are easily moved in complex patterns in automated digital magnetofluidics at an average speed of 2.8 cm/s over a superhydrophobic polyethylene surface created by solvent casting. With additional components, some potential uses for this automated microfluidic system include characterization of superhydrophobic surfaces, water quality analysis, and medical diagnostics.

  18. Structural Characterization of Laboratory Made Tholins by IRMPD Action Spectroscopy and Ultrahigh Resolution Mass Spectrometry

    Science.gov (United States)

    Thissen, R.; Somogyi, A.; Vuitton, V.; Bégué, D.; Lemaire, J.; Steinmetz, V.

    2011-10-01

    The complex organic material that is found on the surface and within the haze layer of Titan is attributed to chemistry occurring in its thick N2/CH4 atmosphere. Although several groups are producing in various laboratory setting the socalled tholins which have been investigated by using analytical methods including UV/Vis, fluorescence, IR, and MS1-5, these very complex organic mixtures still hold many unanswered questions, especially related to the potentiality for their prebiotic chemistry. In addition to tholins characterization and analysis, we recently investigated quantitatively the hydrolysis kinetics of tholins in pure and NH3 containing water at different temperatures.7-8 Our groups at UJF (Grenoble) and at U of Arizona (Tucson) have been collaborating on mass spectral analyses of tholins samples for several years.9 Here, we report our most recent results on the structural characterization of tholins by infrared multiphoton dissociation (IRMPD) action spectroscopy10 and ultrahigh resolution MS. IRMPD action spectroscopy is a recently developed technique that uses IR photons of variable wavelengths to activate ions trapped inside an ion trap. When photons are absorbed at a given wavelength, the selected ion fragments and this fragmentation is monitored as a function of wavelength, analog to an absorption spectrum (impossible to record otherwise because of the much reduced density). This technique can, therefore, be used to determine IR spectra of ions in the gas phase, and provides with very acute structural information. IRMPD action spectroscopy is often used to distinguish between structural isomers of isobaric ions. The drawback is that it requests for high power lasers. Only two Free Electron Lasers (FEL) are available in the world and allow to record spectra with reasonable resolution (20-25 cm-1). IRMPD action spectra of selected ions from tholins will be presented and discussed together with observed fragmentation processes that reveal structural

  19. Automated combustion accelerator mass spectrometry for the analysis of biomedical samples in the low attomole range.

    Science.gov (United States)

    van Duijn, Esther; Sandman, Hugo; Grossouw, Dimitri; Mocking, Johannes A J; Coulier, Leon; Vaes, Wouter H J

    2014-08-05

    The increasing role of accelerator mass spectrometry (AMS) in biomedical research necessitates modernization of the traditional sample handling process. AMS was originally developed and used for carbon dating, therefore focusing on a very high precision but with a comparably low sample throughput. Here, we describe the combination of automated sample combustion with an elemental analyzer (EA) online coupled to an AMS via a dedicated interface. This setup allows direct radiocarbon measurements for over 70 samples daily by AMS. No sample processing is required apart from the pipetting of the sample into a tin foil cup, which is placed in the carousel of the EA. In our system, up to 200 AMS analyses are performed automatically without the need for manual interventions. We present results on the direct total (14)C count measurements in <2 μL human plasma samples. The method shows linearity over a range of 0.65-821 mBq/mL, with a lower limit of quantification of 0.65 mBq/mL (corresponding to 0.67 amol for acetaminophen). At these extremely low levels of activity, it becomes important to quantify plasma specific carbon percentages. This carbon percentage is automatically generated upon combustion of a sample on the EA. Apparent advantages of the present approach include complete omission of sample preparation (reduced hands-on time) and fully automated sample analysis. These improvements clearly stimulate the standard incorporation of microtracer research in the drug development process. In combination with the particularly low sample volumes required and extreme sensitivity, AMS strongly improves its position as a bioanalysis method.

  20. Quantum back-action-evading measurement of motion in a negative mass reference frame

    Science.gov (United States)

    Møller, Christoffer B.; Thomas, Rodrigo A.; Vasilakis, Georgios; Zeuthen, Emil; Tsaturyan, Yeghishe; Balabas, Mikhail; Jensen, Kasper; Schliesser, Albert; Hammerer, Klemens; Polzik, Eugene S.

    2017-07-01

    Quantum mechanics dictates that a continuous measurement of the position of an object imposes a random quantum back-action (QBA) perturbation on its momentum. This randomness translates with time into position uncertainty, thus leading to the well known uncertainty on the measurement of motion. As a consequence of this randomness, and in accordance with the Heisenberg uncertainty principle, the QBA puts a limitation—the so-called standard quantum limit—on the precision of sensing of position, velocity and acceleration. Here we show that QBA on a macroscopic mechanical oscillator can be evaded if the measurement of motion is conducted in the reference frame of an atomic spin oscillator. The collective quantum measurement on this hybrid system of two distant and disparate oscillators is performed with light. The mechanical oscillator is a vibrational ‘drum’ mode of a millimetre-sized dielectric membrane, and the spin oscillator is an atomic ensemble in a magnetic field. The spin oriented along the field corresponds to an energetically inverted spin population and realizes a negative-effective-mass oscillator, while the opposite orientation corresponds to an oscillator with positive effective mass. The QBA is suppressed by -1.8 decibels in the negative-mass setting and enhanced by 2.4 decibels in the positive-mass case. This hybrid quantum system paves the way to entanglement generation and distant quantum communication between mechanical and spin systems and to sensing of force, motion and gravity beyond the standard quantum limit.

  1. Quantum back-action-evading measurement of motion in a negative mass reference frame.

    Science.gov (United States)

    Møller, Christoffer B; Thomas, Rodrigo A; Vasilakis, Georgios; Zeuthen, Emil; Tsaturyan, Yeghishe; Balabas, Mikhail; Jensen, Kasper; Schliesser, Albert; Hammerer, Klemens; Polzik, Eugene S

    2017-07-12

    Quantum mechanics dictates that a continuous measurement of the position of an object imposes a random quantum back-action (QBA) perturbation on its momentum. This randomness translates with time into position uncertainty, thus leading to the well known uncertainty on the measurement of motion. As a consequence of this randomness, and in accordance with the Heisenberg uncertainty principle, the QBA puts a limitation-the so-called standard quantum limit-on the precision of sensing of position, velocity and acceleration. Here we show that QBA on a macroscopic mechanical oscillator can be evaded if the measurement of motion is conducted in the reference frame of an atomic spin oscillator. The collective quantum measurement on this hybrid system of two distant and disparate oscillators is performed with light. The mechanical oscillator is a vibrational 'drum' mode of a millimetre-sized dielectric membrane, and the spin oscillator is an atomic ensemble in a magnetic field. The spin oriented along the field corresponds to an energetically inverted spin population and realizes a negative-effective-mass oscillator, while the opposite orientation corresponds to an oscillator with positive effective mass. The QBA is suppressed by -1.8 decibels in the negative-mass setting and enhanced by 2.4 decibels in the positive-mass case. This hybrid quantum system paves the way to entanglement generation and distant quantum communication between mechanical and spin systems and to sensing of force, motion and gravity beyond the standard quantum limit.

  2. Development of an Automated Technique for Failure Modes and Effect Analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Allasia, G.

    1999-01-01

    Advances in automation have provided integration of monitoring and control functions to enhance the operator's overview and ability to take remedy actions when faults occur. Automation in plant supervision is technically possible with integrated automation systems as platforms, but new design...... methods are needed to cope efficiently with the complexity and to ensure that the functionality of a supervisor is correct and consistent. In particular these methods are expected to significantly improve fault tolerance of the designed systems. The purpose of this work is to develop a software module...... implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  3. Development of an automated technique for failure modes and effect analysis

    DEFF Research Database (Denmark)

    Blanke, Mogens; Borch, Ole; Bagnoli, F.

    1999-01-01

    Advances in automation have provided integration of monitoring and control functions to enhance the operator's overview and ability to take remedy actions when faults occur. Automation in plant supervision is technically possible with integrated automation systems as platforms, but new design...... methods are needed to cope efficiently with the complexity and to ensure that the functionality of a supervisor is correct and consistent. In particular these methods are expected to significantly improve fault tolerance of the designed systems. The purpose of this work is to develop a software module...... implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  4. Automation in the clinical microbiology laboratory.

    Science.gov (United States)

    Novak, Susan M; Marlowe, Elizabeth M

    2013-09-01

    Imagine a clinical microbiology laboratory where a patient's specimens are placed on a conveyor belt and sent on an automation line for processing and plating. Technologists need only log onto a computer to visualize the images of a culture and send to a mass spectrometer for identification. Once a pathogen is identified, the system knows to send the colony for susceptibility testing. This is the future of the clinical microbiology laboratory. This article outlines the operational and staffing challenges facing clinical microbiology laboratories and the evolution of automation that is shaping the way laboratory medicine will be practiced in the future. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. Automated remedial assessment methodology software system

    International Nuclear Information System (INIS)

    Whiting, M.; Wilkins, M.; Stiles, D.

    1994-11-01

    The Automated Remedial Analysis Methodology (ARAM) software system has been developed by the Pacific Northwest Laboratory to assist the U.S. Department of Energy (DOE) in evaluating cleanup options for over 10,000 contaminated sites across the DOE complex. The automated methodology comprises modules for decision logic diagrams, technology applicability and effectiveness rules, mass balance equations, cost and labor estimating factors and equations, and contaminant stream routing. ARAM is used to select technologies for meeting cleanup targets; determine the effectiveness of the technologies in destroying, removing, or immobilizing contaminants; decide the nature and amount of secondary waste requiring further treatment; and estimate the cost and labor involved when applying technologies

  6. Automated Online Solid-Phase Derivatization for Sensitive Quantification of Endogenous S-Nitrosoglutathione and Rapid Capture of Other Low-Molecular-Mass S-Nitrosothiols.

    Science.gov (United States)

    Wang, Xin; Garcia, Carlos T; Gong, Guanyu; Wishnok, John S; Tannenbaum, Steven R

    2018-02-06

    S-Nitrosothiols (RSNOs) constitute a circulating endogenous reservoir of nitric oxide and have important biological activities. In this study, an online coupling of solid-phase derivatization (SPD) with liquid chromatography-mass spectrometry (LC-MS) was developed and applied in the analysis of low-molecular-mass RSNOs. A derivatizing-reagent-modified polymer monolithic column was prepared and adapted for online SPD-LC-MS. Analytes from the LC autosampler flowed through the monolithic column for derivatization and then directly into the LC-MS for analysis. This integration of the online derivatization, LC separation, and MS detection facilitated system automation, allowing rapid, laborsaving, and sensitive detection of RSNOs. S-Nitrosoglutathione (GSNO) was quantified using this automated online method with good linearity (R 2 = 0.9994); the limit of detection was 0.015 nM. The online SPD-LC-MS method has been used to determine GSNO levels in mouse samples, 138 ± 13.2 nM of endogenous GSNO was detected in mouse plasma. Besides, the GSNO concentrations in liver (64.8 ± 11.3 pmol/mg protein), kidney (47.2 ± 6.1 pmol/mg protein), heart (8.9 ± 1.8 pmol/mg protein), muscle (1.9 ± 0.3 pmol/mg protein), hippocampus (5.3 ± 0.9 pmol/mg protein), striatum (6.7 ± 0.6 pmol/mg protein), cerebellum (31.4 ± 6.5 pmol/mg protein), and cortex (47.9 ± 4.6 pmol/mg protein) were also successfully quantified. When the derivatization was performed within 8 min, followed by LC-MS detection, samples could be rapidly analyzed compared with the offline manual method. Other low-molecular-mass RSNOs, such as S-nitrosocysteine and S-nitrosocysteinylglycine, were captured by rapid precursor-ion scanning, showing that the proposed method is a potentially powerful tool for capture, identification, and quantification of RSNOs in biological samples.

  7. Trust in automation and meta-cognitive accuracy in NPP operating crews

    Energy Technology Data Exchange (ETDEWEB)

    Skraaning Jr, G.; Miberg Skjerve, A. B. [OECD Halden Reactor Project, PO Box 173, 1751 Halden (Norway)

    2006-07-01

    Nuclear power plant operators can over-trust or under-trust automation. Operator trust in automation is said to be mis-calibrated when the level of trust is not corresponding to the actual level of automation reliability. A possible consequence of mis-calibrated trust is degraded meta-cognitive accuracy. Meta-cognitive accuracy is the ability to correctly monitor the effectiveness of ones own performance while engaged in complex tasks. When operators misjudge their own performance, human control actions will be poorly regulated and safety and/or efficiency may suffer. An analysis of simulator data showed that meta-cognitive accuracy and trust in automation were highly correlated for knowledge-based scenarios, but uncorrelated for rule-based scenarios. In the knowledge-based scenarios, the operators overestimated their performance effectiveness under high levels of trust, they underestimated performance under low levels of trust, but showed realistic self-assessment under intermediate levels of trust in automation. The result was interpreted to suggest that trust in automation impact the meta-cognitive accuracy of the operators. (authors)

  8. Trust in automation and meta-cognitive accuracy in NPP operating crews

    International Nuclear Information System (INIS)

    Skraaning Jr, G.; Miberg Skjerve, A. B.

    2006-01-01

    Nuclear power plant operators can over-trust or under-trust automation. Operator trust in automation is said to be mis-calibrated when the level of trust is not corresponding to the actual level of automation reliability. A possible consequence of mis-calibrated trust is degraded meta-cognitive accuracy. Meta-cognitive accuracy is the ability to correctly monitor the effectiveness of ones own performance while engaged in complex tasks. When operators misjudge their own performance, human control actions will be poorly regulated and safety and/or efficiency may suffer. An analysis of simulator data showed that meta-cognitive accuracy and trust in automation were highly correlated for knowledge-based scenarios, but uncorrelated for rule-based scenarios. In the knowledge-based scenarios, the operators overestimated their performance effectiveness under high levels of trust, they underestimated performance under low levels of trust, but showed realistic self-assessment under intermediate levels of trust in automation. The result was interpreted to suggest that trust in automation impact the meta-cognitive accuracy of the operators. (authors)

  9. Open Automated Demand Response Communications Specification (Version 1.0)

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Ghatikar, Girish; Kiliccote, Sila; Koch, Ed; Hennage, Dan; Palensky, Peter; McParland, Charles

    2009-02-28

    The development of the Open Automated Demand Response Communications Specification, also known as OpenADR or Open Auto-DR, began in 2002 following the California electricity crisis. The work has been carried out by the Demand Response Research Center (DRRC), which is managed by Lawrence Berkeley National Laboratory. This specification describes an open standards-based communications data model designed to facilitate sending and receiving demand response price and reliability signals from a utility or Independent System Operator to electric customers. OpenADR is one element of the Smart Grid information and communications technologies that are being developed to improve optimization between electric supply and demand. The intention of the open automated demand response communications data model is to provide interoperable signals to building and industrial control systems that are preprogrammed to take action based on a demand response signal, enabling a demand response event to be fully automated, with no manual intervention. The OpenADR specification is a flexible infrastructure to facilitate common information exchange between the utility or Independent System Operator and end-use participants. The concept of an open specification is intended to allow anyone to implement the signaling systems, the automation server or the automation clients.

  10. Automation of the dicentric chromosome assay and related assays

    International Nuclear Information System (INIS)

    Balajee, Adayabalam S.; Dainiak, Nicholas

    2016-01-01

    Dicentric Chromosome Assay (DCA) is considered to be the 'gold standard' for personalized dose assessment in humans after accidental or incidental radiation exposure. Although this technique is superior to other cytogenetic assays in terms of specificity and sensitivity, its potential application to radiation mass casualty scenarios is highly restricted because DCA is time consuming and labor intensive when performed manually. Therefore, it is imperative to develop high throughput automation techniques to make DCA suitable for radiological triage scenarios. At the Cytogenetic Biodosimetry Laboratory in Oak Ridge, efforts are underway to develop high throughput automation of DCA. Current status on development of various automated cytogenetic techniques in meeting the biodosimetry needs of radiological/nuclear incident(s) will be discussed

  11. Estimates of Radionuclide Loading to Cochiti Lake from Los Alamos Canyon Using Manual and Automated Sampling

    Energy Technology Data Exchange (ETDEWEB)

    McLean, Christopher T. [Univ. of New Mexico, Albuquerque, NM (United States)

    2000-07-01

    Los Alamos National Laboratory has a long-standing program of sampling storm water runoff inside the Laboratory boundaries. In 1995, the Laboratory started collecting the samples using automated storm water sampling stations; prior to this time the samples were collected manually. The Laboratory has also been periodically collecting sediment samples from Cochiti Lake. This paper presents the data for Pu-238 and Pu-239 bound to the sediments for Los Alamos Canyon storm water runoff and compares the sampling types by mass loading and as a percentage of the sediment deposition to Cochiti Lake. The data for both manual and automated sampling are used to calculate mass loads from Los Alamos Canyon on a yearly basis. The automated samples show mass loading 200- 500 percent greater for Pu-238 and 300-700 percent greater for Pu-239 than the manual samples. Using the mean manual flow volume for mass loading calculations, the automated samples are over 900 percent greater for Pu-238 and over 1800 percent greater for Pu-239. Evaluating the Pu-238 and Pu-239 activities as a percentage of deposition to Cochiti Lake indicates that the automated samples are 700-1300 percent greater for Pu- 238 and 200-500 percent greater for Pu-239. The variance was calculated by two methods. The first method calculates the variance for each sample event. The second method calculates the variances by the total volume of water discharged in Los Alamos Canyon for the year.

  12. Aircrew Discourse: Exploring Strategies of Information and Action Management

    Science.gov (United States)

    Irwin, Cheryl M.; Veinott, Elizabeth S.; Shafto, Michael G. (Technical Monitor)

    1995-01-01

    This paper explores methodology issues encountered in the analysis of flightcrew communications in aviation simulation research. Examples are provided by two recent studies which are compared on three issues: level of analysis, data definition, and interpretation of the results. The data discussed were collected in a study comparing two levels of aircraft automation. The first example is an investigation of how pilots' information transfer strategies differed as a function of automation during low and high-workload flight phases. The second study focuses on how crews managed actions in the two aircraft during a ten minute, high-workload flight segment. Results indicated that crews in the two aircraft differed in their strategies of information and action management. The differences are discussed in terms of their operational and research significance.

  13. Negative chemical ionization gas chromatography coupled to hybrid quadrupole time-of-flight mass spectrometry and automated accurate mass data processing for determination of pesticides in fruit and vegetables.

    Science.gov (United States)

    Besil, Natalia; Uclés, Samanta; Mezcúa, Milagros; Heinzen, Horacio; Fernández-Alba, Amadeo R

    2015-08-01

    Gas chromatography coupled to high resolution hybrid quadrupole time-of-flight mass spectrometry (GC-QTOF MS), operating in negative chemical ionization (NCI) mode and combining full scan with MSMS experiments using accurate mass analysis, has been explored for the automated determination of pesticide residues in fruit and vegetables. Seventy compounds were included in this approach where 50 % of them are not approved by the EU legislation. A global 76 % of the analytes could be identified at 1 μg kg(-1). Recovery studies were developed at three concentration levels (1, 5, and 10 μg kg(-1)). Seventy-seven percent of the detected pesticides at the lowest level yielded recoveries within the 70 %-120 % range, whereas 94 % could be quantified at 5 μg kg(-1), and the 100 % were determined at 10 μg kg(-1). Good repeatability, expressed as relative standard deviation (RSD home-made database was developed and applied to an automatic accurate mass data processing. Measured mass accuracies of the generated ions were mainly less than 5 ppm for at least one diagnostic ion. When only one ion was obtained in the single-stage NCI-MS, a representative product ion from MSMS experiments was used as identification criterion. A total of 30 real samples were analyzed and 67 % of the samples were positive for 12 different pesticides in the range 1.0-1321.3 μg kg(-1).

  14. Possibilities of the common research-development action in the field of automated logistical engines

    Directory of Open Access Journals (Sweden)

    Pap Lajos

    2003-12-01

    Full Text Available The paper briefly presents the R&D cooperation of the Department of Materials Handling and Logistics and Departments of Automation. The main fields of cooperation are introduced. Different kind of Linear Motor (hereafter LM drives are being developed and tested for warehouse and rolling conveyor systems. Modern control strategies using AI methods are being investigated and tested for Automated guide vehicle. Wireless communication methods are being searched and developed for mobile material handling devices. Application possibilities of voice recognition and image processing are being tested for control of material handling robots and devices. Application of process visualization programs are being developed and investigated. Multi-level industrial communication system is being developed for the laboratories of the cooperating departments.

  15. Joint Facial Action Unit Detection and Feature Fusion: A Multi-Conditional Learning Approach

    NARCIS (Netherlands)

    Eleftheriadis, Stefanos; Rudovic, Ognjen; Pantic, Maja

    2016-01-01

    Automated analysis of facial expressions can benefit many domains, from marketing to clinical diagnosis of neurodevelopmental disorders. Facial expressions are typically encoded as a combination of facial muscle activations, i.e., action units. Depending on context, these action units co-occur in

  16. Altering user' acceptance of automation through prior automation exposure.

    Science.gov (United States)

    Bekier, Marek; Molesworth, Brett R C

    2017-06-01

    Air navigation service providers worldwide see increased use of automation as one solution to overcome the capacity constraints imbedded in the present air traffic management (ATM) system. However, increased use of automation within any system is dependent on user acceptance. The present research sought to determine if the point at which an individual is no longer willing to accept or cooperate with automation can be manipulated. Forty participants underwent training on a computer-based air traffic control programme, followed by two ATM exercises (order counterbalanced), one with and one without the aid of automation. Results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation ('tipping point') decreased; suggesting it is indeed possible to alter automation acceptance. Practitioner Summary: This paper investigates whether the point at which a user of automation rejects automation (i.e. 'tipping point') is constant or can be manipulated. The results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation decreased; suggesting it is possible to alter automation acceptance.

  17. Balance between automation and human actions in nuclear power plant operation. Results of international cooperation; Equilibre entre automatisation et action humaine dans la conduite des centrale nucleaires, resultats de la cooperation internationale

    Energy Technology Data Exchange (ETDEWEB)

    Sun, B [CEA Centre d' Etudes Nucleaires de Fontenay-aux-Roses, 92 (France). Dept. d' Analyse de Surete; Bastl, W [Gesellschaft fuer Reaktorsicherheit m.b.H. (GRS), Garching (Germany); Olmstead, R [Atomic Energy of Canada Ltd, Mississauga (Canada); Oudiz, A [Electric Power Research Inst., Palo Alto, CA (United States); Jenkinson, J [Nuclear Electric PLC, Gloucester (United Kingdom); Kossilov, A [International Atomic Energy Agency, Vienna (Austria)

    1990-07-01

    Automation has long been an established feature of power plants. In some applications, the use of automation has been the significant factor which has enabled plant technology to progress to its current state. Societal demands for increased levels of safety have led to greater use of redundancy and diversity and this, in turn, has increased levels of automation. However, possibly the greatest contributory factor in increased automation has resulted from improvements in information technology. Much recent attention has been focused on the concept of inherently safe reactors, which may simplify safety system requirements and information and control system complexity. The allocation of tasks between man and machine may be one of the most critical activity in the design of new nuclear plants and major retro-fits and it therefore warrants a design approach which is commensurate in quality with the high levels of safety and production performance sought from nuclear plants. Facing this climate, in 1989 the International Atomic Energy Agency (IAEA) formed an advisory group from member countries with extensive experience in nuclear power plant automation. The task of this group was to advise on the appropriate balance between manual and automatic actions in plant operation. (author) [French] L'automatisation a longtemps ete une caracteristique bien etablie des centrales nucleaires. Dans certaines applications, l'utilisation de l'automatisation a ete le facteur decisif qui a permis a la technologie des centrales de progresser jusqu'a son etat actuel. Les exigences de l'opinion publique en matiere de securite renforcee ont conduit a l'utilisation d'une plus grande redondance et a une plus grande diversification et ceci, en retour, a encore accru le niveau d'automatisation. Toutefois, il est possible que le facteur preponderant de cet accroissement de l'automatisation soit constitue par les progres effectues dans la technologie de l'information. Plus recemment, l'attention s

  18. Automation of Raw sugar Crystallizer tacho of Central Julio Antonio Mella

    Directory of Open Access Journals (Sweden)

    Mónica Mulet-Hing

    2016-04-01

    Full Text Available This paper treats about the analysis of the actual  situation and prospects for solutions that permit economical and efficient automation in the area of Tachos of Sugar Company Julio Antonio Mella, as part of the "Supervisory Control Systems", for the first level of automation in this industry, performing the proposed automation in the central area of cans by PLCs .This proposal arises from the need to improve the crystallization process tacho 5 of that area, since it has no automation, that is, everything is done manually, which will bring undoubtedly a quality improvement the final product. The structure and control system variables are defined, proving the feasibility of the proposed solution. The essential result of the work involves the submission of a proposal of automation that has the structure of a control algorithm, taking into account the requirements, technical resources for implementation, the variables that must be observed and processed, as well as elements of final action; the respective field instrumentation is proposed, to perform satisfactorily the control with the minimum possible investment.

  19. Automated mass spectrum generation for new physics

    CERN Document Server

    Alloul, Adam; De Causmaecker, Karen; Fuks, Benjamin; Rausch de Traubenberg, Michel

    2013-01-01

    We describe an extension of the FeynRules package dedicated to the automatic generation of the mass spectrum associated with any Lagrangian-based quantum field theory. After introducing a simplified way to implement particle mixings, we present a new class of FeynRules functions allowing both for the analytical computation of all the model mass matrices and for the generation of a C++ package, dubbed ASperGe. This program can then be further employed for a numerical evaluation of the rotation matrices necessary to diagonalize the field basis. We illustrate these features in the context of the Two-Higgs-Doublet Model, the Minimal Left-Right Symmetric Standard Model and the Minimal Supersymmetric Standard Model.

  20. Complacency and Automation Bias in the Use of Imperfect Automation.

    Science.gov (United States)

    Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L

    2015-08-01

    We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.

  1. Automated MALDI Matrix Coating System for Multiple Tissue Samples for Imaging Mass Spectrometry

    Science.gov (United States)

    Mounfield, William P.; Garrett, Timothy J.

    2012-03-01

    Uniform matrix deposition on tissue samples for matrix-assisted laser desorption/ionization (MALDI) is key for reproducible analyte ion signals. Current methods often result in nonhomogenous matrix deposition, and take time and effort to produce acceptable ion signals. Here we describe a fully-automated method for matrix deposition using an enclosed spray chamber and spray nozzle for matrix solution delivery. A commercial air-atomizing spray nozzle was modified and combined with solenoid controlled valves and a Programmable Logic Controller (PLC) to control and deliver the matrix solution. A spray chamber was employed to contain the nozzle, sample, and atomized matrix solution stream, and to prevent any interference from outside conditions as well as allow complete control of the sample environment. A gravity cup was filled with MALDI matrix solutions, including DHB in chloroform/methanol (50:50) at concentrations up to 60 mg/mL. Various samples (including rat brain tissue sections) were prepared using two deposition methods (spray chamber, inkjet). A linear ion trap equipped with an intermediate-pressure MALDI source was used for analyses. Optical microscopic examination showed a uniform coating of matrix crystals across the sample. Overall, the mass spectral images gathered from tissues coated using the spray chamber system were of better quality and more reproducible than from tissue specimens prepared by the inkjet deposition method.

  2. Spatial and temporal analysis of mass movement using dendrochronology

    NARCIS (Netherlands)

    Braam, R.R.; Weiss, E.E.J.; Burrough, P.A.

    1987-01-01

    Tree growth and inclination on sloping land is affected by mass movement. Suitable analysis of tree growth and tree form can therefore provide considerable information on mass movement activity. This paper reports a new, automated method for studying the temporal and spatial aspects of mass

  3. Hypervascular mediastinal masses: Action points for radiologists

    International Nuclear Information System (INIS)

    Cabral, Fernanda C.; Trotman-Dickenson, Beatrice; Madan, Rachna

    2015-01-01

    Highlights: •An algorithm combining clinical data and radiology features of hypervascular mediastinal masses is proposed to determine further evaluation and subsequently guide treatment. •Characteristic features and known association with syndromes and genetic mutations assists in achieving a diagnosis. •MRI and functional imaging can be very helpful in the evaluation of hypervascular mediastinal masses. •Identification of hypervascularity within mediastinal masses should alert the radiologist and clinician and an attempt should be made to preferably avoid percutaneous CT guided biopsies and attempt tissue sampling surgically with better control of post procedure hemorrhage. -- Abstract: Hypervascular mediastinal masses are a distinct group of rare diseases that include a subset of benign and malignant entities. Characteristic features and known association with syndromes and genetic mutations assist in achieving a diagnosis. Imaging allows an understanding of the vascularity of the lesion and should alert the radiologist and clinician to potential hemorrhagic complications and avoid percutaneous CT guided biopsies. In such cases, pre-procedure embolization and surgical biopsy maybe considered for better control of post procedure hemorrhage. The purpose of this article is to describe and illustrate the clinical features and radiologic spectrum of hypervascular mediastinal masses, and discuss the associated clinical and genetic syndromes. We will present an imaging algorithm to determine further evaluation and subsequently guide treatment

  4. Hypervascular mediastinal masses: Action points for radiologists

    Energy Technology Data Exchange (ETDEWEB)

    Cabral, Fernanda C.; Trotman-Dickenson, Beatrice; Madan, Rachna, E-mail: rmadan@partners.org

    2015-03-15

    Highlights: •An algorithm combining clinical data and radiology features of hypervascular mediastinal masses is proposed to determine further evaluation and subsequently guide treatment. •Characteristic features and known association with syndromes and genetic mutations assists in achieving a diagnosis. •MRI and functional imaging can be very helpful in the evaluation of hypervascular mediastinal masses. •Identification of hypervascularity within mediastinal masses should alert the radiologist and clinician and an attempt should be made to preferably avoid percutaneous CT guided biopsies and attempt tissue sampling surgically with better control of post procedure hemorrhage. -- Abstract: Hypervascular mediastinal masses are a distinct group of rare diseases that include a subset of benign and malignant entities. Characteristic features and known association with syndromes and genetic mutations assist in achieving a diagnosis. Imaging allows an understanding of the vascularity of the lesion and should alert the radiologist and clinician to potential hemorrhagic complications and avoid percutaneous CT guided biopsies. In such cases, pre-procedure embolization and surgical biopsy maybe considered for better control of post procedure hemorrhage. The purpose of this article is to describe and illustrate the clinical features and radiologic spectrum of hypervascular mediastinal masses, and discuss the associated clinical and genetic syndromes. We will present an imaging algorithm to determine further evaluation and subsequently guide treatment.

  5. Determination of thermodynamic potentials and the aggregation number for micelles with the mass-action model by isothermal titration calorimetry

    DEFF Research Database (Denmark)

    Olesen, Niels Erik; Westh, Peter; Holm, René

    2015-01-01

    of micelles with ITC were compared to a mass-action model (MAM) of reaction type: n⋅S⇌Mn. This analysis can provide guidelines for future ITC studies of systems behaving in accordance with this model such as micelles and proteins that undergo self-association to oligomers. Micelles with small aggregation...

  6. A computational framework for the automated construction of glycosylation reaction networks.

    Science.gov (United States)

    Liu, Gang; Neelamegham, Sriram

    2014-01-01

    Glycosylation is among the most common and complex post-translational modifications identified to date. It proceeds through the catalytic action of multiple enzyme families that include the glycosyltransferases that add monosaccharides to growing glycans, and glycosidases which remove sugar residues to trim glycans. The expression level and specificity of these enzymes, in part, regulate the glycan distribution or glycome of specific cell/tissue systems. Currently, there is no systematic method to describe the enzymes and cellular reaction networks that catalyze glycosylation. To address this limitation, we present a streamlined machine-readable definition for the glycosylating enzymes and additional methodologies to construct and analyze glycosylation reaction networks. In this computational framework, the enzyme class is systematically designed to store detailed specificity data such as enzymatic functional group, linkage and substrate specificity. The new classes and their associated functions enable both single-reaction inference and automated full network reconstruction, when given a list of reactants and/or products along with the enzymes present in the system. In addition, graph theory is used to support functions that map the connectivity between two or more species in a network, and that generate subset models to identify rate-limiting steps regulating glycan biosynthesis. Finally, this framework allows the synthesis of biochemical reaction networks using mass spectrometry (MS) data. The features described above are illustrated using three case studies that examine: i) O-linked glycan biosynthesis during the construction of functional selectin-ligands; ii) automated N-linked glycosylation pathway construction; and iii) the handling and analysis of glycomics based MS data. Overall, the new computational framework enables automated glycosylation network model construction and analysis by integrating knowledge of glycan structure and enzyme biochemistry. All

  7. A computational framework for the automated construction of glycosylation reaction networks.

    Directory of Open Access Journals (Sweden)

    Gang Liu

    Full Text Available Glycosylation is among the most common and complex post-translational modifications identified to date. It proceeds through the catalytic action of multiple enzyme families that include the glycosyltransferases that add monosaccharides to growing glycans, and glycosidases which remove sugar residues to trim glycans. The expression level and specificity of these enzymes, in part, regulate the glycan distribution or glycome of specific cell/tissue systems. Currently, there is no systematic method to describe the enzymes and cellular reaction networks that catalyze glycosylation. To address this limitation, we present a streamlined machine-readable definition for the glycosylating enzymes and additional methodologies to construct and analyze glycosylation reaction networks. In this computational framework, the enzyme class is systematically designed to store detailed specificity data such as enzymatic functional group, linkage and substrate specificity. The new classes and their associated functions enable both single-reaction inference and automated full network reconstruction, when given a list of reactants and/or products along with the enzymes present in the system. In addition, graph theory is used to support functions that map the connectivity between two or more species in a network, and that generate subset models to identify rate-limiting steps regulating glycan biosynthesis. Finally, this framework allows the synthesis of biochemical reaction networks using mass spectrometry (MS data. The features described above are illustrated using three case studies that examine: i O-linked glycan biosynthesis during the construction of functional selectin-ligands; ii automated N-linked glycosylation pathway construction; and iii the handling and analysis of glycomics based MS data. Overall, the new computational framework enables automated glycosylation network model construction and analysis by integrating knowledge of glycan structure and enzyme

  8. Computer-aided detection system for masses in automated whole breast ultrasonography: development and evaluation of the effectiveness

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jeoung Hyun [Dept. of Radiology, Ewha Womans University Mokdong Hospital, Ewha Womans University School of Medicine, Seoul (Korea, Republic of); Cha, Joo Hee; Kim, Nam Kug; Chang, Young Jun; Kim, Hak Hee [Dept. of Radiology and Research Institute of Radiology, Asan Medical Center, University of Ulsan College of Medicine, Seoul (Korea, Republic of); Ko, Myung Su [Health Screening and Promotion Center, Asan Medical Center, Seoul (Korea, Republic of); Choi, Young Wook [Korea Electrotechnology Research Institute, Ansan (Korea, Republic of)

    2014-04-15

    The aim of this study was to evaluate the performance of a proposed computer-aided detection (CAD) system in automated breast ultrasonography (ABUS). Eighty-nine two-dimensional images (20 cysts, 42 benign lesions, and 27 malignant lesions) were obtained from 47 patients who underwent ABUS (ACUSON S2000). After boundary detection and removal, we detected mass candidates by using the proposed adjusted Otsu's threshold; the threshold was adaptive to the variations of pixel intensities in an image. Then, the detected candidates were segmented. Features of the segmented objects were extracted and used for training/testing in the classification. In our study, a support vector machine classifier was adopted. Eighteen features were used to determine whether the candidates were true lesions or not. A five-fold cross validation was repeated 20 times for the performance evaluation. The sensitivity and the false positive rate per image were calculated, and the classification accuracy was evaluated for each feature. In the classification step, the sensitivity of the proposed CAD system was 82.67% (SD, 0.02%). The false positive rate was 0.26 per image. In the detection/segmentation step, the sensitivities for benign and malignant mass detection were 90.47% (38/42) and 92.59% (25/27), respectively. In the five-fold cross-validation, the standard deviation of pixel intensities for the mass candidates was the most frequently selected feature, followed by the vertical position of the centroids. In the univariate analysis, each feature had 50% or higher accuracy. The proposed CAD system can be used for lesion detection in ABUS and may be useful in improving the screening efficiency.

  9. Sequential automated fusion/extraction chromatography methodology for the dissolution of uranium in environmental samples for mass spectrometric determination

    Energy Technology Data Exchange (ETDEWEB)

    Milliard, Alex; Durand-Jezequel, Myriam [Laboratoire de Radioecologie, Departement de chimie, Universite Laval, 1045 Avenue de la Medecine, Quebec, QC, G1V 0A6 (Canada); Lariviere, Dominic, E-mail: dominic.lariviere@chm.ulaval.ca [Laboratoire de Radioecologie, Departement de chimie, Universite Laval, 1045 Avenue de la Medecine, Quebec, QC, G1V 0A6 (Canada)

    2011-01-17

    An improved methodology has been developed, based on dissolution by automated fusion followed by extraction chromatography for the detection and quantification of uranium in environmental matrices by mass spectrometry. A rapid fusion protocol (<8 min) was investigated for the complete dissolution of various samples. It could be preceded, if required, by an effective ashing procedure using the M4 fluxer and a newly designed platinum lid. Complete dissolution of the sample was observed and measured using standard reference materials (SRMs) and experimental data show no evidence of cross-contamination of crucibles when LiBO{sub 2}/LiBr melts were used. The use of a M4 fusion unit also improved repeatability in sample preparation over muffle furnace fusion. Instrumental issues originating from the presence of high salt concentrations in the digestate after lithium metaborate fusion was also mitigated using an extraction chromatography (EXC) protocol aimed at removing lithium and interfering matrix constituants prior to the elution of uranium. The sequential methodology, which can be performed simultaneously on three samples, requires less than 20 min per sample for fusion and separation. It was successfully coupled to inductively coupled plasma mass spectrometry (ICP-MS) achieving detection limits below 100 pg kg{sup -1} for 5-300 mg of sample.

  10. Using Automated Planning for Traffic Signals Control

    Directory of Open Access Journals (Sweden)

    Matija Gulić

    2016-08-01

    Full Text Available Solving traffic congestions represents a high priority issue in many big cities. Traditional traffic control systems are mainly based on pre-programmed, reactive and local techniques. This paper presents an autonomic system that uses automated planning techniques instead. These techniques are easily configurable and modified, and can reason about the future implications of actions that change the default traffic lights behaviour. The proposed implemented system includes some autonomic properties, since it monitors the current traffic state, detects if the system is degrading its performance, sets up new sets of goals to be achieved by the planner, triggers the planner that generates plans with control actions, and executes the selected courses of actions. The obtained results in several artificial and real world data-based simulation scenarios show that the proposed system can efficiently solve traffic congestion.

  11. Current advances and strategies towards fully automated sample preparation for regulated LC-MS/MS bioanalysis.

    Science.gov (United States)

    Zheng, Naiyu; Jiang, Hao; Zeng, Jianing

    2014-09-01

    Robotic liquid handlers (RLHs) have been widely used in automated sample preparation for liquid chromatography-tandem mass spectrometry (LC-MS/MS) bioanalysis. Automated sample preparation for regulated bioanalysis offers significantly higher assay efficiency, better data quality and potential bioanalytical cost-savings. For RLHs that are used for regulated bioanalysis, there are additional requirements, including 21 CFR Part 11 compliance, software validation, system qualification, calibration verification and proper maintenance. This article reviews recent advances in automated sample preparation for regulated bioanalysis in the last 5 years. Specifically, it covers the following aspects: regulated bioanalysis requirements, recent advances in automation hardware and software development, sample extraction workflow simplification, strategies towards fully automated sample extraction, and best practices in automated sample preparation for regulated bioanalysis.

  12. A new method for automated discontinuity trace mapping on rock mass 3D surface model

    Science.gov (United States)

    Li, Xiaojun; Chen, Jianqin; Zhu, Hehua

    2016-04-01

    This paper presents an automated discontinuity trace mapping method on a 3D surface model of rock mass. Feature points of discontinuity traces are first detected using the Normal Tensor Voting Theory, which is robust to noisy point cloud data. Discontinuity traces are then extracted from feature points in four steps: (1) trace feature point grouping, (2) trace segment growth, (3) trace segment connection, and (4) redundant trace segment removal. A sensitivity analysis is conducted to identify optimal values for the parameters used in the proposed method. The optimal triangular mesh element size is between 5 cm and 6 cm; the angle threshold in the trace segment growth step is between 70° and 90°; the angle threshold in the trace segment connection step is between 50° and 70°, and the distance threshold should be at least 15 times the mean triangular mesh element size. The method is applied to the excavation face trace mapping of a drill-and-blast tunnel. The results show that the proposed discontinuity trace mapping method is fast and effective and could be used as a supplement to traditional direct measurement of discontinuity traces.

  13. Rules-based analysis with JBoss Drools: adding intelligence to automation

    International Nuclear Information System (INIS)

    Ley, E. de; Jacobs, D.

    2012-01-01

    Rule engines are specialized software systems for applying conditional actions (if/then rules) on data. They are also known as 'production rule systems'. Rules engines are less-known as software technology than the traditional procedural, object-oriented, scripting or dynamic development languages. This is a pity, as their usage may offer an important enrichment to a development toolbox. JBoss Drools is an open-source rules engine that can easily be embedded in any Java application. Through an integration in our Passerelle process automation suite, we have been able to provide advanced solutions for intelligent process automation, complex event processing, system monitoring and alarming, automated repair etc. This platform has been proven for many years as an automated diagnosis and repair engine for Belgium's largest telecom provider, and it is being piloted at Synchrotron Soleil for device monitoring and alarming. After an introduction to rules engines in general and JBoss Drools in particular, we will present its integration in a solution platform, some important principles and a practical use case. (authors)

  14. Single-core magnetic markers in rotating magnetic field based homogeneous bioassays and the law of mass action

    Energy Technology Data Exchange (ETDEWEB)

    Dieckhoff, Jan, E-mail: j.dieckhoff@tu-bs.de [Institut fuer Elektrische Messtechnik und Grundlagen der Elektrotechnik, TU Braunschweig, Braunschweig (Germany); Schrittwieser, Stefan; Schotter, Joerg [Molecular Diagnostics, AIT Austrian Institute of Technology, Vienna (Austria); Remmer, Hilke; Schilling, Meinhard; Ludwig, Frank [Institut fuer Elektrische Messtechnik und Grundlagen der Elektrotechnik, TU Braunschweig, Braunschweig (Germany)

    2015-04-15

    In this work, we report on the effect of the magnetic nanoparticle (MNP) concentration on the quantitative detection of proteins in solution with a rotating magnetic field (RMF) based homogeneous bioassay. Here, the phase lag between 30 nm iron oxide single-core particles and the RMF is analyzed with a fluxgate-based measurement system. As a test analyte anti-human IgG is applied which binds to the protein G functionalized MNP shell and causes a change of the phase lag. The measured phase lag changes for a fixed MNP and a varying analyte concentration are modeled with logistic functions. A change of the MNP concentration results in a nonlinear shift of the logistic function with the analyte concentration. This effect results from the law of mass action. Furthermore, the bioassay results are used to determine the association constant of the binding reaction. - Highlights: • A rotating magnetic field based homogeneous bioassay concept was presented. • Here, single-core iron oxide nanoparticles are applied as markers. • The impact of the particle concentration on the bioassay results is investigated. • The relation between particle concentration and bioassay sensitivity is nonlinear. • This finding can be reasonably explained by the law of mass action.

  15. Automating usability of ATLAS distributed computing resources

    International Nuclear Information System (INIS)

    Tupputi, S A; Girolamo, A Di; Kouba, T; Schovancová, J

    2014-01-01

    The automation of ATLAS Distributed Computing (ADC) operations is essential to reduce manpower costs and allow performance-enhancing actions, which improve the reliability of the system. In this perspective a crucial case is the automatic handling of outages of ATLAS computing sites storage resources, which are continuously exploited at the edge of their capabilities. It is challenging to adopt unambiguous decision criteria for storage resources of non-homogeneous types, sizes and roles. The recently developed Storage Area Automatic Blacklisting (SAAB) tool has provided a suitable solution, by employing an inference algorithm which processes history of storage monitoring tests outcome. SAAB accomplishes both the tasks of providing global monitoring as well as automatic operations on single sites. The implementation of the SAAB tool has been the first step in a comprehensive review of the storage areas monitoring and central management at all levels. Such review has involved the reordering and optimization of SAM tests deployment and the inclusion of SAAB results in the ATLAS Site Status Board with both dedicated metrics and views. The resulting structure allows monitoring the storage resources status with fine time-granularity and automatic actions to be taken in foreseen cases, like automatic outage handling and notifications to sites. Hence, the human actions are restricted to reporting and following up problems, where and when needed. In this work we show SAAB working principles and features. We present also the decrease of human interactions achieved within the ATLAS Computing Operation team. The automation results in a prompt reaction to failures, which leads to the optimization of resource exploitation.

  16. Improving the driver-automation interaction: an approach using automation uncertainty.

    Science.gov (United States)

    Beller, Johannes; Heesen, Matthias; Vollrath, Mark

    2013-12-01

    The aim of this study was to evaluate whether communicating automation uncertainty improves the driver-automation interaction. A false system understanding of infallibility may provoke automation misuse and can lead to severe consequences in case of automation failure. The presentation of automation uncertainty may prevent this false system understanding and, as was shown by previous studies, may have numerous benefits. Few studies, however, have clearly shown the potential of communicating uncertainty information in driving. The current study fills this gap. We conducted a driving simulator experiment, varying the presented uncertainty information between participants (no uncertainty information vs. uncertainty information) and the automation reliability (high vs.low) within participants. Participants interacted with a highly automated driving system while engaging in secondary tasks and were required to cooperate with the automation to drive safely. Quantile regressions and multilevel modeling showed that the presentation of uncertainty information increases the time to collision in the case of automation failure. Furthermore, the data indicated improved situation awareness and better knowledge of fallibility for the experimental group. Consequently, the automation with the uncertainty symbol received higher trust ratings and increased acceptance. The presentation of automation uncertaintythrough a symbol improves overall driver-automation cooperation. Most automated systems in driving could benefit from displaying reliability information. This display might improve the acceptance of fallible systems and further enhances driver-automation cooperation.

  17. Working together on automated vehicle guidance AVG : preliminary business plan, abridged version.

    NARCIS (Netherlands)

    Awareness (ed.)

    1998-01-01

    This plan describes the questions which will have to be answered in the short term, and the action which need to be taken in a phased and structured manner to gain insight into the potential of automated vehicle guidance (AVG).

  18. An improved, automated whole air sampler and gas chromatography mass spectrometry analysis system for volatile organic compounds in the atmosphere

    Science.gov (United States)

    Lerner, Brian M.; Gilman, Jessica B.; Aikin, Kenneth C.; Atlas, Elliot L.; Goldan, Paul D.; Graus, Martin; Hendershot, Roger; Isaacman-VanWertz, Gabriel A.; Koss, Abigail; Kuster, William C.; Lueb, Richard A.; McLaughlin, Richard J.; Peischl, Jeff; Sueper, Donna; Ryerson, Thomas B.; Tokarek, Travis W.; Warneke, Carsten; Yuan, Bin; de Gouw, Joost A.

    2017-01-01

    Volatile organic compounds were quantified during two aircraft-based field campaigns using highly automated, whole air samplers with expedited post-flight analysis via a new custom-built, field-deployable gas chromatography-mass spectrometry instrument. During flight, air samples were pressurized with a stainless steel bellows compressor into electropolished stainless steel canisters. The air samples were analyzed using a novel gas chromatograph system designed specifically for field use which eliminates the need for liquid nitrogen. Instead, a Stirling cooler is used for cryogenic sample pre-concentration at temperatures as low as -165 °C. The analysis system was fully automated on a 20 min cycle to allow for unattended processing of an entire flight of 72 sample canisters within 30 h, thereby reducing typical sample residence times in the canisters to less than 3 days. The new analytical system is capable of quantifying a wide suite of C2 to C10 organic compounds at part-per-trillion sensitivity. This paper describes the sampling and analysis systems, along with the data analysis procedures which include a new peak-fitting software package for rapid chromatographic data reduction. Instrument sensitivities, uncertainties and system artifacts are presented for 35 trace gas species in canister samples. Comparisons of reported mixing ratios from each field campaign with measurements from other instruments are also presented.

  19. A new thermal ionisation mass spectrometer

    International Nuclear Information System (INIS)

    Haines, C.; Merren, T.O.; Unsworth, W.D.

    1979-01-01

    The Isomass 54E, a new thermal ionisation mass spectrometer for precise measurements of isotopic composition is described in detail. It combines the fruits of three development pro ects, viz. automation, energy filters and extended geometry with existing micromass expertise and experience. The hardware and software which are used for the automation as well as the energy filter used, are explained. The 'extended geometry' ion optical system adopted for better performance is discussed in detail. (K.B.)

  20. Improving mass measurement accuracy in mass spectrometry based proteomics by combining open source tools for chromatographic alignment and internal calibration.

    Science.gov (United States)

    Palmblad, Magnus; van der Burgt, Yuri E M; Dalebout, Hans; Derks, Rico J E; Schoenmaker, Bart; Deelder, André M

    2009-05-02

    Accurate mass determination enhances peptide identification in mass spectrometry based proteomics. We here describe the combination of two previously published open source software tools to improve mass measurement accuracy in Fourier transform ion cyclotron resonance mass spectrometry (FTICRMS). The first program, msalign, aligns one MS/MS dataset with one FTICRMS dataset. The second software, recal2, uses peptides identified from the MS/MS data for automated internal calibration of the FTICR spectra, resulting in sub-ppm mass measurement errors.

  1. 75 FR 43537 - Mortgagee Review Board: Administrative Actions

    Science.gov (United States)

    2010-07-26

    ... Mortgage Corp., Inc., Margate, FL. 365. Florida Business Finance Corp., Jacksonville, FL (Titles 1 & 2... accurately identify Academy as the owner of the Web site; and failed to register the fictitious business name... requirements. 5. Automated Finance Corporation, Calabasas, CA [Docket No. 09-9825-MR] Action: On October 30...

  2. Automated toxicological screening reports of modified Agilent MSD Chemstation combined with Microsoft Visual Basic application programs.

    Science.gov (United States)

    Choe, Sanggil; Kim, Suncheun; Choi, Hyeyoung; Choi, Hwakyoung; Chung, Heesun; Hwang, Bangyeon

    2010-06-15

    Agilent GC-MS MSD Chemstation offers automated library search report for toxicological screening using total ion chromatogram (TIC) and mass spectroscopy in normal mode. Numerous peaks appear in the chromatogram of biological specimen such as blood or urine and often large migrating peaks obscure small target peaks, in addition, any target peaks of low abundance regularly give wrong library search result or low matching score. As a result, retention time and mass spectrum of all the peaks in the chromatogram have to be checked to see if they are relevant. These repeated actions are very tedious and time-consuming to toxicologists. MSD Chemstation software operates using a number of macro files which give commands and instructions on how to work on and extract data from the chromatogram and spectroscopy. These macro files are developed by the own compiler of the software. All the original macro files can be modified and new macro files can be added to the original software by users. To get more accurate results with more convenient method and to save time for data analysis, we developed new macro files for reports generation and inserted new menus in the Enhanced Data Analysis program. Toxicological screening reports generated by these new macro files are in text mode or graphic mode and these reports can be generated with three different automated subtraction options. Text reports have Brief mode and Full mode and graphic reports have the option with or without mass spectrum mode. Matched mass spectrum and matching score for detected compounds are printed in reports by modified library searching modules. We have also developed an independent application program named DrugMan. This program manages drug groups, lists and parameters that are in use in MSD Chemstation. The incorporation of DrugMan with modified macro modules provides a powerful tool for toxicological screening and save a lot of valuable time on toxicological work. (c) 2010 Elsevier Ireland Ltd. All

  3. Cellular-automation fluids: A model for flow in porous media

    International Nuclear Information System (INIS)

    Rothman, D.H.

    1987-01-01

    Because the intrinsic inhomogeneity of porous media makes the application of proper boundary conditions difficult, fluid flow through microgeometric models has typically been achieved with idealized arrays of geometrically simple pores, throats, and cracks. The author proposes here an attractive alternative, capable of freely and accurately modeling fluid flow in grossly irregular geometries. This new method numerically solves the Navier-Stokes equations using the cellular-automation fluid model introduced by Frisch, Hasslacher, and Pomeau. The cellular-automation fluid is extraordinarily simple - particles of unit mass traveling with unit velocity reside on a triangular lattice and obey elementary collisions rules - but capable of modeling much of the rich complexity of real fluid flow. The author shows how cellular-automation fluids are applied to the study of porous media. In particular, he discusses issues of scale on the cellular-automation lattice and present the results of 2-D simulations, including numerical estimation of permeability and verification of Darcy's law

  4. ParticipACTION: A mass media campaign targeting parents of inactive children; knowledge, saliency, and trialing behaviours

    Directory of Open Access Journals (Sweden)

    Gauvin Lise

    2009-12-01

    Full Text Available Abstract Background In late 2007, Canada's ParticipACTION national physical activity mass media campaign was re-launched, with an initial campaign targeting parents of elementary school-aged children. The campaign informed them about the risks of physical inactivity for children and youth. The purpose of this study was to assess campaign awareness and understanding following the campaign, and to identify whether exposure to this campaign was likely associated with behaviour change. Methods A convenience sample of 1,500 adults was recruited though an existing panel (n = 60,000 of Canadian adults to participate in online surveys. Initial campaign exposure included "prompted" and "unprompted" recall of specific physical activity messages from the 2007 ParticipACTION campaign, knowledge of the benefits of PA, saliency, and initial trial behaviours to help their children become more active. Results One quarter of respondents showed unprompted recall of specific message content from the ParticipACTION campaign, and prompted recall was 57%. Message recall and understanding was associated with knowledge about physical activity, and that in turn was related to high saliency. Saliency was associated with each of the physical activity-related trial behaviours asked. Conclusion Campaign awareness and understanding was high following this ParticipACTION campaign, and was associated with intermediate campaign outcomes, including saliency and trial behaviours. This is relevant to campaign evaluations, as it suggests that an initial focus on influencing awareness and understanding is likely to lead to more substantial change in campaign endpoints.

  5. Dynamic adaptive policymaking for the sustainable city: The case of automated taxis

    Directory of Open Access Journals (Sweden)

    Warren E. Walker

    2017-06-01

    Full Text Available By 2050, about two-thirds of the world’s people are expected to live in urban areas. But, the economic viability and sustainability of city centers is threatened by problems related to transport, such as pollution, congestion, and parking. Much has been written about automated vehicles and demand responsive transport. The combination of these potentially disruptive developments could reduce these problems. However, implementation is held back by uncertainties, including public acceptance, liability, and privacy. So, their potential to reduce urban transport problems may not be fully realized. We propose an adaptive approach to implementation that takes some actions right away and creates a framework for future actions that allows for adaptations over time as knowledge about performance and acceptance of the new system (called ‘automated taxis’ accumulates and critical events for implementation take place. The adaptive approach is illustrated in the context of a hypothetical large city.

  6. Efficient Temporal Action Localization in Videos

    KAUST Repository

    Alwassel, Humam

    2018-04-17

    State-of-the-art temporal action detectors inefficiently search the entire video for specific actions. Despite the encouraging progress these methods achieve, it is crucial to design automated approaches that only explore parts of the video which are the most relevant to the actions being searched. To address this need, we propose the new problem of action spotting in videos, which we define as finding a specific action in a video while observing a small portion of that video. Inspired by the observation that humans are extremely efficient and accurate in spotting and finding action instances in a video, we propose Action Search, a novel Recurrent Neural Network approach that mimics the way humans spot actions. Moreover, to address the absence of data recording the behavior of human annotators, we put forward the Human Searches dataset, which compiles the search sequences employed by human annotators spotting actions in the AVA and THUMOS14 datasets. We consider temporal action localization as an application of the action spotting problem. Experiments on the THUMOS14 dataset reveal that our model is not only able to explore the video efficiently (observing on average 17.3% of the video) but it also accurately finds human activities with 30.8% mAP (0.5 tIoU), outperforming state-of-the-art methods

  7. Automated tool for virtual screening and pharmacology-based pathway prediction and analysis

    Directory of Open Access Journals (Sweden)

    Sugandh Kumar

    2017-10-01

    Full Text Available The virtual screening is an effective tool for the lead identification in drug discovery. However, there are limited numbers of crystal structures available as compared to the number of biological sequences which makes (Structure Based Drug Discovery SBDD a difficult choice. The current tool is an attempt to automate the protein structure modelling and automatic virtual screening followed by pharmacology-based prediction and analysis. Starting from sequence(s, this tool automates protein structure modelling, binding site identification, automated docking, ligand preparation, post docking analysis and identification of hits in the biological pathways that can be modulated by a group of ligands. This automation helps in the characterization of ligands selectivity and action of ligands on a complex biological molecular network as well as on individual receptor. The judicial combination of the ligands binding different receptors can be used to inhibit selective biological pathways in a disease. This tool also allows the user to systemically investigate network-dependent effects of a drug or drug candidate.

  8. Comparison of retraction phenomenon and BI-RADS-US descriptors in differentiating benign and malignant breast masses using an automated breast volume scanner.

    Science.gov (United States)

    Zheng, Feng-Yang; Yan, Li-Xia; Huang, Bei-Jian; Xia, Han-Sheng; Wang, Xi; Lu, Qing; Li, Cui-Xian; Wang, Wen-Ping

    2015-11-01

    To compare the diagnostic values of retraction phenomenon in the coronal planes and descriptors in the Breast Imaging Reporting and Data System-Ultrasound (BI-RADS-US) lexicon in differentiating benign and malignant breast masses using an automated breast volume scanner (ABVS). Two hundred and eight female patients with 237 pathologically proven breast masses (120 benign and 117 malignant) were included in this study. ABVS was performed for each mass after preoperative localization by conventional ultrasonography (US). Multivariate logistic regression analysis was performed to assess independent variables for malignancy prediction. Diagnostic performance was evaluated through the receiver operating characteristic (ROC) curve analysis. Retraction phenomenon (odds ratio [OR]: 76.70; 95% confidence interval [CI]: 12.55, 468.70; P<0.001) was the strongest independent predictor for malignant masses, followed by microlobulated margins (OR: 55.87; 95% CI: 12.56, 248.44; P<0.001), angular margins (OR: 36.44; 95% CI: 4.55, 292.06; P=0.001), calcifications (OR: 5.53; 95% CI: 1.34, 22.88; P=0.018,) and patient age (OR: 1.10; 95% CI: 1.03, 1.17; P=0.004). Mass shape, orientation, echo pattern, indistinct margins, spiculated margins, and mass size were not significantly associated with breast malignancy. Area under the ROC curve (Az) for microlobulated margins and retraction phenomenon was higher than that for other significant independent predictors. Az, sensitivity, and specificity were 0.877 (95% CI: 0.829, 0.926) and 0.838 (95% CI: 0.783, 0.892), 82.9% and 70.1%, and 92.5% and 98.3%, respectively, for microlobulated margins and retraction phenomenon. Retraction phenomenon and microlobulated margins have high diagnostic values in the differentiation of benign and malignant breast masses using an ABVS. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  9. Effective action and electromagnetic response of topological superconductors and Majorana-mass Weyl fermions

    Science.gov (United States)

    Stone, Michael; Lopes, Pedro L. e. S.

    2016-05-01

    Motivated by an apparent paradox in [X.-L. Qi, E. Witten, and S.-C. Zhang, Phys. Rev. B 87, 134519 (2013), 10.1103/PhysRevB.87.134519], we use the method of gauged Wess-Zumino-Witten functionals to construct an effective action for a Weyl fermion with a Majorana mass that arises from coupling to a charged condensate. We obtain expressions for the current induced by an external gauge field and observe that the topological part of the current is only one-third of that that might have been expected from the gauge anomaly. The anomaly is not changed by the induced mass gap, however. The topological current is supplemented by a conventional supercurrent that provides the remaining two-thirds of the anomaly once the equation of motion for the Goldstone mode is satisfied. We apply our formula for the current to resolve the apparent paradox and also to the chiral magnetic effect (CME), where it predicts a reduction of the CME current to one-third of its value for a free Weyl gas in thermal equilibrium. We attribute this reduction to a partial cancellation of the CME by a chiral vortical effect current arising from the persistent rotation of the fluid induced by the external magnetic field.

  10. Determination of Low Concentrations of Acetochlor in Water by Automated Solid-Phase Extraction and Gas Chromatography with Mass-Selective Detection

    Science.gov (United States)

    Lindley, C.E.; Stewart, J.T.; Sandstrom, M.W.

    1996-01-01

    A sensitive and reliable gas chromatographic/mass spectrometric (GC/MS) method for determining acetochlor in environmental water samples was developed. The method involves automated extraction of the herbicide from a filtered 1 L water sample through a C18 solid-phase extraction column, elution from the column with hexane-isopropyl alcohol (3 + 1), and concentration of the extract with nitrogen gas. The herbicide is quantitated by capillary/column GC/MS with selected-ion monitoring of 3 characteristic ions. The single-operator method detection limit for reagent water samples is 0.0015 ??g/L. Mean recoveries ranged from about 92 to 115% for 3 water matrixes fortified at 0.05 and 0.5 ??g/L. Average single-operator precision, over the course of 1 week, was better than 5%.

  11. Automated gravity gradient tensor inversion for underwater object detection

    International Nuclear Information System (INIS)

    Wu, Lin; Tian, Jinwen

    2010-01-01

    Underwater abnormal object detection is a current need for the navigation security of autonomous underwater vehicles (AUVs). In this paper, an automated gravity gradient tensor inversion algorithm is proposed for the purpose of passive underwater object detection. Full-tensor gravity gradient anomalies induced by an object in the partial area can be measured with the technique of gravity gradiometry on an AUV. Then the automated algorithm utilizes the anomalies, using the inverse method to estimate the mass and barycentre location of the arbitrary-shaped object. A few tests on simple synthetic models will be illustrated, in order to evaluate the feasibility and accuracy of the new algorithm. Moreover, the method is applied to a complicated model of an abnormal object with gradiometer and AUV noise, and interference from a neighbouring illusive smaller object. In all cases tested, the estimated mass and barycentre location parameters are found to be in good agreement with the actual values

  12. Clinical Chemistry Laboratory Automation in the 21st Century - Amat Victoria curam (Victory loves careful preparation)

    Science.gov (United States)

    Armbruster, David A; Overcash, David R; Reyes, Jaime

    2014-01-01

    The era of automation arrived with the introduction of the AutoAnalyzer using continuous flow analysis and the Robot Chemist that automated the traditional manual analytical steps. Successive generations of stand-alone analysers increased analytical speed, offered the ability to test high volumes of patient specimens, and provided large assay menus. A dichotomy developed, with a group of analysers devoted to performing routine clinical chemistry tests and another group dedicated to performing immunoassays using a variety of methodologies. Development of integrated systems greatly improved the analytical phase of clinical laboratory testing and further automation was developed for pre-analytical procedures, such as sample identification, sorting, and centrifugation, and post-analytical procedures, such as specimen storage and archiving. All phases of testing were ultimately combined in total laboratory automation (TLA) through which all modules involved are physically linked by some kind of track system, moving samples through the process from beginning-to-end. A newer and very powerful, analytical methodology is liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS). LC-MS/MS has been automated but a future automation challenge will be to incorporate LC-MS/MS into TLA configurations. Another important facet of automation is informatics, including middleware, which interfaces the analyser software to a laboratory information systems (LIS) and/or hospital information systems (HIS). This software includes control of the overall operation of a TLA configuration and combines analytical results with patient demographic information to provide additional clinically useful information. This review describes automation relevant to clinical chemistry, but it must be recognised that automation applies to other specialties in the laboratory, e.g. haematology, urinalysis, microbiology. It is a given that automation will continue to evolve in the clinical laboratory

  13. De novo analysis of electron impact mass spectra using fragmentation trees

    International Nuclear Information System (INIS)

    Hufsky, Franziska; Rempt, Martin; Rasche, Florian; Pohnert, Georg; Böcker, Sebastian

    2012-01-01

    Highlights: ► We present a method for de novo analysis of accurate mass EI mass spectra of small molecules. ► This method identifies the molecular ion and thus the molecular formula where the molecular ion is present in the spectrum. ► Fragmentation trees are constructed by automated signal extraction and evaluation. ► These trees explain relevant fragmentation reactions. ► This method will be very helpful in the automated analysis of unknown metabolites. - Abstract: The automated fragmentation analysis of high resolution EI mass spectra based on a fragmentation tree algorithm is introduced. Fragmentation trees are constructed from EI spectra by automated signal extraction and evaluation. These trees explain relevant fragmentation reactions and assign molecular formulas to fragments. The method enables the identification of the molecular ion and the molecular formula of a metabolite if the molecular ion is present in the spectrum. These identifications are independent of existing library knowledge and, thus, support assignment and structural elucidation of unknown compounds. The method works even if the molecular ion is of very low abundance or hidden under contaminants with higher masses. We apply the algorithm to a selection of 50 derivatized and underivatized metabolites and demonstrate that in 78% of cases the molecular ion can be correctly assigned. The automatically constructed fragmentation trees correspond very well to published mechanisms and allow the assignment of specific relevant fragments and fragmentation pathways even in the most complex EI-spectra in our dataset. This method will be very helpful in the automated analysis of metabolites that are not included in common libraries and it thus has the potential to support the explorative character of metabolomics studies.

  14. Enantioselective determination of methylphenidate and ritalinic acid in whole blood from forensic cases using automated solid-phase extraction and liquid chromatography-tandem mass spectrometry

    DEFF Research Database (Denmark)

    Thomsen, Ragnar; B. Rasmussen, Henrik; Linnet, Kristian

    2012-01-01

    A chiral liquid chromatography tandem mass spectrometry (LC–MS-MS) method was developed and validated for quantifying methylphenidate and its major metabolite ritalinic acid in blood from forensic cases. Blood samples were prepared in a fully automated system by protein precipitation followed...... methylphenidate was not determined to be related to the cause of death, the femoral blood concentration of d-methylphenidate ranged from 5 to 58 ng/g, and from undetected to 48 ng/g for l-methylphenidate (median d/l-ratio 5.9). Ritalinic acid was present at concentrations 10–20 times higher with roughly equal...

  15. Digital computers in action

    CERN Document Server

    Booth, A D

    1965-01-01

    Digital Computers in Action is an introduction to the basics of digital computers as well as their programming and various applications in fields such as mathematics, science, engineering, economics, medicine, and law. Other topics include engineering automation, process control, special purpose games-playing devices, machine translation and mechanized linguistics, and information retrieval. This book consists of 14 chapters and begins by discussing the history of computers, from the idea of performing complex arithmetical calculations to the emergence of a modern view of the structure of a ge

  16. Low cost automation

    International Nuclear Information System (INIS)

    1987-03-01

    This book indicates method of building of automation plan, design of automation facilities, automation and CHIP process like basics of cutting, NC processing machine and CHIP handling, automation unit, such as drilling unit, tapping unit, boring unit, milling unit and slide unit, application of oil pressure on characteristics and basic oil pressure circuit, application of pneumatic, automation kinds and application of process, assembly, transportation, automatic machine and factory automation.

  17. RADARS, a bioinformatics solution that automates proteome mass spectral analysis, optimises protein identification, and archives data in a relational database.

    Science.gov (United States)

    Field, Helen I; Fenyö, David; Beavis, Ronald C

    2002-01-01

    RADARS, a rapid, automated, data archiving and retrieval software system for high-throughput proteomic mass spectral data processing and storage, is described. The majority of mass spectrometer data files are compatible with RADARS, for consistent processing. The system automatically takes unprocessed data files, identifies proteins via in silico database searching, then stores the processed data and search results in a relational database suitable for customized reporting. The system is robust, used in 24/7 operation, accessible to multiple users of an intranet through a web browser, may be monitored by Virtual Private Network, and is secure. RADARS is scalable for use on one or many computers, and is suited to multiple processor systems. It can incorporate any local database in FASTA format, and can search protein and DNA databases online. A key feature is a suite of visualisation tools (many available gratis), allowing facile manipulation of spectra, by hand annotation, reanalysis, and access to all procedures. We also described the use of Sonar MS/MS, a novel, rapid search engine requiring 40 MB RAM per process for searches against a genomic or EST database translated in all six reading frames. RADARS reduces the cost of analysis by its efficient algorithms: Sonar MS/MS can identifiy proteins without accurate knowledge of the parent ion mass and without protein tags. Statistical scoring methods provide close-to-expert accuracy and brings robust data analysis to the non-expert user.

  18. Automating data analysis for two-dimensional gas chromatography/time-of-flight mass spectrometry non-targeted analysis of comparative samples.

    Science.gov (United States)

    Titaley, Ivan A; Ogba, O Maduka; Chibwe, Leah; Hoh, Eunha; Cheong, Paul H-Y; Simonich, Staci L Massey

    2018-03-16

    Non-targeted analysis of environmental samples, using comprehensive two-dimensional gas chromatography coupled with time-of-flight mass spectrometry (GC × GC/ToF-MS), poses significant data analysis challenges due to the large number of possible analytes. Non-targeted data analysis of complex mixtures is prone to human bias and is laborious, particularly for comparative environmental samples such as contaminated soil pre- and post-bioremediation. To address this research bottleneck, we developed OCTpy, a Python™ script that acts as a data reduction filter to automate GC × GC/ToF-MS data analysis from LECO ® ChromaTOF ® software and facilitates selection of analytes of interest based on peak area comparison between comparative samples. We used data from polycyclic aromatic hydrocarbon (PAH) contaminated soil, pre- and post-bioremediation, to assess the effectiveness of OCTpy in facilitating the selection of analytes that have formed or degraded following treatment. Using datasets from the soil extracts pre- and post-bioremediation, OCTpy selected, on average, 18% of the initial suggested analytes generated by the LECO ® ChromaTOF ® software Statistical Compare feature. Based on this list, 63-100% of the candidate analytes identified by a highly trained individual were also selected by OCTpy. This process was accomplished in several minutes per sample, whereas manual data analysis took several hours per sample. OCTpy automates the analysis of complex mixtures of comparative samples, reduces the potential for human error during heavy data handling and decreases data analysis time by at least tenfold. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Mass spectrometry for protein quantification in biomarker discovery.

    Science.gov (United States)

    Wang, Mu; You, Jinsam

    2012-01-01

    Major technological advances have made proteomics an extremely active field for biomarker discovery in recent years due primarily to the development of newer mass spectrometric technologies and the explosion in genomic and protein bioinformatics. This leads to an increased emphasis on larger scale, faster, and more efficient methods for detecting protein biomarkers in human tissues, cells, and biofluids. Most current proteomic methodologies for biomarker discovery, however, are not highly automated and are generally labor-intensive and expensive. More automation and improved software programs capable of handling a large amount of data are essential to reduce the cost of discovery and to increase throughput. In this chapter, we discuss and describe mass spectrometry-based proteomic methods for quantitative protein analysis.

  20. Drilling Automation Tests At A Lunar/Mars Analog Site

    Science.gov (United States)

    Glass, B.; Cannon, H.; Hanagud, S.; Lee, P.; Paulsen, G.

    2006-01-01

    Future in-situ lunar/martian resource utilization and characterization, as well as the scientific search for life on Mars, will require access to the subsurface and hence drilling. Drilling on Earth is hard - an art form more than an engineering discipline. The limited mass, energy and manpower in planetary drilling situations makes application of terrestrial drilling techniques problematic. The Drilling Automation for Mars Exploration (DAME) project is developing drilling automation and robotics for projected use in missions to the Moon and Mars in the 2011-15 period. This has been tested recently, drilling in permafrost at a lunar/martian analog site (Haughton Crater, Devon Island, Canada).

  1. Decision Making In A High-Tech World: Automation Bias and Countermeasures

    Science.gov (United States)

    Mosier, Kathleen L.; Skitka, Linda J.; Burdick, Mark R.; Heers, Susan T.; Rosekind, Mark R. (Technical Monitor)

    1996-01-01

    Automated decision aids and decision support systems have become essential tools in many high-tech environments. In aviation, for example, flight management systems computers not only fly the aircraft, but also calculate fuel efficient paths, detect and diagnose system malfunctions and abnormalities, and recommend or carry out decisions. Air Traffic Controllers will soon be utilizing decision support tools to help them predict and detect potential conflicts and to generate clearances. Other fields as disparate as nuclear power plants and medical diagnostics are similarly becoming more and more automated. Ideally, the combination of human decision maker and automated decision aid should result in a high-performing team, maximizing the advantages of additional cognitive and observational power in the decision-making process. In reality, however, the presence of these aids often short-circuits the way that even very experienced decision makers have traditionally handled tasks and made decisions, and introduces opportunities for new decision heuristics and biases. Results of recent research investigating the use of automated aids have indicated the presence of automation bias, that is, errors made when decision makers rely on automated cues as a heuristic replacement for vigilant information seeking and processing. Automation commission errors, i.e., errors made when decision makers inappropriately follow an automated directive, or automation omission errors, i.e., errors made when humans fail to take action or notice a problem because an automated aid fails to inform them, can result from this tendency. Evidence of the tendency to make automation-related omission and commission errors has been found in pilot self reports, in studies using pilots in flight simulations, and in non-flight decision making contexts with student samples. Considerable research has found that increasing social accountability can successfully ameliorate a broad array of cognitive biases and

  2. ParticipACTION: Overview and introduction of baseline research on the "new" ParticipACTION

    Directory of Open Access Journals (Sweden)

    Craig Cora L

    2009-12-01

    Full Text Available Abstract Background This paper provides a brief overview of the Canadian physical activity communications and social marketing organization "ParticipACTION"; introduces the "new" ParticipACTION; describes the research process leading to the collection of baseline data on the new ParticipACTION; and outlines the accompanying series of papers in the supplement presenting the detailed baseline data. Methods Information on ParticipACTION was gathered from close personal involvement with the organization, from interviews and meetings with key leaders of the organization, from published literature and from ParticipACTION archives. In 2001, after nearly 30 years of operation, ParticipACTION ceased operations because of inadequate funding. In February 2007 the organization was officially resurrected and the launch of the first mass media campaign of the "new" ParticipACTION occurred in October 2007. The six-year absence of ParticipACTION, or any equivalent substitute, provided a unique opportunity to examine the impact of a national physical activity social marketing organization on important individual and organizational level indicators of success. A rapid response research team was established in January 2007 to exploit this natural intervention research opportunity. Results The research team was successful in obtaining funding through the new Canadian Institutes of Health Research Intervention Research (Healthy Living and Chronic Disease Prevention Funding Program. Data were collected on individuals and organizations prior to the complete implementation of the first mass media campaign of the new ParticipACTION. Conclusion Rapid response research and funding mechanisms facilitated the collection of baseline information on the new ParticipACTION. These data will allow for comprehensive assessments of future initiatives of ParticipACTION.

  3. Autonomy and Automation

    Science.gov (United States)

    Shively, Jay

    2017-01-01

    A significant level of debate and confusion has surrounded the meaning of the terms autonomy and automation. Automation is a multi-dimensional concept, and we propose that Remotely Piloted Aircraft Systems (RPAS) automation should be described with reference to the specific system and task that has been automated, the context in which the automation functions, and other relevant dimensions. In this paper, we present definitions of automation, pilot in the loop, pilot on the loop and pilot out of the loop. We further propose that in future, the International Civil Aviation Organization (ICAO) RPAS Panel avoids the use of the terms autonomy and autonomous when referring to automated systems on board RPA. Work Group 7 proposes to develop, in consultation with other workgroups, a taxonomy of Levels of Automation for RPAS.

  4. ICECAP: an integrated, general-purpose, automation-assisted IC50/EC50 assay platform.

    Science.gov (United States)

    Li, Ming; Chou, Judy; King, Kristopher W; Jing, Jing; Wei, Dong; Yang, Liyu

    2015-02-01

    IC50 and EC50 values are commonly used to evaluate drug potency. Mass spectrometry (MS)-centric bioanalytical and biomarker labs are now conducting IC50/EC50 assays, which, if done manually, are tedious and error-prone. Existing bioanalytical sample preparation automation systems cannot meet IC50/EC50 assay throughput demand. A general-purpose, automation-assisted IC50/EC50 assay platform was developed to automate the calculations of spiking solutions and the matrix solutions preparation scheme, the actual spiking and matrix solutions preparations, as well as the flexible sample extraction procedures after incubation. In addition, the platform also automates the data extraction, nonlinear regression curve fitting, computation of IC50/EC50 values, graphing, and reporting. The automation-assisted IC50/EC50 assay platform can process the whole class of assays of varying assay conditions. In each run, the system can handle up to 32 compounds and up to 10 concentration levels per compound, and it greatly improves IC50/EC50 assay experimental productivity and data processing efficiency. © 2014 Society for Laboratory Automation and Screening.

  5. Effective action and brane running

    International Nuclear Information System (INIS)

    Brevik, Iver; Ghoroku, Kazuo; Yahiro, Masanobu

    2004-01-01

    We address the renormalized effective action for a Randall-Sundrum brane running in 5D bulk space. The running behavior of the brane action is obtained by shifting the brane position without changing the background and fluctuations. After an appropriate renormalization, we obtain an effective, low energy brane world action, in which the effective 4D Planck mass is independent of the running position. We address some implications for this effective action

  6. Detection and identification of drugs and toxicants in human body fluids by liquid chromatography-tandem mass spectrometry under data-dependent acquisition control and automated database search.

    Science.gov (United States)

    Oberacher, Herbert; Schubert, Birthe; Libiseller, Kathrin; Schweissgut, Anna

    2013-04-03

    Systematic toxicological analysis (STA) is aimed at detecting and identifying all substances of toxicological relevance (i.e. drugs, drugs of abuse, poisons and/or their metabolites) in biological material. Particularly, gas chromatography-mass spectrometry (GC/MS) represents a competent and commonly applied screening and confirmation tool. Herein, we present an untargeted liquid chromatography-tandem mass spectrometry (LC/MS/MS) assay aimed to complement existing GC/MS screening for the detection and identification of drugs in blood, plasma and urine samples. Solid-phase extraction was accomplished on mixed-mode cartridges. LC was based on gradient elution in a miniaturized C18 column. High resolution electrospray ionization-MS/MS in positive ion mode with data-dependent acquisition control was used to generate tandem mass spectral information that enabled compound identification via automated library search in the "Wiley Registry of Tandem Mass Spectral Data, MSforID". Fitness of the developed LC/MS/MS method for application in STA in terms of selectivity, detection capability and reliability of identification (sensitivity/specificity) was demonstrated with blank samples, certified reference materials, proficiency test samples, and authentic casework samples. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Comparison of manual and automated pretreatment methods for AMS radiocarbon dating of plant fossils

    Science.gov (United States)

    Bradley, L.A.; Stafford, Thomas W.

    1994-01-01

    A new automated pretreatment system for the preparation of materials submitted for accelerator mass spectrometry (AMS) analysis is less time-consuming and results in a higher sample yield. The new procedure was tested using two groups of plant fossils: one group was pretreated using the traditional method, and the second, using the automated pretreatment apparatus. The time it took to complete the procedure and the amount of sample material remaining were compared. The automated pretreatment apparatus proved to be more than three times faster and, in most cases, produced a higher yield. A darker discoloration of the KOH solutions was observed indicating that the automated system is more thorough in removing humates from the specimen compared to the manual method. -Authors

  8. Portable data collection terminal in the automated power consumption measurement system

    Science.gov (United States)

    Vologdin, S. V.; Shushkov, I. D.; Bysygin, E. K.

    2018-01-01

    Aim of efficiency increasing, automation process of electric energy data collection and processing is very important at present time. High cost of classic electric energy billing systems prevent from its mass application. Udmurtenergo Branch of IDGC of Center and Volga Region developed electronic automated system called “Mobile Energy Billing” based on data collection terminals. System joins electronic components based on service-oriented architecture, WCF services. At present time all parts of Udmurtenergo Branch electric network are connected to “Mobile Energy Billing” project. System capabilities are expanded due to flexible architecture.

  9. DoD Actions Were Not Adequate to Reduce Improper Travel Payments

    Science.gov (United States)

    2016-03-10

    vouchers in near real time and identifies duplicate or incorrect payments. DoD Components developed corrective actions that did not include steps to...causes of improper payments. In addition, many of the payment errors were not preventable through real - time or post-payment automated validation checks...H 1 0 , 2 0 1 6 Report No. DODIG-2016-060 DoD Actions Were Not Adequate to Reduce Improper Travel Payments Mission Our mission is to provide

  10. Creatine Supplementation and Skeletal Muscle Metabolism for Building Muscle Mass- Review of the Potential Mechanisms of Action.

    Science.gov (United States)

    Farshidfar, Farnaz; Pinder, Mark A; Myrie, Semone B

    2017-01-01

    Creatine, a very popular supplement among athletic populations, is of growing interest for clinical applications. Since over 90% of creatine is stored in skeletal muscle, the effect of creatine supplementation on muscle metabolism is a widely studied area. While numerous studies over the past few decades have shown that creatine supplementation has many favorable effects on skeletal muscle physiology and metabolism, including enhancing muscle mass (growth/hypertrophy); the underlying mechanisms are poorly understood. This report reviews studies addressing the mechanisms of action of creatine supplementation on skeletal muscle growth/hypertrophy. Early research proposed that the osmotic effect of creatine supplementation serves as a cellular stressor (osmosensing) that acts as an anabolic stimulus for protein synthesis signal pathways. Other reports indicated that creatine directly affects muscle protein synthesis via modulations of components in the mammalian target of rapamycin (mTOR) pathway. Creatine may also directly affect the myogenic process (formation of muscle tissue), by altering secretions of myokines, such as myostatin and insulin-like growth factor-1, and expressions of myogenic regulatory factors, resulting in enhanced satellite cells mitotic activities and differentiation into myofiber. Overall, there is still no clear understanding of the mechanisms of action regarding how creatine affects muscle mass/growth, but current evidence suggests it may exert its effects through multiple approaches, with converging impacts on protein synthesis and myogenesis. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  11. Connecting imaging mass spectrometry and magnetic resonance imaging-based anatomical atlases for automated anatomical interpretation and differential analysis.

    Science.gov (United States)

    Verbeeck, Nico; Spraggins, Jeffrey M; Murphy, Monika J M; Wang, Hui-Dong; Deutch, Ariel Y; Caprioli, Richard M; Van de Plas, Raf

    2017-07-01

    Imaging mass spectrometry (IMS) is a molecular imaging technology that can measure thousands of biomolecules concurrently without prior tagging, making it particularly suitable for exploratory research. However, the data size and dimensionality often makes thorough extraction of relevant information impractical. To help guide and accelerate IMS data analysis, we recently developed a framework that integrates IMS measurements with anatomical atlases, opening up opportunities for anatomy-driven exploration of IMS data. One example is the automated anatomical interpretation of ion images, where empirically measured ion distributions are automatically decomposed into their underlying anatomical structures. While offering significant potential, IMS-atlas integration has thus far been restricted to the Allen Mouse Brain Atlas (AMBA) and mouse brain samples. Here, we expand the applicability of this framework by extending towards new animal species and a new set of anatomical atlases retrieved from the Scalable Brain Atlas (SBA). Furthermore, as many SBA atlases are based on magnetic resonance imaging (MRI) data, a new registration pipeline was developed that enables direct non-rigid IMS-to-MRI registration. These developments are demonstrated on protein-focused FTICR IMS measurements from coronal brain sections of a Parkinson's disease (PD) rat model. The measurements are integrated with an MRI-based rat brain atlas from the SBA. The new rat-focused IMS-atlas integration is used to perform automated anatomical interpretation and to find differential ions between healthy and diseased tissue. IMS-atlas integration can serve as an important accelerator in IMS data exploration, and with these new developments it can now be applied to a wider variety of animal species and modalities. This article is part of a Special Issue entitled: MALDI Imaging, edited by Dr. Corinna Henkel and Prof. Peter Hoffmann. Copyright © 2017. Published by Elsevier B.V.

  12. Deliberate and Crisis Action Planning and Execution Segments Increment 2A (DCAPES Inc 2A)

    Science.gov (United States)

    2016-03-01

    2016 Major Automated Information System Annual Report Deliberate and Crisis Action Planning and Execution Segments Increment 2A (DCAPES Inc 2A...Program Name Deliberate and Crisis Action Planning and Execution Segments Increment 2A (DCAPES Inc 2A) DoD Component Air Force Responsible Office Program...APB) dated March 9, 2015 DCAPES Inc 2A 2016 MAR UNCLASSIFIED 4 Program Description Deliberate and Crisis Action Planning and Execution Segments

  13. Deliberate and Crisis Action Planning and Execution Segments Increment 2B (DCAPES Inc 2B)

    Science.gov (United States)

    2016-03-01

    2016 Major Automated Information System Annual Report Deliberate and Crisis Action Planning and Execution Segments Increment 2B (DCAPES Inc 2B...Information Program Name Deliberate and Crisis Action Planning and Execution Segments Increment 2B (DCAPES Inc 2B) DoD Component Air Force Responsible Office...been established. DCAPES Inc 2B 2016 MAR UNCLASSIFIED 4 Program Description Deliberate and Crisis Action Planning and Execution Segments (DCAPES) is

  14. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  15. Team play with a powerful and independent agent: operational experiences and automation surprises on the Airbus A-320

    Science.gov (United States)

    Sarter, N. B.; Woods, D. D.

    1997-01-01

    Research and operational experience have shown that one of the major problems with pilot-automation interaction is a lack of mode awareness (i.e., the current and future status and behavior of the automation). As a result, pilots sometimes experience so-called automation surprises when the automation takes an unexpected action or fails to behave as anticipated. A lack of mode awareness and automation surprises can he viewed as symptoms of a mismatch between human and machine properties and capabilities. Changes in automation design can therefore he expected to affect the likelihood and nature of problems encountered by pilots. Previous studies have focused exclusively on early generation "glass cockpit" aircraft that were designed based on a similar automation philosophy. To find out whether similar difficulties with maintaining mode awareness are encountered on more advanced aircraft, a corpus of automation surprises was gathered from pilots of the Airbus A-320, an aircraft characterized by high levels of autonomy, authority, and complexity. To understand the underlying reasons for reported breakdowns in human-automation coordination, we also asked pilots about their monitoring strategies and their experiences with and attitude toward the unique design of flight controls on this aircraft.

  16. 77 FR 30433 - Privacy Act of 1974: Implementation of Exemptions; Automated Targeting System

    Science.gov (United States)

    2012-05-23

    ... Border Protection, Mint Annex, 799 Ninth Street NW., Washington, DC 20229. For privacy issues please... Secretary 6 CFR Part 5 [Docket No. DHS-2012-0020] Privacy Act of 1974: Implementation of Exemptions; Automated Targeting System AGENCY: Privacy Office, DHS. ACTION: Notice of proposed rulemaking. SUMMARY: The...

  17. Optimization of automation: I. Estimation method of cognitive automation rates reflecting the effects of automation on human operators in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Seong, Poong Hyun

    2014-01-01

    Highlights: • We propose an estimation method of the automation rate by taking the advantages of automation as the estimation measures. • We conduct the experiments to examine the validity of the suggested method. • The higher the cognitive automation rate is, the greater the decreased rate of the working time will be. • The usefulness of the suggested estimation method is proved by statistical analyses. - Abstract: Since automation was introduced in various industrial fields, the concept of the automation rate has been used to indicate the inclusion proportion of automation among all work processes or facilities. Expressions of the inclusion proportion of automation are predictable, as is the ability to express the degree of the enhancement of human performance. However, many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, this paper proposes a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs). Automation in NPPs can be divided into two types: system automation and cognitive automation. Some general descriptions and characteristics of each type of automation are provided, and the advantages of automation are investigated. The advantages of each type of automation are used as measures of the estimation method of the automation rate. One advantage was found to be a reduction in the number of tasks, and another was a reduction in human cognitive task loads. The system and the cognitive automation rate were proposed as quantitative measures by taking advantage of the aforementioned benefits. To quantify the required human cognitive task loads and thus suggest the cognitive automation rate, Conant’s information-theory-based model was applied. The validity of the suggested method, especially as regards the cognitive automation rate, was proven by conducting

  18. Supersymmetric Dirac-Born-Infeld action with self-dual mass term

    International Nuclear Information System (INIS)

    Nishino, Hitoshi; Rajpoot, Subhash; Reed, Kevin

    2005-01-01

    We introduce a Dirac-Born-Infeld action to a self-dual N = 1 supersymmetric vector multiplet in three dimensions. This action is based on the supersymmetric generalized self-duality in odd dimensions developed originally by Townsend, Pilch and van Nieuwenhuizen. Even though such a self-duality had been supposed to be very difficult to generalize to a supersymmetrically interacting system, we show that the Dirac-Born-Infeld action is actually compatible with supersymmetry and self-duality in three dimensions, even though the original self-duality receives corrections by the Dirac-Born-Infeld action. The interactions can be further generalized to arbitrary (non)polynomial interactions. As a by-product, we also show that a third-rank field strength leads to a more natural formulation of self-duality in 3D. We also show an interesting role played by the third-rank field strength leading to supersymmetry breaking, in addition to accommodating a Chern-Simons form

  19. Sloan Digital Sky Survey photometric telescope automation and observing software

    International Nuclear Information System (INIS)

    Eric H. Neilsen, Jr.; email = neilsen@fnal.gov

    2002-01-01

    The photometric telescope (PT) provides observations necessary for the photometric calibration of the Sloan Digital Sky Survey (SDSS). Because the attention of the observing staff is occupied by the operation of the 2.5 meter telescope which takes the survey data proper, the PT must reliably take data with little supervision. In this paper we describe the PT's observing program, MOP, which automates most tasks necessary for observing. MOP's automated target selection is closely modeled on the actions a human observer might take, and is built upon a user interface that can be (and has been) used for manual operation. This results in an interface that makes it easy for an observer to track the activities of the automating procedures and intervene with minimum disturbance when necessary. MOP selects targets from the same list of standard star and calibration fields presented to the user, and chooses standard star fields covering ranges of airmass, color, and time necessary to monitor atmospheric extinction and produce a photometric solution. The software determines when additional standard star fields are unnecessary, and selects survey calibration fields according to availability and priority. Other automated features of MOP, such as maintaining the focus and keeping a night log, are also built around still functional manual interfaces, allowing the observer to be as active in observing as desired; MOP's automated features may be used as tools for manual observing, ignored entirely, or allowed to run the telescope with minimal supervision when taking routine data

  20. A simple automated solid-phase extraction procedure for measurement of 25-hydroxyvitamin D3 and D2 by liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Knox, Susan; Harris, John; Calton, Lisa; Wallace, A Michael

    2009-05-01

    Measurement of 25-hydroxyvitamin D(3) (25OHD(3)) and D(2) (25OHD(2)) is challenging. Liquid chromatography-tandem mass spectrometry (LC-MS/MS) methods have been described but they are often complex and difficult to automate. We have developed a simplified procedure involving an automated solid-phase extraction (SPE). Internal standard (hexadeuterated 25-hydroxyvitamin D(3)) was added to serum or plasma followed by protein precipitation with methanol. Following centrifugation, a robotic instrument (CTC PAL [Presearch] for ITSP SPE [MicroLiter Analytical Supplies, Inc]) performed a six-step SPE procedure and the purified samples were injected into the LC-MS/MS. Quantification of 25OHD(3) and 25OHD(2) was by electrospray ionization MS/MS in the multiple-reaction monitoring mode. The lower limit of quantitation was 4.0 nmol/L for 25OHD(3) and 7.5 nmol/L for 25OHD(2). Within- and between-assay precision was below 10% over the concentration range of 22.5-120 nmol/L for D(3) and 17.5-70 nmol/L for D(2) (n = 10). The calibration was linear up to 2500 nmol/L (r = 0.99). Recoveries ranged between 89% and 104% for both metabolites and no ion suppression was observed. The results obtained compared well (r = 0.96) with the IDS-OCTEIA 25-hydroxyvitamin D enzyme immunoassay for samples containing less than 125 nmol/L, at higher concentrations the immunodiagnostic system (IDS) method showed positive bias. Our simplified sample preparation and automated SPE method is suitable for the measurement of 25OHD(3) and D(2) in a routine laboratory environment. The system can process up to 300 samples per day with no cumbersome solvent evaporation step and minimal operator intervention.

  1. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  2. Automated Electrophysiology Makes the Pace for Cardiac Ion Channel Safety Screening

    Directory of Open Access Journals (Sweden)

    Clemens eMoeller

    2011-11-01

    Full Text Available The field of automated patch-clamp electrophysiology has emerged from the tension between the pharmaceutical industry’s need for high-throughput compound screening versus its need to be conservative due to regulatory requirements. On the one hand, hERG channel screening was increasingly requested for new chemical entities, as the correlation between blockade of the ion channel coded by hERG and Torsades de Pointes cardiac arrhythmia gained increasing attention. On the other hand, manual patch-clamping, typically quoted as the gold-standard for understanding ion channel function and modulation, was far too slow (and, consequently, too expensive for keeping pace with the numbers of compounds submitted for hERG channel investigations from pharmaceutical R&D departments. In consequence it became more common for some pharmaceutical companies to outsource safety pharmacological investigations, with a focus on hERG channel interactions. This outsourcing has allowed those pharmaceutical companies to build up operational flexibility and greater independence from internal resources, and allowed them to obtain access to the latest technological developments that emerged in automated patch-clamp electrophysiology – much of which arose in specialized biotech companies. Assays for nearly all major cardiac ion channels are now available by automated patch-clamping using heterologous expression systems, and recently, automated action potential recordings from stem-cell derived cardiomyocytes have been demonstrated. Today, most of the large pharmaceutical companies have acquired automated electrophysiology robots and have established various automated cardiac ion channel safety screening assays on these, in addition to outsourcing parts of their needs for safety screening.

  3. Testing the hierarchy of effects model: ParticipACTION's serial mass communication campaigns on physical activity in Canada.

    Science.gov (United States)

    Craig, C L; Bauman, A; Reger-Nash, B

    2010-03-01

    The hierarchy of effects (HOE) model is often used in planning mass-reach communication campaigns to promote health, but has rarely been empirically tested. This paper examines Canada's 30 year ParticipACTION campaign to promote physical activity (PA). A cohort from the nationally representative 1981 Canada Fitness Survey was followed up in 1988 and 2002-2004. Modelling of these data tested whether the mechanisms of campaign effects followed the theoretical framework proposed in the HOE. Campaign awareness was measured in 1981. Outcome expectancy, attitudes, decision balance and future intention were asked in 1988. PA was assessed at all time points. Logistic regression was used to sequentially test mediating and moderating variables adjusting for age, sex and education. No selection bias was observed; however, relatively fewer respondents than non-respondents smoked or were underweight at baseline. Among those inactive at baseline, campaign awareness predicted outcome expectancy which in turn predicted positive attitude to PA. Positive attitudes predicted high decision balance, which predicted future intention. Future intention mediated the relationship between decision balance and sufficient activity. Among those sufficiently active at baseline, awareness was unrelated to outcome expectancy and inversely related to positive attitude. These results lend support to the HOE model, in that the effects of ParticipACTION's serial mass media campaigns were consistent with the sequential rollout of its messages, which in turn was associated with achieving an active lifestyle among those initially insufficiently active. This provides support to an often-used theoretical framework for designing health promotion media campaigns.

  4. Analytical and clinical performance of the new Fujirebio 25-OH vitamin D assay, a comparison with liquid chromatography-tandem mass spectrometry (LC-MS/MS) and three other automated assays

    OpenAIRE

    Saleh, Lanja; Mueller, Daniel; von Eckardstein, Arnold

    2015-01-01

    BACKGROUND: We evaluated the analytical and clinical performance of the new Lumipulse® G 25-OH vitamin D assay from Fujirebio, and compared it to a liquid chromatography-tandem mass spectrometry (LC-MS/MS) method and three other commercial automated assays. METHODS: Total 25 hydroxy vitamin D (25(OH)D) levels were measured in 100 selected serum samples from our routine analysis with Fujirebio 25(OH)D assay. The results were compared with those obtained with LC-MS/MS and three other automat...

  5. Automated Assessment of Left Ventricular Function and Mass Using Heart Deformation Analysis: Initial Experience in 160 Older Adults.

    Science.gov (United States)

    Lin, Kai; Collins, Jeremy D; Lloyd-Jones, Donald M; Jolly, Marie-Pierre; Li, Debiao; Markl, Michael; Carr, James C

    2016-03-01

    To assess the performance of automated quantification of left ventricular function and mass based on heart deformation analysis (HDA) in asymptomatic older adults. This study complied with Health Insurance Portability and Accountability Act regulations. Following the approval of the institutional review board, 160 asymptomatic older participants were recruited for cardiac magnetic resonance imaging including two-dimensional cine images covering the entire left ventricle in short-axis view. Data analysis included the calculation of left ventricular ejection fraction (LVEF), left ventricular mass (LVM), and cardiac output (CO) using HDA and standard global cardiac function analysis (delineation of end-systolic and end-diastolic left ventricle epi- and endocardial borders). The agreement between methods was evaluated using intraclass correlation coefficient (ICC) and coefficient of variation (CoV). HDA had a shorter processing time than the standard method (1.5 ± 0.3 min/case vs. 5.8 ± 1.4 min/case, P HDA. There was a systemic bias toward lower LVEF (62.8% ± 8.3% vs. 69.3% ± 6.7%, P HDA compared to the standard technique. Conversely, HDA overestimated LVM (114.8 ± 30.1 g vs. 100.2 ± 29.0 g, P HDA has the potential to measure LVEF, CO, and LVM without the need for user interaction based on standard cardiac two-dimensional cine images. Copyright © 2015 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  6. Technology assessment of automation trends in the modular home industry

    Science.gov (United States)

    Phil Mitchell; Robert Russell Hurst

    2009-01-01

    This report provides an assessment of technology used in manufacturing modular homes in the United States, and that used in the German prefabricated wooden home industry. It is the first step toward identifying the research needs in automation and manufacturing methods that will facilitate mass customization in the home manufacturing industry. Within the United States...

  7. The Automator: Intelligent control system monitoring

    International Nuclear Information System (INIS)

    M. Bickley; D.A. Bryan; K.S. White

    1999-01-01

    A large-scale control system may contain several hundred thousand control points which must be monitored to ensure smooth operation. Knowledge of the current state of such a system is often implicit in the values of these points and operators must be cognizant of the state while making decisions. Repetitive operators requiring human intervention lead to fatigue, which can in turn lead to mistakes. The authors propose a tool called the Automator based on a middleware software server. This tool would provide a user-configurable engine for monitoring control points. Based on the status of these control points, a specified action could be taken. The action could range from setting another control point, to triggering an alarm, to running an executable. Often the data presented by a system is meaningless without context information from other channels. Such a tool could be configured to present interpreted information based on values of other channels. Additionally, this tool could translate numerous values in a non-friendly form (such as numbers, bits, or return codes) into meaningful strings of information. Multiple instances of this server could be run, allowing individuals or groups to configure their own Automators. The configuration of the tool will be file-based. In the future, these files could be generated by graphical design tools, allowing for rapid development of new configurations. In addition, the server will be able to explicitly maintain information about the state of the control system. This state information can be used in decision-making processes and shared with other applications. A conceptual framework and software design for the tool are presented

  8. Functional proteomics with new mass spectrometric and bioinformatics tools

    International Nuclear Information System (INIS)

    Kesners, P.W.A.

    2001-01-01

    A comprehensive range of mass spectrometric tools is required to investigate todays life science applications and a strong focus is on addressing the needs of functional proteomics. Application examples are given showing the streamlined process of protein identification from low femtomole amounts of digests. Sample preparation is achieved with a convertible robot for automated 2D gel picking, and MALDI target dispensing. MALDI-TOF or ESI-MS subsequent to enzymatic digestion. A choice of mass spectrometers including Q-q-TOF with multipass capability, MALDI-MS/MS with unsegmented PSD, Ion Trap and FT-MS are discussed for their respective strengths and applications. Bioinformatics software that allows both database work and novel peptide mass spectra interpretation is reviewed. The automated database searching uses either entire digest LC-MS n ESI Ion Trap data or MALDI MS and MS/MS spectra. It is shown how post translational modifications are interactively uncovered and de-novo sequencing of peptides is facilitated

  9. Micronucleus test for radiation biodosimetry in mass casualty events: Evaluation of visual and automated scoring

    Energy Technology Data Exchange (ETDEWEB)

    Bolognesi, Claudia, E-mail: claudia.bolognesi@istge.i [Environmental Carcinogenesis Unit, National Cancer Research Institute, Largo R. Benzi 10, 16132 Genoa (Italy); Balia, Cristina; Roggieri, Paola [Environmental Carcinogenesis Unit, National Cancer Research Institute, Largo R. Benzi 10, 16132 Genoa (Italy); Cardinale, Francesco [Clinical Epidemiology Unit, National Cancer Research Institute, Largo R. Benzi 10, 16132 Genoa (Italy); Department of Health Sciences, University of Genoa, Genoa (Italy); Bruzzi, Paolo [Clinical Epidemiology Unit, National Cancer Research Institute, Largo R. Benzi 10, 16132 Genoa (Italy); Sorcinelli, Francesca [Environmental Carcinogenesis Unit, National Cancer Research Institute, Largo R. Benzi 10, 16132 Genoa (Italy); Laboratory of Genetics, Histology and Molecular Biology Section, Army Medical and Veterinary, Research Center, Via Santo Stefano Rotondo 4, 00184 Roma (Italy); Lista, Florigio [Laboratory of Genetics, Histology and Molecular Biology Section, Army Medical and Veterinary, Research Center, Via Santo Stefano Rotondo 4, 00184 Roma (Italy); D' Amelio, Raffaele [Sapienza, Universita di Roma II Facolta di Medicina e Chirurgia and Ministero della Difesa, Direzione Generale Sanita Militare (Italy); Righi, Enzo [Frascati National Laboratories, National Institute of Nuclear Physics, Via Enrico Fermi 40, 00044 Frascati, Rome (Italy)

    2011-02-15

    In the case of a large-scale nuclear or radiological incidents a reliable estimate of dose is an essential tool for providing timely assessment of radiation exposure and for making life-saving medical decisions. Cytogenetics is considered as the 'gold standard' for biodosimetry. The dicentric analysis (DA) represents the most specific cytogenetic bioassay. The micronucleus test (MN) applied in interphase in peripheral lymphocytes is an alternative and simpler approach. A dose-effect calibration curve for the MN frequency in peripheral lymphocytes from 27 adult donors was established after in vitro irradiation at a dose range 0.15-8 Gy of {sup 137}Cs gamma rays (dose rate 6 Gy min{sup -1}). Dose prediction by visual scoring in a dose-blinded study (0.15-4.0 Gy) revealed a high level of accuracy (R = 0.89). The scoring of MN is time consuming and requires adequate skills and expertise. Automated image analysis is a feasible approach allowing to reduce the time and to increase the accuracy of the dose estimation decreasing the variability due to subjective evaluation. A good correlation (R = 0.705) between visual and automated scoring with visual correction was observed over the dose range 0-2 Gy. Almost perfect discrimination power for exposure to 1-2 Gy, and a satisfactory power for 0.6 Gy were detected. This threshold level can be considered sufficient for identification of sub lethally exposed individuals by automated CBMN assay.

  10. The surveillance state of behavioral automation

    Science.gov (United States)

    Schaefer, Andreas T; Claridge-Chang, Adam

    2012-01-01

    Genetics’ demand for increased throughput is driving automatization of behavior analysis far beyond experimental workhorses like circadian monitors and the operant conditioning box. However, the new automation is not just faster: it is also allowing new kinds of experiments, many of which erase the boundaries of the traditional neuroscience disciplines (psychology, ethology and physiology) while producing insight into problems that were otherwise opaque. Ironically, a central theme of current automatization is to improve observation of animals in increasingly naturalistic environments. This is not just a return to 19th century priorities: the new observational methods provide unprecedented quantitation of actions and ever-closer integration with experimentation. PMID:22119142

  11. Mass screening in breast cancer

    International Nuclear Information System (INIS)

    Strax, P.

    1977-01-01

    Some questions about mass screening in breast cancer are answered it being concluded that: 1. mass screening for the detection of early breast cancer is the only means with proven potential for lowering the death rate of the disease; 2. mammography is an importante - if not the most important modality in mass screening; 3. new film - screen combinations generally available are capable of producing mammograms of excelent quality with radiation doses down to .1 rad into the body of breast. The risk of malignant changes from such dosage - even when given periodically is negligeable. New equipment, to be available, shortly, will use the new film - screen combinations in an automated manner with must reduce cost in time, filme, personnel and processing - of more than 50%. This would make mass screening more practical. (M.A.) [pt

  12. Intelligent control of liquid transfer for the automated synthesis of positron emitting radiopharmaceuticals

    International Nuclear Information System (INIS)

    Iwata, Ren; Ido, Tatsuo; Yamazaki, Shigeki

    1990-01-01

    A method for the intelligent control of liquid transfer, developed for automated synthesis of 2-deoxy-2-[ 18 F]fluoro-D-glucose from [ 18 F]fluoride, is described. A thermal mass flow controller coupled to a personal computer is used to monitor conditions for transferring or passing liquid through a tube or a column. Using this sensor a computer can detect completion of liquid transfer, dispense a stock solution and check the setup conditions of the system. The present feedback control can be readily adapted to other automated syntheses of positron emitting radiopharmaceuticals. (author)

  13. Lean automation development : applying lean principles to the automation development process

    OpenAIRE

    Granlund, Anna; Wiktorsson, Magnus; Grahn, Sten; Friedler, Niklas

    2014-01-01

    By a broad empirical study it is indicated that automation development show potential of improvement. In the paper, 13 lean product development principles are contrasted to the automation development process and it is suggested why and how these principles can facilitate, support and improve the automation development process. The paper summarises a description of what characterises a lean automation development process and what consequences it entails. Main differences compared to current pr...

  14. An automated optofluidic biosensor platform combining interferometric sensors and injection moulded microfluidics.

    Science.gov (United States)

    Szydzik, C; Gavela, A F; Herranz, S; Roccisano, J; Knoerzer, M; Thurgood, P; Khoshmanesh, K; Mitchell, A; Lechuga, L M

    2017-08-08

    A primary limitation preventing practical implementation of photonic biosensors within point-of-care platforms is their integration with fluidic automation subsystems. For most diagnostic applications, photonic biosensors require complex fluid handling protocols; this is especially prominent in the case of competitive immunoassays, commonly used for detection of low-concentration, low-molecular weight biomarkers. For this reason, complex automated microfluidic systems are needed to realise the full point-of-care potential of photonic biosensors. To fulfil this requirement, we propose an on-chip valve-based microfluidic automation module, capable of automating such complex fluid handling. This module is realised through application of a PDMS injection moulding fabrication technique, recently described in our previous work, which enables practical fabrication of normally closed pneumatically actuated elastomeric valves. In this work, these valves are configured to achieve multiplexed reagent addressing for an on-chip diaphragm pump, providing the sample and reagent processing capabilities required for automation of cyclic competitive immunoassays. Application of this technique simplifies fabrication and introduces the potential for mass production, bringing point-of-care integration of complex automated microfluidics into the realm of practicality. This module is integrated with a highly sensitive, label-free bimodal waveguide photonic biosensor, and is demonstrated in the context of a proof-of-concept biosensing assay, detecting the low-molecular weight antibiotic tetracycline.

  15. Assessment selection in human-automation interaction studies: The Failure-GAM2E and review of assessment methods for highly automated driving.

    Science.gov (United States)

    Grane, Camilla

    2018-01-01

    Highly automated driving will change driver's behavioural patterns. Traditional methods used for assessing manual driving will only be applicable for the parts of human-automation interaction where the driver intervenes such as in hand-over and take-over situations. Therefore, driver behaviour assessment will need to adapt to the new driving scenarios. This paper aims at simplifying the process of selecting appropriate assessment methods. Thirty-five papers were reviewed to examine potential and relevant methods. The review showed that many studies still relies on traditional driving assessment methods. A new method, the Failure-GAM 2 E model, with purpose to aid assessment selection when planning a study, is proposed and exemplified in the paper. Failure-GAM 2 E includes a systematic step-by-step procedure defining the situation, failures (Failure), goals (G), actions (A), subjective methods (M), objective methods (M) and equipment (E). The use of Failure-GAM 2 E in a study example resulted in a well-reasoned assessment plan, a new way of measuring trust through feet movements and a proposed Optimal Risk Management Model. Failure-GAM 2 E and the Optimal Risk Management Model are believed to support the planning process for research studies in the field of human-automation interaction. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Both Automation and Paper.

    Science.gov (United States)

    Purcell, Royal

    1988-01-01

    Discusses the concept of a paperless society and the current situation in library automation. Various applications of automation and telecommunications are addressed, and future library automation is considered. Automation at the Monroe County Public Library in Bloomington, Indiana, is described as an example. (MES)

  17. LAIR: A Language for Automated Semantics-Aware Text Sanitization based on Frame Semantics

    DEFF Research Database (Denmark)

    Hedegaard, Steffen; Houen, Søren; Simonsen, Jakob Grue

    2009-01-01

    We present \\lair{}: A domain-specific language that enables users to specify actions to be taken upon meeting specific semantic frames in a text, in particular to rephrase and redact the textual content. While \\lair{} presupposes superficial knowledge of frames and frame semantics, it requires on...... with automated redaction of web pages for subjectively undesirable content; initial experiments suggest that using a small language based on semantic recognition of undesirable terms can be highly useful as a supplement to traditional methods of text sanitization.......We present \\lair{}: A domain-specific language that enables users to specify actions to be taken upon meeting specific semantic frames in a text, in particular to rephrase and redact the textual content. While \\lair{} presupposes superficial knowledge of frames and frame semantics, it requires only...... limited prior programming experience. It neither contain scripting or I/O primitives, nor does it contain general loop constructions and is not Turing-complete. We have implemented a \\lair{} compiler and integrated it in a pipeline for automated redaction of web pages. We detail our experience...

  18. An Engineer-To-Order Mass Customization Development Framework

    DEFF Research Database (Denmark)

    Bossen, Jacob; Hansson, Michael Natapon; Madsen, Ole

    2014-01-01

    competitiveness and revenue, in which Engineer-To-Order companies may benefit from adopting Mass Customization concepts. As automated manufacturing systems tends to be software intensive, it become equally important to enable reusability for physical components and for software related artefacts. In parallel...... to Mass Customization, Software Product Line Engineering has emerged as a way for software developers to manage variability and reusability. This paper seeks to combine the concepts of Mass Customization and Software Product Line Engineering, by introducing a development framework applicable for Engineer...

  19. An improved single-plaquette gauge action

    International Nuclear Information System (INIS)

    Banerjee, D.; Bögli, M.; Holland, K.; Niedermayer, F.; Pepe, M.; Wenger, University; Wiese, UniversityJ.

    2016-01-01

    We describe and test a nonperturbatively improved single-plaquette lattice action for 4-d SU(2) and SU(3) pure gauge theory, which suppresses large fluctuations of the plaquette, without requiring the naive continuum limit for smooth fields. We tune the action parameters based on torelon masses in moderate cubic physical volumes, and investigate the size of cut-off effects in other physical quantities, including torelon masses in asymmetric spatial volumes, the static quark potential, and gradient flow observables. In 2-d O(N) models similarly constructed nearest-neighbor actions have led to a drastic reduction of cut-off effects, down to the permille level, in a wide variety of physical quantities. In the gauge theories, we find significant reduction of lattice artifacts, and for some observables, the coarsest lattice result is very close to the continuum value. We estimate an improvement factor of 40 compared to using the Wilson gauge action to achieve the same statistical accuracy and suppression of cut-off effects.

  20. Automated Groundwater Screening

    International Nuclear Information System (INIS)

    Taylor, Glenn A.; Collard, Leonard B.

    2005-01-01

    The Automated Intruder Analysis has been extended to include an Automated Ground Water Screening option. This option screens 825 radionuclides while rigorously applying the National Council on Radiation Protection (NCRP) methodology. An extension to that methodology is presented to give a more realistic screening factor for those radionuclides which have significant daughters. The extension has the promise of reducing the number of radionuclides which must be tracked by the customer. By combining the Automated Intruder Analysis with the Automated Groundwater Screening a consistent set of assumptions and databases is used. A method is proposed to eliminate trigger values by performing rigorous calculation of the screening factor thereby reducing the number of radionuclides sent to further analysis. Using the same problem definitions as in previous groundwater screenings, the automated groundwater screening found one additional nuclide, Ge-68, which failed the screening. It also found that 18 of the 57 radionuclides contained in NCRP Table 3.1 failed the screening. This report describes the automated groundwater screening computer application

  1. Flexible Automation System for Determination of Elemental Composition of Incrustations in Clogged Biliary Endoprostheses Using ICP-MS.

    Science.gov (United States)

    Fleischer, Heidi; Ramani, Kinjal; Blitti, Koffi; Roddelkopf, Thomas; Warkentin, Mareike; Behrend, Detlef; Thurow, Kerstin

    2018-02-01

    Automation systems are well established in industries and life science laboratories, especially in bioscreening and high-throughput applications. An increasing demand of automation solutions can be seen in the field of analytical measurement in chemical synthesis, quality control, and medical and pharmaceutical fields, as well as research and development. In this study, an automation solution was developed and optimized for the investigation of new biliary endoprostheses (stents), which should reduce clogging after implantation in the human body. The material inside the stents (incrustations) has to be controlled regularly and under identical conditions. The elemental composition is one criterion to be monitored in stent development. The manual procedure was transferred to an automated process including sample preparation, elemental analysis using inductively coupled plasma mass spectrometry (ICP-MS), and data evaluation. Due to safety issues, microwave-assisted acid digestion was executed outside of the automation system. The performance of the automated process was determined and validated. The measurement results and the processing times were compared for both the manual and the automated procedure. Finally, real samples of stent incrustations and pig bile were analyzed using the automation system.

  2. Machine Learning for Mass Production and Industrial Engineering

    OpenAIRE

    Pfingsten, Jens Tobias

    2007-01-01

    The analysis of data from simulations and experiments in the development phase and measurements during mass production plays a crucial role in modern manufacturing: Experiments and simulations are performed during the development phase to ensure the design's fitness for mass production. During production, a large number of measurements in the automated production line controls a stable quality. As the number of measurements grows, the conventional, largely manual data analysis approach...

  3. Systems Operations Studies for Automated Guideway Transit Systems : Discrete Event Simulation Model User's Manual.

    Science.gov (United States)

    1982-06-01

    In order to examine specific Automated Guideway Transit (AGT) developments and concepts, and to build a better knowledge base for future decision-making, the Urban Mass Transportation Administration (UMTA) undertook a new program of studies and techn...

  4. Automated Functional Testing based on the Navigation of Web Applications

    Directory of Open Access Journals (Sweden)

    Boni García

    2011-08-01

    Full Text Available Web applications are becoming more and more complex. Testing such applications is an intricate hard and time-consuming activity. Therefore, testing is often poorly performed or skipped by practitioners. Test automation can help to avoid this situation. Hence, this paper presents a novel approach to perform automated software testing for web applications based on its navigation. On the one hand, web navigation is the process of traversing a web application using a browser. On the other hand, functional requirements are actions that an application must do. Therefore, the evaluation of the correct navigation of web applications results in the assessment of the specified functional requirements. The proposed method to perform the automation is done in four levels: test case generation, test data derivation, test case execution, and test case reporting. This method is driven by three kinds of inputs: i UML models; ii Selenium scripts; iii XML files. We have implemented our approach in an open-source testing framework named Automatic Testing Platform. The validation of this work has been carried out by means of a case study, in which the target is a real invoice management system developed using a model-driven approach.

  5. Enhanced automation of ECCS at Temelin: the merging of Eastern and Western philosophies

    International Nuclear Information System (INIS)

    Burnett, T.; Sykora, M.

    1997-01-01

    After years of separate development, the Eastern and Western concept of safety is now being merged. A significant issue in this process is the philosophy of automation versus operator action for control of safety systems. The paper discusses some aspects of this issue and presents enhancements made at Temelin which merge the Eastern and Western philosophies. (author)

  6. Automated dental identification system: An aid to forensic odontology

    Directory of Open Access Journals (Sweden)

    Parvathi Devi

    2011-01-01

    Full Text Available Automated dental identification system is computer-aided software for the postmortem identification of deceased individuals based on dental characteristics specifically radiographs. This system is receiving increased attention because of the large number of victims encountered in the mass disasters and it is 90% more time saving and accurate than the conventional radiographic methods. This technique is based on the intensity of the overall region of tooth image and therefore it does not necessitate the presence of sharp boundary between the teeth. It provides automated search and matching capabilities for digitized radiographs and photographic dental images and compares the teeth present in multiple digitized dental records in order to access their similarity. This paper highlights the functionality of its components and techniques used in realizing these components.

  7. Automated model building

    CERN Document Server

    Caferra, Ricardo; Peltier, Nicholas

    2004-01-01

    This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses

  8. Validation of an automated solid-phase extraction method for the analysis of 23 opioids, cocaine, and metabolites in urine with ultra-performance liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Ramírez Fernández, María del Mar; Van Durme, Filip; Wille, Sarah M R; di Fazio, Vincent; Kummer, Natalie; Samyn, Nele

    2014-06-01

    The aim of this work was to automate a sample preparation procedure extracting morphine, hydromorphone, oxymorphone, norcodeine, codeine, dihydrocodeine, oxycodone, 6-monoacetyl-morphine, hydrocodone, ethylmorphine, benzoylecgonine, cocaine, cocaethylene, tramadol, meperidine, pentazocine, fentanyl, norfentanyl, buprenorphine, norbuprenorphine, propoxyphene, methadone and 2-ethylidene-1,5-dimethyl-3,3-diphenylpyrrolidine from urine samples. Samples were extracted by solid-phase extraction (SPE) with cation exchange cartridges using a TECAN Freedom Evo 100 base robotic system, including a hydrolysis step previous extraction when required. Block modules were carefully selected in order to use the same consumable material as in manual procedures to reduce cost and/or manual sample transfers. Moreover, the present configuration included pressure monitoring pipetting increasing pipetting accuracy and detecting sampling errors. The compounds were then separated in a chromatographic run of 9 min using a BEH Phenyl analytical column on a ultra-performance liquid chromatography-tandem mass spectrometry system. Optimization of the SPE was performed with different wash conditions and elution solvents. Intra- and inter-day relative standard deviations (RSDs) were within ±15% and bias was within ±15% for most of the compounds. Recovery was >69% (RSD automated SPE system was observed. The extracted samples were stable for 72 h in the autosampler (4°C). This method was applied to authentic samples (from forensic and toxicology cases) and to proficiency testing schemes containing cocaine, heroin, buprenorphine and methadone, offering fast and reliable results. Automation resulted in improved precision and accuracy, and a minimum operator intervention, leading to safer sample handling and less time-consuming procedures.

  9. [Latest development in mass spectrometry for clinical application].

    Science.gov (United States)

    Takino, Masahiko

    2013-09-01

    Liquid chromatography-tandem mass spectrometry (LC-MS/MS) has seen enormous growth in special clinical chemistry laboratories. It significantly increases the analytic potential in clinical chemistry, especially in the field of low molecular weight biomarker analysis. This review summarizes the state of the art in mass spectrometry and related techniques for clinical application with a main focus on recent developments in LC-MS. Current trends in ionization techniques, automated online sample preparation techniques coupled with LC-MS, and ion mobility spectrometry are discussed. Emerging mass spectrometric approaches complementary to LC-MS are discussed as well.

  10. Overview of the mass measurements

    International Nuclear Information System (INIS)

    Shull, L.M.

    1991-01-01

    a three-day mass measurement workshop conference sponsored by the INMM was held April 22-24, 1991, in Atlanta, Georgia. DOE Order 5633.3 requires mass measurement control programs for the measurements of nuclear materials but provides little guidance on details for these programs. Measurement principles used for mass are often applicable to other physical property measurements. Westinghouse Savannah River Site (WSRS) personnel organized the workshop conference to facilitate the transfer of mass measurement technology and establish better communications between the calibration laboratories, manufactures, regulators, and scale and balance users in the mass measurement community. Three different formats were used to present the information: a seminar, individual papers, and workshops. The seminar topic was the Process Measurement Assurance Program (PMAP), developed by EG and G Mound Applied Technologies, for determining and controlling measurement errors in manufacturing processes. Paper and workshop topics included: Mass Measurement Techniques and Programs, Selection of equipment and Standards, Standards and Traceability, and Automation in Mass Measurement. The paper gives an overview of the workshop conference, including purpose, participants, and summaries of the seminar, paper, and workshops

  11. Quantification of 31 illicit and medicinal drugs and metabolites in whole blood by fully automated solid-phase extraction and ultra-performance liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Bjørk, Marie Kjærgaard; Simonsen, Kirsten Wiese; Andersen, David Wederkinck; Dalsgaard, Petur Weihe; Sigurðardóttir, Stella Rögn; Linnet, Kristian; Rasmussen, Brian Schou

    2013-03-01

    An efficient method for analyzing illegal and medicinal drugs in whole blood using fully automated sample preparation and short ultra-high-performance liquid chromatography-tandem mass spectrometry (MS/MS) run time is presented. A selection of 31 drugs, including amphetamines, cocaine, opioids, and benzodiazepines, was used. In order to increase the efficiency of routine analysis, a robotic system based on automated liquid handling and capable of handling all unit operation for sample preparation was built on a Freedom Evo 200 platform with several add-ons from Tecan and third-party vendors. Solid-phase extraction was performed using Strata X-C plates. Extraction time for 96 samples was less than 3 h. Chromatography was performed using an ACQUITY UPLC system (Waters Corporation, Milford, USA). Analytes were separated on a 100 mm × 2.1 mm, 1.7 μm Acquity UPLC CSH C(18) column using a 6.5 min 0.1 % ammonia (25 %) in water/0.1 % ammonia (25 %) in methanol gradient and quantified by MS/MS (Waters Quattro Premier XE) in multiple-reaction monitoring mode. Full validation, including linearity, precision and trueness, matrix effect, ion suppression/enhancement of co-eluting analytes, recovery, and specificity, was performed. The method was employed successfully in the laboratory and used for routine analysis of forensic material. In combination with tetrahydrocannabinol analysis, the method covered 96 % of cases involving driving under the influence of drugs. The manual labor involved in preparing blood samples, solvents, etc., was reduced to a half an hour per batch. The automated sample preparation setup also minimized human exposure to hazardous materials, provided highly improved ergonomics, and eliminated manual pipetting.

  12. An atomistic vision of the Mass Action Law: Prediction of carbon/oxygen defects in silicon

    Energy Technology Data Exchange (ETDEWEB)

    Brenet, G.; Timerkaeva, D.; Caliste, D.; Pochet, P. [CEA, INAC-SP2M, Atomistic Simulation Laboratory, F-38000 Grenoble (France); Univ. Grenoble Alpes, INAC-SP2M, L-Sim, F-38000 Grenoble (France); Sgourou, E. N.; Londos, C. A. [University of Athens, Solid State Physics Section, Panepistimiopolis Zografos, Athens 157 84 (Greece)

    2015-09-28

    We introduce an atomistic description of the kinetic Mass Action Law to predict concentrations of defects and complexes. We demonstrate in this paper that this approach accurately predicts carbon/oxygen related defect concentrations in silicon upon annealing. The model requires binding and migration energies of the impurities and complexes, here obtained from density functional theory (DFT) calculations. Vacancy-oxygen complex kinetics are studied as a model system during both isochronal and isothermal annealing. Results are in good agreement with experimental data, confirming the success of the methodology. More importantly, it gives access to the sequence of chain reactions by which oxygen and carbon related complexes are created in silicon. Beside the case of silicon, the understanding of such intricate reactions is a key to develop point defect engineering strategies to control defects and thus semiconductors properties.

  13. Mass-rearing for sterile insect release

    International Nuclear Information System (INIS)

    Parker, A.G.

    2005-01-01

    As the sterile insect technique (SIT) relies upon released sterile male insects efficiently competing with wild males to mate with wild females, it follows that mass-rearing of insects is one of the principal steps in the process. Mass-rearing for the SIT presents both problems and opportunities due to the increased scale involved compared with rearing insects for most other purposes. This chapter discusses facility design, environmental concerns, strain management, quality control, automation, diet, sex separation, marking, and storage in relation to rearing for the SIT. (author)

  14. Statistics in action a Canadian outlook

    CERN Document Server

    Lawless, Jerald F

    2014-01-01

    Commissioned by the Statistical Society of Canada (SSC), Statistics in Action: A Canadian Outlook helps both general readers and users of statistics better appreciate the scope and importance of statistics. It presents the ways in which statistics is used while highlighting key contributions that Canadian statisticians are making to science, technology, business, government, and other areas. The book emphasizes the role and impact of computing in statistical modeling and analysis, including the issues involved with the huge amounts of data being generated by automated processes.The first two c

  15. Robotic and Human-Tended Collaborative Drilling Automation for Subsurface Exploration

    Science.gov (United States)

    Glass, Brian; Cannon, Howard; Stoker, Carol; Davis, Kiel

    2005-01-01

    Future in-situ lunar/martian resource utilization and characterization, as well as the scientific search for life on Mars, will require access to the subsurface and hence drilling. Drilling on Earth is hard - an art form more than an engineering discipline. Human operators listen and feel drill string vibrations coming from kilometers underground. Abundant mass and energy make it possible for terrestrial drilling to employ brute-force approaches to failure recovery and system performance issues. Space drilling will require intelligent and autonomous systems for robotic exploration and to support human exploration. Eventual in-situ resource utilization will require deep drilling with probable human-tended operation of large-bore drills, but initial lunar subsurface exploration and near-term ISRU will be accomplished with lightweight, rover-deployable or standalone drills capable of penetrating a few tens of meters in depth. These lightweight exploration drills have a direct counterpart in terrestrial prospecting and ore-body location, and will be designed to operate either human-tended or automated. NASA and industry now are acquiring experience in developing and building low-mass automated planetary prototype drills to design and build a pre-flight lunar prototype targeted for 2011-12 flight opportunities. A successful system will include development of drilling hardware, and automated control software to operate it safely and effectively. This includes control of the drilling hardware, state estimation of both the hardware and the lithography being drilled and state of the hole, and potentially planning and scheduling software suitable for uncertain situations such as drilling. Given that Humans on the Moon or Mars are unlikely to be able to spend protracted EVA periods at a drill site, both human-tended and robotic access to planetary subsurfaces will require some degree of standalone, autonomous drilling capability. Human-robotic coordination will be important

  16. 78 FR 53466 - Modification of Two National Customs Automation Program (NCAP) Tests Concerning Automated...

    Science.gov (United States)

    2013-08-29

    ... Customs Automation Program (NCAP) Tests Concerning Automated Commercial Environment (ACE) Document Image... National Customs Automation Program (NCAP) tests concerning document imaging, known as the Document Image... the National Customs Automation Program (NCAP) tests concerning document imaging, known as the...

  17. Analytical and clinical performance of the new Fujirebio 25-OH vitamin D assay, a comparison with liquid chromatography-tandem mass spectrometry (LC-MS/MS) and three other automated assays.

    Science.gov (United States)

    Saleh, Lanja; Mueller, Daniel; von Eckardstein, Arnold

    2016-04-01

    We evaluated the analytical and clinical performance of the new Lumipulse® G 25-OH vitamin D assay from Fujirebio, and compared it to a liquid chromatography-tandem mass spectrometry (LC-MS/MS) method and three other commercial automated assays. Total 25 hydroxy vitamin D (25(OH)D) levels were measured in 100 selected serum samples from our routine analysis with Fujirebio 25(OH)D assay. The results were compared with those obtained with LC-MS/MS and three other automated 25(OH)D assays (Abbott, Beckman, and Roche). The accuracy of each assay tested was evaluated against a Labquality reference serum panel for 25(OH)D (Ref!25OHD; University of Ghent). Intra- and inter-day imprecision of the Fujirebio 25(OH)D assay was Lumipulse G 25-OH vitamin D assay from Fujirebio demonstrated a good correlation with LC-MS/MS and some immunoassays. The performance of the assay is well-suited for routine 25(OH)D measurement in clinical serum samples. A correction for the observed negative bias vs. LC-MS/MS could be considered.

  18. Optimization of automation: III. Development of optimization method for determining automation rate in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Kim, Man Cheol; Seong, Poong Hyun

    2016-01-01

    Highlights: • We propose an appropriate automation rate that enables the best human performance. • We analyze the shortest working time considering Situation Awareness Recovery (SAR). • The optimized automation rate is estimated by integrating the automation and ostracism rate estimation methods. • The process to derive the optimized automation rate is demonstrated through case studies. - Abstract: Automation has been introduced in various industries, including the nuclear field, because it is commonly believed that automation promises greater efficiency, lower workloads, and fewer operator errors through reducing operator errors and enhancing operator and system performance. However, the excessive introduction of automation has deteriorated operator performance due to the side effects of automation, which are referred to as Out-of-the-Loop (OOTL), and this is critical issue that must be resolved. Thus, in order to determine the optimal level of automation introduction that assures the best human operator performance, a quantitative method of optimizing the automation is proposed in this paper. In order to propose the optimization method for determining appropriate automation levels that enable the best human performance, the automation rate and ostracism rate, which are estimation methods that quantitatively analyze the positive and negative effects of automation, respectively, are integrated. The integration was conducted in order to derive the shortest working time through considering the concept of situation awareness recovery (SAR), which states that the automation rate with the shortest working time assures the best human performance. The process to derive the optimized automation rate is demonstrated through an emergency operation scenario-based case study. In this case study, four types of procedures are assumed through redesigning the original emergency operating procedure according to the introduced automation and ostracism levels. Using the

  19. Automated patch clamp on mESC-derived cardiomyocytes for cardiotoxicity prediction.

    Science.gov (United States)

    Stoelzle, Sonja; Haythornthwaite, Alison; Kettenhofen, Ralf; Kolossov, Eugen; Bohlen, Heribert; George, Michael; Brüggemann, Andrea; Fertig, Niels

    2011-09-01

    Cardiovascular side effects are critical in drug development and have frequently led to late-stage project terminations or even drug withdrawal from the market. Physiologically relevant and predictive assays for cardiotoxicity are hence strongly demanded by the pharmaceutical industry. To identify a potential impact of test compounds on ventricular repolarization, typically a variety of ion channels in diverse heterologously expressing cells have to be investigated. Similar to primary cells, in vitro-generated stem cell-derived cardiomyocytes simultaneously express cardiac ion channels. Thus, they more accurately represent the native situation compared with cell lines overexpressing only a single type of ion channel. The aim of this study was to determine if stem cell-derived cardiomyocytes are suited for use in an automated patch clamp system. The authors show recordings of cardiac ion currents as well as action potential recordings in readily available stem cell-derived cardiomyocytes. Besides monitoring inhibitory effects of reference compounds on typical cardiac ion currents, the authors revealed for the first time drug-induced modulation of cardiac action potentials in an automated patch clamp system. The combination of an in vitro cardiac cell model with higher throughput patch clamp screening technology allows for a cost-effective cardiotoxicity prediction in a physiologically relevant cell system.

  20. An Analysis of Automated Solutions for the Certification and Accreditation of Navy Medicine Information Assets

    National Research Council Canada - National Science Library

    Gonzales, Dominic V

    2005-01-01

    ... improve Navy Medicine's current C AND A security posture. The primary research reviewed C AND A policy and included a comparative analysis of two cutting edge automated C AND A tools namely, Xacta and eMASS...

  1. Steam generator automated eddy current data analysis: A benchmarking study. Final report

    International Nuclear Information System (INIS)

    Brown, S.D.

    1998-12-01

    The eddy current examination of steam generator tubes is a very demanding process. Challenges include: complex signal analysis, massive amount of data to be reviewed quickly with extreme precision and accuracy, shortages of data analysts during peak periods, and the desire to reduce examination costs. One method to address these challenges is by incorporating automation into the data analysis process. Specific advantages, which automated data analysis has the potential to provide, include the ability to analyze data more quickly, consistently and accurately than can be performed manually. Also, automated data analysis can potentially perform the data analysis function with significantly smaller levels of analyst staffing. Despite the clear advantages that an automated data analysis system has the potential to provide, no automated system has been produced and qualified that can perform all of the functions that utility engineers demand. This report investigates the current status of automated data analysis, both at the commercial and developmental level. A summary of the various commercial and developmental data analysis systems is provided which includes the signal processing methodologies used and, where available, the performance data obtained for each system. Also, included in this report is input from seventeen research organizations regarding the actions required and obstacles to be overcome in order to bring automatic data analysis from the laboratory into the field environment. In order to provide assistance with ongoing and future research efforts in the automated data analysis arena, the most promising approaches to signal processing are described in this report. These approaches include: wavelet applications, pattern recognition, template matching, expert systems, artificial neural networks, fuzzy logic, case based reasoning and genetic algorithms. Utility engineers and NDE researchers can use this information to assist in developing automated data

  2. World-wide distribution automation systems

    International Nuclear Information System (INIS)

    Devaney, T.M.

    1994-01-01

    A worldwide power distribution automation system is outlined. Distribution automation is defined and the status of utility automation is discussed. Other topics discussed include a distribution management system, substation feeder, and customer functions, potential benefits, automation costs, planning and engineering considerations, automation trends, databases, system operation, computer modeling of system, and distribution management systems

  3. WIDAFELS flexible automation systems

    International Nuclear Information System (INIS)

    Shende, P.S.; Chander, K.P.; Ramadas, P.

    1990-01-01

    After discussing the various aspects of automation, some typical examples of various levels of automation are given. One of the examples is of automated production line for ceramic fuel pellets. (M.G.B.)

  4. Automation of a Versatile Crane (the LSMS) for Lunar Outpost Construction, Maintenance and Inspection

    Science.gov (United States)

    Doggett, William R.; Roithmayr, Carlos M.; Dorsey, John T.; Jones, Thomas C.; Shen, Haijun; Seywald, Hans; King, Bruce D.; Mikulas, Martin M., Jr.

    2009-01-01

    Devices for lifting, translating and precisely placing payloads are critical for efficient Earth-based construction operations. Both recent and past studies have demonstrated that devices with similar functionality will be needed to support lunar outpost operations. Although several designs have been developed for Earth based applications, these devices lack unique design characteristics necessary for transport to and use on the harsh lunar surface. These design characteristics include: a) lightweight components, b) compact packaging for launch, c) automated deployment, d) simple in-field reconfiguration and repair, and e) support for tele-operated or automated operations. Also, because the cost to transport mass to the lunar surface is very high, the number of devices that can be dedicated to surface operations will be limited. Thus, in contrast to Earth-based construction, where many single-purpose devices dominate a construction site, a lunar outpost will require a limited number of versatile devices that provide operational benefit from initial construction through sustained operations. The first generation test-bed of a new high performance device, the Lunar Surface Manipulation System (LSMS) has been designed, built and field tested. The LSMS has many unique features resulting in a mass efficient solution to payload handling on the lunar surface. Typically, the LSMS device mass is estimated at approximately 3% of the mass of the heaviest payload lifted at the tip, or 1.8 % of the mass of the heaviest mass lifted at the elbow or mid-span of the boom for a high performance variant incorporating advanced structural components. Initial operational capabilities of the LSMS were successfully demonstrated during field tests at Moses Lake, Washington using a tele-operated approach. Joint angle sensors have been developed for the LSMS to improve operator situational awareness. These same sensors provide the necessary information to support fully automated operations

  5. Automation in Clinical Microbiology

    Science.gov (United States)

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  6. Virtual automation.

    Science.gov (United States)

    Casis, E; Garrido, A; Uranga, B; Vives, A; Zufiaurre, C

    2001-01-01

    Total laboratory automation (TLA) can be substituted in mid-size laboratories by a computer sample workflow control (virtual automation). Such a solution has been implemented in our laboratory using PSM, software developed in cooperation with Roche Diagnostics (Barcelona, Spain), to this purpose. This software is connected to the online analyzers and to the laboratory information system and is able to control and direct the samples working as an intermediate station. The only difference with TLA is the replacement of transport belts by personnel of the laboratory. The implementation of this virtual automation system has allowed us the achievement of the main advantages of TLA: workload increase (64%) with reduction in the cost per test (43%), significant reduction in the number of biochemistry primary tubes (from 8 to 2), less aliquoting (from 600 to 100 samples/day), automation of functional testing, drastic reduction of preanalytical errors (from 11.7 to 0.4% of the tubes) and better total response time for both inpatients (from up to 48 hours to up to 4 hours) and outpatients (from up to 10 days to up to 48 hours). As an additional advantage, virtual automation could be implemented without hardware investment and significant headcount reduction (15% in our lab).

  7. [Accuracy, precision and speed of parenteral nutrition admixture bags manufacturing: comparison between automated and manual methods].

    Science.gov (United States)

    Zegbeh, H; Pirot, F; Quessada, T; Durand, T; Vételé, F; Rose, A; Bréant, V; Aulagner, G

    2011-01-01

    The parenteral nutrition admixture (PNA) manufacturing in hospital pharmacy is realized by aseptic transfer (AT) or sterilizing filtration (SF). The development of filling systems for PNA manufacturing requires, without standard, an evaluation comparing to traditional methods of SF. The filling accuracy of automated AT and SF was evaluated by mass and physical-chemistry tests in repeatability conditions (identical composition of PNA; n=five bags) and reproducibility conditions (different composition of PNA; n=57 bags). For each manufacturing method, the filling precision and the average time for PNA bags manufacturing were evaluated starting from an identical composition and volume PNA (n=five trials). Both manufacturing methods did not show significant difference of accuracy. Precision of both methods was lower than limits generally admitted for acceptability of mass and physical-chemistry tests. However, the manufacturing time for SF was superior (five different binary admixtures in five bags) or inferior (one identical binary admixture in five bags) to time recorded for automated AT. We show that serial manufacturing of PNA bags by SF with identical composition is faster than automated AT. Nevertheless, automated AT is faster than SF in variable composition of PNA. The manufacturing method choice will be motivate by the nature (i. e., variable composition or not) of the manufactured bags. Copyright © 2010 Elsevier Masson SAS. All rights reserved.

  8. Automation of Test Cases for Web Applications : Automation of CRM Test Cases

    OpenAIRE

    Seyoum, Alazar

    2012-01-01

    The main theme of this project was to design a test automation framework for automating web related test cases. Automating test cases designed for testing a web interface provide a means of improving a software development process by shortening the testing phase in the software development life cycle. In this project an existing AutoTester framework and iMacros test automation tools were used. CRM Test Agent was developed to integrate AutoTester to iMacros and to enable the AutoTester,...

  9. An Automation Planning Primer.

    Science.gov (United States)

    Paynter, Marion

    1988-01-01

    This brief planning guide for library automation incorporates needs assessment and evaluation of options to meet those needs. A bibliography of materials on automation planning and software reviews, library software directories, and library automation journals is included. (CLB)

  10. A unique automation platform for measuring low level radioactivity in metabolite identification studies.

    Directory of Open Access Journals (Sweden)

    Joel Krauser

    Full Text Available Generation and interpretation of biotransformation data on drugs, i.e. identification of physiologically relevant metabolites, defining metabolic pathways and elucidation of metabolite structures, have become increasingly important to the drug development process. Profiling using (14C or (3H radiolabel is defined as the chromatographic separation and quantification of drug-related material in a given biological sample derived from an in vitro, preclinical in vivo or clinical study. Metabolite profiling is a very time intensive activity, particularly for preclinical in vivo or clinical studies which have defined limitations on radiation burden and exposure levels. A clear gap exists for certain studies which do not require specialized high volume automation technologies, yet these studies would still clearly benefit from automation. Use of radiolabeled compounds in preclinical and clinical ADME studies, specifically for metabolite profiling and identification are a very good example. The current lack of automation for measuring low level radioactivity in metabolite profiling requires substantial capacity, personal attention and resources from laboratory scientists. To help address these challenges and improve efficiency, we have innovated, developed and implemented a novel and flexible automation platform that integrates a robotic plate handling platform, HPLC or UPLC system, mass spectrometer and an automated fraction collector.

  11. A Unique Automation Platform for Measuring Low Level Radioactivity in Metabolite Identification Studies

    Science.gov (United States)

    Krauser, Joel; Walles, Markus; Wolf, Thierry; Graf, Daniel; Swart, Piet

    2012-01-01

    Generation and interpretation of biotransformation data on drugs, i.e. identification of physiologically relevant metabolites, defining metabolic pathways and elucidation of metabolite structures, have become increasingly important to the drug development process. Profiling using 14C or 3H radiolabel is defined as the chromatographic separation and quantification of drug-related material in a given biological sample derived from an in vitro, preclinical in vivo or clinical study. Metabolite profiling is a very time intensive activity, particularly for preclinical in vivo or clinical studies which have defined limitations on radiation burden and exposure levels. A clear gap exists for certain studies which do not require specialized high volume automation technologies, yet these studies would still clearly benefit from automation. Use of radiolabeled compounds in preclinical and clinical ADME studies, specifically for metabolite profiling and identification are a very good example. The current lack of automation for measuring low level radioactivity in metabolite profiling requires substantial capacity, personal attention and resources from laboratory scientists. To help address these challenges and improve efficiency, we have innovated, developed and implemented a novel and flexible automation platform that integrates a robotic plate handling platform, HPLC or UPLC system, mass spectrometer and an automated fraction collector. PMID:22723932

  12. Enabling Advanced Automation in Spacecraft Operations with the Spacecraft Emergency Response System

    Science.gov (United States)

    Breed, Julie; Fox, Jeffrey A.; Powers, Edward I. (Technical Monitor)

    2001-01-01

    secure distributed fault and resource management. The SERS incorporates the use of intelligent agents, threaded discussions, workflow, database connectivity, and links to a variety of communications devices (e.g., two-way paging, PDA's, and Internet phones) via commercial gateways. When the SERS detects a problem, it notifies on-call team members, who then can remotely take any necessary actions to resolve the anomalies.The SERS goes well beyond a simple '911' system that sends out an error code to everyone with a pager. Instead, SERS' software agents send detailed data (i.e., notifications) to the most appropriate team members based on the type and severity of the anomaly and the skills of the on-call team members. The SERS also allows the team members to respond to the notifications from their wireless devices. This unique capability ensures rapid response since the team members no longer have to go to a PC or the control center for every anomalous event. Most importantly, the SERS enables safe experimentation with various techniques for increasing levels of automation, leading to robust autonomy. For the MIDEX missions at NASA GSFC, the SERS is used to provide 'human-in-the-loop' automation. During lights-out operations, as greater control is given to the MIDEX automated systems, the SERS can be configured to page remote personnel and keep them informed regarding actions taking place in the control center. Remote off-duty operators can even be given the option of enabling or inhibiting a specific automated response in near real time via their two-way pagers. The SERS facilitates insertion of new technology to increase automation, while maintaining the safety and security of mission resources. This paper will focus on SERS' overall functionality and how SERS has been designed to handle the monitoring and emergency response for missions with varying levels of automation. The paper will also convey some of the key lessons learned from SERS' deployment across of variety

  13. Automation in Immunohematology

    Directory of Open Access Journals (Sweden)

    Meenu Bajpai

    2012-01-01

    Full Text Available There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  14. Fully Automated Laser Ablation Liquid Capture Sample Analysis using NanoElectrospray Ionization Mass Spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Lorenz, Matthias [ORNL; Ovchinnikova, Olga S [ORNL; Van Berkel, Gary J [ORNL

    2014-01-01

    RATIONALE: Laser ablation provides for the possibility of sampling a large variety of surfaces with high spatial resolution. This type of sampling when employed in conjunction with liquid capture followed by nanoelectrospray ionization provides the opportunity for sensitive and prolonged interrogation of samples by mass spectrometry as well as the ability to analyze surfaces not amenable to direct liquid extraction. METHODS: A fully automated, reflection geometry, laser ablation liquid capture spot sampling system was achieved by incorporating appropriate laser fiber optics and a focusing lens into a commercially available, liquid extraction surface analysis (LESA ) ready Advion TriVersa NanoMate system. RESULTS: Under optimized conditions about 10% of laser ablated material could be captured in a droplet positioned vertically over the ablation region using the NanoMate robot controlled pipette. The sampling spot size area with this laser ablation liquid capture surface analysis (LA/LCSA) mode of operation (typically about 120 m x 160 m) was approximately 50 times smaller than that achievable by direct liquid extraction using LESA (ca. 1 mm diameter liquid extraction spot). The set-up was successfully applied for the analysis of ink on glass and paper as well as the endogenous components in Alstroemeria Yellow King flower petals. In a second mode of operation with a comparable sampling spot size, termed laser ablation/LESA , the laser system was used to drill through, penetrate, or otherwise expose material beneath a solvent resistant surface. Once drilled, LESA was effective in sampling soluble material exposed at that location on the surface. CONCLUSIONS: Incorporating the capability for different laser ablation liquid capture spot sampling modes of operation into a LESA ready Advion TriVersa NanoMate enhanced the spot sampling spatial resolution of this device and broadened the surface types amenable to analysis to include absorbent and solvent resistant

  15. Minima de L'intégrale D'action du Problème Newtoniende 4 Corps de Masses Égales Dans R3: Orbites `Hip-Hop'

    Science.gov (United States)

    Chenciner, Alain; Venturelli, Andrea

    2000-09-01

    We consider the problem of 4 bodies of equal masses in R 3 for the Newtonian r-1 potential. We address the question of the absolute minima of the action integral among (anti)symmetric loops of class H 1 whose period is fixed. It is the simplest case for which the results of [4] (corrected in [5]) do not apply: the minima cannot be the relative equilibria whose configuration is an absolute minimum of the potential among the configurations having a given moment of inertia with respect to their center of mass. This is because the regular tetrahedron cannot have a relative equilibrium motion in R 3 (see [2]). We show that the absolute minima of the action are not homographic motions. We also show that if we force the configuration to admit a certain type of symmetry of order 4, the absolute minimum is a collisionless orbit whose configuration ‘hesitates’ between the central configuration of the square and the one of the tetrahedron. We call these orbits ‘hip-hop’. A similar result holds in case of a symmetry of order 3 where the central configuration of the equilateral triangle with a body at the center of mass replaces the square.

  16. Automated Budget System -

    Data.gov (United States)

    Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...

  17. On the least action principle in cosmology

    NARCIS (Netherlands)

    Nusser, A; Branchini, E

    2000-01-01

    Given the present distribution of mass tracing objects in an expanding universe, we develop and test a fast method for recovering their past orbits using the least action principle. In this method, termed FAM for fast action minimization, the orbits are expanded in a set of orthogonal time basis

  18. Mixed meson masses with domain-wall valence and staggered sea fermions

    International Nuclear Information System (INIS)

    Orginos, Kostas; Walker-Loud, Andre

    2008-01-01

    Mixed action lattice calculations allow for an additive lattice-spacing-dependent mass renormalization of mesons composed of one sea and one valence quark, regardless of the type of fermion discretization methods used in the valence and sea sectors. The value of the mass renormalization depends upon the lattice actions used. This mixed meson mass shift is an important lattice artifact to determine for mixed action calculations; because it modifies the pion mass, it plays a central role in the low-energy dynamics of all hadronic correlation functions. We determine the leading order, O(a 2 ), and next-to-leading order, O(a 2 m π 2 ), additive mass shift of valence-sea mesons for a mixed lattice action with domain-wall valence fermions and rooted staggered sea fermions, relevant to the majority of current large scale mixed action lattice efforts. We find that, on the asqtad-improved coarse MILC lattices, this additive mass shift is well parametrized in lattice units by Δ(am) 2 =0.034(2)-0.06(2)(am π ) 2 , which in physical units, using a=0.125 fm, corresponds to Δ(m) 2 =(291±8 MeV) 2 -0.06(2)m π 2 . In terms of the mixed action effective field theory parameters, the corresponding mass shift is given by a 2 Δ Mix =(316±4 MeV) 2 at leading order plus next-to-leading order corrections including the necessary chiral logarithms for this mixed action calculation, determined in this work. Within the precision of our calculation, one cannot distinguish between the full next-to-leading order effective field theory analysis of this additive mixed meson mass shift and the parametrization given above.

  19. Automated determination of aliphatic primary amines in wastewater by simultaneous derivatization and headspace solid-phase microextraction followed by gas chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Llop, Anna; Pocurull, Eva; Borrull, Francesc

    2010-01-22

    This paper presents a fully automated method for determining ten primary amines in wastewater at ng/L levels. The method is based on simultaneous derivatization with pentafluorobenzaldehyde (PFBAY) and headspace solid-phase microextraction (HS-SPME) followed by gas chromatography coupled to ion trap tandem mass spectrometry (GC-IT-MS-MS). The influence of main factors on the efficiency of derivatization and of HS-SPME is described in detail and optimized by a central composite design. For all species, the highest enrichment factors were achieved using a 85 microm polyacrylate (PA) fiber exposed in the headspace of stirred water samples (750 rpm) at pH 12, containing 360 g/L of NaCl, at 40 degrees C for 15 min. Under optimized conditions, the proposed method achieved detection limits ranging from 10 to 100 ng/L (except for cyclohexylamine). The optimized method was then used to determine the presence of primary amines in various types of wastewater samples, such as influent and effluent wastewater from municipal and industrial wastewater treatment plants (WWTPs) and a potable water treatment plant. Although the analysis of these samples revealed the presence of up to 1500 microg/L of certain primary amines in influent industrial wastewater, the concentration of these compounds in the effluent and in municipal and potable water was substantially lower, at low microg/L levels. The new derivatization-HS-SPME-GC-IT-MS-MS method is suitable for the fast, reliable and inexpensive determination of primary amines in wastewater in an automated procedure. Copyright 2009 Elsevier B.V. All rights reserved.

  20. The implementation of the situational control concept of information security in automated training systems

    Directory of Open Access Journals (Sweden)

    A. M. Chernih

    2016-01-01

    subsystem elements of the automated learning system to meet changing conditions of operation.When the event, associated with the emergence of threats to the information security related to one of the elements of a situation to a variety of destabilizing factors is checked in the system, the formation of the base of alternative control actions sets of situational management is formed, then the sets of the admissible solutions of the situational control options are formed.The best solution provides an extremum of the objective function of situational control of information security.Results. The main approaches to ensuring the information security in automated learning systems are considered, the necessity of the use of situational control of security in automated learning systems is based, mathematical model and problem statement of situational control are offered, the method of situational control of information protection is developed.Conclusion. Developed method of situational control of information security in automated learning systems, involves the participation of the operator in the development and decision-making (dialogue procedures statement of objectives situational control, the formation of the base of alternative sets of control actions, etc..Another important feature of this technique is the necessity of using previously developed models (models of decision-making situation, a model of coordination and planning of operation of a subsystem of the control and protection of information, models of information processing about the status of the subsystem analysis models and evaluation of results and the database obtained on the basis of operating experience of information protection systems in the automated learning systems.The implementation of the concept of situational control of information security ensures the timely adaptation of the algorithms and parameters of the information security system to changes in the external environment and the nature of tasks

  1. 78 FR 66039 - Modification of National Customs Automation Program Test Concerning Automated Commercial...

    Science.gov (United States)

    2013-11-04

    ... Customs Automation Program Test Concerning Automated Commercial Environment (ACE) Cargo Release (Formerly...) plan to both rename and modify the National Customs Automation Program (NCAP) test concerning the... data elements required to obtain release for cargo transported by air. The test will now be known as...

  2. An automation model of Effluent Treatment Plant

    Directory of Open Access Journals (Sweden)

    Luiz Alberto Oliveira Lima Roque

    2012-07-01

    Full Text Available Population growth and intensification of industrial activities have increased the deterioration of natural resources. Industrial, hospital and residential wastes are dumped directly into landfills without processing, polluting soils. This action will have consequences later, because the liquid substance resulting from the putrefaction of organic material plows into the soil to reach water bodies. Cities arise without planning, industrial and household wastes are discharged into rivers, lakes and oceans without proper treatment, affecting water resources. It is well known that in the next century there will be fierce competition for fresh water on the planet, probably due to the scarcity of it. Demographic expansion has occurred without proper health planning, degrading oceans, lakes and rivers. Thus, a large percentage of world population suffers from diseases related to water pollution. Accordingly, it can be concluded that sewage treatment is essential to human survival, to preserve rivers, lakes and oceans. An Effluent Treatment Plant (ETP treats wastewater to reduce its pollution to acceptable levels before sending them to the oceans or rivers. To automate the operation of an ETP, motors, sensors and logic blocks, timers and counters are needed. These functions are achieved with programmable logic controllers (PLC and Supervisory Systems. The Ladder language is used to program controllers and is a pillar of the Automation and Control Engineering. The supervisory systems allow process information to be monitored, while the PLC are responsible for control and data acquisition. In the age we live in, process automation is used in an increasing scale in order to provide higher quality, raise productivity and improve the proposed activities. Therefore, an automatic ETP will improve performance and efficiency to handle large volumes of sewage. Considering the growing importance of environmental awareness with special emphasis

  3. Automation-aided Task Loads Index based on the Automation Rate Reflecting the Effects on Human Operators in NPPs

    International Nuclear Information System (INIS)

    Lee, Seungmin; Seong, Poonghyun; Kim, Jonghyun

    2013-01-01

    Many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs) was suggested. These suggested measures express how much automation support human operators but it cannot express the change of human operators' workload, whether the human operators' workload is increased or decreased. Before considering automation rates, whether the adopted automation is good or bad might be estimated in advance. In this study, to estimate the appropriateness of automation according to the change of the human operators' task loads, automation-aided task loads index is suggested based on the concept of the suggested automation rate. To insure plant safety and efficiency on behalf of human operators, various automation systems have been installed in NPPs, and many works which were previously conducted by human operators can now be supported by computer-based operator aids. According to the characteristics of the automation types, the estimation method of the system automation and the cognitive automation rate were suggested. The proposed estimation method concentrates on the effects of introducing automation, so it directly express how much the automated system support human operators. Based on the suggested automation rates, the way to estimate how much the automated system can affect the human operators' cognitive task load is suggested in this study. When there is no automation, the calculated index is 1, and it means there is no change of human operators' task load

  4. Asleep at the automated wheel-Sleepiness and fatigue during highly automated driving.

    Science.gov (United States)

    Vogelpohl, Tobias; Kühn, Matthias; Hummel, Thomas; Vollrath, Mark

    2018-03-20

    Due to the lack of active involvement in the driving situation and due to monotonous driving environments drivers with automation may be prone to become fatigued faster than manual drivers (e.g. Schömig et al., 2015). However, little is known about the progression of fatigue during automated driving and its effects on the ability to take back manual control after a take-over request. In this driving simulator study with Nö=ö60 drivers we used a three factorial 2ö×ö2ö×ö12 mixed design to analyze the progression (12ö×ö5ömin; within subjects) of driver fatigue in drivers with automation compared to manual drivers (between subjects). Driver fatigue was induced as either mainly sleep related or mainly task related fatigue (between subjects). Additionally, we investigated the drivers' reactions to a take-over request in a critical driving scenario to gain insights into the ability of fatigued drivers to regain manual control and situation awareness after automated driving. Drivers in the automated driving condition exhibited facial indicators of fatigue after 15 to 35ömin of driving. Manual drivers only showed similar indicators of fatigue if they suffered from a lack of sleep and then only after a longer period of driving (approx. 40ömin). Several drivers in the automated condition closed their eyes for extended periods of time. In the driving with automation condition mean automation deactivation times after a take-over request were slower for a certain percentage (about 30%) of the drivers with a lack of sleep (Mö=ö3.2; SDö=ö2.1ös) compared to the reaction times after a long drive (Mö=ö2.4; SDö=ö0.9ös). Drivers with automation also took longer than manual drivers to first glance at the speed display after a take-over request and were more likely to stay behind a braking lead vehicle instead of overtaking it. Drivers are unable to stay alert during extended periods of automated driving without non-driving related tasks. Fatigued drivers could

  5. Procedure automation: the effect of automated procedure execution on situation awareness and human performance

    International Nuclear Information System (INIS)

    Andresen, Gisle; Svengren, Haakan; Heimdal, Jan O.; Nilsen, Svein; Hulsund, John-Einar; Bisio, Rossella; Debroise, Xavier

    2004-04-01

    As advised by the procedure workshop convened in Halden in 2000, the Halden Project conducted an experiment on the effect of automation of Computerised Procedure Systems (CPS) on situation awareness and human performance. The expected outcome of the study was to provide input for guidance on CPS design, and to support the Halden Project's ongoing research on human reliability analysis. The experiment was performed in HAMMLAB using the HAMBO BWR simulator and the COPMA-III CPS. Eight crews of operators from Forsmark 3 and Oskarshamn 3 participated. Three research questions were investigated: 1) Does procedure automation create Out-Of-The-Loop (OOTL) performance problems? 2) Does procedure automation affect situation awareness? 3) Does procedure automation affect crew performance? The independent variable, 'procedure configuration', had four levels: paper procedures, manual CPS, automation with breaks, and full automation. The results showed that the operators experienced OOTL problems in full automation, but that situation awareness and crew performance (response time) were not affected. One possible explanation for this is that the operators monitored the automated procedure execution conscientiously, something which may have prevented the OOTL problems from having negative effects on situation awareness and crew performance. In a debriefing session, the operators clearly expressed their dislike for the full automation condition, but that automation with breaks could be suitable for some tasks. The main reason why the operators did not like the full automation was that they did not feel being in control. A qualitative analysis addressing factors contributing to response time delays revealed that OOTL problems did not seem to cause delays, but that some delays could be explained by the operators having problems with the freeze function of the CPS. Also other factors such as teamwork and operator tendencies were of importance. Several design implications were drawn

  6. Classification of terverticillate Penicillia by electrospray mass spectrometric profiling

    DEFF Research Database (Denmark)

    Smedsgaard, Jørn; Hansen, Michael Edberg; Frisvad, Jens Christian

    2004-01-01

    429 isolates of 58 species belonging to Penicillium subgenus Penicillium are classified from direct infusion electrospray mass spectrometry (diMS) analysis of crude extracts by automated data processing. The study shows that about 70% of the species can be classified correctly into species using...

  7. Statistical analysis to assess automated level of suspicion scoring methods in breast ultrasound

    Science.gov (United States)

    Galperin, Michael

    2003-05-01

    A well-defined rule-based system has been developed for scoring 0-5 the Level of Suspicion (LOS) based on qualitative lexicon describing the ultrasound appearance of breast lesion. The purposes of the research are to asses and select one of the automated LOS scoring quantitative methods developed during preliminary studies in benign biopsies reduction. The study has used Computer Aided Imaging System (CAIS) to improve the uniformity and accuracy of applying the LOS scheme by automatically detecting, analyzing and comparing breast masses. The overall goal is to reduce biopsies on the masses with lower levels of suspicion, rather that increasing the accuracy of diagnosis of cancers (will require biopsy anyway). On complex cysts and fibroadenoma cases experienced radiologists were up to 50% less certain in true negatives than CAIS. Full correlation analysis was applied to determine which of the proposed LOS quantification methods serves CAIS accuracy the best. This paper presents current results of applying statistical analysis for automated LOS scoring quantification for breast masses with known biopsy results. It was found that First Order Ranking method yielded most the accurate results. The CAIS system (Image Companion, Data Companion software) is developed by Almen Laboratories and was used to achieve the results.

  8. Automation-aided Task Loads Index based on the Automation Rate Reflecting the Effects on Human Operators in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seungmin; Seong, Poonghyun [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Kim, Jonghyun [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2013-05-15

    Many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs) was suggested. These suggested measures express how much automation support human operators but it cannot express the change of human operators' workload, whether the human operators' workload is increased or decreased. Before considering automation rates, whether the adopted automation is good or bad might be estimated in advance. In this study, to estimate the appropriateness of automation according to the change of the human operators' task loads, automation-aided task loads index is suggested based on the concept of the suggested automation rate. To insure plant safety and efficiency on behalf of human operators, various automation systems have been installed in NPPs, and many works which were previously conducted by human operators can now be supported by computer-based operator aids. According to the characteristics of the automation types, the estimation method of the system automation and the cognitive automation rate were suggested. The proposed estimation method concentrates on the effects of introducing automation, so it directly express how much the automated system support human operators. Based on the suggested automation rates, the way to estimate how much the automated system can affect the human operators' cognitive task load is suggested in this study. When there is no automation, the calculated index is 1, and it means there is no change of human operators' task load.

  9. High-throughput microfluidics automated cytogenetic processing for effectively lowering biological process time and aid triage during radiation accidents

    International Nuclear Information System (INIS)

    Ramakumar, Adarsh

    2016-01-01

    Nuclear or radiation mass casualties require individual, rapid, and accurate dose-based triage of exposed subjects for cytokine therapy and supportive care, to save life. Radiation mass casualties will demand high-throughput individual diagnostic dose assessment for medical management of exposed subjects. Cytogenetic techniques are widely used for triage and definitive radiation biodosimetry. Prototype platform to demonstrate high-throughput microfluidic micro incubation to support the logistics of sample in miniaturized incubators from the site of accident to analytical labs has been developed. Efforts have been made, both at the level of developing concepts and advanced system for higher throughput in processing the samples and also implementing better and efficient methods of logistics leading to performance of lab-on-chip analyses. Automated high-throughput platform with automated feature extraction, storage, cross platform data linkage, cross platform validation and inclusion of multi-parametric biomarker approaches will provide the first generation high-throughput platform systems for effective medical management, particularly during radiation mass casualty events

  10. Automation of radioimmunoassay

    International Nuclear Information System (INIS)

    Yamaguchi, Chisato; Yamada, Hideo; Iio, Masahiro

    1974-01-01

    Automation systems for measuring Australian antigen by radioimmunoassay under development were discussed. Samples were processed as follows: blood serum being dispensed by automated sampler to the test tube, and then incubated under controlled time and temperature; first counting being omitted; labelled antibody being dispensed to the serum after washing; samples being incubated and then centrifuged; radioactivities in the precipitate being counted by auto-well counter; measurements being tabulated by automated typewriter. Not only well-type counter but also position counter was studied. (Kanao, N.)

  11. Improving Quality and Occupational Safety on Automated Casting Lines

    Directory of Open Access Journals (Sweden)

    Kukla S.

    2017-09-01

    Full Text Available The paper presents a practical example of improving quality and occupational safety on automated casting lines. Working conditions on the line of box moulding with horizontal mould split were analysed due to low degree of automation at the stage of cores or filters installation as well as spheroidizing mortar dosing. A simulation analysis was carried out, which was related to the grounds of introducing an automatic mortar dispenser to the mould. To carry out the research, a simulation model of a line in universal Arena software for modelling and simulation of manufacturing systems by Rockwell Software Inc. was created. A simulation experiment was carried out on a model in order to determine basic parameters of the working system. Organization and working conditions in other sections of the line were also analysed, paying particular attention to quality, ergonomics and occupational safety. Ergonomics analysis was carried out on manual cores installation workplace and filters installation workplace, and changes to these workplaces were suggested in order to eliminate actions being unnecessary and onerous for employees.

  12. 77 FR 48527 - National Customs Automation Program (NCAP) Test Concerning Automated Commercial Environment (ACE...

    Science.gov (United States)

    2012-08-14

    ... National Customs Automation Program (NCAP) test concerning the simplified entry functionality in the... DEPARTMENT OF HOMELAND SECURITY U.S. Customs and Border Protection National Customs Automation Program (NCAP) Test Concerning Automated Commercial Environment (ACE) Simplified Entry: Modification of...

  13. MASS APPRAISAL AND REAL ESTATE TAXATION

    Directory of Open Access Journals (Sweden)

    VORONIN V. О.

    2015-12-01

    Full Text Available Raising of problem. The government at the legislative level fixed the definition of market value for tax purposes as mandatory in the implementation of real estate transactions. In order to meet the requirements of objectivity, uniformity and consistency of the results obtained during the evaluation procedures, as well as minimize the influence of subjective factors, there is a need to develop a methodology for evaluating an automated procedure for determining the estimated value of the property based on its market value. To solve this problem, we use special techniques and methods of mass appraisal that incorporates computer-supported statistical analyses, such as multiple regression analysis and adaptive estimation procedure for use in the field of property valuation and property rights. Purpose. Realization of this goal involves the development of the concept of a computer-assisted mass appraisal. The basis of this concept is an adaptive hybrid models of market pricing in different market segments that incorporates software adaptive algorithms for determining the market value by the three evaluation approaches using the results of a multi-level real estate market analysis. It was proposed the utility automated valuation models which is intended for the implementation of a computerized real estate valuation based on the developed software adaptive algorithms. Con-clusion. To achieve this goal have been developed and used concepts underlying of computerized mass appraisal. The basis of this concept is adaptive hybrid pricing models in various segments of the real estate market of Ukraine. The problem is solved by the application of the developed software-based adaptive algorithms for determining the market value of three evaluation approaches using the results of a multi-level analysis of the real estate market. It was pro-posed the model of automated appraisal, according to it was implemented computerization of appraisal procedures on the

  14. Hybrid quadrupole-orbitrap mass spectrometry analysis with accurate-mass database and parallel reaction monitoring for high-throughput screening and quantification of multi-xenobiotics in honey.

    Science.gov (United States)

    Li, Yi; Zhang, Jinzhen; Jin, Yue; Wang, Lin; Zhao, Wen; Zhang, Wenwen; Zhai, Lifei; Zhang, Yaping; Zhang, Yongxin; Zhou, Jinhui

    2016-01-15

    This study reports a rapid, automated screening and quantification method for the determination of multi-xenobiotic residues in honey using ultra-high performance liquid chromatography-hybrid quadrupole-Orbitrap mass spectrometry (UHPLC-Q-Orbitrap) with a user-built accurate-mass database plus parallel reaction monitoring (PRM). The database contains multi-xenobiotic information including formulas, adduct types, theoretical exact mass and retention time, characteristic fragment ions, ion ratios, and mass accuracies. A simple sample preparation method was developed to reduce xenobiotic loss in the honey samples. The screening method was validated based on retention time deviation, mass accuracy via full scan-data-dependent MS/MS (full scan-ddMS2), multi-isotope ratio, characteristic ion ratio, sensitivity, and positive/negative switching performance between the spiked sample and corresponding standard solution. The quantification method based on the PRM mode is a promising new quantitative tool which we validated in terms of selectivity, linearity, recovery (accuracy), repeatability (precision), decision limit (CCα), detection capability (CCβ), matrix effects, and carry-over. The optimized methods proposed in this study enable the automated screening and quantification of 157 compounds in less than 15 min in honey. The results of this study, as they represent a convenient protocol for large-scale screening and quantification, also provide a research approach for analysis of various contaminants in other matrices. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Novel fat-link fermion actions for lattice QCD

    International Nuclear Information System (INIS)

    Zanotti, J.; Bilson-Thompson, S.; Bonnet, F.; Leinweber, D.; Melnitchouk, W.; Williams, A.

    2000-01-01

    Full text: We are currently exploring new ideas for lattice fermion actions. Naive implementations of fermion actions encounter the well known fermion-doubling problem. In order to solve this problem, Wilson introduced an irrelevant (energy) dimension-five operator (the so-called Wilson term) which explicitly breaks chiral symmetry. The scaling properties of this Wilson action can be improved by introducing any number of irrelevant operators of increasing dimension which also vanish in the continuum limit. In this manner, one can improve fermion actions at finite 'a' by combining operators to eliminate O(a) and perhaps O(a 2 ) errors etc. A popular formulation of a lattice fermion action that achieves this is the Clover action which removes the O(a) error introduced by the Wilson term by introducing an additional irrelevant dimension-five operator. The Clover action can be O(a) improved to all orders in the strong coupling 'g'. While the Clover action displays excellent scaling, it is responsible for revealing the exceptional configuration problem where the quark propagator encounters singular behaviour particularly as the quark mass becomes small. Moreover, its free dispersion relation between energy and momentum is unchanged from the standard Wilson action dispersion and shows a continuum like behaviour only for relatively small momenta [F. X. Lee and D. B. Leinweber, Phys. Rev. D59, 074504 (1999), hep-lat/9711044]. Finally, significant chiral symmetry breaking is apparent as the renormalised quark mass differs significantly from the bare mass of the theory. Hence we propose a different approach to fermion action improvement. One in which the additive renormalisations become small while expressing good chiral behaviour. This can be achieved through the consideration of 'fat-link' fermion actions [T. DeGrand (the MILC collaboration, Phys. Rev. D60, 094501 (1999)]. Fat links are created by averaging or smearing links on the lattice with their nearest neighbours in

  16. Automated high-speed video analysis of the bubble dynamics in subcooled flow boiling

    Energy Technology Data Exchange (ETDEWEB)

    Maurus, Reinhold; Ilchenko, Volodymyr; Sattelmayer, Thomas [Technische Univ. Muenchen, Lehrstuhl fuer Thermodynamik, Garching (Germany)

    2004-04-01

    Subcooled flow boiling is a commonly applied technique for achieving efficient heat transfer. In the study, an experimental investigation in the nucleate boiling regime was performed for water circulating in a closed loop at atmospheric pressure. The test-section consists of a rectangular channel with a one side heated copper strip and a very good optical access. For the optical observation of the bubble behaviour the high-speed cinematography is used. Automated image processing and analysis algorithms developed by the authors were applied for a wide range of mass flow rates and heat fluxes in order to extract characteristic length and time scales of the bubbly layer during the boiling process. Using this methodology, a huge number of bubble cycles could be analysed. The structure of the developed algorithms for the detection of the bubble diameter, the bubble lifetime, the lifetime after the detachment process and the waiting time between two bubble cycles is described. Subsequently, the results from using these automated procedures are presented. A remarkable novelty is the presentation of all results as distribution functions. This is of physical importance because the commonly applied spatial and temporal averaging leads to a loss of information and, moreover, to an unjustified deterministic view of the boiling process, which exhibits in reality a very wide spread of bubble sizes and characteristic times. The results show that the mass flux dominates the temporal bubble behaviour. An increase of the liquid mass flux reveals a strong decrease of the bubble life - and waiting time. In contrast, the variation of the heat flux has a much smaller impact. It is shown in addition that the investigation of the bubble history using automated algorithms delivers novel information with respect to the bubble lift-off probability. (Author)

  17. Automated high-speed video analysis of the bubble dynamics in subcooled flow boiling

    International Nuclear Information System (INIS)

    Maurus, Reinhold; Ilchenko, Volodymyr; Sattelmayer, Thomas

    2004-01-01

    Subcooled flow boiling is a commonly applied technique for achieving efficient heat transfer. In the study, an experimental investigation in the nucleate boiling regime was performed for water circulating in a closed loop at atmospheric pressure. The test-section consists of a rectangular channel with a one side heated copper strip and a very good optical access. For the optical observation of the bubble behaviour the high-speed cinematography is used. Automated image processing and analysis algorithms developed by the authors were applied for a wide range of mass flow rates and heat fluxes in order to extract characteristic length and time scales of the bubbly layer during the boiling process. Using this methodology, a huge number of bubble cycles could be analysed. The structure of the developed algorithms for the detection of the bubble diameter, the bubble lifetime, the lifetime after the detachment process and the waiting time between two bubble cycles is described. Subsequently, the results from using these automated procedures are presented. A remarkable novelty is the presentation of all results as distribution functions. This is of physical importance because the commonly applied spatial and temporal averaging leads to a loss of information and, moreover, to an unjustified deterministic view of the boiling process, which exhibits in reality a very wide spread of bubble sizes and characteristic times. The results show that the mass flux dominates the temporal bubble behaviour. An increase of the liquid mass flux reveals a strong decrease of the bubble life- and waiting time. In contrast, the variation of the heat flux has a much smaller impact. It is shown in addition that the investigation of the bubble history using automated algorithms delivers novel information with respect to the bubble lift-off probability

  18. Laboratory Automation and Middleware.

    Science.gov (United States)

    Riben, Michael

    2015-06-01

    The practice of surgical pathology is under constant pressure to deliver the highest quality of service, reduce errors, increase throughput, and decrease turnaround time while at the same time dealing with an aging workforce, increasing financial constraints, and economic uncertainty. Although not able to implement total laboratory automation, great progress continues to be made in workstation automation in all areas of the pathology laboratory. This report highlights the benefits and challenges of pathology automation, reviews middleware and its use to facilitate automation, and reviews the progress so far in the anatomic pathology laboratory. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Automated knowledge acquisition - an aid to understanding the actions of a skilled operator

    International Nuclear Information System (INIS)

    Wright, R.M.

    1990-01-01

    The operator of an accelerator experiment often appears to be a highly skilled magician. The diagnostics used, such as oscilloscope traces, may be only indirectly related to the state of the accelerator. The operator adjusts the knobs so that the experiment runs well, but is not always able to describe in words exactly what is happening. The tool described in this paper, acc-plot-tool, was developed as a supplement to note-taking while attempting to develop an expert system that might function as an apprentice operator. During an experiment, a historical record is made of all operator actions and the resulting changes in the state of the equipment, including digitized records of the oscilloscope traces used to make decisions. These records are transformed into a set of graphical objects in the knowledge engineering environment (KEE). A flexible set of KEE functions has been developed, allowing both graphical and numerical manipulation of the data. The acc-plot-tool is not only useful in helping to understand the operator's actions, but also in discovering correlations between operator actions and system-parameter changes, in detecting equipment failures and in the mathematical modeling of the system. Examples will be given from ion-source operation. (orig.)

  20. Managing laboratory automation.

    Science.gov (United States)

    Saboe, T J

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Finally, some comments on future automation need are discussed.

  1. Determination of the Optimized Automation Rate considering Effects of Automation on Human Operators in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Seong, Poong Hyun; Kim, Jong Hyun; Kim, Man Cheol

    2015-01-01

    Automation refers to the use of a device or a system to perform a function previously performed by a human operator. It is introduced to reduce the human errors and to enhance the performance in various industrial fields, including the nuclear industry. However, these positive effects are not always achieved in complex systems such as nuclear power plants (NPPs). An excessive introduction of automation can generate new roles for human operators and change activities in unexpected ways. As more automation systems are accepted, the ability of human operators to detect automation failures and resume manual control is diminished. This disadvantage of automation is called the Out-of-the- Loop (OOTL) problem. We should consider the positive and negative effects of automation at the same time to determine the appropriate level of the introduction of automation. Thus, in this paper, we suggest an estimation method to consider the positive and negative effects of automation at the same time to determine the appropriate introduction of automation. This concept is limited in that it does not consider the effects of automation on human operators. Thus, a new estimation method for automation rate was suggested to overcome this problem

  2. Proteomic analysis of Bacillus thuringiensis at different growth phases by using an automated online two-dimensional liquid chromatography-tandem mass spectrometry strategy.

    Science.gov (United States)

    Huang, Shaoya; Ding, Xuezhi; Sun, Yunjun; Yang, Qi; Xiao, Xiuqing; Cao, Zhenping; Xia, Liqiu

    2012-08-01

    The proteome of a new Bacillus thuringiensis subsp. kurstaki strain, 4.0718, from the middle vegetative (T(1)), early sporulation (T(2)), and late sporulation (T(3)) phases was analyzed using an integrated liquid chromatography (LC)-based protein identification system. The system comprised two-dimensional (2D) LC coupled with nanoscale electrospray ionization (ESI) tandem mass spectrometry (MS/MS) on a high-resolution hybrid mass spectrometer with an automated data analysis system. After deletion of redundant proteins from the different batches and B. thuringiensis subspecies, 918, 703, and 778 proteins were identified in the respective three phases. Their molecular masses ranged from 4.6 Da to 477.4 Da, and their isoelectric points ranged from 4.01 to 11.84. Function clustering revealed that most of the proteins in the three phases were functional metabolic proteins, followed by proteins participating in cell processes. Small molecular and macromolecular metabolic proteins were further classified according to the Kyoto Encyclopedia of Genes and Genome and BioCyc metabolic pathway database. Three protoxins (Cry2Aa, Cry1Aa, and Cry1Ac) as well as a series of potential intracellular active factors were detected. Many significant proteins related to spore and crystal formation, including sporulation proteins, help proteins, chaperones, and so on, were identified. The expression patterns of two identified proteins, CotJc and glutamine synthetase, were validated by Western blot analysis, which further confirmed the MS results. This study is the first to use shotgun technology to research the proteome of B. thuringiensis. Valuable experimental data are provided regarding the methodology of analyzing the B. thuringiensis proteome (which can be used to produce insecticidal crystal proteins) and have been added to the related protein database.

  3. Comparison of conventional and automated breast volume ultrasound in the description and characterization of solid breast masses based on BI-RADS features.

    Science.gov (United States)

    Kim, Hyunji; Cha, Joo Hee; Oh, Ha-Yeun; Kim, Hak Hee; Shin, Hee Jung; Chae, Eun Young

    2014-07-01

    To compare the performance of radiologists in the use of conventional ultrasound (US) and automated breast volume ultrasound (ABVU) for the characterization of benign and malignant solid breast masses based on breast imaging and reporting data system (BI-RADS) criteria. Conventional US and ABVU images were obtained in 87 patients with 106 solid breast masses (52 cancers, 54 benign lesions). Three experienced radiologists who were blinded to all examination results independently characterized the lesions and reported a BI-RADS assessment category and a level of suspicion of malignancy. The results were analyzed by calculation of Cohen's κ coefficient and by receiver operating characteristic (ROC) analysis. Assessment of the agreement of conventional US and ABVU indicated that the posterior echo feature was the most discordant feature of seven features (κ = 0.371 ± 0.225) and that orientation had the greatest agreement (κ = 0.608 ± 0.210). The final assessment showed substantial agreement (κ = 0.773 ± 0.104). The areas under the ROC curves (Az) for conventional US and ABVU were not statistically significant for each reader, but the mean Az values of conventional US and ABVU by multi-reader multi-case analysis were significantly different (conventional US 0.991, ABVU 0.963; 95 % CI -0.0471 to -0.0097). The means for sensitivity, specificity, positive predictive value, and negative predictive value of conventional US and ABVU did not differ significantly. There was substantial inter-observer agreement in the final assessment of solid breast masses by conventional US and ABVU. ROC analysis comparing the performance of conventional US and ABVU indicated a marginally significant difference in mean Az, but not in mean sensitivity, specificity, positive predictive value, or negative predictive value.

  4. 78 FR 44142 - Modification of Two National Customs Automation Program (NCAP) Tests Concerning Automated...

    Science.gov (United States)

    2013-07-23

    ... Customs Automation Program (NCAP) Tests Concerning Automated Commercial Environment (ACE) Document Image... (CBP's) plan to modify the National Customs Automation Program (NCAP) tests concerning document imaging... entry process by reducing the number of data elements required to obtain release for cargo transported...

  5. The Science of Home Automation

    Science.gov (United States)

    Thomas, Brian Louis

    Smart home technologies and the concept of home automation have become more popular in recent years. This popularity has been accompanied by social acceptance of passive sensors installed throughout the home. The subsequent increase in smart homes facilitates the creation of home automation strategies. We believe that home automation strategies can be generated intelligently by utilizing smart home sensors and activity learning. In this dissertation, we hypothesize that home automation can benefit from activity awareness. To test this, we develop our activity-aware smart automation system, CARL (CASAS Activity-aware Resource Learning). CARL learns the associations between activities and device usage from historical data and utilizes the activity-aware capabilities to control the devices. To help validate CARL we deploy and test three different versions of the automation system in a real-world smart environment. To provide a foundation of activity learning, we integrate existing activity recognition and activity forecasting into CARL home automation. We also explore two alternatives to using human-labeled data to train the activity learning models. The first unsupervised method is Activity Detection, and the second is a modified DBSCAN algorithm that utilizes Dynamic Time Warping (DTW) as a distance metric. We compare the performance of activity learning with human-defined labels and with automatically-discovered activity categories. To provide evidence in support of our hypothesis, we evaluate CARL automation in a smart home testbed. Our results indicate that home automation can be boosted through activity awareness. We also find that the resulting automation has a high degree of usability and comfort for the smart home resident.

  6. Adaptive Automation Design and Implementation

    Science.gov (United States)

    2015-09-17

    with an automated system to a real-world adaptive au- tomation system implementation. There have been plenty of adaptive automation 17 Adaptive...of systems without increasing manpower requirements by allocating routine tasks to automated aids, improving safety through the use of au- tomated ...between intermediate levels of au- tomation , explicitly defining which human task a given level automates. Each model aids the creation and classification

  7. Metabolite Identification Using Automated Comparison of High-Resolution Multistage Mass Spectral Trees

    NARCIS (Netherlands)

    Rojas-Cherto, M.; Peironcely, J.E.; Kasper, P.T.; Hooft, van der J.J.J.; Vos, de R.C.H.; Vreeken, R.; Hankemeier, T.; Reijmers, T.

    2012-01-01

    Multistage mass spectrometry (MSn) generating so-called spectral trees is a powerful tool in the annotation and structural elucidation of metabolites and is increasingly used in the area of accurate mass LC/MS-based metabolomics to identify unknown, but biologically relevant, compounds. As a

  8. Layered distributed architecture for plant automation

    International Nuclear Information System (INIS)

    Aravamuthan, G.; Verma, Yachika; Ranjan, Jyoti; Chachondia, Alka S.; Ganesh, G.

    2005-01-01

    The development of plant automation system and associated software remains one of the greatest challenges to the widespread implementation of highly adaptive re-configurable automation technology. This paper presents a layered distributed architecture for a plant automation system designed to support rapid reconfiguration and redeployment of automation components. The paper first presents evolution of automation architecture and their associated environment in the past few decades and then presents the concept of layered system architecture and the use of automation components to support the construction of a wide variety of automation system. It also highlights the role of standards and technology, which can be used in the development of automation components. We have attempted to adhere to open standards and technology for the development of automation component at a various layers. It also highlights the application of this concept in the development of an Operator Information System (OIS) for Advanced Heavy Water Reactor (AHWR). (author)

  9. Semantics-based Automated Web Testing

    Directory of Open Access Journals (Sweden)

    Hai-Feng Guo

    2015-08-01

    Full Text Available We present TAO, a software testing tool performing automated test and oracle generation based on a semantic approach. TAO entangles grammar-based test generation with automated semantics evaluation using a denotational semantics framework. We show how TAO can be incorporated with the Selenium automation tool for automated web testing, and how TAO can be further extended to support automated delta debugging, where a failing web test script can be systematically reduced based on grammar-directed strategies. A real-life parking website is adopted throughout the paper to demonstrate the effectivity of our semantics-based web testing approach.

  10. Automated Performance Characterization of DSN System Frequency Stability Using Spacecraft Tracking Data

    Science.gov (United States)

    Pham, Timothy T.; Machuzak, Richard J.; Bedrossian, Alina; Kelly, Richard M.; Liao, Jason C.

    2012-01-01

    This software provides an automated capability to measure and qualify the frequency stability performance of the Deep Space Network (DSN) ground system, using daily spacecraft tracking data. The results help to verify if the DSN performance is meeting its specification, therefore ensuring commitments to flight missions; in particular, the radio science investigations. The rich set of data also helps the DSN Operations and Maintenance team to identify the trends and patterns, allowing them to identify the antennas of lower performance and implement corrective action in a timely manner. Unlike the traditional approach where the performance can only be obtained from special calibration sessions that are both time-consuming and require manual setup, the new method taps into the daily spacecraft tracking data. This new approach significantly increases the amount of data available for analysis, roughly by two orders of magnitude, making it possible to conduct trend analysis with good confidence. The software is built with automation in mind for end-to-end processing. From the inputs gathering to computation analysis and later data visualization of the results, all steps are done automatically, making the data production at near zero cost. This allows the limited engineering resource to focus on high-level assessment and to follow up with the exceptions/deviations. To make it possible to process the continual stream of daily incoming data without much effort, and to understand the results quickly, the processing needs to be automated and the data summarized at a high level. Special attention needs to be given to data gathering, input validation, handling anomalous conditions, computation, and presenting the results in a visual form that makes it easy to spot items of exception/deviation so that further analysis can be directed and corrective actions followed.

  11. Automated protein identification by the combination of MALDI MS and MS/MS spectra from different instruments.

    Science.gov (United States)

    Levander, Fredrik; James, Peter

    2005-01-01

    The identification of proteins separated on two-dimensional gels is most commonly performed by trypsin digestion and subsequent matrix-assisted laser desorption ionization (MALDI) with time-of-flight (TOF). Recently, atmospheric pressure (AP) MALDI coupled to an ion trap (IT) has emerged as a convenient method to obtain tandem mass spectra (MS/MS) from samples on MALDI target plates. In the present work, we investigated the feasibility of using the two methodologies in line as a standard method for protein identification. In this setup, the high mass accuracy MALDI-TOF spectra are used to calibrate the peptide precursor masses in the lower mass accuracy AP-MALDI-IT MS/MS spectra. Several software tools were developed to automate the analysis process. Two sets of MALDI samples, consisting of 142 and 421 gel spots, respectively, were analyzed in a highly automated manner. In the first set, the protein identification rate increased from 61% for MALDI-TOF only to 85% for MALDI-TOF combined with AP-MALDI-IT. In the second data set the increase in protein identification rate was from 44% to 58%. AP-MALDI-IT MS/MS spectra were in general less effective than the MALDI-TOF spectra for protein identification, but the combination of the two methods clearly enhanced the confidence in protein identification.

  12. Automation in organizations: Eternal conflict

    Science.gov (United States)

    Dieterly, D. L.

    1981-01-01

    Some ideas on and insights into the problems associated with automation in organizations are presented with emphasis on the concept of automation, its relationship to the individual, and its impact on system performance. An analogy is drawn, based on an American folk hero, to emphasize the extent of the problems encountered when dealing with automation within an organization. A model is proposed to focus attention on a set of appropriate dimensions. The function allocation process becomes a prominent aspect of the model. The current state of automation research is mentioned in relation to the ideas introduced. Proposed directions for an improved understanding of automation's effect on the individual's efficiency are discussed. The importance of understanding the individual's perception of the system in terms of the degree of automation is highlighted.

  13. Optimisation of mass ranging for atom probe microanalysis and application to the corrosion processes in Zr alloys

    International Nuclear Information System (INIS)

    Hudson, D.; Smith, G.D.W.; Gault, B.

    2011-01-01

    Atom probe tomography uses time-of-flight mass spectrometry to identify the chemical nature of atoms from their mass-to-charge-state ratios. Within a mass spectrum, ranges are defined so as to attribute a chemical identity to each peak. The accuracy of atom probe microanalysis relies on the definition of these ranges. Here we propose and compare several automated ranging techniques, tested against simulated mass spectra. The performance of these metrics compare favourably with a trial of users asked to manually range a simplified simulated dataset. The optimised automated ranging procedure was then used to precisely evaluate the very low iron concentration (0.003-0.018 at%) in a zirconium alloy to reveal its behaviour in the matrix during corrosion; oxygen is injected into solution and has the effect of increasing the local iron concentration near the oxide-metal interface, which in turn affects the corrosion properties of the metal substrate. -- Research Highlights: → Realistic simulated mass spectra were generated so as to reproduce experimental data with a perfectly determined composition. → Several metrics were tested against these simulated mass spectra to determine an optimal methodology for ranging mass peaks in atom probe tomography. Systematic automated ranging provides a significant reduction in the deviation between true and measured concentrations compared to manual ranging by multiple users on the same data. → Experimental datasets were subsequently investigated, and Fe has been shown to be distributed as a random solid solution within the matrix of 'as-received' recrystallised ZIRLO, a zirconium alloy.

  14. Mobile home automation-merging mobile value added services and home automation technologies

    OpenAIRE

    Rosendahl, Andreas; Hampe, Felix J.; Botterweck, Goetz

    2007-01-01

    non-peer-reviewed In this paper we study mobile home automation, a field that emerges from an integration of mobile application platforms and home automation technologies. In a conceptual introduction we first illustrate the need for such applications by introducing a two-dimensional conceptual model of mobility. Subsequently we suggest an architecture and discuss different options of how a user might access a mobile home automation service and the controlled devices. As another contrib...

  15. Discrimination between smiling faces: Human observers vs. automated face analysis.

    Science.gov (United States)

    Del Líbano, Mario; Calvo, Manuel G; Fernández-Martín, Andrés; Recio, Guillermo

    2018-05-11

    This study investigated (a) how prototypical happy faces (with happy eyes and a smile) can be discriminated from blended expressions with a smile but non-happy eyes, depending on type and intensity of the eye expression; and (b) how smile discrimination differs for human perceivers versus automated face analysis, depending on affective valence and morphological facial features. Human observers categorized faces as happy or non-happy, or rated their valence. Automated analysis (FACET software) computed seven expressions (including joy/happiness) and 20 facial action units (AUs). Physical properties (low-level image statistics and visual saliency) of the face stimuli were controlled. Results revealed, first, that some blended expressions (especially, with angry eyes) had lower discrimination thresholds (i.e., they were identified as "non-happy" at lower non-happy eye intensities) than others (especially, with neutral eyes). Second, discrimination sensitivity was better for human perceivers than for automated FACET analysis. As an additional finding, affective valence predicted human discrimination performance, whereas morphological AUs predicted FACET discrimination. FACET can be a valid tool for categorizing prototypical expressions, but is currently more limited than human observers for discrimination of blended expressions. Configural processing facilitates detection of in/congruence(s) across regions, and thus detection of non-genuine smiling faces (due to non-happy eyes). Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Experience of automation failures in training: effects on trust, automation bias, complacency and performance.

    Science.gov (United States)

    Sauer, Juergen; Chavaillaz, Alain; Wastell, David

    2016-06-01

    This work examined the effects of operators' exposure to various types of automation failures in training. Forty-five participants were trained for 3.5 h on a simulated process control environment. During training, participants either experienced a fully reliable, automatic fault repair facility (i.e. faults detected and correctly diagnosed), a misdiagnosis-prone one (i.e. faults detected but not correctly diagnosed) or a miss-prone one (i.e. faults not detected). One week after training, participants were tested for 3 h, experiencing two types of automation failures (misdiagnosis, miss). The results showed that automation bias was very high when operators trained on miss-prone automation encountered a failure of the diagnostic system. Operator errors resulting from automation bias were much higher when automation misdiagnosed a fault than when it missed one. Differences in trust levels that were instilled by the different training experiences disappeared during the testing session. Practitioner Summary: The experience of automation failures during training has some consequences. A greater potential for operator errors may be expected when an automatic system failed to diagnose a fault than when it failed to detect one.

  17. Automation in Warehouse Development

    CERN Document Server

    Verriet, Jacques

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and supports the quality of picking processes. Secondly, the development of models to simulate and analyse warehouse designs and their components facilitates the challenging task of developing warehouses that take into account each customer’s individual requirements and logistic processes. Automation in Warehouse Development addresses both types of automation from the innovative perspective of applied science. In particular, it describes the outcomes of the Falcon project, a joint endeavour by a consortium of industrial and academic partners. The results include a model-based approach to automate warehouse control design, analysis models for warehouse design, concepts for robotic item handling and computer vision, and auton...

  18. Portable, remotely operated, computer-controlled, quadrupole mass spectrometer for field use

    International Nuclear Information System (INIS)

    Friesen, R.D.; Newton, J.C.; Smith, C.F.

    1982-04-01

    A portable, remote-controlled mass spectrometer was required at the Nevada Test Site to analyze prompt post-event gas from the nuclear cavity in support of the underground testing program. A Balzers QMG-511 quadrupole was chosen for its ability to be interfaced to a DEC LSI-11 computer and to withstand the ground movement caused by this field environment. The inlet system valves, the pumps, the pressure and temperature transducers, and the quadrupole mass spectrometer are controlled by a read-only-memory-based DEC LSI-11/2 with a high-speed microwave link to the control point which is typically 30 miles away. The computer at the control point is a DEC LSI-11/23 running the RSX-11 operating system. The instrument was automated as much as possible because the system is run by inexperienced operators at times. The mass spectrometer has been used on an initial field event with excellent performance. The gas analysis system is described, including automation by a novel computer control method which reduces operator errors and allows dynamic access to the system parameters

  19. 76 FR 34246 - Automated Commercial Environment (ACE); Announcement of National Customs Automation Program Test...

    Science.gov (United States)

    2011-06-13

    ... Environment (ACE); Announcement of National Customs Automation Program Test of Automated Procedures for In... Customs Automation Program (NCAP) test relating to highway movements of commercial goods that are transported in-bond through the United States from one point in Canada to another point in Canada. The NCAP...

  20. Automated setup for characterization of intact histone tails in Suz12-/- stem cells

    DEFF Research Database (Denmark)

    Sidoli, Simone; Schwämmle, Veit; Hansen, Thomas Aarup

    Epigenetics is defined as the study of heritable changes that occur without modifying the DNA sequence. Histone proteins are crucial components of epigenetic mechanisms and regulation, since they are fundamental for chromatin structure. Mass spectrometry-based proteomics is already an integrated...... developed a high-resolving and automated LC-MS/MS setup to characterize intact histone tails (middle-down strategy)...

  1. Automated cloning methods.; TOPICAL

    International Nuclear Information System (INIS)

    Collart, F.

    2001-01-01

    Argonne has developed a series of automated protocols to generate bacterial expression clones by using a robotic system designed to be used in procedures associated with molecular biology. The system provides plate storage, temperature control from 4 to 37 C at various locations, and Biomek and Multimek pipetting stations. The automated system consists of a robot that transports sources from the active station on the automation system. Protocols for the automated generation of bacterial expression clones can be grouped into three categories (Figure 1). Fragment generation protocols are initiated on day one of the expression cloning procedure and encompass those protocols involved in generating purified coding region (PCR)

  2. Direct analysis by time-of-flight secondary ion mass spectrometry reveals action of bacterial laccase-mediator systems on both hardwood and softwood samples.

    Science.gov (United States)

    Goacher, Robyn E; Braham, Erick J; Michienzi, Courtney L; Flick, Robert M; Yakunin, Alexander F; Master, Emma R

    2017-12-29

    The modification and degradation of lignin play a vital role in carbon cycling as well as production of biofuels and bioproducts. The possibility of using bacterial laccases for the oxidation of lignin offers a route to utilize existing industrial protein expression techniques. However, bacterial laccases are most frequently studied on small model compounds that do not capture the complexity of lignocellulosic materials. This work studied the action of laccases from Bacillus subtilis and Salmonella typhimurium (EC 1.10.3.2) on ground wood samples from yellow birch (Betula alleghaniensis) and red spruce (Picea rubens). The ability of bacterial laccases to modify wood can be facilitated by small molecule mediators. Herein, 2,2'-azino-bis(3-ethylbenzothiazoline-6-sulphonic acid) (ABTS), gallic acid and sinapic acid mediators were tested. Direct analysis of the wood samples was achieved by time-of-flight secondary ion mass spectrometry (ToF-SIMS), a surface sensitive mass spectrometry technique that has characteristic peaks for H, G and S lignin. The action of the bacterial laccases on both wood samples was demonstrated and revealed a strong mediator influence. The ABTS mediator led to delignification, evident in an overall increase of polysaccharide peaks in the residual solid, along with equal loss of G and S-lignin peaks. The gallic acid mediator demonstrated minimal laccase activity. Meanwhile, the sinapic acid mediator altered the S/G peak ratio consistent with mediator attaching to the wood solids. The current investigation demonstrates the action of bacterial laccase-mediator systems directly on woody materials, and the potential of using ToF-SIMS to uncover the fundamental and applied role of bacterial enzymes in lignocellulose conversion. © 2017 Scandinavian Plant Physiology Society.

  3. Evaluation of comfort in bedridden older adults using an air-cell mattress with an automated turning function: measurement of parasympathetic activity during night sleep.

    Science.gov (United States)

    Futamura, Megumi; Sugama, Junko; Okuwa, Mayumi; Sanada, Hiromi; Tabata, Keiko

    2008-12-01

    This study objectively evaluated the degree of comfort in bedridden older adults using an air-cell mattress with an automated turning mechanism. The sample included 10 bedridden women with verbal communication difficulties. The high frequency (HF) components of heart rate variability, which reflect parasympathetic nervous activity, were compared for the manual and automated turning periods. No significant differences in the HF component were observed in 5 of the participants. Significant increases in the HF component associated with automated turning were observed in 3 participants; however, the two participants with the lowest body mass index values exhibited a significant reduction in the HF component during the automated turning period. The results revealed that comfort might not be disturbed during the automated turning period.

  4. A LARGE LIFE INSURANCE COMPANY AUTOMATES. WORKFORCE IMPLICATIONS OF COMPUTER CONVERSION. AUTOMATION PROGRAM REPORT, NUMBER 3.

    Science.gov (United States)

    CIBARICH, AUGUST L.; AND OTHERS

    THIS WAS ONE OF 20 DEMONSTRATION PROJECTS INITIATED IN 11 STATES IN 1961-63 TO GAIN EXPERIENCE WITH LABOR MARKET PROBLEMS ARISING FROM CHANGING TECHNOLOGY AND MASS LAYOFFS. THE FUNDAMENTAL AIM WAS TO COMBINE ACTION AND RESEARCH TO DEMONSTRATE WHAT THE STATE EMPLOYMENT SERVICE COULD DO IN AREAS WHERE THE LABOR MARKET WAS RAPIDLY CHANGING.…

  5. Role of centre vortices in dynamical mass generation

    International Nuclear Information System (INIS)

    Leinweber, Derek B.; Bowman, Patrick O.; Heller, Urs M.; Kusterer, Daniel-Jens; Langfeld, Kurt; Williams, Anthony G.

    2006-01-01

    The mass and renormalization functions of the nonperturbative quark propagator are studied in SU(3) gauge field theory with a Symanzik-improved gluon action and the AsqTad fermion action. Centre vortices in the gauge field are identified by fixing to maximal centre gauge. The role of centre vortices in dynamical mass generation is explored by removing centre vortices from the gauge fields and studying the associated changes in the quark propagator. We find that dynamical mass generation survives in the vortex-removed SU(3) gauge field theory despite the vanishing of the string tension and suppression of the gluon propagator in the infrared suggesting the possibility of decoupling dynamical mass generation from confinement

  6. Automation, Performance and International Competition

    DEFF Research Database (Denmark)

    Kromann, Lene; Sørensen, Anders

    This paper presents new evidence on trade‐induced automation in manufacturing firms using unique data combining a retrospective survey that we have assembled with register data for 2005‐2010. In particular, we establish a causal effect where firms that have specialized in product types for which...... the Chinese exports to the world market has risen sharply invest more in automated capital compared to firms that have specialized in other product types. We also study the relationship between automation and firm performance and find that firms with high increases in scale and scope of automation have faster...... productivity growth than other firms. Moreover, automation improves the efficiency of all stages of the production process by reducing setup time, run time, and inspection time and increasing uptime and quantity produced per worker. The efficiency improvement varies by type of automation....

  7. Automated analysis of information processing, kinetic independence and modular architecture in biochemical networks using MIDIA.

    Science.gov (United States)

    Bowsher, Clive G

    2011-02-15

    Understanding the encoding and propagation of information by biochemical reaction networks and the relationship of such information processing properties to modular network structure is of fundamental importance in the study of cell signalling and regulation. However, a rigorous, automated approach for general biochemical networks has not been available, and high-throughput analysis has therefore been out of reach. Modularization Identification by Dynamic Independence Algorithms (MIDIA) is a user-friendly, extensible R package that performs automated analysis of how information is processed by biochemical networks. An important component is the algorithm's ability to identify exact network decompositions based on both the mass action kinetics and informational properties of the network. These modularizations are visualized using a tree structure from which important dynamic conditional independence properties can be directly read. Only partial stoichiometric information needs to be used as input to MIDIA, and neither simulations nor knowledge of rate parameters are required. When applied to a signalling network, for example, the method identifies the routes and species involved in the sequential propagation of information between its multiple inputs and outputs. These routes correspond to the relevant paths in the tree structure and may be further visualized using the Input-Output Path Matrix tool. MIDIA remains computationally feasible for the largest network reconstructions currently available and is straightforward to use with models written in Systems Biology Markup Language (SBML). The package is distributed under the GNU General Public License and is available, together with a link to browsable Supplementary Material, at http://code.google.com/p/midia. Further information is at www.maths.bris.ac.uk/~macgb/Software.html.

  8. Inselect: Automating the Digitization of Natural History Collections.

    Directory of Open Access Journals (Sweden)

    Lawrence N Hudson

    Full Text Available The world's natural history collections constitute an enormous evidence base for scientific research on the natural world. To facilitate these studies and improve access to collections, many organisations are embarking on major programmes of digitization. This requires automated approaches to mass-digitization that support rapid imaging of specimens and associated data capture, in order to process the tens of millions of specimens common to most natural history collections. In this paper we present Inselect-a modular, easy-to-use, cross-platform suite of open-source software tools that supports the semi-automated processing of specimen images generated by natural history digitization programmes. The software is made up of a Windows, Mac OS X, and Linux desktop application, together with command-line tools that are designed for unattended operation on batches of images. Blending image visualisation algorithms that automatically recognise specimens together with workflows to support post-processing tasks such as barcode reading, label transcription and metadata capture, Inselect fills a critical gap to increase the rate of specimen digitization.

  9. Factory automation for heavy electric equipment

    International Nuclear Information System (INIS)

    Rokutani, Takashi; Ninomiya, Iwao; Hatayama, Naokatsu; Kato, Hiroshi; Yano, Hideaki.

    1986-01-01

    The heightening of productivity in the factories manufacturing heavy electric equipment has been advanced so far by the rationalization of direct workings such as NC, robots and the adoption of FMS (flexible manufacturing system). However, as CAD advances, the effective utilization of these data and the expansion to future CIM (computer integrated manufacture) have become to be demanded. In the Hitachi Works of Hitachi Ltd., it was decided to advance the synthetic rationalization by adopting FA (factory automation) system. Steam turbine blades, pipings for nuclear power plants and motor coils were taken up as the objects since these are important parts, and for the purpose of the drastic heightening of QA level, the heightening of accuracy and the shortening of processes, the synchronization with field installation schedule and the creation of safe working place environment, the FA projects for these three sections were simultaneously planned. When the automation of non-mass production factories is promoted, there are the unmanned factories combining FMS lines for relatively many products and those characterized by FMC of shortening preparation time for small lot products, and this is the examples of the former. The system constitution for FA and the production management combined with it are described. The high reliability of the optical network was regarded as important. (Kako, I.)

  10. Systematic review automation technologies

    Science.gov (United States)

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  11. Automated detection and classification of cryptographic algorithms in binary programs through machine learning

    OpenAIRE

    Hosfelt, Diane Duros

    2015-01-01

    Threats from the internet, particularly malicious software (i.e., malware) often use cryptographic algorithms to disguise their actions and even to take control of a victim's system (as in the case of ransomware). Malware and other threats proliferate too quickly for the time-consuming traditional methods of binary analysis to be effective. By automating detection and classification of cryptographic algorithms, we can speed program analysis and more efficiently combat malware. This thesis wil...

  12. Structure, antihyperglycemic activity and cellular actions of a novel diglycated human insulin

    DEFF Research Database (Denmark)

    O'Harte, F P; Boyd, A C; McKillop, A M

    2000-01-01

    Human insulin was glycated under hyperglycemic reducing conditions and a novel diglycated form (M(r) 6135.1 Da) was purified by RP-HPLC. Endoproteinase Glu-C digestion combined with mass spectrometry and automated Edman degradation localized glycation to Gly(1) and Phe(1) of the insulin A- and B-...

  13. SAIDE: A Semi-Automated Interface for Hydrogen/Deuterium Exchange Mass Spectrometry.

    Science.gov (United States)

    Villar, Maria T; Miller, Danny E; Fenton, Aron W; Artigues, Antonio

    2010-01-01

    Deuterium/hydrogen exchange in combination with mass spectrometry (DH MS) is a sensitive technique for detection of changes in protein conformation and dynamics. Since temperature, pH and timing control are the key elements for reliable and efficient measurement of hydrogen/deuterium content in proteins and peptides, we have developed a small, semiautomatic interface for deuterium exchange that interfaces the HPLC pumps with a mass spectrometer. This interface is relatively inexpensive to build, and provides efficient temperature and timing control in all stages of enzyme digestion, HPLC separation and mass analysis of the resulting peptides. We have tested this system with a series of standard tryptic peptides reconstituted in a solvent containing increasing concentration of deuterium. Our results demonstrate the use of this interface results in minimal loss of deuterium due to back exchange during HPLC desalting and separation. For peptides reconstituted in a buffer containing 100% deuterium, and assuming that all amide linkages have exchanged hydrogen with deuterium, the maximum loss of deuterium content is only 17% of the label, indicating the loss of only one deuterium molecule per peptide.

  14. AUTOMATING THE DATA SECURITY PROCESS

    Directory of Open Access Journals (Sweden)

    Florin Ogigau-Neamtiu

    2017-11-01

    Full Text Available Contemporary organizations face big data security challenges in the cyber environment due to modern threats and actual business working model which relies heavily on collaboration, data sharing, tool integration, increased mobility, etc. The nowadays data classification and data obfuscation selection processes (encryption, masking or tokenization suffer because of the human implication in the process. Organizations need to shirk data security domain by classifying information based on its importance, conduct risk assessment plans and use the most cost effective data obfuscation technique. The paper proposes a new model for data protection by using automated machine decision making procedures to classify data and to select the appropriate data obfuscation technique. The proposed system uses natural language processing capabilities to analyze input data and to select the best course of action. The system has capabilities to learn from previous experiences thus improving itself and reducing the risk of wrong data classification.

  15. The Automation-by-Expertise-by-Training Interaction.

    Science.gov (United States)

    Strauch, Barry

    2017-03-01

    I introduce the automation-by-expertise-by-training interaction in automated systems and discuss its influence on operator performance. Transportation accidents that, across a 30-year interval demonstrated identical automation-related operator errors, suggest a need to reexamine traditional views of automation. I review accident investigation reports, regulator studies, and literature on human computer interaction, expertise, and training and discuss how failing to attend to the interaction of automation, expertise level, and training has enabled operators to commit identical automation-related errors. Automated systems continue to provide capabilities exceeding operators' need for effective system operation and provide interfaces that can hinder, rather than enhance, operator automation-related situation awareness. Because of limitations in time and resources, training programs do not provide operators the expertise needed to effectively operate these automated systems, requiring them to obtain the expertise ad hoc during system operations. As a result, many do not acquire necessary automation-related system expertise. Integrating automation with expected operator expertise levels, and within training programs that provide operators the necessary automation expertise, can reduce opportunities for automation-related operator errors. Research to address the automation-by-expertise-by-training interaction is needed. However, such research must meet challenges inherent to examining realistic sociotechnical system automation features with representative samples of operators, perhaps by using observational and ethnographic research. Research in this domain should improve the integration of design and training and, it is hoped, enhance operator performance.

  16. Distribution automation

    International Nuclear Information System (INIS)

    Gruenemeyer, D.

    1991-01-01

    This paper reports on a Distribution Automation (DA) System enhances the efficiency and productivity of a utility. It also provides intangible benefits such as improved public image and market advantages. A utility should evaluate the benefits and costs of such a system before committing funds. The expenditure for distribution automation is economical when justified by the deferral of a capacity increase, a decrease in peak power demand, or a reduction in O and M requirements

  17. Understanding human management of automation errors

    Science.gov (United States)

    McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.

    2013-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042

  18. WIRELESS HOME AUTOMATION SYSTEM BASED ON MICROCONTROLLER

    Directory of Open Access Journals (Sweden)

    MUNA H. SALEH

    2017-11-01

    Full Text Available This paper presents the development of Global System Mobile (GSM-based control home air-conditioner for home automation system. The main aim of the prototype development is to reduce electricity wastage. GSM module was used for receiving Short Message Service (SMS from the user’s mobile phone that automatically enable the controller to take any further action such as to switch ON and OFF the home air-conditioner. The system controls the air-conditioner based on the temperature reading through the sensor. Every period temperature sensor sends the degree to Micro Controller Unit (MCU through ZigBee. Based on temperature degree MCU send ON or OFF signal to switch. Additionally, system allows user to operate or shut down the airconditioner remotely through SMS.

  19. Fully Automated Trimethylsilyl (TMS) Derivatisation Protocol for Metabolite Profiling by GC-MS.

    Science.gov (United States)

    Zarate, Erica; Boyle, Veronica; Rupprecht, Udo; Green, Saras; Villas-Boas, Silas G; Baker, Philip; Pinu, Farhana R

    2016-12-29

    Gas Chromatography-Mass Spectrometry (GC-MS) has long been used for metabolite profiling of a wide range of biological samples. Many derivatisation protocols are already available and among these, trimethylsilyl (TMS) derivatisation is one of the most widely used in metabolomics. However, most TMS methods rely on off-line derivatisation prior to GC-MS analysis. In the case of manual off-line TMS derivatisation, the derivative created is unstable, so reduction in recoveries occurs over time. Thus, derivatisation is carried out in small batches. Here, we present a fully automated TMS derivatisation protocol using robotic autosamplers and we also evaluate a commercial software, Maestro available from Gerstel GmbH. Because of automation, there was no waiting time of derivatised samples on the autosamplers, thus reducing degradation of unstable metabolites. Moreover, this method allowed us to overlap samples and improved throughputs. We compared data obtained from both manual and automated TMS methods performed on three different matrices, including standard mix, wine, and plasma samples. The automated TMS method showed better reproducibility and higher peak intensity for most of the identified metabolites than the manual derivatisation method. We also validated the automated method using 114 quality control plasma samples. Additionally, we showed that this online method was highly reproducible for most of the metabolites detected and identified (RSD TMS method has been applied to analyse a large number of complex plasma samples. Furthermore, we found that this method was highly applicable for routine metabolite profiling (both targeted and untargeted) in any metabolomics laboratory.

  20. Fully Automated Trimethylsilyl (TMS Derivatisation Protocol for Metabolite Profiling by GC-MS

    Directory of Open Access Journals (Sweden)

    Erica Zarate

    2016-12-01

    Full Text Available Gas Chromatography-Mass Spectrometry (GC-MS has long been used for metabolite profiling of a wide range of biological samples. Many derivatisation protocols are already available and among these, trimethylsilyl (TMS derivatisation is one of the most widely used in metabolomics. However, most TMS methods rely on off-line derivatisation prior to GC-MS analysis. In the case of manual off-line TMS derivatisation, the derivative created is unstable, so reduction in recoveries occurs over time. Thus, derivatisation is carried out in small batches. Here, we present a fully automated TMS derivatisation protocol using robotic autosamplers and we also evaluate a commercial software, Maestro available from Gerstel GmbH. Because of automation, there was no waiting time of derivatised samples on the autosamplers, thus reducing degradation of unstable metabolites. Moreover, this method allowed us to overlap samples and improved throughputs. We compared data obtained from both manual and automated TMS methods performed on three different matrices, including standard mix, wine, and plasma samples. The automated TMS method showed better reproducibility and higher peak intensity for most of the identified metabolites than the manual derivatisation method. We also validated the automated method using 114 quality control plasma samples. Additionally, we showed that this online method was highly reproducible for most of the metabolites detected and identified (RSD < 20 and specifically achieved excellent results for sugars, sugar alcohols, and some organic acids. To the very best of our knowledge, this is the first time that the automated TMS method has been applied to analyse a large number of complex plasma samples. Furthermore, we found that this method was highly applicable for routine metabolite profiling (both targeted and untargeted in any metabolomics laboratory.

  1. Doubly Self-Dual Actions in Various Dimensions

    CERN Document Server

    Ferrara, S.; Yeranyan, A.

    2015-05-11

    The self-duality of the N=1 supersymmetric Born--Infeld action implies a double self-duality of the tensor multiplet square-root action when the scalar and the antisymmetric tensor are interchanged via Poincare' duality. We show how this phenomenon extends to D space-time dimensions for non-linear actions involving pairs of forms of rank p and D-p-2. As a byproduct, we construct a new two-field generalization of the Born-Infeld action whose equations of motion are invariant under a U(1) duality. In these systems, the introduction of Green-Schwarz terms results in explicit non-linear mass-like terms for dual massive pairs.

  2. Human-centred automation: an explorative study

    International Nuclear Information System (INIS)

    Hollnagel, Erik; Miberg, Ann Britt

    1999-05-01

    The purpose of the programme activity on human-centred automation at the HRP is to develop knowledge (in the form of models and theories) and tools (in the form of techniques and simulators) to support design of automation that ensures effective human performance and comprehension. This report presents the work done on both the analytical and experimental side of this project. The analytical work has surveyed common definitions of automation and traditional design principles. A general finding is that human-centred automation usually is defined in terms of what it is not. This is partly due to a lack of adequate models and of human-automation interaction. Another result is a clarification of the consequences of automation, in particular with regard to situation awareness and workload. The experimental work has taken place as an explorative experiment in HAMMLAB in collaboration with IPSN (France). The purpose of this experiment was to increase the understanding of how automation influences operator performance in NPP control rooms. Two different types of automation (extensive and limited) were considered in scenarios having two different degrees of complexity (high and low), and involving diagnostic and procedural tasks. Six licensed NPP crews from the NPP at Loviisa, Finland, participated in the experiment. The dependent variables applied were plant performance, operator performance, self-rated crew performance, situation awareness, workload, and operator trust in the automation. The results from the diagnostic scenarios indicated that operators' judgement of crew efficiency was related to their level of trust in the automation, and further that operators trusted automation least and rated crew performance lowest in situations where crew performance was efficient and vice versa. The results from procedural scenarios indicated that extensive automation efficiently supported operators' performance, and further that operator' judgement of crew performance efficiency

  3. Configuration Management Automation (CMA) -

    Data.gov (United States)

    Department of Transportation — Configuration Management Automation (CMA) will provide an automated, integrated enterprise solution to support CM of FAA NAS and Non-NAS assets and investments. CMA...

  4. Toward designing for trust in database automation

    Energy Technology Data Exchange (ETDEWEB)

    Duez, P. P.; Jamieson, G. A. [Cognitive Engineering Laboratory, Univ. of Toronto, 5 King' s College Rd., Toronto, Ont. M5S 3G8 (Canada)

    2006-07-01

    Appropriate reliance on system automation is imperative for safe and productive work, especially in safety-critical systems. It is unsafe to rely on automation beyond its designed use; conversely, it can be both unproductive and unsafe to manually perform tasks that are better relegated to automated tools. Operator trust in automated tools mediates reliance, and trust appears to affect how operators use technology. As automated agents become more complex, the question of trust in automation is increasingly important. In order to achieve proper use of automation, we must engender an appropriate degree of trust that is sensitive to changes in operating functions and context. In this paper, we present research concerning trust in automation in the domain of automated tools for relational databases. Lee and See have provided models of trust in automation. One model developed by Lee and See identifies three key categories of information about the automation that lie along a continuum of attributional abstraction. Purpose-, process-and performance-related information serve, both individually and through inferences between them, to describe automation in such a way as to engender r properly-calibrated trust. Thus, one can look at information from different levels of attributional abstraction as a general requirements analysis for information key to appropriate trust in automation. The model of information necessary to engender appropriate trust in automation [1] is a general one. Although it describes categories of information, it does not provide insight on how to determine the specific information elements required for a given automated tool. We have applied the Abstraction Hierarchy (AH) to this problem in the domain of relational databases. The AH serves as a formal description of the automation at several levels of abstraction, ranging from a very abstract purpose-oriented description to a more concrete description of the resources involved in the automated process

  5. Toward designing for trust in database automation

    International Nuclear Information System (INIS)

    Duez, P. P.; Jamieson, G. A.

    2006-01-01

    Appropriate reliance on system automation is imperative for safe and productive work, especially in safety-critical systems. It is unsafe to rely on automation beyond its designed use; conversely, it can be both unproductive and unsafe to manually perform tasks that are better relegated to automated tools. Operator trust in automated tools mediates reliance, and trust appears to affect how operators use technology. As automated agents become more complex, the question of trust in automation is increasingly important. In order to achieve proper use of automation, we must engender an appropriate degree of trust that is sensitive to changes in operating functions and context. In this paper, we present research concerning trust in automation in the domain of automated tools for relational databases. Lee and See have provided models of trust in automation. One model developed by Lee and See identifies three key categories of information about the automation that lie along a continuum of attributional abstraction. Purpose-, process-and performance-related information serve, both individually and through inferences between them, to describe automation in such a way as to engender r properly-calibrated trust. Thus, one can look at information from different levels of attributional abstraction as a general requirements analysis for information key to appropriate trust in automation. The model of information necessary to engender appropriate trust in automation [1] is a general one. Although it describes categories of information, it does not provide insight on how to determine the specific information elements required for a given automated tool. We have applied the Abstraction Hierarchy (AH) to this problem in the domain of relational databases. The AH serves as a formal description of the automation at several levels of abstraction, ranging from a very abstract purpose-oriented description to a more concrete description of the resources involved in the automated process

  6. MetAlign: Interface-Driven, Versatile Metabolomics Tool for Hyphenated Full-Scan Mass Spectrometry Data Preprocessing

    NARCIS (Netherlands)

    Lommen, A.

    2009-01-01

    Hyphenated full-scan MS technology creates large amounts of data. A versatile easy to handle automation tool aiding in the data analysis is very important in handling such a data stream. MetAlign software-as described in this manuscript-handles a broad range of accurate mass and nominal mass GC/MS

  7. Automated electron microprobe

    International Nuclear Information System (INIS)

    Thompson, K.A.; Walker, L.R.

    1986-01-01

    The Plant Laboratory at the Oak Ridge Y-12 Plant has recently obtained a Cameca MBX electron microprobe with a Tracor Northern TN5500 automation system. This allows full stage and spectrometer automation and digital beam control. The capabilities of the system include qualitative and quantitative elemental microanalysis for all elements above and including boron in atomic number, high- and low-magnification imaging and processing, elemental mapping and enhancement, and particle size, shape, and composition analyses. Very low magnification, quantitative elemental mapping using stage control (which is of particular interest) has been accomplished along with automated size, shape, and composition analysis over a large relative area

  8. Automate functional testing

    Directory of Open Access Journals (Sweden)

    Ramesh Kalindri

    2014-06-01

    Full Text Available Currently, software engineers are increasingly turning to the option of automating functional tests, but not always have successful in this endeavor. Reasons range from low planning until over cost in the process. Some principles that can guide teams in automating these tests are described in this article.

  9. Driver Psychology during Automated Platooning

    NARCIS (Netherlands)

    Heikoop, D.D.

    2017-01-01

    With the rapid increase in vehicle automation technology, the call for understanding how humans behave while driving in an automated vehicle becomes more urgent. Vehicles that have automated systems such as Lane Keeping Assist (LKA) or Adaptive Cruise Control (ACC) not only support drivers in their

  10. Modeling Increased Complexity and the Reliance on Automation: FLightdeck Automation Problems (FLAP) Model

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    This paper highlights the development of a model that is focused on the safety issue of increasing complexity and reliance on automation systems in transport category aircraft. Recent statistics show an increase in mishaps related to manual handling and automation errors due to pilot complacency and over-reliance on automation, loss of situational awareness, automation system failures and/or pilot deficiencies. Consequently, the aircraft can enter a state outside the flight envelope and/or air traffic safety margins which potentially can lead to loss-of-control (LOC), controlled-flight-into-terrain (CFIT), or runway excursion/confusion accidents, etc. The goal of this modeling effort is to provide NASA's Aviation Safety Program (AvSP) with a platform capable of assessing the impacts of AvSP technologies and products towards reducing the relative risk of automation related accidents and incidents. In order to do so, a generic framework, capable of mapping both latent and active causal factors leading to automation errors, is developed. Next, the framework is converted into a Bayesian Belief Network model and populated with data gathered from Subject Matter Experts (SMEs). With the insertion of technologies and products, the model provides individual and collective risk reduction acquired by technologies and methodologies developed within AvSP.

  11. Automation systems for radioimmunoassay

    International Nuclear Information System (INIS)

    Yamasaki, Paul

    1974-01-01

    The application of automation systems for radioimmunoassay (RIA) was discussed. Automated systems could be useful in the second step, of the four basic processes in the course of RIA, i.e., preparation of sample for reaction. There were two types of instrumentation, a semi-automatic pipete, and a fully automated pipete station, both providing for fast and accurate dispensing of the reagent or for the diluting of sample with reagent. Illustrations of the instruments were shown. (Mukohata, S.)

  12. Advanced accounting techniques in automated fuel fabrication facilities

    International Nuclear Information System (INIS)

    Carlson, R.L.; DeMerschman, A.W.; Engel, D.W.

    1977-01-01

    The accountability system being designed for automated fuel fabrication facilities will provide real-time information on all Special Nuclear Material (SNM) located in the facility. It will utilize a distributed network of microprocessors and minicomputers to monitor material movement and obtain nuclear materials measurements directly from remote, in-line Nondestructive Assay instrumentation. As SNM crosses an accounting boundary, the accountability computer will update the master files and generate audit trail records. Mass balance accounting techniques will be used around each unit process step, while item control will be used to account for encapsulated material, and SNM in transit

  13. Classification of Automated Search Traffic

    Science.gov (United States)

    Buehrer, Greg; Stokes, Jack W.; Chellapilla, Kumar; Platt, John C.

    As web search providers seek to improve both relevance and response times, they are challenged by the ever-increasing tax of automated search query traffic. Third party systems interact with search engines for a variety of reasons, such as monitoring a web site’s rank, augmenting online games, or possibly to maliciously alter click-through rates. In this paper, we investigate automated traffic (sometimes referred to as bot traffic) in the query stream of a large search engine provider. We define automated traffic as any search query not generated by a human in real time. We first provide examples of different categories of query logs generated by automated means. We then develop many different features that distinguish between queries generated by people searching for information, and those generated by automated processes. We categorize these features into two classes, either an interpretation of the physical model of human interactions, or as behavioral patterns of automated interactions. Using the these detection features, we next classify the query stream using multiple binary classifiers. In addition, a multiclass classifier is then developed to identify subclasses of both normal and automated traffic. An active learning algorithm is used to suggest which user sessions to label to improve the accuracy of the multiclass classifier, while also seeking to discover new classes of automated traffic. Performance analysis are then provided. Finally, the multiclass classifier is used to predict the subclass distribution for the search query stream.

  14. An automated swimming respirometer

    DEFF Research Database (Denmark)

    STEFFENSEN, JF; JOHANSEN, K; BUSHNELL, PG

    1984-01-01

    An automated respirometer is described that can be used for computerized respirometry of trout and sharks.......An automated respirometer is described that can be used for computerized respirometry of trout and sharks....

  15. Integration of biotechnology, visualisation technology and robot technology for automated mass propagation af elite trees

    DEFF Research Database (Denmark)

    Find, Jens

    for the production of Christmas trees and Sitka spruce has gained renewed interest as a fast growing species for the production biofuels. These species are used as model systems for the development of automated plant production based on robot and visualisation technology. The commercial aspect of the project aims at......: 1) the market for cloned elite plants in the forestry sector and 2) the market for robot technology in the production of plants for the forestry sector....

  16. Laboratory automation: trajectory, technology, and tactics.

    Science.gov (United States)

    Markin, R S; Whalen, S A

    2000-05-01

    Laboratory automation is in its infancy, following a path parallel to the development of laboratory information systems in the late 1970s and early 1980s. Changes on the horizon in healthcare and clinical laboratory service that affect the delivery of laboratory results include the increasing age of the population in North America, the implementation of the Balanced Budget Act (1997), and the creation of disease management companies. Major technology drivers include outcomes optimization and phenotypically targeted drugs. Constant cost pressures in the clinical laboratory have forced diagnostic manufacturers into less than optimal profitability states. Laboratory automation can be a tool for the improvement of laboratory services and may decrease costs. The key to improvement of laboratory services is implementation of the correct automation technology. The design of this technology should be driven by required functionality. Automation design issues should be centered on the understanding of the laboratory and its relationship to healthcare delivery and the business and operational processes in the clinical laboratory. Automation design philosophy has evolved from a hardware-based approach to a software-based approach. Process control software to support repeat testing, reflex testing, and transportation management, and overall computer-integrated manufacturing approaches to laboratory automation implementation are rapidly expanding areas. It is clear that hardware and software are functionally interdependent and that the interface between the laboratory automation system and the laboratory information system is a key component. The cost-effectiveness of automation solutions suggested by vendors, however, has been difficult to evaluate because the number of automation installations are few and the precision with which operational data have been collected to determine payback is suboptimal. The trend in automation has moved from total laboratory automation to a

  17. A Taxonomy for Heavy-Duty Telemanipulation Tasks Using Elemental Actions

    Directory of Open Access Journals (Sweden)

    Alexander Owen-Hill

    2013-10-01

    Full Text Available In the maintenance of large scientific facilities, telemanipulation procedures can involve various subprocedures which in turn are made up of a sequence of subtasks. This work presents a taxonomy which describes a set of elemental actions for heavy-duty telemanipulation, along with an example of these actions in a standard maintenance subprocedure. As maintenance tasks are often very different at high-level, this generalized way of deconstructing tasks allows a highly adaptable approach to describe the sequence of any procedure, which can then be used for such applications as task monitoring, automation or detection of incomplete tasks. We describe in detail the properties of each elemental action and apply the taxonomy to an example subprocedure to show how the process can be generalizable. An automatic state-machine creation stage is shown, which would be used at the task scheduling stage to simplify calculations carried out during the moment-by-moment execution of the task.

  18. Contaminant analysis automation, an overview

    International Nuclear Information System (INIS)

    Hollen, R.; Ramos, O. Jr.

    1996-01-01

    To meet the environmental restoration and waste minimization goals of government and industry, several government laboratories, universities, and private companies have formed the Contaminant Analysis Automation (CAA) team. The goal of this consortium is to design and fabricate robotics systems that standardize and automate the hardware and software of the most common environmental chemical methods. In essence, the CAA team takes conventional, regulatory- approved (EPA Methods) chemical analysis processes and automates them. The automation consists of standard laboratory modules (SLMs) that perform the work in a much more efficient, accurate, and cost- effective manner

  19. Man-machine interface versus full automation

    International Nuclear Information System (INIS)

    Hatton, V.

    1984-01-01

    As accelerators grow in size and complexity of operation there is an increasing economical as well as an operational incentive for the controls and operations teams to use computers to help the man-machine interface. At first the computer network replaced the traditional controls racks filled with knobs, buttons and digital displays of voltages and potentiometer readings. The computer system provided the operator with the extension of his hands and eyes. It was quickly found that much more could be achieved. Where previously it was necessary for the human operator to decide the order of the actions to be executed by the computer as a result of a visual indication of malfunctioning of the accelerator, now the operation is becoming more and more under the direct control of the computer system. Expert knowledge is programmed into the system to help the non-specialist make decision and to safeguard the equipment. Machine physics concepts have been incorporated and critical machine parameters can be optimised easily by the physicists or operators without any detailed knowledge of the intervening medium or of the equipment being controlled. As confidence grows and reliability improves, more and more automation can be added. How far can this process of automation replace the skilled operator. Can the accelerators of tomorrow be run like the ever increasing robotic assembly plants of today. How is the role of the operator changing in this new environment

  20. Selecting automation for the clinical chemistry laboratory.

    Science.gov (United States)

    Melanson, Stacy E F; Lindeman, Neal I; Jarolim, Petr

    2007-07-01

    Laboratory automation proposes to improve the quality and efficiency of laboratory operations, and may provide a solution to the quality demands and staff shortages faced by today's clinical laboratories. Several vendors offer automation systems in the United States, with both subtle and obvious differences. Arriving at a decision to automate, and the ensuing evaluation of available products, can be time-consuming and challenging. Although considerable discussion concerning the decision to automate has been published, relatively little attention has been paid to the process of evaluating and selecting automation systems. To outline a process for evaluating and selecting automation systems as a reference for laboratories contemplating laboratory automation. Our Clinical Chemistry Laboratory staff recently evaluated all major laboratory automation systems in the United States, with their respective chemistry and immunochemistry analyzers. Our experience is described and organized according to the selection process, the important considerations in clinical chemistry automation, decisions and implementation, and we give conclusions pertaining to this experience. Including the formation of a committee, workflow analysis, submitting a request for proposal, site visits, and making a final decision, the process of selecting chemistry automation took approximately 14 months. We outline important considerations in automation design, preanalytical processing, analyzer selection, postanalytical storage, and data management. Selecting clinical chemistry laboratory automation is a complex, time-consuming process. Laboratories considering laboratory automation may benefit from the concise overview and narrative and tabular suggestions provided.

  1. Automated detection of geomagnetic storms with heightened risk of GIC

    Science.gov (United States)

    Bailey, Rachel L.; Leonhardt, Roman

    2016-06-01

    Automated detection of geomagnetic storms is of growing importance to operators of technical infrastructure (e.g., power grids, satellites), which is susceptible to damage caused by the consequences of geomagnetic storms. In this study, we compare three methods for automated geomagnetic storm detection: a method analyzing the first derivative of the geomagnetic variations, another looking at the Akaike information criterion, and a third using multi-resolution analysis of the maximal overlap discrete wavelet transform of the variations. These detection methods are used in combination with an algorithm for the detection of coronal mass ejection shock fronts in ACE solar wind data prior to the storm arrival on Earth as an additional constraint for possible storm detection. The maximal overlap discrete wavelet transform is found to be the most accurate of the detection methods. The final storm detection software, implementing analysis of both satellite solar wind and geomagnetic ground data, detects 14 of 15 more powerful geomagnetic storms over a period of 2 years.

  2. Automated Vehicles Symposium 2014

    CERN Document Server

    Beiker, Sven; Road Vehicle Automation 2

    2015-01-01

    This paper collection is the second volume of the LNMOB series on Road Vehicle Automation. The book contains a comprehensive review of current technical, socio-economic, and legal perspectives written by experts coming from public authorities, companies and universities in the U.S., Europe and Japan. It originates from the Automated Vehicle Symposium 2014, which was jointly organized by the Association for Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Burlingame, CA, in July 2014. The contributions discuss the challenges arising from the integration of highly automated and self-driving vehicles into the transportation system, with a focus on human factors and different deployment scenarios. This book is an indispensable source of information for academic researchers, industrial engineers, and policy makers interested in the topic of road vehicle automation.

  3. Automated solid-phase extraction of herbicides from water for gas chromatographic-mass spectrometric analysis

    Science.gov (United States)

    Meyer, M.T.; Mills, M.S.; Thurman, E.M.

    1993-01-01

    An automated solid-phase extraction (SPE) method was developed for the pre-concentration of chloroacetanilide and triazine herbicides, and two triazine metabolites from 100-ml water samples. Breakthrough experiments for the C18 SPE cartridge show that the two triazine metabolites are not fully retained and that increasing flow-rate decreases their retention. Standard curve r2 values of 0.998-1.000 for each compound were consistently obtained and a quantitation level of 0.05 ??g/l was achieved for each compound tested. More than 10,000 surface and ground water samples have been analyzed by this method.

  4. Work Planing Automation at Mechanical Subdivision

    OpenAIRE

    Dzindzelėta, Vytautas

    2005-01-01

    Work planing automation, installation possibilities and future outlook at mechanical subdivision. To study how the work planing has changed before and after automation process and to analyse automation process methodology.

  5. Rapid Prototyping of a Cyclic Olefin Copolymer Microfluidic Device for Automated Oocyte Culturing.

    Science.gov (United States)

    Berenguel-Alonso, Miguel; Sabés-Alsina, Maria; Morató, Roser; Ymbern, Oriol; Rodríguez-Vázquez, Laura; Talló-Parra, Oriol; Alonso-Chamarro, Julián; Puyol, Mar; López-Béjar, Manel

    2017-10-01

    Assisted reproductive technology (ART) can benefit from the features of microfluidic technologies, such as the automation of time-consuming labor-intensive procedures, the possibility to mimic in vivo environments, and the miniaturization of the required equipment. To date, most of the proposed approaches are based on polydimethylsiloxane (PDMS) as platform substrate material due to its widespread use in academia, despite certain disadvantages, such as the elevated cost of mass production. Herein, we present a rapid fabrication process for a cyclic olefin copolymer (COC) monolithic microfluidic device combining hot embossing-using a low-temperature cofired ceramic (LTCC) master-and micromilling. The microfluidic device was suitable for trapping and maturation of bovine oocytes, which were further studied to determine their ability to be fertilized. Furthermore, another COC microfluidic device was fabricated to store sperm and assess its quality parameters over time. The study herein presented demonstrates a good biocompatibility of the COC when working with gametes, and it exhibits certain advantages, such as the nonabsorption of small molecules, gas impermeability, and low fabrication costs, all at the prototyping and mass production scale, thus taking a step further toward fully automated microfluidic devices in ART.

  6. Effects of prey type on specific dynamic action, growth, and mass conversion efficiencies in the horned frog, Ceratophrys cranwelli.

    Science.gov (United States)

    Grayson, Kristine L; Cook, Leslie W; Todd, M Jason; Pierce, D; Hopkins, William A; Gatten, Robert E; Dorcas, Michael E

    2005-07-01

    To be most energetically profitable, predators should ingest prey with the maximal nutritional benefit while minimizing the cost of processing. Therefore, when determining the quality of prey items, both the cost of processing and nutritional content must be considered. Specific dynamic action (SDA), the increase in metabolic rate associated with feeding in animals, is a significant processing cost that represents the total cost of digestion and assimilation of nutrients from prey. We examined the effects of an invertebrate diet (earthworms) and a vertebrate diet (newborn mice) on mass conversion efficiencies, growth, and SDA in the Chacoan horned frog, Ceratophrys cranwelli. We found the earthworm diet to be significantly lower in lipid, protein, and energy content when compared to the diet of newborn mice. Growth and mass conversion efficiencies were significantly higher in frogs fed newborn mice. However, mean SDA did not differ between frogs fed the two diets, a finding that contradicts many studies that indicate SDA increases with the protein content of the meal. Together, our results indicate that future studies evaluating the effect of meal type on bioenergetics of herpetofauna are warranted and may provide significant insight into the underlying factors driving SDA.

  7. A Customizable Flow Injection System for Automated, High Throughput, and Time Sensitive Ion Mobility Spectrometry and Mass Spectrometry Measurements.

    Science.gov (United States)

    Orton, Daniel J; Tfaily, Malak M; Moore, Ronald J; LaMarche, Brian L; Zheng, Xueyun; Fillmore, Thomas L; Chu, Rosalie K; Weitz, Karl K; Monroe, Matthew E; Kelly, Ryan T; Smith, Richard D; Baker, Erin S

    2018-01-02

    To better understand disease conditions and environmental perturbations, multiomic studies combining proteomic, lipidomic, and metabolomic analyses are vastly increasing in popularity. In a multiomic study, a single sample is typically extracted in multiple ways, and various analyses are performed using different instruments, most often based upon mass spectrometry (MS). Thus, one sample becomes many measurements, making high throughput and reproducible evaluations a necessity. One way to address the numerous samples and varying instrumental conditions is to utilize a flow injection analysis (FIA) system for rapid sample injections. While some FIA systems have been created to address these challenges, many have limitations such as costly consumables, low pressure capabilities, limited pressure monitoring, and fixed flow rates. To address these limitations, we created an automated, customizable FIA system capable of operating at a range of flow rates (∼50 nL/min to 500 μL/min) to accommodate both low- and high-flow MS ionization sources. This system also functions at varying analytical throughputs from 24 to 1200 samples per day to enable different MS analysis approaches. Applications ranging from native protein analyses to molecular library construction were performed using the FIA system, and results showed a highly robust and reproducible platform capable of providing consistent performance over many days without carryover, as long as washing buffers specific to each molecular analysis were utilized.

  8. Qualitative and quantitative analysis of pharmaceutical compounds by MALDI-TOF mass spectrometry.

    NARCIS (Netherlands)

    Kampen, J.J. van; Burgers, P.C.; Groot, R. de; Luider, T.M.

    2006-01-01

    In this report, we discuss key issues for the successful application of MALDI-TOF mass spectrometry to quantify drugs. These include choice and preparation of matrix, nature of cationization agent, automation, and data analysis procedures. The high molecular weight matrix

  9. Analysis of sulfates on low molecular weight heparin using mass spectrometry: structural characterization of enoxaparin.

    Science.gov (United States)

    Gupta, Rohitesh; Ponnusamy, Moorthy P

    2018-05-21

    Structural characterization of Low Molecular Weight Heparin (LMWH) is critical to meet biosimilarity standards. In this context, the review focuses on structural analysis of labile sulfates attached to the side-groups of LMWH using mass spectrometry. A comprehensive review of this topic will help readers to identify key strategies for tackling the problem related to sulfate loss. At the same time, various mass spectrometry techniques are presented to facilitate compositional analysis of LMWH, mainly Enoxaparin. Areas covered: This review summarizes findings on mass spectrometry application for LMWH, including modulation of sulfates, using enzymology and sample preparation approaches. Furthermore, popular open-source software packages for automated spectral data interpretation are also discussed. Successful use of LC/MS can decipher structural composition for LMWH and help evaluate their sameness or biosimilarity with the innovator molecule. Overall, the literature has been searched using PubMed by typing various search queries such as "enoxaparin", "mass spectrometry", "low molecular weight heparin", "structural characterization", etc. Expert commentary: This section highlights clinically relevant areas that need improvement to achieve satisfactory commercialization of LMWHs. It also primarily emphasizes the advancements in instrumentation related to mass spectrometry, and discusses building automated software for data interpretation and analysis.

  10. Physiological Self-Regulation and Adaptive Automation

    Science.gov (United States)

    Prinzell, Lawrence J.; Pope, Alan T.; Freeman, Frederick G.

    2007-01-01

    Adaptive automation has been proposed as a solution to current problems of human-automation interaction. Past research has shown the potential of this advanced form of automation to enhance pilot engagement and lower cognitive workload. However, there have been concerns voiced regarding issues, such as automation surprises, associated with the use of adaptive automation. This study examined the use of psychophysiological self-regulation training with adaptive automation that may help pilots deal with these problems through the enhancement of cognitive resource management skills. Eighteen participants were assigned to 3 groups (self-regulation training, false feedback, and control) and performed resource management, monitoring, and tracking tasks from the Multiple Attribute Task Battery. The tracking task was cycled between 3 levels of task difficulty (automatic, adaptive aiding, manual) on the basis of the electroencephalogram-derived engagement index. The other two tasks remained in automatic mode that had a single automation failure. Those participants who had received self-regulation training performed significantly better and reported lower National Aeronautics and Space Administration Task Load Index scores than participants in the false feedback and control groups. The theoretical and practical implications of these results for adaptive automation are discussed.

  11. Test-retest reliability of automated whole body and compartmental muscle volume measurements on a wide bore 3T MR system

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, Marianna S.; Newman, David; Kasmai, Bahman; Greenwood, Richard; Malcolm, Paul N. [Norfolk and Norwich University Hospital, Department of Radiology, Norwich (United Kingdom); Leinhard, Olof Dahlqvist [Linkoeping University, Center for Medical Image Science and Visualization, Linkoeping (Sweden); Linkoeping University, Department of Medical and Health Sciences, Linkoeping (Sweden); Karlsson, Anette; Borga, Magnus [Linkoeping University, Center for Medical Image Science and Visualization, Linkoeping (Sweden); Linkoeping University, Department of Biomedical Engineering, Linkoeping (Sweden); Rosander, Johannes [Advanced MR Analytics AB, Linkoeping (Sweden); Toms, Andoni P. [Norfolk and Norwich University Hospital, Department of Radiology, Norwich (United Kingdom); Radiology Academy, Cotman Centre, Norwich, Norfolk (United Kingdom)

    2014-09-15

    To measure the test-retest reproducibility of an automated system for quantifying whole body and compartmental muscle volumes using wide bore 3 T MRI. Thirty volunteers stratified by body mass index underwent whole body 3 T MRI, two-point Dixon sequences, on two separate occasions. Water-fat separation was performed, with automated segmentation of whole body, torso, upper and lower leg volumes, and manually segmented lower leg muscle volumes. Mean automated total body muscle volume was 19.32 L (SD9.1) and 19.28 L (SD9.12) for first and second acquisitions (Intraclass correlation coefficient (ICC) = 1.0, 95 % level of agreement -0.32-0.2 L). ICC for all automated test-retest muscle volumes were almost perfect (0.99-1.0) with 95 % levels of agreement 1.8-6.6 % of mean volume. Automated muscle volume measurements correlate closely with manual quantification (right lower leg: manual 1.68 L (2SD0.6) compared to automated 1.64 L (2SD 0.6), left lower leg: manual 1.69 L (2SD 0.64) compared to automated 1.63 L (SD0.61), correlation coefficients for automated and manual segmentation were 0.94-0.96). Fully automated whole body and compartmental muscle volume quantification can be achieved rapidly on a 3 T wide bore system with very low margins of error, excellent test-retest reliability and excellent correlation to manual segmentation in the lower leg. (orig.)

  12. Test-retest reliability of automated whole body and compartmental muscle volume measurements on a wide bore 3T MR system

    International Nuclear Information System (INIS)

    Thomas, Marianna S.; Newman, David; Kasmai, Bahman; Greenwood, Richard; Malcolm, Paul N.; Leinhard, Olof Dahlqvist; Karlsson, Anette; Borga, Magnus; Rosander, Johannes; Toms, Andoni P.

    2014-01-01

    To measure the test-retest reproducibility of an automated system for quantifying whole body and compartmental muscle volumes using wide bore 3 T MRI. Thirty volunteers stratified by body mass index underwent whole body 3 T MRI, two-point Dixon sequences, on two separate occasions. Water-fat separation was performed, with automated segmentation of whole body, torso, upper and lower leg volumes, and manually segmented lower leg muscle volumes. Mean automated total body muscle volume was 19.32 L (SD9.1) and 19.28 L (SD9.12) for first and second acquisitions (Intraclass correlation coefficient (ICC) = 1.0, 95 % level of agreement -0.32-0.2 L). ICC for all automated test-retest muscle volumes were almost perfect (0.99-1.0) with 95 % levels of agreement 1.8-6.6 % of mean volume. Automated muscle volume measurements correlate closely with manual quantification (right lower leg: manual 1.68 L (2SD0.6) compared to automated 1.64 L (2SD 0.6), left lower leg: manual 1.69 L (2SD 0.64) compared to automated 1.63 L (SD0.61), correlation coefficients for automated and manual segmentation were 0.94-0.96). Fully automated whole body and compartmental muscle volume quantification can be achieved rapidly on a 3 T wide bore system with very low margins of error, excellent test-retest reliability and excellent correlation to manual segmentation in the lower leg. (orig.)

  13. Numerical experiments using deflation with the HISQ action

    Science.gov (United States)

    Davies, Christine; DeTar, Carleton; McNeile, Craig; Vaquero, Alejandro

    2018-03-01

    We report on numerical experiments using deflation to compute quark propagators for the highly improved staggered quark (HISQ) action. The method is tested on HISQ gauge configurations, generated by the MILC collaboration, with lattice spacings of 0.15 fm, with a range of volumes, and sea quark masses down to the physical quark mass.

  14. Buying Program of the Standard Automated Materiel Management System. Automated Small Purchase System: Defense Supply Center Philadelphia

    National Research Council Canada - National Science Library

    2001-01-01

    The Standard Automated Materiel Management System Automated Small Purchase System is a fully automated micro-purchases system used by the General and Industrial Directorate at the Defense Supply Center Philadelphia...

  15. Comparison of Size Modulation Standard Automated Perimetry and Conventional Standard Automated Perimetry with a 10-2 Test Program in Glaucoma Patients.

    Science.gov (United States)

    Hirasawa, Kazunori; Takahashi, Natsumi; Satou, Tsukasa; Kasahara, Masayuki; Matsumura, Kazuhiro; Shoji, Nobuyuki

    2017-08-01

    This prospective observational study compared the performance of size modulation standard automated perimetry with the Octopus 600 10-2 test program, with stimulus size modulation during testing, based on stimulus intensity and conventional standard automated perimetry, with that of the Humphrey 10-2 test program in glaucoma patients. Eighty-seven eyes of 87 glaucoma patients underwent size modulation standard automated perimetry with Dynamic strategy and conventional standard automated perimetry using the SITA standard strategy. The main outcome measures were global indices, point-wise threshold, visual defect size and depth, reliability indices, and test duration; these were compared between size modulation standard automated perimetry and conventional standard automated perimetry. Global indices and point-wise threshold values between size modulation standard automated perimetry and conventional standard automated perimetry were moderately to strongly correlated (p 33.40, p modulation standard automated perimetry than with conventional standard automated perimetry, but the visual-field defect size was smaller (p modulation-standard automated perimetry than on conventional standard automated perimetry. The reliability indices, particularly the false-negative response, of size modulation standard automated perimetry were worse than those of conventional standard automated perimetry (p modulation standard automated perimetry than with conventional standard automated perimetry (p = 0.02). Global indices and the point-wise threshold value of the two testing modalities correlated well. However, the potential of a large stimulus presented at an area with a decreased sensitivity with size modulation standard automated perimetry could underestimate the actual threshold in the 10-2 test protocol, as compared with conventional standard automated perimetry.

  16. Evaluation of automated analysis of 15N and total N in plant material and soil

    DEFF Research Database (Denmark)

    Jensen, E.S.

    1991-01-01

    Simultaneous determination of N-15 and total N using an automated nitrogen analyser interfaced to a continuous-flow isotope ratio mass spectrometer (ANA-MS method) was evaluated. The coefficient of variation (CV) of repeated analyses of homogeneous standards and samples at natural abundance...... was lower than 0.1%. The CV of repeated analyses of N-15-labelled plant material and soil samples varied between 0.3% and 1.1%. The reproducibility of repeated total N analyses using the automated method was comparable to results obtained with a semi-micro Kjeldahl procedure. However, the automated method...... analysis showed that the recovery of inorganic N in the NH3 trap was lower when the N was diffused from water than from 2 M KCl. The results also indicated that different proportions of the NO3- and the NH4+ in aqueous solution were recovered in the trap after combined diffusion. The method is most suited...

  17. Fourier transform ion cyclotron resonance mass spectrometry

    Science.gov (United States)

    Marshall, Alan G.

    1998-06-01

    As for Fourier transform infrared (FT-IR) interferometry and nuclear magnetic resonance (NMR) spectroscopy, the introduction of pulsed Fourier transform techniques revolutionized ion cyclotron resonance mass spectrometry: increased speed (factor of 10,000), increased sensitivity (factor of 100), increased mass resolution (factor of 10,000-an improvement not shared by the introduction of FT techniques to IR or NMR spectroscopy), increased mass range (factor of 500), and automated operation. FT-ICR mass spectrometry is the most versatile technique for unscrambling and quantifying ion-molecule reaction kinetics and equilibria in the absence of solvent (i.e., the gas phase). In addition, FT-ICR MS has the following analytically important features: speed (~1 second per spectrum); ultrahigh mass resolution and ultrahigh mass accuracy for analysis of mixtures and polymers; attomole sensitivity; MSn with one spectrometer, including two-dimensional FT/FT-ICR/MS; positive and/or negative ions; multiple ion sources (especially MALDI and electrospray); biomolecular molecular weight and sequencing; LC/MS; and single-molecule detection up to 108 Dalton. Here, some basic features and recent developments of FT-ICR mass spectrometry are reviewed, with applications ranging from crude oil to molecular biology.

  18. Laboratory automation and LIMS in forensics

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Anders Johannes; Morling, Niels

    2013-01-01

    . Furthermore, implementation of automated liquid handlers reduces the risk of sample misplacement. A LIMS can efficiently control the sample flow through the laboratory and manage the results of the conducted tests for each sample. Integration of automated liquid handlers with a LIMS provides the laboratory......Implementation of laboratory automation and LIMS in a forensic laboratory enables the laboratory, to standardize sample processing. Automated liquid handlers can increase throughput and eliminate manual repetitive pipetting operations, known to result in occupational injuries to the technical staff...... with the tools required for setting up automated production lines of complex laboratory processes and monitoring the whole process and the results. Combined, this enables processing of a large number of samples. Selection of the best automated solution for an individual laboratory should be based on user...

  19. Automation and robotics

    Science.gov (United States)

    Montemerlo, Melvin

    1988-01-01

    The Autonomous Systems focus on the automation of control systems for the Space Station and mission operations. Telerobotics focuses on automation for in-space servicing, assembly, and repair. The Autonomous Systems and Telerobotics each have a planned sequence of integrated demonstrations showing the evolutionary advance of the state-of-the-art. Progress is briefly described for each area of concern.

  20. Automating the radiographic NDT process

    International Nuclear Information System (INIS)

    Aman, J.K.

    1986-01-01

    Automation, the removal of the human element in inspection, has not been generally applied to film radiographic NDT. The justication for automating is not only productivity but also reliability of results. Film remains in the automated system of the future because of its extremely high image content, approximately 8 x 10 9 bits per 14 x 17. The equivalent to 2200 computer floppy discs. Parts handling systems and robotics applied for manufacturing and some NDT modalities, should now be applied to film radiographic NDT systems. Automatic film handling can be achieved with the daylight NDT film handling system. Automatic film processing is becoming the standard in industry and can be coupled to the daylight system. Robots offer the opportunity to automate fully the exposure step. Finally, computer aided interpretation appears on the horizon. A unit which laser scans a 14 x 17 (inch) film in 6 - 8 seconds can digitize film information for further manipulation and possible automatic interrogations (computer aided interpretation). The system called FDRS (for Film Digital Radiography System) is moving toward 50 micron (*approx* 16 lines/mm) resolution. This is believed to meet the need of the majority of image content needs. We expect the automated system to appear first in parts (modules) as certain operations are automated. The future will see it all come together in an automated film radiographic NDT system (author) [pt

  1. Default mode contributions to automated information processing.

    Science.gov (United States)

    Vatansever, Deniz; Menon, David K; Stamatakis, Emmanuel A

    2017-11-28

    Concurrent with mental processes that require rigorous computation and control, a series of automated decisions and actions govern our daily lives, providing efficient and adaptive responses to environmental demands. Using a cognitive flexibility task, we show that a set of brain regions collectively known as the default mode network plays a crucial role in such "autopilot" behavior, i.e., when rapidly selecting appropriate responses under predictable behavioral contexts. While applying learned rules, the default mode network shows both greater activity and connectivity. Furthermore, functional interactions between this network and hippocampal and parahippocampal areas as well as primary visual cortex correlate with the speed of accurate responses. These findings indicate a memory-based "autopilot role" for the default mode network, which may have important implications for our current understanding of healthy and adaptive brain processing.

  2. Automated Vehicles Symposium 2015

    CERN Document Server

    Beiker, Sven

    2016-01-01

    This edited book comprises papers about the impacts, benefits and challenges of connected and automated cars. It is the third volume of the LNMOB series dealing with Road Vehicle Automation. The book comprises contributions from researchers, industry practitioners and policy makers, covering perspectives from the U.S., Europe and Japan. It is based on the Automated Vehicles Symposium 2015 which was jointly organized by the Association of Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Ann Arbor, Michigan, in July 2015. The topical spectrum includes, but is not limited to, public sector activities, human factors, ethical and business aspects, energy and technological perspectives, vehicle systems and transportation infrastructure. This book is an indispensable source of information for academic researchers, industrial engineers and policy makers interested in the topic of road vehicle automation.

  3. Automated PCB Inspection System

    Directory of Open Access Journals (Sweden)

    Syed Usama BUKHARI

    2017-05-01

    Full Text Available Development of an automated PCB inspection system as per the need of industry is a challenging task. In this paper a case study is presented, to exhibit, a proposed system for an immigration process of a manual PCB inspection system to an automated PCB inspection system, with a minimal intervention on the existing production flow, for a leading automotive manufacturing company. A detailed design of the system, based on computer vision followed by testing and analysis was proposed, in order to aid the manufacturer in the process of automation.

  4. Joint force protection advanced security system (JFPASS) "the future of force protection: integrate and automate"

    Science.gov (United States)

    Lama, Carlos E.; Fagan, Joe E.

    2009-09-01

    The United States Department of Defense (DoD) defines 'force protection' as "preventive measures taken to mitigate hostile actions against DoD personnel (to include family members), resources, facilities, and critical information." Advanced technologies enable significant improvements in automating and distributing situation awareness, optimizing operator time, and improving sustainability, which enhance protection and lower costs. The JFPASS Joint Capability Technology Demonstration (JCTD) demonstrates a force protection environment that combines physical security and Chemical, Biological, Radiological, Nuclear, and Explosive (CBRNE) defense through the application of integrated command and control and data fusion. The JFPASS JCTD provides a layered approach to force protection by integrating traditional sensors used in physical security, such as video cameras, battlefield surveillance radars, unmanned and unattended ground sensors. The optimization of human participation and automation of processes is achieved by employment of unmanned ground vehicles, along with remotely operated lethal and less-than-lethal weapon systems. These capabilities are integrated via a tailorable, user-defined common operational picture display through a data fusion engine operating in the background. The combined systems automate the screening of alarms, manage the information displays, and provide assessment and response measures. The data fusion engine links disparate sensors and systems, and applies tailored logic to focus the assessment of events. It enables timely responses by providing the user with automated and semi-automated decision support tools. The JFPASS JCTD uses standard communication/data exchange protocols, which allow the system to incorporate future sensor technologies or communication networks, while maintaining the ability to communicate with legacy or existing systems.

  5. Towards automated discrimination of lipids versus peptides from full scan mass spectra

    Directory of Open Access Journals (Sweden)

    Piotr Dittwald

    2014-09-01

    Full Text Available Although physicochemical fractionation techniques play a crucial role in the analysis of complex mixtures, they are not necessarily the best solution to separate specific molecular classes, such as lipids and peptides. Any physical fractionation step such as, for example, those based on liquid chromatography, will introduce its own variation and noise. In this paper we investigate to what extent the high sensitivity and resolution of contemporary mass spectrometers offers viable opportunities for computational separation of signals in full scan spectra. We introduce an automatic method that can discriminate peptide from lipid peaks in full scan mass spectra, based on their isotopic properties. We systematically evaluate which features maximally contribute to a peptide versus lipid classification. The selected features are subsequently used to build a random forest classifier that enables almost perfect separation between lipid and peptide signals without requiring ion fragmentation and classical tandem MS-based identification approaches. The classifier is trained on in silico data, but is also capable of discriminating signals in real world experiments. We evaluate the influence of typical data inaccuracies of common classes of mass spectrometry instruments on the optimal set of discriminant features. Finally, the method is successfully extended towards the classification of individual lipid classes from full scan mass spectral features, based on input data defined by the Lipid Maps Consortium.

  6. In Vitro Mass Propagation of Cymbopogon citratus Stapf., a Medicinal Gramineae.

    Science.gov (United States)

    Quiala, Elisa; Barbón, Raúl; Capote, Alina; Pérez, Naivy; Jiménez, Elio

    2016-01-01

    Cymbopogon citratus (D.C.) Stapf. is a medicinal plant source of lemon grass oils with multiple uses in the pharmaceutical and food industry. Conventional propagation in semisolid culture medium has become a fast tool for mass propagation of lemon grass, but the production cost must be lower. A solution could be the application of in vitro propagation methods based on liquid culture advantages and automation. This chapter provides two efficient protocols for in vitro propagation via organogenesis and somatic embryogenesis of this medicinal plant. Firstly, we report the production of shoots using a temporary immersion system (TIS). Secondly, a protocol for somatic embryogenesis using semisolid culture for callus formation and multiplication, and liquid culture in a rotatory shaker and conventional bioreactors for the maintenance of embryogenic culture, is described. Well-developed plants can be achieved from both protocols. Here we provide a fast and efficient technology for mass propagation of this medicinal plant taking the advantage of liquid culture and automation.

  7. An Automated Platform for Assessment of Congenital and Drug-Induced Arrhythmia with hiPSC-Derived Cardiomyocytes

    Directory of Open Access Journals (Sweden)

    Wesley L. McKeithan

    2017-10-01

    Full Text Available The ability to produce unlimited numbers of human induced pluripotent stem cell derived cardiomyocytes (hiPSC-CMs harboring disease and patient-specific gene variants creates a new paradigm for modeling congenital heart diseases (CHDs and predicting proarrhythmic liabilities of drug candidates. However, a major roadblock to implementing hiPSC-CM technology in drug discovery is that conventional methods for monitoring action potential (AP kinetics and arrhythmia phenotypes in vitro have been too costly or technically challenging to execute in high throughput. Herein, we describe the first large-scale, fully automated and statistically robust analysis of AP kinetics and drug-induced proarrhythmia in hiPSC-CMs. The platform combines the optical recording of a small molecule fluorescent voltage sensing probe (VoltageFluor2.1.Cl, an automated high throughput microscope and automated image analysis to rapidly generate physiological measurements of cardiomyocytes (CMs. The technique can be readily adapted on any high content imager to study hiPSC-CM physiology and predict the proarrhythmic effects of drug candidates.

  8. Automated daily quality control analysis for mammography in a multi-unit imaging center.

    Science.gov (United States)

    Sundell, Veli-Matti; Mäkelä, Teemu; Meaney, Alexander; Kaasalainen, Touko; Savolainen, Sauli

    2018-01-01

    Background The high requirements for mammography image quality necessitate a systematic quality assurance process. Digital imaging allows automation of the image quality analysis, which can potentially improve repeatability and objectivity compared to a visual evaluation made by the users. Purpose To develop an automatic image quality analysis software for daily mammography quality control in a multi-unit imaging center. Material and Methods An automated image quality analysis software using the discrete wavelet transform and multiresolution analysis was developed for the American College of Radiology accreditation phantom. The software was validated by analyzing 60 randomly selected phantom images from six mammography systems and 20 phantom images with different dose levels from one mammography system. The results were compared to a visual analysis made by four reviewers. Additionally, long-term image quality trends of a full-field digital mammography system and a computed radiography mammography system were investigated. Results The automated software produced feature detection levels comparable to visual analysis. The agreement was good in the case of fibers, while the software detected somewhat more microcalcifications and characteristic masses. Long-term follow-up via a quality assurance web portal demonstrated the feasibility of using the software for monitoring the performance of mammography systems in a multi-unit imaging center. Conclusion Automated image quality analysis enables monitoring the performance of digital mammography systems in an efficient, centralized manner.

  9. Automated lung nodule classification following automated nodule detection on CT: A serial approach

    International Nuclear Information System (INIS)

    Armato, Samuel G. III; Altman, Michael B.; Wilkie, Joel; Sone, Shusuke; Li, Feng; Doi, Kunio; Roy, Arunabha S.

    2003-01-01

    We have evaluated the performance of an automated classifier applied to the task of differentiating malignant and benign lung nodules in low-dose helical computed tomography (CT) scans acquired as part of a lung cancer screening program. The nodules classified in this manner were initially identified by our automated lung nodule detection method, so that the output of automated lung nodule detection was used as input to automated lung nodule classification. This study begins to narrow the distinction between the 'detection task' and the 'classification task'. Automated lung nodule detection is based on two- and three-dimensional analyses of the CT image data. Gray-level-thresholding techniques are used to identify initial lung nodule candidates, for which morphological and gray-level features are computed. A rule-based approach is applied to reduce the number of nodule candidates that correspond to non-nodules, and the features of remaining candidates are merged through linear discriminant analysis to obtain final detection results. Automated lung nodule classification merges the features of the lung nodule candidates identified by the detection algorithm that correspond to actual nodules through another linear discriminant classifier to distinguish between malignant and benign nodules. The automated classification method was applied to the computerized detection results obtained from a database of 393 low-dose thoracic CT scans containing 470 confirmed lung nodules (69 malignant and 401 benign nodules). Receiver operating characteristic (ROC) analysis was used to evaluate the ability of the classifier to differentiate between nodule candidates that correspond to malignant nodules and nodule candidates that correspond to benign lesions. The area under the ROC curve for this classification task attained a value of 0.79 during a leave-one-out evaluation

  10. Screening halogenated environmental contaminants in biota based on isotopic pattern and mass defect provided by high resolution mass spectrometry profiling

    International Nuclear Information System (INIS)

    Cariou, Ronan; Omer, Elsa; Léon, Alexis; Dervilly-Pinel, Gaud; Le Bizec, Bruno

    2016-01-01

    In the present work, we addressed the question of global seeking/screening organohalogenated compounds in a large panel of complex biological matrices, with a particular focus on unknown chemicals that may be considered as potential emerging hazards. A fishing strategy was developed based on untargeted profiling among full scan acquisition datasets provided by high resolution mass spectrometry. Since large datasets arise from such profiling, filtering useful information stands as a central question. In this way, we took advantage of the exact mass differences between Cl and Br isotopes. Indeed, our workflow involved an innovative Visual Basic for Applications script aiming at pairing features according to this mass difference, in order to point out potential organohalogenated clusters, preceded by an automated peak picking step based on the centWave function (xcms package of open access R programming environment). Then, H/Cl-scale mass defect plots were used to visualize the datasets before and after filtering. The filtering script was successfully applied to a dataset generated upon liquid chromatography coupled to ESI(−)-HRMS measurement from one eel muscle extract, allowing for realistic manual investigations of filtered clusters. Starting from 9789 initial obtained features, 1994 features were paired in 589 clusters. Hexabromocyclododecane, chlorinated paraffin series and various other compounds have been identified or tentatively identified, allowing thus broad screening of organohalogenated compounds in this extract. Although realistic, manual review of paired clusters remains time consuming and much effort should be devoted to automation. - Highlights: • We address the screening of halogenated compounds in large Full Scan HRMS datasets. • The workflow involves peak picking, pairing script and review of paired features. • The pairing script is based on exact mass differences between Cl and Br isotopes. • H/Cl scale mass defect plots are used to

  11. Screening halogenated environmental contaminants in biota based on isotopic pattern and mass defect provided by high resolution mass spectrometry profiling

    Energy Technology Data Exchange (ETDEWEB)

    Cariou, Ronan, E-mail: laberca@oniris-nantes.fr; Omer, Elsa; Léon, Alexis; Dervilly-Pinel, Gaud; Le Bizec, Bruno

    2016-09-14

    In the present work, we addressed the question of global seeking/screening organohalogenated compounds in a large panel of complex biological matrices, with a particular focus on unknown chemicals that may be considered as potential emerging hazards. A fishing strategy was developed based on untargeted profiling among full scan acquisition datasets provided by high resolution mass spectrometry. Since large datasets arise from such profiling, filtering useful information stands as a central question. In this way, we took advantage of the exact mass differences between Cl and Br isotopes. Indeed, our workflow involved an innovative Visual Basic for Applications script aiming at pairing features according to this mass difference, in order to point out potential organohalogenated clusters, preceded by an automated peak picking step based on the centWave function (xcms package of open access R programming environment). Then, H/Cl-scale mass defect plots were used to visualize the datasets before and after filtering. The filtering script was successfully applied to a dataset generated upon liquid chromatography coupled to ESI(−)-HRMS measurement from one eel muscle extract, allowing for realistic manual investigations of filtered clusters. Starting from 9789 initial obtained features, 1994 features were paired in 589 clusters. Hexabromocyclododecane, chlorinated paraffin series and various other compounds have been identified or tentatively identified, allowing thus broad screening of organohalogenated compounds in this extract. Although realistic, manual review of paired clusters remains time consuming and much effort should be devoted to automation. - Highlights: • We address the screening of halogenated compounds in large Full Scan HRMS datasets. • The workflow involves peak picking, pairing script and review of paired features. • The pairing script is based on exact mass differences between Cl and Br isotopes. • H/Cl scale mass defect plots are used to

  12. Automation of Electrical Cable Harnesses Testing

    Directory of Open Access Journals (Sweden)

    Zhuming Bi

    2017-12-01

    Full Text Available Traditional automated systems, such as industrial robots, are applied in well-structured environments, and many automated systems have a limited adaptability to deal with complexity and uncertainty; therefore, the applications of industrial robots in small- and medium-sized enterprises (SMEs are very limited. The majority of manual operations in SMEs are too complicated for automation. The rapidly developed information technologies (IT has brought new opportunities for the automation of manufacturing and assembly processes in the ill-structured environments. Note that an automation solution should be designed to meet the given requirements of the specified application, and it differs from one application to another. In this paper, we look into the feasibility of automated testing for electric cable harnesses, and our focus is on some of the generic strategies for the improvement of the adaptability of automation solutions. Especially, the concept of modularization is adopted in developing hardware and software to maximize system adaptability in testing a wide scope of products. A proposed system has been implemented, and the system performances have been evaluated by executing tests on actual products. The testing experiments have shown that the automated system outperformed manual operations greatly in terms of cost-saving, productivity and reliability. Due to the potential of increasing system adaptability and cost reduction, the presented work has its theoretical and practical significance for an extension for other automation solutions in SMEs.

  13. Automated DBS microsampling, microscale automation and microflow LC-MS for therapeutic protein PK.

    Science.gov (United States)

    Zhang, Qian; Tomazela, Daniela; Vasicek, Lisa A; Spellman, Daniel S; Beaumont, Maribel; Shyong, BaoJen; Kenny, Jacqueline; Fauty, Scott; Fillgrove, Kerry; Harrelson, Jane; Bateman, Kevin P

    2016-04-01

    Reduce animal usage for discovery-stage PK studies for biologics programs using microsampling-based approaches and microscale LC-MS. We report the development of an automated DBS-based serial microsampling approach for studying the PK of therapeutic proteins in mice. Automated sample preparation and microflow LC-MS were used to enable assay miniaturization and improve overall assay throughput. Serial sampling of mice was possible over the full 21-day study period with the first six time points over 24 h being collected using automated DBS sample collection. Overall, this approach demonstrated comparable data to a previous study using single mice per time point liquid samples while reducing animal and compound requirements by 14-fold. Reduction in animals and drug material is enabled by the use of automated serial DBS microsampling for mice studies in discovery-stage studies of protein therapeutics.

  14. Varying Levels of Automation on UAS Operator Responses to Traffic Resolution Advisories in Civil Airspace

    Science.gov (United States)

    Kenny, Caitlin; Fern, Lisa

    2012-01-01

    Continuing demand for the use of Unmanned Aircraft Systems (UAS) has put increasing pressure on operations in civil airspace. The need to fly UAS in the National Airspace System (NAS) in order to perform missions vital to national security and defense, emergency management, and science is increasing at a rapid pace. In order to ensure safe operations in the NAS, operators of unmanned aircraft, like those of manned aircraft, may be required to maintain separation assurance and avoid loss of separation with other aircraft while performing their mission tasks. This experiment investigated the effects of varying levels of automation on UAS operator performance and workload while responding to conflict resolution instructions provided by the Tactical Collision Avoidance System II (TCAS II) during a UAS mission in high-density airspace. The purpose of this study was not to investigate the safety of using TCAS II on UAS, but rather to examine the effect of automation on the ability of operators to respond to traffic collision alerts. Six licensed pilots were recruited to act as UAS operators for this study. Operators were instructed to follow a specified mission flight path, while maintaining radio contact with Air Traffic Control and responding to TCAS II resolution advisories. Operators flew four, 45 minute, experimental missions with four different levels of automation: Manual, Knobs, Management by Exception, and Fully Automated. All missions included TCAS II Resolution Advisories (RAs) that required operator attention and rerouting. Operator compliance and reaction time to RAs was measured, and post-run NASA-TLX ratings were collected to measure workload. Results showed significantly higher compliance rates, faster responses to TCAS II alerts, as well as less preemptive operator actions when higher levels of automation are implemented. Physical and Temporal ratings of workload were significantly higher in the Manual condition than in the Management by Exception and

  15. Quantitative Estimation for the Effectiveness of Automation

    International Nuclear Information System (INIS)

    Lee, Seung Min; Seong, Poong Hyun

    2012-01-01

    In advanced MCR, various automation systems are applied to enhance the human performance and reduce the human errors in industrial fields. It is expected that automation provides greater efficiency, lower workload, and fewer human errors. However, these promises are not always fulfilled. As the new types of events related to application of the imperfect and complex automation are occurred, it is required to analyze the effects of automation system for the performance of human operators. Therefore, we suggest the quantitative estimation method to analyze the effectiveness of the automation systems according to Level of Automation (LOA) classification, which has been developed over 30 years. The estimation of the effectiveness of automation will be achieved by calculating the failure probability of human performance related to the cognitive activities

  16. The design of reconfigurable assembly stations for high variety and mass customisation manufacturing

    Directory of Open Access Journals (Sweden)

    Padayachee, Jared

    2013-11-01

    Full Text Available The economical production of mass customised and high variety goods is a challenge facing modern manufacturers. This challenge is being addressed, in part, by the on-going development of technologies that facilitate the manufacturing of these goods. Existing technologies require either excessive inbuilt flexibility or frequent changes to the machine set up to provide the manufacturing functions required for the customisation process. This paper presents design principles for automated assembly stations within the scope of mass customisation. Design principles are presented that minimise the hardware and operating complexities of assembly stations, allowing stations to be easily automated for concurrent mixed model assembly with a First In First Out (FIFO scheduling policy. A reconfigurable assembly station is developed to demonstrate how the proposed design methods simplify the creation and operation of an assembly station for a product family of flashlights.

  17. Automating spectral measurements

    Science.gov (United States)

    Goldstein, Fred T.

    2008-09-01

    This paper discusses the architecture of software utilized in spectroscopic measurements. As optical coatings become more sophisticated, there is mounting need to automate data acquisition (DAQ) from spectrophotometers. Such need is exacerbated when 100% inspection is required, ancillary devices are utilized, cost reduction is crucial, or security is vital. While instrument manufacturers normally provide point-and-click DAQ software, an application programming interface (API) may be missing. In such cases automation is impossible or expensive. An API is typically provided in libraries (*.dll, *.ocx) which may be embedded in user-developed applications. Users can thereby implement DAQ automation in several Windows languages. Another possibility, developed by FTG as an alternative to instrument manufacturers' software, is the ActiveX application (*.exe). ActiveX, a component of many Windows applications, provides means for programming and interoperability. This architecture permits a point-and-click program to act as automation client and server. Excel, for example, can control and be controlled by DAQ applications. Most importantly, ActiveX permits ancillary devices such as barcode readers and XY-stages to be easily and economically integrated into scanning procedures. Since an ActiveX application has its own user-interface, it can be independently tested. The ActiveX application then runs (visibly or invisibly) under DAQ software control. Automation capabilities are accessed via a built-in spectro-BASIC language with industry-standard (VBA-compatible) syntax. Supplementing ActiveX, spectro-BASIC also includes auxiliary serial port commands for interfacing programmable logic controllers (PLC). A typical application is automatic filter handling.

  18. Ask the experts: automation: part I.

    Science.gov (United States)

    Allinson, John L; Blick, Kenneth E; Cohen, Lucinda; Higton, David; Li, Ming

    2013-08-01

    Bioanalysis invited a selection of leading researchers to express their views on automation in the bioanalytical laboratory. The topics discussed include the challenges that the modern bioanalyst faces when integrating automation into existing drug-development processes, the impact of automation and how they envision the modern bioanalytical laboratory changing in the near future. Their enlightening responses provide a valuable insight into the impact of automation and the future of the constantly evolving bioanalytical laboratory.

  19. An Automation Survival Guide for Media Centers.

    Science.gov (United States)

    Whaley, Roger E.

    1989-01-01

    Reviews factors that should affect the decision to automate a school media center and offers suggestions for the automation process. Topics discussed include getting the library collection ready for automation, deciding what automated functions are needed, evaluating software vendors, selecting software, and budgeting. (CLB)

  20. Rapid and Automated Determination of Plutonium and Neptunium in Environmental Samples

    DEFF Research Database (Denmark)

    Qiao, Jixin

    This thesis presents improved analytical methods for rapid and automated determination of plutonium and neptunium in environmental samples using sequential injection (SI) based chromatography and inductively coupled plasma mass spectrometry (ICP-MS). The progress of methodology development...... and optimization for rapid determination of plutonium in environmental samples using SIextraction chromatography prior to inductively coupled plasma mass spectrometry (Paper III); (3) Development of an SI-chromatographic method for simultaneous determination of plutonium and neptunium in environmental samples...... for rapid and simultaneous determination of plutonium and neptunium within an SI system (Paper VI). The results demonstrate that the developed methods in this study are reliable and efficient for accurate assays of trace levels of plutonium and neptunium as demanded in different situations including...

  1. Demands on digital automation; Anforderungen an die Digitale Automation

    Energy Technology Data Exchange (ETDEWEB)

    Bieler, P.

    1995-12-31

    In chapter 12 of the anthology about building control the demands on digital automation are presented. The following aspects are discussed: variety of the companies` philosophies, demands of the customer/investor, demands of the use of buildings/rooms, the user, point of view of manufacturer of technical plants. (BWI) [Deutsch] Kapitel 12 des Sammelbandes ueber Building Control stellt die Anforderungen an die Digitale Automation vor. In diesem Zusammenhang wird auf folgende Themenbereiche eingegangen: Spektrum der Firmenphilosophien, Forderungen der Auftraggeber/Investoren, der Gebaeude-/Raumnutzung, der Betreiber sowie Sicht der Ersteller betriebstechnischer Anlagen. (BWI)

  2. Disassembly automation automated systems with cognitive abilities

    CERN Document Server

    Vongbunyong, Supachai

    2015-01-01

    This book presents a number of aspects to be considered in the development of disassembly automation, including the mechanical system, vision system and intelligent planner. The implementation of cognitive robotics increases the flexibility and degree of autonomy of the disassembly system. Disassembly, as a step in the treatment of end-of-life products, can allow the recovery of embodied value left within disposed products, as well as the appropriate separation of potentially-hazardous components. In the end-of-life treatment industry, disassembly has largely been limited to manual labor, which is expensive in developed countries. Automation is one possible solution for economic feasibility. The target audience primarily comprises researchers and experts in the field, but the book may also be beneficial for graduate students.

  3. Automations influence on nuclear power plants: a look at three accidents and how automation played a role.

    Science.gov (United States)

    Schmitt, Kara

    2012-01-01

    Nuclear power is one of the ways that we can design an efficient sustainable future. Automation is the primary system used to assist operators in the task of monitoring and controlling nuclear power plants (NPP). Automation performs tasks such as assessing the status of the plant's operations as well as making real time life critical situational specific decisions. While the advantages and disadvantages of automation are well studied in variety of domains, accidents remind us that there is still vulnerability to unknown variables. This paper will look at the effects of automation within three NPP accidents and incidents and will consider why automation failed in preventing these accidents from occurring. It will also review the accidents at the Three Mile Island, Chernobyl, and Fukushima Daiichi NPP's in order to determine where better use of automation could have resulted in a more desirable outcome.

  4. Automated System Marketplace 1994.

    Science.gov (United States)

    Griffiths, Jose-Marie; Kertis, Kimberly

    1994-01-01

    Reports results of the 1994 Automated System Marketplace survey based on responses from 60 vendors. Highlights include changes in the library automation marketplace; estimated library systems revenues; minicomputer and microcomputer-based systems; marketplace trends; global markets and mergers; research needs; new purchase processes; and profiles…

  5. Automation in Warehouse Development

    NARCIS (Netherlands)

    Hamberg, R.; Verriet, J.

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and

  6. Operational proof of automation

    International Nuclear Information System (INIS)

    Jaerschky, R.; Reifenhaeuser, R.; Schlicht, K.

    1976-01-01

    Automation of the power plant process may imply quite a number of problems. The automation of dynamic operations requires complicated programmes often interfering in several branched areas. This reduces clarity for the operating and maintenance staff, whilst increasing the possibilities of errors. The synthesis and the organization of standardized equipment have proved very successful. The possibilities offered by this kind of automation for improving the operation of power plants will only sufficiently and correctly be turned to profit, however, if the application of these technics of equipment is further improved and if its volume is tallied with a definite etc. (orig.) [de

  7. Chef infrastructure automation cookbook

    CERN Document Server

    Marschall, Matthias

    2013-01-01

    Chef Infrastructure Automation Cookbook contains practical recipes on everything you will need to automate your infrastructure using Chef. The book is packed with illustrated code examples to automate your server and cloud infrastructure.The book first shows you the simplest way to achieve a certain task. Then it explains every step in detail, so that you can build your knowledge about how things work. Eventually, the book shows you additional things to consider for each approach. That way, you can learn step-by-step and build profound knowledge on how to go about your configuration management

  8. Operational proof of automation

    International Nuclear Information System (INIS)

    Jaerschky, R.; Schlicht, K.

    1977-01-01

    Automation of the power plant process may imply quite a number of problems. The automation of dynamic operations requires complicated programmes often interfering in several branched areas. This reduces clarity for the operating and maintenance staff, whilst increasing the possibilities of errors. The synthesis and the organization of standardized equipment have proved very successful. The possibilities offered by this kind of automation for improving the operation of power plants will only sufficiently and correctly be turned to profit, however, if the application of these equipment techniques is further improved and if it stands in a certain ratio with a definite efficiency. (orig.) [de

  9. Novel heparan sulfate assay by using automated high-throughput mass spectrometry: Application to monitoring and screening for mucopolysaccharidoses.

    Science.gov (United States)

    Shimada, Tsutomu; Kelly, Joan; LaMarr, William A; van Vlies, Naomi; Yasuda, Eriko; Mason, Robert W; Mackenzie, William; Kubaski, Francyne; Giugliani, Roberto; Chinen, Yasutsugu; Yamaguchi, Seiji; Suzuki, Yasuyuki; Orii, Kenji E; Fukao, Toshiyuki; Orii, Tadao; Tomatsu, Shunji

    2014-01-01

    Mucopolysaccharidoses (MPS) are caused by deficiency of one of a group of specific lysosomal enzymes, resulting in excessive accumulation of glycosaminoglycans (GAGs). We previously developed GAG assay methods using liquid chromatography tandem mass spectrometry (LC-MS/MS); however, it takes 4-5 min per sample for analysis. For the large numbers of samples in a screening program, a more rapid process is desirable. The automated high-throughput mass spectrometry (HT-MS/MS) system (RapidFire) integrates a solid phase extraction robot to concentrate and desalt samples prior to direction into the MS/MS without chromatographic separation; thereby allowing each sample to be processed within 10s (enabling screening of more than one million samples per year). The aim of this study was to develop a higher throughput system to assay heparan sulfate (HS) using HT-MS/MS, and to compare its reproducibility, sensitivity and specificity with conventional LC-MS/MS. HS levels were measured in the blood (plasma and serum) from control subjects and patients with MPS II, III, or IV and in dried blood spots (DBS) from newborn controls and patients with MPS I, II, or III. Results obtained from HT-MS/MS showed 1) that there was a strong correlation of levels of disaccharides derived from HS in the blood, between those calculated using conventional LC-MS/MS and HT-MS/MS, 2) that levels of HS in the blood were significantly elevated in patients with MPS II and III, but not in MPS IVA, 3) that the level of HS in patients with a severe form of MPS II was higher than that in an attenuated form, 4) that reduction of blood HS level was observed in MPS II patients treated with enzyme replacement therapy or hematopoietic stem cell transplantation, and 5) that levels of HS in newborn DBS were elevated in patients with MPS I, II or III, compared to those of control newborns. In conclusion, HT-MS/MS provides much higher throughput than LC-MS/MS-based methods with similar sensitivity and specificity

  10. Future Computer, Communication, Control and Automation

    CERN Document Server

    2011 International Conference on Computer, Communication, Control and Automation

    2012-01-01

    The volume includes a set of selected papers extended and revised from the 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011). 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011) has been held in Zhuhai, China, November 19-20, 2011. This volume topics covered include wireless communications, advances in wireless video, wireless sensors networking, security in wireless networks, network measurement and management, hybrid and discrete-event systems, internet analytics and automation, robotic system and applications, reconfigurable automation systems, machine vision in automation. We hope that researchers, graduate students and other interested readers benefit scientifically from the proceedings and also find it stimulating in the process.

  11. "First generation" automated DNA sequencing technology.

    Science.gov (United States)

    Slatko, Barton E; Kieleczawa, Jan; Ju, Jingyue; Gardner, Andrew F; Hendrickson, Cynthia L; Ausubel, Frederick M

    2011-10-01

    Beginning in the 1980s, automation of DNA sequencing has greatly increased throughput, reduced costs, and enabled large projects to be completed more easily. The development of automation technology paralleled the development of other aspects of DNA sequencing: better enzymes and chemistry, separation and imaging technology, sequencing protocols, robotics, and computational advancements (including base-calling algorithms with quality scores, database developments, and sequence analysis programs). Despite the emergence of high-throughput sequencing platforms, automated Sanger sequencing technology remains useful for many applications. This unit provides background and a description of the "First-Generation" automated DNA sequencing technology. It also includes protocols for using the current Applied Biosystems (ABI) automated DNA sequencing machines. © 2011 by John Wiley & Sons, Inc.

  12. Strange and charm baryon masses with two flavors of dynamical twisted mass fermions

    Energy Technology Data Exchange (ETDEWEB)

    Alexandrou, C. [Univ. of Cyprus, Nicosia (Cyprus). Dept. of Physics; Cyprus Institute, Nicosia (Cyprus). Computation-Based Science and Technology Research Center; Carbonell, J. [CEA-Saclay, Gif-sur-Yvette (France). IRFU/Service de Physique Nucleaire; Christaras, D.; Gravina, M. [Univ. of Cyprus, Nicosia (Cyprus). Dept. of Physics; Drach, V. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Papinutto, M. [UFJ/CNRS/IN2P3, Grenoble (France). Laboratoire de Physique Subatomique et Cosmologie; Universidad Autonoma de Madrid (Spain). Dept. de Fisica Teorica; Universidad Autonoma de Madrid UAM/CSIC (Spain). Inst. de Fisica Teorica

    2012-10-15

    The masses of the low-lying strange and charm baryons are evaluated using two degenerate flavors of twisted mass sea quarks for pion masses in the range of about 260 MeV to 450 MeV. The strange and charm valence quark masses are tuned to reproduce the mass of the kaon and D-meson at the physical point. The tree-level Symanzik improved gauge action is employed. We use three values of the lattice spacing, corresponding to {beta}=3.9, {beta}=4.05 and {beta}=4.2 with r{sub 0}/a=5.22(2), r{sub 0}/a=6.61(3) and r{sub 0}/a=8.31(5) respectively. We examine the dependence of the strange and charm baryons on the lattice spacing and strange and charm quark masses. The pion mass dependence is studied and physical results are obtained using heavy baryon chiral perturbation theory to extrapolate to the physical point.

  13. Introduction matters: Manipulating trust in automation and reliance in automated driving.

    Science.gov (United States)

    Körber, Moritz; Baseler, Eva; Bengler, Klaus

    2018-01-01

    Trust in automation is a key determinant for the adoption of automated systems and their appropriate use. Therefore, it constitutes an essential research area for the introduction of automated vehicles to road traffic. In this study, we investigated the influence of trust promoting (Trust promoted group) and trust lowering (Trust lowered group) introductory information on reported trust, reliance behavior and take-over performance. Forty participants encountered three situations in a 17-min highway drive in a conditionally automated vehicle (SAE Level 3). Situation 1 and Situation 3 were non-critical situations where a take-over was optional. Situation 2 represented a critical situation where a take-over was necessary to avoid a collision. A non-driving-related task (NDRT) was presented between the situations to record the allocation of visual attention. Participants reporting a higher trust level spent less time looking at the road or instrument cluster and more time looking at the NDRT. The manipulation of introductory information resulted in medium differences in reported trust and influenced participants' reliance behavior. Participants of the Trust promoted group looked less at the road or instrument cluster and more at the NDRT. The odds of participants of the Trust promoted group to overrule the automated driving system in the non-critical situations were 3.65 times (Situation 1) to 5 times (Situation 3) higher. In Situation 2, the Trust promoted group's mean take-over time was extended by 1154 ms and the mean minimum time-to-collision was 933 ms shorter. Six participants from the Trust promoted group compared to no participant of the Trust lowered group collided with the obstacle. The results demonstrate that the individual trust level influences how much drivers monitor the environment while performing an NDRT. Introductory information influences this trust level, reliance on an automated driving system, and if a critical take-over situation can be

  14. ON-SHELL IMPROVEMENT OF THE MASSIVE WILSON QUARK ACTION.

    Energy Technology Data Exchange (ETDEWEB)

    AOKI, S.; KAYABA, Y.; KURAMASHI, Y.; YAMADA, N.

    2005-04-01

    We review a relativistic approach to the heavy quark physics in lattice QCD by applying a relativistic O(a) improvement to the massive Wilson quark action on the lattice. After explaining how power corrections of m{sub Q}a can be avoided and remaining uncertainties are reduced to be of order (a{Lambda}{sub QCD}){sup 2}, we demonstrate a determination of four improvement coefficients in the action up to one-loop level in a mass dependent way. We also show a perturbative determination of mass dependent renormalization factors and O(a) improvement coefficients for the vector and axial vector currents. Some preliminary results of numerical simulations are also presented.

  15. Trends and applications of integrated automated ultra-trace sample handling and analysis (T9)

    International Nuclear Information System (INIS)

    Kingston, H.M.S.; Ye Han; Stewart, L.; Link, D.

    2002-01-01

    new approach has been developed for the semiconductor industry; however, as with most new technologies its applicability extends to many other areas as well including environmental, pharmaceutical, clinical and industrial chemical processing. This instrumental system represents a fundamentally new approach. Sample preparation has been integrated as a key system element to enable automation of the instrument system. It has long been believed that an automated fully integrated system was not feasible if a powerful MS system were included. This application demonstrates one of the first fully automated and integrated sample preparation and mass spectrometric instrumental analyses systems applied to practical use. The system is also a broad and ambitious mass based analyzer capable not only for elements but also for direct speciated analysis. The complete analytical suite covering inorganic, organic, organo-metallic and speciated analytes is being applied for critical contamination control of semiconductor processes. As with new paradigms technology it will now extend from its current use into those other applications needing real-time fully automated multi-component analysis. Refs. 4 (author)

  16. Automating the Small Library.

    Science.gov (United States)

    Skapura, Robert

    1987-01-01

    Discusses the use of microcomputers for automating school libraries, both for entire systems and for specific library tasks. Highlights include available library management software, newsletters that evaluate software, constructing an evaluation matrix, steps to consider in library automation, and a brief discussion of computerized card catalogs.…

  17. GUI test automation for Qt application

    OpenAIRE

    Wang, Lei

    2015-01-01

    GUI test automation is a popular and interesting subject in the testing industry. Many companies plan to start test automation projects in order to implement efficient, less expensive software testing. However, there are challenges for the testing team who lack experience performing GUI tests automation. Many GUI test automation projects have ended in failure due to mistakes made during the early stages of the project. The major work of this thesis is to find a solution to the challenges of e...

  18. Automation synthesis modules review

    International Nuclear Information System (INIS)

    Boschi, S.; Lodi, F.; Malizia, C.; Cicoria, G.; Marengo, M.

    2013-01-01

    The introduction of 68 Ga labelled tracers has changed the diagnostic approach to neuroendocrine tumours and the availability of a reliable, long-lived 68 Ge/ 68 Ga generator has been at the bases of the development of 68 Ga radiopharmacy. The huge increase in clinical demand, the impact of regulatory issues and a careful radioprotection of the operators have boosted for extensive automation of the production process. The development of automated systems for 68 Ga radiochemistry, different engineering and software strategies and post-processing of the eluate were discussed along with impact of automation with regulations. - Highlights: ► Generators availability and robust chemistry boosted for the huge diffusion of 68Ga radiopharmaceuticals. ► Different technological approaches for 68Ga radiopharmaceuticals will be discussed. ► Generator eluate post processing and evolution to cassette based systems were the major issues in automation. ► Impact of regulations on the technological development will be also considered

  19. 76 FR 69755 - National Customs Automation Program Test Concerning Automated Commercial Environment (ACE...

    Science.gov (United States)

    2011-11-09

    ... DEPARTMENT OF HOMELAND SECURITY U.S. Customs and Border Protection National Customs Automation... announces U.S. Customs and Border Protection's (CBP's) plan to conduct a National Customs Automation Program... conveyance transporting the cargo to the United States. This data will fulfill merchandise entry requirements...

  20. I trust it, but I don't know why: effects of implicit attitudes toward automation on trust in an automated system.

    Science.gov (United States)

    Merritt, Stephanie M; Heimbaugh, Heather; LaChapell, Jennifer; Lee, Deborah

    2013-06-01

    This study is the first to examine the influence of implicit attitudes toward automation on users' trust in automation. Past empirical work has examined explicit (conscious) influences on user level of trust in automation but has not yet measured implicit influences. We examine concurrent effects of explicit propensity to trust machines and implicit attitudes toward automation on trust in an automated system. We examine differential impacts of each under varying automation performance conditions (clearly good, ambiguous, clearly poor). Participants completed both a self-report measure of propensity to trust and an Implicit Association Test measuring implicit attitude toward automation, then performed an X-ray screening task. Automation performance was manipulated within-subjects by varying the number and obviousness of errors. Explicit propensity to trust and implicit attitude toward automation did not significantly correlate. When the automation's performance was ambiguous, implicit attitude significantly affected automation trust, and its relationship with propensity to trust was additive: Increments in either were related to increases in trust. When errors were obvious, a significant interaction between the implicit and explicit measures was found, with those high in both having higher trust. Implicit attitudes have important implications for automation trust. Users may not be able to accurately report why they experience a given level of trust. To understand why users trust or fail to trust automation, measurements of implicit and explicit predictors may be necessary. Furthermore, implicit attitude toward automation might be used as a lever to effectively calibrate trust.

  1. Automation of the Distillation of Alcohol from the RUM Distillery UEB Santiago de Cuba

    Directory of Open Access Journals (Sweden)

    MSc. Mónica Mulet-Hing

    2016-02-01

    Full Text Available This paper deals with the analysis of the current situation and prospects for solutions to the lack of automation in the plant belonging to the rum distillery UEB Santiago de Cuba, Cuba Ron, that leads to limited productivity of your process; also do a survey of its operational status, and seeks to transform the instrumentation and control form. It defines the structure and control system variables, demonstrating the feasibility of the proposed solution. The key result of the work consists in presenting a proposal for automation of the structure consists of a control algorithm to the distillation process, taking into account the requirements, technical means for their execution, the variables that must be observed and processed and final action items proposed to the respective field instrumentation and purchase of the PLC to perform satisfactorily the control with the minimum possible investment.

  2. Test-retest reliability of automated whole body and compartmental muscle volume measurements on a wide bore 3T MR system.

    Science.gov (United States)

    Thomas, Marianna S; Newman, David; Leinhard, Olof Dahlqvist; Kasmai, Bahman; Greenwood, Richard; Malcolm, Paul N; Karlsson, Anette; Rosander, Johannes; Borga, Magnus; Toms, Andoni P

    2014-09-01

    To measure the test-retest reproducibility of an automated system for quantifying whole body and compartmental muscle volumes using wide bore 3 T MRI. Thirty volunteers stratified by body mass index underwent whole body 3 T MRI, two-point Dixon sequences, on two separate occasions. Water-fat separation was performed, with automated segmentation of whole body, torso, upper and lower leg volumes, and manually segmented lower leg muscle volumes. Mean automated total body muscle volume was 19·32 L (SD9·1) and 19·28 L (SD9·12) for first and second acquisitions (Intraclass correlation coefficient (ICC) = 1·0, 95% level of agreement -0·32-0·2 L). ICC for all automated test-retest muscle volumes were almost perfect (0·99-1·0) with 95% levels of agreement 1.8-6.6% of mean volume. Automated muscle volume measurements correlate closely with manual quantification (right lower leg: manual 1·68 L (2SD0·6) compared to automated 1·64 L (2SD 0·6), left lower leg: manual 1·69 L (2SD 0·64) compared to automated 1·63 L (SD0·61), correlation coefficients for automated and manual segmentation were 0·94-0·96). Fully automated whole body and compartmental muscle volume quantification can be achieved rapidly on a 3 T wide bore system with very low margins of error, excellent test-retest reliability and excellent correlation to manual segmentation in the lower leg. Sarcopaenia is an important reversible complication of a number of diseases. Manual quantification of muscle volume is time-consuming and expensive. Muscles can be imaged using in and out of phase MRI. Automated atlas-based segmentation can identify muscle groups. Automated muscle volume segmentation is reproducible and can replace manual measurements.

  3. Computational mass spectrometry for small molecules

    Science.gov (United States)

    2013-01-01

    The identification of small molecules from mass spectrometry (MS) data remains a major challenge in the interpretation of MS data. This review covers the computational aspects of identifying small molecules, from the identification of a compound searching a reference spectral library, to the structural elucidation of unknowns. In detail, we describe the basic principles and pitfalls of searching mass spectral reference libraries. Determining the molecular formula of the compound can serve as a basis for subsequent structural elucidation; consequently, we cover different methods for molecular formula identification, focussing on isotope pattern analysis. We then discuss automated methods to deal with mass spectra of compounds that are not present in spectral libraries, and provide an insight into de novo analysis of fragmentation spectra using fragmentation trees. In addition, this review shortly covers the reconstruction of metabolic networks using MS data. Finally, we list available software for different steps of the analysis pipeline. PMID:23453222

  4. Robotics/Automated Systems Technicians.

    Science.gov (United States)

    Doty, Charles R.

    Major resources exist that can be used to develop or upgrade programs in community colleges and technical institutes that educate robotics/automated systems technicians. The first category of resources is Economic, Social, and Education Issues. The Office of Technology Assessment (OTA) report, "Automation and the Workplace," presents analyses of…

  5. Oak Ridge K-25 Site Technology Logic Diagram. Volume 3, Technology evaluation data sheets; Part B, Remedial action, robotics/automation, waste management

    Energy Technology Data Exchange (ETDEWEB)

    Fellows, R.L. [ed.

    1993-02-26

    The Oak Ridge K-25 Technology Logic Diagram (TLD), a decision support tool for the K-25 Site, was developed to provide a planning document that relates environmental restoration (ER) and waste management (WN) problems at the Oak Ridge K-25 Site. The TLD technique identifies the research necessary to develop these technologies to a state that allows for technology transfer and application to waste management, remediation, decontamination, and decommissioning activities. The TLD consists of four separate volumes-Vol. 1, Vol. 2, Vol. 3A, and Vol. 3B. Volume 1 provides introductory and overview information about the TLD. Volume 2 contains logic diagrams. Volume 3 has been divided into two separate volumes to facilitate handling and use. This volume 3 B provides the Technology Evaluation Data Sheets (TEDS) for ER/WM activities (Remedial Action Robotics and Automation, Waste Management) that are referenced by a TEDS code number in Vol. 2 of the TLD. Each of these sheets represents a single logic trace across the TLD. These sheets contain more detail than each technology in Vol. 2. The TEDS are arranged alphanumerically by the TEDS code number in the upper right corner of each data sheet. Volume 3 can be used in two ways: (1) technologies that are identified from Vol. 2 can be referenced directly in Vol. 3 by using the TEDS codes, and (2) technologies and general technology areas (alternatives) can be located in the index in the front of this volume.

  6. Evaluation of an Automated Keywording System.

    Science.gov (United States)

    Malone, Linda C.; And Others

    1990-01-01

    Discussion of automated indexing techniques focuses on ways to statistically document improvements in the development of an automated keywording system over time. The system developed by the Joint Chiefs of Staff to automate the storage, categorization, and retrieval of information from military exercises is explained, and performance measures are…

  7. Future Control and Automation : Proceedings of the 2nd International Conference on Future Control and Automation

    CERN Document Server

    2012-01-01

    This volume Future Control and Automation- Volume 2 includes best papers from 2012 2nd International Conference on Future Control and Automation (ICFCA 2012) held on July 1-2, 2012, Changsha, China. Future control and automation is the use of control systems and information technologies to reduce the need for human work in the production of goods and services. This volume can be divided into six sessions on the basis of the classification of manuscripts considered, which is listed as follows: Mathematical Modeling, Analysis and Computation, Control Engineering, Reliable Networks Design, Vehicular Communications and Networking, Automation and Mechatronics.

  8. Automation for a base station stability testing

    OpenAIRE

    Punnek, Elvis

    2016-01-01

    This Batchelor’s thesis was commissioned by Oy LM Ericsson Ab Oulu. The aim of it was to help to investigate and create a test automation solution for the stability testing of the LTE base station. The main objective was to create a test automation for a predefined test set. This test automation solution had to be created for specific environments and equipment. This work included creating the automation for the test cases and putting them to daily test automation jobs. The key factor...

  9. Automated security management

    CERN Document Server

    Al-Shaer, Ehab; Xie, Geoffrey

    2013-01-01

    In this contributed volume, leading international researchers explore configuration modeling and checking, vulnerability and risk assessment, configuration analysis, and diagnostics and discovery. The authors equip readers to understand automated security management systems and techniques that increase overall network assurability and usability. These constantly changing networks defend against cyber attacks by integrating hundreds of security devices such as firewalls, IPSec gateways, IDS/IPS, authentication servers, authorization/RBAC servers, and crypto systems. Automated Security Managemen

  10. Effects of alcohol on automated and controlled driving performances.

    Science.gov (United States)

    Berthelon, Catherine; Gineyt, Guy

    2014-05-01

    Alcohol is the most frequently detected substance in fatal automobile crashes, but its precise mode of action is not always clear. The present study was designed to establish the influence of blood alcohol concentration as a function of the complexity of the scenarios. Road scenarios implying automatic or controlled driving performances were manipulated in order to identify which behavioral parameters were deteriorated. A single blind counterbalanced experiment was conducted on a driving simulator. Sixteen experienced drivers (25.3 ± 2.9 years old, 8 men and 8 women) were tested with 0, 0.3, 0.5, and 0.8 g/l of alcohol. Driving scenarios varied: road tracking, car following, and an urban scenario including events inspired by real accidents. Statistical analyses were performed on driving parameters as a function of alcohol level. Automated driving parameters such as standard deviation of lateral position measured with the road tracking and car following scenarios were impaired by alcohol, notably with the highest dose. More controlled parameters such as response time to braking and number of crashes when confronted with specific events (urban scenario) were less affected by the alcohol level. Performance decrement was greater with driving scenarios involving automated processes than with scenarios involving controlled processes.

  11. Automation System Products and Research

    OpenAIRE

    Rintala, Mikko; Sormunen, Jussi; Kuisma, Petri; Rahkala, Matti

    2014-01-01

    Automation systems are used in most buildings nowadays. In the past they were mainly used in industry to control and monitor critical systems. During the past few decades the automation systems have become more common and are used today from big industrial solutions to homes of private customers. With the growing need for ecologic and cost-efficient management systems, home and building automation systems are becoming a standard way of controlling lighting, ventilation, heating etc. Auto...

  12. Guidelines for Automation Project Execution

    OpenAIRE

    Takkinen, Heidi

    2011-01-01

    The purpose of this Master’s thesis was to create instructions for executing an automation project. Sarlin Oy Ab needed directions on how to execute an automation project. Sarlin is starting up a new business area offering total project solutions for customers. Sarlin focuses on small and minor automation projects on domestic markets. The thesis represents issues related to project execution starting from the theory of the project to its kick-off and termination. Site work is one importan...

  13. Organizational changes and automation: Towards a customer-oriented automation: Part 3

    International Nuclear Information System (INIS)

    Van Gelder, J.W.

    1994-01-01

    Automation offers great opportunities in the efforts of energy utilities in the Netherlands to reorganize towards more customer-oriented businesses. However, automation in itself is not enough. First, the organizational structure has to be changed considerably. Various energy utilities have already started on it. The restructuring principle is the same everywhere, but the way it is implemented differs widely. In this article attention is paid to the necessity of realizing an integrated computerized system, which, however, is not feasible at the moment. The second best alternative is to use various computerized systems, capable of two-way data exchange. Two viable approaches are discussed: (1) one operating system on which all automated systems within a company should run, or (2) a selective system linking on the basis of required speed information exchange. Option (2) offers more freedom of selecting the system. 2 figs

  14. You're a What? Automation Technician

    Science.gov (United States)

    Mullins, John

    2010-01-01

    Many people think of automation as laborsaving technology, but it sure keeps Jim Duffell busy. Defined simply, automation is a technique for making a device run or a process occur with minimal direct human intervention. But the functions and technologies involved in automated manufacturing are complex. Nearly all functions, from orders coming in…

  15. Does Automated Feedback Improve Writing Quality?

    Science.gov (United States)

    Wilson, Joshua; Olinghouse, Natalie G.; Andrada, Gilbert N.

    2014-01-01

    The current study examines data from students in grades 4-8 who participated in a statewide computer-based benchmark writing assessment that featured automated essay scoring and automated feedback. We examined whether the use of automated feedback was associated with gains in writing quality across revisions to an essay, and with transfer effects…

  16. Reciclagem de cobre proveniente de analisador automático de carbono e nitrogênio

    Directory of Open Access Journals (Sweden)

    Bendassolli José Albertino

    2002-01-01

    Full Text Available Isotopic and elemental analysis of N, C and S in liquid and solid samples has been simplified with the advent of automated systems. The simplest method of automation for this kind of analysis involves an elemental analyzer interfaced directly to the ion source of an IRMS (Isotope Ratio Mass Spectrometry. In the analyzer reduction system, an expressive amount of oxidized copper is generated as solid residue. This material is normally imported and the price is very high. A methodology was proposed for the recovery of metallic copper in order to recycle this reagent in the reduction system of a GC-IRMS, using the hydrogen gas in the vacuum line. Results show that it is possible to obtain a recycle of about 95 % of the initial metallic copper used in the reduction system.

  17. System reliability, performance and trust in adaptable automation.

    Science.gov (United States)

    Chavaillaz, Alain; Wastell, David; Sauer, Jürgen

    2016-01-01

    The present study examined the effects of reduced system reliability on operator performance and automation management in an adaptable automation environment. 39 operators were randomly assigned to one of three experimental groups: low (60%), medium (80%), and high (100%) reliability of automation support. The support system provided five incremental levels of automation which operators could freely select according to their needs. After 3 h of training on a simulated process control task (AutoCAMS) in which the automation worked infallibly, operator performance and automation management were measured during a 2.5-h testing session. Trust and workload were also assessed through questionnaires. Results showed that although reduced system reliability resulted in lower levels of trust towards automation, there were no corresponding differences in the operators' reliance on automation. While operators showed overall a noteworthy ability to cope with automation failure, there were, however, decrements in diagnostic speed and prospective memory with lower reliability. Copyright © 2015. Published by Elsevier Ltd.

  18. Automated estimation of defects in magnetographic defectoscopy. 1. Automated magnetographic flow detectors

    International Nuclear Information System (INIS)

    Mikhajlov, S.P.; Vaulin, S.L.; Shcherbinin, V.E.; Shur, M.L.

    1993-01-01

    Consideration is given to specific features and possible functions of equipment for automated estimation of stretched continuity defects for samples with plane surface in magnetographic defectoscopy are discussed. Two models of automated magnetographic flow detectors, those with built-in microcomputer and in the form computer attachment, are described. Directions of further researches and development are discussed. 35 refs., 6 figs

  19. Non-perturbative renormalization in coordinate space for Nf=2 maximally twisted mass fermions with tree-level Symanzik improved gauge action

    International Nuclear Information System (INIS)

    Cichy, Krzysztof; Adam Mickiewicz Univ., Poznan; Jansen, Karl; Korcyl, Piotr; Jagiellonian Univ., Krakow

    2012-07-01

    We present results of a lattice QCD application of a coordinate space renormalization scheme for the extraction of renormalization constants for flavour non-singlet bilinear quark operators. The method consists in the analysis of the small-distance behaviour of correlation functions in Euclidean space and has several theoretical and practical advantages, in particular: it is gauge invariant, easy to implement and has relatively low computational cost. The values of renormalization constants in the X-space scheme can be converted to the MS scheme via 4-loop continuum perturbative formulae. Our results for N f =2 maximally twisted mass fermions with tree-level Symanzik improved gauge action are compared to the ones from the RI-MOM scheme and show full agreement with this method. (orig.)

  20. Ergonomics, automation and logistics: practical and effective combination of working methods, a case study of a baking company.

    Science.gov (United States)

    Quintana, Leonardo; Arias, Claudia; Cordoba, Jorge; Moroy, Magda; Pulido, Jean; Ramirez, Angela

    2012-01-01

    The aim of this study was to combine three different analytical methods from three different disciplines to diagnose the ergonomic conditions, manufacturing and supply chain operation of a baking company. The study explores a summary of comprehensive working methods that combines the ergonomics, automation and logistics study methods in the diagnosis of working conditions and productivity. The participatory approach of this type of study that involves the feelings and first-hand knowledge of workers of the operation are determining factors in defining points of action and ergonomic interventions, as well as defining opportunities in the automation of manufacturing and logistics, to cope with the needs of the company. The study identified an ergonomic situation (high prevalence of wrist-hand pain), and the combination of interdisciplinary techniques applied allowed to improve this condition in the company. This type of study allows a primary basis of the opportunities presented by the combination of specialized methods of different disciplines, for the definition of comprehensive action plans for the company. Additionally, it outlines opportunities for improvement and recommendations to mitigate the burden associated with occupational diseases and as an end result improve the quality of life and productivity of workers.

  1. A HOME-BASED MASSED PRACTICE SYSTEM FOR PEDIATRIC NEUROREHABILITATION

    Directory of Open Access Journals (Sweden)

    Yi-Ning Wu

    2013-11-01

    Full Text Available The objective of this paper is to introduce a novel low-cost human-computer interface (HCI system for home-based massed practice for children with upper limb impairment due to brain injury. Successful massed practice, a type of neurorehabilitation, may be of value for children with brain injury because it facilitates impaired limb use. Use of automated, home-based systems could provide a practical means for massed practice. However, the optimal strategy to deliver and monitor home-based massed practice is still unclear. We integrated motion sensor, video game, and HCI software technologies to create a useful home-based massed practice at targeted joints. The system records joint angle and number of movements using a low-cost custom hand-held sensor. The sensor acts as an input device to play video games. We demonstrated the system’s functionality and provided preliminary observations on usage by children with brain injury, including joint motion and muscle activation.

  2. Automated CO2 extraction from air for clumped isotope analysis in the atmo- and biosphere

    Science.gov (United States)

    Hofmann, Magdalena; Ziegler, Martin; Pons, Thijs; Lourens, Lucas; Röckmann, Thomas

    2015-04-01

    The conventional stable isotope ratios 13C/12C and 18O/16O in atmospheric CO2 are a powerful tool for unraveling the global carbon cycle. In recent years, it has been suggested that the abundance of the very rare isotopologue 13C18O16O on m/z 47 might be a promising tracer to complement conventional stable isotope analysis of atmospheric CO2 [Affek and Eiler, 2006; Affek et al. 2007; Eiler and Schauble, 2004; Yeung et al., 2009]. Here we present an automated analytical system that is designed for clumped isotope analysis of atmo- and biospheric CO2. The carbon dioxide gas is quantitatively extracted from about 1.5L of air (ATP). The automated stainless steel extraction and purification line consists of three main components: (i) a drying unit (a magnesium perchlorate unit and a cryogenic water trap), (ii) two CO2 traps cooled with liquid nitrogen [Werner et al., 2001] and (iii) a GC column packed with Porapak Q that can be cooled with liquid nitrogen to -30°C during purification and heated up to 230°C in-between two extraction runs. After CO2 extraction and purification, the CO2 is automatically transferred to the mass spectrometer. Mass spectrometric analysis of the 13C18O16O abundance is carried out in dual inlet mode on a MAT 253 mass spectrometer. Each analysis generally consists of 80 change-over-cycles. Three additional Faraday cups were added to the mass spectrometer for simultaneous analysis of the mass-to-charge ratios 44, 45, 46, 47, 48 and 49. The reproducibility for δ13C, δ18O and Δ47 for repeated CO2 extractions from air is in the range of 0.11o (SD), 0.18o (SD) and 0.02 (SD)o respectively. This automated CO2 extraction and purification system will be used to analyse the clumped isotopic signature in atmospheric CO2 (tall tower, Cabauw, Netherlands) and to study the clumped isotopic fractionation during photosynthesis (leaf chamber experiments) and soil respiration. References Affek, H. P., Xu, X. & Eiler, J. M., Geochim. Cosmochim. Acta 71, 5033

  3. Update on Heavy-Meson Spectrum Tests of the Oktay--Kronfeld Action

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, Jon A. [Seoul Natl. U.; Jang, Yong-Chull [Seoul Natl. U.; Lee, Weonjong [Seoul Natl. U.; DeTar, Carleton [Utah U.; Kronfeld, Andreas S. [TUM-IAS, Munich; Oktay, Mehmet B. [Iowa U.

    2016-01-18

    We present updated results of a numerical improvement test with heavy-meson spectrum for the Oktay--Kronfeld (OK) action. The OK action is an extension of the Fermilab improvement program for massive Wilson fermions including all dimension-six and some dimension-seven bilinear terms. Improvement terms are truncated by HQET power counting at $\\mathrm{O}(\\Lambda^3/m_Q^3)$ for heavy-light systems, and by NRQCD power counting at $\\mathrm{O}(v^6)$ for quarkonium. They suffice for tree-level matching to QCD to the given order in the power-counting schemes. To assess the improvement, we generate new data with the OK and Fermilab action that covers both charm and bottom quark mass regions on a MILC coarse $(a \\approx 0.12~\\text{fm})$ $2+1$ flavor, asqtad-staggered ensemble. We update the analyses of the inconsistency quantity and the hyperfine splittings for the rest and kinetic masses. With one exception, the results clearly show that the OK action significantly reduces heavy-quark discretization effects in the meson spectrum. The exception is the hyperfine splitting of the heavy-light system near the $B_s$ meson mass, where statistics are too low to draw a firm conclusion, despite promising results.

  4. Order Division Automated System.

    Science.gov (United States)

    Kniemeyer, Justin M.; And Others

    This publication was prepared by the Order Division Automation Project staff to fulfill the Library of Congress' requirement to document all automation efforts. The report was originally intended for internal use only and not for distribution outside the Library. It is now felt that the library community at-large may have an interest in the…

  5. Automated detection of pain from facial expressions: a rule-based approach using AAM

    Science.gov (United States)

    Chen, Zhanli; Ansari, Rashid; Wilkie, Diana J.

    2012-02-01

    In this paper, we examine the problem of using video analysis to assess pain, an important problem especially for critically ill, non-communicative patients, and people with dementia. We propose and evaluate an automated method to detect the presence of pain manifested in patient videos using a unique and large collection of cancer patient videos captured in patient homes. The method is based on detecting pain-related facial action units defined in the Facial Action Coding System (FACS) that is widely used for objective assessment in pain analysis. In our research, a person-specific Active Appearance Model (AAM) based on Project-Out Inverse Compositional Method is trained for each patient individually for the modeling purpose. A flexible representation of the shape model is used in a rule-based method that is better suited than the more commonly used classifier-based methods for application to the cancer patient videos in which pain-related facial actions occur infrequently and more subtly. The rule-based method relies on the feature points that provide facial action cues and is extracted from the shape vertices of AAM, which have a natural correspondence to face muscular movement. In this paper, we investigate the detection of a commonly used set of pain-related action units in both the upper and lower face. Our detection results show good agreement with the results obtained by three trained FACS coders who independently reviewed and scored the action units in the cancer patient videos.

  6. Myths in test automation

    Directory of Open Access Journals (Sweden)

    Jazmine Francis

    2014-12-01

    Full Text Available Myths in automation of software testing is an issue of discussion that echoes about the areas of service in validation of software industry. Probably, the first though that appears in knowledgeable reader would be Why this old topic again? What's New to discuss the matter? But, for the first time everyone agrees that undoubtedly automation testing today is not today what it used to be ten or fifteen years ago, because it has evolved in scope and magnitude. What began as a simple linear scripts for web applications today has a complex architecture and a hybrid framework to facilitate the implementation of testing applications developed with various platforms and technologies. Undoubtedly automation has advanced, but so did the myths associated with it. The change in perspective and knowledge of people on automation has altered the terrain. This article reflects the points of views and experience of the author in what has to do with the transformation of the original myths in new versions, and how they are derived; also provides his thoughts on the new generation of myths.

  7. Myths in test automation

    Directory of Open Access Journals (Sweden)

    Jazmine Francis

    2015-01-01

    Full Text Available Myths in automation of software testing is an issue of discussion that echoes about the areas of service in validation of software industry. Probably, the first though that appears in knowledgeable reader would be Why this old topic again? What's New to discuss the matter? But, for the first time everyone agrees that undoubtedly automation testing today is not today what it used to be ten or fifteen years ago, because it has evolved in scope and magnitude. What began as a simple linear scripts for web applications today has a complex architecture and a hybrid framework to facilitate the implementation of testing applications developed with various platforms and technologies. Undoubtedly automation has advanced, but so did the myths associated with it. The change in perspective and knowledge of people on automation has altered the terrain. This article reflects the points of views and experience of the author in what has to do with the transformation of the original myths in new versions, and how they are derived; also provides his thoughts on the new generation of myths.

  8. Proof-of-concept automation of propellant processing

    Science.gov (United States)

    Ramohalli, Kumar; Schallhorn, P. A.

    1989-01-01

    For space-based propellant production, automation of the process is needed. Currently, all phases of terrestrial production have some form of human interaction. A mixer was acquired to help perform the tasks of automation. A heating system to be used with the mixer was designed, built, and installed. Tests performed on the heating system verify design criteria. An IBM PS/2 personal computer was acquired for the future automation work. It is hoped that some the mixing process itself will be automated. This is a concept demonstration task; proving that propellant production can be automated reliably.

  9. Automated Radioanalytical Chemistry: Applications For The Laboratory And Industrial Process Monitoring

    International Nuclear Information System (INIS)

    O'Hara, Matthew J.; Farawila, Anne F.; Grate, Jay W.

    2009-01-01

    The identification and quantification of targeted α- and β-emitting radionuclides via destructive analysis in complex radioactive liquid matrices is highly challenging. Analyses are typically accomplished at on- or off-site laboratories through laborious sample preparation steps and extensive chemical separations followed by analysis using a variety of detection methodologies (e.g., liquid scintillation, alpha energy spectroscopy, mass spectrometry). Analytical results may take days or weeks to report. When an industrial-scale plant requires periodic or continuous monitoring of radionuclides as an indication of the composition of its feed stream, diversion of safeguarded nuclides, or of plant operational conditions (for example), radiochemical measurements should be rapid, but not at the expense of precision and accuracy. Scientists at Pacific Northwest National Laboratory have developed and characterized a host of automated radioanalytical systems designed to perform reproducible and rapid radioanalytical processes. Platforms have been assembled for (1) automation and acceleration of sample analysis in the laboratory and (2) automated monitors for monitoring industrial scale nuclear processes on-line with near-real time results. These methods have been applied to the analysis of environmental-level actinides and fission products to high-level nuclear process fluids. Systems have been designed to integrate a number of discrete sample handling steps, including sample pretreatment (e.g., digestion and valence state adjustment) and chemical separations. The systems have either utilized on-line analyte detection or have collected the purified analyte fractions for off-line measurement applications. One PNNL system of particular note is a fully automated prototype on-line radioanalytical system designed for the Waste Treatment Plant at Hanford, WA, USA. This system demonstrated nearly continuous destructive analysis of the soft β-emitting radionuclide 99Tc in nuclear

  10. Solid-Phase Extraction Strategies to Surmount Body Fluid Sample Complexity in High-Throughput Mass Spectrometry-Based Proteomics

    Science.gov (United States)

    Bladergroen, Marco R.; van der Burgt, Yuri E. M.

    2015-01-01

    For large-scale and standardized applications in mass spectrometry- (MS-) based proteomics automation of each step is essential. Here we present high-throughput sample preparation solutions for balancing the speed of current MS-acquisitions and the time needed for analytical workup of body fluids. The discussed workflows reduce body fluid sample complexity and apply for both bottom-up proteomics experiments and top-down protein characterization approaches. Various sample preparation methods that involve solid-phase extraction (SPE) including affinity enrichment strategies have been automated. Obtained peptide and protein fractions can be mass analyzed by direct infusion into an electrospray ionization (ESI) source or by means of matrix-assisted laser desorption ionization (MALDI) without further need of time-consuming liquid chromatography (LC) separations. PMID:25692071

  11. Automated Test-Form Generation

    Science.gov (United States)

    van der Linden, Wim J.; Diao, Qi

    2011-01-01

    In automated test assembly (ATA), the methodology of mixed-integer programming is used to select test items from an item bank to meet the specifications for a desired test form and optimize its measurement accuracy. The same methodology can be used to automate the formatting of the set of selected items into the actual test form. Three different…

  12. Context-Aware user interfaces in automation

    DEFF Research Database (Denmark)

    Olsen, Mikkel Holm

    2007-01-01

    Automation is deployed in a great range of different domains such as the chemical industry, the production of consumer goods, the production of energy (both in terms of power plants and in the petrochemical industry), transportation and several others. Through several decades the complexity...... of automation systems and the level of automation have been rising. This has caused problems regarding the operator's ability to comprehend the overall situation and state of the automation system, in particular in abnormal situations. The amount of data available to the operator results in information overload....... Since context-aware applications have been developed in other research areas it seems natural to analyze the findings of this research and examine how this can be applied to the domain of automation systems. By evaluating existing architectures for the development of context-aware applications we find...

  13. Automated transit planning, operation, and applications

    CERN Document Server

    Liu, Rongfang

    2016-01-01

    This book analyzes the successful implementations of automated transit in various international locations, such as Paris, Toronto, London, and Kuala Lumpur, and investigates the apparent lack of automated transit applications in the urban environment in the United States. The book begins with a brief definition of automated transit and its historical development. After a thorough description of the technical specifications, the author highlights a few applications from each sub-group of the automated transit spectrum. International case studies display various technologies and their applications, and identify vital factors that affect each system and performance evaluations of existing applications. The book then discusses the planning and operation of automated transit applications at both macro and micro levels. Finally, the book covers a number of less successful concepts, as well as the lessons learned, allow ng readers to gain a comprehensive understanding of the topic.

  14. Participation through Automation: Fully Automated Critical PeakPricing in Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David S.; Motegi, Naoya; Kiliccote,Sila; Linkugel, Eric

    2006-06-20

    California electric utilities have been exploring the use of dynamic critical peak prices (CPP) and other demand response programs to help reduce peaks in customer electric loads. CPP is a tariff design to promote demand response. Levels of automation in DR can be defined as follows: Manual Demand Response involves a potentially labor-intensive approach such as manually turning off or changing comfort set points at each equipment switch or controller. Semi-Automated Demand Response involves a pre-programmed demand response strategy initiated by a person via centralized control system. Fully Automated Demand Response does not involve human intervention, but is initiated at a home, building, or facility through receipt of an external communications signal. The receipt of the external signal initiates pre-programmed demand response strategies. They refer to this as Auto-DR. This paper describes the development, testing, and results from automated CPP (Auto-CPP) as part of a utility project in California. The paper presents the project description and test methodology. This is followed by a discussion of Auto-DR strategies used in the field test buildings. They present a sample Auto-CPP load shape case study, and a selection of the Auto-CPP response data from September 29, 2005. If all twelve sites reached their maximum saving simultaneously, a total of approximately 2 MW of DR is available from these twelve sites that represent about two million ft{sup 2}. The average DR was about half that value, at about 1 MW. These savings translate to about 0.5 to 1.0 W/ft{sup 2} of demand reduction. They are continuing field demonstrations and economic evaluations to pursue increasing penetrations of automated DR that has demonstrated ability to provide a valuable DR resource for California.

  15. Managing laboratory automation

    OpenAIRE

    Saboe, Thomas J.

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Fina...

  16. Automated analysis of organic particles using cluster SIMS

    Energy Technology Data Exchange (ETDEWEB)

    Gillen, Greg; Zeissler, Cindy; Mahoney, Christine; Lindstrom, Abigail; Fletcher, Robert; Chi, Peter; Verkouteren, Jennifer; Bright, David; Lareau, Richard T.; Boldman, Mike

    2004-06-15

    Cluster primary ion bombardment combined with secondary ion imaging is used on an ion microscope secondary ion mass spectrometer for the spatially resolved analysis of organic particles on various surfaces. Compared to the use of monoatomic primary ion beam bombardment, the use of a cluster primary ion beam (SF{sub 5}{sup +} or C{sub 8}{sup -}) provides significant improvement in molecular ion yields and a reduction in beam-induced degradation of the analyte molecules. These characteristics of cluster bombardment, along with automated sample stage control and custom image analysis software are utilized to rapidly characterize the spatial distribution of trace explosive particles, narcotics and inkjet-printed microarrays on a variety of surfaces.

  17. LIBRARY AUTOMATION IN NIGERAN UNIVERSITIES

    African Journals Online (AJOL)

    facilitate services and access to information in libraries is widely acceptable. ... Moreover, Ugah (2001) reports that the automation process at the. Abubakar ... blueprint in 1987 and a turn-key system of automation was suggested for the library.

  18. AUTOMATED PROCESSING OF DAIRY PRODUCT MICROPHOTOS USING IMAGEJ AND STATISTICA

    Directory of Open Access Journals (Sweden)

    V. K. Bitiukov

    2014-01-01

    Full Text Available Summary. The article discusses the construction of algorithms for automated processing of microphotos of dairy products. Automated processing of micro photos of dairy products relevant in the study of the degree of homogenization. Microphotos of dairy products contain information about the distribution of fat globules in the mass fractions. Today, there are some of software products, offering image processing and relieving researchers from routine operations manual data processing. But it need to be adapted for performing the processing of microphotos of dairy products. In this paper we propose to use for processing the application package ImageJ for processing image files taken with digital microscope, and to calculate the statistical characteristics of the proposed use of the software package Statistica. Processing algorithm consists of successive stages of conversion to gray scale, scaling, filtering, binarization, object recognition and statistical processing of the results of recognition. The result of the implemented data processing algorithms is the distribution function of the fat globules in terms of volume or mass fraction, as well as the statistical parameters of the distribution (the mathematical expectation, variance, skewness and kurtosis coefficients. For the inspection of the algorithm and its debugging experimental studieswere carried out. Carries out the homogenization of farm milk at different pressures of homogenization. For each sample were made microphoto sand image processing carried out in accordance with the proposed algorithm. Studies have shown the effectiveness and feasibility of the proposed algorithm in the form of java script for ImageJ and then send the data to a file for the software package Statistica.

  19. Automated 741 document preparation: Oak Ridge National Laboratory's Automated Safeguards Information System (OASIS)

    International Nuclear Information System (INIS)

    Austin, H.C.; Gray, L.M.

    1982-01-01

    OASIS has been providing for Oak Ridge National Laboratory's total safeguards needs since being place on line in April 1980. The system supports near real-time nuclear materials safeguards and accountability control. The original design of OASIS called for an automated facsimile of a 741 document to be prepared as a functional by-product of updating the inventory. An attempt was made to utilize, intact, DOE-Albuquerque's automated 741 system to generate the facsimile; however, the five page document produced proved too cumbersome. Albuquerque's programs were modified to print an original 741 document utilizing standard DOE/NRC 741 forms. It is felt that the best features of both the automated and manually generated 741 documents have been incorporated. Automation of the source data for 741 shipping documents produces greater efficiency while reducing possible errors. Through utilization of the standard DOE/NRC form, continuity within the NMMSS system is maintained, thus minimizing the confusion and redundancy associated with facsimiles. OASIS now fulfills the original concept of near real-time accountability by furnishing a viable 741 document as a function of updating the inventory

  20. Individual differences in the calibration of trust in automation.

    Science.gov (United States)

    Pop, Vlad L; Shrewsbury, Alex; Durso, Francis T

    2015-06-01

    The objective was to determine whether operators with an expectancy that automation is trustworthy are better at calibrating their trust to changes in the capabilities of automation, and if so, why. Studies suggest that individual differences in automation expectancy may be able to account for why changes in the capabilities of automation lead to a substantial change in trust for some, yet only a small change for others. In a baggage screening task, 225 participants searched for weapons in 200 X-ray images of luggage. Participants were assisted by an automated decision aid exhibiting different levels of reliability. Measures of expectancy that automation is trustworthy were used in conjunction with subjective measures of trust and perceived reliability to identify individual differences in trust calibration. Operators with high expectancy that automation is trustworthy were more sensitive to changes (both increases and decreases) in automation reliability. This difference was eliminated by manipulating the causal attribution of automation errors. Attributing the cause of automation errors to factors external to the automation fosters an understanding of tasks and situations in which automation differs in reliability and may lead to more appropriate trust. The development of interventions can lead to calibrated trust in automation. © 2014, Human Factors and Ergonomics Society.

  1. Automation of Taxiing

    Directory of Open Access Journals (Sweden)

    Jaroslav Bursík

    2017-01-01

    Full Text Available The article focuses on the possibility of automation of taxiing, which is the part of a flight, which, under adverse weather conditions, greatly reduces the operational usability of an airport, and is the only part of a flight that has not been affected by automation, yet. Taxiing is currently handled manually by the pilot, who controls the airplane based on information from visual perception. The article primarily deals with possible ways of obtaining navigational information, and its automatic transfer to the controls. Analyzed wand assessed were currently available technologies such as computer vision, Light Detection and Ranging and Global Navigation Satellite System, which are useful for navigation and their general implementation into an airplane was designed. Obstacles to the implementation were identified, too. The result is a proposed combination of systems along with their installation into airplane’s systems so that it is possible to use the automated taxiing.

  2. Programmable automation systems in PSA

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1997-06-01

    The Finnish safety authority (STUK) requires plant specific PSAs, and quantitative safety goals are set on different levels. The reliability analysis is more problematic when critical safety functions are realized by applying programmable automation systems. Conventional modeling techniques do not necessarily apply to the analysis of these systems, and the quantification seems to be impossible. However, it is important to analyze contribution of programmable automation systems to the plant safety and PSA is the only method with system analytical view over the safety. This report discusses the applicability of PSA methodology (fault tree analyses, failure modes and effects analyses) in the analysis of programmable automation systems. The problem of how to decompose programmable automation systems for reliability modeling purposes is discussed. In addition to the qualitative analysis and structural reliability modeling issues, the possibility to evaluate failure probabilities of programmable automation systems is considered. One solution to the quantification issue is the use of expert judgements, and the principles to apply expert judgements is discussed in the paper. A framework to apply expert judgements is outlined. Further, the impacts of subjective estimates on the interpretation of PSA results are discussed. (orig.) (13 refs.)

  3. Automated replication of cone beam CT-guided treatments in the Pinnacle(3) treatment planning system for adaptive radiotherapy.

    Science.gov (United States)

    Hargrave, Catriona; Mason, Nicole; Guidi, Robyn; Miller, Julie-Anne; Becker, Jillian; Moores, Matthew; Mengersen, Kerrie; Poulsen, Michael; Harden, Fiona

    2016-03-01

    Time-consuming manual methods have been required to register cone-beam computed tomography (CBCT) images with plans in the Pinnacle(3) treatment planning system in order to replicate delivered treatments for adaptive radiotherapy. These methods rely on fiducial marker (FM) placement during CBCT acquisition or the image mid-point to localise the image isocentre. A quality assurance study was conducted to validate an automated CBCT-plan registration method utilising the Digital Imaging and Communications in Medicine (DICOM) Structure Set (RS) and Spatial Registration (RE) files created during online image-guided radiotherapy (IGRT). CBCTs of a phantom were acquired with FMs and predetermined setup errors using various online IGRT workflows. The CBCTs, DICOM RS and RE files were imported into Pinnacle(3) plans of the phantom and the resulting automated CBCT-plan registrations were compared to existing manual methods. A clinical protocol for the automated method was subsequently developed and tested retrospectively using CBCTs and plans for six bladder patients. The automated CBCT-plan registration method was successfully applied to thirty-four phantom CBCT images acquired with an online 0 mm action level workflow. Ten CBCTs acquired with other IGRT workflows required manual workarounds. This was addressed during the development and testing of the clinical protocol using twenty-eight patient CBCTs. The automated CBCT-plan registrations were instantaneous, replicating delivered treatments in Pinnacle(3) with errors of ±0.5 mm. These errors were comparable to mid-point-dependant manual registrations but superior to FM-dependant manual registrations. The automated CBCT-plan registration method quickly and reliably replicates delivered treatments in Pinnacle(3) for adaptive radiotherapy.

  4. How to assess sustainability in automated manufacturing

    DEFF Research Database (Denmark)

    Dijkman, Teunis Johannes; Rödger, Jan-Markus; Bey, Niki

    2015-01-01

    The aim of this paper is to describe how sustainability in automation can be assessed. The assessment method is illustrated using a case study of a robot. Three aspects of sustainability assessment in automation are identified. Firstly, we consider automation as part of a larger system...... that fulfills the market demand for a given functionality. Secondly, three aspects of sustainability have to be assessed: environment, economy, and society. Thirdly, automation is part of a system with many levels, with different actors on each level, resulting in meeting the market demand. In this system......, (sustainability) specifications move top-down, which helps avoiding sub-optimization and problem shifting. From these three aspects, sustainable automation is defined as automation that contributes to products that fulfill a market demand in a more sustainable way. The case study presents the carbon footprints...

  5. Automation of coal mining equipment

    Energy Technology Data Exchange (ETDEWEB)

    Yamada, Ryuji

    1986-12-25

    Major machines used in the working face include the shearer and the self-advancing frame. The shearer has been changed from the radio-controlled model to the microcomputer operated machine, while automating various functions. In addition, a system for comprehensively examining operating conditions and natural conditions in the working face for further automation. The selfadvancing frame has been modified from the sequence controlled model to the microcomputer aided electrohydraulic control system. In order to proceed further with automation and introduce robotics, detectors, control units and valves must be made smaller in higher reliability. The system will be controlled above the ground in the future, provided that the machines in the working face are remote controlled at the gate while transmitting relevant data above the ground from this system. Thus, automated working face will be realized. (2 figs, 1 photo)

  6. Controls and automation in the SPIRAL project

    International Nuclear Information System (INIS)

    Bothner, U.; Boulot, A.; Maherault, J.; Martial, L.

    1999-01-01

    The control and automation team of the R and D of Accelerator-Exotic Beam Department has had in the framework of SPIRAL collaboration the following tasks: 1. automation of the resonator high frequency equipment of the CIME cyclotron; 2. automation of the vacuum equipment, i.e. the low energy line (TBE), the CIME cyclotron, the low energy line (BE); 3. automation of load safety for power supply; 4. for each of these tasks a circuitry file based on the SCHEMA software has been worked out. The programs required in the automation of load safety for power supply (STEP5, PROTOOL, DESIGNER 4.1) were developed and implemented for PC

  7. MALDI-TOF mass spectrometry for rapid diagnosis of postoperative endophthalmitis.

    Science.gov (United States)

    Mailhac, Adriane; Durand, Harmonie; Boisset, Sandrine; Maubon, Danièle; Berger, Francois; Maurin, Max; Chiquet, Christophe; Bidart, Marie

    2017-01-30

    This study describes an innovative strategy for rapid detection and identification of bacteria causing endophthalmitis, combining the use of an automated blood culture system with MALDI-TOF mass spectrometry methodology. Using this protocol, we could identify 96% of 45 bacterial strains isolated from vitreous samples collected in acute post-operative endophthalmitis patients. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Automated controlled-potential coulometric determination of uranium

    International Nuclear Information System (INIS)

    Knight, C.H.; Clegg, D.E.; Wright, K.D.; Cassidy, R.M.

    1982-06-01

    A controlled-potential coulometer has been automated in our laboratory for routine determination of uranium in solution. The CRNL-designed automated system controls degassing, prereduction, and reduction of the sample. The final result is displayed on a digital coulometer readout. Manual and automated modes of operation are compared to show the precision and accuracy of the automated system. Results are also shown for the coulometric titration of typical uranium-aluminum alloy samples

  9. Office automation: a look beyond word processing

    OpenAIRE

    DuBois, Milan Ephriam, Jr.

    1983-01-01

    Approved for public release; distribution is unlimited Word processing was the first of various forms of office automation technologies to gain widespread acceptance and usability in the business world. For many, it remains the only form of office automation technology. Office automation, however, is not just word processing, although it does include the function of facilitating and manipulating text. In reality, office automation is not one innovation, or one office system, or one tech...

  10. The Employment-Impact of Automation in Canada

    OpenAIRE

    McLean, Colin Alexander

    2015-01-01

    Standard neoclassical models of labour demand predict that automation does not produce long-term increases in unemployment. Supporting evidence in Canada between 1970 and 2008 is explained by the reallocation of labour from industries with high levels of automation such as Manufacturing to industries with low levels of automation such as Retail and Wholesale Trade, and Business Services. Recent evidence indicates however that on-going technological advances are now driving labour automation i...

  11. Complex Automated Negotiations Theories, Models, and Software Competitions

    CERN Document Server

    Zhang, Minjie; Robu, Valentin; Matsuo, Tokuro

    2013-01-01

    Complex Automated Negotiations are a widely studied, emerging area in the field of Autonomous Agents and Multi-Agent Systems. In general, automated negotiations can be complex, since there are a lot of factors that characterize such negotiations. For this book, we solicited papers on all aspects of such complex automated negotiations, which are studied in the field of Autonomous Agents and Multi-Agent Systems. This book includes two parts, which are Part I: Agent-based Complex Automated Negotiations and Part II: Automated Negotiation Agents Competition. Each chapter in Part I is an extended version of ACAN 2011 papers after peer reviews by three PC members. Part II includes ANAC 2011 (The Second Automated Negotiating Agents Competition), in which automated agents who have different negotiation strategies and implemented by different developers are automatically negotiate in the several negotiation domains. ANAC is an international competition in which automated negotiation strategies, submitted by a number of...

  12. Robotics-assisted mass spectrometry assay platform enabled by open-source electronics.

    Science.gov (United States)

    Chiu, Shih-Hao; Urban, Pawel L

    2015-02-15

    Mass spectrometry (MS) is an important analytical technique with numerous applications in clinical analysis, biochemistry, environmental analysis, geology and physics. Its success builds on the ability of MS to determine molecular weights of analytes, and elucidate their structures. However, sample handling prior to MS requires a lot of attention and labor. In this work we were aiming to automate processing samples for MS so that analyses could be conducted without much supervision of experienced analysts. The goal of this study was to develop a robotics and information technology-oriented platform that could control the whole analysis process including sample delivery, reaction-based assay, data acquisition, and interaction with the analyst. The proposed platform incorporates a robotic arm for handling sample vials delivered to the laboratory, and several auxiliary devices which facilitate and secure the analysis process. They include: multi-relay board, infrared sensors, photo-interrupters, gyroscopes, force sensors, fingerprint scanner, barcode scanner, touch screen panel, and internet interface. The control of all the building blocks is achieved through implementation of open-source electronics (Arduino), and enabled by custom-written programs in C language. The advantages of the proposed system include: low cost, simplicity, small size, as well as facile automation of sample delivery and processing without the intervention of the analyst. It is envisaged that this simple robotic system may be the forerunner of automated laboratories dedicated to mass spectrometric analysis of biological samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Automation of a thermogravimetric equipment

    International Nuclear Information System (INIS)

    Mussio, L.; Castiglioni, J.; Diano, W.

    1987-01-01

    A low cost automation of some instruments by means of simple electronic circuits and a microcomputer Apple IIe. type is discussed. The electronic circuits described are: a) a position detector including phototransistors connected as differential amplifier; b) a current source that, using the error signal of the position detector, changes the current through the coil of an electromagnetic balance to restore its zero position; c) a proportional temperature controller, zero volt switching to drive a furnace to a desired temperature; d) an interface temperature regulator-microcomputer to control the temperature regulator by software; e) a multiplexer for an analog input of a commercial interface. Those circuits are applied in a thermogravimetric equipment used also for vapours adsorption. A program in block diagram form is included and is able to record change of mass, time, furnace temperature and to drive the temperature regulator in order to have the heating rates or the temperature plateaux needed for the experiment. (author) [pt

  14. A Novel High-Molecular-Mass Bacteriocin Produced by Enterococcus faecium: Biochemical Features and Mode of Action.

    Science.gov (United States)

    Vasilchenko, A S; Vasilchenko, A V; Valyshev, A V; Rogozhin, E A

    2018-02-08

    Discovery of a novel bacteriocin is always an event in sciences, since cultivation of most bacterial species is a general problem in microbiology. This statement is reflected by the fact that number of bacteriocins is smaller for tenfold comparing to known antimicrobial peptides. We cultivated Enterococcus faecium on simplified medium to reduce amount of purification steps. This approach allows to purify the novel heavy weight bacteriocin produced by E. faecium ICIS 7. The novelty of this bacteriocin, named enterocin-7, was confirmed by N-terminal sequencing and by comparing the structural-functional properties with available data. Purified enterocin-7 is characterized by a sequence of amino acid residues having no homology in UniProt/SwissProt/TrEMBL databases: NH2 - Asp - Ala - His - Leu - Ser - Glu - Val - Ala - Glu - Arg - Phe - Glu - Asp - Leu - Gly. Isolated thermostable protein has a molecular mass of 65 kDa, which allows it to be classified into class III in bacteriocin classification schemes. Enterocin-7 displayed a broad spectrum of activity against some Gram-positive and Gram-negative microorganisms. Fluorescent microscopy and spectroscopy showed the permeabilizing mechanism of the action of enterocin-7, which is realized within a few minutes.

  15. Automated Methods of Corrosion Measurements

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov

    1997-01-01

    . Mechanical control, recording, and data processing must therefore be automated to a high level of precision and reliability. These general techniques and the apparatus involved have been described extensively. The automated methods of such high-resolution microscopy coordinated with computerized...

  16. Human-centered automation: Development of a philosophy

    Science.gov (United States)

    Graeber, Curtis; Billings, Charles E.

    1990-01-01

    Information on human-centered automation philosophy is given in outline/viewgraph form. It is asserted that automation of aircraft control will continue in the future, but that automation should supplement, not supplant the human management and control function in civil air transport.

  17. Antigravity: Spin-gravity coupling in action

    Science.gov (United States)

    Plyatsko, Roman; Fenyk, Mykola

    2016-08-01

    The typical motions of a spinning test particle in Schwarzschild's background which show the strong repulsive action of the highly relativistic spin-gravity coupling are considered using the exact Mathisson-Papapetrou equations. An approximated approach to choice solutions of these equations which describe motions of the particle's proper center of mass is developed.

  18. A system-level approach to automation research

    Science.gov (United States)

    Harrison, F. W.; Orlando, N. E.

    1984-01-01

    Automation is the application of self-regulating mechanical and electronic devices to processes that can be accomplished with the human organs of perception, decision, and actuation. The successful application of automation to a system process should reduce man/system interaction and the perceived complexity of the system, or should increase affordability, productivity, quality control, and safety. The expense, time constraints, and risk factors associated with extravehicular activities have led the Automation Technology Branch (ATB), as part of the NASA Automation Research and Technology Program, to investigate the use of robots and teleoperators as automation aids in the context of space operations. The ATB program addresses three major areas: (1) basic research in autonomous operations, (2) human factors research on man-machine interfaces with remote systems, and (3) the integration and analysis of automated systems. This paper reviews the current ATB research in the area of robotics and teleoperators.

  19. International Conference Automation : Challenges in Automation, Robotics and Measurement Techniques

    CERN Document Server

    Zieliński, Cezary; Kaliczyńska, Małgorzata

    2016-01-01

    This book presents the set of papers accepted for presentation at the International Conference Automation, held in Warsaw, 2-4 March of 2016. It presents the research results presented by top experts in the fields of industrial automation, control, robotics and measurement techniques. Each chapter presents a thorough analysis of a specific technical problem which is usually followed by numerical analysis, simulation, and description of results of implementation of the solution of a real world problem. The presented theoretical results, practical solutions and guidelines will be valuable for both researchers working in the area of engineering sciences and for practitioners solving industrial problems. .

  20. How the new optoelectronic design automation industry is taking advantage of preexisting EDA standards

    Science.gov (United States)

    Nesmith, Kevin A.; Carver, Susan

    2014-05-01

    With the advancements in design processes down to the sub 7nm levels, the Electronic Design Automation industry appears to be coming to an end of advancements, as the size of the silicon atom becomes the limiting factor. Or is it? The commercial viability of mass-producing silicon photonics is bringing about the Optoelectronic Design Automation (OEDA) industry. With the science of photonics in its infancy, adding these circuits to ever-increasing complex electronic designs, will allow for new generations of advancements. Learning from the past 50 years of the EDA industry's mistakes and missed opportunities, the photonics industry is starting with electronic standards and extending them to become photonically aware. Adapting the use of pre-existing standards into this relatively new industry will allow for easier integration into the present infrastructure and faster time to market.

  1. High-throughput shotgun lipidomics by quadrupole time-of-flight mass spectrometry

    DEFF Research Database (Denmark)

    Ståhlman, Marcus; Ejsing, Christer S.; Tarasov, Kirill

    2009-01-01

    Technological advances in mass spectrometry and meticulous method development have produced several shotgun lipidomic approaches capable of characterizing lipid species by direct analysis of total lipid extracts. Shotgun lipidomics by hybrid quadrupole time-of-flight mass spectrometry allows...... the absolute quantification of hundreds of molecular glycerophospholipid species, glycerolipid species, sphingolipid species and sterol lipids. Future applications in clinical cohort studies demand detailed lipid molecule information and the application of high-throughput lipidomics platforms. In this review...... we describe a novel high-throughput shotgun lipidomic platform based on 96-well robot-assisted lipid extraction, automated sample infusion by mircofluidic-based nanoelectrospray ionization, and quantitative multiple precursor ion scanning analysis on a quadrupole time-of-flight mass spectrometer...

  2. Mass spectrometry in epigenetic research

    DEFF Research Database (Denmark)

    Beck, Hans Christian

    2010-01-01

    cancers has gained tremendous interest in recent years, and many of these inhibitors are currently undergoing clinical trials. Despite intense research, however, the exact molecular mechanisms of action of these molecules remain, to a wide extent, unclear. The recent application of mass spectrometry...

  3. Comparison of Automated Graphical User Interface Testing Tools

    OpenAIRE

    Gaber, Domen

    2018-01-01

    The thesis presents the analysis of modern tools for automated testing of various web based user interfaces. The purpose of the work is to compare specific test automation solutions and point out the most suitable test automation tool amongst them. One of the main goals of test automation is to gain faster execution when compared to manual testing and overall cost deduction. There are multiple test automation solutions available on the market, which differ in complexity of use, type of o...

  4. Wireless Android Based Home Automation System

    Directory of Open Access Journals (Sweden)

    Muhammad Tanveer Riaz

    2017-01-01

    Full Text Available This manuscript presents a prototype and design implementation of an advance home automation system that uses Wi-Fi technology as a network infrastructure connecting its parts. The proposed system consists of two main components; the first part is the server, which presents system core that manages and controls user’s home. Users and system administrator can locally (Local Area Network or remotely (internet manage and control the system. Second part is the hardware interface module, which provides appropriate interface to sensors and actuator of home automation system. Unlike most of the available home automation system in the market, the proposed system is scalable that one server can manage many hardware interface modules as long as it exists within network coverage. System supports a wide range of home automation devices like appliances, power management components, and security components. The proposed system is better in terms of the flexibility and scalability than the commercially available home automation systems

  5. One-Loop Effective Action in Orbifold Compactifications

    CERN Document Server

    Von Gersdorff, Gero

    2008-01-01

    We employ the covariant background formalism to derive generic expressions for the one-loop effective action in field theoretic orbifold compactifications. The contribution of each orbifold sector is given by the effective action of its fixed torus with a shifted mass matrix. We thus study in detail the computation of the heat kernel on tori. Our formalism manifestly separates UV sensitive (local) from UV-insensitive (nonlocal) renormalization. To exemplify our methods, we study the effective potential of 6d gauge theory as well as kinetic terms for gravitational moduli in 11d supergravity.

  6. Automated processing of endoscopic surgical instruments.

    Science.gov (United States)

    Roth, K; Sieber, J P; Schrimm, H; Heeg, P; Buess, G

    1994-10-01

    This paper deals with the requirements for automated processing of endoscopic surgical instruments. After a brief analysis of the current problems, solutions are discussed. Test-procedures have been developed to validate the automated processing, so that the cleaning results are guaranteed and reproducable. Also a device for testing and cleaning was designed together with Netzsch Newamatic and PCI, called TC-MIC, to automate processing and reduce manual work.

  7. Highway Electrification And Automation

    OpenAIRE

    Shladover, Steven E.

    1992-01-01

    This report addresses how the California Department of Transportation and the California PATH Program have made efforts to evaluate the feasibility and applicability of highway electrification and automation technologies. In addition to describing how the work was conducted, the report also describes the findings on highway electrification and highway automation, with experimental results, design study results, and a region-wide application impacts study for Los Angeles.

  8. Automated Status Notification System

    Science.gov (United States)

    2005-01-01

    NASA Lewis Research Center's Automated Status Notification System (ASNS) was born out of need. To prevent "hacker attacks," Lewis' telephone system needed to monitor communications activities 24 hr a day, 7 days a week. With decreasing staff resources, this continuous monitoring had to be automated. By utilizing existing communications hardware, a UNIX workstation, and NAWK (a pattern scanning and processing language), we implemented a continuous monitoring system.

  9. Automation in airport security X-ray screening of cabin baggage: Examining benefits and possible implementations of automated explosives detection.

    Science.gov (United States)

    Hättenschwiler, Nicole; Sterchi, Yanik; Mendes, Marcia; Schwaninger, Adrian

    2018-10-01

    Bomb attacks on civil aviation make detecting improvised explosive devices and explosive material in passenger baggage a major concern. In the last few years, explosive detection systems for cabin baggage screening (EDSCB) have become available. Although used by a number of airports, most countries have not yet implemented these systems on a wide scale. We investigated the benefits of EDSCB with two different levels of automation currently being discussed by regulators and airport operators: automation as a diagnostic aid with an on-screen alarm resolution by the airport security officer (screener) or EDSCB with an automated decision by the machine. The two experiments reported here tested and compared both scenarios and a condition without automation as baseline. Participants were screeners at two international airports who differed in both years of work experience and familiarity with automation aids. Results showed that experienced screeners were good at detecting improvised explosive devices even without EDSCB. EDSCB increased only their detection of bare explosives. In contrast, screeners with less experience (tenure automated decision provided better human-machine detection performance than on-screen alarm resolution and no automation. This came at the cost of slightly higher false alarm rates on the human-machine system level, which would still be acceptable from an operational point of view. Results indicate that a wide-scale implementation of EDSCB would increase the detection of explosives in passenger bags and automated decision instead of automation as diagnostic aid with on screen alarm resolution should be considered. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. 21 CFR 864.5600 - Automated hematocrit instrument.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated hematocrit instrument. 864.5600 Section 864.5600 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices...

  11. 21 CFR 862.2900 - Automated urinalysis system.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated urinalysis system. 862.2900 Section 862....2900 Automated urinalysis system. (a) Identification. An automated urinalysis system is a device... that duplicate manual urinalysis systems. This device is used in conjunction with certain materials to...

  12. Informed baseline subtraction of proteomic mass spectrometry data aided by a novel sliding window algorithm.

    Science.gov (United States)

    Stanford, Tyman E; Bagley, Christopher J; Solomon, Patty J

    2016-01-01

    Proteomic matrix-assisted laser desorption/ionisation (MALDI) linear time-of-flight (TOF) mass spectrometry (MS) may be used to produce protein profiles from biological samples with the aim of discovering biomarkers for disease. However, the raw protein profiles suffer from several sources of bias or systematic variation which need to be removed via pre-processing before meaningful downstream analysis of the data can be undertaken. Baseline subtraction, an early pre-processing step that removes the non-peptide signal from the spectra, is complicated by the following: (i) each spectrum has, on average, wider peaks for peptides with higher mass-to-charge ratios ( m / z ), and (ii) the time-consuming and error-prone trial-and-error process for optimising the baseline subtraction input arguments. With reference to the aforementioned complications, we present an automated pipeline that includes (i) a novel 'continuous' line segment algorithm that efficiently operates over data with a transformed m / z -axis to remove the relationship between peptide mass and peak width, and (ii) an input-free algorithm to estimate peak widths on the transformed m / z scale. The automated baseline subtraction method was deployed on six publicly available proteomic MS datasets using six different m/z-axis transformations. Optimality of the automated baseline subtraction pipeline was assessed quantitatively using the mean absolute scaled error (MASE) when compared to a gold-standard baseline subtracted signal. Several of the transformations investigated were able to reduce, if not entirely remove, the peak width and peak location relationship resulting in near-optimal baseline subtraction using the automated pipeline. The proposed novel 'continuous' line segment algorithm is shown to far outperform naive sliding window algorithms with regard to the computational time required. The improvement in computational time was at least four-fold on real MALDI TOF-MS data and at least an order of

  13. Resins production: batch plant automation

    International Nuclear Information System (INIS)

    Banti, M.; Mauri, G.

    1996-01-01

    Companies that look for automation in their plants without external resources, have at their disposal flexible, custom and easy to use DCS, open towards PLC. In this article it is explained why Hoechts has followed this way of new plants for resins production automation

  14. Killing of targets by effector CD8 T cells in the mouse spleen follows the law of mass action

    Energy Technology Data Exchange (ETDEWEB)

    Ganusov, Vitaly V [Los Alamos National Laboratory

    2009-01-01

    In contrast with antibody-based vaccines, it has been difficult to measure the efficacy of T cell-based vaccines and to correlate the efficacy of CD8 T cell responses with protection again viral infections. In part, this difficulty is due to poor understanding of the in vivo efficacy of CD8 T cells produced by vaccination. Using a: recently developed experimental method of in vivo cytotoxicity we have investigated quantitative aspects of killing of peptide-pulsed targets by effector and memory CD8 T cells, specific to three epitopes of lymphocytic choriomeningitis virus (LCMV), in the mouse spleen. By analyzing data on killing of targets with varying number of epitope-specific effector and memory CD8 T cells, we find that killing of targets by effectors follows the law of mass-action, that is the death rate of peptide-pulsed targets is proportional to the frequency of CTLs in the spleen. In contrast, killing of targets by memory CD8 T cells does not follow the mass action law because the death rate of targets saturates at high frequencies of memory CD8 T cells. For both effector and memory cells, we also find little support for the killing term that includes the decrease of the death rate of targets with target cell density. Interestingly, our analysis suggests that at low CD8 T cell frequencies, memory CD8 T cells on the per capita basis are more efficient at killing peptide-pulsed targets than effectors, but at high frequencies, effectors are more efficient killers than memory T cells. Comparison of the estimated killing efficacy of effector T cells with the value that is predicted from theoretical physics and based on motility of T cells in lymphoid tissues, suggests that limiting step in the killing of peptide-pulsed targets is delivering the lethal hit and not finding the target. Our results thus form a basis for quantitative understanding of the process of killing of virus-infected cells by T cell responses in tissues and can be used to correlate the

  15. Small cities face greater impact from automation

    Science.gov (United States)

    Sun, Lijun; Cebrian, Manuel; Rahwan, Iyad

    2018-01-01

    The city has proved to be the most successful form of human agglomeration and provides wide employment opportunities for its dwellers. As advances in robotics and artificial intelligence revive concerns about the impact of automation on jobs, a question looms: how will automation affect employment in cities? Here, we provide a comparative picture of the impact of automation across US urban areas. Small cities will undertake greater adjustments, such as worker displacement and job content substitutions. We demonstrate that large cities exhibit increased occupational and skill specialization due to increased abundance of managerial and technical professions. These occupations are not easily automatable, and, thus, reduce the potential impact of automation in large cities. Our results pass several robustness checks including potential errors in the estimation of occupational automation and subsampling of occupations. Our study provides the first empirical law connecting two societal forces: urban agglomeration and automation's impact on employment. PMID:29436514

  16. Small cities face greater impact from automation.

    Science.gov (United States)

    Frank, Morgan R; Sun, Lijun; Cebrian, Manuel; Youn, Hyejin; Rahwan, Iyad

    2018-02-01

    The city has proved to be the most successful form of human agglomeration and provides wide employment opportunities for its dwellers. As advances in robotics and artificial intelligence revive concerns about the impact of automation on jobs, a question looms: how will automation affect employment in cities? Here, we provide a comparative picture of the impact of automation across US urban areas. Small cities will undertake greater adjustments, such as worker displacement and job content substitutions. We demonstrate that large cities exhibit increased occupational and skill specialization due to increased abundance of managerial and technical professions. These occupations are not easily automatable, and, thus, reduce the potential impact of automation in large cities. Our results pass several robustness checks including potential errors in the estimation of occupational automation and subsampling of occupations. Our study provides the first empirical law connecting two societal forces: urban agglomeration and automation's impact on employment. © 2018 The Authors.

  17. Space power subsystem automation technology

    Science.gov (United States)

    Graves, J. R. (Compiler)

    1982-01-01

    The technology issues involved in power subsystem automation and the reasonable objectives to be sought in such a program were discussed. The complexities, uncertainties, and alternatives of power subsystem automation, along with the advantages from both an economic and a technological perspective were considered. Whereas most spacecraft power subsystems now use certain automated functions, the idea of complete autonomy for long periods of time is almost inconceivable. Thus, it seems prudent that the technology program for power subsystem automation be based upon a growth scenario which should provide a structured framework of deliberate steps to enable the evolution of space power subsystems from the current practice of limited autonomy to a greater use of automation with each step being justified on a cost/benefit basis. Each accomplishment should move toward the objectives of decreased requirement for ground control, increased system reliability through onboard management, and ultimately lower energy cost through longer life systems that require fewer resources to operate and maintain. This approach seems well-suited to the evolution of more sophisticated algorithms and eventually perhaps even the use of some sort of artificial intelligence. Multi-hundred kilowatt systems of the future will probably require an advanced level of autonomy if they are to be affordable and manageable.

  18. Automated selected reaction monitoring software for accurate label-free protein quantification.

    Science.gov (United States)

    Teleman, Johan; Karlsson, Christofer; Waldemarson, Sofia; Hansson, Karin; James, Peter; Malmström, Johan; Levander, Fredrik

    2012-07-06

    Selected reaction monitoring (SRM) is a mass spectrometry method with documented ability to quantify proteins accurately and reproducibly using labeled reference peptides. However, the use of labeled reference peptides becomes impractical if large numbers of peptides are targeted and when high flexibility is desired when selecting peptides. We have developed a label-free quantitative SRM workflow that relies on a new automated algorithm, Anubis, for accurate peak detection. Anubis efficiently removes interfering signals from contaminating peptides to estimate the true signal of the targeted peptides. We evaluated the algorithm on a published multisite data set and achieved results in line with manual data analysis. In complex peptide mixtures from whole proteome digests of Streptococcus pyogenes we achieved a technical variability across the entire proteome abundance range of 6.5-19.2%, which was considerably below the total variation across biological samples. Our results show that the label-free SRM workflow with automated data analysis is feasible for large-scale biological studies, opening up new possibilities for quantitative proteomics and systems biology.

  19. Intelligent viewing control for robotic and automation systems

    Science.gov (United States)

    Schenker, Paul S.; Peters, Stephen F.; Paljug, Eric D.; Kim, Won S.

    1994-10-01

    We present a new system for supervisory automated control of multiple remote cameras. Our primary purpose in developing this system has been to provide capability for knowledge- based, `hands-off' viewing during execution of teleoperation/telerobotic tasks. The reported technology has broader applicability to remote surveillance, telescience observation, automated manufacturing workcells, etc. We refer to this new capability as `Intelligent Viewing Control (IVC),' distinguishing it from a simple programmed camera motion control. In the IVC system, camera viewing assignment, sequencing, positioning, panning, and parameter adjustment (zoom, focus, aperture, etc.) are invoked and interactively executed by real-time by a knowledge-based controller, drawing on a priori known task models and constraints, including operator preferences. This multi-camera control is integrated with a real-time, high-fidelity 3D graphics simulation, which is correctly calibrated in perspective to the actual cameras and their platform kinematics (translation/pan-tilt). Such merged graphics- with-video design allows the system user to preview and modify the planned (`choreographed') viewing sequences. Further, during actual task execution, the system operator has available both the resulting optimized video sequence, as well as supplementary graphics views from arbitrary perspectives. IVC, including operator-interactive designation of robot task actions, is presented to the user as a well-integrated video-graphic single screen user interface allowing easy access to all relevant telerobot communication/command/control resources. We describe and show pictorial results of a preliminary IVC system implementation for telerobotic servicing of a satellite.

  20. 21 CFR 864.5620 - Automated hemoglobin system.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated hemoglobin system. 864.5620 Section 864.5620 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...